Blog Image

MikeK's software notebook

What you will find

This used to be the place where I wrote stuff I was thinking about while working on the Mozilla project.

Maybe in the near future I'll start to update it again as I'm involved in a couple of new open-source projects - updates pending...

GStreamer Fennec integration status

GStreamer Posted on 07 Jul, 2009 22:27:16

I have now been working on the GStreamer integration in Fennec for some time, and it is time for a status update on it.

The integration is going well, but has been haunted by some issues, mainly to do with the different behavior of GStreamer on the PC and in the device – it is currently unknown to me, how much of this can be contributed to the fact that the version of the GStreamer library is different in the two cases, and how much is due to other factors.

My target device, the Nokia N810, comes with version 0.10.13 of the library, while the current version on my PC is 0.10.22

The first iteration of the integration was based on work done by doublec (see Bug 22540, that holds the history for the work) using playbin as the decoder.

The result in the device of using playbin, was that the audio part of the video played back as expected, but the video was following too slowly (e.g. didn’t play back at the proper frame-rate) – another issue with that solution was that the playbin used the GStreamer network routines to fetch data from the Internet, where we in Fennec/Firefox would like to use necko as the source of data.

What I did was that I wrote a native element to communicate first with Necko, and later abstracting it away from the basic necko interface to use nsMediaStream as the source of data. This element functions as the data source in the GStreamer pipeline that is build when media content should be played.

During the time I have also moved from using decodebin to decodebin2, as the folks over at #gstreamer told me that decodebin won’t be able to handle the audio playback as needed in the version found in the N810 (as a side note, the original playbin solution also uses decodebin2 internally).


Talking about audio, let me explain some of the audio issues that I have noticed – During development I have been almost exclusively testing with mpeg clips as these were the first that came up when I was looking for something to test with. There is no default GStreamer element on the N810 that can decode the audio part of these to a raw format, this means that decodebin and decodebin2 if left alone will just send a “unknown-type” signal and leave the source bin with the audio stream dangling.

I haven’t found a way to link the “unknown-type” pads to anything – but with the help from one of the guys on the #gstreamer IRC channel on I got it working by using decodebin2 and the “autoplug-continue” signal.

The “autoplug-continue” signal is emitted every time a new source pad is found and depending on the return type from your handler of the signal it will either continue to try and decode the stream or link the pad to itself and inform you about this with the “new-decoded-pad” signal.

Different behavior and the problem with volume control

A difference in the behavior on the PC and in the target is that while on the PC elements are found by decodebin2 that can decode the audio part of an mpeg stream to a raw format such a decoder isn’t found on the N810.

On the N810 the decoding of the audio/mpeg stream is done by a special element “dspmp3sink” that also takes care of the actual audio playback – this sink isn’t considered by decodebin2 – so the trick in target is to use the “autoplug-continue” signal as described above and abandoning the autoplug process when an audio stream is detected.

There is one important difference between the PC and the N810 here thou… on the PC we get a raw audio stream that can be linked to different audio manipulating elements, like volume control etc. on the N810 it’s the mpeg audio stream we get out of the decodebin2 element, and you can’t link this stream to the volume control element (it’s expecting a raw stream of numbers it can scale, not a compressed stream).

I’m sure there is a way around this, but I’m also sure that I haven’t found it yet 🙂

Another thing about audio is that the version of the integration that I have on my computer currently is hard-coded to using the “dspmp3sink” element, if the audio format isn’t supported by this sink element, the playback will fail.

Drawing video frames

Initially I forwarded an invalidate event to the main thread for each video frame that was decoded by decodebin2 – this had an unwanted effect as the decoding and the displaying engine ran in two different threads.

The unwanted effect was that it might decode a handful of frames before the drawing thread started to draw, it would then invalidate the screen as many times as there were piled up invalidate events -not the best use of CPU cycles 🙂 Btw I’m not to say if it actually resulted in the same number of redraws as they should be coalesced until the screen is actually redrawn.

The current solution ensures that there is ever only one invalidate pending, but it looks like GStreamer is still trying to decode every video frame, which in turn takes it’s part of the CPU cycles, it would be better to skip at least the color space conversion for the frames that aren’t going to be showed anyway (in order to keep the CPU load low enough to keep the duration of the video correct and the video in sync with the audio).

One could argue that the above is expected as using a fakesink which is currently the way the video frames are extracted from the pipeline is considered a “hack” in the GStreamer docs, the recommended solution is to write a dedicated element to do this – which much be my next task 🙂

This might also fix an issue I see sometimes, where the audio is playing back, but there is absolutely no update of the screen until the very last frame of the video.

Happy me ;)

GStreamer Posted on 16 Apr, 2009 22:17:06

So why am I happy ? – well, first of all I went to the dentist today, so I don’t have to think about dentists until next winter 🙂 – and secondly because after several working days of going around in small circles and trying to figure out the buildsystem, GStreamer etc. I now have my own source plugin, that registers with the GStreamer framework and most importantly can be created by the gstreamer factory call like:

GstElement *streamerSource = gst_element_factory_make(“nsGStreamSource”, nsnull);

I think I with this passed an important step in the process smiley

I also did some changes for improved error handling and preparing the code that I build on top of for multiple video sources running silmuntanously – and last but not least fixed the cleanup code so it nicely releases the resources that it has aquired.

I’ll try to make a GStreamer plugins for dummies page while I still remember what I did 🙂


undefined reference to `gst_push_src_get_type’

GStreamer Posted on 16 Apr, 2009 09:36:20

I’m extending the GstPushSrc source for an integration of GStreamer with the Necko network library.

Creating an extension is not as trivial as inheritance in C++ as GStreamer uses the GObject object model, but anyway looking at the documentation and examples I made an attempt – hit compile, and got the above linker error.

Trying to investigate what went wrong I tried to do a project wide search for the function, it was found in the header file Gstpushsrc.h and no-where else, and just as the prototype “GType gst_push_src_get_type(void);”

Looking some more at the code I found that the “GST_BOILERPLATE” macro I was using generated the *_get_type function for my class and did a call to the base class’ *_get_type function – and viola – there were a “GST_BOILERPLATE_FULL” function in Gstpushsrc.c also generating a *_get_type function.

I was now wondering – I have the prototype, and the implementation, why is the code then not linking? Well I knew that the file holding the implementation was not included in the build – so for a while I was considering adding it to my build, but it kind of felt like ceating and indeed it is not the way to do it.

The GstPushSrc is not part of the core GStreamer library, its part of the base plug-in library. The core library gstreamer-0.10 was allready added to in the root of the build directory, so this was probably a good place to add the base library too.

The remaining problem was then to find the name of the library… I’ll spare you the story of my search – and note that it is: gstreamer-base-0.10, adding this to my file fixed the linking issues, and I’m now ready to continue 🙂

How I’m gonna do it

GStreamer Posted on 07 Apr, 2009 21:18:38

I updated the GStreamer page with a description of how the current setup is in the nsGStreamer,cpp file and what I am changing it into. I’ll post here when there are significant updates to that page.

GStreamer running (again)

GStreamer Posted on 03 Apr, 2009 09:08:13

After upgrading the base to TOT (top of tree) in the beginning of the week, I now have it running again with the GStreamer backend.

I tried building without debug information and with optimizations turned on, it gave a significant improvement in the speed, but I guess that was expected 😉

While playing with the device, I noticed that the audio and the video stream is out of sync – I didn’t notice this before because I had only tested with video only mpegs.

/memory/jemalloc/jemalloc.c:4511: Failed assertion: “(mapelm->bits & CHUNK_MAP_ALLOCATED) != 0”

GStreamer Posted on 01 Apr, 2009 11:11:54

So after upgrading to the latest version of the Mozilla code last week I started to get an assertion when I was using GStreamer to decode video on the N810. Debugging is usual much nice to do on the PC, so I set up building Fennec for Linux, I had to repeat most of the steps that I had done in scratchbox to make it compile with gstreamer, but since it’s all Linux it wasn’t much of an issue to just repeat the steps.

Sure enough the same error appeared on the laptop as I had seen in the device (lucky me). I didn’t have much luck in attaching the debugger to Fennec after it had asserted, but when I redid the test starting Fennec from the debugger it was easy to see the call stack.

It seemed to be related to a delete of the buffer given to nsMediaDecoder::SetRGBData – diffing the old version of this function (where the code was working) with the new version, showed something interesting.

Originally the buffer given in SetRGBData had been copied to an internal buffer, meaning the caller of SetRGBData had ownership of the buffer. In the new version there is no longer a copy, the pointer to the buffer is stored in a nsAutoArrayPtr, meaning the next time this pointer is assigned something the previous content is deleted – ahh… we start to see a reason for the fault here, don’t we?

Previously it was the responsibility of the caller to delete the buffer, this is now handed over to the called function, but since the GStreamer code hasn’t been updated to reflect this there is a conflict of ownership.

Realising what the problem is, is the first step, now I need to fix it 😉

GStreamer running in Fennec

GStreamer Posted on 26 Mar, 2009 15:26:51

I have just seen the first embedded video clip in Fennec using GStreamer as the decoder. It played a small mpeg video clip – the performance is still lacking a bit but there are probably some things we can do to speed it up.

To see the video playing is just verifying that I have done at least something right in the process of updating the GStreamer integration.

Things still on the ToDo list (in the order I plan to do them):

1) Update the patch to the lastest version of the mozilla source (I’m running on a base that is about 1½ week old)
2) Create a new gstreamer source using our own network routines – the current integration leaves it up to gstreamer it self to load the data from the network
3) Do a runtime check on the data/mime type to identify if GStreamer is able to handle the data
4) Fix the rest of the TODO’s I left around in the code
5) Get a second oppinion on a couble of choices made in the integration
6) Attach the patch to the bug-report
7) Optimize
8) Attach a new patch to the bug in Bugzilla

How is the decoder initialised?

GStreamer Posted on 24 Mar, 2009 22:43:37

So there I were, getting the compile going, and I realised that I actually didn’t know what kind of framework it was that I was going to make this library work with. So I started to backtrace the code from the point where the codec is created – this resulted in these notes.

The idea is that I’ll start to document my understanding of each of the functions in the nsMediaDecoder interface that I need to modify to get the compile going, later I might expand this to a bigger subset of the functions in the interface. The idea being that if I find something, then I better write it down. – if it is trivial, then it’s fast to write down, if it isn’t trivial then I’ll probably be happy that I wrote it down if I need to go back to the function at a later time 🙂

I made a document to keep track of the individual functions (it’s blank as I write this) that can be found here.

Next »