in7y118@public.uni-hamburg.de
2007-Mar-22 03:06 UTC
[Swfdec] Integrating Swfdec and GStreamer
Hi, I've recently been looking at integrating Swfdec and GStreamer. There are some issues I'd like advice on. In order of importance for me: 1) Using GStreamer codecs Swfdec currently has a simple API for decoding data that looks like GstBuffer *decode_next_buffer (Decoder *dec, GstBuffer *input); [1] Which is used to decode video (VP6 and Sorensen Squeeze/H263 to RGBA) and audio (MP3, Nellymoser to 44kHz raw audio). The decoding API can't include demuxers (like FLV), since those demuxers trigger script events [2], and I need to ensure they do happen and happen in the right order. And it can't pass the data on to the API. Video is rendered just like every other object, and as such can be rotated, scaled, affected by a clipping region or place below other object. Audio can be modified by scripts or the file (volume, panning). [3] My idea to implement this has been to spawn a pipeline like: appsrc ! right/caps,go=here ! decodebin ! right/caps,go=here ! appsink But as you can see above, the current Swfdec API decodes only one buffer at a time and then wants all available output. Since the available output is not necessary one buffer (it is for video), but may be 0-n (in the mp3 case), how do I make sure I got all available output before returning from the decode function? Or even better: Does someone of you have a better idea on how to implement the Swfdec codec API to make it more suited for APIs like GStreamer's? 2) Using GStreamer as audio output The current audio API in Swfdec is modelled after how the official player works and I've had the best results regarding start/stop latency in ALSA with this approach. When a new audio stream [4] becomes available [5], a new ALSA pcm is openened, and when the stream gets removed by the player, I close the pcm. This way (as opposed to mixing the audio myself), the open/close operations avoid the sound card's buffer and happen way faster. So I've been thinking about opening a new pipeline with appsrc ! autoaudiosink for every new stream and run it for as long as the stream plays. Since you can easily find Flash files which have 5 of those streams playing, I've been wondering if this might cause issues for the varying sound backends. I know it works on Linux/ALSA with dmix and it'll probably work on $SOUND_SERVER_OF_CHOICE, but what about Solaris or BSD? Any opinions on this? 3) Writing a Swfdec plugin for GStreamer People have asked me why I removed the GStreamer plugin from Swfdec. This is motivated by Swfdec now supporting Actionscript and what the Actionscript stuff can do to one's Flash files. This whole new can of worms would make lots of Flash files complicated to get right, especially in the autoplugger. Since I don't have the time to hack two output frontends and the GStreamer one was problematic, I removed the plugin. Btw, I think these problems would apply to other formats interesting for GStreamer, too, like for example SMIL. In no particular order: - Flash files can load (multiple) files (at once). Currently there's no good way to request data from elsewhere. I've been told there is an event to request a different URI that is used by some demuxers, but that requires the current file to be complete and does not support more than one file. - If Flash wants to open new files, it needs the URL of the input for local URL resolving and for security management. Last I looked the URL was not part of the (meta)data in the stream. - Rendering a Flash movie requires a lot of drawing operations. Swfdec uses cairo for this. The only available solution for cairo in GStreamer is to use image buffers. Rendering to images is pretty much the slowest rendering method cairo offers. - A lot of Flash files take 100% CPU and require frame dropping to get smooth audio playback and OK video. Back when I dropped the plugin, the QoS framework did not exist. I haven't looked if the GStreamer QoS would be up to the task now. - Modern Flash relies a lot on user input. Particular in 0.8 there was a lot of buffering taking place inside pipelines, which made user input feel very unresponsive. I have no idea how much that is still the case in 0.10. - The audio issues pointed out in 2) might be interesting, too, in particular when thinking about autoplugging. Nonetheless I am very interested in a working GStreamer plugin for Swfdec. And I bet a lot of other people are, too. While designing the Swfdec API [6], I kept in mind that people want to use Swfdec inside GStreamer, so I hope there are no obstacles in implementing a GStreamer plugin. If there are, I'm of course open to change the API, since it's by no means frozen. So if anyone is looking for an interesting project to do, this might be it. I'll gladly help everyone trying to make this happen, but I certainly do not have enough time to do it myself, since making Swfdec correctly decode SWF files takes up 150% of my time already. Cheers, Benjamin [1] http://gitweb.freedesktop.org/?p=swfdec.git;a=blob;f=libswfdec/swfdec_codec.h [2] http://brajeshwar.com/reference/as2/NetStream.html [3] http://brajeshwar.com/reference/as2/Sound.html [4] http://swfdec.freedesktop.org/documentation/swfdec/SwfdecAudio.html [5] http://swfdec.freedesktop.org/documentation/swfdec/SwfdecPlayer.html#SwfdecPlayer-audio-added [6] http://swfdec.freedesktop.org/documentation/swfdec/
On Thu, Mar 22, 2007 at 10:44:48AM +0100, in7y118@public.uni-hamburg.de wrote:> - Flash files can load (multiple) files (at once). Currently there's > no good way to request data from elsewhere. I've been told there is an > event to request a different URI that is used by some demuxers, but > that requires the current file to be complete and does not support > more than one file. > - If Flash wants to open new files, it needs the URL of the input for > local URL resolving and for security management. Last I looked the URL > was not part of the (meta)data in the stream.These two reasons are why swfdec should really be 'swfplaybin'.> - Rendering a Flash movie requires a lot of drawing operations. Swfdec > uses cairo for this. The only available solution for cairo in > GStreamer is to use image buffers. Rendering to images is pretty much > the slowest rendering method cairo offers.I've done some experimental stuff related to replacing x[v]imagesink with a cairosink, and passing a cairo context upstream for the decoder to draw on. It's a hack, and doesn't work through queues and such, but it has the potential to morph into a decent solution. And it seamlessly used Xrender as a backend.> - A lot of Flash files take 100% CPU and require frame dropping to get > smooth audio playback and OK video. Back when I dropped the plugin, > the QoS framework did not exist. I haven't looked if the GStreamer QoS > would be up to the task now.It works quite well for video playback.> - Modern Flash relies a lot on user input. Particular in 0.8 there was > a lot of buffering taking place inside pipelines, which made user > input feel very unresponsive. I have no idea how much that is still > the case in 0.10.Yeah, mouse control in "vts ! navigationtest ! xvs" still feels a bit "sloppy". This comes from two problems in navigationtest: waiting until the next buffer comes from videotestsrc to update the picture with the new nav info, and not flushing the existing (and now stale) buffer out of the xvimagesink. Thus, if buffers have timestamps 1.0, 2.0, 3.0, etc., an event at time 1.2 gets processed by navigationtest at approximately time 2.0+epsilon to be applied to buffer 3.0. Also, gstreamer navigation still has the severe problem of not passing around enough information for keypresses. It *might* get "a" and "A" correct, but perhaps not "#" and "?", or tab and shift-tab. dave...
Apparently Analagous Threads
- 6 commits - configure.ac libswfdec/swfdec_codec_audio.c libswfdec/swfdec_codec_gst.c libswfdec/swfdec_codec_video.c player/swfplay.c
- make swfdec a gstreamer backend?
- configure.ac libswfdec/Makefile.am libswfdec/swfdec_codec.c libswfdec/swfdec_codec_gst.c
- Proposal: swfdec-gnome
- gstreamer error