I posted a while back "complaining" about lack of a theora player on the iPhone. Porting the code for libtheora (and libogg/libvorbis) was (relatively) painless, and appears to be working. I'm up against a tougher challenge now... rendering the video! I tried using the brand-new SOC-funded SDL port to iphone to get the "player_example.c" code up and running, but ran into a major roadblock: The SDL port to iphone uses an opengl-es driver and opengl drivers for SDL don't support YUV overlay's! So the problem (at least for now) is how I can get the YUV encoded frames from a theora movie rendered on the iPhone? The options I've considered are: a) convert from YUV to RGB in code (this probably won't work for realtime decoding) b) convert from YUV to RGB using pixel shading (If it sounds like I don't know what I'm talking about, that's correct... I've simply seen smarter people talk about doing the conversion on the GPU using pixel shading???) c) convert the frames to still images in a format that the iphone can display natively (doesn't jpeg use the same YUV encoding that theora does?) and render them as a series of images (I have no clue if this is feasible or not, or if the conversion/decoding would be worse than just doing a or b above) d) something else? e) give up! So any feedback would be great! On a positive note, the iphone sdk's NDA should be lifted very soon.. On a negative note, the guy that ported vlc to iphone (google vlc4iphone) hasn't released any of the source... so I can't even cheat to see how he's doing it... and asking him directly hasn't worked. Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.xiph.org/pipermail/theora-dev/attachments/20081007/7458f911/attachment.htm
On Tue, Oct 7, 2008 at 12:56 PM, Steven Woolley <woobert at gmail.com> wrote:> I posted a while back "complaining" about lack of a theora player on the > iPhone. Porting the code for libtheora (and libogg/libvorbis) was > (relatively) painless, and appears to be working.Awesome!> The SDL port to iphone uses an opengl-es driver and opengl drivers for SDL > don't support YUV overlay's!Oops.> a) convert from YUV to RGB in code (this probably won't work for realtime > decoding)There's code for this in a lot of places. http://svn.annodex.net/liboggplay/trunk/src/liboggplay/oggplay_yuv2rgb.c is one example. Does the iphone's arm have an mmx unit?> b) convert from YUV to RGB using pixel shading (If it sounds like I don't > know what I'm talking about, that's correct... I've simply seen smarter > people talk about doing the conversion on the GPU using pixel shading???)If you can do this with opengl-es and get gpu accelleration, this is the way to go. Then backport your code to the sdl port so everyone else can use it. I don't know how to do this, but there are lots of examples. You could try asking macslow for help if you get stuck; he's done a lot opengl media player work.> c) convert the frames to still images in a format that the iphone can > display natively (doesn't jpeg use the same YUV encoding that theora does?) > and render them as a series of images (I have no clue if this is feasible or > not, or if the conversion/decoding would be worse than just doing a or b > above)You could use the same pixel format, but you'd still have to encode each frame, so this isn't going to be a realtime option either. And static image display pipelines often aren't fast enough for video.> On a negative note, the guy that ported vlc to iphone (google vlc4iphone) > hasn't released any of the source... so I can't even cheat to see how he's > doing it... and asking him directly hasn't worked.You could try dumping the link symbols from the binary to see what API it's using. This is especially helpful if it's an undocumented interface. HTH, -r
On 10/7/08, Steven Woolley <woobert at gmail.com> wrote:> I posted a while back "complaining" about lack of a theora player on the > iPhone. Porting the code for libtheora (and libogg/libvorbis) was > (relatively) painless, and appears to be working.Hey, cool! You should send patches upstream.> The SDL port to iphone uses an opengl-es driver and opengl drivers for SDL > don't support YUV overlay's!I reckon video players in iphone aren't using SDL, but probably something specific to the platform. If you can find what it is, that would probably be the best solution.> a) convert from YUV to RGB in code (this probably won't work for realtime > decoding)Of the three ideas this one seems feasable, IMO.> On a negative note, the guy that ported vlc to iphone (google vlc4iphone) > hasn't released any of the source...What the--? That's illegal; VLC is under GPLv2+. -Ivo
On Oct 7, 2008, at 1:56 PM, Steven Woolley wrote:> The SDL port to iphone uses an opengl-es driver and opengl drivers > for SDL don't support YUV overlay's! So the problem (at least for > now) is how I can get the YUV encoded frames from a theora movie > rendered on the iPhone? > > The options I've considered are: > a) convert from YUV to RGB in code (this probably won't work for > realtime decoding) > b) convert from YUV to RGB using pixel shading (If it sounds like I > don't know what I'm talking about, that's correct... I've simply > seen smarter people talk about doing the conversion on the GPU using > pixel shading???) > c) convert the frames to still images in a format that the iphone > can display natively (doesn't jpeg use the same YUV encoding that > theora does?) and render them as a series of images (I have no clue > if this is feasible or not, or if the conversion/decoding would be > worse than just doing a or b above) > d) something else? > e) give up! > > So any feedback would be great!Conversion from YUV to RGB is probably an order of magnitude less intensive than Theora decoding, so if you got that working in realtime and you have a bit CPU juice left you can easily do it. The order of preference should be: - Pixel shaders. If you can offload to GPU, then by all means do it. I'm not familiar with the PowerVR MBX chip that's in the iphone to know whether it has the chops, but I doubt it. - SIMD version of the standard YUV to RGB conversion algorithm. The iphone's version of ARM core has SIMD instructions, but most of the code you'll find online is written for x86 MMX so you'll need to adapt it. - Plain, old-school, optimized C code. For an example that uses pre- calculated tables and only basic operations (add, or and bit shifts), take a look here: http://iaxclient.svn.sourceforge.net/viewvc/iaxclient/branches/2.0/lib/video.c?revision=1199&view=markup . Look for the iaxc_YUV420_to_RGB32() function. You might actually want to profile the last two, I'm not sure that the SIMD version would be so much faster... Hope this helps, Mihai