With the public release of the opengl drivers, what is the best way to get the beagleboard DSP to decode a h.264 video into a frame buffer so I can pull those frames into an OpenGL application to texture onto quads.
I have been previously using the mistral EVM but in order to do this it looks like you have to use Ti’s dmai api, I would rather use a more standard api, along with a mainline kernel. I have been looking through the FAQ’s and it looks like there might be several ways to do this I was just wondering what the most developed /stable way is.
I have access to Ti’s demo Codecs so I just need a way to load them into the DSP and get a frame buffer in memory working.
DMAI is still the preferred way, we're currently working in openembedded to update DMAI to 2.6.28.
AFAIK the only submitted patches for upstream inclusion are the ones
for the dsp-bridge which are sitting as a separate branch in
I would suggest using the gStreamer on EVM to start with (since you
already have the EVM), get the sink modified to do texturing instead
of display. Later when DMAI gets into OE, you should be able to use it
on a mainline kernel.
In any case - here are the steps:
- you need a colour conversion step to convert the YUV to RGB data for
- Implement the texturing in the sink
- Manage the AV sync changes - if audio is present.
And I presume you meant, Triangles, not Quads. openglES does not