Using gst-dsp Video Encoding/Decoding

I'm running PTAMM on my BeagleBoard xM-revB and it uses mpeg encoding/
decoding to stream video from a webcam while crunching data and
displaying mappings using OpenGL. Currently the video stream is very
slow so I want to try to modify the program so that the DSP handles
all of the video streaming. I have gst-dsp setup and I have verified
that the DSP is working, I'm just wondering where to start as far as
implementing the gst-dsp encoder/decoder into my project. Any help
would be appreciated!

I'm running PTAMM on my BeagleBoard xM-revB and it uses mpeg encoding/
decoding to stream video from a webcam while crunching data and
displaying mappings using OpenGL. Currently the video stream is very
slow so I want to try to modify the program so that the DSP handles
all of the video streaming. I have gst-dsp setup and I have verified
that the DSP is working, I'm just wondering where to start as far as
implementing the gst-dsp encoder/decoder into my project. Any help
would be appreciated!

If you just need to stream video from the web-camera, then you can use
some of the examples mentioned here [1] (using v4l2src instead of
videotestsrc mentioned in the example). Otherwise, if you need to feed
your own data into the encoding/streaming pipeline, then I would
suggest to start with AppSrc [2,3,4] to feed your generated data to
already existing TI's DSP-based encoder (TIVidenc1).

[1] http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#H.264_RTP_Streaming
[2] app
[3] appsrc
[4] app\examples\tests - gstreamer/gst-plugins-base - 'Base' GStreamer plugins and helper libraries (mirrored from https://gitlab.freedesktop.org/gstreamer/gst-plugins-base)

I tried a test launch using the TIViddec2 decoder like so:

gst-launch v4l2src device=/dev/video0 ! TIViddec2 ! omapdmaifbsink

but I get the error: no element "TIViddec2". I have set up gst-dsp
with the installer from http://elinux.org/BeagleBoardUbuntu#gst-dsp

Also, I have verified that the DSP is running and I have also tested
the webcam with a simple gst-launch. What is it that I'm missing?

Jeff wrote:

I tried a test launch using the TIViddec2 decoder like so:

gst-launch v4l2src device=/dev/video0 ! TIViddec2 ! omapdmaifbsink

this is wrong. Your /dev/video does not provide mpeg4 or h264
compressed video, or does it?

also, TIViddec2 is afaik from the dsplink/CE/DMAI based gst-ti,
not from the bridge based gst-dsp...

Hi,

I guess you first need to read about gstreamer:
http://gstreamer.freedesktop.org/documentation/

Now, using the shell in the beagle, you can execute gst-inspect, which
is a tool to list and introspect the gstreamer elements that you have
installed in the board.

For example, in my setup:

$ gst-inspect | grep dsp
dvdspu: dvdspu: Sub-picture Overlay
dsp: dspdummy: DSP dummy element
dsp: dspvdec: DSP video decoder
dsp: dspadec: DSP audio decoder
dsp: dsph263enc: DSP video encoder
dsp: dspmp4venc: DSP MPEG-4 video encoder
dsp: dspjpegenc: DSP video encoder
dsp: dsph264enc: DSP video encoder
dsp: dspvpp: DSP VPP filter

As you can see I've 3 video encoders: dsph263enc, dsph264enc and dspmp4venc

If you run 'gst-inspect myencoder' you will get myencoder's details
and parameters.

A simple pipeline to test would be like this:

gst-launch v4l2src device=/dev/video0 ! dspmp4venc ! filesink
location=./myfile.mp4

I don't have a camera, so I can't test this pipeline right now, but,
in theory, it should work.

Afterwards you would craft more complex pipelines for audio&video
recording or so.

vmjl

Ok this was very helpful but when I try to run your suggested launch I get the error: ‘dsp init failed’

I have verified that the DSP works by using the dsp-tools. All the tools work except for dsp-test which returns the error:

create_node: dsp node allocate failed
main: dsp node creation failed

I believe that if I can get this problem fixed I’ll be able to accomplish what I need, has anyone else had this problem?

Another update, your launch does work simply by adding sudo. But when I run it, I get similar errors to that of dsp-test:

libv4lconvert: Error decompressing JPEG: unknown huffman code: 0000fffe
create_node: dsp node allocate failed
sink_setcaps: dsp node creation failed
ERROR: from element ?GstPipeline:pipelin0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)

Ok well I solved that issue as well, apparently I hadn’t loaded the baseimage.dof. However, now when I try to view video such Big Buck Bunny with playbin2, the video is really slow. Is playbin not selected the dspdec? And I’m also wondering if its possible to out v4l2src to xvimagesink using the dsp?

However, now when I try to view video such Big Buck Bunny
with playbin2, the video is really slow. Is playbin not selected the dspdec?

I am almost 100% sure that playbin is not using dsp-based decoder.

And I'm also wondering if its possible to out v4l2src to xvimagesink using
the dsp?

I am not sure what you mean here but maybe TIDmaiVideoSink might be of
interest for you:
http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#Loopback:_Video_7
. But AFAIK it does the output direct to the framebuffer (to the
screen), not to X window.

Ok well I solved that issue as well, apparently I hadn't loaded the
baseimage.dof. However, now when I try to view video such Big Buck Bunny
with playbin2, the video is really slow.

That depends on your gstreamer setup. playbin uses decodebin, and
decodebin has an algorithm to choose the "best" decoder available for
such stream. The "best" is based, first on the decoding capabilities
(must match with the stream properties) AND the rank. The rank is a
hard-coded number which establishes the preference of the element over
others available in the setup. So, if you have a software decoding
element with a higher rank than the dsp-based one, playbin will use
it. Easy answer: remove the software decoding element; Hard answer:
promote in the code the rank of the dsp element; workaround: don't use
playbin and craft manually the pipeline

Is playbin not selected the dspdec?
And I'm also wondering if its possible to out v4l2src to xvimagesink using
the dsp?

I don't know.

vmjl

Are you using a YUV sink like xvimagesink or omapfbsink? Are you using
flags=99 (native video)?

I can get the stream working without the dsp by doing:

gstreamer-launch v4l2src device=/dev/video0 ! video/x-raw-yuv ! xvimagesink

To be honest I’m not really sure how to use omafbsink or flags=99. But basically, if possible, I want to use gstreamer with the dsp to display the feed from my webcam, but dspvdec says it cant link to xvimage sink. Also the big buck bunny video is still very slow when I try:

gstreamer-launch playbin2 uri=file://path

I can get the stream working without the dsp by doing:

gstreamer-launch v4l2src device=/dev/video0 ! video/x-raw-yuv ! xvimagesink

Hmm, I've never seen that gstreamer-launch command before. But I would
try this instead:

% gst-launch videotestsrc ! 'video/x-raw-yuv,format=(fourcc)UYVY' ! xvimagesink

Because UYVY is the format gst-dsp uses that the omapfb accepts.

To be honest I'm not really sure how to use omafbsink or flags=99. But
basically, if possible, I want to use gstreamer with the dsp to display the
feed from my webcam, but dspvdec says it cant link to xvimage sink. Also the
big buck bunny video is still very slow when I try:

That's interesting... Do you have an error log?

gstreamer-launch playbin2 uri=file://path

flags=99 is meant for playbin:

gst-launch playbin2 flags=99 uri=file://path

Heres the output when running big buck bunny:

sudo gst-launch playbin2 flags=99 uri=file:/home/ubuntu/Downloads/big_buck_bunny_480p_surround-fix.avi
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstPulseSinkClock
WARNING: from element /GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2686): gst_base_sink_is_too_late (): /GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
Caught interrupt – handling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 42984252929 ns.
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

This works well for the videotestsrc:
% gst-launch videotestsrc ! ‘video/x-raw-yuv,format=(fourcc)UYVY’ ! xvimagesink

But not for the webcam, its gives and error:
sudo gst-launch v4l2src device=/dev/video0 ! ‘video/x-raw-yuv,widht=640,height=480,format=(fourcc)UYVY’ ! xvimagesink
Setting Pipeline to PAUSED…
libv4lconvert: warning more framesizes than I can handle
ERROR: Pipelne doesnt want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not negotiate format
Additional debug info:
gstbasesrc.c(2755): gst_base_src_start (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Check your filtered caps, if any
Setting pipeline to NULL …
Freeing pipeline …

To resolve your colorspace problem, stick a ffmpegcolorspace element
in there. It will kill your cpu, but you should get some video.

-chris

That does work but is this utilizing the DSP at all?

I don't see that the dsp would have anything to do in relation to
displaying your uncompressed webcam image to xvimagesink.

It would, however, be useful in decompressing your big buck bunny
file, assuming the codec was compatible.

-chris

Yea so how should I run big buck bunny to use the dsp?

With Felipec's 0.9.0 release*, of gst-dsp it's taking about 10% cpu
usage for big buck bunny (not full screen) on my xM B..

using just: sudo gst-launch playbin2 file:/(big_buck)..

* just rerun ./install-gst-dsp.sh it'll update/build to the latest
git of gst-dsp....

Regards,

For some reason mine is still really slow, which version of the movie should I have?