Webcam Streaming on Beagle Board Xm using Gstreamer/ DSP

Hello Vladimir,
    I saw lot of your posts and support in Beagle Board Group. Actually
I asked my question in Google Group. But I couldn't get expected help.
I hope you can help me out.

Still, the mailing list is the place to ask

I am using Logitech HD 910 webcam. I have installed Opencv on both
Angstrom and Ubuntu Os. I have Beagle Board XM rev c.

When I run a simple webcam streaming Program on both OS. It
runs successfully. But there is lot of delay in streaming. I read in
forums. So I planned to use on chip DSP for this operation.

why do you need openCV at all if all you want is to stream video?
gstreamer and vlc should both be able to do that for you.

Thanks for reply.
I want to do a real time color detection and object detection…
I am not aware of how to do it using DSP.

hi all,
please share a doc or procedure, how to install openCV on angstrom, am usiong beagle board…

N reddy

Babu, if you don’t want to make some research then you need something like “opkg install opencv”. Probably this instruction is wrong, but I did not research either :slight_smile:

2012/1/16 babu reddy <babu.dec04@gmail.com>

Dhut C wrote:

Thanks for reply.
I want to do a real time color detection and object detection..
I am not aware of how to do it using DSP.

1) ignore openCV and your camera for now
2) look into dsplink and codec engine (CE) or maybe C6Run
3) run dsplink/codec engine/C6Run examples
4) write your own simple dsp algorithm (or codec)
5) successfully run your simple codec/algo inside link/CE/C6Run
6) implement your "color detection and object detection" for the DSP
    Beware that making a DSP algorithm that is really fast is not simple
    and for most cases you will end up with something that could run
    at similar speed on the ARM unless you put a lot of effort into it
    Of course you can use the DSP to offload work to it and keep the
    ARM free for other stuff, but that means you need to have a SW
    framework on the ARM that allows to schedule DSP work in parallel
    to ARM work.

and only once that works:

7) integrate your dsp algorithm inside openCV.

recommended reading material (list is not exhaustive):

https://www.google.com/search?q=beagleboard+dsplink
https://www.google.com/search?q=beagleboard+codec+engine
http://processors.wiki.ti.com/index.php/Introduction_to_C6Run
http://elinux.org/BeagleBoard/GSoC/2010_Projects#Project:_OpenCV_DSP_Acceleration

vladimir has some good suggestions, but here is some additional OpenCV info:

Here is a good link for install OpenCV. Read the comments section first because you probably need to add this option when doing configure:–enable-pic

http://ozbots.org/opencv-installation/

I have had some trouble compiling on the command line, but got it to work. See this post: https://groups.google.com/d/topic/beagleboard/UvrMRrGDOqA/discussion

Jake

small update inline

Vladimir Pantelic wrote:

Dhut C wrote:

Thanks for reply.
I want to do a real time color detection and object detection..
I am not aware of how to do it using DSP.

1) ignore openCV and your camera for now
2) look into dsplink and codec engine (CE) or maybe C6Run

2a) use Angstrom because it has dsplink and CE already

3) run dsplink/codec engine/C6Run examples
4) write your own simple dsp algorithm (or codec)
5) successfully run your simple codec/algo inside link/CE/C6Run
6) implement your "color detection and object detection" for the DSP
     Beware that making a DSP algorithm that is really fast is not simple
     and for most cases you will end up with something that could run
     at similar speed on the ARM unless you put a lot of effort into it
     Of course you can use the DSP to offload work to it and keep the
     ARM free for other stuff, but that means you need to have a SW
     framework on the ARM that allows to schedule DSP work in parallel
     to ARM work.

and only once that works:

7) integrate your dsp algorithm inside openCV.

8) have fun!

Thanks a lot for your great help!!
I just want to know Does anybody use on chip DSP on any BeagleBoard or BeagleBoard XM ? The reason I am asking because He may know some practical challenges in an actual implementation or some tricks to make it happen.

  I just want to know Does anybody use on chip DSP on any BeagleBoard
or BeagleBoard XM ? The reason I am asking because He may know some
practical challenges in an actual implementation or some tricks to make it
happen.

Speaking about Gstreamer - the following page contains great set of
examples how to use GStreamer with web-camera:

The following page has a set of example Gstreamer pipelines which are
using DSP-accelerated codecs:
http://processors.wiki.ti.com/index.php/Example_GStreamer_Pipelines#Network_Streaming_7
In our hobby project (vehicle controlled over Internet) we are using
Gstreamer on BeagleBoard running Angstrom to stream the video from
on-board camera(s). All the sources and some documentation available
on GitHub. For more details you can take a look on the following
pages: Home · veter-team/veter Wiki · GitHub ,
Hardware design en · veter-team/veter Wiki · GitHub and
http://veter-project.blogspot.com/

Hope this helps,
Andrey.

Andrey,
Nice Work!! Thanks for sharing…Lot of great ideas are coming.

I am planning to use C6run for implementation on DSP. People who used or using C6run , please share your thoughts ,helpful links…

Awesome!

Thanks for your post with the links!

Dhut C wrote:

Thanks a lot for your great help!!
   I just want to know Does anybody use on chip DSP on any BeagleBoard or BeagleBoard XM ? The reason I am asking
because He may know some practical challenges in an actual implementation or some tricks to make it happen.

Ask a concrete question once you get to one, asking meta-questions like that
won't get you far....

Thanks for Advice.
I will do that.

Hello Vladimir,
I am trying to follow your instructions to Use DSP. I went through all these references and docs. Most of info seems outdated. I already have angstrom running card, Do I need to make a new one to use DSP?
Can I use Narcissus online tool to create a new angstrom image which have some DSP related files ?

Also Following website looks like under construction
http://www.angstrom-distribution.org/building-angstrom
Do we have any another website.

I read some of documents, I am little bit confused which steps we need to follow first.
Build Codec, DSP link . Please guide me.

Hello!

I just got first beagle board rev c and i installed ubuntu (11.10 or
11.04 - tried both). My problem is that I can not get OpenCV to work
on it.

I noticed that you said you managed to install openCV, can you tell me
how? I would really appreciate it.

Thanks.
-Seham

Hey I think I followed this link http://linuxconfig.org/introduction-to-computer-vision-with-opencv-on-linux
I hope it will work for you.

Hi there,

I have faced with a problem while trying to achieve webcam streaming working fine in my BB-xM where LI5M03 (mt9p031) camera is attached.

I’m currently trying to see webcam images on the screen. I’m able to do that by using mplayer as follows:
#sudo mplayer tv:// -tv driver=v4l2:device=/dev/video6:outfmt=uyvy:width=320:height=240

However, I am getting an error when I try the similar by using GStreamer as follows:
#sudo gst-launch -e v4l2src device=/dev/video6 ! filesink location=./test.data
I have tried various combinations of v4l2src and sink methods but the error I am getting is:
“v4l2src: Couldn’t negotiate format”

It seems the problem is related to the output format of LI5M03 camera which is UYVY, and the gst drivers are not expecting/supporting UYVY. Does anyone have an experience to confirm my findings?

Any idea?

Regards,
Ozkan.

Regards,
Ozkan.

Hi Ozkan,

I have tried various combinations of v4l2src and sink methods but the error
I am getting is:
      "v4l2src: Couldn't negotiate format"

It seems the problem is related to the output format of LI5M03 camera which
is UYVY, and the gst drivers are not expecting/supporting UYVY. Does anyone
have an experience to confirm my findings?

To check your hypothesis, you can try to insert ffmpegcolorspace
between source and sink.

Regards,
Andrey.

I also tried a basic example: # gst-launch -e v4l2src device=/dev/video6 ! dspmp4venc ! filesink location=./test.mp4
But I got an error:
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device ‘/dev/video6’ cannot capture in the specified format
Additional debug info:
gstv4l2object.c(2218): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Tried to capture in YU12, but device returned format UYVY
Setting pipeline to NULL …
Freeing pipeline …

Any help?

I have tried lots of combinations in order to get camera image on the screen on BB-xM but no success so far.

ximagesink gives a different error than filesink, any comment/feedback about the reason?

root@arm:# gst-launch -e v4l2src device=/dev/video6 ! ximagesink
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not negotiate format
Additional debug info:
gstbasesrc.c(2830): gst_base_src_start (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Check your filtered caps, if any
Setting pipeline to NULL …
Freeing pipeline …

root@arm:# gst-launch -e v4l2src device=/dev/video6 ! filesink location=./test.data
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device ‘/dev/video6’ cannot capture in the specified format
Additional debug info:
gstv4l2object.c(2218): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Tried to capture in YU12, but device returned format UYVY
Setting pipeline to NULL …
Freeing pipeline …

To check your hypothesis, you can try to insert ffmpegcolorspace
between source and sink.

I also tried ffmegcolorspace, colorspace, autoconvert but no success, same error as copied below.

root@arm:# gst-launch -e v4l2src device=/dev/video6 ! video/xraw-yuv,format=(fourcc)YU12 ! ffmpegcolorspace ! filesink location=./test.data
WARNING: erroneous pipeline: could not link v4l2src0 to ffmpegcsp0

Any help?