GStreamer usage with BeagleY-AI

Hi,

I have flashed my BeagleY-AI kit using the bb-imager tool with the image BeagleY-AI Debian 13 v6.1.x-ti XFCE and the kernel version present is 6.1.83-ti-arm64-r72.

I have written and V4L2 driver for my camera sensor and used the CSI0 port. I can able to get the V4L2 stream properly now.

Now my requirement is I need to push the stream to display using GStreamer. I have tried the below command but it has so much latency and throws the below attached logs.
GStreamer command - gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,width=1920,height=1080,format=UYVY ! videoconvert ! autovideosink
Logs - Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Got context from element ‘autovideosink0’: gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)“(GstGLDisplayX11)\ gldisplayx11-0”;
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Redistribute latency…
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../libs/gst/base/gstbasesink.c(3146): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstGLImageSinkBin:autovideosink0-actual-sink-glimage/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.

What is the proper GStreamer command to use. Kindly help me with this.

Thanks,
Naveen.

My best guess is videoconvert step is running on CPU, processing the entire 2 megapixels without GPU or other hardware acceleration. You may be able to enable profiling to confirm this:

For any gstreamer camera flows requiring hardware acceleration, you’re going to have more success using the Edge AI image for now as opposed to Debian, as TI maintains gstreamer support for that image.

1 Like

Alternatively, if you reduce the resolution or framerate you may be able to get away with CPU processing.

1 Like

Hi, as you suggested I have worked with the edge ai image and have successfully got the stream with CSI0 and for device tree I have referred the imx219 device tree. Now I need to do the same for CSI1 too as I need two camera support. Now to enable this I couldn’t find any device tree support. What can I do for this. How to add CSI1 support properly?

Thanks.

use the existing dtso files as a reference linux/arch/arm64/boot/dts/ti/k3-am67a-beagley-ai-csi0-imx219.dtso at ti-linux-6.12.y-bb · goat-hill/linux · GitHub

1 Like

Hi,

Thanks for the response.

As mentioned earlier, I’ve used the same DTSO reference and was able to get CSI0 working without any issues. Also I have applied the same approach for CSI1, renaming the relevant nodes accordingly. With this setup, my camera driver loads correctly, but the platform driver fails with the following error:

[    9.107357] genirq: Flags mismatch irq 73. 00000004 (30121000.csi-bridge) vs. 00006004 (20000000.i2c)
[    9.107372] cdns-csi2rx 30121000.csi-bridge: Unable to request interrupt: -16
[    9.107382] cdns-csi2rx 30121000.csi-bridge: probe with driver cdns-csi2rx failed with error -16

I used &main_i2c0 for the CSI1 configuration. Could this be causing the conflict? Also, is there any validated or working device tree reference for CSI1 on this platform?

Thanks in advance for your guidance.

There’s a quad cam example for the AM67A EVM board linux/arch/arm64/boot/dts/ti/k3-j722s-evm-csi2-quad-rpi-cam-imx219.dtso at ti-linux-6.12.y-bb · goat-hill/linux · GitHub

1 Like

@babldev and @Naveen_K

Thank you very much - it’s very valuable for me to read and understand this.