I am having some trouble streaming video from an CSI0 attached IMX219 camera and was hoping for advice or insight. I’m currently stuck on what seems to be using the gstreamer module v4l2h264 or a gstreamer pipeline issue. The kernel I’m running is 5.10.153-ti-arm64-r84 (as reported by “uname -r”) and it seems to have version 8.2 Edge AI apps from the repository installed from the “bbai64-debian-11.6xfce-edgeai-arm64-2023-01-02-10gb.img.xz” image on an SD card. I do recall running the “/opt/edge_ai_apps/setup_script.sh” as well though. I’m able to stream video to the monitor using the camera (although it often takes a manual modprobe of the imx219 to get it working from a cold boot). I have copied the buried IMX219 files “dcc_2a.bin” and “dcc_viss.bin” from the TI 8.5 RTOS PDK “imaging” directory into a created “/opt/imaging/imx219/” directory on the AI-64 and can run the RPiV2 Camera Sensor Demo successfully using the camera (after running the /opt/edge_ai_apps/init_script.sh beforehand).
I can get video from the camera to my DP->HDMI adapter attached monitor using this command:
gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-bayer, width=1920, height=1080, format=rggb ! tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name=SENSOR_SONY_IMX219_RPI dcc-isp-file=/opt/imaging/imx219/dcc_viss.bin sink_0::dcc-2a-file=/opt/imaging/imx219/dcc1/dcc_2a.bin format-msb=7 ! kmssink driver-name=tidss
What I would like to do however is stream the camera video as a hardware encoded H.264. I have tried running it with the following command and it just hangs before actually the streaming counter starts:
gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-bayer, width=1920, height=1080, format=rggb, framesize=30/1 ! tiovxisp sink_0::device=/dev/v4l-subdev2 sensor-name=SENSOR_SONY_IMX219_RPI dcc-isp-file=/opt/imaging/imx219/dcc_viss.bin sink_0::dcc-2a-file=/opt/imaging/imx219/dcc1/dcc_2a.bin format-msb=7 ! video/x-raw, format=NV12 ! v4l2h264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.XXX.XXX.XXX port=5000 -e
(Note: an actual local IP address is used instead of the XXXs to a system setup to receive the stream.)
...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
...
Trying with a videotestsrc instead, I see the streaming clock running but I don’t receive the stream in VLC on my receiver system:
gst-launch-1.0 videotestsrc ! video/x-raw,width=1920,height=1080, framerate=60/1 ! v4l2h264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.XXX.XXX.XXX port=5000 -e
I have also tried sinking the v4l2h264enc plug-in to an output file instead but it isn’t recognized by a video player.
I successfully received a stream in VLC from a videotestscr using x264enc instead of v4l2h264enc from the installed gstreamer bad plug-ins but it sends the CPU in to the stratosphere (as seen with the “top” command running) so I’ve killed it quickly when testing:
videotestsrc ! video/x-raw,width=640,height=480, framerate=30/1 ! x264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.XXX.XXX.XXX port=5000 -e
So it currently seems to be related to the v4l2h264enc plug-in that I’m led to believe would use the hardware encoder. Any help or direction on how to use the v4l2h264enc plug-in would be appreciated!
Thank You