CSI-2 Cameras for the BBAI-64

Hello,

I noticed two CSI-2 camera ports on the BBAI-64. Would any CSI-2 cam. do or are there particular cameras that are needed for this particular set of ports?

Seth

P.S. I found some CSI-2 cams. out there in consumable goods land. So, I just want to make sure I am purchasing the correct set of cameras to use the EDGEAI SDK and tools. So, I found Leopard Imaging: https://www.leopardimaging.com/product/csi-2-mipi-modules-i-pex/csi-2-mipi-modules/global-shutter-mipi-cameras/1mp-ar0144/li-ar0144-mipi-stereo/ .

I will check them out w/ the CSI-2 imaging cams. too.

Also,

I found some libs. from: SDK Components — Processor SDK Linux for Edge AI Documentation

I have a couple questions:

  1. Do those libs. work w/ the BBAI-64 w/ the update at BeagleBoard.org - update-ai64 ?
  2. Or…would I need to develop those libs. by myself on the BBAI-64. These are the libs. in question:
    a. GitHub - TexasInstruments/edgeai-tidl-tools: Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.
    b. GitHub - TexasInstruments/edgeai-modelzoo: AI / Deep Neural Network Models and Examples
    c. GitHub - TexasInstruments/edgeai-gst-plugins: Repository to host GStreamer plugins for TI's EdgeAI class of devices
    d. GitHub - TexasInstruments/edgeai-tiovx-modules: Repository to host TI's OpenVx modules used in the EdgeAI SDK.

The reason I am asking is b/c I am trying here w/ the edgeai-tidl-tools from the a. section from above.

I received an error before this attempt to ask questions:

  1. The error was about the arch. of the machine in question.
  2. Good news. The lib. is building now!

Seth

P.S. At times…I am receiving this error:

./run_python_examples.sh: line 11: return: can only `return' from a function or sourced script
DEVICE not defined. Run either of below commands
export DEVICE=j7
export DEVICE=am62
./run_python_examples.sh: line 18: return: can only `return' from a function or sourced script

Okay…so! For the above error, just run the script(s) twice. So, when getting the above error, do not quit and give up. Run the command once more to get it to work.

Anyway…I will wait to find the TRM one day. I am sure there is a lot of work going into this processor right now. So, patiently waiting!

Okay…so,

Jupyter Notebooks work on the BBAI-64 from this effort: GitHub - TexasInstruments/edgeai-tidl-tools: Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.

If one looks here: edgeai-tidl-tools/examples/jupyter_notebooks at master · TexasInstruments/edgeai-tidl-tools · GitHub

There are many examples, pretrained, that one can use when the built Jupyter Notebooks is accessed online at your ip a command w/ the port 8888.

Seth

P.S. Oh! Here is the idea so far in a photo:

Jupyter_ONE

Me Again…so I am trying to get the CSI2 port on the BBAI-64 to render video.

Has anyone got their CSI camera to show video, i.e. either via a LCD or via a server?

I have been reading the docs.beagleboard.org pages regarding the samples or demos for the BBAI-64.

I tried gstreamer:

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! videoconvert ! autovideosink
Setting pipeline to PAUSED ...
MESA-LOADER: failed to open tidss: /usr/lib/dri/tidss_dri.so: cannot open shared object file: No such file or directory (search paths /usr/lib/aarch64-linux-gnu/dri:\$${ORIGIN}/dri:/usr/lib/dri, suffix _dri)
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device '/dev/video0' is not a capture device.
Additional debug info:
../sys/v4l2/v4l2_calls.c(629): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Capabilities: 0x4204000
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...

That has some debug data as output.

Would I need to use the UART connection for debugging on the BBAI-64 for CSI usage?

Seth

P.S. See, what I was trying was to make my bot go after dogs or cats for testing. So…

If the camera found a cat in its model via video camera, it would turn towards it or away from it or at least move. Although not useful in this context of bots scaring animals or bots running from animals, I was thinking this could be done, bot movement on models, with a richer set of bot movement…

I replied to the other thread, but it seems that currently, if you want to run ‘edge ai’ via CSI camera on BB AI-64, you can only do it on debian 11 bullseye.

If it works on debian 12 bookworm, please let me know.

1 Like

If I can configure things correctly, I will let you know.

Seth