BeableBone Black as an Image Sensor DSP

Hello :slight_smile:

I need to get pixel data from the parallel interface (D0-D7, HSync, VSync, CLK) of an image sensor whose pixel clock is running at 25Mhz.

The image needs to be demosaiced to get a full color image, then JPEG compressed and finally sent over Ethernet.

These are my doubts:

  1. Is it possible to capture the given pixel data at 25Mhz using GPIO? Is there some examples of this application to get started?
  2. What about processing power? I know it will depend on programming optimization, but given a good optimization, this task could it be done at 24 FPS in “realtime”?
  3. For this application, is there some/any performance impact of using Linux?

I’ve been thinking about a kernel driver that only gets pixel data on the rising edge of CLK (this is the interrupt). Then another interrupt (HSYNC) will start a demosaicing then row JPEG compression if there are at least two rows of raw data. This image will be put on another buffer that will be sent by ethernet through DMA. Will it work? What do you think?

For my application frame latency is an issue, so the faster the other side receives (even part of) a new image, the better.

Thanks in advance!

Take a look at the two imager capes that are available for the BBB.