Hi All,
I’m curious whether anyone has conducted (or would be interested in conducting) an experiment to measure the single-sample latency from analog input to analog output on any BeagleBone board — especially the BeagleBone AI-64.
We’re investigating the feasibility of using the board for analog-to-analog feedback control, where latency can be a limiting factor for closed-loop performance.
Experimental Concept The idea is to quantify the latency between sampling an analog signal and reproducing it on an analog output, using only the ARM processors.
Proposed Setup The diagram below shows the experimental setup as I envision it.
• Input: Sine wave from a function generator, sampled by the built-in ADC or a convenient external ADC.
• Output: The same signal is written to an external DAC (e.g., the MikroE DAC Click, a cape, or another DAC).
• Measurement: An oscilloscope compares the input and output signals to determine latency. If the latency were 0, the two curves would be on top of each other.
We’re aiming for around 10 μs latency, so if we start with a low-frequency sine wave, increase it to 5 kHz without dropping a period, and the phase shift amounts to no more than 1/20 of the signal period, our performance requirements will be met!
Questions
• Has anyone tried something like this on a BeagleBone (especially AI-64)?
• Any suggestions for a low-latency DAC with a good voltage range and output stage (ideally no need for a custom op-amp output stage)?
Looking forward to any experiences, recommendations, or data — and happy to post results if we move forward with the test.
