Voltage on ADC, 1.83v ok?

I'm reading ADC values on the BeagleBoard coming out of the output of an op-amp, through a voltage divider. So far so good.

The output of the op-amp seems to be a maximum 3.67V when powered from SYS5V on the BeagleBone. If I power it from the unregulated 5V rail, it's 3.85v. It won't go higher than that, so it appears to be a diode drop from its rail, fair enough, there's the clamped maximum.

If I voltage-divide that in half (convenient, two equal resistors), I get 1.84V theoretically, actually 1.82V measured in practice.

So I've been using a 1/3 divider to be safe-- again, I have to use standard resistor values so I'm stuck with 20k/10k or 200k/100k--, but that means my maximum voltage is only 1.2V, so I'm throwing away ADC resolution and introducing noise. I'd rather use a 1/2 divider instead, but that'd put the max voltage at 1.82V.

The databook says that the ARM335x can handle up to 1.89v max as a supply voltage on the ADC. However, it looks like the supply is internal. Many ADCs allow accepting an external reference voltage instead of using the internal one, but I don't see any pins on the BeagleBone for that. But most ADCs also specify that an input vltage that's greater than the reference voltage is A Bad Thing, so I'm not sure.

But, overall, my question is: is ~1.82V safe to feed into the input of the ADC on a consistent basis? I do not want to fry my BeagleBone. It's a fantastic board!



I see no issue with 1.82V assuming you have a accurate voltmeter.


Why not add a 1.8V zener just in case?

Thanks, that seems like the safest approach.