While testing the ADC on our target board, I see that the converted result very often varies a lot or jump to a completely way off value. How can this be? The circuit is bandwith limited to 250Hz and the prescaler is configured to a samplerate of 500Hz.
At one input the source is a DC power supply, trimmed to output exactly 2,500V. The other input is connected to ground through a 10k resistor (to calculate the gain and offset). Both inputs are of course bandwidth limited to 250Hz. The variations are most of the times only a few of the LSB bits. But sometimes the converted value jumps to a completely way off value. After studying the graph you attached, I can see that what happens is that the 0V is measured to a value below zero, hence given a large (over 2.5V) value. I've solved the problem by doing several (100) conversions , and then averaging the result. May I ask why the ADC outputs a two's complement result? What are the benefits compared to a ''normal'' ADC? [ This message was edited by: logic_io-TJ on 18-01-2006 08:58 ]
Normaly there is no big difference betwen a two's complement result and normal result But there is a design chose related to the Vref value selection. With best regards, Hich