How can STM32 ADC detect higher voltage than actual - internal temperature sensor testing
Hello
When adding the internal temperature sensor reading in the STM32H750 micro FW, out of curiosity I collected voltage reading data based on different sampling times.
Based on our ADC clock speed (37.5 MHz) and the datasheet requirements for the temperature sensor (minimum sampling time 9 us) I calculated that the minimum cycles required for the ADC sampling are 337.5.
This works out well and with the sufficient sampling time provided I obtain appropriate temperature reading of 27C.

Now it seems understandable that when not enough sampling time is provided, the internal ADC SAR capacitors do not have enough time to charge and in result the input voltage conversion is lower than actual.
However, I am curious about the result for the cycle time 64.5, where the resulting voltage is slightly greater than actual (all the results are consistent i.e. I can run the same FW in a loop and always roughly the same counts are detected).
Looking at the ADC description in AN2834 section 2.1, we can see that the final SAR result depends on a number of comparations and the position and number of capacitors used in each approximation step will depend on the input voltage and previous step results.
Is it simply due to chance that in this case the voltage reading is higher than expected but it could be either way since any of the capacitors may not have charged as expected? How could the ADC error be related to the lower bits (smaller charge capacitors) if they should in theory charge faster than the largest required cap to detect ~600 mV?
