STM32G0 internal temperature sensor wrong. simple offset vs full calibration?
Hello,
we are seeing a systematic offset when using the internal temperature sensor of the STM32G031G8U6.
Over the years we have tested ~50 SoCs (different production batches), and all devices show a very consistent temperature reading that is about 2–3 °C too high at room temperature compared to an external reference, higher temperatures weren't tested.
Because of this, we are considering programming a temperature offset during flashing. Before doing so, we would like to better understand whether a single-point offset calibration is sufficient, or if the internal slope accuracy is not reliable enough, requiring a full two-point calibration.
Example:
If the MCU reports 24 °C but the real temperature is 21 °C, can we simply apply a +3 °C offset globally?
Or is the internal temperature slope not guaranteed well enough, meaning that a two-point calibration is required?
More specifically, regarding the factory calibration values on STM32G0:
TS_CAL1:
ADC raw value measured at 30 °C
Tolerance: ±5 °CTS_CAL2:
ADC raw value measured at 130 °C
Tolerance: ±5 °C
My question is about how to interpret these tolerances.
Does the ±5 °C tolerance mean that both calibration points share the same temperature offset (e.g. TS_CAL1 measured at 27 °C and TS_CAL2 measured at 127 °C)?
Or is it possible that the errors are independent (e.g. TS_CAL1 at 25 °C and TS_CAL2 at 135 °C)?
If the second case is possible, then the effective slope could be significantly off, and a user two-point calibration would be required.
If the first case is true (correlated offset), then a simple offset correction would likely be sufficient — especially since the datasheet specifies a linearity of around ±1 °C.
Could someone from ST please clarify how these tolerances should be interpreted, and whether a single-point offset correction is considered acceptable for STM32G0 internal temperature sensing?
Thank you very much.
