Achieving the theoretical ADC Conversion Time (12-bit) on STM32G4
Hello,
I am using a NUCLEO-G431KB with IAR EWARM 9.20.2 and trying to achieve a 250 ns ADC conversion time on ADC1 (IN1, Single-Ended mode).
Configuration: 12-bit resolution, 2.5 cycles sampling time, ADC clock at 60 MHz (synchronous/2, SYSCLK 120 MHz, HCLK 60 MHz).
Theoretical conversion time:
Tconversion=Tsampling+T12-bit=250 ns
Measured time: 2590 ns (2.59 µs) using interrupt mode (HAL_ADC_Start_IT).
Below is the relevant part of the code where the ADC is started, and a debug pin is toggled before and after the ADC conversion:
while (1)
{
HAL_ADC_Start_IT(&hadc1);
HAL_GPIO_WritePin(PA9_GPIO_Port, PA9_Pin, GPIO_PIN_SET);
}
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef *hadc)
{
HAL_GPIO_WritePin(PA9_GPIO_Port, PA9_Pin, GPIO_PIN_RESET);
dmaBuffer = HAL_ADC_GetValue(&hadc1);
}Is the delay caused by the HAL functions? On reviewing the definition of HAL_ADC_Start_IT, it appears to involve a lengthy process. Could this overhead be due to unnecessary steps like calibration or other operations being performed every time it is called? If yes, how can I minimize this overhead and achieve the theoretical minimum conversion time?
Thank you for your time!
