ADC sampling rate checked by callback
Hello,
I am using a NUCLEO-H743ZI2.
I read the reference manual RM0433 Rev8.
I also read a bunch of topics in the forum but didn't find what I wanted.
I am trying to check my ADC sampling rate by looking at the timestamp at my half full/full callback from the DMA buffer. For now I don't have an AWG to generate a sine so I don't know other method to do so.
In order to do the experiment, I generate a number via the microcontroller and increases it by 100 every second. It goes to the DAC. I put a jumper cable to the ADC1 so I can read the value.
I want a sample rate of 10ksample/second with a ADC resolution of 12bits.
I changed the value of PLL2 in order to have a ADC frequency of 1.2MHz

Then I tried to guess the link between the ADC frequency and the sampling rate and didn't find a clear formula.
From what I understood the formula is something like this:

with
f_sample = 10 kHz that I want
f_ADC= 1.2MHz that i configured in clock configuration
prescaler = 8 that I choose in the ADC parameter
nbcycle_samplingtime = 8.5 that I choose in the ADC parameter
nbcycle_convertingtime that depends on the ADC resolution.

Some tutorials on the internet say the converting time is nbbits_ADC + 0.5, some others nbbits_ADC/2 + 0.5.
In the reference manual of this card, it appears to be nbbits_ADC/2 + 0.5
So I wrote my parameters with this hypothesis.
I set a DMA of 40000 points, as the sampling rate is 10kS/s, this DMA is supposed to be half filled after 2 seconds, and fully filled after 4 seconds, with a period of 4 seconds.
This part of code is where I initialize the ADC, DAC, DMA. I also produce the value for the DAC which increase by 100 every second.
#define DMAsize 40000
uint16_t buffer_adc[DMAsize]={0};
uint16_t buffer_adc2[DMAsize]={0};
int value_dac=0;
HAL_DAC_Start(&hdac1, DAC_CHANNEL_2);
HAL_ADCEx_Calibration_Start(&hadc1,ADC_CALIB_OFFSET,ADC_SINGLE_ENDED);
HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
printf("ADC DMA TEST BEGINS\r\n");
/* USER CODE END 2 */
/* Infinite loop */
/* USER CODE BEGIN WHILE */
while (1)
{
HAL_DAC_SetValue(&hdac1, DAC_CHANNEL_2, DAC_ALIGN_12B_R, value_dac);
if (value_dac < 4095-100) {
value_dac+=100;
} else {
value_dac=0;
}
HAL_Delay(1000);
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
}
/* USER CODE END 3 */
With this part of code I write the callback functions: I display 8 values from the DMA and the timestamp.
As the DMA is 40k points and the sampling rate 10kpoints/s, one quarter of the DMA is supposed to change its value every second: the index and the index+0.5 should be the same.
// Called when first half of buffer is filled
void HAL_ADC_ConvHalfCpltCallback(ADC_HandleTypeDef* hadc) {
RTC_TimeTypeDef sTime;
RTC_DateTypeDef sDate;
int indice=0;
for(indice=0;indice<8;indice++){
buffer_adc2[indice*5000+200]= buffer_adc[indice*5000+200];
}
HAL_RTC_GetTime(&hrtc, &sTime, RTC_FORMAT_BIN);
HAL_RTC_GetDate(&hrtc, &sDate, RTC_FORMAT_BIN);
printf("DMA half full \r\n");
printf("ADC: 0: %d 0.5:%d | 1: %d 1.5: %d | 2:%d 2.5:%d | 3:%d 3.5:%d \r\n"
,buffer_adc2[200],buffer_adc2[5200],buffer_adc2[10200],buffer_adc2[15200],buffer_adc2[20200],buffer_adc2[25200],buffer_adc2[30200],buffer_adc2[35200]);
printf("Time: %02d:%02d:%02d\r\n\r\n",sTime.Hours,sTime.Minutes,sTime.Seconds);
}
// Called when buffer is completely filled
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
RTC_TimeTypeDef sTime;
RTC_DateTypeDef sDate;
int indice=0;
for(indice=0;indice<8;indice++){
buffer_adc2[indice*5000+200]= buffer_adc[indice*5000+200];
}
HAL_RTC_GetTime(&hrtc, &sTime, RTC_FORMAT_BIN);
HAL_RTC_GetDate(&hrtc, &sDate, RTC_FORMAT_BIN);
printf("DMA completely full -----------------\r\n");
printf("ADC: 0: %d 0.5:%d | 1: %d 1.5: %d | 2:%d 2.5:%d | 3:%d 3.5:%d \r\n"
,buffer_adc2[200],buffer_adc2[5200],buffer_adc2[10200],buffer_adc2[15200],buffer_adc2[20200],buffer_adc2[25200],buffer_adc2[30200],buffer_adc2[35200]);
printf("Time: %02d:%02d:%02d\r\n\r\n",sTime.Hours,sTime.Minutes,sTime.Seconds);
}
When I connect the NUCLEO board to my laptop, I read the value with a terminal:
ADC DMA TEST BEGINS<\r><\n>
ADC DMA TEST BEGINS<\r><\n>
DMA half full <\r><\n>
ADC: 0: 23 0.5:29 | 1: 101 1.5: 203 | 2:0 2.5:0 | 3:0 3.5:0 <\r><\n>
Time: 12:00:03<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 23 0.5:29 | 1: 101 1.5: 203 | 2:302 2.5:402 | 3:501 3.5:599 <\r><\n>
Time: 12:00:07<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 699 0.5:799 | 1: 899 1.5: 994 | 2:302 2.5:402 | 3:501 3.5:599 <\r><\n>
Time: 12:00:11<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 699 0.5:799 | 1: 899 1.5: 994 | 2:1098 2.5:1194 | 3:1297 3.5:1396 <\r><\n>
Time: 12:00:15<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 1496 0.5:1595 | 1: 1693 1.5: 1796 | 2:1098 2.5:1194 | 3:1297 3.5:1396 <\r><\n>
Time: 12:00:19<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 1496 0.5:1595 | 1: 1693 1.5: 1796 | 2:1897 2.5:1991 | 3:2103 3.5:2199 <\r><\n>
Time: 12:00:23<\r><\n>
As you can see the value is changing every 1/8th of the DMA buffer, that means the sampling rate is not what it is supposed to be. Also the period of filling the buffer is approximately 8 seconds instead of 4.
Now, I suppose the hypothesis of nbcycle_convertingtime= nbbits_ADC/2 + 0.5 is wrong,
I suppose nbcycle_convertingtime= nbbits_ADC + 0.5
So I keep my previous parameters but change the sampling by 2.5.
The results are here:
ADC DMA TEST BEGINS<\r><\n>
ADC DMA TEST BEGINS<\r><\n>
DMA half full <\r><\n>
ADC: 0: 28 0.5:29 | 1: 23 1.5: 103 | 2:0 2.5:0 | 3:0 3.5:0 <\r><\n>
Time: 12:00:02<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 28 0.5:29 | 1: 23 1.5: 103 | 2:199 2.5:203 | 3:299 3.5:303 <\r><\n>
Time: 12:00:04<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 399 0.5:501 | 1: 500 1.5: 603 | 2:199 2.5:203 | 3:299 3.5:303 <\r><\n>
Time: 12:00:07<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 399 0.5:501 | 1: 500 1.5: 603 | 2:599 2.5:698 | 3:803 3.5:802 <\r><\n>
Time: 12:00:09<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 901 0.5:901 | 1: 997 1.5: 1093 | 2:599 2.5:698 | 3:803 3.5:802 <\r><\n>
Time: 12:00:11<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 901 0.5:901 | 1: 997 1.5: 1093 | 2:1095 2.5:1194 | 3:1194 3.5:1295 <\r><\n>
Time: 12:00:14<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 1393 0.5:1396 | 1: 1495 1.5: 1492 | 2:1095 2.5:1194 | 3:1194 3.5:1295 <\r><\n>
Time: 12:00:16<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 1393 0.5:1396 | 1: 1495 1.5: 1492 | 2:1593 2.5:1696 | 3:1698 3.5:1797 <\r><\n>
Time: 12:00:19<\r><\n>
It seems better but it is not exactly accurate, because the period to fill the buffer is closer to 5 seconds than 4 seconds. Also, values in the same quarter of the buffer may differ sometimes.
So I have several questions:
- What is the correct formula to link the parameter I just mentionned ?
- What is the nb cycle of converting time for ma NUCLEO-H743Zi2 ?
- How can I get my 10k samples per second ?
- Is the PLL2 only used for the ADC clock in my card or is it connected to other peripherals (didn't find in clock configuration or in the reference manual)
Thank you

