Start DAC and ADC at the same time
Hello,
I am using a NUCLEO-H743ZI2 and program on the STM32CubeIDE1.13.2
The purpose of my project is to create a 2kHz sine and to sample it at 40kHz at the same time.
I would like to have no delay between the 12bits DAC and the 12bits ADC: that the first point of my sampled sine is my first point of my buffer read by the DAC. So a phase of 0°.
In order to start the ADC and DAC at the same time I use TIM1, running on the internal clock of 32MHz, which after a delay of 2 seconds triggers an event TR0, one pulse mode.
TIM2 is the timer which triggers the reading of the DAC, and is triggered by TR0, reading the DMA buffer for DAC at 400kHz (presc=0, Counter=79)
TIM3 is the timer which triggers the reading of the ADC, and is triggered by TR0, reading the ADC value and writing in the DMA at 40Hz. (presc=0, Counter=799)
The frequency of the 12bits ADC is 1.2MHz, 12bits, 2.5 sample cycle, prescaler 1.
I generate a sine on 200 points to achieve the 2kHz frequency.
I write the values from the ADC in a buffer of 40000 points, so that 1 buffer is filled in one second.
I copy the values of the ADC DMA buffer in a big buffer which I send all the data after acquiring the signal for 3 seconds.
The buffer1time is a variable I added because the very first 2 points of the ADC values are 0 ( I guess the ADC was not "ready" yet), so I want to record one second later, as the sinus have a frequency multiple to 1Hz the phase at 1s should be the same as at 0s.
#define DMAsize 40000
#define nbDMAsize 3
uint16_t buffer_adc[DMAsize]={0};
uint16_t buffer_adc2[nbDMAsize*DMAsize]={0};
uint16_t bufferCounter=0;
uint16_t buffer1time=0;
#define NSsinus 200
.....
uint32_t sinusDAC[NSsinus];
uint32_t amplitude=(uint32_t)floor(powf(2,11)-1);
uint32_t offset=(uint32_t)floor(powf(2,11));
for(indice=0;indice<NSsinus;indice++){
sinusDAC[indice]=floor(amplitude*sin(2* M_PI*indice/NSsinus)+offset);
}
HAL_ADCEx_Calibration_Start(&hadc1,ADC_CALIB_OFFSET,ADC_SINGLE_ENDED);
HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, (uint32_t*)sinusDAC, NSsinus, DAC_ALIGN_12B_R);
HAL_TIM_Base_Start(&htim1);
/* USER CODE BEGIN WHILE */
while (1)
{
HAL_Delay(3000);
if(bufferCounter>=nbDMAsize*2){
int nbTotalSizeToSend= nbDMAsize*DMAsize*2;
int maxSizePerPacket=powf(2,14);
int nbHeap=floor(nbTotalSizeToSend/maxSizePerPacket);
int restHeap=nbTotalSizeToSend%maxSizePerPacket;
int indice=0;
for(indice=0;indice<nbHeap;indice++){
HAL_UART_Transmit(&huart3, (uint16_t *)&(buffer_adc2[indice*maxSizePerPacket/2]), (uint16_t) maxSizePerPacket, HAL_MAX_DELAY);
}
HAL_UART_Transmit(&huart3, (uint16_t *)&(buffer_adc2[nbHeap*maxSizePerPacket/2]), (uint16_t) restHeap, HAL_MAX_DELAY);
HAL_Delay(1000000);
}
}
/* USER CODE END WHILE */
//callback when ADC DMA buffer half full
void HAL_ADC_ConvHalfCpltCallback(ADC_HandleTypeDef* hadc) {
int indice=0;
if(buffer1time>0){
//just fill one time
if(bufferCounter<nbDMAsize*2){
//copy a big buffer to send all
for(indice=0;indice<DMAsize/2;indice++){
buffer_adc2[indice+bufferCounter*DMAsize/2]= buffer_adc[indice];
}
bufferCounter++;
}
}
else{
buffer1time++;
}
}
//callback when ADC DMA buffer full
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
int indice=0;
if(buffer1time>1){
if(bufferCounter<nbDMAsize*2){
//copy a big buffer to send all
for(indice=0;indice<DMAsize/2;indice++){
buffer_adc2[indice+bufferCounter*DMAsize/2]= buffer_adc[indice+DMAsize/2];
}
bufferCounter++;
}
}
else{
buffer1time++;
}
}
After downloading the data, the spectrum (FFT 1 sided) shows a signal at 2kHz.
However the phase of the peak (so the phase of the first point of my sine) is not 0° but 260°.
At 2kHz, this is a 361µs delay (or a 139µs in advance) modulo 500µs.

This can be confirmed in the time series: the first point is at 1470, instead of 2048 as the sine is center around 2048.

In order to know who is delayed to whom, I made another experiment.
I wrote a 1s ramp starting from the value 1000 to 3000 in order to avoid any saturation around 0 and 4096 in the 12bits DAC.
#define NSramp 2000
uint32_t rampeDAC[NSramp];
for(indice=0;indice<NSramp;indice++){
rampeDAC[indice]=indice+1000;
}
And I changed the time of TIM2 to a counter period of 799 so the signal last 1 second.
The output shows the ramp lasting 1s.
However the number 1000 doesn't start at 0, 1, 2s but a little earlier, 17 packets of 40kHz which means 425µs.

If I keep all the same parameters for the timers but I change the ADC frequency, this delay will change:
fADC=1.2MHz, delay=-17/40000=-425µs
fADC=1.5MHz, delay=-7/40000=-175µs
fADC=2MHz, delay=+1/40000=+2.5µs
fADC=3MHz, delay=+10/40000=+25µs
fADC=6MHz, delay=+20/40000=+50µs
The delay will also change for the sine (generated at 400kHz)
fADC=1.2MHz, delay=261°=362µs
fADC=1.5MHz, delay=254°=352µs
fADC=2MHz, delay=250°=347µs
I also tried with a sine generated at 40kHz (a little ugly because only 20 points to make a period)
fADC=1.2MHz, delay=216°=300µs
fADC=1.5MHz, delay=216°=300µs
fADC=2MHz, delay=234°=325µs
So I don't know how to conclude on the effect of the ADC frequency, I thought as long as it is faster the timer trigger frequency it would be fine but it seems to have a huge consequence on the delay between ADC and DAC.
How is it possible make a 0 delay between my DAC and ADC ?
Thank you
