Inaccurate us delay function
Hi,
I'm trying to implement a `delay(microseconds)` function on an STM32 F745 running at 216 MHz.
I want to keep it dummy and simple without involving interrupts. My strategy is as follows:
-Setup TIM2 to run at 1 MHz. Since it has a 32-bit counter it will overflow after a bit more than 1h which is enough for my application.
-Implement delay() as follows:
inline void delay(const uint32_t wait_time_us)
{
volatile const uint32_t t_start = TIM2->CNT;
while ((TIM2->CNT - t_start) < wait_time_us);
}After implementing it, I try it by simply turning on an off a GPIO pin every 100 ms:
GPIOC->BSRR = GPIO_BSRR_BR_9;
delay(100'000U);
GPIOC->BSRR = GPIO_BSRR_BS_9;
delay(100'000U);However when I inspect it in my logic analyzer (typical 24 MHz 8-channel), I obtain 99.8 ms instead of 100 ms. That's 200 us difference or 200 clock ticks that were missed. And the CPU should be running at 216 MHz.
If I try other delay values (e.g. 100 us) I don't get an exact timing either, but the difference is not the same as before (200 us), but something else (smaller). So it doesn't seem to be an off-by-x kind of problem.
I have verified that the TIM2 is correctly setup with this code (inside an infinite while loop):
if (TIM2->CNT & 0x0000'0001)
{
GPIOC->BSRR = GPIO_BSRR_BR_9;
}
else
{
GPIOC->BSRR = GPIO_BSRR_BS_9;
}In the logic analyzer I see a perfect 1us delay all the time.
So, what can be wrong with my delay function? I've also tried inlining the code to discard delays due to function calls but the result is the same. The larger the delay, the larger the difference. I'm compiling with -O3 optimization level on gcc 8.3, C++14.
Thanks!
