HAL_Delay() inaccuracy
Hi folks,
I recently started my STM32 journey using a STM32WLE5JCxx (the LoRa-E5). After fighting with the SubGHz for a while, I managed to write a decent application, but I'm not completely familiar with all of the essentials yet.
Currently, I am using the `HAL_Delay()` function, but after some testing found that it is highly inaccurate. My code snippet:
int i = 0;
while(1) {
printf("%d\n", i++);
HAL_Delay(100000);
}
The output I get (with the actual difference in milliseconds calculated afterwards):
13:26:41.925 > 0
13:28:22.323 > 1 (+398)
13:30:02.318 > 2 (-5)
13:31:42.309 > 3 (-9)
13:33:22.290 > 4 (-19)
13:35:02.236 > 5 (-54)
13:36:42.118 > 6 (-118)
13:38:21.961 > 7 (-157)
13:40:01.796 > 8 (-165)
13:41:41.623 > 9 (-173)
13:43:21.431 > 10 (-192)
This means that on every 100 seconds, the clock drifts anywhere from -200ms to +400ms. A fixed value drift might've been OK, but this is quite unworkable.
I did read that `HAL_Delay()` might not be the most accurate clock source, but I am unsure how to implement a 'simple' delay-function. If someone could help me with a snippet or point to a place where I can find explanation on how to make an accurate delay myself, that would be very much appreciated.
Cheers!
