STM32G0B1 DMA Capture Delay Issue with TIM1 Input Frequency Measurement
Hello ST Team,
We are currently working with the STM32G0B1KCUX series MCU, which has 144 KB RAM and 256 KB Flash.
Our objective is to measure the frequency of an incoming signal using TIM1 in Input Capture mode with DMA (Channel 2). Below are the key configuration and issue details:
We are interested in capturing frequency using DMA mode without any interrupt callbacks. We need a way to synchronize the transfer completion to ensure the captured data is correct
i have Attached timer settings and code snippet
-
Timer Configuration:
-
Timer: TIM1 (16-bit)
-
Clock Frequency: 64 MHz (no prescaler)
-
DMA: Used to capture values without interrupt
-
Slave Mode: Reset mode with trigger on TI1FP1
-
-
Input Signal:
-
Frequency Range: 15.8 kHz to 16 kHz
-
Signal Period: Approximately 62.5 µs to 63.3 µs
-
-
Observation:
-
When checking the DMA_ISR_TCIF2 flag before reading, the time between frequency measurements varies inconsistently — sometimes 1ms, 2ms, or 3ms.
-
However, if we remove the DMA complete flag check, the capture reads occur at a rate matching the system clock and signal input, i.e., much faster and more consistent.
The Issue:
We are confused why adding this DMA transfer complete flag check:
if (!(DMA1->ISR & DMA_ISR_TCIF2)) return 0; DMA1->IFCR = DMA_ISR_TCIF2; // Clear flagresults in millisecond-level delays between readings, even though the input signal is at ~63 µs intervals. Without this check, data comes in faster, matching expectations.


-
