ADC - TIMER - Sample rate confusion?
Hi, I need a little explanation about what is wrong behind my reasoning..
I am using STM32F411. reading, two inputs of ADC in scan mode over circular DMA triggered via TIM2 update event.
TIM2 is triggering ADC at 100KHz, so I have 50Khz sample rate per adc input. At input of both ADC channels I connected signal generator feeding 1Vrms sine @20KHz.
Everything works ok, I am using printf of both channels over (USB-CDC) as and I can see two sinusoide signals on serial plotter. I am using this serial plotter:
https://web-serial-plotter.atomic14.com/
Now the problem and confusion starts here:
If I change only TIM2 frequency to 10KHz, I am getting 10X lower sample rate also (now 5KHz pre channel, all checked with Logic Analyzer), but I still can see clear 20KHz sine at serial plotter, without degradation and this contradicts Nyquist theorem...
How is this possible and what I am missing?
