Skip to main content
Graduate
February 1, 2025
Solved

ATOMIC_SET_BIT() stopping code execution

  • February 1, 2025
  • 3 replies
  • 4081 views

HI

I am using the STM32G474 MCU and ran into a problem with my code stopping execution.

In my application, I have a level 0 priority interrupt that takes 1.34us to execute that occurs once every 6.66us.  In the final code, I will need 3 of these running, 1 every 2.2us (still leaving about 37% of the MCU bandwidth for other tasks).

When I have 2 of these interrupts every 6.66us, all seems to work fine, but when I add a third, my code stops executing.  After a lot of debugging, I have found that my code is stuck at the following line of code:

 

 ATOMIC_SET_BIT(huart->Instance->CR3, USART_CR3_TXFTIE);

 

 

in the HAL_UART_Transmit_IT() function.  Normally, when writing low level interrupt handlers for serial I/O, when I need to change the UART interrupt flags, I would disable interrupts make the change then enable the interrupts again.  Doing that, it is very clearly safe for a single core MCUs like the STM32G474.  However, when I examined the STM driver code for this function, it uses the ATOMIC_SET_BIT() macro that I am not familiar with.

I have tried to switch to DMA serial I/O but it had the same problem; it worked with 2 1.34us interrupts every 6.6us but not with 3.

I'd like to not use this macro for modifying the UART interrupt flags, but it is in the driver file supplied by STM and when I modify it, it forgets my modifications the next time I compile.

Could someone explain what is going on here, and even better, how I can solve my problem?

 

    This topic has been closed for replies.
    Best answer by TDK

    The ATOMIC_SET_BIT uses the LDREX/STREX instructions to atomically set bits. If an interrupt is called between these, the STREX fails and it has to go do the LDREX again. Usually not a problem, but when you're interrupting every few cycles, this is going to prevent it from working.

    > how I can solve my problem?

    Few options:

    • Don't use HAL.
    • Overwrite ATOMIC_SET_BIT in some user code that gets called after it's defined but before it's used. Not sure if this is possible.
    • Interrupt less often.

    3 replies

    TDKAnswer
    Super User
    February 1, 2025

    The ATOMIC_SET_BIT uses the LDREX/STREX instructions to atomically set bits. If an interrupt is called between these, the STREX fails and it has to go do the LDREX again. Usually not a problem, but when you're interrupting every few cycles, this is going to prevent it from working.

    > how I can solve my problem?

    Few options:

    • Don't use HAL.
    • Overwrite ATOMIC_SET_BIT in some user code that gets called after it's defined but before it's used. Not sure if this is possible.
    • Interrupt less often.
    Graduate
    February 1, 2025

    Thank you.  I understand that I can write my own UART functions and will do that.  However, it also stops USB communications.  The 150kHz interrupts are to regulate the current out of 3 synchronous buck converters that should get their PWM duty cycle updated every PWM cycle running at 150kHz (every 6.67us period).  I can reduce the PWM frequency, but that is not ideal.

    Do you have any suggestions what to do about USB?  I examined the middleware source code but wasn't able to find anything that seems like it would stop USB.

     

    Super User
    February 1, 2025

    USB should work as long as the interrupt is called regularly. I'm not sure what would be holding it up. Maybe toggle a pin within the USB IRQ handler and monitor it on a scope to ensure there are no gaps of more than 1ms or so? Could also use a USB analyzer to see if incoming packets are happening and are correct.

     

    Other solution could be to combine the three IRQs. CCR registers are preloaded so depending on what you're doing, might be able to achieve the same scheme. I'm guessing you rules out non-cpu approaches, which is of course the best solution.

    Graduate
    February 1, 2025

    You may solve the problem by not using HAL for UART. With interrupt frequency over 100 kHz UART HAL would consume too much of your precious processor time.

    Graduate
    February 2, 2025

    Since I thought that might be useful for someone with similar problems, I will summarize the ultimate solution inspired by the responses from TDK, gbm, and knarfB.

    Design summary:

    1. This design is a driver for an endoscope.  It allows the color temperature (in degrees Kelvin) of the emitted light and intensity to be specified and will adjust the current in the three power LEDs (white, red, and blue) as required.  Regulated currents can be set from 0.1A to 15A, each being regulated by its own synchronous buck converter.

    The PWM signals for the drivers are non-overlapping (120 degrees out of phase from each other).  The PWM frequency is 150kHz.  The regulators use voltage mode control (as opposed to peak-current mode control) since transient response is not important.  The control algorithm required uses a 3-pole, 3-zero digital filter with coefficients determined by special software from Biricha Digital Power LTD (https://biricha.com/ ).

    Problem:

    1. The problem was that in my original firmware, there was only about 33% of the MCU bandwidth left over for none-DSMPS functions (790ns every 2.2us) and USB, the UART, and writing to flash did not work when all three drivers were on at the same time.  With a lot of debugging, the problem turned out to be related to the ATOMIC_SET_BIT() macro used by STM which was locked in an infinite loop which was the original reason for the post.

    Solution:

    1. moved the ADC DMA interrupt handler to CCM RAM which increased the available MCU bandwidth to 64%.  This allowed the USB and UART to function correctly.  However, I was still not able to write to flash. The flash is used to store user programmable options.  

    2. Only allowed writing to flash when the endoscope not emitting light.  When not emitting light, the time intensive digital filter is not being executed which leaves almost 100% of the bandwidth available.

    3. Future improvements that I would like to implement, write my own low level UART driver to avoid using the time intensive HAL UART functionality.

    Thanks to TDK, gbm, and knarfB for all of their help.

    -David