Skip to main content
Graduate
April 22, 2025
Question

USART synchronization issue when transmitting a bitmap – incorrect byte reception

  • April 22, 2025
  • 5 replies
  • 951 views

I'm trying to transmit a bitmap via USART. The bitmap transmits correctly in HAL. My project is register-based, written from scratch. The issue is that my USART has a speed synchronization problem—meaning:

  • When my C# terminal is set to 57600 BAUD, it reads one byte correctly, but an additional zero byte is received and written to the buffer index.

  • At equal speeds (115200 BAUD) for both the MCU and the terminal, it incorrectly reads the bitmap byte, and still receives an extra empty byte in the buffer.

I can't figure out what's wrong. I suspect it might be a misconfigured clock system. My board is Nucleo-F303RE (STM32F3xE). I’d appreciate your help!

void USART2_EXTI26_IRQHandler (void)
{

	if(USART2->ISR & USART_ISR_RXNE)
	{
		dl_znak[ind++] = USART2->RDR;
		memcpy(dl_znak+ind,temp,1);
		if(++ind >=20) ind=0;
		//USART2->RQR |= USART_RQR_RXFRQ;
		if (USART2->ISR & USART_ISR_ORE)
		{
		 USART2->ICR |= USART_ICR_ORECF;
		}
		if (USART2->ISR & USART_ISR_FE)
		{
			USART2->ICR |= USART_ICR_FECF;
		}
		if (USART2->ISR & USART_ISR_NE)
				{
					USART2->ICR |= USART_ICR_NCF;
				}
	}

}
void config_UART(void)
{
	RCC -> AHBENR |= RCC_AHBENR_GPIOAEN;


	//PA2_TXAF7
	GPIOA -> MODER &= ~(GPIO_MODER_MODER2_0);
 GPIOA->MODER |= GPIO_MODER_MODER2_1;
 GPIOA->AFR[0] &= ~((0xF << GPIO_AFRL_AFRL2_Pos));
	GPIOA->AFR[0] |= (0x7 << GPIO_AFRL_AFRL2_Pos);

	//PA3_RXAF7
	GPIOA -> MODER &= ~(GPIO_MODER_MODER3_0);
	GPIOA->MODER |= GPIO_MODER_MODER3_1;
	GPIOA->AFR[0] &= ~((0xF << GPIO_AFRL_AFRL3_Pos));
	GPIOA->AFR[0] |= (0x7 << GPIO_AFRL_AFRL3_Pos);

	//GPIOA -> OTYPER |= GPIO_OTYPER_OT_3;
	//GPIOA->PUPDR |= GPIO_PUPDR_PUPDR3_0;
	RCC -> APB1ENR |= RCC_APB1ENR_USART2EN;

	RCC -> CFGR |= RCC_CFGR_PPRE1_2;

	USART2->BRR = 36000000/115200;

	USART2-> CR1 |= USART_CR1_UE | USART_CR1_TE | USART_CR1_RE;
	USART2-> CR1 |= USART_CR1_RXNEIE;

	NVIC_SetPriority(USART2_IRQn, 0);
	NVIC_EnableIRQ(USART2_IRQn);

}

uint8_t data;
uint8_t dl_znak [20];
uint8_t temp[2];

 

    This topic has been closed for replies.

    5 replies

    Graduate
    April 22, 2025

    I forgot to paste clock configuration.

    void clock_config(void)
    {
    	RCC->CR |= RCC_CR_HSION;
    	 while (!(RCC->CR & RCC_CR_HSIRDY));
    
    
    	// RCC->CR |= RCC_CFGR_PLLSRC
    	 RCC->APB1ENR |= RCC_APB1ENR_PWREN;
    
    	 RCC->CFGR |= RCC_CFGR_HPRE_DIV1;
    	 RCC->CFGR|= RCC_CFGR_PPRE1_DIV2;
    	 //PPL SOURCE MUX
    
    	// RCC->CFGR |= RCC_CFGR_PLLMUL9;
    	 RCC->CFGR |= (0x7 << RCC_CFGR_PLLMUL_Pos);
    	 RCC->CR |= RCC_CR_PLLON;
    	 while (!(RCC->CR & RCC_CR_PLLRDY));
    	 RCC->CFGR |= RCC_CFGR_SW_PLL;
    	 while(!(RCC->CFGR & RCC_CFGR_SWS));
    
    }
    Super User
    April 22, 2025

    You mean it works correctly with HAL-based code, but not with your bare-metal register code?

    Have you used an oscilloscope or logic analyser to see what's actually happening on the wire?

     


    @FilipF303RE wrote:
    • my C# terminal 


    What is that? Something you've written yourself?

    HAL code will likely have some overhead leading to gaps between characters; bare metal should achieve (nearly) zero gap - can your terminal cope with that?

    What happens if you use a standard terminal?

    Super User
    April 22, 2025

    > dl_znak[ind++] = USART2->RDR;
    > memcpy(dl_znak+ind,temp,1);
    > if(++ind >=20) ind=0;

    You increment the index twice per RXNE event but only write one byte. The non-written byte remains as its initial zero value.

    It's unclear what temp is doing here at all.

    Graduate
    April 22, 2025

    @TDK 

    Yes, right. I didnt see that Thank you brother I thought its messed up becasue off clock. But still I have diffrent baudrates, for terminal and MCU. For example I used realterm and It was the same situtation only on bare metal code.

    Super User
    April 22, 2025

    The USART2 clock as you've set it up is at 18 MHz, not 36 MHz. CPU clock is 36 MHz, APB1 divider is 2.

    Graduate
    April 22, 2025

    @Andrew Neil 

    Yes, I have my own  terminal that I have wrote by myself where I import bitmap, and read in note and convert to .bin file...(windows coulndt do it) and then I send to MCU. C# have libariers for serial port. Its windows form aplication. It works correctly. In HAL, my buffor reads correctly bitmap, I have this confirmed with check in notepad where I read every pixel byte in string(its 8bit resolution bitmap). The problem is baremetal, I dont have osciloscope in my house. I can use it only on university. Which terminal I should use to send bit map I generate .bin file so there is possible to just import that?  I have only tried for small projects reading chars and it worked correctly but still I had difrent baudrates...(on bare metal). Its for my CNC enginer project for graduate, becasue off needs I had to build my own application. I coulndt be able to create all this project withouth creating my own solutions.

    Super User
    April 22, 2025

    For testing purposes, send only printable text characters.

    Then you can use any standard text terminal.

    The trouble with doing custom code at both ends is always that you don't know which end has the fault - sender, receiver, or both!

     

    Graduate
    April 22, 2025

    @Andrew Neil 

    Then I set the baud rate to 57600, I correctly read transmitted and received characters in the buffer (for example, using RealTerm). However, when I change the baud rate (even to the same value configured in the BRR register , I only see garbage data. Correct data is send and recive only for 57600 baudrate. My own build terminal is not a problem. Its possible that I just had dont wrong clock configuration. 

    Graduate
    April 23, 2025

    @TDK Yes, You are right. Everythings work corretly when I change for 18 Mhz. I was just corious becasue IOC shows me in clock configuration that I have 36Mhz, but when I left not divided prescaler or divided by 2 I still have to set it for 18Mhz... It have pushed me to that I might had something wrong in my code ...

    Clockconfig.png