cancel
Showing results for 
Search instead for 
Did you mean: 

Actual UART Baudrate is off from what is specified

RPape.1
Associate II

I am using UART over RS-485 on an STM32MP153AAB3.

The communication fails because the actual baudrate is off from the specified baudrate. With a specified baudrate of 9600 Bits/s, I am getting an actual baudrate of 9970 Bits/s, as you can see here (time per Bit is 100.3µs instead of 104µs):

0693W00000aPNROQA4.png 

Changing the baudrate to 115200 Bits/s results in a time per Bit of 8.33µs, i. e. 120000 Bits/s:

 0693W00000aPNieQAG.pngI set the USART2,4 Clock in the .ioc file to 64 MHz and chose PCLK1 as the source.

The value in the BRR register is 0x1A0B which, if I understand it correctly, should correspond to a value for USARTDIV of 416, giving a baudrate of 9615 Bits/s (64,000,000/(16*416) = 9615).

My settings are as follows:

static void MX_UART4_Init(void)
{
  huart4.Instance = UART4;
  huart4.Init.BaudRate = 9600;
  huart4.Init.WordLength = UART_WORDLENGTH_8B;
  huart4.Init.StopBits = UART_STOPBITS_1;
  huart4.Init.Parity = UART_PARITY_NONE;
  huart4.Init.Mode = UART_MODE_TX_RX;
  huart4.Init.HwFlowCtl = UART_HWCONTROL_RTS;
  huart4.Init.OverSampling = UART_OVERSAMPLING_16;
  huart4.Init.OneBitSampling = UART_ONE_BIT_SAMPLE_DISABLE;
  huart4.Init.ClockPrescaler = UART_PRESCALER_DIV1;
  huart4.AdvancedInit.AdvFeatureInit = UART_ADVFEATURE_NO_INIT;
  if (HAL_UART_Init(&huart4) != HAL_OK)
  {
    Error_Handler();
  }
  if (HAL_UARTEx_SetTxFifoThreshold(&huart4, UART_TXFIFO_THRESHOLD_1_8) != HAL_OK)
  {
    Error_Handler();
  }
  if (HAL_UARTEx_SetRxFifoThreshold(&huart4, UART_RXFIFO_THRESHOLD_1_8) != HAL_OK)
  {
    Error_Handler();
  }
  if (HAL_UARTEx_DisableFifoMode(&huart4) != HAL_OK)
  {
    Error_Handler();
  }
}

And I'm calling the UART in interrupt mode:

uint8_t uartData[6] = {0xAA, 0x55, 0xAA, 0x55, 0xAA, 0x55};
 
void HAL_UART_TxCpltCallback(UART_HandleTypeDef *huart)
{
    if (HAL_UART_Transmit_IT(&huart4, &uartData[0], sizeof(uartData)) != HAL_OK)
    {
    	Error_Handler();
    }
}

Does anyone have an idea what the problem could be?

Edit: Updated the values for 115200 Bits/s.

1 ACCEPTED SOLUTION

Accepted Solutions
RPape.1
Associate II

We now have the same issue again, but we did not change anything about the clocks. The source for UART is still HSE, but the baudrate is again off by about 4%.

In a new project I created for testing purposes, the baudrate is correct. Initialization and configuration seems to be exactly the same. The only difference is that, for the new project, I have an IOC-file which I use for the configuration, and the new project also has device tree files for the Cortex-A7/Linux side.

Our original project does not have an IOC-file and boots Linux from SD-card, so I cannot configure the clocks the same way but instead have to edit all the device tree files manually, which is a bit confusing because there seem to be various files for the same configuration (e. g. u-boot, tf-a) and it is not clear to me which file is actually used.

My questions are:

1) Is there a way to see what clock sources are used for the various interfaces from a running Linux?

2) If I debug the firmware through Linux core (Production mode), do the Device Tree files in my project actually overwrite the Device Tree of the Linux on the SD-card?

3) What other possibilities are there for a wrong baudrate in one project but a correct baudrate in another project, if the MX_UART_Init(), stm32mp1xx_hal_msp.c, stm32mp1xx_hal_uart.c, stm32mp1xx_it.c are identical?

Thanks in advance!

Update:

I solved the problem and can answer question 3) now:

In stm32mp1xx_hal_conf.h, HSE_VALUE was defined as 24000000, but the correct value would be 25000000 in our case, as the HSE is running at 25 MHz.

I am still very curious about the other two questions, as the relationship between the Linux side and the M4 side is something we come across quite often...

View solution in original post

4 REPLIES 4
Peter BENSCH
ST Employee

The 3...4% deviation sounds like the use of HSI or CSI, both untrimmed.

Where does the 64MHz come from that you are using? PCLK1 is derived from APB1, which in turn comes from MCUSS_CK, which gets its clock from HSI, CSI, HSE or PLL3P.

Regards

/Peter

In order to give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.
RPape.1
Associate II

Yes, you're right, I am using HSI.

How do I find out whether the HSI is trimmed or not?

I checked the RCC_HSICFGR register and there are some bits set in HSITRIM and HSICAL (HSICFGR is 0x34A1700), but does that mean these are correct calibration values?

Edit:

I now tried a whole new board, but the behavior is still exactly the same - the measured baud rate is still 9970 Bits/s for a specified baud rate of 9600 Bits/s.

Edit2:

I tried specifying our 25 MHz HSE as the clock source, but it also does not make a difference. I suspect it is because I am booting in production mode, so I will have to change this on the Linux side, which is what I'll try next.

Edit3:

Okay, we changed the clock source to HSE on the Linux side and now it works like a charm. Thanks for the help, Peter!

I would still be interested if anyone knew how to trim the HSI so that it works, though 🙂

RPape.1
Associate II

We now have the same issue again, but we did not change anything about the clocks. The source for UART is still HSE, but the baudrate is again off by about 4%.

In a new project I created for testing purposes, the baudrate is correct. Initialization and configuration seems to be exactly the same. The only difference is that, for the new project, I have an IOC-file which I use for the configuration, and the new project also has device tree files for the Cortex-A7/Linux side.

Our original project does not have an IOC-file and boots Linux from SD-card, so I cannot configure the clocks the same way but instead have to edit all the device tree files manually, which is a bit confusing because there seem to be various files for the same configuration (e. g. u-boot, tf-a) and it is not clear to me which file is actually used.

My questions are:

1) Is there a way to see what clock sources are used for the various interfaces from a running Linux?

2) If I debug the firmware through Linux core (Production mode), do the Device Tree files in my project actually overwrite the Device Tree of the Linux on the SD-card?

3) What other possibilities are there for a wrong baudrate in one project but a correct baudrate in another project, if the MX_UART_Init(), stm32mp1xx_hal_msp.c, stm32mp1xx_hal_uart.c, stm32mp1xx_it.c are identical?

Thanks in advance!

Update:

I solved the problem and can answer question 3) now:

In stm32mp1xx_hal_conf.h, HSE_VALUE was defined as 24000000, but the correct value would be 25000000 in our case, as the HSE is running at 25 MHz.

I am still very curious about the other two questions, as the relationship between the Linux side and the M4 side is something we come across quite often...

Hello @RPape.1​ ,

For question 1), i think you are looking for this chapter of the wiki: https://wiki.st.com/stm32mpu/wiki/Clock_overview#How_to_monitor_with_debugfs

The command "cat /sys/kernel/debug/clk/clk_summary" will display a complete clock tree of the clocks visible at Linux level.

clk-hsi                                  1            1    64000000          0 0  
   clk-hsi-div                           1            1    64000000          0 0  
      ck_hsi                             2            2    64000000          0 0  
         ck_mco1                         0            0    64000000          0 0  
         uart8_k                         0            0    64000000          0 0  
         uart7_k                         0            0    64000000          0 0  
         uart6_k

in this example, uart8, uart7, uart6 use the HSI clock as clock source.

Hope it helps,

Best Regards,

Kevin

In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.