cancel
Showing results for 
Search instead for 
Did you mean: 

Delay in UART transmission at highbaud rates

shafy s
Associate II
Posted on February 22, 2017 at 18:53

Hi, I am developing an application using STM32L432 nucleo32 board, that samples 5 ADC channels and send that the data to UART.   The ADC channels are sampled using a timer and the maximum sampling frequency can be 10kHz.  So there can be 10K timer events. 

But I am facing some bottle neck in the UART.. even with baud rate of 921 kbps , the UART is causing delay in the transmission of the samples.  The 10K samples, ie 100000 bytes takes around 10 sec to reach the receiving end which is a PC terminal. For testing, I have removed the ADC sampling and tried sending hard coded data and getting the same result.  We have tried higher Baud rates (>1.5 Mbps), with little improvement.  The uart clock used is 32 MHz.  Is there a way to decrease this latency ?

#stm32 #uart
12 REPLIES 12
Posted on February 23, 2017 at 18:57

Also, 921600 is the bit rate. You will get a single character out that is clocked at 921600 bits per second. What is your observed time between characters? It can vary between zero and infin...well not infinity since you are getting characters out and infinity hasn't happened yet. So zero and a big number.

Put your serial line onto an oscilloscope and see what is actually happening.

Posted on February 23, 2017 at 19:17

I had been working on 'the other side' as well, namely on x86 systems under VxWorks (up to Atom) for PLC units.

For UARTs, they use to rely on external chips with large FIFOs (either really external chips, or as compatible IP block on  ASICs). Single character interrupts are not feasable, because external interrupts are very slow over the external bus (PCI). And it still was troublesome to empty the FIFO in time for high baudrates (115200 bps) to avoid protocol corruption.

While an Intel Atom is not a 'rocket', you need to consider it were headless systems (no GUI and gfx system) under a real-time OS. Windows8/10 is no real-time systems, but just 'best effort'. I expect trouble there as well. Digging into the world of Win drivers is not done in a few days.

Posted on February 23, 2017 at 19:45

Ok, but the refresh rate on the LCD is what, and the human perception is what? You're honestly telling me you can't live with a 10 or 100 sample latency at 100 KHz?

The sample timing is separate from the transport chosen, and can be paced with a direct timer trigger of the ADC without an interrupt.

If you must use serial, can't you DMA directly into the USART transmit register, have a bit rate higher than the ADC, and trigger the ADC with a TIM?

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..