cancel
Showing results for 
Search instead for 
Did you mean: 

Using the system timer and USART receive interrupts with DMA

David Fisher
Associate II
Posted on May 11, 2017 at 17:43

I am trying to implement  '2.2.2 Method2: Using the system timer and USART receive interrupts' from the following document on an F767ZI Nucleo board.

http://www.st.com/content/ccc/resource/technical/document/application_note/d6/03/cb/dd/03/54/49/d6/CD00256689.pdf/files/CD00256689.pdf/jcr:content/translations/en.CD00256689.pdf

I want to implement a decoder for the Futaba SBUS rc protocol. It has a fixed length of 25 bytes and takes approximately 3ms to transmit, with 3-5ms between packets depending upon the receiver used. My idea was to use a USART receive interrupt to start a timer that would cause an interrupt if the DMA interrupt hadn't fired within 3.5ms. Does anyone know of an example implementation of the above method ?

8 REPLIES 8
S.Ma
Principal
Posted on May 12, 2017 at 05:47

I took a step back from the need and looked as a possible simple implementation and one idea came up:

The timeout delay can be generated by transmitting the right amount of bytes by dma transmit, except that the gpio is turned as input so the tx line goes nowhere. Then you get 2 dma interrupts, a receive and a transmit one.

After transmitting normally, transmit the dummy block by tx dma and also set the dma for receiving. If the tx dma interrupt comes first, it means timeout, abort rx. If the rx interrupt comes first, abort dummy tx. 

You may not need any sram flag, only use the hw registers bits to implement this scheme should be enough. Haven t tried it, it is just a guess.

Otherwise, dedicate a timer in one shot mode with interrupt on update/max reached. Kick in the timer and its interrupt at the end of transmit. If the rx dma interrupt comes first, kill the timer interrupt. If the timer interrupt comes first, kill the dma rx and set a flag or jump to the rx dma interrupt which can see the dma interrupt disables as a timeout...

Alan Chambers
Associate II
Posted on May 12, 2017 at 09:44

I use DMA RX for USARTs as follows:

1. Configure the DMA RX channel with a circular buffer. No interrupt is necessary.

2. Use some recurring timer event (SysTick interrupt, RTOS timer, or whatever) to poll NTDR frequently to see if the value has changed, and copy out or otherwise consume new bytes. You just need to keep track of the previous value of NTDR.

For low power applications, you might want to kill the ticker when the RX goes idle. You might do this after N ticks with no new data received. There are various ways to wake the ticker up, such

as using the USART RX interrupt - enable when the ticker is stopped, disable on RX. Or you might configure the USART RX pin as EXTI - a single edge can wake up the system (you might need to send a wake-up byte before the actual message). I used the latter approach when the main crystal was disabled and the USART was unable to receive anything.  

I know nothing about the particular protocol you mentioned, but advise you to completely decouple any timing or other contraints in the protocol from the business of sending and receiving bytes over the wire. The USART driver should know nothing about protocols: it just shovels data in and out.

Posted on May 12, 2017 at 13:04

>Or you might configure the USART RX pin as EXTI - a single edge can wake up the system (you might need to send a >wake-up byte before the actual message). I used the latter approach when the main crystal was disabled and the >USART was unable to receive anything.  

I have no control in what is sent to me they are commercial rc rx units. The protocol has a start and stop byte. Latency is important in the application.

How would I implement the timing constraint using this method ? Is there an example using NDTR you could link me to ?

Thanks.

Posted on May 12, 2017 at 16:21

SBUS looks to be a proprietary format which has been reverse engineered. I'm not clear on the details. Are packets well defined? That is, can you always recognise the start of a packet in a stream of bytes, regardless of when they were received? Or do you only know that a new packet is starting because there has been a short break between bytes? How are transmission errors detected?

For my own protocols, I always make sure that packets are well defined. I shovel bytes as they are received into a decoder: typically a one-function FSM which finds valid packets in the byte stream. When a I have a valid packet, I pass it to packet handler for dispatch. This approach decouples byte transfer from the protocol, and filters out incomplete packets and corrupt packets.

Going back to the timer suggestion, I typically poll NDTR register at 1kHz, but could go faster. What is your maximum latency?

Posted on May 12, 2017 at 18:53

25 bytes in 3ms could be managed with an interrupt and systicker (1ms / 1KHz). Sort of protocol you'd just state-machine and recognize the end framing without having to wait for an inter-message gap, and the latency that would introduce.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
Posted on May 12, 2017 at 19:02

Yeah. I did wonder about just using interrupts and forget the ticker, but David asked about DMA...

Posted on May 13, 2017 at 07:27

Did you have a look at the original link ? I thought 

'2.2.2 Method2: Using the system timer and USART receive interrupts' seemed a very efficient way of implementing it. You wouldn't wait for an inter-message gap, the DMA interrupt would trigger when you had 25 bytes. If the timer had exceeded 3.5ms before you received the 25 bytes then you would just reset. It's receiving within the 3.5ms that is important not the gap between.

http://www.st.com/content/ccc/resource/technical/document/application_note/d6/03/cb/dd/03/54/49/d6/CD00256689.pdf/files/CD00256689.pdf/jcr:content/translations/en.CD00256689.pdf

nichtgedacht
Senior
Posted on May 25, 2017 at 18:23

For Futaba SBUS, if using a Jeti Transmitter, one can adjust the frame rate from 5 to 20 ms in 1 ms steps.

Any solution taking the max elapsed time of receiving the complete  data into account is therefore impossible. The first start of the DMA or Interrupt transfer can hit anywhere within the gap between 2 frames and reads the correct data in order. One can not decide due to max elapsed transfer time whether the data came from one frame or not. In particulat because the start of the next transfer is right after the first completed.

But one can state almost fore shure a miss approach, if the first byte gets fetched before the time for 2  bytes has elapsed. Then delay for 1 frame time before  starting  the next transfer,  and you are in sync. If not it repeats a second time.

I have not implemented this solution.

SBUS further has uart_data[0] == 0x0F && uart_data[24] == 0x00.

These bytes can also occur als payload bytes but not often. If one takes the frame as verified under this condition and it is not, the next frame will fail to be verified. At least if there are some calculations before the next start of the transfer. This would mean a short glitch only. I never observed such a glitch even with SRXL protocol testing only the start byte at uart_data[0]

Find my code here

https://github.com/nichtgedacht/mini-sys/

 

Suggestions for improvements are welcome.