cancel
Showing results for 
Search instead for 
Did you mean: 

IDLE flag in USART_SR vs. DMA

picguy
Associate II
Posted on October 23, 2008 at 20:10

IDLE flag in USART_SR vs. DMA

5 REPLIES 5
picguy
Associate II
Posted on May 17, 2011 at 12:42

I think I made a mistake.

I thought I had demonstrated that IDLE status came up after about one byte time of no data arriving on a serial port. But either I was wrong, or IDLE does not come up while RS-232 input DMA is still counting down the number of bytes it has been told to receive. Frankly, I would rather be wrong and discover that something about my setup is incorrect.

So ... how long should it take for IDLE status to come up after data stops flowing? Is this affected using DMA to read?

Details-

I am using USART1 @4.5 M baud (with external RS-485 differential line drivers) for a LAN. I have highly variable message lengths. One message is only 3 bytes long. Another message is 70 bytes. When reading I never know what message length is coming next. I almost certainly need to use DMA because bytes arrive every 160 72MHz clocks. And to make product timings I can not sit in the max priority ISR and loop waiting until I get the unique number of bytes as determined by the second byte of the message.

In my testing I DMA read 100 bytes from USART1. Every 20 milliseconds I get a burst of three bytes. This 3-byte burst takes well under 2 microseconds. IDLE status interrupts me only after 100 bytes have arrived. I.e. after the DMA remaining length winds down to zero. It was my hope that IDLE status would ensue a byte time or so after the third byte was delivered. I would then interrupt, decode the message and ignore it if it was not for my node. I would then restart the DMA for the next message. The message sender, another STM32, ensures that every message receiver has time to get back into read mode.

On my LAN I will have as many as 50 nodes all receiving every message.

I *could* take an interrupt every byte. With 24 clocks for ISR overhead I could write the code such that I spent perhaps as few as an additional 40 clocks in my ISR code. I really don’t want to use anything close to 25% of the total processor time accepting messages.

But wait, there’s more.

If DMA and the way I had planned to use IDLE can not work for me I have another trick up my sleeve before I interrupt every byte. Break. But that poses a problem for the sender.

The problem involves my external RS-485 hardware device. It needs to be told when to go into transmit mode. And it needs to be told when to leave transmit mode. Going into transmit mode is easy. Leaving transmit mode using DMA is less so. In my original plan this was made easy because of the TC status flag. That flag is set when the TX shifter is empty. And that only happens when the DMA stops and the last bit has been transmitted.

Having to send a break complicates matters.

Should I “SBK� (Set BreaK) in USART_CR1 when the TXE status sets? SHould I SBK in the DMA interrupt? (I don't currently interrupt on DMA done.) Will SBK zap the last byte just sent to the TX shifter? And the $64 question: Does my sending BREAK hold off TC until BREAK completes? If it’s the latter AND DMA read allows FE (Framing Error) to set in response to break then I will be okay. Annoyed, but okay. FWIW, I do not plan to use LIN mode.

These are details I need to make the LAN part of our product functional. Hopefully our glorious mod, STOne-32, can find the details I need before I spend a day or two trying alternatives. Getting the LAN part of our product working is very much on the critical path.

- - - -

PS the docs on USART_BRR (Baud Rate Register) are unduly complex. The entire 16-bit register is number of clocks per baud time. Apparently with a minimum of 0x0010. This hexadecimal point stuff only makes it hard to understand.

PPS what will happen if I try 0x0008? Will I get 9M baud?

lanchon
Associate II
Posted on May 17, 2011 at 12:42

I'm not familiar with the serial macrocell, but anyway...

> I would then interrupt, decode the message and ignore it if it was not for my node. I would then restart the DMA for the next message. The message sender, another STM32, ensures that every message receiver has time to get back into read mode.

since you're getting into long interframe inactivity (you said the RX ISR can't be high priority) maybe you can build a DMA circular buffer and interrupt periodically with a timer such that at least two events are triggered in each frame space. when you detect no DMA addr advance, you queue the current DMA adr for the package parser, which looks for validdata between queued addresses.

maybe you can tie the rx pin to a timer trigger in parallel, and detect inactivity using the timer.

> Should I “SBK� (Set BreaK) in USART_CR1 when the TXE status sets? SHould I SBK in the DMA interrupt?

if you don't want to test (ok I agree, testing shouldn't be necessary, ST should give us better docs) I guess you could SBK in the TC handler. (then time and clear it? not that ugly given that you already need to time here with your current design before next TX.)

BTW, doesn't the macrocell have a 9-bit data mode you can use for framing?

or you could use a circular DMA buffer with in-data framing to avoid guard times and breaks: 7-bit words, or frame marker escaping.

picguy
Associate II
Posted on May 17, 2011 at 12:42

lanchon replied 25-08-2008 at 11:53

>since you're getting into long interframe inactivity (you said the RX ISR can't be high priority)

The time between frames is as short as possible. I see where my “I can not sit in the max priority ISR and loop...� could have been interpreted as RX ISR has a low priority. In fact, everything having to do with my serial LAN is max priority. As it is I have only 45 time slots for our longest message. (Three byte msg prompts the sending of a 70-byte data message. My time slots for this are something like 80.1 byte times because I have to revisit all 45 time slots at 125Hz. And then I have both waits before end of message is recognized. It’s tight. I would love to have RS-232 run at 12M baud. But then I would support 100 time slots and have the same problem....)

>build a DMA circular buffer and interrupt periodically with a timer

Interesting thought. Given the product there is a very strict timing requirement between the receipt of the 70-byte message and that message causing an all-important action. Otherwise polling every 500 to 1000 microseconds sounds good. I would need something to recover from garbled messages. That part would be tricky but doable. (Checksums and the limited number of valid message types.)

We already have designed in a mechanism to replay lost or garbled commands. The long message is much like the USB isochronous mode – on time is more important than integrity.

>9-bit data mode

That and DMA might work with half-word mode in the DMA. But it’s so nice to move data from the receive buffer to where the data is to be used using ldmia/stmia with 4 to 6 registers and auto increment of the address registers. And in addition, 9-bit mode won’t give us the fast end or message interrupt we need.

In all, lanchon, interesting thoughts especially “ST should give us better docs.� Given the product it looks like (best) IDLE if it’s supposed to work and I have some bug somewhere, then break (not bad but with potential gotchas), then (don’t make me do this) PIO interrupt per byte read.

dpereverzoff
Associate II
Posted on May 17, 2011 at 12:42

I am working on this very problem now and believe that I am close to a solution. I figure that I might as well use a DMA if I have one.

The Idle detector seems to work for me using DMA except in IrDA mode (I will work on that next week).

To detect end of transmission I set the DMA empty interrupt then when this pops (still not finished sending) I use this interrupt to enable the Transmit Complete interrupt in the USART and when this pops the frame has been sent and I can turn off the transmitter.

I believe that I am going to have to use a circular buffer for the IrDA input (because Idle is not defined for IrDA).

picguy
Associate II
Posted on May 17, 2011 at 12:42

My use for IDLE involves variable length messages. I set the DMA length larger than any valid message. During testing with somewhat primitive code I received multiple messages then my DMA count expired and only then did IDLE status come up.

If you know the length you might as well interrupt on DMA count = 0. If you wish you can respond (mailto) to my user name on this forum and my

http://www.hmtown.com/

domain name.