I'm trying to implement USART Rx of variable, unknown length messages at a 1Mbps baud rate through DMA. I would like to minimize the latency from receiving to consuming the data, so I'm trying to do this with some sort of idle detection as opposed to having my consumer code periodically drain a buffer.
I can attach my code if necessary, but for right now I'm mostly looking for high-level answers on how this is best accomplished to make sure I'm on the right track.
My first attempt was to configure a buffer in DMA and enable the USART IDLE interrupt. In the USART interrupt handler, I first disable DMA (and wait for the enable bit to clear on the DMA stream) and then clear the IDLE flag (by reading the bit and then reading the data register). I copy the data out of the DMA buffer and then re-enable DMA in the interrupt to be ready for the next message.
Doing this gets mixed results -- I do get valid packets occasionally, but most often my received data falls into one or more of these failure categories:
- The valid data starts at byte 1 instead of byte 0 of the DMA buffer. The extra byte is very often a duplicate start byte. Where's the extra byte coming from? Should I be disabling the UART peripheral as well?
- Truncated message - A valid message is in the buffer, but it abruptly ends. It appears the IDLE interrupt might have come early, or the sender stalled (thus triggering IDLE)?
- Data underrun - NDTR says N bytes were received but memory inspection shows fewer than N bytes were actually touched. Do I need to wait a short while after the IDLE flag for DMA to complete before disabling the stream?
I've spent a lot of time searching for this online, and it seems the consensus is that using the IDLE interrupt is not the way to go, and I should be using a timer on the Rx pin (as described in AN3109). The reasons I have found are unclear, though. I'd like to understand more about why IDLE doesn't work for this, and what's the point of the IDLE interrupt if not for this sort of application?