cancel
Showing results for 
Search instead for 
Did you mean: 

UART Transmission, Interrupt vs DMA

IJo.1
Associate III

I am working on a project using UART communication and I have a question what is really different between interrupt mode and DMA mode in TX except for their underlying mechanism.

(HAL_UART_Transmit_IT vs HAL_UART_Transmit_DMA)

I know DMA uses the DMA controller which works independently, and Interrupt type directly use its internal peripheral,

but both are working as the non-blocking mode and they set interrupts when all the data has been transmitted and completed(HAL_UART_TxCpltCallback)

The only thing I can find is DMA can use the half-complete call back(HAL_UART_TxHalfCpltCallback) but is that the only difference except for the mechanism?

I tried all three methods (polling, interrupt, DMA), and worked fine. 

The project needs to do something else while it transmits UART data, so I need to use interrupt type or DMA type.

Both sound good, but I am really wondering what is different between the two.

If they make the same result and just use different internal procedures or devices in MCU, when would use DMA or Interrupt?

I am a bit confused and need your help!

Thank You 🙂

1 ACCEPTED SOLUTION

Accepted Solutions
Both transmitting and receiving with interrupts will interrupt on every byte.
Perhaps you are not looking at the right interrupt. The IRQ name will vary by peripheral and chip but should be something like USART1_IRQHandler. This calls HAL_UART_IRQHandler which calls HAL callbacks as appropriate.
If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

9 REPLIES 9
TDK
Guru

IT interrupts for every byte that gets sent whereas DMA can send data without use of the CPU. In this way, IT is much more CPU intensive. However, DMA requires use of a DMA stream/channel and there are only so many of those.

If you feel a post has answered your question, please click "Accept as Solution".
IJo.1
Associate III

Hi TDK,

thank you for the answer!

I understand how MCU receives the UART data, but i am quite confused when it transmits.

when I try HAL_UART_Transmit_IT,

note that it is not Receive_IT, 

it looks like the interrupt is triggered only when it completes sending all the data and the same thing happens when I use DMA.

And I guess that because the interrupt is not triggered by every byte sent and it triggered once when it complete the job, 

the UART transmit process using interrupt mode is called non-blocking. otherwise will it bother other works on CPU?

I presume that you are saying when it RECEIVES data using HAL_UART_Receive_IT?

I have verified Receice_IT triggered the interrupts every byte received and it quite bothers CPU. and this case, DMA would be the best choice.

Am I on the right track or miss something?

Thanks,

I. Jo

IJo.1
Associate III

I tested the interrupt mode and DMA mode by sending a big character array while other job is working in main while loop.

Polling mode works as blocking mode and of course the main program doesnt work properly.

DMA does makes sense for this, the main program works well.

IT mode, works as non blocking mode, and the main program works perfectly.

but if the interrupt mode(IT) triggers the interrupt by every byte sent, will the main work be bothered and would the main program be affected somehow because the interrupt is triggered too often and too much?

Need your help!

Thank you 😉

Both transmitting and receiving with interrupts will interrupt on every byte.
Perhaps you are not looking at the right interrupt. The IRQ name will vary by peripheral and chip but should be something like USART1_IRQHandler. This calls HAL_UART_IRQHandler which calls HAL callbacks as appropriate.
If you feel a post has answered your question, please click "Accept as Solution".

> but if the interrupt mode(IT) triggers the interrupt by every byte sent, the main work will be bothered and somehow the main program would be affected because the interrupt is triggered too often and too much?

Yes, that's the drawback. IT mode is more CPU intensive than DMA mode. Whether it's "too much" depends on the details.

If you feel a post has answered your question, please click "Accept as Solution".
IJo.1
Associate III

Thank you! I think I am getting what youre saying.

I think I need to check the IRQs, not the complete callback.

And there were only the UART interrups(TX RX) so the interrupt doesnt bothered CPU much.

Do I get you right?

Thanks, it helps a lot!

best

I. Jo

Most STM32 interrupt for every byte in IRQ mode, the HAL call back typically only occurs once all data transmission is accounted for. With DMA you typically get two interrupts per transaction, at least with HAL, at half and complete transfers. Data will not have finished crossing the wire.

DMA typically best for high data rates, and large blocks of data which can be chained, or part of a scatter-gather list driving dispatch.

I'd ditch most of the HAL implementation of either IRQ or DMA modes as they are relatively simple to manage at a lower-level and integrate with better code/buffering.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
The transfer complete callback is only going to trigger when the transfer is complete. It won't trigger on every byte, but the actual UART IRQ will. So if you want to confirm that, set up your breakpoint there.
If you feel a post has answered your question, please click "Accept as Solution".
Piranha
Chief II

Just drop that broken bloatware and implement something like this:

https://github.com/MaJerle/stm32-usart-uart-dma-rx-tx