cancel
Showing results for 
Search instead for 
Did you mean: 

STM32H743ZI UART Tx with DMA doesn't want to turn on

CLeo.1
Senior II

Hi guys, its been a while and I hope everyone is well and safe.

Using a STM32H743ZI

Reference Manuel

I am having issues turning on the DMA1 stream associated to the UART4 Tx, in this case DMA1_Stream1. The DMA is setup as a One Shot mode (Normal Mode) with a dynamic length.

The only way I can currently change the specific stream on is by clearing the "DIR" bit within the CR1 in DMA1.

I have double check the memory which the buffer is located is not within a region where the DMA can't access. The address is: 0x2001ff10

What I am trying to do is:

Setup a CLI to maintain communication within the STM32H7 and the PC VIA UART 4.

The way I am planning it to work is for example I want to send the string "Hello World"

It will store the "Hello World" into the TX Buff associated with the Tx DMA, ill enable the Stream and have it sent off, and have it interrupt me once its completed so I can reset the flag.

Setup of the driver:

CLI::CLI() {
 
	baudRate = CLIBaudRate;
 
	LL_APB1_GRP1_EnableClock(LL_APB1_GRP1_PERIPH_UART4);
	LL_AHB4_GRP1_EnableClock(LL_AHB4_GRP1_PERIPH_GPIOD);
	LL_AHB1_GRP1_EnableClock(LL_AHB1_GRP1_PERIPH_DMA1);
 
	//UART SETUP
	LL_USART_SetDataWidth(UART4, LL_USART_DATAWIDTH_8B);
	LL_USART_SetBaudRate(UART4, CLIClock, LL_USART_PRESCALER_DIV1, LL_USART_OVERSAMPLING_16, baudRate);
	LL_USART_SetStopBitsLength(UART4, LL_USART_STOPBITS_1);
	LL_USART_EnableDMAReq_RX(UART4);
	LL_USART_EnableDMAReq_TX(UART4);
	LL_USART_EnableDirectionRx(UART4);
	LL_USART_EnableDirectionTx(UART4);
 
	//CLI::enableRx();
	 CLI::enableTx();
 
	LL_USART_Enable(UART4);
}
uint8_t CLI::enableTx() {
 
	//GPIO Setup
	LL_GPIO_SetPinMode(GPIOD, LL_GPIO_PIN_1, LL_GPIO_MODE_ALTERNATE);
	LL_GPIO_SetPinSpeed(GPIOD, LL_GPIO_PIN_1, LL_GPIO_SPEED_FREQ_VERY_HIGH);
	LL_GPIO_SetPinOutputType(GPIOD, LL_GPIO_PIN_1, LL_GPIO_OUTPUT_PUSHPULL);
	LL_GPIO_SetPinPull(GPIOD, LL_GPIO_PIN_1, LL_GPIO_PULL_NO);
	LL_GPIO_SetAFPin_0_7(GPIOD, LL_GPIO_PIN_1, LL_GPIO_AF_8);
 
	//DMA SETUP
	LL_DMA_SetPeriphRequest(DMA1, LL_DMA_STREAM_1, LL_DMAMUX1_REQ_UART4_TX);
	LL_DMA_SetPeriphAddress(DMA1, LL_DMA_STREAM_1, (uint32_t)&UART4->TDR);
	LL_DMA_SetMemoryAddress(DMA1, LL_DMA_STREAM_1, (uint32_t)CLI_TxBuff);
	LL_DMA_SetDataTransferDirection(DMA1, LL_DMA_STREAM_1, LL_DMA_DIRECTION_MEMORY_TO_PERIPH);
	LL_DMA_SetPeriphSize(DMA1, LL_DMA_STREAM_1, LL_DMA_PDATAALIGN_BYTE);
	LL_DMA_SetMemorySize(DMA1, LL_DMA_STREAM_1, LL_DMA_PDATAALIGN_BYTE);
	LL_DMA_SetMode(DMA1, LL_DMA_STREAM_1, LL_DMA_MODE_NORMAL);
	LL_DMA_SetDataLength(DMA1, LL_DMA_STREAM_1, 0);
	LL_DMA_SetMemoryIncMode(DMA1, LL_DMA_STREAM_1, LL_DMA_MEMORY_INCREMENT);
	LL_DMA_SetStreamPriorityLevel(DMA1, LL_DMA_STREAM_1, LL_DMA_PRIORITY_MEDIUM);
	LL_DMA_EnableIT_TC(DMA1, LL_DMA_STREAM_1);
 
	NVIC_EnableIRQ(DMA1_Stream1_IRQn);
	NVIC_SetPriority(DMA1_Stream1_IRQn, 0);
 
	return SUCCESS;
}
void WHATAMI::execute(char * TxBuff) {
 
	const char * topBorder = "*********************\n\r";
 
	memcpy(TxBuff, topBorder, sizeof(topBorder));
	LL_DMA_SetDataLength(DMA1, LL_DMA_STREAM_1, sizeof(topBorder));
	LL_DMA_EnableStream(DMA1, LL_DMA_STREAM_1);
}

5 REPLIES 5
TDK
Guru

Can you see signals on the TX/RX lines? Or is the only insight into the system through what pops up on the console?

If you disable the DMA TX request and write to UART4->TDR directly, does it work? If so, look at the DMA initialization. If not, look at UART initialization.

I don't see any cache management. Is data cache disabled? If not, disable it while you're debugging.

If you feel a post has answered your question, please click "Accept as Solution".

Thank you for the reply!

This is the onlysight and the other fact the interrupts never get triggered.

When you say "Disable the DMA Tx Request" You are saying the DMA1_Stream1 -> CR1 disabled not the DMA1MUX_Channel1 right?

Theres no cache management, never had to tinkered with it. But ill make sure its disabled.

CLeo.1
Senior II

@TDK​ Sorry ignore my blunt reply. Actually its in the DTCM region. Must have passed me. I am trying to change the linker file but its still holding on to its DTCM address, any idea?

TDK
Guru

> I have double check the memory which the buffer is located is not within a region where the DMA can't access. The address is: 0x2001ff10

I missed this. Edit: it's in DTCM, which you figured out. Surely this is one reason it's not working.

Here's the FAQ entry for this:

https://community.st.com/s/article/FAQ-DMA-is-not-working-on-STM32H7-devices

Basically, go into your linker script and replace usage of DTCMRAM with RAM_D1, assuming both are defined. If TxBuff is defined on the stack, you'll need the stack in RAM_D1 as well:

_estack = ORIGIN(RAM_D1) + LENGTH(RAM_D1);

If you can't get it to work, post your linker script.

If you feel a post has answered your question, please click "Accept as Solution".