cancel
Showing results for 
Search instead for 
Did you mean: 

NUCLEO-U575ZI DMA->DAC doesn't work (USEF)

Skfir
Associate III

Hello everybody! I have a sine-wave array, that I throw to the DAC, triggering it by TIM6. When the TIM6 interrupt routine is responsible for supplying samples - everything works fine, but when I try to use DMA nothing works. The DMA keeps spitting the USEF error. Please help! I have been trying to figure out what to do for several days already. For testing purposes, I just try to make a one 16-bit variable transfer via DMA to no avail. If anybody would please give me some advice.

void DAC_init(void){
	RCC->AHB3ENR |= RCC_AHB3ENR_DAC1EN;		//Connecting DAC to the bus
	DAC1->CR &= ~(DAC_CR_EN1);		//Disable the DAC before applying settings
	DAC1->MCR |= DAC_MCR_HFSEL_0;	//Must set this bit, since AHB frequency is >80Mhz
	DAC1->MCR |= DAC_MCR_SINFORMAT1 + DAC_MCR_SINFORMAT2;	//DAC now accepts signed numbers (no need to apply bias)
	TIMER_dac_init();	//Setting up the timer for triggering DAC
	DAC1->DHR12LD = 128;  //Before the DMA request is generated it is necessary to perform the first write.
	DAC1->CR |= DAC_CR_TEN1 +(5 << DAC_CR_TSEL1_Pos);	//DAC channel1 will be hardware triggered by the timer 6
	DAC1->CR |= DAC_CR_DMAEN1;	//Enable DMA for the channel 1
	DAC1->CR |= DAC_CR_EN1;		//Enabling DAC channel1
 
	static short i = 32000;
	GPDMA1_Channel14->CCR &= ~(DMA_CCR_EN);	//Disable DMA channel
	GPDMA1_Channel14->CCR |= (DMA_CCR_PRIO_1);	//Set DMA channel priority
	GPDMA1_Channel14->CTR1 |= (DMA_CTR1_DDW_LOG2_0 + DMA_CTR1_SDW_LOG2_0);  //The data is 16-bit wide (DAC accepts 16-bit numbers in SIGNED mode and ignores the 4 bits
	GPDMA1_Channel14->CTR2 |= (DMA_CTR2_DREQ + (2 << DMA_CTR2_REQSEL_Pos) + DMA_CTR2_BREQ);	//Channel request is driven by destination, DMA trigger, Block transfer mode
	GPDMA1_Channel14->CDAR = (volatile long unsigned int)(&(DAC1->DHR12LD));		//Set DAC as the destination for GPDMA transfer
	GPDMA1_Channel14->CSAR = (volatile long unsigned int)&i;	//Pointing DMA at the source data location
	GPDMA1_Channel14->CBR1 |= (1 << DMA_CBR1_BRC_Pos); //Block is transferred just once
	GPDMA1_Channel14->CBR1 |= (1 << DMA_CBR1_BNDT_Pos); //Transfer 2 bytes per block (16-bit data)
	GPDMA1_Channel14->CCR |= (DMA_CCR_EN); //Enable DMA
}

1 ACCEPTED SOLUTION

Accepted Solutions
Skfir
Associate III

I've cracked it!

This value: "DMA_CTR1_DDW_LOG2_0" should be changed to this: "DMA_CTR1_DDW_LOG2_1"

Apparently DAC only accepts the world-aligned access.

And also one more mistake here: "1 << DMA_CBR1_BNDT_Pos" should be: "2 << DMA_CBR1_BNDT_Pos" because we transfer 2 bytes at a time, not one byte. F@^%k my old boots, haha!! Spent nearly two days figuring this out! Hopefully someone will find this story revealing.

But I still don't understand and would be grateful for any comments. The Guide says: "As DAC_DHRx to DAC_DORx data transfer occurred before the DMA request, the very first

data has to be written to the DAC_DHRx before the first trigger event occurs."

Does it mean that during DAC init I must write some dummy value to DAC_DHRx in order to make the DMA thing running? I just cannot understand... Everything seems to work without it though.

View solution in original post

1 REPLY 1
Skfir
Associate III

I've cracked it!

This value: "DMA_CTR1_DDW_LOG2_0" should be changed to this: "DMA_CTR1_DDW_LOG2_1"

Apparently DAC only accepts the world-aligned access.

And also one more mistake here: "1 << DMA_CBR1_BNDT_Pos" should be: "2 << DMA_CBR1_BNDT_Pos" because we transfer 2 bytes at a time, not one byte. F@^%k my old boots, haha!! Spent nearly two days figuring this out! Hopefully someone will find this story revealing.

But I still don't understand and would be grateful for any comments. The Guide says: "As DAC_DHRx to DAC_DORx data transfer occurred before the DMA request, the very first

data has to be written to the DAC_DHRx before the first trigger event occurs."

Does it mean that during DAC init I must write some dummy value to DAC_DHRx in order to make the DMA thing running? I just cannot understand... Everything seems to work without it though.