cancel
Showing results for 
Search instead for 
Did you mean: 

Why would an ADC sample in 8-bit mode when DMA is used, but when used without DMA the full 12-bit conversion is retrieved. - STM32F411

TCloe.1
Associate II

I am trying to set up an example project for ADC with DMA using HAL library.

I am sampling a slow ramp function from 0-3.3V over 2500ms.

For conventional ADC setup:

When debugging the code I can see that the retrieved ADC value the full 12 bits.

For ADC with DMA:

When debugging I see 2 possible issues.

->Firstly, the retrieved ADC value is 8-bit. The value only reaches 255 then overflows and starts at 0 again.

->Secondly, the buffer array does not seem to populate in either normal or circular fashion. Only the first element is populated with this above mentioned 8-bit value.

My ADC and DMA setup is as follows:

static void MX_ADC1_Init(void)

{

 ADC_ChannelConfTypeDef sConfig = {0};

 hadc1.Instance = ADC1;

 hadc1.Init.ClockPrescaler = ADC_CLOCK_SYNC_PCLK_DIV8;

 hadc1.Init.Resolution = ADC_RESOLUTION_12B;

 hadc1.Init.ScanConvMode = DISABLE;

 hadc1.Init.ContinuousConvMode = DISABLE;

 hadc1.Init.DiscontinuousConvMode = DISABLE;

 hadc1.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE;

 hadc1.Init.ExternalTrigConv = ADC_SOFTWARE_START;

 hadc1.Init.DataAlign = ADC_DATAALIGN_RIGHT;

 hadc1.Init.NbrOfConversion = 1;

 hadc1.Init.DMAContinuousRequests = ENABLE;

 hadc1.Init.EOCSelection = ADC_EOC_SINGLE_CONV;

 if (HAL_ADC_Init(&hadc1) != HAL_OK)

 {

  Error_Handler();

 }

 sConfig.Channel = ADC_CHANNEL_0;

 sConfig.Rank = 1;

 sConfig.SamplingTime = ADC_SAMPLETIME_28CYCLES;

 if (HAL_ADC_ConfigChannel(&hadc1, &sConfig) != HAL_OK)

 {

  Error_Handler();

 }

}

static void MX_DMA_Init(void)

{

 /* DMA controller clock enable */

 __HAL_RCC_DMA2_CLK_ENABLE();

 /* DMA interrupt init */

 /* DMA2_Stream4_IRQn interrupt configuration */

 HAL_NVIC_SetPriority(DMA2_Stream4_IRQn, 0, 0);

 HAL_NVIC_EnableIRQ(DMA2_Stream4_IRQn);

 hdma_adc1.Instance = DMA2_Stream4;

 hdma_adc1.Init.Channel = DMA_CHANNEL_0;

 hdma_adc1.Init.Direction = DMA_PERIPH_TO_MEMORY;

 hdma_adc1.Init.PeriphInc = DMA_PINC_DISABLE;

 hdma_adc1.Init.MemInc = DMA_MINC_ENABLE;

 hdma_adc1.Init.PeriphDataAlignment = DMA_PDATAALIGN_WORD;

 hdma_adc1.Init.MemDataAlignment = DMA_MDATAALIGN_WORD;

 hdma_adc1.Init.Mode = DMA_CIRCULAR;

 hdma_adc1.Init.Priority = DMA_PRIORITY_HIGH;

 hdma_adc1.Init.FIFOMode = DMA_FIFOMODE_DISABLE;

}

1 ACCEPTED SOLUTION

Accepted Solutions

> S2CR register is completely clear during debugging, (0x0).

Aren't you trying to use Stream4?

JW

View solution in original post

14 REPLIES 14

I don't use Cube, but this:

 hadc1.Init.Resolution = ADC_RESOLUTION_12B;

sounds to me that you are setting ADC to 12-bit and not to 8-bit.

JW

TCloe.1
Associate II

Thanks for the quick reply,

This is exactly what I mean, I do explicitly set the 12-bit mode but I only see 8 bits when debugging.

hadc1.Init.ContinuousConvMode = DISABLE; << NO, How exactly is it being triggered?

Try clearing the buffer with an impossible value like 0xE5E5, to see if it's actually being written by DMA at all.

Look at the DMA peripheral registers, do they move, or has it thrown an error.

Check address used and alignment.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
TDK
Guru

There aren't any bugs in the code you've presented that would explain this issue. The bug is likely with how you're starting the ADC DMA conversion or interpreting the results.

> DMA_PDATAALIGN_WORD

This means you're transferring 32 bits on each transfer. Half of those will be wasted. It makes more sense to change this to half-word and use a uint16_t array to store results.

> only see 8 bits when debugging

How exactly are you doing this?

If you feel a post has answered your question, please click "Accept as Solution".
TCloe.1
Associate II

Thanks for the reply;

So I initialised the buffer with an impossible value as you suggested.

The DMA does not populate the buffer at all, the buffer stays unchanged.

So looking at the DMA2 register, the S2CR register is completely clear during debugging, (0x0).

I'm not sure what to look out for it an error has occurred.

I know that the ADC in converting as the HAL_ADC_ConvCpltCallback() is triggered after 10 conversions (the length of the buffer).

Hi TDK,

I agree that transferring the full word is wasteful, I will change this to half-word going forward.

When debugging, I am using CubeIDE's Expressions view to look at the buffer values after each conversion. - I hope this answers you question.

> S2CR register is completely clear during debugging, (0x0).

Aren't you trying to use Stream4?

JW

I meant more along the lines of explaining with code how you're setting up the ADC DMA conversions and how the buffer is defined (uint8_t's?).

If you feel a post has answered your question, please click "Accept as Solution".

Correct, I missed that. Thank you

S4CR_DMEIE and S4CR_TEIE are set, additionally S4CR_TCIE is set after a full conversion has been done.