cancel
Showing results for 
Search instead for 
Did you mean: 

STM32G071 ADC sequencer in not fully configurable mode and ADC channels set run-time changing

Vagni
Associate II

On my custom board based on the STM32G071CBT6 mcu I use ADC multichannel DMA conversions, DMA continuous requests, triggered by TIMER2 out event and driven with HAL library (STM32G0 firmware package V1.6.2). I get ADC conversion results by the conversion complete callback.

My first application firmware version initializes ADC sequencer in fully configurable mode. I had to convert four ADC channels in sequence, but three of them could be changed run-time to other ADC channels among the ADC_IN1 – ADC_IN14 range. When new ADC channel selection is needed, I execute the following:

 

 

const ADC_CHAN *pt = &ChanTab[index];
ADC_ChannelConfTypeDef sConfig = {0};

// stop current acquisition
HAL_ADC_Stop_DMA( &hadc1 );
HAL_TIM_Base_Stop( &htim2 );

// set ADC channels sequence
sConfig.SamplingTime = ADC_SAMPLINGTIME_COMMON_1;
sConfig.Channel = pt->ch1;          // the variable channel 1
sConfig.Rank = ADC_REGULAR_RANK_1;
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = pt->ch2;         // the variable channel 2
sConfig.Rank = ADC_REGULAR_RANK_2;
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = pt->ch3;         // the variable channel 3
sConfig.Rank = ADC_REGULAR_RANK_3;
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = ADC_CHANNEL_0;      // the fixed channel
sConfig.Rank = ADC_REGULAR_RANK_4;
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
// Start ADC conversions
HAL_ADC_Start_DMA( &hadc1, (uint32_t *)aADCxConvertedData, 4 );
// Enable Timer (sample frequency)
HAL_TIM_Base_Start(&htim2);

 

 

The ADC conversion results are always good as expected, whichever ADC channels set is run-time selected.

My custom board was then hardware reviewed, adding another fixed analog input on ADC_IN16. So I had to migrate the ADC sequencer mode to not fully configurable and handle the variable ADC channel selection in the following new way:

 

 

const ADC_CHAN *pt = &ChanTab[index];
ADC_ChannelConfTypeDef sConfig = {0};

// stop current acquisition
HAL_ADC_Stop_DMA( &hadc1 );
HAL_TIM_Base_Stop( &htim2 );

// set ADC channels sequence
sConfig.Rank = ADC_RANK_CHANNEL_NUMBER;
sConfig.Channel = ADC_CHANNEL_0;      // the same fixed channel
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = pt->ch1;          // the variable channel 1
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = pt->ch2;         // the variable channel 2
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = pt->ch3;         // the variable channel 3
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
sConfig.Channel = ADC_CHANNEL_16;    // the new added fixed channel
HAL_ADC_ConfigChannel(&hadc1, &sConfig);
// Start ADC conversions
HAL_ADC_Start_DMA( &hadc1, (uint32_t *)aADCxConvertedData, 5 );
// Enable Timer (sample frequency)
HAL_TIM_Base_Start(&htim2);

 

 

After the ADC initialization from reset the ADC conversion results are good as before, also for the new added analog input.

But after every subsequent variable ADC channel selection a kind of offset (30-50 lsb) always occurs on all the conversion results on all the analog inputs.

It looks like ADC calibration is lost after the first variable ADC channel selection. But upon reset I initialize the ADC with HAL_ADC_Init(&hadc1) and do the ADC calibration with HAL_ADCEx_Calibration_Start(&hadc1) before starting the very first ADC conversion.

Clearing CCRDY flag before the ADC channels setting and waiting for CCRDY = 1 before starting the ADC conversions seems to have no effect.

I found as workaround to completely deinitialize the ADC an then reinitialize and recalibrate it before each new channels set selection:

 

 

// stop current acquisition
HAL_ADC_Stop_DMA( &hadc1 );
HAL_TIM_Base_Stop( &htim2 );

// reinit ADC
HAL_ADC_DeInit( &hadc1 );
HAL_ADC_Init(&hadc1);
// Run the ADC calibration
HAL_ADCEx_Calibration_Start( &hadc1 );

// set ADC channels sequence
[…]

 

 

Before the new ADC channel settting, re-running the only ADC calibration or the only ADC initialization without a previous deinitialization does not solve this issue.

Why the ADC conversion results are altered after the first new channels setting at run-time when the ADC sequencer is set to not fully configurable mode?

How should I set run-time the ADC channel configuration in that condition in order to have the same good results as in fully configurable mode?

 

 

0 REPLIES 0