cancel
Showing results for 
Search instead for 
Did you mean: 

STM32L4 Oversampler producing odd results

Jim Kaz
Associate II
Posted on November 09, 2017 at 23:59

STM32L496

CubeMX v4.21  (Edit, updated to V4.22.1, same issue)

HAL Framework: 1.8

I'm trying to reduce some of the noise in my system by using the hardware oversampling.  However, I've noticed that the resulting conversions are slightly lower than without it enabled.  My basic software setup is every 10ms I read 15 ADC channels and process the data.  I have code that keeps track of the minimum and maximum raw tick values that I've seen, so I have a general idea the amount of noise that I'm seeing.

Note, I'm seeing this shift (though it may not by exactly the same amount) on every channel, i'm just going to use channel 1 as my sole example to make it easier to display the data.  If anyone really really wants I can display all my data.

Oversampling off, 2 cycle sampling time, 1047-1051 ticks

Oversampling 8x

, 2 cycle sampling time, 1027-1030 ticks

Oversampling 8x

, 92 cycle sampling time, 1038-1039

ticks

Oversampling

8x

, 640 cycle sampling time, 1051-1052

ticks

The example channel I'm using happens to be a voltage sensor that i'm feeding 15.9V into.  With oversampling off, those tick values result in a reading of ~15.85V.  With Oversampling and 2 cycle sample time, it reads 15.5V (and thus why I think the oversampling values are wrong).

Below is my setup for the ADC

hadc1.Instance = ADC1;

hadc1.Init.ClockPrescaler = ADC_CLOCK_ASYNC_DIV1;

hadc1.Init.Resolution = ADC_RESOLUTION_12B;

hadc1.Init.DataAlign = ADC_DATAALIGN_RIGHT;

hadc1.Init.ScanConvMode = ADC_SCAN_ENABLE;

hadc1.Init.EOCSelection = ADC_EOC_SEQ_CONV;

hadc1.Init.LowPowerAutoWait = ENABLE;

hadc1.Init.ContinuousConvMode = DISABLE;

hadc1.Init.NbrOfConversion = 15;

hadc1.Init.DiscontinuousConvMode = DISABLE;

hadc1.Init.NbrOfDiscConversion = 1;

hadc1.Init.ExternalTrigConv = ADC_SOFTWARE_START;

hadc1.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE;

hadc1.Init.DMAContinuousRequests = ENABLE;

hadc1.Init.Overrun = ADC_OVR_DATA_PRESERVED;

hadc1.Init.OversamplingMode = ENABLE;

hadc1.Init.Oversampling.OversamplingStopReset = ADC_REGOVERSAMPLING_RESUMED_MODE;

hadc1.Init.Oversampling.Ratio = ADC_OVERSAMPLING_RATIO_8;

hadc1.Init.Oversampling.RightBitShift = ADC_RIGHTBITSHIFT_3;

hadc1.Init.Oversampling.TriggeredMode = ADC_TRIGGEREDMODE_SINGLE_TRIGGER;

if (HAL_ADC_Init(&hadc1) != HAL_OK)

{

_Error_Handler(__FILE__, __LINE__);

}

/**Configure the ADC multi-mode

*/

multimode.Mode = ADC_MODE_INDEPENDENT;

if (HAL_ADCEx_MultiModeConfigChannel(&hadc1, &multimode) != HAL_OK)

{

_Error_Handler(__FILE__, __LINE__);

}

/**Configure Regular Channel

*/

sConfig.Channel = ADC_CHANNEL_1;

sConfig.Rank = 1;

sConfig.SamplingTime = ADC_SAMPLETIME_2CYCLES_5;

sConfig.SingleDiff = ADC_SINGLE_ENDED;

sConfig.OffsetNumber = ADC_OFFSET_NONE;

sConfig.Offset = 0;

if (HAL_ADC_ConfigChannel(&hadc1, &sConfig) != HAL_OK)

{

_Error_Handler(__FILE__, __LINE__);

}

The rest of the channels are all the same setup.  I didn't want to include them to keep the post size down since they should be irrelevant.

I start the process using this command:

HAL_ADC_Start_DMA(&hadc1, (uint32_t *)pBuf->rawData(), ConfigAdc::ADC_CHAN_NUM_CHANNELS);

Anyone have any ideas what I'm seeing and why?  Being able to use hardware oversampling would be just like, the best.

Thanks!

3 REPLIES 3
Posted on November 10, 2017 at 00:32

Your signal source probably has too high impedance. Read  the Analog-to-Digital converter characteristics chapter in DS, there's a R

AIN

related subchapter, tables, formulas. Also observe the input signal directly on the input pin with using a fast enough oscilloscope with bandwidth significantly higher than the ADC clock and a good high impedance probe, while running the conversions - that sight might be instructional.

I am by no means a 'L4 ADC expert but a one-off conversion after a longer delay might behave the same as a significantly longer sampling time.

JW

Posted on November 10, 2017 at 00:55

I believe the electrical engineer on the project specifically went and checked all the analog/impedance stuff related to the ADC channels when we were first trying to reduce ADC noise while I worked on a software solution to the problem.  I believe everything checked out... but I wasn't doing oversampling at the time.  I'm not an ADC expert (or really an electrical engineer expert, i'm just a lowly firmware guy

:)

) buuuuuuuuuuuut if the impedance was real close, but technically valid, for a single sample, I could see how repeated readings of it could drain any internal capacitance and maybe if the impedance was slightly too high they weren't refilling completely?  That makes sense in my head, dunno if the physics checks out though

Looking over some of my data from running single bank, dual bank, continuous mode (getting 16 samples and software averaging) and non-continuous mode (just a single sample)... and then doing all of the above with various levels of oversampling... I think this theory may hold water.

I'll have to check this out business tomorrow (which is Monday because tomorrow is a holiday).

Posted on November 10, 2017 at 09:59

Try to get that oscilloscope measurement of input signal, observe the effect of sampling on it. Firmware guys of whichever rank cannot work properly without thorough understanding of the processed phenomena.

JW