cancel
Showing results for 
Search instead for 
Did you mean: 

ADC Read Maximizing Precision and Accuracy

taylors
Associate II

Hello community,

I'm looking for some feedback on how to optimize the ADC peripheral when using a handful of single-ended channels.

I have an STM32H735G-DK and have 4 NTC3950 type thermistors connected to 4 of the Analog pins from the Arduino headers, attached in the R2 position of a voltage divider.

NamePort/PinADC NumberADC ChannelR1 Resistance
temp0PC0ADC3Channel 10487100
temp1PH2ADC3Channel 13

463500

temp2PC2_CADC3Channel 0468800
temp3PC3_CADC3Channel 1468200

The goal is to gather the 4 analog values and convert them to temperatures - the readings need to be accurate to the correct temperature and consistently the correct temperature (I would very much prefer to not have much of a software filter to smooth the readings).

Using CubeMX, the peripheral is being initialized as follows:

static void MX_ADC3_Init(void)
{
  /* USER CODE BEGIN ADC3_Init 0 */
  /* USER CODE END ADC3_Init 0 */

  ADC_ChannelConfTypeDef sConfig = {0};

  /* USER CODE BEGIN ADC3_Init 1 */
  /* USER CODE END ADC3_Init 1 */

  /** Common config
  */
  hadc3.Instance = ADC3;
  hadc3.Init.ClockPrescaler = ADC_CLOCK_ASYNC_DIV2;
  hadc3.Init.Resolution = ADC_RESOLUTION_12B;
  hadc3.Init.DataAlign = ADC3_DATAALIGN_RIGHT;
  hadc3.Init.ScanConvMode = ADC_SCAN_DISABLE;
  hadc3.Init.EOCSelection = ADC_EOC_SINGLE_CONV;
  hadc3.Init.LowPowerAutoWait = DISABLE;
  hadc3.Init.ContinuousConvMode = DISABLE;
  hadc3.Init.NbrOfConversion = 1;
  hadc3.Init.DiscontinuousConvMode = DISABLE;
  hadc3.Init.ExternalTrigConv = ADC_SOFTWARE_START;
  hadc3.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE;
  hadc3.Init.DMAContinuousRequests = DISABLE;
  hadc3.Init.SamplingMode = ADC_SAMPLING_MODE_NORMAL;
  hadc3.Init.ConversionDataManagement = ADC_CONVERSIONDATA_DR;
  hadc3.Init.Overrun = ADC_OVR_DATA_PRESERVED;
  hadc3.Init.LeftBitShift = ADC_LEFTBITSHIFT_NONE;
  hadc3.Init.OversamplingMode = DISABLE;
  if (HAL_ADC_Init(&hadc3) != HAL_OK)
  {
    Error_Handler();
  }

  /** Configure Regular Channel
  */
  sConfig.Channel = ADC_CHANNEL_10;
  sConfig.Rank = ADC_REGULAR_RANK_1;
  sConfig.SamplingTime = ADC3_SAMPLETIME_24CYCLES_5;
  sConfig.SingleDiff = ADC_SINGLE_ENDED;
  sConfig.OffsetNumber = ADC_OFFSET_NONE;
  sConfig.Offset = 0;
  sConfig.OffsetSign = ADC3_OFFSET_SIGN_NEGATIVE;
  if (HAL_ADC_ConfigChannel(&hadc3, &sConfig) != HAL_OK)
  {
    Error_Handler();
  }
  /* USER CODE BEGIN ADC3_Init 2 */
  HAL_ADCEx_Calibration_Start(&hadc3, ADC_CALIB_OFFSET, ADC_SINGLE_ENDED);
  /* USER CODE END ADC3_Init 2 */
}

The task being used to sample and convert the raw value to voltage looks similar to this:

static uint32_t temp_getAdcRead(uint32_t adcChannel) {
	ADC_ChannelConfTypeDef sConfig = {
	  .Channel = adcChannel,
	  .Rank = ADC_REGULAR_RANK_1,
	  .SamplingTime = ADC3_SAMPLETIME_24CYCLES_5,
	  .SingleDiff = ADC_SINGLE_ENDED,
	  .OffsetNumber = ADC_OFFSET_NONE,
	  .Offset = 0,
	  .OffsetSign = ADC3_OFFSET_SIGN_NEGATIVE,
	};

	if (HAL_ADC_ConfigChannel(&hadc3, &sConfig) != HAL_OK)
	{
		Error_Handler();
	}

	HAL_ADC_Start(&hadc3);

	HAL_ADC_PollForConversion(&hadc3, 500);

	return HAL_ADC_GetValue(&hadc3);
}

static uint32_t temp_getResistanceFromRaw(uint32_t raw, uint32_t r1) {
	return (r1 * raw) / (0xFFF - raw);
}

static float temp_getTemp(uint32_t adcChannel, uint32_t r1) {
	// Get raw ADC reading
	uint32_t raw = temp_getAdcRead(adcChannel);

	uint32_t resistance = temp_getResistanceFromRaw(raw, r1);

	return temp_getTempFromResistance(resistance));
}

 The "temp get" task is will use these calls once every second to sample and determine temp approximately every 1 second. The function for converting the calculated resistance to temperature is not shown, it essentially searches through the generic NTC3950 table and finds the sampled temperature for a given resistance. Very simple.

I'm looking for some hints and guidance on how to make the HAL_ADC_Start() to return a more consistent and accurate value. A few things I've tried:

  • ADC clock speed
  • channel sample time
  • Changing number of conversion to 4 to sample all thermistors in one block
  • Oversampling (which doesn't seem to return the averaged value like I would imagine it should)
  • Using V_Ref_Int to help introduce an offset for the readings 

Short of changing the mode to a continuous read and have the peripheral feed a DMA and average a massive pool of reads, I don't know how else to make this more accurate. I would appreciate any suggestions and feedback with the design.

Thanks,

 - Taylor

1 ACCEPTED SOLUTION

Accepted Solutions

Most on-chip ADCs don't achieve near their word size; the last 2 bits or so are likely garbage.

Good board design, ground planes
External reference (even a TL431 is better than on-chip)
Buffer the signal with a good op-amp (TLV2333 etc)
Take 100 readings and divide by 100, gives you 10x less noise i.e. ~3 more clean bits
Running the ADC slower helps a bit

View solution in original post

5 REPLIES 5
AScha.3
Chief

Hi,

you want best resolution , the H735 has 16 bit ADCs , but you use the 12 bit ADC --- why this?

+

Oversampling is needed here, to get some averaging (noise reduction), set it in Cube. (but first test without it, to get a good resolution = as in datasheet )

+

Put a cer.cap with short connections direct on every ADC input -- to gnd. 100nF is a good starting point.

+

how about the precision of the reference current through the NTCs ?

-> think : if your adc gets close to 16 bit, this is about 1/65000 , the current has to be minimum this precision (or the supplied voltage at the resistor , that sets the current); otherwise you test the noise of the current...

most simple + good : use the vref AVDD as voltage for the ntcs, so its always part of the adc reference.

+

Important: at first run calibrate on the adc, only then its really at its spec.

If you feel a post has answered your question, please click "Accept as Solution".
raptorhal2
Lead

Read the Reference Manual on ADC Characteristics to understand its inherent accuracy. An read AN2834 How To Optimize The ADC Accuracy........

An Op Amp between the ADC pin and the signal source will likely be needed to accommodate the large signal impedance. Without an Op Amp, the ADC3_SAMPLETIME should be set to maximum.

Another trick that also allows for NTC installation effects: measure at a known mid-temperature and apply a correction in software. In a production and support environment that can be cumbersome.

 

Thanks for your reply and suggestions. 

 

you want best resolution , the H735 has 16 bit ADCs , but you use the 12 bit ADC --- why this?


I'm using the dev kit initially which does not have ADC1/2 on all of the Arduino header expansion pins. Where I am using 4 thermistors I chose to use ADC3 so I could have them on the same ADC number. I will retest using a few of them on ADC1 and see how that improves the sampling. The pins available are as follows:

Screenshot 2024-03-27 090703.png

 

Oversampling is needed here, to get some averaging (noise reduction), set it in Cube. (but first test without it, to get a good resolution = as in datasheet )

I will implement oversampling. Thanks for the suggestion.

 

Put a cer.cap with short connections direct on every ADC input -- to gnd. 100nF is a good starting point

 

I will look at adding this into our design. Makes sense to me.

 

how about the precision of the reference current through the NTCs ?

-> think : if your adc gets close to 16 bit, this is about 1/65000 , the current has to be minimum this precision (or the supplied voltage at the resistor , that sets the current); otherwise you test the noise of the current...


I don't understand what you mean here. I have a 1K resistor inline from the voltage divider output into the MCU pin to limit the current. I'm very much not concerned about the difference in reading in terms of LSBs (using a 16-bit example, reading in 1000 versus 1001). The reading is being converted to temperature, and after my conversion I'm not even using any decimal point. If my temperature output was consistently the same integer then I would be content.

 

Important: at first run calibrate on the adc, only then its really at its spec.


Do you mean that the calibration should only be run once (for example, in manufacturing) and not run every time the system boots up? That would contradict what I read in the reference manual about calibration.

Thank you in advance.

Most on-chip ADCs don't achieve near their word size; the last 2 bits or so are likely garbage.

Good board design, ground planes
External reference (even a TL431 is better than on-chip)
Buffer the signal with a good op-amp (TLV2333 etc)
Take 100 readings and divide by 100, gives you 10x less noise i.e. ~3 more clean bits
Running the ADC slower helps a bit

Awesome. This is what I was looking for. Thanks for your suggestions. 

Can you clarify two things:

 

Take 100 readings and divide by 100, gives you 10x less noise i.e. ~3 more clean bits


Wouldn't this be the same as running the internal oversampler? Maybe set it to a ratio of 64 and a right bit shift of 6? Even if it was a ratio of 8 and a bit shift of 3 to trim off the unreliable 3 LSBs, as you suggest?

 

Running the ADC slower helps a bit


Do you mean the system clock running into the ADC peripheral? Or the sample time? Or both?

Thank you!

- Taylor