cancel
Showing results for 
Search instead for 
Did you mean: 

How to use ADC3 temperature sensor for STM32H723ZG if VDD=VDDA=VREFP=1.8V?

MStol.2
Associate II

Hi,

I'm using a Nucleo-H723ZG board with 1.8V VDD.

According the schematic, VDDA and VREFP is same as VDD

(verified VREFP to be 1.8V using multi-meter).

If I use the ADC3 analog temperature senor I get strange values.

I get values like -33°C while the DTS reports 38°C.

float temperature = (float)__HAL_ADC_CALC_TEMPERATURE(1800,ADCValue,ADC_RESOLUTION_16B));

If I modify the code above for 3.3V (__VREFANALOG_VOLTAGE__ = 3300) and change the board VDD to 3.3V then its working fine.

(reporting similar values to DTS)

I also tried the classical temperature calculation with tcal1 and tcal2 values: It works for 3.3V but no luck with 1.8V.

Of course the formula cannot work with 1.8V since the tcal values are obtained with VREFP=3.3V, if I scale the ADC value obtained at 1.8V to a 3.3V equivalent, I get similar negative results as with _HAL* macro.

Has anyone an idea how to use the ADC temperature sensor with 1.8V?

Thanks!

6 REPLIES 6
FBL
ST Employee

Hello @Community member​,

As mentioned in datasheet, the conversion range is between 1.7 V and 3.6 V. However, computation is using temperature sensor typical values. You can try using __LL_ADC_CALC_TEMPERATURE(). The temperature calculation should be more accurate using __LL_ADC_CALC_TEMPERATURE().

Hope this helps!

To give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.

MStol.2
Associate II

Hi @F.Belaid​ ,

thanks for your reply.

I got a chance now to try it....sorry for the delay

float temperature = (float)__LL_ADC_CALC_TEMPERATURE(1800,ADCValue,ADC_RESOLUTION_12B));

It does seem to do exactly the same as with __HAL* macro.

The raw ADC decimal value is e.g. 1042 the macro converts it to 2640°C.

(DTS is reading 36°C)

> The raw ADC decimal value is e.g. 1042

If it's a 12-bit ADC and VREF+=1.8V, this corresponds to 1024/4096*1.8V=0.458V. That's probably incorrect, as DS gives typ. readout at 30deg.C 0.62V and a slope of 2mV/deg.C, so 0.458V would correspond to 30 + (458-620)*2 = -51deg.C.

This indicates that you incorrectly read out the temperature sensor's voltage.

Do you calibrate the ADC before measurement? Do you observe the temperature sensor startup time and the required sampling time as given in DS?

Here are some other ideas why temperature readout may be incorrect.

JW

Hi @Community member​ 

thank you for your kind help and explanation.

In general I tried to keep it very simple by using polled mode.

Here is the MX_ADC3_Init:

  hadc3.Instance = ADC3;
  hadc3.Init.ClockPrescaler = ADC_CLOCK_ASYNC_DIV1;
  hadc3.Init.Resolution = ADC_RESOLUTION_12B;
  hadc3.Init.DataAlign = ADC3_DATAALIGN_RIGHT;
  hadc3.Init.ScanConvMode = ADC_SCAN_DISABLE;
  hadc3.Init.EOCSelection = ADC_EOC_SINGLE_CONV;
  hadc3.Init.LowPowerAutoWait = DISABLE;
  hadc3.Init.ContinuousConvMode = DISABLE;
  hadc3.Init.NbrOfConversion = 1;
  hadc3.Init.DiscontinuousConvMode = DISABLE;
  hadc3.Init.ExternalTrigConv = ADC_SOFTWARE_START;
  hadc3.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE;
  hadc3.Init.DMAContinuousRequests = DISABLE;
  hadc3.Init.SamplingMode = ADC_SAMPLING_MODE_NORMAL;
  hadc3.Init.ConversionDataManagement = ADC_CONVERSIONDATA_DR;
  hadc3.Init.Overrun = ADC_OVR_DATA_OVERWRITTEN;
  hadc3.Init.LeftBitShift = ADC_LEFTBITSHIFT_NONE;
  hadc3.Init.OversamplingMode = DISABLE;
  if (HAL_ADC_Init(&hadc3) != HAL_OK)
  {
    Error_Handler();
  }
 
  /** Configure Regular Channel
  */
  sConfig.Channel = ADC_CHANNEL_TEMPSENSOR;
  sConfig.Rank = ADC_REGULAR_RANK_1;
  sConfig.SamplingTime = ADC3_SAMPLETIME_2CYCLES_5;
  sConfig.SingleDiff = ADC_SINGLE_ENDED;
  sConfig.OffsetNumber = ADC_OFFSET_NONE;
  sConfig.Offset = 0;
  sConfig.OffsetSign = ADC3_OFFSET_SIGN_NEGATIVE;
  if (HAL_ADC_ConfigChannel(&hadc3, &sConfig) != HAL_OK)
  {
    Error_Handler();
  }

Here is the temperature measure function it self:

I do a calibration first, start ADC, wait for conversion result, read value and turn ADC off.

float STM32H7_GetTemperature(ADC_HandleTypeDef *hadc){
	// ADC temperature sensor
	// TODO check code below!
	uint16_t ADCValue = 0;
 
	if(HAL_ADCEx_Calibration_Start(hadc, ADC_CALIB_OFFSET, ADC_SINGLE_ENDED) != HAL_OK){
		return HAL_ERROR;
	}
 
	if (HAL_ADC_Start(hadc) != HAL_OK){
		Error_Handler();
		return HAL_ERROR;
	}
 
    if (HAL_ADC_PollForConversion(hadc, 100000) != HAL_OK){
        Error_Handler();
        return -255;
    }else{
        ADCValue = HAL_ADC_GetValue(hadc);
    }
 
    float temperature =(float)(__LL_ADC_CALC_TEMPERATURE (1800,ADCValue,ADC_RESOLUTION_12B));
    HAL_ADC_Stop(hadc);
    return temperature; 
}

The values I get are still off:

raw value is eg. 1050.

Just as an info: Ical values are 1026 for 110°C and 771 for 30°C (at 3.3V).

>> Do you observe the temperature sensor startup time

I don't see anything in what you've posted to do so, although I don't use Cube/HAL and don't understand where do you switch on the temperature sensor and how do you wait the required startup time.

>> and the required sampling time as given in DS?

> sConfig.SamplingTime = ADC3_SAMPLETIME_2CYCLES_5;

This is unlikely to comply with the required sampling time.

JW

MStol.2
Associate II

The temperature sensor channel configuration is generated by CubeMX / STM32CubeIDE.

ADC3_SAMPLETIME_2CYCLES_5 is st by CubeMX generator, there is no GUI field to change.

One could expect that this value generated by the ST Software is correct :/

But indeed, if I change this SamplingTime value to e.g. ADC3_SAMPLETIME_24CYCLES_5

the measurement value becomes reasonable with ~1400.

I probably need my own ADC channel init function, setting SamplingTime to correct values.

@Community member​ thank you again for your kind help!