2024-09-13 05:44 AM
Hello,
I am using a STM32H743ZIT6 in a custom PCB, the MCU and reference voltage at 3.3V
Our intention is to use the built-in ADC to read some DC signals. One signal is our 24V power rail. In our circuit, nominal 24V is scaled down to 1.643V through a voltage divider fed into a buffer circuit which feeds into our ADC pin (PA3, ADC_CHANNEL_15)... so at 16 bits we expect (1.643 / 3.3) * 65535 = 32628 approximately
Below is the code we are using to configure the adc, channel, and get a baseline reading of the 24V rail (We do not intend on using DMA):
static void MX_ADC1_Init(void)
{
ADC_ChannelConfTypeDef sConfig = {0};
hadc1.Instance = ADC1;
hadc1.Init.ClockPrescaler = ADC_CLOCK_SYNC_PCLK_DIV4;
hadc1.Init.Resolution = ADC_RESOLUTION_16B;
hadc1.Init.ScanConvMode = DISABLE;
hadc1.Init.ContinuousConvMode = DISABLE;
hadc1.Init.DiscontinuousConvMode = DISABLE;
hadc1.Init.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE;
hadc1.Init.ExternalTrigConv = ADC_SOFTWARE_START;
hadc1.Init.NbrOfConversion = 1;
hadc1.Init.Overrun = ADC_OVR_DATA_PRESERVED;
hadc1.Init.OversamplingMode = DISABLE;
if (HAL_ADC_Init(&hadc1) != HAL_OK)
{
Error_Handler();
}
// if (HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET, ADC_SINGLE_ENDED) != HAL_OK) {
// Error_Handler();
// }
uint32_t adc_24_value = 0; // channel 15
sConfig.Channel = ADC_CHANNEL_15;
sConfig.Rank = ADC_REGULAR_RANK_1;
//sConfig.SamplingTime = ADC_SAMPLETIME_64CYCLES_5;
sConfig.SingleDiff = ADC_SINGLE_ENDED;
sConfig.OffsetNumber = ADC_OFFSET_NONE;
sConfig.Offset = 0;
if (HAL_ADC_ConfigChannel(&hadc1, &sConfig) != HAL_OK)
{
Error_Handler();
}
HAL_ADC_Start(&hadc1);
if (HAL_ADC_PollForConversion(&hadc1, HAL_MAX_DELAY) == HAL_OK)
{
adc_24_value = HAL_ADC_GetValue(&hadc1);
}
HAL_ADC_Stop(&hadc1);
}
What I am finding with the above code is the value for `adc_24_value` is 31550 to 31560. Around 1000 counts too low or 3% off. When I change the rail voltage to 23V, 23.5V, and 24V I find that the count changes linearly which is good. However, the negative offset still remains.
I've tried a few things to play with the final value and see if I can account for this offset. I found that when I configure the channel sampling time to be any more than 1.5 cycles, this count goes even further down by up to several thousand.... which is bizarre to us. I also find that when I calibrate using the `HAL_ADCEx_Calibration_Start` function, the count is usually about 1000 lower... so for now I have commented that part out, but it doesn't feel good that we get closer to expected results by not calibrating. We also originally tried doing multiple conversions of four different channels at once, and the readings were all low for some reason. Doing one conversion at a time brought the readings much closer to what we expected.
Any help or ideas are greatly appreciated. We've found this behavior to be at least consistent across two different boards running the same firmware. The noise out of the buffer is very low, in the microvolts. Our 3.3V when measured sits at 3.34V which may drop our count by 0.01%, but does not explain all 3%
Regards,
David
2024-09-13 06:02 AM
Why have you commented out
sConfig.SamplingTime = ADC_SAMPLETIME_64CYCLES_5;
What's the sampling time setting then?
Have you measured the actual voltage at the pin? With a scope? Do you see the sampling rate there (from the S&H stage)?
You need quite a strong and fast analog driver for most STM32 ADC inputs at high speed.
Do you have a good (C0G) and big enough capacitor (depending on driver and sampling rate but at least 100 pF I'd say)?
Reduce sampling time, also with a higher clock divider.
Then activate calibration.
... and don't expect wonders, esp. with Vref = VDD, and from an on-chip ADC in general.