2025-07-10 4:14 AM - edited 2025-07-11 5:29 AM
I am developing a board using an STM32U535RETQ - 64 pin QFP pkg. The part is an SMPSU capable "Q" variant - although we are only using the STM's internal LDO... NOT the SMPSU.
I am using CubeMx 6.14 and StmIDE 1.18 to produce my ioc and compilation.
The 14bit ADC1 is performing as expected when set to convert internal signals - Vdd and Vbatt.
The signal to be converted is a low frequency DC signal - we're measuring current via a precision sense resistor and instrumentation amplifier.
Calibration performed during programme "init" :-
ADC_ChannelConfTypeDef sConfig = {0};
hadc1.Instance = ADC1;
sConfig.Channel = ADC_CHANNEL_3; // Ch3 intended- ******************************************
sConfig.Rank = ADC_REGULAR_RANK_1;
sConfig.SamplingTime = ADC_SAMPLETIME_12CYCLES;
sConfig.SingleDiff = ADC_SINGLE_ENDED;
sConfig.OffsetNumber = ADC_OFFSET_NONE;
sConfig.Offset = 0;
if (HAL_ADC_ConfigChannel(&hadc1, &sConfig) != HAL_OK)
{
Error_Handler();
}
if ( HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET_LINEARITY, ADC_SINGLE_ENDED ) != HAL_OK ) // Calibrate 14 bit ADC1 - needed to look at HAL_ADCEX driver source to determine extra ADC_Calib parameter.
{
Error_Handler(); // Handle configuration error
}
We are using the ADC in a basic mode - just one single channel to convert, started by regular software conversions.
Function to read ADC1_Ch3 :-
float Read_ADC1_Ch3(void)
{
float adc1Val;
HAL_ADC_Start(&hadc1);
HAL_ADC_PollForConversion(&hadc1, HAL_MAX_DELAY);
adc1Val = HAL_ADC_GetValue(&hadc1); // 14 bit converter
return adc1Val;
}
When I convert any of the external channels still available ( Ch3 - the one we want, or Ch8 & Ch16 - spare I/P's ), I'm getting a cyclical swing in the reading with a 3% variation - i.e. about 5 bit performance - this repeats every several seconds.
I've looked at the I/P capacitor - nominally 1nF, values from 0-100nF have been tried without any significant difference.
Looked at the number of sample clocks - 3-814, very little difference.
The board has one, solid, Gnd plane.
ADC clock is 4MHz noimal - I've tried 250KHz-16MHz. Again - without any significant difference.
The signal I'm monitoring, measured on a 6 1/2 digit DMM is stable. The SAME SIGNAL, converted by the 12 bit ADC4 is performing as expected for a 12 bit converter. That to me almost certainly says that the issue is with the STM32 / ADC1, and not my external circuitry.
The board can be DC or battery powered - no difference on ADC1 performance.
Has anyone else encountered this issue or have any suggestions as to what the cause may be?
Thanks for any comments received.
John Haughton.
2025-07-10 8:54 PM
> The 14bit ADC1 is performing as expected when set to convert internal signals - Vdd and Vbatt.
This indicates ADC1 is working okay. It's the same circuitry after the sampling cap for all channels.
Recheck your assumptions. Missing something.
Show a plot of the bad data if you can. Might provide some clues. What frequency is the noise? FFT may also prove insightful.
2025-07-10 11:17 PM
Hello @JLH64 ,
I've edited your post to follow the ST community rules. So in next time please use </> button to share your code. Please review this post.
Thank you for your understanding.
2025-07-11 5:14 AM - edited 2025-07-11 5:34 AM
Update - I have confirmed that the issue is still present on other boards, two variants - my developement set, with signal conditioning, as well as prototype production boards, where I currently only have a direct input to ADC1_Ch3 (the signal conditioning will be on an I/O board.
Using a reference voltage, direct into the ADC, gives me the same results as above.
I have also cut the project down so that the only peripherals initialised are ADC1 and the GPIO.
All other hardware ( pretty minimal - LCD driver, external comparator ) is powered down.
2025-07-11 5:27 AM
I've checked whether an ADC4 chanel, using the same physical pin as an ADC1 chanel, is performing within expectations - it is.
I've also cut down the code so that the only peripherals initialised are ADC1, GPIO, and the ICACHE :-
/* Initialize all configured peripherals */
MX_GPIO_Init();
// MX_RTC_Init();
// MX_TIM2_Init();
// MX_LPTIM1_Init();
MX_ICACHE_Init();
// MX_DAC1_Init();
MX_ADC1_Init();
/* USER CODE BEGIN 2 */
All that is left in the main loop is :-
while (1)
{
adcVal = Read_ADC1_Ch3();
HAL_Delay(5);
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
}
/* USER CODE END 3 */
Setting a breakpoint on the HAL_Delay, and inspecting "adcVal"confirms that internal signals are performing to expectations, whilst external signals are unusable.
In operation, the device only needs to read the i/p signal several times a second. To get a picture of the noise I'll create an array and see if I can get it into XL, or spit the values out of DAC1 ( which is performing to expectations).