2021-09-06 05:54 PM
Hello.
I'm using the STM32H735G-DK and I run into a problem when using the ADC, so I made a very short CubeMX project to check this.
I started with a CubeMX configured board with no peripheral default initialisation.
Then I ticked ADC1 IN0. So by default it set it as 16bits conversion with software start. I changed the CPU clock to 550MHz.
In the generated project:
HAL_ADC_Start(&hadc1);
HAL_StatusTypeDef status = HAL_ADC_PollForConversion(&hadc1, 1000);
adc_buff = HAL_ADC_GetValue(&hadc1);
The default pinout use the Arduino A2 pin (PA0_C). I connected the 3.3V on the ADC input. The ADC result is then 65535 which is fine. What is strange is that it doesn't change of a single lsb due to noise.
Then I connected the 0V on the ADC input. The result I get is 1000+-100lsb instead of 0.
Which basically mean that on a 16bit conversion, I have 10bits of noise.
But it still give me a huge offset. How to get rid of this?
The most unfortunate thing is that the audio recording is already not working, see
So I'm using the ADC instead and I run into that terrible effect.
Best regards
Solved! Go to Solution.
2021-09-07 07:01 AM
You should try:
if (HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET, conf_in) != HAL_OK) {
and
if (HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET_LINEARITY, conf_in) != HAL_OK) {
2021-09-07 07:01 AM
You should try:
if (HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET, conf_in) != HAL_OK) {
and
if (HAL_ADCEx_Calibration_Start(&hadc1, ADC_CALIB_OFFSET_LINEARITY, conf_in) != HAL_OK) {
2021-09-22 01:50 PM
.
2021-09-22 01:52 PM
I wasn't aware that the H7 had such embedded feature.
I tried to do ADC_CALIB_OFFSET only, but it appeared that when doing ADC_CALIB_OFFSET and ADC_CALIB_OFFSET_LINEARITY, the 3.3V result was better.
Now I have 0-186lsb for 0V and 64799-65119 for 3.3V, and that's much better.
Thanks a lot.