cancel
Showing results for 
Search instead for 
Did you mean: 

STM32 ADC DMA low raw/Voltage readings

KoSt
Associate III

Hello,
I am experiencing the following problem. When I setup an ADC with DMA for 5 channels, occasionally I get lower readings than the expected ones. It also happens in some of our hardware of our production, but others perform ok. We have inspected the components and everything seems identical. We have measured the reference voltage and all the expected voltages on ADC input pins (some are fixed and known) and all looking good too. On the other hand, when I sample only one ADC channel in polling mode, I do not get this problem, the voltages read as expected.
Here is the ADC configuration with DMA (source code attached too):

KoSt_0-1711711015173.pngKoSt_1-1711711038152.png

KoSt_2-1711711052740.png

KoSt_3-1711711167318.png

 

24 REPLIES 24

I said that the problem is with all the ADC channels and I used the battery voltage as an example to answer your question with some measurements. 

So, are the *raw* readouts (i.e. the values in the buffer, unmodified by any of your functions) shifted against expectation? Are they *offset* by a constant, or are they *scaled* by a constant, or anything else?

This might bring us to your "calibration" method. I don't really like it, there should be no need for that. If the precision of regulator generating voltage on VREF+ is not sufficient, you should calibrate it against the internal VREFINT reference.  Also, do you calibrate the ADC as outlined in the Calibration (ADCAL) subchapter of ADC chapter in Reference Manual?

JW

 

1) The raw readouts are coming directly from the buffer used in DMA and I pass those by value to other functions for the conversions, hence I do not see any alteration in their contents.

2) I had the impression that the G0 chip does not support the extended functions from ST library such calibration. I tried  the following sequence in my initialisation function and I do not get any ADC readings:

 

 

    HAL_ADC_Stop(&hadc1);
    HAL_ADCEx_Calibration_Start(&hadc1);
    //HAL_Delay(2000);
    while ((HAL_ADC_GetState(&hadc1) & HAL_ADC_STATE_READY) != 1) {}
    HAL_ADC_Start_DMA(&hadc1, (uint16_t*)ADCreadings, sizeof(ADCreadings)/sizeof(ADCreadings[V_BAT]));

 

3) I was not aware of the Vrefint. In my case the reading returns raw 409 for 10-bit resolution. The power for the microcontroller is at 3V3, hence the voltage is 1.318V (is this expected?).

4) Yes my calibration is a workaround, but since not having anything of the above working yet, it was the next thing I could think of adjusting the whole ADC on a known voltage such as the battery. Ideally I should not do that, but the hardware should report the readings correctly in range.

> I tried the following sequence in my initialisation function and I do not get any ADC readings

I don' use Cube/HAL but you should be able to use the calibration as outlined in RM; and you also should use it before taking ADC readings.

> I do not get any ADC readings

You should find out, why. You may want to read this thread.

> Vrefint. In my case the reading returns raw 409 for 10-bit resolution. The power for the microcontroller is at 3V3, hence the voltage is 1.318V (is this expected?).

Provided the timing prescribed by DS (startup, sampling) was used, the result is too high, should be nominally 1.212V, see Embedded internal voltage reference table in DS. Lack of calibration may explain this, too. 

There's also a factory-measured value stored in the system memory, read it out and compare (adjusting for different VREF+ if needed, and adjusting for the 10-bit resolution), see Internal voltage reference chapter in DS.

JW

I have got some good news:
1) Adding delays in the generated HAL calibration function between enable and disable ADC stages as per thread you pointed, is fixing the calibration problem and the ADC is starting and it is working as normal after the calibration.

/**
  * @brief  Perform an ADC automatic self-calibration
  *         Calibration prerequisite: ADC must be disabled (execute this
  *         function before HAL_ADC_Start() or after HAL_ADC_Stop() ).
  * @note   Calibration factor can be read after calibration, using function
  *         HAL_ADC_GetValue() (value on 7 bits: from DR[6;0]).
  * @PAram  hadc       ADC handle
  * @retval HAL status
  */
HAL_StatusTypeDef HAL_ADCEx_Calibration_Start(ADC_HandleTypeDef *hadc)
{
  HAL_StatusTypeDef tmp_hal_status;
  __IO uint32_t wait_loop_index = 0UL;
  uint32_t backup_setting_cfgr1;
  uint32_t calibration_index;
  uint32_t calibration_factor_accumulated = 0;
  uint32_t tickstart;

  /* Check the parameters */
  assert_param(IS_ADC_ALL_INSTANCE(hadc->Instance));

  __HAL_LOCK(hadc);

  /* Calibration prerequisite: ADC must be disabled. */

  /* Disable the ADC (if not already disabled) */
  tmp_hal_status = ADC_Disable(hadc);

  // A minimum delay is required after enabling or disabling the ADC
  HAL_Delay(1);

  /* Check if ADC is effectively disabled */
  if (LL_ADC_IsEnabled(hadc->Instance) == 0UL)
  {
    /* Set ADC state */
    ADC_STATE_CLR_SET(hadc->State,
                      HAL_ADC_STATE_REG_BUSY,
                      HAL_ADC_STATE_BUSY_INTERNAL);

    /* Manage settings impacting calibration                                  */
    /* - Disable ADC mode auto power-off                                      */
    /* - Disable ADC DMA transfer request during calibration                  */
    /* Note: Specificity of this STM32 series: Calibration factor is          */
    /*       available in data register and also transferred by DMA.          */
    /*       To not insert ADC calibration factor among ADC conversion data   */
    /*       in array variable, DMA transfer must be disabled during          */
    /*       calibration.                                                     */
    backup_setting_cfgr1 = READ_BIT(hadc->Instance->CFGR1, ADC_CFGR1_DMAEN | ADC_CFGR1_DMACFG | ADC_CFGR1_AUTOFF);
    CLEAR_BIT(hadc->Instance->CFGR1, ADC_CFGR1_DMAEN | ADC_CFGR1_DMACFG | ADC_CFGR1_AUTOFF);

    /* ADC calibration procedure */
    /* Note: Perform an averaging of 8 calibrations for optimized accuracy */
    for (calibration_index = 0UL; calibration_index < 8UL; calibration_index++)
    {
      /* Start ADC calibration */
      LL_ADC_StartCalibration(hadc->Instance);

      /* Wait for calibration completion */
      while (LL_ADC_IsCalibrationOnGoing(hadc->Instance) != 0UL)
      {
        wait_loop_index++;
        if (wait_loop_index >= ADC_CALIBRATION_TIMEOUT)
        {
          /* Update ADC state machine to error */
          ADC_STATE_CLR_SET(hadc->State,
                            HAL_ADC_STATE_BUSY_INTERNAL,
                            HAL_ADC_STATE_ERROR_INTERNAL);

          __HAL_UNLOCK(hadc);

          return HAL_ERROR;
        }
      }

      calibration_factor_accumulated += LL_ADC_GetCalibrationFactor(hadc->Instance);
    }
    /* Compute average */
    calibration_factor_accumulated /= calibration_index;

    // A minimum delay is required after enabling or disabling the ADC
    HAL_Delay(1);
    
    /* Apply calibration factor */
    LL_ADC_Enable(hadc->Instance);
    // A minimum delay is required after enabling or disabling the ADC
    HAL_Delay(1);
    LL_ADC_SetCalibrationFactor(hadc->Instance, calibration_factor_accumulated);
    // A minimum delay is required after enabling or disabling the ADC
    HAL_Delay(1);
    LL_ADC_Disable(hadc->Instance);
    // A minimum delay is required after enabling or disabling the ADC
    HAL_Delay(1);

    /* Wait for ADC effectively disabled before changing configuration */
    /* Get tick count */
    tickstart = HAL_GetTick();

    while (LL_ADC_IsEnabled(hadc->Instance) != 0UL)
    {
      if ((HAL_GetTick() - tickstart) > ADC_DISABLE_TIMEOUT)
      {
        /* New check to avoid false timeout detection in case of preemption */
        if (LL_ADC_IsEnabled(hadc->Instance) != 0UL)
        {
          /* Update ADC state machine to error */
          SET_BIT(hadc->State, HAL_ADC_STATE_ERROR_INTERNAL);

          /* Set ADC error code to ADC peripheral internal error */
          SET_BIT(hadc->ErrorCode, HAL_ADC_ERROR_INTERNAL);

          return HAL_ERROR;
        }
      }
    }

    /* Restore configuration after calibration */
    SET_BIT(hadc->Instance->CFGR1, backup_setting_cfgr1);

    /* Set ADC state */
    ADC_STATE_CLR_SET(hadc->State,
                      HAL_ADC_STATE_BUSY_INTERNAL,
                      HAL_ADC_STATE_READY);
  }
  else
  {
    SET_BIT(hadc->State, HAL_ADC_STATE_ERROR_INTERNAL);

    /* Note: No need to update variable "tmp_hal_status" here: already set    */
    /*       to state "HAL_ERROR" by function disabling the ADC.              */
  }

  __HAL_UNLOCK(hadc);

  return tmp_hal_status;
}


2) Having the calibration completed with a calibration factor result of 64, now the ADCrefint reads at 377 which converts to 1.215V which seems more reasonable value.

3) The embedded internal voltage reference set during production reads as 1655, not sure what that means, and the comment in the source code I find the register address says:

/* ADC internal channels related definitions */
/* Internal voltage reference VrefInt */
#define VREFINT_CAL_ADDR                   ((uint16_t*) (0x1FFF75AAUL)) /* Internal voltage reference,
  address of parameter VREFINT_CAL: VrefInt ADC raw data acquired at temperature 30 DegC (tolerance: +-5 DegC),
  Vref+ = 3.0 V (tolerance: +-10 mV). */



> The embedded internal voltage reference set during production reads as 1655, not sure what that means

That's the ADC reading which was taken from VREFINT in factory:

1655 / 4096 * 3.0V = 1.212V

The VREFINT reference has some manufacturing spread (cca -2.5%+1.5%, according to DS); this should give you a better individual value (10mV/3.0V, i.e. cca 0.33%), all these at room temperature.  The LDO you are using is nominally +-1% at room temperature, so using VREFINT instead of relying on LDO being precise enough won't give you that much, but there are also worse LDOs out there.

JW

 

In other words I will not get much of a benefit by calibrating based on Vrefint manually, because the LDO has a higher error margin? Hence the ST HAL calibration function should be good enough in my case?

We will test tomorrow and verify if we get any better readings in those devices in China. 

> In other words I will not get much of a benefit by calibrating based on Vrefint manually, because the LDO has a higher error margin?

I'm not sure we understand each other.

If you're confident the LDO output is 3.3V precisely enough, forget about VREFINT.

If you're not sure about  the exact LDO output voltage but you are sure it's stable, you can somewhat increase precision by using VREFINT.

In your case, I'd start with ignoring VREFINT. I mentioned it just to get you to measure a "known fixed voltage" so that we know where may be the root of the problem. The ADC calibration as per RM is necessary to take out a manufacturing-dependent offset; that should also remove the need for your "extra" calibration.

JW

Unfortunately the ST calibration does not solve the problem. The ADC problem has to do with the wrong gain as opposed to the wrong offset.
The most interesting part is that now we experienced the problem in a device in another country than China. Hence it is probably not hardware dependent.

Wrong gain may be caused by different-than-expected VREF+ voltage.

What are the VREFINT readings for the problematic devices?

JW