cancel
Showing results for 
Search instead for 
Did you mean: 

ADC frequency not working as expected

MRABA.1
Associate II

I am using NUCLEO-STM32F446re board. It runs at 180MHz SYSCLK and ADC uses APB2CLK that can run at a maximum frequency of 90MHz. I am using STM32CubeIDE and the clock configuration looks as follows:

0693W00000SuQkEQAV.jpgMy ADC setup looks as follows:

0693W00000SuQkJQAV.jpgMy DMA setup:

0693W00000SuQkOQAV.jpg 

Just before the while loop I call ADC to start with DMA.

/* USER CODE BEGIN PV */
volatile  uint16_t 	adcResultsDMA[2];
const int adcChannelCount=sizeof(adcResultsDMA)/sizeof(adcResultsDMA[0]);
/* USER CODE END PV */ 
 
......
......
......
......
int main(void)
{
  /* USER CODE BEGIN 2 */
HAL_ADC_Start_DMA(&hadc1, (uint32_t*)adcResultsDMA, adcChannelCount);
  /* USER CODE END 2 */
 
  /* Infinite loop */
  /* USER CODE BEGIN WHILE */
  while (1)
  {
      .......
  }
}

And the complete call back function I toggle a PIN and start ADC with DMA again.

void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
        HAL_GPIO_TogglePin(GPIOA, time_meas_Pin);
	HAL_ADC_Start_DMA(&hadc1, (uint32_t*)adcResultsDMA, adcChannelCount);
}

Question

When I use an ADC Clock Prescaler = PCLK2 divided by 8 (90MHz / 8 = 11.25MHz).

  • The time between the GPIO toggles is 7.6us (this is the time it takes for the ADC to sample and convert (2 channels) and the DMA to transfer the data to the memory).
  • According to theory 12 bit takes 15 ADC clock cycles, thus for 2 channels it would be (30 ADC cycles / 11.25 MHz = 2.6us ). What I am measuring (7.6us) is way too high than what I expect (2.6us).

Lets assume what I am measuring (7.6us) is correct. When I then almost double the ADC frequency by changing the Clock Prescaler = PCLK2 divided by 4 (90MHz / 4 = 22.5MHz)

  • The time between the GPIO toggles is 7.2us (this is the time it takes for the ADC to sample and convert (2 channels) and the DMA to transfer the data to the memory), I dont understand why do I almost get the same time (7.6us for 11.25MHz and 7.2us for 22.5MHz) when I am increasing my ADC frequency?

Please advise me on what concept am I failing to understand? I would like to start ADC with software and not a timer or so, because there is code written in While() loop that is being started manually.

I have tried to include everything that is necessary with the code and the diagrams, if I might have missed important aspects that doesnt explain my problem clearly please let me know.

1 ACCEPTED SOLUTION

Accepted Solutions
Piranha
Chief II

What you are measuring is mostly the speed of software. Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.

> I would like to start ADC with software and not a timer

One cannot get a maximum speed and accurate frequency with a software. To get these one has to use either a timer or continuous conversion mode.

View solution in original post

7 REPLIES 7
raptorhal2
Lead

Calculate the time spent in the HAL library and your code detecting and servicing the end of conversion, toggling the GPIO pin and starting conversion again.

Cheers, Hal

Piranha
Chief II

What you are measuring is mostly the speed of software. Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.

> I would like to start ADC with software and not a timer

One cannot get a maximum speed and accurate frequency with a software. To get these one has to use either a timer or continuous conversion mode.

Thank you.

I tried to do that in the complete call back function. I thought by toggling the pin I am trying to calculate the time it takes to for ADC conversion and the DMA.

How do you advise me to calculate the time spent in the HAL library?

> Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.

Should I not use HAL library as it makes it easier to program.

Or you recommend to use HAL library and implement it with a Timer, that triggers an event output for ADC to start conversion. And then with that timer I can configure it to whatever frequency I would like my ADC to sample, thus solving the issue of getting quick ADC sampling time.

I can then run the ADC at PCLK2 divided by 4 (90MHz / 4 = 22.5MHz) fast frequency for it to convert the data quickly. And implement DMA for things to run more quickly?

AScha.3
Chief II

>> convert the data quickly. And implement DMA for things to run more quickly?

right !

if you want hi-speed data , you need DMA always.

and HAL is not very fast, but perfect to set up many things on SOC .

+ ADC is running at speed, you set it . this needs no testing.

just - maybe, you choose wrong settings, but its not the problem of ADC then...

If you feel a post has answered your question, please click "Accept as Solution".

Thank you.

I have attached my ADC and DMA setup in my main question. Is there anything you are able to quickly notice that is wrong with my settings?

The timer triggers ADC conversions and DMA transfers data from ADC to RAM. In this scenario the sample (trigger) rate is independent of ADC conversion time, just ensure the conversion time is shorter than the trigger period.