2022-09-03 03:33 AM
I am using NUCLEO-STM32F446re board. It runs at 180MHz SYSCLK and ADC uses APB2CLK that can run at a maximum frequency of 90MHz. I am using STM32CubeIDE and the clock configuration looks as follows:
My ADC setup looks as follows:
My DMA setup:
Just before the while loop I call ADC to start with DMA.
/* USER CODE BEGIN PV */
volatile uint16_t adcResultsDMA[2];
const int adcChannelCount=sizeof(adcResultsDMA)/sizeof(adcResultsDMA[0]);
/* USER CODE END PV */
......
......
......
......
int main(void)
{
/* USER CODE BEGIN 2 */
HAL_ADC_Start_DMA(&hadc1, (uint32_t*)adcResultsDMA, adcChannelCount);
/* USER CODE END 2 */
/* Infinite loop */
/* USER CODE BEGIN WHILE */
while (1)
{
.......
}
}
And the complete call back function I toggle a PIN and start ADC with DMA again.
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
HAL_GPIO_TogglePin(GPIOA, time_meas_Pin);
HAL_ADC_Start_DMA(&hadc1, (uint32_t*)adcResultsDMA, adcChannelCount);
}
Question
When I use an ADC Clock Prescaler = PCLK2 divided by 8 (90MHz / 8 = 11.25MHz).
Lets assume what I am measuring (7.6us) is correct. When I then almost double the ADC frequency by changing the Clock Prescaler = PCLK2 divided by 4 (90MHz / 4 = 22.5MHz)
Please advise me on what concept am I failing to understand? I would like to start ADC with software and not a timer or so, because there is code written in While() loop that is being started manually.
I have tried to include everything that is necessary with the code and the diagrams, if I might have missed important aspects that doesnt explain my problem clearly please let me know.
Solved! Go to Solution.
2022-09-04 01:16 AM
What you are measuring is mostly the speed of software. Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.
> I would like to start ADC with software and not a timer
One cannot get a maximum speed and accurate frequency with a software. To get these one has to use either a timer or continuous conversion mode.
2022-09-03 10:10 AM
Calculate the time spent in the HAL library and your code detecting and servicing the end of conversion, toggling the GPIO pin and starting conversion again.
Cheers, Hal
2022-09-04 01:16 AM
What you are measuring is mostly the speed of software. Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.
> I would like to start ADC with software and not a timer
One cannot get a maximum speed and accurate frequency with a software. To get these one has to use either a timer or continuous conversion mode.
2022-09-04 01:39 AM
Thank you.
I tried to do that in the complete call back function. I thought by toggling the pin I am trying to calculate the time it takes to for ADC conversion and the DMA.
How do you advise me to calculate the time spent in the HAL library?
2022-09-04 01:44 AM
> Of course, the HAL is broken bloatware and normal code can achieve higher speeds, but still that is just not the right approach.
Should I not use HAL library as it makes it easier to program.
Or you recommend to use HAL library and implement it with a Timer, that triggers an event output for ADC to start conversion. And then with that timer I can configure it to whatever frequency I would like my ADC to sample, thus solving the issue of getting quick ADC sampling time.
I can then run the ADC at PCLK2 divided by 4 (90MHz / 4 = 22.5MHz) fast frequency for it to convert the data quickly. And implement DMA for things to run more quickly?
2022-09-04 02:56 AM
>> convert the data quickly. And implement DMA for things to run more quickly?
right !
if you want hi-speed data , you need DMA always.
and HAL is not very fast, but perfect to set up many things on SOC .
+ ADC is running at speed, you set it . this needs no testing.
just - maybe, you choose wrong settings, but its not the problem of ADC then...
2022-09-05 12:01 AM
Thank you.
I have attached my ADC and DMA setup in my main question. Is there anything you are able to quickly notice that is wrong with my settings?
2022-09-05 11:44 AM
The timer triggers ADC conversions and DMA transfers data from ADC to RAM. In this scenario the sample (trigger) rate is independent of ADC conversion time, just ensure the conversion time is shorter than the trigger period.