cancel
Showing results for 
Search instead for 
Did you mean: 

How to get max ADC sample rate on STM32H745?

JBerry
Associate III

I am using an STM32H745 and trying to get the ADC to run at 3.333 MSPS. Currently I am prototyping on a Nucleo-144 development board which has a LQFP144 package MCU on it. While according to the datasheet (page 185), I should be able to run an ADC at up to 3.6 or 3.35 MSPS at 16-bits resolution, I have found some information (page 15) that suggests the ADC cannot be run that fast unless I lower the bit resolution. Since the ADC was not keeping up, I tried lowering it to 10 bits since the actual data is not critical for this prototype (we will be using a TFBGA chip on our custom board which is better, though not the best). But the ADC still seems to be running at about half the rate that I am wanting. I am using the direct channel into ADC3 in order to get the best sample rate I can. I am using DMA to get the samples into memory. I am only using the M7 core currently.

I have ADC3 setup in CubeMX with:

Clock Prescaler: Asynchronous clock mode divided by 1

Resolution: ADC 10-bit resolution

Left Bit Shift: 6 bits shift (looking more closely, this may not actually get set by the generated code)

Conversion Data Management: DMA One Shot Mode

External Trigger Conversion Source: Timer 8 Trigger Out 2 event

External Trigger Conversion Edge: Trigger detection on the rising edge

Sampling Time: 1.5 Cycles (only using 1 channel)

Clock setup:

M7 clock 480MHz, M4 clock 240MHz

APB1 and APB2 clocks (for timers) set to 240MHz

ADC clock set to 36MHz (I have also tried 37MHz and sampling time of 2.5 Cycles as suggested by the datasheet)

Timer 8 is setup:

Prescaler: 0

Counter Period: 71 (240MHz / 72 = 3.333 MHz)

Trigger Event Selection TRGO2: Output Compare (OC1REF)

Channel 1: PWM Generation CH1

Pulse: 35 (for 50% duty cycle trigger)

It gets a little complicated here in that I am using Timer 4 to trigger Timer 8 and a Timer 4 output to trigger setting up the DMA transfer from the ADC, but that seems to be working as expected. I am using Timer 5 to time how long it takes between starting the DMA transfer to getting the transfer complete interrupt. Right now the best I can do is 64 samples in about 41us including the DMA setup time, which seems to be on the order of 1-2us. So that puts me at about 0.6us per sample instead of 0.3us.

If I change the prescaler for Timer 8 to 1 (so half as fast), the timing looks the same, so it seems like the ADC is ignoring every-other trigger from Timer 8 at the faster rate. The fact that I can make that one change and things work like I expect seems to imply the ADC is setup properly, it just cannot run as fast as I want it to. Adjusting the bit resolution between 16, 10, and 8 bits does not seem to affect the timing any.

Is there something I am missing? How should the ADC react to being triggered too frequently? Especially in the case of running it within the specified rates, but supposedly too fast for a given package.

Edit: Did a little more digging and it was not properly setting the ADC to 8-bit mode. Apparently different revisions use a different value for 8-bit mode, but the function that the generated code was using was not the one that checks for that. The generated code also was not setting the left shift setting for some reason. Now I am wondering how many more issues I will find with the generated code before this is over... (or worse, the ones I won't find)

1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Guru

Did you read this? Could be related.

https://community.st.com/s/question/0D53W00000FP7j8SAD/adc-only-half-of-the-programmed-sample-rate-specs-changed

> How should the ADC react to being triggered too frequently? Especially in the case of running it within the specified rates, but supposedly too fast for a given package.

I would imagine the package specifications are due to hardware signal limits, rather than a change in the architecture. I'd just expect the ADC readings to be poor.

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

2 REPLIES 2
TDK
Guru

Did you read this? Could be related.

https://community.st.com/s/question/0D53W00000FP7j8SAD/adc-only-half-of-the-programmed-sample-rate-specs-changed

> How should the ADC react to being triggered too frequently? Especially in the case of running it within the specified rates, but supposedly too fast for a given package.

I would imagine the package specifications are due to hardware signal limits, rather than a change in the architecture. I'd just expect the ADC readings to be poor.

If you feel a post has answered your question, please click "Accept as Solution".
JBerry
Associate III

I had not found that, thanks! I just tried doubling the ADC clock to 72MHz and that seems like maybe it got it working, and at 14 bits.