2022-03-07 09:48 AM
Hi All,
I am using the simplest mode, single conv, software start, etc. 21MHz clock (/2 divisor).
I have been measuring the conversion times both in a loop and with a scope on a waggled GPIO pin.
With a sampling interval of 3,56,480 I see conversion times of 2,4,24 us.
Reading the RM, the data sheet, and whatever appnote I can find, I can't see why this should be.
This is the ADC read function
// Starts a conversion on the internal ST ADC, and optionally reads it
// Returns 12 bit result in 'value'.
// Mutex protected.
// This function is BLOCKING unless start_only=true in which case it just starts the ADC.
// Assumes ADC is already enabled.
void KDE_ADC_ST_read(uint8_t channel, uint16_t *value, bool start_only)
{
if (channel==1) {
ADC1_ST_Lock();
ADC1->SR = ~(ADC_FLAG_EOC | ADC_FLAG_OVR);
ADC1->CR2 |= (uint32_t)ADC_CR2_SWSTART;
}
else {
ADC2_ST_Lock();
ADC2->SR = ~(ADC_FLAG_EOC | ADC_FLAG_OVR);
ADC2->CR2 |= (uint32_t)ADC_CR2_SWSTART;
}
if (!start_only)
{
if (channel==1)
{
#ifdef TIMING_DEBUG
TopLED(true);
#endif
// Wait for End of conversion flag (bit 1 in SR)
// This could HANG but only if the silicon is defective or perhaps if the ADC was not enabled
while (((ADC1->SR) & ADC_FLAG_EOC)==0) {}
#ifdef TIMING_DEBUG
TopLED(false);
#endif
// Clear flags for 'conv started' and 'EOC' - clearing EOC is dumb since it is already checked above :)
ADC1->SR = ~(ADC_FLAG_STRT | ADC_FLAG_EOC);
*value = (uint16_t) ADC1->DR;
ADC1_ST_Unlock();
}
else
{
while (((ADC2->SR) & ADC_FLAG_EOC)==0) {}
ADC2->SR = ~(ADC_FLAG_STRT | ADC_FLAG_EOC);
//__HAL_ADC_CLEAR_FLAG(ADC2, ADC_FLAG_STRT | ADC_FLAG_EOC);
*value = (uint16_t) ADC2->DR;
ADC2_ST_Unlock();
}
}
}
The other thing I have not found is a way of estimating the accuracy loss across the variable sampling interval. There must be some loss between 3 and 480 otherwise nobody would use anything but 3 : - )
2022-03-07 10:51 AM
(480 + 15 ticks) / (21 MHz) = 23.57 us.
You read 24 us. Seems right to me, considering your measurements include some overhead. What values are you expecting?
Note there is a 15 tick conversion time after sampling. On the smaller sampling periods, the ADC is likely done before you check for the flag and your measurement will not be exact due to that.
(56 + 15 ticks) / (21 MHz) = 3.38 us.
(3 + 15 ticks) / (21 MHz) = 0.85 us.
Do continuous conversions and toggle a pin on the EOC flag and you will see the exact value (assuming your code can keep up, which for 480 samples it should).
2022-03-07 11:36 AM
> There must be some loss between 3 and 480 otherwise nobody would use anything but 3 : - )
Longer sampling periods are for higher input impedances. App Note "STM32G4 ADC use tips and recommendations" gives some insight, although written for G4.
hth
KnarfB
2022-03-08 02:44 AM
Thank you all.
I did a lot more digging and found this
https://controllerstech.com/stm32-adc-single-channel/
which mentions a constant of 12.5 clocks. Whether it is 12.5 or 15, my results are indeed close enough.
Regarding the sampling interval, I do recall reading an appnote about this being source impedance related, but that is true for most (unbuffered) ADCs and it is normal to drive them from a low-Zout source. The provision of the 480 clock option is weird because that is a sampling period corresponding to a Zout which is so high that other factors will come into play.
If this ADC can give you 12 "real" bits in 0.85us, that would be amazing : - )
And for sure you will need DMA to reach that speed - same as you need DMA to cope with 21MHz SPI (another thread). Not because the CPU is too slow but because these peripherals run at a slow clock (21MHz max for the ADC) and the instructions for accessing them incur a large number of wait states.
Literally
waggle GPIO
while (((ADC1->SR) & ADC_FLAG_EOC)==0) {}
waggle GPIO
takes 2us, when the ADC conversion should be 0.85us, and this is a 168MHz CPU. The "while" line should be 2 instructions i.e. 17ns at 168MHz.