2024-04-25 02:03 AM - edited 2024-04-25 02:35 AM
Hi,
i am trying to investigate the behaviour of the ADC when measuring a value, where the ADC is triggered from a timer source.
In the manual it states, that when the ADC is clocked from SYSCLK or PLL "P", then it would be faster, but when clocked from AHB clock, the timing would be more precise.
I can only observe a speed difference of about 150ns, but not really a timing difference. Also the falling edge of the signal to be measured has a dip, whereas the rising edge has none.
Green is the actuall signal that is measured. The yellow trace shows the testsignal from the interrupt, where the low time between the high pulses is proportional to the level of the signal. The red trace has no meaning.
Anyone knows some details about this, or where i can have a read?
2024-04-26 10:23 AM
I have now made quite a few measures:
Findings are as follows:
- With sysclock (170Mhz) the sampling happens about 34ns earlier as compared to AHB, and the timing when it happens has a standard deviation of 580ps.
- With AHB clock (1/2 prescaler -> 85Mhz) the standart deviation is 190ps.
Not a huge difference as i would have thought.
The measurement was accomplished with the interrupt having the measurement put out as Pulse. When triggering on this particular pulsewidth there was a fixed level where the sampling happend (as could be seen by the signal (triangular) as rising edge and falling edge created a cross). A squaresignal was used then to get exact timing measurements.