2024-02-04 08:41 AM
I need to measure the time it takes for a component to reach a certain voltage, read out with the ADC. So far I have been measuring in microseconds but it turned out that that is not precise enough.
I found the following code to measure time on Stackoverflow and it works well, but I only superficially understand it and I can't really modify it.
// setup
__HAL_RCC_TIM2_CLK_ENABLE();
TIM2->PSC = HAL_RCC_GetPCLK1Freq()/1000000 - 1;
TIM2->CR1 = TIM_CR1_CEN;
// read time in microseconds
time_start = TIM2->CNT;
Can someone explain to me how it works and if I can modify it to measure nanoseconds? There might a super obvious way to do it but, again, I don't understand the code (This is what I get for copying code from the internet I guess)
Is there maybe a better solution than this?
Some additional info:
I don't need the measurement to be super accurate, whatever the chip itself can provide is probably enough for my needs.
I've set up the ADC in continuous conversion mode and DMA circular which I think is the fastest it can measure but if there is an even faster method, please let me know.
Solved! Go to Solution.
2024-02-05 07:04 AM
Could you share your circuit? I am interested in how you managed to get such precise measurements, it seems that you are using an approach that I haven't come across yet.
2024-02-05 07:04 AM
In this case setting the Prescaler to 0 and using the tickrate of 10.42ns worked to get reliable measurements but I will probably still look into the other solutions, especially using the comparator, because they seem promising.
Thank you very much for helping me!
2024-02-05 08:18 AM
You can read about the principle here : (maybe translate...)
https://www.sprut.de/electronic/pic/projekte/lcmeter/lcmeter.htm
You can leave away the relais etc. , you only want measure C , not L + C .
2024-02-05 08:37 AM
thank you, I'll look into it. Don't even have to translate :)