2024-01-19 07:36 AM - edited 2024-01-19 07:37 AM
Hello.
I get a signal ranging from 0.5V to 1.8V in the input pin of the ADC.
I wrote this code to store in ADC_Buffer (1024 elements) the values read from the ADC every 10ms (when Timer_Flag = 1) .. that is, every 10ms:
- first thing first the whole buffer is filled (see the first "if" marked in blue in the picture) ... this operation happens only once
- then (when it is full) in the green "if" each element of the buffer is shifted to the left by one position to free the last position of the buffer at the bottom right which, therefore, will store the new value read by the ADC
This means that (I measured it with my smartphone timer) the first blue "if" will last 10ms (timer interrupt) + 1024 = 10 seconds
I ask how can I change/optimize the code to avoid waiting so long! There will definitely be a solution but programming for a short time I can't think of anything else.
2024-01-19 08:07 AM
Start with a zero buffer and after each ADC sample do a buffer size FFT. The results will get increasingly accurate as the buffer fills. When the buffer is full, then do the shift and FFT after each sample.
If you accuracy requirement is 1024 samples, then you have to wait.