cancel
Showing results for 
Search instead for 
Did you mean: 

ADC and Timer take too long

luke514
Senior

Hello.

I get a signal ranging from 0.5V to 1.8V in the input pin of the ADC.

I wrote this code to store in ADC_Buffer (1024 elements) the values read from the ADC every 10ms (when Timer_Flag = 1) .. that is, every 10ms:

- first thing first the whole buffer is filled (see the first "if" marked in blue in the picture) ... this operation happens only once

- then (when it is full) in the green "if" each element of the buffer is shifted to the left by one position to free the last position of the buffer at the bottom right which, therefore, will store the new value read by the ADC

This means that (I measured it with my smartphone timer) the first blue "if" will last 10ms (timer interrupt) + 1024 = 10 seconds

I ask how can I change/optimize the code to avoid waiting so long! There will definitely be a solution but programming for a short time I can't think of anything else.

 

luke514_2-1705661593607.png

luke514_5-1705677097020.png

luke514_6-1705677181854.png

luke514_7-1705677207407.png

 

 

 

1 REPLY 1
raptorhal2
Lead

Start with a zero buffer and after each ADC sample do a buffer size FFT. The results will get increasingly accurate as the buffer fills. When the buffer is full, then do the shift and FFT after each sample.

If you accuracy requirement is 1024 samples, then you have to wait.