AnsweredAssumed Answered

ADC - figure out exact signal duration

Question asked by parsec67 on Nov 23, 2016
Latest reply on Nov 23, 2016 by parsec67
I've been looking for info on this subject for some time and cannot find any references, possibly because I'm not an EE and don't have full understanding of all the ADC terminology and consequently using the wrong search terms.

My question: How do I define or limit the total duration of a sampled signal? I want my  resulting signal (buffer) to contain exactly 512 milliseconds of data.

Details/how it is supposed to work:
1. Signal is sampled at 1 MHz, single channel continuous.
2. Sampling will be triggered by analog watchdog if signal exceeds set threshold level.
3. Use DMA, interrupt when buffer full.

It is #3 that I am not clear on how to dimension to get the desired result. Do I calculate total signal duration by counting cycles/nanoseconds used for each individual sample and dimension the DMA buffer accordingly? Or make the watchdog interrupt start a timer that interrupts sampling after the desired time period has elapsed? Or what is the proper/recommended way of achieving this?

I don't necessarily need source code, just trying to understand the concept. Any pointers would be welcome.