I’m trying to minimize control loop delays. Based on a certain analog value (adjustable threshold) a digital output is triggered. Reading ADC values is done by using DMA, digital outputs are driven without DMA.
Results measured by scoop: minimum delay ~300
μs, variation ~90 μs (min 300
μs, max 390
Does anybody have some tips/tricks/best practices minimizing ADC-DO control loop delays?