cancel
Showing results for 
Search instead for 
Did you mean: 

Oscilloscope With External Trigger Using STM32F429 - Is it possible while taking the interrupt latency into account?

MMoha.3
Associate III

Hello,

I'm planning to use the STM32F429 with its triple interleaved ADCs to make a 7.2Msps digital oscilloscope.

So the thing about digital oscilloscopes is that they also record samples before the trigger events.

To do so, they have a circular buffer constantly being filled with the samples, so that when the trigger event happened they only fill half of the buffer with the "new" data which has been gathered after the trigger. the other half contains the "old" data.

So my idea here is to set up the ADCs and the DMA in circular (continuous) mode and use an external interrupt pin (EXTI) as the trigger input.

Such that when the EXTI occurred, a timer is going to count until the time required for filling half of the buffer with the samples is passed, then turn off the ADCs. Then the buffer is ready for processing.

So my question is whether this whole concept is feasible while taking the interrupt latency of the EXTI into account. The core processor will be running @ 144MHz. I couldn't find any number for the interrupt latency so I'm asking if anyone had any similar experience and whether they were successful.

Also if you think there are any other obstacles to my idea feel free to share them.

Thank you in advance.

8 REPLIES 8

Use timer to track precise timestamp of trigger signal instead EXTI. Set some timer/counter to count ADC samples (counter hold position in circular buffer) and use capture event to save its value when trigger comes. After that you can set compare event at same timer to stop ADC. In general, let the hardware (timers, DMAs, etc) do as much as possible. When using IRQs latency is not main problem (known latency can be compensated) - main problem is latency uncertainty.

@Michal Dudka​ Thank you for the answer,

I cannot understand how I can set a timer to precisely count the ADC samples. Do you mean I should set a timer with a 7.2MHz frequency in sync with the ADCs? Is that going to be precise enough?

In my original plan, I was reading the NDTR register of the DMA Stream in the EXTI callback in order to know at which place of the buffer the trigger happened. But for some reason, I couldn't get that to work and my waveform kept scrolling down the screen.

TDK
Guru

This scheme will work, assuming DMA can keep up. Transferring at 7.2 Msps is going to tax the system.

Interrupt latency is 10-20 ticks, so pretty quick. Reading NDTR to determine where the trigger occurred will get you very close. If that's not good enough, you could probably set up a timer to trigger a DMA transfer to capture the NDTR value when a pin goes high.

If you feel a post has answered your question, please click "Accept as Solution".
MMoha.3
Associate III

@TDK​ Thank you,

There is also some other thing that I'm guessing is causing the problem with my NDTR method. I'm using ST's HAL functions at the moment, and I was wondering whether the HAL functions are fast enough for this application, or I should use registry-level programming for this kind of application.

“Fast enough�? very much depends on the requirements of your application, which only you can decide.
You will at the least need to monitor NDTR in the interrupt in order to stop the DMA at the appropriate time. Doing it quickly enough after the start of the interrupt seems beneficial, so eliminating HAL overhead by implementing that particular IRQ seems smart. And stopping DMA quickly via register access seems smart for the same reasons.
I would suggest just getting a proof of concept done, at a significantly reduced data rate, especially if you don’t have experience with the relevant features. After that, you can worry about increasing speed to the limits possible.
If you feel a post has answered your question, please click "Accept as Solution".

First ADC conversion must be triggered. You can setup timers like this:

Set timer prescaler to a value that matches to your sampling period (timer "tick" duration should takes same time as your ADC sampling time).

Use timer "enable" event as trigger for ADC.

Set timer "ARR" (period/top value) to same as you buffer size (timer should overflow when buffer overflow).

When you enable/start timer, you also start continuous conversion and timer/counter will hold actual position in buffer.

Choose one of timer inputs to capture external event (your trigger signal).

When trigger arives, timer saves (to capture register) index in your buffer and you will have exact information about time stamp.

Ending ADC conversion can be done multiple ways and it does not need to be accurate. From NTDR you can read exact information when ADC was stopped.

If you use EXTI IRQ to manually read NDTR you are prone to latency variations. Some IRQs can have higher priority (for example Systick) and can delay your EXTI IRQ routine.

I just found the reason why my waveform kept moving on the screen.

It was because I was using HAL_NVIC_DisableIRQ & HAL_NVIC_EnableIRQ in order to enable/disable the EXTI interrupt. But apparently, these functions introduce a lot of delay and cannot be used for anything above ~100Hz. So I had to use the following codes in order to mask/unmask my interrupts:

EXTI->IMR &= ~(1U<<2); //mask EXTI 2
EXTI->IMR |= (1U<<2); //unmask EXTI 2

So now my waveform stays still on my screen with up to trigger frequencies of ~1MHz

These two functions takes only few cycles. Delay introduced by calling these functions will be in range of lower tens of nanoseconds...