2025-12-16 2:04 AM
I would like to implement delay signal output with STM32 MCU.
So I use the original signal input for EXT interrupt.
EXT interrupt's callback activates the timer
If a timer callback is called after a time delay of the desired value, it is implemented by setting the GPIO pin to HIGH.
However, it can be confirmed that the desired amount of delay is achieved, but the timing of the pin becoming HIGH varies each time, and it takes about 1.5 to 1.8us between EXT and TIMER.
I would like to know if the idea can be implemented as STM32 MCU series.
also posting my Clock Configuration pic
2025-12-16 2:27 AM - edited 2025-12-16 6:58 AM
The MCU core seems to be running at 500MHz and the timers at 250MHz.
That's 2 nano second and 4 nano second clock period respectively.
Then there is interrupt latency, synchronization between peripheral and core, flash latency, overhead of io-functions. You might be able to get a latency of about 100 CPU cycles give or take. I doubt it can be lower than 50.
So you can get perhaps a 200 nanosecond delay using your approach.
There might be a way to use peripherals only to get an input to an output, but you are still limited by the clock period of 4 nanosecond.
And there is the delay of the GPIO-pin push-pull circuit. It cannot instantly switch on or off.
A 1 nanosecond delay is impossible using any clocked logic in the MCU. Unless there the MCU has a delay line you can use (I know there are internal programmable delay lines for quad-spi and octo-spi flash, but I don't think they are available externally).
You can delay a digital signal externally with some logical gates. Connecting an even number of inverters in series. Or use an RC-circuit with a schmitt trigger.
What is your use case?
2025-12-16 5:41 AM - edited 2025-12-16 5:51 AM
Hello @JinJin
If I understood your question right
based on your product Datasheet table 54
if you choose "VERY HIGH" as GPIO maximum output speed you can reach toggling times of nano seconds, but you will need a very clean implementation
Hope that answered your question
Gyessine
To give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.
2025-12-16 6:22 AM
> A 1 nanosecond delay is impossible.
I would add "timpossible o manage via core instructions". But it remains correct.
A few nanoseconds could be achieved hardware, which requires more or less effort.
But I would ask the OP @JinJin what he requires this delay for, i.e. what the delayed signal and un-deleyed signals are supposed to achieve in combination.
2025-12-16 6:49 AM
yep, the OP didn't exactly say "1 ns", so maybe he's happy with 100 ns?
But as char[] said, in the "few ns" area, I'd try with hardware gates.
With a few more ns, maybe this can be done with a timer using an external trigger / clock via pin.
2025-12-16 6:55 AM
OP needs to provide a far more detailed description of what he wants to achieve before any useful answers can be given.
2025-12-16 6:20 PM
I have already completed the code implementation using the EXTI → Timer → GPIO HIGH flow.
With the prescaler configuration shown in the attached clock configuration image, I am able to achieve a nominal delay resolution of approximately 4 ns.
However, I suspect that the latency introduced by the EXTI interrupt and the timer interrupt is not constant on each occurrence. As a result, the GPIO rising edge timing varies within a range of approximately 500 ns.
Therefore, my main question is whether there is any method to minimize this jitter within the STM32 MCU architecture, or if this behavior is an inherent limitation of STM32 when using an interrupt-based approach.
Any clarification on whether this level of jitter can be reduced further, or if a different hardware-based approach is required, would be appreciated
I appreciate all the efforts and suggestions to help clarify this issue.
2025-12-16 6:21 PM
PS. already GPIO output speed is very high
2025-12-16 7:34 PM
Configure timer in slave_triger mode, ETRF pin as a trigger. OnePulse PWM. There 'd be latency about 20-30 nsec - so it's minimum delay, resolution 2 nsec. No interrupt, all in hardware
Example of code I have for stm32G474re:
static void tim3_config(void)
{
TIM_HandleTypeDef htim234;
TIM_ClockConfigTypeDef sClockSourceConfig = {0};
TIM_MasterConfigTypeDef sMasterConfig = {0};
TIM_OC_InitTypeDef sConfigOC = {0};
TIM_SlaveConfigTypeDef sSlaveConfig = {0};
__HAL_RCC_TIM3_CLK_ENABLE();
htim234.Instance = TIM3;
htim234.Init.Prescaler = 0;
htim234.Init.CounterMode = TIM_COUNTERMODE_UP;
htim234.Init.Period = 352;
htim234.Init.ClockDivision = TIM_CLOCKDIVISION_DIV1;
htim234.Init.AutoReloadPreload = TIM_AUTORELOAD_PRELOAD_ENABLE;
HAL_TIM_Base_Init(&htim234);
sClockSourceConfig.ClockSource = TIM_CLOCKSOURCE_INTERNAL;
HAL_TIM_ConfigClockSource(&htim234, &sClockSourceConfig);
HAL_TIM_PWM_Init(&htim234);
HAL_TIM_OnePulse_Init(&htim234, TIM_OPMODE_SINGLE);
sSlaveConfig.SlaveMode = TIM_SLAVEMODE_TRIGGER;
sSlaveConfig.InputTrigger = TIM_TS_ETRF;
sSlaveConfig.TriggerPolarity = TIM_TRIGGERPOLARITY_INVERTED;
sSlaveConfig.TriggerPrescaler = TIM_TRIGGERPRESCALER_DIV1;
sSlaveConfig.TriggerFilter = 0;
HAL_TIM_SlaveConfigSynchro(&htim234, &sSlaveConfig);
sMasterConfig.MasterOutputTrigger = TIM_TRGO_OC4REF; // TIM_TRGO_UPDATE;
sMasterConfig.MasterSlaveMode = TIM_MASTERSLAVEMODE_DISABLE;
HAL_TIMEx_MasterConfigSynchronization(&htim234, &sMasterConfig);
sConfigOC.OCMode = TIM_OCMODE_PWM2;
sConfigOC.Pulse = 1;
sConfigOC.OCPolarity = TIM_OCPOLARITY_LOW;
sConfigOC.OCFastMode = TIM_OCFAST_DISABLE;
HAL_TIM_PWM_ConfigChannel(&htim234, &sConfigOC, TIM_CHANNEL_4);
HAL_TIM_Base_Start(&htim234);
HAL_TIM_PWM_Start(&htim234, TIM_CHANNEL_4);
}
2025-12-16 7:38 PM - edited 2025-12-16 7:39 PM
Put code and vector table in RAM, don't use other interrupts, and the interrupt jitter should be a couple cycles max. Of course, writing to GPIO has latency of its own.
This would be better implemented on a simpler chip such as the F4 where you could eliminate jitter.
Probably there are better ways of implementing this than a CPU-based approach.