2025-04-29 7:41 AM
Hallo all:
I want to implement a generic counter to count the number of impulses on an input, but after debouncing.
I guess that mechanical switches for human interface purposes need a debouncing in the milliseconds range, say 1 ms to 20 ms. My Systick interrupt runs every millisecond, so I can sample the input there (with an appropriate prescaler implemented in software), which should take care of debouncing.
If the input comes from a clean, high-speed signal, I can use a hardware timer like TIM1 in order to count impulses on input TIM1_CH1. I can do some level of debouncing with its built-in hardware input filter, but if the signal is clean, then probably no debouncing is necessary.
The trouble is at mid input frequencies. I am thinking of some signal source which is faster than switches typical for human interfaces, but not completely clean (with some bouncing). Is that a plausible scenario at all?
I am using an STM32F407. I have set APB1 to its maximum speed of 42 MHz, and APB2 to 84 Mhz. This way, I can drive SPI interfaces, high-resolution timers etc. at a very high speed or with great precision.
The STM32 timer input filters are ultimately clocked from tCK_INT, which is APB1 or APB2, depending on the timer. Unfortunately, there is no prescaler there, so tCK_INT gets the full 84 MHz in my case. You can scale a little down with CKD, so that tDTS = 4 × tCK_INT. You can then set the filter to fSAMPLING=fDTS/32, N=8. That means I can scale the filter frequency down a factor of to 4 * 32 = 128, in my case 84 MHz / 128 = around 656 kHz, but no further. With a sample count of N = 8, this means I can filter down to a stability period of 12 us. But not further, say to 50 us.
So the question is, how to debounce the input signal in the frequency range 1 Khz - 656 kHz, for stability periods 1000 us to 12 us.
I could increase the Systick frequency in order to sample the input more often in software, but if I increase it too much, general firmware performance would suffer.
I could use a dedicated timer interrupt just to sample the input more often in software, but again, if interrupts come too often, general firmware performance would suffer.
I could strongly reduce the frequency for APB1 and/or APB2, but then other peripherals would be affected. At the very least, I would have to adjust the timing parameters in all my firmware, which is a lot of work.
Is there any other way to sample an input quickly in hardware, at regular intervals?
Can I use a hardware timer to trigger some automatic sampling? Would I need to use DMA to store the input into a memory buffer? Can I have that buffer automatically wrap around?
Another way is to implement debouncing in hardware, with a simple low-pass filter, but then the debouncing time wouldn't be configurable in software.
Thanks in advance,
rdiez
2025-04-29 9:02 AM
Do you have another free 32-bit timer?
If yes, you could configure this to count with 1 µs, no interrupts (maybe overflow as info), ARR set to 32-bit max. Just let it run free in the background.
Each time you sample the IO / whatever, also save the current 1us-tick-time = CNT, and so on ...