cancel
Showing results for 
Search instead for 
Did you mean: 

F411 How to Generate 1 micro second interrupt

HDaji.1
Senior

I am using F411. The sample code in F411 package can generate timer interrupt in terms of millisecond. I tried to play with some of the parameters like Prescaler, etc, but I could not make any difference. Can anyone give some advice on the relations between system clock, timer counter, etc?

9 REPLIES 9
ssipa.1
Associate II

Hello @HDaji.1​  try this link:

https://www.microchip.com/forums/m1091760.aspx

MM..1
Chief II

Your idea is possible, but you waste with this your MCU power and you need CLK speed over xxMHz.

Explain:

MCU CLK is 48MHz

Interrupt code need 36 instruction cycles to do.

Your main code then alltime is in ISR and normal code can use 12 instruction cycles .

= You degrade MIPS 48 cpu to MIPS 12

People typically don't approach problems this way.

Depending on the cycle counts for the interrupt code, you'll likely saturate the core a a few hundred KHz.

Do things at high rates with HARDWARE, if you need interrupts decimate them so that you only need to attend to thing every 10th or 100th iteration.

For delays, use a TIM in maximal mode ie 0 to 0xFFFF or 0xFFFFFFFF, without interrupts, setting the prescaler to clock CNT at 1 MHz (or higher for finer resolution), and then watch the count advance, spin in a loop deltatering the start and current value to see if X micro-seconds has elapsed. Other interrupts might eat time from your loop, but it will exit as quickly as possible.

On the M3, M4 and M7 there is a processor cycle count in the DWT unit called CYCCNT, this is a 32-bit cycle counter at the CPU frequency, useful as it is fast and doesn't eat TIM resources. There is a 24-bit down-counter in SYSTICK but this is hard to use as the range tends to be sub-optimal and awkward as it is configured for 1ms tick interrupts normally.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
S.Ma
Principal

Say your corr runs at 96MHz and you use a simple timer with a prescaler divide by 2 then 48, yielding a timer clock of 1MHz. Then, you can set the timer overflow/update to the 1..65000 us delay, enable tim interrupt overlow event (one shot mode). You get the interrupt and the pending flag. If the delay is frequently 1..10 usec, it will take cpu bandwidth. If you use this delay in many source code places, as it relies on a physical single timer, be careful for recursive calls.

HDaji.1
Senior

Thank you, guys, for your generous inputs.

In my app, MCU needs to trigger an external ADC and wait around 1 us before use SPI to read result. The ADC chip has one pin to indicate ADC conversion complete (rising edge 0 --> 1), which I use as a trigger. Somehow sometimes my code may miss the trigger. I thought of using a timer. Now it seems not good idea.

Maybe I should use some code to check the pin status.

MM..1
Chief II

This is primary different as subject of your question.

Here you can use two methods offload MCU / pool wait

  1. Define EXTI on signal from ADC , and start any timer for check timeout (store actual TCNT on init ADC or for full offload set CCx interrupt to TCNT+dealy clk) = this handle SPI is started on EXTI or timeout
  2. Pool after init in wihle GPIO change or timeout--

> The ADC chip has one pin to indicate ADC conversion complete (rising edge 0 --> 1), which I

> use as a trigger. Somehow sometimes my code may miss the trigger. ...

> Maybe I should use some code to check the pin status.

Definitely work on this and check your code.

HDaji.1
Senior

Thank you, guys for your inputs. A lot for me to digest.

void delay_us() {
 
    SysTick->VAL = 9000;
 
    while(SysTick->VAL != 0) {}
 
}

I try the above method. However, the delay I get does not change much for VAL values 2000 to 9000. Why is that?