cancel
Showing results for 
Search instead for 
Did you mean: 

Interrupt latency, schedule a recurring task, repeat a task

JDirk.1
Senior
  • Microcontroller: STM32L051R8Tx (MSI used as system clock, set to 4.194 MHz)
  • Board: custom design (no Nucleo etc.)
  • The LPTIM1 is clocked by an external LSE (32,768 kHz, +- 10ppm).
  • IDE: Keil uVision V5.36.0.0
  • Compiler: ARM Compiler “Use default compiler Version 5”

I scheduled a recurring task with help of LPTIM1 and it’s IRQ. The timer is set to 500ms. The task processing time is around 130ms (measured with help of TIMER21). I observed a latency which is becoming visible over time. After 4080 interrupts (after 34:00 minutes, 4080/2/60) the timer is lagging ~9 seconds (in my application, a few hundred milliseconds wouldn’t be a problem at all). In case this time is getting “lost” evenly per interrupt it would mean >2ms per interrupt. It seems to be unrealistic latency, isn’t it?

Secondly, if I replace the task by a delay function of the same time duration (or simple comment the task out), there is no considerable latency at all. 34:00 minutes on the microcontroller equal to 34:00 minutes on my stop watch.

Why is the task affecting the accuracy? I think I am missing something. Every hint is welcome – many thanks in advance.

 

// LPTIM1_IRQHandler
void LPTIM1_IRQHandler(void)
{
    // ES0251: Flags must not be cleared outside the interrupt subroutine.
    if((LPTIM1->ISR) & 0x02) // CNT value reached LPTIM_ARR register’s value?
    {
        LPTIM1->ICR = LPTIM_ICR_ARRMCF; // Writing 1 to this bit clears the ARRM
                                        // flag in the LPTIM_ISR register.
        g_interrupt_flag = 1;    
    }   
} // End: void LPTIM1_IRQHandler(void)

 

int main(void)
{
    RCC->ICSCR |= 0xC000; // Update clock to 4.194 MHz
                          // 0xC000 = 110 00000 00000000
                          // Bits 15:13 MSIRANGE[2:0]: MSI clock rang
                          // 110: range 6 around 4.194 MHz
    
    SystemCoreClockUpdate(); // Only to set globale variable SystemCoreClock?
    
    __disable_irq(); // Global disable IRQs
    
    RCC->IOPENR |= 0x0001; // Enable GPIOA clock, USER LED (PA5)
    RCC->APB2ENR |= 0x0001; // Enable SYSCFG clock (for delay_ms())
    
    GPIOA->MODER &= ~0x00000C00; // Clear pin mode, LED PIN 21 (PA5)
    GPIOA->MODER |= 0x00000400; // Set pin to output mode, LED PIN 21 (PA5)
    GPIOA->BSRR = 0x00200000; // Turn off LED, PIN 21 (PA5)
    
    tick_count_init(TIMER21_PSC); // Needed only during debug (Dev. stage)
    
    usart2_init(); // Needed only during debug (Dev. stage)
    
    i2c2_init(); // Used for sensor measurement (task)
    
    // LPTIM1 is used to schedule measurement
    // One measurement every 500ms
    low_power_timer_init();
    
    NVIC_SetPriority(LPTIM1_IRQn, 1);
    NVIC_EnableIRQ(LPTIM1_IRQn); // Enable LPTIM1 interrupt

    __enable_irq(); // Global enable IRQs
    
    while(1)
    {
        printf("please enter zero (0) \r\n");
        scanf("%d", &g_user_input); // Wait for user input
        
        switch(g_user_input)
        {
            case 0: 
            {
                low_power_timer_disable(); // Stop (previous) measurement
                
                low_power_timer_enable(); // Start a new measurement, 
                                          // interrupt every 500ms
                
                printf("new measurement cmd received\r\n");
                
                g_datapoint_count = 0; // Restart count of measurements
                
                while(g_datapoint_count < 4080) // 4080/2/60 = 34:00 minutes
                {
                    // It takes 143ms needed after user entered "zero" on 
                    // keyboard to come here
                    
                    if(g_interrupt_flag) // Occurs every 500ms
                    {
//                        tick_count_start(); // Timer to measure duration
                        
                        g_interrupt_flag = 0;
                        g_datapoint_count += 1;
                        
                        print_time_on_terminal(0);
                        
                        GPIOA->ODR ^= 0x20; // blink LED
                        
                        task(g_iterations); // takes ~129ms
//                        delay_ms(129); // Needed only during debug
                        
//                        tick_count_stop(TIMER21_MULTIPLIER);
//                        //printf("counts:  %i \r\n", g_counts);
//                        //printf("time_ms: %.0i ms\r\n", g_time_ms);
                        
//                        if(g_time_ms > max_time)
//                        {
//                            max_time = g_time_ms;
//                            printf("max_time: %.0i ms \r\n", max_time);
//                        }  
                    } // End: if(g_interrupt_flag)  
                } // End: switch (stage)
                break;
            } // End: case 0:
        } // End: switch(g_user_input) 
    } // End: while(1)
} // End: main()

 

1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Guru

Your program logic seems fine, but the timing seems a bit convoluted.

Toggle a pin in the interrupt and verify that the timer frequency is sufficiently accurate. Make sure you don't have off by one errors in your LPTIM1 setup, which this very much sounds like. After you verify that, add in the other logic.

LPTIM1 won't lose time in the interrupt. The timer counts up at a steady rate unless you do something to change the CNT register.

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

6 REPLIES 6
Sarra.S
ST Employee

Hello @JDirk.1

In your code, the ISR for LPTIM1 is setting a flag and the main loop is processing this flag. One suggestion is to set this flag and perform the necessary processing in the main loop, this will reduce the processing time within the ISR and potentially minimize the interrupt latency

 

To give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.

Hello Sarra,

many thanks for your quick reply. Could you please become a little more concret? Currently, In the IRQ I do basically three things:

1) make sure the interrupt was caused by CNT==ARR, check interrupt source

2) clear the interrupt flag of LPTIM, to exit the IRQ

3) set a flag to inform the main loop that 500ms passed and to execute task()

What do you suggest will remain inside the IRQ? What should be moved to main()?

How do you explain the difference on the latency by execution or not execution of:?

task(g_iterations); // takes ~129ms

 

TDK
Guru

Your program logic seems fine, but the timing seems a bit convoluted.

Toggle a pin in the interrupt and verify that the timer frequency is sufficiently accurate. Make sure you don't have off by one errors in your LPTIM1 setup, which this very much sounds like. After you verify that, add in the other logic.

LPTIM1 won't lose time in the interrupt. The timer counts up at a steady rate unless you do something to change the CNT register.

If you feel a post has answered your question, please click "Accept as Solution".

Many thanks for your reply.

I changed the toggling of PIN21 (LED) to the interrupt and I did a couple of random measurements during operation with the handheld oscillometer (see screenshots attached).

With function task(g_iterations); included:

  • The time measured is 500ms but sometimes also 502ms.

Functionn task(g_iterations); commented out:

  • All measurements were exactly 500ms.

According to your recommendation I also checked the LPTIM1 setup again. I did some modifications (“try and error”).

By changing the LSE oscillator Driving capability bits from “Lowest drive” to “Medium low drive”, the issue disappeared. I think we identified the reason.

I saw the drive level calculation is described in AN2867 but I didn’t spend too much time on it as the TIMER was working well (at least I thought so).

For me the error was hard to identify as the issue only occurred by adding some further functionality into main() -> task();

Does someone know how it belongs together?

TDK
Guru

Perhaps enable the LSE CSS to see if it's getting tripped. The details on how sensitive the LSE CSS is are lacking, however.

If the task is causing noise that affects the LSE pins, could be causing the issue. Grasping at straws a bit here, but it could be an explanation, especially since it seems to be drive strength related.

If you feel a post has answered your question, please click "Accept as Solution".

Perhaps enable the LSE CSS to see if it's getting tripped. The details on how sensitive the LSE CSS is are lacking, however.

-> I will investigate on it as soon as time allows. As it is working now, I need to recover some "lost" time.

If the task is causing noise that affects the LSE pins, could be causing the issue. Grasping at straws a bit here, but it could be an explanation, especially since it seems to be drive strength related.

-> The task is doing I2C communication (reading a sensor). What do you mean by "LSE pins"? Are you referring to OSC32_IN and OSC32_OUT?