cancel
Showing results for 
Search instead for 
Did you mean: 

Will RTC accuracy decrease in STM32L412KB if we calibrate it's prescalers more frequently ?

NNagp.1
Associate

We are using RTC and it's alarm in our project. We are reading data from sensor at specific interval which can be between 10 min to 24 hours.

We are using LSI as RTC clock source. We measure the LSI frequency using Timers as given in application note. After measuring the LSI frequency then we set AsynchPrediv = 127 and SynchPrediv = (LSIfrequency/128) - 1 to get 1 Hz frequency. MCU go to Stand By mode when it reads the data from sensor and we are using RTC Alarm to wake up the MCU for next reading. The reading interval can be between 10 min to 24 hours.

But if we use this application for long time then RTC shows inaccurate value. It may be due to variation in LSI frequency. So we thought to use second alarm which calibrate the RTC prescalers after every 5 seconds. But doing this it further reduces the accuracy.

So, anybody can tell where is the problem and how to improve RTC accuracy ?

1 REPLY 1

I am not sure LSI is stable enough for any kind of precision timekeeping.

Nevertheless, when entering the INIT mode of RTC, you are losing the subsecond portion of the time. So, you may want to read it out before entering INIT, and then after the adjustment, "store"it back using the "shift" facility of the RTC, maybe also trying to account for time spent in the adjustment.

Given there will be some tricky corner cases, it may be simpler not to adjust the RTC, but to keep track of difference between RTC time and real time.

JW