Showing results for 
Search instead for 
Did you mean: 

Please help in general understanding of STM implemenation of RTC Calibration, Synchonization, Reference detection

Associate II


I'm struggeling a bit in understanding the whole features of possible RTC improvements.

General information, right now I'm using a STM32L051 and a external 32,768kHz Oscillator for the RTC. It's working in general, but the OSC has +/-30ppm which won't be good enough. I can't change the OSC.

In the reference manual (RM0377) and datasheet I can find some information about calibration/synch. The HAL code describes also some pieces. But I can't combine it to a picture that makes sense to me.

If I understand it right, the calibration feature (RTC_OUT_CALIB) generates a output signal (MCU generated signal, either 1Hz or 512Hz)? Is it correct that this can be either used as reference for other ICs, or I can measure it with high precision (Oscilloscope) to see a unwanted deviation? And if i see a deviation I must manually load the value to the MCU to calibrate RTC? In my current PCB version I don't have access to this pin to test it.

In the reference manually it quotes (22.2):

"Digital calibration circuit (periodic counter correction): 0.95 ppm accuracy, obtained in a

calibration window of several seconds"

So is this a high precision internal PLL feature without the need of external signal measurements?

And what about the next feature: Reference clock detection. Is this a input pin which can be used with a high precision clock or other IC? Could this pin be used when I initialize the device / during commissioning to calibrate the RTC with an external reference? Is this a completly automatic feature (as soon as I set RTC register that RTC_REFIN is used)? I propably only be able to use that pin under test conditions, because I need SPI2 MOSI (and there's no alternative SPI2 MOSI pin).

Next point: Is the synchronization feature just another word for the reference clock detection, or is this a completely other feature?

I guess it's all describe with the 22.4.1 block diagram, but I don't really understand it. Whats really possible to calibrate the RTC and which steps are necessary (external signal, external measurement)?


(I will use either CubeIDE generated HAL code, or direct HAL/LL register commands...)


There's no fancy automatism.

Basically, you want to measure the exact frequency of the oscillator, and then feed back the difference as the fine digital tuning parameter. You can do this by outputting the 1Hz signal (and measuring with a precision tool). In some STM32, the LSE can be connected to a timer's channel, so then indrect measurement can be performed comparing it e.g. to the primary system clock, and indirectly to any other input to the timers.

The lock to reference signal requires that oscillator to be present all the time, there's no magical PLL or autocorrection mechanism.


Associate II

okay thanks! I will check my options. Propably I'll be able to test the features with a development board with more/all Pins connectable.

ST Employee

Hi @ARich.6​ ,

The RTC calibration process is also described in AN3371 Using the hardware real-time clock (RTC) in STM32 F0, F2, F3, F4 and L1 series of MCUs.

That is true this document doesn't cover STM32L0, but it should be applicable for this product as well.


To give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.