cancel
Showing results for 
Search instead for 
Did you mean: 

Unexpected RTC Calibration changes

konay61
Associate II
Posted on July 23, 2012 at 10:38

Hi All,

I'm using STM8L and trying to compensate the RTC in terms of the temperature changes. But the calibration value doesnt effect as the written on the Application Notes and Reference Manual. 

For example, when I simply change the calibration value as to +488,5 ppm (the max in 32 sec window), I cant see the expected difference in time. The difference I see is much more less than expected. 

I hope to get solution or at least any suggestions to fix it from you..

Thanks in advance,

Best Regards,

Kadir

#rtc-calibration
12 REPLIES 12
elil
Associate III
Posted on July 25, 2012 at 06:58

Recently I utilized in my project the source code of RTC calibration provided by ST, and it worked well with me. My MCU is STM8L151C8T6. I did some changes but it was intended to make this process periodic instead of single one.

As far as I remember, the result was quite accurate. The only thing I paid attention was that the calibration values(differences) were not always exactly the same. For example, for the same part, sometimes I was getting value 2,  sometimes 3 and sometimes 0. I don't so worry about drift from 2 to 3, since it just 1 RTC cycle, but I do worry about 0. I think I will return to the issue a bit later.

If you attach a file with exactly description of your problem, maybe I will be able to conduct the same experiment with my program.

Good luck ! 

konay61
Associate II
Posted on July 25, 2012 at 08:13

Hi Yevpator, 

Firstly, thanks for your attention.

I think the process of calibrating RTC in terms of Crystal aging and Temperature is almost the same for most of the STM. 

I firstly calculated the calibration value for the crystal used in the project by measuring the frequency from the RTC 1hz Calibration Output. 

Then, According to the project The RTC has to be worked well calibrated in all weather conditions (-40 *C to +85 *C ). That's so, for example, I have to find out the calibration value under the -40 *C temperature and calibrate the RTC with the calculated calibration value but unfortunately The time finally goes ahead and changes more than expected. ( The formulas given in app note has been performed properly.)

What could be the reason of the difference?

Many thanks,

Kadir

elil
Associate III
Posted on July 25, 2012 at 10:38

Hi Kadir,

May I ask you what does it mean ''The difference I see is much more less than expected.''?

 

What is the difference between the expected and the measured values ? 

konay61
Associate II
Posted on July 25, 2012 at 10:52

Hi,

What I ment there is, Although I calibrate RTC periodically according to the temperature(-20 *C), real time goes ahead like 6 seconds in 3 days, which I didnt expect more than 1 second (max).

elil
Associate III
Posted on July 25, 2012 at 11:31

Kadir,

Sorry if I'm asking you stupid questions, but why you expect for 1sec ?

6 sec in 3 days ==> 2 sec in 1 day

1day=24h x 60min x 60sec = 86400sec

2sec/86400sec = 2.31E-05 sec

2sec in 1 day is a drift of 0.002%. It is quite good, isn't it ?

Anyway, you calibrate RTC based on LSI measurement, but the latter depends on the HSI frequency accuracy.

1. Thus, if you need so high accuracy, maybe you have also to conduct periodic HSI calibration as well ?

2. If in your environment there are frequent and extreme temperature changes, maybe you need to shorter calibration periods ?

Sorry if these tips are not helpful.

konay61
Associate II
Posted on July 25, 2012 at 12:25

hi,

Sorry, I think There are some lack of information about my system. 

I'm using LSE (32768 Hz) as a RTC Clock Source. To be able to keep RTC Time adjusted to the real time I have to calibrate RTC periodically(15 min. period), because, as you asked, huge temperature changes may occur in a short period.. 

The max tolerance could be 0,5 seconds in a day, that means 0.002/4 % acc. That's why It should be well calibrated under any temperature.

Thank you very much for your attention and sorry that couldn't describe the problem well. 

Kadir,

elil
Associate III
Posted on July 25, 2012 at 14:23

Hi Kadir,

I think still there is missing information about your problem.

LSE frequency tolerance is 10-30ppm at 25C.

Over the entire temperature range it may be maximum 80-100ppm.

In order to compensate for this possible error you should have another frequency reference source that must be more accurate than yours. So, what is your reference ?

konay61
Associate II
Posted on July 25, 2012 at 14:57

Hi,

The tolerance of crystal which I used is 5 ppm at 25 *C, I first measure the error of the crystal while working with whole system by more sensitive device. 

After that, There is a formula to calculate the effect of the temperature to find the calibration value in ppm. MCU measures the temperature, gets a value then converts it into ppm.(by formula which is = Constant_K*(Current_Temperature - T0)^2 ) 

Eventually, The sum of both values give me the final value which is going to be written to RTC -> Calibration Register..

Hope It gets more clear now..

If I go back to the beginning, the problem is with the final calibration value, I couldnt reduce the error in an acceptable level.

elil
Associate III
Posted on July 25, 2012 at 15:43

I haven't helped you, but at least I've made the picture more clear for the real professionals (-:

The last attempt,

may be you should check the temperature measurement accuracy and ADC accuracy. 

ADC accuracy:

 if you use 12-bit resolution - you can get .125%  of total error

10-bit - 0.225 %

8-bit - 0.524 %

6-bit - 1.728%

STM8 internal temperature sensor itself has its own mistake and depends on temperature and Vdd and may be 1.5-1C.

Please try to take these factors into consideration too.

Good luck !