AnsweredAssumed Answered

RTC sub-second (RTC_SSR) issue

Question asked by MrStive on Feb 25, 2014
Latest reply on Aug 6, 2017 by Scott Dev
I am having troubles with the subsecond register on stm32

I have a datalogger which logs data with a timestamp. I am doing multiple logs (~30) per second, so wanted to log sub-seconds as well.

HOWEVER, as soon as I read the sub-second register (RTC_SSR), the seconds no longer increment properly and skip seconds!

I have not enabled the SSR register or anything, do I need to? I just store the register as an unsigned int, and use the formula
frac = (float)(PREDIV_S - fracsec) / (float)(PREDIV_S + 1);
to get the subsecond as a fraction of a second.

Without reading the register, the timer increments the seconds correctly, and I make about 30 logs per second.
i.e about 30 logs with timestamps 12:00:01, and then 30 logs at timestamp 12:00:02

However, as soon as I read the SSR register, things start playing up. The log will increment the subsecond 12:00:01.01 -> 12:00:01.99, and then sub-seconds will start again but RTC_TR will remain at 1second (i.e 12:00:01.01 -> 12:00:01.99 again), and the SSR register will keep ticking over multiple times before RTC_TR will change.

Then, all of a sudden, the RTC_TR seconds will jump ahead 3-4+ seconds and the logging will read
12:00:05.01 -> 12:00:05.99 over and over.

Hope you can understand this explanation!

Why wont the seconds increment correctly when reading the SSR register? Just commenting out the read from the SSR register will allow the seconds to perform as normal again - all other code is identical....

I see in the note section of the STM reference manual it states:
"Note: SS can be larger than PREDIV_S only after a shift operation. In that case, the correct
time/date is one second less than as indicated by RTC_TR/RTC_DR."

However, I'm seeing a 3+ second skip in the TR register, so this cant be the cause.
AND it seems to be random.
I have the time output to a clock on a LCD and the minutes seem to match up with my clock. i.e 1 minute reading on the RTC = 60seconds in real life. Synch and asynch prescalers are as default.

Any thoughts??