2024-12-17 04:58 AM
Hi everyone,
currently I am working with an STM32L431 on a project that timestamps incoming IEEE802.15.4 frames via an AT86RF233 radio. The radio module is configured to issue an interrupt when a message is received and this interrupt signal is connected to PC13, which is the timestamp interrupt on the STM32L431. To enable the basic timestamp functionallity is set the TSE bit in the RTC_CR register (nothing more) and im able to query the Timestamping Register (TR) to get the time at which the frame was received.
However, I have a question regarding the timing precision of copying the Sub-Second Register (SSR), TR , and Date Register (DR) into their respective timestamp registers. Specifically, I am concerned about the duration of this process in microseconds, as I am implementing a time synchronization mechanism. Does copying these registers take a significant amount of time? Any insights or detailed timing information would be greatly appreciated. Or are there other sources of delay from the interrupt to when the timestamp registers are filled with the current time??
Thank you in advance!
2024-12-17 06:29 AM
Not a direct answer to your question, but one possibility is to get the timestamp with a regular timer and combine it with the RTC to calculate the subseconds part yourself.
In the past I made a data logger that used a sub milli second timer to log events and one of the events was the RTC 1 second tick(or x seconds). I calculated the subseconds in post processing using linear regression. This allowed things like jumps in time synchronization (could set time a few seconds forward or even backwards) to be smoothed out and even allowed me to start logging before the RTC was valid (empty RTC battery and waiting for GPS time signal). But you can do this calculation in real time too. This approach combines long term accuracy and short term high precision.
2024-12-17 08:16 AM
Hi,
thanks for your answer. The problem with this is, that my long term goal is to implement a TDMA protocol, which leverages the "sleep between" slots nature of a TDMA protocol to conserve power. I think i cant use your approach, since i use the RTC to wake up at the right time, i.e., at a slot.
In addition to my first post, i already implemented some sort of time synchronization protocol and achieved a precision of around +-30 us, which is the quantization error of the RTC. When i try to implement the same with the timestamping functionallity. i get around 90-120 us error. Im 99% sure that I can rule out all sources of external errors (interrupts, calculation delays...), except that the timestamping simply takes a very long time in the RTC itself...
2024-12-17 08:38 AM
@Lennart-Lutz wrote:Hi,
thanks for your answer. The problem with this is, that my long term goal is to implement a TDMA protocol, which leverages the "sleep between" slots nature of a TDMA protocol to conserve power. I think i cant use your approach, since i use the RTC to wake up at the right time, i.e., at a slot.
Won't sleeping and waking up introduce latency? I'm not familiar the timestamping feature of the peripheral and what the latency is.
@Lennart-Lutz wrote:When i try to implement the same with the timestamping functionallity. i get around 90-120 us error.
Can you share your code. This doesn't seem right. Could it be that it receives multiple packets and overwrites the timestamps with newer once?