cancel
Showing results for 
Search instead for 
Did you mean: 

Setting RTC calibration output crashes STM32L1 discovery

EmbeddedChris
Associate
Posted on July 19, 2016 at 10:47

Hello, I'm working with an STM32L152C discovery trying to get the RTC to output a calibration signal. When configured to output a 512Hz signal it works as expected, however when set to output 1Hz, the processor crashes and appears to get stuck in some sort of constant resetting loop. I have a timer interrupt set to trigger every second and output an incremented variable via the USART. The interrupt still triggers but the variable remain at their default value. Any sort of debugging during this period isn't possible. After a hard reset, the debugger (ST-Link via IAR) fails to read the CPUID. I've ran into this issue using HAL and straight CMSIS, and have tested two boards and got the same result. Please help. Here is the code I'm using below with PreDiv_A = 127, PreDiv_S = 255 and RTCCLK = 32.768kHz.

  RTC->CR |= (1 << 19);

  RTC->CR |= (1 << 23);

I can confirm that the bits are being set correctly, but after that it all goes crazy.

Thanks in advance.

#stm32l152 #stm32l #discovery #stm32l1-rtc
1 REPLY 1
Walid FTITI_O
Senior II
Posted on July 19, 2016 at 11:56

Hi Chris152,

Try to follow the recommendation described in the application note

http://www.mcutech.net/upload/hr/0pic2_10_22_1331609905.pdf

''Using the STM32 hardware real-time clock (RTC)'' ; in both following parts:

''1.4 RTC coarse calibration part''

''1.9.1 RTC_CALIB output'' .

-Hannibal-