cancel
Showing results for 
Search instead for 
Did you mean: 

USB is less stable when enabling CRS Automatic Trimming Mode [STM32F072]

xpp07
Senior

For some reason, when I enable CRS Automatic Trimming my USB connection gets unstable. I need to unplug/plug back the USB several times for the PC to recognize the device. When I disable the Automatic Trimming mode and use the default HSI48CalibrationValue = 0x20, the connectivity improves a lot. See below the clock initialization. I appreciate any input.

void Clocks_Init(void)
{
  static RCC_CRSInitTypeDef RCC_CRSInitStruct;
                                             
  // CLOCK CONFIGURATION REGISTERS
  RCC->CFGR      = RCC_CFGR_SW_HSI;        // Confgure SYSCLK 
  RCC->CFGR     |= RCC_CFGR_INIT;          // Confgure PLL clk                                           
  
  // CLOCK CONTROL REGISTERS 
  RCC->CR2      |= RCC_CR2_INIT;           // Enable HSI14 CLK                                  
  
  // AHB PERIPHERAL CLOCK ENABLE REGISTERS
  RCC->AHBENR   |= RCC_AHBENR_INIT;        // Enable DMA, CRC, and GPIO Clks                    
 
  // APB PERIPHERAL CLOCK ENABLE REGISTERS 1
  RCC->APB1ENR  |= RCC_APB1ENR_INIT;       // Enable CRS, WWDOG, & TIM3/2 CLKs                  
  
  // APB PERIPHERAL CLOCK ENABLE REGISTERS 2
  RCC->APB2ENR  |= RCC_APB2ENR_INIT;       // Enable ADC interface and SPI Clks                
  RCC->APB2ENR  |= RCC_APB2ENR_DBGMCUEN;   // Enable MCU Debug Clk 
 
  CRS->CR       |= CRS_CR_AUTOTRIMEN;      // Enable CRS Automatic Trimming  Mode
    
  /*Configure the clock recovery system (CRS)**********************************/
  
  /*Enable CRS Clock*/
  __HAL_RCC_CRS_CLK_ENABLE(); 
  
  /* Default Synchro Signal division factor (not divided) */
  RCC_CRSInitStruct.Prescaler = RCC_CRS_SYNC_DIV1;
  
  /* Set the SYNCSRC[1:0] bits according to CRS_Source value */
  RCC_CRSInitStruct.Source = RCC_CRS_SYNC_SOURCE_USB;
  
  /* HSI48 is synchronized with USB SOF at 1KHz rate */
  RCC_CRSInitStruct.ReloadValue =  __HAL_RCC_CRS_RELOADVALUE_CALCULATE(48000000, 1000);
  RCC_CRSInitStruct.ErrorLimitValue = RCC_CRS_ERRORLIMIT_DEFAULT;
  
  /* Set the TRIM[5:0] to the default value*/
  RCC_CRSInitStruct.HSI48CalibrationValue = 0x20; 
  
  /* Start automatic synchronization */ 
  HAL_RCCEx_CRSConfig (&RCC_CRSInitStruct);   
}

2 REPLIES 2
TDK
Guru

> CRS->CR |= CRS_CR_AUTOTRIMEN; // Enable CRS Automatic Trimming Mode

> __HAL_RCC_CRS_CLK_ENABLE();

You need to enable the clock before modifying registers.

Read out and report the CRS registers after initializing and enabling it.

If you feel a post has answered your question, please click "Accept as Solution".
xpp07
Senior

Thank you. Although I think the Auto trimming is enabled again in the function HAL_RCCEx_CRSConfig (&RCC_CRSInitStruct);

.......................
/* Adjust HSI48 oscillator smooth trimming */
  /* Set the TRIM[5:0] bits according to RCC_CRS_HSI48CalibrationValue value */
  MODIFY_REG(CRS->CR, CRS_CR_TRIM, (pInit->HSI48CalibrationValue << CRS_CR_TRIM_BITNUMBER));
  
  /* START AUTOMATIC SYNCHRONIZATION*/
  
  /* Enable Automatic trimming & Frequency error counter */
  SET_BIT(CRS->CR, CRS_CR_AUTOTRIMEN | CRS_CR_CEN);