cancel
Showing results for 
Search instead for 
Did you mean: 

STM32H7 64MHz HSI is off by 4%!

TDK
Guru

According to the STM32H745xI/G datasheet, the 64MHz HSI is accurate to +/- 0.3 MHz around room temperature, so +/- 0.5%.

During testing, I noticed about 10-20% of my UART characters were getting dropped. I then looked at the signal on a scope and noticed the frequency is off.

After redirecting the system clock to MCO2 (PC9) and measuring on a scope, I discovered the problem:

The HSI RC (64 MHz) on the STM32H745 chip is off by 4%!! So much for the "factory calibration".

Am I missing something here?

Clock initialization (480MHz):

 /** Initializes the CPU, AHB and APB busses clocks 
  */
 RCC_OscInitStruct.OscillatorType = RCC_OSCILLATORTYPE_HSI;
 RCC_OscInitStruct.HSIState = RCC_HSI_DIV1;
 RCC_OscInitStruct.HSICalibrationValue = RCC_HSICALIBRATION_DEFAULT;
 RCC_OscInitStruct.PLL.PLLState = RCC_PLL_ON;
 RCC_OscInitStruct.PLL.PLLSource = RCC_PLLSOURCE_HSI;
 RCC_OscInitStruct.PLL.PLLM = 32;
 RCC_OscInitStruct.PLL.PLLN = 480;
 RCC_OscInitStruct.PLL.PLLP = 2;
 RCC_OscInitStruct.PLL.PLLQ = 2;
 RCC_OscInitStruct.PLL.PLLR = 2;
 RCC_OscInitStruct.PLL.PLLRGE = RCC_PLL1VCIRANGE_1;
 RCC_OscInitStruct.PLL.PLLVCOSEL = RCC_PLL1VCOWIDE;
 RCC_OscInitStruct.PLL.PLLFRACN = 0;
 if (HAL_RCC_OscConfig(&RCC_OscInitStruct) != HAL_OK) {
  Error_Handler();
 }
 /** Initializes the CPU, AHB and APB busses clocks 
  */
 RCC_ClkInitStruct.ClockType = RCC_CLOCKTYPE_HCLK | RCC_CLOCKTYPE_SYSCLK
   | RCC_CLOCKTYPE_PCLK1 | RCC_CLOCKTYPE_PCLK2 | RCC_CLOCKTYPE_D3PCLK1
   | RCC_CLOCKTYPE_D1PCLK1;
 RCC_ClkInitStruct.SYSCLKSource = RCC_SYSCLKSOURCE_PLLCLK;
 RCC_ClkInitStruct.SYSCLKDivider = RCC_SYSCLK_DIV1;
 RCC_ClkInitStruct.AHBCLKDivider = RCC_HCLK_DIV2;
 RCC_ClkInitStruct.APB3CLKDivider = RCC_APB3_DIV2;
 RCC_ClkInitStruct.APB1CLKDivider = RCC_APB1_DIV2;
 RCC_ClkInitStruct.APB2CLKDivider = RCC_APB2_DIV2;
 RCC_ClkInitStruct.APB4CLKDivider = RCC_APB4_DIV2;
 
 if (HAL_RCC_ClockConfig(&RCC_ClkInitStruct, FLASH_LATENCY_4) != HAL_OK) {
  Error_Handler();
 }

MCO initialization:

HAL_RCC_MCOConfig(RCC_MCO2, RCC_MCO2SOURCE_SYSCLK, RCC_MCODIV_10);

Measured frequency is 45.99 MHz, which is -4.2% off from what it should be: 48 MHz.

0690X00000ArWkRQAV.png

If I change to 400MHz, the result is the same. -4.2% from what it should be.

If you feel a post has answered your question, please click "Accept as Solution".
1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Guru

There's a bit of misunderstanding/misinformation going on here. The hard-coded calibration value cannot be overwritten. What __HAL_RCC_HSI_CALIBRATIONVALUE_ADJUST and similar does is adjust HSITRIM, which indirectly adjusts HSICAL. The original HSICAL can always be reset by restoring the default on-reset value of HSITRIM, which, unfortunately, differs depending on chip rev.

ST's current code has the following:

#if defined(RCC_HSICFGR_HSITRIM_6)
#define RCC_HSICALIBRATION_DEFAULT     (0x40U)         /* Default HSI calibration trimming value, for STM32H7 rev.V and above  */
#else
#define RCC_HSICALIBRATION_DEFAULT     (0x20U)         /* Default HSI calibration trimming value, for STM32H7 rev.Y */
#endif

but since RCC_HSICFGR_HSITRIM_6 is always defined, it always evaluates to 0x40. It's not like there are different include files for different chip revisions.

The solution proposed by @DWest.1​ works and is similar to what I did, as long as you call it after HAL_RCC_OscConfig. IMO, HAL_RCC_OscConfig shouldn't be touching HSITRIM at all.

CubeMX lists the calibration value it is using, so you could either adjust it there if you know your chip rev ahead of time or within code. And because it's listed and there is no option to select "default" or "do not change", it doesn't allow for it to be adjusted based on chip revision. Consistent with the code, but not super helpful.

Presumably, as the old chip revision becomes less common, this will be less of an issue.

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

18 REPLIES 18
Thomas L.
Associate III

Yep, that's the problem. CubeMX sets HSICAL to 0x20, which is different from its reset value of 0x40. After changing it to 0x40, the clock is now at 48.25MHz, which is an error of +0.5%.

Yet another instances of the hardware being fine but CubeMX producing a bug. I had thought at least the power initialization in CubeMX would be robust.

My chip revision is V (REV_ID = 0x2003).

This is the problem statement:

#define RCC_HSICALIBRATION_DEFAULT   (0x20U)     /* Default HSI calibration trimming value */

If you feel a post has answered your question, please click "Accept as Solution".
Piranha
Chief II

> I had thought at least the power initialization in CubeMX would be robust.

Isn't expecting that the same code monkeys, who make limited non-flexible driver architecture and bloated code full of bugs, where not a single component is of reasonable quality, could make GUI generated code robust "a bit" naive?

Amel NASRI
ST Employee

Hello,

RCC_HSICALIBRATION_DEFAULT should be revision dependent.

This is reported internally to be fixed by our development team.

-Amel

PS: the problem is not on STM32CubeMX generated code, but on the header file stm32h7xx_hal_rcc.h which is copied as is from the STM32CubeH7 package.

To give better visibility on the answered topics, please click on Accept as Solution on the reply which solved your issue or answered your question.

DWest.1
Associate II

The issue is that RCC_HSICALIBRATION_DEFAULT is hardcoded as 0x40 in STM32Cube_FW_H7_V1.6.0 and STM32Cube_FW_H7_V1.7.0, but was 0x20 in STM32Cube_FW_H7_V1.5.0. The correct trim value is dependent on the chip revision.

Here is the fix that will work for all chip revisions. Insert in a user area after SystemClock_Config() is called, so it survives cubemx's code regen.

/* USER CODE BEGIN SysInit */
  if (HAL_GetREVID() <= REV_ID_Y)
  {
    /* Default HSI calibration trimming value, for STM32H7 rev.Y */
    __HAL_RCC_HSI_CALIBRATIONVALUE_ADJUST(0x20U);
  }
  /* USER CODE END SysInit */

> I had thought at least the power initialization in CubeMX would be robust.

From the same team that can't even get toggling pins right?

Piranha
Chief II

> RCC_HSICALIBRATION_DEFAULT should be revision dependent.

> The issue is that RCC_HSICALIBRATION_DEFAULT is hardcoded

https://github.com/STMicroelectronics/STM32CubeH7/blob/79196b09acfb720589f58e93ccf956401b18a191/Drivers/STM32H7xx_HAL_Driver/Inc/stm32h7xx_hal_rcc.h#L214

Am I on a different internet? =)

> the problem is not on STM32CubeMX generated code

But the problem with HAL code is in the fact that it changes this value at all. How do you expect a calibrated value to be placed in source code and compiled? Have anyone at ST with brain ever even thought about it for a minute?

This question is quite old. Note the post date. Library has no doubt been updated since then.
If you feel a post has answered your question, please click "Accept as Solution".

@Piranha​ I totally agree. Documentation says the value is calibrated at the factory. The first thing the official firmware driver does is erase that calibration with no way to recover the factory value. The issue isn't the value for RCC_HSICALIBRATION_DEFAULT, it's the fact that CubeMX is changing the HSICalibrationValue during init without regard to a calibrated value.

@TDK​ Still relevant as of Release v1.7.0 of the STM32CubeH7 Firmware Package. As the original poster, what was your solution?