Showing results for 
Search instead for 
Did you mean: 

CRC-16 MODBUS without considering the entire 32 bits DR, but only the right sequence of 16 bits


Good morning,

I am facing an issue related to the usage of the CRC on the G070RB microcontroller (and user manual RM0454), specifically concerning the use of hardware-based CRC-16 MODBUS for a project currently under development at the company where I work. I have observed that when the CRC polynomial is set to 16 bits, the CRC chip processes the entire DR (Data Register) as 32 bits. In other words:

my sequence 0xF704 is treated as 0x0000F704, where the chip first applies the CRC to the initial two bytes (0x0000) and then uses its CRC as the initial CRC to compute 0xF704. Consequently, through the microcontroller, I obtain a final CRC of 0xD747 instead of the expected 0x4346. Is there a way to apply the CRC only to 0xF704 while "masking" the 0x0000 in front of it?

This issue becomes apparent when sending an odd number of 16-bit packets. Below is a draft of the code I have used:

// Enable CRC
uint32_t *RCC_AHB = 0x40021038;
*RCC_AHB |= (
1 << 12);

// Set RefIn, RefOut bit by bit and Poly Size to 16-bit
uint32_t *CRC_CR= 0x40023008;
*CRC_CR |= (
1 << 7); *CRC_CR |= (1 << 5);
*CRC_CR |= (
1 << 3);

// Set polynomial
uint32_t *CRC_POLY = 0x40023014;

// Set initial CRC value
uint32_t *CRC_INIT = 0x40023010;

// Input data to use
uint32_t input_data = 0xF704;

// Write input_data to DR
uint32_t *CRC_DR = 0x40023000;

Where am I making a mistake?

Best regards,