Skip to main content
JimFouch
Associate III
December 16, 2020
Question

Can't seem to get correct results from CRC hardware

  • December 16, 2020
  • 3 replies
  • 2191 views

I have a routine that I have used for years on other processors and on PCs to get a CRC16 X-25 from a string. I'd like to get the hardware CRC to use the same CRC16.

I have tried every variant in the STM32CubeIDE and have not been able to get same output.

I know the X-25 uses an XOR of 0xFFFF which I do to the result I get from the HAL_CRC_Calculate.

I'm using my name as the test string...

unsigned char crcTest[]="Jim Fouch";

and then calling the CRC in HAL like this...

crcResult = ~HAL_CRC_Calculate(&hcrc, (uint32_t *)&crcTest, strlen(crcTest));

Here is my crc Init...

static void MX_CRC_Init(void)
{
 
 /* USER CODE BEGIN CRC_Init 0 */
 
 /* USER CODE END CRC_Init 0 */
 
 /* USER CODE BEGIN CRC_Init 1 */
 
 /* USER CODE END CRC_Init 1 */
 hcrc.Instance = CRC;
 hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_DISABLE;
 hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_DISABLE;
 hcrc.Init.GeneratingPolynomial = 4129;
 hcrc.Init.CRCLength = CRC_POLYLENGTH_16B;
 hcrc.Init.InitValue = 0xFFFF;
 hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_NONE;
 hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_DISABLE;
 hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_BYTES;
 if (HAL_CRC_Init(&hcrc) != HAL_OK)
 {
 Error_Handler();
 }
 /* USER CODE BEGIN CRC_Init 2 */
 
 /* USER CODE END CRC_Init 2 */
 
}

Which according to my coded routine and an online CRC test util, should return 0x0BAD. I know it's funny that my name returns "BAD" as a hex result. LOL

Here is the util I'm using... www.crccalc.com

Looking for any help. I'm sure I'm overlooking something.

This topic has been closed for replies.

3 replies

waclawek.jan
Super User
December 16, 2020

Which STM32?

Drop Cube and feed the CRC byte-wise, as the first experiment, then try halfwords with various control bits settings.

JW

JimFouch
JimFouchAuthor
Associate III
December 16, 2020

STM32H743

JimFouch
JimFouchAuthor
Associate III
December 16, 2020

Figured it out.

Needed to Invert In & Out...

 hcrc.Instance = CRC;
 hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_DISABLE;
 hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_DISABLE;
 hcrc.Init.GeneratingPolynomial = 4129;
 hcrc.Init.CRCLength = CRC_POLYLENGTH_16B;
 hcrc.Init.InitValue = 0xFFFF;
 hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_BYTE;
 hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_ENABLE;
 hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_BYTES;

and then invert the CRC

 crcResult = ~HAL_CRC_Calculate(&hcrc, (uint32_t *)&crcTest, strlen(crcTest));

KConn.1
Associate
April 14, 2021

Thank you for posting, have been struggling trying to get the HW crc calculator to produce the correct result and this got is working

This is what i needed for 16 bit CRC:

hcrc.Instance = CRC;
hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_DISABLE;
hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_DISABLE;
hcrc.Init.GeneratingPolynomial = 0x1021;
hcrc.Init.CRCLength = CRC_POLYLENGTH_16B;
hcrc.Init.InitValue = 0xFFFF;
hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_HALFWORD;
hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_ENABLE;
hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_HALFWORDS;
 
if (HAL_CRC_Init(&hcrc) != HAL_OK)
{
 Error_Handler();
}

and in my case, no inversion needed on out put:

uint8_t data[] = {0, 1, 2, 3, 4, 5, 6, 7};
uint32_t hal_crc = HAL_CRC_Calculate(&hcrc, (uint32_t *)data, 4);

gets correct crc of 0x9DFD