Can't seem to get correct results from CRC hardware
I have a routine that I have used for years on other processors and on PCs to get a CRC16 X-25 from a string. I'd like to get the hardware CRC to use the same CRC16.
I have tried every variant in the STM32CubeIDE and have not been able to get same output.
I know the X-25 uses an XOR of 0xFFFF which I do to the result I get from the HAL_CRC_Calculate.
I'm using my name as the test string...
unsigned char crcTest[]="Jim Fouch";
and then calling the CRC in HAL like this...
crcResult = ~HAL_CRC_Calculate(&hcrc, (uint32_t *)&crcTest, strlen(crcTest));Here is my crc Init...
static void MX_CRC_Init(void)
{
/* USER CODE BEGIN CRC_Init 0 */
/* USER CODE END CRC_Init 0 */
/* USER CODE BEGIN CRC_Init 1 */
/* USER CODE END CRC_Init 1 */
hcrc.Instance = CRC;
hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_DISABLE;
hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_DISABLE;
hcrc.Init.GeneratingPolynomial = 4129;
hcrc.Init.CRCLength = CRC_POLYLENGTH_16B;
hcrc.Init.InitValue = 0xFFFF;
hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_NONE;
hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_DISABLE;
hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_BYTES;
if (HAL_CRC_Init(&hcrc) != HAL_OK)
{
Error_Handler();
}
/* USER CODE BEGIN CRC_Init 2 */
/* USER CODE END CRC_Init 2 */
}Which according to my coded routine and an online CRC test util, should return 0x0BAD. I know it's funny that my name returns "BAD" as a hex result. LOL
Here is the util I'm using... www.crccalc.com
Looking for any help. I'm sure I'm overlooking something.
