Background
I am trying to calculate CRC-16/CRC2 for a given byte array using boost crc lib.
Note: I am a beginner at best in C++ development
#include <iostream>
#include <vector>
#include <boost/crc.hpp>
namespace APP{
class CrcUtil{
public:
static uint16_t crc16(const std::vector<uint8_t> input) {
boost::crc_16_type result;
result.process_bytes(&input, input.size());
return result.checksum();
}
CrcUtil()=delete;
};
};
I am using catch2 as my test framework. Here is the code for test:
#include "catch.hpp"
#include "../include/crcUtil.h"
TEST_CASE("is crc calculation correct", "[crcUtil.h TESTS]"){
std::vector<uint8_t> bytes = {0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08};
auto expectedCRC2 = 0x3c9d;
auto actualCRC2 = APP::CrcUtil::crc16(bytes);
REQUIRE(expectedCRC2 == actualCRC2);
}
Issue
Each time I ran my test calculated CRC is different.
First run:
/.../test/crcUtilTests.cpp:10: FAILED:
REQUIRE( expectedCRC2 == actualCRC2 )
with expansion:
15517 (0x3c9d) == 63180
Second run:
/.../test/crcUtilTests.cpp:10: FAILED:
REQUIRE( expectedCRC2 == actualCRC2 )
with expansion:
15517 (0x3c9d) == 33478
Nth run:
/.../test/crcUtilTests.cpp:10: FAILED:
REQUIRE( expectedCRC2 == actualCRC2 )
with expansion:
15517 (0x3c9d) == 47016
Question
Is there something wrong with my code?
Why CRC16 is different for same input?
How can reliably calculate CRC16 for a given byte array?