In our Avro scheme we have following 2 decimal types
{
"name": "LONG",
"type": {
"type": "bytes",
"scale": 12,
"precision": 64,
"connect.doc": "Long",
"connect.version": 1,
"connect.parameters": {
"scale": "12",
"jcoType": "JCoBCDType",
"length": "9",
"decimals": "12"
},
"connect.default": "\u0000",
"connect.name": "org.apache.kafka.connect.data.Decimal",
"logicalType": "decimal"
},
"default": "\u0000"
},
{
"name": "LONG2",
"type": {
"type": "bytes",
"scale": 12,
"precision": 64,
"connect.doc": "Long2",
"connect.version": 1,
"connect.parameters": {
"scale": "12",
"jcoType": "JCoBCDType",
"length": "9",
"decimals": "12"
},
"connect.default": "\u0000",
"connect.name": "org.apache.kafka.connect.data.Decimal",
"logicalType": "decimal"
},
"default": "\u0000"
}
Example for:
LONG : 55.71364,
LONG2: 12.43337,
The Kafka broker sends the following encoded message (snippet)
"\f2��\x00\f\x0BNޚ*�\x00"
On our typescript backend we then try to decode it.
import {SchemaRegistry} from '@kafkajs/confluent-schema-registry';
import {AvroDecimal} from '@ovotech/avro-decimal';
const registry = new SchemaRegistry({
host: appConfig.kafkaSchemaUrl,
auth: {
username: appConfig.kafkaUser,
password: appConfig.kafkaPassword
}
}, { forSchemaOptions: { logicalTypes: { 'decimal': AvroDecimal }}}
);
const buffer: Buffer = Buffer.from(message);
const decoded = await registry.decode(buffer);
console.log(decoded);
It fails with "Error: expecting underlying Buffer type".
Other attempts:
const buffer = atob(message);
const buffer = Buffer.from(message, 'base64');
const buffer = Buffer.from(message, 'binary');
const buffer = Buffer.from(message, 'ascii');
Also tried to extend the AvroDecimal class. Tried to skip the decoding of the decimal, by creating a resolver.
Accepted Solution: Decimal type could be decoded. Solution is in ts or js. Altering the scheme is not an option.