A topic has messages with different schemas using RecordNameStrategy
.
Can a consumer implemented with kafkajs (version 3.0.1) consume in the same batch those messages and parse them?
Actually, from the source code seems it can, since it gets the schema from the embedded id in the message itself:
public async decode(buffer: Buffer): Promise<any> {
if (!Buffer.isBuffer(buffer)) {
throw new ConfluentSchemaRegistryArgumentError('Invalid buffer')
}
const { magicByte, registryId, payload } = decode(buffer)
if (Buffer.compare(MAGIC_BYTE, magicByte) !== 0) {
throw new ConfluentSchemaRegistryArgumentError(
`Message encoded with magic byte ${JSON.stringify(magicByte)}, expected ${JSON.stringify(
MAGIC_BYTE,
)}`,
)
}
const schema = await this.getSchema(registryId)
return schema.fromBuffer(payload)
}
But we are getting this weird error:
error in message number 99 {"message":"truncated buffer","stack":"Error: truncated buffer
at RecordType.Type.fromBuffer (/my-platform/backend/node_modules/avsc/lib/types.js:601:11)
at SchemaRegistry.decode (/my-platform/backend/node_modules/@kafkajs/confluent-schema-registry/dist/SchemaRegistry.js:159:29)"}