I am using KsqlDb
a table with the following form:
KSQL-DB Query
create table currency (id integer,name varchar) with (kafka_topic='currency',partitions=1,value_format='avro');
C# model
public class Currency
{
public int Id{get;set;}
public string Name{get;set;}
}
Now i want to know how should i write/read data from this topic in C# using the Confluent library:
Writing
IProducer<int, Currency> producer=....
Currency cur=new Currency();
Message<int,Currency> message = new Message<int, Currency>
{
Key = msg.Id,
Timestamp = new Timestamp(DateTime.UtcNow, TimestampType.CreateTime),
Value = msg
};
DeliveryResult<int,Currency> delivery = await this.producer.ProduceAsync(topic,message);
Reading
IConsumer<int,Currency> iconsumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(Deserializers.Int32) //i assume i need to use the id from my dto
.SetValueDeserializer(...) //what deserializer
.Build();
ConsumeResult<int,Currency> result = consumer.Consume();
Currency message = // what deserializer JsonSerializer.Deserialize<Currency>(result.Message.Value);
I am not sure how to go about this so i tried looking for serializer. I found this library AvroSerializer , but i do not get where the author fetches the schema
.
Any help on how to read/write to a specific topic that would match with my ksqldb
models ?
Update
After some research and some answers here i have started using the schemaRegistry
var config = new ConsumerConfig
{
GroupId = kafkaConfig.ConsumerGroup,
BootstrapServers = kafkaConfig.ServerUrl,
AutoOffsetReset = AutoOffsetReset.Earliest
};
var schemaRegistryConfig = new SchemaRegistryConfig
{
Url = kafkaConfig.SchemaRegistryUrl
};
var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);
IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(new AvroDeserializer<int>(schemaRegistry).AsSyncOverAsync())
.SetValueDeserializer(new AvroDeserializer<Currency>(schemaRegistry).AsSyncOverAsync())
.Build();
ConsumeResult<int, Currency> result = consumer.Consume();
Now i am getting another error:
Expecting data framing of length 5 bytes or more but total data size is 4 bytes
As someone kindly pointed out it seems i retrieving only the id from the schema registry.
How can i just : insert into currency (id,name) values (1,3)
and retrieve it in C# as a POCO (listed above) ?
Update 2
After i have found this source program it seems i am not able to publish messages to tables for some reason.
There is no error when sending the message but it is not published to Kafka.