1

I am using KsqlDb a table with the following form:

KSQL-DB Query
create table currency (id integer,name varchar) with (kafka_topic='currency',partitions=1,value_format='avro');

C# model

public class Currency
{
    public int Id{get;set;}
    public string Name{get;set;}
}

Now i want to know how should i write/read data from this topic in C# using the Confluent library:

Writing

 IProducer<int, Currency> producer=....

 Currency cur=new Currency();

 Message<int,Currency> message = new Message<int, Currency>
            {
                Key = msg.Id,
                Timestamp = new Timestamp(DateTime.UtcNow, TimestampType.CreateTime),
                Value = msg
            };
 DeliveryResult<int,Currency> delivery =  await this.producer.ProduceAsync(topic,message);

Reading

IConsumer<int,Currency> iconsumer = new ConsumerBuilder<int, Currency>(config)
                .SetKeyDeserializer(Deserializers.Int32) //i assume i need to use the id from my dto
                .SetValueDeserializer(...) //what deserializer
                .Build();

ConsumeResult<int,Currency> result = consumer.Consume();

Currency message =  // what deserializer JsonSerializer.Deserialize<Currency>(result.Message.Value);

I am not sure how to go about this so i tried looking for serializer. I found this library AvroSerializer , but i do not get where the author fetches the schema.

Any help on how to read/write to a specific topic that would match with my ksqldb models ?

Update

After some research and some answers here i have started using the schemaRegistry

var config = new ConsumerConfig
            {
                GroupId = kafkaConfig.ConsumerGroup,
                BootstrapServers = kafkaConfig.ServerUrl,
                AutoOffsetReset = AutoOffsetReset.Earliest
            };
var schemaRegistryConfig = new SchemaRegistryConfig
            {
                Url = kafkaConfig.SchemaRegistryUrl
            };
var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);

IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
            .SetKeyDeserializer(new AvroDeserializer<int>(schemaRegistry).AsSyncOverAsync())
            .SetValueDeserializer(new AvroDeserializer<Currency>(schemaRegistry).AsSyncOverAsync())
            .Build();

ConsumeResult<int, Currency> result = consumer.Consume();

Now i am getting another error:

Expecting data framing of length 5 bytes or more but total data size is 4 bytes

As someone kindly pointed out it seems i retrieving only the id from the schema registry.

How can i just : insert into currency (id,name) values (1,3) and retrieve it in C# as a POCO (listed above) ?

Update 2

After i have found this source program it seems i am not able to publish messages to tables for some reason.

There is no error when sending the message but it is not published to Kafka.

Bercovici Adrian
  • 8,794
  • 17
  • 73
  • 152

2 Answers2

0

I found this library AvroSerializer , but i do not get where the author fetches the schema.

Unclear why you need to use a library other than the Confluent one, but they get it from the Schema Registry. You can use CachedSchemaRegistryClient to get the schema string easily, however you shouldn't need this in the code as the deserializer will download from the registry on its own.

If you refer to the examples/ in the confluent-kafka-dotnet repo for Specific Avro consumption, you can see they generate the User class from User.avsc file, which seems to be exactly what you want to do here for Currency rather than write it yourself

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • Which library are you referring as `the Confluent one`. I have not seen an example as to how serialize/deserialize POCO . II just want an easy way to just serialize my `POCO` c# model to avro format , insert it to kafka , transform it using `ksql` and then consume from the output topic and deserialize from AVRO to another C# model – Bercovici Adrian Jul 30 '21 at 06:50
  • [This library from Confluent](https://github.com/confluentinc/confluent-kafka-dotnet/tree/master/examples/AvroSpecific) (link to example) - Doesn't define a POCO. It generates it. That library is imported into the one linked in your question, so you'd have it installed, anyway. Also Avro in KSQL requires the Schema Registry which is **not** "plain Avro" – OneCricketeer Jul 30 '21 at 14:23
  • I have went past that part and now when i am consuming messages it says that the minimum length of bytes is 5 and i receive only 4 from kafka , and then it throws , and i do not get why. – Bercovici Adrian Jul 30 '21 at 14:31
  • Like I said, data isn't "plain Avro" - At least 5 bytes are reserved for the Schema Registry data. https://docs.confluent.io/platform/current/schema-registry/serdes-develop/index.html#wire-format – OneCricketeer Jul 30 '21 at 14:33
  • Could you please be more explicit. I do not understand what i must do in order to fetch the entire message.I have updated my original post. – Bercovici Adrian Jul 30 '21 at 15:06
  • I do not use C#. I suggest following the README page there linked above first. If you still have issues - report here - https://github.com/confluentinc/confluent-kafka-dotnet/issues – OneCricketeer Jul 30 '21 at 16:20
  • I have followed the readme and now while i can send messages to Kafka , it appears they are not published. Although there is no error on the application side. – Bercovici Adrian Aug 04 '21 at 08:55
  • Did you include the `.ContinueWith(task =>` callback? And that didn't execute? I assume you flushed/closed the producer after sending the data (which is done after exiting the `using` block)? – OneCricketeer Aug 04 '21 at 14:50
0

I have solved the problem by defining my custom serializer , thus implementing the ISerializer<T> and IDeserializer<T> interfaces which in their belly are just wrappers over System.Text.Json.JsonSerializer or NewtonsoftJson.

Serializer

public class MySerializer:ISerializer<T>
{
     byte[] Serialize(T data, SerializationContext context)
     {
          var str=System.Text.Json.JsonSerializer.Serialize(data); //you can also use Newtonsoft here
          var bytes=Encoding.UTF8.GetBytes(str);
          return bytes;
     }
}

Usage

   var config = new ConsumerConfig
                {
                    GroupId = kafkaConfig.ConsumerGroup,
                    BootstrapServers = kafkaConfig.ServerUrl,
                    AutoOffsetReset = AutoOffsetReset.Earliest
                };
    IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
            .SetValueDeserializer(new MySerializer<Currency>())
            .Build();

ConsumeResult<int, Currency> result = consumer.Consume();

P.S

I am not even using the schema registry here afteri implemented the interface

Bercovici Adrian
  • 8,794
  • 17
  • 73
  • 152