0

I'm using the OpenTelemetry collector agent to send logs to a Kafka streaming layer. I want to consume these log messages in a .NET Kafka consumer (for now), but I'm having a few issues. OTEL appears to use Protobuf serialization which is making it a bit tricky. Note: eventually I want to send the logs to Elasticsearch via Kafka Connect, but 1 step at a time...

Firstly, is it possible to get the OpenTelemetry Kafka exporter to use JSON serialization? As mentioned, it looks like it uses Protobuf by default and there doesn't appear to be a JSON serialization option for logs - see OTEL Kafka exporter documentation:

enter image description here

Alternatively how can I consume log messages that are published to Kafka using Protobuf serialization by the OpenTelemetry collector agent?

OTEL config is as follows:

...

exporters:
  kafka:
    brokers:
      - "kafka:9093"
    protocol_version: 2.6.2
    topic: logs

service:
  pipelines:
    logs:
      receivers: [filelog]
      exporters: [logging]

My consumer app is .NET Core. So far I've just got a basic producer/consumer example from the confluent-kafka-dotnet GitHub page (all working for simple messages, but not for Protobuf messages published by the OTEL agent). I managed to find a Protobuf consumer example here, but how would I go about generating the proto classes to deserialize the OTEL logs. I'm a bit lost here...

Ryan.Bartsch
  • 3,698
  • 1
  • 26
  • 52

1 Answers1

1

The examples given from the Confluent repo require that the Protobuf messages adhere to a format that is specific to the Confluent Schema Registry.

Based on your OTEL configuration, it doesn't have any information about any such thing, therefore, you'll need to write your own Deserializer implementation in C# for Protobuf objects

how would I go about generating the proto classes to deserialize the OTEL logs

You'd generate the objects from here, I assume

https://github.com/open-telemetry/opentelemetry-proto

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245