I'm using the OpenTelemetry collector agent to send logs to a Kafka streaming layer. I want to consume these log messages in a .NET Kafka consumer (for now), but I'm having a few issues. OTEL appears to use Protobuf serialization which is making it a bit tricky. Note: eventually I want to send the logs to Elasticsearch via Kafka Connect, but 1 step at a time...
Firstly, is it possible to get the OpenTelemetry Kafka exporter to use JSON serialization? As mentioned, it looks like it uses Protobuf by default and there doesn't appear to be a JSON serialization option for logs - see OTEL Kafka exporter documentation:
Alternatively how can I consume log messages that are published to Kafka using Protobuf serialization by the OpenTelemetry collector agent?
OTEL config is as follows:
...
exporters:
kafka:
brokers:
- "kafka:9093"
protocol_version: 2.6.2
topic: logs
service:
pipelines:
logs:
receivers: [filelog]
exporters: [logging]
My consumer app is .NET Core. So far I've just got a basic producer/consumer example from the confluent-kafka-dotnet GitHub page (all working for simple messages, but not for Protobuf messages published by the OTEL agent). I managed to find a Protobuf consumer example here, but how would I go about generating the proto classes to deserialize the OTEL logs. I'm a bit lost here...