0

Im creating a sink connector for mongodb using msk connect. The data I'm publishing to kafka topic is protof encoded.

connector.class=com.mongodb.kafka.connect.MongoSinkConnector
key.converter.schemas.enable=false
database=MongoDBMSKDemo
tasks.max=1
topics=MongoDBMSKDemo.Source
connection.uri= <mongodb connection url>
value.converter.schemas.enable=true
collection=Sink
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
max.batch.size=10
key.converter=org.apache.kafka.connect.storage.StringConverter

But the data not getting stored in mongodb getting below error.

org.apache.kafka.connect.errors.DataException: Could not convert value `[B@51798b7f` into a BsonDocument.

Can anyone please tell me how to get through this?

Olaf Kock
  • 46,930
  • 8
  • 59
  • 90
Sandun Sameera
  • 402
  • 6
  • 13
  • 1
    That is because your `value,converter` is `ByteArrayConverter` – Hossein Torabi Jun 16 '23 at 11:17
  • Can you help me to correct it? – Sandun Sameera Jun 16 '23 at 11:17
  • [Your answer here says StringConverter works](https://stackoverflow.com/a/76482286/2308683), so why use ByteArrayConverter? But for Protobuf, have you tried ProtobufConverter instead? Can you share how you're serializing the data? Are you using Schema Registry? – OneCricketeer Jun 16 '23 at 12:37
  • @OneCricketeer yes it worked for string converters, but the thing is data in my kafka topic is protobuf encoded, do apache have protobuf converter ? if so where I can put my protobuf defs? – Sandun Sameera Jun 17 '23 at 17:37
  • Apache doesn't, but Confluent and BlueApron do have Protobuf converters you can find on the web. Again, are you using Schema Registry, or not? – OneCricketeer Jun 18 '23 at 12:18
  • 1
    Here's an example using Protobuf with AWS glue and msk https://aws.amazon.com/blogs/big-data/introducing-protocol-buffers-protobuf-schema-support-in-amazon-glue-schema-registry/ – OneCricketeer Jun 18 '23 at 12:28
  • @OneCricketeer yes I created a msk registry in msk glue, but in sink conector what should be the `value.converter` ? is it `io.confluent.connect.protobuf.ProtobufConverter` as im in aws context would it work? – Sandun Sameera Jun 19 '23 at 05:01
  • It will work if you install that library and run Confluent Schema Registry on your own, otherwise, AWS Glue has a different package name for their converters https://github.com/awslabs/aws-glue-schema-registry/blob/master/protobuf-kafkaconnect-converter/src/main/java/com/amazonaws/services/schemaregistry/kafkaconnect/protobuf/ProtobufSchemaConverter.java – OneCricketeer Jun 19 '23 at 12:41
  • @OneCricketeer tried with above case and I getting ```There is an issue with the connector Code: InvalidInput.InvalidConnectorConfiguration Message: The connector configuration is invalid. Message: Connector configuration is invalid and contains the following 1 error(s): Invalid value com.amazonaws.services.schemaregistry.kafkaconnect.protobuf.ProtobufSchemaConverter for configuration value.converter: Class com.amazonaws.services.schemaregistry.kafkaconnect.protobuf.ProtobufSchemaConverter could not be found.``` – Sandun Sameera Jun 19 '23 at 15:22
  • my sink connector configuration is ``` connector.class=com.mongodb.kafka.connect.MongoSinkConnector value.converter=com.amazonaws.services.schemaregistry.kafkaconnect.protobuf.ProtobufSchemaConverter value.converter.schemas.enable=true key.converter=org.apache.kafka.connect.storage.StringConverter value.converter.registry.name=p8-proto-defs value.converter.schemaName=ExecutionReport.proto value.converter.endpoint=https://glue.us-east-1.amazonaws.com value.converter.region=us-east-2 value.converter.schemaAutoRegistrationEnabled=true value.converter.dataFormat=PROTOBUF ``` – Sandun Sameera Jun 19 '23 at 15:23
  • `Class ... could not be found` means you're missing a JAR file. It's not a config issue – OneCricketeer Jun 19 '23 at 22:17

0 Answers0