I am trying to write a kafka connector to move data that is in kafka topic into mongodb(sink). For i have added required configurations in connect-json-standalone.properties file and also in connect-mongo-sink.properties file in kafka folder. In this process while starting the connector I am getting below exception
[2019-07-23 18:07:17,274] INFO Started o.e.j.s.ServletContextHandler@76e3b45b{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:855)
[2019-07-23 18:07:17,274] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:231)
[2019-07-23 18:07:17,274] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2019-07-23 18:07:17,635] INFO Cluster created with settings {hosts=[localhost:27017], mode=MULTIPLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,636] INFO Adding discovered server localhost:27017 to client view of cluster (org.mongodb.driver.cluster:71)
[2019-07-23 18:07:17,760] INFO Closing all connections to repracli/localhost:27017 (io.debezium.connector.mongodb.ConnectionContext:86)
[2019-07-23 18:07:17,768] ERROR Failed to create job for ./etc/kafka/connect-mongodb-sink.properties (org.apache.kafka.connect.cli.ConnectStandalone:104)
[2019-07-23 18:07:17,769] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:115)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:112)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
A value is required
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.runtime.AbstractHerder.maybeAddConfigErrors(AbstractHerder.java:423)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:188)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:109)
[2019-07-23 18:07:17,782] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
[2019-07-23 18:07:17,782] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:239)
[2019-07-23 18:07:17,790] INFO Stopped http_localhost8084@5f96f6a2{HTTP/1.1,[http/1.1]}{localhost:8084} (org.eclipse.jetty.server.AbstractConnector:341)
[2019-07-23 18:07:17,790] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167)
[2019-07-23 18:07:17,792] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:256)
[2019-07-23 18:07:17,793] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:94)
[2019-07-23 18:07:17,793] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:185)
[2019-07-23 18:07:17,794] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)
[2019-07-23 18:07:17,796] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:206)
[2019-07-23 18:07:17,799] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:111)
[2019-07-23 18:07:17,800] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:71)
I have tried to solve it by changing connection.uri in connect-mongo-sink.properties in several ways which didn't worked out well. I have also googled some links that also didn't solved my problem.
referal_link : https://groups.google.com/forum/#!topic/debezium/bC4TUld5NGw https://github.com/confluentinc/kafka-connect-jdbc/issues/334
connect-json-standalone.properties:
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schema.registry.url=http://localhost:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
connect-mongo-sink.properties:
name=mongodb-sink-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
tasks.max=1
topics=sample-consumerr-sink-topic
type.name=kafka-connect
mongodb.hosts=repracli/localhost:27017
mongodb.collection=conn_mongo_sink_collc
mongodb.connection.uri=mongodb://localhost:27017/conn_mongo_sink_db?w=1&journal=true
I want the sink connector to work inorder to consume topic data into mongodb collection name "conn_mongo_sink_collc". Can anyone help me how to resolve that error ?
Note : I am using 3 replicaSet mongodb in which port-27017 is primary , 27018-secondary, 27019-secondary.