1

I am using Confluent platform 5.2.1 and jdbc source connector to read data from MySql table. I have done all the required config changes and also the JDBC source connector and MySql driver jar's are placed in share/java/kafka-connect-jdbc directory. I have successfully started all the processes i.e. zookeeper, schema registry, kafka connect etc. When I am trying to execute connector configuration to start reading the data from MySql I am getting below error -

{"error_code":400,"message":"Connector configuration is invalid and contains the following 2 error(s):\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/test?user=test_user&password=Welc0me! for configuration Couldn't open connection to jdbc:mysql://localhost:3306/test?user=test_user&password=Welc0me!\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/test?user=test_user&password=Welc0me! for configuration Couldn't open connection to jdbc:mysql://localhost:3306/test?user=test_user&password=Welc0me!\nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"}

I have gone through many blogs from Confluent itself and others websites as well but could not find any concrete solution, everywhere the suggestion is given to place your driver jar in share/java/kafka-connect-jdbc directory which I have already done.

Additional Informations -

  1. I am using MySql 8.0.16 and the driver jar is also 8.0.16 to avoid the compatibility issue.
  2. I have set the classpath and plugin.path variable to the same folder where the jars are installed.
  3. I am running in stand alone mode.
  4. Below is the MySql properties I am using -
{ 
        "name": "jdbc_source_mysql_foobar_01",
        "config": {
                "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
                "key.converter": "io.confluent.connect.avro.AvroConverter",
                "key.converter.schema.registry.url": "http://localhost:8081",
                "value.converter": "io.confluent.connect.avro.AvroConverter",
                "value.converter.schema.registry.url": "http://localhost:8081",
                "connection.url": "jdbc:mysql://localhost:3306/test?user=test_user&password=Welc0me!",
                "table.whitelist": "foobar",
                "mode": "timestamp",
                "timestamp.column.name": "update_ts",
                "validate.non.null": "false",
                "topic.prefix": "mysql-"
        }
}

Kindly let me know if any other input is needed form my side, I am stuck and blocked completely. Any help will be appreciated.

Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92
Sachchidanand Singh
  • 1,456
  • 1
  • 15
  • 15
  • *"I have set the classpath and plugin.path variable to the same folder where the jars are installed"*. Classpath must refer to the jar file, not the folder. – Andreas Jun 01 '19 at 23:20
  • Thanks for looking in to, mistake form my side while typing, actually I am pointing to jar file in Classpath. – Sachchidanand Singh Jun 01 '19 at 23:22
  • 1
    Adjust your log levels, look if the thing is loaded https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector – OneCricketeer Jun 02 '19 at 01:43
  • @cricket_007 I tried changing the log level to DEBUG in log4j.prop, but since I am new to this I could not find where to see the logs. Could you please guide me where to look for the logs as mentioned in the blog? – Sachchidanand Singh Jun 02 '19 at 06:03
  • It depends on your log4j file. If you're not explicitly writing to a file, and instead stdout, the logs are sent directly to the terminal where you start connect-distributed. – OneCricketeer Jun 02 '19 at 15:27
  • @SachchidanandSingh did you get this working? Did you find your logs as cricket_007 suggested? – Robin Moffatt Jun 07 '19 at 09:17

0 Answers0