my source config
{
"name": "mysql-connector",
"config": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
"tasks.max": "1",
"database.hostname": "127.0.0.1",
"database.port": "3306",
"database.user": "debezium",
"database.password": "dbz",
"database.server.id": "42",
"database.server.name": "dbserverkk1",
"database.dbname":"classicmodels",
"database.include.list": "classicmodels",
"database.allowPublicKeyRetrieval":"true",
"database.history.kafka.bootstrap.servers": "localhost:9092",
"database.history.kafka.topic": "schema-changes.inv" ,
"max.request.size" :"104857600",
"min.row.count.to.stream.results":"1000",
"snapshot.mode":"initial"
}
}
my sink config
{
"name": "mysql-sink-connector",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"connection.url":"jdbc:mysql://127.0.0.1:3306/testmi",
"tasks.max": "1",
"connection.user":"debezium",
"connection.password":"dbz",
"insert.mode":"upsert",
"auto.evolve": "true",
"auto.create":"true",
"value.converter.schemas.enable":"true",
"value.converter":"org.apache.kafka.connect.json.JsonConverter",
"table.name.format": "offices",
"topics":"kafka.classicmodels.offices",
"pk.mode" :"kafka"
}
}
**source config works fine and i was able to see the changes sent to the topics but unable to move the data from topics to mysql db **
**i get the following error ** **
Checking MySql dialect for type of TABLE "offices" (io.confluent.connect.jdbc.dialect.GenericDatabaseDialect:853)
[2022-02-02 15:07:02,915] INFO Setting metadata for table "offices" to Table{name='"offices"', type=TABLE columns=[Column{'state', isPrimaryKey=false, allowsNull=true, sqlType=VARCHAR}, Column{'postalCode', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'territory', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'country', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'addressLine2', isPrimaryKey=false, allowsNull=true, sqlType=VARCHAR}, Column{'phone', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'addressLine1', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'city', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'officeCode', isPrimaryKey=true, allowsNull=false, sqlType=VARCHAR}]} (io.confluent.connect.jdbc.util.TableDefinitions:64)
[2022-02-02 15:07:02,915] INFO Unable to find fields [SinkRecordField{schema=Schema{STRING}, name='__connect_topic', isPrimaryKey=true}, SinkRecordField{schema=Schema{kafka.classicmodels.offices.Value:STRUCT}, name='after', isPrimaryKey=false}, SinkRecordField{schema=Schema{INT64}, name='ts_ms', isPrimaryKey=false}, SinkRecordField{schema=Schema{STRING}, name='op', isPrimaryKey=false}, SinkRecordField{schema=Schema{io.debezium.connector.mysql.Source:STRUCT}, name='source', isPrimaryKey=false}, SinkRecordField{schema=Schema{INT32}, name='__connect_partition', isPrimaryKey=true}, SinkRecordField{schema=Schema{INT64}, name='__connect_offset', isPrimaryKey=true}, SinkRecordField{schema=Schema{kafka.classicmodels.offices.Value:STRUCT}, name='before', isPrimaryKey=false}, SinkRecordField{schema=Schema{STRUCT}, name='transaction', isPrimaryKey=false}] among column names [state, postalCode, territory, country, addressLine2, phone, addressLine1, city, officeCode] (io.confluent.connect.jdbc.sink.DbStructure:276)
**
- source working perfect
- topics getting updated real-time
- i was able to pull data using console consumer and it was with schema and payload on json format how do i pust to a new database