0

my source config

{
  "name": "mysql-connector",  
  "config": {  
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "tasks.max": "1",  
    "database.hostname": "127.0.0.1",  
    "database.port": "3306",
    "database.user": "debezium",
    "database.password": "dbz",
    "database.server.id": "42",  
    "database.server.name": "dbserverkk1",  
    "database.dbname":"classicmodels",
    "database.include.list": "classicmodels",
    "database.allowPublicKeyRetrieval":"true",
    "database.history.kafka.bootstrap.servers": "localhost:9092",  
    "database.history.kafka.topic": "schema-changes.inv" ,
    "max.request.size" :"104857600",
    "min.row.count.to.stream.results":"1000",
    "snapshot.mode":"initial"
 
    
  }
}

my sink config

{
  "name": "mysql-sink-connector",  
  "config": {  
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "connection.url":"jdbc:mysql://127.0.0.1:3306/testmi",
    "tasks.max": "1", 
    "connection.user":"debezium",
    "connection.password":"dbz",
    "insert.mode":"upsert",
    "auto.evolve": "true",
    "auto.create":"true",   
    "value.converter.schemas.enable":"true",
    "value.converter":"org.apache.kafka.connect.json.JsonConverter",
    "table.name.format": "offices",
    "topics":"kafka.classicmodels.offices",
    "pk.mode" :"kafka"
  }
}

**source config works fine and i was able to see the changes sent to the topics but unable to move the data from topics to mysql db **

**i get the following error ** **

    Checking MySql dialect for type of TABLE "offices" (io.confluent.connect.jdbc.dialect.GenericDatabaseDialect:853)
[2022-02-02 15:07:02,915] INFO Setting metadata for table "offices" to Table{name='"offices"', type=TABLE columns=[Column{'state', isPrimaryKey=false, allowsNull=true, sqlType=VARCHAR}, Column{'postalCode', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'territory', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'country', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'addressLine2', isPrimaryKey=false, allowsNull=true, sqlType=VARCHAR}, Column{'phone', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'addressLine1', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'city', isPrimaryKey=false, allowsNull=false, sqlType=VARCHAR}, Column{'officeCode', isPrimaryKey=true, allowsNull=false, sqlType=VARCHAR}]} (io.confluent.connect.jdbc.util.TableDefinitions:64)
[2022-02-02 15:07:02,915] INFO Unable to find fields [SinkRecordField{schema=Schema{STRING}, name='__connect_topic', isPrimaryKey=true}, SinkRecordField{schema=Schema{kafka.classicmodels.offices.Value:STRUCT}, name='after', isPrimaryKey=false}, SinkRecordField{schema=Schema{INT64}, name='ts_ms', isPrimaryKey=false}, SinkRecordField{schema=Schema{STRING}, name='op', isPrimaryKey=false}, SinkRecordField{schema=Schema{io.debezium.connector.mysql.Source:STRUCT}, name='source', isPrimaryKey=false}, SinkRecordField{schema=Schema{INT32}, name='__connect_partition', isPrimaryKey=true}, SinkRecordField{schema=Schema{INT64}, name='__connect_offset', isPrimaryKey=true}, SinkRecordField{schema=Schema{kafka.classicmodels.offices.Value:STRUCT}, name='before', isPrimaryKey=false}, SinkRecordField{schema=Schema{STRUCT}, name='transaction', isPrimaryKey=false}] among column names [state, postalCode, territory, country, addressLine2, phone, addressLine1, city, officeCode] (io.confluent.connect.jdbc.sink.DbStructure:276)

**

  1. source working perfect
  2. topics getting updated real-time
  3. i was able to pull data using console consumer and it was with schema and payload on json format how do i pust to a new database
  • Hi Please have a look of this thread, if it helps. https://stackoverflow.com/questions/45928768/kafka-connect-jdbc-sink-connector-not-working/45940013#45940013 – Raushan Kumar Feb 01 '22 at 06:30
  • You have not specified what converters Debezium is using. Is it also JSON? Can you share what your records actually look like and what you expect the output mysql table to be based on that data? For example, the Debezium data has an `op` field which is not originally in the table. If you're literally trying to copy the original rows, then you need to transform the Debezium event to remove extra information – OneCricketeer Feb 01 '22 at 13:33
  • i have my data in topics with schema and payload json format. should i need more or less to push it to the new db. I was able to do the bulk source and sink without any issues but started to create multiple tables. i want to update the changes in live db to the one i am trying to replicate – Rooban Kumar Feb 01 '22 at 23:05

0 Answers0