0

After it took me several days to get the avro schema to respond correctly when calling the consumer, I am now having no luck with the combination of writing the data to a mysql table.

Both separately works. So I can receive messages cleanly. I can also create my own stream and write a few test columns with values to mysql. As a template for this I have the video of Robin Moffatt : Youtube

taken. But when I try to write the data from the external stream into mysql, I always get a loop with disconnect.

Maybe someone can help me out. Short version:

[2022-01-04 15:04:29,243] INFO [my_mysql_sink|task-0] Initializing writer using SQL dialect: MySqlDatabaseDialect (io.confluent.connect.jdbc.sink.JdbcSinkTask:67)
[2022-01-04 15:04:29,245] INFO [my_mysql_sink|task-0] WorkerSinkTask{id=my_mysql_sink-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:313)
[2022-01-04 15:04:29,246] INFO [my_mysql_sink|task-0] WorkerSinkTask{id=my_mysql_sink-0} Executing sink task (org.apache.kafka.connect.runtime.WorkerSinkTask:198)
[2022-01-04 15:04:29,470] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:29,726] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:29,987] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,260] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,524] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)

Long version:

  
[2022-01-04 15:04:29,159] INFO [my_mysql_sink|task-0] EnrichedConnectorConfig values:
        config.action.reload = restart
        connector.class = io.confluent.connect.jdbc.JdbcSinkConnector
        errors.deadletterqueue.context.headers.enable = false
        errors.deadletterqueue.topic.name =
        errors.deadletterqueue.topic.replication.factor = 3
        errors.log.enable = false
        errors.log.include.messages = false
        errors.retry.delay.max.ms = 60000
        errors.retry.timeout = 0
        errors.tolerance = all
        header.converter = null
        key.converter = class io.confluent.connect.avro.AvroConverter
        name = my_mysql_sink
        predicates = []
        tasks.max = 1
        topics = [my.topic]
        topics.regex =
        transforms = []
        value.converter = class io.confluent.connect.avro.AvroConverter
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2022-01-04 15:04:29,169] INFO [my_mysql_sink|task-0] ConsumerConfig values:
        allow.auto.create.topics = true
        auto.commit.interval.ms = 5000
        auto.offset.reset = earliest
        bootstrap.servers = [kafka1.pro.some.3url.net:9093, kafka2.pro.some.3url.net:9093, kafka3.pro.some.3url.net:9093]
        check.crcs = true
        client.dns.lookup = use_all_dns_ips
        client.id = connector-consumer-my_mysql_sink-0
        client.rack =
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = connect-my_mysql_sink
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        internal.throw.on.fetch.stable.offset.unsupported = false
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 45000
        socket.connection.setup.timeout.max.ms = 30000
        socket.connection.setup.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
        ssl.endpoint.identification.algorithm = https
        ssl.engine.factory.class = null
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.certificate.chain = null
        ssl.keystore.key = null
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLSv1.3
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.certificates = null
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
 (org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2022-01-04 15:04:29,216] WARN [my_mysql_sink|task-0] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2022-01-04 15:04:29,216] INFO [my_mysql_sink|task-0] Kafka version: 7.0.1-ccs (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-01-04 15:04:29,216] INFO [my_mysql_sink|task-0] Kafka commitId: 5defc2ba0be88d51 (org.apache.kafka.common.utils.AppInfoParser:120)
[2022-01-04 15:04:29,217] INFO [my_mysql_sink|task-0] Kafka startTimeMs: 1641305069216 (org.apache.kafka.common.utils.AppInfoParser:121)
[2022-01-04 15:04:29,231] INFO Created connector my_mysql_sink (org.apache.kafka.connect.cli.ConnectStandalone:109)
[2022-01-04 15:04:29,234] INFO [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Subscribed to topic(s): my.topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2022-01-04 15:04:29,235] INFO [my_mysql_sink|task-0] Starting JDBC Sink task (io.confluent.connect.jdbc.sink.JdbcSinkTask:48)
[2022-01-04 15:04:29,237] INFO [my_mysql_sink|task-0] JdbcSinkConfig values:
        auto.create = true
        auto.evolve = false
        batch.size = 3000
        connection.attempts = 3
        connection.backoff.ms = 10000
        connection.password = [hidden]
        connection.url = jdbc:mysql://localhost:3306/test
        connection.user = dbuser
        db.timezone = UTC
        delete.enabled = false
        dialect.name = MySqlDatabaseDialect
        fields.whitelist = []
        insert.mode = upsert
        max.retries = 10
        pk.fields = []
        pk.mode = none
        quote.sql.identifiers = ALWAYS
        retry.backoff.ms = 3000
        table.name.format = ${topic}
        table.types = [TABLE]
 (io.confluent.connect.jdbc.sink.JdbcSinkConfig:376)
[2022-01-04 15:04:29,243] INFO [my_mysql_sink|task-0] Initializing writer using SQL dialect: MySqlDatabaseDialect (io.confluent.connect.jdbc.sink.JdbcSinkTask:67)
[2022-01-04 15:04:29,245] INFO [my_mysql_sink|task-0] WorkerSinkTask{id=my_mysql_sink-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:313)
[2022-01-04 15:04:29,246] INFO [my_mysql_sink|task-0] WorkerSinkTask{id=my_mysql_sink-0} Executing sink task (org.apache.kafka.connect.runtime.WorkerSinkTask:198)
[2022-01-04 15:04:29,470] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:29,726] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:29,987] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,260] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,524] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,741] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:30,945] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:31,139] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:31,338] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:31,532] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:31,747] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:31,947] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:32,153] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:32,368] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:32,598] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:32,800] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:32,993] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:33,211] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:33,412] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:33,613] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:33,814] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:34,018] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:34,219] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:34,422] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:34,626] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:34,826] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka1.pro.some.3url.net:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:35,024] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka2.pro.some.3url.net:9093 (id: -2 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
^C[2022-01-04 15:04:35,174] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:67)
[2022-01-04 15:04:35,177] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:311)
[2022-01-04 15:04:35,181] INFO Stopped http_8083@4e224d5e{HTTP/1.1, (http/1.1)}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:381)
[2022-01-04 15:04:35,181] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:149)
[2022-01-04 15:04:35,222] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:328)
[2022-01-04 15:04:35,222] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:106)
[2022-01-04 15:04:35,223] WARN [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Bootstrap broker kafka3.pro.some.3url.net:9093 (id: -3 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)
[2022-01-04 15:04:35,224] INFO [my_mysql_sink|task-0] Stopping task my_mysql_sink-0 (org.apache.kafka.connect.runtime.Worker:823)
[2022-01-04 15:04:35,226] INFO [my_mysql_sink|task-0] Stopping task (io.confluent.connect.jdbc.sink.JdbcSinkTask:161)
[2022-01-04 15:04:35,226] INFO [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Resetting generation due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:966)
[2022-01-04 15:04:35,226] INFO [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Request joining group due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2022-01-04 15:04:35,227] INFO [my_mysql_sink|task-0] Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2022-01-04 15:04:35,227] INFO [my_mysql_sink|task-0] Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2022-01-04 15:04:35,227] INFO [my_mysql_sink|task-0] Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2022-01-04 15:04:35,229] INFO [my_mysql_sink|task-0] App info kafka.consumer for connector-consumer-my_mysql_sink-0 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2022-01-04 15:04:35,235] INFO [my_mysql_sink|worker] Stopping connector my_mysql_sink (org.apache.kafka.connect.runtime.Worker:376)
[2022-01-04 15:04:35,235] INFO [my_mysql_sink|worker] Scheduled shutdown for WorkerConnector{id=my_mysql_sink} (org.apache.kafka.connect.runtime.WorkerConnector:248)
[2022-01-04 15:04:35,235] INFO [my_mysql_sink|worker] Completed shutdown for WorkerConnector{id=my_mysql_sink} (org.apache.kafka.connect.runtime.WorkerConnector:268)
[2022-01-04 15:04:35,236] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:199)
[2022-01-04 15:04:35,236] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)
[2022-01-04 15:04:35,236] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2022-01-04 15:04:35,237] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2022-01-04 15:04:35,237] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2022-01-04 15:04:35,237] INFO App info kafka.connect for 127.0.1.1:8083 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2022-01-04 15:04:35,237] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:220)
[2022-01-04 15:04:35,238] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:124)
[2022-01-04 15:04:35,238] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:72)
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Francois
  • 57
  • 1
  • 7
  • So, you do have a regular consumer that works? Do its properties match those of the `ConsumerConfig` section of Connect logs? Also, debug with getting `FileSinkConnector` to work since you're not getting errors with MySQL – OneCricketeer Jan 04 '22 at 15:30
  • 1
    Yes, I have a running avro-console-consumer, thanks at this point for your help. I have also been able to write the mysql sink connector with test data to the DB in a complete confluent environment as per the video. For the combination of avro.consumer and mysql-sink I created two properties files. The log also looks good up to the point executing sink task. See log. – Francois Jan 05 '22 at 00:52
  • As long as you're using the same bootstrap servers in the console consumer, and connector properties, it should work the same without connection issues – OneCricketeer Jan 05 '22 at 02:01
  • 1
    OK, I guess I forgot to specify the ssl setting also with the prefix consumer. Now I am one step further. Failed to deserialize data for topic ... to Avro > Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro key schema version for id 422 > Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401 – Francois Jan 05 '22 at 11:55
  • @OneCricketeer i think i'm close to get it work but something is still missing or wrong. Next to the base url for schema i have 3 subjects: some.url/subject/WorkoutKey, some.url/subject/KickValue and some.url/subject/MotionValue I tried to paste the url for key into key.converter.schema.registry.url and one of the Values to value.converter.schema.registry.url - connect is starting and in logfile i see that there is an offset. But if new message arrives i get: Error retrieving Avro key schema for id 422 ... RestClientException: HTTP 404 Not Found; error code: 404 – Francois Jan 05 '22 at 13:26
  • The registry url property cannot have any path in it, and thus the converters should not be using different addresses. If you're trying to change what subject is used, lookup the "strategy" properties for the converter / Avro (de)serializer. The default strategy requires `topic-key` and `topic-value` subjects, and this is also what the console consumer will use to find schemas. Also, the keys should only be strings or ints for mysql, not complex Avro – OneCricketeer Jan 05 '22 at 13:56
  • Well this would mean i do not have to use strategy because the console consumer works with schema.registry.url and shows the stream. But for sure i can confirm that subject for key and value is different to topic-key. How to set key.subject.name.strategy if topic is: gamerboot.gamer.master.workouts.clubs.spieleranalyse and key subject: gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKey – Francois Jan 05 '22 at 14:45
  • 1) Like I said (and the video shows), keys cannot be Avro. 2) `value.converter.value.subject.name.strategy` can be set to `RecordNameStrategy` – OneCricketeer Jan 05 '22 at 14:56
  • hmmm this confuses me. For KSQL in my own environment i think thats true but i was told that key and value are both in avro for schema. I will have to ask the producer site. – Francois Jan 05 '22 at 15:34
  • Maybe the key is Avro. I'm just saying that's not able to be written to Mysql as a primary key without some transformation procedure. In any case, `HTTP 404 Not Found` is a very clear error message that the ID isn't found, and you don't need connect to debug that since you can curl the registry directly for the ID in the error to see if it exists and for which subject its for – OneCricketeer Jan 05 '22 at 19:34
  • I thought it makes sense to create a new topic out of it with all information. https://stackoverflow.com/questions/70607570/kafka-connect-failed-to-deserialize-data-for-topic-error-retrieving-avro-key The original issue with the broken connection loop was the not defined additional consumer. prefix for the ssl connection. – Francois Jan 06 '22 at 13:11

0 Answers0