0

I have created a connector for handling Avro data. I am able to publish data into a topic. But I am not getting the data into the output topic. I have checked the logs of connector and rest-proxy there is no error are showing.

{
    "name": "sink-elastic_avro_2_topic",
    "config": {
        "connector.class": "io.confluent.connect.http.HttpSinkConnector",
        "headers": "Content-Type:application/vnd.kafka.json.v2+json|Accept:application/vnd.kafka.v2+json",
        "batch.max.size": "3000",
        "confluent.topic.bootstrap.servers": "broker:9092",
        "tasks.max": "3",
        "http.api.url": "http://xxx.xxx.xxx:8090/topics/avro_output_topic",
        "topics": "avro_input_topic",
        "request.method": "POST",
        "reporter.bootstrap.servers": "broker:9092",
        "regex.patterns": "^~$",
        "regex.separator": "~",
        "reporter.error.topic.name": "error-responses",
        "regex.replacements": "{\"key\" : \"${key}\" ,\"value\":~}",
        "reporter.result.topic.name": "success-responses",
        "batch.prefix": "{\"records\":[",
        "reporter.error.topic.replication.factor": "1",
        "consumer.override.auto.offset.reset": "latest",
        "confluent.topic.replication.factor": "1",
        "value.converter.schemas.enable": "false",
        "value.converter": "io.confluent.connect.avro.AvroConverter",
        "value.converter.schema.registry.url": "http://schema-registry:8081",
        "batch.suffix": "]}",
        "key.converter": "org.apache.kafka.connect.storage.StringConverter",
        "reporter.result.topic.replication.factor": "1"
    }
}

How to publish Avro data to a topic using Rest proxy.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245

1 Answers1

0

publish Avro data to a topic using Rest proxy.

Using a Sink Connector to point at the Kafka REST Proxy doesn't make sense. You would be consuming from Kafka to write back to Kafka.

You should instead be using a stream-processor like Kafka Streams of KSQLdb to move data across topics within the same Kafka cluster.

Between Kafka clusters, you can use tools like MirrorMaker

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • 1
    Using KStream is not possible in my case. In my case i have 2 environment. I need to transfer the data from one environment to another. In the first environment the data is kept inside a Kafka topic. I need to transfer this data to a topic which is having in the second environment. For this scenario i have to use the rest proxy with connector – Renjith K R Apr 06 '22 at 04:55
  • And you cannot use MirrorMaker because why? That's what it was designed for. https://github.com/apache/kafka/tree/trunk/connect/mirror – OneCricketeer Apr 06 '22 at 15:58
  • Mirror maker is not possible solution for my case. I have multiple Kafka clusters for send the data and in the receiving side I have only one Kafka cluster and which is hosted in a cloud environment. I think Mirror maker works in a one to one cluster mode. Mine is many to one cluster mode. Please advise me if I am wrong – Renjith K R Apr 07 '22 at 06:03
  • Incorrect. Please see document section of replication flows. One of which is "aggregation" (or fan-in) – OneCricketeer Apr 07 '22 at 12:54