1

I am using this repository kafka connect to redis.

Explain: What I want to do is to write kafka topics data into redis using docker. They have created a readme file to instruct how to set the configuration of the kafka:

curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors

connector.json file contains:

{
  "config" : {
    "name" : "RedisSinkConnector1",
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa"
  }
}

Problem: I know how to create a new topic in kafka, but the problem is where I don't know how to change the docker-compose or test the connection. While I have created a new topic in kafka, nothing shown in redis database!

I would be thankful if anyone could help me.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Mostafa Ghadimi
  • 5,883
  • 8
  • 64
  • 102

2 Answers2

0

For starters, there is no Kafka Connect container in the compose file there, so you'll need to add one, or start Kafka Connect outside of Docker on your host machine.

Then, it's not clear if you've gotten the Redis connector properly loaded, so open up http://localhost:8083/connector-plugins to see if it is (this will also verify you've started the Connect Server}

Once that's done, you can post your config (you will need to remove the -s that hides the curl output). Once posted, you will want to be checking the logs of the running Connect process, or you can also go to http://localhost:8083/connectors/RedisSinkConnector1/status

Given what you've shown, and you got this far, both of the above probably say something about a Connection Exception to localhost:6379, since that's the default connection. You'll have to provide "redis.hosts": "redis:6379" as a property.

Then, also mentioned in the documentation

This connector expects records from Kafka to have a key and value that are stored as bytes or a string

So, it wouldn't hurt to also add key & value converter to your properties as well to specify the data types. If you are directly using the Connect container from Confluent, it's probably set to use the Avro converter, not string or bytes one

Here's an example of a valid configuration that you can POST

{
  "name" : "RedisSinkConnector1",
  "config" : {
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa",
    "redis.hosts": "redis:6379",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter"
  }
}

With those adjustments, I would think sending any simple key-value message would work and then use redis-cli to run some scan/get key query

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • Would you please tell where to add `redis.hosts`? I don't know what the property really is. – Mostafa Ghadimi Sep 15 '19 at 14:47
  • 1
    In the json you posted. It's described in the README – OneCricketeer Sep 15 '19 at 14:48
  • Thanks, I try it in 2 hours. but I will give upvote for your descriptive answer. <3 – Mostafa Ghadimi Sep 15 '19 at 14:49
  • After adding the redis.host to connect.json file and then running the `curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors`, it still doesn't work properly. It even doesn't create a new topic in kafka. I have created a new topic in kafka container manually and then nothing was written in redis! – Mostafa Ghadimi Sep 17 '19 at 07:13
  • Should I map the connect.json file to the kafka container and then curl to the following address? curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors – Mostafa Ghadimi Sep 17 '19 at 07:20
  • 1
    Connect Sinks don't create topics, only read. And you'd probably want data in the topic before starting the connector anyway since it defaults to reading from the end of the topic. Meaning if you had data in the topic, then start the connector, you'd still see nothing... and it doesn't matter where the JSON file is located as long as the URL is correct – OneCricketeer Sep 17 '19 at 08:06
  • Sorry I didn't get. I have created a new topic. I have produced some data in that topic. and then run the `curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors` command. I didn't see anything in redis db using `redis-cli keys '*' ` – Mostafa Ghadimi Sep 17 '19 at 08:13
  • 1
    And there are still no errors in the logs and the /status endpoint says its `RUNNING`? – OneCricketeer Sep 17 '19 at 08:16
  • 1
    Neither `http://localhost:8083/connectors/RedisSinkConnector1/status` nor `curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors` prints nothing. – Mostafa Ghadimi Sep 17 '19 at 08:19
  • 1
    When you POST to `/connectors`, it requires both `name` and `config` at the top-level. The output should be the whole response. I don't know why you get nothing. https://docs.confluent.io/current/connect/references/restapi.html#post--connectors – OneCricketeer Sep 17 '19 at 08:22
  • 1
    Oh, you get nothing because you have `curl -s`... Remove the silence flag.`-s`... So you are probably getting an error in the POST, and the status endpoint is a 404 because connector doesn't exist – OneCricketeer Sep 17 '19 at 08:24
  • 1
    Yeah, You are right. I have get `curl: (7) Failed to connect to localhost port 8083: Connection refused`. How can I solve this problem? – Mostafa Ghadimi Sep 17 '19 at 08:32
  • 1
    Your Connect Server needs to be running – OneCricketeer Sep 17 '19 at 08:33
  • How can I run the connect server? sorry to ask so many questions. I am very new to these structure – Mostafa Ghadimi Sep 17 '19 at 08:34
  • Well... I don't know how you are testing anything... You have a docker compose file, right? Try `docker-compose up`? Or are you just doing `docker run`? – OneCricketeer Sep 17 '19 at 08:38
  • That repo you linked to doesn't have a Kafka Connect container in it, by the way – OneCricketeer Sep 17 '19 at 08:39
  • I'm using docker-compose up. then I try docker exec -it kafka bash. `cd opt/kafka/` and here I create a new topic and produce some data. finally in my local computer I run the `curl ...` – Mostafa Ghadimi Sep 17 '19 at 08:40
  • But they claimed that it does making the connection. – Mostafa Ghadimi Sep 17 '19 at 08:41
  • It will, if you 1) Actually run a Kafka Connect server 2) Build and install the Redis connector source code into that server... Seems like you skipped a lot of steps. We've used that connector at my job, with some changes, so I know it does work – OneCricketeer Sep 17 '19 at 08:43
  • I have connected so many databases to kafka and spark. But this one is very complicated and I really don't have any idea how to implement. Is it possible for you to create a repository? – Mostafa Ghadimi Sep 17 '19 at 08:45
  • I don't have a repo that uses Redis... Either you can learn how to [install the connector within a Docker container](https://docs.confluent.io/current/connect/userguide.html#connect-installing-plugins) Or you could use a different connector altogether that is already mostly configured. https://docs.lenses.io/connectors/sink/redis.html#lenses-quickstart – OneCricketeer Sep 17 '19 at 08:54
  • I have downloaded the jar file and copied to the `opt/kafka/kafka-connect-redis-1.2.2-2.1.0-all.jar`. What else I have to do? – Mostafa Ghadimi Sep 17 '19 at 09:14
  • You would need to find `connect-distributed.sh` script along with its property file, edit the property file, then run the script with it – OneCricketeer Sep 17 '19 at 09:21
  • Or you could just add another container that runs Kafka Connect (like your datagen container, or Debezium, or the one from Mongo). Each one of those is able to use the same Redis connector JAR and your connector config – OneCricketeer Sep 17 '19 at 09:23
  • Hi Jordan, I have tried the way you said but I couldn't create a proper connection. I would be thankful to guide me with a repository. Thanks for you support. <3 – Mostafa Ghadimi Sep 18 '19 at 08:55
0

The following configuration would resolve the problem.

{
  "name" : "RedisSinkConnector1",
  "config" : {
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa",
    "redis.hosts": "redis:6379",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter"
  }
}

add kafka-connect in docker-compose file

kafka-connect:
    hostname: kafka-connect
    image: confluentinc/cp-kafka-connect:latest
    container_name: kafka-connect
    ports:
      - 8083:8083
    depends_on:
      - schema-registry
      **- redis**
    environment:
      CONNECT_BOOTSTRAP_SERVERS: kafka:9092
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: "quickstart-avro"
      CONNECT_CONFIG_STORAGE_TOPIC: "quickstart-avro-config"
      CONNECT_OFFSET_STORAGE_TOPIC: "quickstart-avro-offsets"
      CONNECT_STATUS_STORAGE_TOPIC: "quickstart-avro-status"
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      **CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect"**
      CONNECT_LOG4J_ROOT_LOGLEVEL: DEBUG
      CONNECT_PLUGIN_PATH: "/usr/share/java,/etc/kafka-connect/jars"
    volumes:
      - $PWD/jars:/etc/kafka-connect/jars

depends_on redis and CONNECT_REST_ADVERTISED_HOST_NAME variable very important in resolving this issue

Mostafa Ghadimi
  • 5,883
  • 8
  • 64
  • 102
Mehrdad Masoumi
  • 264
  • 1
  • 15