For starters, there is no Kafka Connect container in the compose file there, so you'll need to add one, or start Kafka Connect outside of Docker on your host machine.
Then, it's not clear if you've gotten the Redis connector properly loaded, so open up http://localhost:8083/connector-plugins to see if it is (this will also verify you've started the Connect Server}
Once that's done, you can post your config (you will need to remove the -s
that hides the curl output). Once posted, you will want to be checking the logs of the running Connect process, or you can also go to http://localhost:8083/connectors/RedisSinkConnector1/status
Given what you've shown, and you got this far, both of the above probably say something about a Connection Exception to localhost:6379, since that's the default connection. You'll have to provide "redis.hosts": "redis:6379"
as a property.
Then, also mentioned in the documentation
This connector expects records from Kafka to have a key and value that are stored as bytes or a string
So, it wouldn't hurt to also add key & value converter to your properties as well to specify the data types. If you are directly using the Connect container from Confluent, it's probably set to use the Avro converter, not string or bytes one
Here's an example of a valid configuration that you can POST
{
"name" : "RedisSinkConnector1",
"config" : {
"connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
"tasks.max" : "1",
"topics" : "mostafa",
"redis.hosts": "redis:6379",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "org.apache.kafka.connect.storage.StringConverter"
}
}
With those adjustments, I would think sending any simple key-value message would work and then use redis-cli to run some scan/get key query