1

I am new to Kafka. I have just configured one Kafka standalone connector by referring steps from confluent doc/guide.

Job of Kafka connector is to sync data from file - test.txt. If I update anything in test.txt, it is not getting automatically pushed to topic - connect-test, whereas if i restart connector, updated messages from test.txt are getting detected and pushed to topic - connect-test.

So, my doubt is whether i have to schedule this connector to detect changes or it automatically does this job internally based on some interval.

Viacheslav Shalamov
  • 4,149
  • 6
  • 44
  • 66
RKP
  • 750
  • 2
  • 12
  • 23

1 Answers1

2

Short answer: a running connector syncs automatically.

If I understand you correctly, you want the content of your file text.txt to be pushed to kafka, thus, allowing your kafka-connector to read new dato from it.

Start a new producer, which will push content of file to your topic:

$ kafka-console-producer.sh --broker-list localhost:9092 --topic connect-test
--new-producer < text.txt

Add new messages:

$ echo "new message" >> text.txt

And this message will show up in the topic.

If your kafka-connector is running and configured properly, it will retrieve messages from the topic.

Viacheslav Shalamov
  • 4,149
  • 6
  • 44
  • 66
  • Let's say I have configured JDBC connector with Kafka. Whatever events happen to database, JDBC connector will send it to cluster (I assume). If OP modifies a content of a line i.e. "new message" to "latest message" would kafka-connector identify such update and send it to broker? – JR ibkr Mar 11 '19 at 14:17
  • I don't think default file-connector supports that feature. Also, writing a custom connector to do such thing would be hard as well. However, if data is streamed into a database then Kafka's JDBC connector would be able to identify such events. – JR ibkr Mar 11 '19 at 14:19