1

ingesting data to kakfka cluster that can send data to adx kusto db by using kafka-sink-azure-kusto .

iam successfully ingesting data to kafka cluster and its not transferring data to kusto db. how to debug this? any logs i can verify.

i have tried to check broker log no errors there

ref:https://github.com/Azure/kafka-sink-azure-kusto/blob/master/README.md

king
  • 11
  • 3

1 Answers1

1

Could you please provide more information about how you are running Kafka, and how did you set up the connector?

Debugging steps would be:

  1. Broker logs should mention that connector was picked up properly, did you see that line in the logs?

  2. Looking at Connector logs should show more info about what is actually going on under the hood. Maybe you will see some errors there. /var/log/connect-distributed.log

  3. Try to ingest data via any other method? like one of the SDK's

  4. Try running the setup according to steps detailed under delpoy

Update: more info about connector setup in general can be found at this SO question: Kafka connect cluster setup or launching connect workers

Also, confluent has some helpful docs:https://docs.confluent.io/current/connect/userguide.html

Daniel Dror
  • 2,304
  • 28
  • 30
  • i am running kafka like below. docker run --rm -p 3030:3030 -p 9092:9092 -p 8081:8081 -p 8083:8083 -p 8082:8082 -p 2181:2181 -e ADV_HOST=localhost -v /windows/code/test/kafka-sink-azure-kusto/target/kafka-sink-azure-kusto-0.1.0-jar-with-dependencies.jar:/connectors/kafka-sink-azure-kusto-0.1.0-jar-with-dependencies.jar landoop/fast-data-dev – king Feb 18 '19 at 07:23
  • and i am logging into http://localhost:3030/kafka-connect-ui/#/cluster/fast-data-dev/ and using the ui and adding kust sink connector.by using configuration. – king Feb 18 '19 at 07:25
  • name=KustoSinkConnector connector.class=com.microsoft.azure.kusto.kafka.connect.sink.KustoSinkConnector kusto.sink.flush_interval_ms=300000 tasks.max=2 topics=kafka-adx-poc – king Feb 18 '19 at 07:27
  • kusto.tables.topics_mapping=[{'topic': 'kafka-adx-poc','db': 'kafka-adx-poc', 'table': 'Alerts','format': 'csv', 'mapping':'AlertsMapping'}] kusto.auth.authority=tennant_id kusto.url=https://ingest-{kustodb}.windows.net/ kusto.auth.appid=id kusto.auth.appkey=password kusto.sink.tempdir=/var/tmp/ kusto.sink.flush_size=1000 value.converter=org.apache.kafka.connect.storage.StringConverter key.converter=org.apache.kafka.connect.storage.StringConverter – king Feb 18 '19 at 07:27
  • @king Did you try to ingest data using one of the other SDK's (just to make sure credentials are correct and you have permissions to the database)? Also, please note that kusto.sink.flush_size=1000 is in bytes. so if you only written a single item to kafka, it will wait there until flush_interval_ms. Did you manage to look at the connector logs? – Daniel Dror Feb 18 '19 at 07:39
  • ERROR [Correlation ID: XXX] Execution of class com.microsoft.aad.adal4j.AcquireTokenCallable failed. (com.microsoft.aad.adal4j.AuthenticationContext:54) java.net.ConnectException: Connection refused (Connection refused) – king Feb 18 '19 at 07:48
  • i am trying to ingest data like this ./usr/local/bin/kafka-console-producer --broker-list localhost:9092 --topic kafka-adx-poc – king Feb 18 '19 at 07:49
  • The error you are getting suggests that there was a connection issue. could you possible try and ingest data using the java client for comparison? https://github.com/Azure/azure-kusto-java – Daniel Dror Feb 18 '19 at 08:06
  • please let me know is there any other way to setup a connector instead of web ui. – king Feb 19 '19 at 03:24
  • @king take a look at this question: https://stackoverflow.com/questions/51335621/kafka-connect-cluster-setup-or-launching-connect-workers And also, https://docs.confluent.io/current/connect/userguide.html – Daniel Dror Feb 19 '19 at 10:33