1

(Submitting on behalf of a client)

.........................

We are using kafka connect cluster on kubernetes (helm charts).

However to install plugin it is recommended to extend the image provided "cp-kafka-connect-base" with the local connector. More instructions here: https://docs.confluent.io/current/connect/managing/extending.html#create-a-docker-image-containing-local-connectors

I am unable to do so with the snowflake kafka connector.

Are there any recommended work-arounds?

P.S. :

For PoC, I ran kafka connect on my local machine and added snowflake-kafka-connector jar file in the plugins directory which worked fine. But need this docker image for production deployment.

Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92
Gavin Wilson
  • 482
  • 9
  • 25
  • 1
    I cross-referenced this issue in the git repository to see if we can get some expert help: https://github.com/snowflakedb/snowflake-kafka-connector/issues/64 – Rachel McGuigan Oct 17 '19 at 21:15

2 Answers2

3

Have you tried mounting external volumes with Docker and mapping this location where the Snowflake Connector jar is stored: https://docs.confluent.io/current/installation/docker/operations/external-volumes.html#

For example:

Navigate to ~/cp-docker-images/examples/cp-all-in-one

Open docker-compose.yml in a text editor

Add the final lines (below) to the file, and save it

  connect:
    image: confluentinc/kafka-connect-datagen:latest
    build:
      context: .
      dockerfile: Dockerfile
    hostname: connect
    container_name: connect
    depends_on:
      - zookeeper
      - broker
      - schema-registry
    ports:
      - "8083:8083"
    volumes:
      - ~/my-location:/etc/kafka-connect/jar

And modify the connect plugin path to read something like this;

 CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components,/etc/kafka-connect/jars"

Volumes: This maps a local directory to a logical file system location within Docker. So when it starts up it can read from ~/my-location on my Mac (or whatever you put here) via the mapping /etc/kafka-connect/jars

Connect Plugin Path: And you can see that we have simply added /etc/kafka-connect/jars as a known location then for Kafka Connect to look for new connectors to load in for use.

Hope this helps.

Mike Donovan
  • 369
  • 1
  • 2
1

As well as Mike's suggestion of mounting the connector as an external volume, I would suggest two additional options to consider.

Build a custom image

Using confluent-hub you can install the connector into a new image. For example:


FROM confluentinc/cp-kafka-connect-base

RUN echo "===> Installing Snowflake Connector ..."
RUN confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:0.5.5

Install the connector at runtime

If you're using Docker Compose you can can construct a command paragraph to install the connector at runtime:

    command: 
      - bash 
      - -c 
      - |
        echo "Installing connector plugins"
        confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:0.5.5
        #
        echo "Launching Kafka Connect worker"
        /etc/confluent/docker/run & 
        #
        sleep infinity

See here for more details.

Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92