Questions tagged [confluent-cloud]
154 questions
1
vote
0 answers
Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema
I have a pipeline flow, where i connect the debezium CDC mysql connector from confluent platform to Confluent Cloud since the cloud inbuilt debezium mysql connector is in preview which i have successfully established the connection and the messages…

sha12br
- 11
- 2
1
vote
1 answer
Using MongoDB Sink Connector to update existing documents by a different primary key
I'm attempting to setup a MongoDB Sink Connector via Confluent Cloud which keeps data synced between pgsql and MongoDB.
I'm expecting the config below to update an existing document based on the id (int) field (not _id - objectId), however it just…
user300285
1
vote
2 answers
Confluent Cloud Excessive Usage
We're using a pretty vanilla instance of Confluent Cloud for internal testing. Because this is cloud-based, they give you statistics on how much data you're going through as the month goes along. Unfortunately, there aren't detailed statistics -…

Charlie
- 17
- 7
1
vote
0 answers
Connecting to Confluent Cloud with AIOKafka client
I'm trying to connect to my Confluent Cloud Kafka cluster using a modified version of the AIOKafka ssl_consume_produce.py example in the AIOKafka repo at https://github.com/aio-libs/aiokafka/blob/master/examples/ssl_consume_produce.py. I've…

galen211
- 11
- 1
- 5
1
vote
1 answer
Write to ConfluentCloud from Apache Beam (GCP Dataflow)
I am trying to write to write to Confluent Cloud/Kafka from Dataflow (Apache Beam), using the following:
kafkaKnowledgeGraphKVRecords.apply("Write to Kafka", KafkaIO.write()
…

Pinguin Dirk
- 1,433
- 11
- 18
1
vote
1 answer
Does confluent cloud allow custom connector deployment
Confluent cloud supports only following connector.
GCS Sink Connector
S3 Sink Connector
Please validate If my understanding is correct or not.
Can we deploy custom connector or cdc like debezium in confluent cloud.

Zamir Arif
- 341
- 2
- 13
1
vote
1 answer
Can't read from kafka by KafkaIO in beam
I have written a very simpel pipeline in Apchea Beam as follow to read data from my kafka cluster on Confluent Cloud as follow:
Pipeline pipeline = Pipeline.create(options);
Map propertyBuilder = new HashMap();
…

mahdi
- 25
- 4
0
votes
1 answer
nodejs application kafka communication over the http_proxy https_proxy proxy getting error
{"level":"ERROR","timestamp":"2023-08-31T15:17:48.211Z","logger":"kafkajs","message":"[Connection] Connection…
0
votes
0 answers
How to Write Streaming data to Kafka topic on Confluent Cloud?
import os
from pyspark.sql import SparkSession
from pyspark.sql.types import StructType, StructField, StringType, FloatType, DateType
sp = SparkSession.builder.config("spark.jars",
os.getcwd() +…

ARKHAN
- 401
- 2
- 5
0
votes
0 answers
Confluent Cloud - Throughput and Partitions
As per the doc https://www.confluent.io/confluent-cloud/pricing/?tab=overview#kafka-clusters , Confluent standard cluster supports throughput Up to 250 MBps ingress and 750 MBps egress (1 GBps total).
Question
Is the throughput mentioned is a)…

user16798185
- 99
- 6
0
votes
1 answer
How to connect confluent cloud to databricks
I want to know how to connect confluent cloud to databricks. I wantto read data from confluent to spark dataframe.
I have used this code:
df = spark \
.readStream \
.format("kafka") \
.option("kafka.bootstrap.servers",…

MBINYALA
- 3
- 1
0
votes
0 answers
kafka DataException: Converting byte[] to Kafka Connect data failed due to serialization error of topic
I am using confluent to create a POC for a project. I create the schema for the topic and added a redshift connector for that topic, so when I send a HTTP request to the kafka cluster based on the schema data should be inserted into the redshift db…

C Ekanayake
- 75
- 5
0
votes
0 answers
Prometheus - How to update prometheus config via http serivce discovery to integrate with confluent cloud metrics
I have integrated confluent cloud metrics using prometheus.
I am trying to update prometheus config via http service discovery.
The JSON Format used is as below
[
{
"target": ["xyz"],
"labels": {"__metrics_path_": "abc",
…

P13S
- 11
- 5
0
votes
0 answers
Timeout exception occurs while connecting confluent cloud from Azure databricks notebook
The below code throws below exception. Topic present in the kafka cluster. No network connectivity relate issue.
Exception:
Job aborted due to stage failure: Topic spark_poc_topic not present in metadata after 60000 ms. Caused by: TimeoutException:…

191180rk
- 735
- 2
- 12
- 37
0
votes
0 answers
Does confluent cloud audit events for connectors?
Is there any way to log audit events for Kafka connectors (created from confluent UI). I can see all the default events logged by confluent at this url:…

Dixit Singla
- 2,540
- 3
- 24
- 39