I have a flow where from IBM mainframe IIDR, I am sending records to Kafka topic. The value_format
of the message coming to Kafka topic is AVRO and the key is in AVRO format too. The records are pushed into the Kafka topic. I have a stream associated with that topic. But the records are not passed into the stream.
Example of the test_iidr
topic -
rowtime: 5/30/20 7:06:34 PM UTC, key: {"col1": "A", "col2": 1}, value: {"col1": "A", "col2": 11, "col3": 2, "iidr_tran_type": "QQ", "iidr_a_ccid": "0", "iidr_a_user": " ", "iidr_src_upd_ts": "2020-05-30 07:06:33.262931000", "iidr_a_member": " "}
The value_format in the stream is AVRO and the column names are all checked.
The stream creation query -
CREATE STREAM test_iidr (
col1 STRING,
col2 DECIMAL(2,0),
col3 DECIMAL(1,0),
iidr_tran_type STRING,
iidr_a_ccid STRING,
iidr_a_user STRING,
iidr_src_upd_ts STRING,
iidr_a_member STRING)
WITH (KAFKA_TOPIC='test_iidr', PARTITIONS=1, REPLICAS=3, VALUE_FORMAT='AVRO');
Is it failing to load into the stream from the topic as the KEY
is not mentioned in WITH
statement?
The schema registry has the test_iidr-value
and test_iidr-key
subjects registered in it.
The key.converter
and value.converter
in the Kafka-connect
docker is set as - org.apache.kafka.connect.json.JsonConverter
. Is this JsonConverter
creating this issue?
I created a completely different pipeline with different stream and inserted the same data manually using insert into
statements. It worked. Only the IIDR flow is not working and the records are not pushed into the stream from the topic.
I am using Confluent kafka version 5.5.0.