0

I have a schema in kafka and I need that every time I post a post in this topic, the schema that I registered checks if it is in the same pattern that is being sent.

My schema is: enter image description here

Curl post:

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Operacao\",\"namespace\":\"data.brado.operacao\",\"fields\":[{\"name\":\"id_operacao\",\"type\":\"string\"},{\"name\":\"tipo_container\",\"type\":\"string\"}, {\"name\":\"descricao_operacao\",\"type\":\"string\"},{\"name\":\"entrega\",\"type\":\"string\"},{\"name\":\"coleta\",\"type\":\"string\"},{\"name\":\"descricao_checklist\",\"type\":\"string\"},{\"name\":\"cheio\",\"type\":\"string\"},{\"name\":\"ativo\",\"type\":\"string\"},{\"name\":\"tipo_operacao\",\"type\":\"string\"} ]}"}' http://localhost:38081/subjects/teste/versions

What I need is that when I make a post in the topic it doesn't allow me to send it if it doesn't have this pattern

I was supposed to accuse an error here, because I'm not sending the right schema enter image description here

And it would work in that caseenter image description here

Can anyone help me how to do this check in schema? I've looked everywhere I've found and I haven't found any answers to this.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
souzatorquato
  • 53
  • 1
  • 6
  • I assume you are using the kafka REST proxy, or is it something else? Is that web server actually producing data using Avro producer or just forwarding through the `records` list as JSON/text (therefore ignoring any schema)? – OneCricketeer Jun 23 '22 at 18:55
  • I'm just forwarding the records as text/json.. if I do it this way, will it ignore the schema registry? – souzatorquato Jun 24 '22 at 17:51
  • does my schema registry only work if my producer checks the configured schema first? so I need to add this validation to my producer? – souzatorquato Jun 24 '22 at 20:41
  • if i just post the topic in postman, will it ignore the schema registry? – souzatorquato Jun 24 '22 at 20:42

1 Answers1

0

The schema registry will only block producers that are configured to use it. By default, the broker will not enforce a schema. For that, you need to pay for Confluent Server.

Hard to tell what image you are showing (what REST endpoint you are using), but if it is the Confluent Kafka REST Proxy, then refer to quick-start section on producing and consuming Avro.

# Produce a message using Avro embedded data, including the schema which will
# be registered with schema registry and used to validate and serialize
# before storing the data in Kafka
curl -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json" \
      -H "Accept: application/vnd.kafka.v2+json" \
      --data '{"value_schema": "{\"type\": \"record\", \"name\": \"User\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "testUser"}}]}' \
      "http://localhost:8082/topics/avrotest"

Without these specifics, you're just sending plain-text JSON, which will do no schema checks.

If you have direct access to the Kafka cluster, then writing an Avro producer client would be easier since you don't need to embed the key and/or value schemas every time you want to send an event.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245