1

I have the following question. I'm using a Spark Structured Streaming job which reads from one topic and writes to another topic of the same kerberized Kafka cluster. Everything works super.

But my problem is the following: How would I handle the case in which I have for each of these 2 topics different principals and keytab.

  • Should I submit two JAAS files (one for each principal/keytab) to Spark? If yes, how can this be achieved?
  • Can I put both "KafkaClient" declarations into one JAAS file?

Thank you very much.

user152468
  • 3,202
  • 6
  • 27
  • 57
Neven
  • 123
  • 6

0 Answers0