I have a python microservice application that on request sends specific message to specific kafka topic with specific principal. All parameters are user controlled in request body. How do I avoid Kafka thread safety problems?
I do know that I can control the kerberos cache via env variable KRB5CCNAME
, but if I touch it, it won't be thread safe (one thread of Kafka might collide with another one afterall not letting me reach right topic with right credentials). If I do it via multiprocessing then it won't be memory efficient as I need to deal with potentially hundred of such topics. If I don't touch KRB5CCNAME
then each kinit
command will overwrite same cache with new ticket this way leading me to security leak (wrong app can publish to wrong topic).
I'm not in control of infrastructure and cannot opt into changing design pattern of architecture (can't get one principal with access to many topics, it has to be "bring your own credentials and topic" approach)
How to achieve concurrent Kafka sessions with different principals simultaneously from single python process without multiprocessing?
I'm using confluent-kafka-python
(and accordingly librdkafka
) with vanilla Flask
on RHEL8 with MIT kerberos.