1

We have a use case where we would like to have around 2000 kafka topics (we have 3 brokers) and 1 partition per topic. Do we have any known recommendation on what is the maximum number of topics that one cluster can have? I went through the following url but it is about partitions but not number of topics.

https://www.confluent.io/blog/how-choose-number-topics-partitions-kafka-cluster/

Would there be any additional overhead on management of the cluster if we have more topics?

I have done a lots of searches on web but did not find any recommendations or known limitations w.r.t. high number of kafka topics.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
TechEnthusiast
  • 1,795
  • 2
  • 17
  • 32

2 Answers2

1

2000 is fine. Yes, there will be more and more overhead for each partition (not only topic) that's managed, as partitions are replicated and written to disk, causing network and disk IO for each and every one of them.

1 partition for every topic in your cluster is not recommended. (the consumer offsets topic has more than that, anyway)

Can I have 100s of thousands of topics in a Kafka Cluster?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
-1

for MSK there is limit for partition per broker. enter image description here

rosa
  • 87
  • 1
  • 3