We have a Spring-based application with multi-tier architecture.
Layers are in this order
Rest
Service
DB Layer
All bundled in a .war
file and deployed inside Apache Tomcat 7
We have Rest APIs which does some series of work in the service layer and return response.
For Example:
Policy Push API will do compliance check, validate and push data and record in DB and show msg to user. This process takes about 2 sec min.
Few Days back our server started to crash. On inspecting we found we were hitting around 2000 request/sec which tomcat
couldn't handle.
To overcome this we have added Nginx Loadbalancer
with 3 instances.
It is stable to some extent but I feel its just a fix.
I was looking into Kafka
, RabbitMQ
as an option. But in that case, issue will be Request will be added in queue and response is returened to User.
We need to show Policy status in real time.
By adding it in queue its, not gurantee that Policy is pushed.
It would help if someone could help with this Usecase how to handle?
Whether adding more server to load balancer is the only option or something else?