0

I have the following scenario in my java web-application.

Each request accepted by the application(tomcat-based) has to wait for at-least 2 minutes for some data processing. The initial request in-turn creates a thread-pool to process the data. While the initial request thread is waiting for the processing to complete, can it be used to take up further incoming requests?

davyjones
  • 185
  • 15
  • 2
    Use message Queue. Instead of adding sleep there. – bobs_007 Jun 04 '15 at 06:59
  • 1
    Use some kind of async processing, and do not keep a connection thread idle & blocked for 2 minutes... – Tassos Bassoukos Jun 04 '15 at 07:05
  • This would require a change to how your clients waits for the request to return. Asynchronous behaviours, polling, (as suggested) messaging, etc. But I don't think we can usurp a thread to access more connections while some long running process is working. – dom farr Jun 04 '15 at 07:05
  • Tornado Web Server: http://www.tornadoweb.org/en/stable/ is built around asynchronous networking. This might be a better fit if this is what you need. – dom farr Jun 04 '15 at 07:09
  • *"While the initial request thread is waiting"* -> this is why Servlet 3.0 introduced async request processing so that server threads does not have to wait. Servlet 3.0 is supported by Tomcat 7+. – Pavel Horal Jun 04 '15 at 07:33
  • Some time ago I posted an answer on async request processing using tomcat and Servlet 3.0 API. It might still work: http://stackoverflow.com/questions/7287244/tomcat-7-async-processing/7293118#7293118. But keep in mind that this approach still requires a persistent TCP connection. – home Jun 04 '15 at 07:59
  • Dont create therad pools by request. – Stefan Jun 04 '15 at 08:25

1 Answers1

1

No. It's the application server's business to accept and handle incoming http connections - and you can help it by occupying one connection for as short time as possible. Doing so on your own would totally pollute your own code with appserver duties and you should not want to get into this kind of business.

The comments on your question give some great suggestions for alternative solutions. As long as you keep sitting on a connection (or request/response pair) it will be busy and not available for the appserver.

Alternatively, with your load profile, you might be able to just increase the number of concurrently handled requests - if they're really mostly idle and waiting, this might be a good quick fix: You can potentially handle a lot of other requests while your long-running requests are waiting for the background processing. Not that I think this is elegant, but it might be the quickest fix.

Olaf Kock
  • 46,930
  • 8
  • 59
  • 90