0

I have a rails application is deployed on Apache + Passenger + Rails 2.3.8(Ruby 1.8.7) + Linux server + MySQL 5.

I am trying to create an excel report by getting records from DB and download it.

When my report has < = 600(approx.) records, it get created and download successfully.

But when report contains more records, it does not get down load.

Query and logic processing completes in back-end and application server, but browser starts throwing connection time-out after some time.

I have tried increasing keepAlive time, also tried to modify browser settings. Nothing works for me.

Learn More
  • 1,535
  • 4
  • 29
  • 51

1 Answers1

0

As you didn't provide your code, I can only reply with a general answer.

on my opinion, letting a response time of a request be too long is always not ideal even if you can avoid time-out issue from your browser. you have two better choices:

  1. if you don't need to reply the latest data, use cron job to generate your excel file and respond it when getting request. here is a good reference.

  2. if you have to reply the latest data, divide the data in your database into many parts and replay them separately. (in this case, you may have to send request many times)

Community
  • 1
  • 1
Brian
  • 30,156
  • 15
  • 86
  • 87
  • thanks for your Answer. I was able to reduce time of query execution by adding Indexing. That should help. But I still can not understand why Connection Time-Out should come. If it is Request Timed Out, it is okay. Is there any tool to debug this kind of issue? – Learn More Jan 28 '13 at 09:11
  • I don't think there is any this kind of tool, but you can infer the issue logically. As you said "Query and logic processing completes in back-end and application server, but browser starts throwing connection time-out after some time." I am very sure it's a request time out because your application actually works, the only problem is complaining from the browser. Therefore, the only problem you have to solve is avoiding request time out. omo, indexing data in db may help, but it's not the best solution. if you have very huge data, you'd still have to have the request time out. – Brian Jan 28 '13 at 11:32