0

My front-end is a Rails application, and my backend is a Padrino application.

I want to fetch a large amount of CSV data from the backend. Doing makes the HTTP request timeout in the browser.

  1. I can not query the backend again and again, because each time it will generate the same data, there is no concept of offset and limit records.
  2. I tried sending directly from the backend to LB but it is not working for me.

To summarize, an array of 10000k rows is generated to be sent to the UI or downloaded in a streaming fashion.

the Tin Man
  • 158,662
  • 42
  • 215
  • 303
akash
  • 1,801
  • 7
  • 24
  • 42
  • Would you be able to cache the results of the CSV file in some data store? Whether it be a document store, or a relational database? Then just have a background job reload the data in some amount of time that makes sense? – Justin Wood Oct 14 '14 at 16:41
  • i want to make it scalable .. today the file is upto 10 mb but it may go up to 100-200 mb... can i just stream it to CSV directly – akash Oct 14 '14 at 17:03
  • 3
    If you want scale, you shouldn't be reading directly from a CSV file. – Justin Wood Oct 14 '14 at 17:03
  • 1
    You're trying to move too much data in too short of a time, so yeah, your response-times will suffer. You *really* need to figure out how to get the backend to send chunks, which isn't hard if you own the machine and code. If you can't do that, then figure out how to grab the entire file, compress it using gzip at its maximum setting, then send that and decompress it later. – the Tin Man Oct 14 '14 at 17:26
  • Use a rest client. easy peasy. – Max Oct 14 '14 at 18:00

0 Answers0