0

I wrote a quick (hah) server to accept a JSON payload that contains a URL to download an archive from. After parsing the JSON I download the file, uncompress it and add metadata to each line. With a large file this can take a long time.

require 'thin'
require 'sinatra'
require 'json'

set :bind, '0.0.0.0'

#Listen on port 4567 at /core/action/parse-v1/parse-file for JSON post
post '/core/action/parse-v1/parse-file' do
request.body.rewind
request_payload = JSON.parse request.body.read

url = request_payload["s3Url"]
compressionType = request_payload["compressionType"]
  • stuff that pulls file from s3Url
  • stuff that unzip/tar -xvf/etc the file based on compressionType
  • stuff that creates a JSON object for each line in the file and adds the request_payload to it
  • stuff that pushes results to a queue for processing Finally completes, sending a 200 back to the client (which probably timed out 59 seconds ago).

Currently this all takes quite a while and holds the HTTP connection open the entire time. How do I pass the "stuff" part off immediately and let the HTTP server close its connection?

I've experimented a bit but I'm stuck.

TheFiddlerWins
  • 860
  • 5
  • 19

1 Answers1

2

If you want to return a response to the client before starting some long running process, maybe you need a queuing system? I'm not familiar with Sinatra and I don't know your use case but in Rails this is often accomplished with things like Delayed Job, Resque, or Sidekiq.

If those are too heavyweight for you, you could start another process directly with Ruby as is laid out in this answer.

Community
  • 1
  • 1
Ross Geesman
  • 33
  • 1
  • 4