This question is really old but the issue is still very common because of the 30'' limit in Heroku responses so I will add some code on how I achieved it. Works with Rails 5.2 & 6.1 on Heroku with Puma server.
I'm using #send_stream method (present only in edge rails, future rails 7) so I just copied it + set the Last-Modified header manually. Added all in a rails concern to reuse it.
module Streameable
extend ActiveSupport::Concern
include ActionController::Live
def send_stream(filename:, disposition: 'attachment', type: nil)
response.headers['Content-Type'] =
(type.is_a?(Symbol) ? Mime[type].to_s : type) ||
Mime::Type.lookup_by_extension(File.extname(filename).downcase.delete('.')) ||
'application/octet-stream'
response.headers['Content-Disposition'] =
ActionDispatch::Http::ContentDisposition.format(disposition: disposition, filename: filename) # for Rails 5, use content_disposition gem
# extra: needed for streaming correctly
response.headers['Last-Modified'] = Time.now.httpdate
yield response.stream
ensure
response.stream.close
end
end
class ExporterController < ApplicationController
include Streameable
def index
respond_to do |format|
format.html # index.html
format.js # index.js
format.csv do
send_stream(attachment_opts) do |stream|
stream.write "email_address,updated_at\n"
50.times.each do |i|
line = "user_#{i}@acme.com,#{Time.zone.now}\n"
stream.write line
puts line
sleep 1 # force slow response for testing respose > 30''
end
end
end
end
end
private
def attachment_opts
{
filename: "data_#{Time.zone.now.to_i}.csv",
disposition: 'attachment',
type: 'text/csv'
}
end
end
Then, if you use something like curl you will see the output generated second by second.
$ curl -i http://localhost:3000/exporter.csv
An important thing is to write your code to iterate the data with #each, by using the Enumerable module. Oh, a tip with ActiveRecord, use #find_each so the DB fetch is in batches.