I was using active record to get my stories and then generate a CSV, the standard way as done in the rails cast. But I have a lot of rows and it takes minutes. I think if I coul
This answer builds up on the answer provided by @mu-is-too-short, but without a temporary object using streaming instead.
headers['X-Accel-Buffering'] = 'no'
headers["Cache-Control"] = 'no-cache'
headers["Transfer-Encoding"] = 'chunked'
headers['Content-Type'] = 'text/csv; charset=utf-8'
headers['Content-Disposition'] = 'inline; filename="data.csv"'
headers.delete('Content-Length')
sql = "SELECT * FROM stories WHERE stories.id IN (#{story_ids.join(',')})"
self.response_body = Enumerator.new do |chunk|
conn = ActiveRecord::Base.connection.raw_connection
conn.copy_data("COPY (#{sql.chomp(';')}) TO STDOUT WITH (FORMAT CSV, HEADER TRUE, RCE_QUOTE *, ESCAPE E'\\\\');") do
while row = conn.get_copy_data
chunk << "#{row.length.to_s(16)}\r\n"
chunk << row
chunk << "\r\n"
end
chunk << "0\r\n\r\n"
end
end
You can also use gz = Zlib::GzipWriter.new(Stream.new(chunk)) and gz.write row with a class akin to
class Stream
def initialize(block)
@block = block
end
def write(row)
@block << "#{row.length.to_s(16)}\r\n"
@block << row
@block << "\r\n"
end
end
And remember headers['Content-Encoding'] = 'gzip'. See also this gist.