Okay, in case anyone is interested, I dove in and found a solution. I
don''t claim it''s the best solution, but here is one way that
you can
bypass the memory overhead with large objects stored as bytea fields
(blobs) in PostgreSQL. The solution code is database-specific, so all
you MySQL folks won''t be able to use it, sorry.
Here was my original code to retrieve the object:
@thePic = Picture.find(params[:id])
send_data(@thePic.raw.data, :filename => @thePic.raw.filename, :type =>
@thePic.image.content_type, :disposition => "attachment")
This wasn''t useable for my situation for large files. When I had a
large
file (>10 megs), this code would take 20 seconds or longer before I
would begin to see the download in my dev environment (17" MacBook Pro).
With a little research and tweaking, I replaced it with this:
result = ActiveRecord::Base.connection.execute("select a.content_type,
a.filename, a.data from picture_data a, pictures b where b.raw_image_id
= a.id and b.id = #{params[:id]}");
@response.headers[''Pragma''] = '' ''
@response.headers[''Cache-Control''] = '' ''
@response.headers[''Content-type''] = result.getvalue(0,0)
@response.headers[''Content-Disposition''] = "attachment;
filename=#{result.getvalue(0,1)}"
@response.headers[''Accept-Ranges''] = ''bytes''
@response.headers[''Content-Transfer-Encoding''] =
''binary''
@response.headers[''Content-Description''] = ''File
Transfer''
render_text result.getvalue(0,2)
It''s longer, ugly, and not as readable, but the end result? No
noticeable delay before the download starts. There most likely are ways
to improve upon this, but as it is, this gives me an order of magnitude
performance boost and makes it possible for me to keep large files in
the database :-)
Comments, improvements, etc. welcome.
Jeff
--
Posted via http://www.ruby-forum.com/.