Hi, I have written a rails application that runs very well (that''s so good !!!). This application serves up file downloads for about 150 users. These downloads are protected by some authorization rules, precedence rules, and so on (not public downloads...). Well, I have setup 10 fatcgi processes (I''m planning to setup 10 mongrels cluster with mod_proxy). When 10 concurrent users are downloading files, the other 140 must wait for a free rail process. Now my question is: what is the best way to scale this application ? I cannot have 150 processes or 150 mongrels of about 100Mb each (total 15.000 Mb => 15 Gb ram..) to handle 150 concurrent users... Does anyone know how to handle this ? Thanks in advance... --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
Hello Gianluca, I went through this same pain. I was using "sendfile" in a controller action served by a mongrel cluster proxied via nginx. Mongrel is going to hang things up until it''s sendfile is complete. Here''s the switch to make... Use your web services to send the file, not mongrel. Think of this like using your apache/nginx for sending static files and not loading down mongrel with that task. For example (my case a csv file), in the controller you use: response.headers[''Content-Type''] = ''text/csv'' response.headers[''Content-Disposition''] = "attachment; filename=my_file.csv" response.headers[''X-Accel-Redirect''] = "/custom_download_path/ my_file.csv" render :nothing => true Now configure your web service to turn sendfile on and add a definition of the "custom_download_path" you named above: location /custom_download_path { root /the/real/path/to/the/files; default_type text/csv; expires 1h; add_header Cache-Control private; internal; } This is an nginx example, but there are plenty of apache examples on- line. Nginx uses X-Accel-Redirect in the header, while apache will use X- Sendfile. There is a Rails plug-in to avoid the setting of response headers directly, as I did above, but I haven''t converted over to using it yet. John Guen discusses it here : http://john.guen.in/past/2007/4/17/send_files_faster_with_xsendfile/ Good luck, Jim --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
JimCifarelli ha scritto:> Hello Gianluca, > > I went through this same pain. I was using "sendfile" in a controller > action > served by a mongrel cluster proxied via nginx. Mongrel is going to > hang things > up until it''s sendfile is complete. > > Here''s the switch to make... > > Use your web services to send the file, not mongrel. Think of this > like using your > apache/nginx for sending static files and not loading down mongrel > with that task. > For example (my case a csv file), in the controller you use: > > response.headers[''Content-Type''] = ''text/csv'' > response.headers[''Content-Disposition''] = "attachment; > filename=my_file.csv" > response.headers[''X-Accel-Redirect''] = "/custom_download_path/ > my_file.csv" > render :nothing => true > > Now configure your web service to turn sendfile on and add a > definition of > the "custom_download_path" you named above: > > location /custom_download_path { > root /the/real/path/to/the/files; > default_type text/csv; > expires 1h; > add_header Cache-Control private; > internal; > } > > This is an nginx example, but there are plenty of apache examples on- > line. > Nginx uses X-Accel-Redirect in the header, while apache will use X- > Sendfile. > > There is a Rails plug-in to avoid the setting of response headers > directly, as > I did above, but I haven''t converted over to using it yet. > > John Guen discusses it here : > http://john.guen.in/past/2007/4/17/send_files_faster_with_xsendfile/ > > Good luck, > > Jim > > > > > >Hi Jim Thank you very much for your solution, I''ll try it ASAP !!! Have a nice day ! --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---
You may want to have the web server deliver the files by setting a X- Accel-Redirect (or your webserver''s equivalent) HTTP header. Additionally, take a look at Merb: http://merbivore.com Merb is Rails like Ruby framework targeted at high performance. -- -- Tom Mornini, CTO -- Engine Yard, Ruby on Rails Hosting -- Support, Scalability, Reliability -- (866) 518-YARD (9273) x201 On Jan 19, 2008, at 5:46 AM, Gianluca Tessarolo wrote:> Hi, > > I have written a rails application that runs very well (that''s so > good !!!). > > This application serves up file downloads for about 150 users. > > These downloads are protected by some authorization rules, > precedence rules, and so on (not public downloads...). > > Well, I have setup 10 fatcgi processes (I''m planning to setup 10 > mongrels cluster with mod_proxy). > > When 10 concurrent users are downloading files, the other 140 must > wait for a free rail process. > > Now my question is: what is the best way to scale this application ? > I cannot have 150 processes or 150 mongrels of about 100Mb each > (total = 15.000 Mb => 15 Gb ram..) to handle 150 concurrent users... > > Does anyone know how to handle this ? > > Thanks in advance... > > > > >--~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To post to this group, send email to rubyonrails-talk-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org To unsubscribe from this group, send email to rubyonrails-talk-unsubscribe-/JYPxA39Uh5TLH3MbocFFw@public.gmane.org For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en -~----------~----~----~----~------~----~------~--~---