I really like the idea of using Mongrel. But reading the FAQ, I noticed something that might be an issue. It stated: "Ruby on Rails is not thread safe so there is a synchronized block around the calls to Dispatcher.dispatch. This means that everything is threaded right before and right after Rails runs. While Rails is running there is only one controller in operation at a time." So does this mean that if someone us uploading a large file (say around 80mb), that Rails will stop serving any other requests until the file is finished uploading? Thanks, Chris
Yes. That''s why you need more than 1 FastCGI or Mongrel instance, behind a load balanced proxy. -- -- Tom Mornini On May 3, 2006, at 9:42 AM, Chris Bruce wrote:> I really like the idea of using Mongrel. But reading the FAQ, I > noticed > something that might be an issue. It stated: > > "Ruby on Rails is not thread safe so there is a synchronized block > around the calls to Dispatcher.dispatch. This means that everything is > threaded right before and right after Rails runs. While Rails is > running > there is only one controller in operation at a time." > > So does this mean that if someone us uploading a large file (say > around > 80mb), that Rails will stop serving any other requests until the > file is > finished uploading?
So I could run multiple mongrel instances on one server and use a proxy load balancer? Chris -----Original Message----- From: rails-bounces@lists.rubyonrails.org [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Tom Mornini Sent: Wednesday, May 03, 2006 11:46 AM To: rails@lists.rubyonrails.org Subject: Re: [Rails] Mongrel + RubyOnRails + FileUploads = Problems? Yes. That''s why you need more than 1 FastCGI or Mongrel instance, behind a load balanced proxy. -- -- Tom Mornini On May 3, 2006, at 9:42 AM, Chris Bruce wrote:> I really like the idea of using Mongrel. But reading the FAQ, I > noticed > something that might be an issue. It stated: > > "Ruby on Rails is not thread safe so there is a synchronized block > around the calls to Dispatcher.dispatch. This means that everything is > threaded right before and right after Rails runs. While Rails is > running > there is only one controller in operation at a time." > > So does this mean that if someone us uploading a large file (say > around > 80mb), that Rails will stop serving any other requests until the > file is > finished uploading?_______________________________________________ Rails mailing list Rails@lists.rubyonrails.org http://lists.rubyonrails.org/mailman/listinfo/rails
Yes. -- -- Tom Mornini On May 3, 2006, at 12:07 PM, Chris Bruce wrote:> So I could run multiple mongrel instances on one server and use a > proxy > load balancer? > > > > Chris > > -----Original Message----- > From: rails-bounces@lists.rubyonrails.org > [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Tom Mornini > Sent: Wednesday, May 03, 2006 11:46 AM > To: rails@lists.rubyonrails.org > Subject: Re: [Rails] Mongrel + RubyOnRails + FileUploads = Problems? > > Yes. That''s why you need more than 1 FastCGI or Mongrel > instance, behind a load balanced proxy. > > -- > -- Tom Mornini > > On May 3, 2006, at 9:42 AM, Chris Bruce wrote: > >> I really like the idea of using Mongrel. But reading the FAQ, I >> noticed >> something that might be an issue. It stated: >> >> "Ruby on Rails is not thread safe so there is a synchronized block >> around the calls to Dispatcher.dispatch. This means that >> everything is >> threaded right before and right after Rails runs. While Rails is >> running >> there is only one controller in operation at a time." >> >> So does this mean that if someone us uploading a large file (say >> around >> 80mb), that Rails will stop serving any other requests until the >> file is >> finished uploading? > > _______________________________________________ > Rails mailing list > Rails@lists.rubyonrails.org > http://lists.rubyonrails.org/mailman/listinfo/rails > _______________________________________________ > Rails mailing list > Rails@lists.rubyonrails.org > http://lists.rubyonrails.org/mailman/listinfo/rails
Wait.... this is kind of bad.... if I understand this right... 1 dispatcher means that if someone uplaods a file, nobody else can use the site? So 2 dispatchers works until two people upload a file at the same time... so really, this is not going to scale well at all if you''re trying to do something like flickr. With that logic you''d need 100 dispatchers if you expected 100 users to be concurrently using your site to upload files. YouTube gets 25,000 submissions per day.... does that mean Rails could never be used to build a site like that? I just can''t believe that''s the case. On 5/3/06, Tom Mornini <tmornini@infomania.com> wrote:> > Yes. > > -- > -- Tom Mornini > > > On May 3, 2006, at 12:07 PM, Chris Bruce wrote: > > > So I could run multiple mongrel instances on one server and use a > > proxy > > load balancer? > > > > > > > > Chris > > > > -----Original Message----- > > From: rails-bounces@lists.rubyonrails.org > > [mailto:rails-bounces@lists.rubyonrails.org] On Behalf Of Tom Mornini > > Sent: Wednesday, May 03, 2006 11:46 AM > > To: rails@lists.rubyonrails.org > > Subject: Re: [Rails] Mongrel + RubyOnRails + FileUploads = Problems? > > > > Yes. That''s why you need more than 1 FastCGI or Mongrel > > instance, behind a load balanced proxy. > > > > -- > > -- Tom Mornini > > > > On May 3, 2006, at 9:42 AM, Chris Bruce wrote: > > > >> I really like the idea of using Mongrel. But reading the FAQ, I > >> noticed > >> something that might be an issue. It stated: > >> > >> "Ruby on Rails is not thread safe so there is a synchronized block > >> around the calls to Dispatcher.dispatch. This means that > >> everything is > >> threaded right before and right after Rails runs. While Rails is > >> running > >> there is only one controller in operation at a time." > >> > >> So does this mean that if someone us uploading a large file (say > >> around > >> 80mb), that Rails will stop serving any other requests until the > >> file is > >> finished uploading? > > > > _______________________________________________ > > Rails mailing list > > Rails@lists.rubyonrails.org > > http://lists.rubyonrails.org/mailman/listinfo/rails > > _______________________________________________ > > Rails mailing list > > Rails@lists.rubyonrails.org > > http://lists.rubyonrails.org/mailman/listinfo/rails > > _______________________________________________ > Rails mailing list > Rails@lists.rubyonrails.org > http://lists.rubyonrails.org/mailman/listinfo/rails >-------------- next part -------------- An HTML attachment was scrubbed... URL: http://wrath.rubyonrails.org/pipermail/rails/attachments/20060503/6174ee13/attachment.html
Hi Chris, On 5/3/06 12:42 PM, "Chris Bruce" <cbruce@sleeter.com> wrote:> I really like the idea of using Mongrel. But reading the FAQ, I noticed > something that might be an issue. It stated: > > "Ruby on Rails is not thread safe so there is a synchronized block > around the calls to Dispatcher.dispatch. This means that everything is > threaded right before and right after Rails runs. While Rails is running > there is only one controller in operation at a time." >Keep in mind that this is a rails issue, not a Mongrel issue. All other aspect of mongrel are as thread safe as Ruby can be.> So does this mean that if someone us uploading a large file (say around > 80mb), that Rails will stop serving any other requests until the file is > finished uploading? > >Yes and no. If you get the pre-release (which I''ve got to release soon or people will kill me), then you''ll get file uploads which are streamed to the disk if they are too big. You can install the pre-release like so: gem install mongrel --source=http://mongrel.rubyforge.org/releases/ But *NOT ON WINDOWS*. I tend to exclude windows folks from testing pre-releases since it''s a more complicated platform to build on. This will let many people upload files at the same time, but there''s kind of an annoying catch. Ruby uses the cgi.rb to process the multipart sections of the uploaded file. So, while Mongrel and multithread crank the upload to a temp file, once this temp file is handed to Rails it has to reprocess it all over again. This second processing is to find the multipart boundaries and uses a lot of horrible regex and backtracking. So, with large uploads you can see pretty big CPU spikes and fill up your rails processes fairly quick. So, in general, if you''re doing smallish uploads--like around 1 or 2 MB--then the pre-release is good stuff. Once the upload gets above about 50MB things start to get a little slow. At around 100MB it just sucks to be you. Zed A. Shaw http://www.zedshaw.com/ http://mongrel.rubyforge.org/
Hey Brian, On 5/3/06 5:12 PM, "Brian Hogan" <bphogan@gmail.com> wrote:> Wait.... this is kind of bad.... if I understand this right... > > 1 dispatcher means that if someone uplaods a file, nobody else can use the > site? >Yep, for the time it takes Rails to process the uploaded file with cgi.rb. If you''ve got an insane amount of upload content then 1 won''t work at all.> So 2 dispatchers works until two people upload a file at the same time... so > really, this is not going to scale well at all if you''re trying to do > something like flickr. >Yep, then there''s a delay. But, the story is a bit more complex.> With that logic you''d need 100 dispatchers if you expected 100 users to be > concurrently using your site to upload files. >100 concurrent users is an *insane* number of actual people in the process queue. The thing to keep in mind is that you can''t measure users since that''s behavior dependent. You could have a site that users hit rarely, where 100 concurrent means you have billions of users. You could have a chat site like campfire where 100 concurrent could mean thousands. It all depends on user behavior. The real way to figure out what kind of req/sec equates to a concurrency level is to use simulation. There''s math methods using queuing theory that work decent if you need to figure your require concurrency before you build the system. If you have the system built already then you need to write a simulator tool or use an existing one that will perform a simulation against your live site. A key thing with this though is you can''t really use the results as a measurement of performance. It does tell you how well the site handles load, generally how fast that simulated process might be, and what kind of additional hardware you''d need. But, to really find out how to tune a particular part of the process you need to go back to performance measurements with a tool like httperf.> YouTube gets 25,000 submissions per day.... does that mean Rails could never > be used to build a site like that? >This is something else that''s kind of weird. I''m assuming most people here have a basic understanding of math but for some reason they throw out these kinds of measurements. Any "I get X per day" is really pretty useless since it doesn''t include a distribution. If you assume that this 25k is steady and average it out that''s like .29 req/second. No way that''s right since there''s probably some kind of statistical distribution for the requests. The measurement would have to be augmented with a peak measurement over a smaller period of time. For example, "YouTube get 25k submissions/day with a peak of 1400/hour and .38/second." Then you start to get a bigger picture. Anyway, hope that clears some things up. The gist is that you can usually handle a lot more concurrent connections with lower numbers of backends then you think, but you need to run simulations to really determine this. Zed A. Shaw http://www.zedshaw.com/ http://mongrel.rubyforge.org/
Bob Hutchison
2006-May-04 02:27 UTC
[Rails] Mongrel + RubyOnRails + FileUploads = Problems?
On May 3, 2006, at 5:13 PM, Zed Shaw wrote:>> So does this mean that if someone us uploading a large file (say >> around >> 80mb), that Rails will stop serving any other requests until the >> file is >> finished uploading? >> >> > > Yes and no.[ snip ]> > This will let many people upload files at the same time, but > there''s kind of > an annoying catch. Ruby uses the cgi.rb to process the multipart > sections > of the uploaded file. So, while Mongrel and multithread crank the > upload to > a temp file, once this temp file is handed to Rails it has to > reprocess it > all over again. > > This second processing is to find the multipart boundaries and uses > a lot of > horrible regex and backtracking. So, with large uploads you can > see pretty > big CPU spikes and fill up your rails processes fairly quick.Does the same issue exists for downloading? This problem is tending to the uglier side of things, so maybe a bit of an ugly hack won''t look so bad? What happens if you use a Mongrel handler for the file upload, storing the file somewhere on disk, then interfere with the normal proceedings? Maybe hacking at the request object sent to Rails by adding a query parameter or HTTP header that says where the file was put? Maybe a redirect with the file name instead of the file? Would that give you the multi-threaded upload? Bob> > So, in general, if you''re doing smallish uploads--like around 1 or 2 > MB--then the pre-release is good stuff. Once the upload gets above > about > 50MB things start to get a little slow. At around 100MB it just > sucks to be > you. > > Zed A. Shaw > http://www.zedshaw.com/ > http://mongrel.rubyforge.org/ >---- Bob Hutchison -- blogs at <http://www.recursive.ca/ hutch/> Recursive Design Inc. -- <http://www.recursive.ca/> Raconteur -- <http://www.raconteur.info/> xampl for Ruby -- <http://rubyforge.org/projects/xampl/>
On 5/3/06 10:26 PM, "Bob Hutchison" <hutch@recursive.ca> wrote:> > On May 3, 2006, at 5:13 PM, Zed Shaw wrote: ><snip>>> This second processing is to find the multipart boundaries and uses >> a lot of >> horrible regex and backtracking. So, with large uploads you can >> see pretty >> big CPU spikes and fill up your rails processes fairly quick. > > Does the same issue exists for downloading? >No since Mongrel or a fronting web server handles those so you get much better concurrency. Now, if your rails app is generating the content for each request then you''re screwed. Remember, it''s not just queue length but how long each request stays in the queue. If you have 10 backend Mongrels, then you have a queue length of 10 (basically). That doesn''t mean that you can only handle 10/second. Queues are really weird and kind of don''t make sense until you simulate them. You have to use some statistical distribution of time for each request to get an idea of how such a queue performs. And nothing beats straight up measurement with a tool like httperf. It''s the reality bringer.> This problem is tending to the uglier side of things, so maybe a bit > of an ugly hack won''t look so bad? >What you mention below isn''t a hack, it''s actually the primary advantage of using Mongrel over fastcgi. If a Rails action is slow, you can spend a bit more effort and write a Mongrel Handler that will do it faster. It''s a little tricker--kind of bare metal--but not impossible.> What happens if you use a Mongrel handler for the file upload, > storing the file somewhere on disk, then interfere with the normal > proceedings? Maybe hacking at the request object sent to Rails by > adding a query parameter or HTTP header that says where the file was > put? Maybe a redirect with the file name instead of the file? >Bingo. I actually have documentation in the queue for such a thing. What you could do is have the uploads be done with mongrel, have even an ajax progress thing done with mongrel, and when it''s all finished bounce it over to rails to complete the process. Best of all, if you did it and managed to completely avoid an "railsisms" shooting for a nice REST uplaod/progress/done process, you could even scale that up by writing a little apache or lighttpd module. Of course you''d have to be REALLY desperate to do that, but the possiblity is there. Key with this is to avoid Ruby''s sessions. What you can do is grab the cookie and parse out the rails session ID. Use that as the name of a directory where the user''s uploaded file is stored. Then, when rails is run to process the file you just have to match the current session id to that directory. Very lightweight.> Would that give you the multi-threaded upload? >Yep, but remember, just making something multi-threaded doesn''t instantly solve all your problems. All computers eventually have a finite level of concurrency, and there''s always a point where just one more request can literally break the server''s back. Where this breaking point is can be 10 concurrent or 10 million. What you should do is test your current setup and see if that meets your needs. There''s no point in getting paranoid and spending the next 3 months rewriting your whole app in Mongrel only. If what you have now meets your performance requirements (you do have measurable performance requirements right?) then don''t bother. Once you see that particular Rails actions don''t meet your needs, make a plan to try something else, but don''t just grab for a Mongrel handler. Figure out how far off your Rails action is from your needs (you do have measurable performance requirements right?) and then develop a set of possible solutions that might meet those needs. Do a few small prototypes to see if your proposed solutions will meet the requirements (you do have measurable performance requirements right?). Then pick the solution that does the job. Also, after you''ve written the solution, retest everything and continually verify. Don''t believe that your own shit don''t smell. I''ve seen people spend months making something they thought was really fast only to find out it gave them no statistically significant improvement. I myself have saved tons of potentially lost development time by testing a potential solution before investing, and by testing as I go I avoid going down bad paths for no benefit. Basing your decisions on evidence is always much better than just blindly reaching for what everyone else is doing. Zed A. Shaw http://www.zedshaw.com/ http://mongrel.rubyforge.org/
Bob Hutchison
2006-May-05 02:01 UTC
[Rails] Mongrel + RubyOnRails + FileUploads = Problems?
On May 4, 2006, at 8:27 PM, Zed Shaw wrote:> > > > On 5/3/06 10:26 PM, "Bob Hutchison" <hutch@recursive.ca> wrote: > >> >> On May 3, 2006, at 5:13 PM, Zed Shaw wrote: >> > <snip> >>> This second processing is to find the multipart boundaries and uses >>> a lot of >>> horrible regex and backtracking. So, with large uploads you can >>> see pretty >>> big CPU spikes and fill up your rails processes fairly quick. >> >> Does the same issue exists for downloading? >> > > No since Mongrel or a fronting web server handles those so you get > much > better concurrency. Now, if your rails app is generating the > content for > each request then you''re screwed.Doesn''t look good for me then :-) I''ll probably do something like below.> > Remember, it''s not just queue length but how long each request > stays in the > queue. If you have 10 backend Mongrels, then you have a queue > length of 10 > (basically). That doesn''t mean that you can only handle 10/ > second. Queues > are really weird and kind of don''t make sense until you simulate > them. You > have to use some statistical distribution of time for each request > to get an > idea of how such a queue performs. And nothing beats straight up > measurement with a tool like httperf. It''s the reality bringer. > >> This problem is tending to the uglier side of things, so maybe a bit >> of an ugly hack won''t look so bad? >> > > What you mention below isn''t a hack, it''s actually the primary > advantage of > using Mongrel over fastcgi. If a Rails action is slow, you can > spend a bit > more effort and write a Mongrel Handler that will do it faster. > It''s a > little tricker--kind of bare metal--but not impossible.This is good. I''m worried about a couple of occasional things that are a lot nasty.> >> What happens if you use a Mongrel handler for the file upload, >> storing the file somewhere on disk, then interfere with the normal >> proceedings? Maybe hacking at the request object sent to Rails by >> adding a query parameter or HTTP header that says where the file was >> put? Maybe a redirect with the file name instead of the file? >> > > Bingo. I actually have documentation in the queue for such a > thing. What > you could do is have the uploads be done with mongrel, have even an > ajax > progress thing done with mongrel, and when it''s all finished bounce > it over > to rails to complete the process. > > Best of all, if you did it and managed to completely avoid an > "railsisms" > shooting for a nice REST uplaod/progress/done process, you could > even scale > that up by writing a little apache or lighttpd module. Of course > you''d have > to be REALLY desperate to do that, but the possiblity is there. > > Key with this is to avoid Ruby''s sessions. What you can do is grab > the > cookie and parse out the rails session ID. Use that as the name of a > directory where the user''s uploaded file is stored. Then, when > rails is run > to process the file you just have to match the current session id > to that > directory. Very lightweight.Okay, this is in line with what I was hoping. No way am I going to be doing anything like an apache or lighttpd module.> >> Would that give you the multi-threaded upload? >> > > Yep, but remember, just making something multi-threaded doesn''t > instantly > solve all your problems. All computers eventually have a finite > level of > concurrency, and there''s always a point where just one more request > can > literally break the server''s back. Where this breaking point is > can be 10 > concurrent or 10 million.I''m only worried about two concurrent users, but where one is doing something like uploading a 45 minute video. When that happens I think the number of concurrent users is going to get a bit higher :-) The remainder of your advice is good-all-the-time optimisation advice. Cheers, Bob> > What you should do is test your current setup and see if that meets > your > needs. There''s no point in getting paranoid and spending the next > 3 months > rewriting your whole app in Mongrel only. If what you have now > meets your > performance requirements (you do have measurable performance > requirements > right?) then don''t bother. > > Once you see that particular Rails actions don''t meet your needs, > make a > plan to try something else, but don''t just grab for a Mongrel handler. > Figure out how far off your Rails action is from your needs (you do > have > measurable performance requirements right?) and then develop a set of > possible solutions that might meet those needs. Do a few small > prototypes > to see if your proposed solutions will meet the requirements (you > do have > measurable performance requirements right?). Then pick the > solution that > does the job. > > Also, after you''ve written the solution, retest everything and > continually > verify. Don''t believe that your own shit don''t smell. I''ve seen > people > spend months making something they thought was really fast only to > find out > it gave them no statistically significant improvement. I myself > have saved > tons of potentially lost development time by testing a potential > solution > before investing, and by testing as I go I avoid going down bad > paths for no > benefit. > > Basing your decisions on evidence is always much better than just > blindly > reaching for what everyone else is doing. > > Zed A. Shaw > http://www.zedshaw.com/ > http://mongrel.rubyforge.org/ > > > > _______________________________________________ > Rails mailing list > Rails@lists.rubyonrails.org > http://lists.rubyonrails.org/mailman/listinfo/rails---- Bob Hutchison -- blogs at <http://www.recursive.ca/ hutch/> Recursive Design Inc. -- <http://www.recursive.ca/> Raconteur -- <http://www.raconteur.info/> xampl for Ruby -- <http://rubyforge.org/projects/xampl/>
Zed Shaw wrote:>> What happens if you use a Mongrel handler for the file upload, >> storing the file somewhere on disk, then interfere with the normal >> proceedings? Maybe hacking at the request object sent to Rails by >> adding a query parameter or HTTP header that says where the file was >> put? Maybe a redirect with the file name instead of the file? >> > > Bingo. I actually have documentation in the queue for such a thing. > What > you could do is have the uploads be done with mongrel, have even an ajax > progress thing done with mongrel, and when it''s all finished bounce it > over > to rails to complete the process. > > Best of all, if you did it and managed to completely avoid an > "railsisms" > shooting for a nice REST uplaod/progress/done process, you could even > scale > that up by writing a little apache or lighttpd module. Of course you''d > have > to be REALLY desperate to do that, but the possiblity is there.Call me desperate... I''ve been struggling for quite a while on getting a proxy working between IIS (Windows Web Server) and Mongrel (actually - most time was spend on SCGI :-S). The thing I have been fighting was file uploads as there is a file encoding issues between the IIS and Mongrel/SCGI. This seemed to be unsolvable on the .NET part of things - don''t ask - so I actually parked the whole project and decided to spend some time learning Rails rather than fighting IIS and .NET on this issue. I you have any information on the Mongrel & Rails part this would be sweet. Especially any idea?s on making this as ?transparent? as possible from a Rails point of view would be very helpful. My Ruby/Rails skills are too limited at this point to tackle this. /Boris -- Posted via http://www.ruby-forum.com/.
Brian Hogan
2006-May-05 12:45 UTC
[Rails] Re: Mongrel + RubyOnRails + FileUploads = Problems?
Boris: I will have a solution to thisdocumented very shortly... I''m just finishing up the documentation for my rails reverse_proxy_fix plugin. You''ll be able to use IIS to receive the requests but you''ll use ISAPIRewrite (the paid version only) to forward requests to Mongrel, and my plugin to handle the URL rewriting (with some limitations.... the URL rewriting only works if the URLs were written with Rails helpers.) Docs are coming soon which include a simple (IIS to Mongrel) and an advanced (IIS to Lighttpd to Mongrel cluster) I''ve been promising this for some time but things keep changing so fast it''s hard to keep up. However I''m hoping to have something on my site at the end of the day. On 5/5/06, Boris <boris@bitslapped.nl> wrote:> > Zed Shaw wrote: > > >> What happens if you use a Mongrel handler for the file upload, > >> storing the file somewhere on disk, then interfere with the normal > >> proceedings? Maybe hacking at the request object sent to Rails by > >> adding a query parameter or HTTP header that says where the file was > >> put? Maybe a redirect with the file name instead of the file? > >> > > > > Bingo. I actually have documentation in the queue for such a thing. > > What > > you could do is have the uploads be done with mongrel, have even an ajax > > progress thing done with mongrel, and when it''s all finished bounce it > > over > > to rails to complete the process. > > > > Best of all, if you did it and managed to completely avoid an > > "railsisms" > > shooting for a nice REST uplaod/progress/done process, you could even > > scale > > that up by writing a little apache or lighttpd module. Of course you''d > > have > > to be REALLY desperate to do that, but the possiblity is there. > > Call me desperate... > > I''ve been struggling for quite a while on getting a proxy working > between IIS (Windows Web Server) and Mongrel (actually - most time was > spend on SCGI :-S). The thing I have been fighting was file uploads as > there is a file encoding issues between the IIS and Mongrel/SCGI. This > seemed to be unsolvable on the .NET part of things - don''t ask - so I > actually parked the whole project and decided to spend some time > learning Rails rather than fighting IIS and .NET on this issue. > > I you have any information on the Mongrel & Rails part this would be > sweet. Especially any idea''s on making this as ''transparent'' as possible > from a Rails point of view would be very helpful. My Ruby/Rails skills > are too limited at this point to tackle this. > > /Boris > > -- > Posted via http://www.ruby-forum.com/. > _______________________________________________ > Rails mailing list > Rails@lists.rubyonrails.org > http://lists.rubyonrails.org/mailman/listinfo/rails >-------------- next part -------------- An HTML attachment was scrubbed... URL: http://wrath.rubyonrails.org/pipermail/rails/attachments/20060505/e5e83176/attachment-0001.html
Boris
2006-May-05 20:45 UTC
[Rails] Re: Re: Mongrel + RubyOnRails + FileUploads = Problems?
Brian Hogan wrote:> Boris: > I will have a solution to thisdocumented very shortly... I''m just > finishing > up the documentation for my rails reverse_proxy_fix plugin. > > You''ll be able to use IIS to receive the requests but you''ll use > ISAPIRewrite (the paid version only) to forward requests to Mongrel, and > my > plugin to handle the URL rewriting (with some limitations.... the URL > rewriting only works if the URLs were written with Rails helpers.) > > Docs are coming soon which include a simple (IIS to Mongrel) and an > advanced > (IIS to Lighttpd to Mongrel cluster) > > I''ve been promising this for some time but things keep changing so fast > it''s > hard to keep up. However I''m hoping to have something on my site at the > end > of the day.Cool stuff Brian. I was planning to create a sort of mod_proxy for IIS, so include the possibility to use multiple Mongrel instances and do not have a need for a commercial isapi plugin. -- Posted via http://www.ruby-forum.com/.