What kind of caches are you talking about? Are these full page caches? The kind that get stored into /public? My question is, can you instead of adding /robot to the URL when apache finds the robot, can you instead change Apache''s DocumentRoot? It seems to be that this would prevent apache from finding the cached page. Also, if you actually point to another copy of your /public, you could get the normal static pages... I think perhaps since you are talking about changing mongrel''s caching behaviour that you aren''t talking about the page caches that get stored into /public. (Well, I''m rusty on terminology here) -- Michael Richardson <mcr at simtone.net> Director -- Consumer Desktop Development, Simtone Corporation, Ottawa, Canada Personal: sandelman.ca/mcr SIMtone Corporation fundamentally transforms computing into simple, secure, and very low-cost network-provisioned services pervasively accessible by everyone. Learn more at simtone.net and SIMtoneVDU.com
I am talking about Rails standard page-caching mechanism. Rails by default puts full pages into public/... and if mongrel sees them there, it serves them (without running rails dispatch et al). This is fine for normal but not good for the bot user agents. Here is a new solution: 1) set rails to cache in public/cache 2) Use Apache rewrite to serve these files directly (if found) 3) If not found, pass to mongrel which will not find the cached files either since MONGREL ONLY LOOKS IN public for cached files. Mongrel does not honor the config.action_controller.page_cache_directory rails setting 4) Rails processes the file and puts it into public/cache/... ...on the next request, apache serves from cache. I am working on the reqwrite rules etc. for this. Mike On Jun 10, 2009, at 6:16 PM, mcr at simtone.net wrote:> > What kind of caches are you talking about? > Are these full page caches? The kind that get stored into /public? > > My question is, can you instead of adding /robot to the URL when > apache > finds the robot, can you instead change Apache''s DocumentRoot? > It seems to be that this would prevent apache from finding the cached > page. Also, if you actually point to another copy of your /public, > you > could get the normal static pages... > > I think perhaps since you are talking about changing mongrel''s caching > behaviour that you aren''t talking about the page caches that get > stored > into /public. (Well, I''m rusty on terminology here) > > -- > Michael Richardson <mcr at simtone.net> > Director -- Consumer Desktop Development, Simtone Corporation, > Ottawa, Canada > Personal: sandelman.ca/mcr > > SIMtone Corporation fundamentally transforms computing into simple, > secure, and very low-cost network-provisioned services pervasively > accessible by everyone. Learn more at simtone.net and SIMtoneVDU.com > >
>>>>> "Mike" == Mike Papper <bodaro at gmail.com> writes:Mike> I am talking about Rails standard page-caching mechanism. Rails by default Mike> puts full pages into public/... and if mongrel sees them there, it serves Mike> them (without running rails dispatch et al). This is fine for normal but not Mike> good for the bot user agents. right, so that''s what I thought you were talking about. Only, it''s not mongrel that serves up the pages, but Apache, usually. In your apache config, you have something like: # Rewrite all non-static requests to cluster RewriteCond %{DOCUMENT_ROOT}/%{REQUEST_FILENAME} !-f RewriteRule ^/(.*)$ balancer://spartan_cluster%{REQUEST_URI} [P,QSA,L] which basically serves up any files found in /public, otherwise, punts to the mongrel. I thought that rails put the files directly there for apache to use/see. (there are caveats if your mongrel and apache do not share the same file system, such as because they are on different machines) If you are telling me that actually mongrel does this, it''s news to me. Mike> Here is a new solution: Mike> 1) set rails to cache in public/cache Mike> 2) Use Apache rewrite to serve these files directly (if found) Mike> 3) If not found, pass to mongrel which will not find the cached files either Mike> since MONGREL ONLY LOOKS IN public for cached files. Mongrel does not honor Mike> the config.action_controller.page_cache_directory rails setting Mike> 4) Rails processes the file and puts it into public/cache/... Mike> ...on the next request, apache serves from cache. Mike> I am working on the reqwrite rules etc. for this. So, basically have apache pick a different cache location when it sees a robot. -- Michael Richardson <mcr at simtone.net> Director -- Consumer Desktop Development, Simtone Corporation, Ottawa, Canada Personal: sandelman.ca/mcr SIMtone Corporation fundamentally transforms computing into simple, secure, and very low-cost network-provisioned services pervasively accessible by everyone. Learn more at simtone.net and SIMtoneVDU.com
"Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index." google.com/support/webmasters/bin/answer.py?hl=en&answer=66355 -----Original Message----- From: mongrel-users-bounces at rubyforge.org [mailto:mongrel-users-bounces at rubyforge.org] On Behalf Of Mike Papper Sent: Wednesday, June 10, 2009 8:42 PM To: mongrel-users at rubyforge.org Subject: Re: [Mongrel] Non-Cache Handling for Bots I am talking about Rails standard page-caching mechanism. Rails by default puts full pages into public/... and if mongrel sees them there, it serves them (without running rails dispatch et al). This is fine for normal but not good for the bot user agents. Here is a new solution: 1) set rails to cache in public/cache 2) Use Apache rewrite to serve these files directly (if found) 3) If not found, pass to mongrel which will not find the cached files either since MONGREL ONLY LOOKS IN public for cached files. Mongrel does not honor the config.action_controller.page_cache_directory rails setting 4) Rails processes the file and puts it into public/cache/... ...on the next request, apache serves from cache. I am working on the reqwrite rules etc. for this. Mike On Jun 10, 2009, at 6:16 PM, mcr at simtone.net wrote:> > What kind of caches are you talking about? > Are these full page caches? The kind that get stored into /public? > > My question is, can you instead of adding /robot to the URL when > apache > finds the robot, can you instead change Apache''s DocumentRoot? > It seems to be that this would prevent apache from finding the cached > page. Also, if you actually point to another copy of your /public, > you > could get the normal static pages... > > I think perhaps since you are talking about changing mongrel''s caching > behaviour that you aren''t talking about the page caches that get > stored > into /public. (Well, I''m rusty on terminology here) > > -- > Michael Richardson <mcr at simtone.net> > Director -- Consumer Desktop Development, Simtone Corporation, > Ottawa, Canada > Personal: sandelman.ca/mcr > > SIMtone Corporation fundamentally transforms computing into simple, > secure, and very low-cost network-provisioned services pervasively > accessible by everyone. Learn more at simtone.net andSIMtoneVDU.com> >_______________________________________________ Mongrel-users mailing list Mongrel-users at rubyforge.org rubyforge.org/mailman/listinfo/mongrel-users