I''ve heard this is possible with iptables and was wondering how one might do this with shorewall. Here''s the situation: I have a very popular Web site that contains thousands of news stories in real-time. The problem is that people with high-speed internet access are running robots and other fetching clients which is bring my server down to a crawl. It''s basically like a DOS that I can''t seem to stop. I know, just add them to your blacklist, but it''s not that easy. These people''s IP changes all the time. If I block them out right now, they are right back at it in 5 minutes. Plus if I block them out by the IP it ends up blocking new sign on users later. I couldn''t add the dozens of IP addresses to this list every day and expect that list to be correct. I''ve heard there might be a way to block greedy clients in a real-time situation using iptables. If the client (ie, IP) breaks this speed limit they are blocked for x number of seconds,minutes,hours and so on. Once the time has elapsed you let them back in. I know you can do something similar with mod_perl, but that requires all pages throughout your server to be parsed through this mod_perl package. I know there must be a better way and thought the experts at shorewall may have come accross the same situation. I''m sure I''m not the only one who gets these greedy clients bringing their server to a crawl. Any help would be appreciated, Karen _________________________________________________________________ MSN Photos is the easiest way to share and print your photos: http://photos.msn.com/support/worldwide.aspx
Just wondering if you are using a robots.txt file to help prevent that activity...> -----Original Message----- > From: shorewall-users-admin@shorewall.net > [mailto:shorewall-users-admin@shorewall.net]On Behalf Of Karen Barnes > Sent: Wednesday, September 18, 2002 10:37 > To: shorewall-users@shorewall.net > Subject: [Shorewall-users] Stopping greedy clients! > > > I''ve heard this is possible with iptables and was wondering how one > might do > this with shorewall. Here''s the situation: > > I have a very popular Web site that contains thousands of news stories in > real-time. The problem is that people with high-speed internet access are > running robots and other fetching clients which is bring my server > down to a > crawl. It''s basically like a DOS that I can''t seem to stop. I know, > just add > them to your blacklist, but it''s not that easy. These people''s IP changes > all the time. If I block them out right now, they are right back at it in 5 > minutes. Plus if I block them out by the IP it ends up blocking new sign on > users later. I couldn''t add the dozens of IP addresses to this list every > day and expect that list to be correct. > > I''ve heard there might be a way to block greedy clients in a real-time > situation using iptables. If the client (ie, IP) breaks this speed limit > they are blocked for x number of seconds,minutes,hours and so on. Once the > time has elapsed you let them back in. > > I know you can do something similar with mod_perl, but that requires all > pages throughout your server to be parsed through this mod_perl package. I > know there must be a better way and thought the experts at shorewall may > have come accross the same situation. I''m sure I''m not the only one > who gets > these greedy clients bringing their server to a crawl. > > Any help would be appreciated, > Karen > > > _________________________________________________________________ > MSN Photos is the easiest way to share and print your photos: > http://photos.msn.com/support/worldwide.aspx > > _______________________________________________ > Shorewall-users mailing list > Shorewall-users@shorewall.net > http://www.shorewall.net/mailman/listinfo/shorewall-users >
Hi Karen, Karen Barnes schrieb:> > I''ve heard this is possible with iptables and was wondering how one might do > this with shorewall. Here''s the situation: > > I have a very popular Web site that contains thousands of news stories in > real-time. The problem is that people with high-speed internet access are > running robots and other fetching clients which is bring my server down to a > crawl. It''s basically like a DOS that I can''t seem to stop. I know, just add > them to your blacklist, but it''s not that easy. These people''s IP changes > all the time. If I block them out right now, they are right back at it in 5 > minutes. Plus if I block them out by the IP it ends up blocking new sign on > users later. I couldn''t add the dozens of IP addresses to this list every > day and expect that list to be correct. > > I''ve heard there might be a way to block greedy clients in a real-time > situation using iptables. If the client (ie, IP) breaks this speed limit > they are blocked for x number of seconds,minutes,hours and so on. Once the > time has elapsed you let them back in. > > I know you can do something similar with mod_perl, but that requires all > pages throughout your server to be parsed through this mod_perl package. I > know there must be a better way and thought the experts at shorewall may > have come accross the same situation. I''m sure I''m not the only one who gets > these greedy clients bringing their server to a crawl.did you check mod_throttle? Regards Charly P.S. no I don''t use mod_throttle by myself, our university content is not so interesting :-( -- Karl Gaissmaier Computing Center,University of Ulm,Germany Email:karl.gaissmaier@rz.uni-ulm.de Network Administration Tel.: ++49 731 50-22499
Karl Gaissmaier wrote:> > did you check mod_throttle? >Thanks for the tip! I''ve now installed mod_throttle here at Shorewall.net. -Tom -- Tom Eastep \ Shorewall - iptables made easy AIM: tmeastep \ http://www.shorewall.net ICQ: #60745924 \ teastep@shorewall.net