> -----Original Message----- > From: centos-bounces at centos.org [mailto:centos-bounces at centos.org] On > Behalf Of Keith Keller > Sent: Sunday, August 28, 2016 4:23 PM > To: centos at centos.org > Subject: Re: [CentOS] .htaccess file > > On 2016-08-28, TE Dukes <tdukes at palmettoshopper.com> wrote: > > > > I'm just not following or understanding. The .htaccess file works but > > on a slow DSL, I don't want the hits. > > What exactly is slow when you receive requests from remote clients thatyou> don't want? Are you actually seeing problems when clients make requests > and Apache has to read in your 2MB .htaccess on every request? > And if so, you might also consider moving your blocking even higher, to > iptables rules, so that Apache never even has to deal with them. > > > I added the following to my httpd.conf: > > > ><Directory "/var/www/htdocs"> > > AddType text/htdocs ".txt" > ></Directory> > > > And copied my .htaccess to /var/www/htdocs as htaccess.txt > > Where did you get the idea that this is how to do global Apache > configuration? This won't actually do anything useful. > > > In the example from the apache website, I don't get the: AddType > > text/example ".exm" Where did they come up .exm? > > They made it up as an example, to demonstrate how directives work in > .htaccess files versus global Apache config files. It's not meant to > demonstrate how to add blocking rules to the global config. > > Here's the main point of that page: > > "Any directive that you can include in a .htaccess file is better set in a > Directory block, as it will have the same effect with better performance." > > So, to achieve what I think you're hoping, take all the IPs you're denyingin> your .htaccess file, put them into a relevant Directory block in a configfile> under /etc/httpd, reload Apache, and move your .htaccess file out of the > way. Then httpd will no longer have to read in .htaccess for every HTTP > request. > > Or, alternatively, block those IPs using iptables instead. However,clients will> still be able to make those requests, and that will still use bandwidth onyour> DSL. The only way to eliminate that altogether is to block those requestson> the other side of your link. That's something you'd have to work out with > your ISP, but I don't think it's common for ISPs to put up blocking rulessolely> for this purpose, or to allow home users to configure such blocksthemselves.> > --keith >[Thomas E Dukes] I setup an ipset but quickly ran out of room in the set. I guess I'll have to setup multiple sets. Right now, I'm just trying to take some load off my home server from badbots but I am getting hit on other services as well. There's nothing on the webserver except a test site I use. Just trying to keep out the ones that ignore robots.txt Thanks!!
> There's nothing on the webserver except a test site I use. Just trying to > keep out the ones that ignore robots.txtIf its just a test server, then I'd be tempted to use HTTP AUTH at the top level. Most robots will be blocked by that, and you can use iptables to block the ones that try to guess your password, perhaps with fail2ban. -- Kahlil (Kal) Hodgson GPG: C9A02289 Chief Technology Officer (m) +61 (0) 4 2573 0382 Direct Pricing Exchange Pty Ltd Suite 1415 401 Docklands Drive Docklands VIC 3008 Australia "All parts should go together without forcing. You must remember that the parts you are reassembling were disassembled by you. Therefore, if you can't get them together again, there must be a reason. By all means, do not use a hammer." -- IBM maintenance manual, 1925
On 2016-08-28, TE Dukes <tdukes at palmettoshopper.com> wrote:> I setup an ipset but quickly ran out of room in the set. I guess I'll have > to setup multiple sets.I'm not familiar with ipsets, but from a quick Google search it seems like you can increase the size of an ipset (or make a new larger one and migrate your IPs to the new one). Multiple sets looks like it'd work as well.> Right now, I'm just trying to take some load off my > home server from badbots but I am getting hit on other services as well.Another possibility for you to look at is sshguard. It can protect against brute force ssh attacks (using iptables rules, which is how I use it) but IIRC it can also protect against http attacks (I've never used it that way, so I don't know how difficult this is). Can you be more specific about the "load" you're trying to mitigate? Is it really the load on your home system, or is it that attackers are using your bandwidth, or a combination? --keith -- kkeller at wombat.san-francisco.ca.us
On Sun, Aug 28, 2016 at 5:23 PM, Keith Keller <kkeller at wombat.san-francisco.ca.us> wrote:> On 2016-08-28, TE Dukes <tdukes at palmettoshopper.com> wrote: > >> Right now, I'm just trying to take some load off my >> home server from badbots but I am getting hit on other services as well. > > Another possibility for you to look at is sshguard. It can protect > against brute force ssh attacks (using iptables rules, which is how I > use it) but IIRC it can also protect against http attacks (I've never > used it that way, so I don't know how difficult this is).I use fail2ban, provides similar functionality like sshguard + Apache mod_evasive (for http DoS attacks). -- Arun Khan
> -----Original Message----- > From: centos-bounces at centos.org [mailto:centos-bounces at centos.org] On > Behalf Of Kahlil Hodgson > Sent: Sunday, August 28, 2016 6:42 PM > To: CentOS mailing list > Subject: Re: [CentOS] .htaccess file > > > There's nothing on the webserver except a test site I use. Just trying > > to keep out the ones that ignore robots.txt > > If its just a test server, then I'd be tempted to use HTTP AUTH at the top > level. Most robots will be blocked by that, and you can use iptables toblock> the ones that try to guess your password, perhaps with fail2ban. > > --[Thomas E Dukes] I have thought about that as well. I do have fail2ban installed as wells as denyhosts. Thanks!!
> -----Original Message----- > From: centos-bounces at centos.org [mailto:centos-bounces at centos.org] On > Behalf Of Keith Keller > Sent: Sunday, August 28, 2016 8:23 PM > To: centos at centos.org > Subject: Re: [CentOS] .htaccess file > > On 2016-08-28, TE Dukes <tdukes at palmettoshopper.com> wrote: > > I setup an ipset but quickly ran out of room in the set. I guess I'll > > have to setup multiple sets. > > I'm not familiar with ipsets, but from a quick Google search it seems likeyou> can increase the size of an ipset (or make a new larger one and migrateyour> IPs to the new one). Multiple sets looks like it'd work as well. > > > Right now, I'm just trying to take some load off my home server from > > badbots but I am getting hit on other services as well. > > Another possibility for you to look at is sshguard. It can protectagainst brute> force ssh attacks (using iptables rules, which is how I use it) but IIRCit can> also protect against http attacks (I've never used it that way, so I don'tknow> how difficult this is). > > Can you be more specific about the "load" you're trying to mitigate? Isit> really the load on your home system, or is it that attackers are usingyour> bandwidth, or a combination? > > --keith >[Thomas E Dukes] I saw that as well but it was a little vague on how to do that. Thanks!!