I have got an apache server that will get hit about 100 occasions at the same time every half an hour with demands for Web addresses that match this pattern:


These Web addresses once had content in it and was once valid. Now they all are 404 which means this bot is killing performance each time it hits us.

Exactly what do I increase my htaccess file to bar it?

Note: The bot is on EC2 so obstructing by Ip will not work. I have to block demands that match that pattern.

Utilizing a mod_rewrite rule should enable you to get to where you need to be:

RewriteEngine On
RewriteCond %{REQUEST_URI} ^/neighborhood/[^/]+/feed$ [NC]
RewriteRule ^.*$ - [F,L]

The above mentioned adopts your .htaccess file or if you love to place it in your vhost file (because you have switched off .htaccess parsing for performance -- advisable):

<Location />
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/neighborhood/[^/]+/feed$ [NC]
RewriteRule ^.*$ - [F,L]

Given a URI of /neighborhood/carson/feed you are very likely an answer for example:


You do not have permission to gain access to /neighborhood/carson/feast upon this server.

Apache/2.2.16 (Ubuntu) Server at ... Port 80

It was examined on my small local VM running Apache/2.2.16 on Ubuntu 10.10.

mod_rewrite? However I doubt it may be made faster on apache level. I'd have a look at nginx like a frontend, it's a lot more efficiant at both 404 and rules performance :-)

PS. Also, you might attempt to return a redirect to 100Mb file somewhere to create a chuckle of those bots :-D

Put a caching system or CDN before Apache, and allow your 404 responses to be cached.

403's may be easily set by mod_rewrite:

RewriteRule ^neighborhood/[^/]+/feed$ - [F]