I set a 'deny from' during my htaccess to bar certain junk e-mail bots from parsing my website. When using the code below, I observed during my log file that I am getting lots of 'client refused by server configuration' and it is cluttering in the log files once the bot begins its scan. Any ideas?

Thanks, Steve

<Files *>
order allow,deny
allow from all
deny from 123.45.67.8
</Files>

I wound up choosing the next:

RewriteCond %{REMOTE_ADDR} 123.4.3.4.5
RewriteRule (.*) - [F,L]

Have a look in the conditional logging here - I believe which will provide all you need:

http://httpd.apache.org/docs/2.2/logs.html

Also - if you're able to see that the different bots will always be from a specific Ip, you are able to block them inside your hosts.allow/deny files VIA Ip or instantly using something similar to blockhosts or even mod_evasive, this way apache won't ever begin to see the demands to log them.

-sean

UPDATE: Are you currently determining the ip addresses by hand then adding these to your htaccess? that sounds painful. If you want to get it done this way I recommend you block the ip addresses in the firewall having a drop rule OR as above in hosts allow/deny.

SPURIOUS Damaged RECORD UPDATE: Have a look at blockhosts, it may block ip addresses according to their 'behavior' &lifier will eliminate the requirement for you to definitely by hand prune them out every single day.

You will get the log file to be delivered to a course (also known as a script).

Possibly implement a script than simply provides a periodic summary?). The relaxation to log file?