I have got a bit of bad bots focusing on this site and I have to dynamically handle the IP addresses that individuals bots come. It is a pretty high-traffic site, we obtain a few countless pageviews daily which explains why we are using 4 servers (loadbalanced). We do not use any caching (besides assets) since most in our reactions are unique.

Code-technically it is a pretty small PHP website, which does no database queries and something XML request per pageview. The XML request get's quite a fast response.

I have created a script to (often) analyse which IP addresses do abusive demands and I wish to handle demands from individuals Insolvency practitioners in a different way for some time. The Insolvency practitioners which are abusive change a great deal so I have to block different Insolvency practitioners every few minutes

So: I see IP xx.xx.xx.xx being abusive, I record this somewhere after which I wish to give that IP a unique treatment for the following x minutes it will demands. I have to do that inside a fast way, because I'd rather not decelerate the server and also have the legitimate customers suffer with this.

Solution 1: file

Writing the abusive Insolvency practitioners lower inside a file after which reading through that apply for every request appears not fast enough. Can you agree?

Solution 2:PHP include

I possibly could let my analysis script write a PHP include file that the PHP engine then would come with for each request. But: I know that, while writing the PHP file, lots of customers that perform a request right then have an error since the file has been used.

I possibly could solve that potential problem by writing the file after which carrying out a symlink change (which can be faster).

Solution 3: htaccess

A different way to separate the abusers out is always to write an htacces that blocks or redirects them. This can be the best way but I have to write an htaccess file every x minutes then.

I'd like to hear some ideas/responses on my small suggested solutions, especially concerning speed.

How about dynamically setting up iptables to bar unhealthy Insolvency practitioners? I do not use whatever reason to complete the "firewalling" in PHP...

For that record I have finally made the decision to choose (my very own suggested) solution number two, producing a PHP file that's incorporated on every page request.

The entire option would be the following: A Python script analyses the accesslog file every x minutes and doles out "punitive measures" to particular IP addresses. All presently running punitive measures are written right into a fairly small (<1Kb) PHP file. This PHP file is incorporated for each page request. Directly after generation from the PHP file an rsync job is began to push the brand new PHP file to another 3 servers behind the loadbalancer.

Within the Python script that creates the PHP file When i first concatenate the entire items in the file. Then i open, write and close the file sequentially to lock the apply for the least possible period.

I'd you should consider setting up another server store the (constantly altering) block list in-memory and serves the leading-finish servers. I implemented this type of solution using Node.JS and located the implementation simple and easy , performance excellent. memcached doubles, however i never attempted it.