This site continues to be getting vulnerability scans recently which I'm not responsible of. I am attempting to block any IP that demands greater than a reasonable quantity of pages. I am really unclear about a sum maybe 4-5 demands and block them.

I have seen this completed with lightspeed web servers however i have apache. How do i limit PHP demands to my pages therefore if a person covers the limit they're blocked from the site for half an hour? (I am not really sure what an adequate amount is. Something so a genuine user wouldn't get blocked for.)

i personally use this to bar custom ip-addresses, just paste within the script:

$deny = array("111.111.111", "222.222.222", "333.333.333");
if (in_array ($_SERVER['REMOTE_ADDR'], $deny)) {
} ?>

to bar someone instantly you would need to create past the ip's as time passes.

$time = time();
$myFile = "ip_log.txt";
$fh = fopen($myFile, 'a') or die("can't open file");
$stringData = $ip.":"$time."\n";
fwrite($fh, $stringData);

to now automate allow the first script obtain the array

$ips = file("ip_log.txt")
$count = count($ips);
for ($i=0;$i<=$count;$i++){
if (strpos($ips[$i],$ip)==true){
$teile = explode(" ", $pizza);
$ip_h[] = $teile[0]; // Teil1
$time_h[] = $teile[1]; // Teil2

//now you must arrays from the actual customers ip with occasions. // now set a timeinterval of quantity of demands you define as "junk e-mail" // and appearance with if for through array to deny similar like first script

compiled by me without testing

The key factor in this option would be, the actual "obstructing" is extremely resource-cost affordable. It's also vital that you properly recognise attackers and never prohibit someone accidentally, but additionally you shouldn't be too open to ensure that you already wasted plenty of assets before obstructing somebody.

However you aren't the first going through this issue.

There's an excellent apache module available known as mod_evasive that is set up to complete just that.

I remember when i did a simple technique on the server (since it labored better with this partuclar application) which had a flow like so:

My application counts up a flexible in local apc cache using the key which was the present hour during the day and also the user ip adress and inspections whether a restriction continues to be arrived at.

If that's the case the ip adress is written right into a job-log in which a cronjob selected up and added a filter using iptables so packages from that ip are simply dropped.

So that they dont even achieve the webserver and also the assets spended are minimal.

Daily the additional rules got removed.

I believe everyone knows what'll happen next (considering that every file/image, etc. is really a single request).

To point out an alternate, why don't you simply block the problem Ip(es) individually - you are able to most probably obtain the IP(s) under consideration out of your traffic logs and it is trivial to include a "Deny from" style Ip block for your httpd config. (Furthermore, unlike a PHP based approach this can affect images, etc. too.)

To complete otherwise is likely to risk obstructing legitimate website traffic, Google bots, etc.