This can be a hosting that is shared atmosphere. I control the server, although not always this content. I have got a customer having a Perl script that appears to exhaust control every occasionally and suck lower 50% from the processor until the operation is wiped out.
With ASP scripts, I am in a position to restrict how long the script can run, and IIS only will shut it lower after, say, 90 seconds. This does not work with Perl scripts, becasue it is running like a cgi process (and really launches an exterior process to complete the script).
Similarly, techniques that search for excess resource consumption inside a worker process will not check this out, because the resource that's being consumed (the processor) has been chewed up with a child process as opposed to the Wordpress itself.
It is possible to method to make IIS abort a Perl script (or any other cgi-type process) that's running too lengthy? How??
On the UNIX-style system, I'd make use of a signal handler trapping ALRM occasions, then make use of the alarm function to begin a timer before beginning an action which i expected might timeout. When the action completed, I'd use alarm() to show from the alarm and exit normally, otherwise the signal handler should get it to shut everything up beautifully.
I haven't labored with perl on Home windows shortly even though Home windows is sort of POSIXy, I cannot guarantee this works you will need to look into the perl documentation to ascertain if in order to what extent signals are supported in your platform.
More in depth info on signal handling which kind of self-destruct programming using alarm() are available in the Perl Cook book. Here is a brief example lifted from another publish and modified just a little:
eval outdoors the eval block local $SIG = sub # Set the alarm, take the time running the routine, and switch off # the alarm whether it completes. alarm(90) routine_that_might_take_a_while() alarm()
The ASP script timeout is applicable to any or all scripting languages. When the script is running within an ASP page, the script timeout will close the problem page.
An update on that one...
It works out this particular script apparently is a touch buggy, which the Googlebot has got the uncanny capability to "press it's buttons" and drive it crazy. The script is definitely an older, commercial application that does calendaring. Apparently, it shows links for "the following monthInch and "previous month", and when you stick to the "the following monthInch a lot of occasions, you'll disappear a high cliff. The resulting page, however, still features a "the following monthInch link. Googlebot would continuously beat the script to dying and chew in the processor.
Strangely enough, adding a robots.txt with Disallow: / did not solve the issue. Either the Googlebot had already become ahold from the script and wouldn't revealed, otherwise it really was neglecting the robots.txt.
Anyway, Microsoft's Process Explorer (http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx) was beneficial, because it permitted me to determine the atmosphere for that perl.exe process in greater detail, and that i could determine from this it was the Googlebot leading to my problems.
After I understood that (and determined that robots.txt wouldn't solve the issue), I could use IIS straight to block all visitors to this website from *.googlebot.com, which labored well within this situation, since we do not care if Google indexes the information.
Thanks much for that other ideas that everybody published!