I've got a script that's very lengthy to complete, then when i run it's hit the max execution time on my small webserver and finish up timing out.

As one example of that imagine i've got a for loop which make some pretty intensive manipulation a million time. How could i spread this loop execution in a number of parts to ensure that i do not hit the max execution duration of my Webserver?

Thank you,

Best answer is by using http://php.net/manual/en/function.set-time-limit.php to alter the timeout. Otherwise, you should use 301 redirects to transmit for an up-to-date URL on the timeout.

$threshold = 10000;
$t = microtime();
$i = isset( $_GET['i'] ) ? $_GET['i'] : 0;

for( $i; $i < 10000000; $i++ )
    if( microtime - $t > $threshold )
        header('Location: http://www.example.com/?i='.$i);

    // Your code

The browser is only going to respect a couple of redirects before it stops, you are easier to use javascript to pressure a webpage reload.

For those who have a credit card applicatoin that's likely to loop a recognized quantity of occasions (i.e. you're certain that it is going to finish a while) you are able to increase time period limit within the loop:

foreach ($data as $row) {
    // do your stuff here

This solution will safeguard you against getting one run-away iteration, and can give your whole script run undisturbed as lengthy since you need.

If you possess the right permissions in your hosting server, you could utilize the php interpreter to carry out a php script and also have it run without anyone's knowledge.

See Asynchronous shell exec in PHP.

I at some point used a method where I splitted the job in one file into three parts. It had been just a range of 120.000 elements with intensive operation. I produced a splitter script which saved the arrays inside a database of how big 40.000 each one of these. I Quickly produced an HTML file having a redirect towards the first PHP file to compute the very first 40.000 elements. After computing the very first 40.000 elments I'd again a HTML forward to another PHP file and so forth.

Not so elegant, however it labored :-)

Dont just increase max_execution_time. Setting maximum execution time isn't a solution for the now. Profile the application first. Then look for the main from the problem and optimize it. Only you then is going for growing maximum execution time also known as max_execution_time

if you're managing a script that must execute for unknown time, you should use:


If at all possible you may make the script to ensure that it handles a area of the wanted procedures. Once it completes say 10%, you via AJAX call the script again to complete the following 10%. But you will find conditions where this isn't a perfect solution, it truly is dependent on your work.

I made use of this process to produce a web-based crawler which only went on my small computer for example. Whether it needed to perform the procedures at the same time it might break too. Therefore it was split up into 200 "tasks", each known as via Ajax when the previous completes. Works perfectly, and it is been more than a year because it began running (moving?)