I've got a PHP script that grabs a chunk of information from the database, processes it, after which looks to ascertain if there's more data. This processes runs indefinitely and that i run a number of these at any given time on one server.

It appears something similar to:






The issue is, after a while, something causes the procedure to prevent running and that i weren't in a position to debug it and see the reason.

Here's what I am using to obtain information to date:

  • error_log - Logging all errors, but no errors are proven within the error log.
  • register_shutdown_function - Registered a custom shutdown function. This may get known as and so i be aware of process is not being wiped out through the server, it's being permitted to complete. (or at best I suppose that's the situation with this particular being known as?)
  • debug_backtrace - Drenched a debug_backtrace() during my custom shutdown function. This shows just one call and it is my custom shutdown function.
  • Log if reaches the finish of script - Outdoors from the loop, I've got a function that logs the script left the loop (and for that reason could be reaching the finish from the source file normally). Once the script dies at random, it isn't logging this, so whatever kills it, kills it while it's in the center of processing.

The other debugging techniques can you suggest for locating the reason?

Note: I ought to include that this isn't an problem with max_execution_time, that is disabled of these scripts. Time prior to being wiped out is sporadic. It might run for ten seconds or 12 hrs before it dies.

Update/Solution: Thanks all for the suggestions. By logging the output, I came across that after a MySql query unsuccessful, the script was set to die(). D'oh. Up-to-date it to log the mysql errors after which terminate. First got it working now like no bodies business!

Remember, PHP includes a variable within the ini file that states how lengthy a script should run. max-execution-time

Make certain that you're not groing through this, or make use of the set_time_limit() to improve execution time. Is program running via a web server or via cli?

Adding: My Bad Encounters with PHP. Searching through some background scripts I authored captured. Sorry, but PHP is really a terrible scripting language for doing anything for lengthy measures of your time. I observe that the more recent PHP (which we've not upgraded to) adds the functionality to pressure the GC to operate. The issue I have been getting comes from using an excessive amount of memory since the GC rarely runs to wash up itself. If you are using stuff that recursively reference themselves, additionally they should never be freed.

Creating a range of 100,000 products makes memory, however setting the array for an empty array or splicing everything out, doesn't free it immediately, and does not measure the level as unused (also known as creating a new 100,000 element array increases memory).

My own solution ended up being to write a perl script that went forever, and system("php my_php.php") if needed, to ensure that the interpreter would free completely. I am presently supporting 5.1.6, this can be fixed in five.3+ or at the minimum, description of how the have GC instructions which you can use to pressure the GC to cleanup.

Simple script

#!/usr/bin/perl -w

use strict


then inside your php script


// perform a single processing block

if( $moreblockstodo )  else allows sleep for a little until we obtain more



I'd log memory using your script. Maybe it acquires an excessive amount of memory, hits memory limit and dies?

I'd log the condition from the function to some file inside a couple of different places in each loop.

You will get the items in most variables like a string with var_export, while using var_export($varname,true) form.

You can just log this to some certain file, and keep close track of it. The most recent condition from the function prior to the log finishes usually supplies some clues.