I've an index.php script which i use like a publish-commit URL on the Google Code site. This script clones a directory and develops a task that could try taking some work. I wish to avoid getting this script running several amount of time in parallel.

It is possible to mechanism I'm able to use to prevent performing that script if a different one has already been in session?

You should use flock with LOCK_EX to achieve a unique lock on the file.

E.g.:

<?php
$fp = fopen('/tmp/php-commit.lock', 'r+');
if (!flock($fp, LOCK_EX | LOCK_NB)) {
    exit;
}

// ... do stuff

fclose($fp);
?>

Only when it will save you the condition from the running script and appearance once the script begins if some other script is presently active.

For instance in order to save if your script is running you could do this something similar to this:

$state = file_get_contents('state.txt');

if (!$state) {
   file_put_contents('state.txt', 'RUNNING, started at '.time());

   // Do your stuff here...

   // When your stuff is finished, empty file
   file_put_contents('state.txt', '');
}

how lengthy will it decide to try run.

can use memcache

<?php
$m = new Memcache(); // check the constructor call

if( $m->get( 'job_running' ) ) exit;

else $m->set( 'job_running', true );



//index code here

//at the end of the script

$m->delete( 'job_running' );

?>

When the task fails you will have to obvious from memcache. Flock is a great option too... most likely better really.