I'm dealing with PHP to gain access to files and photos from remote servers. I'm mostly while using file_get_contents() and copy() functions.

Sometimes being able to access a little text file or photo is nearly instant, but in other cases it appears to obtain "stuck" for any minute around the exact same file. And often it really causes my script to hold, as well as after i stop the script Apache remains secured for a few minutes.

I am quite prepared to accept the truth that online connections could be flaky. My problem is which i recover beautifully which I don't crash Apache - the PHP set_time_limit() function only returns a fatal error. Additionally, there's an email within the PHP manual that point allocated to stream procedures doesn't lead towards the running duration of the script.

How do i get over such connection problems and permit my script to carry on? And why would this be leading to Apache to hold?

Thanks, John

$options = array( 'http' => array(
      'user_agent'    => 'Firefox wannabe',
      'max_redirects' => 1,
      'timeout'       => 10,
  ) );
$context = stream_context_create( $options );
$content    = @file_get_contents( $url, false, $context );

Have a look at http://php.internet/manual/en/function.stream-context-create.php and http://world wide web.php.internet/manual/en/context.http.php . The above mentioned code sets a timeout around the connection, and can permit redirects.

This will prevent reaching the timeout.

The lengthy delays might be triggered through the network or through the remote server getting a firewall denying you to definitely grab a lot of files at the same time or with a flanky dns server or router in relation to the remote host. Like a suggestion, you need to cache in your area the downloaded files, the like the following refresh files is going to be handled in your area rather than the large wide internet.