Unsure if this sounds like server related or normal behavior, but I decided to request. I operate a relatively high-traffic website (200k+ uniques/week) and that we serve plenty of links via a PHP file that contains redirects.

The only issue is, I have to update this file frequently, that we do via SFTP. The file is about 800k in dimensions and requires a second approximately to download. However, customers have informed me that sometimes the hyperlinks rerouted with the PHP file aren't effective, and I have recognized this only occurs when I am uploading or installing the file via SFTP.

So my real question is: Can there be in whatever way I'm able to keep your file executable for customers as i upload new copies from it via FTP? This down time is annoying for my customers. Is some establishing Linux that I am unaware of, or perhaps is there nothing I'm able to do relating to this? Whether it is important, my server is running Centos.

Upload it as being another filename, after which relabel it to clobber that old one. The upload takes significant time (contrary beyond 'instantaneous' is important). relabel (or mv) is atomic and will not build your customers wait. You might take just as much time since you need uploading the file, after which inside a moment relabel it.

However , the file transfer isn't atomic. It rewrites the file since it's being submitted. Which in turn causes the file to become incomplete when the file is downloaded meanwhile.

Should you upload the file into another location after which slowly move the file into its correct location following the upload is done you'll replace the file atomically.

Pmivdb really has it. Upload it in a temp file, then relabel it (make certain to help keep a duplicate from the old one). That's really your very best (only?) option.

should you place the files inside a separate folder and increment the file names can there be some php script which may make use of the latest file???