After developing an incredible application on my small local machine with no thought on the way it would perform on my small host, I've encounter a dreadful problem. I'm serving files (.pdf &lifier .zip) through rails send_file to ensure that I'm able to log statistics. The only issue is the fact that when two (or even more) files are downloaded concurrently, a brand new ruby dispatch.fcgi process should be began to deal with each one of these. I realize this may be prevented by utilizing mod_xsendfile, but regrettably my host does not support that apache mod. So here's the strange part. These processes are now being produced not surprisingly, but for whatever reason they're never exiting. Like a test, I downloaded about 10 files concurrently from the couple different computer systems. There have been about 10 processes produced, but none of them ever left. Even minutes after their invocation which after the downloads have been lengthy completed.

Why aren't these exiting? So what can I actually do to avert this problem apart from change to a genuine host that delivers support for mod_xsendfile?

If you do not need access control towards the files you are offering, you can always try placing the files somewhere under /public as well as other url outdoors from the rails application.

Whenever a user would go to download personal files, it might take these to a controller action that updates download statistics, then redirects anyone's browser towards the path in which the file is really saved utilizing a meta refresh tag or a little of javascript. By doing this, apache is going to be handling the file transfer without rails... basically what xsendfile would do.

However, switching to a different host is most likely something worth considering if this sounds like anything further than a toy project you are focusing on... fastcgi is a nice old method to be serving a rails application at this time.