I am attempting to serve large zip files to customers. When you will find 2 concurrent connections, the server expires of memory (RAM). I elevated the quantity of memory from 300MB to 4GB (Dreamhost VPS) after which it labored fine.

I have to allow greater than 2 concurrent connections. The particular 4GB allows 30 concurrent connections (bad).

Well, the present code I am using, needs the double of memory then your actual quality. That's bad. I would like something similar to "streaming" the file to user. And So I would allocate only the chunk being offered to customers.

The next code may be the one I am using in CodeIgniter (PHP framework):

ini_set('memory_limit', '300M'); // it was the maximum amount of memory from my server
set_time_limit(0); // to avoid the connection being terminated by the server when serving bad connection downloads
force_download("download.zip", file_get_contents("../downloads/big_file_80M.zip"));exit;

The pressure_download function is the following (Codeingiter default assistant function):

function force_download($filename = '', $data = '')
{
    if ($filename == '' OR $data == '')
    {
        return FALSE;
    }

    // Try to determine if the filename includes a file extension.
    // We need it in order to set the MIME type
    if (FALSE === strpos($filename, '.'))
    {
        return FALSE;
    }

    // Grab the file extension
    $x = explode('.', $filename);
    $extension = end($x);

    // Load the mime types
    @include(APPPATH.'config/mimes'.EXT);

    // Set a default mime if we can't find it
    if ( ! isset($mimes[$extension]))
    {
        $mime = 'application/octet-stream';
    }
    else
    {
        $mime = (is_array($mimes[$extension])) ? $mimes[$extension][0] : $mimes[$extension];
    }

    // Generate the server headers
    if (strpos($_SERVER['HTTP_USER_AGENT'], "MSIE") !== FALSE)
    {
        header('Content-Type: "'.$mime.'"');
        header('Content-Disposition: attachment; filename="'.$filename.'"');
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header("Content-Transfer-Encoding: binary");
        header('Pragma: public');
        header("Content-Length: ".strlen($data));
    }
    else
    {
        header('Content-Type: "'.$mime.'"');
        header('Content-Disposition: attachment; filename="'.$filename.'"');
        header("Content-Transfer-Encoding: binary");
        header('Expires: 0');
        header('Pragma: no-cache');
        header("Content-Length: ".strlen($data));
    }

    exit($data);
}

I attempted some chunk based codes which i present in Google, however the file always was shipped corrupted. Most likely due to bad code.

Could anybody assist me to?

You will find ideas in this thread. I'm not sure when the readfile() method helps you to save memory, however it sounds promising.

You cannot use $data with whole file data within it. Try pass for this function not this content of file only it's path. Next send all headers once and then read thing about this file using fread(), echo that chunk, call flush() and repeat. If every other header is going to be send meanwhile then finally transfer is going to be corrupted.

Symlink the large file for your document root (presuming it is not an approved only file), then let Apache handle it. (That method for you to accept byte ranges too)

You are delivering the contents ($data) of the file via PHP?

If that's the case, each Apache process handling this can finish up growing to how big this file, as that data is going to be cached.

Your ONLY option would be not to send file contents/data via PHP and just redirect the consumer to some download URL around the filesystem.

Make use of a produced and different symlink, or perhaps a hidden location.

Make sure to add your ini_set before SESSION_START()