I have to pre-compress some large html/xml/json files (large data dumps) using either gzip or deflate. I never wish to serve the files uncompressed. They're so large and repetitive that compression will most likely work super well, even though some older browsers cannot support decompression, my typical clients won't be with them (although it might be nice basically could generate some type of 'hey you have to change your browser' message)

I auto create the files and that i can certainly generate .htaccess files to accompany each file type. Basically things i want is a few always on version of mod_gunzip. Since the files are large, and since I'll be frequently serving them, I want a technique that enables me to compress once, very well, around the command line.

I've discovered some good info on this website yet others about how exactly to get this done with gzip, however i wondered if a person could step me through how to get this done with deflate. Bonuses for an entire answer which includes what my .htaccess file need to look like, along with the command line code I ought to use (GNU/Linux) to acquire optimal compression. Super bonuses to have an answer which addresses how you can send "sorry no apply for you" message to not-compliant browsers.

could be lovely as we could produce a "precompression" tag to pay for questions such as this.

-Foot

Edit: Found AddEncoding in mod_mime

This works:

<IfModule mod_mime.c>

 <Files "*.html.gz">

  ForceType text/html

 </Files>

 <Files "*.xml.gz">

  ForceType application/xml

 </Files>

 <Files "*.js.gz">

  ForceType application/javascript

 </Files>

 <Files "*.gz">

  AddEncoding gzip .gz

 </Files>

</IfModule>

The paperwork allow it to be seem like just the AddEncoding ought to be needed, however i did not have that to operate.

Also, Lighttpd's mod_compression can compress and cache (the compressed) files.

For that command line, compile zlib's zpipe: http://world wide web.zlib.internet/zpipe.c after which

zpipe < BIGfile.html > BIGfile.htmlz

for instance.

Then using Zash's example, setup a filter to alter the header. This will offer you getting RAW deflate files, which modern browsers most likely support.

For a different way to compress files, have a look at using pigz with zlib (-z) or PKWare zip (-K) compression options. Test if these work coming through with Content-Encoding set.

Personally, I'd take a look at built-in filesystem compression rather than carrying this out in the apache layer.

On solaris zfs has transparent compression, use zfs compress to simply compress the filesystem. Similarly, home windows can compress folders, apache assists this content oblivious towards the fact it's compressed on disk. Linux has filesystems that transparent compression also.

A fast method to compress content without dealing directly with moz_gzip/mod_defalte is applying doctor_gzhandler and modifying headers (before any output is send towards the browser).

<?php

/* Replace CHANGE_ME using the correct mime kind of your large file.

 i.e: application/json

*/

doctor_start ('ob_gzhandler')

header('Content-type: CHANGE_ME charset: UTF-8')

header('Cache-Control: must-revalidate')

$offset = 60 * 60 * 2

$ExpStr = 'Expires: ' . gmdate('D, d M Y H:i:s',time() + $offset) . ' GMT'

header($ExpStr)

/* Stuff to create your large files here */