I'm serving all content through apache with
Content-Encoding: zip but that compresses quickly. Enough my submissions are static files around the disk. I wish to gzip the files in advance instead of blending them when they are asked for.
This really is something which, In my opinion,
mod_gzip did in Apache 1.x instantly, but simply getting the file with .gz alongside it. That's no more the situation with
To reply to my very own question using the rather easy line I had been missing during my confiuration:
Options FollowSymLinks MultiViews
I had been missing the MultiViews option. It's there within the Ubuntu default web server configuration, so you shouldn't be much like me and drop them back.
Also I authored a fast Rake task to compress all of the files.
namespace :static do desc "Gzip compress the static content so Apache does not have to do it on-the-fly." task :compress do puts "Gzipping js, html and css files." Dir.glob("#/public/**/*.") do file system "gzip -c -9 # > #.gz" finish finish finish
mod_gzip compressed content quickly too. You are able to pre-compress the files by really logging to your server, and doing the work from spend.
compact disc /var/world wide web/.../data/ for file in * do gzip -c $file > $file.gz done
I've an Apache 2 constructed from source, and that i found I needed to customize the following during my httpd.conf file:
Add MultiViews to Options:
Options Indexes FollowSymLinks MultiViews
AddEncoding x-compress .Z AddEncoding x-gzip .gz .tgz
#AddType application/x-compress .Z #AddType application/x-gzip .gz .tgz
You should use
mod_cache to proxy local content in memory or on disk. I'm not sure if this works not surprisingly with