I authored a JSON-API in NodeJS for any small project, running behind an Apache webserver. Now Let me improve performance with the addition of caching and compression. Essentially, now you ask , what ought to be completed in NodeJS itself and what's better handled by Apache:
a) The API calls have unique Web addresses (e.g. /api/user-id/content) and I wish to cache them not less than a minute.
b) I would like the output to become offered as Gzip (whether it's understood through the client). NodeJS's HTTP module usually provides content as "chunked". As I am only writing an answer in one location, could it be enough to regulate this content-encoding header for everyone it as being one piece so it may be compressed and cached?
a) I suggest caching but with no timer, just allow the alternative strategy remove records. I'm not sure what you're really serving, maybe caching the particular JSON or its source data may be helpful. this is a simple cache I authored together with a small unit test to ensure you get inspiration.
b) How large is the JSON data? You need to compress it yourself, and bear in mind not to get it done obstructing. You are able to stream compress it and deliver it already. I never did by using node.
> I wrote a JSON-API in NodeJS for a small project, running behind an > Apache webserver.
I'd just run the API on different port and never behind apache(proxy??). If you wish to proxy I'd advice you to employ NGINX. See Ryan Dahl's 35mm slides talking about Apache versus NGINX(35mm slides 8+). NGINX may also do compression/caching(fast). Perhaps you should not compress all of your JSON(size? couple of KB?). I recommendt you to definitely read Google's Page Speed "Minimum payload size" section(good read!) explaining that, that we also quote below:
Observe that gzipping is just advantageous for bigger assets. Because of the overhead and latency of compression and decompression, you need to only gzip files over a certain size threshold we recommend the absolute minimum range from 150 and 1000 bytes. Gzipping files below 150 bytes can really make sure they are bigger.
> Now I'd like to improve performance by adding caching and compression
You could do this compression/caching via NGINX(+memcached) which will probably be extremely fast. Much more prefered will be a CDN(for static files) that are enhanced for this function. I do not think you ought to be doing any blending in node.js, even though some modules can be found through NPM's search(look for gzip) like for instance https://github.com/saikat/node-gzip
For caching I'd advice you to definitely take a look at redis that is very fast. It's extending its love to be faster than most client libraries because node.js fast client library(node_redis) uses hiredis(C). With this you should also install
hiredis via npm:
npm install hiredis redis
Some benchmarks with hiredis
PING: 20000 ops 46189.38 ops/sec 1/4/1.082 SET: 20000 ops 41237.11 ops/sec 0/6/1.210 GET: 20000 ops 39682.54 ops/sec 1/7/1.257 INCR: 20000 ops 40080.16 ops/sec 0/8/1.242 LPUSH: 20000 ops 41152.26 ops/sec 0/3/1.212 LRANGE (10 elements): 20000 ops 36563.07 ops/sec 1/8/1.363 LRANGE (100 elements): 20000 ops 21834.06 ops/sec 0/9/2.287 > The API calls have unique URLs (e.g. /api/user-id/content) and I want > to cache them for at least 60 seconds.
You are able to accomplish this caching easily because of redis's setex command. This will probably be very fast.