I am using django with apache and mod_wsgi and PostgreSQL (all on same host), and I have to handle lots of simple dynamic page demands (100s per second). I confronted with problem the bottleneck is the fact that a django do not have persistent database connection and reconnects on each demands (that can take near 5ms). While carrying out a benchmark I acquired by using persistent connection I'm able to handle near 500 r/s while without I recieve only 50 r/s.

Anybody have advice? How you can modify django to make use of persistent connection? Or accelerate connection from python to DB

Thanks ahead of time.

In Django trunk, edit django/db/__init__.py and comment the line:

signals.request_finished.connect(close_connection)

This signal handler causes it to disconnect in the database after every request. I'm not sure what all the side-results of this will be, however it does not make sense at all to begin a brand new connection after every request it destroys performance, as is available observed.

I am by using this now, however i havn't done a complete group of tests to determine contrary breaks.

I'm not sure why everybody thinks this requires a new after sales or perhaps a special connection pooler or any other complex solutions. This appears quite simple, though I do not doubt you will find some obscure gotchas that built them into do that to begin with--that ought to be worked with increased properly 5ms overhead for each request is quite a bit for any high-performance service, as is available observed. (It requires me 150ms--I havn't determined why yet.)

Edit: another necessary change is within django/middleware/transaction.py take away the two transaction.is_dirty() tests and try to call commit() or rollback(). Otherwise, it will not commit a transaction whether it only read in the database, that will leave locks open that needs to be closed.

Disclaimer: I haven't attempted this.

In my opinion you have to implement a custom database back finish. You will find a couple of good examples on the internet that shows how you can implement a database back finish with connection pooling.

Utilizing a connection pool would most likely be a great choice for you personally situation, because the network connections are stored open when connections are came back towards the pool.

  • This publish achieves this by patching Django (among the comments highlights that it's easier to implement a custom back finish outdoors from the core django code)
  • This publish is definitely an implementation of the custom db back finish

Both posts use MySQL - possibly you'll be able to use similar techniques with Postgresql.

Edit:

  • The Django Book mentions Postgresql connection pooling, using pgpool (tutorial).
  • Someone published an area for that psycopg2 after sales that implements connection pooling. I would recommend developing a copy from the existing back finish in your project and patching that certain.

I produced a little Django patch that implements connection pooling of MySQL and PostgreSQL via sqlalchemy pooling.

This works perfectly on manufacture of http://grandcapital.internet/ for any lengthy time period.

The patch was written after searching the subject a little.