Possible Duplicate:
Django cache.set() causing duplicate key error

I went into this issue using django core's database cache:

ERROR:  duplicate key value violates unique constraint "cache_pkey"
STATEMENT:  INSERT INTO "cache" (cache_key, value, expires) VALUES (E':1:cms-menu_nodes_en-us_1', E'gAJdcQEoY21lbnVzLmJhc2UKTmF2aW
LOG:  server process (PID 8453) was terminated by signal 9: Killed
LOG:  terminating any other active server processes
LOG:  all server processes terminated; reinitializing
FATAL:  could not create shared memory segment: Cannot allocate memory
DETAIL:  Failed system call was shmget(key=5432001, size=29278208, 03600).

I looked within the table and affirmed, there's an entry for that key ':1:content management systems-menu_nodes_en-us_1'. I discovered an identical problem here, but was not able to exactly know very well what the problem is.

Anybody have ideas or suggestions? Seems like a bug in django core, since if your key exist, it will update the record.

edit: I ought to have clarified the DB was PostgreSQL 8.4.7. Thanks lazerscience.

edit @ Jack M: I've not had the opportunity to duplicate this error, but believe the code is within django.core.cache.backends.db.DatabaseCache inside a method known as set() that calls _base_set().

Seems like a bug in django core, since if your key exist, it will update the record.

Indeed, but I'd suggest nevertheless bug relates to a concurrency problem, by which situation it may be fixed in the application level. As with two neighboring calls towards the same resource/page/whatever run an exist() statement, find no row, and go to place consequently -- without giving a lock of any sort, and without wrapping the one thing inside a transaction to discard the problem call and (becasue it is only a cache) ongoing.

Additionally, it begs one question: are you certain that you ought to be caching inside your database to begin with? The database typically is really a bottleneck inside a web application (particularly when utilizing an ORM), and also the whole idea of caching would be to avoid that bottleneck. Should not you be utilising memcache rather?