I'm coding a psychology experiment in Python. I have to store user information and scores somewhere, and that i need results like a web application (and become secure).

Aren't well versed relating to this - I am thinking about XML databases, BerkleyDB, sqlite, an openoffice spreadsheet, or I am very thinking about the python "shelve" library. (the majority of my info originating from this thread: http://developers.slashdot.org/story/08/05/20/2150246/FOSS-Flat-File-Database

DATA: I figure that I am likely to have maximally 1000 customers. For every user I have reached store...

  • Username / Pass
  • User detail fields (for any simple profile)
  • User scores around the exercise (2 datapoints: each trial will get a score (correct/incorrect/timeout, and it has an connected number from .1-to-1. that I have to record)
  • Metadata concerning the tests (when, who, etc.)
  • Outcomes of data analysis for user

VERY rough estimate, each user creates 100 tests / day. So more 10k datapoints / day. It must run this way for around 3 several weeks, so about 1m datapoints. Safety multiplier 2x provides me with a target of the database that may handle 2m datapoints.

((note: I possibly could either store trial response data as individual data points, or group tests into Python list objects of different length (user "periods"). The second would significantly bring lower the amount database records, though not the quantity of data. Will it matter? How?))

I would like an answer which will work (a minimum of) until I recieve for this 1000 customers level. If my program is popular beyond that much cla, I am okay with doing a bit of work modding inside a beefier DB. Also repeating that it should be easily deployable like a web application.

Beyond individuals fundamental needs, Among the finest the simplest factor that can make the work. I am pretty eco-friendly.

Thank you for reading through

Tr3y

SQLite can easily handle individuals quantity of data, it features a large userbase having a couple of very well known users on virtually all of the platforms, it's fast, light, and you will find awesome GUI clients that enables you to definitely browse and extract/filter data having a couple of clicks.

SQLite will not scale indefinitely, obviously, but severe performance problems starts only if simultaneous inserts are needed, that we would guess is a concern showing up several orders of magnitude after your prospected load.

I am utilizing it since a couple of years, and that i didn't have an issue with it (although for bigger sites I personally use MySQL). Personally I've found that "Small. Fast. Reliable. Choose any three." (the tagline on SQLite's site) is very accurate.

For the simplicity of use... SQLite3 bindings (site temporarily lower) are members of the python standard library. Here you'll find a little tutorial. Oddly enough enough, simplicity is really a design qualifying criterion for SQLite. From here:

Lots of people like SQLite since it is small , fast. But individuals characteristics are simply happy accidents. Customers also discover that SQLite is extremely reliable. Reliability is due to simplicity. With less complication, there's less to visit wrong. So, yes, SQLite is small, fast, and reliable, but first of all, SQLite aims to become simple.

There is a pretty place-on discussion of when you should use SQLite here. My personal favorite lines are this:

A different way to take a look at SQLite is: SQLite isn't made to replace Oracle. It is made to replace fopen().

It appears in my experience that to your requirements, SQLite is ideal. Indeed, it appears in my experience very possible you won't ever need other things:

Using the default page size 1024 bytes, an SQLite database is restricted in dimensions to two terabytes (2^41 bytes).

It does not seem like you will have much data at any time.