I'm an periodic Python programer who have only labored to date with MYSQL or SQLITE databases. I'm the pc person for everything in a tiny company and I've been began a brand new project where It is all about time for you to try new databases.
Sales department constitutes a CSV dump each week and I have to create a small scripting application that permit people form other departments mixing the data, mostly connecting the records. I've all of this solved, my issue is the rate, I'm using simply text files for those this and not surprisingly it's very slow.
I figured about using mysql, however I want setting up mysql in each and every desktop, sqlite is simpler, but it's very slow. I don't require a full relational database, some method of have fun with large levels of data inside a decent time.
Update: I believe I wasn't being very detailed about my database usage thus explaining my problem badly. I'm working reading through all of the data ~900 Megas or even more from the csv right into a Python dictionary then dealing with it. My issue is storing and mostly reading through the information rapidly.
You most likely do require a full relational DBMS, otherwise at this time, soon. Should you begin right now while your problems and data are pretty straight forward and simple when they become complex and hard you'll have lots of knowledge about a minumum of one DBMS that will help you. You most likely have no need for MySQL on all desktop computers, you may do the installation on the server for instance and feed data out over your network, however, you possibly have to provide more details regarding your needs, toolset and equipment to obtain better suggestions.
And, as the other DBMSes get their talents and weak points too, there is nothing wrong with MySQL for big and sophisticated databases. I'm not sure enough about SQLite to comment knowledgeably about this.
EDIT: @Eric out of your comments to my answer and also the other solutions I form much more strongly the vista that it's time you gone to live in a database. I am not surprised that attempting to do database procedures on the 900MB Python dictionary is slow. I believe you need to first convince yourself, your management, you have arrived at the limits of the items your present toolset can deal with, which future developments are threatened unless of course you re-think matters.
In case your network really can't support a server-based database than (a) you will need to build your network robust, reliable and performant enough for this type of purpose, but (b) if that's no option, or otherwise an earlier option, you ought to be thinking like a central database server fainting digests/extracts/reviews with other customers, instead of synchronised, full RDBMS your client-server configuration.
The issues you're presently going through are problems of not getting a proven method to do the job. They're only getting worse. If only I possibly could advise a miracle means by which this isn't the situation, however i can't and that i don't believe other people will.