I am utilizing a sqlite db that is very convenient and appears to satisfy all me at this time.

Presently my db dimensions are <50MB, however i now have to give a new table that will store large text blobs, that will make the db to achieve as much as 5GB over the following year.

Would sqlite have the ability to cope with a 5GB db size? Any caveats to that particular, in comparison with say mysql?

I am not really a huge expert on databases, but the majority of the DB-related work I have done used SQLite. In my opinion, making the database bigger in-itself should not get in a large performance hit. Naturally you will have more data, so prepare to take more time querying it!

Think about this thought experiment: you've got a table named mydata you utilize constantly within the DB. Now, you add an unrelated table otherdata. Your queries for mydata don't rely on the data in otherdata. Even when you shove GBs of information into otherdata, you will not feel any real performance hit inside your use of mydata.

AFAIK, the architecture of SQLite supports this claim.

SQLite ought to be all right for which for you to do. Size really is not an issue. As lengthy as the computer file can reside on a single computer that's making the phone call, you ought to be all right. If putting it around the network, that's ok, but multi-user access is susceptible to the bugs from the operating-system if this involves securing records, etc. Per evaluating with mysql, since you've removed the server, you've also removed the network traffic connected using the data retrieval. this will quicken things.

-don

As mentioned in Sqlite Frequently asked questions , FAQ

take a look at point .12 , it states max limit of sqlite db could be up to 14 TB!!