I am starting on the project that will likely mix into the millions rows within the short future, and so i am researching the database I personally use as that's certain to prove an problem. From things i read, SQL in most its incarnations has issues once you're able to the two,000,000 rows problem for any table. It is possible to good database suggested of these large size projects?

It's a website I'm speaking about, and archiving old records isn't ideal, though can be achieved whether it turns out to be an problem which i can't overcome.


No database that will call themselves an SQL database when they had difficulties with two million records. You will get in danger with a few databases with 2 billion records though.

I have had mysql databases with more than 150 million records without trouble. You have to evaluate which features you require from a database before you are determining, not in a few days a couple of million rows - which isn't much whatsoever.

I have had tables in MS SQL Server having a lot a lot more than two million rows without trouble. Obviously, it is dependent how you are by using their data.

Just do not attempt using MySQL for something similar to this. A minimum of from the experience, it simply does not allow enough fine-tuning to supply sufficient performance. I have encounter a couple of cases with considerable amounts of information in (almost) in the same way setup tables. MySQL5 carried out like 30 occasions reduced than SQL Server on a single hardware. Extreme example maybe, but nonetheless.

I've not enough knowledge about PostgreSQL or Oracle to evaluate, and so i will just stick to not suggesting MySQL. Or Access )

To begin with, millions of records isn't exactly a great deal when databases are worried. Any database worth it's salt should have the ability to handle that simply fine.

Create proper indexes in your tables and just about any database will have the ability to handle individuals amounts of records. I have seen MySQL databases with countless rows that labored all right, and MySQL isn't a heavyweight in database land.

MS SQL server PostgreSQL, DB2, Progress OpenEdge, just about anything is going to do should you create proper indexes. Such things as MS Access (and perhaps sqlite) may break apart whenever you put lots of data inside them.

We run plenty of databases with row counts within the 100s of millions in MSSQL (2000, 2005, 2008). Your row count is not where your condition will arise, it's within the qualities of accessibility data. For the way it appears, you may want to scale across separate hardware, which is how the variations between database servers will truly appear (might cost...)

Among the tables during my current project has 13 million rows inside it. MS SQL Server handles it simply fine. Really, two million rows is certainly not.

But, seriously, if you prefer a high-finish database, turn to Oracle, Teradata, and DB2.

Microsoft SQL Server, MySQL, Oracle, and DB2 can all handle untold thousands of rows with no problem.

The issue is going to be getting a DBA who understands how to design and keep it in check correctly which means you obtain the performance qualities you are searching for.

2,000,000 rows is actually very little whatsoever. I have seen lots of tables with > 50 million rows with acceptable performance, in MS SQL.

IMHO you are still pretty far from as being a 'big database'

As others have stated, any decent DB are designed for that kind of load. I have used MS SQL Server and PostgreSQL for databases of this size before, both work great. I'd recommend PostgreSQL since it is free and open. I have never done a performance comparison, however it appears to be really capable. I'd avoid DB2 or Oracle because they are very difficult to use (unless of course you need to purchase a complete time DBA, by which situation such you may have the ability to squeeze better performance from individuals than every other solution, particularly with Oracle).