I have purchased a CSV U . s . States business database with ~20 million records, that is divided to 51 databases, every database signifies a condition.
I have to write an ASP.Internet MVC Web Application which will query this database, by condition and much more arguments. Must I produce a SQL Server database and import all of the records within the all 51 csv files? Or possibly must i query straight to the csv files? What's going to be quickest? You can suggest along with other solutions.
One table, with appropriate indexes. 20 million records is peanuts.
Produce a single database, in which you invest individuals records in. But, get it done inside a structured fashion offcourse.
For example, you can produce a table 'State', along with a table known as 'Business'. Produce a relationship between individuals 2 tables. Normalize your database further.
When you wish to possess a performant database, it begins by determining a great, stabilized DB schema. Add the required indexes, and you ought to be fine.
A database is made to have the ability to handle a lot of records.
I'd import the information into one large database. As lengthy because the table is properly indexed it'll offer better performance when querying as rather than needing to scan each file it will have the ability to make use of the correct indexes to quicken things.