Presuming that guidelines happen to be adopted when creating a brand new database, how do you start testing the database in ways that may improve confidence within the database's capability to meet sufficient performance standards, which will suggest performance-improving tweaks towards the database structure if they're needed?

Will I need test data? Exactly what does that appear to be like if no usage designs have been in existence for that database yet?

NOTE: Assets for example blogs and book game titles are welcome.

I'd perform a couple of things:

1) simulate user/application link with the db and test load (load testing). I recommend hooking up with lots of more customers than are required to really make use of the system. You could have all of your customers sign in or get 3rd party software which will sign in several customers and perform defined functions that you simply feel is definitely an sufficient test of the system.

2) place many (possibly millions) of test records and load test again.(scalability testing). As tables grow you might find you'll need indexes in which you did not ask them to before. Or there might be issues with Sights or joins used through the machine.

3) Evaluate the database. I'm mentioning towards the approach to examining tables. Here is really a boring page explaining what it's. Also here is really a connect to an excellent article on Oracle datbase tuning. Most of which might connect with your work.

4) Run queries produced by programs/customers and run explain plans on their behalf. This can, for instance, let you know if you have full table scans. It can help you fix lots of your issues.

5) Also backup and reload from all of these backup copies to exhibit confidence within this too.

You could utilize something for example RedGate's Data Generator to obtain a good load of test data inside it to determine the way the schema works under load. You are right that not understanding the usage designs it's tough to construct an ideal test plan however i presume you'll want a tough idea regarding the type of queries that'll be run against it.

Sufficient performance standards are actually based on the particular client programs which will consume your database. Obtain a sql profiler trace going although the programs hit your db and you ought to have the ability to rapidly place any difficulty areas which might require more optimising (as well as p-normalising in some instances).