Exist guidelines available for loading data right into a database, for use with a brand new installing of a credit card applicatoin? For instance, for application foo to operate, it requires some fundamental data before it can also be began. I have used a few options previously:

TSQL for each row that should be preloaded:

IF NOT EXISTS (SELECT * FROM Master.Site WHERE Name = @SiteName)
INSERT INTO [Master].[Site] ([EnterpriseID], [Name], [LastModifiedTime], [LastModifiedUser])
VALUES (@EnterpriseId, @SiteName, GETDATE(), @LastModifiedUser)

An alternative choice is really a spreadsheet. Each tab signifies a table, and information is joined in to the spreadsheet once we realize we want it. Then, a course can see this spreadsheet and populate the DB.

You will find further complicating factors, such as the associations between tables. So, it is not as simple as loading tables on their own. For instance, when we create Security.Member rows, then you want to add individuals people to Security.Role, we want a means of maintaining that relationship.

Another factor isn't that all databases is going to be missing this data. Some locations will curently have the majority of the data, yet others (that might be new locations all over the world), will begin on your own.

Any ideas are appreciated.

Whether it's very little data, the bare initialization of configuration data - we typically script it with any database creation/modification.

With scripts you've got a large amount of control, so that you can place only missing rows, remove rows which are recognized to be obsolete, not override certain posts that have been personalized, etc.

Whether it's lots of data, then you definitely most likely wish to have an exterior file(s) - I'd avoid a spreadsheet, and employ an ordinary text file(s) rather (BULK Place). You can load this right into a staging area but still use techniques as if you would use inside a script to make sure you don't clobber any special personalization within the destination. And since it's under script control, you have charge of an order of procedures to make sure referential integrity.

I'd recommend a mix of the two approaches shown by Cade's answer.

Step One. Load all of the needed data into temp tables (on Sybase, for instance, load data for table "db1..table1" into "temp..db1_table1"). To be able to have the ability to handle large datasets, use bulk copy mechanism (whichever one your DB server supports) without conntacting transaction log.

Step Two. Operate a script which like a primary step will iterate over each table to become loaded, as needed create indexes on recently produced temp table, compare the information in temp table to primary table, and place/update/remove variations. Then when needed the script can perform auxillary tasks such as the security role setup you pointed out.