How can you cope with source control management and automated deployment (configuration management) of database tables. Sometimes inside a SQL Server atmosphere and it is pretty simple to script out drop and make files for saved methods/triggers/functions even jobs. It is also simple to handle scripting out the development of a brand new db table. However, if in a later point you need to modify that table, you cannot always just drop it and recreate it using the new area for anxiety about losing data. Can there be an automatic way to cope with this issue? Would you script out a temp table and backfill after upgrading the brand new transformed table? (might be rough if there's lots of data)

Any suggestions could be greatly appreciated.

You will find tools available which help you develop your schema, develop changes, version individuals changes and can help you compare the variations between versions as well as create the SQL to create the DDL changes.

For instance, take a look at Embarcadero Change Manager along with other items provided by Embarcardero.

You are able to instantly produce the initial creation script, but ALTER scripts really should be hands-coded on the situation-by-situation basis, because used you must do custom stuff inside them.

Regardless, you will need a way of making apply and rollback scripts for every change, and also have an installer script which runs them (along with a rollback which comes it well obviously). This kind of installer should most likely remember what version the schema is within, and run all of the necessary migrations, within the right order.

See my article here:

http://marksverbiage.blogspot.com/2008/07/versioning-your-schema.html

Tools like Red-colored-gate's SQL Compare are invaluable for making sure you've got a complete script. You'll still might need to by hand adjust it to make certain the objects are scripted within the correct order. Make certain to script triggers and constraints, etc in addition to tables.Generally you will need to use alter instructions rather than drop and make particularly if the table is whatsoever large.

All of our tables and processes and saved procs are needed to become under source control too, therefore we can go back to old versions if necessary. Also our dbas periodically delte what they have to find not in Source COntrol, to ensure that keeps designers from failing to remember to get it done.

Obviously all development scripts being marketed to production ought to be operate on a QA or staging server first to guarantee the script will run correctly (with no changes needed) prior to it being operate on push. Even the timing of running on push must be considered, you won't want to lock out customers especially throughout busy periods and the years have proven that loading scripts to production late on Friday mid-day is generally a bad idea.

It varies, for the way you need to treat existing data and just how extensive the schema changes are, but even just in Management Studio, prior to committing changes, you will get a script of all of the changes.

For a number of data or where you will find constraints or foreign secrets, even simple ALTER procedures may take a lot of time.

Ohh didn't remember to express, make certain you've got a good group of database backup copies before loading schema changes to production. Better safe than sorry.