Our project has about 20 designers, but our application makes relatively light utilization of databases. There exists a assortment of about 5 databases, which are extremely small , might have under 20 tables each, none which have countless rows or anything large.
We've two options up for grabs based on how to handle the evolution from the databases with time:
- Some type of tool. Presently we are using Visual Studio database projects, that have the present meaning of the schema, and check out a reference database to develop a diff script. Then we make use of this diff script to create the reference database current.
- Use version scripts to construct the database from the baseline. The scripts are by hand put into source control. Data migration to maneuver data from old posts/tables to new would participate these scripts. There will be a version recorded within the DB somewhere and improving would run all scripts between DB version and also the current version.
The 2nd option appears to become broadly used and I've discovered an indepth discussion here: http://odetocode.com/blogs/scott/archive/2008/01/31/versioning-databases-the-baseline.aspx
The issue we've using what we have got right now is the fact that we do not connect over our Production databases. What this means is to produce a release package, we must restore a backup of Production into another location, produce a diff against that referece DB and provide the script towards the production DB team. So our release to production differs to the other conditions.
This will make the thought of running versioned scripts appealing because we make use of the same scripts in most conditions, and there is no ad-hoc operate in deployment (eg manual restore of push to reference DB). But considering that we've this type of small-scale DB situation, I seem like we are able to hardly be considered a difficult situation for that DB tools available. What we should want is simple things like possible that is clear to see.
Perform the tools for example RedGate's suite seem sensible with this type of scenario, or must we opt for versioned scripts? Cost is not a lot of an problem, it's much more about developing a Pit of Success where maintaining and implementing the DB is really as fundamental and automated as you possibly can.
In my opinion there always is much more into it than mere schema changes. Should you split a column in 2, or change a column to some separate table, or any other may be, you have to migrate both schema and also the data.
No tool or script will help you to migrate the particular data instantly. In the most you will get a diff for that schema which your devs could find helpful like a indication/checklist for DB version migration scripts (sequences of create/alter/drop and place/update/remove completed in just one transaction).
I am the merchandise manager at Red-colored Gate for SQL Compare, which creates diff scripts between two databases. I would like you to definitely have a look at our SQL Source Control tool, which will help you to track schema changes whenever they are produced in development. If this involves deployment, knowing which schema version is within production, you will get a deployment script out of your source controlled versions. Obviously it is best to be testing this in a staging atmosphere before you run on production.
Scott's article bakes an excellent point when it comes to migration script, and Denis alludes to more complicated changes that can't reasonably be second suspected in comparison tools, and would therefore require custom migration scripts to become handled and used properly. The following version of SQL Compare along with SQL Source Control will therefore manage your schema versions and your migration scripts, permitting you for the greatest of both mobile phone industry's. If you want to see early screenshots of the, please message me at David us dot Atkinson at red-colored-gate us dot com. I'd love to go over your needs therefore we can better design the tool.