Do you know the benefits and drawbacks of utilizing a dump-file like a foundation of data and schema migration, instead of fully script based or perhaps a database delta tool?

The context would be that the application is within production, and there's just one production database. The applying and database schema have been in active development. Critical user data is available within the production database and should be folded forward with deployment of recent versions or fixes.

The solutions being talked about are :

Dump file basis -

  1. Begin with a reference dump file.
  2. Database alter scripts are checked into source control.
  3. Deployment entails loading the dump file after which running the alter scripts

Schema + migration

  1. Entire schema and certain non-user configuration data are saved as DDLs and DMLs in SCM.
  2. Migrations scripts from the latest release's schema are saved in SCM.
  3. Deployment entails loading the schema then moving the information. 3.

My intuition is the fact that utilizing a binary format because the basis isn't good, but I have to have the ability to convince others (if that's indeed the situation), who argue that it's necessary.


I re-developed this to really make it simpler to reply to.

Below may be the original question:

I'm using a team on the database driven enterprise application and searching to enhance our process. Presently, there's lots of manual process around upgrading the database on all tiers. The aim is getting automated process for upgrading the database consistently as well as in an automatic fashion, (using the concept of atomic commits and closer towards continuous delivery), which may bring numerous advantages.

I believe the schema (and certain data essential for application configuration) ought to be symbolized in source control, as well as in addition, any scripts essential to transform and load user data from the present production database. I've read that it's not advisable to possess a dump file (.dmp) in source control, and without effort, To be sure strongly. However I'm not agreed with by everybody around the project. The counter argument is the fact that the truth is it's not possible, or at least is simply too difficult, to not begin with a dump file. I'm up against my limit of database understanding and should not really debate meaningfully... I'm more a developer and never a database specialist. The choice recommended is the fact that alter scripts be stored that affect the dump towards the latest schema.

Can someone assist me to comprehend the benefits and drawbacks of every approach little better? May be the dump-based approach necessary, advisable, or not recommended, and why?

Just a little background that might be relevant: the applying is within production so each latest version must import data as part of the deployment process, as well as for apparent reasons on integration and UAT tiers this ought to be real data. However this application isn't "shipped" and installed by clients, there's just one demonstration of production in a with time, maintained internally. I recognize you will see particulars specific to my project therefore the answer will need to address the overall situation.

Most projects I have been in had explicit SQL scripts for schema creation, and initial data insertion. (and upgrade scripts with alter claims and data migrations)

Also you will find such things as Liquibase for controlling database changes.

http://www.liquibase.org/bestpractices