we are re-evaluating our database upgrade process for the application to take away the discomfort of needing to generate all of the upgrade scripts for any release in the finish from the release cycle. We are searching to maneuver towards a far more transformative process, using migrations that are checked in alongside features having a tool for example migratordotnet, which appears just like a very clean method of controlling schema changes.
However, the default data that is shipped with this database is very regularly susceptible to change and a few of these data updates don't fit well using the migration process. For instance, card inserts to tables that have a name primary key aren't easily identifiable and thus can't be corrected on downgrading.
So I am wondering how people manage the migration of default data? Will they handled it outdoors from the plan migration process? Or are card inserts carried out throughout migrations but removing the information isn't carried out throughout downgrades?
Thanks ahead of time,
For all of us, DB migration is area of the daily development process. Designers need to either commit a course or perhaps a script which works the required changes. This occurs just the attached feature is implemented rather than "in the finish from the release cycle".
Downgrade isn't an problem but when it is, produce a column having a version information. Once the upgrade works and also the customer decides to remain using the latest version, drop the column again (or ensure that it stays).
The important thing for achievement is the fact that we've extensive test cases which produce the database on your own (utilizing an In-Memory DB like H2 or perhaps a DB which is a component of each developer machine) including all data after which migrate it completely through and back. We are able to import anonymized data in the productions servers in to the test cases to find bugs and improve tests without breaking the privacy in our clients or getting when it comes to the designers.