My team continues to be getting increasingly more into CMSes in the last couple years. We are also getting increasingly more into continuous integration. Repairing the 2 has shown to be difficult. To really make it a whole lot worse, we all do Light and .Internet sites so our scripts ideally work with both.

We now have four conditions for every site local, integration, staging and production. Content entry and file upload regularly occur around the production site. Development clearly begins in your area and works its in place.

What are the techniques or techniques will be able to implement on my small build server to instantly push data and schema updates from development conditions into production without overwriting user-produced content? On the other hand, how (so when) can one instantly draw user-produced data lower in to the development conditions?

You'll have 3 types of things inside your database you have to be worried about.

1) The schema, which may be defined in DDL. 2) Static or research data, which may be defined in DML. 3) Dynamic (or user) data which is also defined in DML.

Normally changes to (1) and (2) must increase to production using the code they're mutually just a few. (3) should not increase, but could be replicated to development conditions if they're synchronized with production conditions in those days.

Obviously it is a lot more complicated than that. To obtain (1) up may need transforming exsiting schema / DDL into specific alter claims, and could also require data manipulation if data types or locations are altering. To obtain (2) up requires synchronisation using the code build which could become hard in complex conditions.

You will find many tools available and when you need automation then you definitely most likely need advice from someone acquainted with them.

I personally use a simple plan where all schema changes are reflected inside a SQL build script, and are generally put into a changes SQL script including all SQL needed to do any needed changes. This can be useful for me but my scenario really is easy (1 person, 1 server) and it is therefore atypical.

Nevertheless the answer to any success is determining the job needed for that changes at that time the modification is created. The naive approach to just altering the expansion DB after which carrying out removal at deployment time is really a disaster.

For Any) the very best factor is by using important system. Where your articles is "default" and user content "overrides" it. Which submissions are in 2 different places. So, you improve your stuff without any possibility of needing to cope with any type of merging process. Your computer data could be inside a different directory, or flagged in a different way within the DB, whatever. The down-side of the is you can't only use fundamental as they are server functionality to server the information up, unless of course you need to do the particular mapping once the links are produced.

i.e. you may either intercept /site/images/icon.png and offer either /site/images/default/icon.png or /site/images/client/icon.png or whenever you write the web pages out that you can do the check there and send the correct Web addresses therefore the server can service them directly.

For that second, regarding when you are able make use of the clients data, that's much more of a legitimate IP problem that you would need to speak to the customer about whether they are a) willing and b) even able to discussing their quite happy with you. Next, the tech is easy.