I've essentially MySQL dump from the WordPress database with a few data. The entire dump is all about 20MB and roughly 500 queries, therefore it does not take a long time to load, but it is not very fast either.

I want ot use that database during my other application, and to achieve that, I must alter the schema, i.e. relabel tables, posts, drop some posts, drop tables, create new tables and import data into them, etc.

The truth is, I am likely to alter nearly every single column. Now you ask ,, how must i write a script like this?

Let me have the ability to have one script will be able to execute on the recent dump, which will convert the database to some needed form. Meaning basically start typing ALTER TABLE queries after which performing them, I am going to need to re-import the database each time I wish to test the entire script, which does not appear extremely effective in my experience. Can there be much better method of doing this?

Can One in some way run a lot of queries, take a look at the things they did, after which hit a control button to maneuver the database to it's original condition?

I am not 100% certain I realize your question, however i think you may be searching for

CREATE TABLE new_table LIKE other_database.old_table;

This could permit you to have a "canonical" version from the old database untouched. Your ALTER TABLE claims would execute from the new tables.

Aside from a chance to run all of your script like a single transaction, there is no simple sub-nanosecond UNDO for most of these schema changes.

Getting stated everything, if your developer found me and stated "I am likely to routinely alter nearly every single column", I'd see that like a large warning sign, billowing within the wind.

If modifying the actual dataset appears ineffective for you, you can build an abstraction layer. The main abstraction mechanism for tables is sights. You might have the ability to build some sights which project the form from the dataset the application requires within the imported database. This method is nearly certain to perform more poorly at runtime than physically changing the actual dataset. You should also be conscious of the restrictions on creating and taking advantage of sights in MySQL. Without specifics in your particular situation, it's difficult to provide a concrete answer.

Nevertheless, there is not anything naturally ineffective in regards to a large script of alter table claims.

Can One in some way run a lot of queries, take a look at the things they did, after which hit a control button to maneuver the database to it's original condition?

Not via transactions, I do not believe. You can simply re-restore the database.

What you are attempting to achieve could be better referred to being an ETL task instead of as a lot of "ALTER TABLES". Don't affect the old schema - make your new schema individually after which import the information in it, changing the format along the way, such as this

INSERT INTO NEWTABLE (Fullname)
SELECT CONCAT(Name, Surname) AS FullName FROM  OLDTABLE