I've two Oracle tables, a classic one and a replacement. That old one was poorly designed (much more than mine, actually) but there's lots of current data that should be migrated in to the new table which i produced.

The brand new table has new posts, different posts.

I figured of just writing a PHP script or something like that with a lot of string alternative... clearly this is a stupid method of doing it though.

I would love to have the ability to cleanup the information a little on the way too. Some it had been saved with markup inside it (ex: "
Name
"), plenty of blank space, etc, and so i would love to repair everything before putting it in to the new table.

Does anybody have experience doing something similar to this? What must i do?

Thanks :)

I actually do this a great deal - you are able to migrate with simple choose statememt:

create table newtable as select 
 field1,
 trim(oldfield2) as field3,
 cast(field3 as number(6)) as field4,
 (select pk from lookuptable where value = field5) as field5,
 etc,
from
 oldtable

There's really hardly any you could do this by having an intermediate language like php, etc that you simply can't do in native SQL if this involves cleaning and changing data.

For additional complex cleanup, you could produce a sql function that does the heavy-lifting, however i have cleared up some pretty horrible data without turning to that particular. Remember in oracle you've decode, situation claims, etc.

I'd checkout an ETL tool like Pentaho Kettle. You'll have the ability to query the information in the old table, transform and fix it up, and re-place it in to the new table, with a pleasant WYSIWYG tool.

Here is a previous question i clarified regarding data migration and manipulation with Kettle.
Using Pentaho Kettle, how do I load multiple tables from a single table while keeping referential integrity?

When the data volumes aren't massive and when you're only going to get this done once, then it will likely be unequalled a roll-it-yourself program. Particularly if you possess some custom logic you'll need implemented. Time come to download, learn &lifier make use of a tool (for example pentaho etc.) will most likely not worthwhile.

Coding a choose *, upgrading posts in memory &lifier doing an place into is going to be rapidly completed in PHP or other programming language.

That being stated, when you are carrying this out frequently, then an ETL tool may be worth learning.

I am focusing on an identical project myself - moving data in one model that contains a few dozen tables to some somewhat different type of similar quantity of tables.

I have taken the approach of making a MERGE statement for every target table. The origin query will get all of the data it requires, formats it as being needed, then your merge calculates when the row already is available and updates/card inserts as needed. By doing this, I'm able to run the statement multiple occasions when i develop the answer.