Wednesday, November 10, 2010

Migration time: Oracle Database Machine/Exadata

Add to Technorati Favorites


Ver este articulo en Español
Guess what I've been doing lately??!! taking vacations? well, that was on September (and that's another post) watching movies? wish I could... none of that, but something more exciting: migrating multi-terabyte datamarts to some Oracle DBMs/Exadatas!!!

My first attempt back in March was Datapump, and it worked fine up-to database sizes around 2Tb using import over the network (DB Link) which is cool, but has the following caveat: cannot disable estimation when using DB Link and therefore much time is spent just measuring how much info Datapump is going to transfer.

However, lately I've been using something we call O2O migration (Oracle-to-Oracle) that is a set of tools and procedures developed by Oracle with SAP database systems in mind, which easily get into the Terabyte league and on top of that critical downtimes, then need to perform super-fast... and it does!!!

On my last migration, took about 36 hours to make a clone of 10Tb datamart ... results may vary according to source hardware config, and things like storage setup and maker or number of cpus factor in; of course on the other side you need to take into account the DBM/Exadata sizing. Summarizing, results may vary according to your hardware and database sizing.

This is an excellent alternative you may take into account, but remember this tool is not included with the database: is sold as a service from Oracle.

Subscribe to Oracle Database Disected by Email

Follow IgnacioRuizMX on Twitter
Delicious
Bookmark this on Delicious

2 comments:

Surachart Opun said...

How can I use O2O tool?
I need to migrate... to Exadata.

Anonymous said...

Another alternative to this and is included with the duplicate database from active database (11g onwards). This works very well. I moved a 27tb datamart from one datacenter to another using this method ... pushes the datafiles over the network using tns .

Custom Search