Best Practices: Data Migration using Oracle datapump

Background

I notice clients doing many mistakes when designing their data migration strategy using Oracle datapump, in this article I will try to collect and share my findings and advice when planning and executing a database migration. The pupose of the

Migration Strategy

Even thought the data migration strategy should be shaped up according to the specific data, workload and downtime, I believe the migration approach should avoid creating huge data dumps in order to avoid risking the whole migration process on the success of importing one or a few of those mentioned huge data dumps.

We can avoid this by doing the following on the source:

  1. check that all constraints are validated
  2. If possible export the schemas separately
  3. at schema level, do not export:
    1. indexes (except the PK ones)
    2. sequences
    3. db links

and on the target import:

  1. avoiding doing export – import at the database level
  2. enable but do not validate constraints
  3. create via a DDL script. This way we can control parallelism better:
    1. the indexes
    2. the sequences
    3. DB links
  4. validate the constraints separately via a script after the import is completed successfully. In some cases, this can be done even after the new DB system is taken into use
  5. if possible run the job in interactive mode, this will give more options to intervene if anything goes wrong. For the cases when using complex VPN setups that are prone to timing out an utility like screen or tmux are lifesavers.

Recommended parameters

Export Parameters:

cluster=N
exclude=cluster,indextype,db_link
parallel=n
schemas=schema_name
dumpfile=export%u.dmp

Import Parameters

References

Leave a Reply

Your email address will not be published. Required fields are marked *

%d