How to import oracle dump from ec2 instance to RDS in AWS

青春壹個敷衍的年華 提交于 2019-12-07 16:37:40

问题


I am new to RDS in AWS.I have a database dump file that is residing on an ec2 instance. I have few questions :

  1. How can I move the dump file to RDS Instance.
  2. What will be the pre requisites to do this. I mean what client do I need to install to do this.

Any help or lead is appreciated.

Thanks


回答1:


I agree that the official document is quite confusing and not clear when it comes to Oracle import of dump file.

There is another document step by step instructions from amazon at http://d0.awsstatic.com/whitepapers/strategies-for-migrating-oracle-database-to-aws.pdf and also a presentation at http://fr.slideshare.net/AmazonWebServices/dat308-28616289

Basically the process is :

  1. you need to transfer files to Amazon RDS DB instance - Amazon offers an already defined DATA_PUMP_DIR so you would need to have a script to move your dump files from your EC2 instance to the Amazon RDS DATA_PUMP_DIR

The following script would do the job for you

use DBI;
use warnings;
use strict;

# RDS instance info
my $RDS_PORT=1521;
my $RDS_HOST="<my rds instance>";
my $RDS_LOGIN="<myuser>/*******";
my $RDS_SID="<ORCL_LIKE>"; 

#The $ARGV[0] is a parameter you pass into the script
my $dirname = "DATA_PUMP_DIR";
my $fname = $ARGV[0];

my $data = "dummy";
my $chunk = 8192;

my $sql_open = "BEGIN perl_global.fh := utl_file.fopen(:dirname, :fname, 'wb', :chunk); END;";
my $sql_write = "BEGIN utl_file.put_raw(perl_global.fh, :data, true); END;";
my $sql_close = "BEGIN utl_file.fclose(perl_global.fh); END;";
my $sql_global = "create or replace package perl_global as fh utl_file.file_type; end;";

my $conn = DBI->connect('dbi:Oracle:host='.$RDS_HOST.';sid='.$RDS_SID.';port='.$RDS_PORT,$RDS_LOGIN, '') || die ( $DBI::errstr . "\n");

my $updated=$conn->do($sql_global);
my $stmt = $conn->prepare ($sql_open);
$stmt->bind_param_inout(":dirname", \$dirname, 12);
$stmt->bind_param_inout(":fname", \$fname, 12);
$stmt->bind_param_inout(":chunk", \$chunk, 4);
$stmt->execute() || die ( $DBI::errstr . "\n");

open (INF, $fname) || die "\nCan't open $fname for reading: $!\n";
binmode(INF);
$stmt = $conn->prepare ($sql_write);
my %attrib = ('ora_type','24');
my $val=1;
while ($val> 0) {
  $val = read (INF, $data, $chunk);
  $stmt->bind_param(":data", $data , \%attrib);
  $stmt->execute() || die ( $DBI::errstr . "\n") ; };
die "Problem copying: $!\n" if $!;
close INF || die "Can't close $fname: $!\n";
  $stmt = $conn->prepare ($sql_close);
$stmt->execute() || die ( $DBI::errstr . "\n") ;

make sure all your files are imported

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by FILENAME;
  1. Import your data from dump files to your RDS instance

you can run impdp or you can submit a job using PL/SQL script

declare
  h1    NUMBER;
begin

  h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'FULL', job_name => 'IMPORT_DUMP', version => 'COMPATIBLE');
  dbms_datapump.add_file(handle => h1, filename => 'IMPORT.LOG', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);

  -- optional 
  dbms_datapump.set_parallel(handle => h1, degree => 8);

  dbms_datapump.add_file(handle => h1, filename => 'IMPORT.LOG', directory => 'DATA_PUMP_DIR', filetype => 3);
  dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);

  dbms_datapump.add_file(handle => h1, filename => 'HOST_01.dmp', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
  <repeat the add_file for all your files>

  dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
  dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
  dbms_datapump.set_parameter(handle => h1, name => 'REUSE_DATAFILES', value => 0);
  dbms_datapump.set_parameter(handle => h1, name => 'SKIP_UNSUSABLE_INDEXES', value => 0);

  dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
  dbms_datapump.detach(handle => h1);
end;
/

The import is done and data available into your RDS db. you can clean your files from the DATA_PUMP_DIR, the following script would do the job

-- remove file from data pump dir
begin
utl_file.fremove ('DATA_PUMP_DIR','import.log');
utl_file.fremove ('DATA_PUMP_DIR','<name of file>.dmp');
end fremove;
/


来源:https://stackoverflow.com/questions/23486198/how-to-import-oracle-dump-from-ec2-instance-to-rds-in-aws

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!