How to export using DATA_PUMP to S3 bucket?

≡放荡痞女 提交于 2019-12-10 11:43:31

问题


We have RDS (Oracle) instance, I need to export specific Schema into dumpfile. Export works and copies dump file into DATA_PUMP_DIR. Issue is that RDS do not have file directory access.

I need exported DMP file either on S3 or copy to another EC2 instance.

The article: LINK talks about copying data dump file between two RDS instances but not to S3 or EC2.


回答1:


There are several ways to solve this problem. First option.

  1. Install a free database version of the Oracle XE version on EC2 instance(It is very easy and fast)
  2. Export a schema from the RDS instance to DATA_PUMP_DIR directory. Use DBMS_DATAPUMP package or run expdp user/pass@rds on EC2 to create a dump file.
  3. Create database link on RDS instance between RDS DB and Oracle XE DB.

If you are creating a database link between two DB instances inside the same VPC or peered VPCs the two DB instances should have a valid route between them. Adjusting Database Links for Use with DB Instances in a VPC

  1. Copy the dump files from RDS instance to Oracle XE DB on EC2 uses the DBMS_FILE_TRANSFER.PUT_FILE via database link

  2. Copy files from the DATA_PUMP_DIR directory Oracle XE on EC2 instance to the S3.

Second option. Use the obsolete utility exp to export. It has restrictions on the export of certain types of data and is slower.

  1. Run exp user/password@rds on EC2 instance.
  2. Copy files from the directory Oracle XE on EC2 instance to the S3

Original export is desupported for general use as of Oracle Database 11g. The only supported use of Original Export in 11g is backward migration of XMLType data to a database version 10g release 2 (10.2) or earlier. Therefore, Oracle recommends that you use the new Data Pump Export and Import utilities, except in the following situations which require Original Export and Import: Original Export and Import




回答2:


Third option. I am using it.

  1. Take a look at alexandria-plsql-utils project, and especially look at: amazon_aws_auth_pkg, amazon_aws_s3_pkg and ftp_util_pkg packages.

  2. Install required packages and dependencies.

  3. Do your dump, then with such example code below you can copy file from Amazon RDS Oracle into S3 bucket.

    declare
       b_blob blob;
    begin
       b_blob := file_util_pkg.get_blob_from_file ('DATA_PUMP_DIR', 'my_dump.dmp');
       amazon_aws_auth_pkg.init ('aws_key_id','aws_secret', p_gmt_offset => 0);
       amazon_aws_s3_pkg.new_object('my-bucket-name', 'my_dump.dmp', b_blob, 'application/octet-stream');
     end;
    

    `




回答3:


It's now possible to directly access a S3 bucket from a Oracle database. Please have a look at the following documentation: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html

And here the official news that this is supported: https://aws.amazon.com/about-aws/whats-new/2019/02/Amazon-RDS-for-Oracle-Now-Supports-Amazon-S3-Integration/?nc1=h_ls

It seems that the first post was a little bit to early to get this news. But anyway this post lists further good solutions like the database link.



来源:https://stackoverflow.com/questions/48794500/how-to-export-using-data-pump-to-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!