Migrate 12.1 DBCS to ADW 19c using Data Pump

Architecture:

Source : DBCS on Classic 12.1 Enterprise Edition

Target : OCI gen2 Autonomous Data Warehouse 19c


Pre-Req:
———
1.  Install and Configure ocicli as ‘root’
 Ref : https://docs.cloud.oracle.com/en-us/iaas/Content/API/SDKDocs/cliinstall.htm
 
 $ bash -c “$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)”

2. Source DBCS classic DB provisioned : version 12.1.0.2 Emeteprise

3. Target ADW 19c version created

4. SQL developer installed on your local client machine to do the configuration part of ADW

5. Access to DBCS instance to run the export

6. Intermediate instance to run the impdp for Autonomous, you can install newer version on dbcs instance but for production it would be better to do from another instance


High-Level Steps:

—————–

  1. Export Dump File to Local Filesystem of DBCS Instance
  2. Install and Configure ocicli client on DBCS instance
  3. Run Multipart upload to Migration Object bucket, for larger files we can explore rclone or tsunami udp or even DTS
  4. Install 19c client on intermediary instance to run the import
  5. Copy client credentials file of ADW to intermediary instance and run the impdp to ADW


Detailed Steps:

—————–

1  — Install Oracle 19c Client with Image Method —

Reference : https://support.oracle.com/epmos/faces/DocumentDisplay?_afrLoop=514763738647331&parent=EXTERNAL_SEARCH&sourceId=HOWTO&id=885643.1&_afrWindowMode=0&_adf.ctrl-state=o3w0ut0xp_4

Download 19c client from : https://www.oracle.com/database/technologies/oracle19c-linux-downloads.html

mkdir -p /home/opc/client/oracle/orainventory

mkdir -p /home/opc/client/oracle/19.3.0

unzip LINUX.X64_193000_client_home.zip

cd /home/opc/client/oracle/19.3.0

vim ./inventory/response/client_install.rsp # add the required parameters

chmod 600 ./inventory/response/client_install.rsp

Silent install
[opc@cloudinstance:~/client/oracle/19.3.0]$ ./runInstaller -silent -responseFile /home/opc/client/oracle/19.3.0/inventory/response/client_install.rsp



2 — With oracle user on DBCS —

[oracle@Shadab-Migrate ~]$ sqlplus “/as sysdba”

SQL*Plus: Release 12.1.0.2.0 Production on Fri May 8 04:53:08 2020

Copyright (c) 1982, 2014, Oracle.  All rights reserved.


Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Oracle Label Security and Real Application Testing options

SQL> create or replace directory export_dir as ‘/home/oracle/migratedump’;

Directory created.

SQL> grant read,write on directory export_dir to public;

Grant succeeded.

SQL> exit
Disconnected from Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production

$ expdp dumpfile=migratdumpfile.dmp logfile=migratedumpfile.log directory=export_dir full=y parallel=8



3 — With root user on DBCS —

$ oci os object put –namespace <tenancy-name> -bn <bucket-name> –file /home/oracle/migratedump/migratdumpfile.dmp  –part-size 1 –parallel-upload-count 8

Upload ID: 919adb9c-**********-912d028c95fd
Split file into 3 parts for upload.
Uploading object  [####################################]  100%
{
  “etag”: “309a3651-7************-e56ec9087433”,
  “last-modified”: “Fri, 08 May 2020 05:12:45 GMT”,
  “opc-multipart-md5”: “hVrtNH**************pcquPg==-3”
}

3 Go to your Bucket and create pre-authenticated URL for read for your dump file : https://objectstorage.ap-sydney-1.oraclecloud.com/n/<tenancy-name>/b/<bucket-name>/o/migratdumpfile.dmp

copy the pre-authenticated request (PAR) :
https://objectstorage.ap-sydney-1.oraclecloud.com/p/N637dPGDlv0cnwUHGPFwMiNvJhugACKWN8qZKBmCbAU/n/tenancy-name>/b/<bucket-name>/o/migratdumpfile.dmp




4 — Import Configuration for ADW from SQL Developer —

create user migratetest IDENTIFIED by Abcde1234$## ;

grant create session to migratetest;

grant dwrole to migratetest;

GRANT UNLIMITED TABLESPACE TO migratetest;

/* I m running my import with admin but ideally you should run it with another user like the one i created above */

/* Create Credentials to run the Import using DBMS_CLOUD package

begin
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name=> ‘ADWCS_CREDS’,
username => ‘admin’,
password => ‘*********’
);
end;
/
 

 — If you need to re-create, drop the old one and add it again —

 begin
DBMS_CLOUD.DROP_CREDENTIAL(credential_name=> ‘ADWCS_CREDS’);
end;
/
 




5 — Instance from Step 1 to run the import…. Copy your credentials wallet file and store it on your $ORACLE_HOME/wallet directory —

$ mkdir -p  /home/opc/client/oracle/19.3.0/wallet

— Copy the wallet file from your Autonomous Database to the instance —
mv Wallet_PAYGDEV.zip /home/opc/client/oracle/19.3.0/wallet

— Set below parameters in bash profile —

export ORACLE_HOME=/home/opc/client/oracle/19.3.0
export ORACLE_BASE=/home/opc/client/oracle
export TNS_ADMIN=/home/opc/client/oracle/19.3.0/wallet

cd /home/opc/client/oracle/19.3.0/wallet

unzip Wallet_PAYGDEV.zip

$ vim sqlnet.ora

WALLET_LOCATION = (SOURCE = (METHOD = file) (METHOD_DATA = (DIRECTORY=”/home/opc/client/oracle/19.3.0/wallet”)))
SSL_SERVER_DN_MATCH=yes

$ tnsping paygdev_high

$ sqlplus admin/*********@paygdev_high

select * from database_properties where property_name=’DEFAULT_CREDENTIAL’;

Important : See MOS note before proceeding : https://support.oracle.com/epmos/faces/DocumentDisplay?_afrLoop=518062209629308&parent=EXTERNAL_SEARCH&sourceId=PROBLEM&id=2416814.1&_afrWindowMode=0&_adf.ctrl-state=9i6ti0fxv_4

impdp admin/*********@paygdev_high directory=data_pump_dir credential=ADWCS_CREDS dumpfile=https://objectstorage.ap-sydney-1.oraclecloud.com/p/N637dPGDlv0cnwUHGPFwMiNvJhugACKWN8qZKBmCbAU/n/tenancy-name>/b/<bucket-name>/o/migratdumpfile.dmp parallel=16  encryption_pwd_prompt=yes partition_options=merge transform=segment_attributes:n transform=dwcs_cvt_iots:y exclude=cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link full=y  

Category: CloudDatabaseOracle

Tags:

Leave a Reply

Article by: Shadab Mohammad