Replicate Schema’s Between 2 Autonomous Databases in Different Regions on OCI using Oracle GoldenGate 19c Marketplace

Oracle Autonomous Database now supports capturing of Data using an Extract and applying it to another Autonomous Database in another region. Previously using Goldengate you could only replicate to an Autonomous Database as a downstream Database but with the recently released feature you can now capture transactions at Source and replicate it to another Autonomous Database. So Autonomous DB’s can now act as Upstream Databases. This is useful in scenarios where you need to build Geographically distributed Apps using Autonomous

Steps

1. Provision 2 Transaction Processing Autonomous Databases one in Sydney and another in Ashburn

Australia East (Sydney) – Source : ProjectSYD
US East (Ashburn) – Target : ProjectDR

2. Provision Goldengate 19c Marketplace Microservices edition for Oracle and select deployment as Source and Target 19c, do not select Deployment 2 Autonomous option

See :  https://www.youtube.com/watch?v=dQbcrH8wVDs

3. Login to Golden Gate Compute VM, and get the Credentials

$ cat ~/ogg-credentials.json
{“username”: “oggadmin”, “credential”: “**********”}

Login to https://<public-ip> with the credentials displayment above. You can check the GoldenGate deployments > ServiceManager >
GoldenGate Config Home parameter /u02/deployments/ServiceManager/etc/conf

4. Check the Deployment config file which displays the Source and Target directory structure

$ cat /u02/deployments/ServiceManager/etc/conf/deploymentRegistry.dat

“Source”: {
“environment”: [
{
“name”: “TNS_ADMIN”,
“value”: “/u02/deployments/Source/etc”
}
]
}

“Target”: {
“environment”: [
{
“name”: “TNS_ADMIN”,
“value”: “/u02/deployments/Target/etc”
}
]
}


5. Copy Wallets from Both Source and Target Autonomous Database to TNS_ADMIN location directory as displayed in above command

-rw-r–r–@ 1 shadab staff 20K 7 Jan 13:02 Wallet_ProjectSYD.zip
-rw-r–r–@ 1 shadab staff 20K 7 Jan 13:26 Wallet_ProjectDR.zip

$ sftp -i “mydemo_vcn.priv” opc@<public-ip-of-GoldenGateVM>

sftp> put Wallet_ProjectSYD.zip
Uploading Wallet_ProjectSYD.zip to /home/opc/Wallet_ProjectSYD.zip
Wallet_ProjectSYD.zip 100% 20KB 1.2MB/s 00:00

sftp> put Wallet_ProjectDR.zip
Uploading Wallet_ProjectDR.zip to /home/opc/Wallet_ProjectDR.zip
Wallet_ProjectDR.zip

6. Unzip both the wallets in both Source and Target Directories and change the sqlnet.ora file WALLET_LOCATION parameter to point to the respective TNS_ADMIN directory

$ cp -p Wallet_ProjectSYD.zip /u02/deployments/Source/etc
$ cp -p Wallet_ProjectDR.zip /u02/deployments/Target/etc

$ cd /u02/deployments/Source/etc
$ unzip Wallet_ProjectSYD.zip
Archive: Wallet_ProjectSYD.zip
inflating: README
inflating: cwallet.sso
inflating: tnsnames.ora
inflating: truststore.jks
inflating: ojdbc.properties
inflating: sqlnet.ora
inflating: ewallet.p12
inflating: keystore.jks

$ cd /u02/deployments/Target/etc
$ unzip Wallet_ProjectDR.zip
Archive: Wallet_ProjectDR.zip
inflating: README
inflating: cwallet.sso
inflating: tnsnames.ora
inflating: truststore.jks
inflating: ojdbc.properties
inflating: sqlnet.ora
inflating: ewallet.p12
inflating: keystore.jks

eg: Source
vi /u02/deployments/Source/etc/sqlnet.ora

WALLET_LOCATION = (SOURCE = (METHOD = file) (METHOD_DATA = (DIRECTORY=”/u02/deployments/Source/etc”)))
SSL_SERVER_DN_MATCH=yes

Target
vi /u02/deployments/Target/etc/sqlnet.ora

WALLET_LOCATION = (SOURCE = (METHOD = file) (METHOD_DATA = (DIRECTORY=”/u02/deployments/Target/etc”)))
SSL_SERVER_DN_MATCH=yes

7. — Source Setup —

a. Create Schema and Table which needs to be replicated
$ cd /u02/deployments/Source/etc

$ /u01/app/client/oracle19/bin/sql /nolog

set cloudconfig /u02/deployments/Source/etc/Wallet_ProjectSYD.zip
show tns

connect admin/Rabbithole321#@projectsyd_high

create user goldengateusr identified by PassW0rd_#21 default tablespace DATA quota unlimited on DATA;
create table goldengateusr.accounts (id number primary key, name varchar2(100));
insert into goldengateusr.accounts values (1,’Shadab’);
commit;
select * from goldengateusr.accounts;

b. Unlock ggadmin user and enable supplemental log data

alter user ggadmin identified by PassW0rd_#21 account unlock;
ALTER PLUGGABLE DATABASE ADD SUPPLEMENTAL LOG DATA;
select minimal from dba_supplemental_logging;

select to_char(current_scn) from v$database;
16767325762804

c. Create extract parameter file

$ mkdir /u02/trails/dirdat
$ vi /u02/deployments/Source/etc/conf/ogg/ext1.prm

EXTRACT ext1
USERID ggadmin@projectsyd_high, PASSWORD PassW0rd_#21
EXTTRAIL ./dirdat/sy
ddl include mapped
TABLE goldengateusr.*;

d. Add the extract to source
$ /u01/app/ogg/oracle19/bin/adminclient

CONNECT https://localhost/ deployment Source as oggadmin password DFA9zOjlh0GY%GpI !

ALTER CREDENTIALSTORE ADD USER ggadmin@projectsyd_high PASSWORD PassW0rd_#21 alias projectsyd_high

DBLOGIN USERIDALIAS projectsyd_high

ADD EXTRACT ext1, INTEGRATED TRANLOG, SCN 16767325762804
REGISTER EXTRACT ext1 DATABASE
ADD EXTTRAIL ./dirdat/sy, EXTRACT ext1

START EXTRACT ext1
INFO EXTRACT ext1, DETAIL

The status should be ‘running’

e. Insert rows in source table

/* Insert another row in source table */
insert into goldengateusr.accounts values (2,’John Doe’);
insert into goldengateusr.accounts values (3,’Mary Jane’);
commit;

f. Take a datapump backup of the schema until the SCN to the internal directory ‘DATA_PUMP_DIR’

export ORACLE_HOME=’/u01/app/client/oracle19′
export TNS_ADMIN=’/u02/deployments/Source/etc’

$ /u01/app/client/oracle19/bin/expdp ADMIN/Rabbithole321#@projectsyd_high directory=DATA_PUMP_DIR dumpfile=export01.dmp logfile=export.log schemas=goldengateusr FLASHBACK_SCN=16767325762804

g. Create Bucket, Auth Token for access and DBMS_CLOUD credentials to copy export backup to Customer bucket
Create a Bucket in your tenancy called ‘datapump’ and create an Auth Token for your OCI user which has read/write permissions to this bucket

$ /u01/app/client/oracle19/bin/sqlplus admin/Rabbithole321#@projectsyd_high

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘LOAD_DATA’,
username => ‘oracleidentitycloudservice/shadab.mohammad@oracle.com‘,
password => ‘CR+R1#;4o5M[HJPgsn);’
);
END;
/

— BEGIN
— DBMS_CLOUD.drop_credential(credential_name => ‘LOAD_DATA’);
— END;
— /

BEGIN
DBMS_CLOUD.PUT_OBJECT (‘LOAD_DATA’,’ https://objectstorage.ap-sydney-1.oraclecloud.com/n/ocicpm/b/datapump/’,’DATA_PUMP_DIR’,’export01.dmp‘);
END;
/

select object_name, bytes from dbms_cloud.list_objects(‘LOAD_DATA’,’https://objectstorage.ap-sydney-1.oraclecloud.com/n/ocicpm/b/datapump/‘);

8. — Target Setup —

a. Create DBMS_CLOUD credential on target
cd /u02/deployments/Target/etc/

export ORACLE_HOME=’/u01/app/client/oracle19′
export TNS_ADMIN=’/u02/deployments/Target/etc/’

$ /u01/app/client/oracle19/bin/sqlplus admin/Rabbithole321#@projectdr_high

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘LOAD_DATA’,
username => ‘oracleidentitycloudservice/shadab.mohammad@oracle.com‘,
password => ‘CR+R1#;4o5M[HJPgsn);’
);
END;
/

select object_name, bytes from dbms_cloud.list_objects(‘LOAD_DATA’,’https://objectstorage.ap-sydney-1.oraclecloud.com/n/ocicpm/b/datapump/‘);

b. Unlock ggadmin user on target and enable supplemental log data

alter user ggadmin identified by PassW0rd_#21 account unlock;
ALTER PLUGGABLE DATABASE ADD SUPPLEMENTAL LOG DATA;
select minimal from dba_supplemental_logging;

c. Import the datapump backup from customer bucket to Target ADB

$ /u01/app/client/oracle19/bin/impdp admin/Rabbithole321#@projectdr_high credential=LOAD_DATA schemas=goldengateusr directory=DATA_PUMP_DIR dumpfile=https://objectstorage.ap-sydney-1.oraclecloud.com/n/ocicpm/b/datapump/o/export01.dmp logfile=import.log

d. Create replicat parameter file

$ vi /u02/deployments/Target/etc/conf/ogg/repl1.prm

Replicat repl1
USERID ggadmin@projectdr_high, PASSWORD PassW0rd_#21
map goldengateusr.*, target goldengateusr.*;

e. Create replicat in Target ADB

$ /u01/app/ogg/oracle19/bin/adminclient

CONNECT https://localhost deployment Target as oggadmin password DFA9zOjlh0GY%GpI !
ALTER CREDENTIALSTORE ADD USER ggadmin@projectdr_high PASSWORD PassW0rd_#21 alias projectdr_high
DBLOGIN USERIDALIAS projectdr_high

ADD CHECKPOINTTABLE ggadmin.chkpt
Add Replicat repl1 exttrail ./dirdat/sy CHECKPOINTTABLE ggadmin.chkpt

Start Replicat repl1
info replicat repl1, DETAIL

Status should be ‘running’

9. Now that the replication has started, insert few records in source table and you should be able to see them in target DB. Review /u02/deployments/Target/var/log/ggserr.log for any errors related to the replication

–Source–
export ORACLE_HOME=’/u01/app/client/oracle19′
export TNS_ADMIN=’/u02/deployments/Source/etc/’

$ /u01/app/client/oracle19/bin/sqlplus admin/Rabbithole321#@projectsyd_high

select * from goldengateusr.accounts;

insert into goldengateusr.accounts values (4,’Foo Bar’);
insert into goldengateusr.accounts values (5,’Dummy Value’);
commit;

–Target —
export ORACLE_HOME=’/u01/app/client/oracle19′
export TNS_ADMIN=’/u02/deployments/Target/etc/’

$ /u01/app/client/oracle19/bin/sqlplus admin/Rabbithole321#@projectdr_high

select * from goldengateusr.accounts;

We should now be able to see the new records in the Target DR Database.

10. Since we have included the DDL in the Extract, we can also create a table in Source and it will be auto-magically replicated to the Target

–Source–
create table goldengateusr.cardholder (id number primary key, cardno varchar2(30));

insert into goldengateusr.cardholder values(1,’1234-5677-9876-8765′);
commit;

–Target —
desc goldengateusr.cardholder ;
select * from goldengateusr.cardholder ;

References:
————
[1] https://blogs.oracle.com/dataintegration/free-goldengate-software-on-oci-marketplace
[2] https://docs.oracle.com/en/cloud/paas/autonomous-data-warehouse-cloud/tutorial-getting-started-autonomous-db/index.html
[3] https://docs.oracle.com/en/middleware/goldengate/core/19.1/oracle-db/configure-autonomous-database-capture-replication.html#GUID-6AF0D1AC-FA05-41E8-ADA2-2F6820C68D5C
[4] https://docs.oracle.com/en/middleware/goldengate/core/19.1/oracle-db/using-ogg-autonomous-databases.html#GUID-660E754E-B9A6-48DD-AA66-0D6B66A022CD

Category: CloudOracle Cloud Infrastructure

Tags:

Leave a Reply

Article by: Shadab Mohammad