Which two methods are supported for migrating your on-premises Oracle Database to an Oracle autonomous transaction processing?

This post is part of a series of blog posts on the Oracle Autonomous Database.

The first thing to consider when migrating to the Oracle Autonomous Database is how to load and move your data into the Autonomous Database. There are several options for you. When it comes to loading your data, the traditional tools that you are accustomed to using, such as Oracle Data Pump and SQL Loader, can be used to load data into the Oracle database.

When it comes to moving data into the Autonomous Database, Oracle supports many different types of data types, from SQL Loader text files, to export-import dump files, to CSV, to JSON files. When it comes to supporting object storage, ADB supports a few third-party cloud object storage platforms, such AWS S3 and Azure Blob, and, obviously, of course, Oracle’s own object storage. ADB also supports shifting data from an OCI application layer, as well as from virtual machines that may be hosted in your cloud tenancy.

When you look at migrating an existing database into the Autonomous Database, it’s important to realize that a physical database cannot simply be migrated to an autonomous database. It is not possible to migrate to the Autonomous Database by simply unplugging a PDB from an on-premise container database and plugging it into an Autonomous Database. This is not a supported option as of now.

Also, keep in mind all legacy features that are not supported must also be removed. For example, legacy style LOBs. Also, use of Data Pump export-import for exporting Oracle databases version 10.1 and higher are required.

So when we look at approaches to move data into the Autonomous Database, data can be obviously moved with Data Pump. And the nice thing about following this approach is it will eliminate legacy formats. It can perform upgrade versions. Its platform-independent. It’s a very portable way to load data into a pre-existing or pre-created, new Autonomous Database.

Another thing to keep in mind is that GoldenGate can be used. GoldenGate replication can be used to keep the on-premise or the non-autonomous database up and running and online during the process, meaning that you can have a zero down time migration process. And keep in mind that GoldenGate does have certain restrictions for row ID for nested tables, things such as that. Migrating third-party applications that use the Oracle database can also be supported, but do keep in mind, it’s important to ensure that the vendor supports the Autonomous Database. So while this may be technically possible, it is definitely important to work with the vendor.

Talking about Oracle GoldenGate, it is possible to set up GoldenGate between your on-premise or your other source system and use the Autonomous Database as a target database. Currently, only non-integrated replicas are supported- with the Autonomous Database. And the GoldenGate software release required (on the source system) is Oracle GoldenGate for Oracle database release 12.3.0.1.2 and later. For GoldenGate, the Oracle Autonomous Database cannot be a source database, it can only be a target, so please keep this in mind for rollback purposes.

Which two methods are supported for migrating your on-premises Oracle Database to an Oracle autonomous transaction processing?

When we look at different methods that are available to us, we support many, many approaches. The SQL Developer Data Import Wizard, SQL Developer Migration Workbench, Oracle Data Pump, SQL*Loader, database links, DBMS_CLOUD Package, external tables, data sync, as well as other ETL tools. These are all approaches that can be used, and there are going to be certain situations where one will work better than the other.

Often these tools involve a combination of tools. For example, using the SQL Developer Data Import Wizard is a very powerful tool for loading data or flat files into the Autonomous Database. But in order to work with large data sets, I would strongly advise that you work with Oracle’s Cloud Object Storage. The SQL Developer Migration Workbench is a very powerful tool for migrating third party databases, and it can actually work with the Object Storage support as well. And Oracle do provide a plugin to allow you to migrate from other databases, such as SQL Server, Amazon Redshift.

Data Pump, as I mentioned previously, is definitely a powerful tool and one that is definitely a recommended tool, especially when migrating from different platforms. There will be situations where it may be the only approach, depending on the platform you’re coming from and the data source. But obviously, it’s an Oracle to Oracle solution.

So let’s look at loading data into the Autonomous Database. The DBMS_CLOUD package is the preferred method. Because with this tool, Oracle give you the ability to load various types of data, whether it be a CSV, an Avro file, a JSON file, or just a regular text file. The DBMS_CLOUD package will permit you to perform these upgrades, inserts, and data loads fairly efficiently.

Basically, what’s required is you need to copy your files, in whatever format they are, to the Oracle Object Storage, and then run the DBMS_CLOUD package from within your Autonomous Database to load that data into your Autonomous Database. And yes, the volume of data that this supports is 100% unlimited. We can work with very large data volumes to perform that operation, and it works independently of the Autonomous Database you’re connected through to.

Loading using SQL*Loader is definitely supported, and it’s a good approach for loading data that’s sitting on a local client machine. And it works over the network. You can transfer your data without having to setup the Object Storage to perform this operation. But do keep in mind, the fact that you will be going over your network and you will be running from a client machine, means that the volume of data that you’re transferring can be limited.

And then when it comes to the Import Wizard on SQL Developer, this tool is very powerful. It can import all types of files, and it’s good for importing XLS files if you wish. Keep in mind that that’s limited to number of rows it’s in the Excel version you’re running with. And this approach is good for loading data from your local desktop. When using this utility, one of the most powerful features is it’s a very good utility for getting an understanding of what your data looks like and how your data is going to be mapped when it gets loaded into the Autonomous Database. So it’s a very good tool for setting up a sandbox or a POC and determining how your upgrade or data migration is going to work. It helps with the mappings, etc.

When it comes to the Object Storage, there’s a package called DBMS_CLOUD, and that package is a wrapper on top of our Oracle external table functionality. It simplifies the requirements of defining how you’re going to load your data. And we’re going to drill into some of these packages or functions that are included, COPY_DATA being the one for loading your data. So keep that in mind, when it comes time to load your data using the DBMS_CLOUD package, COPY_DATA is the one that you will be using for actually loading the data into the database.

ADB APIs for Object Store Access

  • PL/SQL package, DBMS_CLOUD
  • Wrapper on top of Oracle external table functionality
  • Simplifies creating external tables on object store files
  • No need to type complicated “create external table” commands

Data loading API: DBMS_CLOUD.COPY_DATA

External table API for queries: DBMS_CLOUD.CREATE_EXTERNAL_TABLE

DBMS_CLOUD is an interface to help you work with your Object Storage from your Autonomous Database. Because, remember, you do not have access to the local file system in the Autonomous Database, so you rely upon that package to perform these tasks.

In addition to DBMS_CLOUD, we have the DBMS_CLOUD_ADMIN package. And the DBMS_CLOUD_ADMIN package is only available on Autonomous Database Shared. And the reason for this is because these operations can be run natively through the database in Autonomous Dedicated. You can type the command, create database link, drop database link, etc. With Autonomous Shared, you do not have that ability. Those capabilities are actually locked down from the end users. So the way you create a database link or drop a database link, the way you enable or disable application continuity, is all done through this package, as well as granting tablespace quotas.

DBMS_CLOUD_ADMIN package is only available for the ADMIN user. To run it as a user other than ADMIN, the user must be granted EXECUTE privileges on the specific procedures, e.g. CREATE_DATABASE_LINK
Use GRANT_TABLESPACE_QUOTA to explictly grant a quota to a user if they need to load data.

We also have a package that’s available on Autonomous Shared called DBMS_CLOUD_MACADM. The DBMS_CLOUD_MACADM is used for configuring Oracle’s database vault on ADB. What you basically do, when you run this package, is you grant the privileges to use this package to the role DV_OWNER and DV_ACCTMGR to that user. Once that role has been granted to that user, that user can then enable, and disable, and manage a database vault.

Now, remember, the DV_OWNER cannot be the admin user. So you cannot grant DV_OWNER to admin. And this is one way of enforcing separation of duties. So keep in mind, the database vault requires the autonomous instance to be physically restarted. It cannot be controlled by the admin user. However, the roles to perform these operations is provided through the admin user to these specific sets of users. So it’s a mechanism to control and separate duties.

When it comes to loading data with the Object Storage, the Object Storage is available to provide you a very fast and low-cost model for storing data. And it’s a very good utility for actually staging data. And it’s possible to query that the components of the object storage directly from your Autonomous Database. The nice thing about using the Object Storage is it simplifies performing E-TL and ELT type operations. It allows you to also perform operations to query in place without actually physically moving your data from the Object Storage and loading it into the Autonomous Database. It’s possible to create hybrid partition tables or external tables that can map to the Object Storage. And it is extremely inexpensive, and your data is encrypted, so that only you can see it. So a very, very powerful feature. In addition to that, it does support the swift object URL as well as our own Object Storage URL.

Object Storage Integration

  • ADB is fully integrated with major object stores: Oracle, AWS, Azure
  • Choice of access:
    • Load object store files into the database
    • Query object store files on the fly without loading
  • Supports all text file formats: CSV, delimited, JSON, etc.

DBMS_CLOUD PACKAGE

CREATE_CREDENTIAL Procedure
This procedure stores Cloud Object Storage credentials in the Autonomous Data Warehouse database. Use stored credentials for data loading or for querying external data residing in the Cloud.

DBMS_CLOUD.CREATE_CREDENTIAL(credential_name => ‘X’,username => ‘’.password => ‘auth-token-string’

);