Manipulating Data from Oracle Object Storage to Oracle Autonomous Data Warehouse (ADW) with Oracle Data Integrator (ODI)

February 20, 2019 | 11 minute read
Sandrine Riley
Senior Principle Product Manager
Text Size 100%:

Guest Authors:  Alex Kong & Weida Wang - Oracle Solution Engineers


Introduction and Prerequisites

This article presents an overview on how to use Oracle Data Integrator in order to manipulate data from Oracle Cloud Infrastructure Object Storage. The scenarios here present loading the data from an object storage in Oracle Cloud Infrastructure and then move the data to Oracle Autonomous Data Warehouse (ADW).

This document could be a reference for customer have data storage in different regions and want to do the data integration and feed into a data warehouse.

Main steps are listed here:

1. Install ODI

2. Patch p26669648_122130_Generic to upgrade ODI to version

3. Set up Source Data Server/Physical Schema/Model in Object Storage.

4. Set up Target Data Server/Physical Schema/Model in ADW.

5. Creating a Mapping and test it.

You should have Object storage and ADW instance provisioned.


 1. Install ODI

Not included in this document.  You can refer below link for reference.


 2. Patch p26669648_122130_Generic to upgrade to

You need to patch ODI to version firstly.

3. Now you get ODI

4. New a Data Server

Let’s setup the topology.  Right click Oracle Object Storage

An overview:

Let me explain the items above.

a. Region:

Oracle Object Storage region. A region is a localized geographic area, and an availability domain is one or more data centers located within a region. A region is composed of several availability domains. Most Oracle Cloud Infrastructure resources are either region-specific, such as a virtual cloud network, or availability domain-specific, such as a compute instance.

b. Tenant OCID:

Tenant’s Oracle Cloud ID. Every Oracle Cloud Infrastructure resource has an Oracle-assigned unique ID called an Oracle Cloud Identifier (OCID). It's included as part of the resource's information in both the Console and API. To find your tenancy's OCID. Go to Administration-> Tenancy Details.

c. User OCID:

Oracle Cloud ID of the user logging into Oracle Object Storage.

In the Console on the page showing the user's details. To get to that page:

  • If you're signed in as the user, click the user icon present in the top-right corner of the Console, and then click User Settings.
  • If you're an administrator doing this for another user, instead click Identity, click Users, and then select the user from the list.

User OCID: api.user


d. Private Key File – Click the browse button to choose the location of the private key file (in PEM format)

Follow the steps to generate the private key and fingerprint

  • Passphrase – Passphrase is the password used while generating the private key

e. fingerprint

f. username:

Specify the user api.user, need to be same with item c. User OCID

Caution: Upload the public key to Object Storage.

You can upload the PEM public key in the Console, located at If you don't have a login and password for the Console, contact an administrator.

  1. Open the Console, and sign in.
  2. View the details for the user who will be calling the API with the key pair:
    • If you're signed in as this user, click your username in the top-right corner of the Console, and then click User Settings.
    • If you're an administrator doing this for another user, instead click Identity, click Users, and then select the user from the list.
    • Click Add Public Key.
    • Paste the contents of the PEM public key in the dialog box and click Add.

Test Connection

5. Creating an Oracle Object Storage Physical Schema


Create an Oracle Object Storage physical schema using the standard procedure, in Administering Oracle Data Integrator.

Oracle Object Storage specific parameters are:

  • Name: Name of the physical schema created
  • Bucket (Schema): It specifies the Oracle Object Storage Bucket name from which upload, download or the delete operation will happen. Select the required bucket from the Bucket Name drop-down list.
  • Directory (Work Schema): This is the temporary folder on the local system used for getting files from Oracle Object Storage bucket during reverse engineering. If the directory does not exist it will be created. Specify the required location in the local system.

And the logical schema:


6. Creating and Reverse-Engineering an Oracle Object Storage Model

Creating an Oracle Object Storage Model

An Oracle Object Storage model is a set of data stores, corresponding to files stored in an Oracle Object Storage bucket. In a given context, the logical schema corresponds to one physical schema. You can create a model from the logical schema for the Oracle Object Storage technology. The bucket schema of this physical schema is the Oracle Object Storage bucket containing all the files. You can create new ODI Data store that will represent a file in Oracle Object Storage so that it can be used in mappings.

Input the information required and Save.

Reverse-Engineering Delimited Files from Oracle Object Storage

To perform a delimited file reverse engineering:

  1. In the Models accordion, right click your Object Storage Model and select New Data store. The Data Store Editor opens.
  2. In the Definition tab, enter the following fields:
    • Name: Name of this data store
    • Resource Name: Click the Search icon, to select the required file from the list of files present in Oracle Object Storage for the configured bucket.

  1. Go to the Storage tab, to describe the type of file. Set the fields as follows:
    • File Format: Delimited
    • Heading (Number of Lines): Enter the number of lines of the header. Note that if there is a header, Oracle Data Integrator uses the first line of the header to name the columns in the file.
    • Select a Record Separator.
    • Select or enter the character used as a Field Separator.
    • Enter a Text Delimiter if your file uses one.
    • Enter a Decimal Separator, if your file contains decimals.

  1. From the File main menu, select Save.
  2. In the Data Store Editor, go to the Attributes tab.
  3. In the editor toolbar, click Reverse Engineer.

Click Reverse Engineer, ODI will generate the Metadata based on the header of the file.

  1. Verify the data type and length for the reverse engineered attributes. Oracle Data Integrator infers the field data types and lengths from the file content, but may set default values (for example 50 for the strings field length) or incorrect data types in this process.
  2. From the File main menu, select Save.

7. Create a Connection with ADW

Create a Data Server for ADW. Specify the Credential file and choose the connection details from dropdown list.

JDBC information will be there, no need to update.

And Test the connection

And then new a Physical Schema.

New the Model and Reverse Engineer.

8. New a Project, Mapping and Test



Set the AP (Access Point) as below:

Caution: You need to run the store procedure to create credential on ADW before running the Mapping.

set define off



credential_name => 'ODI',

username => 'api.user',

password => '.};rKwO6t8***'




set define on


Mapping run finished.

And Review the data loaded in ADW.

Comparing with the source csv file in Oracle Object Storage:

Looks good.



With the Oracle Data Integrator 12c releases Oracle introduced several new enhancements, more source and target are supported (Oracle Object Storage, Oracle Autonomous Data Warehouse Cloud (ADW), Oracle Autonomous Transaction Processing (ATP), Oracle Enterprise Resource Planning (ERP) Cloud etc.). This document could help customer to achieve their data integration over different regions or oversea.

The ODI 12c releases continue to improve Oracle’s strategic Data Integration platform while preserving the key product differentiators: Declarative Design, Knowledge Modules, Hot-Plug-ability, and E-LT architecture.

Sandrine Riley

Senior Principle Product Manager

Product management, direct and indirect sales, marketing, and business development experience in the software industry.

Previous Post

Integration: Heart of the Digital Economy Podcast Series – Moving Data to the Cloud and Autonomous Data Warehouse

Steve Quan | 2 min read

Next Post

Data Integration Platform Videos

Sandrine Riley | 2 min read