Tuesday Aug 04, 2015

Simplicity in Leveraging Oracle Data Integrator for Cloud Applications

Check out last week’s A-Team Blog post… A Universal Cloud Applications Adapter for ODI

Learn about the simplicity of leveraging Oracle Data Integrator (ODI) with all emerging technologies in the world of cloud computing!

For more A-Team reads on ODI, browse through the A-Team Chronicles.

Monday Oct 27, 2014

Updated Statement of Direction for Oracle Business Intelligence Analytics (OBIA)

Oracle's product strategy around the Oracle Business Intelligence Analytics (OBIA) has been published this October in the latest Statement of Direction.

Interesting points relative to the BI Applications around data integration:

  • Oracle’s strategic development for ELT for BI Applications will focus on the Oracle Data Integrator and related technologies. Since the fielding of the ODI compatible version of BI Applications in the 11g series, customers have realized substantial financial and operational benefits from reduced time to value and improved speed of operations. Oracle continues to evolve and develop ODI, and Oracle’s BI Applications will take advantage of the latest capabilities as they become available.

  • Oracle will continue to support the 7.9.6.x product series according to the Oracle Lifetime Support policy including certifications of databases, operating systems, and enabling 3rd  party technologies.  However, Oracle will no longer develop new content for this series, nor extend the 7.9.6.x support or any series based on an ETL architecture with Informatica.

You can find the related blog entry with additional details from the BI Team here.

Wednesday Oct 08, 2014

A Recap of the Data Integration Track's Final Day in OpenWorld 2014

Last week during OpenWorld, my colleague Madhu provided a summary of the first 3 days of the Data Integration track in his blog post: Data Integration At OOW14 -Recapping days 1,2 and 3Today I would like to mention a few key sessions we presented on Thursday.

We kicked off the last day of OpenWorld with the cloud topic. In the Oracle Data Integration: A Crucial Ingredient for Cloud Integration [CON7926] session Julien Testut from Data Integration product management team presented with Sumit Sarkar from Progress DataDirect. Julien provided an overview of the various data integration solutions for cloud deployments, including integration between databases and applications hosted on cloud,  as well as loading cloud BI infrastructures. Sumit followed Julien with a live demo using Oracle Data Integrator and the Progress DataDirect JDBC drivers for loading and transforming data on Amazon Redshift, and extracting data from Salesforce.com. All of us in the audience were amazed that the demo worked seamlessly using only the OpenWorld wi-fi network.

For Oracle GoldenGate we had 2 key sessions on Thursday:

Achieving Zero Downtime During Oracle Applications Upgrades and System Migrations was in the Fusion Middleware AppAdvantage track and featured Oracle GoldenGate’s zero downtime upgrades and migration solution with customer Symantec and partner Pythian. In this session, Doug Reid presented GoldenGate's 3 different deployment models for zero downtime application upgrades.  In his slides, Doug highlighted GoldenGate’s certified solution for Siebel CRM, and mentioned the support for zero downtime application upgrade for JD Edwards and Billing and Revenue Management (BRM) as well. Following Doug, Symantec’s Rohit Muttepawar, came to stage and talked about their database migration project for their critical licensing database. And Pythian’s  Gleb Otochkin and Luke Davies presented how using Oracle GoldenGate in a 6-node active-active replication environment helped Western Union in achieving application releases with zero downtime, database patches and upgrades with zero downtime, and a real-time reporting database with no impact to online users.

The other key GoldenGate session was Managing and Monitoring Oracle GoldenGate. Joe deBuzna in the GoldenGate product management team provided an overview of the new monitoring and management capabilities included in the Oracle GoldenGate Plug-in for Enterprise Manager, and Oracle GoldenGate Monitor. Both of these products are included in the Oracle Management Pack for Oracle GoldenGate license. With the new 12c release,  they can now control the starting, stopping and configuration of existing Oracle GoldenGate processes, and include many new metrics that strengthen how users can monitor their Oracle GoldenGate deployments. 

The Data Integration track at OpenWorld closed with " Insight into Action: Business Intelligence Applications and Oracle Data Integrator". Jayant Mahto from ODI Development team, and Gurcan Orhan from Wipro Technologies focused  on the Oracle BI Applications latest release that embeds Oracle Data Integrator for data loading and transformations, and provides the option to use Oracle GoldenGate for real-time data feeds into the reporting environment. The session highlighted how the new Oracle BI Apps release provides greater strategic insight quickly, efficiently, and with a low total cost of ownership, as well as the role Oracle Data Integrator as the data integration foundation. Jayant and Gurcan presented how Oracle Data Integrator enables users to increase IT efficiency and reduce costs by increasing data transparency, easing setup and maintenance, and improving real-time reporting.

In case you missed it, I'd like to remind you on the press announcement that went out on September 29th and gives a summary of the key developments in the Fusion Middleware product family and Oracle's data integration offering. As mentioned in the press announcement, we have now a new offering for metadata management. An overview of this product was delivered in the “Oracle Data Integration and Metadata Management for the Seamless Enterprise [CON7923] session We will post a dedicated blog on this topic later in the week. Stay tuned for more on that.

Monday May 12, 2014

Check it out – BI Apps is now available!

As of May 8, 2014, Oracle Business Intelligence (BI) Applications is available on the Oracle Software Delivery Cloud (eDelivery), and on the Oracle BI Applications OTN page. This is the second major release on the 11g code line leveraging the power of Oracle Data Integrator (ODI), and certified with the latest version of Oracle BI Foundation For more details on this release and what’s new – check it out!

Friday Apr 25, 2014

Long Running Jobs in Oracle Business Intelligence Applications (OBIA) and Recovery from Failures

Written by Jayant Mahto, Oracle Data Integrator Product Management

In Oracle Business Applications (OBIA), the Data Warehouse load is performed using Oracle Data Integrator (ODI). In ODI, using packages and load plans, one can create quite a complex load job that kicks off many scenarios in a coordinated fashion. This complex load job may run for a long time and may fail before completing the entire job successfully and will require restarts to recover from failure and complete successfully.

This blog uses the complex load plan defined in Load Plan for Oracle Business Applications (OBIA) to illustrate the method of recovery from failures. Similar methods can be used in the recovery of complex load plans defined independently in Oracle Data Integrator (ODI). Note that this post does not go into the details of troubleshooting a failed load plan and only talks about the different restart parameters that affect the behavior of a restarted job.

The failures can happen due to the following reasons:

  • Access failure – Source/Target DB down, network failure etc.
  • Agent failure.
  • Problem with the Database – As in running out of space or some other DB related issue.
  • Data related failure – Exceptions not caught gracefully, like null in not null column etc.

It is important to find out the reason of failure and address it before attempting to restart the load plan otherwise the same failure may happen again. In order to recover from the failure successfully the recover parameters in the load plan steps need to be selected carefully. These parameters are selected during design time of the load plan by the developers. The goal is to be able to make the restarts robust enough so that the administrator can do restart without knowing the details of the failed steps. This is why it is the developer’s responsibility to select the restart parameters for the load plan steps in such a way which guarantees that the correct set of steps will be re-run during restart to make sure that data integrity is maintained.

In the case of OBIA, the load plans have appropriate restart parameters in the generated load plans for out of the box steps. If you are adding a custom steps then you need to choose similar restart parameters for the custom steps.

Now let us look at a typical load plan and the restart parameters at various steps.

Restart of a serial load plan step:

SDE Dimension Group Step highlighted above is a serial step. Let us say the Load plan failed when running the 3 SDE Dims GEO_DIM step. Since this is a serial step and it has been set to “Restart from Failure”, the load plan on restart would start from 3 SDE Dims GEO_DIM only and not run the 3 SDE Dims USER_DIM again. This parameter is widely used in the OBIA serial steps.

The other restart parameter for a serial steps is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of a parallel load plan step:

The Workforce Dependant Facts Step highlighted above is a parallel step with restart parameter set to “Restart from failed children”. It means all the 5 parallel steps under it would be kicked off in parallel (subject to free sessions being available). Now amongst those 5 steps let us say 2 of them completed (indicated by the green boxes above) and then the load plan failed. When the load plan is restarted all the steps that did not complete/failed, will be started again (in this example being Learning Enrollment Fact, Payroll Facts and Recruitment Facts). This parameter is widely used in the OBIA parallel steps.

The other restart parameter for a parallel step is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of the scenario session:

At the lowest order in any load plan are the scenario steps. While the parent steps (serial or parallel) are used to set the dependencies, the scenario steps are what finally load the tables. A scenario step in turn could have one or more steps (corresponding to number of steps inside the package).

It is important to understand the structure of a session that gets created for the execution of a scenario step to understand the failure points and how the restart takes place.

The following diagram illustrates different components in a session:

The restart parameters for the scenario steps in the load plan are:

  • Restart from a new session – This creates a new session for the failed scenario during restart and executed all the steps again.
  • Restart from a failed task – This uses the old session for the failed scenario during restart and starts from the failed task.
  • Restart from a failed step – This uses the old session for the failed scenario during restart and re-executes all the tasks in the failed step again. This is the most common parameter used by OBIA and is illustrated below.

In the above example, scenario step 2 failed when running. It internally has 3 steps (all under the same session in Operator log but identified with different step numbers 0,1,2 in above case). As per the setting corresponding to OBIA Standard, the Scenario would execute from Failed Step which is from Step number 2 Table_Maint_Proc (and the substep 3 Initialize Variables onwards as shown in diagram).

Note that the successful tasks such as “3 – Procedure – TABLE_MAINT_PROC – Initialize variables” will be executed again during restart since the scenario restart parameter is set to “Restart from failed step” in the Load Plan.


OBIA has certain coding standard for setting up restart parameters as discussed above. For serial and parallel steps, the parameters “Restart from failure” and “Restart from failed children” allow the completed steps to be skipped. For scenario steps (which actually kick of the load sessions) the restart parameter of “Restart from failed step” skips the completed steps in the session and reruns all the tasks in the failed step, allowing recovery of an incomplete step.

This standard allows a hands free approach to restart a failed load plan by an administrator who has no knowledge of what the load plan is doing.

Tuesday Apr 22, 2014

Fusion Application Incremental Data Migration

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the last post we discussed how Fusion Application uses ODI to transform and move data from their interface tables to the internal tables. In this article we will look into the use case of how Fusion Application uses ODI for extracting data from legacy applications and loading into Interface tables.

Fusion Applications have created a large number of ODI interfaces and packages for addressing this use case. These ODI artifacts are shipped to Fusion Application customers for performing the initial and incremental data migration from their legacy applications to Fusion Applications interface tables. These shipped artifacts can be customized as per the customizations in the customer’s environment. The below diagram depicts the data migration from Siebel to Fusion Customer Relationship Management (CRM). The whole process is performed in two stages. First the data is extracted from the underlying database of Siebel’s production system into staging area on Oracle database and then migrated into Fusion Applications interface tables. ODI is used for both of these operations. Once the initial migration is done then the trickle of change data is replicated and transformed through ODI and Golden gate combination.

Incremental Migration using ODI and Golden Gate

Initial Load

The initial is the bulk data movement process where the snapshot of the data from legacy application is moved into staging area in Oracle database. Depending upon the underlying database type of the legacy application, appropriate ODI Knowledge Module is used in the interfaces to get the best performance. For instance, for the Oracle to Oracle data movement the Knowledge Modules with DBLINK is used to move data natively through DBLINK.


The replication process takes care of moving data incrementally after the Initial load is complete. Oracle Golden Gate is leveraged for this incremental change data replication which continuously replicates the changes into the staging area. The trickle of change data is then moved from staging area to Fusion Applications interface tables through ODI’s change data capture processes using ODI Journalizing Knowledge Modules.

Thanks again for reading about ODI in the Fusion Applications!  This was the last in a series of three posts.  To review the related posts:  Oracle Data Integrator (ODI) Usage in Fusion Applications and Fusion Application Bulk Import Process.

Tuesday Apr 15, 2014

Fusion Application Bulk Import Process

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the previous blog post we looked at the Fusion Applications end-to-end bulk data integration use cases. Now let’s take a closer look at the Bulk Import process that transforms and moves data from Interface tables to internal tables. For this use case ODI is bundled along with Fusion Application and get configured transparently by the Fusion Application provisioning process. The entire process is automated and controlled through the Fusion Application User Interface. It also seeds the ODI repository with the Fusion Application specific Models, Interfaces, Procedures and Packages which are then dynamically modified through ODI SDK for any Fusion Application customizations.

Fusion Application Bulk Import Process

The above diagram shows the Bulk import process in Fusion Application where ODI is used for data transformation. Here the Interface tables are the source tables which were populated by other processes before the kicking off the Bulk Import process. The Fusion Application internal tables are the target for these integrations where the data needs to be loaded. These internal tables are directly used for Fusion Application functionalities therefore a number of data validations are applied to load only the good quality data into the internal tables. The data validation errors are monitored and corrected through Fusion Application User Interface. The metadata of Fusion Application tables is not fixed and gets modified as the Application is customized for customer’s requirement. Any change in such source or target tables would require corresponding adjustments in ODI artifacts too and is taken care of by the AppComposer which uses ODI SDK to make such changes in ODI artifacts. If auditing is enabled then any change in the internal table data or the changes in ODI artifacts are recorded in centralized auditing table.

Packaged ODI Artifacts

There are a large number of ODI models, interfaces and packages seeded in the default ODI repository used for Bulk Import. These ODI artifacts are built based upon the base metadata of Fusion Application schema.


As part of the customization, Fusion Application entities are added or modified as per the customer’s requirement. Such customizations result in changes in the underlying Fusion Application’s internal tables and interface tables, and require the ODI artifacts to be updated accordingly. The Fusion Application development team as built the extensibility framework to update ODI artifacts dynamically along with any change in Fusion Application schema. It leverages the ODI-SDK for performing any changes in the ODI repository. The dynamic generation of ODI artifacts is automatically kicked off as part of Patching and Upgrades process. Fusion Application AppComposer User Interface also supports explicitly triggering this process so that administrators can regenerate ODI artifacts whenever they make any customizations.

Validation Error Reporting

The validation errors are populated in intermediate tables and are exposed through BI Publisher so that admin users can correct and recycle these error records.


The Fusion Application auditing framework keeps track of the changes performed by each of the users and at what time. There are two levels of auditing captured in Fusion Application audit table for Bulk Import use case. First, metadata changes in ODI artifacts through ODI SDK during customizations. Second, the transactional data changes in the Fusion Application table data as part of ODI interfaces execution. For these purposes the ODI team has exposed some substitution APIs that are used by Fusion Application development team to customize ODI KMs to perform such auditing during the actual data movement.

Provisioning and Upgrade

The provisioning process takes care of install and configuring ODI for the Fusion Application instance.

It takes care of automatically creating ODI repository schemas, configuring topology, setting up ODI agents, setup configurations for ODI –ESS bridge, seeding packaged ODI artifacts, apply modifications to seeded artifacts and create internal users in IDM for external authentication. There is a separate process to apply patches or upgrade the environment to the newer release. Such patching or upgrade processes not only take care of importing newer ODI artifacts but also kick off a CRM extensibility process that modifies ODI artifacts as per the Fusion Application customizations.

External Authentication

There is a dedicated IDM configured with each Fusion Application instance and all Fusion Application components are expected to have their users authenticated through this centralized IDM. For Bulk Import use case ODI is configured with external authentication and there are internal users created in IDM that are used for communication with ODI agent and kicking off ODI jobs.

Enterprise Scheduler Service (ESS) - ODI Bridge

The ODI scenarios are kicked off through ODI-ESS bridges. It is a separate library build for ODI-ESS integration and gets deployed along with Enterprise Scheduler Service (ESS) in Fusion Application environment. It supports both synchronous and asynchronous modes of invocation for ODI jobs. In the asynchronous mode the session status is updated to callbacks to the ESS services. There is a topology editor provided to manage the ESS callback service connectivity exclusively for Fusion Application use cases.

Note: Use of ESS-ODI Bridge is restricted to Fusion Application use case only at the moment.

High Availability

The ODI agent is deployed on Weblogic cluster in the Fusion Application environment to take advantage of ODI high availability capabilities. By default there is only one managed server in the Weblogic cluster created for ODI but as the load increases more managed servers can be added to the cluster to distribute execution of ODI sessions among ODI agent instances in the cluster.

Stay tuned for the last post on this topic coming soon.  This was part two in a series of three posts.  The initial post can be found here.

Friday Apr 11, 2014

Oracle Data Integrator (ODI) Usage in Fusion Applications (FA)

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

Oracle Data Integrator (ODI) is the bulk data transformation platform for Fusion Applications (FA). ODI is used by Fusion Customer Relationship Management (CRM), Fusion Human Capital Management (HCM), Fusion Supply Chain Management (SCM), Fusion Incentive Compensation (IC) and Fusion Financials family products and many other Fusion Application teams are following suit. Among all these product families CRM is the biggest consumer of ODI leveraging a breadth of ODI features and functionality, out of which some features were developed specifically for Fusion Applications use. Some ODI features they utilize include: ODI SDK, high availability, external authentication, various out of the box and customized Knowledge Modules, ODI-ESS bridge, callbacks to ESS EJBs, auditing, open tools, etc. In this post we will first talk about the different Fusion Application use cases at higher level and then take a closer look at different integration points.

Figure 1 shows data integration need of a typical on-premise Fusion Applications deployment.

  1. Bulk Import: Fusion Applications exposes a set of interface tables as the entry point for data load from any outside source. The bulk import process validates this data and loads it in the internal table which can then be used by the fusion application.
  2. Data Migration: Extracting data from external applications, legacy applications or any other data source and loading it into Fusion Application’s interface table. ODI can be used for such data load.
  3. Preparing Data Files: Converting data into Comma Separated Values (CSV) files that can be imported through Fusion Application’s files import wizard. ODI can be used to extract data into such CSV file.

Figure 1: Data Integration Needs in On-Premise Fusion Application

Figure 2 shows the on-demand or cloud environment requirements, which are slightly different as there is no direct connectivity available to the interface tables.

  1. Bulk Import: Fusion Application exposes a set of interface tables as the entry point for any data load from any outside source. The bulk import process validates this data and then loads it in the internal table which can then be used by the application.
  2. Preparing Data Files: Converting data into CSV files that can be imported through Fusion Application’s files import wizard. ODI can be used for creation on such CSV files.
  3. Uploading data files: The data files are uploaded to the Tenant File repository through either Fusion Application’s File import page or Oracle WebCenter Content Document Transfer Utility. The WebCenter Utility is built using ODI open tool framework allowing orchestrating entire process through ODI package.
  4. Loading Interface Table: Data files to be loaded in the interface tables so that it can be consumed by the Bulk Import process. ODI is used for loading these interface tables.

Figure 2: Data Integration Needs for On-Demand Fusion Application

Stay tuned for more blog posts on this topic coming next week. This was part one in a series of three posts.

Wednesday Sep 04, 2013

Extending the Value of Oracle Applications with Oracle Data Integration

Can you imagine a business operating without any application software? Not anymore. Transactional applications are in the center of business operations in today's digital world. With a wide range of offering, Oracle Applications bring customers the latest technology and make IT a strategic differentiator for their business. These applications’ data is also a major asset to the organization and critical to the health of the business. There are several ways Oracle's data integration solutions help Oracle Applications customers drive more value out of their application investments. 

A key method to maximize return on business applications is to connect their data to gain a unified view. A complete and up-to-date view enables businesses to make better business decisions and meet customers’ needs more efficiently. This is where Oracle Data Integration comes in. By connecting applications at the data-layer, at any latency, Oracle’s data integration products enable Oracle Applications users to find answers that takes their business to the next level. Oracle Data Integrator (ODI) , the flagship product for bulk data movement and transformation, has a long list of Knowledge Modules for Oracle Applications to leverage best practices in loading, transforming, or integrating data for that specific application. Oracle Data Integrator is also embedded in the Oracle BI Applications release. With its tight integration with Oracle GoldenGate, ODI enables Oracle BI Applications users to leverage GoldenGate’s real-time data integration capabilities for performing analysis with up-to-date data.

In addition to removing data silos, IT organizations need to meet users’ expectations for high performance and continuous operations. With a single real-time data replication software platform, GoldenGate addresses both of these requirements. Oracle E-Business Suite, Oracle PeopleSoft Applications, JD Edwards, Siebel CRM, ATG Web Commerce customers can use GoldenGate to offload queries from production systems to reduce overhead and increase performance. In addition, GoldenGate is certified for enabling application upgrade for Siebel CRM, JD Edwards, and Communications Billing and Revenue Management without interrupting business operations. With bidirectional and heterogeneous real-time data replication GoldenGate also eliminates planned downtime for database, operating system, and hardware migration projects.

You can find out how Oracle Data Integration can help your Oracle Application deployment by visiting our new resource center: Powering Oracle Applications with Next-Generation Data Integration.

I also would like to invite you to learn more about how Oracle Fusion Middleware offers incremental value to Oracle Applications customers, which we call Oracle AppAdvantage, with innovative, best-of-breed solutions that simplify IT, provide a competitive edge, and enable innovation. You can read about Oracle AppAdvantage Story to discover how Oracle AppAdvantage helps align IT and business initiatives.

Friday Feb 10, 2012

Eliminating Batch Windows Using Real-Time Data Integration

When we invest in technology solutions we expect improvement in productivity, agility, performance and more. We don’t want to be limited by the technology we select. While data warehouses are designed to give us the freedom to access complete and reliable information, the underlying data integration architecture and the type of solution used can lead to significant constraints on how we manage our critical production systems.

With data warehousing solutions, one of the most common constraints is the time window available for batch extract processing on the source systems. The resource intensive extract process typically has to be done in off-business hours and restricts access to critical source systems.

A low-impact, real-time data integration solution can liberate your systems from batch windows. When the extract component uses a non-intrusive method, such as reading database transaction logs to capture only the changed data, it does not burden source systems. Hence, data extract can happen at any time of the day, and throughout the day, while all users are online.

SunGard is a great example for achieving a major transformation in data warehousing solutions by using log-based real-time data integration. The company removed batch processing timeline related constraint by using Oracle GoldenGate to capture all of the intraday changes that take place on the selected tables. As a result they reduced the nightly extract process from the Oracle E-Business Suite application by 9 hours. In addition, Oracle GoldenGate enables to feed EBusiness Suite data to the Oracle BI Application for Finance throughout the day. Hence, end users have access to up-to-date information on the Oracle Business Intelligence dashboards as changes occur.

To read more about how real-time data integration can free your critical systems from batch window constraints and how SunGard leveraged Oracle GoldenGate in their data warehousing implementation, I invite you to check out the SunGard case study and the article “Freedom from Batch Windows Using Real-Time Data Integration we wrote for TDWI's What Works in Emerging Technologies publication. As always, you can find more resources on Oracle GoldenGate on our recently redesigned website.

Wednesday Jul 27, 2011

Maximize the Return on your Oracle Applications with Oracle Data Integration

Application data is the lifeline of any enterprise.  Organizations rely heavily on their applications whether supply chain, HR, HCM, Payroll or any other kind to run their businesses.  Being able to integrate the data from one application into another is a powerful thing.  Siloed applications are useful, but for that particular silo.  Here's where data integration can help alleviate the situation where application data is disparate and separated.

Oracle Data Integration consists of Oracle Data Integrator and Oracle GoldenGate.  ODI's bulk data movement and data transformation features provide the 'heavy lifting' needed to make application data integration a success.  GoldenGate provides zero downtime upgrades as well as the capacity to offload reporting to a different database, that could be of lesser cost, size, etc --thereby making application data integration cost effective as well. Join us on July 28th at 10 AM PT/1PM ET to hear exactly how Oracle Data Integration can help increase the return on your Oracle Applications with data integration technology.  Register today.


Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality


« October 2015