Tuesday Jun 03, 2014

How Oracle Data Integration Customers Differentiate Their Business in Competitive Markets

With data being a central force in driving innovation and competing effectively, data integration has become a key IT approach to remove silos and ensure working with consistent and trusted data. Especially with the release of 12c version, Oracle Data Integrator and Oracle GoldenGate offer easy-to-use and high-performance solutions that help companies with their critical data initiatives, including big data analytics, moving to cloud architectures, modernizing and connecting transactional systems and more.

In a recent press release we announced the great momentum and analyst recognition Oracle Data Integration products have achieved in the data integration and replication market. In this press release we described some of the key new features of Oracle Data Integrator 12c and Oracle GoldenGate 12c. In addition, a few from our 4500+ customers explained how Oracle’s data integration platform helped them achieve their business goals. In this blog post I would like to go over what these customers shared about their experience.

Land O’Lakes is one of America’s premier member-owned cooperatives, and offers an extensive line of agricultural supplies, as well as production and business services. Rich Bellefeuille, manager, ETL & data warehouse for Land O’Lakes told us how GoldenGate helped them modernize their critical ERP system without impacting service and how they are moving to new projects with Oracle Data Integrator 12c: “With Oracle GoldenGate 11g, we've been able to migrate our enterprise-wide implementation of Oracle’s JD Edwards EnterpriseOne, ERP system, to a new database and application server platform with minimal downtime to our business. Using Oracle GoldenGate 11g we reduced database migration time from nearly 30 hours to less than 30 minutes. Given our quick success, we are considering expansion of our Oracle GoldenGate 12c footprint. We are also in the midst of deploying a solution leveraging Oracle Data Integrator 12c to manage our pricing data to handle orders more effectively and provide a better relationship with our clients. We feel we are gaining higher productivity and flexibility with Oracle's data integration products."

ICON, a global provider of outsourced development services to the pharmaceutical, biotechnology and medical device industries, highlighted the competitive advantage that a solid data integration foundation brings. Diarmaid O’Reilly, enterprise data warehouse manager, ICON plc said “Oracle Data Integrator enables us to align clinical trials intelligence with the information needs of our sponsors. It helps differentiate ICON’s services in an increasingly competitive drug-development industry."  You can find more info on ICON's implementation here.

A popular use case for Oracle GoldenGate’s real-time data integration is offloading operational reporting from critical transaction processing systems. SolarWorld, one of the world’s largest solar-technology producers and the largest U.S. solar panel manufacturer, implemented Oracle GoldenGate for real-time data integration of manufacturing data for fast analysis. Russ Toyama, U.S. senior database administrator for SolarWorld told us real-time data helps their operations and GoldenGate’s solution supports high performance of their manufacturing systems: “We use Oracle GoldenGate for real-time data integration into our decision support system, which performs real-time analysis for manufacturing operations to continuously improve product quality, yield and efficiency. With reliable and low-impact data movement capabilities, Oracle GoldenGate also helps ensure that our critical manufacturing systems are stable and operate with high performance."  You can watch the full interview with SolarWorld's Russ Toyama here.

Starwood Hotels and Resorts is one of the many customers that found out how well Oracle Data Integration products work with Oracle Exadata. Gordon Light, senior director of information technology for StarWood Hotels, says they had notable performance gain in loading Oracle Exadata reporting environment: “We leverage Oracle GoldenGate to replicate data from our central reservations systems and other OLTP databases – significantly decreasing the overall ETL duration. Moving forward, we plan to use Oracle GoldenGate to help the company achieve near-real-time reporting.”You can listen about Starwood Hotels' implementation here.

Many companies combine the power of Oracle GoldenGate with Oracle Data Integrator to have a single, integrated data integration platform for variety of use cases across the enterprise. Ufone is another good example of that. The leading mobile communications service provider of Pakistan has improved customer service using timely customer data in its data warehouse. Atif Aslam, head of management information systems for Ufone says: “Oracle Data Integrator and Oracle GoldenGate help us integrate information from various systems and provide up-to-date and real-time CRM data updates hourly, rather than daily. The applications have simplified data warehouse operations and allowed business users to make faster and better informed decisions to protect revenue in the fast-moving Pakistani telecommunications market.” You can read more about Ufone's use case here.

In our Oracle Data Integration 12c launch webcast back in November we also heard from BT’s CTO Surren Parthab about their use of GoldenGate for moving to private cloud architecture. Surren also shared his perspectives on Oracle Data Integrator 12c and Oracle GoldenGate 12c releases. You can watch the video here.

These are only a few examples of leading companies that have made data integration and real-time data access a key part of their data governance and IT modernization initiatives. They have seen real improvements in how their businesses operate and differentiate in today’s competitive markets. You can read about other customer examples in our Ebook: The Path to the Future and access resources including white papers, data sheets, podcasts and more via our Oracle Data Integration resource kit.


Friday May 30, 2014

Looking for Cutting-Edge Data Integration: 2014 Excellence Awards

2014 Oracle Excellence Awards Data Integration

It is nomination time!!!

This year's Oracle Fusion Middleware Excellence Awards will honor customers and partners who are creatively using various products across Oracle Fusion Middleware. Think you have something unique and innovative with one or a few of our Oracle Data Integration products?

We would love to hear from you! Please submit today.

The deadline for the nomination is June 20, 2014.

What you win:

  • An Oracle Fusion Middleware Innovation trophy

  • One free pass to Oracle OpenWorld 2014

  • Priority consideration for placement in Profit magazine, Oracle Magazine, or other Oracle publications & press release

  • Oracle Fusion Middleware Innovation logo for inclusion on your own Website and/or press release

Let us reminisce a little…

For details on the 2013 Data Integration Winners:

Royal Bank of Scotland’s Market and International Banking and The Yalumba Wine Company, check out this blog post: 2013 Oracle Excellence Awards for Fusion Middleware Innovation… and the Winners for Data Integration are…

and for details on the 2012 Data Integration Winners:

Raymond James and Morrisons, check out this blog post: And the Winners of Fusion Middleware Innovation Awards in Data Integration are… 

Now to view the 2013 Winners (for all categories).

We hope to honor you!

Here's what you need to do: 

Click here to submit your nomination today.  And just a reminder: the deadline to submit a nomination is 5pm Pacific Time on June 20, 2014.

Wednesday May 21, 2014

Zero Downtime Consolidation to Oracle Database 12c: Webcast Recap

As companies move to private cloud implementations to increase agility and reduce costs, and start their plans with consolidating their databases, a major question arises: how do we move our systems without causing major disruption to the business. At the end, the most important systems that need to move to this new architecture are the ones that cannot tolerate extensive downtime for any reason. In last week’s webcast “Zero Downtime Consolidation to Oracle Database 12c with Oracle GoldenGate 12c” we tackled this specific dilemma that IT organizations face. The webcast is now available on demand via the link above in case you missed it last week.

In the webcast, we started discussing the benefits companies achieve when they consolidate to a private database cloud, critical database capabilities to deliver database as a service, and a quick overview Oracle Database 12c features that enable private database clouds deployments. Nick Wagner, director of product management for Oracle Database High Availability, talked about the new Global Data Services feature in Oracle Database Maximum Availability Architecture (MAA) as well. Global Data Services helps organizations to handle challenges involved with managing multiple database replicas across different sites. The product manages database replicas of Active Data Guard and Oracle GoldenGate products, and offers workload balancing to maximize performance as well as intelligent handling of outages using all available replicas. Nick also discussed the database features available for upgrading, migrating, and consolidating into Oracle Database 12c and when it makes sense to use Oracle GoldenGate for these efforts.

After Nick, Chai Pydimukkala from senior director of product management for GoldenGate discussed some of the key new features of Oracle GoldenGate 12c for Oracle Database and its expanding heterogeneity that supports consolidation from all major databases. Chai continued his presentation with GoldenGate’s solution architecture for zero downtime consolidation and explained how it minimizes risk with failback option as well as via the phased migration option by running old and new environments concurrently.

Chai also gave an example on how Oracle Active Data Guard customers can seamlessly use GoldenGate for a zero downtime migration without affecting their standby system for high ongoing high availability and disaster recoery. After customer examples we had a long Q&A section. For the full presentation and Q&A with the experts you can watch the on-demand version of the webcast via the following link.

Zero Downtime Consolidation to Oracle Database 12c with Oracle GoldenGate 12c

In the rest of this blog post, I provided  answers to some of the questions we could not take in the live event. 

1) Additional difference in features between GoldenGate 11g and 12c?

Please take a look at our previous blog posts below where we discussed many of the new features of GoldenGate 12c.

· Advanced Replication for The Masses – Oracle GoldenGate 12c for the Oracle Database.

· GoldenGate 12c - What is Coordinated Delivery?

· GoldenGate 12c - Coordinated Delivery Example

· Oracle GoldenGate 12c - Announcing Support for Microsoft and IBM

You can also listen to our podcast: Unveiling Oracle GoldenGate 12c

More comprehensive information can be found in the webcast replay  Introducing Oracle GoldenGate 12c: Extreme Performance Simplified or GoldenGate 12c resource kit.

2) Does GoldenGate have a term license that customer can purchase for migration purposes?

We do offer 1 year license that customer can be used for this purposes.

3) Is there a Oracle Tutorial on configuring and testing Zero downtime upgrade using GoldenGate?

Here is a high level outline/tutorial for the migration. It will require some knowledge of how to setup GoldenGate and configuration.

Oracle GoldenGate Best Practice: Oracle Migrations and Upgrades 9i and up

A more broad discussion of this topic can be found in the following white paper. Zero Downtime Database Upgrades using Oracle GoldenGate 12c

4) If the number of updates to the database is very high, does GoldenGate give some kind of compression to transfer the Trail Files and be up to date with the changes?

Oracle GoldenGate does provide support of compress of change data in the Trail File to better the transfer throughput from source to target with the usage of COMPRESS and COMPRESSTHRESHOLD. 

RMTHOST

{host_name | IP_address}

{, MGRPORT port | PORT port}

[, COMPRESS]

[, COMPRESSTHRESHOLD]

[, ENCRYPT algorithm [KEYNAME key_name]]

[, PARAMS collector_parameters]

[, STREAMING | NOSTREAMING]

[, TCPBUFSIZE bytes]

[, TCPFLUSHBYTES bytes]

[, TIMEOUT seconds]

COMPRESS

This option is valid for online or batch Capture processes and any Oracle GoldenGate initial-load method that uses Trails. Compresses outgoing blocks of records to reduce bandwidth requirements. Oracle GoldenGate decompresses the data before writing it to the Trail. COMPRESS typically results in compression ratios of at least 4:1 and sometimes better. However, compressing data can consume CPU resources.

COMPRESSTHRESHOLD

This option is valid for online or batch Capture processes and any Oracle GoldenGate initial-load method that uses trails. Sets the minimum block size for which compression is to occur. Valid values are from 0 and through 28000. The default is 1,000 bytes.

5) Can the Source system utilize 12C GoldenGate for DML/DDL capture and target system utilize GoldenGate 11g?

Yes.  You can replicate from a higher version of GoldenGate to a lower version using FORMAT RELEASE option. 

FORMAT RELEASE <major>.<minor>

In the on demand replay of the webcast you will find more questions answered by our product management experts. So make sure to watch the on demand webcast as well.

Thursday May 15, 2014

Oracle Data Integrator Webcast Archives

Have you missed some of our Oracle Data Integrator (ODI) Product Management Webcasts?

Don’t worry – we do record and post these webcasts for your viewing pleasure. Recent topics include Oracle Data Integrator (ODI) and Oracle GoldenGate Integration, BigData Lite, the Oracle Warehouse Builder (OWB) Migration Utility, the Management Pack for Oracle Data Integrator (ODI), along with other various themes focused on Oracle Data Integrator (ODI) 12c. We run these webcasts monthly, so please check back regularly.

You can find the Oracle Data Integrator (ODI) Webcast Archives here.

And for a bit more detail:

The webcasts are publicized on the ODI OTN Forum if you want to view them live.  You will find the announcement at the top of the page, with the title and details for the upcoming webcast.

Thank you – and happy listening!

Monday May 12, 2014

Check it out – BI Apps 11.1.1.8.1 is now available!

As of May 8, 2014, Oracle Business Intelligence (BI) Applications 11.1.1.8.1 is available on the Oracle Software Delivery Cloud (eDelivery), and on the Oracle BI Applications OTN page. This is the second major release on the 11g code line leveraging the power of Oracle Data Integrator (ODI), and certified with the latest version of Oracle BI Foundation 11.1.1.7. For more details on this release and what’s new – check it out!

Tuesday May 06, 2014

No Way Out, But to Consolidate

IT teams receive pressures on multiple fronts: Business groups demand new application services delivered faster and continuous, high-quality IT service to support highly dynamic and competitive business operations. In the meanwhile IT teams are asked to reduce costs and improve ROI from existing systems in highly complex and siloed environments.

In the midst of this conundrum, the only way out for IT leadership is to consolidate and leverage cloud architecture. Especially consolidation at the database level and implementing a private database-as-a-service (DBaaS) environment is a transformative approach to delivering database functionality to end users in an agile, efficient, and scalable way.  It helps organizations improve resource utilization, and lower both capital and operational expenditures.

Oracle Database 12c is designed to support database consolidation with many new features for deploying shared database environment. The new multitenant architecture, simplified management capabilities, improvements to security and compliance, along with robust and flexible Maximum Availability Architecture, are only a few of the key differentiators.

While this all sounds good and logical for IT teams, when it comes to moving to a private database cloud, they face yet another major challenge: How do we move our critical systems to a new environment without interrupting our operations. In today’s 24/7 world, especially customer-facing systems cannot tolerate hours or days of downtime. For some of them, even few minutes of downtime can bring high costs to the business. As in each major project, risk involved with moving to a new system is a real deterrent factor too.

Oracle GoldenGate 12c offers the core capabilities to enable this major move to a database as a service environment. With its optimized support for Oracle Database 12c multitenant architecture and heterogeneous data replication capabilities, Oracle GoldenGate allows organizations to avoid business disruption while the systems are moving to a consolidated, DBaaS environment. In addition, Oracle GoldenGate offers a failback option to the old environment, or a phased migration option by running old and new systems simultaneously in active-active mode, to minimize risks. Oracle GoldenGate’s heterogeneity includes all major database vendors and operating systems including SQL Server, DB2 (LUW, z/Os, iSeries), Sybase ASE, HP NonStop and more. 

In our webcast Zero Downtime Consolidation to Oracle Database 12c with Oracle GoldenGate 12c on May 13th 10am PT we will present this solution with specific architecture examples and comparison to other database upgrade options.

Zero Downtime Consolidation to Oracle Database 12c with Oracle GoldenGate 12c

May 13th, 10am PT/ 1pm ET

If your organization needs to improve agility, increase innovation, and reduce costs, I invite you to join this webcast to learn how to remove the risk and business interruption barriers to move to an agile and cost-efficient private database cloud.

Friday May 02, 2014

3 Key Practices For Using Big Data Effectively for Enhanced Customer Experience

As organizations focus on differentiating their offering via superior customer experience, they are looking into ways to leverage big data in this effort. Couple of weeks ago my colleague Pete Schutt and I hosted a webcast on this very topic: Turning Big Data into Real-Time Action for a Greater Customer Experience

In this webcast we talked about 3 key practices to make the most out of big data for improving customer experience, which are:

  1. Know your customer leveraging big data: Leverage all relevant data (internal and external; structured, semi-structured, and unstructured) to understand and predict customers needs & preferences accurately.
  2. Capture, analyze, act on data fast to create value: Achieve accurate insight and take the right action fast so your action can be still relevant to the customer’s situation.
  3. Empower employees & systems with insight & smarter decisions: In this step you ensure that the capability to act right and fast is not limited to a few in the organization, but everyone and every system that interacts and influences customers’ experience.


After explaining why these practices are critical to improving customer experience, we discussed Oracle’s complete big data analytics and management platform, as well as the specific products and architectural approaches to execute on these 3 key areas. We focused particularly on data integration for fast and timely data acquisition and business analytics for real-time insight and action, and how they fit together in a real-time analytics architecture.

You can watch this webcast now on demand via the link below:

Turning Big Data into Real-Time Action for a Greater Customer Experience

In this webcast we received many great questions and I have provided below a few of them along with the answers.

Is real-time action related to the Internet of Things?

Yes, more physical things will be connected to the internet, often wirelessly with RFID tags or other sensors and Java to record where they are and what they are doing (or not doing). The IoT will be more practical by automating the information process from capture to analysis to appropriate and immediate action.

What does Oracle have for real-time mobile analytics?

Oracle BI Mobile App Designer empowers business users to easily create interactive analytical applications on any device without writing a single line of code and to also take action and respond to events in the context of their day-today business activities

Can these real-time systems be managed by business users?

Yes, you need the agility for business owners to be able to respond, experiment, and adapt, in real-time as the environment or consumer behavior changes. The systems have to be intuitive enough for users with the business content and context who can easily visualize, understand, and change the patterns they're looking and the rules that are being enforced.

Can the real-time systems use other statistical models or algorithms?

Yes. Oracle Advanced Analytics offer an enterprise version of R and Oracle RTD can source and publish scores from other advanced analytical models such as R, SAS, or SPSS or others.

Where do we get more information about ODI for big data?

 You can start with Oracle Data Integrator Application Adapter for Hadoop. And also take a look at the  Oracle BigDataLite Virtual Machine, a pre-built environment to get you started on an environment reflecting the core software of Oracle's Big Data Appliance 2.4. BigDataLite is a VirtualBox VM that contains a fully configured Cloudera Hadoop distribution CDH 4.5, an Oracle DB 12c, Oracle's Big Data Connectors, Oracle Data Integrator 12.1.2, and other software. You can use this environment to see ODI 12c in action integrating big data with Oracle DB using ODI's declarative graphical design, efficient EL-T loads, and Knowledge Modules designed to optimize big data integration. 

For GoldenGate, can a target be something other than a database, e.g. queue?

Yes, GoldenGate can deliver database changes into JMS message queues and topics, as well as in flat file format. Oracle GoldenGate Application Adapters would need to be used for those use cases. For low-impact real-time data integration into Hadoop systems customers will need to use the Java Adapter within this GoldenGate Application Adapters license as well.

What other data warehouses can does Oracle support for real-time data integration?

Oracle's data integration offering is heterogeneous for both sources and targets. Both Oracle Data Integrator and Oracle GoldenGate work with non-Oracle data warehouses including Teradata, DB2, Netezza, Greenplum.

I invite you to watch this webcast on demand to hear the details of our solution discussion and the Q&A with the audience. For more information big data integration and analytics you can review Bridging Two Worlds. Big Data and Enterprise Data and Big Data @ Work Turning Customer Interactions into Opportunities.

·


Friday Apr 25, 2014

Long Running Jobs in Oracle Business Intelligence Applications (OBIA) and Recovery from Failures

Written by Jayant Mahto, Oracle Data Integrator Product Management

In Oracle Business Applications 11.1.1.7.1 (OBIA), the Data Warehouse load is performed using Oracle Data Integrator (ODI). In ODI, using packages and load plans, one can create quite a complex load job that kicks off many scenarios in a coordinated fashion. This complex load job may run for a long time and may fail before completing the entire job successfully and will require restarts to recover from failure and complete successfully.

This blog uses the complex load plan defined in Load Plan for Oracle Business Applications 11.1.1.7.1 (OBIA) to illustrate the method of recovery from failures. Similar methods can be used in the recovery of complex load plans defined independently in Oracle Data Integrator (ODI). Note that this post does not go into the details of troubleshooting a failed load plan and only talks about the different restart parameters that affect the behavior of a restarted job.

The failures can happen due to the following reasons:

  • Access failure – Source/Target DB down, network failure etc.
  • Agent failure.
  • Problem with the Database – As in running out of space or some other DB related issue.
  • Data related failure – Exceptions not caught gracefully, like null in not null column etc.

It is important to find out the reason of failure and address it before attempting to restart the load plan otherwise the same failure may happen again. In order to recover from the failure successfully the recover parameters in the load plan steps need to be selected carefully. These parameters are selected during design time of the load plan by the developers. The goal is to be able to make the restarts robust enough so that the administrator can do restart without knowing the details of the failed steps. This is why it is the developer’s responsibility to select the restart parameters for the load plan steps in such a way which guarantees that the correct set of steps will be re-run during restart to make sure that data integrity is maintained.

In the case of OBIA, the load plans have appropriate restart parameters in the generated load plans for out of the box steps. If you are adding a custom steps then you need to choose similar restart parameters for the custom steps.

Now let us look at a typical load plan and the restart parameters at various steps.

Restart of a serial load plan step:


SDE Dimension Group Step highlighted above is a serial step. Let us say the Load plan failed when running the 3 SDE Dims GEO_DIM step. Since this is a serial step and it has been set to “Restart from Failure”, the load plan on restart would start from 3 SDE Dims GEO_DIM only and not run the 3 SDE Dims USER_DIM again. This parameter is widely used in the OBIA serial steps.

The other restart parameter for a serial steps is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of a parallel load plan step:


The Workforce Dependant Facts Step highlighted above is a parallel step with restart parameter set to “Restart from failed children”. It means all the 5 parallel steps under it would be kicked off in parallel (subject to free sessions being available). Now amongst those 5 steps let us say 2 of them completed (indicated by the green boxes above) and then the load plan failed. When the load plan is restarted all the steps that did not complete/failed, will be started again (in this example being Learning Enrollment Fact, Payroll Facts and Recruitment Facts). This parameter is widely used in the OBIA parallel steps.

The other restart parameter for a parallel step is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of the scenario session:

At the lowest order in any load plan are the scenario steps. While the parent steps (serial or parallel) are used to set the dependencies, the scenario steps are what finally load the tables. A scenario step in turn could have one or more steps (corresponding to number of steps inside the package).

It is important to understand the structure of a session that gets created for the execution of a scenario step to understand the failure points and how the restart takes place.

The following diagram illustrates different components in a session:


The restart parameters for the scenario steps in the load plan are:

  • Restart from a new session – This creates a new session for the failed scenario during restart and executed all the steps again.
  • Restart from a failed task – This uses the old session for the failed scenario during restart and starts from the failed task.
  • Restart from a failed step – This uses the old session for the failed scenario during restart and re-executes all the tasks in the failed step again. This is the most common parameter used by OBIA and is illustrated below.


In the above example, scenario step 2 failed when running. It internally has 3 steps (all under the same session in Operator log but identified with different step numbers 0,1,2 in above case). As per the setting corresponding to OBIA Standard, the Scenario would execute from Failed Step which is from Step number 2 Table_Maint_Proc (and the substep 3 Initialize Variables onwards as shown in diagram).

Note that the successful tasks such as “3 – Procedure – TABLE_MAINT_PROC – Initialize variables” will be executed again during restart since the scenario restart parameter is set to “Restart from failed step” in the Load Plan.

Summary:

OBIA has certain coding standard for setting up restart parameters as discussed above. For serial and parallel steps, the parameters “Restart from failure” and “Restart from failed children” allow the completed steps to be skipped. For scenario steps (which actually kick of the load sessions) the restart parameter of “Restart from failed step” skips the completed steps in the session and reruns all the tasks in the failed step, allowing recovery of an incomplete step.

This standard allows a hands free approach to restart a failed load plan by an administrator who has no knowledge of what the load plan is doing.


Tuesday Apr 22, 2014

Fusion Application Incremental Data Migration

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the last post we discussed how Fusion Application uses ODI to transform and move data from their interface tables to the internal tables. In this article we will look into the use case of how Fusion Application uses ODI for extracting data from legacy applications and loading into Interface tables.

Fusion Applications have created a large number of ODI interfaces and packages for addressing this use case. These ODI artifacts are shipped to Fusion Application customers for performing the initial and incremental data migration from their legacy applications to Fusion Applications interface tables. These shipped artifacts can be customized as per the customizations in the customer’s environment. The below diagram depicts the data migration from Siebel to Fusion Customer Relationship Management (CRM). The whole process is performed in two stages. First the data is extracted from the underlying database of Siebel’s production system into staging area on Oracle database and then migrated into Fusion Applications interface tables. ODI is used for both of these operations. Once the initial migration is done then the trickle of change data is replicated and transformed through ODI and Golden gate combination.


Incremental Migration using ODI and Golden Gate

Initial Load

The initial is the bulk data movement process where the snapshot of the data from legacy application is moved into staging area in Oracle database. Depending upon the underlying database type of the legacy application, appropriate ODI Knowledge Module is used in the interfaces to get the best performance. For instance, for the Oracle to Oracle data movement the Knowledge Modules with DBLINK is used to move data natively through DBLINK.

Replication

The replication process takes care of moving data incrementally after the Initial load is complete. Oracle Golden Gate is leveraged for this incremental change data replication which continuously replicates the changes into the staging area. The trickle of change data is then moved from staging area to Fusion Applications interface tables through ODI’s change data capture processes using ODI Journalizing Knowledge Modules.

Thanks again for reading about ODI in the Fusion Applications!  This was the last in a series of three posts.  To review the related posts:  Oracle Data Integrator (ODI) Usage in Fusion Applications and Fusion Application Bulk Import Process.

Wednesday Apr 16, 2014

Learn about Oracle Data Integrator (ODI) Agents

Check out two new ODI A-Team blog posts – all about the Oracle Data Integrator (ODI) Agents! Understand where to install the ODI standalone agent, and find out more about the ODI agent flavors and installation types. Which one(s) make sense for you?

Understanding Where to Install the ODI Standalone Agent

ODI Agents: Standalone, JEE and Colocated 

Happy reading!

Tuesday Apr 15, 2014

Fusion Application Bulk Import Process

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the previous blog post we looked at the Fusion Applications end-to-end bulk data integration use cases. Now let’s take a closer look at the Bulk Import process that transforms and moves data from Interface tables to internal tables. For this use case ODI is bundled along with Fusion Application and get configured transparently by the Fusion Application provisioning process. The entire process is automated and controlled through the Fusion Application User Interface. It also seeds the ODI repository with the Fusion Application specific Models, Interfaces, Procedures and Packages which are then dynamically modified through ODI SDK for any Fusion Application customizations.


Fusion Application Bulk Import Process

The above diagram shows the Bulk import process in Fusion Application where ODI is used for data transformation. Here the Interface tables are the source tables which were populated by other processes before the kicking off the Bulk Import process. The Fusion Application internal tables are the target for these integrations where the data needs to be loaded. These internal tables are directly used for Fusion Application functionalities therefore a number of data validations are applied to load only the good quality data into the internal tables. The data validation errors are monitored and corrected through Fusion Application User Interface. The metadata of Fusion Application tables is not fixed and gets modified as the Application is customized for customer’s requirement. Any change in such source or target tables would require corresponding adjustments in ODI artifacts too and is taken care of by the AppComposer which uses ODI SDK to make such changes in ODI artifacts. If auditing is enabled then any change in the internal table data or the changes in ODI artifacts are recorded in centralized auditing table.

Packaged ODI Artifacts

There are a large number of ODI models, interfaces and packages seeded in the default ODI repository used for Bulk Import. These ODI artifacts are built based upon the base metadata of Fusion Application schema.

Extensibility

As part of the customization, Fusion Application entities are added or modified as per the customer’s requirement. Such customizations result in changes in the underlying Fusion Application’s internal tables and interface tables, and require the ODI artifacts to be updated accordingly. The Fusion Application development team as built the extensibility framework to update ODI artifacts dynamically along with any change in Fusion Application schema. It leverages the ODI-SDK for performing any changes in the ODI repository. The dynamic generation of ODI artifacts is automatically kicked off as part of Patching and Upgrades process. Fusion Application AppComposer User Interface also supports explicitly triggering this process so that administrators can regenerate ODI artifacts whenever they make any customizations.

Validation Error Reporting

The validation errors are populated in intermediate tables and are exposed through BI Publisher so that admin users can correct and recycle these error records.

Auditing

The Fusion Application auditing framework keeps track of the changes performed by each of the users and at what time. There are two levels of auditing captured in Fusion Application audit table for Bulk Import use case. First, metadata changes in ODI artifacts through ODI SDK during customizations. Second, the transactional data changes in the Fusion Application table data as part of ODI interfaces execution. For these purposes the ODI team has exposed some substitution APIs that are used by Fusion Application development team to customize ODI KMs to perform such auditing during the actual data movement.

Provisioning and Upgrade

The provisioning process takes care of install and configuring ODI for the Fusion Application instance.

It takes care of automatically creating ODI repository schemas, configuring topology, setting up ODI agents, setup configurations for ODI –ESS bridge, seeding packaged ODI artifacts, apply modifications to seeded artifacts and create internal users in IDM for external authentication. There is a separate process to apply patches or upgrade the environment to the newer release. Such patching or upgrade processes not only take care of importing newer ODI artifacts but also kick off a CRM extensibility process that modifies ODI artifacts as per the Fusion Application customizations.

External Authentication

There is a dedicated IDM configured with each Fusion Application instance and all Fusion Application components are expected to have their users authenticated through this centralized IDM. For Bulk Import use case ODI is configured with external authentication and there are internal users created in IDM that are used for communication with ODI agent and kicking off ODI jobs.

Enterprise Scheduler Service (ESS) - ODI Bridge

The ODI scenarios are kicked off through ODI-ESS bridges. It is a separate library build for ODI-ESS integration and gets deployed along with Enterprise Scheduler Service (ESS) in Fusion Application environment. It supports both synchronous and asynchronous modes of invocation for ODI jobs. In the asynchronous mode the session status is updated to callbacks to the ESS services. There is a topology editor provided to manage the ESS callback service connectivity exclusively for Fusion Application use cases.

Note: Use of ESS-ODI Bridge is restricted to Fusion Application use case only at the moment.

High Availability

The ODI agent is deployed on Weblogic cluster in the Fusion Application environment to take advantage of ODI high availability capabilities. By default there is only one managed server in the Weblogic cluster created for ODI but as the load increases more managed servers can be added to the cluster to distribute execution of ODI sessions among ODI agent instances in the cluster.

Stay tuned for the last post on this topic coming soon.  This was part two in a series of three posts.  The initial post can be found here.

Friday Apr 11, 2014

Oracle Data Integrator (ODI) Usage in Fusion Applications (FA)

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

Oracle Data Integrator (ODI) is the bulk data transformation platform for Fusion Applications (FA). ODI is used by Fusion Customer Relationship Management (CRM), Fusion Human Capital Management (HCM), Fusion Supply Chain Management (SCM), Fusion Incentive Compensation (IC) and Fusion Financials family products and many other Fusion Application teams are following suit. Among all these product families CRM is the biggest consumer of ODI leveraging a breadth of ODI features and functionality, out of which some features were developed specifically for Fusion Applications use. Some ODI features they utilize include: ODI SDK, high availability, external authentication, various out of the box and customized Knowledge Modules, ODI-ESS bridge, callbacks to ESS EJBs, auditing, open tools, etc. In this post we will first talk about the different Fusion Application use cases at higher level and then take a closer look at different integration points.

Figure 1 shows data integration need of a typical on-premise Fusion Applications deployment.

  1. Bulk Import: Fusion Applications exposes a set of interface tables as the entry point for data load from any outside source. The bulk import process validates this data and loads it in the internal table which can then be used by the fusion application.
  2. Data Migration: Extracting data from external applications, legacy applications or any other data source and loading it into Fusion Application’s interface table. ODI can be used for such data load.
  3. Preparing Data Files: Converting data into Comma Separated Values (CSV) files that can be imported through Fusion Application’s files import wizard. ODI can be used to extract data into such CSV file.

Figure 1: Data Integration Needs in On-Premise Fusion Application

Figure 2 shows the on-demand or cloud environment requirements, which are slightly different as there is no direct connectivity available to the interface tables.

  1. Bulk Import: Fusion Application exposes a set of interface tables as the entry point for any data load from any outside source. The bulk import process validates this data and then loads it in the internal table which can then be used by the application.
  2. Preparing Data Files: Converting data into CSV files that can be imported through Fusion Application’s files import wizard. ODI can be used for creation on such CSV files.
  3. Uploading data files: The data files are uploaded to the Tenant File repository through either Fusion Application’s File import page or Oracle WebCenter Content Document Transfer Utility. The WebCenter Utility is built using ODI open tool framework allowing orchestrating entire process through ODI package.
  4. Loading Interface Table: Data files to be loaded in the interface tables so that it can be consumed by the Bulk Import process. ODI is used for loading these interface tables.