Tuesday Apr 22, 2014

Fusion Application Incremental Data Migration

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the last post we discussed how Fusion Application uses ODI to transform and move data from their interface tables to the internal tables. In this article we will look into the use case of how Fusion Application uses ODI for extracting data from legacy applications and loading into Interface tables.

Fusion Applications have created a large number of ODI interfaces and packages for addressing this use case. These ODI artifacts are shipped to Fusion Application customers for performing the initial and incremental data migration from their legacy applications to Fusion Applications interface tables. These shipped artifacts can be customized as per the customizations in the customer’s environment. The below diagram depicts the data migration from Siebel to Fusion Customer Relationship Management (CRM). The whole process is performed in two stages. First the data is extracted from the underlying database of Siebel’s production system into staging area on Oracle database and then migrated into Fusion Applications interface tables. ODI is used for both of these operations. Once the initial migration is done then the trickle of change data is replicated and transformed through ODI and Golden gate combination.


Incremental Migration using ODI and Golden Gate

Initial Load

The initial is the bulk data movement process where the snapshot of the data from legacy application is moved into staging area in Oracle database. Depending upon the underlying database type of the legacy application, appropriate ODI Knowledge Module is used in the interfaces to get the best performance. For instance, for the Oracle to Oracle data movement the Knowledge Modules with DBLINK is used to move data natively through DBLINK.

Replication

The replication process takes care of moving data incrementally after the Initial load is complete. Oracle Golden Gate is leveraged for this incremental change data replication which continuously replicates the changes into the staging area. The trickle of change data is then moved from staging area to Fusion Applications interface tables through ODI’s change data capture processes using ODI Journalizing Knowledge Modules.

Thanks again for reading about ODI in the Fusion Applications!  This was the last in a series of three posts.  To review the related posts:  Oracle Data Integrator (ODI) Usage in Fusion Applications and Fusion Application Bulk Import Process.

Wednesday Apr 16, 2014

Learn about Oracle Data Integrator (ODI) Agents

Check out two new ODI A-Team blog posts – all about the Oracle Data Integrator (ODI) Agents! Understand where to install the ODI standalone agent, and find out more about the ODI agent flavors and installation types. Which one(s) make sense for you?

Understanding Where to Install the ODI Standalone Agent

ODI Agents: Standalone, JEE and Colocated 

Happy reading!

Tuesday Apr 15, 2014

Fusion Application Bulk Import Process

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

In the previous blog post we looked at the Fusion Applications end-to-end bulk data integration use cases. Now let’s take a closer look at the Bulk Import process that transforms and moves data from Interface tables to internal tables. For this use case ODI is bundled along with Fusion Application and get configured transparently by the Fusion Application provisioning process. The entire process is automated and controlled through the Fusion Application User Interface. It also seeds the ODI repository with the Fusion Application specific Models, Interfaces, Procedures and Packages which are then dynamically modified through ODI SDK for any Fusion Application customizations.


Fusion Application Bulk Import Process

The above diagram shows the Bulk import process in Fusion Application where ODI is used for data transformation. Here the Interface tables are the source tables which were populated by other processes before the kicking off the Bulk Import process. The Fusion Application internal tables are the target for these integrations where the data needs to be loaded. These internal tables are directly used for Fusion Application functionalities therefore a number of data validations are applied to load only the good quality data into the internal tables. The data validation errors are monitored and corrected through Fusion Application User Interface. The metadata of Fusion Application tables is not fixed and gets modified as the Application is customized for customer’s requirement. Any change in such source or target tables would require corresponding adjustments in ODI artifacts too and is taken care of by the AppComposer which uses ODI SDK to make such changes in ODI artifacts. If auditing is enabled then any change in the internal table data or the changes in ODI artifacts are recorded in centralized auditing table.

Packaged ODI Artifacts

There are a large number of ODI models, interfaces and packages seeded in the default ODI repository used for Bulk Import. These ODI artifacts are built based upon the base metadata of Fusion Application schema.

Extensibility

As part of the customization, Fusion Application entities are added or modified as per the customer’s requirement. Such customizations result in changes in the underlying Fusion Application’s internal tables and interface tables, and require the ODI artifacts to be updated accordingly. The Fusion Application development team as built the extensibility framework to update ODI artifacts dynamically along with any change in Fusion Application schema. It leverages the ODI-SDK for performing any changes in the ODI repository. The dynamic generation of ODI artifacts is automatically kicked off as part of Patching and Upgrades process. Fusion Application AppComposer User Interface also supports explicitly triggering this process so that administrators can regenerate ODI artifacts whenever they make any customizations.

Validation Error Reporting

The validation errors are populated in intermediate tables and are exposed through BI Publisher so that admin users can correct and recycle these error records.

Auditing

The Fusion Application auditing framework keeps track of the changes performed by each of the users and at what time. There are two levels of auditing captured in Fusion Application audit table for Bulk Import use case. First, metadata changes in ODI artifacts through ODI SDK during customizations. Second, the transactional data changes in the Fusion Application table data as part of ODI interfaces execution. For these purposes the ODI team has exposed some substitution APIs that are used by Fusion Application development team to customize ODI KMs to perform such auditing during the actual data movement.

Provisioning and Upgrade

The provisioning process takes care of install and configuring ODI for the Fusion Application instance.

It takes care of automatically creating ODI repository schemas, configuring topology, setting up ODI agents, setup configurations for ODI –ESS bridge, seeding packaged ODI artifacts, apply modifications to seeded artifacts and create internal users in IDM for external authentication. There is a separate process to apply patches or upgrade the environment to the newer release. Such patching or upgrade processes not only take care of importing newer ODI artifacts but also kick off a CRM extensibility process that modifies ODI artifacts as per the Fusion Application customizations.

External Authentication

There is a dedicated IDM configured with each Fusion Application instance and all Fusion Application components are expected to have their users authenticated through this centralized IDM. For Bulk Import use case ODI is configured with external authentication and there are internal users created in IDM that are used for communication with ODI agent and kicking off ODI jobs.

Enterprise Scheduler Service (ESS) - ODI Bridge

The ODI scenarios are kicked off through ODI-ESS bridges. It is a separate library build for ODI-ESS integration and gets deployed along with Enterprise Scheduler Service (ESS) in Fusion Application environment. It supports both synchronous and asynchronous modes of invocation for ODI jobs. In the asynchronous mode the session status is updated to callbacks to the ESS services. There is a topology editor provided to manage the ESS callback service connectivity exclusively for Fusion Application use cases.

Note: Use of ESS-ODI Bridge is restricted to Fusion Application use case only at the moment.

High Availability

The ODI agent is deployed on Weblogic cluster in the Fusion Application environment to take advantage of ODI high availability capabilities. By default there is only one managed server in the Weblogic cluster created for ODI but as the load increases more managed servers can be added to the cluster to distribute execution of ODI sessions among ODI agent instances in the cluster.

Stay tuned for the last post on this topic coming soon.  This was part two in a series of three posts.  The initial post can be found here.

Friday Apr 11, 2014

Oracle Data Integrator (ODI) Usage in Fusion Applications (FA)

Written by Ayush Ganeriwal, Oracle Data Integrator Product Management

Oracle Data Integrator (ODI) is the bulk data transformation platform for Fusion Applications (FA). ODI is used by Fusion Customer Relationship Management (CRM), Fusion Human Capital Management (HCM), Fusion Supply Chain Management (SCM), Fusion Incentive Compensation (IC) and Fusion Financials family products and many other Fusion Application teams are following suit. Among all these product families CRM is the biggest consumer of ODI leveraging a breadth of ODI features and functionality, out of which some features were developed specifically for Fusion Applications use. Some ODI features they utilize include: ODI SDK, high availability, external authentication, various out of the box and customized Knowledge Modules, ODI-ESS bridge, callbacks to ESS EJBs, auditing, open tools, etc. In this post we will first talk about the different Fusion Application use cases at higher level and then take a closer look at different integration points.

Figure 1 shows data integration need of a typical on-premise Fusion Applications deployment.

  1. Bulk Import: Fusion Applications exposes a set of interface tables as the entry point for data load from any outside source. The bulk import process validates this data and loads it in the internal table which can then be used by the fusion application.
  2. Data Migration: Extracting data from external applications, legacy applications or any other data source and loading it into Fusion Application’s interface table. ODI can be used for such data load.
  3. Preparing Data Files: Converting data into Comma Separated Values (CSV) files that can be imported through Fusion Application’s files import wizard. ODI can be used to extract data into such CSV file.

Figure 1: Data Integration Needs in On-Premise Fusion Application

Figure 2 shows the on-demand or cloud environment requirements, which are slightly different as there is no direct connectivity available to the interface tables.

  1. Bulk Import: Fusion Application exposes a set of interface tables as the entry point for any data load from any outside source. The bulk import process validates this data and then loads it in the internal table which can then be used by the application.
  2. Preparing Data Files: Converting data into CSV files that can be imported through Fusion Application’s files import wizard. ODI can be used for creation on such CSV files.
  3. Uploading data files: The data files are uploaded to the Tenant File repository through either Fusion Application’s File import page or Oracle WebCenter Content Document Transfer Utility. The WebCenter Utility is built using ODI open tool framework allowing orchestrating entire process through ODI package.
  4. Loading Interface Table: Data files to be loaded in the interface tables so that it can be consumed by the Bulk Import process. ODI is used for loading these interface tables.

Figure 2: Data Integration Needs for On-Demand Fusion Application

Stay tuned for more blog posts on this topic coming next week. This was part one in a series of three posts.


Friday Apr 04, 2014

Turning Big Data into Real-Time Action for a Greater Customer Experience

The power shifted to us, consumers. The digital revolution allows us to access broader set of services, and communicate without boundaries. Today we demand more and better choices in a competitive market, putting pressures on businesses to catch up with our expectations.

By offering differentiated and improved experience to their customers organizations see that they can drive revenue growth via higher loyalty, and improved brand perception.  Because technology is a key enabler for delivering superb and consistent customer experience across all touchpoints, in recent years customer experience solutions have become a top priority for CIOs. Thanks to the availability of big data analytics, organizations can now analyze a broader variety of data, rather than a few basic data points, and gain deeper insight into their customers and operations. In turn, this deeper insight helps align their business to provide a seamless customer experience.

In our digital, fact-paced world we produce large volumes of data with unprecedented velocity. This data contains perishable value that requires fast capture, analysis, and action to be able to influence the operations or the interaction with the customer. Otherwise the insight or action may become irrelevant, which decreases the value for the customer and the organization significantly.  To extract the maximum value from highly dynamic and perishable data, you need to process much faster and take timely action. This is the main premise behind Oracle's Fast Data solutions, which we have discussed in previous blogs and webcasts.

Real-time data integration and analytics play a crucial role in our new world of big and fast data. Organizations that look into leveraging  big data to create greater customer experience, need to evaluate the analytical foundation behind their customer-facing systems and resulting interactions, and determine whether they can improve how and when they collect, analyze, and act on their ever-growing data assets.

In our next webcast my colleague Pete Schutt in the Oracle Business Analytics team and I will discuss how organizations can create value for their customers using real-time customer analytics, and how to leverage big data to build a solid business analytics foundation using the latest features of Oracle Data Integration and Oracle Business Analytics. We will provide multiple customer examples for different solution architectures.

Join us on April 15th 10am PT/ 1pm ET by registering via the link below.

Turning Big Data into Real-Time Action for a Greater Customer Experience

Tuesday, April 15th 10am PT/ 1pm ET

Until we meet at this webcast, please review my related article on this topic published on DBTA earlier this year:

Tuesday Apr 01, 2014

Looking for Cutting-Edge Data Integration: 2014 Excellence Awards

It is nomination time!!!

This year's Oracle Fusion Middleware Excellence Awards will honor customers and partners who are creatively using various products across Oracle Fusion Middleware. Think you have something unique and innovative with one or a few of our Oracle Data Integration products?

We would love to hear from you! Please submit today.

The deadline for the nomination is June 20, 2014.

What you win:

  • An Oracle Fusion Middleware Innovation trophy
  • One free pass to Oracle OpenWorld 2014
  • Priority consideration for placement in Profit magazine, Oracle Magazine, or other Oracle publications & press release
  • Oracle Fusion Middleware Innovation logo for inclusion on your own Website and/or press release

Let us reminisce a little…

For details on the 2013 Data Integration Winners:

Royal Bank of Scotland’s Market and International Banking and The Yalumba Wine Company, check out this blog post: 2013 Oracle Excellence Awards for Fusion Middleware Innovation… and the Winners for Data Integration are…

and for details on the 2012 Data Integration Winners:

Raymond James and Morrisons, check out this blog post: And the Winners of Fusion Middleware Innovation Awards in Data Integration are… 

Now to view the 2013 Winners (for all categories).

We hope to honor you!

Here's what you need to do: 

Click here to submit your nomination today.  And just a reminder: the deadline to submit a nomination is 5pm Pacific Time on June 20, 2014.

Thursday Mar 27, 2014

Interested in presenting and sharing your insights around Data Integration at OpenWorld?

You have been successful in making your organization run smoother. Faster. More cost-effectively. You have come up with the perfect solution to increase your staff retention, speed up your lead to sales pipeline, or minimize your supply management costs. We want to hear your story. Submit your proposal today and share your success at OpenWorld 2014.

Send us a proposal covering your Data Integration success. If it is selected, you will share your idea, experiences, and stories with Oracle customers, developers, and partners from around the world. You will also get a complimentary full pass to the conference.

Learn more about submitting your proposal.

Conference attendees want to hear it straight from you.

Don't wait—proposals must be submitted by April 15!

Tuesday Feb 18, 2014

Recap of Oracle GoldenGate 12c Webcast with Q&A

Simply amazing! That’s how I would summarize last week’s webcast for Oracle GoldenGate 12c.  It was a very interactive event with hundreds of live attendees and hundreds of great questions. In the presentation part my colleagues, Doug Reid and Joe deBuzna, went over the new features of Oracle GoldenGate 12c. They explained Oracle GoldenGate 12c key new features including:

  • Integrated Delivery for Oracle Database,
  • Coordinated Delivery for non-Oracle databases,
  • Support for Oracle Database 12c multitenant architecture,
  • Enhanced high availability via integration with Oracle Data Guard Fast-Start Failover,
  • Expanded heterogeneity, i.e. support for new databases and operating systems,
  • Improved security,
  • Low-downtime database migration solutions for Oracle E-Business Suite,
  • Integration with Oracle Coherence.

We also had a nice long and live Q&A section. In the previous Oracle GoldenGate webcasts, we could not respond to all audience questions in a 10-15 minute timeframe at the end of the presentation. This time we kept the presentation part short and left more than 30 minutes for Q&A. To our surprise, we could not answer even half of the questions we received. 

If you missed this great webcast discussing the new features of Oracle GoldenGate 12c,  and more than 30 minutes of Q&A with GoldenGate Product Management, you can still watch it on demand via the link below.

On Demand Webcast: Introducing Oracle GoldenGate 12c: Extreme Performance Simplified

On this blog post I would like to provide brief answers from our PM team  for some of the questions that we were not able to answer during the live webcast.

1) Does Oracle GoldenGate replicate DDL statements or DML for Oracle Database?

    Oracle GoldenGate replicates DML and DDL operations for Oracle Database and Teradata.

2) Where do we get more info on how to setup integration with Data Guard Fast-Start Failover (FSFO)?

     Please see the following blog posts or documents on My Oracle Support:

Best Practice - Oracle GoldenGate and Oracle Data Guard - Switchover/Fail-over Operations for GoldenGate    [My Oracle Support Article ID   1322547.1] 

Best Practice - Oracle GoldenGate 11gr2 integrated extract and Oracle Data Guard - Switchover/Fail-over Operations  [My Oracle Support Article ID 1436913.1] 

3) Does GoldenGate support SQL Server 2012 extraction? In the past only apply was supported.

Yes, starting with the new 12c release GoldenGate captures from SQL Server 2012 in addition to delivery capabilities.

4) Which RDBMS does GoldenGate 12c support?

GoldenGate supports all major RDBMS. For a full list of supported platforms please see Oracle GoldenGate certification matrix.

5) Could you provide some more details please on Integrated Delivery for dynamic parallel threads at Target side?

Please check out our white papers on Oracle GoldenGate 12c resource kit for more details on the new features, and how Oracle GoldenGate 12c works with Oracle Database. 

6) What is the best way to sync partial data (based on some selection criterion) from a table between databases?

 Please refer to the article: How To Resync A Single Table With Minimum Impact To Other Tables' Replication? [Article ID 966211.1]

7) How can GoldenGate be better than database trigger to push data into custom tables?

Triggers can cause high CPU overhead, in some cases almost double compared to reading from redo or transaction logs. In addition, they are intrusive to the application and cause management overhead as application changes. Oracle GoldenGate's log-based change data capture is not only low-impact in terms of CPU utilization, but also non-intrusive to the application with low maintenance requirements.

8) Are there any customers in the manufacturing industry using GoldenGate and for which application?

We have many references in manufacturing. In fact, SolarWorld USA was our guest speaker in the executive video webcast last November. You can watch the interview here. RIM Blackberry uses Oracle GoldenGate for multi-master replication between its global manufacturing systems. Here is another manufacturing customer story from AkzoNobel.

9) Does GoldenGate 12c support compressed objects for replication? Also does it supports BLOB/CLOB columns?

Yes, GoldenGate 12c and GoldenGate 11gR2 both support compressed objects. GoldenGate has been supporting BLOB/CLOB columns since version 10.

10) Is Oracle Database 11.2.0.4 mandatory to use GoldenGate 12c Integrated Delivery? Not earlier versions?

Yes. To use GoldenGate 12c’s Integrated Delivery, for the target environment Oracle Database 11.2.04 and above is required .

11) We have Oracle Streams implementation for more than 5 years. We would like to migrate to GoldenGate, however older version of GoldenGate were not supporting filtering individual transactions. Is it supported in GoldenGate 12c?

      Yes, it is supported in GoldenGate 12c.


In future blog posts I will continue to provide answers for common questions we received in the webcast. In the meanwhile I highly recommend watching the Introducing Oracle GoldenGate 12c: Extreme Performance Simplified webcast on demand.

Wednesday Jan 29, 2014

ODI 12.1.2 Demo on the Oracle BigDataLite Virtual Machine

Oracle's big data team has just announced the Oracle BigDataLite Virtual Machine, a pre-built environment to get you started on an environment reflecting the core software of Oracle's Big Data Appliance 2.4. BigDataLite is a VirtualBox VM that contains a fully configured Cloudera Hadoop distribution CDH 4.5, an Oracle DB 12c, Oracle's Big Data Connectors, Oracle Data Integrator 12.1.2, and other software.

You can use this environment to see ODI 12c in action integrating big data with Oracle DB using ODI's declarative graphical design, efficient EL-T loads, and Knowledge Modules designed to optimize big data integration. 

The sample data contained in BigDataLite represents the fictional Oracle MoviePlex on-line movie streaming company. The ODI sample performs the following two steps:

  • Pre-process application logs within Hadoop: All user activity on the MoviePlex web site is gathered on HDFS in Avro format. ODI is reading these logs through Hive and processes activities by aggregating, filtering, joining and unioning the records in an ODI flow-based mapping. All processing is performed inside Hive map-reduce jobs controlled by ODI, and the resulting data is stored in a staging table within Hive.
  • Loading user activity data from Hadoop into Oracle: The previously pre-processed data is loaded from Hadoop into an Oracle 12c database, where this data can be used as basis for Business Intelligence reports. ODI is using the Oracle Loader for Hadoop (OLH) connector, which executes distributed Map-Reduce processes to load data in parallel from Hadoop into Oracle. ODI is transparently configuring and invoking this connector through the Hive-to-Oracle Knowledge Module.

Both steps are orchestrated and executed through an ODI Package workflow. 

Demo Instructions

Please follow these steps to execute the ODI demo in BigDataLite:

  1. Download and install BigDataLite. Please follow the instructions in the Deployment Guide at the download page
  2. Start the VM and log in as user oracle, password welcome1.
  3. Start the Oracle Database 12c by double-clicking the icon on the desktop.


  4. Start ODI 12.1.2 by clicking the icon on the toolbar.


  5. Press Connect To Repository... on the ODI Studio window. 


  6. Press OK in the ODI Login dialog.


  7. Switch to the Designer tab, open the Projects accordion and expand the projects tree to Movie > First Folder > Mappings. Double-click on the mapping Transform Hive Avro to Hive Staging.


  8. Review the mapping that transforms source Avro data by aggregating, joining, and unioning data within Hive. You can also review the mapping Load Hive Staging to Oracle the same way. 

    (Click image for full size)

  9. In the Projects accordion expand the projects tree to Movie > First Folder > Packages. Double-click on the package Process Movie Data.


  10. The Package workflow for Process Movie Data opens. You can review the package.


  11. Press the Run icon on the toolbar. Press OK for the Run and Information: Session started dialogs. 




  12. You can follow the progress of the load by switching to the Operator tab and expanding All Executions and the upmost Process Movie Data entry. You can refresh the display by pressing the refresh button or setting Auto-Refresh. 


  13. Depending on the environment, the load can take 5-15 minutes. When the load is complete, the execution will show all green checkboxes. You can traverse the operator log and double-click entries to explore statistics and executed commands. 

This demo shows only some of the ODI big data capabilities. You can find more information about ODI's big data capabilities at:


Wednesday Jan 22, 2014

Deep Dive Into Oracle Data Integrator Changed Data Capture Leveraging Oracle GoldenGate

Check out this blog post below from Christophe – first the details related to Oracle Data Integrator’s (ODI) Journalizing Knowledge Modules (JKMs) as well as a deeper dive into the particulars around the seamless out-of-the-box integration between Oracle Data Integrator (ODI) and Oracle GoldenGate.

http://www.ateam-oracle.com/understanding-the-odi-jkms-and-how-they-work-with-oracle-goldengate/

Happy reading!

Monday Jan 06, 2014

Welcome Oracle Management Pack for Oracle Data Integrator! Let’s maximize the value of your Oracle Data Integrator investments!

To help you make the most of Oracle Data Integrator, and to deliver a superior ownership experience in an effort to minimize systems management costs, Oracle recently released Oracle Management Pack for Oracle Data Integrator. This new product leverages Oracle Enterprise Manager Cloud Control's advanced management capabilities to provide an integrated and top-down solution for your Oracle Data Integrator environments. Management Pack for Oracle Data Integrator supports both 11g (11.1.1.7.0 and higher) and 12c versions of Oracle Data Integrator.

Management Pack for Oracle Data Integrator provides a consolidated view of your entire Oracle Data Integrator infrastructure. This enables users to monitor and manage all their components centrally from Oracle Enterprise Manager Cloud Control.

Performance Monitoring and Management

Management Pack for Oracle Data Integrator streamlines the monitoring of the health, performance, and availability of each and all components of an Oracle Data Integrator environment – this includes Master and Work Repositories, Standalone and JEE agents, as well as source and target Data Servers.