Monday Apr 06, 2015

Announcing Oracle Data Integrator for Big Data

Proudly announcing the availability of Oracle Data Integrator for Big Data. This release is the latest in the series of advanced Big Data updates and features that Oracle Data Integration is rolling out for customers to help take their Hadoop projects to the next level.

Increasing Big Data Heterogeneity and Transparency

This release sees significant additions in heterogeneity and governance for customers. Some significant highlights of this release include

  • Support for Apache Spark,
  • Support for Apache Pig, and
  • Orchestration using Oozie.

Click here for a detailed list of what is new in Oracle Data Integrator (ODI).

Oracle Data Integrator for Big Data helps transform and enrich data within the big data reservoir/data lake without users having to learn the languages necessary to manipulate them. ODI for Big Data generates native code that is then run on the underlying Hadoop platform without requiring any additional agents. ODI separates the design interface to build logic and the physical implementation layer to run the code. This allows ODI users to build business and data mappings without having to learn HiveQL, Pig Latin and Map Reduce.

Oracle Data Integrator for Big Data Webcast

We invite you to join us on the 30th of April for our webcast to learn more about Oracle Data Integrator for Big data and to get your questions answered about Big Data Integration. We discuss how the newly announced Oracle Data Integrator for Big Data

  • Provides advanced scale and expanded heterogeneity for big data projects 
  • Uniquely compliments Hadoop’s strengths to accelerate decision making, and 
  • Ensures sub second latency with Oracle GoldenGate for Big Data.


Friday Feb 27, 2015

How to Future Proof Your Big Data Investments - An Oracle webcast with Cloudera

Cutting through the Big Data Clutter

The Big Data world is changing rapidly, giving raise to new standards, languages and architectures. Customers are unclear about which Big Data technology will benefit their business the most, and how to future proof their Big Data investments.

This webcast helps customers sift through the changing Big Data architectures to help customer build their own resilient Big Data platform. Oracle and Cloudera experts discuss how enterprise platforms need to provide more flexibility to handle real-time and in memory computations for Big Data.



The speakers introduce the 4th generation architecture for Big Data that allows for expanded and critical capabilities to exist alongside each other. Customers can now see higher returns on their Big Data investment by ingesting real time data and improved data transformation for their Big Data analytics solutions. By choosing Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Metadata Management, customers gain the ability to keep pace with changing Big Data technologies like Spark, Oozie, Pig and Flume without losing productivity and reduce risk through robust Big Data governance.

In this webcast we also discuss the newly announced Oracle GoldenGate for Big Data. With this release, customers can stream real time data from their heterogeneous production systems into Hadoop and other Big Data systems like Apache Hive, HBase and Flume. This brings real time capabilities to customer’s Big Data architecture allowing them to enhance their big data analytics and ensure their Big Data reservoirs are up-to-date with production systems.

Click here to mark your calendars and join us for the webcast to understand Big Data Integration and ensure that you are investing in the right Big Data Integration solutions.

Tuesday Dec 09, 2014

Big Data Governance– Balancing Big Risks and Bigger Profits

To me, nothing exemplifies the real value that Big Data brings to life than the role it played in the last edition of the FIFA soccer world cup. Stephen Hawkins predicted that England’s chance of winning a game drops by 60 percent every time the temperature increases by 5ºC. Moreover, he found that England plays better in stadiums situated 500 meters above sea level, and perform better if the games kick off later than 3PM local time. In short, England’s soccer team struggles to cope with the conditions in hot and humid countries.

We all have heard, meditated and opined about the value of Big Data, the panacea for all problems. And it is true. Big Data has started delivering real profits and wins to businesses. But as with any data management program, profit benefits should be maximized while striving to minimize potential risks and costs.

Customer Data is Especially Combustible Data

The biggest lift in businesses using Big Data is obtained through the mining of customer data. By storing and analyzing seemingly disparate customer attributes and running analytic models through the whole data set (data sampling is dying a painful demise), businesses are able to accurately predict buying patterns, customer preferences and create products and services that cater to today’s demanding consumers. But this veritable mine of customer information is combustible. And by that, what I mean is that a small leak is enough to undo any benefits hitherto extracted from ensuing blowbacks like financial remuneration, regulatory constrictions and most important of all the immense reputational damage. And this is why Big Data should always be well governed. Data Governance is an aspect of data security that helps with safeguarding Big Data in business enterprises.

Big Data Governance

Big Data Governance is but a part (albeit a very important part) of a larger Big Data Security strategy. Big Data security should involve considerations along the efficient and economic storage of data, retrieval of data and consumption of data. It should also deal with backups, disaster management and other traditional considerations.

When properly implemented a good Governance program serves as a crystal ball to the data flow within the organizations. It will answer questions on how safe the data is, who can and should be able to lay their hands on the data and proactively prevent data leakage and misuse. Because when dealing with Big Reservoirs of Data, small leakages can go unnoticed. 

Friday Oct 10, 2014

Oracle Data Integrator Webcast Archives

Check out the recorded webcasts on Oracle Data Integrator! 

Each month the Product Management Team hosts a themed session for your viewing pleasure.  Recent topics include Oracle Data Integrator (ODI) and Big Data, Oracle Data Integrator (ODI) and Oracle GoldenGate Integration, BigData Lite, the Oracle Warehouse Builder (OWB) Migration Utility, the Management Pack for Oracle Data Integrator (ODI), along with other various topics focused on Oracle Data Integrator (ODI) 12c.

You can find the Oracle Data Integrator (ODI) Webcast Archives here.

Take a look at the individual sessions:

The webcasts are publicized on the ODI OTN Forum if you want to view them live.  You will find the announcement at the top of the page, with the title and details for the upcoming webcast.

Thank you – and happy listening!

Monday Sep 15, 2014

You are invited! See the Power of Innovation with Oracle Fusion Middleware

Are you going to be at Oracle OpenWorld? If so, don't miss the opportunity to meet the most innovative Oracle Fusion Middleware projects of 2014. Winners of 2014 Oracle Excellence Awards for Fusion Middleware Innovation will be announced in a special Awards Ceremony on Tuesday September 30th during Oracle OpenWorld.

Oracle Fusion Middleware Innovation Awards honor customers with cutting-edge use of Oracle Fusion Middleware technologies to solve unique business challenges or create business value. Winners are selected based on the uniqueness of their business case, business benefits, level of impact relative to the size of the organization, complexity and magnitude of implementation, and the originality of architecture. Data Integration is among the 9 award categories recognized this year, along with Oracle Exalogic Elastic Cloud, Oracle Cloud Application Foundation, Oracle Service Oriented Architecture, Oracle Business Process Management, Oracle WebCenter, Oracle Identity Management, Oracle Development Tools and Framework, and Big Data and Business Analytics.

If you are planning to attend Oracle OpenWorld in San Francisco or plan to be in the area during Oracle OpenWorld, we hope you can join us, and bring back to your organization real-life examples of Fusion Middleware in action.

   Oracle Excellence Awards Ceremony: Oracle Fusion Middleware: Meet This Year’s Most Impressive Innovators(Session ID: CON7029)

   When: Tuesday September 30, 2014

   Time: Champagne Reception 4:30 pm, Ceremony 5-5:45 pm PT

   Where: Yerba Buena Center for the Arts, YBCA Theater (next to Moscone North) 700 Howard St., San Francisco, CA, 94103


To learn more about last year’s Data Integration winners please read our blog post: 2013 Oracle Excellence Awards for Fusion Middleware Innovation… and the Winners for Data Integration are…

To attend this Award Ceremony, Oracle OpenWorld Badges are required. You can register for this session through the Schedule Builder on the Oracle OpenWorld website. If you are not attending the conference, but will be in the area and would like to join the celebration – please contact us here.

We hope to see you there!

Monday Aug 11, 2014

Oracle is a Leader in the Gartner 2014 Magic Quadrant for Data Integration Tools

Oracle maintains its position as a leader in the 2014 “Magic Quadrant for Data Integration Tools” report. This year’s report expands into the key trends of Big Data and Cloud in addition to classic data integration capabilities and visions available in the market.

On the importance of Data Integration, Gartner states, “Data integration is central to enterprises' information infrastructure. Enterprises pursuing the frictionless sharing of data are increasingly favoring technology tools that are flexible in regard to time-to-value demands, integration patterns, optimization for cost and delivery models, and synergies with information and application infrastructures.” Of the vendors it also notes, “Evolving their relevance and competitive positioning requires vendors to extend their vision and deepen their capability to harness market inertia and broaden applicability of data integration offerings. This is in line with buyer expectation for optimal functions, performance and scalability in data integration tools, so that they operate well with the same vendor's technology stack and, increasingly, interoperate across related information and application infrastructures.”

More than 4,500 organizations spanning nearly every industry are using Oracle Data Integration to cost-effectively manage and create value from ever-growing streams of structured and unstructured data, and address emerging deployment styles. Oracle continues to invest in its Data Integration offerings to help customers expand their business by:

Embracing and leading in Cloud Computing, Big Data and Business Analytics initiatives

Delivering real-time data integration solutions for maximum availability

Promoting cross-functional interoperability between heterogeneous systems and applications

Providing a complete and integrated choice of offerings across a variety of styles and deployments including ETL/ELT, data replication, data quality, metadata management, data virtualization and data governance.

Download the Report Now

About the Magic Quadrant: Gartner does not endorse any vendor, product, or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.  

Sunday Jul 13, 2014

New Big Data Features in ODI 12.1.3

Oracle Data Integrator (ODI) 12.1.3 extends its Hadoop capabilities through a number of exciting new cababilities. The new features include:

  • Loading of RDBMS data from and to Hadoop using Sqoop
  • Support for Apache HBase databases
  • Support for Hive append functionality
With these new additions ODI provides full connectivity to load, transform, and unload data in a Big Data environment.

The diagram below shows all ODI Hadoop knowledge modules with KMs added in ODI 12.1.3 in red. 

Sqoop support

Apache Sqoop is designed for efficiently transferring bulk amounts of data between Hadoop and relational databases such as Oracle, MySQL, Teradata, DB2, and others. Sqoop operates by creating multiple parallel map-reduce processes across a Hadoop cluster and connecting to an external database and transfering data from or to Hadoop storage in a partitioned fashion. Data can be stored in Hadoop using HDFS, Hive, or HBase. ODI adds two knowledge modules IKM SQL to Hive- HBase-File (SQOOP) and IKM File-Hive to SQL (SQOOP).

Loading from and to Sqoop in ODI is straightforward. Create a mapping with the database source and hadoop target (or vice versa) and apply any necessary transformation expressions.

In the physical design of the map, make sure to set the LKM of the target to LKM SQL Multi-Connect.GLOBAL and choose a Sqoop IKM, such as  IKM SQL to Hive- HBase-File (SQOOP). Change the MapReduce Output Directory IKM property MAPRED_OUTPUT_BASE_DIR to an appropriate HDFS dir. Review all other properties and tune as necessary. Using these simple steps you should be able to perform a quick Sqoop load. 

For more information please review the great ODI Sqoop article from Benjamin Perez-Goytia, or read the ODI 12.1.3 documentation about Sqoop.

HBase support

ODI adds support for HBase as a source and target. HBase metadata can be reverse-engineered using the RKM HBase knowledge module, and HBase can be used as source and target of a Hive transformation using LKM HBase to Hive and IKM Hive to HBase. Sqoop KMs also support HBase as a target for loads from a database. 

For more information please read the ODI 12.1.3 documentation about HBase.

Hive Append support

Prior to Hive 0.8 there had been no direct way to append data to an existing table. Prior Hive KMs emulated such logic by renaming the existing table and concatenating old and new data into a new table with the prior name. This emulated append operation caused major data movement, particularly when the target table has been large.

Starting with version 0.8 Hive has been enhanced to support appending. All ODI 12.1.3 Hive KMs have been updated to support the append capability by default but provide backward compatibility to the old behavior through the KM property HIVE_COMPATIBLE=0.7. 

Conclusion

ODI 12.1.3 provides an optimal and easy-to use way to perform data integration in a Big Data environment. ODI utilizes the processing power of the data storage and processing environment rather than relying on a proprietary transformation engine. This core "ELT" philosophy has its perfect match in a Hadoop environment, where ODI can provide unique value by providing a native and easy-to-use data integration envionment.

Thursday Jun 26, 2014

Announcing ODI 12.1.3 – Building on Big Data Capabilities.

Continuing the constant innovation that goes into each Oracle Data Integrator release, this release focuses on supporting Big Data standards that we have seen sweeping the market.

Big Data Heterogeneity

Recognizing the need to integrate data from various forms and sources ODI without losing productivity ODI delivers the following capabilities. Having been chosen as the strategic Extract, Load and Transform (EL-T) standard for Oracle, (ODI is being used as the de facto standard for Oracle applications) it is natural that the product’s latest avatar focuses on some of the most impactful features for our customers. Some of the new Big Data features include

        a. JSON support,

        b. Hadoop SQOOP integration and

        c. Hadoop HBase Integration.

ODI now supports data loading from relation databases to HDFS, Hive and HBase and from HDFS and Hive into relational databases.

All the data movement features are further enhanced through the use of ODI’s Knowledge Modules. Knowledge Modules are easy out of the box functional components that can be easily customized and used to get your project off the ground quicker with easy reusability and productivity gains. As an example, there are Knowledge Modules that write to Hive and helps append data to existing data files.

Enhanced Security and Performance:

All the focus on Big Data does not mean a lesser focus on classic product features. ODI 12.1.3 comes with upgraded Federal Information Processing Standard (FIPS) compliant security standards for all its passwords and sensitive information.

It also has increased performance in two ways. ODI can now load target tables using multiple parallel connections and an improved control for data loading allows users to customize concurrency of jobs that can be run.

To learn more about ODI please visit the ODI home page. And for more in depth technical and business scoop the Resource Kit is an excellent starting point. More customer validation here.

Monday May 12, 2014

Check it out – BI Apps 11.1.1.8.1 is now available!

As of May 8, 2014, Oracle Business Intelligence (BI) Applications 11.1.1.8.1 is available on the Oracle Software Delivery Cloud (eDelivery), and on the Oracle BI Applications OTN page. This is the second major release on the 11g code line leveraging the power of Oracle Data Integrator (ODI), and certified with the latest version of Oracle BI Foundation 11.1.1.7. For more details on this release and what’s new – check it out!

Monday Feb 24, 2014

Highlighting Oracle Data Integrator 12c (ODI12c)

Towards the last two months of 2013 we highlighted several features of ODI12c's various features with full length blogs for each of the features. This was so popular that we bring you a one stop shop where you can browse through the various entries at your convenience. This is a great page to bookmark, even if we say it ourselves if you are using or thinking of using ODI12c.

Blog Title

Blog Description

Kicking off the ODI12c Blog Series

Talks about ODI12c themes and features at a high level shedding light on the new releases focus areas.

ODI 12c's Mapping Designer - Combining Flow Based and Expression Based Mapping

Talks about ODI's new declarative designer with the familiar flow based designer.

Big Data Matters with ODI12c

Talks about ODI12c enterprise solutions for the movement, translation and transformation of information and data heterogeneously and in Big Data Environments.

ODI 12c - Aggregating Data

Look at the aggregation component that was introduced in ODI 12c for composing data with relational like operations such as sum, average and so forth.

ODI 12c - Parallel Table Load

Looks at the ODI 12c capability of parallel table load from the aspect of the mapping developer and the knowledge module developer - two quite different viewpoints.

In-Session Parallelism in ODI12c

Discusses the new in-session parallelism, the intelligence to concurrently execute part of the mappings that are independent of each other, introduced in the ODI12c release.

ODI 12c - Mapping SDK the ins and outs

Talks about the ODI 12c SDK that provides a mechanism to accelerate data integration development using patterns and the APIs in the SDK

ODI 12c - XML improvements

Explains ODI support to advanced XML Schema constructs including union, list, substitution groups, mixed content, and annotations.

ODI 12c - Components and LKMs/IKMs

Illustrates capabilities of ODI 12c's knowledge module framework in combination with the new component based mapper.

Welcome Oracle Management Pack for Oracle Data Integrator! Let’s maximize the value of your Oracle Data Integrator investments!

To help you make the most of Oracle Data Integrator, and to deliver a superior ownership experience in an effort to minimize systems management costs, Oracle recently released Oracle Management Pack for Oracle Data Integrator.

ODI 12c - Mapping SDK Auto Mapping

If you want to properly leverage the 12c release the new mapping designer and SDK is the way forward.

ODI 12c - Table Functions, Parallel Unload to File and More

Helps you integrate an existing table function implementation into a flow.

And below is a list of “how to” and “hands on” blogs about ODI 12c and to get started.

ODI 12c - Getting up and running fast

A quick A-B-C to show you how to quickly get up and running with ODI 12c, from getting the software to creating a repository via wizard or the command line, then installing an agent for running load plans and the like.

Time to Get Started with Oracle Data Integrator 12c!

We would like to highlight for you a great place to begin your journey with ODI.  Here you will find the Getting Started section for ODI,

ODI 12c - Slowly Changing Dimensions

Helps setup a slowly changing dimension load in ODI 12c, everything from defining the metadata on the datastore to loading the data.

ODI 12c - Temporal Data Loading

The temporal validity feature in 12c of the Oracle Database is a great feature for any time based data. If you are thinking dimensional data that varies over time.... the temporal validity capabilities of 12c are a great fit, worth checking it out.

ODI 12.1.2 Demo on the Oracle BigDataLite Virtual Machine

Oracle's big data team has just announced the Oracle BigDataLite Virtual Machine, a pre-built environment to get you started on an environment reflecting the core software of Oracle's Big Data Appliance 2.4. BigDataLite is a VirtualBox VM that contains a fully configured Cloudera Hadoop distribution CDH 4.5, an Oracle DB 12c, Oracle's Big Data Connectors, Oracle Data Integrator 12.1.2, and other software.

Webcast - Oracle Data Integrator 12c and Oracle Warehouse Builder

If you missed the recent Oracle Data Integrator 12c and Oracle Warehouse builder live webcast. You can catch up on the events and connect with us with your feedback here. Here we discuss customer examples,ODI12c new features, Big Data compatibility, Oracle Warehouse Builder Migration Utility and Support and live Q and A among other topics.  

 For more information on Oracle Data Integrator visit the ODI Resource Center.

Thursday Jan 30, 2014

Webcast - Oracle Data Integrator 12c and Oracle Warehouse Builder

If you missed the recent Oracle Data Integrator 12c and Oracle Warehouse builder live webcast do not worry. You can still catch up on the events and connect with us with your feedback here. Here we discuss customer examples,ODI12c new features, Big Data compatibility, Oracle Warehouse Builder Migration Utility and Support and live Q and A among other topics.  

Register to listen. 

ODI12c and OWB Webcast.

Here are some questions from the audiences that were answered following the webcast.

Question

Response

Is the management pack free to ODI customers?

While the Management Pack for ODI works to consolidate and manage your ODI infrastructure it is a separately licensed product.  

- You can learn more about the Management Pack for ODI in this Data Sheet

Also visit the Oracle Management Pack for ODI homepage for more details.

Where can we download ODI 12c?

The ODI 12c can be downloaded in various ways including Virtual Machines and complete downloads from the Oracle Technology Network.  

This page also gives the latest on trainings, patches and other happenings around Oracle Data Integrator so keep this page bookmarked to follow the latest on ODI.

Thank you.

Happy to help, please let me know if you need any other assistance. Thank you for your participation.

Could you expand on how the time reduction is effected through parallelism?

As you might have suspected the time reductions depends on various factors including database tunings, network capabilities and code efficiencies.

ODI’s parallelism offers parallel threading and intelligent identification of parallelizable components.

This blog on the in-session parallelism functionality explains the workings and logic behind the process.

What has Oracle done to ensure any applications written today using existing KM's will work going forward for any ODI upgrades?

While we do our best to ensure consistency across the versions with Knowledge Modules - any Knowledge Modules you have changed would require additional testing to make sure everything works as it should.

Customization allows your Knowledge Modules to work most efficiently but also means changes from the out of the box KMs provided. Unfortunately there is no simple/easy answer outside of further testing.

Could you tell more about integrations to Workday? (real-time)

Workday integration is currently provided through some of our partners , mostly with customized Knowledge Modules.  For help in contacting such a partner, please reach out to your Oracle Data Integration Sales Manager.

Does Oracle Business Intelligence Application 11.1.1.7.1 offer a choice to use ODI or a third party DAC?

Oracle BI Apps version 11.1.1.7.1 which was released in May of 2013 was built on Oracle Data Integrator and fully functions only on ODI.

Is ODI 12c Generally Available?

Yes, ODI 12c has been generally available since October of 2013. For more information on the product please go to the ODI homepage.


You can also download the product here

Can you provide high level differences between Oracle Data Integrator and Golden Gate ?

At a high level, ODI is bulk data movement and transformation - known often as ETL/ELT. 

OGG is a high-volume, low latency replication/synchronization product.  For more information - please view the various datasheets, whitepaper on the two products that can both be found here.

What versions of E-Business Suite does ODI 12c require?

The certification matrix gives an overview of the compatible technologies and standards for ODI. 

You can also find more information in the E business documentation

Do you support connection, extraction and upload from QuickBase and into Quickbase in ODI 12c ?

For Intuit QuickBase there is an ODBC driver which might work in conjunction with the generic SQL KMs shipped with ODI. 

More information can be found here: http://qunect.com/. 

Please note that QuickBase is not certified with ODI. For a full set of certified products refer to the certification matrix.

Is it correct that there is no Oracle Warehouse Builder in 12c ?

That is correct. From a packaging perspective, OWB is not shipped with Oracle DB 12c, however, previous OWB releases can connect to Oracle DB 12c.  

The Statement of Direction around ODI and OWB explains the reasons and direction forward for both the products.

Great Webinar - would love to see more of these

Great! The feedback has been great for all our audiences and we shall definitely find ways to bring ODI to you. Meanwhile we invite you to follow our blogs and facebook channels.

Wednesday Jan 29, 2014

ODI 12.1.2 Demo on the Oracle BigDataLite Virtual Machine

Update 4/14/2015: This blog is outdated, instructions for the current Big Data Lite 4.1 are located here

Oracle's big data team has just announced the Oracle BigDataLite Virtual Machine, a pre-built environment to get you started on an environment reflecting the core software of Oracle's Big Data Appliance 2.4. BigDataLite is a VirtualBox VM that contains a fully configured Cloudera Hadoop distribution CDH 4.5, an Oracle DB 12c, Oracle's Big Data Connectors, Oracle Data Integrator 12.1.2, and other software.

You can use this environment to see ODI 12c in action integrating big data with Oracle DB using ODI's declarative graphical design, efficient EL-T loads, and Knowledge Modules designed to optimize big data integration. 

The sample data contained in BigDataLite represents the fictional Oracle MoviePlex on-line movie streaming company. The ODI sample performs the following two steps:

  • Pre-process application logs within Hadoop: All user activity on the MoviePlex web site is gathered on HDFS in Avro format. ODI is reading these logs through Hive and processes activities by aggregating, filtering, joining and unioning the records in an ODI flow-based mapping. All processing is performed inside Hive map-reduce jobs controlled by ODI, and the resulting data is stored in a staging table within Hive.
  • Loading user activity data from Hadoop into Oracle: The previously pre-processed data is loaded from Hadoop into an Oracle 12c database, where this data can be used as basis for Business Intelligence reports. ODI is using the Oracle Loader for Hadoop (OLH) connector, which executes distributed Map-Reduce processes to load data in parallel from Hadoop into Oracle. ODI is transparently configuring and invoking this connector through the Hive-to-Oracle Knowledge Module.

Both steps are orchestrated and executed through an ODI Package workflow. 

Demo Instructions

Please follow these steps to execute the ODI demo in BigDataLite:

  1. Download and install BigDataLite. Please follow the instructions in the Deployment Guide at the download page
  2. Start the VM and log in as user oracle, password welcome1.
  3. Start the Oracle Database 12c by double-clicking the icon on the desktop.


  4. Start ODI 12.1.2 by clicking the icon on the toolbar.


  5. Press Connect To Repository... on the ODI Studio window. 


  6. Press OK in the ODI Login dialog.


  7. Switch to the Designer tab, open the Projects accordion and expand the projects tree to Movie > First Folder > Mappings. Double-click on the mapping Transform Hive Avro to Hive Staging.


  8. Review the mapping that transforms source Avro data by aggregating, joining, and unioning data within Hive. You can also review the mapping Load Hive Staging to Oracle the same way. 

    (Click image for full size)

  9. In the Projects accordion expand the projects tree to Movie > First Folder > Packages. Double-click on the package Process Movie Data.


  10. The Package workflow for Process Movie Data opens. You can review the package.


  11. Press the Run icon on the toolbar. Press OK for the Run and Information: Session started dialogs. 




  12. You can follow the progress of the load by switching to the Operator tab and expanding All Executions and the upmost Process Movie Data entry. You can refresh the display by pressing the refresh button or setting Auto-Refresh. 


  13. Depending on the environment, the load can take 5-15 minutes. When the load is complete, the execution will show all green checkboxes. You can traverse the operator log and double-click entries to explore statistics and executed commands. 

This demo shows only some of the ODI big data capabilities. You can find more information about ODI's big data capabilities at:


Monday Jan 06, 2014

Welcome Oracle Management Pack for Oracle Data Integrator! Let’s maximize the value of your Oracle Data Integrator investments!

To help you make the most of Oracle Data Integrator, and to deliver a superior ownership experience in an effort to minimize systems management costs, Oracle recently released Oracle Management Pack for Oracle Data Integrator. This new product leverages Oracle Enterprise Manager Cloud Control's advanced management capabilities to provide an integrated and top-down solution for your Oracle Data Integrator environments. Management Pack for Oracle Data Integrator supports both 11g (11.1.1.7.0 and higher) and 12c versions of Oracle Data Integrator.

Management Pack for Oracle Data Integrator provides a consolidated view of your entire Oracle Data Integrator infrastructure. This enables users to monitor and manage all their components centrally from Oracle Enterprise Manager Cloud Control.

Performance Monitoring and Management

Management Pack for Oracle Data Integrator streamlines the monitoring of the health, performance, and availability of each and all components of an Oracle Data Integrator environment – this includes Master and Work Repositories, Standalone and JEE agents, as well as source and target Data Servers.


Figure 1 – Management Pack for Oracle Data Integrator - Dashboard

Through an easy-to-use graphical interface, administrators and operators are able to quickly asses the status of their Oracle Data Integrator environments. The Dashboard page provides a summary of the health of each Oracle Data Integrator components, and clearly highlights potential issues with links for drill-down such that users can easily further investigate and assess more details.


Figure 2 – Management Pack for Oracle Data Integrator– Repositories Page

The Repositories page clearly offers key database statistics and performance metrics, where users can easily track the growth of the repository tablespace and purge the Oracle Data Integrator logs when needed – one of Oracle Data Integrator’s best practices!


Figure 3 – Management Pack for Oracle Data Integrator– Database Execution Details Page

The Load plan Executions/Sessions page allows developers and operators to view the entire Oracle Data Integrator Sessions activities and review detailed execution statistics; such as the overall duration of processes, or the number of inserts, updates, deletes performed. Users can click to access database details on Oracle, or to view a relevant entire SQL statement generated by Oracle Data Integrator.