Tuesday Feb 02, 2016

A-Team Article: Integrating Oracle Data Integrator (ODI) On-Premise with Cloud Services

Benjamin Perez-Goytia of the Data Integration Solutions A-Team provides a very matter of fact summary regarding Cloud Integration with Oracle Data Integrator (ODI). For all the details: Integrating Oracle Data Integrator (ODI) On-Premise with Cloud Services.

The blog post details how to integrate Oracle Data Integrator (ODI) on-premise with cloud services.  Cloud computing is a service or utility in high demand and enterprises find themselves with a mix of on-premise data sources and cloud services.  Ben describes how Oracle Data Integrator (ODI) on-premise can enable the integration of both on-promise data sources and cloud services.

Happy reading! For more A-Team reads on ODI, browse through the A-Team Chronicles.

Tuesday Jan 26, 2016

ODI 12c KMs are now available for Oracle Hyperion Planning and Essbase

Oracle Essbase and Oracle Hyperion Planning Knowledge Modules (KMs) are now available for Oracle Data Integrator 12.2.1. The patches containing the KMs are available for download on My Oracle Support. Patch# 20109114 contains Oracle Essbase KMs and Patch# 21903914 has Oracle Hyperion Planning KMs. The KMs support latest version (11.1.2.4) of these Hyperion Applications.  All the features available in their corresponding 11g KMs are available in these 12c KMs as well.

Friday Nov 13, 2015

2015 Oracle Excellence Awards… and the Winners for the Big Data, Business Analytics and Data Integration Category are…

Every year at OpenWorld, Oracle announces the winners to its most prestigious awards, the Oracle Excellence Awards. This year, as we have in previous years, we celebrated in style! In an Oscars-like ceremony on Tuesday, October 27th, customers were highlighted for their innovative solutions. Congratulations to all winners!

Let me introduce this year’s 2015 Big Data, Business Analytics and Data Integration category winners:

Amazon.com, an American electronic commerce and cloud computing company, is the largest Internet-based retailer in the United States.

During its growth over the years, Amazon never invested in back office systems and processes believing that they could scale without impact to their business. After almost 15 years of exponential growth into a multi-national and diverse business, Amazon recognized that they needed to address improve their ability to manage financial reporting.

Together with The Hackett Group, they built an EPM and BI solution to ensure they could deliver automated financial reporting and analytics to over 1,000 stakeholders across the company. Using Oracle Data Integrator Enterprise Edition, Application Adaptors for Data Integration, Oracle BI Enterprise Edition, Oracle Essbase, and various Hyperion products – the solution yields huge benefits:

· 100% Positive Feedback from user community, leadership claiming First Successful Financial Systems Implementation

50% + Reduction in Time to Complete Close Process from 10+ days to 5 days

1,000s of Hours Saved per Month across 1,000 users in data analysis & mining

The automation of reporting that the reports and analytic capabilities are correct has truly provided the confidence Amazon.com desired in moving forward and providing modernization and agility in future business decisions.


CaixaBank is a leading Spanish retail bank and insurer. Centering on delivering customer service through innovation and technology, CaixaBank is regarded as a leading financial institution in global innovation and has pioneered the development of new means of payment.

They set out to become a data-driven company using Big Data architecture using Business Analytics tools by: Democratizing use of data for Reporting, Dashboarding and Analysis, Empowering end-user by allowing combine data for discovering Business patterns and niches and Blending Structured and Non-Structured information. Built on top of Exadata and Exalytics, and leveraging Oracle Business Intelligence, Oracle Big Data Analytics, and Oracle Data Integration tools, CaixaBank’s solution:

Enabled 360°Understanding of Customers to offer tailored, on-demand banking and insurance solutions

50,000+ hours saved/year on Big Data Project Development versus previous methods

75 Big Data projects - 16% drives increased economic margin, 14% addresses cost reduction, 27% improves process and saves time – generating 5 Million+ Euros

Their initial success is driving CaixaBank to transition towards a real time driven organization, enabling the monetization of their data relative to business targets.


Serta is the #1 mattress manufacturer in the United States and one of the most recognized home furnishing brands in the marketplace. In an effort to modernize the antiquated and inefficient ERP system to service financial reporting, and provide IPO readiness, they quickly implemented a scalable solution across the enterprise leveraging Oracle BI Foundation Suite, Oracle Endeca Information Discovery, various Hyperion products, Essbase Analytics Link for Hyperion Financial Management, Oracle Data Relationship Management, and Oracle Data Relationship Governance.

Together with Grant Thornton, the solution took 7 months, and yields:

Scalable, integrated platform delivered improved system stability thanks to consolidated, standardized, data acquisition and better quality data

Which in turn, improved the reliability and confidence in self service analytics

90% faster to produce Standardized Reporting - 9 business days saved per month

275% improvement in availability of data, allowing users to access and provide timely analysis of the data

This solution has provided Serta Simmons some truly amazing results, where the overhead cost reduction is in the millions due to higher quality speedier information, providing a strong foundation for future expansion of this solution.


Skanska AB, founded in 1887, is one of the world’s leading project development and construction groups, with expertise in commercial and residential projects and public-private partnerships. Skanska creates sustainable solutions and aims to be a leader in quality, green construction, work safety, and business ethics

Skanska needed to optimize its monthly management reporting – including shipping data to a 3rd party for augmenting. They wanted to simplify development and testing of new information solutions and reports, and ensure that users could quickly and easily adapt the collection and analysis of information to meet new requirements.

In a centralized, single cloud environment for management reporting worldwide, Skanska is using Oracle BI Cloud Service. The solution benefits include:

Full data load to “ready to analyze” in < 5 minutes

17 days to implement

Oracle BI Cloud Service is proving reliable, being constantly accessible to 60K users via mobile devices in this global organization.


Scottish and Southern Energy, SSE is involved in producing, distributing and supplying electricity and gas and other energy related services. Within the UK, the focus on energy companies and power generation industry is in the spotlight, with growing regulation and joint assessment of competition in the energy market. To ensure that Scottish & Southern Energy (SSE) and subsequently, the SSE Generation directorate, can continue to operate in this ever changing environment, whilst developing and retaining a competitive edge over its market rivals, data on its entire business, and the way it manages and uses that data will need to be a key focus.

SSE was looking for an agile and rapid delivery of mobile accessible BI dashboards which otherwise would have taken their internal IT teams a considerable amount of on premise infrastructure design and provisioning. The Oracle BI Cloud enabled functionality to be surfaced in an extremely short amount of time using real Generation business data to present to senior executives.

SEE found:

Cloud Infrastructure is more Cost and Time effective; 6 weeks of work compared to 6 months

Increased confidence within internal SSE business community for delivering right solution

Management Dashboard key enabler for more effective and quicker decisions by leadership team

The advantages that are available to SSE in utilizing ‘Cloud’ based solutions in regards to speed, flexibility and agility in conjunction with the on premises solutions. SSE finds it is able to integrate data from multiple sources into “Single Source of Truth” and in turn make long term informed decisions vital for the future sustainability of SSE.


Tampa International Airport is one of the region's most significant economic engines, with a total economic output of more than $7 billion. The airport and its tenants employ more than 7,500 people on the airport campus and support more than 81,000 jobs in the community.

Tampa International Airport set out to modernize their infrastructure and data acquisition strategy knowing better data integration and business analytics would strengthen their business and customer loyalty. Sitting on top of Exadata, Exalogic, and Exalytics, Tampa International Airport combined Oracle Data Integrator and Oracle GoldenGate with Oracle BI Foundation Suite for better analytics around passengers, parking, concessions, baggage claim, etc. In 6 months, the completed project yields major benefits:

With better analysis helping them add new routes, they have seen a 6.7% Yearly Passenger Increase, the largest in year-over-year growth in a decade, with international traffic increasing by more than 14%

4.6% increase in revenues driven by greater analysis of the Authority’s parking, concessions and rental car businesses resulting in greater spend per passenger

150% reduction in Time to Complete Planning & Forecasting Process via more efficient resource and people management

The success has provided Tampa International Airport sets a shift in paradigm – for a more proactive, strategic and operational excellence – all in an effort to better serve their customers.

CONGRATULATIONS to our Big Data, Business Analytics and Data Integration category winners!

Monday Oct 12, 2015

Featuring Big Data Sessions at Oracle OpenWorld 2015

Oracle OpenWorld is only a few days away now and Big Data will be front and center again this year! Many of our Oracle Data Integration sessions will speak to your Big Data needs, we hope you will come and meet us to hear how Oracle Data Integration helps everyone by simplifying access to Big Data and introducing real time capabilities to Big Data.

I would recommend attending the following 2 key sessions on Oracle Data Integration with Big Data:

  • Enabling Real-Time Data Integration with Big Data [CON9724]
    In this session Chai Pydimukkala from the Oracle Data Integration Product Management team will discuss GoldenGate's offering for big data environments. With Chai, Janardh Bantupalli from LinkedIn, will present their solution that uses Oracle GoldenGate for Big Data to optimize the data warehousing environment and achieve operational insights with lower costs.
  • Oracle Data Integration Product Family: a Cornerstone for Big Data [CON9609]
    In this session Alex Kotopoulis from the Oracle Data Integration Product Management team and Mark Rittman Chief Technical Officer at Rittman Mead will describe how our Data Integration platform uses a metadata-based approach to hide the complexity of the various big data technologies such as Hive, Pig, and Spark, and delivers a simplified and future-proofed investment in big data technologies.

There are many more Big Data related sessions I’d also recommend to attend:

In addition we will be running several Hands on Labs covering Oracle Big Data Preparation Cloud Service, Oracle Data Integrator and Oracle GoldenGate. Space is limited and they usually fill up quickly so make sure to register!

Please also come to visit us at our various demo pods in Moscone South:

  • Oracle Big Data Preparation Cloud Service: Get Your Big Data Ready to Use
    Workstation ID: SBD-022 / Venue: Moscone South, Upper Right, Big Data Showcase
  • Oracle Big Data Preparation Cloud Service
    Workstation ID: SPI-023 / Venue: Moscone South, Oracle Cloud Platform and Infrastructure Showcase
  • Oracle Data Integrator Enterprise Edition and Big Data Option: High-Performance Data Integration
    Workstation ID: SLM-022 / Venue: Moscone South, Lower Left, Middleware
  • Oracle GoldenGate: Real-Time Data Integration for Heterogeneous and Big Data Environments
    Workstation ID: SLM-035 / Venue: Moscone South, Lower Left, Middleware
  • Tame Big Data with Oracle Data Integration
    Workstation ID: SBD-023 / Venue: Moscone South, Upper Right, Big Data Showcase

We hope you will join a few! Don’t forget to view the Focus on Data Integration – for a full review of Data Integration Sessions during OpenWorld. See you there!

Thursday Oct 08, 2015

Featuring Big Data Preparation Cloud Service and other Cloud Data Integration Sessions at Oracle OpenWorld 2015

Oracle OpenWorld is almost upon us! We are excited to be sharing with you some previews of what will be seen and discussed in just a few weeks in San Francisco!

One of the highlights is Oracle’s new Cloud Based Data Preparation Solution, Oracle Big Data Preparation Cloud Service, also known as BDP.  This new service will revolutionize the process of importing, preparing and publishing your complex business data and getting it ready for use allowing you to spend more time analyzing data rather than preparing data for analysis. Users are guided through the process with intuitive recommendation driven interfaces. The system also provides various ways to automate and operationalize the entire data preparation pipeline via the built in scheduler or via a rich set of RESTful API’s.

During OpenWorld, Oracle’s Luis Rivas, alongside Blue Cloud Innovations’ Vinay Kumar and Pythian’s Alex Gorbachev will discuss and demonstrate how big data promises many game-changing capabilities if tackled efficiently! You will discover how Oracle Big Data Preparation Cloud Service takes “noisy” data from a broad variety of sources in many different formats, both structured and unstructured, and uses sophisticated and unique blend of machine learning and Natural Language Processing based on a vast set of linked open reference data that provide a powerful way to ingest, prepare, enrich, and publish it into useful data streams, ready for further discovery, analysis, and reporting. Don’t miss it:

CON9615 Solving the “Dirty Secret” of Big Data with Oracle Big Data Preparation Cloud Service

Tuesday, Oct 27, 5:15 p.m. | Moscone South—310

Curious to find out more about BDP before the conference? Take a look here and view a short video: Chalk Talk: Oracle Big Data Preparation Cloud Service!

Since we are on the topic of Data Integration and the Cloud – I will also take a quick moment to remind everyone about Oracle Data Integrator’s (ODI) integration relative to the Oracle Storage Cloud Service as well for example. But that’s not all – here is a view into the Data Integration sessions that relate to the Cloud – in chronological order:

CON3506 Into the Cloud and Back with Oracle Data Integrator 12c

Monday, Oct 26, 5:15 p.m. | Moscone West—2022

*****

CON9614 Oracle Data Integration Solutions: the Foundation for Cloud Integration

Wednesday, Oct 28, 11:00 a.m. | Moscone South—274

*****

CON9717 Accelerate Cloud Onboarding Using Oracle GoldenGate Cloud Service

Wednesday, Oct 28, 3:00 p.m. | Moscone West—2022

*****

CON9595 Cloud Data Quality: Lessons Learned from Oracle’s Journey to the Sales Cloud

Thursday, Oct 29, 12:00 p.m. | Moscone West—2022

*****

CON9612 Oracle Enterprise Metadata Management and the Cloud

Thursday, Oct 29, 1:15 p.m. | Marriott Marquis—Salon 4/5/6


We hope you will join a few! Don’t forget to view the Focus on Data Integration – for a full review of Data Integration Sessions during OpenWorld. See you there!

Tuesday Oct 06, 2015

Catch Up on Oracle GoldenGate at Oracle OpenWorld

This year Oracle OpenWorld brings many opportunities for Oracle GoldenGate users to learn the latest features and best practices. If you will be at OpenWorld later this month, I recommend you check out the following key Oracle GoldenGate sessions:

On Monday October 26th, Chai Pydimukkala from the GoldenGate product management team will present Oracle GoldenGate Product Update and Strategy (CON9720) session and share the new features of Oracle GoldenGate and its strategic direction. Along with Chai, Eric Schneider and Andrew Yee from our customer TicketMaster will present their GoldenGate deployment and best practices

On Tuesday Oct 27th, in the Enabling Real-Time Data Integration with Big Data (CON9724) Chai will discuss GoldenGate's offering for big data environments. With Chai,  Janardh Bantupalli from LinkedIn, will present their solution that uses Oracle GoldenGate for Big Data to optimize the data warehousing environment and achieve operational insights with lower costs.

On Wednesday October 28th you can get a preview to the anticipated Oracle GoldenGate Cloud Service by attending Accelerate Cloud On-Boarding using GoldenGate Cloud Service (CON9717). You will also learn about using GoldenGate in public, private, and hybrid cloud deployment and hear from Michael Pape, CTO Database Operations for Intuit, talk about their GoldenGate best practices.  

Other key GoldenGate sessions to attend:

You can find the full list of Oracle GoldenGate and Oracle Data Integration sessions in our Focus On Document, and use the Schedule Builder to build your personal schedule for OpenWorld. While at OpenWorld, follow us via @OracleDI and @ORACLEBigData and using the hashtags #ODI12c,#OGG12c, #OEMM, #OEDQ and #OBDPCS to join the conversation.

Wednesday Aug 05, 2015

Chalk Talk Video: Oracle Big Data Preparation Cloud Service

We continue our Oracle Data Integration chalk talk video series, with an overview of Oracle Big Data Preparation Cloud Service (BDP). BDP allows users to unlock the potential of their data with a non-technical, web-based tool that minimizes data preparation time. BDP provides an interactive set of services that automate, streamline, and guide the process of data ingestion, preparation, enrichment, and governance without costly manual intervention.

View this video to learn more: Chalk Talk: Oracle Big Data Preparation Cloud Service

For additional information – visit the Oracle Big Data Preparation Cloud Service page.


Tuesday Jul 07, 2015

Chalk Talk Video: Kick-Start Big Data Integration with Oracle

Next in the series for Oracle Data Integration chalk talk videos, we speak to Oracle Data Integrator (ODI) for big data. ODI allows you to become a big data developer without learning to code Java and Map Reduce! ODI generates the code and optimizes it with support for Hive, Spark, Oozie, and Pig.

View this video to learn more: Chalk Talk: Kick-Start Big Data Integration with Oracle.

For additional information on Oracle Data Integrator, visit the ODI homepage and the ODI for Big Data page. This blog can be very handy also: Announcing Oracle Data Integrator for Big Data.

Thursday Jul 02, 2015

Chalk Talk Video: How to Raise Trust and Transparency in Big Data with Oracle Metadata Management

Some fun new videos are available; we call the series ‘Chalk Talk’!

The first in the series that we will share with you around Oracle Data Integration speaks to raising trust and transparency within big data. It is known that crucial big data projects often fail due to a lack in the overall trust of the data. Data is not always transparent, and governing it can become a costly overhead. Oracle Metadata Management assists in the governance and trust across all data with the enterprise, Oracle and 3rd party.

View this video to learn more: Chalk Talk: How to Raise Trust and Transparency in Big Data.

For additional information on Oracle Metadata Management, visit the OEMM homepage.

Wednesday Jul 01, 2015

ODI - Integration with Oracle Storage Cloud Service

Oracle Data Integrator’s open tool framework can be leveraged to quickly get access to the Oracle Storage Cloud Service, which is gradually becoming an essential part for integrating on premise data to many cloud services. The reference implementation of an open tool for Oracle Storage Cloud is now available on the Data Integration project on Java.net: ODI OpenTool for Oracle Storage Cloud which can be used and modified as per your integration needs. [Read More]

Tuesday Jun 09, 2015

Oracle Data Integrator Journalizing Knowledge Module for GoldenGate Integrated Replicat Blog from the A-Team

As always, useful content from the A-Team…

Check out the most recent blog about how to modify the out-of-the-box Journalizing Knowledge Module for GoldenGate to support the Integrated Replicat apply mode.

An Oracle Data Integrator Journalizing Knowledge Module for GoldenGate Integrated Replicat

Enjoy!

Monday May 11, 2015

Oracle Big Data Preparation Cloud Service (BDP) – Coming Soon

What are your plans around Big Data and Cloud?

If your organization has already begun to explore these topics, you might be interested a new offering from Oracle that will dramatically simplify how you use your data in Hadoop and the Cloud:

Oracle Big Data Preparation Cloud Service (BDP)

There is a perception that most of the time spent in Big Data projects is dedicated to harvesting value. The reality is that 90% of the time in Big Data projects is really spent on data preparation. Data may be structured, but more often it will be semi-structured such as weblogs, or fully unstructured such as free form text. The content is vast, inconsistent, and incomplete, often off topic, and from multiple differing formats and sources. In this environment each new dataset takes weeks or months of effort to process, frequently requiring programmers writing custom scripts. Minimizing data preparation time is the key to unlocking the potential of Big Data.

Oracle Big Data Preparation Cloud Service (BDP) addresses this very reality. BDP is a non-technical, web-based tool that sets out to minimize data preparation time in an effort to quickly unlock the potential of your data. The BDP tool provides an interactive set of services that automate, streamline, and guide the process of data ingestion, preparation, enrichment, and governance without costly manual intervention.

The technology behind this service is amazing; it intuitively guides the user with a machine learning driven recommendation engine based on semantic data classification and natural language processing algorithms. But the best part is that non-technical staff can use this tool as easily as they use Excel, resulting in a significant cost advantage for data intensive projects by reducing the amount of time and resources required to ingest and prepare new datasets for downstream IT processes.

Curious to find out more? We invite you to view a short demonstration of BDP below:

Let us know what you think!

Stay tuned as we write more about this offering… visit often here!

Wednesday Apr 15, 2015

Data Governance for Migration and Consolidation

By Martin Boyd, Senior Director of Product Management

How would you integrate millions of parts, customer and supplier information from multiple acquisitions into a single JD Edwards instance?  This was the question facing National Oilwell Varco (NOV), a leading worldwide provider of worldwide components used in the oil and gas industry.  If they could not find an answer then many operating synergies would be lost, but they knew from experience that simply “moving and mapping” the data from the legacy systems into JDE was not sufficient, as the data was anything but standardized.

This was the problem described yesterday in a session at the Collaborate Conference in Las Vegas.  The presenters were Melissa Haught of NOV and Deepak Gupta of KPIT, their systems integrator. Together they walked through an excellent discussion of the problem and the solution they have developed:

The Problem:  It is first important to recognize that the data to be integrated from many and various legacy systems had been created over time with different standards by different people according to their different needs. Thus, saying it lacked standardization would be an understatement.  So how do you “govern” data that is so diverse?  How do you apply standards to it months or years after it has been created? 

The Solution:  The answer is that there is no single answer, and certainly no “magic button” that will solve the problem for you.  Instead, in the case of NOV, a small team of dedicated data stewards, or specialists, work to reverse-engineer a set of standards from the data at hand.  In the case of product data, which is usually the most complex, NOV found they could actually infer rules to recognize, parse, and extract information from ‘smart’ part numbers, even from part numbering schemes from acquired companies.  Once these rules are created for an entity or a category and built in to their Oracle Enterprise Data Quality (EDQ) platform. Then the data is run through the DQ process and the results are examined.  Most often you will find out problems, which then suggest some rule refinements are required. Rule refinement and data quality processing steps run repeatedly until the result is as good as it can be.  The result is never 100% standardized and clean data though. Some data is always flagged into a “data dump” for future manual remediation. 

Lessons Learned:

  • Although technology is a key enabler, it is not the whole solution. Dedicated specialists are required to build the rules and improve them through successive iterations
  • A ‘user friendly’ data quality platform is essential so that it is approachable and intuitive for the data specialists who are not (nor should they be) programmers
  • A rapid iteration through testing and rules development is important to keep up project momentum.  In the case of NOV, specialists request rule changes, which are implemented by KPIT resources in India. So in effect, changes are made and re-run overnight which has worked very well

Technical Architecture:  Data is extracted from the legacy systems by Oracle Data Integrator (ODI), which also transforms the data in to the right ‘shape’ for review in EDQ.  An Audit Team reviews these results for completeness and correctness based on the supplied data compared to the required data standards.  A secondary check is also performed using EDQ, which verifies that the data is in a valid format to be loaded into JDE.

The Benefit:  The benefit of having data that is “fit for purpose” in JDE is that NOV can mothball the legacy systems and use JDE as a complete and correct record for all kinds of purposes from operational management to strategic sourcing.  The benefit of having a defined governance process is that it is repeatable.  This means that every time the process is run, the individuals and the governance team as a whole learn something from it and they get better at executing it next time around.  Because of this NOV has already seen orders of magnitude improvements in productivity as well as data quality, and is already looking for ways to expand the program into other areas.

All-in-all, Melissa and Deepak gave the audience great insight into how they are solving a complex integration program and reminded us of what we should already know: "integrating" data is not simply moving it. To be of business value, the data must be 'fit for purpose', which often means that both the integration process and the data must be governed. 

Friday Apr 10, 2015

Customers Tell All: What Sets Oracle Apart in Big Data Integration

Data integration has become a critical component of many technology solutions that businesses pursue to differentiate in their markets. Instead of relying on manual coding in house, more and more businesses choose data integration solutions to support their strategic IT initiatives, from big data analytics to cloud integration.

To explore the differences among the leading data integration solutions and the impact their technologies are having on real-world businesses, Dao Research recently conducted a research study, where they interviewed IBM, Informatica, and Oracle customers. In addition they reviewed publicly available solution information from these three vendors.

The research revealed some key findings that explains Oracle's leadership in the data integration space. For example:

  • Customers who participated in this study cite a range of 30 to 60 % greater development productivity using Oracle Data Integrator vs traditional ETL tools from Informatica and IBM. Dao's research ties Oracle's advantage to product architecture differences such as native push-down processing, the seperation of logical and physical layers, and the ability to extend Oracle Data Integrator using its knowledge modules.
  • The research also showed that Oracle’s data integration cost of ownership is lower because of its unified platform strategy (versus offering multiple platforms and options), its use of source and target databases for processing, higher developer productivity, faster implementation, and it doesn’t require management resources for a middle-tier integration infrastructure.
  • In the area of big data integration, the study highlights Oracle’s advantage with its flexible and native solutions. Unlike competitors’ offerings, developed as separate solutions, Oracle’s solution is aware of the cluster environment of big data systems. Oracle enables big data integration and cloud data integration through the use of a single platform with common tooling and inherent support for big data processing environments.
  • I should add that the latest release of Oracle Data Integrator EE Big Data Options  widens the competitive gap. Oracle is the only vendor that can automatically generate Spark, Hive, and Pig transformations from a single mapping. Oracle Data Integration customers can focus on building the right architecture for driving business value, and do not have to become expert on multiple programming languages.  For example, an integration architect in a large financial services provider told the research company "As an ODI developer, I am a Big Data developer without having to understand the underpinnings of Big Data. That's pretty powerful capability."


You can find the report of Dao's research here:

I invite you to read this research paper to understand why more and more customers trust Oracle for their strategic data integration initiatives after working with or evaluating competitive offerings.


Thursday Feb 19, 2015

Introducing Oracle GoldenGate for Big Data!

Big data systems and big data analytics solutions are becoming critical components of modern information management architectures.  Organizations realize that by combining structured transactional data with semi-structured and unstructured data they can realize the full potential value of their data assets, and achieve enhanced business insight. Businesses also notice that in today’s fast-paced, digital business environment to be agile and respond with immediacy, access to data with low latency is essential. Low-latency transactional data brings additional value especially for dynamically changing operations that day-old data, structured or unstructured, cannot deliver.

Today we announced the general availability of Oracle GoldenGate for Big Data product, which offers a real-time transactional data streaming platform into big data systems. By providing easy-to-use, real-time data integration for big data systems, Oracle GoldenGate for Big Data facilitates improved business insight for better customer experience. It also allows IT organizations to quickly move ahead with their big data projects without extensive training and management resources. Oracle GoldenGate for Big Data's real-time data streaming platform also allows customers to keep their big data reservoirs up to date with their production systems. 

Oracle GoldenGate’s fault-tolerant, secure and flexible architecture shines in this new big data streaming offering as well. Customers can enjoy secure and reliable data streaming with subseconds latency. With Oracle GoldenGate’s core log-based change data capture capabilities it enables real-time streaming without degrading the performance of the source production systems.

The new offering, Oracle GoldenGate for Big Data, provides integration for Apache Flume, Apache HDFS, Apache Hive and Apache Hbase. It also includes Oracle GoldenGate for Java, which enables customers to easily integrate to additional big data systems, such as Oracle NoSQL, Apache Kafka, Apache Storm, Apache Spark, and others.

You can learn more about our new offering via Oracle GoldenGate for Big Data data sheet and by registering for our upcoming webcast:

How to Future-Proof your Big Data Integration Solution

March 5th, 2015 10am PT/ 1pm ET

I invite you to join this webcast to learn from Oracle and Cloudera executives how to future-proof your big data infrastructure. The webcast will discuss :

  • Selection criteria that will drive business results with Big Data Integration 
  • Oracle's new big data integration and governance offerings, including Oracle GoldenGate for Big Data
  • Oracle’s comprehensive big data features in a unified platform 
  • How Cloudera Enterprise Data Hub and Oracle Data Integration combine to offer complementary features to store data in full fidelity, to transform and enrich the data for increased business efficiency and insights.

Hope you can join us and ask your questions to the experts.

About

Lea