Wednesday Aug 05, 2015

Chalk Talk Video: Oracle Big Data Preparation Cloud Service

We continue our Oracle Data Integration chalk talk video series, with an overview of Oracle Big Data Preparation Cloud Service (BDP). BDP allows users to unlock the potential of their data with a non-technical, web-based tool that minimizes data preparation time. BDP provides an interactive set of services that automate, streamline, and guide the process of data ingestion, preparation, enrichment, and governance without costly manual intervention.

View this video to learn more: Chalk Talk: Oracle Big Data Preparation Cloud Service

For additional information – visit the Oracle Big Data Preparation Cloud Service page.


Tuesday Jul 07, 2015

Chalk Talk Video: Kick-Start Big Data Integration with Oracle

Next in the series for Oracle Data Integration chalk talk videos, we speak to Oracle Data Integrator (ODI) for big data. ODI allows you to become a big data developer without learning to code Java and Map Reduce! ODI generates the code and optimizes it with support for Hive, Spark, Oozie, and Pig.

View this video to learn more: Chalk Talk: Kick-Start Big Data Integration with Oracle.

For additional information on Oracle Data Integrator, visit the ODI homepage and the ODI for Big Data page. This blog can be very handy also: Announcing Oracle Data Integrator for Big Data.

Thursday Jul 02, 2015

Chalk Talk Video: How to Raise Trust and Transparency in Big Data with Oracle Metadata Management

Some fun new videos are available; we call the series ‘Chalk Talk’!

The first in the series that we will share with you around Oracle Data Integration speaks to raising trust and transparency within big data. It is known that crucial big data projects often fail due to a lack in the overall trust of the data. Data is not always transparent, and governing it can become a costly overhead. Oracle Metadata Management assists in the governance and trust across all data with the enterprise, Oracle and 3rd party.

View this video to learn more: Chalk Talk: How to Raise Trust and Transparency in Big Data.

For additional information on Oracle Metadata Management, visit the OEMM homepage.

Wednesday Jul 01, 2015

ODI - Integration with Oracle Storage Cloud Service

Oracle Data Integrator’s open tool framework can be leveraged to quickly get access to the Oracle Storage Cloud Service, which is gradually becoming an essential part for integrating on premise data to many cloud services. The reference implementation of an open tool for Oracle Storage Cloud is now available on the Data Integration project on Java.net: ODI OpenTool for Oracle Storage Cloud which can be used and modified as per your integration needs. [Read More]

Tuesday Jun 09, 2015

Oracle Data Integrator Journalizing Knowledge Module for GoldenGate Integrated Replicat Blog from the A-Team

As always, useful content from the A-Team…

Check out the most recent blog about how to modify the out-of-the-box Journalizing Knowledge Module for GoldenGate to support the Integrated Replicat apply mode.

An Oracle Data Integrator Journalizing Knowledge Module for GoldenGate Integrated Replicat

Enjoy!

Monday May 11, 2015

Oracle Big Data Preparation Cloud Service (BDP) – Coming Soon

What are your plans around Big Data and Cloud?

If your organization has already begun to explore these topics, you might be interested a new offering from Oracle that will dramatically simplify how you use your data in Hadoop and the Cloud:

Oracle Big Data Preparation Cloud Service (BDP)

There is a perception that most of the time spent in Big Data projects is dedicated to harvesting value. The reality is that 90% of the time in Big Data projects is really spent on data preparation. Data may be structured, but more often it will be semi-structured such as weblogs, or fully unstructured such as free form text. The content is vast, inconsistent, and incomplete, often off topic, and from multiple differing formats and sources. In this environment each new dataset takes weeks or months of effort to process, frequently requiring programmers writing custom scripts. Minimizing data preparation time is the key to unlocking the potential of Big Data.

Oracle Big Data Preparation Cloud Service (BDP) addresses this very reality. BDP is a non-technical, web-based tool that sets out to minimize data preparation time in an effort to quickly unlock the potential of your data. The BDP tool provides an interactive set of services that automate, streamline, and guide the process of data ingestion, preparation, enrichment, and governance without costly manual intervention.

The technology behind this service is amazing; it intuitively guides the user with a machine learning driven recommendation engine based on semantic data classification and natural language processing algorithms. But the best part is that non-technical staff can use this tool as easily as they use Excel, resulting in a significant cost advantage for data intensive projects by reducing the amount of time and resources required to ingest and prepare new datasets for downstream IT processes.

Curious to find out more? We invite you to view a short demonstration of BDP below:

Let us know what you think!

Stay tuned as we write more about this offering… visit often here!

Wednesday Apr 15, 2015

Data Governance for Migration and Consolidation

By Martin Boyd, Senior Director of Product Management

How would you integrate millions of parts, customer and supplier information from multiple acquisitions into a single JD Edwards instance?  This was the question facing National Oilwell Varco (NOV), a leading worldwide provider of worldwide components used in the oil and gas industry.  If they could not find an answer then many operating synergies would be lost, but they knew from experience that simply “moving and mapping” the data from the legacy systems into JDE was not sufficient, as the data was anything but standardized.

This was the problem described yesterday in a session at the Collaborate Conference in Las Vegas.  The presenters were Melissa Haught of NOV and Deepak Gupta of KPIT, their systems integrator. Together they walked through an excellent discussion of the problem and the solution they have developed:

The Problem:  It is first important to recognize that the data to be integrated from many and various legacy systems had been created over time with different standards by different people according to their different needs. Thus, saying it lacked standardization would be an understatement.  So how do you “govern” data that is so diverse?  How do you apply standards to it months or years after it has been created? 

The Solution:  The answer is that there is no single answer, and certainly no “magic button” that will solve the problem for you.  Instead, in the case of NOV, a small team of dedicated data stewards, or specialists, work to reverse-engineer a set of standards from the data at hand.  In the case of product data, which is usually the most complex, NOV found they could actually infer rules to recognize, parse, and extract information from ‘smart’ part numbers, even from part numbering schemes from acquired companies.  Once these rules are created for an entity or a category and built in to their Oracle Enterprise Data Quality (EDQ) platform. Then the data is run through the DQ process and the results are examined.  Most often you will find out problems, which then suggest some rule refinements are required. Rule refinement and data quality processing steps run repeatedly until the result is as good as it can be.  The result is never 100% standardized and clean data though. Some data is always flagged into a “data dump” for future manual remediation. 

Lessons Learned:

  • Although technology is a key enabler, it is not the whole solution. Dedicated specialists are required to build the rules and improve them through successive iterations
  • A ‘user friendly’ data quality platform is essential so that it is approachable and intuitive for the data specialists who are not (nor should they be) programmers
  • A rapid iteration through testing and rules development is important to keep up project momentum.  In the case of NOV, specialists request rule changes, which are implemented by KPIT resources in India. So in effect, changes are made and re-run overnight which has worked very well

Technical Architecture:  Data is extracted from the legacy systems by Oracle Data Integrator (ODI), which also transforms the data in to the right ‘shape’ for review in EDQ.  An Audit Team reviews these results for completeness and correctness based on the supplied data compared to the required data standards.  A secondary check is also performed using EDQ, which verifies that the data is in a valid format to be loaded into JDE.

The Benefit:  The benefit of having data that is “fit for purpose” in JDE is that NOV can mothball the legacy systems and use JDE as a complete and correct record for all kinds of purposes from operational management to strategic sourcing.  The benefit of having a defined governance process is that it is repeatable.  This means that every time the process is run, the individuals and the governance team as a whole learn something from it and they get better at executing it next time around.  Because of this NOV has already seen orders of magnitude improvements in productivity as well as data quality, and is already looking for ways to expand the program into other areas.

All-in-all, Melissa and Deepak gave the audience great insight into how they are solving a complex integration program and reminded us of what we should already know: "integrating" data is not simply moving it. To be of business value, the data must be 'fit for purpose', which often means that both the integration process and the data must be governed. 

Friday Apr 10, 2015

Customers Tell All: What Sets Oracle Apart in Big Data Integration

Data integration has become a critical component of many technology solutions that businesses pursue to differentiate in their markets. Instead of relying on manual coding in house, more and more businesses choose data integration solutions to support their strategic IT initiatives, from big data analytics to cloud integration.

To explore the differences among the leading data integration solutions and the impact their technologies are having on real-world businesses, Dao Research recently conducted a research study, where they interviewed IBM, Informatica, and Oracle customers. In addition they reviewed publicly available solution information from these three vendors.

The research revealed some key findings that explains Oracle's leadership in the data integration space. For example:

  • Customers who participated in this study cite a range of 30 to 60 % greater development productivity using Oracle Data Integrator vs traditional ETL tools from Informatica and IBM. Dao's research ties Oracle's advantage to product architecture differences such as native push-down processing, the seperation of logical and physical layers, and the ability to extend Oracle Data Integrator using its knowledge modules.
  • The research also showed that Oracle’s data integration cost of ownership is lower because of its unified platform strategy (versus offering multiple platforms and options), its use of source and target databases for processing, higher developer productivity, faster implementation, and it doesn’t require management resources for a middle-tier integration infrastructure.
  • In the area of big data integration, the study highlights Oracle’s advantage with its flexible and native solutions. Unlike competitors’ offerings, developed as separate solutions, Oracle’s solution is aware of the cluster environment of big data systems. Oracle enables big data integration and cloud data integration through the use of a single platform with common tooling and inherent support for big data processing environments.
  • I should add that the latest release of Oracle Data Integrator EE Big Data Options  widens the competitive gap. Oracle is the only vendor that can automatically generate Spark, Hive, and Pig transformations from a single mapping. Oracle Data Integration customers can focus on building the right architecture for driving business value, and do not have to become expert on multiple programming languages.  For example, an integration architect in a large financial services provider told the research company "As an ODI developer, I am a Big Data developer without having to understand the underpinnings of Big Data. That's pretty powerful capability."


You can find the report of Dao's research here:

I invite you to read this research paper to understand why more and more customers trust Oracle for their strategic data integration initiatives after working with or evaluating competitive offerings.


Thursday Feb 19, 2015

Introducing Oracle GoldenGate for Big Data!

Big data systems and big data analytics solutions are becoming critical components of modern information management architectures.  Organizations realize that by combining structured transactional data with semi-structured and unstructured data they can realize the full potential value of their data assets, and achieve enhanced business insight. Businesses also notice that in today’s fast-paced, digital business environment to be agile and respond with immediacy, access to data with low latency is essential. Low-latency transactional data brings additional value especially for dynamically changing operations that day-old data, structured or unstructured, cannot deliver.

Today we announced the general availability of Oracle GoldenGate for Big Data product, which offers a real-time transactional data streaming platform into big data systems. By providing easy-to-use, real-time data integration for big data systems, Oracle GoldenGate for Big Data facilitates improved business insight for better customer experience. It also allows IT organizations to quickly move ahead with their big data projects without extensive training and management resources. Oracle GoldenGate for Big Data's real-time data streaming platform also allows customers to keep their big data reservoirs up to date with their production systems. 

Oracle GoldenGate’s fault-tolerant, secure and flexible architecture shines in this new big data streaming offering as well. Customers can enjoy secure and reliable data streaming with subseconds latency. With Oracle GoldenGate’s core log-based change data capture capabilities it enables real-time streaming without degrading the performance of the source production systems.

The new offering, Oracle GoldenGate for Big Data, provides integration for Apache Flume, Apache HDFS, Apache Hive and Apache Hbase. It also includes Oracle GoldenGate for Java, which enables customers to easily integrate to additional big data systems, such as Oracle NoSQL, Apache Kafka, Apache Storm, Apache Spark, and others.

You can learn more about our new offering via Oracle GoldenGate for Big Data data sheet and by registering for our upcoming webcast:

How to Future-Proof your Big Data Integration Solution

March 5th, 2015 10am PT/ 1pm ET

I invite you to join this webcast to learn from Oracle and Cloudera executives how to future-proof your big data infrastructure. The webcast will discuss :

  • Selection criteria that will drive business results with Big Data Integration 
  • Oracle's new big data integration and governance offerings, including Oracle GoldenGate for Big Data
  • Oracle’s comprehensive big data features in a unified platform 
  • How Cloudera Enterprise Data Hub and Oracle Data Integration combine to offer complementary features to store data in full fidelity, to transform and enrich the data for increased business efficiency and insights.

Hope you can join us and ask your questions to the experts.

Thursday Jan 22, 2015

OTN Virtual Technology Summit Data Integration Subtrack Features Big Data Integration and Governance

I am sure many of you have heard about the quarterly Oracle Technology Network (OTN) Virtual Technology Summits. It provides a hands-on learning experience on the latest offerings from Oracle by bringing experts from our community and product management team. 

The next OTN Virtual Technology Summit is scheduled to February 11th (9am-12:30pm PT) and will feature Oracle's big data integration and metadata management capabilities with hands-on-lab content.

The Data Integration and Data Warehousing sub-track includes the following sessions and speakers:

Feb 11th 9:30am PT -- HOL: Real-Time Data Replication to Hadoop using GoldenGate 12c Adaptors

Oracle GoldenGate 12c is well known for its highly performant data replication between relational databases. With the GoldenGate Adaptors, the tool can now apply the source transactions to a Big Data target, such as HDFS. In this session, we'll explore the different options for utilizing Oracle GoldenGate 12c to perform real-time data replication from a relational source database into HDFS. The GoldenGate Adaptors will be used to load movie data from the source to HDFS for use by Hive. Next, we'll take the demo a step further and publish the source transactions to a Flume agent, allowing Flume to handle the final load into the targets.

Speaker: Michael Rainey, Oracle ACE, Principal Consultant, Rittman Mead

Feb 11th 10:30am PT -- HOL: Bringing Oracle Big Data SQL to Oracle Data Integration 12c Mappings

Oracle Big Data SQL extends Oracle SQL and Oracle Exadata SmartScan technology to Hadoop, giving developers the ability to execute Oracle SQL transformations against Apache Hive tables and extending the Oracle Database data dictionary to the Hive metastore. In this session we'll look at how Oracle Big Data SQL can be used to create ODI12c mappings against both Oracle Database and Hive tables, to combine customer data held in Oracle tables with incoming purchase activities stored on a Hadoop cluster. We'll look at the new transformation capabilities this gives you over Hadoop data, and how you can use ODI12c's Sqoop integration to copy the combined dataset back into the Hadoop environment.

Speaker: Mark Rittman, Oracle ACE Director, CTO and Co-Founder, Rittman Mead

Feb 11th 11:30am PT-- An Introduction to Oracle Enterprise Metadata Manager
This session takes a deep technical dive into the recently released Oracle Enterprise Metadata Manager. You’ll see the standard features of data lineage, impact analysis and version management applied across a myriad of Oracle and non-Oracle technologies into a consistent metadata whole, including Oracle Database, Oracle Data Integrator, Oracle Business Intelligence and Hadoop. This session will examine the Oracle Enterprise Metadata Manager "bridge" architecture and how it is similar to the ODI knowledge module. You will learn how to harvest individual sources of metadata, such as OBIEE, ODI, the Oracle Database and Hadoop, and you will learn how to create OEMM configurations that contain multiple metadata stores as a single coherent metadata strategy.

Speaker: Stewart Bryson, Oracle ACE Director, Owner and Co-founder, Red Pill Analytics

I invite you to register now to this free event and enjoy this feast for big data integration and governance enthusiasts.

Americas -- February 11th/ 9am to 12:30pm PT- Register Now

Please note the same OTN Virtual Technology Summit content will be presented again to EMEA and APAC. You can register for the via the links below.

EMEA – February 25th / 9am to 12:30pm GMT* - Register Now

APAC – March 4th / 9:30am-1:00pm IST* - Register Now

Join us and let us know how you like the data integration sessions in this quarter's OTN event.

Monday Dec 29, 2014

Oracle Data Enrichment Cloud Service (ODECS) - Coming Soon

What are your plans around Big Data and Cloud?

If your organization has already begun to explore these topics, you might be interested a new offering from Oracle that will dramatically simplify how you use your data in Hadoop and the Cloud:

Oracle Data Enrichment Cloud Service (ODECS)

There is a perception that most of the time spent in Big Data projects is dedicated to harvesting value. The reality is that 90% of the time in Big Data projects is really spent on data preparation. Data may be structured, but more often it will be semi-structured such as weblogs, or fully unstructured such as free form text. The content is vast, inconsistent, and incomplete, often off topic, and from multiple differing formats and sources. In this environment each new dataset takes weeks or months of effort to process, frequently requiring programmers writing custom scripts. Minimizing data preparation time is the key to unlocking the potential of Big Data.

Oracle Data Enrichment Cloud Service (ODECS) addresses this very reality. ODECS is a non-technical, web-based tool that sets out to minimize data preparation time in an effort to quickly unlock the potential of your data. The ODECS tool provides an interactive set of services that automate, streamline, and guide the process of data ingestion, preparation, enrichment, and governance without costly manual intervention.

The technology behind this service is amazing; it intuitively guides the user with a machine learning driven recommendation engine based on semantic data classification and natural language processing algorithms. But the best part is that non-technical staff can use this tool as easily as they use Excel, resulting in a significant cost advantage for data intensive projects by reducing the amount of time and resources required to ingest and prepare new datasets for downstream IT processes.

Curious to find out more? We invite you to view a short demonstration of ODECS below:


Let us know what you think!

Stay tuned as we write more about this offering…

Wednesday Dec 17, 2014

Oracle Partition Exchange Blog from the ODI A-Team

More great information from the ODI A-Team!

Check out the A-Team’s most recent blog about the Oracle Partition Exchange – it does come in two parts:

Using Oracle Partition Exchange with ODI

Configuring ODI with Oracle Partition Exchange

The knowledge module is on Java.Net, and it is called “IKM Oracle Partition Exchange Load”.  To search for it, enter “PEL” or “exchange” in the Search option of Java.Net.

A sample ODI 12.1.3 Repository is available as well.  The ODI sample repository has great examples of how to perform both initial and incremental data upload operations with Oracle Partition Exchange.  This repository will help users to understand how to use Oracle Partition Exchange with ODI.

Happy reading!

Thursday Dec 11, 2014

Recap of Oracle GoldenGate 12c for the Enterprise and the Cloud webcast

Last week I hosted a webcast on Oracle GoldenGate 12c's latest features and its solutions for cloud environments. For those of you who missed it, I wanted to give a quick recap and remind that you can watch it on-demand via the following link:

Oracle GoldenGate 12c for the Enterprise and the Cloud

In this webcast my colleague Chai Pydimukkala, senior director of product management for Oracle GoldenGate and I talked about some of the key challenges in cloud deployments and how Oracle GoldenGate addresses them. We discussed examples of cloud-specific data integration use cases, such as synchronizing data between on-premises systems and Oracle Cloud or Amazon Cloud environments. We also discussed zero downtime consolidation to cloud using Oracle GoldenGate.

In the webcast, Chai also presented  the latest features of Oracle GoldenGate 12.1.x including:

  • New database support, including Informix, SQL Server 2014, MySQL Community Edition
  • Real-time data integration between on-premises and cloud with SOCKS5 compliance
  • New features in Oracle GoldenGate Veridata especially the new data repair capabilities
  • Enhancements to Integrated Delivery, and support for capturing data from Active Data Guard standby system
  • The new migration utility to help with the move from Oracle Streams to Oracle GoldenGate.
As with previous GoldenGate webcasts we had a very interactive Q&A where we received tons of questions. We tried answer as much as possible in the available time but could not get to all of them.  Below are some of the commonly asked questions we received during the webcast and brief answers:

Question: Does GoldenGate replace ODI? When shall we use an ETL tool vs GoldenGate?

Answer: GoldenGate is designed for real-time, change data capture, routing, and delivery. It  performs basic, row level transformations. For complex transformation requirements you still need ETL/E-LT solutions. Our customers augment their existing ETL/E-LT solutions by adding GoldenGate for real-time, low-impact change data capture and delivery. GoldenGate can deliver data for ETL in flat file format, or feed staging tables, or it can be JMS messages.Oracle Data Integrator's E-LT architecture creates perfect combination as GoldenGate can capture changed data non-intrusively with low-impact, and deliver to staging tables in the target with sub-seconds latency. With ability to perform transformations within the target (or source) database, ODI takes this change data, performs transformations in micro-batches and loads user tables with high performance. Because of this natural and strategic fit between the products, we have tightly integrated ODI and GoldenGate.To learn more about how GoldenGate and ODI are integrated and work together, please watch this on-demand webcast. I also recommend reading the following white paper on real-time data warehousing best practices

Here you can see a demo of ODI and for customer examples, you can watch Paychex , RBS, and Raymond James videos.

Question: Is there a plan to sell GoldenGate as a service soon?

Answer: Yes, it is in the plans. We are working with the Oracle Cloud team. But we are not able to give a timeline.

Question: Are Integrated Capture and Delivery only available for Oracle Database or this can be used for non-Oracle databases?

Answer: Integrated Capture and Delivery are only available for Oracle Database and truly differentiate Oracle GoldenGate against other data integration and replication vendors. We offer Coordinated Delivery for all supported databases. Coordinated Delivery simplifies configuration significantly as well and it works with non-Oracle databases too. You can read more about Coordinated Delivery in a related blog,  via Oracle GoldenGate 12c Release 1 New Features Overview white paper or documentation.

Question: Is GoldenGate available for download for trial?

Answer: Yes, you can download GoldenGate on OTN for education and development purposes: http://www.oracle.com/technetwork/middleware/goldengate/downloads/index.html.  For big data use case, you can use Big Data Lite virtual environment to experiment with Oracle GoldenGate. 

Question: Does GoldenGate replace Active Data Guard? 

 No. The products are complementary. Data Guard is a physical replication solution designed for Oracle Database disaster recovery and offers it with great simplicity and performance. Oracle GoldenGate offers logical/transactional data replication which supplements Active Data Guard by eliminating downtime during planned outages (migration, consolidation, maintenance), and active-active data center synchronization for maximum availability. License for Oracle GoldenGate for Oracle Database includes also Active Data Guard. As mentioned in the webcast, GoldenGate 12c now can capture data from Active Data Guard's standby system too.

Question:  Does the GoldenGate Veridata repair subset of data instead of doing full sync ? Example : I want to repair only missed deletes.

Yes. Oracle GoldenGate Veridata can do granular repair for out-of-sync records. Please see our Oracle GoldenGate Veridata data sheet for more info.

Question: How do we use Enterprise Manager for GoldenGate? 

Answer: Oracle Management Pack for Oracle GoldenGate license includes a Enterprise Manager Plug-in that allows you to use your Oracle Enterprise Manager solution to monitor and manage Oracle GoldenGate solutions. 

If you have not attended the webcast live, I highly recommend watching Oracle GoldenGate 12c for the Enterprise and the Cloud on demand and listening to the long Q&A session with Chai. During the webcast we covered many other frequently asked questions.


Wednesday Dec 10, 2014

Oracle Enterprise Metadata Management 12.1.3.0.1 is now available!

As a quick refresher, Metadata Management is essential to solve a wide variety of critical business and technical challenges which include how report figures are calculated, understanding the impact of changes to data upstream, providing reports in a business friendly way in the browser and providing reporting capabilities on the entire metadata of an enterprise for analysis and improvement. Oracle Enterprise Metadata Management is built to solve all these pressing needs for customers in a lightweight browser-based interface. Today, we announce the availability of Oracle Enterprise Metadata Management 12.1.3.0.1 as we continue to enhance this offering.

With Oracle Enterprise Metadata Management 12.1.3.0.1, you will find business glossary updates, updates for a better experience to the user interface as well as improved and new metadata harvesting bridges including Oracle SQL Server Data Modeler, Microsoft SQL Server Integration Services, SAP Sybase PowerDesigner, Tableau and more. There are also new dedicated web pages for tracing data lineage and impact! At a more granular level you will also find new customizable action menus per repository object type for more personalization. For a full read on new features, please read here. Additionally, view here for the certification matrix details.

Download Oracle Enterprise Metadata Management 12.1.3.0.1!

Thursday Nov 20, 2014

Let Oracle GoldenGate 12c Take You to the Cloud

If your organization is in the ~80% of the global business community, you are most likely working on a cloud computing strategy for your organization, or actively implementing. The cloud computing growth rate is 5X more than the overall IT growth rate because of the clear and already proven cost savings, agility, and  scalability benefits of cloud architectures.

When organizations decide to embark on their cloud journey, they notice there are several questions and challenges to be addressed, involving data accessibility, security, availability, system management, performance etc. Oracle GoldenGate's real-time data integration and bi-directional transactional replication technology addresses critical challenges such as:

  • How to move my systems to the cloud without interrupting operations?
  • How to enable timely data synchronization between the systems on the cloud and on-premises to ensure access to consistent data for all end users?
  • How do I run operational reports with the data I have in cloud environments, or feed my analytical systems in cloud solutions?
  • In managed or private clouds, how do I keep the cloud platform highly available when I need to do maintenance, upgrades?

 On Tuesday,  December 2nd we will tackle these questions in a free webcast:

Live Webcast: Oracle GoldenGate 12c for the Enterprise and the Cloud

Tuesday, December 2nd, 2014 10am PT/ 1pm ET 

In this webcast, you will not only hear about Oracle GoldenGate's strong solutions for cloud environments, but also the latest features that strengthen its offering. The new features we will discuss include:

  • Support for Informix, SQL Server 2014, MySQL Community Edition, and big data environments
  • Real-time data integration between on premises and cloud with SOCKS5 compliance
  • New data repair functionality to help ensure database consistency across heterogeneous systems
  • Moving from Oracle Streams to GoldenGate with the new migration utility

 I would like to invite you to join me and my colleague Chai Pydimukkala, Senior Director of Product Management for Oracle GoldenGate in this session to learn the latest on GoldenGate 12c and ask your questions in a live Q&A.

Hope to see you there!

About

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality

Search

Archives
« August 2015
SunMonTueWedThuFriSat
      
1
2
3
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
     
Today