X

Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

Recent Posts

Data Integration

Get Going with Oracle Cloud Infrastructure Data Integration

We hope you have been finding the Oracle Cloud Infrastructure Data Integration blogs helpful as you get started with the service: Workspace in Oracle Cloud Infrastructure (OCI) Data Integration Understanding VCN Configuration for Oracle Cloud Infrastructure (OCI) Data Integration Data Asset in Oracle Cloud Infrastructure (OCI) Data Integration Project Setup in Oracle Cloud Infrastructure (OCI) Data Integration Data Flow overview in Oracle Cloud Infrastructure (OCI) Data Integration   Integration Tasks in Oracle Cloud  Infrastructure (OCI) Data Integration More are coming!   We also wanted to point you to some related exploratory Oracle Cloud Infrastructure Data Integration blogs from David Allan.  Thanks David! Oracle Cloud Infrastructure Data Integration and Python SDK Explores the first Oracle Cloud Infrastructure Data Integration API in action to list workspaces. The example see here is how to list workspaces in a compartment. Executing Tasks using Python SDK in Oracle Cloud Infrastructure Data Integration Tasks Uses the Oracle Cloud Infrastructure Data Integration Python SDK to execute a task which has been published to an application. Oracle Cloud Infrastructure Data Integration and Fn Shows how to use Functions when you want to focus on writing code to meet business needs; the example is the ‘hello world’ example for Data Integration.  This example will be extended in subsequent posts illustrating integration with other services in OCI such as the Events Service and Notification Service. Automate Loading Data to a Data Lake or Data Warehouse Using OCI Data Integration and Fn Explains how multiple services from Oracle Cloud Infrastructure work together to load data into the Data Lake or ADW, leverage Oracle Cloud Infrastructure Data Integration, Fn and Events Service to automate the load.   Happy reading!

We hope you have been finding the Oracle Cloud Infrastructure Data Integration blogs helpful as you get started with the service: Workspace in Oracle Cloud Infrastructure (OCI) Data Integration Understandi...

Oracle Cloud Infrastructure Data Integration

Understanding VCN Configuration for Oracle Cloud Infrastructure (OCI) Data Integration

Let's learn more about Oracle Cloud Infrastructure Data Integration. Today's blog will help you understand and teach you Virtual Cloud Network (VCN) configuration for Oracle Cloud Infrastructure Data Integration. Check out the previous blog written on Oracle Cloud Infrastructure Data Integration about Workspaces. Overview of Virtual Cloud Network (VCN) A virtual cloud network (VCN) is a customizable and private network in Oracle Cloud Infrastructure. Just like a traditional data center network, the VCN provides complete control over the network environment. This includes assigning own private IP address space, creating subnets, route tables, and configuring stateful firewalls. VCN resides within a single region but can cross multiple Availability Domains. Once users, groups, and compartments are created then start with VCN creation.  By default, there are two subnets in the VCN (Region Specific).  Private Subnet - Instances contain private IP addresses assigned to Virtual Network Interface Card (VNIC) Public Subnet - Contains both private and public IP addresses assigned to VNICs For more understanding of VCN can refer to - https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Concepts/overview.htm Oracle Cloud Infrastructure Data Integration and Virtual Cloud Networks Now coming to the main topic "Understanding VCN with Oracle Cloud Infrastructure Data Integration". Oracle Cloud Infrastructure Data Integration is in the Oracle Tenancy which resides outside the user tenancy. For Data Integration to access the resources in the user tenancy and get the information related to VCN and subnets the following policy needs to be set at the compartment level/tenancy level i.e. policy set at default root compartment level. allow service dataintegration to use virtual-network-family in tenancy (or) allow service dataintegration to use virtual-network-family in compartment <vcn_compartment> Different Options when Creating a Workspace While creating workspaces there are two options provided i.e. Enable Private Network or using Public Network. Oracle Cloud Infrastructure Data Integration only supports regional subnets i.e. subnet across all Availability Domains. Regional subnets are used for high availability purposes.  While in the process of creating a Workspace using "Enable Private Network", Oracle Cloud Infrastructure Data Integration VCN gets extended with the user-selected VCN. When the option is not selected then Oracle Cloud Infrastructure services like Object Storage get accessed through Service Gateway defined at the tenancy level and the rest of the resources like Database are accessed through Public Internet. Let us consider multiple Scenarios to understand the Oracle Cloud Infrastructure Data Integration with VCN by selecting Private/Public subnet and accessing its resources. Before testing multiple scenarios following are pre-requisites created in the environment: Created VCN with the name "VCN_DI_CONCEPTS" in the respective compartment.   Created four subnets within the mentioned VCN. Oracle Cloud Infrastructure Data Integration only supports regional subnet. For more information on the regional subnets, refer to https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/managingVCNs.htm Below is the list of resources created belonging to Subnet and Region while Testing   For Autonomous Data Warehouse (ADW) to be in private instance Network Security Group (NSG) needs to be defined. In NSG defined two ingress rule for PUBLIC_SUBNET_DI (10.0.2.0/24) and PRIVATE_SUBNET_DI (10.0.1.0/24)   For DB Systems in Private subnet, the following rules in the ROUTE table are included   For Service Gateway, select the option "All IAD Services in Oracle Services Network". To understand more about this option, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/servicegateway.htm   Scenario 1 - Accessing ADW, Object Storage, and Databases in the same Region using DI workspace in Private Subnet Oracle Cloud Infrastructure Data Integration workspace was created in PRIVATE_SUBNET_DI (10.0.1.0/24) Service Gateway used in the PRIVATE_SUBNET_DI     Scenario 2 - Accessing ADW, Object storage in different regions, and accessing Database Systems residing in a public subnet. To access ADW in different regions and DB Systems in public subnet a NAT Gateway is required.  Service Gateway is required for Object storage along with NAT Gateway for cross traffic. Route Rules screenshot(added NAT Gateway with the existing Service Gateway):     Scenario 3 - Accessing ADW, Object Storage, and Database in the same Region using DI workspace in Public Subnet OCI DI Workspace in Public Subnet "PUBLIC_SUBNET_DI" (10.0.2.0/24) Depending on the requirement if Oracle Cloud Infrastructure Data Integration Workspace has been assigned in a VCN and wants to connect resources residing in another VCN which might be in the same region or different region then Local or Remote peering is required accordingly. To understand more about Local or remote peering, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/VCNpeering.htm. If the resources are having Public IP then NAT or Service Gateway can be used accordingly. Scenario 4 - ADW, Object Storage, Database systems residing in the public subnet and all these resources are in different tenancy, different region, and different VCN To test this scenario we have created the following resources in the Mumbai region and different tenancy. The workspace of Oracle Cloud Infrastructure Data Integration is in Public Subnet (10.0.2.0/24). DI Workspace is created in the Ashburn region.   Scenario 5 - Connecting ADW, Databases and Object Storage using DI workspace with "Enable Private Network" Disabled While creating workspace if the option "Enable Private Network" is not selected   This non - enabling option means public connectivity option is selected where the Oracle Cloud Infrastructure Data Integration can access all the public services using Service Gateway and NAT Gateway from Oracle Tenancy. Here, Oracle Cloud Infrastructure Data Integration can't access private resources since for the workspace no VCN is assigned. In this example, Oracle Cloud Infrastructure Data Integration is enabled in the Ashburn region.     Scenario 6 - Connecting Oracle Cloud Infrastructure Data Integration with On-Premise DB There are two methods where Oracle Cloud Infrastructure Data Integration can connect to On-Premise DB IPSec VPN FastConnect Below are the details on how using FastConnect Oracle Cloud Infrastructure Data Integration can access the Database. To understand more about FastConnect, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Concepts/fastconnect.htm   Oracle Cloud Infrastructure Data Integration workspace should be in the same subnet where FastConnect is configured.   In the below example, VCN is created by Oracle as part of FastConnect with the name "####-iad.vcn"   Regional Public subnet is created within the VCN   Dynamic Route Gateway (DRG) is configured which is used as a virtual router that provides a path for private traffic (that is, traffic that uses private IPv4 addresses) between user VCN and networks outside the VCN's region. For more information on DRG, refer - https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/managingDRGs.htm   DRG can be configured with IPSec or Oracle FastConnect   Within the DRG two virtual network have been configured using FastConnect   Route Rules defined in the VCN   OCI DI workspace created in the subnet   Under Data Asset Create and Test the connection   Summary - We can observe that Scenario 1 and Scenario 2 are the same irrespective of Subnet allocated to the workspace.  Since the secondary VNIC extended to the users VCN/tenancy is always Private. Oracle Cloud Infrastructure Data Integration Workspace is assigned to Public or Private Subnet Oracle Cloud Infrastructure Data Integration Workspace is not assigned any network - Disabled "Enable Private Network" Option We just recently announced the general availability of Oracle Cloud Infrastructure Data Integration. With a series of upcoming blogs, we look forward to introducing various concepts. This concludes our blog on how to use VCN in Oracle Cloud Infrastructure Data Integration. To learn more, check out some Oracle Cloud Infrastructure Data Integration Tutorials and the Oracle Cloud Infrastructure Data Integration Documentation.

Let's learn more about Oracle Cloud Infrastructure Data Integration. Today's blog will help you understand and teach you Virtual Cloud Network (VCN) configuration for Oracle Cloud Infrastructure Data...

Oracle Cloud Infrastructure Data Integration

Workspace in Oracle Cloud Infrastructure (OCI) Data Integration

Oracle Cloud Infrastructure Data Integration is a fully managed, serverless, native cloud service that helps you with common extract, load, and transform (ETL) tasks such as ingesting data from different sources, cleansing, transforming, and reshaping that data, and then efficiently loading it to a target system on Oracle Cloud Infrastructure. Before you get started, the administrator must satisfy connectivity requirements so that Oracle Cloud Infrastructure Data Integration can establish a connection to your data sources. The administrator then creates workspaces and gives you access to them. You use workspaces to stay organized and easily manage different data integration environments. The workspace is the preliminary component of Oracle Cloud Infrastructure Data Integration. The workspace acts as an environment provided where the user can work on multiple Projects, Publish/Run Tasks, and Define Data Assets. The administrator must define the policies for the users/groups to start with this data integration solution. Creating and Editing a Workspace: Pre-requisites - All the necessary compartments and VCN have been created for Data Integration activities.  To understand more about VCN for Oracle Cloud Infrastructure Data Integration, refer to https://docs.cloud.oracle.com/en-us/iaas/data-integration/using/preparing-for-connectivity.htm Create a group for users in charge of workspaces and then add users to the group. All the policies have been set up by the administrator so that the user can access the Oracle Cloud Infrastructure Data Integration. If the administrator wants to limit activities within the network, "inspect" permission for VCNs and subnets within the compartment has to be provided instead of "manage". Below is the list of policies required to access Oracle Cloud Infrastructure Data Integration Give permissions to the group to manage Oracle Cloud Infrastructure Data Integration allow group <group_name> to manage dis-workspaces in compartment <compartment_name> Give permission to the group to manage network resources for Workspaces allow group <group_name> to manage virtual-network-family in compartment <compartment_name> Give permission to the group to manage tag-namespaces and tags for Workspaces allow group <group_name> to manage tag-namespaces in compartment <compartment_name> Oracle Cloud Infrastructure Data Integration is located in Oracle Tenancy which is outside user Tenancy. Data Integration sends a request to user tenancy. In return, the user must give the requestor(DI) permission to use the virtual networks set up for integration. Without a policy to accept this request, data integration fails. These policies can be defined at the compartment level or the tenancy level i.e. at the root compartment level allow service dataintegration to use virtual-network-family in tenancy allow service dataintegration to inspect instances in tenancy   Select the Data Integration link from the main menu of Oracle Cloud Infrastructure   Select the corresponding compartment and click on "Create Workspace".   Provide necessary information i.e. Name for the Workspace, VCN details, and other information like DNS, Tag Names.   Click on create for creating the workspace in the corresponding compartment. You're returned to the Workspaces page. It may be a few minutes before your workspace is ready for you to access. After it's created, you can select a Workspace from the list.   You can see the status of a Workspace creation or startup using View Status. It is available while creating or starting a Workspace from Stopped Status   Workspace can be accessed in two ways as shown in the below picture   After accessing the workspace New Projects, Data Assets or Applications can be created through the main console of the Workspace   You can edit the Workspace details, such as a name or description. You can't make changes to the identifier, compartment, VCN, or subnet selections. To edit the tags applied to a Workspace, select Add Tags from the Workspace's Actions (three dots) menu. In the Console, you edit a workspace from the Workspaces page. Select Edit from a workspace's Actions (three dots) menu. Edit the fields you want to change, and then click Save Changes Terminating/Stopping a Workspace -  Only Workspaces with an Active status or in Stopped status can be terminated. When you terminate a workspace, all the associated objects and the resources are removed. Below is the list of resources: Projects Folders Data Flows Tasks Applications Task Runs Data Assets All executions within a Workspace must be stopped before you can terminate the Workspace. Any open tabs associated with the Workspace you're terminating are closed upon termination. Once terminated, a Workspace cannot be restored. Be sure to carefully review the Workspace and resources before you commit to a termination. To terminate the Workspace click on the workspace action(three dots) and then click on Terminate   We just recently announced the general availability of Oracle Cloud Infrastructure Data Integration. With a series of upcoming blogs, we look forward to introducing various concepts. This concludes our initial blog on how a Workspace can be created and used in Oracle Cloud Infrastructure Data Integration.  To learn more, check out some Oracle Cloud Infrastructure Data Integration Tutorials and the Oracle Cloud Infrastructure Data Integration Documentation.

Oracle Cloud Infrastructure Data Integration is a fully managed, serverless, native cloud service that helps you with common extract, load, and transform (ETL) tasks such as ingesting data from...

Data Integration

Oracle Named 2019 Gartner Peer Insights Customer Choice for Data Integration Tools

We are pleased to announce that Oracle has been recognized as a 2019 Gartner Peer Insights Customer Choice for Data Integration Tools.  This distinction is especially important to Oracle because it is based on the direct feedback from our customers.  Thank you all for your support! Oracle Data Integration provides an enterprise class, fully unified solution for building, deploying, and managing real-time data-centric architectures. It combines all the elements of data integration—real-time data movement, transformation, synchronization, data quality, data management, and data services—to ensure that information is timely, accurate, and consistent across complex systems.  By using Oracle Data Integration, customers can experience significant cost savings, and efficiency gains are critical to leverage in today's challenging global economic climate.  They are delivering real-time, enriched, and trusted data from disparate cloud and on-premises sources to enable insightful analytics.  “We are honored to receive Gartner Peer Insights Customers’ Choice designation for the Data Integration Tools market. We thank our customers for their support,” said Jeff Pollock, Vice President Product Management for Oracle. “Over the past 20 years Oracle Data Integration has evolved into an industry leading platform used by thousands of companies across every industry. Working together with our customers, Oracle is committed to driving the innovation necessary to solve the industry’s most challenging data integration issues.”  Find out more! Gartner Peer Insights is an enterprise IT product and service review platform that hosts more than 300,000 verified customer reviews across 430 defined markets. In markets where there is enough data, Gartner Peer Insights recognizes up to seven vendors that are the most highly rated by their customers through the Gartner Peer Insights Customers’ Choice distinction. According to Gartner, “The Gartner Peer Insights Customers’ Choice is a recognition of vendors in this market by verified end-user professionals.” To ensure fair evaluation, Gartner maintains rigorous criteria for recognizing vendors with a high customer satisfaction rate. We at Oracle are deeply proud to be honored as a 2019 Customers’ Choice for the Data Integration Tools Market. To learn more about this distinction, or to read the reviews written about our products by the IT professionals who use them, check out the Customers’ Choice Data Integration Tools for Oracle landing page on Gartner Peer Insights. Here are some excerpts of what Oracle Customers are saying: “Using GoldenGate, it is possible to carry out operations in high data volumes in a much faster and uninterrupted manner. It is also very easy to use. One of the most effective abilities is to manage transactional processing in complex and critical environments. It is very important that data, costs and ongoing transactions are regularly secured to bring the risk to near zero." Software Engineer, Finance Industry “ODI is a very good product. It is lightning fast (which really comes handy when we have to transform massive amount of data), It ability to support heterogeneous databases, big data, JMS, XML, and many other flavors.” Senior Manager - MIS & Middleware, Service Industry A big thank you to our wonderful customers who submitted reviews, and those customers who continue to use our product and services and help shape the future.     The GARTNER PEER INSIGHTS CUSTOMERS’ CHOICE badge is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved. Gartner Peer Insights Customers’ Choice constitute the subjective opinions of individual end-user reviews, ratings, and data applied against a documented methodology; they neither represent the views of, nor constitute an endorsement by, Gartner or its affiliates.

We are pleased to announce that Oracle has been recognized as a 2019 Gartner Peer Insights Customer Choice for Data Integration Tools.  This distinction is especially important to Oracle because it is...

GoldenGate Solutions and News

Oracle GoldenGate Plug-in for Oracle Enterprise Manager v13.2.3.0.0 is now available

We have released GoldenGate OEM Plug-in 13.2.3.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 18.1, 19.1 Microservices (MA) Instances. In the earlier GoldenGate OEM Plug-in 13.2.2.0.0 release, we started supporting our first GoldenGate 12.3 Microservices Instance. In the new release, we have certified the latest of GoldenGate releases 18.1 and 19.1 Microservices and Classic. Along with the certification, we are supporting the new metrics for coordinated and parallel replicats. We have provided more services(Administration Service and Service Manager) and Deployments support in the plug-in. You may discover the new targets in the discovery module and promote the targets of your choice. Finally, we have certified the OEM 13.3 in the release.   Once you discover targets, you can select the processes(Extract, Replicat, etc) while promoting the targets. The selected processes(aka targets) and its parent process would get promoted automatically. For example, if you select the Extract process under Admin Server, the OEM PlugIn will promote the selected Extract process, Admin Server (which is Extract’s Parent), and Service Manager (which is Admin Server’s Parent). On a similar line, if you select the parent process, all the children will be selected by default and then you may choose to de-select the particular child.       Once you promote the processes or targets, you may notice the changes in Dashboard User Interface for Microservices processes. All the processes are shown in the tree structure. The Service Manager is the parent process, which shows one or many GoldenGate Deployments, all the extracts and Replicats are part of Admin Server. You may see all the services status on the screen. Along with the Microservices Instance, you may monitor the Classic Instance on the same dashboard. We have given each process(Target) a specific type name as per GoldenGate terminologies. It will be helpful when you want to know about what type of Extract or Replicats you are monitoring (Classic or Integrated Extract, Coordinated or Parallel Replicat).     When you click on the Service Manager on the dashboard, it will direct you to the below-mentioned page. The page shows all the Deployments and the details of its services, such as Port and Status. In the future, you should be able to search across a particular Deployment. The admin Server page will show all the extract and Replicats processes and its detailed metrics. When you click on individual Extract, Replicat you will be able to see the Metrics, Logs, and Configuration of the process.     For the Parallel and Coordinated replicat(PR/CR), you can see the accumulated metrics in the Parent process. The children process of the PR/CR is not visible on the screen. In the future, we will provide options to select the child process and then you would be able to monitor those children as well.   The GoldenGate OEM Plug-in has upgraded infrastructure to be compatible with the newer version of Enterprise Manager (EM) to 13.3.0.0.0. As mentioned earlier, we have certified the GoldenGate 18.1, 19.1 Classic and Microservices and added few metrics related to Parallel Replicat and Coordinated replicat.    Just to recap the communication between EM Agent and GoldenGate MA, and Classic Instances. You would not require to setup GoldenGate jAgent(Monitor Agent) to communicate with GoldenGate OEM Plug-in for GoldenGate Microservices Instances. The GoldenGate MA architecture provides the RESTful APIs to monitor and manage the GoldenGate MA Instances. The GoldenGate OEM Plug-in uses these RESTful APIs to communicate with GoldenGate MA Instances. For your GoldenGate Classic Instances, you would still need to setup GoldenGate jAgent 12.2.1.2.0+ for the communication purposes. The latest Monitor Agent was released on May, 19 (12.2.1.2.190530).   You can get more details of the release from the documentation.    We are working to get more features around monitoring the GoldenGate Microservices and Classic architecture in future releases. Please stay tuned for further updates.  

We have released GoldenGate OEM Plug-in 13.2.3.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 18.1, 19.1 Microservices (MA) Instances. In the earlier GoldenGate OEM...

Release Announcement for Oracle GoldenGate 19.1

This post was authored by Bobby Curtis, Director of Product Management, Oracle What’s New in Oracle GoldenGate 19.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across your enterprise without compromising availability and performance. Oracle GoldenGate 19c is a high-performance software application for real-time transactional change data capture, transformation, and delivery, offering bidirectional data replication. The application enables you to ensure that your critical systems are operational 24/7, and the associated data is distributed across the enterprise to optimize decision-making. GoldenGate 19.1 Platform New Features For the Oracle Database ➢ Oracle Database 19c Support Capture and Delivery support for Oracle Database 19c, cloud and on-premises. ➢ Centralized Key Management Service Use Oracle Key Vault to centralize and manage encryption keys for the replication environment. ➢ Target-Initiated Paths Distribution paths enabled from the Receiver Service to pull trail files. ➢ New REST API Endpoints Retrieve active transactions and current system change number (SCN) details using REST API endpoints. ➢ New Heartbeat Table Command The UPGRADE HEARTBEATTABLE command upgrades the Heartbeat table from prior versions of Oracle GoldenGate to the 19.1 version. ➢ Cross-Endian Support for Remote Integrated Extract Automatically enabled when the server where the Integrated Extract is running is different from the server where the Oracle Database is running. For MySQL ➢ MySQL 8.0 Support for Capture and Delivery Capture and Delivery support for MySQL 8.0 has been added. ➢ MySQL SSL Connection Support Extract and Replicat can now connect to a MySQL database via SSL. For DB2 for i ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ New Datatype Support Support for DECFLOAT datatype. ➢ New DBOPTIONS USEDATABASEECODING Parameter Allows Extract to store all text data in the trail file in native character encoding. ➢ Improved Extract Performance Enhanced throughput while reducing overall processing. ➢ Security Improvements Availability of AES Encryption. Credential Store, and Oracle Wallet. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. For DB2 z/OS ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ Online Schema Change Support Support for online TABLE CREATE, DROP and ADD, ALTER, DROP COLUMN commands. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. For DB2 LUW ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ New Datatype Support Support for DECFLOAT datatype. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. Other Information ➢ In the initial release, OGG 19.1.0.0.0, Linux builds will be available for most Database/OS combinations that are supported, followed by tiered releases for other supported platforms. ➢ GoldenGate for SQL Server will be released for both Windows and Linux soon, in a 19.1.x release. Docs, Downloads, and Certification: • Documentation is available at: https://docs.oracle.com/en/middleware/goldengate/core/19.1/index.html • Downloads are available through OTN at: https://www.oracle.com/middleware/technologies/goldengate.html • Certification Matrix (19.1 Cert Matrix to be posted soon): https://www.oracle.com/technetwork/middleware/ias/downloads/fusion-certification-100350.html Join us in upcoming events: • Stay up to date by visiting our Data Integration Blog for up to date news and articles. • Save the Date! Oracle OpenWorld is September 16th through the 19th. Don’t hesitate to contact us for any special topics that you might like to discuss. 

This post was authored by Bobby Curtis, Director of Product Management, Oracle What’s New in Oracle GoldenGate 19.1 To succeed in today’s competitive environment, you need real-time information. This...

GoldenGate Solutions and News

Zero Down Time (ZDT) Patching for Oracle GoldenGate

  This document explains how to apply a patch or upgrade an OGG environment without taking any downtime.  This assumes that OGG is already up and running and that the user is already very familiar with how OGG works, and the actual upgrade process.  Like any mission critical, 24x7 environment, this expectation is that the user takes the necessary precautions to test this process prior to implementing it in production, and is aware of any differences between versions.  All of these items are covered in other documents.  Terminology “New” – This refers to the new OGG installation.   This “new” environment is where everything will be running once you have completed the procedure. “Old” – This refers to the old OGG installation.  This “old” environment is the existing OGG installation that you want to upgrade.  After the process is completed, you will be removing this installation. Patching OGG Homes where there are Extract(s) running. Install the new OGG version into a new directory. This location will be referred to as the “new” OGG installation. In the new installation Apply any necessary patches to bring the releases to the most recent bundle patch, and then apply any required one-off patches on top of that. Create new Extract process(es) with different names than the old OGG environment. Create a new set of trail files (different names than the old OGG installation. Copy the parameter files from the old installation into the new one.  Modify them to account for new directories, names, and address any deprecated / modified parameters. On the target Create a new Replicat to receive data from the new OGG installation. In the new Installation Start the Extract Start the Extract pump (if necessary) In the old installation Wait.   How long to wait for?  It depends.  When you started the new Extract in step 4a, it will not process any transactions that were open when it was started.  You will want to wait until any open transactions during that time are closed.  SEND EXTRACT … SHOWTRANS may help in this case. Stop the Extract On the target If the old Replicat is not using a checkpoint table ,add one for it. Once the Replicat from the old installation is at EOF (SEND REPLICAT … GETLAG) stop the old replicat. Start the new replicat using START REPICAT … AFTERCSN [scn].  Where the [scn] is the log_cmplt_csn column from the checkpoint table for the old replicat. This will tell the new replicat to pick up right where the old replicat left off. In the old installation Stop the extract pump (optional) Clean up the old installation and remove it.   Patching OGG Homes where there are Replicat(s) running. Install the new OGG version into a new directory. This location will be referred to as the “new” OGG installation. In the new installation Apply any necessary patches to bring the releases to the most recent bundle patch, and then apply any required one-off patches on top of that. Create new Replicat process(es) with different names than the old OGG environment.  The new replicat will read from the existing trail files. Copy the parameter files from the old installation into the new one.  Modify them to account for new directories, names, and address any deprecated / modified parameters. In the Old installation. If the old Replicat is not using a checkpoint table ,add one for it. Stop the Replicat when it is at EOF (SEND REPLICAT … GETLAG) In the New Installation Start the new replicat using START REPICAT … AFTERCSN [scn].  Where the [scn] is the log_cmplt_csn column from the checkpoint table for the old replicat. This will tell the new replicat to pick up right where the old replicat left off. In the old installation Clean up the old replicat and remove it.  

  This document explains how to apply a patch or upgrade an OGG environment without taking any downtime.  This assumes that OGG is already up and running and that the user is already very familiar with...

Demystifying Oracle Cloud Infrastructure

Oracle has a longstanding reputation for providing technologies that empower enterprises to solve demanding business problems. Oracle has built a cloud infrastructure platform that delivers unmatched reliability, scalability, and performance for mission-critical databases, applications, and workloads. Oracle Cloud Infrastructure is the first cloud built specifically for the enterprise. With the latest high-end components, support for open standards and multi-cloud strategies, and an unwavering commitment to protecting sensitive business data, Oracle Cloud Infrastructure is perfectly suited to meet the needs—and exceed the expectations—of today's enterprise IT teams. Oracle Cloud Infrastructure represents a fundamentally new public cloud architecture and serves as the foundational layer for Oracle Cloud. The infrastructure is designed to provide the performance predictability, core-to-edge security, and governance required for enterprise workloads. Oracle supports traditional, mission-critical, and performance-intensive workloads typically found in on-premises environments, including artificial intelligence (AI), machine learning (ML), and high-performance computing (HPC), as well as cloud-native applications. Oracle Cloud Infrastructure combines the benefits of public cloud (on-demand, self-service, scalability, pay-for-use) with those benefits usually associated with on-premises environments (governance, predictability, control) into a single offering. Here is a good example of how Alliance Data Saves $1 Million Annually Running Critical Applications on Oracle Cloud Infrastructure Learn more about Oracle Cloud Infrastructure here.

Oracle has a longstanding reputation for providing technologies that empower enterprises to solve demanding business problems. Oracle has built a cloud infrastructure platform that delivers unmatched...

Enabling Analytics with Oracle data integration and Oracle Analytics Cloud on Oracle Autonomous Database

Enabling global analytics is one of the most common use cases among customers who build and maintain a data store. In this post, we shall identify the critical components that are required for an end-to-end analytics solution, the characteristics of a great analytics solution, and how Oracle Analytics Cloud, Oracle data integration, and Oracle Autonomous Database, together combine to provide a platform for great analytics. Any chosen analytics solution should bring together and balance the requirements of two major stakeholders, those in the Information Technology (IT) departments and those in the line-of-business functions. Fig 1: IT and Business dictates priorities that need to be balanced in an analytics solution Achieving this balance between the scalability requirements of IT and the user experience focus of a visual tool is critical to the success of any visualization solution. Oracle Data Integration -  The IT Component Oracle data integration solutions help solve key requirements for a successful IT deployment of an analytics solution. Oracle data integration Provides the latest data, both in real-time and bulk, from various sources to be delivered into the data warehouse that is built on top of Oracle Autonomous Database to power analytics, Helps govern data and provide transparency to the data that underpins the analytics visualizations, for easy lineage and impact analysis, thus increasing trust in the data, and Enables true global analytics, by making data available both on-premises and on the cloud. Oracle Analytics Cloud - The Business Component Oracle Analytics Cloud provides the features and benefits that satisfy the requirements of a business user. Oracle Analytics Cloud Provides powerful data flows and enrichment features to enable sharable and traceable business data transformations, Avoids Excel clutter and empower analysts to enhance data with no coding skills required, and Enables augmented data enrichment, through Machine Learning driven enrichment and data transformations. Oracle Autonomous Database - The Platform  Oracle Autonomous Database forms the third component of this analytics solution, along with Oracle data integration and Oracle Analytics Cloud. Oracle Database Provides a robust self- driving, self-securing, and self-repairing data store, providing autonomous datawarehousing capabilities, Watch this video to understand how these three components come together to provide end to end analytics on an Oracle platform. Fig 2: Oracle data integration video  Oracle Data integration, along with Oracle Autonomous Data Warehouse and Oracle Analytics accelerates speed to insight and innovation while enabling fast access to a sophisticated set of analytics and accelerates data preparation and enrichment. Watch this webcast to learn more about how to focus on growing your business and drive innovation with an end-to-end analytics solution.

Enabling global analytics is one of the most common use cases among customers who build and maintain a data store. In this post, we shall identify the critical components that are required for...

Data Integration

Loading Data Into Oracle Autonomous Data Warehouse Cloud with Oracle data integration

Oracle offers the world’s first autonomous database. Oracle also offers tools that helps customers get data into the autonomous database.  In this blog, we will go through what is an Autonomous Database and the capabilities that Oracle data integration provides that helps adopt the Autonomous Data Warehouse Cloud Service (ADWCS). What is an Autonomous database? An autonomous database is a cloud database that uses machine learning to eliminate the human labor associated with database tuning, security, backups, updates, and other routine management tasks traditionally performed by database administrators (DBAs). Autonomous database technology requires that enterprise databases be stored in the cloud, using a cloud service. Being autonomous in the cloud allows the organization to leverage cloud resources to more effectively deploy databases, manage database workloads, and secure the database. A database cloud service makes database capabilities available online, when and where those capabilities are needed. Watch Senior Vice President Juan Loaiza introduce the Oracle Autonomous Database for a deeper insight into the technology. What is Oracle data integration? Oracle’s data integration encompasses a portfolio of cloud-based and on-premises solutions and services that helps with moving, enriching, and governing, data. Oracle data integration has the following capabilities that make it the logical choice when looking to migrate and move data to Oracle Cloud. Oracle data integration Has integrated APIs that allow easy access to Oracle’s underlying tables without affecting source system performance for real-time data access through change data capture, Can automate repeated data delivery into Oracle Datawarehouse Cloud Service by easily surfacing ADWCS as a target system, Brings together real-time data replication, data streaming, bulk data movement, and data governance into a cohesive set of products that are seamlessly integrated for performance. Watch this video to get a quick glimpse of our latest product and how it functions with Oracle Data Warehouse Cloud and Oracle Analytics cloud. Moving Data Into Oracle Data Warehouse Cloud Service Oracle data integration solutions bring together some key technological and user benefits for customers. Managed by Oracle – Engineered and built by teams that have a shared vision, the different solutions and technologies incorporate the best of scientific advances, as well as, seamless integration between the solutions. Unified Data Integration – Provides a single-pane-of-glass to control the various components of data integration like bulk data movement, real-time data, data quality, and data governance. Simplify Complex Integration Tasks – Groups together functions that build up to a business or technology pattern, so that often repeated scenarios can be executed with efficiency. Flexible Universal Credit Pricing – Oracle’s pricing tracks usage, and can be applied across technologies, allowing customers access to all participating Oracle cloud services, freeing customers from procurement woes and providing customers with a truly agile and nimble set of solutions. Here are some scenarios that Oracle data integration helps to solve. Extraction & Transformation - Execute bulk data movement, transformation, and load, scenarios, Data Replication – Change data capture helps replicate data into Oracle Autonomous DataWarehouse and Kafka, for data migration and high availability architectures, Data Lake Builder - Create a comprehensive, fully governed, repeatable data pipeline to your big data lakes, Data Preparation - Ingest and harvest metadata for better data transparency and audits, and Synchronize Data - Seamlessly synchronize two databases together. Fig1: A sample architecture of moving data from source to analytics For a deeper understanding of moving data into Oracle Autonomous Data Warehousing Cloud, watch the below webcast.

Oracle offers the world’s first autonomous database. Oracle also offers tools that helps customers get data into the autonomous database.  In this blog, we will go through what is an Autonomous...

GoldenGate Solutions and News

Oracle GoldenGate for SQL Server supports SQL Server 2017 and Delivery to Microsoft Azure SQL Database

The Oracle GoldenGate Product Management team is pleased to announce that Oracle GoldenGate 12.3 for SQL Server has added new functionality to support Capture and Delivery from/to SQL Server 2017 Enterprise Edition and has added certification to deliver to Microsoft’s Azure SQL Database. SQL Server 2017 Using patch release 12.3.0.1.181228 of Oracle GoldenGate for SQL Server (CDC Extract), which is available on support.oracle.com, under Patches & Updates, customers now have the ability to both capture from and deliver to SQL Server 2017 Enterprise Edition.  Azure SQL Database Also, using the same patch release as for SQL Server 2017 support, remote delivery to Azure SQL Database is now supported.  You can install the Oracle GoldenGate patch on a supported Windows server (see the Certification Matrix) and configure a remote Replicat to deliver data to your Azure SQL Database. Documentation For more information, please review the Oracle GoldenGate documentation as well as a quick start tutorial, which is available here: https://apexapps.oracle.com/pls/apex/f?p=44785:24:111923811479624::NO:24:P24_CONTENT_ID,P24_PREV_PAGE:21869,1

The Oracle GoldenGate Product Management team is pleased to announce that Oracle GoldenGate 12.3 for SQL Server has added new functionality to support Capture and Delivery from/to SQL Server 2017...

Data Integration

Integration: Heart of the Digital Economy Podcast Series – Moving Data to the Cloud and Autonomous Data Warehouse

Authored by Steve Quan, Principal Product Marketing Director, Oracle Digital transformation is inevitable if want to thrive in today’s economy.  We've heard about how application and data integration play a central role in business transformations.  Since data has become a valuable commodity, integration plays a critical role in sharing data with applications in hybrid cloud environments or populating data lakes for analytics.  In these two podcasts you can learn how easy it is to seamlessly integrate data for these use cases. Successful digital businesses rely on data warehouses for contextual information to identify customer intent and remain one-step ahead of competition.   With growing data volumes, you need to easily acquire and prepare data in the right format for business intelligence and analysis.  Listen to Integrating Data for Oracle and Autonomous Data Warehouse  and learn how easy it is to move and keep your data synchronized. Data is also moving to hybrid cloud environments so you can use data on-premises and in the cloud; enabling your organizations to be more agile and react quickly to changes.  Moving data to the cloud is not just copying initial blocks of data, you need to move the data and keep the data synchronized. Listen to Moving Data into the Cloud and learn how Oracle Data Integration makes this easier. Learn more about Oracle’s Application Integration Solution here. Learn more about Oracle’s Data Integration Solution here. Dive into Oracle Cloud with a free trial available here.   Oracle Cloud Café Podcast Channel - check out the Oracle Cloud Café, where you can listen to conversations with Oracle Cloud customers, partners, thought leaders and experts to get the latest information about cloud transformation and what the cloud means for your business.

Authored by Steve Quan, Principal Product Marketing Director, Oracle Digital transformation is inevitable if want to thrive in today’s economy.  We've heard about how application and data integration...

Data Integration

DATA REPLICATION TO AWS KINESIS DATA STREAM USING ORACLE GOLDENGATE

Contributed by: Shrinidhi Kulkarni, Staff Solutions Engineer, Oracle Use case: Replication of data trails present on AWS AMI Linux instance into Kinesis Data Stream (AWS Cloud) using Oracle GoldenGate for Big Data. Architecture: GoldenGate For Big Data: Oracle GoldenGate 12.3.2.1 AWS EC2 Instance: AMI Linux Amazon Kinesis  Highlights: How to configure GoldenGate for Big Data(12.3.2.1) How to configure GoldenGate Big Data Target handlers How to create AWS Kinesis Data Stream Connecting To Your Linux Instance from Windows Using PUTTY Please refer to the following link & the instructions in it that explain how to connect to your instance using PUTTY. And also on how to Transfer files to your instance using WinSCP.     https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/putty.html Download the GoldenGate for Big Data Binaries, Java (JDK or JRE) version 1.8 & Amazon Kinesis Java SDK Download and install GoldenGate for Big Data 12.3.2.1.1, Here is the link: http://www.oracle.com/technetwork/middleware/goldengate/downloads/index.html The Oracle GoldenGate for Big Data is certified for Java 1.8. Before installing and running Oracle GoldenGate 12.3.2.1.1, you must install Java (JDK or JRE) version 1.8 or later. Either the Java Runtime Environment (JRE) or the full Java Development Kit (which includes the JRE) may be used. The Oracle GoldenGate Kinesis Streams Handler uses the AWS Kinesis Java SDK to push data to Amazon Kinesis. The Kinesis Steams Handler was designed and tested with the latest AWS Kinesis Java SDK version 1.11.429 and for creating streams/ shards. https://docs.oracle.com/goldengate/bd123110/gg-bd/GADBD/using-kinesis handler.htm#GADBD-GUID-3DE02CFE-8A38-4407-86DF-81437D0CC4E2 Create a Kinesis data stream(not included under Free-tier)on your AWS Instance, Follow the link for reference- https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html It is strongly recommended that you do not use the AWS account root user or ec2-user for your everyday tasks, even the administrative ones. You need to create a new user with access key & secret_key for AWS, use the following link as reference to do the same :            https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html Attach the following policies to the newly created user to allow access and GET/Put Operations on Kinesis data stream: AWSLambdaKinesisExecutionRole-Predefined Policy in AWS You need to attach the following inline policy as json:  "Version": "2012-10-17",  "Statement": [    {    "Effect": "Allow",      "Action": "kinesis:*",      "Resource": [        "arn:aws:kinesis:<your-aws-region>:<aws-account-id>:stream/<kinesis-stream-name>"      ]    } Unzip the GoldenGate for big data (12.3.2.1) zip file : After you Unzip the Downloaded GoldenGate for Big Data Binary, the directory structure looks like this: Now extract the GoldenGate 12.3.2.1.1 .tar file using “tar -xvf” command. After the “tar –xvf” operation finishes, the following Big-Data target handlers are extracted: You can have a look on the directory structure( files extracted) and then go to “AdapterExamples” directory to make sure kinesis streams handler is extracted:           The Kinesis_Streams directory under big-data contains Kinesis Replicat parameter file(kinesis.prm) and kinesis properties file (kinesis.props). Before you log into GoldenGate instance using GGSCI, set the JAVA_HOME & LD_LIBRARY_PATH to the JAVA 1.8 directory otherwise it would show up an error as following: Export the JAVA_HOME & LD_LIBRARY_PATH as shown below:              export JAVA_HOME=<path-to-your-Java-1.8>/jre1.8.0_181              export LD_LIBRARY_PATH=<path-to-your-Java-1.8>/lib/amd64/server:$JAVA_HOME/lib Once you’re done, log into GoldenGate Instance using ./ggsci command and issue create subdir command to create the GoldenGate specific directories: Configure the Manager parameter file and add an open PORT to it: Example: edit param mgr          PORT 1080 Traverse back to GoldenGate Directory, execute ./ggsci and Add replicat in the GoldenGate instance using the following command:      add replicat kinesis, exttrail AdapterExamples/trail/tr [NOTE: A demo trail is already present at the location: AdapterExamples/trail/tr] Copy the parameter file of the replicat (mentioned above) to ./dirprm directory of the Goldengate Instance. Copy the properties file (kinesis.props) to dirprm folder after making the desired changes. Replicat Param File & kinesis properties file: REPLICAT kinesis -- Trail file for this example is located in "AdapterExamples/trail" directory -- Command to add REPLICAT -- add replicat kinesis, exttrail AdapterExamples/trail/tr TARGETDB LIBFILE libggjava.so SET property=dirprm/kinesis.props REPORTCOUNT EVERY 1 MINUTES, RATE GROUPTRANSOPS 1 MAP QASOURCE.*, TARGET QASOURCE.*; Kinesis Properties File(kinesis.props): gg.handlerlist=kinesis gg.handler.kinesis.type=kinesis_streams gg.handler.kinesis.mode=op gg.handler.kinesis.format=json gg.handler.kinesis.region=<your-aws-region> #The following resolves the Kinesis stream name as the short table name gg.handler.kinesis.streamMappingTemplate=<Kinesis-stream-name> #The following resolves the Kinesis partition key as the concatenated primary keys gg.handler.kinesis.partitionMappingTemplate=QASOURCE #QASOURCE is the schema name used in the sample trail file gg.handler.kinesis.deferFlushAtTxCommit=true gg.handler.kinesis.deferFlushOpCount=1000 gg.handler.kinesis.formatPerOp=true #gg.handler.kinesis.proxyServer=www-proxy-hqdc.us.oracle.com #gg.handler.kinesis.proxyPort=80 goldengate.userexit.writers=javawriter javawriter.stats.display=TRUE javawriter.stats.full=TRUE gg.log=log4j gg.log.level=DEBUG gg.report.time=30sec gg.classpath=<path-to-your-aws-java-sdk>/aws-java-sdk-1.11.429/lib/*:<path-to-your-aws-java-sdk>/aws-java-sdk-1.11.429/third-party/lib/*   ##Configured with access id and secret key configured elsewhere javawriter.bootoptions=-Xmx512m -Xms32m -Djava.class.path=ggjava/ggjava.jar   ##Configured with access id and secret key configured here javawriter.bootoptions=-Xmx512m -Xms32m -Djava.class.path=ggjava/ggjava.jar -Daws.accessKeyId=<access-key-of-new-created-user> -Daws.secretKey=<secret-ke-new-created-user> Make sure you edit the classpath, accessKeyId & Secret Key (of newly-created-user) correctly. After making all the necessary changes you can start the kinesis replicat, which would replicate the trail data to kinesis Data stream. Crosscheck for kinesis replicat’s status, RBA and stats. Once you get the stats, you can view the kinesis.log from. /dirrpt directory which gives information about data sent to kinesis data stream and operations performed.          You can also monitor the data that has been pushed into Kinesis data stream through AWS CloudWatch. Amazon Kinesis Data Streams and Amazon CloudWatch are integrated so that you can collect, view, and analyze CloudWatch metrics for your Kinesis data streams. For example, to track shard usage, you can monitor the following metrics: IncomingRecords: The number of records successfully put to the Kinesis stream over the specified time period. IncomingBytes: The number of bytes successfully put to the Kinesis stream over the specified time period. PutRecord.Bytes: The number of bytes put to the Kinesis stream using thePutRecord operation over the specified time period.

Contributed by: Shrinidhi Kulkarni, Staff Solutions Engineer, Oracle Use case: Replication of data trails present on AWS AMI Linux instance into Kinesis Data Stream (AWS Cloud) using Oracle GoldenGate...

Data Integration

Data Integration Platform Cloud (DIPC) 18.4.3 is Now Available

Data Integration Platform Cloud (DIPC) 18.4.3 is now available! Do you know what DIPC is?  If not, check out this short 2 minute video! Data Integration Platform Cloud (DIPC) is a re-imagination of how various best of breed data integration solutions can come together and work seamlessly, finding synergies in their features and elevating smaller piecemeal tasks and projects into a solution based approach. For example, DIPC introduces the concept of “elevated tasks” and “atomic tasks”. The latter, atomic tasks, are equivalent to point tasks that are used to accomplish smaller data requirements and logic, while the former, elevated tasks, consists of end goal oriented (e.g. building a data lake, or prepping data) groupings that bring together often encountered technological requirements into simple and logical task groupings. Let’s explore some of the new features for DIPC 18.4.3: A major enhancement we made in this release is the added support for Autonomous Data Warehouse (ADW), Oracle’s easy-to-use, fully autonomous database that delivers fast query performance. You can now create a connection to ADW and harvest metadata that can be used in our elevated tasks. In a recent blog article we explored the Data Lake Builder task.  This task helps with data lake automation, enabling an intuitive instantiation and copy of data into a data lake, in an effort to help reduce some of the existing data engineer/ data scientist friction.  You can quickly create a comprehensive, end-to-end repeatable data pipeline to your data lake.  The Add Data to Data Lake task now supports Autonomous Data Warehouse as a target and you can also ingest from Amazon S3.  Additionally, task execution is supported through the remote agent. The Replicate Data task includes advanced Kafka support with Avro and Sub types.  The user experience has been enhanced to support many varied replication patterns in the future.  You also have the option to encrypt data within the task. The ODI Execution task adds support for Autonomous Data Warehouse (ADW) and Oracle Object Storage empowering users to bulk load into ADW and run ETL/ELT workloads to transform their data. You’ll also find that DIPC scheduling is offered, allowing you to create scheduling policies to run jobs.  Additionally, heterogeneous support has been expanded, with GoldenGate for SQL Server now available through the DIPC agent. Learn more by checking out this documentation page for bits on how to create and runs tasks.

Data Integration Platform Cloud (DIPC) 18.4.3 is now available! Do you know what DIPC is?  If not, check out this short 2 minute video! Data Integration Platform Cloud (DIPC) is a re-imagination of...

Oracle Open World 2018 - A Recap

The Annual Oracle Tech Bonanza The first time I attended Oracle’s Open World, in 2013, was when I truly understood the scale of innovation and expertise that Oracle brings to its customers. Over the years, each year, I have only been more amazed at the breadth of technologies, various success stories, and incredible innovation, that Oracle and our customers combine to bring change to the way businesses operate. This year was no different. Oracle Open World 2018 had some of the most relevant and thought-provoking ideas, cutting-edge product showcases, and real-world use cases, on display. Here is a statistic that might throw a light on the scale of the event. “This week Oracle OpenWorld 2018 has hosted more than 60,000 customers and partners from 175 countries and 19 million virtual attendees. Oracle OpenWorld is the world’s most innovative cloud technology conference and is slated to contribute $195 million in positive economic impact to the City of San Francisco in 2018.” For a quick overview of the entire conference, you can read the full press release here.                           Key Notes Starting off the conference every day were keynotes that set the tone for the technology fest every day. Larry Ellison, Executive Chairman and Chief Technology Officer, kicked off the conference diving deep into two mainstays, among others, of Oracle’s focus, namely, Oracle Fusion Cloud Applications, and, Oracle’s Autonomous Database services. With an increased focus on intelligent platforms and a rich cloud ecosystem, Oracle’s Cloud is a critical component that glues together the dizzying array of services and solutions that Oracle offers. Some of the other keynote speakers, in no particular order, included, Mark Hurd, Chief Executive Officer, Steve Miranda, Executive Vice President, and Judith Sim, Chief Marketing Officer. In case you missed it, or want to revisit these sessions, you can watch them all here. My personal favorite was the session on The Role of Security and Privacy in a Globalized Society. To listen to the heads of highly performant teams discuss real-world problems using our technologies drives home the importance of our products outside the development labs.                           Integration Deep Dives: Throughout the conference, this year, the focus was on helping customers quickly migrate to the cloud seamlessly. Artificial Intelligence embedded in the technologies that Oracle delivers helps customers automate and innovate quicker, more securely, and with the least disruption to their existing operations. Application integration and Data Integration, two areas that have consistently contributed to the growth of Oracle Customers’ success in the move to the cloud, had their own set of sessions. Here is a list of the different sessions, topics, and labs, that OOW18 hosted around integration. There were customer sessions, product roadmap sessions, thought leadership sessions, and demos, to cater to every information that our current and prospective customers would need to make the best decision to partner with us on their journey to autonomous data warehousing and the cloud. Innovation Awards: No Oracle Open World is complete without the signature red-carpet event, The Oracle Excellence Awards. This year the award winners included major companies and organizations including American Red Cross, Deloitte, Gap, Hertz, National Grid, Sainsbury's and Stitch Fix. While the winners no doubt showcase the best of the use of Oracle technologies, they represent only a small fraction of the best and innovative use of Oracle technologies in the real world. It is always a matter of pride to watch our customers describe, in their own words, the difference they make in turn to their end customers using Oracle technologies.                           Even as the rumbles of this year’s Open World dies down, we at Oracle’s Integration camp are gearing up for exciting new releases and features. Oracle Data Integration is taking on more and more capabilities, baking them into a seamlessly integrated platform. Existing customers will notice how rapidly the changes are coming without feeling the need to learn new skills that bridges various roles with a single data integration platform. New customers will be delighted at the easy-to-use packaging and pricing models. Oracle Application Integration is meanwhile bridging the requirements that arise out of needing applications to be connected. With out-of-the-box connectors, ERP integrations, and features that seamlessly utilize artificial intelligence, Oracle’s Application Integration brings process automation and self-service integration to our customers. Here are just some of the commendations and accolades that Oracle Data Integration and Oracle Application Integration received from the analysts recently. This has been an exceptional Open World, once again reminding me of Oracle’s technologies, deep technical and business expertise, and customer commitment. I am already looking forward to the next OpenWorld. 

The Annual Oracle Tech Bonanza The first time I attended Oracle’s Open World, in 2013, was when I truly understood the scale of innovation and expertise that Oracle brings to its customers. Over the...

GoldenGate Solutions and News

Release Announcement for Oracle GoldenGate 18.1

What’s New in Oracle GoldenGate 18.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across your enterprise without compromising availability and performance. Oracle GoldenGate 18c is a high-performance software application for real-time transactional change data capture, transformation, and delivery, offering bidirectional data replication. The application enables you to ensure that your critical systems are operational 24/7, and the associated data is distributed across the enterprise to optimize decision-making. GoldenGate 18.1 Platform Features For the Oracle Database Oracle Database 18c Support Capture and Delivery support for Oracle Database 18c, cloud and on-premises Autonomous Data Warehouse Cloud (ADWC) and Autonomous Transaction Processing (ATP) Support Easily connect to ADWC and ATP to deliver transactions Identity Column Support Simplified support for handling identity columns in the Oracle Database AutoCDR Improvements Support for tables with unique keys (UK) Composite Sharding Support for multiple shardspaces of data using consistent partitioning In-Database Row Archival Support Support for compressed invisible rows For MySQL MySQL Remote Capture Support Capture MySQL DML transactions from a remote Linux hub.Use for remote capture against MySQL, Amazon RDS for MySQL, and Amazon Aurora MySQL Database running on Linux or Windows. For DB2 z/OS DB 12.1 Support TIMESTAMP w/TIMEZONE and Configurable Schema for Extract’s Stored Procedure For DB2 LUW Cross Endian Support for Remote Capture and PureScale Support For Teradata Teradata 16.20 Support for Delivery Join us in upcoming events: Stay up to date by visiting our Data Integration Blog for up to date news and articles. Save the Date!  Oracle OpenWorld is October 22nd through the 25th. 

What’s New in Oracle GoldenGate 18.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across...

Data Integration

2018 Oracle OpenWorld Data Integration Sessions, Labs and Demos

                                  With OpenWorld 2018 just days away, we can’t wait to welcome you to San Francisco. As you begin thinking of ways your company fits into a data-driven economy, you’ll need to think about how all your business data and cloud data can work together to provide meaningful insights and make better decisions. As our industry continues to roll out new technologies like AI and machine learning, you’ll want to think how your data can work with machine learning systems to get insights from patterns. Learn from the experts how a unified data infrastructure helps you migrate data to a data warehouse, process IoT data with Apache Kafka, and manage data lifecycle for greater transparency. There are over 30 data integration sessions, labs, and demos that showcase Oracle’s data integration technologies. Make room in your schedules to learn from experts who have helped their organizations successfully transform in this digital age. We want to highlight a few sessions here, but there are plenty more that you should plan on attending. Scan the QR-code or click on this link  to explore all the sessions that may interest you.   Oracle’s Data Platform Roadmap: Oracle GoldenGate, Oracle Data Integrator, Governance [PRM4229]   Jeff Pollock, Vice President of Product, Oracle Monday, Oct 22, 11:30 a.m. - 12:15 p.m. | Moscone West - Room 2002 This session explores the range of solutions in Oracle’s data platform. Get an overview and roadmap for each product, including Oracle Data Integrator, Oracle GoldenGate, Oracle Metadata Management, Oracle Enterprise Data Quality, and more. Learn how each solution plays a role in important cloud and big data trends, and discover a vision for data integration now and into the future. Oracle’s Data Platform in the Cloud: The Foundation for Your Data [PRO4230] Denis Gray, Senior Director - Data Integration Cloud, Oracle Monday, Oct 22, 12:30 p.m. - 1:15 p.m. | Marriott Marquis (Golden Gate Level) - Golden Gate C3 The rapid adoption of enterprise cloud–based solutions brings with it a new set of challenges, but the age-old goal of maximizing value from data does not change. Oracle’s data platform ensures that your data solution is built from the ground up on a foundation of best-of-breed data integration, big data integration, data governance, data management, and data automation technologies. As customers ascend more of their enterprise applications to the cloud, they realize a cloud-based enterprise data platform is key to their success. Join this session led by Oracle product management to see how Oracle’s data platform cloud solutions can solve your data needs, and learn the product overview, roadmap, and vision, as well as customer use cases. Oracle Data Integration Platform Cloud: The Foundation for Cloud Integration [THT6793] Denis Gray, Senior Director - Data Integration Cloud, Oracle Tuesday, Oct 23, 5:00 p.m. - 5:20 p.m. | The Exchange @ Moscone South - Theater 5 The rapid adoption of enterprise-cloud-based solutions brings with it a new set of challenges. However, the age-old goal of maximizing value from data does not change. Powered by Oracle GoldenGate, Oracle Data Integrator, and Oracle Data Quality, Oracle Data Integration Platform Cloud ensures that your data solution is built from the ground up on a foundation that is built on best-of-breed data integration, big data integration, data governance, data management, and data automation technologies. Join this mini theater presentation to see the power and simplicity of Oracle Data Integration Platform Cloud. See how it utilizes machine learning and artificial intelligence to simplify data mapping, data transformation, and overall data integration automation.   After all the sessions, JOIN US and unwind at our Oracle Integration & Data Integration Customer Appreciation Event @OOW18, Thursday, October 25, 2018, 6pm-10pm. Barbarossa Lounge, 714 Montgomery Street, San Francisco, CA! Pass Code Needed to Register - #OOW18 Registration link https://www.eventbrite.com/e/oracle-integration-and-data-integration-customer-appreciation-event-oow2018-registration-51070929525   You can start exploring App Integration and Data Integration sessions in the linked pages.  We are also sharing #OOW18 updates on Twitter: App Integration and Data Integration. Make sure to follow us for all the most up-to-date information before, during, and after OpenWorld!

                                  With OpenWorld 2018 just days away, we can’t wait to welcome you to San Francisco. As you begin thinking of ways your company fits into a data-driven economy, you’ll...

Data Integration

Integration Podcast Series: #1 - The Critical Role Integration Plays in Digital Transformation

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new features based on cutting edge research like those based on Artificial Intelligence (AI), Machine Learning (ML) and Natural Language Processing (NLP), business models need to change to adopt and adapt to these new offerings. In the first podcast of our “Integration: Heart of the Digital Economy” podcast series, we discuss, among other questions: What is digital transformation? What is the role of Integration in digital transformation? What roles do Application and Data Integration play in this transformation? Businesses, small and big, are not able to convert every process into a risk reducing act or a value adding opportunity. Integration plays a central role in the digital transformation of a business. Businesses and technologies run on data. Businesses also run applications and processes. Integration helps supercharge these critical components of a business. For example, cloud platforms now offer tremendous value with their Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS) offerings. Adopting and moving to the cloud would help companies take advantage of the best technologies to run their businesses on without having to worry about the costs of building and maintaining these sophisticated solutions. A good data integration solution should allow you to harness the power of data, work with big and small data sets easily and cost effectively, and make data available where data is needed. A good application integration solution would allow businesses to quickly and easily connect application, orchestrate processes, and even monetize applications with the greatest efficiency and lowest risk. Piecemeal cobbling together of so critical elements of digital transformation would undermine the whole larger cause of efficiency that such a strategic initiative aims to achieve. Digital transformation positions businesses to better re-evaluate their existing business models allowing organizations to focus on their core reason for existence. Learn more about Oracle’s Data Integration Solution here. Learn more about Oracle’s Application Integration Solution here. Oracle Cloud Café Podcast Channel Be sure to check out the Oracle Cloud Café, where you can listen to conversations with Oracle Cloud customers, partners, thought leaders and experts to get the latest information about cloud transformation and what the cloud means for your business.

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new...

Webcast: Data Integration Platform Cloud with Autonomous Capabilities - Building an Intelligent Data Integration Platform

Oracle Data Integration Platform Cloud, also referred to as DIPC, brings together years of expertise and vision into a single platform that delivers on the many requirements needed of a data integration solution. DIPC is ambitious in scope and rich in features. DIPC now includes within its platform features underpinned by artificial intelligence, machine learning, and natural language processing. In this webcast, I was joined by Jeff Pollock, Vice President of Product Management for Data Integration Cloud products at Oracle, and Kalyan Villuri, Senior Database Manager at Veritas technologies, LLC. We cover quite a lot of ground, not just about the product, but about best practices for integrating data. Watch this webcast if You are looking for a solution that can bring scalability and trust to your analytics solutions, You are looking to adopt the Oracle Cloud and autonomous data warehousing, You are considering, or are in the middle of any big data projects, You want to see a real-life example of how customers are using Oracle data integration. DIPC unifies a number of Oracle’s flagship technologies under a modern and intuitively designed interface. For replication, change data capture, or real-time data streaming capabilities, DIPC relies on expertise built and expanded in Oracle’s GoldenGate technology as a foundation. Oracle Data Integrator, Oracle’s flagship ETL and bulk data transformation engine, provides robust capabilities that are used as a starting point for DIPC's ETL capabilities. Oracle Enterprise Data Quality and Oracle Stream Analytics engines provide data quality and data streaming capabilities within DIPC. DIPC, however, is not just a repackaging of these mature products. It is a re-imagination of how various best of breed data integration solutions can come together and work seamlessly, finding synergies in their features and elevating smaller piecemeal tasks and projects into a solution based approach. For example, DIPC introduces the concept of “elevated tasks” and “atomic tasks”. The latter, atomic tasks, are equivalent to point tasks that are used to accomplish smaller data requirements and logic, while the former, elevated tasks, consists of end goal oriented (e.g. building a data lake, or prepping data) groupings that bring together often encountered technological requirements into simple and logical task groupings. We are excited to bring DIPC to market at a juncture where Data Integration is gaining more relevance to our customers as they engage in business transformations and other strategic initiatives. To learn more about DIPC watch the webcast here.

Oracle Data Integration Platform Cloud, also referred to as DIPC, brings together years of expertise and vision into a single platform that delivers on the many requirements needed of a data...