X

Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

Recent Posts

Data Integration

Get Going with Oracle Cloud Infrastructure Data Integration

We hope you have been finding the Oracle Cloud Infrastructure Data Integration blogs helpful as you get started with the service: Workspace in Oracle Cloud Infrastructure (OCI) Data Integration Understanding VCN Configuration for Oracle Cloud Infrastructure (OCI) Data Integration Data Asset in Oracle Cloud Infrastructure (OCI) Data Integration Project Setup in Oracle Cloud Infrastructure (OCI) Data Integration Data Flow overview in Oracle Cloud Infrastructure (OCI) Data Integration   Integration Tasks in Oracle Cloud  Infrastructure (OCI) Data Integration More are coming!   We also wanted to point you to some related exploratory Oracle Cloud Infrastructure Data Integration blogs from David Allan.  Thanks David! Oracle Cloud Infrastructure Data Integration and Python SDK Explores the first Oracle Cloud Infrastructure Data Integration API in action to list workspaces. The example see here is how to list workspaces in a compartment. Executing Tasks using Python SDK in Oracle Cloud Infrastructure Data Integration Tasks Uses the Oracle Cloud Infrastructure Data Integration Python SDK to execute a task which has been published to an application. Oracle Cloud Infrastructure Data Integration and Fn Shows how to use Functions when you want to focus on writing code to meet business needs; the example is the ‘hello world’ example for Data Integration.  This example will be extended in subsequent posts illustrating integration with other services in OCI such as the Events Service and Notification Service. Automate Loading Data to a Data Lake or Data Warehouse Using OCI Data Integration and Fn Explains how multiple services from Oracle Cloud Infrastructure work together to load data into the Data Lake or ADW, leverage Oracle Cloud Infrastructure Data Integration, Fn and Events Service to automate the load.   Happy reading!

We hope you have been finding the Oracle Cloud Infrastructure Data Integration blogs helpful as you get started with the service: Workspace in Oracle Cloud Infrastructure (OCI) Data Integration Understandi...

Oracle Cloud Infrastructure Data Integration

Understanding VCN Configuration for Oracle Cloud Infrastructure (OCI) Data Integration

Let's learn more about Oracle Cloud Infrastructure Data Integration. Today's blog will help you understand and teach you Virtual Cloud Network (VCN) configuration for Oracle Cloud Infrastructure Data Integration. Check out the previous blog written on Oracle Cloud Infrastructure Data Integration about Workspaces. Overview of Virtual Cloud Network (VCN) A virtual cloud network (VCN) is a customizable and private network in Oracle Cloud Infrastructure. Just like a traditional data center network, the VCN provides complete control over the network environment. This includes assigning own private IP address space, creating subnets, route tables, and configuring stateful firewalls. VCN resides within a single region but can cross multiple Availability Domains. Once users, groups, and compartments are created then start with VCN creation.  By default, there are two subnets in the VCN (Region Specific).  Private Subnet - Instances contain private IP addresses assigned to Virtual Network Interface Card (VNIC) Public Subnet - Contains both private and public IP addresses assigned to VNICs For more understanding of VCN can refer to - https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Concepts/overview.htm Oracle Cloud Infrastructure Data Integration and Virtual Cloud Networks Now coming to the main topic "Understanding VCN with Oracle Cloud Infrastructure Data Integration". Oracle Cloud Infrastructure Data Integration is in the Oracle Tenancy which resides outside the user tenancy. For Data Integration to access the resources in the user tenancy and get the information related to VCN and subnets the following policy needs to be set at the compartment level/tenancy level i.e. policy set at default root compartment level. allow service dataintegration to use virtual-network-family in tenancy (or) allow service dataintegration to use virtual-network-family in compartment <vcn_compartment> Different Options when Creating a Workspace While creating workspaces there are two options provided i.e. Enable Private Network or using Public Network. Oracle Cloud Infrastructure Data Integration only supports regional subnets i.e. subnet across all Availability Domains. Regional subnets are used for high availability purposes.  While in the process of creating a Workspace using "Enable Private Network", Oracle Cloud Infrastructure Data Integration VCN gets extended with the user-selected VCN. When the option is not selected then Oracle Cloud Infrastructure services like Object Storage get accessed through Service Gateway defined at the tenancy level and the rest of the resources like Database are accessed through Public Internet. Let us consider multiple Scenarios to understand the Oracle Cloud Infrastructure Data Integration with VCN by selecting Private/Public subnet and accessing its resources. Before testing multiple scenarios following are pre-requisites created in the environment: Created VCN with the name "VCN_DI_CONCEPTS" in the respective compartment.   Created four subnets within the mentioned VCN. Oracle Cloud Infrastructure Data Integration only supports regional subnet. For more information on the regional subnets, refer to https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/managingVCNs.htm Below is the list of resources created belonging to Subnet and Region while Testing   For Autonomous Data Warehouse (ADW) to be in private instance Network Security Group (NSG) needs to be defined. In NSG defined two ingress rule for PUBLIC_SUBNET_DI (10.0.2.0/24) and PRIVATE_SUBNET_DI (10.0.1.0/24)   For DB Systems in Private subnet, the following rules in the ROUTE table are included   For Service Gateway, select the option "All IAD Services in Oracle Services Network". To understand more about this option, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/servicegateway.htm   Scenario 1 - Accessing ADW, Object Storage, and Databases in the same Region using DI workspace in Private Subnet Oracle Cloud Infrastructure Data Integration workspace was created in PRIVATE_SUBNET_DI (10.0.1.0/24) Service Gateway used in the PRIVATE_SUBNET_DI     Scenario 2 - Accessing ADW, Object storage in different regions, and accessing Database Systems residing in a public subnet. To access ADW in different regions and DB Systems in public subnet a NAT Gateway is required.  Service Gateway is required for Object storage along with NAT Gateway for cross traffic. Route Rules screenshot(added NAT Gateway with the existing Service Gateway):     Scenario 3 - Accessing ADW, Object Storage, and Database in the same Region using DI workspace in Public Subnet OCI DI Workspace in Public Subnet "PUBLIC_SUBNET_DI" (10.0.2.0/24) Depending on the requirement if Oracle Cloud Infrastructure Data Integration Workspace has been assigned in a VCN and wants to connect resources residing in another VCN which might be in the same region or different region then Local or Remote peering is required accordingly. To understand more about Local or remote peering, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/VCNpeering.htm. If the resources are having Public IP then NAT or Service Gateway can be used accordingly. Scenario 4 - ADW, Object Storage, Database systems residing in the public subnet and all these resources are in different tenancy, different region, and different VCN To test this scenario we have created the following resources in the Mumbai region and different tenancy. The workspace of Oracle Cloud Infrastructure Data Integration is in Public Subnet (10.0.2.0/24). DI Workspace is created in the Ashburn region.   Scenario 5 - Connecting ADW, Databases and Object Storage using DI workspace with "Enable Private Network" Disabled While creating workspace if the option "Enable Private Network" is not selected   This non - enabling option means public connectivity option is selected where the Oracle Cloud Infrastructure Data Integration can access all the public services using Service Gateway and NAT Gateway from Oracle Tenancy. Here, Oracle Cloud Infrastructure Data Integration can't access private resources since for the workspace no VCN is assigned. In this example, Oracle Cloud Infrastructure Data Integration is enabled in the Ashburn region.     Scenario 6 - Connecting Oracle Cloud Infrastructure Data Integration with On-Premise DB There are two methods where Oracle Cloud Infrastructure Data Integration can connect to On-Premise DB IPSec VPN FastConnect Below are the details on how using FastConnect Oracle Cloud Infrastructure Data Integration can access the Database. To understand more about FastConnect, refer https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Concepts/fastconnect.htm   Oracle Cloud Infrastructure Data Integration workspace should be in the same subnet where FastConnect is configured.   In the below example, VCN is created by Oracle as part of FastConnect with the name "####-iad.vcn"   Regional Public subnet is created within the VCN   Dynamic Route Gateway (DRG) is configured which is used as a virtual router that provides a path for private traffic (that is, traffic that uses private IPv4 addresses) between user VCN and networks outside the VCN's region. For more information on DRG, refer - https://docs.cloud.oracle.com/en-us/iaas/Content/Network/Tasks/managingDRGs.htm   DRG can be configured with IPSec or Oracle FastConnect   Within the DRG two virtual network have been configured using FastConnect   Route Rules defined in the VCN   OCI DI workspace created in the subnet   Under Data Asset Create and Test the connection   Summary - We can observe that Scenario 1 and Scenario 2 are the same irrespective of Subnet allocated to the workspace.  Since the secondary VNIC extended to the users VCN/tenancy is always Private. Oracle Cloud Infrastructure Data Integration Workspace is assigned to Public or Private Subnet Oracle Cloud Infrastructure Data Integration Workspace is not assigned any network - Disabled "Enable Private Network" Option We just recently announced the general availability of Oracle Cloud Infrastructure Data Integration. With a series of upcoming blogs, we look forward to introducing various concepts. This concludes our blog on how to use VCN in Oracle Cloud Infrastructure Data Integration. To learn more, check out some Oracle Cloud Infrastructure Data Integration Tutorials and the Oracle Cloud Infrastructure Data Integration Documentation.

Let's learn more about Oracle Cloud Infrastructure Data Integration. Today's blog will help you understand and teach you Virtual Cloud Network (VCN) configuration for Oracle Cloud Infrastructure Data...

Oracle Cloud Infrastructure Data Integration

Workspace in Oracle Cloud Infrastructure (OCI) Data Integration

Oracle Cloud Infrastructure Data Integration is a fully managed, serverless, native cloud service that helps you with common extract, load, and transform (ETL) tasks such as ingesting data from different sources, cleansing, transforming, and reshaping that data, and then efficiently loading it to a target system on Oracle Cloud Infrastructure. Before you get started, the administrator must satisfy connectivity requirements so that Oracle Cloud Infrastructure Data Integration can establish a connection to your data sources. The administrator then creates workspaces and gives you access to them. You use workspaces to stay organized and easily manage different data integration environments. The workspace is the preliminary component of Oracle Cloud Infrastructure Data Integration. The workspace acts as an environment provided where the user can work on multiple Projects, Publish/Run Tasks, and Define Data Assets. The administrator must define the policies for the users/groups to start with this data integration solution. Creating and Editing a Workspace: Pre-requisites - All the necessary compartments and VCN have been created for Data Integration activities.  To understand more about VCN for Oracle Cloud Infrastructure Data Integration, refer to https://docs.cloud.oracle.com/en-us/iaas/data-integration/using/preparing-for-connectivity.htm Create a group for users in charge of workspaces and then add users to the group. All the policies have been set up by the administrator so that the user can access the Oracle Cloud Infrastructure Data Integration. If the administrator wants to limit activities within the network, "inspect" permission for VCNs and subnets within the compartment has to be provided instead of "manage". Below is the list of policies required to access Oracle Cloud Infrastructure Data Integration Give permissions to the group to manage Oracle Cloud Infrastructure Data Integration allow group <group_name> to manage dis-workspaces in compartment <compartment_name> Give permission to the group to manage network resources for Workspaces allow group <group_name> to manage virtual-network-family in compartment <compartment_name> Give permission to the group to manage tag-namespaces and tags for Workspaces allow group <group_name> to manage tag-namespaces in compartment <compartment_name> Oracle Cloud Infrastructure Data Integration is located in Oracle Tenancy which is outside user Tenancy. Data Integration sends a request to user tenancy. In return, the user must give the requestor(DI) permission to use the virtual networks set up for integration. Without a policy to accept this request, data integration fails. These policies can be defined at the compartment level or the tenancy level i.e. at the root compartment level allow service dataintegration to use virtual-network-family in tenancy allow service dataintegration to inspect instances in tenancy   Select the Data Integration link from the main menu of Oracle Cloud Infrastructure   Select the corresponding compartment and click on "Create Workspace".   Provide necessary information i.e. Name for the Workspace, VCN details, and other information like DNS, Tag Names.   Click on create for creating the workspace in the corresponding compartment. You're returned to the Workspaces page. It may be a few minutes before your workspace is ready for you to access. After it's created, you can select a Workspace from the list.   You can see the status of a Workspace creation or startup using View Status. It is available while creating or starting a Workspace from Stopped Status   Workspace can be accessed in two ways as shown in the below picture   After accessing the workspace New Projects, Data Assets or Applications can be created through the main console of the Workspace   You can edit the Workspace details, such as a name or description. You can't make changes to the identifier, compartment, VCN, or subnet selections. To edit the tags applied to a Workspace, select Add Tags from the Workspace's Actions (three dots) menu. In the Console, you edit a workspace from the Workspaces page. Select Edit from a workspace's Actions (three dots) menu. Edit the fields you want to change, and then click Save Changes Terminating/Stopping a Workspace -  Only Workspaces with an Active status or in Stopped status can be terminated. When you terminate a workspace, all the associated objects and the resources are removed. Below is the list of resources: Projects Folders Data Flows Tasks Applications Task Runs Data Assets All executions within a Workspace must be stopped before you can terminate the Workspace. Any open tabs associated with the Workspace you're terminating are closed upon termination. Once terminated, a Workspace cannot be restored. Be sure to carefully review the Workspace and resources before you commit to a termination. To terminate the Workspace click on the workspace action(three dots) and then click on Terminate   We just recently announced the general availability of Oracle Cloud Infrastructure Data Integration. With a series of upcoming blogs, we look forward to introducing various concepts. This concludes our initial blog on how a Workspace can be created and used in Oracle Cloud Infrastructure Data Integration.  To learn more, check out some Oracle Cloud Infrastructure Data Integration Tutorials and the Oracle Cloud Infrastructure Data Integration Documentation.

Oracle Cloud Infrastructure Data Integration is a fully managed, serverless, native cloud service that helps you with common extract, load, and transform (ETL) tasks such as ingesting data from...

Data Integration

Oracle Named 2019 Gartner Peer Insights Customer Choice for Data Integration Tools

We are pleased to announce that Oracle has been recognized as a 2019 Gartner Peer Insights Customer Choice for Data Integration Tools.  This distinction is especially important to Oracle because it is based on the direct feedback from our customers.  Thank you all for your support! Oracle Data Integration provides an enterprise class, fully unified solution for building, deploying, and managing real-time data-centric architectures. It combines all the elements of data integration—real-time data movement, transformation, synchronization, data quality, data management, and data services—to ensure that information is timely, accurate, and consistent across complex systems.  By using Oracle Data Integration, customers can experience significant cost savings, and efficiency gains are critical to leverage in today's challenging global economic climate.  They are delivering real-time, enriched, and trusted data from disparate cloud and on-premises sources to enable insightful analytics.  “We are honored to receive Gartner Peer Insights Customers’ Choice designation for the Data Integration Tools market. We thank our customers for their support,” said Jeff Pollock, Vice President Product Management for Oracle. “Over the past 20 years Oracle Data Integration has evolved into an industry leading platform used by thousands of companies across every industry. Working together with our customers, Oracle is committed to driving the innovation necessary to solve the industry’s most challenging data integration issues.”  Find out more! Gartner Peer Insights is an enterprise IT product and service review platform that hosts more than 300,000 verified customer reviews across 430 defined markets. In markets where there is enough data, Gartner Peer Insights recognizes up to seven vendors that are the most highly rated by their customers through the Gartner Peer Insights Customers’ Choice distinction. According to Gartner, “The Gartner Peer Insights Customers’ Choice is a recognition of vendors in this market by verified end-user professionals.” To ensure fair evaluation, Gartner maintains rigorous criteria for recognizing vendors with a high customer satisfaction rate. We at Oracle are deeply proud to be honored as a 2019 Customers’ Choice for the Data Integration Tools Market. To learn more about this distinction, or to read the reviews written about our products by the IT professionals who use them, check out the Customers’ Choice Data Integration Tools for Oracle landing page on Gartner Peer Insights. Here are some excerpts of what Oracle Customers are saying: “Using GoldenGate, it is possible to carry out operations in high data volumes in a much faster and uninterrupted manner. It is also very easy to use. One of the most effective abilities is to manage transactional processing in complex and critical environments. It is very important that data, costs and ongoing transactions are regularly secured to bring the risk to near zero." Software Engineer, Finance Industry “ODI is a very good product. It is lightning fast (which really comes handy when we have to transform massive amount of data), It ability to support heterogeneous databases, big data, JMS, XML, and many other flavors.” Senior Manager - MIS & Middleware, Service Industry A big thank you to our wonderful customers who submitted reviews, and those customers who continue to use our product and services and help shape the future.     The GARTNER PEER INSIGHTS CUSTOMERS’ CHOICE badge is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved. Gartner Peer Insights Customers’ Choice constitute the subjective opinions of individual end-user reviews, ratings, and data applied against a documented methodology; they neither represent the views of, nor constitute an endorsement by, Gartner or its affiliates.

We are pleased to announce that Oracle has been recognized as a 2019 Gartner Peer Insights Customer Choice for Data Integration Tools.  This distinction is especially important to Oracle because it is...

GoldenGate Solutions and News

Oracle GoldenGate Plug-in for Oracle Enterprise Manager v13.2.3.0.0 is now available

We have released GoldenGate OEM Plug-in 13.2.3.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 18.1, 19.1 Microservices (MA) Instances. In the earlier GoldenGate OEM Plug-in 13.2.2.0.0 release, we started supporting our first GoldenGate 12.3 Microservices Instance. In the new release, we have certified the latest of GoldenGate releases 18.1 and 19.1 Microservices and Classic. Along with the certification, we are supporting the new metrics for coordinated and parallel replicats. We have provided more services(Administration Service and Service Manager) and Deployments support in the plug-in. You may discover the new targets in the discovery module and promote the targets of your choice. Finally, we have certified the OEM 13.3 in the release.   Once you discover targets, you can select the processes(Extract, Replicat, etc) while promoting the targets. The selected processes(aka targets) and its parent process would get promoted automatically. For example, if you select the Extract process under Admin Server, the OEM PlugIn will promote the selected Extract process, Admin Server (which is Extract’s Parent), and Service Manager (which is Admin Server’s Parent). On a similar line, if you select the parent process, all the children will be selected by default and then you may choose to de-select the particular child.       Once you promote the processes or targets, you may notice the changes in Dashboard User Interface for Microservices processes. All the processes are shown in the tree structure. The Service Manager is the parent process, which shows one or many GoldenGate Deployments, all the extracts and Replicats are part of Admin Server. You may see all the services status on the screen. Along with the Microservices Instance, you may monitor the Classic Instance on the same dashboard. We have given each process(Target) a specific type name as per GoldenGate terminologies. It will be helpful when you want to know about what type of Extract or Replicats you are monitoring (Classic or Integrated Extract, Coordinated or Parallel Replicat).     When you click on the Service Manager on the dashboard, it will direct you to the below-mentioned page. The page shows all the Deployments and the details of its services, such as Port and Status. In the future, you should be able to search across a particular Deployment. The admin Server page will show all the extract and Replicats processes and its detailed metrics. When you click on individual Extract, Replicat you will be able to see the Metrics, Logs, and Configuration of the process.     For the Parallel and Coordinated replicat(PR/CR), you can see the accumulated metrics in the Parent process. The children process of the PR/CR is not visible on the screen. In the future, we will provide options to select the child process and then you would be able to monitor those children as well.   The GoldenGate OEM Plug-in has upgraded infrastructure to be compatible with the newer version of Enterprise Manager (EM) to 13.3.0.0.0. As mentioned earlier, we have certified the GoldenGate 18.1, 19.1 Classic and Microservices and added few metrics related to Parallel Replicat and Coordinated replicat.    Just to recap the communication between EM Agent and GoldenGate MA, and Classic Instances. You would not require to setup GoldenGate jAgent(Monitor Agent) to communicate with GoldenGate OEM Plug-in for GoldenGate Microservices Instances. The GoldenGate MA architecture provides the RESTful APIs to monitor and manage the GoldenGate MA Instances. The GoldenGate OEM Plug-in uses these RESTful APIs to communicate with GoldenGate MA Instances. For your GoldenGate Classic Instances, you would still need to setup GoldenGate jAgent 12.2.1.2.0+ for the communication purposes. The latest Monitor Agent was released on May, 19 (12.2.1.2.190530).   You can get more details of the release from the documentation.    We are working to get more features around monitoring the GoldenGate Microservices and Classic architecture in future releases. Please stay tuned for further updates.  

We have released GoldenGate OEM Plug-in 13.2.3.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 18.1, 19.1 Microservices (MA) Instances. In the earlier GoldenGate OEM...

Release Announcement for Oracle GoldenGate 19.1

This post was authored by Bobby Curtis, Director of Product Management, Oracle What’s New in Oracle GoldenGate 19.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across your enterprise without compromising availability and performance. Oracle GoldenGate 19c is a high-performance software application for real-time transactional change data capture, transformation, and delivery, offering bidirectional data replication. The application enables you to ensure that your critical systems are operational 24/7, and the associated data is distributed across the enterprise to optimize decision-making. GoldenGate 19.1 Platform New Features For the Oracle Database ➢ Oracle Database 19c Support Capture and Delivery support for Oracle Database 19c, cloud and on-premises. ➢ Centralized Key Management Service Use Oracle Key Vault to centralize and manage encryption keys for the replication environment. ➢ Target-Initiated Paths Distribution paths enabled from the Receiver Service to pull trail files. ➢ New REST API Endpoints Retrieve active transactions and current system change number (SCN) details using REST API endpoints. ➢ New Heartbeat Table Command The UPGRADE HEARTBEATTABLE command upgrades the Heartbeat table from prior versions of Oracle GoldenGate to the 19.1 version. ➢ Cross-Endian Support for Remote Integrated Extract Automatically enabled when the server where the Integrated Extract is running is different from the server where the Oracle Database is running. For MySQL ➢ MySQL 8.0 Support for Capture and Delivery Capture and Delivery support for MySQL 8.0 has been added. ➢ MySQL SSL Connection Support Extract and Replicat can now connect to a MySQL database via SSL. For DB2 for i ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ New Datatype Support Support for DECFLOAT datatype. ➢ New DBOPTIONS USEDATABASEECODING Parameter Allows Extract to store all text data in the trail file in native character encoding. ➢ Improved Extract Performance Enhanced throughput while reducing overall processing. ➢ Security Improvements Availability of AES Encryption. Credential Store, and Oracle Wallet. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. For DB2 z/OS ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ Online Schema Change Support Support for online TABLE CREATE, DROP and ADD, ALTER, DROP COLUMN commands. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. For DB2 LUW ➢ Enhanced TIMESTAMP Support Supports all valid TIMESTAMP precisions ➢ New Datatype Support Support for DECFLOAT datatype. ➢ Long Running Transaction (LRT) Support Support for LRT features showtrans, skiptrans, forcetrans. Other Information ➢ In the initial release, OGG 19.1.0.0.0, Linux builds will be available for most Database/OS combinations that are supported, followed by tiered releases for other supported platforms. ➢ GoldenGate for SQL Server will be released for both Windows and Linux soon, in a 19.1.x release. Docs, Downloads, and Certification: • Documentation is available at: https://docs.oracle.com/en/middleware/goldengate/core/19.1/index.html • Downloads are available through OTN at: https://www.oracle.com/middleware/technologies/goldengate.html • Certification Matrix (19.1 Cert Matrix to be posted soon): https://www.oracle.com/technetwork/middleware/ias/downloads/fusion-certification-100350.html Join us in upcoming events: • Stay up to date by visiting our Data Integration Blog for up to date news and articles. • Save the Date! Oracle OpenWorld is September 16th through the 19th. Don’t hesitate to contact us for any special topics that you might like to discuss. 

This post was authored by Bobby Curtis, Director of Product Management, Oracle What’s New in Oracle GoldenGate 19.1 To succeed in today’s competitive environment, you need real-time information. This...

GoldenGate Solutions and News

Zero Down Time (ZDT) Patching for Oracle GoldenGate

  This document explains how to apply a patch or upgrade an OGG environment without taking any downtime.  This assumes that OGG is already up and running and that the user is already very familiar with how OGG works, and the actual upgrade process.  Like any mission critical, 24x7 environment, this expectation is that the user takes the necessary precautions to test this process prior to implementing it in production, and is aware of any differences between versions.  All of these items are covered in other documents.  Terminology “New” – This refers to the new OGG installation.   This “new” environment is where everything will be running once you have completed the procedure. “Old” – This refers to the old OGG installation.  This “old” environment is the existing OGG installation that you want to upgrade.  After the process is completed, you will be removing this installation. Patching OGG Homes where there are Extract(s) running. Install the new OGG version into a new directory. This location will be referred to as the “new” OGG installation. In the new installation Apply any necessary patches to bring the releases to the most recent bundle patch, and then apply any required one-off patches on top of that. Create new Extract process(es) with different names than the old OGG environment. Create a new set of trail files (different names than the old OGG installation. Copy the parameter files from the old installation into the new one.  Modify them to account for new directories, names, and address any deprecated / modified parameters. On the target Create a new Replicat to receive data from the new OGG installation. In the new Installation Start the Extract Start the Extract pump (if necessary) In the old installation Wait.   How long to wait for?  It depends.  When you started the new Extract in step 4a, it will not process any transactions that were open when it was started.  You will want to wait until any open transactions during that time are closed.  SEND EXTRACT … SHOWTRANS may help in this case. Stop the Extract On the target If the old Replicat is not using a checkpoint table ,add one for it. Once the Replicat from the old installation is at EOF (SEND REPLICAT … GETLAG) stop the old replicat. Start the new replicat using START REPICAT … AFTERCSN [scn].  Where the [scn] is the log_cmplt_csn column from the checkpoint table for the old replicat. This will tell the new replicat to pick up right where the old replicat left off. In the old installation Stop the extract pump (optional) Clean up the old installation and remove it.   Patching OGG Homes where there are Replicat(s) running. Install the new OGG version into a new directory. This location will be referred to as the “new” OGG installation. In the new installation Apply any necessary patches to bring the releases to the most recent bundle patch, and then apply any required one-off patches on top of that. Create new Replicat process(es) with different names than the old OGG environment.  The new replicat will read from the existing trail files. Copy the parameter files from the old installation into the new one.  Modify them to account for new directories, names, and address any deprecated / modified parameters. In the Old installation. If the old Replicat is not using a checkpoint table ,add one for it. Stop the Replicat when it is at EOF (SEND REPLICAT … GETLAG) In the New Installation Start the new replicat using START REPICAT … AFTERCSN [scn].  Where the [scn] is the log_cmplt_csn column from the checkpoint table for the old replicat. This will tell the new replicat to pick up right where the old replicat left off. In the old installation Clean up the old replicat and remove it.  

  This document explains how to apply a patch or upgrade an OGG environment without taking any downtime.  This assumes that OGG is already up and running and that the user is already very familiar with...

Demystifying Oracle Cloud Infrastructure

Oracle has a longstanding reputation for providing technologies that empower enterprises to solve demanding business problems. Oracle has built a cloud infrastructure platform that delivers unmatched reliability, scalability, and performance for mission-critical databases, applications, and workloads. Oracle Cloud Infrastructure is the first cloud built specifically for the enterprise. With the latest high-end components, support for open standards and multi-cloud strategies, and an unwavering commitment to protecting sensitive business data, Oracle Cloud Infrastructure is perfectly suited to meet the needs—and exceed the expectations—of today's enterprise IT teams. Oracle Cloud Infrastructure represents a fundamentally new public cloud architecture and serves as the foundational layer for Oracle Cloud. The infrastructure is designed to provide the performance predictability, core-to-edge security, and governance required for enterprise workloads. Oracle supports traditional, mission-critical, and performance-intensive workloads typically found in on-premises environments, including artificial intelligence (AI), machine learning (ML), and high-performance computing (HPC), as well as cloud-native applications. Oracle Cloud Infrastructure combines the benefits of public cloud (on-demand, self-service, scalability, pay-for-use) with those benefits usually associated with on-premises environments (governance, predictability, control) into a single offering. Here is a good example of how Alliance Data Saves $1 Million Annually Running Critical Applications on Oracle Cloud Infrastructure Learn more about Oracle Cloud Infrastructure here.

Oracle has a longstanding reputation for providing technologies that empower enterprises to solve demanding business problems. Oracle has built a cloud infrastructure platform that delivers unmatched...

Enabling Analytics with Oracle data integration and Oracle Analytics Cloud on Oracle Autonomous Database

Enabling global analytics is one of the most common use cases among customers who build and maintain a data store. In this post, we shall identify the critical components that are required for an end-to-end analytics solution, the characteristics of a great analytics solution, and how Oracle Analytics Cloud, Oracle data integration, and Oracle Autonomous Database, together combine to provide a platform for great analytics. Any chosen analytics solution should bring together and balance the requirements of two major stakeholders, those in the Information Technology (IT) departments and those in the line-of-business functions. Fig 1: IT and Business dictates priorities that need to be balanced in an analytics solution Achieving this balance between the scalability requirements of IT and the user experience focus of a visual tool is critical to the success of any visualization solution. Oracle Data Integration -  The IT Component Oracle data integration solutions help solve key requirements for a successful IT deployment of an analytics solution. Oracle data integration Provides the latest data, both in real-time and bulk, from various sources to be delivered into the data warehouse that is built on top of Oracle Autonomous Database to power analytics, Helps govern data and provide transparency to the data that underpins the analytics visualizations, for easy lineage and impact analysis, thus increasing trust in the data, and Enables true global analytics, by making data available both on-premises and on the cloud. Oracle Analytics Cloud - The Business Component Oracle Analytics Cloud provides the features and benefits that satisfy the requirements of a business user. Oracle Analytics Cloud Provides powerful data flows and enrichment features to enable sharable and traceable business data transformations, Avoids Excel clutter and empower analysts to enhance data with no coding skills required, and Enables augmented data enrichment, through Machine Learning driven enrichment and data transformations. Oracle Autonomous Database - The Platform  Oracle Autonomous Database forms the third component of this analytics solution, along with Oracle data integration and Oracle Analytics Cloud. Oracle Database Provides a robust self- driving, self-securing, and self-repairing data store, providing autonomous datawarehousing capabilities, Watch this video to understand how these three components come together to provide end to end analytics on an Oracle platform. Fig 2: Oracle data integration video  Oracle Data integration, along with Oracle Autonomous Data Warehouse and Oracle Analytics accelerates speed to insight and innovation while enabling fast access to a sophisticated set of analytics and accelerates data preparation and enrichment. Watch this webcast to learn more about how to focus on growing your business and drive innovation with an end-to-end analytics solution.

Enabling global analytics is one of the most common use cases among customers who build and maintain a data store. In this post, we shall identify the critical components that are required for...

Data Integration

Loading Data Into Oracle Autonomous Data Warehouse Cloud with Oracle data integration

Oracle offers the world’s first autonomous database. Oracle also offers tools that helps customers get data into the autonomous database.  In this blog, we will go through what is an Autonomous Database and the capabilities that Oracle data integration provides that helps adopt the Autonomous Data Warehouse Cloud Service (ADWCS). What is an Autonomous database? An autonomous database is a cloud database that uses machine learning to eliminate the human labor associated with database tuning, security, backups, updates, and other routine management tasks traditionally performed by database administrators (DBAs). Autonomous database technology requires that enterprise databases be stored in the cloud, using a cloud service. Being autonomous in the cloud allows the organization to leverage cloud resources to more effectively deploy databases, manage database workloads, and secure the database. A database cloud service makes database capabilities available online, when and where those capabilities are needed. Watch Senior Vice President Juan Loaiza introduce the Oracle Autonomous Database for a deeper insight into the technology. What is Oracle data integration? Oracle’s data integration encompasses a portfolio of cloud-based and on-premises solutions and services that helps with moving, enriching, and governing, data. Oracle data integration has the following capabilities that make it the logical choice when looking to migrate and move data to Oracle Cloud. Oracle data integration Has integrated APIs that allow easy access to Oracle’s underlying tables without affecting source system performance for real-time data access through change data capture, Can automate repeated data delivery into Oracle Datawarehouse Cloud Service by easily surfacing ADWCS as a target system, Brings together real-time data replication, data streaming, bulk data movement, and data governance into a cohesive set of products that are seamlessly integrated for performance. Watch this video to get a quick glimpse of our latest product and how it functions with Oracle Data Warehouse Cloud and Oracle Analytics cloud. Moving Data Into Oracle Data Warehouse Cloud Service Oracle data integration solutions bring together some key technological and user benefits for customers. Managed by Oracle – Engineered and built by teams that have a shared vision, the different solutions and technologies incorporate the best of scientific advances, as well as, seamless integration between the solutions. Unified Data Integration – Provides a single-pane-of-glass to control the various components of data integration like bulk data movement, real-time data, data quality, and data governance. Simplify Complex Integration Tasks – Groups together functions that build up to a business or technology pattern, so that often repeated scenarios can be executed with efficiency. Flexible Universal Credit Pricing – Oracle’s pricing tracks usage, and can be applied across technologies, allowing customers access to all participating Oracle cloud services, freeing customers from procurement woes and providing customers with a truly agile and nimble set of solutions. Here are some scenarios that Oracle data integration helps to solve. Extraction & Transformation - Execute bulk data movement, transformation, and load, scenarios, Data Replication – Change data capture helps replicate data into Oracle Autonomous DataWarehouse and Kafka, for data migration and high availability architectures, Data Lake Builder - Create a comprehensive, fully governed, repeatable data pipeline to your big data lakes, Data Preparation - Ingest and harvest metadata for better data transparency and audits, and Synchronize Data - Seamlessly synchronize two databases together. Fig1: A sample architecture of moving data from source to analytics For a deeper understanding of moving data into Oracle Autonomous Data Warehousing Cloud, watch the below webcast.

Oracle offers the world’s first autonomous database. Oracle also offers tools that helps customers get data into the autonomous database.  In this blog, we will go through what is an Autonomous...

GoldenGate Solutions and News

Oracle GoldenGate for SQL Server supports SQL Server 2017 and Delivery to Microsoft Azure SQL Database

The Oracle GoldenGate Product Management team is pleased to announce that Oracle GoldenGate 12.3 for SQL Server has added new functionality to support Capture and Delivery from/to SQL Server 2017 Enterprise Edition and has added certification to deliver to Microsoft’s Azure SQL Database. SQL Server 2017 Using patch release 12.3.0.1.181228 of Oracle GoldenGate for SQL Server (CDC Extract), which is available on support.oracle.com, under Patches & Updates, customers now have the ability to both capture from and deliver to SQL Server 2017 Enterprise Edition.  Azure SQL Database Also, using the same patch release as for SQL Server 2017 support, remote delivery to Azure SQL Database is now supported.  You can install the Oracle GoldenGate patch on a supported Windows server (see the Certification Matrix) and configure a remote Replicat to deliver data to your Azure SQL Database. Documentation For more information, please review the Oracle GoldenGate documentation as well as a quick start tutorial, which is available here: https://apexapps.oracle.com/pls/apex/f?p=44785:24:111923811479624::NO:24:P24_CONTENT_ID,P24_PREV_PAGE:21869,1

The Oracle GoldenGate Product Management team is pleased to announce that Oracle GoldenGate 12.3 for SQL Server has added new functionality to support Capture and Delivery from/to SQL Server 2017...

Data Integration

Integration: Heart of the Digital Economy Podcast Series – Moving Data to the Cloud and Autonomous Data Warehouse

Authored by Steve Quan, Principal Product Marketing Director, Oracle Digital transformation is inevitable if want to thrive in today’s economy.  We've heard about how application and data integration play a central role in business transformations.  Since data has become a valuable commodity, integration plays a critical role in sharing data with applications in hybrid cloud environments or populating data lakes for analytics.  In these two podcasts you can learn how easy it is to seamlessly integrate data for these use cases. Successful digital businesses rely on data warehouses for contextual information to identify customer intent and remain one-step ahead of competition.   With growing data volumes, you need to easily acquire and prepare data in the right format for business intelligence and analysis.  Listen to Integrating Data for Oracle and Autonomous Data Warehouse  and learn how easy it is to move and keep your data synchronized. Data is also moving to hybrid cloud environments so you can use data on-premises and in the cloud; enabling your organizations to be more agile and react quickly to changes.  Moving data to the cloud is not just copying initial blocks of data, you need to move the data and keep the data synchronized. Listen to Moving Data into the Cloud and learn how Oracle Data Integration makes this easier. Learn more about Oracle’s Application Integration Solution here. Learn more about Oracle’s Data Integration Solution here. Dive into Oracle Cloud with a free trial available here.   Oracle Cloud Café Podcast Channel - check out the Oracle Cloud Café, where you can listen to conversations with Oracle Cloud customers, partners, thought leaders and experts to get the latest information about cloud transformation and what the cloud means for your business.

Authored by Steve Quan, Principal Product Marketing Director, Oracle Digital transformation is inevitable if want to thrive in today’s economy.  We've heard about how application and data integration...

Data Integration

DATA REPLICATION TO AWS KINESIS DATA STREAM USING ORACLE GOLDENGATE

Contributed by: Shrinidhi Kulkarni, Staff Solutions Engineer, Oracle Use case: Replication of data trails present on AWS AMI Linux instance into Kinesis Data Stream (AWS Cloud) using Oracle GoldenGate for Big Data. Architecture: GoldenGate For Big Data: Oracle GoldenGate 12.3.2.1 AWS EC2 Instance: AMI Linux Amazon Kinesis  Highlights: How to configure GoldenGate for Big Data(12.3.2.1) How to configure GoldenGate Big Data Target handlers How to create AWS Kinesis Data Stream Connecting To Your Linux Instance from Windows Using PUTTY Please refer to the following link & the instructions in it that explain how to connect to your instance using PUTTY. And also on how to Transfer files to your instance using WinSCP.     https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/putty.html Download the GoldenGate for Big Data Binaries, Java (JDK or JRE) version 1.8 & Amazon Kinesis Java SDK Download and install GoldenGate for Big Data 12.3.2.1.1, Here is the link: http://www.oracle.com/technetwork/middleware/goldengate/downloads/index.html The Oracle GoldenGate for Big Data is certified for Java 1.8. Before installing and running Oracle GoldenGate 12.3.2.1.1, you must install Java (JDK or JRE) version 1.8 or later. Either the Java Runtime Environment (JRE) or the full Java Development Kit (which includes the JRE) may be used. The Oracle GoldenGate Kinesis Streams Handler uses the AWS Kinesis Java SDK to push data to Amazon Kinesis. The Kinesis Steams Handler was designed and tested with the latest AWS Kinesis Java SDK version 1.11.429 and for creating streams/ shards. https://docs.oracle.com/goldengate/bd123110/gg-bd/GADBD/using-kinesis handler.htm#GADBD-GUID-3DE02CFE-8A38-4407-86DF-81437D0CC4E2 Create a Kinesis data stream(not included under Free-tier)on your AWS Instance, Follow the link for reference- https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html It is strongly recommended that you do not use the AWS account root user or ec2-user for your everyday tasks, even the administrative ones. You need to create a new user with access key & secret_key for AWS, use the following link as reference to do the same :            https://docs.aws.amazon.com/general/latest/gr/managing-aws-access-keys.html Attach the following policies to the newly created user to allow access and GET/Put Operations on Kinesis data stream: AWSLambdaKinesisExecutionRole-Predefined Policy in AWS You need to attach the following inline policy as json:  "Version": "2012-10-17",  "Statement": [    {    "Effect": "Allow",      "Action": "kinesis:*",      "Resource": [        "arn:aws:kinesis:<your-aws-region>:<aws-account-id>:stream/<kinesis-stream-name>"      ]    } Unzip the GoldenGate for big data (12.3.2.1) zip file : After you Unzip the Downloaded GoldenGate for Big Data Binary, the directory structure looks like this: Now extract the GoldenGate 12.3.2.1.1 .tar file using “tar -xvf” command. After the “tar –xvf” operation finishes, the following Big-Data target handlers are extracted: You can have a look on the directory structure( files extracted) and then go to “AdapterExamples” directory to make sure kinesis streams handler is extracted:           The Kinesis_Streams directory under big-data contains Kinesis Replicat parameter file(kinesis.prm) and kinesis properties file (kinesis.props). Before you log into GoldenGate instance using GGSCI, set the JAVA_HOME & LD_LIBRARY_PATH to the JAVA 1.8 directory otherwise it would show up an error as following: Export the JAVA_HOME & LD_LIBRARY_PATH as shown below:              export JAVA_HOME=<path-to-your-Java-1.8>/jre1.8.0_181              export LD_LIBRARY_PATH=<path-to-your-Java-1.8>/lib/amd64/server:$JAVA_HOME/lib Once you’re done, log into GoldenGate Instance using ./ggsci command and issue create subdir command to create the GoldenGate specific directories: Configure the Manager parameter file and add an open PORT to it: Example: edit param mgr          PORT 1080 Traverse back to GoldenGate Directory, execute ./ggsci and Add replicat in the GoldenGate instance using the following command:      add replicat kinesis, exttrail AdapterExamples/trail/tr [NOTE: A demo trail is already present at the location: AdapterExamples/trail/tr] Copy the parameter file of the replicat (mentioned above) to ./dirprm directory of the Goldengate Instance. Copy the properties file (kinesis.props) to dirprm folder after making the desired changes. Replicat Param File & kinesis properties file: REPLICAT kinesis -- Trail file for this example is located in "AdapterExamples/trail" directory -- Command to add REPLICAT -- add replicat kinesis, exttrail AdapterExamples/trail/tr TARGETDB LIBFILE libggjava.so SET property=dirprm/kinesis.props REPORTCOUNT EVERY 1 MINUTES, RATE GROUPTRANSOPS 1 MAP QASOURCE.*, TARGET QASOURCE.*; Kinesis Properties File(kinesis.props): gg.handlerlist=kinesis gg.handler.kinesis.type=kinesis_streams gg.handler.kinesis.mode=op gg.handler.kinesis.format=json gg.handler.kinesis.region=<your-aws-region> #The following resolves the Kinesis stream name as the short table name gg.handler.kinesis.streamMappingTemplate=<Kinesis-stream-name> #The following resolves the Kinesis partition key as the concatenated primary keys gg.handler.kinesis.partitionMappingTemplate=QASOURCE #QASOURCE is the schema name used in the sample trail file gg.handler.kinesis.deferFlushAtTxCommit=true gg.handler.kinesis.deferFlushOpCount=1000 gg.handler.kinesis.formatPerOp=true #gg.handler.kinesis.proxyServer=www-proxy-hqdc.us.oracle.com #gg.handler.kinesis.proxyPort=80 goldengate.userexit.writers=javawriter javawriter.stats.display=TRUE javawriter.stats.full=TRUE gg.log=log4j gg.log.level=DEBUG gg.report.time=30sec gg.classpath=<path-to-your-aws-java-sdk>/aws-java-sdk-1.11.429/lib/*:<path-to-your-aws-java-sdk>/aws-java-sdk-1.11.429/third-party/lib/*   ##Configured with access id and secret key configured elsewhere javawriter.bootoptions=-Xmx512m -Xms32m -Djava.class.path=ggjava/ggjava.jar   ##Configured with access id and secret key configured here javawriter.bootoptions=-Xmx512m -Xms32m -Djava.class.path=ggjava/ggjava.jar -Daws.accessKeyId=<access-key-of-new-created-user> -Daws.secretKey=<secret-ke-new-created-user> Make sure you edit the classpath, accessKeyId & Secret Key (of newly-created-user) correctly. After making all the necessary changes you can start the kinesis replicat, which would replicate the trail data to kinesis Data stream. Crosscheck for kinesis replicat’s status, RBA and stats. Once you get the stats, you can view the kinesis.log from. /dirrpt directory which gives information about data sent to kinesis data stream and operations performed.          You can also monitor the data that has been pushed into Kinesis data stream through AWS CloudWatch. Amazon Kinesis Data Streams and Amazon CloudWatch are integrated so that you can collect, view, and analyze CloudWatch metrics for your Kinesis data streams. For example, to track shard usage, you can monitor the following metrics: IncomingRecords: The number of records successfully put to the Kinesis stream over the specified time period. IncomingBytes: The number of bytes successfully put to the Kinesis stream over the specified time period. PutRecord.Bytes: The number of bytes put to the Kinesis stream using thePutRecord operation over the specified time period.

Contributed by: Shrinidhi Kulkarni, Staff Solutions Engineer, Oracle Use case: Replication of data trails present on AWS AMI Linux instance into Kinesis Data Stream (AWS Cloud) using Oracle GoldenGate...

Data Integration

Data Integration Platform Cloud (DIPC) 18.4.3 is Now Available

Data Integration Platform Cloud (DIPC) 18.4.3 is now available! Do you know what DIPC is?  If not, check out this short 2 minute video! Data Integration Platform Cloud (DIPC) is a re-imagination of how various best of breed data integration solutions can come together and work seamlessly, finding synergies in their features and elevating smaller piecemeal tasks and projects into a solution based approach. For example, DIPC introduces the concept of “elevated tasks” and “atomic tasks”. The latter, atomic tasks, are equivalent to point tasks that are used to accomplish smaller data requirements and logic, while the former, elevated tasks, consists of end goal oriented (e.g. building a data lake, or prepping data) groupings that bring together often encountered technological requirements into simple and logical task groupings. Let’s explore some of the new features for DIPC 18.4.3: A major enhancement we made in this release is the added support for Autonomous Data Warehouse (ADW), Oracle’s easy-to-use, fully autonomous database that delivers fast query performance. You can now create a connection to ADW and harvest metadata that can be used in our elevated tasks. In a recent blog article we explored the Data Lake Builder task.  This task helps with data lake automation, enabling an intuitive instantiation and copy of data into a data lake, in an effort to help reduce some of the existing data engineer/ data scientist friction.  You can quickly create a comprehensive, end-to-end repeatable data pipeline to your data lake.  The Add Data to Data Lake task now supports Autonomous Data Warehouse as a target and you can also ingest from Amazon S3.  Additionally, task execution is supported through the remote agent. The Replicate Data task includes advanced Kafka support with Avro and Sub types.  The user experience has been enhanced to support many varied replication patterns in the future.  You also have the option to encrypt data within the task. The ODI Execution task adds support for Autonomous Data Warehouse (ADW) and Oracle Object Storage empowering users to bulk load into ADW and run ETL/ELT workloads to transform their data. You’ll also find that DIPC scheduling is offered, allowing you to create scheduling policies to run jobs.  Additionally, heterogeneous support has been expanded, with GoldenGate for SQL Server now available through the DIPC agent. Learn more by checking out this documentation page for bits on how to create and runs tasks.

Data Integration Platform Cloud (DIPC) 18.4.3 is now available! Do you know what DIPC is?  If not, check out this short 2 minute video! Data Integration Platform Cloud (DIPC) is a re-imagination of...

Oracle Open World 2018 - A Recap

The Annual Oracle Tech Bonanza The first time I attended Oracle’s Open World, in 2013, was when I truly understood the scale of innovation and expertise that Oracle brings to its customers. Over the years, each year, I have only been more amazed at the breadth of technologies, various success stories, and incredible innovation, that Oracle and our customers combine to bring change to the way businesses operate. This year was no different. Oracle Open World 2018 had some of the most relevant and thought-provoking ideas, cutting-edge product showcases, and real-world use cases, on display. Here is a statistic that might throw a light on the scale of the event. “This week Oracle OpenWorld 2018 has hosted more than 60,000 customers and partners from 175 countries and 19 million virtual attendees. Oracle OpenWorld is the world’s most innovative cloud technology conference and is slated to contribute $195 million in positive economic impact to the City of San Francisco in 2018.” For a quick overview of the entire conference, you can read the full press release here.                           Key Notes Starting off the conference every day were keynotes that set the tone for the technology fest every day. Larry Ellison, Executive Chairman and Chief Technology Officer, kicked off the conference diving deep into two mainstays, among others, of Oracle’s focus, namely, Oracle Fusion Cloud Applications, and, Oracle’s Autonomous Database services. With an increased focus on intelligent platforms and a rich cloud ecosystem, Oracle’s Cloud is a critical component that glues together the dizzying array of services and solutions that Oracle offers. Some of the other keynote speakers, in no particular order, included, Mark Hurd, Chief Executive Officer, Steve Miranda, Executive Vice President, and Judith Sim, Chief Marketing Officer. In case you missed it, or want to revisit these sessions, you can watch them all here. My personal favorite was the session on The Role of Security and Privacy in a Globalized Society. To listen to the heads of highly performant teams discuss real-world problems using our technologies drives home the importance of our products outside the development labs.                           Integration Deep Dives: Throughout the conference, this year, the focus was on helping customers quickly migrate to the cloud seamlessly. Artificial Intelligence embedded in the technologies that Oracle delivers helps customers automate and innovate quicker, more securely, and with the least disruption to their existing operations. Application integration and Data Integration, two areas that have consistently contributed to the growth of Oracle Customers’ success in the move to the cloud, had their own set of sessions. Here is a list of the different sessions, topics, and labs, that OOW18 hosted around integration. There were customer sessions, product roadmap sessions, thought leadership sessions, and demos, to cater to every information that our current and prospective customers would need to make the best decision to partner with us on their journey to autonomous data warehousing and the cloud. Innovation Awards: No Oracle Open World is complete without the signature red-carpet event, The Oracle Excellence Awards. This year the award winners included major companies and organizations including American Red Cross, Deloitte, Gap, Hertz, National Grid, Sainsbury's and Stitch Fix. While the winners no doubt showcase the best of the use of Oracle technologies, they represent only a small fraction of the best and innovative use of Oracle technologies in the real world. It is always a matter of pride to watch our customers describe, in their own words, the difference they make in turn to their end customers using Oracle technologies.                           Even as the rumbles of this year’s Open World dies down, we at Oracle’s Integration camp are gearing up for exciting new releases and features. Oracle Data Integration is taking on more and more capabilities, baking them into a seamlessly integrated platform. Existing customers will notice how rapidly the changes are coming without feeling the need to learn new skills that bridges various roles with a single data integration platform. New customers will be delighted at the easy-to-use packaging and pricing models. Oracle Application Integration is meanwhile bridging the requirements that arise out of needing applications to be connected. With out-of-the-box connectors, ERP integrations, and features that seamlessly utilize artificial intelligence, Oracle’s Application Integration brings process automation and self-service integration to our customers. Here are just some of the commendations and accolades that Oracle Data Integration and Oracle Application Integration received from the analysts recently. This has been an exceptional Open World, once again reminding me of Oracle’s technologies, deep technical and business expertise, and customer commitment. I am already looking forward to the next OpenWorld. 

The Annual Oracle Tech Bonanza The first time I attended Oracle’s Open World, in 2013, was when I truly understood the scale of innovation and expertise that Oracle brings to its customers. Over the...

GoldenGate Solutions and News

Release Announcement for Oracle GoldenGate 18.1

What’s New in Oracle GoldenGate 18.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across your enterprise without compromising availability and performance. Oracle GoldenGate 18c is a high-performance software application for real-time transactional change data capture, transformation, and delivery, offering bidirectional data replication. The application enables you to ensure that your critical systems are operational 24/7, and the associated data is distributed across the enterprise to optimize decision-making. GoldenGate 18.1 Platform Features For the Oracle Database Oracle Database 18c Support Capture and Delivery support for Oracle Database 18c, cloud and on-premises Autonomous Data Warehouse Cloud (ADWC) and Autonomous Transaction Processing (ATP) Support Easily connect to ADWC and ATP to deliver transactions Identity Column Support Simplified support for handling identity columns in the Oracle Database AutoCDR Improvements Support for tables with unique keys (UK) Composite Sharding Support for multiple shardspaces of data using consistent partitioning In-Database Row Archival Support Support for compressed invisible rows For MySQL MySQL Remote Capture Support Capture MySQL DML transactions from a remote Linux hub.Use for remote capture against MySQL, Amazon RDS for MySQL, and Amazon Aurora MySQL Database running on Linux or Windows. For DB2 z/OS DB 12.1 Support TIMESTAMP w/TIMEZONE and Configurable Schema for Extract’s Stored Procedure For DB2 LUW Cross Endian Support for Remote Capture and PureScale Support For Teradata Teradata 16.20 Support for Delivery Join us in upcoming events: Stay up to date by visiting our Data Integration Blog for up to date news and articles. Save the Date!  Oracle OpenWorld is October 22nd through the 25th. 

What’s New in Oracle GoldenGate 18.1 To succeed in today’s competitive environment, you need real-time information. This requires a platform that can unite information from disparate systems across...

Data Integration

2018 Oracle OpenWorld Data Integration Sessions, Labs and Demos

                                  With OpenWorld 2018 just days away, we can’t wait to welcome you to San Francisco. As you begin thinking of ways your company fits into a data-driven economy, you’ll need to think about how all your business data and cloud data can work together to provide meaningful insights and make better decisions. As our industry continues to roll out new technologies like AI and machine learning, you’ll want to think how your data can work with machine learning systems to get insights from patterns. Learn from the experts how a unified data infrastructure helps you migrate data to a data warehouse, process IoT data with Apache Kafka, and manage data lifecycle for greater transparency. There are over 30 data integration sessions, labs, and demos that showcase Oracle’s data integration technologies. Make room in your schedules to learn from experts who have helped their organizations successfully transform in this digital age. We want to highlight a few sessions here, but there are plenty more that you should plan on attending. Scan the QR-code or click on this link  to explore all the sessions that may interest you.   Oracle’s Data Platform Roadmap: Oracle GoldenGate, Oracle Data Integrator, Governance [PRM4229]   Jeff Pollock, Vice President of Product, Oracle Monday, Oct 22, 11:30 a.m. - 12:15 p.m. | Moscone West - Room 2002 This session explores the range of solutions in Oracle’s data platform. Get an overview and roadmap for each product, including Oracle Data Integrator, Oracle GoldenGate, Oracle Metadata Management, Oracle Enterprise Data Quality, and more. Learn how each solution plays a role in important cloud and big data trends, and discover a vision for data integration now and into the future. Oracle’s Data Platform in the Cloud: The Foundation for Your Data [PRO4230] Denis Gray, Senior Director - Data Integration Cloud, Oracle Monday, Oct 22, 12:30 p.m. - 1:15 p.m. | Marriott Marquis (Golden Gate Level) - Golden Gate C3 The rapid adoption of enterprise cloud–based solutions brings with it a new set of challenges, but the age-old goal of maximizing value from data does not change. Oracle’s data platform ensures that your data solution is built from the ground up on a foundation of best-of-breed data integration, big data integration, data governance, data management, and data automation technologies. As customers ascend more of their enterprise applications to the cloud, they realize a cloud-based enterprise data platform is key to their success. Join this session led by Oracle product management to see how Oracle’s data platform cloud solutions can solve your data needs, and learn the product overview, roadmap, and vision, as well as customer use cases. Oracle Data Integration Platform Cloud: The Foundation for Cloud Integration [THT6793] Denis Gray, Senior Director - Data Integration Cloud, Oracle Tuesday, Oct 23, 5:00 p.m. - 5:20 p.m. | The Exchange @ Moscone South - Theater 5 The rapid adoption of enterprise-cloud-based solutions brings with it a new set of challenges. However, the age-old goal of maximizing value from data does not change. Powered by Oracle GoldenGate, Oracle Data Integrator, and Oracle Data Quality, Oracle Data Integration Platform Cloud ensures that your data solution is built from the ground up on a foundation that is built on best-of-breed data integration, big data integration, data governance, data management, and data automation technologies. Join this mini theater presentation to see the power and simplicity of Oracle Data Integration Platform Cloud. See how it utilizes machine learning and artificial intelligence to simplify data mapping, data transformation, and overall data integration automation.   After all the sessions, JOIN US and unwind at our Oracle Integration & Data Integration Customer Appreciation Event @OOW18, Thursday, October 25, 2018, 6pm-10pm. Barbarossa Lounge, 714 Montgomery Street, San Francisco, CA! Pass Code Needed to Register - #OOW18 Registration link https://www.eventbrite.com/e/oracle-integration-and-data-integration-customer-appreciation-event-oow2018-registration-51070929525   You can start exploring App Integration and Data Integration sessions in the linked pages.  We are also sharing #OOW18 updates on Twitter: App Integration and Data Integration. Make sure to follow us for all the most up-to-date information before, during, and after OpenWorld!

                                  With OpenWorld 2018 just days away, we can’t wait to welcome you to San Francisco. As you begin thinking of ways your company fits into a data-driven economy, you’ll...

Data Integration

Integration Podcast Series: #1 - The Critical Role Integration Plays in Digital Transformation

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new features based on cutting edge research like those based on Artificial Intelligence (AI), Machine Learning (ML) and Natural Language Processing (NLP), business models need to change to adopt and adapt to these new offerings. In the first podcast of our “Integration: Heart of the Digital Economy” podcast series, we discuss, among other questions: What is digital transformation? What is the role of Integration in digital transformation? What roles do Application and Data Integration play in this transformation? Businesses, small and big, are not able to convert every process into a risk reducing act or a value adding opportunity. Integration plays a central role in the digital transformation of a business. Businesses and technologies run on data. Businesses also run applications and processes. Integration helps supercharge these critical components of a business. For example, cloud platforms now offer tremendous value with their Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS) offerings. Adopting and moving to the cloud would help companies take advantage of the best technologies to run their businesses on without having to worry about the costs of building and maintaining these sophisticated solutions. A good data integration solution should allow you to harness the power of data, work with big and small data sets easily and cost effectively, and make data available where data is needed. A good application integration solution would allow businesses to quickly and easily connect application, orchestrate processes, and even monetize applications with the greatest efficiency and lowest risk. Piecemeal cobbling together of so critical elements of digital transformation would undermine the whole larger cause of efficiency that such a strategic initiative aims to achieve. Digital transformation positions businesses to better re-evaluate their existing business models allowing organizations to focus on their core reason for existence. Learn more about Oracle’s Data Integration Solution here. Learn more about Oracle’s Application Integration Solution here. Oracle Cloud Café Podcast Channel Be sure to check out the Oracle Cloud Café, where you can listen to conversations with Oracle Cloud customers, partners, thought leaders and experts to get the latest information about cloud transformation and what the cloud means for your business.

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new...

Webcast: Data Integration Platform Cloud with Autonomous Capabilities - Building an Intelligent Data Integration Platform

Oracle Data Integration Platform Cloud, also referred to as DIPC, brings together years of expertise and vision into a single platform that delivers on the many requirements needed of a data integration solution. DIPC is ambitious in scope and rich in features. DIPC now includes within its platform features underpinned by artificial intelligence, machine learning, and natural language processing. In this webcast, I was joined by Jeff Pollock, Vice President of Product Management for Data Integration Cloud products at Oracle, and Kalyan Villuri, Senior Database Manager at Veritas technologies, LLC. We cover quite a lot of ground, not just about the product, but about best practices for integrating data. Watch this webcast if You are looking for a solution that can bring scalability and trust to your analytics solutions, You are looking to adopt the Oracle Cloud and autonomous data warehousing, You are considering, or are in the middle of any big data projects, You want to see a real-life example of how customers are using Oracle data integration. DIPC unifies a number of Oracle’s flagship technologies under a modern and intuitively designed interface. For replication, change data capture, or real-time data streaming capabilities, DIPC relies on expertise built and expanded in Oracle’s GoldenGate technology as a foundation. Oracle Data Integrator, Oracle’s flagship ETL and bulk data transformation engine, provides robust capabilities that are used as a starting point for DIPC's ETL capabilities. Oracle Enterprise Data Quality and Oracle Stream Analytics engines provide data quality and data streaming capabilities within DIPC. DIPC, however, is not just a repackaging of these mature products. It is a re-imagination of how various best of breed data integration solutions can come together and work seamlessly, finding synergies in their features and elevating smaller piecemeal tasks and projects into a solution based approach. For example, DIPC introduces the concept of “elevated tasks” and “atomic tasks”. The latter, atomic tasks, are equivalent to point tasks that are used to accomplish smaller data requirements and logic, while the former, elevated tasks, consists of end goal oriented (e.g. building a data lake, or prepping data) groupings that bring together often encountered technological requirements into simple and logical task groupings. We are excited to bring DIPC to market at a juncture where Data Integration is gaining more relevance to our customers as they engage in business transformations and other strategic initiatives. To learn more about DIPC watch the webcast here.

Oracle Data Integration Platform Cloud, also referred to as DIPC, brings together years of expertise and vision into a single platform that delivers on the many requirements needed of a data...

Data Integration

#OOW18 Executive Keynotes and Sessions You Won’t Want to Miss

With Oracle OpenWorld 2018 less than two weeks away, you are probably busy crafting an agenda to fit in all the sessions you want to see. We want to make sure your experience is tailored to perfection. In a couple days, we will share our full list of integration sessions and highlight a few special events just for Integration folks. In the meantime, let’s start our planning with a bang by introducing you to some of the executive keynotes and sessions we are most excited about: CLOUD PLATFORM & CLOUD INFRASTRUCTURE EXECUTIVE KEYNOTES AND SESSIONS Cloud Platform Strategy and Roadmap (PKN5769) – Amit Zavery Mon Oct 22, 9-9:45am | Yerba Buena Theater In this session, learn about the strategy and vision for Oracle’s comprehensive and autonomous PaaS solutions. See demonstrations of some of the new and autonomous capabilities built into Oracle Cloud Platform including a trust fabric and data science platform. Hear how Oracle’s application development, integration, systems management, and security solutions leverage artificial intelligence to drive cost savings and operational efficiency for hybrid and multi-cloud ecosystems. Oracle Cloud: Modernize and Innovate on Your Journey to the Cloud (GEN1229) – Steve Daheb Tue Oct 23, 12:30-1:15pm | Moscone West 2002 Companies today have three sometimes conflicting mandates: modernize, innovate, AND reduce costs. The right cloud platform can address all three, but migrating isn’t always as easy as it sounds because everyone’s needs are unique, and cookie-cutter approaches just don’t work. Oracle Cloud Platform makes it possible to develop your own unique path to the cloud however you choose—SaaS, PaaS, or IaaS. Learn how Oracle Autonomous Cloud Platform Services automatically repairs, secures, and drives itself, allowing you to reduce cost and risk while at the same time delivering greater insights and innovation for your organization. In this session learn from colleagues who found success building their own unique paths to the cloud. Autonomous Platform for Big Data and Data Science (PKN3898) – Greg Pavlik Tue Oct 23, 5:45-6:30pm | Yerba Buena Theater Data science is the key to exploiting all your data. In this general session learn Oracle’s strategy for data science: building, training, and deploying models to uncover the hidden value in your data. Topics covered include ingestion, management, and access to big data, the raw material for data science, and integration with autonomous PaaS services. The Next Big Things for Oracle’s Autonomous Cloud Platform (PKN5770) – Amit Zavery Wed Oct 24, 11:15-12pm | The Exchange @ Moscone South - The Arena Attend this session to learn about cutting-edge solutions that Oracle is developing for its autonomous cloud platform. With pervasive machine learning embedded into all Oracle PaaS offerings, see the most exciting capabilities Oracle is developing including speech-based analytics, trust fabric, automated application development (leveraging AR and VR), and digital assistants. Find out how Oracle is innovating to bring you transformational PaaS solutions that will enhance productivity, lower costs, and accelerate innovation across your enterprise.   You can start exploring App Integration and Data Integration sessions in the linked pages. We are also sharing #OOW18 updates on Twitter: App Integration and Data Integration. Make sure to follow us for all the most up-to-date information before, during, and after OpenWorld!

With Oracle OpenWorld 2018 less than two weeks away, you are probably busy crafting an agenda to fit in all the sessions you want to see. We want to make sure your experience is tailored to...

GoldenGate Solutions and News

Oracle GoldenGate Plug-in for Oracle Enterprise Manager v13.2.2.0.0 is now available

We have released GoldenGate OEM Plug-in 13.2.2.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 12.3 Microservices (MA) Instances. So now, you can discover and monitor the GoldenGate 12.3 Classic and MA Instances through the GoldenGate OEM Plug-in 13.2.2.0.0. While you discover the GoldenGate MA Instances, you need to specify few details like Service Manager name, password, hostname and port number. We have introduced the new property GoldenGate Classic or Microservices that need to be specified based on what GoldenGate Instances, the Classic or Microservices you would want to discover. You can see the status of the services that you have deployed in your GoldenGate MA Instances. The extract, replicat shows the common metrics available for both classic and MA architecture. In the future release, we will add MA specific metrics and detailed monitoring of MA services. You need to provide the monitoring credentials for Microservices so that you can view the configuration tab/log data and able to start/stop the GoldenGate processes.     The GoldenGate OEM Plug-in has upgraded infrastructure to be compatible with newer version of Enterprise Manager (EM) to 13.2.2.0.0.  You would not require to setup GoldenGate jAgent to communicate with GoldenGate OEM Plug-in for GoldenGate 12.3 MA Instances. The GoldenGate MA architecture provides the RESTful APIs to monitor and manage the GoldenGate MA Instances. The GoldenGate OEM Plug-in uses these RESTful APIs to communicate with GoldenGate MA Instances. For your GoldenGate 12.3 Classic Instances, you would still need to setup GoldenGate jAgent 12.2.1.2.0+ for the communication purposes.   We also have, allowed you to edit Big Data handler properties files from Oracle GoldenGate OEM Plug-In.   You can get more details of the release from the documentation and you may download the software from OSDC, OTN, and EM Store.    We are working to get more features around monitoring the GoldenGate Microservices architecture in future releases. Let me know if you have any questions.

We have released GoldenGate OEM Plug-in 13.2.2.0.0. The release's primary focus was to support the monitoring of Oracle GoldenGate 12.3 Microservices (MA) Instances. So now, you can discover...

Data Integration

New Whitepaper: Oracle GoldenGate - Innovations for Another 20 Years

  Over the past 20 years, the GoldenGate data replication platform has evolved from a startup technology targeted for ATM bank networks to what is now a global phenomenon used in every industry by 1000’s of businesses on every continent of the planet. By most measures, GoldenGate has become the most successful data integration product in the history of enterprise software. What started it all was an intense focus on solving the most demanding business continuity challenges that demand zero-downtime of databases and constant availability of important business data. As the technology advanced further, it became widely used for high-end analytic data warehouses and decision support scenarios for most of the Global 2000 industrial base. After 20 years of being on top, there are a whole new set of innovations that will propel the GoldenGate technology for another two decades of market dominance. These recent innovations include: Non-Relational Data Support – for SaaS Applications, Big Data, and Cloud Kernel Integration with Oracle Database – far better performance than any other vendor Remote Capture for Non-Oracle Databases – reduced workloads and simpler admin Simplification, Automation and Self-Service – no need for DBAs with most actions Microservices Core Foundation – more secure, more modular, and easier to work with Simplified, Open Framework for Monitoring – more choices for DevOps Containers, Kubernetes and Docker – faster and easier to deploy GoldenGate Stream Processing and Stream Analytics – added value with event processing Autonomous Cloud – let Oracle Cloud do the patching and optimizing for you Low-Cost (Pay As You Go) Subscriptions – GoldenGate for the cost of a cup of coffee The remainder of this paper provides more details for these innovations and explains how they will drive business results for the kind of modern digital transformation that IT and business leaders are seeking today. Click here to read the full whitepaper! .  

  Over the past 20 years, the GoldenGate data replication platform has evolved from a startup technology targeted for ATM bank networks to what is now a global phenomenon used in every industry by...

Data Integration

GoldenGate for Big Data 12.3.2.1.1 Release Update

Date: 05-Sep-2018 I am pleased to announce the release of Oracle GoldenGate for Big Data 12.3.2.1.1  Major features in this release include the following: New Target - Google BigQuery Oracle GoldenGate for Big Data release 12.3.2.1.1 can deliver CDC data to Google BigQuery cloud data storage from all the supported GoldenGate data sources. New Target – Oracle Cloud Infrastructure Object Storage Cloud Oracle GoldenGate for Big Data can now directly upload CDC files in different formats to Oracle Object Storage on both Oracle Cloud Infrastructure (OCI) and Oracle Cloud Infrastructure Classic (OCI-C). The integration to Object Storage cloud is provided by File Writer Handler. New Target: Azure Data Lake You can connect to Microsoft Azure Data Lake to process big data jobs with Oracle GoldenGate for Big Data. Other Improvements: Extended S3 Targets: Load files to third Party S3 compatible Object Storages Oracle GoldenGate for Big Data can now officially write to third-party Object Storages which are compatible with S3 API such as Dell-ECS Storage. Support for Kafka REST Proxy API V2 You can now either use Kafka REST Proxy API V1 or V2 and it can be specified in the Big Data Properties file. Security: Support for Cassandra SSL Capture Length Delimited Value Formatter The Length Delimited Value Formatter is a row-based formatter. It formats database operations from the source trail file into a length delimited value output. Timestamp with Timezone Property  You can consolidate the format of timestamp with this timezone property Avro Formatter Improvements You can write the Avro decimal logical type and Oracle NUMBER type. Newer Certifications like Apache HDFS 2.9, 3.0, 3.1 Hortonworks 3.0, CDH 5.15,  Confluent 4.1, 5.0 Kafka 1.1, 2.0 and many more !!! More information on Oracle GoldenGate for Big Data Learn more about Oracle GoldenGate for Big Data 12c Download Oracle GoldenGate for Big Data 12.3.2.1.1 Documentation for Oracle GoldenGate for Big Data 12.3.2.1.1 Certification Matrix for Oracle GoldenGate for Big Data 12.3.2. Prior Releases: May 2018 Release: Oracle GoldenGate for Big Data 12.3.2.1 is released Aug 2017 Release: What Everybody Ought to Know About Oracle GoldenGate Big Data 12.3.1.1 Features

Date: 05-Sep-2018 I am pleased to announce the release of Oracle GoldenGate for Big Data 12.3.2.1.1  Major features in this release include the following: New Target - Google BigQueryOracle GoldenGate...

Oracle Named a Leader in 2018 Gartner Magic Quadrant for Data Integration Tools

Oracle has been named a Leader in Gartner’s 2018 “Magic Quadrant for Data Integration Tools” report based on its ability to execute and completeness of vision. Oracle believes that this recognition is a testament to Oracle’s continued leadership and focus on in its data integration solutions. The Magic Quadrant positions vendors within a particular quadrant based on their ability to execute and completeness of vision. According to Gartner’s research methodologies, “A Magic Quadrant provides a graphical competitive positioning of four types of technology providers, in markets where growth is high and provider differentiation is distinct: Leaders execute well against their current vision and are well positioned for tomorrow. Visionaries understand where the market is going or have a vision for changing market rules, but do not yet execute well. Niche Players focus successfully on a small segment, or are unfocused and do not out-innovate or outperform others. Challengers execute well today or may dominate a large segment, but do not demonstrate an understanding of market direction.” Gartner shares that, “the data integration tools market is composed of tools for rationalizing, reconciling, semantically interpreting and restructuring data between diverse architectural approaches, specifically to support data and analytics leaders in transforming data access and delivery in the enterprise.” The report adds “This integration takes place in the enterprise and beyond the enterprise — across partners and third-party data sources and use cases — to meet the data consumption requirements of all applications and business processes.” Download the full 2018 Gartner “Magic Quadrant for Data Integration Tools” here. Oracle recently announced autonomous capabilities across its entire Oracle Cloud Platform portfolio, including application and data integration. Autonomous capabilities include self-defining integrations that help customers rapidly automate business processes across different SaaS and on-premises applications, as well as self-defining data flows with automated data lake and data prep pipeline creation for ingesting data (streaming and batch). A Few Reasons Why Oracle Data Integration Platform Cloud is Exciting Oracle Data Integration Platform Cloud accelerates business transformation by modernizing technology platforms and helping companies adopt the cloud through a combination of machine learning, an open and unified data platform, prebuilt data and governance solutions and autonomous features. Here are a few key features: Unified data migration, transformation, governance and stream analytics – Oracle Data Integration Platform Cloud merges data replication, data transformation, data governance, and real time streaming analytics into a single unified integration solution to shrink the time to complete end-to-end business data lifecycles.  Autonomous – Oracle Data Integration Platform Cloud is self-driving, self-securing, and self-repairing, providing recommendations and data insights, removing risks through machine learning assisted data governance, and automatic platform upkeep by predicting and correcting for downtimes and data drift. Hybrid Integration –Oracle Data Integration Platform Cloud enables data access across on-premises, Oracle Cloud and 3rd party cloud solutions for businesses to have ubiquitous and real-time data access. Integrated Data Lake and Data Warehouse Solutions – Oracle Data Integration Platform Cloud has solution based “elevated” tasks that automate data lake and data warehouse creation and population to modernize customer analytics and decision-making platforms. Discover DIPC for yourself by taking advantage of this limited time offer to start for free with Oracle Data Integration Platform Cloud. Check here to learn more about Oracle Data Integration Platform Cloud. Gartner Magic Quadrant for Data Integration Tools, Mark A. Beyer, Eric Thoo, Ehtisham Zaidi, 19 July 2018. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.  

Oracle has been named a Leader in Gartner’s 2018 “Magic Quadrant for Data Integration Tools” report based on its ability to execute and completeness of vision. Oracle believes that this recognition is...

Data Integration

New Whitepaper: Leverage the Oracle Data Integration Platform Inside Azure and Amazon Cloud

In this whitepaper, find out how you can leverage Oracle Data Integration Platform Cloud to move your on-premises data onto Azure and Amazon Web Services:   Oracle Data Integration Platform Cloud (DIPC) is a highly innovative data integration cloud service, providing a series of industry-first capabilities have been rolled out since inception in 2017, including streaming data replication, pushdown processing for data transformations (no engine required) ,and first class big data ingestion capabilities that support a wide variety of Apache open source projects such as Hive, HBase, Flume, Cassandra, and Kafka.   One of the most innovative architectural patterns that Oracle Data Integration Platform Cloud supports is the ability to push workloads to compute resources of the customer’s choice while preserving the capability for customers to keep their physical records behind their own firewalls and within their own security zones.   While you can move data to both Oracle and non-Oracle environments, this paper focuses on moving data to Azure and Amazon Clouds. There is absolutely no DIPC requirement for customers to put any of their business data into Oracle networks or any cloud resources at all. Oracle DIPC allows customers to keep their data within any of the following:   On Premise Data Centers – which could include any regional or geographically distributed data centers that customers operate themselves or lease from 3rd party operators Amazon Cloud Data Centers – supporting IaaS and PaaS integrations with Amazon services in any AWS regional data centers Azure Cloud Data Centers – supporting IaaS and PaaS integrations with Microsoft Azure data centers across regions Or any other data center that needs to support their workloads   The remainder of this paper will provide specific details about supported use cases for Oracle DIPC to support innovative next-generation data flows within Amazon and Azure clouds.   Click here to read the full whitepaper.  

In this whitepaper, find out how you can leverage Oracle Data Integration Platform Cloud to move your on-premises data onto Azure and Amazon Web Services:   Oracle Data Integration Platform Cloud (DIPC)...

Data Integration

Data Integration Platform Cloud (DIPC) 18.3.3 New Tasks

Just a few days ago, we wrote about the newest release of Data Integration Platform Cloud (DIPC) 18.3.3.  This is all very exciting!  Now, a few bits in an effort to share a bit more on the two newest Elevated Tasks and inclusion of Stream Analytics to DIPC.   This release of DIPC helps with data lake automation, enabling an intuitive instantiation and copy of data into a data lake, in an effort to help reduce some of the existing data engineer/ data scientist friction through a new Data Lake Builder task.  You can quickly create a comprehensive, end-to-end repeatable data pipeline to your data lake.  And – note that nothing is moved to data lake without being fully governed!  When you add data to the data lake, DIPC follows a repeatable pattern to harvest, profile, ingest, shape, copy, and catalog this data. Data can be ingested from a variety of sources, including relational sources, flat files, etc. Harvested metadata will be stored in the DIPC Catalog, and the data will be transformed and secured within the target data lake for downstream activities.  For more information, see Adding Data to Data Lake.   The Replicate Data Task helps address high availability.  Replicate into Oracle… or Kafka!  And, bring that together with Stream Analytics whereby event process is made possible on real-time data streams, including Spatial, Machine Learning, queries on the data stream or cubes.  With Stream Analytics, you can analyze complex event data streams that DIPC consumes using sophisticated correlation patterns, enrichment, and machine learning to provide insights and real-time business decisions. Very simply, the Replicate Data Task delivers changes from your source data to the target.  You set up connections to your source and target, and from the moment that you run this task, any new transaction in the source data is captured and delivered to the target. This task doesn't perform an initial copy of the source (for the initial load see Setting up a Synchronize Data Task) so you'll get all the changes from the point of time that you started your job. This task is especially ideal for streaming data to Kafka targets.  For more information, see Setting up a Replicate Data Task.   For more tutorials, videos, etc on DIPC – please visit the Documentation, as well as the A-Team Chronicles for interesting Data Integration technical know-how.

Just a few days ago, we wrote about the newest release of Data Integration Platform Cloud (DIPC) 18.3.3.  This is all very exciting!  Now, a few bits in an effort to share a bit more on the two...

Data Integration Platform Cloud (DIPC) 18.3.3 is available!

Data Integration Platform Cloud (DIPC) 18.3.3 is now available! Do you know what DIPC is?  Check out this short 2 minute video!   DIPC is innovative!  With an Elevated Task driven approach that guides users through their Data Integration journey, DIPC seeks to simplify and revolutionize Data Integration!  The Elevated Task is a simple, pre-defined set of steps to assist in creating a specific and useful, common job within Data Integration.  These tasks result in simpler solutions such as data migrations or data warehouse/data lake automation and projects that are delivered more quickly, but yet, well designed and effective. Let’s cover some of the brand new and exciting features and tasks in this release!      This release helps with data lake automation, enabling an intuitive instantiation and copy of data into a data lake, in an effort to help reduce some of the existing data engineer/ data scientist friction through a new Data Lake Builder task!  You can quickly create a comprehensive, end-to-end repeatable data pipeline to your data lake.  And – note that nothing is moved to data lake without being fully governed!      The Replicate Data Task helps address high availability.  Replicate into Oracle… or Kafka!  And, bring that together with Stream Analytics whereby event process is made possible on real-time data streams, including Spatial, Machine Learning, queries on the data stream or cubes.  With Stream Analytics, you can analyze complex event data streams that DIPC consumes using sophisticated correlation patterns, enrichment, and machine learning to provide insights and real-time business decisions. Additionally, you have already heard us mention the Synchronize Data Task, the Data Preparation Task, the ODI Execution Task, but this release features enhancements to many of these! For a quick recap:   Synchronize Data:  The Synchronize Data task enables you to copy your selected data from a source to target, and then keeps both databases in sync. You can also use filter rules to include or exclude specific data entities in your job. Any change in the source schema is captured and replicated in the target and vice versa. After you create a synchronize data task and set it up to synchronize all data or specific data entities, you can run the job. If you setup policies for your job, you'll receive notifications for your specified criteria. For more information, see Creating a Synchronize Data Task.   Data Preparation:  The Data Preparation task enables you to harvest data from a source File or Oracle database Connection, and then cleanse, organize, and consolidate that data, saving it to a target Oracle database. For more information, see Setting up a Data Preparation Task.   ODI Execution:  Invoke an existing ODI Scenario to perform bulk data transformations. For more information, see Setting up an ODI Execution Task.   This release also provides updated components under the covers such as:  Enterprise Data Quality 12.2.1.3.0, Oracle Data Integrator 12.2.1.3.1, GoldenGate for Oracle 11g/12c 12.3.0.1.0, GoldenGate for Big Data 12.3.2.1.0, and GoldenGate for MySQL 12.2.0.1.1.  Oracle Stream Analytics is part of DIPC as well, and is available only for user-managed Data Integration Platform Cloud instances.   Want to learn more?  Visit the DIPC site and check out some of our archived webcasts HERE!  

Data Integration Platform Cloud (DIPC) 18.3.3 is now available! Do you know what DIPC is?  Check out this short 2 minute video!   DIPC is innovative!  With an Elevated Taskdriven approach that guides...

Data Integration

Oracle Integration Day is Coming to a City near You

Are you able to innovate quickly in the new digital world? Are you looking for ways to integrate systems and data faster using a modern cloud integration platform? Is your Data Integration architecture allowing you to meet your uptime, replication and analytics/reporting needs? Is your organization able to achieve differentiation and disruption?  Join Oracle product managers and application/data integration experts to hear about best practices for the design and development of application integrations, APIs, and data pipelines with Oracle Autonomous Integration Cloud and Data Integration Platform Cloud. Hear real-world stories about how Oracle customers are able to adopt new digital business models and accelerate innovation through integration of their cloud, SaaS, on-premises applications and databases, and Big Data systems. Learn about Oracle’s support for emerging trends such as Blockchain, Visual Application Development, Self-Service Integration, and Stream Analytics to deliver competitive advantage. Tampa Integration Day With interactive sessions, deep-dive demos and hands-on labs, the Oracle Integration Day will help you to: Understand how Oracle’s Data Integration Platform Cloud (DIPC) can help derive business value from enterprise data; getting data to the right place at the right time reliably and ensuring high availability Understand Oracle's industry leading use of Machine Learning/AI in its Autonomous Integration Cloud and how it can significantly increase speed and improve delivery of IT projects Quickly create integrations using Oracle’s simple but powerful Integration Platform as a Service (iPaaS) Secure, manage, govern and grow your APIs using Oracle API Platform Cloud Service Understand how to leverage and integrate with Oracle’s new Blockchain Cloud Service for building new value chains and partner networks Integration Day begins on August 8 in Tampa. Register now to reserve your spot! Click the links to learn more about your local Integration Day: August 8, 2018 – Tampa August 15, 2018 – Los Angeles August 22, 2018 – Denver September 6, 2018 – San Francisco September 19, 2018 – New York City September 26, 2018 – Toronto October 3, 2018 – Boston December 5, 2018 – Chicago January 23, 2019 – Atlanta January 30 , 2019 –  Dallas February 6, 2019 – Washington DC February 20, 2019 – Santa Clara  

Are you able to innovate quickly in the new digital world? Are you looking for ways to integrate systems and data faster using a modern cloud integration platform? Is your Data...

New Whitepaper: EU GDPR as a Catalyst for Effective Data Governance and Monetizing Data Assets

The European Union (EU) General Data Protection Regulation (GDPR) was adopted on the 27th of April 2016 and came into force on the 25th of May 2018. Although many of the principles of GDPR have been present in country-specific legislation for some time, there are a number of new requirements which impact any organization operating within the EU. As organizations implement changes to processes, organization and technology as part of their GDPR compliance, they should consider how a broader Data Governance strategy can leverage their regulatory investment to offer opportunities to drive business value. This paper reviews some of the Data Governance challenges associated with GDPR and considers how investment in GDPR Data Governance can be used for broader business benefit. It also reviews the part that Oracle’s data governance technologies can play in helping organizations address GDPR. The following Oracle products are discussed in this paper: Oracle Enterprise Metadata Manager (OEMM)–metadata harvesting and data lineage Oracle Enterprise Data Quality (EDQ)–for operational data policies and data cleansing Oracle Data Integration Platform Cloud–Governance Edition (DIPC-GE)–for data movement, cloud-based data cleansing and subscription-based data governance Read the full whitepaper here.

The European Union (EU) General Data Protection Regulation (GDPR) was adopted on the 27th of April 2016 and came into force on the 25th of May 2018. Although many of the principles of GDPR have been...

Data Integration

Data Integration Platform Cloud for SaaS Applications

Customers generate enormous amounts of data in SaaS applications which are critical to business decisions such as reducing procurement spend or maximizing workforce utilization. With most customers using multiple SaaS applications, many of these decisions are made in analytical engines outside of SaaS, or need external data to be brought to SaaS to make decisions within. In this blog we shall examine common data movement and replication needs in the SaaS ecosystem and how Oracle’s Data Integration Platform Cloud (DIPC) enables access to SaaS data and helps with decision making. Data Integration Challenges for SaaS As applications moved from on-premise to SaaS, while they provided a number of benefits, a number of pre-existing assumptions and architectures changed. Let us examine a few changes in enterprise landscape here, which are by no means comprehensive. First, on-premise applications in most cases provided access to applications at a database level, typically read only. This has changed with hardly any SaaS vendor providing database access. Customers now work with REST APIs (or earlier versions of SOAP APIs) to extract and load bulk data. While APIs have many advantages, including removing dependency on application schema, they are no match for SQL queries and have pre-set data throttling limitations defined by SaaS vendor. Second, most customers have multiple SaaS applications which makes it imperative to merge data from different pillars for any meaningful analysis/insight; Sales with Product, Leads with Contacts; Orders with Inventory and the list goes on. While each of the SaaS applications provide some analytical capability, most customers would prefer modern best of breed tools and open architectures for their data for analytical processing. This could be from traditional relational databases with Business Intelligence to modern Data Lakes with Spark engines.  Third most enterprise customers have either an application or an analytical/reporting platform on-premise, which necessitates data movement between cloud to on-premise; i.e, a hybrid cloud deployment. Fourth, semi-structured and unstructured data sources are increasingly used in decision making. Emails, Twitter feeds, Facebook and Instagram posts, Log files and device data all provide context for transactional data in relational systems.  And finally, decision making timelines are shrinking with need for real-time data analysis more often than not. While most SaaS applications provide batch architectures, and REST APIs they struggle to provide robust streaming capability for real time analysis. Customers need SaaS applications to be part of both Kappa and Lambda style architectures. Let us take a peek into how Oracle Data Integration Platform Cloud addresses these issues.   Mitigating SaaS Data Integration Challenges with DIPC Data Integration Platform Cloud (DIPC) is acloud-based platform for data transformation, integration, replication and governance.  DIPC provides batch and real-time data integration among cloud and on-premises environments and brings together the best of breed Oracle data integration products of Oracle GoldenGate, Oracle Data Integrator and Oracle Enterprise Data Quality within one unified cloud platform. You can find more information on DIPC here. For Oracle’s Fusion applications, such as ERP Cloud, HCM Cloud and Sales Cloud, DIPC supports a number of load and extract methods with out of the box connectors. These include BI Publisher, BI Cloud Connector and other standard SOAP/REST interfaces. The choice of interface depends on specific use case. For example, to extract large datasets for a given subject area (say Financials-> Accounts), BI Cloud Connector (BICC) is ideal with its incremental extract setup in Fusion. BICC provides access to Fusion Cloud data via Public View Objects (PVOs). These PVOs are aggregated into Subject Areas (Financials, HCM, CRM etc), and BICC can be setup to manually or programmatically pull full or incremental extracts. DIPC integrates with BI Cloud Connector to kick off an extract, download the PVO data files in chunks, unzip and decrypt them, extract data from CSV formats, read metadata formats from mdcsv files and finally load them to any target such as Database Cloud Service or Autonomous Data Warehouse Cloud Service. For smaller datasets, DIPC can call existing or custom built BI Publisher reports and load data to any targets. For other SaaS applications, DIPC has drivers for Salesforce, Oracle Service Cloud, Oracle Sales Cloud and Oracle Marketing Cloud. These drivers provide a familiar jdbc style interface for data manipulation while accessing SaaS applications over REST/SOAP APIs. In addition, other SaaS applications that provide JDBC style drivers, such as NetSuite can become a source and target for ELT style processing in DIPC. DIPC has generic REST and SOAP support allowing access to any SaaS REST APIs. You can find list of sources and targets supported by DIPC here. DIPC simplifies data integration tasks using the Elevated Tasks, and users can expect more wizards and recipes for common SaaS data load and extract tasks in future. The DIPC Catalog is populated with metadata and sample data harvested from SaaS applications. In the DIPC Catalog users can create Connections to SaaS applications, and subsequent to which a harvest process will be kicked off and populate the Catalog with SaaS Data Entities. From this Catalog, users will be able to create Tasks with Data Entities as Sources and Targets, and wire together a pipeline data flow including JOINs, FILTERS and standard transformation actions. Elevated tasks can also be built to feed SaaS data to a Data Lake or Data Warehouse such as Oracle Autonomous Data Warehouse Cloud (ADWCS). In addition, there is a full featured Oracle Data Integrator embedded inside for existing ODI customers to build out Extract, Load and Transform scenarios for SaaS data integration.  Customers can also bring their existing ODI scenarios to DIPC using ODITask. ODITask is an ODI scenario exported from ODI and imported into DIPC for execution. ODITask can be wired to SaaS source and targets.   Figure above shows DIPC Catalog populated with ERP Cloud View Objects.   Figure above shows details for Work Order View Object in DIPC Catalog   For Hybrid cloud architectures, DIPC provides a remote agent that includes connectors to a wide number of sources and targets. Customers who wish to move/replicate data from on-prem sources can deploy the agent, and have data pushed to DIPC in the cloud for further processing, or vice versa for data being moved to on-premise applications. The remote agent can also be deployed on non-Oracle cloud for integration with Databases running on 3rdparty clouds. For real-time and streaming use cases from SaaS Applications, DIPC includes Oracle Golden Gate, the gold standard in data replication. When permissible, SaaS Applications can deploy Golden Gate to stream data to external Databases, Data Lakes and Kafka Clusters. Either Golden Gate can be deployed to read directly from the SaaS Production database instance to mine the database redo log files or can run on a standby/backup copy of SaaS database and use the cascading redo log transmission mechanism. This mechanism leads to minimal latency and delivers Change Data Capture of specific SaaS transaction tables to an external database or data warehouse providing real-time transaction data for business decisions. Using these comprehensive features in DIPC, we are seeing customers sync end of day/end of month batches of Salesforce Account information into E-Business Suite. Fusion Applications customers are able to extract from multiple OTBI Subject areas and merge/blend Sales, Financials and Sales / Service objects to create custom datamarts. And in Retail, we have customers using Golden Gate’s change data capture to sync Store data to Retail SaaS Apps at corporate in real time. In summary, DIPC provides a comprehensive set of features for SaaS customers to integrate data into Data Warehouses, Data Lakes, Databases and with other SaaS Applications in both real-time and batch. You can learn more about DIPC here.

Customers generate enormous amounts of data in SaaS applications which are critical to business decisions such as reducing procurement spend or maximizing workforce utilization. With most customers...

Oracle GoldenGate Veridata 12.2+ BP new enhancements

In last week, we have released GoldenGate Veridata bundle patch (12.2.1.2.180615). The release contains the two significant improvements along with few bug fixes. The GoldenGate Veridata certifies the High Availability (HA) for Veridata Server and Veridata Agents. The GoldenGate Veridata is leveraging the High availability support provided by WLS. We officially certified it and had documented the detailed steps for all of you to harness it. You may find the details provided in Oracle By Example created by Anuradha Chepuri.   When primary Veridata server fails down, the other Veridata server(backup or slave server) will serve the requests to connected Veridata agents. All the existing requests need to re-initiate again by users. Both the Veridata servers are connected to the shared repository so that all the metadata are available and updated to both the servers. The Veridata Agent HA support has also been tested, when the primary Veridata agent fails down, the other slave or backup Veridata agent will take over. All the new requests will be diverted to Veridata Agent, and existing requests need to re-initiate by users. The VIP address needs to be added into Veridata Agent configuration file so that seamless fail-over could happen.   The other major feature was to allow Active Directory (AD) users access of GoldenGate Veridata product. The Active Directory users can use the GoldenGate Veridata product. We have created new roles in Veridata for Active Directory.   Following are AD Veridata Roles added:- ExtAdministrator ExtPowerUser ExtDetailReportViewer ExtReportViewer ExtRepairOperator You need to import these roles by creating them in your WebLogic server. In below screen, I have shown how to create the ExtAdministrator role. All other roles can be created similarly. Once, all the required roles are imported in the WebLogic server, and you may assign these roles to your AD users or group of users. Over here, I am editing the ExtAdministration Role. For ExtAdministrator role, I want to add the Group. I am adding the existing AD group called "DIQAdmin" to it.   The AD users who are all part of DIQAdmin can access the Veridata product.   You may see my blog on earlier GoldenGate Veridata release over here. Let me know if you have any questions.

In last week, we have released GoldenGate Veridata bundle patch (12.2.1.2.180615). The release contains the two significant improvements along with few bug fixes. The GoldenGate Veridata certifies the...

Data Integration

Walkthrough: Oracle Autonomous Data Integration Platform Cloud Provisioning

We recently launched Oracle Autonomous Data Integration Platform Cloud (ADIPC) Service, a brand new Autonomous cloud-based platform solution for all your data integration needs that helps migrate and extract value from data by bringing together capabilities of a complete Data Integration, Data Quality, and Data Governance. You can get more information about it in Introducing Oracle Autonomous Data Integration Platform Cloud (ADIPC). In this article, I will focus on the provisioning process and walk you through how to provision Autonomous Data Integration Platform Cloud (ADIPC) Instance in the Oracle Cloud. In the previous blog my colleague Julien Testut has walked you through how to provision the Data Integration Platform Cloud - User Managed, cloud service. Here onward, I will refer "Autonomous Data Integration Platform Cloud" as ADIPC "Autonomous Data Integration Platform Cloud Instance" as ADIPC Instance "Data Integration Platform Cloud" as DIPC User Managed. First, you will need to access your Oracle Cloud Dashboard. You can do so by following the link you received after subscribing to the Oracle Cloud, or you can go to cloud.oracle.com and Sign In from there. The Service does not have any pre-requisite, you can directly create ADIPC Instance. We are providing you the in-built Database Cloud Service (DBCS) Instance for storing the ADIPC Repository content for all Editions. The DBCS Instance will not be accessible to the user, and it is used for internal ADIPC purpose. It is self-managed Instance. Go to the Dashboard, click on Create Instance, click on All Services and scroll down to find Data Integration Platform under Integration. Click on Create next to it. This will get you to the ADIPC Service Console page:   You can create Service by clicking either on QuickStarts or Create Instance. The QuickStarts template will provide you ready to use templates for different Editions. Click on QuickStarts at the right-top corner. The page will display Governance Edition template. The upcoming release will have new templates. The Instance name is automatically generated for you. You may change the name if requires. Click on Create and the ADIPC Instance will be created for you.   If you want to select various input parameters while provisioning, click on Create Instance on Service Console page to navigate to the provisioning screens. In the Details screen, Under Instance Details, enter Service Name, Description, Notification Email, Tags, and On Failure Retain Resources. In Configuration Section, you may select the Region and Availability Domain where you want to deploy your ADIPC Instance. In Service Section, select Data Throughput (Data Volume) that has mainly four choices. The ADIPC has new data volume based metering model, where you choose the option based on your data volume in your Data Integration environment. Your Instance will have the compute resource as per selected data volume.  If you want to utilize your on-premises licenses, you may choose to Bring Your Own License option. In this example, I have selected the Governance Edition that includes Data Governance capabilities in addition to everything included with ADIPC Standard and Enterprise Editions.  When done, click Next to review the configuration summary: Finally, click Confirm to start the ADIPC Instance creation. You can see the status of the new Instance being provisioned in the Oracle Cloud Stack Dashboard. You can also check the Stack Create and Delete History at the bottom of the page. It has more detailed information. You can go to the Dashboard by clicking on Action Menu on left top corner, click Dashboard.   Next, let's customize your dashboard to show ADIPC in the Oracle Cloud Dashboard. From the Dashboard, click on minus - button on right top corner,  then click Customize Dashboard: Scroll down in the list and click Autonomous Data Integration Platform Cloud under Integration section: Autonomous Data Integration will then start appearing on the Dashboard: Click the Autonomous Data Integration Platform Cloud (ADIPC) Action Menu to see more details about your subscription and click Open Service Console to view your ADIPC instances and View Account Usage Details to find out how much data you have already consumed: You can also see the Instance status through ADIPC Service Console. You can click on the instance name to get more details about it. When the provisioning process is over, the ADIPC Instance will show as ‘Ready’: Congratulations! We now have a new Autonomous Data Integration Platform Cloud Instance to work with. You can get more information about it on the product page: Data Integration Platform. In future articles, we will cover more DIPC autonomous capabilities. In the meantime, please write your comments if you have any questions.

We recently launched Oracle Autonomous Data Integration Platform Cloud (ADIPC) Service, a brand new Autonomous cloud-based platform solution for all your data integration needs that helps migrate...

Looking for Cutting-Edge Data Integration & Governance Solutions: 2018 Cloud Platform Excellence Awards

It is nomination time!!!  This year's Oracle Excellence Awards: Oracle Cloud Platform Innovation will honor customers and partners who are on the cutting-edge, creatively and innovatively using various products across Oracle Cloud Platform to deliver unique business value.  Do you think your organization is unique and innovative and is using Oracle Data Integration and Governance?  Are you using Data Integration Platform Cloud, GoldenGate Cloud Service, Oracle Data Integrator Cloud Service, Oracle Stream Analytics, etc?  And are you addressing mission critical challenges?  Is your solution around heterogeneous and global data high availability, database migrations to cloud, data warehouse and data lake automation, low latency streaming and integration, or data governance for the business or IT for example?  Tell us how Oracle Data Integration is impacting your business! We would love to hear from you!  Please submit today in the Data Integration and Governance category. The deadline for the nomination is July 20, 2018.  Win a free pass to Oracle OpenWorld 2018!! Here are a few more details on the nomination criteria: Solution shows innovative and/or visionary use of these products There is a measurable level of impact such as ROI or other business benefit (or projected ROI) Solution should have a specific use case identified Nominations for solutions which are not yet running in production will also be considered Nominations will be accepted from Oracle employees, Oracle Partners, third parties or the nominee company We hope to honor you! Click here to submit your nomination today! And just a reminder:  the deadline to submit a nomination is 5pm Pacific Time on July 20, 2018.

It is nomination time!!!  This year's Oracle Excellence Awards: Oracle Cloud Platform Innovation will honor customers and partners who are on the cutting-edge, creatively and innovatively using...

Data Integration

Oracle GoldenGate for Big Data 12.3.2.1 is released

What’s new in Oracle GoldenGate for Big Data 12.3.2.1 ? New Source - Cassandra Starting Oracle GoldenGate for Big Data release 12.3.2.1, GoldenGate can read from NoSQL data stores. With this release, you will be able to capture changes from Cassandra which is a columnar NoSQL data store. It can also capture from the beginning or also known as Initial Capture.   New Target – Kafka REST Proxy Oracle GoldenGate for Big Data can now natively write Logical Change Records (LCR) data to a Kafka topic in real-time using the REST Proxy interface. Supports DDL changes, Operations such as Insert, Update, Delete and Primary Key Update can be handled. It can support Templates and formatters. It can also provide encoding formats such as AVRO and JSON. It can support HTTPS/SSL layer security   New Target – Oracle NoSQL Oracle GoldenGate for Big Data can now officially write to Oracle NoSQL data stores. It can handle Oracle NoSQL data types, mapping between table and columns, DDL changes to be replicated, Primary key updates. It can support both Basic and Kerberos method of authentication.   New Target – Flat files Oracle GoldenGate for Big Data has a new Flat file writer. This is designed to load to a local file system and then load completed files to another location like HDFS. This means that analytical tools will not try to access the real-time half processed files and also can do post processing capabilities like transform, merge like calling a native function. New Target - Amazon Web Services S3 Storage Oracle GoldenGate for Big Data can create a local file system and then load completed files to another location like AWS S3. S3 handler can write to pre-created AWS S3 buckets or create new buckets using AWS OAUTH authentication method. New Data Formats – ORC & Parquet Oracle GoldenGate for Big Data can write newer data formats such as ORC and Parquet using the new Flat File handler..   Newer Certifications like MapR, Hortonworks 2.7, CDH 5.14, Confluent 4.0, MongoDB 3.6, DataStax Cassandra 5.1, Elasticsearch 6.2, Kafka 1.0 and many more !!! More information on Oracle GoldenGate for Big Data Learn more about Oracle GoldenGate for Big Data 12c Download Oracle GoldenGate for Big Data 12.3.2.1 Documentation for Oracle GoldenGate for Big Data 12.3.2.1 Certification Matrix for Oracle GoldenGate for Big Data 12.3.2.1

What’s new in Oracle GoldenGate for Big Data 12.3.2.1 ? New Source - CassandraStarting Oracle GoldenGate for Big Data release 12.3.2.1, GoldenGate can read from NoSQL data stores. With this release,...

Data Integration

New Releases for Oracle Stream Analytics: Data Sheet Now Available

More than ever before, companies across most industries are challenged with handling large volumes of complex data in real-time. The quantity and speed of both raw infrastructure and business events is exponentially growing in IT environments. Mobile data, in particular, has surged due to the explosion of mobile devices and high-speed connectivity. High velocity data brings high value, so companies are expected to process all their data quickly and flexibly, creating a need for the right tools to get the job done.     In order to address this need, we have released a new version 18.1 of Oracle Stream Analytics (OSA). The product is now available in three capacities: In the cloud as part of Oracle Data Integration Platform Cloud On premise as Oracle Stream Analytics On premise as part of Oracle GoldenGate for Big Data   To try OSA for yourself, you can download it here.   The OSA product allows users to process and analyze large scale real-time information by using sophisticated correlation patterns, enrichment, and machine learning. It offers real-time actionable business insight on streaming data and automates action to drive today’s agile businesses.   Oracle Stream Analytics platform targets a broad variety of industries and functions. A few examples include: Supply Chain and Logistics: OSA provides the ability to track shipments in real-time, alerts for any possible delivery delays, and helps to control inventory based on demand and shipping predictions. Financial Services: OSA performs real-time risk analysis, monitoring and reporting of financial securities trading and calculate foreign exchange prices. Transportation: OSA can create passenger alerts and detect the location of baggage, mitigating some common difficulties associated with weather delays, ground crew operations, airport security issues, and more.   One of the most compelling capabilities of OSA is how it is democratizing the ability to analyze streams with its Interactive Designer user interface. It allows users to explore real-time data through live charts, maps, visualizations, and graphically built streaming pipelines without any hand coding. Data can be viewed and manipulated in a spreadsheet-like tabular view, allowing users to add, remove, rename, or filter columns to obtain the desired result. Perhaps most importantly, users can get immediate feedback on how patterns applied on the live data create actionable results.     A few other notable capabilities include the ability to analyze and correlate geospacial information in streams and graphically define and introspect location data and rules, the predictive analytics availablebased on a wide range of Machine Learning models, and the reusable Business Solution Patterns from which users can select a familiar solution analysis.   Other Data Integration products complement OSA to process information in real-time, including Oracle GoldenGate and Oracle Data Integration Platform Cloud. In particular, Oracle Stream Analytics is integrated with the GoldenGate change data capture platform to process live transaction feeds from transactional sources such as OLTP databases to detect patterns in real-time and prepare and enrich data for analytical stores.   To get a deeper look at the features and functionalities available, check out this new data sheet.   You can learn even more about Oracle Stream Analytics at this product’s Help Center page.  

More than ever before, companies across most industries are challenged with handling large volumes of complex data in real-time. The quantity and speed of both raw infrastructure and business events...

Data Integration

Data Integration Platform Cloud (DIPC) 18.2.3 is Available!

  Data Integration Platform Cloud (DIPC) 18.2.3 is now available!   Here are some of the highlights: DIPC boasts an expanded Intuitive Enhanced User Experience which includes: Data Preparation and Data Synchronization Tasks, and the ODI Execution Task providing better management, execution and monitoring.  There is also a continued focus on overall data integration productivity and ease of use on one cloud platform all within one single pane of glass.   The Synchronize Data Task promises new features to enable schema to schema data synchronization in 3 clicks!  This allows for more control during an initial load and/or during on-going synchronizations.  Better monitoring is also possible.     The new Data Preparation Task provides out of the box end to end data wrangling for better data.  The task allows for simple ingest and harvesting of metadata for easy data wrangling, including integrated data profiling.     The new ODI Execution Task provides the ability to easily execute and monitor Oracle Data Integrator scenarios!  This task supports a hybrid development mode where one can develop or design on-premises to thus import into DIPC and integrate with DIPC Tasks.  This is then all executed and monitored in the DIPC Console.   Additionally, there are also enhancements to the Remote Agent, enabling on-premises to on-premises use cases such as custom Oracle to Oracle replication.  And the Data Catalog provides new data entities and new connections as well, furthering its underpinnings for governance at every enterprise!   Want to learn more?  Visit the DIPC site and check out this short 2 minute video on DIPC!

  Data Integration Platform Cloud (DIPC) 18.2.3 is now available!   Here are some of the highlights: DIPC boasts an expanded Intuitive Enhanced User Experience which includes: Data Preparation and Data...

Oracle Named a Leader in 2018 Gartner Magic Quadrant for Enterprise Integration Platform as a Service for the Second Year in a Row

Oracle announced in a press release today that it has been named a Leader in Gartner’s 2018 “Magic Quadrant for Enterprise Integration Platform as a Service” report for the second consecutive year. Oracle believes that the recognition is testament to the continued momentum and growth of Oracle Cloud Platform in the past year.   As explained by Gartner, the Magic Quadrant positions vendors within a particular quadrant based on their ability to execute and completeness of vision separating into the following four categories: Leaders execute well against their current vision and are well positioned for tomorrow. Visionaries understand where the market is going or have a vision for changing market rules, but do not yet execute well. Niche Players focus successfully on a small segment, or are unfocused and do not out-innovate or outperform others. Challengers execute well today or may dominate a large segment, but do not demonstrate an understanding of market direction.   Gartner views integration platform as a service (iPaaS) as having the “capabilities to enable subscribers (aka "tenants") to implement data, application, API and process integration projects involving any combination of cloud-resident and on-premises endpoints.” The report adds, “This is achieved by developing, deploying, executing, managing and monitoring integration processes/flows that connect multiple endpoints so that they can work together.”   “GE leverages Oracle Integration Cloud to streamline commercial, fulfilment, operations and financial processes of our Digital unit across multiple systems and tools, while providing a seamless experience for our employees and customers,” said Kamil Litman, Vice President of Software Engineering, GE Digital. “Our investment with Oracle has enabled us to significantly reduce time to market for new projects, and we look forward to the autonomous capabilities that Oracle plans to soon introduce.”   Download the full 2018 Gartner “Magic Quadrant for Enterprise Integration Platform as a Service” here.   Oracle recently announced autonomous capabilities across its entire Oracle Cloud Platform portfolio, including application and data integration. Autonomous capabilities include self-defining integrations that help customers rapidly automate business processes across different SaaS and on-premises applications, as well as self-defining data flows with automated data lake and data prep pipeline creation for ingesting data (streaming and batch).   Oracle also recently introduced Oracle Self-Service Integration, enabling business users to improve productivity and streamline daily tasks by connecting cloud applications to automate processes. Thousands of customers use Oracle Cloud Platform, including global enterprises, along with SMBs and ISVs to build, test, and deploy modern applications and leverage the latest emerging technologies such as blockchain, artificial intelligence, machine learning and bots, to deliver enhanced experiences.   A Few Reasons Why Oracle Autonomous Integration Cloud is Exciting    Oracle Autonomous Integration Cloud accelerates the path to digital transformation by eliminating barriers between business applications through a combination of machine learning, embedded best-practice guidance, and prebuilt application integration and process automation.  Here are a few key features: Pre-Integrated with Applications – A large library of pre-integration with Oracle and 3rd Party SaaS and on-premises applications through application adapters eliminates the slow and error prone process of configuring and manually updating Web service and other styles of application integration.  Pre-Built Integration Flows – Instead of recreating the most commonly used integration flows, such as between sales applications (CRM) and configure, price, quoting (CPQ) applications, Oracle provides pre-built integration flows between applications spanning CX, ERP, HCM and more to take the guesswork out of integration.  Unified Process, Integration, and Analytics – Oracle Autonomous Integration Cloud merges the solution components of application integration, business process automation, and the associated analytics into a single seamlessly unified business integration solution to shrink the time to complete end-to-end business process lifecycles.   Autonomous – It is self-driving, self-securing, and self-repairing, providing recommendations and best next actions, removing security risks resulting from manual patching, and sensing application integration connectivity issues for corrective action.   Discover OAIC for yourself by taking advantage of this limited time offer to start for free with Oracle Autonomous Integration Cloud.   Check here for Oracle Autonomous Cloud Integration customer stories.   Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.    

Oracle announced in a press release today that it has been named a Leader in Gartner’s 2018 “Magic Quadrant for Enterprise Integration Platform as a Service” report for the second consecutive year....

Data Integration

How to Increase Productivity with Self-Service Integration

By Kellsey Ruppel, Principal Product Marketing Director, Oracle One of the most exciting innovations in integration over the last decade is arriving just in time to address the surge of productivity apps that need to be integrated into enterprises, including small-to medium-size businesses (SMBs). On a general scale, there are approximately 2,300 Software-as-a-Service (SaaS) apps that SMBs use that need to be integrated. Line of business (LOB) users such as marketing campaign managers and sales managers are looking to perform quick and simple self-service integration of these apps themselves without the need for IT involvement – a huge benefit for SMBs who likely might not have a large IT department to lean on. Oracle Self-Service Integration Cloud Service (SSI) provides the right tools for anyone that wants to connect productivity apps such as Slack or Eventbrite into their SMBs. For example, perhaps you are a Marketing Campaign Manager and want to receive an alert each time a new digital asset is ready for your campaign. Or you are a Customer Support Representative trying to automate the deployment of survey links when an incident is closed. Or maybe you are a Sales Manager who wants to feed your event attendees and survey respondents into your CRM. SSI has the tools to address all these needs and more for your SMB. ​For a comprehensive overview of Oracle Self-Service Integration Cloud Service, take a look at our ebook: Make Your Cloud Work for You. Oracle Self-Service Integration is solving these business challenges by: Connecting productivity with enterprise apps - Addressing the quick growth of social and productivity apps that need to be integrated with enterprise apps. Enabling Self-service Integration - Providing line of business (LOB) users the ability to self-service connect applications with no coding to automate repetitive tasks. Recipe-based Integration - Making it easier to work faster and smarter with modern cloud apps with an easy to use interface, library of cloud application connectors, and ready to use recipes.   To learn more, we invite you to attend the webcast, Introducing Oracle Self-Service Integration, on April 18th at 10:00am PT.   Vikas Anand, Oracle Vice President of Product Management, will discuss: Integration trends such as self-service, blockchain, and artificial intelligence the solutions available in Oracle Self-Service Integration Cloud Service Register today!

By Kellsey Ruppel, Principal Product Marketing Director, Oracle One of the most exciting innovations in integration over the last decade is arriving just in time to address the surge of productivity...

Data Integration

How to Use Oracle Data Integrator Cloud Service (ODI-CS) to Manipulate Data from Oracle Cloud Infrastructure Object Storage

Guest Author:  Ioana Stama - Oracle, Sales Consultant   Introduction This article presents an overview on how to use Oracle Data Integrator Cloud Service (ODI-CS) in order to manipulate data from Oracle Cloud Infrastructure Object Storage. The scenarios here present loading the data in an object stored in Oracle Cloud Infrastructure in a table in Database Cloud Service (DBCS) and then move the object to another storage container. We are going to showcase how ODI-CS can connect to Object Storage Classic with both RESTful Services and CURL commands.   About Object Storage by Oracle Cloud Infrastructure Oracle Object Storage is an internet-scale storage, high performance, and durable storage platform. Developers and IT administrators can use this storage service to store an unlimited amount of data, at a very low cost. With the Oracle Object Storage, you can safely and securely use the web-based console to store or retrieve data directly from the internet or from within the cloud platform, at any time. Oracle Object Storage is agnostic to the data content type. It enables a wide variety of use cases. You can send your backup and archive data offsite, store data for Big Data Analytics to generate business insights, or simply build a scale-out web application. The elasticity of the service enables you to start small and scale your application as needed, and you always pay for only what you use. Oracle Object Storage provides a native REST API, along with OpenStack Swift API compatibility, and an HDFS plug-in. Oracle Object Storage also currently offers a Java SDK, as well as Console and Python CLI access for management. We are going to see how the containers look like in the beginning. The source container: The target container: The target table:   Let’s see first how we prepare the topology for the REST services, the File topology and the database topology. We are going to start with the REST services. We are going to create a connection for the source Object Storage and one for the destination one. Preparing the RESTful Services Topology Go to the Topology tab in ODI Studio and right click on the RESTful Services and select New Data Server. You have to give it a name – eg: TestRest In the REST Service endpoint URL you have to write the endpoint URL of the cloud container. This URL can be built accordingly to Oracle documentation or take it from the cloud dashboard. It is already available there.   In order to connect you have to use the cloud account user as per the below picture. After we save this connection we are going to create two new physical schemas. One for the source Object Storage Container and another one for the destination Object Storage Container. We are creating them under the same data server because both containers are created on the same Oracle Cloud Infrastructure. Right click on the newly created data server and select New Physical Schema. The first one is for the TEXTFILE object in the ODICS container, the source one. The resource URL it is also available in the Cloud Dashboard. Now, we are going to go to the Operations Tab. Here we are going to define some operations in order to manipulate the object. As you can see in the list there are methods from where we can pick a method. We defined operations for deleting, getting and putting the object in the container. Let’s test the service by pressing the Test Restful Service button. A pop-up window will open. Here, by selecting the desired operation, the effective URL is built. We can see here that we can add and modify the parameters. The save request content button opens another pop-up that will give you the chance to select the location where you want to save the content that you get from the object. We are going to do the same for the other container. A new physical schema will be created with another resource URL. In the operations tab we only defined operations for the GET and PUT method. The defined operations for both Objects are defined for the purpose of this demonstration. Preparing the File Topology In the Topology tab in ODI Studio right click on the File topology and select new data server. The host here is the name of the host where we are saving the file. Give it a name eg: JCS File. Please leave the JDBC connection to its default state and the save button. Right click on the File data server created and create a schema. Here we are going to mention the path to the file where the files that we are going to use will be stored. Also we have to mention the path of the folder where ODI is going to create his log and error files. Preparing the Database Topology In the Topology tab in ODI Studio right click on the Oracle Technology and select new data server. Give a name to the connection and specify the database user that you want to connect with. In this case storage is our database user. Please go to the JDBC tab and select the Oracle JDBC driver. Please modify the URL according to your details. The next step is to reverse engineer the database table and the file. After we are going to create a mapping to load data from the file to the table.   Go in the Designer tab and after in the Models tab. Here, click on the arrow and select create new model. We will choose the parameters accordingly to the used technology. For the first model we are going to select Oracle as a technology and DB_Storage_logical as the logical schema. After we do that press save. The next step is to reverse engineer the tables. Click on Selective Reverse-Engineering. Select the FILE_STORAGE table and press the Reverse Engineer button. Now we have to reverse engineer the file. Because the content for this is the same we already have a file created. We are going to create a new model and we will select File as a technology and the logical schema created in the topology tab.   After that, press the save button. Next, right click on the new model and select New Datastore. Give it a name, e.g.: TEXT_DATA and in the resource name tab press on the magnifying glass.  Go to the path where we saved the file (the one mentioned in the physical schema). The next step is to go to the Files tab. Here we have to mention the type of file and the delimitators. File format: Delimited Header (if needed) Record separator: e.g.: Unix Filed Separator: e.g.: Tab.   Press save and go to the Attributes Tab. Here you have to click on the Reverse Engineer Tab.   Preparing the mapping and the packages. Create a new project folder and go to the mappings. Right click and select new mapping. Give it a name: e.g. File_to_Oracle. Here, in the canvas, drag and drop the reversed engineered file and the table. Then connect them with the files as a source and the table as a target. Then press save. The next step is to create the packages. We are going to have two packages. One where we are going to call the RESTful services and one where we are going to call the cURL commands.   RESTful services Right click on the package and select new package. Give it a name. e.g. Flow. Here we are going to use the OdiInvokeRESTfulService component from the tool panel. We are going to use it three times. Once for getting data and save it in a file, then for putting the file in the second Object Storage Container, and the third one to delete the file from the source container. The flow is simple: OdiInvokeRESTfulService for saving the data in the object. The mapping that loads the data in the table. OdiInvokeRESTfulService to put the object in the other container. OdiInvokeRESTfulService to delete the object from the source container. The OdiInvokeRESTfulService has different parameters in the General tab. Here we have to select the operation that we want to use. Also, in the Response File parameter we have to specify the location and the file where we want to save the content of the object. In the command tab we can see the command that is going to be executed when we are going to run the flow. The same applies for the other OdiInvokeRESTfulService commands. Let’s run the workflow.   We can see that the execution was successful. We can see that in the table. Also we can check the containers and see that the object has been moved. The target container: The source container: cURL commands.   We are going to create a new package as we did with the previous one. But from the toolbox we are going to select the OdiOSCommand tool. In the command tab we are going to write the cURL commands that you can find below GET: curl -u cloud.admin:account_password https://identity _domain.storage.oraclecloud.com/v1/Storage-identity_domain/ODICS/TEXTFILE_CURL --output /u01/app/oracle/tools/home/oracle/files_storage/test_data.txt PUT: curl -X PUT -F 'data=@/u01/app/oracle/tools/home/oracle/files_storage/test_data.txt' -u cloud.admin:account_password https:// identity _domain.storage.oraclecloud.com/v1/Storage- identity _domain /ODICS_ARCHIVE/TEXTFILE_CURL DELETE: curl -X DELETE -u cloud.admin:account_password https:// identity _domain.storage.oraclecloud.com/v1/Storage- identity _domain /ODICS/TEXTFILE_CURL The steps are: OdiOSCommand to use cURL to get the content of the object The mapping to load data in the table OdiOSCommand to use cURL to put the new object in the second container. OdiOSCommand to use cURL to delete the object from the source container.   Conclusion Oracle Data Integrator Cloud Service (ODI-CS) is able to manipulate objects in Oracle Cloud Infrastructure Classic. You can leverage on ODI-CS capabilities on using RESTful Services and also commands written in any language in order to integrate all your data.

Guest Author:  Ioana Stama - Oracle, Sales Consultant   Introduction This article presents an overview on how to use Oracle Data Integrator Cloud Service (ODI-CS) in order to manipulate data from Oracle...