X

Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

Oracle a Leader in The Forrester Wave™: Enterprise Data Fabric, Q2 2020

Steve Quan
SR. PRINCIPAL DIRECTOR, PRODUCT MARKETING

Oracle is honored to be recognized as a “Leader” in The Forrester Wave™: Enterprise Data Fabric Q2 2020; and for having received the highest score of all vendors in the “Strategy” category (roadmap, vision, strategy execution, and professional services and support).

As the global workforce has become increasingly distributed in 2020, so has the information data specialists need to transform, curate, and analyze information for better decision making.  Yesterday’s data integration emphasized moving traditional data sources to targets. This no longer meets today’s needs for multi-cloud, multi-architecture and multi-types of data. The solution to these multi-dimensional requirements can be visualized as a data fabric, where the distribution of data forms the fabric. The capabilities required to rapidly deliver value are a unified solution using: innovative self-service, process automation, data streaming, embedded AI/ML, new graph engine features and more.

The Oracle solution is a portfolio of data management, movement, and streaming solutions that customers use for deploying data fabrics. Some of the use cases include customer 360, data science, fraud detection, healthcare insights, real-time analysis, and IoT analysis, and more.

“We are excited to be named a Leader in The Forrester Wave™: Enterprise Data Fabric, Q2 2020,”  said Jeff Pollock, Vice President Product Management, Oracle. “We thank our customers for their partnership and collaboration in helping Oracle receive the highest score of any vendor in the “Strategy” category. Over the past 20 years, Oracle has invested in a comprehensive data fabric portfolio that empowers our customers to successfully deliver on digital transformation initiatives. We will continue to invest in market-leading innovations that help our customers achieve more with their data." 

Modern data integration

Today’s businesses need a modern data platform to find answers from available data in legacy systems and the cloud to guide their operations and strategy.  Some customers use data to report business results periodically, some customers need event data to identify fraudulent activities before they occur. Analysts do not just use the information in the data, they also analyze the data context, where it comes from, and how efficiently the data moves between various systems.

Data Engineers prepare and curate available data into pipelines that distribute information to users across an organization.  Sales need to understand what is causing customer churn, production needs to make supply chain decisions, finance needs to report business results to management.

Traditional data integration approaches are rigid, point-to-point connections that don’t scale.  Data source and targets are hard-wired together using fixed ETL designs. Changing integration flows take a long time because the designs are often documented poorly or the ETL developer has moved on to another job.   

Understanding the data fabric

Effective solutions for building a data fabric need three key capabilities. First are tools that discover, capture and extract all data from heterogeneous sources and multi-clouds.  Second is automatically ingesting the data into real-time pipelines. The third is processing, analyzing and distributing all data and graph data with the help of AI/machine-learning technologies. 

Capturing operational and live data  

Data scientists look for deeper insights from the relationships between different data sources.  Information from one source alone may not be very meaningful. Capturing information from multiple sources enriches the data for scientists and improves prediction results. Oracle’s end-to-end integration capability discovers data quickly whether on-premises or in the cloud. Oracle’s Data Catalog searches data and harvests for metadata. Oracle ETL solutions come with a broad set of connectors to load massive data volumes.  Oracle GoldenGate works with Kafka to stream operational data from popular databases as well as events.

Data pipelines from ETL, CDC, and streaming 

Organizations need data from operational systems for data warehousing and reporting.  Data-driven companies want to improve business outcomes by collecting and processing data from transaction systems, message queues and IoT devices in real-time.  One of the early commercial applications of IoT was supply chain optimization with sensors placed in inventory bins for detecting low inventory levels. Another early business application was to monitor transaction event streams to detect possible fraud in a financial system.  No matter when data is needed, modern data integration must be able to handle and ingest any workload promptly.

Distributed graph knowledge

Modern data must be trustworthy. When data scientists analyze, visualize and build prediction models, they need to know where the data comes from, how it is stored, and if there are any relationships between the data.  Lineage diagrams display such dependencies. Modern data platforms need to work with operational data and events. Oracle Stream Analytics includes patented stream processing, correlation, and geo-fencing capabilities to build real-time pipelines automatically. The solution leverages machine-learning and spatial analysis to score and predict outcomes based on streamed data. Live maps and diagrams help visualize the actionable results to accelerate decision making. For example, you can show live location data on maps when events are processed and mark the data red for “violation” or green for “compliant.” This type of visualization is an example of actionable data that accelerates analysts decision making.

The Oracle difference 

Oracle has a history of delivering powerful data management solutions that work well together and helps turn a data fabric concept into reality. Their capabilities include industry-leading ETL, data preparation, replication, and streaming that work together. These are complex processes that are otherwise prone to human errors. Oracle has invested heavily to simplify and automate these tasks to:

  • Transform data without impacting systems
  • Cleanse and repair data to make it trustworthy
  • Ingest data and events with zero-downtime for operational systems
  • Replicate data or recover data after any replication failures
  • Apply machine learning algorithms on streaming data pipelines

As part of our commitment to innovation through close collaboration with customers like you, we look forward to helping support your data-driven digital transformation initiatives as you advance on your path to modernize legacy data management and integration into a data fabric to help you visualize actionable insights and make better decisions.

Download the report and read more.  

 

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.