Thursday Oct 16, 2014

Why Not Data Quality?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

Big data, business intelligence, analytics, data governance, data relationship management, the list of data oriented topics goes on. Lots of people are talking about all of these and yet very few people talk about poor data quality, let alone do anything about it. Why is that?

We think it’s because Data Quality suffers from the “Do I have to?” (DIHT) syndrome. Anyone with kids or anyone who was a kid will recognize this syndrome as in, “Clean up your room.” “Do I have to?” Dealing with poor quality data is not glamorous and it doesn’t get the headlines. Installing business intelligence systems or setting up data governance, gets the headlines and makes careers. But what good is better reporting and structure if the underlying data is junk?

Recently we were in a half day planning session for an existing customer. The customer wanted to know what they could do better using the existing software they had already purchased as Phase 1 of the project and what they would need to acquire to do things better for Phase 2. Reviews like this are critically important, as people change on both sides of the customer/vendor relationship, to ensure knowledge transfer and reaffirmation of goals. The customer provided access to numerous departments across their company for interviews and focus groups. All of this information was gathered, reviewed, summarized and suggestions were made. Excel spreadsheets and PowerPoints ensued. Even though the Aberdeen Group and others have shown significant performance increases in established ERP and other business systems through the use of Data Quality and Master Data Management, because the customer did not directly say they had a data issue (and very few customers ever admit this because poor data is just standard operating procedure), no emphasis was put on data quality as a way of improving the customer’s processes and results with their existing software packages. What is it about data quality that makes it the option of last resort? The go to, when all else fails. It’s got to be the belief that the data in underlying systems and sources is good by default. I mean, really, who would keep bad data around? Well, pretty much everyone. Because if you don’t know that it’s bad, you end up keeping it around.

Let’s admit it, DQ is not glamorous. There are no DQ-er of theyear awards. People in DQ don’t typically have their names on a parking spot right up front in the corporate lot. And besides not being glamorous, it’s hard. Very rarely do we see someone ‘own’ data quality – after all since bad data affects multiple people, across multiple functions, no one really has the right incentives to drive data quality improvements where the resulting benefits accrue to multiple constituencies. Nobody really wants to spend their functional budgets fixing enterprise-wide data problems. Some of the very early DQ adopting companies have teams of people, representing a cross-section of processes and functions, who spend their days manually inspecting data and creating internal systems to meet their specific data quality needs. They are very effective at what they do, but not as efficient as they could be because the whole is greater than the sum of its parts. Also, most of the data knowledge is in their heads and that’s really hard to replicate and subject to loss due to job switching or retirement or the possible run in with the proverbial bus.

 So, given the underwhelming desire to fix poor data, how do you get the powers that be in your company to see the light? In our last article Data Quality, Is it worth it? How do you know? we examined the value of data quality based on units of measure that were meaningful to the given organization. To paraphrase Field of Dreams, if you build the ROI, they will come. The first step to building the ROI, is understanding how poor your data is and what impact that has on your organization. Typically that starts with a Data Quality Health Check.

 A DQ Heath Check takes a sample of your data and looks at varying aspects to determine the quality level of your data. The aspects examined include: Consistency, Completeness, Accuracy, and Validity. These measures attempt to answer the question, Is your data fit for purpose? Consistency looks at the validation of the data within a variable. For example if the variable in question only allows for Ys and Ns, Ms and Ts will lower the consistency rating. Completeness is just that, how complete is the data in your database? Using our previous example if Ys and Ns are only present 20% of the time, your data for that variable is fairly incomplete. Accuracy looks at a number of things but mostly represents data within the bounds of expectations. And Validity looks at usefulness. For example, telephone numbers are typically 10 digits. Phone numbers without area codes or with letters while complete and possibly consistent are not valid.

In another recent customer engagement, we looked at customer records for data anomalies specifically for consistency, completeness, accuracy, and validity. We found that fixing these records resulted in improvements not only in marketing (campaign effectiveness), but also improved service (customer experience), higher collections in finance (lower receivables), and improved reporting. In today’s data rich, integrated, system-driven processes, improving data quality in one part of the organization (whether it be customer data, supplier data, financial data) benefits multiple organizational functions and processes.

So while data quality will never be glamorous for individuals, with a little insight providing a strong ROI for DQ we can move this from Do I Have To? to Let’s Do This.

Friday Sep 12, 2014

Data Quality, Is it worth it? How do you know?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & 

Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

You might think that the obvious answer to the title question would be to fix them, but not so fast. As heretical as this might be to write, not all data quality problems are worth fixing. While the data purists will tell you that every data point is worth making sure it is a correct data point, we believe you should only spend money fixing data that has a direct value impact on your business. In other words, what’s the cost of bad data?

What’s the cost of bad data? That’s a question that is not asked often enough. When you don’t understand the value of your data and the costs associated with poor data quality, you tend to ignore the problem which tends to make matters worse, specifically for initiatives like data consolidation, big data, customer experience, and data mastering. The ensuing negative impact has wider ramifications across the organization – primarily for the processes that rely on good quality data. All business operations systems like, ERP, CRM, HCM, SCM, EPM, that businesses run on assume that the underlying data is good.

Then what’s the best approach for data quality success? Paraphrasing Orwell’s Animal Farm, “all data is equal, some is just more equal than others”. What data is important and what data is not so important is a critical input to data quality project success. Using the Pareto rule, 20% of your data is most likely worth 80% of your effort. For example, it can be easily argued that financial data have a greater value as they are the numbers that run your business, get reported to investors and government agencies, and can send people to jail if they’re wrong. The CFO, who doesn’t like jail, probably considers this valuable data. Likewise, a CMO understands the importance of capturing and complying with customer contact and information sharing preferences. Negligent marketing practices, due to poor customer data, can result in non-trivial fines and penalties, not to mention bad publicity. Similarly, a COO may deem up-to-date knowledge of expensive assets as invaluable information, along with description, location, and maintenance schedule details. Any lapses here could mean significant revenue loss due to unplanned downtime. Clearly, data value is in the eye of the beholder. But prioritizing which data challenges should be tackled first needs to be a ‘value-based’ discussion.

How do you decide what to focus on? We suggest you focus on understanding the costs of poor data quality and management and then establishing a metric that is meaningful to your business. For example, colleges might look at the cost of poor data per student, utilities the cost of poor data per meter, manufacturers the cost of poor data per product, retailers the cost of poor data per customer, or oil producers the cost of poor data per well. Doing so makes it easy to communicate the value throughout your organization and allows anyone who understands the business to size the cost of bad data. For example, our studies show that on campus data quality problems can cost anywhere from $70 to $480 per student per year. Let’s say your school has 7,500 students and we take the low end of the range at $100 per student. That’s a $750,000 per year data quality problem. As another example, our engagement with a utility customer estimated that data quality problems can cost between $5 to $10 per meter. Taking the low value of $5 against 400,000 meters quantifies the data quality problem at $2,000,000 annually. Sizing the problem lets you know just how much attention you should be paying to it. But this is the end result of your cost of poor data quality analysis. Now that we know the destination, how do we get there?

To achieve these types of metrics you have to assess the impact of bad data on your enterprise by engaging all of the parties that are involved in attempting to get the data right, and all of the parties that are negatively affected when it is wrong. You will need to go beyond the creators, curators and users of the data and also involve IT stakeholders and business owners to estimate: impact on revenues; cost of redundant efforts in either getting the data or cleaning it up; the number of systems that will be impacted by high quality data; cost of non-compliance; and cost of rework. Only through this type of analysis can you gain the insight necessary to cost-justify a data quality and master data management effort.

The scope of this analysis is determined by the focus of your data quality efforts. If you are taking an enterprise-wide approach then you will need to deal with many departments and constituencies. If you are taking a Business Unit, functional or project focus for your data quality efforts, your examination will only need to be done on a departmental basis. For example, if customer data is the domain of analysis, you will need to involve subject matter experts across marketing, sales, and service. Alternatively, if supplier data is your focus, you will need to involve experts from procurement, supply-chain, and reporting functions.

Regardless of data domain, your overall approach may look something like this:

  1. Understanding business goals and priorities
  2. Documenting key data issues and challenges
  3. Assessing current capabilities and identifying gaps in your data
  4. Determining data capabilities and identifying needs
  5. Estimating and applying benefit improvement ranges
  6. Quantifying potential benefits and establishing your “cost per” metric
  7. Developing your data strategy and roadmap
  8. Developing your deployment timeline and recommendations

Going through this process ensures executive buy-in for your data quality efforts, gets the right people participating in the decisions that will need to be made, and provides a plan with a ROI which will be necessary to gain the necessary approvals to go ahead with the project.

Be sure to focus on: Master Data Management @ OpenWorld

Friday May 23, 2014

Supercharge Your Web Storefront with Superior Data Quality

By Neela Chaudhari (Compiled from Profit Magazine - May 2014)

A targeted, comprehensive data management strategy can differentiate your business from the competition. By providing a great online customer experience starts with providing consumers with the information and research tools needed to make informed buying decisions. All too often, product research capabilities that already exist on a website get diminished by poor data quality behind the scenes.

Companies like Ace Hardware and are adopting these data management strategies, and many retailers are seeing significant improvements in the quality of their guided navigation, product spec presentation, and product comparison strategies.

So while modern web storefront capabilities are critical to online sales success, be sure to remember that it is the data that makes it work with your customers and prospects!

There are three key areas where product data quality is impacting the customer experience. What are they? Visit our latest Profit article to find out:

Friday May 16, 2014

Master Data Management and Service-Oriented Architecture: Better Together

Master Data Management and Service-Oriented Architecture: Better Together

By Neela Chaudhari

Many companies are struggling to keep up with constant shifts in technology and at the same time address rapid changes in the business. As organizations strive to create greater efficiency and agility with the aid of new technologies, each new business-led project may further fragment IT systems and result in information inconsistencies across the organization. Because data is an essential input for all processes and business objects, these irregularities can undermine the original business objectives of the technology initiatives.

Combining the use of master data management (MDM) on the business side and service-oriented architecture (SOA) on the IT side can counteract the problem of information inconsistency. SOA is a practice that uses technology to decouple services, transactions, events, and processes to enhance data availability for business applications across a range of use cases. But the underlying data is often overlooked or treated as an afterthought when it comes to business processes, leading to poor data quality characteristics for your business applications. Without MDM, the data made available to business applications by an SOA approach might be less than accurate and more widespread throughout an organization. That can lead to a situation where lower quality data is consumed by more business users—ultimately thwarting the objectives of efficiency and agility.

MDM can add value to SOA efforts because it improves the quality and trustworthiness of the data that is being integrated and consumed. MDM aids the tricky issue of upstream and downstream systems integration by ensuring the systems access a data hub containing accurate, consistent master data. It also assists SOA by providing consistent visibility and a technical foundation for master data use. MDM delivers the necessary data services to ensure the quality and timeliness of the enterprise objects the SOA will consume.

To learn more about the importance of MDM to SOA investments, read an in-depth technical article, MDM and SOA Be Warned! (

And don't miss the new Oracle MDM resource center ( Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site, and financial data.

Friday May 02, 2014

Register Now! Product Data Management Weekly Cloudcast

Don't miss out: Product Data Management Weekly Cloudcast

Every Thursday at 10:00 a.m. PST (1:00 p.m. EST).

The North America Master Data Management (MDM) and Enterprise Data Quality (EDQ) team will present a series of weekly webcasts that give an inside look at how Oracle Product Data Management Cloud modernizes complex data management processes allowing customers to focus on strategic opportunities and delivering value to the business. These webcasts will run throughout FY14, with regular updates being distributed.

These sessions are designed for customers and prospects who are interested in learning more about Product Data Management Cloud.  Customer executives and managers with responsibility for Data Management, Data Quality, Commerce, Manufacturing , IT or other data management responsibilities are encouraged to attend. 

Remember, data that is not managed properly degrades at 27% per year!

Please click  HERE to view a complete schedule and register for the demo.

Friday Mar 21, 2014

Master Data Management: How to Avoid Big Mistakes in Big Data

Big Data Quality MDM

Master Data Management: How to Avoid Big Mistakes in Big Data

The paradigm-changing potential benefits of big data can't be overstated—but big changes can deliver big risks as well. For example, exploding data volumes naturally create a corresponding increase in data correlations, but as leading experts warn, correlations should not be mistaken for causes.

To avoid drawing the wrong conclusions from big data, organizations first need a way to assemble reliable master data to analyze. Then they need a way to put those conclusions and that data to work operationally, in the systems that govern and facilitate their day-to-day operations.

Master data management (MDM) helps deliver insightful information in context to aid decision-making. It can be used to filter big data, isolating and identifying key entities and shrinking the dataset to a manageable size for parsing, tagging, and associating with operational system records. And it provides the key intersecting point that enables organizations to map big data results to operational systems that are built on relational databases and structured information.

Adopting master data management capabilities helps organizations create consolidated, consistent, and authoritative master data across the enterprise, enabling the distribution of master information to all operational and analytical applications, including those that contain customer, product, supplier, site, and financial information.

Oracle Master Data Management drives results by delivering the ability to cleanse, govern, and manage the quality and lifecycle of master data.

To learn more about the importance of MDM as an underlying technology that facilitates big data initiatives, read an in-depth Oracle C-Central article, "Masters of the Data: CIOs Tune into the Importance of Data Quality, Data Governance, and Master Data Management."

And don't miss the new Oracle MDM resource center. Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site and financial data.

Tuesday Aug 20, 2013

Catch up on Enterprise Data Quality

The Oracle Data Integration and Master Data Management Newsletter is now available. In this edition of the quarterly newsletter, we highlight 6 uses cases for Oracle Enterprise Data Quality. You will also find information on upcoming webcasts and customer buzz among other things.

Also, if you wan to subscribe our newsletter, do so through the link in the newsletter.  Happy Reading!

You can learn more about our Oracle Enterprise Data Quality multitool in our upcoming webcast, Putting Data to Work Using Oracle Enterprise Data Quality Solutions on Tuesday, August 27 at 10:00 a.m. PT. As Dain Hansen, Director of Product Marketing says, "Unlike swiss army knives, it is guaranteed never to rust or stop you in an airport metal detector."

Wednesday Aug 14, 2013

Master Data—and Deliver a Great Customer Experience

In the fast-paced world of the connected consumer, expectations run high. Every time customers interact with a company, they want a positive, relevant, and personalized experience. If they don’t get it, today’s empowered customers won’t hesitate to leave. Yet many companies can’t deliver great personal experiences to their customers because they are struggling with siloed information systems and processes that fail to provide complete and accurate data to sales, support, and marketing teams. 

In the new white paper by Harvard Business Review Analytic Services, “Delivering on the Promise of Great Customer Experiences,” learn from several forward-thinking organizations—in industries ranging from travel to telecommunications—how to use Master Data Management (MDM) to collect and integrate all types of internal and external data and create the consistent, connected, and personalized experiences that customers want. Oracle Master Data Management offers the most complete product line on the market, enabling organizations to cleanse, centralize, and govern to create a “master” version of customer and business data—and the foundation for an improved customer experience strategy. Find out how your organization can enrich the customer experience.

Read the whitepaper today!

For more information on Master Data Management, visit us on -

Monday Aug 05, 2013

Data Quality: Project, Program or Way of Life?

What are the benefits and methods of developing a great Data Quality and Governance program?The longest journey begins with a single step, and so it is with data quality.

Tune in to this AppCast session with Martin Boyd, Senior Director, Product Strategy, Oracle and Tamer Chavusholu, Managing Partner, KAYGEN as they discuss the various approaches seen around data quality programs and how data quality maturity builds over time.  You will learn best practices and where the best project ROI comes from.  

Visit MDM AppCasts to listen to other AppCasts and visit  For more information on Master Data Management, visit us on

Sunday Jun 16, 2013

Church Pension Group Leverages Fusion Customer Hub and Delivers Critical Data to the Business

Hear about how Fusion Customer Hub is exactly what Church Pension Group needed to provide the right data in a timely manner to the line of business.  Church Pension Group used Fusion Customer Hub to provide best-in-class capabilities around MDM and their multiple source systems. Check out the latest video below. 

Wednesday Jun 12, 2013

Allianz Group Turns to Oracle Master Data Managent for Customer Insight

Allianz Group's challenge was to get a single, consolidated view in order to better service their customers. They chose Oracle, as the best fit for their existing applications. Customer segmentation, behavior, preferences were all critical to Allianz. Oracle MDM connected to many of their applications for a comprehensive, trusted, relevant view of their customer data. For more information on Oracle Master Data Management click here.  Take a look at this video testimonial from Allianz...

Wednesday Jun 05, 2013

Elsevier Gains Customer Insight and More with Oracle Customer Hub

Oracle's Customer Experience solutions and use cases work hand-in-hand with Oracle Customer Hub (key product of Oracle Master Data Management).  Now, don't just take our word for it , listen to Elsevier, world's leading information and content provider for Medical, Technical and Scientific markets.  Hear how Elsevier leveraged Customer Hub to gain better customer insight, why they chose Oracle, and how they can serve customers better across all touchpoints.  

Monday Apr 29, 2013

Upcoming Webcast: Enriching the Customer Experience with Oracle Enterprise Data Quality and Oracle Commerce


Imagine a commerce customer experience that was driven by relevant product and customer data.  A data-driven strategy powered by Enterprise Data Quality offers the commerce customer or potential customer information that is cleansed, standardized, and most importantly relevant.  Knowing the correct information about a customer, their purchasing history, orders as well as details about the product you are considering to buy all contribute to a satisfying, repeatable commerce experience.  Tune into this webcast titled Data Quality – Driving More Personalized Commerce Experiences” and you will learn how Oracle Commerce and Oracle Enterprise Data Quality are the perfect combination to cleanse and standardize product and customer data from multiple sources and automatically optimize product and customer information to support your commerce channel. This webcast is hosted by Michael Hylton, Senior Principal Product Marketing Director -- CX and CRM, and Mala Narasimharajan, Principal Product Marketing Director -- Fusion Middleware, of Oracle. Click here to register.

Monday Apr 22, 2013

Latest MDM Screencast Now Available: Masters of the Data

Oracle Master Data Management recently had a great opportunity to be a part of Oracle Fusion Middleware's Screencast program titled The New Business Imperative: Social, Mobile, Cloud. Each week this screencast series features a different Middleware offering and the series currently features MDM.  The title of the screencast is Masters of the Data: CIOs Tune into Data Quality and Master Data Management. For more information on Oracle MDM click here.


Saturday Mar 16, 2013

Ready for the Gartner MDM Summit Next Week? We Are...


The Gartner MDM Summit is almost here !! Oracle is a PLATINUM sponsor of this event and geared up for a great show next week. We have an information-packed session planned with SONY PlayStation as our featured customer for the session.  I strongly urge those of you attending the conference next week to make it a point to attend -- you won't regret it. 

Our session next week is centered around a customer case study, hear what worked for them, why they chose Oracle and the value of MDM.

  • Oracle Customer Case Study Session: "From Strategy to Operational Excellence", Sree Vaidyanathan - SONY PlayStation   March 21, 2013 4-5PM Room: Texas A                       

In addition to the case study session - don't miss visiting our demo pod for live demos of MDM and Enterprise Data Quality.  For more informatimon on Oracle's session click here.  


Get the latest on all things related to Oracle Master Data Management. Join Oracle's MDM Community today.

Follow us on twitter Catch Us on YouTube


« July 2016