Friday Sep 12, 2014

Data Quality, Is it worth it? How do you know?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & 

Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

You might think that the obvious answer to the title question would be to fix them, but not so fast. As heretical as this might be to write, not all data quality problems are worth fixing. While the data purists will tell you that every data point is worth making sure it is a correct data point, we believe you should only spend money fixing data that has a direct value impact on your business. In other words, what’s the cost of bad data?

What’s the cost of bad data? That’s a question that is not asked often enough. When you don’t understand the value of your data and the costs associated with poor data quality, you tend to ignore the problem which tends to make matters worse, specifically for initiatives like data consolidation, big data, customer experience, and data mastering. The ensuing negative impact has wider ramifications across the organization – primarily for the processes that rely on good quality data. All business operations systems like, ERP, CRM, HCM, SCM, EPM, that businesses run on assume that the underlying data is good.

Then what’s the best approach for data quality success? Paraphrasing Orwell’s Animal Farm, “all data is equal, some is just more equal than others”. What data is important and what data is not so important is a critical input to data quality project success. Using the Pareto rule, 20% of your data is most likely worth 80% of your effort. For example, it can be easily argued that financial data have a greater value as they are the numbers that run your business, get reported to investors and government agencies, and can send people to jail if they’re wrong. The CFO, who doesn’t like jail, probably considers this valuable data. Likewise, a CMO understands the importance of capturing and complying with customer contact and information sharing preferences. Negligent marketing practices, due to poor customer data, can result in non-trivial fines and penalties, not to mention bad publicity. Similarly, a COO may deem up-to-date knowledge of expensive assets as invaluable information, along with description, location, and maintenance schedule details. Any lapses here could mean significant revenue loss due to unplanned downtime. Clearly, data value is in the eye of the beholder. But prioritizing which data challenges should be tackled first needs to be a ‘value-based’ discussion.

How do you decide what to focus on? We suggest you focus on understanding the costs of poor data quality and management and then establishing a metric that is meaningful to your business. For example, colleges might look at the cost of poor data per student, utilities the cost of poor data per meter, manufacturers the cost of poor data per product, retailers the cost of poor data per customer, or oil producers the cost of poor data per well. Doing so makes it easy to communicate the value throughout your organization and allows anyone who understands the business to size the cost of bad data. For example, our studies show that on campus data quality problems can cost anywhere from $70 to $480 per student per year. Let’s say your school has 7,500 students and we take the low end of the range at $100 per student. That’s a $750,000 per year data quality problem. As another example, our engagement with a utility customer estimated that data quality problems can cost between $5 to $10 per meter. Taking the low value of $5 against 400,000 meters quantifies the data quality problem at $2,000,000 annually. Sizing the problem lets you know just how much attention you should be paying to it. But this is the end result of your cost of poor data quality analysis. Now that we know the destination, how do we get there?

To achieve these types of metrics you have to assess the impact of bad data on your enterprise by engaging all of the parties that are involved in attempting to get the data right, and all of the parties that are negatively affected when it is wrong. You will need to go beyond the creators, curators and users of the data and also involve IT stakeholders and business owners to estimate: impact on revenues; cost of redundant efforts in either getting the data or cleaning it up; the number of systems that will be impacted by high quality data; cost of non-compliance; and cost of rework. Only through this type of analysis can you gain the insight necessary to cost-justify a data quality and master data management effort.

The scope of this analysis is determined by the focus of your data quality efforts. If you are taking an enterprise-wide approach then you will need to deal with many departments and constituencies. If you are taking a Business Unit, functional or project focus for your data quality efforts, your examination will only need to be done on a departmental basis. For example, if customer data is the domain of analysis, you will need to involve subject matter experts across marketing, sales, and service. Alternatively, if supplier data is your focus, you will need to involve experts from procurement, supply-chain, and reporting functions.

Regardless of data domain, your overall approach may look something like this:

  1. Understanding business goals and priorities
  2. Documenting key data issues and challenges
  3. Assessing current capabilities and identifying gaps in your data
  4. Determining data capabilities and identifying needs
  5. Estimating and applying benefit improvement ranges
  6. Quantifying potential benefits and establishing your “cost per” metric
  7. Developing your data strategy and roadmap
  8. Developing your deployment timeline and recommendations

Going through this process ensures executive buy-in for your data quality efforts, gets the right people participating in the decisions that will need to be made, and provides a plan with a ROI which will be necessary to gain the necessary approvals to go ahead with the project.

Be sure to focus on: Master Data Management @ OpenWorld

Thursday Aug 28, 2014

How Do You Know if You Have a Data Quality Issue?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality & Murad Fatehali – Senior Director with Oracle’s Insight team and leads the Integration practice in North America.

Big Data, Master Data Management, Analytics are all topics and buzz words getting big play in the press.  And they’re all important as today’s storage and computing capabilities allow for automated decision making that provides customers with experiences more tailored to them as well as provides better information upon which business decisions can be made.  The whole idea being, the more you know, the more you know.

Lots of companies think they know that they should be doing Big Data, Master Data Management and Analytics, but don’t really know where to start or what to start with.  My two favorite questions to ask any prospective customer while discussing these topics are: 1) Do you have data you care about? And, 2) Does it have issues?  If the answers come back “Yes” and “Yes” then you can have the discussion on what it takes to get the data ready for Big Data, Master Data Management and Analytics.  If you try any of these with lousy data, you’re simply going to get lousy results.

But, how do I know if I’ve got less than stellar data?  All you have to do is listen to the different departments in your company and they will tell you.  Here is a guide to the types of things you might hear.

You know you have poor data quality if MARKETING says:

1. We have issues with privacy management and customer preferences
2. We can’t do the types of data matching and enhancement we want to do
3. There’s no way to do data matching with internal or external files
4. We have missing data but we don’t know how much or which variables
5. There’s no standardization or data governance
6. We don’t know who our customer is
7. We’ve got compliance issues

You know you have poor data quality if SALES says:

1. The data in the CRM is wrong and needs to be re-entered and is outdated
2. I have to go through too many applications to find the right customer answers
3. Average call times are too long due to poor data and manual data entry
4. I’m spending too much time fixing data instead of selling

You know you have poor data quality if BUSINESS INTELLIGENCE says:

1. No one trusts the data so we have lots of Excel spreadsheets and none of the numbers match
2. It’s difficult to find data and there are too many sources
3. We have no data variables with consistent definitions
4. There’s nothing to clean the data with
5. Even data we can agree on, like telephone number, has multiple formats

You know you have poor data quality if OPERATIONS or FINANCE says:

1. The Billing report does not match the BI report

2.   1. Payment information and address information does not match the information in the Account Profile
3. Accounts closed in Financial Systems show up as still open in CRM system or vice versa where customers get billed for services terminated
4. Billing inaccuracies are caught during checks because there are no up-front governance rules
5. Agents enter multiple orders for the same service or product on an account
6. Service technicians show up on site with wrong parts and equipment which then requires costly repeat visits and negatively impacts customer satisfaction
7. Inventory systems show items sales deemed OK to sell while suppliers may have marked obsolete or recalled
8. We have multiple GLs and not one single version of financial truth

You know you have poor data quality if IT says:

1. It’s difficult to keep data in synch across many sources and systems
2. Data survivorship rules don't exist
3. Customer Data types (B2B, end user in B2B, customer in B2C, account owner B2C) and status (active, trial, cancelled, etc.) changes for the same customer over time and it’s difficult to keep track without exerting herculean manual effort
You know you have poor data quality if HUMAN RESOURCES says:
1. First have to wait for data, then when it is gathered and delivered we need to work to fix it
2. Ten-percent of our time is wasted due to waiting on things or re-work cycles
3. Employee frustration with searching, finding, and validating data results in churn, and will definitely delay re-hire of employees
4. Incorrect competency data results in: a) productivity loss in terms of looking at the wrong skilled person; b) possible revenue loss due to lack of skills needed; and c) additional hires when none are needed

You know you have poor data quality if PROCUREMENT says:

1. Not knowing our suppliers impacts efficiencies and costs
2. FTEs in centralized sourcing spend up to 20% of their time fixing bad data and related process issues
3. Currently data in our vendor master, material master and pricing information records is manually synched since the data is not accurate across systems.  We end up sending the orders to the wrong suppliers
4. Supplier management takes too much time
5. New product creation form contains wrong inputs rendering many fields unusable
6. Multiple entities: 1) Logistics, 2) Plants, 3) Engineering, 4) Product Management, enter or create Material Master information.  We cannot get spend analytics
7. We have no good way of managing all of the products we buy and use

You know you have poor data quality if PRODUCT MANAGEMENT says:

1. Product development and life-cycle management efforts take longer and cost more
2. We have limited standards and rules for product dimensions.  We need to manually search missing information available elsewhere
3. Our product data clean-up occurs in pockets across different groups, the end result of these redundant efforts is duplication of standards
4. We make status changes to the product lifecycle that don't get communicated to Marketing and Engineering in a timely manner.  Our customers don’t know what the product now does

All of these areas suffer either individually or together due to poor data quality.  All of these issues impact corporate performance which impacts stakeholders which impacts corporate management.  If you’re hearing any of these statements from any of these departments you have a data quality issue that needs to be addressed.  And that is especially true if you’re considering any type of Big Data, Master Data Management or Analytics initiative.

Thursday Aug 07, 2014


Author: John Siegman 

How do you know if you have a Master Data Management (MDM) or Data Quality (DQ) issue on your campus? One of the ways is to listen to the concerns of your campus constituents. While none of them are going to come out and tell you that they have a master data issue directly, by knowing what to listen for you can determine where the issues are and the best way to address them.

What follows are some of the key on-campus domains and what to listen for to determine if there is a MDM or DQ issue that needs to be resolved.

Student: Disconnected processes lacking coordination

· Fragmented data across disparate systems, disconnected across groups for:

- data collection efforts (duplicate/inconsistent student/faculty surveys)

- data definitions, rules, governance

- data access, security, and analysis

· Lack of training around security/access further complicated due to number of sources

· No information owner/no information strategy

· Student attributes maintained across many systems

Learning: Does not capture interactions

· Cannot identify students at risk. Do not capture interactions with students and faculty, and faculty interactions for research support, etc.

· No way to track how many undergraduates are interested in research

· Don't do any consistent analytics for course evaluations

· Difficult and time consuming to gather information because of the federated nature of the data – for example, job descriptions in HR are different than what is really being used

· There is no view of Student experience

HR: Process inconsistencies, lack of data standards complicates execution

· Faculty not paid by the university are not in the HCM system, while students receiving payments from the university are in the HCM system

· Disconnected process to issue IDs, keys, duplicate issues

· Given multiplicity of data sources, accessing the data is a challenge

· Data analytics capabilities and available reports are not properly advertised, so people do not know what is available. As a consequence an inordinate amount of time is spent generating reports

· Faculty/Staff information collection is inconsistent, sometimes paper-based. Implication: lose applicants because it is too difficult to complete the application process

Research: Getting from data to insight is a challenge

· Very time consuming to determine: Which proposals were successful? What type of awards are we best at winning?

· Difficult to understand: number of proposals, dollar value, by school, by department, by agency, by time period

· Data challenges in extracting data out of the system for grants, faculty, and making it centrally available

Deans & Officers: Reporting is a challenge

· Significant use of Excel, reporting is becoming unstable because of the amount of data in the files

· Information charter, a common retention policy does not exist

· A lot of paper is generated for the domains we are covering. Converting paper to digital is a challenge

· Collecting information on faculty activity (publications) is a challenge. Data in documents requires validation

· Data requests result in garbage. Donors receiving the wrong information.

Finance: Has little trust in data

· Do not have workflow governance processes. Implication, information goes into the system without being reviewed, therefore errors can make it into the records

· Systems connected to ERP systems do not always give relevant or requested info

· Closing the month or quarter takes too long as each school and each department has its own set of GLs.

Facilities: Efficiencies are hampered due to data disconnects

· Do not have accurate space metrics due to outdated system, schools not willing to share their info with Research Administrators and Proposal Investigators

· Do not have utility consumption, building by building

· No clear classroom assignment policy (a large room may be assigned to a small number of students)

· Not all classes are under the registrar's control

· No tool showing actual space for planning purposes

· Difficult to determine research costs, without accurate access to floor plans and utilization

· Cannot effectively schedule and monitor classrooms

If your campus has data, you have data issues. As the push for students becomes more competitive, being able to understand your current data, mine your social data, target your alumni, make better use of your facilities, improve your supplier relationships, and increase your student success will be dependent on better data. The tools exist to take data from a problem filled issue to a distinct competitive advantage. The sooner campuses adopt these tools, the sooner they will receive the benefits of doing so.

Wednesday Aug 14, 2013

Master Data—and Deliver a Great Customer Experience

In the fast-paced world of the connected consumer, expectations run high. Every time customers interact with a company, they want a positive, relevant, and personalized experience. If they don’t get it, today’s empowered customers won’t hesitate to leave. Yet many companies can’t deliver great personal experiences to their customers because they are struggling with siloed information systems and processes that fail to provide complete and accurate data to sales, support, and marketing teams. 

In the new white paper by Harvard Business Review Analytic Services, “Delivering on the Promise of Great Customer Experiences,” learn from several forward-thinking organizations—in industries ranging from travel to telecommunications—how to use Master Data Management (MDM) to collect and integrate all types of internal and external data and create the consistent, connected, and personalized experiences that customers want. Oracle Master Data Management offers the most complete product line on the market, enabling organizations to cleanse, centralize, and govern to create a “master” version of customer and business data—and the foundation for an improved customer experience strategy. Find out how your organization can enrich the customer experience.

Read the whitepaper today!

For more information on Master Data Management, visit us on -

Sunday Jun 16, 2013

Church Pension Group Leverages Fusion Customer Hub and Delivers Critical Data to the Business

Hear about how Fusion Customer Hub is exactly what Church Pension Group needed to provide the right data in a timely manner to the line of business.  Church Pension Group used Fusion Customer Hub to provide best-in-class capabilities around MDM and their multiple source systems. Check out the latest video below. 

Wednesday Jun 12, 2013

Allianz Group Turns to Oracle Master Data Managent for Customer Insight

Allianz Group's challenge was to get a single, consolidated view in order to better service their customers. They chose Oracle, as the best fit for their existing applications. Customer segmentation, behavior, preferences were all critical to Allianz. Oracle MDM connected to many of their applications for a comprehensive, trusted, relevant view of their customer data. For more information on Oracle Master Data Management click here.  Take a look at this video testimonial from Allianz...

Wednesday Jun 05, 2013

Elsevier Gains Customer Insight and More with Oracle Customer Hub

Oracle's Customer Experience solutions and use cases work hand-in-hand with Oracle Customer Hub (key product of Oracle Master Data Management).  Now, don't just take our word for it , listen to Elsevier, world's leading information and content provider for Medical, Technical and Scientific markets.  Hear how Elsevier leveraged Customer Hub to gain better customer insight, why they chose Oracle, and how they can serve customers better across all touchpoints.  

Monday Apr 22, 2013

Latest MDM Screencast Now Available: Masters of the Data

Oracle Master Data Management recently had a great opportunity to be a part of Oracle Fusion Middleware's Screencast program titled The New Business Imperative: Social, Mobile, Cloud. Each week this screencast series features a different Middleware offering and the series currently features MDM.  The title of the screencast is Masters of the Data: CIOs Tune into Data Quality and Master Data Management. For more information on Oracle MDM click here.


Saturday Mar 16, 2013

Ready for the Gartner MDM Summit Next Week? We Are...


The Gartner MDM Summit is almost here !! Oracle is a PLATINUM sponsor of this event and geared up for a great show next week. We have an information-packed session planned with SONY PlayStation as our featured customer for the session.  I strongly urge those of you attending the conference next week to make it a point to attend -- you won't regret it. 

Our session next week is centered around a customer case study, hear what worked for them, why they chose Oracle and the value of MDM.

  • Oracle Customer Case Study Session: "From Strategy to Operational Excellence", Sree Vaidyanathan - SONY PlayStation   March 21, 2013 4-5PM Room: Texas A                       

In addition to the case study session - don't miss visiting our demo pod for live demos of MDM and Enterprise Data Quality.  For more informatimon on Oracle's session click here.  

Thursday Jan 17, 2013

Why Should CIOs Care About MDM, Data Governance and Data Quality?

See full size image



Master Data Management, Data Quality and Data Governance are more important than ever when it comes to consolidation, standardization and accountability for data.  A growing number of C level executives are seeing that enterprises need these solutions to supplement the multitude of applications already installed and deployed in their organization.  Take a look at this article on the's C-Central site and read about the MDM, Data Quality and Data Governance trifecta.


Friday Dec 07, 2012

Reference Data Management and Master Data: Are they Related ?

Submitted By:  Rahul Kamath 

Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise Master Data Management (MDM) solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or restructuring sales territories to enable equitable distribution of leads to sales teams following the acquisition of new products, or adding additional cost centers to enable fine grain control over expenses. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation.

What is reference data? How does it relate to Master Data?

Reference data is a close cousin of master data. While master data is challenged with problems of unique identification, may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and gives them contextual value. In fact, the creation of a new master data element may require new reference data to be created. For example, when a European company acquires a US business, chances are that they will now need to adapt their product line taxonomy to include a new category to describe the newly acquired US product line. Further, the cross-border transaction will also result in a revised geo hierarchy. The addition of new products represents changes to master data while changes to product categories and geo hierarchy are examples of reference data changes.1

The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards.

Types & Codes


Relationships / Mappings


Transaction Codes

Industry Classification Categories and Codes, e.g.,
North America Industry Classification System (NAICS)

Product / Segment; Product / Geo

Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601)

Lookup Tables
(e.g., Gender, Marital Status, etc.)

Product Categories

City à State à Postal Codes

Currency Codes (e.g., ISO)

Status Codes

Sales Territories
(e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense)

Customer / Market Segment; Business Unit / Channel

Country Codes
(e.g., ISO 3166, UN)

Role Codes

Market Segments

Country Codes / Currency Codes / Financial Accounts

Date/Time, Time Zones
(e.g., ISO 8601)

Domain Values

Universal Standard Products

and Services Classification (UNSPSC), eCl@ss

International Classification of Diseases (ICD) e.g.,
à IC10 mappings

Tax Rates

Why manage reference data?

Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment.

Sample Use Cases of Reference Data Management

Healthcare: Diagnostic Codes

The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon.

Spend Management: Product, Service & Supplier Codes

Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks.

Change Management: Point of Sales Transaction Codes and Product Codes

In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations.

Multi-National Companies: Industry Classification Schemes

As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change.

1 Master Data versus Reference Data, Malcolm Chisholm, April 1, 2006.

Friday Nov 30, 2012

Oracle Product Leader Named a Leader in Gartner MQ for MDM of Product Data Solutions

Gartner recently Oracle as a leader in the MQ report for MDM of Product Data Solutions.  They named Oracle as a leader with the following key points: 

  • Strong MDM portfolio covering multiple data domains, industries and use cases
  • Oracle PDH can be a good fit for Oracle EBS customers and can form part of a multidomain solution:
  • Deep MDM of product data functionality
  • Evolving support for information stewardship
For  more information on the report visit Oracle's Analyst Relations blog at  To learn more about Oracle's product information solutions for master data management click here

Figure 1.Magic Quadrant for Master Data Management of Product Data Solutions

Saturday Nov 10, 2012

Oracle - A Leader in Gartner's MQ for Master Data Management for Customer Data


The Gartner MQ report for Master Data Management of Customer Data Solutions is released and we're proud to say that Oracle is in the leaders' quadrant.  Here's a snippet from the report itself:

 " “Oracle has a strong, though complex, portfolio of domain-specific MDM products that include prepackaged data models. Gartner estimates that Oracle now has over 1,500 licensed MDM customers, including 650 customers managing customer data. The MDM portfolio includes three products that address MDM of customer data solution needs: Oracle Fusion Customer Hub (FCH), Oracle CDH and Oracle Siebel UCM. These three MDM products are positioned for different segments of the market and Oracle is progressively moving all three products onto a common MDM technology platform..." (Gartner, Oct 18, 2012) 

Figure 1.Magic Quadrant for Master Data Management of Customer Data Solutions

For more information on Oracle's solutions for customer data in Master Data Management, click here.  

Friday Nov 02, 2012

Enterprise Data Quality - New and Improved on Oracle Technology Network

Looking for Enterprise Data Quality technical and developer resources on your projects? Wondering where the best place is to go for finding the latest documentations, downloads and even code samples and libraries?  Check out the new and improved Oracle Technical Network pages for Oracle Enterprise Data Quality.  This section features developer forums as well for EDQ and Master Data Management so that you can connect with other technical professionals who have submitted concerns or posted tips and tricks and learn from them.  Here are the links to bookmark: 



Tuesday Oct 30, 2012

Master Data Management - The Trend Towards Multi-Domain and Other Realities

In my quest to keep my fingers on the pulse of MDM, I recently found a pretty interesting article.  The article was published in Information Week and provides some interesting statistics from a recent survey conducted by the analyst firm, The Information Difference.  Let's take a look:

  • Of the 130 organizations surveyed, 53% have live operational MDM implementations
  • 81% of those with live operational MDM implementations report broad success - a huge improvement over 2011's 54%
  • 64% developed a business case prior to their MDM deployment, while a daring 32% went ahead without a business case.   
The article goes on to talk about the shift in vendors from focusing on customer data and product information management to one that is oriented around multi-domain master data management as well as other realities around MDM.  Take a look at the article. For more information on Oracle's master data management suite, click here


Get the latest on all things related to Oracle Master Data Management. Join Oracle's MDM Community today.

Follow us on twitter Catch Us on YouTube


« June 2016