Thursday Oct 16, 2014

Why Not Data Quality?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

Big data, business intelligence, analytics, data governance, data relationship management, the list of data oriented topics goes on. Lots of people are talking about all of these and yet very few people talk about poor data quality, let alone do anything about it. Why is that?

We think it’s because Data Quality suffers from the “Do I have to?” (DIHT) syndrome. Anyone with kids or anyone who was a kid will recognize this syndrome as in, “Clean up your room.” “Do I have to?” Dealing with poor quality data is not glamorous and it doesn’t get the headlines. Installing business intelligence systems or setting up data governance, gets the headlines and makes careers. But what good is better reporting and structure if the underlying data is junk?

Recently we were in a half day planning session for an existing customer. The customer wanted to know what they could do better using the existing software they had already purchased as Phase 1 of the project and what they would need to acquire to do things better for Phase 2. Reviews like this are critically important, as people change on both sides of the customer/vendor relationship, to ensure knowledge transfer and reaffirmation of goals. The customer provided access to numerous departments across their company for interviews and focus groups. All of this information was gathered, reviewed, summarized and suggestions were made. Excel spreadsheets and PowerPoints ensued. Even though the Aberdeen Group and others have shown significant performance increases in established ERP and other business systems through the use of Data Quality and Master Data Management, because the customer did not directly say they had a data issue (and very few customers ever admit this because poor data is just standard operating procedure), no emphasis was put on data quality as a way of improving the customer’s processes and results with their existing software packages. What is it about data quality that makes it the option of last resort? The go to, when all else fails. It’s got to be the belief that the data in underlying systems and sources is good by default. I mean, really, who would keep bad data around? Well, pretty much everyone. Because if you don’t know that it’s bad, you end up keeping it around.

Let’s admit it, DQ is not glamorous. There are no DQ-er of theyear awards. People in DQ don’t typically have their names on a parking spot right up front in the corporate lot. And besides not being glamorous, it’s hard. Very rarely do we see someone ‘own’ data quality – after all since bad data affects multiple people, across multiple functions, no one really has the right incentives to drive data quality improvements where the resulting benefits accrue to multiple constituencies. Nobody really wants to spend their functional budgets fixing enterprise-wide data problems. Some of the very early DQ adopting companies have teams of people, representing a cross-section of processes and functions, who spend their days manually inspecting data and creating internal systems to meet their specific data quality needs. They are very effective at what they do, but not as efficient as they could be because the whole is greater than the sum of its parts. Also, most of the data knowledge is in their heads and that’s really hard to replicate and subject to loss due to job switching or retirement or the possible run in with the proverbial bus.

 So, given the underwhelming desire to fix poor data, how do you get the powers that be in your company to see the light? In our last article Data Quality, Is it worth it? How do you know? we examined the value of data quality based on units of measure that were meaningful to the given organization. To paraphrase Field of Dreams, if you build the ROI, they will come. The first step to building the ROI, is understanding how poor your data is and what impact that has on your organization. Typically that starts with a Data Quality Health Check.

 A DQ Heath Check takes a sample of your data and looks at varying aspects to determine the quality level of your data. The aspects examined include: Consistency, Completeness, Accuracy, and Validity. These measures attempt to answer the question, Is your data fit for purpose? Consistency looks at the validation of the data within a variable. For example if the variable in question only allows for Ys and Ns, Ms and Ts will lower the consistency rating. Completeness is just that, how complete is the data in your database? Using our previous example if Ys and Ns are only present 20% of the time, your data for that variable is fairly incomplete. Accuracy looks at a number of things but mostly represents data within the bounds of expectations. And Validity looks at usefulness. For example, telephone numbers are typically 10 digits. Phone numbers without area codes or with letters while complete and possibly consistent are not valid.

In another recent customer engagement, we looked at customer records for data anomalies specifically for consistency, completeness, accuracy, and validity. We found that fixing these records resulted in improvements not only in marketing (campaign effectiveness), but also improved service (customer experience), higher collections in finance (lower receivables), and improved reporting. In today’s data rich, integrated, system-driven processes, improving data quality in one part of the organization (whether it be customer data, supplier data, financial data) benefits multiple organizational functions and processes.

So while data quality will never be glamorous for individuals, with a little insight providing a strong ROI for DQ we can move this from Do I Have To? to Let’s Do This.

Friday Sep 12, 2014

Data Quality, Is it worth it? How do you know?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & 

Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

You might think that the obvious answer to the title question would be to fix them, but not so fast. As heretical as this might be to write, not all data quality problems are worth fixing. While the data purists will tell you that every data point is worth making sure it is a correct data point, we believe you should only spend money fixing data that has a direct value impact on your business. In other words, what’s the cost of bad data?

What’s the cost of bad data? That’s a question that is not asked often enough. When you don’t understand the value of your data and the costs associated with poor data quality, you tend to ignore the problem which tends to make matters worse, specifically for initiatives like data consolidation, big data, customer experience, and data mastering. The ensuing negative impact has wider ramifications across the organization – primarily for the processes that rely on good quality data. All business operations systems like, ERP, CRM, HCM, SCM, EPM, that businesses run on assume that the underlying data is good.

Then what’s the best approach for data quality success? Paraphrasing Orwell’s Animal Farm, “all data is equal, some is just more equal than others”. What data is important and what data is not so important is a critical input to data quality project success. Using the Pareto rule, 20% of your data is most likely worth 80% of your effort. For example, it can be easily argued that financial data have a greater value as they are the numbers that run your business, get reported to investors and government agencies, and can send people to jail if they’re wrong. The CFO, who doesn’t like jail, probably considers this valuable data. Likewise, a CMO understands the importance of capturing and complying with customer contact and information sharing preferences. Negligent marketing practices, due to poor customer data, can result in non-trivial fines and penalties, not to mention bad publicity. Similarly, a COO may deem up-to-date knowledge of expensive assets as invaluable information, along with description, location, and maintenance schedule details. Any lapses here could mean significant revenue loss due to unplanned downtime. Clearly, data value is in the eye of the beholder. But prioritizing which data challenges should be tackled first needs to be a ‘value-based’ discussion.

How do you decide what to focus on? We suggest you focus on understanding the costs of poor data quality and management and then establishing a metric that is meaningful to your business. For example, colleges might look at the cost of poor data per student, utilities the cost of poor data per meter, manufacturers the cost of poor data per product, retailers the cost of poor data per customer, or oil producers the cost of poor data per well. Doing so makes it easy to communicate the value throughout your organization and allows anyone who understands the business to size the cost of bad data. For example, our studies show that on campus data quality problems can cost anywhere from $70 to $480 per student per year. Let’s say your school has 7,500 students and we take the low end of the range at $100 per student. That’s a $750,000 per year data quality problem. As another example, our engagement with a utility customer estimated that data quality problems can cost between $5 to $10 per meter. Taking the low value of $5 against 400,000 meters quantifies the data quality problem at $2,000,000 annually. Sizing the problem lets you know just how much attention you should be paying to it. But this is the end result of your cost of poor data quality analysis. Now that we know the destination, how do we get there?

To achieve these types of metrics you have to assess the impact of bad data on your enterprise by engaging all of the parties that are involved in attempting to get the data right, and all of the parties that are negatively affected when it is wrong. You will need to go beyond the creators, curators and users of the data and also involve IT stakeholders and business owners to estimate: impact on revenues; cost of redundant efforts in either getting the data or cleaning it up; the number of systems that will be impacted by high quality data; cost of non-compliance; and cost of rework. Only through this type of analysis can you gain the insight necessary to cost-justify a data quality and master data management effort.

The scope of this analysis is determined by the focus of your data quality efforts. If you are taking an enterprise-wide approach then you will need to deal with many departments and constituencies. If you are taking a Business Unit, functional or project focus for your data quality efforts, your examination will only need to be done on a departmental basis. For example, if customer data is the domain of analysis, you will need to involve subject matter experts across marketing, sales, and service. Alternatively, if supplier data is your focus, you will need to involve experts from procurement, supply-chain, and reporting functions.

Regardless of data domain, your overall approach may look something like this:

  1. Understanding business goals and priorities
  2. Documenting key data issues and challenges
  3. Assessing current capabilities and identifying gaps in your data
  4. Determining data capabilities and identifying needs
  5. Estimating and applying benefit improvement ranges
  6. Quantifying potential benefits and establishing your “cost per” metric
  7. Developing your data strategy and roadmap
  8. Developing your deployment timeline and recommendations

Going through this process ensures executive buy-in for your data quality efforts, gets the right people participating in the decisions that will need to be made, and provides a plan with a ROI which will be necessary to gain the necessary approvals to go ahead with the project.

Be sure to focus on: Master Data Management @ OpenWorld

Thursday Aug 28, 2014

How Do You Know if You Have a Data Quality Issue?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality & Murad Fatehali – Senior Director with Oracle’s Insight team and leads the Integration practice in North America.

Big Data, Master Data Management, Analytics are all topics and buzz words getting big play in the press.  And they’re all important as today’s storage and computing capabilities allow for automated decision making that provides customers with experiences more tailored to them as well as provides better information upon which business decisions can be made.  The whole idea being, the more you know, the more you know.

Lots of companies think they know that they should be doing Big Data, Master Data Management and Analytics, but don’t really know where to start or what to start with.  My two favorite questions to ask any prospective customer while discussing these topics are: 1) Do you have data you care about? And, 2) Does it have issues?  If the answers come back “Yes” and “Yes” then you can have the discussion on what it takes to get the data ready for Big Data, Master Data Management and Analytics.  If you try any of these with lousy data, you’re simply going to get lousy results.

But, how do I know if I’ve got less than stellar data?  All you have to do is listen to the different departments in your company and they will tell you.  Here is a guide to the types of things you might hear.

You know you have poor data quality if MARKETING says:

1. We have issues with privacy management and customer preferences
2. We can’t do the types of data matching and enhancement we want to do
3. There’s no way to do data matching with internal or external files
4. We have missing data but we don’t know how much or which variables
5. There’s no standardization or data governance
6. We don’t know who our customer is
7. We’ve got compliance issues

You know you have poor data quality if SALES says:

1. The data in the CRM is wrong and needs to be re-entered and is outdated
2. I have to go through too many applications to find the right customer answers
3. Average call times are too long due to poor data and manual data entry
4. I’m spending too much time fixing data instead of selling

You know you have poor data quality if BUSINESS INTELLIGENCE says:

1. No one trusts the data so we have lots of Excel spreadsheets and none of the numbers match
2. It’s difficult to find data and there are too many sources
3. We have no data variables with consistent definitions
4. There’s nothing to clean the data with
5. Even data we can agree on, like telephone number, has multiple formats

You know you have poor data quality if OPERATIONS or FINANCE says:

1. The Billing report does not match the BI report

2.   1. Payment information and address information does not match the information in the Account Profile
3. Accounts closed in Financial Systems show up as still open in CRM system or vice versa where customers get billed for services terminated
4. Billing inaccuracies are caught during checks because there are no up-front governance rules
5. Agents enter multiple orders for the same service or product on an account
6. Service technicians show up on site with wrong parts and equipment which then requires costly repeat visits and negatively impacts customer satisfaction
7. Inventory systems show items sales deemed OK to sell while suppliers may have marked obsolete or recalled
8. We have multiple GLs and not one single version of financial truth

You know you have poor data quality if IT says:

1. It’s difficult to keep data in synch across many sources and systems
2. Data survivorship rules don't exist
3. Customer Data types (B2B, end user in B2B, customer in B2C, account owner B2C) and status (active, trial, cancelled, etc.) changes for the same customer over time and it’s difficult to keep track without exerting herculean manual effort
You know you have poor data quality if HUMAN RESOURCES says:
1. First have to wait for data, then when it is gathered and delivered we need to work to fix it
2. Ten-percent of our time is wasted due to waiting on things or re-work cycles
3. Employee frustration with searching, finding, and validating data results in churn, and will definitely delay re-hire of employees
4. Incorrect competency data results in: a) productivity loss in terms of looking at the wrong skilled person; b) possible revenue loss due to lack of skills needed; and c) additional hires when none are needed

You know you have poor data quality if PROCUREMENT says:

1. Not knowing our suppliers impacts efficiencies and costs
2. FTEs in centralized sourcing spend up to 20% of their time fixing bad data and related process issues
3. Currently data in our vendor master, material master and pricing information records is manually synched since the data is not accurate across systems.  We end up sending the orders to the wrong suppliers
4. Supplier management takes too much time
5. New product creation form contains wrong inputs rendering many fields unusable
6. Multiple entities: 1) Logistics, 2) Plants, 3) Engineering, 4) Product Management, enter or create Material Master information.  We cannot get spend analytics
7. We have no good way of managing all of the products we buy and use

You know you have poor data quality if PRODUCT MANAGEMENT says:

1. Product development and life-cycle management efforts take longer and cost more
2. We have limited standards and rules for product dimensions.  We need to manually search missing information available elsewhere
3. Our product data clean-up occurs in pockets across different groups, the end result of these redundant efforts is duplication of standards
4. We make status changes to the product lifecycle that don't get communicated to Marketing and Engineering in a timely manner.  Our customers don’t know what the product now does

All of these areas suffer either individually or together due to poor data quality.  All of these issues impact corporate performance which impacts stakeholders which impacts corporate management.  If you’re hearing any of these statements from any of these departments you have a data quality issue that needs to be addressed.  And that is especially true if you’re considering any type of Big Data, Master Data Management or Analytics initiative.

Thursday Aug 07, 2014

MDM CHALLENGES BY HIGHER EDUCATION DOMAIN

Author: John Siegman 

How do you know if you have a Master Data Management (MDM) or Data Quality (DQ) issue on your campus? One of the ways is to listen to the concerns of your campus constituents. While none of them are going to come out and tell you that they have a master data issue directly, by knowing what to listen for you can determine where the issues are and the best way to address them.

What follows are some of the key on-campus domains and what to listen for to determine if there is a MDM or DQ issue that needs to be resolved.

Student: Disconnected processes lacking coordination

· Fragmented data across disparate systems, disconnected across groups for:

- data collection efforts (duplicate/inconsistent student/faculty surveys)

- data definitions, rules, governance

- data access, security, and analysis

· Lack of training around security/access further complicated due to number of sources

· No information owner/no information strategy

· Student attributes maintained across many systems

Learning: Does not capture interactions

· Cannot identify students at risk. Do not capture interactions with students and faculty, and faculty interactions for research support, etc.

· No way to track how many undergraduates are interested in research

· Don't do any consistent analytics for course evaluations

· Difficult and time consuming to gather information because of the federated nature of the data – for example, job descriptions in HR are different than what is really being used

· There is no view of Student experience

HR: Process inconsistencies, lack of data standards complicates execution

· Faculty not paid by the university are not in the HCM system, while students receiving payments from the university are in the HCM system

· Disconnected process to issue IDs, keys, duplicate issues

· Given multiplicity of data sources, accessing the data is a challenge

· Data analytics capabilities and available reports are not properly advertised, so people do not know what is available. As a consequence an inordinate amount of time is spent generating reports

· Faculty/Staff information collection is inconsistent, sometimes paper-based. Implication: lose applicants because it is too difficult to complete the application process

Research: Getting from data to insight is a challenge

· Very time consuming to determine: Which proposals were successful? What type of awards are we best at winning?

· Difficult to understand: number of proposals, dollar value, by school, by department, by agency, by time period

· Data challenges in extracting data out of the system for grants, faculty, and making it centrally available

Deans & Officers: Reporting is a challenge

· Significant use of Excel, reporting is becoming unstable because of the amount of data in the files

· Information charter, a common retention policy does not exist

· A lot of paper is generated for the domains we are covering. Converting paper to digital is a challenge

· Collecting information on faculty activity (publications) is a challenge. Data in documents requires validation

· Data requests result in garbage. Donors receiving the wrong information.

Finance: Has little trust in data

· Do not have workflow governance processes. Implication, information goes into the system without being reviewed, therefore errors can make it into the records

· Systems connected to ERP systems do not always give relevant or requested info

· Closing the month or quarter takes too long as each school and each department has its own set of GLs.

Facilities: Efficiencies are hampered due to data disconnects

· Do not have accurate space metrics due to outdated system, schools not willing to share their info with Research Administrators and Proposal Investigators

· Do not have utility consumption, building by building

· No clear classroom assignment policy (a large room may be assigned to a small number of students)

· Not all classes are under the registrar's control

· No tool showing actual space for planning purposes

· Difficult to determine research costs, without accurate access to floor plans and utilization

· Cannot effectively schedule and monitor classrooms

If your campus has data, you have data issues. As the push for students becomes more competitive, being able to understand your current data, mine your social data, target your alumni, make better use of your facilities, improve your supplier relationships, and increase your student success will be dependent on better data. The tools exist to take data from a problem filled issue to a distinct competitive advantage. The sooner campuses adopt these tools, the sooner they will receive the benefits of doing so.

Friday May 16, 2014

Master Data Management and Service-Oriented Architecture: Better Together

Master Data Management and Service-Oriented Architecture: Better Together

By Neela Chaudhari


Many companies are struggling to keep up with constant shifts in technology and at the same time address rapid changes in the business. As organizations strive to create greater efficiency and agility with the aid of new technologies, each new business-led project may further fragment IT systems and result in information inconsistencies across the organization. Because data is an essential input for all processes and business objects, these irregularities can undermine the original business objectives of the technology initiatives.

Combining the use of master data management (MDM) on the business side and service-oriented architecture (SOA) on the IT side can counteract the problem of information inconsistency. SOA is a practice that uses technology to decouple services, transactions, events, and processes to enhance data availability for business applications across a range of use cases. But the underlying data is often overlooked or treated as an afterthought when it comes to business processes, leading to poor data quality characteristics for your business applications. Without MDM, the data made available to business applications by an SOA approach might be less than accurate and more widespread throughout an organization. That can lead to a situation where lower quality data is consumed by more business users—ultimately thwarting the objectives of efficiency and agility.

MDM can add value to SOA efforts because it improves the quality and trustworthiness of the data that is being integrated and consumed. MDM aids the tricky issue of upstream and downstream systems integration by ensuring the systems access a data hub containing accurate, consistent master data. It also assists SOA by providing consistent visibility and a technical foundation for master data use. MDM delivers the necessary data services to ensure the quality and timeliness of the enterprise objects the SOA will consume.

To learn more about the importance of MDM to SOA investments, read an in-depth technical article, MDM and SOA Be Warned! (http://www.oracle.com/technetwork/articles/soa/ind-soa-mdm-2090170.html)

And don't miss the new Oracle MDM resource center (http://www.oracle.com/webapps/dialogue/ns/dlgwelcome.jsp?p_ext=Y&p_dlg_id=11125359&src=7319909&Act=42). Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site, and financial data.

Friday May 02, 2014

Register Now! Product Data Management Weekly Cloudcast

Don't miss out: Product Data Management Weekly Cloudcast

Every Thursday at 10:00 a.m. PST (1:00 p.m. EST).

The North America Master Data Management (MDM) and Enterprise Data Quality (EDQ) team will present a series of weekly webcasts that give an inside look at how Oracle Product Data Management Cloud modernizes complex data management processes allowing customers to focus on strategic opportunities and delivering value to the business. These webcasts will run throughout FY14, with regular updates being distributed.

These sessions are designed for customers and prospects who are interested in learning more about Product Data Management Cloud.  Customer executives and managers with responsibility for Data Management, Data Quality, Commerce, Manufacturing , IT or other data management responsibilities are encouraged to attend. 

Remember, data that is not managed properly degrades at 27% per year!

Please click  HERE to view a complete schedule and register for the demo.

Friday Mar 21, 2014

Master Data Management: How to Avoid Big Mistakes in Big Data

Big Data Quality MDM

Master Data Management: How to Avoid Big Mistakes in Big Data

The paradigm-changing potential benefits of big data can't be overstated—but big changes can deliver big risks as well. For example, exploding data volumes naturally create a corresponding increase in data correlations, but as leading experts warn, correlations should not be mistaken for causes.

To avoid drawing the wrong conclusions from big data, organizations first need a way to assemble reliable master data to analyze. Then they need a way to put those conclusions and that data to work operationally, in the systems that govern and facilitate their day-to-day operations.

Master data management (MDM) helps deliver insightful information in context to aid decision-making. It can be used to filter big data, isolating and identifying key entities and shrinking the dataset to a manageable size for parsing, tagging, and associating with operational system records. And it provides the key intersecting point that enables organizations to map big data results to operational systems that are built on relational databases and structured information.

Adopting master data management capabilities helps organizations create consolidated, consistent, and authoritative master data across the enterprise, enabling the distribution of master information to all operational and analytical applications, including those that contain customer, product, supplier, site, and financial information.

Oracle Master Data Management drives results by delivering the ability to cleanse, govern, and manage the quality and lifecycle of master data.

To learn more about the importance of MDM as an underlying technology that facilitates big data initiatives, read an in-depth Oracle C-Central article, "Masters of the Data: CIOs Tune into the Importance of Data Quality, Data Governance, and Master Data Management."

And don't miss the new Oracle MDM resource center. Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site and financial data.

Thursday Jan 30, 2014

Oracle Value Chain Summit - February 3rd-5th, 2014

Are you based out of the Bay Area? 

Join more than 1,000 of your peers at the Value Chain Summit to learn how smart companies are transforming their supply chains into information-driven value chains. This unparalleled experience will give you the tools you need to drive innovation and maximize revenue. As you plan for this Summit don’t miss out on our 2 Product Information Management sessions where you will hear how customers today face the pressure points to get their products to market, achieve operational excellence and stay competitive!

Date: February 3-5, 2014
Location: San Jose McEnery Convention Center

Click here to learn more

Product Information Management: Effective  Multi-Channel Commerce with Product Information Management

Featuring: Pampered Chef

Speakers: Shaibal Talukder, Pampered Chef and Bruck Assefa and Dhiman Bhatacharjee, Oracle Customers today face many pressure points in order to get their products to market through multiple sales and distribution channels.  In today’s multi-channel commerce environment, it is critical more than ever to provide relevant, accurate and timely product data to end-consumers to increase sales and provide superior customer experience. Achieving these goals requires robust tools and governance processes including a central repository for self-service product data onboarding, embedded data quality for cleansing and standardizing, and governance workflows to orchestrate product definitions and change management processes. In this session you will learn how Oracle Product Hub and Enterprise Data Quality can help you master your product information effectively for your multi channel commerce initiatives.

Fusion Product Hub: The Foundation for your Enterprise Product Information

Featuring: CDW and Kaygen

Speakers: Sachin Patel and Milan Bhatia, Oracle, Tamer Chavusholu, Kaygen, Karl Schulz, CDW How much do you trust your product data to give you a competitive advantage and operational excellence in your enterprise? In today’s dynamic and competitive business environment, where faster product launches, cost efficiency, and regulatory compliance are a necessity, many enterprises struggle with achieving consistent, high-quality product data that provides significant business value. This session shows how to achieve a solid Product Information Management Foundation with Oracle Fusion Product Hub and the Oracle Enterprise Data Quality platform. Learn about best practices and differentiated capabilities for the Fusion Product Hub with case study from a marquee customer.

Wednesday Aug 14, 2013

Master Data—and Deliver a Great Customer Experience

In the fast-paced world of the connected consumer, expectations run high. Every time customers interact with a company, they want a positive, relevant, and personalized experience. If they don’t get it, today’s empowered customers won’t hesitate to leave. Yet many companies can’t deliver great personal experiences to their customers because they are struggling with siloed information systems and processes that fail to provide complete and accurate data to sales, support, and marketing teams. 

In the new white paper by Harvard Business Review Analytic Services, “Delivering on the Promise of Great Customer Experiences,” learn from several forward-thinking organizations—in industries ranging from travel to telecommunications—how to use Master Data Management (MDM) to collect and integrate all types of internal and external data and create the consistent, connected, and personalized experiences that customers want. Oracle Master Data Management offers the most complete product line on the market, enabling organizations to cleanse, centralize, and govern to create a “master” version of customer and business data—and the foundation for an improved customer experience strategy. Find out how your organization can enrich the customer experience.

Read the whitepaper today!

For more information on Master Data Management, visit us on oracle.com - www.oracle.com/mdm.

Monday Jul 15, 2013

MDM & Data Governance Summit - San Francisco

Are you based in the bay area or plan to be in the bay area on July 17-18? If so, we would love to see you at the MDM & Data Governance Summit at the Hyatt, Fisherman's Wharf. Hear Doug Cosby, Vice President, Research & Development present on Thursday, July 18th at 10:50AM-11:40AM with Jeff Slatter, Associate Vice President at TD Bank, the 6th largest bank in North America.  Doug will be introducing the addition of Oracle Data Relationship Governance to the Oracle MDM & Data Governance portfolio. Learn how your organization can benefit from Oracle’s latest innovations in MDM and Data Governance to construct an agile enterprise.

To register at a discounted rate of $495 (save $300), use this registration link. To access full conference agenda and logistics, visit the conference site.

We hope to see you there!


Monday Apr 22, 2013

Latest MDM Screencast Now Available: Masters of the Data

Oracle Master Data Management recently had a great opportunity to be a part of Oracle Fusion Middleware's Screencast program titled The New Business Imperative: Social, Mobile, Cloud. Each week this screencast series features a different Middleware offering and the series currently features MDM.  The title of the screencast is Masters of the Data: CIOs Tune into Data Quality and Master Data Management. For more information on Oracle MDM click here.

 

Thursday Jan 17, 2013

Why Should CIOs Care About MDM, Data Governance and Data Quality?

See full size image

 

 

Master Data Management, Data Quality and Data Governance are more important than ever when it comes to consolidation, standardization and accountability for data.  A growing number of C level executives are seeing that enterprises need these solutions to supplement the multitude of applications already installed and deployed in their organization.  Take a look at this article on the oracle.com's C-Central site http://www.oracle.com/us/c-central/cio-solutions/information-matters/importance-of-data/index.html and read about the MDM, Data Quality and Data Governance trifecta.

 

Friday Dec 07, 2012

Reference Data Management and Master Data: Are they Related ?

Submitted By:  Rahul Kamath 

Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise Master Data Management (MDM) solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or restructuring sales territories to enable equitable distribution of leads to sales teams following the acquisition of new products, or adding additional cost centers to enable fine grain control over expenses. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation.

What is reference data? How does it relate to Master Data?

Reference data is a close cousin of master data. While master data is challenged with problems of unique identification, may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and gives them contextual value. In fact, the creation of a new master data element may require new reference data to be created. For example, when a European company acquires a US business, chances are that they will now need to adapt their product line taxonomy to include a new category to describe the newly acquired US product line. Further, the cross-border transaction will also result in a revised geo hierarchy. The addition of new products represents changes to master data while changes to product categories and geo hierarchy are examples of reference data changes.1

The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards.

Types & Codes

Taxonomies

Relationships / Mappings

Standards

Transaction Codes

Industry Classification Categories and Codes, e.g.,
North America Industry Classification System (NAICS)

Product / Segment; Product / Geo

Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601)

Lookup Tables
(e.g., Gender, Marital Status, etc.)

Product Categories

City à State à Postal Codes

Currency Codes (e.g., ISO)

Status Codes

Sales Territories
(e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense)

Customer / Market Segment; Business Unit / Channel

Country Codes
(e.g., ISO 3166, UN)

Role Codes

Market Segments

Country Codes / Currency Codes / Financial Accounts

Date/Time, Time Zones
(e.g., ISO 8601)

Domain Values

Universal Standard Products

and Services Classification (UNSPSC), eCl@ss

International Classification of Diseases (ICD) e.g.,
ICD9
à IC10 mappings

Tax Rates

Why manage reference data?

Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment.

Sample Use Cases of Reference Data Management

Healthcare: Diagnostic Codes

The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon.

Spend Management: Product, Service & Supplier Codes

Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks.

Change Management: Point of Sales Transaction Codes and Product Codes

In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations.

Multi-National Companies: Industry Classification Schemes

As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change.

References
1 Master Data versus Reference Data, Malcolm Chisholm, April 1, 2006.

Wednesday Oct 24, 2012

Oracle and ROLTA: Collaboration for Analytical Master Data Management

Oracle and ROLTA have joined forces to put together an educational webinar series on best practices for maximizing data integrity using analytical master data management.  Hear replays of webcasts by Gartner as well as customer success at Navistar and learn how Master Data Management in the enterprise is the right choice for heterogeneity, data degradation and improved analysis of your business. For more information on this collaboration click here. For additional information on Oracle's solution suite for MDM, click here

Monday Oct 08, 2012

The MDM Journey: From the Customer Perspective

Master Data Management is more than just about a single version of  the truth or providing a 360 degree view of the customer.  It spans multiple domains ranging from customers to suppliers to products and beyond.  MDM is pivotal to providing a solid customer experience - one that results in repeat business, continued loyalty and last but not least - high customer satisfaction.  Customer experience is not only defined as accurate information about the customer for the enterprise, but also presenting the customer with the right information about products, orders, product availability, etc.   Let's take a look at a couple of customer use cases with Oracle MDM.

Below is a picture from a recent customer panel:

Oracle MDM is a key platform for increasing upsell/cross-sell opportunities, improve targeting of customers and uncover new sales opportunies, reduce inaccuracies in mailing marketing materials to prospects, as well as to tap into and uncover the full value of a customer across business units more accurately.  A leading investment and private bank leverages Oracle MDM to do a better job of identifying clients, their levels of investment as well as consistently manage them through a series of areas such as credit, risk, new accounts, etc. Ultimately, they are looking to understand client investments and touchpoints across the company's offerings.  Another use case for Oracle MDM is with a major financial and insurance services company with clients worldwide, looking to resolve customer data inaccuracies and client information stored differently across mulitiple systems. 

For more information on Oracle Master Data Management, click here.  

About

Get the latest on all things related to Oracle Master Data Management. Join Oracle's MDM Community today.

Follow us on twitter Catch Us on YouTube

Search

Categories
Archives
« August 2015
SunMonTueWedThuFriSat
      
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
     
Today