Thursday Oct 16, 2014

Why Not Data Quality?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

Big data, business intelligence, analytics, data governance, data relationship management, the list of data oriented topics goes on. Lots of people are talking about all of these and yet very few people talk about poor data quality, let alone do anything about it. Why is that?

We think it’s because Data Quality suffers from the “Do I have to?” (DIHT) syndrome. Anyone with kids or anyone who was a kid will recognize this syndrome as in, “Clean up your room.” “Do I have to?” Dealing with poor quality data is not glamorous and it doesn’t get the headlines. Installing business intelligence systems or setting up data governance, gets the headlines and makes careers. But what good is better reporting and structure if the underlying data is junk?

Recently we were in a half day planning session for an existing customer. The customer wanted to know what they could do better using the existing software they had already purchased as Phase 1 of the project and what they would need to acquire to do things better for Phase 2. Reviews like this are critically important, as people change on both sides of the customer/vendor relationship, to ensure knowledge transfer and reaffirmation of goals. The customer provided access to numerous departments across their company for interviews and focus groups. All of this information was gathered, reviewed, summarized and suggestions were made. Excel spreadsheets and PowerPoints ensued. Even though the Aberdeen Group and others have shown significant performance increases in established ERP and other business systems through the use of Data Quality and Master Data Management, because the customer did not directly say they had a data issue (and very few customers ever admit this because poor data is just standard operating procedure), no emphasis was put on data quality as a way of improving the customer’s processes and results with their existing software packages. What is it about data quality that makes it the option of last resort? The go to, when all else fails. It’s got to be the belief that the data in underlying systems and sources is good by default. I mean, really, who would keep bad data around? Well, pretty much everyone. Because if you don’t know that it’s bad, you end up keeping it around.

Let’s admit it, DQ is not glamorous. There are no DQ-er of theyear awards. People in DQ don’t typically have their names on a parking spot right up front in the corporate lot. And besides not being glamorous, it’s hard. Very rarely do we see someone ‘own’ data quality – after all since bad data affects multiple people, across multiple functions, no one really has the right incentives to drive data quality improvements where the resulting benefits accrue to multiple constituencies. Nobody really wants to spend their functional budgets fixing enterprise-wide data problems. Some of the very early DQ adopting companies have teams of people, representing a cross-section of processes and functions, who spend their days manually inspecting data and creating internal systems to meet their specific data quality needs. They are very effective at what they do, but not as efficient as they could be because the whole is greater than the sum of its parts. Also, most of the data knowledge is in their heads and that’s really hard to replicate and subject to loss due to job switching or retirement or the possible run in with the proverbial bus.

 So, given the underwhelming desire to fix poor data, how do you get the powers that be in your company to see the light? In our last article Data Quality, Is it worth it? How do you know? we examined the value of data quality based on units of measure that were meaningful to the given organization. To paraphrase Field of Dreams, if you build the ROI, they will come. The first step to building the ROI, is understanding how poor your data is and what impact that has on your organization. Typically that starts with a Data Quality Health Check.

 A DQ Heath Check takes a sample of your data and looks at varying aspects to determine the quality level of your data. The aspects examined include: Consistency, Completeness, Accuracy, and Validity. These measures attempt to answer the question, Is your data fit for purpose? Consistency looks at the validation of the data within a variable. For example if the variable in question only allows for Ys and Ns, Ms and Ts will lower the consistency rating. Completeness is just that, how complete is the data in your database? Using our previous example if Ys and Ns are only present 20% of the time, your data for that variable is fairly incomplete. Accuracy looks at a number of things but mostly represents data within the bounds of expectations. And Validity looks at usefulness. For example, telephone numbers are typically 10 digits. Phone numbers without area codes or with letters while complete and possibly consistent are not valid.

In another recent customer engagement, we looked at customer records for data anomalies specifically for consistency, completeness, accuracy, and validity. We found that fixing these records resulted in improvements not only in marketing (campaign effectiveness), but also improved service (customer experience), higher collections in finance (lower receivables), and improved reporting. In today’s data rich, integrated, system-driven processes, improving data quality in one part of the organization (whether it be customer data, supplier data, financial data) benefits multiple organizational functions and processes.

So while data quality will never be glamorous for individuals, with a little insight providing a strong ROI for DQ we can move this from Do I Have To? to Let’s Do This.

Friday Sep 12, 2014

Data Quality, Is it worth it? How do you know?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & 

Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

You might think that the obvious answer to the title question would be to fix them, but not so fast. As heretical as this might be to write, not all data quality problems are worth fixing. While the data purists will tell you that every data point is worth making sure it is a correct data point, we believe you should only spend money fixing data that has a direct value impact on your business. In other words, what’s the cost of bad data?

What’s the cost of bad data? That’s a question that is not asked often enough. When you don’t understand the value of your data and the costs associated with poor data quality, you tend to ignore the problem which tends to make matters worse, specifically for initiatives like data consolidation, big data, customer experience, and data mastering. The ensuing negative impact has wider ramifications across the organization – primarily for the processes that rely on good quality data. All business operations systems like, ERP, CRM, HCM, SCM, EPM, that businesses run on assume that the underlying data is good.

Then what’s the best approach for data quality success? Paraphrasing Orwell’s Animal Farm, “all data is equal, some is just more equal than others”. What data is important and what data is not so important is a critical input to data quality project success. Using the Pareto rule, 20% of your data is most likely worth 80% of your effort. For example, it can be easily argued that financial data have a greater value as they are the numbers that run your business, get reported to investors and government agencies, and can send people to jail if they’re wrong. The CFO, who doesn’t like jail, probably considers this valuable data. Likewise, a CMO understands the importance of capturing and complying with customer contact and information sharing preferences. Negligent marketing practices, due to poor customer data, can result in non-trivial fines and penalties, not to mention bad publicity. Similarly, a COO may deem up-to-date knowledge of expensive assets as invaluable information, along with description, location, and maintenance schedule details. Any lapses here could mean significant revenue loss due to unplanned downtime. Clearly, data value is in the eye of the beholder. But prioritizing which data challenges should be tackled first needs to be a ‘value-based’ discussion.

How do you decide what to focus on? We suggest you focus on understanding the costs of poor data quality and management and then establishing a metric that is meaningful to your business. For example, colleges might look at the cost of poor data per student, utilities the cost of poor data per meter, manufacturers the cost of poor data per product, retailers the cost of poor data per customer, or oil producers the cost of poor data per well. Doing so makes it easy to communicate the value throughout your organization and allows anyone who understands the business to size the cost of bad data. For example, our studies show that on campus data quality problems can cost anywhere from $70 to $480 per student per year. Let’s say your school has 7,500 students and we take the low end of the range at $100 per student. That’s a $750,000 per year data quality problem. As another example, our engagement with a utility customer estimated that data quality problems can cost between $5 to $10 per meter. Taking the low value of $5 against 400,000 meters quantifies the data quality problem at $2,000,000 annually. Sizing the problem lets you know just how much attention you should be paying to it. But this is the end result of your cost of poor data quality analysis. Now that we know the destination, how do we get there?

To achieve these types of metrics you have to assess the impact of bad data on your enterprise by engaging all of the parties that are involved in attempting to get the data right, and all of the parties that are negatively affected when it is wrong. You will need to go beyond the creators, curators and users of the data and also involve IT stakeholders and business owners to estimate: impact on revenues; cost of redundant efforts in either getting the data or cleaning it up; the number of systems that will be impacted by high quality data; cost of non-compliance; and cost of rework. Only through this type of analysis can you gain the insight necessary to cost-justify a data quality and master data management effort.

The scope of this analysis is determined by the focus of your data quality efforts. If you are taking an enterprise-wide approach then you will need to deal with many departments and constituencies. If you are taking a Business Unit, functional or project focus for your data quality efforts, your examination will only need to be done on a departmental basis. For example, if customer data is the domain of analysis, you will need to involve subject matter experts across marketing, sales, and service. Alternatively, if supplier data is your focus, you will need to involve experts from procurement, supply-chain, and reporting functions.

Regardless of data domain, your overall approach may look something like this:

  1. Understanding business goals and priorities
  2. Documenting key data issues and challenges
  3. Assessing current capabilities and identifying gaps in your data
  4. Determining data capabilities and identifying needs
  5. Estimating and applying benefit improvement ranges
  6. Quantifying potential benefits and establishing your “cost per” metric
  7. Developing your data strategy and roadmap
  8. Developing your deployment timeline and recommendations

Going through this process ensures executive buy-in for your data quality efforts, gets the right people participating in the decisions that will need to be made, and provides a plan with a ROI which will be necessary to gain the necessary approvals to go ahead with the project.

Be sure to focus on: Master Data Management @ OpenWorld

Thursday Aug 28, 2014

How Do You Know if You Have a Data Quality Issue?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality & Murad Fatehali – Senior Director with Oracle’s Insight team and leads the Integration practice in North America.

Big Data, Master Data Management, Analytics are all topics and buzz words getting big play in the press.  And they’re all important as today’s storage and computing capabilities allow for automated decision making that provides customers with experiences more tailored to them as well as provides better information upon which business decisions can be made.  The whole idea being, the more you know, the more you know.

Lots of companies think they know that they should be doing Big Data, Master Data Management and Analytics, but don’t really know where to start or what to start with.  My two favorite questions to ask any prospective customer while discussing these topics are: 1) Do you have data you care about? And, 2) Does it have issues?  If the answers come back “Yes” and “Yes” then you can have the discussion on what it takes to get the data ready for Big Data, Master Data Management and Analytics.  If you try any of these with lousy data, you’re simply going to get lousy results.

But, how do I know if I’ve got less than stellar data?  All you have to do is listen to the different departments in your company and they will tell you.  Here is a guide to the types of things you might hear.

You know you have poor data quality if MARKETING says:

1. We have issues with privacy management and customer preferences
2. We can’t do the types of data matching and enhancement we want to do
3. There’s no way to do data matching with internal or external files
4. We have missing data but we don’t know how much or which variables
5. There’s no standardization or data governance
6. We don’t know who our customer is
7. We’ve got compliance issues

You know you have poor data quality if SALES says:

1. The data in the CRM is wrong and needs to be re-entered and is outdated
2. I have to go through too many applications to find the right customer answers
3. Average call times are too long due to poor data and manual data entry
4. I’m spending too much time fixing data instead of selling

You know you have poor data quality if BUSINESS INTELLIGENCE says:

1. No one trusts the data so we have lots of Excel spreadsheets and none of the numbers match
2. It’s difficult to find data and there are too many sources
3. We have no data variables with consistent definitions
4. There’s nothing to clean the data with
5. Even data we can agree on, like telephone number, has multiple formats

You know you have poor data quality if OPERATIONS or FINANCE says:

1. The Billing report does not match the BI report

2.   1. Payment information and address information does not match the information in the Account Profile
3. Accounts closed in Financial Systems show up as still open in CRM system or vice versa where customers get billed for services terminated
4. Billing inaccuracies are caught during checks because there are no up-front governance rules
5. Agents enter multiple orders for the same service or product on an account
6. Service technicians show up on site with wrong parts and equipment which then requires costly repeat visits and negatively impacts customer satisfaction
7. Inventory systems show items sales deemed OK to sell while suppliers may have marked obsolete or recalled
8. We have multiple GLs and not one single version of financial truth

You know you have poor data quality if IT says:

1. It’s difficult to keep data in synch across many sources and systems
2. Data survivorship rules don't exist
3. Customer Data types (B2B, end user in B2B, customer in B2C, account owner B2C) and status (active, trial, cancelled, etc.) changes for the same customer over time and it’s difficult to keep track without exerting herculean manual effort
You know you have poor data quality if HUMAN RESOURCES says:
1. First have to wait for data, then when it is gathered and delivered we need to work to fix it
2. Ten-percent of our time is wasted due to waiting on things or re-work cycles
3. Employee frustration with searching, finding, and validating data results in churn, and will definitely delay re-hire of employees
4. Incorrect competency data results in: a) productivity loss in terms of looking at the wrong skilled person; b) possible revenue loss due to lack of skills needed; and c) additional hires when none are needed

You know you have poor data quality if PROCUREMENT says:

1. Not knowing our suppliers impacts efficiencies and costs
2. FTEs in centralized sourcing spend up to 20% of their time fixing bad data and related process issues
3. Currently data in our vendor master, material master and pricing information records is manually synched since the data is not accurate across systems.  We end up sending the orders to the wrong suppliers
4. Supplier management takes too much time
5. New product creation form contains wrong inputs rendering many fields unusable
6. Multiple entities: 1) Logistics, 2) Plants, 3) Engineering, 4) Product Management, enter or create Material Master information.  We cannot get spend analytics
7. We have no good way of managing all of the products we buy and use

You know you have poor data quality if PRODUCT MANAGEMENT says:

1. Product development and life-cycle management efforts take longer and cost more
2. We have limited standards and rules for product dimensions.  We need to manually search missing information available elsewhere
3. Our product data clean-up occurs in pockets across different groups, the end result of these redundant efforts is duplication of standards
4. We make status changes to the product lifecycle that don't get communicated to Marketing and Engineering in a timely manner.  Our customers don’t know what the product now does

All of these areas suffer either individually or together due to poor data quality.  All of these issues impact corporate performance which impacts stakeholders which impacts corporate management.  If you’re hearing any of these statements from any of these departments you have a data quality issue that needs to be addressed.  And that is especially true if you’re considering any type of Big Data, Master Data Management or Analytics initiative.

Thursday Aug 07, 2014

MDM CHALLENGES BY HIGHER EDUCATION DOMAIN

Author: John Siegman 

How do you know if you have a Master Data Management (MDM) or Data Quality (DQ) issue on your campus? One of the ways is to listen to the concerns of your campus constituents. While none of them are going to come out and tell you that they have a master data issue directly, by knowing what to listen for you can determine where the issues are and the best way to address them.

What follows are some of the key on-campus domains and what to listen for to determine if there is a MDM or DQ issue that needs to be resolved.

Student: Disconnected processes lacking coordination

· Fragmented data across disparate systems, disconnected across groups for:

- data collection efforts (duplicate/inconsistent student/faculty surveys)

- data definitions, rules, governance

- data access, security, and analysis

· Lack of training around security/access further complicated due to number of sources

· No information owner/no information strategy

· Student attributes maintained across many systems

Learning: Does not capture interactions

· Cannot identify students at risk. Do not capture interactions with students and faculty, and faculty interactions for research support, etc.

· No way to track how many undergraduates are interested in research

· Don't do any consistent analytics for course evaluations

· Difficult and time consuming to gather information because of the federated nature of the data – for example, job descriptions in HR are different than what is really being used

· There is no view of Student experience

HR: Process inconsistencies, lack of data standards complicates execution

· Faculty not paid by the university are not in the HCM system, while students receiving payments from the university are in the HCM system

· Disconnected process to issue IDs, keys, duplicate issues

· Given multiplicity of data sources, accessing the data is a challenge

· Data analytics capabilities and available reports are not properly advertised, so people do not know what is available. As a consequence an inordinate amount of time is spent generating reports

· Faculty/Staff information collection is inconsistent, sometimes paper-based. Implication: lose applicants because it is too difficult to complete the application process

Research: Getting from data to insight is a challenge

· Very time consuming to determine: Which proposals were successful? What type of awards are we best at winning?

· Difficult to understand: number of proposals, dollar value, by school, by department, by agency, by time period

· Data challenges in extracting data out of the system for grants, faculty, and making it centrally available

Deans & Officers: Reporting is a challenge

· Significant use of Excel, reporting is becoming unstable because of the amount of data in the files

· Information charter, a common retention policy does not exist

· A lot of paper is generated for the domains we are covering. Converting paper to digital is a challenge

· Collecting information on faculty activity (publications) is a challenge. Data in documents requires validation

· Data requests result in garbage. Donors receiving the wrong information.

Finance: Has little trust in data

· Do not have workflow governance processes. Implication, information goes into the system without being reviewed, therefore errors can make it into the records

· Systems connected to ERP systems do not always give relevant or requested info

· Closing the month or quarter takes too long as each school and each department has its own set of GLs.

Facilities: Efficiencies are hampered due to data disconnects

· Do not have accurate space metrics due to outdated system, schools not willing to share their info with Research Administrators and Proposal Investigators

· Do not have utility consumption, building by building

· No clear classroom assignment policy (a large room may be assigned to a small number of students)

· Not all classes are under the registrar's control

· No tool showing actual space for planning purposes

· Difficult to determine research costs, without accurate access to floor plans and utilization

· Cannot effectively schedule and monitor classrooms

If your campus has data, you have data issues. As the push for students becomes more competitive, being able to understand your current data, mine your social data, target your alumni, make better use of your facilities, improve your supplier relationships, and increase your student success will be dependent on better data. The tools exist to take data from a problem filled issue to a distinct competitive advantage. The sooner campuses adopt these tools, the sooner they will receive the benefits of doing so.

Friday May 23, 2014

Supercharge Your Web Storefront with Superior Data Quality

By Neela Chaudhari (Compiled from Profit Magazine - May 2014)

A targeted, comprehensive data management strategy can differentiate your business from the competition. By providing a great online customer experience starts with providing consumers with the information and research tools needed to make informed buying decisions. All too often, product research capabilities that already exist on a website get diminished by poor data quality behind the scenes.

Companies like Ace Hardware and Shop.com are adopting these data management strategies, and many retailers are seeing significant improvements in the quality of their guided navigation, product spec presentation, and product comparison strategies.

So while modern web storefront capabilities are critical to online sales success, be sure to remember that it is the data that makes it work with your customers and prospects!

There are three key areas where product data quality is impacting the customer experience. What are they? Visit our latest Profit article to find out:

http://www.oracle.com/us/corporate/profit/big-ideas/051514-mstevens-2203458.html

Friday May 16, 2014

Master Data Management and Service-Oriented Architecture: Better Together

Master Data Management and Service-Oriented Architecture: Better Together

By Neela Chaudhari


Many companies are struggling to keep up with constant shifts in technology and at the same time address rapid changes in the business. As organizations strive to create greater efficiency and agility with the aid of new technologies, each new business-led project may further fragment IT systems and result in information inconsistencies across the organization. Because data is an essential input for all processes and business objects, these irregularities can undermine the original business objectives of the technology initiatives.

Combining the use of master data management (MDM) on the business side and service-oriented architecture (SOA) on the IT side can counteract the problem of information inconsistency. SOA is a practice that uses technology to decouple services, transactions, events, and processes to enhance data availability for business applications across a range of use cases. But the underlying data is often overlooked or treated as an afterthought when it comes to business processes, leading to poor data quality characteristics for your business applications. Without MDM, the data made available to business applications by an SOA approach might be less than accurate and more widespread throughout an organization. That can lead to a situation where lower quality data is consumed by more business users—ultimately thwarting the objectives of efficiency and agility.

MDM can add value to SOA efforts because it improves the quality and trustworthiness of the data that is being integrated and consumed. MDM aids the tricky issue of upstream and downstream systems integration by ensuring the systems access a data hub containing accurate, consistent master data. It also assists SOA by providing consistent visibility and a technical foundation for master data use. MDM delivers the necessary data services to ensure the quality and timeliness of the enterprise objects the SOA will consume.

To learn more about the importance of MDM to SOA investments, read an in-depth technical article, MDM and SOA Be Warned! (http://www.oracle.com/technetwork/articles/soa/ind-soa-mdm-2090170.html)

And don't miss the new Oracle MDM resource center (http://www.oracle.com/webapps/dialogue/ns/dlgwelcome.jsp?p_ext=Y&p_dlg_id=11125359&src=7319909&Act=42). Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site, and financial data.

Friday May 02, 2014

Register Now! Product Data Management Weekly Cloudcast

Don't miss out: Product Data Management Weekly Cloudcast

Every Thursday at 10:00 a.m. PST (1:00 p.m. EST).

The North America Master Data Management (MDM) and Enterprise Data Quality (EDQ) team will present a series of weekly webcasts that give an inside look at how Oracle Product Data Management Cloud modernizes complex data management processes allowing customers to focus on strategic opportunities and delivering value to the business. These webcasts will run throughout FY14, with regular updates being distributed.

These sessions are designed for customers and prospects who are interested in learning more about Product Data Management Cloud.  Customer executives and managers with responsibility for Data Management, Data Quality, Commerce, Manufacturing , IT or other data management responsibilities are encouraged to attend. 

Remember, data that is not managed properly degrades at 27% per year!

Please click  HERE to view a complete schedule and register for the demo.

Friday Apr 25, 2014

Big Data Challenges & Considerations


By: Murad Fatehali

While 'Big Data' is dominating a lot of media and executive attention (it's a Top-5 Initiative according to IDC Retail Insights 2014 Predictions), the underlying considerations & challenges of Big Data are unfortunately getting trivialized.  As corporations and people continue to make the Internet a more fundamental way of life (from social and transactional perspectives), the scope of the underlying data that gets created, stored, retrieved, and analyzed will commensurately and exponentially grow as well. Although many factors and assumptions go into Big Data discussions, outlined below are five pivotal dimensions:

1. Size: This is the most obvious and most talked about factor.  Everybody seems to understand that given the rise of the Internet, coupled with the plethora of apps in the hands of consumers and users, growth in the scale and volume of data is truly astonishing (some say, quantity is doubling every 2 years). However, what gets neglected are the challenges related to storage and analysis of vast volumes of data, captured in a variety of formats, for example: text-based (tweets, SMSs), graphically illustrated (pictures and drawings), and audio-visually represented (podcasts, movies, etc.).  For anyone interested in making sense of Big Data, how efficiently the data gets stored and retrieved will be key factors - no wonder we are seeing an increase in the number and capacity of purpose-built storage systems and lightening-fast data-crunching machines.

2. Sources: Usually Big Data discussions refer to ‘external’ sources like the Internet (Facebook, Twitter) and media (digital or otherwise), but often overlooked are the ‘internal’ data sources that can be scraped for Insight - these would be the transactional and operational systems supporting multiple marketing, sales, fulfillment, and service systems, in addition to the company’s own BI/reporting systems.  Ask any CMO and they will explain to you the widening gaps in data due to lack of integration and data governance.  Many companies struggle when answering how many customers they have, not to mention the difficulties in identifying those most profitable or the most loyal. Since the data opportunities inside the company have not been fully explored, diving into the external sources of Big Data without adequate forethought, can potentially not only add costs, but also complexity and confusion. In our engagement with customers, while ERP systems & POS data remain good inputs into a company’s Business Intelligence and reporting efforts, more and more we see executives asking for data from other sources to be able to develop ‘personalized’ offers and solutions, for example:

a. Machine usage data about performance and diagnostics using sensors (vehicle-to-smartphone connectivity devices for smarter driving from: dash.by, automatic.com, metromile.com, etc.)

b. Medical data from patients/public records and research studies (fitbit and wearable technologies, research findings, etc.)

c. Geographic and telemetric data from devices, maps, GPS signals, user tags and flags (Google maps, in-vehicle trackers, etc.)

d. Smartphone check-ins (four-square, etc.)

e. Social networks that capture trends (Twitter, etc.)

f. Content sites (Wikipedia, etc.)

3. Speed: The flow of data is something that can no longer go unnoticed - it is instant and constant.  According to Google CEO Eric Schmidt, “There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created every 2 days, and the pace is increasing.”  It is clear that the rate of incoming data can quickly alter any trends captured off historical data and render meaningless any campaigns that don’t take into account real-time response tracking.   For example, without compute-intensive servers tied-into the large-scale infrastructure hosting Facebook posts, Twitter feeds, and Google searches it would be impossible to capitalize on fads, keep pace with up-to-the-minute news, understand critical events, and predict emerging trends [case in point: who knew that tracking the number of flu queries on Google can help predict localized outbreaks or that analyzing global weather patterns can help predict harvest outputs or crop failures]. The implications are huge for those who can robustly and quickly find data relationships: commonality, lineage, and correlation are going to be big differentiators.

4. Standards: Unlike organizational transactions designed to meet compliance standards for financial reporting, Big Data does not need to conform to any such rules or convention. The vast majority (some say as much as 90%) of data coming from uploads-social network chatter-comments-likes or tweets is neither structured, nor precise, not to mention anything about data reliability (accuracy). The sheer variety of posts representing wide-ranging interests/agendas across multiple sites/sources being repeated/re-circulated requires significant analytical prowess, speed, and talent to make intelligible.

5. Strategy: Big Data has the potential to help a company truly cross channels (understand personal profiles and preferences) and deliver world-class experiences to their customers; it can also help inform a company’s strategy by making sense of wide-spread data coming from public and private networks, internal and external systems, and individual and institutional sources.  How Big Data plays into a company’s success will depend on the priorities of its leadership - and their willingness to make data-driven decisions a reality, and turn Big Data into more than just a buzzword. Linked-In, for example, is analyzing vast quantities of data to generate billions of personalized recommendations every week – no small feat when looking through the silo and antiquated lens of yesterday’s IT landscape. According to Baseline, a 10% increase in data accessibility translates into an additional $65.7M in net income for a typical Fortune 1000 company.

Big Data challenges and considerations outlined above are further impacted by availability of skilled talent - people who understand data in all its forms and can help govern and master it.  Additionally, ethical and legal challenges further compound the problem since perspectives differ on what is personal, private, and sensitive information.  To be sure, none of these Big Data challenges are expected to get resolved anytime soon – all the more reason, therefore, for executives to be cognizant of the five Big Data considerations (size, sources, speed, standards, and strategy) as they chart new ground in their Big Data journey to unlock its value.

For additional details on the Insight program, please visit: www.oracle.com/insight

Murad Fatehali is a Senior Director with the Insight & Customer Strategy Team at Oracle Corporation; he focuses on helping Oracle customers solve Data Strategy & Integration challenges.

Friday Apr 11, 2014

Thank you for Joining us at Gartner Enterprise Information and Master Data Summit

Thank you for stopping by our Oracle Booth at Gartner and attending our session on Feet on the Ground, Head in the Cloud! We hope your time with Gartner and your fellow attendees was productive, fun and above all inspiring for the year to come.

Recognized as a leader in this year’s Gartner Magic Quadrant for MDM solutions, Oracle’s strategy is to make the full spectrum of deployment options available to our customers.  With a mixture of on-premise and cloud-based applications integrated together to deliver best-in-class customer experience, we have wrestled with these problems and created solutions that can be implemented today.   The decisions you make on how to design and deploy MDM will have a magnified impact as the data explosion continues to unfold.


Oracle MDM solutions are built on the same application engines that drive our enterprise software, with thousands of customers in deployment managing billions of records. We have out-of-the-box integrations to a number of Customer Experience applications, industry standard data feeds and even third party applications and data sources. Our MDM applications are specifically engineered to be part of a heterogeneous application and technology ecosystem. Our EDQ and MDM applications are specifically engineered to be part of a heterogeneous application and technology ecosystem. That’s critically important as the MDM footprint gets extended, and your need to operate across these different environments expands.


It’s an exciting time to be in the enterprise data management space. We would welcome the chance to discuss what we’re doing in more detail, and to share our experience and our recommendations for solving your most pressing data problems. 


For more information please visit us our MDM Solutions Factory at:

http://launch.oracle.com/?oramdm

Friday Mar 21, 2014

Master Data Management: How to Avoid Big Mistakes in Big Data

Big Data Quality MDM

Master Data Management: How to Avoid Big Mistakes in Big Data

The paradigm-changing potential benefits of big data can't be overstated—but big changes can deliver big risks as well. For example, exploding data volumes naturally create a corresponding increase in data correlations, but as leading experts warn, correlations should not be mistaken for causes.

To avoid drawing the wrong conclusions from big data, organizations first need a way to assemble reliable master data to analyze. Then they need a way to put those conclusions and that data to work operationally, in the systems that govern and facilitate their day-to-day operations.

Master data management (MDM) helps deliver insightful information in context to aid decision-making. It can be used to filter big data, isolating and identifying key entities and shrinking the dataset to a manageable size for parsing, tagging, and associating with operational system records. And it provides the key intersecting point that enables organizations to map big data results to operational systems that are built on relational databases and structured information.

Adopting master data management capabilities helps organizations create consolidated, consistent, and authoritative master data across the enterprise, enabling the distribution of master information to all operational and analytical applications, including those that contain customer, product, supplier, site, and financial information.

Oracle Master Data Management drives results by delivering the ability to cleanse, govern, and manage the quality and lifecycle of master data.

To learn more about the importance of MDM as an underlying technology that facilitates big data initiatives, read an in-depth Oracle C-Central article, "Masters of the Data: CIOs Tune into the Importance of Data Quality, Data Governance, and Master Data Management."

And don't miss the new Oracle MDM resource center. Visit today to download white papers, read customer stories, view videos, and learn more about the full range of features for ensuring data quality and mastering data in the key domains of customer, product, supplier, site and financial data.

Thursday Mar 13, 2014

Join Us for a Webcast: Enterprise Best Practices for Complex Multi-Chart/Multi-Ledger Organizations


Wednesday, March 19, 2014
2:00 PM – 3:00 PM EST
View your local time
Live Webcast
Register to watch at your desk!
Join Us for a webcast

Enterprise Best Practices for Complex Multi-Chart/Multi-Ledger Organizations

Like many industry leaders, organizations do a good job closing their books and managing change in their structures. Unfortunately, many of the processes put in place are manual and require many hours or planning, spreadsheet update and rework to address inconsistencies and additional change while the manual processes are being executed.

Diverse organizations, planning for record growth in 2014 and support many different general ledgers and/or multiple charts of accounts, have a complex challenge managing corporate reporting hierarchies and financial consolidation mappings. In this session you will learn how to:
  • Automate key manual processes to drive growth
  • Handle hierarchical change and re-organizations
  • Understand Data Governance concerns and strategies

Join Oracle, AdvancedEPM and General Dynamics for this informative webcast as we discuss key issues around how to automate change in your business. General Dynamics achievements included:
  • Reduced Planning Cycles by 30%
  • Reduced Re-Organizations, and What-If Modeling by 25%
  • Delegated maintenance responsibility to business experts most familiar with change events
  • Consolidated & rationalized corporate and divisional hierarchies across many different sources
  • Pushed hierarchies and other reference data (in multiple formats) to many downstream apps
  • Standardized hierarchies, business rules, and data validations
  • Provided better analysis tools to support Re-Orgs, M&A activity, etc.

About the Speaker

Ed Cody is a seasoned IT business manager having over 10 years of experience working with Hyperion EPM and Oracle BI applications. Ed is the author of 2 books on Hyperion products, and has consulted in both commercial and government organizations. Ed was instrumental in the development of General Dynamics "BI/EPM Center of Excellence" and award winning "BI Collaborative", he has leveraged DRM in concert with Oracle E-Business, Oracle BI, and EPM solutions to provide easy-to-use and effective master data management solutions across General Dynamics.

Register Now!


EPM


Friday Mar 07, 2014

Master Data Management and Big Data: Perfect Together!

By Gino Fortunato

Master Data Management and Big Data: Perfect together!

The 'hot' button around gathering customer insight is Big Data.  And justifiably so.  Using Big Data is a great way to harness previously unusable data to look for patterns in the data crumbs that customers leave behind.  By rapidly processing this data in real time, Big Data allows customer insight that was previously impossible. 

Much of this insight is statistical.  Customers have similar patterns.  They abandon shopping carts when something is out of stock or when they see the final price.  At least compared to other points of the buying process.  It's just human nature.  By using statistical and other techniques, driving insight about what the customer is doing, or might be doing next, can drive a lot of value. 

But wouldn't it be great to use that Big Data insight along with what you already know about that customer?  That's where MDM comes in.  MDM is the spot to operationalize what you already know about the customer.  By using what you already know, plus the insight you have gleaned from Big Data, you can make informed decisions about how to react to the customer's next click.  And do it in real time.   To properly use the insight, it is necessary to properly idenfity the customer.  Again, an area that master data management can help.  With it's built in identity resolution capabilities, MDM can help in two ways.  One is to add to what is being derived based on the Big Data source.  The other is to prevent mistakes when the statistical analysis is wrong.  For example, the customer surfing the gaming site may be grouped into a category that has a number of traits.  One of those traits might be an expected age range.  But if the organization knew the birthdate of the person was outside that age range, they can propose different cross sell/ upsell possiblities and perhaps lead to the discovery of a new subcategory to further open the market.

To learn more about the importance of MDM as an underlying technology that facilitates big data initiatives, read an in-depth Oracle C-Central article, "Masters of the Data: CIOs Tune into the Importance of Data Quality, Data Governance, and Master Data Management."







Monday Feb 10, 2014

Data Relationship Governance: Effectively Managing Change and Creating a Nimble Enterprise

In an agile world, enterprises should be able to react to change at a fast pace. However, managing changes across systems can be very challenging. Data Relationship Governance offers best of breed data governance capabilities to align people and processes and to maximize information asset quality.

Join us for an Executive forum to discover strategies and tools that are unlocking business potential, driving impact across all business lines and allowing organizations to manage for growth.

Learn directly from organizations succeeding with Data Relationship Governance strategies:

  • Cisco Systems will discuss how their solution supports key financial transformation programs, improved business agility and created a consistent framework for growth.
  • NetApp Inc will discuss how they are utilizing their investment within the Finance organization and how DRM has been a strategic game changer for Sales and Marketing.

This session guarantees an in-depth opportunity to understand why Data Relationship Governance is at the forefront for every Finance Executive priority list.

Register today and learn from your peers on how they are unlocking business potential.

Register Now

February 13th, 8:30-1pm PST

Four Seasons

2050 University Ave
East Palo Alto, CA 94303



Agenda

8:30 a.m. – 9:00 a.m.

Networking and Breakfast

9:00 a.m. – 9:45 a.m.

Effectively Manage Change with Enterprise Data Governance
- Doug Cosby, Vice President, Engineering, Oracle

10:00 a.m. – 10:30 a.m.

Data Governance Strategies at Cisco Systems
- Finance Business Manager, Cisco Systems

10:30 a.m. – 11:00 a.m.

Enforcing Governance with EPM and BI Solutions
- Robin Peel and Dianne Paulus, Oracle

11:15 a.m. – 11:45 a.m.

NetApp Inc extends outside of Finance to Support Sales Territory Maintenance
- Sr. Finance Manager, Finance Systems Group, NetApp Inc.

11:45 a.m.

Lunch and Data Governance Panel

If you are an employee or official of a government organization, please click here for important ethics information regarding this event.

Thursday Feb 06, 2014

Improve Your Customer Experience Through World-Class Data Quality

Webcast: Improve Your Customer Experience with World-Class Data Quality

Now Available! - Webcast Replay

As an Oracle customer, you know that Oracle is the premier provider of Commerce Solutions. Now you can take your customer experience to the next level by enhancing the data in your systems by adding EDQ (Enterprise Data Quality) to further boost the effectiveness and ROI of the systems you already have. If you are like most organizations, the quality, completeness and consistency of your customer, partner, and/or product data in your systems is much less than you may know and that may be undermining the effective operation of your systems. But how do you know? Do you have quality metrics? Do you have a quality governance program? Do you know how much any of this may be undermining the expected ROI from your Commerce Solutions? Learn how you can develop a data quality program, reduce support costs, improve customer satisfaction, and drive future growth.

See how you can enhance your CX applications to take your Customers Experience to the next level. You’ll discover: The importance of data quality in providing a rich and engaging experience with your customers from their first interaction on your website. You will learn to create a Data Governance program to optimize customer experience and how Oracle’s Customers transformed their online Commerce experience by leveraging Enterprise Data Quality,

This one hour webcast replay is with Martin Boyd, Senior Director of Strategy for Oracle Enterprise Data Quality and Tamer Chavusholu, Managing Consultant at Kaygen. 

Learn how to leverage a data quality program to dramatically improve your Customer Experience!

Webcast Replay

Tuesday Aug 20, 2013

Catch up on Enterprise Data Quality


The Oracle Data Integration and Master Data Management Newsletter is now available. In this edition of the quarterly newsletter, we highlight 6 uses cases for Oracle Enterprise Data Quality. You will also find information on upcoming webcasts and customer buzz among other things.

Also, if you wan to subscribe our newsletter, do so through the link in the newsletter.  Happy Reading!



You can learn more about our Oracle Enterprise Data Quality multitool in our upcoming webcast, Putting Data to Work Using Oracle Enterprise Data Quality Solutions on Tuesday, August 27 at 10:00 a.m. PT. As Dain Hansen, Director of Product Marketing says, "Unlike swiss army knives, it is guaranteed never to rust or stop you in an airport metal detector."

About

Get the latest on all things related to Oracle Master Data Management. Join Oracle's MDM Community today.

Follow us on twitter Catch Us on YouTube

Search

Categories
Archives
« May 2015
SunMonTueWedThuFriSat
     
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
      
Today