Thursday Sep 20, 2012

Oracle Enterprise Data Quality - Geared Up and Ready for OpenWorld 2012

10 days and counting till Oracle OpenWorld 2012 is upon us.  Enterprise data quality is key to every information integration and consolidation initiative. At this year's OpenWorld, hear how Oracle Enterprise Data Quality provides the critical piece to achieving trusted, reliable master data and increases the value of data integration initiatives. Here are the different ways you can learn and experience Enterprise Data Quality at OpenWorld: 

Conference sessions:

  • Oracle Enterprise Data Quality: Product Overview and Roadmap - Monday 10/1/12, 1:45-2:45 PM - Moscone West - 3006
  • Data Preparation and Ongoing Governance with the Oracle Enterprise Data Quality Platform - Wednesday 10/3/2012, 1:15-2:15 PM - Moscone West - 3000 
  • Data Acquisition, Migration and Integration with the Oracle Enterprise Data Quality Platform - Thursday 10/4/2012, 12:45-1:45 PM - Moscone West - 3005 
  • Hands on Labs: Introduction to Oracle Enterprise Data Quality Platform -  Monday 10/2/2012, 4:45-5:45 PM - Marriot Marquis - Salon 1/2

    Demos:  Trusted Data with Oracle Enterprise Data Quality - Moscone South, Right - S-243 (note: proceed to Middleware Demo grounds)

For a list of Master Data Management and Data Quality sessions and other events click here

Wednesday Sep 19, 2012

Oracle MDM Panel at OOW 12: Best practices, Lessons Learned and More...

By Narayana Machiraju 

We are less than two weeks out from the start of Oracle Open World 2012. The MDM team has built-up a solid line-up of product and customer sessions for you to attend this year in addition to the hands-on labs, and numerous demonstration pods in Moscone West.

This year we will be hosting a customer panel session dedicated to Oracle Customer Hub at Oracle Open World. An esteemed panel of Oracle Customer Hub customers in different Industries: Credit Suisse, Allianz and Elsevier will provide insight into the journey of Customer MDM right from building a business case and MDM vision, establishing and sustaining governance, implementation strategies and realizing the benefits. You will also hear about implementation challenges, phasing strategies and lessons learned from real-life experiences.

If you are already implementing Customer MDM or evaluating the benefits of MDM and you would like to hear directly from our customers then I highly recommend you attend this session:

Customer MDM Panel: Discussion and Q&A on Implementation Best Practices, Data Quality, Data Governance          and ROI

Wednesday October, 3rd, 5:00PM - 6:00PM

Westin Market Street Hotel - Metropolitan 1

The MDM track at Oracle Open World covers variety of topics related to MDM. In addition to the product management team presenting product updates and roadmap, we have several customer panels, Conference sessions and Customer round table sessions featuring a lot of marquee Customers. You can see an overview of MDM sessions here

We hope to see you at Open World and stay in touch via our future blogs.

Tuesday Sep 18, 2012

Enterprise MDM: Rationalizing Reference Data in a Fast Changing Environment

By Rahul Kamath

Enterprises must move at a rapid pace to establish and retain global market leadership by continuously focusing on operational efficiency, customer intimacy and relentless execution.

Reference Data Management

  Reference Data As multi-national companies with a presence in multiple industry categories, market segments, and geographies, their ability to proactively manage changes and harness them to align their front office with back-office operations and performance management initiatives is critical to make the proverbial elephant dance.

Managing reference data including types and codes, business taxonomies, complex relationships as well as mappings represent a key component of the broader agenda for enabling flexibility and agility, without sacrificing enterprise-level consistency, regulatory compliance and control.

Financial Transformation

 Periodically, companies find that processes implemented a decade or more ago no longer mirror the way of doing business and seek to proactively transform how they operate their business and underlying processes.

Financial transformation often begins with the redesign of one’s chart of accounts. The ability to model and redesign one’s chart of accounts collaboratively, quickly validate against historical transaction bases and secure business buy-in across multiple line of business stakeholders, while continuing to manage changes within the legacy general ledger systems and downstream analytical applications while piloting the in-flight transformation can mean the difference between controlled success and project failure.

Attend the session titled CON8275 - Oracle Hyperion Data Relationship Management: Enabling Enterprise Transformation at Oracle Openworld on Monday, October 1, 2012 at 4:45pm in Ballroom A of the InterContinental Hotel to learn how Oracle’s Data Relationship Management solution can help you stay ahead of the competition and proactively harness master (and reference) data changes to transform your enterprise. Hear in-depth customer testimonials from GE Healthcare and Old Mutual South Africa to learn how others have harnessed this technology effectively to build enduring competitive advantage through business process innovation and investments in master data governance.

Hear GE Healthcare discuss how DRM has enabled financial transformation, ERP consolidation, mergers and acquisitions, and the alignment reference data across financial and management reporting applications. Also, learn how Old Mutual SA has upgraded to EBS R12 Financials and is transforming the management of chart of accounts for corporate reporting.

Separately, an esteemed panel of DRM customers including Cisco Systems, Nationwide Insurance, Ralcorp Holdings and Mentor Graphics will discuss their perspectives on how DRM has helped them address business challenges associated with enterprise MDM including major change management initiatives including financial transformations, corporate restructuring, mergers & acquisitions, and the rationalization of financial and analytical master reference data to support alternate business perspectives for the alignment of EPM/BI initiatives.

Attend the session titled CON9377 - Customer Showcase: Success with Oracle Hyperion Data Relationship Management at Openworld on Thursday, October 4, 2012 at 12:45pm in Ballroom of the InterContinental Hotel to interact with our esteemed speakers first hand.

Wednesday Sep 12, 2012

The Top 5 MDM Sessions You Can’t Miss at OpenWorld

Sessions, Demo pods, Hands On Labs, and much more – but where should you focus?  MDM has some excellent sessions planned for OOW –  here is a top 5 list to identify the sessions you just can’t afford to miss.

October 3, 2012  1:15 PM - 2:15 PM    Moscone West - 3002/3004    
What's There to Know About Oracle’s Master Data Management Portfolio and Roadmap?
Hear about product strategy our vision for the future and how Oracle MDM is positioned to excel in helping organizations make the most of their customer,      partner, supplier or product data.

October 3, 2012  5:00 PM - 6:00 PM   Westin San Francisco – Metropolitan I Oracle Customer MDM Applications: Implementation Best Practices, Data Governance, and ROI      

Customers successes provide solid examples of technology at work and how organizations derive value from it. Attend this session and hear from our customers on how they built a business case, established governance and are realizing the benefits of Oracle Customer Hub.

October 2, 2012  10:15 AM - 11:15 AM   Moscone West – 3001 Mastering Product Data: Strategies for Effective Product Information Management                                                                     

Product data is vital for any enterprise in being able to provide a consolidated representation of products to their partners, customers and suppliers.  Hear how our customers leverage product information to be a leader in their respective area and how Oracle is critical to achieving this.

October 2, 2012  11:45 AM - 12:45 PM   Moscone West – 2022 Enabling Trusted Enterprise Product Data with Oracle Fusion Product Hub                                                                                      

Learn how Oracle Fusion Product Hub is paving the way for providing organizations with trusted product data as well as helping organizations make the most of the information and infrastructure they already possess.

October 1, 2012  4:45 PM – 5:45 PM   InterContinental - Ballroom A Oracle Hyperion Data Relationship Management: Enabling Enterprise Transformation                                                                        

Hear how Data Relationship Management drives enterprise transformation and why any organization embarking on an master data management initiative needs it, plus hear from our customers best practices as well as lessons learned. 

Check out the Master Data Management Focus On document for all our sessions at OpenWorld 2012. 

Tuesday Sep 04, 2012

Wondering What to Expect from Master Data Management at OpenWorld 2012? Hold On to Your Seats…

Oralce OpenWorld 2012, September 30 - October 4, 2012 San Francisco

The Countdown begins – just 23 days till OpenWorld hits San Francisco. Oracle OpenWorld 2012 for MDM promises to be chock full of interesting sessions, specifically focused on our customers. We’ve made sure that our sessions are balanced between product information, strategy and real world stories and last but certainly not least - lessons learned – straight from our customers.

Stay on top of all that’s OpenWorld – when it comes to MDM. We’ll be posting not-t- miss sessions and blogs on what our customer lineup will be like at the big show. Look forward to seeing you at OOW – and in case you didn’t get approval to attend- take advantage of our virtual on-demand conference. See you at OpenWorld 2012 ! 

Wednesday Jun 06, 2012

Oracle Enterprise Data Quality: Ever Integration-ready

It is closing in on a year now since Oracle’s acquisition of Datanomic, and the addition of Oracle Enterprise Data Quality (EDQ) to the Oracle software family. The big move has caused some big shifts in emphasis and some very encouraging excitement from the field.  To give an illustration, combined with a shameless promotion of how EDQ can help to give quick insights into your data, I did a quick Phrase Profile of the subject field of emails to the Global EDQ mailing list since it was set up last September. The results revealed a very clear theme:

 

Integration, Integration, Integration!

As well as the important Siebel and Oracle Data Integrator (ODI) integrations, we have been asked about integration with a huge variety of Oracle applications, including EBS, Peoplesoft, CRM on Demand, Fusion, DRM, Endeca, RightNow, and more - and we have not stood still! While it would not have been possible to develop specific pre-integrations with all of the above within a year, we have developed a package of feature-rich out-of-the-box web services and batch processes that can be plugged into any application or middleware technology with ease. And with Siebel, they work out of the box.

Oracle Enterprise Data Quality version 9.0.4 includes the Customer Data Services (CDS) pack – a ready set of standard processes with standard interfaces, to provide integrated:

  • Address verification and cleansing
  •  Individual matching
  • Organization matching

The services can are suitable for either Batch or Real-Time processing, and are enabled for international data, with simple configuration options driving the set of locale-specific dictionaries that are used. For example, large dictionaries are provided to support international name transcription and variant matching, including highly specialized handling for Arabic, Japanese, Chinese and Korean data. In total across all locales, CDS includes well over a million dictionary entries.

 

Excerpt from EDQ’s CDS Individual Name Standardization Dictionary

CDS has been developed to replace the OEM of Informatica Identity Resolution (IIR) for attached Data Quality on the Oracle price list, but does this in a way that creates a ‘best of both worlds’ situation for customers, who can harness not only the out-of-the-box functionality of pre-packaged matching and standardization services, but also the flexibility of OEDQ if they want to customize the interfaces or the process logic, without having to learn more than one product. From a competitive point of view, we believe this stands us in good stead against our key competitors, including Informatica, who have separate ‘Identity Resolution’ and general DQ products, and IBM, who provide limited out-of-the-box capabilities (with a steep learning curve) in both their QualityStage data quality and Initiate matching products.

Here is a brief guide to the main services provided in the pack:

Address Verification and Standardization

EDQ’s CDS Address Cleaning Process

The Address Verification and Standardization service uses EDQ Address Verification (an OEM of Loqate software) to verify and clean addresses in either real-time or batch. The Address Verification processor is wrapped in an EDQ process – this adds significant capabilities over calling the underlying Address Verification API directly, specifically:

  • Country-specific thresholds to determine when to accept the verification result (and therefore to change the input address) based on the confidence level of the API
  • Optimization of address verification by pre-standardizing data where required
  • Formatting of output addresses into the input address fields normally used by applications
  • Adding descriptions of the address verification and geocoding return codes

The process can then be used to provide real-time and batch address cleansing in any application; such as a simple web page calling address cleaning and geocoding as part of a check on individual data.

 Make your database records mappable with MapMarker Plus.

 

Duplicate Prevention

Unlike Informatica Identity Resolution (IIR), EDQ uses stateless services for duplicate prevention to avoid issues caused by complex replication and synchronization of large volume customer data. When a record is added or updated in an application, the EDQ Cluster Key Generation service is called, and returns a number of key values. These are used to select other records (‘candidates’) that may match in the application data (which has been pre-seeded with keys using the same service). The ‘driving record’ (the new or updated record) is then presented along with all selected candidates to the EDQ Matching Service, which decides which of the candidates are a good match with the driving record, and scores them according to the strength of match. In this model, complex multi-locale EDQ techniques can be used to generate the keys and ensure that the right balance between performance and matching effectiveness is maintained, while ensuring that the application retains control of data integrity and transactional commits.

The process is explained below:

EDQ Duplicate Prevention Architecture

Note that where the integration is with a hub, there may be an additional call to the Cluster Key Generation service if the master record has changed due to merges with other records (and therefore needs to have new key values generated before commit).

Batch Matching

In order to allow customers to use different match rules in batch to real-time, separate matching templates are provided for batch matching. For example, some customers want to minimize intervention in key user flows (such as adding new customers) in front end applications, but to conduct a more exhaustive match on a regular basis in the back office. The batch matching jobs are also used when migrating data between systems, and in this case normally a more precise (and automated) type of matching is required, in order to minimize the review work performed by Data Stewards.  In batch matching, data is captured into EDQ using its standard interfaces, and records are standardized, clustered and matched in an EDQ job before matches are written out. As with all EDQ jobs, batch matching may be called from Oracle Data Integrator (ODI) if required.

When working with Siebel CRM (or master data in Siebel UCM), Siebel’s Data Quality Manager is used to instigate batch jobs, and a shared staging database is used to write records for matching and to consume match results. The CDS batch matching processes automatically adjust to Siebel’s ‘Full Match’ (match all records against each other) and ‘Incremental Match’ (match a subset of records against all of their selected candidates) modes.

The Future

The Customer Data Services Pack is an important part of the Oracle strategy for EDQ, offering a clear path to making Data Quality Assurance an integral part of enterprise applications, and providing a strong value proposition for adopting EDQ. We are planning various additions and improvements, including:

  • An out-of-the-box Data Quality Dashboard
  • Even more comprehensive international data handling
  • Address search (suggesting multiple results)
  • Integrated address matching

The EDQ Customer Data Services Pack is part of the Enterprise Data Quality Media Pack, available for download at http://www.oracle.com/technetwork/middleware/oedq/downloads/index.html.

Tuesday Mar 27, 2012

Oracle Enterprise Data Quality: A Leader in Customer Satisfaction

It’s always good to hear feedback from practitioners – the ones who are in the trenches who have experienced both the good and the bad sides of enterprise software. Gartner recently released a report which surveyed 260 data quality professionals from around the world and found that most expressed considerable satisfaction as a whole from their data quality tool vendors. However, a couple of key findings stand out which include, Datanomic (acquired by Oracle), leading the pack in terms of overall customer satisfaction among data quality tools. Read all about it right here http://bit.ly/Ay45SG

See full size image

Tuesday Dec 06, 2011

All the Ingredients for Success: Data Governance, Data Quality, Master Data Management

Recently Oracle sponsored a study from an independent analyst firm, Dan Power, from Hub Designs, All the Ingredients for Success: Data Governance, Data Quality and Master Data Management. Please click to download this latest paper from our Oracle Master Data Management resource kit. This paper discusses the role the data quality plays as an equal component of an enterprise information management or master data management initiative, typical business considerations for data quality, as well as best practices for integrating multi-domain MDM with customer/party data quality and product data quality.

Excerpt from the paper:

Today, businesses need to have the Single View of their master data, and to be able to “Create Once, Use Many” in order to be lean, and indeed, to survive. The silos that are still so prevalent do nothing but increase costs, hurt business and IT agility, and result in bad decisions being made (because of low quality, inaccurate, inconsistent data). If we implement Enterprise Data Quality tools, master data management solutions, and remember that these technologies must be driven by business-led data governance organizations, we’ll be able to break down those silos, and have enterprise-wide master data repositories that lead to streamlined processes, lower costs, better decisions, and more agile organizations that can compete better in global markets.

A quick example: A UK travel company restated its 2009 financial results due to issues arising from using two separate IT systems after the 2007 merger of two companies, in the amount of $191.8 million US dollars (£117m). This type of persistent operating silo is all too common in the companies I see, but this is probably a worst case scenario. But look for those costs that MDM, data governance and data quality can help your enterprise avoid, the revenue increases they can help you realize, and the compliance improvement and agility increases they can bring to your company. Those reductions, improvements and increases are very real. I’ve worked with enough companies now to know firsthand that MDM is not a fad or a trend, but a real force in the marketplace. And it’s experiencing growth of 18% at a time when the enterprise software market as a whole is only growing at about 5.6%.

Download the entire paper today.

 Or learn more about Oracle’s MDM strategy by going to www.oracle.com/goto/mdm

Saturday Nov 19, 2011

Reduce ERP Consolidation Risks with Oracle Master Data Management

Reducing the Risk of ERP Consolidation starts first and foremost with your Data.This is nothing new; companies with multiple misaligned ERP systems are often putting inordinate risk on their business. It can translate to too much inventory, long lead times, and shipping issues from poorly organized and specified goods. And don’t forget the finance side! When goods are shipped and promises are kept/not kept there’s the issue of accounts. No single chart of counts translates to no accountability.

So – I’ve decided. I need to consolidate!

Well, you can’t consolidate ERP applications [for that matter any of your applications] without first considering your data. This means looking at how your data is being integrated by these ERP systems, how it is being synchronized, what information is being shared, or not being shared. Most importantly, making sure that the data is mastered. What is the best way to do this? In the recent webcast: Reduce ERP consolidation Risks with Oracle Master Data Management we outlined 3 key guidelines:

#1: Consolidate your Product Data

#2: Consolidate your Customer, Supplier (Party Data)

#3: Consolidate your Financial Data

Together these help customers achieve reduced risk, better customer intimacy, reducing inventory levels, elimination of product variations, and finally a single master chart of accounts. In the case of Oracle's customer Zebra Technologies, they were able to consolidate over 140 applications by mastering their data. Ultimately this gave them 60% cost savings for the year on IT spend.

Oracle’s Solution for ERP Consolidation: Master Data Management

Oracle's enterprise master data management (MDM) can play a big role in ERP consolidation. It includes a set of products that consolidates and maintains complete, accurate, and authoritative master data across the enterprise and distributes this master information to all operational and analytical applications as a shared service.

It’s optimized to work with any application source (not just Oracle’s) and can integrate using technology from Oracle Fusion Middleware (i.e. GoldenGate for data synchronization and real-time replication or ODI with its E-LT optimized bulk data and transformation capability). In addition especially for ERP consolidation use cases it’s important to leverage the AIA and SOA capabilities as part of Fusion Middleware to connect these multiple applications together and relay the data into the correct hub.

Oracle’s MDM strategy is a unique offering in the industry, one that has common elements across the top and bottom in Middleware, BI/DW, Engineered systems combined with Enterprise Data Quality to enable comprehensive Data Governance at all levels. In addition, Oracle MDM provides the best-in-class capabilities to master all variations of data, including customer, supplier, product, financial data. But ultimately at the center of Oracle MDM is your data, making it more trusted, making it secure and accessible as part of a role-based approach, and getting it to make sense to you in any situation, whether it’s a specific ERP process like we talked about or something that is custom to your organization.

To learn more about these techniques in ERP consolidation watch our webcast or goto our Oracle MDM website at www.oracle.com/goto/mdm

Friday Nov 11, 2011

Bad Data is Really the Monster

Artist - DPFan4Ever

Bad Data is really the monster – is an article written by Bikram Sinha who I borrowed the title and the inspiration for this blog. Sinha writes:

“Bad or missing data makes application systems fail when they process order-level data. One of the key items in the supply-chain industry is the product (aka SKU). Therefore, it becomes the most important data element to tie up multiple merchandising processes including purchase order allocation, stock movement, shipping notifications, and inventory details… Bad data can cause huge operational failures and cost millions of dollars in terms of time, resources, and money to clean up and validate data across multiple participating systems.

Yes bad data really is the monster, so what do we do about it? Close our eyes and hope it stays in the closet?

We’ve tacked this problem for some years now at Oracle, and with our latest introduction of Oracle Enterprise Data Quality along with our integrated Oracle Master Data Management products provides a complete, best-in-class answer to the bad data monster.

What’s unique about it?

Oracle Enterprise Data Quality also combines powerful data profiling, cleansing, matching, and monitoring capabilities while offering unparalleled ease of use. What makes it unique is that it has dedicated capabilities to address the distinct challenges of both customer and product data quality – [different monsters have different needs of course!].

And the ability to profile data is just as important to identify and measure poor quality data and identify new rules and requirements. Included are semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured. Finally all of the data quality components are integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM.

Want to learn more? On Tuesday Nov 15th, I invite you to listen to our webcast on Reduce ERP consolidation risks with Oracle Master Data Management I’ll be joined by our partner iGate Patni and be talking about one specific way to deal with the bad data monster specifically around ERP consolidation. Look forward to seeing you there!

Tuesday May 17, 2011

Boiling the Ocean

When MDM comes up in conversations around the planning table, it is not uncommon for someone to raise the objection that they aren’t interested in trying to “boil the ocean”. What they are saying is that solving all the data quality issues across the enterprise is a job so big, that it has little chance for success. The fear is that anyone who takes on such a project is doomed to fail. To overcome this objection, MDM proponents must point out   that the best practice is to start small to solve a specific Data Quality (DQ) issue. Then, use the ROI to fund additional MDM projects - growing into a full information architecture over time.

It might interest you to know that we have the same discussions inside development. Ten years ago, suggesting that we invest in MDM to solve the worlds DQ problems was met with the exact same “boiling the ocean” objection. And as it is with MDM deployments, so it is with MDM development. Oracle started on a long term strategic initiative that started with key products focused on tangible realizable goals.

Home Grown Beginning

Oracle’s phased approach started with a customer data quality problem inside our own E-Business Suite of applications. We built a program called Oracle Customers Online to manage the data in our customer model. Another early initiative was to create a program to standardize product data in the E-Business Suite item master. Its first release was called Oracle Advanced Product Catalog. Adding data quality, source system management, and application integration capabilities, these two products grew into the Oracle Customer Data Hub and the Oracle Product Hub.

The Acquisition Phase

Oracle took a large step in its march to MDM market leadership with the Siebel acquisition. Their Universal Customer Master brought Oracle a business to consumer orientation with significant MDM capabilities. These included key support of customer facing business processes such as ‘Customer Loyalty’ and ‘Customer Privacy’. The second major MDM acquisition was Hyperion Data Relationship Management (DRM). This gave Oracle a powerful analytical MDM capability as well as the ability to manage master financial reference data.

The expansion Phase

With a solid foundation for customer, product, and financial data MDM, Oracle entered its MDM expansion phase with significant growth along multiple dimensions. The depth of functionality in each existing hub was increased. Industry specific versions were introduced for Comms, Retail, and Higher Education. New MDM hubs were released with the Site Hub and Supplier Hub. Application integration was substantially enhanced by leveraging Oracle’s Fusion Middleware offerings for SOA and Data Integration. SOA enabled ‘MDM Aware’ applications and Data Integration provided maps into Oracle Business Intelligence (OBI) applications. And Data Governance was addressed with the Data Governance Manager.

Fusion Master Data Management

Oracle is now on a path to create Fusion Application versions of all our MDM hubs. The Fusion Product Hub and the Fusion Customer Hub are already available. With recent data quality acquisitions providing the full range of DQ capabilities needed across MDM dimensions being consolidated into the Fusion Middleware platform for all MDM hubs, Oracle will possess the world’s only truly multi-dimensional MDM suite on a single least cost of ownership platform.

The Oracle MDM investment has been significant and sustained over time because of the benefits MDM brings to our customers. We can solve the enterprise data quality problem. We boiled the ocean.

About

Get the latest on all things related to Oracle Master Data Management. Join Oracle's MDM Community today.

Follow us on twitter Catch Us on YouTube

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
7
8
9
10
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today