Friday May 31, 2013

Improving Customer Experience for Segment of One Using Big Data

Customer experience has been one of the top focus areas for CIOs in recent years. A key requirement for improving customer experience is understanding the customer: their past and current interactions with the company, their preferences, demographic information etc. This capability helps the organization tailor their service or products for different customer segments to maximize their satisfaction. This is not a new concept. However, there have been two parallel changes in how we approach and execute on this strategy.

First one is the big data phenomenon that brought the ability to obtain a much deeper understanding of customers, especially bringing in social data. As this Forbes article "Six Tips for Turning Big Data into Great Customer Experiences" mentions big data especially has transformed online marketing. With the large volume and different types of data we have now available companies can run more sophisticated analysis, in a more granular way. Which leads to the second change: the size of customer segments. It is shrinking down to one, where each individual customer is offered a personalized experience based on their individual needs and preferences. This notion brings more relevance into the day-to-day interactions with customers, and basically takes customers satisfaction and loyalty to a new level that was not possible before. 

One of the key technology requirements to improve customer experience at such a granular level is to obtaining a complete and  up-to-date view of the customer. And that requires integrating data across disparate systems and in a timely manner. Data integration solution should move and transform large data volumes stored in heterogeneous systems in geographically dispersed locations. Moving data with very low latency to the customer data repository or a data warehouse, enables companies to have a relevant and actionable insight for each customer. Instead of relying on yesterday's data, which may not be pertinent anymore, the solution should analyze latest information and turn them into a deeper understanding of that customer. With that knowledge the company can formulate real opportunities to drive higher customer satisfaction.

Real-time data integration is key enabling technology for real-time analytics. Oracle GoldenGate's real-time data integration technology  has been used by many leading organizations to get the most out of their big data and build a closer relationship with customers.  One good example in the telecommunications industry is MegaFon. MegaFon is Russia's top provider of mobile internet solutions. The company deployed Oracle GoldenGate 11g to capture billions of monthly transactions from eight regional billing systems. The data was integrated and centralized onto Oracle Database 11g  and distributed to business-critical subsystems. The unified and up-to-date view into customers enabled more sophisticated analysis of mobile usage information and facilitated more targeted customer marketing. As a result of  the company increased revenue generated from the current customer base. Many other telecommunications industry leaders, including DIRECTV, BT, TataSky, SK Telecom, Ufone, have improved customer experience by leveraging real-time data integration. 

Telecommunications is not the only industry where single view of the customer drives more personalized interaction with customers. Woori Bank  implemented Oracle Exadata and Oracle GoldenGate.  In the past, it had been difficult for them to revise and incorporate changes to marketing campaigns in real time because they were working with the previous day’s data. Now, users can immediately access and analyze transactions for specific trends in the data mart access layer and adjust campaigns and strategies accordingly. Woori Bank can also send tailored offers to customers. 

This is just one example of how real-time data integration can transform business operations and the way a company interacts with its customers. I would like to invite you to learn more about data integration facilitating improved customer experience by  reviewing our  free resources here and following us on Facebook, Twitter, YouTube, and Linkedin.

Image courtesy of jscreationzs at FreeDigitalPhotos.net

Friday May 24, 2013

Fast Data and the Furious 6 Ideas

 

If it’s hot cars, fast action, and bad acting you want, definitely you are in the wrong movie theater. But if it’s real-time insights and high value from high velocity data you are looking for, checkout our own little action movie below. Fast data is the latest term that has hit the streets and is ripping a hole in the traditional pavement of how we think of our information architecture. It’s about embracing the element of real-time and high velocity when it comes to all forms of data. It's about finding new ways to act on that elusive data across machines, applications, and from our customers. This data is constantly in flux and changing every minute.

In the last few weeks, we’ve seen 6 new ideas come across that really unveil some new thinking around Fast Data.

1) Real-time are the most critical data insights for making decisions (especially employees). In a recent study by the Economist Intelligence Unit, In Search of Insight and Foresight, Getting More out of Big Data. In one survey response, 48% of employees cited that real-time data (i.e. Customer interactions) was the most critical for making decisions. This was above all forms of data insights including qualitative, historical, trends, and predictive analytics. For CEOs, CIOs, CFOs however, this real-time information was much lower in importance. What this tells us is that the way data is treated in the organization needs to be carefully planned and considered.

2) Machines need to be smarter if they are going to keep up with us – In the recent article by Oracle Profit Magazine, Data at the Speed of Life Ed Zou, vice president of Oracle Fusion Middleware product management, writes, “the right technology needs to be put into place not only to harvest machine-to-machine (M2M) fast data but also to filter it in some way so that only the most important information is available to be acted on in real time. Otherwise, tens of thousands of devices capturing data every second and all the fast data being sent unfiltered could cause not only a deluge at the data center, but an overwhelming amount of traffic on the network”.

3) Fast Data requires an event decision architecture - Pete Utzschneider, Vice President Product Management at Oracle, suggests in this article the best way to combine fast data technologies with M2M makes it possible to create an "event decision architecture" that can be used in making changes in operations and customer service. It is possible to collect the data from a massive number of devices as it becomes available, run it through Oracle Event Processing on a server and make decisions about large and small issues that affect profitability.

4) Fast processes dictate a Fast Data approach to applications – In the article by Tony Baer, Ovum Principal Analyst, Oracle puts applications in the Fast Data lane. Baer talks about the advantages of using in-memory technologies and put them to work for business applications. These in-memory applications put all of their logic in, and write transactions straight to Flash storage where the disk is only used for ‘cold data’. Oracle’s In-memory applications include selected offerings from the PeopleSoft, JD Edwards, Oracle E-Business Suite, Siebel, and Hyperion portfolio. These new applications can run 20x faster using this technology.

5) In-memory is not just a passing fad - Take a look at Stephanie Mann’s (@StephMannTT) recent article, Look out, Big Data: In-memory data grids start to go mainstream , she writes, “the rise of in-memory data grids (IMDGs) -- or in-memory distributed caches -- is part of a larger trend in big data technology. Gartner reports that markets aligned to data integration and data quality tools are on an upward swing, set to push IT spending to $38 trillion by 2014”. And adding to Mann's point: Oracle Coherence is a great example of this technology at work, and is part of the overall Oracle Fast Data Solution which works especially well with Oracle Event Processing.

6) Big Data acceleration needed - In his recent article Big Data Good, Fast Big Data Better Adrian Bridgewater talks how ‘speed’ is what he calls the ‘unloved second cousin of Big Data’. He goes on to say that “Analytics without real-time analytics is like a car at full throttle without a steering wheel, i.e., we need to be able to react to data in the real world and navigate through it without crashing.” Thank you Adrian for this wonderful car analogy and it goes perfect with Fast Data and the Furious 6 Ideas. I couldn't agree more with the headline that big data and fast data are better together.

To learn more about fast data goto www.oracle.com/Fastdata or follow me on www.twitter.com/@dainsworld


Thursday May 16, 2013

Sabre Holdings Case Study Webcast Recap

Last week at Oracle we had a very important event. In addition to the visit by the roaming Gnome, who really enjoyed posing for pictures on our campus, I had the priviledge to host a webcast with guest speaker Amjad Saeed from Sabre Holdings. We focused on Sabre's data integration solution  leveraging Oracle GoldenGate and Oracle Data Integrator for their enterprise travel data warehouse (ETDW).

Amjad, who leads the development effort for Sabre's enterprise data warehouse, presented us how they approached various data integration challenges, such as growing number of sources and data volumes, and what results they were able to achieve. He shared with us how using Oracle's data integration products in heterogeneous environments enabled right-time market insights, reduced complexity, and decreased time to market by 40%. Sabre was also able to standardize development for its global DW development team, achieve real-time view in the execution of the integration processes, and the ability to manage the data warehouse & BI performance on demand. I would like to thank Amjad again very much for taking the time to share their data integration best practices with us on this webcast.

In this webcast my colleague Sandrine Riley and I provided an overview of Oracle Data Integration products' differentiators. We explained architectural strengths that deliver a complete and integrated platform that offers high performance, fast time-to-value, and low cost of ownership.  If you have not had a chance to attend the live event we have the webcast now available on demand via this link for you to watch at your convenience:

Webcast Replay: Sabre Holdings Case Study: Accelerating Innovation using Oracle Data Integration

There were many great questions from our audience. Unfortunately we did not have enough time to respond to all of them. While we are individually following up with the attendees, I also want to post the questions and answer for some of the commonly asked questions here.

    Question: How do I learn Oracle Data Integrator or GoldenGate? Is training the only option ?

    Answer: We highly recommend training through Oracle University. The courses will cover all the foundational components needed to get up and running with ODI or GoldenGate. Oracle University offers instructor-led and online trainings. You can go to http://education.oracle.com to get a complete listing of the courses available.Additionally – but not in replacement of training – you can get started with a guided ‘getting started’ type tutorial which you can find on our OTN page:  in the ‘Getting Started’ section of the page.Also, there are some helpful ‘Oracle by Example’ exercises/videos which you can find on the same page.

    For Oracle GoldenGate, we recommend watching instructional videos on its Youtube channel: Youtube/oraclegoldengate. A good example is here

    Last but not least, at Oracle OpenWorld there are opportunities to learn in depth by attending our hands-on-labs, even though it does not compare to/replace taking training.

    Question: Compare and contrast Oracle Data Integrator to the Oracle Warehouse Builder ETL process. Is ODI repository-driven and based on creation of "maps" when   creating ETL modules?

    Answer: ODI has been built from the ground up to be heterogeneous – so it will excel on both Oracle and non-Oracle platforms.  OWB has been a more Oracle centric product.  ODI mappings are developed with a declarative design based approach, and processes are executed via an agent – who is orchestrating and delegating the workload in a set-based manner to the database.  OWB deploys packages of code on the database and produces more procedural code.   For more details – please read our ODI architecture white paper

    Question:  Is the metadata navigator for ODI flexible and comprehensive so that it could be used as a documentation tool for the ETL?

    Answer: Oracle Data Integrator's metadata navigator has been renamed – now called ODI Console.  The ODI console is a web-based interface to view in a more picture based manner what is inside of the ODI repository.  It could be used as documentation.  Beyond the ODI console, ODI provides full documentation from the ODI Studio.  For any given process, project, etc.  you are able to right click – and there is an option that says ‘print’ – and this will provide you with a PDF document including the details down to the transformations.  These documents may be a more appropriate method of documentation.  Also – please check out a whitepaper on Managing Metadata with ODI.

If you would like to learn more about Oracle Data Integration products please check out our free resources here. I also would like to remind you to follow us on social media if you do not already. You can find us on Facebook, Twitter, YouTube, and Linkedin.



Monday May 13, 2013

ODI Joins Oracle BI Applications

 

Today Oracle announced: New and Enhanced Oracle Business Intelligence Applications Help Organizations Extract Strategic Insights from Business Data.

The new release of Oracle BI Applications 11g, release 11.1.1.7.1, has been designed to add new in-memory analytic applications, expand functional content across existing front and back-office Oracle BI Applications, and leverage the advantages of Oracle Data Integrator Enterprise Edition (ODI). Oracle Business Intelligence (BI) Applications provide complete, real-time, and enterprise wide insight for all users, enabling fact-based actions and intelligent interaction. Designed for rapid deployment at a low cost of ownership, Oracle Business Intelligence Applications are prebuilt solutions that start with the customer, embrace any existing corporate data source, and are seamlessly integrated with Oracle’s transactional solutions to increase effectiveness across the entire customer life cycle.

ODI Advantages for BI Applications

Leveraging the power of ODI, the new release of BI Applications enables customers to increase IT efficiency and reduce costs with a comprehensive data integration platform that covers all data integration requirements – including big data, application integration, as well as BI / data warehousing.

ODI helps to integrate data end-to-end across the full BI Applications architecture, supporting capabilities such as data-lineage which helps business users identify report-to-source capabilities. And while Oracle has re-architectured the solution to use ODI, major concepts still remain the same. For example, mappings and the data warehouse still supports multiple sources, full and incremental loads, slowly changing dimensions, and can be customized.

In addition, customers can choose the option to replicate their data in real-time using Oracle GoldenGate. Oracle GoldenGate products offer real-time data integration, transactional data replication, and data comparison across heterogeneous systems. Oracle GoldenGate enables real-time business intelligence for improved business insight, query offloading to maximize OLTP performance, zero-downtime data migration, disaster recovery, and active-active database synchronization for continuous availability.

There are five key elements that differentiate Oracle Data Integration from other data integration offerings and make it the logical strategic choice as the foundation for Oracle BI Applications.

  1. Completeness. Oracle offers a comprehensive set of data integration capabilities that provides real-time and bulk data movement, data transformation, data services, data federation, data quality, real-time data replication and bidirectional synchronization, big data integration, and cloud data integration capabilities to support virtually any data integration needs.
  2. Best in class performance, availability, and reliability. Both ODI and Oracle GoldenGate are architected for maximum performance, availability, and reliability. ODI’s unique extract, load, transform (E-LT) architecture delivers multifold performance improvements over traditional extract, transform, load (ETL) solutions, and Oracle GoldenGate’s log-based real-time change data capture functionality enables thousands of transactions to be processed per second with transactional integrity. And Oracle GoldenGate has automatic recoverability in case of network outages or other interruptions.
  3. Fast time to value. ODI’s declarative design approach shortens implementation times. Its Oracle JDeveloper–based integrated development environment (IDE), called Oracle Data Integrator Enterprise Edition studio, is designed to dramatically increase developer productivity. Oracle GoldenGate has a simple yet robust architecture that allows deployment in a matter of weeks.
  4. Open and standards-based. A key strength for Oracle Data Integration products is the way they integrate heterogeneous sources and targets. Oracle’s products offer streamlined, low-impact data extract and delivery capabilities for all major platforms and databases. ODI also provides Hadoop-based transformation and loading capabilities and integrates data from a variety of sources such as JMS queues, XML files, and NoSQL. The products are optimized to support Oracle and non-Oracle technologies and applications.
  5. Low total cost of ownership. Many third-party technologies have low acquisition costs but cannot scale, are not reliable, or involve high maintenance costs. Oracle’s data integration offerings provide future-ready products that are very easy to manage and scale, and have lower total cost of ownership. Noteworthy for BI Apps - the number of intermediary servers can be reduced significantly by ODI’s unique E-LT architecture. This is because ODI does not depend on its agents to run on separate servers.

Even Greater Benefits on Oracle Engineered Systems

Oracle BI Applications are optimized to work especially with two Oracle Engineered Systems. Exalytics and the Exadata Database Machine. It is especially Oracle Exadata where there’s an even greater advantage for using Oracle Data Integration products with Oracle BI Applications. Because of ODI’s E-LT architecture, query performance is up to 7-10X faster and consequently transformations are faster and more efficient. Only E-LT architecture can deliver this type of performance advantage where SET-based transformations run natively on Oracle Exadata. Secondly, only Oracle GoldenGate delivers advanced real-time data integration capabilities such as Integrated Capture process for low-impact change data capture, support for compression and Oracle Advanced Security. In addition GoldenGate offers zero-down time migration to Oracle Exadata with its active-active database replication feature.

Get More Resources on Oracle Business Intelligence Applications and Oracle Data Integration:

Friday May 10, 2013

ODI 11g – XML Model Expert Accelerator

The blog post here provided a model creation accelerator for relational oriented technologies, I have uploaded another one for XML oriented data. The expert can be found on the java.net site here.

This expert greatly simplifies getting up and running with XML in ODI, well I think so anyway. It asks the basic questions, whats the file name... then allows additional configuration if you want. You can skip advanced and external database storage and you will have a model with the topology objects built, you can get right into reverse engineering the datastores.

The tool tips come straight from the ODI documentation.

If the advanced option was selected, after pressing OK above, you will be shown some common advanced properties, you can change these and these will be used when building the URL for the XML data server (see full doc here). The actual root element in your document will be displayed in the root element entry.

The external database dialog has details of the relational system you wish to use. This has the basic properties, there are lots more potential configuration options, my goal here is to illustrate the expert which will get you up and running. Enter the ODI encoded password.

Once you have done this you can reverse your model....very simple! 

The expert is written in groovy using the ODI SDK and SwingBuilder which is very simple to use for creating these types of accelerators, let us know what you think.

Tuesday May 07, 2013

ODI - Integrating more social data

Carrying on from the earlier post on integrating social data from Facebook Graph API, the functions you need to integrate JSON are here for download. I have wrapped into 2 functions, there is a readme in the zip describing what you need to do;

  1. JSON_to_XML
  2. Zipped_JSON_to_XML

After you have imported the ODI user functions you can then easily call them from your code; in ODI procedures for example. For example I have a procedure with a Groovy task that simply calls the function as;

  • JSON_to_XML("http://graph.facebook.com/search?q=exadata&type=post", "D:\\output\\fb_exadata.xml", "", "facebook");

The first parameter is a URI (not a regular filesystem path) representing the input data to be processed, and the second parameter is a filesystem path representing the generated file. The 3rd and 4th parameters are for configuring the XML generated, the unnamed element and root node names.

Here is an example of a zipped input taken from the filesystem;

  • Zipped_JSON_to_XML("file:///D:/input/fb_exadata.zip", "D:\\output\\fb_exadata.xml", "", "facebook");

The download is on the ODI user function samples download on the Java.net site. Its a community area, so try it out, make changes, and let me know how it goes. There are a few challenges in the process with names from JSON to XML, so the code has some support for that but could be better.

The 3rd and 4th parameters let us handle JSON with arrays of unnamed elements;

  • [{"id":"345", "name":"bob"},{"id":"123", "name":"jim"}]

so we can generate the following XML passing company as the 3rd parameter for the array name and emp as the unnamed element parameter, most commonly only the 4th parameter needs a value;

  1. <?xml version='1.0' encoding='UTF-8'?>
  2. <company>
  3. <emp><id>345</id><name>bob</name></emp>
  4. <emp><id>123</id><name>jim</name></emp>
  5. </company>

There's a few other posts I will share when I get a moment, including using named pipes and an expert for specific technologies (such as XML). I did this post for fast model creation which is very useful for relational oriented systems, but for XML we can have a much more specific one incorporating much of the sensible defaults and an option for using a system other than in-memory.

Wednesday May 01, 2013

When The Roaming Gnome Conquers Data Integration

It is always fascinating to see how our customers turn Oracle Data Integration products into a major force for their critical initiatives. I particularly like the success stories that tie back to the products or services that I use in my personal life. A little gnome that travels around the world is my new hero when it comes to seeing Oracle Data Integration in action, in day-to-day life.  Well, of course, I am referring to the Travelocity Gnome that we are familiar with from TV ads. And we know that behind this little gnome, is a great innovative IT team serving Sabre Holdings, which owns Travelocity. They deserve the praise for supporting business innovation with cutting edge data warehousing/BI solutions.  

the Traveling Gnome in the office (Photo thanks to Flickr user Ian Kershaw, available under by-nc-sa v2.0)

the Traveling Gnome in the office (Photo thanks to Flickr user Ian Kershaw, available under by-nc-sa v2.0)

Sabre Holdings demonstrated its ability to excel in implementing data integration solutions by winning Oracle Excellence Awards for Fusion Middleware Innovation in Data Integration category in 2011. We now have the privilege to hear directly from Sabre how they used Oracle Data Integrator and Oracle GoldenGate for their critical enterprise data warehouse that drives all kinds of innovative products and services for Sabre employees, partners, and customers. Sabre partnered with Oracle to achieve major improvements from reducing complexity,  better handling growing data volumes and decreasing time to market by 40%.

Next week on May 8th we will host a free webcast where you can hear Sabre's Amjad Saeed, who manages development for their enterprise data warehouse, present how they leveraged advanced data integration approaches in achieving their data warehouse solution goals.

Sabre Holdings Case Study: Accelerating Innovation with Oracle Data Integration

May 8th 1pm ET/ 10am PT

If you have not seen Oracle Data Integration in action, this is a must-see event to attend. I also would like to remind you that this year's  Oracle Excellence Awards for Oracle Fusion Middleware Innovation is open for submissions. You can submit your nomination by June 18th here.


About

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality

Search

Archives
« May 2013 »
SunMonTueWedThuFriSat
   
2
3
4
5
6
8
9
11
12
14
15
17
18
19
20
21
22
23
25
26
27
28
29
30
 
       
Today