Monday May 12, 2014

Check it out – BI Apps 11.1.1.8.1 is now available!

As of May 8, 2014, Oracle Business Intelligence (BI) Applications 11.1.1.8.1 is available on the Oracle Software Delivery Cloud (eDelivery), and on the Oracle BI Applications OTN page. This is the second major release on the 11g code line leveraging the power of Oracle Data Integrator (ODI), and certified with the latest version of Oracle BI Foundation 11.1.1.7. For more details on this release and what’s new – check it out!

Friday Apr 25, 2014

Long Running Jobs in Oracle Business Intelligence Applications (OBIA) and Recovery from Failures

Written by Jayant Mahto, Oracle Data Integrator Product Management

In Oracle Business Applications 11.1.1.7.1 (OBIA), the Data Warehouse load is performed using Oracle Data Integrator (ODI). In ODI, using packages and load plans, one can create quite a complex load job that kicks off many scenarios in a coordinated fashion. This complex load job may run for a long time and may fail before completing the entire job successfully and will require restarts to recover from failure and complete successfully.

This blog uses the complex load plan defined in Load Plan for Oracle Business Applications 11.1.1.7.1 (OBIA) to illustrate the method of recovery from failures. Similar methods can be used in the recovery of complex load plans defined independently in Oracle Data Integrator (ODI). Note that this post does not go into the details of troubleshooting a failed load plan and only talks about the different restart parameters that affect the behavior of a restarted job.

The failures can happen due to the following reasons:

  • Access failure – Source/Target DB down, network failure etc.
  • Agent failure.
  • Problem with the Database – As in running out of space or some other DB related issue.
  • Data related failure – Exceptions not caught gracefully, like null in not null column etc.

It is important to find out the reason of failure and address it before attempting to restart the load plan otherwise the same failure may happen again. In order to recover from the failure successfully the recover parameters in the load plan steps need to be selected carefully. These parameters are selected during design time of the load plan by the developers. The goal is to be able to make the restarts robust enough so that the administrator can do restart without knowing the details of the failed steps. This is why it is the developer’s responsibility to select the restart parameters for the load plan steps in such a way which guarantees that the correct set of steps will be re-run during restart to make sure that data integrity is maintained.

In the case of OBIA, the load plans have appropriate restart parameters in the generated load plans for out of the box steps. If you are adding a custom steps then you need to choose similar restart parameters for the custom steps.

Now let us look at a typical load plan and the restart parameters at various steps.

Restart of a serial load plan step:


SDE Dimension Group Step highlighted above is a serial step. Let us say the Load plan failed when running the 3 SDE Dims GEO_DIM step. Since this is a serial step and it has been set to “Restart from Failure”, the load plan on restart would start from 3 SDE Dims GEO_DIM only and not run the 3 SDE Dims USER_DIM again. This parameter is widely used in the OBIA serial steps.

The other restart parameter for a serial steps is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of a parallel load plan step:


The Workforce Dependant Facts Step highlighted above is a parallel step with restart parameter set to “Restart from failed children”. It means all the 5 parallel steps under it would be kicked off in parallel (subject to free sessions being available). Now amongst those 5 steps let us say 2 of them completed (indicated by the green boxes above) and then the load plan failed. When the load plan is restarted all the steps that did not complete/failed, will be started again (in this example being Learning Enrollment Fact, Payroll Facts and Recruitment Facts). This parameter is widely used in the OBIA parallel steps.

The other restart parameter for a parallel step is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of the scenario session:

At the lowest order in any load plan are the scenario steps. While the parent steps (serial or parallel) are used to set the dependencies, the scenario steps are what finally load the tables. A scenario step in turn could have one or more steps (corresponding to number of steps inside the package).

It is important to understand the structure of a session that gets created for the execution of a scenario step to understand the failure points and how the restart takes place.

The following diagram illustrates different components in a session:


The restart parameters for the scenario steps in the load plan are:

  • Restart from a new session – This creates a new session for the failed scenario during restart and executed all the steps again.
  • Restart from a failed task – This uses the old session for the failed scenario during restart and starts from the failed task.
  • Restart from a failed step – This uses the old session for the failed scenario during restart and re-executes all the tasks in the failed step again. This is the most common parameter used by OBIA and is illustrated below.


In the above example, scenario step 2 failed when running. It internally has 3 steps (all under the same session in Operator log but identified with different step numbers 0,1,2 in above case). As per the setting corresponding to OBIA Standard, the Scenario would execute from Failed Step which is from Step number 2 Table_Maint_Proc (and the substep 3 Initialize Variables onwards as shown in diagram).

Note that the successful tasks such as “3 – Procedure – TABLE_MAINT_PROC – Initialize variables” will be executed again during restart since the scenario restart parameter is set to “Restart from failed step” in the Load Plan.

Summary:

OBIA has certain coding standard for setting up restart parameters as discussed above. For serial and parallel steps, the parameters “Restart from failure” and “Restart from failed children” allow the completed steps to be skipped. For scenario steps (which actually kick of the load sessions) the restart parameter of “Restart from failed step” skips the completed steps in the session and reruns all the tasks in the failed step, allowing recovery of an incomplete step.

This standard allows a hands free approach to restart a failed load plan by an administrator who has no knowledge of what the load plan is doing.


Friday Apr 04, 2014

Turning Big Data into Real-Time Action for a Greater Customer Experience

The power shifted to us, consumers. The digital revolution allows us to access broader set of services, and communicate without boundaries. Today we demand more and better choices in a competitive market, putting pressures on businesses to catch up with our expectations.

By offering differentiated and improved experience to their customers organizations see that they can drive revenue growth via higher loyalty, and improved brand perception.  Because technology is a key enabler for delivering superb and consistent customer experience across all touchpoints, in recent years customer experience solutions have become a top priority for CIOs. Thanks to the availability of big data analytics, organizations can now analyze a broader variety of data, rather than a few basic data points, and gain deeper insight into their customers and operations. In turn, this deeper insight helps align their business to provide a seamless customer experience.

In our digital, fact-paced world we produce large volumes of data with unprecedented velocity. This data contains perishable value that requires fast capture, analysis, and action to be able to influence the operations or the interaction with the customer. Otherwise the insight or action may become irrelevant, which decreases the value for the customer and the organization significantly.  To extract the maximum value from highly dynamic and perishable data, you need to process much faster and take timely action. This is the main premise behind Oracle's Fast Data solutions, which we have discussed in previous blogs and webcasts.

Real-time data integration and analytics play a crucial role in our new world of big and fast data. Organizations that look into leveraging  big data to create greater customer experience, need to evaluate the analytical foundation behind their customer-facing systems and resulting interactions, and determine whether they can improve how and when they collect, analyze, and act on their ever-growing data assets.

In our next webcast my colleague Pete Schutt in the Oracle Business Analytics team and I will discuss how organizations can create value for their customers using real-time customer analytics, and how to leverage big data to build a solid business analytics foundation using the latest features of Oracle Data Integration and Oracle Business Analytics. We will provide multiple customer examples for different solution architectures.

Join us on April 15th 10am PT/ 1pm ET by registering via the link below.

Turning Big Data into Real-Time Action for a Greater Customer Experience

Tuesday, April 15th 10am PT/ 1pm ET

Until we meet at this webcast, please review my related article on this topic published on DBTA earlier this year:

Friday May 31, 2013

Improving Customer Experience for Segment of One Using Big Data

Customer experience has been one of the top focus areas for CIOs in recent years. A key requirement for improving customer experience is understanding the customer: their past and current interactions with the company, their preferences, demographic information etc. This capability helps the organization tailor their service or products for different customer segments to maximize their satisfaction. This is not a new concept. However, there have been two parallel changes in how we approach and execute on this strategy.

First one is the big data phenomenon that brought the ability to obtain a much deeper understanding of customers, especially bringing in social data. As this Forbes article "Six Tips for Turning Big Data into Great Customer Experiences" mentions big data especially has transformed online marketing. With the large volume and different types of data we have now available companies can run more sophisticated analysis, in a more granular way. Which leads to the second change: the size of customer segments. It is shrinking down to one, where each individual customer is offered a personalized experience based on their individual needs and preferences. This notion brings more relevance into the day-to-day interactions with customers, and basically takes customers satisfaction and loyalty to a new level that was not possible before. 

One of the key technology requirements to improve customer experience at such a granular level is to obtaining a complete and  up-to-date view of the customer. And that requires integrating data across disparate systems and in a timely manner. Data integration solution should move and transform large data volumes stored in heterogeneous systems in geographically dispersed locations. Moving data with very low latency to the customer data repository or a data warehouse, enables companies to have a relevant and actionable insight for each customer. Instead of relying on yesterday's data, which may not be pertinent anymore, the solution should analyze latest information and turn them into a deeper understanding of that customer. With that knowledge the company can formulate real opportunities to drive higher customer satisfaction.

Real-time data integration is key enabling technology for real-time analytics. Oracle GoldenGate's real-time data integration technology  has been used by many leading organizations to get the most out of their big data and build a closer relationship with customers.  One good example in the telecommunications industry is MegaFon. MegaFon is Russia's top provider of mobile internet solutions. The company deployed Oracle GoldenGate 11g to capture billions of monthly transactions from eight regional billing systems. The data was integrated and centralized onto Oracle Database 11g  and distributed to business-critical subsystems. The unified and up-to-date view into customers enabled more sophisticated analysis of mobile usage information and facilitated more targeted customer marketing. As a result of  the company increased revenue generated from the current customer base. Many other telecommunications industry leaders, including DIRECTV, BT, TataSky, SK Telecom, Ufone, have improved customer experience by leveraging real-time data integration. 

Telecommunications is not the only industry where single view of the customer drives more personalized interaction with customers. Woori Bank  implemented Oracle Exadata and Oracle GoldenGate.  In the past, it had been difficult for them to revise and incorporate changes to marketing campaigns in real time because they were working with the previous day’s data. Now, users can immediately access and analyze transactions for specific trends in the data mart access layer and adjust campaigns and strategies accordingly. Woori Bank can also send tailored offers to customers. 

This is just one example of how real-time data integration can transform business operations and the way a company interacts with its customers. I would like to invite you to learn more about data integration facilitating improved customer experience by  reviewing our  free resources here and following us on Facebook, Twitter, YouTube, and Linkedin.

Image courtesy of jscreationzs at FreeDigitalPhotos.net

Thursday May 16, 2013

Sabre Holdings Case Study Webcast Recap

Last week at Oracle we had a very important event. In addition to the visit by the roaming Gnome, who really enjoyed posing for pictures on our campus, I had the priviledge to host a webcast with guest speaker Amjad Saeed from Sabre Holdings. We focused on Sabre's data integration solution  leveraging Oracle GoldenGate and Oracle Data Integrator for their enterprise travel data warehouse (ETDW).

Amjad, who leads the development effort for Sabre's enterprise data warehouse, presented us how they approached various data integration challenges, such as growing number of sources and data volumes, and what results they were able to achieve. He shared with us how using Oracle's data integration products in heterogeneous environments enabled right-time market insights, reduced complexity, and decreased time to market by 40%. Sabre was also able to standardize development for its global DW development team, achieve real-time view in the execution of the integration processes, and the ability to manage the data warehouse & BI performance on demand. I would like to thank Amjad again very much for taking the time to share their data integration best practices with us on this webcast.

In this webcast my colleague Sandrine Riley and I provided an overview of Oracle Data Integration products' differentiators. We explained architectural strengths that deliver a complete and integrated platform that offers high performance, fast time-to-value, and low cost of ownership.  If you have not had a chance to attend the live event we have the webcast now available on demand via this link for you to watch at your convenience:

Webcast Replay: Sabre Holdings Case Study: Accelerating Innovation using Oracle Data Integration

There were many great questions from our audience. Unfortunately we did not have enough time to respond to all of them. While we are individually following up with the attendees, I also want to post the questions and answer for some of the commonly asked questions here.

    Question: How do I learn Oracle Data Integrator or GoldenGate? Is training the only option ?

    Answer: We highly recommend training through Oracle University. The courses will cover all the foundational components needed to get up and running with ODI or GoldenGate. Oracle University offers instructor-led and online trainings. You can go to http://education.oracle.com to get a complete listing of the courses available.Additionally – but not in replacement of training – you can get started with a guided ‘getting started’ type tutorial which you can find on our OTN page:  in the ‘Getting Started’ section of the page.Also, there are some helpful ‘Oracle by Example’ exercises/videos which you can find on the same page.

    For Oracle GoldenGate, we recommend watching instructional videos on its Youtube channel: Youtube/oraclegoldengate. A good example is here

    Last but not least, at Oracle OpenWorld there are opportunities to learn in depth by attending our hands-on-labs, even though it does not compare to/replace taking training.

    Question: Compare and contrast Oracle Data Integrator to the Oracle Warehouse Builder ETL process. Is ODI repository-driven and based on creation of "maps" when   creating ETL modules?

    Answer: ODI has been built from the ground up to be heterogeneous – so it will excel on both Oracle and non-Oracle platforms.  OWB has been a more Oracle centric product.  ODI mappings are developed with a declarative design based approach, and processes are executed via an agent – who is orchestrating and delegating the workload in a set-based manner to the database.  OWB deploys packages of code on the database and produces more procedural code.   For more details – please read our ODI architecture white paper

    Question:  Is the metadata navigator for ODI flexible and comprehensive so that it could be used as a documentation tool for the ETL?

    Answer: Oracle Data Integrator's metadata navigator has been renamed – now called ODI Console.  The ODI console is a web-based interface to view in a more picture based manner what is inside of the ODI repository.  It could be used as documentation.  Beyond the ODI console, ODI provides full documentation from the ODI Studio.  For any given process, project, etc.  you are able to right click – and there is an option that says ‘print’ – and this will provide you with a PDF document including the details down to the transformations.  These documents may be a more appropriate method of documentation.  Also – please check out a whitepaper on Managing Metadata with ODI.

If you would like to learn more about Oracle Data Integration products please check out our free resources here. I also would like to remind you to follow us on social media if you do not already. You can find us on Facebook, Twitter, YouTube, and Linkedin.



Thursday Apr 11, 2013

Why Real Time?

Continuing on the five key data integration requirements topic, this time we focus on real-time data for decision making. 

[Read More]

Monday Feb 25, 2013

Connecting Velocity to Value: Introducing Oracle Fast Data

To understand fast data, one must first look at one of the most compelling new the breakthroughs in data management: big data. Big data solutions address the challenge today’s businesses are facing when it comes to managing the increasing volume, velocity, variety of all data - not just data within as well as about the organization. Much of the buzz and to-do around big data has been around Hadoop, NoSQL technologies, but little has been talked about velocity. Velocity is about the speed this data is generating. In many cases the economic value of this data diminishes fast as well. As a result, companies need to process large volumes of data in real-time and make decisions in a more rapid fashion to create value from highly-perishable, high-volumes of data in business operations.

This is where fast data comes in. Fast data solutions help manage the velocity (and scale) of any type of data and any type of event to enable precise action for real-time results.

Fast data solutions come from multiple technologies, and some of the concepts, such as complex event processing and business activity monitoring, have been in use in areas such as the financial services industry for years. But often, the pieces were used in isolation—a complex event process engine as a standalone application to apply predefined business rules to filter data, for example. But when these concepts are tied to analytics, capabilities expand to allow improved real-time insights. By tying together these strands, companies can filter/correlate, move/transform, analyze, and finally act on information from big data sources quickly and efficiently, enabling both real-time analysis and further business intelligence work once the information is stored.

Oracle’s Fast Data solutions offer multiple technologies that work hand-in-hand to create value out of high-velocity, high-volume data. They are designed to optimize the efficiency, scale for processing high volume events and transactions.

[Read More]

Wednesday Oct 31, 2012

Ameristar Wins with Oracle GoldenGate’s Heterogeneous Real-Time Data Integration

Today we announced a press release about another successful project with Oracle GoldenGate. This time at Ameristar. Ameristar is a casino gaming company and needed a single data integration solution to connect multiple heterogeneous systems to its Teradata data warehouse.

The project involves integration of Ameristar’s promotional and gaming data from 14 data sources across its 7 casino hotel properties in real time into a central Teradata data warehouse. The source systems include the Aristocrat gaming and MGT promotional management platforms running on Microsoft SQL Server 2000 databases.

As you can notice, there was no Oracle Database involved in this project, but Ameristar’s IT leadership knew that  GoldenGate’s strong heterogeneous and real-time data integration capabilities is the right technology for their data warehousing project. With GoldenGate Ameristar was able to reduce data latency to the enterprise data warehouse, and use this real-time customer information for marketing teams in improving overall customer experience. Ameristar customers receive more targeted and timely campaign offers, and the company has more up-to-date visibility into financial metrics of the company.

One other key benefit the company experienced with GoldenGate is in operational costs. The previous data capture solution Ameristar used was trigger based and required a lot of effort to manage. They needed dedicated IT staff to maintain it. With GoldenGate, the solution runs seamlessly without needing a fully-dedicated staff, giving the IT team at Ameristar more resources for their other IT projects.

If you want to learn more about GoldenGate and the latest features for Oracle Database and non-Oracle databases, please watch our on demand webcast about Oracle GoldenGate 11g Release 2.

About

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality

Search

Archives
« September 2015
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today