Monday May 12, 2014

Check it out – BI Apps 11.1.1.8.1 is now available!

As of May 8, 2014, Oracle Business Intelligence (BI) Applications 11.1.1.8.1 is available on the Oracle Software Delivery Cloud (eDelivery), and on the Oracle BI Applications OTN page. This is the second major release on the 11g code line leveraging the power of Oracle Data Integrator (ODI), and certified with the latest version of Oracle BI Foundation 11.1.1.7. For more details on this release and what’s new – check it out!

Friday Apr 25, 2014

Long Running Jobs in Oracle Business Intelligence Applications (OBIA) and Recovery from Failures

Written by Jayant Mahto, Oracle Data Integrator Product Management

In Oracle Business Applications 11.1.1.7.1 (OBIA), the Data Warehouse load is performed using Oracle Data Integrator (ODI). In ODI, using packages and load plans, one can create quite a complex load job that kicks off many scenarios in a coordinated fashion. This complex load job may run for a long time and may fail before completing the entire job successfully and will require restarts to recover from failure and complete successfully.

This blog uses the complex load plan defined in Load Plan for Oracle Business Applications 11.1.1.7.1 (OBIA) to illustrate the method of recovery from failures. Similar methods can be used in the recovery of complex load plans defined independently in Oracle Data Integrator (ODI). Note that this post does not go into the details of troubleshooting a failed load plan and only talks about the different restart parameters that affect the behavior of a restarted job.

The failures can happen due to the following reasons:

  • Access failure – Source/Target DB down, network failure etc.
  • Agent failure.
  • Problem with the Database – As in running out of space or some other DB related issue.
  • Data related failure – Exceptions not caught gracefully, like null in not null column etc.

It is important to find out the reason of failure and address it before attempting to restart the load plan otherwise the same failure may happen again. In order to recover from the failure successfully the recover parameters in the load plan steps need to be selected carefully. These parameters are selected during design time of the load plan by the developers. The goal is to be able to make the restarts robust enough so that the administrator can do restart without knowing the details of the failed steps. This is why it is the developer’s responsibility to select the restart parameters for the load plan steps in such a way which guarantees that the correct set of steps will be re-run during restart to make sure that data integrity is maintained.

In the case of OBIA, the load plans have appropriate restart parameters in the generated load plans for out of the box steps. If you are adding a custom steps then you need to choose similar restart parameters for the custom steps.

Now let us look at a typical load plan and the restart parameters at various steps.

Restart of a serial load plan step:


SDE Dimension Group Step highlighted above is a serial step. Let us say the Load plan failed when running the 3 SDE Dims GEO_DIM step. Since this is a serial step and it has been set to “Restart from Failure”, the load plan on restart would start from 3 SDE Dims GEO_DIM only and not run the 3 SDE Dims USER_DIM again. This parameter is widely used in the OBIA serial steps.

The other restart parameter for a serial steps is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of a parallel load plan step:


The Workforce Dependant Facts Step highlighted above is a parallel step with restart parameter set to “Restart from failed children”. It means all the 5 parallel steps under it would be kicked off in parallel (subject to free sessions being available). Now amongst those 5 steps let us say 2 of them completed (indicated by the green boxes above) and then the load plan failed. When the load plan is restarted all the steps that did not complete/failed, will be started again (in this example being Learning Enrollment Fact, Payroll Facts and Recruitment Facts). This parameter is widely used in the OBIA parallel steps.

The other restart parameter for a parallel step is “Restart all children”. This will cause all the children steps to be re-run during restart even if only one failed and others succeeded. This parameter could be useful in some case and developers will decide that.

Restart of the scenario session:

At the lowest order in any load plan are the scenario steps. While the parent steps (serial or parallel) are used to set the dependencies, the scenario steps are what finally load the tables. A scenario step in turn could have one or more steps (corresponding to number of steps inside the package).

It is important to understand the structure of a session that gets created for the execution of a scenario step to understand the failure points and how the restart takes place.

The following diagram illustrates different components in a session:


The restart parameters for the scenario steps in the load plan are:

  • Restart from a new session – This creates a new session for the failed scenario during restart and executed all the steps again.
  • Restart from a failed task – This uses the old session for the failed scenario during restart and starts from the failed task.
  • Restart from a failed step – This uses the old session for the failed scenario during restart and re-executes all the tasks in the failed step again. This is the most common parameter used by OBIA and is illustrated below.


In the above example, scenario step 2 failed when running. It internally has 3 steps (all under the same session in Operator log but identified with different step numbers 0,1,2 in above case). As per the setting corresponding to OBIA Standard, the Scenario would execute from Failed Step which is from Step number 2 Table_Maint_Proc (and the substep 3 Initialize Variables onwards as shown in diagram).

Note that the successful tasks such as “3 – Procedure – TABLE_MAINT_PROC – Initialize variables” will be executed again during restart since the scenario restart parameter is set to “Restart from failed step” in the Load Plan.

Summary:

OBIA has certain coding standard for setting up restart parameters as discussed above. For serial and parallel steps, the parameters “Restart from failure” and “Restart from failed children” allow the completed steps to be skipped. For scenario steps (which actually kick of the load sessions) the restart parameter of “Restart from failed step” skips the completed steps in the session and reruns all the tasks in the failed step, allowing recovery of an incomplete step.

This standard allows a hands free approach to restart a failed load plan by an administrator who has no knowledge of what the load plan is doing.


Friday Apr 04, 2014

Turning Big Data into Real-Time Action for a Greater Customer Experience

The power shifted to us, consumers. The digital revolution allows us to access broader set of services, and communicate without boundaries. Today we demand more and better choices in a competitive market, putting pressures on businesses to catch up with our expectations.

By offering differentiated and improved experience to their customers organizations see that they can drive revenue growth via higher loyalty, and improved brand perception.  Because technology is a key enabler for delivering superb and consistent customer experience across all touchpoints, in recent years customer experience solutions have become a top priority for CIOs. Thanks to the availability of big data analytics, organizations can now analyze a broader variety of data, rather than a few basic data points, and gain deeper insight into their customers and operations. In turn, this deeper insight helps align their business to provide a seamless customer experience.

In our digital, fact-paced world we produce large volumes of data with unprecedented velocity. This data contains perishable value that requires fast capture, analysis, and action to be able to influence the operations or the interaction with the customer. Otherwise the insight or action may become irrelevant, which decreases the value for the customer and the organization significantly.  To extract the maximum value from highly dynamic and perishable data, you need to process much faster and take timely action. This is the main premise behind Oracle's Fast Data solutions, which we have discussed in previous blogs and webcasts.

Real-time data integration and analytics play a crucial role in our new world of big and fast data. Organizations that look into leveraging  big data to create greater customer experience, need to evaluate the analytical foundation behind their customer-facing systems and resulting interactions, and determine whether they can improve how and when they collect, analyze, and act on their ever-growing data assets.

In our next webcast my colleague Pete Schutt in the Oracle Business Analytics team and I will discuss how organizations can create value for their customers using real-time customer analytics, and how to leverage big data to build a solid business analytics foundation using the latest features of Oracle Data Integration and Oracle Business Analytics. We will provide multiple customer examples for different solution architectures.

Join us on April 15th 10am PT/ 1pm ET by registering via the link below.

Turning Big Data into Real-Time Action for a Greater Customer Experience

Tuesday, April 15th 10am PT/ 1pm ET

Until we meet at this webcast, please review my related article on this topic published on DBTA earlier this year:

Thursday Jun 13, 2013

Near Zero Downtime Upgrades for Oracle Communications Billing and Revenue Management

Billing application is one of the key systems for a communications company. It is critical for not only revenue generation but also for customer experience.  Oracle Communications Billing and Revenue Management (BRM) solutions offers a flexible and reliable solution that helps service providers maximize customer value and profitability, and enhance business agility. Oracle BRM supports new business models including cloud, machine to machine (M2M), and mobile virtual network operators (MVNO).  Many of the world's leading, and innovative service providers—including France Telecom, Vodafone, Swisscom, Savvis, Sirius XM Satellite Radio and Nintendo, rely on the BRM application for their critical billing operations.

When it comes to upgrading a major system, such as BRM, things may get a bit tricky. Until recently there were only two approaches to upgrading BRM systems: In-place upgrades and the parallel system run. The in-place upgrade approach means the current BRM system would be shut down and brought back up after it is upgraded to the latest release of BRM. In addition to downtime, it puts heavy risk on the business. The parallel system run approach is an attempt in having two BRM systems running live in parallel. Unfortunately, this approach could only keep a portion of the data in-sync between the two systems as not all data feeds could be applied to both databases. Both of these approaches to upgrade resulted in downtime, which is not desirable for a system that drives revenue and impacts customer experience directly. 

Oracle GoldenGate offers a third approach to upgrade BRM systems and fortunately this new solution delivers near zero downtime and minimized risk.  Using GoldenGate's real-time data replication capability IT teams can create a full copy of the new BRM system on the latest software release without impacting the current BRM system. End users can use the old version, while a parallel system can be upgraded to the newest version of the release, kept in synch in real time with the active database, and tested thoroughly without time pressure before end users switch over to the upgraded BRM instance. GoldenGate's failback option also mitigates the risk to the business during the upgrade process. Oracle GoldenGate’s unique software platform does not require any changes to the source BRM system and uses minimal system resources. The solution supports BRM clients running BRM 7.2 or higher.

This is a very powerful solution for BRM users and we decided to explain the solution in depth via a screencast to help BRM users understand the new method better. If you have a mission-critical BRM system I highly recommend watching the screencast below to learn the detailed implementation steps.

You can watch the screencast also on Oracle Media Network, or Oracle's YouTube account. We have a white paper Near Zero Downtime Upgrade for Oracle Communications Billing and Revenue Management (BRM) Application that you can download for more information as well.

Friday May 31, 2013

Improving Customer Experience for Segment of One Using Big Data

Customer experience has been one of the top focus areas for CIOs in recent years. A key requirement for improving customer experience is understanding the customer: their past and current interactions with the company, their preferences, demographic information etc. This capability helps the organization tailor their service or products for different customer segments to maximize their satisfaction. This is not a new concept. However, there have been two parallel changes in how we approach and execute on this strategy.

First one is the big data phenomenon that brought the ability to obtain a much deeper understanding of customers, especially bringing in social data. As this Forbes article "Six Tips for Turning Big Data into Great Customer Experiences" mentions big data especially has transformed online marketing. With the large volume and different types of data we have now available companies can run more sophisticated analysis, in a more granular way. Which leads to the second change: the size of customer segments. It is shrinking down to one, where each individual customer is offered a personalized experience based on their individual needs and preferences. This notion brings more relevance into the day-to-day interactions with customers, and basically takes customers satisfaction and loyalty to a new level that was not possible before. 

One of the key technology requirements to improve customer experience at such a granular level is to obtaining a complete and  up-to-date view of the customer. And that requires integrating data across disparate systems and in a timely manner. Data integration solution should move and transform large data volumes stored in heterogeneous systems in geographically dispersed locations. Moving data with very low latency to the customer data repository or a data warehouse, enables companies to have a relevant and actionable insight for each customer. Instead of relying on yesterday's data, which may not be pertinent anymore, the solution should analyze latest information and turn them into a deeper understanding of that customer. With that knowledge the company can formulate real opportunities to drive higher customer satisfaction.

Real-time data integration is key enabling technology for real-time analytics. Oracle GoldenGate's real-time data integration technology  has been used by many leading organizations to get the most out of their big data and build a closer relationship with customers.  One good example in the telecommunications industry is MegaFon. MegaFon is Russia's top provider of mobile internet solutions. The company deployed Oracle GoldenGate 11g to capture billions of monthly transactions from eight regional billing systems. The data was integrated and centralized onto Oracle Database 11g  and distributed to business-critical subsystems. The unified and up-to-date view into customers enabled more sophisticated analysis of mobile usage information and facilitated more targeted customer marketing. As a result of  the company increased revenue generated from the current customer base. Many other telecommunications industry leaders, including DIRECTV, BT, TataSky, SK Telecom, Ufone, have improved customer experience by leveraging real-time data integration. 

Telecommunications is not the only industry where single view of the customer drives more personalized interaction with customers. Woori Bank  implemented Oracle Exadata and Oracle GoldenGate.  In the past, it had been difficult for them to revise and incorporate changes to marketing campaigns in real time because they were working with the previous day’s data. Now, users can immediately access and analyze transactions for specific trends in the data mart access layer and adjust campaigns and strategies accordingly. Woori Bank can also send tailored offers to customers. 

This is just one example of how real-time data integration can transform business operations and the way a company interacts with its customers. I would like to invite you to learn more about data integration facilitating improved customer experience by  reviewing our  free resources here and following us on Facebook, Twitter, YouTube, and Linkedin.

Image courtesy of jscreationzs at FreeDigitalPhotos.net

Thursday May 16, 2013

Sabre Holdings Case Study Webcast Recap

Last week at Oracle we had a very important event. In addition to the visit by the roaming Gnome, who really enjoyed posing for pictures on our campus, I had the priviledge to host a webcast with guest speaker Amjad Saeed from Sabre Holdings. We focused on Sabre's data integration solution  leveraging Oracle GoldenGate and Oracle Data Integrator for their enterprise travel data warehouse (ETDW).

Amjad, who leads the development effort for Sabre's enterprise data warehouse, presented us how they approached various data integration challenges, such as growing number of sources and data volumes, and what results they were able to achieve. He shared with us how using Oracle's data integration products in heterogeneous environments enabled right-time market insights, reduced complexity, and decreased time to market by 40%. Sabre was also able to standardize development for its global DW development team, achieve real-time view in the execution of the integration processes, and the ability to manage the data warehouse & BI performance on demand. I would like to thank Amjad again very much for taking the time to share their data integration best practices with us on this webcast.

In this webcast my colleague Sandrine Riley and I provided an overview of Oracle Data Integration products' differentiators. We explained architectural strengths that deliver a complete and integrated platform that offers high performance, fast time-to-value, and low cost of ownership.  If you have not had a chance to attend the live event we have the webcast now available on demand via this link for you to watch at your convenience:

Webcast Replay: Sabre Holdings Case Study: Accelerating Innovation using Oracle Data Integration

There were many great questions from our audience. Unfortunately we did not have enough time to respond to all of them. While we are individually following up with the attendees, I also want to post the questions and answer for some of the commonly asked questions here.

    Question: How do I learn Oracle Data Integrator or GoldenGate? Is training the only option ?

    Answer: We highly recommend training through Oracle University. The courses will cover all the foundational components needed to get up and running with ODI or GoldenGate. Oracle University offers instructor-led and online trainings. You can go to http://education.oracle.com to get a complete listing of the courses available.Additionally – but not in replacement of training – you can get started with a guided ‘getting started’ type tutorial which you can find on our OTN page:  in the ‘Getting Started’ section of the page.Also, there are some helpful ‘Oracle by Example’ exercises/videos which you can find on the same page.

    For Oracle GoldenGate, we recommend watching instructional videos on its Youtube channel: Youtube/oraclegoldengate. A good example is here

    Last but not least, at Oracle OpenWorld there are opportunities to learn in depth by attending our hands-on-labs, even though it does not compare to/replace taking training.

    Question: Compare and contrast Oracle Data Integrator to the Oracle Warehouse Builder ETL process. Is ODI repository-driven and based on creation of "maps" when   creating ETL modules?

    Answer: ODI has been built from the ground up to be heterogeneous – so it will excel on both Oracle and non-Oracle platforms.  OWB has been a more Oracle centric product.  ODI mappings are developed with a declarative design based approach, and processes are executed via an agent – who is orchestrating and delegating the workload in a set-based manner to the database.  OWB deploys packages of code on the database and produces more procedural code.   For more details – please read our ODI architecture white paper

    Question:  Is the metadata navigator for ODI flexible and comprehensive so that it could be used as a documentation tool for the ETL?

    Answer: Oracle Data Integrator's metadata navigator has been renamed – now called ODI Console.  The ODI console is a web-based interface to view in a more picture based manner what is inside of the ODI repository.  It could be used as documentation.  Beyond the ODI console, ODI provides full documentation from the ODI Studio.  For any given process, project, etc.  you are able to right click – and there is an option that says ‘print’ – and this will provide you with a PDF document including the details down to the transformations.  These documents may be a more appropriate method of documentation.  Also – please check out a whitepaper on Managing Metadata with ODI.

If you would like to learn more about Oracle Data Integration products please check out our free resources here. I also would like to remind you to follow us on social media if you do not already. You can find us on Facebook, Twitter, YouTube, and Linkedin.



Thursday Apr 11, 2013

Why Real Time?

Continuing on the five key data integration requirements topic, this time we focus on real-time data for decision making. 

[Read More]

Monday Feb 25, 2013

Connecting Velocity to Value: Introducing Oracle Fast Data

To understand fast data, one must first look at one of the most compelling new the breakthroughs in data management: big data. Big data solutions address the challenge today’s businesses are facing when it comes to managing the increasing volume, velocity, variety of all data - not just data within as well as about the organization. Much of the buzz and to-do around big data has been around Hadoop, NoSQL technologies, but little has been talked about velocity. Velocity is about the speed this data is generating. In many cases the economic value of this data diminishes fast as well. As a result, companies need to process large volumes of data in real-time and make decisions in a more rapid fashion to create value from highly-perishable, high-volumes of data in business operations.

This is where fast data comes in. Fast data solutions help manage the velocity (and scale) of any type of data and any type of event to enable precise action for real-time results.

Fast data solutions come from multiple technologies, and some of the concepts, such as complex event processing and business activity monitoring, have been in use in areas such as the financial services industry for years. But often, the pieces were used in isolation—a complex event process engine as a standalone application to apply predefined business rules to filter data, for example. But when these concepts are tied to analytics, capabilities expand to allow improved real-time insights. By tying together these strands, companies can filter/correlate, move/transform, analyze, and finally act on information from big data sources quickly and efficiently, enabling both real-time analysis and further business intelligence work once the information is stored.

Oracle’s Fast Data solutions offer multiple technologies that work hand-in-hand to create value out of high-velocity, high-volume data. They are designed to optimize the efficiency, scale for processing high volume events and transactions.

[Read More]

Wednesday Oct 31, 2012

Ameristar Wins with Oracle GoldenGate’s Heterogeneous Real-Time Data Integration

Today we announced a press release about another successful project with Oracle GoldenGate. This time at Ameristar. Ameristar is a casino gaming company and needed a single data integration solution to connect multiple heterogeneous systems to its Teradata data warehouse.

The project involves integration of Ameristar’s promotional and gaming data from 14 data sources across its 7 casino hotel properties in real time into a central Teradata data warehouse. The source systems include the Aristocrat gaming and MGT promotional management platforms running on Microsoft SQL Server 2000 databases.

As you can notice, there was no Oracle Database involved in this project, but Ameristar’s IT leadership knew that  GoldenGate’s strong heterogeneous and real-time data integration capabilities is the right technology for their data warehousing project. With GoldenGate Ameristar was able to reduce data latency to the enterprise data warehouse, and use this real-time customer information for marketing teams in improving overall customer experience. Ameristar customers receive more targeted and timely campaign offers, and the company has more up-to-date visibility into financial metrics of the company.

One other key benefit the company experienced with GoldenGate is in operational costs. The previous data capture solution Ameristar used was trigger based and required a lot of effort to manage. They needed dedicated IT staff to maintain it. With GoldenGate, the solution runs seamlessly without needing a fully-dedicated staff, giving the IT team at Ameristar more resources for their other IT projects.

If you want to learn more about GoldenGate and the latest features for Oracle Database and non-Oracle databases, please watch our on demand webcast about Oracle GoldenGate 11g Release 2.

Friday Jan 13, 2012

Value of Standardizing on a Single Data Integration Platform

Fragmented data stores are a reality and so are the fragmented data architectures. Many companies still leverage custom code, scripts or even SQL in addition to SOA or BI/DW related integration solutions. Using many different data integration tools and methods,  they inadvertently create complex, hard-to-manage data architectures. Unfortunately this approach diminishes the tangible benefits of data integration by limiting the ways they can leverage data assets, and by slowing down the efforts to remove data silos.

Standardizing on a single data integration platform for enterprise-wide needs enable to maximize value from data integration projects. These benefits include:
  • Simplified project management
  • Increased developer productivity and shorter implementation times
  • Reduced development, maintenance, training and licensing cost
  • Improved visibility into how data is used
  • Reduced errors in integration and more accurate data
  • Increased agility to respond to business needs

Oracle’s Data Integration product family offers a comprehensive, flexible and integrated platform for enterprise-wide data integration needs. Next week on January 18th, we will talk about how using a single data integration platform for all major DI projects delivers value, and discuss in detail Oracle GoldenGate and Oracle Data Integrator’s capabilities that make Oracle the right solution for a standardized approach to data integration across the enterprise.

 In this webcast you will hear how Oracle GoldenGate provides capture and delivery capabilities for different databases  including Oracle Database, HP NonStop (Enscribe, SQL/MX, SQL/MP), DB2 (LUW and System z), SQL Server, Teradata, Sybase, and MySQL. You will also learn about Oracle Data Integrator's highly flexible knowledge module architecture that provides extended connectivity to support enterprise-wide data integration needs.

Please mark your calendars and register for this free, live webcast:

Enterprise Data Integration for Heterogeneous Environments” on January 18th, 2012 at 10am PT/ 1pm ET.

Thursday Sep 22, 2011

Achieving Business Continuity for EMR Environments: UPMC's Case Study

Electronic Medical Records (EMR) is revolutionizing the healthcare industry by delivering improved patient care and operational efficiencies. To make sure the initiative lives up to its potential, the move to EMR requires working with a robust IT infrastructure. Particularly, when everything is moved to the digital world, a key requirement for uninterrupted patient care is maintaining high levels of availability for clinical systems that contain patient data.

UPMC is a national leader in adoption of EMRs and empowered its diverse health care network that includes 20 hospitals, 400 doctors’ offices and outpatient sites with clinical applications from Cerner. To enable business continuity for its Cerner appliations, UPMC has implemented a system that uses Oracle GoldenGate and Cerner’s 724Access® software to deliver clinicians continuous and secure access to the most up-to-date patient and order data.

In less than 2 weeks, at OpenWorld we will have the opportunity to hear directly from UPMC how they implemented Oracle GoldenGate for their Cerner applications and what results they have seen so far.


UPMC Case Study: Achieving Business Continuity with Oracle GoldenGate

Tuesday October 4th 10:15 am

Intercontinental Hotel- Telegraph Hill Room


For me personally, GoldenGate’s use for business continuity in a healthcare setting is one of the most exciting, and rewarding discussions. It shows how real-time data integration technology can help improve patient care and prevent service interruptions that can cost lives. It makes me proud to be part of a technology solution that, even though indirectly, helps improve quality of care we receive.

I am truly looking forward to hearing Bill Costantini from UPMC present their project for us at OpenWorld. Hope you can join me in this valuable session.

You can find out about other Data Integration Track sessions here and you can follow our updates and reminders for the Data Integration track via our Facebook and Twitter accounts.

Tuesday Jun 07, 2011

Zero Downtime Migration to Oracle Exadata with Oracle GoldenGate

System upgrades, migrations, consolidation and maintenance activities are fact of life in today’s IT environment, as companies strive to achieve higher performance, scalability and flexibility. Those activities typically require a “planned outage" and for systems that support global operations or require 24/7 availability even a planned outage can be disruptive to business.

Currently we see a great demand from customers for consolidating their databases to Oracle Exadata. As they embark on these projects, they also seek a solution to avoid interrupting availability of their business-critical systems, as it can significantly impact revenue, productivity, and customer satisfaction. Oracle GoldenGate’s real-time, heterogeneous data replication capabilities come very handy in these migration and consolidation projects. It synchronizes old environment, for example DB2, SQL Server or an earlier version of Oracle Database, with the new Oracle Exadata system in real time. While the new environment is prepared and tested, Oracle GoldenGate captures every new transaction from the production system and delivers to Oracle Exadata as soon as it is ready. Once the systems are in sync applications and users can be pointed to the Exadata system immediately without any downtime.

One recent customer that chose Oracle GoldenGate for migration to Exadata is e-Dialog. They are part of GSI Commerce Inc., and provide integrated digital marketing solutions to businesses worldwide. When their message volumes for multichannel marketing and campaign analytics services started to increase rapidly they decided to move to Exadata to achieve higher performance. At the same time they had to ensure business continuity during the migration as any outage would impact their customers’ access to their services. They used Oracle GoldenGate to complete the migration in phases over six months. Oracle GoldenGate’s bidirectional replication capabilities allowed e-Dialog to run the legacy system along with Exadata concurrently. I invite you to read more about e-Dialog success story to learn how Oracle GoldenGate enables successful migration to Oracle Exadata.

About

Learn the latest trends, use cases, product updates, and customer success examples for Oracle's data integration products-- including Oracle Data Integrator, Oracle GoldenGate and Oracle Enterprise Data Quality

Search

Archives
« March 2015
SunMonTueWedThuFriSat
1
2
3
4
5
6
7
8
9
10
12
13
14
15
16
17
18
19
20
21
22
23
24
25
27
28
29
30
31
    
       
Today