Tuesday Nov 26, 2013

Introduction and Tips of Oracle BI Apps Variable Refresh

Author: Chuan Shi

Introduction

The ETL logic in BI Apps uses parameters in packages, interfaces, load plans, and knowledge modules (KM) to control the ETL behaviors. Parameters can be configured by customers, pre-seeded in the ETL code, or maintained internally:

  • Data Load parameters that can be configured by customers are maintained at product line (PLV) and fact/dimension group levels in the BI Apps Configuration Manager (BIACM).
  • Some parameters are pre-seeded in the ETL code OOTB, e.g. DATASOURCE_NUM_ID.
  • Other parameters are used internally by the load plan to control the execution of the load plan, e.g. EXECUTION_ID.

ETL parameters are handled by using ODI variables. The purpose of this blog is to explain ODI variable refreshing to ensure that correct values of variables are used in Oracle BI Apps ETL.

ODI Variable Classification

To ensure correct values to be used in ETL, variables are refreshed at run time. A variable can be refreshed in either a package or a load plan. Variables can be classified into four categories based on whether and where they are refreshed in ODI (and if a variable needs to be refreshed, then there must be a refreshing logic specified for it, which will be discussed later).


The four categories are:

  1. Not refreshed

Definition: Generally, internal parameters used by the load plan to control the ETL execution, and therefore they shall not be refreshed

Examples: DATASOURCE_NUM_ID, EXECUTION_ID

  1. Refreshed in the package

Definition: Variables that are not feasible to be refreshed in load plans. (The refreshing logic of such a variable depends on ETL run facts. It sources from DW tables and uses the QUALIFY user define function (UDF). ODI fails to interpret QUALIFY at load plan level in some cases.)

Examples: IS_INCREMENTAL, LAST_EXTARCT_DATE

  1. Hardcoded in the load plan

Definition: Variables whose values are hardcoded by overwriting in load plans. The variables will take the hardcoded values in ETL.

Examples: DOMAIN_CODE, LOOKUP_TYPE

  1. Refreshed in the load plan

Definition: Variables whose values are configured in BIACM by customers. (In other words, the values of these variables come from BIACM. The refreshing logic of such a variable uses a UDF to extract its value from BIACM.)

Examples: UPDATE_ALL_HISTORY, TYPE2_FLG

For a complete list of ODI variables, please refer to this post.

Refreshing Variables in a Package

Refreshing variables in a package is straightforward. One needs to create Refresh Variable steps for the variables to be refreshed. The screenshot below shows examples of refreshing IS_INCREMENTAL.

Hard-coding Variables in a Load Plan

Some variables, such as DOMAIN_CODE and LOOKUP_TYPE are hardcoded in load plan components. To do that, go to the load plan component to which the variables are to be hardcoded, select the Overwrite checkbox and provide the hardcode values for the variables. The screenshot below shows examples of hardcoding DOMAIN_CODE and LOOKUP_TYPE.

Refreshing Variables from BIACM

BIACM is a central UI where customers can define the values of data load parameters (i.e., ODI variables), among with many other features offered. ODI variables, which are refreshed in load plans, have their values extracted from BIACM. We also refer to such variables as BIACM variables.

BIACM Variables are classified into truly global variables, PLV specific variables, and fact/dimension group level variables.

  • A truly global variable (e.g., 13P_CALENDAR_ID) is a variable that has the same value in all product lines (i.e., EBS11510, PSFT90, etc) and for all fact/dimension groups. Truly global variables are refreshed centrally in the load plan system components.
  • A PLV specific variable (e.g., LANGUAGE_BASE) is a variable that takes the same value for all fact/dimension groups within a product line, but different values of the variable can be used in different production lines. They are refreshed individually in consuming load plan dev components.
  • A fact/dimension group level variable (e.g., UPDATE_ALL_HISTORY) is group specific. It can take different values in different fact/dimension groups within the same PLV and across different PLVs. They are refreshed individually in consuming load plan dev components.

From a variable value overriding perspective:

  • A truly global variable has a unique value. PLVs and fact/dimension groups cannot override its value.
  • A PLV variable has product line specific values (e.g., LANGUAGE_BASE takes the value of US in EBS product lines but ENG in PSFT product lines). The value is the same for all fact/dimension groups within that product line.
  • A fact/dimension group level variable has group specific values (e.g., TYPE2_FLG has the value of Yes in Position Dimension Hierarchy, while it has the value of No in Asset Dimension). Also, such a variable has a global default value. If a fact/dimension group does not specify the value of such a variable for its use, then the global default value will be used whenever this variable is called by that group (e.g., the global default value of TYPE2_FLG is No).

These variables are defined in ODI. To ensure that variable refreshing works correctly, there are some rules on the definitions of ODI variables:

  • Set the ‘Keep History’ option to ‘No History’;
  • Always provide a default value (the default value will be picked if refreshing from BIACM does not return a value for some reason. Otherwise the ETL will fail.). As a good practice, the ODI default value of the variable can be set the same as the global value of the variable in BIACM.

(Whenever the Keep History option or the Default Value of a variable is changed, the scenarios that use this variable need to be regenerated.)

Once ODI variables are defined, a refreshing logic is needed to refresh them from BIACM. In this regard,

  • The ODI UDF GET_CM_PARAM is used
  • To return the correct value for a variable, we need to specify the following in the refreshing logic:
  • variable name;
  • product line;
  • fact/dimension group.
  • Syntax: getcmparam($(param_code),$(DATASOURCE_NUM_ID))
  • $(param_code) is the name of the variable (e.g., TYPE2_FLG)
  • $(DATASOURCE_NUM_ID) is used to specify the product line.

For PLV/group level variables, we pass #DATASOURCE_NUM_ID as $(DATASOURCE_NUM_ID);

e.g., getcmparam('TYPE2_FLG','#DATASOURCE_NUM_ID')

For truly global variable, we pass #WH_DATASOURCE_NUM_ID as a pseudo-PLV ID.

e.g., getcmparam('13P_CALENDAR_ID','#WH_DATASOURCE_NUM_ID')

  • Do not pass fact/dimension group directly into the syntax. They are determined by where the variable is refreshed.

BIACM variables are refreshed in load plans. To refresh a variable in a load plan, the following three steps are required (they have been done OOTB):

Step 1: Specify the correct logical schema and refreshing logic in the refreshing tab of the variable definition.

The logical schema has to be CM_BIAPPS11G.

The refreshing logic should be getcmparam() with appropriate inputs, e.g.,

getcmparam('13P_CALENDAR_ID','#WH_DATASOURCE_NUM_ID')

Step 2: Update the variable definition in the variables tab of the load plan.

Go to the load plan component where you want to refresh the variables. In the Variables tab, right click on the variables and select ‘Refresh Variable Definition’ so that the variable definition in the LP is synchronized with its real definition. Once this is done, verify that the logical schema is showing CM_BIAPPS11G, and the select statement is showing the embedded SQL in the getcmparam() function.

Step 3: Check the refreshing checkbox at the appropriate LP step.

For truly global variables, Step 3 becomes:


The logic behind getcmparam() guarantees that appropriate value of the variable is returned from BIACM given the name of the variable, the DATASOURCE_NUM_ID passed in, and the LPC step where it is refreshed.

Values stored in BIACM are strings. Therefore all ODI variables refreshed from BIACM will come in as strings. Each of the consuming codes (where the variables are used) should make sure it converts the data type accordingly. For example, dates are returned as a string in format yyyy-mm-dd hh:mi:ss. TO_DATE_VAR UDF is used to convert the returned string to DATE format. Number values are returned as strings as well.

Checklist when Things Go Wrong

What can go wrong?

  • The value of a variable used in ETL is not in line with expectation.
  • A variable refreshed has no value returned, and it fails ETL run.

Overriding Rule (1)

  • In a load plan, when a variable is refreshed in the parent step (e.g., the root step), its value will be inherited by all its child steps, unless this variable is refreshed/overwritten in a child step.

· However, if a variable is refreshed and/or overwritten in a child step, the value refreshed from this step will override the value refreshed from the parent step. Other child steps of the same level will NOT be affected. They will still inherit the value refreshed in the parent step.

Overriding Rule (2) (unlikely to happen but it exists)

If a variable is refreshed both in a package and in a load plan, then the value refreshed from the package will override the value refreshed from the load plan.

When the value of a variable returned from BIACM is not in line with expectation:

  • Confirm where the variable is refreshed, e.g., BIACM? ETL tables in DW? etc.
  • For BIACM PLV or group level variables:
  • Check its value(s) in BIACM UI. For PLV variables, check its value in each product line; for group level variables, check its group specific values as well as global default value.
  • Check if the variable is refreshed in a root step of a load plan (refresh checkbox checked). In the meanwhile, check if the root step is named after a fact/dim group.
  • Check if the variable is incorrectly refreshed or hardcoded in a child step belonging to the root step (avoid overriding rule 1).
  • Check the ODI default value of this variable. If BIACM returns (null), i.e., nothing, for this variable, its ODI default value will be used. Also, if we check the overwrite box (but not the refresh checkbox) of a variable in a load plan step, but forget to provide the value, then the ODI default value will be used.
  • Note: overriding rule (2) is unlikely to happen to BIACM variables.
  • For variables refreshed from ETL tables in DW, an incorrect value likely to indicate run issue. Check the run of that specific task.

When variable has no value:

  • Confirm where the variable should be refreshed, e.g., BIACM? ETL tables in DW? etc.
  • In rare cases, a variable may not have a value returned when it is refreshed, and this leads to ETL failures.
  • ODI behaves like this: it first refreshes the variable from its designated source (e.g., BIACM). If its source returns (null), i.e., nothing, for this variable, the ODI default value of this variable will be used in ETL. However, if the ODI default value is not provided, then this variable will not have a value.

Monday Nov 26, 2012

Currency Conversion in Oracle BI applications

Authored by Vijay Aggarwal and Hichem Sellami

A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc.

With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business.

This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users.

Source Systems

OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users.

For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly.

OBIA Design

The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance.

When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies.

OBIA Facts store amounts in various currencies:

Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies.

Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company.

Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation.

The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics.

When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard.

Design Logic

When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts.

If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates.

The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA.

The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion.

Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters.

Some Gotchas to Look for

It is necessary to think through the currencies during the initial implementation.

1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time.

2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done.

Technical Section

This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system.

In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables.

W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly.

EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process.

  1. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G.
    1. Delete the records which have Start Date > minimum conversion date
    2. Update the End Date of the existing records.
  2. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter.
    1. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP.
    2. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range.
  1. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date).
  2. Filter the records marked as ‘Filter’ in the INFA map.
  3. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly.
  4. SIL map will then insert/update the GS data into W_EXCH_RATE_G.

These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G.

We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this.

Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly.

PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only.

1 Jan 2010 – USD to INR – 45

31 Jan 2010 – USD to INR – 46

PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW.

Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate.

We generate From Date and To Date while extracting data from source and this has certain assumptions:

If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected.

Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type.

Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards.

As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked.

Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA.

W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well.

What if all 5 Global Currencies configured are same?

As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record.

We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases.

Reusable Lookups

Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned.

Changing Currency preferences in the Dashboard:

In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution.

List of Currency Preferences

Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml

This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles.

Adding Currency to an Amount Field

When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error.

There are two ways to add the Currency field to an amount metric:

  1. In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”…
  2. In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area.
    1. Select Column Properties option in the Edit list of a metric.
    2. In the Data Format tab, select Custom as Treat Number As.
    3. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

Thursday Sep 13, 2012

BI Applications overview

Welcome to Oracle BI applications blog! This blog will talk about various features, general roadmap, description of functionality and implementation steps related to Oracle BI applications. In the first post we start with an overview of the BI apps and will delve deeper into some of the topics below in the upcoming weeks and months. If there are other topics you would like us to talk about, pl feel free to provide feedback on that.

The Oracle BI applications are a set of pre-built applications that enable pervasive BI by providing role-based insight for each functional area, including sales, service, marketing, contact center, finance, supplier/supply chain, HR/workforce, and executive management. For example, Sales Analytics includes role-based applications for sales executives, sales management, as well as front-line sales reps, each of whom have different needs.

The applications integrate and transform data from a range of enterprise sources—including Siebel, Oracle, PeopleSoft, SAP, and others—into actionable intelligence for each business function and user role.

This blog  starts with the key benefits and characteristics of Oracle BI applications. In a series of subsequent blogs, each of these points will be explained in detail.

Why BI apps?

  • Demonstrate the value of BI to a business user, show reports / dashboards / model that can answer their business questions as part of the sales cycle.
  • Demonstrate technical feasibility of BI project and significantly lower risk and improve success
  • Build Vs Buy benefit
  • Don’t have to start with a blank sheet of paper.
  • Help consolidate disparate systems
  • Data integration in M&A situations
  • Insulate BI consumers from changes in the OLTP
  • Present OLTP data and highlight issues of poor data / missing data – and improve data quality and accuracy

Prebuilt Integrations

BI apps support prebuilt integrations against leading ERP sources: Fusion Applications, E- Business Suite, Peoplesoft, JD Edwards, Siebel, SAP

  • Co-developed with inputs from functional experts in BI and Applications teams.
  • Out of the box dimensional model to source model mappings
  • Multi source and Multi Instance support

Rich Data Model

 BI apps have a very rich dimensionsal data model built over 10 years that incorporates best practises from BI modeling perspective as well as reflect the source system complexities 

  • Conformed dimensional model across all business subject areas allows cross functional reporting, e.g. customer / supplier 360
  • Over 360 fact tables across 7 product areas
  • CRM – 145, SCM – 47, Financials – 28, Procurement – 20, HCM – 27, Projects – 18, Campus Solutions – 21, PLM - 56
  • Supported by 300 physical dimensions
  • Support for extensive calendars; Gregorian, enterprise and ledger based
  • Conformed data model and metrics for real time vs warehouse based reporting
  •  Multi-tenant enabled

Extensive BI related transformations

BI apps ETL and data integration support various transformations required for dimensional models and reporting requirements. All these have been distilled into common patterns and abstracted logic which can be readily reused across different modules

  • Slowly Changing Dimension support
  • Hierarchy flattening support
  • Row / Column Hybrid Hierarchy Flattening
  • As Is vs. As Was hierarchy support
  • Currency Conversion :-  Support for 3 corporate, CRM, ledger and transaction currencies
  • UOM conversion
  • Internationalization / Localization
  • Dynamic Data translations
  • Code standardization (Domains)
  • Historical Snapshots
  • Cycle and process lifecycle computations
  • Balance Facts
  • Equalization of GL accounting chartfields/segments
  • Standardized values for categorizing GL accounts
  • Reconciliation between GL and subledgers to track accounted/transferred/posted transactions to GL
  • Materialization of data only available through costly and complex APIs e.g. Fusion Payroll, EBS / Fusion Accruals
  • Complex event Interpretation of source data – E.g.
    • What constitutes a transfer
    • Deriving supervisors via position hierarchy
    • Deriving primary assignment in PSFT
    • Categorizing and transposition to measures of Payroll Balances to specific metrics to support side by side comparison of measures of for example Fixed Salary, Variable Salary, Tax, Bonus, Overtime Payments.
    • Counting of Events – E.g. converting events to fact counters so that for example the number of hires can easily be added up and compared alongside the total transfers and terminations.
    • Multi pass processing of multiple sources e.g. headcount, salary, promotion, performance to allow side to side comparison.
    • Adding value to data to aid analysis through banding, additional domain classifications and groupings to allow higher level analytical reporting and data discovery
    • Calculation of complex measures examples:
    • COGs, DSO, DPO, Inventory turns  etc
    • Transfers within a Hierarchy or out of / into a hierarchy relative to view point in hierarchy.

Configurability and Extensibility support 

BI apps offer support for extensibility for various entities as automated extensibility or part of extension methodology

  • Key Flex fields and Descriptive Flex support
  • Extensible attribute support (JDE)
  • Conformed Domains

ETL Architecture

BI apps offer a modular adapter architecture which allows support of multiple product lines into a single conformed model

  • Multi Source
  • Multi Technology
  • Orchestration – creates load plan taking into account task dependencies and customers deployment to generate a plan based on a customers of multiple complex etl tasks
  • Plan optimization allowing parallel ETL tasks
  • Oracle: Bit map indexes and partition management
  • High availability support
  • Follow the sun support

TCO

BI apps support several utilities / capabilities that help with overall total cost of ownership and ensure a rapid implementation

  • Improved cost of ownership – lower cost to deploy
  • On-going support for new versions of the source application
  • Task based setups flows
  • Data Lineage
  • Functional setup performed in Web UI by Functional person
  • Configuration
  • Test to Production support

Security

BI apps support both data and object security enabling implementations to quickly configure the application as per the reporting security needs

  • Fine grain object security at report / dashboard and presentation catalog level
  • Data Security integration with source systems
  • Extensible to support external data security rules

Extensive Set of KPIs

  • Over 7000 base and derived metrics across all modules
  • Time series calculations (YoY, % growth etc)
  • Common Currency and UOM reporting
  • Cross subject area KPIs (analyzing HR vs GL data, drill from GL to AP/AR, etc)

Prebuilt reports and dashboards

  • 3000+ prebuilt reports supporting a large number of industries
  • Hundreds of role based dashboards
  • Dynamic currency conversion at dashboard level

Highly tuned Performance

The BI apps have been tuned over the years for both a very performant ETL and dashboard performance. The applications use best practises and advanced database features to enable the best possible performance.

  • Optimized data model for BI and analytic queries
  • Prebuilt aggregates& the ability for customers to create their own aggregates easily on warehouse facts allows for scalable end user performance
  • Incremental extracts and loads
  • Incremental Aggregate build
  • Automatic table index and statistics management
  • Parallel ETL loads
  • Source system deletes handling
  • Low latency extract with Golden Gate
  • Micro ETL support
  • Bitmap Indexes
  • Partitioning support
  • Modularized deployment, start small and add other subject areas seamlessly

Source Specfic Staging and Real Time Schema

  • Support for source specific operational reporting schema for EBS, PSFT, Siebel and JDE

Application Integrations

The BI apps also allow for integration with source systems as well as other applications that provide value add through BI and enable BI consumption during operational decision making

  • Embedded dashboards for Fusion, EBS and Siebel applications
  • Action Link support
  • Marketing Segmentation
  • Sales Predictor Dashboard
  • Territory Management

External Integrations

The BI apps data integration choices include support for loading extenral data

  • External data enrichment choices : UNSPSC, Item class etc. Extensible
  • Spend Classification

Broad Deployment Choices

  • Exalytics support
  • Databases :  Oracle, Exadata, Teradata, DB2, MSSQL
  • ETL tool of choice : ODI (coming), Informatica

Extensible and Customizable

  • Extensible architecture and Methodology to add custom and external content
  • Upgradable across releases

Thanks for reading a long post, and be on the lookout for future posts.  We will look forward to your valuable feedback on these topics as well as suggestions on what other topics would you like us to cover.

About

Oracle Blogs Admin-Oracle

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today