Monday Feb 17, 2014

How to Compare RPDs

Author: Vivian(Weibei) Li

Introduction

Comparing RPD is a very necessary and useful process in RPD development. It is good to consider doing it in the following use cases.

  • You have a customized RPD and want to find out the differences between the customized RPD and the OOTB RPD.
  • You did some customization on the RPD and want to check if the customization is done as expected.
  • The administrator wants to check the differences in the later RPD with the older RPDs.
  • You want to upgrade your RPD and you are not sure if there are any upgrade issues from one release to anther release. You can compare the RPDs for the two releases to catch issues before upgrading.

There are many more cases to compare RPDs. This blog describes how to compare the RPDs.

There are two ways to compare RPDs, one is to use the compare utility and another is to use the compare dialogue in the admin tool.

You can choose either way to compare your RPDs. Here are the comparison of the two methods and some tips.

  • RPD compare utility is executed by the command line, so it doesn’t need the admin tool open.
  • RPD compare dialogue can be directly used in admin tool.
  • When two local RPDs are compared, especially when they require different OBIEE versions, RPD compare utility is recommended.
  • When modifications are performed in the RPD with admin tool and it is to be compared with benchmark RPDs(for example, the original RPD), RPD compare dialogue is recommended. See the details in the next sessions.

Compare RPDs with Utility

There are utilities used for rpd comparison in OBIEE. You can call the utilities with a few commands in the command prompt of Windows. Here lists the steps to compare RPDs with this option.

Step 1 - Choose your OBIEE version

Use the later version of OBIEE to do the comparison if you compare the two releases, and the two releases need different versions of OBIEE.
For example, there are two releases, let’s say release 1 and release 2. release 2 uses the later version of OBIEE. To compare the two releases, you need to use the OBIEE version for release 2.

Step 2 – Check out the subset of RPDs

It is recommended to check out the subset of RPDs that you want to compare with the projects you are interested in. Comparing the entire RPDs will be very inefficient and time-consuming.

Step 3 – Equalize the RPDs

Before comparing the RPDs, you should equalize the RPDs first with equalizerpds utility. The equalizerpds utility will equalize the Upgrade ID of objects in two separate repositories. If objects have the same Upgrade ID, they are considered to be the same object. The utility compares Upgrade IDs from the first repository (typically the original repository) with Upgrade IDs from the second repository (typically the modified repository). Then, the utility equalizes the Upgrade IDs of objects with the same name, using the Upgrade ID from the original repository.

equalizerpds.exe can be found under <ORA_HOME>\Oracle_BI1\bifoundation\server\bin.

· Syntax

The equalizerpds utility takes the following parameters:

equalizerpds [-B original_repository_password] -C original_repository_name

[-E modified_repository_password] -F modified_repository_name [-J rename_map_file]

[-O output_repository_name] [-Y equalStringSet]

· Explanation

Original and modified repository – Use the base repository as the ‘original’ repository and use repository you want to compare as ‘modified’ repository.

For example, if you want to compare release 1 and release 2 RPDs to find out upgrading issues to upgrade from release 1 to release 2, put release 1 RPD as the original and release 2 RPD as the modified repository.

rename_map_file

When you equalize objects, you can lose track of object renames because legitimate object renames become different objects. In other words, intentional renames you did in the repository might be changed to different Upgrade IDs, so subsequent merges erroneously treat the renamed object as a new object. To avoid this situation, enter the before and after names of intentionally renamed objects in a rename map file that you then pass to the utility. The equalizerpds utility uses the information in the file to ensure that the original IDs are used in the renamed current objects.

rename_map_file is a text file containing a list of objects that were renamed and that you want to equalize. The format is a tab-separated file with the following columns:

                    TypeName     Name1     Name2
       For example, logical column "Core"."Dim - Customer"."Address4" is re-named as 
       "Core"."Dim - Customer"."Address 4" from Release 1 to Release 2. The file can be written as
                    Logical Column "Core"."Dim - Customer"."Address4" "Core"."Dim - Customer"."Address 4"

       Tip: How to find out the TypeName value?
Query your object with Query Repository tool in the admin tool, and you will find the TypeName value in the result. 
1.       Open the admin tool. Go to Tools->Query Repository

 

2.       In the popup dialogue, query your object.
3.       You will find the Type value in the result.

 

You can put this file in any folder in your machine, and give the absolute path in rename_map_file parameter. See the example below.

· An equalization command example

equalizerpds -B pwd123 -C C:\rpdcomparison\release1\release1.rpd -E pwd123 -F C:\rpdcomparison\release2\release2.rpd -J C:\rpdcomparison\rename-map-file.txt -O C:\rpdcomparison\release2\equalizedrpd.rpd

Step 4 – Compare the RPDs

Now you can compare the RPDs with the comparerpd utility. comparerpd.exe can be found under <ORA_HOME>\Oracle_BI1\bifoundation\server\bin.

· Syntax

The comparerpd utility takes the following parameters:

comparerpd [-P modified_rpd_password] -C modified_rpd_pathname

[-W original_rpd_password] -G original_rpd_pathname {-O output_csv_file_name |

-D output_patch_file_name | -M output_mds_xml_directory_name} -E -8

· Explanation

Original and modified repository – Use the base repository as the ‘original’ repository and use repository you want to compare as ‘modified’ repository. The ‘modified’ repository should be the equalized RPD got from step 3.

-O output_csv_file_name is the name and location of a csv file where you want to store the repository object comparison output.

-D output_patch_file_name is the name and location of an XML patch file where you want to store the differences between the two repositories.

-M output_mds_xml_directory_name is the top-level directory where you want to store diff information in MDS XML format.

Note: You can specify an output CSV file using -O, an XML patch file using -D, or an MDS XML directory tree using -M. You cannot specify more than one output type at the same time.

· A comparison command example

comparerpd -P pwd123 -C C:\rpdcomparison\release2\equalizedrpd.rpd -W pwd123 -G C:\rpdcomparison\release1\release1.rpd -O C:\rpdcomparison\results.csv

Compare RPDs with Compare Dialogue

In this session, I will describe how to compare RPDs with the compare dialogue. The compare dialogue must be used with admin tool opened.

Compare the MUD RPD Before and After the Modification

· Open your fixed MUD RPD.

· Go to ‘File’->‘Multiuser’->’Compare with Original’.

It will compare the current modified rpd with the local copy of the original rpd.

Compare the local RPDs

You can also use compare dialogue to do the comparison for the local RPDs.

Note: It is recommended to use the compare utility to compare the local RPDs. Remember to extract the subset of RPDs before comparing. Comparing the entire RPDs will be time-consuming.

· Open your local RPD, let’s say RPD1 with admin tool. This RPD is the base RPD in your comparison.

· Go to ‘File’->’Compare’.

· Select the repository or XML file in the popup compare dialogue.

· Enter the password for the repository which you want to compare to, let’s say RPD2. RPD2 will be the modified repository.

· The compare dialogue will open the RPD2, equalize RPD2, then compare the RPD2 with RPD1, and finally show the results.

· You can see what the object looks like in RPD1 and RPD2 by clicking ‘View 1’ and ‘Edit 2’.

View 1 – the object in RPD1

Edit 2 – the object in RPD2

· Save the diff file as .CSV file in your local machine.

In summary, RPD comparison is a good tool for RPD development.  Using it appropriately will build more confidence for your RPD modification, fix and upgrade. It makes the RPD development more smooth and less error-prone if adding RPD comparison as a necessary process.

Tuesday Nov 26, 2013

Introduction and Tips of Oracle BI Apps Variable Refresh

Author: Chuan Shi

Introduction

The ETL logic in BI Apps uses parameters in packages, interfaces, load plans, and knowledge modules (KM) to control the ETL behaviors. Parameters can be configured by customers, pre-seeded in the ETL code, or maintained internally:

  • Data Load parameters that can be configured by customers are maintained at product line (PLV) and fact/dimension group levels in the BI Apps Configuration Manager (BIACM).
  • Some parameters are pre-seeded in the ETL code OOTB, e.g. DATASOURCE_NUM_ID.
  • Other parameters are used internally by the load plan to control the execution of the load plan, e.g. EXECUTION_ID.

ETL parameters are handled by using ODI variables. The purpose of this blog is to explain ODI variable refreshing to ensure that correct values of variables are used in Oracle BI Apps ETL.

ODI Variable Classification

To ensure correct values to be used in ETL, variables are refreshed at run time. A variable can be refreshed in either a package or a load plan. Variables can be classified into four categories based on whether and where they are refreshed in ODI (and if a variable needs to be refreshed, then there must be a refreshing logic specified for it, which will be discussed later).


The four categories are:

  1. Not refreshed

Definition: Generally, internal parameters used by the load plan to control the ETL execution, and therefore they shall not be refreshed

Examples: DATASOURCE_NUM_ID, EXECUTION_ID

  1. Refreshed in the package

Definition: Variables that are not feasible to be refreshed in load plans. (The refreshing logic of such a variable depends on ETL run facts. It sources from DW tables and uses the QUALIFY user define function (UDF). ODI fails to interpret QUALIFY at load plan level in some cases.)

Examples: IS_INCREMENTAL, LAST_EXTARCT_DATE

  1. Hardcoded in the load plan

Definition: Variables whose values are hardcoded by overwriting in load plans. The variables will take the hardcoded values in ETL.

Examples: DOMAIN_CODE, LOOKUP_TYPE

  1. Refreshed in the load plan

Definition: Variables whose values are configured in BIACM by customers. (In other words, the values of these variables come from BIACM. The refreshing logic of such a variable uses a UDF to extract its value from BIACM.)

Examples: UPDATE_ALL_HISTORY, TYPE2_FLG

For a complete list of ODI variables, please refer to this post.

Refreshing Variables in a Package

Refreshing variables in a package is straightforward. One needs to create Refresh Variable steps for the variables to be refreshed. The screenshot below shows examples of refreshing IS_INCREMENTAL.

Hard-coding Variables in a Load Plan

Some variables, such as DOMAIN_CODE and LOOKUP_TYPE are hardcoded in load plan components. To do that, go to the load plan component to which the variables are to be hardcoded, select the Overwrite checkbox and provide the hardcode values for the variables. The screenshot below shows examples of hardcoding DOMAIN_CODE and LOOKUP_TYPE.

Refreshing Variables from BIACM

BIACM is a central UI where customers can define the values of data load parameters (i.e., ODI variables), among with many other features offered. ODI variables, which are refreshed in load plans, have their values extracted from BIACM. We also refer to such variables as BIACM variables.

BIACM Variables are classified into truly global variables, PLV specific variables, and fact/dimension group level variables.

  • A truly global variable (e.g., 13P_CALENDAR_ID) is a variable that has the same value in all product lines (i.e., EBS11510, PSFT90, etc) and for all fact/dimension groups. Truly global variables are refreshed centrally in the load plan system components.
  • A PLV specific variable (e.g., LANGUAGE_BASE) is a variable that takes the same value for all fact/dimension groups within a product line, but different values of the variable can be used in different production lines. They are refreshed individually in consuming load plan dev components.
  • A fact/dimension group level variable (e.g., UPDATE_ALL_HISTORY) is group specific. It can take different values in different fact/dimension groups within the same PLV and across different PLVs. They are refreshed individually in consuming load plan dev components.

From a variable value overriding perspective:

  • A truly global variable has a unique value. PLVs and fact/dimension groups cannot override its value.
  • A PLV variable has product line specific values (e.g., LANGUAGE_BASE takes the value of US in EBS product lines but ENG in PSFT product lines). The value is the same for all fact/dimension groups within that product line.
  • A fact/dimension group level variable has group specific values (e.g., TYPE2_FLG has the value of Yes in Position Dimension Hierarchy, while it has the value of No in Asset Dimension). Also, such a variable has a global default value. If a fact/dimension group does not specify the value of such a variable for its use, then the global default value will be used whenever this variable is called by that group (e.g., the global default value of TYPE2_FLG is No).

These variables are defined in ODI. To ensure that variable refreshing works correctly, there are some rules on the definitions of ODI variables:

  • Set the ‘Keep History’ option to ‘No History’;
  • Always provide a default value (the default value will be picked if refreshing from BIACM does not return a value for some reason. Otherwise the ETL will fail.). As a good practice, the ODI default value of the variable can be set the same as the global value of the variable in BIACM.

(Whenever the Keep History option or the Default Value of a variable is changed, the scenarios that use this variable need to be regenerated.)

Once ODI variables are defined, a refreshing logic is needed to refresh them from BIACM. In this regard,

  • The ODI UDF GET_CM_PARAM is used
  • To return the correct value for a variable, we need to specify the following in the refreshing logic:
  • variable name;
  • product line;
  • fact/dimension group.
  • Syntax: getcmparam($(param_code),$(DATASOURCE_NUM_ID))
  • $(param_code) is the name of the variable (e.g., TYPE2_FLG)
  • $(DATASOURCE_NUM_ID) is used to specify the product line.

For PLV/group level variables, we pass #DATASOURCE_NUM_ID as $(DATASOURCE_NUM_ID);

e.g., getcmparam('TYPE2_FLG','#DATASOURCE_NUM_ID')

For truly global variable, we pass #WH_DATASOURCE_NUM_ID as a pseudo-PLV ID.

e.g., getcmparam('13P_CALENDAR_ID','#WH_DATASOURCE_NUM_ID')

  • Do not pass fact/dimension group directly into the syntax. They are determined by where the variable is refreshed.

BIACM variables are refreshed in load plans. To refresh a variable in a load plan, the following three steps are required (they have been done OOTB):

Step 1: Specify the correct logical schema and refreshing logic in the refreshing tab of the variable definition.

The logical schema has to be CM_BIAPPS11G.

The refreshing logic should be getcmparam() with appropriate inputs, e.g.,

getcmparam('13P_CALENDAR_ID','#WH_DATASOURCE_NUM_ID')

Step 2: Update the variable definition in the variables tab of the load plan.

Go to the load plan component where you want to refresh the variables. In the Variables tab, right click on the variables and select ‘Refresh Variable Definition’ so that the variable definition in the LP is synchronized with its real definition. Once this is done, verify that the logical schema is showing CM_BIAPPS11G, and the select statement is showing the embedded SQL in the getcmparam() function.

Step 3: Check the refreshing checkbox at the appropriate LP step.

For truly global variables, Step 3 becomes:


The logic behind getcmparam() guarantees that appropriate value of the variable is returned from BIACM given the name of the variable, the DATASOURCE_NUM_ID passed in, and the LPC step where it is refreshed.

Values stored in BIACM are strings. Therefore all ODI variables refreshed from BIACM will come in as strings. Each of the consuming codes (where the variables are used) should make sure it converts the data type accordingly. For example, dates are returned as a string in format yyyy-mm-dd hh:mi:ss. TO_DATE_VAR UDF is used to convert the returned string to DATE format. Number values are returned as strings as well.

Checklist when Things Go Wrong

What can go wrong?

  • The value of a variable used in ETL is not in line with expectation.
  • A variable refreshed has no value returned, and it fails ETL run.

Overriding Rule (1)

  • In a load plan, when a variable is refreshed in the parent step (e.g., the root step), its value will be inherited by all its child steps, unless this variable is refreshed/overwritten in a child step.

· However, if a variable is refreshed and/or overwritten in a child step, the value refreshed from this step will override the value refreshed from the parent step. Other child steps of the same level will NOT be affected. They will still inherit the value refreshed in the parent step.

Overriding Rule (2) (unlikely to happen but it exists)

If a variable is refreshed both in a package and in a load plan, then the value refreshed from the package will override the value refreshed from the load plan.

When the value of a variable returned from BIACM is not in line with expectation:

  • Confirm where the variable is refreshed, e.g., BIACM? ETL tables in DW? etc.
  • For BIACM PLV or group level variables:
  • Check its value(s) in BIACM UI. For PLV variables, check its value in each product line; for group level variables, check its group specific values as well as global default value.
  • Check if the variable is refreshed in a root step of a load plan (refresh checkbox checked). In the meanwhile, check if the root step is named after a fact/dim group.
  • Check if the variable is incorrectly refreshed or hardcoded in a child step belonging to the root step (avoid overriding rule 1).
  • Check the ODI default value of this variable. If BIACM returns (null), i.e., nothing, for this variable, its ODI default value will be used. Also, if we check the overwrite box (but not the refresh checkbox) of a variable in a load plan step, but forget to provide the value, then the ODI default value will be used.
  • Note: overriding rule (2) is unlikely to happen to BIACM variables.
  • For variables refreshed from ETL tables in DW, an incorrect value likely to indicate run issue. Check the run of that specific task.

When variable has no value:

  • Confirm where the variable should be refreshed, e.g., BIACM? ETL tables in DW? etc.
  • In rare cases, a variable may not have a value returned when it is refreshed, and this leads to ETL failures.
  • ODI behaves like this: it first refreshes the variable from its designated source (e.g., BIACM). If its source returns (null), i.e., nothing, for this variable, the ODI default value of this variable will be used in ETL. However, if the ODI default value is not provided, then this variable will not have a value.

Wednesday Nov 13, 2013

How to handle BIAPPS ETL failures due to duplicate data issues in Oracle DW

How to handle BIAPPS ETL failures due to duplicate data issues in Oracle DW

Author: Amit Kothari

Sometimes there are duplicate data rows in the source tables, due to this while running a ODI Load Plan (LP) in BI Applications 11.1.1.7.1 the ODI sessions may error out while creating the unique indexes. It is always recommended that the source data issues be fixed first but that may not be feasible sometimes or you may just want to fix the data in the warehouse and continue with the LP run. This blog shows you one way to do that – first we find the table and the indexed columns for the failed index and then we substitute it in the delete sql to delete the duplicate rows.

Sample Error:

ODI-1217: Session EXEC_TABLE_MAINT_PROC (12383500) fails with return code 20000.
ODI-1226: Step TABLE_MAINT_PROC fails after 1 attempt(s).
ODI-1232: Procedure TABLE_MAINT_PROC execution fails.
ODI-1228: Task TABLE_MAINT_PROC (Procedure) fails on the target ORACLE connection BIAPPS_DW.
Caused By: java.sql.SQLException: ORA-20000: Error creating Index/Constraint : W_EMPLOYEE_D_U1=> ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
ORA-06512: at line 152

Solution:

From sql developer run this sql in your ODI repository schema based on the failed indexed name which we got from the error above, it will give you the indexed columns for the failed index, for eg -

SELECT st.table_name,sc.col_name
FROM SNP_TABLE st, SNP_KEY sk, snp_key_col skc, snp_col sc
WHERE sk.key_name ='W_EMPLOYEE_D_U1'
  AND sk.I_table = st.I_table   AND skc.I_key = sk.I_key   AND sc.I_col= skc.i_col   AND st.I_table = sc.i_table


When we run the above sql we get the table name as W_EMPLOYEE_D and columns as – EFFECTIVE_FROM_DT , DATASOURCE_NUM_ID,INTEGRATION_ID

These are the columns which has the duplicate rows resulting in the failure of the unique index creation. Now we can proceed to delete the duplicate rows via a delete sql, make sure you backup the table data before issuing the delete statement.

From sql developer run this sql in your Oracle target schema and then simply restart the ODI load plan - a sample delete sql based on the above index is as follows. Just substitute the columns and the table name for your use case.

DELETE FROM W_EMPLOYEE_D A
WHERE ROWID > (SELECT MIN(ROWID) FROM W_EMPLOYEE_D B WHERE A.INTEGRATION_ID = B.INTEGRATION_ID AND A.DATASOURCE_NUM_ID = B.DATASOURCE_NUM_ID AND A.EFFECTIVE_FROM_DT = B.EFFECTIVE_FROM_DT) ;
commit;

Thursday Sep 12, 2013

Cleaning up ODI I$ and C$ work tables

ODI uses I$ and C$ work tables for E-LT processing.  If there are failures, sometimes these work tables can be left over.  This post explains how BI Apps provides a utility to help clean these up.[Read More]

Monday Apr 22, 2013

BI Warehouse - Analytics enabled by warehouse

A lot of times, a question comes up as to what value a warehouse based solution brings. Below are a few use cases that exemplify the need for a persistent storage. Both real time and warehouse based analytics are complementary to each other and serve different use cases. In a follow up post we will also cover the use cases best served by real time and not warehouse based solution.

Feature

Description

Sample Use Case

Currency Conversion

4 corporate level and functional currency available in warehouse

Show my worldwide bookings for last week in US dollars

e.g. Bookings / Billings / Backlog report

UOM conversion

Common UOM reporting

Show me units shipped worldwide in Sales UOM for a product

Snapshots / Trend Reports

Balances / Snapshot tables providing trend reporting

Inventory Balance, Headcount Balance, Opportunity Revenue trend views

“What changed” report

Time Series Calculations

Enable usage of time series calculations

Year Ago, Qtr Ago, Month Ago

YTD, QTD, MTD etc, % Growth metrics

Event Interpretation

Assignment record having multiple transitions

# of hires, # of transfers, # of promotions, # of resignations require event interpretation and comparison to prior records

Hierarchies / SCD / As Is vs As Was reports

Report on various versions of hierarchies / historical views

Show me revenue numbers with current sales org over last 3 years

Also, performance is one of the key value of the warehouse which is due to the following transformations or services provided by the warehouse.

Feature

Description

Comments

Denormalization

Most Entities are denormalized in the warehouse

2 – 3 x lesser joins in queries

Symmetrical

All queries have same shape.

One pattern to tune

Normalized schemas result in unpredictable query shapes requiring case by case tuning

Indexing

All columns are (bitmap) indexed

Can be freely used in prompts, filters, group by etc

Outer Joins

OLTP tables have optional relationships

Large number of outer joins causing tuning and modeling issues

Class tables

Overloaded Tables containing multiple entities

Sometimes cause query plan distortions

Transpositions

Use column A or column B depending on transaction type

Require a case statement

Cannot be indexed / used in filters etc

Date arithmetic

Timestamp join to date joins require date functions

Suppress index usage

Aggregates

No aggregates in OLTP

Enable exalytics usage / other performance tweaks

Monday Nov 26, 2012

Currency Conversion in Oracle BI applications

Authored by Vijay Aggarwal and Hichem Sellami

A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc.

With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business.

This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users.

Source Systems

OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users.

For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly.

OBIA Design

The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance.

When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies.

OBIA Facts store amounts in various currencies:

Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies.

Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company.

Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation.

The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics.

When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard.

Design Logic

When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts.

If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates.

The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA.

The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion.

Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters.

Some Gotchas to Look for

It is necessary to think through the currencies during the initial implementation.

1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time.

2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done.

Technical Section

This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system.

In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables.

W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly.

EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process.

  1. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G.
    1. Delete the records which have Start Date > minimum conversion date
    2. Update the End Date of the existing records.
  2. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter.
    1. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP.
    2. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range.
  1. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date).
  2. Filter the records marked as ‘Filter’ in the INFA map.
  3. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly.
  4. SIL map will then insert/update the GS data into W_EXCH_RATE_G.

These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G.

We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this.

Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly.

PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only.

1 Jan 2010 – USD to INR – 45

31 Jan 2010 – USD to INR – 46

PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW.

Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate.

We generate From Date and To Date while extracting data from source and this has certain assumptions:

If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected.

Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type.

Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards.

As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked.

Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA.

W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well.

What if all 5 Global Currencies configured are same?

As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record.

We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases.

Reusable Lookups

Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned.

Changing Currency preferences in the Dashboard:

In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution.

List of Currency Preferences

Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml

This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles.

Adding Currency to an Amount Field

When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error.

There are two ways to add the Currency field to an amount metric:

  1. In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”…
  2. In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area.
    1. Select Column Properties option in the Edit list of a metric.
    2. In the Data Format tab, select Custom as Treat Number As.
    3. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.
About

Oracle Blogs Admin-Oracle

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today