X

Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

Easily Integrate Planning & Budgeting Cloud Service (PBCS) with Oracle Data Integrator (ODI)

Author: Thejas B Shetty (Oracle SSI)

  1. Safe Harbor Statement:

The following article represents the views of the author only & Oracle does not endorse it.

It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle’s products remains at the sole discretion of Oracle.

  1.  Introduction:

In the On-Premise version of Hyperion Planning, it is possible to use the pre-packaged ODI Knowledge Modules (KMs) for Essbase to integrate data with the plan types. These knowledge modules use the Essbase Java APIs to connect and communicate directly with the underlying Essbase cubes (plan types), having additional options to execute calculation scripts, MaxL commands etc.

In the Cloud version of Hyperion Planning, namely Planning Budgeting Cloud Service(PBCS)/Enterprise Planning Budgeting Cloud Service(EPBCS), customers do not have the direct access into the underlying Essbase cube and hence they cannot communicate/integrate directly with the Essbase cube.

Oracle has no plans of providing native access to the underlying Essbase cubes of the PBCS/EPBCS applications and hence recommends using alternative methods like EPM Automate / REST APIs to communicate with PBCS application components & perform various integration operations. These different integration operations are pre-defined as JOBS within the Simplified Interface which are then invoked from either EPM Automate or direct REST API calls.

Many customers, who have extensively used ODI in the past to integrate with Hyperion Planning applications, would still like to use ODI with PBCS the same way, without investing significant time/efforts in building new interfaces using FDMEE/Data Management. It would also help customers migrating their On-Prem Planning applications to PBCS/EPBCS, to move their historical data from the source Essbase cubes to the new PBCS/EPBCS application. The transformation/re-mapping of data if any can be managed in the ODI layer instead of manipulating the data in either the source Planning application or the new PBCS/EPBCS application.

However the limitation of not having direct access to the underlying Essbase cube of PBCS/EPBCS application makes it impossible to use the existing Essbase Knowledge Modules & needs the ODI developer to either utilize an additional command line utility called EPM Automate within ODI OS Commands/procedures or learn a new technology namely REST API & leverage it to perform various jobs within PBCS/EPBCS. All of these cannot be done in a single ODI Mapping & requires sequencing of multiple components in an ODI package.

I have created the ODI KMs for PBCS/EPBCS utilizing the PBCS REST APIs in the ODI 12c version. The KM options are similar to the On-Prem Essbase KM options & contains the KM code written in Jython that invokes REST APIs to the PBCS to perform equivalent jobs. There are additional KM options added specific to PBCS, to make the integration process simpler, but yet very robust.

This way, any data in the source, regardless of the technology/platform, can be integrated with PBCS/EPBCS using the data stores on the ODI Mapping by treating it as if it were an on-premise Essbase data store. For an ODI developer, this KM code decouples the underlying technology (REST APIs) from the standard knowledge/skills needed to load data into PBCS, thereby letting the developer only concentrate on the core ODI components & PBCS functionalities. All other processes like the file-generation, uploading of data file to cloud, clear data etc are done within the KM code by invoking REST APIs in the background. The required end-point URLs for the REST API calls are also hardcoded within the KM & hence the ODI developer need not have a knowledge on the same.

These KMs are presently not officially supported by Oracle and hence no SRs can be raised for the same. However, by sharing them with a wider community, I encourage people to use them, modify them, and contribute their valuable input and feedback. You can find these KMs in the ODI Exchange within ODI Studio as well as on this website: link.

  1. Pre-Requisites to use the Knowledge Modules:

There are no pre-requisite .jar files that are required to use the PBCS Knowledge Modules. All the Java libraries used within the KM code should be present in the ODI agent/studio by default during installation of ODI.

However, a new ODI Technology by name EPM Cloud Service needs to be added to the ODI repository before the KMs can be used. The ODI technology can be downloaded from here

  1. Additional Setup before using the KMs

Make sure that the machine where ODI Agent (or ODI Studio) is installed, has the required network connection enabled to the PBCS Application URL. Make sure there are no firewall restrictions to the PBCS application URL.

  1. Setting up the Topology for PBCS/EPBCS Applications

Create a new Data Server under the EPM Cloud Service technology.

Choose a name for the Data Server.

Update the Topology as below for the Data Server by entering the PBCS Cloud URL in the ServiceURL (Data Server) field.

Please note that this Service URL should end with oraclecloud.com & not suffixed with anything else in the end, including special characters and/or spaces.

*Oracle employees using the PBCS/EPBCS VM on their laptops can use the Service URL as http://192.168.56.101:9000 (Please modify the IP address as per your VM configuration)

Update the Username / Password used for connecting to PBCS/EPBCS application.

The Username should be of the format: domain.username for all PBCS/EPBCS GSE/Test/Production pods.

*Oracle Employees using the PBCS/EPBCS VM on their laptop, can put only the username without the domain. Eg: epm_default_cloud_admin

Create a physical schema, under this Data Server, to represent the ApplicationType and ApplicationName.

PBCS should be used as Application Type for both PBCS/EPBCS type of applications. The KM has not been tested with FCCS/TRCS & hence may or may not work with FCCS/TRCS applications.

  1. Using the Reverse Engineering KM (RKM)

Use the RKM EPM Cloud Service-PBCS Knowledge Module to reverse engineer the PBCS/EPBCS Application Datastores into your ODI Models, pointing to an EPM Cloud Service type of logical schema.

A separate datastore is fetched corresponding to each of the BSO/ASO plantypes within the PBCS/EPBCS application:

·<AppName>_<CubeName>_Data: Used for loading/extracting data into/from PBCS/EPBCS.

The datastore will have each of the dimensions excluding the attribute dimensions as ODI Column Attributes. The numeric column attribute for the data amount is added in the end.

  1. Using IKM SQL to EPM Cloud Service-PBCS (DATA).

Use any Datastore that contains source data on the left hand side of an ODI Mapping, and connect it with a PBCS/EPBCS Datastore on right hand side. If the source datastore is a RDBMS technology (Oracle/MS SQL Server / Generic SQL), it does not require any Staging area. To load data from other technologies including text files, use an appropriate LKM (eg: LKM File to SQL) to extract data into a SQL Staging area (eg: SUNOPSIS_MEMORY_ENGINE, Oracle Staging Area etc). The source data store need not be in the same format as that of PBCS/EPBCS and can have different column names / column count. The IKM will apply the mappings defined, before loading data into PBCS/EPBCS.

Map the target expressions from the source datastore.

The below image shows an example of how a text file can be integrated with PBCS/EPBCS using an Oracle Database as staging area.

Click on the PBCS/EPBCS datastore in Target_Group on the Physical tab and choose the IKM as IKM SQL to EPM Cloud Service-PBCS(DATA).

The IKM Options are categorized into 3 groups:

  • General

  • Pre-Load Operations

  • Post-Load Operations

General:

·DATA_LOAD_JOB_NAME

The name of the job of type IMPORT_DATA defined in the PBCS UI. This job should have Source Type as Essbase & have the same Cube name as that of the Plan type of the target datastore. The Source File name can be any arbitrary name ending with a .txt extension. The actual file name passed to perform the data import, is passed dynamically from ODI using the ODI session number as suffix & overrides any static filename already defined in the job.

·LOG_ENABLED

Flag to indicate if logging should be performed during the load process. If set to Yes, logging would be done to the file specified by the LOG_FILE_NAME option.

·LOG_FILENAME

The fully qualified name of the file into which logging is to be done. Do not use backslashes (\) in the filename.

Pre-Load Operations:

·PRE_LOAD_CLEAR_CUBE:

Boolean Flag to indicate whether to run a CLEAR_CUBE job on the Cloud before data is loaded. This requires a job of type CLEAR_CUBE to be setup onetime in the PBCS UI. The name of the CLEAR_CUBE job must be referenced in the CLEAR_CUBE_JOB_NAME option. The definition of the Clear Cube operation has to be defined in the PBCS UI using the various options as shown below & cannot be controlled from the ODI Option.

·CLEAR_CUBE_JOB_NAME

Name of the CLEAR_CUBE job defined in the PBCS UI.

·PRE_LOAD_RUN_RULE:

Determines whether to run a business rule before data load. This is useful if you want to selectively clear an intersection of the cube, by passing run time parameters. The CLEAR_CUBE job doesn’t let you choose the selective slice of the cube that needs to be cleared.

·PRE_LOAD_RULE_NAME

Name of the pre-load business rule as defined/deployed from the Calc Manager.

·PRE_LOAD_RULE_PARAMS

Run time parameters for pre-load business rule.

Format: varname:value

In case of multiple parameters, separate each parameter by comma delimiters.

eg: var1:value1,var2:value2

Post-Load Operations:

·POST_LOAD_RUN_RULE:

Determines whether to run a business rule after data load. It is useful if one intends to aggregate or copy data after data is loaded by passing run time parameters.

·POST_LOAD_RULE_NAME

Name of the post-load business rule as defined/deployed from the Calc Manager.

·POST_LOAD_RULE_PARAMS

Run time parameters for post-load business rule.

Format: varname:value

In case of multiple params, separate each param by comma delimiters.

eg: var1:value1,var2:value2

Sample Process Logging:

Conclusion:

This concludes the initial setup and will get you up and running with the ODI 12c Knowledge Module for PBCS/EPBCS to load data. In the next blog, I will include more Knowledge Modules with features to extract data.

For any troubleshooting /queries/suggestions, do not hesitate to contact me: thejas.b.shetty@oracle.com

 

Join the discussion

Comments ( 5 )
  • David Hecksel Wednesday, May 16, 2018
    Well done !
  • Sumit Matoo Wednesday, June 13, 2018
    When I try to Import the Technology "EPM Cloud Service" to our ODI_11.1.1.9.0_GENERIC_150427.0937, it gives the below Import error-

    ***********************************
    cvc-complex-type.2.4.a invalid content was found starting with element 'Encryption'. One of '{SmartExportList, Object}' is expected
    ***********************************
  • Julien Wednesday, June 13, 2018
    Hi Sumit,

    The technology requires ODI 12.2.1.x, you will need to upgrade to be able to use it.

    Thanks,
    Julien
  • Mark Scott Wednesday, August 29, 2018
    Hello Julien - thanks for sharing this. I am probably misunderstanding part of this - so I have a simple question: Can someone use the REST API to get the data OUT of Hyperion PBCS and back on-prem?
  • Julien Friday, September 7, 2018
    Hi Mark,

    I am not a PBCS expert but it does provide some REST API support. I'd recommend to reach out to PBCS support team or a PBCS expert for more info. The following link may help: https://docs.oracle.com/cd/E60665_01/epm-common/INTGR/int_cloud_plan_budget_rest_api_client.htm#INTGR-pbcs_rest_api_book_31

    Thanks,
    Julien
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.