Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

A Simple Guide to Hyperion Financial Management (HFM) Knowledge Modules for Oracle Data Integrator 12c

Author: Thejas B Shetty (Oracle SSI)


Until Hyperion Financial Management (HFM) version, it was possible to use the pre-packaged Oracle Data Integrator (ODI) Knowledge Modules (KMs) to integrate data/metadata with HFM. These knowledge modules used the HFM drivers (HFMDriver.dll) and Visual Basic (VB) APIs to connect and communicate with HFM.

APIs for the HFM in the version were completely re-written using Java. The old VB APIs and HFMDriver.dll are obsolete and hence cannot be used to communicate with HFM

Oracle has not released a compatible HFM Knowledge Modules for the version using the latest Java API and now recommends using alternative methods to integrate with HFM (tools like FDMEE etc.)

Many customers, who have extensively used ODI in the past to integrate with HFM, would still want to use ODI with HFM

Hence, I have recreated the ODI KMs for version using HFM Java API in the ODI 12c version. These KMs are presently not officially supported by Oracle and hence no SRs can be raised for the same. However, sharing them with a wider community, I encourage people to use them, modify them, and contribute their valuable inputs/feedback. You can find the KMs in the ODI Exchange within ODI Studio as well as on this website: link.

These KMs have almost the same functionality and options as the previous KMs. However new capabilities have been added to make the integration process simpler yet robust.

Pre-Requisites to use the Knowledge Modules

Presently, the KM code is written in such a way that, it will work only if the ODI Agent or (Local Agent/No Agent) of ODI Studio is physically on the same machine where HFM is installed and configured.

It may or may not work if the ODI agent is located on a different physical machine. The same has not been tested as of today.

The KMs will work if HFM is installed on Exalytics. However, the ODI agent should also be installed in the same Exalytics host.

In cases where HFM is clustered (load balanced) and ODI agent is installed on one of the nodes of the HFM cluster, it is not guaranteed that the KMs will work. This has not been tested as of today.

Adding Drivers to the ODI Agent

Standalone Agent / Standalone Collocated Agent:

  1. Set the environment variable ODI_ADDITIONAL_CLASSPATH to locate additional jars before starting the agent. 3 jar files as shown below is required for the HFM Knowledge Modules to work:
  2. Do one of the following
    • Copy the 3 additional libraries to the DOMAIN_HOME/lib directory & restart the agent. The ODI standalone and standalone collocated agents will automatically add these libraries to the agent’s classpath
    • Edit the DOMAIN_HOME/config/fmwconfig/components/ODI/Agent Name/bin/instance.sh/cmd command to add the libraries to the ODI_POST_CLASSPATH variable.

The 3 additional libraries required are listed below. These files are found under the subdirectory of EPM_MIDDLEWARE_HOME.

  • “D:\Oracle\Middleware\EPMSystem11R1\common\jlib\\epm_j2se.jar”;
  • “D:\Oracle\Middleware\EPMSystem11R1\common\jlib\\epm_thrift.jar”;
  • “D:\Oracle\Middleware\EPMSystem11R1\common\jlib\\epm_hfm_server.jar”

ODI Studio (Local Agent / No Agent):

On Windows operating systems place the jar files in: 
Alternatively, instead of copying the 3 jar files into the userlib folder, edit the additional_path.txt file located inside the userlib folder and include the path of 3 jar files as shown below:

On Linux/Unix operating systems place the jar files in: 
Alternatively, instead of copying the 3 jar files into the userlib folder, edit the additional_path.txt file located inside the userlib folder and include the path of 3 jar files as shown above.

Close and re-open ODI Studio.

Additional Setup before using the KMs

Copy the reg.properties file

In the server where HFM (and ODI Agent) is installed, copy the:

$ORACLE_MIDDLEWARE/user_projects/config/foundation/  file  to $ORACLE_MIDDLEWARE/user_projects/epmsystemX/config/foundation/  folder.

Modify epmsystemX as per your environment in the above path.

Setting up the Topology for HFM Integration

Under Topology -> Technology double click Hyperion Financial Management.

Update the Naming Rules as mentioned in the below screenshot to represent EPMOracleInstance

Update the Topology as below for HFM Data Server by entering the HFMCluster name in the Cluster (Data Server) field.

Update the shared service Username / Password used for connecting to HFM application.

Update physical schema, for HFM Data Server, to represent the Application and EPMOracleInstance of the HFM application/server.

Using the Reverse Engineering KM (RKM)

Use the RKM Hyperion Financial Management PS4 Knowledge Module to reverse engineer the HFM Datastores into the Models.

By default, 2 Datastores are fetched from HFM:

  • EnumMemberList is used for extracting members (without properties) from HFM.
  • HFMData is used for loading/extracting data into HFM.

I will post an updated RKM later to include more Datastores to integrate multi-period data, metadata (with properties), Journals etc.

Using LKM Hyperion Financial Management PS4 Members To SQL

Use the EnumMemberList Datastore as a source and connect it with a RDBMS Datastore on right hand side. To extract members into a file, first extract into a RDBMS staging table and then use IKM SQL to File to transfer contents into a text file.

Click on the EnumMemberList_AP Access Point on the Physical tab and choose the LKM.

On the LKM options, choose:

  • Dimension Name: Name of the dimension for which members have to be extracted.
  • Member List Name: Name of the member list. Eg: [Base],[Hierarchy], [Custom List] etc.
  • Top Member: Specifies the top member. Members who are ancestors of this specified member are not extracted. (Optional)

Using IKM SQL to Hyperion Financial Management PS4 Data

Use any RDBMS Datastore that contains data as a source and connect it with a HFMData Datastore on right hand side. To load data from a file, first extract into a RDBMS staging table using LKM File to SQL to transfer contents of file into RDBMS table. The source data store need not be in the same format as that of HFM and can have different column names / column count. The IKM will apply the mappings defined before loading into HFM.

Click on the HFMData in Target_Group on the Physical tab and choose the IKM.

On the IKM options, choose:

  • Import Mode: Specifies one of the following load options:
    • DATALOAD_MERGE - to overwrite data in the application
    • DATALOAD_REPLACE - to replace data in the application
    • DATALOAD_ACCUMULATE - to add data to existing data in the application
  • Accumulate within File: Flag to indicate whether to accumulate values in the load data. If set to Yes, it indicates that multiple values for the same cells in the load data must be added before loading them into the application.
  • File Contains Share Data: Flag to indicate whether the data contains ownership data. If set to Yes, it indicates that the load data contains ownership data, such as shares owned.
  • Consolidate After Load: Flag to indicate whether consolidate should be performed after the data load. If set to Yes, it indicates that consolidate will be performed after the data load with the parameters provided by option CONSOLIDATE_PARAMETERS.
  • Consolidate Parameters: Specifies the parameters for consolidate operation as comma-separated values in the order Scenario, Year, Period, Parent. Entity and Type (represented by a letter as given below). Consolidation types and the letters that represent them are listed below:
    •  I = Consolidate
    •  D = Consolidate All with Data
    •  A = Consolidate All
    •  C = Calculate Contribution
    •  F = Force Calculate Contribution
    • e.g.  Actual,1999,2,EastRegion.EastSales,A.
  • Log Enabled: Flag to indicate if logging should be performed during the load process. If set to Yes, logging would be done to the file specified by the LOG_FILE_NAME option.
  • Log File Name: The fully qualified name of the file into which logging is to be done.


That concludes the first part and should get you up and running with the ODI Knowledge Module for HFM In the next update, I will include more Knowledge Modules with features to load/extract Metadata/Journals.

For any troubleshooting /queries/suggestions, do not hesitate to contact me: thejas.b.shetty@oracle.com

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.