Author: Thejas B Shetty (Oracle SSI)
Until Hyperion Financial Management (HFM) 188.8.131.52.x version, it was possible to use the pre-packaged Oracle Data Integrator (ODI) Knowledge Modules (KMs) to integrate data/metadata with HFM. These knowledge modules used the HFM drivers (HFMDriver.dll) and Visual Basic (VB) APIs to connect and communicate with HFM.
APIs for the HFM in the 184.108.40.206 version were completely re-written using Java. The old VB APIs and HFMDriver.dll are obsolete and hence cannot be used to communicate with HFM 220.127.116.11.
Oracle has not released a compatible HFM Knowledge Modules for the 18.104.22.168 version using the latest Java API and now recommends using alternative methods to integrate with HFM (tools like FDMEE etc.)
Many customers, who have extensively used ODI in the past to integrate with HFM, would still want to use ODI with HFM 22.214.171.124.
Hence, I have recreated the ODI KMs for 126.96.36.199 version using HFM Java API in the ODI 12c version. These KMs are presently not officially supported by Oracle and hence no SRs can be raised for the same. However, sharing them with a wider community, I encourage people to use them, modify them, and contribute their valuable inputs/feedback. You can find the KMs in the ODI Exchange within ODI Studio as well as on this website: link.
These KMs have almost the same functionality and options as the previous KMs. However new capabilities have been added to make the integration process simpler yet robust.
Presently, the KM code is written in such a way that, it will work only if the ODI Agent or (Local Agent/No Agent) of ODI Studio is physically on the same machine where HFM is installed and configured.
It may or may not work if the ODI agent is located on a different physical machine. The same has not been tested as of today.
The KMs will work if HFM is installed on Exalytics. However, the ODI agent should also be installed in the same Exalytics host.
In cases where HFM is clustered (load balanced) and ODI agent is installed on one of the nodes of the HFM cluster, it is not guaranteed that the KMs will work. This has not been tested as of today.
The 3 additional libraries required are listed below. These files are found under the subdirectory of EPM_MIDDLEWARE_HOME.
On Windows operating systems place the jar files in:
Alternatively, instead of copying the 3 jar files into the userlib folder, edit the additional_path.txt file located inside the userlib folder and include the path of 3 jar files as shown below:
On Linux/Unix operating systems place the jar files in:
Alternatively, instead of copying the 3 jar files into the userlib folder, edit the additional_path.txt file located inside the userlib folder and include the path of 3 jar files as shown above.
Close and re-open ODI Studio.
Copy the reg.properties file
In the server where HFM (and ODI Agent) is installed, copy the:
$ORACLE_MIDDLEWARE/user_projects/config/foundation/188.8.131.52/reg.properties file to $ORACLE_MIDDLEWARE/user_projects/epmsystemX/config/foundation/184.108.40.206 folder.
Modify epmsystemX as per your environment in the above path.
Under Topology -> Technology double click Hyperion Financial Management.
Update the Naming Rules as mentioned in the below screenshot to represent EPMOracleInstance
Update the Topology as below for HFM Data Server by entering the HFMCluster name in the Cluster (Data Server) field.
Update the shared service Username / Password used for connecting to HFM application.
Update physical schema, for HFM Data Server, to represent the Application and EPMOracleInstance of the HFM application/server.
Use the RKM Hyperion Financial Management PS4 Knowledge Module to reverse engineer the HFM Datastores into the Models.
By default, 2 Datastores are fetched from HFM:
I will post an updated RKM later to include more Datastores to integrate multi-period data, metadata (with properties), Journals etc.
Use the EnumMemberList Datastore as a source and connect it with a RDBMS Datastore on right hand side. To extract members into a file, first extract into a RDBMS staging table and then use IKM SQL to File to transfer contents into a text file.
Click on the EnumMemberList_AP Access Point on the Physical tab and choose the LKM.
On the LKM options, choose:
Use any RDBMS Datastore that contains data as a source and connect it with a HFMData Datastore on right hand side. To load data from a file, first extract into a RDBMS staging table using LKM File to SQL to transfer contents of file into RDBMS table. The source data store need not be in the same format as that of HFM and can have different column names / column count. The IKM will apply the mappings defined before loading into HFM.
Click on the HFMData in Target_Group on the Physical tab and choose the IKM.
On the IKM options, choose:
That concludes the first part and should get you up and running with the ODI Knowledge Module for HFM 220.127.116.11. In the next update, I will include more Knowledge Modules with features to load/extract Metadata/Journals.For any troubleshooting /queries/suggestions, do not hesitate to contact me: firstname.lastname@example.org