X

An Oracle blog about HCM Cloud

Recent Posts

HCM Data Loader (HDL)

HCM Data Loader (HDL) - What is the Scope Parameter?

Overview: Setting the scope for HDL is one the steps in preparing your Oracle Fusion HCM environment for data conversion or integration using HDL. In this article we will discuss available options and evaluate pros and cons for each.  Where to specify the HCM Data Loader Scope? You need to specify the scope parameter on the fusion setup manager task – Configure HCM Data Loader.  Available Options:  Limited Mode:- Only business objects not supported by HCM File-Based Loader can be loaded using HCM Data Loader. Limited mode was designed for early adopters of HDL in R9 and it allowed customers to test use of HDL for newly supported objects without disrupting their use of FBL for existing objects. For e.g. One can continue to use FBL for existing integrations and uptake HDL for newly supported objects such as Document of Records, Security Objects , Area of Responsibility, Role mapping etc.   Full Mode:-HCM Data Loader is used for bulk loading data into all supported business objects. HCM File Based Loader and HCM Spreadsheet Data Loader are disabled. With R10 and GA for HDL, Full mode is the default for all new customers. Existing customers who upgrade to R10 will not be forced to switch to Full mode, however they must start planning for this move.  Switching from Limited to Full is a one-time switch and customers and their partners must take action to convert data files to work with HDL before they make the switch to Full mode. Important: There is NO back and forth switch between these modes so if you switch to full mode, there is no going back to FBL. User interface will not allow you to switch back to Limited mode once scope has been set to FULL. It will also disable the FBL & FBL based spreadsheet loaders. (Payroll Batch Loader or Compensation Spreadsheets or Benefits Enrollment Spreadsheets should continue to work as is) HCM Spreadsheet Loader: You can access HCM Spreadsheet loaders vis Data Exchange UI. To avoid confusion, following image clearly identifies FBL based spreadsheet loaders vs HDL based Spreadsheet loaders.  FBL Spreadsheet Loaders: FBL based spreadsheet loaders will no longer work if you use HDL full mode. As you can see, this does not include speciality loaders such as Payroll Batch Loader or Compensation Spreadsheet loaders or Benefits Enrollment Upload etc, as those spreadsheet loaders are dependent on FBL or HDL engines.   HDL Spreadsheet loaders (HSDL) Once you switch to HDL full mode, you should plan on using HDL based Spreadsheet loaders. Conclusion:  You could be implementing Oracle HCM Cloud (Fusion HCM) very first time or simply trying to switch from FBL to HDL, before you get started, you must understand the implication of the HDL scope parameter.  Hope this article helps you plan your implementations proactively. Good luck.   

Overview: Setting the scope for HDL is one the steps in preparing your Oracle Fusion HCM environment for data conversion or integration using HDL. In this article we will discuss available options...

Living in the Cloud

HCM Common Feature Release 13 Using the Transaction Console

Release 13 Transaction Console comprises three work areas: 1. Transaction Summary 2. Analytics 3. Approval Rules Instead of using the BPM Worklist, administrators can use the Transaction Console to easily configure, monitor, and troubleshoot HCM approval processes. For example, you can: • Check the latest status of approval transactions and take necessary actions, such as Withdraw (Pending, Failed), Recover (Failed), Reassign (Pending, Failed), Send Back to Initiator (Failed). • Search on existing transactions based on user-defined criteria, and repeat queries by creating saved searches. • Access the simple and straightforward process details page. • Use the relevant analytics dashboards for tracking and monitoring processes. • Manage approval rules including Alert Initiator on Error. Note: BPM transactions migrated from R12 will get a Preupgrade status:- • Only pending transactions have a Withdraw option • Auto recovery is not applicable to those failed transactions Click Doc ID 2430452.1 to download the white paper.  It covers everything that a system administrator should know.  I would recommend this copy to anyone who wants to learn the basics or try something new - a Modern HR leading practice. Reference: LCM Archive and Purge Processes for Release 12 Oracle HCM Cloud

Release 13 Transaction Console comprises three work areas: 1. Transaction Summary 2. Analytics 3. Approval Rules Instead of using the BPM Worklist, administrators can use the Transaction Console to...

Extensibility

Contextual Analysis in Talent Management

* { box-sizing: border-box; } /* Create two unequal columns that floats next to each other */ .column { float: left; padding: 10px; height: 270px; /* Should be removed. Only for demonstration */ } .left { width: 40%; } .right { width: 60%; } /* Clear floats after the columns */ .row:after { content: ""; display: table; clear: both; } The use of contextual information that takes into account the holistic view of a worker is essential to foster a more engaged workforce and management team. Users expect the underlying data to give insight into the individual’s contributions and the overall impact to the company’s bottom line. Application context passing to embedded reports is a unified approach for better information sharing within the Fusion Applications promoting equality of opportunity in the workplace. Contextual analysis provides immediately accessible and relevant contextualized reporting that can transform performance reviews. Organizations need to enable greater connectivity between the information in real-time reporting with other corporate information. Use of the simplest and most cost effective analytics will lead to a solid career conversation driving top performance. Questions employees care about most: How am I doing? How do I fit into the bigger picture? What is next for me? Opportunity to reaffirm strengths and align individual goals with purpose and direction. What and how should I develop to be more successful? Key result areas managers concern most: Is my team engaged? Are we working towards vital priorities? Am I bringing the best of my direct reports? Ability to give feedback in the moment and surmount the limits to growth. Problems organizational leaders face: The gender imperative. How work gets done and how performance is measured/rewarded. Adaptive challenges in talent development as well as culture change. This white paper (Doc ID 2378216.1) is a guide of leading practices with sample use cases: Individual Potential to Improve and Grow Long-term Goals Rating History Sales Performance Goals Pay for Performance Metric Personal Qualities and Attributes

The use of contextual information that takes into account the holistic view of a worker is essential to foster a more engaged workforce and management team. Users expect the underlying data to give...

UCM

Oracle HCM Cloud- Introduction to UCM (WebCenter Content Server)

There is lot of information available around UCM, content management, etc but in this article I will try to explain its meaning and usage specifically for Oracle HCM Cloud. Oracle UCM is the Oracle’s Universal Content Management application and it is now referred as Oracle Webcenter Content Server. To keep it simple, let’s just call it as UCM – which is the application that Oracle HCM cloud uses as its content management system. UCM is the preferred method of file transfer for inbound and outbound integrations i.e. data loading as well as extracts. UCM provides improved and secured file management capabilities and it is integral part of HCM Cloud.  Accessing UCM Since UCM is an integral part of Oracle Cloud applications, it is available (at no extract cost) as part of HCM Cloud. There are two ways to access the UI File Import\Export UI: -  You can go to Navigator - Tools - File Import and Export - UI to search\download\upload files from UCM.  Traditional UCM UI: - You can also login or navigate directly to UCM application, simply use /cs in the application url for eg. https://hxyz-test.fs.us2.oraclecloud.com/cs or https://fs-aufsn4x0POD.oracleoutsourcing.com/cs and it should display UCM login form. You can then sign in and locate the document using Search or Browse functionality.    Web services\Programmatic Access You may also use web services to automate the file transfer to\from UCM. It supports both RIDC and Generic Soap Port. We recommend using Generic SOAP Port service for UCM automation. Webservice URLs would be something similar to - Generic Soap Port (Recommended) - https://xyz-test.fs.us.oraclecloud.com/idcws/  or  https://xyz-test.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl RIDC (limitations with SSO) - https://xyz-text.fs.us2.oraclecloud.com/cs/idcplg We will discuss technical details, sample programs etc in a separate article.     Options for Automation There may be multiple methods to automate UCM file transfer but in general I have seen following two methods - Option1: Command line- You can use the WebCenter Content Document Transfer Utility via command line\shell\batch scripts to automate the file transfer.  Utility is available from the Oracle OTN webpage and you can see this article for installation instructions.  Command line samples are discussed in this article.  Option2: Java Programs- You can use the Webcenter Content Document Transfer utility and invoke it via java or use the API wrapper described in this article as a starting point and simplify the UCM automation.  

There is lot of information available around UCM, content management, etc but in this article I will try to explain its meaning and usage specifically for Oracle HCM Cloud. Oracle UCM is the Oracle’s...

Invoking Document Transfer Utility Using Java Program

The WebCenter Content Document Transfer Utility is a set of command line interface tools written in Java providing content import and export capabilities. The UploadTool is used to create a new content item in Oracle WebCenter Content based on contents streamed from a local file. The SearchTool is used to locate content items within Oracle WebCenter Content matching specific query criteria. The DownloadTool is used to retrieve a content item from Oracle WebCenter Content and save its contents to a local file Soap Based transfer Utility The preferred generic soap-based transfer utility (oracle.ucm.fa_genericclient_11.1.1.jar) which requires the Oracle JRF Web Service supporting libraries and uses JAX/WS over HTTPS to communicate with the Oracle WebCenter Content Server The generic soap-based transfer utility accesses the Content Server through its GenericSoapPort web service (/idcws/GenericSoapPort) and requires the client to specify a suitable UsernameToken-based Oracle WSM Security Client Policy that matches the Server's configured service policy.   Services available in SOAP based Utility                                                                                    Java Library File Name Service Type IDC  Service Name Class File Location oracle.ucm.fa_genericclient_11.1.1.jar Upload CHECKIN_UNIVERSAL oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.UploadTool oracle.ucm.fa_genericclient_11.1.1.jar Search GET_SEARCH_RESULTS oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.SearchTool oracle.ucm.fa_genericclient_11.1.1.jar Download GET_FILE oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.DownloadTool   Automating Document Transfer If the customer is planning to leverage the services provided by delivered Oracle WebCenter Content Document Transfer Utility, then the document transfer automation can be achieved using : 1) Commands to invoke the services provided by the Document Transfer Utility. Please refer the this blog for more details on this approach  2) Java Programs to invoke the services provided by the Document Transfer Utility. 3) Oracle HCM COE – UCM utility - API wrapper on top of #2 and can be used as a starting point for automation.   Oracle HCM COE – UCM Utility Oracle HCM COE – UCM utility is a package containing Java based sample API that interacts with Web Content Server Document Transfer utility to facilitate Upload, Search, Download actions on UCM.  This sample API is also designed to handle Delete action on UCM. This API is not production ready, please use this as a template or working model and modify to fit your requirements. With the help of this utility users will be able to: i) Search files in UCM based on multiple attributes like Title, Author, Content ID, Dates, Comments etc., ii) Search and Download multiple files in a single execution of the program iii) Search, Download and Delete multiple files in a single execution of the program iv) Search and Delete multiple files in a single execution of the program The App that is part of this utility package provides the User Interface to perform Upload, Search and Delete actions. Sample Testing Programs are provided in the package with the help of which the UCM Utility API can be invoked by passing required parameters. Once the transaction is successful, this API returns respective attribute values back to the Testing Programs. The return values are dependent on the type of action performed by the API. You can download the Oracle HCM COE – UCM Utility package from here.   Prerequisite Oracle Java 6 SE release 1.6.0_20 is the minimum JAVA version required for a successful document transfer.   Contents of Oracle HCM COE – UCM Utility Package    

The WebCenter Content Document Transfer Utility is a set of command line interface tools written in Java providing content import and export capabilities. The UploadTool is used to create a new content...

Fusion HCM Center of Excellence

Invoking WebCenter Content Document Transfer Utility using Windows Commands

The WebCenter Content Document Transfer Utility is a set of command line interface tools written in Java providing content import and export capabilities. The UploadTool is used to create a new content item in Oracle WebCenter Content based on contents streamed from a local file. The SearchTool is used to locate content items within Oracle WebCenter Content matching specific query criteria. The DownloadTool is used to retrieve a content item from Oracle WebCenter Content and save its contents to a local file Forms of Document Transfer Form 1: Soap Based transfer Utility (Recommended) The preferred generic soap-based transfer utility (oracle.ucm.fa_genericclient_11.1.1.jar) which requires the Oracle JRF Web Service supporting libraries and uses JAX/WS over HTTPS to communicate with the Oracle WebCenter Content Server The generic soap-based transfer utility accesses the Content Server through its GenericSoapPort web service (/idcws/GenericSoapPort) and requires the client to specify a suitable UsernameToken-based Oracle WSM Security Client Policy that matches the Server's configured service policy. Form 2: RIDC Based transfer Utility (Not Recommended)  The original RIDC-based transfer utility (oracle.ucm.fa_client_11.1.1.jar) which is a feature-set Java library that encapsulates Oracle WebCenter Content RIDC and uses standard HTTPS to communicate with the Oracle WebCenter Content server. RIDC tool is impending deprecation in FA environments due to authentication obstacles, which currently cannot be handled universally and programmatically. Customers whom have deviated from standard Oracle Access Manager (OAM) web single sign-on for access to their Fusion Applications should utilize the generic soap-based transfer utility (or the underlying GenericSoapPort web service directly) to access the Content Server. The RIDC-based transfer utility has support for "Basic" authentication and restricted support for OAM 11g form-based authentication. Note: This document talks only about Soap Based Transfer Utility Details steps for installing WebCenter Content Document Transfer Utility is documented here.   Services available in SOAP based Utility                                                                                    Java Library File Name Service Type IDC  Service Name Class File Location oracle.ucm.fa_genericclient_11.1.1.jar Upload CHECKIN_UNIVERSAL oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.UploadTool oracle.ucm.fa_genericclient_11.1.1.jar Search GET_SEARCH_RESULTS oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.SearchTool oracle.ucm.fa_genericclient_11.1.1.jar Download GET_FILE oracle.ucm.fa_genericclient_11.1.1.jar.oracle .ucm.idcws.client.DownloadTool   Program Options Library We can invoke the document transfer services (like upload/download/search) using Windows/Unix commands. Standard program options used in these commands are described below                  Option                                                                           Description --url  Content Server Protocol-Specific Connection URL /idcws/.  For Example, if the HCM Cloud URL is https://zzzz-test.fs.us.oraclecloud.com/homePage/faces/AtkHomePageWelcome then this URL value is  https://zzzz-test.fs.us.oraclecloud.com/idcws/ --username Oracle HCM Cloud User Name --password Oracle HCM Cloud Password --policy JAX/WS Client Policy - e.g. oracle/wss_username_token_over_ssl_client_policy -Doracle.security .jps.config This is the path where jps-config.xml is saved. In our case the this file is present in  ‘D:\UCM\config’. We can set some properties like Minimum File Size, Maximum File Size, other Audit properties in this jps-config.xml file. These values are pre-set. You can choose to edit these values or leave them as is. --primaryFile  Location of the file that needs to be uploaded --dDocTitle Title of the document --dSecurityGroup Destination Security Group; For FA import/export use-cases this is set by default to ‘FAFusionImportExport’ --dDocAccount Destination Security Account. Ex: ‘hcm/dataloader/import’  --proxyHost Proxy Host Name --proxyPort Proxy Port Value --allDocs Search all documents including old revisions [true/false(default)]; By default we search only the latest released --dID Unique id (number) assigned to each contect in UCM --dDocName  Content item identifier (string). This is the Content ID attribute in UCM UI. --dOriginalName Document original name /filename (string) --dDocAuthor Document author --dExtension Document extension (string) --SortField Sort search query results based on the specified metadata field --SortOrder The sort order: ASC (ascending) / DESC (descending); defaults to ASC while searching. --StartRow   The row to begin the search results display (after having applied any sort); defaults to 1 --ResultCount Maximum number of search results to return to client; defaults to 20 --defaultFields Output core metadata fields (dID, dDocName, dDocTitle, dDocLastModifiedDate, dDocLastModifier) [true(default)/false] while searching --moreFields Output a more detailed set of metadata fields (fields above plus dOriginalName, VaultFileSize etc) [true/false(default)] while searching --fields  User-specified fields to render while searching e.g. --fields=dID,dOriginalName --RevisionSelection Method Which revision to download; Valid values: Latest / LatestReleased.  Defaults to Latest --outputFile Output/Destination local file to write; if not provided dOriginalName of file provided at checkin time is utilized. --outputDir Can be leveraged in place of --outputFile to stream contents to a local file in specified output directory using dOriginalName provided at checkin time as the file name. Where dOriginalName is the name of the file while uploading into UCM   Sample Commands Upload a File into UCM Sample 1 : Command for File Upload to UCM (without Proxy): java  -Doracle.security.jps.config="D:\UCM\config\jps-config.xml" oracle.ucm.idcws.client.UploadTool --url=https://zzzz-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --primaryFile="C:\worker.zip" --dDocTitle="SampleFile1" --dSecurityGroup=FAFusionImportExport --dDocAccount=hcm/dataloader/import   Sample 2:  Command for File Upload to UCM (with Proxy enabled): java -Doracle.security.jps.config=" D:\UCM\config\jps-config.xml" oracle.ucm.idcws.client.UploadTool --url=https://zzzz-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --primaryFile="C: \worker.zip" --dDocTitle="SampleFile2" --dSecurityGroup=FAFusionImportExport --dDocAccount=hcm/dataloader/import --verbose --proxyHost=www-proxy.us.zzzzz.com --proxyPort=80   Search a File in UCM Sample 1:  Command to fetch specific number of files in the order of their upload datetime starting with the latest file: java oracle.ucm.idcws.client.SearchTool --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --allDocs=true --ResultCount=40   Sample 2:  Command to fetch specific file based on Document ID java oracle.ucm.idcws.client.SearchTool --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --dID=6327   Sample 3:  Command to fetch specific file based on Content ID java oracle.ucm.idcws.client.SearchTool --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --dDocName=UCMFA00005759 If you know only partial value of the Content ID attribute, then you can replace --dDocName=UCMFA00005759 with --dDocName%=UCM in the above command   Sample 4:  Command to fetch specific file based on the Title: java oracle.ucm.idcws.client.SearchTool --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --dDocTitle%=work   Sample 5:  Command to fetch specific file based on Document Author, sorting the result by Document ID in Descending order java oracle.ucm.idcws.client.SearchTool --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --dOriginalName=worker%  --dDocAuthor=HCM% --SortField=dID --SortOrder=DESC   Download a File from UCM Sample 1:  Command to download a file using Document ID from a local directory java " -Doracle.security.jps.config="D:\UCM\config\jps-config.xml" oracle.ucm.idcws.client.DownloadTool  --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --outputFile=C:\worker.zip  --dID=6328   Sample 2:  Command to download a file using Content ID from a local directory java " -Doracle.security.jps.config="D:\UCM\config\jps-config.xml" oracle.ucm.idcws.client.DownloadTool  --url=https://xxxx-test.fs.us.oraclecloud.com/idcws/ --username=HCM.USER --password=Welcome1 --policy=oracle/wss_username_token_over_ssl_client_policy --outputFile=C:\worker.zip  --dDocName=UCMFA00005220 --RevisionSelectionMethod=Latest

The WebCenter Content Document Transfer Utility is a set of command line interface tools written in Java providing content import and export capabilities. The UploadTool is used to create a new content...

UCM

Installing Oracle Webcenter Content Document Transfer Utility

Components required for Document Transfer The WebCenter Content Document Transfer Utility is a set of command line interface tools providing content import and export capabilities. In this document we will discuss about downloading/configuring various libraries (depicted in below screenshot)that would facilitate the document exchange between UCM and a local directory .   Before Downloading the Libraries Before downloading the libraries we need to: i) Verify if the Java Runtime Requirements are met ii) Create a folder structure to logically segregate the libraries after downloading i) Verify if the Java Runtime Requirements are met Oracle Java 6 SE release 1.6.0_20 is the minimum JAVA version required for a successful document transfer. Verify the current available version in your local system by below command If the version is less than what is recommended, download the latest version from below URL: http://www.oracle.com/technetwork/java/javase/downloads/jre8-downloads-2133155.html Accept the License agreement and download the relevant version suitable for your operating system. For example if you have a Windows 64-bit OS, then you need to choose Download the .exe file and install the latest Java SE JRE. Note: Please do not download any version of Java SE 9 since we have not tested and certified the Utility with this version of JAVA ii) Create below empty folder structure to logically segregate the libraries after downloading 1) Create a folder UCM under a directory(lets say D:\UCM) 2) Create a folder ‘mw_lib’ under D:\UCM. Here (D:\UCM\mw_lib) we will place Middleware PS6 libraries. 3) Create a folder ‘lib’ under D:\UCM. Here (D:\UCM\lib) we will place UCM Generic Client library. 4) Create a folder ‘config’ under D:\UCM. Here (D:\UCM\config) we will place Jps configuration file. 5) After the folder creation, the empty directory D:\UCM would look like this   Downloading Middleware PS6 Libraries The generic soap-based transfer utility (oracle.ucm.fa_genericclient_11.1.1.jar) has dependencies on the  JRF JAX/WS libraries. These libraries can be extracted from Fusion Apps 11g Middleware. Below are the steps that needs to be followed for extracting required middleware libraries. Below is the link to download Fusion Apps 11g JDeveloper. Click on the link: http://www.oracle.com/technetwork/developer-tools/jdev/downloads/jdeveloer111171-2183166.html Accept the License Agreement Download the below selected version of Jdeveloper 11.1.1.7.1. jdevstudio11117install.exe file will be downloaded Install the downloaded .exe file to a directory, lets say D:\ Middleware\ After successful installation, copy the below list of 47 PS6 libraries from specified Middleware folder to D:\UCM\mw_lib directory D:\Middleware\modules\com.bea.core.apache.commons.lang_2.1.0.jar D:\Middleware\modules\com.bea.core.stax2_1.0.0.0_3-0-1.jar D:\Middleware\modules\com.bea.core.woodstox_1.0.0.0_4-0-5.jar D:\Middleware\modules\ glassfish.jaxb_1.0.0.0_2-1-12.jar D:\Middleware\modules\javax.ejb_3.0.1.jar D:\Middleware\modules\javax.mail_1.1.0.0_1-4-1.jar D:\Middleware\modules\javax.management.j2ee_1.0.jar D:\Middleware\modules\javax.servlet_1.0.0.0_2-5.jar D:\Middleware\modules\javax.xml.rpc_1.2.1.jar D:\Middleware\modules\ws.api_1.1.0.0.jar D:\Middleware\oracle_common\modules\oracle.dms_11.1.1\dms.jar D:\Middleware\oracle_common\modules\oracle.fabriccommon_11.1.1\fabric-common.jar D:\Middleware\oracle_common\modules\oracle.http_client_11.1.1.jar D:\Middleware\oracle_common\modules\oracle.iau_11.1.1\fmw_audit.jar D:\Middleware\oracle_common\modules\oracle.idm_11.1.1\identitystore.jar D:\Middleware\oracle_common\modules\oracle.jmx_11.1.1\jmxframework.jar D:\Middleware\oracle_common\modules\oracle.jmx_11.1.1\jmxspi.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-api.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-ee.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-audit.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-common.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-internal.jar D:\Middleware\oracle_common\modules\oracle.jps_11.1.1\jps-unsupported-api.jar D:\Middleware\oracle_common\modules\oracle.jrf_11.1.1\jrf-api.jar D:\Middleware\oracle_common\modules\oracle.logging-utils_11.1.1.jar D:\Middleware\oracle_common\modules\oracle.odl_11.1.1\ojdl.jar D:\Middleware\oracle_common\modules\oracle.osdt_11.1.1\osdt_core.jar D:\Middleware\oracle_common\modules\oracle.osdt_11.1.1\osdt_saml.jar D:\Middleware\oracle_common\modules\oracle.osdt_11.1.1\osdt_wss.jar D:\Middleware\oracle_common\modules\oracle.osdt_11.1.1\osdt_xmlsec.jar D:\Middleware\oracle_common\modules\oracle.pki_11.1.1\oraclepki.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\orasaaj-rt.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\orawsdl.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\orawsrm.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\wsclient-rt.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\wssecurity.jar D:\Middleware\oracle_common\modules\oracle.webservices_11.1.1\wsserver.jar D:\Middleware\oracle_common\modules\oracle.wsm.agent.common_11.1.1\wsm-agent-core.jar D:\Middleware\oracle_common\modules\oracle.wsm.agent.common_11.1.1\wsm-agent-fmw.jar D:\Middleware\oracle_common\modules\oracle.wsm.agent.common_11.1.1\wsm-pap.jar D:\Middleware\oracle_common\modules\oracle.wsm.common_11.1.1\wsm-pmlib.jar D:\Middleware\oracle_common\modules\oracle.wsm.common_11.1.1\wsm-policy-core.jar D:\Middleware\oracle_common\modules\oracle.wsm.common_11.1.1\wsm-secpol.jar D:\Middleware\oracle_common\modules\oracle.wsm.policies_11.1.1\wsm-seed-policies.jar D:\Middleware\oracle_common\modules\oracle.xdk_11.1.0\xml.jar D:\Middleware\oracle_common\modules\oracle.xdk_11.1.0\xmlparserv2_sans_jaxp_services.jar D:\Middleware\oracle_common\modules\org.jaxen_1.1.1.jar   Downloading Web Center Content Document Transfer Utility Web Center Content Document Transfer Utility can be downloaded from: http://www.oracle.com/technetwork/middleware/webcenter/content/downloads/wcc-11g-downloads-2734036.html  Accept the License Agreement Expand the section “Individual components download” and choose client component as “Webcenter content document transfer utility”. By clicking the download link a Zip file 150521-REL10-oracle.ucm.fa_client_11.1.1.zip will be downloaded. Download this zip file to any location say D:\FusionHCM\UCM and extract the contents of zip file into the folder D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1 Open the extracted folder (D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1\) and go to ‘generic’ folder (D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1\generic) under the extracted folder Copy oracle.ucm.fa_genericclient_11.1.1.jar file from the ‘generic’ folder (D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1\generic) to D:\UCM\lib folder that we created earlier                                                To Open ‘config’ folder (D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1\generic\config) under ‘generic’ folder   Copy audit-store.xml and jps-config.xml files from ‘config’ folder (D:\FusionHCM\UCM\150521-REL10-oracle.ucm.fa_client_11.1.1\generic\config) to D:\UCM\config folder that we created earlier                                                To                          This step concludes the downloading and logical segregation of libraries Setting the Classpath Middleware PS6 libraries & UCM Generic Client library (‘oracle.ucm.fa_genericclient_11.1.1.jar’) are essential for UCM file transfer using Commands. Hence the path where these libraries are saved must be provided in the ClassPath environment variable Classpath variable value that we need to add is : D:\UCM\mw_lib\*;D:\UCM\lib\*;  (Selecting all libraries from D:\UCM\mw_lib & D:\UCM\lib folders). Below are the steps to add classpath variable for Windows 7 OS. Right-Click on the Computer and click Properties option. Click Advanced system setting Link Then Click Environment Variable button in System Properties If there is a CLASSPATH Variable already existing in User Variables, the double-click that variable and add D:\UCM\mw_lib\*;D:\UCM\lib\*;  to it Click ‘OK’ buttons for all the windows. This concludes the Classpath configuration. If CLASSPATH Variable doesn’t exist then click New button in User Variables section   In the New User Variable, provide the values for Variable Name and Variable Value as shown in below screenshot. Click ‘OK’ buttons for all the windows. This concludes the Classpath configuration For More information on the ClassPath, please refer the below link https://docs.oracle.com/javase/tutorial/essential/environment/paths.html

Components required for Document Transfer The WebCenter Content Document Transfer Utility is a set of command line interface tools providing content import and export capabilities. In this document we...

Living in the Cloud

Oracle HCM Cloud: Release 13 Update Planning & Opting Into New Features

Oracle HCM Cloud provides a straightforward upgrade and update experience. Get prepared in advance for a successful transition to the latest release updates. Oracle HCM Cloud Checklists for Upgrading from Release 12 to Release 13 (Doc ID 2354604.1) This white paper helps ensure that you complete all the tasks for a successful upgrade.  Follow the step-by-step instructions in the upgrade planning checklists to track and plan the Release 13 upgrade.  These checklists are designed to provide advanced planning information.  Release 12 customers may choose to use this information to plan future solutions as they are enhanced. For some localization customers, upgrade is essential to meet compliance requirements. It is crucial to develop a release plan to ensure a smooth transition from old technology to new.   Release 13 Upgrade Approaches You must evaluate your business priorities versus resource availability in order to determine how to plan your upgrade. Common approaches are:  Technical Upgrade Only. Strictly upgrade. No feature uptakes are included and you validate the deployed business processes in Release 12 continue to work after the upgrade.  The intention is to defer adoption of new features and products to post-upgrade new project initiatives. Technical Upgrade and Light Features Uptake. Upgrade, plus minimal feature uptakes, for example, localization changes, or additional language install. Technical Upgrade and Features Uptake. Upgrade, plus key product complete features uptake, for example, brand new user experience including new Default Landing Page. To Do Lists If your project is part of a pilot program live in prior release and has a few features to uptake in Release 13, focus on the following To Do Lists to fast track your upgrade. table, td, th { border: 1px solid black; } table { border-collapse: collapse; width: 100%; } th { text-align: left; } R12 PRE-UPG Key Actions Helpful Tips Choose dates for R13UPG (first come first served) 1) Feasibility of Technical Upgrade only 2) Self-service request via My Services 3) Schedule P2T/T2T via My Services   Release 13 Upgrade Planning Checklist for resource planning: - New Features (see column B) - Release Schedules (see columns C/E) - Upgrade Impact (see column D) *** *** action is on the partner and customer to assess the impacts, that Oracle can only suggest potential impacts   Stage to upgrade on Tuesdays whereas Production(Prod)/Parallel Prod on Weekends - replay Scheduling Rel13 Upgrade on Customer Connec If project in midst of implementation phase, consider upgrade Stage and Pre-Prod in the same week If customer is live and has one Stage one Parallel Prod, consider upgrade Prod/Parallel Prod in the same week thereafter P2T Scope of UPG Testing 1) Plan and prepare new test scripts Review functional known issues/workaround, see Doc ID 1554838.1 for Release 13 HCM Cloud Release 13 Upgrade Planning Checklist for uptake actions: - Business flow (see column N) - Profile option values (see column R) - Security roles and function privileges (see column S) - Navigation path (see column T) - Product training and white paper (see columns V-Z) - Decision on Priority Order (AA), Reimplement (AB), Must Uptake (AC), Consider Hiding (Q) based on columns (F/G/L) Enable Offerings 1) Enabling Offerings and Functional Areas 2) Enabling Features Select enable for implementation checkbox Configure Features and Feature Choices Configuration Freeze 1) Backup   2) Change Freeze   Before P2T or T2T if Gold instance: - Production changes, for example customization, configuration, and data fix script to freeze prior to P2T/T2T - Publish Sandbox, exit dummy sandboxes - Never attempt to import R12 packages to R13 - Prepare BI backups in both target and source instances - Cease all ESS jobs in source instance for upgrade to start table, td, th { border: 1px solid black; } table { border-collapse: collapse; width: 100%; } th { text-align: left; } R13 POST-UPG Key Actions Helpful Tips SaaS-PaaS immediate action Switch to using consolidated (server) URL prior to upgrade regression testing Product family specific nodes like 'fs', 'crm', 'hcm' etc are replaced by 'fa', example: https://.fa..oraclecloud.com/ Perform R13 post-upgrade functional steps Configuration Changes Release 13 Upgrade Planning Checklist for resource planning: - FSM Opt-in Feature (see column K) - Profile Options (see column R) - Security Role & Function Privileges (see column S) Tools Download Desktop Integration Installer Re-install ADFdi correct version on client: (Navigator) Tools > Download Desktop Integration Installer On-going Uptakes Announcements: (Month) Advisory - Visibility into New Cloud Apps Functionality Specifically, this communication lists the products that release new functionality in the forthcoming schedule. - Review the updated What's New for the recent changes. Read Revision History first.

Oracle HCM Cloud provides a straightforward upgrade and update experience. Get prepared in advance for a successful transition to the latest release updates. Oracle HCM Cloud Checklists for Upgrading...

Fusion HCM Center of Excellence

LCM Archive and Purge Processes for Release 12 Oracle HCM Cloud

Archiving and Purging is a data growth control method that Oracle HCM Cloud customers can adopt into a strategy for information life cycle management (LCM). Purge Load Batch Data (customers) Maintaining the stage and interface tables in all environments is the responsibility of customers * make this a regular task Customers have full control of when and how frequently to delete stage table data Archive then Purge Completed BPM Tasks (CloudOps/customers) BPM tasks in terminal state (STATE = COMPLETED, ERRORED, EXPIRED, WITHDRAWN) will be archived  In-progress tasks that are still outstanding and active (STATE = ASSIGNED, ALERTED, INFO_REQUESTED, OUTCOME_UPDATED, SUSPENDED) will not be archived Workflow Tasks Automatically Dismissed or Withdrawn (CloudOps) 7 days old (driven by Profile Option: FND_NOTIFICATION_EXP_PERIOD) FYI notifications are subject to auto-dismissal 180 days old WIP transactions are subject to auto-withdrawal, see Approvals & Notifications FAQs Doc ID 1987850.1 for expiration and escalation configuration options Human Capital Management - Approval Notification Archive Real Time (customers) Run the Archive Workflow Tasks (post-R12.17.11) scheduled process on demand for real time reporting Read LCM Archive and Purge Processes for Release 12 Oracle HCM Cloud (Doc ID 2323993.1) for full details.

Archiving and Purging is a data growth control method that Oracle HCM Cloud customers can adopt into a strategy for information life cycle management (LCM). Purge Load Batch Data (customers) Maintaining...

File Based Loader (FBL)

Calling FBLdi from Peoplesoft

FBL Desktop Integrator (FBLdi) is a standalone desktop application that can assist you with various steps involved in FBL and data conversion without the need for developing complicated java programs or going through the lengthy process of posting a file to Oracle Fusion HCM manually.  It's been years since I did any coding for Peoplesoft but still daring to put some sample code around how to automate the integration from Peoplesoft. As you can see the sample code is using hardcoded values for the parameters and you will have to write way smarter code than what I have below to make sure it is picking up the values from parameters than hard-coded inputs. Good luck with the implementation.    Sample batch file is attached here (please rename it as Fusion_Upload.bat)    Local string &fbl_batchname = "testbatch123"; Local string &fbl_autoload = "Y"; Local string &fbl_mode = "ucm"; Local string &fbl_prop = "D:\FBLdi2.3.6\profiles\ERF_UCM.tm"; Local string &input_file = "D:\2.zip"; Local string &object_list = "Department,Establishment,Grade"; Local string &bath_name = "PB_Test123"; Local string &commandline = "ucm " | "-f " | &input_file | " -i " | "-al " | &fbl_autoload | " -b " | &bath_name; Local JavaObject &runtime = GetJavaClass("java.lang.Runtime").getRuntime(); Local JavaObject &process = &runtime.exec("D:\FBLdi2.3.6\Fusion_Upload.bat " | &commandline); Local JavaObject &inputStreamReader = CreateJavaObject("java.io.InputStreamReader", &process.getInputStream()); Local JavaObject &bufferedReader = CreateJavaObject("java.io.BufferedReader", &inputStreamReader); Local any &inputLine; &i = 0; While True    &inputLine = &bufferedReader.readLine();    If (&inputLine <> Null And          &i < 50) Then       MessageBox(0, "", 0, 0, &inputLine);       &i = &i + 1;    Else       Break;    End-If;  End-While;

FBL Desktop Integrator (FBLdi) is a standalone desktop application that can assist you with various steps involved in FBL and data conversion without the need for developing complicated java programs...

File Based Loader (FBL)

File Based Loader (FBL)- SFTP or UCM

Oracle HCM Cloud File Based Loader(FBL) was the tool of choice for data loading and integrations from Release 4 onwards. Starting Release 10 when HCM Data Loader (HDL) was introduced, FBL was deprecated.  We still have customers using FBL either manually or in automated fashion and for that reason, I am writing this article. The purpose of this article is to educate the audience on FBL automation as it regards with SFTP vs UCM usage, provide more information so we can move away from using SFTP for file transfer.  Oracle Webcenter Content Server (UCM) is the preferred method of file transfer for FBL as well as HDL data loading tools. It is the replacement infrastructure for staging data files for FBL and supersedes the sftp option provided with Oracle HCM Cloud in prior releases. It provides improved and secured file management capabilities and it being used by other Oracle Cloud applications.  UCM is integral part of HCM Cloud and can be accessed directly within the application via File Import\Export UI.   Automating FBL based Integrations is a two step process. First you need to stage the data file to either sftp\ucm followed by webservice call to invoke FBL.  Here are the key differences between usage of sftp vs ucm for automating inbound integrations.  SFTP Step1: Inbound file will be sent to cloud sftp server (sftp.cloud.oracle.com) using the ftp credentials for the application. Step2: Web service call to involve FBL. In this case the FBL auto invoke URL will be for the loader composite.  For e.g.  https://hxyz-test.hcm.us2.oraclecloud.com/soa-infra/services/default/HcmCommonBatchLoaderCoreInboundLoaderComposite/inboundloaderprocess_client_ep UCM Step1: Inbound file will be sent to ucm (webcenter content server) using hcm credentials. File will be sent to UCM using either the generic soap port web service or ucm ridc web service. UCM (Soap - recommended)-e.g. https://hxyz-test.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl or UCM (ridc – does not support SSO) - https://hxyz-TEST.fs.us2.oraclecloud.com/cs/idcplg Step2: Web service call to involve FBL. In this case the FBL auto invoke URL will be for the integration webservice. For e.g. https://hcm-aufsn4x0POD.oracleoutsourcing.com/hcmCommonBatchLoader/LoaderIntegrationService?wsdl ​Detailed technical essay is available via Oracle Support - document id 1595283.1 Starting Dec 1st 2017, Oracle hosted sftp server will no longer be available. If you are using an Oracle hosted sftp server for inbound integrations, then you must switch to UCM and if you are using it for BI\extracts\outbound integrations then you can switch to UCM or a different sftp server. Please review FBL vs HDL article if you are considering the uptake of HDL. Review the support article 2312867.1 for additional information around sftp deprecation announcement.     

Oracle HCM Cloud File Based Loader(FBL) was the tool of choice for data loading and integrations from Release 4 onwards. Starting Release 10 when HCM Data Loader (HDL) was introduced, FBL...

HCM Data Loader (HDL)

HDL- Updating Source Keys (SourceKey.dat)

When you load data using HCM Data Loader, you can provide a source key. The source key is a value, usually generated from a legacy environment (PS\EBS), that identifies the record uniquely in that environment. If you specify no source key (for e.g. converting data using user keys or online data entry), then a default source key is generated. You can update both default and locally defined source keys for integration-enabled objects. This topic describes how to update source keys. (As I write this article, current release for Oracle HCM Cloud is Release 12. ) SourceKey.dat To update the source key associated with any record, you load a SourceKey.dat file. In the file, you supply both a reference to the record to update and the new source-key value. For e.g. SourceKey.dat METADATA|SourceKey|BusinessObject|Component|OldSourceSystemOwner|OldSourceSystemId|NewSourceSystemOwner|NewSourceSystemId MERGE|SourceKey|Worker|PersonPhone|FUSION|300000001572671|STUDENT1|PH1001_W MERGE|SourceKey|Worker|PersonAddress|FUSION|300000002451147|STUDENT1|ADDR1001_HOME Easy to Read Format: METADATA SourceKey BusinessObject Component OldSourceSystemOwner OldSourceSystemId NewSourceSystemOwner NewSourceSystemId MERGE SourceKey Worker PersonPhone FUSION 300000001572671 PEOPLESOFT PH1001_W MERGE SourceKey Worker PersonAddress FUSION 300000002451147 PEOPLESOFT ADDR1001_HOME Please review this article if you are looking for more information on HDL Keys and Integration keymap.  Hope you find it useful.   

When you load data using HCM Data Loader, you can provide a source key. The source key is a value, usually generated from a legacy environment (PS\EBS), that identifies the record uniquely in that...

HCM Data Loader (HDL)

HCM Data Loader (HDL) Keys

In this article we will discuss various key types supported by Oracle HCM Data loader a.k.a. HDL. Whether you are planning to implement coexistence with ongoing data sync, one time conversion or ongoing integrations you must select the right key type to identify records uniquely. If you select a wrong key type then it may result in huge amount of rework down the line. So please pay special attention to various key types and see what makes most sense in your situation, I do recommend using the system keys where possible. Supported Key Types: HCM Data Loader (HDL) supports 4 different types of keys as listed below (in the order of key resolution sequence) GUID – Oracle Fusion Global Unique ID Oracle Fusion Surrogate ID Source Keys User Keys These key types are explained below: Oracle Fusion GUID Integration Key Generated by Fusion Generated in Oracle Fusion when a record is created Hexadecimal value Unique across all objects Held in Integration Key Map Oracle Fusion Surrogate ID Fusion Generated Unique ID Generated in Oracle Fusion when the record is created Numeric value Unique only for the object type Held on the object Source Keys Source System Key Information Two values combined: SourceSystemOwner SourceSystemID Held in Integration Key Map   User Keys User Readable and Generated Keys Natural values One or many attributes Sometimes alternatives Sometimes updateable Held on the object definition     Key Type Create Update Held on Object Type Generated Automatically GUID No Yes No Hexadecimal Yes Surrogate ID No Yes (see note #1) Yes Numeric Yes Source Key Yes Yes No Alphanumeric Conditionally (see note #2) User Key Yes Yes (see note #3) Yes Alphanumeric No Notes:- 1.    You can use surrogate IDs when updating objects, but the IDs may not be readily available to Oracle HCM Cloud users.  2.    Default source keys are generated only if you don't supply a source key when creating an object.  3.    You can't use user keys alone when updating some objects because their values are subject to change. 4.    Keys that aren't held on the object exist in the Integration Key Map table. Integration Key Map table Keys that aren’t held on the object are stored in the HDL integration key map table - HRC_INTEGRATION_KEY_MAP. You should be able to use BIP to fire up a SQL statement and view the contents of this table. e.g.  select OBJECT_NAME, SOURCE_SYSTEM_ID, SOURCE_SYSTEM_OWNER, SURROGATE_ID, RAWTOHEX(GUID) guid from fusion.HRC_INTEGRATION_KEY_MAP WHERE SOURCE_SYSTEM_OWNER ='STUDENT1‘ Business Object Fusion GUID Source Key Surrage ID User Key Location 25DD4078E961A23BE053A697480AFB92 STUDENT1_LOC1  300000001572671  (Set code and Location Code) COMMON, HQ1 Fusion GUID: System generated GUID Source Key:  Source System Owner is the reference to source application like PS or EBS, Source System Key is the actual key\id provided in the Location.dat file.  Surrogate ID: System generated. In this case it is the primary key from locations record. e.g. select * from PER_LOCATION_DETAILS_F_VL  where location_code = 'STUDENT1 Location1‘   (Result Location ID = 300000001572671) User Key:- Best way to get this info is the Business Object Documentation from MOS or other option is UI as shown below. Online page should highlight user keys with *    Supplying Keys in a HDL File CREATE: One can supply system keys or user keys (or both) while creating new objects. You can’t supply surrogate id or GUID because those fields are auto generated if the data load is successful.   Creating an Object Using Source Keys: METADATA|Job|JobCode|EffectiveStartDate|EffectiveEndDate|Name|SetCode|SourceSystemOwner|SourceSystemId MERGE|Job|SE|2010/01/01|4712/12/31|Software Engineer|COMMON|EBS-UK|12349   Creating an Object Using a User Key: METADATA|Job|JobCode|EffectiveStartDate|EffectiveEndDate|Name|SetCode MERGE|Job|SE|2010/01/01|4712/12/31|Software Engineer|COMMON   Update: One can supply any of the 4 key types while doing updates. Updating an Object Using Source Keys: METADATA|Job|EffectiveStartDate|EffectiveEndDate|Name|SourceSystemOwner|SourceSystemId MERGE|Job|2010/01/01|4712/12/31|Software Engineer - Java|EBS-UK|12349   Updating an Object Using a User Key: METADATA|Job|JobCode|EffectiveStartDate|EffectiveEndDate|Name|SetCode MERGE|Job|SE|2010/01/01|4712/12/31|Software Engineer - Java|COMMON   Updating an Object Using the Fusion GUID: METADATA|Job|GUID|EffectiveStartDate|EffectiveEndDate|Name MERGE|Job|2342UJHFI2323|2010/01/01|4712/12/31|Software Engineer - Java   Updating an Object Using the Fusion Surrogate ID: METADATA|Job|JobId|EffectiveStartDate|EffectiveEndDate|Name MERGE|Job|13413|2010/01/01|4712/12/31|Software Engineer - Java   Specifying Foreign Keys Using the Fusion Surrogate ID as a Foreign Key: METADATA|Assignment|JobId|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|SourceSystemId|SourceSystemOwner MERGE|Assignment|13413|2010/01/01|4712/12/31|5232|EBS-UK|234234   Using the User Key as a Foreign Key: METADATA|Assignment|JobCode|SetCode|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|SourceSystemId|SourceSystemOwner MERGE|Assignment|SE|COMMON|2010/01/01|4712/12/31|5232|EBS-UK|234234   Using the Fusion GUID as a Foreign Key:  Supply the value using hint () METADATA|Assignment|JobId(GUID)|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|SourceSystemId|SourceSystemOwner MERGE|Assignment|2342UJHFI2323|2010/01/01|4712/12/31|5232|EBS-UK|234234   Using the Source Key as a Foreign Key: Supply the value using hint () METADATA|Assignment|JobId(SourceSystemId)|EffectiveStartDate|EffectiveEndDate|AssignmentNumber|SourceSystemId|SourceSystemOwner MERGE|Assignment|13143|2010/01/01|4712/12/31|5232|EBS-UK|234234   System Integration Considerations: What not to do? Include effective date or hire date as part of the unique key\source key Include first name or last name as part of unique key\source key for person name Bottom line: Please do not design keys which can be easily changed in the source application.  

In this article we will discuss various key types supported by Oracle HCM Data loader a.k.a. HDL. Whether you are planning to implement coexistence with ongoing data sync, one time conversion or...

HCM Data Loader (HDL)

HDL Thread Count

This article provides general guidelines for HCM Data Loader(HDL) thread count parameters. It is one of the frequently discussed topics as use of multithreading for conversion can drastically improve the performance. (As I write this article, the current HCM Cloud release is Rel12) HDL Process Flow: HCM Data Loader (HDL) process is mainly divided into three sub processes – transfer, import and load.  For simplicity sake we will consider transfer-import as IMPORT process and LOAD as the final load process that does data inserts into main application tables.  Here are key differences between IMPORT and LOAD. Import Load Import is a SQL based program. It will read the input file, insert data into stage records, do basic validations and construct logical business objects. Processing is driven by number of physical rows in the file Concurrency is determined by Number of Import Processes parameter. Since this is a SQL based program, “Job Queue Processes” value set at the database level will determine how many concurrent import jobs one can run. Concurrency is further controlled by number of ESS servers configured in the instance.   Load is a Java based program. It will validate logical business objects and insert data into HCM cloud. Processing is driven by number of objects constructed by the import process as opposed to physical rows for e.g. 500 rows may still corresponds to 1 worker object. Concurrency is determined by Number of Load Processes parameter. Since this is a Java based program, number of servers configured in the instance will determine how many concurrent load jobs one can run. For e.g. 2 Core Setup Servers will translate into 2*8=16 max number of load threads for worker load. Concurrency is further controlled by number of ESS servers configured in the instance.   HDL run control parameters drive the bulk data loading process, you can default these parameters via configuration pages or specify suitable values during program run as shown below.   Maximum Concurrent Threads for Import This parameter controls the number of threads that will share the work during the import phase of data loading.    Maximum Concurrent Threads for Load The maximum number of threads that can be used for loading a business object is dependent on the number of servers your environment has been configured. Example An example environment could have: Server # of Servers Concurrency CoreSetupServer 3 3 * 8 = 24 Threads PayrollServer 2 2 * 8 = 16 Threads CompensationServer 3 3 * 8 = 24 Threads  HCM Domain ess_server 2 2 * 25 = 50 ess processes job_queue_processes# 20 20 database processes   In general, one ess server can handle 25 concurrent programs and one application server can handle 8 concurrent threads and the job queue processes parameter for the database controls concurrency for sql based programs. Since the environment\pod has 3 core setup servers, the maximum number of threads available is 24 (3*8). And it has 2 ess server so one can run maximum 50 (2*25) batch programs. This includes HDL, PBL, etc as well as any other scheduled or adhoc programs.   Now that we know how servers and thread count works, next piece of puzzle is to sort out who runs specific business object.  Business Objects & Servers:- HDL supports many business objects, there are few ways to sort out which object belongs to what server. Method1: Article on my oracle support - 2020600.1 Method2: Initiate data load page – i.e. the place where you get templates for object mentions the module name per object. You can equate module name to server name as below Module Server Global HR - Core HR Core Process server Global HR - Core Setup Core Setup server Compensation Compensation server Global Payroll Payroll server Talent Talent server Workforce Management Workforce Management server   Here is a worked example showing server configurations and max number of threads that can be used for data loading.   POD Name: COEDEV-TEST Application Usage Development Instance CoreSetupServer 8 with 8GB Heap Worker Load Threads 8 * 8 = 64 PayrollServer 2 with 4GB Heap Elements Load Threads 2 * 8 = 16 CoreProcessesServer 2 with 4 GB Heap DoR Load Threads 2 * 8 = 16 CompensationServer 3 with 8GB Heap Salary Load Threads 3 * 8 = 24 WorkforceMgmt 1 with 8GB Heap Absences Load Threads 1 * 8 = 8 HCM Domain ess_server 2 with 8GB Heap Concurrent Programs# 2 * 25 = 50 job_queue_processes# 20   (Max number of concurrent SQL Jobs)   Case1: User kicked-off a large batch of workers with 64 import and 64 load threads. Result:  Import will run with max of 20 concurrency as it's controlled by job queue parameter. Load will run with max of 50 concurrency, even if 64 threads are available the instance only has 2 ess servers so you can run only 50 (out of 64) concurrent load programs.  Case2: User A ran some random ess program, at the same time user B kicked off a large HDL batch of workers with 64 import and 64 load threads. Result:  Import will run with max of 20 concurrency as case 1, load will run with max of 49 concurrency (50 ess capacity - 1 as there is another program running in this instance).   ** Note: It is always a smart idea to find out how your instance was resized or configured so you can make proper decisions on number of threads\concurrency to be used during data loading. This is critical if you are loading mass amount of data.  If your instance is not configured to handle the data volume then follow MOS note – 2004494.1 to submit the resizing request.  Hope you find this useful. If you looking for more info on HDL batch size then review this article. 

This article provides general guidelines for HCM Data Loader(HDL) thread count parameters. It is one of the frequently discussed topics as use of multithreading for conversion can drastically improve...

HCM Data Loader (HDL)

HDL Batch Size

Very often I get this question about what is the concurrency or thread count to be used or recommended batch size while loading data using HCM Data Loader. I find it challenging to provide a generic answer simply because no two customers are same. Recommendations provided to one customer for loading say 1000 employees doesn't have to match recommendations given to another who is also loading 1000 employees. Let’s say there are 4 customers who are trying to load 1000 employees using HDL. But each of them wants to bring different amount of historical information per employee and hence the number of physical rows is different. Customer A Customer B Customer C Customer D 1000 Employees  1000 Employees 1000 employees   1000 employees   1 year of history, physical rows = 1,000  3 years of history, physical rows ~ 10,000  10 years of history, physical rows ~ 50,000  20 years of history, physical rows ~ 100,000    Naturally if everyone follows the same rule while loading their data, it's not going to be efficient. So if you run the program using 50 load threads for HDL and batch size or chunk size = 100  here is what may happen- Customer A Customer B Customer C Customer D 1 year of history, physical rows = 1,000  3 years of history, physical rows ~ 10,000  10 years of history, physical rows ~ 50,000  20 years of history, physical rows ~ 100,000  Number of batches 10=1000/100 to be divided against max 50 threads. (Total objects are 1000 employees) Number of batches 10=1000/100 to be divided against max 50 threads. (Total objects are 1000 employees) Number of batches 10=1000/100 to be divided against max 50 threads. (Total objects are 1000 employees) Number of batches 10=1000/100 to be divided against max 50 threads. (Total objects are 1000 employees) 10 threads are being used Excellent system performance  10 threads are being used Should expect reasonable performance  10 threads are being used Should expect performance issues 10 threads are being used Should expect worst case system performance   What went wrong? If you notice, in all the above cases no one is using all of the 50 threads, because there are only 10 batches to be processed. We need to review the chunk size parameter to be able to load this data in reasonable amount of time.  Customer A Customer B Customer C Customer D 1 year of history, physical rows = 1,000  Chunk Size = 100, Load Threads = 50 3 years of history, physical rows ~ 10,000  Chunk Size = 60, Load Threads = 50 10 years of history, physical rows ~ 50,000  Chunk Size = 40, Load Threads = 50 20 years of history, physical rows ~ 100,000  Chunk Size = 10, Load Threads = 50 Number of batches 10=1000/100 to be divided against 10 threads. Each thread will process 100 objects or 100 employees Number of batches 17=1000/60 to be divided against 17 threads. Each thread will process 60 objects or 60 employees Number of batches 25=1000/40 to be divided against 25 threads. Each thread will process 100 objects or 100 employees.  Number of batches 100=1000/10 to be divided against 50 threads. Each thread will process 2 batches of 100 objects each.  10 threads are being used 17 threads are being used 25 threads are being used 50 threads are being used   This example should provide far better performance as compared to prior use-case where chunk size was defaulted to 100 irrespective of amount of historical data that was being converted. So remember, there is on one solution that fits all but hopefully this article gives you some insight into the batch size a.k.a. chunk size for HDL data loading. There are several system level parameters which would affect the data loading performance, I will discuss those in next article.    

Very often I get this question about what is the concurrency or thread count to be used or recommended batch size while loading data using HCM Data Loader. I find it challenging to provide a generic...

HCM Data Loader (HDL)

HCM Data Loader (HDL) - Loading Activities & Interest Card

Overview:     Loading Person Profile (Talent Profile) using HDL is a well documented process and something Fusion HCM customers are doing since FBL days. Well defined and well known process. This article is addressing special use case from the talent profile load as it relates to Activities and Interests card.  Its one of those special cases because portrait card that we are trying to load is not really a part of Person Profile definition or in other words, it’s not a content section on Manage Profile Type Setup task. Our goal is to load highlighted information from the screenshot below:- Talent Profile Object: Talent Profile is one of the supported business objects for HDL and one can use it to load Person Profile as well as Model profiles (such as Job or Position). Employees can manage their own careers by keeping their talent profiles current so that their skills, qualifications, accomplishments, and career preferences reflect their current performance and future career goals.  You can create:  • Person profile: A collection of a worker's skills, qualifications, education background, and so on.  • Model profile: A collection of the work requirements and required skills and qualifications of a workforce structure, such as a job or position.  There is an excellent article on loading talent data using HDL on MOS- 2022627.1 Data Loading Steps: Loading the About Me Section:-  About Me section is similar to the linkedin profile summary section. You can load the information for this using summary field on the Profile object. METADATA|TalentProfile|SourceSystemOwner|SourceSystemId|Description|Summary|ProfileTypeCode|ProfileCode|ProfileStatusCode|ProfileUsageCode|PersonId(SourceSystemId) MERGE|TalentProfile|STUDENT7|STUDENT7_PERPROFILE100|HDLdi Demo Person Profile|About Me Student 7 100 |PERSON|STUDENT7_PERPROFILE100|A|P|STUDENT7_PER100   Loading the Area of Interest Section:-  Area of Interest section can include the keywords or summary for areas of interest to the employee. You can load this information using ProfileKeyword child object from the Talent Profile. The profile keyword holds keywords for areas of expertise and areas of interests for person based profiles. The discriminator ProfileKeyword is used to load profile keyword records using HCM Data Loader. The following ProfileKeyword attributes are commonly used when loading a new profile keyword. Other optional attributes may be available. METADATA|ProfileKeyword|SourceSystemOwner|SourceSystemId|ProfileId(SourceSystemId)|KeywordType|Keywords MERGE|ProfileKeyword|STUDENT7|STUDENT7_100KeyAOI|STUDENT7_PERPROFILE100|AOI|Accounting, Operations, Marketing KeywordType field in this section has 2 valid values   AOI  -  Area of Interest AOE - Area of Expertise Here is the complete sample file for the Talent Profile Object.     Hope you find this information useful. Good luck with the implementation.     

Overview:     Loading Person Profile (Talent Profile) using HDL is a well documented process and something Fusion HCM customers are doing since FBL days. Well defined and well known process.This article...

HCM Data Loader (HDL)

HCM Data Loader (HDL) - Objects Supporting Deletes

Here is a list of most commonly used objects during HDL conversion and if they support deletes or not. You can review HDL user guide for the complete list of supported objects.  Delete command - You can use the DELETE action as part of the HDL (.dat) file for objects supporting deletes. Command identifies business-object components to be purged from Oracle Fusion HCM. You cannot delete individual date-effective records so you need to delete the complete logical business object.  For e.g. METADATA|Grade|SetCode|GradeCode|GradeName DELETE|Grade|COMMON|STUDENT5_GRADE1|STUDENT5 Grade1 Important:- Please note FBL delete diagnostics can not be used to purge the data converted using HDL.   Important:- You can use the Delete command as part of HDL input file to purge the data from stage or production instance. Since the delete operation is non reversible, be careful while executing these commands in production database.   List of commonly used business objects:   Category\Module Object Delete\Purge Supported? Global HR       Action Yes   ActionReason Yes   Location No   Job No   Department No   Grade Yes   GradeRate Yes   Grade Ladder Yes   JobFamily Yes   Department Tree Yes   Organization Tree Yes   Position Yes   Allocated Checklist Yes   Areas of Responsibility Yes   Document Record Yes   Document Record Delivery Preference Yes   Extended Lookup Yes   Legislative Data Group Yes   Person Type Yes Talent       Education Establishment Yes   RatingModel Yes   Content Item Yes   Talent Profile Yes   Goals Yes   Goal Plan Yes   Goal Plan Set Yes Comp       SalaryBasis Yes   Salary Yes Absence Management       Person Accrual Detail Yes   Person Entitlemnt Detail Yes   Person Absence Entry Yes   AbsenceCase Yes Payroll       Element Entry Yes Time and Labor       Time Record Group Yes Employee Data       Worker *Some child objects, You can not delete entire Worker object   Worker Object: Child Object Delete Supported? Comments PersonAddress Yes You can delete addresses, but not the primary address. PersonCitizenship Yes   PersonDeliveryMethod Yes   PersonDriversLicence Yes   PersonEmail Yes You can delete all email addresses, regardless of the email type or primary status. However, if you delete just the primary email address and other email address is to be retained, you must first update one of the remaining email addresses as primary. PersonEthnicity Yes   PersonLegislativeData Only Only if multiple There must always be one legislative data record for a worker, additional records can be deleted. PersonName No You cannot delete a worker’s name.    PersonNationalIdentifier Yes   PersonPassport Yes   PersonPhone Yes   PersonReligion Yes   PersonVisa Yes   PersonUserInformation No This is only available when creating new workers, and is not applicable for deleting users. PersonUserManualRoles No No. This is only available when creating new workers, and is not applicable for removing roles. Refer to the Loading Update User Requests using HCM Data Loader white paper to remove roles from existing users. WorkRelationship Cancel When deleting a work relationship, you must also supply the CancelWorkRelationshipFlag with a value of Y.  WorkTerms No You cannot delete an individual work terms record.  Assignment No You cannot delete an individual assignment record.   AssignmentGradeSteps Yes   AssignmentSupervisor No You can end-date assignment supervisor records, but you you canot delete them. AssignmentWorkMeasure Yes   AssignmentExtraInfo Yes   Contract No   WorkTermsGradeSteps Yes   WorkTermsSupervisor No You can end-date work term supervisor records, but you canot delete them. WorkTermsWorkMeasure Yes   WorkTermsExtraInfo Yes   WorkerExtraInfo Yes  

Here is a list of most commonly used objects during HDL conversion and if they support deletes or not. You can review HDL user guide for the complete list of supported objects.  Delete command - You...

HDL Desktop Integrator (HDLdi)

HDL Desktop Integrator (HDLdi) Introduction HDL Desktop Integrator (HDLdi) is a standalone desktopapplication that can assist you withvarious steps involved in HDL and data conversion without the need for developingcomplicated java programs or going through the lengthy process of posting afile to Oracle Fusion HCM manually. HDLdi can assist you with file transfer and registrationinvolved in HDL conversion, and provide sample files specifically tailored foryour own application. It can be used as a great learning tool if you arestarting up a new project, or conducting a workshop or a training class. It can also be used to spot-check your Oracle Fusion HCMpod to ensure that it is configured properly for HDL. HDLdi Installation HDLdi Installation is really simple –download the application, unzip/extract the file, and start using! (You musthave java installed on your computer) My Oracle Support links to download theApp Note2056538.1 HDLdi Demo Database HDLdi can assist you with sample files specificallytailored for your own application so that you can upload these files to FusionHCM pod and get comfortable with the HDL data loading process. Some of the highlightsfor the HDLdi demo data- Sample files for 30+ business objects and more are being added with new releases of the tool. Demo data for 10 employees Core transactions e.g. Hire, Promotion, Transfer, Termination, Rehire, etc Multiple Assignments Multiple Managers Historical Information Salary Information Work structures Flexfields Talent Profile Area of responsibility Photos and Attachments Several use cases for worker object  Automating the HDL conversion using HDLdi HDLdi user interface is very useful if you are posting the files manually especially during the implementation phase or learning phase but post go live or coexistence scenarios, you will automate the file posting and HDL invoke.  There are two options to do this, Option 1: Command Line  To view the usage use - > upload ucm –h usage: upload ucm [-prop <arg>] -f <arg> [-i [-ime <arg>] [-lme <arg>]  [-lct <arg>] [-lgs <arg>] [-et <arg>] [-dsf <arg>]] [-proxy  -host <host> -port <port>] [-output <xml_file>] Example: upload ucm -prop D:\Prasanna\HDLdi2.3.6\profiles\MXF_HDL.TM -f D:\temp\demo.zip -i -ime 100 -lme 100 -lct 1 -lgs 200 -et NONE –dsf Y -output result.xml Parameter Description -h,--help Print help message -prop,--property-file <arg> Optional. The setup file of ucm server configuration. The default properties file is profiles/ucmcli.tm. -f,--upload-file <arg> File to upload -i,--invoke Optional. Invoke HDL service after upload -dsf,--delete-source-file <arg> Delete source file. Y/N. This parameter is effective only when invoke is enabled. -et,--encrypt-type <arg> Encrypt types are: PGPSIGNED, PGPUNSIGNED, PGPX509SIGNED, PGPX509UNSIGNED. NONE or leave blank for no encryption. This parameter is effective only when invoke is enabled. -ime,--import-maximum-err <arg> Import Maximum Errors. Numeric value. Default value is 100. This parameter is effective only when invoke is enabled. -lct,--load-concurrent-thread <arg> Load Concurrent Thread.Numeric value. Default value is 1. This parameter is effective only when invoke is enabled. -lgs,--load-group-size <arg> Load Group Size. Numeric value. Default value is 200. This parameter is effective only when invoke is enabled. -lme,--load-maximum-err <arg> Load maximum Errors. Numeric value. Default value is 100. This parameter is effective only when invoke is enabled. -output,--output-file <arg> xml file to ouput the result -host,--proxy-host <arg> http proxy host -port,--proxy-port <arg> proxy port  -proxy   A flag to use http proxy to connect server Option 2: Java API  You can invoke HDLdi from a java program by simply using - oracle.hcm.cx.app.CommandInvoker  class Example 1:   public String invoke(String[] args) throws Exception         usage:  CommandInvoker ci = new CommandInvoker();       String result = ci.invoke(new String[]{"ucm", "-prop","D:/ucmdata/MXF_HDL.TM", "-f", "D:/ucmdata/person1.zip", "-i", "-ime", "100", "-lme", "100", "-lct", "1", "-lgs", "200", "-et", "PGPSIGNED", "-dsf", "Y"}); Example 2: public String invokeUCM(String fileName, String profileName, String invoke, String importMaxErr, String loadMaxErr, String loadConThread, String loadGroupSize, String encryptType, String deleteSourceFile) throws Exception usage:          CommandInvoker ci = new CommandInvoker(); String result = ci.invokeUCM("D:/ucmdata/person1.zip", "D:/ucmdata/MXF_HDL.TM", "-i", "100", "100", "1", "200", "NONE", "Y");  Conclusion As part of Oracle Fusion HCM cloud implementation – full HR or coexistence model, you are likelyto face the situation where you need to automate file transfer and data loadingtasks for HDL. You can accomplish these tasks without investing heavily inprogramming, by using the tools and methods described in this article. Tosummarize, here are various options to use HDLdi for file transfer andregistration tasks · HDLdi user interface · HDLdi command line · HDLdi functional call as part of custom javaprogram Important Information: HDL Desktop Integrator(HDLdi) is not a supported product of Oracle. You will not be able to enter any SR or Bug for this app but you can report issues or enhancement requests via My Oracle Support Communities. Your opinion is very important to us. Please provide the feedback which can help us to support you even better. 

HDL Desktop Integrator (HDLdi) Introduction HDL Desktop Integrator (HDLdi) is a standalone desktop application that can assist you withvarious steps involved in HDL and data conversion without the...

Business Intelligence Publisher (BIP)

BI Publisher to query record

Oracle BI Publisher is a powerful reporting tool that allows you to separate data sources (data model) from the data formats (report layout). The BIP engine can format any well-formed XML data, allowing integration with any system that can generate XML. BI Publisher can merge multiple data sources into a single output document. In this article you can see how easy it is to use BIP to query any record from Fusion HCM Cloud instance. In the SaaS model you don’t have access to SQL tools such as SQL Developer to query the database and hence tools such as BIP can come very handy for troubleshooting issues or data reconciliation where you could write simple SQLs or reports to look at the data from database records.  Use case: Run a Sql statement against PER_ALL_PEOPLE_F table.  Navigation: There are two different ways to login to Oracle Fusion HCM BI Publisher   Method1: If you know the direct URL for BIP then use that, it will be the simplest method to get to BIP. Here are few samples for BIP URLS so that you can derive the URL for your application based on your server details.  https://HXYZ-TEST.bi.us2.oraclecloud.com/analytics/ https://bi-aufsn4x0POD.oracleoutsourcing.com/analytics/ (POD – Pod name) Method2: You can login to Fusion HCM Application and go to Tools -- Reports and Analytics, click on the Browse Catalog icon. This should open BIP UI in a new window.   Create a new Data Model Select the data source as SQL Query   Select the data source as ApplicationDB_HCM and then provide the SQL statement that you want to run   SQL Statement View results \ Create new Report – If you only want to view the data for issue troubleshooting and all then you don’t really need to create any report and format. You could simply click on view data button and see the results. You can optionally save the data model in a shared folder so other team members can access the data model than having everyone create their own copy.    Results  

Oracle BI Publisher is a powerful reporting tool that allows you to separate data sources (data model) from the data formats (report layout). The BIP engine can format any well-formed XML data,...

HCM Data Loader (HDL)

FBL vs HDL

This article compares FBL and HDL – two of the commonly used data loading tools in Fusion HCM to highlight key differences and similarities. It also provides some useful information on documentation and user tools.  Parameter FBL (File Based Loader) HDL (HCM Data Loader) General Release4+ Flat file based data load utility Data from any source Supports ~21 business objects Partial data set for incremental updates except work relationship and salary Delete is not supported Limited support for flex fields Release10+ Recommended tool for data loading for all new customers. Flat file based data load utility Data from any source Supports ~90 business objects Partial data set for incremental updates except salary. (We can now send partial row set for work relationship!) Delete is supported for certain business objects. Supports all flexfields including Extensible Flexfields and the PeopleGroup Flexfield   Supported File transfer Supports both SFTP and UCM although UCM is the recommend method for file transfer. Supports only the UCM as method for file transfer UCM file transfer URLs (Examples) UCM (Soap - Recommended)- https://hxyz-test.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl UCM (ridc – does not support SSO) - https://hxyz-TEST.fs.us2.oraclecloud.com/cs/idcplg UCM (Soap - Recommended)- https://hxyz-test.fs.us2.oraclecloud.com/idcws/GenericSoapPort?wsdl UCM (ridc – does not support SSO) - https://hxyz-TEST.fs.us2.oraclecloud.com/cs/idcplg Auto invoke URL (Examples) FBL URLs for Auto invoke. URL format may differ depending on the file transfer protocol (sftp or ucm). SFTP: If you are using SFTP as file transfer protocol then FBL URL will be for loader composite. https://hxyz-test.hcm.us2.oraclecloud.com/soa-infra/services/default/HcmCommonBatchLoaderCoreInboundLoaderComposite/inboundloaderprocess_client_ep https://hcm-aufsn4x0POD.oracleoutsourcing.com/soa-infra/services/default/HcmCommonBatchLoaderCoreInboundLoaderComposite/inboundloaderprocess_client_ep UCM: If you are using the UCM as file transfer protocol for FBL then use loader integration service URLs. https://hcm-aufsn4x0POD.oracleoutsourcing.com/hcmCommonBatchLoader/LoaderIntegrationService?wsdl https://hxyz-test.hcm.us2.oraclecloud.com/hcmCommonBatchLoader/LoaderIntegrationService?wsdl HDL URLs for Auto invoke: https://hxyz-test.hcm.us2.oraclecloud.com/hcmCommonDataLoader/HCMDataLoader?wsdl https://hcm-aufsn4x0POD.oracleoutsourcing.com/hcmCommonDataLoader/HCMDataLoader?wsdl Data Files Components data delivered in separate data files. Files are grouped into a named folder. All data in a single data file. Data file has the name of the object. File is included in the top level within a zip file. Processing Instructions No explicit instructions - interpretation of how to process data is embedded in the FBL processing engine Can indicate whether the data should be MERGED or DELETED. Interpretation of what MERGE means is embedded in the object processing code. Column Headings Column Headings are defined in the FBL Columns Spreadsheet Column Headings are different to FBL (e.g. PERSON_NUMBER becomes PersonNumber), and are described in the Business Object template files. Date format Date format is YYYY-MM-DD Date format is YYYY/MM/DD Keys FBL uses GUIDs (source system references) to identify primary, parent and foreign keys. Accordingly there is just 1 column for each key reference. HDL can use any of 4 different types of key to identify primary, parent and foreign keys. It uses annotations in the column heading metadata line to indicate the type of key being passed if not an explicit attribute. Supported keys are Oracle Fusion GUID Oracle Fusion Surrogate ID Source Keys User Key Attachments and Images Not supported Attachments for specific business objects and Images are supported. These are provided in a special folder in the zip file and are referenced in the corresponding data files. Flexfields FBL has support for a limited set of flexfields. The column headers are provided with generic names e.g. Attribute1, Attribute2 DFF Support PER_ASG_DF PER_CITIZENSHIPS_DFF PER_ETHNICITIES_DFF PER_GRADES_DF PER_JOBS_DFF PER_LOCATIONS_DF PER_ORGANIZATION_UNIT_DFF PER_PERSONS_DFF Supports all flexfields including Extensible Flexfields and the PeopleGroup Flexfield. The column headers use the Flexfield segment names rather than generic column names Object Coverage FBL supports around 21 objects HDL supports around 90 objects. (Note - Business Unit was supported in FBL but not supported via HDL) User Guide User Guide 1595283.1 User Guide 1664133.1 Business Object Documentation 2020600.1     Overall Integration Guide Rel10 Integration Guide Rel10 Integration Guide Desktop tools for automation FBLdi - 1915774.1 Automation white paper - 1955064.1   HDLdi - 2056538.1 , there is another white paper on MOS to automate HDL without HDLdi 1664133.1 Offline data validators Basic Advanced – MySQL based Validator - 2022617.1 Diagnostics HR2HR Batch Error Analysis - 1600353.1 Batch Loader Diagnostics -1961599.1 Some Useful BIP\SQL reports – MOS community HCM Data Loader Data-Set Status Diagnostics - 1664133.1 Coexistence tool-kits Delivered toolkits must be used only as a reference point, customers must implement\customize their extracts to fit their business need.  Peoplesoft Coexistence using FBL - 1667423.1 (You may raise SR for Peoplesoft (not fusion) for your PS and Tools release to get up to date info on PS patch for coexistence) EBS Toolkit for Coexistence using FBL- 1556687.1 Delivered toolkits must be used only as a reference point, customers must implement\customize their extracts to fit their business need.  Peoplesoft Toolkit  for Coexistence using HDL – 2111641.1 EBS Toolkit for Coexistence using HDL - 1942763.1   Hope you find it helpful. 

This article compares FBL and HDL – two of the commonly used data loading tools in Fusion HCM to highlight key differences and similarities. It also provides some useful information on documentation...

Oracle

Integrated Cloud Applications & Platform Services