X

An Oracle blog about HCM Cloud

HDL Post Conversion Processes

Prasanna Borse
Center of Excellence at Oracle

Overview:

This article provides best practices for post conversion processing in Oracle HCM Cloud. After a successful data migration or incremental updates using HDL, you must run set of processes to either creating supplemental data or optimizing indexes for better performance. It is critical to understand when and how to run these programs. As I write this article, 19c is the current release for Oracle HCM cloud application.

Post Conversion Processes

#

Program

Description

1

Synchronize Person Records

Notifies consuming Oracle Cloud applications, such as Oracle Fusion Trading Community Model, of changes to person and assignment details since the last data load.

2

Refresh Manager Hierarchy

For performance reasons, the complete manager hierarchy for each person is extracted from active data tables. The hierarchy is stored in a separate manager hierarchy table, known as the de-normalized manager hierarchy (PER_MANAGER_HRCHY_DN).

3

Update Person Search Keywords

Several attributes of person, employment, and profile records are used as person-search keywords. This process copies keyword values in all installed languages from the originating records to the PER_KEYWORDS table, where they're indexed to improve search performance.

4

Optimize Person Search Keywords Index

Optimizes the index of the PER_KEYWORDS table to improve search performance.

5

Autoprovision Roles for All Users

Grant\remove roles based on current role-provisioning rules. It will evaluate ALL users in the system against role provisioning setup and hence resource intensive and may create lot of LDAP requests.

6

Send Pending LDAP Requests

Sends user-account requests to the LDAP directory. Run this process only when you want user accounts to be either created or updated.

7

Send Personal Data for Multiple Users to LDAP

Ensures that personal data held in your LDAP directory matches that held by Oracle HCM Cloud, post bulk updates.  Fields to be synchronized are - first Name, last name, email and manager.

8

*Synchronize Person Assignments from Position

If using position management and PositionOverrideFlag is set to Y in worker.dat for assignment records. You may want to run this process before running refresh manager hierarchy process if line manager is also synchronized.

(Full HCM implementations only and not for coexistence)

9 Calculate Seniority Dates You can not create V3 seniority records using HDL, only updates are allowed. After loading worker object via HDL, you must run this process to create default seniority records for workers based on configured seniority date rules.

 

Auto-trigger Processes

For worker object, by default, these two processes run automatically when HDL completes:.

  • Refresh Manager Hierarchy
  • Update Person Search Keywords

You can prevent either or both of these processes from running automatically using a SET instruction in the Worker.dat file. This can be useful especially in cases where you have many batches for employee data conversion and by disabling the process run post every single batch can allow you to run these programs only once after completing all the batches. 

When to run these post conversion programs? 

This is a fun subject for a discussion and you may find forums with slightly different flavors for the recommendations. Here is one of the best ways to plan your post conversion schedule. 

Program

One-time Conversion

Incremental/ Ongoing updates

Comments

Synchronize Person Records

Yes -Run after loading person data

Yes - If there are changes to person records on a daily basis.

 

Refresh Manager Hierarchy

Yes (HDL will auto trigger)

Yes (HDL will auto trigger)

Run the program manually if HDL auto trigger is disabled.

Update Person Search Keywords

Yes (HDL will auto trigger)

Yes (HDL will auto trigger)

Run the program manually if HDL auto trigger is disabled.

Optimize Person Search Keywords Index

Yes

Yes

 

Autoprovision Roles for All Users

No

No

Auto Role Provisioning Rules should be configured BEFORE running HDL Worker loads. This would let user accounts to be created along with requests to provision roles automatically. Please do not schedule this process, instead run it manually when role provisioning rules are modified. 

Send Pending LDAP Requests

Yes

Yes

You should run this job after bulk loading workers via HDL.

It is also a best practice to schedule this job on a daily basis to take care of ongoing user access requests as well as processing future dated ldap requests.

Personal Data for Multiple Users to LDAP

Yes

Depends

Required only if personal data (name, email, manager) is updated in bulk via HDL or Spreadsheet loaders, etc) 

Best to run the process once after one-time conversion so you don't need to worry about data loading dependencies such as initial load for basic information and separate batch for line manager updates. 

Synchronize Person Assignments from Position

Depends

Depends

Depends? If you are using position management and full HR (not coexistence)

Given that changes can happen regularly within a position, you'll want to this process to run on a regular basis. If you are synchronizing the line manager, then it's recommended to run this process daily as well.

Calculate Seniority Dates Yes Yes You will run this job after initial conversion as well as ongoing worker load via HDL. Please review the section below to set appropriate value for batch size to avoid performance issues. 

 

How to run these post conversion programs?

It's time to deep dive into each of these programs and understand run control parameters and other considerations. 

Synchronize Person Records

The job is resource intensive as it publishes events in SOA for consuming applications so please run this process after hours and using proper input parameters. Also note, this job will only process effective dated transactions i.e. it will pick up future dated hires only when that hire date becomes effective. 

There are 3 input parameters for this ESS job. 

  1. From Date:  

  2. To Date: 

  3. After Batch Load:  <Yes or No or Blank>. Set this parameter to Yes when you are running the process post HDL conversion. 

The Job can be run in 2 ways.
 

Daily
Pass the system date in both the date parameters and "After Batch Load" parameter set to "No".
 

For a specific Period

Pass the date range in the parameters. Example: From Date 12-Aug-2019; To Date 18-Aug-2019;

Note: The max. date range between from date, to date parameters should be 7 days.  For the very first time, the date range will be accepted for more than 7 days. All subsequent runs with date range more than 7 days, the job will end up in Warning state. It doesn't raise events at all.

 

Refresh Manager Hierarchy

A person's manager hierarchy could be derived from active data tables, but the impact of that approach on performance is unpredictable. Therefore, the complete manager hierarchy for each person is extracted from data tables and stored in a separate manager hierarchy table. This table is known as the denormalized manager hierarchy. Whenever a change is made to a person's manager hierarchy through the application pages, the change is reflected automatically in the denormalized manager hierarchy table. You use the Refresh Manager Hierarchy process to populate the denormalized manager hierarchy table when person records are updated using data loaders (e.g. HDL). 

There is 1 input parameter for this ESS job. 

1. Updated Within the Last N Days - e.g. 1 day if the job is scheduled on daily basis, and you can leave it blank for the initial run or reconciliation for the full refresh.

 

Update Person Search Keywords

There are 3 input parameters for this ESS job. 

1. Batch Id:  <Number Value or blank>. To execute the job for a particular HDL batch, enter only the Batch Id and leave all the other parameters as a blank. It will create/recreate the person keywords for all the people who are successfully loaded via that HDL batch load.
2. Name:   <Person Name LOV or blank>. To execute the job for a specific person, pass only the person name and leave other fields blank.
3. After Batch Load:  <Yes or No or Blank>. To execute the job in Delta\Incremental mode option, then pass only 'After Batch Load' parameter as 'Yes'. It will create/recreate person keywords for all the modified or new people in the system. This option will work only when the delta (change) size is less than 20K. For delta >20k, run the job with All People option (i.e. do not pass any parameters) 

 

Optimize Person Search Keywords Index

Run this job only when there is minimum load in the system. Generally it is better to run daily once during maintenance time.  And if they have 'Update Person Search Keywords' ess job also scheduled to be run daily during maintenance time then it is recommended to schedule this job right after.

There are 2 input parameters for this ESS job

1. Maximum Optimization Time: < # of mins or Blank>, default is 180 minutes
2. Optimization Level: <Full Optimization or Rebuild the Index or Blank>, default is Full Optimization . If you are rebuilding the index then optimization time is ignored. 

Note- Recommendations will vary based on the employee population, system usage by the users, db usages , data loaders used , index fragmentation,'Update Person Search Keywords' ess job runs/schedules etc. 

Autoprovision Roles for All Users

Please do not schedule this process for a daily or regular run, instead do the manual run as necessary. You need to run this process only if there are changes in the auto provisioning rules setup. You need to run Send Pending LDAP ESS to actually process all the LDAP requests generated by this job.

There is 1 input parameter for this ESS job

1. Process Generated Role Requests: <Yes or No>. Set the Process Generated Role Requests parameter to No to defer the processing, deferring the processing is better for performance, especially when thousands of role requests may be generated.  

 

Send Pending LDAP Requests

Most of the changes to users & roles are shared automatically by Oracle HCM Cloud to Oracle Identity Management, but you may need to run this ESS after mass updates (such as HDL worker load or Auto provisioning changes). Best to schedule this process on daily basis, so it can send the pending requests as well as future dated requests which are now current to Oracle Identity Management. 

There are 2 input parameters for this ESS job. 

1. User Type <All, Party, Person> default is All
2. Batch Size <A> A- Auto calculate the batch size. For e.g if you have 1000 requests to be processed in 10 batches then batch size is 100. 

Synchronize Person Assignments from Position

You must schedule this process to run on a regular basis. If you are synchronizing the manager, then it's recommended to run this process daily.

There are 3 input parameters for this ESS job. 
1. Past Period to Be Considered in Days - Number of days in the past to be considered for updating the attribute in the assignments. The default value is 30 days. For example, if you set this parameter to 60 days, then any assignment records with start dates during the previous 60 days are synchronized from positions.   
2. Run at Enterprise Level <Yes, No> - Select Yes to run the process for the enterprise, or No to run it for a specific legal entity.
3. Legal Employer <Legal employer name LOV> - Legal entity for which you want to run the process.
 

Calculate Seniority Dates

There are 6 input parameters for this ESS job. 
1. Person Number: Comma separated list of person numbers you want the ESS job to run for, or leave it blank to run it for all employees.
2. Past Period in Days: default is 1 day. It determines how many changes are getting picked up by the process. 
3. Include Terminated Work Relationships: It determines whether seniority Data of terminated workers will be generated or not.  
4. Legal Employer: <Legal employer name LOV> - Legal entity for which you want to run the process.
5. Union: <Union LOV> - Union name for which you want to run the process.
6. Selected Seniority Date Rules: List of rules you want to populate for the workers.

Note - Past Period in days parameter will impact how many records are getting picked up for deriving Seniority Dates. If some records are not all modified for a longer time, then we need to run the ESS once, with a bigger number as the period. However when we run with a big number as the past period, it will impact performance of the ESS job. Multi threading is one option to improve the performance of the ESS job, and it is done by setting up batch size\chunk size using profiles option. The PER_EMP_SD_MAX_PROCESS_REC profile option is used to set the number of records processed per thread and subsequently that will control the number of threads used depending on the total records. Customer needs to create a new profile option through following steps.

  • Navigate to the task "Manage Profile Options"
  • Create a profile option with following details.
    • Profile Option Code : PER_EMP_SD_MAX_PROCESS_REC
    • Profile Option Display Name : PER_EMP_SD_MAX_PROCESS_REC
    • Application : Global Human Resources
    • Module : Employment
    • Start Date : 01/01/1951
  • Once profile option is created make the profile option Enabled and Updatable at Site Level
  • Now navigate to task 'Manage Administrator Profile Values'
    • Search for newly created profile option (PER_EMP_SD_MAX_PROCESS_REC)
    • At the site level, set a desired value e.g. 10000 or 2000 etc. This is the chuck size which will eventually determine number of threads. (This value is to be decided by customer as per their requirement.)

Join the discussion

Comments ( 1 )
  • hcm fusion Monday, March 23, 2020
    good blog



    Oracle Fusion Financials Online Training
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.