The Integration blog covers the latest in product updates, best practices, customer stories, and more.

  • October 10, 2020

A Simple Guide to Oracle HCM Data Loader (HDL) Job Support in Oracle HCM Cloud Adapter

Prakash Masand, and Asawari Pawar

Oracle Integration continues to simplify integrations with Oracle HCM Cloud by adding native support to more and more Oracle HCM integration touch points. Oracle Integration now supports Oracle HCM Data Loader (HDL) jobs, a powerful tool for bulk loading data through integrations. Using Oracle HDL, you can load business objects for most Oracle HCM Cloud products into Oracle Integration. For example, you can load new hires from Oracle Talent Acquisition Cloud (Taleo EE) as workers into Oracle HCM Cloud using an Oracle HDL job. To learn more about Oracle HDL jobs, refer to this blog.

The Oracle HCM Cloud Adapter in Oracle Integration simplifies the way an integration specialist invokes an Oracle HDL job process and monitors the status of the job. The Oracle HDL job can load data into business object from delimited data (.dat) files. Integration architect can generate the delimited data files in Oracle Integration using business object template files provided by Oracle HCM Cloud. Because the business object template file contains every attribute including flex fields, it can be further simplified and personalized by removing the excess attributes. The business object template file must be associated with a stage file action in Oracle Integration to generate the delimited data file .This greatly simplifies generation of delimited data files for Oracle HCM business objects through Oracle Integration. To learn more about how to obtain the business-object template file from Oracle HCM Cloud and use the same for delimited data file generation, refer to this post on the Oracle Cloud Customer Connect portal.

At a high level, an Oracle HDL job pattern can be implemented in three steps:
1) Generate the HCM HDL compliant delimited data (.dat) file.
2) Submit the Oracle HDL job.
3) Monitor the job status until completion.


The Oracle HDL job is a bulk data load process that runs in a batch mode, to support this Oracle HCM Cloud Adapter supports two Oracle HDL operations:

  1. Submit the Oracle HDL job: The Oracle HCM Cloud Adapter uploads a ZIP file containing a .dat file to Oracle Universal Content Management (UCM) and invokes the Oracle HDL importAndLoad operation. This operation returns the Oracle HDL process ID. Note that the ZIP file can contain multiple business object .dat files, as supported by the Oracle HDL job.

  2. Get Oracle HDL process status: The Oracle HCM Cloud Adapter invokes the HDL getDataSetStatus operation to get the status of the specific Oracle HDL process

Design-Time Flow for the Oracle HDL Process

This part consists of the following steps:

  1. Create an integration that reads source files staged on an FTP server, having information that needs to be loaded into Oracle HCM Cloud using HDL Job. The business object here we are taking is workers.

  2. Here we generate the worker delimited data file using the business object template file obtained from Oracle HCM Cloud. 

  • Create a stage file action using the Write File operation, specify file name as “Worker.dat” and output directory as “WorkerOutput”.

  • Click Next.


  • Select XML Schema (XSD) document option in the wizard, as we will be providing the business object template files.

  • Click Browse and upload the business object template file obtained from Oracle HCM Cloud. 

  • Select the “WorkerFileData” schema element, this is the main element for the delimited data file.


 Please refer to the Oracle Customer Cloud Connect post mentioned above for steps on retrieving the Oracle HCM Cloud business objects in the respective schema.

  • Click Done to save the stage file invoke action.

  1. Now lets map the elements from source csv file to the business object template file. Use the mapper to map the source file elements with the target elements depicted in the schema.

    • The left hand panel Sources, shows all of the available values and fields that can used in this mapping. The Target panel shown on the right illustrates the Write hierarchy. This is a representation of the basic structure for a Worker.dat file.

    • For the field Labels on the Target panel, enter the corresponding header title e.g. Under the WorkerLabel parent, the EffectiveStartDateLabel will have the value of EffectiveStartDate. These values correspond to the header column within the final DAT file been generated.

    • For the data sections of the Target panel, map the values from the Sources panel or enter in default values.

    • Additionally, it is necessary to map the repeating element recordName value to the repeating element of the Parent specific section values. In our example it is NewHires that require to be mapped with Parent data sections such as “Worker”, “PersonLegislativeData”, “PersonName”, and so on.

  1. Create a stage file action using the ZIP File operation to generate the ZIP file to send to the Submit an HCM Data Loader job operation.

  2. Configure the Oracle HCM Cloud Adapter in the Adapter Endpoint Configuration Wizard to use the Submit an HCM Data Loader job operation.

  • Select the Import Bulk Data using HCM Data Loader (HDL) action.

  • Select the Submit an HCM Data Loader job operation.

  • Select the security group and doc account configuration parameters for submitting the Oracle HDL job.


  • Map the OIC File reference. If you want to send additional parameters to the importAndLoad operation, they can be sent through the Parameters element.

  1. Configure a second Oracle HCM Cloud Adapter in the Adapter Endpoint Configuration Wizard to use the Query the status of HCM Data Loader Job (HDL) operation. This action can run inside a loop until the HDL job status is identified as Started or any other status that you want.

  • Select Query the status of HCM Data Loader Job (HDL).


  • Map Process ID to getStatus Request (Oracle HCM Cloud) > Process ID.

  1. Map the response from the Query the status of HCM Data Loader Job (HDL) operation.


Oracle Integration provides extensive visibility to the job status by providing status information at various states viz. Job/ Load and Import states. Compare the status of the job as per your business requirement. The overall status can be one of the following values:





The process has not started yet. It is waiting or ready.

If this value is returned please poll again after some wait.


The process is running, but the data set has not been processed yet.


The process is running.


The data set completed successfully.

Job is completed, you can fetch the output.


Either data set load or data set import was cancelled.

Job is cancelled.


The data set or process in error

Job has ended in error.

This concludes the blog, you can see how Oracle Integration streamlines process of submitting the HDL job, by natively supporting generation of HDL DAT file, submitting the HDL job and querying the HDL job status. To learn more about the feature, please refer to the Oracle HCM Adapter documentation here.

Join the discussion

Comments ( 2 )
  • Akhil Monday, January 11, 2021
    Thank you for the blog. Its a very nice blog for HCM adapter. I have used the same and need to get the error details. I saw point 7 to pass Message and Error code in REst API. Can you please share the details for step 7 ( means link of REST API)
  • Prakash Masand Friday, February 19, 2021

    One does not need to call any additional API to get the error information, rather it is available in the response to a Query status call / invoke itself, please note error node will be populated only when a job has ended in error.

Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.