Oracle HCM Data Loader (HDL) Process ❓ ๐ค
- Oracle HCM Data Loader (HDL) jobs, a powerful tool for bulk loading data through integrations.
- Using Oracle HDL, you can load business objects for most Oracle HCM Cloud products into Oracle Integration. For example, you can load new hires from Oracle Talent Acquisition Cloud (Taleo EE) as workers into Oracle HCM Cloud using an Oracle HDL job.
- Integration architect can generate the delimited data files in Oracle Integration using business object template files provided by Oracle HCM Cloud. The business object template file must be associated with a stage file action in Oracle Integration to generate the delimited data file .
- Advertisement -
- To learn more about how to obtain the business-object template file from Oracle HCM Cloud and use the same for delimited data file generation, refer to this post on the Oracle Cloud Customer Connect portal.
- ⭐At a high level, an Oracle HDL job pattern can be implemented in three steps:
1) Generate the HCM HDL compliant delimited data (.dat) file.2) Submit the Oracle HDL job.3) Monitor the job status until completion.
Hands-on Practice :
Use Case :
In this article we will see the steps to develop an Integration which will load Person Email detail from OIC as Workers ( business object ) into Oracle HCM Cloud using an Oracle HDL job.
I have divided the complete article into two parts :
In PART -1 we will submit the HDL Process ( import and load ) directly in the HCM cloud application i.e. we will see the steps to submit the HDL process from front end and in PART - 2 we will develop an integration that will submit the same HDL process from OIC.
PART - 1
Create HCM HDL File of Worker Business Object :
Save the below box data as 'Worker.dat' file. Then compress (zip) the Worker.dat into a file name of your choice, but it must have a .zip file extension. This file contains email details of two employees. Please cross check weather the below employee no exist in your HCM Application Instance or not. If not exist you can use any other employee no which exist in your HCM Application.
METADATA|PersonEmail|DateFrom|DateTo|PersonNumber|EmailType|EmailAddress|PrimaryFlag MERGE|PersonEmail|1951/01/01|2022/06/22|69658|H1|test@gmail.com|Y MERGE|PersonEmail|1951/01/01|2022/06/22|65507|H1|test@gmail.com|Y
We have the HCM Data Loader file ready for bulk Loading Person Email , now lets Follow the next step to import this into the HDL staging tables and load the data into the application tables ( i.e. Submit HDL Job - Import and Load )
Knowledge : ๐
All files must include METADATA lines, to explain which attributes are included in the file and the order in which their values are supplied.
All attribute names and values are delimited by the pipe '|' character by default.
The string immediately after the METADATA instruction identifies the record type the attributes are for, in our case 'PersonEmail'. The values that follow are the names of the attributes available on the PersonEmail record, that you want to supply data for.
The MERGE instruction tells HDL to create the grade if it doesn't already exist, or update it if it does.
IMPORTING AND LOADING YOUR FILE ( HDL Job ) :
- Login to HCM Cloud Application, on the home page ⌂, click My Client Groups > Data Exchange
- On the Data Exchange page, click Import and Load Data
- Click Import File (at top right)
- Click Choose File button and select the .zip file which we have saved above and then click Review Parameters. You can directly click on Submit Now button but I intentionally want you check the parameters also because these parameters name and its value we will Map while developing OIC Integration.
- You don't need to change the parameters values. Just click Submit
- Click OK on the Submitted confirmation page
- Import and Load Data process will get starts. Click Refresh ๐ until the Import and Load status completed. Once the Import and Load process completed successfully, you can verify the HDL process by checking the Employees records.
Knowledge : ๐
The Import Status will indicate if the business object .dat files in your zip file imported into the staging tables correctly. Here you can see that import was successful.
The Load Status will indicate if the data is successfully loaded in the Oracle HCM Cloud application tables. The clock icon indicates that Load is still in progress.
Lets Verify the HDL process by checking the Employee Records :
- Go the HCM Application Home Page, click My Client Groups > Person Management
- Search with any of the employee number which you have uploaded above in .zip file.
- Then click Task Icon ๐ and then select Person
- You can see the Home email got updated with the ones which we have passed above in .zip file. This way we have successfully Load file with HCM Data Loader. Now we will develop the Integration that will be capable to do the same process which we have did manually.
PART - 2
OIC INTEGRATION TO SUBMIT HCM DATA LOADER (HDL) JOB :
⭐Prerequisite : FTP and HCM Connection. If you don't have these connection available in your OIC instance, You can follow these blogs to know the configuration steps ( click to configure FTP Connection , click to configure HCM Connection )
Lets create one Schedule Integration that will Trigger HCM Data Loader (HDL) job for uploading Worker data file from OIC into the HCM Cloud.
STEP 1 :
Before we start Integration development save the below box data as 'Worker.dat' file in your local system. Then compress (zip) the Worker.dat into a file name of your choice, but it must have a .zip file extension.
METADATA|PersonEmail|DateFrom|DateTo|PersonNumber|EmailType|EmailAddress|PrimaryFlag MERGE|PersonEmail|1951/01/01|2022/06/22|69658|H1|oic_test@gmail.com|Y MERGE|PersonEmail|1951/01/01|2022/06/22|65507|H1|oic_test@gmail.com|Y
Once you saved the above file, Login to OIC Instance and follow the below steps :
STEP 2 :
- Create a Schedule Integration having some meaningful name.
- Search for the FTP connection and select it. FTP adapter configuration wizard will get opened.
- Enter any meaningful name for the endpoint and then click Next
- Select 'Download File' operation , *Transfer Mode as Binary and enter the Download Directory path and then click Next
- FTP Adapter configuration completed. Click Done.
- Now search for the HCM Connection and select it. HCM adapter configuration window will get opened.
- Enter any meaningful name for the endpoint and then click Next
- Select Import Bulk Data using HCM Data Loader (HDL) and then click next
- Select *Security Group = FAFusionImportExport and *Doc Account =hcm$/dataloader$/import$ and then click Next
- HCM Adapter configuration for HDL completed. Click Done.
- Now lets complete the Mapping :
- Open the FTP Adapter mapping
- Map the file Name , Directory and DownloadDirectory directory. Please make sure that for this Article please pass below details :
- File Name = Worker.zip (in current case we are uploading Person Email of Worker business object so we need to pass Worker.dat file name exactly)
- Directory = enter the FTP directory path where the Worker.dat file will be present while testing
- DownloadDirectory = you can enter '/'
Please take reference from below image
- Add Assign action, enter some meaningful name and click create.
- click on ➕ icon and declare one variable name 'varHDLFlowStatus' and initialize its value as '' (empty string) by clicking on edit icon ๐. Then click Validate and Close the assign window.
- Now we will add While action in the flow which will keep executing in a loop till the time submitted HDL process is completed :
- Search for the While action and select it. Enter a meaningful name (ex- HDLFlowLoop) and then click create.
- Click on 'Expression Builder' and build the expression like below and then click Validate and close the expression window.
$varHDLFlowStatus !='CANCELLED' and $varHDLFlowStatus !='COMPLETED'
- Now lets configure HCM adapter to invoke operation which will query the status of the submitted HCM Data Loader Job :
- Search for the HCM Connection and select it. Enter any meaningful name for the endpoint, click Next
- Open the mapper by clicking on ๐ icon and map the Process Id. Click Validate and click Close ( please take reference from below image )
- Now we will add one more assign action in which we will update the variable value which we have declare before While action (in above step )
- Search for Assign Action and select it. Enter any meaning name (like 'updateHDLFlowStatus' ) and click Create.
- Click on the ➕ icon, select the variable which we have declare before while loop from the drop down menu (in my case is 'varHDLFlowStatus) and then click on edit ๐icon
- Drag and drop the *STATUS element in the expression builder ( take reference from below image ). Then click Validate and Close.
- Again click Validate and Close.
- Now at the end of the development just add one WAIT Activity of 10 sec. So that the while loop execute after interval of 10 secs. Once the while condition is fulfil it will exit the While Loop.
- search for Wait action.
- Enter some meaningful name for the WAIT Action. In seconds column enter 10. Click Create.
Knowledge : ๐
Wait Action : Maximum wait time should be 6 hrs.
- Integration development is completed. Enable the Tracking by clicking on the hamburger sign ( top right )
Lets TEST the Integration :
- Upload the Worker.zip File ( which we have saved in PART 2 : STEP 1) in the FTP Folder ( make sure you upload the file in same directory path which you have entered in the FTP Mapping in PART 2 : STEP 2)
- Wait for the Integration Run to get completed. You can check the status in Monitoring Page ( You can Navigate to Monitoring window by clicking on Home Icon (top left ), then click Monitoring --> Integrations --> Tracking
- Once the Integration run completed, you can login to HCM Cloud Application and can verify the PersonEmail of the Employees get updated successfully with the email details which we havwe pass in the Worker.zip file by following the same verification steps that we followed in PART 1 of this article.
TADA ! ๐
No comments:
Post a Comment
If you have any doubts, Please let me know.