Tuesday, April 21, 2020

How to give access only in OTBI Data Model for BI Developers without having Admin access in Oracle HCM Cloud environment

Create a Role with the below details. This Role will only give access to Data Model to query in the Oracle Database table. User will not be able to create Reports in this approach.


Step 1 : Navigate to Home --> Tools --> Security Console

Step 2 : Click on the "Create Role" button.

Step 3 : Provide the details as below in the "Basic Information" section. Details can be changed as
             per requirement. This is just a sample.

             Role Name         : BI Developer Custom
             Role Code          : BI_DEVELOPER_CUSTOM
             Role Category    : BI - Abstract Roles
             Description         : A custom BI Abstract Role for BI Developers

Step 4 : Navigate to "Function Security Policies" section and a Function Security Policy named 
             "Manage Reports and Analytics"

Step 5 : Navigate to "Data Security Policies" section and no need to add anything here.

Step 6 : Now navigate to "Role Hierarchy" section and add the role 
             "BI Publisher Data Model Developer" role here.

             Note : FYI "BI Publisher Data Model Developer" will provide access only to 
                        Data Models, not in Reports. To have access in Reports, we have to add 
                        "BI Platform Author Role" in the "Role Hierarchy" section. User will not 
                        have access in BI Administration from the OTBI in this approach.

                        If user wants to access everything in OTBI, then simply we have to add only one role
                        named "BI Administrator Role" in the "Role Hierarchy" section.

                       Modify accordingly as per requirement here.

Step 7 : Click on "Next" button and it will navigate to "Users" section. Users can be assigned this  
             new role to their User Profiles in this step (Check the "Add User" button). Otherwise we 
             can assign this role to the Users from the Security Console "Users" tab. 

Step 8 : Click again on the "Next" button and check the Summary of the new role. Then click 
             on "Save and Close". In this step the new custom role will be created.


Step 9 : If in the step 7 User was added then step 9 is optional, else navigate to 
             Home --> Tools --> Security Console --> Users.
             Search for a user who should have this role and assign this role (BI Developer Custom)
             to the user.

Step 10 : Now navigate to Home --> Tools --> Reports and Analytics --> Browse Catalog
               In the OTBI select the folder where this role should have full access or create a new folder 
               in OTBI.

Step 11 : Select the folder, go to "Permission" (More--> Permission or check at left hand bottom)
                add the new role with Permission as "Full Control" and also click on the Check Boxes to 
                give accesses in the Sub Folders as well.

Step 12 : Now login as the User, Navigate to "Reports and Analytics --> Browse Catalog --> OTBI"
               and try to access the "Data Model" and save it to that particular folder.

              Note : After logging in as the User, if you are not able to see the "Reports and Analytics" 
                         icon from UI, then login in or take help from Admin to run the process 
                         "Import User and Role Application Security Data" from 
                         "Home --> Tools --> Scheduled Processes". Once the process in completed then try
                         again to login as the user and to test the Data Model access.
                         





Saturday, April 18, 2020

Some of the useful ESS Process names in Oracle HCM Cloud environment


Please use the ESS processes accordingly as per Business requirement :



Navigation : Home --> Tools -> Scheduled Processes



1> Delete Master Geography Data for the Selected Country

Deletes master geography data for a selected country if the country data is not associated with the territory setup


2> Run Convert Pending Workers Automatically Process
PWKConversionJob
Schedules or runs process that automatically quick converts pending workers matching the conversion criteria

3> ESS process for value set values upload

4> Retrieve Latest LDAP Changes
SyncRolesJob
Synchronizes users, roles and role grants with definitions in LDAP

5> Import User and Role Application Security Data
AseImportUsersAndRolesJob
Import user and role data from LDAP and store in Applications Security tables

6> Import User Login History
AseInactiveUsersDataLoadJob
Imports information that the Inactive Users Report uses to identify inactive users

7> Regenerate Data Security Grants
RegenDataSecurityGrants
Manages processes to create or update grants for job, data and abstract roles

8> Regenerate Data Security Profiles
Regenerates all security profiles or a set of security profiles based on a selected security profile type

9> User Role Membership Report
Lists roles assigned to each user.




Friday, April 17, 2020

How to migrate Fast Formulas using HDL file in Oracle HCM Cloud environment



Step 1 : Save the whole Fast Formula text in a .txt file. In this example I have save the
             Fast Formula in a .txt file named "SM_TEST_FF_Text_1.txt".

Step 2 : Create a folder named "ClobFiles" and put the SM_TEST_FF_Text_1.txt file
             in the "ClobFiles" folder.
 

            Note : Repeat the same steps, if you have more than 1 Fast Formula to migrate.
            Suppose, if you have 3 different Fast Formulas to migrate, save the 3 different
            Fast Formulas in 3 different .txt files and put all the 3 .txt file in the "ClobFiles" folder.

Step 3 : Sample HDL File for Fast Formulas load as below. HDL file name should
              be  "FastFormula.dat". Save the HDL details in FastFormula.dat file.

             METADATA|FastFormula|FormulaCode|FormulaName|FormulaTypeCode
             |LegislativeDataGroupName|EffectiveStartDate|EffectiveEndDate|FormulaText
            MERGE|FastFormula|SM_TEST_FF_1|SM_TEST_FF_1|Extract Rule|#NULL|1901/01/01
            |4712/12/31|SM_TEST_FF_Text_1.txt

            In the upper HDL 2 lines, 1st line is the Header(starting with METADATA) and the
            2nd line is the Data line (starting with MERGE).
            Pass the Fast Formula details accordingly in the Data line. But in the "FormulaText"
            attribute, pass the Fast Fomula .txt file name where the exact Fast Formula has been
            written (example : for Fast Formula SM_TEST_FF_1, Fast Fomula text has been saved
            in the  "SM_TEST_FF_Text_1.txt file" and has been put in the "ClobFiles" folder.
            In the HDL Data line, this file name has been passed in the FormulaText attribute)

Step 4 : Now create a single .zip file with both FastFormula.dat file and ClobFiles folder.

Step 5 : Load the .zip file in the HCM Data Loader in target environment.



Additional Steps : Load the Fast Formula description
Step 6 : Sample HDL File for Fast Formulas description load as below. HDL file name should
              be "FastFormulaTranslation.dat".
              Save the HDL details in FastFormulaTranslation.dat file and pass the details accordingly.

              METADATA|FastFormulaTranslation|FormulaId|Language|FormulaCode|FormulaName
              |Description|FormulaTypeCode|LegislativeDatGroupName|SourceLang
              MERGE|FastFormulaTranslation|XXXXXXXXXX|US|SM_TEST_FF_1|SM_TEST_FF_1
              |Test Fast Formula Load 1 using HDL|Extract Rule|#NNULL|US

              <get the Formula Id value from database and replace the Formula Id value here
                for XXXXXXXXXX>

Step 7 : Now create a single .zip file with FastFormulaTranslation.dat file.

Step 8 : Load the .zip file in the HCM Data Loader in target environment.


Tuesday, April 14, 2020

What is the Connection Pool command for accessing Direct Database Query in OTBI in Oracle HCM Cloud environment

Please pass the below details in Connection Pool :

"HCM_OLTP"."Connection Pool"



Note : As of now, by default this option is hidden in the latest release (I think from 19B).

How to Create an Inbound Interface process using HCM Extract in Oracle HCM Cloud environment


Here is an example has been provided. Please modify accordingly as per Business requirement.

Assumption : Viewer is aware of how to create an HCM Extract in Oracle HCM Cloud environment using BI Report.

Using Inbound Interface process, we can Extract data from Oracle HCM Cloud environment and load the data back accordingly in Oracle HCM Cloud environment, using HDL.
Here is a scenario how to achieve it.

Suppose, in the Person External Identifier section, the Username details for each Person to login in the Oracle HCM Cloud environment is also present.We will be checking if there are any changes between Person External Identifier Username details and User Profile Username details.
If yes, we will correct the User Profile Username details with Person External Identifier Username details.

Our main data source will be BI Report and we will be creating pipe delimited ".dat" file from there.
So, that is why we are creating this dummy HCM Extract.

Why we need HCM Extract is because, only HCM Extract has this functionality to call HDL using Inbound Interface option, in the "Extract Delivery Options" section within HCM Extract from the "Delivery Type".

Using the below SQL, a Data model needs to be created first.
Then an eText template needs to be created.
Using this Data Model and etext template, the required Report needs to be created to create the User.dat file. Pass this Report path in the dummy HCM Extract's Delivery Option.

SQL to get the details for the User.dat file (Modify as required):
----------------------------------------------------
SELECT 'LINK' AS LINK  --(This Link attribute will be used to link with the delivered Global Data Model Report)
,'N' AS SUSPENDED
,A.PERSON_NUMBER
,A.EXT_IDENTIFIER_NUMBER
,B.USERNAME
FROM
 (SELECT DISTINCT PAPF.PERSON_NUMBER
 ,PEAI.EXT_IDENTIFIER_NUMBER
 FROM PER_ALL_PEOPLE_F PAPF,PER_EXT_APP_IDENTIFIERS PEAI
 WHERE PAPF.PERSON_ID  = PEAI.PERSON_ID
 AND PEAI.EXT_IDENTIFIER_TYPE  = <pass the External Identifier Type code here>
 AND TO_CHAR(SYSDATE, 'YYYY/MM/DD') BETWEEN TO_CHAR(PEAI.DATE_FROM,'YYYY/MM/DD') AND NVL(TO_CHAR(PEAI.DATE_TO,'YYYY/MM/DD'),'4712/12/31')
 ORDER BY PAPF.PERSON_NUMBER
 ) A,
 (SELECT DISTINCT P.PERSON_NUMBER
 ,U.USERNAME
 FROM PER_USERS U, PER_ALL_PEOPLE_F P
 WHERE U.PERSON_ID   = P.PERSON_ID
 ORDER BY P.PERSON_NUMBER
 ) B
WHERE  A.PERSON_NUMBER   =  B.PERSON_NUMBER
AND A.EXT_IDENTIFIER_NUMBER  <>  B.USERNAME
AND EXISTS
  (SELECT 'X'
  FROM PER_ALL_PEOPLE_F PAPF, PER_EXT_APP_IDENTIFIERS PEAI
  WHERE  PAPF.PERSON_ID    =  PEAI.PERSON_ID
  AND PEAI.EXT_IDENTIFIER_TYPE =  <pass the External Identifier Type code here>
  AND     TO_CHAR(SYSDATE,'YYYY/MM/DD') BETWEEN TO_CHAR(PEAI.DATE_FROM,'YYYY/MM/DD') AND NVL(TO_CHAR(PEAI.DATE_TO,'YYYY/MM/DD'),''4712/12/31')
  )

Sample User.dat file details below :
------------------------------------------------
METADATA|User|Suspended|PersonNumber|Username
MERGE|User|N|<SQL will pass the Person Number here through eText template>|<SQL will pass the Username here through eText template>

Now follow the steps in the HCm Extract :
------------------------------------------------
Step 1 : Create a dummy HCM Extract.
             In the Parameters section add a parameter called Auto Load (Tag Name = Auto_Load)
             with   Data Type as "Text" with Default Value as "Y".
             For the Data Group I have taken "PER_EXT_ASG_STATUS_DETAILS_UE" as
             User Entity as it returns very less row count.

Step 2 : In Data group Filter criteria I have put condition as "1=2" so that it doesn't return
             any data set.

Step 3 : In the Data Group a Data Record has been created and a Dummy Attribute has been
             added called "Dummy Field".
             It's Data Type is "Text", Type is "String" and string value has been passed as "Demo"

Step 4 : Now for the Extract Delivery Option, you can pass the details as below :
              Start Date                         : 01/01/1901
              End Date                          : 12/31/4712
              Delivery Option Name    : HDLOutput
              Output Type                     : Text
             Report                               : <pass the BI Report path here ;
                                                        example : /Custom/..../SM_USERNAME_UPDATE_RPT.xdo>
             Template Name                : <pass the eText template name here;
                                                        example : SM_USERNAME_UPDATE_TMPLT>
             Output Name                   : User
             Delivery Type                  : Inbound Type
             Required Bursting Node  : ticked
 
 
            In the "Additional Details" section please pass the details as below :
            Encryption Mode                    : None
            Override File                           : .dat
            Run Time File Name               : blank
            Integration Name                    : UserNameUpdate
            Integration Type                      : Data Loader
            Integration Parameters            : blank
            Key                                         : blank
            Locale                                     : blank
            Time Zone                              : blank
            Compress                                : blank
            Compressed Delivery Group  : User.zip
 
Step 5 : Navigate to Home --> My Clients Groups --> Data Exchange --> Refine Extracts
             (Under HCM Extracts)
             Search for the dummy HCM Extract which has been created. Select the HCM Extract
             and click on the "Edit" button.

Step 6 : Once the HCM Extract is opened, in the "Tasks" tab, under Flow Task the
             HCM Extract name will be present.
             Now click on the "Actions" button and select the option "Select and Add".
             A search popup will appear and search the task, named
             "Initiate HCM Data Loader"    (Generate HCM Data Loader file and optionally
              perform a data load).Now select the task and click on the "Done" button.

Step 7 : Now select the task "Initiate HCM Data Loader" click on the "Go to Task" button.
             2 options will be there :
                    a> Data Loader Archive Action
                    b> Data Loader Configuration

Step 8 : Now select the 1st option "Data Loader Archive Action".Click on the "Edit" button.
             In the "Parameter Basis" dropdown, select the option "Bind to Flow Task".
             In the "Basis Value" dropdown, select the option "Extract Name , Submit , Payroll Process"

Step 9 : Now select the 2nd option "Data Loader Configuration". Click on the "Edit" button.
              In the "Parameter Basis" dropdown, select the option "Constant Bind".
              In the "Basis Value" multiline text box, please provide the below details :
              ImportMaximumErrors=100,LoadMaximumErrors=100,LoadConcurrentThreads=8
              ,LoadGroupSize=100

Step 10 : Now click on the "Next" button and then click on the "Submit" button.

Now submit the HCM Extract and check the Inbound Interface process.

Monday, April 13, 2020

SQL for getting current session User Person Number in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------


SELECT DISTINCT PAPF.PERSON_NUMBER
FROM PER_ALL_PEOPLE_F PAPF
WHERE PAPF.PERSON_ID = (SELECT NVL(HRC_SESSION_UTIL.GETUSER_PERSONID,-1) FROM DUAL)

SQL for getting the environment name in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT UPPER(ADDMNS.EXTERNAL_VIRTUAL_HOST)
FROM ASK_DEPLOYED_DOMAINS ADDMNS
WHERE UPPER(ADDMNS.DEPLOYED_DOMAIN_NAME) = 'FADOMAIN'

Friday, April 10, 2020

How to update/reset User profile password in bulk mode using Rest API webservice in Oracle HCM Cloud environment


I am using Postman and Rest API webservice to update/reset the User Profile password.

Step 1 : Run the below SQL to get the USER_GUID for the Users :
             SELECT USERNAME, USER_GUID
             FROM PER_USERS
             WHERE USERNAME IN
             (<pass Username details with comma seperated values within double quotes>)

Step 2 : Open Postman

Step 3 : Select the Action as "POST" from the Action dropdown.

Step 4 : pass the URL as below :
             https://<pass your oracle cloud url>/hcmRestApi/scim/Bulk

Step 5 : In the Authorization tab, select "Basic Authorization" and pass the Username
             and Password details.

Step 6 : In the Header tab, add the below parameter :
              Content-Type = application/json

Step 7 : Now in the Body tab pass the details as below format as Request Payload:
 
        {
    "Operations":
     [
   {
    "method":"PATCH",
    "path":"/Users/<Enter GUID for your first User>",
    "bulkId":"clientBulkId1",
    "data":{
      "schemas":[
        "urn:scim:schemas:core:2.0:User"
           ],
      "password": "<Enter the password value here>"
     }
   },
   {
    "method":"PATCH",
    "path":"/Users/<Enter GUID for your second User>",
    "bulkId":"clientBulkId1",
    "data":{
      "schemas":[
        "urn:scim:schemas:core:2.0:User"
           ],
      "password": "<Enter the password value here>"
     }
   }
   ,{
    <pass json value set structure accordingly for other Users>
   }
      ]
 }

Step 8 : Check the status of the Response. If the status is 200 Ok then the update is successful.

How to compile Fast Formulas in bulk mode in Oracle HCM Cloud environment

Step 1 : Navigate to Home --> Payroll --> Checklist

Step 2 : In the right hand side, click on the Task pane and under "Payroll Flows" section,
             click on the link called "Submit a Process or Report"

Step 3 : Now select the "Legislative Data Group" as "US Legislative Data Group" and under
             "Process or Report" search text box, search the process "Compile Formula".

Step 4 : Select the process and click Next.

Step 5 : Now provide the details as below :
             a> Payroll Flow : pass the value accordingly to identify the process run. It can be anything.
             b> Formula : Pass the Fast Formula name here. If we want to compile multiple Fast
                  Formulas  at once, we can use wild character like '%'.example : there are 3 Fast Formulas
                  which starts with "SM". So to compile the 3 Fast Formulas at once, we can pass
                  "SM%" in the Formulas value.
             c> Formula Type : Pass the Fast Formula type value here. If all the Fast Formulas are HCM
                  Extract rule type,then we have to pass as "Extract Rule". Pass this value accordingly.

Step 6 : Submit the process. Once the process is successful, you can validate the compilation of the
              Fast Formulas from the below navigation :
              Setup and Maintenance --> Tasks --> Fast Formulas


Note : If you are not able to see the process "Compile Formula" under Payroll Checklist, then you should assign Payroll Manager data role in your User profile.

How to configure the FTP in Oracle HCM Cloud environment

Step 1 : Navigate to Home --> Tools --> Reports and Analytics --> Browse Catalog

Step 2 : Once we are in the Oracle BI Publisher, please click on the Oracle BI
              Administration  link(you should have BI Admin role assigned to your User Profile)

Step 3 : Then search for the BI Publisher section and click on the "Manage BI Publisher" link.

Step 4 : Once we are inside "Manage BI Publisher" link, search for the section called "Delivery"
              and  click on the "FTP" link.

Step 5 : You will see the option called "Add Server". Click on it.

Step 6 : Put the details accordingly.

Thursday, April 9, 2020

How to load Value Set values using File Import/Export or ESS process in Oracle HCM Cloud environment

Step 1 : Create a text file (example : ABC_Value_Set_Value_Load.txt) with the details as below :
Note : 1st line below is the Header and next line onwards (Data Lines) pass the details of the values for the Value Set Code. File details are as below :


ValueSerCode|Value|TranslatedValue|Description|EnabledFlag|StartDateActive|EndDateActive|SortOrder
ABC|9999 9999 9999 9999||Test ABC1|N|1901-01-01|4712-12-31|1
ABC|9999 9999 9999 9998||Test ABC2|N|1901-01-01|4712-12-31|2
ABC|9999 9999 9999 9997||Test ABC3|N|1901-01-01|4712-12-31|3

Step 2 : Now navigate to Home --> Tools --> File Import and Export

Step 3 : Under the Search Results section click on the "+" button and the Upload File popup will appear.

Step 4 : In the popup, click on the "Choose File" button and select the Value Set value file (ABC_Value_Set_Value_Load.txt) from the correct location.

Step 5 : In the popup, click on the "Account" dropdown and select the UCM location where the file should be uploaded.
         Generally I select "hcm/dataloader/import" location as we are going to import the values in our HCM Cloud environment.
Step 6 : Once the file and the Account details have been selected, click on "Save and Close" button.

Step 7 : Now we have 2 options to load the Value Set values in our HCM Cloud environment.

Option 1 : Using ESS process
 a> Navigate to Home --> Tools --> Scheduled Processes
 b> Click on the "Schedule New Process" button and search for the ESS job named "ESS process for value set values upload".
 c> Select the process and click on "OK" button. Then you will find 2 input parameters.
  1> File Name : Pass the name of the file has been uploaded in UCM (ABC_Value_Set_Value_Load.txt)
  2> Account : Pass the name of the Account of the UCM (hcm/dataloader/import)
 d> Now click on the "Submit" button to submit the process.
 e> Once the process is completed, please check the log file for the load details.


Option 2 : Using "Manage Value Sets" navigation
 a> Navigate to Setup and Maintenance --> Tasks --> Manage Value Sets
 b> Under the "Search Results" section, click on the "Action" button.
 c> Once the "Action" button is clicked, click on the "Import" button.
 d> Pass the File Name (ABC_Value_Set_Value_Load.txt) and Account information (hcm/dataloader/import) as requested.
    Then click the "Upload" button.
 e> Now the import process will start and once the process is completed, check the log file for load details.

How to include all the HDL transactions in Event Object table in Oracle HCM Cloud environment

At the top of the HDL .dat file, please add the below set command as Yes (Y).

This Command will include all the transactions which will be performed using HDL , in the Event Object table.If we don't use the below command in the HDL file, then the transactions are not captured in the Event Object table.



SET ENABLE_INCREMENTAL_LOAD_EVENTS Y

How to schedule a HCM Extract using a Flow Schedule Fast Formula in Oracle HCM Cloud environment

Here in this example I am going to schedule a HCM Extract using Flow Schedule Fast Formula.
I have tried to dynamically schedule the HCM Extract using Common Lookup and Value Set.
As of now, for each and every HCM Extract I have created seperate Flow Schedule Fast Formulas.
I am still trying to use single Flow Schedule Fast Formula to schedule multiple HCM Extracts.
The issue why I am not able to use single Flow Schedule Fast Formula for multiple HCM Extracts is,
while submitting the HCM Extracts I am not able to pass the HCM Extract Name as parameter in the Flow Schedule Fast Formula.
If anyone is able to do it, please share the details in the comment section.

Now the question arise, what we have made dynamic here. You can change the schedule of the Extract any time as per Business requirement from the
Common Lookup section. Everytime you don't need to change the Fast Formula to change the Schedule timing.

This Flow Schedule Fast Formula is still in development and I am trying to add multiple functionality to this.
As of now please consider this as a sample Flow Schedule Fast Formula.



Step 1 : Create a Common Lookup. Here we will pass the HCM Extract name and the schedule details of HCM Extract.
 a> Navigate to Setup and Maintenance --> Tasks --> Manage Common Lookups

 b> Create a new Common Lookup. Provide the details as below :
  1> Lookup Type : SM_HCM_FLOW_SCHEDULE_CL (an example)
  2> Meaning : Common Lookup for Flow Schedule Details
  3> Description : Common Lookup for Flow Schedule Details
  4> Module : Global Human Resources
  5> Lookup Code : SM_HCM_ABC_INTEGRATION_V1 (passing the HCM Extract name here; an example
  6> Display Sequence : 1 (pass accordingly)
  7> Enabled : mark it checked
  8> Start Date : 01/01/1901 (pass accordingly)
  9> End Date : 12/31/4712 (pass accordingly)
  10> Meaning : Daily1 (Here this value I am using to check how frequently the process will run)
  11> Description : Hourly (Here this value I am using in what mode the process will run)
  12> Tag : 4 (Here this value I am using in what time span this process will run ; example : in every 4 hours)

 c> Save the Common Lookup.

Note : Here in the Common Lookup, I will pass the details of the Schedule, how the HCM Extract is going to be run.
 Depending on this configuration, the Flow Schedule Fast Formula should be taken care.



Step 2 : Create a Table type Value Set Code. This Value Set will pass the Scheduling details of HCM Extract in the Flow Schedule Fast formula
   to set the next Scheduling date.
 a> Navigate to Setup and Maintenance --> Tasks --> Manage Value Sets

 b> Create a new Value Set. Provide the details as below :
  1> Value Set Code : SM_HCM_FLOW_SCHEDULE_VS (an example)
  2> Description : Value Set for ABC Integration
  3> Module : Global Human Resources
  4> Validation Type : Table
  5> Value Data Type : Character
  6> From Clause : FND_LOOKUP_VALUES_TL FLVT,FND_LOOKUP_VALUES_B FLVB
  7> Value Column Name : FLVB.LOOKUP_TYPE||'*'||FLVB.LOOKUP_CODE||'*'||FLVT.MEANING||'*'||FLVT.DESCRIPTION||'*'||FLVB.TAG
  8> Value Column Type : VARCHAR2
  9> ID Column Name : FLVB.LOOKUP_TYPE||'*'||FLVB.LOOKUP_CODE||'*'||FLVT.MEANING||'*'||FLVT.DESCRIPTION||'*'||FLVB.TAG
  10> ID Column Type : VARCHAR2
  11> WHERE Clause :      FLVT.LOOKUP_TYPE   = FLVB.LOOKUP_TYPE
     AND FLVT.LOOKUP_CODE  = FLVB.LOOKUP_CODE
     AND FLVT.VIEW_APPLICATION_ID = FLVB.VIEW_APPLICATION_ID
     AND FLVB.ENABLED_FLAG   = 'Y'
     AND FLVT.LANGUAGE  = USERENV('LANG')
     AND FLVT.LOOKUP_TYPE  = :{PARAMETER.LOOKUP_TYPE}
     AND FLVB.LOOKUP_CODE LIKE    :{PARAMETER.LOOKUP_CODE}

 c> Save the Value Set.



Step 3 :
Sample Flow Schedule Fast Formula :(The full Fast Formula I have written here, there may be some typo. Please correct accordingly if any. Please consider this as an example)
/****************************************************************************************************************************************
     
Formula Name   : Schedule HCM Extract using Flow Schedule Fast Formula
Formula Type   : Flow Schedule 
Formula Description : Flow Schedule Formula to return a date time from Saturday 6 pm to Friday 6 pm (as an example)
Returns   : NEXT_SCHEDULED_DATE
     
Formula Results  : NEXT_SCHEDULED_DATE; This will be a date time value with yyyy-MM-dd HH:mm:ss format
Effective Date  : 01/01/1901 
     
Name                          Date             Version        Description
-----------------------     ------------          ---------        ------------------ 
Sourav Mazumder     01/01/2020      1.0            Initial Version
     
****************************************************************************************************************************************/
INPUTS ARE SUBMISSION_DATE(DATE), SCHEDULED_DATE(DATE)
SM_GET_HCM_ABC_UPD_SCHED_DETAILS = ' '
SM_VS_LOOKUP_TYPE = 'SM_HCM_FLOW_SCHEDULE_CL'
SM_VS_LOOKUP_CODE = 'SM_HCM_ABC_INTEGRATION_%'
SM_GET_HCM_ABC_UPD_SCHED_DETAILS = GET_VALUE_SET('SM_HCM_FLOW_SCHEDULE_VS','|=LOOKUP_TYPE='''||SM_VS_LOOKUP_TYPE||''''||'|LOOKUP_CODE='''||SM_VS_LOOKUP_CODE||'''')
SM_SCHEDULE_TYPE = SUBSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,2)+1),((INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,3))-((INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,2)+1))))
SM_SCHEDULE_TIME_TYPE = SUBSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,3)+1),((INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,4))-((INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,3)+1))))
SM_SCHEDULE_TIME = SUBSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,4)+1),((LENGTH(SM_GET_HCM_ABC_UPD_SCHED_DETAILS)+1)-((INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,4)+1))))

SM_PRINT_VAR_0 = ESS_LOG_WRITE('SM_GET_HCM_ABC_UPD_SCHED_DETAILS :' + SM_GET_HCM_ABC_UPD_SCHED_DETAILS)
SM_PRINT_VAR_1 = ESS_LOG_WRITE('SM_SCHEDULE_TYPE :' + SM_SCHEDULE_TYPE + '-' + TO_CHAR(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,2)+1))
SM_PRINT_VAR_2 = ESS_LOG_WRITE('SM_SCHEDULE_TIME_TYPE :' + SM_SCHEDULE_TIME_TYPE + '-' + TO_CHAR(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,3)+1))
SM_PRINT_VAR_3 = ESS_LOG_WRITE('SM_SCHEDULE_TIME :' + SM_SCHEDULE_TIME + '-' + TO_CHAR(INSTR(SM_GET_HCM_ABC_UPD_SCHED_DETAILS,'*',1,4)+1))

SM_GET_SUBMISSION_DATE = TO_CHAR(SUBMISSION_DATE)
SM_GET_SCHEDULED_DATE  = TO_CHAR(SCHEDULED_DATE)
SM_PRINT_SUBMISSION_DATE = ESS_LOG_WRITE('SM_GET_SUBMISSION_DATE :' + SM_GET_SUBMISSION_DATE)
SM_PRINT_SCHEDULED_DATE  = ESS_LOG_WRITE('SM_GET_SCHEDULED_DATE  :' + SM_GET_SCHEDULED_DATE)

SM_GET_DAY_END = TO_NUMBER(SUBSTR(TO_CHAR(SCHEDULED_DATE,'YYYY-MM-DD HH24:mm:ss'),12,2))
SM_GET_DAY_NUMBER = TO_CHAR(SCHEDULED_DATE,'D')
SM_PRINT_DAY_END = ESS_LOG_WRITE('SM_GET_DAY_END :' + TO_CHAR(SM_GET_DAY_END))
SM_PRINT_DAY_NUMBER = ESS_LOG_WRITE('SM_GET_DAY_NUMBER :' + SM_GET_DAY_NUMBER)
SM_GET_SUNDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-05','YYYY-MM-DD'),'D')
SM_GET_MONDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-06','YYYY-MM-DD'),'D')
SM_GET_TUESDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-07','YYYY-MM-DD'),'D')
SM_GET_WEDNESDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-08','YYYY-MM-DD'),'D')
SM_GET_THURSDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-09','YYYY-MM-DD'),'D')
SM_GET_FRIDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-10','YYYY-MM-DD'),'D')
SM_GET_SATURDAY_NUMBER  = TO_CHAR(TO_DATE('2020-01-11','YYYY-MM-DD'),'D')

IF UPPER(SM_SCHEDULE_TIME_TYPE) = 'HOURLY' THEN
 NEW_SCHEDULED_DATE = ADD_DAYS(SCHEDULED_DATE,((1/24)*TO_NUMBER(SM_SCHEDULE_TIME)))
IF UPPER(SM_SCHEDULE_TIME_TYPE) = 'MINUTELY' THEN
 NEW_SCHEDULED_DATE = ADD_DAYS(SCHEDULED_DATE,((1/24)*(1/60)*TO_NUMBER(SM_SCHEDULE_TIME)))
IF (SM_GET_DAY_NUMBER = SM_GET_FRIDAY_NUMBER AND SM_GET_DAY_END > 23) THEN
 NEXT_SCHEDULED_DATE = ADD_DAYS(SCHEDULED_DATE,1)
ELSE
 NEXT_SCHEDULED_DATE = NEW_SCHEDULED_DATE
SM_PRINT_NEXT_SCHEDULED_DATE = ESS_LOG_WRITE('NEXT_SCHEDULED_DATE :' + TO_CHAR(NEXT_SCHEDULED_DATE))
RETURN NEXT_SCHEDULED_DATE
/***************************************************************************************************************************************/


Step 4 : Now navigate to My Clients Group --> Data Exchange --> Submit Extract. Search the HCM Extract you want to submit using the Flow Schedule.
   In the Schedule section click on the dropdown and change the value to "Using a schedule". Now in the Frequency pass the Flow Schedule Fast
          Formula name. Put the Start Date when the HCM Extract should run for the first time. Pass the End date upto when the HCM  Extract should
   run using the Flow Schedule.

Wednesday, April 1, 2020

How to give access to other Users to view the Output of HCM Extract submitted by another User in Oracle HCM Cloud environment


Please follow the below steps to let other Users, view the Output of HCM Extract, submitted by another User :(Option names or some minor changes may happen as per Oracle Release update)

Step 1 : Create a Role from Security Console with the following information :
         a> Navigate to Home --> Tools --> Security Console
         b> Go to "Roles" tab and search for option "Create Role" and click on it.
         c> Provide the Role Name, Role Code. Choose Role Category as 'HCM - Job Roles'.
              Provide Description(optional).
         d> Now click on the "Next" button and go to the next tab named "Function Security Policies".
         e> Under "Function Security Policies" tab, click on the "Add Function Security Policy" button.
              A search popup will appear and search for the Privilege named "View Output Document",
              code "CMK_VIEW_OUTPUT_DOC_PRIV". Now select the Privilege and click on the        
             "Add Privilege to Role" button.
        f> Once the Privilege is added, click on the "Next" button and go to the next tab named
             "Data Security Policies".
       g> Under "Data Security Policies" tab, click on the "Create Data Security Policy" button.
            A popup will appear. Provide a name for the "Policy Name" option. Pass the Start Date
            value. Now click on the search button of "Database Resource" and search for the Database
            Resource named as "Payroll Checklist for Table PAY_CHECKLISTS". Select the Database
            Resource name and click on "OK" button. Set the "Data Set" value as "All values".
            Set the "Actions" as "View Payroll Flow" and click on "OK".
       h> Now continue to go on to the next tabs one by one, clicking on the "Next" buttons and go to
            the last tab called "Summary". Click on "Save and Close" button and create the new Role.

Step 2 : Assign the newly created Role to the required Users.
        a> Navigate to Home --> Tools --> Security Console
        b> Go to "Users" tab and search for the User who needs access to see the required output from
             HCM Extract submitted by other User.
        c> Now open the User and assign the newly created Role to the User.

Step 3 : Set Owner for HCM Extract
        a> Navigate to Home --> My Client Groups --> Data Exchange --> HCM Extracts
        b> Under HCM Extracts, click on the "Refine Extracts" link and the "Payroll Flow Patters"
             section will appear. In the "Flow Pattern" search for the HCM Extract name, we want the
             output to be accessible for.
        c> In the search result select the HCM Extract and click on the "Edit" button. The "Tasks" tab
             will appear. In the "Tasks" tab select the Flow Task Name with the same name as HCM
             Extract name nd click on the "Go to Task" button.
        d> Now the "Task Details" tab will appear and click on the "Next" button.
        e> Now under the "Task Details", the "Owner" section will appear. Choose "Owner Type"
             as "Group"from the Dropdown List. From the "Owner" Dropdown List, choose the Role
             Name which has been created at Step 1.
        f> Submit the changes.

Step 4 : Navigate to Home --> Tools --> Scheduled Processes and submit the Process named
             "Import User and Role Application Security Data". The process run should be completed
             and successful.

Step 5 : Now submit the HCM Extract which has been refined at Step 3. But make sure the HCM
             Extract should be run with an User, who doesn't have assigned the newly created Role at
             Step 1.

Step 6 : Now login with the User, who had the newly created Role assigned to his/her User Profile.
             Navigate to Home --> Payroll --> Checklist and search for the "Payroll Process Flow Name"
             with which the HCM Extract was submitted. Now select the Payroll Process Flow Name
             and navigate to the Task Details page through "Go to Task" button. Now try to see the Output
             from there and the output will be visible for the User having the newly created Role, assigned
             to its User Profile.

 
            

SQL for getting the Delivery Option details for HCM Extracts in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT 
                PEDT.DEFINITION_NAME
               ,PEDT.DESCRIPTION
               ,PEDOB.BASE_DELIVERY_OPTION_NAME
               ,PEDOB.DATE_TO
               ,PEDOB.DATE_FROM
               ,PEDOB.DELIVERY_TYPE
               ,PEDOB.BIP_REPORT_NAME
               ,PEDOB.BIP_TEMPLATE_NAME
               ,PEDOB.OUTPUT_TYPE
               ,PEDOB.OUTPUT_NAME
               ,PEDOB.OUTPUT_DIRECTORY_NAME
               ,PEDOB.FILTER_NODE
               ,PEDOB.BURSTING_NODE
               ,PEDOB.LEGISLATION_CODE
               ,PEDOB.MANDATORY_FLAG
               ,PEDODP.OPTION_TYPE
               ,PEDODP.OPTION_VALUE
FROM PER_EXT_DEFINITIONS_TL PEDT, PER_EXT_DELIVERY_OPTIONS_B PEDOB
 , PER_EXT_DELIVERY_OPTION_DTLS PEDODP
WHERE PEDT.DEFINITION_NAME LIKE '<pass the HCM Extract Name>'
AND       PEDT.EXT_DEFINITION_ID                   =    PEDOB.EXT_DEFINITION_ID
AND       PEDOB.EXT_DELIVERY_OPTION_ID   =    PEDODP.EXT_DELIVERY_OPTION_ID

SQL for getting the ID value of Payroll Configuration Group (Threading Name) for running HCM Extracts in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT 
                TO_CHAR(ACTION_PARAM_GROUP_ID) , ACTION_PARAM_GROUP_ID
FROM FUSION.PAY_ACTION_PARAM_GROUPS
WHERE ACTION_PARAM_GROUP_NAME = <pass the Threading Name here>

SQL for getting Person's User Role details in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT 
               P.PERSON_NUMBER
              ,U.USERNAME
              ,U.SUSPENDED
              ,RD.ROLE_COMMON_NAME
              ,RDT.ROLE_NAME
FROM PER_USERS U, PER_ALL_PEOPLE_F P
 ,PER_USER_ROLES UR, PER_ROLES_DN RD, PER_ROLES_DN_TL RDT
WHERE   U.PERSON_ID          =    P.PERSON_ID
AND         U.USER_ID               =    UR.USER_ID
AND         UR.ROLE_ID             =    RD.ROLE_ID
AND         UR.ROLE_ID             =    RDT.ROLE_ID

AND         UR.ACTIVE_FLAG    =    ‘Y’
ORDER BY U.USERNAME


SQL for getting all the Role names present in Oracle HCM Cloud environment and also assigned to Users

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT 
               DISTINCT RDT.ROLE_NAME
FROM PER_USERS U, PER_ALL_PEOPLE_F P
 ,PER_USER_ROLES UR, PER_ROLES_DN RD, PER_ROLES_DN_TL RDT
WHERE   U.PERSON_ID          =    P.PERSON_ID
AND         U.USER_ID               =    UR.USER_ID
AND         UR.ROLE_ID             =    RD.ROLE_ID
AND         UR.ROLE_ID             =    RDT.ROLE_ID

SQL to get HDL file details using UCM Content Id or HDL zip File name in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT
               HDDS.DATA_SET_ID
              ,HDDS.REQUEST_ID
              ,HDDS.UCM_CONTENT_ID
              ,HDDS.DATA_SET_NAME
              ,HDDS.DATA_SET_STATUS
              ,HDDSBO.DATA_SET_BUS_OBJ_ID
              ,HDDSBO.BUSINESS_OBJECT_ID
              ,HDDSBO.DATA_FILE_NAME
              ,HDFL.LINE_ID
              ,HDFL.SEQ_NUM
              ,HDFL.TEXT
FROM HRC_DL_DATA_SETS HDDS, HRC_DL_DATA_SET_BUS_OBJS HDDSBO,
 HRC_DL_FILE_LINES HDFL
WHERE HDDS.DATA_SET_ID                                 =   HDDSBO.DATA_SET_ID
AND      HDDS.REQUEST_ID                                   =   HDDSBO.REQUEST_ID
AND      HDDS.REQUEST_ID                                   =   HDFL.REQUEST_ID
AND      HDDSBO.DATA_SET_BUS_OBJ_ID         =   HDFL.DATA_SET_BUS_OBJ_ID
AND      HDDS.UCM_CONTENT_ID                       =   <pass UCM Content Id here>
AND      HDDS.DATA_SET_NAME                          =   <pass Zip file name here>
ORDER BY HDDSBO.DATA_FILE_NAME, HDFL.SEQ_NUM

How to create a Custom Role only having access to Setup and Maintenance

  To create a Custom Role only to have access in Setup and Maintenance, we need the below privileges: Please follow the below instructions t...