Tuesday, April 21, 2020

How to give access only in OTBI Data Model for BI Developers without having Admin access in Oracle HCM Cloud environment

Create a Role with the below details. This Role will only give access to Data Model to query in the Oracle Database table. User will not be able to create Reports in this approach.


Step 1 : Navigate to Home --> Tools --> Security Console

Step 2 : Click on the "Create Role" button.

Step 3 : Provide the details as below in the "Basic Information" section. Details can be changed as
             per requirement. This is just a sample.

             Role Name         : BI Developer Custom
             Role Code          : BI_DEVELOPER_CUSTOM
             Role Category    : BI - Abstract Roles
             Description         : A custom BI Abstract Role for BI Developers

Step 4 : Navigate to "Function Security Policies" section and a Function Security Policy named 
             "Manage Reports and Analytics"

Step 5 : Navigate to "Data Security Policies" section and no need to add anything here.

Step 6 : Now navigate to "Role Hierarchy" section and add the role 
             "BI Publisher Data Model Developer" role here.

             Note : FYI "BI Publisher Data Model Developer" will provide access only to 
                        Data Models, not in Reports. To have access in Reports, we have to add 
                        "BI Platform Author Role" in the "Role Hierarchy" section. User will not 
                        have access in BI Administration from the OTBI in this approach.

                        If user wants to access everything in OTBI, then simply we have to add only one role
                        named "BI Administrator Role" in the "Role Hierarchy" section.

                       Modify accordingly as per requirement here.

Step 7 : Click on "Next" button and it will navigate to "Users" section. Users can be assigned this  
             new role to their User Profiles in this step (Check the "Add User" button). Otherwise we 
             can assign this role to the Users from the Security Console "Users" tab. 

Step 8 : Click again on the "Next" button and check the Summary of the new role. Then click 
             on "Save and Close". In this step the new custom role will be created.


Step 9 : If in the step 7 User was added then step 9 is optional, else navigate to 
             Home --> Tools --> Security Console --> Users.
             Search for a user who should have this role and assign this role (BI Developer Custom)
             to the user.

Step 10 : Now navigate to Home --> Tools --> Reports and Analytics --> Browse Catalog
               In the OTBI select the folder where this role should have full access or create a new folder 
               in OTBI.

Step 11 : Select the folder, go to "Permission" (More--> Permission or check at left hand bottom)
                add the new role with Permission as "Full Control" and also click on the Check Boxes to 
                give accesses in the Sub Folders as well.

Step 12 : Now login as the User, Navigate to "Reports and Analytics --> Browse Catalog --> OTBI"
               and try to access the "Data Model" and save it to that particular folder.

              Note : After logging in as the User, if you are not able to see the "Reports and Analytics" 
                         icon from UI, then login in or take help from Admin to run the process 
                         "Import User and Role Application Security Data" from 
                         "Home --> Tools --> Scheduled Processes". Once the process in completed then try
                         again to login as the user and to test the Data Model access.
                         





Saturday, April 18, 2020

Some of the useful ESS Process names in Oracle HCM Cloud environment


Please use the ESS processes accordingly as per Business requirement :



Navigation : Home --> Tools -> Scheduled Processes



1> Delete Master Geography Data for the Selected Country

Deletes master geography data for a selected country if the country data is not associated with the territory setup


2> Run Convert Pending Workers Automatically Process
PWKConversionJob
Schedules or runs process that automatically quick converts pending workers matching the conversion criteria

3> ESS process for value set values upload

4> Retrieve Latest LDAP Changes
SyncRolesJob
Synchronizes users, roles and role grants with definitions in LDAP

5> Import User and Role Application Security Data
AseImportUsersAndRolesJob
Import user and role data from LDAP and store in Applications Security tables

6> Import User Login History
AseInactiveUsersDataLoadJob
Imports information that the Inactive Users Report uses to identify inactive users

7> Regenerate Data Security Grants
RegenDataSecurityGrants
Manages processes to create or update grants for job, data and abstract roles

8> Regenerate Data Security Profiles
Regenerates all security profiles or a set of security profiles based on a selected security profile type

9> User Role Membership Report
Lists roles assigned to each user.




Friday, April 17, 2020

How to migrate Fast Formulas using HDL file in Oracle HCM Cloud environment



Step 1 : Save the whole Fast Formula text in a .txt file. In this example I have save the
             Fast Formula in a .txt file named "SM_TEST_FF_Text_1.txt".

Step 2 : Create a folder named "ClobFiles" and put the SM_TEST_FF_Text_1.txt file
             in the "ClobFiles" folder.
 

            Note : Repeat the same steps, if you have more than 1 Fast Formula to migrate.
            Suppose, if you have 3 different Fast Formulas to migrate, save the 3 different
            Fast Formulas in 3 different .txt files and put all the 3 .txt file in the "ClobFiles" folder.

Step 3 : Sample HDL File for Fast Formulas load as below. HDL file name should
              be  "FastFormula.dat". Save the HDL details in FastFormula.dat file.

             METADATA|FastFormula|FormulaCode|FormulaName|FormulaTypeCode
             |LegislativeDataGroupName|EffectiveStartDate|EffectiveEndDate|FormulaText
            MERGE|FastFormula|SM_TEST_FF_1|SM_TEST_FF_1|Extract Rule|#NULL|1901/01/01
            |4712/12/31|SM_TEST_FF_Text_1.txt

            In the upper HDL 2 lines, 1st line is the Header(starting with METADATA) and the
            2nd line is the Data line (starting with MERGE).
            Pass the Fast Formula details accordingly in the Data line. But in the "FormulaText"
            attribute, pass the Fast Fomula .txt file name where the exact Fast Formula has been
            written (example : for Fast Formula SM_TEST_FF_1, Fast Fomula text has been saved
            in the  "SM_TEST_FF_Text_1.txt file" and has been put in the "ClobFiles" folder.
            In the HDL Data line, this file name has been passed in the FormulaText attribute)

Step 4 : Now create a single .zip file with both FastFormula.dat file and ClobFiles folder.

Step 5 : Load the .zip file in the HCM Data Loader in target environment.



Additional Steps : Load the Fast Formula description
Step 6 : Sample HDL File for Fast Formulas description load as below. HDL file name should
              be "FastFormulaTranslation.dat".
              Save the HDL details in FastFormulaTranslation.dat file and pass the details accordingly.

              METADATA|FastFormulaTranslation|FormulaId|Language|FormulaCode|FormulaName
              |Description|FormulaTypeCode|LegislativeDatGroupName|SourceLang
              MERGE|FastFormulaTranslation|XXXXXXXXXX|US|SM_TEST_FF_1|SM_TEST_FF_1
              |Test Fast Formula Load 1 using HDL|Extract Rule|#NNULL|US

              <get the Formula Id value from database and replace the Formula Id value here
                for XXXXXXXXXX>

Step 7 : Now create a single .zip file with FastFormulaTranslation.dat file.

Step 8 : Load the .zip file in the HCM Data Loader in target environment.


Tuesday, April 14, 2020

What is the Connection Pool command for accessing Direct Database Query in OTBI in Oracle HCM Cloud environment

Please pass the below details in Connection Pool :

"HCM_OLTP"."Connection Pool"



Note : As of now, by default this option is hidden in the latest release (I think from 19B).

How to Create an Inbound Interface process using HCM Extract in Oracle HCM Cloud environment


Here is an example has been provided. Please modify accordingly as per Business requirement.

Assumption : Viewer is aware of how to create an HCM Extract in Oracle HCM Cloud environment using BI Report.

Using Inbound Interface process, we can Extract data from Oracle HCM Cloud environment and load the data back accordingly in Oracle HCM Cloud environment, using HDL.
Here is a scenario how to achieve it.

Suppose, in the Person External Identifier section, the Username details for each Person to login in the Oracle HCM Cloud environment is also present.We will be checking if there are any changes between Person External Identifier Username details and User Profile Username details.
If yes, we will correct the User Profile Username details with Person External Identifier Username details.

Our main data source will be BI Report and we will be creating pipe delimited ".dat" file from there.
So, that is why we are creating this dummy HCM Extract.

Why we need HCM Extract is because, only HCM Extract has this functionality to call HDL using Inbound Interface option, in the "Extract Delivery Options" section within HCM Extract from the "Delivery Type".

Using the below SQL, a Data model needs to be created first.
Then an eText template needs to be created.
Using this Data Model and etext template, the required Report needs to be created to create the User.dat file. Pass this Report path in the dummy HCM Extract's Delivery Option.

SQL to get the details for the User.dat file (Modify as required):
----------------------------------------------------
SELECT 'LINK' AS LINK  --(This Link attribute will be used to link with the delivered Global Data Model Report)
,'N' AS SUSPENDED
,A.PERSON_NUMBER
,A.EXT_IDENTIFIER_NUMBER
,B.USERNAME
FROM
 (SELECT DISTINCT PAPF.PERSON_NUMBER
 ,PEAI.EXT_IDENTIFIER_NUMBER
 FROM PER_ALL_PEOPLE_F PAPF,PER_EXT_APP_IDENTIFIERS PEAI
 WHERE PAPF.PERSON_ID  = PEAI.PERSON_ID
 AND PEAI.EXT_IDENTIFIER_TYPE  = <pass the External Identifier Type code here>
 AND TO_CHAR(SYSDATE, 'YYYY/MM/DD') BETWEEN TO_CHAR(PEAI.DATE_FROM,'YYYY/MM/DD') AND NVL(TO_CHAR(PEAI.DATE_TO,'YYYY/MM/DD'),'4712/12/31')
 ORDER BY PAPF.PERSON_NUMBER
 ) A,
 (SELECT DISTINCT P.PERSON_NUMBER
 ,U.USERNAME
 FROM PER_USERS U, PER_ALL_PEOPLE_F P
 WHERE U.PERSON_ID   = P.PERSON_ID
 ORDER BY P.PERSON_NUMBER
 ) B
WHERE  A.PERSON_NUMBER   =  B.PERSON_NUMBER
AND A.EXT_IDENTIFIER_NUMBER  <>  B.USERNAME
AND EXISTS
  (SELECT 'X'
  FROM PER_ALL_PEOPLE_F PAPF, PER_EXT_APP_IDENTIFIERS PEAI
  WHERE  PAPF.PERSON_ID    =  PEAI.PERSON_ID
  AND PEAI.EXT_IDENTIFIER_TYPE =  <pass the External Identifier Type code here>
  AND     TO_CHAR(SYSDATE,'YYYY/MM/DD') BETWEEN TO_CHAR(PEAI.DATE_FROM,'YYYY/MM/DD') AND NVL(TO_CHAR(PEAI.DATE_TO,'YYYY/MM/DD'),''4712/12/31')
  )

Sample User.dat file details below :
------------------------------------------------
METADATA|User|Suspended|PersonNumber|Username
MERGE|User|N|<SQL will pass the Person Number here through eText template>|<SQL will pass the Username here through eText template>

Now follow the steps in the HCm Extract :
------------------------------------------------
Step 1 : Create a dummy HCM Extract.
             In the Parameters section add a parameter called Auto Load (Tag Name = Auto_Load)
             with   Data Type as "Text" with Default Value as "Y".
             For the Data Group I have taken "PER_EXT_ASG_STATUS_DETAILS_UE" as
             User Entity as it returns very less row count.

Step 2 : In Data group Filter criteria I have put condition as "1=2" so that it doesn't return
             any data set.

Step 3 : In the Data Group a Data Record has been created and a Dummy Attribute has been
             added called "Dummy Field".
             It's Data Type is "Text", Type is "String" and string value has been passed as "Demo"

Step 4 : Now for the Extract Delivery Option, you can pass the details as below :
              Start Date                         : 01/01/1901
              End Date                          : 12/31/4712
              Delivery Option Name    : HDLOutput
              Output Type                     : Text
             Report                               : <pass the BI Report path here ;
                                                        example : /Custom/..../SM_USERNAME_UPDATE_RPT.xdo>
             Template Name                : <pass the eText template name here;
                                                        example : SM_USERNAME_UPDATE_TMPLT>
             Output Name                   : User
             Delivery Type                  : Inbound Type
             Required Bursting Node  : ticked
 
 
            In the "Additional Details" section please pass the details as below :
            Encryption Mode                    : None
            Override File                           : .dat
            Run Time File Name               : blank
            Integration Name                    : UserNameUpdate
            Integration Type                      : Data Loader
            Integration Parameters            : blank
            Key                                         : blank
            Locale                                     : blank
            Time Zone                              : blank
            Compress                                : blank
            Compressed Delivery Group  : User.zip
 
Step 5 : Navigate to Home --> My Clients Groups --> Data Exchange --> Refine Extracts
             (Under HCM Extracts)
             Search for the dummy HCM Extract which has been created. Select the HCM Extract
             and click on the "Edit" button.

Step 6 : Once the HCM Extract is opened, in the "Tasks" tab, under Flow Task the
             HCM Extract name will be present.
             Now click on the "Actions" button and select the option "Select and Add".
             A search popup will appear and search the task, named
             "Initiate HCM Data Loader"    (Generate HCM Data Loader file and optionally
              perform a data load).Now select the task and click on the "Done" button.

Step 7 : Now select the task "Initiate HCM Data Loader" click on the "Go to Task" button.
             2 options will be there :
                    a> Data Loader Archive Action
                    b> Data Loader Configuration

Step 8 : Now select the 1st option "Data Loader Archive Action".Click on the "Edit" button.
             In the "Parameter Basis" dropdown, select the option "Bind to Flow Task".
             In the "Basis Value" dropdown, select the option "Extract Name , Submit , Payroll Process"

Step 9 : Now select the 2nd option "Data Loader Configuration". Click on the "Edit" button.
              In the "Parameter Basis" dropdown, select the option "Constant Bind".
              In the "Basis Value" multiline text box, please provide the below details :
              ImportMaximumErrors=100,LoadMaximumErrors=100,LoadConcurrentThreads=8
              ,LoadGroupSize=100

Step 10 : Now click on the "Next" button and then click on the "Submit" button.

Now submit the HCM Extract and check the Inbound Interface process.

Monday, April 13, 2020

SQL for getting current session User Person Number in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------


SELECT DISTINCT PAPF.PERSON_NUMBER
FROM PER_ALL_PEOPLE_F PAPF
WHERE PAPF.PERSON_ID = (SELECT NVL(HRC_SESSION_UTIL.GETUSER_PERSONID,-1) FROM DUAL)

SQL for getting the environment name in Oracle HCM Cloud environment

Sample SQL Query (You can modify as required):
-----------------------------------

SELECT UPPER(ADDMNS.EXTERNAL_VIRTUAL_HOST)
FROM ASK_DEPLOYED_DOMAINS ADDMNS
WHERE UPPER(ADDMNS.DEPLOYED_DOMAIN_NAME) = 'FADOMAIN'

How to get rid of "Exception during RestAction" error from Employee Self Service functionality while submitting Absence

 After RedWood is being applied, while navigating to Add Absence functionality under Me tab, if it is throwing an Rest error, then follow th...