Quantcast
Channel: Apps2Fusion Articles
Viewing all 930 articles
Browse latest View live

Creating Absence Management Usage Report In Oracle HCM Cloud Application

$
0
0

Introduction

One of the area which is of particular interest in a typical ERP implementation is to understand user engagement and user acceptance. While one way of getting this kind of information can be via a survey where-in one can get users inputs another efficient way of gauging whether a particular functionality is being used is by tracking the usage report. For custom BIP Reports one can make use this article here.

One can even enable the Audit Framework too.

In this article we will try to understand how to find the list of employees who have not used the Absence Management functionality. We would try to get a list of all such employees using OTBI Analysis. One assumption we make at this point of time is that anyone who have not taken any absence recorded in the application is the one who has not used the functionality.

Here we will make use of OTBI feature where one analysis is used as filter criteria for another analysis ( complete article here.

For this example, we will perform the following steps:

  1. Create an OTBI Analysis which will have all the person numbers along with the total no of absences taken by individuals.

  2. Create an OTBI Analysis which will have all person numbers and then filter out the ones which have absences recorded against them

  3. Verification

 

First, we will create an OTBI Analysis which will have all person numbers along with total no of absences recorded against them. We would use “Workforce Management – Absence Real Time” subject area for this analysis.

PersonNumberWithAbsencesTaken

Field Name

Data Source

Person Number

"Worker"."Person Number"

# Of Absences

"Assignment Absences"."# Of Absences"

 

 

Next, we will create an OTBI Analysis using subject area “Workforce Management – Assignment Real Time” to find all the person numbers and then apply a filter condition to not include any person numbers who have at least one absence recorded.

 

 

 

Verification

From the above screenshots we can see that person number 10 does not has any absence recorded. We will verify this by navigating to the “Manage Person Records” screen and see if any absences are seen there.

 

We could see that there are no absences recorded for this employee.


Performing Mathematical Calculations on an Extract Rule Attribute

$
0
0

Introduction

 

In one of the previous article <link here> we have seen how to make use of  “Extract Rule” Type Fast Formula to derive values using

 

  1. Extract Input Parameters

  2. Other Extract Attributes

However, one major limitation of an Extract Rule Type Fast Formula is that it always returns a Text Output. What this means is that even if you had a numeric output as a result of all calcualations performed earlier it would give a corresponding text value as a output. This also implies that one would not be able to perform any mathematical operations on the extract rule output value. One alternative for this could be to use another Extract Rule Fast Formula wherein one may use the TO_NUMBER function.

 

Let-us demonstrate the same using a example.

 

Pre-requisite

 

We already have an existing attribute (‘LOSInDays’) defined which is based upon a Extract Rule Fast formula. We would now make use of this attribute, apply the TO_NUMBER function on this perform a mathematical operation and then return a text type value.

Fast Formula Text

DEFAULT FOR DATA_ELEMENTS IS EMPTY_TEXT_TEXT


INPUTS ARE DATA_ELEMENTS (TEXT_TEXT)


ln_los_in_days = TO_NUMBER(DATA_ELEMENTS['LOSINDAYS'])


ln_newhire_criteria_days = GET_PARAMETER_VALUE_NUMBER('NEWHIRECRITERIADAYS')


RULE_VALUE = 'N'


IF (ln_los_in_days <= ln_newhire_criteria_days)

THEN

     (

             RULE_VALUE = 'Y'

     )


RETURN RULE_VALUE

Creating a Waterfall Graph In Oracle ERP Cloud Application

$
0
0

Introduction

 

While most of us are probably aware of various type of Graphs, Charts which can be used to visually display data metrics there are still a lot many types which are unexplored.

One such unexplored (probably less commonly used one is a Waterfall Graph Chart).

In this article, we will create a very simple OTBI Analysis and build a Waterfall Graph from it.

OTBI Analysis

For this example, we will use “Workforce Management - Person Real Time”  Subject Area and choose the following two fields.

 

Field Name

Data Source

Person Count

"Person"."Person Count"

Person Type

"Person Type"."Person Type"

 

The criteria tab should appear as:

 

Also one would need to add a View of Grpah Type and choose the following values

 

And the Results Tab would appear as:

 

 

And if we place both the Graph and the data(tabular format) in the compound layout (side-by-side) the Analysis would appear as:

 

Loading Data into Oracle HCM Cloud Using Inbound Interface Delivery Option

$
0
0

Introduction

One of the major challenges consultants have faced over the last couple of years with respect to HCM Data Load is to automate the entire process of Downloading Data from the HCM Cloud Application, Making Transformation (Updates/Edits to them) and reloading the same back to the application using HCM Data Loader.

Things are relatively easy if one is loading data from a legacy application to HCM Cloud as in most cases one has access to database where they can use pl/sql programs to first get data in HDL format and then use transformation technique to make changes to data and then upload the HDL file in Oracle HCM Cloud Application.

Imagine, you are asked to do the entire activity in HCM Cloud itself. Meaning say your application is already gone live and then there is a requirement to say change the location of all employees in the system from Location A to Location B . One would need to create a new assignment record which would start one day after the most recent assignment record. In this scenario one would have to perform the following steps:

 

  1. Extract All Eligible Records in HDL format

  2. Make Changes to the record

  3. Upload the new record into application

 

In this, example we would try to demonstrate the same. For simplicity sake we would only use one Worker Record (Person#898453 for this example) and just modify the value of one attribute say “WorkAtHomeFlag”. Which means if the value of “Work from Home” is No or Null the new record should have a value of Yes and if the value of Work from Home is Yes it should be changed to No.

So without further ado, lets gets started.

 

Extract All Eligible Records in HDL Format

In one of the previous article (Generating EText Output From HCM Extract In Oracle Fusion HCM Cloud Application) we have seen how to create eText output.

So, we will create an HCM Extract which will fetch the following fields:

Business Object / Entity

Data Field

WorkRelationship

PeriodOfServiceId

PersonId

LegalEntityId

DateStart

PrimaryFlag

WorkerNumber

WorkerType

Business Object / Entity

Data Field

WorkTerms

AssignmentId

PeriodOfServiceId

EffectiveStartDate

EffectiveEndDate

EffectiveSequence

EffectiveLatestChange

AssignmentName

AssignmentNumber

ReasonCode

WorkAtHomeFlag

 

*Field Highlighted in Green is the one which is to be changed. All other fields are basic fields which are required for any update/correction action required.

Business Object / Entity

Data Field

Assignment

AssignmentId

WorkTermsAssignmentId

EffectiveStartDate

EffectiveEndDate

EffectiveSequence

EffectiveLatestChange

AssignmentName

AssignmentNumber

PrimaryAssignmentFlag

PrimaryFlag

ReasonCode

WorkAtHomeFlag

 

*Field Highlighted in Green is the one which is to be changed. All other fields are basic fields which are required for any update/correction action required.

We would have two data groups created namely WorkerDataGroup and WorkTermsDataGroup. WorkerDataGroup is the Root Data Group.

 

 

Some Details related to Data Group are in table below:

 

DataGroupName

UserEntity

DataGroupFilterCriteria

ExtractRecords

WorkerDataGroup

PER_EXT_WORK_RELATIONSHIP_ALL_UE

(pps.person_id=300000173638388)

WorkRelationship

WorkTermsDataGroup

PER_EXT_SEC_ASSIGNMENT_UE

(asg.primary_flag='Y')

WorkTerms

Assignment

 

Also we should have the following Connection between the Root Data Group (WorkerDataGroup) and WorkTermsDataGroup

Parent Data Group

Parent Data Group Database Item

Child Data Group

Child Data Group Database Item

WorkerDataGroup

Extract Relationship Period Of Service Id

WorkTermsDataGroup

Extract Assignment Period of service

 

Once we have all these details populated we would need to define the Extract Delivery Options. Details for which are in below table

Attribute Name

Attribute Value

Start Date

1/1/2000

End Date

12/31/4712

*Delivery Option Name

WorkerHDLData

*Output Type

Text

Report

/Custom/Practice Samples/UpdateAssignmentExtractReport.xdo

Template Name

UpdateWorkAtHomeFlag

*Output Name

Worker

*Delivery Type

Inbound Interface

Required

Checked

Encryption Mode

None

Override File Extension

.dat

Integration Name

Worker

Integration Type

Data Loader

Compressed Delivery Group

UpdateAssignment.zip

 

 

Make Changes to the Record

Now that we are all set from extracting the data from application. Let’s next try to focus on the transformation logic applied. We would apply the logic on the eText Template and the reason for the same is:

  1. We do not want to use any transformation in HCM Extract

  2. All Transformation are done in eText template

This, however is the approach I followed, and one may choose to have a different rule applied.

As discussed, we would be creating a new record and only try to modify the value of “WorkAtHomeFlag”. This eventually means that we will have the Effective Start Date value changed to current value + 1 for both WorkTerms and Assignment Record and the value of “WorkAtHomeFlag” altered based on current value (i.e from ‘N’ to ‘Y’ , ‘Y’ to ‘N’ or even Null to ‘Y’)

Business Object/Entity

Data Field

Transformation Logic

WorkTerms

EffectiveStartDate

INCREASE_DATE(EffectiveStartDate,1)

WorkTerms

WorkAtHomeFlag

If WorkAtHomeFlag = ‘N’ or WorkAtHomeFlag is NULL

Then ‘Y’

Else If WorkAtHomeFlag = ‘Y’

Then ‘N’

End If

Assignment

EffectiveStartDate

INCREASE_DATE(EffectiveStartDate,1)

Assignment

WorkAtHomeFlag

If WorkAtHomeFlag = ‘N’ or WorkAtHomeFlag is NULL

Then ‘Y’

Else If WorkAtHomeFlag = ‘Y’

Then ‘N’

End If

 

Upload the New Record into Application

Once we are done with the above setup we will get Worker.txt file which will have all the data but HDL only supports .dat file extension and so to ensure we get the file with same filename and also it triggers the “HCM Data Loader” we have to perform the following steps:

  1. Add a parameter named “Auto Load” to the UpdateAssignmentExtract

  2. Add “Initiate HCM Data Loader” to UpdateAssignmentExtract payroll flow

 

Add “Auto Load” Parameter to UpdateAssignmentExtract

We would need to add a new parameter. Details of the parameter are in below table

Attribute Name

Attribute Value

Sequence

100

Name

Auto Load

Tag Name

Auto_Load

Data Type

Text

Display

Yes

 

 

Add Initiate HCM Data Loader Payroll Flow to UpdateAssignmentExtract Payroll Flow

As a next step we would also need to Add “Initiate HCM Data Loader” Payroll flow as part of UpdateAssignmentExtract Payroll Flow. We can either navigate to Payroll->Checklist->Search for UpdateAssignmentExtract Payroll Flow Pattern or use Refine Extracts->Search for UpdateAssignmentExtract

Once the search results are retrieved we need to click on “Edit” (pencil icon) and choose the following task

Name

Initiate HCM Data Loader

Description

Generate HCM Data Loader File and optionally perform a Data Load

Task Type

Standard Process

*Please make sure to select the payroll flow which has the description of  “Generate HCM Data Loader File and optionally perform a Data Load” as the other payroll flow task which has similar name will not serve the purpose here (both have different definition)

Once we add the payroll flow task we should click on “Go To Task” and add the following details:

Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Archive Action)

Name

Data Loader Archive Action

Execution Mode

Submit

Data Type

Text

Parameter Basis

Bind to Flow Task

Basis Value

UpdateAssignmentExtract, Submit , Payroll Process

Usage

Input Parameter



Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Configuration)

Name

Data Loader Configuration

Execution Mode

Submit

Data Type

Text

Parameter Basis

Constant Bind

Basis Value

ImportMaximumErrors=100,

LoadMaximumErrors=100,

LoadConcurrentThreads=8,

LoadGroupSize=100

Usage

Input Parameter

 

 

Now the setup part is complete.

 

Verification

In order to confirm that the automatic load process is working as expected we need to perform the following steps:

 

  1. Verify Application Data Before Load

  2. Run UpdateAssignmentExtract

  3. Check the Worker.dat file which is zipped inside UpdateAssignment.zip

  4. Verify Application Data After Load

 

Verify Application Data Before Load

We would navigate to Person Management -> Search for Person Number 898453 and check the Assignment Record.

 

We could see that the most recent Assignment Record has Effective Start Date as 10/14/18 (i.e 2018/10/14 in YYYY/MM/DD format) and the value of Working at Home field is No

This means that the new Assignment Record should have an Effective Start Date of 10/15/18 (i.e 2018/10/15 in YYYY/MM/DD format) and the value of Working At Home should be Yes.

 

Run UpdateAssignmentExtract

We would now submit the UpdateAssignmentExtract with Auto Load parameter value as ‘Y’

 

When we click on “Ok and View Checklist” it would take us to the checklist page where we can check the status of the tasks.

We can see that the payroll task has successfully completed.

Also, when we check the extract run details for “SampleRun10” we can find that a content id has been generated (DL_WORKER_1416194)

 

 

Check the Worker.dat file zipped inside UpdateAssignment.zip

Once we click on the Download icon next to DL_WORKER_1416494 , UpdateAssignment.zip file gets downloaded. The zip file contains Worker.dat file which has the HDL file as prepared by combination of UpdateAssignmentExtract and the UpdateWorkAtHomeFlag eText template. We could see that the transformation rule have been applied and so the effective start date is 2018/10/15 and also WorkAtHomeFlag value is ‘Y’

 

Verify Application Data After Load

Before we look at the Assignment Detail for Person# 898453 let us have a look at the “Import and Load Data” screen

 

And now if we quickly look at the Assignment Record for 898453 23 could see that a new assignment record has got created effective 10/15/2018 and also the Work From Home value has changed to Yes

 

Summary

So, this is how one can automate the HCM data load in Oracle HCM Cloud Application. While I have only used a single attribute “WorkAtHomeFlag” to demonstrate how this feature works one can use any other attribute or a combination of attribute and update the same.

You may download the Extract Definition, eText RTF and the sample Worker.dat file from below link:

UpdateAssignmentExtract (Extract Definition)

UpdateWorkAtHomeFlag (eText RTF File)

Worker.dat (Sample File Generated by HCM Extract in this example)

With this I have come to the end of the article and I hope this will be of some use to you all.

Thanks all for your time and have a nice day!

Creating Absence Management Usage Report In Oracle HCM Cloud Application

$
0
0

Introduction

One of the area which is of particular interest in a typical ERP implementation is to understand user engagement and user acceptance. While one way of getting this kind of information can be via a survey where-in one can get users inputs another efficient way of gauging whether a particular functionality is being used is by tracking the usage report. For custom BIP Reports one can make use this article here.

One can even enable the Audit Framework too.

In this article we will try to understand how to find the list of employees who have not used the Absence Management functionality. We would try to get a list of all such employees using OTBI Analysis. One assumption we make at this point of time is that anyone who have not taken any absence recorded in the application is the one who has not used the functionality.

Here we will make use of OTBI feature where one analysis is used as filter criteria for another analysis ( complete article here.

For this example, we will perform the following steps:

  1. Create an OTBI Analysis which will have all the person numbers along with the total no of absences taken by individuals.

  2. Create an OTBI Analysis which will have all person numbers and then filter out the ones which have absences recorded against them

  3. Verification

 

First, we will create an OTBI Analysis which will have all person numbers along with total no of absences recorded against them. We would use “Workforce Management – Absence Real Time” subject area for this analysis.

PersonNumberWithAbsencesTaken

Field Name

Data Source

Person Number

"Worker"."Person Number"

# Of Absences

"Assignment Absences"."# Of Absences"

 

 

Next, we will create an OTBI Analysis using subject area “Workforce Management – Assignment Real Time” to find all the person numbers and then apply a filter condition to not include any person numbers who have at least one absence recorded.

 

 

 

Verification

From the above screenshots we can see that person number 10 does not has any absence recorded. We will verify this by navigating to the “Manage Person Records” screen and see if any absences are seen there.

 

We could see that there are no absences recorded for this employee.

Creating BI Publisher Data Model Using Query Builder In Oracle ERP Cloud

$
0
0

Introduction

While I hope that most of us by now are already aware of how to create a simple BI Publisher Report from Data Model using SQL query, but in case you are not aware you may feel free to refer this article here and it would get you started.

If one has some basic idea of SQL then its fine but if you are a newbie or an individual working primarily with OTBI reports (also referred to as a drag and drop reports) one may face challenges. In such scenarios one may make use of  “Query Builder” feature. This feature allows one to simply pick choose fields from the list of available fields and prepares a basic SQL query out of it. One may then simply add conditional clauses and the complete query is ready.

We will illustrate the same in the worked example here.

 

Worked Out Example

In this example we will try to get few details of person’s employee and assignment tables. At this point we know that we will be using “Standard SQL” and Data Source is “ApplicationDB_HCM”.

 

Once we click on the Query Builder a new page will open where we have to choose the Schema Name (“FUSION” for this example) and also the table names from which we want to fetch the column details.

 

Once we select the tables we would need to select the columns ( using checkbox) Remember one may even use the “Check All” option (but this will only work if the total number of columns in less than 60). The tables we are using here have more than 60 columns and hence we will have to manually choose. The application screen (post selection) would appear as below:

 

And the moment we do this a basic SQL will be ready which can be seen in the “SQL” tab

 

The SQL query which gets created can be fetched from the table below:

SQL Query Created from Query Builder

select "PER_ALL_PEOPLE_F"."PERSON_ID" as "PERSON_ID",

"PER_ALL_PEOPLE_F"."EFFECTIVE_START_DATE" as "EFFECTIVE_START_DATE",

"PER_ALL_PEOPLE_F"."EFFECTIVE_END_DATE" as "EFFECTIVE_END_DATE",

"PER_ALL_PEOPLE_F"."BUSINESS_GROUP_ID" as "BUSINESS_GROUP_ID",

"PER_ALL_PEOPLE_F"."START_DATE" as "START_DATE",

"PER_ALL_PEOPLE_F"."APPLICANT_NUMBER" as "APPLICANT_NUMBER",

"PER_ALL_PEOPLE_F"."PERSON_NUMBER" as "PERSON_NUMBER",

"PER_ALL_PEOPLE_F"."WAIVE_DATA_PROTECT" as "WAIVE_DATA_PROTECT",

"PER_ALL_ASSIGNMENTS_F"."ACTION_CODE" as "ACTION_CODE",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_ID" as "ASSIGNMENT_ID",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_NAME" as "ASSIGNMENT_NAME",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_NUMBER" as "ASSIGNMENT_NUMBER",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_SEQUENCE" as "ASSIGNMENT_SEQUENCE",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_STATUS_TYPE" as "ASSIGNMENT_STATUS_TYPE",

"PER_ALL_ASSIGNMENTS_F"."ASSIGNMENT_TYPE" as "ASSIGNMENT_TYPE",

"PER_ALL_ASSIGNMENTS_F"."BARGAINING_UNIT_CODE" as "BARGAINING_UNIT_CODE",

"PER_ALL_ASSIGNMENTS_F"."EFFECTIVE_START_DATE" as "EFFECTIVE_START_DATE_1",

"PER_ALL_ASSIGNMENTS_F"."EFFECTIVE_END_DATE" as "EFFECTIVE_END_DATE_1",

"PER_ALL_ASSIGNMENTS_F"."EFFECTIVE_SEQUENCE" as "EFFECTIVE_SEQUENCE",

"PER_ALL_ASSIGNMENTS_F"."EMPLOYEE_CATEGORY" as "EMPLOYEE_CATEGORY",

"PER_ALL_ASSIGNMENTS_F"."EMPLOYMENT_CATEGORY" as "EMPLOYMENT_CATEGORY",

"PER_ALL_ASSIGNMENTS_F"."NORMAL_HOURS" as "NORMAL_HOURS",

"PER_ALL_ASSIGNMENTS_F"."NOTICE_PERIOD" as "NOTICE_PERIOD",

"PER_ALL_ASSIGNMENTS_F"."NOTICE_PERIOD_UOM" as "NOTICE_PERIOD_UOM",

"PER_ALL_ASSIGNMENTS_F"."PERIOD_OF_SERVICE_ID" as "PERIOD_OF_SERVICE_ID",

"PER_ALL_ASSIGNMENTS_F"."PERSON_ID" as "PERSON_ID_1"

from"FUSION"."PER_ALL_ASSIGNMENTS_F" "PER_ALL_ASSIGNMENTS_F",

"FUSION"."PER_ALL_PEOPLE_F" "PER_ALL_PEOPLE_F"

 

And one may add additional where clause as required. For this example we will use the following where clause

Where Clause

WHERE  "PER_ALL_PEOPLE_F"."PERSON_ID" = "PER_ALL_ASSIGNMENTS_F"."PERSON_ID"

AND TRUNC(SYSDATE) BETWEEN  "PER_ALL_PEOPLE_F"."EFFECTIVE_START_DATE" AND "PER_ALL_PEOPLE_F"."EFFECTIVE_END_DATE"

AND TRUNC(SYSDATE) BETWEEN  "PER_ALL_ASSIGNMENTS_F"."EFFECTIVE_START_DATE" AND "PER_ALL_ASSIGNMENTS_F"."EFFECTIVE_END_DATE"

AND "PER_ALL_PEOPLE_F"."PERSON_NUMBER" = '481'

 

And once done we can see that the Data Model is created

 

Finally, if we click on the “Data” tab we will get the results.

 

 

Summary

So this is how one can make use of query builder feature and prepare simple SQL query based data model in Oracle ERP Cloud Application. This might serve as a good starting point for all non-techie guys to build a very basic BI Report.

Hope this is useful and you guys will be able to utilize this for self-learning or for your project deliverables.

That’s all from my side. Thanks for your time and have a nice day!

Verifying Workforce Structures Position Row In Oracle HCM Cloud Application

$
0
0

Introduction

Many a times there could be issues with several data records and it could be really difficult to find the records as well as the associated issues. Oracle HCM Cloud makes life simpler by “Diagnostic Framework”. We have already seen a couple of worked out example in the following posts:

 

Person Validations Diagnostic Test In Oracle HCM Cloud Application

An Introduction of Person Setup Validations Diagnostic Test In Oracle HCM Cloud Application

 

In this post we will try to run “HCM WS Positions Data Diagnostic” and check the results.

 

Worked Out Example

We would need to login to application with a user having appropriate access (HCM_IMPL) and navigate to “Run Diagnostic Tests” and search for “HCM WS Positions Data Diagnostic” test.

If we click on the details tab we could find the detailed description about the Diagnostic test

 

And if we click on “Parameters” we can see the following details

By default , we pass % as position code (this fetches details for all positions in the system) and print details as ‘N’ ( does not prints the input value passed).

For our example, we will pass three different position codes and verify results

Position Code: HEUSPOSE001

Print Detail : Y

 

Position Code: Analyst_CN

Print Detail: Y

 

Position Code: POS021

Print Details: Y

 

Summary

So, this is how one can validate whether a specific position has all the correct details associated with it. In case of any discrepancy one can take necessary actions and resolve issues.

Generating .dat files from HCM Extracts In Oracle HCM Cloud Application

$
0
0

Introduction

In one of the previous article (Generating EText Output From HCM Extracts In Oracle Fusion HCM Cloud Application) we have seen how we can generate EText Data from HCM Cloud Application. One can even get eText Output from BI Publisher Report. But the major limitation in both the approach is that the output file extension is .TXT and one would need other means to have .DAT file.

If the generated file is prepared with the intent of loading the data back into Oracle HCM Cloud Application using HDL one can do same using Loading Data into Oracle HCM Cloud Using Inbound Interface Delivery Option

In this post we will demonstrate how can one generate eText file with .DAT extension from Oracle HCM Cloud Application. However, the only limitation at this point of time is that the .DAT file will be under a zip folder.

For this example too we will use the same “UpdateAssignmentExtract”

We would have two data groups created namely WorkerDataGroup and WorkTermsDataGroup. WorkerDataGroup is the Root Data Group

Some Details related to Data Group are in table below:

DataGroupName

UserEntity

DataGroupFilterCriteria

ExtractRecords

WorkerDataGroup

PER_EXT_WORK_RELATIONSHIP_ALL_UE

(pps.person_id = (select papf.person_id from per_all_people_f papf where trunc(sysdate) between papf.effective_start_date and papf.effective_end_date and papf.person_number = '4177'))

WorkRelationship

WorkTermsDataGroup

PER_EXT_SEC_ASSIGNMENT_UE

(asg.primary_flag='Y')

WorkTerms

Assignment

 

Also we should have the following Connection between the Root Data Group (WorkerDataGroup) and WorkTermsDataGroup

Parent Data Group

Parent Data Group Database Item

Child Data Group

Child Data Group Database Item

WorkerDataGroup

Extract Relationship Period Of Service Id

WorkTermsDataGroup

Extract Assignment Period of service

 

Once we have all these details populated we would need to define the Extract Delivery Options. Details for which are in below table

Attribute Name

Attribute Value

Start Date

1/1/2000

End Date

12/31/4712

*Delivery Option Name

WorkerHDLData

*Output Type

Text

Report

/Custom/Practice Samples/UpdateAssignmentExtractReport.xdo

Template Name

UpdateWorkAtHomeFlag

*Output Name

Worker

*Delivery Type

Inbound Interface

Required

Checked

Encryption Mode

None

Override File Extension

.dat

Integration Name

Worker

Integration Type

Data Loader

Compressed Delivery Group

UpdateAssignment.zip

 

 

Once we are done with the above setup we will get the output file (Worker.txt for this example) but we are trying to generate .DAT file and so to ensure we get the file with same filename we will have to perform the following steps:

Add a parameter named “Auto Load” to the UpdateAssignmentExtract

Add “Initiate HCM Data Loader” to UpdateAssignmentExtract payroll flow

 

Add “Auto Load” Parameter to UpdateAssignmentExtract

We would need to add a new parameter. Details of the parameter are in below table

Attribute Name

Attribute Value

Sequence

100

Name

Auto Load

Tag Name

Auto_Load

Data Type

Text

Display

Yes

 

*Note: We always have to use ‘N’ as the value of this parameter for this specific scenario or else it will try to load data back into application (which is not the ask here)

Attribute Name

Attribute Value

Sequence

100

Name

Auto Load

Tag Name

Auto_Load

Data Type

Text

Display

Yes

 

 

Add Initiate HCM Data Loader Payroll Flow to UpdateAssignmentExtract Payroll Flow

As a next step we would also need to Add “Initiate HCM Data Loader” Payroll flow as part of UpdateAssignmentExtract Payroll Flow. We can either navigate to Payroll->Checklist->Search for UpdateAssignmentExtract Payroll Flow Pattern or use Refine Extracts->Search for UpdateAssignmentExtract

Once the search results are retrieved we need to click on “Edit” (pencil icon) and choose the following task

 

*Please make sure to select the payroll flow which has the description of  “Generate HCM Data Loader File and optionally perform a Data Load” as the other payroll flow task which has similar name will not serve the purpose here (both have different definition)

Once we add the payroll flow task we should click on “Go To Task” and add the following details:

Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Archive Action)

Name

Data Loader Archive Action

Execution Mode

Submit

Data Type

Text

Parameter Basis

Bind to Flow Task

Basis Value

UpdateAssignmentExtract, Submit , Payroll Process

Usage

Input Parameter

 

Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Configuration)

Name

Data Loader Configuration

Execution Mode

Submit

Data Type

Text

Parameter Basis

Constant Bind

Basis Value

ImportMaximumErrors=100,

LoadMaximumErrors=100,

LoadConcurrentThreads=8,

LoadGroupSize=100

Usage

Input Parameter

 

 

Now the setup part is complete and we will try running “UpdateAssignmentExtract” and check if a .DAT file gets generated

 

If we go tot the “Extract Delivery Options” tab we can see all the output files generated.

 

If we click on “DL_WORKER_1457667” a zip file will get downloaded

And when we unzip the file we can see a .dat file

Summary

So this is how one can get .dat file from Oracle HCM Cloud Application making use of HCM Extracts.

Hope this was useful.

Have a nice day.


Creating ESS Job For a BIP Report having multiple user input parameters In Oracle HCM Cloud Application

$
0
0

Introduction

In one of the previous article (Creating a Parameterised ESS Job in Oracle HCM Cloud Application) we have seen how a parameterized ESS job is created in Oracle HCM Cloud Application, but in that example we only used one single parameter and there were no specific linking between the ESS Job and the BI Report to uniquely identify the parameter. Since we only dealt with one parameter we didn’t faced much problems but in case there are multiple parameters one may have some trouble mapping the ESS Job parameter to the corresponding BI Report parameter. There is no concept of Token (I hope you guys remember this was used in EBS while defining Concurrent program parameter) in Oracle Cloud and the entire mapping is done based on the parameter sequence.

So, we need to arrange the parameters in the same order in BIP Report and the ESS Job. Parameter 1 of BIP Report becomes argument 1 of ESS Job, Parameter 2 of BIP Report becomes argument 2 and so on.

We would demonstrate the same in this post here.

We would need to perform the following steps:

Create a SQL Query Based BIP Data Model which has multiple parameters

 

Create a BIP Report from the SQL Data Model

 

Creating a Custom ESS Job for the BIP Report

 

Running the Custom ESS Job and verifying the results

 

Create a SQL Query Based BIP Data Model which has multiple parameters

We will create a SQL Query which will fetch person number, hire date, primary flag and worker type. The SQL query is mentioned below:

SQL Query

select papf.person_number,

         ppos.date_start hiredate,

         ppos.period_type,

         ppos.primary_flag

from per_all_people_f papf,

        per_periods_of_service ppos

where papf.person_id = ppos.person_id

and ppos.date_start between :p_from_date and :p_to_date

and trunc(sysdate) between papf.effective_start_date and papf.effective_end_date

and ppos.primary_flag =nvl(:p_flag,ppos.primary_flag)

and ppos.period_type = nvl(:p_period_type,ppos.period_type)

 

The details of the various parameters used are mentioned below:

P_FROM_DATE

Attribute Name

Attribute Value

*Name

p_from_date

Data Type

Date

Default Value

 

Parameter Type

Date

Row Placement

1

Display Label

From Date

Text Field Size

10

Options (Ignore User Timezone)

Unchecked

Date Format String

yyyy-MM-dd

 

 

P_TO_DATE

Attribute Name

Attribute Value

*Name

p_to_date

Data Type

Date

Default Value

 

Parameter Type

Date

Row Placement

2

Display Label

To Date

Text Field Size

10

Options (Ignore User Timezone)

Unchecked

Date Format String

yyyy-MM-dd

 

 

P_FLAG

Attribute Name

Attribute Value

*Name

p_flag

Data Type

String

Default Value

 

Parameter Type

Text

Row Placement

3

Display Label

Primary Flag

Text Field Size

1

 

P_PERIOD_TYPE

Attribute Name

Attribute Value

*Name

p_period_type

Data Type

String

Default Value

 

Parameter Type

Text

Row Placement

4

Display Label

Worker Type

Text Field Size

1

 

Create a BIP Report from the SQL Data Model

We will create a BIP Report from the above Data Model, and when we run the report the output will be as shown below:

 

Creating Custom ESS Job

We would now need to create a custom ESS job. We would need to search for “Manage Enterprise Scheduler Job Definitions and Job Sets for Human Capital Management and Related Applications” task under Setup and Maintenance. Next we need to click on (+) button and populate the following details

ESS Job Details

Attribute Name

Attribute Value

*Display Name

Person Details ESS Job

Name

XX_PERSONDETAILS

Path

/ah/

Application

Global Human Resources

Description

Custom ESS Job created to demonstrate how to run BI Report having multiple parameters

Retries

 

Job Category

 

Timeout Period

 

*Job Application Name

EarHcmEss

Enable Submission from Enterprise Manager

Unchecked

Job Type

BIPJobType

Bursting Report

Unchecked

Class Name

oracle.xdo.service.client.scheduler.BIPJobExecutable

Default Output Format

PDF

*Report ID

/Custom/Practice Samples/PersonDetailsReport

Priority

 

Allow Multiple Pending Submissions

False

Enable submission from Scheduled Processes

Checked

 

Once populated the UI will appear as below:

Now, we will create the parameters using the following details

From Date

Attribute Name

Attribute Value

*Parameter Prompt

From Date

*Data Type

Date

Read Only

Unchecked

Page Element

Date Picker

Show

Date Only

Default Date Format

yyyy-MM-dd

Default Value

 

Tooltip Text

 

Required

Checked

Do not Display

Unchecked

 

To Date

Attribute Name

Attribute Value

*Parameter Prompt

To Date

*Data Type

Date

Read Only

Unchecked

Page Element

Date Picker

Show

Date Only

Default Date Format

yyyy-MM-dd

Default Value

 

Tooltip Text

 

Required

Checked

Do not Display

Unchecked

 

 

Primary Flag

Attribute Name

Attribute Value

*Parameter Prompt

Primary Flag (Y/N)

*Data Type

String

Read Only

Unchecked

Page Element

Text Box

Default Value

Y

Tooltip Text

 

Required

Checked

Do not Display

Unchecked

 

Worker Type

Attribute Name

Attribute Value

*Parameter Prompt

Worker Type (E/C/N/O)

*Data Type

String

Read Only

Unchecked

Page Element

Text Box

Default Value

E

Tooltip Text

E-Employee, C-Contingent Worker, N- Non-Worker and O for Others.

Required

Checked

Do not Display

Unchecked

 

 

Once all these details are filled the parameter section would appear as below:

*Note: One can use the up-down arrow buttons (highlighted in yellow above) to re-order the parameters. One should ensure that the parameter order should exactly be same as that of BIP Report parameter sequencing.

 

Running the Custom ESS Job and Verifying Results

Now as a last step we will run the ESS Job and try to verify the results. For this we need to navigate to Navigator->Tools->Scheduled Processes-> Schedule New Process ( Search for Person Details ESS Job and submit the same)

And the process when gets completed displays the below information

 

Summary

So, from the above screenshot we can easily conclude that if we keep the ordering / sequencing of the parameter same in BIP Report and ESS Job then the application automatically maps the parameter with the ESS Job argument and no explicit linking is required.

Generating Password Protected PDF Reports from Simple Click on Springboard Icon

$
0
0

Introduction

In one of the previous article (Creating a Password Protected PDF Output In Oracle Fusion Cloud Application ) we have seen how we can created confidential reports / letters (Appointment Letter, Experience Certificate, Employment Certificate, Compensation Letter etc) in Oracle HCM Cloud Application.

Also, in one of the previous article (Running Parameterized Reports From Navigator Menu and Springboard Icon In Oracle Fusion HCM Cloud ) we have seen how to run a report from navigator menu or springboard icon.

Now, say if we want to have a password protected pdf report to be displayed / downloaded by a simple click on Springboard Icon then we would need to make use of both the above features.

We would try to demonstrate the same in this article here.

Worked Out Example

For this example we would need to perform the following steps:

  1. Create a Password Protected PDF Report

  2. Create a Page Entry under a Group (say “Compensation Related Reports” )

  3. Verify that the Report gets displayed when we click on the Page Entry ( It would ask for password to open the PDF) and once the correct password is entered data should be displayed.

 

Create Password Protected PDF Report

We would use the following SQL (we have used our own data instead of fetching the details from database to keep the SQL simple, but one can very well fetch the details from database too)

SQL

SELECT FT.*

FROM

(

SELECT T1.*,

      T1.NID||T1.DOB password

FROM

(SELECT 1 as EmployeeNumber,

               'Alfred Thomas' as EmployeeName,

                10000 as AnnualSalaryinUSD,

                10 as BonusPercentage,

    'QY7865W' NID,

    10151972 DOB

FROM dual

) T1

UNION

SELECT T2.*,

      T2.NID||T2.DOB password

FROM

(SELECT 2 as EmployeeNumber,

               'Barack Alfonso' as EmployeeName,

                20000 as AnnualSalaryinUSD,

                20 as BonusPercentage,

   'SW864E' NID,

   11171995 DOB

FROM      dual

) T2

UNION

SELECT T3.*,

      T3.NID||T3.DOB password

FROM

(SELECT 3 as EmployeeNumber,

               'Chris Matthew' as EmployeeName,

                30000 as AnnualSalaryinUSD,

                30 as BonusPercentage,

   'GT603J' NID,

12241987 DOB

FROM      dual

) T3

UNION

SELECT T4.*,

      T4.NID||T4.DOB password

FROM

(SELECT 4 as EmployeeNumber,

               'Derek Laker' as EmployeeName,

                40000 as AnnualSalaryinUSD,

                40 as BonusPercentage,

   'JK273T' NID,

   10291998 DOB

FROM      dual

) T4

) FT

WHERE FT.EmployeeNumber = :p_employeenumber

 

The Data Model Diagram will appear as below:

A screenshot of a cell phone

Description generated with very high confidence

 

Next we will need to create a RTF Template . (The RTF Template will look as below):

A screenshot of a social media post

Description generated with very high confidence

In order to ensure that the user is prompted for a password we would need to have the following properties defined under Info->Properties-> Advanced Properties ->Custom

Name

Value

Type

xdo-pdf-security

true

Text

xdo-pdf-open-password

{/DATA_DS/G_1/PASSWORD}

Text

 

A screenshot of a social media post

Description generated with very high confidence

 

Now when we upload this RTF template to the data model we would be able to create the Report (lets call it Rewards Letter). Once we try to run the Report it will ask us for an input employee number ( we will pass 1 for now) and once we do so it would ask for a password. The password for opening the PDF for this record (person number 1) is combination of its NID (National Identifier) along with the person’s date of birth in MMDDYYYY format.

 

A screenshot of a cell phone

Description generated with very high confidence

Creating Page Entry

We would need to create a Group and a Page Entry. For this we would need to Create a Sandbox and activate it (sandbox).

Navigation is Navigator -> Configuration->Structure->Create Group

Details for the same are in below table

 

Compensation Related Reports (Group)

Attribute Name

Attribute Value

*Name

Compensation Related Reports

Icon

<Choose Any Icon>

Show on Navigator

Yes

 

A screenshot of a cell phone

Description generated with high confidence

 

We would also need to create a page Entry. Details in Table below

 

Reward Letter (Page Entry)

Attribute Name

Attribute Value

*Name

Reward Letter

*Icon

<Any Icon>

Group

Compensation Related Reports

Show on Navigator

Yes

Show on Springboard

Yes

Link Type

Static URL

Secure Destination

Unchecked

*Destination

<Link of the Custom Report>

 

A screenshot of a cell phone

Description generated with high confidence

Once we populate the values we can save the page entry details and publish the sandbox.

 

Verification

Now that all the setup is complete and the sandbox is published we should be able to see an icon titled “Reward Letter” under “Compensation Related Reports” on the Springboard.

A screenshot of a cell phone

Description generated with very high confidence

 

Next we should click on the “Reward Letter” link and should take us to the Report Screen where we need to provide a user input (Employee Number). Once we provide the employee number another dialog box will open which will ask for the password for the pdf document.

A screenshot of a cell phone

Description generated with very high confidence

 

And once we provide the password we should be able to see the report. (Reward Letter for this example)

A screenshot of a social media post

Description generated with very high confidence

 

When we click on submit, we would be able to see the details.

A screenshot of a cell phone

Description generated with very high confidence

Creating .dat files from BI Publisher Report In Oracle HCM Cloud Application

$
0
0

Introduction

In one of the previous article (Generating EText Output From HCM Extracts In Oracle Fusion HCM Cloud Application) we have seen how we can generate EText Data from HCM Cloud Application. One can even get eText Output from BI Publisher Report. But the major limitation in both the approach is that the output file extension is .TXT and one would need other means to have .DAT file.

If the generated file is prepared with the intent of loading the data back into Oracle HCM Cloud Application using HDL one can do same using Loading Data into Oracle HCM Cloud Using Inbound Interface Delivery Option

In this post we will demonstrate how can one generate eText file with .DAT extension from Oracle HCM Cloud Application. However, the only limitation at this point of time is that the .DAT file will be under a zip folder.

For this example, we will have to create a BI Data Model which will act as the Data Source and an HCM Extract. While the data would be fetched from the BI Data Model we would need to make use of an HCM Extract as we would need the “Inbound Interface” delivery option to assign the .dat extension to the generated output file.

We would create a Dummy Extract named “DummyExtract” which will have just one attribute.

We would have one data group namely DummyDataGroup which is also the root data group

Some Details related to Data Group are in table below:

DataGroupName

UserEntity

DataGroupFilterCriteria

ExtractRecords

DummyDataGroup

PER_EXT_PAY_EMPLOYEES_UE

 

PersonRecord

 

Once we have all these details populated we would need to define the Extract Delivery Options. Details for which are in below table

Attribute Name

Attribute Value

Start Date

1/1/2000

End Date

12/31/4712

*Delivery Option Name

PersonData

*Output Type

Text

Report

/Custom/Practice Samples/PersonDataReport.xdo

Template Name

PersonDataTemplate

*Output Name

Worker

*Delivery Type

Inbound Interface

Required

Checked

Encryption Mode

None

Override File Extension

.dat

Integration Name

PersonData

Integration Type

Data Loader

Compressed Delivery Group

PersonData.zip

 

 

Once we are done with the above setup we will get the output file (.txt for this example) but we are trying to generate .DAT file and so to ensure we get the file with same filename we will have to perform the following steps:

  1. Add a parameter named “Auto Load” to the DummyExtract

  2. Add “Initiate HCM Data Loader” to DummyExtract payroll flow

 

Add “Auto Load” Parameter to DummyExtract

We would need to add a new parameter. Details of the parameter are in below table

Attribute Name

Attribute Value

Sequence

10

Name

Auto Load

Tag Name

Auto_Load

Data Type

Text

Display

Yes

 

*Note: We always have to use ‘N’ as the value of this parameter for this specific scenario or else it will try to load data back into application (which is not the ask here)

 

Add Initiate HCM Data Loader Payroll Flow to DummyExtract Payroll Flow

As a next step we would also need to Add “Initiate HCM Data Loader” Payroll flow as part of DummyExtract Payroll Flow. We can either navigate to Payroll->Checklist->Search for DummyExtract Payroll Flow Pattern or use Refine Extracts->Search for DummyExtract

Once the search results are retrieved we need to click on “Edit” (pencil icon) and choose the following task

 

*Please make sure to select the payroll flow which has the description of  “Generate HCM Data Loader File and optionally perform a Data Load” as the other payroll flow task which has similar name will not serve the purpose here (both have different definition)

Once we add the payroll flow task we should click on “Go To Task” and add the following details:

Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Archive Action)

Name

Data Loader Archive Action

Execution Mode

Submit

Data Type

Text

Parameter Basis

Bind to Flow Task

Basis Value

DummyExtract, Submit , Payroll Process

Usage

Input Parameter

 

Initiate HCM Data Loader Task Definition: Basic Information (Data Loader Configuration)

Name

Data Loader Configuration

Execution Mode

Submit

Data Type

Text

Parameter Basis

Constant Bind

Basis Value

ImportMaximumErrors=100,

LoadMaximumErrors=100,

LoadConcurrentThreads=8,

LoadGroupSize=100

Usage

Input Parameter

 

 

Now the setup part from an HCM Extract piece is completed.

*Note: One point to note here is that the Report we have used here /Custom/Practice Samples/PersonDataReport.xdo is not using globalReportsDataModel but a custom Data Model.

Details of PersonDataReport (Data Source)

Contrary to the normal approach where we use globalReportsDataModel in HCM Extract here we will have to create a report which would be based on custom Data Model. The SQL query for the custom data model is mentioned below:

 

SQL Query for PersonData Data Model

select papf.person_number personnumber,

          to_char(papf.start_date, 'YYYY/MM/DD') hiredate,

         to_char(papf.effective_start_date, 'YYYY/MM/DD') effectivestartdate,

         to_char(papf.effective_end_date, 'YYYY/MM/DD') effectiveenddate

from per_all_people_f papf

where trunc(sysdate) between papf.effective_start_date and papf.effective_end_date

and papf.person_number IN ('7','10','13','39')

 

We would create a Report from this data model which will have the following eText Template attached.

And when we run the report we can see data as below:

 

Running Dummy Extract

Now as a last step we will run the Dummy Extract and check if we get the output in .dat file

 

If we go to the “Extract Delivery Options” tab we can see all the output files generated.

 

If we click on “DL_PERSONDA_1460063” a zip file will get downloaded

And when we unzip the file we can see a .dat file

A screenshot of a social media post

Description generated with very high confidence

Summary

So this is how one can get .dat file from Oracle HCM Cloud Application using BI Publisher Data Model as a Data Source. One can use any SQL query which would serve as the data source (no matter whether the application module is HCM , Finance or something else).The only limitation at this point is that the Report needs to be static (that is cannot accept user input) but probably in near future we will get a workaround for that too. Till then we may use this technique to generate .DAT file extension from BI Publisher.

Oracle PIM Cloud - Item Creation using FBDI

$
0
0

FBDI is one of the many ways in Oracle Fusion Cloud to do conversions. FBDI stands for File Based Data Import. In cloud environment, FBDI is the best way to get mass conversions done in shortest time. All entities are not provided with FBDI currently in Oracle Cloud. During the Planning phase itself, time and budget calculated based on the availability of FBDI, ADFDI or web services. This is important during the plan and design phase as the write access to Database is restricted for SaaS users (Software as a Service).

Pre-Requisites for FBDI:

1. Machine should have Excel or equivalent software which can execute macros in Excel

2. For Excel, users should make sure the Macros are enabled

3. FBDI for the component should exist in the docs.oracle.com

4. It is better to download the FBDI template which has is of the same version as of the Cloud Environment.

Stages:

1. Downloading the template

2. Preparing data for the template

3. Generating the .zip files

4. Uploading the file to oracle

5. Move data to Interface tables

6. Move data to Base tables

 

Stage I – Download the Template

Download the template from the Oracle website.

Navigation: docs.oracle.com -> Cloud -> Applications -> Product Management (select your version here from the drop down) -> Books -> File-Based Data Import for Oracle Supply Chain Management Cloud -> Expand All -> Select Item -> Download the template

It is essential to match the version of the SaaS and the version of the FBDI template download to avoid issues.

 

Downloading the right FBDI template:

 

Stage II – Preparing the Data

Preparing data for the template

This is the crucial part of the entire process. The only process which differentiates from other conversions. Rest of the process is generic for any FBDI upload. Basic instructions are provided in the first worksheet of every FBDI template.

 

There are multiple worksheets for Item creation, but this demo is restricted to EGP_SYSTEM_ITEMS_INTERFACE only.

In the EGP_SYSTEM_ITEMS_INTERFACE there are multiple fields that are present. Out of which only few are mandatory, rest can be configured based on the specific requirements.

 

 

  • Transaction Type – Create (for creation of Items)

  • Batch ID – Derived from Item Batches

  • Batch Number – Derived from Item Batches

  • Item Number – Name of the Item (In Oracle Cloud Item Name is referred as Item Number)

  • Organization Code – Refers to the name of the Item Master Code ( not the name).

  • Description – Refers to Item Description

  • Template Name – This is the Item Template Name

  • Source System Code – Refers to the Source system from “Manage Trading Community Source Systems”

  • Item Class Name – Refers to Name of the Items to which the items are uploaded

  • Primary Unit of Measure—Refers to Unit of Measure Name (not the code)

    • Lifecycle Phase – Refers to the lifecycle phase of the item

    • Item Status – Refers to the status of the item

Deriving Batch ID and Batch Number from Item Batches:

Only users who have the role of “Product Data Steward” can view Item Batches in their Item Workbench Tasks panel.

Navigation -> Home -> Product Management -> Product Information Management -> Tasks Panel

 

Check for the available batches by click on search(Create one if no item batch is available)

 

 

 

  • Note down the Batch Number and Batch Id from here. If the Batch Id is not visible on the screen, get it from the columns view as shown in above screenshot.

  • Manage Source System can be obtained from the FSM setup task “Manage Trading Community Source Systems

 

 

Primary Unit of Measure Name can be obtained from the FSM setup task “Manage Units of Measure”. Refer below screenshot

 

 

 

Manage Lifecycle Phases can be obtained from the FSM setups “Manage Lifecycle Phases”. Production is one of the phases present in the instance by default.

 

 

Save the details in the worksheet.

Stage III – Generate .zip File

  • Once all the required data is filled. Click on “Generate CSV File” button present on the first worksheet of the template

  • The macro runs and generates a series of csv files zipped into one file.  Name the file with an appropriate name for easy reference.

Stage IV- Upload the files to Oracle Cloud

  • The zip file generated should be uploaded to the UCM File server location

  • Navigation: Home-> Navigator-> File Import and Export

 

Choose the location of the zip file and provide the correct Account. Here it is scm/item/import

 

 

Stage V- Move data to Interface Tables

This stage involves moving the prepared data from UCM server to Interface tables. Oracle cloud performs basic validations on the data and moves the process to success/ error accordingly.

Navigation: Home -> Navigator-> Scheduled Processes -> Schedule New Process -> Load Interface File for Import

It is very important to provide the right Import Process and the Data File.

 

 

Stage VI- Move data to Base Tables

The data uploaded now is in interface tables, the data will be moved to Base Table if the validated data meets all the criteria for getting created as an item.

Navigation: Home-> Navigator -> Scheduled Processes-> Item Import

Enter the Batch Id and other parameters as needed.

 

The process should end in Succeeded status. In case of any errors refer the log file for the failure reason.

 

Verify the Item Creation by searching for the item manually.

 

 

Thank you for watching out this space

 

About the Author:

 

Krishna Kumar is an Oracle ERP consultant with an experience of 12 years. He has implemented many ERP projects in High-Tech and Retail industries. He is a Cloud enthusiast and an early-bird in implementing Cloud SCM projects.


 

Solution for error "ORA-01012 not logged on" while trying to start the Oracle Database

$
0
0

Problem:

 

When trying start the database you may get the error ORA-01012 not logged on.


$ sqlplus / as sysdba

SQL*Plus: Release 11.2.0.4.0 Production on Tue Dec 11 07:21:30 2018

Copyright (c) 1982, 2011, Oracle. All rights reserved.

Connected.

SQL> startup

ORA-01012: not logged on

 

error ora 01012

 

Solution:

To resolve this error you need to remove the orphaned shared memory segment usingg sysresv utility. I was able to reolve the issue successfully with this approach.

 

$sysresv

 

sysresv command will list the currently allocated IPC resources for shared memory and then you can remove the shared memory segment using ipcrm -m command as you can see in the screenshor below. You can ignore the Semaphores.

sysresv solution ora 01012

 

Once done now you can start the database and it shoud start successfully.

databasestart

Oracle Integration Cloud Service (OICS) Training

$
0
0

 

Oracle Integration Cloud Service (OICS) product has integration capabalities (ICS) and process automations (PCS) and build applications visually (VBCS) and is customer managed. We will see OICS, Visual Builder and MFTCS in this course.

{tab Course Contents | orange}

{tab-l1 Day 1 | orange}

Tour to OIC
Integrations
Integration Patterns/Orchestrations
FBDI, web service, REST API
Connections - ERP, REST, SOAP, DB, FTP, FILE and other adapters
Lookups

{tab-l1 Day 2 | green}

Inbound and Outbound Integration with Oracle Fusion
Mapper, payload handler Functions, Scope, Fault Handler, Stage File, and other Actions
ERP Integration Outbound - BIP Report
Building the Chart of Accounts Structure

{tab-l1 Day 3 | red}

Build Event based integration
ERP Integration Inbound- Tips & Best Practices
Built Bulk import ERP Inbound integration

{tab-l1 Day 4 | blue}

ERP CallBack and Reconciliation
Migration, Monitoring, Debugging
VBCS
Overview
REST Services and Apex(ORDS)

{tab-l1 Day 5 | grey}

VBCS Page Build and deploy

{tab-l1 Day 6 | info}

Process cloud service
MFT

{tab-l1 Day 7 | orange}

Compute Classic Procuring and initial setups
Role Management, agent installation, Admin steps

{tab-l1 Day 8 | green}

Future Releases and Additional topics. (Bouncing DB server)

{/tabs}
{tab Enroll | grey}

 
 
 
 
 


{tab Training Hours | red}

Start Date: 02nd March 2019

Training Schedule: 02, 03, 09, 10, 16, 17, 23 & 24th March 2019

Timing: 12:00 NOON GMT | 07:00AM EST | 4:00AM PST | 6:00AM CST | 5:00AM MST | 5:30PM IST  | 01:00PM GMT+1

This training will run for 8 days over weekends

{/tabs}
{jcomments off}

 

 

Oracle Enterprise Planning and Budgeting Cloud Service (EPBCS) Training

$
0
0

 

Oracle Integration Cloud Service (OICS) product has integration capabalities (ICS) and process automations (PCS) and build applications visually (VBCS) and is customer managed. We will see OICS, Visual Builder and MFTCS in this course.

{tab Course Contents | orange}

{tab-l1 Day 1 | orange}

Overview of EPBCS
Navigating the Application
Creating a Planning Application
Enterig and Retrieving Data using Smartview
Hands-on Session

{tab-l1 Day 2 | green}

Data Management: Importing and Exporting Data and Metadata
Introduction to EPM Automate
Hands-on session

{tab-l1 Day 3 | red}

Introduction to Business Process/Frameworks
Financials Business Process

Revenue, Expenses, Balance Sheet, Income Statement

Hands-on session

{tab-l1 Day 4 | grey}

Capital Assets Planning
Workforce Planning

Projects Module

Hands-on session

{tab-l1 Day 5 | blue}

Overview of EPBCS
Strategic Modelling
Administration task
Security: Users, Groups, Migration of Artifacts
Creating Task Lists
Creating Data Forms

Hands-on Session

{tab-l1 Day 6 | orange}

Designing Business Rules
Approval Process

Creating User Defined Elements

Setting up Exchange Rates

Hands-on session

{/tabs}
{tab Enroll | grey}

 
 
 
 
 


{tab Training Hours | red}

Start Date: 02nd March 2019

Training Schedule: 02, 03, 09, 10, 16, 17, 23 & 24th March 2019

Timing: 12:00 NOON GMT | 07:00AM EST | 4:00AM PST | 6:00AM CST | 5:00AM MST | 5:30PM IST  | 01:00PM GMT+1

This training will run for 8 days over weekends

{/tabs}
{jcomments off}

 

 

 


Oracle Enterprise Planning and Budgeting Cloud Service (EPBCS) Training

$
0
0

 

Enterprise Planning and Budgeting Cloud Service training will show you how to implement, administer and effectively use the application to meet your business needs.This training wil explain how to create goal-oriented plans, reports and dashboards. It show you how to import and export Metadata. It also describe how to use dashboards and reports to get an immediate sense of the state of your business plans.

{tab Course Contents | orange}

{tab-l1 Day 1 | orange}

Overview of EPBCS
Navigating the Application
Creating a Planning Application
Enterig and Retrieving Data using Smartview
Hands-on Session

{tab-l1 Day 2 | green}

Data Management: Importing and Exporting Data and Metadata
Introduction to EPM Automate
Hands-on session

{tab-l1 Day 3 | red}

Introduction to Business Process/Frameworks
Financials Business Process

Revenue, Expenses, Balance Sheet, Income Statement

Hands-on session

{tab-l1 Day 4 | grey}

Capital Assets Planning
Workforce Planning

Projects Module

Hands-on session

{tab-l1 Day 5 | blue}

Overview of EPBCS
Strategic Modelling
Administration task
Security: Users, Groups, Migration of Artifacts
Creating Task Lists
Creating Data Forms

Hands-on Session

{tab-l1 Day 6 | orange}

Designing Business Rules
Approval Process

Creating User Defined Elements

Setting up Exchange Rates

Hands-on session

{/tabs}
{tab Enroll | grey}

 
 
 
 
 


{tab Training Hours | red}

Start Date: 06th April 2019

Training Schedule: 06, 07, 13, 14, 20 & 21st Apr 2019

Timing: 12:00 NOON GMT | 08:00AM EST | 5:00AM PST | 7:00AM CST | 6:00AM MST | 5:30PM IST  | 01:00PM GMT+1

This training will run for 6 days over weekends

{/tabs}
{jcomments off}

 

 

 

Oracle IDCS

$
0
0

Security could now be the reason to move to the cloud , for instance Oracle’s vision to postulate  that the cloud can be more secure than on-premise. At every layer and stage of technology investments are made in proactive security be it the cloud computing stack and everything from silicon to IAAS , PAAS and SAAS . Oracle’s portfolio includes award winning database security solutions such as the Audit Vault and Database Firewall. Oracle epitomises in Identity and Access management with its comprehensive solution for directory , governance and access. The Oracle cloud incorporates the best of oracles security products , technology and processes. Oracle’s security cloud services are built on the prodigious intellectual property and security foundation.

 

Which brings us to the focus of this article , The Identity Cloud Service(IDCS). A module of Oracles offering of security features.  

 

IDCS is a secure on demand Identity service from the Oracle public cloud. IDCS provides native cloud security by Access and Identity management platforms that is designed to integrate with the enterprise security fabric. It is designed to rapidly integrate modern identity to modern applications into your identity management systems from the cloud.Since it is built on standards it easons integration of your applications and facilitates inter operations. Not only does the design of the application guarantee security but it also inherits security from the Oracle Cloud. IDCS helps customers move to the Cloud quickly  and easily adopt the cloud as well.

 

Do you want training on Oracle IDCS?

Contact us now

 

Identity challenges faced with SAAS applications

There are several challenges users face with SAAS applications. Below mentioned are three glaring challenges commonly faced by SAAS users.

  1. Fragmentation

  2. Weak Security

  3. Lack of Governance

 

Let’s deduce the reason for these challenges.

 

1. Fragmentation - For a while SAAS applications were deployed on the basis of need. Causing a range of applications that an organisation may use to  have been deployed from various vendors. Often due to the rush to time to market they have not wired it up or architected it to their credentials with on premise identities , resulting in users having to log into these SAAS applications with different credentials as opposed to using their on premise credentials hence causing a lack of synchronisation .

The scater of these SAAS applications has costed enterprises huge amounts of money in terms of resetting multiple passwords for users and other challenges that could arise due to fragmentation of applications.

           2. Weak Security - SAAS applications are not built with inbuilt security. For instance if

           we want to have intelligent capabilities within SAAS applications to prompt the user for

           Multi factor auth or One Time Passcode(OTP) or Step of indication depending on risk,

           these kind of options are not facilitated within the application. To many enterprises the

           lack of these facilities has found an increasing need for an integrated, centralised

           solution that can enforce this across their entire cloud based portfolio.

 

           3. Lack of Governance - Governance is always an area where enterprises have  

           excellent coverage on on-premise applications because On-Premise Identity

           Management including Oracle’s identity and governance portfolio has excellent

           governance capabilities for on premises applications but they are not extended to the

          cloud because no identity as a service vendor provides these capabilities on the cloud.

 

The Identity Cloud Service is designed to differentiate in 3 key areas

 

  1. Hybrid

  2. Open & Standards - based

  3. Secure Defense - in Depth

 

Why Oracle Identity Cloud Service when it comes to the above mentioned differentiators?

 

  

 

Hybrid means identities can be managed for both cloud based applications and on premise applications integrating with Oracles on premise portfolio in a manner that is better than other stand alone Identity As A Service providers can. Oracles IDCS Hybrid Identity feature

permits us to manage identities for cloud and on premise applications with enterprise - grade hybrid deployments.

 

  1. There are flexible provisions to manage identities in the cloud

1. Synchronize identities directly with AD or OIG

2. If IDCS is configured for SAAS application all the identities from OIG or directly from AD can be synchronised using an identity bridge software that is capable of deploying on-premises or authentication can be federated to OAM or ADFS for instance which primarily permits authentication to be federated to external power

    2. Centralised governance workflows for Cloud applications

          1. Access review certification for extended OIG . A provision for an OIM connector is

          present where IDCS is capable of enabling the administrator to perform a set of

          Capabilities for applications that are protected by IDCS

          2. Audit compliance to extend OIG SoD to IDCS

          3. IDCS applications and Access Control ro include external reporting in OIG

          

         

Do you want training on Oracle IDCS?

Contact us now

Open & Standards - based

 

 

IDCS conforms to 4 key standards

1. Oauth

2. SCIM

3. SAML

4. OpenID

 

Using the above mentioned standards we can integrate with pretty much any application as long as these standards are conformed to.Proprietary integration is no longer existant , standards are the basis of everything.

 

  1. OAuth and SAML can be used to integrate with almost any application that we have

  2. SCIM is used to manage all forms of identity . It is an wholesome open standard that is used  to manage all identities in the cloud directory. Hence, if customers want to provision users into the cloud directory into their instance they can do that directly using the SCIM standard

  3. OpenID connect is used for authentication workflows.Oracle is a sustaining member of the board of the OpenId foundation

  4. Native IDCS support for SAML, SCIM, OpenID, Connect and OAauth

  5. FastFed Working Group to facilitate acceleration and to simplify application Integration

 

Secure Defense - in Depth

 

 

IDCS is designed with security in mind. It is built with several security capabilities to encrypt identities at rest besides the fact that it leverages security capabilities from the Oracle cloud platform.

The key differentiator is that many of the capabilities is leveraged from the Oracle cloud Platform itself.

 

  1. Oracle public cloud layers of defense

            1. Administrative controls for fraud detection , alerting , blocking, behavioural based

             Strong authentication

            2. Restriction of Admin access : Roles , Policies and real-time variables

            3. Schema isolation and Transparent Data Encryption

     2. Contextual user access control Implementation in IDCS

           1. Time-Of-Day, Device, Network, Geo-location etc.

     3. Third Party integration - ready with open Apps

            1. Policies and risk scores from SIEM,CASB,UEBA vendors

             

 

Capabilities of Oracle Identity Cloud Service

 

The Oracle Identity Cloud Service is not another SSO and provisioning service in the cloud , it is basically a comprehensive Identity management solution that can do all of the below mentioned features. This particular service can not only integrate with Oracle cloud applications like Oracles SAAS and PAAS applications but also third party applications like Workday and Office365 etc but also on-premise applications.

 

One thing that differentiates IDCS is that it enables customers to protect not just the IDCS API’s but also their custom APIs using the IDCS server. Once we move it to the cloud we can continue getting  capabilities like governance , segregation of duties and Audit/compliance reports using the OIM connectors for IDCS so that they can continue using all these capabilities from OIG even after moving the application policies to IDCS.

 

Practical Applications in the cloud

 

Let’s look at a few practical applications of IDCS on the cloud and it’s advantages

 

1.Modernising custom applications in the cloud

 

Why should we modernize?

 

  1. Maintaining Legacy applications are quite expensive

  2. Proprietary Integrations

  3. Integration with AD/OIG

 

Moving on-premise applications to IAAS/PAAS

 

How does IDCS facilitate modernisation?

  1. Rich API support

  2. Flexible User/Group/Role based access control policies

  3. Ability to secure custom App API’s

 

Key Features of IDCS

 

  1. Easy to integrate Apps with IDCS

  2. Use oAuth to protect App API’s addition to user

  3. The SCIM compliant Cloud Directory is fully featured

  4. App roles and groups are supported

  5. Inter-op with 3rd party tokens for services that span multiple apps/services

  6. Audit Logs are available in detail

 

2. Integration with any application

 

With IDCS we can integrate with any application be it:

 

  1. Oracle PAAS/SAAS service

  2. Oracle on-premise applications

  3. 3rd Party SAAS applications

 

Key features of IDCS in terms of integration with applications

 

  1. It helps to integrate with 3rd party apps using SAML/OIDC/oAuth for SSO & Access Mgmt. functions

  2. IDCS can act as an Identity Provider in this scenario

  3. Profile and password management functions performed Users and Administrators

  4. Accomplish Hybrid Identity capabilities (existing OIM customers)

  5. Third party apps to target - Salesforce,Box,Office 365,Google etc.

 

3. Manage external identities

In many cases customers are trying to upgrade  legacy applications to modernise them and be able to incorporate social identities and auto scaling capabilities.

In many cases some applications could have been written decades ago when there was no concept of social identity these applications were deployed on premises and in many cases tested applications are hosted on custom hardware , in situations where they experience higher demand these applications would require manual scaling and then they need to be scaled back when the demand reduces.By moving these applications to the cloud we can leverage a lot of auto scaling capabilities. We can simplify the management and administration of these applications by being in the cloud and for IDCS the very strong data security in the cloud provides things like transparent data encryption and schemalisation  for consumer identity that will be stored in the identity coud.

Why do customers upgrade to external-facing apps?

 

  1. To consume Social Identities

  2. Auto scaling is more reliable in the cloud

  3. To move apps to SAAS/PAAS

 

Why IDCS for these apps?

  1. Strong data security in the Oracle Cloud

  2. Rich APIs for integration with custom Applications

Key Features of IDCS in terms on Managing External Identities

  1. Fully-functional Cloud Directory that can house identities

  2. Self - service and ID Admin functions for admins and end users respectively

  3. An easy access to applications without the need of VPN or on-premise gateways

  4. Extensive APIs allow customers to integrate identity in a coherent manner

Oracle IDCS

$
0
0

 

 

Security could now be the reason to move to the cloud , for instance Oracle’s vision to postulate  that the cloud can be more secure than on-premise. At every layer and stage of technology investments are made in proactive security be it the cloud computing stack and everything from silicon to IAAS , PAAS and SAAS . Oracle’s portfolio includes award winning database security solutions such as the Audit Vault and Database Firewall. Oracle epitomises in Identity and Access management with its comprehensive solution for directory , governance and access. The Oracle cloud incorporates the best of oracles security products , technology and processes. Oracle’s security cloud services are built on the prodigious intellectual property and security foundation.

 

Which brings us to the focus of this article , The Identity Cloud Service(IDCS). A module of Oracles offering of security features.  

 

IDCS is a secure on demand Identity service from the Oracle public cloud. IDCS provides native cloud security by Access and Identity management platforms that is designed to integrate with the enterprise security fabric. It is designed to rapidly integrate modern identity to modern applications into your identity management systems from the cloud.Since it is built on standards it easons integration of your applications and facilitates inter operations. Not only does the design of the application guarantee security but it also inherits security from the Oracle Cloud. IDCS helps customers move to the Cloud quickly  and easily adopt the cloud as well.

 

Do you want training on Oracle IDCS?

Contact us now

 

Identity challenges faced with SAAS applications

There are several challenges users face with SAAS applications. Below mentioned are three glaring challenges commonly faced by SAAS users.

1.Fragmentation
2.Weak Security
3.Lack of Governance

 

Let’s deduce the reason for these challenges.

 

1. Fragmentation - For a while SAAS applications were deployed on the basis of need. Causing a range of applications that an organisation may use to  have been deployed from various vendors. Often due to the rush to time to market they have not wired it up or architected it to their credentials with on premise identities , resulting in users having to log into these SAAS applications with different credentials as opposed to using their on premise credentials hence causing a lack of synchronisation .

The scater of these SAAS applications has costed enterprises huge amounts of money in terms of resetting multiple passwords for users and other challenges that could arise due to fragmentation of applications.

 

         2. Weak Security - SAAS applications are not built with inbuilt security. For instance if

           we want to have intelligent capabilities within SAAS applications to prompt the user for

           Multi factor auth or One Time Passcode(OTP) or Step of indication depending on risk,

           these kind of options are not facilitated within the application. To many enterprises the

           lack of these facilities has found an increasing need for an integrated, centralised

           solution that can enforce this across their entire cloud based portfolio.

 

           3. Lack of Governance - Governance is always an area where enterprises have  

           excellent coverage on on-premise applications because On-Premise Identity

           Management including Oracle’s identity and governance portfolio has excellent

           governance capabilities for on premises applications but they are not extended to the

          cloud because no identity as a service vendor provides these capabilities on the cloud.

 

The Identity Cloud Service is designed to differentiate in 3 key areas 


1.Hybrid
2.Open & Standards - based
3.Secure Defense - in Depth

 

Why Oracle Identity Cloud Service when it comes to the above mentioned differentiators?

 

  

 

Hybrid means identities can be managed for both cloud based applications and on premise applications integrating with Oracles on premise portfolio in a manner that is better than other stand alone Identity As A Service providers can. Oracles IDCS Hybrid Identity feature

permits us to manage identities for cloud and on premise applications with enterprise - grade hybrid deployments.

 

1.There are flexible provisions to manage identities in the cloud

1. Synchronize identities directly with AD or OIG

2. If IDCS is configured for SAAS application all the identities from OIG or directly from AD can be synchronised using an identity bridge software that is capable of deploying on-premises or authentication can be federated to OAM or ADFS for instance which primarily permits authentication to be federated to external power

    2. Centralised governance workflows for Cloud applications

          1. Access review certification for extended OIG . A provision for an OIM connector is

          present where IDCS is capable of enabling the administrator to perform a set of

          Capabilities for applications that are protected by IDCS

          2. Audit compliance to extend OIG SoD to IDCS

          3. IDCS applications and Access Control ro include external reporting in OIG

          

         

Do you want training on Oracle IDCS?

Contact us now

Open & Standards - based

 

 

IDCS conforms to 4 key standards

1. Oauth

2. SCIM

3. SAML

4. OpenID

 

Using the above mentioned standards we can integrate with pretty much any application as long as these standards are conformed to.Proprietary integration is no longer existant , standards are the basis of everything.

 

  1. OAuth and SAML can be used to integrate with almost any application that we have

  2. SCIM is used to manage all forms of identity . It is an wholesome open standard that is used  to manage all identities in the cloud directory. Hence, if customers want to provision users into the cloud directory into their instance they can do that directly using the SCIM standard

  3. OpenID connect is used for authentication workflows.Oracle is a sustaining member of the board of the OpenId foundation

  4. Native IDCS support for SAML, SCIM, OpenID, Connect and OAauth

  5. FastFed Working Group to facilitate acceleration and to simplify application Integration

 

Secure Defense - in Depth

 

 

IDCS is designed with security in mind. It is built with several security capabilities to encrypt identities at rest besides the fact that it leverages security capabilities from the Oracle cloud platform.

The key differentiator is that many of the capabilities is leveraged from the Oracle cloud Platform itself.

 

  1. Oracle public cloud layers of defense

            1. Administrative controls for fraud detection , alerting , blocking, behavioural based

             Strong authentication

            2. Restriction of Admin access : Roles , Policies and real-time variables

            3. Schema isolation and Transparent Data Encryption

     2. Contextual user access control Implementation in IDCS

           1. Time-Of-Day, Device, Network, Geo-location etc.

     3. Third Party integration - ready with open Apps

            1. Policies and risk scores from SIEM,CASB,UEBA vendors

             

 

Capabilities of Oracle Identity Cloud Service

 

The Oracle Identity Cloud Service is not another SSO and provisioning service in the cloud , it is basically a comprehensive Identity management solution that can do all of the below mentioned features. This particular service can not only integrate with Oracle cloud applications like Oracles SAAS and PAAS applications but also third party applications like Workday and Office365 etc but also on-premise applications.

 

One thing that differentiates IDCS is that it enables customers to protect not just the IDCS API’s but also their custom APIs using the IDCS server. Once we move it to the cloud we can continue getting  capabilities like governance , segregation of duties and Audit/compliance reports using the OIM connectors for IDCS so that they can continue using all these capabilities from OIG even after moving the application policies to IDCS.

 

Practical Applications in the cloud

 

Let’s look at a few practical applications of IDCS on the cloud and it’s advantages

 

1.Modernising custom applications in the cloud

 

Why should we modernize?

 

1.Maintaining Legacy applications are quite expensive
2.Proprietary Integrations
3.Integration with AD/OIG

 

Moving on-premise applications to IAAS/PAAS

 

How does IDCS facilitate modernisation?


1.Rich API support
2.Flexible User/Group/Role based access control policies
3.Ability to secure custom App API’s

Key Features of IDCS


1.Easy to integrate Apps with IDCS
2.Use oAuth to protect App API’s addition to user
3.The SCIM compliant Cloud Directory is fully featured
4.App roles and groups are supported
5.Inter-op with 3rd party tokens for services that span multiple apps/services
6.Audit Logs are available in detail

2. Integration with any application

 

With IDCS we can integrate with any application be it:


1.Oracle PAAS/SAAS service
2.Oracle on-premise applications
3.3rd Party SAAS applications

Key features of IDCS in terms of integration with applications


1.It helps to integrate with 3rd party apps using SAML/OIDC/oAuth for SSO & Access Mgmt. functions
2.IDCS can act as an Identity Provider in this scenario
3.Profile and password management functions performed Users and Administrators
4.Accomplish Hybrid Identity capabilities (existing OIM customers)
5.Third party apps to target - Salesforce,Box,Office 365,Google etc.

3. Manage external identities

In many cases customers are trying to upgrade  legacy applications to modernise them and be able to incorporate social identities and auto scaling capabilities.

In many cases some applications could have been written decades ago when there was no concept of social identity these applications were deployed on premises and in many cases tested applications are hosted on custom hardware , in situations where they experience higher demand these applications would require manual scaling and then they need to be scaled back when the demand reduces.By moving these applications to the cloud we can leverage a lot of auto scaling capabilities. We can simplify the management and administration of these applications by being in the cloud and for IDCS the very strong data security in the cloud provides things like transparent data encryption and schemalisation  for consumer identity that will be stored in the identity coud.

Why do customers upgrade to external-facing apps? 

1.To consume Social Identities
2.Auto scaling is more reliable in the cloud
3.To move apps to SAAS/PAAS

 

Why IDCS for these apps?

1.Strong data security in the Oracle Cloud

2.Rich APIs for integration with custom Applications

Key Features of IDCS in terms on Managing External Identities

1.Fully-functional Cloud Directory that can house identities
2.Self - service and ID Admin functions for admins and end users respectively
3.An easy access to applications without the need of VPN or on-premise gateways
4.Extensive APIs allow customers to integrate identity in a coherent manner

Oracle Identity Cloud Service (IDCS) Training

$
0
0

Oracle Identity Cloud Service (IDCS) integrates directly with existing directories and identity management system, making it easier for users to access applications. Providing a platform that is robust and secure, allows users to access, develop and deploy their applications.

{tab Preview | green}

Advantages with IDCS:

Identity Cloud Service is designed to be an integral part of the enterprise security fabric. Oracle Identity Cloud Service provides identity management, singl-sign-on (SSO) and identity governance for applications on-premise, in the cloud and mobile applications. The benefits of implementing Oracle Identity Cloud Service are, Improved Business Responsiveness, Enhanced User Productivity and Experience, Hybrid Multi-Channel Access and finally Simplified IT and Reduced Cost.

 

{tab Course Contents | orange}

{tab-l1 Day 1 | orange}

CLOUD SECURITY FUNDAMENTALS

Cloud Service Mode: SaaS, PaaS, IaaS and cloud deploymentmodel
Introduction to Oracle Cloud Security
Cloud Security Benefits


A DEEPER LOOK INTO ORACLE SECURITY

Oralce Cloud Security Architecture
Oracle Cloud Security Service
Introduction to Oracle IDCS & CASB
Identity Centric Security

{tab-l1 Day 2 | green}

INTRODUCTION TO IDCS

On-Premise Identity & Access Mangement Overview
Introduction to IDCS, Overview & Deep Dive
IDCS Pricing Model
Concepts and Terminology for SAML, OAuth, OpenID Connect
Registering for Cloud & Accessing IDCS Console

MANAGING IDCS USERS, GROUPS, APPLICATIONS (IDCS ADMINISTRATION)

Administering Users
Administering Groups
Administrating Application
User, Role, Application Assignment
Bulk Load Users

{tab-l1 Day 3 | red}

IDCS CUSTOMIZATION & BRANDING

Setting Overview
Customize default settings and notifications
Customize Password Policies
Email Notifications
Multi-Factor Authentication

 

MANAGING ORACLE IDENTITY CLOUD SERVICE APPLICATIONS

IDCS Integration to other Applications
Understanding Cloud-based Oracle Applications
On-Premise Applications
Token Based SSO Auditing
IDCS Reports type and Run IDCS Reports

{tab-l1 Day 4 | blue}

USING REST APIs

Introduction of REST APIs
API Catalog for IDCS

MONITORING IDCS

Monitor IDCS Performance
Tune IDCS Instance
Troubleshooting IDCS

{/tabs}
{tab Enroll | grey}

 
 
 
 
 


{tab Training Hours | red}

Start Date: 11th May 2019

Training Schedule: 11, 12, 18 & 19th May 2019

Timing: 12:00 NOON GMT | 08:00AM EST | 5:00AM PST | 7:00AM CST | 6:00AM MST | 5:30PM IST  | 01:00PM GMT+1

This training will run for 4 days over weekends

{/tabs}
{jcomments off}

 

BIP - Multiple File Bursting with Ora_Hash Function

$
0
0

BIP - Multiple File Bursting with Ora_Hash Function

 

Business Need/Requirement: To extract Huge Data into multiple chunks i.e., multiple file extraction instead of having one single extraction filed with all records.

This could be achieve, by performing below core activities:-

1.  Using OraHash Function, we can split records into balanced data sets

2.  Using Bursting, we can have each defined balanced sets in separate files

3.  Using Cutom ESS Job, to run the BIP program, pass the chunk size and to execute BIP Program to move files into a SFTP Location

Please refer to below example, which illustrates the process to follow, along with screenshots

·        First we create a data model and reports using BIP Reports (Refer to BIP Report Creation Blog)

          

·        Then we add the busting query to the Report

          

·        We should then create an Input parameter for the report say, ‘File_count’ which is used by Ora_hash and BIP bursting Logic to split records and to create multiple files.

1.  As we know, ORA_HASH function (Part of Data model) will split rows into balanced data sets. Number of data sets depends on value passed to BIP report Input parameter.

Example: ORA_HASH(column, File_count) -- > ORA_HASH(Trx_number,2) à this will split the records into three sets (0,1,2) as shown in below screen shot.

       

2.     We have split the records into three Data Sets and now we want to create three Files for each data set. Inside BIP Bursting query, we are using the alias name HASH defined for Ora_Hash function part of Data Model query. Since Ora_hash created three Data Sets, BIP Bursting will also create three Files and shall have each data set in one file each.

        

BIP Query where we also need to provide SFTP details where we want to place the file, once BIP program is exceuted.

SELECT distinct 1 as "KEY"

 , 'XXVR_CUST_ACCOUNT' Template  à Data Template Name

 ,  'en-US' LOCALE                    à English

 , 'TEXT' OUTPUT_FORMAT à Output Format

 ,  'true' SAVE_OUTPUT        àSaving the output or not

 , 'FTP' DEL_CHANNEL            à Delivery Channel type

 , 'XXVR_CUST_ACCOUNT' OUTPUT_NAME à Output name (Give any name)

 , 'SFTP Server' PARAMETER1 à SFTP Server Name

 , 'em257008' PARAMETER2                à Username

 , 'Integrat!0n'  PARAMETER3 à Password

 , '/E_1/ftp_inbox/upload/INT001' PARAMETER4      à File Path in the server

 ,  'XXVR_CUST_ACCOUNTS_'||TO_CHAR(SYSDATE, 'DDMMYYHH24MISS')||'.csv'  PARAMETER5 à Output File Name

 , 'TRUE' PARAMETER6          à Use Secure FTP (TRUE or FALSE)

Note: - We should also enable the bursting from the properties tab as shown below.

Custom ESS JOB execution for BIP Report –

We can create a custom ESS Job to run the BIP program. Create a parameter prompt in custom job, which we will use in Ora_Hash and Bursting Query Logic

 

From the Navigation Tab, click on Schedule Process à click on Schedule new Process à select the custom ESS JOB and pass the paramter value (In our examle we used File Count value is 2)

 

As shown above, the Job status is Succeded.

Login into the SFTP server via Wincp or any other tool and we can see that 3 Files are placed in SFTP location via BIP bursting with almost equal split of records in each file.


Viewing all 930 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>