Quantcast
Channel: Apps2Fusion Articles
Viewing all 930 articles
Browse latest View live

How to Manage Catalogs in Fusion Cloud Procurement- Demo

$
0
0

A Procurement catalog is nothing but maintaining list of items which consists of details about the item including its price. Maintaining this list allows requisitioning users to easily browse their requirement and add items to their requisitions. Fusion accommodates three types of Procurement catalogs.
Administrators can define partitions of the local catalog using inclusion and exclusion rules for agreements and categories
Administrators can setup a punchout to an Oracle Exchange marketplace, such as exchange.oracle.com, or a supplier web store to access their catalogs
Administrators can define informational catalogs, which contain instructions or links for ordering other items or services at your company

 

How to Manage Catalogs in Fusion Cloud Procurement

Enrol Now for Fusion Procurement Training


How to Create Procurement Agent in Oracle Fusion Cloud- Demo

$
0
0

Use the Create Procurement Agent page to control a procurement agent's access to procurement activities for a  business unit. You can implement document security for individual document types. You can also control a procurement agent's access to activities such as suppliers, approved supplier list entries, and business intelligence spend data through the settings.

 

How to Create Procurement Agent in Oracle Fusion Cloud

 

Enrol Now for Oracle Fusion Cloud Procurement Training

An Overview of HR Helpdesk Feature in Oracle HCM Cloud Release 13

$
0
0

Introduction

HR Helpdesk is probably one of the most sought-after feature which HR Administrators, Support Engineers, Business Analyst, and end-users have been requesting for a long time now. Thankfully the same is delivered in the Release 13 version.

Prior to the availability of a delivered Helpdesk suite individual users either had to route their business requests (HR Related) either via some custom application or had to make requests via email. Neither of the two approaches was a full proof one as it really became difficult to track the requests and status. With this new feature, the HR Helpdesk has been integrated with the application and it’s just one single application which tracks the request as well as ensures that the real-time status is retrieved.

Enabled with advanced features like diverse communication channels (chat support, email, phone, web), interaction history, link to related articles, link to associated Service Requests and many more makes HR Helpdesk feature a very useful and attractive addition to the new release additional feature kit.

Let-us now quickly try to check how to create a simple Service Request in the application.

Creating a Service Request

Navigation -> Navigator -> HR Helpdesk -> HR Service Requests

 

One may even choose to navigate to the HR Service Requests page via the Springboard Icon

Once you click on ‘HR Service Request’ text it will take you to a new page where one can search for existing requests or even create new ones.

For this example, we will create a new service request (by clicking on “Create Service Request” Button)

This will open a new page where we would need to enter specific details like Title, Point of Contact, Severity, Category etc.

A typical filled in form would appear as below:

One may even add attachments too in the service request as required (we haven’t attached any for this specific example though).

Once you click on “Save and Continue” it will take you to a new screen which will appear as below:

All the other tabs namely (Messages, Contacts, Social and Related Service Requests) would appear as shown in screen-shot below:

 

Now we can click on “Save and Close” and the Service Request will be created. Once we press the “Save and Close” button it would take us to the Service Request Homepage where we will be able to view our newly created request

Scheduling Extract In Oracle HCM Cloud – A Worked Example

$
0
0

Introduction

I hope that most of you by this time must have been familiar with HCM Extracts if not you may feel free to read some of the articles like

Creating Custom HCM Extract

Migrate Custom HCM Extract

One Extract for All HCM Outbound Integrations

Delta Changes HCM Extract

HCM Extract Components

User Defined Parameter as HCM Extract Filter Criteria

Using Extract Rule Type Attribute for HCM Extract

Using Record Calculation Extract Attribute for HCM Extract

and

Notifying Changes Using HCM Extract

In this article we would like to demonstrate how to schedule a Custom HCM Extract. For this example, we would create a very simple HCM Extract (PersonName) which will have Person Number, Person First Name, Person Last Name, Person Full Name, Person Display Name, Person Name Start Date and Person Name End Date as extract attributes.

We would then try to schedule this extract to run every 5 minutes ( yes not a real life scenario but just to demonstrate how scheduling works). The delivered scheduling option available for an HCM Extract is Once, Daily, Weekly, Monthly and PAY_SAMPLE_FLOW_SCHEDULE (delivered Fast Formula which gives the next scheduled date).

Since we do not have any delivered schedule option which submits extract after an interval of 5 mins we would have to create a Custom Fast Formula of Flow Schedule Type.

Pre-requisite

We would need to have two things in place before we can schedule a HCM Extract. They are:

  1. Custom HCM Extract which is to be scheduled

  2. Custom Fast Formula which would schedule the Extract after every 5 minutes.

We would navigate to Data Exchange-> Manage Extract Definitions -> Search for (PersonName Extract)

Creating a Custom Fast Formula (Flow Schedule)

As we have already seen that there does not exist a delivered Fast Formula which can be used to schedule the extract we would need to create one.

Fast Formula Text

/*****************************************************************************

FORMULA NAME: FiveMinFlowSchedule

FORMULA TYPE: Flow Schedule

DESCRIPTION:  Formula to return a date time.

             Returns NEXT_SCHEDULED_DATE;

Formula Results:

NEXT_SCHEDULED_DATE     This will be a date time value with yyyy-MM-dd HH:mm:ss format.

*******************************************************************************/


/* Inputs */

INPUTS ARE        SUBMISSION_DATE(DATE), SCHEDULED_DATE(DATE)


/* Calculations */

NEXT_SCHEDULED_DATE = ADD_DAYS(SCHEDULED_DATE,0.0035)

  

/* Returns */

RETURN NEXT_SCHEDULED_DATE


/* End Formula Text */

 

When created the Fast Formula would appear in UI as below:

 

Now that we have the Extract and Flow Schedule Fast Formula in place we will go ahead and schedule the extract and verify the results.

 

Scheduling Extract

In order to schedule the extract we need to choose the “Submit Extract” options and choose the “Using a schedule” option under flow submission. Here we have chosen the newly defined fast formula FiveMinFlowSchedule and given a start and end time too

Verifying Results

As a last step, we should choose the “View Extract Results” and we should be able to see all the extracts for this flow there

Notice that the Recurring Flag is set to “Yes” which means this flow gets resubmitted but if we calculate the start time and end time there is a gap of 25 minutes which means there should be 5 submissions each with an interval of 5 minutes. That’s exactly what you will observe if you pay attention to the Submitted On and Completed On fields in the snapshot above.

And with this I have come to the end of article. Hope this was a new learning and would be useful in your real time projects.

That’s all for now have a nice day!

Post Promotion Evaluation in Oracle Demantra PTP- Demo

$
0
0

At its core, Demantra Predictive Trade Planning is a sales and promotion planning system that enables account managers to develop a highly accurate, account-level sales forecast and event plan by simply going through their daily sales planning activities. With Oracle Demantra Trade Promotion Optimization, sales organizations can go beyond volume forecasting and implement best business practices such as advanced promotion evaluation, simulation, and optimization. Account managers can measure and predict base sales, net lift, and indirect effects such as cannibalization and consumer pantry loading. Oracle Demantra Trade Promotion Optimization can take advantage of retailer point-of-sale (POS) and syndicated data to measure, predict, and optimize promotion lift, taking into account effects such as cannibalization and pantry loading. It also automates the process of promotion analysis as it can be done on-the-fly at the level of individual products and accounts. With this capability, account managers and trade marketers can be more effective and efficient with trade funds while providing precision forecasts to the entire company.

 

Post Promotion Evaluation in Oracle Demantra

 

Enrol Now for Oracle Demantra Predictive and Planning Training

 

Canceling Scheduled Extract in Oracle HCM Cloud Application

$
0
0

Introduction

In the previous article namely Scheduling Extract in Oracle HCM Cloud Application - A Worked Example we discussed how to schedule an HCM Extract with an interval of 5 minutes in between. But at times there could be a need where in we might have to make changes to the existing extract and the schedule needs to be cancelled.

In this article we would try to demonstrate how to cancel a scheduled extract.

Pre-requisite

As a pre-requisite we would need to ensure that there is a scheduled extract existing already and we are to cancel the same.

For this example, I have specifically scheduled PersonName extract and schedule it to re-run after every 5 minutes. (Screen-shot below confirms the same)

We would submit the extract with Flow Name as “5January2018PersonNameFlowSubmission_DemonstrateCancellation”  using the FiveMinFlowSchedule Fast formula (which ensures that the next scheduled date is current date and time + 5 minutes).

We could see that there is a 2 hour difference between the start and end dates.

 

We can cancel the schedule by using following navigation:

Scheduled Process-> HCM Flow Secured and then select “Cancel Process” the status will change from Wait to Canceled.

Notice the scheduled time of this process as 1/5/18 6:37 AM and if we search for all extract runs on or after this date in the “View Extract Results” section we will only find one schedule process which will be in ‘Scheduled’ state. No further schedules would be displayed too.

 

So this is how we can cancel a scheduled extract.

Oracle Fusion HCM Cloud Approval Management (AMX)- Demo

$
0
0

Use approval management to determine the policies that apply to approval workflows for particular business objects such as expense reports.You can use BPM Worklist to review and configure approval policies for any HCM task; however, you are recommended to configure approval policies for the Hire an Employee, Promote, Transfer, and Terminate tasks using the Manage Approval Rules and Notifications for Human Capital Management interface.
Predefined approval rules exist for many Oracle Fusion Global Human Resources tasks. In most cases, approval by the first-level and second-level managers of the person who submits the transaction is required; however, you can create different approval rules for any task. It identifies Global Human Resources tasks that have predefined approval rules. It also identifies the attributes enabled for use in custom approval rules for Global Human Resources tasks. Attributes that occur in both employment terms and assignments are enabled in both.When using the Manage Approval Rules and Notifications interface, you can specify one or more approval rules for each approver type. To create additional approval rules, you either add a new rule or duplicate a selected rule and edit it as appropriate. When you create multiple approval rules for an approver, they are evaluated individually in an undefined order.

Oracle Fusion HCM Cloud Approval Management (AMX): 

 

 

Enrol now for Oracle Fusion HCM Trainings

Creating BI Publisher Parameterized Report using Oracle BI EE as Data Source

$
0
0

Introduction

By now I assume most of us are aware how to create a BI Publisher Report in Oracle HCM Cloud Application using a SQL Query, but at times there is a need to have a BI Publisher Report to be created from an OTBI Subject Area.

 

Advantages of using OTBI Subject Area in Fusion BI Report:

  1. An OTBI Analysis takes care of security internally (meaning only the records which an individual is supposed to view are displayed)

  2. OTBI is comparatively simple as one can create reports by drag-drop feature (in case of simple reports)

  3. Creating a BI Publisher Report from OTBI Analysis gives us the flexibility to have design the Report Output as per specific business needs

  4. Many a times Business Users always want to schedule/run all kinds of reports be it BIP or OTBI from a single work-area. There are two ways to accomplish this namely:

  1. Create Custom Payroll Flow Pattern

  2. Create Custom ESS Jobs

 

Both of the above options (at this point of time as per my limited knowledge) are supported only if there exists a BI Publisher Report (.xdo). So for all the above reasons it does makes sense to explore and investigate if we can have a BI Publisher report created using Oracle BI EE as a data source.

 

Creating a Simple SQL Query Based Data Model using Oracle BI EE as Data-source

For this example we would try to create a very simple report comprising of the following fields:

  1. Person Number

  2. Display Name

  3. Full Name

  4. Known As

 

We would use “Workforce Management – Person Real Time” subject area and also ensure that apply one condition to ensure that we only pick the Person Name which has Name Type as “GLOBAL”..

Additionally, we would also have a mandatory parameter (Person Number) configured too which would have a Person Number LOV attached to it.

 

So let’s get started.

 

First Step

As a first step we would need to login to application with valid credentials (HCM_IMPL2) user for this example and navigate to following location:

 

Navigator-> Tools -> Reports and Analytics -> BI Catalog -> Create New Data Model

Once these we would need to choose “SQL Query” as the Data Set option and select “Oracle BI EE” as the data source.

One may use the Query Builder option and pick choose the database fields you are interested in:

Once done the Data Set Query would appear as shown in snapshot below ( We have added a where clause of Name type ‘GLOBAL’ and also a bind variable named personnumber)

 

 

The SQL Query used is below:

SQL Query

select "Person Details"."Person Number" as "Person Number",

        "Person Names"."Display Name" as "Display Name",

        "Person Names"."Full Name" as "Full Name",

        "Person Names"."Known As" as "Known As"

from"Workforce Management - Person Real Time"."Person Names" "Person Names",

       "Workforce Management - Person Real Time"."Person Details" "Person Details"

where "Person Names"."Name Type" = 'GLOBAL'

and    "Person Details"."Person Number" = :personnumber

Also we have a parameter which has a LOV attached to it.

LOV Query for Person Number

select "Person Details"."Person Number" as "Person Number"

from"Workforce Management - Person Real Time"."Person Details" "Person Details"

order by   "Person Details"."Person Number" asc

 

Creating a BI Publisher Report in Fusion for OBIEE Model

While I hope that most of you (or in-fact some of you who are already acquainted with BI Publisher Report Creation from one of the previous article) would be able to create the below BIP Report from SQL query for those who would like a quick refresher please feel free to read the articlehere.

 

We would next need to create a BI Report using this Data Model and the final output would appear as below:


OTBI Reporting in Oracle Fusion Cloud Financials- Demo

$
0
0

OTBI is a set of pre-seeded yet customizable analysis structures that OracleApplications users can access to create ad hoc reports, dashboards and alerts to aid daily decision-making. OBIEE features of saving reports, sending alerts and ability to enable certain business process actions within the Oracle Applications.

 

OTBI Reporting in Oracle Fusion Cloud Financials

 

Enrol now for Fusion Financials Cloud Reporting Training

Using Image and Static Text in OTBI Analysis

$
0
0

We can use images/logos/static-text/header/footer etc in an RTF Template which is attached to a BI Data Model and that gives a professional look to our reports. But you would be delighted that most of these features can be applied to an OTBI Analysis too.

Recently I was trying to build a Birthday Alerts Notification Mailer using OTBI and I did use a Logo and Static Text to give the email a pleasant look.

 

Lets see how I accomplished the same.

 

Creating the OTBI Analysis

I created a simple Analysis using “Workforce Management - Worker Assignment Real Time” Subject Area and chose the Worker Folder to fetch the data attributes required.

The data attribute details are as follows:

         Attribute Name

Data Source

Person Number

"Worker"."Person Number"

Display Name

"Worker"."Employee Display Name"

BirthDay

CONCAT(CONCAT(CAST(DAYOFMONTH("Worker"."Employee Date Of Birth") AS CHAR(2)),'-'),

CASE CAST(MONTHNAME("Worker"."Employee Date Of Birth") AS CHAR)

WHEN '01' THEN 'Jan'

WHEN '02' THEN 'Feb'

WHEN '03' THEN 'Mar'

WHEN '04' THEN 'Apr'

WHEN '05' THEN 'May'

WHEN '06' THEN 'Jun'

WHEN '07' THEN 'Jul'

WHEN '08' THEN 'Aug'

WHEN '09' THEN 'Sep'

WHEN '10' THEN 'Oct'

WHEN '11' THEN 'Nov'

ELSE 'Dec' END

)

Adding an Image

 

Next we need to ensure that we add a Image. For this we would need to click on the “Edit” option (Pencil Icon) on the title page and it would take us to the following screen

 

 

Notice that we have selected the Logo checkbox and then it gives an option of uploading an image. I already had an image saved into my local machine and I have uploaded the same here (Please ensure the file size is less than 50 KB). Once done we can save the analysis and revert to the layout section.

 

Adding Static Text

 

In order to add Static Text to an Analysis one would need to create a new View of “Static Text” type and add to the layout.The static text details are populated and the screen would appear as below:

Viewing the Analysis

 

Once both the Logo Image and the Static Text section has been added , we can save the analysis and try to open the same.The OTBI analysis appears as below:

Uploading BI Report Output to Content Server

$
0
0

Introduction

One of the major requirements of an ERP Application is to facilitate data transfer to and from to third party application. While some organizations are fine by using an Export/Import feature the recommended approach is transferring the extracted data (flat file) to unix server. And then either via using a push mechanism (from source system) or a pull mechanism (from consumer application) the flat-file is transferred to the target application system (unix-server).

While HCM Extracts (preferred outbound tool) used to support uploading the file to Content Server (also referred to an Universal Content Management Server) starting Release 13 we can upload the output of BI Report to UCM server too.

In this article we would try to demonstrate how do we upload the file to content server.

So let’s get started.

Uploading BI Report Data to Content Server

For this example we would use a already existing report (BirthdayLetterReport) and we have to use the schedule option and fill in the following details in the Destination Section:

 

Attribute Name

Attribute Value

Destination Type

Content Server

Output

All

*Server

FA_UCM_PROVISIONED

Security Group

FAFusionImportExport

Account

 

Author

HCM_IMPL

Title

BirthdayAlert

File Name

BirthdayAlertFile

Comments

Sample Schedule to Upload BI Report Output to Content Server

Include Custom Metadata

Checked

 

Once populated the UI screen would appear as below:

 

We need to check that the job has completed successfully (Job Name was “UploadJob”)

 

Verification

Now that we have seen that the BI Report Job has completed successfully we would try to validate and check whether the file actually got uploaded to content server. We can perform this verification two ways:

  1. From File Import and Export Link

 

We would need to navigate to following Menu Item

Navigator->Tools-> File Import and Export and then search for File as “BirthdayAlertFile

 

 

2) From Content Server Link

We would need to type the Content server link in the Web Browser and then search for file

The content server link ishttps://host:port/cs

 

Loading Worker Record Using HDL Template

$
0
0

Introduction

Data Migration is one of the most important yet most challenging task which needs to be accomplished in any ERP Implementation project. Different ERP system provide different tools for loading data into the system. Oracle HCM Cloud also provides multiple ways via which we can load data into the system. The available ones till date being (as of Application Release 13):

  1. Manual Data Entry via UI

  2. HDL (HCM Data Loader)

  3. HSDL (HCM Spreadsheet Data Loader)

  4. Web-Service

Out of all the above options HDL is the recommended way of loading huge volume of data and is used for inbound integration. Also HDL Load supports data load using Web Service ( One Webservice Loads the HDL Files into UCM Server and the second one loads the data using HCM Data Loader Service).

In this article I will try to explain how to load some worker records into the HCM System. We would need to perform certain steps in order to achieve the same. They are:

  1. Configure Source System Owner

  2. Identify a pre-existing Legal Employer and Business Unit to which all the worker records would be associated.

  3. Load specific workforce structure components (namely Location, Organization, Job, Grade and Position) which would be associated with Worker’s Assignment Details

  4. Load Worker Records without Assignment Supervisor Details

  5. Load Assignment Supervisor Details.

Let-us get started then:

1. Configure Source System Owner

For this example, we would add a Source System Owner (HA for this example) which will be active from 01-Jan-2018 to 31-Dec-4712.We need to login to the application with user having appropriate privileges and then follow below navigation:

Navigator -> Setup and Maintenance -> (Click on Task List and click on “Search”)

Once you click on the “Search” link you need to search for “Manage Common Lookups” and then search for Lookup Type as “HRC_SOURCE_SYSTEM_OWNER” and add new entry as shown:

Once added we can save the record.

2.Identifying Pre-Existing Legal Employer and Business Unit

We assume that the basic configuration is already been done in the HCM System and at least one Legal Employer and Business Unit has been configured (if the same is not done one may configure the same via UI).In this example we have checked and found that there exists a Legal Employer with name “US1 Legal Entity” and also a Business Unit with name “US1 Business Unit”.We can verify the existence of Legal Employer by searching for the same (US1 Legal Entity) using Manage Legal Entity HCM Information task under Setup and Maintenance

In order to verify the existence of Business Unit we need to do search for the same (US1 Business Unit) using Manage Business Units task under Setup and Maintenance

And with this we have successfully verified the existence of Legal Employer and Business Unit details which would be used in the subsequent steps.

3. Loading Basic Work-Structure Components (Location, Organization, Job, Grade and Position)

In this step we would try to manually create HDL Templates and try to load the same into the application.

We would be having five different files namely:

  1. Location.dat

  1. Organization.dat

  1. Job.dat

  1. Grade.dat

Position.dat

We will zip all these files (WorkstructureComponents.zip ) , One point to note here is that the files needs to be zipped directly ( one should not keep the files in folder if you do so you will encounter error during file load)

Next we need to click on “Data Exchange” and click on “Import and Load Data

Verification

We could clearly see that the load has been successful, but still it makes sense to search for these records for verification ( One should navigate to Navigator -> My Workforce-> Workforce Structures)

We can clearly see that all the work-structure components have been successfully uploaded into the application and we could view the same from the UI too.

4. Load Worker Records without Assignment Supervisor Details

Now we would continue with the next step where we will try to load the worker records. The file we are trying to load has many Metadata Contents

a)Metadata (Worker)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_PER<SEQ> (example HA_PER1, HA_PER2)

PersonNumber

HA<SEQ> (example HA1, HA2...)

ActionCode

HIRE

b)Metadata (PersonName)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_PERNAME<SEQ> (example HA_PERNAME1, HA_PERNAME2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

NameType

GLOBAL

c)Metadata (PersonNationalIdentifier)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_NID<SEQ> (example HA_NID1, HA_NID2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

NationalIdentifierType

SSN

d) Metadata (PersonEmail)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_EMAIL<SEQ> (example HA_EMAIL1, HA_EMAIL2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

EmailType

W1

e)Metadata (PersonPhone)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_WPH<SEQ> / HA_HPH<SEQ> (example HA_WPH1, HA_HPH1)

PhoneType

W1 / H1 (  W1 for WorkPhone and H1 for HomePhone)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

f)Metadata (PersonAddress)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_ADDR<SEQ>   (example HA_ADDR1, HA_ADDR2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

AddressType

HOME

g)Metadata (PersonCitizenship)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_CTZSHIP<SEQ>   (example HA_CTZSHIP1, HA_CTZSHIP2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

h)Metadata (PersonEthnicity)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_ETHNIC<SEQ>   (example HA_ETHNIC1, HA_ETHNIC2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

Ethnicity

1 , 2, 8 , 6 , 7 (Check for valid codes in application)

i)Metadata (PersonLegislativeData)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_LEG<SEQ>   (example HA_LEG1, HA_LEG2)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

LegislationCode

US

j)Metadata ( WorkRelationship)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_PDSERVICE<SEQ>(example HA_PDSERVICE1)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

LegalEmployerName

US1 Legal Entity

WorkerNumber

HA<SEQ> (example HA1, HA2 …)

k)Metadata (WorkTerms)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_WRKTERM<SEQ>(example HA_WRKTERM1)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

PeriodOfServiceId(SourceSystemId)

HA_PDSERVICE<SEQ>  (example HA_PDSERVICE1)

ActionCode

HIRE

AssignmentNumber

HA_WRKTERM<SEQ>(example HA_WRKTERM1)

AssignmentType

ET / CT ( ET for Employee , CT for Contingent Worker)

BusinessUnitShortCode

US1 Business Unit

LegalEmployerName

US1 Legal Entity

SystemPersonType

EMP / CWK ( EMP for Employee , CWK for Contingent Worker)

l)Metadata (Assignment)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_ASSIGN<SEQ>(example HA_ASSIGN1)

PersonId(SourceSystemId)

HA_PER<SEQ> (example HA_PER1, HA_PER2...)

ActionCode

HIRE

AssignmentName

E-HA<SEQ> (example E-HA1 , E-HA2 )

AssignmentNumber

HA_ASSIGN<SEQ>(example HA_ASSIGN1)

AssignmentType

E / C ( E for Employee , C for Contingent Worker)

BusinessUnitShortCode

US1 Business Unit

LegalEmployerName

US1 Legal Entity

SystemPersonType

EMP / CWK ( EMP for Employee , CWK for Contingent Worker)

JobId(SourceSystemId)

HA_JOB<SEQ> (example HA_JOB1)

LocationId(SourceSystemId)

HA_LOC<SEQ> (example HA_LOC1)

OrganizationId(SourceSystemId)

HA_DEPT<SEQ> (example HA_DEPT1)

PositionId(SourceSystemId)

HA_POS<SEQ> (example HA_POS1)

GradeId(SourceSystemId)

HA_GRADE<SEQ> (example HA_GRADE1)

PeriodOfServiceId(SourceSystemId)

HA_PDSERVICE<SEQ> (example HA_PDSERVICE1)

WorkTermsAssignmentId(SourceSystemId)

HA_WRKTERM<SEQ> (example HA_WRKTERM1)

m)Metadata (Assignment Supervisor)

We would not load the Assignment Supervisor at the first instance of worker load (as already mentioned in the beginning). Details about important attributes of this metadata would be discussed later when we load the assignment supervisor details.

n)Metadata (AssignmentWorkMeasure)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_WRKFTE<SEQ> / HA_WRKHEAD<SEQ>   (example HA_WRKFTE1 / HA_WRKHEAD1)

AssignmentId(SourceSystemId)

HA_ASSIGN<SEQ> (example HA_ASSIGN1, HA_ASSIGN2...)

ActionCode

HIRE / ADD_CWK ( HIRE for Employee and ADD_CWK for Contingent Worker)

Unit

FTE / HEAD ( FTE for Full Time Equivalent and HEAD for Headcount)

o)Metadata (PersonUserInformation)

Details about important fields:

PersonNumber

HA<SEQ> ( example HA1 , HA2 ,….)

UserName

<FirstName>.<LastName>@<sourcesystemowner>.com

Example ( Adam.Baro@ha.com)

Now that we have got an idea of the various Business Entity and their key data fields ( along with the mapping logic used for them) we may proceed with creating a zip file (InitialWorkerLoad.zip) and try to upload the same to the application

We could see that we have tried to upload the files twice ( this is because during the first run we encountered some error and in the first run only 7 user records got created) . In the second run the balance 3 user records along with the entities which failed during first run were loaded.

Verification

In this case we had to run the process multiple times and none of them were 100% successful. We will hence try to verify the load results by searching for person records using person management.

Now we could see that all the 10 records are present along with the National ID, Business Unit, Country, Department , Location details. Also at this point it makes sense to check that the Assignment Supervisor records should not be populated. We would check for Becky Camelo’s Employment Page to verify the same.

Also as per the data we have loaded Adam Baro is the topmost manager of all the employee records loaded. But as we have not loaded the Supervisor Details yet we would not be able to view any employee reporting to Adam in the Organization Chart (Navigation is Navigator-> Directory->Directory and then search for Adam Baro. Once the name appears click on it and the click on “View in Organization Chart”)

5. Load Assignment Supervisor Details

Now in this step we would try to load the Assignment Supervisor Details. This Business Entity gets loaded using the Worker.dat object and the Metadata is AssignmentSupervisor

Metadata (AssignmentSupervisor)

Details about important fields:

SourceSystemOwner

HA

SourceSystemId

HA_MANAGER<SEQ>   (example HA_MANAGER1)

AssignmentId(SourceSystemId)

Unique Assignment Record for which Supervisor is being assigned. HA_ASSIGN2 means that the assignment record with AssignmentId value as HA_ASSIGN2 is getting a supervisor associated.

ManagerAssignmentId(SourceSystemId)

AssignmentId of the Employee who is been assigned as Supervisor.

ManagerId(SourceSystemId)

PersonId of the Employee who is been assigned as Supervisor

ManagerType

Type of Supervisor (Line Manager, HR Manager etc) . LINE_MANAGER for this example

PersonId(SourceSystemId)

PersonId of the Employee who is getting a supervisor assigned.

Now we will try to load this file and check the load status.

From the screenshot we can infer that the load has been successful. We would try to verify the results now.

Verification

We can verify the results in two ways:

  1. Verify Manager Details from Manage Employment

We can navigate to the Manage Employment for Person Number HA2 and now we should see the Manager Details populated (HA1 should be visible as the manager)

Alternatively one may even check the Organization Chart for (Adam Baro) and all the Reportees would be visible there too.

2.  Verify Manager Details Using Organization Chart

We may navigate to Navigator->Directory->Directory-> (Select Adam Baro as the Person Name) and then click on “View in Organization Chart”. Next click on the Print Button and then choose the appropriate configurations ( like levels to display, Orientation, and Fields to display). For this example I chose Levels to Display as 3, Orientation as Vertical and Fields to Display as All and then the following organization chart appears:

From this it is clear that while we didn’t have any individuals reporting to Adam prior to Supervisor assignment  being uploaded the same isn’t the case once we load the SupervisorAssignment file which hereby confirms that the supervisor details have been successfully associated with the worker records.

Inference / Summary

With this we have come to the end of the article and I hope I was able to explain the basic steps required to load worker records in the system and even associate supervisor details to those assignments.

I hope this was a good learning and you guys had a great time during the session.

Thank You all for your time and have a nice day ahead!

Creating BI Publisher Data Model Using Secured List Views

$
0
0

Introduction

While trying to create a Business Intelligence Publisher data model with physical SQL, we have two options. (listed below):

1. Select data directly from a database table, in which case the data you return isn't subject to data-security restrictions. Because you can create data models on unsecured data.

2. Join to a secured list view in your select statements. The data returned is determined by the security profiles that are assigned to the roles of the user who's running the report.

While at times there is a need to fetch the complete details from the database ( option 1) at times there are requirement where we want users to only fetch data which he/she is entitled to view. In such cases, using a Secured List View comes handy.

In this article, we would try to understand the impact of the Result Set returned by a SQL query if we use a Secured List View as compared to the database table.

Most of the commonly used tables do have a corresponding secured list view also available. The complete details could be foundhere.

For this example, we would try to create a very simple SQL query making use of a Database Table (PER_ALL_PEOPLE_F) and the secured list view (PER_PERSON_SECURED_LIST_VIEW) corresponding to this database table.

Since we are trying to demonstrate the impact of using a Secured List View in a SQL Data Model the easiest and the simplest way of showcasing this could probably be by trying to fetch the total number of records returned using the secured List View vis a vis that of the direct database table.

So at broad level we would perform the following steps:

  1. Create a simple Data Model which would display the Table Name and Record Count for PER_ALL_PEOPLE_F and also for PER_PERSON_SECURED_LIST_V

  2. Next we would run this data model with an Implementation User (say HCM_IMPL) and we expect to see the record count to be the same ( HCM_IMPL is a user who has all the roles available in the HCM Area)

  3. Next we would try to run the same data model using a named user (say JAMES.AARON) and we expect to get a record count value of the data row corresponding to PER_PERSON_SECURED_LIST_V to have a lower value. The Record Count Value of the data row corresponding to PER_ALL_PEOPLE_F should anyways return the same value as in step 2.

So without much delay let’s get started.

Creating a Simple Data Model

We would be creating a simple data model which would comprise of the following fields:

  1. Table Name

  2. Record Count

The SQL query used is:

SQL Query:

SELECT 'PER_ALL_PEOPLE_F' TABLENAME,

(SELECT COUNT(*) FROM PER_ALL_PEOPLE_F) recordcount

FROM PER_ALL_PEOPLE_F

UNION

SELECT 'PER_PERSON_SECURED_LIST_V' TABLENAME,

(SELECT COUNT(*) FROM PER_PERSON_SECURED_LIST_V) recordcount

FROM PER_PERSON_SECURED_LIST_V

We would save this Data Model in the Shared Folder as RecCount_dm

Running the BI Data Model with logged in user as HCM_IMPL

In this step we would login as HCM_IMPL as a user and try to run the Data Model. We expect to see the same RecordCount values for the PER_ALL_PEOPLE_F and PER_PERSON_SECURED_LIST_V database objects.

We could see that the RecordCount for both the data rows is same ( 3690 Records)

Running the BI Data Model with logged in user as JAMES.AARON

We would login to the application using JAMES.AARON as the user . navigate to the BI Data Model and ‘View the Data’. The results displayed are captured in screenshot below:

We could see that the Data Row corresponding to PER_ALL_PEOPLE_F has RecordCount value as 3890 while the same for Data Row corresponding to PER_PERSON_SECURED_LIST_V returns 11.

Inference

So now, we have seen that using the same SQL Query and the same tables we get different results depending on the roles assigned to the logged in user.

On one hand the Admin User who has access to all the roles is having the same record count returned from both database table and secured list view, the other user who only have access to specific data returns a much lower record count (corresponding to the Secured List View Data Row). This is in-line with the Secured List View properties where the data returned is determined by the security profiles that are assigned to the roles of the user who's running the report.

I hope the above post is clearly establishing the fact.

While I have tried this using one database table and its secured list view one can try with any other table and secured list view and the results would be similar.

With this, I have come to the end of my article and I hope I was able to explain the concept clearly.

Thanks for your time and have a nice day!

Project Budgeting Overview in Oracle PPM Cloud- Demo

$
0
0

A budget or forecast version represents a specific planning scenario created with a financial plan type. For example, a cost budget that is based on a set of proposed contract terms, or a cost forecast that is based on an engineering estimate.

Select one of the following methods to create budget or forecast versions.

Generate amounts based on quantity from another financial plan or the project plan.

Copy amounts from another budget or forecast version.

Manually enter amounts for budget or forecast lines.

Project Budgeting Overview in Oracle PPM Cloud

 

Enrol Now for Oracle Fusion PPM Cloud Training

Displaying Total Rows Fetched By OTBI Analysis, Logged In User, Current Date and Time In Oracle Fusion Cloud Application

$
0
0

Introduction

I assume most of us are aware that we may use the COUNT(<unique_attribute>) expression in either the SQL query or the BI Template to display the Total Number of Records fetched by the SQL query in a BI Report. Things however are not that simpler if one tries to display the same in an OTBI Analysis. One needs to perform a series of steps to achieve this.Also we would try to display current logged in username along with currentdate and currenttime.

On a high level the steps are:

  1. Add a custom attribute to get MAX(RCOUNT(1)) in the OTBI Analysis

  2. Add a custom attribute to get username using USER() function

  3. Add a custom attribute to get currentdate using CURRENT_DATE function

  4. Add a custom attribute to get currenttime using CURRENT_TIME function

  5. Add a Narrative View and choose explicit settings in the same (described in detail in this article later)

Add Custom Attribute MAX(RCOUNT(1))

We already have a custom report created and we would just add the MAX(RCOUNT(1)) field in the analysis.

One additional thing one should make note is the column sequence (at which place of the OTBI Analysis have we placed the column).

Means if I have two columns say personnumber and MAX(RCOUNT(1)) and I have placed MAX(RCOUNT(1)) at the second place then the column sequence is 2.

For my example the column sequence for MAX(RCOUNT(1)) is 16.

Add Custom Attribute USER()

We already have a custom report and we would add a new column and USER()

The column sequence is 12

Add Custom Attribute CURRENT_DATE

We would need to add a new column CURRENT_DATE

The column sequnce for this data field is 11.

Add Custom Attribute CURRENT_TIME

We will add a new attriute and add CURRENT_TIME

The column sequence for this data field is 10.

Add a Narrative View and make special settings

Now that we have added the total count column we need to add a narrative view,

Important properties to set:

Use @<column_sequence> which will display the actual value stored in the column.

Set the Rows to Display value to 1. (This ensures that this field would be displayed just once even though the same actually gets repeated for each of the data row).

The application screenshot once we make the changes appears as below:

Verification

We would quickly try to run the analysis for different values and try to verify the results.

And now we will change the parameter values and the Total Number of Rows would change too.

Inference / Summary

So this is how we can display the total number of rows fetched by a OTBI Analysis, Logged In username, Current Date and Time by making use of Narrative View. One however should make sure that they set the Rows to Display property to 1 else multiple records will be displayed.And with this I have come to the end of the article.

Hope this was a good read and will be useful.

Thanks all for your time and have a nice day ahead!.


Passing Multiple Values from Report Parameter

$
0
0

Introduction

Till now we have created a large number of BI Data Model Report (SQL as Data Source) and we have used passing either all the parameter values or a single specific value but at times there is a need to have multiple values passed.

In this article we will try to explore this specific feature where-in we can pass multiple values to a Report parameter.

Worked Out Example

We would create a very simple BI Report and try to demonstrate the same. The SQL query will fetch the person number field and the same would be used as a Parameter (Menu Type)

SQL Query

SELECT papf.person_number

FROMper_all_people_f papf

WHERE TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date

AND  (papf.person_number IN (:personnumber) OR 'All' IN (:personnumber || 'All'))

The Data Model once created would appear as below:

Creating Menu Type Parameter

We would need to create a Menu Type Parameter and for this we would need to perform the below steps:

  1. Creating a List of Value

  2. The following details would be required for the same:

Attribute Name

Attribute Value

*Name

PersonNumberLOV

Type

SQLQuery

Data Source

ApplicationDB_HCM

SQL Query

SELECT papf.person_number

FROM    per_all_people_f papf

WHERE TRUNC(SYSDATE) BETWEEN papf.effective_start_date AND papf.effective_end_date

Once the details are populated in the Application the screenshot would appear as below:

  1. Associating the List of Values to the Parameter

Now that we have created the List of Values we would need to associate the same with the Person Number parameter. The following details needs to be added in the parameter section:

Attribute Name

Attribute Value

*Name

personnumber

Data Type

String

Default Value

 

Parameter Type

Menu

Row Placement

1

Display Label

PersonNumber

List of Values

PersonNumberLOV

Number of Values to Display in List

100

Options: Multiple Selection

Checked

Options: Can select all: Null Values Passed

Checked

Options: Can select all: All Values Passed

 

Refresh Other Parameters on Change

Checked

The above details when captured in screenshot would appear as below:

Verification

Now we have completed all the required setups and as a last step we would try to run the report for the below three cases (which will cover all scenarios):

  1. Pass a Single Value

Although we have used the Parameter to allow multiple values to be passed to the report the same report should even work fine when we pass a single value.

  1. Pass Multiple Values

For this specific scenario we would pass multiple values and check how the report behaves.

  1. All Values

In this scenario we would try to run the report for “All” value

One important thing to note for this option is that one would need to change the value beside the Rows Attribute from the default value of 5 to a higher value ( I chose 100 for this example ) else it would only display 5 rows and one might feel that the report is not working as intended

Inference / Summary

With this I have come to the end of the article and hope that it was a nice feature to get ourselves familiar with. Now instead of running a report for a specific value we may run for multiple values and analyze the results at one go.

Thanks for your time have a nice day!

Uploading Data Files to UCM Content Server of Oracle Fusion Cloud Using a Standalone Java Application

$
0
0

Introduction

In one of the previousarticle we have seen how to load worker records in Oracle Cloud Application using the User Interface ( Navigator -> Tools -> Import and Export) option but many a times the source system ( legacy application where from we are trying to load the source data) does wants to load the files directly into the server without having an individual to manually login into the application.

Also due to security reasons many a times some organizations prefer not to give access to the Application pages but rather prefer some other way of performing a file transfer.

Weird it may sound at first but there are reasons like:

  1. Organizations have separate task force for maintaining ERP application and they do not at times want to give application access to technical team (to ensure they should not be able to view sensitive data)

  2. The Technical team members too at times also prefer a simple java based UI where they would only populate the required fields and get a confirmation

In such situations one may either build a new Java Application or use an existing one provided by Oracle HCM Centre of Excellence. The tool is named as HDLdi and the full form is HCM Data Loader Desktop Integrator. Detailed documentation about the same can be found at below MOSC Article:

HDLdi

You may even wish to download the associated files from link below:

<Link to Download HDLdi Application>

Worked Out Example

Once we download the application and click on HDLdi2.3.11

The below page appears:

Next we need to click on the “Setup” link and populate the following details:

Attribute Name

Attribute Value

Protocol

Soap

UCM Server URL

https://<hostname>:<port>/idcws/GenericSoapPort?wsdl

UCM Username

<UserName>

UCM Password

<UserPassword>

Inbound Folder

hcm/dataloader/import

Outbound Folder

hcm/dataloader/export

WSDL URL

https://<hostname>:<port>/hcmCommonDataLoader/HCMDataLoader?wsdl

HCM Username

<UserName>

HCM Password

<UserPassword>

Sequence

1

 

The application page would appear as below:

Next we need to click on “File Upload” and point to the zip file ( contains Worker Object)

Once we have provided the location of the file we would need to click on “UCM” button and then on “Refresh”. We would get the details from the server:

Next we may click on the “File Registration” button which would try to invoke the Import And Load Process.

Sample Payload:

Response:

So, this is how we can upload files to UCM Server and even invoke the HDL Loader Service. In the next step we would try to verify the existence of the file as well as whether the HDL program got invoked in the Oracle Cloud Application.

Verify Existence of File in UCM Server

We can verify whether the file (“SampleFileForUCMUpload.zip”) has been loaded in the application by navigating to  Navigator->Tools->File Import and Export and performing a search. The results when captured in screenshot would appear as below:

One may even search for file by using the Content Server Link and search for Title as “SampleFileForUCMUpload.zip”. The link of content server is of following format:

https://<host>:<port>/cs/idcplg

Verify whether HCM Data Loader Process was Invoked

As a last step we would navigate to (Navigator-> My Workforce (Data Exchange) -> Import and Load Data) and a quick search will reveal that a Data Load process was invoked

Inference / Summary

So, this is how we can load a file to UCM Server and even invoke Web Service to load data into the application. One can load as many files as required using the tool and may even automate the entire process too.

The HDLdi tool does gives a simple interface to end user to try performing a simple load and get acquainted with the Data Load Process

Hopefully this is a good read and would help you all in your projects.

Do try this out and share your experience.

That’s it from me for now, have a nice day ahead!

Order to Cash Cycle in Oracle Fusion SCM Cloud- Demo

$
0
0

Modern order-to-cash business flows must operate in a multi-faceted, dynamic environment. These business flows must support:

Multiple sources of order capture
Multiple methods of order fulfillment
Consistency of process governance and order promising
Streamlined customer experience
Seamless integration with billing and finance
To facilitate these business flows, Oracle provides a multichannel order management capability with a central order hub at its core. This hub includes capabilities for order capture and fulfillment orchestration, pricing, product configuration, inventory management, and order promising.

Order to Cash Cycle in Oracle Fusion SCM Cloud:

 

Enrol Now for Oracle Fusion SCM Cloud Training

Order Management In Fusion SCM Cloud- Demo

$
0
0

Order Management Cloud is a Supply Chain Management application that improves order fulfilment for your business processes. It includes predefined integration, centrally managed orchestration policies, global availability, and fulfilment monitoring that can help increase customer satisfaction and order profitability.
You can use Order Management to capture customer demand and fulfil sales orders in the following ways:

Capture customer demand. Capture demand from various channels, such as web, mobile, call centre, direct sales, and partners. Provide the functionality that your users can use during order capture, such as pricing items, determining availability, getting the order status, and so on.

Orchestrate and monitor fulfilment across channels. Coordinate with other Oracle Supply Chain Management Cloud applications during fulfilment , such as Purchasing, Manufacturing, Inventory Management, and other solutions.

Use web services. Integrate Order Management with various systems that reside outside of the Order Management solution. For details, see the Integrating Order Management chapter in the guide titled Oracle SCM Cloud, Implementing Order Management. 

Order Management In Fusion SCM Cloud:


Enrol Now for Oracle Fusion Cloud SCM Training

Comparing Output of a BI Report with OTBI Report In Oracle Fusion Cloud Application

$
0
0

Introduction

One of the most common (and probably the most difficult too) question asked to a technical consultant is which one among OTBI or BIP Report should one choose while building a Report in Oracle Cloud Application.

Honestly, speaking there is no straight answer to this. Many a times one would prefer building a BIP Report while at time building an OTBI Report would suffice.

Each one of the above tools has its own set of pros and cons but one significant feature (one may call a disadvantage of using an OTBI Report.. if they wish to say so) is that an OTBI Analysis always performs a equijoin among the various attribute fields. What it means that if we are joining multiple folders and at any point of time we join a data attribute which doesn’t holds data then the entire data row is skipped.

We can illustrate this very well with the help of an example.

Worked Out Example

In this example we will try to create a very simple report ( we will name the Report as “Benefit Setup Report”) which would contain the following fields:

  1. Program Name

  2. Plan Type

  3. Plan Name

  4. Option Name

Now Program Name could be considered as the topmost data attribute which will have multiple Plan Types associated with it. Each Plan Type can have multiple plans and each plan may have multiple plan options.

Now, there could be some plans which may not have any plan options configured. In such scenario if we build a OTBI Report then consisting of all above mentioned four fields we would only get records which have data in all the fields. All the rows where option name is not configured will not be shown in OTBI Report. However, in this specific scenario we may make use of BI Report and fetch data for even those records which are not having any plan options configured.

Creating a BI Report

As a first step, we would create a simple SQL query based data set to fetch program name, plan type, plan name and option name.

SQL Query

select program.name ProgramName,

  plantype.name PlanType,

  planname.name PlanName,

  planoption.name optionname

FROM    ben_pgm_f program,

       ben_plip_f planprocessing,

       ben_pl_typ_f plantype,

       ben_pl_f    planname,

       ben_opt_f  planoption,

       ben_oipl_f  processingoption

WHERE  program.pgm_id = planprocessing.pgm_id

ANDplanprocessing.pl_id = planname.pl_id

ANDplantype.pl_typ_id = planname.pl_typ_id

ANDplanname.pl_id = processingoption.pl_id

ANDplanoption.opt_id = processingoption.opt_id

ANDTRUNC(SYSDATE) BETWEEN program.effective_start_date and program.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planprocessing.effective_start_date and planprocessing.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN plantype.effective_start_date and plantype.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planname.effective_start_date and planname.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planoption.effective_start_date and planoption.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN processingoption.effective_start_date and processingoption.effective_end_date

UNION

select program.name ProgramName,

  plantype.name PlanType,

  planname.name PlanName,

  planoption.name optionname

FROM    ben_pgm_f program,

       ben_plip_f planprocessing,

       ben_pl_typ_f plantype,

       ben_pl_f    planname

WHERE  program.pgm_id = planprocessing.pgm_id

ANDplanprocessing.pl_id = planname.pl_id

ANDplantype.pl_typ_id = planname.pl_typ_id

ANDplanname.pl_id NOT IN (select boiplf.pl_id

                             from   ben_oipl_f boiplf

                                                                                       where  TRUNC(SYSDATE) BETWEEN boiplf.effective_start_date and boiplf.effective_end_date)

ANDplanoption.opt_id = processingoption.opt_id

ANDTRUNC(SYSDATE) BETWEEN program.effective_start_date and program.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planprocessing.effective_start_date and planprocessing.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN plantype.effective_start_date and plantype.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planname.effective_start_date and planname.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN planoption.effective_start_date and planoption.effective_end_date

ANDTRUNC(SYSDATE) BETWEEN processingoption.effective_start_date and processingoption.effective_end_date

The catalog folders used can be downloaded from below link:

<Data Model Catalog Folder>

<BI Report Catalog Folder>

The output of the BI Report would look as shown below:

 

Note: That for some records the Option Name field is Blank.

Creating OTBI Analysis

Once we are done creating the BI Report we can create an OTBI report using “Benefit – Setup Real Time”.

Issued SQL

SET VARIABLE PREFERRED_CURRENCY='User Preferred Currency 1';SELECT

  0 s_0,

  "Benefits - Setup Real Time"."- Plan Basic Details"."Plan Name" s_1,

  "Benefits - Setup Real Time"."- Program Basic Details"."Program Name" s_2,

  "Benefits - Setup Real Time"."Plan Options"."Option Name" s_3,

  "Benefits - Setup Real Time"."Plan Type"."Plan Type Name" s_4

FROM "Benefits - Setup Real Time"

ORDER BY 1, 3 ASC NULLS LAST, 5 ASC NULLS LAST, 2 ASC NULLS LAST, 4 ASC NULLS LAST

FETCH FIRST 75001 ROWS ONLY

The catalog folders can be downloaded from below link:

<OTBI Catalog Folder>

 

Note: We could see that all the four fields namely Program Name, Plan Type Name, Plan Name and Option Name are populated. What this also implies that any record in which any of the above fields was not having data (NULL) the record is skipped.

Comparing BI Report Data with OTBI Report Data

In this step we would keep a snapshot of both the records side by side and understand the differences:

 

Inference / Summary

We can clearly see from the above screenshot that while the BIP Report fetched records where there was no Option Name ( for Plan Names “AU Lifestyle Allowance” and “AU SGC Superannuation”) but the records didn’t even appeared in the corresponding OTBI Report.

So we can infer from the above example that while the BIP Report allows us to use the Outer Join feature to display records (even in case some of the data fields are not holding data) an OTBI Analysis completely skips such record (works on equijoin concept).

And with this I have come to the end of this article.

I hope this was a interesting read and you folks had a good time going through this.

Thanks for all your time , have a great day ahead!

Viewing all 930 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>