High Level Overview of Data Integration from Fusion Cloud Applications to Oracle Warehouse

Santosh Raviteja
4 min readJun 12, 2021

BIAPPS 11.1.1.10.2

Overview:

This document describes the high-level understanding of how to extract data from Oracle Fusion Applications in Cloud (SaaS) and load it into Oracle Warehouse installed in on premise environment.

Extracting cloud data from Cloud data source to Oracle warehouse is a twostep process.

Step 1: Oracle Fusion HCM cloud data extraction to external storage service.

Step 2: Data consumption from UCM to Oracle Warehouse.

Abbreviations:

BIACM — Business Intelligence Application Console Manager

BICCC — Business Intelligence Cloud Connector Console

FA — Fusion Application

HCM — Human Capital Management

HTTPS — Hyper Text Transfer Protocol Secure

OBIA — Oracle Business Intelligence Application

OBIEE — Oracle Business Intelligence Enterprise Edition

ODI — Oracle Data Integrator

OTBI — Oracle Transactional Business Intelligence

PLP — Post Load Process

SaaS — Software as a Service

SDE — Source Dependent Extract

SDS — Source Dependent Store

SIL — Source Independent Load

SOA — Service Oriented Architecture

UCM — Universal Content Management Server

VO — View Objects

FTS — Fusion Technology Source

Data flow from Fusion Cloud — HCM to Oracle Warehouse

Oracle Fusion HCM cloud data extraction to external storage service:

i. BICCC is used to extract the data from Oracle Fusion HCM cloud to External storage services which are Oracle Cloud Storage Services or UCM server.

In Oracle Fusion Applications Release 10 and above, Oracle BI Cloud Connector Console (BICCC) is automatically deployed and no separate service request is required to deploy it on Fusion pod.

ii. BICCC can be started with a HTTP URL based on the following format:

http://FA OHS Host: FA OHS Port/BIACM or with the help of a web link with login details provided by Oracle cloud support.

iii. BICCC is used to schedule a once-only or regular data load as per the customer requirement. BICCC allows users to schedule a cloud extract for the required VOs based on the extract date provided by the user. BICCC also allows to reset last extract date to perform the full data load for the selected offerings, instead of incremental load.

iv. If Fusion cloud is configured a source system in BI Apps version 11.1.1.10.2 (current version deployed / to be deployed at PFG), then the only standard way to support external storage service is UCM. User should not select any other option when using 11.1.1.10.2 version of BI Apps.

v. While configuring UCM as external storage in BICCC, provide the necessary host, port, download folder location and upload folder location details.

vi. Data will be exported to UCM in the form of compressed csv, mdcsv (metadata file), pecsv (primary extract file) file format. Along with the data files, a summary file called MANIFEST.MF file will be generated and renamed to <timestamp>_MANIFEST_<TIMESTAMP>.MF post successful download via the ODI Load Plan. Manifest file contains the list of files that make up an extract. Data exported for the list of enabled offerings can be viewed from UCM console.

vii. There is no support for encrypting just the extracted content on UCM. UCM is part of FA SaaS DB’s and all Fusion data can be encrypted using FA advanced encryption options.

Data consumption from UCM to Oracle warehouse:

i. Once the data is exported to UCM, Fusion application source details have to be registered with the BIACM console under “Define Business Intelligence Applications Instance” in order to proceed with data extract from UCM through ODI load plan.

ii. User can configure the user name and password in credential store in BI Domain. In enterprise manager, create a credential key FUSION_APPS_OBIA_UCM_USER-KEY with the user name/password under map oracle.apps.security.

The username and the password is the user that has security group OBIAExport assigned in FA SaaS environment.

iii. If HTTPS is enabled for the UCM Server, configure SSL.

Certificates on FA SaaS PODs are CA signed, so there is no need to import the certificates as long as DemoTrust.jks is defined as the trust store in WebLogic Server.

Add this JAVA OPTION in setDomainEnv.sh:

Dweblogic.security.SSL.trustedCAkeystore=${WL_HOME}/server/lib/DemoTrust.jks

iv. ODI load plans will first copy the files from UCM server to download folder (Onprem) and then move them to cloud files folder. After that ODI runs FTS mappings to populate SDS schema where csv file acts as a source.

v. After loading SDS schema, ODI also provides huge Out of the box code for SDE SIL and PLP interfaces/packages to transform the data into dimensions, facts and aggregate tables with SDS schema as source.

vi. Once the Oracle data warehouse is populated for all the target tables, OBIEE built in dashboards and reports can be used to render the data.

References:

https://docs.oracle.com/applications/biapps102/install/GUID-DEAA66C3-40C3-4833-A9DB-63124D3A732E.htm#BIAIN-GUID-DEAA66C3-40C3-4833-A9DB-63124D3A732E

https://docs.oracle.com/cd/E53672_01/doc.111191/e53673/GUID-2B9E0FE1-6781-4286-8134-8EE20BF270B3.htm

https://docs.oracle.com/cd/E53672_01/doc.111191/e53673/GUID-978A67EE-BD7B-4114-AE88-DE785DD31BAC.htm

--

--

Santosh Raviteja

Data evangelist, CSM, The views expressed here are my own and do not necessarily reflect the views of any organization. Favorite quote: This too shall pass away