Showing posts with label EPM Cloud. Show all posts
Showing posts with label EPM Cloud. Show all posts

Wednesday, June 22, 2022

Unable to load Dates to EPBCS application using Oracle EPM Cloud Data Management

Issue Description 

There might be a scenario where you are trying to load Dates into EPBCS application and you end up with dates not getting loaded, and for the dates column you get a Blank value.

But in Data load workbench it shows that the data is loaded successfully.

The Import and Export process also shows success.

Root Cause

The Format for the column which holds the Date information is not correct.

Solution

Data management supports loading date in two formats

MM-DD-YYYY     &      DD-MM-YYYY

Based on the date format set in your EPBCS application the date format can be selected

Most preferably MM-DD-YYYY

So while designing the source file which is a CSV file, ensure you select the entire Date column

Right Click > Format Cells > Select Custom > and in the TYPE: section enter MM-DD-YYYY  or DD-MM-YYYY based on what is defined in your application.

Note: 

Ensure the value entered in the Type section should be in CAPS, Data management will not throw an error if you use Lower case, but the Date information wont get loaded to the EPBCS application.

Monday, April 25, 2022

Cloning Applications between environments using Clone Environment option in Oracle EPM Cloud

 Cloning Applications between environments using Clone Environment option in Oracle EPM Cloud  

Click on Navigate > Under Tools section > Click on Clone Environment  


Enter the details of your Target environment where you need to clone this application

Target environment URL, Target environment Username, Target environment Password

Roles Required

You must have either a Service Administrator or a Service Administrator with Identity Domain Administrator predefined role.

For Oracle Cloud Classic to OCI cloning, you must sign in as a Service Administrator, who also has the Identity Domain Administrator predefined role to ensure that users and their roles are cloned to the OCI environment.


For cloning artifacts, you can choose the options as per your requirements



  1. Optional: Select the Users and Predefined Roles check box to clone users and their predefined role assignments.

If you are performing an Oracle Cloud Classic to OCI migration, you must select this check box.

  1. Optional: for environments other than Oracle Enterprise Data Management Cloud and Narrative Reporting only: Deselect the Data Management check box if you do not want to clone Data Management records. Cloning of Data Management records may take a long time if the staging tables contain a very large number of records.

Clone Data Management records only if both the source and target environments are on the same monthly update or the target environment is one update newer than the source environment. For example, you can clone 22.01 Data Management records to another 22.01 environment or to a 22.02 environment only.

  1. Optional: for Planning, Planning Modules, Free Form, Financial Consolidation and Close, and Tax Reporting only: Deselect the Job Console check box if you do not want to clone Job Console records.
  2. Optional: for Planning, Planning Modules, and Free Form only: Deselect the Application Audit check box if you do not want to clone application audit records.

Note: Application audit data of Financial Consolidation and Close, and Tax Reporting is, by default, included in the snapshot.

  1. Optional: Select the Stored Snapshots and Files check box if you want to clone the contents of inbox and outbox, and stored snapshots. This process may take a long time depending on the number and size of stored snapshots and files in inbox and outbox.
  2. Click Clone to initiate the process.

Tasks to Perform After Cloning Environments

1.       Daily Maintenance Start Time

The daily maintenance start time of the target environment is automatically reset to that of the source environment from which the snapshot was cloned, ensure to change the time in the target environment accordingly.

2.       Update EPM Automate Scripts

If you have written any script to perform an activity, ensure the script is updated with the target environment URL with identity domain name updated.

Also, if you are using native accounts then ensure password is updated In the scripts.

3.       Smart View URLs

Modify the public and private connection Smart View URLs so that they point to the cloned environment. Also, update connection URLS as needed.

4.       Integrated Business Process

a.       Single Sign On

If you are moving multiple instances in an integrated business process to a new domain, SSO between instances will not work until all instances are migrated.

If you had setup SSO to authenticate the users and the migration resulted in a change in the identity domain being used, for example, after Oracle Cloud Classic to OCI migration, you must reconfigure SSO.

While cloning between same domains SSO need not be reconfigured, instead update the PROD application in the list.

b.      Updating Integrations and Connections details

There are places where you need to update the connection details, like updating data management target application connections, source system connections, ensure the URL`s, Usernames and password are updated accordingly.


Thursday, March 10, 2022

EPM Cloud Data management / FDMEE / Troubleshooting steps

 


Required Roles for Loading Data

If you have an issue with loading data to Oracle Enterprise Performance Management Cloud using Data Integration, Data Management, EPM Automate or REST APIs, make sure that the user loading the data has one of the following roles:

  • Service Administrator predefined role
  • Power User predefined role and Run Integration application role

How to trace issues  using ERROR log

  • The process details log in data management is the key log to trace any issues
  • Ensure the Log level is set to 5 under System Settings tab > select Profile type to ALL
  • Open the process details log in Notepad / Notepad++
  • Find (Ctrl+F) for the key word "error"

How to determine the load status from DATA LOAD  WORKBENCH

In the Workflow tab > Data load workbench > there are four icons (fish icon) > Import , Validate , Export , Check

If the Import fish is grey, then the import step is failed the source system level 

  • Ensure the source system connection is setup correctly 
  • Ensure the file name is defined correctly in the data load rule 
  • Ensure the data is available in the source combination in which it is extracting 
  • Ensure the source filters are defined properly 

If the Import fish is Orange and Validate fish is Grey

  • Go the Data load workbench > click on Validation errors 
  • You will find the details of the missing mapping members 
  • Ensure the mappings are defined correctly for those members and re-run the validate step

If the Import fish is Green, Validate fish is Green but the Export fish is Grey

  • Ensure the connection is available and the properties of the target system is defined properly for the target system
  • In the error log you will find an error code "Error: 3303", "Error: 3335" - it describes that the member is not available in the target 
  • In those scenarios we need to manually create the members in the target system if it valid

Known error codes and solution

3303 Member not found in database.

3304 Insufficient access to store data.

3333 Bad data value supplied.

3335 Record rejected because of duplicate member names.

3336 Member/Data unknown.

3337 Record rejected because of dimension conflicts with Header Name


How to Verify that Microsoft Active Directory Federation Service is Working

 

1.  Access the Microsoft Active Directory Federation Service Sign In page: https://adfs.example.com/adfs/ls/IdpInitiatedSignOnPage (replace adfs.example.com with your Microsoft Active Directory Federation Service hostname)

2.  If required, select Sign in to this site and click Sign In.

3.  Enter the Microsoft Active Directory credentials for a user that exists on both Microsoft Active Directory Federation Service and Oracle Identity Cloud Service (in this example, csaladna@example.com) and click Sign In.

  1. Confirm that the message You are signed in is displayed.

EPM SSO Configuration - IDP metadata File and Setting Attributes

 

EPM Cloud supports only Service Provider (SP) initiated SSO; it does not support Identity Provider (IdP) initiated SSO.

Reference : https://docs.oracle.com/en/cloud/get-started/subscriptions-cloud/csimg/configuring-oracle-cloud-service-provider.html

 

  1. SSO PROTOCOL : Available are HTTP POST, HTTP Artifact - ___________________
  2. User Identifier : Available are User ID, User Email Address - _____________________
  3. Contained in : _________ (If the User Identifier value is the user ID, then the contained in field must be the SAML attribute and you must specify the name of the SAML attribute for the contained in field such as SamAccountName in the case of Microsoft Active Directory Federation Services. If it is selected as Email Address then the contained in will be Name ID by default)
  4. Provider Metadata : Users can export the provider metadata file by default will be sha2.0 we can leverage to get sha256 methodology as well from oracle support, Partner Active Directory admin team has to import that file into MSAD and provide us back the “Federation.xml” file.
  5. This Federation file will be imported in oracle my services for further setup and configuration.

Configuring SSO (Single Sign On) : From Cloud ADFS server to Oracle EPM applications (Old generation)

 


Login to my services console and click on Navigate icon>Users




Click on SSO configuration
Under - Configure your Identity Provider Information - click on EXPORT METADATA


Create an SR with oracle if you require sha256 metadata file
If you require sha2.0 file you can download directly using the download option
Provide the exported metadata file to Oracle 
Get the modified "sha256 methodology" file from Oracle (Get the requirement from MSAD / AD team)
Submit the file to MSAD team
And get the modified Federation file from MSAD team



Then import the file to my-services in Configure SSO tab
Click on EDIT
And upload the file provided by MSAD team

Click on TEST SSO
In the next screen click on START SSO
It will redirect to a page to confirm the connection to MSAD is successful
Click on ENABLE SSO 


The SSO will be configured 

Sunday, February 9, 2020

Submitting an UDR from EPM cloud Environment along with snapshot to Oracle Support

From the Simplified Interface - open Setting and Actions by clicking your username (displayed at the right top corner of the screen) -> Select Provide Feedback -> Enter a brief description -> Click the Submit button -> Click the Confirm Application Snapshot Submission > ON ->Click on Submit button again.

How to restart the services of an EPM Cloud Environment

How to restart the services of an EPM Cloud Environment 

Using the EPM Automate feature users can do a restart of their EPM Cloud environment.

The command to be used is as follows,

Ex: epmautomate resetservice "Users experience unacceptably slow connections"


Working with EPM Automate

1. Download the latest version of EPM Automate
    Login to your EPM Cloud Service
    Tools -> Install -> EPM Automate

2. Install EPM Automate
    Run the EPM Automate.exe downloaded in step 1

3. Use EPM Automate using a command prompt

    - Open Command Prompt
    - Navigate to the install Location

Example: CD :\Oracle\"EPM Automate"
- Login to EPM Automate
epmautomate login / identitydomain

Example: epmautomate login https://.pbcs.us2.oraclecloud.com mydomain
- Run reset Service Command

epmautomate resetservice "Users experience unacceptably slow connections"

Saturday, February 8, 2020

Is it Possible to Disable EPM Cloud Passwords from Expiring

This is not possible as it is a violation of password policies,

The only workaround is to enable SSO with their own Identity Provider (IdP) and customer can manage their own password policies.

Please see the documentation for the same below :
https://docs.oracle.com/en/cloud/paas/process-cloud/cprcw/configuring-federated-sso-and-authentication.html

An Enhancement was raised with regard to this and it was rejected.

How is the data size calculated for a EPM Cloud service instance?

The total size of data used by your EPM Cloud service instance is the sum of the following:
  • Application data stored in Essbase
  • Artifact Snapshot created by the daily maintenance window
  • All snapshots that you have exported via EPM Automate or the Migration user interface
  • All snapshots that you have uploaded via EPM Automate or the Migration user interface
  • All data files that you have uploaded via EPM Automate or the FDMEE user interface

How to overcome Blank page issue during Drill-through via Smartview in EPM Cloud

In EPM Cloud, (PBCS,EPBCS) users might face an issue while doing a drill-through to data management or drill-through to source values.
Users might get a Blank page during the process.

The Smart View extension (add-in) for the browser would have not been installed.

To install the add-ins, refer the steps below

Chrome: 

Download the Smart View for Office extension from the Chrome web store.

Firefox:


Information about the Firefox add-in -
https://docs.oracle.com/applications/smartview/720/SVNST/firefox_extension_100x0867c8ee.htm#SVNST-GUID-DA412821-E7CD-40E4-9CA0-9739D8CAC0D8

For, IE11 - The following settings changes have to be completed -


Go to Tools > Internet Options > General > Tabs

Choose Open links from other programs : The current tab or window


Thursday, February 6, 2020

New and missing features in Cloud Data Management in comparison to FDMEE on-premise

New and missing features in Cloud Data Management in comparison to FDMEE on-premise


Significance of Data Management in Cloud



Significance of Data Management in Cloud 

Cloud Data Management is intended to allow an organization to adopt a pure Cloud solution for Oracle EPM deployments.

  • Cloud Data Management is a module within the Oracle EPM Cloud Services.
  • It is built using the same code line as on-premises FDMEE.
  • Cloud Data Management can integrate flat files.
  • It includes all the on-premises FDMEE transformation capabilities including SQL mapping which can accommodate complex transformations.
  • It includes the prebuilt logic to natively load data to each of the Oracle EPM Cloud
  • Service offerings.
  • Cloud Data Management can integrate with other Oracle Cloud Service offerings
  • including the ability to source data from and load data back to Fusion G/L Cloud.
  • As well, it can source data from other Oracle EPM Cloud Services.

Monday, December 16, 2019

Error "Active Time Limit Exceeded - Call Aborted" When Exporting Data to PBCS Application Through Data Management

In Data Management, Load method was not set properly for Oracle Planning and Budgeting Cloud Service(PBCS) application especially for multi-period loads.

Kindly refer the steps below,
  • Click on Navigate and go to Data management
  • In Target Application, select the application and click on Application options
  • Define the Load Method as "All data types with security" and save the changes.
  • Goto Data Load Rule -> Target Options -> Ensure Load Method is also set to "All data types with security" and save the changes
  • Re-run the export task for multiple periods and it should work fine.
Note :
The above steps are applicable for the loads with less number of rows.
If the file size is huge(i.e have millions of rows), then above steps would not help to fix the issue.
There is an Enhancement request already raised with the development but fix is not yet available:

Enh 26834089 - PBCS: UNABLE TO IMPORT LARGE DATA FILES DUE TO DB RESOURCE CONSTRAINTS

Monday, July 15, 2019

LOADING SMARTLISTS IN DATA MANAGEMENT USING MULTI-COLUMN FUNCTIONALITY


      Quite a long time we been getting some issues from customers that they were unable to load HR data because the accounts are in the columns.
       This whitepaper will help you regarding how to resolve those issues.
      Now, Data Management have the capability of loading non-numeric value and loading accounts in the columns.
            In this source file there is a combination of Percentage, Date, Text and a Smartlist.

 

There are some pre-defined or default Smartlists in PBCS, they are visible when you click on the Smartlist option under create and manage.
Apart from the default Smartlists, you can see the Smartlists you created.

 

Assigning the Smartlists to the Dimension

Refer the steps mentioned in the “Defining Properties for a Smartlist” and assign the Datatypes to the dimension
In this, all the Data types are assigned to the Account dimension as Members.
 



Define an Import format in the Data management by setting the File type as Multi Column – All Data Type.

 


Click on the Add expression editor and choose the Expression type as Driver and dimension as Account as we have assigned the data types to the Account dimension.
Then in the Columns choose the Data type columns as 1, 4 


Create a new location by assigning an Import Format.
 
Setup a Global mapping by defining the Prior Period Key and the Period Key and define the Target year.
You can also setup a Source Mapping or Application Mapping and set it to Explicit and define the same in the Data load rule.

 
Define category mapping accordingly.


Create a new Data Load Rule for the Multi-period load,
Since we have defined the Expression in the Import Format i.e “Driver=Account”, you will find a new Column called “Column Headers”.
 

 
In this column headers you can define the column number for each and every data type and if required you can execute them as a standalone.

 



Make sure that while loading Smartlists the Load method in the Target application is set to “All data types with security”


Define the Mappings for all the dimensions in the Source Value and the Target Value.
  
To execute a data load rule, go to Data Load Rule, click on Execute and enter the details as shown below
Select the start and end periods according to the data that is required to be loaded and click Run. Data will be imported, validated and exported for all the periods between the start and end period range.


You can see the process is being executed and being succeeded in the process details tab.
   

Once when the data is loaded successfully, you can validate the date in the Target Data column.
 
 

To check whether the data is load to the target successfully we can validate in the Target application.
Go to the Navigator and click on the Forms option.
Create a new simple form using the Add icon. Define a name for that form, click on the Layout column, and set the Point of View accordingly.


 


Once when the POV is set, click on save and click the Preview so that you can see the data displayed over the form.

Loaded Data Preview

 






Other posts

How is the data size calculated for a EPM Cloud service instance?

The total size of data used by your EPM Cloud service instance is the sum of the following: Application data stored in Essbase Artifact...