Lights Out DRM


November 8th, 2016


Oracle offers a powerful scripting tool called Batch Client. This tool provides the ability to fully automate many tasks in Data Relationship Management (DRM) and Data Relationship Governance (DRG) applications. As you will notice when reading my other blog posts, I’m a big fan of automation. My favorite type of automation is tasks that are kicked off on a fixed schedule with absolutely no need for human interaction. In this post, I will describe such an automated task I recently created for a client.

“Did you create a new account in SAP?”

The client, a large manufacturer of laboratory equipment, was using SAP to maintain their chart-of-accounts. While SAP was the system of record for both GL data and metadata, the client used Oracle Planning and Budgeting Cloud Services (PBCS) for financial forecasting and Hyperion Financial Management (HFM) to consolidate and close their books.  The client wanted to use DRM to manage the hierarchies for both PBCS and HFM, while keeping SAP as the source for the metadata.

This is a common practice in many companies. The downstream systems are not always notified of the changes made upstream. So the users of the downstream systems scramble to identify new nodes when data loads fail due to unsynchronized metadata.

Ideally, DRM would serve as the source system for all metadata, including SAP. So keeping SAP as the source of the metadata presented a few challenges:

  1. How / when can DRM know that new nodes are added in SAP?
  2. How to prevent duplication of new SAP nodes in DRM?
  3. How / when can PBCS and HFM users know that new SAP nodes are added?
  4. How do new SAP nodes added in DRM get the PBCS and HFM specific properties?

DRM Batch Client Saves the Day!

Users of DRM and DRG already know how flexible they are. But the Batch Client really takes things to the next level. Using this tool, developers can create fully automated processes that reduce the need for users to manually interact with the application. These processes include refreshing, opening, closing, and deleting versions as well as launching existing Exports, Imports and Blends in DRM. You can also generate version backups, restore versions from backup files, create workflow requests, and launch action scripts. In this session,

Putting together a few batches, I created a “lights out” process that:

  1. Open a simple file with metadata export from SAP
  2. Convert the file into a DRM consumable format
  3. Import the DRM consumable file into temporary versions in DRM
  4. Compare the temporary versions to the working version in DRM to identify new nodes
  5. Generate DRG workflows that call on PBCS and HFM users to enrich the new nodes with the properties for their respective applications.

Each of the above-listed steps is contained in a simple DOS batch executable file (.bat). There is a set of files for each of the four dimensions that are being updated by the process. All in all, there are 24 batch files (6 for each dimension). The files are called sequentially by a master batch file that is scheduled to execute on a daily basis by Windows Scheduler.

The master batch file looks something like this:


The SAP administrator is responsible for creating the SAP metadata export files for each dimension. In this case, there are four files:

  • SAP_Accounts.txt
  • SAP_CostCenter.txt
  • SAP_Entity.txt
  • SAP_ProfitCenter.txt

The SAP administrator saves the files in a shared folder. The automated task finds and reads each of the SAP files whether it is new or not. If the file has not changed since the last task execution, the process does nothing. If the file is new and contains new nodes, the task identifies them and creates a new DRG workflow for each new node.

Where the Magic Happens

The key for this automatic task is the DRM Batch Client. The following is an example of how one such batch file looks:


The above batch file calls the DRM Batch Client command using the Import function. The function is defined by the /op switch (/op=Import). This batch file generates two log files: one for the batch client (/log=DRM_Import.log), and one for the import function (/implog=zImporter_%VersionName%.log). The batch client log will show if the batch command was executed successfully or if it generated any errors. The import log will contain specific errors encountered during the import process.

For this to work, the batch file must run on a computer that has the DRM-Batch Client installed on it. This can be a users’ desktop, a server, or the DRM server. In the lights out process I created for this client, we ran the entire task on the DRM server.

Do the Ground Work First!

As you can imagine, the key to creating the automated task is to first create all the components in DRM. In this case, I had to create a separate import for each dimension that needed to be updated by SAP. The Imports in DRM are prefixed with “Imp_”. For Example:

  • IMP_Account
  • IMP_CostCenter
  • IMP_Entity
  • IMP_ProfitCenter

The nodes from SAP are loaded into temporary versions prefixed with “TEMP_”. So after the import steps are complete, there are four new temporary versions in DRM:

  • TEMP_Account
  • TEMP_CostCenter
  • TEMP_Entity
  • TEMP_ProfitCenter

To find the new nodes in the temporary versions, I created a compare export that evaluates the differences between the temporary version created by the import step and the working version (SAP_COMP_Leaf). This only compares the leaf nodes of the version and exports the difference (new nodes) to a temporary file. The temporary files are called:

  • New_SAP_Accounts.txt
  • New_SAP_CostCenter.txt
  • New_SAP_Entity.txt
  • New_SAP_ProfitCenter.txt

After the exports are executed by the batch client the temporary files are created, the batch client consumes these files to create a new workflow for each new node. In order to do this, I had to first create the workflow models in DRG. In this case:

  • Maint_Account
  • Maint_Entity
  • Maint_ProfitCenter
  • Maint_CostCenter

Each of these models is designed to let the users enrich the properties of PBCS and HFM as they see fit. Once the workflow is created, the users are updated via e-mail and work through the steps of updating the required fields. Of course, DRG keeps track of the changes made by each user.

After the PBCS and HFM users update all the properties and the workflow is completed, the DRM administrator commits the changes to the working version in DRM. At this point, the PBCS and HFM administrators can update the dimension metadata in their respective applications from the working version in DRM.

Although this last step can also be automated using MaxL or EPM Automate, the client decided to keep this process manual.




About TopDown Team

The TopDown Team includes members of TopDown Consulting who want to let the community know about webcasts, conferences, and other events. The team also conducts interviews on various EPM industry topics.

Tags: ,

Leave a Reply

Your email address will not be published. Required fields are marked *