Dynamic Back Ups for Essbase in UNIX

Paul Hoch

By
August 4th, 2015


We’ve had a number of clients that need a dynamic backup process that allows them to recover from any situation where they need to restore data (e.g., a catastrophic loss or even user errors). Also important is having the most up-to-date data. However, to avoid interruption during the workday, many companies only do backups on weekends or at night. This means if you have a crash in the middle of the day, you may lose hours or even days of users’ data.

There are a number of ways you can do Essbase backups, but many are intrusive, meaning the users must be out of the system while the backup is running. This becomes a real problem when you have a global company because around the world potentially means around-the-clock users, or at the very least, little downtime. So how do you do backups that don’t interrupt the workday?

We found that using the DATAEXPORT calculation command would allow us to output a backup of the data without requiring the users to be out of the system. By using this command we could do backups that had a minimal impact on the users in the middle of the day but would also provide the most current data. The next challenge was: how do we create a script that dynamically exports data for all of the databases in our environment? Our client had more than 25 different databases, and we didn’t want to write 25 different scripts that would all require maintenance and updates!

Using the examples below let’s walk through the couple of iterations that took us to the perfect solution. Note: This solution is specific to UNIX environments.

Script 1 started as:

LOGIN $1 $2 on $5;

alter APPLICATION $3 load database $4;

execute calculation
'
SET DATAEXPORTOPTIONS
{
DataExportLevel "LEVEL0";
DataExportColFormat OFF;
DATAEXPORTOVERWRITEFILE ON;
};

DATAEXPORT "File" " " $3.$4.Lev0.txt ;
'
on $3.$4;
EXIT;

What we are doing is writing a script in MaxL where we pass parameters. See the first line — $1, $2, $5. All of these are parameters that are passed upon execution of the script. There is no hardcoding username, passwords, etc.; things that start with $ are variables that are passed to the script at run time.

Three lines down: alter application $3, load database $4. $4 is the database we want to back up.

For example:

$3 = Sample
$4 = Basic

The meat of this script is the line:

DATAEXPORT "File" " " $3.$4.Lev0.txt ;

What I want to do here is export to a file called ‘Sample.Basic.Lev0.txt’. The point of this is that I have this script that runs for Sample Basic ($3, $4), and I have another database called Demo Basic that I want to write to ‘Demo.Basic.Lev0.txt’, etc. for each of my 25 databases. I can run this this script for all 25 databases, as it just continues to pass parameters I need. If I have a change, I don’t have to update it 25 times – I just change the script in one place and it runs for all 25.

Sounds good, right? The problem is this script doesn’t work. The line that says:

DATAEXPORT "File" " " $3.$4.Lev0.txt ;

should accept a parameter, but it doesn’t. There is no way to pass a parameter in the midst of this code line because the script won’t accept it. The command won’t take a parameter even though it should, and parameters are valid in all other instances. This particular line won’t work despite numerous attempts to trick it into working.

So now my problem was: If I can’t dynamically name the export file and I just call it “Lev0.txt”, it will keep overwriting the file each time the script is run. When I run it for Sample Basic, it outputs as “Lev0.txt” and then when I run the script for Demo Basic it overwrites the file as “Lev0.txt”. Not a great solution!

So I asked the question again: “How do I write dynamic script that works for 25 different databases without having to do 25 different scripts with filenames hardcoded in?”

Here is Script number 2:

LOGIN $1 $2 on $5;

alter APPLICATION $3 load database $4;

execute calculation
'

SET DATAEXPORTOPTIONS
{
DataExportLevel "LEVEL0";
DataExportColFormat OFF;
DATAEXPORTOVERWRITEFILE ON;
};
DATAEXPORT "File" " " "Lev0TEMP.txt"

'
on $3.$4;

set filepath=/data01/EssbaseServer/essbaseserver1/app;

shell mv $filepath/Lev0TEMP.TXT $filepath/$3.$4.Lev0.txt;

EXIT;

The meat of this script is still:

DATAEXPORT "File" " " "Lev0TEMP.txt"

Since I can’t dynamically name the file as part of the export, I output it to a temporary file and rename it afterward like this:

shell mv $filepath/Lev0TEMP.TXT $filepath/$3.$4.Lev0.txt;

In this case I used a shell command to rename the file to the name of the application. MaxL will pass the values of $3 and $4 to the shell! This allowed me to run the one script against multiple databases. The script will dynamically rename and export the files appropriately by using MaxL shell and UNIX commands.

This worked but only sort of. It didn’t work perfectly. Where it does:

DATAEXPORT "File" " " "Lev0TEMP.txt"

and

shell mv $filepath/Lev0TEMP.TXT $filepath/$3.$4.Lev0.txt;

will work except when your export exceeds Essbase’s 2Gb file size limits. Essbase will begin to generate multiple files at 2Gb each. In my example, I got the first file “Lev0TEMP.txt” but I also got “Lev0Temp1.txt” because of the size of the files. My “mv” command only renamed the first file, which left the second (and any subsequent) file out of the rename and backup. This means I wasn’t getting a complete backup set.

So I looked at it yet another way and finally came up with the solution that works.

The final script is:

LOGIN $1 $2 on $5;

alter APPLICATION $3 load database $4;

execute calculation
'
SET DATAEXPORTOPTIONS
{
DataExportLevel "LEVEL0";
DataExportColFormat OFF;
DATAEXPORTOVERWRITEFILE ON;
};
DATAEXPORT "File" " " "Lev0TEMP.txt"

'
on $3.$4;

set filepath=/data01/EssbaseServer/essbaseserver1/app;

shell "for f in $filepath/Lev0TEMP*.txt; do mv \$f \$(echo \$f | sed 's/Lev0TEMP/$3\.$4\.Lev0/g'); done" ;

EXIT;

This is where the serious work occurs:

shell "for f in $filepath/Lev0TEMP*.txt; do mv \$f \$(echo \$f | sed 's/Lev0TEMP/$3\.$4\.Lev0/g'); done" ;

In this one line, I am able to dynamically rename any files that have Lev0TEMP* in the filename to the name of the application and database.

This line renames all files correctly, with the right extension and allows one script that works for 25 databases.

One line of cool UNIX code saves the day!

Ultimately the solution was using the DATAEXPORT command, which is non-intrusive; it doesn’t require users to get kicked out. You can use DATAEXPORT in Windows and UNIX to export the data out of databases while users are working, but in a UNIX environment you can make use of some shell commands to really make it shine.


Paul Hoch

About Paul Hoch

Paul Hoch has over 18 years of EPM and Hyperion experience, specializing in Essbase and Planning. He has successfully completed large-scale EPM implementations for clients in industries ranging from retail to government and financial services to health care. Paul has served as solution architect and team lead for implementations and he has developed and delivered advanced team training. He also blogs and presents regularly on EPM topics, including Essbase, Planning, Hybrid Cloud implementations, PBCS, EPBCS, Smart View, Calc Manager, migrating to the cloud, and more.

2 comments on “Dynamic Back Ups for Essbase in UNIX”

  1. Jose Jaimes says:

    Great job!

    You could have used a sub variable to set the export location. Just need to make sure you properly set the quotes on the sub variable value.

    Jose

  2. Antun says:

    Please add an information about dataimport. You said about little downtime. May be I wrong, but you need for more time to restore data via DataImport function and recalculated it.

Leave a Reply

Your email address will not be published. Required fields are marked *