purging the cache using ibots

32
Purging the Cache using iBots on Tuesday, 19 August 2008 As you can create (seed) the cache using iBots, you can also purge it using the iBots. When you don’t have option of the EVENT POLLING tables this option is really helpful. While creating the iBot, you can schedule the iBot as per your requirement of when you need to purge the cache. Now in the ‘Advanced’ tab, you can ‘Add Action’ under section ‘Execute these actions when iBot conditions are satisfied’. Select ‘Custom Script’ option in this ‘Add Action’ menu and specify the javascript filename which contains the script to purge the cache. Following is the sample javascript which needs to be placed at following location; {drive}\OracleBI\server\Scripts\Common // purgeSASCache.js // // Purges the cache on SAS. // Parameter(0) - The user name to pass in to NQCMD. // Parameter(1) - The password for the aforementioned user. ///////////////////////////////////////////////////////// // The full path to nqcmd.exe var nqCmd = "C:\\OracleBI\\server\\Bin\\nQCmd.exe"; // The data source name var dsn = "AnalyticsWeb"; // The user to execute the queries var user = "Administrator"; // The password of the aforementioned user var pswd = "Administrator"; // The ODBC procedure call for purging the cache var sqlStatement = "{call SAPurgeAllCache()};"; ////////////////////////////////////////////////////////// // Returns a string from the file name //////////////////////////////////////////////////////////

Upload: narayana-reddy-a

Post on 11-Mar-2015

349 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Purging the Cache Using iBots

Purging the Cache using iBots on Tuesday, 19 August 2008

As you can create (seed) the cache using iBots, you can also purge it using the iBots. When you

don’t have option of the EVENT POLLING tables this option is really helpful.

While creating the iBot, you can schedule the iBot as per your requirement of when you need to

purge the cache. Now in the ‘Advanced’ tab, you can ‘Add Action’ under section ‘Execute these

actions when iBot conditions are satisfied’. Select ‘Custom Script’ option in this ‘Add Action’

menu and specify the javascript filename which contains the script to purge the cache.

Following is the sample javascript which needs to be placed at following location; {drive}\

OracleBI\server\Scripts\Common

// purgeSASCache.js

//

// Purges the cache on SAS.

// Parameter(0) - The user name to pass in to NQCMD.

// Parameter(1) - The password for the aforementioned user.

/////////////////////////////////////////////////////////

// The full path to nqcmd.exe

var nqCmd = "C:\\OracleBI\\server\\Bin\\nQCmd.exe";

// The data source name

var dsn = "AnalyticsWeb";

// The user to execute the queries

var user = "Administrator";

// The password of the aforementioned user

var pswd = "Administrator";

// The ODBC procedure call for purging the cache

var sqlStatement = "{call SAPurgeAllCache()};";

//////////////////////////////////////////////////////////

// Returns a string from the file name

//////////////////////////////////////////////////////////

function GetOutput(fso, fileName)

{

var outStream = fso.OpenTextFile(fileName, 1);

var output = outStream.ReadAll();

outStream.Close();

return output;

}

//////////////////////////////////////////////////////////

Page 2: Purging the Cache Using iBots

// Get WshShell object and run nqCmd. Capture the output

// so that we can handle erroneous conditions.

var wshShell = new ActiveXObject("WScript.Shell");

// Create a temp file to input the SQL statement.

var fso = new ActiveXObject("Scripting.FileSystemObject");

var tempFolder = fso.GetSpecialFolder(2);

var tempInFileName = fso.GetTempName();

var tempOutFileName = fso.GetTempName();

tempInFileName = tempFolder + "\\" + tempInFileName;

tempOutFileName = tempFolder + "\\" + tempOutFileName;

var tempInFile = fso.CreateTextFile(tempInFileName, true);

tempInFile.WriteLine(sqlStatement);

tempInFile.Close();

try

{

// execute

var dosCmd = nqCmd + " -d \"" + dsn + "\" -u \"" + user

+ "\" -p \"" + pswd + "\" -s \"" + tempInFileName + "\"" +

" -o \"" + tempOutFileName + "\"";

wshShell.Run(dosCmd, 0, true);

var output = GetOutput(fso, tempOutFileName);

// Remove the temp files

fso.DeleteFile(tempInFileName);

if (fso.FileExists(tempOutFileName)) {

fso.DeleteFile(tempOutFileName);

}

// Check the output for any errors

if (output.indexOf("Processed: 1 queries") == -1) {

ExitCode = -1;

Message = output;

}

else if (output.indexOf("Encountered") != -1) {

ExitCode = -2;

Message = output;

}

else {

ExitCode = 0;

}

} catch (e) {

if (fso.FileExists(tempInFileName)) {

fso.DeleteFile(tempInFileName);

}

Page 3: Purging the Cache Using iBots

if (fso.FileExists(tempOutFileName)) {

fso.DeleteFile(tempOutFileName);

}

throw e;

}

OBI EE Automated Cache Purging and iBots Trigger

Filed under: OBI EE — dipayansengupta @ 10:25 AM

OBI EE Cache Purging is time based. There is no direct way to automate the cache purging

depending on the ETL completion. So the way around to achieve is to write a shell scripts and

purge the cache once the ETL process is completed and then immediately run the iBots.

The Script Logic is mentioned below:

1.Check the CTRL table (where data loading completion is logged) /ETL Metadata table every 15-

20 mins

2. If the table is loaded than purge the cache only for that table

Call SAPurgeCacheByTable( ‘DBName’, ‘CatName’, ‘SchName’, ‘TabName’ );

3. Once the cache is purged run only those iBots which are using the table using the below

command.

./saschinvoke -u userid/password -j $jobid

Page 4: Purging the Cache Using iBots

Automating cache purging without using OBIEE Scheduler

My team had a requirement to automate cache purging feature in OBIEE without using Scheduler services.

I will tell you why this scenario came.

We had around 100 iBots scheduled in Test environment. Developers did not want to make the scheduler services up as it would send wrong results to users. And we did not want to disable all the iBots by opening it one by one manually so that we can make the scheduler services up.

So I had to schedule the cache purging feature without using scheduler feature of OBIEE.

Page 5: Purging the Cache Using iBots

Now, I am gonna tell you how we can do this by using windows scheduler feature.

1. Go to Control Panel> System and Security> Administrative tools> Schedule tasks

2. Click on Schedule tasks. It will open Task Scheduler window.

Page 6: Purging the Cache Using iBots

3. On the right hand side, click on Create task. It will open a new window to create task.

Page 7: Purging the Cache Using iBots

4. Now give the name of the task. Go to the triggers tab. Click on New button to get New Trigger window.

Page 8: Purging the Cache Using iBots

5. Depending on your choice, you can change the trigger setting. Here I have done repeat task every 1 hour.

6. Go to Actions tab. Click on New button to get New Action window. Select the .exe file which you want to schedule. Here we will select nQcmd.exe which is present in c:\oraclebi\server\Bin folder. Add the arguments which is required to open nQcmd.exe and purge the cache. Here the arguments are :

-d AnalyticsWeb -u Administrator -p Administrator -s c:\oraclebi\server\Bin\SQL.txt

-d is for Data Source Name i.e AnalyticsWeb.

-u is user name i.e Administrator

-p is Password i.e Administrator

-s is File path where the purge cache command is writtent.

Create a file SQL.txt and write the purge command:

{call SAPurgeAllCache()};

Page 9: Purging the Cache Using iBots

Place the file in Bin folder

7. Now in the conditions tab, you can give the condition as per your choice.

Page 10: Purging the Cache Using iBots

8. Now once you finish this task, your scheduled task will appear on Task Scheduler library. And it will trigger autmatically and purge the cache in OBIEE.

I would like to thank my friend Diptesh for helping me in this scenario.

Page 11: Purging the Cache Using iBots

OBIEE – iBot TrackingPosted on December 2, 2010 by John Lamont Watson

We have Delivers functionality in obiee to schedule report delivery; and we

also have a model provided for tracking application usage.  What we don’t

seem to have is a model for tracking iBot usage.

I have a requirement to deliver an email to an Administrative group once

the daily data load is complete, or an error report should the load fail.  This

is fine, out of the box Delivers functionality.  However, the client does not

want to recieve the report more than once per day; we can do this as a

conditinoal request, but how do we know whether a report has already been

emailed out today.

Below are the steps that I have completed in order to provide a Subject

Area for Usage Tracking – iBots.  I make no effort to cover out of the box

usage tracking or out of the box Delivers functionality; this information is

alreadys available in abundance, but please ask if you are unsure.

In The Physical Layer

If you have already deployed Usage Tracking then the following tables will

be available from your database; import them in the physical layer.

S_NQ_JOB

S_NQ_JOB_PARAM

S_NQ_INSTANCE

These tables are shown below, among others.  Create an Alias for each of

these tables to represent  your dimensions.  Create an additional Alias for

the table S_NQ_INSTANCE to represent your fact table.

Please ignore the table and alias for S_NQ_ERR_MSG; it is not necessary for

my requirement; but, an optional extra.

Page 12: Purging the Cache Using iBots

Create the following Physical Tables and Alias'

Primary Keys for these Alias Tables should be created as below.

Dim_S_NA_JOB=JOB_ID

Dim_S_NQ_JOB_PARAM=JOB_ID

Dim_S_NQ_INSTANCE=JOB_ID, INSTANCE_ID

Fact_S_NQ_INSTANCE=JOB_ID, INSTANCE_ID

The Screenshot below shows how these Alias tables should relate to one

another.  The Joins themselves are listed below the screenshot.

Create physical Joins as shown

Dim_S_NQ_JOB.JOB_ID = Fact_S_NQ_INSTANCE.JOB_ID

Dim_S_NQ_JOB.JOB_ID = Dim_S_NQ_JOB_PARAM.JOB_ID

Dim_S_NQ_INSTANCE.INSTANCE_ID =

Fact_S_NQ_INSTANCE.INSTANCE_ID AND

Dim_S_NQ_INSTANCE.JOB_ID = Fact_S_NQ_INSTANCE.JOB_ID

Page 13: Purging the Cache Using iBots

In The Logical Layer

we have two dimensions to create and one fact table. I’ll go through these

three tables individually below; if a change is not detailed then you don’t

need to make it.

Dim – iBot Agent

We’ll start with the Job Dimension. Drag the Alias Dim_S_NQ_JOB into the

Usage Tracking Business Model and open the table source for this logical

table; edit it as below. Create Job Dimension Table Source as shown

You can see that we have added an Inner Join to the Alias

Dim_S_NQ_JOB_PARAM. And thats it. We can now map the columns that we

need as below. Please remove all other columns for logical table.

You can see that we have added an Inner Join to the Alias

Dim_S_NQ_JOB_PARAM.  And thats it.  We can now map the columns that

we need as below.  Please remove all other columns for logical table.

Map the Columns as shown

Page 14: Purging the Cache Using iBots

We also need to restrict the rows returned by this table; navigate to the

Content Tab.  Particularly we are restricting S_NQ_JOB and

S_NQ_JOB_PARAM to a 1:1 mapping.

Configure the Where Clause for the table

Dim – iBot Instance

I have called this dimension Dim – iBot Instance.  Drag the Alias

Dim_S_NQ_INSTANCE into the Usage Tracking Business Model.  This

Logical Table requires only a single source, we can move straight to the

Column Mapping tab.  Please map the columns as below and remove all

other columns.

Page 15: Purging the Cache Using iBots

Map Columns as shown

Again, we need to restrict the rows returned for this logical table; navigate

to the Content tab and complete as below.

Add the Where Clause to the table

I found it useful to add four additional columns to this dimension.  Below I

will show the expressions that I used for these columns.

Dim – iBot Instance: Additional Start Date Column

Start Date column is useful

Dim – iBot Instance: Additional End Date Column 

Page 16: Purging the Cache Using iBots

As is this one

Dim – iBot Instance: Amend Status Column

Make the Status Column more useful

Dim – iBot Instance: Additional Action Flag Column

'Y' If Email Sent, 'N' Otherwise

Fact – iBot Instances

Page 17: Purging the Cache Using iBots

The fact table uses a single source; drag the Alias Fact_S_NQ_INSTANCE

into the Usage Tracking Business Model; navigate to the Column Mapping

tab and configure the Facts. No where clause need be added to this Logical

Table. 

Map the Fact Columns

I have added a Fact measure for the Elapsed Execution Time of an iBot; the

expression for the column is shown in the screenshot.  The column as

aggregates as a sum.

This additional fact column is likely to be useful

In The Presentation Layer

In the Presentation Layer I have exposed the majority of columns that exist

in the logical layer.  I have added a couple more columns other than those

Page 18: Purging the Cache Using iBots

detailed above; and no doubt you will come up with your own useful

additions.

Create your Presentation Catalog

The End Report

Thats about it for the RPD.  In my case I finished up creating the report

below.  I add this report as a condition for an iBot; only send an email if this

report shows no results.  This should mean that the iBot only sends the

email once per day.

Create your condition Report

Page 19: Purging the Cache Using iBots

There are lots of extensions to this schema that could enhance it greatly.  I

have thought of a couple this morning, but as its not required I’ll not have

the time to implement.  Hopefully, you found this helpfull/interesting; let me

know if you extend it in an interesting way.  Good Luck! 

OBIEE: Seeding the Dashboards for End Users and Purging on Nightly Basis Many dashboards on most systems can be seeded with an iBot that runs overnight and caches the reports to the dashboards for quick access and less queries to the database. We created a series of iBots that are triggered by an initial iBot the purges the old Cache and starts the process of reseeding the data. First we created the iBot Purge Cache:

This iBot is run at 5:00am every night after the ETL completion by the Administrator user. The delivery content is the first report we needed to cache. The destination for the iBot is the Oracle BI Server Cache.

Page 20: Purging the Cache Using iBots

Notice in the Advanced section the javascript purgeSASCache.js, Below are the parameters that were set in the Advanced tab.

The javascript purgeSASCache.js should be placed in the folder OracleBI/Server/Scripts/CommonThe below is the javascript file:

Page 21: Purging the Cache Using iBots

outStream.Close();return output;

}

//////////////////////////////////////////////////////////

// Get WshShell object and run nqCmd. Capture the output

// so that we can handle erroneous conditions.

var wshShell = new ActiveXObject("WScript.Shell");

// Create a temp file to input the SQL statement.

var fso = new ActiveXObject("Scripting.FileSystemObject");

var tempFolder = fso.GetSpecialFolder(2);

var tempInFileName = fso.GetTempName();

var tempOutFileName = fso.GetTempName();

tempInFileName = tempFolder + "\\" + tempInFileName;

tempOutFileName = tempFolder + "\\" + tempOutFileName;

var tempInFile = fso.CreateTextFile(tempInFileName, true);

tempInFile.WriteLine(sqlStatement);

tempInFile.Close();

try

{

// execute

var dosCmd = nqCmd + " -d \"" + dsn + "\" -u \"" + user

+ "\" -p \"" + pswd + "\" -s \"" + tempInFileName + "\"" +

" -o \"" + tempOutFileName + "\"";

wshShell.Run(dosCmd, 0, true);

var output = GetOutput(fso, tempOutFileName);

// Remove the temp files

fso.DeleteFile(tempInFileName);

if (fso.FileExists(tempOutFileName)) {

fso.DeleteFile(tempOutFileName);

}

// Check the output for any errors

if (output.indexOf("Processed: 1 queries") == -1) {

Page 22: Purging the Cache Using iBots

ExitCode = -1;

Message = output;

}

else if (output.indexOf("Encountered") != -1) {

ExitCode = -2;

Message = output;

}

else {

ExitCode = 0;

}

} catch (e) {

if (fso.FileExists(tempInFileName)) {

fso.DeleteFile(tempInFileName);

}

if (fso.FileExists(tempOutFileName)) {

fso.DeleteFile(tempOutFileName);

}

throw e;

}

OBIEE – Multi User DevelopmentPosted on February 2, 2011 by John Lamont Watson

The concept of the MUDE, or Multi User Development Environment, was

introduced way back with Siebel Analytics 7.7. In theory it is really quite

simple; in practice, although useful, it can be tedious.

Setting up the MUD

Set up is very easy. Primarily all you need is a shared folder on a filesystem,

accessible to all of the relevant developers.

Open the Administration Tool

Page 23: Purging the Cache Using iBots

Select Options from the Tools Menu, as in the screenshot below.

Select Options from Tools Menu

Select the Multi-User Tab

Browse to the Folder containing your Multi User RPD File

Enter your Name and Click OK

Complete the Options' Multiuser tab

And actually thats it; you’re setup.  Each developer that requires access

must follow the same process.

Using the MUD

For the rest of this article we will use the samplesales.rpd file as an

example.  The RPD is organized into 2 Projects.  When we want to work on

the RPD in a shared environment we check out the project that we want to

Page 24: Purging the Cache Using iBots

work on; make our changes; merge our code back into the shared repository

and then, once totally happy, publish our merged repository to replace the

current version.

Checking Out a Project

Select Multiuser, Checkout from the File MenuSelect Multiuser

Checkout from the File Menu

You will be prompted for a Username and Password for the MUD RPD

Provide Username and Password

From the Project List, select the Project(s) that you would like to

Checkout and Click OK

Select Project(s) to Checkout

A subset of the Current Repository will be created locally, containing

only the project(s) selected.  Choose a name and location for the file; I

would just accept defaults; and click Save.

Page 25: Purging the Cache Using iBots

Save your Modified, or subset, Repository File

The subset Repository, known as the Modified version, is now stored locally

and should be open in the Admin Tool.  This is the point at which you should

generally make changes.

Merging the Project back into the MUDE

Select Multiuser, Merge Local Changes from the File Menu

Select Multiuser Merge Local Changes from the File Menu

You should receive a request for Lock information, as below.  If you

don’t you will probably receive  Information that the Repository is

already locked.  The information may tell you when it is likely to be

available, or you may have to contact the person currently locking it. 

You don’t need to Complete the Lock Information, but it may help your

fellow developers.  Click OK.

Page 26: Purging the Cache Using iBots

Enter Lock Information and Click OK

You will get a Merge Repositories Dialog Box.  Any conflicts between

your Modified RPD and the Current RPD will be detailed here; for each

conflict you must specify which specify which RPD is correct; the

Merge button will not be available until all conflicts are resolved. 

Resolve any conflicts and click Merge.

Note: When you Checkout, a copy of the project(s) checked out will be contained in your Modified Repository,

stored locally; this will contain any changes that you have made during Checkout.  At the point of Checkout a copy

of the whole MUD Repository is also taken and this is referred to as the Original Repository.  The Current

Repository is the MUD, containing any changes made since your Checkout.  The 3 versions of the RPD are

required for the 3 Way Merge.

At this point you should check Consistency of the Merged Repository,

Click Yes

Click Yes to Check Consistency

Page 27: Purging the Cache Using iBots

All going well, you should recieve a blank sheet as below; click to

Close the Consistency Check

Click to Close the Consistency Check

You should now have a copy of the Current Repository with your changes

merged.  At this point we have not committed to the merge; we can still

Discard Local Changes; although we would lose our changes, we would not

impact the MUD.  We have also locked the MUD so no other changes can be

made until we do either discard these changes or publish them to overwrite

the Current Repository.

Publishing the Merged Repository

Select Multiuser, Publish to Network from the File Menu

Select Multiuser - Publish to Network

At this point you will be asked again whether you would like a Consistency

Check.  If you have made no further changes since the Merge then there is

no reason to.  However, you are able to make changes post merge, whilst

the Repository is still locked; this second Consistency Check should be used

in this case.

Page 28: Purging the Cache Using iBots

When to Configure the Merged RPD

We should configure a locked RPD, post Merge, when our changes could be

considered global to the RPD; such as changing the Administrator

password.  It is not necessary, but may be useful for example if we

introduced a new Dimension that joins to every Fact.  And as you start to

use the MUD you will come to learn there are a couple of strange ‘features’

that can be overcome by editing in the locked RPD.

You should remember that this is not a risky approach as you are not

committed until you Publish your changes; but locking the RPD for longer

periods may impact other developers.

OBIEE Project Management

So far I’ve not mentioned how you organize your RPD into Projects, but you

will need to know how to do this and how to make changes.  You can

manage projects in either the Checked Out or Locked RPD.

Select Projects from the Manage Menu

Select Project from the Manage Menu

The Project Manager will open; double click to view or edit an existing

Project

Page 29: Purging the Cache Using iBots

Double Click on an existing Project to View it

Note: To Create a new Project Select NewProject from the Action Menu

In the Project Definition Window the applet on the left shows Objects

available in the RPD and the applet on the right shows those Objects

included in this Project.  Use the Add and Remove buttons to manage

the Objects contained withint this Project and click OK.

Choose the Objects contained in the Project

Note: An object can be contained in more than one Project; and indeed the same Project can be checked out by

many different users at the same time.

The next time you Checkout, any new projects should be displayed; and

edited projects should reflect any changes made.

the Oracle BI Server Event Polling Table

The Oracle BI Server does not populate the event polling table. The event

table is populated by inserting rows into it each time that a table is updated.

This process is normally configured by the database administrator, who

Page 30: Purging the Cache Using iBots

typically modifies the load process to insert a row into the polling table each

time a table is modified. This can be done from the load script, using

database triggers (in databases that support triggers), from an application,

or manually. If the process of populating the event table is not done

correctly, then the Oracle BI Server cache purging is affected, because the

server assumes the information in the polling table is correct and up to date.