Friday 18 December 2015

Set Yesterday or Some Date Value as default in OBIEE Dashboard Prompt

Aim is to set default value to yesterday or some other day that you desired in dashboard prompt. So, when we click on the dashboard page.

This can be done in many ways
Here I am going to introduce 3 ways of doing the same.
  1. Using Advanced SQL in Prompt (From Answer side)
  2. Using repository variable (From rpd side)
  3. Using derived column in rpd (Same using repository variable)
We discuss this one by one

Using Advanced SQL in Prompt (From Answer side):

  1. Create a prompt on date column as per your requirement
  2. In Default to option drop-down, select SQL Result SQL Result
3.   Now click on eclipse button provided under selected SQL Result option
4.   Write the following query in the editor & click on OK.
(Here I am explaining to set default value to yesterday,)
 SELECT CASE WHEN 1=0 THEN Times.”Time Id” else TIMESTAMPADD(SQL_TSI_DAY, -1, CURRENT_DATE) END FROM SH
In general, query will be,
SELECT CASE WHEN 1=0 THEN “any date column”  else TIMESTAMPADD(SQL_TSI_DAY, -1, CURRENT_DATE) END FROM “SubjectAreaName”
Write SQL
5.   Now add this prompt to dashboard page and you will be able to see the prompt is filled with the default value for yesterday
Note: If you want to set some other date as default, you can do that by simply changing the else part of SQL written.

Using repository variable (From rpd side):


1.   Open rpd and go to Manage > Variables… and select new initialization block to creating dynamic repository variable. Select new initialization block option.
2.   Click on Edit Data Source… and write the query & choose the corresponding connection pool as shown below.
Query is:
SELECT CURRENT_DATE – 1 FROM DUAL
(or)
SELECT SYSDATE – 1 FROM DUAL
QueryForVariable
3.   Click on OK to confirm the query (You can test here the query o/p by clicking on Test button)
4.   Click on Edit Data Target… option and click New… to create new repository variable
5.   Give repository variable Name, here I am using YstrDay as Variable Name and click OK
Repository Variable
6.   Click OK and again OK to confirm creation of initialization block.
7.   Save the rpd and go to answers and reload the server metadata.
8.   Go to the prompt you have created already, and this time select Server Variable from Default to option.
Server Variable

9.   Click on Eclipse button under the Server Variable option and mention the repository variable name in editor. i.e. “YstrDay” in my case.

Conversion of date datatype to timestamp in OBIEE

In OBIEE, direct date to timestamp conversion is not possible unless we use evaluate functions to call l it’s database functions…

One way i found to convert date column to it’s timestamp is:
First, cast the date column to char later cast it to timestamp
Ex: Cast(Cast(CURRENT_DATE AS CHAR) AS TIMESTAMP)
  
You may have question why we need to convert it to timestamp.
Well, while using timestampdiff with sql_tsi_hour as interval, we can’t pass date datatype columns as it throws error:
Ex: TimestamDiff(SQL_TSI_HOUR, TimeID, CURRENT_DATE) won’t take as functionality. Gives you following error:

[nQSError: 10058] A general error has occurred. [nQSError: 22025] Function TimestampDiff is called with an incompatible type. (HY000)
 Now, try in this way:
TimestamDiff(SQL_TSI_HOUR, TimeID, Cast(Cast(CURRENT_DATE as Char) as Timestamp))
It gives you proper results without any error.

Year Ago Calculation without Time Series Functions

This article explains you how to calculate year ago measures without using time-series function.
Usually, for requirement like: measure, year ago measure want to show. We’ll go for 2 measures. One is: Measure and YearAgoMeasure calculated through time-series.
This blog entry also avoid to create other column for calculating year ago measure..
Here is the procedure:
I’m assuming you have Year dashboard Prompt, assigned with presentation variable:  var_Year
1. Create a filter on year column in this way:
https://bischool.files.wordpress.com/2010/10/filter.jpg?w=652
2. And filter should be like this:
https://bischool.files.wordpress.com/2010/10/advanced-filter.jpg?w=652
3.  Go to Pivot then arrange columns in this fashion and view Results:
You’ll be noticed that results are giving current/ (selected year from prompt) and previous year values.
Note: Apply descending order on Year column.
You may not like the labels showing year values. Instead of that, you may need to see the labels like: Current and Prior..
This can be done by writing following condition in Year Ex :
CASE WHEN Time.”Fiscal Year” = @{var_Year} then ‘Current’ when Time.”Fiscal Year” = @{var_Year}-1 then ‘Prior’ END
Limitation with this approach is: we’re using Advanced filter, which will not appear dynamically changing years in filters view (if you add filter view to compound layout)

Tuesday 15 December 2015

What is Surrogate key


Surrogate key is a substitution for the natural primary key.
It is just a unique identifier or number for each row that can be used for the primary key to the
table. The only requirement for a surrogate primary key is that it is unique for each row in the table.
Data warehouses typically use a surrogate, (also known as artificial or identity key), key for the
dimension tables primary keys. They can use Infa sequence generator, or Oracle sequence, or SQLServer Identity values for the surrogate key.

It is useful because the natural primary key (i.e. Customer Number in Customer table) can change and this makes updates more difficult.

Some tables have columns such as AIRPORT_NAME or CITY_NAME which are stated as the
primary keys (according to the business users) but ,not only can these change, indexing on a
numerical value is probably better and you could consider creating a surrogate key called, say,
AIRPORT_ID. This would be internal to the system and as far as the client is concerned you may display only the AIRPORT_NAME.

Another benefit you can get from surrogate keys (SID) is :
Tracking the SCD Slowly Changing Dimension.

Example:
On the 1st of January 2002, Employee 'E1' belongs to Business Unit 'BU1' (that's what would be in your Employee Dimension). This employee has a turnover allocated to him on the Business Unit 'BU1' But on the 2nd of June the Employee 'E1' is muted from Business Unit 'BU1' to Business Unit 'BU2.' All the new turnover have to belong to the new Business Unit 'BU2' but the old one should Belong to the Business Unit 'BU1.'

If you used the natural business key 'E1' for your employee within your datawarehouse everything would be allocated to Business Unit 'BU2' even what actually belongs to 'BU1.'

If you use surrogate keys, you could create on the 2nd of June a new record for the Employee 'E1' in your Employee Dimension with a new surrogate key.

This way, in your fact table, you have your old data (before 2nd of June) with the SID of the
Employee 'E1' + 'BU1.' All new data (after 2nd of June) would take the SID of the employee 'E1'+'BU2.'

You could consider Slowly Changing Dimension as an enlargement of your natural key: natural key of the Employee was Employee Code 'E1' but for you it becomes

Employee Code + Business Unit '-E1' + 'BU1' or 'E1' + 'BU2.' But the difference with the natural key enlargement process, is that you might not have all part of your new key within your fact table, so you might not be able to do the join on the new enlarge key -> so you need another id.

Implicit Fact Column

An Implicit fact column is used when we have multiple fact tables and the report is getting generated using only dimension columns.
A User may request a report where it may have only Dimensions and no Fact columns. In this case, the server may sometimes get confused as to which fact table should it join to fetch the data. So it joins to the nearest fact table and pulls the data through it. So the report output obtained may be different from what the user is expecting.So, in order to avoid this kind of error, we need to set Implicit Fact Column.

The goal of this is to guide the BI Server to make the best choice between two possible query paths.
We can set a fact attribute (measure) as an implicit fact column.
We can also create dummy implicit fact column on and assign any numeric value to it.

We can set implicit fact column in presentation catalog properties.

1.  Goto properties of presentation catalog in presentation layer.
2.  In implicit fact column section click on set and select any measure column from fact table.
3.  Click OK.
4.  Save your work.

Instead of selecting any fact measure column as implicit fact column, we can also define a dummy implicit fact.
1.  Create a Physical Column in Fact table in Physical Layer.
2.  Name it as Implicit_Column.
3.  Drag this column in Fact table from BMM layer.
4.  Double click on logical table source of fact table.
5.  In content tab, assign any numeric value to Implicit_Column.

Once this is done we can set this column as Implicit Fact Column in Presentation catalog as mentioned above.

Wednesday 25 November 2015

load plan error: Caused By: oracle.odi.jdbc.datasource.ConnectionTimeoutException: A login timeout occured while connecting to the database

Error: Load plan failed due to login time out error.
Sol:
  1. Add the below parameter in Listener.ora 
CONNECT_TIMEOUT_LISTENER=0

  1. Add the below parameters in  Sqlnet.ora  
SQLNET.EXPIRE_TIME = 0
SQLNET.OUTBOUND_CONNECT_TIMEOUT = 0

SQLNET.INBOUND_CONNECT_TIMEOUT = 0

Tuesday 24 November 2015

ORA-32001: write to SPFILE requested but no SPFILE specified at startup


Sol:
SQL> show parameter spfile

NAME TYPE                VALUE
------------------------------------ ----------- ------------------------------
spfile                             string

SQL> show parameter pfile

NAME TYPE                VALUE
------------------------------------ ----------- ------------------------------
spfile                            string
SQL> show parameter dump
NAME TYPE                     VALUE
------------------------------------ ----------- ------------------------------
background_core_dump  string partial
background_dump_dest string /slot/oracle/11.2.0/log/diag/rdbms/oracledb1/oracledb1/trace
core_dump_dest string /slot/oracle/11.2.0/log/diag/rdbms/oracledb1/oracledb1/cdump
max_dump_file_size string 10M
shadow_core_dump string PARTIAL
user_dump_dest string /slot/oracle/11.2.0/log/diag/rdbms/oracledb1/oracledb1/trace
SQL> CREATE SPFILE FROM PFILE;
File created.
SQL> shutdown immediate
Database closed.
Database dismounted.
ORACLE instance shut down.

SQL> startup

ORACLE instance started.
Total System Global Area 2154590208 bytes
Fixed Size 2230112 bytes
Variable Size 1325402272 bytes
Database Buffers 805306368 bytes
Redo Buffers 21651456 bytes
Database mounted.
Database opened.

SQL> show parameter pfile
NAME TYPE VALUE
------------------------------------ ----------- ------------------------------
spfile string /slot/oracle/11.2.0/dbs/spfileoracledb1.ora
SQL> ALTER SYSTEM SET processes = 5000 SCOPE=SPFILE;

System altered.

Friday 20 November 2015

Redeploy the Java EE ODI Agent in WebLogic


    1. In WLS Console, navigate to Deployments and select oraclediagent from the list. Click on the Delete button at the top of the table to delete the deployment of the oraclediagent enterprise application. Click on Activate Changes.
    2. Still on the Summary of Deployments page, click on Lock and Edit and then click on the Install button.
    3. On the Application Install Assistant, provide the path to the oraclediagent.ear file. This path should be <ODI Home>/setup/manual/oracledi-agent. Select oraclediagent.ear using the radio button displayed:


    1. Click Next. Accept the default selections on the screen:



 
    1. Click Next. Select the ODI Managed Server as the deployment target:


    1. Click Next. Accept the defaults on the Option settings page:



    1. Click Next. Review the choices and then click Finish.


    1. Click on Activate Changes.
    2. On the Summary of Deployments page, select the oraclediagent enterprise application and then click on the Start button: Servicing All Requests. (Click Lock & Edit, if necessary). 
The oraclediagent should successfully start.

Wednesday 18 November 2015

Configure Initial Extract Date in obia 11g

Initial Extract Date is required when you extract data for a full load. It reduces the volume of data in the initial load. The specified initial extract date will be used as a filter on the creation date of the transactional data in the selected full extract mapping. The date format is YYYY-MM-DD, for example, 2014-12-31. The default date is January 01, 1970.

  1. Select Manage Implementation Projects in FSM
  2. Select configure Initial Extract Date
  3. Click on GO to Task button
  1. Click on Date on Global Parameter Value. Warning window will appear click on OK

  1. Set the required date, Save and Close

Tuesday 17 November 2015

Configuring Global Currencies

Currency conversions are required because your business might have transactions involving multiple currencies. To create a meaningful report, you have to use a common currency.

OBA warehouse stores amounts in the following currencies:
1.      Document currency: The document currency is the currency of the transaction.
a.      Ex: if you purchase a chair from a supplier in Mexico, then the document currency is probably the Mexican peso. Or, if you made a business trip to the United Kingdom and filed an expense report for meal expenses in the UK, then the document currency of the expense report will most likely be in GBP.
2.      Local currency: The local currency is the base currency of your ledger, or the currency in which your accounting entries are recorded in.
3.      Global currencies: Oracle BI Applications provides three global currencies, which are the common currencies used by Oracle Business Analytics Warehouse.
a.      Ex: if your organization is a multinational enterprise that has its headquarters in the United States, you probably want to choose US dollars (USD) as one of the three global currencies.The global currency is useful when creating enterprise-wide analyses. For example, a user might want to view enterprise-wide data in other currencies. For every monetary amount extracted from the source, the load mapping loads the document and local amounts into the target table. It also loads the exchange rates required to convert the document amount into each of the three global currencies. For fact tables, there are two amount columns covering the Local currency amount and the Document currency amount. In addition, there are three columns covering the Global currency (for example, global _amount1) and their corresponding exchange rate columns. In most cases, the source system provides the document currency amount, which is the default setting for handling currency. If the source system provides only the document currency amount, then the source adapter performs lookups to identify the local currency codes based on the source system to which the appropriate currencies are assigned. After the lookups occur, the extract mapping provides the load mapping with the document currency amount and the document and local currency codes. The load mapping will then use the provided local currency  codes and perform currency conversion to derive the local amount. The load mapping will also fetch the global currencies setup and look up the corresponding exchange rates to each of the three global currencies.
b.      To specify global currencies, use the parameters GLOBAL1_CURR_CODE, GLOBAL2_CURR_CODE, and GLOBAL3_CURR_CODE.

Before you configure global currencies using parameters GLOBAL1_CURR_CODE, GLOBAL2_CURR_CODE and so on, you must configure the Currency domain, as follows:

1.      In Oracle BI Applications Configuration Manager, select the Manage Externally Conformed Domains link in the Tasks pane to display the "Manage Externally Conformed Domains dialog".
2.      In the Product Line drop down list, select Not Specified Product Line.
3.      In the Externally Conformed Domains list, select Currency, then click Configure Domain to start the configuration wizard.
4.      Follow the on-screen instructions on the configuration wizard.
5.      Click Save.