Problem logical column calculation

Hi,
I've a logical table with 2 datasources DS1 and DS2 with this definition
DS1: Physical table TABLE_A, where condition: TABLE_A.A_YEAR = EXTRACT(YEAR FROM CURRENT_DATE)
DS2: Physical table TABLE_A, where condition: TABLE_A.A_YEAR = EXTRACT(YEAR FROM CURRENT_DATE) - 1
Columns:
Field1: DS1.VAL1
Field2: DS2.VAL1
Field3 CASE WHEN EXTRACT(MONTH FROM CURRENT_DATE) > 11 THEN DS1.VAL1 ELSE DS2.VAL1
Showing Field1 and Field2 in Answers is showing the correct value, but Field3 is always the value of DS1.VAL1.
Having a look in the properties of the field, there's written for "Data Type derives from physical sources:" the following:
"case when extract(month from DATE '2008-10-15') > 10 then sum(TABLE_A.VAL1) else sum(TABLE_A.VAL1) end"
So there's nothing shown that my returned columns have different logical data sources ...
What do I have to do to get the correct value.
Thanks
chrissy

Hi user649490,
this was only an example and so a typing error ...
Because the month is extracted in condition I have to do it for all month ...
Otherwise I shave found my error ... I copied datasource one and thought I unmapped all columns, mapped by first datasource. Having again a look into the column mapping I recognized, that the columns where double mapped by datasource 1 and 2. I changed this and it's working fine now ...
So sorry for bothering you with my stupidness :-)
chrissy

Similar Messages

  • Logical Column(s) a.k.a. Pre-calculated Measures

    I am looking for best practices around logical columns either in Presentation Layer or Business Layer. Specifically I want to know..
    1) Is it advisable to have logical columns?
    2) How many are good to have? Should one create logical columns for all frequent calculations which are done on dashboards?
    3) Are there any performance implications?
    4) Can we use Time Series function with logical columns? Like AGO etc.
    In short I am looking for intelligent pros and cons of such implementations.
    PS: Logical columns are derived from physical columns. For e.g. Profit = Income - Expense.
    Regards.

    Hi
    1) all complex logic should be in the BMM layer in the RPD. Yes, it's advisable to have them in the BMM. Although another good practice is to try to delegate all complex calculations to ETL if possible.
    2) As many as deemed necessary. Yes.
    3) No, not really. But if you use Answers' logical columns - you have to be creative in propagating them to other reports.
    4) Yes you can. But again, this is something that's better to have in the RPD.

  • Logical Column for calculation

    Hi guru,
    My requirement is we have to display report with Total count of Opportunities , Count new Oppty ( which are created with in 30days from today ) Count Old Opptys ( which are created more than 30days ago and Overdue ( which closed date is > current Date).
    For this requirement I had created one logical column and I am getting error when ever I am using Timestampdiff.
    Could you please let me know how can I create these 3 logical columns in Meta Data.
    I am rally appreciate for your help.

    Here is what you need to do:
    1. Create a Session Variable in your RPD
    2. Check the "Enable any user to set the value" checkbox in the Session Variable properties
    3. In your dashboard prompt, select"Request Variable" in the Set Variable section
    4. Enter the name of your Session Variable in the box that appears
    5. Now you can reference the session variable in the formula of your logical column. Syntax: VALUEOF(NQ_SESSION.YOUR_SESSION_VARIABLE)
    Note: You cannot reference Presentation Variables in the RPD
    -Dave

  • BMM derived logical column - bounced visit

    In Web Metrics, a "bounced visit" is when a session views only one page on a website and then leaves. In my fact table, I capture the "total pages" viewed for each session and I define this metric in my Business Model with an aggregation rule of "sum". I'm trying to use this metric to derive the "bounced visit" metric but I'm running into issues.
    Session ID Total Pages
    1179860475     5
    1179861625     1 <= This is a bounced visit
    1179861920     7
    1179866260     2
    1179868693     13
    If I define "bounced visits" as
    CASE WHEN "total pages" = 1 THEN 1 ELSE 0 END
    What I see in the session logs is:
    CASE WHEN sum("total pages") = 1 THEN 1 ELSE 0 END
    The aggregation of the "total pages" is being done first and then the derived metric is being calculated. This leads to incorrect results. Is there anyway of solving this in the business model? I know that I can go back to the ETL, calculate a "bounce visit" metric, store it in the fact, build out aggregates, etc. I was looking for a short term solution.
    Other things I've tried:
    1) make a copy of the "total pages" column and turning off the aggregation, call it "total pages - no aggregation"
    this leads to queries of the form:
    select distinct T22583.TOTAL_PAGES as c1
    from
    WEB_SESSIONS_A1 T22583
    order by c1
    2) Create a logical column based on "total pages - no aggregation"
    bounced visit = CASE WHEN EnterpriseWarehouse."Web Sessions"."Total Pages - no aggregation" = 1 THEN 1 ELSE 0 END
    This leads to [nQSError: 14020] None of the fact tables are compatible with the query request Web Sessions.bounced visit.

    Cool. I now have two approaches to solve the problem. Thanks for your help. Using your technique, the new logical column shows up in the Logical Table Source screen and I can define an expression for that logical column. The second approach leaves the table type as a Physical table. This has some benefits as I've noticed that the queries that are generated when the table type is defined as a Select statement end up retrieving all of the columns, even though I only needed to act on one of them.

  • Derived logical column using Row values in OBIEE11g

    Hi Experts
    I have a requirement whereby I need to create say 100 new logical column in obiee using the row values of one dimension.
    For eg. say I have dimension values like A and B in one of the table ,  I need to create logical columns like C=A+B and D = A/B and E = AxB as such when the user drags this to the analysis the calculation should happen automatically.
    The problem is they are in the rows and not the coulmn , so I can't do plain calculation like sales per Unit = Sales/Units - as Sales and Units are two differnet measure column.
    How to achieve this in RPD.
    Regds

    Hi
    Pls. find below the examples :
    You have a dimesion called KPI which has soem KPIs.
    KPI           Measure Values
    KPI1          10
    KPI2          20
    KPI3          50           
    etc
    Now I need to calulate a driven KPI i.e KPI10 which is KPI10 = KPI2/KPI3 , and I need to do that in RPD.
    I know I can do it in Front end analaytics by using Pivot or Selection steps . But I want to do it in RPD.
    Regds

  • "On time pick rate" logical column defined as INT instead of DOUBLE

    Hi,
    I noticed that there is a problem with the indicator "Top 10 Plants by Pick Rate". The calculation of this formula results in a decimal number indicating the number of selections made in time in relation to the total number of selections. However, in the "Business Model and Mapping" in "Oracle BI Administration Tool", the Logical Column "On time pick rate" (Core.Fact - Sales - Pick Line) is defined as INT (integer). This produces a truncation of the information and the indicator always displays 0 (zero).
    You think this is a bug or is the expected behavior?
    Regards

    I have the same problem also. I'm using VC++6.0 and ADO 2.71 to perform queries with an Oracle Client 9.2.0.1. The same query on same table but on different Server version returns different results. With the production server, which is an 8.1.7.4 server, a NUMBER column returns a LONG value, while on the Test server, which is a 9.2.0.5.0 server, the same column returns a DOUBLE value. The problem is that using ADO I don't check data type/size when getting values, but I aspect a LONG value because a created the column with NUMBER specification. Could this be a different default from previous server version when creating the NUMBER column without any SCALE and PRECISION attributes?

  • At max how many logical columns can be created in RPD

    Hi All,
    At max how many logical columns can be created in RPD. I have a requirement of creating 200 columns. Will there be any problem .
    Is there any predefind number of columns for RPD creation??
    Please help ..

    Hi Annapurna,
    There's no limit that I'm aware of or which is mentioned anywhere. Just as an example: I have a logical fact table with around 750 logical columns (>500 original measures & 250 derived measures). No issue whatsoever. Opening the presentation table through answers takes about 2-3 seconds (the NQSQL command has a lot to retrieve), but that's about it.
    Cheers,
    C.

  • Display values for logical columns with several physical sources

    Hi all,
    I'm enocuntering some strange behaviour with the values displayed for a column (when we want to add it a as a filter, and in the dialogue box select 'Show All' values).
    Basically the logical column is mapped against several physical columns as the base fact table is aggregated to different levels. Additionally, one column in the fact tables has an attribute value, and these vary between the aggregated and non-aggregated table. This is not a problem.
    In the production environment, when we display the all values in the filters prompt for this column we are seeing the values being taken from one fact table. Naturally this means that not all the values possible for this column are being shown (i.e. values from the aggregated fact tables are missing). Now in one of our test environment where those fact tabels have additional data loaded, the values are being taken from one of the other fact tables. Unfortunately it is not necessarily the fact table with less data.
    My questions are:
    a) What dictates which fact table the server will use when the query in Answers doesnt use any dimensions (i.e. we are selecting just this attribute and selecting all the possible values by which it can be filtered)?
    b) Is there a way of obtaining all the values from the differente physical columns? I.e. that the display values shows the values of that logical column across all the physical fact tables? Maybe we would need to model that attribute as a logical dimension just for that attribute? Im not really sure this would work, as at the physical level the join between the dimension and fact would still have to go to only one particular fact table.
    Any info or help is much appreciated.

    Hi,
    Aggregate tables exist at physical level and are created by ETL procedures. Although i am aware that the dimension come into play when the application needs to decide which source aggregate table to use, it is not an issue here as in our query we are not involving dimensions. We are selecting just this one columen, and then the filter option on this column, then in the dialogue box we select 'All Choices', only the values from one of the source tables is being shown.
    This is not a probelm within reports, i think it is a product limitation in that in that option to list all values a user can filter by, it is not possible to display all the values from the different fact tables to which that column is mapped.
    Has anyone else encountered this behaviour?

  • Changes to Derived Logical Column not reflected in Answers

    Before I go off my rocker and stab myself with my IPhone lightsabre... :-)
    I created a new logical column for an existing table in my BM, deriving from an existing column (a datetime field called ESTIMATEDCLOSE) as follows:
    MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE) || ' - ' || CAST(Year(Sales.OPPORTUNITY.ESTIMATEDCLOSE) AS CHAR(4))
    ...which should give me something like "Jan - 2008". The problem is that it returned something like "2008/01/31 00:00:00" in Answers - completely unexpected.
    I decided to start out small and try something simpler, changing my derived column to be only Sales.OPPORTUNITY.ESTIMATEDCLOSE. This returned "2008/01/31 00:00:00" in Answers - sort of expected.
    Then I changed it to MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE). This also returned "2008/01/31 00:00:00" in Answer!!!
    It almost seems as if a derived logical column is cached in some weird way and changes to the formula is not reflecting in answers. Thinking that something so simple must work and I made a simple mistake somewhere I have spend a good part of the past day playing around with this and trying every possible solution I can think of - no luck. The odd thing is that I would rename my logical column and this will be reflected in Answers, but changes to the derived expression is not reflected.
    - I deleted the column and started all over - no luck
    - tried creating derived columns using different datetime expressions or even just concatenating strings to each other - no luck.
    - Restarted all the Oracle services, thinking it's a cach'ing issue - no luck.
    - I rebooted the server - no luck.
    Once, when I started out with only MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE) I got back correct values in Answers (Jan, Feb, Mar..) and I thought the problem was finally gone. So I added || ' - ' || CAST(Year(Sales.OPPORTUNITY.ESTIMATEDCLOSE) AS CHAR(4)) to the derived formula. Guess what, Answers kept on giving me back only the Month names, completely ignoring my updated formula.
    Anybody else ran into this problem? Something so simple should surely work...

    John,
    Yes, cleared the cache (from Dashboard Administration) and also used nqcmd Call SAPurgeAllCache. Restarted BI Server. Still no luck.
    I tried your formula and then only got back "2008/01/31" (just the date without the timestamp) - very strange.
    Then I noticed the following amazing thing (by accident):
    I started with a request in Answers that does not contain my derived column and then add my derived column to the report. The text would be all wrong in this column (Jan, Feb, March). Then I add the same column again to the report and the second added column will magically show the correct strings (Jan - 2008) !!! Even stranger is that the moment I added the same column again to the report, the text on the first column would also magically jump to the correct values (Jan - 2008).
    I did a bit more testing and created a blank request and added my derived column as the only column - this time it will show the correct values immediately. It does seem like it's tied to the content already in the request. I went a step further and saved my buggy request and added it to a Dashboard - it worked perfectly in the Dashboard.
    Must be something buggy here in Answers when working with derived columns. I'm on 10.1.3.4...

  • Logical Column based on expression leads to nQSError: 14020

    Hallo everbody,
    I've got two dimensions D01, D02 and a fact table F.
    In Answers I have created analysises before containing D01.A and D02.A without problems.
    Now, in the rpd-file, I added another logical column to dimension D01, say D01.B, which is based on an expression of type "SOME_FUNCTIONS(D01.A))".
    If I drag D01.B and D02.A into an analysis I get the error message: nQSError: 14020 "None of the fact tables are compatible with the query request..."
    What is the problem here? I don't quite understand what is going wrong here, as I only used some functions on a colum that is working...
    Thanks for any help.
    Best regards
    Matt

    Exactly. As I wrote, B contains some functions (mainly SUBSTR and CAST) that depend on other columns from the same dimension (all other colums are physical).
    Funny thing is, when I drag D01.B into the analysis alone, it works all fine... Adding a measure also leads to the said error.

  • Create two logical columns with same LTS mapping but diff filter conditions

    Hi,
    Problem:
    How to create two logical columns within same logical table mapped to same physical column but different filter conditions?
    I have a scenario where in,
    Physical layer columns
         - table1.employee
         - table1.emp_city
    I need a columns in logical layer:
    Logical layer - lt1.count_emp_delhi (counts distinct employees whose city_name = 'Delhi')
              lt1.count_emp_mumbai(counts distinct employees whose city_name = 'Mumbai')
    My approach:
    For Delhi column
    1. Create a logical column lt1.count_emp_delhi mapped to the physical column table1.employee
    2. Aggregate using countdistinct in aggregate tab.
    3. Edit the mapping condition
         3.1. Use the where clause and set table1.emp_city='Delhi'.
    For Mumbai column.
    Followed the same approach as above but in 3.1 if I change the condition to 'Mumbai', even the delhi column is populated with mumbai count which is ERRONEOUS
    Could some one please help?

    Hi,
    1. Create two alias tables for table1 in Physical Layer. Lets say TB_Mumbai and TB_Delhi
    2. Create a logical table in BMM layer (D1 Employee Cities )
    3. Drag and drop the employee & emp_city columns from both alias tables (TB_Mumbai and TB_Delhi ) into your newly created logical table.
    4. Now you can see two Logical Table Sources (TB_Mumbai and TB_Delhi )
    5. Now using Where condition, write the condition on each table
    NOTE: Don't write any condition on the Physical table Table1.
    Hope it helps you.
    Regards,
    Kalyan Chukkapalli
    http://123obi.com

  • Logical Column Aggregation

    I have CurrentDollars and CurrentUnits in my measures table, and I"ve created PriorDollars and PriorUnits logical columns utilizing the AGO function. Since 'Use existing logical columns as the source' is checked, I don't have access to the Aggregation tab, and everything I've read leads me to believe that this is OBIEE's way of forcing inheritance of the aggregation rule of the 'source' column.
    Problem is, the Prior* columns don't aggregate (sum). If I create a report with Year, CurrentDollars, CurrentUnits, I get three rows... one row for each year that has data. That's what I want. But when I add either of the Prior columns, I get 52 rows per year (since my data is at the week level). If I wrap either of the Prior columns in SUM, it sums all Prior* column values. If I check the box that says I want it to use Sum aggregation, nothing changes. I get similar bogus results if I add a SUM function around the original AGO function.
    This is a real drag. Am I missing something obvious? Some blogs lead me to think that this is a known bug. Does anyone have a workaround?
    Edited by: Tom G. on Aug 26, 2009 11:30 AM

    Hi,
    What is the source for AGO measures PriorDollars and PriorUnits?
    AGO(CurrentUnits, '?',?)
    AGO(CurrentDollars, '?',?)
    Have you set up hierarchy correctly (because you said that the granularity in your fact table is at week grain) and it should be like this:
    Fiscal Year > Fiscal Quarter > Fiscal Month > Fiscal Week
    It depends on how did you define your TIME dimension.
    Try maybe like this:
    PriorYEARDollars = AGO(CurrentUnits, 'YEAR',1)
    and the same for
    PriorYEARUnits =AGO(CurrentDollars, 'YEAR',1)
    Now, if you put in the report
    YEAR
    CurrentUnits
    CurrentDollars
    PriorYEARUnits
    PriorYEARDollars
    you'll get only three rows (for 3 years) with PRIOR YEAR measures.
    Try it.
    A good reference for understanding AGO:
    http://obiee101.blogspot.com/2008/11/obiee-ago-and-todate-series.html
    Also you need to set chronological key correctly for TIME dimension.
    Regards
    Goran
    http://108obiee.blogspot.com

  • OBIEE 11.1.1..6.9 Logical Column Derived from existing columns

    Hello,
    I've just installed the last version 11.1.1.6.9 in my test environment and I already have a problem with the RPD.
    I've some logical column derived from existing columns using an expression. (so far, so good)
    With Adminsitration Tool (Last Version 11.1.1.6.9), when I launch the Consistency Check Manager I have the following error (I use the SampleAppLite RPD in order to reproduce the problem)
    ERRORS:
    SampleApp Lite :
    [nQSError: 46008] Internal error: File server\objectmodel\Src\SOSecureRpGateway.cpp, line 479.
    [nQSError: 23013] An error occurred when extracting the metadata definition for the Attribute '"SampleApp Lite"."D3 Orders (Facts Attributes)"."test"'.
    The logical column test on "D3 Orders (Facts Attributes)" :
    CASE WHEN "SampleApp Lite"."D3 Orders (Facts Attributes)"."Order Date" > "SampleApp Lite"."D0 Time"."Calendar Date" THEN 1 ELSE 0 END
    The problem appears only when I use multiple source table in column expression.
    Does anyone has the same problem ? I think it's easy to reproduce (I searched on OracleSupport but I didn't find anything yet)
    Benjamin

    I know that there is no relation between "SampleApp Lite"."D3 Orders (Facts Attributes)"."Order Date", "SampleApp Lite"."D0 Time"."Calendar Date", it's also the same thing in my own RPD.
    But as it's working with the 11.1.1.6.2 BP1 version I don't understand why it's not working with 11.1.1.6.9.
    Implicit fact column is not set on my repository, but I don't have any request with only dimensional column, so if my understanding is correct I don't need to use it. Also, the problem appears during the check of the repository not in answers.
    thanks anyway

  • Create a logical column with more than one data source

    I'm having a problem to create a logical column with more than one data source in Siebel 7.8.
    What I want to do is the union of 2 physical tables in one logical table.
    For example, I have a "local_clients" table and a "abroad_clients" table. What I want is to have a logical table "clients" with the client data from the 2 tables.
    What I've tried is dragging the datasources I need onto the logical column.
    However this isn't working because it only retrieves the data from the first data source.

    Hi!
    I think it is not possible to do this just by dragging the columns to the logical table. A logical table can have more than one source, but I think each column must have just one direct source column.
    I'm not sure, but maybe you should do the UNION SQL to get the data of the two tables. In the physical layer, when you create a new physical table, it's possible to set the "table type" as a "SELECT". I didn't try that, but it seems that it's possible to have the union table in the physical layer.
    Bye.
    Message was edited by:
    user578388

  • Logical column - Fact measure using Dimension value

    Hello all, I have a Fact table that has a metric (Value) where I have set the Aggregation rule to Sum. Now I'd like to create a metric based on the value of a Product Dimension that joins to this Fact table. So I create logical column that has the syntax - Case When Product_Desc = 'A' then Value else 0 End.
    The issue is in Answers when I bring this new metric in it doesn't bring in the correct value, are there steps I am missing with creating this metric?
    My assumption is if I just bring that new metric in by itself it would return one record - A sum of the Value where the Product = 'A'.
    Thank you

    Hi BRizzle,
    In your scenario, you creatre a calculated measure using the employee dimension, and then this measure show "#VALUE" on the pivot table, right? It seems that it's a limitation of using calculated measures in SQL Server Analysis. Calculated measures cannot
    be secured using Dimension Security in a straight forward manner, in fact they won't be listed at all in the Dimension tab of the role where we define the Dimension security. When such measures are browsed in client tools like Excel, the value that would be
    displayed is an error value like #VALUE. For the detail information about it, please see:
    Limitations / Disadvantages of using Calculated Measures / Calculated Members in SSAS
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    Charlie Liao
    TechNet Community Support

Maybe you are looking for