Logical column for sum number of occurences

Hi,
In Discoverer I have a column: "Number of Employees" and it is the expression like this: DECODE(Polisaid,Polisaid,1). I tried the same in OBI repository: CASE WHEN "Number of Employees" IS NOT NULL THEN 1 END. In query, How much employees are in HR Department The OBI gave me only number 1?? (I cannot use Aggregate function) while for the same query Discoverer gave me correct answer.
How to solve the problem?
Thanks in advance.

Hi stanisa,
DECODE(Polisaid,Polisaid,1)here decodeand case are same...he meant column equals polisaid then give me polisaid if not display 1.
Case when "Number of Employees" is not null then "Number of Employees" else 1 end;
Hope it helps you.
By,
KK

Similar Messages

  • Logical Columns - Running Sum & 3Month Rollover

    Hi All,
    Need to build a logical column.
    I have a column with number of units (count distinct) in RPD.
    I need to build a new logical column in the REPOSITORY , such that it has running sum values.... sothat when i pick this #units column and months column in the ANSWERS...I need to get roll over of all previous values for first month.
    Say Jan ---> XXXX units ( summation of all previous available units - few years)
    Feb----> YYYY units ( sum of till jan values & Feb units)
    Mar----> ZZZZ ...etc ( sum of till Feb values & Mar units) so on.
    Based on this newly built column I need to build another column of " 3 months roll over " column.
    Replies appreciated.
    Thanks in advance.

    Hi user11939829m
    So help me understand your new measures a little better. For the sake of this post, let's say your data is like so
    Month Year -- Units
    Jan 2010 -- 1
    Feb 2010 -- 2
    Mar 2010 -- 3
    Apr 2010 -- 4
    May 2010 -- 5
    Jun 2010 -- 6Then let's say you have a report with the above columns and the new running sum columns.
    Month Year -- Units -- Running Sum Units
    Jan 2010 -- 1 -- 1
    Feb 2010 -- 2 -- 3
    Mar 2010 -- 3 -- 6
    Apr 2010 -- 4 -- 10
    May 2010 -- 5 -- 15
    Jun 2010 -- 6 -- 21Now what exactly would your three months rollover be? Would the 3 month rolling sum = running sum for current month + running sum for last month + running sum for last last month?
    i.e.
    Month Year -- Units -- Running Sum Units -- 3 Month Rolling Sum
    Jan 2010 -- 1 -- 1 -- 1
    Feb 2010 -- 2 -- 3 -- 4
    Mar 2010 -- 3 -- 6 -- 10
    Apr 2010 -- 4 -- 10 -- 19
    May 2010 -- 5 -- 15 -- 31
    Jun 2010 -- 6 -- 21 -- 46Not sure what value such a measure would add. Or do you mean 3 month rolling sum would be the running sum for just the last three months (like below)? This makes more sense but in your description, you indicated that you'd build the 3 month rolling sum off of the running sum which confused me a bit.
    Month Year -- Units -- Running Sum Units -- 3 Month Rolling Sum
    Jan 2010 -- 1 -- 1 -- 1
    Feb 2010 -- 2 -- 3 -- 3
    Mar 2010 -- 3 -- 6 -- 6
    Apr 2010 -- 4 -- 10 -- 9
    May 2010 -- 5 -- 15 -- 12
    Jun 2010 -- 6 -- 21 -- 15 Is that what you are going for? Please elaborate.
    Best regards,
    -Joe

  • How To Create Logical column For Lastyear To Till Date

    Hi All,
    I Have to calculate Lastyear to Till Date for Logical Column.
    Ex:I have Time Dime and Fact column and Dimension columns.
    jan 2011 to oct 2012.
    Please Let me know.
    Thanks,
    Abhi

    Looks like you are not reading my messages
    Since you are doing on logical columns getting this error.
    Try this using physical columns then count from aggregate tab
    CASE WHEN "Oracle Data Warehouse"."Catalog"."dbo"."Fact_W_SRVREQ_F_Open_Date"."OPEN_DT_WID" > 20110101 THEN "Oracle Data Warehouse"."Catalog"."dbo"."Fact_W_SRVREQ_F_Open_Date"."SR_WID" END
    The same you can go with your variable
    Or else the same can do by duplicating the existing Fact - CRM - Service Request"."# of SRs" and just add above code.
    or just use the below you dont use sum since already aggregation is happend for # of SRs
    CASE WHEN "Core"."Dim - Date"."Year" = VALUEOf("Warehouse Refresh Date Last Year"."CURRENT_CALENDAR_YEAR_LAST_YEAR") AND "Core"."Dim - Date"."Date" < VALUEOF("Warehouse Refresh Date Last Year"."LAST_REFRESH_DATE_LAST_YEAR") THEN "Core"."Fact - CRM - Service Request"."# of SRs" END
    Hope this works, mark as correct

  • Logical Column for calculation

    Hi guru,
    My requirement is we have to display report with Total count of Opportunities , Count new Oppty ( which are created with in 30days from today ) Count Old Opptys ( which are created more than 30days ago and Overdue ( which closed date is > current Date).
    For this requirement I had created one logical column and I am getting error when ever I am using Timestampdiff.
    Could you please let me know how can I create these 3 logical columns in Meta Data.
    I am rally appreciate for your help.

    Here is what you need to do:
    1. Create a Session Variable in your RPD
    2. Check the "Enable any user to set the value" checkbox in the Session Variable properties
    3. In your dashboard prompt, select"Request Variable" in the Set Variable section
    4. Enter the name of your Session Variable in the box that appears
    5. Now you can reference the session variable in the formula of your logical column. Syntax: VALUEOF(NQ_SESSION.YOUR_SESSION_VARIABLE)
    Note: You cannot reference Presentation Variables in the RPD
    -Dave

  • How to create logical columns for current period and prior period

    Hello all.
    Is there any way in obiee to create a new logica column in BMM layer that says "CURRENT PERIOD" AND "PRIOR PERIOD" OR ONE SINGLE COLUMN THAT SAYS "PERIOD".
    In those columns what I need is if it a current period column it shoould have 03/01/2012-03/31/2012 (this month date range)
    and in prior period column I shouldhave 02/01/2012 -02/29/2012(which is previous month date range).These columns I will be using in my reports.
    Please help me if we can create any such columns with these conditions/requirements

    Hi,I have already created he dynamic variables.But I am not getting how to use those variables and create the new logical columns in bmm layer.
    this is what I am trying
    case when VALUEOF("Current Month begin date"."Current Month begin date") ='..' and VALUEOF("Current Month end date"."Current Month end date")
    ='..' then 'current period' end
    I don't really understand what I should write case when VALUEOF("Current Month begin date"."Current Month begin date") ='..'??

  • Poor Response Time- Total on a column for n number of rows

    I have a table with column cost on my custom OAF page.
    When I query for configuration it returns me many rows. so I have set the default rows for table = 10 and then I neatly have next button to go to other rows.
    I have enabled totalling on cost column.
    Row for Total appears showing sum for the costs only on that page( for only 10 rows). I click for next from drop down , and I see total for the costs for the second page.
    Ex:
    table has 17 rows and
    page 1 :
    total row at the end saying 1000.00
    page 2 :
    total = 1500.00
    I want to display Total Cost by summing up the costs for say 300 items returned by the query on all the pages , in above case 2500.00.
    I thought of a way to do it ;
    I added a sumVO with query "select sum(...) from table" .
    Added a new region in my page , added a messageStyleText based on the sumVO, and pulled the total cost in.
    It shows me the right result, but my problem is performance.
    It is getting very slow. I am using the same query as for displaying the results in table, but summing on cost column.
    Can I avoid writing the sum query and do it programmatically in OAF ??
    Thanks in advance.

    Even if you use programmatic approach, what do you think program will do?
    Program has to fetch all the rows in the middle tier and sum it up using for loop. No way its going to solve your problem.
    First find out the reason for the slow performance using Trace option. and fix the query.
    If your not able to fix it, try materialized view for the summation query.
    To take sql trace for OAF page refer this link infor http://prasanna-adf.blogspot.com/2009/01/sql-trace.html
    --Prasanna                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • After Waves install, Logic asking for serial number unless xsKey plugged in

    I installed Logic Studio and have been using it without dongle. After installing Waves Gold bundle and going through super paranoid copy protection maneuver, Logic now needs the xsKey dongle to run. Otherwise it asks for new serial number, then the original SN, which is of course the xsKey you must insert to proceed.
    Does anyone know what needs to be done to correct this, short of dumping the Waves stuff and starting all over again?
    Help!

    It's unlikely to be Waves related I think. I'm running iLok Waves plugins here in LP8 without my XSkey connected. Waves has nothing to do with the XSkey, it either uses hard disk authorisation (old plugs) or iLoks (current plugs).
    It sounds to me more like the Logic installation is screwy, if it's forgotten it's authorised. Try running some disk utils and verifying permissions, and if no joy, rerunning the Logic install and reauthorise.

  • Logic asks for serial number every time I start it...

    Hi,
    I've just reinstalled Logic Studio and it asks me for a serial number every time I start up Logic Pro if I don't have the old XS key inserted.
    It's the upgrade version and in the past i've just needed to insert the XS Key at installation, not evey time I use it.
    I'm using Logic Pro (lastest update) with Lion which Apple says is supported.
    Can anyone help?

    I had this issue when I upgraded. I resolved it by inserting the key before I started the upgrade and left it in there and then valdidate it with the serial. After that it never asked for the key again. If this is not working you might want to contact support and they probably provide a new serial key.

  • Since downloading the new iTunes my column for the number of plays no longer work

    Hi
    Apart from the fact that I don't like the new set up I also find the the column which keeps count of the number of times a music track is played no longer responds. I tried to re-download my old version of iTunes but received a message saying the files for my music could not be found, so had to go back to the new version. Can anyone help.
    PS. I'm quite dismayed at reading some of the comments especially about those who do finally get through on the phone, find that the people who are supposed to help have not received adequate training.

    Try a force restart by holding the power and home button down at the same time. One of two things will PROBABLY happen; 1) It will boot into iOS as normal. 2) You will get an iTunes logo with a USB cable at the bottom. If the second happens you will need to plug it into iTunes to do a restore. If neither of these happen, force restart again. Once the phone powers off, let go of both buttons and push and hold the home button and plug the phone into iTunes. This will force the phone into recovery mode. Proceed with a restore at that point.

  • Rows to column for huge number of records

    my database version is 10gr2
    i want to transfer the rows to column .....i have seen the examples for small no of records but how can it be done if there are more the 1000 records in a table ...???
    here is the sample data that i would like to change it to column
    SQL> /
    NE              RAISED                         CLEARED                        RTTS_NO        RING                                                                              
    10100000-1LU    22-FEB-2011 22:01:04/28-FEB-20 22-FEB-2011 22:12:27/28-FEB-20                SR-10/ ER-16/ CR-25/ CR-29/ CR-26/ RIDM-1/ NER5/ CR-31/ RiC600-1                  
                    11 01:25:22/                   11 02:40:06/
    10100000-2LU    01-FEB-2011 12:15:58/06-FEB-20 05-FEB-2011 10:05:48/06-FEB-20                RIMESH/ RiC342-1/ 101/10R#10/ RiC558-1/ RiC608-1                                  
                    11 07:00:53/18-FEB-2011 22:04: 11 10:49:18/18-FEB-2011 22:15:
                    56/19-FEB-2011 10:36:12/19-FEB 17/19-FEB-2011 10:41:35/19-FEB
                    -2011 11:03:13/19-FEB-2011 11: -2011 11:08:18/19-FEB-2011 11:
                    16:14/28-FEB-2011 01:25:22/    21:35/28-FEB-2011 02:40:13/
    10100000-3LU    19-FEB-2011 20:18:31/22-FEB-20 19-FEB-2011 20:19:32/22-FEB-20                INR-1/ ISR-1                                                                      
                    11 21:37:32/22-FEB-2011 22:01: 11 21:48:06/22-FEB-2011 22:12:
                    35/22-FEB-2011 22:20:03/28-FEB 05/22-FEB-2011 22:25:14/28-FEB
                    -2011 01:25:23/                -2011 02:40:20/
    10100000/10MU   06-FEB-2011 07:00:23/19-FEB-20 06-FEB-2011 10:47:13/19-FEB-20                101/IR#10                                                                         
                    11 11:01:50/19-FEB-2011 11:17: 11 11:07:33/19-FEB-2011 11:21:
                    58/28-FEB-2011 02:39:11/01-FEB 30/28-FEB-2011 04:10:56/05-FEB
                    -2011 12:16:21/18-FEB-2011 22: -2011 10:06:10/18-FEB-2011 22:
                    03:27/                         13:50/
    10100000/11MU   01-FEB-2011 08:48:45/22-FEB-20 02-FEB-2011 13:15:17/22-FEB-20 1456129/       101IR11 RIMESH                                                                    
                    11 21:59:28/22-FEB-2011 22:21: 11 22:08:49/22-FEB-2011 22:24:
                    52/01-FEB-2011 08:35:46/       27/01-FEB-2011 08:38:42/
    10100000/12MU   22-FEB-2011 21:35:34/22-FEB-20 22-FEB-2011 21:45:00/22-FEB-20                101IR12 KuSMW4-1                                                                  
                    11 22:00:04/22-FEB-2011 22:21: 11 22:08:21/22-FEB-2011 22:22:
                    23/28-FEB-2011 02:39:53/       26/28-FEB-2011 02:41:07/
    10100000/13MU   22-FEB-2011 21:35:54/22-FEB-20 22-FEB-2011 21:42:58/22-FEB-20                LD MESH                                                                           
                    11 22:21:55/22-FEB-2011 22:00: 11 22:24:52/22-FEB-2011 22:10:

    could you do something like this?
    with t as (select '10100000-1LU' NE,   '22-FEB-2011 22:01:04/28-FEB-2011 01:25:22/' raised ,  '22-FEB-2011 22:12:27/28-FEB-2011 02:40:06/' cleared from dual union
                  select '10100000-2LU', '01-FEB-2011 12:15:58/06-FEB-2011 07:00:53/18-FEB-2011 22:04:56/19-FEB-2011 10:36:12/19-FEB-2011 11:03:13/19-FEB-2011 11:16:14/28-FEB-2011 01:25:22/',
                  '05-FEB-2011 10:05:48/06-FEB-2011 10:49:18/18-FEB-2011 22:15:17/19-FEB-2011 10:41:35/19-FEB-2011 11:08:18/19-FEB-2011 11:21:35/28-FEB-2011 02:40:13/' from dual
    select * from(
    select NE,   regexp_substr( raised,'[^/]+',1,1) raised, regexp_substr( cleared,'[^/]+',1,1) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,2) , regexp_substr( cleared,'[^/]+',1,2) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,3) , regexp_substr( cleared,'[^/]+',1,3) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,4) , regexp_substr( cleared,'[^/]+',1,4) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,5) , regexp_substr( cleared,'[^/]+',1,5) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,6) , regexp_substr( cleared,'[^/]+',1,6) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,7) , regexp_substr( cleared,'[^/]+',1,7) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,8) , regexp_substr( cleared,'[^/]+',1,8) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,9) , regexp_substr( cleared,'[^/]+',1,9) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,10) , regexp_substr( cleared,'[^/]+',1,10) cleared  from t
    union
    select NE,   regexp_substr( raised,'[^/]+',1,11) , regexp_substr( cleared,'[^/]+',1,11) cleared  from t
    where nvl(raised,cleared) is not null
    order by ne
    NE     RAISED     CLEARED
    10100000-1LU     28-FEB-2011 01:25:22     28-FEB-2011 02:40:06
    10100000-1LU     22-FEB-2011 22:01:04     22-FEB-2011 22:12:27
    10100000-2LU     28-FEB-2011 01:25:22     28-FEB-2011 02:40:13
    10100000-2LU     19-FEB-2011 10:36:12     19-FEB-2011 10:41:35
    10100000-2LU     19-FEB-2011 11:03:13     19-FEB-2011 11:08:18
    10100000-2LU     19-FEB-2011 11:16:14     19-FEB-2011 11:21:35
    10100000-2LU     06-FEB-2011 07:00:53     06-FEB-2011 10:49:18
    10100000-2LU     01-FEB-2011 12:15:58     05-FEB-2011 10:05:48
    10100000-2LU     18-FEB-2011 22:04:56     18-FEB-2011 22:15:17you should be able to do it without all those unions using a connect by but I can't quite get it to work
    the following doesn't work but maybe someone can answer.
    select NE,   regexp_substr( raised,'[^/]+',1,level) raised, regexp_substr( cleared,'[^/]+',1,level) cleared from t
    connect by  prior  NE = NE and   regexp_substr( raised,'[^/]+',1,level) = prior regexp_substr( raised,'[^/]+',1,level + 1)Edited by: pollywog on Mar 29, 2011 9:38 AM
    here it is with the model clause which gets rid of all the unions.
    WITH t
            AS (SELECT '10100000-1LU' NE,
                       '22-FEB-2011 22:01:04/28-FEB-2011 01:25:22/' raised,
                       '22-FEB-2011 22:12:27/28-FEB-2011 02:40:06/' cleared
                  FROM DUAL
                UNION
                SELECT '10100000-2LU',
                       '01-FEB-2011 12:15:58/06-FEB-2011 07:00:53/18-FEB-2011 22:04:56/19-FEB-2011 10:36:12/19-FEB-2011 11:03:13/19-FEB-2011 11:16:14/28-FEB-2011 01:25:22/',
                       '05-FEB-2011 10:05:48/06-FEB-2011 10:49:18/18-FEB-2011 22:15:17/19-FEB-2011 10:41:35/19-FEB-2011 11:08:18/19-FEB-2011 11:21:35/28-FEB-2011 02:40:13/'
                  FROM DUAL)
      SELECT *
        FROM (SELECT NE, raised, cleared
                FROM t
              MODEL RETURN UPDATED ROWS
                 PARTITION BY (NE)
                 DIMENSION BY (0 d)
                 MEASURES (raised, cleared)
                 RULES
                    ITERATE (1000) UNTIL raised[ITERATION_NUMBER] IS NULL
                    (raised [ITERATION_NUMBER + 1] =
                          REGEXP_SUBSTR (raised[0],
                                         '[^/]+',
                                         1,
                                         ITERATION_NUMBER + 1),
                    cleared [ITERATION_NUMBER + 1] =
                          REGEXP_SUBSTR (cleared[0],
                                         '[^/]+',
                                         1,
                                         ITERATION_NUMBER + 1)))
       WHERE raised IS NOT NULL
    ORDER BY NEEdited by: pollywog on Mar 29, 2011 10:34 AM

  • Logical Column(s) a.k.a. Pre-calculated Measures

    I am looking for best practices around logical columns either in Presentation Layer or Business Layer. Specifically I want to know..
    1) Is it advisable to have logical columns?
    2) How many are good to have? Should one create logical columns for all frequent calculations which are done on dashboards?
    3) Are there any performance implications?
    4) Can we use Time Series function with logical columns? Like AGO etc.
    In short I am looking for intelligent pros and cons of such implementations.
    PS: Logical columns are derived from physical columns. For e.g. Profit = Income - Expense.
    Regards.

    Hi
    1) all complex logic should be in the BMM layer in the RPD. Yes, it's advisable to have them in the BMM. Although another good practice is to try to delegate all complex calculations to ETL if possible.
    2) As many as deemed necessary. Yes.
    3) No, not really. But if you use Answers' logical columns - you have to be creative in propagating them to other reports.
    4) Yes you can. But again, this is something that's better to have in the RPD.

  • Changes to Derived Logical Column not reflected in Answers

    Before I go off my rocker and stab myself with my IPhone lightsabre... :-)
    I created a new logical column for an existing table in my BM, deriving from an existing column (a datetime field called ESTIMATEDCLOSE) as follows:
    MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE) || ' - ' || CAST(Year(Sales.OPPORTUNITY.ESTIMATEDCLOSE) AS CHAR(4))
    ...which should give me something like "Jan - 2008". The problem is that it returned something like "2008/01/31 00:00:00" in Answers - completely unexpected.
    I decided to start out small and try something simpler, changing my derived column to be only Sales.OPPORTUNITY.ESTIMATEDCLOSE. This returned "2008/01/31 00:00:00" in Answers - sort of expected.
    Then I changed it to MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE). This also returned "2008/01/31 00:00:00" in Answer!!!
    It almost seems as if a derived logical column is cached in some weird way and changes to the formula is not reflecting in answers. Thinking that something so simple must work and I made a simple mistake somewhere I have spend a good part of the past day playing around with this and trying every possible solution I can think of - no luck. The odd thing is that I would rename my logical column and this will be reflected in Answers, but changes to the derived expression is not reflected.
    - I deleted the column and started all over - no luck
    - tried creating derived columns using different datetime expressions or even just concatenating strings to each other - no luck.
    - Restarted all the Oracle services, thinking it's a cach'ing issue - no luck.
    - I rebooted the server - no luck.
    Once, when I started out with only MONTHNAME(Sales.OPPORTUNITY.ESTIMATEDCLOSE) I got back correct values in Answers (Jan, Feb, Mar..) and I thought the problem was finally gone. So I added || ' - ' || CAST(Year(Sales.OPPORTUNITY.ESTIMATEDCLOSE) AS CHAR(4)) to the derived formula. Guess what, Answers kept on giving me back only the Month names, completely ignoring my updated formula.
    Anybody else ran into this problem? Something so simple should surely work...

    John,
    Yes, cleared the cache (from Dashboard Administration) and also used nqcmd Call SAPurgeAllCache. Restarted BI Server. Still no luck.
    I tried your formula and then only got back "2008/01/31" (just the date without the timestamp) - very strange.
    Then I noticed the following amazing thing (by accident):
    I started with a request in Answers that does not contain my derived column and then add my derived column to the report. The text would be all wrong in this column (Jan, Feb, March). Then I add the same column again to the report and the second added column will magically show the correct strings (Jan - 2008) !!! Even stranger is that the moment I added the same column again to the report, the text on the first column would also magically jump to the correct values (Jan - 2008).
    I did a bit more testing and created a blank request and added my derived column as the only column - this time it will show the correct values immediately. It does seem like it's tied to the content already in the request. I went a step further and saved my buggy request and added it to a Dashboard - it worked perfectly in the Dashboard.
    Must be something buggy here in Answers when working with derived columns. I'm on 10.1.3.4...

  • YTD Actual and Budget Formula in Logical column

    Hello,
    Im trying to replicate a report from Oracle EBS into OBIEE.
    I have to create logical columns for YTD Actual , YTD Budget,Current month actual, Current Month Budget using GL_BALANCES table. The formula currently im using is,
    YTD Actual:
    CASE WHEN GL_BALANCES.ACTUAL_FLAG = 'A' THEN (GL_BALANCES.PERIOD_NET_DR - GL_BALANCES.PERIOD_NET_CR)+ (GL_BALANCES.BEGIN_BALANCE_DR - GL_BALANCES.BEGIN_BALANCE_CR) ELSE 0 END
    YTD Budget :
    CASE WHEN .GL_BALANCES.ACTUAL_FLAG = 'B' THEN GL_BALANCES.PERIOD_NET_DR - .GL_BALANCES.PERIOD_NET_CR + (.GL_BALANCES.BEGIN_BALANCE_DR - GL_BALANCES.BEGIN_BALANCE_CR) ELSE 0 END
    Current Month actual :
    TODATE(GL_BALANCES."YTD Actual", "Time Dim"."Month")
    Current Month Budget :
    TODATE(GL_BALANCES."YTD Budget", "Time Dim"."Month")
    Please let me know, if my calculations are right.
    Thanks

    Hi,
    I have only one KF and after closing the month, I would load actuals to the Budget KF itself.
    Right now this coulmn is a formula with below:
    ( 1 <= 'Last Actual' ) * 'Jan / Actual' + ( 1 > 'Last Actual' ) * 'Jan / &0T_VERS&'
    And the column heading is "January Act/Budget"
    At the moment it is confusing for users to know which figures are Actuals and which are Budget.
    Thanks
    Priya

  • Logical column not in hierarchy is shown as being drillable?

    I have a hierarchy created, lets say customer. Within all the levels, logical column "customer zipcode" is not added. By default then, this column would be associated with the detail or lowest level. When I create a report in answers with just this column, it shows up drillable whereby it drills to the lowest level which is apart of by default. So my question is why is this logical column drillable if it is not associated explicitly in the hierarchy?

    Thanks for the reply. Later yesterday I figured out the problem. It took some unit testing and trial and error; but, this is what I gathered during testing.
    1. logical columns for hierarchy table source that are not explicitly added to a level in the hierarchy is by default put in lowest level and they are not drillable because of that.
    2. If you add a logical column which is derived off other logical columns through an expression (for example concatenating col_nm||'('||col_cd||')' in the lowest level, add it as a key and set the key for "use for drilling" then this too will make all logical columns drillable to that. Don't know why but could be a bug.
    3. If you set preferred drill path for the lowest level to another hierarchy level, then those logical column(s) will be drillable. Probably because they are not at the lowest level any more since there is another level down now due to the preferred drill path.
    So in the end since the lowest level drillable column has to be a logical column which is built through expression and the fact that this lowest level needs to drill to another level in another hierarchy due to business requirements I am stuck with all columns not in the hierarchy being drillable. The fix around this though is to go to all these columns that are now drillable, set the aggregation rule to 'no interaction' and save as system-wide default.

  • Cross tab report-dyanamic columns for months and quarterly sum

    Hi all,
    I work on report creation in BI Publisher.I need to display values in a cross tab report in a way that it shows data for 3 months and then a column for its quarterly sum.
    For ex:-
    Market --Jan       Feb    Mar    Q1_sum Apr May Jun Q2_sum ---------------like wise for n months
    Market1 100 --80 --30 -- 210 --10 -- 80 --90 --210
    Market2 120 --90 --40 --250 --100 --70 --30 --200
    Market3 130 --70 --60 --260 --140 -- 0 --40 --180 ('--' just to maintain indentation)
    The values of number of months( date range) and the names of market are derived dyanamically .
    The code from my sql query presently shows months-range (Jan ,feb,mar..), Quarterly sum to be displayed, Market names etc as child elements in multiple occurences main Query set.
    <Main Query>
    <Market>abc</Market>
    <Region>abc</Region>
    <Months-Range>abc</Months-Range>
    <Quarterly_sum>abc</Quarterly_sum>
    <Main Query>
    <Main Query>
    </Main Query>
    Please guide me on code in rtf template for the same.
    Thanks
    Edited by: user9061488 on Jul 13, 2010 1:32 AM
    Edited by: user9061488 on Jul 13, 2010 3:48 AM

    Hi,
    Do u have time dim in your metadata??
    If not,
    Create a time dimension year,quater,month,day
    http://lh4.ggpht.com/_rhCtHYLiamQ/S7PQvxYBbzI/AAAAAAABZXI/ef_Ur9AmyUo/s800/04_year_quarter_bmm.jpg
    After that in fact table by using the (ago/todate) function ....pull the respective columns to pivot table columns section(date column) and measures col in measures section and enable the grand total in columns section so that it will show grand total for every quarter
    thanks,
    saichand.v

Maybe you are looking for

  • Tab to change component focus not "real tab" in jtextarea

    Hi, Assume you have four text field in one panel. The first text field get focus. Then you press tab, the focus will change to next text field... and so on..... If you put one text area in that panel..... the story begins.... When you press tab in te

  • Only the Admin can view the data in the Discoverer Reports

    Hello, Discoverer Version Info: OracleBI Discoverer 10g (10.1.2.3) Oracle Business Intelligence Discoverer Plus 10g (10.1.2.55.26) Discoverer Model - 10.1.2.55.26 Discoverer Server - 10.1.2.55.26 End User Layer - 5.1.1.0.0.0 End User Layer Library -

  • Upgrade from English Adobe Web premium to Swedish version???

    I have a licensed version of Adobe Web Premium CS3 (English Version). Can I buy the latest version (6) from adobe.com and use the upgrade option and buy the Swedish version?

  • Airplay intermittent loss of sound - Sony NS410

    I use windows vista and iTunes 10.7 - after about a day and a half of trying to get music to play (which requires a ridiculously high level of tech know how, amending settings on firewalls, opening ports, setting anti-virus just right etc etc etc - h

  • BW Accelerator for BEx query to be used in polestar

    I'm using a Bex query in polestar after creating BWA Index. The BEX query is suitable for Business Objects after checking it in the program RSR_QPROV_CHECK(which beomes infocube) But the F-Table of the query is not Indexed and it ends dump after long