Cube Totals Presentation

I have written this query which gives me output based on year,season,country,model series. I am grouping it with Cube but I am not able to get my Grouping correctly.
I need grouping in this fashion
1: year,season,country Total
2: year,country Total
3: year,season,series Total
4: year,season Total
5: year Total
Any help would be highly appreciated.
   WITH SERIES
          AS (  SELECT   MAX(TO_NUMBER (WEEK)) WEEK,
                         YEAR YEAR,
                         SEASON SEASON,
                         MODEL_SERIES MODEL_SERIES,
                         COUNTRY COUNTRY,
                         MAX (FAILURES) FAILURES,
                         MAX (SELL_IN_NUMBER) SELL_IN_NUMBER,
                         MAX (SELL_THRU_NUMBER) SELL_THRU_NUMBER,
                         MAX (temp_af_summary) ANNUALIZED_FAILURE,
                         MAX (AFR_SELL_IN) AFR_SELL_IN,
                         MAX (AFR_SELL_THRU) AFR_SELL_THRU
                  FROM   QAR_PLAN_SERIES_QTY
              GROUP BY   YEAR,
                         SEASON,
                         MODEL_SERIES,
                         COUNTRY
              ORDER BY   MODEL_SERIES),
       MDL
          AS (SELECT   *
                FROM   (SELECT   DISTINCT MODEL_SERIES FROM   QAR_PLAN_SERIES_QTY),
                       (SELECT   DISTINCT COUNTRY FROM QAR_PLAN_SERIES_QTY)),
    TREND_MV AS
        SELECT
            MAX(TO_NUMBER (WEEK)) WEEK,
            DECODE(GROUPING(SERIES.YEAR),0, SERIES.YEAR,1, NULL)YEAR,
            DECODE(GROUPING(SERIES.SEASON),0, SERIES.SEASON,1, 'WW')SEASON,
            DECODE(GROUPING(SERIES.MODEL_SERIES),0, SERIES.MODEL_SERIES,1, NULL)SERIES,
            DECODE(GROUPING(SERIES.COUNTRY),0, SERIES.COUNTRY,1, 'COUNTRY')COUNTRY,
            SUM (FAILURES) FAILURES,
             SUM (SELL_IN_NUMBER) SELL_IN_NUMBER,
             SUM (SELL_THRU_NUMBER) SELL_THRU_NUMBER,
             SUM (ANNUALIZED_FAILURE) ANNUALIZED_FAILURE,
             SUM (ANNUALIZED_FAILURE)/SUM (SELL_IN_NUMBER) AFR_SELL_IN,
             SUM (ANNUALIZED_FAILURE)/SUM (SELL_IN_NUMBER) AFR_SELL_THRU
        FROM
            SERIES,MDL
        WHERE   SERIES.MODEL_SERIES(+) = MDL.MODEL_SERIES
              AND SERIES.COUNTRY(+) = MDL.COUNTRY           
        GROUP BY
            CUBE(SERIES.YEAR, SERIES.SEASON,SERIES.COUNTRY,SERIES.MODEL_SERIES)
     SELECT   TREND_MV.WEEK,
              TREND_MV.WEEK1,
              TREND_MV.YEAR,
              TREND_MV.SEASON,
              TREND_MV.SERIES,
              TREND_MV.COUNTRY_1,
              TREND_MV.COUNTRY,
              TREND_MV.FAILURES,
              TREND_MV.SELL_IN_NUMBER,
              TREND_MV.SELL_THRU_NUMBER,
              TREND_MV.ANNUALIZED_FAILURE,
              TREND_MV.AFR_SELL_IN,
              TREND_MV.AFR_SELL_THRU
       FROM   TREND_MV   ;Edited by: BluShadow on 10-Jan-2012 14:52
added {noformat}{noformat} tags. Please read {message:id=9360002} and learn to do this yourself in future.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Hi,
Not sure to fully understand the requirements.
Maybe something like this ?
Scott@my11g SQL>with t as (
  2  -- start of sample data
  3  select '1999' year, 'Autumn' season, 'France' country, 'Pro' series, 10 amt from dual
  4  union all select '1999' , 'Autumn' , 'France' , 'Pro' , 15 from dual
  5  union all select '1999' , 'Summer' , 'France' , 'Pro' , 30 from dual
  6  union all select '2000' , 'Winter' , 'France' , 'Am' , 70 from dual
  7  union all select '2000' , 'Winter' , 'Norway' , 'Am' , 30 from dual
  8  union all select '2000' , 'Spring' , 'Norway' , 'Pro' , 90 from dual
  9  -- end of sample data
10  )
11  select year, season, country, series, total
12  from (
13  select year, season, country, series, sum(amt) total,
14  case
15  when (grouping(year)=0 and grouping (season)=0 and grouping(country)=0 and grouping(series)=1) then 1
16  when (grouping(year)=0 and grouping (season)=1 and grouping(country)=0 and grouping(series)=1) then 2
17  when (grouping(year)=0 and grouping (season)=0 and grouping(country)=1 and grouping(series)=0) then 3
18  when (grouping(year)=0 and grouping (season)=0 and grouping(country)=1 and grouping(series)=1) then 4
19  when (grouping(year)=0 and grouping (season)=1 and grouping(country)=1 and grouping(series)=1) then 5
20  else 0
21  end gp
22  from t
23  group by cube(year, season, country, series)
24  )
25  where gp>0
26  order by gp, year, season, country, series ;
YEAR SEASON COUNTR SER      TOTAL
1999 Autumn France             25
1999 Summer France             30
2000 Spring Norway             90
2000 Winter France             70
2000 Winter Norway             30
1999        France             55
2000        France             70
2000        Norway            120
1999 Autumn        Pro         25
1999 Summer        Pro         30
2000 Spring        Pro         90
2000 Winter        Am         100
1999 Autumn                    25
1999 Summer                    30
2000 Spring                    90
2000 Winter                   100
1999                           55
2000                          190
18 rows selected.

Similar Messages

  • 0vtype =20 (plan data)  not populating to cube but present in psa

    Hello Experts,
    I have a requirement in BW 3.5 where in I need to create a report on multiprovider which should give the actual data,plan data and the difference between actual and plan data for some key figures.
    The data flow is like this
    There are two cubes one for actual and the other for plan ,I am able to get the actual data (10) for actual cube from R/3 .
    But the problem is with plan cube,I am not able to get the plan data (20) to the plan cube (to the field 0vtype) ,but the data is coming till psa from R/3,there is no routine present in update rule also,but still data is not showing up for this 0vtype in plan cube...
    I have tried loading selectively (020) by giving 020 as constant in the update rule..but still no luck
    The datasource that is fetching plan data is 1_CO_PA_PLN1000.
    Could you please suggest how to proceed further
    Thanks,

    HI Sashi,
    Checking the mapping between the 0vtype and the source fiedls.
    Check in transfer rules first if it is mapped or not.
    Transfer rules are not mapped to source fields then correct it .
    Hope this helps.
    Regards,
    Reddy

  • SSAS Cube Total Amount showing correctly but monthly split up is wrong

    Dear All,
    I have created two dimensions "Detail Fields" and "Time Period" from named query.Report Amount is the measure group. 
    Our report is generated By Dropping the Prime Supplier in row fields,
    Report Year in Column Fields with Report Amount in the Data Region. Please see the below output.
                                Report Year
                                2011                   2012                     2013
                 2014                  Grand Total 
    Prime Supplier   Report Amount  Report Amount  Report Amount   Report Amount    Report Amount
    Test01                    $1500.00      $2500.00              $1520.50            $2540.00          
       $10,000.00
    Test02                    $2980.00      $1240.00              $5210.50            $5410.00          
       $15,000.00
    The issue here is the Grand Total for every supplier is fetching correctly (i.e $10,000 and $15,0000 is correct for each supplier), but the split up for every year is incorrect (i.e 1500,2500,1520,2540 for the Supplier Test01) . I don't understand why its
    showing the wrong split up. May be  Using of complex named  query or any relationship to be created if we are using named query or any other settings or any other reason.  
          Also if the record count is less we are not facing any issue, the amount is splitting perfectly for every year, but in production there we have millions of records hence the amount is splitting wrongly or the  problem with the complex
    named query if needed i can paste the query used. 

    Okay guys, let me share what went wrong.  The problem is with cube-dimension relationship. After fixing it the value showing correctly.

  • How do you know how long the total presentation is?

    I'm new to making keynote presentations and it needs to only be so long.  How can I tell how long it is in total length?

    Did you record it to set the timing? If you did, the easiest way is to go to the Finder and right click (control click) the file icon and Show Package Content (see below if you don't have this option). Then look for the file titled narrationTrack0.m4a and open that in Quicktime. Then you can see how long your Keynote will be.
    If you have not recorded the slideshow and are just manually advancing the slides as you talk over them, then you can go to the Play Menu>Rehearse Slidshow and it will have a timer on it so you can see how long it takes you to click through.
    If you don't have the option to Show Package Content, follow these steps
    Go to the Finder and find your Keynote file and make sure that it has the .key extension on it. If it doesn't, click it and command i to view info and uncheck the box that says Hide Extension.
    Delete the .key and change to .zip and then click the use .zip in the dialog box that comes up
    Double click the new .zip file and it will change it to a folder
    Open the folder and will see all the files that make up your Keynote
    When you have done what you need with the files, close the folder and add .key to the end of the name and click Add in the dialog box (this will change it back to a Keynote file)

  • G4 Cube total sound loss

    Hi Folks,
    I was sat quite happily working away, listening to BBC Radio 2's great Steve Wright in the Afternoon over the Internet via a Widget on my Cube, when all of a sudden, the sound just stopped, along with a 'disconnection'-type noise.
    Having had the situation many a time that the Internet connection had been lost, I went to the Widget to reconnect to the signal, only to see that the signal was still streaming into the Widget. At the time, I was clearing out some duplicates in iTunes (a job that seemingly is endless), and I then further noticed that System sounds were not coming through to the speakers either.
    Finally, I checked by playing from iTunes itself, and there's very definitely no sound coming from my Mac.
    The speakers are Harmann Kardon Soundsticks II, which are connected to a Griffin switch that takes the line input to the speakers and converts this to USB so that I can plug them into the Cube. They've got power, and have worked happily for around eighteen months. If I unplug the line cable from the Griffin switch, I get a sound from the speakers, so they're definitely okay.
    So far, I've repaired permissions, verified my hard drive, rebooted, logged out & in, checked System Preferences & Apple System Profiler, but still no sound emanates from my Mac.
    Can anyone help, please?

    Hi,
    Thanks for the response and for the article link.
    I can't mark this as answered or resolved, but the Soundsticks are pumping sound out as I type. I'm just not sure I could repeat the process, nor understand which bit did what exactly. Talk about some odd behaviours...
    Your suggestion is sound: isolate the Griffin iMic. I plugged in my earbuds - no sound.
    Then, I thought, get out the old Cube speakers: I unplugged the iMic & plugged in the old faithfuls. On my desktop came an OS warning script, of the type seen in OS installations (black square with white text in four languages, explaining the only thing this here CPU can do is switch off, thank you very much).
    After the restart, Cube speakers working: so the sound card etc are all good. Next, I hot swapped the Cube speakers for the Soundsticks - no output, no connection noises, no power hum.
    So I plugged the Soundsticks into my iPod via line - no sound. Then I plugged the Soundsticks into my PowerBook via line - no sound. So I tried the iMic with the PowerBook & the Soundsticks - no sound.
    Even though the Soundsticks power light was on, I checked its transformer for warmth, checked all connections, unplugged power & plugged back in (a la 'Have you tried turning it off and on again?'). I hot swapped the iMic one more time into the Cube, and noticed again no connection noises or power hum, but after unplugging and plugging the line input to the iMic a couple of times, noticed the return of the power whine & connection noise. As I hovered around the Cube USB port, I got similar connection noises...
    Unfortunately, the hot swap again caused a problem, and the Cube shut down. After reboot, I went through your instructions again for the cuda reset, as I'm sure this Mac is showing the right kinds of distress that reset solves. I haven't done it yet though - can't bring myself to crack the case straight away now that the Soundsticks are working again.

  • Error while transporting the cube

    Hi all,
    While I was transporting my cube from Development box to Acceptance box, following error occured.
       Execution of programs after import (XPRA)
       Transport request   : GCDK905220
       System              : NCA
       tp path             : /usr/sap/NCA/DVEBMGS01/exe/tp
       Version and release: 372.04.10 700
       Post-import methods for change/transport request: GCDK905220
          on the application server: bdhp4684
       Post-import method RS_AFTER_IMPORT started for CUBE L, date and time: 20080802045222
       Program terminated (job: RDDEXECL, no.: 04522200)
          See job log
       Execution of programs after import (XPRA)
       End date and time : 20080802045237
       Ended with return code:  ===> 12 <===
    Actually transport failed but the cube was present in the Acceptance box in inactive state
    Please help me for the same.
    With thanks and regards
    Mukul Singhal

    Hi Mukul,
    This is an RC=12 issue.Contact the basis team or raise ticket basis team stating that your specified transport request is got terminated.Basis team will resolve the issue and then you can Import the transport into your acceptance system.
    I hope it will help you.
    Regards
    Suresh B.G.

  • End Routine and test on a field not present in target system

    Hello,
    I find an issue in testing a field present in the ODS source system, using a Transformation.
    More exactly, I do not have a field that I need to test in my end routine because I can't find it in the RESULT_PACKAGE structure (because it is not present in the target Cube). In BW 3.x I solved this problem with an enhancement in the communication structure of my ODS and using a start routine.
    Any idea for a Transformation?
    Many thanks in advance for your kind support.
    Regards,
         Giovanni

    Hello and thanks for your reply.
    I have two conditions to implement.
    1) Fill some fields of the Cube, NON present in my ODS.
    2) Condition on one filed of my ODS (filed NON present in the Cube) for filling some field of the Cube.
    In my tys_SC_1 structure of the start routine I find all fields of my ODS and the fields that I need to check but I need to check this filed in one point of my process and set a filed in  my Cube (filed not present in my ODS), not present in the tys_SC_1 structure of the start routine.
    I hope to have described in a better way may issue.
    Any idea?
    Thanks.
          Giovanni

  • Key figure present in Infocube but not in Transformation

    Hi All,
    I have added two new quantity key figures in my cube but I am not able to find them in Trnasformation from undelying DSO to the said CUbe.
    ALtough if i search for the Key figures in the cube i am able to see them but they are not present in the trnasformation.
    NOTE: i have 252 char and key figures in my cube at present ( this number i got from the sl no field in transformation

    Hi,
    Check for the consistency of the cube and Activate again, the related transformations should go for inactive,
    Add the same 2 KFs in the underlying DSO as well, activate
    then check the 2 KFs should be in the Transformations, mapp it and acticvate.
    Regards
    ReddY A

  • Authorization issue to view cube contents

    Hi Gurus,
      I am getting Authorization issue to view cube contents in Production server, When I execute the cube it is showing me the following statement.
    "You do not have sufficient authorization for the infoprovider ZMMG_C05".
    Please provide me a possible solution for this.
    Thanks,
    Jackie.

    Hi,
    Two things to be checked with respect to authorization for this one.
    1) Functional Roles: Check whether Info cube is present in the functional roles that are assigned to you.
                                  If not you need to get the functional role in which the Infocube is assigned.
    2) Data Access Roles: Check in the data access roles assigned to you, whether you have the access
                                      to the selection that you are using to see the data in the info cube. Else, request
                                      BASIS team to assign the appropriate data access roles to you.
    Hope this helps.
    Regards,
    Bharat

  • Error in Update rules while defining CKF at cube level

    Hi,
    I am willing to create a Calculated Key figure at CUBE level using formula function
    <b>Unit Cube: Item Cube / Units</b>
    <b>Definitation</b>
    Item Cube - Total Cube for line
    Units   - Actual destination quantity in alternative unit
    <b>Units</b>
    The Unit for ‘Item Cube’ is Volume unit and
    The Unit for ‘Units’ is Alternative Unit of Measure for Stock keeping Unit
    When i tried to create CKF at Cube level in update rules using formula it is displaying me an error that no unit is available in source system for the CKF Unit Cube
    How can I proceed on this
    Thanks

    Hi
    Have you transferred the global settings from the context menu of source system where all the units from the source is replicated to BW system
    Regards
    N Ganesh

  • Reduce snapshots in the cube

    Hi gurus,
    I need to run a job monthly so that it should delete the snapshot dates of a cube which are older than 90 days except weekend. Selection conditions for the cube is  present day as Snapshot date.
    Any fuction module is available to delete the requests of the particular snapshot  dates.
    Can any one help me on this issue.
    Quick response is appreciated.
    Thanks,
    Subhakar.

    Have you thought about running an infopackage with a selection date from the snapshot cube back in on itself and multiplying by -1?
    Then with compression and zero elimination it does the same as a phusical delete - but it is "auditable"

  • OBIEE 11g - How to display Grand Total in graphs

    hi ,
          I am using obiee 11.1.1.6 version . In a report I have a requirement to display the grand total (present in the table) to be shown in the graph also . Kindly help.
    Regards,
    Niv

    Hi Niv,
    One quick question here.How are you creating Grand Total?
    I have created using grand total by row summing and inserting that value in the Group By Function.
    Attached the screen shot for your reference.
    GT - Download - 4shared
    Mark if helps
    Thanks,

  • Number of records in Report and Cube don't match

    Hello,
    I have a small query. In the cube, thru the 'Manage' context menu option I find in the ' Request' tabstrip the number of records loaded =  23767.
    In the report on this cube I see the number of records = 23369.
    I have not put any restriction or condition in the report. It is a si,ple report that is created to see all the records loaded.
    May you please help me in understanding the reason for the same.
    Thanks for you help.
    Regards,
    Sumita

    To compare the reocrds in Cube against BW Query
    try to display records for small range (monthly) / by Cost Center / any similar key field and then compare the Cube totals with Report totals.
    Hope this helps
    Praveen

  • How do I clear out the measure data in a 10g cube?

    Does anyone know of an OLAP feature that I could call to zero out the measures stored in the cube before I refresh from the source data?
    I'm new to Oracle OLAP, but I have some experience with Essbase and I have quite a bit of experience with Oracle PL/SQL. The database is 10g.
    Here's my problem. My dimensions and cube are mapped to Oracle tables. I've built an Oracle procedure, using DBMS_LOB / xml_clob functionality, to refresh my cube. The procedure worked successfully to initially load the data. It works successfully when changing the dollar amounts in source data. But it's possible that a record in my source data can be totally removed. When this happens, the dollars from the removed record are still showing up in the cube total after the procedure executes. I had assumed the CleanMeasures="true" parameter would take care of this, but it is not.
    thanks,
    Nancy
    Here's the parameters I'm passing to BuildDatabase in my Oracle procedure:
    ' <BuildDatabase Id="Action2" AWName="BUDGET.BUDGETS" BuildType="EXECUTE"
    RunSolve="true" CleanMeasures="true" CleanAttrs="true" CleanDim="true"
    TrackStatus="false" MaxJobQueues="0">';

    If you are doing a reload every time then you can issue following commands to clear data from cube.
    lmt name to all
    allstat
    clear all from <cubename>prttopvar
    You can wrap above commands in pl sql procedure using dbms_aw.execute package and execute it before cube load starts. Instead of clearing it from whole cube you can clear only from one partition also. Just take a look at clear command in olap DML 10.2 reference.
    Thanks,
    Brijesh
    Edited by: Brijesh Gaur on Aug 10, 2010 6:47 AM

  • UCCX presented, Abandoned & handled calls

    Hi All,
    I need to submit exact count for  UCCX presented, Abandoned & handled calls to my management,
    But in UCCX historical reporting,
    I found diffeernt total number for total presented, Abandoned & handled calls for same durration,
    Kindly suggest which report should I refer for the exact count for perticular CSQ,
    so that total UCCX presented, Abandoned & handled calls will matched with the total calls handled by agents.
    Thankx
    Deeeps

    Sorry, but that would be impossible given that scripting changes reporting metrics.
    Take for example a script where you queue a caller in two queues simultaneously, and then your caller is handled by one of them.  They both will be marked as presented, only one as handled, and the other as dequeued.  Is that two calls or one?  You wouldn't no based on counts alone.
    You have to understand your scripts in order to understand your reporting.  Unless you were willing to share your scripts and call flows with all of us, the best we can do is guess.
    Sorry for the bad news, but that's how it goes.
    Anthony Holloway
    Please use the star ratings to help drive great content to the top of searches.

Maybe you are looking for

  • Problem with Iphoto

    Hi, Every time I start Iphoto, pops up a screen that tells me that iphoto has an non imported photo and has to recover it. I press YES, but when it ends the import, nothing shows up. Please help me solve this problem as it is annoying because it happ

  • Confusion... Data Access Object and Collection Class

    Please help me... i have a Book class in the library system, so normally i would have a Collection class eg. BookCollection class which keeps an array/ arrayList of Book objects. In BookCollection class i have methods like "searchBook(BookID)" which

  • Lsmw..........Step 5: Maintain Field Mapping and Conversion Rules

    Hi All, After maintaining all the four steps in LSMW, when I goto Step 5, I cannot see any fields which I have created in recording to map the fields to source fields. If i remember correctly there is a button to click to see all the fields under "Fi

  • How do I sync my highscores to my new Iphone?

    Just got the new 4S. After som difficulties with my WIFI shutting down while trying to sync with itunes I managed to sync everything except photos from the old iphone camera and my highscores from angry birds etc. Is it possible to sync camera photos

  • Best date format

    Hi all, this message is related to a previous one that I 've posted. I wonder what is the best date format to use when converting from VARCHAR2 to DATE. The VARCHAR can be for example 1, (01/01/01), or 2. (01/01/2001) The currently format used (I wan