Sum in Query

Hi All,
I have a query where i want to sum the line total and break up of freight if any.
i have the query to give the break up but when i sum this, it returns Null Value.
The query is mentioned below;
T6.Linetotal  and
STR((Select (PCH3.[LineTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) from PCH3  where PCH3.docentry=T7.Docentry and PCH3. ExpnsCode = '1' and (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum])<> '0.00'),19,2) as 'Freight'
I would like to get a sum of these. How to achieve that?
Thanks,
Joseph

Hi Gordon,
The query is mentioned below;
SELECT Distinct  T7.DOCNUM AS ' INVOICE No.',T7.DocDate as 'Invoice Date',T7.DocDueDate,T7.[DocCur],T5.DOCNUM as 'Excise Invoice' ,T5.[NumAtCard] as 'Excise Reference',T1.DOCNUM AS 'PO', T1.Docdate as 'PO Date',T7.CardCode as 'Vendor Code',  T7.CARDNAME as 'Vendor Name',T7.NUMATCARD as 'Vendor PO No.',
T7.SHIPTOCODE as 'Ship To Code',T6.ITEMCODE,T6.DSCRIPTION,T6.Width1, cast(T15.UserText as VarChar) as 'Item Remarks', T6.QUANTITY,T6.WhsCode as 'Wh Name',T6.UnitMsr as 'UoM',T6.PRICE as 'Basic Price',T6.LineTotal as 'Basic Amount',
T10.[TransCat] as 'Purchase Type',T6.AcctCode as 'GL Code',
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=-90 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )BED,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=-60 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )Cess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=-55 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )HeCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=21 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )ServiceFreight,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=5 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )ServiceTax,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=6 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )ServiceCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=7 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )ServiceHeCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=-40 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CVD,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=17 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CVDCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=18 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CVDHeCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=14 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )Custom,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=15 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CustomCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=16 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CustomHeCess,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=7 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )EVAT,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=8 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )EAVAT,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=9 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )ECST,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=10 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )EACST,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=-80 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )AED,
(Select PCH2.LineTotal from PCH2  where PCH2.docentry=T7.Docentry and PCH2.linenum = T6.Linenum and PCH2. ExpnsCode = '5' ) as 'AED Reversal',
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=1 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )VAT,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=4 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )CST,
(Select distinct isnull(Sum(PCH4.taxsum),0) from PCH4 where PCH4.statype=20 and PCH4.docentry=T7.Docentry and PCH4.linenum = T6.Linenum )TCS,
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) when 0 then 0 else (PCH3.[LineTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end from PCH3  where PCH3.docentry=T7.Docentry and PCH3. ExpnsCode = '1' )
,19,2) as 'Freight',
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) when 0 then 0 else (PCH3.[LineTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end from PCH3  where PCH3.docentry=T7.Docentry and PCH3. ExpnsCode = '2' )
,19,2) as 'Insurance',
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) when 0 then 0 else (PCH3.[LineTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end from PCH3  where PCH3.docentry=T7.Docentry and PCH3. ExpnsCode = '3' )
,19,2) as 'Packing & Forwrding',
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) when 0 then 0 else (PCH3.[LineTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end from PCH3  where PCH3.docentry=T7.Docentry and PCH3. ExpnsCode = '4' )
,19,2) as 'CHA',
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]+ T7.WTSum) when 0 then 0 else (T7.[WTSum]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end  )
,19,2) as 'TDS',
STR ((Select case (T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) when 0 then 0 else (T7.[DocTotal]*T6.[LineTotal])/(T7.[Doctotal]-T7.[TotalExpns]-T6.[VatSum]) end  )
,19,2) as 'Doc Total',
--T7.[WTSum] as 'TDS',T7.Doctotal as 'Document Total',
T8.[SlpName],(T9.[firstName]+' ' + T9.[lastName]) as 'Owner',
T7.Address As 'Bill To Address',T7.Address2 As 'Ship To Address',
(Select Distinct CRD1.State from CRD1  where CRD1.CardCode = T7.CardCode and CRD1.Address = T7.ShipToCode) as 'Vendor State',T3.DocNum As ' GRPO Ref #',
T3.DocDate as 'Date Of Receipt'
FROM         dbo.POR1 T0 INNER JOIN
                      dbo.OPOR T1 ON T1.DocEntry = T0.DocEntry  LEFT JOIN
                      dbo.PDN1 T2 ON T2.BaseEntry = T1.DocEntry LEFT JOIN
                      dbo.OPDN T3 ON T2.DocEntry = T3.DocEntry  LEFT JOIN
                      dbo.IEI1 T4 ON T4.BaseEntry = T2.DocEntry LEFT JOIN
                      dbo.OIEI T5 ON T5.DocEntry = T4.DocEntry  LEFT JOIN
                      dbo.PCH1 T6 ON T6.BaseEntry = T3.DocEntry LEFT JOIN
                      dbo.OPCH T7 ON T7.DocEntry = T6.DocEntry  LEFT JOIN
                      dbo.OITM T15 ON T15.ItemCode = T6.ItemCode  LEFT JOIN
                      dbo.OSLP T8 on T1.SLPCode = T8.SlpCode    LEFT JOIN
                      dbo.OHEM T9 on T1.OwnerCode = T9.empID    LEFT JOIN
                      dbo.PCH12 T10 on T7.Docentry = T10.DocEntry
WHERE (T4.BaseType = '20' or T6.Excisable ='N') and T7.Doctype = 'I'
Thanks,
Joseph

Similar Messages

  • Avoid #MULTIVALUE, Sum from Query webi 3.1

    All,
    I have a query that returns multiple rows in the following format
    Region   Sales
    US         800
    UK         500
    APAC     400
    I have no control over query ( It's MDX generated through a cube universe)
    My requirement is to present this as a single line item on the report after the header.
    Total Sales: 1700.
    Since i don't want to render a table and hide all three rows to get the sum, i am wondering if there is a way to achive this using a formula- may be access the query results directly and apply calculation on top.
    I have tried some, but i am afraid my expertise is rather limited- hence no success
    Any advice would be much appreciated.

    Bashir,
    I might be mis-understanding this..but this is my data provider ( my query) result.. consider it with one dimension and one measure
    Region  | Sales
    US        | 800
    UK       | 500
    APAC | 400
    Now if i use a simple summation formula, for that single column cell
    it would be like
    =
    "Sum([My Query].[Sales])"
    This will work if i have a table and sum this total at the end of it.
    My aim is to get this output in a single cell and this formula doesn't work there.
    I am not sure if i understand your recommendation correctly
    Edited by: ramaks on Mar 1, 2010 8:48 PM
    Edited by: ramaks on Mar 1, 2010 8:49 PM
    Edited by: ramaks on Mar 1, 2010 8:49 PM
    Edited by: ramaks on Mar 1, 2010 8:49 PM

  • Sum sales query last 6 months

    Hi people......
    Using docdate and sum(quantity) how can i sum the last 6 months from today.??
    select itemcode, sum(quantity) FROM oinv, inv1, oitm, oitb.......   WHERE getdate()  ?????
    I hope explain correctly....

    Hi,
    Better use: DateDiff(DD,T0.DocDate,getdate()) >=  180
    Thanks,
    Gordon

  • Crosstab SUM behaviour query

    Hello,
    I have a crosstab report in Discoverer Plus which does not display the expected results.
    There are organisation areas down the side with their associated staff below (outline style). The data point is a calculation which shows their average monthly hours, so it's a numeric field with a multiply and division performed on it.
    When the report is expanded everything is fine. When it is collapsed the summed amounts for the orangisation levels with all the staff below them are incorrect. Instead of showing the summed amount of all the staff's average monthly hours, it shows a "sum distinct" amount.
    E.g. Expanded:
                   Average Hours
    Procurement      680
    A Smith         160
    B Jones         200
    C Walker        160
    D Brown         160Collapsed:
                         Average Hours
    Procurement          360In Discoverer Desktop the collapsed amount is correct.
    I have tried setting different aggregation behaviour for the worksheet in Disco Plus which did not make a difference. When I edit the calculation and try to re-add the numeric field, it does not give me the "detail" option when I expand it; there's only AVG, COUNT, MAX, MIN, SUM.
    Do you have any ideas what might be causing my problem?
    Thanks
    Hazel

    Hi,
    If this is working in Desktop but not in Plus, then I would try changing the aggregation settings in the pref.txt for Discoverer Plus. In particular, try changing AllowAggregationOverRepeatedValues to 1.
    Rod West

  • Cumulative sum's query

    Hello!
    I've got a problem with cumulative sum's.
    I have a table like this one:
    ENDDATE CURRENCY RATE AMOUNT
    31.1.2006 USD 0.3 10
    28.2.2006 USD 0.3 20
    31.3.2006 USD 0.3 30
    30.4.2006 USD 0.2 40
    31.5.2006 USD 0.2 50
    30.6.2006 USD 0.2 60
    31.7.2006 USD 0.3 70
    31.8.2006 USD 0.3 80
    30.9.2006 USD 0.3 90
    31.10.2006 USD 0.3 100
    30.11.2006 USD 0.5 110
    31.12.2006 USD 0.5 120
    31.1.2006 EUR 0.4 130
    28.2.2006 EUR 0.4 140
    I want to get MAX(ENDDATE), CURRENCY, RATE and SUM(AMOUNT) grupped by CURRENCY and RATE, but here is the problem:
    for USD i have a period with rate 0.3, then a period with rate 0.2 and then again period with 0.3. And I must calculate this two 0.3 groups separately.
    So I want to get this result:
    ENDDATE CURRENCY RATE TOGETHER
    31.3.2006 USD 0.3 60
    30.6.2006 USD 0.2 150
    31.10.2006 USD 0.3 340
    31.12.2006 USD 0.5 230
    28.2.2006 EUR 0.4 270
    Any idea how? I have tried different group by's, analytic functions but with no result.
    Thanks in advance,
    Rok
    P.S.: Sorry for my poor English :)
    Message was edited by:
    user550917

    A try :
    SQL> select * from rate_tbl;
    ENDDATE  CUR       RATE     AMOUNT
    31/01/06 USD         ,3         10
    28/02/06 USD         ,3         20
    31/03/06 USD         ,3         30
    30/04/06 USD         ,2         40
    31/05/06 USD         ,2         50
    30/06/06 USD         ,2         60
    31/07/06 USD         ,3         70
    31/08/06 USD         ,3         80
    30/09/06 USD         ,3         90
    31/10/06 USD         ,3        100
    30/11/06 USD         ,5        110
    31/12/06 USD         ,5        120
    31/01/06 EUR         ,4        130
    28/02/06 EUR         ,4        140
    14 rows selected.
    SQL>
    SQL> select max(enddate) enddate,
      2         currency,
      3         rate,
      4         sum(amount) together
      5  from (select enddate,
      6               currency,
      7               rate,
      8               amount,
      9               row_number() over (order by currency,enddate)-row_number() over (partition by currency,rate order by currency, enddate) rk
    10        from rate_tbl)
    11  group by currency,rate,rk
    12  order by currency desc,enddate;
    ENDDATE  CUR       RATE   TOGETHER
    31/03/06 USD         ,3         60
    30/06/06 USD         ,2        150
    31/10/06 USD         ,3        340
    31/12/06 USD         ,5        230
    28/02/06 EUR         ,4        270
    SQL> Nicolas.

  • SUM (CASE Problem -

    Can't seem to get the ELSE element of this SUM(CASE query to
    work - any suggestions gratefully received.
    SUM(CASE WHEN MonthID = 1 THEN CASE WHEN COS_State = 1 THEN
    COS_Value * (dbo.Value_Calc.Probability / 100) END ELSE COS_Value
    END)
    This doesnt bring back the correct COS_Value in the Else part
    - it brings back the total
    If I try
    SUM(CASE WHEN MonthID = 1 THEN CASE WHEN COS_State = 1 THEN
    COS_Value * (dbo.Value_Calc.Probability / 100) END ELSE CASE WHEN
    MonthID = 1 THEN COS_Value END)
    this gives an incorrect syntax at )
    help please....

    Thanks
    Still not there - trying an AND statement but doesnt like the
    syntax at 'when'
    SUM(CASE WHEN MonthID = 1 AND COS_State = 1 THEN COS_Value *
    (dbo.Value_Calc.Probability / 100) ELSE WHEN MonthID = 1 AND
    COS_State = 2 THEN COS_Value ELSE 0 END) AS N1,

  • Link to an another report

    Hi,
    I have a report with a link to an another detail report. Now the links are displaying on all rows, but I want to display the links on rows only having certain values.
    Thanks

    My question was how to display certain values of a column as a link instead of all the values, ie in the scott.dept example all the dept numbers (10,20,...) are displaying as links. But I just want to display the deptno 10 as a link and rest as plain text, means only if user clicks on the 10 then only the 2nd report comes up, otherwise nothing.
    Anyway, I was looking for this code too. Thanks very much. That code is working fine, But there is an another issue, if we are using the Report from SQL Query then the customization page dont have many options like where clause, sum columns, query options etc like QBE Reports. Is there any way to include that too?
    Can you tell me where I can find the documents on such codes. I am new to the Portal(Oracle too). I have a book explaining the basics, but I do want to learn more than that.
    Thanks

  • Getting the desired Output

    Hi ALl,
    I have a table as Listed below.
    SQL> CREATE  TABLE T1 (c1 varchar2(10),c2 varchar2(10),c3 varchar2(10), c4 var
    ar2(10),audit1 number);
    Table created.
    SQL> DESC T1
    Name                                      Null?    Type
    C1                                                 VARCHAR2(10)
    C2                                                 VARCHAR2(10)
    C3                                                 VARCHAR2(10)
    C4                                                 VARCHAR2(10)
    AUDIT1                                             NUMBER
    SQL> INSERT INTO T1 VALUES('A','B',NULL,2,1);
    1 row created.
    SQL> INSERT INTO T1 VALUES('A','B',1,NULL,1);
    1 row created.
    SQL> COMMIT;
    Commit complete.
    SQL> SELECT * FROM T1;
    C1         C2         C3         C4             AUDIT1
    A          B                     2                   1
    A          B          1                              1
    SQL>I need to output from the above table as single row considering the Dupilcate values as once row ,and also nned to neglect the Null values from it and take the other possible value in it.
    C1         C2         C3         C4             AUDIT1
    A          B           1          2                   1

    Suppose a new row(below) is added, what do you expect as o/p?
    INSERT INTO T1 VALUES('A','B',1,3,2);Purvesh's query will give you the <tt>MAX(c4)</tt> but if you use <tt>SUM(c4)</tt> it'll also give you a single-row & NULLS removed but the values summed.
    -- "Query 1"
    select c1, c2, max(c3) c3, max(c4) c4, max(audit1) audit1
      from t1
    group by c1, c2;
    A     B     1     3     2
    -- "Query 2"
    select c1, c2, SUM(c3) c3, SUM(c4) c4, SUM(audit1) audit1
      from t1
    group by c1, c2;
    A     B     3     7     6Please clarify this.

  • Dao and sql

    Dear All,
    In SQL we can find out the SUM of some datas, or we can get the datas by using select query and do the sum in DAO classes using StingBuffer.
    i need to know here,
    If i write SUM in query, is it processing speed will be low? or if i make sum in dao classes is it processing speed is faster? which one is good for better performance compared, writing in DAO classes is better or Sum in sql query is better?
    Thanks a lot in Adv

    It is entirely dependent on your design. If you fetch the records from two different tables via JDBC with no cache, the performance would be slower than doing the sum in the DB engine over large intervals based on the complexity of the query etc. This does not merit doing it with SUM though, it is entirely a design decision.

  • Summary Column

    Dear All
    I have block with amount and type columns
    i want to create two summary column sum the amount column regarding to the type
    example :
    amount    type
    1000         1
    2000          2
    500           1
    600            2
    1st summary column for type 1 =1500
    2st summary column for type 2 =2600
    Can any one help me ?
    Thanks in advanced

    Hi Slava,
    I did the following and it works:
    1. Create 2 queries (just for this example)
    select * from employees
    select * from departments
    2. Create a sum_salary summary column outside any group in the Data Model (Reset At "Report") - this is sum from query 1
    3. Create a sum_department_id summary column outside any group in the Data Model (Reset At "Report") - this is sum from query 2
    4. Create a Formula Column, again outside any group in the Data Model. PL/SQL Formula is:
    begin
    return :sum_salary + :sum_department_id;
    end;
    5. Place all these 3 columns in separate frames in the Paper Layout.
    Let us know if you face any problems.
    Navneet.

  • MTD problems

    I have a powerpivot report that shows the 12 months of each year.  A sales column is compared to a sales column from the previous year.  Everything works great except for the current month.  It is showing a month end total for January 2013
    and I need it to show the same time period as the current month.  Any ideas on how to get the current MTD and LY MTD to work across all months? I need it to appear as the latter example below for the current month.
    ty
    ly
    TY
    LY
    $9,132.72
    $350,900.23
    $9,132.72
    $13,180.67
    MTD:=CALCULATE(SUM([AMT]),Query[ACCOUNTNUM], DATESBETWEEN(Dim_Time[DateValue], FIRSTDATE(Dim_Time[DateValue]), TODAY()))
    MTD LY:=if(SUM([AMT])=0,BLANK(),CALCULATE(SUM([AMT]),Query[ACCOUNTNUM],SAMEPERIODLASTYEAR( DATESBETWEEN(Dim_Time[DateValue], FIRSTDATE(Dim_Time[DateValue]), TODAY()))))

    Hello,
    Please also take a look at the following articles regarding how to calaulate MTD in DAX:
    http://powerpivotfaq.com/Lists/TGPPF/DispForm.aspx?ID=71
    Time Intelligence Functions in DAX:
    http://blogs.msdn.com/b/analysisservices/archive/2010/04/12/time-intelligence-functions-in-dax.aspx
    Regards,
    Elvis Long
    TechNet Community Support

  • Converting number from 0 to 1

    Hi,
    Iam generating a query in which the following fields are used:
    date1, date2 of data type character. I am calculating the difference between two dates and applying sum function. query is as follows
    select sum(translate((To_Date((date2),'yyyymmdd') - To_date((date1),'yyyymmdd')),0,1)) from disdate
    problem 1:- if dont use translate when date1 and date2 are equal it is giving value 0 so i used translate to convert 0 to 1 and apply sum function over the result. But it is converting even the number 10 to 11.
    Can any one suggest me a solution?
    thankyou
    satyanag

    I've found in the past that if you want to calculate the period of time for something then you want to add 1 onto your calculation.
    for example:
    with t as (select sysdate as fromdate, sysdate as todate from dual
              union all select sysdate-1, sysdate from dual)
    select todate - fromdate from t         
    0
    1let's say we're talking about policies. the first record has a duration of one day (starts and ends on the same day) so you should add 1 to the calculation
    the second one is quite obviously 2 days in duration so you should add 1 to make it reflect the true state of afairs:
    with t as (select sysdate as fromdate, sysdate as todate from dual
              union all select sysdate-1, sysdate from dual)
    select todate - fromdate + 1 from t      
    1
    2

  • Production Cube painfully slow when resolving partitioned data

    Hi,
    We have a few 10g cubes in our production environment on Unix servers. For the most part the users are very satisfied with the response time. Most are of a similar design Global Composites, compressed around 9 dimensions.
    Recently we introduced a new cube and took advantage of the Parallell processing
    feature and partitioned it on the month level of time . It has 9 dimensions (see portion of load log below for specifics). All dimensions are sparse , global composites, Compression, 15 measures .. Most dimensions are aggregated fully or 1 below the top with the exception product aggregated only to the leaf level.
    The load works fine 9million data rows loaded and aggregated in 2 hours. The problem we are having is in the response time to a query. It's fine when only 1 month is selected. When you select qtr and years the query becomes painfully slow. When selecting 1 measure for 1 month it takes about 15 seconds to return. When selecting the same measure for the year or qtr it takes about 6 minutes to return. The same is true when I run a report from the Olap worksheet using olap DML so i know that it has nothing to do with the front end or network.
    Any ideas on what could be causing the bottleneck in summing the partitions or on how to diagnose the problem. Version is 10.2.0.2
    There are only 7 months in the cube and it's 25gb.
    thx
    Mike
    SQL> select xml_loadid, xml_recordid, xml_date, xml_message
    2 from OLAPSYS.xml_load_log
    3 where xml_loadid = 1006;
    XML_LOADID XML_RECORDID XML_DATE XML_MESSAGE
    1006 1 12-AUG-07 14:17:30 Job# AWXML$_1006 to Build(Refresh) Analytic Workspace RLCUBEMGR.ACP_MTH Submitted to the Queue.
    1006 11 12-AUG-07 14:17:31 Started Build(Refresh) of RLCUBEMGR.ACP_MTH Analytic Workspace.
    1006 12 12-AUG-07 14:17:31 Attached AW RLCUBEMGR.ACP_MTH in RW Mode.
    1006 13 12-AUG-07 14:17:31 Started Loading Dimensions.
    1006 14 12-AUG-07 14:17:33 Started Loading Dimension Members.
    1006 15 12-AUG-07 14:17:33 Started Loading Dimension Members for A_FICO_MTH.DIMENSION (1 out of 8 Dimensions).
    1006 16 12-AUG-07 14:17:34 Finished Loading Members for A_FICO_MTH.DIMENSION. Added: 27. No Longer Present: 0.
    1006 17 12-AUG-07 14:17:34 Started Loading Dimension Members for A_GEN4_MTH.DIMENSION (2 out of 8 Dimensions).
    1006 18 12-AUG-07 14:17:34 Finished Loading Members for A_GEN4_MTH.DIMENSION. Added: 11. No Longer Present: 0.
    1006 19 12-AUG-07 14:17:34 Started Loading Dimension Members for A_LOC_MTH.DIMENSION (3 out of 8 Dimensions).
    1006 20 12-AUG-07 14:17:44 Finished Loading Members for A_LOC_MTH.DIMENSION. Added: 22,887. No Longer Present: 0.
    1006 21 12-AUG-07 14:17:44 Started Loading Dimension Members for A_MAKE_MTH.DIMENSION (4 out of 8 Dimensions).
    1006 22 12-AUG-07 14:17:45 Finished Loading Members for A_MAKE_MTH.DIMENSION. Added: 59. No Longer Present: 0.
    1006 23 12-AUG-07 14:17:45 Started Loading Dimension Members for A_PROD_MTH.DIMENSION (5 out of 8 Dimensions).
    1006 24 12-AUG-07 14:17:45 Finished Loading Members for A_PROD_MTH.DIMENSION. Added: 31. No Longer Present: 0.
    1006 25 12-AUG-07 14:17:45 Started Loading Dimension Members for A_SUBV_MTH.DIMENSION (6 out of 8 Dimensions).
    1006 26 12-AUG-07 14:17:45 Finished Loading Members for A_SUBV_MTH.DIMENSION. Added: 3. No Longer Present: 0.
    1006 27 12-AUG-07 14:17:45 Started Loading Dimension Members for A_TERM_MTH.DIMENSION (7 out of 8 Dimensions).
    1006 28 12-AUG-07 14:17:45 Finished Loading Members for A_TERM_MTH.DIMENSION. Added: 12. No Longer Present: 0.
    1006 29 12-AUG-07 14:17:45 Started Loading Dimension Members for A_TIME_MTH.DIMENSION (8 out of 8 Dimensions).
    1006 30 12-AUG-07 14:17:45 Finished Loading Members for A_TIME_MTH.DIMENSION. Added: 11. No Longer Present: 0.
    1006 31 12-AUG-07 14:17:45 Finished Loading Dimension Members.
    1006 32 12-AUG-07 14:17:45 Started Loading Hierarchies.
    1006 33 12-AUG-07 14:17:45 Started Loading Hierarchies for A_FICO_MTH.DIMENSION (1 out of 8 Dimensions).
    1006 34 12-AUG-07 14:17:46 Finished Loading Hierarchies for A_FICO_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 35 12-AUG-07 14:17:46 Started Loading Hierarchies for A_GEN4_MTH.DIMENSION (2 out of 8 Dimensions).
    1006 36 12-AUG-07 14:17:47 Finished Loading Hierarchies for A_GEN4_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 37 12-AUG-07 14:17:47 Started Loading Hierarchies for A_LOC_MTH.DIMENSION (3 out of 8 Dimensions).
    1006 38 12-AUG-07 14:18:03 Finished Loading Hierarchies for A_LOC_MTH.DIMENSION. 5 hierarchy(s) DLRGRP, RGNDLRGRP, RGNSTATE, STANDAR
    D, STATE Processed.
    1006 39 12-AUG-07 14:18:03 Started Loading Hierarchies for A_MAKE_MTH.DIMENSION (4 out of 8 Dimensions).
    1006 40 12-AUG-07 14:18:03 Finished Loading Hierarchies for A_MAKE_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 41 12-AUG-07 14:18:03 Started Loading Hierarchies for A_PROD_MTH.DIMENSION (5 out of 8 Dimensions).
    1006 42 12-AUG-07 14:18:03 Finished Loading Hierarchies for A_PROD_MTH.DIMENSION. 2 hierarchy(s) NEWUSED, STANDARD Processed.
    1006 43 12-AUG-07 14:18:03 Started Loading Hierarchies for A_SUBV_MTH.DIMENSION (6 out of 8 Dimensions).
    1006 44 12-AUG-07 14:18:03 Finished Loading Hierarchies for A_SUBV_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 45 12-AUG-07 14:18:03 Started Loading Hierarchies for A_TERM_MTH.DIMENSION (7 out of 8 Dimensions).
    1006 46 12-AUG-07 14:18:04 Finished Loading Hierarchies for A_TERM_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 47 12-AUG-07 14:18:04 Started Loading Hierarchies for A_TIME_MTH.DIMENSION (8 out of 8 Dimensions).
    1006 48 12-AUG-07 14:18:04 Finished Loading Hierarchies for A_TIME_MTH.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    1006 49 12-AUG-07 14:18:04 Finished Loading Hierarchies.
    1006 50 12-AUG-07 14:18:04 Started Loading Attributes.
    1006 51 12-AUG-07 14:18:04 Started Loading Attributes for A_FICO_MTH.DIMENSION (1 out of 8 Dimensions).
    1006 52 12-AUG-07 14:18:04 Finished Loading Attributes for A_FICO_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 53 12-AUG-07 14:18:04 Started Loading Attributes for A_GEN4_MTH.DIMENSION (2 out of 8 Dimensions).
    1006 54 12-AUG-07 14:18:04 Finished Loading Attributes for A_GEN4_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 55 12-AUG-07 14:18:04 Started Loading Attributes for A_LOC_MTH.DIMENSION (3 out of 8 Dimensions).
    1006 56 12-AUG-07 14:18:11 Finished Loading Attributes for A_LOC_MTH.DIMENSION. 11 attribute(s) A_DEALER_AMERICREDIT_IND, A_DEALER_E
    CONTRACTING_IND, A_DEALER_MARKET_SEGMENT, A_DEALER_ONEGAP_IND, A_DEALER_RESERVE_PLAN, A_DEALER_RETAIL_SALES_AGR_APV, A_D
    EALER_SEGMENT, A_DEALER_STATUS_CD, A_FLOOR_PLAN_PROVIDER_NAME, LONG_DESCRIPTION, SHORT_DESCRIPTION Processed.
    1006 57 12-AUG-07 14:18:11 Started Loading Attributes for A_MAKE_MTH.DIMENSION (4 out of 8 Dimensions).
    1006 58 12-AUG-07 14:18:11 Finished Loading Attributes for A_MAKE_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 59 12-AUG-07 14:18:11 Started Loading Attributes for A_PROD_MTH.DIMENSION (5 out of 8 Dimensions).
    1006 60 12-AUG-07 14:18:11 Finished Loading Attributes for A_PROD_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 61 12-AUG-07 14:18:11 Started Loading Attributes for A_SUBV_MTH.DIMENSION (6 out of 8 Dimensions).
    1006 62 12-AUG-07 14:18:11 Finished Loading Attributes for A_SUBV_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 63 12-AUG-07 14:18:11 Started Loading Attributes for A_TERM_MTH.DIMENSION (7 out of 8 Dimensions).
    1006 64 12-AUG-07 14:18:11 Finished Loading Attributes for A_TERM_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 65 12-AUG-07 14:18:11 Started Loading Attributes for A_TIME_MTH.DIMENSION (8 out of 8 Dimensions).
    1006 66 12-AUG-07 14:18:12 Finished Loading Attributes for A_TIME_MTH.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, SHORT_DESCRIPTION
    Processed.
    1006 67 12-AUG-07 14:18:12 Finished Loading Attributes.
    1006 68 12-AUG-07 14:18:12 Finished Loading Dimensions.
    1006 69 12-AUG-07 14:18:12 Started Updating Partitions.
    1006 70 12-AUG-07 14:18:12 Finished Updating Partitions.
    1006 71 12-AUG-07 14:18:26 Detached AW RLCUBEMGR.ACP_MTH.
    1006 72 12-AUG-07 14:18:26 Starting Parallel Processing.
    1006 10073 12-AUG-07 14:18:27 Attached AW RLCUBEMGR.ACP_MTH in MULTI Mode.
    1006 10074 12-AUG-07 14:18:27 Started Load of Measures: V_APP_RECEIVED, V_APP_APPROVED, V_APP_COUNTERED, V_APP_CANCELLED, V_APP_DECISIONE
    D, V_BKD_UNIT, V_BKD_DOLLAR, V_OS_UNIT, V_OS_DOLLAR, V_DELQ_UNIT, V_DELQ_DOLLAR, V_CHGOFF_UNIT, V_CHGOFF_DOLLAR, V_REPO_
    UNIT, V_DLRRSVUNEARN from Cube ACPMTH.CUBE. DEFAULT Partition.
    1006 10075 12-AUG-07 14:18:42 Finished Load of Measures: V_APP_RECEIVED, V_APP_APPROVED, V_APP_COUNTERED, V_APP_CANCELLED, V_APP_DECISION
    ED, V_BKD_UNIT, V_BKD_DOLLAR, V_OS_UNIT, V_OS_DOLLAR, V_DELQ_UNIT, V_DELQ_DOLLAR, V_CHGOFF_UNIT, V_CHGOFF_DOLLAR, V_REPO
    UNIT, VDLRRSVUNEARN from Cube ACPMTH.CUBE. DEFAULT Partition. Processed 0 Records. Rejected 0 Records.
    1006 10076 12-AUG-07 14:18:42 Started Auto Solve for Measures: V_APP_APPROVED, V_APP_CANCELLED, V_APP_COUNTERED, V_APP_DECISIONED, V_APP_
    RECEIVED, V_BKD_DOLLAR, V_BKD_UNIT, V_CHGOFF_DOLLAR, V_CHGOFF_UNIT, V_DELQ_DOLLAR, V_DELQ_UNIT, V_DLRRSVUNEARN, V_OS_DOL
    LAR, V_OS_UNIT, V_REPO_UNIT from Cube ACPMTH.CUBE. DEFAULT Partition.
    1006 90112 12-AUG-07 16:09:33 Running Jobs: AWXML$_1006_577, AWXML$_1006_578. Waiting for Tasks to Finish...
    1006 90113 12-AUG-07 16:18:57 Finished Parallel Processing.
    1006 90114 12-AUG-07 16:18:57 Completed Build(Refresh) of RLCUBEMGR.ACP_MTH Analytic Workspace.
    SQL> COLUMN DBMS_LOB.GETLENGTH(AWLOB) HEADING "Bytes";
    SQL> SELECT EXTNUM, SUM(DBMS_LOB.GETLENGTH(AWLOB)) FROM AW$acp_MTH GROUP BY EXTNUM;
    EXTNUM SUM(DBMS_LOB.GETLENGTH(AWLOB))
    1 5715538444
    3 3758960128
    2 3758960128
    0 1.1173E+10
    4 2295138076

    Watrost makes some very valid points and you should review the suggestions he has made.
    In addition I would like to add the following. When deciding on how and when to use partitioning and parallel processing I would suggest considering these points:
    1) The partition dimension - most people partition on time, probably because all the examples I have seen in this area show this as if it were the "normal" method. Using time is a good idea if you want to perform some form of rolling time window and need to quickly and easily drop data for a specific time period. However, partitioning does impact on query performance and you need to balance the needs of "ease of maintenance" against query performance for your users. Once you have selected a level to use as the partition key you will notice on the Summary tab that levels above this key are in fact grayed out and cannot be selected. As Watrost points out, the partition key in effect becomes your top level of pre-aggregation. All other levels are summed at query time. This is normally ok, but for a 9 dimensional model it means you have no summary levels computed across any of your other 8 dimensions for levels above the partition key. This will have a serious impact on your query performance.
    2) Parallel processing - many people seem to implement partitioning in the expectation that their cubes will aggregate more quickly. In most cases this can be true but there are many factors to consider. To be really effective you need to have multiple CPUs, and you should set the number of parallel jobs to a maximum of No of CPUs-1. For some reason I have seen many DBAs just randomly pick a number for this value that greatly exceeds the number of CPUs and then wonder why their cube build takes such a long time. I would start by using the No of CPUs/2 and scaling up to No of CPUs-1 to determine the sweet spot for your system based on resources. Don't forget it is not just CPUs that are important in this situation you also need fast disks as I/O contention can occur very easily.
    3) Partial vs Full aggregation - If you upgrade to the latest 10.2.0.3 patcheset and apply the OLAP A Patch as well (available from Metalink) which will allow you to take advantage of partial aggregation features. When loading data into a cube you should only aggregate data for incoming values rather than re-aggregating data for the whole cube. Although AWM does provide a radio button to control this on the second page of the maintenance wizard it is only with the above patchsets that this actually works.
    4) Sortarea size : check your sortarea size. Typically, this is set very small in most instances and OLAP operations are very sort intensive. Therefore, increasing this to 1M or higher can have a significant impact on load performance.
    5) Monitoring Builds - There are some free performance views you can download from OTN that will help you monitor what is happening during a build. See the link for SQL Scripts for OLAP DBAs and the associated readme at:
    http://www.oracle.com/technology/products/bi/olap/OLAP_DBA_scripts.ZIP
    http://www.oracle.com/technology/products/bi/olap/OLAP_DBA_README.htm
    I would look at the views AW_WAIT_EVENTS, OLAP_PGA_USE and AW_LONGOPS as well as generally monitoring your instance via OEM using the ADDM reports that can be generated. These will help you discover performance bottlenecks in your system
    Now for the dilemma - parallel processing vs partitioning. Most people would agree that partial aggregation is a good idea as there is no need to re-aggregate a complete cube when there have been no changes to the dimensions/hierarchies and you are loading data for just the current month. But if you partition by month you will never get parallel processing because you will only be aggregating data within a single partition and for parallel processing to occur aggregation must occur across multiple partitions. What it does mean is that the time take for each load and partial aggregation should be consistent which makes it easy to plan.
    If this is proving to be a significant problem then it is possible to use partitioning and generate a fully solved cube. In 11g this will be resolved using a new aggregation feature. For 10gR2 it is possible to create this type of solution. In fact I have just re-designed a customers physical data model to reduce the query time for one of their main user reports from 14 minutes to 10 seconds. The problem was the query accessed the data within the partition that was not aggregated.
    It is quite a simple process and does not change your current logical data model. Quite simply you create a surrogate partition dimension with only one two levels - All and "Level 1". Partition on "Level 1". If you are using time, which most people do, level 1 contains all the values for all months, all quarters and all years. Next create a source table that contains using GROUPING SETS to compute totals for months, quarters and years only for base levels across all your other dimensions. This will created an embedded total fact table for your surrogate time dimension.
    In the mapping editor, map the surrogate time dimension column in the fact table to your surrogate dimension "Level 1". In the Rules tab within AWM set the aggregation for this surrogate dimension to "Do not Aggregate". Load your data.
    In your existing cube you replace your existing stored measures with calculated measures, using the same name, that point to the new physical cubes via a QDR from the standard time dimension to the surrogate time dimension.
    Hope this helps.
    Keith Laker
    Oracle EMEA Consulting
    BI Blog: http://oraclebi.blogspot.com/
    DM Blog: http://oracledmt.blogspot.com/
    BI on Oracle: http://www.oracle.com/bi/
    BI on OTN: http://www.oracle.com/technology/products/bi/
    BI Samples: http://www.oracle.com/technology/products/bi/samples/

  • SQL Query Help: Average of Sums !

    Hi Folks !
    I've been tasked with wirting a program that queries Oracle DB. And I'm not used to SQL a lot. May be my question is very basic. Please bear with me.
    I've simplified the table structure for this question.
    The table Order has 4 fields:
    1. Order_id
    2. Trial_No.
    3. Response_Time.
    4. Order_Time
    Now the first two fields (Order_id, Trial_number) together make up the Primary Key (It is a composite key). My goal is to find the Average Response time for each Order ID in the past 10 minutes. Order_Time is the time at which Order was placed. An order can take more than one trail to complete the order. Order Reponse time is the sum of all the Trail response times.
    A basic query like this
    select AVG(Response_Time) from Order where Order_time < sysdate -10/1440.
    gives me the Average of the response time but here is the catch : It produces the Average no matter how many trials the Order had. An order's response time is SUM of all the Trail Response times it had.
    Consider the following data in the table
    Order_id Trial_No Resopnse_Time (ms)
    1 1 1000
    2 1 1000
    2 2 1000
    2 3 1000
    If I just use "select AVG(RESPONSE_Time), it will result in 1000ms. (4000/4).
    But in reality, Order 2 had taken 3000ms to complete (3 trials). And the Avg Order Response time should be (1000+3000)/2 = 2000ms. Can you see it?
    In other words, How do I calcluater the Avg of the sums of the rows, instead of Avg of all the rows. I've gone through some docs about 'group by' clause and 'subqueries', but don't seem to find the answer.
    Thanks very much for reading such a long post. Looking forward to some guideance.
    - K

    You'll want some analytics on this:
    SYS%xe> --select sum(sum_resp_time)/sum(rn) avg
    SYS%xe> --from (
    SYS%xe> with dataset as
      2  (select  1 Order_id, 1 Trial_No, 1000 Response_Time from dual union all
      3   select  2         , 1         , 1000 from dual union all
      4   select  2         , 2         , 1000 from dual union all
      5   select  2         , 3         , 1000 from dual
      6  )
      7  select order_id
      8  ,      response_time
      9  ,      row_number() over ( partition by order_id  order by order_id) rn
    10  ,      sum(response_time) over ( partition by order_id  order by order_id) sum_resp_time
    11  from dataset
    12  -- ) where rn = 1;
      ORDER_ID RESPONSE_TIME         RN SUM_RESP_TIME
             1          1000          1          1000
             2          1000          1          3000
             2          1000          2          3000
             2          1000          3          3000
    4 rijen zijn geselecteerd.
    Verstreken: 00:00:02.15
    SYS%xe> select sum(sum_resp_time)/sum(rn) avg
      2  from (
      3  with dataset as
      4  (select  1 Order_id, 1 Trial_No, 1000 Response_Time from dual union all
      5   select  2         , 1         , 1000 from dual union all
      6   select  2         , 2         , 1000 from dual union all
      7   select  2         , 3         , 1000 from dual
      8  )
      9  select order_id
    10  ,      response_time
    11  ,      row_number() over ( partition by order_id  order by order_id) rn
    12  ,      sum(response_time) over ( partition by order_id  order by order_id) sum_resp_time
    13  from dataset
    14  ) where rn = 1;
           AVG
          2000
    1 rij is geselecteerd.But you could do without them, as Raj showed.
    ( I guess this is one of my weirdest queries ;-) )
    Edited by: hoek on Apr 7, 2009 6:17 PM

  • Can not use SUM in a simple query because of a syntax error?

    Hi
    Thank you for reading my post.
    I am trying to execute the following query:
    q = em.createQuery("SELECT SUM((NE.pipeLength - 0.6)+((NE.networkDepth-NE.initialDepth)*0.41)+ NE.standLenght)  FROM NExpansion NE  where (( NE.contract.contractor=:contractor) AND (NE.pDiameter=:diameter))");
                  q.setParameter("contractor", contractor);
    q.setParameter("diameter", diameter);
         l =  (Long) q.getSingleResult();And I get the following error:
    Caused by: Exception [TOPLINK-8025] (Oracle TopLink Essentials - 2.0.1 (Build b09d-fcs (12/06/2007))): oracle.toplink.essentials.exceptions.EJBQLException
    Exception Description: Syntax error parsing the query [SELECT SUM((NE.pipeLength - 0.6)+((NE.networkDepth-NE.initialDepth)*0.41)+ NE.standLenght)  FROM NExpansion NE  where (( NE.contract.contractor=:contractor) AND (NE.pDiameter=:diameter))], line 1, column 12: unexpected token [(].
    Internal Exception: line 1:12: unexpected token: (
            at oracle.toplink.essentials.exceptions.EJBQLException.unexpectedToken(EJBQLException.java:389)
            at oracle.toplink.essentials.internal.parsing.ejbql.EJBQLParser.handleANTLRException(EJBQLParser.java:350)
            at oracle.toplink.essentials.internal.parsing.ejbql.EJBQLParser.addError(EJBQLParser.java:278)
            at oracle.toplink.essentials.internal.parsing.ejbql.EJBQLParser.reportError(EJBQLParser.java:378)
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.aggregateExpression(EJBQLParser.java:1416)
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.selectExpression(EJBQLParser.java:1158)
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.selectClause(EJBQLParser.java:403)
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.selectStatement(EJBQLParser.java:178)
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.document(EJBQLParser.java:135)
            at oracle.toplink.essentials.internal.parsing.ejbql.EJBQLParser.parse(EJBQLParser.java:166)
            at oracle.toplink.essentials.internal.parsing.ejbql.EJBQLParser.buildParseTree(EJBQLParser.java:127)
            at oracle.toplink.essentials.internal.ejb.cmp3.base.EJBQueryImpl.buildEJBQLDatabaseQuery(EJBQueryImpl.java:215)
            at oracle.toplink.essentials.internal.ejb.cmp3.base.EJBQueryImpl.buildEJBQLDatabaseQuery(EJBQueryImpl.java:189)
            at oracle.toplink.essentials.internal.ejb.cmp3.base.EJBQueryImpl.buildEJBQLDatabaseQuery(EJBQueryImpl.java:153)
            at oracle.toplink.essentials.internal.ejb.cmp3.base.EJBQueryImpl.<init>(EJBQueryImpl.java:114)
            at oracle.toplink.essentials.internal.ejb.cmp3.base.EJBQueryImpl.<init>(EJBQueryImpl.java:99)
            at oracle.toplink.essentials.internal.ejb.cmp3.EJBQueryImpl.<init>(EJBQueryImpl.java:86)
            at oracle.toplink.essentials.internal.ejb.cmp3.EntityManagerImpl.createQuery(EntityManagerImpl.java:204)
            ... 30 more
    Caused by: line 1:12: unexpected token: (
            at oracle.toplink.essentials.internal.parsing.ejbql.antlr273.EJBQLParser.aggregateExpression(EJBQLParser.java:1365)
            ... 43 morePlease let me know what am I doing wrong?
    Thanks.

    Hello,
    From the grammar in the JPA spec, SUM only takes a state_field_path_expression which is defined as
    state_field_path_expression := {identification_variable | single_valued_association_path_expression}.state_field
    Please feel free to file an enhancement to have this expanded upon.
    Best Regards,
    Chris

Maybe you are looking for