Expression Grouping and Summing

I am working on a report where I have an expression that gets the Max value of a field in the detail row of my report.  I have four levels of grouping that I then need this expression to sum.  The expression is: 
=Max(IIF(LEFT(Fields!JobNum.Value,1)="S" AND Fields!Week1STol.Value<>0, CDBL(Fields!Week1STol.Value), 0))
I am using Max, because I need a value from a result set that is one of the non-zero values from the result set.  They will all be the same, or 0, so if there is a different way to get the non-zero value without using an aggregate I can change my expression
to do that, but I have not found a way to do that yet.  Now since I am grouping this, and as each group rolls up I need the sum of the max values in the subgroup.  When I get is the Max value of the new group.
I have tried various ways to try and get this to sum.  I have tried creating total rows, added custom code to collect running totals, etc., I have also tried wrapping the whole thing in a SUM aggregate, and that will work in the detail, but will not
roll up into the groupings.
Any help as to how this can be done will be greatly appreciated.
Thank you,
Chad 

Ok, after continuing to search the internet I finally found the extra piece that I was missing that gave me the results I needed. The new expression looks like this:
=runningvalue(Sum(Max(IIF(LEFT(Fields!JobNum.Value,1)="S" AND Fields!Week1STol.Value<>0, CDBL(Fields!Week1STol.Value), 0),"JobItem"),"JobItem"),SUM, "JobItem")
In this I wrapped the original expression of Max in both a Sum and a runningvalue both at the JobItem level to get this rollup value. Now when I open the grouping I get the correct running value at each level.
What this really gives to me is a running total at each of four groupings, even with a "max" value at the detail level. This also allows me to total a max value inline on the report, without needing a hidden row, or report footer.
Thank you to everyone who looked at this, I hope it helps someone else. If this answer is not clear enough, please don't hesitate to add to the comments and I will try to clarify.
Thank you, Chad

Similar Messages

  • Sql grouping and summing impossible?

    I want to create an sql query to sum up some data but i'm starting to think it is impossible with sql alone. The data i have are of the following form :
    TRAN_DT     TRAN_RS     DEBT     CRED
    10-Jan     701     100     0
    20-Jan     701     150     0
    21-Jan     701     250     0
    22-Jan     705     0     500
    23-Jan     571     100     0
    24-Jan     571     50     0
    25-Jan     701     50     0
    26-Jan     701     20     0
    27-Jan     705     0     300The data are ordered by TRAN_DT and then by TRAN_RS. Tha grouping and summing of data based on tran_rs but only when it changes. So in the table above i do not want to see all 3 first recods but only one with value DEBT the sum of those 3 i.e. 100+150+250=500. So the above table after grouping would be like the one below:
    TRAN_DT     TRAN_RS     DEBT     CRED
    21-Jan     701     500     0
    22-Jan     705     0     500
    24-Jan     571     150     0
    26-Jan     701     70     0
    27-Jan     705     0     300The TRAN_DT is the last value of the summed records. I undestand that the tran_dt may not be selectable. What i have tried so far is the following query:
    select tran_dt,
             tran_rs,
             sum(debt)over(partition by tran_rs order by tran_dt rows unbounded preceding),
             sum(cred)over(partition by tran_rs order by tran_dt rows unbounded preceding) from that_tableIs this even possible with sql alone, any thoughts?
    The report i am trying to create in BI Publisher.Maybe it is possible to group the data in the template and ask my question there?

    915218 wrote:
    Is this even possible with sql alone, any thoughts?It sure is...
    WITH that_table as (select to_date('10/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 100 debt, 0 cred from dual union all
                         select to_date('20/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 150 debt, 0 cred from dual union all
                         select to_date('21/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 250 debt, 0 cred from dual union all
                         select to_date('22/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 500 cred from dual union all
                         select to_date('23/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 100 debt, 0 cred from dual union all
                         select to_date('24/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 50 debt, 0 cred from dual union all
                         select to_date('25/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 50 debt, 0 cred from dual union all
                         select to_date('26/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 20 debt, 0 cred from dual union all
                         select to_date('27/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 300 cred from dual)
    , brk AS (
    SELECT     tran_dt,
            tran_rs,
         debt,
         cred,
            CASE WHEN Nvl (Lag (tran_rs) OVER (ORDER BY tran_dt, tran_rs), 0) != tran_rs THEN tran_rs || tran_dt END brk_tran_rs
      FROM that_table
    ), grp AS (
    SELECT     tran_dt,
            tran_rs,
             debt,
             cred,
            Last_Value (brk_tran_rs IGNORE NULLS) OVER  (ORDER BY tran_dt, tran_rs) grp_tran_rs
      FROM brk
    SELECT     Max (tran_dt),
            Max (tran_rs),
             Sum (debt),
             Sum (cred)
      FROM grp
    GROUP BY grp_tran_rs
    ORDER BY 1, 2
    Boneist
    MAX(TRAN_    TRAN_RS       DEBT       CRED
    21-JAN-12        701        500          0
    22-JAN-12        705          0        500
    24-JAN-12        571        150          0
    26-JAN-12        701         70          0
    27-JAN-12        705          0        300
    Me
    MAX(TRAN_ MAX(TRAN_RS)  SUM(DEBT)  SUM(CRED)
    21-JAN-12          701        500          0
    22-JAN-12          705          0        500
    24-JAN-12          571        150          0
    26-JAN-12          701         70          0
    27-JAN-12          705          0        300Edited by: BrendanP on 17-Feb-2012 04:05
    Test data courtesy of Boneist, and fixed bug.
    Edited by: BrendanP on 17-Feb-2012 04:29

  • LINQ Group and Sum table

    I can't figure how to group those records 
    EmpID
    ClientId
    HourIn
       HourOut
            Date
    TotalHours
    12345
    9999
    8:00 AM
    12:00 PM
    01/05/2015
    4
    12345
    9999
    1:00 PM
    3:00 PM
    01/05/2015
    2
    12345
    9999
    3:30 PM
    6:00 PM
    01/05/2015
    2.5
    12345
    9999
    8:00 AM
    5:00 PM
    01/06/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/07/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/08/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/09/2015
    9
    I want to group by date and sum total hours and hourin in the first and hour out is the last one. 
    FYI (date can be datetime) (eg i can make date to be (1/18/2014 12:00:00 AM)
    EmpID
    ClientId
    HourIn
    HourOut
    Date
    TotalHours
    12345
    9999
    8:00 AM
    6:00 PM
    01/05/2015
    8.5
    12345
    9999
    8:00 AM
    5:00 PM
    01/06/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/07/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/08/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/09/2015
    9
    Thanks in advance

    Hope this helps..do a group by and then select ..change the below accordingly
    DataTable objtable = new DataTable();
    objtable.Columns.Add("EmpID", typeof(int)); objtable.Columns.Add("ClientId", typeof(int));
    objtable.Columns.Add("HourIn", typeof(DateTime)); objtable.Columns.Add("HourOut", typeof(DateTime));
    objtable.Columns.Add("Date", typeof(DateTime)); objtable.Columns.Add("TotalHours", typeof(double));
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 8:00 AM"), Convert.ToDateTime("01/05/2015 12:00 PM"), Convert.ToDateTime("01/05/2015"), 4);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 1:00 PM"), Convert.ToDateTime("01/05/2015 3:00 PM"), Convert.ToDateTime("01/05/2015"), 2);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 3:30 PM"), Convert.ToDateTime("01/05/2015 6:00 PM"), Convert.ToDateTime("01/05/2015"), 2.5);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/06/2015 8:00 AM"), Convert.ToDateTime("01/06/2015 5:00 PM"), Convert.ToDateTime("01/06/2015"), 9);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/07/2015 8:00 AM"), Convert.ToDateTime("01/07/2015 5:00 PM"), Convert.ToDateTime("01/07/2015"), 9);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/08/2015 8:00 AM"), Convert.ToDateTime("01/08/2015 5:00 PM"), Convert.ToDateTime("01/08/2015"), 9);
    objtable.AcceptChanges();
    var result = objtable.AsEnumerable().GroupBy(x => x.Field<DateTime>("Date")).Select(g => new
    empId = g.First().Field<int>("EmpID"),
    clientID = g.First().Field<int>("ClientId"),
    hoursIn = g.Min(e => e.Field<DateTime>("HourIn")),
    hourOut = g.Max(e => e.Field<DateTime>("HourOut")),
    totalhours = g.First().Field<double>("TotalHours")
    foreach (var row in result)
    Console.WriteLine("{0} {1} {2} {3} {4} ", row.empId,row.clientID, row.hoursIn.ToString("HH:mm tt"), row.hourOut.ToString("HH:mm tt"),row.totalhours);

  • Grouping and sum values in XI

    Hello,
    I 'm working with invoice and I have this source structure as XI input:
    Invoice
    -- |....
    -- |....
    -- |Item1
    |taxcode=01
    |Amount=12
    --|Item2
    |taxcode=08
    |Amount=10
    --|Item3
    |taxcode=01
    |Amount=24
    Now my scope is to map these fields to the IDOC segment E1EDP04 grouping by taxcode (MWSKZ taxcode)and putting the sum of the group amount (MWSBT amount).
    IDOC:
    --|...
    --|...
    --|E1EDP01
    |...
    |...
    |EIEDP04
    |MWSKZ=01
    |MWSBT=36
    |...
    --|E1EDP01
    |...
    |...
    |EIEDP04
    |MWSKZ=08
    |MWSBT=10
    |...
    How can I group by a field in XI?
    Thank you
    Corrado

    Hi Corrado,
    If You want to do it in graphical mapping then I will do it this way:
    1. sort by taxcode
    (taxcode) --> split by value (valuechanged) --> formatByExample (Amount, and splitted value) --> sum(amount) --> MWSBT
    I can send u a screenshot of something similar if u need.
    best regards
    Dawid

  • Group and Sum in Webi

    Hi,
    I have three objects Business Unit, Department and Revenue
    I would like to group by Business Unit and Departement; sum of Revenue.
    How to do this in report level?
    Please let me know?
    Thanks and Regards,
    Manjunath N Jogin

    Hi Pumpactionshotgun ,
    I did same thing in webi but revenue field is diplaying all records.
    It is not aggreagting (sum) based on Business Unit and Departement.
    How to  do 'aggregation seting for Revenue in the universe'
    I am waiting your replay.
    Thanks and Regards,
    Manjunath N Jogin

  • Select query with group and sum

    Friends I have a table which has a list of item that are sold in many provinces and their selling price.
    EMP_TABLE
    item_code
    item_desc
    item_province
    item_selling_price
    Now I want a query which a row containing
    distinct item code ,item desc,province ,sum of item_selling_price for Ontario,sum of item_selling_price for British Columbia,sum of item_selling_price for Quebec
    Can anyone please tell me how to do it.
    thx
    m

    Hello
    It's always usefull to provide some test data and create table scripts etc, but does this do what you're after?
    create table dt_test_t1
    (item_code                     varchar2(3),
    item_desc                    varchar2(10),
    item_province               varchar2(20),
    item_selling_price          number(3)
    ) tablespace av_datas;
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province2',10);
    SQL> SELECT
      2     item_code,
      3     item_desc,
      4     SUM(DECODE(item_province,'Province1',item_selling_price,0)) province_1_total,
      5     SUM(DECODE(item_province,'Province2',item_selling_price,0)) province_2_total
      6  FROM
      7     dt_test_t1
      8  GROUP BY
      9     item_code,
    10     item_desc;
    ITE ITEM_DESC  PROVINCE_1_TOTAL PROVINCE_2_TOTAL
    ABC Item1                    10               20
    ABC Item2                    30                0
    ABC Item3                     0               30
    ABC Item4                    20               10
    ABC Item5                    10               20HTH
    David

  • How to Group and Sum rtf elements

    Dear Friends,
    I am having limited knowledge of XML so looking for your help
    I am trying to create external template for Bill Presentment Architecture which is in .rtf. Template is having 3 section Invoice header, Lines and Tax summary.
    Invoice Header and line section is ok, but facing some issues with Tax section.
    LINE SECTION
    LINE_ID     ITEM     RATE     QTY     BASE_AMT     TAX_RATE     TAX_AMT     TOTAL_AMT
    1     P0001     20     10     200     21%     42     242
    3     P0003     40     30     1200     21%     252     1452
    4     P0004     20     100     2000     10%     420     2200
    5     P0005     50     10     500     10%     105     550
    2     P0002     50     20     1000     6%     210     1060
    EXPECTED RESULT IN TAX SUMMARY SECTION WHICH I AM LOOKING FOR
    TAX RATE           BASE_AMT          TAX_AMT          TOTAL_AMT
    21%          1400          294          1694
    10%          2500          525          2750
    6%          1000          210          1060
    Looking for your help, much appriciated.
    Regards,
    Jay
    Edited by: 992820 on Mar 9, 2013 5:20 AM

    >
    Tax Code Taxable amount
    VAT 6% 1200
    VAT 6% 1400
    VAT 7% 1000
    VAT 7% 2000
    I used you code
    <?for-each-group:LINE;./tax_code?> G tag
    and getting following result
    VAT 6% 1200
    VAT 7% 1000
    >
    because you have loop by tax_code and no loop and no aggregation function (e.g. sum) for amount, so amount will be from first row
    try add sum function for amount
    also http://docs.oracle.com/cd/E18727_01/doc.121/e13532/T429876T431325.htm#I_bx2Dcontent
    Summary Lines: Check this box if you want to display summarized transaction lines. Grouped lines are presented in the Lines and Tax area of the primary bill page.so it's may be not template problem but some settings?

  • Group and Sum/Average

    Hello.
    I have an excel structure here, this data is extracted from our company's software. The record was recorded from Dec 5, 2013 to Dec 11, 2013 
    Job   Operator
    Added
    Job Opened
    Job Revenue Recognition Date
    Shipment ID
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038408
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038412
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038414
    Joebeth Cañete
    05-Dec-13
    10-Dec-13
    IMM
    S00038440
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038501
    Rachel Rosales
    09-Dec-13
    09-Dec-13
    IMM
    S00038502
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038503
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038504
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038506
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038508
    May Jane Herrera
    09-Dec-13
    09-Dec-13
    IMM
    S00038509
    Joebeth Cañete
    09-Dec-13
    17-Dec-13
    IMM
    S00038510
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038512
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038513
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038514
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038515
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038516
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038518
    Joebeth Cañete
    10-Dec-13
    10-Dec-13
    IMM
    S00038523
    Sharlah Faye Solomon
    10-Dec-13
    10-Dec-13
    IMM
    S00038524
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038525
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038526
    Rachel Rosales
    10-Dec-13
    10-Dec-13
    IMM
    S00038528
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038530
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038531
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038532
    Joebeth Cañete
    10-Dec-13
    10-Dec-13
    IMM
    S00038533
    Rachel Rosales
    10-Dec-13
    10-Dec-13
    IMM
    S00038534
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038541
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038542
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038543
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038544
    May Jane Herrera
    11-Dec-13
    11-Dec-13
    IMM
    S00038548
    May Jane Herrera
    11-Dec-13
    11-Dec-13
    IMM
    S00038549
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038554
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038555
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038557
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038558
    Now, my boss wants to have a matrix like this.
    Job Operator      Added      Job Opened        Shipment ID
    Employee A
    Total for (n) weeks [of Employee A]
    Average Shipment Daily: 
    Average Shipment Weekly:
    Employee B
    Total for (n) weeks [of Employee B]
    Average Shipment Daily: 
    Average Shipment Weekly:
    Inshort, he wants to group the records by Job Operator and get the average number of shipments daily and weekly. Is it possible to do this job?

    What' the define of the 'Week' you mentioned?
    It would be benefit for us to find the solution if you can provide the output you want.
    E.g..
    Job     Operator
    Total   for (n) weeks 
    Average   Shipment Daily
    Average Shipment Weekly
    Joebeth Cañete
    2
    1
    3.5
    But in any case, you can try to create a pivot table to analyze the data. 

  • Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount 28

    Post Author: tadj188#
    CA Forum: Formula
    Needed: Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount > 28
    Background:
    1) Report SQL is created with unions to have detail lines continue on a page, until it reaches page footer or report footer, rather than using  subreports.    A subreport report is now essentially a group1a, group1b, etc. (containing column headers and other data within the the report    with their respective detail lines).  I had multiple subreports and each subreport became one union.
    Created and tested, already:
    1) I have calculated @TotalLineForEachOfTheSameGroup, now I need to sum of the individual same group totals to get the total line count on a page.
    Issue:
    1) I need this to create break on a certain line before, it dribbles in to a pre-printed area.
    Other Ideas Appreciated:
    1) Groups/detail lines break inconveniently(dribble) into the pre-printed area, looking for alternatives for above situation.
    Thank you.
    Tadj

    export all image of each page try like this
    var myDoc = app.activeDocument;
    var myFolder = myDoc.filePath;
    var myImage = myDoc.allGraphics;
    for (var i=0; myImage.length>i; i++){
        app.select(myImage[i]);
        var MyImageNmae  = myImage[i].itemLink.name;
        app.jpegExportPreferences.jpegQuality = JPEGOptionsQuality.high;
        app.jpegExportPreferences.exportResolution = 300;
           app.selection[0].exportFile(ExportFormat.JPG, File(myFolder+"/"+MyImageNmae+".JPEG"), false);
        alert(myImage[i].itemLink.name)

  • A better way to differentiate positive vs. negative numbers and sum them ?

    Hi, I wonder if there is a better or easier way to differentiate the positive numbers from negative ones and sum them respectively?
    I come up with below:
    create table t1 (num number, id varchar2(3));
    insert into t1 values  (-100, 1);
    insert into t1 values (50, 2);
    insert into t1 values  (-10, 3);
    insert into t1 values  (-20, 4);
    insert into t1 values  (50, 5);
    insert into t1 values  (60, 6);
    select sum(decode(sign(num), 1, num, 0)) plus, sum(decode (sign(num), -1, num, 0)) minu from t1;
    PLUS   MINU
    160     -130Any suggestion would be appreciated! Thanks

    if there is a better or easier way to differentiate the positive numbers from negative ones and sum them respectively?Maybe you want them in different rows than in different columns:
    SQL> select sign(num), sum(num) from t1 group by sign(num)
    SIGN(NUM)   SUM(NUM)
             1        160
            -1       -130
    2 rows selected.?

  • Building a template - groups and formulas

    CR2008
    I'm trying to build a template to standardize some reports. I can get the header and the formatting fine: however, when I created a formula for Page M of N, the formula disappeared, and the field was replaced with one of the header fields. Does this just not ever work, or am I missing something?
    The documentation says "If an existing report is used as a template, these report objects will be applied to the new report: Fields, Groups, Summary fields". However, when I tried to save a grouping and apply it to an existing report, nothing happened - only the formatting came across.
    Thanks for any and all help!
    Edit: I added a TemplateField to my template and grouped on it, and applied it -- didn't work. However, when I then added grouping to the report, and re-applied the template, the new groups picked up the formatting from last time. When I added more groups than there were in the template, the newest groups were formatted the same as the last group I had specified. The Custom Group Name, which I had specified to be TemplateField2, (the same as the second grouping) still wasn't picked up, though. When I only specified a single TemplateField in the detail line, the Template Expert split the detail section into seven stacked detail sections, so that won't be much use.
    So, the other question remains: am I completely out of luck with formulas? And functions, for that matter? Thanks again!
    Edit again: I tried adding a Summary field to the template. While combining a text field and a summary field didn't work, dropping them next to each other did. Also, it only worked for the same summary: when I accidentally did a Sum instead of a DistinctCount, as the template was expecting, the formatting was ignored.
    Edited by: Garrett Fitzgerald on May 7, 2009 3:03 PM
    Edited by: Garrett Fitzgerald on May 7, 2009 3:34 PM

    Hello Rashmi,
    I'm not sure if I understood your question, but it seems you have to put your key figures in columns, together with time, if, for example, you want to show information like this:
    .........................................................                                                                           06.2008      07.2008 ...
    ..........................................................                                                 kf1 kf2....      kf1 kf2
    Org->Channel->customer->product  
    Please let me know if is this you are looking for.
    Best regards,
    Sue

  • Group and cuntry charts of account (COA)

    Hello
    We have this situation:
    Our company (which is part of international group) has two brach officies overbroad. So, we have three company codes (ex. 0001-home company, 0002-1st branch, 0003-2nd branch - one for our home company, other two for branches). We are tempoarily using only one COA from our home company
    In this year we will be moving on to international group COA, so we will have country COA for all three company codes (that mean that everyday booking will be on group COA). But how can we achieve to aggregate two branch offices COA with home company COA in balance reports ?
    Field alternative account number in fs00 will be used in countries COA to aggreagate at group COA level, but we need to aggregate on home company level too, ex:
    0002 - balance reporting only for branch office country
    0003 - balance reporting only for branch office country
    0001 - balance reporting for 0001 and sum of 0002 and 0003.
    Is that even possible ?
    Is there any good SAP documention on moving to a new COA ? What should be take care of etc.)
    Best regards

    Hi,
    You can use alternative chart of account in 2 branches (for external reporting) and use unique chart of account for all the 3 company codes.
    It will suffice your requirement:-
    0002 - balance reporting only for branch office country
    0003 - balance reporting only for branch office country
    0001 - balance reporting for 0001 and sum of 0002 and 0003
    If you want to use different chart of account for all company code than you cannot have cross company code controlling. It is advisable to use country chart of account(Alternative account number in company code) for reporting purpose and better controlling.
    P.S.:- The alternative account number is only issued in the financial statement if you explicitly assign it to the relevant items in the financial statement version.
    Regards,
    Anand Raichura

  • GROUP and MAX Query

    Hi
    I have two tables that store following information
    CREATE TABLE T_FEED (FEED_ID NUMBER, GRP_NUM NUMBER);
    CREATE TABLE T_FEED_RCV (FEED_ID NUMBER, RCV_DT DATE);
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, 2);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED VALUES (5, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    I join these tables using the following query to return all the feeds and check when each feed was received:
    SELECT
    F.FEED_ID,
    F.GRP_NUM,
    FR.RCV_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR
    ON F.FEED_ID = FR.FEED_ID
    ORDER BY GRP_NUM, RCV_DT DESC;
    Output
    Line: ----------
    FEED_ID     GRP_NUM     RCV_DT
    1     1     
    2     1     5/1/2009
    3     2     2/1/2009
    5          
    4          5/12/2009
    Actually I want the maximum date of when we received the feed. Grp_Num tells which feeds are grouped together. NULL grp_num means they are not grouped so treat them as individual group. In the example - Feeds 1 and 2 are in one group and any one of the feed is required. Feed 3, 4 and 5 are individual groups and all the three are required.
    I need a single query that should return the maximum date for the feeds. For the example the result should be NULL because.. out of feed 1 and 2 the max date is 5/1/2009. For feed 3 the max date is 2/1/2009, for feed 4 it is 5/12/2009 and for feed 4 it is NULL. Since one of the required feed is null so the results should be null.
    DELETE FROM T_FEED;
    DELETE FROM T_FEED_RCV;
    COMMIT;
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, NULL);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    For above inserts, the result should be for feed 1 and 2 - 5/1/2009, feed 3 - 2/1/2009 and feed 4 - 5/12/2009. So the max of these dates is 5/12/2009.
    I tried using MAX function grouped by GRP_NUM and also tried using DENSE_RANK but unable to resolve the issue. I am not sure how can I use the same query to return - not null value for same group and null (if any) for those that doesn't belong to any group. Appreciate if anyone can help me..

    Hi,
    Kuul13 wrote:
    Thanks Frank!
    Appreciate your time and solution. I tweaked your earlier solution which was more cleaner and simple and built the following query to resolve the prblem.
    SELECT * FROM (
    SELECT NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY')) AS GRP_ID
    ,MAX (FR.RCV_DT) AS MAX_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR ON F.FEED_ID = FR.FEED_ID
    GROUP BY NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY'))
    ORDER BY MAX_DT DESC NULLS FIRST)
    WHERE ROWNUM=1;
    I hope there are no hidden issues with this query than the later one provided by you.Actually, I can see 4 issues with this. I admit that some of them are unlikely, but why take any chances?
    (1) The first argument to NVL is a NUMBER, the second (being the result of ||) is a VARCHAR2. That means one of them will be implicitly converted to the type of the other. This is just the kind of thing that behaves differently in different versions or Oracle, so it may work fine for a year or two, and then, when you change to another version, mysteriously quit wiorking. When you have to convert from one type of data to another, always do an explicit conversion, using TO_CHAR (for example).
    (2)
    F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY)'will produce a key like '123405202009'. grp_num is a NUMBER with no restriction on the number of digits, so it could conceivably be 123405202009. The made-up grp_ids must never be the same any real grp_num.
    (3) The combination (carr_id, feed_id, eff_dt) is unique, but using TO_CHAR(EFF_DT, 'MMDDYYYY) assumes that the combination (carr_id, feed_id, TRUNC (eff_dt)) is unique. Even if eff_dt is always entered as (say) midnight (00:00:00) now, you may decide to start using the time of day sometime in the future. What are the chances that you'll remember to change this query when you do? Not very likely. If multiple rows from the same day are relatively rare, this is the kind of error that could go on for months before you even realize that there is an error.
    (4) Say you have this data in t_feed:
    carr_id      feed_id  eff_dt       grp_num
    1        234      20-May-2009  NULL
    12       34       20-May-2009  NULL
    123      4        20-May-2009  NULLAll of these rows will produce the same grp_id: 123405202009.
    Using NVL, as you are doing, allows you to get by with just one sub-query, which is nice.
    You can do that and still address all the problems above:
    SELECT  *
    FROM     (
         SELECT  NVL2 ( F.GRP_NUM
                   , 'A' || TO_CHAR (f.grp_num)
                   , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                TO_CHAR (f.feed_id) || ':' ||
                                TO_CHAR ( f.eff_dt
                                            , 'MMDDYYYYHH24MISS'
                   ) AS grp_id
         ,     MAX (FR.RCV_DT) AS MAX_DT
         FROM               T_FEED      F
         LEFT OUTER JOIN  T_FEED_RCV  FR        ON  F.FEED_ID = FR.FEED_ID
         GROUP BY  NVL2 ( F.GRP_NUM
                     , 'A' || TO_CHAR (f.grp_num)
                     , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                     TO_CHAR (f.feed_id) || ':' ||
                                     TO_CHAR ( f.eff_dt
                                                 , 'MMDDYYYYHH24MISS'
         ORDER BY  MAX_DT   DESC   NULLS FIRST
    WHERE      ROWNUM = 1;I would still use two sub-queries, adding one to compute grp_id, so we don't have to repeat the NVL2 expression. I would also use a WITH clause rather than in-line views.
    Do you find it easier to read the query above, or the simpler query you posted in your last message?
    Please make things easy on yourself and the people who want to help you. Always format your code so that the way the code looks on the screen makes it clear what the code is doing.
    In particular, the formatting should make clear
    (a) where each clause (SELECT, FROM, WHERE, ...) of each query begins
    (b) where sub-queries begin and end
    (c) what each argument to functions is
    (d) the scope of parentheses
    When you post formatted text on this site, type these 6 characters:
    before and after the formatted text, to preserve spacing.
    The way you post the DDL (CREATE TABLE ...)  and DML (INSERT ...) statements is great: I wish more people were as helpful as you.
    There's no need to format the DDL and DML.  (If you want to, then go ahead: it does help a little.)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Group and sort in group2

    CREATE TABLE [dbo].[Table_1](
    [group] [int] NULL,
    [sort] [int] NULL,
    [Item] [nchar](10) NULL
    ) ON [PRIMARY]
    GO
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (1, 1, N'Car ')
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (2, 1, N'Bus ')
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (1, 0, N'Bike ')
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (2, 0, N'Lorry ')
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (1, 1, N'Train ')
    INSERT [dbo].[Table_1] ([group], [sort], [Item]) VALUES (1, 1, N'Ship ')
    Data
    My Dataset Query in ssrs table
    Group2 expression
    Group on Group
    Sort on Sort
    Group 1 expression
    Sort on Sort
    First(Item.value ) is not working..  i need to fix in ssrs since this is an an sample data of sp with more columns
      my problem is once group2 is grouped, it is not taking the sort and just dispalying the First item in the sort of the item.
    ShanmugaRaj

    I beg your pardon, but in your example in the insert statement you have 1, 1, Ship , whereas in your report you display 1, 0 , Ship.
    Why not avoid the problem at the source, byt letting T-sql do the work:
    with a ([Group], [FirstSort])
    AS (SELECT [GROUP], MIN(sort) As FirstSort from dbo.Table_1 group by [group])
    select  a.[GROUP], a.FirstSort, c.[Item] AS FirstItem , b.[sort], b.[item]
    from a
    inner join  dbo.table_1 as b on a.[Group] = b.[group]
    inner join dbo.Table_1 as c on a.[Group] = c.[group] and a.[FirstSort] = c.[Sort]
    order by a.[group], b.[sort] -- LEAVE OUT the order BY in the report dataset
    Gives this result (I took the insert statements from your example)
    GROUP FirstSort FirstItem sort item
    1 0 Bike       0 Bike     
    1 0 Bike       1 Car      
    1 0 Bike       1 Train    
    1 0 Bike       1 Ship     
    2 0 Lorry      0 Lorry    
    2 0 Lorry      1 Bus      
    Jan D'Hondt - SQL server BI development
    sorry for that .. will take care heare by
    ShanmugaRaj

  • GroupFilter and sum(); filtered records should not be included in the sum()

    Hi All,
    I am working on a report that has the following requirement.
    1. Query records from the database (multiple queries)
    2. Filter records based on a certain criteria.
    3. Sum() the rows.
    4. Cannot use where clause because of the complexity of the report.
    I have achieved the above requirement using data-template with one exception where my sum() is including all the filtered rows in the summation. I DO NOT want the filtered rows to be included in the sum(). I know I could do sum() in layout-template using xslt.
    Is there a way to do this in data-template?
    My data-template looks something like
    <group name="g_dept" source="q_dept">
        <element name="dname" value="dname"/>
        <element name="d_salary" value="g_emp.salary" function="sum()"/>
        <group name="g_emp" source="q_emp" groupFilter="filterEmployees(:empno)">
           <element name="ename" value="ename"/>
           <element name="empno" value="empno"/>
           <element name="salary" value="salary"/>
        </group>
    </group>
    My data looks like
    10    Scott1    7865    $100
    10    Scott2    6875    $200
    10    Scott3    5678    $300
    10    Scott4    8657    $500 <-- filtered with pl/sql package
    My output should look like
    Dept: 10
    Scott1    7865    $100
    Scott2    6875    $200
    Scott3    5678    $300
    Total for 10: $600
    instead my output looks like
    Total for 10: $1100 <-- including filtered row Steve.
    Is this a feature or a bug in the data-template? not sure the order of events querying, filtering & summation. If anyone could answer, I really appreciate it.
    Thanks

    Was there ever an answer the original question with the groupfilter and sum()?
    I have a GL Report (bi pub 5.6.3 with ebusiness suite 11.5.10.2) that uses the oracle.apps.fnd.flex.kff.select (with security as the output type). In my lowest group (lines) i have a group filter that eliminates the row if the security is Y (to eliminate it from the XML and report). However, the sums that I'm doing are including all rows, not just the ones included in the data file.
    I too have the same issue as Kalagara, where I can't use the where clause to exclude the rows.
    Thanks - Jennifer

Maybe you are looking for

  • Can I set an af:setActionListener on the search button of a af:query ?

    Hi, I want to store a flag in the session scope when the search button of an <af:query> form is pressed. The flag is used to determine if the query has been executed and the result table iterator should provide the result. I'm still looking for a sol

  • Using Key Note with an IPad 2

    Can I show a keynote presentation through a projector using an IPad 2?

  • SOAP API's for Sql Server Analysis Services 2005

    Can any one tell me the Latest XMLA SOAP API's necessary for communicating with Sql Server Analysis Services 2005????

  • Arbitrary limit for  playback of avi files QT 7.04

    QT pro player stops without any error indication after about 1.11GB of a 1.37GB avi file. Everything is fine up to that point. VLC plays the whole file to its end. I saw a post re a 2.0 GB limit. Anyone having the same proplem?? Thanks, Harley

  • Enum operator overloads not called

    Hi, we are faced with a serious problem of the current Forte C++ compiler version (WS6U2 with actual patches). The following test program shows that the compiler does not accept operator overloads for enums. Instead of this all enums are implicitly c