Sql grouping and summing impossible?

I want to create an sql query to sum up some data but i'm starting to think it is impossible with sql alone. The data i have are of the following form :
TRAN_DT     TRAN_RS     DEBT     CRED
10-Jan     701     100     0
20-Jan     701     150     0
21-Jan     701     250     0
22-Jan     705     0     500
23-Jan     571     100     0
24-Jan     571     50     0
25-Jan     701     50     0
26-Jan     701     20     0
27-Jan     705     0     300The data are ordered by TRAN_DT and then by TRAN_RS. Tha grouping and summing of data based on tran_rs but only when it changes. So in the table above i do not want to see all 3 first recods but only one with value DEBT the sum of those 3 i.e. 100+150+250=500. So the above table after grouping would be like the one below:
TRAN_DT     TRAN_RS     DEBT     CRED
21-Jan     701     500     0
22-Jan     705     0     500
24-Jan     571     150     0
26-Jan     701     70     0
27-Jan     705     0     300The TRAN_DT is the last value of the summed records. I undestand that the tran_dt may not be selectable. What i have tried so far is the following query:
select tran_dt,
         tran_rs,
         sum(debt)over(partition by tran_rs order by tran_dt rows unbounded preceding),
         sum(cred)over(partition by tran_rs order by tran_dt rows unbounded preceding) from that_tableIs this even possible with sql alone, any thoughts?
The report i am trying to create in BI Publisher.Maybe it is possible to group the data in the template and ask my question there?

915218 wrote:
Is this even possible with sql alone, any thoughts?It sure is...
WITH that_table as (select to_date('10/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 100 debt, 0 cred from dual union all
                     select to_date('20/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 150 debt, 0 cred from dual union all
                     select to_date('21/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 250 debt, 0 cred from dual union all
                     select to_date('22/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 500 cred from dual union all
                     select to_date('23/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 100 debt, 0 cred from dual union all
                     select to_date('24/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 50 debt, 0 cred from dual union all
                     select to_date('25/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 50 debt, 0 cred from dual union all
                     select to_date('26/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 20 debt, 0 cred from dual union all
                     select to_date('27/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 300 cred from dual)
, brk AS (
SELECT     tran_dt,
        tran_rs,
     debt,
     cred,
        CASE WHEN Nvl (Lag (tran_rs) OVER (ORDER BY tran_dt, tran_rs), 0) != tran_rs THEN tran_rs || tran_dt END brk_tran_rs
  FROM that_table
), grp AS (
SELECT     tran_dt,
        tran_rs,
         debt,
         cred,
        Last_Value (brk_tran_rs IGNORE NULLS) OVER  (ORDER BY tran_dt, tran_rs) grp_tran_rs
  FROM brk
SELECT     Max (tran_dt),
        Max (tran_rs),
         Sum (debt),
         Sum (cred)
  FROM grp
GROUP BY grp_tran_rs
ORDER BY 1, 2
Boneist
MAX(TRAN_    TRAN_RS       DEBT       CRED
21-JAN-12        701        500          0
22-JAN-12        705          0        500
24-JAN-12        571        150          0
26-JAN-12        701         70          0
27-JAN-12        705          0        300
Me
MAX(TRAN_ MAX(TRAN_RS)  SUM(DEBT)  SUM(CRED)
21-JAN-12          701        500          0
22-JAN-12          705          0        500
24-JAN-12          571        150          0
26-JAN-12          701         70          0
27-JAN-12          705          0        300Edited by: BrendanP on 17-Feb-2012 04:05
Test data courtesy of Boneist, and fixed bug.
Edited by: BrendanP on 17-Feb-2012 04:29

Similar Messages

  • LINQ Group and Sum table

    I can't figure how to group those records 
    EmpID
    ClientId
    HourIn
       HourOut
            Date
    TotalHours
    12345
    9999
    8:00 AM
    12:00 PM
    01/05/2015
    4
    12345
    9999
    1:00 PM
    3:00 PM
    01/05/2015
    2
    12345
    9999
    3:30 PM
    6:00 PM
    01/05/2015
    2.5
    12345
    9999
    8:00 AM
    5:00 PM
    01/06/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/07/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/08/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/09/2015
    9
    I want to group by date and sum total hours and hourin in the first and hour out is the last one. 
    FYI (date can be datetime) (eg i can make date to be (1/18/2014 12:00:00 AM)
    EmpID
    ClientId
    HourIn
    HourOut
    Date
    TotalHours
    12345
    9999
    8:00 AM
    6:00 PM
    01/05/2015
    8.5
    12345
    9999
    8:00 AM
    5:00 PM
    01/06/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/07/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/08/2015
    9
    12345
    9999
    8:00 AM
    5:00 PM
    01/09/2015
    9
    Thanks in advance

    Hope this helps..do a group by and then select ..change the below accordingly
    DataTable objtable = new DataTable();
    objtable.Columns.Add("EmpID", typeof(int)); objtable.Columns.Add("ClientId", typeof(int));
    objtable.Columns.Add("HourIn", typeof(DateTime)); objtable.Columns.Add("HourOut", typeof(DateTime));
    objtable.Columns.Add("Date", typeof(DateTime)); objtable.Columns.Add("TotalHours", typeof(double));
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 8:00 AM"), Convert.ToDateTime("01/05/2015 12:00 PM"), Convert.ToDateTime("01/05/2015"), 4);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 1:00 PM"), Convert.ToDateTime("01/05/2015 3:00 PM"), Convert.ToDateTime("01/05/2015"), 2);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/05/2015 3:30 PM"), Convert.ToDateTime("01/05/2015 6:00 PM"), Convert.ToDateTime("01/05/2015"), 2.5);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/06/2015 8:00 AM"), Convert.ToDateTime("01/06/2015 5:00 PM"), Convert.ToDateTime("01/06/2015"), 9);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/07/2015 8:00 AM"), Convert.ToDateTime("01/07/2015 5:00 PM"), Convert.ToDateTime("01/07/2015"), 9);
    objtable.Rows.Add(12345, 9999, Convert.ToDateTime("01/08/2015 8:00 AM"), Convert.ToDateTime("01/08/2015 5:00 PM"), Convert.ToDateTime("01/08/2015"), 9);
    objtable.AcceptChanges();
    var result = objtable.AsEnumerable().GroupBy(x => x.Field<DateTime>("Date")).Select(g => new
    empId = g.First().Field<int>("EmpID"),
    clientID = g.First().Field<int>("ClientId"),
    hoursIn = g.Min(e => e.Field<DateTime>("HourIn")),
    hourOut = g.Max(e => e.Field<DateTime>("HourOut")),
    totalhours = g.First().Field<double>("TotalHours")
    foreach (var row in result)
    Console.WriteLine("{0} {1} {2} {3} {4} ", row.empId,row.clientID, row.hoursIn.ToString("HH:mm tt"), row.hourOut.ToString("HH:mm tt"),row.totalhours);

  • Select query with group and sum

    Friends I have a table which has a list of item that are sold in many provinces and their selling price.
    EMP_TABLE
    item_code
    item_desc
    item_province
    item_selling_price
    Now I want a query which a row containing
    distinct item code ,item desc,province ,sum of item_selling_price for Ontario,sum of item_selling_price for British Columbia,sum of item_selling_price for Quebec
    Can anyone please tell me how to do it.
    thx
    m

    Hello
    It's always usefull to provide some test data and create table scripts etc, but does this do what you're after?
    create table dt_test_t1
    (item_code                     varchar2(3),
    item_desc                    varchar2(10),
    item_province               varchar2(20),
    item_selling_price          number(3)
    ) tablespace av_datas;
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item1','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item2','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item3','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item4','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province2',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province1',10);
    INSERT INTO dt_test_t1 VALUES('ABC','Item5','Province2',10);
    SQL> SELECT
      2     item_code,
      3     item_desc,
      4     SUM(DECODE(item_province,'Province1',item_selling_price,0)) province_1_total,
      5     SUM(DECODE(item_province,'Province2',item_selling_price,0)) province_2_total
      6  FROM
      7     dt_test_t1
      8  GROUP BY
      9     item_code,
    10     item_desc;
    ITE ITEM_DESC  PROVINCE_1_TOTAL PROVINCE_2_TOTAL
    ABC Item1                    10               20
    ABC Item2                    30                0
    ABC Item3                     0               30
    ABC Item4                    20               10
    ABC Item5                    10               20HTH
    David

  • Grouping and sum values in XI

    Hello,
    I 'm working with invoice and I have this source structure as XI input:
    Invoice
    -- |....
    -- |....
    -- |Item1
    |taxcode=01
    |Amount=12
    --|Item2
    |taxcode=08
    |Amount=10
    --|Item3
    |taxcode=01
    |Amount=24
    Now my scope is to map these fields to the IDOC segment E1EDP04 grouping by taxcode (MWSKZ taxcode)and putting the sum of the group amount (MWSBT amount).
    IDOC:
    --|...
    --|...
    --|E1EDP01
    |...
    |...
    |EIEDP04
    |MWSKZ=01
    |MWSBT=36
    |...
    --|E1EDP01
    |...
    |...
    |EIEDP04
    |MWSKZ=08
    |MWSBT=10
    |...
    How can I group by a field in XI?
    Thank you
    Corrado

    Hi Corrado,
    If You want to do it in graphical mapping then I will do it this way:
    1. sort by taxcode
    (taxcode) --> split by value (valuechanged) --> formatByExample (Amount, and splitted value) --> sum(amount) --> MWSBT
    I can send u a screenshot of something similar if u need.
    best regards
    Dawid

  • PL SQL Grouping and Edit the Oldest Row in a Group

    Event Table
    CREATE TABLE "EVENT" ("EVENT_ID" VARCHAR2(36), "EVENTTYPE" VARCHAR2(110), "WHEN_OCCURRED" TIMESTAMP (6),"DESCRIPTION" VARCHAR2(255), "CURRENCY" VARCHAR2(3), "AMOUNT" NUMBER(38,8), "TRANSACTION_ID" VARCHAR2(255) )
    Insert into EVENT (EVENT_ID,EVENTTYPE,DESCRIPTION,CURRENCY,AMOUNT,TRANSACTION_ID) values ('1','paymentEvent',to_timestamp('15-AUG-11 10.11.30.165000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Payment Transaction Confirmed','USD',10','3');
    Insert into EVENT (EVENT_ID,EVENTTYPE,DESCRIPTION,CURRENCY,AMOUNT,TRANSACTION_ID) values ('2','paymentEvent',to_timestamp('15-AUG-11 12.31.23.162000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Payment Transaction Confirmed','USD',10,'3085c51d-6425-4974-b2fb-f5bdc4d08137');
    Transaction table
    CREATE TABLE "TRANSACTION" ("TRANSACTION_ID" VARCHAR2(36 BYTE), "LAST_UPDATED" TIMESTAMP (6), "DESCRIPTION" VARCHAR2(255 BYTE), "TYPE" VARCHAR2(255 BYTE), "DEBIT_CREDIT" NUMBER(38,8), "ORIGINAL_ID" VARCHAR2(36 BYTE), "TYPE" VARCHAR2(255 BYTE))
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('c8ef1e7c-4134-45d8-9568-5da2fc41e137',to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed','SETTLEMENT',1,'3','1', 'debit');
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('8243cc1d-614d-4cfd-9c3b-b5cd44b3a67f',to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed','SETTLEMENT',1,'3','1', 'credit');
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('3',to_timestamp('15-AUG-11 10.11.30.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'testCharge','PAYMENT',10,null,null,null);
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('609f264b-1cef-4615-8f28-d8dae391dbba',to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed','SETTLEMENT',4,'3','1', 'debit');
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('3085c51d-6425-4974-b2fb-f5bdc4d08137',to_timestamp('15-AUG-11 12.31.23.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'testCharge','PAYMENT',10,null,null,null);
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('bafc4f83-d3b7-45f2-b35d-ba4101eebf3c',to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),Settle: Payment Transaction Confirmed','SETTLEMENT',2,'3085c51d-6425-4974-b2fb-f5bdc4d08137','2','debit');
    Insert into TRANSACTION (TRANSACTION_ID,LAST_UPDATED,DESCRIPTION,TYPE,DEBIT_CREDIT,ORIGINAL_ID,EVENT_ID) values ('bbb6123b-8b70-45f2-b593-5b6bb2afd6b2',to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed','SETTLEMENT',2,'3','1','credit');
    Result
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,TRX_DESC,GROSS,NET) values ('609f264b-1cef-4615-8f28-d8dae391dbba',to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),0,4,'3','Payment Transaction Confirmed',10,6);
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,TRX_DESC,GROSS,NET) values ('bbb6123b-8b70-45f2-b593-5b6bb2afd6b2',to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),2,0,'3','Correction: Payment Transaction Confirmed',0,2);
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,TRX_DESC,GROSS,NET) values ('c8ef1e7c-4134-45d8-9568-5da2fc41e137',to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,1,'3','Correction: Payment Transaction Confirmed'0,-1);
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,TRX_DESC,GROSS,NET) values ('8243cc1d-614d-4cfd-9c3b-b5cd44b3a67f',to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),1,0,'3','Correction: Payment Transaction Confirmed'0,1);
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,TRX_DESC,GROSS,NET) values ('bafc4f83-d3b7-45f2-b35d-ba4101eebf3c',to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,2,'3085c51d-6425-4974-b2fb-f5bdc4d08137','Payment Transaction Confirmed'10,8);
    Explanation
    * Each event table insert has a respective type='payment' insert in transaction table.
    * My result interesting only about transaction type='settlement'
    * One type = 'payment' transaction could have one type = 'settlement' by default. But in a some cases (an error) there be many type = settlement
    * When we take transaction_id = '3' it has 4 'settlement' transactions
    To the result these four(4) settlement takes and out of a four the oldest transaction should have a different presentation
    (See result number 01)
    If payment transaction has only one settlement transaction then the result description = Payment Transaction Confirmed
    gross = respective event table amount
    net = gross + credit - debit
    if payment transaction has more than one settlement transaction then * Oldest transaction ( order by LAST_UPDATED)
    description = Payment Transaction Confirmed
    gross = respective event table amount
    net = gross + credit - debit
    * All other transactions
    description = Correction: Payment Transaction Confirmed
    gross = 0
    net = if type = debit then net = -(debit) else credit
    Appreciate your help for write a PL/SQL query that satisfies above.
    I am thinking to
    Group all relative settlement transactions into one unit.
    Find oldest among them
    Apply different logic for oldest
    Apply common logic for othersStill need some help from you guys to convert my thoughts into SQL
    Thanks

    Yes. that is what I initially trying to do. But no success.
    Please refer below my effort
    Transaction table
    CREATE TABLE "TRANSACTION" ("TRANSACTION_ID" VARCHAR2(36 BYTE), "OBJECT_TYPE" VARCHAR2(255 BYTE), "VERSION" NUMBER(19,0), "CREATED" TIMESTAMP (6), "LAST_UPDATED" TIMESTAMP (6), "DESCRIPTION" VARCHAR2(255 BYTE), "EXTERNAL_ID" VARCHAR2(255 BYTE), "TYPE" VARCHAR2(255 BYTE), "STATE" VARCHAR2(255 BYTE), "LAST_STATE_CHANGE" TIMESTAMP (6), "CHANNEL" VARCHAR2(255 BYTE), "MANDATE_ID" VARCHAR2(255 BYTE), "SUBSCRIPTION_ID" VARCHAR2(255 BYTE), "BULK_REFUND_TASK_ID" VARCHAR2(255 BYTE), "GROSS_CURRENCY" VARCHAR2(3 BYTE), "GROSS_AMOUNT" NUMBER(38,8), "TAX_CURRENCY" VARCHAR2(3 BYTE), "TAX_AMOUNT" NUMBER(38,8), "PAYER_ID" VARCHAR2(36 BYTE), "PAYEE_ID" VARCHAR2(36 BYTE), "ON_BEHALF_OF_PAYEE" VARCHAR2(255 BYTE), "ORIGINAL_ID" VARCHAR2(36 BYTE), "TRANSFER_ID" VARCHAR2(36 BYTE), "EVENT_ID" VARCHAR2(36 BYTE), "TERM_ID" VARCHAR2(36 BYTE), "CHARGE" NUMBER(1,0) DEFAULT 0, "TAGS_AND_VALUES" VARCHAR2(4000 BYTE), "LARGE_TAGS_AND_VALUES" CLOB, "EXPORT_PENDING" VARCHAR2(4 BYTE) DEFAULT 'TRUE')
    Data
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('c8ef1e7c-4134-45d8-9568-5da2fc41e137','com.valista.trxe.model.Transaction',1,to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed',null,'SETTLEMENT','CONFIRMED',to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',1,'USD',0,'9f285bf7-46a1-4621-b165-dc782bf1d1ae','1c887a16-4344-4a80-8a11-f26d80824e4a',null,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14','f24be53d-7b78-455c-b1ac-f9fc242ec5ff','a2f09cbb-1555-4c8d-b401-0233982cfa7e','c6b70c01-7f9a-4d96-93a6-e8e1b568cf59',1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('8243cc1d-614d-4cfd-9c3b-b5cd44b3a67f','com.valista.trxe.model.Transaction',1,to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed',null,'SETTLEMENT','CONFIRMED',to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',1,'USD',0,'1c887a16-4344-4a80-8a11-f26d80824e4a','9f285bf7-46a1-4621-b165-dc782bf1d1ae',null,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14','ac5ae000-7ca8-4542-bf83-8d99e085e706','a2f09cbb-1555-4c8d-b401-0233982cfa7e','b70382c8-a05e-49c4-9919-ad765c5c5046',1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('61db88ec-ade8-4cc3-90f3-e49fa26dcf14','com.valista.trxe.model.Transaction',5,to_timestamp('15-AUG-11 10.11.30.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'testCharge',null,'PAYMENT','CONFIRMED',to_timestamp('15-AUG-11 10.11.30.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',10,'USD',0,'e8a58763-9faa-4632-a2b3-747aa00fe9c4','9f285bf7-46a1-4621-b165-dc782bf1d1ae',null,null,'ac45ebc4-f817-42e7-8d4a-96e41fb40ef8',null,null,1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('609f264b-1cef-4615-8f28-d8dae391dbba','com.valista.trxe.model.Transaction',1,to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed',null,'SETTLEMENT','CONFIRMED',to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',4,'USD',0,'9f285bf7-46a1-4621-b165-dc782bf1d1ae','1c887a16-4344-4a80-8a11-f26d80824e4a',null,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14','c0e866f2-0389-4fe3-9b11-f9ed6f49dd66','a2f09cbb-1555-4c8d-b401-0233982cfa7e','64103b9f-b760-4add-8c43-dbc7ac501fda',1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('3085c51d-6425-4974-b2fb-f5bdc4d08137','com.valista.trxe.model.Transaction',2,to_timestamp('15-AUG-11 12.31.23.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'testCharge',null,'PAYMENT','CONFIRMED',to_timestamp('15-AUG-11 12.31.23.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',10,'USD',0,'e8a58763-9faa-4632-a2b3-747aa00fe9c4','9f285bf7-46a1-4621-b165-dc782bf1d1ae',null,null,'db78d04e-c23d-4601-9b0e-76e826e85e29',null,null,1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('bafc4f83-d3b7-45f2-b35d-ba4101eebf3c','com.valista.trxe.model.Transaction',1,to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),'Settle: Payment Transaction Confirmed',null,'SETTLEMENT','CONFIRMED',to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',2,'USD',0,'9f285bf7-46a1-4621-b165-dc782bf1d1ae','1c887a16-4344-4a80-8a11-f26d80824e4a',null,'3085c51d-6425-4974-b2fb-f5bdc4d08137','9b5e384c-7ce9-4d7a-871e-5f418e737205','9c5cb90a-2035-490f-9ca7-c70f24a9010c','b70382c8-a05e-49c4-9919-ad765c5c5046',1,null,'TRUE');
    Insert into TRANSACTION (TRANSACTION_ID,OBJECT_TYPE,VERSION,CREATED,LAST_UPDATED,DESCRIPTION,EXTERNAL_ID,TYPE,STATE,LAST_STATE_CHANGE,CHANNEL,MANDATE_ID,SUBSCRIPTION_ID,BULK_REFUND_TASK_ID,GROSS_CURRENCY,GROSS_AMOUNT,TAX_CURRENCY,TAX_AMOUNT,PAYER_ID,PAYEE_ID,ON_BEHALF_OF_PAYEE,ORIGINAL_ID,TRANSFER_ID,EVENT_ID,TERM_ID,CHARGE,TAGS_AND_VALUES,EXPORT_PENDING) values ('bbb6123b-8b70-45f2-b593-5b6bb2afd6b2','com.valista.trxe.model.Transaction',1,to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),'Correction: Payment Transaction Confirmed',null,'SETTLEMENT','CONFIRMED',to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),null,null,null,null,'USD',2,'USD',0,'1c887a16-4344-4a80-8a11-f26d80824e4a','9f285bf7-46a1-4621-b165-dc782bf1d1ae',null,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14','bef6cea1-d188-44e6-a5cf-6282480e13c2','a2f09cbb-1555-4c8d-b401-0233982cfa7e','e4a0bb0b-7273-4f28-8681-04fc067cb80d',1,null,'TRUE');
    My Query
    select transaction_id as trx_id,
              last_updated,
              nvl((case when payee_id= '9f285bf7-46a1-4621-b165-dc782bf1d1ae' then gross_amount end),0) as credit,
              nvl((case when payer_id= '9f285bf7-46a1-4621-b165-dc782bf1d1ae' then gross_amount end),0) as debit,
              connect_by_root transaction_id root,
              rownum rn,
    description as trx_desc
              from (select * from transaction order by last_updated )
              where type = 'SETTLEMENT'
                   start with original_id is null
                   connect by original_id = prior transaction_id order by last_updated
    Result I got ( this is not a table just result set )
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('609f264b-1cef-4615-8f28-d8dae391dbba',to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),0,4,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',1,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('bbb6123b-8b70-45f2-b593-5b6bb2afd6b2',to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),2,0,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',2,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('c8ef1e7c-4134-45d8-9568-5da2fc41e137',to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,1,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',3,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('8243cc1d-614d-4cfd-9c3b-b5cd44b3a67f',to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),1,0,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',4,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('bafc4f83-d3b7-45f2-b35d-ba4101eebf3c',to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,2,'3085c51d-6425-4974-b2fb-f5bdc4d08137',5,'Settle: Payment Transaction Confirmed');
    Result I want to get
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('609f264b-1cef-4615-8f28-d8dae391dbba',to_timestamp('15-AUG-11 10.17.43.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),0,4,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',1,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('bbb6123b-8b70-45f2-b593-5b6bb2afd6b2',to_timestamp('15-AUG-11 10.18.56.000000000 AM','DD-MON-RR HH.MI.SS.FF AM'),2,0,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',2,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('c8ef1e7c-4134-45d8-9568-5da2fc41e137',to_timestamp('15-AUG-11 12.01.42.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,1,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',3,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('8243cc1d-614d-4cfd-9c3b-b5cd44b3a67f',to_timestamp('15-AUG-11 12.29.45.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),1,0,'61db88ec-ade8-4cc3-90f3-e49fa26dcf14',4,'Correction: Payment Transaction Confirmed');
    Insert into EXPORT_TABLE (TRX_ID,LAST_UPDATED,CREDIT,DEBIT,ROOT,RN,TRX_DESC) values ('bafc4f83-d3b7-45f2-b35d-ba4101eebf3c',to_timestamp('15-AUG-11 12.31.44.000000000 PM','DD-MON-RR HH.MI.SS.FF AM'),0,2,'3085c51d-6425-4974-b2fb-f5bdc4d08137',1,'Settle: Payment Transaction Confirmed');
    Explanation
    Refer 5th record of the result set. It has a RN = 5 and I want to get RN = 1 for that since that has a different ROOT and since it's the oldest memeber.
    So Iwant to get RN as
    date root row_number or something just find the oldest memeber in a tree
    15-AUG-11 10.17.43.000000000 AM     61db88ec-ade8-4cc3-90f3-e49fa26dcf14     1
    15-AUG-11 10.18.56.000000000 AM     61db88ec-ade8-4cc3-90f3-e49fa26dcf14     2
    15-AUG-11 12.01.42.000000000 PM     61db88ec-ade8-4cc3-90f3-e49fa26dcf14     3
    15-AUG-11 10.25.56.000000000 AM     55db88ec-ade8-4cc3-90f3-e49fa26dcf14     1
    15-AUG-11 10.27.56.000000000 AM     55db88ec-ade8-4cc3-90f3-e49fa26dcf14     2
    15-AUG-11 10.55.56.000000000 AM     85db88ec-ade8-4cc3-90f3-e49fa26dcf14     1

  • Group and Sum in Webi

    Hi,
    I have three objects Business Unit, Department and Revenue
    I would like to group by Business Unit and Departement; sum of Revenue.
    How to do this in report level?
    Please let me know?
    Thanks and Regards,
    Manjunath N Jogin

    Hi Pumpactionshotgun ,
    I did same thing in webi but revenue field is diplaying all records.
    It is not aggreagting (sum) based on Business Unit and Departement.
    How to  do 'aggregation seting for Revenue in the universe'
    I am waiting your replay.
    Thanks and Regards,
    Manjunath N Jogin

  • Expression Grouping and Summing

    I am working on a report where I have an expression that gets the Max value of a field in the detail row of my report.  I have four levels of grouping that I then need this expression to sum.  The expression is: 
    =Max(IIF(LEFT(Fields!JobNum.Value,1)="S" AND Fields!Week1STol.Value<>0, CDBL(Fields!Week1STol.Value), 0))
    I am using Max, because I need a value from a result set that is one of the non-zero values from the result set.  They will all be the same, or 0, so if there is a different way to get the non-zero value without using an aggregate I can change my expression
    to do that, but I have not found a way to do that yet.  Now since I am grouping this, and as each group rolls up I need the sum of the max values in the subgroup.  When I get is the Max value of the new group.
    I have tried various ways to try and get this to sum.  I have tried creating total rows, added custom code to collect running totals, etc., I have also tried wrapping the whole thing in a SUM aggregate, and that will work in the detail, but will not
    roll up into the groupings.
    Any help as to how this can be done will be greatly appreciated.
    Thank you,
    Chad 

    Ok, after continuing to search the internet I finally found the extra piece that I was missing that gave me the results I needed. The new expression looks like this:
    =runningvalue(Sum(Max(IIF(LEFT(Fields!JobNum.Value,1)="S" AND Fields!Week1STol.Value<>0, CDBL(Fields!Week1STol.Value), 0),"JobItem"),"JobItem"),SUM, "JobItem")
    In this I wrapped the original expression of Max in both a Sum and a runningvalue both at the JobItem level to get this rollup value. Now when I open the grouping I get the correct running value at each level.
    What this really gives to me is a running total at each of four groupings, even with a "max" value at the detail level. This also allows me to total a max value inline on the report, without needing a hidden row, or report footer.
    Thank you to everyone who looked at this, I hope it helps someone else. If this answer is not clear enough, please don't hesitate to add to the comments and I will try to clarify.
    Thank you, Chad

  • How to Group and Sum rtf elements

    Dear Friends,
    I am having limited knowledge of XML so looking for your help
    I am trying to create external template for Bill Presentment Architecture which is in .rtf. Template is having 3 section Invoice header, Lines and Tax summary.
    Invoice Header and line section is ok, but facing some issues with Tax section.
    LINE SECTION
    LINE_ID     ITEM     RATE     QTY     BASE_AMT     TAX_RATE     TAX_AMT     TOTAL_AMT
    1     P0001     20     10     200     21%     42     242
    3     P0003     40     30     1200     21%     252     1452
    4     P0004     20     100     2000     10%     420     2200
    5     P0005     50     10     500     10%     105     550
    2     P0002     50     20     1000     6%     210     1060
    EXPECTED RESULT IN TAX SUMMARY SECTION WHICH I AM LOOKING FOR
    TAX RATE           BASE_AMT          TAX_AMT          TOTAL_AMT
    21%          1400          294          1694
    10%          2500          525          2750
    6%          1000          210          1060
    Looking for your help, much appriciated.
    Regards,
    Jay
    Edited by: 992820 on Mar 9, 2013 5:20 AM

    >
    Tax Code Taxable amount
    VAT 6% 1200
    VAT 6% 1400
    VAT 7% 1000
    VAT 7% 2000
    I used you code
    <?for-each-group:LINE;./tax_code?> G tag
    and getting following result
    VAT 6% 1200
    VAT 7% 1000
    >
    because you have loop by tax_code and no loop and no aggregation function (e.g. sum) for amount, so amount will be from first row
    try add sum function for amount
    also http://docs.oracle.com/cd/E18727_01/doc.121/e13532/T429876T431325.htm#I_bx2Dcontent
    Summary Lines: Check this box if you want to display summarized transaction lines. Grouped lines are presented in the Lines and Tax area of the primary bill page.so it's may be not template problem but some settings?

  • Group and Sum/Average

    Hello.
    I have an excel structure here, this data is extracted from our company's software. The record was recorded from Dec 5, 2013 to Dec 11, 2013 
    Job   Operator
    Added
    Job Opened
    Job Revenue Recognition Date
    Shipment ID
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038408
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038412
    Joebeth Cañete
    05-Dec-13
    19-Dec-13
    IMM
    S00038414
    Joebeth Cañete
    05-Dec-13
    10-Dec-13
    IMM
    S00038440
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038501
    Rachel Rosales
    09-Dec-13
    09-Dec-13
    IMM
    S00038502
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038503
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038504
    Sharlah Faye Solomon
    09-Dec-13
    09-Dec-13
    IMM
    S00038506
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038508
    May Jane Herrera
    09-Dec-13
    09-Dec-13
    IMM
    S00038509
    Joebeth Cañete
    09-Dec-13
    17-Dec-13
    IMM
    S00038510
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038512
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038513
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038514
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038515
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038516
    Sheena Dagandan
    09-Dec-13
    09-Dec-13
    IMM
    S00038518
    Joebeth Cañete
    10-Dec-13
    10-Dec-13
    IMM
    S00038523
    Sharlah Faye Solomon
    10-Dec-13
    10-Dec-13
    IMM
    S00038524
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038525
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038526
    Rachel Rosales
    10-Dec-13
    10-Dec-13
    IMM
    S00038528
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038530
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038531
    May Jane Herrera
    10-Dec-13
    10-Dec-13
    IMM
    S00038532
    Joebeth Cañete
    10-Dec-13
    10-Dec-13
    IMM
    S00038533
    Rachel Rosales
    10-Dec-13
    10-Dec-13
    IMM
    S00038534
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038541
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038542
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038543
    Sharlah Faye Solomon
    11-Dec-13
    11-Dec-13
    IMM
    S00038544
    May Jane Herrera
    11-Dec-13
    11-Dec-13
    IMM
    S00038548
    May Jane Herrera
    11-Dec-13
    11-Dec-13
    IMM
    S00038549
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038554
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038555
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038557
    Rachel Rosales
    11-Dec-13
    11-Dec-13
    IMM
    S00038558
    Now, my boss wants to have a matrix like this.
    Job Operator      Added      Job Opened        Shipment ID
    Employee A
    Total for (n) weeks [of Employee A]
    Average Shipment Daily: 
    Average Shipment Weekly:
    Employee B
    Total for (n) weeks [of Employee B]
    Average Shipment Daily: 
    Average Shipment Weekly:
    Inshort, he wants to group the records by Job Operator and get the average number of shipments daily and weekly. Is it possible to do this job?

    What' the define of the 'Week' you mentioned?
    It would be benefit for us to find the solution if you can provide the output you want.
    E.g..
    Job     Operator
    Total   for (n) weeks 
    Average   Shipment Daily
    Average Shipment Weekly
    Joebeth Cañete
    2
    1
    3.5
    But in any case, you can try to create a pivot table to analyze the data. 

  • Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount 28

    Post Author: tadj188#
    CA Forum: Formula
    Needed: Sum of LineCount Including Groups and Detail Data On Each Page Used To Generate New Page If TotalPageLineCount > 28
    Background:
    1) Report SQL is created with unions to have detail lines continue on a page, until it reaches page footer or report footer, rather than using  subreports.    A subreport report is now essentially a group1a, group1b, etc. (containing column headers and other data within the the report    with their respective detail lines).  I had multiple subreports and each subreport became one union.
    Created and tested, already:
    1) I have calculated @TotalLineForEachOfTheSameGroup, now I need to sum of the individual same group totals to get the total line count on a page.
    Issue:
    1) I need this to create break on a certain line before, it dribbles in to a pre-printed area.
    Other Ideas Appreciated:
    1) Groups/detail lines break inconveniently(dribble) into the pre-printed area, looking for alternatives for above situation.
    Thank you.
    Tadj

    export all image of each page try like this
    var myDoc = app.activeDocument;
    var myFolder = myDoc.filePath;
    var myImage = myDoc.allGraphics;
    for (var i=0; myImage.length>i; i++){
        app.select(myImage[i]);
        var MyImageNmae  = myImage[i].itemLink.name;
        app.jpegExportPreferences.jpegQuality = JPEGOptionsQuality.high;
        app.jpegExportPreferences.exportResolution = 300;
           app.selection[0].exportFile(ExportFormat.JPG, File(myFolder+"/"+MyImageNmae+".JPEG"), false);
        alert(myImage[i].itemLink.name)

  • Difference of sql query in SQL Workshop and SQL Report (case when sum(...

    Hi ,
    I need some help, pls.
    The follwg report runs in SQL Workshop - but gives errors in the sql-query report in HTMLDB:
    SQL-Workshop:
    select
    case
    when SUM(S.POTENTIAL_GROWTH)> 1 and SUM(S.POTENTIAL_GROWTH) < 4 then 'Ok'
    when SUM(S.POTENTIAL_GROWTH)> 4 and SUM(S.POTENTIAL_GROWTH) < 8 then 'SoSo'
    when SUM(S.POTENTIAL_GROWTH)> 8 and SUM(S.POTENTIAL_GROWTH) < 12 then 'Very Good'
    else 'too much'
    end
    from "SEGMENTATION" S
    In an HTMLDB SQL-Query Report I added Aliases (as requested for HTML) but still gives the error:
    <<Query cannot be parsed, please check the syntax of your query. (ORA-00920: invalid relational operator) >>
    the code:
    select
    case
    when SUM(S.POTENTIAL_GROWTH) pot_growth > 1 and SUM(S.POTENTIAL_GROWTH)pot_growth < 4 then 'Ok'
    when SUM(S.POTENTIAL_GROWTH)pot_growth > 4 and SUM(S.POTENTIAL_GROWTH)pot_growth < 8 then 'SoSo'
    when SUM(S.POTENTIAL_GROWTH)pot_growth > 8 and SUM(S.POTENTIAL_GROWTH)pot_growth < 12 then 'Very Good'
    else 'too much'
    end
    from "SEGMENTATION" S
    any halp from the gurus wud be very much appreciated.
    TIA
    Bernhard

    Still you did not give the alias for the case when column. For that only you should give the alias
    select
    case
    when SUM(S.POTENTIAL_GROWTH) pot_growth &gt; 1 and SUM(S.POTENTIAL_GROWTH)pot_growth &lt; 4 then 'Ok'
    when SUM(S.POTENTIAL_GROWTH)pot_growth &gt; 4 and SUM(S.POTENTIAL_GROWTH)pot_growth &lt; 8 then 'SoSo'
    when SUM(S.POTENTIAL_GROWTH)pot_growth &gt; 8 and SUM(S.POTENTIAL_GROWTH)pot_growth &lt; 12 then 'Very Good'
    else 'too much'
    end "GROWTH"
    from "SEGMENTATION" S

  • How to Perform Forced Manual Failover of Availability Group (SQL Server) and WSFC (Windows Server Failover Cluster)

    I have a scenario with the three nodes with server 2012 standard, each running an instance of SQL Server 2012 enterprise, participate in a
    single Windows Server Failover Cluster (WSFC) that spans two data centers.
    If the nodes in the primary data center are unavailable due to data center outage. Then how I can able to access node in the WSFC (Windows Server Failover Cluster) in the secondary disaster recovery data center automatically with some script.
    I want to write script that can be able to check primary data center by pinging some IP after every 5 or 10 minutes.
    If that IP is unable to respond then script can be able to Perform Forced Manual Failover of Availability Group (SQL Server) and WSFC (Windows Server Failover Cluster)
    Can you please guide me for script writing for automatic failover in case of primary datacenter outage?

    please post you question on failover clusters in the cluster forum.  THey will explain how this works and point you at scipts.
    You should also look in the Gallery for cluster management scripts.
    ¯\_(ツ)_/¯

  • How to Perform Forced Manual Failover of Availability Group (SQL Server) and WSFC (Windows Server Failover Cluster) with scrpiting

    I have a scenario with the three nodes with server 2012 standard, each running an instance of SQL Server 2012 enterprise, participate in a
    single Windows Server Failover Cluster (WSFC) that spans two data centers.
    If the nodes in the primary data center are unavailable due to data center outage. Then how I can able to access node in the WSFC (Windows Server Failover Cluster) in the secondary disaster recovery data center automatically with some script.
    I want to write script that can be able to check primary data center by pinging some IP after every 5 or 10 minutes.
    If that IP is unable to respond then script can be able to Perform Forced Manual Failover of Availability Group (SQL Server) and WSFC (Windows Server Failover Cluster)
    Can you please guide me for script writing for automatic failover in case of primary datacenter outage?

    You are trying to implement manually what should be happening automatically in the cluster. If the primary SQL Server becomes unavailable in the data center, it should fail over to the secondary SQL Server automatically.  Is that not working?
    You also might want to run this configuration by some SQL experts.  I am not a SQL expert, but if you have both hosts in the data center in a cluster, there is no need for replication between those two nodes as they would be accessing
    the database from some form of shared storage.  Then it looks like you are trying to implement Always On to the DR site.  I'm not sure you can mix both types of failover in a single configuration.
    FYI, it would make more sense to establish a file share witness in your DR site instead of placing a third node in the data center for Node Majority quorum.
    . : | : . : | : . tim

  • SQL Query to get All AD Groups and its users in Active Directory

    Hi,
       Is there any query to get all AD groups and its user in an instance of a SQL server?

    Check this blog.
    http://www.mikefal.net/2011/04/18/monday-scripts-%E2%80%93-xp_logininfo/
    It will give you more than what is required. If you dont want the extra information,then you can try this.. I took the query and removed the bits that you might not require.
    declare @winlogins table
    (acct_name sysname,
    acct_type varchar(10),
    act_priv varchar(10),
    login_name sysname,
    perm_path sysname)
    declare @group sysname
    declare recscan cursor for
    select name from sys.server_principals
    where type = 'G' and name not like 'NT%'
    open recscan
    fetch next from recscan into @group
    while @@FETCH_STATUS = 0
    begin
    insert into @winlogins
    exec xp_logininfo @group,'members'
    fetch next from recscan into @group
    end
    close recscan
    deallocate recscan
    select
    u.name,
    u.type_desc,
    wl.login_name,
    wl.acct_type
    from sys.server_principals u
    inner join @winlogins wl on u.name = wl.perm_path
    where u.type = 'G'
    order by u.name,wl.login_name
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

  • EA1 - SQL Formatter issues (JOINs and GROUPs and ORDER BY oh my ;)

    Great job with improving the SQL Formatter, but it still has some bugs that need to be worked out.
    The key words JOIN and it's modifiers INNER, LEFT, RIGHT and FULL OUTER are not recognized as master key words. As such they end up flush against the left margin Also when GROUP BY and/or ORDER BY key words are present in an outer most select statement the other key words are not indented far enough to be right aligned with the end of the word BY and are indented too far to be right aligned with the word GROUP or ORDER. In sub queries, GROUP and ORDER BY are correctly right aligned with their respective SELECT statements.

    We're picking up and collating the Formatter issues. I'll add these.
    Specific bug for these #7013462
    Sue

Maybe you are looking for

  • Is there anyway to have a 10key number pad on my Macbook Pro Retina with Mountain Lion?

    Is there anyway to have a 10key number pad on my Macbook Pro Retina with Mountain Lion? I have a separate keyboad that does this for me but I do not want to have to pack it with me on my trip. Please help.

  • Bw system and client

    Hi,    can anybody help me with a code or function module that can get me the BW system i am working on along with the client. example: BW development, BW Production etc. Thanks in advance, Ravi.

  • Links between portlets on the same page

    Hi, I want to search for an employee in a poplist. When I choose one from the poplist I want a report to start that will show all the projects he is working in. The above happens but when I put my form as a portlet on a page I want the report to star

  • Auto Approval workflow for GR/Confiramtion

    Hello Experts, I have activated workflow WS10400010, and when i created GR from SRM, it is showing status as Awaiting approval. Why it is so? Can anybody guide me? Thanks in advance.

  • Help Menu Quits when using Internet Connect

    When a new user account (with admin. priveleges) is opened, the help menu functions perfectly. After connecting to the Internet (modem) the help menu will not function. (can't get any response on any underlines or ? window). I've tried the file delet