Analytical fucntion question?

hello all. I have an analytical question. I have found couple good examples on how to concatenate several rows into one. But I have not been able to get this to work a specific way.
Oracle 9.2.0.6.0
Given the following data:
table fred.
ORDER_NUMBER_HW                        ORDER_NUMBER_SVC                       SERIAL_NUMBER                
11                                     123                                    bb                           
11                                     123                                    aa                           
11                                     456                                    bb                            I would like to see the following output
ORDER_NUMBER_HW                        SERVICE_ORDERS          SERIAL_NUMBERS                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
11                                     123,456                 aa,bb                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        I have used the following sql to come up with the results below, but i do not want to see the duplicate values in the concatenated strings.
This is what i get now. Any suggestions on how to fix this so I don't get the duplicate info in the concatenated strings?
ORDER_NUMBER_HW                        SERVICE_ORDERS      SERIAL_NUMBERS                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
11                                     123,123,456         bb,aa,bb                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       
SELECT     order_number_hw
          ,LTRIM(SYS_CONNECT_BY_PATH(order_number_svc, ','), ',') service_orders
          ,LTRIM(SYS_CONNECT_BY_PATH(serial_number, ','), ',') serial_numbers
      FROM (SELECT order_number_hw
                  ,order_number_svc
                  ,serial_number
                  ,ROW_NUMBER() OVER(PARTITION BY order_number_hw ORDER BY order_number_svc) rn
                  ,COUNT(*) OVER(PARTITION BY order_number_hw) cnt
              FROM fred)
     WHERE rn = cnt
START WITH rn = 1
CONNECT BY PRIOR order_number_hw = order_number_hw AND PRIOR rn = rn - 1
  ORDER BY serial_number;
table creation.
CREATE TABLE FRED
  ORDER_NUMBER_HW   NUMBER                      NOT NULL,
  ORDER_NUMBER_SVC  NUMBER                      NOT NULL,
  SERIAL_NUMBER     VARCHAR2(30 BYTE)
SET DEFINE OFF;
Insert into FRED
   (ORDER_NUMBER_HW, ORDER_NUMBER_SVC, SERIAL_NUMBER)
Values
   (11, 123, 'bb');
Insert into FRED
   (ORDER_NUMBER_HW, ORDER_NUMBER_SVC, SERIAL_NUMBER)
Values
   (11, 123, 'aa');
Insert into FRED
   (ORDER_NUMBER_HW, ORDER_NUMBER_SVC, SERIAL_NUMBER)
Values
   (11, 456, 'bb');
COMMIT;

SQL> select order_number_hw,
  2         replace(replace(replace(LTRIM(SYS_CONNECT_BY_PATH(col1, ','), ','),',',',@'),'@,'),'@') service_orders,
  3         replace(replace(replace(LTRIM(SYS_CONNECT_BY_PATH(col2, ','), ','),',',',@'),'@,'),'@') serial_numbers
  4    from (select t.*,
  5                 lag(null, 1, order_number_svc) over(partition by order_number_hw, order_number_svc order by 1) col1,
  6                 lag(null, 1, serial_number) over(partition by order_number_hw, serial_number order by 1) col2,
  7                 row_number() over(partition by order_number_hw order by order_number_svc) rn,
  8                 COUNT(*) OVER(PARTITION BY order_number_hw) cnt
  9            from fred t)
10   where rn = cnt
11   START WITH rn = 1
12  CONNECT BY PRIOR order_number_hw = order_number_hw
13         AND PRIOR rn = rn - 1
14  /
ORDER_NUMBER_HW SERVICE_ORDERS      SERIAL_NUMBERS
             11 123,456             aa,bb
SQL>

Similar Messages

  • Analytic function question

    I have been trying to fix this since yesterday ad I am close, here is the question
    CREATE TABLE P_X_STG
      PID  NUMBER,
      EID  VARCHAR2(10 BYTE)
    CREATE TABLE TAB_C
      EID  VARCHAR2(10 BYTE),
      X    NUMBER
    SET DEFINE OFF;
    Insert into P_X_STG
       (PID, EID)
    Values
       (1, 'e3');
    Insert into P_X_STG
       (PID, EID)
    Values
       (1, 'e1');
    Insert into P_X_STG
       (PID, EID)
    Values
       (1, 'e2');
    Insert into P_X_STG
       (PID, EID)
    Values
       (2, 'e3');
    Insert into P_X_STG
       (PID, EID)
    Values
       (2, 'e4');
    Insert into P_X_STG
       (PID, EID)
    Values
       (3, 'e5');
    COMMIT;
    SET DEFINE OFF;
    Insert into TAB_C
       (EID, X)
    Values
       ('e1', 100);
    Insert into TAB_C
       (EID, X)
    Values
       ('e3', 300);
    Insert into TAB_C
       (X)
    Values
       (400);
    COMMIT;we match both the tables by eid
    if the eid matches, get corresponding x information
    if a pid has multiple different eids, for matching eid, get corresponding x information, but for non matching, simply put a NULL to x
    for matching eids, print "ematch", for non matchig eids, print "pmatch"
    in the below query, for non matching eids, we copy x information from matching ones, can someone help me substitute tha with Null
    SELECT pid,
                eid,
                x,
                ematch
           FROM (  SELECT p.pid,
                          p.eid,
                          CASE
                             WHEN p.eid = c.eid THEN c.x
                             ELSE LAG (x) OVER (ORDER BY pid)
                          END
                             x,
                          CASE WHEN p.eid = c.eid THEN 'ematch' ELSE 'pmatch' END
                             ematch
                     FROM p_x_stg p, tab_c c
                    WHERE p.eid = c.eid(+)
                 ORDER BY pid, eid)
          WHERE x IS NOT NULL
       ORDER BY pid;
    result is
    1     e1     100     ematch
    1     e2     100     pmatch
    1     e3     300     ematch
    2     e3     300     ematch
    2     e4     300     pmatchI want below result
    1    e1    100    ematch
    1    e2    null    pmatch
    1    e3    300    ematch
    2    e3    300    ematch
    2    e4    null    pmatchfor 1, e2 and 2, e4, instead of copying, just put null
    Edited by: user650888 on Apr 6, 2012 12:29 PM

    Hi,
    user650888 wrote:
    thanks, can this be combined into one query ?Do you mean without a sub-query?
    Why? What's wrong with a sub-query?
    If uisng a sub-query is the clearest and most efficient way to get the results you want, why wouldn't you want to use a sub-query?
    Suppose there was a way, that involved some other technique. How would we know that you didn't object to that technique as well?
    At any rate, I don't see how to do it without a sub-query. Analytic functions are computed after the WHERE clause is applied. If you want to use the results of an analytic function in a WHERE clause (as here, we want to use the results of the analytic COUNT function in a WHERE clause), then you have to compute the analytic function separately, in another (sub-) query.

  • Decode Fucntion Question

    Hi,
    I am trying to run this select query below and the IF-THEN-ELSE statement is not working... I am basically trying to update the CUSTOMER.CUST_OPT_IN_RENTAL_IND column with 'N' flag to 'Y' WHERE cust_classif_tr3_FULL_NM ='VIP Program'
    and CUST_OPT_IN_RENTAL_IND ='N'
    and CUST_EMAIL_ADDR_IND = 'Y'
    AND CUST_STAT_TR2_FULL_NM='Active' to be included in the query below. can anyone help please?
    SELECT
    CUSTOMER.CUSTOMER_SSKEY CUSTOMER_sskey,
    TO_CHAR(( CUSTOMER.DATE_MODIFIED ),'MM/DD/YYYY') date_modified,
    NVL(( CUSTOMER.SRC_CUST_ID ),( 'UNKNOWN' )) SRC_CUST_ID,
    NVL(( CUSTOMER.EXTNL_CUST_ID ),( 'UNKNOWN' )) EXTNL_CUST_ID,
    NVL(( TO_CHAR(( CUSTOMER.LAST_UPDATE_DT ),'MM/DD/YYYY') ),( '01/01/1900' )) LAST_UPDATE_DT,
    NVL(( CUSTOMER.SUB_CONTNT_CD ),( 'UNKNO' )) SUB_CONTNT_CD,
    NVL(( CUSTOMER.SUB_CONTNT_NM ),( 'UNKNOWN' )) SUB_CONTNT_NM,
    DECODE((CUST_EMAIL_ADDR_IND ),( 'N' ),( '-')
    , ( 'Y' ), ( 'EPITEST'|| LPAD(MOD(ROWNUM,100),3,'0')||'@INTERVALINTL.COM')
    ) email_address,
    IF CUSTOMER.CUST_OPT_IN_RENTAL_IND :='N' THEN result :='Y';
    ELSIF CUSTOMER.CUST_OPT_IN_RENTAL_IND :='N' THEN result := 'VIP Program' cust_classif_tr3_FULL_NM;
    ELSIF CUSTOMER.CUST_OPT_IN_RENTAL_IND :='N' THEN result := 'Y' CUST_EMAIL_ADDR_IND;
    ELSIF CUSTOMER.CUST_OPT_IN_RENTAL_IND :='N' THEN result := 'Active' CUST_STAT_TR2_FULL_NM;
    ELSIF
    END IF;
    NVL(( CUSTOMER.EMAIL_FORMAT_PREF ),( 'UNKNOWN' )) email_format_pref,
    NVL(( TRUNC( CUSTOMER.EMAIL_UNDELIVERABLE_COUNT ) ),( -1 )) email_undeliverable_count,
    NVL(( CUSTOMER.EMAIL_HTML_CAPABILITY ),( 'UNKNOWN' )) email_html_capability,
    NVL(( CUSTOMER.EMAIL_UNSUBSCRIBE ),( 'UNKNOWN' )) email_unsubscribe,
    NVL(( CUSTOMER.CUST_ID ),( 0 )) CUST_ID,
    NVL(( CUSTOMER.ONLINE_PROFILE_FIRST_NAME ),( 'UNKNOWN' )) ONLINE_PROFILE_FIRST_NAME,
    NVL(( CUSTOMER.ONLINE_PROFILE_LAST_NAME ),( 'UNKNOWN' )) ONLINE_PROFILE_LAST_NAME,
    NVL(( CUSTOMER.GOLD_BILL_TYPE_TR1_CD ),( 'UNKNO' )) GOLD_BILL_TYPE_TR1_CD,
    NVL(( CUSTOMER.GOLD_BILL_TYPE_TR1_FULL_NM ),( 'UNKNOWN' )) GOLD_BILL_TYPE_TR1_FULL_NM,
    NVL(( CUSTOMER.GOLD_BILL_TYPE_TR2_CD ),( 'UNKNO' )) GOLD_BILL_TYPE_TR2_CD
    FROM
         CUSTOMERSTAGE CUSTOMER
    Thanks,
    Soph

    Hi, Soph,
    Welcome to the forum!
    Whenever you hav a question, post a little sample data (CREATE TABLE and INSERT statments) and the resutls you want from that data.
    In the case of a DML problem (such as UPDATE), the sample data should show what the tabels are like before the change, and the reuslts will be the contents of the changed table after it.
    Always say what version of Oracle you're using.
    Without that, people will still try to help you as much as they can, but that won't be much.
    IF-THEN-ELSE is PL/SQL. You can't use PL/SQL statements in the middle of a SQL staement, such as SELECT, even if the SQL statement is being doen within PL/SQL.
    Instead of IF-THEN-ELSE, you can use CASE in SQL, or, in some special situations, DECODE.
    Here's an example of using CASE:
    Say you have a column in your table called cust_opt_in_rental, and it could have the values 'N'. 'N1', 'N2' or 'N3'.
    You want to produce a column in your result set called result, which will be derived from cust_opt_in_rental.
    If cust_opt_in_rental is 'N', then result should be ''Y'
    If cust_opt_in_rental is 'N1', then result should be 'VIP Program' followed by the column cust_classif_tr3_FULL_NM
    If cust_opt_in_rental is 'N2', then result should be 'Y' followed by the column CUST_EMAIL_ADDR_IND
    If cust_opt_in_rental is 'N3', then result should be 'Active' fololowed by the column CUST_STAT_TR2_FULL_NM
    You can do that using CASE like this:
    SELECT  CASE     customer.cust_opt_in_rental_ind
              WHEN  'N'   THEN  'Y';
              WHEN  'N1'  THEN  'VIP Program '  || cust_classif_tr3_full_nm
              WHEN  'N2'  THEN  'Y '         || cust_email_addr_ind
              WHEN  'N3'  THEN  'Active '       || cust_stat_tr2_full_nm
         END             AS result
    FROM     ...This is one situation where DECODE would work as well.

  • Analytic sql question

    Dear All,
    I have the following problem of getting the output table from the input table below.
    Do you have any idea of doing this with the help of analytic sql?
    p.s. I have done this by using pure plsql block which is too slow to run with high amount of data. Below data is just a sample, in real I have millions rows of data.
    Input table:
    TIME     USER     VALUE     
    1     A     X
    2     A     X
    3     B     Y
    4     B     Y
    5     A     X
    5     B     X
    6     A     Y
    7     B     Y
    7     A     Y
    Output table:
    START_TIME     END_TIME     USER     VALUE
    1          2          A     X
    5          5          A     X
    6          7          A     Y
    3          4          B     Y
    5          5          B     X
    7          7          B     Y

    I feel sure I've over-complicated things here, and that there's an easier way of doing it, but here's one solution:
    with my_tab as (select 1 col1, 'A' col2, 'X' col3 from dual union all
                    select 2 col1, 'A' col2, 'Y' col3 from dual union all
                    select 3 col1, 'B' col2, 'Y' col3 from dual union all
                    select 4 col1, 'B' col2, 'X' col3 from dual union all
                    select 5 col1, 'A' col2, 'X' col3 from dual union all
                    select 5 col1, 'B' col2, 'X' col3 from dual union all
                    select 6 col1, 'A' col2, 'Y' col3 from dual union all
                    select 7 col1, 'B' col2, 'Y' col3 from dual union all
                    select 7 col1, 'A' col2, 'Y' col3 from dual)
    select distinct start_col1,
                    end_col1,
                    first_value(col2) over (partition by col2, start_col1, end_col1 order by col1, start_col1, end_col1) col2,
                    first_value(col3) over (partition by col2, start_col1, end_col1 order by col1, start_col1, end_col1) col3
    from   (select col1,
                   col2,
                   col3,
                   last_value(start_col1 ignore nulls) over (order by col1, col2) start_col1,
                   last_value(end_col1 ignore nulls) over (order by col1 desc, col2 desc) end_col1
            from   (select col1,
                           col2,
                           col3,
                           case when lag(col2, 1, '{NULL}') over (order by col1, col2) <> col2
                                     then col1
                           end start_col1,
                           case when lead(col2, 1, '{NULL}') over (order by col1, col2) <> col2
                                     then col1
                           end end_col1
                    from   my_tab))
    order by col2,
             start_col1,
             end_col1;
    START_COL1   END_COL1 C C
             1          2 A X
             5          5 A X
             6          7 A Y
             3          4 B Y
             5          5 B X
             7          7 B Y

  • SQL - Analytical Query Question

    Hi All,
    I have a requirement for which I am trying to generate the output and I am not able to come up with good logic to solve this issue. I have been trying to solve this for some time now and am not able to figure out how.
    I have posted a similar kind of post some time back but this is different to the original one and little more complex than my previous question. I have listed below the script to create a table and insert data.
    DROP TABLE ITEMTABLE
    CREATE TABLE ITEMTABLE
      ITEMTABLEID1           NUMBER(9) NOT NULL, 
      ITEMTABLEID2           NUMBER(9) NOT NULL,
      PARENTTABLEID          NUMBER(9),
      PARENTINFO          VARCHAR2(20), 
    CONSTRAINT ITEMTABLE_PK PRIMARY KEY (ITEMTABLEID1,ITEMTABLEID2)          
    Insert into ITEMTABLE values (19217,10245,19216,'PARENTINFO-1');
    Insert into ITEMTABLE values (19217,10315,19216,'PARENTINFO-2' );
    Insert into ITEMTABLE values (19217,10336,19216,'PARENTINFO-2' );
    DROP TABLE FINANCE
    CREATE TABLE FINANCE
      FINANCEKEY          NUMBER(9) NOT NULL,
      PARENTID1           NUMBER(9) NOT NULL, 
      PARENTID2           NUMBER(9) NOT NULL,  
      CONSTRAINT FINANCE_PK PRIMARY KEY (FINANCEKEY)
    Insert into FINANCE values (8332, 19217,10245);
    Insert into FINANCE values (8404, 19217, 10315);
    Insert into FINANCE values (8425, 19217, 10336);
    DROP TABLE ACCT
    CREATE TABLE ACCT
      ACCTKEY             NUMBER(9)  NOT NULL,   
      FINANCEKEY          NUMBER(9),
      FLAG                VARCHAR2(1), 
      SOURCEKEY           NUMBER(9),
      CONSTRAINT ACCT_PK PRIMARY KEY (ACCTKEY)
    Insert into ACCT values (9874, 8332, 'N',0);
    Insert into ACCT values (9875, 8332, 'N',0 );
    Insert into ACCT values (9982, 8404, 'Y', 9874);
    Insert into ACCT values (9983, 8404, 'Y', 9875);
    Insert into ACCT values (10008, 8425, 'N', 9982);
    Insert into ACCT values (10009, 8425, 'Y', 9983);
    SQL> With tempacct1 as
      2    (Select  I.ITEMTABLEID1,I.ITEMTABLEID2, AC.SOURCEKEY, NVL(AC.FLAG,'N') AS FLAG, AC.ACCTKEY
      3     FROM ITEMTABLE I,FINANCE F,ACCT AC
      4    where I.ITEMTABLEID1 = F.PARENTID1
      5      and I.ITEMTABLEID2 =  F.PARENTID2
      6    and F.FINANCEKEY = AC.FINANCEKEY
      7        and I.PARENTTABLEID = 19216
      8         ORDER BY  acctkey ASC
      9        )
    10     SELECT  ITEMTABLEID1,ITEMTABLEID2,acctkey, flag ,SOURCEKEY
    11     FROM    tempacct1;
    ITEMTABLEID1 ITEMTABLEID2    ACCTKEY F  SOURCEKEY
           19217        10245       9874 N          0
           19217        10245       9875 N          0
           19217        10315       9982 Y       9874
           19217        10315       9983 Y       9875
           19217        10336      10008 N       9982
           19217        10336      10009 Y       9983
    6 rows selected.
    Desired Output -
    ITEMTABLEID1 ITEMTABLEID2    ACCTKEY F  SOURCEKEY
           19217        10336      10008 N       9982
           19217        10336      10009 Y       9983The solution by Frank for my previous post few weeks back looks like this :-
    SQL>    SELECT  sourcekey
      2  , flag
      3  , acctkey
      4  FROM (
      5       SELECT  ac.sourcekey
      6       ,     NVL (ac.flag, 'N') AS flag
      7       ,     ac.acctkey
      8       ,     RANK () OVER ( PARTITION BY  CASE
      9                         WHEN  sourcekey = 0
    10             THEN  acctkey
    11             ELSE  sourcekey
    12                     END
    13         ORDER BY      CASE
    14                              WHEN  ac.flag = 'Y'
    15                    THEN  1
    16             ELSE  2
    17                   END
    18         ,   SIGN (sourcekey)
    19                    ) AS rnk
    20          FROM    itemtable i
    21       ,     finance f
    22       ,     acct ac
    23         WHERE   i.itemtableid1  = f.parentid1
    24         AND     i.itemtableid2  = f.parentid2
    25       AND     f.financekey  = ac.financekey
    26         AND     i.parenttableid  = 19216
    27   )
    28  WHERE rnk = 1;
    SOURCEKEY F    ACCTKEY
          9874 Y       9982  -- Needs to be removed
          9875 Y       9983  -- Needs to be removed
          9982 N      10008  
          9983 Y      10009
    Output Desired would be
    ITEMTABLEID1 ITEMTABLEID2    ACCTKEY F  SOURCEKEY
           19217        10336      10008 N       9982
           19217        10336      10009 Y       9983
    SQL> The slight change to the requirement is when we have sourcekey that is same as acctkey then only display the row which has max acctkey. So in this case, the last two row have a sourcekey of 9982, 9983 which is equal to acctkey of first two rows. So, we look for Max(Acctkey) which would be 10008 and 10009 and only display those.
    This logic needs to be added on top of the existing logic. So I am not sure how it could be done.
    I would really appreciate any help.
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Release 10.2.0.4.0 - 64bit Production
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - ProductionEdited by: ARIZ on Jun 16, 2010 7:56 PM

    Hi,
    This gets the right results from your sample data.
    SELECT  ac.sourcekey
    ,       NVL (ac.flag, 'N') AS flag
    ,       ac.acctkey
    FROM    itemtable     i
    ,          finance          f
    ,       acct          ac
    WHERE   i.itemtableid1   = f.parentid1
    AND     i.itemtableid2   = f.parentid2
    AND     f.financekey     = ac.financekey
    AND     i.parenttableid  = 19216
    AND      ac.acctkey     NOT IN ( SELECT  sourcekey
                          FROM      acct
                         WHERE      sourcekey     IS NOT NULL     -- If needed
    ; I'm a little uncertain of your requirements, so I'm not sure how it will work on your real data.
    At least in this new version of the problem, it looks like rows can be chained together, where the sourcekey of one row is the acctkey of the next row. If you want only the first row in each such chain, just look for the ones where the acctkey does not relate back to any sourcekey.
    NOT IN is never TRUE if the subquery returns any NULLs. Unless sourcekey has a NOT NULL constraint, you'd better check for it in the NOT IN sub-query.

  • Analytical function question

    Hi,
    I'm looking for the best way to count the number of orders since a particular order that had a particular status. The ordernumbers are sequentially numbered, the key field is a sequence value and there is an orderdate field.
    I've tried the usual "x = select max xxx" and that works well for a small number of records, but not for updating 51M.
    Suggestions would be appreciated.
    Victor

    I solved this problem with an inner select with two first_value statements and an outer select with a >= between the two.

  • Understanding row_number() and using it in an analytic function

    Dear all;
    I have been playing around with row_number and trying to understand how to use it and yet I still cant figure it out...
    I have the following code below
    create table Employee(
        ID                 VARCHAR2(4 BYTE)         NOT NULL,
       First_Name         VARCHAR2(10 BYTE),
       Last_Name          VARCHAR2(10 BYTE),
        Start_Date         DATE,
        End_Date           DATE,
         Salary             Number(8,2),
       City               VARCHAR2(10 BYTE),
        Description        VARCHAR2(15 BYTE)
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary,  City,       Description)
                 values ('01','Jason',    'Martin',  to_date('19960725','YYYYMMDD'), to_date('20060725','YYYYMMDD'), 1234.56, 'Toronto',  'Programmer');
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary,  City,       Description)
                  values('02','Alison',   'Mathews', to_date('19760321','YYYYMMDD'), to_date('19860221','YYYYMMDD'), 6661.78, 'Vancouver','Tester')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary,  City,       Description)
                 values('03','James',    'Smith',   to_date('19781212','YYYYMMDD'), to_date('19900315','YYYYMMDD'), 6544.78, 'Vancouver','Tester')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary,  City,       Description)
                  values('04','Celia',    'Rice',    to_date('19821024','YYYYMMDD'), to_date('19990421','YYYYMMDD'), 2344.78, 'Vancouver','Manager')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary,  City,       Description)
                  values('05','Robert',   'Black',   to_date('19840115','YYYYMMDD'), to_date('19980808','YYYYMMDD'), 2334.78, 'Vancouver','Tester')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary, City,        Description)
                  values('06','Linda',    'Green',   to_date('19870730','YYYYMMDD'), to_date('19960104','YYYYMMDD'), 4322.78,'New York',  'Tester')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary, City,        Description)
                  values('07','David',    'Larry',   to_date('19901231','YYYYMMDD'), to_date('19980212','YYYYMMDD'), 7897.78,'New York',  'Manager')
    insert into Employee(ID,  First_Name, Last_Name, Start_Date,                     End_Date,                       Salary, City,        Description)
                   values('08','James',    'Cat',     to_date('19960917','YYYYMMDD'), to_date('20020415','YYYYMMDD'), 1232.78,'Vancouver', 'Tester')I did a simple select statement
    select * from Employee e
    and it returns this below
    ID   FIRST_NAME LAST_NAME  START_DAT END_DATE      SALARY CITY       DESCRIPTION
    01   Jason      Martin     25-JUL-96 25-JUL-06    1234.56 Toronto    Programmer
    02   Alison     Mathews    21-MAR-76 21-FEB-86    6661.78 Vancouver  Tester
    03   James      Smith      12-DEC-78 15-MAR-90    6544.78 Vancouver  Tester
    04   Celia      Rice       24-OCT-82 21-APR-99    2344.78 Vancouver  Manager
    05   Robert     Black      15-JAN-84 08-AUG-98    2334.78 Vancouver  Tester
    06   Linda      Green      30-JUL-87 04-JAN-96    4322.78 New York   Tester
    07   David      Larry      31-DEC-90 12-FEB-98    7897.78 New York   Manager
    08   James      Cat        17-SEP-96 15-APR-02    1232.78 Vancouver  TesterI wrote another select statement with row_number. see below
    SELECT first_name, last_name, salary, city, description, id,
       ROW_NUMBER() OVER(PARTITION BY description ORDER BY city desc) "Test#"
       FROM employee
       and I get this result
    First_name  last_name   Salary         City             Description         ID         Test#
    Celina          Rice         2344.78      Vancouver    Manager             04          1
    David          Larry         7897.78      New York    Manager             07          2
    Jason          Martin       1234.56      Toronto      Programmer        01          1
    Alison         Mathews    6661.78      Vancouver   Tester               02          1 
    James         Cat           1232.78      Vancouver    Tester              08          2
    Robert        Black         2334.78     Vancouver     Tester              05          3
    James        Smith         6544.78     Vancouver     Tester              03          4
    Linda         Green        4322.78      New York      Tester             06           5
    I understand the partition by which means basically for each associated group a unique number wiill be assigned for that row, so in this case since tester is one group, manager is another group, and programmer is another group then tester gets its own unique number for each row, manager as well and etc.What is throwing me off is the order by and how this numbering are assigned. why is
    1 assigned to Alison Mathews for the tester group and 2 assigned to James Cat and 3 assigned Robert Black
    I apologize if this is a stupid question, i have tried reading about it online and looking at the oracle documentation but that still dont fully understand why.

    user13328581 wrote:
    understanding row_number() and using it in an analytic functionROW_NUMBER () IS an analytic fucntion. Are you trying to use the results of ROW_NUMBER in another analytic function? If so, you need a sub-query. Analuytic functions can't be nested within other analytic functions.
    ...I have the following code below
    ... I did a simple select statementThanks for posting all that! It's really helpful.
    ... and I get this result
    First_name  last_name   Salary         City             Description         ID         Test#
    Celina          Rice         2344.78      Vancouver    Manager             04          1
    David          Larry         7897.78      New York    Manager             07          2
    Jason          Martin       1234.56      Toronto      Programmer        01          1
    Alison         Mathews    6661.78      Vancouver   Tester               02          1 
    James         Cat           1232.78      Vancouver    Tester              08          2
    Robert        Black         2334.78     Vancouver     Tester              05          3
    James        Smith         6544.78     Vancouver     Tester              03          4
    Linda         Green        4322.78      New York      Tester             06           5... What is throwing me off is the order by and how this numbering are assigned. why is
    1 assigned to Alison Mathews for the tester group and 2 assigned to James Cat and 3 assigned Robert Black That's determined by the analytic ORDER BY clause. Yiou said "ORDER BY city desc", so a row where city='Vancouver' will get a lower namber than one where city='New York', since 'Vancouver' comes after 'New York' in alphabetic order.
    If you have several rows that all have the same city, then you can be sure that ROW_NUMBER will assign them consecutive numbers, but it's arbitrary which one of them will be lowest and which highest. For example, you have 5 'Tester's: 4 from Vancouver and 1 from New York. There's no particular reason why the one with first_name='Alison' got assinge 1, and 'James' got #2. If you run the same query again, without changing the table at all, then 'Robert' might be #1. It's certain that the 4 Vancouver rows will be assigned numbers 1 through 4, but there's no way of telling which of those 4 rows will get which of those 4 numbers.
    Similar to a query's ORDER BY clause, the analytic ORDER BY clause can have two or more expressions. The N-th one will only be considered if there was a tie for all (N-1) earlier ones. For example "ORDER BY city DESC, last_name, first_name" would mena 'Vancouver' comes before 'New York', but, if multiple rows all have city='Vancouver', last_name would determine the order: 'Black' would get a lower number than 'Cat'. If you had multiple rows with city='Vancouver' and last_name='Black', then the order would be determined by first_name.

  • Question about GROUP BY and HAVING

    Good afternoon,
    I have the following query which returns the desired result (set of students who take CS112 or CS114 but not both). I wanted to "condense" it into a single SELECT statement (if that is at all possible - DDL to execute the statement is provided at the end of this post):
    -- is this select distinct * and its associated where clause absolutely
    -- necessary to obtain the result ?
    select distinct *
      from (
            select s.sno,
                   s.sname,
                   s.age,
                   sum(case when t.cno in ('CS112', 'CS114')
                            then 1
                            else 0
                       end)
                     over (partition by s.sno) as takes_either_or_both
              from student s join take t
                               on (s.sno = t.sno)
      where takes_either_or_both = 1
    ;The following looked reasonable but, unfortunately unsuccessful:
      Window functions not allowed here (in Having Clause)
    select max(s.sno),
           max(s.sname),
           max(s.age),
           sum(case when t.cno in ('CS112', 'CS114')
                    then 1
                    else 0
               end)
             over (partition by s.sno) as takes_either_or_both
      from student s join take t
                       on (s.sno = t.sno)
    group by s.sno
    having sum(case when t.cno in ('CS112', 'CS114')
                    then 1
                    else 0
               end)
             over (partition by s.sno) = 1
    Invalid identifier in Having clause
    select s.sno,
           s.sname,
           s.age,
           sum(case when t.cno in ('CS112', 'CS114')
                    then 1
                    else 0
               end)
             over (partition by s.sno) as takes_either_or_both
      from student s join take t
                       on (s.sno = t.sno)
    group by s.sno, s.sname, s.age
    having takes_either_or_both = 1
    ;I have searched for a document that completely defines the sequence in which the clauses are executed. I have found tidbits here and there but not something complete. I realize that my running into problems like this one is due to my lack of understanding of the sequence and the scope of the clauses that make up a statement. Because of this, I cannot even tell if it is possible to write the above query using a single select statement. Pardon my bit of frustration...
    Thank you for your help,
    John.
    DDL follows.
            /* drop any preexisting tables */
            drop table student;
            drop table courses;
            drop table take;
            /* table of students */
            create table student
            ( sno integer,
              sname varchar(10),
              age integer
            /* table of courses */
            create table courses
            ( cno varchar(5),
              title varchar(10),
              credits integer
            /* table of students and the courses they take */
            create table take
            ( sno integer,
              cno varchar(5)
            insert into student values (1,'AARON',20);
            insert into student values (2,'CHUCK',21);
            insert into student values (3,'DOUG',20);
            insert into student values (4,'MAGGIE',19);
            insert into student values (5,'STEVE',22);
            insert into student values (6,'JING',18);
            insert into student values (7,'BRIAN',21);
            insert into student values (8,'KAY',20);
            insert into student values (9,'GILLIAN',20);
            insert into student values (10,'CHAD',21);
            insert into courses values ('CS112','PHYSICS',4);
            insert into courses values ('CS113','CALCULUS',4);
            insert into courses values ('CS114','HISTORY',4);
            insert into take values (1,'CS112');
            insert into take values (1,'CS113');
            insert into take values (1,'CS114');
            insert into take values (2,'CS112');
            insert into take values (3,'CS112');
            insert into take values (3,'CS114');
            insert into take values (4,'CS112');
            insert into take values (4,'CS113');
            insert into take values (5,'CS113');
            insert into take values (6,'CS113');
            insert into take values (6,'CS114');

    Hi, John,
    Just use the aggregate SUM function.
            select s.sno,
                   s.sname,
                   s.age,
                   sum(case when t.cno in ('CS112', 'CS114')
                            then 1
                            else 0
                       end) as takes_either_or_both
              from student s join take t
                               on (s.sno = t.sno)
           GROUP BY  s.sno,
                    s.sname,
                  s.age
           HAVING  sum(case when t.cno in ('CS112', 'CS114')
                            then 1
                            else 0
                       end)  = 1;Analytic functions are computed after the WHERE- and HAVING clause have been applied. To use the results of an analytic fucntion in a WHERE- or HAVING clause, you have to compute it in a sub-query, and then you can use it in a WHERE- or HAVING clause of a super-query.

  • Understanding sum() over(order by) analytic function

    Could you please explain Having_order_by column values computation for below query?
    I understand that No_Partition column has been computed over entire result set
    select level
    ,sum(level) over(order by level) Having_order_by
    ,sum(level) over() No_Partition
    from dual
    connect by level < 6

    Hi,
    ActiveSomeTimes wrote:
    Could you please explain Having_order_by column values computation for below query?
    I understand that No_Partition column has been computed over entire result set
    select level
    ,sum(level) over(order by level) Having_order_by
    ,sum(level) over() No_Partition
    from dual
    connect by level < 6
    When you have an ORDER BY clause, the function only operates on a window, that is, a subset of the result set, relative to the current row.
    When you say "ORDER BY LEVEL", it will only operate on LEVELs less that or equal to the current LEVEL, so on
    LEVEL = 1, the analytic fucntion will only look at LEVEL <= 1, that is, just 1; on
    LEVEL = 2, the analytic fucntion will only look at LEVEL <= 2, that is, 1 and 2; on
    LEVEL = 3, the analytic fucntion will only look at LEVEL <= 3, that is, 1, 2 and 3
    LEVEL = 6, the analytic fucntion will only look at LEVEL <= 6, that is, 1, 2, 3, 4, 5 and 6
    In the function call without the ORDER BY clause, the function looks at the entire result set, regrdless of what vlaue LEVEL has on the current row.

  • Hierarchy - having longest paths for a value

    Hi there, I have a query which need to query hierarchical data from our chart of account (segment3). This parent-child relationship involve having many branches and leaves. I would like to be able to pull out the longest path for each of them and not all the values. Here's the case.
    Value 1121 is the lowest level and its hierarchiy look like this.
    1121
    4CQU TOUS
    3CQU
    2DES LQLO LT02
    | |
    1VTI LT01
    or
    1121 - 4CQU - 3CQU - 2DES - 1VTI
    1121 - 4CQU - 3CQU - LQLO
    1121 - 4CQU - 3CQU - LT02 - LT01
    1121 - TOUS
    With the following query
    SELECT SYS_CONNECT_BY_PATH(parent_flex_value, '.')||'.'||f1.flex_value path,level
    FROM fnd_flex_value_children_v f1
    WHERE f1.flex_value_set_id = 1005215
    AND f1.flex_value = '1121'
    CONNECT BY PRIOR flex_value = parent_flex_value;
    I get the following
    PATH LEVEL
    .1VTI.2DES.3CQU.4CQU.1121     4
    .2DES.3CQU.4CQU.1121     3
    .3CQU.4CQU.1121     2
    .4CQU.1121     1
    .LQLO.3CQU.4CQU.1121     3
    .LT01.LT02.3CQU.4CQU.1121     4
    .LT02.3CQU.4CQU.1121     3
    .TOUS.1121     1
    I would like to pull out only the following
    .1VTI.2DES.3CQU.4CQU.1121     4
    .LQLO.3CQU.4CQU.1121     3
    .LT01.LT02.3CQU.4CQU.1121     4
    .TOUS.1121     1
    Anyone knows how can this be done? I'm using 10g database along with 11.5.10.2 apps.
    Thanks a lot.
    Edited by: ChrisT1826 on 2010-07-21 06:56

    Hi,
    Apparantly, you're doing something like this now:
    SELECT  SYS_CONNECT_BY_PATH (id, ' - ')     AS path
    FROM     ...
    ;but you only want to get one row of results; in this case
    1121 - 4CQU - 3CQU - 2CQU - 2DES - 1VTIbecause it's the longest (6 levels).
    Is that right.
    Do something like this instead:
    WITH     connect_by_results     AS
         SELECT  SYS_CONNECT_BY_PATH (id, ' - ')     AS path
         ,     LEVEL                             AS lvl
         FROM     ...     -- The rest of your original CONNECT BY query goes here
    ,     got_rnk          AS
         SELECT     path
         ,     RANK () OVER (ORDER BY  lvl  DESC)     AS rnk
         FROM     connect_by_results
    SELECT     path
    FROM     got_rnk
    WHERE     rnk     = 1
    ;If there happens to be a tie (two or more rows with the same longest-length path) the query above will display all of the rows with the longest path.
    If that's not what you want, add tie-breakers to the end of the analytic ORDER BY clause, or use ROW_NUMBER instead of RANK.
    Analytic fucntions usually interfere with CONNECT BY when they're in the same query. That's why I used two sub-queries:
    (1) connect_by_results has all the CONNECT BY stuff, but no analytics
    (2) got_rnk has all the analytics, but no CONNECT BY stuff.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements) and the results you want from that data.
    Edited by: Frank Kulash on Jul 21, 2010 10:16 AM
    This answers the original question, or at least what I remember as being on this site when I started writing. It looks like the question has changed.
    Definely post CREATE TABLE and INSERT statements for the sample data, and explain how you get those results.

  • Writing single query with conflicting WHERE statements

    How do I run the following as a single query that will result in a column that pertains to the first block of code and then another column that pertains to the 2nd block of code? The issue is that I need 2 different WHERE statements. So the final output will be patient_id, complete, incomplete.
    SELECT patient_id, COUNT(*) AS complete
    FROM STATUS
    WHERE status = 1
    GROUP BY patient_id
    ORDER BY patient_id;
    SELECT patient_id, COUNT(*) AS incomplete
    FROM STATUS
    WHERE status = 2
    GROUP BY patient_id
    ORDER BY patient_id;
    Thanks!

    Hi,
    apex wrote:
    Thanks for all of the help.
    I would like to add another column with decile and am struggling as to how to do it. Since I can't reference something in a calculation in the same step, I think I will need a 3rd nesting, but what I haven't figured out is how to get the number of subjects whose ratio is less than that subject's ratio. Right: you can't assign an alias (such as complete or ratio) to a calculated column and use that alias in the same sub-query.
    If the calculation isn't very complicated, then you might find it simpler just to repeat the calculation. For example, I think this is what you want:
    WITH     got_complete     AS
         SELECT    patient_id
         ,        COUNT ( CASE WHEN status = 1 THEN 1 END )     AS complete
         ,       COUNT ( CASE WHEN status = 2 THEN 1 END )      AS incomplete
         FROM        pt_status
         WHERE        status IN (1,2)
         GROUP BY  patient_id
    SELECT       patient_id, complete, incomplete
    ,                                complete / (complete + incomplete)     AS Ratio
    ,       RANK () OVER ( ORDER BY  complete / (complete + incomplete) ) - 1
                                                        AS decile
    FROM      got_complete
    ORDER BY  complete
    ,            incomplete     DESC
    ;RANK numbers rows 1, 2, 3, ... If I understand your requirements, you want the numbering to start with 0 (meaning "there are 0 other patients with a lower ratio"), so that's why I subtracted 1.
    Depending how you want to handle ties, you may need to add some tie-breaker expressions to the analytic ORDER BY clause, and/or use ROW_NUMBER instead of RANK.
    Here, I used the calculation "complete / (complete + incomplete)" in the ratio column, then repeated it in the decile column.
    I you want, you can add another sub-query, called got_ratio, which would add the ratio column, but do nothing about the decile column. Then, in the main query, you could use RANK as shown above (or, as you suggested, a scalar sub-query referencing got_ratio) to get the number of other pateient_ids with lower ratios.
    On the other hand, you could do this whole job without any sub-queries, using AVG as I did earlier, and then repeating that same AVG expression in the ORDER BY clause for RANK. Aggregate functions are computed before analytic fucntions, so the analytic RANK can referenece the aggregate AVG.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only), and also post the results you want from that data. Include examples of decile ties (2 or more patients with the same ratio).
    Explain, using specific examples, how you get those results from that data.

  • Bucket query help

    create table rangespendbucket(rangespend varchar2(40), id number)
    insert into rangespendbucket values('100-200',1);
    insert into rangespendbucket values('200-500',2);
    insert into rangespendbucket values('500-1000',3);
    insert into rangespendbucket values('1000-',4);
    commit;
    create table spend(supplier varchar2(40), cy_spend number)
    insert into spend values('A',100);
    insert into spend values('B',25);
    insert into spend values('C',30);
    insert into spend values('D',1000);
    insert into spend values('E',10);
    insert into spend values('A',200);
    insert into spend values('F',0);
    insert into spend values('E',20);
    insert into spend values('C',540);
    insert into spend values('B',300);
    insert into spend values('A',300);
    insert into spend values('C',10);
    insert into spend values('B',0);
    insert into spend values('E',0);
    insert into spend values('G',90);
    insert into spend values('H',0);
    insert into spend values('A',0);
    insert into spend values('P',7000);
    commit;
    i am new in this forums . some one in my company given me the following query/task.
    I want find out all those in a single query(1-8) except 1.1(separatee query).
    we are using oracke 10g reaalese 2 version.
    1)no of customer/supplier in the spend bucket.
    1.1. If anybody clcik on that particular bucket it will show no of suppliers.
    2)total no of supplier for all bucket(sum)
    3)% of supplier for each bucket.(each bucket supp cnt *100/total supp cnt)
    3)total spend for each bucket
    4)total spend for all combination of bucket
    5)% of spend for each bucket than total bucket(each bucket supp spend *100/total supp spend)
    6)how many no of suppliers make 80% of total spend(respect to all bucket)
    7)how many no of suppliers make 20% of total spend(respect to all bucket)
    8)top 3 suppliers make how much % of spend(respect to all bucket)
    i am eagerly requesting to all of you please help me to making this query.
    this query is required for making dashboard.
    column name should be like this-totalsupplierscnt__all_bucket,'cnt suppliers 100-200','%cnt suppliers 100-200','cnt supplier 200-500','%cnt supplier 200-500',
    'cnt supplier 500-1000','%cnt supplier 500-1000','cnt suppliers 1000-','%cnt suppliers 1000-',
    totalsuppliersspend_all_bucket,'spend for 100-200','%spend for 100-200','spend for 200-500','%spend for 200-500',
    'spend for 500-1000','%spend for 500-1000','spend for 1000-','%spend for 1000-',
    'no of supplierss 80% of total spend'(calculation-suppose total spend 100. spend sorted by decending 80% of total spend may cover 1-2 suppliers),
    'no of supplierss 20% of total spend'(calculation- total no of suppliers- no of suppliers making 80% spend),
    'top 3 suppliers spend'(calculation-spend sorted by desc,if we get total spend then we calculate -top3'spend*100/total spend)
    if you want much more clarification i will give you.
    Edited by: 949497 on Jul 27, 2012 7:51 PM
    Edited by: 949497 on Jul 27, 2012 8:11 PM

    Hi,
    Welcoem to the forum!
    949497 wrote:
    create table rangebucket(rangespend varchar2(40), id number) ...Thanks for posting the CREATE TABLE and INSERT statements; that's very helpful!
    i am new in this forums ....You're way ahead of some people, who have been using the forum for years but still haven't learned how to post their data.
    Don't forget to post the exact results you want from that data.
    How are the two tables related? Do you have to parse rangebucket.rangespend to extract the NUMBERs 100 and 200 from the VARCHAR2 '100-200'? It would be simpler to do it the other way around: store the NUMBERs 100 and 200 in two separate NUMBER columns, and derive the label '100-200' from them (or store the label in a separate column, as it is now, in addition to rangebegin and rangeened NUMBER columns).
    Whatever number is related to these ranges, what happens if that number is exactly 200? What if it is less than 100?
    >
    I want find out all those in a single query(1-8) except 1.1(separatee query).
    we are using oracke 10g reaalese 2 version.Thanks! That's another thing that's always imoportant (and another thing some people haven't learned to do).
    1)no of customer/supplier in the spend bucket.
    1.1. If anybody clcik on that particular bucket it will show no of suppliers.This is the SQL and PL/SQL forum. What do you mean by "click", and how does it involve SQL or PL/SQL?
    2)total no of supplier for all bucket(sum)
    3)% of supplier for each bucket.(each bucket supp cnt *100/total supp cnt)
    3)total spend for each bucket
    4)total spend for all combination of bucket
    5)% of spend for each bucket than total bucket(each bucket supp spend *100/total supp spend)
    6)how many no of suppliers make 80% of total spend(respect to all bucket)I'm not certain I understand what any of the outputs are, but I'm especially unsure of 6) and 7).
    Do you want the smallest possible number of suppliers, such that their spend totals account for at least 80% of all spend total?
    Do you want the largest possible number of suppliers, such that their sepnd total just barely exceeds 80% of the overall total?
    Do you want something else entirely?
    When you post the results you want from the given sample data, explain this part very carefully.
    7)how many no of suppliers make 20% of total spend(respect to all bucket)
    8)top 3 suppliers make how much % of spend(respect to all bucket)I suspect you'll need to use aggregate functions to get the total by suppliers, and then use analytic fucntions to answer questions such as "Is the running total greater than 80% yet?". I'm sure this will be clearer after you post the results you want.

  • MAX function

    Hi Everyone,
    Am aware of the following flavors of MAX function
    1) choose MAX from the folders/fields list (selected items tab)
    2) create calculation using: MAX keep dense
    3) create calculation using: MAX analytic function
    questions, pls:
    ===========
    a) with MAX regular, MAX keep dense, MAX - analytic function
    is it necessary to sort it using tools/sort - choose fields to sort by?
    or does the data get sorted due to the ORDER BY clause in MAX used in a calculation
    b) how to understand the diff. bet. MAX keep dense and MAX - analytic function
    1) i understand that analytic functions are applied after detail row processing
    does MAX keep dense calculation happen during detail row processing?
    2) how did you know to advise when to use MAX keep dense, and when to use MAX - analytic function?
    tx for your ideas and assistance, sandra

    Hi,
    a) with MAX regular, MAX keep dense, MAX - analytic function is it necessary to sort it using tools/sort - choose fields to sort by? or does the data get sorted due to the ORDER BY clause in MAX used in a calculationIt is only necessary to use a sort if you want to have the rows returned in a specific order. The order by in the max calculation defines the maximum within the group or window. It may affect the order the rows are returned, but if it does this is not guaranteed and you should use a sort on the main query.
    b) how to understand the diff. bet. MAX keep dense and MAX - analytic function
    1) i understand that analytic functions are applied after detail row processing does MAX keep dense calculation happen during detail row processing?Yes
    2) how did you know to advise when to use MAX keep dense, and when to use MAX - analytic function?In general, if you want the result on a single row, so you have one row for each group then you should use the aggregate max. If you want to use the same max on all the rows in the window (defined by the partition) then use the analytic max.
    Rod West

  • Compare the current value with the previous value in the same column

    Hi all,
    I have to include a statement in a query which allows to compare the current value of column A with the previous value of column A (same column). from there, I need to add a condition in order to have the expected result.
    Let's take an example to illustrate what I want to achieve:
    I have the following columns in table called 'Charges':
    Ship_id batch_nr Order_nr Price
    SID1111 9997 MD5551 50
    SID1111 9998 MD5552 50
    SID1111 9999 MD5553 50
    SID2222 8887 MD6661 80
    SID2222 8887 MD6662 80
    SID2222 8887 MD6662 80
    SID3333 6666 MD7771 90
    I want to check if the ship_id of row 2,3 (and more if available) is equal to the ship_id of row 1.
    If it is the case, then value 'together with the first batch_nr' in row 2 and 3 under Price column. If not, then keep the original value of Price column
    PLease see below the expected result:
    Ship_id batch_nr Order_nr Price
    SID1111 9997 MD5551 50
    SID1111 9998 MD5552 together with 9997
    SID1111 9999 MD5553 together with 9997
    SID2222 8887 MD6661 80
    SID2222 8887 MD6662 together with 8887
    SID2222 8887 MD6663 together with 8887
    SID3333 6666 MD7771 90
    Thanks in advance for your help, it is really urgent.
    Imco20030

    Hi,
    user11961002 wrote:
    Hi,
    Here is the query that I use:
    [ select
    sl.ship_id,
    o.ordnum,
    o.reffld_5 "BatchNR",
    sum(tc1.chrg_amt) "FreightPRC",
    sum(tc2.chrg_amt) "FuelPRC",
    sum (tc1.chrg_amt + tc2.chrg_amt + tc3.chrg_amt) "Total Price"
    from ord_line ol
    join ord o on (ol.ordnum = o.ordnum and ol.client_id = o.client_id)
    join shipment_line sl on (ol.ordnum = sl.ordnum and ol.client_id = sl.client_id and ol.ordlin = sl.ordlin)
    join adrmst a2 on (o.rt_adr_id = a2.adr_id)
    left join tm_chrg tc1 on (tc1.chargetype = 'FREIGHT' and tc1.chrg_role = 'PRICE' and tc1.ship_id = sl.ship_id)
    left join tm_chrg tc2 on (tc2.chargetype = 'FUELSURCHARGE'and tc2.chrg_role = 'PRICE' and tc2.ship_id = sl.ship_id)
    where sl.ship_id = 'SID0132408'
    group by o.client_id, o.ordnum, o.reffld_2, sl.ship_id, a2.adrnam, a2.adrln1, a2.adrpsz, a2.adrcty, a2.ctry_name,
    o.reffld_5, ol.early_shpdte
    order by ship_id
    ]That looks like the query you were using before you started this thread.
    Modify it, using the analytic fucntions FIRST_VALUE and LAG, like I showed you.
    I see that you did simplify the problem quite a bit, and it's good that you did that.
    It doesn't matter that your real problem involves joins or GROUP BY. Analytic functions are calculated on the results after all joins and GROUPS BYs are done. Just substitute your real expressions for the simplified ones.
    For example, in your simplified problem, there was a column called order_nr, but I see now that's it's really called o.ordnum. Where the solution I posted earlier says "ORDER BY order_nr", you should say "ORDER BY o.ordnum".
    Here's a less obvious example: in your simplifed problem, there was a column called price, but I see now that it's really SUM (tc1.chrg_amt + tc2.chrg_amt + tc3.chrg_amt). Where the solution I posted earlier says "TO_CHAR (price)", you should say "TO_CHAR (SUM (tc1.chrg_amt + tc2.chrg_amt + tc3.chrg_amt))". (You can't use an alias, like "Total Price", in the same SELECT clasue where it is defined.)
    I removed some columns from the select as they are not relevant for the wanted action like 'adress details or other references'.
    Now here is the result:
    Shipment ID     Order Number     WMS Batch     Freight      Fuel Price Order Total Price
    SID0132408     MDK-000014-05602649     04641401     110     10 120
    SID0132408     MDK-000014-05602651     04641402     110     10 120
    SID0132408     MDK-000014-05602652     04641363     110     10 120
    as you can see, the 3 orders have the same shipment ID.
    The expected result should be shown under column 'Total Price' as follows:
    Shipment ID     Order Number     WMS Batch     Freight      Fuel Price Order Total Price
    SID0132408     MDK-000014-05602649     04641401     110     10 120
    SID0132408     MDK-000014-05602651     04641402     110     10 tog with 04641401
    SID0132408     MDK-000014-05602652     04641363     110     10 tog with 04641401Okay, so those are the correct results that I asked for, plus the incorrect results you're getting now. Thanks; extra information doesn't hurt.
    But where is the raw data that you're starting with?
    It looks like you tried to format the code (but not the results) by typing this 1 character:
    before the formatted section and this different character
    after the formatted section. To post formatted text on this site, type these 6 characters
    before the formatted section, and the exact same 6 characters again after the formatted section.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Partition By Query Problem

    Hi everyone,
    I'm trying to print out for every 5 seconds; the average value of each stock's price for last 10 seconds. The query shoud look like this:
    select price, avg(price) from stockInputChannel [partition by stockName range 10 slide 5]
    BUT, OCEP's CQL does not suppport this kind of query. It warns me to use "partition by $partname rows $rownumber range $rangevalue slide $slidevalue". As I mentioned above I do not need row related query, parameter.. Why the CQL forces me to use that?
    Thanks.

    Hi,
    Peter is right. The best way to learn how a function works is to do some experiments. The SQL Languarge Reference manual describes all the built-in functions, including ROW_NUMBER, and has a good general description of how analytic functions in general work, under "Analytic functions".
    ROW_NUMBER assigns an integer 1, 2, 3, ... to each row.
    When you say "partition by customer_no", a separate set of numbers (starting with 1) will be used for each distinct value of customer_no.
    When you say "order by customer_type desc,rowid desc", that means the lower numbers will be assigned to the higher values of customer_type. If there is a tie (two or more rows with exactly the same customer_type), then lower numbers will be assigned to the higher values of rowid.
    In the main query, the WHERE clause includes only the rows that were assigned 1 by the ROW_NUMBER function. The result is get one row of dta for each customer_no (the PARTITION BY column). Which row will that be? The one with the highest customer_type (the ORDER BY column). This is called a Top-N Query, because you are picking N items (N=1 here) from the top of a sorted list. Notice that the analytic function is used in a sub-query, and its value is used in the WHERE clause of the main query. That's necessary, because analytic fucntions in a (sub)query are evaluated after the WHERE clause.

Maybe you are looking for

  • Logic and User Exit for Availabilty Check in Sales Orders,

    Hi All, I have a requirement of creating a Sales order from an Inbound IDOC. Before saving i need to do all the validations for the sales order including the availabilty check for the sales orders. If it is not  fulfilled i need to reject the sales o

  • Printing form name for FORM-ER 5

    hi i want to know the form name and  printing program name  used for CENVAT credit rule 9A 2004 (ER-5).

  • MacBook does not recognize USB hard drive

    Hello, I have a MacBook 10.5.8 and I have been using an external hard drive (Mobile Drive Clasic 2.5'' 80-GB USB-2.0) to keep saves of important files. It has been working fine until now but I have recently used the same USB drive on a Windows 7, whi

  • Edit in Audition, sequence huge!

    I've been working on several 2-3 minute sequences in Premiere. Each contain a few clips and stereo audio track. To master the track in Audition I've been selecting Edit -> Edit in Audtion -> Sequence, and select from the PopUp "Work Area". My expecta

  • License information

    Hi, We have three systems landscape Development, Quality & Production (ECC 5.0), Also, we have SAND box & 2 solution managers. OS version is Win 2003 server & DB is Oracle 10G for all the systems. At present we don’t have any idea about the number of