Rank - no null

Declare @test table
calldate date,
calllogid int,
id int
Insert into @test select '2013-12-02', 52713, 48287
Insert into @test select '2013-12-02', 52713, 271
Insert into @test select '2014-05-01', 71817,NULL
Insert into @test select '2014-05-01', 71817, 62805
Insert into @test select '2014-02-28', 63995, NULL
Select * from @test
order by calllogid
Above is temp table with sample data and below is the output I'm looking for..
calldate calllogid id
2013-12-02 52713 48287
2013-12-02 52713 271
2014-02-28 63995 NULL
2014-05-01 71817 62805
Say suppose, if a callogid is having multiple id's I'm interested to display both. And which if one is null I just want to show not null record for that calllogid, in above for calllogid 71817 I'm only interested in seeing not null record. and If the calllogid
has only one record with null id, I would like to show that as well..
I can achieve with rank function but this dealing with NULL giving me issue.Help is appreciated.
- please mark correct answers

Please do not post those proprietary table variables. We need to know keys, constraints, DRI, etc. to give you answers. 
What little you did post is a mess. What math do you do on the call_log_id? That is the only reason to make it a numeric data type. Why do you use a vague, generic “id”? Let's call it foobar, so we will know it is not defined and cannot be a key. But means
that this is not a table! 
CREATE TABLE Tests 
(call_date DATE DEFAULT CURRENT_TIMESTAMP NOT NULL,
 call_log_id CHAR(5) NOT NULL
   CHECK(call_log_id LIKE '[0-9][0-9][0-9][0-9][0-9]'),
 foobar INTEGER,
 PRIMARY KEY (??????????) );
You also got the syntax for insertion wrong. 
INSERT INTO Tests
VALUES 
('2013-12-02', '52713', 48287),
('2013-12-02', '52713', 271),
('2014-05-01', '71817', NULL),
('2014-05-01', '71817', 62805),
('2014-02-28', '63995', NULL);
What is foobar? Why is it NULL? What about multiple NULLs? 
Do you want to try again and give us valid DDL? 
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL

Similar Messages

  • Need help with RANK() on NULL data

    Hi All
    I am using Oracle 10g and running a query with RANK(), but it is not returning a desired output. Pleas HELP!!
    I have a STATUS table that shows the history of order status.. I have a requirement to display the order and the last status date (max). If there is any NULL date for an order then show NULL.
    STATUS
    ORD_NO | STAT | DT
    1 | Open |
    1 | Pending |
    2 | Open |
    2 | Pending |
    3 | Open |1/1/2009
    3 | Pending |1/6/2009
    3 | Close |
    4 | Open |3/2/2009
    4 | Close |3/4/2009
    Result should be (max date for each ORD_NO otherwise NULL):
    ORD_NO |DT
    1 |
    2 |
    3 |
    4 |3/4/2009
    CREATE TABLE Status (ORD_NO NUMBER, STAT VARCHAR2(10), DT DATE);
    INSERT INTO Status VALUES(1, 'Open', NULL);
    INSERT INTO Status VALUES(1, 'Pending', NULL);
    INSERT INTO Status VALUES(2, 'Open', NULL);
    INSERT INTO Status VALUES(2, 'Pending',NULL);
    INSERT INTO Status VALUES(3, 'Open', '1 JAN 2009');
    INSERT INTO Status VALUES(3,'Pending', '6 JAN 2009');
    INSERT INTO Status VALUES(3, 'Close', NULL);
    INSERT INTO Status VALUES(4, 'Open', '2 MAR 2009');
    INSERT INTO Status VALUES(4, 'Close', '4 MAR 2009');
    COMMIT;
    I tried using RANK function to rank all the orders by date. So used ORDER BY cluse on date in descending order thinking that the null dates would be on top and will be grouped together by each ORD_NO.
    SELECT ORD_NO, DT, RANK() OVER (PARTITION BY ORD_NO ORDER BY DT DESC)
    FROM Status;
    ...but the result was something..
    ORD_NO |DT |RANKING
    *1 | | 1*
    *1 | | 1*
    *2 | | 1*
    *2 | | 1*3 | | 1
    3 |1/6/2009 | 2
    3 |1/1/2009 | 3
    4 |3/4/2009 | 1
    4 |3/2/2009 | 2
    I am not sure why didn't the first two ORD_NOs didn't group together and why ranking of 1 was assigned to them. I was assuming something like:
    ORD_NO |DT |RANKING
    *1 | | 1*
    *1 | | 2*
    *2 | | 1*
    *2 | | 1*
    3 | | 1
    3 |1/6/2009 | 2
    3 |1/1/2009 | 3
    4 |3/4/2009 | 1
    4 |3/2/2009 | 2
    Please guide me if I am missing something here?
    Regards
    Sri

    Hi,
    If i well understood, you don't need rank
    SELECT   ord_no, MAX (dt)KEEP (DENSE_RANK LAST ORDER BY dt) dt
        FROM status
    GROUP BY ord_no
    SQL> select * from status;
        ORD_NO STAT       DT
             1 Open
             1 Pending
             2 Open
             2 Pending
             3 Open       2009-01-01
             3 Pending    2009-01-06
             3 Close
             4 Open       2009-03-02
             4 Close      2009-03-04
    9 ligne(s) sélectionnée(s).
    SQL> SELECT   ord_no, MAX (dt)KEEP (DENSE_RANK LAST ORDER BY dt) dt
      2      FROM status
      3  GROUP BY ord_no;
        ORD_NO DT
             1
             2
             3
             4 2009-03-04
    SQL>

  • How do I run query per report page?

    I have a report with two subreports which both need queries to run on the fly per supplier(per report page).  All the other subreports on the page as well as the main report queries are loaded in advance before the report and includes all records needed for all suppliers.  However, these two querys that I'm trying to have run per report though can only be loaded one at a time because of the way these queries are formated (it allows a bar graph to show a label for only the current supplier on the x-axis and all other suppliers will display a "" for x-axis label, effectively only showing one legend label per report).  I already have the necessary parameter needed for the report(Pm-Supplier) so how can I run an oracle query on the fly per report page? 
    Eric
    Query 1:
    select vendor,supplierloc,commodity,ip from (select c.Vendor, q.supplierloc, c.commodity,SUM (q.indexpoints+q.lateindexpoints) AS ip
    FROM qa_occ q, glovia_prod.c_vencom@GL7TEST c
    where q.occdate BETWEEN TO_DATE ('4/1/2006', 'mm/dd/yyyy')
    AND TO_DATE ('3/9/2007', 'mm/dd/yyyy')
    and c.vendor=q.supplier
    and (upper(trim(foundby)) not like '%MARKET%' or foundby is null)
    and (upper(trim(foundby)) not like '%TRIAL%' or foundby is null)
    and (upper(trim(rank)) not like '%MARKET%' or rank is null)
    and (upper(trim(rank)) not like '%TRIAL%' or rank is null)
    and nvl(void,'N') = 'N'
    and q.supplier= @SUPPLIER
    GROUP BY C.vendor,q.supplierloc,c.commodity) qa
    union
    select '' as vendor,supplierloc,commodity,ip from (select c.Vendor, q.supplierloc, c.commodity,SUM (q.indexpoints+q.lateindexpoints) AS ip
    FROM qa_occ q, glovia_prod.c_vencom@GL7TEST c
    where q.occdate BETWEEN TO_DATE (@BegDate, 'mm/dd/yyyy')
    AND TO_DATE (@EndDate, 'mm/dd/yyyy')
    and c.vendor=q.supplier
    and (upper(trim(foundby)) not like '%MARKET%' or foundby is null)
    and (upper(trim(foundby)) not like '%TRIAL%' or foundby is null)
    and (upper(trim(rank)) not like '%MARKET%' or rank is null)
    and (upper(trim(rank)) not like '%TRIAL%' or rank is null)
    and nvl(void,'N') = 'N'
    and q.supplier<> @SUPPLIER
    GROUP BY C.vendor,q.supplierloc,c.commodity) qa
    order by commodity,ip desc
    Query 2:
    select c.name as name,c.vendor,c.ven_loc, SUM(q.indexpoints+q.lateindexpoints) as ip
    from qa_occ q,glovia_prod.ven_loc c
    WHERE q.occdate BETWEEN TO_DATE (@BegDate, 'mm/dd/yyyy')
    AND TO_DATE (@EndDate, 'mm/dd/yyyy')
    and (upper(trim(q.foundby)) not like '%MARKET%' or q.foundby is null)
    and (upper(trim(q.foundby)) not like '%TRIAL%' or q.foundby is null)
    and (upper(trim(q.rank)) not like '%MARKET%' or q.rank is null)
    and (upper(trim(q.rank)) not like '%TRIAL%' or q.rank is null)
    and q.supplier is not null
    and nvl(q.void,'N') = 'N'
    and q.supplier=@SUPPLIER
    and q.supplier=c.vendor
    and q.supplierloc=c.ven_loc
    GROUP BY c.name,c.vendor,c.ven_loc
    union
    select '' as name,c.vendor,c.ven_loc, SUM(q.indexpoints+q.lateindexpoints) as ip
    from qa_occ q,glovia_prod.ven_loc c
    WHERE q.occdate BETWEEN TO_DATE (@BegDate, 'mm/dd/yyyy')
    AND TO_DATE (@EndDate, 'mm/dd/yyyy')
    and (upper(trim(q.foundby)) not like '%MARKET%' or q.foundby is null)
    and (upper(trim(q.foundby)) not like '%TRIAL%' or q.foundby is null)
    and (upper(trim(q.rank)) not like '%MARKET%' or q.rank is null)
    and (upper(trim(q.rank)) not like '%TRIAL%' or q.rank is null)
    --and q.supplier is not null
    and nvl(q.void,'N') = 'N'
    and q.supplier=c.vendor
    and q.supplierloc=c.ven_loc
    and q.supplier <> @SUPPLIER
    GROUP BY c.name,c.vendor,c.ven_loc
    ORDER BY ip DESC

    You can't, CR considers each subreport as a separate report and therefore it makes a new connection and runs the SQL for the subreport.

  • Rollup/Cube Operation in Oracle 10g

    I'm using Oracle 10g as my database.
    Suppose I have a table that has data:
    ID SEMESTER SUBJECT MARKS
    9 4 Maths 50
    9 4 Science 45
    9 4 English 42
    10 5 Maths 56
    10 5 History 23
    Now the output should look like this
    ID SEMESTER SUBJECT MARKS RANK
    9 4 Maths 50
    Science 45
    English 42
    Total 137 1
    10 5 Maths 56
    History 23
    Total 79 2
    Can anybody please help me out.
    Thanx in advance

    Select * from tmp_sp_marks
    SEMESTER     SUBJECT     MARKS          STUDID
    3          Maths          40          8
    3          English     52          8
    3          ujarati          40          8
    4          Science     45          9
    4          English     42          9
    5          Maths          43          10
    5          English     44          10
    4          Maths          50          9
    select case when rn = 1 then studid else null end studid,
    case when rn = 1 then semester else null end semester,
    subject,
    marks,
    case when subject = 'Total' then rank else null end rank
    from
    select studid, semester, subject, marks, max(rank) over(partition by studid, semester) rank,
    row_number() over(partition by studid, semester order by marks) rn
    from
    select studid, semester, subject, marks, 0 rank from tmp_sp_marks a
    union
    select studid, semester, 'Total', sm, dense_rank() over(order by sm desc) rank
    from (
    select studid, semester, sum(marks) sm
    from tmp_sp_marks
    group by rollup(studid, semester) )
    where studid is not null and semester is not null
    order by max(rank) over(partition by studid, semester) desc , marks asc
    OUTPUT:
    STUDID     SEMESTER     SUBJECT     MARKS          RANK
    8          3     Maths     40     
                   Gujarati     40     
              English     52     
              Total     132          2
    9          4     English     42     
              Science     45     
              Maths     50     
              Total     137          1
    10          5     Maths     43     
              English     44     
              Total     87          3

  • Inconsistent datatypes: expected NUMBER got CHAR error

    Hi,
    I have the following table
    create GLOBAL TEMPORARY TABLE br_total_rtn_data_tmp
    code varchar(50)     NOT NULL,
    name                         varchar(255),     
    cum_ytd_rtn_amt          varchar(255),     
    cum_one_mon_rtn_amt          varchar(255)     ,
    cum_thr_mon_rtn_amt          varchar(255)     ,
    cum_six_mon_rtn_amt          varchar(255),
    cum_nine_mon_rtn_amt     varchar(255),
    cum_one_yr_rtn_amt          varchar(255),
    cum_thr_yr_rtn_amt          varchar(255),
    cum_five_yr_rtn_amt          varchar(255),
    cum_ten_yr_rtn_amt          varchar(255),
    cum_lof_rtn_amt               varchar(255),
    avg_anl_one_yr_rtn_amt     varchar(255),
    avg_anl_thr_yr_rtn_amt     varchar(255),
    avg_anl_five_yr_rtn_amt     varchar(255),
    avg_anl_ten_yr_rtn_amt     varchar(255),
    avg_anl_lof_rtn_amt          varchar(255),
    cum_prev_1m_month_end     varchar(255),
    cum_prev_2m_month_end     varchar(255)
    )ON COMMIT PRESERVE ROWS;
    I have a case statement
    CASE
                 WHEN code = 'MDN' THEN
                           max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.mdn /100  else null end)
                 WHEN code = 'QRT' THEN
                      max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.quartile  else null end)
                 WHEN code = 'PCT' THEN
                      max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.pct_beaten / 100 else null end)
                 WHEN code = 'RNK' THEN
                           case when (p.m_date = v_prev2_yr_mon and p.period_type = '1M'  and p.rank is  null and p.cnt is null)
                        THEN
                                       P.RANK
                        else
                                        p.rank||'/'||p.cnt
                        end           
                 ELSE NULL
                 END CASE The output for code = RNK should be somewhat like 3/5 which is rank/count
    but i get the error "Inconsistent datatypes: expected NUMBER got CHAR error" when i put p.rank||'/'||p.cnt
    How can that be solved.
    ORacle version is 10g.

    Taken from the documentation of the CASE expression:
    "For a simple CASE expression, the expr and all comparison_expr values must either have the same datatype (CHAR, VARCHAR2, NCHAR, or NVARCHAR2, NUMBER, BINARY_FLOAT, or BINARY_DOUBLE) or must all have a numeric datatype. If all expressions have a numeric datatype, then Oracle determines the argument with the highest numeric precedence, implicitly converts the remaining arguments to that datatype, and returns that datatype.
    For both simple and searched CASE expressions, all of the return_exprs must either have the same datatype (CHAR, VARCHAR2, NCHAR, or NVARCHAR2, NUMBER, BINARY_FLOAT, or BINARY_DOUBLE) or must all have a numeric datatype. If all return expressions have a numeric datatype, then Oracle determines the argument with the highest numeric precedence, implicitly converts the remaining arguments to that datatype, and returns that datatype."
    You need to use the same data type for all your expressions. If you want to return a string, then you need to convert the remaining numbers explicitly to strings. E.g. you could try something like this:
    CASE
                 WHEN code = 'MDN' THEN
                           to_char(max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.mdn /100  else null end), 'TM')
                 WHEN code = 'QRT' THEN
                      to_char(max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.quartile  else null end), 'TM')
                 WHEN code = 'PCT' THEN
                      to_char(max(case when p.m_date = v_prev2_yr_mon and p.period_type = '1M' then p.pct_beaten / 100 else null end), 'TM')
                 WHEN code = 'RNK' THEN
                           case when (p.m_date = v_prev2_yr_mon and p.period_type = '1M'  and p.rank is  null and p.cnt is null)
                        THEN
                                       to_char(P.RANK, 'TM')
                        else
                                        p.rank||'/'||p.cnt
                        end           
                 ELSE NULL
                 END CASE I see another potential issue, you're mixing aggregate functions with non-aggregate expressions, this can only work if these non-aggregate expressions are part of the group by clause, but you haven't posted the complete statement so I can only guess.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle:
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • To Update a table

    HI all,
    I have to move values from a column of data type varchar2 to a column of data type number.
    How could I do that?
    It throws an error
    WHERE range_low = (SELECT status_short_message
    ERROR at line 3:
    ORA-01722: invalid number
    UPDATE edr_rpt_by_ranges_output
         SET range_label = range_low
       WHERE range_low = (SELECT status_short_message
                            FROM status_message
                           WHERE edr_rpt_by_ranges_output.range_low =  status_message.status_code);
                        Edited by: Indhu Ram on Dec 17, 2009 9:41 AM
    Edited by: Indhu Ram on Dec 17, 2009 9:55 AM
    Edited by: Indhu Ram on Dec 17, 2009 10:00 AM

    edr_rpt_by_ranges_output table structure
    RANK                           NOT NULL NUMBER(3)                                                                                                                                                                                    
    ROW_TYPE                                 NOT NULL VARCHAR2  (30)                                                                                                                                                                                 
    LANE_ID                                      NOT NULL NUMBER(3)                                                                                                                                                                                    
    DIRECTION_ID                              NOT NULL NUMBER(3)                                                                                                                                                                                    
    INTERVAL_START_DATE_TIME       NOT NULL DATE                                                                                                                                                                                         
    INTERVAL_END_DATE_TIME         NOT NULL DATE                                                                                                                                                                                         
    RANGE_LABEL                             VARCHAR2(20)                                                                                                                                                                                 
    RANGE_LOW                      NOT NULL NUMBER(12)                                                                                                                                                                                   
    RANGE_HIGH                     NOT NULL NUMBER(12)   desc status_message
    Name Null Type
    STATUS_CODE NOT NULL NUMBER(12)
    STATUS_ABBREVIATION NOT NULL VARCHAR2(20)
    STATUS_SHORT_MESSAGE NOT NULL VARCHAR2(10)
    STATUS_MESSAGE NOT NULL VARCHAR2(50)
    Edited by: Indhu Ram on Dec 17, 2009 9:50 AM

  • Regarding case statement

    Hi guys,
    i am getting a very strange problem. i have score column. basis of this column i am sorting the dataand after it i am giving rank to it.
    but for one employee it is not giving rank. giving null.so please any suggestion.
    i am applying case statement for it.

    It can be because of case statement...Whats the purpose of case statement here ? plz elaborate..
    Try removing case and check if you are getting same error

  • Calculate serial no according to groups

    Name   ID
    Prime    12   
    Prime    12  
    Prime    12   
    Prime    12   
    Prime    12   
    Prime    16   
    Prime    18   
    Prime    18   
    Prime    18 
    I have 2 colums say Name and iD. I want result of sql as:  
    Name   ID      Order
    Prime    12    1
    Prime    12    2
    Prime    12    3
    Prime2    12    1
    Prime2    12     2
    Prime    16    1
    Prime    18    1
    Prime    18    2
    Prime    18    3
    Means the count of product rows based on name.

    >>Prime2
    How you are getting Prime2, I am not seeing Prime2 in your input table
    in-case its a typo then try this
    declare @tab table (Name varchar(50),id int, [Rank] int NULL)
    Insert into @tab (name,ID) Values
    ('Prime',12),
    ('Prime',12),
    ('Prime',12),
    ('Prime2',12),
    ('Prime2',12),
    ('Prime',16),
    ('Prime',18),
    ('Prime',18),
    ('Prime',18);
    with cte
    as(
    Select name,id,rank, row_number()Over(partition by Name,Id Order by Id asc)RowNum From @tab
    )update cte set [rank]= RowNum
    select * from @tab
    Satheesh
    My Blog

  • Sql query slowness due to rank and columns with null values:

        
    Sql query slowness due to rank and columns with null values:
    I have the following table in database with around 10 millions records:
    Declaration:
    create table PropertyOwners (
    [Key] int not null primary key,
    PropertyKey int not null,    
    BoughtDate DateTime,    
    OwnerKey int null,    
    GroupKey int null   
    go
    [Key] is primary key and combination of PropertyKey, BoughtDate, OwnerKey and GroupKey is unique.
    With the following index:
    CREATE NONCLUSTERED INDEX [IX_PropertyOwners] ON [dbo].[PropertyOwners]    
    [PropertyKey] ASC,   
    [BoughtDate] DESC,   
    [OwnerKey] DESC,   
    [GroupKey] DESC   
    go
    Description of the case:
    For single BoughtDate one property can belong to multiple owners or single group, for single record there can either be OwnerKey or GroupKey but not both so one of them will be null for each record. I am trying to retrieve the data from the table using
    following query for the OwnerKey. If there are same property rows for owners and group at the same time than the rows having OwnerKey with be preferred, that is why I am using "OwnerKey desc" in Rank function.
    declare @ownerKey int = 40000   
    select PropertyKey, BoughtDate, OwnerKey, GroupKey   
    from (    
    select PropertyKey, BoughtDate, OwnerKey, GroupKey,       
    RANK() over (partition by PropertyKey order by BoughtDate desc, OwnerKey desc, GroupKey desc) as [Rank]   
    from PropertyOwners   
    ) as result   
    where result.[Rank]=1 and result.[OwnerKey]=@ownerKey
    It is taking 2-3 seconds to get the records which is too slow, similar time it is taking as I try to get the records using the GroupKey. But when I tried to get the records for the PropertyKey with the same query, it is executing in 10 milliseconds.
    May be the slowness is due to as OwnerKey/GroupKey in the table  can be null and sql server in unable to index it. I have also tried to use the Indexed view to pre ranked them but I can't use it in my query as Rank function is not supported in indexed
    view.
    Please note this table is updated once a day and using Sql Server 2008 R2. Any help will be greatly appreciated.

    create table #result (PropertyKey int not null, BoughtDate datetime, OwnerKey int null, GroupKey int null, [Rank] int not null)Create index idx ON #result(OwnerKey ,rnk)
    insert into #result(PropertyKey, BoughtDate, OwnerKey, GroupKey, [Rank])
    select PropertyKey, BoughtDate, OwnerKey, GroupKey,
    RANK() over (partition by PropertyKey order by BoughtDate desc, OwnerKey desc, GroupKey desc) as [Rank]
    from PropertyOwners
    go
    declare @ownerKey int = 1
    select PropertyKey, BoughtDate, OwnerKey, GroupKey
    from #result as result
    where result.[Rank]=1
    and result.[OwnerKey]=@ownerKey
    go
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • How to use "Rank" function  in Oracle?

    I need to display Top 15 records by using rank function.
    Here is my query...I need to pull top 15 FAQ's using the below query.. How can I use RANK function to display the Top 15 FAQ"s in the list.
    Select  distinct SUb1.FAQ,Sub1.FAQ_Hits,GU.display_Name_FMLS as displayname,ev.ParentLinkrecordid,ev.userid from User GU
    Join Event ev
    ON LOWER (ev.userid) IN (LOWER (GU.lanid), LOWER (Gu.racfid))
    Join (Select distinct sm.stem as FAQ,Sum(ev.Eventresults) as FAQ_Hits,ev.ParentLinkrecordid as Topic_ID from Event ev
    Join SubjectMatter sm
    ON (TO_CHAR (sm.smrecordid) = ev.eventdetail1) AND ev.eventdetail1 IS NOT NULL AND sm.smtype = 1
    Where (Upper(ev.eventsubtype) in (Upper('FAQ'),Upper('OPENFAQ')))
    AND TO_DATE (eventdatetime, 'yyyy-mm-dd hh24:mi:ss') >= TO_DATE ('20100601', 'yyyymmdd')
    and TO_DATE (eventdatetime, 'yyyy-mm-dd hh24:mi:ss') <= TO_DATE ('20100831', 'yyyymmdd')
    Group by sm.stem,ev.Parentlinkrecordid
    order by FAQ )sub1
    ON Sub1.Topic_ID = ev.ParentLinkrecordid)

    A few bits that I noticed in the query ...
    in (Upper('FAQ'),Upper('OPENFAQ'))1) Do you really a upper for a string which is already in upper case.
    Select distinct sm.stem as FAQ,Sum(ev.Eventresults) as FAQ_Hits,ev.ParentLinkrecordid as Topic_ID2) Do you need a distinct when you are using a GROUP function viz. SUM ?
    You rank query is as follows, I am not very good at the ANSI style JOIN so changed it slightly ... :-)
    Also notice the usage rank function in the "sub1" query.
    select distinct sub1.faq,
                    sub1.faq_hits,
                    gu.display_name_fmls as displayname,
                    ev.parentlinkrecordid,
                    ev.userid
    from user gu, event ev,
      (select rank() over (order by sum(ev.eventresults) desc) rnk,
              sum(ev.eventresults) as faq_hits,
              sm.stem as faq,         
              ev.parentlinkrecordid as topic_id
         from event ev, subjectmatter sm
        where (to_char(sm.smrecordid) = ev.eventdetail1)
          and ev.eventdetail1 is not null
          and sm.smtype = 1
          AND upper(ev.eventsubtype) in ('FAQ', 'OPENFAQ')
          and to_date(eventdatetime, 'yyyy-mm-dd hh24:mi:ss') >= to_date('20100601', 'yyyymmdd')
          and to_date(eventdatetime, 'yyyy-mm-dd hh24:mi:ss') <= to_date('20100831', 'yyyymmdd')
        group by sm.stem, ev.parentlinkrecordid
        order by faq) sub1
    where lower(ev.userid) in (lower(gu.lanid), lower(gu.racfid))
      and sub1.topic_id = ev.parentlinkrecordid)
      and sub1.rnk <= 15;Like mentioned above, some sample data would have helped.

  • Rank function taking too long

    I am running the below query. This query runs fine. However, if I uncomment the
    "rank() over(partition by CONCAT_DATE,VARIABLE_ID order by VARIABLE_VALUE) RANK" and
    "B.rank=1" , the query takes a very long time to execute...takes about 6-7 minutes
    instead of 20 seconds(when the rank part is commented out). Is there any other way to speed
    this up as I needed the one with the lowest rank.
    Thanks
    SELECT
    EXAMCODE,
    STARTDATE,
    REVDATE,
    ENDDATE,
    VARIATION ,
    GROUPNAME,
    PLAN,
    STDPLAN,
    CORPPLAN,
    PRODUCT,
    PES,
    CONCAT_DATE,
    NOTE_ID,
    MAJ_HDG_ID,
    MAJ_HDG_TXT,
    MIN_HDG_ID,
    MIN_HDG_TXT,
    VARIABLE_ID,
    VARIABLE_DESC,
    PROVIDERCODE,
    VARFORMAT,
    NOTE_NAME,
    MAJHEADNGNOTE,
    MINHEADNGNOTE,
    VARIABLENOTE,
    VARIABLE_VALUE
    FROM(
    SELECT
    EXAMCODE,
    STARTDATE,
    REVDATE,
    ENDDATE,
    VARIATION ,
    GROUPNAME,
    PLAN,
    STDPLAN,
    CORPPLAN,
    PRODUCT,
    PES,
    CONCAT_DATE,
    NOTE_ID,
    MAJ_HDG_ID,
    MAJ_HDG_TXT,
    MIN_HDG_ID,
    MIN_HDG_TXT,
    VARIABLE_ID,
    VARIABLE_DESC,
    PROVIDERCODE,
    VARFORMAT,
    NOTE_NAME,
    MAJHEADNGNOTE,
    MINHEADNGNOTE,
    VARIABLENOTE,
    VARIABLE_VALUE --,
    -- rank() over(partition by CONCAT_DATE,VARIABLE_ID order by VARIABLE_VALUE) RANK
    FROM
    SELECT
    EXAM_DIM2.EXAM_CODE EXAMCODE,
    to_char(START_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') STARTDATE,
    to_char(REV_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') REVDATE,
    to_char(END_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') ENDDATE,
    VARIATION_DIM2.VARIATION_ID VARIATION ,
    EXAM_DIM2.GROUP_NAME GROUPNAME,
    EXAM_DIM2.BENEFIT_PLAN_NAME PLAN,
    EXAM_DIM2.STANDARD_PLAN_NAME STDPLAN,
    EXAM_DIM2.CORPORATE_PLAN_NAME CORPPLAN,
    EXAM_DIM2.PRODUCT_NAME PRODUCT,
    STRUCTURE_DIM2.STRUCTURE_NAME PES,
    EXAM_DIM2.EXAM_CODE || ' - ' || to_char(START_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') || ' - ' || nvl(to_char(REV_DATE_DIM2.FULL_DATE,'MM/DD/YYYY'),'N/A')|| ' - ' || to_char(END_DATE_DIM2.FULL_DATE ,'MM/DD/YYYY') CONCAT_DATE,
    NOTES_DIM2.NOTE_ID NOTE_ID,
    DECODE (MAJOR_HEADING_DIM2.HEADING_ID ,null,HEADING_DIM2.HEADING_ID,MAJOR_HEADING_DIM2.HEADING_ID) MAJ_HDG_ID ,
    DECODE (MAJOR_HEADING_DIM2.HEADING_ID ,null,HEADING_DIM2.HEADING_TEXT,MAJOR_HEADING_DIM2.HEADING_TEXT) MAJ_HDG_TXT ,
    DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,'',HEADING_DIM2.HEADING_ID) MIN_HDG_ID,
    DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,'',HEADING_DIM2.HEADING_TEXT) MIN_HDG_TXT,
    VARIABLE_DIM2.VARIABLE_ID VARIABLE_ID,
    VARIABLE_DIM2.VARIABLE_SHORT_DESCRIPTION VARIABLE_DESC,
    VARIABLE_DIM2.PROVIDER_ARRANGEMENT_CODE PROVIDERCODE,
    VARIABLE_DIM2.VARIABLE_FORMAT_CODE VARFORMAT,
    NOTES_DIM2.NOTE_NAME NOTE_NAME,
    '' as MAJHEADNGNOTE,
    '' as MINHEADNGNOTE,
    DBMS_LOB.SUBSTR(NOTES_DIM2.NOTE_TEXT,DBMS_LOB.GETLENGTH(NOTES_DIM2.NOTE_TEXT) ,1) VARIABLENOTE,
    EXAM_INFO_FACT2.VARIABLE_VALUE VARIABLE_VALUE
    FROM
    MED_DM.DATE_DIM START_DATE_DIM2,
    MED_DM.DATE_DIM END_DATE_DIM2,
    MED_DM.DATE_DIM REV_DATE_DIM2,
    MED_DM.EXAM_DIM EXAM_DIM2,
    MED_DM.STRUCTURE_DIM STRUCTURE_DIM2,
    MED_DM.NOTES_FACT NOTES_FACT2,
    MED_DM.HEADING_DIM MAJOR_HEADING_DIM2,
    MED_DM.HEADING_DIM HEADING_DIM2,
    MED_DM.VARIABLE_DIM VARIABLE_DIM2,
    MED_DM.NOTES_DIM NOTES_DIM2,
    MED_DM.EXAM_INFO_FACT EXAM_INFO_FACT2,
    MED_DM.VARIATION_DIM VARIATION_DIM2
    WHERE
    ( EXAM_INFO_FACT2.EXAM_DIM_ID = EXAM_DIM2.EXAM_DIM_ID )
    AND ( VARIATION_DIM2.VARIATION_DIM_ID (+)= EXAM_INFO_FACT2.VARIATION_DIM_ID )
    AND ( EXAM_INFO_FACT2.STRUCTURE_DIM_ID = STRUCTURE_DIM2.STRUCTURE_DIM_ID)
    AND ( EXAM_DIM2.END_DATE_DIM_ID=END_DATE_DIM2.DATE_DIM_ID )
    AND ( EXAM_DIM2.REVISION_DATE_DIM_ID=REV_DATE_DIM2.DATE_DIM_ID )
    AND ( EXAM_DIM2.START_DATE_DIM_ID=START_DATE_DIM2.DATE_DIM_ID )
    AND ( HEADING_DIM2.HEADING_DIM_ID= EXAM_INFO_FACT2.HEADING_DIM_ID )
    AND ( MAJOR_HEADING_DIM2.HEADING_ID(+)=HEADING_DIM2.PARENT_HEADING_ID )
    AND ( EXAM_INFO_FACT2.VARIABLE_DIM_ID = VARIABLE_DIM2.VARIABLE_DIM_ID )
    AND ( EXAM_INFO_FACT2.EXAM_DIM_ID = NOTES_FACT2.EXAM_DIM_ID (+) )
    AND ( EXAM_INFO_FACT2.VARIABLE_DIM_ID = NOTES_FACT2.VARIABLE_DIM_ID (+))
    AND ( NOTES_FACT2.NOTE_DIM_ID = NOTES_DIM2.NOTE_DIM_ID (+))
    UNION ALL
    SELECT
    EXAM_DIM2.EXAM_CODE EXAMCODE,
    to_char(START_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') STARTDATE,
    to_char(REV_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') REVDATE,
    to_char(END_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') ENDDATE,
    '' as VARIATION,
    EXAM_DIM2.GROUP_NAME GROUPNAME,
    EXAM_DIM2.BENEFIT_PLAN_NAME PLAN,
    EXAM_DIM2.STANDARD_PLAN_NAME ,
    EXAM_DIM2.CORPORATE_PLAN_NAME CORPPLAN,
    EXAM_DIM2.PRODUCT_NAME PRODUCT,
    '' as PES,
    EXAM_DIM2.EXAM_CODE || ' - ' || to_char(START_DATE_DIM2.FULL_DATE,'MM/DD/YYYY') || ' - ' || nvl(to_char(REV_DATE_DIM2.FULL_DATE,'MM/DD/YYYY'),'N/A')|| ' - ' || to_char(END_DATE_DIM2.FULL_DATE ,'MM/DD/YYYY') CONCAT_DATE,
    MED_DM.NOTES_DIM.NOTE_ID NOTE_ID,
    DECODE (MAJOR_HEADING_DIM2.HEADING_ID ,null,HEADING_DIM2.HEADING_ID,MAJOR_HEADING_DIM2.HEADING_ID) MAJ_HDG_ID ,
    DECODE (MAJOR_HEADING_DIM2.HEADING_ID ,null,HEADING_DIM2.HEADING_TEXT,MAJOR_HEADING_DIM2.HEADING_TEXT) MAJ_HDG_TXT ,
    DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,'',HEADING_DIM2.HEADING_ID) MIN_HDG_ID,
    DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,'',HEADING_DIM2.HEADING_TEXT) MIN_HDG_TXT,
    VARIABLE_DIM2.VARIABLE_ID VARIABLE_ID,
    VARIABLE_DIM2.VARIABLE_SHORT_DESCRIPTION VARIABLE_DESC,
    VARIABLE_DIM2.PROVIDER_ARRANGEMENT_CODE PROVIDERCODE,
    VARIABLE_DIM2.VARIABLE_FORMAT_CODE VARFORMAT,
    MED_DM.NOTES_DIM.NOTE_NAME NOTE_NAME,
    (CASE WHEN ((HEADING_DIM2.PARENT_HEADING_ID is null) AND (NOTES_FACT.VARIABLE_DIM_ID is null))
    THEN DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1)
    ELSE
    END) as MAJHEADNGNOTE,
    -- DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1),'') MAJHEADNGNOTE,
    -- DECODE(HEADING_DIM2.PARENT_HEADING_ID,null,'',DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1)) MINHEADNGNOTE,
    (CASE WHEN ((HEADING_DIM2.PARENT_HEADING_ID is not null) AND (NOTES_FACT.VARIABLE_DIM_ID is null))
    THEN DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1)
    ELSE
    END) as MINHEADNGNOTE,
    (CASE WHEN (NOTES_FACT.VARIABLE_DIM_ID is not null)
    THEN DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1)
    ELSE
    END) as VARIABLENOTE,
    --DECODE(NOTES_FACT.VARIABLE_DIM_ID,null,DBMS_LOB.SUBSTR(MED_DM.NOTES_DIM.NOTE_TEXT,DBMS_LOB.GETLENGTH(MED_DM.NOTES_DIM.NOTE_TEXT),1),'') VARIABLENOTE,   
    '' as VARIABLE_VALUE
    FROM
    MED_DM.DATE_DIM START_DATE_DIM2,
    MED_DM.DATE_DIM END_DATE_DIM2,
    MED_DM.DATE_DIM REV_DATE_DIM2,
    MED_DM.EXAM_DIM EXAM_DIM2,
    MED_DM.NOTES_FACT,
    MED_DM.HEADING_DIM MAJOR_HEADING_DIM2,
    MED_DM.HEADING_DIM HEADING_DIM2,
    MED_DM.VARIABLE_DIM VARIABLE_DIM2,
    MED_DM.NOTES_DIM
    WHERE
    ( MED_DM.NOTES_DIM.NOTE_DIM_ID=MED_DM.NOTES_FACT.NOTE_DIM_ID )
    AND ( MED_DM.NOTES_FACT.VARIABLE_DIM_ID=VARIABLE_DIM2.VARIABLE_DIM_ID (+) )
    AND ( MED_DM.NOTES_FACT.EXAM_DIM_ID=EXAM_DIM2.EXAM_DIM_ID )
    AND ( HEADING_DIM2.HEADING_DIM_ID=MED_DM.NOTES_FACT.HEADING_DIM_ID )
    AND ( EXAM_DIM2.END_DATE_DIM_ID=END_DATE_DIM2.DATE_DIM_ID )
    AND ( EXAM_DIM2.REVISION_DATE_DIM_ID=REV_DATE_DIM2.DATE_DIM_ID )
    AND ( EXAM_DIM2.START_DATE_DIM_ID=START_DATE_DIM2.DATE_DIM_ID )
    AND ( MAJOR_HEADING_DIM2.HEADING_ID(+)=HEADING_DIM2.PARENT_HEADING_ID )
    AND ( MED_DM.NOTES_FACT.HEADING_DIM_ID is not null)
    )B
    WHERE B.EXAMCODE ='G971' and B.STARTDATE = '10/01/2002'
    --and B.RANK =1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    There are probably lots of things you could do to improve the performance of this query. The first thing I would try is applying the filter criteria to each query in the "union all" rather than to the final set of results. The way it is working now is that the inner queries are ranking and returning all rows for all exam codes and all start dates (likely thousands if not millions of rows). Only after all rows have been ranked are the results filtered for the desired examcode.
    Try something like:
    select * from (
      select ...rank() over(...) rnk from (
        select ... from ...
          where examcode='G971' and startdate=to_date('10/02/2002','MM/DD/YYYY') ...
        union all
        select ... from ...
          where examcode='G971' and startdate=to_date('10/02/2002','MM/DD/YYYY') ...
    ) where rnk = 1Also, indexes on examcode and/or startdate might help, if they don't already exist.

  • Most efficient way to strip nulls from a Double[] array?

    I'm trying to optimize performance of some code that needs to accept a Double[] array (or a List<Double>) and fetch the median of the non-null values. I'm using org.apache.commons.math.stat.descriptive.rank.Median to fetch the median after converting to double[]. My question is how I can most efficiently make this conversion?
    public class MathStatics {
         private static Median median = new Median();
         public static Double getMedian(Double[] doubles) {
              int numNonNull = 0;
              for (int i = 0; i < doubles.length; i++) {
                   if (doubles[i] != null) numNonNull++;
              double[] ds = new double[numNonNull];
              for (int i = 0; i < doubles.length; i++) {
                   if (doubles[i] != null) ds[i] = doubles;
                   System.out.println(ds[i]);
              return median.evaluate(ds);
         public static void main(String[] args) {
              Double[] test = new Double[] {null,null,-1.1,2.2,5.8,null};
              System.out.println(MathStatics.getMedian(test));
    I'm sure that the code I wrote above is clunky and amateurish, so I'd really appreciate some insight into how to make improvements. FWIW, the arrays will typically range in size from ~1-15,000 doubles.
    Thanks!

    There's no need to loop over the array twice
              int numNonNull = 0;
              double[] ds = new double[numNonNull];
              for (int i = 0; i < doubles.length; i++) {
                   if (doubles[i] != null) {
    numNonNull++;
    ds[i] = doubles;
                   System.out.println(ds[i]);
    Except that ds will have length zero, so you'll get a OutOfBoundsException every time you use it.
    As you're using Doubles rather than doubles, you can add the non-null values to an ArrayList and then convert it to an array at the end.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Implementing Rank / Partitioned Index

    Has anyone implemented a partitioned RANK in Power Query? I understand I can Sort and then create an Index, but how can I define partitions for the index? Is there some way I can make it restart with each new partition value?
    For instance, I want to rank company performance in each sport:
    let
    Source = Table.FromRecords(
    {[Company = "Company A", Event = "Swimming", Points = 8],
    [Company = "Company A", Event = "Running", Points = 5],
    [Company = "Company B", Event = "Swimming", Points = 10],
    [Company = "Company B", Event = "Running", Points = 2],
    [Company = "Company C", Event = "Swimming", Points = 2],
    [Company = "Company C", Event = "Running", Points = 4] })
    in
    #"Source"
    Is there a way to do this, besides manually filtering to a single sport, doing the Sort + Index, repeating this step for each sport, and then appending these steps?

    Since there is a built-in rank function in Power Pivot and you have more control over filtering etc., if one has the choice, then Power Pivot is the better option. On the other hand, Curt provides a fine solution if you need to do the equivalent in Power
    Query.
    It appears though, that you'd rather not have an index column and the lot. An alternative solution involves using a couple of portable custom functions.
    ListRankEqual:
    (inputValue as any, inputSeries as list, optional orderDescending as nullable logical) as number =>
    let
        SortedSeries = if not orderDescending or orderDescending = null then
                        List.Sort(inputSeries,Order.Descending)
                     else
                        List.Sort(inputSeries),
        RankEqual = List.PositionOf(SortedSeries,inputValue)+1
    in
        RankEqual
    The function has two mandatory parameters and one optional parameter. In your example, inputValue would be the current value in the Points column, specified as [Points] when passed to the function. inputSeries would be the entire Points column, or in this
    case, a subset of the Points column. In the body of the function, we sort inputSeries in
    descending order to get the rank in ascending order and vice versa.
    The second function I call GroupList. I tend to use this function when I want to group a column, but retain the original columns in a table. In your example, we want to rank Swimming and Running independently. Therefore, we need
    to create a Swimming points list and a Running points list for the ListRankEqual function.
    The code for GroupList:
    (keyColumnList as list, keyColumnValue as any, associatedColumnList as list)=>
    let
        FilteredKeyColumnPositions = List.PositionOf(keyColumnList, keyColumnValue, Occurrence.All),
        FilteredGroupList = List.Transform(FilteredKeyColumnPositions, each associatedColumnList{_})
    in
       FilteredGroupList
    keyColumnList is the list of values in the grouping column. In this case, it would be specified as
    Source[Event]. keyColumnValue is the current value in the Event column, specified as [Event]. associatedColumnList is the column with the values you want to rank (Source[Points]).
    GroupList first finds all the positions of the current value in the Event column. It then uses List.Transform to produce a corresponding Point list. The Swimming event will produce a list containing all of the points for Swimming, and the Running event
    will produce a list containing all of the points for Running.
    The complete custom formula would be:
    ListRankEqual([Points],GroupList(Source[Event],[Event],Source[Points]))
    You just toss the formula into a new custom column.

  • How to use RANK function ?

    Hello everyone,
    here is the query I run in sql developer using RANKFunction.
    SELECT Empno, Ename, Job, Mgr, Hiredate, Sal
    FROM
    (SELECT Empno, Ename, Job, Mgr, Hiredate, Sal,
    *RANK*() OVER
    (ORDER BY SAL Desc NULLS LAST) AS Emp_Rank
    FROM Emp
    ORDER BY SAL Desc NULLS LAST)
    WHERE Emp_Rank < 6;How I can use this query in my report in obiee or is there any replacement of RANK() function in obiee so that I can use that to get my same above result.
    Thanks

    Kuldip wrote:
    Thanks, you are absolutely correct. However, By doing this I am getting my output as whic I was not expecting.
    Students Marks Rank
    student1 95 1
    student2 95 1
    student3 93 3
    student4 93 3
    student5 91 5
    The output should be as instead
    Students Marks Rank
    student1 95 1
    student2 95 1
    student3 93 2
    student4 93 2
    student5 91 3
    Can It be done like this ?
    Thanks.
    Edited by: Kuldip on Mar 15, 2012 11:51 PMHi Boss,
    I think you copied the above scenario from this site..
    http://oracle-bi.siebelunleashed.com/articles/rank-and-dense-rank-functionsobiee/
    Then why asking how to do this? Are you testing us? Doesn't that site say how to achieve this?

  • One-to-many selfjoin removing records with the same ranking or with a substitute

    Sorry for my bad choice of discussion title, feel free to suggest me a more pertinent one
    I've rewritten post for clarity and following the FAQ.
    DB Version
    I'm using Oracle Enterprise 10g 10.2.0.1.0 64bit
    Tables involved
    CREATE TABLE wrhwr (
    wr_id INTEGER PRIMARY KEY,
    eq_id VARCHAR2(50) NULL,
    date_completed DATE NULL,
    status VARCHAR2(20) NOT NULL,
    pmp_id VARCHAR2(20) NOT NULL,
    description VARCHAR2(20) NULL);
    Sample data
    INSERT into wrhwr  VALUES (1,'MI-EXT-0001',date'2013-07-16','Com','VER-EXC','Revisione')
    INSERT into wrhwr VALUES  (2,'MI-EXT-0001',date'2013-07-01','Com','VER-EXC','Verifica')
    INSERT into wrhwr  VALUES (3,'MI-EXT-0001',date'2013-06-15','Com','VER-EXC','Revisione')
    INSERT into wrhwr  VALUES (4,'MI-EXT-0001',date'2013-06-25','Com','VER-EXC','Verifica')
    INSERT into wrhwr  VALUES (5,'MI-EXT-0001',date'2013-04-14','Com','VER-EXC','Revisione')
    INSERT into wrhwr  VALUES (6,'MI-EXT-0001',date'2013-04-30','Com','VER-EXC','Verifica')
    INSERT into wrhwr  VALUES (7,'MI-EXT-0001',date'2013-03-14','Com','VER-EXC','Collaudo')
    Query used
    SELECT *
      FROM (SELECT eq_id,
                   date_completed,
                   RANK ()
                   OVER (PARTITION BY eq_id
                         ORDER BY date_completed DESC NULLS LAST)
                      rn
              FROM wrhwr
             WHERE     status != 'S'
                   AND pmp_id LIKE 'VER-EX%'
                   AND description LIKE '%Verifica%') table1,
           (SELECT eq_id,
                   date_completed,      
                   RANK ()
                   OVER (PARTITION BY eq_id
                         ORDER BY date_completed DESC NULLS LAST)
                      rn
              FROM wrhwr
             WHERE     status != 'S'
                   AND pmp_id LIKE 'VER-EX%'
                   AND description LIKE '%Revisione%') table2,
           (SELECT eq_id,
                   date_completed,           
                   RANK ()
                   OVER (PARTITION BY eq_id
                         ORDER BY date_completed DESC NULLS LAST)
                      rn
              FROM wrhwr
             WHERE     status != 'S'
                   AND pmp_id LIKE 'VER-EX%'
                   AND description LIKE '%Collaudo%') table3
    WHERE     table1.eq_id = table3.eq_id
           AND table2.eq_id = table3.eq_id
           AND table1.eq_id = table2.eq_id
    Purpose of the above query is to selfjoin wrhwr table 3 times in order to have for every row:
    eq_id;
    completition date of a work request of type Verifica for this eq_id (table1 alias);
    completition date of a wr of type Revisione (table2 alias) for this eq_id;
    completition date of a wr of type Collaudo (table3 alias) for this eq_id;
    A distinct eq_id:
    can have many work requests (wrhwr records) with different completition dates or without completition date (date_completed column NULL);
    in a date range can have all the types of wrhwr ('Verifica', 'Revisione', 'Collaudo') or some of them (ex. Verifica, Revisione but not Collaudo, Collaudo but not Verifica and Revisione, etc.);
    substrings in description shouldn't repeat;
    (eq_id,date_completed) aren't unique but (eq_id,date_completed,description,pmp_id) should be unique;
    Expected output
    Using sample data above I expect this output:
    eq_id,table1.date_completed,table2.date_completed,table3.date_completed
    MI-EXT-001,2013-07-01,2013-07-16,2013-03-14 <--- for this eq_id table3 doesn't have 3 rows but only 1. I want to repeat the most ranked value of table3 for every result row
    MI-EXT-001,2013-07-01,2013-06-15,2013-03-14 <-- I don't wanna this row because table1 and table2 have both 3 rows so the match must be in rank terms (1st,1st) (2nd,2nd) (3rd,3rd)
    MI-EXT-001,2013-06-25,2013-06-15,2013-03-14 <-- 2nd rank of table1 joins 2nd rank of table2
    MI-EXT-001,2013-04-30,2013-04-14,2013-03-14 <-- 1st rank table1, 1st rank table2, 1st rank table3
    In vector style syntax, expected tuple output must be:
    ix = i-th ranking of tableX
    (i1, i2, i3) IF EXISTS an i-th ranking row in every table
    ELSE
    (i1, b, b)
    where b is the first available lower ranking of table2 OR NULL if there isn't any row  of lower ranking.
    Any clues?
    With the query I'm unable to remove "spurius" rows.
    I'm thinking at a solution based on analytic functions like LAG() and LEAD(), using ROLLUP() or CUBE(), using nested query but I would find a solution elegant, easy, fast and easy to maintain.
    Thanks

    FrankKulash ha scritto:
    About duplicate dates: I was most interested in what you wanted when 2 (or more) rows with the same eq_id and row type (e.g. 'Collaudo') had exactly the same completed_date.
    In the new results, did you get the columns mixed up?  It looks like the row with eq_id='MI-EXT-0002' has 'Collaudo' in the desciption, but the date appears in the verifica column of the output, not the collaudo  column.
    Why don't you want 'MI-EXT-0001' in the results?  Is it realted to the non-unique date?
    For all optimization questions, see the forum FAQ:https://forums.oracle.com/message/9362003
    If you can explain what you need to do in the view (and post some sample data and output as examples) then someone might help you find a better way to do it.
    It looks like there's a lot of repetition in the code.  Whatever you're trying to do, I suspect there's a simpler, more efficient way to do it.
    About Duplicate dates: query must show ONLY one date_completed and ignore duplicated. Those records are "bad data". You can't have 2 collaudos with the same date completed.
    Collaudo stands for equipment check. A craftperson does an equipment check once a day and, with a mobile app, update the work request related to equipment and procedure of preventive maintenance, so is impossibile that complete more than one check (Collaudo) in a day, by design.
    In the new results, it's my fault: during digitation I've swapped columns
    With "I don't want 'MI-EXT-0001'" I mean: "I don't want to show AGAIN MI-EXT-0001. In the previous post was correct the output including MI-EXT-0001.
    Regarding optimisation...
    repetition of
    LAST_VALUE ( 
                            MIN (CASE WHEN r_type = THEN column_name END) IGNORE NULLS) 
                         OVER (PARTITION BY eq_id ORDER BY r_num)  AS alias_column_name
    is because I don't know another feasible way to have all columns needed of table wrhwr in main query, maintaining the correct order. So i get them in got_r_type and propagate them in all the subquery.
    In main query I join eq table (which contains all the information about a specific equipment) with "correct" dates and columns of wrhwr table.
    I filter eq table for the specific equipment standard (eq_std column).
    efm_eq_tablet table and where clause
    AND e.eq_id = e2.eq_id 
              AND e2.is_active = 'S'; 
    means: show only rows in eq table that have an equal row in efm_eq_tablet table AND are active (represented by 'S' value in is_active column).
    About the tables v2, r2 and c2
              (SELECT doc_data, doc_data_rinnovo, eq_id 
                 FROM efm_doc_rep edr 
                WHERE edr.csi_id = '1011503' AND edr.doc_validita_temp = 'LIM') v2, 
              (SELECT doc_data, doc_data_rinnovo, eq_id 
                 FROM efm_doc_rep edr 
                WHERE     eq_id = edr.eq_id 
                      AND edr.csi_id = '1011504' 
                      AND edr.doc_validita_temp = 'LIM') r2, 
              (SELECT doc_data, doc_data_rinnovo, eq_id 
                 FROM efm_doc_rep edr 
                WHERE     edr.csi_id IN ('1011505', '1011507') 
                      AND edr.doc_validita_temp = 'LIM' 
                      AND adempimento_ok = 'SI') c2, 
    Those tables contains "alternate" dates of completition to be used when there isn't any wrhwr row for an eq_id OR when all date_completed are NULL.
    NVL() and NVL2() functions are used in main query in order to impletement this.
    The CASE/WHEN blocks inside main query implements the behavior of selecting the correct date based of the above conditions.

Maybe you are looking for

  • BP Role validity Date

    Hi, May I know how do I change the role validity date in business partner? What is the validity date for? Thanks.

  • SharePoint Designer 2013 with SOAP DataSource and Windows Authentication

    Hello What do I need to do to create a SOAP Data Source in SharePoint Designer 2013 with Windows Authentication? I am trying to display a list from another subsite in my Web Part Page using a DataView web part. My environment is set up to use Claims

  • Vertical line in a report

    Hi every body, In a report I need to draw a vertical line using sy-vline up to a certain height. i am writing d code like write : 50 sy-vline.Then it just draw a vertical line upto one row level.But I want upto the end of the page. How can i draw ver

  • ERROR SINCRONIZACION DESKTOP MANAGER 6.1.00.35

    Hola: Yo tengo una BB 9300 y he instalado DManager 6.1.00.35 en XP SP3 He desinstalado, reiniciado e instalado nuevamente el programa. He realizado cada uno de estos pasos ARTICULO Podrian decirme que hacer o el es el Desktop manager que tiene proble

  • Nokia 5233 sensor problem

    when i try to turn the phone upside down for silencing calls and snoozing alarms it doesnt work , how can i fix this problem help plz !!!! Solved! Go to Solution.