Lag Function

Hi, Everyone,
I am trying to use the lag function for a particular purpose. But i am not able to achieve what i want. Below is an example what is need to acheive using the lag function.
Account_number cat_desc pr_code product_desc datetime
999999 Freshmilk 1001 Meiji Fresh milk 02/08/2011
999999 Drinking water 1002 DRINKINGWATER1.5L 03/08/2011
999999 Freshmilk 1001 Meiji Fresh milk 10/08/2011
I would like to find the difference between the days of he has bought "meiji Fresh milk" in a seperate column. I would like to get the result as below
Account_number cat_desc pr_code product_desc datetime Days
999999 Freshmilk 1001 Meiji Fresh milk 02/08/2011 null
999999 Drinking water 1002 DRINKINGWATER1.5L 03/08/2011 0
999999 Freshmilk 1001 Meiji Fresh milk 10/08/2011 8
Could you please help?
Thanks in advance

WITH t AS
     (SELECT 999999 a, 'Freshmilk' b, 1001 c, 'Meiji Fresh milk' d,
             DATE '2011-08-02' dif
        FROM DUAL
      UNION ALL
      SELECT 999999, 'Drinking water', 1002, 'DRINKINGWATER1.5L',
             DATE '2011-08-03'
        FROM DUAL
      UNION ALL
      SELECT 999999, 'Freshmilk', 1001, 'Meiji Fresh milk', DATE '2011-08-10'
        FROM DUAL)
SELECT a,b,c,d,to_char(dif,'DD')-to_char(lag(dif) over( partition by d order by dif),'DD')
  FROM t
  A     B     C     D     TO_CHAR(DIF,'DD')-TO_CHAR(LAG(DIF)OVER(PARTITIONBYDORDERBYDIF),'DD')
999999     Drinking water     1002     DRINKINGWATER1.5L     
999999     Freshmilk     1001     Meiji Fresh milk     
999999     Freshmilk     1001     Meiji Fresh milk     8
do you want zero for second row ?
Lag function is finding no records for this group and hence null , if you want you can hard codeEdited by: user2361373 on Nov 11, 2011 1:47 PM

Similar Messages

  • Using the LAG function

    Hi all,
    using: Oracle9i Enterprise Edition Release 9.0.1.3.
    I have a table that looks basically like this:
    SQL> select labour_rate_id, organisation_id, commence_date_time, termination_date_time, labour_rate
      2    from labour_rate
      3  where organisation_id = 3265022
      4  order by commence_date_time;
    LABOUR_RATE_ID ORGANISATION_ID COMMENCE_ TERMINATI LABOUR_RATE
              7095         3265022 20-APR-05 30-OCT-07        43.5
              9428         3265022 01-JAN-06 31-DEC-07        43.5
             10762         3265022 01-NOV-07 31-DEC-08       55.52As you can see with the first organisation, some of the dates overlap - i.e. the second last termination date is between the start and end dates of the last record.
    I'm writing an update statement to make the previous termination date, one day less than the current commence date.
    I cannot rely on the labour rate Id to provide any sequence, it's just a coincidence that these are in order. I can rely on the commence date to be accurate.
    here's what I have so far:
    select * from (select organisation_id,
                          labour_rate_id,
                          commence_date_time,
                          termination_date_time,              
                          lag(labour_rate_id) over (order by organisation_id, commence_date_time) as prev_labour_rate_id,
                          lag(termination_date_time) over(order by organisation_id, commence_date_time) as prev_term_date_time
                     from labour_rate)      
    where prev_term_date_time between commence_date_time and termination_date_timethis should select all those where the previous termination date is between the start and end date of the current one. the only obvious problem with this is:
    LABOUR_RATE_ID ORGANISATION_ID COMMENCE_ TERMINATI LABOUR_RATE
             10742         4406709 01-NOV-06 31-DEC-07          40
             10743         4406711 18-DEC-06 31-DEC-07          46
             10750         4415820 31-OCT-07 31-DEC-08        4.75most of the data is like this. I only want to get the non-contiguous dates for each organisation ID. the above query will pick up these rows due to the lag function not taking into account distinct organisation ID's.
    this works:
    select  lrp.labour_rate_id,
            lrp.organisation_id,       
            lr.commence_date_time,
            lr.termination_date_time,
            lrp.labour_rate_id as prev_labour_rate_id,
            lrp.termination_date_time as prev_term_date_time
       from labour_rate lr,
            labour_rate lrp
      where lrp.organisation_id = lr.organisation_id
        and lrp.commence_date_time = (select max(commence_date_time)
                                    from labour_rate lrs
                                   where lrs.organisation_id = lr.organisation_id
                                     and lrs.commence_date_time < lr.commence_date_time
        and lrp.termination_date_time > lr.commence_date_timebut it makes me cringe. there surely is a more elegant way of doing this?
    ultimately I want to be able to set the termination date equal to the day before the subsequent commencement date for the same organisation where the termination date is between the subsequent commencement and end dates.
    would someone be able to point me in the right direction?

    Well, for a start
    lag(labour_rate_id) over (order by organisation_id, commence_date_time)should be
    lag(labour_rate_id) over (partition by organisation_id order by commence_date_time)

  • Lag function in oracle

    Hi,
    I am using lag function to display values like below:
    order details date starttime
    main order 1 07/10/12 06:00am
    line 1 07/10/12 06:21am
    line 2 07/10/12 06:31am
    main order 2 07/11/12 07:00am
    line 1 07/11/12 07:01am
    line 2 07/11/12 07:02am
    the data displays correctly when i use lag function except that the line 1 details are never getting displayed ie first line under every order does not get displayed?
    is using lag function in this case correct? Where is the mistake? Any help would be appreciated.
    Thanks

    Hi,
    880122 wrote:
    One caveat if I use union like you suggested above is I need to join v_orders and v_ord_line by order_number. Can you please suggest how I can achieve the above result. How should my query look like?When talking about realtional databases, JOIN means to create a row of output with elements from 2 or more different rows, usually from 2 or more different tables. It doesn't look like you need to do that for this Master-Detail report. All the ouptut you want for each Master row comes from a single row v_orders, and all the output you need for each detail row comes from a single row of v_ord_line. Why do you think you need to join tables? Are you using "join" to mean something else? If so, what?
    What's wrong with the query I posted?
    Point out where it's producing the wrong results, and explain how you get the right results in those places. (That means you'll have to post the right results.) Post new sample data, if you need to.
    Please use \ tags to make your messages readable.  See the forum FAQ {message:id=9360002}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Lag function not working in calculated measure

    I am facing a strange problem while using "Lag" function in calculated measure.
    I have a time dimension which consists of date, workday, financial week, Financial Month and Financial Year.
    The concept of workday is its a integer number which represents date. In case of holidays this number donot change and remain constatnt till next working day. 
    Here workday no remains 460 as 20120708 is a holiday.27 and 28 denotes FinancialWeek.
    Now basically I want to find previous wokdays count.
    with member [Measures].[Prev Count]
    AS
    sum(
              existing [Date].[Workdayno].[Workdayno].Members,
     ([Date].[Workdayno].lag(1),[Measures].[Fact Count]))
    Select { [Measures].[Prev Count]} on 0,
    {[Date].[Calendar].Members} on 1 from [My Cube]
    Where([Date].[Calendar].&[20120709])
    What I expect that It will give me sum of counts where wokday no is 460.
    But its showing null value.
    with member [Measures].[PrevCount]
    As
    SUM(EXISTING [Processing Date].[Workday].[Workday].MEMBERS,([Processing Date].[Workday].LAG(1),[Measures].[Sequenced Records]))
    SELECT {[Measures].[PrevCount]} ON 0 FROM --[E2Eitem]
    WHERE([Processing Date].[E2E Calendar].[Date].&[20120709])
    It looks like existing function is failing to exist the member related to date??
    Any suggestion.

    Try using
    with member [Measures].[Prev Count]
    AS
    sum(
              existing [Date].[Workdayno].[Workdayno].Members,
     ([Date].[Workdayno].currentmember.lag(1),[Measures].[Fact Count]))
    Victor Rocca

  • SQL Query - Using Lag function

    Hi All,
    I need to select the previous and next record, when the value of column is changed for each account.
    My table structure is as follows:
    Account_No number(10)
    Payment_type number(5)
    Installment_Type number(5)
    Date_chage date
    Sample record:
    Account_No Payment_Type Installment_Type Date_change
    70539 ** 1 ** 2 ** 01-OCT-83
    70539 ** 1 ** 2 ** 03-FEB-01
    70539 ** 1 ** 2 ** 26-APR-02
    70539 ** 1 ** 1 ** 21-JUN-02
    70539 ** 1 ** 2 ** 12-JUL-02
    185562 ** 1 ** 2 ** 23-APR-02
    185562 ** 2 ** 2 ** 10-MAY-02
    In the above sample data, the value of instalment is changed on 21-jun-02 and 12-jul-02 for the account 70539. Also the value of Payment Type is changed on 23-apr-02 for the account 185562.
    So, my output should be like this.
    Account_No Payment_Type Installment_Type Date_change
    70539 ** 1 ** 2 ** 26-APR-02
    70539 ** 1 ** 1 ** 21-JUN-02
    70539 ** 1 ** 2 ** 12-JUL-02
    185562 ** 1 ** 2 ** 23-APR-02
    185562 ** 2 ** 2 ** 10-MAY-02
    I tried with lag function, but I couldnt succeed. Im using oracle 8.1.6 Version.
    Can anyone help me to achieve this?
    ** To distinguish the value for each coulmn.
    Thanks and regards,
    Vijay R.

    With the information you have provided, I've come up with the following.
    SELECT A.ACCOUNT_NO, A.PAYMENT_TYPE, A.INSTALLMENT_TYPE, A.DATE_CHANGE
    FROM
        (SELECT account_no, payment_type, installment_type, date_change,
                LEAD( (payment_type), 1)
                     over (partition by account_no order by account_no, DATE_CHANGE)  LEAD_PAY,
                LEAD( (installment_type), 1)
                     over (partition by account_no order by account_no, DATE_CHANGE)  LEAD_INST
          from T_ACCNTS ) A
    WHERE A.PAYMENT_TYPE <> NVL(A.LEAD_PAY,99)
       OR A.INSTALLMENT_TYPE <> NVL(A.LEAD_INST,99)
    ORDER BY 1, 4;

  • LAG Function not accepting a default value

    Hi,
    I'm trying to create a calculation using the LAG function and it won't allow me to enter a default value. Currently the condition is LAG(Work Order Headers WOI.Service Dt,1) OVER(PARTITION BY Work Order Headers WOI.Equipment Ident ORDER BY Work Order Headers WOI.Service Dt ) but I want/need it to be LAG(Work Order Headers WOI.Service Dt,1,trunc(sysdate)) OVER(PARTITION BY Work Order Headers WOI.Equipment Ident ORDER BY Work Order Headers WOI.Service Dt ).. so adding the 'default' value of trunc(sysdate). I have tried other examples not using dates and I get the same problem. It gives a Too many arguments supplied to function LAG error.
    We are using Discoverer 10.1.2.45.46c.
    Is this a known problem with our version ?

    Further. I did try to create a custom folder in the Admin layer and it allows the 'default' value.

  • Misconception about the LAG function's ... functionality?

    So I want to look at the value of a column on a current row, and the value of the same column on the most recently entered row, prior to the current. The problem I'm encountering is that for the current row, I'm only interested in those added last month. So, if the record that I'm seeking in the LAG function is prior to last month, it gets rejected from the resultset.
    create table check_lag
    (filenr number, code varchar2(2), create_date date);
    insert into check_lag values (1,'02',to_date('9/5/2008','MM/DD/YYYY')); -- current
    insert into check_lag values (1,'01',to_date('9/1/2008','MM/DD/YYYY')); --lag record, same month
    insert into check_lag values (2,'02',to_date('9/10/2008','MM/DD/YYYY'));-- current
    insert into check_lag values (2,'01',to_date('8/10/2008','MM/DD/YYYY'));-- lag record prior monthSo this query's results made sense
    SELECT FILENR, CODE,
           LAG( CODE ) OVER( PARTITION BY FILENR ORDER BY FILENR, CREATE_DATE ) AS PRIOR_CODE,
           CREATE_DATE
    FROM   CHECK_LAG;
    FILENR CODE PRIOR_CODE CREATE_DATE
    1      01              9/1/2008
    1      02   01         9/5/2008
    2      01              8/10/2008
    2      02   01         9/10/2008But as soon as I add a WHERE clause which set's a boundary around last month, I exclude a LAG record
    SELECT FILENR, CODE,
           LAG( CODE ) OVER( PARTITION BY FILENR ORDER BY FILENR, CREATE_DATE ) AS PRIOR_CODE,
           CREATE_DATE
    FROM   CHECK_LAG;
    FILENR CODE PRIOR_CODE CREATE_DATE
    1      01              9/1/2008
    1      02   01         9/5/2008
    2      02              9/10/2008I know that I could push this into an inline view, and provide the WHERE clause with the date range after the inline view is processed, but this is a huge table with an index on the CREATE_DATE, and so the following forces a table scan
    SELECT *
    FROM   ( SELECT FILENR, CODE,
                    LAG( CODE ) OVER( PARTITION BY FILENR ORDER BY FILENR, CREATE_DATE ) AS PRIOR_CODE,
                    CREATE_DATE
            FROM   CHECK_LAG )
    WHERE  CREATE_DATE BETWEEN TO_DATE( '09/01/2008', 'MM/DD/YYYY' )
                           AND TO_DATE( '09/30/2008 23:59:59', 'MM/DD/YYYY HH24:MI:SS' )
    AND    PRIOR_CODE IS NOT NULL;
    FILENR CODE PRIOR_CODE CREATE_DATE
    1      02   01         9/5/2008
    2      02   01         9/10/2008Is that just the way things are, or am I missing out on another approach?
    Thanks,
    Chuck

    Hi, Chuck,
    Thanks for including the CREATE TABLE and INSERT statements.
    When you use" ORDER BY <single_column>" in an analytic function, you can restrict it to a range of values relative to the ordering value on the current row.
    For example, you could say
    ORDER  BY create_date
    RANGE  BETWEEN  60 PRECEDING
               AND  30 PRECEDINGif you only wanted to consider rows that were no more than 60 days earlier, but at least 30 days earlier.
    It's a little more complicated for your problem, because you can't just hard-code numbers like 60 and 30; you have to compute them for every row.
    SELECT     filenr
    ,     code
    ,     LAST_VALUE (code) OVER
         ( PARTITION BY     filenr
           ORDER BY     create_date
           RANGE BETWEEN     create_date - ADD_MONTHS ( TRUNC ( create_date
                                        , 'MM'
                                   , -1
                                   )     PRECEDING
              AND     create_date - TRUNC ( create_date
                                 , 'MM'
                                 )          PRECEDING
         ) AS prior_code
    ,     create_date
    FROM     check_lag;You could probably get the same results using LAG, but this is exactly for what LAST_VALUE was designed.
    Windowing is described in the [SQL Language Reference manual|http://download.oracle.com/docs/cd/B28359_01/server.111/b28286/functions001.htm#sthref972] in the section for "analytic functions" in general.

  • Poor performance of LAG function in view

    Hi,
    I have a query, containing a LAG function, that runs supper fast. I've created a view based on the exact query but when I select from it the performance is very slow. If I exclude the column, that gets its value from the LAG function in the underlying query, the performance is the same as the original query. Does anybody know how I can get the view's performance to be the same as the query, and why the view is so much slower when it uses the exact same SQL as the query?
    Many thanks,
    Johan

    Thanks Rob,
    The sql in the view is not very complicated (I removed some of the columns to make it more readable). When I select all the columns from the view except prev_credit_balance the performance is great but when I add the prev_credit_balance column performance is slow. If I run the query from the create view statement below it is fast and include the LAG function. I do select from a rather large table (but then, why doesn't the query give the same problems as the view?)
    CREATE OR REPLACE VIEW CALL_MONITOR_VIEW AS
    select TT.Name Transaction_Type,
    SUBSTR('27'||TD.Subscriber_UID,1,11) Msisdn,
    TD.Subscriber_UID,
         IN_PLATFORM_PREFIX,
         TD.End_Of_Call,
         TD.Called_Party Other_Party,
         TD.Call_Duration,
         TD.Value,
    (TD.Value*TT.Display_Sign) DISPLAY_VALUE,
         TD.Credit_Balance,
    LAG(TD.credit_balance,1) over (order by TD.Subscriber_UID asc, TD.end_of_call asc) prev_credit_balance,
         TD.File_UID,
         TD.File_Type_UID,
         FT.Name File_Type
    from Transaction_Detail TD,
    Transaction_Type TT,
         File_Types FT,
         In_Platform ip,
    TC_Unitization_Map UM
    where TD.Transaction_Type_UID = TT.Transaction_Type_UID(+)
    and TD.IN_PLATFORM_UID = IP.IN_PLATFORM_UID(+)
    and TD.File_Type_Uid = FT.File_Type_uid
    and TD.Provider_ID = UM.Provider_ID (+)
    )

  • LAG function doesn't works correctly changing aggregates level

    Hello there,
    I'm trying to use the function in Discoverer Desktop but i'm experiencing some problems.
    First of all i will explain what i'm trying to do.
    I've to treat data from a typical business datawarehouse and i want to show in the same worksheet both the totals,year fixed (calculated partitioning by months and other fields of dimensions ) for the current year and for the passed year.
    I've done this by making a summary table that calculates SUM(import) grouping by some fields and a LAG(SUM(Import),1)over (partition by <fields> order by year ASC).
    This last expression effectively returns the sum of the passed year.
    The problem is from now on: if i want to roll up over a hierarchy i noticed that some values of the totals of the passed year are different from the import calculated by the simple SUM (of course changing the year... :-) ) ... Mining inside the data i discovered that some there were some rows filled in the SUM column that were empty on the LAG column. This is because running the query, if there isn't any row resulting by aggregating on a certain year and month, it cannot be any calculation for the LAG, and the result is an empty entry for the LAG (but it's not correct).
    Does someone know a workaround or some function like LAG that is not affected by this problem?
    Many thanks in advance for any kind of help
    Greetings to all

    You might want to user NVL() before SUM()
    Ex:
    LAG(SUM(NVL(Import,0)),1) over()
    Use NVL wherever you use SUM, because:
    1+NULL = NULL
    Hope this helps :)

  • DML question about LAG function

    Hello,
    I am trying to get a month-to-date number on a value that is stored as YTD in the cube. That is, for today's month-to-date number, I want to subtract today's value, from last month's value. I am trying to do this with the following statement:
    data - lag(chgdims(data limit lmt(time to parents using time_parentrel)),1,time)
    I'm pretty new to DML, but I know that this is clearly not the correct formula. Does anyone have any ideas on how to do this?
    Thanks

    Dear Fred-san
    Thank you very much for your support on this.
    But, may I double check about what you mentioned above?
    So, what you were mentioning was that if some user executes the query with
    the function module (RFC_READ_TABLE), under the following conditions, he can access to
    the HR data even when he does not have the authorizations for HR transactions?
    <Conditions>
    1. the user has the authorization for HR database tables themselves
    2. RFC_READ_TABLE is called to retrieve the data from HR database
    <example>
    Data: LF_HR_TABLE like  DD02L-TABNAME value 'PA0000'.
    CALL FUNCTION 'RFC_READ_TABLE'
       EXPORTING
        query_table                = LF_HR_TABLE
      TABLES
       OPTIONS                    =
       fields                     =
       data                       =    .
    But then, as long as we call this function module for a non-critical tables such as
    VBAP (sales order) or EKKO (purchase order) within our query, it wouldn't seem to be
    so security risk to use RFC_READ_TABLE...
    Besides, each query (infoset query) has got the concept of user groups, which limits
    the access to the queries within the user group.
    ※If someone does not belong to the user group, he cannot execute the queries within that
       user group, etc
    So, my feeling is that even infoset queries does have authorization concept...
    Would you give me your thought on this?
    I also thank you for your information for SCU0.
    That is an interesting transaction
    Kind regards,
    Takashi

  • Lag function ...need help

    i have a table and i am calculating the difference between the rows. i need to select the max iddference and the corresponding column values.
    table structure
    ll_id act_date upb
    1 1/1/2008 20
    1 2/2/2008 30
    1 3/3/2008 50
    so when i run below query i will get 1 20 (max. difference between 50-30) for ll_id=1
    problem i want corresponding act_date also.
    SELECT ll_id, max(upbdiff)
    from
    select a.ll_id, a.act_date, a.upb,
    lag(upb,1,0)
    over (partition by a.ll_id order by a.act_date asc) as diff,
    a.upb - lag(upb,1,0)
    over (partition by a.ll_id order by a.act_date asc) as upbdiff,
    FROM table1 a)
    group by ll_id
    how can i get the corresponding date also with the result set?
    Thanks
    RD

    Hi,
    In Ranjeeth's query:
    with table01 as(
    select '1' field01, to_date( '1/1/2008', 'DD/MM/YYYY') field02, 10 field03 from dual union
    select '1' field01, to_date( '1/2/2008', 'DD/MM/YYYY') field02, 20 field03 from dual union
    select '1' field01, to_date( '1/3/2008', 'DD/MM/YYYY') field02, 50 field03 from dual union
    select '2' field01, to_date( '1/4/2008', 'DD/MM/YYYY') field02, 20 field03 from dual union
    select '2' field01, to_date( '1/5/2008', 'DD/MM/YYYY') field02, 30 field03 from dual union
    select '2' field01, to_date( '1/6/2008', 'DD/MM/YYYY') field02, 50 field03 from dual union
    select '3' field01, to_date( '1/7/2008', 'DD/MM/YYYY') field02, 20 field03 from dual)
    select table03.field01, table03.field02, table03.field04
    from (select field01, max(field04) field05
    from (select field01, field02, field03,
    field03 - lag( field03, 1) over (partition by field01 order by field01) field04
    from table01)
    group by field01) table02,
    (select field01, field02, field03,
    field03 - lag( field03, 1) over (partition by field01 order by field01) field04
    from table01) table03
    where table02.field01 = table03.field01 and table02.field05 = table03.field04; the only table mentioned is dual. Everything else is a alias for some result set based on that one table. (If you were to adapt this query for your application, of course, you would not use dual: you would use your real table name where the query above has tabe01, in which case that would be the only table referenced.) Often, especially when using analytic functions such as LAG, it is necessary to use sub-queries. The result sets of these sub-queries are treated like tables in super-queries. A typical reason is that analytic functions are computed after the WHERE-clause is evaluated, so to use the results of an analytic function in a WHERE-clause, you must compute it in a sub-query (such as table03, above) and then you can use the column from the sub-query (field04, above) in the WHERE-clause of another query.
    Regarding your other query:
    select id, max(val_diff), max(date_val) keep(dense_rank first order by val_diff desc) date_val1 from
    11 (
    12 select id, date_val, val - nvl(lag(val) over(partition by id order by date_val asc),val) val_diff from t
    13* ) group by idI'll try to re-create the problem.

  • Using LAG function to find a previous record

    Hey,
    I tried searching the forum for this, but I actually didn't even know what to search for, so I'm creating a new thread.
    The SQL below displays a list of prices:
    WITH T AS (
      SELECT 1  AS ID, 'PERM' AS TYPE, 100 AS PRICE, SYSDATE + 1 AS START_DATE FROM DUAL UNION
      SELECT 3  AS ID, 'TEMP' AS TYPE, 90  AS PRICE, SYSDATE + 2 AS START_DATE FROM DUAL UNION
      SELECT 7  AS ID, 'TEMP' AS TYPE, 80  AS PRICE, SYSDATE + 3 AS START_DATE FROM DUAL UNION
      SELECT 8  AS ID, 'PERM' AS TYPE, 75  AS PRICE, SYSDATE + 4 AS START_DATE FROM DUAL UNION
      SELECT 16 AS ID, 'TEMP' AS TYPE, 70  AS PRICE, SYSDATE + 5 AS START_DATE FROM DUAL UNION
      SELECT 20 AS ID, 'TEMP' AS TYPE, 60  AS PRICE, SYSDATE + 6 AS START_DATE FROM DUAL UNION
      SELECT 34 AS ID, 'TEMP' AS TYPE, 50  AS PRICE, SYSDATE + 7 AS START_DATE FROM DUAL
      SELECT T.ID
           , T.TYPE
           , T.PRICE
           , TRUNC (T.START_DATE) AS START_DATE
           , CASE
               WHEN T.TYPE = 'PERM'
                 THEN T.ID
                 ELSE LAG (T.ID, 1, NULL) OVER (PARTITION BY NULL ORDER BY T.ID)
             END AS BASE_ID
        FROM T
    ORDER BY T.START_DATEThe challenge is to produce this output:
    ID TYPE PRICE BASE_ID
    1 PERM   100       1
    3 TEMP    90       1
    7 TEMP    80       1
    8 PERM    75       8
    16 TEMP    70       8
    20 TEMP    60       8
    34 TEMP    50       8What I want to achieve is to bring a column with the ID of the most recent PERM price for TEMP prices,
    and It's own ID for PERM prices.
    My attempt uses LAG to navigate back on the record set, but it uses 1 statically. If there was a way to come up with a number
    for each TEMP price saying how far it was from the most recent PERM, then I could use that number instead of 1.
    Something like:
    ID TYPE PRICE DISTANCE_FROM_PREV_PERM
    1 PERM   100                       0
    3 TEMP    90                       1
    7 TEMP    80                       2
    8 PERM    75                       0
    16 TEMP    70                       1
    20 TEMP    60                       2
    34 TEMP    50                       3Any help will be greatly appreciated.
    Thanks.

    Maybe
    select id,type,price,
           last_value(base_id) ignore nulls over (order by the_row) base_id
    /*     last_value(base_id ignore nulls) over (order by the_row) base_id  -- old way */
      from (select id,type,price,start_date,
                   case type when 'PERM' then id end base_id,
                   row_number() over (order by start_date) the_row
              from t
           )Regards
    Etbin

  • HELP!!! is the lag function?

    Why I am not getting the right results here when I do the insert into the table, I got the right results when I run the select statment alone BUT not when I try to insert the results in the table. I am in the process of writing a PL\SQL package and I stuck here, I try to put this is a cursor, It did not work either same problem right results in the select satement of the cursor BUT no when I uodate the table.
    INSERT INTO TEMP_IND
    temp_pidm,
    temp_term_code,
    temp_bldg_code,
    temp_room_number,
    temp_indicator
    SELECT
    s.szslife_pidm,
    s.szslife_slrrasg_term_code,
    s.szslife_building_code,
    s.szslife_room_number,
    DECODE (lag(srr.repeat_room_number)
    over (PARTITION BY srr.repaat_bldg_code
    ,srr.repeat_room_number
    ORDER BY srr.repeat_term_code
    ,'N', NULL
    ,srr.repeat_room_number,'Y'
    ,'N'
    ) indicator2
    FROM SZSLIFE_REPEAT_ROOM srr
    ,SZSLIFE s
    WHERE srr.repeat_pidm = s.szslife_pidm
    AND srr.repaat_bldg_code = s.szslife_building_code
    AND srr.repeat_room_number = s.szslife_room_number
    AND srr.repeat_term_code = s.szslife_slrrasg_term_code
    --AND szslife_pidm = 1862
    GROUP BY
    s.szslife_pidm,
    s.szslife_slrrasg_term_code,
    s.szslife_building_code,
    s.szslife_room_number,
    srr.repaat_bldg_code,
    srr.repeat_room_number,
    srr.repeat_term_code
    right results
    1862 200290 BLA 003 N
    1862 200310 BLA 003 Y
    1862 200390 BLA 207 N
    1862 200410 BLA 207 Y
    1862 200590 SMI 216 N
    1862 200610 SMI 216 Y
    1862 200490 UNI 4 N
    1862 200510 UNI 4 Y
    when I inserted in the table it insert all the rows with a Y

    LAG has an offset parameter which you can set to 3.
    See more at the oracle help page here
    http://download.oracle.com/docs/cd/B10501_01/server.920/a96540/functions56a.htm

  • Recusive lag function

    Hi if i have a table containing
    id,value
    and I have a simplified calculation formula, eg (2 * PI * 12354.35 / 6548).
    in pseudo code I would like to retrieve a X value
    if(ordering of x=1){
    x=2343
    }else{
    X value = calculate(previous_row_x value)
    how can that be achieve?

    I don't have 11g so I cant try the recursive with
    but I think you can accomplish this with the model clause
    I was not sure what number you wanted to start with I picked .00001 (the end of the nvl clause)
    I have iterate set to 100 but you can change it to a large number
    anyway
    SELECT x
      FROM DUAL
    MODEL
       DIMENSION BY (0 d)
       MEASURES (cast (null as number) x)
       RULES
          ITERATE (100) until   x[ITERATION_NUMBER] =  1
          (x [ITERATION_NUMBER] =  nvl(   x [ITERATION_NUMBER-1] * (2 * 3.14 * 12354.35/6548),.00001));
    X
    1E-5
    0.00011848704642639
    0.00140391801708494
    0.0166346099269189
    0.197098579869572
    2.33536285835815
    27.6710247420748
    327.865799328
    3884.78501866021
    46029.6702862535

  • How to program the function lag

    Hi,
    I am trying to do a query to reproduce the lag function. I cannot use this function because I want to compare a week from a year to the same week of the year n-1 and we can have 52 or 53 weeks...
    I have a lot of problems to do a query that can put the data of the 2 years on the same line of results.
    Any suggestions are welcome.
    Regards

    Below are the rows returned from the above query.
    FY_START     FY_START_1     WEEK_START     WEEK_START_1     WEEK_END     WEEK_END_1     FY_END          FY_END_1     WEEK_NUM     WEEK_NUM_1
    9/26/2004     9/28/2003     9/26/2004     9/28/2003     10/2/2004     10/4/2003     9/24/2005     9/25/2004     1          1
    9/26/2004     9/28/2003     10/3/2004     10/5/2003     10/9/2004     10/11/2003     9/24/2005     9/25/2004     2          2
    9/26/2004     9/28/2003     10/10/2004     10/12/2003     10/16/2004     10/18/2003     9/24/2005     9/25/2004     3          3
    9/26/2004     9/28/2003     10/17/2004     10/19/2003     10/23/2004     10/25/2003     9/24/2005     9/25/2004     4          4
    9/26/2004     9/28/2003     10/24/2004     10/26/2003     10/30/2004     11/1/2003     9/24/2005     9/25/2004     5          5
    9/26/2004     9/28/2003     10/31/2004     11/2/2003     11/6/2004     11/8/2003     9/24/2005     9/25/2004     6          6
    9/26/2004     9/28/2003     11/7/2004     11/9/2003     11/13/2004     11/15/2003     9/24/2005     9/25/2004     7          7
    9/26/2004     9/28/2003     11/14/2004     11/16/2003     11/20/2004     11/22/2003     9/24/2005     9/25/2004     8          8
    9/26/2004     9/28/2003     11/21/2004     11/23/2003     11/27/2004     11/29/2003     9/24/2005     9/25/2004     9          9
    9/26/2004     9/28/2003     11/28/2004     11/30/2003     12/4/2004     12/6/2003     9/24/2005     9/25/2004     10          10
    9/26/2004     9/28/2003     12/5/2004     12/7/2003     12/11/2004     12/13/2003     9/24/2005     9/25/2004     11          11
    9/26/2004     9/28/2003     12/12/2004     12/14/2003     12/18/2004     12/20/2003     9/24/2005     9/25/2004     12          12
    9/26/2004     9/28/2003     12/19/2004     12/21/2003     12/25/2004     12/27/2003     9/24/2005     9/25/2004     13          13
    9/26/2004     9/28/2003     12/26/2004     12/28/2003     1/1/2005     1/3/2004     9/24/2005     9/25/2004     14          14
    9/26/2004     9/28/2003     1/2/2005     1/4/2004     1/8/2005     1/10/2004     9/24/2005     9/25/2004     15          15
    9/26/2004     9/28/2003     1/9/2005     1/11/2004     1/15/2005     1/17/2004     9/24/2005     9/25/2004     16          16
    9/26/2004     9/28/2003     1/16/2005     1/18/2004     1/22/2005     1/24/2004     9/24/2005     9/25/2004     17          17
    9/26/2004     9/28/2003     1/23/2005     1/25/2004     1/29/2005     1/31/2004     9/24/2005     9/25/2004     18          18
    9/26/2004     9/28/2003     1/30/2005     2/1/2004     2/5/2005     2/7/2004     9/24/2005     9/25/2004     19          19
    9/26/2004     9/28/2003     2/6/2005     2/8/2004     2/12/2005     2/14/2004     9/24/2005     9/25/2004     20          20
    9/26/2004     9/28/2003     2/13/2005     2/15/2004     2/19/2005     2/21/2004     9/24/2005     9/25/2004     21          21
    9/26/2004     9/28/2003     2/20/2005     2/22/2004     2/26/2005     2/28/2004     9/24/2005     9/25/2004     22          22
    9/26/2004     9/28/2003     2/27/2005     2/29/2004     3/5/2005     3/6/2004     9/24/2005     9/25/2004     23          23
    9/26/2004     9/28/2003     3/6/2005     3/7/2004     3/12/2005     3/13/2004     9/24/2005     9/25/2004     24          24
    9/26/2004     9/28/2003     3/13/2005     3/14/2004     3/19/2005     3/20/2004     9/24/2005     9/25/2004     25          25
    9/26/2004     9/28/2003     3/20/2005     3/21/2004     3/26/2005     3/27/2004     9/24/2005     9/25/2004     26          26
    9/26/2004     9/28/2003     3/27/2005     3/28/2004     4/2/2005     4/3/2004     9/24/2005     9/25/2004     27          27
    9/26/2004     9/28/2003     4/3/2005     4/4/2004     4/9/2005     4/10/2004     9/24/2005     9/25/2004     28          28
    9/26/2004     9/28/2003     4/10/2005     4/11/2004     4/16/2005     4/17/2004     9/24/2005     9/25/2004     29          29
    9/26/2004     9/28/2003     4/17/2005     4/18/2004     4/23/2005     4/24/2004     9/24/2005     9/25/2004     30          30
    9/26/2004     9/28/2003     4/24/2005     4/25/2004     4/30/2005     5/1/2004     9/24/2005     9/25/2004     31          31
    9/26/2004     9/28/2003     5/1/2005     5/2/2004     5/7/2005     5/8/2004     9/24/2005     9/25/2004     32          32
    9/26/2004     9/28/2003     5/8/2005     5/9/2004     5/14/2005     5/15/2004     9/24/2005     9/25/2004     33          33
    9/26/2004     9/28/2003     5/15/2005     5/16/2004     5/21/2005     5/22/2004     9/24/2005     9/25/2004     34          34
    9/26/2004     9/28/2003     5/22/2005     5/23/2004     5/28/2005     5/29/2004     9/24/2005     9/25/2004     35          35
    9/26/2004     9/28/2003     5/29/2005     5/30/2004     6/4/2005     6/5/2004     9/24/2005     9/25/2004     36          36
    9/26/2004     9/28/2003     6/5/2005     6/6/2004     6/11/2005     6/12/2004     9/24/2005     9/25/2004     37          37
    9/26/2004     9/28/2003     6/12/2005     6/13/2004     6/18/2005     6/19/2004     9/24/2005     9/25/2004     38          38
    9/26/2004     9/28/2003     6/19/2005     6/20/2004     6/25/2005     6/26/2004     9/24/2005     9/25/2004     39          39
    9/26/2004     9/28/2003     6/26/2005     6/27/2004     7/2/2005     7/3/2004     9/24/2005     9/25/2004     40          40
    9/26/2004     9/28/2003     7/3/2005     7/4/2004     7/9/2005     7/10/2004     9/24/2005     9/25/2004     41          41
    9/26/2004     9/28/2003     7/10/2005     7/11/2004     7/16/2005     7/17/2004     9/24/2005     9/25/2004     42          42
    9/26/2004     9/28/2003     7/17/2005     7/18/2004     7/23/2005     7/24/2004     9/24/2005     9/25/2004     43          43
    9/26/2004     9/28/2003     7/24/2005     7/25/2004     7/30/2005     7/31/2004     9/24/2005     9/25/2004     44          44
    9/26/2004     9/28/2003     7/31/2005     8/1/2004     8/6/2005     8/7/2004     9/24/2005     9/25/2004     45          45
    9/26/2004     9/28/2003     8/7/2005     8/8/2004     8/13/2005     8/14/2004     9/24/2005     9/25/2004     46          46
    9/26/2004     9/28/2003     8/14/2005     8/15/2004     8/20/2005     8/21/2004     9/24/2005     9/25/2004     47          47
    9/26/2004     9/28/2003     8/21/2005     8/22/2004     8/27/2005     8/28/2004     9/24/2005     9/25/2004     48          48
    9/26/2004     9/28/2003     8/28/2005     8/29/2004     9/3/2005     9/4/2004     9/24/2005     9/25/2004     49          49
    9/26/2004     9/28/2003     9/4/2005     9/5/2004     9/10/2005     9/11/2004     9/24/2005     9/25/2004     50          50
    9/26/2004     9/28/2003     9/11/2005     9/12/2004     9/17/2005     9/18/2004     9/24/2005     9/25/2004     51          51
    9/26/2004     9/28/2003     9/18/2005     9/19/2004     9/24/2005     9/25/2004     9/24/2005     9/25/2004     52          52

Maybe you are looking for

  • Install multiple version of SQL Server on a same machine, both with default instance?

    We have an almost idle server (windows server 2008 R2) with SQL Server 2008 Enterprise with default instance. We would like to port all the data, install SQL Server 2012 developer edition and use it as testing server. Is that possible we still choose

  • OO Interactive ALV Problem

    I have an OO ALV program that allows the User to change fields in the display and then it will update a data base when they click on the SAVE button.  I need to have an edit to check two fields both of which have the value 'X' or ' '.  I need to edit

  • "cannot find Arial MT font"  error

    I have Windows 7 and Adobe 11... Having a problem with some documents saying "cannot find Arial MT font"... some parts of a document will just have dots.  I didn't have this problem w/Windows XP & older version of Adobe.  I've gone back to Adobe 9.5

  • VKOA Custom Table: How to delete or change VKOA Custom Table

    Hi All, We have a requirement to create VKOA custom sequence table to include Order Reason in the access sequence. In order to achieve that I have customized a  table and selected : Condition Type, Sales Org and Order Reason. But in the final output

  • Startup disk question/problem

    Should the startup disk always be at the top of the list of disks upon startup? Or is it now the case, with 10.5 on a MacBook, that the internal disk is at the top of the list whether or not it is the startup disk? Situation: New MacBook Mac OS 10.5.