Use of analytic function

I'm trying to sort a series of rows and look for the latest (based on time). I only want the latest row, because that's the one I want to update. I'm trying to figure out a query, and it looks like an analytical function is my best avenue, but the syntax is stymies me. Can someone help/suggest a correction?
select KEY_ID, to_char(START_DATETIME,'HH24:MI'), DESCRIPTION, FIRST_VALUE(KEY_ID)
  OVER (PARTITION by KEY_ID ORDER BY START_DATETIME DESC) LIST
        from TS_TIMECARD_DETAIL
       where TIMECARD_ID = 60412
       order by START_DATETIME DESC
/Thanks.
PS. Key_ID is unique/primary key.

That was close enough. It's actually part of a trigger designed to match up pickup and delivery information. Here's what I finally came up with:
select KEY_ID, DESCRIPTION into x_key, x_desc
        from (select KEY_ID, DESCRIPTION, START_DATETIME, MAX(START_DATETIME)
                OVER (PARTITION by KEY_ID) MAX_START_DATETIME
                from TS_TIMECARD_DETAIL
               where TIMECARD_ID = x_tc
                 and START_DATETIME < v_dttm
               order by START_DATETIME DESC)
       WHERE rownum = 1;

Similar Messages

  • Can i use an analytic function instead of a group by clause?

    Can i use an analytic function instead of a group by clause? Will this help in any performance improvement?

    analytic can sometimes avoid scanning the table more than once :
    SQL> select ename,  sal, (select sum(sal) from emp where deptno=e.deptno) sum from emp e;
    ENAME             SAL        SUM
    SMITH             800      10875
    ALLEN            1600       9400
    WARD             1250       9400
    JONES            2975      10875
    MARTIN           1250       9400
    BLAKE            2850       9400
    CLARK            2450       8750
    SCOTT            3000      10875
    KING             5000       8750
    TURNER           1500       9400
    ADAMS            1100      10875
    JAMES             950       9400
    FORD             3000      10875
    MILLER           1300       8750
    14 rows selected.
    Execution Plan
    Plan hash value: 3189885365
    | Id  | Operation          | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT   |      |    14 |   182 |     3   (0)| 00:00:01 |
    |   1 |  SORT AGGREGATE    |      |     1 |     7 |            |          |
    |*  2 |   TABLE ACCESS FULL| EMP  |     5 |    35 |     3   (0)| 00:00:01 |
    |   3 |  TABLE ACCESS FULL | EMP  |    14 |   182 |     3   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       2 - filter("DEPTNO"=:B1)which could be rewritten as
    SQL> select ename, sal, sum(sal) over (partition by deptno) sum from emp e;
    ENAME             SAL        SUM
    CLARK            2450       8750
    KING             5000       8750
    MILLER           1300       8750
    JONES            2975      10875
    FORD             3000      10875
    ADAMS            1100      10875
    SMITH             800      10875
    SCOTT            3000      10875
    WARD             1250       9400
    TURNER           1500       9400
    ALLEN            1600       9400
    JAMES             950       9400
    BLAKE            2850       9400
    MARTIN           1250       9400
    14 rows selected.
    Execution Plan
    Plan hash value: 1776581816
    | Id  | Operation          | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT   |      |    14 |   182 |     4  (25)| 00:00:01 |
    |   1 |  WINDOW SORT       |      |    14 |   182 |     4  (25)| 00:00:01 |
    |   2 |   TABLE ACCESS FULL| EMP  |    14 |   182 |     3   (0)| 00:00:01 |
    ---------------------------------------------------------------------------well, there is no group by and no visible performance enhancement in my example, but Oracle7, you must have written the query as :
    SQL> select ename, sal, sum from emp e,(select deptno,sum(sal) sum from emp group by deptno) s where e.deptno=s.deptno;
    ENAME             SAL        SUM
    SMITH             800      10875
    ALLEN            1600       9400
    WARD             1250       9400
    JONES            2975      10875
    MARTIN           1250       9400
    BLAKE            2850       9400
    CLARK            2450       8750
    SCOTT            3000      10875
    KING             5000       8750
    TURNER           1500       9400
    ADAMS            1100      10875
    JAMES             950       9400
    FORD             3000      10875
    MILLER           1300       8750
    14 rows selected.
    Execution Plan
    Plan hash value: 2661063502
    | Id  | Operation            | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT     |      |    14 |   546 |     8  (25)| 00:00:01 |
    |*  1 |  HASH JOIN           |      |    14 |   546 |     8  (25)| 00:00:01 |
    |   2 |   VIEW               |      |     3 |    78 |     4  (25)| 00:00:01 |
    |   3 |    HASH GROUP BY     |      |     3 |    21 |     4  (25)| 00:00:01 |
    |   4 |     TABLE ACCESS FULL| EMP  |    14 |    98 |     3   (0)| 00:00:01 |
    |   5 |   TABLE ACCESS FULL  | EMP  |    14 |   182 |     3   (0)| 00:00:01 |
    -----------------------------------------------------------------------------So maybe it helps

  • How to use sum analytic function in adf

    Hi
    jdev 11.1.1.5
    oracle 11g r2
    I want to use analytic function (sum,count,avg and ...) .
    I see [url http://andrejusb.blogspot.co.uk/2013/02/oracle-analytic-functions-for-total-and.html]Oracle Analytic Functions for Total and Average Calculation in ADF BC
    and use it in my vo and jsf page,my vo have too much record and I want to have sum in table footer on demand (because of performance) and if user do not want to see the sum in footer of table it do not calculate sum,
    what is your idea?

    Before I read that blog I use another vo for sum but after that blog decide to use analytic fuction becuase we have some page that have to many dvt graph and table and know we use seperate vo for them and it has not good performance and too many query must run in database ,I want to have 1 vo with some analytic function for graph and tables

  • Getting first and last records of sql using a analytical function query

    hi all!
    Thanks in advance for looking at my problem! I have a query that runs against a table that keep all records
    of changes on another table (journal table kind of thing). I wrote a sql that would tell what the status of that
    requisition was, what came next and its respective dates. However, that would bring a lot of rows. I only need
    to see the first row and last couple say 3 last rows. How could I achieve that?
    SELECT ano yr,
                    numero id,
                    jn_datetime,
                    status_siafi status,
                    lead(status_siafi) over(PARTITION BY ano, numero ORDER BY jn_datetime) next_status,
                    lead(jn_datetime) over(PARTITION BY ano, numero ORDER BY jn_datetime) date_next_status,
                    MAX(jn_datetime) over(PARTITION BY ano, numero) last_update,
                    MIN(jn_datetime) over(PARTITION BY ano, numero) first_update
    FROM   nl_compensado_jn
    WHERE  ano = '08'
    AND    numero = '113747'
    GROUP  BY ano,
                             numero,
                             jn_datetime,
                             status_siafi
    YR ID     JN_DATETI S N DATE_NEXT LAST_UPDA FIRST_UPD
    08 113747 11-SEP-08 1 2 11-SEP-08 20-NOV-08 11-SEP-08
    08 113747 11-SEP-08 2 3 12-SEP-08 20-NOV-08 11-SEP-08
    08 113747 12-SEP-08 3 2 12-SEP-08 20-NOV-08 11-SEP-08
    08 113747 12-SEP-08 2 3 15-SEP-08 20-NOV-08 11-SEP-08
    08 113747 15-SEP-08 3 2 15-SEP-08 20-NOV-08 11-SEP-08
    08 113747 15-SEP-08 2 3 16-SEP-08 20-NOV-08 11-SEP-08
    08 113747 16-SEP-08 3 2 16-SEP-08 20-NOV-08 11-SEP-08
    08 113747 16-SEP-08 2 3 17-SEP-08 20-NOV-08 11-SEP-08
    08 113747 17-SEP-08 3 2 17-SEP-08 20-NOV-08 11-SEP-08
    08 113747 17-SEP-08 2 3 18-SEP-08 20-NOV-08 11-SEP-08
    08 113747 18-SEP-08 3 2 18-SEP-08 20-NOV-08 11-SEP-08
    08 113747 18-SEP-08 2 3 19-SEP-08 20-NOV-08 11-SEP-08
    08 113747 19-SEP-08 3 2 19-SEP-08 20-NOV-08 11-SEP-08
    08 113747 19-SEP-08 2 3 23-SEP-08 20-NOV-08 11-SEP-08
    08 113747 23-SEP-08 3 2 24-SEP-08 20-NOV-08 11-SEP-08
    08 113747 24-SEP-08 2 3 25-SEP-08 20-NOV-08 11-SEP-08
    08 113747 25-SEP-08 3 2 25-SEP-08 20-NOV-08 11-SEP-08
    08 113747 25-SEP-08 2 3 26-SEP-08 20-NOV-08 11-SEP-08
    08 113747 26-SEP-08 3 2 26-SEP-08 20-NOV-08 11-SEP-08
    08 113747 26-SEP-08 2 3 29-SEP-08 20-NOV-08 11-SEP-08
    08 113747 29-SEP-08 3 2 29-SEP-08 20-NOV-08 11-SEP-08
    08 113747 29-SEP-08 2 3 02-OCT-08 20-NOV-08 11-SEP-08
    08 113747 02-OCT-08 3 2 02-OCT-08 20-NOV-08 11-SEP-08
    08 113747 02-OCT-08 2 3 03-OCT-08 20-NOV-08 11-SEP-08
    08 113747 03-OCT-08 3 2 03-OCT-08 20-NOV-08 11-SEP-08
    08 113747 03-OCT-08 2 3 06-OCT-08 20-NOV-08 11-SEP-08
    08 113747 06-OCT-08 3 2 06-OCT-08 20-NOV-08 11-SEP-08
    08 113747 06-OCT-08 2 3 07-OCT-08 20-NOV-08 11-SEP-08
    08 113747 07-OCT-08 3 2 07-OCT-08 20-NOV-08 11-SEP-08
    08 113747 07-OCT-08 2 3 08-OCT-08 20-NOV-08 11-SEP-08
    08 113747 08-OCT-08 3 2 08-OCT-08 20-NOV-08 11-SEP-08
    08 113747 08-OCT-08 2 3 09-OCT-08 20-NOV-08 11-SEP-08
    08 113747 09-OCT-08 3 2 09-OCT-08 20-NOV-08 11-SEP-08
    08 113747 09-OCT-08 2 3 10-OCT-08 20-NOV-08 11-SEP-08
    08 113747 10-OCT-08 3 2 14-OCT-08 20-NOV-08 11-SEP-08
    08 113747 14-OCT-08 2 3 15-OCT-08 20-NOV-08 11-SEP-08
    08 113747 15-OCT-08 3 2 15-OCT-08 20-NOV-08 11-SEP-08
    08 113747 15-OCT-08 2 3 16-OCT-08 20-NOV-08 11-SEP-08
    08 113747 16-OCT-08 3 2 16-OCT-08 20-NOV-08 11-SEP-08
    08 113747 16-OCT-08 2 3 17-OCT-08 20-NOV-08 11-SEP-08
    08 113747 17-OCT-08 3 2 17-OCT-08 20-NOV-08 11-SEP-08
    08 113747 17-OCT-08 2 3 21-OCT-08 20-NOV-08 11-SEP-08
    08 113747 21-OCT-08 3 2 21-OCT-08 20-NOV-08 11-SEP-08
    08 113747 21-OCT-08 2 3 22-OCT-08 20-NOV-08 11-SEP-08
    08 113747 22-OCT-08 3 2 22-OCT-08 20-NOV-08 11-SEP-08
    08 113747 22-OCT-08 2 3 23-OCT-08 20-NOV-08 11-SEP-08
    08 113747 23-OCT-08 3 2 23-OCT-08 20-NOV-08 11-SEP-08
    08 113747 23-OCT-08 2 3 27-OCT-08 20-NOV-08 11-SEP-08
    08 113747 27-OCT-08 3 2 27-OCT-08 20-NOV-08 11-SEP-08
    08 113747 27-OCT-08 2 3 28-OCT-08 20-NOV-08 11-SEP-08
    08 113747 28-OCT-08 3 2 28-OCT-08 20-NOV-08 11-SEP-08
    08 113747 28-OCT-08 2 3 29-OCT-08 20-NOV-08 11-SEP-08
    08 113747 29-OCT-08 3 2 29-OCT-08 20-NOV-08 11-SEP-08
    08 113747 29-OCT-08 2 3 30-OCT-08 20-NOV-08 11-SEP-08
    08 113747 30-OCT-08 3 2 30-OCT-08 20-NOV-08 11-SEP-08
    08 113747 30-OCT-08 2 3 31-OCT-08 20-NOV-08 11-SEP-08
    08 113747 31-OCT-08 3 2 31-OCT-08 20-NOV-08 11-SEP-08
    08 113747 31-OCT-08 2 3 03-NOV-08 20-NOV-08 11-SEP-08
    08 113747 03-NOV-08 3 2 03-NOV-08 20-NOV-08 11-SEP-08
    08 113747 03-NOV-08 2 3 06-NOV-08 20-NOV-08 11-SEP-08
    08 113747 06-NOV-08 3 2 06-NOV-08 20-NOV-08 11-SEP-08
    08 113747 06-NOV-08 2 3 07-NOV-08 20-NOV-08 11-SEP-08
    08 113747 07-NOV-08 3 2 07-NOV-08 20-NOV-08 11-SEP-08
    08 113747 07-NOV-08 2 3 10-NOV-08 20-NOV-08 11-SEP-08
    08 113747 10-NOV-08 3 2 10-NOV-08 20-NOV-08 11-SEP-08
    08 113747 10-NOV-08 2 3 12-NOV-08 20-NOV-08 11-SEP-08
    08 113747 12-NOV-08 3 2 12-NOV-08 20-NOV-08 11-SEP-08
    08 113747 12-NOV-08 2 3 13-NOV-08 20-NOV-08 11-SEP-08
    08 113747 13-NOV-08 3 2 13-NOV-08 20-NOV-08 11-SEP-08
    08 113747 13-NOV-08 2 3 14-NOV-08 20-NOV-08 11-SEP-08
    08 113747 14-NOV-08 3 2 14-NOV-08 20-NOV-08 11-SEP-08
    08 113747 14-NOV-08 2 3 17-NOV-08 20-NOV-08 11-SEP-08
    08 113747 17-NOV-08 3 2 17-NOV-08 20-NOV-08 11-SEP-08
    08 113747 17-NOV-08 2 3 18-NOV-08 20-NOV-08 11-SEP-08
    08 113747 18-NOV-08 3 2 18-NOV-08 20-NOV-08 11-SEP-08
    08 113747 18-NOV-08 2 2 18-NOV-08 20-NOV-08 11-SEP-08
    08 113747 18-NOV-08 2 3 19-NOV-08 20-NOV-08 11-SEP-08
    08 113747 19-NOV-08 3 2 19-NOV-08 20-NOV-08 11-SEP-08
    08 113747 19-NOV-08 2 4 20-NOV-08 20-NOV-08 11-SEP-08
    08 113747 20-NOV-08 4             20-NOV-08 11-SEP-08
    80 rows selected.thanks!!!!!!!!!!!!!!!!!!!!!!!!!!!
    gleisson henrique

    sorry!!!!! didn't notice that major detail!!!
    insert into nl_compensado_jn values ('INS','LETICIA','11-SEP-08 15:08:27','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','1','');
    insert into nl_compensado_jn values ('UPD','BELLA','19-SEP-08 07:43:20','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','18-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','15-SEP-08 07:45:54','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','12-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','11-SEP-08 15:34:30','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','11-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','16-SEP-08 13:48:38','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','16-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','18-SEP-08 07:44:12','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','17-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','16-SEP-08 07:38:29','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','15-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','19-SEP-08 16:13:20','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','19-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','18-SEP-08 15:33:59','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','18-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','15-SEP-08 15:35:52','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','15-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','02-OCT-08 07:51:38','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','29-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','26-SEP-08 08:11:04','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','25-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','29-SEP-08 15:46:31','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','29-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','29-SEP-08 12:12:29','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','26-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','08-OCT-08 07:44:06','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','07-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','09-OCT-08 07:44:43','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','08-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','03-OCT-08 07:44:57','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','02-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','06-OCT-08 07:41:19','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','03-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-SEP-08 07:35:00','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','16-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','03-OCT-08 15:17:09','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','03-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','23-SEP-08 16:05:01','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','19-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','25-SEP-08 07:37:44','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','24-SEP-08');
    insert into nl_compensado_jn values ('UPD','GERENTE','26-SEP-08 15:57:35','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','26-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','24-SEP-08 15:31:40','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','24-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','12-SEP-08 08:02:34','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','11-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','09-OCT-08 15:04:27','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','09-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-SEP-08 15:31:46','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','17-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','07-OCT-08 07:51:57','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','06-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','07-OCT-08 15:04:54','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','07-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','02-OCT-08 15:49:48','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','02-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','25-SEP-08 15:36:45','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','25-SEP-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','06-OCT-08 15:00:08','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','06-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','08-OCT-08 14:57:23','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','08-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','12-SEP-08 15:31:47','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','12-SEP-08');
    insert into nl_compensado_jn values ('UPD','BELLA','06-NOV-08 10:04:08','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','03-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','10-NOV-08 14:11:55','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','07-NOV-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','23-OCT-08 15:08:23','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','23-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','31-OCT-08 14:59:36','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','31-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','28-OCT-08 10:33:59','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','27-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','16-OCT-08 08:01:41','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','15-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','29-OCT-08 11:04:35','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','28-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-OCT-08 07:58:07','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','16-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','22-OCT-08 10:36:15','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','21-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','21-OCT-08 13:08:38','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','17-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','23-OCT-08 10:49:52','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','22-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','27-OCT-08 10:12:47','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','23-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','16-OCT-08 15:36:47','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','16-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','14-OCT-08 15:19:24','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','14-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','03-NOV-08 09:10:26','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','31-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','15-OCT-08 07:59:37','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','14-OCT-08');
    insert into nl_compensado_jn values ('UPD','ANTUNES','10-OCT-08 11:25:23','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','09-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','03-NOV-08 16:01:49','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','03-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','29-OCT-08 15:13:36','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','29-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','22-OCT-08 15:25:48','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','22-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','30-OCT-08 10:22:24','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','29-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','30-OCT-08 15:15:47','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','30-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-OCT-08 15:19:19','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','17-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','06-NOV-08 16:08:43','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','06-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','31-OCT-08 10:42:10','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','30-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','07-NOV-08 16:01:50','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','07-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','21-OCT-08 15:34:07','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','21-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','27-OCT-08 15:22:24','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','27-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','28-OCT-08 15:16:19','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','28-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','15-OCT-08 15:15:54','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','15-OCT-08');
    insert into nl_compensado_jn values ('UPD','BELLA','07-NOV-08 09:39:43','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','06-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-NOV-08 09:29:29','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','14-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','12-NOV-08 09:40:53','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','10-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','18-NOV-08 09:49:53','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','17-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','19-NOV-08 15:29:15','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084810','16660','2','19-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','10-NOV-08 15:25:03','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','10-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','13-NOV-08 09:10:07','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','12-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','14-NOV-08 10:33:24','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','3','13-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','12-NOV-08 15:32:54','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','12-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','17-NOV-08 15:37:10','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','17-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','19-NOV-08 09:14:38','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084810','16660','3','18-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','20-NOV-08 09:06:16','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084810','16660','4','19-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','14-NOV-08 15:19:03','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','14-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','18-NOV-08 15:47:14','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','18-NOV-08');
    insert into nl_compensado_jn values ('UPD','BELLA','13-NOV-08 15:29:06','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084110','16660','2','13-NOV-08');
    insert into nl_compensado_jn values ('UPD','HEBER','18-NOV-08 18:41:45','','TRIGER25','','08','113747','00','00003','84110','08','DV','540638','11-SEP-08','WX01208201001760084810','16660','2','18-NOV-08');

  • Error using Analytic function in reports

    Hi,
    I am trying to use Oracle analytic function (lag) in a report. But am getting the below error:
    Encountered the symbol "(" when expecting one of the following:
    ,from into bulk
    This is the code in the formula column:
    function extend_lifeFormula return VARCHAR2 is
    l_extend_life VARCHAR2(80);
    l_life_in_months VARCHAR2(80);
    l_asset_id NUMBER;
    begin
    SRW.REFERENCE(:P_BOOK_NAME);
    SRW.REFERENCE(:ASSET_ID);
    SELECT asset_id,
         lag(life_in_months,1,0) over (PARTITION BY asset_id
                   ORDER BY transaction_header_id_in) Extend_Life
    INTO l_asset_id,
    l_life_in_months
    FROM fa_books
    WHERE book_type_code = 'US GAAP'
    AND asset_id = 1;
    return life_in_months;
    end;
    Has anyone experienced this error before? Does client pl/sql engine not support Analytic functions? The above query runs fine in SQL.
    Thanks,
    Ashish

    From our version of 6i Reports Builder Help, I got ...
    Oracle ORACLE PL/SQL V8.0.6.3.0 - Production
    You may check yours.

  • Using analytical functions

    I need to categorize the entries in col c1 as follows:
    1. if the entry in col c1 has only positive values in col c2, then that entry in col c1 gets a category of PRD.
    2. if the entry in col c1 has only negative values in col c2, then that entry in col c1 gets a category of RMT.
    3. if the entry in col c1 has both positive and negative values in col c2, then that entry in col c1 gets a category of INT.
    Here's the data table:
    CREATE TABLE K_TST
    C1 VARCHAR2(10),
    C2 NUMBER
    c1     c2
    a     1
    a     -1
    a     -3
    a     2
    a     -2
    b     1
    c     -1
    d     1
    d     2
    d     3
    So I use the following query:
    select a.c1,
         a.init_category,
         count(*) over (partition by a.c1) category_count,
         decode(count(*) over (partition by a.c1), 1, a.init_category, 'INT') final_category
    from
         select distinct
              c1,
              decode(instr(to_char(c2), '-'), 0, 'PRD', 'RMT') init_category
         from k_tst
    ) a
    The idea is that I assign only categories PRD and RMT in the sub-query and then use the analytical function to count if an entry in col c1 belongs to more than one category. If it belongs to more than one category in the subquery, then the category is reassigned to INT.
    No problem, the result is:
    C1     INIT_CATEGORY     CATEGORY_COUNT     FINAL_CATEGORY
    a     PRD          2          INT
    a     RMT          2          INT
    b     PRD          1          PRD
    c     RMT          1          RMT
    d     PRD          1          PRD
    The report that I want is without INIT_CATEGORY col and only distinct values, so I modify the query from above to this:
    select distinct
         a.c1,
    --     a.init_category,
         count(*) over (partition by a.c1) category_count,
         decode(count(*) over (partition by a.c1), 1, a.init_category, 'INT') final_category
    from
         select distinct
              c1,
              decode(instr(to_char(c2), '-'), 0, 'PRD', 'RMT') init_category
         from k_tst
    ) a
    The result is:
    C1     CATEGORY_COUNT     FINAL_CATEGORY
    a     5          INT
    b     1          PRD
    c     1          RMT
    d     3          INT
    Note the the CATEGORY_COUNT for 'a' changed to 5 and CATEGORY_COUNT for 'd' changed to 3. So 'd' is now categorized as INT instead of the desired category PRD (all entries in col c2 for 'd' are positive).
    Why did the results of CATEGORY_COUNT change by merely adding the 'distinct' to the outer query?
    Thanks for the time to answer this question.

    I need to categorize the entries in col c1 as follows:
    1. if the entry in col c1 has only positive values in col c2, then that entry in col c1 gets a category of PRD.
    2. if the entry in col c1 has only negative values in col c2, then that entry in col c1 gets a category of RMT.
    3. if the entry in col c1 has both positive and negative values in col c2, then that entry in col c1 gets a category of INT.
    Here's the data table:
    CREATE TABLE K_TST
    C1 VARCHAR2(10),
    C2 NUMBER
    c1     c2
    a     1
    a     -1
    a     -3
    a     2
    a     -2
    b     1
    c     -1
    d     1
    d     2
    d     3
    So I use the following query:
    select a.c1,
         a.init_category,
         count(*) over (partition by a.c1) category_count,
         decode(count(*) over (partition by a.c1), 1, a.init_category, 'INT') final_category
    from
         select distinct
              c1,
              decode(instr(to_char(c2), '-'), 0, 'PRD', 'RMT') init_category
         from k_tst
    ) a
    The idea is that I assign only categories PRD and RMT in the sub-query and then use the analytical function to count if an entry in col c1 belongs to more than one category. If it belongs to more than one category in the subquery, then the category is reassigned to INT.
    No problem, the result is:
    C1     INIT_CATEGORY     CATEGORY_COUNT     FINAL_CATEGORY
    a     PRD          2          INT
    a     RMT          2          INT
    b     PRD          1          PRD
    c     RMT          1          RMT
    d     PRD          1          PRD
    The report that I want is without INIT_CATEGORY col and only distinct values, so I modify the query from above to this:
    select distinct
         a.c1,
    --     a.init_category,
         count(*) over (partition by a.c1) category_count,
         decode(count(*) over (partition by a.c1), 1, a.init_category, 'INT') final_category
    from
         select distinct
              c1,
              decode(instr(to_char(c2), '-'), 0, 'PRD', 'RMT') init_category
         from k_tst
    ) a
    The result is:
    C1     CATEGORY_COUNT     FINAL_CATEGORY
    a     5          INT
    b     1          PRD
    c     1          RMT
    d     3          INT
    Note the the CATEGORY_COUNT for 'a' changed to 5 and CATEGORY_COUNT for 'd' changed to 3. So 'd' is now categorized as INT instead of the desired category PRD (all entries in col c2 for 'd' are positive).
    Why did the results of CATEGORY_COUNT change by merely adding the 'distinct' to the outer query?
    Thanks for the time to answer this question.

  • Using Analytic functions in Cursors

    Hi,
    I am trying to use an analytic function as part of my select statement in a cursor declaration. However, i get a compilation error.
    Any idea?

    Hi,
    I am trying to use an analytic function as part of my select statement in a cursor declaration. However, i get a compilation error.
    Any idea?

  • Date ranges - possible to use analytic functions?

    The next datastructure needs to be converted to a daterange datastructure.
    START_DATE END_DATE      AMMOUNT
    01-01-2010 28-02-2010         10
    01-02-2010 31-03-2010         20
    01-03-2010 31-05-2010         30
    01-09-2010 31-12-2010         40Working solution:
    with date_ranges
    as   ( select to_date('01-01-2010','dd-mm-yyyy') start_date
           ,      to_date('28-02-2010','dd-mm-yyyy') end_date
           ,      10                                 ammount
           from   dual
           union all
           select to_date('01-02-2010','dd-mm-yyyy') start_date
           ,      to_date('31-03-2010','dd-mm-yyyy') end_date
           ,      20                                 ammount
           from   dual
           union all
           select to_date('01-03-2010','dd-mm-yyyy') start_date
           ,      to_date('31-05-2010','dd-mm-yyyy') end_date
           ,      30                                 ammount
           from   dual
           union all
           select to_date('01-09-2010','dd-mm-yyyy') start_date
           ,      to_date('31-12-2010','dd-mm-yyyy') end_date
           ,      40                                 ammount
           from   dual
    select   rne.start_date
    ,        lead (rne.start_date-1,1)  over (order by rne.start_date) end_date
    ,        ( select sum(dre2.ammount)
               from   date_ranges dre2
               where  rne.start_date >= dre2.start_date
               and    rne.start_date <= dre2.end_date
             ) range_ammount
    from     ( select dre.start_date
               from   date_ranges dre
               union -- implicit distinct
               select dre.end_date + 1
               from   date_ranges dre
             ) rne
    order by rne.start_date
    /Output:
    START_DATE END_DATE   RANGE_AMMOUNT
    01-01-2010 31-01-2010            10
    01-02-2010 28-02-2010            30
    01-03-2010 31-03-2010            50
    01-04-2010 31-05-2010            30
    01-06-2010 31-08-2010
    01-09-2010 31-12-2010            40
    01-01-2011
    7 rows selected.However, I would like to use an analytic function to calculate the range_ammount. Is this possible?
    Edited by: user5909557 on Jul 29, 2010 6:19 AM

    Hi,
    Welcome to the forum!
    Yes, you can replace the scalar sub-queriy with an analytic SUM, like this:
    WITH  change_data   AS
         SELECT     start_date     AS change_date
         ,     ammount          AS net_amount
         FROM     date_ranges
        UNION
         SELECT     end_date + 1     AS change_date
         ,     -ammount        AS net_amount
         FROM     date_ranges
    ,     got_range_amount     AS
         SELECT     change_date          AS start_date
         ,     LEAD (change_date) OVER (ORDER BY  change_date) - 1
                                     AS end_date
         ,     SUM (net_amount)   OVER (ORDER BY  change_date)
                                    AS range_amount
         FROM    change_data
    ,     got_grp          AS
         SELECT     start_date
         ,     end_date
         ,     range_amount
         ,     ROW_NUMBER () OVER ( ORDER BY        start_date, end_date)
               - ROW_NUMBER () OVER ( PARTITION BY  range_amount
                                         ORDER BY          start_date, end_date
                           )         AS grp
         FROM    got_range_amount
    SELECT       MIN (start_date)     AS start_date
    ,       MAX (end_date)     AS end_date
    ,       range_amount
    FROM       got_grp
    GROUP BY  grp
    ,            range_amount
    ORDER BY  grp
    ;This should be much more efficient.
    The code is longer than what you posted. That's largely because it consolidates consecutive groups with the same amount.
    For example, if we add this row to the sample data:
           union all
           select to_date('02-01-2010','dd-mm-yyyy') start_date
           ,      to_date('30-12-2010','dd-mm-yyyy') end_date
           ,      0                                 ammount
           from   dualThe query you posted produces:
    START_DAT END_DATE  RANGE_AMMOUNT
    01-JAN-10 01-JAN-10            10
    02-JAN-10 31-JAN-10            10
    01-FEB-10 28-FEB-10            30
    01-MAR-10 31-MAR-10            50
    01-APR-10 31-MAY-10            30
    01-JUN-10 31-AUG-10             0
    01-SEP-10 30-DEC-10            40
    31-DEC-10 31-DEC-10            40
    01-JAN-11I assume you only want a new row of output when the range_amount changes., that is:
    START_DAT END_DATE  RANGE_AMOUNT
    01-JAN-10 31-JAN-10           10
    01-FEB-10 28-FEB-10           30
    01-MAR-10 31-MAR-10           50
    01-APR-10 31-MAY-10           30
    01-JUN-10 31-AUG-10            0
    01-SEP-10 31-DEC-10           40
    01-JAN-11                      0Of course, you could modify the original query so that it did this, but it would end up about as complex as the query above, but less efficient.
    Conversely, if you prefer the longer output, then you don't need the suib-query got_grp in the query above.
    Thanks for posting the CREATE TABLE and INSERT statments; that's very helpful.
    There are some people who have been using this forum for years who still have to be begged to do that.

  • Want to use analytical function as a Virtual column

    I am wondering if I can use an analytic function as a virtual column to a table?
    The table conatins a field named BUSINESS_RUN_DATE, which becomes the EXPIRY_DATE of the on the previous record. So we want to add this value right into the table without resorting t a view.
    This is what I tried to add the column to the table:
    alter table stg_xref_test_virtual
    ADD (expiry_date2 date generated always AS (max (business_run_date) over
    *(PARTITION BY ntrl_src_sys_key order by business_run_date*
    rows between 1 preceding and 1 following))) ;
    It give me an error that GROUP BY is not allowed.
    Can someone help out>?
    Thanks,
    Ian

    From the documentation.
    [Column Expressions|http://download.oracle.com/docs/cd/B28359_01/server.111/b28286/expressions005.htm#BABIGHHI]
    A column expression, which is designated as column_expr in subsequent syntax diagrams, is a limited form of expr. A column expression can be a simple expression, compound expression, function expression, or expression list, but it can contain only the following forms of expression:* Columns of the subject table — the table being created, altered, or indexed
    * Constants (strings or numbers)
    ** Deterministic functions — either SQL built-in functions or user-defined functions*
    No other expression forms described in this chapter are valid. In addition, compound expressions using the PRIOR keyword are not supported, nor are aggregate functions.
    You can use a column expression for these purposes:
    * To create a function-based index.
    * To explicitly or implicitly define a virtual column. When you define a virtual column, the defining column_expr must refer only to columns of the subject table that have already been defined, in the current statement or in a prior statement.
    The combined components of a column expression must be deterministic. That is, the same set of input values must return the same set of output values.

  • Lag analytical function and controlling the offset

    Can you help me on that? Small challenge. At least I gave up in half a day.
    Data
    ACCOUNT_NUMBER     BATCH_ID     TRANSACTION_DATE     TRANSACTION_TYPE     TRANSACTION_NUMBER     PARENT_TRANSACTION_NUMBER
    124680     ZY000489     1/11/2011     62     377     NULL
    124680     ZY000489     1/11/2011     1     378     NULL
    124680     ZY000489     1/11/2011     1     379     NULL
    124680     ZY000489     1/11/2011     1     380     NULL
    124680     ZY000489     1/11/2011     62     381     NULL
    124680     ZY000489     1/11/2011     1     381     NULL
    124681     ZY000490     1/11/2011     350     4000     NULL
    124681     ZY000490     1/11/2011     1     4001     NULL
    124681     ZY000490     1/11/2011     1     4002     NULL
    I want to identify parent Transaction Number for each row in above data.
    The way to identify it is My parent transaction Id is
    -     All child transaction have type as 1
    -     One main transaction can have multiple line items.
    -     Any transaction (type) can have an related child transaction (Transaction Type as 1)
    -     Each logical group of transactions have same account number, batch id, transaction date and consecutive Transaction Number (like 377, 378, 379, 380 in above example)
    The data should look like below once I identified parent transaction columns:
    ACCOUNT_NUMBER     BATCH_ID     TRANSACTION_DATE     TRANSACTION_TYPE     TRANSACTION_NUMBER     PARENT_TRANSACTION_NUMBER
    124680     ZY000489     1/11/2011     62     377     377
    124680     ZY000489     1/11/2011     1     378     377
    124680     ZY000489     1/11/2011     1     379     377
    124680     ZY000489     1/11/2011     1     380     377
    124680     ZY000489     1/11/2011     62     381     381
    124680     ZY000489     1/11/2011     1     382     381
    124681     ZY000490     1/11/2011     350     4000     4000
    124681     ZY000490     1/11/2011     1     4001     4000
    124681     ZY000490     1/11/2011     1     4002     4000
    I tried using LAG Analytical function trying to lag dynamically with offset but had difficulties dynamically expanding the offset. Its an Control Break kind of functionality that i want to achieve in single SQL.
    i Know we can do it using pl/sql construct but the challenge is to do it using single sql. Please help
    Please let me know if you are able to do it in single SQL.
    Thanks

    rohitgoswami wrote:
    Can you help me on that? Small challenge. At least I gave up in half a day.
    Data
    ACCOUNT_NUMBER     BATCH_ID     TRANSACTION_DATE     TRANSACTION_TYPE     TRANSACTION_NUMBER     PARENT_TRANSACTION_NUMBER
    124680     ZY000489     1/11/2011     62     377     NULL
    124680     ZY000489     1/11/2011     1     378     NULL
    124680     ZY000489     1/11/2011     1     379     NULL
    124680     ZY000489     1/11/2011     1     380     NULL
    124680     ZY000489     1/11/2011     62     381     NULL
    124680     ZY000489     1/11/2011     1     381     NULL
    124681     ZY000490     1/11/2011     350     4000     NULL
    124681     ZY000490     1/11/2011     1     4001     NULL
    124681     ZY000490     1/11/2011     1     4002     NULL
    I want to identify parent Transaction Number for each row in above data.
    The way to identify it is My parent transaction Id is
    -     All child transaction have type as 1
    -     One main transaction can have multiple line items.
    -     Any transaction (type) can have an related child transaction (Transaction Type as 1)
    -     Each logical group of transactions have same account number, batch id, transaction date and consecutive Transaction Number (like 377, 378, 379, 380 in above example)
    The data should look like below once I identified parent transaction columns:
    ACCOUNT_NUMBER     BATCH_ID     TRANSACTION_DATE     TRANSACTION_TYPE     TRANSACTION_NUMBER     PARENT_TRANSACTION_NUMBER
    124680     ZY000489     1/11/2011     62     377     377
    124680     ZY000489     1/11/2011     1     378     377
    124680     ZY000489     1/11/2011     1     379     377
    124680     ZY000489     1/11/2011     1     380     377
    124680     ZY000489     1/11/2011     62     381     381
    124680     ZY000489     1/11/2011     1     382     381
    124681     ZY000490     1/11/2011     350     4000     4000
    124681     ZY000490     1/11/2011     1     4001     4000
    124681     ZY000490     1/11/2011     1     4002     4000
    I tried using LAG Analytical function trying to lag dynamically with offset but had difficulties dynamically expanding the offset. Its an Control Break kind of functionality that i want to achieve in single SQL.
    i Know we can do it using pl/sql construct but the challenge is to do it using single sql. Please help
    Please let me know if you are able to do it in single SQL.
    ThanksCan probably pretty this up ... i just went for functional code for the moment.
    TUBBY_TUBBZ?with
      2     data (acc_no, batch_id, trans_date, trans_type, trans_no) as
      3  (
      4    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 62   , 377   from dual union all
      5    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 378   from dual union all
      6    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 379   from dual union all
      7    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 380   from dual union all
      8    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 62   , 381   from dual union all
      9    select 124680, 'ZY000489', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 382   from dual union all
    10    select 124681, 'ZY000490', to_date('1/11/2011', 'mm/dd/yyyy'), 350  , 4000  from dual union all
    11    select 124681, 'ZY000490', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 4001  from dual union all
    12    select 124681, 'ZY000490', to_date('1/11/2011', 'mm/dd/yyyy'), 1    , 4002  from dual
    13  )
    14  select
    15    acc_no,
    16    batch_id,
    17    trans_date,
    18    trans_type,
    19    trans_no,
    20    case when trans_type != 1
    21    then
    22      trans_no
    23    else
    24      lag
    25      (
    26        case when trans_type = 1
    27        then
    28           null
    29        else
    30           trans_no
    31        end
    32        ignore nulls
    33      ) over (partition by acc_no, batch_id, trans_date order by trans_no asc)
    34    end as parent_trans_no
    35  from data;
                ACC_NO BATCH_ID                 TRANS_DATE                         TRANS_TYPE           TRANS_NO    PARENT_TRANS_NO
                124680 ZY000489                 11-JAN-2011 12 00:00                       62                377                377
                124680 ZY000489                 11-JAN-2011 12 00:00                        1                378                377
                124680 ZY000489                 11-JAN-2011 12 00:00                        1                379                377
                124680 ZY000489                 11-JAN-2011 12 00:00                        1                380                377
                124680 ZY000489                 11-JAN-2011 12 00:00                       62                381                381
                124680 ZY000489                 11-JAN-2011 12 00:00                        1                382                381
                124681 ZY000490                 11-JAN-2011 12 00:00                      350               4000               4000
                124681 ZY000490                 11-JAN-2011 12 00:00                        1               4001               4000
                124681 ZY000490                 11-JAN-2011 12 00:00                        1               4002               4000
    9 rows selected.
    Elapsed: 00:00:00.01
    TUBBY_TUBBZ?

  • Analytic functions in expression

    Hi list,
    is there a way of using an analytic function in an OWB expression? Just like with DECODE it doesn't seem possible, because OWB creates something like x := analytic_function(), I think for tracing or error handling purposes. If I want to have, say, a rank() in the outcome of a join (i.e. the SELECT part), I have to create a view and integrate that into the mapping, is that correct? Or is there another way around?
    TIA,
    Bjoern

    Not sure what you are trying to accomplish - can you please elaborate? You should be able to invoke functions that accept scalars as parameters and return scalars in most cases. Another solution is (as mentioned by you) a view.
    Regards:
    Igor

  • Having clause with Analytic function

    can you pls let me know if we can use HAVING clause with analytic function
    select eid,empno,sum(sal) over(partition by year)
    from employee
    where dept = 'SALES'
    having sum(sal) > 10000I m getting error while using the above,
    IS that we can use HAVING clause with partition by
    Thanks in advance

    Your having clause isn't using an analytical function, is using a regular aggregate function.
    You also can't use analytical functions in the where clause or having clause like that as they are windowing functions and belong at the top of the query.
    You would have to wrap the query to achieve what you want e.g.
    SQL> ed
    Wrote file afiedt.buf
      1  select deptno, total_sal
      2  from (
      3        select deptno,sum(sal) over (partition by deptno) as total_sal
      4        from   emp
      5       )
      6  group by deptno, total_sal
      7* having total_sal > 10000
    SQL> /
        DEPTNO  TOTAL_SAL
            20      10875
    SQL>

  • Oracle Analytic Function Issue

    Hi, I have created a simple table contaning 3 columns and 3 records.
    Insert into MYTABLE (ID, AMOUNT, RESULT) Values (1, 1, 1);
    Insert into MYTABLE (ID, AMOUNT, RESULT) Values (2, 4, 1);
    Insert into MYTABLE (ID, AMOUNT, RESULT) Values (3, 7, 0);
    COMMIT;
    I can SUM the AMOUNT using the analytic functions as
    SELECT ID, AMOUNT, RESULT, SUM(AMOUNT) OVER() as TOTAL
    FROM MYTABLE;
    ID      AMOUNT      RESULT      TOTAL
    1      1      1      12
    2      4      1 12
    3      7      0      12
    What I want to be able to do is summing the AMOUNTs by RESULT in this case 0 and 1.
    To get the following result, how should I rewrite the query?
    ID      AMOUNT      RESULT      TOTAL RESULT_0_TOTAL RESULT_1_TOTAL
    1 1 1      12 7 5
    2      4      1      12 7 5
    3      7      0      12 7 5

    SELECT ID, AMOUNT, RESULT
    , SUM(CASE WHEN RESULT = 0 THEN AMOUNT ELSE 0 END) OVER() as TOTAL_0
    , SUM(CASE WHEN RESULT = 1 THEN AMOUNT ELSE 0 END) OVER() as TOTAL_1
    FROM MYTABLE;

  • Query in analytic function

    Hi,
    I am using below query
    select * from
    SELECT FLAG,S_DATE,ROW_NUMBER() OVER (PARTITION BY
    flag order by S_DATE,FLAG ) as d
    FROM table_name
    ORDER BY S_DATE
    which gives below output
    Flag     | S_DATE      | D
    Y     | 2/27/2012 5:33     |     1
    Y     | 2/27/2012 5:34     |     2
    Y     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:34     |     1
    N     | 2/27/2012 5:34     |     2
    N     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:35     |     4
    N     | 2/27/2012 5:35     |     5
    Y     |  2/27/2012 5:36     |     4
    Y     |  2/27/2012 5:36     |     5
    Y     |  2/27/2012 5:36     |     6
    But i want the output to be in below order there is change in last 3 rows
    Flag     | S_DATE      | D
    Y     | 2/27/2012 5:33     |     1
    Y     | 2/27/2012 5:34     |     2
    Y     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:34     |     1
    N     | 2/27/2012 5:34     |     2
    N     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:35     |     4
    N     | 2/27/2012 5:35     |     5
    Y     |  2/27/2012 5:36     |     1
    Y     |  2/27/2012 5:36     |     2
    Y     |  2/27/2012 5:36     |     3
    ihave used the analytic function.
    Edited by: user8858890 on Feb 27, 2012 2:00 AM

    Hi,
    user8858890 wrote:
    ... But i want the output to be in below order there is change in last 3 rows
    Flag     | S_DATE      | D
    Y     | 2/27/2012 5:33     |     1
    Y     | 2/27/2012 5:34     |     2
    Y     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:34     |     1
    N     | 2/27/2012 5:34     |     2
    N     | 2/27/2012 5:34     |     3
    N     | 2/27/2012 5:35     |     4
    N     | 2/27/2012 5:35     |     5
    Y     |  2/27/2012 5:36     |     1
    Y     |  2/27/2012 5:36     |     2
    Y     |  2/27/2012 5:36     |     3
    Why do you want the last 3 rows (which have flag = 'Y') to be numbered 1, 2, 3, when the first 3 rows (which also have flag = 'Y') already have numbers 1, 2 and 3? Do you want a separate #1 whenevever there is a group of consecutive rows (when ordered by s_date) that have the same flag? If so, then you have to identify the groups, like this:
    WITH     got_grp_id     AS
         SELECT     flag
         ,     s_date
         ,     ROWID               AS r_id
         ,     ROW_NUMBER () OVER ( ORDER BY      s_date
                                   ,                  ROWID
               - ROW_NUMBER () OVER ( PARTITION BY  flag
                                         ORDER BY          s_date
                             ,               ROWID
                           )    AS grp_id
         FROM    table_name
    SELECT       flag
    ,       s_date
    ,       ROW_NUMBER () OVER ( PARTITION BY  flag
                                 ,          grp_id
                          ORDER BY          s_date
                          ,               r_id
                        )      AS d
    FROM      got_grp_id
    ORDER BY  s_date
    ,            grp_id
    ,       d
    ;This assumes that each row can be uniquely idendified, so that the order is unambiguous. In your sample data, there are completely identical rows, so I used ROWID to uniquely identify the rows. Using ROWID assumes that table_name is a real table, not just a result set.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
    Explain, using specific examples, how you get those results from that data.
    Always say what version of Oracle you're using.

  • Using the Ago Function

    Hi,
    I have a time dimension with levels AllTime, Year, Month, Week, Day. I set the chronological key to the day and created an Ago Measure in my fact. The repository global consistency was successful. However when I used it in answers, I got the following error message:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 22040] To use AGO function, the storage level of the query ('[DIM_DATE.DIMENSION_KEY]') must be a static level. (HY000)
    SQL Issued: SELECT DIM_DATE.CALENDAR_YEAR_NAME saw_0, F_TEST.MEASURE saw_1, F_TEST.YearAgoMeasures saw_2 FROM WMS ORDER BY saw_0
    Does anyone have any idea please? Note that the DIM_DATE.DIMENSION_KEY is the primary key of the time dimension table.
    Thanks a lot
    Marija

    Hi Wildmight,
    I restarted everything and got the "must be a static level. (HY000)" error fixed. Then I checked the chronological key in the level "year" but it seems not to be working. It's taking a long long time to show the results (I finally cancel it).
    Reviewing the nqquery log file, I don't really get how obi retrieves the todate info, it uses the analytic function: ROW_NUMBER() OVER (partition by....), do you get the same?
    Thanks again.

Maybe you are looking for

  • Authoring and Publishing site collection in SharePoint Online

    Hi , Can we have multiple authoring and publishing site collection in SharePoint Online? Thanks, Kaviya (Please remember to click "Mark as Answer" on the post that helps you)

  • BAPI initialize error at wdDoInit of controller

    Hi all, I'm trying to call a simple BAPI from webdynpro but a NullPointerException is thrown at WdDoInit() of controller. At Line: wdContext.nodeBapi_Debtor_Getdetail_Input().bind(new Bapi_Debtor_Getdetail_Input()); The stack is: java.lang.NullPointe

  • JTable Java 1.6 Bug Progress ??? Work Around ?

    I was just wondering if there was a work around for the following bug on java 1.6 ??? http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6568959 This has been really annoying me because it's a requirement to use JInternalFrame and of course they want

  • ZFS and 4.6C

    Hi forum, does SAP support ZFS in a Oracle 9.2.0.7 for R/3 4.6C SR2 with Kernel 4.6D EXT ? Regards. Ganimede Dignan.

  • Blank WiFi login screen at various public hotspots

    Whenever I'm at a public hotspot that requires a login (like, say, a Panera Bread), my iPhone detects the network, then pops up a Login screen. The problem is, that screen remains blank -- I never get as far as being able to actually log in. I've tri