Having clause with Analytic function

can you pls let me know if we can use HAVING clause with analytic function
select eid,empno,sum(sal) over(partition by year)
from employee
where dept = 'SALES'
having sum(sal) > 10000I m getting error while using the above,
IS that we can use HAVING clause with partition by
Thanks in advance

Your having clause isn't using an analytical function, is using a regular aggregate function.
You also can't use analytical functions in the where clause or having clause like that as they are windowing functions and belong at the top of the query.
You would have to wrap the query to achieve what you want e.g.
SQL> ed
Wrote file afiedt.buf
  1  select deptno, total_sal
  2  from (
  3        select deptno,sum(sal) over (partition by deptno) as total_sal
  4        from   emp
  5       )
  6  group by deptno, total_sal
  7* having total_sal > 10000
SQL> /
    DEPTNO  TOTAL_SAL
        20      10875
SQL>

Similar Messages

  • Need valuable guidance to make a peformance oriented query, trying to replace unions with analytical function

    Hi,
       Please find below table structure and insert scritps. Requesting for vluable help.
    create table temp2 (col1 number,col2 varchar2(10),col3 number,col4 varchar2(20));
    insert into temp2 values (1,'a',100,'vvv');
    insert into temp2 values (2,'b',200,'www'); 
    insert into temp2 values (3,'c',300,'xxx');
    insert into temp2 values (4,'d',400,'yyy');   
    insert into temp2 values (5,'e',500,'zzz');
    insert into temp2 values (6,'f',600,'aaa');
    insert into temp2 values (7,'g',700,'bbb'); 
    insert into temp2 values (8,'h',800,'ccc');
    I am trying to get same output, what we get from below UNION query with ANALYTICAL Function.
    select * from temp2 where col1 in (1,2,3,4,5)
    union
    select * from temp2 where col1 in (1,2,5,6)
    union
    select * from temp2 where col1 in (1,2,7,8);
    I am seeking help by this dummy example to understand the concept, how can we use analytical functional over UNION or OUTER JOINS.
    In my exact query, I am using same table three times adding UNION clause. here also we scan temp2 three times, so for bulky tables using 'union'  would be hampering query's performance
    It means i go with three time scans of same table that is not performance oriented. With the help of above required concept, i will try to remove UNIONs from my exact query.
    Thanks!!

    Thanks for your time BluShadow and sorry as i think i couldn't make my query clear.
    I try it again. Below there are three queries, you may see all three queries are using same tables. Difference in all three queries are just few conditions, which makes all three queries diff with each other.
    I know, u cant run below query in your database, but i think it will convey my doubt to you. I have mentioned no. of rows with each clause and total i am getting 67 rows as my output. (Reason may be first n third query's result set are the subset of Second Query dataset)
    So i want to take all common rows as well as additional rows, if present in any of the query. This is getting easliy done with UNION clause but want to have it in other way as here my same is getting scanned again n again.
    SELECT
             START_TX.FX_TRAN_ID START_FX_TRAN_ID
            ,END_TX.FX_TRAN_ID END_FX_TRAN_ID
            ,START_TX.ENTERED_DT_TS
            ,USER
            ,START_TX.TRADE_DT
            ,START_TX.DEAL_NUMBER
            ,START_TX.FX_DEAL_TYPE
            ,START_TX.ORIENTATION_BUYSELL
            ,START_TX.BASE_CCY
            ,START_TX.BASE_CCY_AMT
            ,START_TX.SECONDARY_CCY
            ,START_TX.SECONDARY_CCY_AMT
            ,START_TX.MATURITY_DT
            ,START_TX.TRADE_RT
            ,START_TX.FORWARD_PTS              
            ,START_TX.CORPORATE_PIPS           
            ,START_TX.DEAL_OWNER_INITIALS      
            ,START_TX.CORPORATE_DEALER         
            ,START_TX.PROFIT_CENTER_CD
            ,START_TX.COUNTERPARTY_NM
            ,START_TX.COUNTERPARTY_NUMBER
      FROM
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS >=  TO_DATE('20-Nov-2013 4:00:01 AM','DD-Mon-YYYY HH:MI:SS AM')) START_TX
           INNER JOIN
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS <=  TO_DATE('20-Nov-2013 4:59:59 PM','DD-Mon-YYYY HH:MI:SS AM'))  END_TX
       ON START_TX.COUNTERPARTY_NM        = END_TX.COUNTERPARTY_NM         AND
          START_TX.COUNTERPARTY_NUMBER    = END_TX.COUNTERPARTY_NUMBER     AND
          START_TX.FX_DEAL_TYPE           = END_TX.FX_DEAL_TYPE            AND
          START_TX.BASE_CCY               = END_TX.BASE_CCY                AND
          START_TX.SECONDARY_CCY          = END_TX.SECONDARY_CCY           AND
          NVL(START_TX.CORPORATE_DEALER,'nullX')=NVL(END_TX.CORPORATE_DEALER,'nullX')       AND
          START_TX.ORIENTATION_BUYSELL='B'                                 AND 
          END_TX.ORIENTATION_BUYSELL='S'                                  AND
          START_TX.FX_TRAN_ID = 1850718                                  AND
         (START_TX.BASE_CCY_AMT           = END_TX.BASE_CCY_AMT          
          OR
          START_TX.SECONDARY_CCY_AMT      = END_TX.SECONDARY_CCY_AMT)        -- 10 Rows
    UNION
    SELECT
             START_TX.FX_TRAN_ID START_FX_TRAN_ID
            ,END_TX.FX_TRAN_ID END_FX_TRAN_ID
            ,START_TX.ENTERED_DT_TS
            ,USER
            ,START_TX.TRADE_DT
            ,START_TX.DEAL_NUMBER
            ,START_TX.FX_DEAL_TYPE
            ,START_TX.ORIENTATION_BUYSELL
            ,START_TX.BASE_CCY
            ,START_TX.BASE_CCY_AMT
            ,START_TX.SECONDARY_CCY
            ,START_TX.SECONDARY_CCY_AMT
            ,START_TX.MATURITY_DT
            ,START_TX.TRADE_RT
            ,START_TX.FORWARD_PTS              
            ,START_TX.CORPORATE_PIPS           
            ,START_TX.DEAL_OWNER_INITIALS      
            ,START_TX.CORPORATE_DEALER         
            ,START_TX.PROFIT_CENTER_CD
            ,START_TX.COUNTERPARTY_NM
            ,START_TX.COUNTERPARTY_NUMBER
      FROM
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS >=  TO_DATE('20-Nov-2013 4:00:01 AM','DD-Mon-YYYY HH:MI:SS AM')) START_TX
           INNER JOIN
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS <=  TO_DATE('20-Nov-2013 4:59:59 PM','DD-Mon-YYYY HH:MI:SS AM'))  END_TX
       ON START_TX.COUNTERPARTY_NM        = END_TX.COUNTERPARTY_NM         AND
          START_TX.COUNTERPARTY_NUMBER    = END_TX.COUNTERPARTY_NUMBER     AND
          START_TX.FX_DEAL_TYPE           = END_TX.FX_DEAL_TYPE            AND
          START_TX.BASE_CCY               = END_TX.BASE_CCY                AND
          START_TX.SECONDARY_CCY          = END_TX.SECONDARY_CCY           AND
          NVL(START_TX.CORPORATE_DEALER,'nullX')=NVL(END_TX.CORPORATE_DEALER,'nullX')  AND
          START_TX.FX_TRAN_ID = 1850718                                  AND
          START_TX.ORIENTATION_BUYSELL='B'                                 AND 
          END_TX.ORIENTATION_BUYSELL='S'                        --                                   67 Rows
    UNION 
    SELECT
             START_TX.FX_TRAN_ID START_FX_TRAN_ID
            ,END_TX.FX_TRAN_ID END_FX_TRAN_ID
            ,START_TX.ENTERED_DT_TS
            ,USER
            ,START_TX.TRADE_DT
            ,START_TX.DEAL_NUMBER
            ,START_TX.FX_DEAL_TYPE
            ,START_TX.ORIENTATION_BUYSELL
            ,START_TX.BASE_CCY
            ,START_TX.BASE_CCY_AMT
            ,START_TX.SECONDARY_CCY
            ,START_TX.SECONDARY_CCY_AMT
            ,START_TX.MATURITY_DT
            ,START_TX.TRADE_RT
            ,START_TX.FORWARD_PTS              
            ,START_TX.CORPORATE_PIPS           
            ,START_TX.DEAL_OWNER_INITIALS      
            ,START_TX.CORPORATE_DEALER         
            ,START_TX.PROFIT_CENTER_CD
            ,START_TX.COUNTERPARTY_NM
            ,START_TX.COUNTERPARTY_NUMBER
      FROM
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS >=  TO_DATE('20-Nov-2013 4:00:01 AM','DD-Mon-YYYY HH:MI:SS AM')) START_TX
           INNER JOIN
          (SELECT * FROM FX_TRANSACTIONS WHERE GMT_CONV_ENTERED_DT_TS <=  TO_DATE('20-Nov-2013 4:59:59 PM','DD-Mon-YYYY HH:MI:SS AM'))  END_TX
       ON START_TX.COUNTERPARTY_NM        = END_TX.COUNTERPARTY_NM         AND
          START_TX.COUNTERPARTY_NUMBER    = END_TX.COUNTERPARTY_NUMBER     AND
          START_TX.FX_DEAL_TYPE           = END_TX.FX_DEAL_TYPE            AND
          START_TX.BASE_CCY               = END_TX.BASE_CCY                AND
          START_TX.SECONDARY_CCY          = END_TX.SECONDARY_CCY           AND
          NVL(START_TX.CORPORATE_DEALER,'nullX')=NVL(END_TX.CORPORATE_DEALER,'nullX') AND
          START_TX.ORIENTATION_BUYSELL='B'                                 AND 
          END_TX.ORIENTATION_BUYSELL='S'                                   AND
          START_TX.FX_TRAN_ID = 1850718                                  AND
            END_TX.BASE_CCY_AMT BETWEEN (START_TX.BASE_CCY_AMT - (START_TX.BASE_CCY_AMT * :PERC_DEV/100)) AND (START_TX.BASE_CCY_AMT + (START_TX.BASE_CCY_AMT * :PERC_DEV/100))        
            OR
            END_TX.SECONDARY_CCY_AMT BETWEEN (START_TX.SECONDARY_CCY_AMT - (START_TX.SECONDARY_CCY_AMT*:PERC_DEV/100) ) AND (START_TX.SECONDARY_CCY_AMT + (START_TX.SECONDARY_CCY_AMT*:PERC_DEV/100))
        );                                                       ---                              10 Rows

  • EVALUATE in OBIEE with Analytic function LAST_VALUE

    Hi,
    I'm trying to use EVALUATE with analytic function LAST_VALUE but it is giving me error below:
    [nQSError: 17001] Oracle Error code: 30483, message: ORA-30483: window functions are not allowed here at OCI call OCIStmtExecute. [nQSError: 17010] SQL statement preparation failed. (HY000)
    Thanks
    Kumar.

    Hi Kumar,
    The ORA error tells me that this is something conveyed by the oracle database but not the BI Server. In this case, the BI server might have fired the incorrect query onto the DB and you might want to check what's wrong with it too.
    The LAST_VALUE is an analytic function which works over a set/partition of records. Request you to refer to the semantics at http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions073.htm and see if it is violating any rules here. You may want to post the physical sql here too to check.
    Hope this helps.
    Thank you,
    Dhar

  • For users having problems with Mail functionality...

    Hello! I have been having problems with Mail functionality as well, so, I took a few steps to see if I could solve the problem on my own.
    My problems were:
    Mail would freeze when sending mail, or, when I pressed quit, it would freeze.
    If I toggled between inboxes too much it would freeze up.
    Even after trying the many techniques others offered on this board, Mail still froze up. So.. here's what I did:
    1.) Turn off the auto-retrieve mail function in the preferences/general menu. Set it to a manual check. Just check it yourself every once in awhile. It may not be as convenient, but it works, trust me.
    2.) Under the fonts and colors menu, set all three fonts to the SAME font (mines Century Schoolbook, but anything works) and turn off fixed-width font. Sometimes mail has a tough time decoding and working with texts, and causes it to freeze.
    3.) Clean our your main inbox! Place the emails that you want to keep, but don't need to be in your inbox in a new folder on your inbox bar (mines aptly named 'Saved Messages.' That way, when you open up Mail to your default account, it doesn't have 1,000+ messages to index. Also, try to keep the inboxes from having more than 100-200 messages per. It seems to frown on any more than that.
    After trying these three easy, steps, restart mail and see if things run smoother. I have had no problems since these changes, and mail runs a ton better. Plus you don't have to worry about deleting your library/mail folder anymore. Let me know if things work out so I can see if my theory is a universally proven thing. Thanks and good luck!
    iMac G5 20"   Mac OS X (10.4.5)  

    Good question. I have worked very little with IMAP other than as a testing device, so, on a daily routine I cannot say how it would function. However, I can only expect that by having the option to download all emails for offline viewing selected, you are right in it being a lag creator. Another suggestion would be, if you are running a single-computer setup, to have the account delete emails from the server after recieving them on your computer. Thus, you reduce the risk of repeat downloads or unnecessary build-up on the server end. DO NOT select this option if you run multiple computer set ups on one AppleMail account system or you will cause some discrepancies and/or data loss from system to system. Or, if you have emails redirected from a seperate server to Mail (i.e. I redirect mail from my university email account into Mail so that I don't have to go to the web-mail site to check it--> just like Gmail or Yahoo etc etc) you won't be able to check them from the web-based email system if you have the system delete them. Ok that sounded confusing. Solution: have the system remove emails from the server after a week (or any period of time, just not immediately) so that, in the off-chance you need to use the online system, you will have access, and will give you a chance to check the emails on other systems without a threat of data corruption (by corruption i mean that frustrating feeling you get when you can't find that super important email that was in your inbox just 10 seconds ago on your desktop etc.)
    I will see if I can learn anything more about IMAP Ernie. Its definitley the biggest problem source for Mail, POP and .Mac being simple and functional as long as you don't play with the settings too much. As far as the issue of Offline viewing goes, try turning it off and seeing how Mail functions.
    BEFORE you turn it off, however, I recommend saving all of your emails in your inbox as a text file or something so, in case the server burps and has a brain flatuence, you don't lose anything. Been there, done that, thrown the books against the wall in a frustrated rage.
    My only reserve about the Offline Viewing is whether or not Mail will, when you click "recieve mail," have to re-download the emails from the server each time. If so, then it may end up taking longer and lagging more. The issue here is obviously minimizing the amount of activity between Mail and the email server without jeapordizing information on the server or multiple Mail programs on the same account. When we can find that balance, I think Mail's potential will be realized. Keep the questions coming.

  • Help needed with analytical function

    I want to get the employee details of the highest and 2nd highest salaried employee in a particular department. But also the department should have more than 1 employee.
    I tried the query and it gave me proper results. But I'm wondering if there is some other alternative than using the subquery.
    Here is the table and the result query :
    with t as
    select 1 emp_id,3 mgr_id,'Rajesh' emp_name,3999 salary,677 bonus,'HR' dpt_nme from dual union
    select 2 ,3 ,'Gangz',4500,800,'Finance' from dual  union
    select 3 ,4 ,'Sid',8000,12000,'IT' from dual  union
    select 4 ,null,'Ram',5000,677,'HR' from dual  union
    select 5 ,4,'Shyam',6000,677,'IT' from dual union
    select 6 ,4 ,'Ravi',9000,12000,'IT' from dual  
    select * from
    (select emp_id, mgr_id, emp_name, dpt_nme, salary, row_number() over (partition by dpt_nme order by salary desc) rn from t where dpt_nme in
    (select dpt_nme from t group by dpt_nme having count(*) > 1)) where rn < 3

    Hi,
    You need a sub-query, but you don't need more than that.
    Here's one way to eliminate the extra sub-query:
    WITH     got_analytics     AS
         SELECT  emp_id,     mgr_id,     emp_name, dpt_nme, salary
         ,     ROW_NUMBER () OVER ( PARTITION BY  dpt_nme
                                   ORDER BY          salary     DESC
                           )         AS rn
         ,     COUNT (*)     OVER ( PARTITION BY  dpt_nme
                                       )         AS dpt_cnt
         FROM     t
    SELECT  emp_id,     mgr_id,     emp_name, dpt_nme, salary
    ,     rn
    FROM     got_analytics
    WHERE     rn     < 3
    AND     dpt_cnt     > 1
    ;Analytic functions are computed after the WHERE clause is applied. Since we need to use the results of the analytic ROW_NUMBER function in a WHERE clause, that means we'll have to compute ROW_NUMBER in a sub-query, and use the results in the WHERE clause of the main query. We can call the analytic COUNT function in the same sub-query, and use its results in the same WHERE clause of the main query.
    What results would you want if there's a tie for the 2nd highest salary in some department? For example, if you add this row to your sample data:
    select 7 ,3 ,'Sunil',8000,12000,'IT' from dual  union? You may want to use RANK instead of ROW_NUMBER.

  • Help with analytical function

    I successfully use the following analytical function to sum all net_movement of a position (key for a position: bp_id, prtfl_num, instrmnt_id, cost_prc_crncy) from first occurrence until current row:
    SELECT SUM (net_movement) OVER (PARTITION BY bp_id, prtfl_num, instrmnt_id, cost_prc_crncy ORDER BY TRUNC (val_dt) RANGE BETWEEN UNBOUNDED PRECEDING AND 0 FOLLOWING) holding,
    what i need is another column to sum net_movement of a position but only for the current date, but all my approaches fail..
    - add the date (val_dt) to the 'partition by' clause and therefore sum only values with same position and date
    SELECT SUM (net_movement) OVER (PARTITION BY val_dt, bp_id, prtfl_num, instrmnt_id, cost_prc_crncy ORDER BY TRUNC (val_dt) RANGE BETWEEN UNBOUNDED PRECEDING AND 0 FOLLOWING) today_net_movement
    - take the holding for the last date and subtract it from the current holding afterwards
    SELECT SUM (net_movement) OVER (PARTITION BY bp_id, prtfl_num, instrmnt_id, cost_prc_crncy ORDER BY TRUNC (val_dt) RANGE BETWEEN UNBOUNDED PRECEDING AND -1 FOLLOWING) last_holding,
    - using lag on the analytical function which calculates holding fails too
    I also want to avoid creating a table which stores the last holding..
    Does anyone sees where I make a mistake or knows an alternative to get this value?
    It would help me much!
    Thanks in advance!

    Thank you,
    but I already tried that but it returns strange values which are not the correct ones for sure.
    It is always the same value for each row, if its not 0, and a very high one (500500 for example), even if the sum of all net_movement of that date is 0 (and the statement for holding returns 0 too)
    I also tried witch trunc(val_dt,'DDD') with the same result (without trunc it is the same issue)
    please help if you can, thanks in advance!

  • Speed up query with analytic function

    Hi
    how can I speed up the query below ?
    All time is in analytic function (WINDOW SORT)
    Thanks for your help
    11.2.0.1
    Rows     Row Source Operation
      28987  HASH UNIQUE (cr=12677 pr=155778 pw=109730 time=25010 us cost=5502 size=3972960 card=14880)
    1668196   WINDOW SORT (cr=12677 pr=155778 pw=109730 time=890411840 us cost=5502 size=3972960 card=14880)
    1668196    HASH JOIN RIGHT OUTER (cr=12677 pr=0 pw=0 time=1069165 us cost=3787 size=3972960 card=14880)
      30706     TABLE ACCESS FULL FLO_FML_EVENT (cr=270 pr=0 pw=0 time=7420 us cost=56 size=814158 card=30154)
    194733     HASH JOIN RIGHT OUTER (cr=12407 pr=0 pw=0 time=571145 us cost=3730 size=3571200 card=14880)
        613      VIEW  (cr=342 pr=0 pw=0 time=489 us cost=71 size=23840 card=745)
        613       HASH UNIQUE (cr=342 pr=0 pw=0 time=244 us cost=71 size=20115 card=745)
        745        WINDOW SORT (cr=342 pr=0 pw=0 time=1736 us cost=71 size=20115 card=745)
        745         MAT_VIEW ACCESS FULL MVECRF_CUR_QUERY (cr=342 pr=0 pw=0 time=1736 us cost=69 size=20115 card=745)
    194733      HASH JOIN  (cr=12065 pr=0 pw=0 time=431813 us cost=3658 size=3095040 card=14880)
         43       MAT_VIEW ACCESS FULL MVECRF_VISIT_REVS (cr=3 pr=0 pw=0 time=0 us cost=2 size=946 card=43)
    194733       HASH JOIN OUTER (cr=12062 pr=0 pw=0 time=292098 us cost=3656 size=2767680 card=14880)
    194733        HASH JOIN OUTER (cr=10553 pr=0 pw=0 time=234394 us cost=2962 size=2574240 card=14880)
    194733         HASH JOIN  (cr=9999 pr=0 pw=0 time=379996 us cost=2570 size=2380800 card=14880)
      30076          MAT_VIEW ACCESS FULL MVECRF_ACTIVATED_FORMS (cr=1817 pr=0 pw=0 time=28411 us cost=361 size=2000285 card=29855)
    194733          HASH JOIN  (cr=8182 pr=0 pw=0 time=209061 us cost=1613 size=9026301 card=97057)
        628           MAT_VIEW ACCESS FULL MVECRF_STUDYVERSION_FORMS (cr=19 pr=0 pw=0 time=250 us cost=6 size=18212 card=628)
    194733           MAT_VIEW ACCESS FULL MVECRF_FORMITEMS (cr=8163 pr=0 pw=0 time=80733 us cost=1606 size=12462912 card=194733)
    132342         MAT_VIEW ACCESS FULL MVECRF_ITEM_SDV (cr=554 pr=0 pw=0 time=23678 us cost=112 size=1720446 card=132342)
    221034        MAT_VIEW ACCESS FULL MVECRF_ITEMDATA (cr=1509 pr=0 pw=0 time=46459 us cost=299 size=2873442 card=221034)
    SELECT              
          DISTINCT
             'CL238093011' AS ETUDE,
             FI.STUDYID,
             FI.STUDYVERSIONID,
             FI.SITEID,
             FI.SUBJECTID,
             FI.VISITID,
             VR.VISITREFNAME,
             FI.SUBJECTVISITID,
             FI.FORMID,
             FI.FORMINDEX,
             SVF.FORMREFNAME,
             SVF.FORMMNEMONIC AS FMLNOM,
             EVENT_ITEM.EVENT AS EVENUM,
             EVENT_ITEM.EVENT_ROW AS LIGNUM,
             NULL AS CODVISEVE,
             MIN(DID.MINENTEREDDATE)
                OVER (
                   PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
                AS ATTDAT1ERSAI,
             MIN(IFSDV.ITEMFIRSTSDV)
                OVER (
                   PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
                AS ATTDAT1ERSDV,
             MAX(IFSDV.ITEMFIRSTSDV)
                OVER (
                   PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
                AS ATTDATDERSDV,
             DECODE (AF.SDVCOMPLETESTATE,
                     0,
                     'N',
                     1,
                     'Y')
                AS ATTINDSDVCOP,
             AF.FMINSDVCOMPLETESTATE AS ATTDAT1ERSDVCOP,
             DECODE (AF.SDVPARTIALSTATE,
                     0,
                     'N',
                     1,
                     'Y')
                AS ATTINDSDVPTL,
             EVENT_ITEM.EVENT_RELECT AS ATTINDRVUMEDCOP,
             DECODE (QUERY.NBQSTFML, NULL, 'N', 'Y') AS ATTINDQST,
             DECODE (AF.MISSINGITEMSSTATE,
                     0,
                     'N',
                     1,
                     'Y')
                AS ATTINDITMABS,
             DECODE (AF.FROZENSTATE,
                     0,
                     'N',
                     1,
                     'Y')
                AS ATTINDETACON,
             AF.FMINFROZENSTATE AS ATTDAT1ERCON,
             AF.FMAXFROZENSTATE AS ATTDATDERCON,
             DECODE (AF.DELETEDSTATE,
                     0,
                     'N',
                     1,
                     'Y')
                AS ATTINDETASPR,
             EVENT_ITEM.ROW_DELETED AS ATTINDLIGSPR
      FROM   CL238093011.MVECRF_FORMITEMS FI,
             CL238093011.MVECRF_STUDYVERSION_FORMS SVF,
             CL238093011.MVECRF_ACTIVATED_FORMS AF,
             CL238093011.MVECRF_ITEM_SDV IFSDV,
             CL238093011.MVECRF_VISIT_REVS VR,
             CL238093011.MVECRF_ITEMDATA DID,
             (SELECT   DISTINCT
                       SUBJECTID,
                       VISITID,
                       FORMID,
                       FORMINDEX,
                       COUNT (
                          DISTINCT QUERYID
                          OVER (
                             PARTITION BY SUBJECTID, VISITID, FORMID, FORMINDEX
                          NBQSTFML
                FROM   CL238093011.MVECRF_CUR_QUERY
               WHERE   QUERYSTATE IN (0, 1, 2)) QUERY,
             CL238093011.FLO_FML_EVENT EVENT_ITEM
    WHERE   (AF.VISITDELETED IS NULL OR AF.VISITDELETED = 0)
             AND AF.FORMTYPE NOT IN (4, 5, 6, 7, 8, 103)
             AND (AF.DELETEDDYNAMICFORMSTATE IS NULL
                  OR AF.DELETEDDYNAMICFORMSTATE = 0)
             AND FI.SUBJECTVISITID = AF.SUBJECTVISITID
             AND FI.FORMID = AF.FORMID
             AND FI.FORMREV = AF.FORMREV
             AND FI.FORMINDEX = AF.FORMINDEX
             AND FI.VISITID = VR.VISITID
             AND FI.VISITREV = VR.VISITREV
             AND FI.CONTEXTID = IFSDV.CONTEXTID(+)
             AND FI.CONTEXTID = DID.CONTEXTID(+)
             AND FI.SUBJECTID = QUERY.SUBJECTID(+)
             AND FI.VISITID = QUERY.VISITID(+)
             AND FI.FORMID = QUERY.FORMID(+)
             AND FI.FORMINDEX = QUERY.FORMINDEX(+)
             AND FI.STUDYVERSIONID = SVF.STUDYVERSIONID
             AND FI.FORMID = SVF.FORMID
             AND FI.VISITID = SVF.VISITID
             AND FI.SUBJECTID = EVENT_ITEM.SUBJECTID(+)
             AND FI.VISITID = EVENT_ITEM.VISITID(+)
             AND FI.FORMID = EVENT_ITEM.FORMID(+)
             AND FI.FORMINDEX = EVENT_ITEM.FORMINDEX(+)

    user12045475 wrote:
    Hi
    how can I speed up the query below ?
    All time is in analytic function (WINDOW SORT)
    Thanks for your help
    11.2.0.1
    Rows     Row Source Operation
    28987  HASH UNIQUE (cr=12677 pr=155778 pw=109730 time=25010 us cost=5502 size=3972960 card=14880)
    1668196   WINDOW SORT (cr=12677 pr=155778 pw=109730 time=890411840 us cost=5502 size=3972960 card=14880)
    1668196    HASH JOIN RIGHT OUTER (cr=12677 pr=0 pw=0 time=1069165 us cost=3787 size=3972960 card=14880)
    30706     TABLE ACCESS FULL FLO_FML_EVENT (cr=270 pr=0 pw=0 time=7420 us cost=56 size=814158 card=30154)
    194733     HASH JOIN RIGHT OUTER (cr=12407 pr=0 pw=0 time=571145 us cost=3730 size=3571200 card=14880)
    613      VIEW  (cr=342 pr=0 pw=0 time=489 us cost=71 size=23840 card=745)
    613       HASH UNIQUE (cr=342 pr=0 pw=0 time=244 us cost=71 size=20115 card=745)
    745        WINDOW SORT (cr=342 pr=0 pw=0 time=1736 us cost=71 size=20115 card=745)
    745         MAT_VIEW ACCESS FULL MVECRF_CUR_QUERY (cr=342 pr=0 pw=0 time=1736 us cost=69 size=20115 card=745)
    194733      HASH JOIN  (cr=12065 pr=0 pw=0 time=431813 us cost=3658 size=3095040 card=14880)
    43       MAT_VIEW ACCESS FULL MVECRF_VISIT_REVS (cr=3 pr=0 pw=0 time=0 us cost=2 size=946 card=43)
    194733       HASH JOIN OUTER (cr=12062 pr=0 pw=0 time=292098 us cost=3656 size=2767680 card=14880)
    194733        HASH JOIN OUTER (cr=10553 pr=0 pw=0 time=234394 us cost=2962 size=2574240 card=14880)
    194733         HASH JOIN  (cr=9999 pr=0 pw=0 time=379996 us cost=2570 size=2380800 card=14880)
    30076          MAT_VIEW ACCESS FULL MVECRF_ACTIVATED_FORMS (cr=1817 pr=0 pw=0 time=28411 us cost=361 size=2000285 card=29855)
    194733          HASH JOIN  (cr=8182 pr=0 pw=0 time=209061 us cost=1613 size=9026301 card=97057)
    628           MAT_VIEW ACCESS FULL MVECRF_STUDYVERSION_FORMS (cr=19 pr=0 pw=0 time=250 us cost=6 size=18212 card=628)
    194733           MAT_VIEW ACCESS FULL MVECRF_FORMITEMS (cr=8163 pr=0 pw=0 time=80733 us cost=1606 size=12462912 card=194733)
    132342         MAT_VIEW ACCESS FULL MVECRF_ITEM_SDV (cr=554 pr=0 pw=0 time=23678 us cost=112 size=1720446 card=132342)
    221034        MAT_VIEW ACCESS FULL MVECRF_ITEMDATA (cr=1509 pr=0 pw=0 time=46459 us cost=299 size=2873442 card=221034)
    SELECT              
    DISTINCT
    'CL238093011' AS ETUDE,
    FI.STUDYID,
    FI.STUDYVERSIONID,
    FI.SITEID,
    FI.SUBJECTID,
    FI.VISITID,
    VR.VISITREFNAME,
    FI.SUBJECTVISITID,
    FI.FORMID,
    FI.FORMINDEX,
    SVF.FORMREFNAME,
    SVF.FORMMNEMONIC AS FMLNOM,
    EVENT_ITEM.EVENT AS EVENUM,
    EVENT_ITEM.EVENT_ROW AS LIGNUM,
    NULL AS CODVISEVE,
    MIN(DID.MINENTEREDDATE)
    OVER (
    PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
    AS ATTDAT1ERSAI,
    MIN(IFSDV.ITEMFIRSTSDV)
    OVER (
    PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
    AS ATTDAT1ERSDV,
    MAX(IFSDV.ITEMFIRSTSDV)
    OVER (
    PARTITION BY FI.SUBJECTID, FI.VISITID, FI.FORMID, FI.FORMINDEX
    AS ATTDATDERSDV,
    DECODE (AF.SDVCOMPLETESTATE,
    0,
    'N',
    1,
    'Y')
    AS ATTINDSDVCOP,
    AF.FMINSDVCOMPLETESTATE AS ATTDAT1ERSDVCOP,
    DECODE (AF.SDVPARTIALSTATE,
    0,
    'N',
    1,
    'Y')
    AS ATTINDSDVPTL,
    EVENT_ITEM.EVENT_RELECT AS ATTINDRVUMEDCOP,
    DECODE (QUERY.NBQSTFML, NULL, 'N', 'Y') AS ATTINDQST,
    DECODE (AF.MISSINGITEMSSTATE,
    0,
    'N',
    1,
    'Y')
    AS ATTINDITMABS,
    DECODE (AF.FROZENSTATE,
    0,
    'N',
    1,
    'Y')
    AS ATTINDETACON,
    AF.FMINFROZENSTATE AS ATTDAT1ERCON,
    AF.FMAXFROZENSTATE AS ATTDATDERCON,
    DECODE (AF.DELETEDSTATE,
    0,
    'N',
    1,
    'Y')
    AS ATTINDETASPR,
    EVENT_ITEM.ROW_DELETED AS ATTINDLIGSPR
    FROM   CL238093011.MVECRF_FORMITEMS FI,
    CL238093011.MVECRF_STUDYVERSION_FORMS SVF,
    CL238093011.MVECRF_ACTIVATED_FORMS AF,
    CL238093011.MVECRF_ITEM_SDV IFSDV,
    CL238093011.MVECRF_VISIT_REVS VR,
    CL238093011.MVECRF_ITEMDATA DID,
    (SELECT   DISTINCT
    SUBJECTID,
    VISITID,
    FORMID,
    FORMINDEX,
    COUNT (
    DISTINCT QUERYID
    OVER (
    PARTITION BY SUBJECTID, VISITID, FORMID, FORMINDEX
    NBQSTFML
    FROM   CL238093011.MVECRF_CUR_QUERY
    WHERE   QUERYSTATE IN (0, 1, 2)) QUERY,
    CL238093011.FLO_FML_EVENT EVENT_ITEM
    WHERE   (AF.VISITDELETED IS NULL OR AF.VISITDELETED = 0)
    AND AF.FORMTYPE NOT IN (4, 5, 6, 7, 8, 103)
    AND (AF.DELETEDDYNAMICFORMSTATE IS NULL
    OR AF.DELETEDDYNAMICFORMSTATE = 0)
    AND FI.SUBJECTVISITID = AF.SUBJECTVISITID
    AND FI.FORMID = AF.FORMID
    AND FI.FORMREV = AF.FORMREV
    AND FI.FORMINDEX = AF.FORMINDEX
    AND FI.VISITID = VR.VISITID
    AND FI.VISITREV = VR.VISITREV
    AND FI.CONTEXTID = IFSDV.CONTEXTID(+)
    AND FI.CONTEXTID = DID.CONTEXTID(+)
    AND FI.SUBJECTID = QUERY.SUBJECTID(+)
    AND FI.VISITID = QUERY.VISITID(+)
    AND FI.FORMID = QUERY.FORMID(+)
    AND FI.FORMINDEX = QUERY.FORMINDEX(+)
    AND FI.STUDYVERSIONID = SVF.STUDYVERSIONID
    AND FI.FORMID = SVF.FORMID
    AND FI.VISITID = SVF.VISITID
    AND FI.SUBJECTID = EVENT_ITEM.SUBJECTID(+)
    AND FI.VISITID = EVENT_ITEM.VISITID(+)
    AND FI.FORMID = EVENT_ITEM.FORMID(+)
    AND FI.FORMINDEX = EVENT_ITEM.FORMINDEX(+)
    Do you have the license for parallel query (may/may not help)? PQO can help with sorts ...

  • SQL Query With analytical function

    Hi
    Below is the scenario which i am looking for in sql query using analytical functions
    I/p
    Col1
    50
    0
    -150
    -200
    300
    -100
    -300
    500
    -100
    O/p
    Col1          col2
    50                 0
    0                   0
    -150          -100
    -200              -200
    300               0
    -100              0
    -300              -100
    500               400
    -100              0Any help really appreciated
    Thanks in advance
    Edited by: unique on Aug 10, 2010 4:53 AM
    Edited by: unique on Aug 10, 2010 4:55 AM
    Edited by: unique on Aug 10, 2010 4:55 AM

    Oh,In this case,There is OLAP solution ;-)
    OLAP samples of my homepage http://www.geocities.jp/oraclesqlpuzzle/oracle-sql1-olap.html
    with work(SK,Val) as(
    select  1,  50 from dual union
    select  2,   0 from dual union
    select  3,-150 from dual union
    select  4,-200 from dual union
    select  5, 300 from dual union
    select  6,-100 from dual union
    select  7,-300 from dual union
    select  8, 500 from dual union
    select  9,-100 from dual)
    select SK,Val,GID,
    case when Val > 0
         then greatest(0,sum(Val) over(partition by GID))
         else Least(0,Val+greatest(0,sum(Val) over(partition by GID
                                     order by SK rows between unbounded preceding
                                                          and 1 preceding)))
         end as COL3
    from (select SK,Val,
          sum(greatest(0,sign(Val))) over(order by SK) as GID
          from work)
    order by SK;
    SK   VAL  GID  COL3
    1    50    1     0
    2     0    1     0
    3  -150    1  -100
    4  -200    1  -200
    5   300    2     0
    6  -100    2     0
    7  -300    2  -100
    8   500    3   400
    9  -100    3     0

  • Tuning sql with analytic function

    Dear friends I've developed one sql :
    with REP as
    (select /*+ MATERIALIZE */ branch_code,
       row_number() over(partition by branch_code, account order by bkg_date desc  ) R,
             account,
             bkg_date,
             lcy_closing_bal
        from history t
    select REP1.branch_code,
           REP1.account,
           REP1.bkg_date,
           REP1.lcy_closing_bal,
             NULL  AS second,
           REP2.bkg_date        bkg_date2,
           REP2.lcy_closing_bal lcy_closing_bal2,
            NULL  AS third,
           REP3.bkg_date        bkg_date3,
           REP3.lcy_closing_bal lcy_closing_bal3
      from (SELECT * FROM REP WHERE R=1) REP1, (SELECT * FROM REP WHERE R=2) REP2, (SELECT * FROM REP WHERE R=3) REP3
    where
           (REP1.BRANCH_CODE = REP2.BRANCH_CODE(+) AND REP1.ACCOUNT = REP2.ACCOUNT(+)) AND
           (REP1.BRANCH_CODE = REP3.BRANCH_CODE(+) AND REP1.ACCOUNT = REP3.ACCOUNT(+))The point is I want to restrict (tune) REP before it used ,because , as you can see I need maximum three value from REP (where R=1,R=2,R=3) . Which analytic function and with wich options I have to use to receive only 3 values in each branch_code,account groups at the materializing time ?

    Radrigez wrote:
    Dear friends I've developed one sql :
    with REP as
    from (SELECT * FROM REP WHERE R=1) REP1,
    (SELECT * FROM REP WHERE R=2) REP2,
    (SELECT * FROM REP WHERE R=3) REP3
    where
    (REP1.BRANCH_CODE = REP2.BRANCH_CODE(+) AND REP1.ACCOUNT = REP2.ACCOUNT(+)) AND
    (REP1.BRANCH_CODE = REP3.BRANCH_CODE(+) AND REP1.ACCOUNT = REP3.ACCOUNT(+))
    The first step is to put your subquery (which doesn't need to be materialized) into an inline view and restrict the result set on r in (1,2,3) as suggested by thtsang - you don't need to query the same result set three times.
    Then you're looking at a simple pivot operation (assuming the number of rows you want per branch and account is fixed). If you're on 11g search the manuals for PIVOT, on earlier versions you can do this with a decode() or case() operator.
    Step 1 (which could go into another factored subquery) would be something like:
    select
            branch_code, account,
            case r = 1 then bkg_date end bkg_date,
            case r = 1 then lcy_closing_bal end lcy_closing_bal,
            case r = 2 then bkg_date end bkg_date2,
            case r = 2 then lcy_closing_bal end lcy_closing_bal2,
            case r = 3 then bkg_date end bkg_date3,
            case r = 3 then lcy_closing_bal end lcy_closing_bal3
    from
            repThis gives you the eight necessary columns, but still (up to) three rows per branch/account.
    Then you aggregate this (call it rep1) on branch and account.
    select
            branch_code, account,
            max(bkg_date),
            max(lcy_closing_bal),
            max(bkg_date2),
            max(lcy_closing_bal2),
            max(bkg_date3),
            max(lcy_closing_bal3)
    from
            rep1
    group by
            branch_code, account
    order by
            branch_code, accountRegards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    Author: <b><em>Oracle Core</em></b>

  • I need help with Analytic Function

    Hi,
    I have this little problem that I need help with.
    My datafile has thousands of records that look like...
    Client_Id Region Countries
    [1] [1] [USA, Canada]
    [1] [2] [Australia, France, Germany]
    [1] [3] [China, India, Korea]
    [1] [4] [Brazil, Mexico]
    [8] [1] [USA, Canada]
    [9] [1] [USA, Canada]
    [9] [4] [Argentina, Brazil]
    [13] [1] [USA, Canada]
    [15] [1] [USA]
    [15] [4] [Argentina, Brazil]
    etc
    My task is is to create a report with 2 columns - Client_Id and Countries, to look something like...
    Client_Id Countries
    [1] [USA, Canada, Australia, France, Germany, China, India, Korea, Brazil, Mexico]
    [8] [USA, Canada]
    [9] [USA, Canada, Argentina, Brazil]
    [13] [USA, Canada]
    [15] [USA, Argentina, Brazil]
    etc.
    How can I achieve this using Analytic Function(s)?
    Thanks.
    BDF

    Hi,
    That's called String Aggregation , and the following site shows many ways to do it:
    http://www.oracle-base.com/articles/10g/StringAggregationTechniques.php
    Which one should you use? That depends on which version of Oracle you're using, and your exact requirements.
    For example, is order importatn? You said the results shoudl include:
    CLIENT_ID  COUNTRIES
    1        USA, Canada, Australia, France, Germany, China, India, Korea, Brazil, Mexicobut would you be equally happy with
    CLIENT_ID  COUNTRIES
    1        Australia, France, Germany, China, India, Korea, Brazil, Mexico, USA, Canadaor
    CLIENT_ID  COUNTRIES
    1        Australia, France, Germany, USA, Canada, Brazil, Mexico, China, India, Korea?
    Mwalimu wrote:
    ... How can I achieve this using Analytic Function(s)?The best solution may not involve analytic functions at all. Is that okay?
    If you'd like help, post your best attempt, a little sample data (CREATE TABLE and INSERT statements), the results you want from that data, and an explanation of how you get those results from that data.
    Always say which version of Oracle you're using.
    Edited by: Frank Kulash on Aug 29, 2011 3:05 PM

  • Having trouble with put_line function in block statement..

    Am a student, and the request does not appear to be in my book anywhere... Nothing on put_line, or block statements that I can see..
    Need to write a block which will bring back via put_line for department 110: the department_id (departments table), the department_name (departments table), and the city in which the department is located (which must be pulled from locations table... can be joined by department_id). I can write the select statement with join, just not sure how to turn it into a block with put_line function..
    any advice appreciated.. Thanks.

    Hi,
    You need to format your code, so that it's easy to see what statements are in each section (DECLARE, BEGIN, etc.).
    The compiler doesn't care about this, but it will help anyone who tries to read your code, including yourself.
    Type &#123;code&#125; before and after formatted sections when posting messages on this site.
    All statements, including each individual variable declaration) end with a semicolon.
    If a query will return no more than one row, you can capture the results using an INTO-clause, right after the SELECT-clause. For every column in the SELECT-clause, there will be one variable in the INTO-clause.
    Lists, including lists of table names in a query's FROM-clause, are delimited by commas.
    In a query's FROM-clause, the real name of the table comes first, optionally followed by the alias you're using in the query.
    I think this is what you're trying to do:
    SET   SERVEROUTPUT  ON  SIZE 50000
    Declare
        dep_id    NUMBER;
        Dep_string VarChar (100);
    Begin
        select  d.department_id, d.department_name -- , l.city
          into  dep_id,          dep_string
          from  departments  d
             ,  locations    l
          where d.location_id = l.location_id
            and d.location_id = 110;
        DBMS_OUTPUT.PUT_LINE (Dep_string);
    END;
    /No one should ever have this many errors at one time. Write code in much smaller increments, and test after each one. For example, you know you'll be using dbms_output.put_line, so start with something like:
    SET   SERVEROUTPUT  ON  SIZE 50000
    BEGIN
        dbms_output.put_line ('Hello, world!');
    END;You may have problems if you forget semicolons, or SET SERVEROUTPUT. Solve those before you go any further.
    Once the program above is working, add a little (and I mean a little) to it.
    For example, you know you'll be printing a varibale, not a literal, so change it to a variable:
    SET   SERVEROUTPUT  ON  SIZE 50000
    DECLARE
        dep_string  VARCHAR2 (100) := 'Hello, world!';
    BEGIN
        dbms_output.put_line (dep_string);
    END;And so on.
    Edited by: Frank Kulash on Dec 15, 2008 12:36 PM

  • Help with analytical function   (ora 9...)

    Hi everyone, is there a way to fill some missing numbers based on what have come before and after that missing number by starttime and based on how many missing data are between? ... by "simple" select? I know how to do that just theoreticly with simple math commands, but is there a way to apply them in sql (analytical functions)?
    Thanks in advance for any ideas !
    The missing number on line 17 could be calculated as 339+(1/2)*(356-339) = 347,5
    The missing number on line 23 could be calculated as 355+(1/3)*(292-355) = 334
    The missing number on line 24 could be calculated as 355+(2/3)*(292-355) = 313
    rownumber + temp_table (starttime,data_column)
    15     23.5.2007 16:15     ,     258
    16     23.5.2007 16:30     ,     339
    17     23.5.2007 16:45     ,     
    18     23.5.2007 17:00     ,     356
    19     23.5.2007 17:15     ,     373
    20     23.5.2007 17:30     ,     355
    21     23.5.2007 17:45     ,     363
    22     23.5.2007 18:00     ,     355
    23     23.5.2007 18:15     ,     
    24     23.5.2007 18:30     ,     
    25     23.5.2007 19:00     ,     292
    26     23.5.2007 19:15     ,     295
    THANKS
    Message was edited by:
    dusoo

    Way too late, but I wouldn't let my effort go unpublished ;-)
    SQL> create table temp_table
      2  as
      3  select 15 rownumber, to_date('23.5.2007 16:15','dd.mm.yyyy hh24:mi') starttime, 258 data_column from dual union all
      4  select 16, to_date(' 23.5.2007 16:30','dd.mm.yyyy hh24:mi'), 339 from dual union all
      5  select 17, to_date(' 23.5.2007 16:45','dd.mm.yyyy hh24:mi'), null from dual union all
      6  select 18, to_date(' 23.5.2007 17:00','dd.mm.yyyy hh24:mi'), 356 from dual union all
      7  select 19, to_date(' 23.5.2007 17:15','dd.mm.yyyy hh24:mi'), 373 from dual union all
      8  select 20, to_date(' 23.5.2007 17:30','dd.mm.yyyy hh24:mi'), 355 from dual union all
      9  select 21, to_date(' 23.5.2007 17:45','dd.mm.yyyy hh24:mi'), 363 from dual union all
    10  select 22, to_date(' 23.5.2007 18:00','dd.mm.yyyy hh24:mi'), 355 from dual union all
    11  select 23, to_date(' 23.5.2007 18:15','dd.mm.yyyy hh24:mi'), null from dual union all
    12  select 24, to_date(' 23.5.2007 18:30','dd.mm.yyyy hh24:mi'), null from dual union all
    13  select 25, to_date(' 23.5.2007 19:00','dd.mm.yyyy hh24:mi'), 292 from dual union all
    14  select 26, to_date(' 23.5.2007 19:15','dd.mm.yyyy hh24:mi'), 295 from dual
    15  /
    Tabel is aangemaakt.
    SQL> with t as
      2  ( select t.*
      3         , max(case when data_column is not null then rownumber end) over (order by rownumber) lowerbound
      4         , last_value(data_column ignore nulls) over (order by rownumber) prevvalue
      5         , min(case when data_column is not null then rownumber end) over (order by rownumber desc) upperbound
      6         , last_value(data_column ignore nulls) over (order by rownumber desc) nextvalue
      7      from temp_table t
      8  )
      9  select rownumber
    10       , starttime
    11       , case
    12         when data_column is not null then data_column
    13         else   prevvalue * ((upperbound - rownumber) / (upperbound - lowerbound))
    14              + nextvalue * ((rownumber - lowerbound) / (upperbound - lowerbound))
    15         end data_column
    16    from t
    17   order by rownumber
    18  /
                                 ROWNUMBER STARTTIME                                      DATA_COLUMN
                                        15 23-05-2007 16:15:00                                    258
                                        16 23-05-2007 16:30:00                                    339
                                        17 23-05-2007 16:45:00                                  347,5
                                        18 23-05-2007 17:00:00                                    356
                                        19 23-05-2007 17:15:00                                    373
                                        20 23-05-2007 17:30:00                                    355
                                        21 23-05-2007 17:45:00                                    363
                                        22 23-05-2007 18:00:00                                    355
                                        23 23-05-2007 18:15:00                                    334
                                        24 23-05-2007 18:30:00                                    313
                                        25 23-05-2007 19:00:00                                    292
                                        26 23-05-2007 19:15:00                                    295
    12 rijen zijn geselecteerd.Regards,
    Rob.

  • Having problem with to_lob function

    Hi,
    Oracle DB version: 10.2.0.1.0
    Tools being used: SQL Developer and SQLPlus (I have tried the query on both)
    Create table test
    col_xml LONG
    ...Note: We cannot convert this column to CLOB permanently because we just don't have the right to do that. :-(
    Now, the problem is, I'm trying to convert this LONG into a CLOB using the TO_LOB function but still getting this error:
    select TO_LOB(col_xml) from test;
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected - got LONGBy the way, our main goal is to query this column so that we could filter the rows that we need.
    Sort of:
    select TO_LOB(col_xml) from test where TO_LOB(col_xml) like '%criteria%';Have I missed something or any idea on how to make my goal possible. The TO_LOB function is pretty simple but I don't know why it doesn't work in our my environment.
    Suggestion/Ideas/Links to learning materials is high appreciated (Prefer oracle sites, message boards and forums are blocked by the firewall :-( ).
    Thanks.

    Spongebob wrote:
    Follow up question: This TO_LOB function looks simple and easy to use but as shown above in my post it throws an error message. Have I missed something or is there something in our environment that affected the behavior of this function?Have a look at Oracle® Database SQL Language Reference.
    There are several limitations with this function. It boils down to the following:
    "<i>Before using this function, you must create a LOB column to receive the converted LONG values. To convert LONG values, create a CLOB column. To convert LONG RAW values, create a BLOB column.</i>"
    So in other words, this function needs a destination LOB to write the LONG content into. It does not dynamically create the destination LOB/CLOB.
    So you cannot do this as there is no explicit LOB definition to receive the converted LONG content:
    SQL> select TO_LOB(text) from all_views where owner = 'SYSTEM' and view_name = 'PRODUCT_PRIVS';
    select TO_LOB(text) from all_views where owner = 'SYSTEM' and view_name = 'PRODUCT_PRIVS'
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected - got LONGBut if we create a destination LOB, we can use the function. E.g.
    SQL> create table view_text( view_name varchar2(30), view_sql clob );
    Table created.
    SQL> insert into view_text select view_name, TO_LOB(text) from all_views where owner = 'SYSTEM' and view_name = 'PRODUCT_PRIVS';
    1 row created.
    SQL> The bottom line to this is: LONG IS AN EVIL DATA TYPE.
    It should not be used. It dates back to Oracle 7. We are now at 11g. Several versions later. Still using the LONG data type in applications should be dealt with swiftly and harshly using the old lead pipe. Even a legacy app would have undergone several updates since Oracle v7. And that would have provided ample opportunity to convert from the horrible LONG data type to LOB.

  • Help required with  Analytic function

    Hi I have a table like following
    column1 column2 column3 cloumn4 start_Date
    1 601 A B 10-jan-2007
    2 601 A B 11 -jan-2007
    1 602 A B 12-jan-007
    1 603 A C 12-jan-2007
    there is no Uk on this table.
    now I have to group column2 ,column3 ,column4.
    I finally need to get the row that has highest start date. Effectively meaming
    there will be three groups based on abouve data:
    1st is:
    1 601 A B 10-jan-2007
    2 601 A B 11 -jan-2007
    2nd is
    1 602 A B 12-jan-007
    3rd is
    1 603 A C 12-jan-2007
    now i need to get the second row from the first group only
    2 601 A B 11 -jan-2007
    i dont need data from other groups since they have less than 2 rows.
    How can I achieve this using analytics function.
    Please help!!

    Hi,
    Can you please provide an example. I am sorry, this
    is my first day with analytice :(I figured as much; that's why I gave you detailed instructions to gain some experience by building on the
    query that Forms already gave you (slightly prettied up below):
    select  *
    from    (   -- Begin sub-query
            SELECT  column1,column2,column3,column4
            ,        row_number () over
                        ( partition by column2,column3,column4
                          order by start_date desc
                        ) as rnk  -- "rank" is not a good name
            from   table
            )   -- End sub-query
    where   rnk = 1;
    [pre]
    Follow the instructions I gave you earlier.
    Add the analytic "COUNT (*) OVER (...) AS cnt" function next to the analytic ROW_NUMBER function in Forms' query.
    Add the test for "cnt >= 2" next to the test for "rnk = 1".                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Grapher having trouble with trig functions

    Well I thought that I could make some cool visuals for my math class by using grapher instead of just making charts and graphs in Excel on my pc, but no dice with that idea. I created all the different variables (and how they related to x, ie x^2*(5-x6)/x=m) and it was pretty cool how it would all come together and I could use those variables in other equations instead of writing it all out (stuff like y=4x^m-m/5). (Those are examples, not anything I used)
    Everything seemed fine until I tried to use a cosine function in one of the equations. It seemed really intuitive and recognized cos as cosine and not the variables c, o, and s multiplied, so I thought "great!" Then when I entered it the graph didn't appear anything like it should have. It gave me a Richter scale instead of a slightly curved line. I spent about half an hour trying to find my own error but then realized maybe there was something wrong with the program. I tried just plain old cos(x) and found that it was completely off. Instead of traveling from 0,1 to 180,-1 like it should have, it went from 0,1 to 3.1475,-1. This was way off so I tried a sine function and that did the same exact thing (having a period of roughly 6.295 instead of 360!). These functions were way off, and I'm using an intel macbook from my school (with 10.5), so I thought maybe someone changed the settings, so I tried them them on my aunts g4 macmini (10.4.11 I think) and it gave me the same problem!
    Anyway what's wrong with the trig functions for Grapher and how can I fix them?

    Emzz, I'm running 10.5.5 (Intel) and Grapher v2.0. My version of Grapher supports changing Trigonometric Mode, so the absence of this option seems to have nothing to do with Leopard.
    But let's do some Math now. As you surely know the cosine has (besides others) a root at 90 Degrees in the unit circle. Now let f : (angle in Degrees) -> (length of the corresponding arc of the unit circle) be a mapping. For 90 Degrees f yields 0.5*Pi. That means cosine has a root at 0.5*Pi in Radian unit.
    Now, if you swith Grapher's Trigonometric Mode from Degrees to Radian, cosine will no longer have a root at 90 but at approximately 1.57. Hope that explains a bit...
    By the way, Radian is the standard unit of angular measurement when it comes to trigonometric functions. For more details about Radian see [this article at Wikipedia|http://en.wikipedia.org/wiki/Radian] (click the link to be redirected).
    I suggest you close this topic because, as you already noted yourself, your initial question is answered.
    Good computing.
    floba
    (MN576)
    Message was edited by: floba

Maybe you are looking for

  • BT Vision box and infinity

    Hi We are considering changing to BT Infinity from normal BT broadband. We currently have a black BT Vision Box (not yet updated) connected to an upstairs hub using the plug in black powerline adapters. We paid about £60 to buy the box outright a cou

  • Vendor Report in Foriegn Currency.

    Some of our Vendors are Not Local and use foreign currencies (which already identified in the master data). We need to have the "option" to print the Vendor Account statement in this foreign currency whenever we require. Can any one tell  how I can p

  • Unable to create a DC.

    I'm new with NWDI and CE 7.1.1. I am studying the tutorial "How to Perform Development with a Track", from the on-line help in NWDS. Everything is OK until I have to create the first DC, whose type is Java. According to the tutorial, I insert in the

  • I open a new tab, and click on the bookmark in the bookmark toolbar, and the web site opens in a side bar. I want it to open in the full tabbed window.

    I love Firefox, but this problem is frustrating me. I want to go to a favorite site and have bookmarked it. When I click on that bookmark, the site opens in a sidebar! How do I make it open in the full tab window?

  • Error during OWB10gR2 Installation

    Hi, I am facing following error during Oracle Warehouse Builder Installation on Solaris x86. During installation Universal Installer prompt following messege: Error in invoking target 'isqlldr' of make file '/u01/app/oracle/product/owb10.2.0/rdbms/li