How to use analytical functions inside a mapping

Hello everybody. Here Isend you a trick that we are using for two years.
If you want to use a function (for instance :ROW_NUMBER() OVER (PARTITION BY ... ORDER BY ...)) inside a mapping you must create an expression with in the INGRP1 the fields you are going to use in the window function and in the OUTGRP1 the function you want. Create an Out -attribute with the expressión and link it to a distinct operator (using a "distinct" you encapsulte the sql and you will be able to use the function inside a filter - in the where clause.). The distinct can eliminate some register (depends on the function). If you validate the expression an error will appear (don't worry about that, the mapping will be ok).
But there is a limitation, you will not be able tou sum over, min over, max over (it detect that they are aggregator functions). Other limitation: the debugger doesn't run with this kind of functions.
Please publish this information on "The Warehouse Builder Utility Exchange". Mi email is [email protected] (if you need more information)

It is possible to add SUM, MIN, MAX functions also to OWB releases prior to Paris - You have to put them in double quotes - write "MIN" "MAX" "SUM" etc. The rest of the rules (adding cut-off operator after the expression - as DISTINCT, or UNION ALL) is analogical to with ROW_NUMBER()
That means You can create mapping with following functionality:
SELECT
sum (salary) over (partition by DEPARTMENT) department_salary,
salary,
employee_id,
employee_name
FROM employees_salaries
writing it this way in OWB:
SELECT
"SUM" (salary) over (partition by DEPARTMENT) department_salary,
salary,
employee_id,
employee_name
FROM employees_salaries
Regards,
Martin

Similar Messages

  • How to use analytic function with aggregate function

    hello
    can we use analytic function and aggrgate function in same qurey? i tried to find any example on Net but not get any example how both of these function works together. Any link or example plz share with me
    Edited by: Oracle Studnet on Nov 15, 2009 10:29 PM

    select
    t1.region_name,
    t2.division_name,
    t3.month,
    t3.amount mthly_sales,
    max(t3.amount) over (partition by t1.region_name, t2.division_name)
    max_mthly_sales
    from
    region t1,
    division t2,
    sales t3
    where
    t1.region_id=t3.region_id
    and
    t2.division_id=t3.division_id
    and
    t3.year=2004
    Source:http://www.orafusion.com/art_anlytc.htm
    Here max (aggregate) and over partition by (analytic) function is in same query. So it means we can use aggregate and analytic function in same query and more than one analytic function in same query also.
    Hth
    Girish Sharma

  • How to use Analytic functions in Forms10g

    Hi,
    Can we use Analytic function in forms10g like Lead & Lag.
    Thanks & Regards,

    Use a db view as a data source of your form block ....
    Greetings...
    Sim

  • How to use node functions in Message mapping !!

    Hi  Gurus,
    I have got one issue in message mapping, please can any one put some ideas on this !!
    Source Structure
    <Group_ZA>  0..unbound
         <D02_ZA>           0.. unbound
             ZA_01             0..1   - QA
             ZA_02             0..1      20
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1     QD
             ZA_02             0..1     40
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QN
             ZA_02             0..1     12
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QP
             ZA_02             0..1    60
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
          iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
         <D02_ZA>          
             ZA_01             0..1     QA
             ZA_02             0..1      20
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1     QD
             ZA_02             0..1     40
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QN
             ZA_02             0..1     12
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QP
             ZA_02             0..1    60
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
    iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
    <D02_ZA>          
             ZA_01             0..1    QN
             ZA_02             0..1     12
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QP
             ZA_02             0..1    60
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
    iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
    <D02_ZA>          
             ZA_01             0..1     QA
             ZA_02             0..1      20
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1     QD
             ZA_02             0..1     40
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QN
             ZA_02             0..1     12
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
         <D02_ZA>          
             ZA_01             0..1    QP
             ZA_02             0..1    60
             ZA_03             0..1
             ZA_04             0..1
         </D02_ZA>
    </Group_ZA>
    Target Structure
    ProductActivityNotification                                                       0..unbound
                          ProductActivity                                                             1..1
                                        Item                                                                 1..unbound
                                              Inventory                                                   0..1
                                                     UnrestrictedUseQuantity                    0..1
    The Group_ZA comes 'n' number of times and D02_ZA comes sometimes 5 times and sometimes 6 times etc.,
    ZA_01 field can come with 5 to 6 different values like 'QA','QD','QN' etc., sometimes ZA_01 comes only 3 times QA,QD,QN.
    if ZA_01 = 'QA' then only  we need to pass   ZA_02 value on to ''UnrestictedUseQuantity'' (target side),
    I mapped the fields, the vlaue ZA_02  passing properly on to 'UnrestictedUseQuantity' when D02_ZA comes 5 or 6 times, all the times 'QA'  value comes into ZA_01 field. In case if QA value missed or doesn't come from source, the Target side ' 'UnrestictedUseQuantity' field last value comes into last but one.
    Ex: QA value is in 3 times, but I have 4 source message on top. values comes into target side like
    20
    20
    20
    suppose to come like
    20
    20
    -- (Space)
    20
    Please required your valuable inputs in bit urget !!
    how to map field level please?
    Many Thanks in Advance
    Kind Regards
    San

    Petre:
    If you want to use standard functions then you try this:
    If-->currentdate -OR- Constant(01.02)
    currentdate -OR- Constant(02.02)   --> OR -->
    currentdate -OR- Constant(03.02)
    Then give some output
    Else give some output
    So give the output for the first two conditions to another OR and the result of the third to the same OR. So whenever the condition is true in any of the condition you will get the THEN value else you will get the ELSE value.
    ---Satish

  • How to use or function in Message Mapping?

    Hi! I was wondering if anyone can show me how to use the or function.
    Im trying to match the current date to three possible values so i need three or functions.
    EG. if ( currentDate == 01.02  || currentDate = 02.02  || currentDate = 03.02 ) {
              do something;

    Petre:
    If you want to use standard functions then you try this:
    If-->currentdate -OR- Constant(01.02)
    currentdate -OR- Constant(02.02)   --> OR -->
    currentdate -OR- Constant(03.02)
    Then give some output
    Else give some output
    So give the output for the first two conditions to another OR and the result of the third to the same OR. So whenever the condition is true in any of the condition you will get the THEN value else you will get the ELSE value.
    ---Satish

  • How to use mathematical functions in XSL mapping

    Hi,
    I am using Jdeveloper 10.1.3.3. I need to insert mathematical functions like "multiply,divide,power" etc in my mapping. But in the XSL i am getting all string functions and very few math functions for number.
    I am newbie in Jdev. Please if anyone can share how this can be done.
    Thanks

    Hi,
    The RfcAdapter trys to find a Sender Agreement for this RFC call but the lookup failes. The values used for this lookup are:
    Sender Party/Sender Service: The values from Party and Service belonging to the sender channel.
    Sender Interface: The name of the RFC function module.
    Sender Namespace: The fix RFC namespace urn:sap-com:document:sap:rfc:functions
    Receiver Party/Receiver Service: These fields are empty. This will match the wildcard
    Regards,
    Suryanarayana

  • How to use analytical function in this case

    SELECT COUNT (rms.status_code) rms_status_count,
    rms.status_name rms_status_name,
    TO_CHAR (rtd.add_date, 'MON')|| ' '|| TO_CHAR (rtd.add_date, 'YYYY') month_year,
    MAX (rtd.add_date) date_for_sort
    FROM ri_mast_status rms, ri_tran_data rtd
    WHERE rtd.status_code = rms.status_code
    AND TRUNC (MONTHS_BETWEEN (SYSDATE, rtd.add_date)) < 36
    AND NVL (rtd.delete_flg, '0') = '0'
    GROUP BY TO_CHAR (rtd.add_date, 'MON')|| ' '|| TO_CHAR (rtd.add_date, 'YYYY'),
    rms.status_name
    ORDER BY MAX (rtd.add_date);
    it gives output for the last 3 years based on month and year.

    r you trying this ?
    select *from
    select rms.*,
    row_number() over(partition by TO_CHAR (rtd.add_date, 'MON')|| ' '|| TO_CHAR (rtd.add_date,rms.status_name 'YYYY') order by rtd.add_date) RN,
    MAX(rtd.add_date) over(partition by TO_CHAR (rtd.add_date, 'MON')|| ' '|| TO_CHAR (rtd.add_date,rms.status_name 'YYYY')  order by rtd.add_date) date_for_sort,
    COUNT(rms.status_code) over(partition by TO_CHAR (rtd.add_date, 'MON')|| ' '|| TO_CHAR (rtd.add_date,rms.status_name 'YYYY')  order by rtd.add_date) rms_status_count
    FROM ri_mast_status rms, ri_tran_data rtd
    WHERE rtd.status_code = rms.status_code
    AND TRUNC (MONTHS_BETWEEN (SYSDATE, rtd.add_date)) < 36
    AND NVL (rtd.delete_flg, '0') = '0'
    where rn=1

  • Using analytic function to get the right output.

    Dear all;
    I have the following sample date below
    create table temp_one
           id number(30),  
          placeid varchar2(400),
          issuedate  date,
          person varchar2(400),
          failures number(30),
          primary key(id)
    insert into temp_one values (1, 'NY', to_date('03/04/2011', 'MM/DD/YYYY'), 'John', 3);
    insert into temp_one values (2, 'NY', to_date('03/03/2011', 'MM/DD/YYYY'), 'Adam', 7);
    insert into temp_one values (3, 'Mexico', to_date('03/04/2011', 'MM/DD/YYYY'), 'Wendy', 3);
    insert into temp_one values (4, 'Mexico', to_date('03/14/2011', 'MM/DD/YYYY'), 'Gerry', 3);
    insert into temp_one values (5, 'Mexico', to_date('03/15/2011', 'MM/DD/YYYY'), 'Zick', 9);
    insert into temp_one values (6, 'London', to_date('03/16/2011', 'MM/DD/YYYY'), 'Mike', 8);this is output I desire
    placeid       issueperiod                               failures
    NY              02/28/2011 - 03/06/2011          10
    Mexico       02/28/2011 - 03/06/2011           3
    Mexico        03/14/2011 - 03/20/2011          12
    London        03/14/2011 - 03/20/2011          8All help is appreciated. I will post my query as soon as I am able to think of a good logic for this...

    hI,
    user13328581 wrote:
    ... Kindly note, I am still learning how to use analytic functions.That doesn't matter; analytic functions won't help in this problem. The aggregate SUM function is all you need.
    But what do you need to GROUP BY? What is each row of the result set going to represent? A placeid? Yes, each row will represent only one placedid, but it's going to be divided further. You want a separate row of output for every placeid and week, so you'll want to GROUP BY placeid and week. You don't want to GROUP BY the raw issuedate; that would put March 3 and March 4 into separate groups. And you don't want to GROUP BY failures; that would mean a row with 3 failures could never be in the same group as a row with 9 failures.
    This gets the output you posted from the sample data you posted:
    SELECT       placeid
    ,             TO_CHAR ( TRUNC (issuedate, 'IW')
                  , 'MM/DD/YYYY'
                ) || ' - '|| TO_CHAR ( TRUNC (issuedate, 'IW') + 6
                                             , 'MM/DD/YYY'
                               )     AS issueperiod
    ,       SUM (failures)                  AS sumfailures
    FROM        temp_one
    GROUP BY  placeid
    ,            TRUNC (issuedate, 'IW')
    ;You could use a sub-query to compute TRUNC (issuedate, 'IW') once. The code would be about as complicated, efficiency probably won't improve noticeably, and the the results would be the same.

  • How can rewrite the Query using Analytical functions ?

    Hi,
    I have the SQL script as shown below ,
    SELECT cd.cardid, cd.cardno,TT.TRANSACTIONTYPECODE,TT.TRANSACTIONTYPEDESC DESCRIPTION,
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'LOAD_ACH'
    THEN th.transactionamount
    END, 0)
    ) AS load_ach,
    SUM
    (NVL (CASE tt.transactiontypecode
    WHEN 'FUND_TRANSFER_RECEIVED'
    THEN th.transactionamount
    END,
    0
    ) AS Transfersin,
    ( SUM (NVL (CASE tt.transactiontypecode
    WHEN 'FTRNS'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'SEND_MONEY'
    THEN th.transactionamount
    END, 0)
    )) AS Transferout,
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_ACH'
    THEN th.transactionamount
    END, 0)
    ) AS withdrawal_ach,
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_CHECK'
    THEN th.transactionamount
    END, 0)
    ) AS withdrawal_check,
    ( SUM (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_CHECK_FEE'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'REJECTED_ACH_LOAD_FEE'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_ACH_REV'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_CHECK_REV'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM
    (NVL (CASE tt.transactiontypecode
    WHEN 'WITHDRAWAL_CHECK_FEE_REV'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM
    (NVL (CASE tt.transactiontypecode
    WHEN 'REJECTED_ACH_LOAD_FEE_REV'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'OVERDRAFT_FEE_REV'
    THEN th.transactionamount
    END, 0)
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'STOP_CHECK_FEE_REV'
    THEN th.transactionamount
    END,
    0
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'LOAD_ACH_REV'
    THEN th.transactionamount
    END, 0)
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'OVERDRAFT_FEE'
    THEN th.transactionamount
    END, 0)
    ) +
    SUM (NVL (CASE tt.transactiontypecode
    WHEN 'STOP_CHECK_FEE'
    THEN th.transactionamount
    END, 0)
    )) AS Fee,
    th.transactiondatetime
    FROM carddetail cd,
    transactionhistory th,
    transactiontype tt,
    (SELECT rmx_a.cardid, rmx_a.endingbalance prev_balance, rmx_a.NUMBEROFDAYS
    FROM rmxactbalreport rmx_a,
    (SELECT cardid, MAX (reportdate) reportdate
    FROM rmxactbalreport
    GROUP BY cardid) rmx_b
    WHERE rmx_a.cardid = rmx_b.cardid AND rmx_a.reportdate = rmx_b.reportdate) a
    WHERE th.transactiontypeid = tt.transactiontypeid
    AND cd.cardid = th.cardid
    AND cd.cardtype = 'P'
    AND cd.cardid = a.cardid (+)
    AND CD.CARDNO = '7116734387812758335'
    --AND TT.TRANSACTIONTYPECODE = 'FUND_TRANSFER_RECEIVED'
    GROUP BY cd.cardid, cd.cardno, numberofdays,th.transactiondatetime,tt.transactiontypecode,TT.TRANSACTIONTYPEDESC
    Ouput of the above query is :
    CARDID     CARDNO     TRANSACTIONTYPECODE     DESCRIPTION     LOAD_ACH     TRANSFERSIN     TRANSFEROUT     WITHDRAWAL_ACH     WITHDRAWAL_CHECK     FEE     TRANSACTIONDATETIME
    6005     7116734387812758335     FUND_TRANSFER_RECEIVED     Fund Transfer Received     0     3.75     0     0     0     0     21/09/2007 11:15:38 AM
    6005     7116734387812758335     FUND_TRANSFER_RECEIVED     Fund Transfer Received     0     272     0     0     0     0     05/10/2007 9:12:37 AM
    6005     7116734387812758335     WITHDRAWAL_ACH     Withdraw Funds via ACH     0     0     0     300     0     0     24/10/2007 3:43:54 PM
    6005     7116734387812758335     SEND_MONEY     Fund Transfer Sent     0     0     1     0     0     0     19/09/2007 1:17:48 PM
    6005     7116734387812758335     FUND_TRANSFER_RECEIVED     Fund Transfer Received     0     1     0     0     0     0     18/09/2007 7:25:23 PM
    6005     7116734387812758335     LOAD_ACH     Prepaid Deposit via ACH     300     0     0     0     0     0     02/10/2007 3:00:00 AM
    I want the output like for Load_ACH there should be one record etc.,
    Can any one help me , how can i rewrite the above query using analytical functions .,
    Sekhar

    Not sure of your requirements but this mayhelp reduce your code;
    <untested>
    SUM (
       CASE
       WHEN tt.transactiontypecode IN
          ('WITHDRAWAL_CHECK_FEE', 'REJECTED_ACH_LOAD_FEE', 'WITHDRAWAL_ACH_REV', 'WITHDRAWAL_CHECK_REV',
           'WITHDRAWAL_CHECK_FEE_REV', 'REJECTED_ACH_LOAD_FEE_REV', 'OVERDRAFT_FEE_REV','STOP_CHECK_FEE_REV',
           'LOAD_ACH_REV', 'OVERDRAFT_FEE', 'STOP_CHECK_FEE')
       THEN th.transactionamount
       ELSE 0) feeAlso, you might want to edit your post and use &#91;pre&#93; and &#91;/pre&#93; tags around your code for formatting.

  • How to use email function in crystal report ?

    Post Author: kudo
    CA Forum: .NET
    Hi I'm a novice by touching .net not more than 2 months. Can somebody guide me how to use email function provided in crystal report components?(Better put a sample code so that I can understand well.)  ps: I'm using VS2005 VB.net.Thanks.

    Post Author: mewdied
    CA Forum: .NET
    'EXPORT to EMAIL        ''' Code for exporting the report to Mapi (.Net Windows application)        ''' *For a Web application you must export to disk as a PDF file first.
            crReportDocument.Load(Application.StartupPath + "\World Sales Report.rpt")        crMicrosoftMailDestinationOptions = New MicrosoftMailDestinationOptions        With crMicrosoftMailDestinationOptions            .MailCCList = "[email protected]"            .MailToList = "[email protected]"            .MailSubject = "Attached exported report"            .UserName = "admin"            .Password = "password"        End With
            crExportOptions = crReportDocument.ExportOptions        With crExportOptions            .DestinationOptions = crMicrosoftMailDestinationOptions            .ExportDestinationType = ExportDestinationType.MicrosoftMail            .ExportFormatType = ExportFormatType.PortableDocFormat        End With
            'Add some error handling        Try            crReportDocument.Export()            MsgBox("Report exported successfully.")        Catch err As Exception            MessageBox.Show(err.ToString())        End Try
    Hope this helps

  • How to use Aggregate Functions during Top N analysis?

    Say i want to find top 5 highest salaries and their totals and average. In that case how to use aggregate functions. Please give me an example on this.
    Regards,
    Renu
    Message was edited by:
    user642387

    Hi,
    Yes, you can do that with aggregate functions.
    First, do a sub-query to retrieve all the salaries (in descending order), then say "WHERE ROWNUM <= 5" in the main query. Use the aggregate SUM and AVG functions in the main query.
    Analytic functions are easier to use for jobs like this, once you get familiar with them. If you're not leaving the field this month, then it's probably worthwhile for you to get familiar with analytic functions.

  • Using Analytic Functions

    Hi all,
    I am using ODI 11g(11.1.1.3.0) and I am trying to make an interface using analytic functions in the column mapping, something like below.
    sum(salary) over (partition by .....)
    The problem is that when ODI saw sum it assumes this as an aggregate function and puts group by. Is there any way to make ODI understand it is not an aggregate function?
    I tried creating an option to specify whether it is analytic or not and updated IKM with no luck.
    <%if ( odiRef.getUserExit("ANALYTIC").equals("1") ) { %>
    <% } else { %>
    <%=odiRef.getGrpBy(i)%>
    <%=odiRef.getHaving(i)%>
    <% } %>
    Thanks in advance

    Thanks for the reply.
    But I think in ODI 11g getFrom() function is behaving differently, that is why it is not working.
    When I check out the A.2.18 getFrom() Method from Substitution API Reference document, it says
    Allows the retrieval of the SQL string of the FROM in the source SELECT clause for a given dataset. The FROM statement is built from tables and joins (and according to the SQL capabilities of the technologies) that are used in this dataset.
    I think getfrom also retrieves group by clause, I create a step in IKM just with *<%=odiRef.getFrom(0)%>* and I can see that even that query generated has a group by clause

  • Problem in using aggregate functions inside case statement

    Hi All,
    I am facing problem while using aggregate functions inside case statement.
    CASE WHEN PSTYPE='S' THEN MAX(DECODE(POS.PBS,1,ABS(POS.PPRTQ),0)) ELSE SUM(DECODE(POS.PBS,1,ABS(POS.PPRTQ),0)) END,
    how can I achieve above requirement ? Con anyone help me.
    Thanks and Regards
    DG

    Hi All,
    Below is my query:
            SELECT
            CASE WHEN p_reportid IN ('POS_RV_SN','POS_PB') THEN POS.PACCT
            ELSE POS.PACCT || '-' || DECODE(POS.SYSTEMCODE,'GMI1','1', 'GMI2','2', 'GMI3','4', 'GMI4','3', '0') ||POS.PFIRM|| NVL(POS.POFFIC,'000') END,
            CASE WHEN p_reportid IN ('POS_RV_SN','POS_PB') THEN POS.PACCT||POS.PCUSIP||DECODE(POS.PBS,1,'+',2,'-')
            ELSE POS.PFIRM||POS.POFFIC||POS.PACCT||POS.PCUSIP||DECODE(POS.PBS,1,'+',2,'-') END,POS.SYSTEMCODE,CASE WHEN POS.PSTYPE='S' THEN POS.PSYMBL ELSE POS.PFC END,POS.PEXCH||DECODE(POS.PSUBEX,'<NULL>',''),
            POS.PCURSY,
            CASE WHEN POS.PSBCUS IS NULL THEN SUBSTR(POS.PCTYM,5,2) || SUBSTR(POS.PCTYM,1,4) ELSE POS.PSBCUS || SUBSTR(POS.PCTYM,5,2) || SUBSTR(POS.PCTYM,1,4) END ,
            NVL(POS.PSUBTY,'F') ,POS.PSTRIK,*SUM(DECODE(POS.PBS,1,ABS(POS.PPRTQ),0)) ,SUM(DECODE(POS.PBS,2,ABS(POS.PPRTQ),0))* ,
            POS.PCLOSE,SUM(POS.PMKVAL) ,
            TO_CHAR(CASE WHEN INSTR(POS.PUNDCP,'.') > 0 OR LENGTH(POS.PUNDCP) < 15 THEN POS.PUNDCP ELSE TO_CHAR(TO_NUMBER(POS.PUNDCP) / 100000000) END),
            POS.UBS_ID,POS.BBG_EXCHANGE_CODE,POS.BBG_TICKER ,POS.BBG_YELLOW_KEY,POS.PPCNTY,POS.PMULTF,TO_CHAR(POS.BUSINESS_DATE,'YYYYMMDD'),
            POS.SOURCE_GMI_LIB,
            --DECODE(POS.SYSTEMCODE,'GMI1','euro','GMI2','namr','GMI3','aust','GMI4','asia','POWERBASE','aust','SINACOR','namr',POS.SYSTEMCODE),
            DECODE(p_reportid,'RVPOS_SING','euro','RVPOS_AUSTDOM','aust','RVPOS_AUSTEOD','euro','RVPOS_GLBLAPAC','asia','POS_RV_SN','namr','POS_PB','aust',POS.SYSTEMCODE),
            POS.RIC,
            CASE WHEN PSUBTY = 'S' THEN POS.TYPE ELSE NULL END,
            DECODE(POS.UBS_ID,NULL,POS.PCUSP2,POS.ISIN),POS.UNDERLYING_BBG_TICKER,POS.UNDERLYING_BBG_EXCHANGE,POS.PRODUCT_CLASSIFICATION,
            CASE WHEN PSUBTY = 'S' THEN POS.PSDSC2 ELSE NULL END,
            CASE WHEN PSUBTY = 'S' THEN C.SSDSC3 ELSE NULL END,
            NVL(C.SSECID,POS.PCUSIP),
            NULL,
            POS.PYSTMV,
            POS.PMINIT,
            POS.PEXPDT,
            CASE WHEN POS.PSUBTY='S' THEN  SUBSTR(C.ZDATA2,77,1) ELSE NULL END,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL,
            NULL
            FROM POSITIONS_WRK POS LEFT OUTER JOIN
            (SELECT * FROM CDS_PRODUCTS CP INNER JOIN FUTURE_MASTER FM ON
            (CP.STRXCH=FM.ZEXCH AND CP.SFC=FM.ZFC AND CP.BUSINESS_DATE = FM.BUSINESS_DATE )) C ON POS.PCUSIP = C.SCUSIP
            AND NVL(POS.PCUSP2,'X') = NVL(C.SCUSP2,'X')
            WHERE
            POS.PEXCH NOT IN ('A1','A2','A3','B1','B3','C2','D1','H1','K1','L1','M1','M3','P1','S1')
            AND (POS.PSBCUS IS NOT NULL OR POS.PCTYM IS NOT NULL OR POS.PSTYPE ='S')
            AND POS.BUSINESS_DATE = run_date_char
            GROUP BY
            POS.UBS_ID,POS.SYSTEMCODE,POS.RECIPIENTCODE,POS.BUSINESS_DATE,POS.PACCT,POS.PFIRM,POS.POFFIC,POS.PCUSIP,POS.PBS,CASE WHEN POS.PSTYPE='S' THEN POS.PSYMBL ELSE POS.PFC END,
            POS.PEXCH,POS.PSUBEX,POS.PCURSY,
            CASE WHEN POS.PSBCUS IS NULL THEN SUBSTR(POS.PCTYM,5,2) || SUBSTR(POS.PCTYM,1,4) ELSE POS.PSBCUS || SUBSTR(POS.PCTYM,5,2)  || SUBSTR(POS.PCTYM,1,4) END,
            NVL(POS.PSUBTY,'F') ,POS.PSTRIK,POS.PCLOSE,TO_CHAR(CASE WHEN INSTR(POS.PUNDCP,'.') > 0 OR LENGTH(POS.PUNDCP) < 15 THEN POS.PUNDCP ELSE TO_CHAR(TO_NUMBER(POS.PUNDCP) / 100000000) END),
            POS.BBG_EXCHANGE_CODE,POS.BBG_TICKER,POS.BBG_YELLOW_KEY,POS.PPCNTY,POS.PMULTF,POS.PSUBTY,POS.SOURCE_GMI_LIB,RIC,
            CASE WHEN PSUBTY = 'S' THEN POS.TYPE ELSE NULL END,
            DECODE(POS.UBS_ID,NULL,POS.PCUSP2,POS.ISIN),POS.UNDERLYING_BBG_TICKER,POS.UNDERLYING_BBG_EXCHANGE,POS.PRODUCT_CLASSIFICATION,
            CASE WHEN PSUBTY = 'S' THEN POS.PSDSC2 ELSE NULL END,
            CASE WHEN PSUBTY = 'S' THEN C.SSDSC3 ELSE NULL END,
            NVL(C.SSECID,POS.PCUSIP),
            POS.PYSTMV,
            POS.PMINIT,
            POS.PEXPDT,
            CASE WHEN PSUBTY = 'S'  THEN  SUBSTR(C.ZDATA2,77,1) ELSE NULL END;Now, could you plz help me in replacing the bold text in the query with the requirement.
    Thanks and Rgds
    DG
    Edited by: BluShadow on 16-May-2011 09:39
    added {noformat}{noformat} tags.  Please read: {message:id=9360002} for details on how to post code/data                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Using analytical function - value with highest count

    Hi
    i have this table below
    CREATE TABLE table1
    ( cust_name VARCHAR2 (10)
    , txn_id NUMBER
    , txn_date DATE
    , country VARCHAR2 (10)
    , flag number
    , CONSTRAINT key1 UNIQUE (cust_name, txn_id)
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9870,TO_DATE ('15-Jan-2011', 'DD-Mon-YYYY'), 'Iran', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9871,TO_DATE ('16-Jan-2011', 'DD-Mon-YYYY'), 'China', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9872,TO_DATE ('17-Jan-2011', 'DD-Mon-YYYY'), 'China', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9873,TO_DATE ('18-Jan-2011', 'DD-Mon-YYYY'), 'Japan', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9874,TO_DATE ('19-Jan-2011', 'DD-Mon-YYYY'), 'Japan', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9875,TO_DATE ('20-Jan-2011', 'DD-Mon-YYYY'), 'Russia', 1);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9877,TO_DATE ('22-Jan-2011', 'DD-Mon-YYYY'), 'China', 0);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9878,TO_DATE ('26-Jan-2011', 'DD-Mon-YYYY'), 'Korea', 0);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9811,TO_DATE ('17-Jan-2011', 'DD-Mon-YYYY'), 'China', 0);
    INSERT INTO table1 (cust_name, txn_id, txn_date,country,flag) VALUES ('Peter', 9854,TO_DATE ('13-Jan-2011', 'DD-Mon-YYYY'), 'Taiwan', 0);
    The requirement is to create an additional column in the resultset with country name where the customer has done the maximum number of transactions
    (with transaction flag 1). In case we have two or more countries tied with the same count, then we need to select the country (among the tied ones)
    where the customer has done the last transaction (with transaction flag 1)
    e.g. The count is 2 for both 'China' and 'Japan' for transaction flag 1 ,and the latest transaction is for 'Japan'. So the new column should contain 'Japan'
    CUST_NAME TXN_ID TXN_DATE COUNTRY FLAG country_1
    Peter 9811 17-JAN-11 China 0 Japan
    Peter 9854 13-JAN-11 Taiwan 0 Japan
    Peter 9870 15-JAN-11 Iran 1 Japan
    Peter 9871 16-JAN-11 China 1 Japan
    Peter 9872 17-JAN-11 China 1 Japan
    Peter 9873 18-JAN-11 Japan 1 Japan
    Peter 9874 19-JAN-11 Japan 1 Japan
    Peter 9875 20-JAN-11 Russia 1 Japan
    Peter 9877 22-JAN-11 China 0 Japan
    Peter 9878 26-JAN-11 Korea 0 Japan
    Please let me know how to accomplish this using analytical functions
    Thanks
    -Learnsequel

    Does this work (not spent much time checking it)?
    WITH ana AS (
    SELECT cust_name, txn_id, txn_date, country, flag,
            Sum (flag)
                OVER (PARTITION BY cust_name, country)      n_trx,
            Max (CASE WHEN flag = 1 THEN txn_date END)
                OVER (PARTITION BY cust_name, country)      l_trx
      FROM cnt_trx
    SELECT cust_name, txn_id, txn_date, country, flag,
            First_Value (country) OVER (PARTITION BY cust_name ORDER BY n_trx DESC, l_trx DESC) top_cnt
      FROM ana
    CUST_NAME      TXN_ID TXN_DATE  COUNTRY          FLAG TOP_CNT
    Fred             9875 20-JAN-11 Russia              1 Russia
    Fred             9874 19-JAN-11 Japan               1 Russia
    Peter            9873 18-JAN-11 Japan               1 Japan
    Peter            9874 19-JAN-11 Japan               1 Japan
    Peter            9872 17-JAN-11 China               1 Japan
    Peter            9871 16-JAN-11 China               1 Japan
    Peter            9811 17-JAN-11 China               0 Japan
    Peter            9877 22-JAN-11 China               0 Japan
    Peter            9875 20-JAN-11 Russia              1 Japan
    Peter            9870 15-JAN-11 Iran                1 Japan
    Peter            9878 26-JAN-11 Korea               0 Japan
    Peter            9854 13-JAN-11 Taiwan              0 Japan
    12 rows selected.

  • Should I use Analytic functions ?

    Hello,
    I have a table rci_dates with the following structure (rci_id,visit_id,rci_name,rci_date).
    A sample of data in this table is as given below.
    1,101,'FIRST VISIT', '2010-MAY-01',
    2,101,'FIRST VISIT', '2010-MAY-01'
    3,101,'FIRST VISIT', '2010-MAY-01'
    4,101,'FIRST VISIT', '2010-MAY-01'
    5,102,'SECOND VISIT', '2010-JUN-01',
    6,102,'SECOND VISIT', '2010-JUN-01'
    7,102,'SECOND VISIT', '2010-JUN-01'
    8,102,'SECOND VISIT', '2010-JUL-01'
    I want to write a query which returns me the records which are similar to the record with rc_id =8 since the rci_date is different within the visit_id 102. Where as in Visit_id 101 the rci_dates are all same so it should not be displayed in the output returned by my query.
    How can I do this ? Should I be using analytic functions. Can someone please let me know.
    Thanks

    ok i have created the table and inserted the data. but it appears that the data are the output you are expecting, they all the same visit_id.
    SQL> CREATE TABLE RCI
      2  (RCI_ID NUMBER(10) NOT NULL,
      3   VISIT_ID NUMBER(10) NOT NULL,
      4   RCI_NAME VARCHAR2(20 BYTE) NOT NULL,
      5   DCI_DATE VARCHAR2(8 BYTE));
    Table created
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876540, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876640, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876740, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876840, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876940, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877040, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877140, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877240, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877240, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877640, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877740, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877840, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877940, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878040, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878140, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878240, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878340, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878440, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878540, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877640, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14877740, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878340, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878540, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 17418240, 12140, 'SCREENING', '20000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 17418340, 12140, 'SCREENING', '20000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 17418440, 12140, 'SCREENING', '20000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14878240, 12140, 'SCREENING', '20000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 18790240, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 21724540, 12140, 'SCREENING', '19000101');
    1 row inserted
    SQL> INSERT INTO RCI ( RCI_ID, VISIT_ID, RCI_NAME, DCI_DATE ) VALUES ( 14876540, 12140, 'SCREENING', '20091015');
    1 row inserted
    SQL> commit;
    Commit complete
    SQL> select * from rci;
         RCI_ID    VISIT_ID RCI_NAME             DCI_DATE
       14876540       12140 SCREENING            19000101
       14876640       12140 SCREENING            19000101
       14876740       12140 SCREENING            19000101
       14876840       12140 SCREENING            19000101
       14876940       12140 SCREENING            19000101
       14877040       12140 SCREENING            19000101
       14877140       12140 SCREENING            19000101
       14877240       12140 SCREENING            19000101
       14877240       12140 SCREENING            19000101
       14877640       12140 SCREENING            19000101
       14877740       12140 SCREENING            19000101
       14877840       12140 SCREENING            19000101
       14877940       12140 SCREENING            19000101
       14878040       12140 SCREENING            19000101
       14878140       12140 SCREENING            19000101
       14878240       12140 SCREENING            19000101
       14878340       12140 SCREENING            19000101
       14878440       12140 SCREENING            19000101
       14878540       12140 SCREENING            19000101
       14877640       12140 SCREENING            19000101
       14877740       12140 SCREENING            19000101
       14878340       12140 SCREENING            19000101
       14878540       12140 SCREENING            19000101
       17418240       12140 SCREENING            20000101
       17418340       12140 SCREENING            20000101
       17418440       12140 SCREENING            20000101
       14878240       12140 SCREENING            20000101
       18790240       12140 SCREENING            19000101
       21724540       12140 SCREENING            19000101
       14876540       12140 SCREENING            20091015
    30 rows selected
    SQL> -- using the sample similar code that i have previously posted it returned all the rows.
    SQL> select rci.*
      2    from rci
      3   where rci.visit_id in (select r1.visit_id
      4                            from (select rci.visit_id,
      5                                         count(*) over (partition by rci.visit_id, rci.dci_date order by rci.visit_id) rn
      6                                    from rci) r1
      7                            where r1.rn = 1)
      8  order by rci.rci_id;
         RCI_ID    VISIT_ID RCI_NAME             DCI_DATE
       14876540       12140 SCREENING            20091015
       14876540       12140 SCREENING            19000101
       14876640       12140 SCREENING            19000101
       14876740       12140 SCREENING            19000101
       14876840       12140 SCREENING            19000101
       14876940       12140 SCREENING            19000101
       14877040       12140 SCREENING            19000101
       14877140       12140 SCREENING            19000101
       14877240       12140 SCREENING            19000101
       14877240       12140 SCREENING            19000101
       14877640       12140 SCREENING            19000101
       14877640       12140 SCREENING            19000101
       14877740       12140 SCREENING            19000101
       14877740       12140 SCREENING            19000101
       14877840       12140 SCREENING            19000101
       14877940       12140 SCREENING            19000101
       14878040       12140 SCREENING            19000101
       14878140       12140 SCREENING            19000101
       14878240       12140 SCREENING            19000101
       14878240       12140 SCREENING            20000101
       14878340       12140 SCREENING            19000101
       14878340       12140 SCREENING            19000101
       14878440       12140 SCREENING            19000101
       14878540       12140 SCREENING            19000101
       14878540       12140 SCREENING            19000101
       17418240       12140 SCREENING            20000101
       17418340       12140 SCREENING            20000101
       17418440       12140 SCREENING            20000101
       18790240       12140 SCREENING            19000101
       21724540       12140 SCREENING            19000101
    30 rows selected
    SQL> just as what frank have said it will be helpful if you post a sample output based on the original posting, that is in the first posting you have.

Maybe you are looking for

  • Barcodes on reports (NOT SAPSCRIPT)

    Hello everybody in this forum, does anybody know, how to print a barcode on a simple ABAP-report? I recently read the docu about PRINT-CONTROL but I was not able to benefit of the explanations given in the way that the solution is clear now. I tried

  • Creative zen micro in the c

    is there anyway i can hook my zen micro up in the car so it plays through my speakers and radio deck? . i see they have a belkin zen thing, but is that just a holder for it for going in your cup holder? if i cant listen to my zen in the car im tossin

  • Why I cannot play a video in Photoshop Elements 12

    I cannot play a video .mov file in Photoshop Elements 12

  • 2.1 RC1 Dbms Output not working

    I don't know if this is new, or an existing bug from the EA releases (saw a couple threads in regards to dbms_output). I don't seem to be catching any results in the Dbms Output window. Even for something as simple as this run as a statement or scrip

  • Shared Services Database

    Trying to connect to a MS-SQL 2005 database for shared services. The server is configured as Server1\gendev. The config utility says that it is an invalid host. I am able to connect to the server via ODBC. Is there an issue with the "\" in the name?