Urgent need a SQL Query

Hi all
i have a variable that store consequitive ; like 'abc;;;;def;;;gh;;;;lm'.
i want to replace all the consequitive ; with the single ; .
eg out put like 'abc;def;gh;lm'
send the sql or pl/sql ASAP. its very urgent for me.
thanks in advace

send the sql or pl/sql ASAP. its very urgent for me.Nobody here cares if something is urgent for you. AT ALL. It is not urgent for us. It is certainly not any more important than other threads on this forum. It IS however completely ignorant for you to assume that your issue is any more important than someone else's issue. Please NEVER use the word urgent in any of your future posts (unless it is relevant to the question)
On top of that, you have provide a lack of info, such as version, where you are using this value etc.

Similar Messages

  • Need a SQL query(Pls Its urgent)

    Hi,
    I want a SQL query for the foll..
    Details r below..
    Table AFPFTRAN
    BU TRansactiondate Amt
    13202 10-04-05 10,000
    13203 11-05-05 20,000
    13202 20-04-05 5,000
    Table AFGENCOD
    BU Clerk Name Clerk Code
    13202 Amit TFBG
    13203 Anand TFMG
    I want a query to get data as below..
    Tr Date TFBG TFMG
    Apr 15,000 0
    May 0 20,000
    JUn 0 0
    Jul 0 0
    Aug
    Sep
    Aug
    Nov
    DEc
    Jan
    Feb
    Mar
    I want this to achieve using a single query...?How can I achive this?
    Pls give the sol...
    Adios...
    Prashanth Deshmukh

    Does it look like you need ?
    SQL> select * from t;
            BU TDATE           AMT
         13202 10.04.05      10000
         13203 11.05.05      20000
         13202 20.04.05       5000
    SQL> select * from t1;
            BU NAME  CODE
         13202 Amit  TFBG
         13203 Anand TFMG
    SQL> SELECT pivot_month.mname, tops.TFBG, tops.TFMG
      2  from
      3  (
      4  select to_char(t.tdate,'MM') month_no,
      5  sum(decode(t1.code,'TFBG',t.amt,0)) TFBG,
      6  sum(decode(t1.code,'TFMG',t.amt,0)) TFMG
      7  from t1, t
      8  where t.bu=t1.bu
      9  and to_char(t.tdate,'yyyy') = 2005
    10  GROUP BY t1.code, to_char(t.tdate,'MM')
    11  ) tops,
    12  (select rownum mn, to_char(to_date(rownum,'MM'),'MON') mname,
    13  case when rownum between 1 and 3 then rownum + 9
    14  else rownum - 3 end rn
    15  from dict where rownum < 13) pivot_month
    16  where pivot_month.mn = tops.month_no (+)
    17  order by pivot_month.rn
    18  /
    MNA       TFBG       TFMG
    APR      15000          0
    MAY          0      20000
    JUN
    JUL
    AUG
    SEP
    OCT
    NOV
    DEC
    JAN
    FEB
    MAR
    12 rows selected.Rgds.

  • Proxy to JDBC scenario need dynamic sql query for sender .

    Hi Experts,
    I am developing proxy to jdbc scenario. in this i need to pass dynamic sql query  whre we are passing classical method like below.
    while we are passing select stmt in constant and mapped with access field  and key field mapped with key field.
    MY requirement is like instead of passing select stmt in constant where i can generate dynamically and passed in one field and mapped with access field.

    Hi Ravinder,
    A simple UDF or use of graphical mapping functions in most cases should provide you everything you need to construct a dynamic SQL statement for your requirement.
    Regards,
    Ryan Crosby

  • URgent: Help regarding SQL Query

    Hi ,
    I need help regarding an sql query.
    Sample Data:
    ITEM_TYPE  ITEM_NUM   UNIT_PRICE QUANTITY       LINE_TOTAL
    ITEM         1            5         10           50
    ITEM         2           10         5            50
    ITEM         1            5          5            25
    ITEM                       2         10           20
    TAX                                               16.5
    TAX                                              -3.5I would like to display the data as
    ITEM_TYPE ITEM_NUM  UNIT_PRICE          QUANTITY          LINE_TOTAL
    ITEM       1          5                 15               145
                  2         10                  5 
                              2                 10
    TAX                                                          13.0
    Line_total = unit_price * QuantityThanks in Advance
    G.Vamsi Krishna
    Edited by: user10733211 on Aug 5, 2009 7:42 AM
    Edited by: user10733211 on Aug 5, 2009 7:49 AM
    Edited by: user10733211 on Aug 5, 2009 8:12 AM
    Edited by: user10733211 on Aug 5, 2009 8:22 AM
    Edited by: user10733211 on Aug 5, 2009 8:24 AM

    Hi,
    Try this, use some analytics:
    SQL> with t as (
      2  select 'item' item_type, 1 item_num, 5 unit_price, 10 quantity, 50 linetotal from dual union all
      3  select 'item', 2, 10, 5, 50 from dual union all
      4  select 'item', 1, 5, 5, 25 from dual union all
      5  select 'item', null, 2, 10, 20 from dual union all
      6  select 'tax', null, null, null, 16.5 from dual union all
      7  select 'tax', null, null, null, -3.5 from dual
      8  ) -- actual query starts here:
      9  select item_type
    10  ,      item_num
    11  ,      unit_price
    12  ,      sum_qty
    13  ,      case when sum_lt = lag(sum_lt) over ( order by item_type, item_num )
    14              then null
    15              else sum_lt
    16         end  sum_lt
    17  from ( select item_type
    18         ,      item_num
    19         ,      unit_price
    20         ,      quantity
    21         ,      sum(quantity) over  ( partition by item_type, item_num ) sum_qty
    22         ,      sum(linetotal) over ( partition by item_type )           sum_lt
    23         ,      row_number() over ( partition by item_type, item_num  order by item_type, item_num ) rn
    24         from   t
    25       )
    26  where rn=1;
    ITEM   ITEM_NUM UNIT_PRICE    SUM_QTY     SUM_LT
    item          1          5         15        145
    item          2         10          5
    item                     2         10
    tax                                           13
    4 rows selected.
    edit
    And please use the code tag, instead of clunging with concats.
    Read:
    http://forums.oracle.com/forums/help.jspa
    Edited by: hoek on Aug 5, 2009 5:15 PM
    edit2
    Also nulls for item_type:
    ops$xmt%OPVN> with t as (
      2  select 'item' item_type, 1 item_num, 5 unit_price, 10 quantity, 50 linetotal from dual union all
      3  select 'item', 2, 10, 5, 50 from dual union all
      4  select 'item', 1, 5, 5, 25 from dual union all
      5  select 'item', null, 2, 10, 20 from dual union all
      6  select 'tax', null, null, null, 16.5 from dual union all
      7  select 'tax', null, null, null, -3.5 from dual
      8  ) -- actual query starts here:
      9  select case when item_type = lag(item_type) over ( order by item_type, item_num )
    10              then null
    11              else sum_lt
    12         end  item_type
    13  ,      item_num
    14  ,      unit_price
    15  ,      sum_qty
    16  ,      case when sum_lt = lag(sum_lt) over ( order by item_type, item_num )
    17              then null
    18              else sum_lt
    19         end  sum_lt
    20  from ( select item_type
    21         ,      item_num
    22         ,      unit_price
    23         ,      quantity
    24         ,      sum(quantity) over  ( partition by item_type, item_num ) sum_qty
    25         ,      sum(linetotal) over ( partition by item_type )           sum_lt
    26         ,      row_number() over ( partition by item_type, item_num  order by item_type, item_num ) rn
    27         from   t
    28       )
    29  where rn=1;
    ITEM_TYPE   ITEM_NUM UNIT_PRICE    SUM_QTY     SUM_LT
           145          1          5         15        145
                        2         10          5
                                   2         10
            13                                          13
    4 rows selected.If you really need a space instead of nulls, then simply replace the nulls by a space....
    Edited by: hoek on Aug 5, 2009 5:18 PM

  • VERY URGENT: problem in sql query with long datatype in weblogic

    I have a problem while tryind to retrieve a column value with a long datatype using servlet and oci driver and the server is weblogic5.1 .I have used prepared statement the problem comes in the
    preparedStatement.executeQuery().
    The sql Query is simple query and runs well in all cases and fails only when the long datatype column is included in the query.
    The exception that comes on the weblogic server is that :
    AN UNEXPECTED EXCEPTION DETECTED IN THE NATIVE CODE OUTSIDE THE VM.

    Did you try changing the driver then?
    Please use Oracle's thin driver instead of the oci driver.
    There are many advantages of using the type 4 driver. the first and foremost being that it does not require oracle client side software on your machine. Therefore no enteries to be made in tnsnames.ora
    The thin driver is available in a jar called classes112.zip the class which implements the thin driver is oracle.jdbc.driver.OracleDriver
    the connection string is
    jdbc:oracle:thin:@<machine name>:1521:<sid>
    please try out with the thin driver and let me know.
    regards,
    Abhishek.

  • Need a SQL Query (URGENT)

    Hi Folks,
    I have 2 tables, in which the 1st one has 200 columns and 2 table had 2 columns.. There is one common column for both the tables, but there is little change in schema of the tables...The common col in 2nd table is a primary key but the same column in the 1st table is a ordinary one..The data type for the common column is same...
    Now i need to write a query to select 199columns(except the common column)from the 1st table and the other column(2nd col)other than the common column frm the 2nd table for "table1.commoncolumn=table2.commoncolumn"......
    I had tried the natural join but its nt working in my informix sql database....I also tried by explicitly mentioning column names like "select column1,column2....column199,table2.column2 from table1,table2 where table1.commoncolumn=table2.commoncolumn", but its having a severe performance impact.......
    Can some please suggest a query for the above one?? Thankx in advance..

    Please gimme possible solutions & suggestions regarding the above query....
    The informix z forum very slow...The problem is, you labeled your questions badly. As this is a forum of volunteers people tend to react badly to the use of the word "urgent" in the subject line. Nobody's question matters more than anybody else's. In fact some regulars won't answer questions with "urgent" in the strapline as a matter of principle.
    Then when we get to actually read your question it turns out to be a question about Informix. I guess not many people here use Informix so your potential pool of responders is pretty small (for instance I'm not even sure how to spell it).
    It's not our fault the Informix forums are so lame.
    Anyway, what I suggest is you repost your question with a new title: (Off topic) Need help with an INFORMIX query.
    At least that will attract people who might be able to answer your question. Then you need to include the actual query you're running and all the supporting details necessary for people to understand the nature of the performance impact.
    You need to ask the right people and you need to ask the right question. This is standard etiquette (and indeed common sense) regardless of which forum you're using.
    Cheers, APC
    Blog : http://radiofreetooting.blogspot.com/

  • Help needed for SQL query

    hello ,
    I am a beginner in terms of writing sql queries. I hope some body can help me out.
    I have two tables
    mysql> desc user_group_t;
    ---------------------------------------------------+
    | Field | Type | Null | Key | Default | Extra |
    ---------------------------------------------------+
    | userAccountId | char(8) | | PRI | | |
    | groupId | char(8) | | PRI | | |
    ---------------------------------------------------+
    2 rows in set (0.00 sec)
    mysql> desc group_t;
    ---------------------------------------------------+
    | Field | Type | Null | Key | Default | Extra |
    ---------------------------------------------------+
    | id | char(8) | | PRI | | |
    | name | char(50) | YES | | NULL | |
    | email | char(100) | YES | | NULL | |
    | description | char(254) | YES | | NULL | |
    | parentId | char(8) | YES | | NULL | |
    | creatorId | char(8) | YES | | NULL | |
    | createDate | char(20) | YES | | NULL | |
    | updateDate | char(20) | YES | | NULL | |
    | updatorId | char(8) | YES | | NULL | |
    ---------------------------------------------------+
    9 rows in set (0.00 sec)
    what I want is list of all groups with id,name and #of members(which is the # of rows in the user_group_t for any given id). Importantly I need the groups with 0 members also to be listed. In short my output should contain exactly the same number of rows as in group_t table with an additional column indicating # of members for that group.
    Any help would be greatly appreciated.
    Thanks in Advance.
    -Vasanth

    Thanks Donald,
    Actually I figured it out, with the following query:
    select id,name,sum(if(groupid is not null,1,0)) as members from group_t left join user_group_t on id=groupid group by id;
    I tried your solution, but mysql says there is an error at '+' . Anyway I modified your solution to the one below and it worked.
    select a.id, a.name, count(b.groupid) from group_t a left join user_group_t b on a.id=b.groupid group by a.id, a.name;
    I tried that before but then I used Count(*) instead of count on groupid. Your solution is elagant and I will go with yours.
    Thanks again.
    Vasanth

  • Very Urgent Help , ABAP SQL Query

    Guys,
    Please suggest.I have a table(custom_table1) with a field say A which is of date type = c and length = 9. And i want to query this table.Following is the query.
    Select substr(A,0,5) B C into itab From table custom_table1
    where b = ( select b from cusstom_table2 )
    and substr(A,0,5) = Input_A.
    That is i want to equate an Input_A (which is of 5 character length) with field A (only first 5 character of the 9 length). But it seems the query is wrong. Kindly help ,very urgent
    Thanks

    Thanks guys, U have helped me to fill up the where condition as
    but what about the column A in the select query ?I need only 5 characters from the field populated into Itab.
    CONCATENATE  srch_str '%' INTO srch_str.  -- bcos i want it to be 'InputA%'
    Select Substr(A,0,5) , B C Into Itab From CustomTable 1 where B = (Select B from customtable2) and A Like SrchStr
    Is there any means i can populate only the 5 characters from the select field A into Itab without using Substr (becos it doesnt work) ? Please help.

  • Need the sql query for IN clause.

    Hi All,
    i have item data like --
    IBM 200 and IBM 500
    have present code like this ----select * from emp where in (IBM 200 and IBM 500);
    so i need change or repalce like ('IBM 200','IBM 500')
    Any body please help me out this..
    Need to get out put 'IBM 200','IBM 500' from data IBM 200 and IBM 500 ...
    Edited by: anbarasan on Oct 17, 2011 10:30 PM

    Please consider the following when you post a question. This would help us help you better
    1. New features keep coming in every oracle version so please provide Your Oracle DB Version to get the best possible answer.
    You can use the following query and do a copy past of the output.
    select * from v$version 2. This forum has a very good Search Feature. Please use that before posting your question. Because for most of the questions
    that are asked the answer is already there.
    3. We dont know your DB structure or How your Data is. So you need to let us know. The best way would be to give some sample data like this.
    I have the following table called sales
    with sales
    as
          select 1 sales_id, 1 prod_id, 1001 inv_num, 120 qty from dual
          union all
          select 2 sales_id, 1 prod_id, 1002 inv_num, 25 qty from dual
    select *
      from sales 4. Rather than telling what you want in words its more easier when you give your expected output.
    For example in the above sales table, I want to know the total quantity and number of invoice for each product.
    The output should look like this
    Prod_id   sum_qty   count_inv
    1         145       2 5. When ever you get an error message post the entire error message. With the Error Number, The message and the Line number.
    6. Next thing is a very important thing to remember. Please post only well formatted code. Unformatted code is very hard to read.
    Your code format gets lost when you post it in the Oracle Forum. So in order to preserve it you need to
    use the {noformat}{noformat} tags.
    The usage of the tag is like this.
    <place your code here>\
    7. If you are posting a *Performance Related Question*. Please read
       {thread:id=501834} and {thread:id=863295}.
       Following those guide will be very helpful.
    8. Please keep in mind that this is a public forum. Here No question is URGENT.
       So use of words like *URGENT* or *ASAP* (As Soon As Possible) are considered to be rude.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Need of SQL query in selecting distinct values from two tables

    hi,
    I need a query for selecting distinct values from two tables with one condition.
    for eg:
    there are two tables a & b.
    in table a there are values like age,sex,name,empno and in table b valuses are such as age,salary,DOJ,empno.
    here what i need is with the help of empno as unique field,i need to select distinct values from two tables (ie) except age.
    can anybody please help me.
    Thanks in advance,
    Ratheesh

    Not sure what you mean either, but perhaps this will start a dialog:
    SELECT DISTINCT a.empno,
                    a.name,
                    a.sex,
                    b.salary,
                    b.doj
    FROM    a,
            b
    WHERE   a.empno = b.empno;Greg

  • Help Needed In SQL Query

    HI All,
    Oracle sql clarification required
    Sample Table:
    empno empname Job mgr_id hire_date salary deptno
    7788 SCOTT ANALYST 7566 19-APR-87 3000 20
    7902 FORD ANALYST 7566 03-DEC-81 3000 20
    7934 MILLER CLERK 7782 23-JAN-82 1300 10
    7900 JAMES CLERK 7698 03-DEC-81 950 30
    7369 SMITH CLERK 7902 17-DEC-80 800 20
    7876 ADAMS CLERK 7788 23-MAY-87 1100 20
    Need "single / one" sql for this requirement statement:
    There will be 2 drop down boxes (1st - Job list, 2nd - empno) in the form in which the following result set is expected
    1) When user selects value from 1st drop down box (job) as "ANALYST" leaving the second drop down unselected, the result expected is 2 (no. of rows for that job)
    2) When user selects value from 1st drop down box (job) as "ANALYST" and the value from 2nd drop down box as 7902, the result expected is 1 (no of rows for that job and empno)
    Sqls which I have tried from my side (given below) didn't give the expected result and please do help me in correcting this
    select count(1) from scott.emp where job='ANALYST' and ( empno = :empno or empno is null ) ;
    Please help for this requirement. Any help is deeply appreciated.
    Thanks
    Zaheer

    Hi,
    welcome to the forum.
    Please read SQL and PL/SQL FAQ
    When you put some code or output please enclose it between two lines starting with {noformat}{noformat}
    i.e.:
    {noformat}{noformat}
    SELECT ...
    {noformat}{noformat}
    For your question the following will both work:SQL> select * from emp
    where job='ANALYST' and (empno =:empno or :empno is null)
    EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
    7788 SCOTT ANALYST 7566 19/04/1987 00:00:00 3000 20
    7902 FORD ANALYST 7566 03/12/1981 00:00:00 3000 20
    2 rows selected.
    SQL> select * from emp
    where job='ANALYST' and empno =NVL(:empno, empno)
    EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
    7788 SCOTT ANALYST 7566 19/04/1987 00:00:00 3000 20
    7902 FORD ANALYST 7566 03/12/1981 00:00:00 3000 20
    2 rows selected.
    Regards.
    Al                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Need a sql query to get the difference between two timestamp in the format of hh:mm:ss.msec

    I have a database table where it keeps record of the transaction when it starts at StartTime and when it ends at EndTime. Both these entries are having the timestamp entries. Say for example, I have a tuple with Entries like 'Transaction A' starts at '2014-05-07
    20:55:03.170' and ends at '2014-05-08 08:56:03.170'. I need to find the difference between these two timestamps and my expected output is 12:01:00.000. Let me know how to achieve this ? 

    Hi,
    You can use below script which calculates difference as DD:HH:MM:SS. You can modify the same:
    DECLARE @startTime DATETIME
    DECLARE @endTime DATETIME
    SET @startTime = '2013-11-05 12:20:35'
    SET @endTime = '2013-11-10 01:22:30'
    SELECT [DD:HH:MM:SS] =
    CAST((DATEDIFF(HOUR, @startTime, @endTime) / 24) AS VARCHAR)
    + ':' +
    CAST((DATEDIFF(HOUR, @startTime, @endTime) % 24) AS VARCHAR)
    + ':' +
    CASE WHEN DATEPART(SECOND, @endTime) >= DATEPART(SECOND, @startTime)
    THEN CAST((DATEDIFF(MINUTE, @startTime, @endTime) % 60) AS VARCHAR)
    ELSE
    CAST((DATEDIFF(MINUTE, DATEADD(MINUTE, -1, @endTime), @endTime) % 60)
    AS VARCHAR)
    END
    + ':' + CAST((DATEDIFF(SECOND, @startTime, @endTime) % 60) AS VARCHAR),
    [StringFormat] =
    CAST((DATEDIFF(HOUR , @startTime, @endTime) / 24) AS VARCHAR) +
    ' Days ' +
    CAST((DATEDIFF(HOUR , @startTime, @endTime) % 24) AS VARCHAR) +
    ' Hours ' +
    CASE WHEN DATEPART(SECOND, @endTime) >= DATEPART(SECOND, @startTime)
    THEN CAST((DATEDIFF(MINUTE, @startTime, @endTime) % 60) AS VARCHAR)
    ELSE
    CAST((DATEDIFF(MINUTE, DATEADD(MINUTE, -1, @endTime), @endTime) % 60)
    AS VARCHAR)
    END +
    ' Minutes ' +
    CAST((DATEDIFF(SECOND, @startTime, @endTime) % 60) AS VARCHAR) +
    ' Seconds '
    Reference:
    http://sqlandme.com/2013/12/23/sql-server-calculating-elapsed-time-from-datetime/
    - Vishal
    SqlAndMe.com

  • Need a sql query to get multiple dates in rows

    Hi All,
    i need a query to get dates for last 7 days and each dates should be in one row...
    but select sysdate from dual..gives one row...
    Expexcted Output
    Dates:
    01-oct-2013
    30-sep-2013
    29-sep-2013
    28-sep-2013
    27-sep-2013
    26-sep-2013

    Hi,
    Do you mean that you want all 7 dates together on 1 row?
    Here's one way:
    SELECT LISTAGG ( TO_CHAR ( SYSDATE + 1 - LEVEL
                              , 'DD-Mon-YYYY'
                    ) WITHIN GROUP (ORDER BY LEVEL)    AS txt
    FROM    dual
    CONNECT BY LEVEL <= 7
    This is an example of String Aggregation, that is, taking a column on multiple rows, and concatenating all the values (however many htere happen to be) into 1 big string column 1 row.
    Like everything else, exactly how to do it depends on your Oracle version.
    For more on String Aggregation, including differetn techniques for different versions, see http://www.oracle-base.com/articles/10g/StringAggregationTechniques.php
    Message was edited by: FrankKulash
    Sorry, I mis-read the question.

  • Small modification needed in sql query.

    Hi OTN,
    I have Below Query in which i'am fetching the Data from Target Aggregation Table.
    select
    TARGET_EMPLOYEE.FIRST_NAME||' ' ||TARGET_EMPLOYEE.LAST_NAME as Requestername,
    TARGET_EMPLOYEE.GE_ID as requesterGEID,
    TARGET_ATTENDEE.FIRST_NAME||' ' ||TARGET_ATTENDEE.LAST_NAME as AttendeeName,
    TARGET_ATTENDEE.ATTENDEE_TYPE_FLAG as Attendeetyflg,
    TARGET_ATTENDEE.US_GO_ATTENDEE_FLAG as usgoflg,
    TARGET_ATTENDEE.COMP_GOVT_AGENCY_DEPT as Atcomp,
    TARGET_ATTENDEE.COUNTRY as atcountry,
    TARGET_AGGREGATION_ATTENDEE.REQUEST_TYPE as requesttype,
    TARGET_PA_REQUEST.PA_REQUEST_ID as parequest,
    TARGET_PA_REQUEST.EVENT_DATE as eventdate,
    TARGET_PA_REQUEST.EVENT_DESCRIPTION as eventdesc,
    TARGET_PA_REQUEST.CREATED_DATE as requestdate,
    TARGET_PA_REQUEST.STATUS as status,
    TARGET_AGGREGATION_ATTENDEE.status,
    TARGET_AGGREGATION_ATTENDEE.be_aggregation_approved_amount as approved_amount,
    TARGET_AGGREGATION_ATTENDEE.level_id,
    TARGET_AGGREGATION_ATTENDEE.be_aggregation_pend_amount as req_amount
    from TARGET_ATTENDEE,
    TARGET_EMPLOYEE,
    TARGET_PA_REQUEST,
    TARGET_AGGREGATION_ATTENDEE
    where TARGET_ATTENDEE.ATTENDEE_ID=TARGET_AGGREGATION_ATTENDEE.ATTENDEE_ID
    and TARGET_EMPLOYEE.GE_ID=TARGET_AGGREGATION_ATTENDEE.REQUESTOR_ID
    and TARGET_AGGREGATION_ATTENDEE.PA_REQUEST_ID=TARGET_PA_REQUEST.PA_REQUEST_ID
    and to_char(TARGET_AGGREGATION_ATTENDEE.CREATED_DATE,'YYYY')=to_char(sysdate,'YYYY')
    and TARGET_AGGREGATION_ATTENDEE.pa_request_id='NAM1421';
    for fetching particular request id: NAM1421
    i'am getting output:
    REQUESTERNAME     REQUESTERGEID     ATTENDEENAME     ATTENDEETYFLG     USGOFLG     ATCOMP     ATCOUNTRY     REQUESTTYPE     PAREQUEST     EVENTDATE     EVENTDESC     REQUESTDATE     STATUS     STATUS_1     APPROVED_AMOUNT     LEVEL_ID     REQ_AMOUNT
    Danny Leung     1000650742     Mano hariram     Y     Y     BSNL     India     BE     NAM1421     13-AUG-12     fdg     13-AUG-12     APD               15     3
    Danny Leung     1000650742     Mano hariram     Y     Y     BSNL     India     BE     NAM1421     13-AUG-12     fdg     13-AUG-12     APD     Approved     3     70     0
    Danny Leung     1000650742     Mano hariram     Y     Y     BSNL     India     BE     NAM1421     13-AUG-12     fdg     13-AUG-12     APD               20     3
    But i need output like :
    Requested Amount, Approved Amount, Approved status in a single row.
    getting Requested Amount from TARGET AGGREGATION TABLE
    LOGIC: where Level_id should minimum that Levelid we have to get into Requested Amount.
    for Getting Approved Amount: where the status should be approved that corresponding amount should be in Approved Amount column.
    Thanks

    hi Matthew,
    I need output of above query like:
    in the Requested Amount Column from Target Aggregation Table should give (corresponding value of minimum Level id).
    In the Approved Amount Column from Target Aggregation Table should Give (Corresponding value of Approved status).
    Please suggest me how to achieve it.
    Table Data of Target Aggregation Table:
    PA_REQUEST_ID     STATUS     APPROVED_AMOUNT     LEVEL_ID     REQ_AMOUNT
    APAC1031     CAN          0     0.61336
    APAC1031     CAN          0     8.17212
    APAC1031     CAN          30     0
    APAC1031     CAN          30     0     
    APAC3873               0     
    APAC3873               80     
    APAC3873               0     
    APAC3873               80     
    APAC3873               30     
    APAC3873               70     
    APAC3873               30     
    APAC3873     Approved          80     
    APAC3873               60          
    APAC3997     Approved     0.08     80     0
    APAC3997     Approved          80     
    APAC3997     Approved          80     
    APAC3997               30     
    APAC3997               30     0.08
    APAC3997               30     
    APAC3997               0     
    APAC3997               0     0.08
    APAC3997               0     
    APAC3997               60     
    APAC3997               60     0.08
    APAC3997               60     
    APAC3998               0     
    APAC3998               30     
    APAC3998               0     
    APAC3998               0     
    APAC3998               30     
    APAC3998               30     
    APAC4001               0     
    APAC4004               30     
    APAC4004               30     
    APAC4004               0     
    APAC4004               0     
    APAC4004               80     
    APAC4004               0     
    APAC4004               80     
    APAC4004               0     
    APAC4004               60     
    APAC4006               30     
    APAC4006               30     
    APAC4006               0     
    APAC4006               0     
    APAC4007               0     
    APAC4007               0     0.08
    APAC4008               30     0.06
    APAC4008     Approved     0.06     80     0
    APAC4008               60     0.06
    APAC4008               0     0.06
    APAC4009               0     10
    APAC4009               0     10
    APAC4010               0     
    APAC4010               60     
    APAC4010               30     
    APAC4010     Approved          80     
    APAC4011               30     
    APAC4011               30     0.68
    APAC4011               0     
    APAC4011               0     0.68
    APAC4013               30     
    APAC4013               30     
    APAC4013               0     
    APAC4013               30     
    APAC4013               0     
    APAC4013               0     
    APAC4014               30     
    APAC4014               30     0.9
    APAC4014               0     
    APAC4014               0     0.9
    APAC4016               0     
    APAC4016               0     
    APAC4016               30     
    APAC4016               30     
    APAC4017     CAN          30     
    APAC4017     CAN          30     
    APAC4017     CAN          60     
    APAC4017     CAN          60     
    APAC4017     CAN          60     
    APAC4017     CAN          70     
    APAC4017     CAN          70     
    APAC4017     CAN          70     
    APAC4017     CAN          80     
    APAC4017     CAN          80     
    APAC4017     CAN          80     
    APAC4017     CAN          30     
    APAC4017     CAN          0     
    APAC4017     CAN          0     
    APAC4017     CAN          0     
    APAC4019               60     0.08
    APAC4019     Approved          80     
    APAC4019     Approved     0.08     80     0
    APAC4019               30     
    APAC4019               60     
    APAC4019               30     0.08
    APAC4019               0     0.08
    APAC4019               0     
    APAC4023     Rejected          0     0.12
    APAC4025     CAN          80     
    APAC4025     CAN          80     
    APAC4025     CAN          0     
    APAC4025     CAN          0     
    APAC4025     CAN          0     
    APAC4025     CAN          30     
    APAC4025     CAN          30     
    APAC4025     CAN          30     
    APAC4025     CAN          60     
    APAC4025     CAN          60     
    APAC4025     CAN          60     
    APAC4025     CAN          70     
    APAC4025     CAN          70     
    APAC4025     CAN          70     
    APAC4025     CAN          80     
    APAC4027               80     
    APAC4027               80     
    APAC4027               80     
    APAC4027               70     
    APAC4027               70     
    APAC4027               70     
    APAC4027               60     
    APAC4027               60     
    APAC4027               60     
    APAC4027               30     
    APAC4027               30     
    APAC4027               30     
    APAC4027               0     
    APAC4027               0     
    APAC4027               0     
    APAC4028               60     
    APAC4028               0     
    APAC4028               0     
    APAC4028               70     
    APAC4028               70     
    APAC4028               70     
    APAC4028               60     
    APAC4028               60     
    APAC4028               30     
    APAC4028               30     
    APAC4028               30     
    APAC4028               0     
    APAC4029     Rejected          0

  • Need the sql query

    Hi All,
    i have data like in table
    ename dept sal
    anbu ece 2000
    anbu ece 3000
    anbu csc 4000
    raj csc 5000
    need output only the same ename,same dept ,diffrent sal
    ename dept sal
    anbu ece 2000
    anbu ece 3000
    please give me the out put ...

    select ename
         , dept
         , sal
      from (
    select ename
         , dept
         , sal
         , count(*) over (partition by ename, dept) cnt
      from test
    where cnt > 1like in
    SQL> with test as
      2  (
      3  select 'anbu' ename, 'ece' dept, 2000 sal from dual union all
      4  select 'anbu' ename, 'ece' dept, 3000 sal from dual union all
      5  select 'anbu' ename, 'csc' dept, 4000 sal from dual union all
      6  select 'raj'  ename, 'csc' dept, 5000 sal from dual
      7  )
      8  select ename
      9       , dept
    10       , sal
    11    from (
    12  select ename
    13       , dept
    14       , sal
    15       , count(*) over (partition by ename, dept) cnt
    16    from test
    17    )
    18   where cnt > 1
    19  /
    ENAM DEP        SAL
    anbu ece       2000
    anbu ece       3000
    SQL>

Maybe you are looking for

  • Communication problem with agilent 34401a

    Hi all, I' m using Labview 8.2 and visa 4.2. I 've connected my agilent 34401a and it is recognized, I see it in agilent connection expert and manage to communicate with it. (query *IDN?) But I get a problem when I want to use the control_mode vi. (r

  • How do I move song files from the documents file to iTunes?

    I have a bunch of songs in the AIFF format in my documents file. I'd like to move them to my iTunes library, but I must be doing something wrong--many times. Since expecting different results from doing the same thing is a definition of stupidity, I

  • Help me witn the bulk update

    i am trying to update a table with millions of data using row num but as iam dividing the data into sets, while using rownum iam able to update only first set teh incremental set is not being updated. please help with below query. begin select max(ro

  • Missing audio input devices

    I have 5 mac pro systems. All running the same OS and software versions. All using the AJA Io HD as the primary A/V capture device. 1 of the mac pros can select the Aja IO HD as the audio input in Soundtrack...on the other 4 the option is missing. Bu

  • Block Paymemt method entered during PO or IR [me21n or MIRO]

    Hi, for a particular type of outbound payment scenario, we need the invoices to be blocked by default. Lets take for example the payment method is Wire [W] so if payment method wire is selected during MIRO, then a block should be placed when released