Required help for a query

Hi All....
Required help for one more query.
I have a table with data like this:
Cust_id Transaction_no
111 1
111 2
111 3
111 4
111 5
111 6
222 7
222 8
333 9
333 10
333 11
333 12
I wrote the following query :
select cust_id, ntile(3) over (order by cust_id) "Bucket" from trans_detls
The output is like this :
Cust_id Bucket
111 1
111 1
111 1
111 1
111 2
111 2
222 2
222 2
333 3
333 3
333 3
333 3
The problem is that I dont want the cust_id to overlap in buckets. That is one cust_id should be present in only one bucket.
Is this possible?
Thanks in advance.
Ameya

Or Something like..
SQL> select * from test;
        ID         NO
       111          1
       111          2
       111          3
       111          4
       111          5
       111          6
       222          7
       222          8
       333          9
       333         10
       333         11
       333         12
12 rows selected.
SQL> select id, ntile(3) over (order by rn) "Bucket"
  2  from(
  3      select id,row_number() over(partition by id order by no) rn
  4      from test);
        ID     Bucket
       111          1
       222          1
       333          1
       111          1
       222          2
       333          2
       111          2
       333          2
       111          3
       333          3
       111          3
       111          3
12 rows selected.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Similar Messages

  • Require help for a query

    Hi all..
    I required some help in writing a query..
    My table is like this
    Cust_id     Cust_type Del_status Incharge
    111     Gold     HD          
    222     Gold     
    333     Gold     HD
    444     Gold
    123     Gold     HD          
    456     Gold          
    789     Gold     HD
    987     Gold          
    555     Silver     HD
    666     Silver     HD
    777     Silver
    888     Silver
    I want a query to generate this output
    Cust_id     Cust_type Del_status Incharge
    111     Gold     HD     1
    222     Gold          1
    333     Gold     HD      1
    444     Gold          1
    555     Silver     HD     1
    777     Silver          1
    123     Gold     HD     2
    456     Gold          2
    789     Gold     HD     2
    987     Gold          2
    666     Silver     HD     2
    888     Silver          2
    The query basically allocates the customers to incharges... based on cust_type and del_status.
    there are 3 categories, Gold Customers, Silver Customers and HD customers..
    It should divide this three equally amongst the 2 incharges...
    Also this is just a sample data... actually table consists of around 3Lac customers and 12 incharges
    Sorry if its a incorrect post..
    Thanks in advance..

    Is there a way to find the value for ntile(2) dynamically?
    I tried something like
    Connected to:
    Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.8.0 - Production
    SQL> with mytable as (select 111 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      2  select 222 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      3  select 333 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      4  select 444 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      5  select 123 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      6  select 456 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      7  select 789 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      8  select 897 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      9  select 555 cust_id, 'Silver' cust_type, 'HD' del_status from dual UNION ALL
    10  select 666 cust_id, 'Silver' cust_type, 'HD' del_status from dual UNION ALL
    11  select 777 cust_id, 'Silver' cust_type, null del_status from dual UNION ALL
    12  select 888 cust_id, 'Silver' cust_type, null del_status from dual UNION ALL
    13  select 1001 cust_id, 'Copper' cust_type, null del_status from dual UNION ALL
    14  select 1002 cust_id, 'Copper' cust_type, 'HD' del_status from dual UNION ALL
    15  select 1003 cust_id, 'Copper' cust_type, null del_status from dual
    16  )
    17  select t1.cust_id
    18  , t1.cust_type
    19  , t1.del_status
    20  --, ntile(3) over (partition by nvl(t1.del_status,t1.cust_type) order by t1.cust_id)
    21  , ntile((select count(distinct nvl(t2.del_status,t2.cust_type))-1 from mytable t2)) over (parti
    tion by nvl(del_status,cust_type) order by cust_id)
    22  incharge
    23  from mytable t1
    24  order by incharge, t1.cust_type, t1.cust_id
    25  /
    , ntile((select count(distinct nvl(t2.del_status,t2.cust_type))-1 from mytable t2)) over (partition
    ERROR at line 21:
    ORA-30488: argument should be a function of expressions in PARTITION BY
    SQL> The number of incharges could change during the time.
    Message was edited by:
    Sven Weller

  • Please Help for the Query

    Please Help for the Query
    Hi frds please help me for the below query.What I want to do is to pull out the data from below table :-
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/18/2008 3IINFOTECH -4
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3
    as such :-
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3
    Here I want to find the Trend i.e either asc or desc order from the lowest indicator.
    In the above sample data -8, -4, -5, -3 out of which I want the asc order data -8, -5, -3 and exclude -4 data.Because the asc order -8, -5, -3 will not follow.
    So I want the data
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3

    SQL> CREATE TABLE BORRAME(FECHA DATE, INDICA VARCHAR2(100));
    Tabla creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/13/2008','MM/DD/YYYY'), '3IINFOTECH -8');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/18/2008','MM/DD/YYYY'), '3IINFOTECH -4');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/25/2008','MM/DD/YYYY'), '3IINFOTECH -5');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/27/2008','MM/DD/YYYY'), '3IINFOTECH -3');
    1 fila creada.
    SQL> COMMIT;
    Validación terminada.
    SQL>
    SQL> SELECT FECHA, INDICA
      2  FROM BORRAME
      3  WHERE SUBSTR(INDICA,INSTR(INDICA,'-',1)+1,LENGTH(INDICA)) <> '4'
      4  ORDER BY SUBSTR(INDICA,INSTR(INDICA,'-',1)+1,LENGTH(INDICA)) DESC;
    FECHA                                                                
    INDICA                                                               
    13/03/08                                                             
    3IINFOTECH -8                                                        
    25/03/08                                                             
    3IINFOTECH -5                                                        
    27/03/08                                                             
    3IINFOTECH -3                                                        
                    

  • Help for a query to add columns

    Hi,
    I need for a query where I should add each TableC value as an additional column.
    Please suggest...
    I have 3 tables (TableA, TableB, TableC). TableB stores TableA Id and TableC stores TableB Id
    Considering Id of TableA.
    Sample data
    TableA     :          
    ID     NAME     TABLENAME     ETYPE
    23     Name1     TABLE NAMEA     Etype A
    TableB     :          
    ID     A_ID     RTYPE     RNAME
    26     23     RTYPEA     RNAMEA
    61     23     RTYPEB     RNAMEB
    TableC     :          
    ID     B_ID     COMPNAME     CONC
    83     26     Comp Name AA     1.5
    46     26     Comp Name BB     2.2
    101     61     Comp Name CC     4.2
    Scenario 1: AS PER ABOVE SAMPLE DATA Put each TableC value as an additional column.
    For an Id in TableA(23) where TableB contains 2 records of A_ID (26, 61) and TableC contains 2 records for 26 and 1 record for 61.
    Output required: Put each TABLEC value as an additional column
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME     TableC_1_CONC TableC_2_COMPNAME     TableC_2_CONC     
    Name1 EtypeA RTypeA Comp Name AA 1.5 Comp Name BB 2.2     so on..
    Name1 EtypeA RTypeB Comp Name CC 4.2 NULL NULL     
    Scenario 2: If Table C contains ONLY 1 row for each Id in TableB, output should be somewhat
    Output:
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME
    TableC_1_CONCvalue     value     value     value               value

    Hi,
    Welcome to the forum!
    Do you want the data from TableC presented
    (1) in one column, or
    (2) in several columns (a different column of results for each row in the original TableC)?
    (1) Is called String Aggregation and is easier than (2).
    The best way to do this is with a user-defined aggregate function (STRAGG) which you can copy from asktom.
    Ignoring TableA for now, you could get what you want by saying
    SELECT    b.rtype
    ,         STRAGG (   c.compname
                     || ' '
                     || c.conc
                     )  AS c_data
    FROM      TableB  b
    JOIN      TableC  c  ON b.id  = c.b_id
    GROUP BY  b.rtype;(2) Presenting N rows of TableC as it they were N columns of the same row is called a pivot. Search for "pivot" or "rows to columns" to find examples of how to do this.
    The number of columns in a result set is hard-coded into the query. If you don't know ahead of time how many rows in TableC will match a row in TableB, you can:
    (a) guess high (for example, hard-code 20 columns and let the ones that never contain a match be NULL) or,
    (b) use Dynamic SQL to write a query for you, which has exactly as many columns as you need.
    The two scripts below contain basic information on pivots.
    This first script is similar to what you would do for case (a):
    --     How to Pivot a Result Set (Display Rows as Columns)
    --     For Oracle 10, and earlier
    --     Actually, this works in any version of Oracle, but the
    --     "SELECT ... PIVOT" feature introduced in Oracle 11
    --     is better.  (See Query 2, below.)
    --     This example uses the scott.emp table.
    --     Given a query that produces three rows for every department,
    --     how can we show the same data in a query that has one row
    --     per department, and three separate columns?
    --     For example, the query below counts the number of employess
    --     in each departent that have one of three given jobs:
    PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
    SELECT     deptno
    ,     job
    ,     COUNT (*)     AS cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ,          job;
    Output:
        DEPTNO JOB              CNT
            20 CLERK              2
            20 MANAGER            1
            30 CLERK              1
            30 MANAGER            1
            10 CLERK              1
            10 MANAGER            1
            20 ANALYST            2
    PROMPT     ==========  1. Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno;
    --     Output:
        DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
            30           0          1           1
            20           2          2           1
            10           0          1           1
    --     Explanation
    (1) Decide what you want the output to look like.
         (E.g. "I want a row for each department,
         and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
    (2) Get a result set where every row identifies which row
         and which column of the output will be affected.
         In the example above, deptno identifies the row, and
         job identifies the column.
         Both deptno and job happened to be in the original table.
         That is not always the case; sometimes you have to
         compute new columns based on the original data.
    (3) Use aggregate functions and CASE (or DECODE) to produce
         the pivoted columns. 
         The CASE statement will pick
         only the rows of raw data that belong in the column.
         If each cell in the output corresponds to (at most)
         one row of input, then you can use MIN or MAX as the
         aggregate function.
         If many rows of input can be reflected in a single cell
         of output, then use SUM, COUNT, AVG, STRAGG, or some other
         aggregate function.
         GROUP BY the column that identifies rows.
    PROMPT     ==========  2. Oracle 11 PIVOT  ==========
    WITH     e     AS
    (     -- Begin sub-query e to SELECT columns for PIVOT
         SELECT     deptno
         ,     job
         FROM     scott.emp
    )     -- End sub-query e to SELECT columns for PIVOT
    SELECT     *
    FROM     e
    PIVOT     (     COUNT (*)
              FOR     job     IN     ( 'ANALYST'     AS analyst
                             , 'CLERK'     AS clerk
                             , 'MANAGER'     AS manager
    NOTES ON ORACLE 11 PIVOT:
    (1) You must use a sub-query to select the raw columns.
    An in-line view (not shown) is an example of a sub-query.
    (2) GROUP BY is implied for all columns not in the PIVOT clause.
    (3) Column aliases are optional. 
    If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
    {code}
    The second script, below, shows one way of doing a dynamic pivot in SQL*Plus:
    {code}
    How to Pivot a Table with a Dynamic Number of Columns
    This works in any version of Oracle
    The "SELECT ... PIVOT" feature introduced in Oracle 11
    is much better for producing XML output.
    Say you want to make a cross-tab output of
    the scott.emp table.
    Each row will represent a department.
    There will be a separate column for each job.
    Each cell will contain the number of employees in
         a specific department having a specific job.
    The exact same solution must work with any number
    of departments and columns.
    (Within reason: there's no guarantee this will work if you
    want 2000 columns.)
    Case 0 "Basic Pivot" shows how you might hard-code three
    job types, which is exactly what you DON'T want to do.
    Case 1 "Dynamic Pivot" shows how get the right results
    dynamically, using SQL*Plus. 
    (This can be easily adapted to PL/SQL or other tools.)
    PROMPT     ==========  0. Basic Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ORDER BY     deptno
    PROMPT     ==========  1. Dynamic Pivot  ==========
    --     *****  Start of dynamic_pivot.sql  *****
    -- Suppress SQL*Plus features that interfere with raw output
    SET     FEEDBACK     OFF
    SET     PAGESIZE     0
    SPOOL     p:\sql\cookbook\dynamic_pivot_subscript.sql
    SELECT     DISTINCT
         ',     COUNT (CASE WHEN job = '''
    ||     job
    ||     ''' '     AS txt1
    ,     'THEN 1 END)     AS '
    ||     job
    ||     '_CNT'     AS txt2
    FROM     scott.emp
    ORDER BY     txt1;
    SPOOL     OFF
    -- Restore SQL*Plus features suppressed earlier
    SET     FEEDBACK     ON
    SET     PAGESIZE     50
    SPOOL     p:\sql\cookbook\dynamic_pivot.lst
    SELECT     deptno
    @@dynamic_pivot_subscript
    FROM     scott.emp
    GROUP BY     deptno
    ORDER BY     deptno
    SPOOL     OFF
    --     *****  End of dynamic_pivot.sql  *****
    EXPLANATION:
    The basic pivot assumes you know the number of distinct jobs,
    and the name of each one.  If you do, then writing a pivot query
    is simply a matter of writing the correct number of ", COUNT ... AS ..."\
    lines, with the name entered in two places on each one.  That is easily
    done by a preliminary query, which uses SPOOL to write a sub-script
    (called dynamic_pivot_subscript.sql in this example).
    The main script invokes this sub-script at the proper point.
    In practice, .SQL scripts usually contain one or more complete
    statements, but there's nothing that says they have to.
    This one contains just a fragment from the middle of a SELECT statement.
    Before creating the sub-script, turn off SQL*Plus features that are
    designed to help humans read the output (such as headings and
    feedback messages like "7 rows selected.", since we do not want these
    to appear in the sub-script.
    Turn these features on again before running the main query.
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Require help on the Query related to where condition

    Hi All,
    i am a java developer and was required to write a query for some of my operations to get the data from oracle database.
    i am stuck at one point. please help me.
    i have table with 3 columns.
    ID | Name | Value
    1 | ser | x
    1 | Bus | a
    1 | pol | b
    2 | ser | x
    2 | Bus | a
    2 | pol | c
    3 | ser | y
    3 | Bus | a
    3 | pol | b
    4 | ser | y
    4 | Bus | d
    4 | pol | e
    i have written query like below.
    Select ID from table where Name in ('ser','Bus','pol') and Value in ('x','a','b');
    i am getting out put as ID = 1,2,3
    but i should get the output as ID=1
    i should get the ID which satisfies all the given attributes and values.
    i cannot use the AND for each Name and Value here as i the no.of values inName or Value field may increase and that will be dynamically passed from my java class.
    Immediate help is appreciated.
    thanks a lot

    Something like this will deal with multiple values and treat the name value pairs together...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select 1 as ID, 'ser' as Name, 'x' as Value from dual union all
      2             select 1, 'Bus', 'a' from dual union all
      3             select 1, 'pol', 'b' from dual union all
      4             select 2, 'ser', 'x' from dual union all
      5             select 2, 'Bus', 'a' from dual union all
      6             select 2, 'pol', 'c' from dual union all
      7             select 3, 'ser', 'y' from dual union all
      8             select 3, 'Bus', 'a' from dual union all
      9             select 3, 'pol', 'b' from dual union all
    10             select 4, 'ser', 'y' from dual union all
    11             select 4, 'Bus', 'd' from dual union all
    12             select 4, 'pol', 'e' from dual)
    13  --
    14  -- end of test data
    15  --
    16      ,i as (select 'ser|x,Bus|a,pol|b' as r from dual)
    17  --
    18  -- end of input
    19  --
    20      ,split as (select regexp_substr(r,'[^,]+',1,rn) as pair
    21                 from   i
    22                        cross join (select rownum rn
    23                                    from dual
    24                                    connect by rownum <= (select length(regexp_replace(r,'[^,]'))+1 from i))
    25                )
    26  select id
    27  from t, split
    28  where name = regexp_substr(pair,'[^|]+',1,1)
    29  and   value = regexp_substr(pair,'[^|]+',1,2)
    30  group by id
    31* having count(*) = (select count(*) from split)
    SQL> /
            ID
             1It's then just a case of providing the input in the correct format.

  • Help for TSQL query

    Hi,
    I want to design query. Below are the required:
    Existing Table :
    Code   Val1   Val2   Val3
    A       A11      -         -
    B        -       B22        -
    C        -       -         C33
    Output required:  
    Code  Col1  Col2
    A       A11   B22
    B       B22   C33
    C       C33  -
    Can anyone help with the query? 

    Why don't you normalize your data?
    E.g.
    DECLARE @Sample TABLE
    Code CHAR(1) ,
    Val1 CHAR(3) ,
    Val2 CHAR(3) ,
    Val3 CHAR(3)
    INSERT INTO @Sample
    VALUES ( 'A', 'A11', NULL, NULL ),
    ( 'B', NULL, 'B22', NULL ),
    ( 'C', NULL, NULL, 'C33' );
    -- SQL Server 2012+
    WITH Normalized
    AS ( SELECT U.Code ,
    U.Attribute ,
    U.Value
    FROM @Sample S UNPIVOT ( Value FOR Attribute IN ( Val1, Val2, Val3 ) ) U
    SELECT N.Code ,
    N.Value ,
    LEAD(N.Value, 1, NULL) OVER ( ORDER BY N.Code ASC )
    FROM Normalized N;
    -- SQL Server 2008+
    WITH Normalized
    AS ( SELECT U.Code ,
    U.Attribute ,
    U.Value ,
    ROW_NUMBER() OVER ( ORDER BY U.Code ) AS RN
    FROM @Sample S UNPIVOT ( Value FOR Attribute IN ( Val1, Val2, Val3 ) ) U
    SELECT L.Code ,
    L.Value ,
    R.Value
    FROM Normalized L
    LEFT JOIN Normalized R ON L.RN = R.RN - 1;
    btw, post in future concise and complete examples. Include table DDL and sample data INSERT statements as runnable T-SQL scripts.

  • SQL experts please help for a query

    I have following table1.
    What query can give the result as given below, SQL experts please help on this.
    TABLE1
    Event DATETIME
    in 2/JAN/2010
    out 2/JAN/2010
    in 13/JAN/2010
    out 13/JAN/2010
    in 5/JAN/2010
    out 5/JAN/2010
    RESULT REQUIRED FROM THE SQL QUERY
    COL1_IN COL2_OUT
    2/JAN/2010 2/JAN/2010
    13/JAN/2010 13/JAN/2010
    5/JAN/2010 5/JAN/2010

    I tried to help, but this puzzles me.
    Why is this not returning pre-selected set of rows, why it's doing some merge join cartezian ?
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Prod
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    SQL> select * from table1;
    EVENT      DATETIME
    in         2/JAN/2010
    out        2/JAN/2010
    in         13/JAN/2010
    out        13/JAN/2010
    in         5/JAN/2010
    out        5/JAN/2010
    6 rows selected.
    SQL> explain plan for
      2  with a as
    (select datetime from table1 where event='in'),
    b as
    (select datetime from table1 where event='out')
    select  a.datetime COL1_IN ,b.datetime COL2_OUT from a,b ;
    Explained.
    SQL> set wrap off
    SQL> set linesize 200
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 185132177
    | Id  | Operation            | Name   | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT     |        |     9 |   288 |     8   (0)| 00:00:01 |
    |   1 |  MERGE JOIN CARTESIAN|        |     9 |   288 |     8   (0)| 00:00:01 |
    |*  2 |   TABLE ACCESS FULL  | TABLE1 |     3 |    48 |     3   (0)| 00:00:01 |
    |   3 |   BUFFER SORT        |        |     3 |    48 |     5   (0)| 00:00:01 |
    |*  4 |    TABLE ACCESS FULL | TABLE1 |     3 |    48 |     2   (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
       2 - filter("EVENT"='in')
       4 - filter("EVENT"='out')
    Note
       - dynamic sampling used for this statement
    21 rows selected.
    SQL> with a as
    (select datetime from table1 where event='in'),
    b as
    (select datetime from table1 where event='out')
    select  a.datetime COL1_IN ,b.datetime COL2_OUT from a,b ;
    COL1_IN         COL2_OUT
    2/JAN/2010      2/JAN/2010
    2/JAN/2010      13/JAN/2010
    2/JAN/2010      5/JAN/2010
    13/JAN/2010     2/JAN/2010
    13/JAN/2010     13/JAN/2010
    13/JAN/2010     5/JAN/2010
    5/JAN/2010      2/JAN/2010
    5/JAN/2010      13/JAN/2010
    5/JAN/2010      5/JAN/2010
    9 rows selected.
    SQL>

  • Help for the query

    hello,
    I have a question again. The tables for the query are "Patient" and "Station".
    Station-Table:
    s_id, station
    Patient-Table:
    p_id, name, s_id, gender
    I want to know how many Patient are Male and Female for each Station. That means that the output should be:
    Station Male Female
    S1 12 10
    S2 6 4

    I supposed the values in gender are 'M' for Male and 'F' for Female
    select s.station, sum(decode(p.gender, 'M', 1, 0)) Male , sum(decode(p.gender, 'F', 1, 0)) Female
    from station s, patient p
    where s.s_id=p.s_id
    group by s.station;

  • Required help for java web url of BI Query

    hello Experts,
    Here is my scnenario: I am having a BI query lest's say xxxx. now i have build url and passed the query parameters in URL. Still the query is not taking that varialble values and showing the parameter screen. My requirement is the query should execute with the parameters which are passed in the URL. It should not ask for the parameter screen.
    Help will be really appriciated.
    thanks in advance.

    Where is the dll ?
    you need to include it in a jar, and ref that jar with
    <nativelib href=<url to the jar>/>
    and request all-permissions in you jnlp file (and sign both jars)
    /Andy

  • Require help for query

    hi all,
    i had 3 tables
    1) SERV_AUTH_DETAIL WITH COLUMNS
    SERV_AUTH_DETAIL_ID
    NUMBER_OF_UNITS
    UNIT_COST
    SERV_ID
    2) PUR_ORDER_DETAIL WITH COLUMNS
    PUR_ORDER_DETAIL_ID
    PUR_ORDER_ID
    NUMBER_OF_UNITS
    UNIT_COST
    SERV_AUTH_DETAIL_ID
    3) PUR_ORDER WITH COLUMNS
    PUR_ORDER_ID
    SERV_AUTH_ID
    PUR_ORDER_STATUS
    when i issue the following update statement, i got the error stating
    column ambiguously defined
    update t_serv_auth_detail
    set number_of_units = (select number_of_units
    from t_pur_order_detail pod
    join t_serv_auth_detail sa
    on
    pod.serv_auth_detail_id = sa.serv_auth_detail_id
    and pod.pur_order_id = 1)
    where serv_auth_detail_id = (select s.serv_auth_detail_id
    from t_pur_order_detail p
    join t_pur_order po
    on
    p.pur_order_id = po.pur_order_id
    and po.pur_order_id = 1)
    ERROR at line 2:
    ORA-00918: column ambiguously defined
    could anyone give guidelines to correct this error.
    thanks in advance.
    rampa

    In your UPDATE statement:
    UPDATE t_SERV_AUTH_DETAIL SA
       SET NUMBER_OF_UNITS = (SELECT NUMBER_OF_UNITS
                                FROM t_PUR_ORDER_DETAIL POD
                               WHERE POD.SERV_AUTH_DETAIL_ID = SA.SERV_AUTH_DETAIL_ID
                                 AND POD.PUR_ORDER_ID = 1)
    WHERE SERV_AUTH_DETAIL_ID = (SELECT P.SERV_AUTH_DETAIL_ID
                                    FROM t_PUR_ORDER_DETAIL P
                                    JOIN t_PUR_ORDER PO
                                      ON P.PUR_ORDER_ID = PO.PUR_ORDER_ID
                                     AND PO.PUR_ORDER_ID = 1);the 2nd subquery - SELECT P.SERV_AUTH_DETAIL_ID...
    is returning more than one row and you have used a "=" operator to match it. That's the reason for that ORA error.
    pratz

  • Help for a query

    Hi,
    Oracle 10g r2.
    I have a page with a listbox item, and a "cart" that can contains values from the listbox (user can add values from the listbox to the cart).
    I want to filter a report depending on the values in the cart and the value selected in the listbox !
    Only one value can be selected in the listbox, but the cart can contains several values.
    When no value is selected in the listbox, it returns '%', else it returns the value
    Here is what I need :
    listbox = '%', cart = empty ==> return all records
    listbox = '%', cart = 'value1, value2' ==> returns records where spat_area_name in ('value1','value2')
    listbox = 'value1', cart = 'value2, value3' ==> return records where spat_area_name in ('value1','value2','value3')
    listbox = 'value1', cart = empty ==> return records where spat_area_name in ('value1')
    For example (don't works) :
    select
    from
         spatial_points
    where
         spat_area_name like :p3_filtre_area_name
    or
         spat_area_name in (
              select usa_area_name from user_selected_areas where usa_loggus_id = 591
         ):p3_filtre_area_name is the listbox value
    (select usa_area_name from user_selected_areas where usa_loggus_id = 591) returns the values stored in the cart
    I tried a few things (using CASE or DECODE) but I can't manage to make it works.
    So any help would be much appreciated.
    Thanks.
    Yann.

    Hi,
    Here are some create/insert statements if you want to test :
    create table accelerator_lines (
         accl_name varchar2(7),
         accl_description varchar2(50),
         constraint accl_lines_pk primary key (accl_name)
    create table areas (
         area_name varchar2(7),
         area_description varchar2(50),
         constraint areas_pk primary key (area_name)
    create table spatial_points (
         spat_id integer,
         spat_accl_name varchar2(7),
         spat_area_name varchar2(7) not null,
         spat_class varchar2(6) not null,
         spat_number varchar2(6) not null,
         spat_pt varchar2(1),
         spat_type varchar2(1) not null,
         constraint spatial_pk primary key (spat_id),
         constraint spat_type check (spat_type in ('P','S','B','T','U','C')),
         constraint spat_pt check (spat_pt in ('E','S','A','B','C','D')),
         constraint spatial_accl_fk foreign key (spat_accl_name) references accelerator_lines(accl_name),
         constraint spatial_area_fk foreign key (spat_area_name) references areas(area_name)
    create table user_selected_areas (
         usa_id integer,
         usa_area_name varchar2(7),
         constraint usa_id_pk primary key (usa_id),
         constraint usa_area_name_fk foreign key (usa_area_name) references areas(area_name)
    create table user_selected_accl_lines (
         usal_id integer,
         usal_accl_name varchar2(7),
         constraint usal_id_pk primary key (usal_id),
         constraint usal_accl_name_fk foreign key (usal_accl_name) references accelerator_lines(accl_name)
    insert into accelerator_lines values ('LHC','LHC description');
    insert into accelerator_lines values ('LINAC4','LINAC4 description');
    insert into accelerator_lines values ('SPS','SPS description');
    insert into accelerator_lines values ('TI12','TI12 description');
    insert into accelerator_lines values ('TI18','TI18 description');
    insert into accelerator_lines values ('LEP','LEP description');
    insert into areas values ('TT81','TT81 description');
    insert into areas values ('PDV3','PDV3 description');
    insert into areas values ('PDV4','PDV4 description');
    insert into areas values ('193','193 description');
    insert into areas values ('EHW1','EHW1 description');
    insert into areas values ('TCC2','TCC2 description');
    insert into spatial_points values (1,'LHC','PDV4','MB2M1','22586','E','A');
    insert into spatial_points values (2,'LHC','PDV4','MB2M1','22586','S','A');
    insert into spatial_points values (3,'LHC','PDV4','MBC4','sr555','E','B');
    insert into spatial_points values (4,'TI12','TT81','RD433','22','E','A');
    insert into spatial_points values (5,'TI12','TT81','ESD8C5','564','S','A');
    insert into spatial_points values (6,'LEP','PDV3','MBRRM1','22586','E','B');
    insert into spatial_points values (7,'LEP','PDV3','MBRRM1','22586','S','B');
    insert into spatial_points values (8,'LEP','PDV3','FFZ55','2266','B','C');
    insert into spatial_points values (9,'LEP','PDV3','YEFH8','18992','E','B');
    insert into spatial_points values (10,'LEP','PDV4','YEFH8','18992','S','B');
    insert into spatial_points values (11,'LEP','PDV4','YEFH8','18995','E','B');
    insert into spatial_points values (12,'LEP','PDV4','YEFH8','18995','S','B');
    insert into spatial_points values (13,'LEP','PDV4','YEFH8','18996','E','B');
    insert into spatial_points values (14,'LEP','PDV4','YEFH8','18996','S','B');
    insert into spatial_points values (15,'LEP','PDV4','YEFH8','18999','D','U');  
    insert into spatial_points values (16,'LINAC4','193','QASM1','4255','E','B');
    insert into spatial_points values (17,'LINAC4','193','QASM1','4255','S','B');
    insert into spatial_points values (18,'LINAC4','193','QASM1','4264','E','B');
    insert into spatial_points values (19,'LINAC4','TCC2','FFEPO','4264','S','B');
    insert into spatial_points values (20,'LINAC4','TCC2','QASM1','4255','D','P');
    insert into spatial_points values (21,'SPS','EHW1','LMRDE','22586','E','B');
    insert into spatial_points values (22,'SPS','EHW1','LMRDE','22586','S','B');
    insert into spatial_points values (23,'SPS','EHW1','X8PE5','22587','E','B');
    insert into spatial_points values (24,'SPS','EHW1','X8PE5','22587','S','B');
    insert into spatial_points values (25,'SPS','EHW1','X8PE5','22590','C','A');
    insert into spatial_points values (26,'SPS','TCC2','DDFFR9','22590','C','A');
    insert into spatial_points values (27,'TI18','PDV4','94FFG4E','22586','E','B');
    insert into spatial_points values (28,'TI18','PDV4','94FFG4E','22586','S','B');
    insert into spatial_points values (29,'TI18','193','94FFG4E','22589','E','T');
    insert into spatial_points values (30,'TI18','TCC2','NO55D','22589','S','T');
    insert into user_selected_areas values (1,'PDV4');
    insert into user_selected_areas values (2,'193');
    insert into user_selected_accl_lines values (1,'TI18');Currently, my query is the following :
    select
         spat_id,
         spat_accl_name,
         spat_area_name,
         spat_class,
         spat_number,
         spat_pt,
         spat_type
    from
         spatial_points
    where
         spat_area_name like :p3_filtre_area_name
         and spat_class like nvl(:p3_filtre_spatial_class,'%')
         and spat_number like nvl(:p3_filtre_number,'%')
         and instr(:p3_filtre_spatial_point_values,nvl(spat_pt,' ')) > 0
         and instr(:p3_filtre_spatial_type_values,spat_type) > 0
         and (
              (:p3_filtre_accl_name is null and spat_accl_name is null)
              or decode(:p3_filtre_accl_name,'%',nvl(spat_accl_name,'_null_'),:p3_filtre_accl_name) = nvl(spat_accl_name,'_null_')
         )It works but it takes care only of the items values, but not of the values contained in the cart (USER_SELECTED_AREAS and USER_SELECTED_ACCL_LINES).
    There will be a USER_SELECTED_ table for spat_class, spat_number, spat_pt, spat_type too.
    :p3_filtre_area_name is a select list that contains AREAS table values
    :p3_filtre_accl_name is a select list that contains ACCELERATOR_LINES table values (+ "_null_" value).
    :p3_filtre_spatial_class and :p3_filtre_number are textfield items
    :p3_filtre_spatial_point_values and :p3_filtre_spatial_type_values are textfiel items containing a comma seperated list of values (ie. "E,A,B,C").
    select lists return '%' when nothing is selected
    textfields return null when nothing is entered
    I hope I'm understandable.
    Thanks.

  • Need help for a query

    Dear all,
    I have data in a table with 3 columns.
    Tabke name---test & columns are ....
    Proeductioncode varchar2(5);
    revisionno varchar2(3);
    dateadopted char(8);
    SELECT PRODUCTIONCODE,
    REVISIONNO,
    dateadopted
    FROM test
    where
    productioncode ='CI50E';
    output should be
    productioncode revisionno dateadopted
         CI50E     004     20110125     
         CI50E     001     20101104     
         CI50E     003     20110320     
         CI50E     002     20101214     
    My requirement is
    I wanna display records which are dateadopted is > sysdate and
    one record which is less than sysdate, which is the max(<sysdate)..
    from the data above, the output should be
    CI50E     003     20110320     
    CI50E     004     20110125     
    Please help me get this output. we need to write it in a select query.

    Sorry I don't have a database available to test - missed to join by productioncode :( grouping by revisionno caused more than one row being returned :(
    No need to send data. I added a data providing subquery. Will take shot when I get to the database, but I might not be able to post from the office
    with
    /* generating test data */
    test as
    (select 'CI50E' productioncode,'004' revisionno,'20110125' dateadopted from dual union all /* 003 if increasing by date */
    select 'CI50E','001','20101104' from dual union all
    select 'CI50E','003','20110320' from dual union all                                        /* 004 if increasing by date */
    select 'CI50E','002','20101214' from dual union all
    select 'XI50Y','001','20110220' from dual union all
    select 'XI50Y','002','20110220' from dual union all
    select 'XI50Y','003','20110304' from dual 
    /* end of test data generation */
    SELECT productioncode,revisionno,dateadopted
      FROM (select productioncode,revisionno,dateadopted
              from test a
             where (dateadopted > to_char(sysdate,'yyyymmdd'))
                or dateadopted = (select max(dateadopted)
                                    from test
                                   where productioncode = a.productioncode
                                     and dateadopted < to_char(sysdate,'yyyymmdd')
    ORDER BY PRODUCTIONCODERegards
    Etbin
    Edited by: Etbin on 20.2.2011 21:58
    did some alignment
    Edited by: Etbin on 21.2.2011 7:39
    missing right parenthesis added, group by clause removed since there is just a single product
    Edited by: Etbin on 21.2.2011 7:52
    some more data edit

  • Required help for badi for GL tab in MIRO transaction

    Hi,
    I am working in MIRO transaction.
    I am having a requirement as below:
    While creating invoices, In the gl tab when we enter gl account number system should populate Tax jurisdiction code by default.
    I am using BADI EXTENSION_US_TAXES method MM_ITEM_TAX_MODIFY but it populates the tax jurisdiction in PO reference tab not in GL tab.
    Please advice
    Regards,
    Suvarna Nandi

    Hi
    Have you tried with enhancement FYTX0002? See also Note 302998 - Collecting fields for user-exit. It's an idea.
    I hope this helps you
    Regards
    Eduardo

  • Help for a query with minus clause

    I want to write a query like this:
    select a.field
    from table_1 a
    minus
    select b.field
    from table_2 b
    where not exists (select 1
    from table_3 c
    where c.field = a.field);
    Unfortunately I get:
    ERROR at line 8:
    ORA-00904: invalid column name
    Can you help me?
    Thanks!

    Hi,
    Mark1970 wrote:
    I want to write a query like this:
    select a.field
    from table_1 a
    minus
    select b.field
    from table_2 b
    where not exists (select 1
    from table_3 c
    where c.field = a.field);
    Unfortunately I get:
    ERROR at line 8:
    ORA-00904: invalid column name
    Can you help me?
    Thanks!Table_1 and its alias, a, are only in scope in the first branch of the query, before the keyword MINUS. The second branch, after MINUS, must be a complete self-contained query.
    There are many ways to re-write the query, including:
    select     a.field
    from      table_1          a1
        minus
    select     b.field
    from     table_2          b
    where      not exists      (
                           select     1
                       from      table_3     c
                       JOIN     table_1     a     ON     c.field = a.field
                   );It's suspicious that the EXISTS sub-query does not depend on anything in table_2. Are you sure this is what you want to do?
    If you'd like help, post a little sample data (CREATE TABLE and INSERT statements for all three tables) and the results you want from that data.

  • Pl. help for my query report . I am stuck at output section.....

    Dear All,
    I want output like under, I am giving some part of my qury here and what Output I have
    got also given below this output.
    Required Output:
    ===============
    Brand wise sales in % for 16/06/2007
    Variety Leaf Dust Fann Total Prev.day
    cumu
    Wagh Bakri 46.14% 27.52% 14.88% 88.53% 85.98%
    Mili 6.80% 1.50% 1.09% 9.39% 11.20%
    Navchetan 1.94% 1.94% 1.85%
    Others 0.14% 0.14% 0.97%
    Waghbakri - Organic [D'ling] 0.00% 0.00% 0.00%
    Nilgiri 100 gms Jar 0.00% 0.00% 0.00%
    Msc Leaf 100/250 Pouch 0.00% 0.00% 0.00%
    Total --> 55.02% 29.02% 29.02% 100.00% 100.00%
    Prev.day% 78.23% 16.64% 5.13% 100.00% 100.00%
    I got Output::
    -=============
    Brand Wise Sales in % For 15-MAY-07
    Variety Leaf Dust Fann Total Prev.Cumu
    Mili 11.57 % 1.39 % 1.48 % 14.43 % 14.66 % 0 %
    Navchetan 1.95 % 0.00 % 0.00 % 1.95 % 1.87 % 0 %
    Nilgiri 100gms Jar 0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 0 %
    Others 1.40 % 0.00 % 0.00 % 1.40 % 0.72 % 0 %
    Wagh Bakri 57.06 % 17.09 % 8.07 % 82.22 % 82.71 % 0 %
    Waghbakri-Organic 0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 0 %
    0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 60.59 %
    Total-->in % 71.98 18.47 9.55 100.00 99.96
    I don't get previous total in row area after total in % . it should seen at upper side of last row before total in % area. U can not see it because of strucher is disturbed.
    Sample of Query :
    ===============Pl. not it is half ............
    SET PAGESIZE 15
    SET LINE 300
    SET VERIFY OFF
    SET FEEDBACK OFF
    SET HEADING OFF
    DEFINE SDT='01-MAY-07'
    DEFINE DT='15-MAY-07'
    SPOOL C:\VIPUL\SALES\S.TXT
    TTITLE LEFT " Brand Wise Sales in % For "DT" " SKIP 1 LEFT "-------------------------------------------------------------------------------------" SKIP 1 "Variety Leaf Dust Fann Total Prev.Cumu" SKIP 1 LEFT "-------------------------------------------------------------------------------------"
    COLUMN BRAND FORMAT A25
    COLUMN Leaf FORMAT 990.99
    COLUMN Dust FORMAT 990.99
    COLUMN Fann FORMAT 990.99
    COLUMN Total FORMAT 9999990.99
    COLUMN PrvCumu FORMAT 9999990.99
    BREAK ON REPORT
    COMPUTE SUM LABEL "Total-->in %" OF Leaf ON report
    COMPUTE SUM LABEL "Total-->in %" OF Dust ON report
    COMPUTE SUM LABEL "Total-->in %" OF Fann ON report
    COMPUTE SUM LABEL "Total-->in %" OF TOTAL ON report
    COMPUTE SUM LABEL "Total-->in %" OF PRVCUMU ON report
    BREAK ON REPORT SKIP 2
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF LPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF DPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF FPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF PRVTOT ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF CUMUTOT ON REPORT
    SELECT A.BRAND BRAND,ROUND ((Leaf*100)/C.INVPERT,3) Leaf,'%',ROUND((Dust*100)/C.INVPERT,3) Dust,'%',
    ROUND((Fann)*100/C.INVPERT,3) Fann,'%',ROUND((TOT)*100/C.INVPERT,3) TOTAL,'%',
    ROUND((CTOT)*100/D.CUMUPER,3) PRVCUMU,'%',ROUND ((LPRV*100)/E.PRVPER,2) LPRV,'%',ROUND((DPRV*100)/E.PRVPER,2) DPRV,'%',
    ROUND((FPRV)*100/E.PRVPER,2) FPRV,'%',ROUND((PRV)*100/E.PRVPER,2) PRVTOT,'%',
    ROUND((CPRV)*100/E.PRVPER,2) CUMUTOT,'%'
    FROM
    SELECT BRAND,SUM (LEAF) Leaf,SUM (DUST) Dust,SUM(FANN) Fann,SUM(LEAF+DUST+FANN) TOT,
    SUM(LC) LC,SUM(DC) DC,SUM(FC) FC,SUM(LC+DC+FC) CTOT, SUM (LF) LPRV,SUM (DF) DPRV,SUM(FF) FPRV,
    SUM(LF+DF+FF) PRV,SUM(LF+DF+FF) CPRV
    FROM
    SELECT DECODE(A.BRANDCD ,'WB','Wagh Bakri',
    'WIS','Wagh Bakri',
    'WTM','Wagh Bakri',
    'ML', 'Mili',
    '02', 'Others',
    'DL', 'Others',
    'GM', 'Others',
    'GMD','Others',
    'TQ', 'Others',
    'WOD','Waghbakri-Organic[Dling]',
    'WOG','Waghbakri-Organic[Dling]',
    'WOC','Waghbakri-Organic[Dling]',
    'NC', 'Navchetan',
    'NG', 'Nilgiri 100gms Jar',
    'MSC','Msc Leaf 100/250 Pouch') BRAND ,
    SUM(C.INVQTY) LEAF,0 DUST,0 FANN,0 LC,0 DC,0 FC,
    0 LF,0 DF,0 FF
    FROM
    WB.WBPRODUCTDETAILS A,DSP.DSPINVA B,DSP.DSPINVB C
    WHERE A.COMPCODE = C.COMPCODE AND A.P_UNIQUEID = C.P_UNIQUEID AND
    B.COMPCODE = C.COMPCODE AND B.INVYEAR = C.INVYEAR AND
    B.FACTORYCODE = C.FACTORYCODE AND B.REFINV = C.REFINV AND B.INVNO C.INVNO AND B.INVDATE = C.INVDATE AND B.PARTYCD <> 'A0101G0999' AND
    B.INVDATE ='&DT' AND A.VARIETY = 1 GROUP BY A.BRANDCD,A.VARIETY
    UNION ALL
    SELECT DECODE(A.BRANDCD ,'WB','Wagh Bakri',
    'WIS','Wagh Bakri',
    'WTM','Wagh Bakri',
    'ML', 'Mili',
    '02', 'Others',
    'DL', 'Others',
    'GM', 'Others',
    'GMD','Others',
    'TQ', 'Others',
    'WOD','Waghbakri-Organic[Dling]',
    'WOG','Waghbakri-Organic[Dling]',
    'WOC','Waghbakri-Organic[Dling]',
    'NC', 'Navchetan',
    'NG', 'Nilgiri 100gms Jar',
    'MSC','Msc Leaf 100/250 Pouch') BRAND,
    0 LEAF,SUM(C.INVQTY) DUST,0 FANN,0 LC,0 DC,0 FC,0 LF,0 DF,0 FF
    FROM
    WB.WBPRODUCTDETAILS A,DSP.DSPINVA B,DSP.DSPINVB C
    WHERE A.COMPCODE = C.COMPCODE AND A.P_UNIQUEID = C.P_UNIQUEID AND
    B.COMPCODE = C.COMPCODE AND B.INVYEAR = C.INVYEAR AND
    B.FACTORYCODE = C.FACTORYCODE AND B.REFINV = C.REFINV AND B.INVNO C.INVNO AND B.INVDATE = C.INVDATE AND B.PARTYCD <> 'A0101G0999' AND
    B.INVDATE ='&DT' AND A.VARIETY = 3 GROUP BY A.BRANDCD,A.VARIETY
    and so on ........................

    Dear Satyaki_De
    Thanks for prompt reply.
    I need out put result like this
    Total in %
    Prev. Day %
    Above two section values are different means Total in % - variable is different and
    Prev. Day% - variable is also different .
    When query execute it gave result of Prev. Day % before one line and right end side of Total in %
    So I had given front part of my query.
    Regards
    Vipul Patel
    Ahmedabad
    India

Maybe you are looking for