Help for a query to add columns

Hi,
I need for a query where I should add each TableC value as an additional column.
Please suggest...
I have 3 tables (TableA, TableB, TableC). TableB stores TableA Id and TableC stores TableB Id
Considering Id of TableA.
Sample data
TableA     :          
ID     NAME     TABLENAME     ETYPE
23     Name1     TABLE NAMEA     Etype A
TableB     :          
ID     A_ID     RTYPE     RNAME
26     23     RTYPEA     RNAMEA
61     23     RTYPEB     RNAMEB
TableC     :          
ID     B_ID     COMPNAME     CONC
83     26     Comp Name AA     1.5
46     26     Comp Name BB     2.2
101     61     Comp Name CC     4.2
Scenario 1: AS PER ABOVE SAMPLE DATA Put each TableC value as an additional column.
For an Id in TableA(23) where TableB contains 2 records of A_ID (26, 61) and TableC contains 2 records for 26 and 1 record for 61.
Output required: Put each TABLEC value as an additional column
TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME     TableC_1_CONC TableC_2_COMPNAME     TableC_2_CONC     
Name1 EtypeA RTypeA Comp Name AA 1.5 Comp Name BB 2.2     so on..
Name1 EtypeA RTypeB Comp Name CC 4.2 NULL NULL     
Scenario 2: If Table C contains ONLY 1 row for each Id in TableB, output should be somewhat
Output:
TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME
TableC_1_CONCvalue     value     value     value               value

Hi,
Welcome to the forum!
Do you want the data from TableC presented
(1) in one column, or
(2) in several columns (a different column of results for each row in the original TableC)?
(1) Is called String Aggregation and is easier than (2).
The best way to do this is with a user-defined aggregate function (STRAGG) which you can copy from asktom.
Ignoring TableA for now, you could get what you want by saying
SELECT    b.rtype
,         STRAGG (   c.compname
                 || ' '
                 || c.conc
                 )  AS c_data
FROM      TableB  b
JOIN      TableC  c  ON b.id  = c.b_id
GROUP BY  b.rtype;(2) Presenting N rows of TableC as it they were N columns of the same row is called a pivot. Search for "pivot" or "rows to columns" to find examples of how to do this.
The number of columns in a result set is hard-coded into the query. If you don't know ahead of time how many rows in TableC will match a row in TableB, you can:
(a) guess high (for example, hard-code 20 columns and let the ones that never contain a match be NULL) or,
(b) use Dynamic SQL to write a query for you, which has exactly as many columns as you need.
The two scripts below contain basic information on pivots.
This first script is similar to what you would do for case (a):
--     How to Pivot a Result Set (Display Rows as Columns)
--     For Oracle 10, and earlier
--     Actually, this works in any version of Oracle, but the
--     "SELECT ... PIVOT" feature introduced in Oracle 11
--     is better.  (See Query 2, below.)
--     This example uses the scott.emp table.
--     Given a query that produces three rows for every department,
--     how can we show the same data in a query that has one row
--     per department, and three separate columns?
--     For example, the query below counts the number of employess
--     in each departent that have one of three given jobs:
PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
SELECT     deptno
,     job
,     COUNT (*)     AS cnt
FROM     scott.emp
WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
GROUP BY     deptno
,          job;
Output:
    DEPTNO JOB              CNT
        20 CLERK              2
        20 MANAGER            1
        30 CLERK              1
        30 MANAGER            1
        10 CLERK              1
        10 MANAGER            1
        20 ANALYST            2
PROMPT     ==========  1. Pivot  ==========
SELECT     deptno
,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
FROM     scott.emp
WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
GROUP BY     deptno;
--     Output:
    DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
        30           0          1           1
        20           2          2           1
        10           0          1           1
--     Explanation
(1) Decide what you want the output to look like.
     (E.g. "I want a row for each department,
     and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
(2) Get a result set where every row identifies which row
     and which column of the output will be affected.
     In the example above, deptno identifies the row, and
     job identifies the column.
     Both deptno and job happened to be in the original table.
     That is not always the case; sometimes you have to
     compute new columns based on the original data.
(3) Use aggregate functions and CASE (or DECODE) to produce
     the pivoted columns. 
     The CASE statement will pick
     only the rows of raw data that belong in the column.
     If each cell in the output corresponds to (at most)
     one row of input, then you can use MIN or MAX as the
     aggregate function.
     If many rows of input can be reflected in a single cell
     of output, then use SUM, COUNT, AVG, STRAGG, or some other
     aggregate function.
     GROUP BY the column that identifies rows.
PROMPT     ==========  2. Oracle 11 PIVOT  ==========
WITH     e     AS
(     -- Begin sub-query e to SELECT columns for PIVOT
     SELECT     deptno
     ,     job
     FROM     scott.emp
)     -- End sub-query e to SELECT columns for PIVOT
SELECT     *
FROM     e
PIVOT     (     COUNT (*)
          FOR     job     IN     ( 'ANALYST'     AS analyst
                         , 'CLERK'     AS clerk
                         , 'MANAGER'     AS manager
NOTES ON ORACLE 11 PIVOT:
(1) You must use a sub-query to select the raw columns.
An in-line view (not shown) is an example of a sub-query.
(2) GROUP BY is implied for all columns not in the PIVOT clause.
(3) Column aliases are optional. 
If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
{code}
The second script, below, shows one way of doing a dynamic pivot in SQL*Plus:
{code}
How to Pivot a Table with a Dynamic Number of Columns
This works in any version of Oracle
The "SELECT ... PIVOT" feature introduced in Oracle 11
is much better for producing XML output.
Say you want to make a cross-tab output of
the scott.emp table.
Each row will represent a department.
There will be a separate column for each job.
Each cell will contain the number of employees in
     a specific department having a specific job.
The exact same solution must work with any number
of departments and columns.
(Within reason: there's no guarantee this will work if you
want 2000 columns.)
Case 0 "Basic Pivot" shows how you might hard-code three
job types, which is exactly what you DON'T want to do.
Case 1 "Dynamic Pivot" shows how get the right results
dynamically, using SQL*Plus. 
(This can be easily adapted to PL/SQL or other tools.)
PROMPT     ==========  0. Basic Pivot  ==========
SELECT     deptno
,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
FROM     scott.emp
WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
GROUP BY     deptno
ORDER BY     deptno
PROMPT     ==========  1. Dynamic Pivot  ==========
--     *****  Start of dynamic_pivot.sql  *****
-- Suppress SQL*Plus features that interfere with raw output
SET     FEEDBACK     OFF
SET     PAGESIZE     0
SPOOL     p:\sql\cookbook\dynamic_pivot_subscript.sql
SELECT     DISTINCT
     ',     COUNT (CASE WHEN job = '''
||     job
||     ''' '     AS txt1
,     'THEN 1 END)     AS '
||     job
||     '_CNT'     AS txt2
FROM     scott.emp
ORDER BY     txt1;
SPOOL     OFF
-- Restore SQL*Plus features suppressed earlier
SET     FEEDBACK     ON
SET     PAGESIZE     50
SPOOL     p:\sql\cookbook\dynamic_pivot.lst
SELECT     deptno
@@dynamic_pivot_subscript
FROM     scott.emp
GROUP BY     deptno
ORDER BY     deptno
SPOOL     OFF
--     *****  End of dynamic_pivot.sql  *****
EXPLANATION:
The basic pivot assumes you know the number of distinct jobs,
and the name of each one.  If you do, then writing a pivot query
is simply a matter of writing the correct number of ", COUNT ... AS ..."\
lines, with the name entered in two places on each one.  That is easily
done by a preliminary query, which uses SPOOL to write a sub-script
(called dynamic_pivot_subscript.sql in this example).
The main script invokes this sub-script at the proper point.
In practice, .SQL scripts usually contain one or more complete
statements, but there's nothing that says they have to.
This one contains just a fragment from the middle of a SELECT statement.
Before creating the sub-script, turn off SQL*Plus features that are
designed to help humans read the output (such as headings and
feedback messages like "7 rows selected.", since we do not want these
to appear in the sub-script.
Turn these features on again before running the main query.
{code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Similar Messages

  • Please Help for the Query

    Please Help for the Query
    Hi frds please help me for the below query.What I want to do is to pull out the data from below table :-
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/18/2008 3IINFOTECH -4
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3
    as such :-
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3
    Here I want to find the Trend i.e either asc or desc order from the lowest indicator.
    In the above sample data -8, -4, -5, -3 out of which I want the asc order data -8, -5, -3 and exclude -4 data.Because the asc order -8, -5, -3 will not follow.
    So I want the data
    date ticker indicator
    03/13/2008 3IINFOTECH -8
    03/25/2008 3IINFOTECH -5
    03/27/2008 3IINFOTECH -3

    SQL> CREATE TABLE BORRAME(FECHA DATE, INDICA VARCHAR2(100));
    Tabla creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/13/2008','MM/DD/YYYY'), '3IINFOTECH -8');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/18/2008','MM/DD/YYYY'), '3IINFOTECH -4');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/25/2008','MM/DD/YYYY'), '3IINFOTECH -5');
    1 fila creada.
    SQL> INSERT INTO BORRAME VALUES(TO_DATE('03/27/2008','MM/DD/YYYY'), '3IINFOTECH -3');
    1 fila creada.
    SQL> COMMIT;
    Validación terminada.
    SQL>
    SQL> SELECT FECHA, INDICA
      2  FROM BORRAME
      3  WHERE SUBSTR(INDICA,INSTR(INDICA,'-',1)+1,LENGTH(INDICA)) <> '4'
      4  ORDER BY SUBSTR(INDICA,INSTR(INDICA,'-',1)+1,LENGTH(INDICA)) DESC;
    FECHA                                                                
    INDICA                                                               
    13/03/08                                                             
    3IINFOTECH -8                                                        
    25/03/08                                                             
    3IINFOTECH -5                                                        
    27/03/08                                                             
    3IINFOTECH -3                                                        
                    

  • Need help for the query columns to rows

    Hi everyone,
    I have two tables TABLE1 and TABLE2; TABLE1 is Master table,TABLE2 is child table.
    The key for TABLE1 is C1 column
    The key for TABLE2 is C1,C_MONTH Columns
    The sample data is as follows
    TABLE1
    ======
    C1 C2
    1 A
    2 B
    3 C
    4 D
    TABLE2
    ======
    C1 C_MONTH C3
    1 JAN AAA
    1 FEB BBB
    1 MAR CCC
    1 APR DDD
    2 JAN ZZZ
    2 FEB YYY
    2 MAR XXX
    2 APR UUU
    I want to display the data as follows
    1 A JAN AAA FEB BBB MAR CCC APR DDD
    2 B JAN ZZZ FEB YYY MAR XXX APR UUU
    Can any one help me how to write this query?
    Thanks in advance

    [email protected] wrote:
    Thanks for the update
    but I want the out put as column values rather than one column as follows
    C1 C2 J_MONTH J_VALUE F_MONTH F_VALUE M_MONTH M_VALUE A_MONTH A_VALUE
    1 A JAN AAA FEB BBB MAR CCC APR DDD
    2 B JAN ZZZ FEB YYY MAR XXX APR UUUThis is a standard pivot.
    In 10g or below you can do something like...
    SQL> ed
    Wrote file afiedt.buf
      1  with table1 as (
      2                  select 1 c1, 'A' c2 from dual union all
      3                  select 2, 'B' from dual union all
      4                  select 3, 'C' from dual union all
      5                  select 4, 'D' from dual
      6                 ),
      7       table2 as (
      8                  select 1 c1, 'JAN' C_MONTH,'AAA' C3 from dual union all
      9                  select 1,'FEB','BBB' C3 from dual union all
    10                  select 1,'MAR','CCC' C3 from dual union all
    11                  select 1,'APR','DDD' C3 from dual union all
    12                  select 2,'JAN','ZZZ' C3 from dual union all
    13                  select 2,'FEB','YYY' C3 from dual union all
    14                  select 2,'MAR','XXX' C3 from dual union all
    15                  select 2,'APR','UUU' C3 from dual
    16                 )
    17  -- end of test data
    18  select table1.c1, table1.c2
    19        ,max(decode(c_month, 'JAN', c_month)) as jan_month
    20        ,max(decode(c_month, 'JAN', c3)) as jan_value
    21        ,max(decode(c_month, 'FEB', c_month)) as feb_month
    22        ,max(decode(c_month, 'FEB', c3)) as feb_value
    23        ,max(decode(c_month, 'MAR', c_month)) as mar_month
    24        ,max(decode(c_month, 'MAR', c3)) as mar_value
    25        ,max(decode(c_month, 'APR', c_month)) as apr_month
    26        ,max(decode(c_month, 'APR', c3)) as apr_value
    27  from table1 join table2 on (table1.c1 = table2.c1)
    28* group by table1.c1, table1.c2
    SQL> /
            C1 C JAN JAN FEB FEB MAR MAR APR APR
             1 A JAN AAA FEB BBB MAR CCC APR DDD
             2 B JAN ZZZ FEB YYY MAR XXX APR UUU
    SQL>From 11g upwards you can use the new PIVOT keyword.

  • Required help for a query

    Hi All....
    Required help for one more query.
    I have a table with data like this:
    Cust_id Transaction_no
    111 1
    111 2
    111 3
    111 4
    111 5
    111 6
    222 7
    222 8
    333 9
    333 10
    333 11
    333 12
    I wrote the following query :
    select cust_id, ntile(3) over (order by cust_id) "Bucket" from trans_detls
    The output is like this :
    Cust_id Bucket
    111 1
    111 1
    111 1
    111 1
    111 2
    111 2
    222 2
    222 2
    333 3
    333 3
    333 3
    333 3
    The problem is that I dont want the cust_id to overlap in buckets. That is one cust_id should be present in only one bucket.
    Is this possible?
    Thanks in advance.
    Ameya

    Or Something like..
    SQL> select * from test;
            ID         NO
           111          1
           111          2
           111          3
           111          4
           111          5
           111          6
           222          7
           222          8
           333          9
           333         10
           333         11
           333         12
    12 rows selected.
    SQL> select id, ntile(3) over (order by rn) "Bucket"
      2  from(
      3      select id,row_number() over(partition by id order by no) rn
      4      from test);
            ID     Bucket
           111          1
           222          1
           333          1
           111          1
           222          2
           333          2
           111          2
           333          2
           111          3
           333          3
           111          3
           111          3
    12 rows selected.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Help for TSQL query

    Hi,
    I want to design query. Below are the required:
    Existing Table :
    Code   Val1   Val2   Val3
    A       A11      -         -
    B        -       B22        -
    C        -       -         C33
    Output required:  
    Code  Col1  Col2
    A       A11   B22
    B       B22   C33
    C       C33  -
    Can anyone help with the query? 

    Why don't you normalize your data?
    E.g.
    DECLARE @Sample TABLE
    Code CHAR(1) ,
    Val1 CHAR(3) ,
    Val2 CHAR(3) ,
    Val3 CHAR(3)
    INSERT INTO @Sample
    VALUES ( 'A', 'A11', NULL, NULL ),
    ( 'B', NULL, 'B22', NULL ),
    ( 'C', NULL, NULL, 'C33' );
    -- SQL Server 2012+
    WITH Normalized
    AS ( SELECT U.Code ,
    U.Attribute ,
    U.Value
    FROM @Sample S UNPIVOT ( Value FOR Attribute IN ( Val1, Val2, Val3 ) ) U
    SELECT N.Code ,
    N.Value ,
    LEAD(N.Value, 1, NULL) OVER ( ORDER BY N.Code ASC )
    FROM Normalized N;
    -- SQL Server 2008+
    WITH Normalized
    AS ( SELECT U.Code ,
    U.Attribute ,
    U.Value ,
    ROW_NUMBER() OVER ( ORDER BY U.Code ) AS RN
    FROM @Sample S UNPIVOT ( Value FOR Attribute IN ( Val1, Val2, Val3 ) ) U
    SELECT L.Code ,
    L.Value ,
    R.Value
    FROM Normalized L
    LEFT JOIN Normalized R ON L.RN = R.RN - 1;
    btw, post in future concise and complete examples. Include table DDL and sample data INSERT statements as runnable T-SQL scripts.

  • How to implement "Application Help" for activated Businiess Functions & Add Ons?

    Dear all
    I am setting up a SAP NetWeaver System with Business Function INSURANCE activated (for the use of FS-CD) and the reinsurance Add On FS-RI installed.
    I am able to set up "Application Help" in transaction SR13 for the "ERP/ECC part", meaning: When I open some "normal" ECC/ERP transaction, like MM03 oder PA20 and click on ("Help" in menubar >) "Application Help" in SAP Gui, a browser opens an shows me the context sensitive online help page for the corresponding transaction. (For PA20, this is "Displaying HR Master Data", for MM03 it's "Displaying Material Master Records")
    If I open an FS-RI transaction (like "/MSG/H_BORD1" oder "/MSG/R_V3"), I am only forwarded to the general SAP Library. But there is an online documentation available ("SAP for Industries" > "SAP Insurance Management" > ...)
    So, how to link the Business Functions and Add Ons with the Online Help?
    Is there a simple way to set up "Application Help" for activated Business Functions and installed Add Ons?
    I found some (online) documentation, but this is all more confusing than helping...
    Thanks for any help!
    Frank

    Second, when you create a Core Data Document Based application, XCode generates the MyDocument class, derivating from NSPersistentDocument. This class is dedicated to maintain the Managed Object Model and the Managed Object Context of each document of your application.
    There is only one Context and generally one Model for each Document.
    In Interface Builder, the Managed Object Context must be bound to the managedObjectContext of the MyDocument instance: it's the File's owner of the myDocument.xib Nib file.
    It's not the case in your Nib File where the Managed Object Context is bound to the invoiceID of the Invoice Controller.
    It's difficult to help you without an overall knowledge of your application, perhaps could you create an InvoiceItem Controller and bind its Content Set to the relationship of your invoice.

  • Help for the query

    hello,
    I have a question again. The tables for the query are "Patient" and "Station".
    Station-Table:
    s_id, station
    Patient-Table:
    p_id, name, s_id, gender
    I want to know how many Patient are Male and Female for each Station. That means that the output should be:
    Station Male Female
    S1 12 10
    S2 6 4

    I supposed the values in gender are 'M' for Male and 'F' for Female
    select s.station, sum(decode(p.gender, 'M', 1, 0)) Male , sum(decode(p.gender, 'F', 1, 0)) Female
    from station s, patient p
    where s.s_id=p.s_id
    group by s.station;

  • Help for a query

    Hi,
    Oracle 10g r2.
    I have a page with a listbox item, and a "cart" that can contains values from the listbox (user can add values from the listbox to the cart).
    I want to filter a report depending on the values in the cart and the value selected in the listbox !
    Only one value can be selected in the listbox, but the cart can contains several values.
    When no value is selected in the listbox, it returns '%', else it returns the value
    Here is what I need :
    listbox = '%', cart = empty ==> return all records
    listbox = '%', cart = 'value1, value2' ==> returns records where spat_area_name in ('value1','value2')
    listbox = 'value1', cart = 'value2, value3' ==> return records where spat_area_name in ('value1','value2','value3')
    listbox = 'value1', cart = empty ==> return records where spat_area_name in ('value1')
    For example (don't works) :
    select
    from
         spatial_points
    where
         spat_area_name like :p3_filtre_area_name
    or
         spat_area_name in (
              select usa_area_name from user_selected_areas where usa_loggus_id = 591
         ):p3_filtre_area_name is the listbox value
    (select usa_area_name from user_selected_areas where usa_loggus_id = 591) returns the values stored in the cart
    I tried a few things (using CASE or DECODE) but I can't manage to make it works.
    So any help would be much appreciated.
    Thanks.
    Yann.

    Hi,
    Here are some create/insert statements if you want to test :
    create table accelerator_lines (
         accl_name varchar2(7),
         accl_description varchar2(50),
         constraint accl_lines_pk primary key (accl_name)
    create table areas (
         area_name varchar2(7),
         area_description varchar2(50),
         constraint areas_pk primary key (area_name)
    create table spatial_points (
         spat_id integer,
         spat_accl_name varchar2(7),
         spat_area_name varchar2(7) not null,
         spat_class varchar2(6) not null,
         spat_number varchar2(6) not null,
         spat_pt varchar2(1),
         spat_type varchar2(1) not null,
         constraint spatial_pk primary key (spat_id),
         constraint spat_type check (spat_type in ('P','S','B','T','U','C')),
         constraint spat_pt check (spat_pt in ('E','S','A','B','C','D')),
         constraint spatial_accl_fk foreign key (spat_accl_name) references accelerator_lines(accl_name),
         constraint spatial_area_fk foreign key (spat_area_name) references areas(area_name)
    create table user_selected_areas (
         usa_id integer,
         usa_area_name varchar2(7),
         constraint usa_id_pk primary key (usa_id),
         constraint usa_area_name_fk foreign key (usa_area_name) references areas(area_name)
    create table user_selected_accl_lines (
         usal_id integer,
         usal_accl_name varchar2(7),
         constraint usal_id_pk primary key (usal_id),
         constraint usal_accl_name_fk foreign key (usal_accl_name) references accelerator_lines(accl_name)
    insert into accelerator_lines values ('LHC','LHC description');
    insert into accelerator_lines values ('LINAC4','LINAC4 description');
    insert into accelerator_lines values ('SPS','SPS description');
    insert into accelerator_lines values ('TI12','TI12 description');
    insert into accelerator_lines values ('TI18','TI18 description');
    insert into accelerator_lines values ('LEP','LEP description');
    insert into areas values ('TT81','TT81 description');
    insert into areas values ('PDV3','PDV3 description');
    insert into areas values ('PDV4','PDV4 description');
    insert into areas values ('193','193 description');
    insert into areas values ('EHW1','EHW1 description');
    insert into areas values ('TCC2','TCC2 description');
    insert into spatial_points values (1,'LHC','PDV4','MB2M1','22586','E','A');
    insert into spatial_points values (2,'LHC','PDV4','MB2M1','22586','S','A');
    insert into spatial_points values (3,'LHC','PDV4','MBC4','sr555','E','B');
    insert into spatial_points values (4,'TI12','TT81','RD433','22','E','A');
    insert into spatial_points values (5,'TI12','TT81','ESD8C5','564','S','A');
    insert into spatial_points values (6,'LEP','PDV3','MBRRM1','22586','E','B');
    insert into spatial_points values (7,'LEP','PDV3','MBRRM1','22586','S','B');
    insert into spatial_points values (8,'LEP','PDV3','FFZ55','2266','B','C');
    insert into spatial_points values (9,'LEP','PDV3','YEFH8','18992','E','B');
    insert into spatial_points values (10,'LEP','PDV4','YEFH8','18992','S','B');
    insert into spatial_points values (11,'LEP','PDV4','YEFH8','18995','E','B');
    insert into spatial_points values (12,'LEP','PDV4','YEFH8','18995','S','B');
    insert into spatial_points values (13,'LEP','PDV4','YEFH8','18996','E','B');
    insert into spatial_points values (14,'LEP','PDV4','YEFH8','18996','S','B');
    insert into spatial_points values (15,'LEP','PDV4','YEFH8','18999','D','U');  
    insert into spatial_points values (16,'LINAC4','193','QASM1','4255','E','B');
    insert into spatial_points values (17,'LINAC4','193','QASM1','4255','S','B');
    insert into spatial_points values (18,'LINAC4','193','QASM1','4264','E','B');
    insert into spatial_points values (19,'LINAC4','TCC2','FFEPO','4264','S','B');
    insert into spatial_points values (20,'LINAC4','TCC2','QASM1','4255','D','P');
    insert into spatial_points values (21,'SPS','EHW1','LMRDE','22586','E','B');
    insert into spatial_points values (22,'SPS','EHW1','LMRDE','22586','S','B');
    insert into spatial_points values (23,'SPS','EHW1','X8PE5','22587','E','B');
    insert into spatial_points values (24,'SPS','EHW1','X8PE5','22587','S','B');
    insert into spatial_points values (25,'SPS','EHW1','X8PE5','22590','C','A');
    insert into spatial_points values (26,'SPS','TCC2','DDFFR9','22590','C','A');
    insert into spatial_points values (27,'TI18','PDV4','94FFG4E','22586','E','B');
    insert into spatial_points values (28,'TI18','PDV4','94FFG4E','22586','S','B');
    insert into spatial_points values (29,'TI18','193','94FFG4E','22589','E','T');
    insert into spatial_points values (30,'TI18','TCC2','NO55D','22589','S','T');
    insert into user_selected_areas values (1,'PDV4');
    insert into user_selected_areas values (2,'193');
    insert into user_selected_accl_lines values (1,'TI18');Currently, my query is the following :
    select
         spat_id,
         spat_accl_name,
         spat_area_name,
         spat_class,
         spat_number,
         spat_pt,
         spat_type
    from
         spatial_points
    where
         spat_area_name like :p3_filtre_area_name
         and spat_class like nvl(:p3_filtre_spatial_class,'%')
         and spat_number like nvl(:p3_filtre_number,'%')
         and instr(:p3_filtre_spatial_point_values,nvl(spat_pt,' ')) > 0
         and instr(:p3_filtre_spatial_type_values,spat_type) > 0
         and (
              (:p3_filtre_accl_name is null and spat_accl_name is null)
              or decode(:p3_filtre_accl_name,'%',nvl(spat_accl_name,'_null_'),:p3_filtre_accl_name) = nvl(spat_accl_name,'_null_')
         )It works but it takes care only of the items values, but not of the values contained in the cart (USER_SELECTED_AREAS and USER_SELECTED_ACCL_LINES).
    There will be a USER_SELECTED_ table for spat_class, spat_number, spat_pt, spat_type too.
    :p3_filtre_area_name is a select list that contains AREAS table values
    :p3_filtre_accl_name is a select list that contains ACCELERATOR_LINES table values (+ "_null_" value).
    :p3_filtre_spatial_class and :p3_filtre_number are textfield items
    :p3_filtre_spatial_point_values and :p3_filtre_spatial_type_values are textfiel items containing a comma seperated list of values (ie. "E,A,B,C").
    select lists return '%' when nothing is selected
    textfields return null when nothing is entered
    I hope I'm understandable.
    Thanks.

  • Help for a query with minus clause

    I want to write a query like this:
    select a.field
    from table_1 a
    minus
    select b.field
    from table_2 b
    where not exists (select 1
    from table_3 c
    where c.field = a.field);
    Unfortunately I get:
    ERROR at line 8:
    ORA-00904: invalid column name
    Can you help me?
    Thanks!

    Hi,
    Mark1970 wrote:
    I want to write a query like this:
    select a.field
    from table_1 a
    minus
    select b.field
    from table_2 b
    where not exists (select 1
    from table_3 c
    where c.field = a.field);
    Unfortunately I get:
    ERROR at line 8:
    ORA-00904: invalid column name
    Can you help me?
    Thanks!Table_1 and its alias, a, are only in scope in the first branch of the query, before the keyword MINUS. The second branch, after MINUS, must be a complete self-contained query.
    There are many ways to re-write the query, including:
    select     a.field
    from      table_1          a1
        minus
    select     b.field
    from     table_2          b
    where      not exists      (
                           select     1
                       from      table_3     c
                       JOIN     table_1     a     ON     c.field = a.field
                   );It's suspicious that the EXISTS sub-query does not depend on anything in table_2. Are you sure this is what you want to do?
    If you'd like help, post a little sample data (CREATE TABLE and INSERT statements for all three tables) and the results you want from that data.

  • Need help for a query

    Dear all,
    I have data in a table with 3 columns.
    Tabke name---test & columns are ....
    Proeductioncode varchar2(5);
    revisionno varchar2(3);
    dateadopted char(8);
    SELECT PRODUCTIONCODE,
    REVISIONNO,
    dateadopted
    FROM test
    where
    productioncode ='CI50E';
    output should be
    productioncode revisionno dateadopted
         CI50E     004     20110125     
         CI50E     001     20101104     
         CI50E     003     20110320     
         CI50E     002     20101214     
    My requirement is
    I wanna display records which are dateadopted is > sysdate and
    one record which is less than sysdate, which is the max(<sysdate)..
    from the data above, the output should be
    CI50E     003     20110320     
    CI50E     004     20110125     
    Please help me get this output. we need to write it in a select query.

    Sorry I don't have a database available to test - missed to join by productioncode :( grouping by revisionno caused more than one row being returned :(
    No need to send data. I added a data providing subquery. Will take shot when I get to the database, but I might not be able to post from the office
    with
    /* generating test data */
    test as
    (select 'CI50E' productioncode,'004' revisionno,'20110125' dateadopted from dual union all /* 003 if increasing by date */
    select 'CI50E','001','20101104' from dual union all
    select 'CI50E','003','20110320' from dual union all                                        /* 004 if increasing by date */
    select 'CI50E','002','20101214' from dual union all
    select 'XI50Y','001','20110220' from dual union all
    select 'XI50Y','002','20110220' from dual union all
    select 'XI50Y','003','20110304' from dual 
    /* end of test data generation */
    SELECT productioncode,revisionno,dateadopted
      FROM (select productioncode,revisionno,dateadopted
              from test a
             where (dateadopted > to_char(sysdate,'yyyymmdd'))
                or dateadopted = (select max(dateadopted)
                                    from test
                                   where productioncode = a.productioncode
                                     and dateadopted < to_char(sysdate,'yyyymmdd')
    ORDER BY PRODUCTIONCODERegards
    Etbin
    Edited by: Etbin on 20.2.2011 21:58
    did some alignment
    Edited by: Etbin on 21.2.2011 7:39
    missing right parenthesis added, group by clause removed since there is just a single product
    Edited by: Etbin on 21.2.2011 7:52
    some more data edit

  • Does any one help for SDO_WITHIN_DISTANCE query?

    I tried to use SDO_WITHIN_DISTANCE to find all records that fall in 5 mile radius from the location I searched. The results return the records of Longitude + 5 instead of 5 mile of the location.
    My query was:
    SELECT *
    FROM BH_ORDER_T
    WHERE SDO_WITHIN_DISTANCE(od_pair_geometry,mdsys.SDO_GEOMETRY(2001, 8307, mdsys.SDO_POINT_TYPE(-81.4373367,41.3875542,null), null,null),'DISTANCE=5 UNIT=MILE') = 'TRUE'
    This location is SOLON City Hall, SOLON, OH
    The output results are all records that od_pair_geometry column Longitude between -81 and - 84. For example:
    Sandusky OH (2002,8307,Null,(1,2,1),(-75.46,41.24,-82.71,41.45))
    Elyria OH (2002,8307,Null,(1,2,1),(-75.46,41.24,-82.12,41.38))
    Cleveland OH (2002,8307,Null,(1,2,1),(0,0,-81.7436,41.4336))
    What was wrong with my Query?
    Thanks in advance

    Hi,
    Can you describe what are you doing.
    HSQL demo environment have been startted and when you
    run a job, it's closed DB?
    Regards,
    MAI wanna do data integration from Oracle to HSQLDB, but when connect to hsqldb, it had something wrong just like i had described.
    I fixed it by using demo environment of hsqldb 1.7 instead of 1.8.0, it runs ok.
    but still don't know why v1.8.0 is wrong.

  • Require help for a query

    Hi all..
    I required some help in writing a query..
    My table is like this
    Cust_id     Cust_type Del_status Incharge
    111     Gold     HD          
    222     Gold     
    333     Gold     HD
    444     Gold
    123     Gold     HD          
    456     Gold          
    789     Gold     HD
    987     Gold          
    555     Silver     HD
    666     Silver     HD
    777     Silver
    888     Silver
    I want a query to generate this output
    Cust_id     Cust_type Del_status Incharge
    111     Gold     HD     1
    222     Gold          1
    333     Gold     HD      1
    444     Gold          1
    555     Silver     HD     1
    777     Silver          1
    123     Gold     HD     2
    456     Gold          2
    789     Gold     HD     2
    987     Gold          2
    666     Silver     HD     2
    888     Silver          2
    The query basically allocates the customers to incharges... based on cust_type and del_status.
    there are 3 categories, Gold Customers, Silver Customers and HD customers..
    It should divide this three equally amongst the 2 incharges...
    Also this is just a sample data... actually table consists of around 3Lac customers and 12 incharges
    Sorry if its a incorrect post..
    Thanks in advance..

    Is there a way to find the value for ntile(2) dynamically?
    I tried something like
    Connected to:
    Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.8.0 - Production
    SQL> with mytable as (select 111 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      2  select 222 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      3  select 333 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      4  select 444 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      5  select 123 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      6  select 456 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      7  select 789 cust_id, 'Gold' cust_type, 'HD' del_status from dual UNION ALL
      8  select 897 cust_id, 'Gold' cust_type, null del_status from dual UNION ALL
      9  select 555 cust_id, 'Silver' cust_type, 'HD' del_status from dual UNION ALL
    10  select 666 cust_id, 'Silver' cust_type, 'HD' del_status from dual UNION ALL
    11  select 777 cust_id, 'Silver' cust_type, null del_status from dual UNION ALL
    12  select 888 cust_id, 'Silver' cust_type, null del_status from dual UNION ALL
    13  select 1001 cust_id, 'Copper' cust_type, null del_status from dual UNION ALL
    14  select 1002 cust_id, 'Copper' cust_type, 'HD' del_status from dual UNION ALL
    15  select 1003 cust_id, 'Copper' cust_type, null del_status from dual
    16  )
    17  select t1.cust_id
    18  , t1.cust_type
    19  , t1.del_status
    20  --, ntile(3) over (partition by nvl(t1.del_status,t1.cust_type) order by t1.cust_id)
    21  , ntile((select count(distinct nvl(t2.del_status,t2.cust_type))-1 from mytable t2)) over (parti
    tion by nvl(del_status,cust_type) order by cust_id)
    22  incharge
    23  from mytable t1
    24  order by incharge, t1.cust_type, t1.cust_id
    25  /
    , ntile((select count(distinct nvl(t2.del_status,t2.cust_type))-1 from mytable t2)) over (partition
    ERROR at line 21:
    ORA-30488: argument should be a function of expressions in PARTITION BY
    SQL> The number of incharges could change during the time.
    Message was edited by:
    Sven Weller

  • Needs help for the query

    Hi All,
    I had atable containing 2 columns like NO and Grade.
    ID Grade DEPTID
    1 A 10
    2 E 20
    3 D 20
    4 C 30
    5 H 10
    6 K 10
    7 B 30
    8 L 30
    9 R 20
    i need output as
    DEPTID Employees and grades in the DEPT
    10 1 A , 5 H , 6 K
    20 2 E , 3 D , 9 R
    30 4 C , 7 B , 8 L
    Please anyone give me query for this.
    thanks in advance.
    rampa

    STRAGG for string aggregation. One of the most frequent ask question over here.
    Here a forum search for you.
    Of course, the options depend of version you are on.
    Nicolas.

  • SQL experts please help for a query

    I have following table1.
    What query can give the result as given below, SQL experts please help on this.
    TABLE1
    Event DATETIME
    in 2/JAN/2010
    out 2/JAN/2010
    in 13/JAN/2010
    out 13/JAN/2010
    in 5/JAN/2010
    out 5/JAN/2010
    RESULT REQUIRED FROM THE SQL QUERY
    COL1_IN COL2_OUT
    2/JAN/2010 2/JAN/2010
    13/JAN/2010 13/JAN/2010
    5/JAN/2010 5/JAN/2010

    I tried to help, but this puzzles me.
    Why is this not returning pre-selected set of rows, why it's doing some merge join cartezian ?
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Prod
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    SQL> select * from table1;
    EVENT      DATETIME
    in         2/JAN/2010
    out        2/JAN/2010
    in         13/JAN/2010
    out        13/JAN/2010
    in         5/JAN/2010
    out        5/JAN/2010
    6 rows selected.
    SQL> explain plan for
      2  with a as
    (select datetime from table1 where event='in'),
    b as
    (select datetime from table1 where event='out')
    select  a.datetime COL1_IN ,b.datetime COL2_OUT from a,b ;
    Explained.
    SQL> set wrap off
    SQL> set linesize 200
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 185132177
    | Id  | Operation            | Name   | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT     |        |     9 |   288 |     8   (0)| 00:00:01 |
    |   1 |  MERGE JOIN CARTESIAN|        |     9 |   288 |     8   (0)| 00:00:01 |
    |*  2 |   TABLE ACCESS FULL  | TABLE1 |     3 |    48 |     3   (0)| 00:00:01 |
    |   3 |   BUFFER SORT        |        |     3 |    48 |     5   (0)| 00:00:01 |
    |*  4 |    TABLE ACCESS FULL | TABLE1 |     3 |    48 |     2   (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
       2 - filter("EVENT"='in')
       4 - filter("EVENT"='out')
    Note
       - dynamic sampling used for this statement
    21 rows selected.
    SQL> with a as
    (select datetime from table1 where event='in'),
    b as
    (select datetime from table1 where event='out')
    select  a.datetime COL1_IN ,b.datetime COL2_OUT from a,b ;
    COL1_IN         COL2_OUT
    2/JAN/2010      2/JAN/2010
    2/JAN/2010      13/JAN/2010
    2/JAN/2010      5/JAN/2010
    13/JAN/2010     2/JAN/2010
    13/JAN/2010     13/JAN/2010
    13/JAN/2010     5/JAN/2010
    5/JAN/2010      2/JAN/2010
    5/JAN/2010      13/JAN/2010
    5/JAN/2010      5/JAN/2010
    9 rows selected.
    SQL>

  • Pl. help for my query report . I am stuck at output section.....

    Dear All,
    I want output like under, I am giving some part of my qury here and what Output I have
    got also given below this output.
    Required Output:
    ===============
    Brand wise sales in % for 16/06/2007
    Variety Leaf Dust Fann Total Prev.day
    cumu
    Wagh Bakri 46.14% 27.52% 14.88% 88.53% 85.98%
    Mili 6.80% 1.50% 1.09% 9.39% 11.20%
    Navchetan 1.94% 1.94% 1.85%
    Others 0.14% 0.14% 0.97%
    Waghbakri - Organic [D'ling] 0.00% 0.00% 0.00%
    Nilgiri 100 gms Jar 0.00% 0.00% 0.00%
    Msc Leaf 100/250 Pouch 0.00% 0.00% 0.00%
    Total --> 55.02% 29.02% 29.02% 100.00% 100.00%
    Prev.day% 78.23% 16.64% 5.13% 100.00% 100.00%
    I got Output::
    -=============
    Brand Wise Sales in % For 15-MAY-07
    Variety Leaf Dust Fann Total Prev.Cumu
    Mili 11.57 % 1.39 % 1.48 % 14.43 % 14.66 % 0 %
    Navchetan 1.95 % 0.00 % 0.00 % 1.95 % 1.87 % 0 %
    Nilgiri 100gms Jar 0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 0 %
    Others 1.40 % 0.00 % 0.00 % 1.40 % 0.72 % 0 %
    Wagh Bakri 57.06 % 17.09 % 8.07 % 82.22 % 82.71 % 0 %
    Waghbakri-Organic 0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 0 %
    0.00 % 0.00 % 0.00 % 0.00 % 0.00 % 60.59 %
    Total-->in % 71.98 18.47 9.55 100.00 99.96
    I don't get previous total in row area after total in % . it should seen at upper side of last row before total in % area. U can not see it because of strucher is disturbed.
    Sample of Query :
    ===============Pl. not it is half ............
    SET PAGESIZE 15
    SET LINE 300
    SET VERIFY OFF
    SET FEEDBACK OFF
    SET HEADING OFF
    DEFINE SDT='01-MAY-07'
    DEFINE DT='15-MAY-07'
    SPOOL C:\VIPUL\SALES\S.TXT
    TTITLE LEFT " Brand Wise Sales in % For "DT" " SKIP 1 LEFT "-------------------------------------------------------------------------------------" SKIP 1 "Variety Leaf Dust Fann Total Prev.Cumu" SKIP 1 LEFT "-------------------------------------------------------------------------------------"
    COLUMN BRAND FORMAT A25
    COLUMN Leaf FORMAT 990.99
    COLUMN Dust FORMAT 990.99
    COLUMN Fann FORMAT 990.99
    COLUMN Total FORMAT 9999990.99
    COLUMN PrvCumu FORMAT 9999990.99
    BREAK ON REPORT
    COMPUTE SUM LABEL "Total-->in %" OF Leaf ON report
    COMPUTE SUM LABEL "Total-->in %" OF Dust ON report
    COMPUTE SUM LABEL "Total-->in %" OF Fann ON report
    COMPUTE SUM LABEL "Total-->in %" OF TOTAL ON report
    COMPUTE SUM LABEL "Total-->in %" OF PRVCUMU ON report
    BREAK ON REPORT SKIP 2
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF LPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF DPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF FPRV ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF PRVTOT ON REPORT
    COMPUTE SUM LABEL 'Prev.Day-->in %' OF CUMUTOT ON REPORT
    SELECT A.BRAND BRAND,ROUND ((Leaf*100)/C.INVPERT,3) Leaf,'%',ROUND((Dust*100)/C.INVPERT,3) Dust,'%',
    ROUND((Fann)*100/C.INVPERT,3) Fann,'%',ROUND((TOT)*100/C.INVPERT,3) TOTAL,'%',
    ROUND((CTOT)*100/D.CUMUPER,3) PRVCUMU,'%',ROUND ((LPRV*100)/E.PRVPER,2) LPRV,'%',ROUND((DPRV*100)/E.PRVPER,2) DPRV,'%',
    ROUND((FPRV)*100/E.PRVPER,2) FPRV,'%',ROUND((PRV)*100/E.PRVPER,2) PRVTOT,'%',
    ROUND((CPRV)*100/E.PRVPER,2) CUMUTOT,'%'
    FROM
    SELECT BRAND,SUM (LEAF) Leaf,SUM (DUST) Dust,SUM(FANN) Fann,SUM(LEAF+DUST+FANN) TOT,
    SUM(LC) LC,SUM(DC) DC,SUM(FC) FC,SUM(LC+DC+FC) CTOT, SUM (LF) LPRV,SUM (DF) DPRV,SUM(FF) FPRV,
    SUM(LF+DF+FF) PRV,SUM(LF+DF+FF) CPRV
    FROM
    SELECT DECODE(A.BRANDCD ,'WB','Wagh Bakri',
    'WIS','Wagh Bakri',
    'WTM','Wagh Bakri',
    'ML', 'Mili',
    '02', 'Others',
    'DL', 'Others',
    'GM', 'Others',
    'GMD','Others',
    'TQ', 'Others',
    'WOD','Waghbakri-Organic[Dling]',
    'WOG','Waghbakri-Organic[Dling]',
    'WOC','Waghbakri-Organic[Dling]',
    'NC', 'Navchetan',
    'NG', 'Nilgiri 100gms Jar',
    'MSC','Msc Leaf 100/250 Pouch') BRAND ,
    SUM(C.INVQTY) LEAF,0 DUST,0 FANN,0 LC,0 DC,0 FC,
    0 LF,0 DF,0 FF
    FROM
    WB.WBPRODUCTDETAILS A,DSP.DSPINVA B,DSP.DSPINVB C
    WHERE A.COMPCODE = C.COMPCODE AND A.P_UNIQUEID = C.P_UNIQUEID AND
    B.COMPCODE = C.COMPCODE AND B.INVYEAR = C.INVYEAR AND
    B.FACTORYCODE = C.FACTORYCODE AND B.REFINV = C.REFINV AND B.INVNO C.INVNO AND B.INVDATE = C.INVDATE AND B.PARTYCD <> 'A0101G0999' AND
    B.INVDATE ='&DT' AND A.VARIETY = 1 GROUP BY A.BRANDCD,A.VARIETY
    UNION ALL
    SELECT DECODE(A.BRANDCD ,'WB','Wagh Bakri',
    'WIS','Wagh Bakri',
    'WTM','Wagh Bakri',
    'ML', 'Mili',
    '02', 'Others',
    'DL', 'Others',
    'GM', 'Others',
    'GMD','Others',
    'TQ', 'Others',
    'WOD','Waghbakri-Organic[Dling]',
    'WOG','Waghbakri-Organic[Dling]',
    'WOC','Waghbakri-Organic[Dling]',
    'NC', 'Navchetan',
    'NG', 'Nilgiri 100gms Jar',
    'MSC','Msc Leaf 100/250 Pouch') BRAND,
    0 LEAF,SUM(C.INVQTY) DUST,0 FANN,0 LC,0 DC,0 FC,0 LF,0 DF,0 FF
    FROM
    WB.WBPRODUCTDETAILS A,DSP.DSPINVA B,DSP.DSPINVB C
    WHERE A.COMPCODE = C.COMPCODE AND A.P_UNIQUEID = C.P_UNIQUEID AND
    B.COMPCODE = C.COMPCODE AND B.INVYEAR = C.INVYEAR AND
    B.FACTORYCODE = C.FACTORYCODE AND B.REFINV = C.REFINV AND B.INVNO C.INVNO AND B.INVDATE = C.INVDATE AND B.PARTYCD <> 'A0101G0999' AND
    B.INVDATE ='&DT' AND A.VARIETY = 3 GROUP BY A.BRANDCD,A.VARIETY
    and so on ........................

    Dear Satyaki_De
    Thanks for prompt reply.
    I need out put result like this
    Total in %
    Prev. Day %
    Above two section values are different means Total in % - variable is different and
    Prev. Day% - variable is also different .
    When query execute it gave result of Prev. Day % before one line and right end side of Total in %
    So I had given front part of my query.
    Regards
    Vipul Patel
    Ahmedabad
    India

Maybe you are looking for

  • Probleming when starting Application Sharing

    Hello, when I open the CLP and want to start Application Sharing I have to select a user from my contact list first (otherwise I get a message to do so). But after selecting an application to share the user is still not invated and I have to select h

  • Migration of 2lis_11_vahdr DS into BI7

    Dear Experts, I have issue with 2lis_11_vahdr data source. I hope this DS is not support BI7 directly, I relpicated it from R/3 and Migrated into BI7. Now while converting the existing Infosource into Transformations Iam getting fully mapped Transfer

  • EA28 Update BUG

    After updating to EA28, I found a very disturbing bug with my phone.  I have an exchange account set up, and with that, my phone locks automatically and requires a 4 digit code to unlock.  While my phone is locked, I can now press the Emergency Call

  • TS1292 My iTunes card is not active? I'm verry annoyed! How can I have this solved as quickly as possible

    My $20 iTunes card in not active!?? Why? How do you fix it?

  • FSB not going where it should

    I have the spec shown in my sig and I'm using the 1.D2 Mod BIOS. I was wondering if anyone had problems getting the FSB up to 300. I set my cpu multi to x6, the HTT multi to x2 and the memory to 100MHz but I couldn't get it to boot once I reach 235/2