Help on Pivot Query - 11g

I am working on Oracle 11g database.
I am trying to join two tables having the following data
tab1
contact_id
full_name
email_id
tel_no
tab2
scr_id
org_id (foreign key to org table)
contact_id (foreign key to contact_id in tab1)
contact_type (will take values 'ID','IDE','DC')So an Organization can have multiple contact types like 'ID','IDE','DC'
I want them to be displayed as
org_id
id_full_name
id_email_id
id_tel_no
ide_full_name
ide_email_id
ide_tel_no
dc_full_name
dc_email_id
dc_tel_noI am trying to use code similar to this ( This only produces full name values for 3 contact types whereas I would need to produce email id and tel no as columns too)
        SELECT *
     FROM
          SELECT  org_id,
                        full_name,
               contact_type
                FROM
               tab1,
               tab2
                WHERE
               tab1.contact_id = tab2.contact_id
     PIVOT
          MAX(full_name)
          FOR contact_type
          IN ('ID','IDE','DC')
     )Any suggestions?

Hi,
Instead of a single aggregate function at the beginning of the PIVOT clause, use a comma-delimited list of aggregate functions.
Since you didn't post CREATE TABLE and INSERT statements for any sample data, I'll use scott.emp to illustrate.
Say we want to see the ename, sal and hiredate for the first 3 employees (in alphabetic order) in each job:
WITH     got_r_num     AS
     SELECT     ename, job, sal, hiredate
     ,     ROW_NUMBER () OVER ( PARTITION BY  job
                               ORDER BY          ename
                       )         AS r_num
     FROM    scott.emp
SELECT       *
FROM       got_r_num
PIVOT       (     MAX (ename)     AS ename
       ,     MAX (sal)     AS sal
       ,     MAX (hiredate)     AS hiredate
       FOR      r_num IN (1, 2, 3)
ORDER BY  job
;Output:
JOB       1_ENAM 1_SAL 1_HIREDAT 2_ENAM 2_SAL 2_HIREDAT 3_ENAM 3_SAL 3_HIREDAT
ANALYST   FORD    3000 03-DEC-81 SCOTT   3000 19-APR-87
CLERK     ADAMS   1100 23-MAY-87 JAMES    950 03-DEC-81 MILLER  1300 23-JAN-82
MANAGER   BLAKE   2850 01-MAY-81 CLARK   2450 09-JUN-81 JONES   2975 02-APR-81
PRESIDENT KING    5000 17-NOV-81
SALESMAN  ALLEN   1600 20-FEB-81 MARTIN  1250 28-SEP-81 TURNER  1500 08-SEP-81 
I hope this answers your question.
If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables, and also post the results you want from that data.
Explain, using specific examples, how you get those results from that data.

Similar Messages

  • Help in Pivot Query- To change the column data to rows data!

    Hello Gurus -
    I have to change the row to Column -
    When i use the query -
    select NVL (T2.NAME, ' Grand Total') AS State,count(T2.NAME) as Total
    from Defect T1,statedef T2,repoproject T3
    WHERE T1.STATE=T2.ID AND T1.repoproject = T3.dbid AND T3.name like '%Compass Juice' GROUP BY ROLLUP (T2.NAME)
    Then i have got the following data -
    STATE          TOTAL
    Analysis     17
    Closed          1302
    Development     9
    Duplicate     24
    Failed          2
    OnHold          4
    Opened          146
    QA          1
    ReadyForQA     1
    Withdrawn      335
    Grand Total     1841
    But i want the data in following format -
    State Analysis     Closed     Development      Duplicate     Failed     OnHold     Opened     QA     ReadyForQA     Withdrawn     GrandTotal
    Total 17     1302     9          24          2     4     146     1     1          335          1841
    Kindly help me with this. I searched the forum and saw the usage of Max and NVL, Decode but i am unable to understand it to use in my query. kindly help me with this.

    Hi,
    In 11g you can use pivot.
    [http://www.oracle.com/technology/pub/articles/oracle-database-11g-top-features/11g-pivot.html]
    example
    SQL> desc customers
    Name                                      Null?    Type
    CUST_ID                                            NUMBER(10)
    CUST_NAME                                          VARCHAR2(20)
    STATE_CODE                                         VARCHAR2(2)
    TIMES_PURCHASED                                    NUMBER(3)
    When this table is selected:
    select cust_id, state_code, times_purchased
    from customers
    order by cust_id;
    The output is:
    CUST_ID STATE_CODE TIMES_PURCHASED
          1 CT                       1
          2 NY                      10
          3 NJ                       2
          4 NY                       4
    ... and so on ...
    Note how the data is represented as rows of values: For each customer, the record shows the customer's home state and how many times the customer purchased something from the store. As the customer purchases more items from the store, the column times_purchased is updated.
    Now consider a case where you want to have a report of the purchase frequency each state�that is, how many customers bought something only once, twice, thrice and so on, from each state. In regular SQL, you can issue the following statement:
    select state_code, times_purchased, count(1) cnt
    from customers
    group by state_code, times_purchased;
    Here is the output:
    ST TIMES_PURCHASED        CNT
    CT               0         90
    CT               1        165
    CT               2        179
    CT               3        173
    CT               4        173
    CT               5        152
    ... and so on ...
    This is the information you want but it's a little hard to read. A better way to represent the same data may be through the use of crosstab reports, in which you can organized the data vertically and states horizontally, just like a spreadsheet:
    Times_purchased
                 CT           NY         NJ      ... and so on ...
    1             0            1          0      ...
    2            23          119         37      ...
    3            17           45          1      ...
    ... and so on ...
    Prior to Oracle Database 11g, you would do that via some sort of a decode function for each value and write each distinct value as a separate column. The technique is quite nonintuitive however.
    Fortunately, you now have a great new feature called PIVOT for presenting any query in the crosstab format using a new operator, appropriately named pivot. Here is how you write the query:
    select * from (
       select times_purchased, state_code
       from customers t
    pivot
       count(state_code)
       for state_code in ('NY','CT','NJ','FL','MO')
    order by times_purchased
    Here is the output:
    . TIMES_PURCHASED       'NY'       'CT'       'NJ'       'FL'       'MO'
                  0      16601         90          0          0          0
                  1      33048        165          0          0          0
                  2      33151        179          0          0          0
                  3      32978        173          0          0          0
                  4      33109        173          0          1          0
    ... and so on ...

  • Help in Pivot query

    Hi All,
    For below given data, I want query result shows all months in the outpt, even if there is no data for that month.
    If there is no data for a month it should display 0 as the amount.
    Please help me in building this query.
    version : Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    create table ledger
    (id number, month varchar2(3), year number, amount real);
    begin
    insert into ledger values (1,'jan',1980,100);
    insert into ledger values (2,'jan',1980,200);
    insert into ledger values (3,'mar',1980,300);
    insert into ledger values (4,'mar',1980,1000);
    insert into ledger values (5,'jul',1980,400);
    insert into ledger values (6,'jun',1980,500);
    insert into ledger values (7,'oct',1980,700);
    insert into ledger values (8,'oct',1980,600);
    insert into ledger values (9,'oct',1980,200);
    insert into ledger values (1,'jan',1981,100);
    insert into ledger values (2,'jan',1981,200);
    insert into ledger values (3,'mar',1981,400);
    insert into ledger values (4,'mar',1981,100);
    insert into ledger values (5,'jul',1981,300);
    insert into ledger values (6,'jun',1981,200);
    insert into ledger values (7,'oct',1981,100);
    insert into ledger values (8,'oct',1981,100);
    insert into ledger values (9,'oct',1980,100);
    insert into ledger values (1,'jan',1982,100);
    insert into ledger values (2,'jan',1982,200);
    insert into ledger values (3,'mar',1982,400);
    insert into ledger values (4,'mar',1982,100);
    insert into ledger values (5,'jul',1982,300);
    insert into ledger values (6,'jun',1982,200);
    insert into ledger values (7,'oct',1982,100);
    insert into ledger values (8,'oct',1982,100);
    insert into ledger values (9,'oct',1982,100);
    insert into ledger values (1,'jan',1984,100);
    insert into ledger values (2,'jan',1984,200);
    insert into ledger values (3,'mar',1984,300);
    insert into ledger values (4,'mar',1984,1000);
    insert into ledger values (5,'jul',1984,400);
    insert into ledger values (6,'jun',1984,500);
    insert into ledger values (7,'oct',1984,700);
    insert into ledger values (8,'oct',1984,600);
    insert into ledger values (9,'oct',1984,200);
    end;
    select  Initcap(month) "Year",
           sum(case  when year = 1980 then amount else 0 end ) "1980",
           sum(case  when year = 1981 then amount else 0 end ) "1981",
           sum(case  when year = 1982 then amount else 0 end ) "1982",
           sum(case  when year = 1983 then amount else 0 end ) "1983",
           sum(case  when year = 1984 then amount else 0 end ) "1984",
           sum(case  when year = 1985 then amount else 0 end ) "1985"
    from ledger
    group by  month;
    Expected output
    Year     1980       1981        1982        1983       1984         1985
    Jan         amount    amount   amount   amount  amount   amount
    Feb        amount    amount   amount   amount  amount   amount
    Mar        amount    amount   amount   amount  amount   amount
    Apr        amount    amount   amount   amount  amount   amount
    May       amount    amount   amount   amount  amount   amount
    Jun       amount    amount   amount   amount  amount   amount
    Jul        amount    amount   amount   amount  amount   amount
    Aug      amount    amount   amount   amount  amount   amount
    Sep      amount    amount   amount   amount  amount   amount
    Oct      amount    amount   amount   amount  amount   amount
    Nov      amount    amount   amount   amount  amount   amount
    Dec      amount    amount   amount   amount  amount   amountThanks
    Raghu

    and in case you want a proper sorting order...
    SQL> with months as
      2  (
      3       select level l, to_char(add_months(trunc(sysdate, 'yy'), level-1), 'mon') m
      4       from dual
      5       connect by level <= 12
      6  )
      7  select max(Initcap(m.m)) "Year",
      8         sum(case  when year = 1980 then amount else 0 end ) "1980",
      9         sum(case  when year = 1981 then amount else 0 end ) "1981",
    10         sum(case  when year = 1982 then amount else 0 end ) "1982",
    11         sum(case  when year = 1983 then amount else 0 end ) "1983",
    12         sum(case  when year = 1984 then amount else 0 end ) "1984",
    13         sum(case  when year = 1985 then amount else 0 end ) "1985"
    14  from ledger l, months m
    15  where l.month (+) = m.m
    16  group by m.l
    17  order by m.l;
    Year       1980       1981       1982       1983       1984       1985
    Jan         300        300        300          0        300          0
    Feb           0          0          0          0          0          0
    Mar        1300        500        500          0       1300          0
    Apr           0          0          0          0          0          0
    May           0          0          0          0          0          0
    Jun         500        200        200          0        500          0
    Jul         400        300        300          0        400          0
    Aug           0          0          0          0          0          0
    Sep           0          0          0          0          0          0
    Oct        1600        200        300          0       1500          0
    Nov           0          0          0          0          0          0
    Dec           0          0          0          0          0          0
    12 rows selected

  • Help with Pivot Query

    I'm trying to write a pivot (by day) query that includes a count of events that happen based on a field not being null. In the same query I'd like to include the total events (null or not) so that I can get a ratio.
    i.e. this is what I have, how can I add the total count as a final column( whether g is null or not for all days).
    select x,y,z,
    sum (decode ( to_char (timestamp, 'YYYY-MM-DD', '2012-06-09', count ) ) JUNE_09
    sum (decode ( to_char (timestamp, 'YYYY-MM-DD', '2012-06-10', count ) ) JUNE_10
    from ( select x, y, z, timestamp, count(*) count
    from ab
    where g is not null
    group by x,y,z)
    group by x,y,z.
    This is 10g.
    Thanks!

    Hi,
    user12100488 wrote:
    I'm sorry I don't have DDL or data statements to post on this network. Write some and post them. I'm not saying this is trivial, I'm just saying that it's necessary.
    If you prefer, you can post a WITH clause that contains your sample data, like GVR did above.
    See the forum FAQ {message:id=9360002} for examples of both.
    Either way, post the results you want from that sample data. Include at least one row where the date is outside the the range in which you're interested.
    Thanks, Frank. I need the g_cnt as a count where g is NOT NULL for every distinct x,y,z per day. I have this now in my original statement. I need to add a total count for every distinct x,y,z whether it is null or not, to get a ratio. This is currently written in PL/SQL but I believe it can be done in one statement.Right; there's no need for PL/SQL just to get a ratio. In fact, even if you needed to do this query in PL/SQL, for some other reason, you would almost certainly want to compute the ratios in the query itself.
    I'm looking for output like
    X Y Z June 9 June10 June 11 Total (g) Total (all days)
    g_cnt g_cnt total whether null or notOnce again, you need to post some sample data and the results you want from that data.
    If you want numbers in the output, then post numbers, not "g_cnt". It's great tol include comments such as "total whether null or not", but do so in addition to (not instead of) posting the results you actually want.
    If you want ratios, include them in the results you post.

  • Need help on pivot query

    Hi All,
      I have data like this
    sdate    sday
    Empname   empprofile
    2014-07-06 Sunday
    xxx        7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-09 Wednesday
    xxx    7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    I need to display the data for 7 days as. 
    2014-07-06                         2014-07-07                2014-07-08
    Sunday                                Monday                      Tuesday    
    Empname   Empprofile    Empname  Empprofile    ..................
    xxx        7a7p - RN           xxx            7a7p - RN     .................
    xxx        7a7p - RN           xxx            7a7p - RN     ................
    xxx        7a7p - RN           xxx            7a7p - RN    ................
    Waiting for valuable replies

    One way would be filtering [sdate] for specific range and let the reporting tool to do the pivoting. Another way would be using the traditional way to pivot data by grouping (group by), spreading (case expression) and aggregating (min / max / count / etc.).
    If the dates range will not be static then you will have to use dynamic pivoting and the code can be ugly.
    Here is an example for the static and the dynamic versions.
    SET NOCOUNT ON;
    USE tempdb;
    GO
    CREATE TABLE #T (
    sdate date,
    sday varchar(15),
    Empname varchar(50),
    empprofile varchar(50)
    INSERT INTO #T
    (sdate, sday, Empname, empprofile)
    VALUES
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN');
    GO
    WITH C1 AS (
    SELECT
    ROW_NUMBER() OVER(PARTITION BY sdate ORDER BY empprofile) AS rn
    FROM
    #T
    SELECT
    MAX(CASE WHEN sdate = '20140706' THEN sdate END) AS [sdate - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN sday END) AS [sday - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN Empname END) AS [Empname - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN empprofile END) AS [empprofile - 0140706],
    MAX(CASE WHEN sdate = '20140707' THEN sdate END) AS [sdate - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN sday END) AS [sday - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN Empname END) AS [Empname - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN empprofile END) AS [empprofile - 0140707],
    MAX(CASE WHEN sdate = '20140708' THEN sdate END) AS [sdate - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN sday END) AS [sday - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN Empname END) AS [Empname - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN empprofile END) AS [empprofile - 0140708],
    MAX(CASE WHEN sdate = '20140709' THEN sdate END) AS [sdate - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN sday END) AS [sday - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN Empname END) AS [Empname - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN empprofile END) AS [empprofile - 0140709],
    MAX(CASE WHEN sdate = '20140710' THEN sdate END) AS [sdate - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN sday END) AS [sday - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN Empname END) AS [Empname - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN empprofile END) AS [empprofile - 0140710],
    MAX(CASE WHEN sdate = '20140711' THEN sdate END) AS [sdate - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN sday END) AS [sday - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN Empname END) AS [Empname - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN empprofile END) AS [empprofile - 0140711],
    MAX(CASE WHEN sdate = '20140712' THEN sdate END) AS [sdate - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN sday END) AS [sday - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN Empname END) AS [Empname - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN empprofile END) AS [empprofile - 0140712]
    FROM
    C1
    GROUP BY
    rn;
    GO
    DECLARE
    @sdt date = '20140706',
    @edt date = '20140712';
    DECLARE @sql nvarchar(MAX) = N'
    WITH C1 AS (
    SELECT
    ROW_NUMBER() OVER(PARTITION BY sdate ORDER BY empprofile) AS rn
    FROM
    #T
    SELECT ';
    SET @sql = @sql + STUFF(
    SELECT
    ', MAX(CASE WHEN CONVERT(char(8), sdate, 112) + '' - '' + sday = ''' + CONVERT(char(8), sdate, 112) + ' - ' + sday + N''' THEN Empname END) AS ' + QUOTENAME(CONVERT(char(8), sdate, 112) + ' - ' + sday + ' - Empname') +
    ', MAX(CASE WHEN CONVERT(char(8), sdate, 112) + '' - '' + sday = ''' + CONVERT(char(8), sdate, 112) + ' - ' + sday + N''' THEN empprofile END) AS ' + QUOTENAME(CONVERT(char(8), sdate, 112) + ' - ' + sday + ' - empprofile')
    FROM
    #T
    WHERE
    sdate BETWEEN @sdt AND @edt
    GROUP BY
    sdate,
    sday
    ORDER BY
    sdate
    FOR XML PATH('')
    ), 1, 1, '');
    SET @sql = @sql + N' FROM C1 GROUP BY rn;';
    PRINT @sql;
    EXEC sp_executesql @sql;
    GO
    DROP TABLE #T;
    GO
    I would suggest to stick to the reporting tool for this kind of work.
    AMB
    Some guidelines for posting questions...

  • 11G Pivot Query with Oracle EBS

    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data so...
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_name

    lilhelp wrote:
    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data Why not?
    >
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_nameDon't use GROUP BY. When you use PIVOT, the grouping is implied by what is in the PIVOT clause and what is not.
    Try this:
    select    *
    from        (
           select  segment2
           ,       accounted_dr
           ,       period_name
           from       gl_je_lines          a
           ,       gl_code_combinations     b
           where       b.code_combination_id = a.code_combination_id
           and       segment2 = '007751'
    pivot       (
           sum (accounted_dr)
           for period_name in ('SEP-08','OCT-08','NOV-08')
    ;which is just your posted query without the GROUP BY clause.

  • 11G Pivot Query with parameters

    Hello all,
    I would like to find some way, any way to pass parameters to a pivot query. The following pivot query works, but I would like segment2 to be a variable as well as the period names so....
    select * from
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 >='007611' and segment2 <='007751' AND period_name in ('SEP-08','OCT-08','NOV-08'))
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08') )
    ....would be something like....
    select * from
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 >= :P4_OBJECT_FROM AND and segment2 <=:P4_OBJECT_TO AND period_name in &P4_EPSB_PERIOD_HOLD.)
    pivot
    sum(accounted_dr)
    for period_name in (&P4_EPSB_PERIOD_HOLD.) )
    It is our understanding that we have to hardcode period names and objects, but we would like to get around that. Does anyone have any ideas or tricks?
    Thanks

    lilhelp wrote:
    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data Why not?
    >
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_nameDon't use GROUP BY. When you use PIVOT, the grouping is implied by what is in the PIVOT clause and what is not.
    Try this:
    select    *
    from        (
           select  segment2
           ,       accounted_dr
           ,       period_name
           from       gl_je_lines          a
           ,       gl_code_combinations     b
           where       b.code_combination_id = a.code_combination_id
           and       segment2 = '007751'
    pivot       (
           sum (accounted_dr)
           for period_name in ('SEP-08','OCT-08','NOV-08')
    ;which is just your posted query without the GROUP BY clause.

  • Dynamic SQL and Pivot Query in 11G

    Hello all,
    I am using APEX and 11G I am trying to create a report based on the results of a pivot query. Below is the code to build the query string. The :P4_EPSB_PERIOD_HOLD holds data like (SEP-08') for example.
    declare
    q varchar2(4000);
    begin
    q:=q ||' select * FROM';
    q:=q ||' ( ';
    q:=q ||' select segment2, ';
    q:=q ||' accounted_dr, ';
    q:=q ||' period_name ';
    q:=q ||' from gl_je_lines a, ';
    q:=q ||' gl_code_combinations b';
    q:=q ||' where b.code_combination_id = a.code_combination_id';
    q:=q ||' and segment2 >= :P4_EPSB_OBJECT_FROM';
    q:=q ||' and segment2 <= :P4_EPSB_OBJECT_TO';
    q:=q ||' and period_name IN :P4_EPSB_PERIOD_HOLD';
    q:=q ||' and segment4 >= :P4_EPSB_LOCATION_FROM';
    q:=q ||' and segment4 <= :P4_EPSB_LOCATION_TO';
    q:=q ||' )';
    q:=q ||' PIVOT';
    q:=q ||' (';
    q:=q ||' sum(accounted_dr)';
    q:=q ||' for period_name IN :P4_EPSB_PERIOD_HOLD';
    q:=q ||' )';
    return q;
    end;
    I get the missingfailed to parse SQL query:
    ORA-00906: missing left parenthesis
    If I print the sql statement that the query generates, I get the following code, which, if the varaibles are hard-coded, works fine.
    select * FROM ( select segment2, accounted_dr, period_name from gl_je_lines a, gl_code_combinations b where b.code_combination_id = a.code_combination_id and segment2 >= :P4_EPSB_OBJECT_FROM and segment2 <= :P4_EPSB_OBJECT_TO and period_name IN :P4_EPSB_PERIOD_HOLD and segment4 >= :P4_EPSB_LOCATION_FROM and segment4 <= :P4_EPSB_LOCATION_TO ) PIVOT ( sum(accounted_dr) for period_name IN :P4_EPSB_PERIOD_HOLD )
    Any advice as to how to tackle this would be most welecome and appreciated.
    Thanks

    P4_EPSB_PERIOD_HOLDcome with single quotes? like 'SEP-08' or SEP-08

  • Pivot query, phantom behavior, possible bug?

    Hi experts,
    Oracle 11g
    We are trying to construct a binary type table using a pivot query. (I know there will be no 0's, I will fill that in later.)
    The query to create this table is as follows:
    create table TEMP_T1 as
                select * from (
                select master_id, mbr_id
                        sub_id,
                        NVL(max(den_flag),0) DF,
                        NVL(max(num_flag),0) NF,
                        NVL(sum(den_flag),0) DXF,
                        NVL(sum(num_flag),0) NXF
                from MEM_SUM
                group by master_id, mbr_id,
                        sub_id
                pivot ( max(DF) D_FLAG,
                          max(NF) N_FLAG,
                          sum(DXF) D_COUNT,
                          sum(NXF) N_COUNT
                          FOR sub_id in
                        ( 'MEAS-1' AS "MEAS1",'MEAS-2' AS "MEAS2",'MEAS-3' AS "MEAS3",'MEAS-4' AS "MEAS4"))I am seeing unusual results when I run this query, and am unsure why. Wanted to get some thoughts.
    First issue:
    Although the query selects master_id and mbr_id, I only get master_id with the pivoted results (when it should be master_id, mbr_id & pivot results). Not sure why.
    Second issue:
    And this is where it gets even more strange, if I
    1) do something to the code to make it have a syntax problem, re-run, it will fail, naturally.
    then
    2) fix the syntax back to the original query above, it will run and return master_id, mbr_id & the pivoted results.
    Has anyone encountered such a strange issue before? Any suggestions on a solution welcome.
    Edited by: chris001 on Feb 22, 2013 8:09 AM

    I've experienced a similar problem with my FaderPort controller but only under odd circumstances.
    If I accidentally unplugged my FaderPort and then plugged it back in, the Control Surfaces preferences that I had assigned it would get deleted and the MIDI messages would come through as note data and other really weird stuff depending on what track I had selected in the arrange window...
    What I noticed was that when I just went in and re-did my Control Surfaces stuff for that controller it would reset and work perfectly.
    Eventually I had to contact PreSonus and let them know what was happening. Turns out it was a problem with the driver, and the driver had to be updated to accomodate 8.0.2.
    I can only imagine that this is still the case for many companies (including the ones who made your volume pedal) since Logic 9 came so suddenly, without even a 8.1 ever being released....Try contacting that company and seeing if there are any known issues with the driver for the device and Logic 9... might help, can't hurt.

  • Power Pivot Bug: Can't Save Edits to Power Pivot Query without checking Teradata Connection's "Save my password" Option

    Can't edit and re-Save Power Pivot query with Teradata connection (using LDAP security mechanism) unless "Save my password" option was checked.  Without this option I receive this error upon attempt to Save changes made to the query ...
    The following exception occurred while the managed IDbConnection interface was being used: [TeraGSS Security Library] [115022] Exception occurred in TERAGSS layer.  See inner exception for details.;TdgssAuthenticationTokenExchange delegate threw an exception.
     See the inner exception for details.
    ErrorCode: -452984642 Severity: Error Facility: DotNet
    [Teradata Database] [8017] The UserId, Password or Account is invalid..
    A connection could not be made to the data source with the DataSourceID of '6a06124c-496f-4887-bf39-d2582173d499', Name of 'Teradata fsltdprd'.
    An error occurred while processing table 'Query'.
    The current operation was cancelled because another operation in the transaction failed.

    Sorry you're right, the Office category isn't currently accepting bug reports in which case Olaf's suggestion to use the smiley face is the way to go. In Excel please go to File > Options > Trust Centre > 'Trust Centre Settings...' and check
    that the Feeeback Tool is enabled.
    If the option is greyed out like this...
    ... then you should be able to enable it by changing the value of the 'Enabled' registry key from a 0 to a 1 which you will find under: HKEY_CURRENT_USER\Software\Microsoft\Office\15.0\Common\Feedback. You will then need to close all Office applications
    and re-launch Excel 2013.
    Once the Feeback Tool has been enabled, you should see the smile face(s) in the top right hand corner of the Excel window and be able to send your feedback.
    Regards,
    Michael
    Please remember to mark a post that answers your question as an answer...If a post doesn't answer your question but you've found it helpful, please remember to vote it as helpful :)
    Website: nimblelearn.com, Blog:
    nimblelearn.com/blog, Twitter:
    @nimblelearn

  • Setting Column Names in Dynamic Pivot Query

    Hi all,
    I'm having trouble setting column names in a dynamic pivot query and was wondering if someone could please help me figure out what I need to do.
    To help you help me, I've setup an example scenario in my hosted account. Here's the login info for my hosted site at [http://apex.oracle.com]
    Workspace: MYHOSTACCT
    Username : DEVUSER1
    Password : MYDEVACCTAnd, here is my test application info:
    ID     : 42804
    Name   : dynamic query test
    Page   : 1
    Table 1: PROJECT_LIST         (Alias = PL...  Listing of Projects)
    Table 2: FISCAL_YEAR          (Alias = FY...  Lookup table for Fiscal Years)
    Table 3: PROJECT_FY           (Alias = PF...  Intersection table containing project fiscal years)
    Table 4: PROJECT_FY_HEADCOUNT (Alias = PFH... Intersection table containing headcount per project and fiscal year)Please forgive the excessive normalization for this example, as I wanted to keep the table structure similar to my real application, which has much more going on.
    In my sample, I have the "Select Criteria" region, where the user specifies the project and fiscal year range that he or she would like to report. Click the Search button, and the report returns the project headcount in a pivoted fashion for the fiscal year range specified.
    I've got it working using a hard-coded query, which is displayed in the "Hardcoded Query" region. In this query, I basically return all years, and set conditions on each column which determines whether that column should be displayed or not based on the range selected by the user. While this works, it is not ideal, as there could be many more fiscal years to account for, and this is not very dynamic at all. Anytime a fiscal year is added to the FISCAL_YEAR table, I'd have to update this page.
    So, after reading all of the OTN SQL pivot forums and "Ask Tom" pivot thread, I've been able to create a second region labeled "Dynamic Query" in which I've created a dynamic query to return the same results. This is a much more savvy solution and works great; however, the column names are generic in the report.
    I had to set the query to parse at runtime since the column selection list is dynamic, which violates SQL rules. Can anyone please help me figure out how I can specify my column names in the dynamic query region to get the same column values I'm getting in the hardcoded region?
    Please let me know if you need anymore information, and many thanks in advance!
    Mark

    Hi Tony,
    Thanks so much for your response. I've had to study up on the dbms_sql package to understand your function... first time I've used it. I've fed my dynamic query to your function and see that it returns a colon delimited list of the column names; however, I think I need a little more schooling on how and where exactly to apply the function to actually set the column names in APEX.
    From my test app, here is the code for my dynamic query. I've got it in a "PL/SQL function body returning sql query" region:
    DECLARE 
      v_query      VARCHAR2(4000);
      v_as         VARCHAR2(4);
      v_range_from NUMBER;
      v_range_to   NUMBER;         
    BEGIN
      v_range_from := :P1_FY_FROM;
      v_range_to   := :P1_FY_TO;
      v_query      := 'SELECT ';
      -- build the dynamic column selections by looping through the fiscal year range.
      -- v_as is meant to specify the column name as (FY10, FY11, etc.), but it's not working.
      FOR i IN v_range_from.. v_range_to  LOOP
        v_as    := 'FY' || SUBSTR(i, 3, 4);
        v_query := v_query || 'MAX(DECODE(FY_NB,' || i || ',PFH_HEADCOUNT,0)) '
          || v_as || ',';
      END LOOP;
      -- add the rest of the query to the dynamic column selection
      v_query := rtrim(v_query,',') || ' FROM ('
        || 'SELECT FY_NB, PFH_HEADCOUNT FROM ('
        || 'SELECT FY_ID, FY_NB FROM FISCAL_YEAR) A '
        || 'LEFT OUTER JOIN ('
        || 'SELECT FY_ID, PFH_HEADCOUNT '
        || 'FROM PROJECT_FY_HEADCOUNT '
        || 'JOIN PROJECT_FY USING (PF_ID) '
        || 'WHERE PL_ID = ' || :P1_PROJECT || ') B '
        || 'ON A.FY_ID = B.FY_ID)';
      RETURN v_query;
    END;I need to invoke GET_QUERY_COLS(v_query) somewhere to get the column names, but I'm not sure where I need to call it and how to actually set the column names after getting the returned colon-delimited list.
    Can you (or anyone else) please help me get a little further? Once again, feel free to login to my host account to see it first hand.
    Thanks again!
    Mark

  • APEX 3.2 -ORACLE 10G - PIVOT QUERY

    Hello, i searched around the forum and i cound't find an answer to this specific matter, although i saw some replies that were close...
    i need to creat a form based on a pivot query. but oracle 10g doesn't support that feature so i hope someone can help me.
    my problem is that the number of columns will be variable. here's an example:
    ORIGINAL TABLE
    NAME     KMS     VEHICLE
    Joe     100     AUDI
    Tom     300     VW
    Mark     150     FORD
    Ann     250     FORD
    Joe     200     VW
    Tom     123     AUDI
    Mark     345     AUDI
    Ann     45     VW
    Joe     6     FORD
    Tom     67     FORD
    Mark     46     VW
    Ann     99     AUDI
    DESIRED RESULT
    Joe     Tom     Mark     Ann     Vehicle
    100     123     345     99     AUDI
    6     67     150     250     FORD
    200     300     46     45     VW
    the new columns will be the values in the old NAME column. BUT these values are variable. today its joe,tom,mark and ann tomorrow it could be silvia, tony,richard,harry , william and jane. this means the usuall replies i saw, using MAX and DECODE will not apply because i never know what values or how many values are in this column. with pivot i can get this done.... how can i do this in oracle 10g? is there a way to creat a ser function Pivot somehow? ideas?
    thanks!
    Mark Pereira
    Edited by: 899716 on Jul 18, 2012 12:02 PM

    This is the Oracle Forms forum. Post your question in the SQL forum.
    Tip: check the latest Oracle Magazine (July/August 2012). There is an article by Tom Kyte about the same question.
    http://www.oracle.com/technetwork/oramag/magazine/home/index.html

  • Oracle 11 Pivot Query

    Hi,
    I have 2 questions on the pivot query below:
    1 - I want to have total by Country and Year
    2 - I want to allow the use a dynamic year range (like year between 1990 and 2000)
    Thanks,
    Ribhi
    select * from (
    select CNTRY_NAME, MYYEAR,AM_AMOUNT
    from yearly_lending t
    pivot
    sum(AM_AMOUNT)
    for MYYEAR in (1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998,1999,2000,2001,2002,2003,2004,2005,2006,2007,2008,2009)
    order by CNTRY_NAME

    Hi all,
    Thank you for your help.
    The qury working as expected, but i wont to add total by country and year
    Country 1974 1975 1976 1977 1977 1978...2009 Total
    Jordan 10 5 0 3 5 1 24
    Egypt 5 0 0 0 10 0 15
    Syria 8 2 10 20
    Total 23 7 0 3 15 11 59
    also would like to select year range rather than entering year by year example year from 2000 to year 2009
    I created a view from the qury below and then my pivot query
    select
    rtrim(cntry.short_name_e) cntry_name,
    to_char(e.date0,'yyyy') myYear,
    am.value / 1000000 am_amount,
    a.fk_countrycode a_cntry
    from event e,
    agreement_amount am,
    agreement a, in_country cntry
    where
    a.number0 = e.fk_agreementnumber
    AND A.SUB_NUMBER <>'P'
    AND a.fk_countrycode = cntry.intl_code
    and a.sub_number = e.fk_agreementsub_nu
    and a.type = e.fk_agreementtype
    /*and rtrim(a.status) <> 'CANCELLED' */
    and rtrim(e.type) = 'SIGNING'
    and rtrim(e.fk_agreementtype) <> 'GRANT'
    and e.fk_agreementnumber = am.fk_agreementnumber
    and e.fk_agreementsub_nu = am.fk_agreementsub_nu
    and e.fk_agreementtype = am.fk_agreementtype
    and am.serial_number = 1
    order by rtrim(cntry.short_name_e) ,to_char(e.date0,'yyyy')
    Best regards,
    Ribhi
    Edited by: Ribhi on Apr 29, 2009 7:20 PM

  • Pivot query probelm

    Hi, I am trying to do a pivot report, first I did a simple pivot hard-coding the columns that I was about to use but, it is not convenient so I was asked to do a dynamic report where the columns were selected by a sub-query so I  followed the solution from this post
    Pivot query using XML option in APEX , but I get this error  ORA-00932: inconsistent datatypes: expected - got CLOB, now I know that by default that query is set to return CLOB, but the thing is that when I specify the return value to varchar2 my browser freezes and I cannot go any further than that I need to reboot my browser, I'm using Oracle 11g and APEX 3.2.1 here is my query
    SELECT
         xmlserialize(CONTENT DEPARTMENT_XML as varchar2(4000)) XML
    FROM
                SELECT
                   DEPARTMENT,
                   S_GROUP AS "S GROUP",
                   S_GROUP
                FROM   MYTABLE  where PER = 'BNA' and department not like 'SI%'  
       PIVOT XML(
                   COUNT(S_GROUP)
                   FOR DEPARTMENT      
                   IN  (select distinct department from MYTABLE )          )
    ORDER BY 1
    Thank you.

    Because execuitng the query without the
    ROW_NUMBER gives the results in milliseconds
    but the row_number takes it 7+ min.I think you mean that executing the query without ROW_NUMBER gives the first rows in milliseconds. However, if you wait until the last row, you may find that it takes several minutes as well. If this is not the case, please post some more details.
    And, as BluShadow already said, an additional ordering takes some time, so it will be a little slower, but probably not this much. If you really want to know where time is being spent, then I suggest to take a look at [url http://forums.oracle.com/forums/thread.jspa?threadID=501834&tstart=0]this thread, and start measuring.
    Regards,
    Rob.

  • Help for a query to add columns

    Hi,
    I need for a query where I should add each TableC value as an additional column.
    Please suggest...
    I have 3 tables (TableA, TableB, TableC). TableB stores TableA Id and TableC stores TableB Id
    Considering Id of TableA.
    Sample data
    TableA     :          
    ID     NAME     TABLENAME     ETYPE
    23     Name1     TABLE NAMEA     Etype A
    TableB     :          
    ID     A_ID     RTYPE     RNAME
    26     23     RTYPEA     RNAMEA
    61     23     RTYPEB     RNAMEB
    TableC     :          
    ID     B_ID     COMPNAME     CONC
    83     26     Comp Name AA     1.5
    46     26     Comp Name BB     2.2
    101     61     Comp Name CC     4.2
    Scenario 1: AS PER ABOVE SAMPLE DATA Put each TableC value as an additional column.
    For an Id in TableA(23) where TableB contains 2 records of A_ID (26, 61) and TableC contains 2 records for 26 and 1 record for 61.
    Output required: Put each TABLEC value as an additional column
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME     TableC_1_CONC TableC_2_COMPNAME     TableC_2_CONC     
    Name1 EtypeA RTypeA Comp Name AA 1.5 Comp Name BB 2.2     so on..
    Name1 EtypeA RTypeB Comp Name CC 4.2 NULL NULL     
    Scenario 2: If Table C contains ONLY 1 row for each Id in TableB, output should be somewhat
    Output:
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME
    TableC_1_CONCvalue     value     value     value               value

    Hi,
    Welcome to the forum!
    Do you want the data from TableC presented
    (1) in one column, or
    (2) in several columns (a different column of results for each row in the original TableC)?
    (1) Is called String Aggregation and is easier than (2).
    The best way to do this is with a user-defined aggregate function (STRAGG) which you can copy from asktom.
    Ignoring TableA for now, you could get what you want by saying
    SELECT    b.rtype
    ,         STRAGG (   c.compname
                     || ' '
                     || c.conc
                     )  AS c_data
    FROM      TableB  b
    JOIN      TableC  c  ON b.id  = c.b_id
    GROUP BY  b.rtype;(2) Presenting N rows of TableC as it they were N columns of the same row is called a pivot. Search for "pivot" or "rows to columns" to find examples of how to do this.
    The number of columns in a result set is hard-coded into the query. If you don't know ahead of time how many rows in TableC will match a row in TableB, you can:
    (a) guess high (for example, hard-code 20 columns and let the ones that never contain a match be NULL) or,
    (b) use Dynamic SQL to write a query for you, which has exactly as many columns as you need.
    The two scripts below contain basic information on pivots.
    This first script is similar to what you would do for case (a):
    --     How to Pivot a Result Set (Display Rows as Columns)
    --     For Oracle 10, and earlier
    --     Actually, this works in any version of Oracle, but the
    --     "SELECT ... PIVOT" feature introduced in Oracle 11
    --     is better.  (See Query 2, below.)
    --     This example uses the scott.emp table.
    --     Given a query that produces three rows for every department,
    --     how can we show the same data in a query that has one row
    --     per department, and three separate columns?
    --     For example, the query below counts the number of employess
    --     in each departent that have one of three given jobs:
    PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
    SELECT     deptno
    ,     job
    ,     COUNT (*)     AS cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ,          job;
    Output:
        DEPTNO JOB              CNT
            20 CLERK              2
            20 MANAGER            1
            30 CLERK              1
            30 MANAGER            1
            10 CLERK              1
            10 MANAGER            1
            20 ANALYST            2
    PROMPT     ==========  1. Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno;
    --     Output:
        DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
            30           0          1           1
            20           2          2           1
            10           0          1           1
    --     Explanation
    (1) Decide what you want the output to look like.
         (E.g. "I want a row for each department,
         and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
    (2) Get a result set where every row identifies which row
         and which column of the output will be affected.
         In the example above, deptno identifies the row, and
         job identifies the column.
         Both deptno and job happened to be in the original table.
         That is not always the case; sometimes you have to
         compute new columns based on the original data.
    (3) Use aggregate functions and CASE (or DECODE) to produce
         the pivoted columns. 
         The CASE statement will pick
         only the rows of raw data that belong in the column.
         If each cell in the output corresponds to (at most)
         one row of input, then you can use MIN or MAX as the
         aggregate function.
         If many rows of input can be reflected in a single cell
         of output, then use SUM, COUNT, AVG, STRAGG, or some other
         aggregate function.
         GROUP BY the column that identifies rows.
    PROMPT     ==========  2. Oracle 11 PIVOT  ==========
    WITH     e     AS
    (     -- Begin sub-query e to SELECT columns for PIVOT
         SELECT     deptno
         ,     job
         FROM     scott.emp
    )     -- End sub-query e to SELECT columns for PIVOT
    SELECT     *
    FROM     e
    PIVOT     (     COUNT (*)
              FOR     job     IN     ( 'ANALYST'     AS analyst
                             , 'CLERK'     AS clerk
                             , 'MANAGER'     AS manager
    NOTES ON ORACLE 11 PIVOT:
    (1) You must use a sub-query to select the raw columns.
    An in-line view (not shown) is an example of a sub-query.
    (2) GROUP BY is implied for all columns not in the PIVOT clause.
    (3) Column aliases are optional. 
    If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
    {code}
    The second script, below, shows one way of doing a dynamic pivot in SQL*Plus:
    {code}
    How to Pivot a Table with a Dynamic Number of Columns
    This works in any version of Oracle
    The "SELECT ... PIVOT" feature introduced in Oracle 11
    is much better for producing XML output.
    Say you want to make a cross-tab output of
    the scott.emp table.
    Each row will represent a department.
    There will be a separate column for each job.
    Each cell will contain the number of employees in
         a specific department having a specific job.
    The exact same solution must work with any number
    of departments and columns.
    (Within reason: there's no guarantee this will work if you
    want 2000 columns.)
    Case 0 "Basic Pivot" shows how you might hard-code three
    job types, which is exactly what you DON'T want to do.
    Case 1 "Dynamic Pivot" shows how get the right results
    dynamically, using SQL*Plus. 
    (This can be easily adapted to PL/SQL or other tools.)
    PROMPT     ==========  0. Basic Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ORDER BY     deptno
    PROMPT     ==========  1. Dynamic Pivot  ==========
    --     *****  Start of dynamic_pivot.sql  *****
    -- Suppress SQL*Plus features that interfere with raw output
    SET     FEEDBACK     OFF
    SET     PAGESIZE     0
    SPOOL     p:\sql\cookbook\dynamic_pivot_subscript.sql
    SELECT     DISTINCT
         ',     COUNT (CASE WHEN job = '''
    ||     job
    ||     ''' '     AS txt1
    ,     'THEN 1 END)     AS '
    ||     job
    ||     '_CNT'     AS txt2
    FROM     scott.emp
    ORDER BY     txt1;
    SPOOL     OFF
    -- Restore SQL*Plus features suppressed earlier
    SET     FEEDBACK     ON
    SET     PAGESIZE     50
    SPOOL     p:\sql\cookbook\dynamic_pivot.lst
    SELECT     deptno
    @@dynamic_pivot_subscript
    FROM     scott.emp
    GROUP BY     deptno
    ORDER BY     deptno
    SPOOL     OFF
    --     *****  End of dynamic_pivot.sql  *****
    EXPLANATION:
    The basic pivot assumes you know the number of distinct jobs,
    and the name of each one.  If you do, then writing a pivot query
    is simply a matter of writing the correct number of ", COUNT ... AS ..."\
    lines, with the name entered in two places on each one.  That is easily
    done by a preliminary query, which uses SPOOL to write a sub-script
    (called dynamic_pivot_subscript.sql in this example).
    The main script invokes this sub-script at the proper point.
    In practice, .SQL scripts usually contain one or more complete
    statements, but there's nothing that says they have to.
    This one contains just a fragment from the middle of a SELECT statement.
    Before creating the sub-script, turn off SQL*Plus features that are
    designed to help humans read the output (such as headings and
    feedback messages like "7 rows selected.", since we do not want these
    to appear in the sub-script.
    Turn these features on again before running the main query.
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Maybe you are looking for

  • Balance Carry Forward To wrong account

    Hi, When we do balance carry forward to the next fiscal year via Tcode: GVTR, it gives a message 'balance carry forward to fiscal year 2005 successfully' But when we display trial balance report, the balance is carried forward to the same P&L account

  • Syncing ITunes with My documents music folder....?

    Was just wondering if its possible to sync ITune with the music folder on my computer as it gets annoying having to add new music manually, especially as Windows media player does this automatically without causing duplicates...

  • Asant problem- building the hello1 example

    I am trying to run the first example from the tutorial.I followed the instructions as described in the tutorial and configured the environment variables. However, when I am trying to build the hello1 example I get the following message: Usage: java [

  • H E L P! Updated my iPhone and now LOTS of new problems never had before!

    Everything was working perfectly. Then I updated my iPhone and iTunes software (via the automatic update which notified me this week that a new version of one or the other was available so I said "yes" and figured everything would be ok as usual). NU

  • Number group not maintain for company code

    Hi goodmorning, I am doing withholding tax. after completion of configuration setting i execute remittance challana at that time system showing  NUMBER GROUP NOT MAINTAIN FOR COMPANYCODE 1001 SECTION IBT100I 194I AND BUSINESS PLACE. company code 1001