Help in Pivot query

Hi All,
For below given data, I want query result shows all months in the outpt, even if there is no data for that month.
If there is no data for a month it should display 0 as the amount.
Please help me in building this query.
version : Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
create table ledger
(id number, month varchar2(3), year number, amount real);
begin
insert into ledger values (1,'jan',1980,100);
insert into ledger values (2,'jan',1980,200);
insert into ledger values (3,'mar',1980,300);
insert into ledger values (4,'mar',1980,1000);
insert into ledger values (5,'jul',1980,400);
insert into ledger values (6,'jun',1980,500);
insert into ledger values (7,'oct',1980,700);
insert into ledger values (8,'oct',1980,600);
insert into ledger values (9,'oct',1980,200);
insert into ledger values (1,'jan',1981,100);
insert into ledger values (2,'jan',1981,200);
insert into ledger values (3,'mar',1981,400);
insert into ledger values (4,'mar',1981,100);
insert into ledger values (5,'jul',1981,300);
insert into ledger values (6,'jun',1981,200);
insert into ledger values (7,'oct',1981,100);
insert into ledger values (8,'oct',1981,100);
insert into ledger values (9,'oct',1980,100);
insert into ledger values (1,'jan',1982,100);
insert into ledger values (2,'jan',1982,200);
insert into ledger values (3,'mar',1982,400);
insert into ledger values (4,'mar',1982,100);
insert into ledger values (5,'jul',1982,300);
insert into ledger values (6,'jun',1982,200);
insert into ledger values (7,'oct',1982,100);
insert into ledger values (8,'oct',1982,100);
insert into ledger values (9,'oct',1982,100);
insert into ledger values (1,'jan',1984,100);
insert into ledger values (2,'jan',1984,200);
insert into ledger values (3,'mar',1984,300);
insert into ledger values (4,'mar',1984,1000);
insert into ledger values (5,'jul',1984,400);
insert into ledger values (6,'jun',1984,500);
insert into ledger values (7,'oct',1984,700);
insert into ledger values (8,'oct',1984,600);
insert into ledger values (9,'oct',1984,200);
end;
select  Initcap(month) "Year",
       sum(case  when year = 1980 then amount else 0 end ) "1980",
       sum(case  when year = 1981 then amount else 0 end ) "1981",
       sum(case  when year = 1982 then amount else 0 end ) "1982",
       sum(case  when year = 1983 then amount else 0 end ) "1983",
       sum(case  when year = 1984 then amount else 0 end ) "1984",
       sum(case  when year = 1985 then amount else 0 end ) "1985"
from ledger
group by  month;
Expected output
Year     1980       1981        1982        1983       1984         1985
Jan         amount    amount   amount   amount  amount   amount
Feb        amount    amount   amount   amount  amount   amount
Mar        amount    amount   amount   amount  amount   amount
Apr        amount    amount   amount   amount  amount   amount
May       amount    amount   amount   amount  amount   amount
Jun       amount    amount   amount   amount  amount   amount
Jul        amount    amount   amount   amount  amount   amount
Aug      amount    amount   amount   amount  amount   amount
Sep      amount    amount   amount   amount  amount   amount
Oct      amount    amount   amount   amount  amount   amount
Nov      amount    amount   amount   amount  amount   amount
Dec      amount    amount   amount   amount  amount   amountThanks
Raghu

and in case you want a proper sorting order...
SQL> with months as
  2  (
  3       select level l, to_char(add_months(trunc(sysdate, 'yy'), level-1), 'mon') m
  4       from dual
  5       connect by level <= 12
  6  )
  7  select max(Initcap(m.m)) "Year",
  8         sum(case  when year = 1980 then amount else 0 end ) "1980",
  9         sum(case  when year = 1981 then amount else 0 end ) "1981",
10         sum(case  when year = 1982 then amount else 0 end ) "1982",
11         sum(case  when year = 1983 then amount else 0 end ) "1983",
12         sum(case  when year = 1984 then amount else 0 end ) "1984",
13         sum(case  when year = 1985 then amount else 0 end ) "1985"
14  from ledger l, months m
15  where l.month (+) = m.m
16  group by m.l
17  order by m.l;
Year       1980       1981       1982       1983       1984       1985
Jan         300        300        300          0        300          0
Feb           0          0          0          0          0          0
Mar        1300        500        500          0       1300          0
Apr           0          0          0          0          0          0
May           0          0          0          0          0          0
Jun         500        200        200          0        500          0
Jul         400        300        300          0        400          0
Aug           0          0          0          0          0          0
Sep           0          0          0          0          0          0
Oct        1600        200        300          0       1500          0
Nov           0          0          0          0          0          0
Dec           0          0          0          0          0          0
12 rows selected

Similar Messages

  • Help in Pivot Query- To change the column data to rows data!

    Hello Gurus -
    I have to change the row to Column -
    When i use the query -
    select NVL (T2.NAME, ' Grand Total') AS State,count(T2.NAME) as Total
    from Defect T1,statedef T2,repoproject T3
    WHERE T1.STATE=T2.ID AND T1.repoproject = T3.dbid AND T3.name like '%Compass Juice' GROUP BY ROLLUP (T2.NAME)
    Then i have got the following data -
    STATE          TOTAL
    Analysis     17
    Closed          1302
    Development     9
    Duplicate     24
    Failed          2
    OnHold          4
    Opened          146
    QA          1
    ReadyForQA     1
    Withdrawn      335
    Grand Total     1841
    But i want the data in following format -
    State Analysis     Closed     Development      Duplicate     Failed     OnHold     Opened     QA     ReadyForQA     Withdrawn     GrandTotal
    Total 17     1302     9          24          2     4     146     1     1          335          1841
    Kindly help me with this. I searched the forum and saw the usage of Max and NVL, Decode but i am unable to understand it to use in my query. kindly help me with this.

    Hi,
    In 11g you can use pivot.
    [http://www.oracle.com/technology/pub/articles/oracle-database-11g-top-features/11g-pivot.html]
    example
    SQL> desc customers
    Name                                      Null?    Type
    CUST_ID                                            NUMBER(10)
    CUST_NAME                                          VARCHAR2(20)
    STATE_CODE                                         VARCHAR2(2)
    TIMES_PURCHASED                                    NUMBER(3)
    When this table is selected:
    select cust_id, state_code, times_purchased
    from customers
    order by cust_id;
    The output is:
    CUST_ID STATE_CODE TIMES_PURCHASED
          1 CT                       1
          2 NY                      10
          3 NJ                       2
          4 NY                       4
    ... and so on ...
    Note how the data is represented as rows of values: For each customer, the record shows the customer's home state and how many times the customer purchased something from the store. As the customer purchases more items from the store, the column times_purchased is updated.
    Now consider a case where you want to have a report of the purchase frequency each state�that is, how many customers bought something only once, twice, thrice and so on, from each state. In regular SQL, you can issue the following statement:
    select state_code, times_purchased, count(1) cnt
    from customers
    group by state_code, times_purchased;
    Here is the output:
    ST TIMES_PURCHASED        CNT
    CT               0         90
    CT               1        165
    CT               2        179
    CT               3        173
    CT               4        173
    CT               5        152
    ... and so on ...
    This is the information you want but it's a little hard to read. A better way to represent the same data may be through the use of crosstab reports, in which you can organized the data vertically and states horizontally, just like a spreadsheet:
    Times_purchased
                 CT           NY         NJ      ... and so on ...
    1             0            1          0      ...
    2            23          119         37      ...
    3            17           45          1      ...
    ... and so on ...
    Prior to Oracle Database 11g, you would do that via some sort of a decode function for each value and write each distinct value as a separate column. The technique is quite nonintuitive however.
    Fortunately, you now have a great new feature called PIVOT for presenting any query in the crosstab format using a new operator, appropriately named pivot. Here is how you write the query:
    select * from (
       select times_purchased, state_code
       from customers t
    pivot
       count(state_code)
       for state_code in ('NY','CT','NJ','FL','MO')
    order by times_purchased
    Here is the output:
    . TIMES_PURCHASED       'NY'       'CT'       'NJ'       'FL'       'MO'
                  0      16601         90          0          0          0
                  1      33048        165          0          0          0
                  2      33151        179          0          0          0
                  3      32978        173          0          0          0
                  4      33109        173          0          1          0
    ... and so on ...

  • Help with Pivot Query

    I'm trying to write a pivot (by day) query that includes a count of events that happen based on a field not being null. In the same query I'd like to include the total events (null or not) so that I can get a ratio.
    i.e. this is what I have, how can I add the total count as a final column( whether g is null or not for all days).
    select x,y,z,
    sum (decode ( to_char (timestamp, 'YYYY-MM-DD', '2012-06-09', count ) ) JUNE_09
    sum (decode ( to_char (timestamp, 'YYYY-MM-DD', '2012-06-10', count ) ) JUNE_10
    from ( select x, y, z, timestamp, count(*) count
    from ab
    where g is not null
    group by x,y,z)
    group by x,y,z.
    This is 10g.
    Thanks!

    Hi,
    user12100488 wrote:
    I'm sorry I don't have DDL or data statements to post on this network. Write some and post them. I'm not saying this is trivial, I'm just saying that it's necessary.
    If you prefer, you can post a WITH clause that contains your sample data, like GVR did above.
    See the forum FAQ {message:id=9360002} for examples of both.
    Either way, post the results you want from that sample data. Include at least one row where the date is outside the the range in which you're interested.
    Thanks, Frank. I need the g_cnt as a count where g is NOT NULL for every distinct x,y,z per day. I have this now in my original statement. I need to add a total count for every distinct x,y,z whether it is null or not, to get a ratio. This is currently written in PL/SQL but I believe it can be done in one statement.Right; there's no need for PL/SQL just to get a ratio. In fact, even if you needed to do this query in PL/SQL, for some other reason, you would almost certainly want to compute the ratios in the query itself.
    I'm looking for output like
    X Y Z June 9 June10 June 11 Total (g) Total (all days)
    g_cnt g_cnt total whether null or notOnce again, you need to post some sample data and the results you want from that data.
    If you want numbers in the output, then post numbers, not "g_cnt". It's great tol include comments such as "total whether null or not", but do so in addition to (not instead of) posting the results you actually want.
    If you want ratios, include them in the results you post.

  • Help on Pivot Query - 11g

    I am working on Oracle 11g database.
    I am trying to join two tables having the following data
    tab1
    contact_id
    full_name
    email_id
    tel_no
    tab2
    scr_id
    org_id (foreign key to org table)
    contact_id (foreign key to contact_id in tab1)
    contact_type (will take values 'ID','IDE','DC')So an Organization can have multiple contact types like 'ID','IDE','DC'
    I want them to be displayed as
    org_id
    id_full_name
    id_email_id
    id_tel_no
    ide_full_name
    ide_email_id
    ide_tel_no
    dc_full_name
    dc_email_id
    dc_tel_noI am trying to use code similar to this ( This only produces full name values for 3 contact types whereas I would need to produce email id and tel no as columns too)
            SELECT *
         FROM
              SELECT  org_id,
                            full_name,
                   contact_type
                    FROM
                   tab1,
                   tab2
                    WHERE
                   tab1.contact_id = tab2.contact_id
         PIVOT
              MAX(full_name)
              FOR contact_type
              IN ('ID','IDE','DC')
         )Any suggestions?

    Hi,
    Instead of a single aggregate function at the beginning of the PIVOT clause, use a comma-delimited list of aggregate functions.
    Since you didn't post CREATE TABLE and INSERT statements for any sample data, I'll use scott.emp to illustrate.
    Say we want to see the ename, sal and hiredate for the first 3 employees (in alphabetic order) in each job:
    WITH     got_r_num     AS
         SELECT     ename, job, sal, hiredate
         ,     ROW_NUMBER () OVER ( PARTITION BY  job
                                   ORDER BY          ename
                           )         AS r_num
         FROM    scott.emp
    SELECT       *
    FROM       got_r_num
    PIVOT       (     MAX (ename)     AS ename
           ,     MAX (sal)     AS sal
           ,     MAX (hiredate)     AS hiredate
           FOR      r_num IN (1, 2, 3)
    ORDER BY  job
    ;Output:
    JOB       1_ENAM 1_SAL 1_HIREDAT 2_ENAM 2_SAL 2_HIREDAT 3_ENAM 3_SAL 3_HIREDAT
    ANALYST   FORD    3000 03-DEC-81 SCOTT   3000 19-APR-87
    CLERK     ADAMS   1100 23-MAY-87 JAMES    950 03-DEC-81 MILLER  1300 23-JAN-82
    MANAGER   BLAKE   2850 01-MAY-81 CLARK   2450 09-JUN-81 JONES   2975 02-APR-81
    PRESIDENT KING    5000 17-NOV-81
    SALESMAN  ALLEN   1600 20-FEB-81 MARTIN  1250 28-SEP-81 TURNER  1500 08-SEP-81 
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables, and also post the results you want from that data.
    Explain, using specific examples, how you get those results from that data.

  • Need help on pivot query

    Hi All,
      I have data like this
    sdate    sday
    Empname   empprofile
    2014-07-06 Sunday
    xxx        7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-06 Sunday
    xxx   7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-07 Monday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-08 Tuesday
    xxx       7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-09 Wednesday
    xxx    7a7p - RN
    2014-07-09 Wednesday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-10 Thursday
    xxx   7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-11 Friday
    xxx       7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    2014-07-12 Saturday
    xxx   7a7p - RN
    I need to display the data for 7 days as. 
    2014-07-06                         2014-07-07                2014-07-08
    Sunday                                Monday                      Tuesday    
    Empname   Empprofile    Empname  Empprofile    ..................
    xxx        7a7p - RN           xxx            7a7p - RN     .................
    xxx        7a7p - RN           xxx            7a7p - RN     ................
    xxx        7a7p - RN           xxx            7a7p - RN    ................
    Waiting for valuable replies

    One way would be filtering [sdate] for specific range and let the reporting tool to do the pivoting. Another way would be using the traditional way to pivot data by grouping (group by), spreading (case expression) and aggregating (min / max / count / etc.).
    If the dates range will not be static then you will have to use dynamic pivoting and the code can be ugly.
    Here is an example for the static and the dynamic versions.
    SET NOCOUNT ON;
    USE tempdb;
    GO
    CREATE TABLE #T (
    sdate date,
    sday varchar(15),
    Empname varchar(50),
    empprofile varchar(50)
    INSERT INTO #T
    (sdate, sday, Empname, empprofile)
    VALUES
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-06', 'Sunday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-07', 'Monday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-08', 'Tuesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-09', 'Wednesday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-10', 'Thursday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-11', 'Friday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN'),
    ('2014-07-12', 'Saturday', 'xxx ', '7a7p - RN');
    GO
    WITH C1 AS (
    SELECT
    ROW_NUMBER() OVER(PARTITION BY sdate ORDER BY empprofile) AS rn
    FROM
    #T
    SELECT
    MAX(CASE WHEN sdate = '20140706' THEN sdate END) AS [sdate - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN sday END) AS [sday - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN Empname END) AS [Empname - 20140706],
    MAX(CASE WHEN sdate = '20140706' THEN empprofile END) AS [empprofile - 0140706],
    MAX(CASE WHEN sdate = '20140707' THEN sdate END) AS [sdate - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN sday END) AS [sday - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN Empname END) AS [Empname - 20140707],
    MAX(CASE WHEN sdate = '20140707' THEN empprofile END) AS [empprofile - 0140707],
    MAX(CASE WHEN sdate = '20140708' THEN sdate END) AS [sdate - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN sday END) AS [sday - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN Empname END) AS [Empname - 20140708],
    MAX(CASE WHEN sdate = '20140708' THEN empprofile END) AS [empprofile - 0140708],
    MAX(CASE WHEN sdate = '20140709' THEN sdate END) AS [sdate - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN sday END) AS [sday - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN Empname END) AS [Empname - 20140709],
    MAX(CASE WHEN sdate = '20140709' THEN empprofile END) AS [empprofile - 0140709],
    MAX(CASE WHEN sdate = '20140710' THEN sdate END) AS [sdate - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN sday END) AS [sday - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN Empname END) AS [Empname - 20140710],
    MAX(CASE WHEN sdate = '20140710' THEN empprofile END) AS [empprofile - 0140710],
    MAX(CASE WHEN sdate = '20140711' THEN sdate END) AS [sdate - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN sday END) AS [sday - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN Empname END) AS [Empname - 20140711],
    MAX(CASE WHEN sdate = '20140711' THEN empprofile END) AS [empprofile - 0140711],
    MAX(CASE WHEN sdate = '20140712' THEN sdate END) AS [sdate - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN sday END) AS [sday - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN Empname END) AS [Empname - 20140712],
    MAX(CASE WHEN sdate = '20140712' THEN empprofile END) AS [empprofile - 0140712]
    FROM
    C1
    GROUP BY
    rn;
    GO
    DECLARE
    @sdt date = '20140706',
    @edt date = '20140712';
    DECLARE @sql nvarchar(MAX) = N'
    WITH C1 AS (
    SELECT
    ROW_NUMBER() OVER(PARTITION BY sdate ORDER BY empprofile) AS rn
    FROM
    #T
    SELECT ';
    SET @sql = @sql + STUFF(
    SELECT
    ', MAX(CASE WHEN CONVERT(char(8), sdate, 112) + '' - '' + sday = ''' + CONVERT(char(8), sdate, 112) + ' - ' + sday + N''' THEN Empname END) AS ' + QUOTENAME(CONVERT(char(8), sdate, 112) + ' - ' + sday + ' - Empname') +
    ', MAX(CASE WHEN CONVERT(char(8), sdate, 112) + '' - '' + sday = ''' + CONVERT(char(8), sdate, 112) + ' - ' + sday + N''' THEN empprofile END) AS ' + QUOTENAME(CONVERT(char(8), sdate, 112) + ' - ' + sday + ' - empprofile')
    FROM
    #T
    WHERE
    sdate BETWEEN @sdt AND @edt
    GROUP BY
    sdate,
    sday
    ORDER BY
    sdate
    FOR XML PATH('')
    ), 1, 1, '');
    SET @sql = @sql + N' FROM C1 GROUP BY rn;';
    PRINT @sql;
    EXEC sp_executesql @sql;
    GO
    DROP TABLE #T;
    GO
    I would suggest to stick to the reporting tool for this kind of work.
    AMB
    Some guidelines for posting questions...

  • 11G Pivot Query with Oracle EBS

    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data so...
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_name

    lilhelp wrote:
    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data Why not?
    >
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_nameDon't use GROUP BY. When you use PIVOT, the grouping is implied by what is in the PIVOT clause and what is not.
    Try this:
    select    *
    from        (
           select  segment2
           ,       accounted_dr
           ,       period_name
           from       gl_je_lines          a
           ,       gl_code_combinations     b
           where       b.code_combination_id = a.code_combination_id
           and       segment2 = '007751'
    pivot       (
           sum (accounted_dr)
           for period_name in ('SEP-08','OCT-08','NOV-08')
    ;which is just your posted query without the GROUP BY clause.

  • Power Pivot Bug: Can't Save Edits to Power Pivot Query without checking Teradata Connection's "Save my password" Option

    Can't edit and re-Save Power Pivot query with Teradata connection (using LDAP security mechanism) unless "Save my password" option was checked.  Without this option I receive this error upon attempt to Save changes made to the query ...
    The following exception occurred while the managed IDbConnection interface was being used: [TeraGSS Security Library] [115022] Exception occurred in TERAGSS layer.  See inner exception for details.;TdgssAuthenticationTokenExchange delegate threw an exception.
     See the inner exception for details.
    ErrorCode: -452984642 Severity: Error Facility: DotNet
    [Teradata Database] [8017] The UserId, Password or Account is invalid..
    A connection could not be made to the data source with the DataSourceID of '6a06124c-496f-4887-bf39-d2582173d499', Name of 'Teradata fsltdprd'.
    An error occurred while processing table 'Query'.
    The current operation was cancelled because another operation in the transaction failed.

    Sorry you're right, the Office category isn't currently accepting bug reports in which case Olaf's suggestion to use the smiley face is the way to go. In Excel please go to File > Options > Trust Centre > 'Trust Centre Settings...' and check
    that the Feeeback Tool is enabled.
    If the option is greyed out like this...
    ... then you should be able to enable it by changing the value of the 'Enabled' registry key from a 0 to a 1 which you will find under: HKEY_CURRENT_USER\Software\Microsoft\Office\15.0\Common\Feedback. You will then need to close all Office applications
    and re-launch Excel 2013.
    Once the Feeback Tool has been enabled, you should see the smile face(s) in the top right hand corner of the Excel window and be able to send your feedback.
    Regards,
    Michael
    Please remember to mark a post that answers your question as an answer...If a post doesn't answer your question but you've found it helpful, please remember to vote it as helpful :)
    Website: nimblelearn.com, Blog:
    nimblelearn.com/blog, Twitter:
    @nimblelearn

  • Setting Column Names in Dynamic Pivot Query

    Hi all,
    I'm having trouble setting column names in a dynamic pivot query and was wondering if someone could please help me figure out what I need to do.
    To help you help me, I've setup an example scenario in my hosted account. Here's the login info for my hosted site at [http://apex.oracle.com]
    Workspace: MYHOSTACCT
    Username : DEVUSER1
    Password : MYDEVACCTAnd, here is my test application info:
    ID     : 42804
    Name   : dynamic query test
    Page   : 1
    Table 1: PROJECT_LIST         (Alias = PL...  Listing of Projects)
    Table 2: FISCAL_YEAR          (Alias = FY...  Lookup table for Fiscal Years)
    Table 3: PROJECT_FY           (Alias = PF...  Intersection table containing project fiscal years)
    Table 4: PROJECT_FY_HEADCOUNT (Alias = PFH... Intersection table containing headcount per project and fiscal year)Please forgive the excessive normalization for this example, as I wanted to keep the table structure similar to my real application, which has much more going on.
    In my sample, I have the "Select Criteria" region, where the user specifies the project and fiscal year range that he or she would like to report. Click the Search button, and the report returns the project headcount in a pivoted fashion for the fiscal year range specified.
    I've got it working using a hard-coded query, which is displayed in the "Hardcoded Query" region. In this query, I basically return all years, and set conditions on each column which determines whether that column should be displayed or not based on the range selected by the user. While this works, it is not ideal, as there could be many more fiscal years to account for, and this is not very dynamic at all. Anytime a fiscal year is added to the FISCAL_YEAR table, I'd have to update this page.
    So, after reading all of the OTN SQL pivot forums and "Ask Tom" pivot thread, I've been able to create a second region labeled "Dynamic Query" in which I've created a dynamic query to return the same results. This is a much more savvy solution and works great; however, the column names are generic in the report.
    I had to set the query to parse at runtime since the column selection list is dynamic, which violates SQL rules. Can anyone please help me figure out how I can specify my column names in the dynamic query region to get the same column values I'm getting in the hardcoded region?
    Please let me know if you need anymore information, and many thanks in advance!
    Mark

    Hi Tony,
    Thanks so much for your response. I've had to study up on the dbms_sql package to understand your function... first time I've used it. I've fed my dynamic query to your function and see that it returns a colon delimited list of the column names; however, I think I need a little more schooling on how and where exactly to apply the function to actually set the column names in APEX.
    From my test app, here is the code for my dynamic query. I've got it in a "PL/SQL function body returning sql query" region:
    DECLARE 
      v_query      VARCHAR2(4000);
      v_as         VARCHAR2(4);
      v_range_from NUMBER;
      v_range_to   NUMBER;         
    BEGIN
      v_range_from := :P1_FY_FROM;
      v_range_to   := :P1_FY_TO;
      v_query      := 'SELECT ';
      -- build the dynamic column selections by looping through the fiscal year range.
      -- v_as is meant to specify the column name as (FY10, FY11, etc.), but it's not working.
      FOR i IN v_range_from.. v_range_to  LOOP
        v_as    := 'FY' || SUBSTR(i, 3, 4);
        v_query := v_query || 'MAX(DECODE(FY_NB,' || i || ',PFH_HEADCOUNT,0)) '
          || v_as || ',';
      END LOOP;
      -- add the rest of the query to the dynamic column selection
      v_query := rtrim(v_query,',') || ' FROM ('
        || 'SELECT FY_NB, PFH_HEADCOUNT FROM ('
        || 'SELECT FY_ID, FY_NB FROM FISCAL_YEAR) A '
        || 'LEFT OUTER JOIN ('
        || 'SELECT FY_ID, PFH_HEADCOUNT '
        || 'FROM PROJECT_FY_HEADCOUNT '
        || 'JOIN PROJECT_FY USING (PF_ID) '
        || 'WHERE PL_ID = ' || :P1_PROJECT || ') B '
        || 'ON A.FY_ID = B.FY_ID)';
      RETURN v_query;
    END;I need to invoke GET_QUERY_COLS(v_query) somewhere to get the column names, but I'm not sure where I need to call it and how to actually set the column names after getting the returned colon-delimited list.
    Can you (or anyone else) please help me get a little further? Once again, feel free to login to my host account to see it first hand.
    Thanks again!
    Mark

  • APEX 3.2 -ORACLE 10G - PIVOT QUERY

    Hello, i searched around the forum and i cound't find an answer to this specific matter, although i saw some replies that were close...
    i need to creat a form based on a pivot query. but oracle 10g doesn't support that feature so i hope someone can help me.
    my problem is that the number of columns will be variable. here's an example:
    ORIGINAL TABLE
    NAME     KMS     VEHICLE
    Joe     100     AUDI
    Tom     300     VW
    Mark     150     FORD
    Ann     250     FORD
    Joe     200     VW
    Tom     123     AUDI
    Mark     345     AUDI
    Ann     45     VW
    Joe     6     FORD
    Tom     67     FORD
    Mark     46     VW
    Ann     99     AUDI
    DESIRED RESULT
    Joe     Tom     Mark     Ann     Vehicle
    100     123     345     99     AUDI
    6     67     150     250     FORD
    200     300     46     45     VW
    the new columns will be the values in the old NAME column. BUT these values are variable. today its joe,tom,mark and ann tomorrow it could be silvia, tony,richard,harry , william and jane. this means the usuall replies i saw, using MAX and DECODE will not apply because i never know what values or how many values are in this column. with pivot i can get this done.... how can i do this in oracle 10g? is there a way to creat a ser function Pivot somehow? ideas?
    thanks!
    Mark Pereira
    Edited by: 899716 on Jul 18, 2012 12:02 PM

    This is the Oracle Forms forum. Post your question in the SQL forum.
    Tip: check the latest Oracle Magazine (July/August 2012). There is an article by Tom Kyte about the same question.
    http://www.oracle.com/technetwork/oramag/magazine/home/index.html

  • Oracle 11 Pivot Query

    Hi,
    I have 2 questions on the pivot query below:
    1 - I want to have total by Country and Year
    2 - I want to allow the use a dynamic year range (like year between 1990 and 2000)
    Thanks,
    Ribhi
    select * from (
    select CNTRY_NAME, MYYEAR,AM_AMOUNT
    from yearly_lending t
    pivot
    sum(AM_AMOUNT)
    for MYYEAR in (1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998,1999,2000,2001,2002,2003,2004,2005,2006,2007,2008,2009)
    order by CNTRY_NAME

    Hi all,
    Thank you for your help.
    The qury working as expected, but i wont to add total by country and year
    Country 1974 1975 1976 1977 1977 1978...2009 Total
    Jordan 10 5 0 3 5 1 24
    Egypt 5 0 0 0 10 0 15
    Syria 8 2 10 20
    Total 23 7 0 3 15 11 59
    also would like to select year range rather than entering year by year example year from 2000 to year 2009
    I created a view from the qury below and then my pivot query
    select
    rtrim(cntry.short_name_e) cntry_name,
    to_char(e.date0,'yyyy') myYear,
    am.value / 1000000 am_amount,
    a.fk_countrycode a_cntry
    from event e,
    agreement_amount am,
    agreement a, in_country cntry
    where
    a.number0 = e.fk_agreementnumber
    AND A.SUB_NUMBER <>'P'
    AND a.fk_countrycode = cntry.intl_code
    and a.sub_number = e.fk_agreementsub_nu
    and a.type = e.fk_agreementtype
    /*and rtrim(a.status) <> 'CANCELLED' */
    and rtrim(e.type) = 'SIGNING'
    and rtrim(e.fk_agreementtype) <> 'GRANT'
    and e.fk_agreementnumber = am.fk_agreementnumber
    and e.fk_agreementsub_nu = am.fk_agreementsub_nu
    and e.fk_agreementtype = am.fk_agreementtype
    and am.serial_number = 1
    order by rtrim(cntry.short_name_e) ,to_char(e.date0,'yyyy')
    Best regards,
    Ribhi
    Edited by: Ribhi on Apr 29, 2009 7:20 PM

  • 11G Pivot Query with parameters

    Hello all,
    I would like to find some way, any way to pass parameters to a pivot query. The following pivot query works, but I would like segment2 to be a variable as well as the period names so....
    select * from
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 >='007611' and segment2 <='007751' AND period_name in ('SEP-08','OCT-08','NOV-08'))
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08') )
    ....would be something like....
    select * from
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 >= :P4_OBJECT_FROM AND and segment2 <=:P4_OBJECT_TO AND period_name in &P4_EPSB_PERIOD_HOLD.)
    pivot
    sum(accounted_dr)
    for period_name in (&P4_EPSB_PERIOD_HOLD.) )
    It is our understanding that we have to hardcode period names and objects, but we would like to get around that. Does anyone have any ideas or tricks?
    Thanks

    lilhelp wrote:
    Hello all,
    We are trying to use the 11G pivot query function with data from Oracle E-Business Suite. We have an 11G database installed with our Oracle APEX. We cannot seem to get the pivot function to work. At a glance, would anyone be able to see any glaring errors in our syntax. I am not certain it is possible to provide test data Why not?
    >
    We are trying to have column headings with the Period Names SEP-08 OCT-08 NOV-08, with rows of segment2 007751 and accounted_dr as the dataset.
    When we run the sql we get an error ORA-00904: "PERIOD_NAME": invalid identifier.
    Any help or insight would be greatly appreciated.
    select * from (
    select segment2, accounted_dr, period_name
    from gl_je_lines a, gl_code_combinations b
    where b.code_combination_id = a.code_combination_id
    and segment2 = '007751')
    pivot
    sum(accounted_dr)
    for period_name in ('SEP-08','OCT-08','NOV-08')
    group by segment2, period_nameDon't use GROUP BY. When you use PIVOT, the grouping is implied by what is in the PIVOT clause and what is not.
    Try this:
    select    *
    from        (
           select  segment2
           ,       accounted_dr
           ,       period_name
           from       gl_je_lines          a
           ,       gl_code_combinations     b
           where       b.code_combination_id = a.code_combination_id
           and       segment2 = '007751'
    pivot       (
           sum (accounted_dr)
           for period_name in ('SEP-08','OCT-08','NOV-08')
    ;which is just your posted query without the GROUP BY clause.

  • Pivot query, phantom behavior, possible bug?

    Hi experts,
    Oracle 11g
    We are trying to construct a binary type table using a pivot query. (I know there will be no 0's, I will fill that in later.)
    The query to create this table is as follows:
    create table TEMP_T1 as
                select * from (
                select master_id, mbr_id
                        sub_id,
                        NVL(max(den_flag),0) DF,
                        NVL(max(num_flag),0) NF,
                        NVL(sum(den_flag),0) DXF,
                        NVL(sum(num_flag),0) NXF
                from MEM_SUM
                group by master_id, mbr_id,
                        sub_id
                pivot ( max(DF) D_FLAG,
                          max(NF) N_FLAG,
                          sum(DXF) D_COUNT,
                          sum(NXF) N_COUNT
                          FOR sub_id in
                        ( 'MEAS-1' AS "MEAS1",'MEAS-2' AS "MEAS2",'MEAS-3' AS "MEAS3",'MEAS-4' AS "MEAS4"))I am seeing unusual results when I run this query, and am unsure why. Wanted to get some thoughts.
    First issue:
    Although the query selects master_id and mbr_id, I only get master_id with the pivoted results (when it should be master_id, mbr_id & pivot results). Not sure why.
    Second issue:
    And this is where it gets even more strange, if I
    1) do something to the code to make it have a syntax problem, re-run, it will fail, naturally.
    then
    2) fix the syntax back to the original query above, it will run and return master_id, mbr_id & the pivoted results.
    Has anyone encountered such a strange issue before? Any suggestions on a solution welcome.
    Edited by: chris001 on Feb 22, 2013 8:09 AM

    I've experienced a similar problem with my FaderPort controller but only under odd circumstances.
    If I accidentally unplugged my FaderPort and then plugged it back in, the Control Surfaces preferences that I had assigned it would get deleted and the MIDI messages would come through as note data and other really weird stuff depending on what track I had selected in the arrange window...
    What I noticed was that when I just went in and re-did my Control Surfaces stuff for that controller it would reset and work perfectly.
    Eventually I had to contact PreSonus and let them know what was happening. Turns out it was a problem with the driver, and the driver had to be updated to accomodate 8.0.2.
    I can only imagine that this is still the case for many companies (including the ones who made your volume pedal) since Logic 9 came so suddenly, without even a 8.1 ever being released....Try contacting that company and seeing if there are any known issues with the driver for the device and Logic 9... might help, can't hurt.

  • Help for a query to add columns

    Hi,
    I need for a query where I should add each TableC value as an additional column.
    Please suggest...
    I have 3 tables (TableA, TableB, TableC). TableB stores TableA Id and TableC stores TableB Id
    Considering Id of TableA.
    Sample data
    TableA     :          
    ID     NAME     TABLENAME     ETYPE
    23     Name1     TABLE NAMEA     Etype A
    TableB     :          
    ID     A_ID     RTYPE     RNAME
    26     23     RTYPEA     RNAMEA
    61     23     RTYPEB     RNAMEB
    TableC     :          
    ID     B_ID     COMPNAME     CONC
    83     26     Comp Name AA     1.5
    46     26     Comp Name BB     2.2
    101     61     Comp Name CC     4.2
    Scenario 1: AS PER ABOVE SAMPLE DATA Put each TableC value as an additional column.
    For an Id in TableA(23) where TableB contains 2 records of A_ID (26, 61) and TableC contains 2 records for 26 and 1 record for 61.
    Output required: Put each TABLEC value as an additional column
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME     TableC_1_CONC TableC_2_COMPNAME     TableC_2_CONC     
    Name1 EtypeA RTypeA Comp Name AA 1.5 Comp Name BB 2.2     so on..
    Name1 EtypeA RTypeB Comp Name CC 4.2 NULL NULL     
    Scenario 2: If Table C contains ONLY 1 row for each Id in TableB, output should be somewhat
    Output:
    TableA.NAME TableA.ETYPE TableB.RTYPE TableC_1_COMPNAME
    TableC_1_CONCvalue     value     value     value               value

    Hi,
    Welcome to the forum!
    Do you want the data from TableC presented
    (1) in one column, or
    (2) in several columns (a different column of results for each row in the original TableC)?
    (1) Is called String Aggregation and is easier than (2).
    The best way to do this is with a user-defined aggregate function (STRAGG) which you can copy from asktom.
    Ignoring TableA for now, you could get what you want by saying
    SELECT    b.rtype
    ,         STRAGG (   c.compname
                     || ' '
                     || c.conc
                     )  AS c_data
    FROM      TableB  b
    JOIN      TableC  c  ON b.id  = c.b_id
    GROUP BY  b.rtype;(2) Presenting N rows of TableC as it they were N columns of the same row is called a pivot. Search for "pivot" or "rows to columns" to find examples of how to do this.
    The number of columns in a result set is hard-coded into the query. If you don't know ahead of time how many rows in TableC will match a row in TableB, you can:
    (a) guess high (for example, hard-code 20 columns and let the ones that never contain a match be NULL) or,
    (b) use Dynamic SQL to write a query for you, which has exactly as many columns as you need.
    The two scripts below contain basic information on pivots.
    This first script is similar to what you would do for case (a):
    --     How to Pivot a Result Set (Display Rows as Columns)
    --     For Oracle 10, and earlier
    --     Actually, this works in any version of Oracle, but the
    --     "SELECT ... PIVOT" feature introduced in Oracle 11
    --     is better.  (See Query 2, below.)
    --     This example uses the scott.emp table.
    --     Given a query that produces three rows for every department,
    --     how can we show the same data in a query that has one row
    --     per department, and three separate columns?
    --     For example, the query below counts the number of employess
    --     in each departent that have one of three given jobs:
    PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
    SELECT     deptno
    ,     job
    ,     COUNT (*)     AS cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ,          job;
    Output:
        DEPTNO JOB              CNT
            20 CLERK              2
            20 MANAGER            1
            30 CLERK              1
            30 MANAGER            1
            10 CLERK              1
            10 MANAGER            1
            20 ANALYST            2
    PROMPT     ==========  1. Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno;
    --     Output:
        DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
            30           0          1           1
            20           2          2           1
            10           0          1           1
    --     Explanation
    (1) Decide what you want the output to look like.
         (E.g. "I want a row for each department,
         and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
    (2) Get a result set where every row identifies which row
         and which column of the output will be affected.
         In the example above, deptno identifies the row, and
         job identifies the column.
         Both deptno and job happened to be in the original table.
         That is not always the case; sometimes you have to
         compute new columns based on the original data.
    (3) Use aggregate functions and CASE (or DECODE) to produce
         the pivoted columns. 
         The CASE statement will pick
         only the rows of raw data that belong in the column.
         If each cell in the output corresponds to (at most)
         one row of input, then you can use MIN or MAX as the
         aggregate function.
         If many rows of input can be reflected in a single cell
         of output, then use SUM, COUNT, AVG, STRAGG, or some other
         aggregate function.
         GROUP BY the column that identifies rows.
    PROMPT     ==========  2. Oracle 11 PIVOT  ==========
    WITH     e     AS
    (     -- Begin sub-query e to SELECT columns for PIVOT
         SELECT     deptno
         ,     job
         FROM     scott.emp
    )     -- End sub-query e to SELECT columns for PIVOT
    SELECT     *
    FROM     e
    PIVOT     (     COUNT (*)
              FOR     job     IN     ( 'ANALYST'     AS analyst
                             , 'CLERK'     AS clerk
                             , 'MANAGER'     AS manager
    NOTES ON ORACLE 11 PIVOT:
    (1) You must use a sub-query to select the raw columns.
    An in-line view (not shown) is an example of a sub-query.
    (2) GROUP BY is implied for all columns not in the PIVOT clause.
    (3) Column aliases are optional. 
    If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
    {code}
    The second script, below, shows one way of doing a dynamic pivot in SQL*Plus:
    {code}
    How to Pivot a Table with a Dynamic Number of Columns
    This works in any version of Oracle
    The "SELECT ... PIVOT" feature introduced in Oracle 11
    is much better for producing XML output.
    Say you want to make a cross-tab output of
    the scott.emp table.
    Each row will represent a department.
    There will be a separate column for each job.
    Each cell will contain the number of employees in
         a specific department having a specific job.
    The exact same solution must work with any number
    of departments and columns.
    (Within reason: there's no guarantee this will work if you
    want 2000 columns.)
    Case 0 "Basic Pivot" shows how you might hard-code three
    job types, which is exactly what you DON'T want to do.
    Case 1 "Dynamic Pivot" shows how get the right results
    dynamically, using SQL*Plus. 
    (This can be easily adapted to PL/SQL or other tools.)
    PROMPT     ==========  0. Basic Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ORDER BY     deptno
    PROMPT     ==========  1. Dynamic Pivot  ==========
    --     *****  Start of dynamic_pivot.sql  *****
    -- Suppress SQL*Plus features that interfere with raw output
    SET     FEEDBACK     OFF
    SET     PAGESIZE     0
    SPOOL     p:\sql\cookbook\dynamic_pivot_subscript.sql
    SELECT     DISTINCT
         ',     COUNT (CASE WHEN job = '''
    ||     job
    ||     ''' '     AS txt1
    ,     'THEN 1 END)     AS '
    ||     job
    ||     '_CNT'     AS txt2
    FROM     scott.emp
    ORDER BY     txt1;
    SPOOL     OFF
    -- Restore SQL*Plus features suppressed earlier
    SET     FEEDBACK     ON
    SET     PAGESIZE     50
    SPOOL     p:\sql\cookbook\dynamic_pivot.lst
    SELECT     deptno
    @@dynamic_pivot_subscript
    FROM     scott.emp
    GROUP BY     deptno
    ORDER BY     deptno
    SPOOL     OFF
    --     *****  End of dynamic_pivot.sql  *****
    EXPLANATION:
    The basic pivot assumes you know the number of distinct jobs,
    and the name of each one.  If you do, then writing a pivot query
    is simply a matter of writing the correct number of ", COUNT ... AS ..."\
    lines, with the name entered in two places on each one.  That is easily
    done by a preliminary query, which uses SPOOL to write a sub-script
    (called dynamic_pivot_subscript.sql in this example).
    The main script invokes this sub-script at the proper point.
    In practice, .SQL scripts usually contain one or more complete
    statements, but there's nothing that says they have to.
    This one contains just a fragment from the middle of a SELECT statement.
    Before creating the sub-script, turn off SQL*Plus features that are
    designed to help humans read the output (such as headings and
    feedback messages like "7 rows selected.", since we do not want these
    to appear in the sub-script.
    Turn these features on again before running the main query.
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Having a problem with writing pivoting query.

    Hi All,
    I have a query with one input parameter. It gives different result based on the input provided.
    For example
    1. If i give input as date1 the output would be as follows.
    Col1     Col2     Col3
    name1     v1     10
    name1     v2     14
    name2     v1     15
    name3     v3     202. If i give input as date2 the output would be as follows.
    Col1     Col2     Col3
    name2     v1     14
    name2     v2     10
    name3     v1     8
    name3     v2     14
    name1     v1     10
    name1     v2     34
    name1     v4     23
    name1     v5     10
    name4     v1     12
    name4     v2     14
    name4     v4     18
    name4     v5     20
    name5     v1     14
    name5     v2     10and so on for diff inputs, I get diff output.
    Now, I am trying to write a query which would give me the pivot data on the outputs shown above.
    For Example
    1. For the first output on the top, the pivot query should return the data as follows:
          name1     name2     name3
    v1     10     15     0
    v2     14     0     0
    v3     0     0     202. For the second output on the top, the pivot query should return the data as follows:
          name1     name2     name3     name4     name5
    v1     10     14     8     12     14
    v2     34     10     14     14     10
    v3     0     0     0     0     0
    v4     23     0     0     18     0
    v5     10     0     0     20     0and so on...
    I would be greatly thankful for any kind of input provided since.
    Regards
    Sapan

    Hi Frank,
    Thanks for your response, I did have a look at the thread which had the options for both static as well as the dynamic pivoting.
    But as you said this scenario needs a dynamic SQL. The only constraint is, I need to plug in the SQL into HTML-DB and spooling would not be feasible. That's the reason I am trying to frame a query around the current query which gives me the base data to be pivoted. I am also keeping my options open on coming up with PL/SQL block, which can integrate well with HTML-DB.
    Any thoughts/suggestions would be helpful.
    Thank you.

  • SQL Pivot Query

    Hi Experts
    I recently purchased a copy of Gordon's book to help me create a pivot query based on the examples at the end which I hope to show me a set of data based on my Sales Order tables.   Unfortunately I have found it a little too complex.
    In my ORDR table I have 10 UDFs which relate to various statuses and dates in the sales order process. 
    What I would like is to see a pivot table query of the sum quanties of items at the different stages (as rows) against the dates in weeks.
    In the rows would be the status UDF fields:
    U_DES_STAGE
    U_PRN_STAGE
    U_PRS_STAGE
    U_SEW_STAGE
    U_EMB_STAGE
    for each UDF there is an associated date UDF
    U_DES_STAGE_DATE
    U_PRN_STAGE_DATE
    U_PRS_STAGE_DATE
    U_SEW_STAGE_DATE
    U_EMB_STAGE_DATE
    The columns I require in the pivot table would be the UDF stage dates which are always input for the week ending Friday.
    The data in the pivot should be the sum of the quantities in the sales orders rows RDR1.
    Stage
    W/E 14/3/2014
    W/E 21/3/2014
    W/E 28/3/2014
    W/E 4/4/2014
    W/E 11/4/2014
    W/E 18/4/2014
    DES
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    PRN
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    PRS
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    SEW
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    EMB
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    Sum of Qty
    The problem is that I am not sure as to how to add the WHERE clause to limit the results to only the 5 items (DES% or PRN% or PRS% or SEW% or EMB%)
    I would be very grateful for any assistance.
    Regards
    Geoff

    Hi Gordon - you're very welcome.   The content of the book is very helpful to an SQL novice!
    Essentially the 10 UDF are for recording each stage of the processes involved in the sales order.    there are 5 processes so each process has 2 fields - one records the initals of the staff member who is responsible and the other records the date the process is scheduled for.
    We want to be able to produce a schedule to show us how many garments are scheduled on a sales order in a particular week effectively turning this table into the one below it:
    Document Number
    Quantity
    Design - Scheduled Date
    Dye Print Scheduled Date
    Dye Press Scheduled Date
    Dye Sewing Scheduled Date
    Embroidery - Scheduled Date
    Print – Scheduled Date
    298835
    315
    12.07.10
    06.09.10
    300058
    144
    22.07.10
    06.09.10
    300921
    29
    01.10.10
    302330
    15
    30.09.10
    30.09.10
    302820
    460
    05.10.10
    302833
    55
    20.09.10
    22.09.10
    303476
    86
    06.10.10
    06.10.10
    303948
    13
    11.08.10
    11.08.10
    303982
    106
    26.10.10
    27.10.10
    304012
    99
    25.11.10
    25.11.10
    304186
    6
    27.08.10
    27.08.10
    304331
    10
    07.09.10
    07.09.10
    304382
    16
    29.09.10
    29.09.10
    304399
    15
    19.08.10
    304556
    85
    01.10.10
    22.10.10
    304557
    11
    29.09.10
    29.09.10
    304563
    8
    29.09.10
    29.09.10
    304567
    7
    29.09.10
    304570
    19
    01.10.10
    22.10.10
    304571
    113
    01.10.10
    22.10.10
    304576
    11
    29.09.10
    29.09.10
    304603
    72
    22.09.10
    23.09.10
    304604
    86
    22.09.10
    23.09.10
    304606
    107
    22.09.10
    23.09.10
    304608
    107
    22.09.10
    23.09.10
    304613
    107
    22.09.10
    23.09.10
    304693
    12
    29.09.10
    29.09.10
    304710
    5
    29.09.10
    304760
    6
    29.09.10
    29.09.10
    304765
    4
    29.09.10
    304899
    42
    15.09.10
    304963
    100
    22.09.10
    24.09.10
    304974
    719
    27.09.10
    29.09.10
    304975
    401
    28.09.10
    29.09.10

Maybe you are looking for

  • Gradient problems with stepping single and full colour

    Hi, just wondering is anyone else is having problems on printed jobs with the gradient having definite steps between the different tints. We had no problems with CS3, but since updating to CS4 we have had jobs showing this stepping problem. Has Indes

  • Why do I keep getting a "Safari can't find the server" message? Why does the connection keep dropping out?

    When I open Safari and browse, I often get this message: "Safari can't find the server". I will then wait for at least 5 minutes, try the site again, and it will open, but then it just keeps happening on just about all websites now. These are very st

  • Solution on Link Failover for Hosted Webserver

    Hi, One of my customer has Web based application which is hosted over internet on IP provided by ISP. Challange is in case, the ISP link fails webserver is not available. Customer is planning to add one more link from different ISP. How do I Load bal

  • How to reduce the backup size?

    HI there, I am syncing an iPad 32 GB and an iPhone 64 GB via icloud. Photos are not backed, no photostream activated. Apps which are allowed to upload to icloud for backup are very limited. Mail is not syncing to iCloud as well. Still, my backup size

  • Animation menu is not showing

    I have Photoshop CS6 Extended . In that Animation menu is not showing . Only the 3D menu is available. Kindly help me to fix this issue.