Comparison query

I have a table called vehicle_density. Following are the sample data of this table.
Period          Vehicle       Location
Jan-2009     300     Jayanagar
Fab-2009     245     Jayanagar
Mar-2009     236     Jayanagar
Apr-2009     298     Jayanagar
May-2009     325     Jayanagar
Jun-2009     204     Jayanagar
Jan-2009     568     BTM
Fab-2009     585     BTM
Mar-2009     401     BTM
Apr-2009     565     BTM
May-2009     621     BTM
Jun-2009     425     BTM
Jan-2009     145     RT Nagar
Fab-2009     200     RT Nagar
Mar-2009     254     RT Nagar
Apr-2009     120     RT Nagar
May-2009     282     RT Nagar
Jun-2009     96     RT NagarNow I need a query to compare vehicle density between different locations for given time period.
For example here is sample output vehicle density comparison between BTM and Jayanagar between Jan-2009 to Mar-2009
Period       Jayanagar   BTM
Jan-2009     300     568
Fab-2009     245     585
Mar-2009     236     401Same way density comparison between jayanagar, BTM and RT Nagar between Jan-2009 to Mar-2009 is below
Period     Jayanagar     BTM     RT Nagar
Jan-2009     300     568     145
Fab-2009     245     585     200
Mar-2009     236     401     254
Apr-2009     298     565     120Can I do it in single sql query?
Thanks,
Sujnan

Hi, Sujnan,
Yes, you can do that in one query. It's called a pivot.
One way is shown below. This example uses the COUNT function; for what you want, use SUM instead.
Search for "pivot" or "rows to columns" for more examples.
--     How to Pivot a Result Set (Display Rows as Columns)
--     For Oracle 10, and earlier
--     Actually, this works in any version of Oracle, but the
--     "SELECT ... PIVOT" feature introduced in Oracle 11
--     is better.  (See Query 2, below.)
--     This example uses the scott.emp table.
--     Given a query that produces three rows for every department,
--     how can we show the same data in a query that has one row
--     per department, and three separate columns?
--     For example, the query below counts the number of employess
--     in each departent that have one of three given jobs:
PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
SELECT     deptno
,     job
,     COUNT (*)     AS cnt
FROM     scott.emp
WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
GROUP BY     deptno
,          job;
Output:
    DEPTNO JOB              CNT
        20 CLERK              2
        20 MANAGER            1
        30 CLERK              1
        30 MANAGER            1
        10 CLERK              1
        10 MANAGER            1
        20 ANALYST            2
PROMPT     ==========  1. Pivot  ==========
SELECT     deptno
,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
FROM     scott.emp
WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
GROUP BY     deptno;
--     Output:
    DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
        30           0          1           1
        20           2          2           1
        10           0          1           1
--     Explanation
(1) Decide what you want the output to look like.
     (E.g. "I want a row for each department,
     and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
(2) Get a result set where every row identifies which row
     and which column of the output will be affected.
     In the example above, deptno identifies the row, and
     job identifies the column.
     Both deptno and job happened to be in the original table.
     That is not always the case; sometimes you have to
     compute new columns based on the original data.
(3) Use aggregate functions and CASE (or DECODE) to produce
     the pivoted columns. 
     The CASE statement will pick
     only the rows of raw data that belong in the column.
     If each cell in the output corresponds to (at most)
     one row of input, then you can use MIN or MAX as the
     aggregate function.
     If many rows of input can be reflected in a single cell
     of output, then use SUM, COUNT, AVG, STRAGG, or some other
     aggregate function.
     GROUP BY the column that identifies rows.
PROMPT     ==========  2. Oracle 11 PIVOT  ==========
WITH     e     AS
(     -- Begin sub-query e to SELECT columns for PIVOT
     SELECT     deptno
     ,     job
     FROM     scott.emp
)     -- End sub-query e to SELECT columns for PIVOT
SELECT     *
FROM     e
PIVOT     (     COUNT (*)
          FOR     job     IN     ( 'ANALYST'     AS analyst
                         , 'CLERK'     AS clerk
                         , 'MANAGER'     AS manager
NOTES ON ORACLE 11 PIVOT:
(1) You must use a sub-query to select the raw columns.
An in-line view (not shown) is an example of a sub-query.
(2) GROUP BY is implied for all columns not in the PIVOT clause.
(3) Column aliases are optional. 
If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
{code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Similar Messages

  • Daily Sales Total Comparison Query

    Hi Experts,
    I'm trying to make a query to get daily sales total for week and wish to make a graph.
    However, if there is no figures in credit note or Down payment invoice or invoice then query seems not showing any figures for particular date.
    I would be appreciated if anyone help this.
    SELECT DISTINCT
    GetDate(),
    SUM (DISTINCT T0.DocTotal) AS 'Daily INV Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE DateDiff(D,T0.DocDate,GetDate())=0 AND DateDiff(D,T1.DocDate,GetDate())=0 AND DateDiff(D,T2.DocDate,GetDate())=0
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 1, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 1, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 1, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 1, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 2, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 2, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 2, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 2, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 3, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 3, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 3, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 3, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 4, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 4, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 4, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 4, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 5, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 5, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 5, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 5, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 6, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 6, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 6, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 6, 0))
    UNION ALL
    SELECT
    DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 7, 0)),
    SUM (DISTINCT T0.DocTotal) AS 'Daily Sales Sum',
    SUM (DISTINCT T2.DocTotal) AS 'Daily DT INV Sum',
    SUM (DISTINCT T1.DocTotal*-1) AS 'Daily CR Sum',
    SUM (DISTINCT T0.DocTotal) + SUM (DISTINCT T2.DocTotal) - SUM (DISTINCT T1.DocTotal) AS 'Daily Sales Total'
    FROM OINV T0, ORIN T1, ODPI T2
    WHERE T0.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 7, 0)) AND T1.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 7, 0)) AND T2.DocDate = DATEADD(dd, 0, DATEADD(dd, DATEDIFF(dd, 0, GetDate()) - 7, 0))

    Could you let me know how to make pivot query?
                        AR INV TOTAL  |  AR Down Payment Total  | AR Credit Total  | (AR INV TOTAL+AR DP TOTAL-AR CREDIT TOTAL)
    Today's Sales
    Yesterday
    Until Week Before

  • Time comparison query help

    Hi All,
    Please help me to write a query to compare the timestamps and filter the data with the below intervals.
    From current date @3.30 AM to 2.30 PM
    From current date @2.30 PM to (current date+1) 3.30 AM
    Input data
    2012-08-13 03:30:00.000
    2012-08-13 04:10:49.954
    2012-08-13 08:10:49.972
    2012-08-13 11:29:33.095
    2012-08-13 14:29:33.112
    2012-08-13 17:29:33.128
    2012-08-14 02:29:33.128

    with testdata as (
    select to_timestamp('2012-08-13 03:30:00.00','yyyy-mm-dd hh24:mi:ss:ff') d from dual union all
    select to_timestamp('2012-08-13 04:10:49.95','yyyy-mm-dd hh24:mi:ss:ff') from dual union all
    select to_timestamp('2012-08-13 08:10:49.97','yyyy-mm-dd hh24:mi:ss:ff') from dual union all
    select to_timestamp('2012-08-13 11:29:33.09','yyyy-mm-dd hh24:mi:ss:ff') from dual union all
    select to_timestamp('2012-08-13 14:29:33.11','yyyy-mm-dd hh24:mi:ss:ff') from dual union all
    select to_timestamp('2012-08-13 17:29:33.12','yyyy-mm-dd hh24:mi:ss:ff') from dual union all
    select to_timestamp('2012-08-14 02:29:33.12','yyyy-mm-dd hh24:mi:ss:ff') from dual
    select
    d
    ,case
    when d >= to_date('2012-08-13','YYYY-MM-DD') + interval '3:30' HOUR to MINUTE
      and d <  to_date('2012-08-13','YYYY-MM-DD') + interval '14:30' HOUR to MINUTE
    then 'First'
    when d >= to_date('2012-08-13','YYYY-MM-DD') + interval '14:30' HOUR to MINUTE
      and d <  to_date('2012-08-13','YYYY-MM-DD') + 1 + interval '3:30' HOUR to MINUTE
    then 'Second'
    else 'None'
    end "Date-Interval"
    from testdata
    D Date-Interval
    "13/08/2012 03:30:00,000000000" "First"
    "13/08/2012 04:10:49,950000000" "First"
    "13/08/2012 08:10:49,970000000" "First"
    "13/08/2012 11:29:33,090000000" "First"
    "13/08/2012 14:29:33,110000000" "First"
    "13/08/2012 17:29:33,120000000" "Second"
    "14/08/2012 02:29:33,120000000" "Second"

  • Rate Comparison query

    Table Structure
    Po_no number(12)
    Item_code varchar210),
    rate number(12,3)I want to output like that
    Item_code         Current Rate            Previous RateWhere current rate = rate of Max(po_no) and Previous Rate = rate of 2nd Max(po_no)

    Maybe NOT TESTED!
    select item_code "Item_code",max(max_rate) "Current Rate",min(max_rate) "Previous Rate"
      from (select item_code,max(rate) max_rate
              from (select po_no,item_code,rate,
                           dense_rank() over (partition by po_no order by rate desc) r
                      from your_table
             where r in (1,2)
             group by item_code,r
    group by item_codeRegards
    Etbin
    Edited by: Etbin on 21.2.2011 17:33
    returns null Previous Rate when having one value only
    select item_code "Item_code",max(max_rate) "Current Rate",min(case r when 2 then max_rate end) "Previous Rate"
      from (select item_code,max(rate) max_rate,r
              from (select po_no,item_code,rate,
                           dense_rank() over (partition by item_code order by rate desc) r
                      from (select 10 po_no,'1' item_code,12.345 rate from dual union all
                            select 10,'1',12.678 from dual union all
                            select 10,'1',12.234 from dual union all
                            select 10,'1',12.789 from dual union all
                            select 10,'2',10.678 from dual union all
                            select 10,'2',10.234 from dual union all
                            select 10,'2',10.789 from dual union all
                            select 10,'3',10.001 from dual
             where r in (1,2)
             group by item_code,r
    group by item_codeEdited by: Etbin on 21.2.2011 17:39

  • Schema Table Comparison

    Hi All,
    I've got 2 schemas with identical tables.
    I want to do a minus on the tables but would like to do this with a procedure that then reports the change into a <table_name>_diff table for each - This table should show records that are in schema1 but not in 2 and records that are in schema 2 but not in 1.
    There are about 40 tables in total so a proc rather than doing it all manually would be superb...
    Any ideas ?

    Hi ,
    I have found somewhere in the net the following code......
    REM
    REM Edit the following three DEFINE statements to customize this script
    REM to suit your needs.
    REM
    REM Tables to be compared:
    DEFINE table_criteria = "table_name = table_name" -- all tables
    REM DEFINE table_criteria = "table_name != 'TEST'"
    REM DEFINE table_criteria = "table_name LIKE 'LOOKUP%' OR table_name LIKE 'C%'"
    REM Columns to be compared:
    DEFINE column_criteria = "column_name = column_name" -- all columns
    REM DEFINE column_criteria = "column_name NOT IN ('CREATED', 'MODIFIED')"
    REM DEFINE column_criteria = "column_name NOT LIKE '%_ID'"
    REM Database link to be used to access the remote schema:
    DEFINE dblink = "remote_db"
    SET SERVEROUTPUT ON SIZE 1000000
    SET VERIFY OFF
    DECLARE
      CURSOR c_tables IS
        SELECT   table_name
        FROM     user_tables
        WHERE    &table_criteria
        ORDER BY table_name;
      CURSOR c_columns (cp_table_name IN VARCHAR2) IS
        SELECT   column_name, data_type
        FROM     user_tab_columns
        WHERE    table_name = cp_table_name
        AND      &column_criteria
        ORDER BY column_id;
      TYPE t_char80array IS TABLE OF VARCHAR2(80) INDEX BY BINARY_INTEGER;
      v_column_list     VARCHAR2(32767);
      v_total_columns   INTEGER;
      v_skipped_columns INTEGER;
      v_count1          INTEGER;
      v_count2          INTEGER;
      v_rows_fetched    INTEGER;
      v_column_pieces   t_char80array;
      v_piece_count     INTEGER;
      v_pos             INTEGER;
      v_length          INTEGER;
      v_next_break      INTEGER;
      v_same_count      INTEGER := 0;
      v_diff_count      INTEGER := 0;
      v_error_count     INTEGER := 0;
      v_warning_count   INTEGER := 0;
      -- Use dbms_sql instead of native dynamic SQL so that Oracle 7 and Oracle 8
      -- folks can use this script.
      v_cursor          INTEGER := dbms_sql.open_cursor;
    BEGIN
      -- Iterate through all tables in the local database that match the
      -- specified table criteria.
      FOR r1 IN c_tables LOOP
        -- Build a list of columns that we will compare (those columns
        -- that match the specified column criteria). We will skip columns
        -- that are of a data type not supported (LOBs and LONGs).
        v_column_list := NULL;
        v_total_columns := 0;
        v_skipped_columns := 0;
        FOR r2 IN c_columns (r1.table_name) LOOP
          v_total_columns := v_total_columns + 1;
          IF r2.data_type IN ('BLOB', 'CLOB', 'NCLOB', 'LONG', 'LONG RAW') THEN
            -- The column's data type is one not supported by this script (a LOB
            -- or a LONG). We'll enclose the column name in comment delimiters in
            -- the column list so that the column is not used in the query.
            v_skipped_columns := v_skipped_columns + 1;
            IF v_column_list LIKE '%,' THEN
              v_column_list := RTRIM (v_column_list, ',') ||
                               ' /*, "' || r2.column_name || '" */,';
            ELSE
              v_column_list := v_column_list || ' /* "' || r2.column_name ||'" */ ';
            END IF;
          ELSE
            -- The column's data type is supported by this script. Add the column
            -- name to the column list for use in the data comparison query.
            v_column_list := v_column_list || '"' || r2.column_name || '",';
          END IF;
        END LOOP;
        -- Compare the data in this table only if it contains at least one column
        -- whose data type is supported by this script.
        IF v_total_columns > v_skipped_columns THEN
          -- Trim off the last comma from the column list.
          v_column_list := RTRIM (v_column_list, ',');
          BEGIN
            -- Get a count of rows in the local table missing from the remote table.
            dbms_sql.parse
            v_cursor,
            'SELECT COUNT(*) FROM (' ||
            'SELECT ' || v_column_list || ' FROM "' || r1.table_name || '"' ||
            ' MINUS ' ||
            'SELECT ' || v_column_list || ' FROM "' || r1.table_name ||'"@&dblink)',
            dbms_sql.native
            dbms_sql.define_column (v_cursor, 1, v_count1);
            v_rows_fetched := dbms_sql.execute_and_fetch (v_cursor);
            IF v_rows_fetched = 0 THEN
              RAISE NO_DATA_FOUND;
            END IF;
            dbms_sql.column_value (v_cursor, 1, v_count1);
            -- Get a count of rows in the remote table missing from the local table.
            dbms_sql.parse
            v_cursor,
            'SELECT COUNT(*) FROM (' ||
            'SELECT ' || v_column_list || ' FROM "' || r1.table_name ||'"@&dblink'||
            ' MINUS ' ||
            'SELECT ' || v_column_list || ' FROM "' || r1.table_name || '")',
            dbms_sql.native
            dbms_sql.define_column (v_cursor, 1, v_count2);
            v_rows_fetched := dbms_sql.execute_and_fetch (v_cursor);
            IF v_rows_fetched = 0 THEN
              RAISE NO_DATA_FOUND;
            END IF;
            dbms_sql.column_value (v_cursor, 1, v_count2);
            -- Display our findings.
            IF v_count1 = 0 AND v_count2 = 0 THEN
              -- No data discrepencies were found. Report the good news.
              dbms_output.put_line
              r1.table_name || ' - Local and remote table contain the same data'
              v_same_count := v_same_count + 1;
              IF v_skipped_columns = 1 THEN
                dbms_output.put_line
                r1.table_name || ' - Warning: 1 LOB or LONG column was omitted ' ||
                'from the comparison'
                v_warning_count := v_warning_count + 1;
              ELSIF v_skipped_columns > 1 THEN
                dbms_output.put_line
                r1.table_name || ' - Warning: ' || TO_CHAR (v_skipped_columns) ||
                ' LOB or LONG columns were omitted from the comparison'
                v_warning_count := v_warning_count + 1;
              END IF;
            ELSE
              -- There is a discrepency between the data in the local table and
              -- the remote table. First, give a count of rows missing from each.
              IF v_count1 > 0 THEN
                dbms_output.put_line
                r1.table_name || ' - ' ||
                LTRIM (TO_CHAR (v_count1, '999,999,990')) ||
                ' rows on local database missing from remote'
              END IF;
              IF v_count2 > 0 THEN
                dbms_output.put_line
                r1.table_name || ' - ' ||
                LTRIM (TO_CHAR (v_count2, '999,999,990')) ||
                ' rows on remote database missing from local'
              END IF;
              IF v_skipped_columns = 1 THEN
                dbms_output.put_line
                r1.table_name || ' - Warning: 1 LOB or LONG column was omitted ' ||
                'from the comparison'
                v_warning_count := v_warning_count + 1;
              ELSIF v_skipped_columns > 1 THEN
                dbms_output.put_line
                r1.table_name || ' - Warning: ' || TO_CHAR (v_skipped_columns) ||
                ' LOB or LONG columns were omitted from the comparison'
                v_warning_count := v_warning_count + 1;
              END IF;
              -- Next give the user a query they could run to see all of the
              -- differing data between the two tables. To prepare the query,
              -- first we'll break the list of columns in the table into smaller
              -- chunks, each short enough to fit on one line of a telnet window
              -- without wrapping.
              v_pos := 1;
              v_piece_count := 0;
              v_length := LENGTH (v_column_list);
              LOOP
                EXIT WHEN v_pos = v_length;
                v_piece_count := v_piece_count + 1;
                IF v_length - v_pos < 72 THEN
                  v_column_pieces(v_piece_count) := SUBSTR (v_column_list, v_pos);
                  v_pos := v_length;
                ELSE
                  v_next_break :=
                    GREATEST (INSTR (SUBSTR (v_column_list, 1, v_pos + 72),
                                     ',"', -1),
                              INSTR (SUBSTR (v_column_list, 1, v_pos + 72),
                                     ',/* "', -1),
                              INSTR (SUBSTR (v_column_list, 1, v_pos + 72),
                                     ' /* "', -1));
                  v_column_pieces(v_piece_count) :=
                    SUBSTR (v_column_list, v_pos, v_next_break - v_pos + 1);
                  v_pos := v_next_break + 1;
                END IF;
              END LOOP;
              dbms_output.put_line ('Use the following query to view the data ' ||
                                    'discrepencies:');
              dbms_output.put_line ('(');
              dbms_output.put_line ('SELECT ''Local'' "LOCATION",');
              FOR i IN 1..v_piece_count LOOP
                dbms_output.put_line (v_column_pieces(i));
              END LOOP;
              dbms_output.put_line ('FROM "' || r1.table_name || '"');
              dbms_output.put_line ('MINUS');
              dbms_output.put_line ('SELECT ''Local'' "LOCATION",');
              FOR i IN 1..v_piece_count LOOP
                dbms_output.put_line (v_column_pieces(i));
              END LOOP;
              dbms_output.put_line ('FROM "' || r1.table_name || '"@&dblink');
              dbms_output.put_line (') UNION ALL (');
              dbms_output.put_line ('SELECT ''Remote'' "LOCATION",');
              FOR i IN 1..v_piece_count LOOP
                dbms_output.put_line (v_column_pieces(i));
              END LOOP;
              dbms_output.put_line ('FROM "' || r1.table_name || '"@&dblink');
              dbms_output.put_line ('MINUS');
              dbms_output.put_line ('SELECT ''Remote'' "LOCATION",');
              FOR i IN 1..v_piece_count LOOP
                dbms_output.put_line (v_column_pieces(i));
              END LOOP;
              dbms_output.put_line ('FROM "' || r1.table_name || '"');
              dbms_output.put_line (');');
              v_diff_count := v_diff_count + 1;
            END IF;
          EXCEPTION
            WHEN OTHERS THEN
              -- An error occurred while processing this table. (Most likely it
              -- doesn't exist or has fewer columns on the remote database.)
              -- Show the error we encountered on the report.
              dbms_output.put_line (r1.table_name || ' - ' || SQLERRM);
              v_error_count := v_error_count + 1;
          END;
        END IF;
      END LOOP;
      -- Print summary information.
      dbms_output.put_line ('-------------------------------------------------');
      dbms_output.put_line
      'Tables examined: ' || TO_CHAR (v_same_count + v_diff_count + v_error_count)
      dbms_output.put_line
      'Tables with data discrepencies: ' || TO_CHAR (v_diff_count)
      IF v_warning_count > 0 THEN
        dbms_output.put_line
        'Tables with warnings: ' || TO_CHAR(v_warning_count)
      END IF;
      IF v_error_count > 0 THEN
        dbms_output.put_line
        'Tables that could not be checked due to errors: ' || TO_CHAR(v_error_count)
      END IF;
      dbms_sql.close_cursor (v_cursor);
    END;I hope , it ' ll help you...!!!!
    Regards,
    Simon

  • SNMP, Query dot1dStpPortState on Catalyst 2960-S

    Hi Community,
    I would like to be able to query the dot1dStpPortState obect on the Catalyst 2960-S on our LAN . Im running firmware
    c2960s-universalk9-mz.122-55.SE2.bin and according to the Cisco SNMP Object Navigator the object is supported (via the BRIDGE-MIB).
    However when i query using snmpwalk from my workstation :
    snmpwalk -v 2c -c bic-zua-ro 10.u.y.x 1.3.6.1.2.1.17.2.15.1.3
    I recieve and error .
    SNMPv2-SMI::mib-2.17.2.15.1.3 = No Such Instance currently exists at this OID
    For the sake of comparison, querying our 4700 :
    snmpwalk -v 2c -c bic-zua-ro 10.u.y.x 1.3.6.1.2.1.17.2.15.1.3
    returns (as expected, cropped)
    SNMPv2-SMI::mib-2.17.2.15.1.3.1 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.3 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.40 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.67 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.104 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.257 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.258 = INTEGER: 5
    SNMPv2-SMI::mib-2.17.2.15.1.3.259 = INTEGER: 5
    Is there some special configuration i need to do on our 2960's. The only snmp related settings i can see in the running config is snmp-server community. In this case :
    snmp-server community bic-zua-ro RO
    Thanks in advance for any comments/ assistance.
    Rgds
    Ian

    Hi Vinod,
    Wow, thanks for your prompt reply. Output from filtered running config pasted below
    TVS-Stack17#sh run | inclu snmp
    snmp-server community bic-zua-ro RO
    Interestingly when i walk the entire dot1dBridge (1.3.6.1.2.1.17) i recieve lots of data from both dot1dBase (1) and dot1dTp (4) but nothing from dot1dStp (2)
    I tried portAdditionalOperStatus and did not recieve any response but got lots of data from its patent  portEntry (1)
    Running show spann on the 2960 stack i can see various ports in forwarding and blocking start as i would expect.
    Rgds,
    Ian

  • How to create a template for a Workflow

    I have created a workflow that has couple of dataflows inside it.
    Now, I need to create another 20 workflows which have the similar logic to that of the first workflow, except for couple of changes such as the source file, etc. Instead of creating the whole logic from the scratch for all the workflows, is there any way of using the first workflow as a template & create the 20 work flows quickly? 
    As per our req, i need to have different names corresponding to the different source files that am having.
    If i go with the option of replication of the data flows , i am facing the below issue:
    -> Change in the name of the dataflow/ change in any of the object within the dataflow (eg, changing the source file/ target table within a data flow)is also getting reflected in all the other data flows.
    I am looking out for a way wherein , if we make the change in the workflow/dataflow , it shouldn't change the original(template) workflow.
    A more detailed scenario.:
    I have a work flow, say WF_Microsoft to be created for pulling data from a source file "Microsoft"
    WF_Microsoft has two data flows DF_Microsoft1 -> DF_Microsoft2 (DF_Microsoft1 connected to DF_Microsoft2 )
    DF_Microsoft1 in turn has SourceTable_Microsoft as input & StagingTable_Microsoft as output
    DF_Microsoft2 in turn has StagingTable_Microsoft as input & TargetTable_Microsoft as output with some table comparison & query transformations in between them
    Now, i need to create Workflows for few other source files eg "SAP","Oracle" and so on with the same logic with the difference being that, the source file is different, the output/input within the dataflows are different tables. I am looking out for a solution where in i need not create the dataflows, drag the objects again & again for all the 20 files. But, instead should do it without much effort by just replacing the source & target tables.
    If i replicate, any change in the naming of the dataflow / change in the object is also reflected in the original one as well. But i need the original one to remain same.
    -Thanks,
    Naveen

    I have created a workflow that has couple of dataflows inside it.
    Now, I need to create another 20 workflows which have the similar logic to that of the first workflow, except for couple of changes such as the source file, etc. Instead of creating the whole logic from the scratch for all the workflows, is there any way of using the first workflow as a template & create the 20 work flows quickly? 
    As per our req, i need to have different names corresponding to the different source files that am having.
    If i go with the option of replication of the data flows , i am facing the below issue:
    -> Change in the name of the dataflow/ change in any of the object within the dataflow (eg, changing the source file/ target table within a data flow)is also getting reflected in all the other data flows.
    I am looking out for a way wherein , if we make the change in the workflow/dataflow , it shouldn't change the original(template) workflow.
    A more detailed scenario.:
    I have a work flow, say WF_Microsoft to be created for pulling data from a source file "Microsoft"
    WF_Microsoft has two data flows DF_Microsoft1 -> DF_Microsoft2 (DF_Microsoft1 connected to DF_Microsoft2 )
    DF_Microsoft1 in turn has SourceTable_Microsoft as input & StagingTable_Microsoft as output
    DF_Microsoft2 in turn has StagingTable_Microsoft as input & TargetTable_Microsoft as output with some table comparison & query transformations in between them
    Now, i need to create Workflows for few other source files eg "SAP","Oracle" and so on with the same logic with the difference being that, the source file is different, the output/input within the dataflows are different tables. I am looking out for a solution where in i need not create the dataflows, drag the objects again & again for all the 20 files. But, instead should do it without much effort by just replacing the source & target tables.
    If i replicate, any change in the naming of the dataflow / change in the object is also reflected in the original one as well. But i need the original one to remain same.
    -Thanks,
    Naveen

  • Text Variable for a Restricted Key Figure

    Hi,
    I have a version comparison query three Restricted Key figures based on 0QUANTITY each of which are restricted by Version, Value Type and Fiscal Year. All three are in my query columns. The users select 3 sets of version, value type and fiscal year in the selection for comparison.
    They want to see the text of the version they have selected as the description of the column. How can i achieve this with text variables on these 3 Restricted Key Figures which have multiple restrictions. Please advise
    Thanks
    Rashmi.

    Hi Rashimi,
    As u said this can be acheived by text Variable,
    I have adoubt here,are  there THREE columns for Quantity or just ONE column with three Restrictions.
    In case it is three Columns then u can go with Text variables else probably u may have to populate it in three Columns and by using Workbook you can combine them in one Column.....i.e One column with all the restrictions -text.
    And here is how u create a  text variable.....
    Create text Variables,For Example ,say 0FISCALYEAR.
    1)Right Click on the 0FISCALYEAR and Create a input Variable say YFISCYEAR and save.
    2)If ur using 0FISCALYEAR in the RKF,then drag both the 0FISCALYEAR and KF say 0QUANTITY in the new selection.
    3)After the Description of the RKF there is a = sign click on that to VIEW and CREATE Text variable.
    4)Create New Variable ,enter the tech name(say YFYEAR) and dscription and in the same screen choose type Replacement PAth after which u will get a set of infobjects.
    5)choose 0FISCALYEAR
    6)In the Next screen choose the Tech name of the Input variabble(0FISCALYEAR) and save.
    7)In the description of the RKF ,type as 0FISCALYEAR &YFYEAR&.
    Check this too .....
    /people/surendrakumarreddy.koduru/blog/2009/02/26/using-text-variables-with-customer-exits-in-report-headings
    rgds
    SVU123

  • Creating query on Bex - Quaterly comparison for statistical & Actual

    Hi All,
    I would like to create a query for 'Quarterly comparison for statistical & Actual periods'.
    My Key Figures should be
    1) Plan 1st Qtr (Fiscal year, Period: 1 to 3, Value type : 1(Plan), Version : 0(Plan/actual), Valuation View: actual Value).
    2)1st Qtr (Fiscal year, Period: 1 to 3, "Value type : 4(Actual),11(Actual statistical)", Version : 0(Plan/actual), Valuation View: actual Value).
    3)Var 1st Qt (Plan 1st qtr - 1st Qtr)
    same thing for 4 Quaters. finally with
    4)Plan Year (Fiscal year, Period: 1 to 12, Value type : 1(Plan), Version : 0(Plan/actual), Valuation View: actual Value).
    I created a structure and created key figures with selections and formulas as required. But I did not see any data when I ran this query.
    The report was generated with 'no applicable data'.
    I need to create this query with plan 1st Qtr, Ist Qtr, Var 1st Qtr, Plan 2nd Qtr, 2nd Qtr, Var 2nd Qtr, Plan 3rd Qtr, 3rd Qtr, Var 3rd Qtr, Plan 4th Qtr, 4th Qtr, Var 4th Qtr, Plan year. key figures.
    Please let me know how can I create this query with these Key Figiures.
    Any help would be appreciated. Please respond with the reply.
    Thanks,
    Aparna.

    Hi
    The best way is then to run a report with your KF without any restriction, and the different chars in the drill down: Fiscal year, Period:, Value type,  Version , Valuation View
    Then you can check that you have some information with the combination of values of your chars:
    Fiscal year, Period: 1 to 3, Value type : 1(Plan), Version : 0(Plan/actual), Valuation View: actual Value.
    If you find a actual Value in the fiscal period you are looking at, for the period 1 to 3, for the Valuation type 1, for the version 0, then create arestricted KF by adding the restrictions one at a time....You moght discover why you do not get the results
    PY

  • BEx Query Designer 2 years comparison by month

    Hi expert,
    I'm drafting a report which want to comparison 2 years cost variance. I want to have 2 column inofmraiton, Cost & Variance % (which is  (Current Period - Previous Period) * 100/ Previous Period)  .
    I create the follow fields & defined a value range of  'CalYear/Month' in Filter of Characteristic Restrictions with variable offset value -12 which can input From mm.yyyy & To mm.yyyy.
    1. Characteristic of 'Cost'
    2. 'Cost -12' which is Cost with restriction of 'CalYear/Month' in 'From mm.yyyy' -12 to 'To mm.yyyy' -12 (HIDE FIELD)
    3. 'Variance %' which use function [ 'Cost' % 'Cost -12']
    After I Inputted the value of From Value 04.2011 & To Value 06.2011, it show 6 column of the cost in Apr 2010, May 2010, Jun 2010, Apr 2011, May 2011, Jun 2011, It show 0 under the 'Variance %' of Apr 2010, May 2010 & Jun 2010 is correct, but it can't calculate the any value of Variance % under Apr 2011, May 2011 & Jun 2011.
    Please advice
    Thanks & Regards
    Jack

    Create a CKF/formulae - then use Data Functions.
    Use SUMGT operator.
    SUMCT:
    It returns the result of the operand to all rows or columns
    SMGT:
    It returns the overall result of the operand
    SUMRT:
    It returns the query result of the operand
    How to use sumct, sumgt and sumrt
    http://help.sap.com/saphelp_nw04/helpdata/en/03/17f13a2f160f28e10000000a114084/frameset.htm
    Regards,
    rvc

  • Query of queries date comparison

    Cut to the basics, I'm trying to run the following code:
    <cfset qData = QueryNew("dataDate,ID")>
    <cfset padDate = "#DateFormat(Now(),"dd mmm yy")#
    23:59">
    <cfset queryAddRow(qData)>
    <cfset QuerySetCell(qData,"ID",1)>
    <cfset QuerySetCell(qData,"dataDate",padDate)>
    <cfset delDate = "#DateFormat(Now(),"dd mmm yy")#
    00:00">
    <cfquery name="qZero" dbtype="query">
    SELECT ID
    FROM qData
    WHERE dataDate = '#delDate#'
    </cfquery>
    This works fine in MX7 but I need to put it on a server using
    MX6.1. It appears that in 6.1 query of queries considers the
    dataDate field to be a date but will not accept a date on the right
    hand side of the equals sign in the where clause so comes up with
    'Unsupported type comparison'. Is there any way round this?

    I just ran into this problem actually.
    What appears to be happening is QoQ has trouble comparing SQL
    date types and DB date types. I had to perform an lsdateformat on
    the data to get it to process.
    I am also looking into a problem where QoQ is switching my
    dates to strings. CF has yet to impress me. For every kind of cool
    thing they do there are 25 lame things.

  • Query Builder - Where Clause - Could not format error using date comparison

    We've come across a bug in the Query Builder, under the Create Where Clause tab, if you select a column of Date type plus one of the comparison operators =, !=, <, >, <=, >=, BETWEEN or NOT BETWEEN it displays an error in the Logging Page:
    Level: Severe
    Source: o.d.r.queryBuilder.SQLGenerator
    Message: Could not format :2010-09-02 16:20:31.0
    Then under the Show SQL tab it doesn't display the date(s) you selected, e.g.
    WHERE LAST_UPDATE BETWEEN AND
    Also the View Results tab does not display any results.
    You can still press Apply to add the SQL as is to the editor window and from there you have to manually code in the date parameters.
    We're using the latest version of SQL Developer 2.1.1.64.45 Windows 32bit version with JDK

    Hi Gordon,
    When I add the following lines:
    declare @refdt1 date
    set @refdt =
    /*select 1 from jdt1 t0 where t0.RefDate*/ '[%1]'
    declare @refdt2 date
    set @refdt =
    /*select 1 from jdt2 t0 where t0.RefDate*/ '[%2]'
    WHERE T0.RefDate >= @refdt1 and T0.RefDate <= @refdt2
    ... the error message is now:
    Must declare the scalar variable @refdt1
    Note: Before adding these lines, the query works perfectly, and returns totals from the whole database (with dynamically generated column headings!)
    Thanks
    Leon Lai
    AMENDED QUERY:
    declare @refdt1 date
    set @refdt1 =
    /*select 1 from jdt1 t0 where t0.RefDate*/ '[%1]'
    declare @refdt2 date
    set @refdt2 =
    /*select 1 from jdt1 t0 where t0.RefDate*/ '[%2]'
    --------- I inserted the 6 lines above ---------------------
    DECLARE @listCol VARCHAR(2000)
    DECLARE @query VARCHAR(4000)
    SELECT @listCol =
    STUFF
    ( SELECT DISTINCT   '],['    +    CAST(month(T0.RefDate) AS varchar)
    FROM  JDT1 T0
    FOR XML PATH('')
    ), 1, 2, ' ') +   ']'
    SET @query =
    'SELECT * FROM
    (SELECT Account, month (T0.RefDate) Month , Debit
    FROM JDT1 T0
    ------------------- I add the WHERE clause below --------------------
    WHERE T0.RefDate >= @refdt1 and T0.RefDate <= @refdt2
    GROUP BY Account, RefDate, Debit
    ) S
    PIVOT
    Sum(Debit)
    FOR Month IN ('+@listCol+')
    ) AS pvt'
    EXECUTE (@query)
    Edited by: LEONLAI on Oct 21, 2011 2:36 PM

  • Query for Balance Sheet comparison Actual/Plan

    Hello,
    I try to create a query in the BEx using the cube 0FIGL_V10 to display the balance sheet and p/l statement in BI. It works fine with comparison actual/actual but I don't know how to make a comparison between actual and plan figures.
    I am using as filter criteria the dimensions value type ([0VTYPE 10 which is actual and 0VTYPE 20 which is plan). But how is it possible to display it in the right line and columns.
    I need to fix this be the end of next week. So please answer as soon as you can.
    Thanks a lot for your help.
    Regards
    Robert

    Hello Sam,
    thanks a lot for you pretty good answer. It works perfectly but there is one thing left, maybe you can answer that too:
    I want to display 3 actual periods but only one plan period. How can I realize this?
    Thanks again for your prompt help.
    Regards
    Robert

  • Query about date comparison

    Hi,
    I have a database table with a date field called 'ENDDAT'
    How do I write a select query that selects all records whose difference between sy-datum and 'ENDDAT' is greater than 7.
    If I do the normal date comparison for eg.
    select * from <databasetable> where enddat - sy-datum > 7
    or
    select * from <databasetable> where (enddat - sy-datum) > 7
    It throws error that '-' is not a valid comparison operator.
    Can anyone help????

    I've never seen that done before.  You can do as joseph has suggested.   If you can do it in the select statement, I'm not really sure that you'd want to put that kind of processing on the database.
    data: begin of itab occurs 0,
          enddat type sy-datum,
          end of itab.
    select * into corresponding fields of table itab
           from <databasetable> .
    data: cdatum type sy-datum.
    Loop at itab.
    cdatum = enddat - sy-datum
    * if <= 7, then delete from itab.
    if cdatum <= 7.
    delete itab.
    endif.
    endloop.
    Regard,
    Rich Heilman

  • Is there a speed  comparison between xpath query

    Is there a paper whit speed comparison between xpath query on XML, stored with all oracle's techniques?
    Thank you and sorry for my english

    Not currently. Since there are no industry standards for XML benchmarks that are accepted by all major vendors Oracle policy prevents us from publishing numbers. However if you have examples of what you are looking for I can give guidance.

Maybe you are looking for