Dynamically replacing pattern using SQL only

Hi ,
Table File name
CREATE TABLE file_name_table AS
( SELECT 'FILE_A_[PATTERN1]_[PETTERN2].txt' AS file_name FROM DUAL
UNION
SELECT 'FILE_B_[PATTERN3].txt' FROM DUAL
UNION
SELECT 'FILE_B_[PATTERN3]_[PATTERN4].txt' FROM DUAL)Pattern Table
CREATE TABLE pattern_table AS
( SELECT '[PATTERN1]' AS pattern, 'P00191' AS pattern_value  FROM DUAL
UNION
SELECT '[PATTERN2]' AS pattern, 'P00293' AS pattern_value  FROM DUAL
UNION
SELECT '[PATTERN3]' AS pattern, 'p567' AS pattern_value  FROM DUAL
UNION
SELECT '[PATTERN4]' AS pattern, 'p879' AS pattern_value  FROM DUAL
UNION
SELECT '[PATTERN5]' AS pattern, 'p005' AS pattern_value  FROM DUAL)Now I need a view which will show the follwing output
'FILE_A_P00191_P00293.txt'
'FILE_B_p567.txt'
'FILE_B_p567_p879.txt'basically the in the output the pattern will be matched and the pattern in file name will be replaced by the pattern value of pattern table. Hope I am clear. Please help.
Edited by: Mr Lonely on May 22, 2013 1:25 PM -- Fixed the output.

Hi Jeneesh,
This is working excellent.
However I have a small problem.
If filename contains something in [] for example filename_[FIXED] and that pattern does exists in the pattern table then it's replacing it with null.
For example.
FILE_NAME                        NEW_FNAME                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     
FILE_B_[PATTERN3].txt            FILE_B_p567.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 
TEST_[TEST_PATTERN].txt          TEST_.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       
TEST_[TEST_PAT].txt              TEST_LOL.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    
FILE_A_[PATTERN1]_[PETTERN2].txt FILE_A_P00191_.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              
FILE_B_[PATTERN3]_[PATTERN4].txt FILE_B_p567_p879.txt                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             Edited by: Mr Lonely on May 22, 2013 3:21 PM

Similar Messages

  • Count rows from multiple tables using SQL only

    Hi, I know this has probably been answered before, but I couldn't find the answer anywhere. Please help.
    I'd like count(*) [rows] for all tables in database using SQL only - no PL/SQL
    The result should be something like:
    Table RowCount
    DBA_TABLES 1000
    DBA_USERS 50
    etc.
    Thanks!

    offcource write this script:
    create or replace procedure count_tables (ip_schema VARCHAR2)
    is
    lv_owner VARCHAR2(100);
    lv_table_name VARCHAR2(100);
    lv_sql_statement VARCHAR2(2000);
    lv_count_table NUMBER;
    CURSOR c1 IS
    SELECT owner, table_name
    FROM all_tables
    WHERE owner = ip_schema
    ORDER BY table_name;
    begin
    dbms_output.put_line ('+--------------------------------------------------------------------+');
    dbms_output.put_line ('¦ | | ¦');
    dbms_output.put_line ('¦ Schema Name | Table Name | Number of Rows ¦');
    dbms_output.put_line ('¦ | | ¦');
    dbms_output.put_line ('¦------------------------------------------------------------------¦');
    OPEN c1;
    LOOP
    FETCH c1 INTO lv_owner , lv_table_name;
    EXIT WHEN c1%NOTFOUND;
    lv_sql_statement := 'SELECT count(*) FROM ' || lv_owner || '.' || lv_table_name;
    EXECUTE IMMEDIATE lv_sql_statement INTO lv_count_table;
    IF lv_count_table > 0 THEN
    dbms_output.put_line ('| '||rpad(lv_owner, 14, ' ')||'| '|| rpad(lv_table_name, 32, ' ')||'| '|| rpad(lv_count_table, 16, ' ')||' |');
    -- dbms_output.put_line ('|---------------|---------------------------------|------------------|');
    END IF;
    END LOOP;
    CLOSE c1;
    dbms_output.put_line ('+--------------------------------------------------------------------+');
    exception
    WHEN OTHERS THEN
    dbms_output.put_line ('owner: '||lv_owner||' - table: '||lv_table_name||' - '||sqlerrm);
    end count_tables;
    set serveroutput on size 1000000
    exec count_tables
    drop procedure count_tables;

  • Dynamic header creation using SQL

    Hi Gurus,
    I need your help again. I have a query which uses a date parameter to populate a report. The report pulls out data from the user entered date to minus eleven months. The report counts totals calls registered each month. I have the query working fine but I need help in populating the header.
    For ex - suppose I run the query on todays date (18-Jan-2012)
    The report header will be -
    Jan Feb Mar Apr ....Dec
    But I want the header to be populated in the following format -
    Jan 2012 Feb 2011 Mar 2011 Apr 2011....Dec 2011
    And for ex if I run the report for a future date say (21-May-2012)
    The header will be in the following format -
    Jan 2012....May 2012 Jun 2011 Jul 2011 .......Dec 2011.
    Please let me know if I can populate the header using SQL. Any help is greatly appreciated.

    Hi Tenacious,
    You wrote:
    I want the header to be populated in the dynamic format with the year value concanated to the Month column.My script does that; you can look one more time at the output in my first post, and also in my second post.
    And if you want another example, when we replace 18-Jan-2012 by 21-May-2012, we have the following:
    SQL> select to_char(col,'Monyyyy') col_date
      2  from
      3  (select
      4         add_months(to_date('21-May-2012','dd-Mon-yyyy'),
      5                      -level + 1) col
      6  from dual
      7  connect by level <= extract(month from to_date('21-May-2012','dd-Mon-yyyy')
      8  union
      9  select
    10         add_months(to_date('21-May-2012','dd-Mon-yyyy')
    11                    , level - 12 )
    12  from dual
    13  connect by level <= 12 - extract(month from to_date('21-May-2012','dd-Mon-y
    yyy')))
    14  order by extract(year from col) desc, extract(month from col);
    COL_DAT
    Jan2012
    Feb2012
    Mar2012
    Apr2012
    May2012
    Jun2011
    Jul2011
    Aug2011
    Sep2011
    Oct2011
    Nov2011
    COL_DAT
    Dec2011
    12 rows selected.
    SQL>

  • Dynamic Dimesion Building using SQL Interface

    I am attempting to use SQL Interface in Essbase 6.5 to dynamically build the account dimension from a PeopleSoft tree. The view works fine in SQL worksheet, but the last UNION statement does not work using the SQL interface. Here is the SQL that is generated: [SELECT * FROM ps_n_pre_acc_vw WHERE 1 = 1 order by 6,4,5,3] is generated. Here is the view:SELECT B.TREE_NODE AS PARENT , A.TREE_NODE AS CHILD , (A.TREE_NODE||' : '||C.DESCR) , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'A' AS ord FROM SYSADM.PSTREENODE A , SYSADM.PSTREENODE B , SYSADM.PS_TREE_NODE_TBL C WHERE A.TREE_NAME = 'PRE_CUBE' AND A.SETID = 'NW' AND A.EFFDT = ( SELECT MAX(EFFDT) FROM PSTREELEAF WHERE TREE_NAME = 'PRE_CUBE') AND A.TREE_NAME = B.TREE_NAME AND A.SETID = B.SETID AND A.EFFDT = B.EFFDT AND A.PARENT_NODE_NUM = B.TREE_NODE_NUM AND A.TREE_NODE = C.TREE_NODE AND A.SETID = C.SETID AND C.EFF_STATUS = 'A' AND C.EFFDT = ( SELECT MAX(EFFDT) FROM SYSADM.PS_TREE_NODE_TBL X WHERE C.TREE_NODE = X.TREE_NODE AND C.SETID = X.SETID) UNION SELECT B.TREE_NODE AS PARENT , (A.RANGE_FROM||' : '||DESCR) AS child , A.RANGE_FROM , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'B' AS ord FROM SYSADM.PSTREELEAF A , SYSADM.PSTREENODE B , SYSADM.PS_GL_ACCOUNT_TBL C WHERE A.TREE_NAME = 'PRE_CUBE' AND A.SETID = 'NW' AND A.EFFDT = ( SELECT MAX(EFFDT) FROM PSTREELEAF WHERE TREE_NAME = 'PRE_CUBE') AND A.TREE_NAME = B.TREE_NAME AND A.EFFDT = B.EFFDT AND A.SETID = B.SETID AND A.TREE_NODE_NUM = B.TREE_NODE_NUM AND C.SETID = A.SETID AND A.RANGE_FROM = C.ACCOUNT AND C.EFFDT = ( SELECT MAX(EFFDT) FROM SYSADM.PS_GL_ACCOUNT_TBL X WHERE C.ACCOUNT = X.ACCOUNT AND c.setid = x.setid) UNION SELECT B.TREE_NODE AS PARENT , (D.ACCOUNT||' : '||DESCR) AS child , D.ACCOUNT , b.tree_node_num AS num , a.tree_node_num AS c_tree_node_num , 'B' AS ord FROM pstreeleaf a , pstreenode b , PS_GL_ACCOUNT_TBL D WHERE a.tree_name = 'PRE_CUBE' AND a.effdt = '01-JAN-2001' AND A.SETID = 'NW' AND A.RANGE_FROM <> A.RANGE_TO AND A.TREE_NAME = B.TREE_NAME AND A.SETID = B.SETID AND A.EFFDT = B.EFFDT AND A.TREE_NODE_NUM = B.TREE_NODE_NUM AND D.ACCOUNT BETWEEN A.RANGE_FROM AND A.RANGE_TO AND D.SETID = 'NW' AND D.EFFDT = ( SELECT MAX(EFFDT) FROM PS_GL_ACCOUNT_TBL X WHERE D.ACCOUNT = X.ACCOUNT AND D.SETID = X.SETID)I created a view from the last union section of the SQL and Essbase returns a zero rows found error, although when I ran the view in SQL worksheet I was able to get rows returned. Any ideas on why the view does not return the expected results in Essbase?

    before your query do:
    <cfsavecontent
    variable="emailmessagebody"><cfinclude
    template="#mancbPath#/mancb_body.cfm"></cfsavecontent>
    then in your query instead of using '<cfinclude ...>'
    use
    '#emailmessagebody#'
    Azadi Saryev
    Sabai-dee.com
    http://www.sabai-dee.com/

  • SQL : Can u print the numbers 1 upto 10 using SQL only

    Hi Friends,
    I want to know is there a possibility to print all the numbers between 1 upto 10 using a sql statement only.
    An earlier reply is appreciated.
    Thanks in Advance
    Sriram

    You may try something like this:
    select rownum from all_objects where rownum<=10;
    This should normally work, for I can't imagine how one may have a DB with less then 10 objects. Still, it's not too elegant.
    Why not use pl/sql?
    declare
    n number
    begin
    dbms_output.enable;
    for n in 1..100 loop
    dbms_output.put_line(to_char(n, '9990'));
    end loop;
    This would seem the most natural approach.
    Regards,
    BD.

  • Copy hiearchical data using SQL only.

    Is there anyway to copy hiearchical data that use ID's without using a pl/sql loop procedure or a global temporary table?
    Example
    CREATE TABLE DUCK_TREE
    FIRST_NAME VARCHAR2(100 BYTE),
    LAST_NAME VARCHAR2(100 BYTE),
    FAMILY_ID NUMBER,
    PARENT_ID NUMBER
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Louie', 'Duck', 207, 203);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Dewey', 'Duck', 206, 203);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Huey', 'Duck', 205, 203);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Donald', 'Duck', 204, 201);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Thelma', 'Duck', 203, 201);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Daphne', 'Duck', 202, 200);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Quackmore', 'Duck', 201, 200);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Grandma', 'Duck', 200, NULL);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Hortense', 'McDuck', 103, 100);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Scrooge', 'McDuck', 102, 100);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Matilda', 'McDuck', 101, 100);
    INSERT INTO DUCK_TREE ( FIRST_NAME, LAST_NAME, FAMILY_ID, PARENT_ID ) VALUES (
    'Scotty', 'McDuck', 100, NULL);
    COMMIT;
    Now you would like to copy the hierachical data starting say at Grandma and all children?

    I can't really see a way to get away from iterative PL/SQL or the use of temporary tables (global or otherwise).
    The best solution I've managed to come up with so far is use at least 1 GTT to store a mapping of current FAMILY_IDs to new FAMILY_IDs, then use the map table to translate the old IDs to new IDs during the copy phase.
    DROP TABLE id_map;
    CREATE global TEMPORARY TABLE ID_MAP
        (   old_id NUMBER,
            new_id NUMBER
        ON COMMIT DELETE ROWS;
    DROP SEQUENCE duck_tree_seq;
    CREATE SEQUENCE duck_tree_seq INCREMENT BY 1
        START WITH 208;
    create or replace procedure ADD_DUCK_WITH_DESCENDENTS
          (fName varchar2, lName varchar2, pid number, RootID number) is
        new_id NUMBER;
    begin
        -- Create new Ancestor
        insert into duck_tree values (fname, lname, duck_tree_seq.nextval, pid)
        returning family_id into new_id;
        -- Add root mapping
        insert into id_map values (rootid, new_id);
        -- Map lineage
        INSERT INTO id_map
        WITH t AS
             SELECT d.*
               FROM duck_tree d
              WHERE level <> 1 -- exclude root ancestor
            CONNECT BY d.parent_id = PRIOR d.family_id
              START WITH d.first_name = 'Grandma'
         SELECT family_id, DUCK_TREE_SEQ.nextval newid FROM t;
        -- Copy descendents
        insert into duck_tree (
            first_name,
            last_name,
            family_id,
            parent_id)
        WITH t AS
            (SELECT d.*
               FROM duck_tree d
              WHERE level <> 1 -- exclude root ancestor
            CONNECT BY d.parent_id = PRIOR d.family_id
              START WITH d.first_name = 'Grandma'
         SELECT t.first_name
              , t.last_name
              , m1.NEW_ID
              , m2.NEW_ID
           FROM t
        JOIN id_map m1
             ON t.family_id = m1.old_id
        LEFT JOIN id_map m2
             ON t.parent_id = m2.old_id;
        -- Clear out the map table
        delete from id_map;
    end;
    call add_duck_with_descendents('Grandpa','Duck',null,200);
    /

  • Variable in regex replace pattern

    Hi,
    I need to use a variable in a regex replace pattern - how can I do it? Specifically, I need to pass arguments to a shell script that then uses that argument in a replace pattern:
    #!/bin/bash
    #$1 now holds the argument
    sed 's/searchpattern/replace_pattern_with_variable$1/g' file1 > file2
    when I run this, the replace pattern uses $1 as a literal string "$1" and not the variable value. How can I change this?
    Thanks!
    Ingo

    Hi Ingo,
       As Vid points out, the issue is that single quotes protect strings from shell interpretation. You need to have the dollar sign, '$', visible to the shell or it won't read what follows as a variable name. Using double quotes works because the shell "reads through" those.
       However, complex regular expressions can contain lots of characters that the shell interprets. These can be quoted individually by backslashes but the use of backslashes in regular expressions is complex enough without the addition of shell interpretation. I find it easiest to keep the single quotes and only expose the part of the string that the shell needs to interpret.
       The shell doesn't have a special string concatenation character. All you have to do is to put the strings beside each other with nothing in between and the shell will concatenate them. Therefore it's possible to write your example as:
    sed 's/searchpattern/replace_pattern_with_variable'${1}'/g' file1 > file2
    That is, one closes the single quote right before the variable and then resumes it immediately afterward. The shell will put these quoted strings together with the contents of the variable just as it would with double quotes but you still enjoy the protection of single quotes around the rest of the string!
    Gary
    ~~~~
       P.S. Perl's master plan (or what passes for one) is to take
       over the world like English did. Er, as English did...
          -- Larry Wall in <[email protected]>

  • How to query using 3 optional inputs and case insensitive using SQL?

    Hi Folks,
    I am having trouble with the following query:
    select *
    from t1
    where (1=1)
    and lower(fname) like lower ('%mary%')
    and lower(lname) like lower ('%smith%')
    and lower(status) like lower ('%%')
    I need all three statements in the where clause to be completely optional and case insensitve.
    if I just write the following:
    (1=1)
    and lower(fname) like lower ('%mary%')
    and lower(lname) like lower ('%%') <-- Need to ignore this line
    and lower(status) like lower ('%%') <-- need to ignore this line
    nothing is returned. How do I ignore the 2nd and third lines using SQL only. I know about the ask TOM Article using procedures, but I need to do this using SQL only
    thanks in advance
    Edited by: user2184537 on Oct 16, 2009 9:40 AM
    Edited by: user2184537 on Oct 16, 2009 10:10 AM

    Hi,
    Is this query generated dynamically? (That's the only reason I can see for saying "WHERE 1 = 1".)
    If so, test the parameters for NULL, and only add them if a value was given.
    Failing that, you can explicitly test for NULL parameters
    where   (     lower(fname)  = '%' || lower (:p_fname) || '%'
         OR     :p_fname     IS NULL
    and     (     lower(lname)  = '%' || lower (:p_lname) || '%'
         OR     :p_lname     IS NULL
    and     (     lower(status) = '%' || lower (:p_status) || '%'
         OR     :p_status    IS NULL
         )Did you really mean to have all those '%''s? '%' is a wildcard in LIKE operations, but not when using =.
    Perhaps you should be saying:
    where   (     lower(fname)  = lower (:p_fname)
         OR     :p_fname     IS NULL
    and     (     lower(lname)  = lower (:p_lname)
         OR     :p_lname     IS NULL
    and     (     lower(status) = lower (:p_status)
         OR     :p_status    IS NULL
         )You're already handling case-sensitivity, by using LOWER in all the comparisons.
    Unfortunately, you can't just say something like:
    WHERE   LOWER (fname) = LOWER (NVL (:p_fname, fname))because that would discartd rows where fname IS NULL when :p_name is also NULL.
    Edited by: Frank Kulash on Oct 16, 2009 12:54 PM

  • Crosstab query using pure SQL only

    Hi all,
    Found a lot of threads on crosstab, but none seems to address what I need. I need to perform crosstab query using pure SQL only & the number of columns are dynamic. From a query, I obtained the below table:
    Name Date Amount
    Alex 2005-06-10 1000
    Alex 2005-06-20 1000
    Alex 2005-07-10 1000
    Alex 2005-07-20 1000
    Alex 2005-08-10 1000
    Alex 2005-08-20 1000
    John 2005-06-10 2000
    John 2005-06-20 2000
    John 2005-07-10 2000
    John 2005-07-20 2000
    John 2005-08-10 2000
    John 2005-08-20 2000
    And I need to transform it into:
    Name 06-2005 07-2005 08-2005
    Alex 2000 2000 2000
    John 4000 4000 4000
    Reason for the columns being dynamic is because they'll be a limit on the date ranges to select the data from. I'd have a lower & upper bound date say June-2005 to August-2005, which explains how I got the data from the above table.
    Please advise.
    Thanks!

    Hi,
    I couldn't resist the intellectual challenge of a pure SQL solution for a pivot table with a dynamic number of columns. As Laurent pointed out, a SQL query can only have a fixed number of columns. You can fake a dynamic number of columns, though, by selecting a single column containing data at fixed positions.
    <br>
    <br>
    If it were me, I'd use a PL/SQL solution, but if you must have a pure SQL solution, here is an admittedly gruesome one. It shows the sum of all EMP salaries per department over a date range defined by start and end date parameters (which I've hardcoded for simplicity). Perhaps some of the techniques demonstrated may help you in your situation.
    <br>
    <br>
    set echo off
    set heading on
    set linesize 100
    <br>
    select version from v$instance ;
    <br>
    set heading off
    <br>
    column sort_order noprint
    column sal_sums format a80
    <br>
    select -- header row
      1        as sort_order,
      'DEPTNO' as DEPTNO ,
      sys_connect_by_path
        ( rpad
            ( to_char(month_column),
              10
          ' | '
        ) as sal_sums
    from
        select
          add_months( first_month, level - 1 ) as month_column
        from
          ( select
              date '1981-01-01' as first_month,
              date '1981-03-01' as last_month,
              months_between( date '1981-03-01', date '1981-01-01' ) + 1 total_months
            from dual
        connect by level < total_months + 1
      ) months
    where
      connect_by_isleaf = 1
    connect by
      month_column = add_months( prior month_column, 1 )
    start with
      month_column = date '1981-01-01'
    union all
    select -- data rows
      2 as sort_order,
      deptno,
      sys_connect_by_path( sum_sal, ' | ' ) sal_sums
    from
      select
        dept_months.deptno,
        dept_months.month_column,
        rpad( to_char( nvl( sum( emp.sal ), 0 ) ), 10 ) sum_sal
      from
          select
            dept.deptno,
            reporting_months.month_column
          from
            dept,
            ( select
                add_months( first_month, level - 1 ) as month_column
              from
                ( select
                    date '1981-01-01' as first_month,
                    date '1981-03-01' as last_month,
                    months_between( date '1981-03-01', date '1981-01-01' ) + 1 total_months
                  from
                    dual
              connect by level < total_months + 1
            ) reporting_months
        ) dept_months,
        emp
      where
        dept_months.deptno = emp.deptno (+) and
        dept_months.month_column = trunc( emp.hiredate (+), 'MONTH' )
      group by
        dept_months.deptno,
        dept_months.month_column
    ) dept_months_sal
    where
      month_column = date '1981-03-01'
    connect by
      deptno = prior deptno and
      month_column = add_months( prior month_column, 1 )
    start with
      month_column = date '1981-01-01'
    order by
      1, 2
    <br>
    VERSION
    10.1.0.3.0
    <br>
    DEPTNO      | 81-01-01   | 81-02-01   | 81-03-01
    10          | 0          | 0          | 0
    20          | 0          | 0          | 0
    30          | 0          | 2850       | 0
    40          | 0          | 0          | 0
    <br>
    Now, if we substitute '1981-03-01' with '1981-06-01', we see 7 columns instead of 4
    <br>
    DEPTNO      | 81-01-01   | 81-02-01   | 81-03-01   | 81-04-01   | 81-05-01   | 81-06-01
    10          | 0          | 0          | 0          | 0          | 0          | 2450
    20          | 0          | 0          | 0          | 2975       | 0          | 0
    30          | 0          | 2850       | 0          | 0          | 2850       | 0
    40          | 0          | 0          | 0          | 0          | 0          | 0
    <br>To understand the solution, start by running the innermost subquery by itself and then work your way outward.

  • Using SQL to update Translation Patterns in CUCM 9.1

    Hello,
    I have approximately 400 translation patterns that I have exported from one CUCM 9.1 system that I want to import into another CUCM 9.1 system as this will save me lots of work.
    I have done this and the import was successful but when I look at the Translation Patterns in CUCM admin the value for
    Discard Digits under Called Party Transformations is shown as < None > where it should be PreDot.
    Oddly when I export the Translation Patterns from the CUCM cluster to which I have just imported them the value shown in the csv file is PreDot
    I was wondering whether it would be possible to change the value for Discard Digits using SQL. Despite reading Bill Bell's helpful blog series I am struggling to work out how to do this.
    The basic operation I want to do is described below:
    Update Discard Digits to PreDot for all Translation Patterns that start with 90.
    Can anyone help me with this?

    Thank you for the answer, in my case I only need to distribute LSCs to the phones for 802.1x authentication. As far as I understand, it is possible to just update the CTL file with Cisco CTL Client utility.

  • How to send an email with attachment to dynamic emial address using PL/SQL

    Hi,
    i want to send an automated email with attachment everyday to differnet people so number people is not static.
    so is it any way using PL/SQL ?
    thanks for your support!

    i want to send an automated email with attachment everyday to differnet people so number people is not static.
    Why? Explain it.
    You can create a table and store your email id through front-end application day to day.
    The table should look like this ->
    satyaki>
    satyaki>select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod
    PL/SQL Release 10.2.0.3.0 - Production
    CORE    10.2.0.3.0      Production
    TNS for 32-bit Windows: Version 10.2.0.3.0 - Production
    NLSRTL Version 10.2.0.3.0 - Production
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>
    satyaki>create table email_master
      2    (
      3       email_grp_header         varchar2(30) not null,
      4       craete_time                  timestamp,
      5       constraints pk_header primary key(email_grp_header)
      6    );
    Table created.
    Elapsed: 00:00:02.12
    satyaki>
    satyaki>create table email_chld
      2    (
      3       email_grp_header          varchar2(30) not null,
      4       email_recepient             varchar2(100),
      5       craete_time                   timestamp,
      6       constraint fk_header foreign key(email_grp_header) references email_master(email_grp_header)
      7    );
    Table created.
    Elapsed: 00:00:00.09
    satyaki>
    satyaki>
    satyaki>insert into email_master values('GRP_INVENTORY',systimestamp);
    1 row created.
    Elapsed: 00:00:00.07
    satyaki>
    satyaki>
    satyaki>insert into email_master values('GRP_PURCHASE',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>commit;
    Commit complete.
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>select * from email_master;
    EMAIL_GRP_HEADER               CRAETE_TIME
    GRP_INVENTORY                  24-OCT-08 08.55.36.190000 PM
    GRP_PURCHASE                   24-OCT-08 08.55.54.481000 PM
    Elapsed: 00:00:00.18
    satyaki>
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.07
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>insert into email_chld values('GRP_INVENTORY','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>insert into email_chld values('GRP_PURCHASE','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.03
    satyaki>
    satyaki>insert into email_chld values('GRP_PURCHASE','[email protected]',systimestamp);
    1 row created.
    Elapsed: 00:00:00.11
    satyaki>commit;
    Commit complete.
    Elapsed: 00:00:00.05
    satyaki>
    satyaki>select * from email_chld;
    EMAIL_GRP_HEADER               EMAIL_RECEPIENT                                                                                      CRAETE_TIME
    GRP_INVENTORY                  [email protected]                                                                                      24-OCT-08 08.56.46.107000 PM
    GRP_INVENTORY                  [email protected]                                                                                         24-OCT-08 08.57.03.551000 PM
    GRP_INVENTORY                  [email protected]                                                                                    24-OCT-08 08.57.36.277000 PM
    GRP_PURCHASE                   [email protected]                                                                                      24-OCT-08 08.58.06.129000 PM
    GRP_PURCHASE                   [email protected]                                                                                    24-OCT-08 08.58.26.900000 PM
    Elapsed: 00:00:00.10
    satyaki>And, then based on the group header you can get the list of recipient and use it dynamically inside your PL/SQL Application.
    Regards.
    Satyaki De.

  • Error ORA-06502 When using function REPLACE in PL/SQL

    Hi,
    I have a PL/SQL procedure which gives error 'Error ORA-06502 When using function REPLACE in PL/SQL' when the string value is quite long (I noticed this with a string 9K in length)
    variable var_a is of type CLOB
    and the assignment statement where it gives the error is
    var_a := REPLACE(var_a, '^', ''',''');
    Can anyone please help!
    Thanks

    Even then that shouldn't do so:
    SQL> select overload, position, argument_name, data_type, in_out
      2  from all_arguments
      3  where package_name = 'STANDARD'
      4  and object_name = 'LPAD'
      5  order by 1,2
      6  /
    OVERLOAD   POSITION ARGUMENT_NAME                  DATA_TYPE                      IN_OUT
    1                 0                                VARCHAR2                       OUT
    1                 1 STR1                           VARCHAR2                       IN
    1                 2 LEN                            BINARY_INTEGER                 IN
    1                 3 PAD                            VARCHAR2                       IN
    2                 0                                VARCHAR2                       OUT
    2                 1 STR1                           VARCHAR2                       IN
    2                 2 LEN                            BINARY_INTEGER                 IN
    3                 0                                CLOB                           OUT
    3                 1 STR1                           CLOB                           IN
    3                 2 LEN                            NUMBER                         IN
    3                 3 PAD                            CLOB                           IN
    4                 0                                CLOB                           OUT
    4                 1 STR1                           CLOB                           IN
    4                 2 LEN                            NUMBER                         INI wonder what happened?

  • Could not use color replacement tool because it only works in full color mode: Elements 11

    I have a PSD file open and I added a new layer via the copy command.  I have added a Layer Mask to the new layer and I'm trying to use the Brush Tool with black forground to cover part of the new layer.  I'm following an online tutorial step by step, but when I try to use the Brush Tool I get "could not use color replacement tool because it only works in full color mode".  The Image>Mode is set to RGB.  I'm stuck!

    First, show the tools optiond in the bottom of the screen (shortcut F4)
    Now look at which of the three options you can select, brush, impressionist brush or replacement tool. I suspect you have to select the brush.
    Edit:
    The reason of the message is that when you are painting on a mask you are in greyscale mode for the mask. For instance, if you paste a coloured layer into a mask, it will be converted to greyscale.

  • Burned CDs will not sync to my ipod. I just replaced my computer with a different computer, unauathorizeed all previous computers, authorized the one i am now using, but only purchase content will transfer to my ipod.

    Burned CDs will not sync to my ipod. I just replaced my computer with a different computer, unauathorizeed all previous computers, authorized the one i am now using, but only purchase content will transfer to my ipod.

    Hi ballen13!
    I have an article here for you that may be able to address your question and provide some insight into this issue:
    Some songs in iTunes won't copy to iPod
    http://support.apple.com/kb/TS1420
    Take care, and thanks for visiting the Apple Support Communities.
    -Braden

  • Is there a solution for dynamic reports and using Denes' Export to Excel?

    Oracle 10.2.0.4.0
    Application Express 3.2.1.00.10
    Hello all!
    I am using Denes Kubicek's Export_Excel_Pkg in my application and I'm having trouble exporting reports based on report regions created using a PL/SQL function body returning SQL query. I realize this is not an Oracle supported package, but was hoping someone here could shed some light on it. When I open up the Excel file, I get an error such as: Report Values Error: ORA-06550: line 22, column 5: PL/SQL: ORA-00907: missing right parenthesis.
    I've searched the forum and already have done as others suggested by modifying the REPLACE on the v_sql variable in Export_Excel_Pkg.Get_Usable_SQL, but it did not work. My assumption is that there is an issue with the value being passed to the wwv_flow_utilities.get_binds function. I could not find documentation on this function, but I'm thinking that it cannot extract the bind variables within a PL/SQL block. The report only works when I have just use SQL with bind variables...doesn't work for PL/SQL. Nor does it work for dynamic SQL reports that use a "lexical" parameter (e.g. using WHERE &p_and_condition.) to build the WHERE clause.
    Has anyone come up with a work-around to this? I somehow need to be able to extract reports based on dynamic SQL (or PL/SQL) to Excel.
    Help is appreciated!
    This is my example of a report based on PL/SQL function:
    DECLARE
      v_sql VARCHAR2(4000);
    BEGIN
      v_sql := q'[SELECT UPPER(t1.olo_name) agency_title,
           t1.class_code,
           UPPER(t1.class_title) class_title,
           t1.pay_plan,
           t1.pay_grade_code,
           COUNT(t1.appt_fte) total_employees,
           SUM(t1.appt_fte) filled_fte,
           AVG(DECODE(t2.pay_cycle_code,
                      'UB',((t1.wage_type1_amt_for_pay * 26)/t1.appt_fte),
                      'UM',((t1.wage_type1_amt_for_pay * 12)/t1.appt_fte),
                       0)) avg_annual_rate
       FROM my_schema.table1 t1,
                my_schema.table2 t2,
                my_schema.table3 pro
      WHERE t1.pos_wk = t2.pos_wk
        AND t2.pos_rate_active_flag = 'Y'
        AND t1.ops_ind = 'N'
        AND t1.employee_type IN ('1','2')
        AND pro.ROLE_CODE = :F101_DW_ROLE
        AND pro.pos_role_orgs_active_flag = 'Y']';
      IF :P_MULTI_OLO IS NOT NULL THEN
        v_sql := v_sql || q'[ AND INSTR(':'||']' || v('P_MULTI_OLO') || q'['||':', ':'||t1.olo_code||':') > 0]';     
      END IF;
      v_sql := v_sql || q'[GROUP BY UPPER(t1.olo_name), t1.class_code, UPPER(t1.class_title), t1.pay_plan, t1.pay_grade_code ORDER BY t1.class_code ASC, avg_annual_rate]';
      RETURN v_sql;
    END;This is my example using a SQL statement with a lexical parameter:
    SELECT UPPER(t1.olo_name) agency_title,
           t1.class_code,
           UPPER(t1.class_title) class_title,
           t1.pay_plan,
           t1.pay_grade_code,
           COUNT(t1.appt_fte) total_employees,
           SUM(t1.appt_fte) filled_fte,
           AVG(DECODE(t2.pay_cycle_code,
                      'UB',((t1.wage_type1_amt_for_pay * 26)/t1.appt_fte),
                      'UM',((t1.wage_type1_amt_for_pay * 12)/t1.appt_fte),
                       0)) avg_annual_rate
       FROM my_schema.table1 t1,
                my_schema.table2 t2,
                my_schema.table3 pro
      WHERE t1.pos_wk = t2.pos_wk
        AND t2.pos_rate_active_flag = 'Y'
        AND t1.ops_ind = 'N'
        AND t1.employee_type IN ('1','2')
        AND pro.ROLE_CODE = :F101_DW_ROLE
        AND pro.pos_role_orgs_active_flag = 'Y'
        &P63_AND_CONDITION.
      GROUP BY UPPER(t1.olo_name),
               t1.class_code,
               UPPER(t1.class_title),
               t1.pay_plan,
               t1.pay_grade_code   
    ORDER BY t1.class_code ASC, avg_annual_rateThe *&P63_AND_CONDITION.* value is populated based on a "Before Header" computation under Page Rendering, using the logic below. It is then used by the SQL query defined in the reports region at run time.
    DECLARE
      v_sql VARCHAR2(4000) := NULL;
    BEGIN
      v_sql := ' ';
      IF :P_MULTI_OLO IS NOT NULL THEN
        v_sql := v_sql || q'[ AND INSTR(':'||']' || v('P_MULTI_OLO') || q'['||':', ':' || t1.olo_code || ':') > 0]';     
      END IF;
      RETURN v_sql;
    END;

    Did you get an answer for this?

Maybe you are looking for

  • Error while creating the service entry sheet in PRD

    Hi All, We are in SAP R/3 4.7 Version. OF39 settings are 58 & 61 Statistical update only ! Now my case is i assigned Rs.100 Budget agaisnt one FC & Comitmetn item & created one service Purchase order Rs.80 & Make down payment against same PO Rs.80. N

  • Insert a row in ALV

    Hi , I need to insert a row in ALV output. On click of a Inser Row button a pop up should show with a parameter.Once the value in entered in the parameter enter is hit , the first column of the new row should have value put in that parameter as non e

  • The network path was not found *DNS*

    Hi all, Some background... I only have one DC which is running AD and DNS.  I have a file/share server that I want to add to the domain and along with a few workstations.  I was getting an error before this and it was 'dns name does not exist'... but

  • Problem with file.listFiles() of File class??

    Hi all friends, Iam facing with one peculiar problem,Iam using [file.listfiles()] method of File class in my program and this method of file class introduced in Java2.Now when Iam runing on Mac OS classic(8 to 9)[it is my client requirement they can'

  • Importing Menus from other Desktop Environments

    I'm on Openbox. I feel my GNOME menu is complete but I don't use GNOME. I want to know if there is away, a program or anything, to import the menu from other Desktop Environments into my OpenBox right click menu. Thank you.