Table replication using streams uni-directional

Hi,
I recently started using streams and configured the table level replication using streams. I have configured all required processes in a "local capture" method and capture, propogate processes are running fine (status shows to be fine). The apply process at target database is running fine. The logminer is minging the redo logs as per the alert log at source. But i do not see any changes applied on target. suppose, i insert a row at source table it should be appeared at target after some time.
I am trying to find if there is any thing not satisfied at the apply level. I am looking for some directions on the below questions. please dont point me to some links please as i have been going through several sites.
What is captured on source is formed into a LCR. Now, how can i see an LCR at source database (through a query or procedure) when any insert or update is done at source?
How can i see if the LCRs are propagated to target database or not?
If LCRs are propogated properly what query or any object helps me to find out what is in the LCR at target site?
You may probably ask me any questions related to streams configuration and i will check if i have done it properly or not.
Regards,
sree

Hi,
Thanks for your reply. Yes. It is 11g.
In my later research i found the below error message from the alert.log at apply site.
knllgobjinfo: MISSING Streams multi-version data dictionary!!!
knlldmm: gdbnm=STRM2.ORDB.NET
knlldmm: objn=16914
knlldmm: objv=1
knlldmm: scn=17541786
When checked on the source site, the SCN i retrieved is,
At Capture Site:
==========
SQL> SELECT DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER FROM DUAL;
GET_SYSTEM_CHANGE_NUMBER
17550440
At Apply site:
=========
SQL> SELECT DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER FROM DUAL;
GET_SYSTEM_CHANGE_NUMBER
17550779
When looked at the metalink note, i found that it might be an instantiation issue or the Issue with Propogation. I am trying to get help from the oracle documents but it looks like it might take little more time. Can you quickly guide me.
How to make sure that propogation is going fine. It is understood that there occurs when the Streams data dictionary information for the specified object is not available in the apply database.
I have instantiated my table at source using DBMS_STREAMS_ADM.ADD_TABLE_RULES and added propagation rules DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES as well. Later i stopped capture and apply, executed DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION at source (nothing done at destination), restarted the apply, capture process. I believed that it would do the instantiation again.
How to make sure that the instantiation is done properly?
Thank you for your patience.
Regards
Sree
Edited by: user9501019 on Oct 13, 2009 4:19 PM

Similar Messages

  • Replicate to a different schema using streams

    All,
    I'm performing replication using streams, I have read the doc and steps by steps notes on metalink getting the replication successfully.
    But now, I want to do the same but on differents schemas ie.
    source DB: Schema= A Table=test
    Destination DB: Schema= B Table=test
    On both databases the schemas are basically the same but with different names.
    What changes o additional steps I need to perform in order to complete the replication?
    Thanks in advance..!

    Here is the procedure
    1) Follow the instructions from the documentation to create the streams administrator user on both source and target.(e.g. steams_adm)
    2) Connect to the streams administrator user at the TARGET (10.2) database
    --Create a queue
    Begin
    Dbms_streams_adm.set_up_queue(
    queue_name => 'APPLY_Q',queue_table => 'APPLY_Q_QT');
    End;
    --Create rules for apply process
    begin
    dbms_rule_adm.create_rule(rule_name =>'APPLY_RULE',
    condition => ':dml.get_object_owner()=''<put your source schema owner>'' ');
    end;
    begin
    dbms_rule_adm.create_rule_set(rule_set_name => 'streams_adm.apply_ruleset',
    evaluation_context => 'SYS.STREAMS$_EVALUATION_CONTEXT');
    dbms_rule_adm.add_rule(rule_set_name => 'APPLY_RULESET',rule_name =>'APPLY_RULE');
    end;
    --Set the rule to rename the schema
    begin
    dbms_streams_adm.rename_schema(
    rule_name=>'APPLY_RULE',
    from_schema_name =>'<put the source schema>',
    to_schema_name => '<put the target schema>');
    end;
    --Create the apply
    begin
    dbms_apply_adm.create_apply(apply_name => 'APPLY_CHANGES',
    rule_set_name => 'APPLY_RULESET',
    queue_name => 'APPLY_Q',
    source_database => '<source database name>',
    apply_captured => true);
    end;
    BEGIN
    DBMS_APPLY_ADM.SET_PARAMETER(
    apply_name => 'APPLY_CHANGES',
    parameter => 'DISABLE_ON_ERROR',
    value => 'N' );
    END;
    --Start apply
    begin
    dbms_apply_adm.start_apply('APPLY_CHANGES');
    end;
    3) Connected as strmadmin on SOURCE (9.2) database:
    --Create a database link to the TARGET database connected as streams_adm. The name of the dblink must match the target database name since you have to set global_names=true in the init files of both databases
    --Create a queue
    Begin
    Dbms_streams_adm.set_up_queue(
    queue_name => 'CAPTURE_Q',queue_table => 'CAPTURE_Q_QT');
    End;
    --Create the rule and rulesets
    *** Here replace the schema owner and the list of tables you'd like to capture changes *** from
    begin
    dbms_rule_adm.create_rule(rule_name =>'FILTER_TABLES',
    condition => ':dml.get_object_owner()=''<put your schema owner>'' and :dml.get_object_name() in (''TAB1'',''TAB2'')');
    end;
    begin
    dbms_rule_adm.create_rule_set(rule_set_name => 'streams_adm.capture_ruleset',
    evaluation_context => 'SYS.STREAMS$_EVALUATION_CONTEXT');
    dbms_rule_adm.add_rule(rule_set_name => 'CAPTURE_RULESET',rule_name =>'FILTER_TABLES');
    end;
    --Prepare the schema for instantiation
    *** Replace the schema name with your schema name
    BEGIN
    DBMS_CAPTURE_ADM.PREPARE_SCHEMA_INSTANTIATION(
    schema_name => '<put your schema name here>');
    END;
    --Create propagation
    BEGIN
    DBMS_PROPAGATION_ADM.CREATE_PROPAGATION(
    propagation_name => 'streams_propagation',
    source_queue => 'streams_adm.capture_q',
    destination_queue => 'streams_adm.apply_q',
    destination_dblink => '<database link>',
    rule_set_name => 'streams_adm.capture_ruleset');
    END;
    --Create capture process
    BEGIN
    DBMS_CAPTURE_ADM.CREATE_CAPTURE(
    queue_name => 'streams_adm.capture_q',
    capture_name => 'capture_tables',
    rule_set_name => 'streams_adm.CAPTURE_RULESET',
    first_scn => NULL);
    END;
    --start capture
    begin
    dbms_capture_adm.start_capture('CAPTURE_TABLES');
    end;
    Let me know if you have any questions

  • Replicate using streams to a different schema

    All,
    I'm performing replication using streams, I have read the doc and steps by steps notes on metalink getting the replication successfully.
    But now, I want to do the same but on differents schemas ie.
    source DB: Schema= A Table=test
    Destination DB: Schema= B Table=test
    On both databases the schemas are basically the same but with different names.
    What changes o additional steps I need to perform in order to complete the replication?
    Thanks in advance..!

    There are demos of both Change Data Capture (superior for what you appear to be doing) and Streams in Morgan's Library at www.psoug.org. Some of them duplicate exactly what you describe.
    PS: There is a specific Streams forum. In the future please post Streams related requests there. Thank you.

  • Partial replication of tables using streams

    I need to replicate a subset of master table (few columns) to another database. destination tables are read only. But this needs to happen in real time. I was using oracle streams to replicate entire table. To make it partial, can we have something like streams replicating materialized views from maser. Any clue on how can this be achived with better performance.

    The data needs to be replicated in a few seconds, if we run a materialized view on all the tables in seconds, will it not have a performace impact on the source database.Since some tables needs to be replicated entirely at real time we are using streams for those tables.Now to get a partial one, I am trying to create a rule set with delete_columns, rename_columns etc combination of dbms_rules package.
    Any links that could help this construction?
    Do you still feel that materialized viwe would be a better alternative to the problem, to replicate few columns of lot of tables in real time?

  • Can we capture changes made to the objects other than tables using streams

    Hello All,
    I have setup a schema level streams replication using local capture process. I can capture all the DML changes on tables but have some issues capturing DDL. Even though streams are used for sharing data at different or within a database I was wondering if we can replicate the changes made to the objects like views, procedures, functions and triggers at the source database. I am not able to replicate the changes made to the views in my setup.
    Also, when I do a "select source_database,source_object_type,instantiation_scn from dba_apply_instantiated_objects" under the column 'object_type' I just see the TABLE in all the rows selected.
    Thanks,
    Sunny boy

    Hello
    This could be a problem with your rules configured with capture,propagation or apply. Or might be a problem with your instantiation.
    You can replicate Functions, Views, Procedure, Triggers etc using Streams Schema level replication or by configuring the rules.
    Please note that the objects like Functions, Views, Procedure, Triggers etc will not appear in the DBA_APPLY_INSTANTIATED_OBJECTS view. The reason is because you do a schema level instantiation only the INSTANTIATION_SCN in DBA_APPLY_INSTANTIATED_SCHEMAS is accounted for these objects. At the same time tables would get recursively instantiated and you would see an entry in DBA_APPLY_INSTANTIATED_OBJECTS.
    It works fine for me. Please see the below from my database (database is 10.2.0.3):
    on capture site_
    SQL> connect strmadmin/strmadmin
    Connected.
    SQL> select capture_name,rule_set_name,status from dba_capture;
    CAPTURE_NAME RULE_SET_NAME STATUS
    STREAMS_CAPTURE RULESET$_33 ENABLED
    SQL> select rule_name from dba_rule_set_rules where rule_set_name='RULESET$_33';
    RULE_NAME
    TEST41
    TEST40
    SQL> set long 100000
    SQL> select rule_condition from dba_rules where rule_name='TEST41';
    RULE_CONDITION
    ((:ddl.get_object_owner() = 'TEST' or :ddl.get_base_table_owner() = 'TEST') and
    :ddl.is_null_tag() = 'Y' and :ddl.get_source_database_name() = 'SOURCE.WORLD')
    SQL> select rule_condition from dba_rules where rule_name='TEST40';
    RULE_CONDITION
    ((:dml.get_object_owner() = 'TEST') and :dml.is_null_tag() = 'Y' and :dml.get_so
    urce_database_name() = 'SOURCE.WORLD')
    SQL> select * from global_name;
    GLOBAL_NAME
    SOURCE.WORLD
    SQL> conn test/test
    Connected.
    SQL> select object_name,object_type,status from user_objects;
    OBJECT_NAME OBJECT_TYPE STATUS
    TEST_NEW_TABLE TABLE VALID
    TEST_VIEW VIEW VALID
    PRC1 PROCEDURE VALID
    TRG1 TRIGGER VALID
    FUN1 FUNCTION VALID
    5 rows selected.
    on apply site_
    SQL> connect strmadmin/strmadmin
    Connected.
    SQL> col SOURCE_DATABASE for a22
    SQL> select source_database,source_object_owner,source_object_name,source_object_type,instantiation_scn
    2 from dba_apply_instantiated_objects;
    SOURCE_DATABASE SOURCE_OBJ SOURCE_OBJECT_NAME SOURCE_OBJE INSTANTIATION_SCN
    SOURCE.WORLD TEST TEST_NEW_TABLE TABLE 9886497863438
    SQL> select SOURCE_DATABASE,SOURCE_SCHEMA,INSTANTIATION_SCN from
    2 dba_apply_instantiated_schemas;
    SOURCE_DATABASE SOURCE_SCHEMA INSTANTIATION_SCN
    SOURCE.WORLD TEST 9886497863438
    SQL> select * from global_name;
    GLOBAL_NAME
    TARGET.WORLD
    SQL> conn test/test
    Connected.
    SQL> select object_name,object_type,status from user_objects;
    OBJECT_NAME OBJECT_TYPE STATUS
    TEST_VIEW VIEW VALID
    PRC1 PROCEDURE VALID
    TRG1 TRIGGER VALID
    FUN1 FUNCTION VALID
    TEST_NEW_TABLE TABLE VALID
    5 rows selected.
    These Functions, Views, Procedure, Trigger are created on the source and got replicated automatically to the target site TARGET.WORLD. And note that none of these objects are appearing in DBA_APPLY_INSTANTIATED_OBJECTS view.
    I have used the above given rules for capture. For propagation I dont have a ruleset itself and for apply I have same rules as of the capture rules.
    Please verify your environment and let me know if you need further help.
    Thanks,
    Rijesh

  • How to populate the table using streaming data

    Dear folks
    M about to create a midlet application in which i need to populate my table with the current stock exchange values. Since these values get changed dynamically , I planned to use streaming. Since m new to streaming ,pls someone come up to guide me in right path. M using Canvas application. I appreciate all the replies

    Dear folks
    M about to create a midlet
    application in which i need to populate my table
    with the current stock exchange values. Since these
    values get changed dynamically , I planned to use
    streaming. Since m new to streaming ,pls someone
    come up to guide me in right path. M using Canvas
    application. I appreciate all the repliesthats fine and funny....
    let me tell you something, in GCF everything you fetch through GPRS is bound to come as an inputstream and whatever request you send is passed as outputstream. so if you have worked with GCF, you have worked with streaming already. isn't it.
    SD

  • Updating ARDT table without using direct update statement

    hi,
        can any one guide me how to update REMARK field in the ADRT table without using direct UPDATE statement. It would be helpful if any one can tell me the bapi or a function module with a sample code.

    Hi                                                                               
    <b>SZA0                           Business Address Services (w/o Dialog) </b> ADDR_PERSONAL_UPDATE                                                          
    ADDR_PERSON_UPDATE                                                            
    ADDR_PERS_COMP_UPDATE                                                         
    ADDR_UPDATE                                                                   
    these are the four function modules which will update the (Business Address Services) reward if usefull
    check these is there any  help ful for u or not

  • Can I stream things directly from the time capsule to my AppleTv without using my computer?

    I have an older MacBook and I'm worried about the storage so I want to get things moved to a Time Capsule.  I was just wondering if I can stream things directly from the Time Capsule to the AppleTV without having to store them in iTunes on my computer and play them off of there.

    You cannot stream from a TC .. it is not a media device.. you can store movie files on the TC but you still need to run the computer to play them.
    Depending on how you connect this can cause dramatic slowdowns.
    A WD Live TV can play raw movie files from a TC.. because it is not a media streamer which is all the ATV is, it is a full media player.

  • How can I used Streams?

    I have one source database(RAC).
    and four destination database.
    Now , I used materialed view for replicate tables.
    I'll change to oracle steams.
    How can I do?
    please introduce me

    Surachart,
    A basic setup would consist of a capture process and four propagation processes on the source database, and four apply processes at the targets.
    Uni-directional streams and less moving parts and is thus easier to manage than bi-directional.
    You shouild experiment by replicating a dummy table containing a few columns until you are clear about the roles of the various processes and the commands to manipulate them.
    Check out the "WeDoStreams" blog for some easy to follow tutorials on streams steup, of various levels of complexity.
    Rgds
    Mark Teehan
    Singapore

  • How do we use "streams" functionality in Oracle 8i?

    Hello Gurus,
    Oracle Streams is only available as of Oracle 9i. I need to replicate data from an Oracle 8i database to another database. The tables don't match exactly and i want a solution that can mimic streams. How can that be done? Is there any sample code available to do this type of streaming between Oracle 8i dbs?
    Thanks.

    You can use Materialized views (used to be called Snapshot) or Advanced Replication. But I think AR is an extra option not included in Enterprise Edition.
    You could also develop trigger based replication using database link. But I advice you don't go that way. Don't reinvent the wheel. Think of what happens when one db goes down - you don't want the other to sink with it...
    You should give Snapshot a try. Look for fast snapshot refresh. It uses incremental log, so it is a lot faster in most cases.
    You need at least 9i R2 to use Streams, but you're better off with 10.2.0.3.

  • Problem in Update using STREAMS in Ora 10g EE

    Hi
    I am using Oracle 10g Enterprise Edition for replicating datas after failover occurs in DB
    I had followed the steps in below URL..
    http://blogs.ittoolbox.com/oracle/guide/archives/oracle-streams-configuration-change-data-capture-13501
    The replication is achived using streams.
    INSERT,DELETE operations working fine without any issue. But UPDATE operation is not replicated immediately.
    I did UPDATE and gave COMMIT, no replication. Again I did UPDATE and gave COMMIT, now only it updates the Streams table.
    For every UPDATES it needs two times? Why? How to solve this?
    Or Any other URL which gives working examples of Streams in 10g..
    Any help regard this highly appreciated..

    Thanks for ur reply..
    There is no specific reason to use STREAMS..
    I have to replicate the data during my properity database stated up.
    We already discuss with this topic
    Data Replication in after downtime
    Instead of using Triggers or procedure, i have to use anyone of the technologies in Oracle 10g for DataReplication..
    Can you give me more info like advantages of using DataGuard or other techologies in Oracle 10g EE .

  • Oracle Replication and SQL*Load direct Path

    We are setting up Oracle replication and have a few tables which are loaded using SQL*Loader Direct path. I have following questions:
    1. Can MultiMaster replication replicate direct patch sqlloaded data.
    2. If the answer to the above question is no, should we set up a new snapshot replication group . All other tables are in the Multi Master replication.
    3. Another question is on the number of replication groups. How many of these we should create. We have total of about 500 tables and database size is about 450 Gig. We plan to replicate all the tables and the refresh interval we want to be one minute for all the tables. Does having more/less replication groups help??
    Thanks for your help

    We are setting up Oracle replication and have a few tables which are loaded using SQL*Loader Direct path. I have following questions:
    1. Can MultiMaster replication replicate direct patch sqlloaded data.Yes, I don't think it should matter how the table is getting updated.
    2. If the answer to the above question is no, should we set up a new snapshot replication group . All other tables are in the Multi Master replication.see above
    3. Another question is on the number of replication groups. How many of these we should create. We have total of about 500 tables and database size is about 450 Gig. We plan to replicate all the tables and the refresh interval we want to be one minute for all the tables. Does having more/less replication groups help??
    Thanks for your help I believe what Oracle recommends is that you split up your tables into replication groups transactionally. I personally have 6 replication groups supporting 700+ tables. They are broken up more by business function than by number and size.

  • Need table replication help

    Hello. I'm trying to create simple table replication(windows servers. 11gR2). Unfortunately with no luck. When I start DBMS_STREAMS_ADM.MAINTAIN_TABLES(... everything goes without errors, but I don't see any changes in the destination table. Below are all steps
    The first DB:
    ----sys----
    ALTER SYSTEM SET job_queue_processes=2
    ALTER SYSTEM SET aq_tm_processes=1
    ALTER SYSTEM SET parallel_max_servers=10
    ALTER SYSTEM SET streams_pool_size=200M
    ALTER SYSTEM SET undo_retention=900
    ALTER SYSTEM SET job_queue_interval=1
    create user STRMADMIN identified by STRMADMIN;
    ALTER USER STRMADMIN DEFAULT TABLESPACE USERS
    TEMPORARY TABLESPACE TEMP
    QUOTA UNLIMITED ON USERS;
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
    GRANT EXECUTE ON DBMS_AQADM TO strmadmin;
    Set all databases in archivelog mode.
    GRANT EXECUTE ON DBMS_APPLY_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_CAPTURE_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_PROPAGATION_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_STREAMS TO strmadmin;
    GRANT EXECUTE ON DBMS_STREAMS_ADM TO strmadmin;
    execute DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE('STRMADMIN');
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_SET_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    CREATE PUBLIC DATABASE LINK LOGM_LINK_STREAM2
    CONNECT TO STRMADMIN IDENTIFIED BY STRMADMIN
    USING '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=bio-db1)(PORT=1521)))(CONNECT_DATA=(SID=BIO)(SERVER=DEDICATED)))';
    CREATE DIRECTORY db_files_directory AS 'C:\oracle\oradata\db_files';
    The second DB:
    same steps as in the first DB, only DBLink is different
    CREATE PUBLIC DATABASE LINK LOGM_LINK_STREAM1
    CONNECT TO STRMADMIN IDENTIFIED BY STRMADMIN
    USING '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=bio-db)(PORT=1521)))(CONNECT_DATA=(SID=BIO)(SERVER=DEDICATED)))';
    And then - connect as strmadmin to the first DB
    ----strmadmin------
    DECLARE
    tables DBMS_UTILITY.UNCL_ARRAY;
    BEGIN
    tables(1) := 'BIODATA.AUDIT_DATA';
    DBMS_STREAMS_ADM.MAINTAIN_TABLES(
    table_names => tables,
    source_directory_object => NULL,
    destination_directory_object => NULL,
    source_database => 'LOGM_LINK_STREAM1',
    destination_database => 'LOGM_LINK_STREAM2',
    perform_actions => TRUE,
    script_name => 'configure_rep.sql',
    script_directory_object => 'db_files_directory',
    bi_directional => FALSE,
    include_ddl => TRUE,
    instantiation => DBMS_STREAMS_ADM.INSTANTIATION_TABLE_NETWORK);
    END;
    anonymous block completed

    Below is generated scrip. I added
    ALTER SYSTEM SET global_names=true. So the source Db SID BIO1, destination - BIO
    SET ECHO ON
    SET VERIFY OFF
    WHENEVER SQLERROR EXIT SQL.SQLCODE;
    -- get TNSNAME and streams admin user details for both the databases
    PROMPT
    PROMPT 'Enter TNS Name of site 1 as parameter 1:'
    DEFINE db1 = &1
    PROMPT
    PROMPT 'Enter streams admin username for site 1 as parameter 2:'
    DEFINE strm_adm_db1 = &2
    PROMPT
    PROMPT 'Enter streams admin password for site 1 as parameter 3:'
    DEFINE strm_adm_pwd_db1 = &3
    PROMPT
    PROMPT 'Enter TNS Name of site 2 as parameter 4:'
    DEFINE db2 = &4
    PROMPT
    PROMPT 'Enter streams admin username for site 2 as parameter 5:'
    DEFINE strm_adm_db2 = &5
    PROMPT
    PROMPT 'Enter streams admin password for site 2 as parameter 6:'
    DEFINE strm_adm_pwd_db2 = &6
    -- connect as streams administrator to site 1
    PROMPT Connecting as streams administrator to site 1
    CONNECT &strm_adm_db1/&strm_adm_pwd_db1@&db1
    -- Set up queue "STRMADMIN"."BIO1$CAPQ"
    BEGIN
    dbms_streams_adm.set_up_queue(
    queue_table => '"STRMADMIN"."BIO1$CAPQT"',
    storage_clause => NULL,
    queue_name => '"STRMADMIN"."BIO1$CAPQ"',
    queue_user => '');
    END;
    -- PROPAGATE changes for table "BIODATA"."AUDIT_DATA1"
    DECLARE
    version_num NUMBER := 0;
    release_num NUMBER := 0;
    pos NUMBER;
    initpos NUMBER;
    q2q BOOLEAN;
    stmt VARCHAR2(100);
    ver VARCHAR2(30);
    compat VARCHAR2(30);
    BEGIN
    BEGIN
    stmt := 'BEGIN dbms_utility.db_version@BIO(:ver, :compat); END;';
    EXECUTE IMMEDIATE stmt USING OUT ver, OUT compat;
    -- Extract version number
    initpos := 1;
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    version_num := TO_NUMBER(SUBSTR(compat, initpos, pos - initpos));
    initpos := pos + 1;
    -- Extract release number
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    release_num := TO_NUMBER(SUBSTR(compat, initpos,
    pos - initpos));
    initpos := pos + 1;
    ELSE
    release_num := TO_NUMBER(SUBSTR(compat, initpos));
    END IF;
    ELSE
    version_num := TO_NUMBER(SUBSTR(compat, initpos));
    END IF;
    -- use q2q propagation if compatibility >= 10.2
    IF version_num > 10 OR
    (version_num = 10 AND release_num >=2) THEN
    q2q := TRUE;
    ELSE
    q2q := FALSE;
    END IF;
    EXCEPTION WHEN OTHERS THEN
    q2q := FALSE;
    END;
    dbms_streams_adm.add_table_propagation_rules(
    table_name => '"BIODATA"."AUDIT_DATA1"',
    streams_name => '',
    source_queue_name => '"STRMADMIN"."BIO1$CAPQ"',
    destination_queue_name => '"STRMADMIN"."BIO1$APPQ"@BIO',
    include_dml => TRUE,
    include_ddl => TRUE,
    include_tagged_lcr => TRUE,
    source_database => 'BIO1',
    inclusion_rule => TRUE,
    and_condition => NULL,
    queue_to_queue => q2q);
    END;
    -- Disable propagation. Enable after destination has been setup
    DECLARE
    q2q VARCHAR2(10);
    destn_q VARCHAR2(65);
    BEGIN
    SELECT queue_to_queue INTO q2q
    FROM dba_propagation
    WHERE source_queue_owner = 'STRMADMIN' AND
    source_queue_name = 'BIO1$CAPQ' AND
    destination_queue_owner = 'STRMADMIN' AND
    destination_queue_name = 'BIO1$APPQ' AND
    destination_dblink = 'BIO';
    IF q2q = 'TRUE' THEN
    destn_q := '"STRMADMIN"."BIO1$APPQ"';
    ELSE
    destn_q := NULL;
    END IF;
    dbms_aqadm.disable_propagation_schedule(
    queue_name => '"STRMADMIN"."BIO1$CAPQ"',
    destination => 'BIO',
    destination_queue => destn_q);
    EXCEPTION WHEN OTHERS THEN
    IF sqlcode = -24065 THEN NULL; -- propagation already disabled
    ELSE RAISE;
    END IF;
    END;
    -- CAPTURE changes for table "BIODATA"."AUDIT_DATA1"
    DECLARE
    compat VARCHAR2(512);
    initpos NUMBER;
    pos NUMBER;
    version_num NUMBER;
    release_num NUMBER;
    compat_func VARCHAR2(65);
    get_compatible VARCHAR2(4000);
    BEGIN
    SELECT value INTO compat
    FROM v$parameter
    WHERE name = 'compatible';
    -- Extract version number
    initpos := 1;
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    version_num := TO_NUMBER(SUBSTR(compat, initpos, pos - initpos));
    initpos := pos + 1;
    -- Extract release number
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    release_num := TO_NUMBER(SUBSTR(compat, initpos, pos - initpos));
    initpos := pos + 1;
    ELSE
    release_num := TO_NUMBER(SUBSTR(compat, initpos));
    END IF;
    END IF;
    IF version_num < 10 THEN
    compat_func := 'dbms_streams.compatible_9_2';
    ELSIF version_num = 10 THEN
    IF release_num < 2 THEN
    compat_func := 'dbms_streams.compatible_10_1';
    ELSE
    compat_func := 'dbms_streams.compatible_10_2';
    END IF;
    ELSIF version_num = 11 THEN
    IF release_num < 2 THEN
    compat_func := 'dbms_streams.compatible_11_1';
    ELSE
    compat_func := 'dbms_streams.compatible_11_2';
    END IF;
    ELSE
    compat_func := 'dbms_streams.compatible_11_2';
    END IF;
    get_compatible := ':lcr.get_compatible() <= '||compat_func;
    dbms_streams_adm.add_table_rules(
    table_name => '"BIODATA"."AUDIT_DATA1"',
    streams_type => 'CAPTURE',
    streams_name => '"BIO1$CAP"',
    queue_name => '"STRMADMIN"."BIO1$CAPQ"',
    include_dml => TRUE,
    include_ddl => TRUE,
    include_tagged_lcr => TRUE,
    source_database => 'BIO1',
    inclusion_rule => TRUE,
    and_condition => get_compatible);
    END;
    -- connect as streams administrator to site 2
    PROMPT Connecting as streams administrator to site 2
    CONNECT &strm_adm_db2/&strm_adm_pwd_db2@&db2
    -- Datapump TABLE MODE IMPORT (NETWORK)
    DECLARE
    h1 NUMBER := NULL; -- data pump job handle
    name_expr_list VARCHAR2(32767); -- for metadata_filter
    object_name dbms_utility.uncl_array; -- object names
    cnt NUMBER;
    object_owner VARCHAR2(30); -- owner
    job_state VARCHAR2(30); -- job state
    status ku$_Status; -- data pump status
    job_not_exist exception;
    pragma exception_init(job_not_exist, -31626);
    local_compat v$parameter.value%TYPE;
    remote_compat v$parameter.value%TYPE;
    min_compat v$parameter.value%TYPE;
    ind NUMBER;
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc;
    -- The job description from get_status
    BEGIN
    object_name(1) := 'AUDIT_DATA1';
    object_owner := 'BIODATA';
    FOR idx IN 1..1 LOOP
    SELECT COUNT(1) INTO cnt FROM all_tables@BIO
    WHERE owner = object_owner AND table_name = object_name(idx);
    -- table does not exist locally, need instantiation
    IF cnt = 0 THEN
    IF name_expr_list IS NULL THEN
    name_expr_list := '(';
    ELSE
    name_expr_list := name_expr_list ||',';
    END IF;
    name_expr_list := name_expr_list||''''||object_name(idx)||'''';
    END IF;
    END LOOP;
    IF name_expr_list IS NOT NULL THEN
    name_expr_list := name_expr_list || ')';
    ELSE
    COMMIT;
    RETURN;
    END IF;
    select value into local_compat from v$parameter
    where name = 'compatible';
    select value into remote_compat from v$parameter@BIO
    where name = 'compatible';
    IF TO_NUMBER(REPLACE(local_compat, '.', '0')) > TO_NUMBER(REPLACE(remote_compat, '.', '0')) THEN
    min_compat := remote_compat;
    ELSE
    min_compat := local_compat;
    END IF;
    h1 := dbms_datapump.open(operation=>'IMPORT',job_mode=>'TABLE',
    remote_link=>'BIO1',
    job_name=>NULL, version=> min_compat);
    dbms_datapump.metadata_filter(
    handle=>h1,
    name=>'NAME_EXPR',
    value=>'IN'||name_expr_list);
    dbms_datapump.metadata_filter(
    handle=>h1,
    name=>'SCHEMA_EXPR',
    value=>'IN'||'(''BIODATA'')');
    dbms_datapump.start_job(h1);
    job_state := 'UNDEFINED';
    BEGIN
    WHILE (job_state != 'COMPLETED') AND (job_state != 'STOPPED') LOOP
    status := dbms_datapump.get_status(
    handle => h1,
    mask => dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,
    timeout => -1);
    job_state := status.job_status.state;
    dbms_lock.sleep(10);
    END LOOP;
    EXCEPTION WHEN job_not_exist THEN
    dbms_output.put_line('job finished');
    END;
    COMMIT;
    EXCEPTION WHEN OTHERS THEN
    dbms_output.put_line('Exception, sql code = ' || SQLCODE || ', error message: ' || SQLERRM);
    dbms_output.put_line( dbms_utility.format_error_stack);
    dbms_output.put_line( dbms_utility.format_error_backtrace);
    IF h1 IS NOT NULL THEN
    BEGIN
    dbms_datapump.get_status(
    handle => h1,
    mask => dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,
    timeout => -1,
    job_state => job_state,
    status => status );
    dbms_output.put_line('Data pump job status: ' || job_state);
    le := status.wip;
    IF le IS NULL THEN
    dbms_output.put_line('WIP info is NULL');
    ELSE
    dbms_output.put_line('WIP info:');
    ind := le.FIRST;
    WHILE ind IS NOT NULL LOOP
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    END LOOP;
    END IF;
    le := status.error;
    IF le IS NULL THEN
    dbms_output.put_line('Error info is NULL');
    ELSE
    dbms_output.put_line('Error info:');
    ind := le.FIRST;
    WHILE ind IS NOT NULL LOOP
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    END LOOP;
    END IF;
    EXCEPTION
    WHEN job_not_exist THEN
    dbms_output.put_line('Data pump job finished');
    WHEN OTHERS THEN RAISE;
    END;
    END IF;
    ROLLBACK;
    RAISE;
    END;
    -- Set up queue "STRMADMIN"."BIO1$APPQ"
    BEGIN
    dbms_streams_adm.set_up_queue(
    queue_table => '"STRMADMIN"."BIO1$APPQT"',
    storage_clause => NULL,
    queue_name => '"STRMADMIN"."BIO1$APPQ"',
    queue_user => '');
    END;
    -- APPLY changes for table "BIODATA"."AUDIT_DATA1"
    DECLARE
    compat VARCHAR2(512);
    initpos NUMBER;
    pos NUMBER;
    version_num NUMBER;
    release_num NUMBER;
    compat_func VARCHAR2(65);
    get_compatible VARCHAR2(4000);
    BEGIN
    SELECT value INTO compat
    FROM v$parameter
    WHERE name = 'compatible';
    -- Extract version number
    initpos := 1;
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    version_num := TO_NUMBER(SUBSTR(compat, initpos, pos - initpos));
    initpos := pos + 1;
    -- Extract release number
    pos := INSTR(compat, '.', initpos, 1);
    IF pos > 0 THEN
    release_num := TO_NUMBER(SUBSTR(compat, initpos, pos - initpos));
    initpos := pos + 1;
    ELSE
    release_num := TO_NUMBER(SUBSTR(compat, initpos));
    END IF;
    END IF;
    IF version_num < 10 THEN
    compat_func := 'dbms_streams.compatible_9_2';
    ELSIF version_num = 10 THEN
    IF release_num < 2 THEN
    compat_func := 'dbms_streams.compatible_10_1';
    ELSE
    compat_func := 'dbms_streams.compatible_10_2';
    END IF;
    ELSIF version_num = 11 THEN
    IF release_num < 2 THEN
    compat_func := 'dbms_streams.compatible_11_1';
    ELSE
    compat_func := 'dbms_streams.compatible_11_2';
    END IF;
    ELSE
    compat_func := 'dbms_streams.compatible_11_2';
    END IF;
    get_compatible := ':lcr.get_compatible() <= '||compat_func;
    dbms_streams_adm.add_table_rules(
    table_name => '"BIODATA"."AUDIT_DATA1"',
    streams_type => 'APPLY',
    streams_name => '',
    queue_name => '"STRMADMIN"."BIO1$APPQ"',
    include_dml => TRUE,
    include_ddl => TRUE,
    include_tagged_lcr => TRUE,
    source_database => 'BIO1',
    inclusion_rule => TRUE,
    and_condition => get_compatible);
    END;
    -- Get tag value to be used for Apply
    DECLARE
    found BINARY_INTEGER := 0;
    tag_num NUMBER;
    apply_nm VARCHAR2(30);
    apply_nm_dqt VARCHAR2(32);
    BEGIN
    SELECT apply_name INTO apply_nm
    FROM dba_apply_progress
    WHERE source_database = 'BIO1';
    apply_nm_dqt := '"' || apply_nm || '"';
    -- Use the apply object id as the tag
    SELECT o.object_id INTO tag_num
    FROM dba_objects o
    WHERE o.object_name= apply_nm AND
    o.object_type='APPLY';
    LOOP
    BEGIN
    found := 0;
    SELECT 1 INTO found FROM dba_apply
    WHERE apply_name != apply_nm AND
    apply_tag = hextoraw(tag_num);
    EXCEPTION WHEN no_data_found THEN
    EXIT;
    END;
    EXIT WHEN (found = 0);
    tag_num := tag_num + 1;
    END LOOP;
    -- alter apply
    dbms_apply_adm.alter_apply(
    apply_name => apply_nm_dqt,
    apply_tag => hextoraw(tag_num));
    END;
    -- Start apply process applying changes from BIO1
    DECLARE
    apply_nm VARCHAR2(32);
    apply_nm_dqt VARCHAR2(32);
    BEGIN
    SELECT apply_name INTO apply_nm
    FROM dba_apply_progress
    WHERE source_database = 'BIO1';
    apply_nm_dqt := '"' || apply_nm || '"';
    dbms_apply_adm.start_apply(
    apply_name => apply_nm_dqt);
    EXCEPTION WHEN OTHERS THEN
    IF sqlcode = -26666 THEN NULL; -- APPLY process already running
    ELSE RAISE;
    END IF;
    END;
    -- connect as streams administrator to site 1
    PROMPT Connecting as streams administrator to site 1
    CONNECT &strm_adm_db1/&strm_adm_pwd_db1@&db1
    -- Enable propagation schedule for "STRMADMIN"."BIO1$CAPQ"
    -- to BIO
    DECLARE
    q2q VARCHAR2(10);
    destn_q VARCHAR2(65);
    BEGIN
    SELECT queue_to_queue INTO q2q
    FROM dba_propagation
    WHERE source_queue_owner = 'STRMADMIN' AND
    source_queue_name = 'BIO1$CAPQ' AND
    destination_queue_owner = 'STRMADMIN' AND
    destination_queue_name = 'BIO1$APPQ' AND
    destination_dblink = 'BIO';
    IF q2q = 'TRUE' THEN
    destn_q := '"STRMADMIN"."BIO1$APPQ"';
    ELSE
    destn_q := NULL;
    END IF;
    dbms_aqadm.enable_propagation_schedule(
    queue_name => '"STRMADMIN"."BIO1$CAPQ"',
    destination => 'BIO',
    destination_queue => destn_q);
    EXCEPTION WHEN OTHERS THEN
    IF sqlcode = -24064 THEN NULL; -- propagation already enabled
    ELSE RAISE;
    END IF;
    END;
    -- Start capture process BIO1$CAP
    BEGIN
    dbms_capture_adm.start_capture(
    capture_name => '"BIO1$CAP"');
    EXCEPTION WHEN OTHERS THEN
    IF sqlcode = -26666 THEN NULL; -- CAPTURE process already running
    ELSE RAISE;
    END IF;
    END;
    /

  • Creation of External table by using XML files.

    I am in the process of loading of XML file data into the database table. I want to use the External Table feature for this loading. Though we have the external table feature with Plain/text file, is there any process of XML file data loading into the Database table by using External table?
    I am using Oracle 9i.
    Appreciate your responses.
    Regards
    Edited by: user652422 on Dec 16, 2008 11:00 PM

    Hi,
    The XML file which U posted is working fine and that proved that external table can be created by using xml files.
    Now My problem is that I have xml files which is not as the book.xml, my xml file is having some diff format. below is the extracts of the file ...
    <?xml version="1.0" encoding="UTF-8" ?>
    - <PM-History deviceIP="172.20.7.50">
    <Error Reason="" />
    - <Interface IntfName="otu2-1-10B-3">
    - <TS Type="15-MIN">
    <Error Reason="" />
    - <PM-counters TimeStamp="02/13/2008:12:15">
    <Item Name="BBE-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="BBE-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="ES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="ES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="CSES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="CSES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="UAS-S" Direction="Received" Validity="ADJ" Value="135" />
    <Item Name="UAS-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SEF-S" Direction="Received" Validity="ADJ" Value="135" />
    </PM-counters>
    <PM-counters TimeStamp="03/26/2008:12:30">
    <Item Name="BBE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="BBE-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="ES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="ES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="SES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="SES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="CSES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="CSES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="UAS" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="UAS-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="PSC" Direction="Received" Validity="OFF" Value="0" />
    </PM-counters>
    </TS>
    </Interface>
    </PM-History>
    My problem is the Item Name and Direction the value of both(ex PSCReceived or UASReceived) will be treated as the coulmn name of the table and '0' would be the value of that column. I am confused how to create the external table creation program for that.
    I would really appreciate your responses.
    Regards

  • Data are not transfered to a table view using signal out

    Hi all,
    we have recently upgraded to SP16 and I'm now facing a problem on my model that worked fine when we were on SP15.
    I have a data service returning data to my model. The data are transfered to a signal out element and then displayed into a table view using a signal in element. Now the table view remains empty.
    I have checked that the dataservice returns data because if I directly connect my table view with the dataservice the data are displayed. But my requirement is to use signal in/out elements.
    This error was not appearing in SP15.
    Any ideas?
    Regards
    Panos

    After doing some tests I found out that if you have a model that has more than 2 signals in/out then the 3rd one will not work. This is definitely a bug.
    Hope that soon there will be a SP without any new bugs!

Maybe you are looking for