Oracle stream - Table instantiation

Hi,
I have included DDL rules in capture process. Everthing goes fine except when I am creating new table in source. It gives below error.
ORA-26687: no instantiation SCN provided for "PCAT_NT"."" in source database "PCAT"
My DDL rules are as below :
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'PCAT_NT',
streams_type => 'CAPTURE',
streams_name => 'PCAT_NT_CAPTURE',
queue_name => 'STRMADMIN.STRMPCAT_QUEUE',
include_dml => false,
include_ddl => true,
source_database => 'PCAT',
inclusion_rule => TRUE,
and_condition => '(:ddl.get_command_type() = ''TRUNCATE TABLE'')'
END;
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'PCAT_NT',
streams_type => 'CAPTURE',
streams_name => 'PCAT_NT_CAPTURE',
queue_name => 'STRMADMIN.STRMPCAT_QUEUE',
include_dml => false,
include_ddl => true,
source_database => 'PCAT',
inclusion_rule => TRUE,
and_condition => '(:ddl.get_command_type() = ''ALTER TABLE'')'
END;
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'PCAT_NT',
streams_type => 'CAPTURE',
streams_name => 'PCAT_NT_CAPTURE',
queue_name => 'STRMADMIN.STRMPCAT_QUEUE',
include_dml => false,
include_ddl => true,
source_database => 'PCAT',
inclusion_rule => TRUE,
and_condition => '(:ddl.get_command_type() = ''CREATE INDEX'')'
END;
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name => 'PCAT_NT',
streams_type => 'CAPTURE',
streams_name => 'PCAT_NT_CAPTURE',
queue_name => 'STRMADMIN.STRMPCAT_QUEUE',
include_dml => false,
include_ddl => true,
source_database => 'PCAT',
inclusion_rule => TRUE,
and_condition => '(:ddl.get_command_type() = ''CREATE TABLE'')'
END;
Regards

I followed document id 733691.1 for downstream setup and cudn't see DBMS_STREAMS_ADM.MAINTAIN_SCHEMAS procedure there.
Regards

Similar Messages

  • How to load a base64-stream (always WORD-document) in large XML files ( 1 MB) into ORACLE 10 table

    On our Oracle Server there are multipe XML files that I have to read and put the data into an ORACLE 10 table.
    I have to threath these XML files one by one.
    In the XML there is also een base64-stream (alwasys a WORD-document) .  This base64-stream exists of 1.000.000 characters.
    How can I read this base64-stream from the XML-file into a BLOB-column of my ORACLE 10 table
    ORACLE 10
    1 XML = +/- 1 MB
    PL/SQL
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <slo xmlns="http://www.myschema.com/gasdedectie">
    <LekKey>1999_036371_509627</LekKey>
    <HuiNum>46</HuiNum>
    <Res></Res>
    <InfLig>TEST STRUI AFGESTORVEN - PLAANSTRAAT 46</InfLig>
    <XWGS>3.637028</XWGS>
    <YWGS>50.962667</YWGS>
    <Pei>KESTENS</Pei>
    <DatPei>1999-11-30T10:17:36.000+01:00</DatPei>
    <Kan> </Kan>
    <Doc>UEsDBBQABgAIAAAAIQB5gHbnswEAAHcGAAATAAgCW0NvbnRlbnRfVHlwZXNdLnhtbCCiBAIooAACAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAC0VUtP20AQviP1P1h7rewNPVRVFYcDj2OL1FTluqzH
    ycK+tDMB8u8764AVwMSBiIsle/d7zLc74+nJg7PFHSQ0wdfiuJqIArwOjfGLWvydX5Q/RIGkfKNs8FCL
    NaA4mX05ms7XEbBgtMdaLIniTylRL8EprEIEzyttSE4Rv6aFjErfqgXIb5PJd6mDJ/BUUuYQs+kZtGpl
    qTh/4M8bJ9EvRHG62ZelamFcxufvchCRwOILiIrRGq2Ia5N3vnnhq3z0VDGy24NLE/ErG39DIa8897Qt
    8DbuJg7XchOhK+Y3559MA8WlSvRLOS5W3ofUyCboleOgqt3KA6WFtjUaenxmiyloQOSDdbbqV5wy/qnk
    IR96hRTclbPSELjLFCIeH2ynJ818kMhAH/uQhy4LpLUFPFj6VRIb3l0RbMn/M7Q8b1vQfKvHz8RhmbHV
    RmILO64GRHxQ+4g877Vy7ODxkXnUwj1c//k0F1vko0ZaHgJzdW1hj8TfGUZPPWqCeLCB7J6HX/+OZpck
    d2fXaTwo0wfKfpprGV1y2+/RYr0iT7SDc4Y8xhtoBrRl99uY/QcAAP//AwBQSwMEFAAGAAgAAAAhAB6R
    GrfzAAAATgIAAAsACAJfcmVscy8ucmVscyCiBAIooAACAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAACMkttKA0EMhu8F32HIfTfbCiLS2d5IoXci6wOEmewBdw7MpNq+vaMgulDbXub058tP1puDm9Q7
    pzwGr2FZ1aDYm2BH32t4bbeLB1BZyFuagmcNR86waW5v1i88kZShPIwxq6Lis4ZBJD4iZjOwo1yFyL5U
    upAcSQlTj5HMG/WMq7q+x/RXA5qZptpZDWln70C1x1g2X9YOXTcafgpm79jLiRXIB2Fv2S5iKmxJxnKN
    ain1LBpsMM8lnZFirAo24Gmi1fVE/1+LjoUsCaEJic/zfHWcA1peD3TZonnHrzsfIVksFn17+0ODsy9o
    PgEAAP//AwBQSwMEFAAGAAgAAAAhAGtucxBiAQAA1AUAABwACAF3b3JkL19yZWxzL2RvY3VtZW50Lnht
    bC5yZWxzIKIEASigAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
    AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAArJRNTwIxEIbvJv6HTe+2LCioYeGiJlwVI9fSnd2t
    bj/SDir/3goBF2HXg7006TSd9+nbmRlPP1WdvIPz0uiMpLRHEtDC5FKXGXmeP1xck8Qj1zmvjYaMrMGT
    6eT8bPwINcdwyVfS+iRk0T4jFaK9ZcyLChT31FjQ4aQwTnEMW1cyy8UbL4H1e70hc80cZHKQM5nlGXGz
    POjP1zYo/53bFIUUcGfESoHGExJMqqAdEnJXAmZEQS75Njigr7Yk7DTDoIVBSeGMNwVSYRTbyn/Ljg5f
    xjyua/AvEqv7ogCB/gfh6IgGw9o4Ri0cJ3z+jxf9Li/6MRm2r//tRpcFaUx5sfJo1CIYvi8JStk+yiSC
    SrtohjFp2oszpVa3FudVTIYPWD4BYuj+Ro02gl1mpGF2xGtVDCOk0aqbLdusnT9yGZPBH1mxi3T5cBMT
    oTAa53xZN6zYh3YQ7GAWT74AAAD//wMAUEsDBBQABgAIAAAAIQA/ebUM0KkAAO1pAwARAAAAd29yZC9k
    b2N1bWVudC54bWzsfdmS4kiz5v2YzTuk5ZjNDadKC1pQ9l91TEhiBwmxc9OmFQTakIQEHDtm/zuc2xmz
    uZoHmUf5n2Q8JMhMcs+qrK7qbrDuSq0eER4e7qHwL9z/8e87z71KrSh2Av/LNfEZv76yfCMwHX/x5Xo0
    rH2qXF/Fieabmhv41pfrvRVf//vX//7f/pHdmIGx9Sw/uQISfnyThcaX62WShDcYFhtLy9Piz55jREEc
    2MlnI/CwwLYdw8KyIDIxEifw/CiMAsOKYyhP0PxUi6+P5LzH1ILQ8qEsO4g8LYk/B9EC87RovQ0/AfVQ
    SxzdcZ1kD7Rx5kQm+HK9jfybY4U+3VYIvXJTVOj45/RG9KgVT5RbvCkeOZCXiEWWC3UI/HjphHfN+FZq
    0MTlqUrpS41IPff0XBYS1KPybpv8lj4QIy2Drrgj+IjcE8wwi5c8t+AD6t+7Xn1IkcBfasyxRxCJ2zq8
    pQrnZZ5q4mmOf0vm21hzn7kwIr5HvutRsA1vqxM630et6a9vaaGB+Y6a4Uw+8u43LX4XgUdDd7DUQuv6
    yjNumgs/iDTdhRplBHWFJPL6KygLPTD36G94ld2AsjHVL9c4XqtRFUK4Pl0SLVvbugm6w1C0SPD5m6ES
    5S8Okr1rwaOp5n65HjqJa11j6EawTVzHtzqpe7qJoxsYlFW8qQfBGumJQaJFCTzjmFACKtTXPKjm7/Wg
    qhnrgtjpWck3b5/MS8nrEBUE/UCJgsDOr8eHU6lkXmx242r+4nTNjj5VJVSUpcUJHzval+viUlHBI73j
    +IHGhDeabyyD6Mp04mSYVxMdVW+POqCjCaqMQ/3RDeDi6TR2vNC1lAB6Em4Wuii1GpazWAJHSZpgWAKn
    4JZuLR3fBNUF715fuYGxtgqGuNoeeNn0BcsFDsM9zXWDTAbj4GohupBX8Lacq11e0h79i1gR3oRB7CAF
    2LgtvhYF3pdrI3C3nl+8Ds/Ith1byddPRc2ho/I3j1dPpwWhM7LjB2RDLdIWkRYuH1LGT0SeojnOaVq7
    BFkuA9pQVAMYY0BTaIIgWPrYHsu2LSORikcRSzichueAnYjD+b/6XeszqEoPbGTBCzCOSnSFRI0EC1pI
    Gm/rluUi03oFF4vnjF5aR21wjFoE4ogETLvJW3W80oEOio82EcTnwTB/XRsXOtAPhCXIpcXHITQJdWYu
    geHNS+V/b6n3miJqiXa1jR7rvNcbEDpGso0s6GQ4uoH/j9WCo29gxzk1P1Uc6KeCNLDi2GXQvUWPwV1U
    9rG74F3Er+MrGqpL0TuPuXt1eymKgmxpaSaMy4Lp51QwdHpWDd11wprjuoh76PgqurE8HY3RqGnCvMaA
    6VgCeiuMHD9B3ajdxJGhQrcWx0lkJcYSXbaByvE6Bg+dbuRF3pWCKhCDqrzSs25gAmFtmwTAbe1mZ0ce
    +gtG+cFg125gAL00eqC809thFCd1K/Cu0AE0AiqaU9fSToyqDI+eHkGF+QFqfN4U17/KvlxzNEnnL9y7
    4zmJFV25DiiXCo5+BR8Qn0Fz5y8nmuMWx1CA60M5p4YeD+E0L/w42JCE3j+H41t1jI5vbrU0HIPEwD+g
    1x+aMnQ9+TocDQZSry13B8MrnlfFOj9AFJKcDryLbBN68o3GsEywVJU6mUkFVA5+vJQTKYwS6HSoj2vZ
    wGOSZEn0uO0Ayz
    pQQAALNCCQAAAA==</Doc>
    </slo>

    If I understand your question correctly, the discussion in this thread
    ORA-31167: 64k size limit for XML node
    shows you do not have a lot of options.
    As you are needing to extract the contents of a node from an XML document, and that node is > 64K, you are going to need to treat the entire XML as a CLOB and use CLOB functionality (aka INSTR, SUBSTR) to find and extract what you need.
    Once your DB version is 11.1 or greater, then you can use the built-in that Billy mentions.
    Note: mdrake is Mark Drake from Oracle.

  • Oracle stream - Downstream new table setup

    Hi,
    I want to add new table to my existing oracle stream setup. Below are the step. Is this OK?
    1) stop apply/capture
    2) Add new rule to the the existing captue which internally will call DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION too(I guess).
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'NIG.BUILD_VIEWS',
    streams_type => 'CAPTURE',
    streams_name => 'NIG_CAPTURE',
    queue_name => 'STRMADMIN.NIG_Q',
    include_dml => true,
    include_ddl => true,
    source_database => 'PNID.LOUDCLOUD.COM'
    END;
    3) Import (which will initialize the table
    impdp system DIRECTORY=DBA_WORK_DIRECTORY DUMPFILE=nig_part2_srm_expdp_%U.dmp exclude=grant,statistics,ref_constraint
    4) start apply/capture

    Have you applied this way? What was the result?
    regards

  • Oracle Streaming in same database - 10g

    Oracle Streaming in 10g database
    Posted: Sep 21, 2006 4:25 AM Reply
    Hello,
    I am trying do streaming at table level for all dml changes from one schema(source) to another schema (target) controlled by admin schema (stream_admin). I have all these schemas in same database(10g-enterprise edition).
    As given in documentations , I created
    1. two queues(in_Q and out_Q) in Stream_admin,
    2. two process( capture and apply process) in Stream_admin ,
    3. A propagation rule for propagation between in_Q and out_Q,
    4. did instantiation,
    5. started both capture and apply process.
    Having done that , I insert in source table and check for the same in target but alas!! nothing happens. I fail to achieve streaming.
    I am not getting any error. Neither in process , nor in propagation or queues. And all queues,rules and process are enabled.
    Please help.

    datapump uses dbms_metadata extensively.
    Problem is twofold
    - the amount of data
    - why on earth do you need to 'copy' these 1.2 Tb a 'number of times'
    One would expect you would have tested the upgrade on a smaller test database. and you wouldn't need to fix bugs in a 1.2 Tb database.
    A better idea would be to duplicate the complete database using RMAN. This doesn't perform conventional INSERTs and doesn't create redo.
    Sybrand Bakker
    Senior Oracle DBA

  • Oracle Streams 'ORA-25215: user_data type and queue type do not match'

    I am trying replication between two databases (10.2.0.3) using Oracle Streams.
    I have followed the instructions at http://www.oracle.com/technology/oramag/oracle/04-nov/o64streams.html
    The main steps are:
    1. Set up ARCHIVELOG mode.
    2. Set up the Streams administrator.
    3. Set initialization parameters.
    4. Create a database link.
    5. Set up source and destination queues.
    6. Set up supplemental logging at the source database.
    7. Configure the capture process at the source database.
    8. Configure the propagation process.
    9. Create the destination table.
    10. Grant object privileges.
    11. Set the instantiation system change number (SCN).
    12. Configure the apply process at the destination database.
    13. Start the capture and apply processes.
    For step 5, I have used the 'set_up_queue' in the 'dbms_strems_adm package'. This procedure creates a queue table and an associated queue.
    The problem is that, in the propagation process, I get this error:
    'ORA-25215: user_data type and queue type do not match'
    I have checked it, and the queue table and its associated queue are created as shown:
    sys.dbms_aqadm.create_queue_table (
    queue_table => 'CAPTURE_SFQTAB'
    , queue_payload_type => 'SYS.ANYDATA'
    , sort_list => ''
    , COMMENT => ''
    , multiple_consumers => TRUE
    , message_grouping => DBMS_AQADM.TRANSACTIONAL
    , storage_clause => 'TABLESPACE STREAMSTS LOGGING'
    , compatible => '8.1'
    , primary_instance => '0'
    , secondary_instance => '0');
    sys.dbms_aqadm.create_queue(
    queue_name => 'CAPTURE_SFQ'
    , queue_table => 'CAPTURE_SFQTAB'
    , queue_type => sys.dbms_aqadm.NORMAL_QUEUE
    , max_retries => '5'
    , retry_delay => '0'
    , retention_time => '0'
    , COMMENT => '');
    The capture process is 'capturing changes' but it seems that these changes cannot be enqueued into the capture queue because the data type is not correct.
    As far as I know, 'sys.anydata' payload type and 'normal_queue' type are the right parameters to get a successful configuration.
    I would be really grateful for any idea!

    Hi
    You need to run a VERIFY to make sure that the queues are compatible. At least on my 10.2.0.3/4 I need to do it.
    DECLARE
    rc BINARY_INTEGER;
    BEGIN
    DBMS_AQADM.VERIFY_QUEUE_TYPES(
    src_queue_name => 'np_out_onlinex',
    dest_queue_name => 'np_out_onlinex',
    rc => rc, , destination => 'scnp.pfa.dk',
    transformation => 'TransformDim2JMS_001x');
    DBMS_OUTPUT.PUT_LINE('Compatible: '||rc);
    If you dont have transformations and/or a remote destination - then delete those params.
    Check the table: SYS.AQ$_MESSAGE_TYPES there you can see what are verified or not
    regards
    Mette

  • Doubt in Oracle streams

    Doubt in Oracle streams .Can you please help me in understanding the terms
    1. Message
    2.User defined Event
    3. Event
    4.Rules
    5.Oracle supplied PL/SQL packages
    6.Subscriber,Consumer

    Hi
    Message
    A message is the smallest unit of information that is inserted into and retrieved from a queue.
    Queue
    A queue is repository for messages. Queues are stored in queue tables
    Enqueue
    To place a message in queue
    Dequeue
    To comsume a message
    Agent
    An agent is a end user or the application uses a queue
    Thanks
    Venkat

  • Help on Oracle streams 11g configuration

    Hi Streams experts
    Can you please validate the following creation process steps ?
    What is need to have streams doing is a one way replication of the AR
    schema from a database to another database. Both DML and DDL shall do
    the replication of the data.
    Help on Oracle streams 11g configuration. I would also need your help
    on the maintenance steps, controls and procedures
    2 databases
    1 src as source database
    1 dst as destination database
    replication type 1 way of the entire schema FaeterBR
    Step 1. Set all databases in archivelog mode.
    Step 2. Change initialization parameters for Streams. The Streams pool
    size and NLS_DATE_FORMAT require a restart of the instance.
    SQL> alter system set global_names=true scope=both;
    SQL> alter system set undo_retention=3600 scope=both;
    SQL> alter system set job_queue_processes=4 scope=both;
    SQL> alter system set streams_pool_size= 20m scope=spfile;
    SQL> alter system set NLS_DATE_FORMAT=
    'YYYY-MM-DD HH24:MI:SS' scope=spfile;
    SQL> shutdown immediate;
    SQL> startup
    Step 3. Create Streams administrators on the src and dst databases,
    and grant required roles and privileges. Create default tablespaces so
    that they are not using SYSTEM.
    ---at the src
    SQL> create tablespace streamsdm datafile
    '/u01/product/oracle/oradata/orcl/strepadm01.dbf' size 100m;
    ---at the replica:
    SQL> create tablespace streamsdm datafile
    ---at both sites:
    '/u02/oracle/oradata/str10/strepadm01.dbf' size 100m;
    SQL> create user streams_adm
    identified by streams_adm
    default tablespace strepadm01
    temporary tablespace temp;
    SQL> grant connect, resource, dba, aq_administrator_role to
    streams_adm;
    SQL> BEGIN
    DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE (
    grantee => 'streams_adm',
    grant_privileges => true);
    END;
    Step 4. Configure the tnsnames.ora at each site so that a connection
    can be made to the other database.
    Step 5. With the tnsnames.ora squared away, create a database link for
    the streams_adm user at both SRC and DST. With the init parameter
    global_name set to True, the db_link name must be the same as the
    global_name of the database you are connecting to. Use a SELECT from
    the table global_name at each site to determine the global name.
    SQL> select * from global_name;
    SQL> connect streams_adm/streams_adm@SRC
    SQL> create database link DST
    connect to streams_adm identified by streams_adm
    using 'DST';
    SQL> select sysdate from dual@DST;
    SLQ> connect streams_adm/streams_adm@DST
    SQL> create database link SRC
    connect to stream_admin identified by streams_adm
    using 'SRC';
    SQL> select sysdate from dual@SRC;
    Step 6. Control what schema shall be replicated
    FaeterBR is the schema to be replicated
    Step 7. Add supplemental logging to the FaeterBR schema on all the
    tables?
    SQL> Alter table FaeterBR.tb1 add supplemental log data
    (ALL) columns;
    SQL> alter table FaeterBR.tb2 add supplemental log data
    (ALL) columns;
    etc...
    Step 8. Create Streams queues at the primary and replica database.
    ---at SRC (primary):
    SQL> connect stream_admin/stream_admin@ORCL
    SQL> BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'streams_adm.FaeterBR_src_queue_table',
    queue_name => 'streams_adm.FaeterBR_src__queue');
    END;
    ---At DST (replica):
    SQL> connect stream_admin/stream_admin@STR10
    SQL> BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'stream_admin.FaeterBR_dst_queue_table',
    queue_name => 'stream_admin.FaeterBR_dst_queue');
    END;
    Step 9. Create the capture process on the source database (SRC).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name =>'FaeterBR',
    streams_type =>'capture',
    streams_name =>'FaeterBR_src_capture',
    queue_name =>'FaeterBR_src_queue',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database => NULL,
    inclusion_rule => true);
    END;
    Step 10. Instantiate the FaeterBR schema at DST. by doing export
    import : Can I use now datapump to do that ?
    ---AT SRC:
    exp system/superman file=FaeterBR.dmp log=FaeterBR.log
    object_consistent=y owner=FaeterBR
    ---AT DST:
    ---Create FaeterBR tablespaces and user:
    create tablespace FaeterBR_datafile
    '/u02/oracle/oradata/str10/FaeterBR_01.dbf' size 100G;
    create tablespace ws_app_idx datafile
    '/u02/oracle/oradata/str10/FaeterBR_01.dbf' size 100G;
    create user FaeterBR identified by FaeterBR_
    default tablespace FaeterBR_
    temporary tablespace temp;
    grant connect, resource to FaeterBR;
    imp system/123db file=FaeterBR_.dmp log=FaeterBR.log fromuser=FaeterBR
    touser=FaeterBR streams_instantiation=y
    Step 11. Create a propagation job at the source database (SRC).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
    schema_name =>'FaeterBR',
    streams_name =>'FaeterBR_src_propagation',
    source_queue_name =>'stream_admin.FaeterBR_src_queue',
    destination_queue_name=>'stream_admin.FaeterBR_dst_queue@dst',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database =>'SRC',
    inclusion_rule =>true);
    END;
    Step 12. Create an apply process at the destination database (DST).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name =>'FaeterBR',
    streams_type =>'apply',
    streams_name =>'FaeterBR_Dst_apply',
    queue_name =>'FaeterBR_dst_queue',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database =>'SRC',
    inclusion_rule =>true);
    END;
    Step 13. Create substitution key columns for äll the tables that
    haven't a primary key of the FaeterBR schema on DST
    The column combination must provide a unique value for Streams.
    SQL> BEGIN
    DBMS_APPLY_ADM.SET_KEY_COLUMNS(
    object_name =>'FaeterBR.tb2',
    column_list =>'id1,names,toys,vendor');
    END;
    Step 14. Configure conflict resolution at the replication db (DST).
    Any easier method applicable the schema?
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'id';
    cols(2) := 'names';
    cols(3) := 'toys';
    cols(4) := 'vendor';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name =>'FaeterBR.tb2',
    method_name =>'OVERWRITE',
    resolution_column=>'FaeterBR',
    column_list =>cols);
    END;
    Step 15. Enable the capture process on the source database (SRC).
    BEGIN
    DBMS_CAPTURE_ADM.START_CAPTURE(
    capture_name => 'FaeterBR_src_capture');
    END;
    Step 16. Enable the apply process on the replication database (DST).
    BEGIN
    DBMS_APPLY_ADM.START_APPLY(
    apply_name => 'FaeterBR_DST_apply');
    END;
    Step 17. Test streams propagation of rows from source (src) to
    replication (DST).
    AT ORCL:
    insert into FaeterBR.tb2 values (
    31000, 'BAMSE', 'DR', 'DR Lejetoej');
    AT STR10:
    connect FaeterBR/FaeterBR
    select * from FaeterBR.tb2 where vendor= 'DR Lejetoej';
    Any other test that can be made?

    Check the metalink doc 301431.1 and validate
    How To Setup One-Way SCHEMA Level Streams Replication [ID 301431.1]
    Oracle Server Enterprise Edition - Version: 10.1.0.2 to 11.1.0.6
    Cheers.

  • Oracle streams configuration problem

    Hi all,
    i'm trying to configure oracle stream to my source database (oracle 9.2) and when i execute the package DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS'); i got an error bellow:
    ERROR at line 1:
    ORA-01353: existing Logminer session
    ORA-06512: at "SYS.DBMS_LOGMNR_D", line 2238
    ORA-06512: at line 1
    When checking some docs, they told i have to destraoy all logminer session, and i verify to v$session view and cannot identify logminer session. If someone can help me because i need this sttream tools for schema synchronization of my production database and datawarehouse database.
    That i want is how to destroy or stop logminer session.
    Thnaks for your help
    regards
    raitsarevo

    Thanks Werner, it's ok now my problem is solved and here bellow the output of your script.
    I profit if you have some docs or some advise for my database schema synchronisation, is using oracle sctrems is the best or can i use anything else but not Dataguard concept or standby database because i only want to apply DMl changes not DDL. If you have some docs for Oracle streams and especially for schema synchronization not tables.
    many thanks again, and please send to my email address [email protected] if needed
    ABILLITY>DELETE FROM system.logmnr_uid$;
    1 row deleted.
    ABILLITY>DELETE FROM system.logmnr_session$;
    1 row deleted.
    ABILLITY>DELETE FROM system.logmnrc_gtcs;
    0 rows deleted.
    ABILLITY>DELETE FROM system.logmnrc_gtlo;
    13 rows deleted.
    ABILLITY>EXECUTE DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
    PL/SQL procedure successfully completed.
    regards
    raitsarevo

  • Oracle Streams 9i Database Link Error

    Can anyone be help me regarding Oracle Streams 9i
    Oracle Version: 9.2.0.5.0
    Initial parameters:
    global_name=true
    aq_tm_processes=1
    job_queue_process=10
    log_parallelism=1 scope=spfile
    database is in archivelog;
    I executed the following SQL statements:
    CONNECT sys/xxxxx@db1 AS SYSDBA
    CREATE USER strmadmin IDENTIFIED BY strmadminpw
    DEFAULT TABLESPACE users QUOTA UNLIMITED ON users;
    GRANT CONNECT, RESOURCE, SELECT_CATALOG_ROLE TO strmadmin;
    GRANT EXECUTE ON DBMS_AQADM TO strmadmin;
    GRANT EXECUTE ON DBMS_CAPTURE_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_PROPAGATION_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_STREAMS_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_APPLY_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_FLASHBACK TO strmadmin;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_SET_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    CONNECT strmadmin/strmadminpw@db1
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    CREATE DATABASE LINK db2 CONNECT TO strmadmin IDENTIFIED BY strmadminpw USING 'DB2';
    CONNECT sys/xxxxx@db2 AS SYSDBA
    CREATE USER strmadmin IDENTIFIED BY strmadminpw
    DEFAULT TABLESPACE users QUOTA UNLIMITED ON users;
    GRANT CONNECT, RESOURCE, SELECT_CATALOG_ROLE TO strmadmin;
    GRANT EXECUTE ON DBMS_AQADM TO strmadmin;
    GRANT EXECUTE ON DBMS_CAPTURE_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_PROPAGATION_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_STREAMS_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_APPLY_ADM TO strmadmin;
    GRANT EXECUTE ON DBMS_FLASHBACK TO strmadmin;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_SET_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_OBJ,
    grantee => 'strmadmin',
    grant_option => FALSE);
    END;
    CONNECT strmadmin/strmadminpw@db2
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    CONNECT sys/xxxxx@db2 as sysdba
    GRANT ALL ON scott.dept TO strmadmin;
    CONNECT sys/xxxxx@db1 AS SYSDBA
    CREATE TABLESPACE logmnr_ts DATAFILE 'D:\oracle\oradata\db1\logmnr01.dbf'
    SIZE 25 M REUSE AUTOEXTEND ON MAXSIZE UNLIMITED;
    EXECUTE DBMS_LOGMNR_D.SET_TABLESPACE('logmnr_ts');
    CONNECT sys/xxxxx@db1 AS SYSDBA
    ALTER TABLE scott.dept ADD SUPPLEMENTAL LOG GROUP log_group_dept_pk (deptno) ALWAYS;
    CONNECT strmadmin/strmadminpw@db1
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
    table_name => 'scott.dept',
    streams_name => 'db1_to_db2',
    source_queue_name => 'strmadmin.streams_queue',
    destination_queue_name => 'strmadmin.streams_queue@db2',
    include_dml => true,
    include_ddl => true,
    source_database => 'db1');
    END;
    CONNECT strmadmin/strmadminpw@db1
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'scott.dept',
    streams_type => 'capture',
    streams_name => 'capture_simp',
    queue_name => 'strmadmin.streams_queue',
    include_dml => true,
    include_ddl => true);
    END;
    Did this exp/imp
    exp userid=scott/tiger@db1 FILE=dept_instant.dmp TABLES=dept OBJECT_CONSISTENT=y ROWS=n
    imp userid=scott/tiger@db2 FILE=dept_instant.dmp IGNORE=y COMMIT=y LOG=import.log STREAMS_INSTANTIATION=y
    and then
    CONNECT sys/xxxxx@db2 AS SYSDBA
    ALTER TABLE scott.dept DROP SUPPLEMENTAL LOG GROUP log_group_dept_pk;
    CONNECT strmadmin/strmadminpw@db2
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'scott.dept',
    streams_type => 'apply',
    streams_name => 'apply_simp',
    queue_name => 'strmadmin.streams_queue',
    include_dml => true,
    include_ddl => true,
    source_database => 'db1');
    END;
    CONNECT strmadmin/strmadminpw@db2
    BEGIN
    DBMS_APPLY_ADM.SET_PARAMETER(
    apply_name => 'apply_simp',
    parameter => 'disable_on_error',
    value => 'n');
    DBMS_APPLY_ADM.START_APPLY(
    apply_name => 'apply_simp');
    END;
    CONNECT strmadmin/strmadminpw@db1
    BEGIN
    DBMS_CAPTURE_ADM.START_CAPTURE(
    capture_name => 'capture_simp');
    END;
    Here is my problem:
    After i configured it the streams still cannot replicate. And when I checked the EM GUI for streams monitoring it displays a broken database link.
    Also it says that the database link is not active.
    Can anyone please help?
    Thanks.

    I don't know what Datacomp is, but you should be able
    to connect to any non-Oracle database if there is
    an ODBC driver for it. You would use Generic
    Heterogeneous Services to do this. See the following
    link to the documentation for details:
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96544/gencon.htm#1656
    There are also several articles on the Net. Do a google
    search on oracle generic heterogeneous services /
    connectivity.
    Hope this helps.
    Kailash.

  • Oracle Streams Update conflict handler not working

    Hello,
    I've been working on the Oracle streams and this time we've to come up with Update conflict handler.
    We are using Oracle 11g on Solaris10 env.
    So far, we have implemented bi-directional Oracle Streams Replication and it is working fine.
    Now, when i try to implement Update conflict handler - it executed successfully but it is not fulfilling the desired functionality.
    Here are the steps i performed:
    Steap -1:
    create table test73 (first_name varchar2(20),last_name varchar2(20), salary number(7));
    ALTER TABLE jas23.test73 ADD (time TIMESTAMP WITH TIME ZONE);
    insert into jas23.test73 values ('gugg','qwer',2000,SYSTIMESTAMP);
    insert into jas23.test73 values ('papa','sdds',2050,SYSTIMESTAMP);
    insert into jas23.test73 values ('jaja','xzxc',2075,SYSTIMESTAMP);
    insert into jas23.test73 values ('kaka','cvdxx',2095,SYSTIMESTAMP);
    insert into jas23.test73 values ('mama','rfgy',1900,SYSTIMESTAMP);
    insert into jas23.test73 values ('tata','jaja',1950,SYSTIMESTAMP);
    commit;
    Step-2:
    conn to strmadmin/strmadmin to server1:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    Step-3
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-4
    conn to strmadmin/strmadmin to server2
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-5
    And now, if i try to update the value of salary, then it is not getting handled by update conflict handler.
    update jas23.test73 set salary = 1500,time=SYSTIMESTAMP where first_name='papa'; --server1
    update jas23.test73 set salary = 2500,time=SYSTIMESTAMP where first_name='papa'; --server2
    commit; --server1
    commit; --server2
    Note: Both the servers are into different timezone (i hope it wont be any problem)
    Now, after performing all these steps - the data is not same at both sites.
    Error(DBA_APPLY_ERROR) -
    ORA-26787: The row with key ("FIRST_NAME", "LAST_NAME", "SALARY", "TIME") = (papa, sdds, 2000, 23-DEC-10 05.46.18.994233000 PM +00:00) does not exist in ta
    ble JAS23.TEST73
    ORA-01403: no data found
    Please help.
    Thanks.
    Edited by: gags on Dec 23, 2010 12:30 PM

    Hi,
    When i tried to do it on Server-2:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    it throws me an error,
    Error -
    ERROR at line 1:
    ORA-32588: supplemental logging attribute all column exists

  • BLOB in Oracle Streams

    Oracle 10.2.0.4:
    I am new to Oracle streams and just reading docs at this point. I read in http://download.oracle.com/docs/cd/B19306_01/server.102/b14229.pdf doc that BLOB are not supported by streams. I am just looking for basic stream configuration with some rule processing which will send LCR from source queue to destination queue. And as I understand I can do that by using ANYDATA payload.
    We have some tables witih BLOB data.

    It's all a balancing act. If you absolutely need both data centers processing transactions simultaneously, you'll need Streams.
    Lets start with the simplest possible case of this, two data centers A and B, with databases 1 and 2. Database 1 is in data center A, database 2 is in data center B. If database 1 fails, would you be able to shift traffic to database 2 relatively easily? Assuming that you're building in functionality to shift load between databases, which is normally the case when you're building this sort of distributed application, it may be easier to do this sort of shift regardless of the reason that database 1 fails.
    If you have a standby database in each data center (1A as the standby for database 1, 2A as the standby for database 2), when 1 fails, you have to figure out whether whatever caused 1 to fail will also cause 1A to fail. If data center A is having connectivity or power issues, for example, you would have to shift traffic to 2 rather than failing 1 over to 1A. On the other hand, if it was an isolated server failure, you could either shift traffic to 2 or fail over to 1A. There is some risk that having a more complex failure scenario makes it more likely that someone makes a mistake-- there will be a number of failover steps that you'd do only if you're failing from 1 to 1A and other steps that you'd do if you were shifting traffic from 1 to 2 and some steps that are common-- and makes it more difficult to fully test all the scenarios. On the other hand, there may well be benefits to having more options to respond to different sorts of failures. And politics/ reporting structure as well as geography plays a role here-- if the data centers are on different continents, shifting traffic is probably much less desirable than if you have two US data centers.
    If, rather than having standbys 1A and 2A, database 1 and 2 were really multi-node RAC clusters, both database 1 and database 2 would be able to survive most sorts of localized hardware failure (i.e. one node can fail on database 1 without affecting whether database 1 is up and processing transactions). If there was a data center wide failure, you'd still have to shift traffic. But one server dying in a pile wouldn't be an issue. Of course, there would be a handful of events that could take down the entire RAC cluster without affecting the data center where a standby could potentially be used (i.e. the SAN for the cluster fails but the standby is using a different SAN). Those may not be particularly likely, however, so it may make sense not to bother with contingency planning for them and just assume that anything that knocks out all the nodes of the cluster forces traffic to be shifted to 2 and that it wouldn't be worth trying to maintain a standby for those scenarios.
    There are lots of trade-offs here. You have simplicity of setup, you have simplicity of failover, you have robustness, etc. And there are going to be cases where you realistically need to take a stab at predicting how likely various events are which gets pretty deeply into hardware, setup, and politics (i.e. how likely a server is to fail depends on whether you've bought a high-end server with doubly-rundundant-everything or a commodity linux box, how likely a data center is to fail depends on the data center's redundancy measures and your level of confidence in those measures, etc)
    Justin

  • Complex Oracle Streams issue - Update conflicts

    This is for Oracle streams replication on 11g r2.
    I am facing update conflicts in a table. The conflict arise due to technical and business logic issue. The business logic will pass through the replication/apply process successfully but we want to arrest and resolve it before replication for our requirements. These are typically a bit complex cases and we are exploring the possibility of having both DML handlers and Error handlers. The DML handlers will take care of business logic conflicts and Error handler for technical issues before pushing it to Error queue by Streams. Based on our understanding and verification, we found a limitation to configure both procedure DML handler and Error handler together for the same table operation.
    Statement handlers can not be used for our conflict scenarios.
    Following are my questions:
    1. Have anyone implemented or faced such a scenario in their real time application? If yes, can you please share some insights or inputs?
    2. Is there a custom way to handle this complex problem of configuring both DML and Error handler?
    3. Is there any alternative possible way to resolve this situation at Oracle streams environment with other handlers?

    Dear All
    I too have a similar requirement. Could anyone help with one?
    We can handle the error-ing transactions via Error Handler procedures.
    But we can not configure the DML handler procedure for transactions that are successfully replicated. STreams does not allow us to configure a handler for this. Is there any other handler / procedures / hooks in streams where we can implement the desired functionality - which includes changing the values in the LCR before invoking lcr.execute() and we should be able to discard the LCR also if required.
    Regards
    Velmurugan
    Edited by: 982387 on Jan 16, 2013 11:25 PM
    Edited by: 982387 on Jan 16, 2013 11:27 PM

  • Capturing data of the previous time interval with Oracle Stream(HotLog)

    I read from 10g Oracle manual that Oracle Steam can capture data within specified time interval with begin_data and end_data options.
    For example :
    BEGIN
    DBMS_CDC_PUBLISH.CREATE_CHANGE_SET(
    change_set_name => 'set_cns',
    description => 'set_cns...',
    change_source_name => 'HOTLOG_SOURCE',
    stop_on_ddl => 'y',
    begin_date => sysdate,
    end_date => sysdate + 1);
    END;
    However if I set begin_date to previous time from now, Oracle doesn't caputre data anymore in this case.(HotLog method)
    (I set begin_date => sysdate - 1/24
    end_date => sysdate + 1/24.)
    Does anybody know to capture the previous time interval with Oracle Stream?

    Change C2 to:
    cursor c2(passing_date IN date) IS
      SELECT MONITOR_ID, SAMPLE_ID,
                   COLL_TIME, DEW_POINT
        FROM ARCHIVE_DATA
        WHERE COLL_TIME < passing_date
        ORDER BY COLL_TIME desc;And rather than populating a table with the three records, you could just select the three records using: where COLL_TIME between Prev3_time and Prev1_time

  • Execution of Row level trigger in Oracle Streams.

    Hi All,
    Oracle Database version : 9.2.0.4 on windows NT/2000 environment.
    We managed to install,configure oracle stream technologies.
    Oracle Stream seems to be working fine for replication of DML & DDL changes from source database to target database.
    Following is detail at source end.
    Source Sid = acc
    Source Schema = stream
    Source Table = dept
    structure of dept table.
    Name Null? Type
    DEPTNO NOT NULL NUMBER(5)
    DNAME NOT NULL VARCHAR2(10)
    LOC NOT NULL VARCHAR2(10)
    Streamadmin user = strmadmin
    Following is detail at target end.
    Target Sid = fin
    Target Schema = stream
    Target Table = dept
    structure of dept table.
    Name Null? Type
    DEPTNO NOT NULL NUMBER(5)
    DNAME NOT NULL VARCHAR2(10)
    LOC NOT NULL VARCHAR2(10)
    TRAN_DATE                    NULL DATE DEFAULT SYSDATE
    I checked on insert/update/delete of rows into dept table at source database, changes are correctly replicated to target table dept.
    I wrote a simple trigger which is as follows on dept table at target database.
    create or replace trigger dept_upd_del
    before delete or update of dname,loc on stream.dept
    for each row
    begin
    dbms_output.put_line('Inside Trigger');
    if updating then
    dbms_output.put_line('Update');     
    insert into stream.dept_change values (:old.deptno,'U',sysdate);
    end if;
    if deleting then
    dbms_output.put_line('Delete');
    insert into stream.dept_change values (:old.deptno,'D',sysdate);
    end if;
    end;
    I expect this trigger to get executed whenever changes occurs into dept table at target database whenever dml changes are propagated from source to target table. However i found that the above trigger is not executed at all.
    I was further surprised, since incase i update/delete rows from target table dept the above trigger is executing correctly.
    Can someone please let me know about this?
    I believe stream technology is using INSERT / UPDATE & DELETE statement when changes are applied at target table but this doesn't seems to be the case.
    Thanks in Advance.
    Regards,
    Vidyanand

    The trigger at the destination will not fire because it already has at the source site. Read about that in the streams documentation on page 4-25. To change the "fire once" property of the trigger, use the procedure SET_TRIGGER_FIRING_PROPERTY in the DBMS_DDL package.
    Hope this helps.
    Claudine

  • Granularity of change rules in CDC (using Oracle streams) at columns level.

    Is it possible to implement granularity of change rules in CDC (using Oracle streams) at columns level.
    E.g. table abc with columns a1, b1, c1 where c1 is some kind of auditing column. I want to use CDC on table abc but want to ignore changes on c1.
    Is it possible? My other option is to split table abc into a child table that has the Primary key and c1 only but it needs additional table and joins.
    Thanks
    Shyam

    The requirement can be implemented by a simple trigger.
    You seem to plan to kill a mosquito by using a nuclear bomb.
    Sybrand Bakker
    Senior Oracle DBA

Maybe you are looking for

  • 2 mac user accounts, 2 itunes match accounts, 1 mac

    I have a mac that I share by two seperate logins. I was using itunes match fine in my login. My gf signed up for itunes match and associated under itunes with in her own login, totally seperate itunes libary etc. Now I had issus with match not workin

  • HT3529 SMS sent to multiple contacts were converted into mms.

    SMS sent to multiple contacts were converted into mms, text only 61 characters long including spaces. Carrier O2 have said it is an known issue with apple products. This has cost me £50 to say happy new year to my family and friends. Anyone know of t

  • Obiee and BIP security - obiee 11g 11.1.1.7.1

    Hello,   I have configured an external LDAP setting for authentication. Reordered the new LDAP as first authentication provider. The issue i am facing is , that i am able to login with external ldap users and weblogic as expected in obiee.  But when

  • How to Get  SQL Results in Java?

    Hello, I'm new to java and have a few questions when running sql. I have no problems connecting to the Oracle 9i database, but my questions are these: (1) I have a CREATE TABLE statement. How do I check to see if the table was actually created? SQL r

  • XML and german Umlaut

    Hello everyone, i am writing a little application that uses SAX to parse XML documents and i am having a bit of a problem with german Umlaut (e.g. �, �, �). I looked up the entities that are commonly used in the internet, so for example � becomes eit