Oracle stream - first_scn and start_scn

Hi,
My first_scn is 7669917207423 and start_scn is 7669991182403 in DBA_CAPTURE view.
Once I will start the capture from which SCN it will start to capture from archive log?
Regards,

I am using oracle 10.2.0.4 version oracle streams. It's Oracle downstream setup. The capture as well as apply is running on target database.
Regards,
Below is the setup doc.
1.1 Create the Streams Queue
conn STRMADMIN
BEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(
queue_table => 'NIG_Q_TABLE',
queue_name => 'NIG_Q',
queue_user => 'STRMADMIN');     
END;
1.2 Create apply process for the Schema
BEGIN
DBMS_APPLY_ADM.CREATE_APPLY(
queue_name => 'NIG_Q',
apply_name => 'NIG_APPLY',
apply_captured => TRUE
END;
1.3 Setting up parameters for Apply
exec dbms_apply_adm.set_parameter('NIG_APPLY' ,'disable_on_error','n');
exec dbms_apply_adm.set_parameter('NIG_APPLY' ,'parallelism','6');
exec dbms_apply_adm.set_parameter('NIG_APPLY' ,'_dynamic_stmts','Y');
exec dbms_apply_adm.set_parameter('NIG_APPLY' ,'_hash_table_size','1000000');
exec dbms_apply_adm.set_parameter('NIG_APPLY' ,'_TXN_BUFFER_SIZE',10);
/********** STEP 2.- Downstream capture process *****************/
2.1 Create the downstream capture process
BEGIN
DBMS_CAPTURE_ADM.CREATE_CAPTURE (
queue_name => 'NIG_Q',
capture_name => 'NIG_CAPTURE',
rule_set_name => null,
start_scn => null,
source_database => 'PNID.LOUDCLOUD.COM',
use_database_link => true,
first_scn => null,
logfile_assignment => 'IMPLICIT');
END;
2.2 Setting up parameters for Capture
exec DBMS_CAPTURE_ADM.ALTER_CAPTURE (capture_name=>'NIG_CAPTURE',checkpoint_retention_time=> 2);
exec DBMS_CAPTURE_ADM.SET_PARAMETER ('NIG_CAPTURE','_SGA_SIZE','250');
2.3 Add the table level rule for capture
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'NIG.BUILD_VIEWS',
streams_type => 'CAPTURE',
streams_name => 'NIG_CAPTURE',
queue_name => 'STRMADMIN.NIG_Q',
include_dml => true,
include_ddl => true,
source_database => 'PNID.LOUDCLOUD.COM'
END;
/**** Step 3 : Initializing SCN on Downstream database—start from here *************/
import
=================
impdp system DIRECTORY=DBA_WORK_DIRECTORY DUMPFILE=nig_part1_srm_expdp_%U.dmp table_exists_action=replace exclude=grant,statistics,ref_constraint logfile=NIG1.log status=300
/********** STEP 4.- Start the Apply process ********************/
sqlplus STRMADMIN
exec DBMS_APPLY_ADM.START_APPLY(apply_name => 'NIG_APPLY');

Similar Messages

  • Oracle Streams Update conflict handler not working

    Hello,
    I've been working on the Oracle streams and this time we've to come up with Update conflict handler.
    We are using Oracle 11g on Solaris10 env.
    So far, we have implemented bi-directional Oracle Streams Replication and it is working fine.
    Now, when i try to implement Update conflict handler - it executed successfully but it is not fulfilling the desired functionality.
    Here are the steps i performed:
    Steap -1:
    create table test73 (first_name varchar2(20),last_name varchar2(20), salary number(7));
    ALTER TABLE jas23.test73 ADD (time TIMESTAMP WITH TIME ZONE);
    insert into jas23.test73 values ('gugg','qwer',2000,SYSTIMESTAMP);
    insert into jas23.test73 values ('papa','sdds',2050,SYSTIMESTAMP);
    insert into jas23.test73 values ('jaja','xzxc',2075,SYSTIMESTAMP);
    insert into jas23.test73 values ('kaka','cvdxx',2095,SYSTIMESTAMP);
    insert into jas23.test73 values ('mama','rfgy',1900,SYSTIMESTAMP);
    insert into jas23.test73 values ('tata','jaja',1950,SYSTIMESTAMP);
    commit;
    Step-2:
    conn to strmadmin/strmadmin to server1:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    Step-3
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-4
    conn to strmadmin/strmadmin to server2
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-5
    And now, if i try to update the value of salary, then it is not getting handled by update conflict handler.
    update jas23.test73 set salary = 1500,time=SYSTIMESTAMP where first_name='papa'; --server1
    update jas23.test73 set salary = 2500,time=SYSTIMESTAMP where first_name='papa'; --server2
    commit; --server1
    commit; --server2
    Note: Both the servers are into different timezone (i hope it wont be any problem)
    Now, after performing all these steps - the data is not same at both sites.
    Error(DBA_APPLY_ERROR) -
    ORA-26787: The row with key ("FIRST_NAME", "LAST_NAME", "SALARY", "TIME") = (papa, sdds, 2000, 23-DEC-10 05.46.18.994233000 PM +00:00) does not exist in ta
    ble JAS23.TEST73
    ORA-01403: no data found
    Please help.
    Thanks.
    Edited by: gags on Dec 23, 2010 12:30 PM

    Hi,
    When i tried to do it on Server-2:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    it throws me an error,
    Error -
    ERROR at line 1:
    ORA-32588: supplemental logging attribute all column exists

  • Help with Oracle Streams. How to uniquely identify LCRs in queue?

    We are using strems for data replication in our shop.
    When an error occours in our processing procedures, the LCR is moved to the error queue.
    The problem we are facing is we don't know how to uniquely identify LCRs in that queue, so we can run them again when we think the error is corrected.
    LCRs contain SCN, but as I understand it, the SCN is not unique.
    What is the easy way to keep track of LCRs? Any information is helpful.
    Thanks

    Hi,
    When you correct the data,you would have to execute the failed transaction in order.
    To see what information the apply process has tried to apply, you have to print that LCR. Depending on the size (MESSAGE_COUNT) of the transaction that has failed, it could be interesting to print the whole transaction or a single LCR.
    To do this print you can make use of procedures print_transaction, print_errors, print_lcr and print_any documented on :
    Oracle Streams Concepts and Administration
      Chapter - Monitoring Streams Apply Processes
         Section - Displaying Detailed Information About Apply Errors
    These procedures are also available through Note 405541.1 - Procedure to Print LCRs
    To print the whole transaction, you can use print_transaction procedure, to print the error on the error queue you can use procedure print_errors and to print a single_transaction you can do it as follows:
    SET SERVEROUTPUT ON;
    DECLARE
       lcr SYS.AnyData;
    BEGIN
        lcr := DBMS_APPLY_ADM.GET_ERROR_MESSAGE
                    (<MESSAGE_NUMBER>, <LOCAL_TRANSACTION_ID>);
        print_lcr(lcr);
    END;
    Thanks

  • Oracle Streams b/w MS-Access 2007 and Oracle 10g.

    Can we set up Oracle Streams between MS-Access 2007 and Oracle 10g? Ms-Access as source and Oracle 10g as destination database. If so, can any one please give me little heads up with supported doc's or any source of info.

    Help Help....!!!

  • Performance tuning and Periodic Maintance in Oracle Streaming Environment

    We had Setup the Bi-Directional Oracle Streaming between two remote Sites each of 2-Node RAC Databases
    This is Our Enviroment Summary.
    Database Oracle 10g R2 version 10.2.0.4.0
    Os: Solaris[tm] OE (64-bit)
    Currenly Oracle Streaming working Successfully and I'm Daily Monitor the Same.
    As Mention in the Master Note for Streams Performance Recommendations [ID 335516.1]
    Purging Streams Checkpoints
    10.2: Alter the capture parameter CHECKPOINT_RETENTION_TIME from the default retention of 60 days to a realistic value for your database.
    A typical setting might be to retain 7 days worth of checkpoint metadata :
    exec dbms_capture_adm.alter_capture(capture_name=>'your_capture', checkpoint_retention_time=> 7);
    My Query
    ===> Currenly In My Environment CHECKPOINT_RETENTION_TIME is as default 60 days
    I want to change the checkpoint_retention_time=> 7 so what should I take care for before execute the above command in my Live Streaming Environment.
    Edited by: user8171787 on Apr 13, 2011 11:00 PM

    We had Setup the Bi-Directional Oracle Streaming between two remote Sites each of 2-Node RAC Databases
    This is Our Enviroment Summary.
    Database Oracle 10g R2 version 10.2.0.4.0
    Os: Solaris[tm] OE (64-bit)
    Currenly Oracle Streaming working Successfully and I'm Daily Monitor the Same.
    As Mention in the Master Note for Streams Performance Recommendations [ID 335516.1]
    Purging Streams Checkpoints
    10.2: Alter the capture parameter CHECKPOINT_RETENTION_TIME from the default retention of 60 days to a realistic value for your database.
    A typical setting might be to retain 7 days worth of checkpoint metadata :
    exec dbms_capture_adm.alter_capture(capture_name=>'your_capture', checkpoint_retention_time=> 7);
    My Query
    ===> Currenly In My Environment CHECKPOINT_RETENTION_TIME is as default 60 days
    I want to change the checkpoint_retention_time=> 7 so what should I take care for before execute the above command in my Live Streaming Environment.
    Edited by: user8171787 on Apr 13, 2011 11:00 PM

  • Oracle Streams 'ORA-25215: user_data type and queue type do not match'

    I am trying replication between two databases (10.2.0.3) using Oracle Streams.
    I have followed the instructions at http://www.oracle.com/technology/oramag/oracle/04-nov/o64streams.html
    The main steps are:
    1. Set up ARCHIVELOG mode.
    2. Set up the Streams administrator.
    3. Set initialization parameters.
    4. Create a database link.
    5. Set up source and destination queues.
    6. Set up supplemental logging at the source database.
    7. Configure the capture process at the source database.
    8. Configure the propagation process.
    9. Create the destination table.
    10. Grant object privileges.
    11. Set the instantiation system change number (SCN).
    12. Configure the apply process at the destination database.
    13. Start the capture and apply processes.
    For step 5, I have used the 'set_up_queue' in the 'dbms_strems_adm package'. This procedure creates a queue table and an associated queue.
    The problem is that, in the propagation process, I get this error:
    'ORA-25215: user_data type and queue type do not match'
    I have checked it, and the queue table and its associated queue are created as shown:
    sys.dbms_aqadm.create_queue_table (
    queue_table => 'CAPTURE_SFQTAB'
    , queue_payload_type => 'SYS.ANYDATA'
    , sort_list => ''
    , COMMENT => ''
    , multiple_consumers => TRUE
    , message_grouping => DBMS_AQADM.TRANSACTIONAL
    , storage_clause => 'TABLESPACE STREAMSTS LOGGING'
    , compatible => '8.1'
    , primary_instance => '0'
    , secondary_instance => '0');
    sys.dbms_aqadm.create_queue(
    queue_name => 'CAPTURE_SFQ'
    , queue_table => 'CAPTURE_SFQTAB'
    , queue_type => sys.dbms_aqadm.NORMAL_QUEUE
    , max_retries => '5'
    , retry_delay => '0'
    , retention_time => '0'
    , COMMENT => '');
    The capture process is 'capturing changes' but it seems that these changes cannot be enqueued into the capture queue because the data type is not correct.
    As far as I know, 'sys.anydata' payload type and 'normal_queue' type are the right parameters to get a successful configuration.
    I would be really grateful for any idea!

    Hi
    You need to run a VERIFY to make sure that the queues are compatible. At least on my 10.2.0.3/4 I need to do it.
    DECLARE
    rc BINARY_INTEGER;
    BEGIN
    DBMS_AQADM.VERIFY_QUEUE_TYPES(
    src_queue_name => 'np_out_onlinex',
    dest_queue_name => 'np_out_onlinex',
    rc => rc, , destination => 'scnp.pfa.dk',
    transformation => 'TransformDim2JMS_001x');
    DBMS_OUTPUT.PUT_LINE('Compatible: '||rc);
    If you dont have transformations and/or a remote destination - then delete those params.
    Check the table: SYS.AQ$_MESSAGE_TYPES there you can see what are verified or not
    regards
    Mette

  • Long transactions and Oracle Streams

    In a Oracle Streams replication,
    Lets say that you have a long running transaction that started at SCN 10,
    in the mean time, you have 100 new transactions that occured and commited, so the actual SCN is now 110.
    the transaction that started at SCN 10 still did not commit, does all of the following transactions will wait for this long running transaction to commit before they can be applied to the target database ?
    Thanks!
    Patrick
    www.renaps.com

    If the transactions executed on other sessions have commited then you should see those changes in your target. Also, you should see a message in the alert log regarding long running/large txn.
    But the order of commit on the apply side is controlled by a parameter called COMMIT_SERIALIZATION. If set to default which is FULL the commit order is preserved. If set to NONE then commit order is not preserved. You can get the COMMIT_SCN of a transaction on the target using one of LCR member procedures.

  • Oracle Streams and Oracle Apps  11i

    Hi,
    I am looking for an oracle solution to build a reporting instance off an e-business suite 11i and offload discoverer reporting to the reporting instance. I have tried dataguard logical standby but too many issues and I cannot use. I am wondering if anyone tried using oracle streams to build a reporting instance from an oracle apps instance?
    Thanks

    Hi,
    Is Streams supported in an E-Business Suite database? We once had a scenerio where some parameters for Streams were conflicting with parameters required for E-Business Suite performance improvement. I could not find any documents from Oracle to verify whether Streams on E-Business Suite database was a supported configuration or not.
    So if you have any such document, could you send the link to me?
    Also please provide the link to your white paper/presentation.
    Regards,
    Sujoy

  • Setup between Oracle streams and MQ

    Hi All,
    I m trying to create the setup between oracle streams and Messing Queue(MQ).i have already install MQClient on my machine and messing gateways is also working fine,but accoring to setup docs i have created one user having all the granst they provide in docs and created queuetable,queue,dml handler,tables rules and starting queue after that we have done Capture and Apply process on that user, in other word u have say the entire setup but our data is not propagating that messaage to MQ.
    So if anyone has specific setup code then pl provide or any suggestion u want can give.
    thnks & regards
    Sanjeev

    Hi Sanjeev,
    There is a special forum dedicated to Oracle Streams: Streams
    You may want to try there.
    Extra tip: post the code you used. That way it is easier for people to see what you might have done wrong.
    Regards,
    Rob.

  • Differenece between oracle streams and oracle Replication

    Hi all,
    Can anyone tell me difference between oracle streams and oracle replication?
    Regards

    refer the link:Difference Between Oracle Replication & Oracle Streams.
    Oracle Replication is designed to replicate exact copies of data sets to various databases. Oracle Streams is designed to propagate individual data changes to various databases. Thus, Replication is probably easier if the end goal is to maintain identical copies of data, where Streams is easier if the end goal is to allow different databases to react differently to data changes.
    Oracle Replication is a significantly more mature product-- it is quite usable with older databases. Oracle Streams is a much newer technology and is only usable among different 9i databases. Most competent Oracle developers and DBA's are familiar with Oracle Replication, while many fewer have any real experience with Streams.
    The Streams architecture strikes me as a lot more flexible than Oracle Replication's. This leads me to suspect that Oracle will be pushing Streams over Replication in subsequent releases, so I would expect new features in Streams, like DDL changes, that aren't in Oracle Replication. Realistically, though, I don't expect any serious movement away from Replication for at least a few releases, so I wouldn't tend to be overly concerned on this front.
    answered by Justin
    Distributed Database Consulting, Inc.
    reference from forum thread:
    Difference Between Oracle Replication & Oracle Streams.

  • Oracle Streams and Dataguard

    Does anyone know how to configure oracle streams and dataguard to work together? I've set up an environment which successfully captures and applies records in a streams environment using
    log_archive_config=SEND, RECEIVE, NODG_CONFIG
    as soon as I try to introduce the streams setup into a database which already has dataguard setup it begins to get convoluted as setting DG_CONFIG requires you to then set the db_unique_name and it seems that the streams log mining will not work without being fully incorporated into the Dataguard configuration.
    Has anyone set up streams in a dataguard environment?

    Lets see what's missing from your request:
    1. Oracle version number?
    2. Physical or logical Data Guard?
    3. Which Data Guard mode? Synch or Asynch?
    4. Which Data Guard protection level?
    5. ARCH or LGWR?
    6. Which streams mode? Synch or Asynch?
    7. Hotlog or Autolog?
    8. Streams on the production server or the standby server?
    I'm all out of guesses tonight. Please provide enough information for someone to help you.

  • Oracle Streams and CLOB column

    Hi there,
    We are using "Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit". My question is "Does Oracle Streams captures, propagates (source capture method) and applies CLOB column changes?"
    If yes, is this default behavior? Can we tell Streams to exclude the CLOB column from the whole (capture-propage-apply) process?
    Thanks in advance!

    You can exclude columns via a rule (dbms_streams_adm.delete_column).
    CLOBs are captured.
    http://download.oracle.com/docs/cd/E11882_01/server.112/e17069/strms_capture.htm#i1006263

  • Oracle Streams and Database Encryption

    I am looking for encryption method for OLTP database.
    Oracle Streams will be used to synchronize the data between source and remote database.
    Business process wants all remote database to be fully encrypted.
    What Can I use?
    - TDE is not supported with Oracle Streams,
    - I can’t use DBMS_CRYPTO since it needs big Database modifications.
    Any other Oracle supported techniques, or maybe I’m gonna need to consider hardware encryption (like hard disk encryption) or OS level encryption.
    Thanks,

    TDE is not supported with LogMiner based technologies such as Streams, Data Guard (logical standby). Therefore you can not use TDE to replicated encrypted content. --You will get exception "Unsupported data type"
    But, you might replicate decrypted data into encrypted destination. This means your destination might have TDE encrypted columns.
    Unofficially Oracle will support TDE with logminer based technologies in the next database version 11.
    I am waiting for this.
    Regards,
    Mike

  • Oracle stream - Downstream

    I am getting the below error now on downstream database. This constraint doesn't exist on downstream because I am
    excluding constraint on downstream
    when importing.
    ORA-00001: unique constraint (PCAT_NT.PK01_DCS_CAT_CATINFO) violated
    Is this mandatory to include constraint in Oracle stream setup?
    My steps are as below:
    1) Create the downstream capture process
    BEGIN
    DBMS_CAPTURE_ADM.CREATE_CAPTURE (
    queue_name => 'STRMPCAT_QUEUE',
    capture_name => 'DOWNSTRMPCAT_CAPTURE',
    rule_set_name => null,
    start_scn => null,
    source_database => 'PCAT',
    use_database_link => true,
    first_scn => null,
    logfile_assignment => 'IMPLICIT');
    END;
    2) Add schema rule on Downstream database
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => 'PCAT_NT',
    streams_type => 'CAPTURE',
    streams_name => 'DOWNSTRMPCAT_CAPTURE',
    queue_name => 'STRMPCAT.STRMPCAT_QUEUE',
    include_dml => true,
    include_ddl => false,
    source_database => 'PCAT');
    END;
    3) Initializing SCN on Downstream database
    --on source
    expdp system SCHEMAS=PCAT_NT DIRECTORY=DATA_PUMP_DIR DUMPFILE=PCAT_expdp.dmp exclude=user,grant,statistics,synonyms,procedure,constraint FLASHBACK_SCN=7620529799615
    --on target
    impdp system SCHEMAS=PCAT_NT DIRECTORY=DATA_PUMP_DIR DUMPFILE=PCAT_expdp.dmp
    Regards

    Are you stating that the constraint PCAT_NT.PK01_DCS_CAT_CATINFO does not exist on the destination database and that you are getting a constraint violation error on the destination database? That strikes me as exceptionally unlikely.
    How do you know that the constraint was not created on the destination database? I would tend to suspect that you created the constraint potentially unintentionally.
    I would be interested in why you're getting duplicate rows as well. If there are no duplicate rows in the source database, duplicates in the destination would imply that you're doing something wrong in the setup (i.e. the SCN in your export is incorrect) or that you have something in the Streams config which is causing duplicates (i.e. a custom DML handler).
    Justin

  • Instantiation and start_scn of capture process

    Hi,
    We are working on stream replication, and I have one doubt abt the behavior of the stream.
    During set up, we have to instantiate the database objects whose data will be transferrd during the process. This instantiation process, will create the object at the destination db and set scn value beyond which changes from the source db will be accepted. Now, during creation of capture process, capture process will be assigned a specific start_scn value. Capture process will start capturing the changes beyond this value and will put in capture queue. If in between capture process get aborted, and we have no alternative other than re-creation of capture process, what will happen with the data which will get created during that dropping / recreation procedure of capture process. Do I need to physically get the data and import at the destination db. When at destination db, we have instantiated objects, why not we have some kind of mechanism by which new capture process will start capturing the changes from the least instantiated scn among all instantiated tables ? Is there any other work around than exp/imp when both db (schema) are not sync at source / destination b'coz of failure of capture process. We did face this problem, and could find only one work around of exp/imp of data.
    thanx,

    Thanks Mr SK.
    The foll. query gives some kind of confirmation
    source DB
    SELECT SID, SERIAL#, CAPTURE#,CAPTURE_MESSAGE_NUMBER, ENQUEUE_MESSAGE_NUMBER, APPLY_NAME, APPLY_MESSAGES_SENT FROM V$STREAMS_CAPTURE
    target DB
    SELECT SID, SERIAL#, APPLY#, STATE,DEQUEUED_MESSAGE_NUMBER, OLDEST_SCN_NUM FROM V$STREAMS_APPLY_READER
    One more question :
    Is there any maximum limit in no. of DBs involved in Oracle Streams.
    Ths
    SM.Kumar

Maybe you are looking for

  • Home sharing sync with other computers

    I have two computers with iTunes. They are logged into my account with Home Sharing setup. We have downloaded asps on the main PC but they don't appear on the other. I assume Home Sharing can make it where both PCs show the same apps. Isn't that righ

  • Collections.sort() - sort on multiple fields?

    I have a collection of objects in a List that I need to be able to sort on each field individually. For example, the data in the List is eventually displayed on a table in a web page. The user is able to click on a column header and the table is to b

  • Error : cannot set up certs for trusted CAs, help !

    Hi ! I am attempting to use the javax.crypto classes in order to decrypt and encrypt information. But i get the follow message when i change the executing directory : -> cannot set up certs for trusted CAs If i decompress all follow jar files: jce1_2

  • "Alternate HTML content should be placed here. This content requires the Adobe Flash Player." Error

    I used to make Flex 3 applications until yesterday. Today I can't and I don't know why (was it Windows Update!?). I have the latest Flash Player Debug version installed. If I launch IE normally, I can see any type of flash content on the web: This is

  • JDeveloper Connection Manager error?

    When attempting to create a database connection using the JDev Connection Manager, JDev9031 and JDev9034. Get through all of the steps but when test the connection get the following error message: Io exception: Connection refused(DESCRIPTION=(TMP=)(V