Streams 9i to 10g

can I user oracle 9.2.0.1.0 as source for implementing streams across 9i-->10g or i need 9.2.0.6 or higher versions only??

Original question was about replication to 10g (not specified about R1 or R2)
For replication to 10gR2 you need to use 9.2.0.6 or 9.2.0.7 or 9.2.0.8 depends on platform. You need to apply patch for this bug. You can find out backport for this patch on metalink.
I think You can use 9.2.0.5 and up for 10gR1 replication. There is patch for 9.2.0.5 and Streams (do not remember number). You have to apply this patch to 9.2.0.5.

Similar Messages

  • Problem in Update using STREAMS in Ora 10g EE

    Hi
    I am using Oracle 10g Enterprise Edition for replicating datas after failover occurs in DB
    I had followed the steps in below URL..
    http://blogs.ittoolbox.com/oracle/guide/archives/oracle-streams-configuration-change-data-capture-13501
    The replication is achived using streams.
    INSERT,DELETE operations working fine without any issue. But UPDATE operation is not replicated immediately.
    I did UPDATE and gave COMMIT, no replication. Again I did UPDATE and gave COMMIT, now only it updates the Streams table.
    For every UPDATES it needs two times? Why? How to solve this?
    Or Any other URL which gives working examples of Streams in 10g..
    Any help regard this highly appreciated..

    Thanks for ur reply..
    There is no specific reason to use STREAMS..
    I have to replicate the data during my properity database stated up.
    We already discuss with this topic
    Data Replication in after downtime
    Instead of using Triggers or procedure, i have to use anyone of the technologies in Oracle 10g for DataReplication..
    Can you give me more info like advantages of using DataGuard or other techologies in Oracle 10g EE .

  • Applying Streams for oracle 10g 10.2

    Dear all, i'm new to streams however i wish to deploy it in our environment, i have one instance 10g 10.2 (source) small size 9.5 GB which i want to replicate using streams to another stand by instance (destination) of the same version, both running on windows 2003 server, i've read some documents on the same but unfortunately couldn't find a comprehensive step by step example from scratch on how to do it for the whole DB not one schema, i would highly appreciate if you may provide me with such a document.

    Streams is NOT Data Guard and can not be used to manage a standby database. In other words wrong tool for the job.
    To use Data Guard you MUST have Enterprise Edition and the directions for creating a physical standby, which is what you appear to want, are well documented in numerous locations including metalink.

  • How can i view the Streams process through 10g OEM

    I have successfully configured OEM on Oracle 10.1.0.3
    My question is:
    How can i view the Streams processes and queues so that i could monitor its functionality.
    In 9i Java Console OEM I was able to monitor Streams using the <Distributed> tab but I don't find any such option in 10g. Please help.
    Thanks in Advance.
    Regards,
    Raj

    There's a message on the Database Administration page of EM10g.
    "TIP      Use the Enterprise Manager 10g Java Console to manage Streams, Advanced Replication, Advanced Queues, XML Database, Spatial and Workspace."
    At present there is no way through the EM10g Web Interface, and I understand the Java version which comes with 10g doesn't contain any 10g new features.
    So I am still using OEM 9.2 in addition to EM10g
    Thanks,
    Alan...

  • Need help, implementing streams between oracle 10g databases

    Hello all.
    Please help me, and give me instruction if any body implement streams between databases (Oracle 10g)
    I have implemented streams between schema table on (10g) and result was success.
    Firstly I want to know some thing, such as:
    1) Is it possible make streams with conditions, (make only DML or DDL i know) but make DML and not DELETE operation, just INSERT and UPDATE operations on the table?
    2) After changes was applied on target database, can I delete that records which copied (replicated) on source database?
    I have 2 databases and one of them is for archive purpose (I want use it as target database for streams). Other one is PROD database, applications make DML operation.
    I) record insert with null status
    II) processing (status 1)
    III) if success (status 2) unsuccess (status 3)
    For now, I have cron script on (Linux host) and in this script has PLSQL anonymous block and it works couple times during the day. This script works as archive.
    My task is: Make it via Oracle streams.
    Thank you beforehand.

    For conditions on the type of operation (Insert) check in the doc after apply handler and you can associate what ever code and conditions you want. You can also choose to work with a subset rules but there are some restriction as no lobs :
    [http://download.oracle.com/docs/cd/B28359_01/server.111/b28321/strms_rules.htm#insertedID7|http://download.oracle.com/docs/cd/B28359_01/server.111/b28321/strms_rules.htm#insertedID7]
    For a complete list of the restrictions :
    [http://download.oracle.com/docs/cd/B28359_01/server.111/b28321/ap_restrictions.htm#i1012965|http://download.oracle.com/docs/cd/B28359_01/server.111/b28321/ap_restrictions.htm#i1012965]

  • ORA-04031-----capture aborat in stream in oracle 10g R2 in win 2003

    Hi Experts,
    Error Message ORA-04031: unable to allocate 116 bytes of shared memory ("streams pool","unknown object","streams pool","kolcalm coll")
    I got this message for capture ABORTED. my sga_target and sga_max as 1880M. Do I need to increase their size? stream pool size as 880. shared_pool_reserved_size as 14M.
    Based on oracle document for this error message:
    Action: If the shared pool is out of memory, either use the dbms_shared_pool package to pin large packages, reduce your use of shared memory, or increase the amount of available shared memory by increasing the value of the INIT.ORA parameters "shared_pool_reserved_size" and "shared_pool_size". If the large pool is out of memory, increase the INIT.ORA parameter "large_pool_size".
    Key point: we use asm to control. it include share_pool and large_pool. Do we need to increase share pool size?
    Or we need to take other steps to handle this point.
    Thanks for help!!
    regards
    Jim
    Edited by: user589812 on Nov 24, 2008 9:02 AM

    Sorry. I use Oracle 10GR2 in window 2003
    For you sql results as
    STREAMS_POOL_SIZE_FOR_ESTIMATE STREAMS_POOL_SIZE_FACTOR ESTD_SPILL_COUNT ESTD_SPILL_TIME ESTD_UNSPILL_COUNT ESTD_UNSPILL_TIME
    272 .1828 66034094 49261 72443892 114732
    424 .2849 56899900 42916 69761439 110546
    576 .3871 29490067 17904 56631824 87657
    728 .4892 23989724 13951 55330937 85704
    880 .5914 20168762 11552 54131928 83935
    1032 .6935 16931737 9668 52998667 82269
    1184 .7957 14147340 8134 51948431 80724
    1336 .8978 11706951 6797 51052796 79408
    1488 1 9679293 5687 50314492 78322
    1640 1.1022 8051240 4800 49768913 77520
    1792 1.2043 6747253 4087 49361033 76921
    STREAMS_POOL_SIZE_FOR_ESTIMATE STREAMS_POOL_SIZE_FACTOR ESTD_SPILL_COUNT ESTD_SPILL_TIME ESTD_UNSPILL_COUNT ESTD_UNSPILL_TIME
    1944 1.3065 5699093 3515 49064019 76483
    2096 1.4086 4829584 3040 48850903 76171
    2248 1.5108 4089840 2638 48682591 75923
    2400 1.6129 3425461 2278 48542520 75717
    2552 1.7151 2815554 1949 48417853 75535
    2704 1.8172 2267859 1648 48297317 75357
    2856 1.9194 1865779 1427 48178756 75183
    3008 2.0215 1678078 1315 48063018 75014
    3160 2.1237 1637930 1277 47943171 74838
    I am waiting for your help!
    JIm

  • Oracle Stream Replication 10G

    Hi.
    We are planing to use 10G stream replication for our production enviorment. But we are concenred over few things before we move ahead with option. Our prior experience with 9i stream replication was not that good.
    9i does'nt functionality to do some advance maintenance task that could required on production enviorment. We are hoping 10G would be better option to go with streams.
    However, following are few questions that we have regarding this product. Request your valuable inputs on Oracle 10G streams replication.
    -->How effective this tool is compared to prior version 9i.
    -->How stable Streams are with 10G? Are still there any know critical issues with stream replication 10G for e.g bugs or patches that are under review e.g?
    -->For replication of production enviorment, would it be an ideal choice compared to third party vendor like golden Gate or shareplex.
    You reponse would be greatly appreciated.
    Thnks
    Ratheesh

    Hi Rateesh,
    -->How effective this tool is compared to prior version 9i.Very good. I use Streams in many production environments, even for bidirectional replication. Streams is very stable in 10g.
    For replication of production enviorment, would it be an ideal choice compared to third party vendor like golden Gate or shareplex.It depends on several factors. It's better than GG and Shareplex by far (IMHO).
    Madhu Tumma has some great Streams notes on this in his book "Oracle Streams":
    http://www.rampant-books.com/book_2004_2_streams.htm
    Hope this helps. . . .
    Donald K. Burleson
    Oracle Press author

  • Sample scripts for streams setting source 9i-- destination10g

    I need to set up streams across 9i to 10g (both in windows OS)
    tried out sucessfully setting up across 9i-->9i(using OEM - using sample by oracle ) and
    10g-->10g(http://www.oracle.com/technology/obe/obe10gdb/integrate/streams/streams.htm#t6 which uses scripts)
    I need to implement streams from 9i to 10g. the problem is:
    packages used in 10g demo are not available in 9i.
    Do we have a sample script to implement streams across 9i-->10g?

    thanks Arvind, that would be really great.Me trying to have a demo so trying the demo scripts on dept table.Me trying since a month.I have moved my 9.2.0.1.0 source to 9.2.0.7 then applied the patchset 3 for 9.2.0.7 to fix the bug as i got to know there was a bug with streams across 9i,10g
    bug no:4285404 - PROPROGATION FROM 9.2 AND 10.1 TO 10.2
    Note: Executed the same script, with 4.2.2 and not 4.2.1(it is optional) ,as when i tried to export then import and then when i tried to delete supplimental log group from target it said "trying to drop non existant group"
    also when i query capture process it is showing LCRs getting queued,propogation also showing data is propagated from source, apply doesnt have errors but showing 0 for transactions assigned as well as applied.
    looks like destination queue not getting populated though at source propagation is sucessful
    Please find
    1.scripts
    2.init parameters of 9i (source)
    3. init parameters of 10g (target)
    SCRIPT:
    2.1 Create Streams Administrator :
    connect SYS/password as SYSDBA
    create user STRMADMIN identified by STRMADMIN;
    2.2 Grant the necessary privileges to the Streams Administrator :
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE to STRMADMIN;
    GRANT SELECT ANY DICTIONARY TO STRMADMIN;
    GRANT EXECUTE ON DBMS_AQ TO STRMADMIN;
    GRANT EXECUTE ON DBMS_AQADM TO STRMADMIN;
    GRANT EXECUTE ON DBMS_FLASHBACK TO STRMADMIN;
    GRANT EXECUTE ON DBMS_STREAMS_ADM TO STRMADMIN;
    GRANT EXECUTE ON DBMS_CAPTURE_ADM TO STRMADMIN;
    GRANT EXECUTE ON DBMS_APPLY_ADM TO STRMADMIN;
    GRANT EXECUTE ON DBMS_RULE_ADM TO STRMADMIN;
    GRANT EXECUTE ON DBMS_PROPAGATION_ADM TO STRMADMIN;
    BEGIN
    DBMS_AQADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => 'ENQUEUE_ANY',
    grantee => 'STRMADMIN',
    admin_option => FALSE);
    END;
    BEGIN
    DBMS_AQADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => 'DEQUEUE_ANY',
    grantee => 'STRMADMIN',
    admin_option => FALSE);
    END;
    BEGIN
    DBMS_AQADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => 'MANAGE_ANY',
    grantee => 'STRMADMIN',
    admin_option => TRUE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_EVALUATION_CONTEXT_OBJ,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_SET_OBJ,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_RULE_OBJ,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_ANY_RULE_SET,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.ALTER_ANY_RULE_SET,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.EXECUTE_ANY_RULE_SET,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.CREATE_ANY_RULE,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.ALTER_ANY_RULE,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.EXECUTE_ANY_RULE,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    END;
    BEGIN
    DBMS_RULE_ADM.GRANT_SYSTEM_PRIVILEGE(
    privilege => DBMS_RULE_ADM.EXECUTE_ANY_EVALUATION_CONTEXT,
    grantee => 'STRMADMIN',
    grant_option => TRUE);
    END;
    2.3 Create streams queue :
    connect STRMADMIN/STRMADMIN
    BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'STREAMS_QUEUE_TABLE',
    queue_name => 'STREAMS_QUEUE',
    queue_user => 'STRMADMIN');
    END;
    2.4 Add apply rules for the table at the destination database :
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'SCOTT.DEPT',
    streams_type => 'APPLY',
    streams_name => 'STRMADMIN_APPLY',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'str1');
    END;
    2.5 Specify an 'APPLY USER' at the destination database:
    This is the user who would apply all DML statements and DDL statements.
    The user specified in the APPLY_USER parameter must have the necessary
    privileges to perform DML and DDL changes on the apply objects.
    BEGIN
    DBMS_APPLY_ADM.ALTER_APPLY(
    apply_name => 'STRMADMIN_APPLY',
    apply_user => 'SCOTT');
    END;
    2.6 If you do not wish the apply process to abort for every error that it
    encounters, you can set the below paramter.
    The default value is 'Y' which means that apply process would abort due to
    any error.
    When set to 'N', the apply process will not abort for any error that it
    encounters, but the error details would be logged in DBA_APPLY_ERROR.
    BEGIN
    DBMS_APPLY_ADM.SET_PARAMETER(
    apply_name => 'STRMADMIN_APPLY',
    parameter => 'DISABLE_ON_ERROR',
    value => 'N' );
    END;
    2.7 Start the Apply process :
    BEGIN
    DBMS_APPLY_ADM.START_APPLY(apply_name => 'STRMADMIN_APPLY');
    END;
    Section 3
    Steps to be carried out at the Source Database (V920.IDC.ORACLE.COM)
    3.1 Move LogMiner tables from SYSTEM tablespace:
    By default, all LogMiner tables are created in the SYSTEM tablespace.
    It is a good practice to create an alternate tablespace for the LogMiner
    tables.
    CREATE TABLESPACE LOGMNRTS DATAFILE 'logmnrts.dbf' SIZE 25M AUTOEXTEND ON
    MAXSIZE UNLIMITED;
    BEGIN
    DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
    END;
    3.2 Turn on supplemental logging for DEPT table :
    connect SYS/password as SYSDBA
    ALTER TABLE scott.dept ADD SUPPLEMENTAL LOG GROUP dept_pk
    (deptno) ALWAYS;
    3.3 Create Streams Administrator and Grant the necessary privileges :
    Repeat steps 2.1 and 2.2 for creating the user and granting the required
    privileges.
    3.4 Create a database link to the destination database :
    connect STRMADMIN/STRMADMIN
    CREATE DATABASE LINK str2 connect to
    STRMADMIN identified by STRMADMIN using 'str2' ;
    //db link working fine.I tested it
    3.5 Create streams queue:
    BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_name => 'STREAMS_QUEUE',
    queue_table =>'STREAMS_QUEUE_TABLE',
    queue_user => 'STRMADMIN');
    END;
    3.6 Add capture rules for the table at the source database:
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES(
    table_name => 'SCOTT.DEPT',
    streams_type => 'CAPTURE',
    streams_name => 'STRMADMIN_CAPTURE',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'str1');
    END;
    3.7 Add propagation rules for the table at the source database.
    This step will also create a propagation job to the destination database.
    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
    table_name => 'SCOTT.DEPT',
    streams_name => 'STRMADMIN_PROPAGATE',
    source_queue_name => 'STRMADMIN.STREAMS_QUEUE',
    destination_queue_name => 'STRMADMIN.STREAMS_QUEUE@str2,
    include_dml => true,
    include_ddl => true,
    source_database => 'str1');
    END;
    Section 4
    Export, import and instantiation of tables from Source to Destination Database
    4.1 If the objects are not present in the destination database, perform an
    export of the objects from the source database and import them into the
    destination database
    Export from the Source Database:
    Specify the OBJECT_CONSISTENT=Y clause on the export command.
    By doing this, an export is performed that is consistent for each
    individual object at a particular system change number (SCN).
    exp [email protected] TABLES=SCOTT.DEPT FILE=tables.dmp
    GRANTS=Y ROWS=Y LOG=exportTables.log OBJECT_CONSISTENT=Y
    INDEXES=Y STATISTICS = NONE
    Import into the Destination Database:
    Specify STREAMS_INSTANTIATION=Y clause in the import command.
    By doing this, the streams metadata is updated with the appropriate
    information in the destination database corresponding to the SCN that
    is recorded in the export file.
    imp [email protected] FULL=Y CONSTRAINTS=Y
    FILE=tables.dmp IGNORE=Y GRANTS=Y ROWS=Y COMMIT=Y LOG=importTables.log
    STREAMS_INSTANTIATION=Y
    4.2 If the objects are already present in the desination database, there are
    2 ways of instanitating the objects at the destination site.
    1. By means of Metadata-only export/import :
    Export from the Source Database by specifying ROWS=N
    exp USERID=SYSTEM@str1TABLES=SCOTT.DEPT FILE=tables.dmp
    ROWS=N LOG=exportTables.log OBJECT_CONSISTENT=Y
    Import into the destination database using IGNORE=Y
    imp USERID=SYSTEM@str2FULL=Y FILE=tables.dmp IGNORE=Y
    LOG=importTables.log STREAMS_INSTANTIATION=Y
    2. By Manaually instantiating the objects
    Get the Instantiation SCN at the source database:
    connect STRMADMIN/STRMADMIN@source
    set serveroutput on
    DECLARE
    iscn NUMBER; -- Variable to hold instantiation SCN value
    BEGIN
    iscn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
    DBMS_OUTPUT.PUT_LINE ('Instantiation SCN is: ' || iscn);
    END;
    Instantiate the objects at the destination database with this SCN value.
    The SET_TABLE_INSTANTIATION_SCN procedure controls which LCRs for a table
    are to be applied by the apply process.
    If the commit SCN of an LCR from the source database is less than or
    equal to this instantiation SCN , then the apply process discards the LCR.
    Else, the apply process applies the LCR.
    connect STRMADMIN/STRMADMIN@destination
    BEGIN
    DBMS_APPLY_ADM.SET_TABLE_INSTANTIATION_SCN(
    source_object_name => 'SCOTT.DEPT',
    source_database_name => 'str1',
    instantiation_scn => &iscn);
    END;
    Enter value for iscn:
    <Provide the value of SCN that you got from the source database>
    Finally start the Capture Process:
    connect STRMADMIN/STRMADMIN@source
    BEGIN
    DBMS_CAPTURE_ADM.START_CAPTURE(capture_name => 'STRMADMIN_CAPTURE');
    END;
    INIT.ora at 9i
    # Copyright (c) 1991, 2001, 2002 by Oracle Corporation
    # Archive
    log_archive_dest_1='LOCATION=D:\oracle\oradata\str1\archive'
    log_archive_format=%t_%s.dbf
    log_archive_start=true
    # Cache and I/O
    db_block_size=8192
    db_cache_size=25165824
    db_file_multiblock_read_count=16
    # Cursors and Library Cache
    open_cursors=300
    # Database Identification
    db_domain=""
    db_name=str1
    # Diagnostics and Statistics
    background_dump_dest=D:\oracle\admin\str1\bdump
    core_dump_dest=D:\oracle\admin\str1\cdump
    timed_statistics=TRUE
    user_dump_dest=D:\oracle\admin\str1\udump
    # File Configuration
    control_files=("D:\oracle\oradata\str1\CONTROL01.CTL", "D:\oracle\oradata\str1\CONTROL02.CTL", "D:\oracle\oradata\str1\CONTROL03.CTL")
    # Instance Identification
    instance_name=str1
    # Job Queues
    job_queue_processes=10
    # MTS
    dispatchers="(PROTOCOL=TCP) (SERVICE=str1XDB)"
    # Miscellaneous
    aq_tm_processes=1
    compatible=9.2.0.0.0
    # Optimizer
    hash_join_enabled=TRUE
    query_rewrite_enabled=FALSE
    star_transformation_enabled=FALSE
    # Pools
    java_pool_size=33554432
    large_pool_size=8388608
    shared_pool_size=100663296
    # Processes and Sessions
    processes=150
    # Redo Log and Recovery
    fast_start_mttr_target=300
    # Security and Auditing
    remote_login_passwordfile=EXCLUSIVE
    # Sort, Hash Joins, Bitmap Indexes
    pga_aggregate_target=25165824
    sort_area_size=524288
    # System Managed Undo and Rollback Segments
    undo_management=AUTO
    undo_retention=10800
    undo_tablespace=UNDOTBS1
    firstspare_parameter=50
    jobqueue_interval=1
    aq_tm_processes=1
    transaction_auditing=TRUE
    global_names=TRUE
    logmnr_max_persistent_sessions=5
    log_parallelism=1
    parallel_max_servers=2
    open_links=5
    INIT>ora at 10g (target)
    # Copyright (c) 1991, 2001, 2002 by Oracle Corporation
    # Archive
    log_archive_format=ARC%S_%R.%T
    # Cache and I/O
    db_block_size=8192
    db_cache_size=25165824
    db_file_multiblock_read_count=16
    # Cursors and Library Cache
    open_cursors=300
    # Database Identification
    db_domain=""
    db_name=str2
    # Diagnostics and Statistics
    background_dump_dest=D:\oracle\product\10.1.0\admin\str2\bdump
    core_dump_dest=D:\oracle\product\10.1.0\admin\str2\cdump
    user_dump_dest=D:\oracle\product\10.1.0\admin\str2\udump
    # File Configuration
    control_files=("D:\oracle\product\10.1.0\oradata\str2\control01.ctl", "D:\oracle\product\10.1.0\oradata\str2\control02.ctl", "D:\oracle\product\10.1.0\oradata\str2\control03.ctl")
    db_recovery_file_dest=D:\oracle\product\10.1.0\flash_recovery_area
    db_recovery_file_dest_size=2147483648
    # Job Queues
    job_queue_processes=10
    # Miscellaneous
    compatible=10.1.0.2.0
    # Pools
    java_pool_size=50331648
    large_pool_size=8388608
    shared_pool_size=83886080
    # Processes and Sessions
    processes=150
    sessions=4
    # Security and Auditing
    remote_login_passwordfile=EXCLUSIVE
    # Shared Server
    dispatchers="(PROTOCOL=TCP) (SERVICE=str2XDB)"
    # Sort, Hash Joins, Bitmap Indexes
    pga_aggregate_target=25165824
    sort_area_size=65536
    # System Managed Undo and Rollback Segments
    undo_management=AUTO
    undo_tablespace=UNDOTBS1
    sga_target=600000000
    parallel_max_servers=2
    global_names=TRUE
    open_links=4
    logmnr_max_persistent_sessions=4
    REMOTE_ARCHIVE_ENABLE=TRUE
    streams_pool_size=300000000
    undo_retention=1000
    tahnks a lot...

  • Streams on Standard Edition

    Hallo list,
    Iam trying to configure Oracle streams on SE 10g database ,this Master database will replicate data to many Personal or XE databases.
    My question is this configuration possible under SE 10g ? i have read that SE support message queining and applying features only but how i will capture the Changes on Master database !! does this means i have to buy EE .
    Can some one Kindly guide me to link htat discuss configuration of streams under SE 10g database ? .
    Thank you.

    Yes, that's what is written in the licensing guide. Technically, this is worth a test. If you have a Personal Edition installed, that should not take you more than 1 or 2 hours; make especially sure you can propagate buffered messages between AQ Personal and AQ XE or Personal and Personal. Note that since XE is based on 10.2.0.1 and has no support, you can expect a bunch of uncorrected bugs on Streams on that side.
    However, since Personal Edition is a 1 named user-only license and Streams is by nature a multi-user/multi-device approach; I wonder how you can mix together those sort of things (from a contract perspective). I would suggest like Dan did, (1) contact your Oracle Sales Rep and get a confirmation you can do that or (2) you read the OLSA, work with a good lawyer and provision a few bucks for the lawsuit.
    Note also that if you are looking for ways to manage single users with connected/disconnected applications that would synchronize all together, there use to be a product for that called Oracle Lite and it manages very advanced synchronization scenarios, including the application upgrade. http://www.oracle.com/technology/products/lite/index.html . Other than that, there is the iPhone!
    My 2 cents
    Grégory

  • Problem in configuring Oracle Streams

    Hi Experts,
    I am trying to configure oracle streams on oracle 10g through OEM, among the 5 steps, all steps went smooth, the very last step that asks the host username and host password is giving error "Login Error - oracle.sysman.emSDK.admObj.AdminObjectException: ERROR: Wrong password for user"
    I want one complete schema to be replicated...
    Source :
    Oracle : 10.2.0.1
    OS : Windows
    User : administrator/passwd
    Dest :
    Oracle : 10.2.0.4
    OS : RHEL 4.7
    User : oracle/oracle
    please help me, thanks

    Hello, Pls how did you reset your password. I have changed my password but still is not working. Pls give me step to do that.
    Waiting for your reply.
    Thanks

  • ORACLE 10G: Grid or not?

    The following is an excerpt from Ian Foster's article on What is a Grid?.
    &lt;....coordinates resources that are not subject to centralized control …
    (A Grid integrates and coordinates resources and users that live
    within different control domains—for example, the user’s desktop
    vs. central computing; different administrative units of the same
    company; or different companies; and addresses the issues of
    security, policy, payment, membership, and so forth that arise in
    these settings. Otherwise, we are dealing with a local management
    system.)&gt;
    My main point of confusion is that going from ORACLE 10G's capabilities, especially with new features like the 10G Database ASM, Streams, Transportable Tablespaces, 10G Application Server APM, Workload Management and all of 10G Enterprise Manager Grid Control array of features, systems built on this infrastructure cant be referred to as a grid, atleast from the foregoing excerpt.
    Please can someone put me right, assuming i am going wrong. I am researching into grid technology and where Oracle technology fits.
    Thank you.

    Hi,
    you shouldn't confuse - Oracle Grid is just hype
    but seriously this one is enterprise grid computing or
    computing on demand or utility computing.
    It's different from academic (classics) grid.
    Main goal of enterprise grid - reduce TCO of datacenters.
    It's based on usual "idiots" report of Gather (or may
    be IDS) that server's CPUs are too underload (idle).
    So you should dynamically distribute workload between
    few servers in RAC cluster (now named as Grid).
    And of course, it will be work automagically
    so bosses can fire all datacenter's personnel
    and manage just push one red button.
    SL

  • ORA-23460: missing value for column column name in resolution method

    Hi All,
    We have implemented bi-directional replication via oracle streams..Oracle 10g 10.2.0.4
    We are getting following error in conflict resolution .ORA-23460: missing value for column "ASSIGNED_APPL_USER_ID" in resolution method "<method>" for
    "TMADMIN"."PATIENT_VISITS"."REP_UPDATE" ORA-01403: no data found .
    Above column ASSIGNED_APPL_USER_ID is NULL column ,We are using MAXIMUM resolution method for update conflict handler .We tried many times on toad the same scenarios and working fine.
    We have set supplemental logging on primay and unique columns.
    I found many docs on internet and they all said add supplemental logging on all columns.
    We did set up like this ...:
    DECLARE
    COLS DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'STUDY_ID';
    cols(2) := 'KENDLE_STUDY_SUB_ID';
    cols(3) := 'SITE_ID';
    cols(4) := 'PATIENT_ID';
    cols(5) := 'VISIT_NUMBER';
    cols(6) := 'VISIT_TYPE';
    cols(7) := 'VISIT_DATE';
    cols(8) := 'CRF_COLLECTION_DATE';
    cols(9) := 'DT_CREATED';
    cols(10) := 'DT_MODIFIED';
    cols(11) := 'MODIFIED_BY';
    cols(12) := 'MONITOR_VISIT_DATE';
    cols(13) := 'UNSCHEDULED_VISIT_REASON';
    cols(14) := 'LAST_SCHEDULED_VISIT_NUMBER';
    cols(15) := 'VISIT_DATE_IN_TRIALBASE';
    cols(16) := 'PROJECTED_VISIT_DATE';
    cols(17) := 'SEND_CERTIFIED_LETTER_YN';
    cols(18) := 'ASSIGNED_APPL_USER_ID';
    cols(19) := 'DATE_CERTIFIED_LETTER_SENT';
    cols(20) := 'DURATION_OF_CALL_IN_MINUTES';
    cols(21) := 'COMPLETED_APPL_USER_ID';
    cols(22) := 'CALL_STATUS';
    cols(23) := 'VISIT_HYPERLINK';
    cols(24) := 'NEXT_CALL_DATE';
    cols(25) := 'DT_PMNT_REQUESTED_TO_SPONSOR';
    cols(26) := 'VISIT_DATE_NAP_YN';
    cols(27) := 'MONITOR_VISIT_DATE_NAP_YN';
    cols(28) := 'CRF_COLLECTION_DATE_NAP_YN';
    cols(29) := 'IMPORTED_SUBJECT_VISIT_DATE';
    cols(30) := 'REMINDER_SENT_DATE';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    OBJECT_NAME =>'TMADMIN.PATIENT_VISITS',
    METHOD_NAME =>'MAXIMUM',
    RESOLUTION_COLUMN =>'DT_MODIFIED',
    COLUMN_LIST =>cols);
    END;
    DT_MODIFIED is our resolution columns......
    Any help would be appreciate...
    Thanks in advance.
    Thanks,
    Nick.

    You should post this question the Oracle Server forums.
    Good luck!
    Warm regards.
    ashok

  • Set SGA maximum size larget than it's component

    Dear all,
    I want to activate the DISM in Oracle 10g on Solaris container, therefore I need to set the parameter sga_max_size larger than it's component.
    From my understanding, below are the sga components parameter and the current value on my system::
    SHARED_POOL_SIZE (560M)
    LARGE_POOL_SIZE (0)
    JAVA_POOL_SIZE (32M)
    DB_CACHE_SIZE (560M)
    STREAMS_POOL_SIZE (0)
    The thing is, my sga max size parameter (1168M) is already larger than above parameters.
    Is there another sga component I've missed ? Plz advice. Thanks.

    Hi,
    SGA: The size is determined indirectly from the size of the contained memory areas.
    – 1) Buffer pool: DB_BLOCK_BUFFERS (unit: blocks) or DB_CACHE_SIZE when you use the dynamic SGA
    2) – Shared pool: SHARED_POOL_SIZE
    –3) Java pool: JAVA_POOL_SIZE
    – 4) Large pool: LARGE_POOL_SIZE
    –5) Streams pool (Oracle 10g or later): STREAMS_POOL_SIZE
    –6) Redo buffer: LOG_BUFFER
    In addition, in the context of the dynamic SGA , you can define parameter
    SGA_MAX_SIZE, which sets an upper limit for the total size of the SGA. In
    general, you can only increase the size of parameters, such as DB_CACHE_SIZE
    or SHARED_POOL_SIZE, up to the size defined by SGA_MAX_SIZE.
    Thanks
    Sunny

  • Call Function from rman

    Please i need to call a function from rman ...for example the idea is the following
    I want delete the archivelog from initial SCN ,this SCN is calculated for the function
    sfInitSCN ,because i am work witch logminer and i need delete the archivelog that was
    process.
    When I process the archivelog ,the last SCN of archivelog is writer to a table, and I read from this table for delete the archivelog..
    RMAN> run
    2> {
    3> backup arhivelog from SCN= sql 'begin Scott.sfInitSCN;end;' until SCN=2263545 delete input;}
    Please any idea ..........

    Please any idea about the streams replication work witch archivelog.
    For example streams replication in 10g, when you make a backup database and delete archivelog ,it that not permit delete archivelog that was need for streams replication ,I was implementing this idea ,any suggestion of how oracle implementing this..
    Pleasee any idea...

  • Oracle Streams b/w MS-Access 2007 and Oracle 10g.

    Can we set up Oracle Streams between MS-Access 2007 and Oracle 10g? Ms-Access as source and Oracle 10g as destination database. If so, can any one please give me little heads up with supported doc's or any source of info.

    Help Help....!!!

Maybe you are looking for

  • Connecting iPhone duplicates photos in iPhoto

    Every time I connect my iPhone 6 (running iOS 8.1.1) to my iMac (running 10.10.1 Yosemite) to sync my photos, iPhoto launches, every photo is duplicated, and pairs of identical photos are imported into my iPhoto library.  Is there any way to prevent

  • Can .chm files link to .html files?

    Hi, I'm using RoboHelp X5 and cannot find a way to link from a .chm file to an .htm page. It seems like it should be simple, but..... Is this a known issue, software limitation, or am I missing something? Any tips would be greatly appreciated! Thanks

  • Dynamic Handling of HR Data

    I am new to HR and old to ABAP which means that I am not good (yet) with field-symbols or OO so I am having a bit of trouble figuring out how to do what I want to do.  Unicode is not helping me either.  Could someone please help me streamline the fol

  • Macbook Pro and 5.8 Ghz Channel on WRT610N

    I am wondering if anyone has had success with a recent vintage Macbook Pro connecting to the 5.8Ghz channel on a WRT610N.  I can connect with the 2.4 channel without a problem but I cannot maintain a connection on the high channel.  Linksys support t

  • To get Last digits

    Hi Group, I have one variable with varying length(Max20) I need to get last 4 digit of this variable,can any body suggest