Conflict issue in oracle streams

Hi,
Am having a stream replication environemnt set on scott.emp. My issue is say the stream setup is between two server one is in UK and other is in US. I want that the users of UK wont be able to update the US records and similarly UK records not to be updated by US user. Is there anything i can do that to achive this. Am using 10gr2 for replication and its an bidirectional replication.
Thanks
Kapil

Hi,
But i think the dbms_rls is based on row level security for schema, but here am having diffrent application user.....i need to restrict the application users to update the records of other users ...for example am having a schema called admin in US having records from both US users(application user) and UK. I am not sure in this scenario dbms_rls will work or not!!!
please correct me if my understanding is wrong some where??
Thanks
Kapil

Similar Messages

  • Complex Oracle Streams issue - Update conflicts

    This is for Oracle streams replication on 11g r2.
    I am facing update conflicts in a table. The conflict arise due to technical and business logic issue. The business logic will pass through the replication/apply process successfully but we want to arrest and resolve it before replication for our requirements. These are typically a bit complex cases and we are exploring the possibility of having both DML handlers and Error handlers. The DML handlers will take care of business logic conflicts and Error handler for technical issues before pushing it to Error queue by Streams. Based on our understanding and verification, we found a limitation to configure both procedure DML handler and Error handler together for the same table operation.
    Statement handlers can not be used for our conflict scenarios.
    Following are my questions:
    1. Have anyone implemented or faced such a scenario in their real time application? If yes, can you please share some insights or inputs?
    2. Is there a custom way to handle this complex problem of configuring both DML and Error handler?
    3. Is there any alternative possible way to resolve this situation at Oracle streams environment with other handlers?

    Dear All
    I too have a similar requirement. Could anyone help with one?
    We can handle the error-ing transactions via Error Handler procedures.
    But we can not configure the DML handler procedure for transactions that are successfully replicated. STreams does not allow us to configure a handler for this. Is there any other handler / procedures / hooks in streams where we can implement the desired functionality - which includes changing the values in the LCR before invoking lcr.execute() and we should be able to discard the LCR also if required.
    Regards
    Velmurugan
    Edited by: 982387 on Jan 16, 2013 11:25 PM
    Edited by: 982387 on Jan 16, 2013 11:27 PM

  • Oracle Streams Update conflict handler not working

    Hello,
    I've been working on the Oracle streams and this time we've to come up with Update conflict handler.
    We are using Oracle 11g on Solaris10 env.
    So far, we have implemented bi-directional Oracle Streams Replication and it is working fine.
    Now, when i try to implement Update conflict handler - it executed successfully but it is not fulfilling the desired functionality.
    Here are the steps i performed:
    Steap -1:
    create table test73 (first_name varchar2(20),last_name varchar2(20), salary number(7));
    ALTER TABLE jas23.test73 ADD (time TIMESTAMP WITH TIME ZONE);
    insert into jas23.test73 values ('gugg','qwer',2000,SYSTIMESTAMP);
    insert into jas23.test73 values ('papa','sdds',2050,SYSTIMESTAMP);
    insert into jas23.test73 values ('jaja','xzxc',2075,SYSTIMESTAMP);
    insert into jas23.test73 values ('kaka','cvdxx',2095,SYSTIMESTAMP);
    insert into jas23.test73 values ('mama','rfgy',1900,SYSTIMESTAMP);
    insert into jas23.test73 values ('tata','jaja',1950,SYSTIMESTAMP);
    commit;
    Step-2:
    conn to strmadmin/strmadmin to server1:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    Step-3
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-4
    conn to strmadmin/strmadmin to server2
    SQL>
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'first_name';
    cols(2) := 'last_name';
    cols(3) := 'salary';
    cols(4) := 'time';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name => 'jas23.test73',
    method_name => 'MAXIMUM',
    resolution_column => 'time',
    column_list => cols);
    END;
    Step-5
    And now, if i try to update the value of salary, then it is not getting handled by update conflict handler.
    update jas23.test73 set salary = 1500,time=SYSTIMESTAMP where first_name='papa'; --server1
    update jas23.test73 set salary = 2500,time=SYSTIMESTAMP where first_name='papa'; --server2
    commit; --server1
    commit; --server2
    Note: Both the servers are into different timezone (i hope it wont be any problem)
    Now, after performing all these steps - the data is not same at both sites.
    Error(DBA_APPLY_ERROR) -
    ORA-26787: The row with key ("FIRST_NAME", "LAST_NAME", "SALARY", "TIME") = (papa, sdds, 2000, 23-DEC-10 05.46.18.994233000 PM +00:00) does not exist in ta
    ble JAS23.TEST73
    ORA-01403: no data found
    Please help.
    Thanks.
    Edited by: gags on Dec 23, 2010 12:30 PM

    Hi,
    When i tried to do it on Server-2:
    SQL> ALTER TABLE jas23.test73 ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
    it throws me an error,
    Error -
    ERROR at line 1:
    ORA-32588: supplemental logging attribute all column exists

  • Oracle Streams and Oracle Apps  11i

    Hi,
    I am looking for an oracle solution to build a reporting instance off an e-business suite 11i and offload discoverer reporting to the reporting instance. I have tried dataguard logical standby but too many issues and I cannot use. I am wondering if anyone tried using oracle streams to build a reporting instance from an oracle apps instance?
    Thanks

    Hi,
    Is Streams supported in an E-Business Suite database? We once had a scenerio where some parameters for Streams were conflicting with parameters required for E-Business Suite performance improvement. I could not find any documents from Oracle to verify whether Streams on E-Business Suite database was a supported configuration or not.
    So if you have any such document, could you send the link to me?
    Also please provide the link to your white paper/presentation.
    Regards,
    Sujoy

  • Help on Oracle streams 11g configuration

    Hi Streams experts
    Can you please validate the following creation process steps ?
    What is need to have streams doing is a one way replication of the AR
    schema from a database to another database. Both DML and DDL shall do
    the replication of the data.
    Help on Oracle streams 11g configuration. I would also need your help
    on the maintenance steps, controls and procedures
    2 databases
    1 src as source database
    1 dst as destination database
    replication type 1 way of the entire schema FaeterBR
    Step 1. Set all databases in archivelog mode.
    Step 2. Change initialization parameters for Streams. The Streams pool
    size and NLS_DATE_FORMAT require a restart of the instance.
    SQL> alter system set global_names=true scope=both;
    SQL> alter system set undo_retention=3600 scope=both;
    SQL> alter system set job_queue_processes=4 scope=both;
    SQL> alter system set streams_pool_size= 20m scope=spfile;
    SQL> alter system set NLS_DATE_FORMAT=
    'YYYY-MM-DD HH24:MI:SS' scope=spfile;
    SQL> shutdown immediate;
    SQL> startup
    Step 3. Create Streams administrators on the src and dst databases,
    and grant required roles and privileges. Create default tablespaces so
    that they are not using SYSTEM.
    ---at the src
    SQL> create tablespace streamsdm datafile
    '/u01/product/oracle/oradata/orcl/strepadm01.dbf' size 100m;
    ---at the replica:
    SQL> create tablespace streamsdm datafile
    ---at both sites:
    '/u02/oracle/oradata/str10/strepadm01.dbf' size 100m;
    SQL> create user streams_adm
    identified by streams_adm
    default tablespace strepadm01
    temporary tablespace temp;
    SQL> grant connect, resource, dba, aq_administrator_role to
    streams_adm;
    SQL> BEGIN
    DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE (
    grantee => 'streams_adm',
    grant_privileges => true);
    END;
    Step 4. Configure the tnsnames.ora at each site so that a connection
    can be made to the other database.
    Step 5. With the tnsnames.ora squared away, create a database link for
    the streams_adm user at both SRC and DST. With the init parameter
    global_name set to True, the db_link name must be the same as the
    global_name of the database you are connecting to. Use a SELECT from
    the table global_name at each site to determine the global name.
    SQL> select * from global_name;
    SQL> connect streams_adm/streams_adm@SRC
    SQL> create database link DST
    connect to streams_adm identified by streams_adm
    using 'DST';
    SQL> select sysdate from dual@DST;
    SLQ> connect streams_adm/streams_adm@DST
    SQL> create database link SRC
    connect to stream_admin identified by streams_adm
    using 'SRC';
    SQL> select sysdate from dual@SRC;
    Step 6. Control what schema shall be replicated
    FaeterBR is the schema to be replicated
    Step 7. Add supplemental logging to the FaeterBR schema on all the
    tables?
    SQL> Alter table FaeterBR.tb1 add supplemental log data
    (ALL) columns;
    SQL> alter table FaeterBR.tb2 add supplemental log data
    (ALL) columns;
    etc...
    Step 8. Create Streams queues at the primary and replica database.
    ---at SRC (primary):
    SQL> connect stream_admin/stream_admin@ORCL
    SQL> BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'streams_adm.FaeterBR_src_queue_table',
    queue_name => 'streams_adm.FaeterBR_src__queue');
    END;
    ---At DST (replica):
    SQL> connect stream_admin/stream_admin@STR10
    SQL> BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'stream_admin.FaeterBR_dst_queue_table',
    queue_name => 'stream_admin.FaeterBR_dst_queue');
    END;
    Step 9. Create the capture process on the source database (SRC).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name =>'FaeterBR',
    streams_type =>'capture',
    streams_name =>'FaeterBR_src_capture',
    queue_name =>'FaeterBR_src_queue',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database => NULL,
    inclusion_rule => true);
    END;
    Step 10. Instantiate the FaeterBR schema at DST. by doing export
    import : Can I use now datapump to do that ?
    ---AT SRC:
    exp system/superman file=FaeterBR.dmp log=FaeterBR.log
    object_consistent=y owner=FaeterBR
    ---AT DST:
    ---Create FaeterBR tablespaces and user:
    create tablespace FaeterBR_datafile
    '/u02/oracle/oradata/str10/FaeterBR_01.dbf' size 100G;
    create tablespace ws_app_idx datafile
    '/u02/oracle/oradata/str10/FaeterBR_01.dbf' size 100G;
    create user FaeterBR identified by FaeterBR_
    default tablespace FaeterBR_
    temporary tablespace temp;
    grant connect, resource to FaeterBR;
    imp system/123db file=FaeterBR_.dmp log=FaeterBR.log fromuser=FaeterBR
    touser=FaeterBR streams_instantiation=y
    Step 11. Create a propagation job at the source database (SRC).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
    schema_name =>'FaeterBR',
    streams_name =>'FaeterBR_src_propagation',
    source_queue_name =>'stream_admin.FaeterBR_src_queue',
    destination_queue_name=>'stream_admin.FaeterBR_dst_queue@dst',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database =>'SRC',
    inclusion_rule =>true);
    END;
    Step 12. Create an apply process at the destination database (DST).
    SQL> BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name =>'FaeterBR',
    streams_type =>'apply',
    streams_name =>'FaeterBR_Dst_apply',
    queue_name =>'FaeterBR_dst_queue',
    include_dml =>true,
    include_ddl =>true,
    include_tagged_lcr =>false,
    source_database =>'SRC',
    inclusion_rule =>true);
    END;
    Step 13. Create substitution key columns for äll the tables that
    haven't a primary key of the FaeterBR schema on DST
    The column combination must provide a unique value for Streams.
    SQL> BEGIN
    DBMS_APPLY_ADM.SET_KEY_COLUMNS(
    object_name =>'FaeterBR.tb2',
    column_list =>'id1,names,toys,vendor');
    END;
    Step 14. Configure conflict resolution at the replication db (DST).
    Any easier method applicable the schema?
    DECLARE
    cols DBMS_UTILITY.NAME_ARRAY;
    BEGIN
    cols(1) := 'id';
    cols(2) := 'names';
    cols(3) := 'toys';
    cols(4) := 'vendor';
    DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(
    object_name =>'FaeterBR.tb2',
    method_name =>'OVERWRITE',
    resolution_column=>'FaeterBR',
    column_list =>cols);
    END;
    Step 15. Enable the capture process on the source database (SRC).
    BEGIN
    DBMS_CAPTURE_ADM.START_CAPTURE(
    capture_name => 'FaeterBR_src_capture');
    END;
    Step 16. Enable the apply process on the replication database (DST).
    BEGIN
    DBMS_APPLY_ADM.START_APPLY(
    apply_name => 'FaeterBR_DST_apply');
    END;
    Step 17. Test streams propagation of rows from source (src) to
    replication (DST).
    AT ORCL:
    insert into FaeterBR.tb2 values (
    31000, 'BAMSE', 'DR', 'DR Lejetoej');
    AT STR10:
    connect FaeterBR/FaeterBR
    select * from FaeterBR.tb2 where vendor= 'DR Lejetoej';
    Any other test that can be made?

    Check the metalink doc 301431.1 and validate
    How To Setup One-Way SCHEMA Level Streams Replication [ID 301431.1]
    Oracle Server Enterprise Edition - Version: 10.1.0.2 to 11.1.0.6
    Cheers.

  • Xbox/wifi comcast conflict issue

    Let me start this by saying I really don't know I'm trying to learn. Ok now that's out of the way I've noticed there's a post from a comcast employee about a conflict between xbox live and comcast that's not allowing some ppl to play over wifi network. My question is if comcast or isp's are not able to effect or control wifi speeds only hardline how can 1 game or certain games be blocked but other devices or even other games have no issues?         I'm truely trying to understand here I have bad wifi problems and have been through all techs/ ideas anyone can suggest. No one has an answer. They all say my equipment should have no issues. I think its something on comcasts side simliar to this xbox issue. As in there's something in or associated with my act that's effecting my wifi. Everyone I talk to says there is no way to effect or slow or even stop my wifi withouth effecting my hardline, but this issue with comcast and microsoft admits there is some form of wifi control beyond the hardline. I'm not looking for a debate just really curious here.

    Cool thanks for response! I have been through all steps possible to reduce interferance literally all. I can tell what equipment I have seeings how this post is getting more help than my actual help request. I currently do not have comcast gateway, but comcast is basically telling me its only way to get my wifi speeds So I ordered one to try local office only has old units.  I have been thru 3 complete upgrades to try to resolve this issue. I started with my own motorola/arris sb6850 after a month a ph tech thru comcast sugested it maybe a bad unit so willing to try anything I take back get replacement. I use for week to make sure no change wifi still 50% of direct connect 2ft away from modem/router. Then I give in knowing gateways don't have wifi power of separate router.    I return gateway pick up netgear cm400 modem and netgear ac band r6300 router. I again get all set up check direct connect all good wifi still 50% of direct. This time I try netgear support after they have me change some internal settings and try for the usual 48 hrs still no change. Netgear then tells me wifi is bad in unit. Reluctant to believe I have yet 3rd bad wifi in my stuff I still listen. I then return R6300 router and pick up netgear nighthawk go thru same process wait week call support ect. They netgear then tells me I have another bad wifi in router. Keep with me now that's 4th bad or defective unit supposedly.    After over many hours of netgear support I decide rather than just return "defective" router (I don't think it actually was). I return modem as well. This time I get motorola/arris sb6121 modem, and ASUS rt-ac68p router (not a cheap unit). After all set up thru comcast and asus still no change. Direct connect I'm still getting 100mbps wifi 50 or under usually under. Again these test are within 2ft of router with no interferance and nothing else connected or running direct connection. I call comcast "tech support" again they say I need gateway unit only way to have wifi speeds and rule out my equipments supposed possible conflict issues with each other even though I fully checked. I even tried comcast wifi "support" few days ago this person suggested it was just a speedtest site being incorrect even though I use xfinity speed test, and ookola (speedtest.net) both of with powered by ookola. Plus I tell person its not just numbers if I have 1 streaming unit I have slow lagging issues on other devices. If I truely was getting 105 or 100 mbps or even the 50ish I'm getting according to tests this should not be happening.      This is my annoying frustrating true story. Tomorrow or next day my gateway unit from comcast will be here even though its nothing compared to current equipment. Oh I forgot to mention I've had multiple techs come to my home at least 6 probably 10. I have irratated many ppl trying to get this issue resolve through comcast. I'm polite but def frustrated.       

  • BLOB in Oracle Streams

    Oracle 10.2.0.4:
    I am new to Oracle streams and just reading docs at this point. I read in http://download.oracle.com/docs/cd/B19306_01/server.102/b14229.pdf doc that BLOB are not supported by streams. I am just looking for basic stream configuration with some rule processing which will send LCR from source queue to destination queue. And as I understand I can do that by using ANYDATA payload.
    We have some tables witih BLOB data.

    It's all a balancing act. If you absolutely need both data centers processing transactions simultaneously, you'll need Streams.
    Lets start with the simplest possible case of this, two data centers A and B, with databases 1 and 2. Database 1 is in data center A, database 2 is in data center B. If database 1 fails, would you be able to shift traffic to database 2 relatively easily? Assuming that you're building in functionality to shift load between databases, which is normally the case when you're building this sort of distributed application, it may be easier to do this sort of shift regardless of the reason that database 1 fails.
    If you have a standby database in each data center (1A as the standby for database 1, 2A as the standby for database 2), when 1 fails, you have to figure out whether whatever caused 1 to fail will also cause 1A to fail. If data center A is having connectivity or power issues, for example, you would have to shift traffic to 2 rather than failing 1 over to 1A. On the other hand, if it was an isolated server failure, you could either shift traffic to 2 or fail over to 1A. There is some risk that having a more complex failure scenario makes it more likely that someone makes a mistake-- there will be a number of failover steps that you'd do only if you're failing from 1 to 1A and other steps that you'd do if you were shifting traffic from 1 to 2 and some steps that are common-- and makes it more difficult to fully test all the scenarios. On the other hand, there may well be benefits to having more options to respond to different sorts of failures. And politics/ reporting structure as well as geography plays a role here-- if the data centers are on different continents, shifting traffic is probably much less desirable than if you have two US data centers.
    If, rather than having standbys 1A and 2A, database 1 and 2 were really multi-node RAC clusters, both database 1 and database 2 would be able to survive most sorts of localized hardware failure (i.e. one node can fail on database 1 without affecting whether database 1 is up and processing transactions). If there was a data center wide failure, you'd still have to shift traffic. But one server dying in a pile wouldn't be an issue. Of course, there would be a handful of events that could take down the entire RAC cluster without affecting the data center where a standby could potentially be used (i.e. the SAN for the cluster fails but the standby is using a different SAN). Those may not be particularly likely, however, so it may make sense not to bother with contingency planning for them and just assume that anything that knocks out all the nodes of the cluster forces traffic to be shifted to 2 and that it wouldn't be worth trying to maintain a standby for those scenarios.
    There are lots of trade-offs here. You have simplicity of setup, you have simplicity of failover, you have robustness, etc. And there are going to be cases where you realistically need to take a stab at predicting how likely various events are which gets pretty deeply into hardware, setup, and politics (i.e. how likely a server is to fail depends on whether you've bought a high-end server with doubly-rundundant-everything or a commodity linux box, how likely a data center is to fail depends on the data center's redundancy measures and your level of confidence in those measures, etc)
    Justin

  • Oracle stream not working as Logminer is down

    Hi,
    Oracle streams capture process is not capturing any updates made on table for which capture & apply process are configured.
    Capture process & apply process are running fine showing enabled as status & no error. But, No new records are captured in ‘streams_queue_table’ when I update record in table, which is configured for capturing changes.
    This setup was working till I got ‘ORA-01341: LogMiner out-of-memory’ error in alert.log file. I guess logminer is not capturing the updates from redo log.
    Current Alert log is showing following lines for logminer init process
    LOGMINER: Parameters summary for session# = 1
    LOGMINER: Number of processes = 3, Transaction Chunk Size = 1
    LOGMINER: Memory Size = 10M, Checkpoint interval = 10M
    But same log was like this before
    LOGMINER: Parameters summary for session# = 1
    LOGMINER: Number of processes = 3, Transaction Chunk Size = 1
    LOGMINER: Memory Size = 10M, Checkpoint interval = 10M
    LOGMINER: session# = 1, reader process P002 started with pid=18 OS id=5812
    LOGMINER: session# = 1, builder process P003 started with pid=36 OS id=3304
    LOGMINER: session# = 1, preparer process P004 started with pid=37 OS id=1496We can clearly see reader, builder & preparer process are not starting after I got Out of memory exception in log miner.
    To allocate more space to logminer, I tried to setup tablespace to logminer I got 2 exception which was contradicting each other error.
    SQL> exec DBMS_LOGMNR.END_LOGMNR();
    BEGIN DBMS_LOGMNR.END_LOGMNR(); END;
    *ERROR at line 1:
    ORA-01307: no LogMiner session is currently activeORA-06512: at "SYS.DBMS_LOGMNR", line 76
    ORA-06512: at line 1
    SQL> EXECUTE DBMS_LOGMNR_D.SET_TABLESPACE('logmnrts');
    BEGIN DBMS_LOGMNR_D.SET_TABLESPACE('logmnrts'); END;
    *ERROR at line 1:
    ORA-01356: active logminer sessions foundORA-06512: at "SYS.DBMS_LOGMNR_D", line 232
    ORA-06512: at line 1
    When I tried stopping logminer exception was ‘no logminer session is active’, But when I tried to setup tablespace exception was ‘active logminer sessions found’. I am not sure how to resolve this issue.
    Please let me know how to resolve this issue.
    Thanks
    siva

    The Logminer session associated with a capture process is a special kind of session which is called a "persistent session". You will not be able to stop it using DBMS_LOGMNR. This package controls only non-persistent sessions.
    To stop the persistent LogMiner session you must stop the capture process.
    However, I think your problem is more related to a lack of RAM space instead of tablespace (i. e, disk) space. Try to increase the size of the SGA allocated to LogMiner, by setting capture parameter SGASIZE. I can see you are using the default of 10M, which may be not enough for your case. Of course, you will have to increase the values of init parameters streams_pool_size, sga_target/sga_max_size accordingly, to avoid other memory problems.
    To set the SGASIZE parameter, use the PL/SQL procedure DBMS_CAPTURE_ADM.SET_PARAMETER. The example below would set it to 100Megs:
    begin
    DBMS_CAPTURE_ADM.set_parameter('<name of capture process','_SGA_SIZE','100');
    end;
    I hope this helps.
    Ilidio.

  • Oracle Streams - First Load

    Hi,
    I have an Oracle Streams environment working well. I replicate and transform the data.
    My problem is:
    Initially I have a source database with 3 million of records and my destination database with no records.
    I have to equalize source and destination database before start to synchronize it.
    Do you know how can I replicate (and transform) this data for my first load?
    It's not only to copy all the data, it's to copy and transform.
    Is it possible to use the same transformation process for this first load?
    If I didn't transform the data I would use the Data Pump tool(e.g.). But I have to transform the data for my destination database.
    Thanks

    I am in DAC and trying to run the Informatica ETL for one of the prebuilt execution plan (HR- Oracle R12). I built the project plan and ran it. I got failed status for all of the ETL steps (Starting from the first one 'Load Row into Run table'). I have attached the error log for the Load Row into Run table ibelow.
    I took a closer look at all the steps, and it seem like they all have this common "fail parent if this task fails" error message.
    Error log for
    pmcmd startworkflow -u Administrator -p **** -s SOBI:4006 -f SILOS -lpf C:\Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles\SILOS.SIL_InsertRowInRunTable.txt SIL_InsertRowInRunTable
    Status Desc : Failed
    WorkFlowMessage :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.1.1 SP5], build [186.0822], Windows 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu May 07 14:46:04 2009
    Connected to Integration Service at [SOBI:4006]
    Folder: [SILOS]
    Workflow: [SIL_InsertRowInRunTable] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SIL_InsertRowInRunTable] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SIL_InsertRowInRunTable] will be failed.]
    Start time: [Thu May 07 14:45:43 2009]
    End time: [Thu May 07 14:45:47 2009]
    Workflow log file: [C:\Informatica\PowerCenter8.1.1\server\infa_shared\WorkflowLogs\SIL_InsertRowInRunTable.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Integration Service: [Oracle_BI_DW_Base_Integration_Service]
    Disconnecting from Integration Service
    Completed at Thu May 07 14:46:04 2009
    =====================================
    ERROR OUTPUT
    =====================================
    Error Message : Unknown reason for error code 36331
    ErrorCode : 36331
    If you have any input on how to fix this issue, please let me know.

  • Global_names in Oracle Streams?

    Oracle recommends GLOBAL_NAMES=TRUE when setting the Oracle Streams environment. Are there any reasons why Oracle recommends this setting? Actually, I've used GLOBAL_NAMES=false without any issues though....

    An excerpt from the Oracle Documentations.
    GLOBAL_NAMES
    Specifies whether a database link is required to have the same name as the database to which it connects.
    To use Streams to share information between databases, set this parameter to true at each database that is participating in your Streams environment.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14229/strms_mprep.htm#i1006278
    Regards,
    Sabdar Syed.

  • Oracle Streams - Error ORA-26786,ORA-26787

    Good day to all,
    I wish someone can help me in Oracle Streams, scoring to identify the following error in the Target with dba_apply_error:
    ORA-26786: A row with key ("RECKEY") = (1) exists But you have conflicting column (s) "ULT_NRO" in table PROD2.NUM_TRANSACCIONES
    ORA-01403: no data found
    This makes many of my tables in the destination are desynchronized. Moreover also I have cases where I get the ORA-26787 which has the following definition:
    ORA-26787: "The row with key% s does not exist in table% s"
    The BD I have in the Source is a 11gR2 EE and target is 11g R2 Edition One is only pasive and the source database is active.
    Hopefully I can help me,
    I hope your comments.
    Thank you.

    Hi,
    Well, I have to tell you I tried implmentar the Streams with parallelism 1, however I keep getting the same error.
    ORA-26787: The row with key ("ANO", "CAPITAL_AMORTIZADO", "CAPITAL_APROBADO", "CAPITAL_DESEMBOLSADO", "CAPITAL_VENCIDO", "CREDITO", "CUOTAS_ATRAZADAS", "DIAS_MOROSIDAD_HOY", "DIAS_MOROSIDAD_PROY", "FECHA_ULTIMA_CAP", "FECHA_ULT_MOV", "FECHA_VENCIMIENTO", "FLAG_ESTADO_ANT", "FLAG_ESTADO_CRED", "GARANTIA_CANT", "GASTO_COBRANZA", "ID_AGENCIA", "INTERES_ACUM", "INTERES_COMP_PROY", "INTERES_MOR_PROY", "INTERES_PROY", "MES", "NRO_PROCESO", "OTRO_CARGO", "PAGO_INTERES", "PRODUCTO", "SALDO_GARANTIA", "SECTORISTA", "TASA") = (2012, 857508.04, 45000, 858078.39, 570.35, 2002000321, 1, 2858, 2880, 2004-04-13:00:00:00, 2004-04-13:00:00:00, 2004-01-14:00:00:00, 1, 7, , 230, 0002, 0, 4481.53, 651.6, 0, 2, 1, 0, 13144.13, 268 , 0, dfigue, 31.37) does not exist in table PROD2.CREDITO_SALDO_PRYCTDO
    ORA-01403: no data found
    ORA-26786: A row with key ("ITEM") = (2) exists but has conflicting column(s) "FCHA_ULT_PROC", "FLAG_PRCSO", "HORA_FIN", "HORA_INI" in table PROD2.CIE_PROYCCION_PRVSION
    ORA-01403: no data found
    It's good to tell you also that once the error occurs, I proceed to stop the apply, then to erase the mistakes and finally start the application in the destination, and thus gradually disappear errors and re-based sincornizar destination data, however this is constant.
    Is due to some configuration I'm missing?
    Thanks for your opinions.
    Regards.

  • Oracle Stream Replication 10G

    Hi.
    We are planing to use 10G stream replication for our production enviorment. But we are concenred over few things before we move ahead with option. Our prior experience with 9i stream replication was not that good.
    9i does'nt functionality to do some advance maintenance task that could required on production enviorment. We are hoping 10G would be better option to go with streams.
    However, following are few questions that we have regarding this product. Request your valuable inputs on Oracle 10G streams replication.
    -->How effective this tool is compared to prior version 9i.
    -->How stable Streams are with 10G? Are still there any know critical issues with stream replication 10G for e.g bugs or patches that are under review e.g?
    -->For replication of production enviorment, would it be an ideal choice compared to third party vendor like golden Gate or shareplex.
    You reponse would be greatly appreciated.
    Thnks
    Ratheesh

    Hi Rateesh,
    -->How effective this tool is compared to prior version 9i.Very good. I use Streams in many production environments, even for bidirectional replication. Streams is very stable in 10g.
    For replication of production enviorment, would it be an ideal choice compared to third party vendor like golden Gate or shareplex.It depends on several factors. It's better than GG and Shareplex by far (IMHO).
    Madhu Tumma has some great Streams notes on this in his book "Oracle Streams":
    http://www.rampant-books.com/book_2004_2_streams.htm
    Hope this helps. . . .
    Donald K. Burleson
    Oracle Press author

  • Oracle Streams 10g

    Dear all,
    i am trying to find a full example of setup oracle streams 10g. I found a lot of good examples on metalink but on oracle 9i not on 10g.
    I highly appreciate your help.
    Wessam

    Hi: Besides the initial setup, Streams has a lot of "gotchas." We have spent a lot of time researching performance issues with Streams, and how to avoid them. We provide a free pointpoint of the issues. We also documented some best practices. Here are the links:
    http://oraclemagician.com/white_papers/streams_perf.pdf
    http://oraclemagician.com/white_papers/streams.pps
    Let me know if this proves to be helpful.
    Best,
    Chris Lawson
    OracleMagician.com

  • Reconfigure the Oracle streams incase of Server move from 192 to 191

    Hi All,
    We have bi directioanl oracle streams setup between two databases.
    Recently, We have moved our server from cin192 to cin191. After server moved we checked the all the streams process.
    Capture process shows Wating for Dictionary Redo First SCN XXXXXXXXX on both the database.
    When i checked these SCN ,i got cin192 archive log.
    Can you please help how can i resolve these issue..
    Do we need to reconfigure the streams or we can assign new SCN to capture process without dropping anything with cin191 server archive log file...
    Means , How to point new server archive log file to capture process..
    Any help would be appreciated...
    It's urgent...Plzzzzzzzzzz Help.
    Thanks,
    Singh

    Hi Singh,
    If I would know what cin191 and cin192 are, I would probably be able to redirect you to the right forum.
    If you are looking for Oracle streams, I suggest you to try the Database - General forum: General Database Discussions
    If is Oracle replication what you are looking for, please check here: Replication
    This is the Berkeley DB High Availability forum ( http://www.oracle.com/technology/documentation/berkeley-db/db/ref/rep/intro.html )
    Bogdan

  • RESETLOGS using Oracle Streams

    Hi all,
    I have lost my all my redo log files, i need to open the database and i think that i will need to use RESETLOGS option. I'm using Oracle Streams, do you think that if i use RESETLOGS the Streams will work fine after that???
    Tks,
    Paulo.

    Yes, it is. You may read my article (Note Id: 431430.1) on metalink, it address a similar issue.

Maybe you are looking for