Sequence in Transaction

Hi I want to put a sequense in transaction. Meaning that if I roll back the transaction I want the value of the sequence also to roll back. Is that possible?
Regards,
Sovon.

No.
If you're trying to create a gap-free sequence, you're not going to be able to do it. Aside from being unable to rollback a sequence increment, there are a number of other problems that will eventually cause you problems.
Justin

Similar Messages

  • Oracle Receivables -Document sequence for transaction

    Hi Guys,
    I have one requirement Separate invoice series is maintained for domestic and International customer.
    How can we maintain a document sequence for transaction document number for domestic and international customers separately.Plz provide the solution.
    Thanks
    Guru Prasad.

    Hello.
    Create two Transaction Types and two Sequences and assign each sequence to each transaction type.
    Octavio

  • Sequences and Transactions

    I'm not quite clear about the advantages/disadvantages of protecting sequences with transactions. I know that using an explicit transaction for sequence accesses will reduce concurrency, but I'm not sure to what extent. Could someone please elaborate on this?
    Also, if I have two threads accessing a single sequence concurrently, each with their own transaction, is it possible that they will obtain the same element due to the isolation restriction? What happens if one thread aborts its transaction, are the sequence elements that it obtained returned to the queue of available elements, or discarded?
    Thanks,
    Patrick

    Hello,
    Are the questions unclear? Should I try to clarify?
    Thanks,
    Patrick

  • Weblogic Eclipselink Sequence Table Connection Pool Sequence Separate transaction while JTA on main transaction

    Hi,
    And thanks in advance for your support.
    In weblogic 12, managing to get the eclipse link connection sequencing mechanism when one uses Tables for sequencing entity ids seems to be complicated.
    QUICK REFERENCE:
    http://www.eclipse.org/eclipselink/api/2.5/org/eclipse/persistence/config/PersistenceUnitProperties.html
    The concept:
    While having EJB, MDBs etc... run on a JEE container, be it glassfish or weblogic, it should be possible to have the main thread transaction be managed as part of JTA global transactions by the contianer.
    Namely, pumping messages to JMS queues, persisting entities etc.
    Meanwhile, it should be also possible to as the transaction is on going write and update entity ids from sequencing tables.
    For this very purpose, eclipse link provides persistence.xml properties, such as the now deprecated eclipselink.jdbc.sequence-connection-pool" value="true", to fullfill this very purpose.
    This option greatly avoids dead longs, by allowing eclipse link to fetch a non JTA managed connection, pseudo "two phase locking read table update table" go to the datbase and fetch a new sequence.
    The same mechnism under JTA is a disaster. A transaction that creates ten different entities, might do ten reads and updates on this table, while mean while a competing transaction might be trying to do the same. It is guaranteed dead lock with minimal stress on the environment.
    Under glassfish, for example, tagging a persistence.xml with :
    <persistence-unit name="MY_PU" transaction-type="JTA">
            <provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
            <jta-data-source>jdbc/DERBY_DS</jta-data-source>
            <non-jta-data-source>jdbc/DERBY_DS</non-jta-data-source>       
            <properties>           
                <property name="eclipselink.jdbc.sequence-connection-pool" value="true" />
            </properties>
    </peristence-unit>
    does miracles, when entities are using TABLE sequencing.
    Under weblogic, say you are using the Derby embedded XA driver with two phase commit, deploying the applicaiton immediately leads to:
    Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.3.3.v20120629-r11760): org.eclipse.persistence.exceptions.DatabaseException
    Internal Exception: java.sql.SQLException: Cannot call commit when using distributed transactions
    Error Code: 0
      at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:324)
      at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicCommitTransaction(DatabaseAccessor.java:426)
      at org.eclipse.persistence.internal.databaseaccess.DatasourceAccessor.commitTransaction(DatasourceAccessor.java:389)
      at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.commitTransaction(DatabaseAccessor.java:409)
      at org.eclipse.persistence.internal.sequencing.SequencingManager$Preallocation_Transaction_Accessor_State.getNextValue(SequencingManager.java:579)
      at org.eclipse.persistence.internal.sequencing.SequencingManager.getNextValue(SequencingManager.java:1067)
      at org.eclipse.persistence.internal.sequencing.ClientSessionSequencing.getNextValue(ClientSessionSequencing.java:70)
      at org.eclipse.persi
    While weblogic is right that their might be a distributed transaction ongoing, it is mistaken in the fact tha tthe connection requested by eclipse link for generating the ID should be part of the global transaciton.
    Eclipse link provides other ways to attempt to configure the sequencing mechanism, by sating for example a non-jta transaction.
    I have attempted also using these properties both withe original data DERBY_DS that uses the XA driver, and later with a new data source i created on purpose to try to work around the sequencing contengy.
    For example:
    <!--property name="eclipselink.jdbc.sequence-connection-pool.nonJtaDataSource" value="jdbc/DERBY_SEQUENCING_NON_JTA" /-->
                <!--property name="eclipselink.connection-pool.sequence.nonJtaDataSource" value="jdbc/DERBY_SEQUENCING_NON_JTA" /-->
    This new DERBY_SEQUENCING_NON_JTA is explicitly configured to use a NON_XA driver with global transactions flag set to disabled.
    Regardless, the only thing I get out of this is that the application is deployed and super fast, up to the point where i stress it with a system test that introduces some degreee of concurrency, and then I see the dead locks on the sequencing table.
    Meaning that the ongoing transactions are holding tight to their locks on the sequencing table.
    Is this a known issue?
    Is there something I am missing in the configuration?
    It really should not be this diffcult to get eclipse link to run its sequence reads and updates on a separate transaction of the main JTA transaction, but so far looks impossible.
    Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.3.3.v20120629-r11760): org.eclipse.persistence.exceptions.DatabaseException
    Internal Exception: java.sql.SQLTransactionRollbackException: A lock could not be obtained within the time requested
    Error Code: 30000
    Call: UPDATE ID_GEN SET SEQ_VALUE = SEQ_VALUE + ? WHERE SEQ_NAME = ?
      bind => [2 parameters bound]
    Query: DataModifyQuery(name="MyEntity_Gen" sql="UPDATE ID_GEN SET SEQ_VALUE = SEQ_VALUE + ? WHERE SEQ_NAME = ?")
    Many thanks for your help.

    Are you calling the cmp bean code and your new Sql code under a same transactional context?
    The following setting
    "rollbackLocalTxUponConnClose=true"
    will make the connectionpool to call the rollback method on the connection object before keeping it back in the pool. In your sql code if you are calling connection.close() , then your entire transaction will be rolled back.
    CMP bean requires a transactional connection while communicating with the database.
    What is the sequence of code execution?
    I think you must be calling sql code first and then cmp bean code later.
    You may avoid this problem in this way. This is my guess based on my understanding on your code execution.
    1. set rollbackLocalTxUponConnClose=false
    Execute the sql code and cmp code in a single transaction (in a single session bean method with cmt or bmt transaction ). Specify tx.rollback if it is bmt. or call tx.setRollbackOnly() if it is a cmt. In this way you will have control to roll back the transactions.
    Hope this helps you.
    bmt-> bean managed transaction
    cmt-> container managed transaction.
    Regards,
    Seshi.

  • How to change focus sequence in transaction VA01?

    Hi all,
    after upgrading our systems from 4.6c to ECC6.0 the focus sequence in the table control of transaction VA01 has changed.
    After entering a Material and pressing enter the focus switched to column Order Quantity of the same row in our 4.6c system.
    In ECC6.0 pressing enter causes the focus to go to Material in the next row
    Is there any way to change the focus sequence without doing a modification?
    Using the TAB-key is no alternative for our users.
    Regards,
    Hubert

    I think that I've found some usable solution.
    There is the input check "module vbap_bearbeiten_ende on chain-request." for fields rv45a-mabnr and rv45a-kwmeng of dynpro 4900/SAPMV45A.
    The form vbap_bearbeiten_ende which is called by PAI-module vbap_bearbeiten_ende provides several enhancement-points.
    When I create a warning message in an implementation of an applicable enhancement-point, the desired performance should come back to our users.
    SAP Note 0001309393 will also solve my problem.
    Edited by: Hubert Heitzer on Jul 15, 2009 9:48 AM

  • Problem: PL/SQL + Stored Procedure + Sequence + Trigger + Transaction + Violation Key

    I have a violation key when i insert some datas from a stored procedure. Why ???
    This is my script :
    1) The script of the table :
    CREATE TABLE OLLMULTI (
         IDO int NOT NULL ,
         IDL int NOT NULL ,
         T1 varchar2 (2) NOT NULL ,
         IDR int NOT NULL ,
         Constraint pk_outilsllmulti PRIMARY KEY (IDO, IDL, T1) ,
         Constraint u_outilsllmulti unique (IDR) );
    2) Now, i want to manage automatic increment field on IDR :
    I create a sequence :
    create sequence OLLMULTI_sequence increment by 1 start with 1;
    I create the trigger :
    create or replace trigger OLLMULTI_trigger
    before insert on OLLMULTI
    for each row when (new.IDR is null)
    begin
    select OLLMULTI_sequence.nextval into :new.IDR from dual;
    end;
    3) Now i create my store procedure and, in my procedure i want to insert 6 rows :
    Procedure Insert_OLLMULTI( i_IDO in OLLMULTI.IDO%type)
    is
    pragma AUTONOMOUS_TRANSACTION;
    BEGIN
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,2,'GJ');
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,5,'ND');
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,12,'AC');
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,120,'AH');
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,10,'ZG');
    INSERT INTO OLLMULTI (IDO,IDL,T1)
    VALUES (i_IDO,5,'RB');
    commit;
    EXCEPTION
    WHEN OTHERS THEN
    rollback;
    raise;
    END;
    END;
    4) The problem :
    The violation key on the constrainst u_outilsllmulti appears sometimes on a randome insert. Never the same !
    Why ? I think that the sequence is the problem... Or is the problem is the "pragma AUTONOMOUS_TRANSACTION" command ?
    Anyone can help me ?

    Two ideas:
    - Is it possible that there are already records in the table that were
    created without using the sequence? A sequence initially starts at 1,
    if you already got data you first have to increment it.
    - May it be that some inserts happen without your procedure? The trigger
    allows to create IDs without using the sequence. Correct, there are already records in my table !!
    Thanks. That's right.

  • Get error from SQL AGENT: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024

    I am running SQL Agent that executes an SSIS process from sql server1. The SSIS process executes its SQL/tables/sp’s  against another sql server2.
    I get an error after adding data flow tasks with transaction supported within a sequence with transaction required. The error, “The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction
    manager has disabled its support for remote/network transactions"
    Prior to adding this sequence everything was working from sql agent, and there were other sequences with oledb destinations.
    Everything works when running within SSIS Package.
    I see this article on similar issue,
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/0bfa2569-8849-4884-8f68-8edf98a9b4fe/problem-to-execute-a-package-and-tasks-with-a-certain-transactionoption-property-i-need-help?forum=sqlintegrationservices
    “I had similar issue and solved by setting the following on both the machines. Allow Remote Clients, Allow Remote Administration,
    Allow Inbound Clients, Allow Outbound Clients, and TIP are enabled in Component Services/My Computer/Properties/MSDTC/Security Configuration.”
    I don’t want to remove transaction required for the “Remove Duplicates from Staging” sequence.
    Anyone seen this?
    Greg Hanson

    DTC was running on remote computer. The problem was it was no longer accepting transactions from remote servers. This was in SSIS so I had to turn to "Transaction Supported" for all Data Flow Transactions.
    Greg Hanson

  • What are some best practices for Effective Sequences on the PS job record?

    Hello all,
    I am currently working on an implementation of PeopleSoft 9.0, and our team has come up against a debate about how to handle effective sequences on the job record. We want to fully grasp and figure out what is the best way to leverage this feature from a functional point of view. I consider it to be a process-related topic, and that we should establish rules for the sequence in which multiple actions are inserted into the job record with a same effective date. I think we then have to train our HR and Payroll staff on how to correctly sequence these transactions.
    My questions therefore are as follows:
    1. Do you agree with how I see it? If not, why, and what is a better way to look at it?
    2. Is there any way PeopleSoft can be leveraged to automate the sequencing of actions if we establish a rule base?
    3. Are there best practice examples or default behavior in PeopleSoft for how we ought to set up our rules about effective sequencing?
    All input is appreciated. Thanks!

    As you probably know by now, many PeopleSoft configuration/data (not transaction) tables are effective dated. This allows you to associate a dated transaction on one day with a specific configuration description, etc for that date and a different configuration description, etc on a different transaction with a different date. Effective dates are part of the key structure of effective dated configuration data. Because effective date is usually the last part of the key structure, it is not possible to maintain history for effective dated values when data for those configuration values changes multiple times in the same day. This is where effective sequences enter the scene. Effective sequences allow you to maintain history regarding changes in configuration data when there are multiple changes in a single day. You don't really choose how to handle effective sequencing. If you have multiple changes to a single setup/configuration record on a single day and that record has an effective sequence, then your only decision is whether or not to maintain that history by adding a new effective sequenced row or updating the existing row. Logic within the PeopleSoft delivered application will either use the last effective sequence for a given day, or the sequence that is stored on the transaction. The value used by the transaction depends on whether the transaction also stores the effective sequence. You don't have to make any implementation design decisions to make this happen. You also don't determine what values or how to sequence transactions. Sequencing is automatic. Each new row for a given effective date gets the next available sequence number. If there is only one row for an effective date, then that transaction will have a sequence number of 0 (zero).

  • System did not take VK11 call transaction if I have many call transaction

    Hi, experts, please help,
    My program will call transaction VK11 repeatly (one by one).  They all use same condition table (e.g. A005).   When I run using mode = 'N',  each call transaction returns sy-subrc = 0.   But the conditon records are not created.   But when I change to mode = 'D' the condition records are created. 
    Since VK11 is a very simple transaction, it let me feel the VK11 BDC process too fast and backend SAP not able to handle them one by one.    
    Any one has same problem?  Any idea?
    Thanks.

    Hi,
    After calling transaction check the messages table. If there are no errors then do like this.
    DO 5 TIMES.
    COMMIT WORK AND WAIT.
    CHECK sy-subrc IS INITIAL.
    EXIT.
    ENDDO.
    If this also does not work then give an explicit wait statement.
    COMMIT WORK AND WAIT.
    WAIT UPTO 4 seconds.
    We also had similar problem when we try to call sequence of transactions i.e if first transaction is success then call second.
    But even the first transaction was success second one was failing because of time lag in data base updation. Even do loop also did not worked for one transaction. after that we gave both commit work and wait & wait upto 5 seconds. Now it is working fine.
    Hope this will resolve ur issue.
    Thanks,
    Vinod.

  • Free goods by header net value not Quantity

    Regards
    My client wants to make a sales promotions as the free goods SD. The difference is that free goods should not be given by the bought quantity of certain material, they should be given from the header net value of an order.
    For example:
    If a customer have an order with header net value of 100$ from only materials that has certain material pricin group, it should receive one free good.( All the materials in the order have the same material pricing group)
    If a customer have an order with header net value of 200$ from only materials that has certain material pricin group, it should receive two free goods.
    If there are ways to make this happen please let me know i would really appreciate it.
    Thanks
    Edited by: Ronald Caroli on Feb 13, 2008 9:13 PM

    Hi ronald,
    The userexit option is very tough to do.
    You are using a right userexit, but how are you coding it? Whats your design and flow?
    The BDC are programmable recordings of a standard transaction. Hence you will be able to record the way in which you add a line item to sales order. Ths recording is automatically converted into lines of code by SAP. Then you insert your checks and balances between these lines of code to make it behave the way you want the sequence of transactions to work.
    My thought was to make a recording of adding a new line item, in change sales order transaction(va02), with item category as Free of Charge(FOC). Then in the code, I wanted to check for material pricing group, if found correct, then would go finding the net value of sales document. If the value is above the limit, calculate the target quantity. Then follow the rest of the recording and add a new line with FOC as item category.
    But this was a seperate transaction, which could be applied seperately and not to be done automatically in sales order. This was becuase say if you switch off the promotion tomorrow, then you will not have to worry abt removing the same functionality from sales order, you just need to de-activate the transaction.
    Then there would be issues of doing hardcoding the net value limits, free good quantity in the user exit which is not at all recommended.
    Hope this helps.
    Abhishek

  • Issue WHT Certificate-reg

    Dear Friends,
    I have a issue w.r.t TDS certificate printing.
    I have configured WHT in our company.
    I did the following sequence of transactions.
    1. Vendor down payment vide  F-48 for Rs. 50000/- TDS is deducted 10% and SC 5%
    2. Challan Updation vide J1INCHLN
    3. Bank  Updation vide J1INBANK,
    THEN
    4. I posted Vendor invoice for Rs. 500000/- TDs and SC at the requred Tax is deducted and posted
    5. I cleared the Down payment of Rs. 50000/- vide t-code F-54
    6. then I did J1INCHLN AND J1INBANK for invoice document.
    The issue here is
    The TDS certificate shows two line item, one for Rs. 50000 and tds and Surchage
    the other is Rs. 500000/- with TDS and Surchage to the extent of  Rs. 450000/-
    My  query here is the TDS and Surchage  shown  here is correct, but instead of showing Rs. 450000/- in the second line item the  certificte  shows the full value of Rs. 500000/- (which  is invoice amount).
    It seems misleading that the vendor has received Rs. 550000/- in total instead of Rs. 500000/-
    Can any body  clarify me what went wrong.
    Points will be awarded.
    Thanks and Regards,
    Sathish

    Hi SSQ,
    One you do the Invoice for 15000/- system will reverse the 10000/- downpayment TDS.
    For this you need to go to " Central Invoice Tab " and select Cent Invoice & First Payment " radio button in Withholding tax type for payment
    Central invoice & 1st partial payment
    If you set this indicator, the "central invoicing" concept applies to this withholding tax type.
    In the case of partial payments, the full withholding tax amount is deducted from the first partial payment.
    "Central invoicing" is a special method for dealing with line items that are linked to other, dependent documents like customer or vendor memos (credit memos, debit memos, down payment clearings or partial payments). This link is entered when the dependent documents are posted by specifying the invoice reference fields document number, fiscal year and line item (to which the reference is made).
    Note
    This field should only be used if you selected "Minimum check at item level" since otherwise the base minimum amount check takes place automatically for all line items in the invoice document.
    Use
    Applies only in Argentina for withholding tax on tax on sales/purchases.
    BestRegards
    Ashish Jain

  • Credit memo and MAP issue

    Hello Guru's
    I have 2 issues, and i am unable to understand the system logic for the same
    Issue 1:-
    We have an Intercomapny scenario
    sequence of transaction is as follows
    1. Create Return purchase order (document type NB) - Intercompany purchase
    2. Post return goods receipts - movement type 161
    3. Post credit memo with respect to return purchase order
    4. Cancel the movement type 161 - using 161 movement type
    System allows me to cancel the document created using 161 movement type
    Step 4 should not be allowed but system is allowing me to do so, please advise how to stop this
    Issue 2:-
    I have a strange MAP issues which is flactuating very drastically
    the scenario is as follows
    Store stock is 10 ea, stock value is 1000, MAP is 100
    DC map is 25
    When we make return STO from store to DC, ths stocks are issued at DC MAP (condition type P101)
    Assume we issue 9 quantity from Store then
    Store stock is 9 ea, stock value is (1000-(25*9)) =  775, MAP is 775/1 = 775
    This transaction is causing major flactuation in store MAP (100 changed to 775)
    Please advise if there is any way we can control this behaviour
    I understand that there is a setting which will in case of major MAP changes will put the amount into PRD account.
    Appreciate if you can guide me to help
    Regards
    Amit

    Hi,
    For your first scenario,
    for a retun po of IC NB type, after creating the PO you  need to deliver the same at VL10G. For the delivery system will allow to create the PGR . Prior to these steps you need to return the goods using 161 at MIGO.Please check this 161 stock posted to stock in transit.After PGR system will clear the stock in transit.
    Now after claring the stock from transit try cancelling the MIGO 161 document.
    For the second query,
    Please review your question' When we make return STO from store to DC, ths stocks are issued at DC MAP (condition type P101)
    Assume we issue 9 quantity from Store then
    Store stock is 9 ea, stock value is (1000-(25*9)) = 775, MAP is 775/1 = 775
    This transaction is causing major flactuation in store MAP (100 changed to 775),
    Please review the third line.
    Regards,

  • Account is blocked for posting, while releasing the billing doc to FI

    Hi,
    While releasing the document to FI, I am getting the error Account is blocked for posting .
    The GL account does not contain any entry for that company code. What can be the reason of this error?
    Regards
    Rudra

    HI,
    It might sound bit different to you...................even then hava look at the following settings
    Transaction Code VKOA - Check the GL Account , the system is determining on the basis of Access Sequence.
    Transaction Code FS00 - Check whether the GL Account is blocked for posting or not.
    Regards
    Jaydip

  • SM35 Re-processing of Incorrect/Error session in Background Mode

    Hi all,
    I am trying to reprocess a session in SM35 which is in Error status. Here I am a facing an issue, if I reprocess this session in background mode then it says that Batch input data is not available for a screen which is already processed. Ideally it should start from the screen where it threw error in the first run. If I select mode as Foreground or Display error then session is getting processed correctly.
    Note:
    If the session is in Ready to Process state then processing in background works perfectly fine. Has anyone came across this problem ?
    Has anyone tried reprocessing Incorrect Session in background mode ??????
    Regrads,
    Antony

    Thanks Sandra,
    Indeed I understood the problem. And the problem is that transaction KEBC is always executed correctly & since its processed correctly its removed from the BDC Queue
    Here transaction KEBC, sets a Memory Parameter, and transaction KEU2, first checks whether the memory parameter is initial, if found intial it pops-up a screen and makes the user enter it. In my case KEBC  always run successfully
    As you rightly suggested when session is in error, transaction KEBC is not getting called again to set the memory parameter & its throwing the pop-up from the transaction KEU2 to be entered by the user. ( BDCDATA currently dosent handle this Pop-up).
    I could have omitted the KEBC transaction, but handling this pop-up from transaction KEU2 becomes tricky since it will only pop-up if the memory parameter is initial. And unfortunately Pop-up is the first screen in the sequence for transaction for KEU2 & nothing can be done in coding level to call KEU2 with the pop-up in all scenarios(Even when Memmory varaible is set) !
    So I am kind of in a dilema, how to handle this? What I am suggesting to the Functional consultant is that let the session be only run in background mode (as it will be always in a new internal session memory variable will always be inital ) & I will record this pop-up in my BDC omitting transaction KEBC.
    Do you have anyother solution for this?
    Or is there any option to re process even the successfull transacations ?

  • Performance tuning in BDB Replication

    Hi,
    I have set up Berkeleydb replication with 1 master and 5 slaves for session management in our web application with webservers behind load balancer. Somehow i get stale session data from same of the slaves. Our application cannot live with stale data. Currently our environment is configured with following settings:
    cur_env.SetEventNotification(new Env.EventNotifyFcn(event_callback));
    cur_env.RepMgrAckPolicy = (Env.RepAckPolicy.NONE;
    //cur_env.SetVerbose(Env.DbVerb.Replication, true);
    cur_env.RepPriority = _config.priority;
    cur_env.RepMgrSetLocalSite(_config._listener.host, config.listener.port);
    foreach (RepHostInfoObj hostentry in config.hosts)
    cur_env.RepMgrAddRemoteSite(hostentry.host, hostentry.port, Env.RepSiteFlags.None);
    * We can now open our environment, although we're not ready to
    * begin replicating. However, we want to have a dbenv around
    * so that we can send it into any of our message handlers.
    cur_env.CacheSize = new CacheSize(0, 500, 0); // 500MB cache
    cur_env.SetFlags(EnvFlags.TxnNoSync, TRUE);
    SHARED_DATA data = new SHAREDDATA();
    data.ismaster = 1;
    cur_env.SetPrivateData(_data);
    cur_env.Open(_config.home, Env.OpenFlags.Create | Env.OpenFlags.Recover |
    Env.OpenFlags.ThreadSafe | Env.OpenFlags.InitRep | Env.OpenFlags.InitLock | Env.OpenFlags.InitLog |
    Env.OpenFlags.InitMPool | Env.OpenFlags.InitTxn, 0);
    cur_env.RepMgrStart(_3, config.startpolicy);
    Please check if the above configuration is good.
    My application needs good perfromance with no stale data.
    Kindly help.
    Thanks,
    Karthik

    Data appearing on replica sites naturally lags behind the master,
    simply because it takes some time to transmit across a network, and
    apply updates to the database.
    Would you rather have read operations on the replica wait until they
    "catch up" to some certain point in the sequence of transactions
    generated by the master?
    Your an ack policy of "NONE" allows the master to race ahead of the
    clients without bound. If you were to use the "ALL" ack policy then
    commit operations at the master would try to wait until all replica
    sites had received the transaction, and you would get closer to having
    all sites progress in sync. Of course that slows down the master
    though.
    Is this the kind of "stale" data you're talking about, or something
    more long-term?
    Alan Bram
    Oracle

Maybe you are looking for

  • "...in a custom Sequence Preset Editor."

    On page 1011 of the Final Cut Express User Manual, in the section "The Export QuickTime Movie Command", in the second paragraph, it suggests that there is a way to create a custom Sequence Preset. I quote: "You can also choose to use the existing set

  • Apple TV not allowing me to Sync

    I am in iTunes and am tring to select SYNC for my Apple TV, but there is not an option any longer. I use to be able to select synch, now it just asks if I want to turn off streaming. I cannot see any of the Apple TV libaray on my iTunes library on my

  • Step wise config data for schedule line agreements

    DEar All, I need urgently one the foolowing doc: step wise config data for schedule line agreements. thanks regards, s KUMAR

  • REQUEST for ARMv6 or lower support !!!

    As there are many people using older version of andriod phone and the new Samsung Galaxy Y is being huge in countries like India, it's time to have Adobe AIR support running on it. I did by own research on friends around me who using Andriod phone an

  • FCP export problem w/ graphics

    Hi everyone, I have had a reoccurring problem with my latest FCP projects when I export them through Compressor to DVDSP. My opening title is centered on the screen; my closing title is left justified. Everything is rendered and looks fine in FCP but