Need Suiggestion forOracle data replication/integration/transfer strategies

Client OLTP database(Oracle 10g on Solaris) connected to a web based 'Point of Sale' system where some of their insurance related transactions(around 2000 per day) are stored. Most of their insurance related transactions are stored in a remote central system (accessible over the intranet).
Based on the net worth of a transaction, there is a requirement to either transfer the transaction real-time or batch it and send it across to a remote Oracle 10g(on Solaris) staging area(a part of the remote central system) and fire some stored procedures on the remote Oracle staging area to process the records transferred. Some amount of configurable data massaging would also be needed before firing the stored procedures on the remote Oracle 10g(on Solaris) staging area to process the records transferred. The outcome of the whole process also needs to be tracked.
Client is interested in automating this process to the extent possible using the various possible replication/data transfer strategies (dblink, datapump, etl, replication kit, MQ, xml/http etc.- pro's & con's) to do so taking into consideration the standard security and bandwidth related constraints.

Hi,
There are lots of solutions available
You can think and use the possible solutions for your scenarios.
1.) Data Pump Network Exports/Imports
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php#NetworkExportsImports
2.) Data replication using DBMS_COMPARISON
http://nadvi.blogspot.in/2011/11/data-replication-using-dbmscomparison.html
http://www.bash-dba.com/2012/02/data-synchronization-replication-using.html
3.) Complete Schema Refresh using Data Pump
http://www.in-oracle.com/Oracle-DBA/DBA-I/schema-refresh-Data-Pump.php
4.) Oracle Golden Gate
5.) Oracle Streaming
Let you know if any query the same :)
Thanks
Hitgon

Similar Messages

  • I need options for data replication within production db and dimensional db

    Hi,
    I'm looking for options on how to solve this issue. We've 2 databases, one is our production, operative database, used by around 400 users at a time, and another one, which is our dimensional model of the same info, used to obtain reports. We also have a lot of ETL's (extract, transform and load) processes running every night to update the dim model.
    Mi problem is that we have some online reports, and nowadays, we're getting data from the operational database, causing a performance issue in online operations. We want to migrate this reports to the dimensional model, and we're trying to find the best options for doing this.
    Options that we're considering are ETL's process running continuously every XX minutes, materialized views, ETL's on demand, and others.
    Our objective is to minimize performance issues on transactional database.
    We're using Oracle 8i (yes, the oldie one) and Reporting Services as report engine (reports just run a pkg to get data).
    Any option is welcome.
    Thx in advance.
    Regards,
    Adrian.

    The best option for you if the performance is the
    most important is ORACLE STREAMS. Also is the most
    complex but the final results are very goodsAgreed. As User12345 points out, though, that requires Oracle 9.2 or higher.
    Another option is the materialized views with Fast
    Refresh , that need the materialized view logs in
    the master site.
    The first load is expensive but if you refresh each
    15 minutes the cost is not high.I'd be careful about making that sort of statement. The overhead of both maintaining materialized view logs (which have to be written to synchronously with the OLTP transactions and which impose an overhead roughly equivalent to a trigger on the underlying table) and doing fast refreshes every 15 minutes can be extensive depending on the source system. One of the reasons that Streams came about was to limit this overhead.
    For refresh i execute a cron shell that run the
    DBMS_MVIEW.REFRESH package. my experience with group
    refresh not was goodWhat was your negative experience with refresh groups? I've used them regularly without serious problems. Manual refreshes of individual materialized views against an OLTP system would scare the pants off me because you'd inevitably end up with transactionally inconsistent views of the data (i.e. child records would be present with no parent record, updates that affect multiple tables would be partially replicated until the next refresh, etc). Trying to move that sort of inconsistent data into a different data model, and trying to run reports off that data, would seem highly challenging at a minimum. Throwing everything into a single refresh group so that all the materialized views are transactionally consistent, or choosing a handful of refresh groups for those tables that are related to each other, seems like a far easier way to build a system.
    Justin

  • Refresh System Data in Integration Directory PI 7.11

    Hey,
    I have setup a new technical system and a business system in SLD, but now I still can't find it in Integration Directory.
    I knew from the later PI Version, that there was an option to transfer SLD data to integration directory, but now I can't find it
    anymore.
    Can anybody help?
    Thanks!
    Michael

    I have setup a new technical system and a business system in SLD, but now I still can't find it in Integration Directory.
    I knew from the later PI Version, that there was an option to transfer SLD data to integration directory, but now I can't find it
    Tools (or right-click in ID) --> Assign Business System --> follow the wizard...it will assign the business system from SLD to ID

  • Please Help PI Data Dependent Integration Builder Authorizations NOT Workng

    Dear Friends / Experts,
    I had spend many days and explored all Weblog  and links on this website and implemented all the steps required to acheive Data Dependent Integration Builder Security and I am not successful so far. I am just giving up now - Please Help Me ---
    As I said, I already read all the important Forum Links and SAP Web links and Followed Each and Every Step - service.sap.com/instguidesNW04 ® Installation ® SAP XI
    Security Requirement - Data Dependent/Object Level Authorizations in XI / PI
    In distributed teams or in a shared PI environment it might be necessary to limit authorization for a developer or a group of developers to only one Software Component or objects within a Software Component or to specific Configuration Objects.
    Our Environment - PI 7.0 SP 16
    Created a new role in the Integration Builder Design
    u2013Add Object Types of any Software Component and Namespace
    - Enable usage of Integration Builder roles in Exchange Profile
    Integration Builder u2013Integration Builder RepositoryParameter com.sap.aii.util.server.auth.activation to true
    Assign users to the newly created Integration Builder roles
    u2013Create dummy roles in Web AS ABAP, these roles are then available as groups in Web AS Java
    u2013Assign users to these roles
    u2013Assign the Integration Builder roles to the above groups in Web AS Java
    u2013Assign unrestricted roles to Super Users
    Please help - How to validate whether Data Dependent Authorizations are Activated?
    I am working with XI Developers and Basis Team and we did updated all the Required Exchange profile parameters.
    Per this Document - User Authorizations in Integration Builder Tools - Do we need to update the server.lockauth.activation in Exchange Profile. When We updated, It removed Edit Access from all XI Developers in PI
    In both the Integration Repository and the Integration Directory, you can define more detailed authorizations that restrict access to design and configuration objects.
    In both tools, you define such authorizations by choosing Tools ® User Roles from the menu bar. The authorization for this menu option is provided by role SAP_XI_ADMINISTRATOR_J2EE. Of course, this role should only be granted to a very restricted number of administrators. To activate these more detailed authorizations, you must set exchange profile parameter com.sap.aii.ib.server.lockauth.activation to true.
    The access authorizations themselves can be defined at the object-type level only (possibly restricted by a selection path), where you can specify each access action either individually as Create, Modify, or Delete for each object type, or as an overall access granting all three access actions.
    http://help.sap.com/saphelp_nw04/helpdata/en/f7/c2953fc405330ee10000000a114084/frameset.htm
    I was able to control display and maintain access from ABAP Roles, but completely failed to implement Integration Builder Security?
    Are there any ways to check Whether Data Dependent authorization or J2EE Authorizations are activated?
    Thanks a lot
    Satish

    Hello,
    so to give you status of our issue.
    We were able to export missing business component .
    But we also exported some interfaces after that and we had some return code 8, due  to objects still present in change list on quality system (seems after previous failed transports , the change list was not cleared completley...).
    So now we have checked that no objects is present in the change list of quality system and we plan to export again our devs on quality system.
    Hope after that no more return code 8 during imports and all devs transported correctly on quality system.
    Also recommending to read that, which is pretty good.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/7078566c-72e0-2e10-2b8a-e10fcf8e1a3d?overridelayout=t…
    Thanks all,
    S.N

  • Need help with Desktop Office Integration (DOI)

    Hi all,
    i need help to read an Excelsheet into an int. table.
    Its the first time, that i use the SAP DOI. I copy different coding into my Report to get connection to an existing Excelsheet.
    Here is my Coding:
    * first get the SAP DOI i_oi_container_control interface
      CALL METHOD c_oi_container_control_creator=>get_container_control
                        IMPORTING control = gr_control
                                  error   = gr_errors.
      APPEND gr_errors.
    * create a control container as defined in dynpro 100
      CREATE OBJECT gr_container
                EXPORTING container_name = 'CONTAINER'.
    * initialize the SAP DOI Container, tell it to run in the container
    * specified above and tell it to run Excel in-place
      CALL METHOD gr_control->init_control
        EXPORTING
          r3_application_name      =    'Data'
          inplace_enabled          = ' '
          inplace_scroll_documents = 'X'
          parent                   = gr_container
          register_on_close_event  = 'X'
          register_on_custom_event = 'X'
          no_flush                 = 'X'
        IMPORTING
          error                    = gr_errors.
    * save error object in collection
      APPEND gr_errors.
    * ask the SAP DOI container for a i_oi_document_proxy for Excel
      CALL METHOD gr_control->get_document_proxy
                           EXPORTING document_type = 'Excel.Sheet'
    *                       EXPORTING document_type = 'Word.Document'
                                    no_flush = 'X'
    *                                REGISTER_CONTAINER = 'X'
                          IMPORTING document_proxy = gr_document
                                    error          = gr_errors.
      APPEND gr_errors.
    Then i open the Document from lokal PC.
    CALL METHOD gr_document->open_document
          EXPORTING
    *      document_title   = ld_filenc
            document_url     = ld_verzc
    *      NO_FLUSH         = ' '
    *      OPEN_INPLACE     = ' '
    *      open_readonly    = ' '
    *      PROTECT_DOCUMENT = ' '
    *      STARTUP_MACRO    = ''
    *      USER_INFO        =
    *      ONSAVE_MACRO     =
          IMPORTING
            error            = gr_errors
    *      RETCODE          =
      APPEND gr_errors.
    Now i start the spreadsheet interface:
    *...check if our document proxy can serve a spreadsheet interface  data:
      DATA: pd_has TYPE i.
      CALL METHOD gr_document->has_spreadsheet_interface
                        EXPORTING no_flush = 'X'
                        IMPORTING is_available = pd_has
                                  error = gr_errors.
      APPEND gr_errors.
      CALL METHOD gr_document->get_spreadsheet_interface
                        EXPORTING no_flush = ' '
                        IMPORTING
                                  sheet_interface = gr_spreadsheet
                                  error = gr_errors.
      APPEND gr_errors.
    * now loop through error collection because
    * Get_spreadsheet_interface flushed and synchronized
    * the automation queue !
      LOOP AT gr_errors.
        CALL METHOD gr_errors->raise_message
                        EXPORTING  type     = 'I'
                        EXCEPTIONS message_raised = 1
                                   OTHERS         = 2.
        IF sy-subrc = 1.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                  WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
          pd_kz_fehler = 'X'.
          EXIT.
        ENDIF.
      ENDLOOP.
      FREE gr_errors.
    Ok, now i can open a Excelsheet, and i can mark a range in the sheet
    rows is a constant.
    CALL METHOD gr_spreadsheet->set_selection
          EXPORTING
            left     = 1
            top      = 2
            rows     = rows
            columns  = 18
    *    NO_FLUSH = ' '
    *    UPDATING = -1
        IMPORTING
          error    = gr_errors
    *    RETCODE  =
    my first problem: Excel is really open, and the user can see the sheet. I dont want, that excel is visible...is there a way to start excel in no_visible mode?
    second problem: The sheet have makro aktiv...at start from excel, there is a popup, which ask 'makros activate' oder not activate...  i dont want this popup... is there a way, to say it from abap, that makros always active?
    third problem: i see, that the content Table have this components:
    TYPES: BEGIN OF SOI_GENERIC_ITEM,
             ROW(4) TYPE C,
             COLUMN(4) TYPE C,
             VALUE(256) TYPE C,
           END OF SOI_GENERIC_ITEM.
    But my excelsheet have more then 10000 lines....
    forth (and biggest) problem: i need the selected data into an int. tabelle. The table have the components:
    ROW
    COL
    VALUE
    i dont know, how can i do this...
    please help me
    Sorry for my bad english.
    Greetings
    Markus

    Hi,
    May be this link is useful
    /people/thomas.jung3/blog/2005/05/11/using-classic-activex-controls-in-the-abap-control-framework
    Also Check out report SAPRDEMOEXCELINTEGRATION2.
    check the links
    http://www.esnips.com/doc/741a848e-f49a-4436-bec4-e21950f6c94c/desktop-office-integration.pdf
    http://www.esnips.com/doc/2080a9ec-64f9-49c4-bd03-d9f56bc2437c/MSWord--Excel-with-ABAP.pdf
    Regards,
    Raj.

  • Problems with HowTo "Creation of BI Master Data in Integrated Planning...)

    Hi,
    one of our customers need a planning application in which he can create master data for different attributes, insert some key figures for this attributes and changes already planned key figured.
    For the last two point I've already found a working solution, the first point (creating of masterdata in web template) is still a problem. After a little search, I found a howto-guide "Creation of BI Master Data in Integrated Planning (IP) through Web Layouts", in which the necessary steps are described, how to do this (please see following link: http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/607193d5-cdd5-2b10-c699-8ff04c3124f6)
    Does anyone of you know this guide and was able to build a working solution?
    I have different questions, maybe you could help me:
    - what kind of variable is he using?
    - do I need a DataProvider in Web Template?
    - how do I get the connection to the used variable in the button group?
    Thanks for your help.
    Regards
    Tim

    Hi Timo,
    you require two howtos to implement the solution.
    First, please take a look at the first document:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10d2b273-0e12-2c10-fab3-a34bde559f92?QuickLink=index&…
    In chapter 3 you see the prerequisites (note 1101726 and notes 1384495, 1387004 and 136772). By this, you change the system behaviour, not to check if the new inserted master data is valid (it can't be valid, because the entries are new!)
    And after this please implement the solution from the how to.
    Furthermore you can take a look at this how to, which might help you too:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/607193d5-cdd5-2b10-c699-8ff04c3124f6?QuickLink=index&…
    Even if my first post is a bit old, my customer still uses this solution, at the moment we are on 7.3 sp06, so it is working in newer releases too.
    Hope this helps.
    Regards
    Tim

  • Need to insert data in 2 tables thro stored procedure

    I need to create a stored procedure which will insert data in two tables. The procedure will get
    its inputs from an Oracle Developer Form which will be inserted into the tables.
    The 2 tables structure:
    1.FEES_MASTER
    Name Null? Type
    FEES_ID NOT NULL NUMBER(8) -- Primary Key
    CS_ID NOT NULL NUMBER(8) -- Class Student ID; An enrolled student
    REC_DATE NOT NULL DATE -- Fees receipt date
    REC_AMOUNT NOT NULL NUMBER(6) -- Fees receipt amount
    2.FEES_DETAIL
    Name Null? Type
    FEES_ID NOT NULL NUMBER(8) -- Foreign Key
    MONTH NOT NULL DATE -- First of each month to identify fee month
    Scenario:
    A student submits fees for 3 months through Master/Detail related blocks in a Developer Form as
    Under:
    Fees Master
    Fees ID : 11002
    Class Student ID : 356
    Receipt Date : 06-JAN-2001
    Receipt Amount : 1500
    Fees Detail
    Fees ID Fees Month
    11002 01-JAN-2001
    11002 01-FEB-2001
    11002 01-MAR-2001
    I need to check each fees detail record for fees month duplication as well before inserting new records.
    How can this be achieved?
    Thanks in advance.

    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Fan Liu ([email protected]):
    create primary key in the detail table. i think it's FEES_ID + MONTH. then mark the columns in Forms as primary key property true, then call check_record_uniqueness built-in in on-check-unique trigger.<HR></BLOCKQUOTE>
    Thanks very much. But what I need is to ensure transaction integrity thro stored procedure. The system allows a certain CS_ID (i.e. Class Student ID, which is assigned a new one to every student annually) to submit fees for 12 months only, because a student stays in a class for a year. The 2 columns in the FEES_DETAIL table are composite primary key which only ensures that a certain FEES_ID will not be repeated for the same month. But suppose:
    1)this data already exist in the tables:
    Fees Master
    Fees ID : 11002
    Class Student ID : 356
    Receipt Date : 06-JAN-2001
    Receipt Amount : 1500
    Fees Detail
    Fees ID Fees Month
    11002 01-JAN-2001
    11002 01-FEB-2001
    11002 01-MAR-2001
    2)And this data is currently being inserted:
    Fees Master
    Fees ID : 11300
    Class Student ID : 356
    Receipt Date : 04-FEB-2001
    Receipt Amount : 1500
    Fees Detail
    Fees ID Fees Month
    11300 01-JAN-2001
    11300 01-FEB-2001
    11300 01-MAR-2001
    The data in the 2nd condition is perfectly valid but the application can't allow a student to submit fees for a month which he has already submitted. Now only a stored procedure can make sure after checking that the same student doesn't pay fees for duplicate months. Another reason for my emphasis on stored procedure is that what if a user tries to insert data thro an SQL* Plus session instead of the Form.
    PROBLEM: Now the problem I am having is I don't know how the procedure will take input for multiple records from the FEES_DETAIL block in the Form.
    Please assist in this regard. Thanks.

  • I have an old Mac Tower.  I need to move data from the unit to an external hard drive. What kind of hard drive do I purchase.  There is not a lot of data to be transferred.

    I have an old Mac Tower.  I need to move data from the unit to an external hard drive.  What kind of of hard drive must I purchase so that it works with the Tower?  There is not a lot to transfer.

    MG,
    What we might need to do is to change the question.  What computer do you plan to use the data on in the future?  For example, if you have a newer tower, just move the hard drive to the new tower.  If you want to use the data, what about an ethernet LAN to get the data over?  If you want to archive the data, how do you store the second drive?  What are your plans for the first drive?  If the new software will not recognize the old file type, the data must be exported.  Just getting files to another hard drive will not finish your journey.
    As Allan said, knowing the exact model and year of your tower is important.  My suggestion is to find a user group near you.  Please post back with more information, including the name of a large city near you.
    For example, our user group just experimented with a 1983 Apple IIc that predates the Mac.  It started right up, read files from 1984  and saved them from a 5.25" floppy to a 3.5" floppy.  That 3.5 floppy will go into a platinum G3 which will read PRODOS files using Apple File Exchange.  That G3, with a USB card, will allow saving the files to a USB flash drive.  But, somewhere along the way, some software has to read the file and convert it to data that is useable by current software.
    Ji~m

  • Failure in the SOAP Runtime: Employee Master Data Replication Using IDoc OTM_EMPL

    During employee master data replication from ECC to CfTE, we encountered the following error:
    Unexpected element -el=OTM_EMPL ns=urn:sap-com:document:sap:idoc:soap:messages
    We are using the 1402 WSDL for Replication of Employee Master Data with Employment Details (humancapitalmanagementmasterd2). The payload was not received in CfTE.
    Can anybody please advise how to resolve this issue?
    Thank you.

    Dear Joselito,
    This issue need deeper investigation in customer system.
    Kindly report an incident for SAP Support for further help.
    Regards,
    Rahul Mishra

  • Data replication and synchronization in Oracle 10g XE.

    We are trying to do data replication and synchronization sort of thing for all our servers. We are using Oracle 10g. XE. I guess there are some features in oracle already for replication but I am not very sure about them.
    To explain it more clearly - we will have individual database servers in our sub-divisions and then divisions and centers and then main server. We need to synchronize at various levels. So If any body is aware of any techniques, please let me know.

    Hi,
    Could you tell me what exactly synchronisation your talking about..?
    we will have individual database servers in our sub-divisions and then divisions >>and centers and then main serverIf you have mulitple DB servers then you can connect it by DB links. also if you are talking DB synchronisation then you can have Triggers,Materialized views.
    we also have two independent severs which are synchronised(atleast schema levels).
    Regards!

  • Data Replication Between Sqlserver and Oracle11g using materialized view.

    I have Sqlserver 2005 as my source and oracle11g as my target.I need to populate the target daily with change data from source.
    for that we have created a dblink between sqlserver and oracle and replicated that table as a Materialized view in Oracle.
    problem we are getting here is Fast refresh option is not available.each day it will pick full data from the source.
    is there any way to use Fast refresh in this scenario??
    Thanks in advance.
    Regards,
    Balaram.

    Pl do not post duplicates - Data Replication Between Sqlserver and Oracle11g using materialized view.

  • Z field dat to be transfer from SO to Purchase Requestion

    Hi
    I am faing one problem
    I have created 1 Z field in Sales Order and the have created the same field in Purchase Requestion.
    Now how i can transfer Z field data from SO to PR. What is the procedure and configuration i have to do.
    Pls confirm.
    Regards
    Vicky

    Hi Pls reply which user Exit , i should apply for transfering Z filed data from SO to PR.
    This is 3rd party process and we have added a Z field but we need to transfering data from SO to PR.
    Which User Exit will use to perform this process

  • I need to create a bridge to transfer SQL Server query results to my local system

    I need an interim solution. My SQL Server DB is being populated and the code for the primary application is nearing completion.
    I need to collect data from the system before  the formal reporting processes are complete. How would you recommend that I do that? Right now I'm planning to run queries on the server and download them. Is that a good strategy? Is there a better way?
    What would you recommend?

    You may use any of the below options,
    1. Using Linked Server
    http://msdn.microsoft.com/en-us/library/ms188279.aspx
    2. Using OpenRowSet
    http://msdn.microsoft.com/en-us/library/ms190312.aspx
    3. Using SSIS
    http://technet.microsoft.com/en-us/library/ms141134.aspx
    4. Using Replication
    http://msdn.microsoft.com/en-us/library/ms151198.aspx
    Regards, RSingh

  • Attempt to fetch cache data from Integration Directory failed

    HI,
    while checking cache connectivity testing: status is
         green:   Integration Repository     
         green:    Integration Directory     
              green: Integration Server - JAVA     
                  red:Adapter Engine af.axd.aipid     
               yello:Integration Server - ABAP
    Jun 30, 2007 1:16:08 PM - Cache notification from Integration Directory received successfully
    Attempt to fetch cache data from Integration Directory failed; cache could not be updated
    [Fetch Data]: Unable to find an associated SLD element (source element: SAP_XIIntegrationServer, [CreationClassName, SAP_XIIntegrationServer, string, Name, is.00.aipid, string], target element type: SAP_BusinessSystem)
    [Data Evaluation]: GlobalError
    what to do?
    and there is nothing under integration server and integration engine but there is an green status under Non-Central Adapter Engines > from this i am doing send messeage testing fro xi to bi ,
    send message to: http://aibid:8000/sap/xi/engine?type=entry
    payload:
    <?xml version="1.0" encoding="utf-8"?>
    <ns1:MI_VCNdatatoBI
    xmlns:ns1="http://bi.sap.com"
    xmlns:xsi="http://www.w3.org/2001/XMLSchemainstance">
    <DATA>
    <item>
    </BIC/ZG_CWW010>1000<//BIC/ZG_CWW010>
    </BIC/ZVKY_CHK>1<//BIC/ZVKY_CHK>
    </item>
    </DATA>
    </ns1:MI_VCNdatatoBI>
    i can sent message from there (component monitoring > Non-Central Adapter Engines) but unable to get it at message monitoring and at BI side.
    dushyant.

    thanks,
    but i have adepter type XI
    and i am folowing step of this lonk and there is no need to create fild adepter type according to that and almost done but while sending message through config. monitor in RWB it goes but not coming in mess monitoring and at bi side
    see 4.5 > 3 and 4 topic and 4.6 > 3,4,5
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f027dde5-e16e-2910-97a4-f231046429f2
    now what to do?
    dushyant,

  • Issue in data replication for one particular table

    Hi,
    We have implemented streams in out test environment and testing the business functionalities. We have an issue in data replication for only one custom table all other tables data replications are proper no issue. When we do 100 rows update data replication is not happening for that particular table.
    Issue to simulate
    Update one row -- Replication successful.
    100 rows update -- After 3-4 hrs nothing happened.
    Please let me know did any of you have come across similar issue.
    Thanks,
    Anand

    Extreme slowness on apply site are usually due to lock, library cache locks or too big segments in streams technical tables left after a failure during heavy insert. these tables are scanned with full table scan and scanning hundreds of time empty millions of empty blocks result in a very big loss of performance, but not in the extend your are describing. In your case it sound more like a form of lock.
    We need more info on this table : Lob segments? tablespace in ASSM?
    If the table is partitioned and do you have a job that perform drop partitions? most interesting is what are the system waits nd above all the apply server sessions waits. given the time frame, I would more looks after a lock or an library cache lock due to a drop partitions or concurrent updates. When you are performing the update, you may query 'DBA_DDL_LOCKS', 'DBA_KGLLOCK' and 'DBA_LOCK_INTERNAL' to check that you are not taken in a library cache lock.

Maybe you are looking for

  • Accessing EnvironmentProperty entries from an EJB

    Does anyone know how to access environment entries from within an EJB? One of the examples in the ejb/corba devguide contains the following: // Add any environment properties that // the bean requires EnvironmentProperties { prop1 = "value1"; prop2 =

  • An error has occurred (Can't connect to the Apple Software Update server.)

    I keep getting this error when I click the Update tab.

  • Purchasing number ranges

    Hi All , I am getting a problem in my IDES , I am creating new number range for PR or PO , it is getting created and when i assign it to document type i am getting error "please enter internal/external number range" , system is shown the value of num

  • How do you know if a statement is closed?

    In my project, the closed field of oracle.jdbc.driver.OracleStatement class was used to check if this statement is closed. However, for oracle 10g jdbc, the field is not public, but default. My question is: How do you know if a statement is closed? O

  • Truble with smartforms with word

    Hello All. i have a trouble with smartforms when I creat text , the smartforms open word page and  the paragraph i was created within style did'nt show. Note. I have Office 2007 <MOVED BY MODERATOR TO THE CORRECT FORUM> Edited by: Alvaro Tejada Galin