JDBC XMLType Insert/Update Performance

I am writing a backend Java EE application that inserts or updates XML documents into a table backed by XMLType.  I'm using Oracle 11.2.0.3 with associated drivers and XDK, with connections obtained from an oracle.jdbc.poo.OracleDataSource.  Unfortunately, I can't get my application to perform faster than 7 transactions a second, with or without JDBC batching.  I've tried several different table structures; here are the two that have shown the most promise:
CREATE TABLE mrbook_raw
  id  RAW(16),
  created_date TIMESTAMP,
  last_modified_ts TIMESTAMP,
  bk_typ_cd VARCHAR2(16),
  bk_invt XMLType,
  PRIMARY KEY(id),
  FOREIGN KEY (id) REFERENCES mrbook(id)
) XMLTYPE COLUMN bk_invt ELEMENT "http://www.mrbook.com/BookData/Vers1#BK_INVT";
--Also tried the below...
CREATE TABLE book_master OF XMLTYPE
XMLTYPE STORE AS SECUREFILE BINARY XML
VIRTUAL COLUMNS
  isbn_nbr AS ( XmlCast(
                XmlQuery('declare namespace plh="http://www.mrbook.com/BookData/Vers1/Header";
                          declare namespace bk_invt="www.mrbook.com/BookData/Vers1/InventoryData";
                          /bk_invt:BK_INVT/plh:HDR/plh:ISBN_NBR'
                          PASSING object_value RETURNING CONTENT) AS VARCHAR2(64)) ),
  book_id AS ( XmlCast(
                XmlQuery('declare namespace plh="http://www.mrbook.com/BookData/Vers1/Header";
                          declare namespace invtdata="http://www.mrbook.com/BookData/Vers1/InventoryData";
                          /bk_invt:BK_INVT/plh:HDR/plh:BOOK_ID'
                          PASSING object_value RETURNING CONTENT) AS VARCHAR2(64)) )
Here is the code that inserts into the first table:
    private static final String INS_MRBOOK_RAW =
            "INSERT INTO MRBOOK_RAW " +
                    "(id, bk_invt, last_modified_ts, created_date) " +
            "VALUES (?, ?, ?, ?)";
    private static final String UPD_MRBOOK_RAW =
            "UPDATE MRBOOK_RAW " +
            "SET " +
                "bk_invt = ?, " +
                "last_modified_ts = ? " +
            "WHERE id = ?";
protected void updateBookRaw(BookRecord record, Connection con)
        PreparedStatement stmt = null;
        SQLXML sqlxml = null;
        Timestamp now = new Timestamp(System.currentTimeMillis());
        try
            stmt = con.prepareStatement(UPD_MRBOOK_RAW);
            sqlxml = con.createSQLXML();
            DOMResult result = sqlxml.setResult(DOMResult.class);
            result.setNode(record.getExistingInvtData());
            stmt.setSQLXML(1, sqlxml);
            stmt.setTimestamp(2, now);
            stmt.setBytes(3, record.getId());
            stmt.executeUpdate();
        catch(SQLException e)
            throw new RuntimeException(e);
        finally
            if(sqlxml != null)
                try
                    sqlxml.free();
                catch(SQLException e)
                    log.error("Unable to free SQLXML!", e);
            if(stmt != null)
                try
                    stmt.close();
                catch(SQLException e)
                    log.error("Unable to close statement!", e);
    protected void insertBookRaw(BookRecord record, Connection con)
        PreparedStatement stmt = null;
        SQLXML sqlxml = null;
        Timestamp now = new Timestamp(System.currentTimeMillis());
        XMLDocument bookDoc = parser.getInvtDataDoc(record.getNewBook());
        try
            stmt = con.prepareStatement(INS_MRBOOK_RAW);
            sqlxml = con.createSQLXML();
            DOMResult domResult = sqlxml.setResult(DOMResult.class);
            domResult.setNode(bookDoc);
            stmt.setBytes(1, record.getId());
            stmt.setSQLXML(2, sqlxml);
            stmt.setTimestamp(3, now);
            stmt.setTimestamp(4, now);
            stmt.executeUpdate();
        catch(SQLException e)
            throw new RuntimeException(e);
        finally
            if(sqlxml != null)
                try
                    sqlxml.free();
                catch(SQLException e)
                    log.error("Unable to free SQLXML object!", e);
            if(stmt != null)
                try
                    stmt.close();
                catch(SQLException e)
                    log.error("Unable to close statement!", e);
I've tried every storage method possible; CLOBs, Binary XMLType, and structured storage, but I just cannot get the application to go faster than 7 records/second.
I understand that this may or may not be an XMLType question, but I don't know where to start.  From everything above, it looks like I should be getting good performance inserting and updating XMLType records; and I do indeed get pretty good performance from retrieval, but not from insert or update.  Does anyone have any suggestions on what I might try or a reference I might look at to start?

Perhaps a more specific question... should I use PreparedStatements with SQLXML or should I use the OracleXMLSave class?  Are SQLXML types batchable under Oracle?

Similar Messages

  • Merge vs Insert & Update Performance Comparison

    I am trying to do an insert and update to the target table from source table (oracle 11g).Its OLTP.Merge will be better or Insert & Update will be better..

    Hi,
    Welcome to the forum!
    user3893088 wrote:
    I am trying to do an insert and update to the target table from source table (oracle 11g).Its OLTP.Merge will be better or Insert & Update will be better..No one can say whether one will necessarily be better than the other, especially without seeing exactly what you're doing.
    For performance questions, see the forum FAQ
    SQL and PL/SQL FAQ
    #3 "How to improve the performance"
    I would guess that MERGE would be faster, because it only needs to scan the source table once. MERGE will proably be simpler to code, and easier to read and understand as well, so maintenance will take less time.

  • JDBC ,delete .insert ,update

    hi
    If there is one feed from sap ,we can insert or update database records(if not already inserted then update). but can we delete also ? or a separate interface is needed for deletion of a record in the database table?
    Thanks

    Statement node is mapped to the repeating segment from the source structure.
    action is coming as a separate field from the source structure since insert and updates can be implicitly done with the key logic but deletes have to be explicitly specified from the source field whether a particular record needs to be deleted.
    So suppose a flag on source structure comes as DELETE, so another receiving structure statement node needs to be created since table for insert or update or delete is the same , just the action is different and action is coming as a flag from the source structure?
    what is the mapping logic we need to put at Statement level for creating this conditional create?
    Thanks
    Edited by: sap_logic on Sep 20, 2010 9:54 PM

  • How to perform insert, update and delete in a table component

    hi all,
    i am using a table component in my page. I want to retreive data from multiple tables as well as perform insertion, updation and deletion operation.The changes should be affected in the corresponding tables. can anyone provide a solution for my problem.
    Thanks in advance
    regards,
    prasant

    There is a great tutorial for insert, update and delete records in a table.
    http://developers.sun.com/prodtech/javatools/jscreator/learning/tutorials/2/inserts_updates_deletes.html
    Hope it helps.
    Thanks,
    Moumita

  • Oracle RAC - Not getting performance(TPS) as we expect on insert/update

    Hi All,
    We got a problem while executing insert/update and delete queries with Oracle RAC system, we are not getting the TPS as we expected in Oracle RAC. The TPS of Oracle RAC (for insert/update and delete ) is less than as that of
    single oracle system.
    But while executing select queries, we are getting almost double TPS as that of Single Oracle System.
    We have done server side and client side load balancing.
    Can anyone knows to solve this strange behaviour? Shall we need to perform any other settings in ASM/ Oracle Nodes
    for better performance on insert/update and delete queries.
    The following is the Oracle RAC configuration
    OS & Hardware :Windows 2008 R2 , Core 2 Du0 2.66GHz , 4 GB
    Software : Oracle 11g 64 Bit R2 , Oracle Clusterware & ASM , Microsoft iSCSI initiator.
    Storage Simulation : Xeon 4GB , 240 GB ,Win 2008 R2, Microsoft iSCSI Traget
    Please help me to solve this. We are almost stuck with this situation.
    Thanks
    Roy

    Load Profile Per Second Per Transaction Per Exec Per Call
    ~~~~~~~~~~~~ ------------------ ----------------- ----------- -----------
    DB time(s): 48.3 0.3 0.26 0.10
    DB CPU(s): 0.1 0.0 0.00 0.00
    Redo size: 523,787.9 3,158.4
    Logical reads: 6,134.6 37.0
    Block changes: 3,247.1 19.6
    Physical reads: 3.5 0.0
    Physical writes: 50.7 0.3
    User calls: 497.6 3.0
    Parses: 182.0 1.1
    Hard parses: 0.1 0.0
    W/A MB processed: 0.1 0.0
    Logons: 0.1 0.0
    Executes: 184.0 1.1
    Rollbacks: 0.0 0.0
    Transactions: 165.8
    Instance Efficiency Indicators
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Buffer Nowait %: 93.74 Redo NoWait %: 99.96
    Buffer Hit %: 99.99 Optimal W/A Exec %: 100.00
    Library Hit %: 100.19 Soft Parse %: 99.96
    Execute to Parse %: 1.09 Latch Hit %: 99.63
    Parse CPU to Parse Elapsd %: 16.44 % Non-Parse CPU: 84.62
    Shared Pool Statistics Begin End
    Memory Usage %: 75.89 77.67
    % SQL with executions>1: 71.75 69.88
    % Memory for SQL w/exec>1: 75.63 71.38

  • Array Insert/Update/Select in JDBC

    I've been using array processing in ODBC, OCI, and Pro*C for years and desperately need to do so in JDBC. I've thus far been unsuccessful and am beginning to doubt that it's supported which to me is unfathomable. I've likewise found many like inquiries on the net - none of which were addressed. There have been many respondants who don't undertstand what array processing is and mistake it for batch SQL statements. They are not the same. Batched SQL statements are seperate statements which are executed in a single call to the database engine. What I'm taking about is using a prepared statement and binding primitive arrays to the statement. A single statement is executed and the contents of the arrays passed to the database. I've conducted tests years ago and array insert/update is many times faster than batch statements. Does anyone know whether or not JDBC supports array processing? If anyone's interested, I can provide snippets via email which illustrate how this is done in ODBC and other API's.
    Thanks.

    You referred me to
    http://java.sun.com/products/jdbc/download.html. The
    only reference I found to arrays were SQL Arrays - not
    what I'm talking about. See prior C/ODBC snippet. Do
    you know how to do this in JDBC?You are talking about passing an array as a single parameter to and from the underlying database correct? (And that has nothing to do with batch processing.)
    If so in section 16.4 "Array Object"..... looking at that section gives the following reference....
    The Array object returned to an application by the ResultSet.getArray and
    CallableStatement.getArray methods is a logical pointer to the SQL ARRAY
    value in the database; it does not contain the contents of the SQL ARRAY value.
    The above has nothing to do with batch processing (although presumably one could use it in a batch process but then one can use String as well.)
    Of course perhaps there is something in there that says that only applies to batch processing. If so could you please point out the section and quote the text.

  • Cannot perform insert/update on tabular form, because of dynamic action

    Hello all,
    I have created dynamic action which computes value from several tabular form cells.
    This functionality works nice, when I change value in associated cell then the computed value is changed by the dynamic action.
    But I am not able to insert or update the row in tabular form when the dynamic action is enabled. When I set condition to "Never", then the row is inserted or updated without any problems.
    Any guess where is the problem?
    Apex version: 4.1.1.00.23
    Jiri

    Ok, below are details related to my DA. Sorry for the poor description in previous posts.
    Triggered by: Chnage event
    Selection type: jQuery Selector
    jQuery Selector
    td[headers='SPECIESCODE'] input,td[headers='MEASURETYPE'] input,td[headers='TRUNKLOGID'] input,td[headers='LOGCLASS'] input,td[headers='LENGTH'] input,td[headers='PIECES'] input,td[headers='DIAMETER'] inputTrue Actions:
    *1.*
    //get element id which was changed by user
    var v_elementid = (jQuery(this.triggeringElement).attr('id'));
    switch(v_elementid.substr(0,3))
      case "f05":
       //assign value from triggered element to hidden field     
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).val());   
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val()); 
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val()); 
       break;
      case "f06":
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val()); 
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val());      
       break;
      case "f07":
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val()); 
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val());
       break;
    }*2.*
    //get element id which was changed by user
    var v_elementid = (jQuery(this.triggeringElement).attr('id'));
    switch(v_elementid.substr(0,3))
      case "f09":
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val());      
       break;
      case "f12":
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val());
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val());
       break;
      case "f13":
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val());
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='DIAMETER'] input").val());  
       break;
      case "f14":
       $s("P210_HIDDEN_DIAMETER",jQuery(this.triggeringElement).val());
       //assign actual values also to other hidden elements     
       $s("P210_HIDDEN_MEASURE",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='MEASURETYPE'] input").val());
       $s("P210_HIDDEN_TRUNKLOGID",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='TRUNKLOGID'] input").val());
       $s("P210_HIDDEN_SPECIES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='SPECIESCODE'] input").val());
       $s("P210_HIDDEN_LENGTH",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LENGTH'] input").val()); 
       $s("P210_HIDDEN_PIECES",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='PIECES'] input").val()); 
       $s("P210_HIDDEN_LOGCLASS",jQuery(this.triggeringElement).parents("tr:first").find("td[headers='LOGCLASS'] input").val());
       break;
    }*3.*
    call PL/SQL function (values from hidden fields are used as input params)
    begin
    :P210_HIDDEN_VOLUME :=  function(:P210_HIDDEN_SPECIES,:P210_HIDDEN_MEASURE,:P210_HIDDEN_TRUNKLOGID,:P210_HIDDEN_LOGCLASS,:P210_HIDDEN_LENGTH,:P210_HIDDEN_DIAMETER,:P210_HIDDEN_PIECES, :G_USERLANG);
    end;*4.*
    here I want to set computed value from hidden field to VOLUME cell in tabular form. And it seems there is some problem, because not only the corresponding VOLUME cell is changed, but also several other cells. I realized that also several cells which are hidden to user has same value as computed VOLUME field. Including ROWID and then the insert/update operations are not done.
    // find the VOLUME-Field and set it to the computed value
    jQuery(this.triggeringElement).parents("tr:first").find("td[headers='VOLUME'] input").val($v("P210_HIDDEN_VOLUME"));I hope I have provide more information and somebody will be able to he help me. maybe the problem is clear, but with my poor jQuery knowledge i am not able to find it.
    Thanks in advance!
    -Jiri
    Edited by: Jiri N. on Aug 10, 2012 3:24 AM

  • How can perform insert /update /delete in one single mapping.

    Hi,
    I want to is there any logic by which we can create 2-3 pipeline in a mappings like pipelines will work for insert / update /delete or storing soem rejected data according to conditional flag.
    I tried it in a mapping but problem is that when target load order is like ins then upd then delete/reject . if new rec will come then control will pass through ins target . but if rec needs to update or delete then again control is going to ins target not update / delete target.
    We have already given the all conditional flags in filter after lookup and before target .
    all possibilities we checked but didnt got success.
    last option is separate the mappings for insert / update/delete.....etc.
    Is there any solution for this type of problem.
    reply plz if any body have solutions.
    ---Umesh

    Hi Umesh,
    I understand from your query that you want to load target with insert, update and delete rows after runnng the mappping...
    If you are looking for the same then you can use one of the Oracle fetures Oracle Streams: Change Data Capture.
    the Url is:
    http://www.oracle.com/technology/products/bi/db/10g/pdf/twp_cdc_cookbook_0206.pdf
    If any other help required do reply.
    Regards
    Tarang Jain

  • JDBC and Hung Update Operations

    Hi,
    I am learning JDBC interface in Java with Oracle database and have noticed a strange problem when I perform simple update (i.e. insert) operation (using bind variable of PreparedStatement).
    I am currently running Oracle database version 9.2.0.1.0 on Solaris 10 with my client program running on JVM Sun 1.5 with JDBC ojdbc5_g.jar. The database instance runs on Solaris OS and the JDBC thin driver is run on either Windows or Linux in an internal LAN network (using TCP). I can connect and perform queries from my JDBC code just fine; I can also perform the usual operation from SQL*Plus on my database hosting machine as well as remotely on the machines that I run the JDBC code using SQL*Plus client.
    Now, if the JDBC code attempts to perform a modification operation AND any a user connected to the database through SQL*Plus also performs some sort of update operation on the "same" table that the JDBC code is trying to modify, then the JDBC code hangs. The two operations are being performed at various times. Just to give you a scenario in which the code does or does not become responsive:
    If I login via SQL*Plus and connect to the database but perform solely "select" statements on my designated table, and run the JDBC updating code, everything works fine. The insert is done accordingly even in the subsequent updates via JDBC as long as the SQL*Plus user performs only "select" queries.
    As soon as I perform a modification operation (i.e. delete the inserted row by the JDBC code) in SQL*Plus, any subsequent attempts to, let's say insert a row, by the JDBC code would result in an operation to hang (just to note, I can acquire the connection in this scenario but the actual execution of operation from JDBC hangs, no response) -- even if I wait half an hour and run my JDBC program, the operation hangs.
    The strange thing is, to remedy this, a user that has performed some sort of modification operation on the table via SQL*Plus has to "disconnect" from the database. As soon as the SQL*Plus user disconnects, the JDBC operation goes through and program successfully performs the update
    By the way, I am also utilizing DataSource in my code (without JNDI) to establish connections but I've tried the DriverManager in my code with the same result. One last remark is that the isolation level has been set to READ COMMITTED if that has anything to do with it (but the thing is neither of updating operations from JDBC or SQL*Plus are being done at the same time -- as I've mentioned, I even waited a few minutes between each run). Perhaps, the SQL*Plus holds a luck on the rows of the table for some reason and once the user disconnects, the JDBC program goes through shrug
    Any hint would be appreciated.

    The issue is that by default, a SQL-PLUS session operates in
    'autocommit false' mode, so when you do any sort of update,
    your SQL-PLUS session has a transaction and will hold an
    exclusive lock on anything you've updated, until you commit,
    or disconnect. The JDBC session, or any second separate
    concurrent SQL-PLUS session will also be blocked.
    Joe Weinstein at BEA Systems

  • Insert / update data to a table through DBLINK (oracle)

    I try to insert / update a table from one instance of oracle database to another one through oracle dblink, get following error:
    java.sql.SQLException: ORA-01008: not all variables bound
    ORA-02063: preceding line from MYLINK
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
    at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:582)
    at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1986)
    at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1144)
    at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:2152)
    at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:2035)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:2876)
    at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:609)
    The same code to insert / update the exact same table in local instance works fine.No any binding problem. So i am pretty sure all ? mark in SQL are set with some value before sending to Oracle.
    Someone please advise what is possible problem. Db link is not set correctly? or we can not update remote table by dblink.
    By the way i can do insert / update from TOAD to the remote table through DBLINK. Problem happens only in Java code.
    thanks!
    Gary

    dblink links from one database instance to another.
    So it is certainly a source of possible problems when it works on one database and not another.
    You should start by looking at the dblink and it possible testing it in the database not via java.
    Note as well that that error suggests that it is coming from the Oracle database. I believe if you had a bind parameter problem in your java code that the error would come from the driver. But that is a guess on my part.

  • Can you create nested condition in merge(e.g. insert/update) using owb

    Hi,
    Does OWB9iR2 allow you to build in nested condition in merge. such as
    If no match on col1 and col2 then
    if col3 match then no new sequence insert <---
    else insert new sequence;
    else (there is match on col1 and col2)
    update col3 and sequence.
    I have an incremental load for a lookup table, where insert/update is used. There are two match columns and surrogate key is used. When there is no match, it shall not insert a sequence when there is a match on third column. I can not use the 3rd column in the original match because it shall be updated where there is a match for the two match column.
    I am trying to avoid using transformant for performance impact. Thanks

    HIi I think the misleading thing is that in PL/SQL you can use booleans, which is not possible in SQL. So in a PL/SQL tranformation, this is OK:
    a:= case when not orgid_lkup( INGRP1.ORG_ID )
    then get_supid(..)
    else ...
    but, the following SQL does not work:
    select case when not orgid_lkup( INGRP1.ORG_ID )
    then get_supid(..)
    else ...
    into a
    from dual;
    I ended up using only 0/1 as boolean return values for these reasons;
    so I can have:
    select
    case when orgid_lkup( INGRP1.ORG_ID ) = 0 then ...
    though true booleans are better if you don't have to embed them in SQL.
    Antonio

  • Insert,update and delete data in a table using webdynpro for abap

    Dear All,
    I have a requirement to create a table allowing the user to add rows in it and update a row as well as delete a row from that table.To do this I guess I have to make use of ALV.But using ALV I am not able to enter data to a table where as I can make a column editable delete a row etc. please guide me to perform these operations(insert,update and delete) on table.
    Thanks,
    Reddy.

    Hi Sridhar,
    By using ALV you can do all insert delete etc things. if you want to edit i mean you can yenter data in ALV.
    Check this...
    http://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3133474a-0801-0010-d692-81827814a5a1
    Editing alv in web dynpro
    editing rows in alv reports
    Re: editing rows and columns in alv reports in webdynpro abap
    Cheers,
    Kris.

  • Insert, update and delete trigger over multiple Database Links

    Hello guys,
    first of all I'll explain my environment.
    I've got a Master DB and n Slave Databases. Insert, update and delete is only possible on the master DB (in my opinion this was the best way to avoid Data-inconsistencies due to locking problems) and should be passed to slave databases with a trigger. All Slave Databases are attached with DBLinks. And, additional to this things, I'd like to create a job that merges the Master DB into all Slave DB's every x minutes to restore consistency if any Error (eg Network crash) occurs.
    What I want to do now, is to iterate over all DB-Links in my trigger, and issue the insert/update/delete for all attached databases.
    This is possible with the command "execute immediate", but requires me to create textual strings with textually coded field values for the above mentioned commands.
    What I would like to know now, is, if there are any better ways to provide these functions. Important to me is, that all DB-Links are read dynamically from a table and that I don't have to do unnecessary string generations, and maybe affect the performance.
    I'm thankful for every Idea.
    Thank you in advance,
    best regards
    Christoph

    Well, I've been using mysql for a long time, yes, but I thought that this approach would be the best for my requirements.
    Materialized View's don't work for me, because I need real-time updates of the Slaves.
    So, sorry for asking that general, but what would be the best technology for the following problem:
    I've got n globally spread Systems. Each of it can update records in the Database. The easies way would be to provide one central DB, but that doesn't work for me, because when the WAN Connection fails, the System isn't available any longer. So I need to provide core information locally at every System (connected via LAN).
    Very important to me is, that Data remain consistent. That means, that it must not be that 2 systems update the same record on 2 different databases at the same time.
    I hope you understand what I'd need.
    Thank you very much for all your replies.
    best regards
    Christoph
    PS: I forgot to mention that the Databases won't be very large, just about 20k records, and about 10 queriees per second during peak times and there's just the need to sync 1 Table.
    Edited by: 907142 on 10.01.2012 23:14

  • Oracle 11g: Oracle insert/update operation is taking more time.

    Hello All,
    In Oracle 11g (Windows 2008 32 bit environment) we are facing following issue.
    1) We are inserting/updating data on some tables (4-5 tables and we are firing query with very high rate).
    2) After sometime (say 15 days with same load) we are feeling that the Oracle operation (insert/update) is taking more time.
    Query1: How to find actually oracle is taking more time in insert/updates operation.
    Query2: How to rectify the problem.
    We are having multithread environment.
    Thanks
    With Regards
    Hemant.

    Liron Amitzi wrote:
    Hi Nicolas,
    Just a short explanation:
    If you have a table with 1 column (let's say a number). The table is empty and you have an index on the column.
    When you insert a row, the value of the column will be inserted to the index. To insert 1 value to an index with 10 values in it will be fast. It will take longer to insert 1 value to an index with 1 million values in it.
    My second example was if I take the same table and let's say I insert 10 rows and delete the previous 10 from the table. I always have 10 rows in the table so the index should be small. But this is not correct. If I insert values 1-10 and then delete 1-10 and insert 11-20, then delete 11-20 and insert 21-30 and so on, because the index is sorted, where 1-10 were stored I'll now have empty spots. Oracle will not fill them up. So the index will become larger and larger as I insert more rows (even though I delete the old ones).
    The solution here is simply revuild the index once in a while.
    Hope it is clear.
    Liron Amitzi
    Senior DBA consultant
    [www.dbsnaps.com]
    [www.orbiumsoftware.com]Hmmm, index space not reused ? Index rebuild once a while ? That was what I understood from your previous post, but nothing is less sure.
    This is a misconception of how indexes are working.
    I would suggest the reading of the following interasting doc, they are a lot of nice examples (including index space reuse) to understand, and in conclusion :
    http://richardfoote.files.wordpress.com/2007/12/index-internals-rebuilding-the-truth.pdf
    "+Index Rebuild Summary+
    +•*The vast majority of indexes do not require rebuilding*+
    +•Oracle B-tree indexes can become “unbalanced” and need to be rebuilt is a myth+
    +•*Deleted space in an index is “deadwood” and over time requires the index to be rebuilt is a myth*+
    +•If an index reaches “x” number of levels, it becomes inefficient and requires the index to be rebuilt is a myth+
    +•If an index has a poor clustering factor, the index needs to be rebuilt is a myth+
    +•To improve performance, indexes need to be regularly rebuilt is a myth+"
    Good reading,
    Nicolas.

  • JDBC adapter - no UPDATE to database

    Hi,
    I have the scenario ABAP Proxy to XI to JDBC to read some data from a database and return some info back to BW. Is it mandatory to UPDATE the selected records in some way or can the UPDATE part of the adapter be disabled by putting something in the UPDATE parameter on the comm channel ?
    Cheers
    Colin.

    Hi Colin,
    if you are talking about a sender JDBC adapter: you can put <TEST> in the update statement.
    See this snippet from the help:
    <i>Update SQL Statement
    You have the following options:
    Enter a valid SQL statement that is to be applied to the database once the data (determined from the Query SQL Statement) has been successfully sent to the Integration Server/PCK.
    It must be an INSERT, UPDATE, or DELETE statement.
    In place of the SQL statement, you can also enter <TEST>. Once the data determined from Query SQL Statement has been successfully sent, the data in the database remains unaltered.
    This is recommended if the data has not only been read, but also changed by a stored procedure entered under Query SQL Statement.</i>
    http://help.sap.com/saphelp_nw04/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm
    If you want to synchronously call a receiver JDBC adapter, i think you can use action=SELECT without an update.
    Regards
    Christine

Maybe you are looking for

  • RE: Is it possible to sort tags?

    This is a continuation of the Support Forum question titled: “Is it possible to sort tags?” located at: https://support.mozilla.org/en-US/questions/963133?esab=a&as=s&r=2&page=1 If this new topic has to be posed as a new question the following applie

  • How do i watch tv shows after i purchase them, i can only view 30 second videos

    how do i watch tv shows after i purchase them on itunes, i can only view 30 second video?

  • Contract creation in SRM 7.0

    Hi,      We are using SRM 7.0 as backend ECC EH4. I want to create Global Outline Agreement (GCTR) in SRM which will replicate on ECC but I donu2019t want to use PI so thatu2019s why I mentioned system type ERP_2.0 instead of ERP_4.0 in Define System

  • ITunes requires Safari 4.0.3 message box

    Hi !! I usually got this message box after I open iTunes 10.1.1 "itunes requires Safari 4.0.3 or later to be installed to use the i Tunes store. Within iTunes. Use de software update and bla bla bla..." And I've opened the software update and I doesn

  • Soap to proxy - synchronous error

    hi i am working on soap to proxy scenario. i am getting an error in message monitoring Exception caught by adapter framework: com.sap.aii.af.sdk.xi.srt.BubbleException: System Error Received. HTTP Status Code = 200: However System Error received in p