PreparedStatement batch updates

I'm trying to use a batch update with a PreparedStatement, but when my code executes
I receive an error stating The JDBC 2.0 method is not implemented, when the code
executes addBatch(). Does JRockit have any api documentation stating what is
and is not implemented. Any suggestions on a workaround? I'm assuming there
is a Weblogic specific class that will accomplish this, but I would rather not
implement such a class unless there is no other alternative.

Hi,
This seems to be a JDBC driver specific question (may not be related to
JRockit). You may post this in WebLogic JDBC news groups.
Sathish Santhanam
Developer Relations Engineer
BEA Support
"frank" <[email protected]> wrote in message
news:3f01921a$[email protected]..
>
I'm using JRockit 7.0 sp2rp1
"frank" <[email protected]> wrote:
I'm trying to use a batch update with a PreparedStatement, but when my
code executes
I receive an error stating The JDBC 2.0 method is not implemented, when
the code
executes addBatch(). Does JRockit have any api documentation stating
what is
and is not implemented. Any suggestions on a workaround? I'm assuming
there
is a Weblogic specific class that will accomplish this, but I would
rather
not
implement such a class unless there is no other alternative.

Similar Messages

  • JDBC Batch Updates & PreparedStatement problems (Oracle 8i)

    Hi,
    we're running into problems when trying to use JDBC Batch Updates with PreparedStatement in Oracle8i.
    First of all, Oracle throws a SQLException if one of the statements in the batch fails (e.g. because of a unique constraint violation). As far as I understand, a BatchUpdateException should be thrown?
    The next problem is much worse:
    Consider this table:
    SQL> desc oktest
    ID NOT NULL NUMBER(10)
    VALUE NOT NULL VARCHAR2(20)
    primary key is ID
    When inserting in through batch updates with a PreparedStatement, I can pass 'blah' as a value for ID without getting an exception (Oracle silently converts 'blah' to zero). Only when the offending statement is the last statement in the batch, Oracle throws an exception (again, SQLException instead of BatchUpdateException).
    Any comments/suggestions are appreciated.
    E-mail me if you want to see the code...
    Thanks,
    Oliver
    Oracle version info:
    (Enterprise Edition Release 8.1.6.0.0, JServer Release 8.1.6.0.0, Oracle JDBC driver 8.1.6.0.0 and 8.1.7.0.0 (we're using the 'thin' driver)
    CLASSPATH=/opt/oracle/product/8.1.7/jdbc/lib/classes12.zip:...
    null

    Please refer
    http://www.oracle.com/technology/products/oracle9i/daily/jun07.html

  • More than 1 preparedStatement  object using batch update

    Hey how can I execute more than 1 preparedStatement object using batch update..Pls explain with a code(java)
    Thanks

    // turn off autocommit
    con.setAutoCommit(false);
    PreparedStatement stmt = con.prepareStatement(
         "INSERT INTO employees VALUES (?, ?)");
    stmt.setInt(1, 2000);
    stmt.setString(2, "Kelly Kaufmann");
    stmt.addBatch();
    stmt.setInt(1, 3000);
    stmt.setString(2, "Bill Barnes");
    stmt.addBatch();
    // submit the batch for execution
    int[] updateCounts = stmt.executeBatch();
    search in google for more information

  • Batch updates with callable/prepared statement?

    The document http://edocs.bea.com/wls/docs70/oracle/advanced.html#1158797 states
    that "Using Batch updates with the callableStatement or preparedStatement is
    not supported". What does that actually mean? We have used both callable and prepared
    statements with the batch update in our current project (with the Oracle 817 db).
    It seems to run ok anyway.

    So the documentation should state that batch updates do not work ok in old versions
    of JDriver for Oracle, BUT work correctly with newer version. Additionally, batch
    updates work ok when used with Oracle supplied jdbc-drivers?
    "Stephen Felts" <[email protected]> wrote:
    Support for addBatch and executeBatch in the WLS Jdriver for Oracle was
    added in 7.0SP2.
    It was not available in 6.X or 7.0 or 7.0SP1.
    "Janne" <[email protected]> wrote in message news:3edb0cdc$[email protected]..
    The document http://edocs.bea.com/wls/docs70/oracle/advanced.html#1158797
    states
    that "Using Batch updates with the callableStatement or preparedStatementis
    not supported". What does that actually mean? We have used both callableand prepared
    statements with the batch update in our current project (with the Oracle817 db).
    It seems to run ok anyway.

  • Oracle 10G AS, Batch Update doesn't commit.

    Hello,
    We are using Batch uptdate to commit a set of records, but to our surprise one of the batches commits in DB and the other doesnt.
    The below pseudo code explains our scenario :
    String str1="INSERT INTO TBL1 (COL1,COL2) VALUES (?,?) ";
    String str2="UPDATE TBL2 SET COL1=? WHERE COL2=?";
    PreparedStatement pstmt1=conn.prepareStatement(str1);
    PreparedStatement pstmt2=conn.prepareStatement(str2);
    for(500 recs)
    pstmt1.addbatch();
    pstmt2.addbatch();
    On 500 Recs
    //This batch updates the DB and commits too
    pstmt1.executeBatch();
    pstmt1.clearBatch();
    con.commit();
    //The Batch executes successfully, but updates are not reflected in DB,
    //which may mean that it must not be committing in DB
    pstmt2.executeBatch();
    pstmt2.clearBatch();
    con.commit();
    - In the above, pstmt1 is an INSERT and pstmt2 is an UPDATE.
    - It so happens that at the end, DB reflects Inserts done by pstmt1, but it doesnt reflect the Updates of pstmt2.
    - We have also fired a SELECT stmt immediately after pstmt2 Execute Batch, but the updated values are not reflected.
    - Also the above scenario works absolutely fine if we use a STATEMENT instead of a PREPAREDSTATEMENT for pstmt2.
    Set up Details:
    App Server :: Oracle 10G AS for Linux 2.1,
    Database Driver :: Oracle 10G Driver which is bundled with Oracle 10G AS
    Database :: Oracle 9i
    Any ideas in this regards would be highly appreciated.
    Thanks !
    - Khyati
    Message was edited by:
    user473157

    I think this is not the right forum for this question; probably a JDBC forum or one of the Oracle AS forums.
    Kuassi

  • HELP!!! Strange Error with Batch Update

    While trying to do a batch update with the following preparedstatment:
    INSERT INTO TAB1
    (DATETIME,CALC_ID,SITE_ID,APPL_ID,VAL)
    values (
    {ts '2003-06-24 08:49:14'},
    (SELECT CALC_ID FROM TAB2 WHERE CALC_NAME=?),
    2,
    7,
    Notice the parameter in the subquery. This works fine for a single executeUpdate. But if i do two preparedstatement.addbatch calls and then executeBatch I see a unique constraint violation on (DATETIME,CALC_ID, and SITE_ID) CALC_ID is the only key that varies in my batch. It is as if the subquery is not being resolved and written literally to the field. If I try the 1st update by itself in add batch it works. If I try the second it works by itself. If I try them together in the batch it fails. The unique constraint is not violated when I execute them singly.
    Any Ideas?
    Thanks in advance.

    I've found that db2, PreparedStatements and batch updating do not work very well together. I've had to use dynmanic sql and build the update statement myself. Actually I wrote a query parser that builds the statement so I can pass it like a PreparedStatement.
    I'm not sure if your in db2 ver 7 but if so make sure your using fixpack 9. at lot of bad prepared statement behavior.

  • Batch Updates in 6.0sp2 w/ JDriver

    Ok, I know this subject has been posted multiple times in this group but I still
    haven't been able to find a definitive answer. Does the OCI JDriver shipped with
    WLS6.0sp2 for HPUX support batch updates (specifically the 817_8 so's)? I'm getting
    the "This JDBC 2.0 method is not implemented for PreparedStatements" error so
    I am assuming not. I've seen posts claiming this was implemented in sp1 but it's
    not working for our sp2 implementation. Any feedback would be appreciated.
    -Erik

    Erik,
    Based on the error message that you copied in the message you can indeed
    conclude that the required functionality is not available in the OCI
    jDriver.
    Did you already have a look at the SequeLink driver?
    This JDBC 2.0 compliant driver does support batch updates.
    A free eval is available on www.merant.com/datadirect.
    Cheers,
    Dimitri
    "Erik" <[email protected]> wrote in message
    news:3b57c513$[email protected]..
    >
    Ok, I know this subject has been posted multiple times in this group but Istill
    haven't been able to find a definitive answer. Does the OCI JDrivershipped with
    WLS6.0sp2 for HPUX support batch updates (specifically the 817_8 so's)?I'm getting
    the "This JDBC 2.0 method is not implemented for PreparedStatements" errorso
    I am assuming not. I've seen posts claiming this was implemented in sp1but it's
    not working for our sp2 implementation. Any feedback would be appreciated.
    -Erik

  • Statement caching and batch update

    Can these 2 JDBC features work together ?
    Is it possible while statement is cached to be reparsed (soft) if used in batch update ?
    I am asking this questions because i have a sitution where an insert is cached using implicit statement caching and then put in a batch to exeute batch updates !!! From statspack reports i find that 1/3 of statements are reparsed ... even soft !!!

    Statement caching and batch update work fine together. The most common cause of unexpected soft parses is changing the type of some parameters. If you first bind one type, setInt(1, ...), do addBatch, then bind another type to the same parameter, setString(1, ...), and do addBatch, you will get a soft reparse. There is nothing the JDBC driver can do about this, the RDBMS requires it.
    In general, whatever parse behavior you see with statement caching you would also see without it.
    Douglas

  • Error while running batch update statement

    Hi
    We are experiencing the below error while running the batch update statement where in the IN clause have more than 80,000 entries. The IN clause is already handled for max 1000 values so it has multiple or clause
    like update...where id in (1,2...999) OR id in (1000,1001........) OR Id in ()...
    Error at Command Line:1 Column:0
    Error report:
    SQL Error: ORA-00603: ORACLE server session terminated by fatal error
    ORA-00600: internal error code, arguments: [kghfrh:ds], [0x2A9C5ABF50], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [kkoitbp-corruption], [], [], [], [], [], [], []
    00603. 00000 - "ORACLE server session terminated by fatal error"
    *Cause: An ORACLE server session is in an unrecoverable state.
    *Action: Login to ORACLE again so a new server session will be created
    Is there a limitation of oracle or some bug?
    Thanks

    http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/limits003.htm
    The limit on how long a SQL statement can be depends on many factors, including database configuration, disk space, and memoryI think you're over this limit.
    The only way is creating a temporary table with all the values and using
    IN (select ...)Max
    http://oracleitalia.wordpress.com

  • Inbound IDoc SHPCON - Batch update issue

    Hi all,
    I would like to use SHPCON.DELVRY03 idoc in order to update Outbound delivery document.
    Scope is :
    - picking
    - good issue
    - update batches
    - update serial numbers
    - update volume and weights
    We met issue on update batches a soon as one document item already has batch information before receiving IDoc.
    2 cases :
    - no batch spliting => High level item has already a linked batch (LIPS-POSNR = 000010 and LIPS-CHARG not empty.
    Is there a way to update batch information if external warehouse confirms another batch number ?
    - batch splitting => High level item has already 2 sublines POSNR = 900001 & 900002. Both has batch numbers.
    If I want to confirm it, I can send E1EDL19 with BAS qualifier but sublines are added...
    Must I delete existing sublines with E1EDL19 DEL ?
    Thanks a lot for your help.
    J.C.
    and to post good issue.
    All is OK except

    Hi,
    We are having same issue as of yours...
    i.e Updation of batch via IDOC to the delivery if delivery contains batch ...its not happening and also idoc not throwing any error
    Did you able to resolve this ?
    Thanks
    Rajesh

  • Problem with Batch Updates

    Hi All,
    I have a requirement where in I have to use batch updates. But if there's a problem in one of the updates, I want the rest of the updates to ingore this and continue with the next statement in the batch. Is this not one of the features in BatchUpdates in JDBC2? I have been trying to accomplish this since 2 days now. Can anyone help me with this. Can anyone please help me with this?
    FYI, I have tried the following.
    I have 3 test updates in my batch.
    2 nd statement is an erraneous statement(deliberate). Other 2 statements are fine. It is appropriatley throwing 'BatchUpdateException'. when I ckeck the "arrays of ints" reurned by executeBatch() as well as BatchUpdateException.getUpdateCounts() are returning an arrays of size '0'. If remeove the erraneous statement, behaviour is as expected.
    Thanks in advance,
    Bharani

    The next paragraph of the same API doc:
    If the driver continues processing after a failure, the array returned by the method BatchUpdateException.getUpdateCounts will contain as many elements as there are commands in the batch, and at least one of the elements will be the following:
    3. A value of -3 -- indicates that the command failed to execute successfully and occurs only if a driver continues to process commands after a command fails
    A driver is not required to implement this method.

  • How can I avoid using rollback segment for batch updates.

    I am currently trying to avoid associating a large amount of space for rollback segment as this gets filled up only during the nightly batch updates. All that space will never be used during the day. Hence want to know if there is any way of avoiding the use of rollback segment at the session level.
    Rajesh

    No, but what you can do is create a large rollback segment to use with your batch job, at the start of your batch job bring the segment online, then use set transaction to use that rollback segment, when the batch job is finished and committed, you can then bring the rollback segment offline.
    If you are really pressed for space, as an alternate plan, you could actually create the large segment before the batch job and drop it after.

  • Block Batch updates from QM

    Hi Guru's,
    I have a question regarding batch updates through QM (inspection lots).
    Is it possible to block a batch for getting updates from QM? Meaning that we can take UD, but that the results are not copied to the batch anymore. The solution should also make sure that we still can do a few logistical movement (like transfer postings, goods issue, ...)
    Maybe anybody experience with a custom-build solution?

    Sounds like you don't get along with your QM staff?
    The whole point of inspection lots and QM status controls is to ensure quality if you want to circumvent Quality control measures you can always let them disposition the batch and move it to blocked status, then scrap the blocked stock (move into 999 storage type and clear via LI21) and then pull the inv. back out of 999 in unresricted status/clear.
    If you are not fighting with your QC staff then perhaps you should stop creating inspection lots to begin with.
    Just seems strange to want to do this.

  • Best way to do batch update of an online database

    We have a periodic process which will do batch update of the online database. I would like to have some advice on what is the best way to avoid this operation affecting cache.
    We have multiple environments using shared cache, so the total cache size is pretty big. Each environment on average refresh once a day. Some are pretty big in the DB size but not necessary have high traffic, and thus doesn't have high count in the cache. I am wondering if I do a refresh (basically lots of puts) to the db, will the affect the cache? if so, is there a way to not affect the cache?
    If there is a way to disable caching of the LN at environment level, that should also work for us as we a 1st tier object cache.
    Also, I would like to cleanup the log at the end of the batch update. Will the cleanup process block the reader threads which are set at read_uncommitted?
    Thanks!
    Edited by: JoshWo on Mar 9, 2010 5:20 PM

    Josh,
    It is possible to use a base API Cursor (with the CacheMode configured) to write entities in a destination EntityStore after reading them from a source EntityStore. The idea is to read the entities using a DPL cursor from the source store, and then use the binding of the destination store to convert the entities to DatabaseEntry objects, and finally store the DatabaseEntry objects using a base API cursor from the destination store.
    import com.sleepycat.je.CacheMode;
    import com.sleepycat.je.Cursor;
    import com.sleepycat.je.DatabaseEntry;
    import com.sleepycat.je.OperationStatus;
    import com.sleepycat.persist.PrimaryIndex;
    import com.sleepycat.persist.EntityCursor;
        PrimaryIndex srcIndex = ... // index of source store
        PrimaryIndex destIndex = ... // index of destination store
        EntityCursor<MyEntity> srcCursor = srcIndex.entities();
        Cursor destCursor = destIndex.getDatabase().openCursor(null, null);
        destCursor.setCacheMode(CacheMode.UNCHANGED);
        EntityBinding<MyEntity> destBinding = destIndex.getEntityBinding();
        DatabaseEntry key = new DatabaseEntry();
        DatabaseEntry data = new DatabaseEntry();
        for (MyEntity e : srcCursor) {
            destBinding.objectToKey(e, key);
            destBinding.objectToData(e, data);
            OperationStatus status = destCursor.put(key, data);
            assert status == OperationStatus.SUCCESS;
        destCursor.close();
        srcCursor.close();Please let me know if you have questions.
    Obviously, this would be a lot easier if there were an EntityCursor.put method, so that the CacheMode could be set on the EntityCursor. I'll file an enhancement request to add such a method.
    --mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • How to update bulk no of records using  Batch Update

    i am trying to insert and update records into multiple tables using Statement addBatch() and executeBatch() method. But some times it is not executing properly. some tables records are inserting and some tables it is not inserting. But I want all the records need to excute and commit or if any error is there then all records need to rollback.
    This is the code i am using.
    public String addBatchQueryWithDB(StringBuffer quries, Connection conNew,Statement stmtNew) throws Exception {
    String success="0";
    try {         
    conNew.setAutoCommit(false);
    String[] splitquery=quries.toString().trim().replace("||","##").split("\\|");
    for(int i=0;i<splitquery.length;i++) {
    //System.out.println("query.."+i+".."+splitquery.trim().replace("##","||"));
    stmtNew.addBatch(splitquery[i].trim().replace("##","||"));
    int[] updCnt = stmtNew.executeBatch();
    for(int k=0;k<updCnt.length;k++){
    int test=updCnt[k];
    if(test>=0){
    success=String.valueOf(Integer.parseInt(success)+1);
    // System.out.println(".updCnt..."+updCnt[k]);
    System.out.println("success...length.."+success);
    if(updCnt.length==Integer.parseInt(success)){
    success="1";
    //conNew.commit();
    } catch (BatchUpdateException be) {
    //handle batch update exception
    int[] counts = be.getUpdateCounts();
    for (int i=0; i<counts.length; i++) {
    System.out.println("DB::addBatchQuery:Statement["+i+"] :"+counts[i]);
    success="0";
    conNew.rollback();
    throw new Exception(be.getMessage());
    } catch (SQLException ee) {
    success="0";
    System.out.println("DB::addBatchQuery:SQLExc:"+ee);
    conNew.rollback();
    throw new Exception(ee.getMessage());
    } catch (Exception ee) {
    success="0";
    System.out.println("DB::addBatchQuery:Exc:"+ee);
    conNew.rollback();
    throw new Exception(ee.getMessage());
    }finally {
    // determine operation result
    if(success!=null && success.equalsIgnoreCase("1")){
    System.out.println("commiting......");
    conNew.commit();
    }else{
    System.out.println("rolling back......");
    success="0";
    conNew.rollback();
    conNew.setAutoCommit(true);
    return success;
    }

    Koteshwar wrote:
    Thank you for ur reply,
    I am using single connection only, but different schemas. Then I am passing a Stringbuffer to a method, and First iam setting con.setAutoCommit(false); and then in that method i am splitting my queries in stringbuffer and adding it to stmt.addBatch(). After that I am executing the batch.after executing the batch i am commiting the connectionIf I am reading that right then you should stop using Batch.
    The intent of Batch is that you have one statement and you are going to be using it over and over again with different data.

Maybe you are looking for

  • Error while load the data from CSV with CTL file..?

    Hi TOM, When i try to load data from CSV file to this table, CTL File content: load data into table XXXX append      Y_aca position char (3),      x_date position date 'yyyy/mm/dd' NULLIF (x_date = ' '), X_aca position (* + 3) char (6) "case when :Y_

  • I get error 23 when trying to restore my iphone 4

    I tried to update my iPhone 4 to the latest software and it got stuck on the store screen...whenever i plug it into itunes it will go through the process up till it says verifying with apple. Then error 23 comes up and I am not able to do anything wi

  • Weird Video problem and kernel panic!

    Okay, so last ngiht I was running tech tool maintenence, and I had repair on. So, it finished scanning my startup volume (I was on my eDrive) and then I shut down and restarted. Now, it tried to start from my startup volume (tiger 4.7, with latest up

  • Security Assessment IP SCOM error

    I've enabled the Security Assessment in Ops Insights, and after that this error appeared in SCOM (2012, R2 UR5). The event log on the management server pretty must just echo's this and I can't get any more information. Alert Description Source:    sc

  • Photoshop CS6 work window too tall for monitor display

    On a MacBook Pro 15-inch retina display running OS 10.8.3, in Photoshop CS6: after launching PS, the work window is too tall for the monitor display. The bottom is cut off: there are no bottom scroll bars or nor any of the page info, I can't grab the