Batch Updates in 6.0sp2 w/ JDriver

Ok, I know this subject has been posted multiple times in this group but I still
haven't been able to find a definitive answer. Does the OCI JDriver shipped with
WLS6.0sp2 for HPUX support batch updates (specifically the 817_8 so's)? I'm getting
the "This JDBC 2.0 method is not implemented for PreparedStatements" error so
I am assuming not. I've seen posts claiming this was implemented in sp1 but it's
not working for our sp2 implementation. Any feedback would be appreciated.
-Erik

Erik,
Based on the error message that you copied in the message you can indeed
conclude that the required functionality is not available in the OCI
jDriver.
Did you already have a look at the SequeLink driver?
This JDBC 2.0 compliant driver does support batch updates.
A free eval is available on www.merant.com/datadirect.
Cheers,
Dimitri
"Erik" <[email protected]> wrote in message
news:3b57c513$[email protected]..
>
Ok, I know this subject has been posted multiple times in this group but Istill
haven't been able to find a definitive answer. Does the OCI JDrivershipped with
WLS6.0sp2 for HPUX support batch updates (specifically the 817_8 so's)?I'm getting
the "This JDBC 2.0 method is not implemented for PreparedStatements" errorso
I am assuming not. I've seen posts claiming this was implemented in sp1but it's
not working for our sp2 implementation. Any feedback would be appreciated.
-Erik

Similar Messages

  • Batch updates with callable/prepared statement?

    The document http://edocs.bea.com/wls/docs70/oracle/advanced.html#1158797 states
    that "Using Batch updates with the callableStatement or preparedStatement is
    not supported". What does that actually mean? We have used both callable and prepared
    statements with the batch update in our current project (with the Oracle 817 db).
    It seems to run ok anyway.

    So the documentation should state that batch updates do not work ok in old versions
    of JDriver for Oracle, BUT work correctly with newer version. Additionally, batch
    updates work ok when used with Oracle supplied jdbc-drivers?
    "Stephen Felts" <[email protected]> wrote:
    Support for addBatch and executeBatch in the WLS Jdriver for Oracle was
    added in 7.0SP2.
    It was not available in 6.X or 7.0 or 7.0SP1.
    "Janne" <[email protected]> wrote in message news:3edb0cdc$[email protected]..
    The document http://edocs.bea.com/wls/docs70/oracle/advanced.html#1158797
    states
    that "Using Batch updates with the callableStatement or preparedStatementis
    not supported". What does that actually mean? We have used both callableand prepared
    statements with the batch update in our current project (with the Oracle817 db).
    It seems to run ok anyway.

  • Statement caching and batch update

    Can these 2 JDBC features work together ?
    Is it possible while statement is cached to be reparsed (soft) if used in batch update ?
    I am asking this questions because i have a sitution where an insert is cached using implicit statement caching and then put in a batch to exeute batch updates !!! From statspack reports i find that 1/3 of statements are reparsed ... even soft !!!

    Statement caching and batch update work fine together. The most common cause of unexpected soft parses is changing the type of some parameters. If you first bind one type, setInt(1, ...), do addBatch, then bind another type to the same parameter, setString(1, ...), and do addBatch, you will get a soft reparse. There is nothing the JDBC driver can do about this, the RDBMS requires it.
    In general, whatever parse behavior you see with statement caching you would also see without it.
    Douglas

  • Error while running batch update statement

    Hi
    We are experiencing the below error while running the batch update statement where in the IN clause have more than 80,000 entries. The IN clause is already handled for max 1000 values so it has multiple or clause
    like update...where id in (1,2...999) OR id in (1000,1001........) OR Id in ()...
    Error at Command Line:1 Column:0
    Error report:
    SQL Error: ORA-00603: ORACLE server session terminated by fatal error
    ORA-00600: internal error code, arguments: [kghfrh:ds], [0x2A9C5ABF50], [], [], [], [], [], []
    ORA-00600: internal error code, arguments: [kkoitbp-corruption], [], [], [], [], [], [], []
    00603. 00000 - "ORACLE server session terminated by fatal error"
    *Cause: An ORACLE server session is in an unrecoverable state.
    *Action: Login to ORACLE again so a new server session will be created
    Is there a limitation of oracle or some bug?
    Thanks

    http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/limits003.htm
    The limit on how long a SQL statement can be depends on many factors, including database configuration, disk space, and memoryI think you're over this limit.
    The only way is creating a temporary table with all the values and using
    IN (select ...)Max
    http://oracleitalia.wordpress.com

  • Inbound IDoc SHPCON - Batch update issue

    Hi all,
    I would like to use SHPCON.DELVRY03 idoc in order to update Outbound delivery document.
    Scope is :
    - picking
    - good issue
    - update batches
    - update serial numbers
    - update volume and weights
    We met issue on update batches a soon as one document item already has batch information before receiving IDoc.
    2 cases :
    - no batch spliting => High level item has already a linked batch (LIPS-POSNR = 000010 and LIPS-CHARG not empty.
    Is there a way to update batch information if external warehouse confirms another batch number ?
    - batch splitting => High level item has already 2 sublines POSNR = 900001 & 900002. Both has batch numbers.
    If I want to confirm it, I can send E1EDL19 with BAS qualifier but sublines are added...
    Must I delete existing sublines with E1EDL19 DEL ?
    Thanks a lot for your help.
    J.C.
    and to post good issue.
    All is OK except

    Hi,
    We are having same issue as of yours...
    i.e Updation of batch via IDOC to the delivery if delivery contains batch ...its not happening and also idoc not throwing any error
    Did you able to resolve this ?
    Thanks
    Rajesh

  • Problem with Batch Updates

    Hi All,
    I have a requirement where in I have to use batch updates. But if there's a problem in one of the updates, I want the rest of the updates to ingore this and continue with the next statement in the batch. Is this not one of the features in BatchUpdates in JDBC2? I have been trying to accomplish this since 2 days now. Can anyone help me with this. Can anyone please help me with this?
    FYI, I have tried the following.
    I have 3 test updates in my batch.
    2 nd statement is an erraneous statement(deliberate). Other 2 statements are fine. It is appropriatley throwing 'BatchUpdateException'. when I ckeck the "arrays of ints" reurned by executeBatch() as well as BatchUpdateException.getUpdateCounts() are returning an arrays of size '0'. If remeove the erraneous statement, behaviour is as expected.
    Thanks in advance,
    Bharani

    The next paragraph of the same API doc:
    If the driver continues processing after a failure, the array returned by the method BatchUpdateException.getUpdateCounts will contain as many elements as there are commands in the batch, and at least one of the elements will be the following:
    3. A value of -3 -- indicates that the command failed to execute successfully and occurs only if a driver continues to process commands after a command fails
    A driver is not required to implement this method.

  • Oracle 10G AS, Batch Update doesn't commit.

    Hello,
    We are using Batch uptdate to commit a set of records, but to our surprise one of the batches commits in DB and the other doesnt.
    The below pseudo code explains our scenario :
    String str1="INSERT INTO TBL1 (COL1,COL2) VALUES (?,?) ";
    String str2="UPDATE TBL2 SET COL1=? WHERE COL2=?";
    PreparedStatement pstmt1=conn.prepareStatement(str1);
    PreparedStatement pstmt2=conn.prepareStatement(str2);
    for(500 recs)
    pstmt1.addbatch();
    pstmt2.addbatch();
    On 500 Recs
    //This batch updates the DB and commits too
    pstmt1.executeBatch();
    pstmt1.clearBatch();
    con.commit();
    //The Batch executes successfully, but updates are not reflected in DB,
    //which may mean that it must not be committing in DB
    pstmt2.executeBatch();
    pstmt2.clearBatch();
    con.commit();
    - In the above, pstmt1 is an INSERT and pstmt2 is an UPDATE.
    - It so happens that at the end, DB reflects Inserts done by pstmt1, but it doesnt reflect the Updates of pstmt2.
    - We have also fired a SELECT stmt immediately after pstmt2 Execute Batch, but the updated values are not reflected.
    - Also the above scenario works absolutely fine if we use a STATEMENT instead of a PREPAREDSTATEMENT for pstmt2.
    Set up Details:
    App Server :: Oracle 10G AS for Linux 2.1,
    Database Driver :: Oracle 10G Driver which is bundled with Oracle 10G AS
    Database :: Oracle 9i
    Any ideas in this regards would be highly appreciated.
    Thanks !
    - Khyati
    Message was edited by:
    user473157

    I think this is not the right forum for this question; probably a JDBC forum or one of the Oracle AS forums.
    Kuassi

  • How can I avoid using rollback segment for batch updates.

    I am currently trying to avoid associating a large amount of space for rollback segment as this gets filled up only during the nightly batch updates. All that space will never be used during the day. Hence want to know if there is any way of avoiding the use of rollback segment at the session level.
    Rajesh

    No, but what you can do is create a large rollback segment to use with your batch job, at the start of your batch job bring the segment online, then use set transaction to use that rollback segment, when the batch job is finished and committed, you can then bring the rollback segment offline.
    If you are really pressed for space, as an alternate plan, you could actually create the large segment before the batch job and drop it after.

  • JDBC Batch Updates & PreparedStatement problems (Oracle 8i)

    Hi,
    we're running into problems when trying to use JDBC Batch Updates with PreparedStatement in Oracle8i.
    First of all, Oracle throws a SQLException if one of the statements in the batch fails (e.g. because of a unique constraint violation). As far as I understand, a BatchUpdateException should be thrown?
    The next problem is much worse:
    Consider this table:
    SQL> desc oktest
    ID NOT NULL NUMBER(10)
    VALUE NOT NULL VARCHAR2(20)
    primary key is ID
    When inserting in through batch updates with a PreparedStatement, I can pass 'blah' as a value for ID without getting an exception (Oracle silently converts 'blah' to zero). Only when the offending statement is the last statement in the batch, Oracle throws an exception (again, SQLException instead of BatchUpdateException).
    Any comments/suggestions are appreciated.
    E-mail me if you want to see the code...
    Thanks,
    Oliver
    Oracle version info:
    (Enterprise Edition Release 8.1.6.0.0, JServer Release 8.1.6.0.0, Oracle JDBC driver 8.1.6.0.0 and 8.1.7.0.0 (we're using the 'thin' driver)
    CLASSPATH=/opt/oracle/product/8.1.7/jdbc/lib/classes12.zip:...
    null

    Please refer
    http://www.oracle.com/technology/products/oracle9i/daily/jun07.html

  • Block Batch updates from QM

    Hi Guru's,
    I have a question regarding batch updates through QM (inspection lots).
    Is it possible to block a batch for getting updates from QM? Meaning that we can take UD, but that the results are not copied to the batch anymore. The solution should also make sure that we still can do a few logistical movement (like transfer postings, goods issue, ...)
    Maybe anybody experience with a custom-build solution?

    Sounds like you don't get along with your QM staff?
    The whole point of inspection lots and QM status controls is to ensure quality if you want to circumvent Quality control measures you can always let them disposition the batch and move it to blocked status, then scrap the blocked stock (move into 999 storage type and clear via LI21) and then pull the inv. back out of 999 in unresricted status/clear.
    If you are not fighting with your QC staff then perhaps you should stop creating inspection lots to begin with.
    Just seems strange to want to do this.

  • Best way to do batch update of an online database

    We have a periodic process which will do batch update of the online database. I would like to have some advice on what is the best way to avoid this operation affecting cache.
    We have multiple environments using shared cache, so the total cache size is pretty big. Each environment on average refresh once a day. Some are pretty big in the DB size but not necessary have high traffic, and thus doesn't have high count in the cache. I am wondering if I do a refresh (basically lots of puts) to the db, will the affect the cache? if so, is there a way to not affect the cache?
    If there is a way to disable caching of the LN at environment level, that should also work for us as we a 1st tier object cache.
    Also, I would like to cleanup the log at the end of the batch update. Will the cleanup process block the reader threads which are set at read_uncommitted?
    Thanks!
    Edited by: JoshWo on Mar 9, 2010 5:20 PM

    Josh,
    It is possible to use a base API Cursor (with the CacheMode configured) to write entities in a destination EntityStore after reading them from a source EntityStore. The idea is to read the entities using a DPL cursor from the source store, and then use the binding of the destination store to convert the entities to DatabaseEntry objects, and finally store the DatabaseEntry objects using a base API cursor from the destination store.
    import com.sleepycat.je.CacheMode;
    import com.sleepycat.je.Cursor;
    import com.sleepycat.je.DatabaseEntry;
    import com.sleepycat.je.OperationStatus;
    import com.sleepycat.persist.PrimaryIndex;
    import com.sleepycat.persist.EntityCursor;
        PrimaryIndex srcIndex = ... // index of source store
        PrimaryIndex destIndex = ... // index of destination store
        EntityCursor<MyEntity> srcCursor = srcIndex.entities();
        Cursor destCursor = destIndex.getDatabase().openCursor(null, null);
        destCursor.setCacheMode(CacheMode.UNCHANGED);
        EntityBinding<MyEntity> destBinding = destIndex.getEntityBinding();
        DatabaseEntry key = new DatabaseEntry();
        DatabaseEntry data = new DatabaseEntry();
        for (MyEntity e : srcCursor) {
            destBinding.objectToKey(e, key);
            destBinding.objectToData(e, data);
            OperationStatus status = destCursor.put(key, data);
            assert status == OperationStatus.SUCCESS;
        destCursor.close();
        srcCursor.close();Please let me know if you have questions.
    Obviously, this would be a lot easier if there were an EntityCursor.put method, so that the CacheMode could be set on the EntityCursor. I'll file an enhancement request to add such a method.
    --mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • How to update bulk no of records using  Batch Update

    i am trying to insert and update records into multiple tables using Statement addBatch() and executeBatch() method. But some times it is not executing properly. some tables records are inserting and some tables it is not inserting. But I want all the records need to excute and commit or if any error is there then all records need to rollback.
    This is the code i am using.
    public String addBatchQueryWithDB(StringBuffer quries, Connection conNew,Statement stmtNew) throws Exception {
    String success="0";
    try {         
    conNew.setAutoCommit(false);
    String[] splitquery=quries.toString().trim().replace("||","##").split("\\|");
    for(int i=0;i<splitquery.length;i++) {
    //System.out.println("query.."+i+".."+splitquery.trim().replace("##","||"));
    stmtNew.addBatch(splitquery[i].trim().replace("##","||"));
    int[] updCnt = stmtNew.executeBatch();
    for(int k=0;k<updCnt.length;k++){
    int test=updCnt[k];
    if(test>=0){
    success=String.valueOf(Integer.parseInt(success)+1);
    // System.out.println(".updCnt..."+updCnt[k]);
    System.out.println("success...length.."+success);
    if(updCnt.length==Integer.parseInt(success)){
    success="1";
    //conNew.commit();
    } catch (BatchUpdateException be) {
    //handle batch update exception
    int[] counts = be.getUpdateCounts();
    for (int i=0; i<counts.length; i++) {
    System.out.println("DB::addBatchQuery:Statement["+i+"] :"+counts[i]);
    success="0";
    conNew.rollback();
    throw new Exception(be.getMessage());
    } catch (SQLException ee) {
    success="0";
    System.out.println("DB::addBatchQuery:SQLExc:"+ee);
    conNew.rollback();
    throw new Exception(ee.getMessage());
    } catch (Exception ee) {
    success="0";
    System.out.println("DB::addBatchQuery:Exc:"+ee);
    conNew.rollback();
    throw new Exception(ee.getMessage());
    }finally {
    // determine operation result
    if(success!=null && success.equalsIgnoreCase("1")){
    System.out.println("commiting......");
    conNew.commit();
    }else{
    System.out.println("rolling back......");
    success="0";
    conNew.rollback();
    conNew.setAutoCommit(true);
    return success;
    }

    Koteshwar wrote:
    Thank you for ur reply,
    I am using single connection only, but different schemas. Then I am passing a Stringbuffer to a method, and First iam setting con.setAutoCommit(false); and then in that method i am splitting my queries in stringbuffer and adding it to stmt.addBatch(). After that I am executing the batch.after executing the batch i am commiting the connectionIf I am reading that right then you should stop using Batch.
    The intent of Batch is that you have one statement and you are going to be using it over and over again with different data.

  • Db batch update problem

    I have a real problem with doing a batch update in SOA Suite and have tried a number of different approaches but they have all failed at some point. My requirement is really quite simple.
    I have a simple table - MY_TABLE:
    Columns:
    PK_ID - primary key
    ATTR_1 - not a primary key but selection criteria for on which to update.
    ATTR_2 - value to update
    In oracle the above update statement would look something like:
    update MY_TABLE set ATTR_2=<some value> where ATTR_1=<some value>
    However I don't want to update a single row, I want to update a number of rows at the same time based on a number of different ATTR_1 values. So my approaches so far have been:
    a) use the Update functionality of the DB Adapter. The problem with this is that the DB adapater insists on updating by primary key. I do not have the primary key, I have another column for which I want to use as the criteria. So this won't work.
    b) Create a two stage process whereby I retrieve the primary keys based on ATTR_1 and feed this into approach a). Problem is that now I need to do a select based on a list of ATTR_1 values. The Select feature of the DB adapter does not allow selection based on a number of attributes, just one. i.e. the SQL that it creates for you is something like SELECT VALUES FROM TABLE WHERE CRITERIA = <some value>. What you would really need is SELECT VALUES FROM TABLE WHERE CRITERIA IN <some value>, but the Select feature won't allow this.
    c) This brings me on to the next approach. So I am going to try and use the DB adapter in Pure SQL mode. The SQL I define is something like:
    update MY_TABLE set ATTR_2=#myValue2 where ATTR_1 in #myValue1
    To get this work I have had to do some XSLT to convert my repeating element of ATTR_1 into a concatonated string e.g. ('myValue1','myValue1','myValue1'). The XSLT to do this sort of broke the XSLT GUI in SOA Suite, but it seems to work at runtime and creates the correct XML request to the DB adapter, however no rows are being updated and there is no error message either.
    This simple DB update is seemingly beyond my grasp. Can anyone help?
    Also I have no intention of looping through BPEL doing a single update at a time. The performance hit for a large number of records would be prohibitive.
    Edited by: user10103872 on 28-Nov-2010 10:42
    Edited by: user10103872 on 28-Nov-2010 10:43

    >
    I am thinking to write a procedure call to loop the sql statement "update table_A set version_no='1';"
    >
    Please don't do that. That will result in other complex issues and is not the solution to this problem.
    When you try to do any insert, update or delete, Oracle generates UNDO (which basically has all the data to revert back the changes, in case you decide to do a rollback).
    In your case, the UNDO Table space is not sized appropriately for this large update that you are trying to run. So, you need to fix that, by sizing it appropriately or by Using Auto-Extensible Table spaces.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/undo.htm#BABJCFFE

  • HELP!!! Strange Error with Batch Update

    While trying to do a batch update with the following preparedstatment:
    INSERT INTO TAB1
    (DATETIME,CALC_ID,SITE_ID,APPL_ID,VAL)
    values (
    {ts '2003-06-24 08:49:14'},
    (SELECT CALC_ID FROM TAB2 WHERE CALC_NAME=?),
    2,
    7,
    Notice the parameter in the subquery. This works fine for a single executeUpdate. But if i do two preparedstatement.addbatch calls and then executeBatch I see a unique constraint violation on (DATETIME,CALC_ID, and SITE_ID) CALC_ID is the only key that varies in my batch. It is as if the subquery is not being resolved and written literally to the field. If I try the 1st update by itself in add batch it works. If I try the second it works by itself. If I try them together in the batch it fails. The unique constraint is not violated when I execute them singly.
    Any Ideas?
    Thanks in advance.

    I've found that db2, PreparedStatements and batch updating do not work very well together. I've had to use dynmanic sql and build the update statement myself. Actually I wrote a query parser that builds the statement so I can pass it like a PreparedStatement.
    I'm not sure if your in db2 ver 7 but if so make sure your using fixpack 9. at lot of bad prepared statement behavior.

  • Batch Updating Keywords

    Could anyone tell me if there is an easier way to batch update a group of images to add an additional keyword? I know there is a batch update where I can select group of pictures and then update all the information at once, however, it doesn't pickup existing data from any of the pictures which means that I have to keep up with what was there before, write it on a piece of paper, then go in and re-enter all the data each time. This is taking too much time to warrant trying to keep things up to date.
    Hopefully someone can tell me an easier way.
    Thanks,
    glenn

    That is quite easy.
    Simply select all the images in the browser, make sure that 'Primary Only' is off (the button next to the 'Quick Preview' buton).
    Open the Keyword HUD, then drag the keyword you want to add from the HUD onto one of the selected images, it should then be added to all of them, multiple keywords can be dragged as well, just use Command to select more. (I'd advice to first turn on the lock in the keyword hud to prevent accidental moving of keywords)
    It should be possible to do it with the 'Batch Change' option, just make sure that you select 'Append' in the 'Add Metadata' area.

Maybe you are looking for

  • Connect to SAP R/3 AND BW

    Hai, Can any body help me. I am using ECC 6. i was installed BI cont 703 in ecc server. how to connect BI to ECC in same server. Thanks, Skumar

  • FLV Export from Media Encoder

    Hi, When I export an FLV from ME, the clip does not play or starts and then stops on Vital Streams servers. I can take the same clip, export a movie as AVI, then bring it into ON2's Flix Pro (using the same codec as ME FLV export is based on) and gen

  • Difficulty in connecting to 3G network, anyone have any ideas?

    I've been unable to connect to the 3G network for a few days now but internet connectivity is fine at home with my wifi router. Is there something happening with the system or something turned off on the phone? (iPhone 4, 8 GB)

  • OPatch installation error for OID

    Hi, We are trying to install OID at linux 64-bit OS and DB with Oracle 11g with Standard Edition,while installation it is giving error message as mentioned below. [2012-05-10T19:17:11.625+05:30] [as] [WARNING] [] [oracle.as.config] [tid: 34] [ecid: 0

  • Brand new iPad Mini Retina- dead out of box

    Just got a brand new ipad Mini retina and it won't turn on. Have tried all kind of reset button combos and nothing. Any suggestions or is it really dead? Have an appt at Apple Store in the morning, but would love to resolve it before then.