OraclePreparedStatement and setBinaryStream

Im getting a java.lang.StackOverflowError using the setBinaryStream method on OraclePreparedStatement. Anyone run across this before? Not sure what Im doing wrong. The serialized size of the object Im trying to send to the setBinaryStream method is around 8.2k
Database version: Oracle8i Enterprise Edition Release 8.1.7.4.0 - Production
Using the Oracle (XA) JDBC driver (10.1.0.2.0)
The problem is happening using IBM WSAD 5.1. Any ideas what Im doing wrong? Ive been fighting this at long longer than I care to admit.
java.lang.StackOverflowError
     at java.lang.Throwable.(Throwable.java)
     at java.lang.Throwable.(Throwable.java)
     at java.lang.StackOverflowError.(StackOverflowError.java:51)
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
<snipped about 5 pages of of this
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(OraclePreparedStatement.java)
     at oracle.jdbc.driver.OraclePreparedStatement.setBinaryStream(OraclePreparedStatement.java:6847)
     at com.ibm.ws.rsadapter.jdbc.WSJdbcPreparedStatement.setBinaryStream(WSJdbcPreparedStatement.java:838)
<snipped>

Code snippet that might help: Thanks in advance for any help:
Method with the problem contains this snippet of code:
byte[] bytesValue = serialize(value, key);
ByteArrayInputStream stream = new ByteArrayInputStream (bytesValue);
if (bytesValue != null)
//The following throws the java.lang.StackOverflowError
statement.setBinaryStream(
3,
stream,
bytesValue.length);
else
statement.setBytes(3, null);
serialize (Serializable object, String key) method:
if (object == null)
return null;
ObjectOutputStream stream = null;
ByteArrayOutputStream byteStream = null;
try
byteStream = new ByteArrayOutputStream(512);
stream = new ObjectOutputStream(byteStream);
// Serialize the object.
stream.writeObject(object);
return byteStream.toByteArray();
catch (IOException e)
Attributes tokens = new Attributes();
tokens.set(OBJECT_KEY_STRING, key);
throw new ObjectSerializationException(tokens, e);
finally
{ // Ensure that the output stream is explicitly closed regardless of the exit path
// of this method.
try
if (byteStream != null)
byteStream.close();
if (stream != null)
stream.close();
catch (IOException e)
Attributes tokens = new Attributes();
tokens.set(OBJECT_KEY_STRING, key);
throw new ObjectSerializationException(tokens, e);
}

Similar Messages

  • OraclePreparedStatement and dynamic where clause

    I am trying to figure out a better and more efficient way to construct a dynamic sql statement where the one or more parameters can be passed which would be included in the where clause. I know I can simply use a case or if statement to build the where clause but I cant believe this is the only/best way to do it.
    I was playing with OraclePreparedStatement but it only seems to work when the where columns are known. I need something like:
    select something from somewhere
    where ? = ?
    And I can simply set 1 to be a column and 2 to be a value. I suppose I could sublcass or create my own class that does this.
    Any help would be appreciated.
    Ryan

    >>>However, the problem is that my current structure doesn't all me to write any SQL to be written
    on DAL
    You cannot use dynamic sql in DAL? Have a stored procedure where you build dynamic sql and call it from
    within DAL.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Blob setBytes and setBinaryStream Clarification Needed

    I'm writing a JDBC driver for SQL Server and working on the Blob implementation details.
    One thing that is not clear in the JDBC spec to me at least is the behavior of the call to setBytes and to setBinaryStream on a Blob.
    Say for a simple example I have a Blob with 5 bytes in it ->
    1 2 3 4 5
    AA BB CC DD EE
    So in the database these 5 bytes are stored. I open a Blob on this field.
    Then I call setBytes with offset of 3 and write bytes 11 22 33 44
    Do I get this ->
    1 2 3 4 5 6
    AA BB 11 22 33 44 <- Writes over data, also past end of Blob.
    Or this?
    1 2 3 4 5
    AA BB 11 22 33 <- Writes over data, but not past end of Blob.
    Or this?
    1 2 3 4 5 6 7 8 9
    AA BB 11 22 33 44 CC DD EE <- Inserts data at offset, pushes existing data to right.
    So in other words, does setBytes insert data at the insertion point, or start overwriting data at the insertion point? Also, does it allow you to keep writing the data in the backend Blob past the end of the Blob?
    Same goes for setBinaryStream. The API spec is not clear about the desired behavior.
    Thanks for any clues!
    Matt

    Well... looking at the API...
    public void setBytes(int parameterIndex, byte[] x)
        throws SQLExceptionSets the designated parameter to the given Java array of bytes.
    The driver converts this to an SQL VARBINARY or LONGVARBINARY
    (depending on the argument's size relative to the driver's limits
    on VARBINARY values) when it sends it to the database.
    public void setBinaryStream(int parameterIndex, InputStream x,int length)
        throws SQLExceptionSets the designated parameter to the given input stream, which will
    have the specified number of bytes. When a very large binary value
    is input to a LONGVARBINARY parameter, it may be more practical to send
    it via a java.io.InputStream object. The data will be read from the
    stream as needed until end-of-file is reached.
    It appears that setBytes() you pass in an entire in-memory array
    of bytes, whereas setBinaryStream() allows you to pass in an
    InputStream that contains the bytes.

  • When to use setBytes() and setBinaryStream()

    Hi,
    I am using oracle thin driver to insert a image (blob type) into database.
    Which method of PreparedStatement interface should I use to insert data,
    setBytes() or SetBinaryStream().
    My understanding is BinaryStream also sends data in form of bytes.
    Then what is the difference between these two method ?
    Please let me know.
    Thanks,
    -Amol

    Well... looking at the API...
    public void setBytes(int parameterIndex, byte[] x)
        throws SQLExceptionSets the designated parameter to the given Java array of bytes.
    The driver converts this to an SQL VARBINARY or LONGVARBINARY
    (depending on the argument's size relative to the driver's limits
    on VARBINARY values) when it sends it to the database.
    public void setBinaryStream(int parameterIndex, InputStream x,int length)
        throws SQLExceptionSets the designated parameter to the given input stream, which will
    have the specified number of bytes. When a very large binary value
    is input to a LONGVARBINARY parameter, it may be more practical to send
    it via a java.io.InputStream object. The data will be read from the
    stream as needed until end-of-file is reached.
    It appears that setBytes() you pass in an entire in-memory array
    of bytes, whereas setBinaryStream() allows you to pass in an
    InputStream that contains the bytes.

  • JDBC Drivers compatability between Oracle and IBM WSAD 5.1.1

    I am using a DataAccessObject pattern. I discovered you can speed up your procesing with SELECT statments us using an OraclePreparedStatment and setting defineColumnType; thereby reducing the number of trips the SQL is sent to the database.
    Here is the code:
    ((OraclePreparedStatement)ps).defineColumnType(1, Types.BIGINT);'ps' is a PreparedStatement.
    However, at runtime, I am getting a class cast exception for the below reason. I am using WASD version 5.1.1. I was not aware IBM was using any specific classes in the JDBC drivers.
    [2005-05-23 15:36:59,855][Servlet.Engine.Transports : 2][ERROR][{TDAO}{getTWebInfos}{APP0000}{Unknown Contained Application Exception}{External Message:com/ibm/ws/rsadapter/jdbc/WSJdbcPreparedStatement incompatible with oracle/jdbc/OraclePreparedStatement}]
    {TDAO}{getTWebInfos}{APP0000}{Unknown Contained Application Exception}{External Message:com/ibm/ws/rsadapter/jdbc/WSJdbcPreparedStatement incompatible with oracle/jdbc/OraclePreparedStatement}
         at java.lang.Throwable.(Throwable.java)Could someone please explain this to me? Any help would be greatly appreciated.

    I'm not an WSAD expert to be honest, but it's quite possible they are using their own wrappers. Check out what calls you can make on their ps wrapper or more likely conn wrapper - they may have some type of getWrappedConnection functionality as other vendor(s) do in which case you can, eg, cast the connection to OracleConnection and go from there.

  • Get mails from server and put into DB without allocating memory.

    Hi all
    I have a application that downloads the mails from the mail server and then write them to the DB.
    I am using folder.getMessage method to get the mails from server.
    is there a way using which I can have a function that acts as a pipe between my server and DB and write the mails without consuming the memory.
    (may be something to do with getInputByteStream/getOutputByteStream or getInputStream and setBinaryStream 'preparedStatement interface ')
    Thanks

    Hi all
    I have a application that downloads the mails from the mail server and then write them to the DB.
    I am using folder.getMessage method to get the mails from server.
    is there a way using which I can have a function that acts as a pipe between my server and DB and write the mails without consuming the memory.
    (may be something to do with getInputByteStream/getOutputByteStream or getInputStream and setBinaryStream 'preparedStatement interface ')
    Thanks

  • SQL unicode to XML using XMLForest

    I'm using XMLForest to retrieve some data from the db. One of the columns has unicode characters such as "&#258;". I use XMLForest to read in this data, but the result turns this &#258; into a "?". How do I keep the unicode format? I need to output this xml to html using an xsl sheet, and the xsl sheet just converts the "?" as is.
    Example:
    table with firstnames, and lastnames
    firstname lastname
    &#258;bram Smîth
    I use XMLForest as:
    SELECT XMLForest (table.firstname, table.lastname);
    The result returns as:
    <firstname>?bram</firstname><lastname>Sm?th</lastname>
    So my xsl sheet returns the xml as just
    ?bram Sm?th
    Is there a method I can follow such that I know what unicode characters are being used?
    fyi: this is to run in a java application using Oracle JDBC. I use the following to get the XML string from the XMLType. This is after executing the query using an OraclePreparedStatement and putting the result into an OracleResultSet (ors).
    StringBuilder builder = new StringBuilder();
    oracle.xdb.XMLType xml = XMLType.createXML(ors.getOPAQUE(1));
    builder.append(xml.getStringVal()+"\n");
    xml.close();
    System.out.println(builder.toString());

    There is a slight work-around to this, but I wouldn't highly recommend it. Its just something that I was able to use, since our comany's database is using US7ASCII.
    ASCIISTR will return the ASCII of the query result. When there is ascii that is not in the standard ASCII such as &#51425;, the character is replaced by its unicode value such as, "\C8E1". I parse the xml query result that is returned, for the character "\". Then using Java:
    int i = queryResult.indexOf("\\");// i = the index of "\"
    //there's usually 4 characters in the unicode, so i+1 to i+5.
    String unicode = queryResult.substring(i+1,i+5);
    int con = Integer.parseInt(unicode, 16);
    char c = (char) con;
    queryResult.replace(i, i+5, ""+c);//replace the ascii with actual character.
    I replace the "\C8E1" in the query result with char c.
    For the example that is provided, the query would be:
    SELECT XMLElement("Row", XMLForest(ASCIISTR(data_value))) FROM test;
    The result would return as:
    <Row><DATA_VALUE>Abram Sm\00EEth</DATA_VALUE></Row>
    the \00EE would be replaced with î.
    Just an idea, if anyone else is in the same situation.

  • Charsets - java -oracle

    Hi,
    I am trying to transfer data from an Oracle -Long field to an
    CLOB field in another Oracle- database.
    The transfer of the data works fine, except for the tranfer
    of the euro -currency -sign.
    The charsets of the databases are set to ISO8859-15.
    This charset is supported on my operating system
    Charset.availableCharsets()When i output the hex-representation of the
    euro-sign I get the following:
    ffffffef, ffffffbf, ffffffbd
    according to the ISO8859-15 code-chart this should be 'A4' I guess.
    I have tried various scenarious, e.g reading the input with
    the encoding "ISO8859-15", reading it with "windows-1252" and
    converting it to ISO8859-15, but without any success.
    When I read the data via sqlplus(which also works with ISO8859-15)
    I get the euro currency sign and I get a byte-value 128(which seems
    to be correct). Here the charset conversion seems to be o.k..
    I am using Windows-NT, with Oracle-jdbc-ThinClient.
    somehow the oracle client does some characterset-conversion that
    I am missing, but nevertheless I am just reading the bytes and
    get wrong Hex-digits!!!?????.
    thanks for any help,
    regards,
    alex
    InputStream druckdat = rs.getBinaryStream(colDruckdat);
         try{
            //inputReader = new InputStreamReader(druckdat, "ISO8859-15");
         byte[] byteArr = new byte[10000];
            druckdat.read(byteArr);
         printHexString(byteArr);
         //String string = new String(byteArr, "windows-1251");
         //printHexString(string);
    void printHexString(byte[] byteArr){
                   for(int k=0; k<byteArr.length; k++){
                        if(k % 8 == 0)
                             System.out.println("\n");
                        System.out.print(Integer.toHexString(new Integer(byteArr[k]).intValue()));
                        //System.out.print(new Integer(byteArr[k]).intValue());
                        System.out.print(",   ");

    i am not sure if the problems i had with oracle are identical to yours, but anyway here is what i found out:
    i had a oracle 9i with database character set of ISO8859P1 and national character set UTF8. Problem was that when i wanted to insert or read non-8859-1 characters i just got garbage. Reason was the automatic charset conversion done by the thin driver which first converts to the database character set before sending the data to the db. if one uses a NCHAR data type the db then converts it to the national character set in use there. This obviously is a problem when the database character set is only a subset of the national character set, resulting in information loss.
    unfortunately one has to use oracle-specific API (OraclePreparedStatement and its method setFormOfUse( types. NCHAR) so the thin driver doesn't use this conversion. This problem continues if one dynamically creates a SQL statement and wants to do an executeQuery() (standard jdbc API), it again gets converted automagically by the thin driver but OracleStatement has no method to suppress the conversion. So to avoid charset conversion one has to use the UNISTR function and encode the string to UTF-16 code points.
    Of (only some) help were the jdbc developers guide and the globalization docs from oracle you can find on tahiti.oracle.com
    i suspect you suffer from the same problem as 8859-1 and 8859-15 differ most prominently in the euro sign, and i guess your database character set is set to ISO8859P1.

  • IoEO - Conceptual Clarifications

    Hi All,
    I need few clarifications and perception of everyone who reads this thread.
    I need to understand the significance or the importance of EO. Per my observation there are very less EOs created actually. The VOs are most of the times built on PL/SQL query.
    When everything can be done without the presence of EO, then why should one have it. Is there a specific scenario where the application cannot do without it?
    For DML operations, we normally use :OraclePreparedStatements and OracleCallableStatements in the COs directly.
    Any quick response will be greatly appreciated.

    Hi,
    1. It would depend on the factors prevailing at the moment. May be when the forms were developed Oracle preferred to reuse the existing PLSQL APIs created for forms and avoid the effort to replicate all the logic and validations in the EO. But that is one of my guesses and frankly I dont know why Oracle did so.
    2. You can definitely write Joins and Decode statements in the VO (based on EOs) but remember that the calculated column (decode columns) can only fetch data. If you associate a field to one of the calculated attributes, your values would not be saved into the database. You would be able to save only those values which are directly based on the EO attributes which in further have one-on-one mapping with the Database table.
    Hope that clarifies.
    Regards
    Sumit

  • Standard JDBC anb Blobs

    Is Oracle JDBC driver entirely compatible with JDBC?
    I4m trying to manage Blobs using the thin driver throught JDBC standard interfaces. The code is ok for MySql, but doesnt do anything for Oracle.

    Not that I can tell. In order to use Oracle Blob's you must cast the PreparedStatement to an OraclePreparedStatement, and then use their specific Blob methods. It is really crappy. It's amazing that I can use JDBC drivers with SQL Server and MS-Access, a company's whose Java support is minimal at best, and have the Blob stuff work fine. Oracle really needs to get their act together. Or, maybe, they are just trying to make the software proprietary so it will be a pain to make it work with other databases - all they are really doing is making it a pain for developers to support Oracle!

  • RegisterOutParameter - setBinaryStream - Problems inserting Blob - setRAW

    As posted in metalink (was: "Problems inserting BLOB/InputStream with ojdbc14.jar for 10g - Data size bigger than max size for this type"):
    Using setBinaryStream for large Blobs works as long as I don't register outParameters.
    Query that works: "INSERT INTO blobtest (attachment_id,name,data) VALUES(blobtest_SEQ.nextval,?,?)";
    Query that fails = "BEGIN INSERT INTO blobtest (attachment_id,name,data) VALUES( blobtest_SEQ.nextval,?,?) RETURN attachment_id INTO ? ; END;"
    The necessary tables were created by hand:
    CREATE TABLE blobtest ( NAME CHAR(255), data BLOB, attachment_id NUMBER(38))
    And
    CREATE SEQUENCE TBL_ATTACHMENT_SEQ
    The output was: <<user: SEE
    pw: QD
    instantiating oracle driver
    query: INSERT INTO blobtest (attachment_id,name,data) VALUES(TBL_ATTACHMENT_SEQ.nextval,?,?)
    uploaded no Return Parameter blob of size: 256809
    query: BEGIN INSERT INTO blobtest (attachment_id,name,data) VALUES(TBL_ATTACHMENT_SEQ.nextval,?,?) RETURN attachment_id INTO ? ; END;
    java.sql.SQLException: Datengr÷&#9600;e gr÷&#9600;er als max. Gr÷&#9600;e f³r diesen Typ: 256809
    at
    oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java
    :125)
    at
    oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java
    :162)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setRAW(OraclePreparedState
    ment.java:5342)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setBinaryStreamInternal(Or
    aclePreparedStatement.java:6885)
    at
    oracle.jdbc.driver.OracleCallableStatement.setBinaryStream(OracleCall
    ableStatement.java:4489)
    at BlobTest.writeBlob(BlobTest.java:161)
    at BlobTest.testBlob(BlobTest.java:118)
    at BlobTest.main(BlobTest.java:92)
    error: Datengr÷&#9600;e gr÷&#9600;er als max. Gr÷&#9600;e f³r diesen Typ:
    256809>>
    here the java test case:
    * Created on 25.08.2004 $Id: BlobTest.java,v 1.4 2005/04/22 11:21:11 hauser Exp $
    * as posted in metalink jdbc forum 050405 and responses by
    * [email protected]
    import java.io.File;
    import java.io.FileInputStream;
    import java.io.InputStream;
    import java.sql.CallableStatement;
    import java.sql.Connection;
    import java.sql.Driver;
    import java.sql.DriverManager;
    import java.sql.PreparedStatement;
    import java.sql.Types;
    public class BlobTest {
    private static String FILE_NAME = "c:/temp/veryLargeFile.pdf";
    public BlobTest() {
    final static int ORACLE = 1;
    final static int MYSQL = 2;
    private String jdbcUrl = "jdbc:mysql://localhost/test?user=monty&password=greatsqldb";
    private int dbType = ORACLE;
    private Driver driver = null;
    private String user = "";
    private String pw = "";
    public static String SCHEME = "";
    public static void main(String[] args) {
    BlobTest bt = new BlobTest();
    if (args[0] != null) {
    System.out.println("dbType: " + args[0]);
    if (args[0].toLowerCase().indexOf("oracle") != -1) {
    bt.dbType = ORACLE;
    if (args[0].toLowerCase().indexOf("mysql") != -1) {
    bt.dbType = MYSQL;
    } else {
    System.out.println("not yet supported db type: " + args[0]);
    System.exit(99);
    if (args[1] != null) {
    System.out.println("jdbcUrl: " + args[1]);
    if (args[1].trim().length() != 0) {
    bt.jdbcUrl = args[1].trim();
    } else {
    System.out.println("not yet supported jdbcUrl : " + args[1]);
    System.exit(99);
    if (args.length > 2 && args[2] != null) {
    System.out.println("user: " + args[2]);
    if (args[2].trim().length() != 0) {
    bt.user = args[2].trim();
    } else {
    System.out.println("invalid user: " + args[2]);
    System.exit(99);
    if (args.length > 3 && args[3] != null) {
    System.out.println("pw: " + args[3].substring(0, 2));
    if (args[3].trim().length() != 0) {
    bt.pw = args[3].trim();
    } else {
    System.out.println("invalid filename: " + args[3]);
    System.exit(99);
    if (args.length > 4 && args[4] != null) {
    System.out.println("filename: " + args[4]);
    if (args[4].trim().length() != 0) {
    FILE_NAME = args[4].trim();
    } else {
    System.out.println("invalid filename: " + args[4]);
    System.exit(99);
    bt.setUp();
    bt.testBlob();
    public void setUp() {
    try {
    if (this.dbType == ORACLE) {
    System.out.println("instantiating oracle driver ");
    this.driver = (Driver) Class.forName(
    "oracle.jdbc.driver.OracleDriver").newInstance();
    } else {
    this.driver = (Driver) Class.forName("com.mysql.jdbc.Driver")
    .newInstance();
    if (this.driver == null) {
    System.out.println("oracle driver is null");
    System.exit(88);
    DriverManager.registerDriver(this.driver);
    } catch (Exception e) {
    e.printStackTrace();
    System.out.println("error: " + e.getMessage());
    public void testBlob() {
    try {
    this.writeBlob();
    } catch (Exception e) {
    e.printStackTrace();
    System.out.println("error: " + e.getMessage());
    * testfunction
    private void writeBlob() throws Exception {
    Connection conn = null;
    PreparedStatement pStmt = null;
    CallableStatement cStmt, cStmt2 = null;
    InputStream in = null;
    try {
    File file = new File(BlobTest.FILE_NAME);
    in = new FileInputStream(file);
    conn = DriverManager.getConnection("jdbc:" + this.jdbcUrl,
    this.user, this.pw);
    conn.setAutoCommit(false);
    String queryWorks = "INSERT INTO " + SCHEME
    + "blobtest (attachment_id,name,data) VALUES(" + SCHEME
    + "TBL_ATTACHMENT_SEQ.nextval,?,?)";
    cStmt = conn.prepareCall(queryWorks);
    System.out.println("query: " + queryWorks);
    cStmt.setString(1, file.getAbsolutePath());
    in = new FileInputStream(file);
    cStmt.setBinaryStream(2, in, (int) file.length());
    cStmt.execute();
    System.out.println("uploaded no Return Parameter blob of size: "
    + file.length());
    conn.commit();
    String queryFails = "BEGIN INSERT INTO " + SCHEME
    + "blobtest (attachment_id,name,data) VALUES(" + SCHEME
    + "TBL_ATTACHMENT_SEQ.nextval,?,?)"
    + " RETURN attachment_id INTO ? ; END;";
    cStmt2 = conn.prepareCall(queryFails);
    System.out.println("query: " + queryFails);
    cStmt2.setString(1, file.getAbsolutePath());
    in = new FileInputStream(file);
    cStmt2.setBinaryStream(2, in, (int) file.length());
    cStmt2.registerOutParameter(3, Types.INTEGER);
    cStmt2.execute();
    System.out.println("uploaded blob of size: " + file.length()
    + " - id: " + cStmt2.getInt(3));
    conn.commit();
    } catch (Exception e) {
    e.printStackTrace();
    System.out.println("error: " + e.getMessage() + "\nname: "
    + BlobTest.FILE_NAME);
    if (conn != null) {
    try {
    conn.rollback();
    } catch (Exception e1) {
    throw e;
    } finally {
    if (in != null) {
    try {
    in.close();
    } catch (Exception e) {
    if (pStmt != null) {
    try {
    pStmt.close();
    } catch (Exception e) {
    if (conn != null) {
    try {
    conn.close();
    } catch (Exception e) {
    and the batch file I use to start:
    @setlocal
    @echo off
    rem $Id: runBlobTest.bat,v 1.2 2005/04/21 15:06:22 hauser Exp $
    set classpath=../WEB-INF/classes;../WEB-INF/lib/ojdbc14.jar;
    echo JAVA_HOME: %JAVA_HOME%
    set JAVA_HOME=C:\PROGRA~1\Java\j2re1.4.1_02\
    echo classpath: %classpath%
    set javaCmd=C:\PROGRA~1\Java\j2re1.4.1_02\bin\java
    %javaCmd% -version
    %javaCmd% BlobTest "oracle" "oracle:thin://@ORADB.yourdomain.COM:1521:t300" "username" "password" "C:\Temp\veryLargeFile.pdf"
    endlocal

    Apparently, this is partially known - with a different stacktrace though:
    <<From: Oracle, Anupama Srinivasan 25-Apr-05 07:15
    Can you please check on Bug:4083226?
    Using the RETURNING Clause is not supported with JDBC. You could embed the statement in PL/SQL Block as in Metalink Note 124268.1 - JDBC Support for DML Returning Clause.
    The Enhancement Request filed on this issue is being considered for Release 10.2
    >>
    And my answer to it.
    Using the RETURNING Clause is not supported with JDBC.This is strange, with just "emptyblob()", it DOES work.
    I guess, our work-around that hopefully is more portable than embedding a "PL/SQL Block" will be to
    1) create the record with an empty blob,
    2) update the blob in a second statement (now, a RETURNING statement is no longer needed)

  • Oracle, SELECT IN and PreparedStatement.setArray

    I want to execute the following query: SELECT * FROM SOMETABLE WHERE IDFIELD IN (?)
    The number of values in the IN list is variable. How can I do this with a prepared statement?
    I am aware of the different alternatives:
    1) Keep a cache of prepared statement for each sized list seen so far.
    2) Keep a cache of prepared statements for different sizes (1, 5, 10, 20) and fill in the left over parameter positions with the copies first value.
    They both have the disadvantage that there could be many prepared statements for each query that get used once, and never used again.
    I have tried this:
    stmt.execute ("CREATE OR REPLACE TYPE LONGINTLIST AS TABLE OF NUMBER(15)");
    ArrayDescriptor desc = ArrayDescriptor.createDescriptor ("LONGINTLIST", conn);
    long idValues [] = {2, 3, 4};
    oracle.sql.ARRAY paramArray = new oracle.sql.ARRAY (desc, conn, idValues);
    PreparedStatement query = conn.prepareStatement ("SELECT * FROM MYTABLE WHERE ID_FIELD IN (?)");
    query.setArray (1, paramArray);
    But Oracle gives a data conversion error.
    I then tried this:
    PreparedStatement query = conn.prepareStatement ("SELECT * FROM MYTABLE WHERE ID_FIELD IN (SELECT * FROM TABLE (?))");
    This works and the rows are returned, but the Oracle optimizer does not like it very much, since it always does a full table scan even though there is a primary key index on ID_FIELD.
    Any ideas?
    I also tried this:
    OraclePreparedStatement oraQuery = (OraclePreparedStatement) query;
    oraQuery.setARRAY (1, paramArray);
    But same behavior.
    Roger Hernandez

    Please re-read the original message. As I mentioned,
    I am aware of the two commonly used alternatives.No actually the most used alternative is to build the SQL dynamically each time.
    I know how to get both of them to work, and have used
    both alternatives in the past. The downside to both
    these approaches is that you need to save multiple
    prepared statements for each query. What I am trying
    to find is a way of having only one saved prepared
    statement for a query having a variable number of IN
    clause parameters.You could probably use a stored procedure that takes an 'array' and then do the processing in the stored proc to handle each array element.
    However, your database might not support that stored procs or arrays. Or it might not cache it with arrays. And the overhead of creating the array structure or processing it in the proc might eat any savings that you might gain (even presuming there is any savings) by using a prepared statement in the first place. Of course given that you must be using an automated profiling tool and have a loaded test environment you should be able to easily determine if this method saves time or not.
    Other than that there are no other solutions.

  • Problem in Distributed transaction with Oracle 8.1.7 and Weblogic 7.0

              Hi,
              I am using two unmanaged weblogic 7.0 servers and oracle 8.1.7 enterprise edition.
              I am using oracle.jdbc.xa.client.OracleXADataSource for creating connection pool
              in weblogic. The pool gets created fine but when connection it is getting used
              it throws up following error.
              java.sql.SQLException: ORA-02044: transaction manager login denied: transaction
              in progress
              ORA-06512: at "SYS.JAVA_XA", line 0
              ORA-06512: at line 1
              at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:168)
              at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:208)
              at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:543)
              at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1405)
              at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:822)
              at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:1446)
              at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:1371)
              at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1900)
              at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:363)
              at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:407)
              at oracle.jdbc.xa.client.OracleXAResource.start(OracleXAResource.java:171)
              at weblogic.jdbc.jta.VendorXAResource.start(VendorXAResource.java:41)
              at weblogic.jdbc.jta.DataSource.start(DataSource.java:569)
              I don't know what is causing this problem.Please send me the pointers.
              Regards,
              Vikash
              

    ID is a NUMBER and id.toString() is not a number, for
    example it ca be a null reference.
    well we tryed also this version :
    ps.setLong(1, id.longValue());
    moreover the exception wasn't thrown for the value of id :56 but was thrown for the value of id : 88. Hence I think it is a case of something other than my code.
    michal

  • Problem with Oracle 8.1.7 and JDBC

    Im connecting to Oracle 8.1.7 (running on a MS Windows Server) from a JBoss 2.4.10,
    running on a Linux box. The code that generates the problem looks like this :
    String query = "SELECT some_fields FROM some_table WHERE id = ?";
    Prepared Statement ps = conn.prepareStatement(query);
    ps.setString(1, id.toString());
    ps.executeQuery(); // this throws the folowing exception
    java.sql.SQLException: Zamkni?ta instrukcja
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:169)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:211)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:274)
    at oracle.jdbc.driver.OracleStatement.ensureOpen(OracleStatement.java:49
    35)
    at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePrepar
    edStatement.java:382)
    at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePrepare
    dStatement.java:339)
    at org.jboss.pool.jdbc.PreparedStatementInPool.executeQuery(PreparedStat
    ementInPool.java:71)
    The connection is ok. The statement is not closed or anything. The syntax of the query is ok, because i tested it on the Oracle console. What is
    strange, is that this exception is thrown only some times. What I mean is that
    regardles of the id parameter, the execution of this query some times throws
    an excpetion, and some times not, without any special patern.
    I will be gratefull for all help
    MIchal
    GG : #2814355
    e-mail : mailto:[email protected]
    fotoWWW : http://foto.natolin.net
    bikeWWW : http://rowery.natolin.net (under construction)

    ID is a NUMBER and id.toString() is not a number, for
    example it ca be a null reference.
    well we tryed also this version :
    ps.setLong(1, id.longValue());
    moreover the exception wasn't thrown for the value of id :56 but was thrown for the value of id : 88. Hence I think it is a case of something other than my code.
    michal

  • Problem with Unicode and Oracle NCLOB fields

    When I try to INSERT a new (N)CLOB into an Oracle database, all is fine until I use a non-ASCII character, such as an accented roman letter, like the "�" (that's '\u00E9') in "caf�" or the Euro Currency symbol "?" (that's '\u20AC' as a Java character literal, just in case the display is corrupted here too). This doesn't happen with "setString", but does happen when streaming characters to the CLOB; however, as Oracle or the driver refuse strings larger than 4000 characters, and as I need to support all the above symbols (and many more), I'm stuck.
    Here's the background to the problem (I've tried to be detailed, after a lot of looking around on the web, I've seen lots of people with similar problems, but no solutions: I've seen and been able to stream ASCII clobs, or add small NCHAR strings, but not stream NCLOBs...).
    I'm using Oracle 9.2.0.1.0 with the "thin" JDBC driver, on a Windows box (XP Pro). My database instance is set up with AL32UTF8 as the database encoding, and UTF8 as the national character set.. I've created a simple user/schema, called LOBTEST, in which I created two tables (see below).
    The basic problems are :
    - with Oracle and JDBC, you can't set the value of a CLOB or NCLOB with PreparedStatement's setString or setCharacterStream methods (as it throws an exception when you send more than 4000 characters)
    - with Oracle, you can only have one LONG VARCHAR-type field per table (according to their documentation) and you MUST read all columns in a set order (amongst other limitations).
    - with a SQL INSERT command, there's no way to set the value of a parameter that's a CLOB (implementations of the CLOB interface can only be obtained by performing a SELECT.... but obviously, when I'm inserting, the record doesn't exist yet...). Workarounds include (possibly) JDBC 4 (doesn't exist yet...) or doing the following Oracle-specific stuff :
    INSERT INTO MyTable (theID,theCLOB) VALUES (1, empty_clob());
    SELECT * FROM MyTable WHERE theId = 1;
    ...and getting the empty CLOB back (via a ResultSet), and populating it. I have a very large application, that's deployed for many of our customers using SapDB and MySQL without a hitch, with "one-step" INSERTS; I can't feasibly change the application into "three-step INSERT-SELECT-UPDATE" just for Oracle, and I shouldn't need to!!!
    The final workaround is to use Oracle-specific classes, described in:
    http://download-east.oracle.com/otn_hosted_doc/jdeveloper/904preview/jdbc-javadoc/index.html
    ...such as CLOB (see my example). This works fine until I add some non-ASCII characters, at which point, irrespective of whether the CLOB data is 2 characters or 2 million characters, it throws the same exception:
    java.io.IOException: Il n'y a plus de donn?es ? lire dans le socket
         at oracle.jdbc.dbaccess.DBError.SQLToIOException(DBError.java:716)
         at oracle.jdbc.driver.OracleClobWriter.flushBuffer(OracleClobWriter.java:270)
         at oracle.jdbc.driver.OracleClobWriter.flush(OracleClobWriter.java:204)
         at scratchpad.InsertOracleClobExample.main(InsertOracleClobExample.java:61)...where the error message in English is "No more data to read from socket". I need the Oracle-specific "setFormOfUse" method to force it to correctly use the encoding of the NCLOB field, without it, even plain ASCII data is rejected with an exception indicating that the character set is inappropriate. With a plain CLOB, I don't need it, but the plain CLOB refuses my non-ASCII data anyway.
    So, many many thanks in advance for any advice. The remainder of my post includes my code example and a simple SQL script to create the table(s). You can mess around with the source code to test various combinations.
    Thanks,
    Chris B.
    CREATE TABLE NCLOBTEST (
         ID         INTEGER NOT NULL,
         SOMESTRING NCLOB,
         PRIMARY KEY (ID)
    CREATE TABLE CLOBTEST (
         ID         INTEGER NOT NULL,
         SOMESTRING CLOB,
         PRIMARY KEY (ID)
    package scratchpad;
    import java.io.Writer;
    import java.sql.Connection;
    import java.sql.Driver;
    import java.sql.PreparedStatement;
    import java.sql.SQLException;
    import java.util.Properties;
    import oracle.jdbc.driver.OracleDriver;
    import oracle.jdbc.driver.OraclePreparedStatement;
    import oracle.sql.CLOB;
    public class InsertOracleClobExample
         public static void main(String[] args)
              Properties jdbcProperties = new Properties();
              jdbcProperties.setProperty( "user", "LOBTEST" );
              jdbcProperties.setProperty( "password", "LOBTEST" );
    //          jdbcProperties.setProperty("oracle.jdbc.defaultNChar","true");
              Driver jdbcDriver = new OracleDriver();
              PreparedStatement pstmt = null;
              Connection connection = null;
              String tableName = "NCLOBTEST";
              CLOB clob = null;
              try
                   connection = jdbcDriver.connect("jdbc:oracle:thin:@terre:1521:orcl", jdbcProperties);
                   pstmt = connection.prepareStatement("DELETE FROM NCLOBTEST");
                   pstmt.executeUpdate();
                   pstmt.close();
                   pstmt = connection.prepareStatement(
                        "INSERT INTO "+tableName+" (ID,SOMESTRING) VALUES (?,?);"
                   clob = CLOB.createTemporary(pstmt.getConnection(), true, CLOB.DURATION_SESSION);
                   clob.open(CLOB.MODE_READWRITE);
                   Writer clobWriter = clob.getCharacterOutputStream();
                   clobWriter.write("Caf? 4,90? TTC");
                   clobWriter.flush();
                   clobWriter.close();
                   clob.close();
                   OraclePreparedStatement opstmt = (OraclePreparedStatement)pstmt;
                   opstmt.setInt(1,1);
                   opstmt.setFormOfUse(2, OraclePreparedStatement.FORM_NCHAR);
                   opstmt.setCLOB(2, clob);
                   System.err.println("Rows affected: "+opstmt.executeUpdate());
              catch (Exception sqlex)
                   sqlex.printStackTrace();
                   try     {
                        clob.freeTemporary();
                   } catch (SQLException e) {
                        System.err.println("Cannot free temporary CLOB: "+e.getMessage());
              try { pstmt.close(); } catch(SQLException sqlex) {}
              try { connection.close(); } catch(SQLException sqlex) {}
    }

    The solution to this is to use a third-party driver. Oranxo works really well.
    - Chris

Maybe you are looking for

  • Get current week of calendar

    Hi there, I want do get the number of the current week into a integer. my code: <%@ page import="java.sql.*" %> <%@ page import="java.io.*" %> <%@ page import="java.util.*"%> Calendar calendar = Calendar.getInstance(); int kw = calendar.WEEK_OF_YEAR;

  • Error building RT2860 WLAN driver in kernel 2.6.18-128.2.1.4.25.el5.i686

    Hello, I installed VM Server v2.2.1 (kernel 2.6.18-128.2.1.4.25.el5.i686) and am trying to build the RT2860 WLAN driver from source (2010_07_16_RT2860_Linux_STA_v2.4.0.0). I installed package kernel-ovs-devel-2.6.18-128.2.1.4.25.el5.i686 and when I r

  • IWeb not publishing

    Just empty cache in the browser!

  • Captivate adding extra slide at the end?

    Hi I have a programme with 15 standard slides and 11 quiz slides (Plus the results slide which equals 27) However, once I am on the results page at the end there is an extra page (28 pages). If I click play it just fades onto a white blank screen. Is

  • Ipod App stopped working

    Hi there, So this morning i used my phone and bought some itunes, but then when i got in my car the ipod app i noticed had stopped working, it flashes up for a few seconds then disapears. I've tried turning it off and on and rebooting it but nothing