Update oracle long field

Hello:
I'm trying to update an oracle long field with a prepared statement. Here's the
code:
objCallableStmt = objConnection.prepareCall(
"{call spUpdatetblContract(?,?)}");
objCallableStmt.setInt(1, intContractId);
objCallableStmt.setAsciiStream(2, new java.io.ByteArrayInputStream(strContractBody.getBytes()),
strContractBody.length());
I get the following error:
java.sql.SQLException: Data size bigger than max size for this type
I'm using BEA WebLogic Server 8.1 and Oracle9i Enterprise Edition Release 9.2.0.1.0

Armando Calzada wrote:
Thanks Joe
I'm using Oracle's Driver (Thin XA) Versions 8.1.7, 9.0.1, 9.2.0Sure. And what happens if you get the connection directly from the oracle driver
as opposed to a weblogic pool. The same? Most likely it will be. Can you repeat
this code in a tiny standalone java program that gets it's connection from the
latest oracle driver, and tell me what it does? This may simply be a limitation
of the oracle thin driver.
Joe
>
Joe Weinstein <[email protected]> wrote:
Armando Calzada wrote:
Hello:
I'm trying to update an oracle long field with a prepared statement.Here's the
code:
objCallableStmt = objConnection.prepareCall(
"{call spUpdatetblContract(?,?)}");
objCallableStmt.setInt(1, intContractId);
objCallableStmt.setAsciiStream(2, new java.io.ByteArrayInputStream(strContractBody.getBytes()),
strContractBody.length());
I get the following error:
java.sql.SQLException: Data size bigger than max size for this type
I'm using BEA WebLogic Server 8.1 and Oracle9i Enterprise Edition Release9.2.0.1.0
Whose Oracle driver are you using?
thanks
Joe

Similar Messages

  • Issue with multiple Oracle Long Fields

    Hello,
         We have some reports that try to display two Oracle Long data type fields. When we do so we receive the error "ORA-1002 - fetch out of sequence" when we connect the report through "Oracle Server" connection type.
         If I execute SQL Statement of the report in "Oracle SQL Developer" there is no error.
         I have the following components installed on my Windows 7 PC:
          - Crystal Report 2008 (12.3.0.601)
          - Oracle Client 11.1.7.0
        If you want to replicate the issue you can create the following tables:
    create table TEST1_T
      fld1 VARCHAR2(4),
      text   LONG
    create table TEST2_T
      fld1 VARCHAR2(4),
      fld2 VARCHAR2(4),
      text   LONG
    In the TEST1_T table insert data and populate the text field. In the TEST2_T, populate the fld1 to be able to link the 2 tables together you can let the text field blank.
    In the report insert TEST1_T.fld1, TEST2_T.text and TEST1_T.text if you hit preview you should get the error Ora-1002.
    If you need more information let me know.
    Thank you.
    Charles
    P.S: If we connect thru "CR Oracle ODBC Driver 5.3" and the first long field in the SQL statement is null boths fields are blanked in the report but if both fields are populated the report execute correctly.

    Hi,
       This is a known issue with CR that uses Oracle Long Fields.
       See below for direct copy of information from KBase 1205489, hope it helps.
    Symptom
    In Crystal Reports, problems may arise when using more than one Oracle LONG data type field.
    Some of the symptoms you might see:
    Mixed data between two LONG data columns when displayed within the same report.
    Incorrect data displayed in the column which is based on a LONG field.
    Resolution
    Change LONG fields to CLOB fields in the Oracle table.
    Oracle recommends migrating any LONG data to the CLOB type starting in Oracle 9i.
    To make this change, use the TO_LOB method.
    Ken

  • Can't update oracle Long data type

    Hi All,
    Setup
    Weblogic 5.1.0sp11, oracle 8.1.6, Solaris8, classes12.zip,
    Solaris_JDK_1.2.1_03a
    ConnectionPool:
    weblogic.jdbc.connectionPool.XXXDBPool=\
    url=jdbc:oracle:thin:@10.2.30.50:1521:ZZZZ,\
    driver=oracle.jdbc.driver.OracleDriver,\...etc
    Problem
    I can't seem to write to (insert/update) an oracle type Long column
    using connections from the weblogic connection pool. Each entry(as
    XML) into this column is fairly large (>1MB). Reading from works fine,
    however.
    The same configuration(minus connection pool and associated
    code/config) via tomcat works fine(read/write), so I'm thinking the
    weblogic connection pool is giving me a connection object which
    doesn't update this Long column properly.
    Using the usual(again works fine for tomcat) :-
    //Url = jdbc:weblogic:pool:XXXDBPool
    //driver = weblogic.jdbc.pool.Driver
    stat = con.prepareStatement("UPDATE BM_RM_USER_DIARY set DIARY=? where
    id=?");
    //stat.setString(1,diaryData); //use an inputstream instead.
    ByteArrayInputStream bais = new
    ByteArrayInputStream(diaryData.getBytes());
    //StringBufferInputStream bais = new
    StringBufferInputStream(diaryData); //deprecated
    stat.setAsciiStream(1, bais, bais.available());
    stat.setLong(2,userId);
    I'm not getting any errors from this update, so I can't really see
    what I might be doing wrong. It just returns fine as if it has
    succeeded (rows updated = 1)
    Assuming I CAN NOT change these settings, are there any other ways I
    can programmatically get these inserts working ? Is this even a known
    issue ? as i can't find a problem in these postings that quite matches
    mine.
    Any help appreciated.
    Cheers
    Alkesh

    Hi All
    Just to follow up on this.
    The problem was solved by explicitly calling commit() after
    the update has taken place.
    It seems as though my connection from the connection pool has
    autocommit set to false.
    Thanks
    Alkesh
    [email protected] (Alkesh) wrote in message news:<[email protected]>...
    Hi All,
    Setup
    Weblogic 5.1.0sp11, oracle 8.1.6, Solaris8, classes12.zip,
    Solaris_JDK_1.2.1_03a
    ConnectionPool:
    weblogic.jdbc.connectionPool.XXXDBPool=\
    url=jdbc:oracle:thin:@10.2.30.50:1521:ZZZZ,\
    driver=oracle.jdbc.driver.OracleDriver,\...etc
    Problem
    I can't seem to write to (insert/update) an oracle type Long column
    using connections from the weblogic connection pool. Each entry(as
    XML) into this column is fairly large (>1MB). Reading from works fine,
    however.
    The same configuration(minus connection pool and associated
    code/config) via tomcat works fine(read/write), so I'm thinking the
    weblogic connection pool is giving me a connection object which
    doesn't update this Long column properly.
    Using the usual(again works fine for tomcat) :-
    //Url = jdbc:weblogic:pool:XXXDBPool
    //driver = weblogic.jdbc.pool.Driver
    stat = con.prepareStatement("UPDATE BM_RM_USER_DIARY set DIARY=? where
    id=?");
    //stat.setString(1,diaryData); //use an inputstream instead.
    ByteArrayInputStream bais = new
    ByteArrayInputStream(diaryData.getBytes());
    //StringBufferInputStream bais = new
    StringBufferInputStream(diaryData); //deprecated
    stat.setAsciiStream(1, bais, bais.available());
    stat.setLong(2,userId);
    I'm not getting any errors from this update, so I can't really see
    what I might be doing wrong. It just returns fine as if it has
    succeeded (rows updated = 1)
    Assuming I CAN NOT change these settings, are there any other ways I
    can programmatically get these inserts working ? Is this even a known
    issue ? as i can't find a problem in these postings that quite matches
    mine.
    Any help appreciated.
    Cheers
    Alkesh

  • How to insert a record in to oracle table from Java which has a oracle LONG Field?

    The problem is only 80 char's inserted in to the LONG FIELD which happend to me when i run the same insert statement from SQL plus.
    Here is the code....
    java.io.File inputFile = new java.io.File("input.txt");
    int inputFileLen = (int) inputFile.length();
    java.io.InputStream ipStream = new java.io.FileInputStream(inputFile);
    System.out.println(inputFileLen);
    PreparedStatement myStmt = conn.prepareStatement("insert into LONG_EXAMPLE (notes,name)values(?,?)");
    myStmt.setAsciiStream(1, ipStream, inputFileLen);
    myStmt.setString(2,"Steve");
    int res = myStmt.executeUpdate();
    myStmt.close();
    Note : Here the size of the input.txt 250 bytes , 1KB.
    Thanks in advance......:-)
    Cheers,
    Vetri !

    Hien
    I had a similar requirement to you and put the code on a push button. In my case the reports are being run interactively to the screen and then printed, and I have no way to know whether the report had actually been printed, so we rely on the user pressing the button. Of course there is a danger that the user forgets to press the button, but in our situation that would not be disasterous. Whatever method is used, I don't think there is any way to be sure that it has actually come out of the printer.

  • Updating a LONG RAW field in Oracle DB from SQL Server

    Hello Experts,
    I need to be able to update a LONG RAW (binary, it looks like the SQL Server type equivalent is Image or varbinary(max)) field which resides on an Oracle server and is connected to the SQL Server 2012 via a Linked Server.
    I can retreive data in general, just trying to find how to deal with LONG RAW.
    Thanks!
    CB

    Hello,
    It seems that the Long datatypes in Oracle have a lot of restrictions. According to
    this blog:LONG and LONG RAW columns cannot be used in distributed SQL statements.
    In that case, you should update the long raw column on the Oracle side. You can try to use openquery as Rick post above to send the SQL statment to Oracle and execute.
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • Oracle Provider for oledb 8i & 9i truncates LONG fields

    Anyone,
    I have a situation where my Oracle oledb driver truncates LONG fields.
    My application that runs queries using the oracle oledb driver.
    The query is a join between two tables and includes a LONG as one of the fields retrieved.
    When the Cursor Location is set to server the data returned is truncated.
    Setting the Cursor Location to Client I can retrieve the complete data under specific cursor and lock types. The Microsoft driver for oracle does not truncate.
    Note that a simple query without the join does not cause the problem.
    Has anyone else noticed this? I know its quite specific.
    Thanks
    Stuart

    Hi,
    I've got the same problem, i.e. SELECT a LONG RAW column from
    joined/unioned tables truncates the column (via OraOleDB).
    Unfortunately, setting ChunkSize to 65535 does not help me,
    because data in a column are longer, then 64K (and are truncated
    to 64K). Please, have you found any solution?
    Thanks for any reply ..
    Milan

  • Oracle OLEDB drive 8i and 9i truncates LONG fields

    Anyone,
    I have a situation where my Oracle oledb driver truncates LONG fields.
    My application that runs queries using the oracle oledb driver.
    The query is a join between two tables and includes a LONG as one of the fields retrieved.
    When the Cursor Location is set to server the data returned is truncated.
    Setting the Cursor Location to Client I can retrieve the complete data under specific cursor and lock types. The Microsoft driver for oracle does not truncate.
    Note that a simple query without the join does not cause the problem.
    Has anyone else noticed this? I know its quite specific.
    Otherwise where do you submit bugs to oracle?
    Thanks
    Stuart

    Hi,
    I've got the same problem, i.e. SELECT a LONG RAW column from
    joined/unioned tables truncates the column (via OraOleDB).
    Unfortunately, setting ChunkSize to 65535 does not help me,
    because data in a column are longer, then 64K (and are truncated
    to 64K). Please, have you found any solution?
    Thanks for any reply ..
    Milan

  • Update Yes/No field in access table through oracle procedure

    Hi,
    How to update Yes/No field in access table through oracle procedure. all other fields like AutoNumber, Text I can update it. Yes/No field how to update? Please, any one can help me?
    Thanks and Regards,
    Sudha.

    Sudha Teki wrote:
    select "fldPost" from tblPHd@ODBCLNKNot quite sure what you mean, but the way you select the column would indicate a case sensitive column name
    Look at this example
    SQL> create table t
      2  ("this" varchar2(10))
      3  /
    Table created.
    SQL> insert into t values ('hello')
      2  /
    1 row created.
    SQL> select *
      2    from t
      3  /
    this
    hello
    SQL> select this
      2    from t
      3  /
    select this
    ERROR at line 1:
    ORA-00904: "THIS": invalid identifier
    SQL> select "this"
      2    from t
      3  /
    this
    helloIs your column name also case sensitive?

  • Oracle OLE DB provider truncates LONG field's data

    I am using Visual Basic 6, Oracle 8.1, and use ADO to connect to the Oracle database.
    I have a problem with OLE DB provider OraOLEDB.Oracle reading a LONG field (MEMO field in Access).
    When I tried to get the value of the LONG field, it truncates it to 100 characters.
    I do not have the problem using Microsoft Oracle OLE DB Provider MSDAORA.
    This is the Oracle table:
    CREATE TABLE tblA (
    tblA_KEY NUMBER(10) NOT NULL,
    tblA_MEMO LONG NULL
    This is the VB 6 code:
    Set rs = New ADODB.Recordset
    rs.Open ssql, adoConn, adOpenForwardOnly, adLockReadOnly
    sMemo = "" & rs.Fields("tblA_Memo") -->> sMemo contains only the first 100
    characters of tblA_Memo
    How can I fix this?
    Thank you.

    Thanks. I assume you are talking about 9.2 version of Oracle, right?
    While we are in the subject of LONG field. In Oracle 8.1, I can not do the following sql stt:
    select * from tblA where tblA_Memo like '%york%'
    It gives me the error "ORA-00932: inconsistent datatypes"
    Can I do this in Oracle 9.2 ?
    Thank you.

  • Want to update BLOB field to LONG field.

    We have 2 table one were BLOB field is there, another were LONG i want to transfer BLOB to LONG fields. Please advice how i can?
    I tried DBMS_LOB and UTL_RAW.CASTE_TO_VARCHAR2 its working but its saving some initials unwanted character.
    Thanks
    Bakulesh

    As I understand, 1 BLOB does have BINARY data. And, you have to copy that to LONG RAW and not just LONG. The unwanted cahracters are those BINARY data. ( They might be really of some use ! )
    We have 2 table one were BLOB field is there, another were LONG i want to transfer BLOB to LONG fields. Please advice how i can?
    I tried DBMS_LOB and UTL_RAW.CASTE_TO_VARCHAR2 its working but its saving some initials unwanted character.
    Thanks
    Bakulesh

  • How to reference LONG field in trigger

    I am trying to create a trigger (update and delete) for auditing purposes. The source table has a long field and I need to insert all columns (which includes the long field) into an audit table if any column in the source record changes or if the source record is deleted. I cannot reference :old.longfield or do a select into.
    If anyone has a suggestion or has done it in the past, please email me at [email protected] or post an answer.
    Thank You so much for your help !!!!!

    As you have noticed, you cannot reference the value of a LONG column using :old/:new in a trigger. The work around is to instead process your audit information (including the LONG value) in an after statement trigger. Here is an example to get you started:
    http://asktom.oracle.com/pls/ask/f?p=4950:8:605685::NO::F4950_P8_DISPLAYID

  • PHP - Load Gif image into Oracle Blob field - ORA-00972 error

    I am receiving an " ORA-00972 - Identifier Is too Long" error message when I try to update a BLOB field with file contents from a gif file.
    __GIF FILES_:_
    c:\bl\x_PageLayout-4_LA.gif (15K)
    c:\bl\x_PageLayout-4_Spec.gif (21k)
    ===================================================================================================================================
    ORACLE DATABASE (STYLEELEMENTPIX TABLE):*
    STYLE_ID NUMBER
    SEQ_KEY NUMBER
    PIX_NAME VARCHAR2(30 BYTE)
    PIX BLOB
    PIX_LABEL VARCHAR2(30 BYTE)
    MODIFY_DATE DATE
    PIX_TYPE CHAR(1 BYTE)
    DEFAULTDISPLAY CHAR(1 BYTE)
    PIX field currently is null
    ===================================================================================================================================
    PHP CODE:*
    $filename = 'C:\BL\\';
    $filename .= $row->PIX_NAME; //filename of gif ex: c:\bl\x_PageLayout-4_LA.gif
    $fp = fopen($filename, "rb"); //open gif file
    $file_content = fread($fp, filesize($filename)); //read gif file
    //set gif file in PIX field (PIX datatyle BLOB)
    $cursor1 = oci_parse($conn, "UPDATE STYLEELEMENTPIX SET PIX = '$file_content' WHERE STYLE_ID = $row->STYLE_ID AND SEQ_KEY = $row->SEQ_KEY");
    oci_execute ($cursor1);
    For both records, style id will be 100 ($row->STYLE_ID), and seq_key will be 1 for the first record and 2 for the second ($row->SEQ_KEY)
    ===================================================================================================================================
    ERROR MESSAGE:
    Warning: oci_parse() [function.oci-parse]: ORA-00972: identifier is too long in C:\wamp\www\eStyleGuide\Admin\BLOB.php on line 44 ($cursor1 = ....)

    Use a LOB locator. See "Inserting and Updating LOBs" on p 193 of the free book http://www.oracle.com/technetwork/topics/php/underground-php-oracle-manual-098250.html
    A more concerning issue is the security implications of using string concatenation to construct the SQL statement. It is recommended to use bind variables.

  • Oracle error ORA-01461when trying to insert into an ORACLE BLOB field

    I am getting Oracle error ‘ORA-01461: can bind a LONG value only  for insert into a LONG column' when trying to insert into an ORACLE BLOB field. The error occurs when trying to insert a large BLOB (JPG), but does not occur when inserting a small (<1K) picture BLOB.(JPG). Any ideas?
    BTW, when using a SQL Server datasource using the same code.... everything works with no problems.
    ORACLE version is 11.2.0.1
    The ORACLE datasource is JDBC using Oracle's JDBC driver ojdbc6.jar v11.2.0.1 (I also have tried ojdbc5.jar v11.2.0.1; ojdbc5.jar v11.2.0.4; and ojdbc6.jar v11.2.0.4 with the same error result.)
    Here is my code:
    <cfset file_mime = Lcase(Right(postedXMLRoot.objname.XmlText, 3))>
    <cfif file_mime EQ 'jpg'><cfset file_mime = 'jpeg'></cfif>
    <cfset file_mime = 'data:image/' & file_mime & ';base64,'>
    <cfset image64 = ImageReadBase64("#file_mime##postedXMLRoot.objbase64.XmlText#")>
    <cfset ramfile = "ram://" & postedXMLRoot.objname.XmlText>
    <cfimage action="write" source="#image64#" destination="#ramfile#" overwrite="true">
    <cffile action="readbinary" file="#ramfile#" variable="image_bin">
    <cffile action="delete" file="#ramfile#">
    <cfquery name="InsertImage" datasource="#datasource#">
    INSERT INTO test_images
    image_blob
    SELECT
    <cfqueryparam value="#image_bin#" cfsqltype="CF_SQL_BLOB">
    FROM          dual
    </cfquery>

    Can't you use "alter index <shema.spatial_index_name> rebuild ONLINE" ? Thanks. I could switch to "rebuild ONLINE" and see if that helps. Are there any potential adverse effects going forward, e.g. significantly longer rebuild than not using the ONLINE keyword, etc? Also wondering if spatial index operations (index type = DOMAIN) obey all the typical things you'd expect with "regular" indexes, e.g. B-TREE, etc.

  • Problem with update of BLOB field in a table with compound primary key

    Hi,
    I've been developing an application in Application Express 3.1.2.00.02 that includes processing of BLOB data in one of the tables (ZPRAVA). Unfortunately, I've come across a strange behaviour when I tried to update value in a BLOB field for an existing record via a DML form process. Insert of a new record including the BLOB value is OK (the binary file uploads upon submiting the form without any problems). I haven't changed the DML process in any way. The form update process used to work perfectly before I'd included the BLOB field. Since than, I keep on getting this error when trying to update the BLOB field:
    ORA-20505: Error in DML: p_rowid=3, p_alt_rowid=ID, p_rowid2=CZ000001, p_alt_rowid2=PR_ID. ORA-01008: not all variables bound
    Unable to process row of table ZPRAVA.
    OK
    Some time ago, I've already created another application where I used similar form that operated on a BLOB field without problems. The only, but maybe very important, difference between both the cases is that the first sucessfull one is based on a table with a standard one-column primary key whereas the second (problematic one) uses a table with compound (composite) two-column PK (two varchar2 fields: ID, PR_ID).
    In both cases, I've followed this tutorial: [http://www.oracle.com/technology/obe/apex/apex31nf/apex31blob.htm]).
    Can anybody confirm my suspicion that Automatic Row Processing (DML) can be used for updating BLOB fields within tables with only single-column primary keys?
    Thanks in advance.
    Zdenek

    Is there a chance that the bug will be included in the next patch?No, this fix will be in the next full version, 3.2.
    Scott

  • Modifying/Updating User Defined Field in a Scheduled Task

    I've written a notification task to send an e-mail to a manager who has a contract employee with a contract that is about to expire.
    Once we isolate a user who has a contract about to expire, we send a notification to the manager. The date that the notification is sent out should be stored in the USR table in a user-defined field, "USR_UDF_LASTSENT."
    Updating this USR_UDF_LASTSENT field is where I'm having difficulty.
    I've tried using the UserManager in a couple of ways. Suppose I've isolated a single user using SearchCriteria and the UserManager and have a single User object called "currentUser." I want to store a Date object in the user defined field "USR_UDF_LASTSENT". Date today = new Date();
    I've tried: currentUser.setAttribute("USR_UDF_LASTSENT", today); //This will run without error, but when I check the DB there is no change to the attribute.
    With a defined instance of UserManager userManager, I've tried: userManager.modify("USR_UDF_LASTSENT", today, currentUser); //This errored out with this error - oracle.iam.identity.exception.NoSuchUserException: IAM-3054135:No user found for the criteria USR_UDF_LASTSENT-9/24/13 2:58 PM.:USR_UDF_LASTSENT:9/24/13 2:58 PM. It looks like it's doing a search rather than a modification.
    I've also tried using the entity manager in the following way:
    Date today = new Date();
    HashMap<String, Object> mapAttrs = new HashMap<String, Object>(); 
    mapAttrs.put("USR_UDF_LASTSENT", today); 
    EntityManager entMgr = Platform.getService(EntityManager.class); 
    entMgr.modifyEntity("User", currentUser.getEntityId(), mapAttrs);
    But it returns with this error: Failed: oracle.iam.platform.entitymgr.UnknownAttributeException: User : [USR_UDF_LASTSENT]
    Is my entityType, "User" inappropriate in this case? What should be used here?
    How can I Set or Update this user defined field from a scheduled task?

    Thanks guys. I did go to Identity System Administration console and chose 'Export' from under "System Managment" which I believe Kevin may have been hinting at. I got an xml export of the AttributeDefinitions for our user defined fields. In this file, there was a header for the attribute I was looking for:
    <AttributeDefinition repo-type="API" name="LastSent" subtype="User Metadata">
       <multiValued>
       <backendName>usr_udf_lastsent</backendName>
    I put the string "LastSent" in place of USR_UDF_LASTSENT in the EntityManager version of my attempt at this task. I believe this is what Kevin and delhi were getting at.
    This didn't work:
    Date today = new Date();
    HashMap<String, Object> mapAttrs = new HashMap<String, Object>(); 
    mapAttrs.put("USR_UDF_LASTSENT", today); 
    EntityManager entMgr = Platform.getService(EntityManager.class); 
    entMgr.modifyEntity("User", currentUser.getEntityId(), mapAttrs);
    But this did:
    Date today = new Date();
    HashMap<String, Object> mapAttrs = new HashMap<String, Object>(); 
    mapAttrs.put("LastSent", today); 
    EntityManager entMgr = Platform.getService(EntityManager.class); 
    entMgr.modifyEntity("User", currentUser.getEntityId(), mapAttrs);
    I wonder if currentUser.setAttribute("LastSent", today); would work... Hmm.

Maybe you are looking for