Value too large error, while upgrading

Hi
We have been upgrading EBS 11i to R12.1.1 on IBM AIX6.1 with db 11.2.0.3.
while applying patch 6678700
we faced an issue
the worker fails saying:
xxxxx/log> tail -f adwork002.log
sqlplus -s APPS/***** @xxxx/apps/apps_st/appl/ap/12.0.0/patch/115/sql/apstca01.sql &un_ap &batchsize 24 40
DECLARE
ERROR at line 1:
ORA-12899: value too large for column "AP"."AP_SUPPLIER_SITES_ALL"."CITY"
(actual: 26, maximum: 25)
ORA-06512: at line 631
Time when worker failed: Sat Oct 26 2013 02:27:10
what should we do?
I searched on metalink. but could not find anything relevant.
Thanks
dba88

dba88 wrote:
Hi
We have been upgrading EBS 11i to R12.1.1 on IBM AIX6.1 with db 11.2.0.3.
while applying patch 6678700
we faced an issue
the worker fails saying:
xxxxx/log> tail -f adwork002.log
sqlplus -s APPS/***** @xxxx/apps/apps_st/appl/ap/12.0.0/patch/115/sql/apstca01.sql &un_ap &batchsize 24 40
DECLARE
ERROR at line 1:
ORA-12899: value too large for column "AP"."AP_SUPPLIER_SITES_ALL"."CITY"
(actual: 26, maximum: 25)
ORA-06512: at line 631
Time when worker failed: Sat Oct 26 2013 02:27:10
what should we do?
I searched on metalink. but could not find anything relevant.
Thanks
dba88
Please see (Apstca01.Sql Faile In 6678700 (Doc ID 1449827.1)).
Thanks,
Hussein

Similar Messages

  • How to find the exact coloumn which fired the column value too large error

    Hi all
    I have a procedure which has a insert statement init.
    I have encountered an column value too large error in that procedure.
    I want to know exactly in which column the error has come
    Any ideas,
    Thanks
    Hari

    What is your insert statement? I get the exact column name here.
    SQL> create table sample1(col1 number(1), col2 number(3), col3 varchar2(2))
      2  /
    Table created.
    SQL> insert into sample1
      2  select 1, 2, 'A' from dual
      3  union
      4  select 1,333, 'B' from dual
      5  union
      6  select 2, 44, 'CCC' from dual
      7  /
    insert into sample1
    ERROR at line 1:
    ORA-12899: value too large for column "ETL_USER"."SAMPLE1"."COL3" (actual: 3,
    maximum: 2)
    SQL> Cheers
    Sarma.

  • Discoverer 'Value too large' error

    Hi All,
    I have been stuck with a problem with the Oracle Discoverer Plus 10g (10.1.2.55.26) since the last 2 days, tried out all the possible options mentioned on Forums and elsewhere, but somehow nothing seems to help. So I'm turning in to the experts for help. Below is a description:
    We have a report on Discoverer which is giving us the 'ORA-12899 value too large for column', whenever we try to schedule it thru the Scheduler. The report runs okay if we run manually. In order to remediate the problem, I did the following:
    1. Identify the Business Areas associated with that report, and map the column in question to its base table.
    2. Could see that the column is a varchar2 column, and so I altered its length from 50 to 200,
    3. Once the column was altered, I deleted it from all the complex folders on its Business Areas, and added it again by copying from the simple folder (which maps directly to the column in the table).
    4. Then I refreshed all the BA's so that the new column length would be picked up everywhere, and the refresh went successfully.
    5. I deleted all the existing scheduled workbooks with errors, and tried to schedule again.
    Even then, the workbook still throws the same 'ORA-12899 value too large for column (actual: 57, maximum:50)'. Not sure what step is missing while refreshing the EUL which is causing this error.
    Please advise!
    Thanks.

    Hi,
    Just wanted to mention that we were able to locate and fix the problem yesterday evening. There was another complex folder which also had the same element, and we missed changing that the first time. On doing that change, deleting and rescheduling the report, it started to work perfectly
    Thanks.

  • Varchar Type column value  too large  ERROR (ORA-12899)

    Hi,
    I'm using Build *"ODI_11.1.1.3.0_GENERIC_100623.1635"*
    I'm trying to move data from SQL SERVER 2005 to Oracle DB 11.2.
    Integration Knowledge Module is : IKM SQL INCREMENTAL UPDATE
    Loading Knowledge Module : LKM SQL TO ORACLE
    The problem is in VARCHAR type column with length 20.
    The target column type is VARCHAR2 , length 20.
    Integration fails on the LOADING STEP with message : java.sql.BatchUpdateException: ORA-12899: value too large for column "DW_WORKUSER"."C$_10PLFACT"."C65_SOURCENO" *(actual: 26, maximum: 20)*
    Column mapping is executed on the Source.
    In column mapping I'm not using any convertion, casting or any other logic, I'm just moving my column to the target table.
    Can anyone make some suggestions ?

    Hi,
    This is due to bytes not chars. On your SQL server db you can have multibyte characters (as you can elsewhere). You may need to modify the C$/I$ and your target tables to bytes rather than chars. You may also want to check the CHARACTER_SETS that are used on the source and target.
    eg
    VARCHAR2(20 BYTE)
    Cheers
    Bos
    Edited by: Bos on Apr 13, 2011 8:36 AM

  • ODI Datastore Length differs with the DB length -IKM throws value too large

    ODI datastore when reverse engineered shows different length to that of the datalength in the actual db.
    ODI Datastore column details: char(44)
    Target db column : varchar2(11 char)
    The I$ table inserts char44 into varchar2(11char) in the target. As the source column value is empty ODI throws
    "ORA-12899: value too large for column (actual: 44, maximum: 11).

    Yes. I have reverse engineered the target also.
    source datatype     varchar2(11 char)
    After Reverse Engineering
    odi datstore datatype-Source :  char(44)
    target datatype: varchar2(11 char)
    after Reverse Engineering
    odi datstore datatype-Target :  char(44)
    Since the target datastore is char(44) in ODI Datastore and the values in the source column are null/spaces, the IKM inserts them into the target Column which is of 11 Char and the above mentioned value too large error occurs.
    There are no junk values seen on the column and I tried with substr(column,1,7) and
    Trim functions too and it does not help.

  • PLS-00123: Program too large error

    Hi,
    We have been getting a 'PLS-00123: Program too large' error while compiling a particularly large package on a *11g database*. The package has around 15000 lines of code. We are compiling the package using a third party IDE(PL/SQL Developer).
    I went through AskTom's note (http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:571023051648) and Metalink Note#62603.1 which helped in understanding the problem. However, could not resolve the problem since modifying the structure of the package was not an option.
    Someone from the team contacted a DBA regarding this who in turn suggested that we compile the package in SQL*Plus. We did as suggested on a *10g client* connected to the 11g database and the package got compiled.
    I cannot understand why we encountered the error while compiling the package in PL/SQL Developer but not while doing the same from SQL*Plus. Can anyone please explain this or point out some helpful documents?
    Regards,
    Sujoy

    Hope this link helps..
    http://forums.allroundautomations.com/ubb/ubbthreads.php?ubb=showflat&Number=10265&PHPSESSID=2e3c1b028a66500ae8a1730abc88b993

  • Inserted value too large for column error while scheduling a job

    Hi Everyone,
    I am trying to schedule a PL SQL script as a job in my Oracle 10g installed and running on Windows XP.
    While trying to Submit the job I get the error as "Inserted value too large for column:" followed by my entire code. The code is correct - complies and runs in Oracle ApEx's SQL Workshop.
    The size of my code is 4136 character, 4348 bytes and 107 lines long. It is a code that sends an e-mail and has a +utl_smtp.write_data([Lots of HTML])+
    There is no insert statement in the code whatsoever, the code only queries the database for data...
    Any idea as to why I might be getting this error??
    Thanks in advance
    Sid

    The size of my code is 4136 character, 4348 bytes and 107 lines long. It is a code that sends an e-mail and has a utl_smtp.write_data(Lots of HTML)SQL variable has maximum size of 4000

  • 'Value too large for defined data type' error while running flexanlg

    While trying to run flexanlg to analyze my access log file I have received the following error:
    Could not open specified log file 'access': Value too large for defined data type
    The command I was running is
    ${iPLANET_HOME}/extras/flexanlg/flexanlg -F -x -n "Web Server" -i ${TMP_WEB_FILE} -o ${OUT_WEB_FILE} -c hnrfeuok -t s5m5h5 -l h30c+5 -p ctl
    Which should generate a html report of the web statistics
    The file has approx 7 Million entries and is 2.3G in size
    Ideas?

    I've concatenated several files together from my web servers as I wanted a single report, several reports based on individual web servers is no use.
    I'm running iWS 6.1 SP6 on Solaris 10, on a zoned T2000
    SunOS 10 Generic_118833-23 sun4v sparc SUNW,Sun-Fire-T200
    Cheers
    Chris

  • I am getting error "ORA-12899: value too large for column".

    I am getting error "ORA-12899: value too large for column" after upgrading to 10.2.0.4.0
    Field is updating only through trigger with hard coded value.
    This happens randomly not everytime.
    select * from v$version
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE     10.2.0.4.0     Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    Table Structure
    desc customer
    Name Null? Type
    CTRY_CODE NOT NULL CHAR(3 Byte)
    CO_CODE NOT NULL CHAR(3 Byte)
    CUST_NBR NOT NULL NUMBER(10)
    CUST_NAME CHAR(40 Byte)
    RECORD_STATUS CHAR(1 Byte)
    Trigger on the table
    CREATE OR REPLACE TRIGGER CUST_INSUPD
    BEFORE INSERT OR UPDATE
    ON CUSTOMER FOR EACH ROW
    BEGIN
    IF INSERTING THEN
    :NEW.RECORD_STATUS := 'I';
    ELSIF UPDATING THEN
    :NEW.RECORD_STATUS := 'U';
    END IF;
    END;
    ERROR at line 1:
    ORA-01001: invalid cursor
    ORA-06512: at "UPDATE_CUSTOMER", line 1320
    ORA-12899: value too large for column "CUSTOMER"."RECORD_STATUS" (actual: 3,
    maximum: 1)
    ORA-06512: at line 1
    Edited by: user4211491 on Nov 25, 2009 9:30 PM
    Edited by: user4211491 on Nov 25, 2009 9:32 PM

    SQL> create table customer(
      2  CTRY_CODE  CHAR(3 Byte) not null,
      3  CO_CODE  CHAR(3 Byte) not null,
      4  CUST_NBR NUMBER(10) not null,
      5  CUST_NAME CHAR(40 Byte) ,
      6  RECORD_STATUS CHAR(1 Byte)
      7  );
    Table created.
    SQL> CREATE OR REPLACE TRIGGER CUST_INSUPD
      2  BEFORE INSERT OR UPDATE
      3  ON CUSTOMER FOR EACH ROW
      4  BEGIN
      5  IF INSERTING THEN
      6  :NEW.RECORD_STATUS := 'I';
      7  ELSIF UPDATING THEN
      8  :NEW.RECORD_STATUS := 'U';
      9  END IF;
    10  END;
    11  /
    Trigger created.
    SQL> insert into customer(CTRY_CODE,CO_CODE,CUST_NBR,CUST_NAME,RECORD_STATUS)
      2                values('12','13','1','Mahesh Kaila','UPD');
                  values('12','13','1','Mahesh Kaila','UPD')
    ERROR at line 2:
    ORA-12899: value too large for column "HPVPPM"."CUSTOMER"."RECORD_STATUS"
    (actual: 3, maximum: 1)
    SQL> insert into customer(CTRY_CODE,CO_CODE,CUST_NBR,CUST_NAME)
      2                values('12','13','1','Mahesh Kaila');
    1 row created.
    SQL> set linesize 200
    SQL> select * from customer;
    CTR CO_   CUST_NBR CUST_NAME                                R
    12  13           1 Mahesh Kaila                             I
    SQL> update customer set cust_name='tst';
    1 row updated.
    SQL> select * from customer;
    CTR CO_   CUST_NBR CUST_NAME                                R
    12  13           1 tst                                      Urecheck your code once again..somewhere you are using record_status column for insertion or updation.
    Ravi Kumar

  • Error on reverse on XML: value too large for column

    Hi All,
    I am trying to reverse engineer while creating the data model on XML technology.
    My JDBC URL on data server reads this:
    jdbc:snps:xml?d=../demo/abc/CustomerPartyEBO.xsd&s=MYEBO
    I get an error while doing the reverse.
    java.sql.SQLException: ORA-12899: value too large for column "PINW"."SNP_REV_KEY_COL"."KEY_NAME" (actual: 102, maximum: 100)
    After doing some check through selective reverse, found that this is happening only for few tables, whose names are quite longer.
    Tried setting the "maximum column name length" and "maximum table name length" to 120 and even higher values on XML technology from Topology Manager. No luck there.
    Thanks in advance for any help here.

    That is not the place to change.
    The error states that the SNP_REV_KEY_COL.KEY_NAME in the Work Repository schema PINW has maximum length defined to be 100.
    I donot know if Oracle will support this change but you will have to make a change to the Work Repository table SNP_REV_KEY_COL and change the column lengths as a workaround.

  • Error :Value Too large for DEF_VALUE of SNP_REV_COL

    I am getting the following error while I am importing my work repository .
    Error : Value Too large for DEF_VALUE of SNP_REV_COL
    Can any one pls let me know the root cause of the issue.
    I found the following work around to resolve the issue.
    I am changing the 'DEF_VALUE' column length in both 'SNP_REV_COL' and 'SNP_COL' tables.
    I am using the ODI 10.1.3.5 version.
    alter table SNP_REV_COL modify DEF_VALUE VARCHAR2 (400);
    alter table SNP_COL modify DEF_VALUE VARCHAR2(400);
    I am able to import the work_rep with out any issues after changing the above columns.
    I am looking for the reason why this issues is occurring.
    Thanks,
    Yellanki
    Edited by: Yellanki on Feb 7, 2011 3:18 AM

    Ankit,
    I am trying to move my Dev WR to Test. And my Source Technologies are SQL Server and Oracle.
    Target is Oracle. I got the work around for this issue And I am looking for the root cause of the issue.
    Any help is greatly appreciated.
    Thanks,
    Yellanki

  • SQL Error: ORA-12899: value too large for column

    Hi,
    I'm trying to understand the above error. It occurs when we are migrating data from one oracle database to another:
    Error report:
    SQL Error: ORA-12899: value too large for column "USER_XYZ"."TAB_XYZ"."COL_XYZ" (actual: 10, maximum: 8)
    12899. 00000 - "value too large for column %s (actual: %s, maximum: %s)"
    *Cause:    An attempt was made to insert or update a column with a value
    which is too wide for the width of the destination column.
    The name of the column is given, along with the actual width
    of the value, and the maximum allowed width of the column.
    Note that widths are reported in characters if character length
    semantics are in effect for the column, otherwise widths are
    reported in bytes.
    *Action:   Examine the SQL statement for correctness.  Check source
    and destination column data types.
    Either make the destination column wider, or use a subset
    of the source column (i.e. use substring).
    The source database runs - Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    The target database runs - Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    The source and target table are identical and the column definitions are exactly the same. The column we get the error on is of CHAR(8). To migrate the data we use either a dblink or oracle datapump, both result in the same error. The data in the column is a fixed length string of 8 characters.
    To resolve the error the column "COL_XYZ" gets widened by:
    alter table TAB_XYZ modify (COL_XYZ varchar2(10));
    -alter table TAB_XYZ succeeded.
    We now move the data from the source into the target table without problem and then run:
    select max(length(COL_XYZ)) from TAB_XYZ;
    -8
    So the maximal string length for this column is 8 characters. To reduce the column width back to its original 8, we then run:
    alter table TAB_XYZ modify (COL_XYZ varchar2(8));
    -Error report:
    SQL Error: ORA-01441: cannot decrease column length because some value is too big
    01441. 00000 - "cannot decrease column length because some value is too big"
    *Cause:   
    *Action:
    So we leave the column width at 10, but the curious thing is - once we have the data in the target table, we can then truncate the same table at source (ie. get rid of all the data) and move the data back in the original table (with COL_XYZ set at CHAR(8)) - without any issue.
    My guess the error has something to do with the storage on the target database, but I would like to understand why. If anybody has an idea or suggestion what to look for - much appreciated.
    Cheers.

    843217 wrote:
    Note that widths are reported in characters if character length
    semantics are in effect for the column, otherwise widths are
    reported in bytes.You are looking at character lengths vs byte lengths.
    The data in the column is a fixed length string of 8 characters.
    select max(length(COL_XYZ)) from TAB_XYZ;
    -8
    So the maximal string length for this column is 8 characters. To reduce the column width back to its original 8, we then run:
    alter table TAB_XYZ modify (COL_XYZ varchar2(8));varchar2(8 byte) or varchar2(8 char)?
    Use SQL Reference for datatype specification, length function, etc.
    For more info, reference {forum:id=50} forum on the topic. And of course, the Globalization support guide.

  • Data Profiling - Value too large for column error

    I am running a data profile which completes with errors. The error being reported is an ORA 12899 Value too large for column actual (41 maximum 40).
    I have checked the actual data in the table and the maximum is only 40 characters.
    Any ideas on how to solve this. Even though it completes no actual profile is done on the data due to the error.
    OWB version 11.2.0.1
    Log file below.
    Job     Rows Selected     Rows Inserted     Rows Updated     Rows Deleted     Errors     Warnings     Start Time     Elapsed Time     
    Profile_1306385940099                                   2011-05-26 14:59:00.0     106     
    Data profiling operations complete.                                             
    Redundant column analysis for objects complete in 0 s.                                              
    Redundant column analysis for objects.                                              
    Referential analysis for objects complete in 0.405 s.                                              
    Referential analysis for objects.                                              
    Referential analysis initialization complete in 8.128 s.                                             
    Referential analysis initialization.                                             
    Data rule analysis for object TABLE_NAME complete in 0 s.                                             
    Data rule analysis for object TABLE_NAME                                             
    Functional dependency and unique key discovery for object TABLE_NAME complete in 0 s.                                             
    Functional dependency and unique key discovery for object TABLE_NAME                                             
    Domain analysis for object TABLE_NAME complete in 0.858 s.                                             
    Domain analysis for object TABLE_NAME                                             
    Pattern analysis for object TABLE_NAME complete in 0.202 s.                                             
    Pattern analysis for object TABLE_NAME                                             
    Aggregation and Data Type analysis for object TABLE_NAME complete in 9.236 s.                                             
    Aggregation and Data Type analysis for object TABLE_NAME                                             
    Functional dependency and unique key discovery for object TABLE_NAME complete in 0 s.                                             
    Functional dependency and unique key discovery for object TABLE_NAME                                             
    Domain analysis for object TABLE_NAME complete in 0.842 s.                                             
    Domain analysis for object TABLE_NAME                                             
    Pattern analysis for object TABLE_NAME complete in 0.187 s.                                             
    Pattern analysis for object TABLE_NAME                                             
    Aggregation and Data Type analysis for object TABLE_NAME complete in 9.501 s.                                             
    Aggregation and Data Type analysis for object TABLE_NAME                                             
    Functional dependency and unique key discovery for object TABLE_NAME complete in 0 s.                                             
    Functional dependency and unique key discovery for object TABLE_NAME                                             
    Domain analysis for object TABLE_NAME complete in 0.717 s.                                             
    Domain analysis for object TABLE_NAME                                             
    Pattern analysis for object TABLE_NAME complete in 0.156 s.                                             
    Pattern analysis for object TABLE_NAME                                             
    Aggregation and Data Type analysis for object TABLE_NAME complete in 9.906 s.                                             
    Aggregation and Data Type analysis for object TABLE_NAME                                             
    Functional dependency and unique key discovery for object TABLE_NAME complete in 0 s.                                             
    Functional dependency and unique key discovery for object TABLE_NAME                                             
    Domain analysis for object TABLE_NAME complete in 0.827 s.                                             
    Domain analysis for object TABLE_NAME                                             
    Pattern analysis for object TABLE_NAME complete in 0.187 s.                                             
    Pattern analysis for object TABLE_NAME                                             
    Aggregation and Data Type analysis for object TABLE_NAME complete in 9.172 s.                                             
    Aggregation and Data Type analysis for object TABLE_NAME                                             
    Functional dependency and unique key discovery for object TABLE_NAME complete in 0 s.                                             
    Functional dependency and unique key discovery for object TABLE_NAME                                             
    Domain analysis for object TABLE_NAME complete in 0.889 s.                                             
    Domain analysis for object TABLE_NAME                                             
    Pattern analysis for object TABLE_NAME complete in 0.202 s.                                             
    Pattern analysis for object TABLE_NAME                                             
    Aggregation and Data Type analysis for object TABLE_NAME complete in 9.313 s.                                             
    Aggregation and Data Type analysis for object TABLE_NAME                                             
    Execute data prepare map for object TABLE_NAME complete in 9.267 s.                                             
    Execute data prepare map for object TABLE_NAME                                             
    Execute data prepare map for object TABLE_NAME complete in 10.187 s.                                             
    Execute data prepare map for object TABLE_NAME                                             
    Execute data prepare map for object TABLE_NAME complete in 8.019 s.                                             
    Execute data prepare map for object TABLE_NAME                                             
    Execute data prepare map for object TABLE_NAME complete in 5.507 s.                                             
    Execute data prepare map for object TABLE_NAME                                             
    Execute data prepare map for object TABLE_NAME complete in 10.857 s.                                             
    Execute data prepare map for object TABLE_NAME                                             
    Parameters                                             
    O82647310CF4D425C8AED9AAE_MAP_ProfileLoader                              1     2011-05-26 14:59:00.0     11     
    ORA-12899: value too large for column "SCHEMA"."O90239B0C1105447EB6495C903678"."ITEM_NAME_1" (actual: 41, maximum: 40)                                             
    Parameters                                             
    O68A16A57F2054A13B8761BDC_MAP_ProfileLoader                              1     2011-05-26 14:59:11.0     5     
    ORA-12899: value too large for column "SCHEMA"."O0D9332A164E649F3B4D05D045521"."ITEM_NAME_1" (actual: 41, maximum: 40)                                             
    Parameters                                             
    O78AD6B482FC44D8BB7AF8357_MAP_ProfileLoader                              1     2011-05-26 14:59:16.0     9     
    ORA-12899: value too large for column "SCHEMA"."OBF77A8BA8E6847B8AAE4522F98D6"."ITEM_NAME_2" (actual: 41, maximum: 40)                                             
    Parameters                                             
    OA79DF482D74847CF8EA05807_MAP_ProfileLoader                              1     2011-05-26 14:59:25.0     10     
    ORA-12899: value too large for column "SCHEMA"."OB0052CBCA5784DAD935F9FCF2E28"."ITEM_NAME_1" (actual: 41, maximum: 40)                                             
    Parameters                                             
    OFFE486BBDB884307B668F670_MAP_ProfileLoader                              1     2011-05-26 14:59:35.0     9     
    ORA-12899: value too large for column "SCHEMA"."O9943284818BB413E867F8DB57A5B"."ITEM_NAME_1" (actual: 42, maximum: 40)                                             
    Parameters

    Found the answer. It was the database character set for multi byte character sets.

  • Error: value too large for column ?!?

    I use LKM SQL to Oracle and IKM Oracle Incremental Update. When I execute the interface I get an error message:
    12899 : 72000 : java.sql.BatchUpdateException: ORA-12899: value too large for column "SAMPLE"."C$_0EMP"."C4_EMP_NM" (actual: 17, maximum: 15)
    I checked the source and maximum length is 15 and I don't understand where ODI can find 17 characters. Both columns (source and targer) are Varchar2(15).
    Then I replaced the mapping with Substr(EMP_NM,1,15) to make sure ODI will get only 15 characters in stage area but... it didn't work. I tried even Substr(EMP_NM,1,10) but no luck.
    Why this ODI error occurred ? It doesn't have any sense....

    It didn't work.
    After I changed length of that column of the source datastore (from 15 to 16), ODI created temporary tables (C$ with I$) with larger columns (16 instead of 15) but I got the same error message.
    I'm wondering why I have to extend length of source datastore in the source model if there are no values in the source table with a length greather than 15....
    Any other idea? Thanks !

  • 'Value too large for column' error in msql

    In my 9ilite database I have a table with a LONG column
    In the documentation it says that a LONG column can hold upto 2 GB data
    However when I try an insert more than 4097 characters into the column I get the error message on insert 'Value too large for column'
    Is this a bug or is the documentation wrong
    or am I doing something wrong ?
    Any help would be much appreciated

    You have run into some bug in handling intermediate results in Oracle 9i Lite. You can by pass the bug as follows in Java.
    public static oracle.lite.poljdbc.BLOB createBlob(Connection conn,
                        byte[] data)
    throws SQLException, IOException
    oracle.lite.poljdbc.BLOB blob = new oracle.lite.poljdbc.BLOB(
    (oracle.lite.poljdbc.OracleConnection)conn);
    OutputStream writer = blob.getBinaryOutputStream();
    writer.write(data);
    writer.flush();
    writer.close();
    return blob;
    public static void insertRow(Connection conn, int num,
                        oracle.lite.poljdbc.BLOB blob)
    throws SQLException
    PreparedStatement ps = conn.prepareStatement(
    "insert into TEST_BLOB values (?, ?)");
    ps.setInt(1, num);
    ps.setBlob(2, blob);
    ps.execute();

Maybe you are looking for

  • How to buy Mac OSX Mountain Lion without going to Mac AppStore.

    Hello! This is my first post on these forums so please welcome me if you are feeling nice today! Before we get started please note that I know Mavericks is released and it is better, however I need Mountain Lion in this case. I currently can't access

  • To display a continuos Error Message using BADI

    Hi Everybody,                     There's a transaction that makes use of classes and interfaces to display an ALV grid with input enabled frames and fields. There's a need to display an error message using a specified BADI method, which gets trigger

  • How do I reduce image in batch process on Elements 11 or 12?

    I have about a hundred photos in high resolution that I want to reduce to (the same) small image size for a website. How can I do this several (or all) at a time, instead of individually editing each one, using my Photoshop Elements 11 or 12? I canno

  • Line across my screen when linking

    Hi, I have a strange problem for which I haven't found anything on here at the Forum. When I "drag" the little link symbol to a file to create a link, the blue line and arrow sometimes stays on my screen and I have to quit Dreamweaver to get rid of i

  • How do I delete a album in elements 11?

    How do I delete a album inelements 11