Toplink blob handling

when we use a serailized mapping in toplink to write objects to database, Does toplink add any special headers/bits
to the data written as a binary blob into the DB?
When we migrate blob data from one DB to another DB external to Toplink, Toplink is unable
to deserialize the data from the DB, we compared the data in both the original and the
migrated DB, the blob data looks fine.
Thanks

No, TopLink does not do anything special, only normal Java serialization.
What is the error? Do you have a different version of the class when you try to load it? Have you defined a serialVersionUID in your class?
James : http://ww.eclipselink.org

Similar Messages

  • DBMS_LOB을 이용한 BLOB 데이타 HANDLING 예제 프로그램

    제품 : PRECOMPILERS
    작성날짜 : 1998-09-15
    DBMS_LOB을 이용한 BLOB 데이타 HANDLING 예제 프로그램
    ==================================================
    1) select 하는 예제
    - pro*c 2.2에서 compile하는 경우는 package로 DBMS_LOB package 호출을
    encapsulate 해서 사용해야 하고, proc 8.0 이상에서 compile하는 경우
    DBMS_LOB package를 직접 호출하는 것이 가능하다.
    - make 방법
    PRO*C 8.0인 경우
    $ make -f demo_proc.mk EXE=my_prog OBJS=my_prog.o build \
    PROCFLAGS="sqlcheck=full userid=scott/tiger define=V8"
    PRO*C 2.2:
    $ setenv TWO_TASK v8_alias
    $ make -f proc.mk EXE=my_prog OBJS=my_prog.o build \
    PROCFLAGS="sqlcheck=full userid=scott/tiger"
    - 수행 SQL Script
    create or replace package blob_it as
    my_blob blob;
    function get_blob_len return number;
    procedure read_blob(amount in out number, offset in number,
    buf in out raw);
    end;
    create or replace package body blob_it as
    function get_blob_len return number is
    begin
    return DBMS_LOB.GETLENGTH(my_blob);
    end;
    procedure read_blob(amount in out number, offset in number,
    buf in out raw) is
    begin
    DBMS_LOB.READ(my_blob,amount,offset,buf);
    end;
    end;
    drop table lob_tab;
    create table lob_tab (c1 number, c2 blob);
    insert into lob_tab values (1,
    utl_raw.cast_to_raw('AAAAAAAaaaaaaaaaa'));
    - Program 예제
    #include <stdio.h>
    #include <string.h>
    #define TERM(X) ( X.arr[X.len] = '\0' )
    #define SLEN(X) ( X.len = strlen((char *)X.arr) )
    #define READ_SIZE 60
    EXEC SQL INCLUDE SQLCA;
    /* Structure for VARRAW */
    typedef struct {short len; char arr[READ_SIZE];} vr;
    EXEC SQL BEGIN DECLARE SECTION;
    VARCHAR oracleid[20];
    EXEC SQL TYPE vr IS VARRAW(READ_SIZE);
    vr my_vr;
    EXEC SQL END DECLARE SECTION;
    FILE *fp;
    main()
    char action_str[30];
    long amount;
    long offset;
    short done;
    long total;
    EXEC SQL WHENEVER SQLERROR DO o_error(action_str);
    strcpy( (char *)oracleid.arr, "scott/tiger" );
    SLEN( oracleid );
    strcpy( action_str, "connecting to d/b" );
    EXEC SQL CONNECT :oracleid;
    fp = fopen("my_blob.dat","wb");
    strcpy( action_str, "fetching blob locator" );
    EXEC SQL EXECUTE
    BEGIN
    select c2 into blob_it.my_blob from lob_tab
    where c1 = 1;
    #ifndef V8
    :total := blob_it.get_blob_len;
    #else
    :total := DBMS_LOB.GETLENGTH(blob_it.my_blob);
    #endif
    END;
    END-EXEC;
    amount = READ_SIZE;
    offset = 1;
    done = 0;
    strcpy( action_str, "reading from blob" );
    while (!done)
    EXEC SQL EXECUTE
    BEGIN
    #ifndef V8
    blob_it.read_blob(:amount,:offset,:my_vr);
    #else
    DBMS_LOB.READ(blob_it.my_blob,:amount,:offset,:my_vr);
    #endif
    END;
    END-EXEC;
    offset += amount;
    if (offset >= total)
    done = 1;
    fwrite(my_vr.arr,(size_t)amount,(size_t)1,fp);
    fclose(fp);
    EXEC SQL WHENEVER SQLERROR CONTINUE;
    EXEC SQL ROLLBACK WORK RELEASE;
    int o_error( action_str )
    char *action_str;
    int i;
    char error_str[150];
    EXEC SQL WHENEVER SQLERROR CONTINUE;
    for ( i = 0; i < sqlca.sqlerrm.sqlerrml; i++ )
    error_str[i] = sqlca.sqlerrm.sqlerrmc;
    error_str[i] = '\0';
    printf( "\nFailed with following Oracle error while %s:\n\n%s",
    action_str, error_str );
    EXEC SQL ROLLBACK WORK RELEASE;
    exit(1);
    2) insert 하는 예제
    - 수행 SQL문
    create directory BFILE_DIR as '/mnt3/rctest80/ldt';
    - make 방법
    $ make -f demo_proc.mk EXE=my_prog OBJS=my_prog.o build \
    PROCFLAGS="sqlcheck=full userid=scott/tiger"
    - Bfile을 이용하여 데이타를 로드하는 프로그램 예제
    #include <stdio.h>
    #include <string.h>
    #define SLEN(X) ( X.len = strlen((char *)X.arr) )
    EXEC SQL INCLUDE SQLCA;
    VARCHAR oracleid[20];
    FILE *fp;
    main()
    char action_str[30];
    EXEC SQL WHENEVER SQLERROR DO o_error(action_str);
    strcpy( (char *)oracleid.arr, "scott/tiger" );
    SLEN( oracleid );
    strcpy( action_str, "connecting to d/b" );
    EXEC SQL CONNECT :oracleid;
    EXEC SQL EXECUTE
    DECLARE
    lobd BLOB;
    fils BFILE :=BFILENAME('BFILE_DIR', 'a30.bmp');
    amt INTEGER;
    BEGIN
    insert into lob_tab values (1, empty_blob())
    returning c2 into lobd;
    DBMS_LOB.FILEOPEN(fils, dbms_lob.file_readonly);
    DBMS_LOB.LOADFROMFILE(lobd, fils, dbms_lob.getlength(fils));
    DBMS_LOB.FILECLOSE(fils);
    END;
    END-EXEC;
    EXEC SQL COMMIT WORK RELEASE;
    int o_error( action_str )
    char *action_str;
    int i;
    char error_str[150];
    EXEC SQL WHENEVER SQLERROR CONTINUE;
    for ( i = 0; i < sqlca.sqlerrm.sqlerrml; i++ )
    error_str[i] = sqlca.sqlerrm.sqlerrmc[i];
    error_str[i] = '\0';
    printf( "\nFailed with following Oracle error while %s:\n\n%s",
    action_str, error_str );
    EXEC SQL ROLLBACK WORK RELEASE;
    exit(1);

  • Attempt to access expired blob handle

    We get a random error in our test environment after upgrading from SQL server 2008 R2 to 2012.
    System.Data.SqlClient.SqlException (0x80131904): A system assertion check has failed. Check the SQL Server error log for details. Typically, an assertion failure is caused by a software bug or data corruption. To check for database corruption, consider running
    DBCC CHECKDB. If you agreed to send dumps to Microsoft during setup, a mini dump will be sent to Microsoft. An update might be available from Microsoft in the latest Service Pack or in a QFE from Technical Support.
    A severe error occurred on the current command.  The results, if any, should be discarded.
    Location:  tmpilb.cpp:2576
    Expression:  fFalse
    SPID:   55
    Process ID:  1552
    Description:  Attempt to access expired blob handle (3)
    The weird thing is that is looks almost exactly like the below kb article, we are using a matching scenario as the symptons described in the below kb.
    support.microsoft.com/kb/2644794/
    It did not make this issue in 2008 R2, we then took the database from 2008 R2 and attached to the 2012 db server, and then
    Event viewer (throws a lot of the 4 below errors):
    Level Date   Source   Event Id Task category
    Error 01-04-2014 17:42:00 MSSQL$DGOFFICEMAINSQL 17310  Server
     A user request from the session with SPID 58 generated a fatal exception. SQL Server is terminating this session. Contact Product Support Services with the dump produced in the log directory.
    Error 01-04-2014 17:41:59 MSSQL$DGOFFICEMAINSQL 17065  Server
     SQL Server Assertion: File: <tmpilb.cpp>, line = 2576 Failed Assertion = 'fFalse' Attempt to access expired blob handle (3). This error may be timing-related. If the error persists after rerunning the statement, use DBCC CHECKDB to check the database
    for structural integrity, or restart the server to ensure in-memory data structures are not corrupted.
    Error 01-04-2014 17:41:58 MSSQL$DGOFFICEMAINSQL 3624  Server
     A system assertion check has failed. Check the SQL Server error log for details. Typically, an assertion failure is caused by a software bug or data corruption. To check for database corruption, consider running DBCC CHECKDB. If you agreed to send dumps
    to Microsoft during setup, a mini dump will be sent to Microsoft. An update might be available from Microsoft in the latest Service Pack or in a QFE from Technical Support.
    Error 01-04-2014 14:47:41 MSSQL$DGOFFICEMAINSQL 17066  Server
     SQL Server Assertion: File: <qxcntxt.cpp>, line=1137 Failed Assertion = '!"No exceptions should be raised by this code"'. This error may be timing-related. If the error persists after rerunning the statement, use DBCC CHECKDB to check
    the database for structural integrity, or restart the server to ensure in-memory data structures are not corrupted.
    (DBCC CHECKDB) has been run, and no errors seems to occur
    All usefull ideas are welcome :)

    OMG weirdest thing ever:
    I have installed SQL server 2012 SP1 on my Windows 7 machine, running the program against the same database, then I tried to replicate the error:
    Using IE 11 going to the page that causes the issue, validating in event viewer at the same time, nothing happens, everything looks good.
    Using Newest FF or Chrome, CONSISTANTLY throws 3 event logs, everytime I go to the page, 2x event id 17065 and 1 x 17066, I assume it throws the other errors upon too many, or too frequent errors.
    But WHY, I simply cannot wrap my brain around why such a feature inside the SQL server, should be able to act differently based on the browser.
    Error 01-04-2014 17:41:59 MSSQL$DGOFFICEMAINSQL 17065  Server
     SQL Server Assertion: File: <tmpilb.cpp>, line = 2576 Failed Assertion = 'fFalse' Attempt to access expired blob handle (3). This error may be timing-related. If the error persists after rerunning the statement, use DBCC CHECKDB to check the database
    for structural integrity, or restart the server to ensure in-memory data structures are not corrupted.
    Error 01-04-2014 14:47:41 MSSQL$DGOFFICEMAINSQL 17066  Server
     SQL Server Assertion: File: <qxcntxt.cpp>, line=1137 Failed Assertion = '!"No exceptions should be raised by this code"'. This error may be timing-related. If the error persists after rerunning the statement, use DBCC CHECKDB to check the database
    for structural integrity, or restart the server to ensure in-memory data structures are not corrupted.
    I am still very interested in good ideas!!!

  • About blob handling - how to display document type and document name?

    Hi,
    I am developing an APEX application, and I need a report region with a blob column(document) in it, and I want it to display like this :
    with an image on the left side(different images for different type of docs such as word, pdf, txt) ,
    and the file name on the right, with a download link on the file name, and also get the file name included into the download link address,
    also have the file size displayed in the report.
    I failed on getting this done with the help of any document, any help will be very much appreciated.
    Thanks.

    OK this link is very useful: http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/apex/r31/apex31nf/apex31blob.htm#o
    plus the doclib application example.
    It's too late here and I won't write down the detail now. The above two give the answers anyway.

  • ADF, TopLink exceptions handling on 10g

    We have a J2EE application using TopLink and ADF datacontrols. In development environment (JDeveloper) all exceptions are shown well (there's a <html:errors/> tag on every jsp page generated automatically together with ADF datacontrols). But when we deploy our application on 10g server, we see only meaningless "null".
    When we look on server into log called "redirected output/errors", there is all here.
    (Including SQL statement and exact error description). Must be anything set on application server to bahave exactly as OC4J in JDeveloper ? Thanks.

    Errors are stored into application.log file for every particular application. Could it be possible to show the content of this file in application by using some interface such as it is in enterprise manager? That means showing groups of 500 lines and browsing capability to show another ones.

  • Blob handling

    1 declare
    2 l_blob blob;
    3 l_bfile bfile;
    4 begin
    5 insert into demo values ( 1, empty_blob() )
    6 returning theBlob into l_blob;
    7 l_bfile := bfilename( 'ext_dir', 'DSC00963.JPG' );
    8 dbms_lob.fileopen( l_bfile );
    9 dbms_lob.loadfromfile( l_blob, l_bfile,
    10 dbms_lob.getlength( l_bfile ) );
    11 dbms_lob.fileclose( l_bfile );
    12* end;
    SQL> /
    declare
    ERROR at line 1:
    ORA-22285: non-existent directory or file for FILEOPEN operation
    ORA-06512: at "SYS.DBMS_LOB", line 523
    ORA-06512: at line 8
    where am i going wrong?

    thanks for your reply, i changed what u mentioned, after that it showed like this
    1 declare
    2 l_blob blob;
    3 l_bfile bfile;
    4 begin
    5 insert into demo values ( 1, empty_blob() )
    6 returning theBlob into l_blob;
    7 l_bfile := bfilename( 'EXT_DIR', 'DSC00963.JPG' );
    8 dbms_lob.fileopen( l_bfile );
    9 dbms_lob.loadfromfile( l_blob, l_bfile,
    10 dbms_lob.getlength( l_bfile ) );
    11 dbms_lob.fileclose( l_bfile );
    12* end;
    SQL> /
    declare
    ERROR at line 1:
    ORA-22288: file or LOB operation FILEOPEN failed
    The system cannot find the path specified.
    ORA-06512: at "SYS.DBMS_LOB", line 523
    ORA-06512: at line 8

  • How to store a  blob with toplink

    Hi .. I am using toplink 10g (10.1.3.0) and oracle9 and I am trying to insert a blob with toplink in the database but I dont know how I need to use a resultset and a PreparedStatement if it is the case hor how use toplink???

    You should be able to read the file in and convert its data to a byte-array in your object. TopLink will handle the conversion from byte-array to BLOB in the database.
    If you are using the Oracle thin JDBC drivers, they have a 5k size limit, if your file is larger than this you will need to use the Oracle8/9Platform in your login and a TypeConversionMapping and set the fieldClassification to java.sql.Blob.

  • How to handle blob data with java and mysql and hibernate

    Dear all,
    I am using java 1.6 and mysql 5.5 and hibernate 3.0 . Some time i use blob data type . Earlier my project's data base was oracle 10g but now i am converting it to Mysql and now i am facing problem to save and fetch blob data to mysql database . Can anybody give me the source code for blob handling with java+Mysql+Hibernate
    now my code is :--
    ==================================================
    *.hbm.xml :--
    <property name="image" column="IMAGE" type="com.shrisure.server.usertype.BinaryBlobType" insert="true" update="true" lazy="false"/>
    ===================================================
    *.java :--
    package com.shrisure.server.usertype;
    import java.io.OutputStream;
    import java.io.Serializable;
    import java.sql.Blob;
    import java.sql.Connection;
    import java.sql.PreparedStatement;
    import java.sql.ResultSet;
    import java.sql.SQLException;
    import java.sql.Types;
    import javax.naming.InitialContext;
    import javax.sql.DataSource;
    import oracle.sql.BLOB;
    import org.hibernate.HibernateException;
    import org.hibernate.usertype.UserType;
    import org.jboss.resource.adapter.jdbc.WrappedConnection;
    import com.google.gwt.user.client.rpc.IsSerializable;
    public class BinaryBlobType implements UserType, java.io.Serializable, IsSerializable {
    private static final long serialVersionUID = 1111222233331231L;
    public int[] sqlTypes() {
    return new int[] { Types.BLOB };
    public Class returnedClass() {
    return byte[].class;
    public boolean equals(Object x, Object y) {
    return (x == y) || (x != null && y != null && java.util.Arrays.equals((byte[]) x, (byte[]) y));
    public void nullSafeSet(PreparedStatement st, Object value, int index) throws HibernateException, SQLException {
    BLOB tempBlob = null;
    WrappedConnection wc = null;
    try {
    if (value != null) {
    Connection oracleConnection = st.getConnection();
    if (oracleConnection instanceof oracle.jdbc.driver.OracleConnection) {
    tempBlob = BLOB.createTemporary(oracleConnection, true, BLOB.DURATION_SESSION);
    if (oracleConnection instanceof org.jboss.resource.adapter.jdbc.WrappedConnection) {
    InitialContext ctx = new InitialContext();
    DataSource dataSource = (DataSource) ctx.lookup("java:/DefaultDS");
    Connection dsConn = dataSource.getConnection();
    wc = (WrappedConnection) dsConn;
    // with getUnderlying connection method , cast it to Oracle
    // Connection
    oracleConnection = wc.getUnderlyingConnection();
    tempBlob = BLOB.createTemporary(oracleConnection, true, BLOB.DURATION_SESSION);
    tempBlob.open(BLOB.MODE_READWRITE);
    OutputStream tempBlobWriter = tempBlob.getBinaryOutputStream();// setBinaryStream(1);
    tempBlobWriter.write((byte[]) value);
    tempBlobWriter.flush();
    tempBlobWriter.close();
    tempBlob.close();
    st.setBlob(index, tempBlob);
    } else {
    st.setBlob(index, BLOB.empty_lob());
    } catch (Exception exp) {
    if (tempBlob != null) {
    tempBlob.freeTemporary();
    exp.printStackTrace();
    st.setBlob(index, BLOB.empty_lob());
    // throw new RuntimeException();
    } finally {
    if (wc != null) {
    wc.close();
    public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {
    final Blob blob = rs.getBlob(names[0]);
    return blob != null ? blob.getBytes(1, (int) blob.length()) : null;
    public Object deepCopy(Object value) {
    if (value == null)
    return null;
    byte[] bytes = (byte[]) value;
    byte[] result = new byte[bytes.length];
    System.arraycopy(bytes, 0, result, 0, bytes.length);
    return result;
    public boolean isMutable() {
    return true;
    public Object assemble(Serializable arg0, Object arg1) throws HibernateException {
    return assemble(arg0, arg1);
    public Serializable disassemble(Object arg0) throws HibernateException {
    return disassemble(arg0);
    public int hashCode(Object arg0) throws HibernateException {
    return hashCode();
    public Object replace(Object arg0, Object arg1, Object arg2) throws HibernateException {
    return replace(arg0, arg1, arg2);
    =================================================================
    can anyone give me the source code for this BinaryBlobType.java according to mysql blob handling ..

    Moderator action: crosspost deleted.

  • ClassCastException w/ BLOB

    We are currently using v 2.5.2 and have run into a problem when attempting
    to insert a BLOB using WebLogic (via EJB). When I run a stand-alone test
    case, everything works as planned. The BLOB is inserted with no problems.
    However, if I change the test case to call an EJB deployed on WebLogic
    8.1 (which in turn calls the same functionality to insert the BLOB via
    KODO), a receive the following exception:
    NestedThrowablesStackTrace:
    java.lang.ClassCastException
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.OracleBlobMapping.update(OracleBlobMapping.java:100)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.OracleBlobMapping.insert(OracleBlobMapping.java:73)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.insert(ClassMapping.java:437)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:530)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flushInternal(PersistenceManagerImpl.java:697)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:422)
    at
    com.iits.sdb.app.facade.MaintainProducerRequestFacade.addProducerRequest(MaintainProducerRequestFacade.java:99)
    at
    com.iits.sdb.app.delegate.MaintainProducerRequestDelegate.addProducerRequest(MaintainProducerRequestDelegate.java:89)
    at
    com.iits.ejb.RequestProcessBean.persistRequest(RequestProcessBean.java:333)
    at
    com.iits.ejb.RequestProcessBean.prepareAddInitialAppointment(RequestProcessBean.java:125)
    at
    com.iits.ejb.RequestProcessEjb_pfomyb_EOImpl.prepareAddInitialAppointment(RequestProcessEjb_pfomyb_EOImpl.java:150)
    at
    com.iits.ejb.RequestProcessEjb_pfomyb_EOImpl_WLSkel.invoke(Unknown Source)
    at
    weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:466)
    at
    weblogic.rmi.cluster.ReplicaAwareServerRef.invoke(ReplicaAwareServerRef.java:108)
    at
    weblogic.rmi.internal.BasicServerRef$1.run(BasicServerRef.java:409)
    at
    weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:353)
    at
    weblogic.security.service.SecurityManager.runAs(SecurityManager.java:144)
    at
    weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:404)
    at
    weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:30)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:197)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:170)
    Any help would be greatly appreciated.

    Marc,
    After doing some additional investigation, the following link provides
    more insight into why the error occurred.
    http://e-docs.bea.com/wls/docs70/jdbc/thirdparty.html#1045809
    It seems like adding a conditional cast to weblogic's wrapper class,
    weblogic.jdbc.vendor.oracle.OracleThinBlob, would fix the issue in
    OracleBlobMapping. Is there a recommended way for us to add this
    functionality, to enable us to still use Kodo for BLOB's in WebLogic?
    Thanks,
    Josh
    Josh Zook wrote:
    After trying your suggestion below, I still run into the same issue (see
    stack trace below). However, after looking at the generated SQL, it
    appears that Kodo is trying to INSERT the blob, rather than use
    EMPTY_BLOB() and then SELECT FOR UPDATE. I don't think upgrading is 3.0
    is an option for us at this time. Please let me know if you can think of
    anything else.
    Thanks,
    Josh
    java.sql.SQLException: Data size bigger than max size for this type: 4036
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java:95)
    at
    oracle.jdbc.dbaccess.DBDataSetImpl.setBytesBindItem(DBDataSetImpl.java:2413)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setItem(OraclePreparedStatement.java:1166)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setBinaryStream(OraclePreparedStatement.java:2598)
    at
    weblogic.jdbc.wrapper.PreparedStatement.setBinaryStream(PreparedStatement.java:289)
    at
    com.iits.pulsar.jdo.OracleDictionary.blobToPreparedParameter(OracleDictionary.java:76)
    Marc Prud'hommeaux wrote:
    Josh-
    The problem is that the Oracle JDBC drivers require special handling for
    blobs over a certain size: they need to have Oracle-specific BLOB/CLOB
    classes. Unfortunately, when another DataSource wraps the oracle
    DataSource, there isn't much we can do.
    Kodo 3.0 has a better way of doing this that will probably work around
    your problem. If you don't want to upgrade to 3.0 RC 1, then you
    could try to subclass OracleDictionary with your own class that
    overrides AbstractDictionary.blobToPreparedParameter. I have read that
    some versions of the Oracle driver will handle a direct blob better if
    you call stmnt.setBinaryStream(index, new ByteArrayInputStram (bytes))
    the if you do stmnt.setObject(index, ob) (which is the default way we
    deal with this).
    Please let us know if you still have problems after trying this.
    In article <[email protected]>, Josh Zook wrote:
    Thanks for the help. Yes, we are using WebLogic datasource, so that is
    the root of the problem. My standalone test case was using Kodo intenal
    connection pooling. It looks like WebLogic datasource wraps the
    ResultSet
    and Blob in it's own classes (perhaps causing the cast exception?).
    I tried you advice and set isOracleJDBCDriver=false. I now get the
    exception listed below. Any suggestions would be appreciated.
    java.sql.SQLException: Data size bigger than max size for this type: 4009
    atoracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    atoracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java:95)
    at
    oracle.jdbc.dbaccess.DBDataSetImpl.setBytesBindItem(DBDataSetImpl.java:2413)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setItem(OraclePreparedStatement.java:1166)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setBytes(OraclePreparedStatement.java:2208)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:3002)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:3217)
    at
    weblogic.jdbc.wrapper.PreparedStatement.setObject(PreparedStatement.java:172)
    at
    com.solarmetric.kodo.impl.jdbc.schema.dict.AbstractDictionary.blobToPreparedParameter(AbstractDictionary.java:1110)
    at
    com.solarmetric.kodo.impl.jdbc.schema.dict.OracleDictionary.blobToPreparedParameter(OracleDictionary.java:203)
    at
    com.solarmetric.kodo.impl.jdbc.schema.dict.AbstractDictionary.toPreparedParameter(AbstractDictionary.java:840)
    at
    com.solarmetric.kodo.impl.jdbc.sql.SQLValue.applyParameter(SQLValue.java:129)
    at
    com.solarmetric.kodo.impl.jdbc.sql.SQLBuffer.setParameters(SQLBuffer.java:239)
    at
    com.solarmetric.kodo.impl.jdbc.sql.SQLBuffer.setParameters(SQLBuffer.java:222)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.prepareStatementInternal(SQLExecutionManagerImpl.java:753)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatementNonBatch(SQLExecutionManagerImpl.java:445)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatement(SQLExecutionManagerImpl.java:423)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeInternal(SQLExecutionManagerImpl.java:381)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.flush(SQLExecutionManagerImpl.java:255)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:554)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flushInternal(PersistenceManagerImpl.java:697)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:422)
    at
    com.iits.sdb.app.facade.MaintainProducerRequestFacade.addProducerRequest(MaintainProducerRequestFacade.java:106)
    >>>
    >>>
    Marc Prud'hommeaux wrote:
    Josh-
    Are you using Kodo's own DataSource in the Weblogic environment, or are
    you using a DataSource bound into WebLogic's JNDI? If the latter, then
    it may be that our special blob handling for Oracle is having problems
    with WebLogic's DataSource.
    One thing you might try to do is set the DictionaryProperties to have
    "isOracleJDBCDriver=false", which should prevent Kodo from using it's
    special Oracle driver behavior.
    In article <[email protected]>, Josh Zook wrote:
    We are currently using v 2.5.2 and have run into a problem when
    attempting
    to insert a BLOB using WebLogic (via EJB). When I run a stand-alonetest
    case, everything works as planned. The BLOB is inserted with noproblems.
    However, if I change the test case to call an EJB deployed on
    WebLogic
    8.1 (which in turn calls the same functionality to insert the BLOB via
    KODO), a receive the following exception:
    NestedThrowablesStackTrace:
    java.lang.ClassCastException
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.OracleBlobMapping.update(OracleBlobMapping.java:100)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.OracleBlobMapping.insert(OracleBlobMapping.java:73)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.insert(ClassMapping.java:437)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:530)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.flushInternal(PersistenceManagerImpl.java:697)
    at
    com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:422)
    at
    com.iits.sdb.app.facade.MaintainProducerRequestFacade.addProducerRequest(MaintainProducerRequestFacade.java:99)
    at
    com.iits.sdb.app.delegate.MaintainProducerRequestDelegate.addProducerRequest(MaintainProducerRequestDelegate.java:89)
    at
    com.iits.ejb.RequestProcessBean.persistRequest(RequestProcessBean.java:333)
    at
    com.iits.ejb.RequestProcessBean.prepareAddInitialAppointment(RequestProcessBean.java:125)
    at
    com.iits.ejb.RequestProcessEjb_pfomyb_EOImpl.prepareAddInitialAppointment(RequestProcessEjb_pfomyb_EOImpl.java:150)
    at
    com.iits.ejb.RequestProcessEjb_pfomyb_EOImpl_WLSkel.invoke(UnknownSource)
    at
    weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:466)
    at
    weblogic.rmi.cluster.ReplicaAwareServerRef.invoke(ReplicaAwareServerRef.java:108)
    at
    weblogic.rmi.internal.BasicServerRef$1.run(BasicServerRef.java:409)
    at
    weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:353)
    at
    weblogic.security.service.SecurityManager.runAs(SecurityManager.java:144)
    at
    weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:404)
    at
    weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:30)
    atweblogic.kernel.ExecuteThread.execute(ExecuteThread.java:197)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:170)
    Any help would be greatly appreciated.
    Marc Prud'hommeaux [email protected]
    SolarMetric Inc. http://www.solarmetric.com
    Marc Prud'hommeaux [email protected]
    SolarMetric Inc. http://www.solarmetric.com

  • EJB3: What is the easiest way to handle the rapid change DB structure?

    Hi,
    I'm new in EJB, and it's the biggest leap to learn EJB3 without having strong fundamental in EJB 2.Currently, I'm facing the inefficient way to handle database rapid change on my Web Application. I choose JSP, Struts, and EJB3 with Oracle Database 10.1.0.5, Oracle Application Server 10.1.3.1, Oracle JDeveloper 10.1.3.2.
    I've followed tutorial how to make EJB3 Entity Bean, and how to access EJB Session Bean from JSP. But in that tutorial, there is no clue how to manage or change Entity Bean if there is a change in table structure, let say add 2 new fields, and change 1 field type from VARCHAR to NUMBER..
    What has been done was I create a new EJB3 project, and build all new Entity Bean. All generated JAVA file is copied and pasted to my current Entity Bean file.
    Is there any other easiest and simplest way to handle the rapid change in development process? As I noticed that EJB 3 using TopLink to handle Entity Bean creation process, but i haven't found articles or tutorial how to deal with my problem.
    Thanks..

    Unfortunately there is no automatic way to do this with EJB right now. You'll basically need to go into your entity bean and add/change the attribute to match the new database column.
    Note that we do provide automatic synchronization or the ADF Business Components framework - in case you choose to use it instead of EJB 3.0.
    Also if you are just starting your development/learning I think you might want to use JSF and not Struts. Seems like JSF has a much better future then Struts.

  • Update ORDImage with in memory BLOB?

    According to the interMedia docs, one of your examples is:
    BEGIN
    INSERT INTO emp VALUES (
    'John Doe', 24000, 'Technical Writer', 123,
    ORDSYS.ORDImage(ORDSYS.ORDSource(empty_blob(), NULL,NULL,NULL,SYSDATE,1),
    NULL,NULL,NULL,NULL,NULL,NULL,NULL));
    -- select the newly inserted row for update
    SELECT photo INTO Image FROM emp
    WHERE ename = 'John Doe' for UPDATE;
    --BEGIN
    -- use the getContent method to get the LOB locator.
    -- populate the data with dbms lob calls or write an OCI program to
    -- fill in the image BLOB.
    --END;
    -- set property attributes for the image data
    Image.setProperties;
    UPDATE emp SET photo = Image WHERE ename = 'John Doe';
    -- continue processing
    END;
    However, the part I'm stuck on is the populate the data (using OCI or LOB packages). For a regular BLOB field (or interMedia text), I can use the procedure outlined below:
    sSQL = "SELECT blob FROM store WHERE id = " & ID
    Set oDS = setDS(sSQL)
    oDS.Edit
    Set oBLOB = oDS.Fields("blob").Value
    sRet = oBLOB.Write(oFile.Binary, sSize)
    oDS.Update
    I retrieve the blob field (which was inserted with empty_blob()) into a dynaset using OO4O, edit the current row and write the new blob with the file and size parameters (I get from the rest of my code). This does not work for a fieldtype of ORDImage (the write method is not available). Is there any sample code that will update the blob field with binary in-memory data as opposed to from an external file? Thanks,
    Kevin
    null

    The interMedia object method getContent() returns a blob. This can then be read and write using the BLOB interface.
    So in your example, your SQL would be
    sSQL = "SELECT s.image.getContent() FROM store s WHERE s.id = " & ID
    Then you could uses the fields.value method to get the blob handle.

  • Baffled by Toplink and stored procedures

    Here's what I'm after...
    I have a nice package built to act as a bridge to a MySQL database. Everything works fine, but I need to write classes to "wrap" two stored procedures (procedures, not functions or triggers, defined in the database itself). I want to leverage the TopLink connection handling so that I don't have to open/close connections for these two classes. Surely there is some way to do this.
    Working with one example...
    Stored Procedure definition:
    CREATE PROCEDURE openPorts (ipAddr INT UNSIGNED)
    BEGIN
      DECLARE lastScanID INT UNSIGNED;
      SELECT results.scanrequestid INTO lastScanID
        FROM (results,network,scanner)
        WHERE
          results.ip=ipAddr and
          results.ip between network.ip and network.ip+pow(2,(32-network.cidr))-1 and
          results.scannerip=scanner.ip and
          scanner.regionid=network.regionid
        ORDER BY results.endtime DESC LIMIT 1;
      SELECT distinct protocol,port
        FROM results
        where
          ip=ipAddr and
          scanrequestid=lastScanID and
          port>0 and protocol>0
        ORDER BY PROTOCOL,PORT;
    ENDI want to create a class file OpenPortsDAO such that calling
    List<OpenPort> ports = OpenPortsDAO.find(int ipInteger)Yields a list (or vector, whatever) of the rows returned as the result.
    Ignoring for the moment the issues with mapping returned row fields to object fields in my OpenPort class, how the heck do I...
    1) Define the stored procedure
    2) Set the input variable value
    3) Get the connection (if necessary)
    4) Execute the query
    I imagine it's going to be something like...
    StoredProcedureCall call = new StoredProcedureCall();
    call.setProcedureName("hostSummary");
    call.addNamedArgumentValue("ipAddr", value);
    call.getResult();Should this wrapping class be defined in the persistence unit just like any other? If so, my assumption is then that I would not have to manually handle connections, etc.
    Any help would be very much appreciated. I think if I can just get pointed in the right direction, it should "click" for me.
    -David

    Doug,
    Thanks for the info. I was able to handle the stored procedure, but I did so via POJOs.
    I'm interested in this part of your reply...
    >
    This will give you an entity class with a full lifecycle to it assuming that TopLink may write it back to the database if modified in a transaction.
    Given the following pseudo-&lt;code|SQL>...
    drop table starship
    go
    drop table contractor
    go
    create table contractor (
    id serial primary key,
    name text unique)
    go
    create table starship (
    id serial primary key,
    name text unique,
    contractor_id int references contractor)
    go
    create function get_by_id(shipid int, OUT contractor, OUT shipname)
    BEGIN
      blah
    END;
    GOAre you saying that I can create a full CRUD object for the function/procedure call? IE., assuming I get back a result and I see that the ship name is incorrect, I could make a change to the returned data and have that persisted? Without having to get a starship object on my own?
    If that's the case, I'd like to know more.
    -David

  • Blobs and OCI8

    When creating a server which uses JDBC thick drivers to get
    blobs in and out of the DB, I have noticed the following:
    The OCI layer seems to be caching the entire blob in memory.
    This means that if I read in a 30 meg blob in JDBC, I suddenly
    have 30 meg in memory on my middle-tier. My middle tier
    application does the following in a connection it borrows from a
    connection pool:
    -) Gets the Blob handle
    -) Uses getBytes(l_position, l_size, l_buf) to get the bytes and
    simply prints them to an outputStream to the client
    -) Return connection to the pool
    OR
    -) Get the Blob handle
    -) Use getBinaryStream()
    -) Loop with read(l_buf) printing to the output stream
    -) Return connection to the pool
    The top way is slightly faster than the bottom (usually), but
    both have the following problems:
    1) Using the thin driver is painfully slow. It takes forever
    2) Using the thick driver is good performance, but causes a
    scalablity problem. While the blob is being read in, the ENTIRE
    blob data is being stored in memory on my middle tier. Even
    when the Blob is garbage collected, the ENTIRE blob is still in
    memory. The only way I have found to get rid of the extra
    memory is to actually close the connection, which kills
    performance for other reasons.
    Is there some way to use the thick driver Blob class and NOT
    have it store the memory on my middle tier? Or is there someway
    to clear this memory without actually closing the connection????
    This is all done with Oracle8i and JDBC.
    Thanks,
    Kevin
    null

    : Is there some way to use the thick driver Blob class and NOT
    : have it store the memory on my middle tier? Or is there
    someway
    : to clear this memory without actually closing the
    connection????
    : This is all done with Oracle8i and JDBC.
    : Thanks,
    : Kevin
    Have you tryed it with oci8 jdbc drivers?
    I'm working with oci8 drivers, getting the pointers to the
    blobs ( getblob() ) and then reading/writing them with PL/SQL
    calls.
    null

  • How to use resultset preparedstatement

    Thanks for the quick response
    How can i use resultset  for the query below
    select * from custom_table
    where 
    (entity_name= 'PO_HEAD' and pk1_value = :P5_PO_HEADER_ID and pk2_value = '0' )
    OR (entity_name= 'PO_HEADERS' and pk1_value = :P5_PO_HEADER_ID )
    OR (entity_name= 'PO_VENDORS' and pk1_value = :P5_VENDOR_ID )

    You should be able to read the file in and convert its data to a byte-array in your object. TopLink will handle the conversion from byte-array to BLOB in the database.
    If you are using the Oracle thin JDBC drivers, they have a 5k size limit, if your file is larger than this you will need to use the Oracle8/9Platform in your login and a TypeConversionMapping and set the fieldClassification to java.sql.Blob.

  • Oc4j 9.0.2.0.0 and  8.1.7 client

    I need to use the latest production version of oc4j -> 9.0.2.0.0. My client needs use to be able to use the oci drivers with the 8.1.7 client install. Is it possible to use 9.0.2.0.0 with the 8.1.7 client install?
    I can use the thin driver no problem, but the application requires the speed and extras that oci proviceds, especially with blob handling.
    I can use the oci drivers on a machine with the 9i client installed, no problem.
    But when I use it on a machine with the 8.1.7 client installed, I get (of course) and unsatisfied link error - it can't find ocijdbc9.
    error:
    java.lang.UnsatisfiedLinkError: no ocijdbc9 in java.library.path
         at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1312)
         at java.lang.Runtime.loadLibrary0(Runtime.java:749)
         at java.lang.System.loadLibrary(System.java:820)
         at oracle.jdbc.oci8.OCIDBAccess.logon(OCIDBAccess.java:294)
         at oracle.jdbc.driver.OracleConnection.<init>(OracleConnection.java:287)
         at oracle.jdbc.driver.OracleDriver.getConnectionInstance(OracleDriver.java:442)
         at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:321)
         at java.sql.DriverManager.getConnection(DriverManager.java:517)
         at java.sql.DriverManager.getConnection(DriverManager.java:177)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.DriverManagerDataSource.getConnection(DriverManagerDataSource.java:219)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.DriverManagerConnectionPoolDataSource.getPooledConnection(DriverManagerConnectionPoolDataSource.java:24)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.OrionPooledDataSource.getPooledConnection(OrionPooledDataSource.java:290)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.PooledConnectionUsage.getPooledConnection(PooledConnectionUsage.java:21)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.OrionPooledDataSource.getConnection(OrionPooledDataSource.java:162)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.DriverManagerXADataSource.getAutoCommitConnection(DriverManagerXADataSource.java:248)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.LogicalDriverManagerXAConnection.intercept(LogicalDriverManagerXAConnection.java:113)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.FilterConnection.prepareStatement(FilterConnection.java:240)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.FilterConnection.prepareStatement(FilterConnection.java:241)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.OrclCMTConnection.prepareStatement(OrclCMTConnection.java:774)
         at gc.hc.pphb.webtos.util.WebtosUserManager.userExists(WebtosUserManager.java:152)
         at gc.hc.pphb.webtos.util.SimpleUserManager.getUser(SimpleUserManager.java:34)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.EvermindHttpServletRequest.getUserPrincipalInternal(EvermindHttpServletRequest.java:3117)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.HttpApplication.authenticate(HttpApplication.java:5470)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.HttpApplication.getRequestDispatcher(HttpApplication.java:2299)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:585)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.HttpRequestHandler.run(HttpRequestHandler.java:243)
         at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].util.ThreadPoolThread.run(ThreadPoolThread.java:64)
    If I down load the 9i jdbc drivers and install them, I get a jvm message stating that a dependant library - oracore9.dll - is missing. I've tried changing my Oracle_home variable, adding the /lib directory to the class and replacing the \jdbc\classes12dms.jar with a remnamed version of classes12.zip (so I have an older version of OCIDBAccess.java that doesn't call the 9i jdbc libraries) but to no avail. The must be a way to get 9.0.2.0.0 to look for the correct version of the oci driver....
    Any help would be greatly appreciated...
    Mike

    Mike,
    The Oracle9iAS R2 Client CD on Windows is available. Please download the client software from http://otn.oracle.com/software/products/ias/htdocs/solsoft.html#client and this should have new version of OCI drivers. Please install this on a separate Oracle home and include this in your $ORACLE_HOME/bin in your path and then start OC4J and you should be in good shape.
    regards
    Debu

Maybe you are looking for

  • How do I reduce the megabyte size of a photo to email it

    I have a 4 MB photo that I would like to download to a newspaper. The optimal size for downloading is 2 MB. How to I reduce the size of the photo froom 4 MB to 2MB in Elements 8.

  • OIM 11g: Target account attribute value enforcement/policy

    We have some requirements around enforcing certain attribute values on our target platforms. For example, if we provision a field "Approval Limit=$100", and on a recon that value has been changed to "Approval Limit=$5000", then appropriate action sho

  • Intergrate any ABAP transaction in a ABAP WD application (not via ITS!)

    Can I integrate any standard ABAP transaction in a ABAP Webdynpro application, for instance in a  Popup? (not ITS!). My workaround is: To create an iView for the ABAP transaction in the portal and to call the iView using the iVIEW URL. This works, bu

  • I messed up:  codec not supported

    I am studying Itallan. I found my way to Italian TV and radio stations on the web and had them playing just fine in Safari. When I clicked on the <Link to Video> on the page in Safari, Windows Media Player launched and I could watch the video and lis

  • Message/photo problems ?

    I have a new iphone 5s ( only 30 days ) having trouble once in a while opening photos attached to txt messages. Txt. message is ok , photo says its downloading and never does, the message and photo come through on my second iphone 5 with no problems,