Loading a blob record

Hi,
In my program I must put some new record in a database. The problems come when I have to load the column of blob type. I'd want to put into that field the contents of an array of byte[ ].

This is my exact code
        //Connecting database
        MyDBConnection mdbc=new MyDBConnection();
        mdbc.init();
        Connection conn=mdbc.getMyConnection();
        try {
           Statement stmt= conn.createStatement();
        } catch (SQLException ex) {
            ex.printStackTrace();
// Other code
public void updateDataBase(String data1, byte[] data){
int u = stmt.executeUpdate("insert into myTable (column1) values("
                    +quotate(data1)+" );");
            ResultSet res = stmt.executeQuery("select * from myTable");
              res.next();
            res.next();
            res.getBlob("data").setBytes(1,data);
//To easily make quotes
private  String quotate(String content) {
        return "'"+content+"'";
       

Similar Messages

  • Loading multiple physical records into a logical record

    Hello,
    I'm not sure if this is the right place to post this thread.
    I have to import data from a fixed length positioned text file into a oracle table using sql loader.
    My sample input file (which has 3 columns) looks like:
    Col1 Col2 Col3
    1 A abcdefgh
    1 A ijklmnop
    1 A pqrstuv
    1 B abcdefgh
    1 B ijklmn
    2 A abcdefgh
    3 C hello
    3 C world
    The above text file should be loaded into the table as:
    Col1 Col2 Col3
    1 A abcdefghijklmnpqrstuv
    1 B abcdefghijklmn
    2 A abcdefgh
    3 C helloworld
    My question: Is there a way tht i can use the logic of loading multiple physical records into a logical record in my oracle tables. Please suggest.
    Thanks in advance.

    Hi,
    user1049091 wrote:
    Kulash,
    Thanks for your reply.
    The order of the concatenated strings is important as the whole text is split into several physical records in the flat file and has to be combined into 1 record in Oracle table.
    My scenario is we get these fixed length input files from mainframes on a daily basis and this data needs to be loaded into a oracle table for reporting purpose. It needs to be automated.
    Am still confused whether to use external table or a staging table using sql loader. Please advise with more clarity as am a beginner in sql loader. Thanks.I still think an external table would be better.
    You can create the external table like this:
    CREATE TABLE     fubar_external
    (      col1     NUMBER (2)
    ,      col2     VARCHAR2 (2)
    ,      col3     VARCHAR2 (50)
    ORGANIZATION  EXTERNAL
    (       TYPE             ORACLE_LOADER
         DEFAULT DIRECTORY  XYZ_DIR
         ACCESS PARAMETERS  (
                                 RECORDS DELIMITED BY     NEWLINE
                          FIELDS  (   col1        POSITION (1:2)
                                      ,   col2        POSITION (3:4)
                               ,   col3        POSITION (5:54)
         LOCATION        ('fubar.txt')
    );where XYZ_DIR is the Oracle Directory on the database server's file system, and fubar.txt is the name of the file on that directory. Every day, when you get new data, just overwrite fubar.txt. Whenever you query the table, Oracle will read the file that's currently on that directory. You don't have to drop and re-create the table every day.
    Note that the way you specify the columns is similar to how you do it in SQL*Loader, but the SEQUENCE generator doesn't work in external files; use ROWNUM instead.
    Do you need to populate a table with the concatenated col3's, or do you just need to display them in a query?
    Either way, you can reference the external table the same way you would reference a regular, internal table.

  • Error in Loading a BLOB Column in Oracle Database from 1 to diff  schema

    Hi ALL,
    I am in a POC where in I have to load a BLOB data from 1 schema to a different schema ie from Staging to Target.
    I load my staging(Oracle Schema ) through Oracle PLSQL. Now I have to load to my ODS.
    I am using the LKM (LKM SQL to Oracle) and IKM (IKM Oracle Incremental Update PLSQL).
    It errors out in the 3rd step 3-Loading-SS_0 Load Data
    The script used and the error message is in the attachment. However if I run the script manually it runs and the Load happens successfully. Also I was able to load the same BLOB objects it the tables were in the same schema(In this case LKM is bypassed).
    Any Thoughts on this?
    The Error I receive is:
    java.lang.NumberFormatException: For input string: "4294967295"
         at java.lang.NumberFormatException.forInputString(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
         at oracle.jdbc.driver.OracleResultSetMetaData.getPrecision(OracleResultSetMetaData.java:303)
         at com.sunopsis.sql.SnpsQuery.getResultSetParams(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Thanks & Regards,
    Krishna

    Hi,
    Are you at the same database? If yes is better to grant select, at the schemas where is the source, to your Staging Area user.
    Then, at topology, define all physical schemas under the same Data Server.
    If you are at distinct database a good solution is to use a DBLink (KM) because when a SQL to SQL KM is used, the data go thru agent, making a conversion to Java.
    I'm not sure if it can handle LOB's fields... Every time that I needed to use it was in the ways that I described.
    Does it make any sense to you?
    Cezar

  • Sqlldr is loading only 1st record from xml document

    Hi,
    I am trying to load XML doc with multiple records using sql*loader.
    I have registered my XSD perfectly.
    This is my control file
    LOAD DATA
    INFILE *
    INTO TABLE Orders APPEND
    XMLType(xmldata)
    FIELDS(
         xmldata LOBFILE (CONSTANT FULDTL_2.xml)
    TERMINATED BY '???')
    BEGINDATA
    FULDTL_2.xml
    -- Here, what I have to give for TERMINATED BY '???'
    My xml doc
    <Order ID="146120486" Status="CL" Comments="Shipped On 08/05/2008"/>
    <Order ID="143417590" Status="CL" Comments="Handset/Device has been received at NRC" ShipDate=""/>
    sqlldr is loading only 1st record from the file.
    How can I make it to load all the records from my xml doc.
    Thanks in advance.

    thanks for both the replies above - essentially the same correct solution.
    something worth noting now that I've written and tested both a SAX solution and a DOM solution is that there is a significant (4 x) time penalty using SAX.
    I considering dividing the vector I am storing/recovering into chunks and saving each chunk separately using DOM to speed things up...
    any thoughts on this approach?

  • Bapi to load Purchase Info records.

    Hi all,
    can anyone suggest a bapi for loading purchase info records.
    thanks,
    deepthi.n

    There is no BAPI in 4.6C,but you can use ALE Program  RBDSEINF
    or you can use ME_POST_INFORECORD FM

  • Application view loading all the records on view

    Hello,
    We have a customization navigating from JTT pages to opening a OA Frame work page, occassionally user reporting the target "OA Framework page" hungs, and poor response from application. We noticed in EBS -> Poolmonitor VO is trying to load all the records from the table, unable to reproduce this scenario.
    Why application loosing request parameters?
    Any help is appreciated, thanks in advance.
    Pullarao.

    Hello,
    We have a customization navigating from JTT pages to opening a OA Frame work page, occassionally user reporting the target "OA Framework page" hungs, and poor response from application. We noticed in EBS -> Poolmonitor VO is trying to load all the records from the table, unable to reproduce this scenario.
    Why application loosing request parameters?
    Any help is appreciated, thanks in advance.
    Pullarao.

  • Loading a BLOB to a table, using webutil

    Is there somewhere I can see a comprehensive example of loading a BLOB to a table, using webutil. That is where the user browses and selects a picture (or whatever) from the client machine, and then the file is loaded to the database.
    Thanks, Wayne

    Hello,
    <p>You can inspire from this article.</p>
    Francois

  • SQL*Loader-510: Physical record in data file (clob_table.ldr) is long

    If I generate loader / Insert script from Raptor, it's not working for Clob columns.
    I am getting error:
    SQL*Loader-510: Physical record in data file (clob_table.ldr) is long
    er than the maximum(1048576)
    What's the solution?
    Regards,

    Hi,
    Has the file been somehow changed by copying it between windows and unix? Ora file transfer done as binary or ASCII? The most common cause of your problem. Is if the end of line carriage return characters have been changed so they are no longer /n/r could this have happened? Can you open the file in a good editor or do an od command in unix to see what is actually present?
    Regards,
    Harry
    http://dbaharrison.blogspot.co.uk/

  • HR_BLP_SAVE_TIMEDATA and how to load high volume records onto Infotype 2001

    Because of high data volume (estimated 1 million records each run) and short SLA time requirement (runs every 2 hours), we chose to use functions that are called by CAT6 (Skipping the CATS to HR info types loading process) to load the time records directly into Infotype 2001 and Infotype 2002. 
    Now we discover that some records (sporadically and randomly) don't get loaded onto the infotype and they don't return from the error message table either.  We are wondering whether anyone has used these functions before, and how they resolve the issues of missing records / error handling. 
    We are also wondering if there is any other feasible solution to meet our customer's need.
    The two functions we use are:   HR_BLP_MAINTAIN_TIMEDATA and HR_BLP_SAVE_TIMEDATA

    Hi Curt,
    No, they are 32 bit RGB. I have just made an 8 bit RGB and it gives me the option. Thank you for your help

  • Exel to SQL DFTask loads some Null records

    I have an SSIS package to import Excel data into SQL. When I look at Excel I dont see ANY Null rows. However when I run the SSIS package and check SQL, It loads some NULL RECORDS IN SQL. WHy is that It loads Null records in SQL, When i cannot see the Null
    recs In Excel?

    That's because the person who created the Excel file and added the data to it pressed the "Enter" key on some of the empty cells under that specific column. In excel, such a row might look like any other empty, non-data row but for SSIS, its an
    actual data row except that it doesn't have any value. SSIS treats a missing data value as a DBNULL and that's what you saw.
    There are multiple ways of fixing this. 
    1. Educate the creator of the excel to be more careful at the time of data entry.
    2. Make that target column in the DB table as NOT NULL. 
    3. Handle such "empty" data values as exception inside your SSIS (using a data flow task and ignoring such rows).
    4. Switch to using CSV file format instead of excel format.
    5. All of the above :)
    Hope this helps.
    Cheers!
    Muqadder.

  • LSMW loading multiple structured records..

    Hi,
    We want to know how to load multi structured records thru <b>LSMW</b>, ie. we have a customer records which have multiple sub items eg, telephone numbers..multiple identification numbers for each customer.
    Q. What is the best way to load these thru lsmw.
    Q. Do we need one input file or multiple files input files (prefered)? How do we link  these record in different files?
    Q. Do we use Batch recording or BAPI input? we want to load the <b>Business Partner</b> object (I know nothing about SAP!) What BAPI to use?
    mike

    First you have to work out what your logical group of records is going to be. Business partner, addresses telephone numbers, e-mail, web address, etc.
    Now you must decide how your input data is to be structured. Single file for all records, or multiple files, one for each record type.
    If you want multiple input files then each record <u>MUST</u> have a unique key that LSMW can use to group together your logical set.
    If you want to use a single input file, then either all records have the same structure and an unique key identifier for each logical group. Or you have many record types and an identifier for each one. You must have a way of indicating the start of the next logical group, (record type '1', or a change of unique identifier).
    You can only sort your input files if you include an unique key in each record.
    Now look to see what methods are available to load the data. BAPIs (Business APIs) these are functions that can be called from both, within SAP and from external systems. IDOCs (Intermediate DOCuments) programs that process data that is in known format (related to EDI) into SAP, most of these programs are now BAPIs. BI and DI (Batch and Direct Input) Older SAP programs that apply all the edits and sequence of the screens that the user uses. BI allows you to process the updates at a later time. BDC (Batch Data Communication) a process where by a program trys to imatate a user inputting the data, to be used as a last resort.
    Do not expect to be able to get all of your logical group loaded with a single method.
    Look at BAPIs whose names start BAPI_BUPA, transactions BAPI, SE37.
    MattG.

  • Delta update data error. Can we do a full load with bad records?

    Hello,
    We are working with SAP BW 7.0. And we had a problem with a delta update. It has made the delta update correctly, however, a register is loaded incorrectly, when this register was correct in the source system.
    We just made a load of that record and now we have it right in the master data. But we must now update the InfoCube. The data came up with a delta load, and now to load only this record we must make a full load (with only 1 register), then in the infocube make a delete selection of the wrong record.
    The problem is that we have doubts about how this would affect the load delta, because loads are scheduled for each day and then lost as the source of where to start the next load and we have problems with delta loads the next few days.
    Thank you.

    hi,
    What is your delta extractor (LIS or not LIS), What is you target (DSO or cube).
    depending on your soruce and target procedure is not the same but you can do it in every cases :
    in case of not LIS
    just reload with full IP to PSA
    in case of LIS
    delete setup tables
    do a restructuration for your record (if you can with selection condition)
    in case of cube in BW
    do a simple full upload
    in case of DSO
    do a full upload in repair mode (if dataflow is 3.x) else just use DTP.
    But if your target is DSO and DSO load other infocube after be sure that they are not corrupted
    Cyril

  • Data Load with 0 records

    Hi,
    How a system should react under following cases:
    1. Full load bringing in 0 records
    2. Init load bringing in 0 records
    3. Delta load bringing in 0 records
    Note: here by 0 records I mean actually the load has no records.
    For each of the above case will the load turn green or remain yellow and then times out.
    I always have different reactions from the system for these cases.  Appreciate view from experts…
    Thank you,
    sam

    Jr roberto setting which you said exists true,
    but i have that setting marked as green.
    i did an init load which pulled in 0 records. this is correct. now eventhough the green is checked for 0 records in rsmo settings the load errored out after the time out setting in infopack
    and the main traffic light is still running...with  No errors could be found. The current process has probably not finished yet.
    any tips..

  • (8I) SQL*LOADER에서 | (PIPE LINE)을 RECORD SEPARATOR로 사용하기

    제품 : ORACLE SERVER
    작성날짜 : 2003-10-21
    ===============================================================
    (8I) SQL*LOADER에서 | (PIPE LINE)을 RECORD SEPARATOR로 사용하기
    ===============================================================
    PURPOSE
    Oracle8i부터는 , SQL*Loader을 사용할때 record terminator을 지정할 수 있게
    되었다.
    Explanation
    Oracle8i 이전에는 record seperator로 default로 linefeed(carriage return,
    newline 등)였다. 이전에는 VAR 또는 FIX 등의 적당한 file을 다루기 위한 옵션을
    주어야 하기 때문에 복잡한 감이 있었고 flexible하지 못했다.
    Oracle8i부터는 , SQL*Loader을 사용할때 record terminator을 지정할 수 있게
    되었다. newline 또는 carriage return 문자를 포함하는 data 또는 special 문자를
    포함하는 data를 load하고자 할때 record terminator를 hexadecimal로 지정하여 활용할 수 있다.
    Example
    다음의 예제는 '|' (pipe line)을 record separator로 사용한다.
    record separator를 사용하기 위해서 SQL*Loader의 control file에 'infile'절에 적당한 값을 지정하여야 한다.
    아래의 예는 '|' (pipe line)을 사용하기 위해서
    "str X'7c0a'"을 'infile'절에 지정하였다.
    --controlfile : test.ctl
    load data
    infile 'test.dat' "str X'7c0a'"
    into table test
    fields terminated by ',' optionally enclosed by '"'
    (col1, col2)
    --datafile: test.dat
    1,this is the first line of the first record
    this is the second|
    2,this is the first line of the second record
    this is the second|
    SQL> desc test
    Name Null? Type
    COL1 VARCHAR2(4)
    COL2 VARCHAR2(100)
    $ sqlldr scott/tiger control=test.ctl log=test.log
    load된 data을 보면 아래와 같이 carriage return이 들어가 있는 data가 한 column에
    제대로 들어간 것을 볼 수 있다.
    SQL> select * from test;
    COL1
    COL2
    1
    this is the first line of the first record
    this is the second
    2
    this is the first line of the second record
    this is the second
    RELATED DOCUMENT
    <Note:74719.1>

  • Loading 3 millions record in database via externel table

    I am loading 3+ millions records in database by using externel tables. It is very slow process. How can I make this process fast?

    Hi,
    1. Break down the file into several files. let just say 10 files (300,000 record each)
    2. disable all index on the target table if possible
    3. disable Foreign key if possible, beside you can check this later using exceptions table
    4. make sure your freelist is and initrans is 10 for the target table, if you are inserting tabel resides in manual space management tablespace
    5. Create 10 proccess, each reading from their own file. and run this 10 process concurrently and used log error with unlimited reject limit facility. so the insert will continue until finish
    hope can help.

Maybe you are looking for

  • Hard Drive Died, Have Music, Want Pics/Contacts/Apps

    Hello all. I am a bit frustrated. I love my iPhone, but really hate iTunes. My wife and I each have an iPhone and I have iPods as well. Recently, my hard drive on my old PC died (the one I would sync my iPhone with). I have all of my music backed up

  • Error in Wsdl

    Hi All, We have SOAP to RFC scenario  Sender system will call WSDL which PI 7.1 will create I have done the following to create WSDL (Option 1)  For Sender system to genrate wsdl, i went to Sender Agreement ->. Display WSD. Save it (Option 2 ): I wen

  • M1330 mic doesn't work

    Hi!! I bought the wonderful Dell XPS m1330 and almost everything works. One of the essential features that doesn't work is my microphone. I saw the wiki page and I didn't try yet to recompile alsa because my speakers work well and I don't want make a

  • How to set DDL lock for a table

    OCI,how to set DDL lock for a table?

  • Highlight color of menu item

    hi, I wanted the menu item highlight color to be changed.. My using windows look and feel, in that iam getting blue color as highlight. Cant i change that color to some other color (ex. gray)? Is it possible in windows look and feel? Help needed very