Insert large collections

Oh great Oracle gurus...
How do I insert into collections (varrays or nested tables) where I have more than 1000 members in the collection? I am having a problem where I have carefully built an insert statement which gathers up all the members of a collection and tries to insert them as a single statement. When I get a large number of members to the collection I am getting an error message ORA-00939: too many arguments for function.
I assume that I can insert numerous times into a nested table but am not sure exactly how to do this (the samples in the documentation seem to be vague). However the documentation says that a varray must be inserted as a unit.
Cameron

Hi,
The docs say an expression list can't contain any more than 1000 expressions, so I would guess this it the limit you'll be hitting here.
However, in my testing I can't insert any more than 999 elements in a nested table using an expression list. With 1000 elements:
[email protected](161)> create type nt_type is table of varchar2(10);
  2  /
Type created.
[email protected](161)> create table test (x int, y nt_type) nested table y store as test_y;
Table created.
[email protected](161)> @ed
  1  declare
  2     sql_stmt varchar2(32767);
  3  begin
  4     sql_stmt := 'insert into test (x, y) values (1, nt_type(';
  5     for i in 1 .. 999 loop
  6        sql_stmt := sql_stmt || '''hello'',';
  7     end loop;
  8     sql_stmt := sql_stmt || '''hello''))';
  9     execute immediate sql_stmt;
10* end;
11  /
declare
ERROR at line 1:
ORA-00939: too many arguments for function
ORA-06512: at line 9However, if I try to insert only 999 elements:
[email protected](161)> ed
Wrote file TEST.BRANDT_161_20070322212141.sql
  1  declare
  2     sql_stmt varchar2(32767);
  3  begin
  4     sql_stmt := 'insert into test (x, y) values (1, nt_type(';
  5     for i in 1 .. 998 loop
  6        sql_stmt := sql_stmt || '''hello'',';
  7     end loop;
  8     sql_stmt := sql_stmt || '''hello''))';
  9     execute immediate sql_stmt;
10* end;
[email protected](161)> /
PL/SQL procedure successfully completed.
[email protected](161)> select * from test;
         X Y
         1
NT_TYPE('hello', 'hello', 'hello', 'hello', 'hello', 'hello
', 'hello', 'hello', 'hello', 'hello')
[email protected](161)> select cardinality(y) from test;
CARDINALITY(Y)
           999Maybe you could try a temporary table approach:
[email protected](161)> delete from test;
1 row deleted.
[email protected](161)> create global temporary table gtt(y varchar2(10)) on commit delete rows;
Table created.
[email protected](161)> insert into gtt(y)
  2     select 'hello'
  3     from dual
  4     connect by level <= 2000;
2000 rows created.
[email protected](161)> insert into test(x, y) values (1, (select cast(collect(y) as nt_type) from gtt));
1 row created.
Elapsed: 00:00:00.07
[email protected](161)> select cardinality(y) from test;
CARDINALITY(Y)
          2000
Elapsed: 00:00:00.01
[email protected](161)> commit;
Commit complete.
Elapsed: 00:00:00.01
[email protected](161)> select * from gtt;
no rows selectedI must say, though... a nested table with more than 1000 elements sounds to me like it should just be a "table".
cheers,
Anthony

Similar Messages

  • Large Collection, Re-Adding Music Folder . . . Best Practice?

    My wife and I have a very large collection of music that we have been trying to incorporate into a single library for our laptops to use. I think I have it mostly figured out for sharing a library (found an article about using synctoy in Windows to essentially mirror the libraries from a shared network drive while using the same physical copy of music files on a network drive. This seems to be working so far, but...
    In the process I have been moving from having a mirrored copy of music on each laptop (via an external USB drive on each laptop) to consolidating that into one music folder on the networked drive. I've got it just about complete but I ran an xcopy script again to ensure all music files from those drives really made it in entirety to the networked drive AFTER adding folder to library in iTunes. Now of course iTunes doesn't know those new files are there and they are all over the place, not in just one directory.
    Now the question, if I select add folder to library in iTunes again selecting the entire music folder so it will pick up those new files, will it create duplicate entries in iTunes? Or should this work ok?
    Is there an easier way to do this?

    I think it will not add duplicates. It might if you also tell iTunes to add music to the iTunes library as well. Here is my tip and what I do.
    I make a music library folder called "Music Library", in there I store my music via artists (each in their own folder), and then albums in sub folders. I name the folders exactly as they are embedded in the mp3. I know this is not necessary but that's just me. For my compilation music I have a folders called compilations and again the albums are stored in sub folders named after the album.
    I also name every file with the track, artist and title (I use mp3Tag for this) so every track on my system looks something like this
    "01 The Who - I'm A Man.mp3"
    Now I use iTunes to get encode and get track info but that's it, I then do the renaming manually.
    Now that I have all this done and backed up on DVD's I can move, reload etc etc when I want to. I have it all on external hard drives as well.
    So finally I ensure that my iTunes default library folder is always empty, that way whenever I delete all or part of "My" library iTunes does not ask me if I want to delete them from the hard drive.
    Manually adding the top level folder to iTunes does not duplicate anything. I regularly delete my library and reinstall it. This does not take long, I also save playlists and import them back as well.
    Letting iTunes organise your music is OK for some but not me, for one thing it does not add the artist to the actual filename, crazy if you ask me, specially if you want to do other things with the files. Of course Apple does not want you to do other things with your music.
    If you look on this forum at posts relating to iTunes has deleted my music, or all my files are missing or iTunes can't find my music, I bet that the people all let iTunes manage their music library and turn on the sync function. I used to have some of these issues but not anymore. I deleted files from my hard drive twice by accident, and I'm an experienced computer user. Now I'm 100% trouble free thanks to doing things manually and making my own music folder.
    Message was edited by: Stubchain

  • HT1386 hi, when i am syn my books from my mac to ipad even after one full day sync itunes show wait for items to copy and the books in ipad are still not fully loaded. how do i fix this, i have a large collection of books

    hi, when i am sync my books from my mac to ipad even after 24 hours sync, itunes show "wait for items to copy" and the books in ipad are still not fully loaded.
    how do i fix this, i have a large collection of books around 12000 books

    Nobody knows? Not even administrators?
    Please it would be really nice to have help on that to take all the benefit of the remote app.
    Thank you very much in advance.

  • Very large collection, slow iTunes

    I have a massive iTunes library, about 500 GB and 95,000 songs. I have been collecting music for almost 40 years and have ripped my entire CD collection and about 1,000 of my LP's so far. But iTunes is very slow with a large collection like this. Browsing by any of the cover art options is painfully slow. Does anyone have tips on how to speed things up? It wiould be appreciated. All of my music is in AAC format.
    iMac G5 20 1.8 ghz   Mac OS X (10.4)  

    I'm desperately trying to solve this problem as well. Here's my experience:
    - almost 60,000 songs/300+GB
    - Dual 2.0 G5 w/ 3.5 GB of RAM
    - takes time to load library at application start-up (understandable and doesn't bother me).
    - have run it on both internal and external firewire 400 drives with same problems.
    - my main problem is getting the beachball when I <get info> to alter track info. There is some lag time when using the search box as well amongst a lot of other things. In general, the application just doesn't respond as fast as I would like and it seems to be the size of the library.
    - I added 2GB of ram (up from 1.5) and that didn't help.
    - My percentage of HD used does not seem to affect it.
    My only other option (that I can think of) is to get a faster HD (or H Drives and stripe them) to try to increase access or seek times.
    I found this bit of info though in another thread that I found very interesting...
    "Re: Need Help with EXTREMELY Large Library!
    Posted: Sep 7, 2006 7:21 AM   in response to: Bill Ryan2
    I don't know how much of the slowdown is iTunes managing its playlists, and how much is the operating system handling the iTunes Library (the single file containing the tag information).
    I ran Sonar (http://www.matterform.com/macsoftware/filesecurity/) to watch file read/write activity while editing track tags. I saw a lot of iTunes Library temp and XML file writing.
    My iTunes Library (for 160 Gb) of music is 53 Mb. But the Spinning Beachball corresponds to writing this file. Maybe a faster disk would help things."
    This is my fear...that the fact that iTunes stores all that good info in one file (terrifying to me), that it can only handle so much.
    Has anyone tried any sort of RAID array with fast HD's with large buffers? I'd love to know if this will help before I sink a bunch of cash into this setup. Multiple libraries are not an option for me...it's kind of like keeping half my records at another house.
    Dual 2.0 G5 w/ 3.5 GB of RAM   Mac OS X (10.4.8)  
    Dual 2.0 G5   Mac OS X (10.4.8)  
    Dual 2.0 G5   Mac OS X (10.4.8)  

  • Inserting large xml data into xmltype

    Hi all,
    In my project I need to insert very large XML data into xmltype column.
    My table:
    CREATE TABLE TransDetailstblCLOB ( id number, data_xml XMLType) XmlType data_xml STORE AS CLOB;
    I am using JDBC approach to insert values. It works fine for data less than 4000 bytes when using preparedStatement.setString(1, xmlData). As I have to insert large Xml data >4000 bytes I am now using preparedStatement.setClob() methods.
    My code works fine for table which has column declared as CLOB expicitly. But for TransDetailstblCLOB where the column is declared as XMLTYPE and storage option as CLOB I am getting the error : "ORA-01461: can bind a LONG value only for insert into a LONG column".
    This error means that there is a mismatch between my setClob() and column. which means am I not storing in CLOB column.
    I read in Oracle site that
    When you create an XMLType column without any XML schema specification, a hidden CLOB column is automatically created to store the XML data. The XMLType column itself becomes a virtual column over this hidden CLOB column. It is not possible to directly access the CLOB column; however, you can set the storage characteristics for the column using the XMLType storage clause."
    I dont understand its stated here that it is a hidden CLOB column then why not I use setClob()? It worked fine for pure CLOB column (another table) then Why is it giving such error for XMLTYPE table?
    I am struck up with this since 3 days. Can anyone help me please?
    My code snippet:
    query = "INSERT INTO po_xml_tab VALUES (?,XMLType(?)) ";
              //query = "INSERT INTO test VALUES (?,?) ";
         // Get the statement Object
         pstmt =(OraclePreparedStatement) conn.prepareStatement(query);
         // pstmt = conn.prepareStatement(query);
         //xmlData="test";
    //      If the temporary CLOB has not yet been created, create new
         temporaryClob = oracle.sql.CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
         // Open the temporary CLOB in readwrite mode to enable writing
         temporaryClob.open(CLOB.MODE_READWRITE);
         log.debug("tempClob opened"+"size bef writing data"+"length "+temporaryClob.getLength()+
                   "buffer size "+temporaryClob.getBufferSize()+"chunk size "+temporaryClob.getChunkSize());
         OutputStream out = temporaryClob.getAsciiOutputStream();
         InputStream in = new StringBufferInputStream(xmlData);
    int length = -1;
    int wrote = 0;
    int chunkSize = temporaryClob.getChunkSize();
    chunkSize=xmlData.length();
    byte[] buf = new byte[chunkSize];
    while ((length = in.read(buf)) != -1) {
    out.write(buf, 0, length);
    wrote += length;
    temporaryClob.setBytes(buf);
    log.debug("Wrote lenght"+wrote);
         // Bind this CLOB with the prepared Statement
         pstmt.setInt(1,100);
         pstmt.setStringForClob(2, xmlData);
         int i =pstmt.executeUpdate();
         if (i == 1) {
         log.debug("Record Successfully inserted!");
         }

    try this, in adodb works:
    declare poXML CLOB;
    BEGIN
    poXML := '<OIDS><OID>large text</OID></OIDS>';
    UPDATE a_po_xml_tab set podoc=XMLType(poXML) WHERE poid = 102;
    END;

  • How to insert large xml file to XMLType column?

    Hi,
    I have a table with one column as XMLType(Binary XML storage option and Free Text Indexing). When i try to insert a large XML as long as 8kb, i'm getting an error ORA-01704:string literal too long.
    Insert into TEST values(XMLTYPE('xml HERE'));
    How to insert large XML values to XMLType column?
    Regards,
    Sprightee

    For a large XML file, you basically have two options - you can load the string directly as an XMLType, or you can load the string as a CLOB and cast it on the database side to an XMLType.
    If you decide to load the XML as XmlType client-side, then you may be interested to know that versions of Oracle after 11.2.0.2 support the JDBC 4.0 SQLXML standard. See the JDBC driver release documentation here:
    http://docs.oracle.com/cd/E18283_01/java.112/e16548/jdbcvers.htm#BABGHBCC
    If you want to load as a CLOB, then you'll need to use PreparedStatement's setClob() method, or allocate an oracle.sql.clob object.
    For versions before 11.2.0.2, you can create an XMLType with a constructor that includes an InputStream or byte[] array.
    HTH
    Edited by: 938186 on Jun 23, 2012 11:43 AM

  • Powershell Get-CMDevice Failure on Large Collections

    We are getting an error with the Powershell 'Get-CMDevice' cmdlet with larger collections.  The line we are using is ......
    Get-CMDevice -CollectionName "Test All Desktops and Laptops"
    This works on the smaller collections but with this one we get the error.......
    Get-CMDevice : ConfigMgr Error Object:
    instance of __ExtendedStatus
    Description = "[42000][191][Microsoft][SQL Server Native Client 11.0][SQL
    Server]Some part of your SQL statement is nested too deeply. Rewrite the query or
    break it up into smaller queries.";
    Operation = "ExecQuery";
    ParameterInfo = "SELECT * FROM SMS_CombinedDeviceResources WHERE
    (ResourceID=16777223 OR ResourceID=16777224 OR ResourceID=16777226 OR
    ResourceID=16777227 OR ResourceID=16777228 OR ResourceID=16777229 OR
    ResourceID=16777244 OR ResourceID=16777250 OR ResourceID=16777251 OR
    ResourceID=16777260 OR ResourceID=16777272 OR ResourceID=16777273 OR
    ResourceID=16777274 OR ResourceID=16777275 OR ResourceID=16777278 OR ..............ResourceID=16780606  OR ResourceID=2097154624  OR ResourceID=2097154626  OR 
    ResourceID=2097154645  OR ResourceID=2097154873 ) ";
        ProviderName = "WinMgmt";
        StatusCode = 2147749889;
    Error Code:
    Failed
    At line:1 char:1
    + Get-CMDevice -CollectionName "Test All Desktops and Laptops"
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (Microsoft.Confi...etDeviceCommand:GetDe 
       viceCommand) [Get-CMDevice], WqlQueryException
        + FullyQualifiedErrorId : UnhandledExeception,Microsoft.ConfigurationManagement. 
       Cmdlets.Collections.Commands.GetDeviceCommand
    The collection we are querying has 668 machines in it (all direct membership).  
    I have tried using 'Set-CMQueryResultMaximum' to set the maximum to say 2000 but this still gets the same error.
    Environment : SCCM 2012 SP1 CU3 
    Has anyone else seen this or found a fix etc?

    Hi, MattB101. This has been fixed in R2.
    Check out my Configuration Manager blog at http://aka.ms/ameltzer

  • Since I first discovered the Genuis Mix feature I have had it turned on.  I have a large collection of R

    Since I discovered the Genius Mix feature in iTunes I have had it turned on.  I have a large collection of R&B muisic and enjoyed the Genuis Mix Classic R&B.  Lately since I upgraded the iTunes software the catagory dissappeared.  What's up?

    Hi Jon,
    Here is a link to my scripts page. You're actually more likely to grab my attention via a message here tacked onto any thread that I've posted to than my email as I get so much spam I often miss things that aren't.
    Most text fields in iTunes are limited to 255 characters. I believe Description can be longer, but it is difficult to access, so the Lyrics field may be the best place to put the data, assuming iTunes is prepared to accept and store it for the given file type.
    I've developed techniques for matching items in the iTunes database with items in other lists by making a dictionary of identifying strings. For example even though iTunes doesn't allow direct access to a track by its file path, if your Excel spreadsheet knows them then I could find the right video to attach the other data to.
    When the time comes to move the library see this migrate iTunes library post.
    tt2

  • Large Collections

    Hey gang,
    A while ago a discussion went on about large collections. I decided to do
    the following for a collection that can be huge:
    class Container
    Vector items
    class Item
    Container parent;
    Now items always have their parent set. Also, when an item has it's parent
    set, I automatically add it to the Vector items. Now for retrieval of the
    items I've defined a few methods. There are methods that work directly with
    the items collection and there are methods that do a query for the items
    that belong under it and return only subsets. This is so that a small items
    collection can be obtained directly and a large collection is done through
    queries so as not to take too long.
    First of all, does this sound like an OK way to attack this problem?
    Second, is there a JDO method to determine if the items collection has been
    fetched yet?

    First of all, does this sound like an OK way to attack this problem?Yes.Very good.
    >
    Second, is there a JDO method to determine if the items collection has
    been fetched yet?There isn't a general JDO way. You can get into Kodo-specific ways:
    PersistenceManagerImpl pm = (PersistenceManagerImpl)
    JDOHelper.getPersistenceManager (this);
    StateManagerImpl sm = pm.getState (JDOHelper.getObjectId (this));
    int field = sm.getMetaData ().getField ("fieldName").getIndex ();
    boolean loaded = sm.getLoaded ().get (field);Hmm guess it's only natural though a pity that this isn't JDO standard.
    Thanks for the Kodo tip. However, I think I'll monitor my accessor methods
    then and choose which to use by deciding if the collection has already been
    loaded.

  • How to insert large xml file ino xmltype column

    Hi all,
    iam using oracle10g. I have a table example1(key_column varchar2(10),xml_column xmltype) like this. i want to insert xml file from server to xmltype column. Can you please anybody suggest regarding this.wht is the size of the xmltype column. is it same as clob like 2GB or sth different. and one more thing is iam creating one procedure in that procedure , in that we are fetching xmltype value into clob varable. is it correct or not...
    Thanks

    Manipulating a single XMLType of 2GB is probably not sensible period. I'm guessing that the content of a file that big is probably a set of smaller XML documents which someone has wrapped a begin and end tag around. Since databases are good at handling collections of documents (or collections of anything else for that matter), you should consider de-compising the large document back into a set of smaller documents before storing it in the database

  • Unable to insert large blob data into DB

    Hi,
    I have written a piece of code which serializes a java object and writes it into a blob in DB.
    It works pretty fine for small objects of around 3 to 4 KB size, but I start getting trouble with larger objects.
    Here's my piece of code -
    private final static String QUERY_INSERT_AUDIT_DATA = "INSERT INTO " +
    "KAAS_AUDIT_DATA(object_id, object_type_cd, create_by, summary_data) VALUES (?, ?, ?, ?)";
    byte [] byteArray;
    bos = new ByteArrayOutputStream();
    oos = new ObjectOutputStream(bos);
    oos.writeObject(summaryData);
    oos.flush();
    oos.close();
    byteArray = bos.toByteArray();
    bos.close();
    ByteArrayInputStream bais = new ByteArrayInputStream(byteArray);
    BufferedInputStream buffInputSteam = new BufferedInputStream(bais);
    trace("addAuditSummary() : byteArray " + byteArray.length);
    trace("addAuditSummary() : buffInputSteam.available " + buffInputSteam.available());
    trace("addAuditSummary() : Calling Query to populating data");
    statement = conn.prepareStatement(QUERY_INSERT_AUDIT_DATA);
    statement.setString(1, objectId.toUpperCase());
    statement.setInt(2, objectType);
    statement.setString(3, createdBy);
    statement.setBinaryStream(4, buffInputSteam, buffInputSteam.available());
    statement.executeUpdate();
    statement.close();
    Basically, I am converting the object to BufferedInputStream to sent it to the BLOB(summary_data).
    The error I get is -
    ][30 Nov 2007 10:38:08] [ERROR] [com.hns.iag.kaas.util.debug.DebugDAO] addAuditSummary() : SQL exception occured while adding audit summary data for Object: BUSINESS_SO_BASE_DEAL
    ]java.sql.SQLException: Io exception: Connection reset by peer.
    at oracle.jdbc.dbaccess.DBError.throwSqlException(Ljava/lang/String;Ljava/lang/String;I)V(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(ILjava/lang/Object;)V(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(Ljava/io/IOException;)V(DBError.java:334)
    at oracle.jdbc.ttc7.TTC7Protocol.handleIOException(Ljava/io/IOException;)V(TTC7Protocol.java:3675)
    at oracle.jdbc.ttc7.TTC7Protocol.doOall7(BBI[B[Loracle/jdbc/dbaccess/DBType;[Loracle/jdbc/dbaccess/DBData;I[Loracle/jdbc/dbaccess/DBType;[Loracle/jdbc/dbaccess/DBData;I)V(Optimized Method)
    at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(Loracle/jdbc/dbaccess/DBStatement;B[BLoracle/jdbc/dbaccess/DBDataSet;ILoracle/jdbc/dbaccess/DBDataSet;I)I(TTC7Protocol.java:1141)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout()V(Optimized Method)
    at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate()I(Optimized Method)
    at weblogic.jdbc.wrapper.PreparedStatement.executeUpdate()I(PreparedStatement.java:94)
    at com.hns.iag.kaas.util.debug.DebugDAO.addAuditSummary(Ljava/lang/String;ILjava/lang/Object;Ljava/lang/String;)V(DebugDAO.java:794)
    at com.hns.iag.kaas.servlets.sdm.action.SummaryAction.perform(Lcom/hns/iag/kaas/servlets/sdm/core/Event;Lcom/hns/iag/kaas/servlets/sdm/core/UserContext;)Ljava/lang/String;(SummaryAction.java:246)
    at com.hns.iag.kaas.servlets.sdm.SDMControllerServlet.process(Ljavax/servlet/http/HttpServletRequest;Ljavax/servlet/http/HttpServletResponse;)V(SDMControllerServlet.java:213)
    at com.hns.iag.kaas.servlets.sdm.SDMControllerServlet.doPost(Ljavax/servlet/http/HttpServletRequest;Ljavax/servlet/http/HttpServletResponse;)V(SDMControllerServlet.java:128)
    at javax.servlet.http.HttpServlet.service(Ljavax/servlet/http/HttpServletRequest;Ljavax/servlet/http/HttpServletResponse;)V(HttpServlet.java:760)
    at javax.servlet.http.HttpServlet.service(Ljavax/servlet/ServletRequest;Ljavax/servlet/ServletResponse;)V(HttpServlet.java:853)
    at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run()Ljava/lang/Object;(ServletStubImpl.java:971)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(Ljavax/servlet/ServletRequest;Ljavax/servlet/ServletResponse;Lweblogic/servlet/internal/FilterChainImpl;)V(ServletStubImpl.java:402)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(Ljavax/servlet/ServletRequest;Ljavax/servlet/ServletResponse;)V(ServletStubImpl.java:305)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run()Ljava/lang/Object;(WebAppServletContext.java:6354)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Lweblogic/security/subject/AbstractSubject;Ljava/security/PrivilegedAction;)Ljava/lang/Object;(AuthenticatedSubject.java:317)
    at weblogic.security.service.SecurityManager.runAs(Lweblogic/security/acl/internal/AuthenticatedSubject;Lweblogic/security/acl/internal/AuthenticatedSubject;Ljava/security/PrivilegedAction;)Ljava/lang/Object;(SecurityManager.java:118)
    at weblogic.servlet.internal.WebAppServletContext.invokeServlet(Lweblogic/servlet/internal/ServletRequestImpl;Lweblogic/servlet/internal/ServletResponseImpl;)V(WebAppServletContext.java:3635)
    at weblogic.servlet.internal.ServletRequestImpl.execute(Lweblogic/kernel/ExecuteThread;)V(ServletRequestImpl.java:2585)
    at weblogic.kernel.ExecuteThread.execute(Lweblogic/kernel/ExecuteRequest;)V(ExecuteThread.java:197)
    at weblogic.kernel.ExecuteThread.run()V(ExecuteThread.java:170)
    at java.lang.Thread.startThreadFromVM(Ljava/lang/Thread;)V(Unknown Source)
    I would really appreciate any help.
    Thanks
    Saurabh                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              

    I would say, most likele BufferedInputStream.available() returns an incorrect length. available() does not return the length of the InputStream. See the javadocs for the details
    Additionally: it doesn't make sense at all to wrap the ByteArrayInputStream with a BufferedInputStream, the array is in memory already, so there is no need to buffer the read access to it (you are simply adding more overhead).
    Remove the BufferedInputStream (passing the bais directly to the setBinaryStream() method) and use byteArray.length to indicate the length of the data

  • Problem inserting large files into a Blob-Column

    hi all,
    i am using a oracle 10g database.
    i defined a table including one blob column as follows:
    create table contact(
    id     number(22) primary key not null,
    lastupdated date not null,
    lastwriter_id number(22) not null,
    contacttype_id number(22) not null,
    notice varchar2(2000),
    attachment blob,
    attachmentname varchar2(255))
    tablespace users
    storage (initial 2M pctincrease 0)
    lob (attachment) store AS (
    tablespace users
    storage (initial 10M)
    enable storage in row
              pctversion 5
              chunk 1
    index lob_attachment_idx (tablespace users storage (initial 1M)));
    now i fill this table from a java application using hibernate.
    small files (for example 2700 chars) are ok, the pass into the attachment column.
    larger files dont go there. i receive no errormessage.
    whats going wronr?
    ist my create table statement wrong?
    thnax for help dieter

    Quick and dirty testcase:
    test@ORA10G>
    test@ORA10G> --
    test@ORA10G> drop table t;
    Table dropped.
    test@ORA10G> create table t (x blob);
    Table created.
    test@ORA10G>
    test@ORA10G> insert into t (x)
      2  select utl_raw.cast_to_raw(rpad('a',1000,'x')) from dual;
    1 row created.
    test@ORA10G>
    test@ORA10G> commit;
    Commit complete.
    test@ORA10G>
    test@ORA10G> --
    test@ORA10G> select dbms_lob.getlength(x) as len, dbms_lob.substr(x,10,1) as chunk from t;
           LEN CHUNK
          1000 61787878787878787878
    test@ORA10G>
    test@ORA10G>pratz

  • Efficient method to insert large number of data into table

    Hi,
    I have a procedure that accepts an input parameter, that contains, a comma seperated values as input.
    Something like G12-UHG,THA-90HJ,NS-98039,........There can be more than 90,000 values in that comma seperated input paramter.
    What is the most efficient way to do an insert in this case?.
    3 methods I have in mind are :
    1) Get individual tokens from CSV and use a plain old loop and do an insert.
    2) Use BULK COLLECT & FOR ALL. However I don't know how to do this, since input is not from cursor, rather a parameter.
    3) Use Table collections. Again this involves plain old looping through the collection. Same as 1st method.
    Please do suggest the most efficient method.
    Thanks

    90,000 values?? Whats the data type of the input parameter?
    you can use the string to row conversion trick if you want and do a single insert
    SQL> with t as (select 'ABC,DEF GHI,JKL' str from dual)
      2  select regexp_substr(str,'[^,]+', 1, level) list
      3  from t connect by level <= NVL( LENGTH( REGEXP_REPLACE( str, '[^,]+', NULL ) ), 0 ) + 1
      4  /
    LIST
    ABC
    DEF GHI
    JKL Edited by: Karthick_Arp on Feb 13, 2009 2:18 AM

  • Insert Large Binary Values

    Hi all.,
    I want to insert a very large binary value of about 40000 character into database table i have tried by using clob datatype and blob datatype but all my efforts where ended in vain..
    So can any one help me on this......
    Thanks In advance

    You should use the dbms_lob package.
    RTM first, then ask.

  • Direct insert Vs collection

    Greetings,
    In one of my procedures, I've used collection with commit every 1000 records. It took about 4 hrs to process 14,000,000 records. Then, without changing the logic, I've modified the same procedure and did direct insert and delete statements. It took about 1.5 hrs.
    I was wrong thinking that collection would improve performance as they fetch all at the same time. Is this because of 'commit'? I have also experimented with commit every 10,000 records, even then it didn't improve much.
    Could you discuss further on why slow Vs faster? Thank you,
    Lakshmi

    The rules of thumb (borrowed from Tom Kyte)
    1) If you can do it in SQL, do it in SQL
    2) If you can't do it in SQL, do it in PL/SQL. "Can't do it in SQL" generally means that you can't figure out how to code the logic in SQL and/or the business logic is so complex that it makes the SQL unreadable.
    2a) If you have to use PL/SQL, try to use collections & bulk processing.
    3) If you can't do it in PL/SQL, do it in a Java stored procedure
    4) If you can't do it in a Java stored procedure, do it in an external procedure
    Collections are never preferred over straight SQL from a performance standpoint, but they may be easier to code for more complex business rules.
    Justin

Maybe you are looking for

  • DVI to Video Adapter Not Working...

    Hi, I bought this thingy to view video from my Dual 2 Ghz PowerPC G5 on a Panasonic HDTV, using one of its s-video inputs. The video is blank when I switch to that input. I tried using the "Detect Displays" in the Displays preference panel. I do get

  • Install Oracle 8i 8.1.7 on Linux Redhat 7.1

    Hi, I am installing Oracle 8.1.7 on AMD-K6 CPU running Linux 7.1. But almost the end of installation, I get the following error: Error in invoking target install of makefile /opt/oracle/product/8.1.7/ctx/ins_ctx.mk Anybody know what caused this error

  • Is there an app for video playback on external screen from ipad mini which avoids mirroring?

    I want to play fullscreen video from ipad mini to external screens with a clean look and without without showing playback controls etc. I need to be able to play, pause, change the order of video tracks without affecting the external display as it's

  • Cannot open Dreamweaver due to Java error.

    I have reviewed the other posts regarding this topic and have not found an answer that works for me. I am using Dreamweaver to update a website.  Upon opening Dreamweaver I receive the following message: "To open Dreamweaver, you need a Java SE 6 run

  • Specify field mapping when changing table location

    I want to change the tables in a report to refer to database tables with a different schema, and specify the mapping of fields in code. I already have code that changes table locations successfully when the schema does not differ, using the RAS API.