Couldn't copy large amount of data from enterprise DB to Oracle 10g

Hi,
I am using i-batis to copy data from enterprise DB to oracle and viceversa.
The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
i am binding these to a java string property in a POJO to bind values.
I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
--- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
com.ibatis.common.jdbc.exception.NestedSQLException:
--- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
--- The error occurred while applying a parameter map.
--- Check the updateOracleFromEDB-InlineParameterMap.
--- Check the parameter mapping for the 'btlxml' property.
--- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
at com.iba
I have latest oracle 10g jdbc drivers.
remember, i could copy any amount of data from oracle to EDB but not otherway around.
PLease let me know if you have come across this issue, any recommendation is very much appreciated.
Thanks,
CK.

Hi,
I finally remembered how I solved this issue previously.
The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
Here it is (for insert but I suppose tha update will be the same)
create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
begin
insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
end insertXML;
here is the sqlmap file I used
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE sqlMap
PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
"http://ibatis.apache.org/dtd/sql-map-2.dtd">
<sqlMap>
     <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
     <insert id="insert" parameterClass="AqAost">
          begin
               insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
          end;
     </insert>
</sqlMap>
an here is a simple program
package com.sg2net.jdbc;
import java.io.IOException;
import java.io.Reader;
import java.io.StringWriter;
import java.sql.Connection;
import oracle.jdbc.pool.OracleDataSource;
import com.ibatis.common.resources.Resources;
import com.ibatis.sqlmap.client.SqlMapClient;
import com.ibatis.sqlmap.client.SqlMapClientBuilder;
public class TestInsertXMLType {
     * @param args
     public static void main(String[] args) throws Exception {
          // TODO Auto-generated method stub
          String resource="sql-map-config-xmlt.xml";
          Reader reader= Resources.getResourceAsReader(resource);
          SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
          OracleDataSource dataSource= new OracleDataSource();
          dataSource.setUser("test");
          dataSource.setPassword("test");
          dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
          Connection connection=dataSource.getConnection();
          sqlMap.setUserConnection(connection);
          AqAost aqAost= new AqAost();
          aqAost.setFileNo(3);
          aqAost.setProgramNo("prg");
          Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
          Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
          aqAost.setOstXML(readerToString(ostXMLReader));
          aqAost.setBltXML(readerToString(bltXMLReader));
          sqlMap.insert("insert", aqAost);
          connection.commit();
     public static String readerToString(Reader reader) {
          StringWriter writer = new StringWriter();
          char[] buffer = new char[2048];
          int charsRead = 0;
          try {
               while ((charsRead = reader.read(buffer)) > 0) {
                    writer.write(buffer, 0, charsRead);
          } catch (IOException ioe) {
               throw new RuntimeException("error while converting reader to String", ioe);
          return writer.toString();
package com.sg2net.jdbc;
public class AqAost {
     private long fileNo;
     private String programNo;
     private String ostXML;
     private String bltXML;
     public long getFileNo() {
          return fileNo;
     public void setFileNo(long fileNo) {
          this.fileNo = fileNo;
     public String getProgramNo() {
          return programNo;
     public void setProgramNo(String programNo) {
          this.programNo = programNo;
     public String getOstXML() {
          return ostXML;
     public void setOstXML(String ostXML) {
          this.ostXML = ostXML;
     public String getBltXML() {
          return bltXML;
     public void setBltXML(String bltXML) {
          this.bltXML = bltXML;
I tested the insert and it works correctly
ciao,
Giovanni

Similar Messages

  • Copying large amount of data from one table to another getting slower

    I have a process that copies data from one table (big_tbl) into a very big archive table (vb_archive_tbl - 30 mil recs - partitioned table). If there are less than 1 million records in the big_tbl the copy to the vb_archive_table is fast (-10 min), but more importantly - it's consistant. However, if the number of records is greater than 1 million records in the big_tbl copying the data into the vb_archive_tbl is very slow (+30 min - 4 hours), and very inconsistant. Every few days the time it takes to copy the same amount of data grows signicantly.
    Here's an example of the code I'm using, which uses BULK COLLECT and FORALL INSERST to copy the data.
    I occasionally change 'LIMIT 5000' to see performance differences.
    DECLARE
    TYPE t_rec_type IS RECORD (fact_id NUMBER(12,0),
    store_id VARCHAR2(10),
    product_id VARCHAR2(20));
    TYPE CFF_TYPE IS TABLE OF t_rec_type
    INDEX BY BINARY_INTEGER;
    T_CFF CFF_TYPE;
    CURSOR c_cff IS SELECT *
    FROM big_tbl;
    BEGIN
    OPEN c_cff;
    LOOP
    FETCH c_cff BULK COLLECT INTO T_CFF LIMIT 5000;
    FORALL i IN T_CFF.first..T_CFF.last
    INSERT INTO vb_archive_tbl
    VALUES T_CFF(i);
    COMMIT;
    EXIT WHEN c_cff%NOTFOUND;
    END LOOP;
    CLOSE c_cff;
    END;
    Thanks you very much for any advice
    Edited by: reid on Sep 11, 2008 5:23 PM

    Assuming that there is nothing else in the code that forces you to use PL/SQL for processing, I'll second Tubby's comment that this would be better done in SQL. Depending on the logic and partitioning approach for the archive table, you may be better off doing a direct-path load into a staging table and then doing a partition exchange to load the staging table into the partitioned table. Ideally, you could just move big_tbl into the vb_archive_tbl with a single partition exchange operation.
    That said, if there is a need for PL/SQL, have you traced the session to see what is causing the slowness? Is the query plan different? If the number of rows in the table is really a trigger, I would tend to suspect that the number of rows is causing the optimizer to choose a different plan (with your sample code, the plan is obvious, but perhaps you omitted some where clauses to simplify things down) which may be rather poor.
    Justin

  • Copying large amount of data from one table to another

    We have a requirement to copy data from a staging area (consists of several tables) to the actual tables. The copy operation needs to
    1) delete the old values from the actual tables
    2) copy data from staging area to actual tables
    3) commit only after all rows in the staging area are copied successfully. Otherwise, it should rollback.
    Is it possible to complete all these steps in one transaction without causing problems with the rollback segments, etc.? What are the things that I need to consider in order to make sure that we will not run into production problems?
    Also, what other best practices/alternative methods are available to accomplish what is described above.
    Thanks,
    Eser

    It's certainly possible to do this in a single transaction. In fact, that would be the best practice.
    Of course, the larger your transactions are, the more rollback you need. You need to allocate sufficient rollback (undo in 9i) to handle the transaction size you're expecting.
    Justin
    Distributed Database Consulting, Inc.
    www.ddbcinc.com

  • How can I copy large amount of data between two HD ?

    Hello !
    Which command could I user to copy large amount of data between two hard disk drives ?
    How Lion identify the disk drives when you want to write some script, for example in Windows I sue
    Robocopy D:\folder source\Files E:\folder destination
    I just want to copy files and if the files/folders exist in destination the files should be overwritted.
    Help please, I bougth my first MAC 4 days ago.
    Thanks !

    Select the files/folders on one HD and drag & drop onto the other HD. The copied ones will overwrite anything with the same names.
    Since you're a newcomer to the Mac, see these:
    Switching from Windows to Mac OS X,
    Basic Tutorials on using a Mac,
    Mac 101: Mac Essentials,
    Mac OS X keyboard shortcuts,
    Anatomy of a Mac,
    MacTips,
    Switching to Mac Superguide, and
    Switching to the Mac: The Missing Manual,
    Snow Leopard Edition.&
    Additionally, *Texas Mac Man* recommends:
    Quick Assist,
    Welcome to the Switch To A Mac Guides,
    Take Control E-books, and
    A guide for switching to a Mac.

  • Transporting large amounts of data from one database schema to another

    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Am currently using datapump but quite slow still - having to do in chunks.
    Also the datapump files quite large so having to compress and move across the network.
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31

    user5716448 wrote:
    Hi,
    We need to move large amount of data from one 10.2.0.4 database schema to another 11.2.0.3 database.
    Pl quantify "large".
    Am currently using datapump but quite slow still - having to do in chunks.
    Pl quantify "quite slow".
    Also the datapump files quite large so having to compress and move across the network.
    Again, pl quantify "quite large".
    Is there a better/quicker way?
    Habe haerd about transportable tablespaces but never used them and don't know about speed - if quicker thana datapump.
    tablespace names different in both databases.
    Also source database on solaris opertaing system on sun box
    target database on aix on ibm power series box.
    It may be possible, assuming you do not violate any of these conditions
    http://docs.oracle.com/cd/E11882_01/server.112/e25494/tspaces013.htm#ADMIN11396
    Any ideas would be great.
    Thanks
    Edited by: user5716448 on 08-Sep-2012 03:30
    Edited by: user5716448 on 08-Sep-2012 03:31Master Note for Transportable Tablespaces (TTS) -- Common Questions and Issues [ID 1166564.1]
    HTH
    Srini

  • Power BI performance issue when load large amount of data from database

    I need to load data set from my database, which have large amount of data, it will take so many time to initialize data before I can build report, is there any good way to process large amount of data for PowerBI? As I know many people analysis data based
    on PowerBI, is there any suggestion for loading large amount of data from database?
    Thanks a lot for help

    Hi Ruixue,
    We have made significant performance improvements to Data Load in the February update for the Power BI Designer:
    http://blogs.msdn.com/b/powerbi/archive/2015/02/19/6-new-updates-for-the-power-bi-preview-february-2015.aspx
    Would you be able to try again and let us know if it's still slow? With the latest improvements, it should take between half and one third of the time that it used to.
    Thanks,
    M.

  • Why does copying large amount of data block internet?

    I have a wrt600n and copied about 20G of data from a 100Mb connection and it took over the entire router.  All computers plugged into the router lost internet access and I was unable to log into the router admin during the transfer. Why would this happen?  After the transfer everything went back to normal.

    It was not just my computer..... tried all the computers on my network...none could reach the internet.  Even the new software that Linksys has to display your network show all lines red. 
    And the computer that was sending the data was 100mb... while the others are all Gig.  Strange...switches should localize the traffic to the 2 computers talking and not affect the others.

  • Faster way to migrate data from SQL Server to Oracle 10g

    We have to migrate data from SQL Server to Oracle 10 g.
    One particular table on SQL Server has records around 1.25 millions.
    We tried moving data using DTS package, but looks it will take hours with current speed of 300 records/minute.
    This table has TEXT column, which has XML strings stored. I am not sure, if this is the reason for slow migration.
    Would you please suggest better options to migrate it faster?
    Thanks in advance !!!

    Have you tried Migration work bench?

  • Transfer Large Amounts of Data From External to External Drives

    I have 4 each 1.5TB USB 2.0 drives that contain music & movies for use in itunes or apple tv. I purchased two 3TB Thunderbolt drives and would like to transfer the data from the 4 each 1.5TB drives to the new 3TB Thunderbolt drives How Do I Do This?
    I have tried to copy/paste  drag/drop  but only so much moves then stops.
    All formatting is MAC Journaled

    First I just want to say thank you for the suggestion.
    There were no errors shown just stopped after several hours, also sleep is turned off. What I started doing is taking lets say approximately 6GB at a time and maybe doing this 4 or 5 times and after about 20 to 30 minutes it is done and I do this all over again.
    I will run the disk utility like you suggested and try that again but give or take there are probably 2500 files that are 2+GB in size and then all of the music files we have collected is around 90GB total if not slightly higher.

  • Migration data from SAP DB2 to Oracle 10g

    Hi , I am assigned to migrate data from SAP system which is using DB2 to Oracle 10g.I not very familiar with SAP. I hope someone can help me., especially for those familiar in SAP.
    Please help me.
    thanks
    jebatco

    Hello,
    just migrating a DB2 database to Oracle 10g might be an easy task. The Oracle Migration Workbench is the tool for such a migration:
    http://www.oracle.com/technology/tech/migration/workbench/index.html
    I have no idea about SAP, and that might complicate the picture. But there exist specialists for this task:
    Oracle Expertise in the SAP environment
    The Solution Center SAP Support and Service – located in Walldorf – offers SAP
    customers the following services:
    • Advanced Customer Services (ACS)
    • Performance Analysis and Tuning
    • Development of concepts for Backup/Restore/Recovery, and High Availability,
    Administration
    • Security concepts
    • Optimizing of ABAP/4 programs (performance improvement)
    • Migration service for customers, who want to use Oracle as the database for SAP
    applications (from Informix, MaxDB, DB2, or SQL Server to Oracle).
    • Migration services from “Oracle to Oracle” (e.g. Tru64 to HP_UX)
    • Integration-Products and –Services
    • Oracle Database: The Database of Choice for Deploying SAP Solutions
    This is taken from http://www.oracle.com/newsletters/sap/docs/ora4sap-db-of-choice.090213.pdf
    Best regards
    Wolfgang

  • Load Data from SQL Server to Oracle 10g using Sql*loader utility

    I am trying to lod data from sql server 2005 to oracle 10g.
    What is the best way to do it?
    Can sql*loader utility do it?
    what is the difference of using sql*loader utility and migration tool fom sql developer ?
    Thanks
    Edited by: user11313758 on Sep 30, 2009 4:30 PM

    Hello:
    You could consider using Oracle Heterogeneous Services to do this. If your Oracle database is on a Windows platform the link below shows you how to make a connection to SqlServer from an Oracle database.
    http://www.databasejournal.com/features/oracle/article.php/3442661/Making-a-Connection-from-Oracle-to-SQL-Server.htm
    Varad

  • Delete large amounts of data from a table

    I have a table with about 10 fields to store info for customers. Over time as we have added more customers that table has grown to about 14 million rows. As the data comes in a service constantly inserts a row into the table. 90% of the data is not revelent
    i.e. I don't want data that is 3 months ago, but the most recent data is used to generate tracking reports. My goal is to write a sql to perform a purge of the data that is older than a month.
    Here is my problem I can NOT use TRUNCATE TABLE as I would lose everything? Yesterday I wrote a delete table statement with a where clause. When I ran it on a test system it locked up my table and the simulation gps inserts were intermittently failing. Also
    my transaction log grew to over 6GB as it attempted to log each delete.
    My first thought was to delete the data a little at a time starting with the oldest first but I was wondering if there was a better way.I am expecting solutions apart from these :
    1.Create a temp DB and copy the required data into temp Db and truncate the original.
    2.Deleting in chunks(i.e.,1000 -10000 records at a time).
    3.Set the Recovery mode  as simple

    I agree with Satish (+1)
    This
    is the right way to do it. Your database architect should think about this, and you can use partitioning by the condition for deleting (for example years if you are deleting old year data). 
    check this like for more details (but most of what you need
    Satish already mentioned): http://www.sqlservercentral.com/scripts/Truncate+Table/69506/
    * in the link you have SP named TRUNCATE_PARTITION which you can use
    [Personal Site] [Blog] [Facebook]

  • What is the best way to migrate large amounts of data from a g3 to an intel mac?

    I want to help my mom transfer her photos and other info from an older  blueberry G3 iMac to a new intel one.  There appears to be no prmigration provision on the older mac.  Also the firewire caconnestions are different.  Somebody must have done this before.

    Hello
    the cable above can be use to enable Target Disk mode for data transfert
    http://support.apple.com/kb/ht1661 for more info
    to enable Target Disk mode just after startup sound Hold on "T" key on key board until see at  screen firewire symbol aka screen saver , then plug fire wire cable betwen 2 mac
    HTH
    Pierre

  • Osx server crashes when copying large amount of data

    Ok. I have set up a mac os x server on a G4 Dual 867. Set to standalone server. The only services running are, VPN, AFP, DNS (I am pretty sure the DNS is set up correctly). I have about 3 Firewire drives and 2 USB 2.0 drives hooked up to it.
    When I try and copy roughly 230GB from one drive to another, it either just stops in the middle or CRASHES the server! I can't see anything out of the ordinary in the logs, though I am a newbie.
    I am stumped. Could this be hardware related? I just did a complete fresh install of os x server!

    This could be most anything, whether a disk error, a non-compliant device, a firewire error (I've had FireWire drivers tip over Mac OS X with a kernel panic; if the cable falls out at an inopportune moment when recording in GarageBand, toes up it all goes), to a memory error. This could also be a software error. This could be a FireWire device(s) that's simply drawing too much power.
    Try different combinations of drives, and replace one or more of these drives with another; start a sequence of elimination targeting the drives.
    Here's what Apple lists about kernel panics as an intro; it's details from the panic log that'll most probably be interesting...
    http://docs.info.apple.com/article.html?artnum=106228
    With some idea of which code is failing, it might be feasible to find a related discussion.
    A recent study out of CERN found three hard disk errors per terabyte of storage, so a clean install is becoming more a game of moving the errors around than actually fixing anything. FWIW.

  • NMH305 dies when copying large amounts of data to it

    I have an NMH305 still set up with the single original 500GB drive.
    I have an old 10/100 3COM rackmount switch (the old white one) uplinked to my Netgear WGR614v7 wireless router.  I had the NAS plugged into the 3COM switch and everything worked flawlessly.  Only problem was it was only running at 100m.
    I recently purchased a TRENDnet TEG-S80g 10/100/1000 'green' switch.  I basically replaced the 3com with this switch.  To test the 1g speeds, I tried a simple drag & drop of about 4g worth of pics to the NAS on a mapped drive.  After about 2-3 seconds, the NAS dropped and Explorer said it was no longer accessible.  I could ping it, but the Flash UI was stalled.
    If I waited several minutes, it could access it again.  I logged into the Flash UI and upgraded to the latest firmware, but had the same problem.
    I plugged the NAS directly into the Netgear router and transfered files across the wireless without issue.  I plugged it back into the green switch and it dropped after about 6-10 pics transfered.
    I totally bypassed the switch and plugged it directly into my computer.  Verified I can ping & log in to the Flash UI, then tried to copy files and it died again.
    It seems to only happend when running at 1g links speeds.  The max transfer I was able to get was about 10mbps, but I'm assuming that's limited by the drive write speeds & controllers.
    Anyone ran into this before?
    TIA!

    Hi cougar694u,
    You may check this review "click here". This is a thorough review about the Media Hub's Write and Read throughput vs. File Size - 1000 Mbps LAN.
    Cheers

Maybe you are looking for

  • Wow the new firmware has made a mess of my Touch, Creative please resp

    Okay, first I find out that some of my music is missing track number tags and is now listed in alpabetical order. Next, Media Source doesnt detect my Touch at all, so I have no easy way to fix my tags. So I have to move the music off my player, tag i

  • BP aliases table BKK21, visible in UI

    Hi I am a CRM functional consultant. Now we have requirement to make BP aliases visible in UI. Through configuration I am able to make aliases tab visible in GUI (transaction BP). But corresponding Component/View is not available for UI. Please guide

  • How do I get my purchases to be available on multiple computers

    I have a MacBook Pro and a Mac Mini. I recently purchased some videos on the MBP but want to be able to access the videos on the Mini. I have both authorized to me Apple ID and have checked for available downloads. Any thoughts hoe to get the videos

  • Reports on Assessments

    Hello, OD R18 doc says on page 1447 "The answers to the questions are mapped as field values in the record, so that the answers are easily accessible, and available for reporting" Also it says in next page that Assessment script can be used for SR Su

  • Had BT a long time (Hub 1.5) and for about 1 year ...

    Well title basicly says it all, well i have had my BT for along time and about 1 year ago it stopped allowing me from port fowarding which is an annoyance because now we can barely play online gaming because it restricts access is there anyway you co