LKM step Load huge data very slow

Hi ALL,
when you try to load 4 million data from flat file to my oracle database.
it took me around 2.5 hours to finish the step 3 "LOAD DATA" which i thought odi will load data from flat file to C$ table.
it is anyway to speed it up?
I just chose the simple LKM FILE TO SQL option.
thanks
I Z

this is only the step 3 create temp table. it is nothing about the sql load yet.
create table <%=odiRef.getObjectName("T"+odiRef.getTableName("COLL_SHORT_NAME"),"W")%>
<%=odiRef.getSrcColList("","[COL_NAME] [DEST_CRE_DT]","[COL_NAME] [DEST_CRE_DT]",",\n","")%>
<%=odiRef.getUserExit("WORK_TABLE_OPTIONS")%>
this one will auto-generate the sqll like
create table ESBUSER.TC$_0TEST
ROW CODE VARCHAR2(1), --> this one is auto-generated by the above code. i have not way to change ROW CODE TO ROW_CODE.
B VARCHAR2(20),
C VARCHAR2(20),
D VARCHAR2(40)
i dont know why it will generate the ROW CODE......without the "_"...
please help
thanks
I Z

Similar Messages

  • SQL loader load data very slow...

    Hi,
    On my production server have issue of insert. Regular SQL loder load file, it take more time for insert the data in database.
    First 2 and 3 hours one file take 8 to 10 seconds after that it take 5 minutes.
    As per my understanding OS I/O is very slow, First 3 hours DB buffer is free and insert data in buffer normal.
    But when buffer is fill then going for buffer waits and then insert is slow on. If it rite please tell me how to increase I/O.
    Some analysis share here of My server...................
    [root@myserver ~]# iostat
    Linux 2.6.18-194.el5 (myserver) 06/01/2012
    avg-cpu: %user %nice %system %iowait %steal %idle
    3.34 0.00 0.83 6.66 0.00 89.17
    Device: tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn
    sda 107.56 2544.64 3140.34 8084953177 9977627424
    sda1 0.00 0.65 0.00 2074066 16
    sda2 21.57 220.59 1833.98 700856482 5827014296
    sda3 0.00 0.00 0.00 12787 5960
    sda4 0.00 0.00 0.00 8 0
    sda5 0.69 2.75 15.07 8739194 47874000
    sda6 0.05 0.00 0.55 5322 1736264
    sda7 0.00 0.00 0.00 2915 16
    sda8 0.50 9.03 5.24 28695700 16642584
    sda9 0.51 0.36 24.81 1128290 78829224
    sda10 0.52 0.00 5.98 9965 19004088
    sda11 83.71 2311.26 1254.71 7343426336 3986520976
    [root@myserver ~]# hdparm -tT /dev/sda11
    /dev/sda11:
    Timing cached reads: 10708 MB in 2.00 seconds = 5359.23 MB/sec
    Timing buffered disk reads: 540 MB in 3.00 seconds = 179.89 MB/sec
    [root@myserver ~]# sar -u -o datafile 1 6
    Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
    09:57:19 AM CPU %user %nice %system %iowait %steal %idle
    09:57:20 AM all 6.97 0.00 1.87 16.31 0.00 74.84
    09:57:21 AM all 6.74 0.00 1.25 17.48 0.00 74.53
    09:57:22 AM all 7.01 0.00 1.75 16.27 0.00 74.97
    09:57:23 AM all 6.75 0.00 1.12 13.88 0.00 78.25
    09:57:24 AM all 6.98 0.00 1.37 16.83 0.00 74.81
    09:57:25 AM all 6.49 0.00 1.25 14.61 0.00 77.65
    Average: all 6.82 0.00 1.44 15.90 0.00 75.84
    [root@myserver ~]# sar -u -o datafile 1 6
    Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
    09:57:19 AM CPU %user %nice %system %iowait %steal %idle
    mca-webreporting2;601;2012-05-27 16:30:01 UTC;2.54;1510.94;3581.85;0.00
    mca-webreporting2;600;2012-05-27 16:40:01 UTC;2.45;1442.78;3883.47;0.04
    mca-webreporting2;599;2012-05-27 16:50:01 UTC;2.44;1466.72;3893.10;0.04
    mca-webreporting2;600;2012-05-27 17:00:01 UTC;2.30;1394.43;3546.26;0.00
    mca-webreporting2;600;2012-05-27 17:10:01 UTC;3.15;1529.72;3978.27;0.04
    mca-webreporting2;601;2012-05-27 17:20:01 UTC;9.83;1268.76;3823.63;0.04
    mca-webreporting2;600;2012-05-27 17:30:01 UTC;32.71;1277.93;3495.32;0.00
    mca-webreporting2;600;2012-05-27 17:40:01 UTC;1.96;1213.10;3845.75;0.04
    mca-webreporting2;600;2012-05-27 17:50:01 UTC;1.89;1247.98;3834.94;0.04
    mca-webreporting2;600;2012-05-27 18:00:01 UTC;2.24;1184.72;3486.10;0.00
    mca-webreporting2;600;2012-05-27 18:10:01 UTC;18.68;1320.73;4088.14;0.18
    mca-webreporting2;600;2012-05-27 18:20:01 UTC;1.82;1137.28;3784.99;0.04
    [root@myserver ~]# vmstat
    procs -----------memory---------- -swap -----io---- system -----cpu------
    r b swpd free buff cache si so bi bo in cs us sy id wa st
    0 1 182356 499444 135348 13801492 0 0 3488 247 0 0 5 2 89 4 0
    [root@myserver ~]# dstat -D sda
    ----total-cpu-usage---- dsk/sda -net/total- -paging -system
    usr sys idl wai hiq siq| read writ| recv send| in out | int csw
    3 1 89 7 0 0|1240k 1544k| 0 0 | 1.9B 1B|2905 6646
    8 1 77 14 0 1|4096B 3616k| 433k 2828B| 0 0 |3347 16k
    10 2 77 12 0 0| 0 1520k| 466k 1332B| 0 0 |3064 15k
    8 2 77 12 0 0| 0 2060k| 395k 1458B| 0 0 |3093 14k
    8 1 78 12 0 0| 0 1688k| 428k 1460B| 0 0 |3260 15k
    8 1 78 12 0 0| 0 1712k| 461k 1822B| 0 0 |3390 15k
    7 1 78 13 0 0|4096B 6372k| 449k 1950B| 0 0 |3322 15k
    AWR sheet output
    Wait Events
    ordered by wait time desc, waits desc (idle events last)
    Event Waits %Time -outs Total Wait Time (s) Avg wait (ms) Waits /txn
    free buffer waits 1,591,125 99.95 19,814 12 129.53
    log file parallel write 31,668 0.00 1,413 45 2.58
    buffer busy waits 846 77.07 653 772 0.07
    control file parallel write 10,166 0.00 636 63 0.83
    log file sync 11,301 0.00 565 50 0.92
    write complete waits 218 94.95 208 955 0.02
    SQL> select 'free in buffer (NOT_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='N')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual
    union
    2 3 select 'keep in buffer (YES_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='Y')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual;
    'FREEINBUFFER(NOT_DIRTY)' DIRTY_PERCENT
    free in buffer (NOT_DIRTY) 10.71%
    keep in buffer (YES_DIRTY) 89.29%
    Rag....

    1)
    Yah This is partition table and on it Local partition index.
    SQL> desc GR_CORE_LOGGING
    Name Null? Type
    APPLICATIONID VARCHAR2(20)
    SERVICEID VARCHAR2(25)
    ENTERPRISENAME VARCHAR2(25)
    MSISDN VARCHAR2(15)
    STATE VARCHAR2(15)
    FROMTIME VARCHAR2(25)
    TOTIME VARCHAR2(25)
    CAMP_ID VARCHAR2(50)
    TRANSID VARCHAR2(25)
    MSI_INDEX NUMBER
    SQL> select index_name,column_name from user_ind_columns where table_name='GR_CORE_LOGGING';
    INDEX_NAME
    COLUMN_NAME
    GR_CORE_LOGGING_IND
    MSISDN
    2) I was try direct but after that i was drop this table and again create new partition table and create fresh index. but still same issue.

  • 4G LTE data very slow or nonexistent in San Diego inland North County area?

    Over the past few weeks I've noticed my phone being a bit slower than usual with data speeds in terms of loading a web page, various, apps, etc. Phone is a HTC Thunderbolt.
    However this week since 1/6/2014 and usually in the morning when I'm at work (Poway, CA 92064) my data is VERY slow and goes all over the place from 4G LTE to 3G to 1X and even spends periods of time with no data connection whatsoever even though I have several bars of service.  Was just out to lunch in San Diego/Scripps Ranch 92131 and still had 1X or no data service at all.  This is becoming downright unacceptable as I am paying for this service and it's nonexistent for large periods of time.
    I'm not sure this is limited to a local problem though. My partner is in the Bay Area hundreds of miles away from here and he said his phone (HTC Rezound) the other day also spent a lot of time on 1X and wore his battery down unusually early searching for a data signal.
    Verizon...what the heck?  What's up?

    It may be that there working on the system in your area or are they having any special events in your when they do that can tax the system some what and it cause what your seeing..  You might want to give Customer Service a call and ask for a Tech Rep.. see if there having any issues in your area and while your on the phone with them,  have them check your PRL code.. it may need updating and the rep may have to push it to your phone.. b33

  • Load Huge data into oracle table

    Hi,
    I am using oracle 11g Express Edition, I have a file of .csv forma, Which has a data of size 500MB which needs to be uploaded into oracle table.
    Please suggest which would be the best method to upload the data into table. Data is employee ticket history which is of huge data.
    How to do the mass upload of data into oracle table need experts suggestion on this requirement.
    Thanks
    Sudhir

    Sudhir_Meru wrote:
    Hi,
    I am using oracle 11g Express Edition, I have a file of .csv forma, Which has a data of size 500MB which needs to be uploaded into oracle table.
    Please suggest which would be the best method to upload the data into table. Data is employee ticket history which is of huge data.
    How to do the mass upload of data into oracle table need experts suggestion on this requirement.
    Thanks
    SudhirOne method is to use SQL Loader (sqlldr)
    Another method is to define an external table in Oracle which is allowing you to view your big file as a table in database.
    You may want to have a look at this guide: Choosing the Right Export/Import Utility and this Managing External Tables.
    Regards.
    Al
    Edited by: Alberto Faenza on Nov 6, 2012 10:24 AM

  • Stepping through table is very slow

    LV 7.0 on Win 2K
    I use a table to display records of information. The table is filled once at entry to a user interface subprogram and the user has the possiblity to select certain records by clicking or stepping through with the cursor buttons.
    I have also written subprograms to sort the table by the various columns. The active row is used to provide more information e.g. displaying it in an image.
    Usually you would click on a line with the mouse, but somtimes it is more convenient to step through subsequent lines with the cursor buttons.
    As it turned out the latter is sometimes very slow.
    I tried to trace that behaviour in my program, but it seems that the problem is within labview. I reduced the vi to a simple loop
    where nothing is done with the table except to display the value. If you click on a line the value will follow instantaniously. If you click on a line at the top of the table and then step down with the cursor buttons the value will also follow very quickly. But when you do the same while beeing at the bottom this will take several seconds.
    I read several entries in the forum that describe similar problems when updating the table. Please note that this is a very static table here. The only thing that is happening, is a user interacting with it with the keyboard.
    The answers I saw so far, suggesting to keep the table small, or switching to other indicators etc. seem to overlook the reason why to use the table; that is tp have a compact display of a lot of ordered information. If the reason is really within LV runtime. Than NI has a nice task to improve LV next time :-(
    Gabi
    7.1 -- 2013
    CLA
    Attachments:
    Tableaccess.zip ‏65 KB

    Unfortunately, I can confirm that LabVIEW 7.1 shows the same slow behavior. Actually pressing "down arrow" on my rig is so slow, it seems to lock up the PC. (W2k)
    Interestingly, if you capture the "up" and "down" arrows with a filter event, there is no slowdown! Maybe you can used the attached rough draft (LabVIEW 7.0) to improve the UI experience to the user until NI fixes the issue.
    Enjoy!
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    Tableaccess.vi ‏266 KB

  • Loading Page is very slow. Thread dump show "BuildCookie"

    Hello, Folks,
    I am developing a portal project, which have some pages and books. I find that my page is loaded very slow and I took three times thread dump, the following is the detail of thread dump. Could someone give me some hints that can help me to figure out what needs to be corrected in my code or design?
    "ExecuteThread: '14' for queue: 'default'" daemon prio=5 tid=0x03bf5190 nid=0xe2
    c runnable [44ce000..44cfd90]
    at java.io.WinNTFileSystem.list(Native Method)
    at java.io.File.list(File.java:915)
    at java.io.File.listFiles(File.java:993)
    at com.bea.wlw.runtime.core.dispatcher.AppManager.isExternalBuildCookieV
    alid(AppManager.java:1080)
    at com.bea.wlw.runtime.core.dispatcher.AppManager.validateExternalBuildC
    ookie(AppManager.java:1137)
    at com.bea.wlw.runtime.core.dispatcher.AppManager.ensureAppDeployment(Ap
    pManager.java:655)
    at com.bea.wlw.runtime.core.dispatcher.AppManager.ensureAppDeployment(Ap
    pManager.java:575)
    at jsp_servlet._framework._skeletons._default.__book._jspService(book.js
    p:12)
    at weblogic.servlet.jsp.JspBase.service(JspBase.java:33)
    at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run
    (ServletStubImpl.java:1072)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:465)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:348)
    at weblogic.servlet.internal.RequestDispatcherImpl.include(RequestDispat
    cherImpl.java:638)
    at weblogic.servlet.internal.RequestDispatcherImpl.include(RequestDispat
    cherImpl.java:423)
    at com.bea.netuix.servlets.controls.JspRenderer.renderAlt(JspRenderer.ja
    va:194)
    at com.bea.netuix.servlets.controls.JspRenderer.endRender(JspRenderer.ja
    va:142)
    at com.bea.netuix.nf.ControlLifecycle$1.postVisit(ControlLifecycle.java:
    538)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:564)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walkRecursiveRender(ControlTreeWa
    lker.java:553)
    at com.bea.netuix.nf.ControlTreeWalker.walk(ControlTreeWalker.java:247)
    at com.bea.netuix.nf.Lifecycle.runOutbound(Lifecycle.java:204)
    at com.bea.netuix.nf.Lifecycle.run(Lifecycle.java:146)
    at com.bea.netuix.servlets.manager.UIServlet.runLifecycle(UIServlet.java
    :333)
    at com.bea.netuix.servlets.manager.UIServlet.doPost(UIServlet.java:196)
    at com.bea.netuix.servlets.manager.PortalServlet.doPost(PortalServlet.ja
    va:772)
    at com.bea.netuix.servlets.manager.PortalServlet.doGet(PortalServlet.jav
    a:671)
    at com.bea.netuix.servlets.manager.UIServlet.service(UIServlet.java:147)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run
    (ServletStubImpl.java:1072)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:465)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:28)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.ja
    va:27)
    at com.bea.p13n.servlets.PortalServletFilter.doFilter(PortalServletFilte
    r.java:293)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.ja
    va:27)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationActio
    n.run(WebAppServletContext.java:6987)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
    dSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:
    121)
    at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppSe
    rvletContext.java:3892)
    at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestIm
    pl.java:2766)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)

    The page is slow because of the VO query.. I have tuned the query... Now its faster....

  • Yahoo Finance Portfolios - add/remove page and update data very slow to load

    The Add/Remove symbols page only partially loads then stops dead. Can take a minute or more to complete loading. And when I try to update quantities or price I get a whirling color gif which only sometimes reaches the point of letting me load data. Firefox, no problem.
    Thanks!
    Fresnel1
    New Jersey

    The Add/Remove symbols page only partially loads then stops dead. Can take a minute or more to complete loading. And when I try to update quantities or price I get a whirling color gif which only sometimes reaches the point of letting me load data. Firefox, no problem.
    Thanks!
    Fresnel1
    New Jersey

  • Mobile LTE data very slow - 80026

    I have been experiencing a massive drop off in data of up and down data speeds in Lafayette, Colorado (80026).
    Speedtest is giving me anywhere from 47ms to 457ms pings and data speeds up/down of as low as .54/.28 respectively. This is becoming more and more frequent with the tests trending slower and slower.
    I've reset the device and network setting on the phone (iPhone 5s) to no avail. It shows 2 or 3 bars and "LTE" when these tests are run.
    Not quite sure what the next steps are for troubleshooting or reporting this as the Support sections oddly make no mention of connectivity or data speed issues as topics. Or if they do, they are buried deep or in a category I am not identifying. Something that would presumably be near the top of users' support issues, I would have thought would warrant higher level visibility. Perhaps room for improvement for customer support there?
    Anyway, any thoughts? Is this a temporary issue? Where should I be reporting this? What remedies, if any, should I be seeking? Are there prescribed troubleshooting steps?
    Also also, the "tags" field in this forum software doesn't seem to support ZIP codes as a standalone tag as it won't let me enter more than two digits (Mac 10.9.5, Chrome 39.0.2171.99).

    Hi Sarah, thank you too... I'll likewise try to answer as best I can, as well as add a bit more data:
    I haven't been able to identify a pattern as to when the slowdowns happen except that it is happening far more often than it used to. I'll try and log it.
    I am using the Ookla Speedtest tool (as it is installed and convenient on the device) to ping a variety of networks and test up/down speeds with sample packets. If I am feeling very fancy or annoyed I sometimes even tether my computer and run more advanced network tests from the terminal so I can see if there is a routing issue. But as of yet, the slowdowns appear to be entirely in Verizon IPs.
    The bit more data: my wife and neighbors who are on Verizon are also telling me that they are seeing significant data slowdowns regularly from Lafayette (80026) as well as in south Longmont (80501). None of them have reported it to Verizon as they "don't know how", not that I blame them as "data speeds" or something of that ilk is not a service category under the predefined "Topics" for some reason... and my wife leaves these sort of things up to me anyway.

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Since a week or 2 Firefox has been loading web pages very slow. Or it does load them but it takes a good 3 seconds before i can scroll or enter anything.

    This morning also it froze up for a good 10 seconds and then it loaded the web pages again. Especially when i am browsing on a forum and return to main page or something. It just takes ages before i can select anything. It loads somewhat slower also but the part of it that i cannot select anything or scroll down before it stops thinking.

    A possible cause is a problem with the file places.sqlite that stores the bookmarks and the history.
    *http://kb.mozillazine.org/Bookmarks_history_and_toolbar_buttons_not_working_-_Firefox
    *https://support.mozilla.org/kb/Bookmarks+not+saved#w_fix-the-bookmarks-file
    You can also try to repair the places database with this extension:
    *https://addons.mozilla.org/firefox/addon/places-maintenance/
    A possible cause is security software (firewall) that blocks or restricts Firefox or the plugin-container process without informing you, possibly after detecting changes (update) to the Firefox program.
    Remove all rules for Firefox and the plugin-container from the permissions list in the firewall and let your firewall ask again for permission to get full unrestricted access to internet for Firefox and the plugin-container process and the updater process.
    See:
    *https://support.mozilla.org/kb/Server+not+found
    *https://support.mozilla.org/kb/Firewalls

  • Photoshop loading photo files very slow (not printer)

    For the past several weeks Photoshop C3 has been loading even the smallest of photo files very, very slowly -- about 10-15 seconds for one.
    I read an earlier post to this affect, that said it might be a networked printer, but the printer on this computer is NOT networked.
    I'm running Windows XP, latest update, with 3 gig of RAM. And we have deleted and reloaded the Photoshop program and updated it to make sure it's not a file goober in the program.
    Anyone else experiencing this trend and/or have any solutions.
    rfrank
    WSU Today
    http://www.wsutoday.wsu.edu

    John:
    Thanks for the effort, but alas it didnt make any difference.
    Our IT support crew tried a variety of generic drivers in the default mode, but a 56k photo still took 15 seconds to load.
    However, I still had a version of "Photoshop CS" on my machine. So, out of curiosity I fired up CS and grabbed the same test photos, and bam the photo popped right up. So it is definitely a CS3 issue
    Any other solutions that you know of?
    rfrank
    WSU Today

  • Huge data on slow network

    Hi All!!!
    I have a client which connect to the database through slow network (2Mbit/s) with slow response time:
    PING (10.6.0.5) from 10.1.9.24 : 56(84) bytes of data.
    64 bytes from (10.6.0.5): icmp_seq=1 ttl=124 time=115 ms
    64 bytes from (10.6.0.5): icmp_seq=6 ttl=124 time=109 ms
    64 bytes from (10.6.0.5): icmp_seq=7 ttl=124 time=108 ms
    Once per day client exec sql query which take 15Meg of data from database.
    It's take a lot of time, much more if I transfer this data through FTP.
    Why it's happend?
    May be server can compress data before transfer?
    Help me please.

    Contact the carrier.    

  • SSIS Moving data very Slow

    We have a large table roughly 200 GB that we are attempting to migrate from one server to another.  We are attempting to use SSIS and the table has an image type and a text data type.  I’ve tried to tune my ssis package by altering the BufferTempStoragePath, BLOBTempStoragePath, DefaultBufferSize, and  DefaultBufferMaxRows properties but the transfer still takes extremely long (Up to 15 hours).  The table is 15 million rows.  I’ve also attempted to run my ssis in chunks as high as 1 million per data flow task and as low as 500000 per data flow task.  If I attempt to transfer 250000 records at a time it starts out transferring pretty fast then it slows down drastically once it gets to about 800000 records.  I’m using SQL Server 2005 x64 bit on Windows Server 2003 R2 with SP2.  I have 16gb of memory on the target server on the and 8 GB on the source.  Are there any other changes I should consider?

     CozyRoc wrote:
    Matt,
    Is that behavior with swapping of BLOB/CLOB columns out of to disk documented? I don't remember reading about it.
    Yep, it sure is.  What Matt M refers to is the fact that a BLOB cannot fit into the data flow buffer.  When this happens, the BLOB data gets written to the directory (optionally) specified in the Data Flow parameter, BlobTempStoragePath, or the directory stored in the "TEMP" environment variable.
    It is in the description of that property where you will see the confirmation of Matt's statement, "When the data flow task processes a column containing BLOB data that is larger than the available memory on the system it temporarily writes data from the BLOB to the file system."
    http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.dts.pipeline.wrapper.mainpipeclass.blobtempstoragepath(SQL.90).aspx
    That description is a bit misleading, as I suspect the phrase "...larger than the available memory on the system" also includes being able to fit into the data flow buffer, which ever is smaller.

  • Why first time loading ReportViewer object very slow

    I am interested to know why the first time .net applaction loading the ReportViewer object so slow. It might take over 60s. Is this a performance issue or we need preload the object?

    Hello Steve,
    if you view a report for the first time the report viewer needs to load all dlls needed to connect to the database and to render and display the report. These dlls stay in memory a certain time afterwards.
    That is why the report takes much longer to display the first time.
    Falk

  • Emac loads net pages very slow why ?

    hi guys need your help my emac loads internet pages once it thinks it ok to do so..
    what can cause this and i cant update it on apple site for update browser need hlp asap
    thx abdul
    its bog standerd

    TenFourFox is the most up to date browser for our PPCs, they even have G4 & G5 optimized versions...
    http://www.floodgap.com/software/tenfourfox/
    SeaMonkey seems pretty fast also, with many options...
    http://www.seamonkey-project.org/
    http://www.seamonkey-project.org/releases/

Maybe you are looking for

  • "Open links from applications" in Safari 5

    Preferences > General > "Open links from applications" This has been removed in Safari 5, so when I open, say, 10 weblocs I keep in a folder in the Finder, I now get 10 separate windows. 1) ***, Apple?!? 2) Any suggestions for a hack to fix this stup

  • SAP Travel & Expense Management (TRIP) in the Enterprise Portal environment

    HI All I am Viki. In our organization they are planning to implement the SAP Travel & Expense Management (TRIP) in the Enterprise Portal environment. Can any one help me how to start this process other details like What is the business package for Tr

  • IOS 4.2.1 have performance problem?

    I use iOS 4.2.1 on iPhone 4. While many background applications run, my iPhone4 is very slowly. I think iOS4.2.1 have performance problem. So, Apple should release maintenance release for iOS 4.2. Do you think?

  • XL Reporter not working for MS Office 2007

    Hi We have SAP B1-2005 with (PL35, XL Reporter 6.80.01), after installing XL Reporter we got the error as Microsoft Excel is required but not installed, for which we had browsed the forums and got the below said solutions 1. Uninstall XLR completely.

  • Trouble exporting to homepage?????

    I have created a 30min movie using iMovie and have tried several ways to get it posted on my .mac homepage. First I tried to "share" directly to my homepage, which it said it would take roughly 40 minutes to compress to the right size. Well after tha