Copying large amount of small files across network: Files are missing

On my iMac, i have a directory with 1440 small files (about 100k each). I select all of them, and drag them to a folder on my Mac Mini. The copy finishes quickly, but there are always a few files missing, somtimes 5, sometimes 15, or anything between.
When selecting half of the files, the same problem occurs. Even when I narrow the selection to 100 files, it occured once that only 99 of them were copied.
Both Macs run 10.5.2 and are connected through a Cisco gigabit switch.
I solved by zipping the files in 1 archive and transferring that, but of course that is hardly a worthy workaround. One of the most basic file operations, copying across a network, should work flawless in my opinion.
Do any of you have similar problems? Or perhaps a solution? This, together with the wireless and standby-problems on two MacBooks here, make me consider downgrading to Tiger for the time being.

Scott Oliphant wrote:
iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

Similar Messages

  • Finder issues when copying large amount of files to external drive

    When copying large amount of data over firewire 800, finder gives me an error that a file is in use and locks the drive up. I have to force eject. When I reopen the drive, there are a bunch of 0kb files sitting in the directory that did not get copied over. This is happens on multiple drives. I've attached a screen shot of what things look like when I reopen the drive after forcing an eject. Sometime I have to relaunch finder to get back up and running correctly. I've repaired permissions for what it's worth.
    10.6.8, by the way, 2.93 12-core, 48gb of ram, fully up to date. This has been happening for a long time, just now trying to find a solution

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • How can I copy large amount of data between two HD ?

    Hello !
    Which command could I user to copy large amount of data between two hard disk drives ?
    How Lion identify the disk drives when you want to write some script, for example in Windows I sue
    Robocopy D:\folder source\Files E:\folder destination
    I just want to copy files and if the files/folders exist in destination the files should be overwritted.
    Help please, I bougth my first MAC 4 days ago.
    Thanks !

    Select the files/folders on one HD and drag & drop onto the other HD. The copied ones will overwrite anything with the same names.
    Since you're a newcomer to the Mac, see these:
    Switching from Windows to Mac OS X,
    Basic Tutorials on using a Mac,
    Mac 101: Mac Essentials,
    Mac OS X keyboard shortcuts,
    Anatomy of a Mac,
    MacTips,
    Switching to Mac Superguide, and
    Switching to the Mac: The Missing Manual,
    Snow Leopard Edition.&
    Additionally, *Texas Mac Man* recommends:
    Quick Assist,
    Welcome to the Switch To A Mac Guides,
    Take Control E-books, and
    A guide for switching to a Mac.

  • Some files that iWork needs are missing.

    I tried updating iWork '08, but the installer failed during the update of Pages. Since then i have been unable to boot any of the iWork apps. Numbers and Pages both gives the same error message: "Some files that iWork needs are missing", it then suggests that i reinstall iWork '08, but when i do that, the same issues still remain. i have tried reinstalling, uninstalling by deleting the app icon and then installing it again, ive also tried uninstalling using appdelete, and then install it again, but the same issue just remains.
    Do you know what i can do?
    Thanks in advance.

    Guys
    I have the same problem as above.
    What happened was that all was fine, then trying to install an add-in for Safari, I hapened to delete the System/Library folder when the GUI went slow on me :-(((
    Anyway, couldn't fix or reboot, so ended up reloading the OS from the DVD's.
    Once that was done, almost everything was OK, except for Parallels and iWork08, both of which had worked previously.
    In the case of iWork08 I've tried all the tips here (delete any file associated with iWork 06/08, fix rights) and even going as far as looking what files the install package says it installs.
    Yet, the problem remains.
    Like some others, the installer thinks I'm UPGRADING rather than installing fresh.
    What other tips can you give, where is the root problem and how do I get around it?
    Thanks!
    /TrakDah

  • Files that iWork needs are missing

    When I open Pages or either of the other 2 apps, I get the error window that says, "Files that iWork needs are missing. To restore the missing files, use the iWork installer to reinstall iWork."
    About 2 days after I loaded iWork8, my computer hard drive crashed (would not boot up). I downloaded everything except the System onto an external hard drive, wiped the G5 hard drive, and reinstalled the System. I recopied everything back from the external hard drive, but iWork will not open.
    I tried reinstallation, and everything went though the install process, but still it would not open.
    Then I did a Spotlight search for anything that said iWork, dumped all of that, did another reinstallation from the disk, and it still gives the same error message.
    Now what? I have iWork on my laptop, and I really like it and would really like to have it work on my desk top computer.

    Dennis,
    No doubt it would have been better had I reinstalled iWork from the CD rather than just drag all my applications from my external drive. But all other 74 primary apps (Office as 1 primary app) that I brought in by dragging work just fine. If iWork is going to be this touchy, then Apple should provide an "Uninstall" program as Windows has to do for this very reason -- you can't ever find all the little files that an application uses.
    I did as you suggested: searched for iWork, Pages, Numbers, and Keynote. I am a university professor and some of these terms generate a lot of files.
    For "pages" I got 20 folders, 2 images, 54 presentations, 2 music, 1 application (the one we want to find), 249 PDF documents, 793 HTML files, 994 documents, and 89 "other". Do I just dump the application, or are there "documents" or any other kind of file I need to dig through?
    For "keynote" I got: 4 folders (all dealing with keynote speakers), 1 application (the one we want to find), 17 PDF documents, 8 HTML files, 33 documents, and 68 "other". Do I just dump the application, or are there "documents" or any other kind of file I need to dig through?
    For "numbers" I got: 0 folders, 531 presentations, 4 application (1 of which is the 1 we want to find), 3 system preferences (none of which seems to be related to iWork), 544 PDF documents, 874 HTML files, 1915 documents, and 20 "other". Do I just dump the application, or are there "documents" or any other kind of file I need to dig through?
    I will wait to hear from you before I dump anything. Thanks.
    Calidris

  • Files that iLife needs are missing Files that iWeb needs are missing

    Well, I seem to have a problem that was in other threads, but they really didn't resolve the issue - just got lucky.
    When opening either any iWork app or iWeb, I get the following message: "Files that Ilife needs are missing".
    This occured after an archive install, but I have removed any file/library remotely connected to IWeb and (several times) reinstalled ILife from the DVD.
    I tried the install iWork demo workaround - no luck!
    iMovie, iDvd, iTunes all work OK.
    What are the critical files in this case?
    How do I resolve this?

    OK, I fixed it
    1. Use AppDelete to delete all programs installed by iLife
    2. Download iLifeSupport
    3. Show Package contents, open "installers" and run each of the installers (5) then run iLifeSupport.dist
    4. Reinstall iLife
    Did it for me, but it took a while

  • I have files that say they are missing or offline. I have checked all my folders off of lightroom and they can not be found. I have used the ? mark settings to find and still not found. I can see the images on lightroom so they have to be in my system som

    I have files that say they are missing or offline. I have checked all my folders off of lightroom and they can not be found. I have used the ? mark settings to find and still not found. I can see the images on lightroom so they have to be in my system some how. How can I retrieve these images? Is there a re boot for lightroom? Need help ASAP!

    You have probably moved the files outside of Lightroom. The images you see in LR are previews that were made before you moved the files.
    Do a search for the files/folders using your computers operating system. (Explorer/Finder)
    Once you have found them, you can point LR to the right location.

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • NMH305 dies when copying large amounts of data to it

    I have an NMH305 still set up with the single original 500GB drive.
    I have an old 10/100 3COM rackmount switch (the old white one) uplinked to my Netgear WGR614v7 wireless router.  I had the NAS plugged into the 3COM switch and everything worked flawlessly.  Only problem was it was only running at 100m.
    I recently purchased a TRENDnet TEG-S80g 10/100/1000 'green' switch.  I basically replaced the 3com with this switch.  To test the 1g speeds, I tried a simple drag & drop of about 4g worth of pics to the NAS on a mapped drive.  After about 2-3 seconds, the NAS dropped and Explorer said it was no longer accessible.  I could ping it, but the Flash UI was stalled.
    If I waited several minutes, it could access it again.  I logged into the Flash UI and upgraded to the latest firmware, but had the same problem.
    I plugged the NAS directly into the Netgear router and transfered files across the wireless without issue.  I plugged it back into the green switch and it dropped after about 6-10 pics transfered.
    I totally bypassed the switch and plugged it directly into my computer.  Verified I can ping & log in to the Flash UI, then tried to copy files and it died again.
    It seems to only happend when running at 1g links speeds.  The max transfer I was able to get was about 10mbps, but I'm assuming that's limited by the drive write speeds & controllers.
    Anyone ran into this before?
    TIA!

    Hi cougar694u,
    You may check this review "click here". This is a thorough review about the Media Hub's Write and Read throughput vs. File Size - 1000 Mbps LAN.
    Cheers

  • Redistributing Large Amounts of Media to Different Project Files

    Hello
    I'm working on a WWI documentary that requires an extremely large amount of Media. I was just wondering, if (besides exporting hundreds of QT files), perhaps just capturing all of this media, (spread out), into several FCP Project Files?
    In the screen shot below, note that there are 2 FCP projects open at one time. Why? Because I thought it might help re-distrtibuting all the captured media into more than just one FCP Project file. Is this advisable? Would creating more FCP files containing more and media actually be better than capturing soooo many captured archival clips into just one main FCP Project file?
    Or
    Should I be exporting 'Non-Self Contained' files to be re-imported back into the Project?
    Screen Shot;
    http://www.locationstudio.net/Redistribute-1.jpg
    Thanx
    Mike

    It is absolutely advisable. This keeps project sizes down to manageable levels. This is one of the tips I give in my tutorial DVD on Getting Organized with Final Cut Pro.
    Bear in mind, that if your footage and sequences are in different projects, you cannot match back to the clip in the Browser. Once a clip is cut into a sequence that resides in a different project, the link from that media and the original clip in it's original project is broken. That can be a pain, so I try not to break up projects unless I have to. And LARGE projects I find that I have to.
    ALthough it can still be a pain. When you need to find the footage that you know is in the same bin as that clip...and you don't know what that bin is? Sorry...match back doesn't work. So, just a caviat.
    Shane

  • ITunes Locking when Adding Files Across Network

    I just made the switch from windows to Mac and I seem to be having issues with iTunes. My main computer used to be a PC and I just purchased a Macbook Air. I have a pretty large media library (1.5TB) and there was no way I could house that on the air so I repurposed my PC as a file server. I then created alias's to each media file on the PC (i.e. Video, Music (320), Music (ALAC), Pictures,eBooks, etc..).
    I set the library in iTunes to be in the local music file on the mac and I also placed the alias's in the music file. I made sure to unflag the two "let itunes do everything" flags. I then started adding the alias's in one by one. Video (which is huge), Pictures, ALAC music all added with no fuss.. slow as .... but they worked. I then tried to add the music file containing MP3s. Not only did it creep, it locked up constantly. I had a beach ball almost constantly. I killed the add process and tried again.. same thing. I then rebooted both machines and the same thing. At this point I am thinking my files was corrupt so I loaded iTunes on the PC and had no issues adding the files.
    I then reloaded iTunes on the mac. No good. Now when I say I was getting beach balls I really mean it. I have never in my long life of dealing with PCs ever seen something freeze up and lock completely so often. It is so bad it is crashing finder and I have to power it down. No I know that iTunes is not the best thing that Apple has ever made (trying to be nice) but this is silly.
    Being persistant I decided to bypass the PC and I went and bought a Time Capsule. Now not only does it still crash the files that do work are added REALLY slowly (and accessed a a pitiful rate). Is the hard drive on the TC that slow?? Time Machine seems to work ok.
    So here I am. I have files that I have confirmed are functional on two PCs using iTunes for windows and one of them is completely crashing my mac. I can not house this media on the air. 1.5TB will not fit in a 256GB drive for obvious reasons. There are no strange and funky files just plain old MP3s.
    Does anyone have suggestion as to what I can do? I have to be honest.. at this point I am rather ******. It should not take two days to add MP3s. I am afraid my journey into the world of OS X is about to end. If I didn't have an iPad & iPhone I would use an alternative player but I do and I am stuck.

    Yes, I am repling to my own thread The problem only lies with MP3s from a network drive on a mac (lion). This problem can't be duplicated on a windows machine which is sorta madding. I converted 500 tracks to AAC and the problem went away and I have never had an issue with ALAC. No, I am not going to convert 50k tracks to AAC.
    I have been watching the system log as the MP3s import and I keep getting this error:
    "iTunes[1879]: _AMDDeviceAttachedCallbackv3 (thread 0x11309c000): Device 'AMDevice 0x7fd554add380 {UDID = b407ece7782174048d0ee3136beafc5a111e9156, device ID = 17, FullServiceName = a4:d1:d2:76:fe:2d@fe80::a6d1:d2ff:fe76:fe2d._apple-"
    So far google has failed me so I have no idea what it means. Anyone have any idea?

  • Mountain Lion Finder "unresponsive" (Beach balls) when trying to copy large amount of files

    Before Mountain Lion, using Lion -  transfering 999 files from my compact flash card to my SSD hard drive via a FW800 reader was SO easy. Select all, drag and drop and the transfer would begin immediately.
    Now, after upgrading to Mountain Lion, I experience at least 60 seconds of beach balls and an "unresponsive" finder. Sometimes it starts to transfer, sometimes I have to relaunch finder.
    I've reset the PRAM (although I didn't hear 4 beeps), the SMC, I did verify and repair disk permissions.
    Any thoughts on this?
    Thank you,
    (Late 2011 15 in MBP)

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • Trouble sharing files across networked computers

    I frequently have this problem when trying to copy files from my MBP at home to my PowerMac at work and vice versa. Regardless of how the connection between the two machines is made (smb, MobileMe), when I try to copy a file or folder, I get a message saying 'You may need to enter the name and password for an administrator on this computer to change the item named "filename.xxx".'
    I am the administrator for both machines, and I am logged in to both as the administrator when this message occurs.
    At this point, I have buttons giving me the option to Stop or Continue. Clicking Continue brings up a message saying 'The item "filename.xxx" contains one or more items you do not have permission to read. Do you want to copy the items you are allowed to read?'
    Again I have the option to Stop or Continue. Clicking Continue gives me the message 'The operation cannot be completed because you do not have sufficient privileges for some of the items.' This message comes up even though I am logged in as the administrator on both machines and I have made sure that I've got read & write privileges for that file.
    At this point, I'm stymied. This occurs whether I'm attempting to transfer a single file, a folder, or an archive. I've adjusted sharing preferences on both machines so that I specifically have full privileges for both the source and target folders as well as the file(s) being moved. Both machines are running OSX 10.5.8.
    Any suggestions? Thanks in advance -

    Hi Keven - just wanted to chime in that I can replicate this error on a nearly identically configured setup as yours, but like you, have no substantive response (yet) from Apple. In fact, my post disappeared. The only thing I see, perusing the forums, is that people are recommending Snow Leopard upgrade to solve numerous networking issues. I realize this is not a satisfactory answer. I will stay subscribed to your post and will let you know if Apple comes back to me with a good response as well.

  • Copying large amount of data from one table to another getting slower

    I have a process that copies data from one table (big_tbl) into a very big archive table (vb_archive_tbl - 30 mil recs - partitioned table). If there are less than 1 million records in the big_tbl the copy to the vb_archive_table is fast (-10 min), but more importantly - it's consistant. However, if the number of records is greater than 1 million records in the big_tbl copying the data into the vb_archive_tbl is very slow (+30 min - 4 hours), and very inconsistant. Every few days the time it takes to copy the same amount of data grows signicantly.
    Here's an example of the code I'm using, which uses BULK COLLECT and FORALL INSERST to copy the data.
    I occasionally change 'LIMIT 5000' to see performance differences.
    DECLARE
    TYPE t_rec_type IS RECORD (fact_id NUMBER(12,0),
    store_id VARCHAR2(10),
    product_id VARCHAR2(20));
    TYPE CFF_TYPE IS TABLE OF t_rec_type
    INDEX BY BINARY_INTEGER;
    T_CFF CFF_TYPE;
    CURSOR c_cff IS SELECT *
    FROM big_tbl;
    BEGIN
    OPEN c_cff;
    LOOP
    FETCH c_cff BULK COLLECT INTO T_CFF LIMIT 5000;
    FORALL i IN T_CFF.first..T_CFF.last
    INSERT INTO vb_archive_tbl
    VALUES T_CFF(i);
    COMMIT;
    EXIT WHEN c_cff%NOTFOUND;
    END LOOP;
    CLOSE c_cff;
    END;
    Thanks you very much for any advice
    Edited by: reid on Sep 11, 2008 5:23 PM

    Assuming that there is nothing else in the code that forces you to use PL/SQL for processing, I'll second Tubby's comment that this would be better done in SQL. Depending on the logic and partitioning approach for the archive table, you may be better off doing a direct-path load into a staging table and then doing a partition exchange to load the staging table into the partitioned table. Ideally, you could just move big_tbl into the vb_archive_tbl with a single partition exchange operation.
    That said, if there is a need for PL/SQL, have you traced the session to see what is causing the slowness? Is the query plan different? If the number of rows in the table is really a trigger, I would tend to suspect that the number of rows is causing the optimizer to choose a different plan (with your sample code, the plan is obvious, but perhaps you omitted some where clauses to simplify things down) which may be rather poor.
    Justin

  • Copying large amount of data from one table to another

    We have a requirement to copy data from a staging area (consists of several tables) to the actual tables. The copy operation needs to
    1) delete the old values from the actual tables
    2) copy data from staging area to actual tables
    3) commit only after all rows in the staging area are copied successfully. Otherwise, it should rollback.
    Is it possible to complete all these steps in one transaction without causing problems with the rollback segments, etc.? What are the things that I need to consider in order to make sure that we will not run into production problems?
    Also, what other best practices/alternative methods are available to accomplish what is described above.
    Thanks,
    Eser

    It's certainly possible to do this in a single transaction. In fact, that would be the best practice.
    Of course, the larger your transactions are, the more rollback you need. You need to allocate sufficient rollback (undo in 9i) to handle the transaction size you're expecting.
    Justin
    Distributed Database Consulting, Inc.
    www.ddbcinc.com

Maybe you are looking for

  • Calling the Selection screen in the Interactive report

    Hello, this is urgent requirement. I need to call the selection in the interactive report. my requirement is i have to display list of the table name which is stored in the table DD20T. in basic list i have to display all the table name. if i double

  • Using LDB's in  program and supressing Selection screen

    Hi , I am using LDB in report program and dont want its corresponding selection screen but want my slection screen which i declared in program. Experst how should i supress the LDB 's corresponding sletcion screen . Regards Poornima

  • Pointer corruption on Remote Desktop 8.0.3

    There are two aspects to my problem relating to mouse pointer display on my Retina macbook pro, running Remote Desktop 8.0.3. 1. When connecting to a Windows 7 machine, with high DPI setting enabled (Control Panel->All Control Panel Items->Display se

  • Include directories with files when building Installer

    I want my installer to install the executable in C:\ProductA\SystemTest\Program1 I also want have C:\ProductA\SystemTest\CsvFiles created and have the csv files included. Is this possible? How would I go about doing this? Solved! Go to Solution.

  • Steer valuation type in inbound IDOC INVOIC

    We are working with batch management. When a purchase order is partially delivered, we need to make sure that the incoming vendor invoice (via IDOC type INVOIC) is linked to the correct valuation type. I can steer the PO number en PO item line. Is it