What's the fastest way to copy 151GB, 375000 files from win 2003 server to win 2008 server

Non techie here.
I have a project where I need to get 151GB of data spread over 375000+ files moved from a win 2003 FAP to a 2008 server. Copy, xCopy, Robocopy all take in excess of 50hours to move it to an external HDD. Has to be external move for security reasons.
I have 40 hours max to get it off and onto the new server.
Cheers 
Ian

I copied over 12TB in 24 hours using the method below. A lot of this depends on your infrastructure. The scripts I used are unmodified for your case. I suggest you give them a look and understand the process and change it to fit your needs.
There are 2 parts. The first is a Main script that schedules PowerShell jobs that actually do the work. The main script will read a file called JobCount every loop to see how many jobs it can run at one time, the reason I did this was to change the number
of jobs depending on Day (production) times and Night times. Also, the Main loop reads a nomig file that tells the script, don't move these folders because they were our test cases, you can even do test cases during the migration since you can modify the file
while the script is running. The example was use to move thousands of home folders. Using Robocopy if you tell a single command to do everything, it will take hours to start, just looking around. If you do one root folder at a time, it will run much faster
which is the reason I created this. If you have a small number root folders, you may want to point it at folders where you do have a lot of subs, remember you can have more than one main process running in different runspaces.
Main Script 
VVVVVVVVVV
$homeOld = (Get-ChildItem -Path \\server\share | select name)
$JobRun = 10
$i = 0
$Count = $homeOld.Count
foreach ($homeDir in $homeOld) { 
    $i = $i + 1
    $Sdir = $homeDir.Name
    Write-Progress -Activity "Migrating Homes" -Status "Processing $i of $Count"  -PercentComplete ($i/$homeOld.Count*100) -CurrentOperation "Next to Process $Sdir"
    $not = gc \\serverl\share\script\nomig.txt -ErrorAction "Stop"
    $JobRun = gc \\server\share\script\jobcount.txt -ErrorAction "Stop"
    if ($not -notcontains ($homeDir.Name).ToLower()) {
            While ((Get-Job -State "Running").Count -gt ($JobRun-1)) {
                Start-Sleep -Seconds 3
            if ((Get-Job -State "Completed").Count -gt 0) {
            $Comp = Get-Job -State "Completed"
            foreach ($Job in $Comp) { 
                $outfile = $Job.name + ".txt"
                Receive-Job -Job $Job | Out-File -FilePath "\\server\share\verify\$outfile"
                Remove-Job -Job $Job}
            Start-Job -Name $Sdir -ArgumentList "\\server\share\$Sdir", "\\newserver\share\$Sdir", "/COPYALL", "/MIR", "/W:1", "/R:1", "/MT:5" -FilePath \\server\share\script\robothread.ps1 > $null
    else {
        Write-Host $HomeDir.Name " Excluded" -ForegroundColor Green
=====
Thread Script - where Robocopy does the work.
VVVVVVVVV
& robocopy $args[0] $args[1] $args[2] $args[3] $args[4] $args[5] $args[6]
============
This comes with no warranty, it is just an idea I used to do a very fast copy with permissions and all attributes, where no other method was useable.
Thanks,
Allan
Allan

Similar Messages

  • What is the simplest  way to get a xml-file from 10g R2 -database ?

    Hi,
    I'm new in xml, there are so many tools to work with xml:
    what is the simplest way to get a xml-file from 10g R2 -database ?
    I have : 10g R2 and a xsd.file to describe the xml-structure
    thank you
    Norbert

    There is no automatic way to generate XML documents from an arbitary set of relational tables using the information contained in an XML Schema. Typically the easiest way to generate XML from relational data is to use the SQL/XML operators (XMLElement, XMLAGG, XMLAttribtues, XMLForest). There are many examples of using these operators in the forums You can validate the generated XML against the XML Schema by registering the XML Schema with XML DB and then using the XMLType.SchemaValidate() method

  • What is the Best way To Copy and paste data from 1 book to another

     I have 18 sheets in 5 different books that I want to extract data from specific cells.  What is the best way to do this?  Example:  1 sheet is called Numbers E-O1 data in 13:WXYZ. The data updates and moves up 1 row every time I enter
    a new number. So let's say I enter the number 12. Through a lot of calculations the output goes in 13:WXYZ. To what I call a counter which is a 4 digit number.  Anyways, how can I send that 4 digit number to a totally different sheet?  To bullet
    what I'm talking about
    data in cells Row 13:WXYZ in book called Numbers sheet E-O1
    send data to book called "Vortex Numbers" Sheet E-O row 2001:CDEF
    What formula or Macro can I use to make this work?
    thank you!

    Hello Larbec,
    Syntax:
    '[BookName]SheetName'!Range
    Enter in cell  2001:CDEF:
    ='[Numbers]E-O1'!13:WXYZ
    This assumes that the file is open in Excel. Otherwise you need to add the path:
    'ThePath[BookName]SheetName'!Range
    Best regards George

  • What's the easiest way to select all my files from all the folders on a hard drive and place into one folder?

    Hi there,
    I have about 30,000 images all in hundreds of folders. I'm wondering what the easiest way to get them all into one folder so I can select and convert them all from .psd, .tiff, etc all to jpg. The reason I'm doing this is because the folder structure is such a mess, that I'm just going to import them all into aperture to sort everything. But the .tiffs and .psds are 100 mb each so I want to scale them to jpgs that are only 4 or 5 mb before I even think about importing them into aperture.
    I tried doing a search for "kind is image" and it shows them all but a ton of them are renamed the same thing so when I try to select all and move into one folder it tells me I can skip that one or stop copying.
    Any thoughts or ideas on this?
    Thanks,
    Caleb

    Hi russelldav,
    one note on your data handling:
    When  each of the 50 participants send the same 60 "words" you don't need 3000 global variables to store them!
    You can reorganize those data into a cluster for each participant, and using an array of cluster to keep all the data in one "block".
    You can initialize this array at the start of the program for the max number of participants, no need to (dynamically) add or delete elements from this array...
    Edited:
    When all "words" have the same representation (I16 ?) you can make a 2D array instead of an array of cluster...
    Message Edited by GerdW on 10-26-2007 03:51 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • What is the best way to recover a corrupted file from a failing Hard Drive?

    I have a 3TB WD internal drive that is starting to fail with Bad blocks. I noticed it when I had some write errors and so got a new disc as fast as possible to get the data off before it dies completely.
    However, some files are failing to be copied with the Finder.  What tools are best for trying to recover these mainly video files?  I have Tech Tool Pro but its utility seems aimed at finding accidently deleted files.  I seem to remember a utility from way back that would repeatedly try to read a back block and re constitute the file. 
    All suggestions are welcome!
    Thanks.

    Choose and use only a single background process to monitor health of a drive, they really are not safe to use multiple tools. And SoftRAID does its scan and checking in background during idle time for weak sectors - they are not really black/white but more xx tries to read/write.
    as I mentioned, and you fit the profile, DW is one trick pony and can be used when you need to mount or access files to recover and backup.
    Yes, DW will create a new directory. It has an advanced deep scan to scavenge for files.
    Sometimes you just need to use DW's Preview to see what the drive volume 'looks like" now and what it would be after a rebuild/create directory. but sometimes you just want Data Rescue 3. both are in the $95 range.
    Too often large backups, like 10 drives, are not well categorized or 'where is what."
    Clone of a drive means you can restore the volume... TM to keep versions. But I don't have the 200% faith in TM sets or backup.
    Sometimes a RAID6 is the ideal method for a large viideo catalogue or collection. And some people or businesses keep duplicate RAID6 sets, to mirror each other.
    I liken 3TB drive sometimes as putting too many or all the eggs in one basket.
    also, to keep any drive to 40% free which may sound wasteful, unless you consider the data' value.
    SoftRAID 4 could be useful to zero and test the drive. Or to to 'take over' the drive even. Then to build a mirror from it.  MPG recommends using it even in non-array use for some of those very reasons.
    choice now going forward: clone with Superduper / CCC / recovery with DR3 / and build the directory (or not) with DW 4.x.
    With externals sometimes the weak link is the case and interface or drive enclosure, plus the bridge if any (SATA native and none is my preference). And controllers, which may be needed.
    Instead of having to order a drive? always have spares and extras. Always have a clone and an extra that can be used to put into service. And order more.

  • What is the best way to migrate settings/apps/files from late 2008 MBP to new 2012 MBA?

    I have the 2008 MBP backed up via TM to an external HD that is FW800, but is USB 2.0 capable.  I was hoping the TB/FW adapter would come out from Apple soon, and maybe it will.  Bottom line, my new MBA arrives next week, and I'm not sure when the TB/FW adapter will hit the stores.  Can I use USB 2.0 to link the HD to my new MBA or will it be too slow to conduct the migration?  I have about 110GB on the MBP HD and the new MBA is the i7/8G/256GB model. so no worries there.

    The firewire adapter is due for release in "July"  according to the specs.  You can make due with your TM backup on USB 2.0.   It's slower than firewire but should complete within 2 hours.

  • What is the fastest way to transfer files and applications from an older iMac to a new MacBook Pro?

    What is the fastest way to transfer files and applications from an older iMac to a new MacBook Pro?
    I have a Firewire cable and Thunderbolt adapter, but no icons showing either Mac appear on either desktop.

    The fastest way is to use Carbon Copy Cloner and a external drive formatted GUID OS X Ext. J in Disk Utlilty, then connecting to the new Mac and using Migration Assistant in the Utilites folder.
    Even faster, if you can determine your going to replace all the apps from fresh sources anyway (like if the older Mac's OS X verison is old thus the apps) then just use a external drive and copy just your files to it, then connect to the new Mac and transfer over.
    Some apps you can just grab the registration code and install it fresh on the new machine with the old code, talk to the developer about transfering the program, as long as it's deleted on the older Mac in the process.
    It used to be Firewire Target disk mode was fastest, but since Thunderbolt came out...
    Notice this support doc hasn't been updated since june 2012, no Thunderbolt info
    https://support.apple.com/kb/HT1661

  • What is the best way to copy aperture library on to external hard drive? I am getting a message that say's "There was an error opening the database. The library could not be opened because the file system of the library's volume is unsupported".

    What is the best way to copy aperture library on to external hard drive? I am getting a message that say's "There was an error opening the database. The library could not be opened because the file system of the library's volume is unsupported". What does that mean? I am trying to drag libraries (with metadata) to external HD...wondering what the best way to do that is?

    Kirby Krieger wrote:
    Hi Shane.  Not much in the way of thoughts - - but fwiw:
    How is the drive attached?
    Can you open large files on the drive with other programs?
    Are you running any drive compression or acceleration programs (some drives arrive with these installed)?
    Can you reformat the drive and try again?
    Hi Kirby,
    I attached the UltraMax Plus with a USB cable. The UltraMax powers the cable so power is not an issue. I can open other files. Also, there is 500GB of files on the drive so I cannot re-format it. Although, I noted I could import the entire Aperture Library. However, I do not want to create a duplicate on my machine because that would be defeating the purpose of the external drive.
    Thanks,
    Shane

  • What is the fastest way of getting data?

    With a scanning electron microscope, I need to scan a 512*512 pixel area with a pixel repetition of 15000 (two channels), meaning averaging over 15000 measurements. Simultaneously I have to adjust the voltage output for every pixel.
    I am using a 6111E Multifunction I/O board in a 800MHz P3. The whole task has do be done as fast as possible (not more than 20 minutes altogether).
    What is the fastest way to get this huge amount of data with averaging and output in between? (E.g. do I use buffered read with hardware triggering or is there a faster way?)

    Using the NI-DAQ API (not LabView) will give you a significant amount of more control over what happens and when to the data stream; which translates to a more efficient program. But you need to program in C/C++ or Delphi then. The Measurement Studio provides ActiveX controls that are like the LabView ones for C&C++ (they�re slow like the LabView ones though � not a lot you can do about the Windows GDI).
    What are you trying to sample 15000 times? The 512*512 pixel field?
    That�s almost 15Gigs of data! And it means you need to process data at 12.8MB/s to finish it in 20 minutes. I hope you know C, x86 assembly and MMX.
    I would setup a huge circular buffer (NI-DAQ calls them �double buffers�), about 30 seconds worth or so, to use with SCAN_Start. Then I would proces
    s the actual buffer the card is DMA�ing the data into with a high priority thread. Progressively sum the scan values from the 16bit buffer (the samples are only 12 bit, but the buffer should still be 16bits wide) into a secondary buffer of DWORDs the size of the screen (512*512), and you�ll need two of those, one for each channel. Once the 15000 scans are complete, convert each entry into a float divide by 15000.0f, and store it in a third buffer of floats.
    If you wish to contract this out, send me an email at [email protected]

  • I'm buying a new Macbook Pro this week and am wondering what is the best way to copy over the software I have from my existing Macbook Pro to the new one? eg. Photoshop and Office etc. I no longer have the CDs.

    I'm buying a new Macbook Pro this week and am wondering what is the best way to copy over the software I have from my existing Macbook Pro to the new one? eg. Photoshop and Office etc. I no longer have the CDs.

    Ya know what I'm on a brand new MBP just about 24 hours old and you know whats been working amazingly for me. I have a 27inch iMac as well and i've just connected it to my network and been dragging files and apps across the network onto my new MBP. Its really working fast and its flawless. You could always do that option, Just go into sharing options and turn them on for both Macs. Then just click and drag. Of course they have to both be on the same network for this to be possible.
    Look at my network.
    Shared is what your looking at.  I click on there see all my computers files and then drag the ones i want form its folder to my MBP folders.  Hope that helps if your looking for a very simple way on a wireless network.

  • What is the best way to copy a DVD i made from iMovie?  It was HD720 and I don't want to lose any quality in the copies.

    What is the best way to copy a DVD I made from iMovie?  It was HD720 and I don't want to lose any quality in the copies.  I need to distribute it to about 20 people. It's 42 minutes long.

    You will need to save it as a video to the camera roll.
    Import it into windows as you would a photo.
    Then purchase DVD authoring software, and create a DVD.

  • There are over 4000 duplicates in my iTunes. What is the fastest way to delete them?

    There are over 4000 duplicates in my iTunes. What is the fastest way to delete them?

    Hello there, yandere69keita.
    The following Knowledge Base clarifies your concern about your My Photo Stream counting towards your iCloud storage:
    iCloud: My Photo Stream FAQ
    http://support.apple.com/kb/ht4486
    Does My Photo Stream use my iCloud storage?
    No. Photos uploaded to My Photo Stream do not count against your iCloud storage.
    Thanks for reaching out to Apple Support Communities.
    Cheers,
    Pedro.

  • Sq01 what is the fastest way,

    hi all,
    what is the fastest way to detect coding that update to eban  from sq01?
    as i am facing problem with more than 100,000 PRs updated mysteriously (customer tab data ) without any history changes .
    i suspected it might be from query and i need to find the root cause of problem immediately
    pls advice

    Hi Ester,
    Haven't got clearly what you written.
    You mean to say, there is modifications done to query for updation?
    if it is a standard code then I don't think it is used for modification.
    Regards,
    Atish

  • What's the fastest way to share files live between 2 Macs in the same room?

    Please can I have some advice on this scenario?
    I'll need to share HD video footage between 2 new Mac Pros in the same room. 1 Mac will be used to upload the footage and the other to edit it using FCP. The footage can be stored either on hard drives in the edit machine or on desktop hard drives connected to the edit machine (or if there is a better option I'm open to advice).
    What is the fastest way of sharing the files and what is the simplest way?
    Any suggestions would be greatly appreciated.
    Thanks in advance

    What is the fastest way of sharing the files and what is the simplest way?
    Fastest way? a fiber channel SAN connecting the two machines to a common fiber-channel based storage array. Can't beat it for performance, but it comes at a cost (think $20K as a starting point, depending on the amount of storage you need).
    Simplest way? Some external media (thumb drive, external hard drive, etc.) that you shuffle between the two machines
    Intermediate: a NAS-based storage box on your network, although be aware that real-time editing of HD video can overwhelm many low-end NAS boxes.

  • What is the fastest way to detect on to off tag transition​s and then read 500 analog tags?

    what would be the fastest way to read 500 analog tags from the tag engine when a boolean tag transitions from on to off?? Right now I have a boolean indicator setup with the HMI wizard in a while loop with 20ms timer. The indicator feeds the boolean crossing ptbypt vi. When the output is true, I use one read multiple tags vi to get all 500 at once. I am reading data into the tag engine through an opc server and have around 2500+ tags. I need to read all of the data in less than 100ms. My plc logic is setup to zero out all of the 500 analog tags when the boolean indicator turns on again. Would I be better off using the trend tags vi to monitor the boolean indicator??

    Unclebump,
    You might try using read tag.vi

Maybe you are looking for