Help with large file copy to WKS

I need to copy a folder and all of it's contents from
\apps\reallybigfolder to each workstation at c:\putithere.
Because of the size of the file copy, I chose to create an app object
with logic that says if xxx.exe exists then do not push. Basically, I
don't want this thing to just run constantly. I need it to run one time
to each workstation on the network when the users log in.
I have tried pointing the app object to \public\ncopy.exe and then
setting the parameters the way I want them, but I keep getting an
error: "Cannot load VDM IPX/SPX Support". The two files in the folder
are copied, but the subfolders are not. I have tried using the /s/e
switches, but it does not help.
I have also tried writing a .bat file, to test it - but I get the same
results as above. So next I tried using copy instead of ncopy. I do not
get the error message, but it still does not copy any of the subfolders.
Is there another way? An easier way? I really appreciate the help.
Tony

What you are doing should work.
It sounds as if there are some other workstation issues going on.
I don't think I seen or could make the error "Cannot load VDM IPX/SPX
Support" happen if I tried. Perhaps this would happen w/o a Novell
Client installed. In such a case you could use XCOPY or ROBOCOPY.
(Robocopy is way cooler than XPCOPY and is free from MS.)
You can also use the "Distribution Tab" and the "Files" section to copy
entire directories. Just use *.* as the source.
[email protected] wrote:
> I need to copy a folder and all of it's contents from
> \apps\reallybigfolder to each workstation at c:\putithere.
>
> Because of the size of the file copy, I chose to create an app object
> with logic that says if xxx.exe exists then do not push. Basically, I
> don't want this thing to just run constantly. I need it to run one time
> to each workstation on the network when the users log in.
>
> I have tried pointing the app object to \public\ncopy.exe and then
> setting the parameters the way I want them, but I keep getting an
> error: "Cannot load VDM IPX/SPX Support". The two files in the folder
> are copied, but the subfolders are not. I have tried using the /s/e
> switches, but it does not help.
>
> I have also tried writing a .bat file, to test it - but I get the same
> results as above. So next I tried using copy instead of ncopy. I do not
> get the error message, but it still does not copy any of the subfolders.
>
> Is there another way? An easier way? I really appreciate the help.
>
> Tony
Craig Wilson
Novell Product Support Forum Sysop
Master CNE, MCSE 2003, CCN

Similar Messages

  • Network speed affected by large file copy operations. Also, why intermittent network outages?

    Hi
    I have a couple of issues on our company network.
    The first is thate a single large file copy imapcts the entire network and dramatically reduces network speed and the second is that there are periodic outages where file open/close/save operations may appear to hang, and also where programs that rely on
    network connectivity e.g. email, appear to hang. It is as though the PC loses it's connection to the network, but the status of the network icon does not change. For the second issue if we wait the program will respond but the wait period can be up to 1min.
    The downside of this is that this affects Access databases on our server so that when an 'outage' occurs the Access client cannot recover and hangs permamnently.
    We have a Windows Active Directory domain that comprises Windows 2003 R2 (soon to be decommissioned), Windows Server 2008 Standard and Windows Server 2012 R2 Standard domain controllers. There are two member servers: A file server running Windows 2008 Storage
    Server and a remote access server (which also runs WSUS) running Windows Server 2012 Standard. The clients comprise about 35 Win7 PC's and 1 Vista PC.
    When I copy or move a large file from the 2008 Storage Server to my Win7 client other staff experience massive slowdowns when accessing the network. Recently I was moving several files from the Storage Server to my local drive. The files comprised pairs
    (e.g. folo76t5.pmm and folo76t5.pmi), one of which is less than 1MB and the other varies between 1.5 - 1.9GB. I was moving two files at a time so the total file size for each operation was just under 2GB.
    While the file move operation was taking place a colleague was trying to open a 36k Excel file. After waiting 3mins he asked me for help. I did some tests and noticed that when I was not copying large files he could open the Excel file immediately. When
    I started copying more data from the Storage Server to my local drive it took several minutes before his PC could open the Excel file.
    I also noticed on my Win7 client that our email client (Pegasus Mail), which was the only application I had open at the time would hang when the move operation was started and it would take at least a minute for it to start responding.
    Ordinarlily we work with many files
    Anyone have any suggestions, please? This is something that is affecting all clients. I can't carry out file maintenance on large files during normal work hours if network speed is going to be so badly impacted.
    I'm still working on the intermittent network outages (the second issue), but if anyone has any suggestions about what may be causing this I would be grateful if you could share them.
    Thanks

    What have you checked for resource usage during one of these copies of a large file?
    At a minimum I would check Task Manager>Resource Monitor.  In particular check the disk and network usage.  Also, look at RAM and CPU while the copy is taking place.
    What RAID level is there on the file server?
    There are many possible areas that could be causing your problem(s).  And it could be more than one thing.  Start by checking these things.  And go from there.
    Hi, JohnB352
    Thanks for the suggestions. I have monitored the server and can see that the memory is nearly maxed out with a lot of hard faults (varies between several hundred to several thousand), recorded during normal usage. The Disk and CPU seem normal.
    I'm going to replace the RAM and double it up to 12GB.
    Thanks! This may help with some other issues we are having. I'll post back after it has been done.
    [Edit]
    Forgot to mention: there are 6 drives in the server. 2 for the OS (Mirrored RAID 1) and 4 for the data (Striped RAID 5).

  • Qosmio X500-148 - Large file copy stucks

    Large file copy (~5-10gb) between USB or Firewire disk stucks the PC. It goes up to 50% then slows down and there is no possibility to do anything including cancel the task or open the browser or WIn Explorer or Task manager.
    Event viewer does not show anything strange.
    This happens only with Windows 7 64bit - no problem with Windows 7 32bit (same external hardware). No problem when copying between internal PC disk.
    The PC is very powerful - Qosmio X500-148

    The external Hardware is:
    1.5 TB WD Hard disk USB
    1.0 TB WD + 250GB Maxtor + 250GB Maxtor on Firewire
    I have used standard copy feature - copy and paste - as well as
    Viceversa Pro for folder sync with same results
    Please note that the same external configuration was running properly on a 1 core PC - Satellite - running on Win7 x86 without problems. Since I have moved on Win7 x64 on my brand new X500-148 I have a great del of copying problems
    Installed all Windows upgrade and the Event Monitor doesn't show anything strange when copying

  • Photoshop CS6 keeps freezing when I work with large files

    I've had problems with Photoshop CS6 freezing on me and giving me RAM and Scratch Disk alerts/warnings ever since I upgraded to Windows 8.  This usually only happens when I work with large files, however once I work with a large file, I can't seem to work with any file at all that day.  Today however I have received my first error in which Photoshop says that it has stopped working.  I thought that if I post this event info about the error, it might be of some help to someone to try to help me.  The log info is as follows:
    General info
    Faulting application name: Photoshop.exe, version: 13.1.2.0, time stamp: 0x50e86403
    Faulting module name: KERNELBASE.dll, version: 6.2.9200.16451, time stamp: 0x50988950
    Exception code: 0xe06d7363
    Fault offset: 0x00014b32
    Faulting process id: 0x1834
    Faulting application start time: 0x01ce6664ee6acc59
    Faulting application path: C:\Program Files (x86)\Adobe\Adobe Photoshop CS6\Photoshop.exe
    Faulting module path: C:\Windows\SYSTEM32\KERNELBASE.dll
    Report Id: 2e5de768-d259-11e2-be86-742f68828cd0
    Faulting package full name:
    Faulting package-relative application ID:
    I really hope to hear from someone soon, my job requires me to work with Photoshop every day and I run into errors and bugs almost constantly and all of the help I've received so far from people in my office doesn't seem to make much difference at all.  I'll be checking in regularly, so if you need any further details or need me to elaborate on anything, I should be able to get back to you fairly quickly.
    Thank you.

    Here you go Conroy.  These are probably a mess after various attempts at getting help.

  • Can some help with CR2 files ,Ican`t see CR2 files in adobe bridge

    can some help with CR2 files ,I can`t see CR2 files in adobe bridge when I open Adobe Photoshop cs5- help- about plugins- no camera raw plugins. When i go Edit- preference and click on camera raw  shows message that Adobe camera raw plugin cannot be found

    That's strage. Seems that the Camera Raw.8bi file has been moved to different location or has gone corrupt. By any chance did you try to move the camera raw plugin to a custom location?
    Go To "C:\Program Files (x86)\Common Files\Adobe\Plug-Ins\CS5\File Formats" and look for Camera Raw.8bi file.
    If you have that file there, try to download the updated camera raw plugin from the below location.
    http://www.adobe.com/support/downloads/thankyou.jsp?ftpID=5371&fileID=5001
    In case  you ae not able to locate the Camera Raw.8bi file on the above location, then i think you need to re-install PS CS5.
    [Moving the discussion to Photoshop General Discussions Forum]

  • Help with add file name problem with Photoshop CS4

    Frustrating problem: Help with add file name problem with Photoshop CS4. What happens is this. When I am in PS CS4 or CS3 and run the following script it runs fine. When I am in Bridge and go to tools/photoshop/batch and run the same script it runs until it wants interaction with preference.rulerunits. How do I get it to quit doing this so I can run in batch mode? Any help is appreciated. HLower
    Script follows:
    // this script is another variation of the script addTimeStamp.js that is installed with PS7
    //Check if a document is open
    if ( documents.length > 0 )
    var originalRulerUnits = preferences.rulerUnits;
    preferences.rulerUnits = Units.INCHES;
    try
    var docRef = activeDocument;
    // Create a text layer at the front
    var myLayerRef = docRef.artLayers.add();
    myLayerRef.kind = LayerKind.TEXT;
    myLayerRef.name = "Filename";
    var myTextRef = myLayerRef.textItem;
    //Set your parameters below this line
    //If you wish to show the file extension, change the n to y in the line below, if not use n.
    var ShowExtension = "n";
    // Insert any text to appear before the filename, such as your name and copyright info between the quotes.
    //If you do not want extra text, delete between the quotes (but leave the quotes in).
    var TextBefore = "Lower© ";
    // Insert any text to appear after the filename between the quotes.
    //If you do not want extra text, delete between the quotes (but leave the quotes in).
    var TextAfter = " ";
    // Set font size in Points
    myTextRef.size = 10;
    //Set font - use GetFontName.jsx to get exact name
    myTextRef.font = "Arial";
    //Set text colour in RGB values
    var newColor = new SolidColor();
    newColor.rgb.red = 0;
    newColor.rgb.green = 0;
    newColor.rgb.blue = 0;
    myTextRef.color = newColor;
    // Set the position of the text - percentages from left first, then from top.
    myTextRef.position = new Array( 10, 99);
    // Set the Blend Mode of the Text Layer. The name must be in CAPITALS - ie change NORMAL to DIFFERENCE.
    myLayerRef.blendMode = BlendMode.NORMAL;
    // select opacity in percentage
    myLayerRef.opacity = 100;
    // The following code strips the extension and writes tha text layer. fname = file name only
    di=(docRef.name).indexOf(".");
    fname = (docRef.name).substr(0, di);
    //use extension if set
    if ( ShowExtension == "y" )
    fname = docRef.name
    myTextRef.contents = TextBefore + " " + fname + " " + TextAfter;
    catch( e )
    // An error occurred. Restore ruler units, then propagate the error back
    // to the user
    preferences.rulerUnits = originalRulerUnits;
    throw e;
    // Everything went Ok. Restore ruler units
    preferences.rulerUnits = originalRulerUnits;
    else
    alert( "You must have a document open to add the filename!" );

    you might want to try the scripting forum howard:
    http://www.adobeforums.com/webx?13@@.ef7f2cb

  • Wpg_docload fails with "large" files

    Hi people,
    I have an application that allows the user to query and download files stored in an external application server that exposes its functionality via webservices. There's a lot of overhead involved:
    1. The user queries the file from the application and gets a link that allows her to download the file. She clicks on it.
    2. Oracle submits a request to the webservice and gets a XML response back. One of the elements of the XML response is an embedded XML document itself, and one of its elements is the file, encoded in base64.
    3. The embedded XML document is extracted from the response, and the contents of the file are stored into a CLOB.
    4. The CLOB is converted into a BLOB.
    5. The BLOB is pushed to the client.
    Problem is, it only works with "small" files, less than 50 KB. With "large" files (more than 50 KB), the user clicks on the download link and about one second later, gets a
    The requested URL /apex/SCHEMA.GET_FILE was not found on this serverWhen I run the webservice outside Oracle, it works fine. I suppose it has to do with PGA/SGA tuning.
    It looks a lot like the problem described at this Ask Tom question.
    Here's my slightly modified code (XMLRPC_API is based on Jason Straub's excellent [Flexible Web Service API|http://jastraub.blogspot.com/2008/06/flexible-web-service-api.html]):
    CREATE OR REPLACE PROCEDURE get_file ( p_file_id IN NUMBER )
    IS
        l_url                  VARCHAR2( 255 );
        l_envelope             CLOB;
        l_xml                  XMLTYPE;
        l_xml_cooked           XMLTYPE;
        l_val                  CLOB;
        l_length               NUMBER;
        l_filename             VARCHAR2( 2000 );
        l_filename_with_path   VARCHAR2( 2000 );
        l_file_blob            BLOB;
    BEGIN
        SELECT FILENAME, FILENAME_WITH_PATH
          INTO l_filename, l_filename_with_path
          FROM MY_FILES
         WHERE FILE_ID = p_file_id;
        l_envelope := q'!<?xml version="1.0"?>!';
        l_envelope := l_envelope || '<methodCall>';
        l_envelope := l_envelope || '<methodName>getfile</methodName>';
        l_envelope := l_envelope || '<params>';
        l_envelope := l_envelope || '<param>';
        l_envelope := l_envelope || '<value><string>' || l_filename_with_path || '</string></value>';
        l_envelope := l_envelope || '</param>';
        l_envelope := l_envelope || '</params>';
        l_envelope := l_envelope || '</methodCall>';
        l_url := 'http://127.0.0.1/ws/xmlrpc_server.php';
        -- Download XML response from webservice. The file content is in an embedded XML document encoded in base64
        l_xml := XMLRPC_API.make_request( p_url      => l_url,
                                          p_envelope => l_envelope );
        -- Extract the embedded XML document from the XML response into a CLOB
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/methodResponse/params/param/value/string/text()').getclobval(), 1 );
        -- Make a XML document out of the extracted CLOB
        l_xml := xmltype.createxml( l_val );
        -- Get the actual content of the file from the XML
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/downloadResult/contents/text()').getclobval(), 1 );
        -- Convert from CLOB to BLOB
        l_file_blob := XMLRPC_API.clobbase642blob( l_val );
        -- Figure out how big the file is
        l_length    := DBMS_LOB.getlength( l_file_blob );
        -- Push the file to the client
        owa_util.mime_header( 'application/octet', FALSE );
        htp.p( 'Content-length: ' || l_length );
        htp.p( 'Content-Disposition: attachment;filename="' || l_filename || '"' );
        owa_util.http_header_close;
        wpg_docload.download_file( l_file_blob );
    END get_file;
    /I'm running XE, PGA is 200 MB, SGA is 800 MB. Any ideas?
    Regards,
    Georger

    Script: http://www.indesignsecrets.com/downloads/MultiPageImporter2.5JJB.jsx.zip
    It works great for files upto ~400 pages, when have more pages than that, is when I get the crash at around page 332 .
    Thanks

  • Help with Automator to copy text from multiple files

    Hello,
    I'm new to automator and applescript but it seems like what I'm trying to accomplish is fairly easy.
    I'd like to run a workflow that will open a text file, copy the contents, paste the contents into a given application and then take a screenshot.
    I'd like to be able to do this for several hundred text files.
    I've tried with a Service but can't figure out how to provide the text input after "Get Specified Finder Items".
    Get Specified Finder Items --> Get Contents of TextEdit Document --> Copy to Clipboard --> results in no data.

    You'll need to use Applescript (you could use the Automator Run Applescript Action).
    set recipientAddress to do shell script "cat <filename.txt>"
    set theSubject to "Type your subject here!"
    set theContent to "Type your message content here!"
    tell application "Mail"
      activate
              set theMessage to make new outgoing message with properties {subject:theSubject, content:theContent, visible:true}
              tell theMessage
      make new to recipient with properties {address:recipientAddress}
      -- Uncomment send to Send the Message:
      -- send
              end tell
    end tell

  • Read parameter error -50 with larger file - please help

    I have this line of code: (read sfRef from SourcePosition as data)
    It works fine with these 811.2MB files but when I try to read from a 1.62GB file I get parameter error -50. Did I miss something about larger files?

    found it!
    You can't read larger than 1gb. I had it grab the data in multiple pieces and it works now.

  • Large file copy to iSCSI drive fills all memory until server stalls.

    I am having the file copy issues that people have been having with various versions of Server now for years, as can be read in the forums. I am having this issue on Server 2012 Std., using Hyper-V.
    When a large file is copied to an iSCSI drive, the file is copied into memory first faster than it can be sent over the network. It fills all available GB of memory until the server, which is a VM host, pretty much stalls and also all the VMs stall. This
    continues until the file copy is finished or stopped, then the memory is gradually released as it is taken out of memory as it is sent over the network.
    This issue was happening on send and receive. I change the registry setting for Large Cache to disable it, and now I can receive large files from the iSCSI. They now take an additional 1 GB of memory and it sits there until the file copy is finished.
    I have tried all the NIC and disk settings as can be found in the forums around the internet that people have posted in regard to this issue.
    To describe in a little more detail, when receiving a file from iSCSI, the file copy windows shows a speed of around 60-80 MB / sec, which is wire speed. When sending a file to iSCSI, the file copy window shows a speed of 150 MB/sec, which is actually the
    speed at which it is being written to memory. The NIC counter in Task Mgr shows instead the actual network speed which is about half of that. The difference is the rate at which memory fills until it is full.
    This also happens when using Window Server Backup. It freezes up the VM Host and Guests while the host backup is running because of this issue. It does cause some software issues.
    The problem does not happen inside the Guests. I can transfer files to a different LUN on the same iSCSI, which uses the same NIC as the Host with no issue.
    Does anyone know if the fix has been found for this? All forum posts I have found for this have closed with no definite resolution found.
    Thanks for you help.
    KTSaved

    Hi,
    Sorry if it causes confusion but "by design" I mean "by design it will use memory for copying files via network".
    In Windows 2000/2003, the following keys could help control the memory usage:
    LargSystemCache (0 or 1) HKEY_LOCAL_MACHINE\CurrentControlSet\Control\Session Manager\Memory Management
    Size (1, 2 or 3) in HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameter
    I saw threads mentioned that it will not work in later systems such as Windows 2008 R2.
    For Windows 2008 R2 and Windows 2008, there is a service named Microsoft Windows Dynamic Cache Service which addressed this issue:
    https://www.microsoft.com/en-us/download/details.aspx?id=9258
    However I searched and there is no update version for Windows 2012 and 2012 R2.
    I also noticed that the following command could help control the memory usage. With value = 1, NTFS uses the default amount of paged-pool memory:
    fsutil behavior set memoryusage 1
    You need a reboot after changing the value. 
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Large file copy fails

    trying to move a 60GB folder from a NAS to a local USB drive, and regardless of how many times it try to do this it fails within the first few minutes.
    I'm on a managed Ciscso Gigabit Ethernet switch in a commercial building and I have hundreds of users having no problems with OS 10.6, Windows XP and Windows 7 but my Yosemite system is not able to do this unless I boot into my OS 10.6. partition.
    Reconfig of the switch is not a viable option, I can't change things on a switch that would jeopardize hundreds of users to fix one thing on a mac testing the legitimacy of OS 10.10 in  a corporate setting.

    The CPU does occasionly peak at 100% when transferring a large file but the copy often fails when the CPU is significantly lower. I know a 4240 has 300Mbit/s throughput but as I understood it traffic would still be serviced but would bypass the inspection process if exceeded, maybe a transition from inspection to non inspection causes the copy to fail like a tcp reset, I may try a sniffer.
    I do have TAC involved but like to try and utilise the knowledge of other expert users like yourself to try and rectify issues. Thanks for your help. If you have any other comments please let me know, I will certainly post my findings if you are interested.

  • IdcApache2Auth.so Compiled With Large File Support

    Hi, I'm installing UCM 10g on solaris 64 Bit plattform and Apache 2.0.63 , everything went fine until I update configuration in the httpd.conf file. When I query server status it seems to be ok:
    +./idcserver_query+
    Success checking Content Server  idc status. Status:  Running
    but in the apache error_log and I found the next error description:
    Content Server Apache filter detected a bad request_rec structure. This is possibly a problem with LFS (large file support). Bad request_rec: uri=NULL;
    Sizing information:
    sizeof(*r): 392
    +[int]sizeof(r->chunked): 4+
    +[apr_off_t]sizeof(r->clength): 4+
    +[unsigned]sizeof(r->expecting_100): 4+
    If the above size for r->clength is equal to 4, then this module
    was compiled without LFS, which is the default on Apache 1.3 and 2.0.
    Most likely, Apache was compiled with LFS, this has been seen with some
    stock builds of Apache. Please contact Support to obtain an alternate
    build of this module.
    When I search at My Oracle Support for suggestions about how to solve my problem I found a thread which basically says that Oracle ECM support team could give me a copy IdcApache2Auth.so compiled with LFS.
    What do you suggest me?
    Should I ask for ECM support team help? (If yes please tell me How can I do it)
    or should I update the apache web server to version 2.2 and use IdcApache22Auth.so wich is compiled with LFS?
    Thanks in advance, I hope you can help me.

    Hi ,
    Easiest approach would be to use Apache2.2 and the corresponding IdcApache22Auth.so file .
    Thanks
    Srinath

  • Windows 7 64-bit Corrupting (Altering) Large Files Copied to External NTFS Drives

    http://social.technet.microsoft.com/Forums/en-US/w7itproperf/thread/13a7426e-1a5d-41b0-9e16-19437697f62b/
    continue to this thread, I have the same problems, the corrupted files only archives, zip or 7z ... and exe, there are no problems copying larg files like movies for example, no problem when copying via linux in the same laptop, it is a windows issue, I
    have all updates installed, nothing missing

    Ok, lets be brief.
    This problem is annoying me for years. It is totally reproducible although random. It happens when copying to external drives (mainly USB) when they are configured for "safe removal". I have had issues copying to NTFS and FAT32 partitions. I have had issues
    using 4 different computers, from 7 years old to brand new ones, using AMD or intel chipsets and totally different USB controllers, and using many different USB sticks, hard disks, etc. The only common thing in those computers is Windows 7 x64 and the external
    drives optimization for "safe removal". Installing Teracopy reduces the chances of data corrpution, but does not eliminate them completely. The only real workaround (tested for 2 years) is activating the write cache in the device manager properties of the
    drive. In that way, windows uses the same transfer mechanisms as for the internal drives, and everything is OK.
    MICROSOFT guys, there is a BIG BUG in windows 7 x64 external drives data transfer mechanism. There is a bug in the cache handling mechanism of the safe removal function. Nobody hears, I've been years talking about this in forums. It is a very dangerous bug
    because it is silent, and many non professional people is experiencing random errors in their backup data. PLEASE, INVESTIGATE THIS. YOU NEED TO FIX SUCH IMPORTANT BUG. IT IS UNBELIEVABLE THAT IT IS STILL THERE SINCE 2009!!!
    Hope this helps.

  • Help with Reconnecting Files

    Over this past weekend I was moving, via Relocate Masters, large numbers of images from managed status on my MacBook Pro to referenced status on an external drive (an Air Disk, to be precise). This had gone fine for thousands of images, but Sunday night I returned to the computer to find that it was completely locked up in mid-Relocate, and seemed to have been locked up for a while. I'm not sure what happened, but it seemed bad -- I couldn't get to the Force Quit dialog, or anywhere else. I powered down by holding in the power button.
    Upon resurrecting things, Aperture would not open the library. Long story shorter, I rebuilt the database and was then able to see the library contents again. But then I noticed that many files were "offline." Digging further, and using the Manage Referenced Files box, I discovered that some 3500 files need to be reconnected.
    The Reconnect All button did a little bit of magic for me, but now I've gotten to a point where using Reconnect All only reconnects one file at a time. I was renaming files as I Relocated Masters, and my fear is that this is what has caused the disconnect. *So first,* can anyone tell me that this is what has caused the problem (the problem being my apparent need to now reconnect 3400 files, one-by-one)?
    *Second, and more importantly to me, do I have any recourse beyond reconnecting one-by-one?* Again, the problem seems to me to be that the library file expects a file named image.jpg (or whatever), because the change of the file name to greatimage.jpg that occurred during Relocation did not get written to the library file.
    Many thanks for all help and suggestions.

    Homme dArs wrote:
    I hope someone else can join in with a definitive answer to your problem, but I don't believe you will be happy using the configuration you are attempting to set up.
    Having Aperture access your images via wireless is going to be very very slow, and any connection problems will be very frustrating. I have even abandoned the idea of using wireless backup of image files because it so slow.
    Perhaps someone can help with your immediate problem, but I would strongly suggest you find a way to store your referenced files locally (a firewire drive would be ideal).
    Thanks. I certainly appreciate your thought, and before trying it myself, my instinct was the same as yours. That said, until the present problem cropped up, I have to say it has been quite a good solution. My network is 802.11n, and I've noticed no real lag in accessing the files or otherwise using Aperture with a library stored as this is.
    But yes, I hope someone is able to help me get these 3400-some files reconnected short of one-by-one.
    Thank you for your comments.

  • Help with moving files

    I'm having problems copying data across my network to my new mac mini and i'm looking for some help with how to resolve this issue.
    I'm using "Connect to Server' to connect via SMB to a Vista x64 machine that has the files I need to copy. I'm mounting the share as the user that has permissions to the files on the vista client. The connection is fine, but when I start the copy process, I receive an error message about not having the appropriate permissions to copy files and then the entire copy process fails. Here is the exact error message:
    Copy
    The operation cannot be completed because you do not have sufficient privileges for some of the items.
    OK
    From the vista machine I have logged in as the same user and I have taken ownership of ALL of the files and folders I'm trying to copy. I also made sure that the folder is shared with EVERYONE having full access (not something I would normally do, but figured it might help this situation). The vista machine is in a workgroup if that makes a difference.
    Question 1: how can I do this via command line so that instead of stopping the entire file copy it skips the ones it can't copy and copies the rest of them?
    With hundreds of subfolders and thousands of files, I have no idea which ones it is failing to copy making this process quite painful.
    I'm also stuck on another basic issue - when I originally connected to the share, I told the OSX machine to remember the password. If I want to reconnect to the share using a different username and different password, how do I get OSX to reprompt me for this information?
    Help

    You are probably better off using
    ditto
    In a terminal
    man ditto
    and read the description and instructions.
    Yes, sadly, many copy process when the fail at a file abort the rest of the process. Windows does this as well.

Maybe you are looking for

  • A Simple problem I`m sure BUT ITS DRIViNG ME CRAAAZY!

    Hello, I hope someone can help. I published a site with the first podcst episode using iweb.It went through onto iTunes bo problem so all`s good... so today I put together the 2nd episode, submit it to itunes as before...up comes the `..please provid

  • I dont have a email account only enterprise with no acitvation password,

    how I take the activation password, if I dont know to contact the adminstrator, some tips needed...the enterpise setup is for free? what are they requirements needed, and i want the step by step procedure to installing the enterprise server software,

  • Freezing a frame in the imovie to empahsise a point

    Making a training movie for my webpage. Took some photos of text on whiteboard. Downloaded them iPhoto and dropped into iMovie in sequence and am adj timing as will do a voiceover.  Using Ken Burns to feature on a section of the text. Now I want to s

  • Forum group for OUCSS

    Hi, Can anyone please let me know whether we have a separate forum group for OUCSS? If not, which one among the below is the appropriate group to post the questions related to OUCSS? . Oracle Utilities . Webcenter Portal Thanks,

  • I can not switch to Indesign in Bridge

    I can not switch to Indesign in Bridge@