File save to networked drive takes a long time

I have been putting up with a real slowness with saving files in DW CS4. The save of a file is to a local network drive (H: drive).  I can use other apps like a text editor and saving a file to the same drive is pretty near instant.  But when I save a .cfm file in DW to the same drive it will often take 30 sec to up to 5 minutes...all of which is unacceptible for me.  It's not every file either that is taking the same delay in time to save.  It doesn't seem related to file length either.  I have a file just now that is 47 lines of CF code, I saved it and it took about 3 or 4 minutes.
What is going on?  Is this DW doing some funcky syntax checking or web standards checking or is it our network issue?  If this is a configuration issue I'd sure like to know how to fix this. It's extremely annoying as I forget my train of thought by the time the file has saved.
All suggestions welcomed!

Well I probably should have checked around here some more. Just talked to someone in our IT dept. and they believe (unproven theory) that it is likely our network configuration and design. However if you have a few things to try to eliminate any other possibilities I'd like to hear them.

Similar Messages

  • Deserializing obj on network drive take a long time -- any ideas to improve

    Good morning!
    Deserializing objects from a network drive is taking way too long, much slower than deserializing the same object files stored on the local hard drive of the development PC. How I can reduce the time??
    The size of the serialized objects is small (72K and 17K).
    The PC access network drives quickly enough (word processing, text editing, etc). The PC JVM is 1.5.0_09 and the OS is Windows XP.
    Here is the background.
    I had the idea to create objects and serialized them on my server (an AS400) and let client applications deserialize them.
    Now that I have it running I find that deserializing the objects off the server take way too long! Does anyone have any ideas how to improve this.
    If I copy the serialized objects off the server onto the local hard drive of the development PC and access them from the local hard drive the speed is fine.
    To my mind that eliminates the potential for incompatibility between the JVM which serialized the objects and the JVM which descerialized the object. That is, the deserialize fast on the local drive, slow over the network.
    Used both UNC and mapped drive to access the server and I can't see any performance difference. That is
    filename = "//server/java/ftpaudit/SerializedObject.ser"
    and
    filename = "J:/ftpaudit/SerializedObject.ser"
    where J: is mapped to //server/java
    This is the code which deserialize the objects.
    try {
         File file = new File(filename);
         FileInputStream f = new FileInputStream(file);
         ObjectInputStream ois = new ObjectInputStream(f);
         Object returnObj = ois.readObject();
         ois.close();
         f.close();
         return returnObj;
    } catch (InvalidClassException ice) {
         ....Thank you for your time and help.
    BillB

    Could the file be locked on the server?I don't believe it is. It eventually (minutes loads). The delay is consistent. I haven't put the logging to see what part of the deserialize method is chewing up the time yet.
    Putting in code to copy the file from the server to the local hard drive then deserializing performs much, much faster (milliseconds, almost as fast as deserializing directly from the local drive).
    I modified this from another post to this forum
    try {
            FileChannel srcChannel = new FileInputStream(file).getChannel();
            FileChannel dstChannel = new FileOutputStream(dir+file).getChannel();
            dstChannel.transferFrom(srcChannel, 0, srcChannel.size());
            srcChannel.close();
            dstChannel.close();
        } catch (IOException e) {
            e.printStackTrace();
        }I was wondering if there was a more efficient way to deserialize an object than the one in my original post. Or if there were hidden aspects to consider such as buffer size.
    Here's an alternative approach: since the server side
    is written in Java (I'm assuming, since you say that
    it writes a file containing serialized data), why not
    create an RMI interface for retrieving the
    information? The current approach, of opening a file
    over the network, is very fragile -- what happens in
    the network mount point or share name changes?Thank you. Good suggestion.

  • InDesign 9.2 - Save to network location takes a long time.

    Saving InDesign files to a network location seems to take an extremely long time during which I am unable to do anything else inside InDesign.
    I know that this has been an issue for me for at least 2 weeks (since March 3, 2014, possibly earlier). I am not certain whether I have updated InDesign since then, but have consistently been experiencing this same problem.
    Today I ran some tests (all using the same ~20MB file).
    Save from InDesign to network location: 12 seconds
    Save from InDesign to desktop: <2 seconds
    Copy file (in Windows Explorer) from desktop to network: <2 seconds
    Copy file from network to desktop: <2 seconds
    I know I can work on the file locally and copy my saved versions out at the end of the day, but this was not necessary before, and I don't know what changed to make this take so much longer than it used to.
    Current system configuration: InDesign 9.2 x64 running on a Win 7 Pro 64-bit with 16GB RAM and a Xeon 3.6GHz processor.
    Is anyone else experiencing this? Any ideas on how to resolve?
    Thanks!

    Daylight savings time started on the 9th, right? I'm not sure about what a "server time sync problem" would be, but since my problem started before the DST switch, it doesn't seem that this could be the cause.
    Also, since I am able to copy to and from the server outside of InDesign without any problems, doesn't this indicate the server and my computer are communicating just fine?
    Thank you for your assistance!

  • Office 2007 saving documents over mapped network drive takes abnormally long

    On a new Windows 8.1 Pro 64 bit computer, I have installed Office 2007. When I attempt to save documents to a mapped network drive on a peer network (saving to a Windows 8.1 Pro 64 bit computer), I have abnormally long save times.  However, when I attempt
    to save documents to my computer, I have no issues.  I do not have any issues when opening the files over the mapped drive.
    Turning off the third-party Bullguard Internet Security antivirus and firewall makes no difference.  Setting as an exception the excluded mapped drive folder makes no difference either.
    I have disconnected the mapped drive and re-mapped the network drive, but this makes no difference.
    I had no problem with the same software on a prior computer running the same Office 2007 version.  Another computer on the network has no issues either.
    Please advise how to speed up the saved documents (Word, Excel).

    Hi,
    As the slow to save files issue, please try the suggestion below to modify the registry:
    Important
    This section, method, or task contains steps that tell you how to modify the registry. However, serious problems might occur if you modify the registry incorrectly. Therefore, make sure that you follow these steps carefully. For added protection, back up the
    registry before you modify it. Then, you can restore the registry if a problem occurs. For more information about how to back up and restore the registry,
    http://windows.microsoft.com/en-US/windows7/Back-up-the-registry
    To start the Registry Editor, press Win + R, type "regedit.exe" in the blank box, press Enter. Then add the following registry key to the machine:
    [Hkey_Current_User\Software\Microsoft\Office\12.0\Word\Options]
    "DisableRobustifiedUNC"=Dword:00000001
    [Hkey_Current_User\Software\Microsoft\Office\12.0\Excel\Options]
    "DisableRobustifiedUNC"=Dword:00000001
    I hope this can help.
    Regards,
    Melon Chen
    TechNet Community Support

  • HT2506 every time i open an image or PDF file using preview the mac take a long time to load and opens every single file the usually opens with PDF at the same time after a noticeable delay i would like to know why and i would like to know how i can avoid

    every time i try to oppen a file like an image or a PDF it oppens by default using oreview built in appliction
    occasionally this is accompanied by a long delay after wich the preview oppens every file in the downloads  folder and i can navigate through them with ease
    i would like to know why the application has this elay i really want to avoid it
    because every tome i want to oppen an image there is a risk it would take a minnute or two

    In the finder right click on the iPhoto library and show package contents, find the folder named import (or importing) and drag it to the desk top. Do NOT make any other changes of any sort. Launch iPhoto and you will Be fine. Open the import(ing) folder and if there are any photos in it you need import them to iPhoto then trash the imoort(ing) folder.
    LN

  • My final cut is running extremely slow after having for 6 months.  I use external hard drives and save nothing on my computer.  it takes a long time to render now. do you think something is wrong with computer. took it to mac store but they fail to help

    my final cut is running extremely slow after having for 6 months.  I use external hard drives and save nothing on my computer.  it takes a long time to render now. do you think something is wrong with computer. took it to mac store but they fail to help me.  I have a 13 inch mac book pro 10.6.8 8gb 2.3 gfz intel core i5

    What did Apple tell you about what they did (or did not) find here?
    See if working on the main disk is any faster.
    USB2 is glacial, as I/O interconnects go.  And USB2 disks have been known to fail, or sometimes to drop back to USB1 speeds, too.  FireWire 800 is substantially faster than USB2, but still not as fast as the internal drive connections can provide.
    Put another way, start varying your configuration, and see which configuration changes (still) have the problem, and which don't.

  • Excel 2007 and 2010 slow to save to network drive with Windows 7

    I know this issue has been passed around and I know that there are some fixes out there. I will lay out my issue and resources, what I've tried and the things I've noticed.
    I have a mix of Server 2003 32bit, 64 bit and 2008 64 bit. The desktop/laptop environment is mainly XP (no issues saving with an XP machine), a few Win 7 computers and a dozen thin clients running off the 2008 TS. The slow saving issue happens with either
    a Win 7 machine or a client terminal served into 2008. It takes 60 seconds or more to save or "save as" an excel file to a network drive. Same thing happens with any Office product. No issue saving to the local machine. If you log into either 7 or
    TS with the domain admin account, the saving takes no time at all. Only as a domain user does it hang.
    I have tried the SMB2 trick, the 981711 hotfix, turning off indexing and starting the webclient service. I am probably forgetting a trick or two that I've seen online from various people having the same issue. Point is though, the issue remains and besides
    being utterly annoying, it is preventing me from rolling Windows 7 out to my employees. Please Help. Thank you in advance for any advice posted

    Hi,
    Have you tried to create a new txt file with notepad to test this issue? If this issue also happens on .txt file, you should check your network.
    And I suggest all users disable their anti-virus to see if the problem goes away.
    There are other things to try if the problem is evident in all of the documents you open. When EXCEL is not running, make sure you delete all the temporary files that
    EXCEL may have left lying around various folders in your system. Look for files that end with the TMP extension, or files that being with the tilde (~) character. If your file folders become very cluttered with these temporary files, it can slow EXCEL down
    immensely.
    Another thing to try is to start EXCEL with the
    /a switch on the command line. This causes it to load without also loading different startup files such as add-ins and macros. If this fixes the problem, then you can bet that the slowdown is caused by one of those add-ins or macros.
    You might also rename the Normal.dot (or Normal.dotm) file; if it is corrupted then it can slow down response times. (Renaming the file causes Word to create a clean,
    fresh, empty one the next time you start the program.)

  • Takes a long time to select a files when importing files

    When I am selecting a files through the import function, it takes a long time (sometimes 5 minutes) for the computer to allow a file to be selected. I have just reformat the drives and did a fresh install of all application. I am running OS 10.4.4 on a G5 2.7. This happens irregardless whether I am selecting a file on an external drive (Firewire 800) or internal drive. Any clue, anyone? please help. Thanks

    What are you importing?
    Shane

  • Can't preview files from a network drive to a local CF9 server.

    Hi,
    I have the following set up:
    CF9 Local Dev version
    CF Builder 2.
    Project files are in a network drive N:\project
    I have RDS set up and everything seems to work ok, view DSN, files, etc on my local server.
    In the URL prefix section I even created a map:
    Local Path: N:\project,
    URL prefix: http://localhost:8500/project/
    There is no FTP set up on the CF9 server.
    The problem is that when I try to preview a file that I am working on from N:\project, it doesn't show on my local server. The previewing part works if I manually move the file to CF9. But it is the automatic uploading (or moving) of the file to the local CF9  that doesn't seem to work.
    BTW, I have a similar set up in DreamWeaver, where I am editing the same project (or site in this case) off of  N:\project, uploading to the same local CF9 server through RDS, and it works fine.
    I know that if in CF Builder you move the project to under the web root it will work but that would not work for us, since we need to keep the project source files on a network drive for sharing/backup purposes.
    Has anyone been able to successfully preview source files from a network drive on a local CF9 server?
    Thanks in advance,

    Hi again,
    After doing some more googling I realize that for files to be previewed the MUST be under wwwroot. This wasn't too clear in the CF documentation.
    So my new question is:
    Is there are way, once I save a file to automatically copy it from one location(N:\project) to another (c:/ColdFusion9/wwwroot/project)?
    I think there is a way to do it with Ant or some sort of Eclipse feature, but I am not too familiar with either. Could someone point me in the right direction?
    Thanks,

  • Lightroom has trouble reading RAW files on my network drive ("The file appears to be unsupported or

    Hey guys, I currently run LR5 on Mac OSX.  I've had my RAWs on an external HDD and have smart previews built for those files.  Recently, I upgraded to a new router, the ASUS RT-N65U, and plugged in my WD My Passport Ultra HDD (FAT32 filesystem) into to.  Now I've put my RAWs on the network drive, and verified via MD5 checksums that the files are being read correctly (albeit rather slowly) from the network drive.
    However, whenever I try to view one of the photos on the network drive, Lightroom complains, "The file appears to be unsupported or damaged".  I've verified via MD5 that the files are not damaged.  Does anyone know how I can get LR to be happy with the files on my network drive?
    (Bonus question: how do I make my network drive faster?)
    Thanks!

    Got RAWs from a D7000 and D600, LR v5.2 [922700].
    Oddly things are working now... I ejectected the network drive in Finder and reconnected to my router.  Lightroom no longer has problems viewing the photos, even after I closed my laptop and reopenned it.  I think the problem might occur when my laptop goes to sleep for a long time and I reconnect.  Will repoert back when I try that.
    I'm using an external HDD attached to my router.  The reason I don't plug the external HDD to my laptop directly is I don't want to keep doing it every time I want to access my photos, and my laptop only has limited disk capacity since it's an SSD.

  • Can I "cache" large RAW files imported from network drive on local drive?

    I've seen discussions similar to this topic, regarding "offline" editing, but not exact answers I'm looking for.
    I'm copying RAW files from a shoot onto my network RAID drive in a per-album folder, from where I then import into a new LR catalog (I'm using LR 4.1) on my Windows laptop for that album.
    I'm finding that editing RAW files on a network drive from a wireless laptop is awfully slow (not to mention not being able to continue the work offline somewhere else), so I'd like to cache those files on my speedy SSD drive just until I'm done editing and exporting/publishing that album.
    I back up my per-album catalog onto the network folder alongside the RAW files since that's my IT-managed master repository. When I'm done editing, I'd purge the cached files and just keep the catalog previews on my limited capacity SSD.
    So, is there a way to tell LR to, for one/all files imported from a location (in my case a network folder), look in an alternate location for the identical files? Since I create per-album catalogs in per-album folders, there wouldn't be filename clashing. Either LR can provide this caching behavior on its own or I'd manually copy files from the network folder to a local folder and tell LR to look there first.
    I don't really want to make copies of files into secondary folders within LR since then it's a hassle to merge edits on the cached copy to its master copy (I haven't done this, but I'd imagine so).
    Thanks for your workflow tips.
    Erhhung

    Have you tried locating your ACR Raw cache on this SSD drive, also ensuring that the size limit of this cache is large enough, to keep a goodly chunk of these Raw images all in play at the same time?
    My understanding is that for performance reasons, LR will be caching your Raw data locally in any case - on conventional hard disk by default. And it will be doing so in an intermediate half-converted form (demosaiced into "sensor RGB space" but not translated into "picture RGB values" - the latter process varies depending on your live Develop adjustments).
    So long as this cache has not run short, and reallocated space for something else, LR should (as far as I know) not be needing to go back to the network drive to re-fetch actual Raw content. And keeping that cache somewhere fast rather than somewhere slow, should improve responsiveness.
    I imagine, though, that LR will continually check the availability and the unchanged timestamp of the real source file; and wouldn't let you carry on working from automatically cached data otherwise.

  • Saved files take a long time to appear on Server or Desktop

    We have a network of nearly 40 macs, Panther Server, mostly panther clients but the newer machines are on 10.4.
    All of the newer machines exhibit the problem in that files saved to the server or to the desktop take a long time to appear.
    It appears to be a refresh issue, as on the server other clients can see the file straight away and locally the files can be seen on the desktop via the terminal.
    Force quitting the finder forces the files to appear but this is not a solution.
    The clients which have the problem all run 10.4 and the Server is 10.3. The 10.3 clients do not exhibit the same issues.

    I have also seen this issue at one of my larger clients (50+ workstations).
    The issues seems isolated to 10.4.5 or higher. Our workstations with 10.4.2 did not exhibit this problem. It is not an issue with hardware (we have seen this on an iBook G3 and various G5 tower models). Nor does it appear to be server-related: we have seen the issue saving locally with no servers mounted. We have seen the issue when saving from Adobe CS apps, MS Office apps, and from system apps (like TextEdit). Deleting com.apple.finder.plist from the user's home Library and then relaunching the Finder gets relief for a short time but the problem returns soon after. plutil reports the file is OK. The volume directory is fine, the permissions check out, and caches have been purged, to no avail.
    Like you, summersault, I am able to see the saved file on the command line, and when my users Force Quit the Finder, the file appears. Other workarounds, like the Nudge contextual menu, do not work.
    I have submitted the error via apple.com/feedback and would encourage you to do the same.
    Anxiously awaiting 10.4.7...
    -B
      Mac OS X (10.4.6)  

  • My mac air died, after i reinstall from dvd drive, when i boot up again, i saw a prohibit sign showing up on the screen, then get to my logon screen.  After logon, it takes very long time to load any application.  pls help

    my mac air died, after i reinstall from dvd drive, when i boot up again, i saw a prohibit sign showing up on the screen, then get to my logon screen.  After logon, it takes very long time to load any application.  pls help

    OK. If you believe that.
    When the disk is starting to die it can be bad for a moment and then be good for a while. If there was no problem while you were checking it of course it shows up good.
    I tell you those are the signs of a disk that is about to die.
    Allan

  • I am trying to delete an XLS file from a network drive and am getting a message that says the file cannot be deleted because it is in use.  I can't see where it is in use and have tried to reboot the computer to no avail.

    I am trying to delete an XLS file from a network drive and am getting a message that says the file cannot be deleted because it is in use.  I can't see where it is in use and have tried to reboot the computer to no avail.  I can delete other files but several XLS files are giving me this message.

    You said it was a network drive.  So is someone else on the network using that same file or may have have left with excel still accessing it on their system?

  • AP Trial Balance - File is massive and takes a long time to open up

    Hello,
    We are upgrading from 11.5.10 to 12.1.3 and are running into some serioud problems with the Accounts Payables Trial Balance report in Detail Mode.
    1. The Report takes upwards of 10 minutes to open up - and in many cases does not.
    2. On further research we see that the report output file is > 500MB whereas comparable 11i Trial Balance files are only 19MB.
    3. We tried running in various formats Text, PDF, HTML with no luck.
    4. In fact HTML and Text formats gave “Authentication failed” errors when we tried to open the report.
    My questions are:
    1. Why is there a such a large increase in file size? Is this expected going to R12?
    2. Is there a way to speed up the opening of the large files or create smaller files? - (summary mode, and smaller date ranges are not options as per business needs). Even a 2 month date range give the same problems.
    3. We noticed that the XLA_TRIAL_BALANCES table has 36,521,125 records...LARGE.
    4. We compared the ap_liability_balance records between 11i and R12. We have 29,171,382 in 11i and 37,769,034 in R12. But this is expected as we purge regularly in 11i production and I don't think this difference (about 30%) in the size of the tables should cause such a large difference in the size of the report files (nearly 2000%!!!).
    5. Why are we getting Authentication failed errors? Just for these reports?
    Of course we opened an SR, but you know how SRs go :-) and this is a time critical project with our deadlines nearing.
    Any help or pointers from the experts here is greatly appreciated.
    Thanks
    Edited by: user11992646 on Jan 27, 2012 2:12 AM

    Hi Hussein,
    Thanks for your prompt reply and my apologies for the substantial delay in mine.
    We did some trouble shooting since then.
    We repolved the problem by choosing a pdf format instead of the seeded RTF format. This opened up the output in a couple of minutes because the pdf formatted output was about 50mb vs the RTF output which was 300MB+.
    But the ORIGINAL problem still remains. The xml tagged .out file is still huge with a size of nearly 1gb!
    My earlier questions here now boil down to the following questions.
    Q1. Is the huge XML Publisher's .out file expected in R12?  It really is massive!  The text file from 11i was just 19MB
    Q2. Oracle does not seem to offer the option to format the output in plain text ...We see PDF, RTF but no plain ugly text.  I would expect plain text file to be much smaller.  How can we get the output in plain text?
    1.Yes our stats are upto date. Let me emphasize that we do not have a performance problem with the report itself. The issue we are having is opening up the output.
    2. The authentication error is resolved. It was because of formatting the reports incorrectly.
    3. We did not use the usual tuning procedures such as tkprof because we do not have a performance problem.
    4. We checked the OPP file and there are none. I reiterate, The report does complete without issues. It is just that the output takes a long time to open up in the RTF
    I will appreciate if others  can share any experiences if you have faced this situation and how you have handled it.
    Thanks in advance for your time!

Maybe you are looking for