Downloading large files with short-timed dail-up

My dial-up account limits the amount of time I can be connected, in this case 8 hours. That's not enough time for me to upgrade my iMac OS 10.4.7 to newer updates. How can I get around this problem? There is no broadband available where I live. Thanks.

Hi, somefrank. Welcome to the Discussions.
Here are some ideas for obtaining large updates if you do not have broadband or your ISP restricts the size or duration of your downloads:
• If there is an Apple Retail Store in your area, visit them and bring a suitably-sized disc (CD-R, CD-RW, DVD-R/RW) with you — or buy a pack there — and they'll let you download and burn the required Delta or Combo Update there in the store. All the stores have broadband.
• If you're not near an Apple Retail Store, perhaps a sympathetic, local Apple Authorized Service facility with broadband would do the same. See "How to find your nearest Apple-Authorized Service Provider (AASP)" to determine if there's an AASP in your area.
• If you have a friend who is using Mac OS X and has broadband, work with them to download the Standalone version of the desired Delta or Combo Update and burn it to CD or DVD. Once you have a CD/DVD with the Standalone Update, copy the Standalone Update to your desktop and run it: it cannot be run from an optical disc.
Note that some of the information above is from page 477 of the "Software Update" chapter of my book, Troubleshooting Mac® OS X, Tiger Edition.
Good luck!
Dr. Smoke
Author: Troubleshooting Mac® OS X
Note: The information provided in the link(s) above is freely available. However, because I own The X Lab™, a commercial Web site to which some of these links point, the Apple Discussions Terms of Use require I include the following disclosure statement with this post:
I may receive some form of compensation, financial or otherwise, from my recommendation or link.

Similar Messages

  • Airport Extreme with Ubee D3.0 - cannot download large files.

    Hello all Macoids!
    Here is my issue...
    I am on Charter Cable Network using Charter's provided modem Ubee D3.0 It is connected to an AirPort Extreme "main base" vie ethernet cable. That AirPort Extreme that I call "main base" distributes the WiFi throughout the apartment to other AirPort devices. And it all works.
    However...
    I cannot download large files on Wi-Fi using iMac or even while connected to the AirPort Extreme main base via ethernet cable on my MacBook. What is interesting that inside of iTunes I can download large files with TV shows episodes just fine, but the update to an iPhone or an iPad will stop after 10 or 100 megs with error message "Unknown Error 9006" occurred. Interestingly enough, if I will drag the Download window in iTunes around while it is downloading 1.4 gig iOS update file for iPhone 5 - it will complete fine. So it looks like Airport main base (or whichever Mac is used) looses the connection to a download server or Ubee D3.0 modem, unless I constantly "renew" it by dragging the Download window around.
    That is quite annoying to do for 15-20 minutes… Same happens on MacBook Pro that is actually connected to an AirPort Extreme main base via ethernet cable...
    Macbook running OS 10.9
    iMac running OS 10.8.5
    Same thing.
    I would've call Charter but knowing well they have usually no idea what is up I thought to ask here if anyone has the same problem….?
    Any smart suggestions not based on experience?
    Anything appreciated very much!

    I ended up reformatting with HFS and the problem was solved, sort of.  The AEBS can now handle large files.  But that allowed me to expose yet another serious firmware bug (version 7.6.1), namely that if you use "account" style fileshares, the fileshares are not reliably accessible anymore, and frequent rebooting of the AEBS is needed to bring them back.  A quick test for whether this has happened is just attempt at ten-minute intervals to create a file on a read-write share.  You'll find it can work for up to a few hours, but at some point it will fail.  Makes the AEBS essentially unusable for account-based fileshares.  I
    With firmware 7.5, I'd noticed a variation of this problem, which was that editing the fileshare permissions on the AEBS resulted sometimes in a corruption of the fileshare rights.  When this happened, you needed to reinitialize the AEBS completely.  So I hoped that in 7.6 they fixed the probems.  They fixed that one but added the new one.
    For now, the workaround seems to be using a device-based password for the fileshares, and forgetting about account-based shares.  The huge problem presented by this approach is that all users have full access, so I await Apple's next attempt at stable firmware with great anticipation.  If only they had a beta test program, other than their users, we would not be in this near-constant state of working around serious bugs.

  • Help with download large file (~50mb) from http protocol

    I'm just using the default HttpURLConnection to download a large file from a server, I can never finish downloading before I get this exception:
    java.net.SocketException: Connection reset
         at java.net.SocketInputStream.read(SocketInputStream.java:168)
         at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
         at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
         at sun.net.www.MeteredStream.read(MeteredStream.java:116)
         at java.io.FilterInputStream.read(FilterInputStream.java:116)
         at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2446)
         at sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:2441)
            ...The file is on IIS web server, its a static file, not a server script, web service, or anything special. I usually can download 10-30 mb before this exception occurs. I'm using java 1.6.0_11 on windows xp, anyone have suggestions or experience downloading large files with java?

    Thank you everyone for their suggestions. I tried wget, and it worked fine, but I tried a couple of different clients and they were failing too. I assume wget has resume capabilities, so I debugged the server and found the issue. It was a connection problem on the server end.

  • IMac disconnects when downloading large files

    The internet on my iMac is a wired connection, hooked up through my router. It works perfectly fine and never disconnects while I'm browsing, playing games, or anything like that. When I download small files, things are still fine. Whenever I download something that takes more than a few minutes, my connection will shut itself off like clockwork. It then takes a minute or two to kick back in and resume, but then after another few minutes it will disconnect itself again. It isn't my ISP, since my laptop (Windows 7) downloads fine on this same network. My iPhone and iPad also have no problem downloading large files on this network either. What could be causing this strange issue with my iMac?

    Please read this whole message before doing anything.
    This procedure is a diagnostic test. It’s unlikely to solve your problem. Don’t be disappointed when you find that nothing has changed after you complete it.
    The purpose of this exercise is to determine whether the problem is caused by third-party system modifications that load automatically at startup or login. Disconnect all wired peripherals except those needed for the test, and remove all aftermarket expansion cards. Boot in safe mode and log in to the account with the problem. The instructions provided by Apple are as follows:
    Be sure your Mac is shut down.
    Press the power button.
    Immediately after you hear the startup tone, hold the Shift key. The Shift key should be held as soon as possible after the startup tone, but not before the tone.
    Release the Shift key when you see the gray Apple icon and the progress indicator (looks like a spinning gear).
    Safe mode is much slower to boot and run than normal, and some things won’t work at all, including wireless networking on certain Macs.
    The login screen appears even if you usually log in automatically. You must know your login password in order to log in. If you’ve forgotten the password, you will need to reset it before you begin.
    Test while in safe mode. Same problem(s)?
    After testing, reboot as usual (i.e., not in safe mode) and verify that you still have the problem. Post the results of the test.

  • Best way to download large files direct to disk?

    Hi,
    I am writing a process that will download a file (in the
    background, no GUI interaction) and will save it onto the users
    drive. Once it is saved to disk they will be notified.
    I don't want to use FileReference as that prompts the user, I
    also don't want to just use UrlStream direct into a ByteArray
    because the files can be quite big and I don't want to hog memory.
    What would you recommend I do? Is the only option to simply
    do an urlstream.readBytes() into a smaller temporary ByteArray and
    then immediately write those bytes to disk, rather than appending
    to an overall larger byte array that only gets written once the
    entire file is saved?

    I think it might be your lucky day, I think I just saw the
    article you need.....
    http://jarinheit.com/2008/06/19/downloading-large-files-in-adobe-air-with-flex/
    ....you didn't say what you wanted to do it in and well Im
    trying to learn JS and all this so I converted it....
    quote:
    <html>
    <head>
    <script src="AIRAliases.js" />
    <script>
    var urlString = "
    http://www.adobe.com/lib/com.adobe/template/gnav/adobe-hq.png";
    var urlReq = new air.URLRequest(urlString);
    var urlStream = new air.URLStream();
    var fileStream = new air.FileStream();
    urlStream.addEventListener(air.Event.COMPLETE, loaded);
    urlStream.addEventListener(air.ProgressEvent.PROGRESS,
    writeFile);
    var file =
    air.File.desktopDirectory.resolvePath("testing.png");
    fileStream.openAsync(file, air.FileMode.WRITE);
    urlStream.load(urlReq);
    function writeFile(event) {
    // only write every 50k or so downloaded
    if (urlStream.bytesAvailable > 51200) {
    // Read the buffer into a ByteArray and write it to disk
    var data = new air.ByteArray();
    urlStream.readBytes(data, 0, urlStream.bytesAvailable);
    fileStream.writeBytes(data, 0, data.length);
    function loaded(event) {
    // Write any remaining data to the file before closing it
    var data = new air.ByteArray();
    urlStream.readBytes(data, 0, urlStream.bytesAvailable);
    fileStream.writeBytes(data, 0, data.length);
    fileStream.close();
    </script>
    </head>
    <body>
    </body>
    </html>
    ...and it downloads it in the background.
    Hope it helps.

  • Can I download large files a bit at a time

    I'm on dialup and I'm wondering if it is possible to download large files, such as security updates, in small bits like you can do from a P2P network.
    Can I adjust a preference, download a utility, buy some additional software?
    Thanks in advance for any suggestions.
    G5 iSight   Mac OS X (10.4.2)  

    Thanks Niel for the link. I downloaded that program, and then, knowing what to look for, located a few others as well.
    I'm sure they all will allow me to download large files, but how do I get these programs to press the button that says "Download". I go to the page that has the "Download" button, copy the web address for that page, paste it into the program, tell it to download, but of course all it does is copy the page.
    I know there must be an easy way to do it, but after a few hours of trying I can't fathom it.
    How do you find the web address of the actual file?

  • Firefox doesn't reconvert special characters in the file names when download a file with any special characters in the file name

    <blockquote>Locking duplicate thread.<br>
    Please continue here: [/questions/815207]</blockquote><br>
    if i try to download a file with any special characters in file name (e.g. File_Name.pdf), it doesn't reconvert them from the "sanitize url" process and download the file an incorrect name (e.g. File%5FName.pdf).
    This is really annoying.
    Thank you for your patient

    Start Firefox in <u>[[Safe Mode]]</u> to check if one of the extensions is causing the problem (switch to the DEFAULT theme: Firefox (Tools) > Add-ons > Appearance/Themes).
    * Don't make any changes on the Safe mode start window.
    * https://support.mozilla.com/kb/Safe+Mode
    * [[Troubleshooting extensions and themes]]

  • TS1368 when downloading large files such as videos of television season do you need to turn off autolock to prevent download from halting

    when downloading large files such as videos of a television season do you need to turn off autolock to prevent download from halting? And if so, if the download will take a significant amount of time (not sure how much offhand) how do you prevent burn in from prolonged screen image?

    Among the alternatives not mentioned... Using a TiVo DVR, rather than the X1; a Roamio Plus or Pro would solve both the concern over the quality of the DVR, as well as providing the MoCA bridge capability the poster so desperately wanted the X1 DVR to provide. (Although the TiVo's support only MoCA 1.1.) Just get a third-party MoCA adapter for the distant location. Why the hang-up on having a device provided by Comcast? This seems especially ironic given the opinions expressed regarding payments over time to Comcast. If a MoCA 2.0 bridge was the requirement, they don't exist outside providers. So couldn't the poster have simply requested a replacement XB3 from the local office and configured it down to only providing MoCA bridging -- and perhaps as a wireless access point? Comcast would bill him the monthly rate for the extra device, but such is the state of MoCA 2.0. Much of the OP sounds like frustration over devices providing capabilities the poster *thinks* they should have.

  • Large file with fstream with WS6U2

    I can't read large file (>2GB) with the stl fstream. Can anyone do this or is this a limitation of WS6U2 fstream classes. I can read large files with low level function C functions

    I thought that WS6U2 meant Forte 6 Update 2. As to more information, the OS is SunOS 5.8, the file system is NFS mounted from an HP-UX 11.00 box and it's largefile enabled. my belief is that fstream does not implement access to large files, but I can't be sure.
    Anyway, I'm not sure by what you mean by the access to the OS support for largefiles by the compiler, but as I mentioned before, I can read more then 2GB with open() and read(). My problem is with fstream. My belief is that fstream must be largefile enabled. Any idea?

  • HT5701 Cannot download pdf files with Safari 6.0.4?

    Cannot download pdf files with Safari 6.0.4

    Back up all data.
    Triple-click the line of text below to select it, the copy the selected text to the Clipboard (command-C):
    /Library/Internet Plug-ins
    In the Finder, select
    Go ▹ Go to Folder
    from the menu bar, or press the key combination shift-command-G. Paste into the text box that opens (command-V), then press return.
    From the folder that opens, remove any items that have the letters “PDF” in the name. You may be prompted for your login password. Then quit and relaunch Safari, and test.
    The "Silverlight" web plugin distributed by Microsoft can also interfere with PDF display in Safari, so you may need to remove it as well, if it's present.
    If you still have the issue, repeat with this line:
    ~/Library/Internet Plug-ins
    If you don’t like the results of this procedure, restore the items from the backup you made before you started. Relaunch Safari again.

  • Unable to download large files via my Mobile Network

    I have unlimited mobile data but I am unable to download large files, like podcasts unless I connect to Wi Fi. My settings are set to use mobile data, is there anything I can do?

    There is a 50meg (recently upgraded from 20) per file download limit over cellular data. No, you can't change that.

  • When downloading large files thay stop at approx 7 Mb.

    When I download large files they stop at approx 7 Mb. They can be paused and restarted. They then carry on and will then stop at other random sizes. Repeating the pause restart enables the download to complete.
    What can be done to fix this?
    Cheers
    Paul

    I think I may have solved this. I needed to turn off RFC1323, by adding it to sysctl.conf.
    If sysctl.conf doesn't exist in /etc then you just create it, use vi or pico.
    search the discussion forums here for references to speedguide.net and make sure you see something like the following...
    Default TCP Receive Window (RWIN) = 65535
    RWIN Scaling (RFC1323) = 0 bits
    Unscaled TCP Receive Window = 65535

  • After some running time I cannot download large files via any browser.

    I am running OS X 10.9.2 on a 27inch iMac 2013 model.
    If I keep it running (without sleeping) for some time, perhaps a week, I can no longer download any large files via any browser. It will start and after a very short time go down to 0 B/s, stall there and at the end aboft with an "Unknown network error" or something similar.
    Has anyone seen similar problems?

    Open router set up using http://192.168.1.1  .....you will see username & password .....leave username blank & in password use admin...............
    Click status tab..........check firmware version .......download the latest firmware from http://linksys.com/download ..........
    Udate the firmware ....reset the router for 20-30 seconds ........ later on reconfigure the router ........

  • Problem downloading large pdf with Safari

    Has anyone else had this problem when clicking on a link to a large pdf file (1MB) I get a blank screen in Safari. When I try with firefox its fine only with safari is there anything I need to change to allow downloads of larger pdf files in safari?

    1MB isn't a particularly large file at all. Can we assume that you've verified that Safari will open smaller PDF's?
    Also, you haven't mentioned what Adobe product and version you are using (may help to give the version of Safari as well).

  • Problem unzipping larger files with Java

    When I extract small zip files with java it works fine. If I extract large zip files I get errors. Can anyone help me out please?
    import java.io.*;
    import java.util.*;
    import java.net.*;
    import java.util.zip.*;
    public class  updategrabtest
         public static String filename = "";
         //public static String filesave = "";
        public static boolean DLtest = false, DBtest = false;
         // update
         public static void main(String[] args)
              System.out.println("Downloading small zip");
              download("small.zip"); // a few k
              System.out.println("Extracting small zip");
              extract("small.zip");
              System.out.println("Downloading large zip");
              download("large.zip"); // 1 meg
              System.out.println("Extracting large zip");
              extract("large.zip");
              System.out.println("Finished.");
              // update database
              boolean maindb = false; //database wasnt updated
         // download
         public static void download (String filesave)
              try
                   java.io.BufferedInputStream in = new java.io.BufferedInputStream(new
                   java.net.URL("http://saveourmacs.com/update/" + filesave).openStream());
                   java.io.FileOutputStream fos = new java.io.FileOutputStream(filesave);
                   java.io.BufferedOutputStream bout = new BufferedOutputStream(fos,1024);
                   byte data[] = new byte[1024];
                   while(in.read(data,0,1024)>=0)
                        bout.write(data);
                   bout.close();
                   in.close();
              catch (Exception e)
                   System.out.println ("Error writing to file");
                   //System.exit(-1);
         // extract
         public static void extract(String filez)
              filename = filez;
            try
                updategrab list = new updategrab( );
                list.getZipFiles();
            catch (Exception e)
                e.printStackTrace();
         // extract (part 2)
        public static void getZipFiles()
            try
                //String destinationname = ".\\temp\\";
                String destinationname = ".\\";
                byte[] buf = new byte[1024]; //1k
                ZipInputStream zipinputstream = null;
                ZipEntry zipentry;
                zipinputstream = new ZipInputStream(
                    new FileInputStream(filename));
                zipentry = zipinputstream.getNextEntry();
                   while (zipentry != null)
                    //for each entry to be extracted
                    String entryName = zipentry.getName();
                    System.out.println("entryname "+entryName);
                    int n;
                    FileOutputStream fileoutputstream;
                    File newFile = new File(entryName);
                    String directory = newFile.getParent();
                    if(directory == null)
                        if(newFile.isDirectory())
                            break;
                    fileoutputstream = new FileOutputStream(
                       destinationname+entryName);            
                    while ((n = zipinputstream.read(buf, 0, 1024)) > -1)
                        fileoutputstream.write(buf, 0, n);
                    fileoutputstream.close();
                    zipinputstream.closeEntry();
                    zipentry = zipinputstream.getNextEntry();
                }//while
                zipinputstream.close();
            catch (Exception e)
                e.printStackTrace();
    }

    In addition to the other advice, also change every instance of..
    kingryanj wrote:
              catch (Exception e)
                   System.out.println ("Error writing to file");
                   //System.exit(-1);
    ..to..
    catch (Exception e)
    e.printStackTrace();
    }I am a big fan of the stacktrace.

Maybe you are looking for

  • Finding broken link in url

    Hi,  i would like to generate a report to identify broken links in a page or list pages under a website in CQ5 server. how can we generate this? pleas help me. Raja R

  • I deleted my up to date projects... How can I retrieve them.

    I did a dumb thing... I have had to swap between Motion 3 at work and Motion 4 at home... I Updated my folder one day and of course it was from the least updated to the most. So I lost the most recent project files... They are now longer in Auto Save

  • How to use session state protection

    I use Apex 3.2.1 I access my site by a url passing a parameter like this : f?p=101:1:0::::ITEM1:1234567. There is no login and password to access the site. The value of the parameter ITEM1 is the authorization of the first page, with a database funct

  • Error message CS6 trial installation

    after the installation and the long extraction, now there is an error message: "the setup file cannot be located (error 103) It tells me to go to the folder and install myself, but there is nothing in the folder...

  • No refund from ebuynow

    I ordered a camera from ebuy now and it was on backorder for a long time so I eventually asked for a refund.  They said it would process in 48 hours and that was three weeks ago and still no refund.  I send emails weekly and get no response from them