Backup to TimeMachine with WI FI is very slow!

Backup to TimeMachine with WI FI is very slow!
I set the backup with TM but backups and slow
We can Help Me ?

alexrav wrote:
Backup to TimeMachine with WI FI is very slow!
I set the backup with TM but backups and slow
We can Help Me ?
Yes, WIFI is slow. If this is your first backup, it will be much faster if you connect to your Time Capsule or Airport Express via an Ethernet Cable.
Also try the things in item #D2 of the Time Machine - Troubleshooting *User Tip* at the top of this forum.

Similar Messages

  • I am having a problem with my cs6 running very slow and when i save i am getting an error message that says, "This document is bigger than 2 gb (the file may not be read correctly by some TIFF viewers.) please help

    I am having a problem with my cs6 running very slow and when i save i am getting an error message that says, "This document is bigger than 2 gb (the file may not be read correctly by some TIFF viewers.) please help

    wen6275,
    It looks like you're actually using a camera or phone to take a photo of your monitor.
    That's not what is commonly known as a "screen shot". 
    It looks like you're on a Mac.  Hitting Command+Shift+3 places a capture of the contents of your entire monitor on your Desktop; hitting Command+Shift+4 will give you a cross-hairs cursor for you to select just the portion you want to capture.
    I'm mentioning this because I fear I'm not construing your original post correctly when you type "I am working with a large files [sic], currently it has 149 layers, all of which are high resolution photographs", so I'm wondering if there's some similar use of your own idiosyncratic nomenclature in your description of your troublesome file.
    Perhaps I'm just having a major senior moment myself, but I'm utterly unable to imagine such a file.  Would you mind elaborating?

  • My IMac 2009 with snow Leopard is very slow. What can I do ? I'm french

    My IMac 2009 with snow Leopard is very slow. What can I do ? I'm french

    Patjan73 wrote:
    My IMac 2009 with snow Leopard is very slow. What can I do ? I'm french
    The first two things I recommend is boot from your installer dvd or external bootable drive and repair/verify your drives with Disk Utility (not repair permissions) and do a smc and pram reset.
    Also make sure you have at least 20GB to 25GB free space on your boot volume for OS operations.

  • I update my iphone 4s with iso8 but its very slow can i go back to iso7 ?

    I update my iphone 4s with iso8 but its very slow can i go back to iso7 ?

    Is everything slow?  What apps are slow?
    Try a reboot.
    http://www.wikihow.com/Reboot-an-iPad

  • Mavericks install incomplete. No Apple installers on HD. Will I be able to upgrade to Yosemite and will it fix all that didn't install with Mavericks upgrade? Very slow connection speeds 30 hrs to download Mavericks. Tried re-install; no help/

    Ever since upgrade to Mavericks I have been having problems which are too numerous to mention. I finally found a program from a UCBerkley professor called Pacifier which indicated 4000+ of 400,000+ files and folders did not install. I was unable to go further to try to fix everything because Pacifier can't find ANY Apple installers on my HD. It recommended I download the Mavericks installer without reinstalling to continue; however, after having done so (very slow connection; takes 30 hours to download) and although the installer appears in apps while downloading it disappears and NO Apple installer packages can be found on my hard drive. I tried to download Yosemite once and it indicated my system wasn't eligible for the upgrade. Its' download time was similar to Mavericks; 28+ hours.
    I am using a Macbook Pro mid-2012 version OS X 10.9.5 (13F34); Processor 2.5 GHz Intel Core i5; Memory  4 GB 1600 MHz DDR3; Graphics  Intel HD Graphics 4000 1024 MB.
    Regardless of how many files and apps I remove my memory is being eaten up to the point I can hardly do anything at all. There are no civil words for how frustrated I am. Someone please help me.
    Thanks,
    JJ

    Install Mavericks, Lion/Mountain Lion Using Internet Recovery
    Be sure you backup your files to an external drive or second internal drive because the following procedure will remove everything from the hard drive.
    Boot to the Internet Recovery HD:
    Restart the computer and after the chime press and hold down the COMMAND-OPTION- R keys until a globe appears on the screen. Wait patiently - 15-20 minutes - until the Recovery main menu appears.
    Partition and Format the hard drive:
    Select Disk Utility from the main menu and click on the Continue button.
    After DU loads select your newly installed hard drive (this is the entry with the mfgr.'s ID and size) from the left side list. Click on the Partition tab in the DU main window.
    Under the Volume Scheme heading set the number of partitions from the drop down menu to one. Click on the Options button, set the partition scheme to GUID then click on the OK button. Set the format type to Mac OS Extended (Journaled.) Click on the Partition button and wait until the process has completed. Quit DU and return to the main menu.
    Reinstall Lion/Mountain Lion. Mavericks: Select Reinstall Lion/Mountain Lion, Mavericks and click on the Install button. Be sure to select the correct drive to use if you have more than one.
    Note: You will need an active Internet connection. I suggest using Ethernet if possible because it is three times faster than wireless.
    This should restore the version of OS X originally pre-installed on the computer.

  • In OBIEE, retrieving base members along with their attributes is very slow

    I can use Attributes in the prompts and not in the actual report columns, then the report retrieval is very fast. But as soon as I add the Base Member, Attribute1, Measures1, Measure2 etc, the retrieval is slow. All the measures are dynamically calculated. Adding the attribute to the prompt - works fine. How do I design a aggregation with the attributes, I have enabled the enable query tracking. Any suggestions.

    I really recommend reading the DBAG section I linked in your other recent question: Re: ASO Cube with attributes very slow in retrieval
    For various classes of attribute queries, aggregations are not / can not be used - unfortunately this is just how ASO works. Personally, I have had to make what would logically be an attribute into a real, stored dimension because of this limitation.

  • Desktop with cross bars and very slow computer

    I have a iMac OSX 10.9.2 (13C64) early 2009 that became very slow to open and to shut and to process the adobe software. Also the screen of my mac is with cross bars. Already I cleaned with mackeeper, already I defrag, I noticed with the disk utility and all is well. I have over 400 GB available.
    Please help

    Sounds like your display or GPU is failing? Back up any data you deem important and book an appointment at your nearest Apple Store or Apple Authorised Repair Centre asap. I would also get rid of mackeeper (and anything else that purports to do the same) plus whatever you've used to 'defrag' your mac. They ar no use to you at all and in reality most 'helper' applications just create problems as well as slowing your mac down.
    Now what question do you have regarding OS X Lion Server?

  • Pacman -Syu with 3.4 is very slow

    Hi,
    I've got a problem with Pacman 3.4.0-2, -Syu is very slow ...
    :: Starting full system upgrade... takes ~ 60 seconds to complete.
    sudo pacman -Syu
    :: Synchronizing package databases...
    core is up to date
    extra is up to date
    community is up to date
    :: Starting full system upgrade...
    there is nothing to do
    If I executed -Syu onetime and do it again, it takes <5 seconds, after a reboot it's the same as before (60s).
    I tried pacman-optimize and different mirrors, I also deleted the cache, but it's still slow. What could be the problem? Pacman 3.3 was so fast ...

    Allan:
    Performed ..pacman-optimize yesterday...pacman -3.4.2-0
    Today have this time report:
    [root@n6re ~]# time pacman -Syu
    :: Synchronizing package databases...
    core is up to date
    extra is up to date
    community is up to date
    :: Starting full system upgrade...
    warning: vlc: local (1.1.0-1) is newer than extra (1.0.6-8)
    there is nothing to do
    real    0m2.010s
    user    0m0.260s
    sys     0m0.180s
    [root@n6re ~]#
    This indicates a problem I have been experiencing...no upgrades for two weeks!
    Vlc was built from abs extra and installed via pacman -U under the new pacman package.
    I am not sure what the statement "vlc package is newer than extra" means.  Is it my listing of extra or the package in archlinux repo that is referenced.
    Perhaps this conflict prevents any upgrade of other packages?
    I read that pacman -U operates in a new way with the new pacman package.  I note that it seems to download depends whether they already exist in file or not such as a local cachedir.
    Is there a problem in the latest pacman related to the use of ..pacman -U with abs generated packages?
    A previous post I made in this thread indicated 23 secs for ...pacman -Syu with same message resulting.  Thus, pacman-optimize was effective in reducing the time.  But I still cannot upgrade with Syu.
    Last edited by lilsirecho (2010-07-03 16:28:22)

  • IPhone turning off during Phone Call with iOS 4 and very slow performance!!

    Ever since the update, my phone has been switching itself off during a conversation.
    Has any one else experienced this problem since the update???
    Do you have a solution, if so please post it and help me!!!
    Also, the phone has been going very slow.
    Keyboard appearing 20 seconds later, after opening a new note and overall slow performance.

    Thread here :
    http://discussions.apple.com/thread.jspa?threadID=2471090&tstart=15

  • Query combining spatial condition with regular condition works very slow

    I had a query like the following. The first part of the condition is a sdo_relate, and others are just regular conditions. If I execute the query with those two condiction separately, they both works very fast, but I combine them together, it's terriblely slow(more than 5 minutes) I tried different combination, and try the /* ordered*/ hint too. Still does not work. Any oracle pro got an idea to optimize the query? Thanks a lot!
    select * from MM_ACCOUNT
    where SDO_RELATE(GEOLOC,
         (SELECT SDO_AGGR_UNION(MDSYS.SDOAGGRTYPE(geoloc, 5))
         from MM_METRO_FBR_BF
         where ID = 'cyecroi0o1'
         and CREATION_TIME='1026758969052')
         ,'mask=inside querytype=window') = 'TRUE'
    and VERTICAL_MKT_SUB_GRP_SIC in ('Retail','Air','Ground','Other','Water','Advertising','Entertainment/Recreation','Other (broadcast etc)','Publishing','Sports / Sports Club''s') and ENTERPRISE_CODE in ('Medium-Large','Medium-Small')
    Tim

    Hi, Daniel
    After I drop the indexes for vertical market, and tried this query you suggested eariler:
    select /*+ ordered */ * from
    (select /*+ no_merge */
    SDO_AGGR_UNION(MDSYS.SDOAGGRTYPE(geoloc, 5)) bufferShape
    from MM_METRO_FBR_BF
    where ID = 'cyecroi0o1'
    and CREATION_TIME='1026758969052') aggr_geom,
    MM_ACCOUNT
    where SDO_RELATE(GEOLOC, aggr_geom.bufferShape,
    'mask=inside querytype=window') = 'TRUE'
    and VERTICAL_MKT_SUB_GRP_SIC in ('Retail--Retail','Air--Transportation','Ground--Transportation','Other--Transportation','Water--Transportation','Advertising--Media / Entertainment','Entertainment/Recreation--Media / Entertainment','Other (broadcast etc)--Media / Entertainment','Publishing--Media / Entertainment') and ENTERPRISE_CODE in ('Medium-Large','Medium-Small')
    IT DID work significantly faster!!!! Thanks a lot! HOwever, I'm just a little bit confused here, why the index is actually slowing things down instead of making it faster?? And what's the "no_merge" doing here?
    Thanks for you great help again!
    Tim

  • Old i810 mobo with built in video very slow

    Hi All,
    It's been a while since I've posted here, I'm loving Arch all over again.  After distro hopping for a while this year I've settled on Arch once again.  I love the minimalistitc and elegant approach and learning about the configuration files and seeings to keep the system going.  Anyway,  I'm having one issue with an old P3 533 system that rests in a i810 motherboard with on-board video.  The system RAM reads 384 MB in the BIOS as well as when the "free" command is issued which is telling me that the video chip-set is getting zero RAM to work with.  Does anyone know if it's possible to adjust the shared RAM with the video using config files in Linux?  The BIOS does not have any option for this, surprisingly.  Thanks in advance
    -Pete

    Gusar wrote:
    I am quite sure that even on those old things RAM to the graphic card is allocated dynamically. I would search elsewhere for your problems. Which problems are those exactly? And which driver are you using?
    And as for that 120 x 80, that's not resolution, it's number of characters. You shouldn't be using vga= anyway, you should be using KMS which will automatically give you a native resolution.
    Gusar,
    My main issue was slow window rendering and movement i.e., very laggy feeling.  When Win XP was on this machine windows were redered much smoother.  Since I'm running openbox, everything should be pretty snappy but it isn't.  I've run Arch on multiple machinex and this is the only one giving me the issue. I'm not trying make a big deal out of this, it just doesn't seem right and I figured there may be an easy fix or something I may have missed.
    Regarding appending vga=771 to the kernel line, I followed the wiki beginner's guide and have been successful with all my other machines except for this one.  If I sjouldn't be using this feature then why is it in the beginner's guide?  I'm usinf xf86-video-intel as my driver and have followed the beginner's guide for all the xorg requirements.
    -Pete

  • InDesign problem With Https cURL Seems Very Slow at first time  to handshaking to server

    I am trying to connect the server using cURL from Indesign, but when i am trying to call to server the indesign takes so much time to open and initialize the socket it takes 25 seconds at first time only, but On all subsequent calls it is fast,
    *but when i am tryid that same programe with an c++ empty project it is very fast in 2 second we get a result back,
    *So the Problem With Indesign Socket initialization using https at first time because i have tryid same Curl On Simple c++ program and it is fast at all calls
    Specification:
    I am Currently using
         Indesign cs7 version 9.0
         visual studio 2010
    Can Anyone tell me why It happens,
    my code snippet is as follows please try to find if anyone knows better.
             curl_global_init(CURL_GLOBAL_ALL);
              curl = curl_easy_init();
             struct data config;
          config.trace_ascii = 1; /* enable ascii tracing */
          struct curl_slist *chunk = NULL;
             curl_slist_append( chunk, "Content-Type: application/json");
          curl = curl_easy_init();
          struct AppMemoryStruct chunk1;
             chunk1.memory = (char *)malloc(1);  /* will be grown as needed by the realloc above */
             chunk1.size = 0;    /* no data at this point */
           if(curl) {
                 curl_easy_setopt(curl, CURLOPT_URL, url.GetPlatformString().c_str());
                 curl_easy_setopt(curl,CURLOPT_SSLCERTTYPE,"PEM");
                 curl_easy_setopt(curl, CURLOPT_SSLCERT, "C:\\test\\omg.aps.net.pem");
                 curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, FALSE);
              curl_easy_setopt(curl, CURLOPT_SSL_VERIFYHOST, 2);
                 curl_easy_setopt(curl,CURLOPT_KEYPASSWD,"");
                 char error[1024];
                 curl_easy_setopt ( curl, CURLOPT_ERRORBUFFER, error );
           /* send all data to this function  */
                 curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &AppWriteMemoryCallback);
          /* we pass our 'chunk' struct to the callback function */
                 curl_easy_setopt(curl, CURLOPT_WRITEDATA, (void *)&chunk1);
          /* some servers don't like requests that are made without a user-agent
              field, so we provide one */
                 //curl_easy_setopt(curl, CURLOPT_USERAGENT, "libcurl-agent/1.0");
                 //curl_easy_setopt(curl, CURLOPT_HTTPHEADER, chunk);
                 //curl_easy_setopt(curl, CURLOPT_HTTPGET,1);
         //heres the performance get down its around 25 seconds when i am calling The server Using Curl Object Having
         //https url 
                res = curl_easy_perform(curl);
        // till here
                 if(CURLE_OK != res)
                     errorCode = kFalse;
                     if(chunk1.memory)
                         free(chunk1.memory);
                      curl_easy_cleanup(curl);
                   return errorCode;
                 errorCode = kTrue;
                 if(chunk1.memory)
                     returndata.SetCString(chunk1.memory);
                     free(chunk1.memory);
               /* always cleanup */
             curl_easy_cleanup(curl);curl_global_init(CURL_GLOBAL_ALL);
          please try to find a key issue why the performance is very bad using https.
    Thanks,

    I was tempted to move this over to the InDesign SDK forum where there are coders, but there isn't a lot of traffic over there, so I'm going to leave this open here with the advice that you probably won't get an answer from anyone here becasue we are, for the most part, not folks who work with code beyond scripting (and very few scripters hang out here, either).
    You should cross-post over there yourself and hope someone who understnds the qusetion sees it in one place or the other.

  • Abobe form with 2d barcode moves very slow in Adobe reader 8

    I have created a 2d barcode form in designer 8 the form is a two page form with four barcodes to capture the data. The barcodes are on the second page. The form works fine if you open it in Adobe pro but slow down and you see the hour glass appearing once you move the mouse over any field when it is open in reader. I have saved the form with no embeded fonts. Is there a setting i am missing for the barcode set up.
    Frustrated form designer.

    Kofax did send us some error checking to put in the code see below.
    But when I tried it with the raw.Value without the field names it goes to one field that thats takes all the data. The workflow does not put the fields in the AC index fields because there are no field names.
    What will I need to do to modify the code.
    This is really getting very difficult and I have a deadline. Is it possible to do the connect web session as you suggested.
    *****Kofax Fix that was sent*****************
    The fix for this is to modify the sample Workflow Agent so that it ignores
    these blank fields in the tab-delimited output:
    For i = 0 To (iBarcodeFields - 1)
    If FieldName(i) <> "" Then
    Set oIndexField_Field = _
    oIndexFieldsElement.FindChildElementByAttribute("IndexField",
    "Name", FieldName(i))
    If Not oIndexField_Field Is Nothing Then
    oIndexField_Field.AttributeValue("Value") = FieldValue(i)
    End If
    End If
    Next i
    A check is added (the first If statement) to see if the current field name is
    an empty string - If not then process the field.

  • SQL query with parallel hint  running very slow

    I have a SQL query which joins three huge tables. (given below)
         insert /*+ append */ into final_table (oid, rmeth, id, expdt, crddt, coupon, bitfields, processed_count)
         select /*+ full(t2) parallel(t2,31) full(t3) parallel(t3,31)*/
         seq_final_table.nextval, '200', t2.id, t3.end_date, '1/jul/2009',123,t2.bitfield, 0
         from table1 t1, table2 t2, table3 t3 where
         t1.id=t2.id and
         t2.pid=t3.pid and
         t2.vid=t3.vid and
         t3.end_date is not null and
         (trunc(t1.expiry_date) != trunc(t3.end_date) or trim(t1.expiry_date) is null);
         Below are some statistics of the three tables.
         Table_Name          RowCount    Size(MB)
         table1 36469938 532
         table2 242172205     39184
         table3 231756758     29814
         The above query ran for 30+ hours, and returned with no rows inserted into final_table. I didn't get any error message also.
         But when I ran the query with table1 containing just 10000 records, the query completed succesfully within 20 minutes.
         Can any one please optimize the above query?
    Edited by: jaysara on Aug 18, 2009 11:51 PM

    As a side note: You probably don't want to insert a string into a date field, won't you?
    Under the assumption that crddt is of datatype date:
    crddt='1/jul/2009' needs to be changed into
    crddt= to_date('01/07/2009','dd/mm/yyyy') This is data type correct and nls independent.

  • Transfer rate with USB 5102 is very slow (8 updates/sec). Any possible way to increase it??

    Taking 100 microsec data from 32 sensors we want to monitor results in real time. Because of slow data transfer protocol (niScope_InitiateAcquisition - niScope_FetchBinary8 loop) for USB NI5102 (~125ms) we can update only every 4 sec. Any way to speed it up? Is PCMCIA 5102 will do it faster (we don't have PCI)??

    To verify, you wish to take data for 100 microseconds on each of 32 sensors and display the data in "real time." The problem is the overhead of the initiate/acquire, not the actual acquisition time or theoretical data transfer time. Have you considered taking all your data in a single acquisition? I assume you are already multiplexing the signal to the 5102. This will require a bit of work in setting up the advance trigger for the multiplexer and the reference trigger for the 5102, but this is far cheaper than buying another acquisition board. Your data acquisition sequence would then be:
    Set 5102 for 64000 points, 20MHz, 0% trigger reference position (add more points if needed for settling time)
    Set multiplexer to first channel
    Iniate acquisition on the USB-5102 (you may want to use the hardware start trigger for repeatability)
    Trigger 5102 to start acquisition
    Cycle multiplexer through the 32 sensors with 100 microsecond dwell time per sensor (add extra dwell time for settling time as needed)
    Read data from 5102
    Parse the data by splitting the array using your known timing
    With this method, you only take the overhead hit once per scan instead of 32 times. You should be able to get single-digit Hz update rates. You can speed up the data acquisition by using both channels of the 5102 (unless you are already doing this for some sort of data reduction). If you are doing this in LabVIEW, pass your data into a queue in your acquistion loop and pull it out of the queue in a separate loop to process and display. This allows you to time multiplex the acquisition and display.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

Maybe you are looking for

  • WRT54G No longer available as Wireless Connection, Cat5 connect without issue

    Without changing a single setting I can only appear to go wireless on the neighbors unsecured network OR plug in the old Cat5 leash.  My network no longer appears as an option when attempting to connect wireless.  It's consistent across two laptops. 

  • I need to print logo in alv grid

    I need to print logo in alv grid .As of now its getting displayed but it cannot be printed .Kindly tell me wether there is any option to print it .Eitjer using ALV or Object oriented ALV.Please reply soon

  • Itunes pause sorting while editing song information

    I am going through my song library and manually updating song information like genre, artist name, etc.  Is there a way to pause the live updating of the sorting list while I am doing this so that when I change the title of a song or the genre it doe

  • Using the Device Camera without CameraUI

    Hello, I am building an iOS Application with this code here: Flex Use the camera without cameraUI The Aim is to render some text in the picture. The picture of the Device Camera is projected onto the Screen with some Text as "Real-Time-Preview". Some

  • Can FDM user maintenance be automated through backend?

    We have a FDM environment with 2 FDM apps for 3000 entities and about 700 users. You might have figure out that maintaining security across large number users since each one has to be done manually across both apps. Now we are about to start using FD