Time Machine : incredible calculation time required

I have started backing up with Time Machine and this is an example of what it gives .
In the beggining it calculated 6.000 days and after thirty minutes or so , it got to that conclusion .
I have done the full reset of Time Machine and restarted the procedure but nothing changed .
Currently , it needs 654 days .
It doesn't seem to make any progress.
What is the deal here?
Shall I leave it or does it need fixing ?
Thank you and have a great weekend.

Ok, now it has reduced the time in eight hours and it's progressing .

Similar Messages

  • Time machine external hard drive requires repair every time I turned the computer off or after I ejected the external hard drive?

    Time machine external hard drive requires repair every time I turned the computer off or after I ejected the external hard drive?

    What kind of repair?

  • Time required for getting DataSet info

    I'm interested in hearing about what sort of times others have encountered in getting output from the "List Data Set Runs" and "Read Data Set Run by Id" vis.
    It's my first use of datasets and the information is fine, but with 500 datasets it is taking ~30 sec for the above VIs to do their thing (LVDSC 6.1, P4). The time required seems to be a very nice linear function of the nb. of datasets so when there's the 2500-5000 datasets I expect to have in my database, it'll be ~3 to 6 minutes. This will really mess up my intended use of them in a quick user interface.
    If it makes any difference, the number of tags in my dataset is 0 (zero). I'm just interested in the start time, end time, and description.
    TIA.
    =====================================================
    Fading out. " ... J. Arthur Rank on gong."

    Hi Donald,
    I can't give you comparable numbers as I am on DSC 7.0. But here's something that might be of interest to you:
    1. If all that you are interested in is the Start time, End time, (of the runs, and not traces), and the Description, then you can make-do with just the 'List Data Set Runs' VI. This VI returns this info' for all the Runs in a given DataSet. You would unbundle the 'data runs' cluster to get this info'. You do not need to use the 'Read Data Set Run by ID' VI.
    The 'Read Data Set Run by ID' VI internally basically looks-up the run, if found, gets the trace info' for each tag in that run. These are expensive operations by their nature.
    Also, normally, the Description remains the same for Runs within a DataSet. If thi
    s is the case with you, you just read the Description from the first Run. (Of course, it's very much possible that the Description changes from run to run. In that case you got to read all).
    2. In DSC 7.0, my 'List Data Set Runs' VI takes about 4 seconds to return 500 runs. And about 7 seconds for 1000 runs. I have one tag in my DataSet but no description. This is way faster than what you are seeing (plus my test machine is a crappy 450MHz P3). I know DataSets and Citadel in general has improved considerably in DSC 7.0. DataSets are now built into Citadel. You may want to get an Eval copy of 7.0 and see if the difference is as significant as we're seeing here.
    Hope this helps. Please write back if you want to discuss more.
    Regards,
    Khalid

  • Time required - backup

    hi,
    1)I want know what is the time required to take 20GB database cold backup. What are critiria depends on the same?
    2)If I export same 20GB db to dumb file - by what is % reduce the data?
    Thanks in advance...
    By
    Mahi
    B'lore

    Its really hard to guess about it without knowing your hardware. I have a DB on AMD athelon 64-bit with sata hard drives and the db size of all data files is 47G and whole data is about 39G and if i do export of this
    DB, the size of DUMP is 20G.
    I do a periodic FULL export of this DB on a remote machine regularly and it takes about 3 hours to create the DUMP of my DB on my this network machine with PIV processor and sata hard drives.
    So it depends on your I/O band width and specification of your other hard ware. I think my provided info could help you estimating the size of your DUMP file on 20G database and the time it may take.( My dump time includes network over head Which could be avoided if dump is done on local machine)
    Regards

  • To find the time required by the process chain to complete

    Hi Experts,
    I am calulating the average time required by the process chain to compete.
    Is there any way to find the time required by the process chain to complete..
    Thanks in advance.
    Regards,
    Ashwin

    Hi,
    There is a Tool provided by SAP to do the Process Chain Analysis.
    It is basically a ABAP Program /SSA/BWT which provides the following BW Tools:
    a)Process Chain Analysis : this tool is used to perform the Runtime analysis of the Process Chains. The analysis can be performed not only at Process Chain level but also at the Process Type level.
    b)Detailed Request Analysis
    c)Aggregate Toolset
    d)Infoprovider BPPO Analysis
    So you can go through the program and analyse the runtime of your Process Chains.
    Regards,
    Abhishek
    Edited by: Abhishek Dutta on Aug 13, 2008 7:13 AM

  • I had backed up my IPhone 4s on iCloud on Jan 19. I am now trying to do another back up but it says the time required is 7 hours. It appears to long a time for 1GB of data stored on the iCloud. Can someone help me please?

    I had backed up my IPhone 4s on iCloud on Jan 19. I am now trying to do another back up but it says the time required is 7 hours. It appears to long a time for 1GB of data stored on the iCloud. Can someone help me please?

    To be honest, that sounds about right.
    For example on my 8Mbps (megbits) down service I get around 0.4Mbps upload.  That is the equivalent of (very approximately) 3Mb (megabytes) per minute or 180Mb per hour.  Over 7 hours that would be just over 1Gb.
    Obviously, it all depends on your connection speed, but that is certainly what I would expect, and that is why I use my computer for backing up, not iCloud.  So much quicker.

  • How to estimate time required by one replicate to process 10 MB trai file

    Hi
    Consider one trail file of size 10MB. The trail file contains only insert statements. Size of one record is 2KB and total trail file size 10 MB. Under ideal configuration how much time the Replicate would take to read the trail file and apply transactions on target server
    Is there any formuale to estimate time required by Replicat process for a trail file ?
    Thanks

    There are so many variables I wouldn't venture to guess.
    I have implemented GG on multiple systems, each had their own performance characteristics depending on operating system, oracle version, makeup of the data being replicated, network latency, load on the source system, load on the target system, etc.
    My best suggestion is to set it up and try it out.
    In my experience once set up I have had to do very little GG specific tuning to get very good speed (sometimes tweaking BATCHSQL helps for some usage patterns). Many SLAs I have encountered are 15 minutes... I get spoiled sometimes when I report we are getting committed transactions captured transported and applied within just a few seconds. And then they always want their data in 10 seconds :)

  • Reducing time required for ABAP-only copyback (system copy) process

    Our company is investigating how to reduce the amount of time it takes to perform a copyback (system copy) from a production ABAP system to a QA system.  We use a similar process for all ABAP-only systems in our landscape, ranging from 3.1h systems to ECC6.0 ABAP-only systems on both DB2 and Oracle database platforms, and the process takes approximately two weeks of effort from end-to-end (this includes time required to resolve any issues encountered). 
    Here is an overview of the process we use:
    u2022     Create and release backup transports of key system tables and IDu2019s (via client copy) in the QA system to be overwritten (including RFC-related tables, partner profile and IDOC setup-related tables,  scheduled background jobs, archiving configuration, etc.).
    u2022     Reconfigure the landscape transport route to remove QA system from transport landscape.
    u2022                    Create a virtual import queue attached to the development system to capture all transports released from development during the QA downtime.
    u2022     Take a backup of the target production database.
    u2022     Overwrite the QA destination database with the production copy.
    u2022     Localize the database (performed by DBAu2019s).
    u2022     Overview of Basis tasks (for smaller systems, this process can be completed in one or two days, but for larger systems, this process takes closer to 5 days because of the BDLS runtime and the time it takes to import larger transport requests and the user ID client copy transports):
    o     Import the SAP license.
    o     Execute SICK to check the system.
    o     Execute BDLS to localize the system.
    o     Clear out performance statistics and scheduled background jobs.
    o     Import the backup transports.
    o     Import the QA client copy of user IDu2019s.
    o     Import/reschedule background jobs.
    o     Perform any system-specific localization (example: for a CRM system with TREX, delete the old indexes).
    u2022     Restore the previous transport route to include the QA system back into the landscape.
    u2022     Import all transports released from the development system during the QA system downtime.
    Our companyu2019s procedure is similar to the procedure demonstrated in this 2010 TechEd session:
    http://www.sapteched.com/10/usa/edu_sessions/session.htm?id=825
    Does anyone have experience with a more efficient process that minimizes the downtime of the QA system?
    Also, has anyone had a positive experience with the system copy automation tools offered by various companies (e.g., UC4, Tidal)?
    Thank you,
    Matt

    Hi,
    > One system that immediately comes to mind has a database size of 2TB.  While we have reduced the copyback time for this system by running multiple BDLS sessions in parallel, that process still takes a long time to complete.  Also, for the same system, importing the client copy transports of user ID's takes about 8 hours (one full workday) to complete.
    >
    For BDLS run, I agree with Olivier.
    > The 2 weeks time also factors in time to resolve any issues that are encountered, such as issues with the database restore/localization process or issues resulting from human error.  An example of human error could be forgetting to request temporary ID's to be created in the production system for use in the QA system after it has been initially restored (our standard production Basis role does not contain all authorizations required for the QA localization effort).
    >
    For the issues that you encounter because of system copy, you can minimize this time period as you would be doing it on periodic basis (making some task list) and you can make a note of issues that you faced in previous run. So, normally i don't count it as system copy time
    Thanks
    Sunny

  • Real time requirement of XI

    can any one tell me the real time requirement of XI, and what r the buisness need in which the client will prefer to go for XI, and some real time scenarion or situation in which XI can is used .

    Hi Santhosh,
             The XI comes in to picure depends on the business Requirements.
      Let's take an example suppose a Company XXX it's having several branches in few areas..this company may be using SAP Systems and the comapny may be having different vendors..the vendors may be using different systems..let's assume the vendor is not having SAP systems their Landscape...eventhough the Comapny XXX want to communicate with the vendors...that time they cannot communicate each other...they can communicate each other their business process only by using XI.
       Here Comapny XXX having SAP System their Landscape and the Vendor using a web application...in this scenarion the communication done by using any EAI tool.We are using XI EAI tool to communicate each other.
    Hope I am Clear.
    Please let me know if you have queries?
    Thanks and Regards,
    Chandu.

  • Urgent Full Time Requirement for SAP Senior FI/CO Analyst

    Hi,
    Any one looking for Urgent Full Time Requirement for SAP Senior FI/CO Analyst
    Moderator: This is not job board. Upon second violation your user will be banned

    Hi,
    Any one looking for Urgent Full Time Requirement for SAP Senior FI/CO Analyst
    Moderator: This is not job board. Upon second violation your user will be banned

  • Is there a way to set Minimum Time Requirement for a course?

    Is there a way to set a minimum time requirement in a Capitvate 5 course?
    For example, the learner must be in the course for a minimum of 30 minutes or the course can not be considered "complete."  This is a SCORM course and the learner's record in the LMS will not show the course as complete until the learner has been in the course at least 30 minutes.  (This minimum time requirement may or may not be in conjunction with also passing a test with 80%.  for example, in order to be considered complete, the user must be in the course for at least 30 minutes AND pass test questions with a score of at least 80%)
    Perhaps some modification of the Timer Widget described by Lilybiri at http://bit.ly/ep5cOY?
    Or, some modification of the SCORM Manifest?

    Lilybiri,
    Thanks to your 2 articles on variables and advanced actions as well as a few other resources (listed below) I was able to get up-to-speed enough on Adobe Captivate variables and advanced actions and implement your solution 90%.
    The last 10% is...
    You had recommended in Step 2 to "Attach a score of 0points to the CBNo and a score of 100points (just an example) to the CBYes."
    My question is, how do I attach points to the buttons? Is "score" a system variable? Also, is there anything special I need to know about these attached points in terms of how they will be communicated to our LMS using SCORM protocol?
    And as a final thought....
    If someone exits the eLearning and then re-enters the next day, I think that the cpInfoElapsedTimeMS will reset to zero, is that correct?  Is there way for the project to remember what the cpInfoElapsedTimeMS was?   We are using this in conjunction with an LMS using SCORM, if that is helpful.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    RESOURCES FOR LEARNING ADOBE CAPTIVATE VARIABLE AND ADVANCED ACTIONS
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    In case anyone else needs to get up-to-speed on Adobe Captivate variables and Adobe Captivate Advanced Actions (for Captivate 5), here are the resources I used. Hope others find them helpful useful as well.
    Lilybiri: 
    Article 1:  "Use Power of Variables in Captivate 5 (4) - without advanced actions": http://kb2.adobe.com/community/publishing/862/cpsid_86299.html
    Article 2: "Unleash Power of Variables in Captivate (4&5) with Advanced Actions"
    http://kb2.adobe.com/community/publishing/871/cpsid_87182.html
    http://lilybiri.posterous.com/unleash-the-power-of-variables-in-captivate-5
    Adobe Webinar by Vish at Adobe:
    "Using Advanced Actions & Variables in Captivate 5" http://www.adobe.com/cfusion/event/index.cfm?event=set_registered&id=1808722&loc=en_us
    CP Guru
    "List of Adobe Captivate 5 System Variables" http://www.cpguru.com/2010/07/20/adobe-captivate-5-system-variables/
    If you scroll to the bottom on the webpate, you can download a beautifully formatted PDF

  • Time required to install GTS 7.2

    Dear BASIS,
    Kindly let me know what is the time required to install GTS 7.2.
    OS:Sun solaris
    Database: Oracle 10g
    Ram:15 GB
    Harddisk 150 GB
    KIndly let me know your Suggessions.
    Thanks & Regards
    Prasanth

    Hi,
    Please tell me the time required to install GTS 7.2(only installation part and not configuration time).
    Thanks for your help.
    Regards
    Prasanth

  • Time Machine Folder Calculating forever

    Hey Guys,
    I have a Time Machine Backup folder from July. If I try to connect to my MacBook Pro and read contents of the folders, I am unable to.
    There is a red '-' Icon attached to some folders and when I click on them, the preview pane shows 'Calculating' and it stays like that forever (I left it overnight). From Terminal 'You have no permissions to access' or something like that.
    There is a lot of stuff I need in that backup. Any help is really appreciated.
    Thanks guys!
    Uday

    Yosemite is a dog..
    Do you have another computer on the network running some functional Mac OS??
    If not buy a USB drive. plug it into the Mac.. boot from recovery and try and use the TM to restore the whole backup (at a suitable time for your files) to the USB drive. Do this over ethernet with no wireless turned on.
    Once you have a working recovery restart the computer from Yosemite.. and pluck the files you need from the USB drive.
    You can also have a try with finder to do a manual restore.
    Can't access old files on time capsule

  • HT201250 Time machine displays calculating size?

    When trying to set up time machine for the first time with a 4tb Iomega external drive is displaying message calculating size for some time?

    All sorted now, used some of the advice from Pondini's FAQs; thanks!

  • Time Machine configured Mac always requires password

    I have a new MacBook air that I set up using Time Machine – fast and simple according to the blurb BUT now every time I want to open or move a file, I have to put my password in. Also, when moving folders, Finder copies them.
    The only solution seems to be to change permissions individually for each file, but I have about 5,000... Any ideas?
    I've already repaired permissions and updated permissions on master folders.

    If you are going to steer clear of Time Machine, here are some cloning programs so you can continue to do a backup.
    Clone  - Carbon Copy Cloner
    Clone – Data Backup
    Clone – Deja Vu
    Clone  - SuperDuper
    Clone Software – 6 Applications Tested

Maybe you are looking for

  • Ipod shuffle 2nd Gen- Not Recognised by Windows 8.1 Laptop

    Hi All, I have HP Laptop with .... Windows 8.1 / 4gb RAM / Intel i3 / 64-bit, i am trying to connect my Ipod shuffle  (2nd gen) to laptop but its failed in Itunes (i got updated one) as well as in Windows.. at the same time, the same ipod i can conne

  • Placing multiple page pdf as anchored objects in text

    I am trying to place a multilple page pdf as in anchored object in text. Ideally I'd like it to flow in and size to the column width. Is this possible? Im running ID 5.5 on windows 7. Ive tried the script for MultiPageImporter-but I cant seem to get

  • Open a file in Flex?

    By googling, I can find lots of examples of opening a local file in Flash, but none of them work with Flex (for instance, import flash.filesystem.*; ) I can get the target object by using the FileReference, but OnSelect, how can I open that file for

  • Help! Itunes will not install because it cannot access and HKEY

    Error reads as follows: Coult not open key: HkeyLOCALMACHINES\software\classes\QuicktimePlayerLib.QuicktimePlayerApp\CLSID. Verify that you have sufficient access to that key, or contact your support personnel. I ran regedit and found that this Hkey

  • Lock out from iPhone 5

    I have forgotten my password for my iPhone 5 and I'm on holidays so i can't connect it to my desktop. How to i get into my phone?