Best method/tool for cloning a failing HDD for Data Recovery?

I have been brooding over this subject "Which method/tool is best for cloning a failing HDD - including the system drive - for data recovery from the clone.Has anyone tried cloning for this specific purpose and achieved any results?I would be interested if they can share their experience or even air their views on the subject.

I recently review one of the tool, here's my blog/review : http://arnavsharma.net/4/post/2014/05/review-stellar-phoenix-windows-data-recovery.html
Arnav Sharma | http://arnavsharma.net/ Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading
the thread.

Similar Messages

  • Best method to create a spinnable flower windmill for iPad and Android

    What is the best method to create a spinnable flower windmill for iPad and Android?  I want to be able to drag and spin.  The faster you drag, the faster it should spin.  Each petal will have an image which will be its own link.  Any links to tutorials, toolkits, and recent examples are greatly appreciated.  Thank you.

    Here's a whell that spins, but it has a spin button instead of being able to drag it:
    http://www.switchonthecode.com/tutorials/creating-a-roulette-wheel-using-html5-canvas
    Seems to me the best ui would be to be able to drag it and stop it or drag it and spin it at various speeds.

  • Which cable for the add-on HDD for Split X2?

    Does anyone have an idea about which HP cable is used for the extra HDD that is installed (by the consumer) in the keyboard dock?
    It looks like a DV5 series cable.
    I ask, as these cables are only about $5 of $6 dollars, whereas the "kit" (cable and chassis for the HDD) is $50 from HP.
    Thanks,
    Garrett

    I tried many things until I eventually bought the kit. Sorry not to be able to help.

  • DPM 2012 still requires put end users into local admin groups for the purpose of end user data recovery?

    On client computers that are protected by DPM 2010 and prior versions, you had to put the end users account in the local administrators group. If you did not add the end user account to the local administrators group you would get this error after opening
    the recovery tab in the DPM client: “DPM found no recovery points which you are authorized to restore on the specified DPM server. You can restore only those recovery points for which you were an administrator at the time the
    backup was taken. To restore other recovery points, contact your DPM administrator, or attempt to restore from another DPM.”  This is not ideal on many networks because the end users are not allowed to have local administrator access.
    Ths fix to this was included in hotfix 2465832 found here: http://support.microsoft.com/kb/2465832.
    This hotfix (a hotfix rollup package for DPM 2010) resolves other issues with DPM 2010 as well. You can find the full list of what this hotfix corrects on that link.
    One would think this issue should have been resolved in DPM 2012, however I am encountering the same exact issue, had to include end-users into the workstation local admin group before they can search for recovery points on the DPM server. This is not acceptable
    practice.
    Is there a new hotfix for the same issue on DPM 2012? I am hesitated to apply KB2465832 since it also includes many other fixes for DPM 2010, which may not appicable for version 2012.
    Please help.
    Thanks,

    This is a hands off solution to allow all users that use a machine to be able to restore their own files.
    1) Make these two cmd files and save them in c:\temp
    2) Using windows scheduler – schedule addperms.cmd to run daily – any new users that log onto the machine will automatically be able to restore their own files.
    <addperms.cmd>
    Cmd.exe /v /c c:\temp\addreg.cmd
    <addreg.cmd>
    set users=
    echo Windows Registry Editor Version 5.00>c:\temp\perms.reg
    echo [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft Data Protection Manager\Agent\ClientProtection]>>c:\temp\perms.reg
    FOR /F "Tokens=*" %%n IN ('dir c:\users\*. /b') do set users=!users!%Userdomain%\\%%n,
    echo "ClientOwners"=^"%users%%Userdomain%\\bogususer^">>c:\temp\perms.reg
    REG IMPORT c:\temp\perms.reg
    Del c:\temp\perms.reg
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT] This
    posting is provided "AS IS" with no warranties, and confers no rights.
    That's a good one! Thanks for that.
    I've been scripting on KIX for some time, so here is mine, hope it helps to someone... (it's probably not the best, but it works)
    ========================================================================
    $RC=setoption("WOW64AlternateRegView","on") 
    $DPMkey = "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft Data Protection Manager\Agent\ClientProtection"
    $uservariable = "%userdomain%\%username%"
    If KeyExist ($DPMkey)
    $Userstring=ReadValue($DPMkey, "ClientOwners")
    If $Userstring == ""
    WriteValue($DPMkey,"ClientOwners", $uservariable, "REG_MULTI_SZ")
    ? "Key created"
    else
    If not instr($Userstring,$uservariable)
    $Userstring = "$Userstring,$uservariable"
    WriteValue($DPMkey,"ClientOwners", $Userstring, "REG_MULTI_SZ")
    EndIf
    Endif
    EndIf
    ==========================================================================
    The problem actually is that you still need to use an admin account to write on the registry, so ensure you configure it properly on the schedule task.
    In case you use a service account on the schedule task... the "$uservariable" will get populated with that account. As a work around to this... I changed it for the following line:
    =========================================================
    $uservariable = ReadValue("HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Authentication\LogonUI", "LastLoggedOnSAMUser")
    =========================================================
    The only problem with that, is that key gets created/updated only if user gets logged phisically on that PC, but will not work for anyone connecting through RDP.

  • Looking for silent HDD for Macbook Pro 208 Penryn

    Hi,
    i'm looking for a really silent HDD for my MB Pro. The HDD in my stock base MB Pro is a FUJITSU MHY2200BH. It makes a permanent "fan-like" background noise. Can someone recommend a silent 200-250GB HDD? (doesn't matter whether 5400rpm or 7200rpm).

    These models are to be considered (Samsung, Western Digital and Toshiba)
    http://www.tomshardware.com/reviews/wd-toshiba-join-club,1776-8.html

  • Tool and best method for diary

    Hi All,
    Can anyone please help me in this ?
    I want to create a diary  for 2015 .
    I have some design in the diary unlike traditional diary.
    Which is the best method and tool to create the same? How to create templates and design effectiviely without redoing the same deisgn and alignment again and again.
    Looking forward for yoru response.
    Thanks Nirmala

    You need to ask specific program questions in that specific program's forum InDesign
    The Cloud forum is not about using individual programs
    The Cloud forum is about the Cloud as a delivery & install process
    If you will start at the Forums Index https://forums.adobe.com/welcome
    You will be able to select a forum for the specific Adobe product(s) you use
    Click the "down arrow" symbol on the right (where it says All communities) to open the drop down list and scroll

  • Best Method for Saving Data to File?

    Hi, 
    We're using labVIEW 2009 to acquire data from our instrument.  In the past we have used the "write to labview measurement file" express configuration tool to save data to a file. However we had some issues with our VI - occasionally we would lose data or column headers somewhat erratically and were never able to sort out the problem. We would like to rewrite our VI with the best possible solution. 
     Here's a summary of what we would like to do:
    Save 30 variables at a rate of roughly 1 Hz. We would like one set of column headers per file so that the data can be easily imported into labVIEW with the variable names intact. We will be collecting data continuously, so we would like to divide the data into 3-4 files per day. Ideally, the program to start new files at the same times from day to day and the filename could be configured to include the date and time/file number.
    I am hoping that users can provide a little feedback about methods that were most successful and reliable. From what I have read there are a few different ways to do this (express VI, tdms, "write to text file"). Any thoughts or relevant examples would be quite useful for us!
    Thanks for your help!

    Meg T wrote:
    Is it correct to say that in your method, the indexing results in building up the data into one large array of data before saving it to the file with the column headers and filenames appended?  If we are writing data at 1Hz for 6 hours of time, will we run into an issue being able to store all the data?
    Yes the indexing will build up one large array.  If this is a problem due to array size and memory, you will have to write more often.  You can write every second, that should not be a problem.  You should still write the column names to the file first, and have the append input set to False (or nothing wired in since the default is false).  This will create a new file with the headers only.  Then the data write has a True wired in so that the append takes place.  If running for 6 hours and gathering data once per second, your array will contain 36,006 rows of data.  I'm not sure if that would cause a memory problem or not.
    Meg T wrote:
     it also seems difficult to incorporate headers into express VI if you are writing the data continuously as a part of a loop with the "append" option.
    If you write the headers before the loop as I have shown, and use append inside the loop, you will not have problems.
    - tbob
    Inventor of the WORM Global

  • Best method for secure Internet ?????

    OK,
    So I wasn't sure where to put this post, so I figured I would start with the system I am using and go from there.
    The Problem:
    I am trying to set up a secure method in which I can use Internet services as I make my way from non-secure places throughout the day studying for school. In any given day I might find myself at 3-5 different locations, studying. I often have my computer with me and would like to be able to use it it without the worries of open line surfing.
    I have read (not completely understood) the VPN concept and think that it might provide what it is I am looking for.
    Before I went spending the money for an aftermarket plan that provides me a VPN, I thought I would look into setting up my own VPN off of my home connection. Although I am not sure if all I need is an internet line (IP Address) - or do I need an actual computer connected to the Internet at home.
    I might be completely wrong here in my thoughts of using the VPN, so please provide any thoughts or suggestions.
    Brief Recap:
    We have a MBP & iBook G4. We would like to be able to use either of them when we are out in public non-secure areas. Is there a way - and if yes, What is the best method to use to make this happen.
    Thanx,
    -Al-
    MBP   Mac OS X (10.4.7)  

    Qanuk,
    VPN provides a method of connecting to a home or business network securely. From some remote location, VPN creates a secure "Tunnel" whereby you can join your home network and use network resources. This would allow you to "share" files, printers, etc. between computers not just on your network locally, but across the internet, regardless of where you are connected.
    If your intention is not to share your network's resources across the internet, but to "surf" normally, you don't need to use VPN. With normal internet use, you are already as secure as you need to be. It is true that there is a potential for "snooping" inherent in using open wireless networks, but this is only a potential, not a real problem.
    The question is, "what will someone else be able to find?" Well, if you send an email, there is a potential that someone could intercept that email and read it. Big deal. There is also the possibility that someone could determine what websites you are visiting. Again, big deal.
    In order to do this "snooping," that someone would have spend a great deal of resources and time learning to use the tools necessary to do this snooping, then spend the time using them while you are on the same network. Why would someone do this in order to collect very mundane information from you? It's just not going to happen.
    Now, in any cases where you might actually be transmitting data useful to a "snooper," such as the transmission of credit card information, you will undoubtedly be doing so using a secure browser connection. In such cases, a secure connection is made between your browser and the server to which you are connected (your bank's server, the retailer's server, etc.). No snooping is possible in these cases.
    As for your computer, itself, you are protected by the best firewall in the industry. No one is going to be able to "hack" your computer while you are connected to the same network, as long as you have your firewall turned on, and especially if you have no running services.
    In short, you are already as secure as you need to be; surf away.
    Scott

  • Best method for incremental replication to backup xserve?

    I was just trying to figure out what would be the best method for routinely incrementally replicating a primary xserve. I have a secondary that has the same hardware. I thought about using scheduled tasks with Carbon Copy Cloner over say a Crossover cable to the secondary xserve. However, doing this backup would continually wipe out the network settings for the secondary mac? I could do the replication to a secondary drive and then just make that drive the boot drive in the event of data corruption on the master server?
    Where I'm at now is that the primary server runs Raid1, so in the event of a single drive failure or a hardware failure, I could swap the drives into the backup server. However, I'd like some sort of protection against data corruption on the primary xserve.
    Any general thoughts on a better way to do this? Or is there software that does this well?
    Thanks

    Our primary is an XServe RAID. In addition it has an external 2TB drive that I CCC all the user accounts to every night. I then setup a Mac Mini Server as a replica.
    If the primary XServe has a catastrophic failure, I simply connect the 2TB drive to the Mac Mini and point the replicated user accounts to the cloned data on the 2TB drive.
    I haven't tested it. But this scenario seemed like the best solution.
    Message was edited by: MacTech_09

  • Best method for timestamping? (for later use with perl script)

    What is the best method that I can use to timestamp events in Linux for later use with perl script?
    I am performing some energy measurements.. where I am running several tasks separated by 20 secs in between. Before I start any execution of tasks, I always place initial delay for me to start the script and start the measurement device.
    My problem is that I don't know how long is that first delay exactly. So to solve this, I thought I could use date commands to time stamp all tasks.. or at least to timestamp first dela.
    Here is example of what I am doing:
    1st delay
    task 1
    20s
    task 2
    20s
    task 3..... etc
    What would be the best to use?

    logger.
    It posts messages straight to the system log.  You can see the message, in all its glory using tools like journalctl.  You will see the message, the date, time, host name, user name, and the PID of logger when it ran.

  • Best method for controlling Office 365 updates

    Were looking for the best method for updating Office 365. We will be testing prior to releasing the version to the rest of the company.  We have a couple of methods we're contemplating but looking for any pros or cons for each.  We are also
    using SCCM 2012.
    1. Run setup.exe setting the version and internal install source in an .xml file run as an SCCM package using distribution points as the install source.
    2. Run click2runclient.exe with command lines setting the version and internal install source as an SCCM package using distribution points as the install source.
    3  Set the version through group policy and turn on automatic updates and don't specify an install source.
    Option 3 appears to be the most straight forward with the least administrative overhead.  Would it be possible to revert back to an earlier version using this method?
    I have read various articles but looking for any input as to what is working well  or not working for others.

    Hi,
    I would like to share this
    blog post with you, which provides an example how to implement a fully automated testing and deployment process of Office 365 updates. This deployment method provides you the ability to test updates before you approve them in my environment.
    The process might look like:
    Deploy Office 365 in your environment with Office Deployment Tool, configure the "Updates" element in the configuration.xml file so that updates are enabled and the "UpdatePath" attribute points to an internal source.
    Download the latest Office 365 build into a different internal source, configure your test machine to pick up builds from it.
    After testing the updates, copy the updates to the first internal source.
    You should be able to integrate the process with SCCM to reduce your administrative effort.
    Hope this helps.
    Regards,
    Ethan Hua
    TechNet Community Support
    It's recommended to download and install
    Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
    programs.
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Best method for backing up macbook air?

    Please tell me what would be the best method to back up my macbook air?

    Yes, this i know, Since I have nearly 100 HD laying around, when I clone my system, its bootable, and the second HD becomes the defacto recovery
    at the very worst, you can reload the OS online.       But yes, youre correct, but Superduper is FREE, and CCC surely isnt.
    as a "clone it all" option, Superduper is great for HD upgrading and emergency backup for laptop HD.  Since laptops HD are prone to failure (not SSD as in the AIR), keeping a cloned backup is a great idea.
    But yes, your right, ......I consider a second cloned HD a "recovery HD", ......but not in the sense you imply.......  
    An ERROR that both superduper and CCC make in a CLONE, however, is they write the BOOT FILES in a diff. location, so if you install the replacement HD after a crash, it will (not 1st time which is always slow) cause slightly slower startup from poweron.......in which case you have to reinstall the OS on the CLONE...........that IS if you care that much about a few extra seconds boot time on your replacement cloned HD.
    But LION and MOUNTAIN LION systems *also* have "Network Recovery", which can download and install without a recovery volume.

  • Best method to clone HDD to new SSD

    I am about to upgrade the 256 Gb HDD on my Late 2008 Aluminium Macbook (running 10.9.4) to a 512 Gb SSD.
    What is the best practice for cloning the existing drive to the new one?  My goal is to just have a seamless transition where I can swap one disk for the other, boot up from the new SSD drive and have the computer function just like it did when running from the original HDD, with all my existing apps and data.
    I have a USB to SATA cable if needed, and maintain regular Time Machine backups to a Time Capsule.
    I've read several different approaches, like using Carbon Cloner or OS X Disk Utility or restoring from a Time Machine backup, and would like to know if there is a stated best practice way of doing this?
    Thanks in advance for any help - it will be much appreciated!

    Get an external enclosure - OWC sells them in a variety of flavors. Or use your SATA->USB cable.
    Download Carbon Copy Cloner (if only for the free trial time) and plug in the SSD, inside the enclosure, use Disk Utility to format the SSD and then use CCC to clone your old drive to your new SSD.
    Everything should go off without a hitch...
    Clinton

  • Best method for "UPSERTS"?

    Hello,
    I'm trying to do some "update if exists otherwise insert" logic in my SSIS package but am running into problems.  I researched and found that one way to do it is with Lookup transform, with redirection of error output.  But this causes problems
    as I run out of memory (I assume) because random components start failing for no reason.  My reference table has over 2 million rows also.
    Any ideas on best method of doing upserts?

    Hi John,
    Sorry for the lack of explenation, but I've ment src, lookups, scd, insert & update. Maybe it's because I've never seen a suitable
    situation where I could use merge statement, but I like the fact that you can extract your data, do calculations, FK lookups and update/insert the destination with SCD 1 or 2. I have seen a situation where they used merge statements but the "ETL" process exists
    only out of SQL tasks & scripttasks, yes dataflows are used but 1 to 1 to load data from src and store it in staging. this leads to a situation where you copy the data 3 times in different tables, ones in a staging, ones with the join to determine the
    FK's and one upsert statement. It works I agree but I always wondering if this is a good approach, it looks messy, difficult to maintain, row based errorhandling is not possible , redirection of unknown lookups, ...
    Maybe it's because I'm used to work with a DWH.
    Thanks in advance.

  • Best for cloning - Retrospect Duplicate, SuperDuper!, or CCC?

    I would like opinions on what is the "best" and fastest program for cloning my startup volume to another bootable drive. I have been using Carbon Copy Cloner for quite a while, but the new version is still beta. I have Retrospect, but have used only the "backup", not the "duplicate" function. And, I have seen good review about SuperDuper! If anyone can give me comparisons on these, to include speed of cloning/duplicating, I would be much obliged. Thanks.

    Retrospect is an excellent backup utility although it hasn't been updated in a long time. However, the easiest way to clone is to use the Restore option of Disk Utility:
    How to Clone Using Restore Option of Disk Utility
    1. Open Disk Utility from the Utilities folder.
    2. Select the backup or destination volume from the left side list.
    3. Click on the Erase tab in the DU main window. Set the format type to Mac OS Extended (journaled, if available) and click on the Erase button. This step can be skipped if the destination has already been freshly erased.
    4. Click on the Restore tab in the DU main window.
    5. Select the backup or destination volume from the left side list and drag it to the Destination entry field.
    6. Select the startup or source volume from the left side list and drag it to the Source entry field.
    7. Double-check you got it right, then click on the Restore button.
    8. Select the destination drive on the Desktop and press COMMAND-I to open the Get Info window. At the bottom in the Ownership and Permissions section be sure the box labeled "Ignore Permissions on this Volume" is unchecked. Verify the settings for Ownership and Permissions as follows: Owner=system with read/write; Group=admin with read/write; Other with read-only. If they are not correct then reset them.
    For added precaution you can boot into safe mode before doing the clone.
    For general backup:
    Basic Backup
    My personal recommendations are (order is not significant):
    1. Retrospect Desktop (Commercial - not yet universal binary)
    2. Synchronize! Pro X (Commercial)
    3. Synk (Backup, Standard, or Pro)
    4. Deja Vu (Shareware)
    5. PsynchX 2.1.1 and RsyncX 2.1 (Freeware)
    6. Carbon Copy Cloner (Freeware - 3.0 is a Universal Binary)
    7. SuperDuper! (Commercial)
    The following utilities can also be used for backup, but cannot create bootable clones:
    1. Backup (requires a .Mac account with Apple both to get the software and to use it.)
    2. Toast
    3. Impression
    4. arRSync
    Apple's Backup is a full backup tool capable of also backing up across multiple media such as CD/DVD. However, it cannot create bootable backups. It is primarily an "archiving" utility as are the other two.
    Impression and Toast are disk image based backups, only. Particularly useful if you need to backup to CD/DVD across multiple media.
    Visit The XLab FAQs and read the FAQs on maintenance, optimization, virus protection, and backup and restore. Also read How to Back Up and Restore Your Files.

Maybe you are looking for