Backup strategy with versioning

Hello forum,
I'm actualy looking into organizing our backups in a new way and I would like to know how you guys perform your file-backups.
We used to just copy our projects per sync-tool to another internal drive and archive unused projects on our nas. The problem with that setup is that you don't have older versions if you for example overwrite the wrong file and that wrong file is allready copied to the backup.
There is timemachine on the mac, but is there something similar under windows 7, that's also performing well with large data of video, graphics and After Effects files?
I'm looking forward to your experiences and workflows.
Thanks in advance
Daniel

We use Paragon here because it is cheaper and provides standard imaging and Drive tool features. The ability to backup data realtime similar to Differential backups without scheduling though is a feature only Retrospect has that I am aware of currently.
Eric
ADK

Similar Messages

  • Any new backup strategy with Oracle VM 3.1 ?

    Now, since Oracle VM 3.1 was released, could we be updated with backup strategy.
    Backing up OCFS2 repositories allow us protect OVF files right?, but what happen with apps runing on VMs, these data is also backed up? what about open files, databases, etc.
    I will appreciate any comment.
    Thanks in advance.

    try running this on each server:
    /usr/bin/rescan-scsi-bus.sh

  • SCE Backup fail with Version 3.6.5 Build 489

    Hello,
    I upgraded my SCE 2020 and now can't do a subscriber backup on local Disk
    Subscriber backup failed on SCE 2020 Version 3.6.5 Build 489:
    sce#>conf
    sce(config)#>interface LineCard 0
    sce(config if)#>subscriber export csv-file subsback.csv
    0 subscriber(s); 0 error(s)
    sce(config if)#>
    But I have 1298 subscribers !!!
    sce#>show interface linecard 0 subscriber all-names
    There are 1298 subscribers in the data-base:
    N/A
    subscriber1
    subscriber2
    subscriber3
    subscriber1298
    sce#>
    (These commands worked well with old Firmare 3.5.0 Build 407).
    Any idea about the problem ?  It's a new Bug ?
    Thanks for comments.
    Best regards.
    Seb

    Hi,
    This is Shelley from Cisco TAC. I support the SCE.
    This is a bug and was identified a few weeks ago. I would like you to open a TAC case so that we can track and monitor user affected by this bug. Once you open the case, we will provide you with the Bug ID as well (at this time it is internal hence cannot share)
    Hope this answers your query.(Please mark complete if so and rate.)
    Shelley Bhalla
    CCIE #20002
    Cisco TAC - Carrier services

  • Backup strategy with no tape drives/library

    Hi, I am new into the job of sys admin, so we recently setup a system with running solaris 10 and with its storage as a NetApp, but we do not have yet a library or tape drive attached to it.
    So how can I approach this problem? Can I setup a LUN on NetApp and copy all important stuff to that LUN?
    FR

    For the oracle backup, oracle should have comands to dump or export the database to a file, and that file could be backed up. Alternately, there are problably options to put the database in read-only mode, so that you can back up the database files without worrying that they will be changed during the backup.
    For other files, if you are using zfs snapshots, you could back up the zfs snapshots even if the original file is still open.
    You might also want to look at the rysnc command- that should let you back up only files that have differ between the source and destination (i.e. you could do differential backups.) So maybe you could a monthly full backup with tar to NAS or SAN, a weekly rsync incremental backup, and daily zfs snapshot that is kept for a week.

  • RMAN Backup Strategy Comment Needed

    Hi,
    Currently I'm trying to implement a backup strategy with RMAN.
    Scenario: My side we're 5 days operational (Monday to Friday). I'm planning my backup as follow:
    Mon – Incremental level 1 differential
    Tue – Incremental level 1 differential
    Wed – Incremental level 1 cumulative
    Thurs – Incremental level 1 differential
    Fri – Incremental level 0 (aka full database backup)
    I planned Mon,Tues and Thurs as an incremental differential, while Wed as an incremental cumulative. With Wed as a cumulative backup, my thinking is that if my db is to fail on Thurs/Fri, my recovery will need to apply fewer backups but at the expense of storage.
    Any comment or view about the strategy will be greatly appreciated.
    Thanks and Regards
    Eugene (ET)

    hi,
    there is Oracle suggested backup strategy
    http://download.oracle.com/docs/cd/B19306_01/backup.102/b14192/bkup004.htm
    other
    http://books.google.cz/books?id=3aEIIqRHkY8C&pg=RA1-PA326&lpg=RA1-PA326&dq=oracle+rman+backup+stratery&source=bl&ots=DrnVZ3tZPs&sig=QajKM0SXbpoa3eWZgl5kmwmSUPI&hl=cs&ei=o6xcSvSaCpKwsAbmu8yVDA&sa=X&oi=book_result&ct=result&resnum=4
    Regards,
    Tom

  • Back up strategy with Maxtor OneTouch II

    I just bought Maxtor OneTouch II 300GB. I need help with the backup strategy with or without Retrospect. The external Maxtor should be used for storage and backup.
    My questions, since the instructions and Help in Retrospect are confusing:
    When I want to first backup the entire startup disk can I simply drag it to Maxtor on the Desktop or should I use Retrospect?
    In Retrospect, should I use "Removable Disk" or "File"?
    How long does it take with Maxtor FW 800 ? I've got 70GB startup disk to copy. Is it hours?
    Can the copied volume on external disk be used as startup disk?
    I would appreciate your experience.
    Milan

    I suggest you use Disk utility to partition the drive so you have a space large enough for your boot drive. Then use carbon copy cloner, which you can find from versiontracker.com to "clone" your boot drive to the partition. you need to use a program like this, because various files needed to create a "boot drive" are hidden.
    please note, the best back-ups are completely disconnected from all computers and power sources. Just something to consider.

  • Your backup is from a different version of Microsoft SharePoint Foundation and cannot be restored to a server running The backup file should be restored to a server with version '12.0.0.6318' or later.

    am trying  to restore the bak file into a new  site collection in my sp 2010  standalone env.
    am getting error
    PS C:\Windows\system32> stsadm -o restore -url http://srvr1-01:123/sites/Repository -filename "C:\mBKUPCOPY\Sharepoint_bankup.bak"
    STSADM.EXE : Your backup is from a different version of Microsoft SharePoint Foundation and cannot be restored to a server running
    the current version. The backup file should be restored to a server with version '12.0.0.6318' or later.
    At line:1 char:1
    + stsadm -o restore -url http://srvr1-01:123/sites/Repository -filename "C: ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (Your backup is ...6318' or later.:String) [], RemoteException
    + FullyQualifiedErrorId : NativeCommandError

    As stated in the other thread on this topic you can't restore a 2007 backup to 2010, it needs to be upgraded.
    https://social.technet.microsoft.com/Forums/en-US/31c70f0a-5d89-4308-895b-af0c2b249114/restore-the-site-collection-from-moss-2007-to-sp-2010-site-collec?forum=sharepointadminprevious

  • Need help with backup strategy

    So my 2 TB backup drive failed this morning.  I'm using a Seagate drive in an OWC external enclosure.  I plugged it in and the drive didn't pop-up on the desktop, nor is it visible in Disk Utility.
    I know that all drives eventually fail, but it seems like I've had more than my fair share of problems lately.  This is of course making me think hard about my backup strategy.  Here's what I'm doing now - I'd appreciate your thoughts on how to make it even more "bulletproof".
    > TimeMachine back-up of boot drive and media drive (with photos, documents, movies, etc.)
    > External clone of boot drive and media drive (on two partitions) - this is the one that failed
    I suppose I could add a third external clone for redundancy, but an offsite backup would probably be even better.  Not quite sure what the best option is there, though.  Any ideas?

    I too love automation. However, an automated backup requires the backup drive to be always connected (and maybe powered) in order to perform the backup. Also, the computer you speak of must remain on (by the sounds of it) 24/7 in order to do a nightly back up at midnight.
    First of all, it probably is not the best to leave your puter on 24/7. I won't go into all the OS reasons why. But here are 2 other reasons to think about:
    1) Your internal HD will always be powered, bad on drives
    2) You are constantly using power to run your system. Not good for your power bill or the greater environment
    As for the always connected Backup HD- I only connect, and power on my backup when I go to do a backup. This leaves it disconnected in case of virus infection. And powered down/disconnected removes some of the risk of damage from power surges/spikes (yes I use a UPS, but these can/have failed)
    So to sum up: I back up everyday. What I do is after I am done working in LR for the day I shut it down, and than start it back up (ugh) so that I can back up my Library with all the work I just did (wish Adobe would do a back up upon closing!!!)
    Than I close down LR AGAIN..... Than I connect my Back up HD. Via usb, once connected it automatically powers up and fires up the back up software. All I do is hit start. Since this is an incremental back up it only takes a short while. I go use the "John" grab a drink, and come back unplug the Back up HD, and turn off the system for the day.

  • Best backup strategy

    Time Machine is brilliant. It's saved me many times.
    But recently, the backup drive with a year's worth of backups, died.
    I therefore lost ALL my backups.
    Not a major problem as I still have my current data and having re-formatted the Time Machine drive it's merrily backing it all up again.  (I just hope I don't need to recover to last week's backup ... as I no longer have it.)
    But until that's finished I have NO backups!  Eeek!
    So what is the best backup strategy, bearing in mind drives can fail, houses can burn down, etc.  Should I have two or three Time Machine backup discs and keep swapping them every day so if one dies I've still got a one-day-old version?
    Making DVD backups and storing them elsewhere is very time consuming and while my data is really important to me, it defeats the object if I can get on with any work on that data as I'm constantly burning to lotsof DVDs!
    Your views would be appreciated!

    I pretty much do a similar thing, but my offsite backup goes to a locked cabinet in my office (easier to get to weekly to bring home and update then using a bank - I honestly cannot remember when I last physically went to my bank, its been years).
    TM makes a great source for restoring single files or being able to go back to an earlier version of a file.  And the clones are easier for full restoration, or even for booting from temporarily if the machines boot drive dies and you just want to get in to deauthorize it before sending in for repairs or such.  Always good to have a bootable clone around, for many reasons.
    My external clones are on firewire 800 bus powered portable drives, again for simplicity (no power cables or bricks to go with them).
    I also still burn to optical disc for specific things like filed tax returns, and other financial documents I need to keep around in the fire safe for some period of time.

  • Backup strategy

    Hi,
    short story:
    Due to the structure of the library I have the "originals" and "modified" images, when edited some. When importing a library into another without copying them I have some/many thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Long story:
    I'm looking for a working backup strategy for iPhoto. I'd like to collect fotos over the time and when the library reaches some GByte I'd like to copy it to two external drives in a software raid set (mirror). I know that I can merge the two libraries (one on raid, one on MacBook) with some piece of software. So far so good.
    Now I'd like to have the thumbnails of the fotos on the ext. drive on my MacBook as well. I can import them to a library with the option not to copy the images but leave them where they are (on ext. drive). When I'm on the go I can access the thumbnails. When viewing them I am asked to insert the drive.
    But now comes my problem: Due to the structure of the library I have the "originals" and "modified" images, when edited some. So I have some thumbnails twice. How do I prevent iPhoto to use the original thumb when a modified exists?
    Thanks a lot.

    What you really need to use isMedia Expression. It creates catalogs containing thumbnails of the cataloged photos and the catalog can be used without the source file being available.
    You can add keywords and other identifiers to the photos while just using the catalog and then when you get back and have the source files available the new metadata can be applied to the actual file.
    Expression Media appears to be available for the upgrade price of $99 if you are using iPhoto as shown here:
    $99 (Full Version $199)
    For qualifying owners of:
    Licensed copy of an earlier version of Expression Media or any iView Multimedia product.
    OR
    Any licensed photographic software, including Windows Photo Gallery or iPhoto
    That info is from this site: http://www.microsoft.com/expression/products/Upgrade.aspx and by clicking on the "here" link on that page. I've emailed the developers to see if I've intrepreted that page correctly.
    In the meantime you can download and run the demo. You can also catalog your iPhoto library with EM and write all metadata entered in iPhoto back to the original files. Steps 1-7 of Old Toad's Tutorial #1 describe how. That page is in revision since iView is no longer available and Step #2 is slightly different with the EM demo. It should be changed by the end of the day today.
    I use it in conjunction with iPhoto. EM is my primary DAM (digital asset management) application and I use iPhoto for projects like books, calendars, slideshows, etc.
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 6 and 7 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
    Note: There's now an Automator backup application for iPhoto 5 that will work with Tiger or Leopard.

  • Backup strategy in FRA

    Hi Experts,
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE     11.1.0.6.0     Production
    TNS for HPUX: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - ProductionI would like to ask some advice to place my backup in FRA.
    I read in a book that it is adviced not to put the archivelog backup in the FRA since if something happen with the disk that store FRA, everything will be gone. However the concept of FRA is a centralized backup logically.
    so based on your experiences, what is the best way to utilize FRA for backup strategy?
    is it safe to put everything in FRA or we should still split the backup for archivelog to different location?
    thanks

    The idea is that you should never have a single copy of your backup anyways, be it of whatever file type. Its true that the FRA gone would lead to the non-availability of the backups but then , you should had pushed your backups from it to the tape drive eventually to safeguard them. So there wont' be an harm in putting the backup in the FRA as long as you multiplex the backup and keep it on some another location as well.
    HTH
    Aman....

  • Newbie questions about Database replication, "backups", and sql version

    Just created my first Sql Database using the "SQL Database Management Portal". A blast but three questions come to mind:
    1. Is this DB automatically
    "replicated"? (In other words,  is DB "failover" or clustering,  a feature of all newly created Azure databases?) 
    2. Our on-premise model is to make a daily DB backup which is saved to a nightly tape. If needed, we can restore a two-month old database backup under a new name to compare it with the current one. Does Sql Azure support this capability
    or not? Can it be requested?
    3.  Which on-premise version of sql is "Sql Azure" closest to? (2014?)
    TIA,
    edm2
    P.S. My database was created using the "web" edition.

    Hi edm,
    According to your description, you create a SQL Azure database in Azure platform. The replication feature is not supported by Microsoft Azure SQL database. If you want to sync the SQL Azure database and local database, you can use SQL Azure Data Sync service.
    For more information, see:
    http://blogs.technet.com/b/the_cloud_pilot/archive/2011/10/24/your-first-sql-azure-data-sync-step-by-step.aspx
    In addition, if you have Web or Business Edition databases, you must create your own backup strategy. You can use database copy or Import and Export services to create copies of the data and export the file to an Microsoft Azure storage account. Meanwile,
    Windows Azure SQL Database provides a mechanism for automating the process for exporting a database to BACPAC files on a set interval and frequency. For more information, see:
    Schedule an Automated Export:
    http://msdn.microsoft.com/en-us/library/hh335292.aspx#automate
    Windows Azure SQL Database Backup and Restore strategy:
    http://www.mssqltips.com/sqlservertip/3057/windows-azure-sql-database-backup-and-restore-strategy/
    Currently, Azure uses a special version of Microsoft SQL Server as its backend. It provides high availability by storing multiple copies of databases, elastic scale and rapid provisioning, when we check the version of SQL Server, it shows as follows.
    Microsoft SQL Azure (RTM) - 11.0.9216.62
    Regards,
    Sofiya Li
    If you have any feedback on our support, please click
    here.
    Sofiya Li
    TechNet Community Support

  • Synchronous multimaster replication Backup strategy using RMAN

    Hi all,
    I am using synchronous multimaster replication. my question is backup strategy has to perofirm in all the master site or only in any of 1 mastersite? what all problems will encounter in RMAn backup in my scenario please help me out with your suggestion
    nagaraj.K
    [email protected]

    You ask: "I want to configure backup strategy using RMAN. any one can help me that"
    And the answer is clearly no we can not.
    An RMAN backup strategy depends on your SLA (Service Level Agreement) with your customers that you didn't post. What is your down-time window? What is your skill set?
    You ask: "How to configure for RMAN Incremental backup?"
    Read the docs at http://tahiti.oracle.com for your version (which you didn't mention).
    You ask: "What will be backup space and there size ?"
    We have no idea. Are you going to create an image copy or a backup set? Read the docs and learn the difference. Are you going to turn on compression? Are you going to back up only changed blocks? We don't know.
    You ask: "how to manage growing online archiving files?"
    Again we can't help you without knowing your SLA. How many MG/GB are they? What period of time do you need to retain them to meet your SLA? When you figure this out back up to tape those you don't need to keep on the server.
    You ask: "how to manage growing data and there disk space?"
    This is one we can answer: BUY MORE DISK! That was easy.
    You ask: "How we can give good performance our CPU and memory?"
    Do you really expect that we can with zero knowledge of your environment, your version, your application, etc. distill into a few short paragraphs the collective wisdom of Cary Millsap, Jonathan Lewis, and the rest of the Oak Table Network membership?
    Of course we can not. So what you need to do is buy Cary's book, Jonathan's book, and take their classes.
    You ask: "we need keep all archive log in backup files or we need to remove old archive files?"
    Remove the old ones after being absolutely sure you have a good backup.
    You ask: "where we can take backup tape drive,SAN,disk removable hard disk? which one is better?"
    No one solution is better than the other. They are all adequate.

  • Can't open version 3 documents with version 5

    Hello,
    My company didn't upgrade me to version 4 so I'm trying to open my version 3 .cp files with version 5. Every time I try to open one, it crashes.
    I'm using a different computer (still Windows XP, though) than I had when I last used the version 3 files and am accessing them on a remote server via VPN.
    Any ideas? I've tried restarting my computer when it doesn't open and looking for a version 4 to download so I can try opening it that way, saving it, and then opening it in version 5, but I haven't had any luck so far.
    Thanks for your help,
    Elaine

    Hi Elaine
    I see this is your second post here in the forums. So it occurs to me that you may be unaware of one of Captivate's "golden rules" that help keep you out of trouble.
    That rule?
    always, Always, ALWAYS work with projects ONLY when you have them stored on your local C drive.
    never, Ever, NEVER EVER consider opening Captivate, then clicking File > Open and navigating to a network drive and try to make edits while the project is on a network.
    Sometimes your IT peeps will help you aim the shotgun squarely at your leg and tell you to pull the trigger. They want you to work with your files on the LAN because their precious servers are backed up and your local hard drives typically aren't. It's perfectly fine to store your projects on the LAN if the only purpose is for backup. So consider copying the projects there at the end of each day to satisfy those backup requirements.
    Cheers... Rick
    Helpful and Handy Links
    Begin learning Captivate 5 moments from now! $29.95
    Captivate Wish Form/Bug Reporting Form
    Adobe Certified Captivate Training
    SorcererStone Blog
    Captivate eBooks

  • Limiting Time Machine backup Size with WD MyBookLive and 10.8

    I cannot take credit for the any part of this solution; merely for merging and clarifying how the solutions discovered by 2 Apple Support Communities contributors much smarter than I (namely “Pondini” – Florida and  “himynameismarek”) - worked perfectly for my situation. All cudo’s to these two!
    I have about average or better PC skills, but am an absolute newbie with Apple. This week I got a new iMac. Having a number of home PC’s all sharing files and back up space on a Western Digital MyBookLive (“WD MBL”) 3TB network drive (NAS), naturally I wanted to use it to backup the new Mac rather than rushing out to buy an Apple Time Capsule.
    There are hundreds of threads on limiting size of a Time Machine (“TM”) backup, many of which required entries in “Terminal” or were devised on older versions of OSX. I’m running OSX Mountain Lion 10.8, so was concerned they may not work.
    The issues I wanted to resolve were:
    Time Machine will use up all of the space on my WD MBL if left to it’s own devices.
    The WD MBL is compatible with Mac and PC’s… which is good… but unlike a back up in Windows 7 Pro which will allow you to make backups in a mapped “share” you create yourself, Apple TM Backups will not; they end up in a hidden folder on the NAS (much like PC backups done with WD Smartware)
    At first I thought maybe I could limit the size of a share created in the MBL, but not possible, at least not that I've seen and I have searched for days.
    The solutions:
    First make sure you have the latest firmware for the WD MBL as of today it is MyBookLive 02.11.09-053. From what I’ve read Western Digital fixed the compatibility issues with 10.8 Mountain Lion just recently.
    Next you need to start TM so that it starts to create a back up. You can stop the back up once you see files being copied. Do this before you walk thru the video tutorial by my Marek below. WD MBL will create the hidden folder you need to find for TM Backups. This folder is called “TimeMachine” but it is not visible even in the “MBL_NAME-backup” folder in Finder.
    Open safari and type “ afp://xxx.xxx.x.xxx ” but use your own ip address of your MBL. Mine was 192.168.1.120, yours will be different.
    It will ask how you want to connect. CHOOSE AS A GUEST even if your MBL is protected… I’m not sure why it works but it does. Then a window will come up asking which share you’d like to mount. You will see all of your own shares plus one called software and now one called “TimeMachine”. Choose that one.
    Now in “Finder” you will see a mounted shared item called “YOUR_MBL_NAME-“ (the same as the one that is probably already there but with a dash(-) at the end). You’ll also see a new “device” in the device list called “Time Machine Backups” (If you already have watched the video tutorial by Marek, you’d know you are looking for a file called “YOUR_MACHINE_NAME.sparsebundle”. Well if you browse the folder “Backups.backupdb” in the Time Machine Backups device you won’t find it… again I don’t know why but you won’t. It resides in the hidden folder called “TimeMachine” that is now visible in the thing you just mounted in step 4)
    NOW watch this video tutorial http://youtu.be/Nq7mSizqUSI and follow it step by step.
    Voila... issues resolved. Thank you Pondini and Marek!

    Try Use Terminal to limit Time Machine sparcebundle size on timecapsule,
    should work to limit Time Machine backup size on any NAS or external disk (or not...)
    sudo defaults write /Library/Preferences/com.apple.TimeMachine MaxSize 500000
    to return to ilimited
    sudo defaults delete /Library/Preferences/com.apple.TimeMachine MaxSize
    if you want to reclame deleted files space shrink it use
    hdiutil resize -size 500g -shrinkonly /Volumes/TimeMachineYOURNAME/YOURNAME.sparsebundle/
    Regards

Maybe you are looking for

  • Pacman only checks core rep

    I just installed Arch but I can't get pacman to check any repositories other than core. Ie, $ pacman -Sl extra returns nothing. Here's my pacman.conf # /etc/pacman.conf # See the pacman manpage for option directives # GENERAL OPTIONS [options] LogFil

  • Lion RSS view subject instead of sender

    In the previous version of Mail in Snow Leopard, RSS feeds showed only the Subject field and then could be sorted by date.  Now, the "sender" of the feed is shown in bold and the subject is below it normal text.  I find this a bit more difficult to s

  • Dreamweaver 8 - new install won't import .ste

    My hard drive screwed up and I had to reinstall windows 7.  I re installed dreamweaver 8 the exact same way I always do on a new computer or fresh install, went to import my saved .ste files which those same ones worked fine so far every time. This t

  • 9031: Patchset vs. Bundled Patch 1

    Hi! Could you guys please help me out. Why does BP1 for 9031 have a earlier release date than the 9031 Patchset? Does it mean that the BP1 is obsolete when Patchset 9031 is installed? 3288558 Containers for J2EE: Patch 9.0.3.1.0 BUNDLED PATCH 1 (Rele

  • Forgot my Ipod touch 5th generation passcode?

    I forgot my Ipod touch 5th generation passcode. Right now my Ipod is disabled.I went on Itunes on my computer to restart my entire Ipod but when I connected my Ipod to my computer it said       'To allow access,please respond on your Ipod touch'   bu