The best practice when backing up your files

Hi,
I recently started using Carbon Copy Cloner after using only Time Machine as my back up solution.  I do understand the purpose of each application TM and a cloner utility such as Super Duper or CCC but I was wondering what the best process is when using these two methods to backup you files.
For instance I use TM to back up my files as frequently as possible to keep all my recent changes updated, but I don’t see how I would keep my clone updated and make sure that when something happens I will have a workable boot disk and not something that contains corrupted files.  In other words I think these cloner utilities have some sort of feature to keep your clone drive updated every time something changes but this got me wondering what if you update your clone drive and for some reason one of the updated file was corrupted, without you knowing it if course, now you have a backup that contains bad files and may affect your system (corrupted fonts files etc.) and when you realized that something is not working right in your system you may want to recover from your clone but you will basically end up with the same problem because the bad files were also backed up.
How do you ensure that your clone will always be ready and that it will not contain bad files?
What is your backup process?  Be so kind and share your method.
Again, I’m ok with TM I just need to know how you guys are managing your clone drives using ether Super Duper or Carbon Copy Cloner.
Thanks a lot!

I use CCC exclusively and update my clones every couple of days or after installing updates. I have no use for TM, since I've never had to go back and get something I deleted or changed. I do, however, boot into those clones on a routine basis to ensure that they look and act like the originals. This has worked for over seven years, but YMMV.

Similar Messages

  • What is the best practice in securing deployed source files

    hi guys,
    Just yesterday, I developed a simple image cropper using ajax
    and flash. After compiling the package, I notice the
    package/installer delivers the same exact source files as in
    developed to the installed folder.
    This doesnt concern me much at first, but coming to think of
    it. This question keeps coming out of my head.
    "What is the best practice in securing deployed source
    files?"
    How do we secure application installed source files from
    being tampered. Especially, when it comes to tampering of the
    source files after it's been installed. E.g. modifying spraydata.js
    files for example can be done easily with an editor.

    Hi,
    You could compute a SHA or MD5 hash of your source files on
    first run and save these hashes to EncryptedLocalStore.
    On startup, recompute and verify. (This, of course, fails to
    address when the main app's swf / swc / html itself is
    decompiled)

  • What is the best practice when any of Src/Tgt DB is re-started in streams

    We have a live production dual direction streams environment (A --> B and B --> A) and due to a DBFcorrupt file at source A, it was brought down and all traffic was switched to B. All streams captures,prop and apply were enabled and messages were captured at B and propagated to A (but cannot reach A and apply there as it was down) .When A is re-started, some of the captured messages in B never got applied to A. What could be the possible reason. What is a best practice in a streams environment when any of the source/target instance is shutdown and re-started.

    Hi Serge,
    A specific data file got corrupted and they restored it. Can you please send me the URL for the metalink document about that bug in 9.2. I'd really appreciate your help on this.
    Thx,
    Amal

  • Wat is the best solution when I get DLL file is missing ,is that SFC to run or system restore?

    hi,
    wat is the best solution if I get an error some dll file is missing ,  is that SFC utility to run  or sytem restore  or some other solution?
    thanks
    johan
    h.david

    then it's fine to perform a system restore, but it will revert some changes you have made on your system.
    and detailed infoemation would be helpful.
    Regards
    Yolanda
    TechNet Community Support
    Most error messages relating to missing .dll files are due to virus scanners. The scanner removes the threat (the .dll file) but leaves the program behind that tries to invoke the .dll file. Your recipe of using System Restore to fix the problem could well
    backfire: It might restore the infected .dll file! This is why it is important to know the full wording of the error message. If the missing file resided in the System32 folder then it's probably OK to restore it. If it resided elsewhere then restoration would
    probably be a bad idea.

  • What is the best way to back up iphoto "files"?

    I use time machine but this backs up the "library".  I want to back up the files.
    I have about 40000 photos that I just want to put on an external drive.  If I highlight too many photos or events in iphoto and drag them into a new folder, it crashes my system.  If I do one event at a time its gonna take for ever...Any advice?  I am running a pretty old OS...
    One of the reasons I want to do this is when I upgrade the OS, its gonna erase the hard drive and I want to make sure that the files are stored in one other place besides the time machine, before I upgrate...
    THANKS!
    JB

    File -> Export.
    Select what you want to Export: Original, Jpeg or Tiff and then export to folders on the desktop. Drag those wherever you need them
    Or, just drag the whole iPhoto Library from the Pictures Folder to the External - that way you get everything, files included.
    Regards
    TD

  • What is the best practice when distributing a desktop application which uses SMO

    I have a WPF application which installs a web site. One of the installation steps is to execute a number of SQL scripts which will install website's database. The database server is not necessarily the same machine as the product is installed - in most cases
    it's a different server on the same network. All the scripts are generated by a build of database project (visual studio 2012 db project) - hence all of those are in sqlcmd mode.
    I can workaround sql variables (:variable - type of creature) by making some simple text replacements. The big problem is with the "GO" statements. At first I thought that I can split the script into many subscripts using a regex split operation
    defining "GO" to be the separator - this did not work; having a world Polygon in the script will cause the whole operation to fail. Unfortunately I have no control  over the content of the script hence if someone will decide
    to put a comment like /* If you do not know how to use this script then GO TO HELL */ would break the installation process.
    I have then tried executing the whole script (with "GO" statements) using SMO. This works well on my development machine but once I try to do this on a server, application falls over missing dlls. Now for Microsoft.SqlServer.ConnectionInfo, Microsoft.SqlServer.Smo, Microsoft.SqlServer.SqlClrProvider
    and Microsoft.SqlServer.SqlEnum the solution is simple - I can just set "Copy local" to true and those dlls will be distributed with the application. 
    Big problem is with Microsoft.SqlServer.SqlClrProvider.dll. OK, I can get this library from my local machine - but I am not sure which SQL version this will be used with. I can't include more than one of those dlls as all of those have the same name. 
    I know that the official MS line is to install SQL feature pack (http://msdn.microsoft.com/en-us/library/ff713979.aspx). But again - I do not know which version of SQL my application will be working with, and I need to make the installation process as simple
    as possible. 
    My question is - is there a way I can distribute a desktop application (which makes a use of SMO) without any prerequisites and for all versions of SQL server? If there is no such a thing then using SMO is pretty much pointless as it cannot be distribute
    - even if then it's not version agnostic... Thanks for any help! 

    I agree with Olaf, when you want to ensure that you can interact with any version of SQL Server.
    The problem you will run into if you are not controlling the version of SQL Server is in the version of SMO. If you use SMO version 10.5 and need to deploy using SMO to SQL 2012, it will not work. SMO will always be backward compatible but not forward compatible.
    So you would have to have a max version of SQL Server your setup would deploy to in order to control errors and failures. So if you distribute SQL 2012 SMO then the max version would be SQL 2012 for the deploy.
    So using a tool that is version agnostic is the way to go, and you can also use in .NET the System.Data.SqlClient to execute statements against the SQL Server as well. This will be version agnostic too.
    Ben Miller - SQL Server MVP, SQL MCM 2008 - @DBADuck http://www.dbaduck.com

  • What is the best way to back up my files (ai. , What is the best way to back up my files (ai.

    I need to get my files oof my computer.  They are very large and are bogging the system down.  I have tried external harddrives & have had success for awhile with that but I have had 2 fry on me.  Can I store them somewhere online?  Does apple offer that?

    TimeMachine
    https://support.apple.com/kb/PH4542
    https://support.apple.com/kb/PH4351
    https://support.apple.com/kb/HT1427
    https://support.apple.com/kb/HT3275
    https://support.apple.com/kb/HT4878
    https://www.apple.com/support/timemachine/
    How to clone your system:
    http://macperformanceguide.com/Mac-HowToClone-backup.html
    http://macperformanceguide.com/Mac-HowToClone.html
    http://www.macupdate.com/app/mac/7032/carbon-copy-cloner
    http://www.macperformanceguide.com/blog/2012/20120711_2-MacPro-internal-clone-ba ckup.html 
    My guess is your choices of backup drive, external case, and interface could be improved.
    You should always have a set off line at all times as well as off site, volume cloning as well as TimeMachine.
    And all your equipment on UPS.

  • What are the best practices for backing up SAP B1?

    Right now, we are using a nightly tape back up. Is this sufficient? Or are there other, better options?
    Thank you!
    John Sefton

    It is sufficient,as you would have default SBO-Backup utility
    whcih can be automatted.
    In addition, you can even take a manual backup from the back end.

  • PHP 5.6.8 to 5.5.x. What is the best practice when trying to do so?

    Hi everyone,
    I've been using Arch for a while and I really try hard to really keep it simple.
    Most of the time I'm able to do so, but right now I'm not sure what should I do.
    I have LAMP stack installed using PHP 5.6.8.
    What I don't like about it is that most of rolling release linux distributions just don't care about the fact that most of PHP software (unless it is very simple) will break due poor backward compatibility of PHP itself.
    Ideally, distribution should pack all currently developed versions of such software as PHP (5.6/5.5/5.4) is, but this is not the case....
    So, basicaly I can't run Magento on it.
    I have been investigating my options and I ended up here:
    https://wiki.archlinux.org/index.php/Ar … ck_Machine
    It seems that it expects me to have old version in pacman cache which i don't have (I have been using vagrant with virtualbox before).
    Also, it seems that I could install it from AUR, but I can't find 5.5.x package.
    So, what can i do?
    It seems that there is no Arch setup capable of running Magento at this point?
    Is it really my only option to do everything from scratch?
    Sorry, if I'm missing something obvious.

    Hi everyone,
    I've been using Arch for a while and I really try hard to really keep it simple.
    Most of the time I'm able to do so, but right now I'm not sure what should I do.
    I have LAMP stack installed using PHP 5.6.8.
    What I don't like about it is that most of rolling release linux distributions just don't care about the fact that most of PHP software (unless it is very simple) will break due poor backward compatibility of PHP itself.
    Ideally, distribution should pack all currently developed versions of such software as PHP (5.6/5.5/5.4) is, but this is not the case....
    So, basicaly I can't run Magento on it.
    I have been investigating my options and I ended up here:
    https://wiki.archlinux.org/index.php/Ar … ck_Machine
    It seems that it expects me to have old version in pacman cache which i don't have (I have been using vagrant with virtualbox before).
    Also, it seems that I could install it from AUR, but I can't find 5.5.x package.
    So, what can i do?
    It seems that there is no Arch setup capable of running Magento at this point?
    Is it really my only option to do everything from scratch?
    Sorry, if I'm missing something obvious.

  • Best Method of Backing Up Computer Files

    Hi,
    What is the best way of backing up your computer files (Almost like a clone of the MacIntosh HD with MAC OS X (Snow Leopard OS 10.6.2)) in case of a hard drive crash? I am new to Mac and I wanted to have a back up in case of a hard drive crash. So I could retrieve my files from whereever. Any and all ideas will be appreciated.
    Thanks,
    relaxed4102

    There are three basic types of backup applications: *Bootable Clone, Archive, and Time Machine.*
    This is a general explanation and comparison of the three types. Many variations exist, of course, and some combine features of others.
    |
    _*BOOTABLE "CLONE"*_
    |
    These make a complete, "bootable" copy of your entire system on an external disk/partition, a second internal disk/partition, or a partition of your internal disk.
    Advantages
    If your internal HD fails, you can boot and run from the clone immediately. Your Mac may run a bit slower, but it will run, and contain everything that was on your internal HD at the time the clone was made or last updated. (But of course, if something else critical fails, this won't work.)
    You can test whether it will run, just by booting-up from it (but of course you can't be positive that everything is ok without actually running everything).
    If it's on an external drive, you can easily take it off-site.
    Disadvantages
    Making an entire clone takes quite a while. Most of the cloning apps have an update feature, but even that takes a long time, as they must examine everything on your system to see what's changed and needs to be backed-up. Since this takes lots of time and CPU, it's usually not practical to do this more than once a day.
    Normally, it only contains a copy of what was on your internal HD when the clone was made or last updated.
    Some do have a feature that allows it to retain the previous copy of items that have been changed or deleted, in the fashion of an archive, but of course that has the same disadvantages as an archive.
    |
    _*TRADITIONAL "ARCHIVE" BACKUPS*_
    |
    These copy specific files and folders, or in some cases, your entire system. Usually, the first backup is a full copy of everything; subsequently, they're "incremental," copying only what's changed.
    Most of these will copy to an external disk; some can go to a network locations, some to CDs/DVDs, or even tape.
    Advantages
    They're usually fairly simple and reliable. If the increments are on separate media, they can be taken off-site easily.
    Disadvantages
    Most have to examine everything to determine what's changed and needs to be backed-up. This takes considerable time and lots of CPU. If an entire system is being backed-up, it's usually not practical to do this more than once, or perhaps twice, a day.
    Restoring an individual item means you have to find the media and/or file it's on. You may have to dig through many incremental backups to find what you're looking for.
    Restoring an entire system (or large folder) usually means you have to restore the most recent Full backup, then each of the increments, in the proper order. This can get very tedious and error-prone.
    You have to manage the backups yourself. If they're on an external disk, sooner or later it will get full, and you have to do something, like figure out what to delete. If they're on removable media, you have to store them somewhere appropriate and keep track of them. In some cases, if you lose one in the "string" (or it can't be read), you've lost most of the backup.
    |
    _*TIME MACHINE*_
    |
    Similar to an archive, TM keeps copies of everything currently on your system, plus changed/deleted items, on an external disk, Time Capsule (or USB drive connected to one), internal disk, or shared drive on another Mac on the same local network.
    Advantages
    Like many Archive apps, it first copies everything on your system, then does incremental backups of additions and changes. But TM's magic is, each backup appears to be a full one: a complete copy of everything on your system at the time of the backup.
    It uses an internal OSX log of what's changed to quickly determine what to copy, so most users can let it do it's hourly incremental backups without much effect on system performance. This means you have a much better chance to recover an item that was changed or deleted in error, or corrupted.
    Recovery of individual items is quite easy, via the TM interface. You can browse your backups just as your current data, and see "snapshots" of the entire contents at the time of each backup. You don't have to find and mount media, or dig through many files to find what you're looking for.
    You can also recover your entire system (OSX, apps, settings, users, data, etc.) to the exact state it was in at the time of any backup, even it that's a previous version of OSX.
    TM manages it's space for you, automatically. When your backup disk gets near full, TM will delete your oldest backup(s) to make room for new ones. But it will never delete it's copy of anything that's still on your internal HD, or was there at the time of any remaining backup. So all that's actually deleted are copies of items that were changed or deleted long ago.
    TM examines each file it's backing-up; if it's incomplete or corrupted, TM may detect that and fail, with a message telling you what file it is. That way, you can fix it immediately, rather than days, weeks, or months later when you try to use it.
    Disadvantages
    It's not bootable. If your internal HD fails, you can't boot directly from your TM backups. You must restore them, either to your repaired/replaced internal HD or an external disk. This is a fairly simple, but of course lengthy, procedure.
    TM doesn't keep it's copies of changed/deleted items forever, and you're usually not notified when it deletes them.
    It is fairly complex, and somewhat new, so may be a bit less reliable than some others.
    |
    RECOMMENDATION
    |
    For most non-professional users, TM is simple, workable, and maintenance-free. But it does have it's disadvantages.
    That's why many folks use both Time Machine and a bootable clone, to have two, independent backups, with the advantages of both. If one fails, the other remains. If there's room, these can be in separate partitions of the same external drive, but it's safer to have them on separate drives, so if either app or drive fails, you still have the other one.
    |
    _*OFF-SITE BACKUPS*_
    |
    As great as external drives are, they may not protect you from fire, flood, theft, or direct lightning strike on your power lines. So it's an excellent idea to get something off-site, to your safe deposit box, workplace, relative's house, etc.
    There are many ways to do that, depending on how much data you have, how often it changes, how valuable it is, and your level of paranoia.
    One of the the best strategies is to follow the above recommendation, but with a pair of portable externals, each 4 or more times the size of your data. Each has one partition the same size as your internal HD for a "bootable clone" and another with the remainder for TM.
    Use one drive for a week or so, then take it off-site and swap with the other. You do have to tell TM when you swap drives, via TM Preferences > Change Disk; and you shouldn't go more than about 10 days between swaps.
    There are other options, instead of the dual drives, or in addition to them. Your off-site backups don't necessarily have to be full backups, but can be just copies of critical information.
    If you have a MobileMe account, you can use Apple's Backup app to get relatively-small amounts of data (such as Address book, preferences, settings, etc.) off to iDisk daily. If not, you can use a 3rd-party service such as Mozy or Carbonite.
    You can also copy data to CDs or DVDs and take them off-site. Re-copy them every year or two, as their longevity is questionable.
    Backup strategies are not a "One Size Fits All" sort of thing. What's best varies by situation and preference.
    Just as an example, I keep full Time Machine backups; plus a CarbonCopyCloner clone (updated daily, while I'm snoozing) locally; plus small daily Backups to iDisk; plus some other things to CDsDVDs in my safe deposit box. Probably overkill, but as many of us have learned over the years, backups are one area where +Paranoia is Prudent!+

  • Best practice when using Tangosol with an app server

    Hi,
    I'm wondering what is the best practice when using Tangosol with an app server (Websphere 6.1 in this case). I've been able to set it up using the resource adapter, tried using distributed transactions and it appears to work as expected - I've also been able to see cache data from another app server instance.
    However, it appears that cache data vanishes after a while. I've not yet been able to put my finger on when, but garbage collection is a possibility I've come to suspect.
    Data in the cache survives the removal of the EJB, but somewhere later down the line it appear to vanish. I'm not aware of any expiry settings for the cache that would explain this (to the best of my understanding the default is "no expiry"), so GC came to mind. Would this be the explanation?
    If that would be the explanation, what would be a better way to keep the cache from being subject to GC - to have a "startup class" in the app server that holds on to the cache object, or would there be other ways? Currently the EJB calls getCacheAdapter, so I guess Bad Things may happen when the EJB is removed...
    Best regards,
    /Per

    Hi Gene,
    I found the configuration file embedded in coherence.jar. Am I supposed to replace it and re-package coherence.jar?
    If I put it elsewhere (in the "classpath") - is there a way I can be sure that it has been found by Coherence (like a message in the standard output stream)? My experience with Websphere is that "classpath" is a rather ...vague concept, we use the J2CA adapter which most probably has a different class loader than the EAR that contains the EJB, and I would rather avoid to do a lot of trial/error corrections to a file just to find that it's not actually been used.
    Anyway, at this stage my tests are still focused on distributed transactions/2PC/commit/rollback/recovery, and we're nowhere near 10,000 objects. As a matter of fact, we haven't had more than 1024 objects in these app servers. In the typical scenario where I've seen objects "fade away", there has been only one or two objects in the test data. And they both disappear...
    Still confused,
    /Per

  • Want the best practices docs on Oracle database Admin provided by Oracle

    Hi there,
    I looked at everywhere and didn’t find the best practices docs on Oracle database administration especially on creating db in oracle 10g dbs. I can find bits and pieces here and there. But I didn’t find all incorporated in one. Could somebody direct/provide me on this?

    ok, I'm not looking for the oracle provided manual to find out the best practices in db field. I'm looking for the best practices when creating the db in oracle db, for example. I can read all the oracle manual in Oracle tihiti world. But I don't know the most used and practical things to do when creating the db:- If I have to define what should be the best practices when creating the db, here are the checklist:
    Here are the best practices when creating the Oracle 10 DBs:
    1.     Create meaningful database name
    2.     Create the directory structure following the Optimal Flexible Architecture(OFA)
    a.     Give suffix for the redo log .LOG
    b.     Give suffix for the data file .DBF
    c.     Give suffix for the Control file .CTL
    3.     Enable Password complexity
    4.     Enable ARCHIVELOG Mode
    5.     Use User Oracle Managed File
    6.     Create separate tablespace for data files and Indexes
    7.     Put archive log in different multiple drives
    8.     Multiplex redo log files, and control file
    etc.
    I want to see what other db gurus considers the best practices in the field.

  • What is the best method for backing up movies?

    I've been using Handbrake to rip out movies that I OWN so I can watch them on my apple tv. Everything works great, I'm just wondering what the best way to back up these files is. Any ideas?
    I own about 400 DVD's and would like to have them all in my Itunes Library so I can watch them in my apple tv. I've spent many many many many hours with Handbrake and putting them on external firewire hard drives and I really love this system. I can watch any movie any time without loading up my DVD.
    BUT, I would hate to have a hard drive failure and loose all of this time I've spent doing this. Any ideas on the best way to back up these movie.MP4 files? I guess I could burn a ton of DVD's, but I'm guesssing there is another easier way.
    And PLEASE, leave out the copyright comments, I BOUGHT these DVD's, and there is NO reason I shouldnt be able to watch them on my apple tv...
    thanks,
    Brown.Recluse

    I'm in a similar situation including movies I've purchased from iTunes...
    Here's my setup:
    I have all my iTunes data (music, movies, etc.) as well as about 10 GB of photos stored on a nework storage device that uses RAID-5 with SATA disks. This is basically a little toaster-sized NAS machine with four hard drives in it (www.infrant.com). If one of these drives dies, I get alerted and can insert a new one without missing a beat since the data is stored redundantly across all drives. I can also just yank out any one of the four drives while the thing is running and my data is still safe (I tried this after I bought it as a test).
    That's how I prevent problems with a single disk... Redundancy
    Now onto "backups"...
    Nightly, my little RAID toaster automatically backs itself up to an attached 250GB USB drive. However, these backups are only of my critical data... documents, photos and home movies... I don't bother backing up my "Hollywood" movies since I can live without them in a disaster.
    ... I actually don't permanently store anything I care about on a laptop or a desktop. It all goes on the NAS box (Network Sttached Storage) and then my other computers use it as a network drive. It's attached via Gigbait to my main computer and via wireless to most everything else.
    My achilles heel is that I don't store anything outside of my house yet. If I was smart, I'd alternate two USB drives attached to my NAS box and always keep one of them somewhere else (Safe Deposit Box?).
    ...and that's just one way to do it.
    -Brian

  • Database Log File becomes very big, What's the best practice to handle it?

    The log of my production Database is getting very big, and the harddisk is almost full, I am pretty new to SAP, but familiar with SQL Server, if anybody can give me advice on what's the best practice to handle this issue.
    Should I Shrink the Database?
    I know increase hard disk is need for long term .
    Thanks in advance.

    Hi Finke,
    Usually the log file fills up and grow huge, due to not having regular transaction log backups. If you database is in FULL recovery mode, every transaction is logged in Transaction file, and it gets cleared when you take a log backup. If it is a production system and if you don't have regular transaction log backups, the problem is just sitting there to explode, when you need a point in time restore. Please check you backup/restore strategy.
    Follow these steps to get transactional file back in normal shape:
    1.) Take a transactional backup.
    2.) shrink log file. ( DBCC shrinkfile('logfilename',10240)
          The above command will shrink the file to 10 GB.(recommended size for high transactional systems)
    >
    Finke Xie wrote:
    > Should I Shrink the Database? .
    "NEVER SHRINK DATA FILES", shrink only log file
    3.) Schedule log backups every 15 minutes.
    Thanks
    Mush

  • What is the best Practice to improve MDIS performance in setting up file aggregation and chunk size

    Hello Experts,
    in our project we have planned to do some parameter change to improve the MDIS performance and want to know the best practice in setting up file aggregation and chunk size when we importing large numbers of small files(one file contains one record and each file size would be 2 to 3KB) through automatic import process,
    below is the current setting in production:-
    Chunk Size=2000
    No. Of Chunks Processed In Parallel=40
    file aggregation-5
    Records Per Minute processed-37
    and we made the below setting in Development system:-
    Chunk Size=70000
    No. Of Chunks Processed In Parallel=40
    file aggregation-25
    Records Per Minute processed-111
    after making the above changes import process improved but we want to get expert opinion making these changes in production because there is huge number different between what is there in prod and what change we made in Dev.
    thanks in advance,
    Regards
    Ajay

    Hi Ajay,
    The SAP default values are as below
    Chunk Size=50000
    No of Chunks processed in parallel = 5
    File aggregation: Depends  largely on the data , if you have one or 2 records being sent at a time then it is better to cluster them together and send it at one shot , instead of sending the one record at a time.
    Records per minute Processed - Same as above
    Regards,
    Vag Vignesh Shenoy

Maybe you are looking for

  • Can't expoert due to missing file.

    I have completed my sound track in Soundtrack 1.2. While working on the project, I deleted a file that was no longer useful and is not a part of the project. Now, when I try to export the mix it tells me that I must reconnect the non-existent file fi

  • Line Item increment nos. in S Order

    Dear Gurus, Ours is a Mechanical process industry, though most of the items are NORM items, many are BOMs (LUMF), here the main item is a Model no. and the sub items are the parts in it according to the customers' requirement. In a Sales Order if I e

  • FF_5 - Import Bank Statements, Balancing field "Profit Center"

    Hi When importing the bank statement I get the error Error: (GLT2 201) Balancing field "Profit Center " in line item 001 not filled when using the NEW GL only. I know why I'm getting the error, the document is trying to determine a Profit Center, I d

  • Artwork in .mov files

    Is there a way to edit the artwork metadata (used in coverflow view) for .mov files? I can replace the artwork for MPEG-4 movies in the info pane for the movie, but QT movies won't let me delete the screenshot artwork and replace it with a movie post

  • Is there a difference between new iPad wall charger and iPad 2 wall charger?

    I've given my new iPad charger away with my iPad 2 by mistake... just wondering if either can be damaged by the wrong charger or there'll be any difference in charging speeds?