Best strategy for offsite backups

I am using TM on an external hard drive (call it #1) with my iMac, however I want to store a backup offsite, in the event of a fire, etc.
Can you use TM on one machine to make back ups on multiple different external hard drives (not necessarily at the same time)? Is this the best way to back up offsite, by using TM to make an extra external hard drive backup that I would store elsewhere?
If I use TM, would I get another external hard drive (#2), then turn off TM, disconnect external hard drive #1, plug in #2, then turn TM on, then have it do a full back up, then turn TM off, then disconnect #2, reconnect #1, and turn TM back on?
Let's say #2 has been off site for a month (while #1 has been running with TM on the computer) and I want to update #2. Is the best way to erase it, and do a full back up? Or, after a month, can I just plug it in and let TM do an incremental backup? Does it matter that TM has been doing backups with #1?
Sorry for the confusing questions.
Thanks for any advice.
Lee

Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
[CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

Similar Messages

  • WHAT IS BEST STRATEGY FOR RMAN BACKUP CONFIGURATION

    Hi all,
    my database size is 50GB I want TAKe WEEKLY FULL BACKUP AND INCREMENTAL BACKUP
    WITHOUT RECOVERY CATALOG.by follwing commands
    weekly full database backup
    run
    backup as compressed backupset
    incremental level=0
    device type disk
    tag "weekly_database"
    format '/sw/weekly_database_%d_t%t_c%c_s%s_p%p'
    database;
    I want do CONFIGURE RMAN BY FOLLWING stragtegy
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.
    CONFIGURE BACKUP OPTIMIZATION OFF;
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO
    '\SW\AUTOCFILE_'%F';
    and other is by default
    sql>alter system set control_file_record_keep_time=15 days;
    os--aix6 and for two database 10g2 and 11g2
    what is best configuration strategy for rman backup.AND BACKUP WITH RECOVERY CATALOG OR WITHOUT RECOVERY CATALOG
    PLEASE TELL ME
    Edited by: afzal on Feb 26, 2011 1:45 AM

    For simply two databases, there really wouldn't be a need for a recovery catalog. You can still restore/recover without a controlfile and without a recovery catalog.
    From this:
    afzal wrote:
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.I am assuming you want to keep two weeks worth of backups, therefore these:
    alter system set control_file_record_keep_time=15 days;
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 daysShould be:
    RMAN> sql 'alter system set control_file_record_keep_time=22';
    RMAN> CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 14;22 would give you that extra layer of protection for instances when a problem occurs with a backup and you want to ensure that data doesn't get aged out.

  • Best strategy for variable aggregate custom component in dataTable

    Hey group, I've got a question.
    I'd like to write a custom component to display a series of editable Things in a datatable, but the structure of each Thing will vary depending on what type of Thing it is. So, some Things will display radio button groups (with each radio button selecting a small set of additional input elements, so we have a vertical array radio buttons and beside each radio button, a small number of additional input elements), some will display text-entry fields, and so on.
    I'm wondering what the best strategy for tackling this sort of thing is. I'm sort of thinking I'll need to do something like dynamically add to the component tree in my custom component's encodeBegin(), and purge the extra (sub-) components in encodeEnd().
    Decoding will be a bit of a challenge, maybe.
    Or do I simply instantiate (via constructor calls, not createComponent()) the components I want and explicitly call their encode*() and decode() methods, without adding them to the view tree?
    To add to the fun of all this, I'm only just learning Faces (having gone through the Dudney book, Mastering JSF, and writing some simpler custom components) and I don't have experience with anything other than plain vanilla JSP. (No EJB, no Struts, no Tapestry, no spiffy VisualDevStudioWysiwyg++ [bah humbug, I'm an emacs user]). I'm using JSP 2.0, JSF 1.1_01, JBoss 4.0.1 and JDK 1.4.2. No, I won't upgrade to 1.5 (yet).
    Any hints, pointers to good sample code? I've looked at some of the sample code that came with the RI and I've tried to navigate the JSF Blueprints stuff, but I haven't really found anything on aggregating components into a custom component. Did I miss something obvious?
    If this isn't a good question, please let me know how I can sharpen it up a bit.
    Thanks.
    John.

    Hi,
    We're doing something very similar. I had a look at the Tomahawk Date component, and it seems to dynamically created InputText components in the encodeEnd(). However, it doesn't decode this directly (it only expects a single textual value). I expect you may have to check the request yourself in decode().
    Other ideas would be appreciated, though - I'm still new to JSF.

  • What's best strategy for dealing with 40+ hours of footage

    We have been editing a documentary with 45+ hours of footage and presently have captured roughly 230 gb. Needless to say it's a lot of files. What's the best strategy for dealing with so much captured footage? It's almost impossible to remember it all and labeling it while logging it seems inadequate as it difficult to actually read comments in dozens and dozens of folders.
    Just looking for suggestions on how to deal with this problem for this and future projects.
    G5 Dual Core 2.3   Mac OS X (10.4.6)   2.5 g ram, 2 internal sata 2 250gb

    Ditto, ditto, ditto on all of the previous posts. I've done four long form documentaries.
    First I listen to all the the sound bytes and digitize only the ones that I think I will need. I will take in much more than I use, but I like to transcribe bytes from the non-linear timeline. It's easier for me.
    I had so many interviews in the last doc that I gave each interviewee a bin. You must decide how you want to organize the sound bytes. Do you want a bin for each interviewee or do you want to do it by subject. That will depend on you documentary and subject matter.
    I then have b-roll bins. Sometime I base them on location and sometimes I base them on subject matter. This last time I based them on location because I would have a good idea of what was in each bin by remembering where and when it was shot.
    Perhaps, you weren't at the shoot and do not have this advantage. It's crucial that you organize you b-roll bins in a way that makes sense to you.
    I then have music bins and bins for my voice over.
    Many folks recommend that you work in small sequences and nest. This is a good idea for long form stuff. That way you don't get lost in the timeline.
    I also make a "used" bin. Once I've used a shot I pull it out of the bin and put it "away" That keeps me from repeatedly looking at footage that I've already used.
    The previous posts are right. If you've digitized 45 hours of footage you've put in too much. It's time to start deleting some media. Remember that when you hit the edit suite, you should be one the downhill slide. You should have a script and a clear idea of where you're going.
    I don't have enough fingers to count the number of times that I've had producers walk into my edit suite with a bunch of raw tape and tell me that that "want to make something cool." They generally have no idea where they're going and end up wondering why the process is so hard.
    Refine your story and base your clip selections on that story.
    Good luck
    Dual 2 GHz Power Mac G5   Mac OS X (10.4.8)  

  • Best strategy for migrating GL 6 DW CS4?

    First I gotta say that -- as decade-long user of several Adobe products -- Adobe has really screwed over long-time users of Adobe Golive. After a week of messing around with something that should be a slam dunk (considering the substatial amount of $ I've paid to Adobe over the years for multiple copies of GoLive 5, 6 and CS2), I can tell you Adobe's GL > DW migration utility ONLY works for sites created FROM SCRATCH in GL CS2 (they must have the web-content, web-data folder structure). This means that Adobe's migration utility only works for maybe 10% - 15% of the GoLive sites out there, and about 10% - 15% of Adobe's loyal GoLive customers, and it particularly screws over Adobe's longtime GoLive customers. Sweet! ( (Just like Adobe screwed over users of PhotoStyler -- which was a better image editor than PhotoShop, BTW.) Obviously, I would walk away from Adobe and make sure I never ever paid them another cent, but the Republican-Democrat "free market economy" has made sure I don't have that option.
    So I've gotta make Dreamweaver work, which means I've gotta migrate several large sites (the biggest has 5,000 static pages and is about 2 gigs in size) from GoLive 6 (or GoLive CS2 with the older folder structure) to Dreamweaver CS4.
    Which brings me to my question -- what's the best strategy for doing ths? Adobe's migration utility is a bad joke, so here's the alternative, real world plan I've come up with. I'd apreciate any suggestions or comments...
    1) create copies of all my GL components in the content folders for my GL site (not in the GoLIve components folder)
    2) replace the content of all GoLive compnents (in the GoLive components folder) with unque character strings, so that instead of a header with images and text, my old header would look something like xxxyyyzzz9
    3) create a new folder called astoni in the root of my hard drive. Copy all my GoLIve web site content (HTML, images, SWF, etc.) into astoni in exactly the structure it was in with GL
    4) create a new Dreamweaver site by defining astoni as the local location for my site, astoni\images as the location for images, etc.
    5) use Dreamweaver and open the newly defined astoni site. Then open each of the GoLive components I copied into a content level folder in astoni, and drag each into the Dreamweaver Assets/Library pane, in order to create library items just like my old GoLive components
    6) use Dreamweaver to Search & Replace out the unique text strings like xxxyyyzzz9 with the content of my new DW library items
    7) refresh site view to make all the links hook up...
    Thanks for your help. Hope this discussion helps others too...

    Instead of ragging on people who are familiar with DW and Site Development, you should be gracious and accept the practical advice you've been given.  A "best strategy" would be to read some tutorials and learn how to work with HTML, CSS and server-side technologies. Without some basic code skills, you're going to find DW an uphill, if not impossible battle.
    Frankly, I don't have free time to hand-hold someone through the excruciating process of migrating a 5,000 page static site from GoLive  to DW. And I doubt anyone else in this forum has either.  We're not Adobe employees.  We don't get paid to participate here.  We are all product users JUST LIKE YOU.
    I'm sorry you're frustrated.  I'm also sorry for your clients. But the problem you have now isn't Adobe's fault. It's yours for not keeping up with server-side technologies or handing-off these huge static sites to more capable web developers.  I'm not saying you need to buy anyone's books, but they are good resources for people willing to learn new things.
    That said, maybe you should stick with GoLive.  The software doesn't have an expiration date on it and will continue working long into the future.  If you're happy using GL, keep using it to maintain your legacy sites. At the same time learn X/HTML, CSS & PHP or ASP.  Use DW CS4 for the creation of new projects.
    FREE Tutorial Links:
    HTML & CSS Tutorials - http://w3schools.com/
    From   Tables to CSS Web Design Part 1 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt1.html
    From   Tables to CSS Web Design Part 2 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt2.html
    Taking  a Fireworks (or Photoshop) comp to a CSS based layout in DW
    http://www.adobe.com/devnet/fireworks/articles/web_standards_layouts_pt1.html
    Creating  your first website in DW CS4 -
    http://www.adobe.com/devnet/dreamweaver/articles/first_cs4_website_pt1.html
    Guidance  on when to use DW Templates, Library Items and SSIs -
    http://www.adobe.com/devnet/dreamweaver/articles/ssi_lbi_template.html
    Best of luck,
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists
    www.alt-web.com/
    www.twitter.com/altweb
    www.alt-web.blogspot.com

  • Best command for script backup?

    hello,
    i always use robocopy for file backup , but with powershell and the new technologies is still the easiest and best choice?

    Your initial question was "Best command for script backup?". For this you received an answer. If you now need support on the subject of your script then you should start a new thread. Note also that you must test your robocopy command from a
    command line before trying to embed it in a PowerShell script.

  • Best strategy for buring dual layer DVD

    I've got a project that won't fit on a single layer DVD, so I've got it set up for dual layer and "best quality" which results in a 7.38GB disk image.  However, iDVD 8 warned me when burning the disk images that some utilities may not be able to burn a reliable dual layer disk copy and to use iDVD instead, does this include Disk Utility?
    I always use Disk Utility to make copies, iDVD took almost 7 hours to burn the disk image, I can't afford to wait that long for each copy.  So what's the best strategy for burning dual layer DVDs?

    subsequent copies burn in far less time immediately after the first provide you don't quit iDvd
    If you know ahead of time you'll need more than a few copies, I'd recommend burning from a disc image to the desktop and reducing the burn speed (4x or lower) I prefer to use Roxio Toast myself. But others have had success with Apple's Disc Utilities as well.

  • Absolute Best Strategy for Cropping

    What's the absolute best strategy for high precision cropping when time is not an issue and and want "perfect" quality?
    What's the absolute best strategy for low precision cropping when time is an issue and standards are still high?
    There's gotta be different strategies for doing this, and I was curious if we could choose a winner.
    Thanks,

    Heh, Marian, that about sums it up. 
    In all seriousness...  How does "precision" connect with "cropping"?  I've always thought of "cropping" as most based on an artistic judgment, not something done for precision.  Perhaps you could describe your photography and what you expect from your workflow?
    Are you cropping photos of documents?  People?  Stars, nebulae, and galaxies?
    Is this "precision" you speak of an attempt to express "minimal loss of pixel data"?  If that's the case, get the most powerful computer you can and always work at upsampled resolutions.  And don't resample during cropping.
    One last comment:  Consider employing the crop tool in the Camera Raw dialog.  This stores metadata telling future conversions how to crop the image, but the original raw data is still all intact.  Note that you can open a whole set of images in Camera Raw, do exposure and cropping changes, and click [Done] to save that metadata.
    -Noel

  • Development System Backup - Best Practice / Policy for offsite backups

    Hi, I have not found any recommendations from SAP on best practices/recommendations on backing up Development systems offsite and so would appreciate some input on what policies other companies have for backing up Development systems. We continuously make enhancements to our SAP systems and perform daily backups; however, we do not send any Development system backups offsite which I feel is a risk (losing development work, losing transport & change logs...).
    Does anyone know whether SAP have any recommendations on backuping up Development systems offsite? What policies does your company have?
    Thanks,
    Thomas

    Thomas,
    Your question does not mention consideration of both sides of the equation - you have mentioned the risk only.  What about the incremental cost of frequent backups stored offsite?  Shouldn't the question be how the 'frequent backup' cost matches up with the risk cost?
    I have never worked on an SAP system where the developers had so much unique work in progress that they could not reproduce their efforts in an acceptable amount of time, at a acceptable cost.  There is typically nothing in dev that is so valuable as to be irreplaceable (unlike production, where the loss of 'yesterday's' data is extremely costly).  Given the frequency that an offsite dev backup is actually required for a restore (seldom), and given that the value of the daily backed-up data is already so low, the actual risk cost is virtually zero.
    I have never seen SAP publish a 'best practice' in this area.  Every business is different; and I don't see how SAP could possibly make a meaningful recommendation that would fit yours.  In your business, the risk (the pro-rata cost of infrequently  needing to use offsite storage to replace or rebuild 'lost' low cost development work) may in fact outweigh the ongoing incremental costs of creating and maintaining offsite daily recovery media. Your company will have to perform that calculation to make the business decision.  I personally have never seen a situation where daily offsite backup storage of dev was even close to making  any kind of economic sense. 
    Best Regards,
    DB49

  • Best strategy for data definition handling while using OCCI

    Hi, subject says all.
    Assuming to use OCCI for data manipulation (e.g. object navigation), what's the best strategy to handle definitions in the same context ?
    I mean primarily dynamic creation of tables and columns.
    Apparent choices are:
    - SQL from OCCI.
    - use OCI in parallel.
    Did I miss anything ? Thanks for any suggestion.

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best pratices for RMAN backup management for many databases

    Dear all,
    We have many 10g databases (>40) hosted on multiple Windows servers which are backup up using RMAN.
    A year ago, all backup's were implemented through Windows Scheduled Tasks using some batch files.
    We have been busy (re)implementing / migrating such backup in Grid Control.
    I personally prefer to maintain the backup management in Grid Control, but a colleague wants now to go back to the batch files.
    What i am looking for here, are advices in the management of RMAN backup for multiple databases: do you guys use Grid Control or any third-party backup management tool or even got your home-made solution?
    One of the discussion topic is the work involved in case that the central backup location changes.
    Well... any real-life advices on best practices / strategies for RMAN backup management for many databases will be appreciated!
    Thanks,
    Thierry

    Hi Thierry,
    Thierry H. wrote:
    Thanks for your reaction.
    So, i understand that Grid Control is for you not used to manage the backups, and as a consequence, you also have no 'direct' overview of the job schedules.
    One of my concern is also to avoid that too many backups are started at the same time to avoid network / storage overload. Such overview is availble in Grid Control's Jobs screen.
    And, based on your strategy, do you recreate a 'one-time' Oracle scheduled job for every backup, or do your scripts create an Oracle job with multiple schedule?
    You're very welcome!
    Well, Grid Control is not an option for us, since each customer is in a separate infrastructure, and with their own licensing. I have no real way (in difference to your situation) to have a centralized point of control, but that on the other hand mean that I don't have to consider network/storage congestion, like you have to.
    The script is run from a "permanent" job within the dba-scheduler, created like this:
    dbms_scheduler.create_job(
            job_name        => 'BACKUP',
            job_type        => 'EXECUTABLE',
            job_action      => '/home/oracle/scripts/rman_backup.sh',
            start_date      => trunc(sysdate)+1+7/48,
            repeat_interval => 'trunc(sysdate)+1+7/48',
            enabled         => true,
            auto_drop       => false,
            comments        => 'execute backup script at 03:30');and then the "master-script", determines which level to use, based on weekday from the OS. The actual job schedule (start date, run interval etc) is set together with the customer IT/IS dept, to avoid congestion on the backup resources.
    I have no overview of the backup status, run times etc, but have made monitoring scripts that will alert me if/when a backup either fails, or runs for too long. This, in addition with scheduled disaster/recovery tests makes me sleep rather well at night.. ;-)
    I realize that there (might be) better ways of doing backup scheduling in your environment, since my requirements are so completely different than yours, but I guess that we all face the same challenges in unifying the environments as much as possible, to minimize the amount of actual work we have to do. :-)
    Good luck!
    //Johan

  • RAID for offsite backup?

    Hello! My question is not specific to Xserve, but the OSX software RAID provided by Disk Utility. I posted on the 10.6.2 installation and setup board, but was recommended that I post here instead. So here goes...
    I'm planning to use 2 external drives with Time Machine. But I only plan to keep one of them always connected to my iMac and the other stored somewhere safe (connecting periodically to sync up). The reasons being the usual ones for an offsite backup. And need I say that one of my computers has also been the unfortunate victim of a lightning strike? Ouch!
    So instead of having to change the target drive in TM every time, I was hoping to set up a mirrored RAID using Disk Utilities. But I would only keep one drive connected at all times, while the other would be connected periodically to sync up.
    Is this doable? Would the second drive automatically mirror the first one when I connect it? Would I keep getting warnings/errors from OSX that the other drive isn't connected? Appreciate your opinions on this. Thanks!

    Don't go there. Seriously.
    Breaking a mirror should never be considered part of your normal workflow, but that's precisely what you're proposing here - periodically breaking the mirror to take one of the copies off-site.
    It might not bring you seven years' bad luck but it is asking for trouble.
    IMHO your best bet is to periodically clone the Time Machine disk to the second disk using Disk Utility.app, asr or any of the numerous disk clone apps out there (Carbon Copy Cloner, Synchronize Pro, etc.).
    In this way Time Machine can continually backup to its own disk and you take periodic snapshots of that disk on your own schedule. Don't try to get Time Machine to backup to two disks (either alternately or via mirroring).

  • Best practice for bi backup

    Hi,
    Who can suggest me the best practice for backup/restore of entire bi dashboard reports, permissions and etc?
    Ed,

    Hi,
    If you want move the entire dashboards,reports ,permissions
    Zip the *<OracleBIData>Web/catalog* folder and move it new environment. In new envornment unzip this catalog and in Instanceconfig.xml mention the path for this new catalog.
    If you want move the few dashboads or reports ,do it by Catalog Manager.
    Thank you.

  • Best Strategy for Integrating Crystal/Business Objects with OpenACS Environment

    Post Author: devashanti
    CA Forum: Deployment
    I'm working for a client that uses AOL server and OpenACS for their web services/applications. I need suggestions on the best strategy to integrate a reporting solution using Business Objects XI. Ideally I'd like to send an API call from our web application's GUI to the Crystal API with report parameter values to pass into specific reports called via the API - I can get it down to one integer value being passed - or if this is not possible a way to seamlessly, from the end user perspective, move into a reporting module. We are using an Oracle backend database. I'm experienced with creating stored procedures and packages for reporting purposes.
    Although I have many years of experience integrating the Crystal active X controls into n-tier client server applications, the past few years I have had little opportunity to work with Business Objects and the newer versions of Crystal or web based solutions with Crystal Reports. I signed up to try out crystalreports.com, but I doubt my client will find this solution acceptable for security reasons as the reports are for an online invoicing system we are developing. However we can set up a reports server in-house if necessary, so it gives me some testing ground.
    Can anyone provide suggestions for a doable strategy?

    Please post this query to the Business Objects Enterprise Administration forum:
    BI Platform
    That forum is monitored by qualified technicians and you will get a faster response there. Also, all BOE queries remain in one place and thus can be easily searched in one place.
    Thank you for your understanding,
    Ludek

  • Best strategy for flash access to an image in a protected folder

    hello all,
    i'm looking for a good strategy for my swf to be able to load
    images in a password protected directory. since flash can't access
    a file behind a protected folder i figure i'll use php to get the
    image and pass it to flash.
    how best to get an image from php to flash since i can't just
    send a path to a file and have flash execute a Loader() to retrieve
    it since the Loader won't be able to get behind the protected
    folder?
    is it best to have php send the file binary to flash that
    flash will receive via URLLoaderDataFormat.BINARY? is it best to,
    at runtime, have php move the requested file from the protected
    folder to an unprotected folder and after flash loads it then
    delete it from the unprotected folder? maybe it's best to leave the
    image folder unprotected and let an .htaccess redirect restrict
    outside access to the folder?
    suggestions?
    tia,
    michael

    The built-in Firefox Sync service allows you to store your settings, including bookmarks, on a Mozilla server. This isn't accessible as a file like Google Drive; it can only be used by connecting another copy of Firefox to your Sync account. You can read more about this here: [[How do I set up Firefox Sync?]] (I haven't tried it myself since it changed recently.)
    The cross-browser Xmarks service is more focused: it just does bookmarks. http://www.xmarks.com/ (I've never tried this one either)
    If you prefer working with files and you have a Dropbox or similar account, you could in theory back up your Firefox profile folder, or more particularly, the bookmarkbackups subfolder. I haven't tried that myself. To find the folder:
    Open your current Firefox settings (AKA Firefox profile) folder using either
    * "3-bar" menu button > "?" button > Troubleshooting Information
    * Help menu > Troubleshooting Information
    In the first table on the page, click the "Show Folder" button
    A window should launch showing your currently active settings files. The bookmarkbackups folder should contain dated .json files which can be used to restore your bookmarks to another copy of Firefox.
    Alternately, someone might have created an add-on for saving Firefox bookmarks to the cloud another way, but I haven't heard of one.

Maybe you are looking for

  • Solution for Prefilling PDF data

    I called and spoke to a sales associate asking about LiveCycle for use to prefill pdf data for one of our large customers.  He told me the cost was very high for LiveCycle and to come and ask here on what to user.  I am familiar with now unsupported

  • Can't install FaceTime

    It shows in the App Store an active install button for face time. I am running mavericks on a 27 in iMac. When I click the button I get an error window saying the OS is too new for face time

  • Can Anyone Tell Me Why Time Machine Won't Create A BackUp Folder for a Programed Backup?

    Does anyone know why my Time Machine is NOT creating a backup folder on and regular scheduled backup?  Just recently I downloaded the Time Machine Editor and it said to turn off the regular Time Machine (ON/OFF) switch, and just run off the TM Editor

  • Mac OSX - Certificate not found

    Hi, Is SCCM 2012 SP1 supports Mac OS 10.9.1 version? If yes, do I need to install any updates? I have successfully installed and enrolled certificate on Mac OS 10.9.1 verison but when I open Configuration Manager under system preferences, it says 'Ce

  • Set argument value

    Hi, I have a Job below. And passing an argument value to it. sys.dbms_scheduler.set_job_argument_value( job_name => '"SCHEMA1"."TEST_JOB"', argument_name => '"IN_DATE"', argument_value => 'sysdate'); I am passing sysdate. This is a daily job and ever