Best Archiving Strategy?

I have many hours of video of the kids that is on mini DVD tape.
Now is the time to drag out the firewire 400 deck and move these off to a hard drive.
Any advice on the best way to handle this so the kids can make sense of it all, assuming I can keep migrating it to new storage methods!
I have iMovie, anything else?
Thanks i advance   
Cas

Do you have the firewire 400 cable too?! I think that's all you need. What you will want to do is just save it into Events folders that are stored on the new External HD. Make sure you have formatted it as Mac OS Extended using Disk Utility (it's inside the Utilities folder inside of the Applications Folder). As a Mac formatted disk you will have full access to the video clips from programs like iMovie.

Similar Messages

  • Put Together A Data Archiving Strategy And Execute It Before Embarking On Sap Upgrade

    A significant amount is invested by organizations in a SAP upgrade project. However few really know that data archiving before embarking on SAP upgrade yields significant benefits not only from a cost standpoint but also due to reduction in complexity during an upgrade. This article not only describes why this is a best practice  but also details what benefits accrue to organizations as a result of data archiving before SAP upgrade. Avaali is a specialist in the area of Enterprise Information Management.  Our consultants come with significant global experience implementing projects for the worlds largest corporations.
    Archiving before Upgrade
    It is recommended to undertake archiving before upgrading your SAP system in order to reduce the volume of transaction data that is migrated to the new system. This results in shorter upgrade projects and therefore less upgrade effort and costs. More importantly production downtime and the risks associated with the upgrade will be significantly reduced. Storage cost is another important consideration: database size typically increases by 5% to 10% with each new SAP software release – and by as much as 30% if a Unicode conversion is required. Archiving reduces the overall database size, so typically no additional storage costs are incurred when upgrading.
    It is also important to ensure that data in the SAP system is cleaned before your embark on an upgrade. Most organizations tend to accumulate messy and unwanted data such as old material codes, technical data and subsequent posting data. Cleaning your data beforehand smoothens the upgrade process, ensure you only have what you need in the new version and helps reduce project duration. Consider archiving or even purging if needed to achieve this. Make full use of the upgrade and enjoy a new, more powerful and leaner system with enhanced functionality that can take your business to the next level.
    Archiving also yields Long-term Cost Savings
    By implementing SAP Data Archiving before your upgrade project you will also put in place a long term Archiving Strategy and Policy that will help you generate on-going cost savings for your organization. In addition to moving data from the production SAP database to less costly storage devices, archived data is also compressed by a factor of five relative to the space it would take up in the production database. Compression dramatically reduces space consumption on the archive storage media and based on average customer experience, can reduce hardware requirements by as much as 80% or 90%. In addition, backup time, administration time and associated costs are cut in half. Storing data on less costly long-term storage media reduces total cost of ownership while providing users with full, transparent access to archived information.

    Maybe this article can help; it uses XML for structural change flexiblity: http://www.oracle.com/technetwork/oramag/2006/06-jul/o46xml-097640.html

  • Archival Strategy for IMAP?

    I've been reading through the forums and I suppose my issue is different enough to warrant a new thread.
    I have recently switched to IMAP. I have several accounts that I track, and dozens of subfolders into which I route and filter emails as they come in.
    With POP, everything was stored locally, and I didn't have to worry about disk usage. This is no longer the case. So I would like to implement an auto-archival process that copies messages older than 90 days to a mailbox on my local machine.
    Here's the catch: I want to preserve my folder structure.
    Let's say I have this folder:
    Inbox
    |_account1
    |_subfolder1
    |_subfolder2
    |_subfolder3
    Basically I want to set up an automator action or a set of rules within Mail.app that periodically looks through the folders and moves any emails older than 90 days to it's corresponding folder on my local drive. In other words, mail in subfolder1 would go to a similarly named subfolder1 on my mac.
    This is a different technique than simply synchronizing my IMAP directories in that the contents of the archived folders will be different than those of the ones on the server.
    So far, I have been unable to find or create an automator action that looks for files older than a non-specified date (a pet peeve that I hope is addressed in Leopard). And even if I could, I have not found a way to have either automator or Mail.app itself apply rules to a specific folder (account, yes, but not a folder).
    If anyone has developed an archival strategy that works in these conditions, please let me know.
    Many thanks,
    Steve

    Hi Terry,
    what we se usually is that customer run one company-wide SLT box. The monitoring and all operation aspects are easier to handle. For example, you can patch the SLT system independently from the source systems. The general recommendation for the SLT box is to have it close to your HANA -> why? The RFC compression is much better than the DB compression. From a performance perspective I do not see an advantage when you have SLT on the source system. In real life these are only figures, the overall network performance should not be a problem. We have customers with one central SLT box next to Hana and more than 80 ECC systems globally distributed.
    Best, Tobias

  • Best backup strategy

    Time Machine is brilliant. It's saved me many times.
    But recently, the backup drive with a year's worth of backups, died.
    I therefore lost ALL my backups.
    Not a major problem as I still have my current data and having re-formatted the Time Machine drive it's merrily backing it all up again.  (I just hope I don't need to recover to last week's backup ... as I no longer have it.)
    But until that's finished I have NO backups!  Eeek!
    So what is the best backup strategy, bearing in mind drives can fail, houses can burn down, etc.  Should I have two or three Time Machine backup discs and keep swapping them every day so if one dies I've still got a one-day-old version?
    Making DVD backups and storing them elsewhere is very time consuming and while my data is really important to me, it defeats the object if I can get on with any work on that data as I'm constantly burning to lotsof DVDs!
    Your views would be appreciated!

    I pretty much do a similar thing, but my offsite backup goes to a locked cabinet in my office (easier to get to weekly to bring home and update then using a bank - I honestly cannot remember when I last physically went to my bank, its been years).
    TM makes a great source for restoring single files or being able to go back to an earlier version of a file.  And the clones are easier for full restoration, or even for booting from temporarily if the machines boot drive dies and you just want to get in to deauthorize it before sending in for repairs or such.  Always good to have a bootable clone around, for many reasons.
    My external clones are on firewire 800 bus powered portable drives, again for simplicity (no power cables or bricks to go with them).
    I also still burn to optical disc for specific things like filed tax returns, and other financial documents I need to keep around in the fire safe for some period of time.

  • Going to upgrade to Logic X. Need to keep OS 10.6.8 for Adobe CS4? Advise please as to best partition strategy.

    Going to upgrade to Logic X. Need to keep OS 10.6.8 for Adobe CS4? Advise please as to best partition strategy.

    Going to upgrade to Logic X. Need to keep OS 10.6.8 for Adobe CS4? Advise please as to best partition strategy.

  • Archiving strategy

    We have 22 million records in AR cube and  12 million records in  AR ods, that is almost 7 years worth of data. We are planning to do Archiving the data for better reporting performance. Can you please guide me with the best archiving solutions?
    I will Assign full points if useful

    Hi
    chk these links. hope they are helpful.
    http://help.sap.com/saphelp_nw04/helpdata/en/8d/3e4d15462a11d189000000e8323d3a/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/8d/3e4d15462a11d189000000e8323d3a/frameset.htm
    also these threads,
    Archiving
    Re: archiving
    check the following How-to-guides available in SDN
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/b32837f2-0c01-0010-68a3-c45f8443f01d
    http://www.thespot4sap.com/articles/SAP_Data_Archiving_Overview.asp
    Thanks==pts.

  • Best localization strategy

    Hi Guys,
    Which is the best GEO strategy in UCM ?
    WE have a english version site in UCM. Now we are going to include Locale Site also ...
    We will be using same template in locale sites that we used in English site.
    Localized site URL would be wwww.example.com/fr/abcsection/
    on every template , we have header,footer n left navigation fragments , and contribution regions.
    Experts ,
    Please suggest

    Firstly, if I understood the question correctly, you would like to localize your sites created by UCM (Site Studio?), not UCM itself (it should be OK for both French and English).
    There are actually quite a few strategies how to do that "ideally", and I would suggest you to contact Oracle Consulting Services to help you to start.
    In general, it very much depends how tightly are translations linked together. Theoretically, you could have:
    - one element, which is translated to many languages. In this case, you might want to use some resource files containing technical names and translations and in the element just use a simple code to pick the correct translation. This is something that you could read in Ravenna standard demo (ask for Linux version, this could be given for free)
    - many elements/items representing texts in different languages. There is a component, I believe called Translation Workflows, developed by a Belgian consultant Jurgen Debedts ([email protected]) that enables linking these together causing that if a text changes related texts could be sent to workflows for translation. This is a little bit more complicated than that, because for longer texts you might want to have them sent to an external translation agency rather than to your employees.
    In fact, the "social" feature, ie. who actually does translations, is the more complex one, and there is an experience with it, so why re-inventing the wheel?

  • Record Center Archival Strategy/Best Practise

    I am familiar with record center but I would like some guidance on how to do some of the things.
    Scenario : We have 4 site collections, 2 site collections are containing team sites which are built for different departments and one site collection is project site collection which contains sites for on going project and last site collection is for record
    center. There are different content types associated with each of libraries in above sites.
    -- From the team sites in those two site collections, only few documents from document libraries need to be archived (Here I would like to know, should I create a separate library in record center for each library in source site -- in this approach record
    center will have numerous libraries or just create one library for each team site and archive all documents in it -- this approach has drawback of finding document)
    Same way for project site collection where once project is ended they would like to archive all libraries : should I archive each separate library or archive all project libraries for that project into one library on record site.
    Dhaval Raval

    Hi Dhaval,
    I found articles for different aspects of Record Management in SharePoint 2010 for your reference:
    Introducing Records Management in SharePoint 2010
    http://blogs.msdn.com/b/ecm/archive/2010/02/13/introducing-records-management-in-sharepoint-2010.aspx
    Estimate performance and capacity requirements for large scale document repositories in SharePoint Server 2010
    http://technet.microsoft.com/en-us/library/hh395916(v=office.14).aspx
    Choose how to store and manage records
    https://support.office.com/en-us/article/Choose-how-to-store-and-manage-records-5299e96c-00ea-47dc-a711-f21b3cea00d3?ui=en-US&rs=en-US&ad=US
    To design the record management plan, we need consider about govermance, performance, security and etc.
    Regards,
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected] .
    Rebecca Tu
    TechNet Community Support

  • What are the best archiving methods for small colleges?

    Hi Folks,
    Im thinking a few terabyte hard drives will be my best option for a hundred dollars each.
    We have a small operation here at a New England college with two editing systems that get a good amount of student use and use by myself for college needs. Some of the student work will get deleted at the end of each semester, but I need to save all the college's video.
    What are the best ways to archive? Some of my concerns are cost and space. I can afford a few externals, do any of you have a nice cheap solution?
    I understand if I am to go the route of archiving on externals I should have a backup of my backup, do any of you guys have any feedback on ghosting your backups, is that common?
    thanks
    Mike

    If it's JUST for backup (not operation), you can get bare drives and either an enclosure for 1 or 2 drives with no tools trays (like the ones from Sansdigital: about $120 for a 1 drive enclosure) or a device like the UltraDock from Wiebetech (I use the latter). Physically install (or connect in the case of Ultradock) a bare SATA II 3.5" drive, connect it via USB 2, Firewire, esata, initialize/erase it, and then copy your files.
    Then store the drive carefully in the foil anti-static bag it came in a dry location where it won't get jostled about by errant students (or kids in my house).
    The savings in not getting full drive enclosures in the cost of multiple power supplies and interfaces. You pay for that once with the mountable enclosure / drivedock, and thereafter, just buy bare drives.
    I also like another reply: have the students provide drives on which to back up their projects. A friend of mine has a ProTools audio studio, and part of his "set up" charge, is $100 for a bare drive that he will archive for 5 years, or the client can take with him at the end of the project (when the bill is paid).
    Eddie O

  • Best Practices & Strategy Planning for SAP BI Architecture

    What are best practices and strategy planning that SAP BI Architect should know?
    What are the challenges are involved with this role ?
    What are the other information that this Architect should know to deliver the robust BI solution?
    Is there any white papers on the best practices on Architecture & Data Modeling, please ?
    Thanks,
    Venkat.

    Hi
    As per the Best Practice  first load the master and next transaction data .
    Please find the link for best practices
    http://www.sap.com/services/pdf/BWP_SAP_Best_Practices_for_Business_Intelligence.pdf.
    Regarding the architecture it depend upon the size of volumen and how much frequency  your load and  hard ware sizing
    based on this  we can provide best  solution
    If you any issues please let me know
    Regards
    Madan

  • Best migration strategy in ODI

    What's the best strategy of migration in ODI ? I mean we have Dev,Test, UA and Production instances. I know there are metalink notes and some documentation on migration but I need to know the potential issues with migration if anyone has experienced. We may not migrate whole development to Test but migrate partially as some objects might be still under development before moving to test and in such scenario, how do we keep track between instances ? I appreciate if some one can throw some light on this who has experienced with different scenarios of migrating objects in ODI and ofcourse the best strategy of the migration process...
    Thanks in advance.
    Ram

    Hi ,
    ODI provide the option to export / import ODI object in form of XML file. So , You can partial export some object that you want then import to the production env. As my exp , Sometime it has some issue about internal object ID link that make a ruin to the repository on the production. Because all the object in ODI was linked together with "Internal ID" that related between object.
    With this method , you cannot track what's the different between prod and dev environment until you create a version of that object and use compare version utilty in designer.
    Hope this help ,
    Somchai

  • 2048x1536 videos and best compression strategy

    Hi,
    I can't quite figure out if DPS on the iPad3 supports palyback of videos in 2048*1536 resolution. Should I stick to 1024*768 even for the 2048 rendition?
    The cover of our magazine includes a full-screen video that features animated text. I'd like it to stay as crisp as possible, but had an error message pop up when tried to play a 2048*1536 video. Playback of a 104*768 video works just fine, but I find the text somewhat pixelated, especially after compressing it to mp4 (I exported from AE in Quicktime format using the Photo - JPEG codec, and then compressed to h.264 in media encoder)
    The second part of my question, which I realize may not really belong in this forum but rather in the After Effects or Media Encoder forrums, is what would be the best rendering and compression strategy to keep the type as crisp as possible, especially if 2048*1536 is indeed not supported?
    Thanks

    1536x2048 will not render on a iPad3 this is something programed in to the video play back on iOS basicly it only supports up to 1080 which is 1920x1080 and it really is just the 1920 we have to worry about anything over that does not play. So this is what I do with my full screen videos on the iPad3 I scale them to 90% of 1536x2048 which is 1382x1843 which plays just fine with no noticable loss in quailty.
    Here are the sizes I use for full screen videos on iOS devices.
    iPad3 - 1382x1843
    iPad1&2 - 691x922
    iPhone4 - 576x864
    iPhone2&3 - 288x432
    The second part of your question I can help answer but I am no expert on video encoding I will just sure the settings I use.
    Video
    Format: H.264
    Frame Rate: 30
    Field Order: Progressive
    Aspect: Square Pixels (1.0)
    TV Standard: NTSC
    Bitrate Encoding: VBR, 2 Pass
    Target Bitrate [Mbps]: 8
    Maximum Bitrate [Mbps]: 8
    Key Frame Distance: 72
    Audio
    Audio Codec: AAC
    Sample Rate: 48000 Hz
    Channels: Stereo
    Audio Quality: High
    Bitrate [kbps]: 320
    Precedence: Bitrate
    Multiplexer
    Multiplexer: MP4
    Stream Compatibility: Standard
    Hope that helps.

  • SAP Content Server Archiving strategy

    Hi,
    I have installed ECC 6.0 SR3 and SAP Content Server 6.40 recently. Content Server is contains only scanned copies. I want to propose archval strategy for the Content Server.
    Any body has document for the same please let me know.
    Thanks in advance

    hi,
    Archiving,
    There are both technical and legal reasons for archiving application data. Data Archiving:
    · Resolves memory space and performance problems caused by large volumes of transaction data
    · Ensures that data growth remains moderate so that the database remains manageable in the long term
    · Ensures that companies can meet the legal requirements for data storage in a cost-efficient manner
    · Ensures that data can be reused at a later date, for example, in new product development
    please refer this thread
    http://help.sap.com/saphelp_erp60_sp/helpdata/en/8d/3e4c11462a11d189000000e8323d3a/frameset.htm
    Benakaraja
    ??P
    Edited by: benaka rajes on Sep 10, 2009 6:09 PM

  • FCP X 10.1 Best Backup Strategy

    We have a great backup strategy in FCP 10.0.x utilising offsite backups of each SD card via Camera Archives and Carbon Copy Cloner. We then backup up our Final Cut Projects and Final Cut Events folders - but excluded the 'Transcoded Media' folder as that was the biggy and we could re transcode media if things turned pear shaped. We also excluded the 'Render Files' and 'Analysis Files' for the same reason - quick to rebuild. We run 2 edit suites with 8TB Promise R4 drives and and even with these at 75% full of all our media we could at the end of the day (or during the day) create a backup via Carbon Copy Cloner with the above exceptions very fast to an external Thunderbolt drive to take offsite each day - easy and fast! I'm talking a couple of minutes tops to achieve a very complete offsite backup that if needed can work with the offsite backup of our camera archives and we could 'rebuild' any edit within a few hours - we've tested and provides great peace of mind... But now what with this packaged library structure. Is something like this still possible anyone?

    Hi nzglen,
    Welcome to the Support Communities!
    The Managing Media with Final Cut Pro X Libraries white paper (PDF) at http://www.apple.com/final-cut-pro/docs/Media_Management.pdf covers strategies for backing up libraries and media.
    I hope this information helps ....
    Happy Holidays!
    - Judy

  • Mail back-up and archive strategy.

    Hi, all.
    I am new to Mail as I just recently switched from Entourage. One thing I have always done to avoid database corruption and crashes was to periodically export the content of my folders and mailboxes and then empty then so the database wouldn't grow too big (a constant threat to Entourage).
    Some questions since I am new to Mail:
    1. Can the Mail database experience the same problems? Where is the database located and what would be considered a size limit, if any, for the database before it became prone to crash and/or corruption?
    2. What strategy do you use for exporting mail and archiving different folders and mailboxes? Are there any third party applications worth using for this process? I have seen some that archive Mail mailboxes as PDF files. I would prefer for them to be archived as text in another database so that I can search past mails if it become necessary.
    Thank you in advance.

    I am not currently using Time Machine.
    Whatever backup method you use should be backing up the mail files. If it's not, use a different method, such as Time Machine.
    how would export work ?
    Select the mailboxes you want to export in the mailbox list. Then select
    Mailbox ▹ Export Mailbox...
    from the menu bar.
    In case I want to save the entire Mail database where is this database located in OS X ?
    Triple-click the line below to select it:
    ~/Library/Mail
    Right-click or control-click the highlighted line and select
    Services ▹ Reveal
    from the contextual menu.

Maybe you are looking for

  • Suppl

    Hi, I am working on supplier site open interface. I succesfully loaded data into 'ap_supplier_sites_int' interface table. After submitting supplier site open interface program, it's giving '*Operating Unit Info is null*' error. But, i have operating

  • OS X Server 3.0.1 VPN inconsistent

    I understand that many people have had trouble using their VPN Server since the 3.0 Server update with Mavericks.  I have NOT had trouble connecting my iMac (OS X 10.9 Mavericks) to the VPN Server here at work (Server 3.0.1, OS X 10.9 Mavericks), or

  • Trying to install Support for DTS packages in SQL server 2008

    Hi all I am trying to follow the instructions in the following link... technet.microsoft.com/en-us/library/ms143755(v=sql.105).aspx ...to install support for DTS packages - when it says on the Feature Selection page, select Integration Services - non

  • Push button not visible in Quality

    Hi, We have created one push button for the tcode fbl1n and fbl5n using the standard BADi  "FI_ITEMS_MENUE01". Now, the problem is the button is visible in Development server but not in Quality server. We have checked the version management but the c

  • I need to display separate bar for totals at the end of the barchart.

    Hi All, My scenario is similar to the one described below http://siebel.ittoolbox.com/groups/technical-functional/siebel-analytics-l/displaying-grand-total-column-as-a-seperate-vertical-bar-in-a-bar-chart-report-3351958 I didn't understand how the co