Best strategy for archiving some of your calendar

I'd like to archive all events in my calendar pre-2004. I'd also like to be able to re-import these events at a later date both as a separate calendar (which can be deleted again) and/or back into the calendar that they came from.
What is the best strategy to do this, especially with regard to repeating events?
My first inclination is to duplicate my calendar (i.e. export and re-import) and then delete all events for 2005 onwards (including removing all future events of repeated items), and then export this calendar as "Pre-2005". And then to delete all events pre-2005 from the calendar I'm going to keep active (as well as not removing all future events of repeated items).
This, however, would appear to be a labourious process in the extreme, considering I would have to select every event and delete it going back 5 years, month by month. Any other ideas?

Answer:
Check the calendar(s) you want to delete events from
Show the "Search Results" list (View>Show Search Results)
Sort by date (click on the Date heading)
Shift-select events to delete, then delete (gives you the choice to delete future events or not)

Similar Messages

  • What's best strategy for dealing with 40+ hours of footage

    We have been editing a documentary with 45+ hours of footage and presently have captured roughly 230 gb. Needless to say it's a lot of files. What's the best strategy for dealing with so much captured footage? It's almost impossible to remember it all and labeling it while logging it seems inadequate as it difficult to actually read comments in dozens and dozens of folders.
    Just looking for suggestions on how to deal with this problem for this and future projects.
    G5 Dual Core 2.3   Mac OS X (10.4.6)   2.5 g ram, 2 internal sata 2 250gb

    Ditto, ditto, ditto on all of the previous posts. I've done four long form documentaries.
    First I listen to all the the sound bytes and digitize only the ones that I think I will need. I will take in much more than I use, but I like to transcribe bytes from the non-linear timeline. It's easier for me.
    I had so many interviews in the last doc that I gave each interviewee a bin. You must decide how you want to organize the sound bytes. Do you want a bin for each interviewee or do you want to do it by subject. That will depend on you documentary and subject matter.
    I then have b-roll bins. Sometime I base them on location and sometimes I base them on subject matter. This last time I based them on location because I would have a good idea of what was in each bin by remembering where and when it was shot.
    Perhaps, you weren't at the shoot and do not have this advantage. It's crucial that you organize you b-roll bins in a way that makes sense to you.
    I then have music bins and bins for my voice over.
    Many folks recommend that you work in small sequences and nest. This is a good idea for long form stuff. That way you don't get lost in the timeline.
    I also make a "used" bin. Once I've used a shot I pull it out of the bin and put it "away" That keeps me from repeatedly looking at footage that I've already used.
    The previous posts are right. If you've digitized 45 hours of footage you've put in too much. It's time to start deleting some media. Remember that when you hit the edit suite, you should be one the downhill slide. You should have a script and a clear idea of where you're going.
    I don't have enough fingers to count the number of times that I've had producers walk into my edit suite with a bunch of raw tape and tell me that that "want to make something cool." They generally have no idea where they're going and end up wondering why the process is so hard.
    Refine your story and base your clip selections on that story.
    Good luck
    Dual 2 GHz Power Mac G5   Mac OS X (10.4.8)  

  • Best strategy for migrating GL 6 DW CS4?

    First I gotta say that -- as decade-long user of several Adobe products -- Adobe has really screwed over long-time users of Adobe Golive. After a week of messing around with something that should be a slam dunk (considering the substatial amount of $ I've paid to Adobe over the years for multiple copies of GoLive 5, 6 and CS2), I can tell you Adobe's GL > DW migration utility ONLY works for sites created FROM SCRATCH in GL CS2 (they must have the web-content, web-data folder structure). This means that Adobe's migration utility only works for maybe 10% - 15% of the GoLive sites out there, and about 10% - 15% of Adobe's loyal GoLive customers, and it particularly screws over Adobe's longtime GoLive customers. Sweet! ( (Just like Adobe screwed over users of PhotoStyler -- which was a better image editor than PhotoShop, BTW.) Obviously, I would walk away from Adobe and make sure I never ever paid them another cent, but the Republican-Democrat "free market economy" has made sure I don't have that option.
    So I've gotta make Dreamweaver work, which means I've gotta migrate several large sites (the biggest has 5,000 static pages and is about 2 gigs in size) from GoLive 6 (or GoLive CS2 with the older folder structure) to Dreamweaver CS4.
    Which brings me to my question -- what's the best strategy for doing ths? Adobe's migration utility is a bad joke, so here's the alternative, real world plan I've come up with. I'd apreciate any suggestions or comments...
    1) create copies of all my GL components in the content folders for my GL site (not in the GoLIve components folder)
    2) replace the content of all GoLive compnents (in the GoLive components folder) with unque character strings, so that instead of a header with images and text, my old header would look something like xxxyyyzzz9
    3) create a new folder called astoni in the root of my hard drive. Copy all my GoLIve web site content (HTML, images, SWF, etc.) into astoni in exactly the structure it was in with GL
    4) create a new Dreamweaver site by defining astoni as the local location for my site, astoni\images as the location for images, etc.
    5) use Dreamweaver and open the newly defined astoni site. Then open each of the GoLive components I copied into a content level folder in astoni, and drag each into the Dreamweaver Assets/Library pane, in order to create library items just like my old GoLive components
    6) use Dreamweaver to Search & Replace out the unique text strings like xxxyyyzzz9 with the content of my new DW library items
    7) refresh site view to make all the links hook up...
    Thanks for your help. Hope this discussion helps others too...

    Instead of ragging on people who are familiar with DW and Site Development, you should be gracious and accept the practical advice you've been given.  A "best strategy" would be to read some tutorials and learn how to work with HTML, CSS and server-side technologies. Without some basic code skills, you're going to find DW an uphill, if not impossible battle.
    Frankly, I don't have free time to hand-hold someone through the excruciating process of migrating a 5,000 page static site from GoLive  to DW. And I doubt anyone else in this forum has either.  We're not Adobe employees.  We don't get paid to participate here.  We are all product users JUST LIKE YOU.
    I'm sorry you're frustrated.  I'm also sorry for your clients. But the problem you have now isn't Adobe's fault. It's yours for not keeping up with server-side technologies or handing-off these huge static sites to more capable web developers.  I'm not saying you need to buy anyone's books, but they are good resources for people willing to learn new things.
    That said, maybe you should stick with GoLive.  The software doesn't have an expiration date on it and will continue working long into the future.  If you're happy using GL, keep using it to maintain your legacy sites. At the same time learn X/HTML, CSS & PHP or ASP.  Use DW CS4 for the creation of new projects.
    FREE Tutorial Links:
    HTML & CSS Tutorials - http://w3schools.com/
    From   Tables to CSS Web Design Part 1 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt1.html
    From   Tables to CSS Web Design Part 2 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt2.html
    Taking  a Fireworks (or Photoshop) comp to a CSS based layout in DW
    http://www.adobe.com/devnet/fireworks/articles/web_standards_layouts_pt1.html
    Creating  your first website in DW CS4 -
    http://www.adobe.com/devnet/dreamweaver/articles/first_cs4_website_pt1.html
    Guidance  on when to use DW Templates, Library Items and SSIs -
    http://www.adobe.com/devnet/dreamweaver/articles/ssi_lbi_template.html
    Best of luck,
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists
    www.alt-web.com/
    www.twitter.com/altweb
    www.alt-web.blogspot.com

  • Best strategy for variable aggregate custom component in dataTable

    Hey group, I've got a question.
    I'd like to write a custom component to display a series of editable Things in a datatable, but the structure of each Thing will vary depending on what type of Thing it is. So, some Things will display radio button groups (with each radio button selecting a small set of additional input elements, so we have a vertical array radio buttons and beside each radio button, a small number of additional input elements), some will display text-entry fields, and so on.
    I'm wondering what the best strategy for tackling this sort of thing is. I'm sort of thinking I'll need to do something like dynamically add to the component tree in my custom component's encodeBegin(), and purge the extra (sub-) components in encodeEnd().
    Decoding will be a bit of a challenge, maybe.
    Or do I simply instantiate (via constructor calls, not createComponent()) the components I want and explicitly call their encode*() and decode() methods, without adding them to the view tree?
    To add to the fun of all this, I'm only just learning Faces (having gone through the Dudney book, Mastering JSF, and writing some simpler custom components) and I don't have experience with anything other than plain vanilla JSP. (No EJB, no Struts, no Tapestry, no spiffy VisualDevStudioWysiwyg++ [bah humbug, I'm an emacs user]). I'm using JSP 2.0, JSF 1.1_01, JBoss 4.0.1 and JDK 1.4.2. No, I won't upgrade to 1.5 (yet).
    Any hints, pointers to good sample code? I've looked at some of the sample code that came with the RI and I've tried to navigate the JSF Blueprints stuff, but I haven't really found anything on aggregating components into a custom component. Did I miss something obvious?
    If this isn't a good question, please let me know how I can sharpen it up a bit.
    Thanks.
    John.

    Hi,
    We're doing something very similar. I had a look at the Tomahawk Date component, and it seems to dynamically created InputText components in the encodeEnd(). However, it doesn't decode this directly (it only expects a single textual value). I expect you may have to check the request yourself in decode().
    Other ideas would be appreciated, though - I'm still new to JSF.

  • Best strategy for buring dual layer DVD

    I've got a project that won't fit on a single layer DVD, so I've got it set up for dual layer and "best quality" which results in a 7.38GB disk image.  However, iDVD 8 warned me when burning the disk images that some utilities may not be able to burn a reliable dual layer disk copy and to use iDVD instead, does this include Disk Utility?
    I always use Disk Utility to make copies, iDVD took almost 7 hours to burn the disk image, I can't afford to wait that long for each copy.  So what's the best strategy for burning dual layer DVDs?

    subsequent copies burn in far less time immediately after the first provide you don't quit iDvd
    If you know ahead of time you'll need more than a few copies, I'd recommend burning from a disc image to the desktop and reducing the burn speed (4x or lower) I prefer to use Roxio Toast myself. But others have had success with Apple's Disc Utilities as well.

  • Absolute Best Strategy for Cropping

    What's the absolute best strategy for high precision cropping when time is not an issue and and want "perfect" quality?
    What's the absolute best strategy for low precision cropping when time is an issue and standards are still high?
    There's gotta be different strategies for doing this, and I was curious if we could choose a winner.
    Thanks,

    Heh, Marian, that about sums it up. 
    In all seriousness...  How does "precision" connect with "cropping"?  I've always thought of "cropping" as most based on an artistic judgment, not something done for precision.  Perhaps you could describe your photography and what you expect from your workflow?
    Are you cropping photos of documents?  People?  Stars, nebulae, and galaxies?
    Is this "precision" you speak of an attempt to express "minimal loss of pixel data"?  If that's the case, get the most powerful computer you can and always work at upsampled resolutions.  And don't resample during cropping.
    One last comment:  Consider employing the crop tool in the Camera Raw dialog.  This stores metadata telling future conversions how to crop the image, but the original raw data is still all intact.  Note that you can open a whole set of images in Camera Raw, do exposure and cropping changes, and click [Done] to save that metadata.
    -Noel

  • Best Strategy for Integrating Crystal/Business Objects with OpenACS Environment

    Post Author: devashanti
    CA Forum: Deployment
    I'm working for a client that uses AOL server and OpenACS for their web services/applications. I need suggestions on the best strategy to integrate a reporting solution using Business Objects XI. Ideally I'd like to send an API call from our web application's GUI to the Crystal API with report parameter values to pass into specific reports called via the API - I can get it down to one integer value being passed - or if this is not possible a way to seamlessly, from the end user perspective, move into a reporting module. We are using an Oracle backend database. I'm experienced with creating stored procedures and packages for reporting purposes.
    Although I have many years of experience integrating the Crystal active X controls into n-tier client server applications, the past few years I have had little opportunity to work with Business Objects and the newer versions of Crystal or web based solutions with Crystal Reports. I signed up to try out crystalreports.com, but I doubt my client will find this solution acceptable for security reasons as the reports are for an online invoicing system we are developing. However we can set up a reports server in-house if necessary, so it gives me some testing ground.
    Can anyone provide suggestions for a doable strategy?

    Please post this query to the Business Objects Enterprise Administration forum:
    BI Platform
    That forum is monitored by qualified technicians and you will get a faster response there. Also, all BOE queries remain in one place and thus can be easily searched in one place.
    Thank you for your understanding,
    Ludek

  • Best practices for applying sharpening in your workflow

    Recently I have been trying to get a better understanding of some of the best practices for sharpening in a workflow.  I guess I didn't realize it but there are multiple places to apply sharpening.  Which are best?  Are they additive?
    My typical workflow involves capturing an image with a professional digital SLR in either RAW or JPEG or both, importing into Lightroom and exporting to a JPEG file for screen or printing both lab and local. 
    There are three places in this workflow to add sharpening.  In the SLR, manually in Lightroom and during the export to a JPEG file or printing directly from Lightroom
    It is my understanding that sharpening is not added to RAW images even if you have added sharpening in your SLR.  However sharpening will be added to JPEG’s by the camera. 
    Back to my question, is it best to add sharpening in the SLR, manually in Lightroom or wait until you export or output to your final JPEG file or printer.  And are the effects additive?  If I add sharpening in all three places am I probably over sharpening?

    You should treat the two file types differently. RAW data never has any sharpening applied by the camera, only jpegs. Sharpening is often considered in a workflow where there are three steps (See here for a founding article about this idea).
    I. A capture sharpening step that corrects for the loss of sharp detail due to the Bayer array and the antialias filter and sometimes the lens or diffraction.
    II. A creative sharpening step where certain details in the image are "highlighted" by sharpening (think eyelashes on a model's face), and
    III. output sharpening, where you correct for loss of sharpness due to scaling/resampling or for the properties of the output medium (like blurring due to the way a printing process works, or blurring due to the way an LCD screen lays out its pixels).
    All three of these are implemented in Lightroom. I. and II. are essential and should basically always be performed. II. is up to your creative spirits. I. is the sharpening you see in the develop panel. You should zoom in at 1:1 and optimize the parameters. The default parameters are OK but fairly conservative. Usually you can increase the mask value a little so that you're not sharpening noise and play with the other three sliders. Jeff Schewe gives an overview of a simple strategy for finding optimal parameters here. This is for ACR, but the principle is the same. Most photos will benefit from a little optimization. Don't overdo it, but just correct for the softness at 1:1.
    Step II as I said, is not essential but it can be done using the local adjustment brush, or you can go to Photoshop for this. Step III is however very essential. This is done in the export panel, the print panel, or the web panel. You cannot really preview these things (especially the print-directed sharpening) and it will take a little experimentation to see what you like.
    For jpeg, the sharpening is already done in the camera. You might add a little extra capture sharpening in some cases, or simply lower the sharpening in camera and then have more control in post, but usually it is best to leave it alone. Step II and III, however, are still necessary.

  • Best encoder for archival purposes?

    I have been using AAC with 256 resolution recently for all my CD imports to iTunes but I occasionally get skips in the songs (happens on all computers). So two fold question.
    1) Am I better off using MP3 with Constant Bitrate of 256 or higher for best quality for importing and archiving to iTunes, and
    2) What causes the skipping of imported CDs even when the originals play just fine in other applications or CD players?
    Steve

    archival is a tricky word. it implies you want to keep the files in as close to original as possible. for true archival quality, you could use the apple lossless format, but its not practical at all.
    VBR is the best comprimise, it just sorts out the difficult to compress areas and gives them a higher bitrate than the rest of the file. i just set it as 192kbps with VBR, which is what i used with MP3, i figure the MP4 AAC LC encoder in itunes has improved since its early days, (not all that confident though it doesnt matter too much)
    however, in your case, i think that the 'ripping' engine in itunes is the problem.
    if the gaps and pauses are in the middle of tracks, then it is the drive correcting data incorrectly by using the ECC values from the cd/dvd drive to smooth over the invalid areas. try switching it off, it might help, but it depends on the drive and the CD in question.
    it's been a problem in the past for most removable media, reading errors and correcting them has 2 approaches, guessing what is supposed to be there by using the error correcting codes (faster, default behaviour) or re-reading the data several times at low speed until the same data is read back (slower, can damage hardware by constantly spinning up and down to re-read the same sectors).
    if the gaps are only near the end and beginning of tracks, then it's more likely due to the encoder and/or the disc. but ts uncommon, encoders are pretty good these days.
    some copy protection methods break the mastering conventions (known as red-book CD audio) so you'd probably need to read the disc at 1x speed instead of 30x to 40x as you normally would. even then, it might not work OK.
    there are worse forms of copy-protection, most are oriented towards disabling or slowing down 'ripping' across the entire CD's content.
    if you have windows, getting a hold of EAC to rip the files is the preferred fix.
    or, get a 1:1 cd copy program (i.e. programs used to rip copy-protected games will also rip copy-protected audio CD's) such as blindwrite or clonecd, which will allow you to make a backup from a damaged/copy-protected disc. you can then mount the disc image in a virtual cdrom and test that the virtual image is error-free.
    you can also use slysoft's anyDVD to have an on-the-fly application that repairs copy-protected media i.e. dvd's and audio cd's with copy-protection for personal use. it also makes reading dvd's and cd's faster and less hardware intensive.
    for mac, well, there's umm ... bootcamp. a regular toast/dmg creator won't do the job if its one of the nastier copy protection mechanisms, they damage/alter the error correcting data on the disc intentionally so that the drive itself mangles the data every time it grabs that section of the cd.
    programs like EAC just read the data in it's raw form over and over then compare each pass for integrity. if the drive is passing the mangled data to the OS and correcting it on the fly (via ECC), then your drive just wont be able to read the audio cd's without those gaps and breaks.
    so, in short, it's been a problem for a while.

  • Best strategy for data definition handling while using OCCI

    Hi, subject says all.
    Assuming to use OCCI for data manipulation (e.g. object navigation), what's the best strategy to handle definitions in the same context ?
    I mean primarily dynamic creation of tables and columns.
    Apparent choices are:
    - SQL from OCCI.
    - use OCI in parallel.
    Did I miss anything ? Thanks for any suggestion.

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best method for archiving a table?

    I want to archive data in a table, based on a date selection.
    My initial thought was:
    -- Force creation of blank table, if it does not already exist
    create table ifsapp.customer_order_kmarchive as select * from ifsapp.customer_order where 1=2
    -- Update table with latest data to purge
    insert into ifsapp.customer_order_kmarchive as select * from ifsapp.customer_order where order_date < sysdate - 360
    -- Remove data
    delete from ifsapp.customer_order where order_date < sysdate - 360
    But of course you have to specify values on the insert. Is there a simple way round, without explicitly naming each column?
    Is there any check I can do so that if the copy fails, the delete won't happen?
    Thanks

    Oscar
    If proper date format (or trunc function) is not used, I bet, your archival process will be messed up at some stage.
    Consider for example the below scenario and run a test in your database:
    SQL> create table original_table(id Number, order_date date);
    SQL> Begin
    For i in 1..100
    Loop
    insert into original_table values (i, (sysdate-360)+ i/3600);
    End Loop;
    Commit;
    End;
    Now run the following select statement (it's like inserting into archive table) .
    Keep running it few times and you will see how the TIME part in the date field affects your whole process.
    SQL> select count(*) from original_table where order_date < sysdate-360;
    You will see that the result returned will change as your clock moves even though the date is same.
    As a result, your INSERT into ARHIVE_TABLE might start at 9 am and select all rows less than a year and 9 am on that day and archive them. Your DELETE statement might begin at 9.30 am and will delete all rows from original_table with order_date less than a year and 9.30 am. Now, understand why you need a TO_DATE function or a TRUNC function to get consistent results.
    You can use function based index to index TRUNC(ORDER_DATE) and your queries will be still performing great!

  • Best strategy for flash access to an image in a protected folder

    hello all,
    i'm looking for a good strategy for my swf to be able to load
    images in a password protected directory. since flash can't access
    a file behind a protected folder i figure i'll use php to get the
    image and pass it to flash.
    how best to get an image from php to flash since i can't just
    send a path to a file and have flash execute a Loader() to retrieve
    it since the Loader won't be able to get behind the protected
    folder?
    is it best to have php send the file binary to flash that
    flash will receive via URLLoaderDataFormat.BINARY? is it best to,
    at runtime, have php move the requested file from the protected
    folder to an unprotected folder and after flash loads it then
    delete it from the unprotected folder? maybe it's best to leave the
    image folder unprotected and let an .htaccess redirect restrict
    outside access to the folder?
    suggestions?
    tia,
    michael

    The built-in Firefox Sync service allows you to store your settings, including bookmarks, on a Mozilla server. This isn't accessible as a file like Google Drive; it can only be used by connecting another copy of Firefox to your Sync account. You can read more about this here: [[How do I set up Firefox Sync?]] (I haven't tried it myself since it changed recently.)
    The cross-browser Xmarks service is more focused: it just does bookmarks. http://www.xmarks.com/ (I've never tried this one either)
    If you prefer working with files and you have a Dropbox or similar account, you could in theory back up your Firefox profile folder, or more particularly, the bookmarkbackups subfolder. I haven't tried that myself. To find the folder:
    Open your current Firefox settings (AKA Firefox profile) folder using either
    * "3-bar" menu button > "?" button > Troubleshooting Information
    * Help menu > Troubleshooting Information
    In the first table on the page, click the "Show Folder" button
    A window should launch showing your currently active settings files. The bookmarkbackups folder should contain dated .json files which can be used to restore your bookmarks to another copy of Firefox.
    Alternately, someone might have created an add-on for saving Firefox bookmarks to the cloud another way, but I haven't heard of one.

  • Best strategy for importing organized images

    Hello,
    I have recently started using Aperture in my photography workflow and I need some advice from the experts on the best strategy to import the existing .JPGs from my hard disk into my Aperture library.
    Prior to using Aperture, my workflow was as follows:
    1.) I would import ALL the photos from my digital camera to a folder on my hard disk (one folder for each date of import). These, in effect, were my 'master' images, which I never edited or deleted.
    2.) Next I would make a copy of each new master folder, under a separate folder called 'picks', where I would go thru and delete the bad ones and occasionally make some minor edits or crops to the good ones. These 'picks' folders are what I used for my slideshows, photo galleries, etc.
    NOW, as I'm moving everything to Aperture, I'm attempting to find the best strategy to import everything, where I would maintain both a master copy of everything that came off my camera and the already organized 'albums' of 'versions' the represent my 'picks', but also reduce some of the redundancy of having duplicate copies of 60-70% of the photos in my library.
    Here is where I need some advice. In the case of new photos imported directly from the camera into Aperture I have managed to figure out how to import the master images and then create albums of versions for my good pics. However, I haven't found any way to import my existing masters and picks without importing two copies of each "good" pick.
    Is there any way I can get Aperture to recognize that the photos in my 'picks' folders are actually 'versions' of some of the original 'masters' so I don't need to import two master copies of each of my good pics? (I generally left the file names the same.)
    I have several thousand existing pics on my hard disk that I'm hoping to import into Aperture so I'm hoping there is an efficient way to do this!
    Many thanks for the help!

    HI I would also be interested in how others do this. I have been playing with aperture for some time and I am still not sure whether to commit to it.
    I also keep my masters in a folder structure with year at the highest level and day of shooting below that.
    I know its not necessary to do this in Aperture but I am not sure I understand why or if its a bad thing to do.
    Would love to her what others think and advise.
    Thanks

  • Best Format for Archiving Video Files

    I am importing old VHS and 8mm family videos for my cousin, to both burn them to DVD, and to archive the movies in case he wants to edit them later.
    I am planning on purchasing for him, a "write once," external HDD, probably firewire, (though I am open to suggestions), to put both the backup Video TS folders on, as well as the Quicktime movies themselves.
    My guess is, the best format for the QT movies would be their original DV-NTSC, but at around 13GB an hour, this can add up pretty fast.
    The h264 .mov codec looks great, takes forever to render, and it seems the file size difference to DV-NTSC is nominal at best.
    Obviously the future is some kind of HD, so these movies will never look great years from now, but I want them to be preserved for him at the best quality possible, at the smallest file size, if it will not compromise the quality too much.
    I know I can render them significantly smaller as h264 .mp4 files, or as DIVX, (which also takes a long time), but I imagine they are of significantly reduced quality that would be visible in future video editing he might do.
    If DV-NTSC is the only way to go, so be it. But if not, then any suggestions for what other format I could use to save these QT files, without visible loss of quality is greatly appreciated.

    Thanks for all the advice. And I mean everyone.
    Here is my latest dilemma.
    Though 15 years old, the original Vhs's play beautifully. Aside from some (expected) glitches in the beginning or end of some of the shots, the video has not degraded appreciably. This is great news, as my cousin wanted to make back ups before degradation begins.
    First, I captured the Vhs movies and corrected some bad color, mostly due to poor white balancing, and on the computer, they looked great in comparison. I burned DVD's and they looked terrible in comparison with the originals. Now I know I am going to lose some resolution when going to DVD, but it seemed harsh, so I ran some tests.
    I burned a DVD of the original capture, with no color correction, just the raw footage and the DVD did not look much better, so it wasn't the color correction.
    Not satisfied, I made a copy from the FCP timeline straight to Vhs, (I should note, through the ADVC-110.) This looked significantly degraded as well.
    Finally, I made a straight dub from Vhs to DVCam, figuring that had to look good, and that also looked significantly degraded, compared to the, (now looking pristine), Vhs original.
    As the original Vhs looks so good, what good would a TBC do? Or a proc amp?
    Is there some secret to capturing and exporting Vhs that I am missing? Or, will it always look significantly degraded, even straight to DVCam?
    +David Murray wrote: There is an industrial way to get good VhS copies but it is extremely expensive+
    I don't know what this is, but I am sure it is clearly out of our league, but David M., if you are still following this thread, I'm curious, would this ultra expensive method actually result in a dub that looks like the original? I am actually really shocked at how bad the straight DVCam dub looks. It really does not look much better than the color corrected, computer exported version.
    The Vhs tapes of my cousins daughter are as precious to him as anything he has.
    As they date back 15 years, he is concerned about the longevity of the tapes.
    As there are over forty hours of them, he wants them to be in an editable format, so later he or his daughter can make a highlight reel, or whatever. So thus the QT movie archive/backup on HD.
    Whether as Finalcutter said, that this drive, even if unused, and put in a cool dry safe, might not function years down the line, makes it iffy, but what other choice is there for editable versions, unless I break the QT movies up into 20 minute chunks and archive them to over a hundred DVD's instead. Perhaps that is the safest solution, though certainly time consuming.
    As failure of the HDD somewhere in the future is likely, though not certain, I am still not sure that the HDD holding all his QT movies for future editing is the wrong way to go.
    As it is a digital version, and not an analog copy, I get that a DVCam backup of the originals is the best option. I will suggest to him that we do this as well, even though he does not have DVCam himself. But again, is there something I am missing here? Why does the Vhs to DVCam copy look so degraded? Is there anything I can do to keep the original quality, or will it die with his original Vhs as it slowly degrades over time?

  • Best Strategy for Exporting Fireworks into Dreamweaver

    I created a navigation bar in Fireworks and now I want to export it into Dreamweaver.
    So I Went to Edit->Copy HTML Code (hotkey for this is Ctrl+Alt+C)
    Now, what's the "best choice" from the dialog choices: AIR HTML, Dreamweaver HTML, Generic HTML, GoLive HTML, AIR XHTML, Dreamweaver XHTML, Generic XHTML or GoLive XHTML for a cross browser "professional" website?
    I only tried the "Dreamweaver HTML Option" and itseemed to work nicely in my rookie opinion.
    I think this technique might be at a performance cost to my graphic design or the website speed. So should I use a different strategy for exporting Fireworks to Dreamweaver?  I have the CS5 package so can use any of those programs for support.
    Thanks,

    INHO, the best way to export from Fireworks to Dreamweaver is to first define your site in Dreamweaver, and then export the Fireworks document to the assets folder. Place your cursor when you want the navbar to be on your HTML page and choose Insert > Image Objects > Fireworks HTML.

Maybe you are looking for

  • HT2534 No "none " payment card

    But when I'm going by your steps, I see only three payment types, and i don't have internet payment card, what can I do?! how can I create an Apple ID?!

  • ISE sponsor portal guest accounts

    I am having an issue with guest accounts that have been created in the sponsor portal, some accounts work fine but others show up in the authentication logs on ISE as error 22056.  This error points to ISE not looking in the right identity store but

  • How to kick start SAP SRM???

    Hi SRM Gurus.., I am basically a Spend management consultant working on Ariba Analysis., now I am planning to move towards SAP SRM. Please give me tips is it is useful to future career and How to start learning SAP SRM. In particularhow to kicki star

  • Importing grouped objects

    Is there a way to paste grouped objects into Keynote and keep them separate. When I try and paste part of an Illustrator file, or a set of objects from Powerpoint, it pastes them into Keynote, but lumps them together as one image and there is no way

  • Why is this exception being thrown

    Everytime this piece of code is run we get this Exception. Is there a way to avoid that at all. Though it is being handled properly but it would still be better if this could be avoided. Any ideas JavaDeveloper mspete = JavaDeveloper().getInstance();