Importing/Transcoding best practices

Hello
Apologies if this is a very basic question: I just returned from Africa and I have many hours of video as a result. All videos are 1080p, high quality and what not, and vary from 30s to 30 minutes in length. Therefore, some files are a few megabytes, other, a few gigabytes.
The output for my project is no more than 7 minutes or so, therefore I have to cut a lot.
My question: what is the best practice when you have this much video? Should I import and transcode everything (all 73GB of video) or is there a best practice to say cut what you need, then transcode that.
I am using a MBPr 15" with 16GB and 512GB, so I've unchecked the "Copy files to Final Cut Events Folder" in order to not eat up my local HD.
Anyhow, any advice would be really appreciated, thanks again
Rob

First, thank you for your quick reply and being very nice to me seeing as my questions are probably very basic. I have purchased a book on FCPx but it didn't deal with workflows very well, especially with what I'm dealing with.
"I'd suggest you acquire a large external drive":
Done, I'm using a 1TB USB3 drive specifically for this project. All videos are loaded onto the HD and when I began importing, I checked off the option to "Copy files to FCPx events folder" in order to centralize my content to the HD. That said, the Events Folder on the local HD DOES have content, pointers I believe. Should I be backing those up to the external HD?
"Before capturing footage create a CDL (capture decision list) and capture what you intend to use."
Not done. Some of the footage I have didn't lend itself for this unfortunately. For example, I had a gopro camera mounted on my head and another mounted on the head of a local tribesman while we went hunting for small game (their food of course). So the videos are long and I'd like to include portions of it into the final video. Is the only option for me to import and optimize the whole thing, or can I import, not optimize, review, cut, save the portions I like, then optimize those sections?
I'm hoping you can spare a little more patience for me. I'm a photog so my workflow there is solid. I'm very new at this and I'd like to get better. The management of files for me is key so I want to get off on the right foot.
Cheers
Rob

Similar Messages

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

  • Import data from excel file - best practice in the CQ?

    Hi,
    I have question related to importing data from excel file and creates from those data a table in the CQ page. Is inside CQ some OOTB component which provides this kind of functionalities? Maybe somebody implement this kind of functionality or there is best practice to do this kind of functionalities?
    Thanks in advance for any answer,
    Regards
    kasq

    You can check a working example package [1] (use your Adobe ID to log in)
    After installing it, go to [2] for immediate example.
    Unfortunately it only supports the old OLE-2 Excel format (.xls and not .xlsx)
    [1] - http://dev.day.com/content/packageshare/packages/public/day/cq540/demo/xlstable.html
    [2] - http://localhost:4502/cf#/content/geometrixx/en/company/news/pressreleases/my_personal_bes ts.html

  • Best practice to back up our most important asset : iPhoto!

    Hi everyone!
    I consider my 30k photos which I've carefully tagged/marked/modified/red eyed/etc as my most important asset. I've spent days and days on iPhoto to eventually build this 50GB library.
    I've bought my 500 GB TC few months ago and now guess what, it's completely full!
    Modifications calculation, deletion and updates of my library are now taking forever. I understand from this board that Time Machine/TC is updating the 50GB as a whole instead of updating only the modifications ... what a nightmare in this almost ideal apple world!
    Any recommendation to share? Any free apps to do this work for me?
    So far my personal brainstorming came to this process:
    1) download "Time Machine Editor"
    2) schedule updates to be daily or weekly
    3) be patient
    Thanks for sharing your best practices!
    dofre

    I would get a program such as SuperDuper ( http://www.versiontracker.com/dyn/moreinfo/macosx/22126 ) to CLONE your hard drive onto the external (as an image, if you don't have a free partition) and then when your new HD comes back, clone it back.
    SuperDuper is well worth the $20 shareware fee.

  • File import best practice

    I need some outside input on a process. We get a file from a bank and I have to take it and move it along to where it needs to go. Pretty straight forward.
    The complexity is the import process from the bank. It's a demand pull process where an exe needs to be written that pulls the file from the bank and drops it into a folder. My issue is they want me to kick the exe off from inside SSIS and then use a file
    watcher to import the file into a database once the download is complete. My opinion is the the SSIS package that imports the file and the exe that gets the file from the bank should be totally divorced from each other.
    Does anybody have an opinion on the best practice of how this should be done?

    Here it is:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/bd08236e-0714-4b8f-995f-f614cda89834/automatic-project-execution?forum=sqlintegrationservices
    Arthur My Blog

  • Use Best Practice to Import Master Data

    Hi,
    I am SAP beginner, i glad someone can guide me on my issue. How can I using best practice to import 1000++ material master data into SAP system ??? I already prepared the data in Excel spreadsheet. Can any one guide me on step?
    Thanks.

    Hi,
    LSMW is a very good tool for master data upload. The tool is very rich in features is also complex. Being a beginner you should check with a consultant to know how you can use LSMW to upload your 1000 + records. The tool too is quite intuitive. After entering the LSMW transaction you create the project, subproject and the object you are going to work on uploading. When u enter the next screen you see several radio buttons. Typically every upload required all the features behind those radio buttons and in the same sequence. It is not possible to give the details to be performed in each of these radio buttons in this forum. Please take a consultant's help in your vicinity.
    thanx
    Bala

  • Import of Process Chains - What is the best practice

    Hi Colleagues,
    Wish to check on what is considered as a best practice:
    1. Move Process chains to the Production Environment through Transports
    Else
    2. Create them in the production system directly (obviously after having tested the scenario in other envrionments)
    Also would be glad to know if you have some links to posts which have discussions on the errors encountered in transports of process chains.
    Good Day and regards!
    Uma Shankar

    Hi Srinivas,
    Thanks a lot for the reply and the time taken to update this post.
    I will have to try this suggestion and shall update this post. Am now away from the project site, however since i was trying to help a colleague, will forward this reply to him to try it out and seek his observation.
    However a couple of things:
    1. This problem arose when the process chain was transported from Development to the Quality system.
    2. Though the error came up in the transport but when we were trying load the data into the Quality system using this process chain we were able to do so.
    So looks like the error had no impact on the performance of either the info-pack or the process chain. However we were sceptical on the our decision to import this process chain to the Production due to this error.
    However having seen hyour reply i will suggest my colleague to try it out (assuming that this error is still a pending issue) and shall update this post for the benefit of all.
    Good Day and regards!
    Uma Shankar

  • Import files into bridge from a CMS - best practice

    Hi All,
    i am trying to come up with the simplest solution for the following:
    we have a CMS in which our content is managed and edited.
    the content editor might like to edit video files that reside in our asset management system.
    our editing tool is Adobe Premiere.
    My thought was to transfer the video file along with its metadata and edit instructions to an FTP server from our asset repository and then transfer these files into the video editing local environment using bridge. after editing the file - it should be transferred back into the asset repository via FTP server or such.
    it seems the way to this would be to extend the bridge functionality using the java script SDK, but i would love to know if there is a best practice solution before we start detailed design and development.
    your help is highly appreciated and thanks in advance,
    Deena

    Not sure what advice you have been given and how you have interpreted it.
    You are of course referring to im6 here since you can't drag projects from im08 to iDVD.
    I don't make many DVD's anymore and may well be wrong here but I'm not aware that dragging an im6 project to iDVD is any different than using share/iDVD and I'm wondering if the advice relates to exporting from im6 and then importing the exported movie into iDVD which is indeed a different workflow.

  • Importing best practices baseline package (IT) ECC 6.0

    Hello
    I hope is the right forum,
    i've a sap release ECC 6.00 with stack abap 14.
    In this release i have to install the preconfigured smartforms that now are called
    best practices baseline package. These pacakges are localized and mine is for Italy:
    SAP Best Practices Baseline Package (IT)
    the documents about the installation say that the required support package level has to be with stack 10.
    And it says :
    "For cases when the support package levels do not match the Best Practices requirements, especially when HIGHER support package levels are implemented, only LIMITED SUPPORT can be granted"
    Note 1044256
    By your experience , is it possible to do this installation in this support package condition?
    Thanks
    Regards
    Nicola Blasi

    Hy
    a company wants to implement the preconfigured smartforms in a landscape ECC 6.0
    I think that these smartforms can be implement using the SAP best practices , in particular the baseline package ....see service.sap.com/bestpractices  --> baseline package;  once installed you can configured the scenario you want....
    the package to download is different each other ,depends the localization...for example italy or other country but this is not important at the moment....
    the problem is the note 1044256...it says that to implement this, i must have the support package level requested in this note...not lower and above all not higher.......
    before starting with this "baseline package" installation i'd like to know if i can do it because i have a SP level of 14 for aba e basis for example....while the notes says that want a SP level of 10 for aba e basis.
    what i can do?
    i hope is clear now....let me know
    thanks
    Nicola

  • Best practices / workflow for importing images?

    Is there a recommended workflow or best practices for importing images into the library? Specifically, I want to know -
    1.) Is it better to import /store the Masters right into the Aperture Library
    or
    2.) Is it better to move / store the Masters into a folder you specify and reference them?
    Even more specific, is there any advantage in storing the Masters in the Aperture library?
    As I'm determining how I want to do my workflow, (New Aperture user, so I want to design my workflow from the start) I am thinking that the later would be the way to go, storing them not in the library, but in a folder as a referenced image. My reasoning is as follows
    * Images more readily available for other applications, without having to do an export
    * Library database remains small, storing only the versions (less chance of corruption perhaps?)
    * Eliminate any issues with Time Machine - TM would backup the folder where the Masters are stored, and only need to update when new files are imported there. Also thinking that TM would have less issues backing up the Aperture library.
    So can someone either confirm my reasoning or tell me I'm way off base and should store the masters in the library, and why.
    Thanks in Advance!

    This link has a pretty good summary of the pros and cons of each method. Neither one is perfect of course. Otherwise this question would be easy to answer!
    http://www.bagelturf.com/aparticles/qanda/files/8992c352f4d1429747200b3e06c215fe -42.php

  • Importing SECTIONS (not Styles) from ePub Best Practices Sample Doc

    I have successfully imported STYLES from the ePub Best Practices Sample Document (http://support.apple.com/kb/HT4168) into an existing Pages document. However, the Best Practices doc also has defined SECTIONS which you can see by clicking on the sections button in the toolbar.
    How can I get those defined Section types (i.e. Table of Contents, Index, etc.) into my other document which does not included those?
    Thanks,
    Jeff

    Hi Jeff,
    With the sample document and your new Pages document, View > Page Thumnails. Click on a thumbnail in the sample doc, Edit > Copy. Click in the thumbnail pane in the new doc and Edit > Paste. I guess this does the same as selecting and copying the text in the doc window.
    Regards,
    Ian.

  • One-time import from external database - best practices/guidance

    Hi everyone,
    I was wondering if there was any sort of best practice or guideline on importing content into CQ5 from an external data source.  For example, I'm working on a site that will have a one-time import of existing content.  This content lives in an external database, in a custom schema from a home-grown CMS.  This importer will be run once - it'll connect to the external database, query for existing pages, and create new nodes in CQ5 - and it won't be needed again.
    I've been reading up a bit about connecting external databases to CQ (specifically this:http://dev.day.com/content/kb/home/cq5/Development/HowToConfigureSlingDatasource.html), as well as the Feed Importer and Site Importer tools in CQ, but none of it really seems to apply to what I'm doing.  I was wondering if there exists any sort of guidelines for this kind of process.  It seems like something like this would be fairly common, and a requirement in any basic site setup.  For example:
    Would I write this as a standalone application that gets executed from the command-line?  If so, how do I integrate that app with all of the OSGi services on the server?  Or,
    Do I write it as an OSGi module, or a servlet?  If so, how would you kick off the process? Do I create a jsp that posts to a servlet?
    Any docs or writeups that anyone has would be really helpful.
    Thanks,
    Matt

    Matt,
    the vault file format is just an xml representation of what's in the
    repository and the same as the package format. In fact, if you work on
    your projects with eclipse and maven instead of crxdelite to do your
    work, you will become quite used to that format throughout your project.
    Ruben

  • Number ranges - Import of Legacy Data - best practice

    Hi,
    we are planning to move legacy data objects to our SAP CRM.
    These Objects have an external key (Numeric, 6 digit) in a number range that is relatively full and highly fragmented.
    Is there a best practice how to implement internal number assignment for this kind of pre filled number range?
    The internal key in SAP would be different and under our control, the external key is the interesting one.
    Cheers,
    Andreas

    Hi Luís,
    The scenario is in the context of insurance business.
    The setup: SAP CRM as central Business Partner system. And in the CRM we keep the policy numbers of the surrounding (non SAP) policy systems as references (I'm talking about insurance policies...).
    For each policy we create a one order object, containing, among others, LOB, policy type and the policy number.
    These policy number ranges are to be maintained in the central CRM system in the future.
    And in one of these Systems they have the situation described above:
    6 digit key in a number range that is relatively full and highly fragmented. They are managing their numbers in an xls right now, but we would also have them migrated into our system.
    And after the migration we would be responisble to find a unused number, whenever a new policy is to be created.
    Cheers,
    Andreas

  • Best Practices for Keeping Library Size Minimal

    My library file sizes are getting out of control. I have multiple library's that are now over 3TB in size!
    My question is, what are the best practices in keeping these to a manageable size?
    I am using FCPX 10.2. I have three camera's (2x Sony Handycam PJ540 and 1x GoPro Hero 4 Black).
    When I import my PJ540 videos they are imported as compressed mp4 files. I have always chosen to transcode the videos when I import them. Obviously this is why my library sizes are getting out of hand. How do people deal with this? Do they simply import the videos and choose not to transcode them and only transcode them when they need to work on them? Do they leave the files compressed and work with them that way and then transcoding happens when exporting your final project?
    How do you deal with this?
    As for getting my existing library sizes down, should I just "show package contents" of my library's and start deleting the transcoded versions or is there a safer way to do this within FCPX?
    Thank you in advance for you help.

    No. Video isn't compressed like compressing a document. When you compress a document you're just packing the data more tightly. When you compress video you do it basically by throwing information away. Once a video file is compressed, and all video files are heavily compressed in the camera, that's not recoverable. That information is gone. The best you can do is make it into a format that while not deteriorate further as the file is recompressed. Every time you compress a file, especially to heavily compressed formats, more and more information is discarded. The more you do this the worse the image gets. Transcoding converts the media to a high resolution, high data rate format that can be manipulated considerably without loss, and go through multiple generations without loss. You can't go to second generation H.264 MPEG-4 without discernible loss in quality.

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

Maybe you are looking for

  • How I managed to solve the problems with the 1.2 iPod/7.0 iTunes Upgrade

    As many of you know there are several people that have experienced a problem with or after upgrading their iPods and iTunes. While mine is not the definitive answer, it is one that worked for me and I hope that by sharing this, many will receive a be

  • Multiple Transports in Background

    Something was changed in my STMS configuration (4.6C Core system) and it is now allowing multiple background jobs to be scheduled through STMS. Previously the system would provide an error if there was already one transport job released in the backgr

  • OBPM SUITE DOWNLOAD

    Hi, As i know that OBPM SUITE has 5 intergrated tools 1)ORACLE BUSINESS PROCESS MANAGEMENT 2)ORACLE BPEL PROCESS MANAGER 3)ORACLE BUSINESS ACTIVITY MONITORING 4)ORACLE BUSINESS RULES 5)ORACLE WEBCENTER SUITE Is there any option to downlaod 5 tools wh

  • How to increase RMI-timeout

    Hi there, I have an session ejb method, running quite a while. Calling this method, results in 0rg.omg.CORBA.COMM_FAILURE: vmcid: SUN minor code: 209 completed: No Can this timeout be increased? How to? regards Heiner

  • When will ipod touch 6 be available?

    when will ipod touch 6 be available?