Best codec for data discs?

I want to clear a large project off my drive by burning it onto data discs. What's the best codec to use for later FCP editing?
On a slightly different topic, is there a best "universal" codec that can be edited in other popular NLE systems, e.g. Avid, Pinnacle, etc.?
Thanks.

On a slightly different topic, is there a best "universal" codec that can be edited in other popular NLE systems, e.g. Avid, Pinnacle, etc.?< </div>
Read up on H.264, it's for distribution, not editing. The only codec that approaches platform agnosticism or universality is MPEG1 which is also useless as an image preservation or editing codec. After that, maybe plain ol' DV. There is a different flavor for each operating system but getting the files unwrapped and imported into all NLEs cannot be assumed.
bogiesan

Similar Messages

  • Best codec for DVD export for editing

    If I want to burn DVDs for a client who will want to edit the footage later, what's the best codec for that?

    What editing application will they be editing with?
    DVD's aren't the best archival solution. They hold 4.7GB of info, and that is like 20 min of DV quality footage. Forget about HD quality...Oh, unless you use HDV. That's the same data rate
    Shane

  • Best codec for export from FCE?

    Heya, I'm sort of a newbie, so please be patient with me
    I'm trying to export a video from FCE using QT conversion.  I'm going to be putting it on Youtube.  What is the best codec for export?  I've always heard that H.264 is the 'best', but my videos have always been pale and washed out.  I tried exporting using MPEG-4 video, and it's not washed out.  Is H.264 better and I should just deal with the paleness, or does it not matter too much?
    Thanks

    All the Web video sites re-compress your videos. So it's generally best to upload high quality files – typically determined by bit rate.
    My suggestion would be to export as a Quick Time movie at current settings. (Guessing that you edited an AIC sequence,)
    If you don;t have MPEG Streamclip, do a search and download it. In Streamclip, open the file you exported from FCE. Export to Quick Time. For Compression, choose H.264 from the drop down menu. Don't change Frame Size. If your movie is interlaced, check De-interlace. Check the Limit Data Rate box and type in 12,000 if the size of your movie is1080; 6,000 if your movie is 720; 3,000 if your movie is 480.
    Use the YT uploader and you should be good to go.
    Russ

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • Best Codec for exporting animations with alpha channel from FCP timeline

    Whats the best codec for exporting animations (hopefully using loseless compression) that will retain alpha channel, and use the current sequence settings for fps and size?
    Currently Im exporting image sequences, but I'd prefer a wrapper.. I don't know much about the 'Animation' export codec.. I did notice that set to current size had some funky value of 753x450 or something.. instead of my current project size of 1920x1080...
    Anyhow.. Just seeing what others are doing..
    basically Im reading out some titles that I'll end up dropping over video later.. but since they render out so slowly (**** lower 3rds is slow) Im trying to get a jump on the process..
    Tx..

    PNG should do it. I think PNG is lossless (that's what they say, anyway). But realistically, if you use PNG, Animation, or JPG2000 set on highest quality, I'd dare you to tell a difference.
    I use PNG because it's supposedly lossless, renders twice as fast as Animation in a FCP timeline, and is usually about half the size.
    If you use "Export Using Quicktime Conversion", you will need to check all of the settings manually to make sure they match your sequence settings. This is always true, no matter what codec you use.
    If that sounds risky, your other option is to go into your sequence settings and change the codec to the one you want to render to (this will probably turn your whole timeline red), and then choose Export -> Quicktime movie.

  • Best  Course  for Data Warehousing

    Hi,
                I am planning to join data warehousing course .I heard there is lot courses in data warehousing .
    Data warehousing with ETL tools or
    Data warehousing with Crystal Reports or
    Data warehousing with Business object or
    Data warehousing with Informatica or
    Data warehousing with Bo-Webel or
    Data warehousing with Cognos or
    Data warehousing with Data Stage or
    Data warehousing with MSTR or
    Data warehousing with Erwin or
    Data warehousing with oracle.
    Please suggest me which best to choose and  which have more scope because I  don't know  the ABC of data warehousing  but I have some experience in oracle.
    Is it must that I need work experience in data warehousing  then only can get a job ?Please tell me which is the best book for data warehousing which should start from scratch.  Please  give your suggestions about to my queries.
    Thanks & Regards,
    Raji

    Hi,
    Basically Datawarehouse is a concept.To develop DW , we need two tools mainly. One is ETL tool and other one is Reporting tool .
    The few famous ETL tools are
    Informatica
    Data Stage
    Few famous Reporting tools are
    Crystal Reports
    Cognos
    Business object
    As a DW developer you should aware of atleat one ETL tool and atleat one Reporting tool.The combination is your choice.It better to finout the best combination in point of job market , and then learn them.
    Erwin is Datamodel tool. It can aslo be used in DW implementation. You have already have experience on ORacle,So my adivce is go for Data warehousing with oracle or Data warehousing with Informatica .And learn one reporting tool.I donot is there any reporting tool available from ORACLE.
    My suggestion on books.
    Fundamentals of Datawarehouse by PaulRaj Ponnai and
    Datawarehouse toolkit.
    http://www.inmoncif.com/about.html is one of the best site for Datawarehouse.
    With rgds,
    Anil Kumar Sharma .P
    Assigning points is the way to say thanks in SDN site.

  • SAP Best Practices for Data Migration :repositories only on MS SQL Server ?

    Hi,
    I'm implementing the "SAP Best Practices for Data Migration" (see https://websmp109.sap-ag.de/bp-datamigration).
    As part of the installation you have to install MS SQL Server Express Edition. The installation guide contains detailed steps to do this. All repositories for Data Services should be running on SQL Server, according to the installation guide.
    The customer I'm working for now does not want to use SQL Server, but DB2, as company standard.
    So I use DB2 for the local and profiler repositories.
    I notice however that the web application http://localhost:8080/MigrationServices does not support DB2.The only database type you can select in the configuration area is MS SQL Server.
    Is this a limitation, a by design ?

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Best practices for data migration

    Hi All,
    This thread is useful for those who can use their opportunity to share the ideas and knowledge for making better and best use of data migration using Business Objects Data Services.

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Best codec for delivering 1080p video in flash

    I'm developing some AS3 that will end up in a mac projector (.app).
    This app will stream 1080p videos from the local disk.
    What is the best codec for fluid videos? (no hiccups, fast start, etc)
    Is there any special setting I have to consider in the code or in the publish settings?
    TIA

    bump

  • Best codec for editing

    What's the best codec to export to DVD if someone else is editing the footage?

    Again, you're asking two separate questions. If your client wants to edit the footage, you don't want to burn a playable DVD, you want to burn a data disc. This means you are able to put HD content on there (really, you're able to put anything on there), but you're limited to 4.7GB on a standard disc. As Tom says, that is very little space (especially if you're working with HD).
    What you really need is to send your client a hard drive with this footage. Whether or not it is editable on "various platforms" depends on what it is now-- open it in QT and bring up the Inspector. What does it say the format is?

  • Best codec for out to stand alone HD video playable through VLC on PC

    I finished a wedding in France recorded in HD on a Sony XD CAM EX-1
    I am editing in Final Cut 6.0.6
    I had photos from Photoshop CS4 imported into the film and did not
    interlace them. They were accepted into the timeline with the Sony footage: XD Cam EX 1080i 50 35mbs VBR.
    I would like to deliver an uncompresssed version on a hard drive playable on VLC through a PC. I would also like to see if I can compress this half hour and put it on a DVD?
    Final cut and Compressor had trouble with this footage to make the Quicktime
    stand alone video and a 39 Program stream.mpeg
    When I did get a successful copy_ they will not copy to an external Hard Drive_ it reads disc error.
    Is this because of the mix of interlaced and non interlaced? Should I go back
    and interlace all the photos and try again???
    What is the best codec to export a full res from Final Cut? ( Quicktime, which ProRes??)
    and if I try to put the half hour on a dvd not blue ray ... which codec is best? MPEG2?
    Thanks very much,
    Bill Phipps

    Hi Kevin,
    Thanks for your input. I shot the wedding in France in PAL on the Sony EX1 using
    XD Cam EX 1080i 50 35mbs VBR. And want to deliver to bride and groom
    a full HD copy on a hard drive to be played on a pc using VLC software.
    I was able to load my foootage and Sony allows me to select this in Final Cut
    as a setting when I edit. I imported some
    still photos from Photoshop and these are not interlaced. Something I am
    now concerned about. I could easily interlace them, however, I was thinking
    it would be better to de-interlace the video footage instead as everyone
    viewing the film would do so on a pc.
    I went on Creative Cow and have gotten a thread
    from Creative Cow which tells me:
    Re: To deinterlace,or not to deinterlace
    by Ed Dooley on Jan 8, 2009 at 11:01:35 am
    Not necessarily true. Yes you'll lose the resolution if you use the simple de-interlace filter in FCP (and many other programs). If, however, you use Frame Controls in Compressor (do a search of this forum and you'll find lots of instructions), you do not lose res, the fields get blended. There are 3rd party products that do the same thing (Nattress has one).
    Having said all that, I wouldn't deinterlace unless you need to (like for the web for example), the program's going out to TV? Leave it interlaced.
    Ed
    I at the moment cannot find where Frame Controls in Compressor and saved my project just
    before my last step of using the Final Cut Pro de-interlace filter on all of the HD Pal footage.
    I had to try 3 times to make a stand alone Quick Time which normally goes smoothly for me, so
    I started to wonder if the interlaced video with non-interlaced photos was the problem?
    The stand alone Quick Time and the MPEG-2 both gave me trouble and I got an error when
    trying to copy them to a drive that was formatted in the only cross platform format I know
    being Fat 32. It is a 7200 rpm drive.
    I got an answer from Creative Cow on this where the person advised me to convert the file
    to Pro Res.
    I would appreciate any input.
    Thanks for your time and attention.
    Bill

  • Best strategy for data definition handling while using OCCI

    Hi, subject says all.
    Assuming to use OCCI for data manipulation (e.g. object navigation), what's the best strategy to handle definitions in the same context ?
    I mean primarily dynamic creation of tables and columns.
    Apparent choices are:
    - SQL from OCCI.
    - use OCI in parallel.
    Did I miss anything ? Thanks for any suggestion.

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best Practise for Data Refresh & Hierarchy

    Hi,
    During a recent discussion with one of our BI user groups, the questions were raised as what the best practice are to handle the following two issues.
    Issue 1:
    If entries are posted to the prior periods in SAP R/3 (outside of the daily auto-refresh range), the current process is that the user group will ask us to conduct a manual refresh in BI for the prior periods which are effected.
    Question: Is it possible to set up a trigger in the system, so that BI knows which periods are changed and automatically refreshes data for those periods?
    Issue 2:
    If a hierarchy used in the reports is modified, there might be an adverse impact on the financial data the user group reports. The current process we have in place is to run a group of BI reports for both current year and prior year to make sure nothing is impacted, but there is limitation to this current process. What if there is no impact on current or prior year, but on the years prior to that?
    Question: What other global companies do to minimize such reporting impact, especially when they have hundreds of complex reports?
    If someone has any info on this, help me in sharing the same.
    Thanks all for your support.
    Regards,
    Murali

    Hi Srini,
    1. SAP suggestes to implement data archiving strategy  as early as possible to manage database growth .
    However pople think of archiving when they realise the  problems like large data volumes,slow system resonse time,performance issues etc...
    2. There is a proper way to implement Data Archiving . Database has to be anaylzed first for getting the top DB tables and Archiving objects.
    3. Based on the DB analysis ,Data archiving plan has to be implenemted according to data management guide.
    4. There is a minimum period known as residence time has to be completed before any data to be archived. Once the document is business completed and serverd its minimum required period in the Database ,it can be archived.
    5, Before going for data archiving there are many steps to be followed like analysis,configuration etc that you can see in details at the link below :
    http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
    let me know if this helps you .
    -Supriya
    Edited by: Supriya  Shrivastava on May 4, 2009 10:38 AM

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • The best practice for data archiving

    Hi
    My client has been using OnDemand for almost 2 years, there are around 2M records in the system(Activities), so just want to know what is the best practice of data archiving, we dont care much about the data in the 6 month ago.

    Hi Erik,
    Archival is nothing but deletion.
    Create a backup cube in BW. Copy the data from your planning cube to the backup cube, and then delete that data region from your planning cube.
    Archival will definitely improve the performance of your templates, scripts, etc; since the system will now search from a smaller dataset.
    Hope this helps.

Maybe you are looking for

  • Spilled water, but my laptop still works. How can I check for damage?

    So a little over a day and a half ago I knocked about 1/4 cup of water onto my 2014 Macbook Pro. It was on but not plugged in. I immediately turned it off and flipped it over at the same time. Some water poured out of the keyboard. Then I cleaned off

  • Errors in the OLAP storage engine: The attribute key cannot be found when processing

    this is the absolute worst error message in all of computing.  I despise it.  Here is my situation. SSAS 2008 R2. I have one dimension.  I have not even built my cube yet.  only a dimension.  I am trying to process it.  I can process it when I only h

  • AIR 17 - Sub Textures don't render ( black screen) on HTC Desire 610 ?

    Hello, AIR 17.0.0.124 critical bug on stage3D/Android/ HTC Desire 610 ( full spefication http://www.gsmarena.com/htc_desire_610-6160.php) I was using AIR 16 and all working correctly. In AIR 17 when I'm using SubTextures ( atlas textures) , stage 3D

  • IMac G3 Indigo 400-Panther Installation Freeze

    Hey Guys/Girls, I'm setting up a file server of sorts with two identical Indigo 400 mhz iMac G3's, one of them already has Panther installed and the other is currently on 10.1.2. The problem is when the computer restarts to boot from the CD, it loads

  • Issue for SNOTE-1296484

    Hi, We have implemented this note(1296484-Corrections for Web Dynpro applications in performance mgmt) to our CD2( ABAP) system.This note has been implemted successfully last day.Then I have implmented the next note ( 1316516-Accessing header data in