Render in high precision YUV

Hello,
My FCP (version 6.03) project contains several scenes. Each scene, which is made up of SD PAL clips, resides in its own sequence. I’m rendering each sequence in high precision YUV and nesting all of the sequences together in a new sequence. This parent sequence will be exported to Compressor. Should this parent sequence be rendered in high precision YUV too?
Thanks in advance

I'd make them the same...
Ya know, you don't HAVE to nest these sequences to a master. You can copy and paste the clips from the various scene sequences too... That way you know they are using the renders from the earlier sequences whose settings are exactly the same all the way down the line. But match these sequences exactly as far as all settings go.
Jerry

Similar Messages

  • FCP6 any problem using "Render in 8-bit YUV" instead of "Render 10-bt material in high-precision YUV" video processing

    I have a long and complex 1080p FCP6 project using ProRes442.  It is made up of mostly high resolution stills and some 1280i video clips.  Rendering has  laways been anightmare.  It takes extremely long and makes frequent mistakes whch have to be re-rendered.   Just today, I discovered the option of selecting  "Render in 8-bit YUV" instead of "Render 10-bt material in high-precision YUV" video processing.  The rendering time is cut down to a fraction and even on a large HD monitor I can tell no difference in quality.  I am getting ready to re-render the entire project in 8-bit and just wanted to check if changing to 8-bit would pose some problems and/or limitations that I'm not aware of.  This is not a broadcast or hollywood film thing. But it does represent my artwork and I burn it to bluray so i do want it to look good.  Lke I said, I can tell no difference between the 8-bit and 10-bit color depth with the naked eye.  Thank you all for all the help you have always given me with my many questions in the past.

    Unless you have a 10bit monitor (rare and very expensive) you can not see the difference as your monitor is 8 bit.
    10 bit is useful for compositing and color grading. Otherwise, 8 bit it fine for everything else.
    x

  • Render in 8-bit YUV or Render all YUV material in high-precision YUV

    Friends,
    I'm a wedding videographer and I work with the mini dv cams PD150. In a particular work I will have a lot (I mean a LOT) of color correction to do, this way, should I work my sequence with the setting Render in 8-bit YUV or Render all YUV material in high-precision YUV ?
    Thanks again!!!

    I read your commentary:
    "Selecting this option does not add quality to clips captured at 8-bit resolution when output back to video; it simply improves the quality of rendered effects that support 10-bit precision."
    So, 3 Way Color Corrector is a effect that support 10-bit precision? If I use "Render all YUV material in high-precision YUV" will the video look nicer?
    Thanks again

  • High precision YUV rendering

    May seem a excessive....
    I have a long project and I have decided to divede it in smoller videoclips for the final dvd, so I created new sequences for every new videoclip (portion of the previous bigger one). The original was rendered in high precision YUV and I forgot to modify the settings in the new sequences before pasting. I pasted, then modified the settings and then re rendered... is it ok anyway? I noticed that if you paste after changing the settings no rendering is required - I just want to be sure that the final result is the same.
    Thanks

    though if any changes are made to the sequence, it would kill the render, correct?
    I just am seeing how exporting this with my color correction is taking 7 hours for my system to do.. I'd love to know if there is some secret way to have rendered files saved so that it doesn't take 7 hours every time, in case I have to make a change.
    -p

  • High precision YUV

    I shoot DV SD PAL 8 bit. I put two or three filters on a clip. Does those effects will better looking when I run High precision YUV instaed of 8 bit YUV. The final output is DVD. Or can this make thing worse when running High YUV. Render time does not matter.

    But in Apple manual says:
    "This is the highest-quality option for processing video in Final Cut Pro. This option
    processes all 8- and 10-bit video at 32-bit floating point. In certain situations, such as
    when applying multiple filters to a single clip or compositing several clips together, a
    higher bit depth will improve the quality of the final render file even though the
    original clip has only 8 bits of color information "
    To me it mean I will get better result or... ?

  • 8 bit or high precision

    Question for those who edit a lot of miniDV-
    Which render setting do you use or recommend? The 8 bit or the high precision YUV and is there a big difference between the 2? Seems like the high precision is kind of like polishing a turd, so to speak.

    I rarely use 10 bit rendering, there are some bugs associated with the high precision, and the return is seldom there.
    Patrick

  • Is there a way to view timestamps in DIAdem with a higher precision than 100 microseconds?

    I understand DIAdem has limitations on viewing timestamps due to DateTime values being defined as a double value and that this results in 100 us resolution.  Is there any way to get around this?  I am logging time critical data with timestamps from an IEEE 1588 clock source and it is necessary to have higher resolution.  Perhaps I could convert timestamp to a double before logging but then would have to convert it back in Diadem somehow...
    Thanks,
    Ben

    As you said, DIAdem can only display up to 4 decimal positions on a timestamp. Timestamps in DIAdem are recorded as the number of seconds since 01/01/0000 00:00:00.0000. To achieve a higher precision, it would be necessary to use a relative timestamp. Many timestamps are defined from different references anyway, so it might be possible to import the timestamps as numeric values to maintain their precision. Converting the timestamp prior to importing into DIAdem seems like a viable method of working around the precision limit.
    Steven

  • Request for info on fatal error handling and High-Precision Timing in J2SE

    Hi
    Could anyone please provide me some information or some useful links on fatal error handling and High-Precision Timing Support in J2SE 5.0.
    Thanks

    Look at System.nanoTime

  • FYI - High precision data issue with sdo_anyinteract on 11.2.0.3 with 8307

    For anyone that may happen to be experiencing issues with SDO_ANYINTERACT on 11.2.0.3 with high-precision geodetic data, I currently have a service request in to fix it.
    The issue we have is with locating small polygons ("circles" from 0.5"-4") in the 8307 space. The metadata we have specifies a 1mm tollerance for this data, which has worked fine since (as I remember) 10.1. Support verified it works fine up to 11.2.0.2, then is broken in 11.2.0.3.
    So if you are pulling your hair out - stop. ;-) The SR# is 3-5737847631, and the bug# (will be) 14107534.
    Bryan

    Here is the resolution to this issue...
    Oracle came back and said what we have at that tolerance is unsupported and we were just lucky for it to have worked all these years. They are not going to fix anything because it technically isn't broke. We pointed out that the documentation is a little unclear on what exactly supports higher precision, and they noted that for future updates.
    When asked if they would entertain a feature request for a set of high-precision operators (basically the old code) in future release - they basically said no. So for the few items that we much have higher precision - we are on our own.
    What still makes us puzzled is that apparently no one else is using high-precision data in lat/lon. Amazing, but I guess true.
    Anyhow, here is what we used to use (up to 11.2.0.3) which worked fine at a 1mm tollerance:
    Where mask_geom is:
    mask_geom      :=
             sdo_geometry (2001,
                           8307,
                           sdo_point_type (x_in, y_in, 0),
                           NULL,
                           NULL);
    SELECT copathn_id
      INTO cpn
      FROM c_path_node a
    WHERE     sdo_anyinteract (a.geometry_a2, mask_geom) = 'TRUE'
           AND node_typ_d = 'IN_DUCT'
           AND ROWNUM < 2;Basically this finds indexed geometry and compares it to a single mask geometry (a simple point for the x/y given). Only one row is returned (in case they overlapped duct openings - not normal).
    Since this no longer returns any rows reliably for items less than 5cm in size, here is our work-around code:
    SELECT copathn_id
      INTO cpn
      FROM (  SELECT copathn_id,
                     node_typ_d,
                       ABS (ABS (x_in) - ABS (sdo_util_plus.get_mbr_center (a.geometry_a2).sdo_point.x))
                     + ABS (ABS (y_in) - ABS (sdo_util_plus.get_mbr_center (a.geometry_a2).sdo_point.y))
                        distdiff
                FROM c_path_node a
               WHERE sdo_nn (a.geometry_a2,
                             mask_geom,
                             'distance=0.05 unit=m') = 'TRUE'
            ORDER BY distdiff)
    WHERE node_typ_d = 'IN_DUCT'
       AND ROWNUM < 2;Essentially we use sdo_nn to return all results (distance usually is 0) at the 5cm level. At first we though just this would work - then we found that in many cases it would we return multiple results all stating a distance of 0 (not true).
    For those results we then use our own get_mbr_center function that returns the center point for each geometry, and basically compute a delta from the given x_in,y_in and that geometry.
    Then we order the results by that delta.
    The outer select then makes sure the row is of the correct type, and that we only get one result.
    This works, and is fast (actually it is quicker than the original code).
    Bryan

  • Higher precision of timestamp when wrinting to txt

    Hello,
    I would like to have a higher precision (--->miliseconds) of the timestamp when saving waveforms to .txt. Labview only prints the date and the time in HH:MMS. As I am acquiring data with a rate of 1k, 1000 data values have the same time description in the .txt-file.
    Note: This problem only occurs when writing to .txt, it is no problem to get a higher precision by using the graph or the chart. 
    Any help or suggestions would be appreciated.

    Thanks so far.....
    Maybe I was not precise enough. What I am looking for is the opportunity to easily manipulate the format of  the timestamp, which comes with my data and then write it to .txt. I already used the "Format Date/Time String"-VI to get the time with the miliseconds part and joined this time information with the data of the waveforms, which I also had to extract from the waveforms before, afterwards, but I thought there would be a more elegant way, because if I can extract the ms-part from the timestamp it must have been in it before, right ? ;-) So why can´t I tell Labview to also display the ms-part, when using the "write waveforms to .txt"-VI? I attached a .txt-file with a short excerpt of data, which should visualise the problem.
    Regards
    Message Edited by Marauder on 03-10-2006 03:20 PM
    Attachments:
    data_with_same_timestamp.txt ‏10 KB

  • Slow Render with High-End System?

    Im currently working on a (in my opinion) high-end Windows system for video editing. The system is about 2 years old and has cost a fortune in that time. So Im expecting significantly better speed. So heres my problem:
    Im working primarly in Premiere Pro and After Effects. All the media I work with are as a video imported Jpeg Sequences. Often I have multiple Sequences (up to 7 or 8) overlayed and tweaked with Dissolves and plugins like Twixtor. I also use the Adobe Dynamic Link from After Effects to Premiere and vice versa. All the footage is currently in 1080p but in future I will want to render 4K. Im aware that a 4K workflow is probably a pain in the *** so Im surely going to edit offline with 1080p. However  I cant get any real-time playback with all my sequences. I ALWAYS have to render a preview to watch my edits. I dont know if Im just having too high expectations for my system, but Im kinda sure there has to be an issue for this lack of performance. Maybe the Dynamic Link is slowing my system down?
    System Specs:
    Model : HP Z400 Workstation 103C_53335X
    Mainboard : HP 0B4Ch
    System BIOS : HP 786G3 v03.15 10/29/2010
    RAM : 12GB ECC DIMM DDR3
    Processor : Intel(R) Xeon(R) CPU           W3530  @ 2.80GHz (4C 3GHz/3GHz, 2.13GHz IMC, 4x 256kB L2, 8MB L3)
    Chipset:
         HP X58 I/O Hub 2x 2.4GHz (4.79GHz)
         HP Core Desktop (Bloomfield) UnCore 2x 2.4GHz (4.79GHz), 3x 4GB ECC DIMM DDR3 1GHz 192-bit
    Graphic card : NVIDIA Quadro 4000 (8CU 256SP SM5.0 950MHz, 512kB L2, 2GB 2.81GHz 256-bit, PCIe 2.00 x16)
    Harddisks:
          4x WDC WD2002FAEX-007BA0 (1TB, RAID10/SATA600, SCSI-4, 3.5", 7200rpm) : 932GB (C:)
         Intel Raid 1 Volume (4TB, RAID, SCSI-4) : 4TB (D:)
         HL-DT-ST BD-RE BH10LS30 (SATA150, BD-RE, DVD+-RW, CD-RW, 4MB Cache) : k.A. (E:)
    Thank you very much in advance for your help and I apologize for any grammatical mistakes since english is not my main language.

    Valentin,
    I have always called a raid10 a solution for the paranoid in a hurry. It takes 4 drives to give you the capacity and performance of two disks, but gives you security by the mirroring.
    Before going into your specific situation, allow me to tell something about volumes and drives, because they can be confusing and at the same time they are very important for optimal performance of a system.
    Single disk, not partitioned is 1 disk = 1 volume to the OS
    Single disk, partitioned is 1 disk = multiple volumes (not a good idea BTW)
    Multiple disks in one raid array is Many disks = 1 volume
    Multiple disks in one raid array with partitions is Many disks = multiple volumes (not a good idea either)
    Each volume has a distinct drive letter for access.
    Partitioning is a thing of the past and should not be used at all on modern systems.
    You have to think about volumes more than about number of disks. In my current system I have 27 different physical disks but only 4 volumes. In the old one I have 17 disks and 5 volumes.
    Now that we are clear what we are talking about, volumes with distinct drive letters, we can address your situation.
    You have TWO volumes, C: (single disk) and D: (4 disks in raid10). Spreading the load across two volumes is more demanding and gives slower performance than using more volume, unless one or more volumes are very fast, as I tried to explain in a previous reply (remember Isenfluh/Sulwald?). If you add a SSD as you intend, you have increased the number of volumes to 3, which will definitely help performance, because SSD's are faster than conventional disks, the pagefile can be stored on the SSD, so all your performance will go up.
    Compare your setup with mine with rough estimated figures:
    Volume
    Valentin
    Harm
    Transfer rate Valentin
    Tranfer rate Harm
    C:
    1 HDD
    1 SSD
    125 MB/s
    450 MB/s
    D:
    4x Raid10
    1 SSD
    250 MB/s
    450 MB/s
    E:
    NA
    21x Raid30
    NA
    2,700 MB/s
    F:
    NA
    1 HDD
    NA
    150 MB/s
    These figures are indicative, but do show where the major differences are. In my experience disk setup is overlooked quite often, but has a huge impact on a system's responsiveness. It is the weakest (slowest) link in the chain, afterall and with your workflow, doubly so.
    But in your specific case there is something else, and that is your disappointing hardware MPE score. 100 seconds is extremely slow, even for a Quadro 4000. I would be quite normal to see a score around 8 - 9 seconds on such a system, well maybe around 12 - 13 seconds with your ECC memory, but 100 is way too slow. Some background services or processes are interfering with the hand-over from memory-GPU-VRAM-GPU-memory. This can be caused by a myriad of things, but a good starting point would be the BlackViper list of services to set to manual or disabled and taking a closer look at the processes running with Process Explorer. There should normally be less than 50 processes running.
    Hope this helps.

  • Lightroom doesn't render final high quality images

    In Lightroom 3.3 rc it seem that I have many more times when Lightroom fails to render the final high quality image.  I see the inital low resolution version shown and the final high resolution version never appears.  Once this happens, I never get the high resolution version for that image until I exit and restart Lightroom.  This happens in the modules; but, has also happened on a slide show after all slides have been prepared by lightroom.  I may have seen this in 3.2; but, seems much worse in 3.3 rc.  I updated to 3.3 for Nikon D7000 support; so, no idea if difference could be related to support that camera.
    Thanks,
    Kevin

    I don't use a lot of optical flow, but I found that if I want it rendered the render needs to be forced. As you point out the analysis happens automatically, but the render bar remains.
    Just to test, I added some text to a re-timed clip with OF  and naturally the render bar returned. Re-rendered and it went away.
    Is your experience different?
    BTW, if you can play and evaluate unrendered clips OK, it is possible to skip the render step and export.
    Russ

  • To be with higher precision for the time

    Hi,
    Can we let the precision of the time be with, such as 1/100 second?
    Thks & Bst Rgds,
    HuaMin

    Hi
    Yes Oracle9i and above hav it.
    you can use TIMESTAMP data type with it's precision
    like
    CREATE TABLE test (col1 VARCHAR2(10), col2 TIMESTAMP(4));
    If you ommit the precision the default is 6, but i had some issues with developing forms and reports on tables who have TIMESTAMP data type in them, i think Forms and reports do NOT support TIMESTAMP.
    Regards
    Tony G.

  • GPS Access to high precision time and time stamping WASAPI microphone captures

    I am interested in using multiple WPs to capture microphone WAV that is time stamped with high accuracy (sub millisecond), hopefully from the GPS system. The GPS hardware obviously has the capability to sub microsecond levels but I can't find an API for
    it.

    What I would like to do is get geo positional data, which has a defined statistical uncertainty but might be relatively better correlated and as accurate a time stamp as possible.  Latency isn't an issue but consistency is. GPS, of course, easily produces
    time information to sub microsecond though I don't know a way to access it in WP.  .1ms consistency would be all I really need but it's important that each phone in a cluster be able to capture and time stamp a sound (assume all phones are equidistant
    from a source) to within .1ms. I am thinking of a product that could be used, for one obvious example at weddings, to capture the proceedings and allow after the fact enjoyment by replaying the proceedings and shifting the focus on the bride/groom minister
    as they talk using beam forming dsp tech from the data. There are other ways but it occurs to me that the ubiquity of smart phones would really make this easy. Just have the guests download an app. It would be part of a wedding doc package along with videography
    and stills.

  • High Precision Voltage Source to Calibrate 24 bit ADC

    I am trying to find a cost effective way to calibrate a 24 ADC with a voltage source.
    The ADC has 3 differential inputs.It ranges from 40, 20, 10, 5, 2, 1, 0.5Vpp
    It natively samples at 32 kHz, But I will be taking a 1 second sample as a reading for the calibration.
    The source must be able to produce half full scale of each range ie 20V to 0.25V differential.
    The source must be able to produce the signal to +-0.01% (ie. 20V +-0.002V to 0.25V+-25mV)
    I was looking at the PXI-4132. I will have to buy the PXI frame and controller.
    Is there a cheaper or better solution for this task. I also looked at the Keatley $6000 and Agilent solutions $5750.
    A SMU appears to be more feature rich then what is called for in this task.
    This is my first project please be critical of my post so that I can improve.

    Two ways: Get a Reference source (have a look at the secondary market like this  , I think you will need a recalibration anyway)
    or get a stable source and a reference voltmeter.
    also have a look how other solved this problem .. like Jim Williams from linear see
    http://cds.linear.com/docs/Design%20Note/dsol11.pdf
    http://cds.linear.com/docs/Application%20Note/an86f.pdf
    However as Lynn already pointed out 16bit seems all you need.
    And: Usually an ADC used ratiometric migth go up to 22bit .. not ratiometric it will need a reference better than 22bit (ok 16bit here, that can be done)   . So if your ADC will work ratiometric in your application all you need is a stable voltage divider since reference sources are available ....
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

Maybe you are looking for

  • Cisco TCS "the overview for this server cannot be found"

    Hi, I have a problem on my TCS server, a message is displayed "The overview for this server can not be found" This error message appears when I last reinstalling the usb image provided by Cisco, Thank's Billel

  • Vendor payment in installment

    Hi All We have a requirement that we need to pay 50% amount to vendor at the time of payment. I know it can be done thru FI using F-47. Is there any way, we can handle the advance payment thru MM using payment terms or any other option? Regards SAmee

  • Creating iViews Dynamically

    Dear All, Is it possible to create different kinds of iViews dynamically like <b>URL iViews, XML iView</b>s etc? I know that we can create iViews based on existing PCD object or application. Please let me know. Kind regards, Sreejesh.

  • Why it closes when I try to open games in POGO?

    When I try to open a game on POGO it shuts down Firefox and POGO. What can I do to fix the problem?

  • [PIQ_AUDIT] Can I use the module just once?

    Hi, I wonder whether module can use once in audit. When audit permits the duplicate usage modules, isn't proper. Any idea?  in filter, condition  or something... Thanks, Nicole. P.S. Is it possible for one student? That means one usage module though