Maximum Bit Depth /Maximum Render Quality  Questions

Maximum Bit Depth
If my project contains high-bit-depth assets generated by high-definition camcorders, I was told to select Maximum Bit Depth because Adobe Premiere Pro uses all the color information in these assets when processing effects or generating preview files. I'm capturing HDV using the Matrox RTX-2 Hardware in Matrox AVI format.
When I finally export my project using Adobe Media Encoder CS4, will selecting Maximum Bit Depth provide better color resolution once I post to Blu-ray format?
Maximum Render Quality
I was told that by using Maximum Render Quality, I maintain sharp detail when scaling from large formats to smaller formats, or from high-definition to standard-definition formats as well as maximizes the quality of motion in rendered clips and sequences. It also renders moving assets more sharply. It's my understanding that at maximum quality, rendering takes more time, and uses more RAM than at the default normal quality. I'm running Vista 64 Bit with 8 GIGs of RAM so I'm hoping to take advantage of this feature.
Will this also help to improve better resolution when I finally export my project using Adobe Media Encoder CS4 and post to Blu-ray format?
Does it look like I have the specs to handle Maximum Bit Depth and Maximum Render Quality when creating a new HDV project with the support of the Matrox RTX 2 Hardware capturing in Matrox AVI format? See Below Specs.
System Specs
Case: Coolmaster-830
Op System: Vista Ultima 64 Bit
Edit Suite: Adobe Creative Suite 4 Production Premium Line Upgrade
Adobe Premiere Pro CS 4.0.1 update before installing RT.X2 Card and 4.0 tools
Performed updates on all Adobe Production Premium Products as of 03/01/2009
Matrox RTX2 4.0 Tools
Main Display: Dell 3007 30"
DVI Monitor: Dell 2408WFP 24"
MB: ASUS P5E3 Deluxe/WiFi-AP LGA 775 Intel X38
Display Card: SAPPHIRE Radeon HD 4870 512MB GDDR5 Toxic ver.
PS: Corsair|CMPSU-1000HX 1000W
CPU: INTEL Quad Core Q9650 3G
MEM: 2Gx4|Corsair TW3X4G1333C9DHXR DDR3 (8 Gigs Total)
1 Sys Drive: Seagate Barracuda 7200.11 500GB 7200 RPM 32MB
Cache SATA 3.0Gb/s
2 Raid 0: Seagate Barracuda 7200.11 500GB 7200 RPM 32MB Cache SATA 3.0Gb/s Using Intel's integrared Raid Controller on MB

Just some details that i find useful on maximum render depth
You really need it even with 8bit source files, when using heavy grading/multiple curves/vignettes. If after grading you see banding, go to sequence > sequence settings from the top menu and check "maximum bit depth (ignore the performance popup), then check again your preview (it will change in a second) to see if banding is still present in 32bit mode. If no banding, you must check it when exporting, if  banding is still there, change your grading, then uncheck it to continue with editing.
Unfortunately Maximum bit depth exporting is extremely time-consuming, but can really SAVE YOUR DAY when facing artifacts after heavy grading, by completely or almost completely eliminating banding and other unwanted color distortions.
Use it only for either small previews or the really final output.
Best Regards.

Similar Messages

  • Maximum bit depth-maximum render quality when dynamic linking

    Hi
    A bit confused by the use of Maximum bit depth and Maximum render quality as used both in Sequence Settings and also as options when rendering in AME.
    1 Do you need to explicitly enable these switches in the sequence for best quality or, do you simply need to switch them on in AME when you render in Media Encoder?
    2 When dynamic linking to After Effects, when should you use an 8 bit vs 16 or 32 bit working space, and, how does this bit depth interact with the maximum bit depth, maximum render quality in PPro?

    Hi jbach2,
    I understand your confusion.  I'm like that most of the time I'm working. *chuckle*  The two settings you mentioned are two completely different parameters affecting (or is it effecting) your video. You do not need to enable them within the sequence itself unless you want to preview video on you program monitor at the highest quality.  I personally don't recommend it, as it's a tremendous resource hog, (the program even warns you when you try to click them) and unessecary for improving final output.  Again, do not enable these options in your sequence settings if you are only wanting a high quality export. Doing so will greatly reduce your editing performance unless you have a high-end system. ...and even then I don't think its worth it unless you're editing on a huge screen with a Director who wants to see everything at a maximum quality during the edit process.
    Keeping it simple...
    Resizing your final output video? Use Maximum bit depth.
    Starting or working with high bitdepth sources? Use Max Bit Depth.
    When/where do I enable these? In the AME only. ^_^
    Why?:
    Enabling the Max bit and Max render only needs to be done when you are exporting.  They both serve different functions. 
    Max Render aids in the scaling/conversion process only.  My understanding is that you never need to enable the Max Render Quality (MRQ) unless you are exporting in a format/pixel ratio different from your original video.  For example, when rendering a 1080p timeline out to a 480p file format, you'll want to use MRQ to ensure the best scaling with the least amount of artifacts and aliasing.  If you're exporting at the same size you're working with, DON'T enable MRQ.  It will just cost you time and CPU. Its only function is to do a high quality resizing of your work.
    Maximum bit depth increases the color depth that your video is working with and rendering to.  If you're working with video that has low color depth, then I don't believe it will matter.  However, if you're working with 32 bit color on your timeline in PPro and/or After Effects, using lots of graphics, high contrast values, or color gradients, you may want to enable this option. It ultimately depends on the color depth of your source material.
    The same applies to After Effects.
    Create something in AE like a nice color gradient.  Now switch the same project between 8,16,32 bit depth, and you will see a noticable difference in how the bit depth effects your colors and the smoothness of the gradient.
    Bit depth effects how different plugins/effects change your overall image.  Higher depth means more colors to work with (and incidentally, more cpu you need)
    Just remember that "DEPTH" determines how many colors you can "fill your bucket with" and "QUALITY" is just that, the quality of your "resize".
    http://blogs.adobe.com/VideoRoad/2010/06/understanding_color_processing.html
    Check out this adobe blog for more info on color depth ^_^  Hope that helps!
    ----a lil excerpt from the blog i linked to above---
    Now, 8-bit, 10-bit, and 12-bit color are the industry standards for recording color in a device. The vast majority of cameras use 8-bits for color. If your camera doesn’t mention the color bit depth, it’s using 8-bits per channel. Higher-end cameras use 10-bit, and they make a big deal about using “10-bit precision” in their literature. Only a select few cameras use 12-bits, like the digital cinema camera, the RED ONE.
    Software like After Effects and Premiere Pro processes color images using color precision of 8-bits, 16-bits, and a special color bit depth called 32-bit floating point. You’ve probably seen these color modes in After Effects, and you’ve seen the new “32″ icons on some of the effects in Premiere Pro CS5.
    jbach2 wrote:
    Hi
    A bit confused by the use of Maximum bit depth and Maximum render quality as used both in Sequence Settings and also as options when rendering in AME.
    1 Do you need to explicitly enable these switches in the sequence for best quality or, do you simply need to switch them on in AME when you render in Media Encoder?
    2 When dynamic linking to After Effects, when should you use an 8 bit vs 16 or 32 bit working space, and, how does this bit depth interact with the maximum bit depth, maximum render quality in PPro?
    Message was edited by: SnJK

  • Bit Depth and Render Quality

    When you finally export media to some sort of media format via the encoder does the projects preview Bit Depth and Render Quality settings affect the output file?
    I know there is "Use Preview files" setting in the media exporter dialogue but I just want to be sure of what I am doing.

    Jeff's response is my perspective, as well, which is both backed up by my own tests and the official Adobe word.
    Exhibit A: My Tests
    That is DV footage with a title superimposed over it in a DV sequence, with a Gaussian blur effect (the Premiere accelerated one) applied to the title; all samples are from that sequence exported back to DV. This was to show the relative differences of processing between software and hardware MPE, Premiere export and AME queueing, and the effect of the Maximum Bit Depth and Maximum Render Quality options on export (not the sequence settings; those have no bearing on export).
    The "blooming" evident in the GPU exports is due to hardware MPE's linear color processing. I think it's ugly, but that's not the point here. Further down the line, you can see the effect of Maximum Bit Depth (and MRQ) on both software MPE and hardware MPE. I assume you can see the difference between the Maximum Bit Depth-enabled export and the one without. Bear in mind that this is 8-bit DV footage composited and "effected" and exported back to 8-bit DV. I don't understand what your "padding with zeroes" and larger file size argument is motivated by--my source files and destination files are the same size due to the DV codec--but it's plainly clear that Maximum Bit Depth has a significant impact on output quality. Similar results would likely be evident if I used any of the other 32-bit enabled effects; many of the color correction filters are 32-bit, and should exhibit less banding, even on something 8-bit like DV.
    Exhibit B: The Adobe Word
    This is extracted from Karl Soule's blog post, Understanding Color Processing: 8-bit, 10-bit, 32-bit, and more. This section comes from Adobe engineer Steve Hoeg:
    1. A DV file with a blur and a color corrector exported to DV without the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 8-bit frame,
    apply the color corrector to the 8-bit frame to get another 8-bit frame,
    then write DV at 8-bit.
    2. A DV file with a blur and a color corrector exported to DV with the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 32-bit frame,
    apply the color corrector to the 32-bit frame to get another 32-bit
    frame, then write DV at 8-bit. The color corrector working on the 32-bit
    blurred frame will be higher quality then the previous example.
    3. A DV file with a blur and a color corrector exported to DPX with the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 32-bit frame,
    apply the color corrector to the 32-bit frame to get another 32-bit
    frame, then write DPX at 10-bit. This will be still higher quality
    because the final output format supports greater precision.
    4. A DPX file with a blur and a color corrector exported to DPX without the max bit depth flag.
    We will clamp 10-bit DPX file to 8-bits, apply the blur to get an 8-bit
    frame, apply the color corrector to the 8-bit frame to get another
    8-bit frame, then write 10-bit DPX from 8-bit data.
    5. A DPX file with a blur and a color corrector exported to DPX with the max bit depth flag.
    We will import the 10-bit DPX file, apply the blur to get an 32-bit
    frame, apply the color corrector to the 32-bit frame to get another
    32-bit frame, then write DPX at 10-bit. This will retain full precision through the whole pipeline.
    6. A title with a gradient and a blur on a 8-bit monitor. This will display in 8-bit, may show banding.
    7. A title with a gradient and a blur on a 10-bit monitor
    (with hardware acceleration enabled.) This will render the blur in
    32-bit, then display at 10-bit. The gradient should be smooth.
    Bullet #2 is pretty much what my tests reveal.
    I think the Premiere Pro Help Docs get this wrong, however:
    High-bit-depth effects
    Premiere Pro includes some video effects and transitions
    that support high-bit-depth processing. When applied to high-bit-depth
    assets, such as v210-format video and 16-bit-per-channel (bpc) Photoshop
    files, these effects can be rendered with 32bpc pixels. The result
    is better color resolution and smoother color gradients with these
    assets than would be possible with the earlier standard 8 bit per
    channel pixels. A 32-bpc badge appears
    to the right of the effect name in the Effects panel for each high-bit-depth
    effect.
    I added the emphasis; it should be obvious after my tests and the quote from Steve Hoeg that this is clearly not the case. These 32-bit effects can be added to 8-bit assets, and if the Maximum Bit Depth flag is checked on export, those 32-bit effects are processed as 32-bit, regardless of the destination format of the export. Rendering and export/compression are two different processes altogether, and that's why using the Maximum Bit Depth option has far more impact than "padding with zeroes." You've made this claim repeatedly, and I believe it to be false.
    Your witness...

  • Turning on Render at Maximum Bit Depth and Maximum Render Quality crashes render every time

    I've tried a few times to render an H264 version of my Red media project with Maximum Bit Depth and Maximum Render Quality.  Premiere crashes every time.  I have GPUs enabled. Are people using these settings with Red media and successfully rendering?

    To answer your specific question did you see the tooltip?
    I beleive it allows for 32-bit processing (16-bit if unchecked). Per the project settings help file at http://helpx.adobe.com/premiere-elements/using/project-settings-presets.html
    Maximum Bit Depth
    Allows Premiere Elements to use up to 32‑bit processing, even if the project uses a lower bit depth. Selecting this option increases precision but decreases performance.
    The help file for export is somewhat less informative about what it actually does but does point out that it is the color bit depth - http://helpx.adobe.com/media-encoder/using/encode-export-video-audio.html
    (Optional) Select Use Maximum Render Quality or Render At Maximum Bit Depth. Note:  Rendering at a higher color bit depth requires more RAM and slows rendering substantially.
    In practice the simplest suggestion is to export twice - once with / once without the setting and compare the time taken and perceived quality.
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children
    If this post or another user's post resolves the original issue, please mark the posts as correct and/or helpful accordingly. This helps other users with similar trouble get answers to their questions quicker. Thanks.

  • Wraptor DCP and missing Maximum Bit Depth option

    DCPs are made from 12bit JPEG200 frame files wrapped is a MFX container. But there is no Maximum Bit Depth option in Wraptor DCP codec for AME8.
    Does it means:
    1. Wraptor DCP has by default Maximum Bit Depth checked on, so it correctly produces high depth color renders?
    2. OR, Wraptor DCP ignores AME Maximum Bit Depth, so it always renders in 8 bits and than scale up to 12bits (what is a waste of information)?
    The following article implies that option 2 is the correct case, what would be a shame for a such quality demand workflow as DCP production.
    The Video Road – Understanding Color Processing: 8-bit, 10-bit, 32-bit, and more

    Wraptor DCP output is not working for me on a feature length film.  Am I missing something?
    Symptom:  simply hangs at various stages of the job.  No message, never crashes... just STOPS on one frame and never resumes. 
    Hardware:
    OSX 10.9.2 on MacPro late 2013
    Going to try it with an older machine.  Any suggestions?

  • When do I need maximum render quality?

    Help me to understand it right.
    There is "maximum bit depth" and "maximum render quality".
    I only use maximum render quality if I downscale a project from hd to sd to get a better downscale?!
    I use maximum bit depth if I want to render effects in 10 Bit quality.
    I usually cut with XDCAM Files outputting it to disc - with this 8 Bit Files there is no require to render with maximum bit depth isn't it?
    Because when I output to XDCAM-Disc all files must be coded to 8 bit xdcam-mxf again right?

    with this 8 Bit Files there is no require to render with maximum bit depth isn't it?
    See this The Video Road blogpost on Understanding Color Processing. At the end of the article Steve Hoeg presents detailed explanation how the 'Maximum Bit Depth' flag works.
    See also this discussion on 'Maximum Render Quality'.
    Additional I don't unterstand right now, why Premiere is rendering my 50MBit Xdcam Files with 25 Mbits MPEG files.Is this only preview quality?
    If you're talking about preview files, then yes, unless you tick 'Use Previews' checkbox in the Export Settings dialog. By default PrPro utilises MPEG2 for rendering previews. You can change that while you're creating new sequence: in the New Settings dialog click Settings tab, choose 'Custom' from the Editing Mode drop down list, then you will be able to set Preview File Format and Codec in the Video Previews section. Now the question is whether you really want it? Rendering to a production codec will take longer, whereas rendering previews happens more often (if ever) than rendering final output...
    Tento wrote:
    No it's not necessary. Unless you want to make a color grading in 10bit, but that would be with a lossless codec like DNxHD.
    No, that's a delusion. See this The Video Road blogpost on Understanding Color Processing I mentioned earlier in my comment.

  • Final cut pro millions of colours + bit depth question

    Hello
    I am working in final cut pro 7 and I wanted to know what is the maximum bit depth I can export using the Prores codec? All I see in compression settings for rendering my timeline when wanting to render with Prores 4444 is the option for 'millions of colors' and 'millions of colors +' I was under the impression that millions of colors refered to 8 bit... does the alpha channel mean I can get 10 bit? can the alpha channel hold 2 more bits per channel or something? Or is there no way I can export a 10bit file using the Prores codec within fcp7..? is it all just 8bit. -and when I select 422HQ there is no advanced options for millions of colors..what does this mean? is the only way to get 10bit out of fcp7 to render with the 10bit uncompressed codec? and if so can I render the timeline in prores while im working with it then delete all the renders and change the render codec to 10bit uncompressed, will this now be properly giving me 10bit from the original 4444 12 bit files i imported in the beginning..?
    Any help is much appreciated

    ProRes is 10-bit. Every ProRes codec is 10-bit...LT, 422, HQ.  Not one of them is 8-bit.  Except for ProRes 444...that's 12 bit.

  • Creative Audigy 2 NX Bit Depth / Sample Rate Prob

    This is my first post to this form
    Down to business: I recently purchased a Creative Audigy 2 NX sound card. I am using it on my laptop (an HP Pavilion zd 7000, which has plenty of power to support the card.) I installed it according to the instructions on the manual, but I have been having some problems with it. I can't seem to set the bit depth and sample rate settings to their proper values.
    The maximum bit depth available from the drop down menu in "Device Control" -> "PCI/USB" tab is 6 bits and the maximum sample rate is 48kHz. I have tried repairing and reinstalling the drivers several times, but it still wont work. The card is connected to my laptop via USB 2.0.
    I looked around in the forms and found out that at least one other person has had the same problem but no solution was posted. If anyone knows of a way to resolve this issue I would appreciate the input!
    Here are my system specs:
    HP Pavilion zd 7000
    Intel Pentium 4 3.06 GHz
    GB Ram
    Windows XP Prof. SP 2
    Thnx.
    -cmsleimanMessage Edited by cmsleiman on -27-2004 09:38 PM

    Well, I am new to high-end sound cards, and I may be misinterpreting the terminology, but the sound card is supposed to be a 24bit/96kHz card.
    I am under the impression that one should be able to set the output quality of the card to 24bits of depth and a 96kHz sample rate, despite the speaker setting that one may be using, to decode good quality audio streams (say an audio cd or the dolby digital audio of a dvd movie.) I can currently achieve this only on 2. speaker systems (or when i set the speaker setting of the card to 2.) Otherwise the maximum bit depth/sample rate I can set the card output to is a sample rate of 48kHz and a bit depth of 6bits.
    Am I mistaken in thinking that if I am playing a good quality audio stream I should be able to raise the output quality of the card to that which it is advertised and claims to have?
    Thnx

  • Bit Depth & Sample Rate: 24 bit 96kHz? 192kHz?

    I am using the Apogee Duet for Mac and iOS on my Mac and I love it - I'm thinking about getting an iPad for mobile recording (voice overs, mostly) and I wonder if Garage Band can manage 24 bit audio at 96 kHz or 192 kHz? I know that the Auria app can, so if nothing else I can just buy that, but since all I would use the iPad for is Voice Overs to edit later in a computer, a $50 app feels like overkill. Comments? Thoughts? Specs?

    Well, I am new to high-end sound cards, and I may be misinterpreting the terminology, but the sound card is supposed to be a 24bit/96kHz card.
    I am under the impression that one should be able to set the output quality of the card to 24bits of depth and a 96kHz sample rate, despite the speaker setting that one may be using, to decode good quality audio streams (say an audio cd or the dolby digital audio of a dvd movie.) I can currently achieve this only on 2. speaker systems (or when i set the speaker setting of the card to 2.) Otherwise the maximum bit depth/sample rate I can set the card output to is a sample rate of 48kHz and a bit depth of 6bits.
    Am I mistaken in thinking that if I am playing a good quality audio stream I should be able to raise the output quality of the card to that which it is advertised and claims to have?
    Thnx

  • Maximum audio sample rate and bit depth question

    Anyone worked out what the maximum sample rates and bit depths AppleTV can output are?
    I'm digitising some old LPs and while I suspect I can get away with 48kHz sample rate and 16 bit depth, I'm not sure about 96kHz sample rate or 24bit resolution.
    If I import recordings as AIFFs or WAVs to iTunes it shows the recording parameters in iTunes, but my old Yamaha processor which accepts PCM doesn't show the source data values, though I know it can handle 96kHz 24bit from DVD audio.
    It takes no more time recording at any available sample rates or bit depths, so I might as well maximise an album's recording quality for archiving to DVD/posterity as I only want to do each LP once!
    If AppleTV downsamples however there wouldn't be much point streaming higher rates.
    I wonder how many people out there stream uncompressed audio to AppleTV? With external drives which will hold several hundred uncompressed CD albums is there any good reason not to these days when you are playing back via your hi-fi? (I confess most of my music is in MP3 format just because i haven't got round to ripping again uncompressed for AppleTV).
    No doubt there'll be a deluge of comments saying that recording LPs at high quality settings is a waste of time, but some of us still prefer the sound of vinyl over CD...
    AC

    I guess the answer to this question relies on someone having an external digital amp/decoder/processor that can display the source sample rate and bit depth during playback, together with some suitable 'demo' files.
    AC

  • Export with Maximum Render quality

    I'm about to output in 4K a final master project file for a short film that was primarily shot in 4K, but also has some clips in 1920 by 1080 and even 1280 by 720.
    I'm not concerned with how long the export takes.  Is there any downside (or benefit) to activating "maximum render quality"?  What about "render at maximum depth"?  24 bit or 48 bit?
    I would think for a master file, you'd max out all of this, no?
    Thank you for any and all input.  I will use the master file to generate Vimeo, Apple TV, etc. dubs.

    Use Maximum Render Quality: Mostly affects upscaling of content. If you're not upscailing, you won't notice much of a difference. Looks like you might be upscaling content to 4k, so I'd say yes to turning this on.
    Render at Maximum Depth: What is the bit-depth of your source content? Do you have any content in your sequence that would benefit from the higher bit-depth?
    If you really don't care how long it takes to export, you might as well turn these on. The drawback is the longer encode time.

  • "Maximum Render Quality" Better to turn it OFF when using CUDA MPE?

    http://crookedpathfilms.com/blog/201...port-settings/
    "IMPORTANT NOTE ABOUT RENDERING TIME:  Make sure you do not select “Use  Maximum Render Quality” if you are utilizing the accelerated GPU  graphics (Mercury Playback Engine).  This will not improve your video  and will only slow down the rendering speed by as much as 4 times!"
    http://blogs.adobe.com/premiereprotr...e-pro-cs5.html
    "For export, scaling with CUDA is always at maximum quality, regardless  of quality settings. (This only applies to scaling done on the GPU.)  Maximum Render Quality can still make a difference with CUDA-accelerated  exports for any parts of the render that are processed on the CPU...
    When rendering is done on the CPU with Maximum Render Quality enabled,  processing is done in a linear color space (i.e., gamma = 1.0) at 32  bits per channel (bpc), which results in more realistic results, finer  gradations in color, and better results for midtones. CUDA-accelerated  processing is always performed in a 32-bpc linear color space. To have  results match between CPU rendering and GPU rendering, enable Maximum  Render Quality."
    Here is what I got out of after reading those two sites:
    I should turn it off for it's always ON (when CUDA MPE is used)  regardless I check or uncheck it.Turning it ON only offloads the  calculation to CPU (instead of GPU) hence slowing down the previewing  and encoding performance.
    So I guess I should have Maximum Render Quality setting turned OFF in both of squence settings and export settings.
    However, David Knarr of Studio 1 Productions suggest otherwise:
    http://www.studio1productions.com/Articles/PremiereCS5.htm
    "When you startup Adobe Premiere CS5 and you don't have a certified video card  (or one that is unlocked) the Mercury Playback Engine is in software rendering mode  and by default the Maximum Render Quality mode (or MRQ) is to OFF.
    (Maximum Render Quality mode will maximize the quality of motion in rendered clips and  sequences.  So when you select this option, the video will often  render moving objects more sharply.  Maximum Render Quality also maintains sharp  detail when scaling from large formats to smaller formats, or from  high-definition to standard-definition formats.  For the highest quality exports you should always use the Maximum Render Quality  mode.)
    When you unlock Adobe Premiere CS5 so the Mercury Playback Engine can use almost  any newer NVidia card (or if you are using a "certified" NVidia graphics card),  the Mercury Playback Engine will be in the hardware  rendering mode and the Maximum Render Quality mode  will be turned ON.
    Since the software mode is not set to maximum render quality,  it can sometime render faster than the hardware render, but a a loss in  qualitly. If you set the software to  maximum render quality you will see that it is very, very slow compared to the  hardware render.
    Here is how to set the Maximum Render Quality.
    1)  Open up Premiere CS5
    2)  Click on Sequence at the top of the screen
    3)  Then select Sequence Settings
    4)  At the bottom of the window select Maximum Render Quality and click Okay
    It is always best to be using the Maximum Render Quality mode,"
    Now, I'm lost.

    Okay, I am loosing it.....    You are correct.
    I am not sure what I was remembering, I could have sworn that when I loaded Premiere CS5 for the first time before I unlocked the video card, the Maximum Render Quality mode was NOT checked.  Then when I unlocked the video card,  the Maximum Render Quality mode was check to ON and I didn't set it to be On.
    I just when back and uninstalled Premiere and re-installed it, to see what was going on and I was totally wrong.
    Sorry for the mistake and I will be updating the article on my website in the next 15 min.
    Also, I have written a small program to do the unlock automatically.  The program is free and it works with the cards listed under the Automatic Mode.
    If your video card isn't listen, just let me know what your card is and what you typed into the cuda cards file and I will add it to the program.
    David Knarr
    Studio 1 Productions

  • Maximum Render Quality not installed

    I am using Premiere Pro CS4 and did not see the Use Maximum Render Quality when exporting a video, my only choices are "Use Preview Files", "Include Source XMP MetaData", and "File  Info".
    I found one post that said it was in a update so I checked my verison and it said 4.0.0.  I then checked for updated from the Help menu and ran that. version did not change and still no Max Render Quality so I found the download for PP CS4 4.2.1 and downloaded and unsipped it.
    When I ran the EXE file it started but then stopped and said "Cannot Install click the Quite button".  In the text of the window is said everything for 4.2.1 was installed except the language packs.  Verson number is still 4.0.0.
    What else can I try?
    Thanks.

    Thank you Ann
    on the reinstall and updates, does the last update (4.2.1) for Premiere Pro CS4 have all the updated combined or do I need to run them one at a time?
    also does the Encore update (4.0.1) run or included with the Premiere updates?
    I will do this starting monday, i hope to have this 2,000 ft project done tonight so I can deliver it tomorrow but will keep my data and rerun it after the update if Max Render shows up
    Thanks again

  • Maximum Render Quality no longer shown.......

    I'm using CS4.
    I have had great success with Steve Bellune's suggested settings for exporting media and encoding HDV 1080 60i projects for DVD burning.
    Now, all of a sudden,  for some crazy reason  all I have on the small drop down menue are  the following choices:
        Use Preview files
        Include Source XMP Metadata
        File Info.....
    There is NO Maximum Render Quality choice like I have had in the past.
    Might be my imagination, but I seem to see many, many more Jaggies on the DVDs.....
    As far as I know, nothing else has changed.
    Suggestions  ????  Tks,  Tom B.

    Please show people what you see - http://forums.adobe.com/thread/592070?tstart=30

  • Maximum Render Quality CS5.5

    My project is about 1hr 15 mins long, covering 8 sequences.
    3 days ago, I encoded using the MPG2-DVD preset and burnt a trial disk with an Encore project.
    After reviewing the disk, I made a few trivial changes to the PP project and encoded again.
    Exactly the same encode settings except I checked Maximum Render Quality.
    This time, it took FIVE TIMES longer to encode, and I cannot see any difference in disk quality.
    What does checking Max Render Quality do? 
    Would you expect it to take 5x?
    And would you expect a better quality disk?
    Thanks

    Jim,
    Bill and I are testing this extensively for the new PPBM6 test and on our current time-line, which is an AVCHD 1080i-29.97 source with numerous effects, fast color corrector, brightnes & contrast, gamma correction, gaussian blur and 3-way color corrector and speed slowed down by 50%, for a total duration of 2;39;04 exporting to MPEG2-DVD with a preset of NTSC 23.976 Widescreen High Quality I have just tested again the export times and it gave me the following results:
    System
    Hardware MPE On,
    MRQ Off
    Hardware MPE On,
    MRQ On
    Software MPE,
    MRQ On
    i7-3930K,
    GTX 680
    24 s
    24 s
    436 s (94 without MRQ)
    i7-2600K,
    GTX 680
    NA
    33 s
    870 s
    i7-980X,
    GTX 680
    NA
    30 s
    556 s
    Now this test is taxing on the GPU, because there is frame blending, scaling and blurring going on during export, but I can not see any difference with or without MRQ with hardware MPE turned on. I ran several runs and they are consistently between 23 and 24 seconds on this test.  Several things of interest here, the advantage the 6-cores have over a 4-core when using software mode only, and the difference between MRQ settings in software mode MPE.
    PS. You may be testing this with AME instead of Direct Export. AME is seriously handicapped in CS6 and that may be the cause of your strange results. How does it look with your same test when you use Direct Export, because exporting a 1 minute clip without effects taking 1 minute contrasts seriously with my export of 24 seconds for a clip 2.5 times longer and filled with effects. You know that both Bill and I have rather tuned systems and when our software MPE exports take this long (436 - 870 seconds) there is something going on here I would like to know more about.

Maybe you are looking for

  • Why is my image icon not showing up for subscribers?

    I'm using podbean which produces the itunes feed and auto-submits it. You can see the icon when you visit the itunes page on the web - https://itunes.apple.com/us/podcast/bloggersareweird/id589296259 but when you actually subscribe via iTunes the ico

  • I have just fixed my screen flicker (Software only)

    First of all, please note the below procedure is to be tried at your own risk. The basic idea is to add a custom refresh rate to the system (70Hz Vertical) and use it instead of the default one (60Hz). What was my flicker like: Continuous, felt like

  • My macbook treats my TV as a second moniter.

    How can I change my display settings so that when I hook my computer up to the TV it shows on the TV exactly what is shown on the computer's screen? Currently it is showing a blank desktop as though I was using dual moniters, this is very awkward for

  • Client Auth failure:SSLException Received fatal alert: bad_certificate

    Friends, I have managed to establish a one -way https connection between the client and the tomcat-server by keeping the client-Authentication=false <Connector enableLookups="true" port="8443" scheme="https" secure="true" maxProcessors="75" debug="0"

  • Poor grayscale resulution when viewing image in flickr

    When viewing my uploaded images on flickr, FF does not display the images correctly. They looks pixelated and not right. The images looks perfectly right using IE/chrome What to do?