Maximum bit depth-maximum render quality when dynamic linking

Hi
A bit confused by the use of Maximum bit depth and Maximum render quality as used both in Sequence Settings and also as options when rendering in AME.
1 Do you need to explicitly enable these switches in the sequence for best quality or, do you simply need to switch them on in AME when you render in Media Encoder?
2 When dynamic linking to After Effects, when should you use an 8 bit vs 16 or 32 bit working space, and, how does this bit depth interact with the maximum bit depth, maximum render quality in PPro?

Hi jbach2,
I understand your confusion.  I'm like that most of the time I'm working. *chuckle*  The two settings you mentioned are two completely different parameters affecting (or is it effecting) your video. You do not need to enable them within the sequence itself unless you want to preview video on you program monitor at the highest quality.  I personally don't recommend it, as it's a tremendous resource hog, (the program even warns you when you try to click them) and unessecary for improving final output.  Again, do not enable these options in your sequence settings if you are only wanting a high quality export. Doing so will greatly reduce your editing performance unless you have a high-end system. ...and even then I don't think its worth it unless you're editing on a huge screen with a Director who wants to see everything at a maximum quality during the edit process.
Keeping it simple...
Resizing your final output video? Use Maximum bit depth.
Starting or working with high bitdepth sources? Use Max Bit Depth.
When/where do I enable these? In the AME only. ^_^
Why?:
Enabling the Max bit and Max render only needs to be done when you are exporting.  They both serve different functions. 
Max Render aids in the scaling/conversion process only.  My understanding is that you never need to enable the Max Render Quality (MRQ) unless you are exporting in a format/pixel ratio different from your original video.  For example, when rendering a 1080p timeline out to a 480p file format, you'll want to use MRQ to ensure the best scaling with the least amount of artifacts and aliasing.  If you're exporting at the same size you're working with, DON'T enable MRQ.  It will just cost you time and CPU. Its only function is to do a high quality resizing of your work.
Maximum bit depth increases the color depth that your video is working with and rendering to.  If you're working with video that has low color depth, then I don't believe it will matter.  However, if you're working with 32 bit color on your timeline in PPro and/or After Effects, using lots of graphics, high contrast values, or color gradients, you may want to enable this option. It ultimately depends on the color depth of your source material.
The same applies to After Effects.
Create something in AE like a nice color gradient.  Now switch the same project between 8,16,32 bit depth, and you will see a noticable difference in how the bit depth effects your colors and the smoothness of the gradient.
Bit depth effects how different plugins/effects change your overall image.  Higher depth means more colors to work with (and incidentally, more cpu you need)
Just remember that "DEPTH" determines how many colors you can "fill your bucket with" and "QUALITY" is just that, the quality of your "resize".
http://blogs.adobe.com/VideoRoad/2010/06/understanding_color_processing.html
Check out this adobe blog for more info on color depth ^_^  Hope that helps!
----a lil excerpt from the blog i linked to above---
Now, 8-bit, 10-bit, and 12-bit color are the industry standards for recording color in a device. The vast majority of cameras use 8-bits for color. If your camera doesn’t mention the color bit depth, it’s using 8-bits per channel. Higher-end cameras use 10-bit, and they make a big deal about using “10-bit precision” in their literature. Only a select few cameras use 12-bits, like the digital cinema camera, the RED ONE.
Software like After Effects and Premiere Pro processes color images using color precision of 8-bits, 16-bits, and a special color bit depth called 32-bit floating point. You’ve probably seen these color modes in After Effects, and you’ve seen the new “32″ icons on some of the effects in Premiere Pro CS5.
jbach2 wrote:
Hi
A bit confused by the use of Maximum bit depth and Maximum render quality as used both in Sequence Settings and also as options when rendering in AME.
1 Do you need to explicitly enable these switches in the sequence for best quality or, do you simply need to switch them on in AME when you render in Media Encoder?
2 When dynamic linking to After Effects, when should you use an 8 bit vs 16 or 32 bit working space, and, how does this bit depth interact with the maximum bit depth, maximum render quality in PPro?
Message was edited by: SnJK

Similar Messages

  • Bit Depth and Render Quality

    When you finally export media to some sort of media format via the encoder does the projects preview Bit Depth and Render Quality settings affect the output file?
    I know there is "Use Preview files" setting in the media exporter dialogue but I just want to be sure of what I am doing.

    Jeff's response is my perspective, as well, which is both backed up by my own tests and the official Adobe word.
    Exhibit A: My Tests
    That is DV footage with a title superimposed over it in a DV sequence, with a Gaussian blur effect (the Premiere accelerated one) applied to the title; all samples are from that sequence exported back to DV. This was to show the relative differences of processing between software and hardware MPE, Premiere export and AME queueing, and the effect of the Maximum Bit Depth and Maximum Render Quality options on export (not the sequence settings; those have no bearing on export).
    The "blooming" evident in the GPU exports is due to hardware MPE's linear color processing. I think it's ugly, but that's not the point here. Further down the line, you can see the effect of Maximum Bit Depth (and MRQ) on both software MPE and hardware MPE. I assume you can see the difference between the Maximum Bit Depth-enabled export and the one without. Bear in mind that this is 8-bit DV footage composited and "effected" and exported back to 8-bit DV. I don't understand what your "padding with zeroes" and larger file size argument is motivated by--my source files and destination files are the same size due to the DV codec--but it's plainly clear that Maximum Bit Depth has a significant impact on output quality. Similar results would likely be evident if I used any of the other 32-bit enabled effects; many of the color correction filters are 32-bit, and should exhibit less banding, even on something 8-bit like DV.
    Exhibit B: The Adobe Word
    This is extracted from Karl Soule's blog post, Understanding Color Processing: 8-bit, 10-bit, 32-bit, and more. This section comes from Adobe engineer Steve Hoeg:
    1. A DV file with a blur and a color corrector exported to DV without the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 8-bit frame,
    apply the color corrector to the 8-bit frame to get another 8-bit frame,
    then write DV at 8-bit.
    2. A DV file with a blur and a color corrector exported to DV with the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 32-bit frame,
    apply the color corrector to the 32-bit frame to get another 32-bit
    frame, then write DV at 8-bit. The color corrector working on the 32-bit
    blurred frame will be higher quality then the previous example.
    3. A DV file with a blur and a color corrector exported to DPX with the max bit depth flag. We
    will import the 8-bit DV file, apply the blur to get an 32-bit frame,
    apply the color corrector to the 32-bit frame to get another 32-bit
    frame, then write DPX at 10-bit. This will be still higher quality
    because the final output format supports greater precision.
    4. A DPX file with a blur and a color corrector exported to DPX without the max bit depth flag.
    We will clamp 10-bit DPX file to 8-bits, apply the blur to get an 8-bit
    frame, apply the color corrector to the 8-bit frame to get another
    8-bit frame, then write 10-bit DPX from 8-bit data.
    5. A DPX file with a blur and a color corrector exported to DPX with the max bit depth flag.
    We will import the 10-bit DPX file, apply the blur to get an 32-bit
    frame, apply the color corrector to the 32-bit frame to get another
    32-bit frame, then write DPX at 10-bit. This will retain full precision through the whole pipeline.
    6. A title with a gradient and a blur on a 8-bit monitor. This will display in 8-bit, may show banding.
    7. A title with a gradient and a blur on a 10-bit monitor
    (with hardware acceleration enabled.) This will render the blur in
    32-bit, then display at 10-bit. The gradient should be smooth.
    Bullet #2 is pretty much what my tests reveal.
    I think the Premiere Pro Help Docs get this wrong, however:
    High-bit-depth effects
    Premiere Pro includes some video effects and transitions
    that support high-bit-depth processing. When applied to high-bit-depth
    assets, such as v210-format video and 16-bit-per-channel (bpc) Photoshop
    files, these effects can be rendered with 32bpc pixels. The result
    is better color resolution and smoother color gradients with these
    assets than would be possible with the earlier standard 8 bit per
    channel pixels. A 32-bpc badge appears
    to the right of the effect name in the Effects panel for each high-bit-depth
    effect.
    I added the emphasis; it should be obvious after my tests and the quote from Steve Hoeg that this is clearly not the case. These 32-bit effects can be added to 8-bit assets, and if the Maximum Bit Depth flag is checked on export, those 32-bit effects are processed as 32-bit, regardless of the destination format of the export. Rendering and export/compression are two different processes altogether, and that's why using the Maximum Bit Depth option has far more impact than "padding with zeroes." You've made this claim repeatedly, and I believe it to be false.
    Your witness...

  • Maximum Bit Depth /Maximum Render Quality  Questions

    Maximum Bit Depth
    If my project contains high-bit-depth assets generated by high-definition camcorders, I was told to select Maximum Bit Depth because Adobe Premiere Pro uses all the color information in these assets when processing effects or generating preview files. I'm capturing HDV using the Matrox RTX-2 Hardware in Matrox AVI format.
    When I finally export my project using Adobe Media Encoder CS4, will selecting Maximum Bit Depth provide better color resolution once I post to Blu-ray format?
    Maximum Render Quality
    I was told that by using Maximum Render Quality, I maintain sharp detail when scaling from large formats to smaller formats, or from high-definition to standard-definition formats as well as maximizes the quality of motion in rendered clips and sequences. It also renders moving assets more sharply. It's my understanding that at maximum quality, rendering takes more time, and uses more RAM than at the default normal quality. I'm running Vista 64 Bit with 8 GIGs of RAM so I'm hoping to take advantage of this feature.
    Will this also help to improve better resolution when I finally export my project using Adobe Media Encoder CS4 and post to Blu-ray format?
    Does it look like I have the specs to handle Maximum Bit Depth and Maximum Render Quality when creating a new HDV project with the support of the Matrox RTX 2 Hardware capturing in Matrox AVI format? See Below Specs.
    System Specs
    Case: Coolmaster-830
    Op System: Vista Ultima 64 Bit
    Edit Suite: Adobe Creative Suite 4 Production Premium Line Upgrade
    Adobe Premiere Pro CS 4.0.1 update before installing RT.X2 Card and 4.0 tools
    Performed updates on all Adobe Production Premium Products as of 03/01/2009
    Matrox RTX2 4.0 Tools
    Main Display: Dell 3007 30"
    DVI Monitor: Dell 2408WFP 24"
    MB: ASUS P5E3 Deluxe/WiFi-AP LGA 775 Intel X38
    Display Card: SAPPHIRE Radeon HD 4870 512MB GDDR5 Toxic ver.
    PS: Corsair|CMPSU-1000HX 1000W
    CPU: INTEL Quad Core Q9650 3G
    MEM: 2Gx4|Corsair TW3X4G1333C9DHXR DDR3 (8 Gigs Total)
    1 Sys Drive: Seagate Barracuda 7200.11 500GB 7200 RPM 32MB
    Cache SATA 3.0Gb/s
    2 Raid 0: Seagate Barracuda 7200.11 500GB 7200 RPM 32MB Cache SATA 3.0Gb/s Using Intel's integrared Raid Controller on MB

    Just some details that i find useful on maximum render depth
    You really need it even with 8bit source files, when using heavy grading/multiple curves/vignettes. If after grading you see banding, go to sequence > sequence settings from the top menu and check "maximum bit depth (ignore the performance popup), then check again your preview (it will change in a second) to see if banding is still present in 32bit mode. If no banding, you must check it when exporting, if  banding is still there, change your grading, then uncheck it to continue with editing.
    Unfortunately Maximum bit depth exporting is extremely time-consuming, but can really SAVE YOUR DAY when facing artifacts after heavy grading, by completely or almost completely eliminating banding and other unwanted color distortions.
    Use it only for either small previews or the really final output.
    Best Regards.

  • Turning on Render at Maximum Bit Depth and Maximum Render Quality crashes render every time

    I've tried a few times to render an H264 version of my Red media project with Maximum Bit Depth and Maximum Render Quality.  Premiere crashes every time.  I have GPUs enabled. Are people using these settings with Red media and successfully rendering?

    To answer your specific question did you see the tooltip?
    I beleive it allows for 32-bit processing (16-bit if unchecked). Per the project settings help file at http://helpx.adobe.com/premiere-elements/using/project-settings-presets.html
    Maximum Bit Depth
    Allows Premiere Elements to use up to 32‑bit processing, even if the project uses a lower bit depth. Selecting this option increases precision but decreases performance.
    The help file for export is somewhat less informative about what it actually does but does point out that it is the color bit depth - http://helpx.adobe.com/media-encoder/using/encode-export-video-audio.html
    (Optional) Select Use Maximum Render Quality or Render At Maximum Bit Depth. Note:  Rendering at a higher color bit depth requires more RAM and slows rendering substantially.
    In practice the simplest suggestion is to export twice - once with / once without the setting and compare the time taken and perceived quality.
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children
    If this post or another user's post resolves the original issue, please mark the posts as correct and/or helpful accordingly. This helps other users with similar trouble get answers to their questions quicker. Thanks.

  • Wraptor DCP and missing Maximum Bit Depth option

    DCPs are made from 12bit JPEG200 frame files wrapped is a MFX container. But there is no Maximum Bit Depth option in Wraptor DCP codec for AME8.
    Does it means:
    1. Wraptor DCP has by default Maximum Bit Depth checked on, so it correctly produces high depth color renders?
    2. OR, Wraptor DCP ignores AME Maximum Bit Depth, so it always renders in 8 bits and than scale up to 12bits (what is a waste of information)?
    The following article implies that option 2 is the correct case, what would be a shame for a such quality demand workflow as DCP production.
    The Video Road – Understanding Color Processing: 8-bit, 10-bit, 32-bit, and more

    Wraptor DCP output is not working for me on a feature length film.  Am I missing something?
    Symptom:  simply hangs at various stages of the job.  No message, never crashes... just STOPS on one frame and never resumes. 
    Hardware:
    OSX 10.9.2 on MacPro late 2013
    Going to try it with an older machine.  Any suggestions?

  • Sb not displaying waveforms when dynamic linked to PR...

    I posted this in Soundbooth, but it seems appropriate here as well. Hopefully there's an easy answer:
    I have the following programs open - Premiere Pro CS4, Adobe Bridge CS4 & Soundbooth CS4.
    I create a new multitrack file in Sb. Click File\Adobe Dynamic Link\Import Premiere Pro Sequence and select the Project and Sequence. Click OK and the Video Track appears with the project name & (reference). There are thumbnails of the video images from the Pr timeline and a wide green band but no audio waveforms. No other audio tracks contain anything at all.
    The video track will play with audio and the meters indicate stereo audio but there is no audio visible anywhere.
    Workspace is "Edit Audio to Video," and Properties show the proper sample rate. All sample rates are the same throughout the timeline.
    One anomaly is that the Properties "Last Modified" entry is wrong as it shows today's date but about 4 hours later than the actual system clock.
    I guess I was under the impression that this workflow would enable me to open my 9 tracks of audio to do sweetening in Sb and that that work would be reflected in my timeline in Pr through the Dynamic Linking.
    How is this supposed to work? The documentation is less than useful in both Sb & Pr. There are implications, but no real information on what's what.
    Anyone doing this, and, if so, how???
    TIA,

    I'm really confused with the whole multitrack thing in Soundbooth. Seems all it is good for is combining multiple files that you can change over again. I don't see how this is useful at all when editing with an NLE. Once the tracks are in there they are stuck and can't be movied if you tweak any tracks.
    Like you I don't understand the point of the new soundbooth multitrack file at all. Seems like they stuck it in so that there would be a new feature for CS4.
    If you could open an OMF exported by premiere and work like that dynamic linked that would be great. Maybe someone can suprise us with an actual real world use.
    Since the update I use Audition anyway. Awesome music app.

  • How can I retain Speedgrade Lumetri effect on a Premiere clip when dynamic linking that clip to Afte

    I'm trying to streamline the workflow between me (editor) and the animator at my office.
    Right now we are exporting clips as ProRes files to be tracked and rotoscoped by the animator in AE and then dropping them back in Premiere. We work off a server so it would be much faster to simply dynamic link my clip that needs screen replacement animation into an After Effects comp. Then the animator could open that After Effects file on the server and do their work then just save it so it will automatically repopulate in my timeline with the tracked shot.
    The problem is here. I want to color correct in Speedgrade before animation so I'm not having to track around screens. So I color correct and the clips come back to Premiere with the Speedgrade Lumetri effects applied. Then I dynamic link to AE. But when the tracked shot returns from AE, the Lumetri color effect is completely lost, which causes me to have to re-color and track around the screen replacement animation.
    Does anyone have any tips for this? The help is much appreciated.

    I just responded to this on another thread, but let me repeat it here for convenience:
    Like you, I too (and I presume most everyone) first send to SG for color correction and then (actually after another SG trip for grading) replace with AE comp for effects work. And when I try that with my CC 2014, I have the same issue (all SG effects disappear after replace with AE comp). But I believe I've found a simple workaround (at least working great so far):
    Instead of replace with AE comp, open AE and drag the PP comp to the AE projects - it will appear like a clip.
    Create an AE comp with that and do your thing.
    Save as AE project and import it into PP - it will appear like a clip that you can use in PP (as if you'd done a replace with AE comp).
    Everything will be dynamic: if you change the SG corrections, the changes will get reflected everywhere (in AE as well as the AE project imported in PP), and what you do in SG will apply BEFORE the AE stuff, which is exactly what we want. Looks like Adobe put all the capability in, but the replace with AE comp is not making use of it yet - I will report this.
    One other related issue is that for some weird reason, you can't drag the imported AE project in PP onto a PP sequence, BUT you can select it and ask to create a new PP sequence! Go figure.

  • Sb shows no waveforms when Dynamic Linked to Pr....???

    I have the following programs open - Premiere Pro CS4, Adobe Bridge CS4 & Soundbooth CS4.
    I create a new multitrack file in Sb. Click File\Adobe Dynamic Link\Import Premiere Pro Sequence and select the Project and Sequence. Click OK and the Video Track appears with the project name (reference), thumbnails of the video images and a wide green band with no audio waveforms.
    It will play and the meters indicate stereo audio but there is no audio visible anywhere.
    Workspace is "Edit Audio to Video," and Properties show the proper sample rate. All sample rates are the same throughout the timeline.
    One anomaly is that the Properties "Last Modified" entry is wrong as it shows today's date but about 4 hours later than the actual system clock.
    I guess I was under the impression that this workflow would enable me to open my 9 tracks of audio to do sweetening in Sb and that that work would be reflected in my timeline in Pr through the Dynamic Linking.
    How is this supposed to work? The documentation is less than useful in both Sb & Pr. There are implications, but no real information on what's what.
    Anyone doing this, and, if so, how?
    TIA,

    Research has revealed that this does not work!! Read this thread:
    http://www.adobeforums.com/webx/.59b76657/2
    Dynamic Link is a marketing Pipe Dream, not a Pipeline!!

  • Each clip is precomposed when dynamic linking a timeline from Premier

    When I'm sending a timeline from Premier Pro CC to After Effects CC all of my video clips are being pre composed individually in the after effects composition.
    The last project I worked on had the same workflow except when they came into after effects the videos were not pre composed.
    From the PP timeline, I'm selecting all the timeline media and choosing 'replace with AFX comp!
    Is there a way to choose to precomp everything or not from a Premier timeline? I don't want each media to be precomposed!

    Thanks Todd,
    I can see that's the case as I went from a 1080 timeline down to 720P and used 'Scale to Comp'
    Causes havic in AFX if you have used different sections of the same clip in your edit.
    So I guess the work around would be to manuelly scale everything in the clip settings.

  • How can I get Premiere cs 5.5 to open AE 6 when dynamic linking instead of 5.5

    Just wondering, or do i have to buy Premiere 6 for it to open AE6?

    do i have to buy Premiere 6 for it to open AE6?
    Yes, you do, or more to the point yopu will need to buy a whole suite. DL only works within the same version and only inside Production Premium and Master Collection.
    Mylenium

  • When do I need maximum render quality?

    Help me to understand it right.
    There is "maximum bit depth" and "maximum render quality".
    I only use maximum render quality if I downscale a project from hd to sd to get a better downscale?!
    I use maximum bit depth if I want to render effects in 10 Bit quality.
    I usually cut with XDCAM Files outputting it to disc - with this 8 Bit Files there is no require to render with maximum bit depth isn't it?
    Because when I output to XDCAM-Disc all files must be coded to 8 bit xdcam-mxf again right?

    with this 8 Bit Files there is no require to render with maximum bit depth isn't it?
    See this The Video Road blogpost on Understanding Color Processing. At the end of the article Steve Hoeg presents detailed explanation how the 'Maximum Bit Depth' flag works.
    See also this discussion on 'Maximum Render Quality'.
    Additional I don't unterstand right now, why Premiere is rendering my 50MBit Xdcam Files with 25 Mbits MPEG files.Is this only preview quality?
    If you're talking about preview files, then yes, unless you tick 'Use Previews' checkbox in the Export Settings dialog. By default PrPro utilises MPEG2 for rendering previews. You can change that while you're creating new sequence: in the New Settings dialog click Settings tab, choose 'Custom' from the Editing Mode drop down list, then you will be able to set Preview File Format and Codec in the Video Previews section. Now the question is whether you really want it? Rendering to a production codec will take longer, whereas rendering previews happens more often (if ever) than rendering final output...
    Tento wrote:
    No it's not necessary. Unless you want to make a color grading in 10bit, but that would be with a lossless codec like DNxHD.
    No, that's a delusion. See this The Video Road blogpost on Understanding Color Processing I mentioned earlier in my comment.

  • Maximum Render Quality not installed

    I am using Premiere Pro CS4 and did not see the Use Maximum Render Quality when exporting a video, my only choices are "Use Preview Files", "Include Source XMP MetaData", and "File  Info".
    I found one post that said it was in a update so I checked my verison and it said 4.0.0.  I then checked for updated from the Help menu and ran that. version did not change and still no Max Render Quality so I found the download for PP CS4 4.2.1 and downloaded and unsipped it.
    When I ran the EXE file it started but then stopped and said "Cannot Install click the Quite button".  In the text of the window is said everything for 4.2.1 was installed except the language packs.  Verson number is still 4.0.0.
    What else can I try?
    Thanks.

    Thank you Ann
    on the reinstall and updates, does the last update (4.2.1) for Premiere Pro CS4 have all the updated combined or do I need to run them one at a time?
    also does the Encore update (4.0.1) run or included with the Premiere updates?
    I will do this starting monday, i hope to have this 2,000 ft project done tonight so I can deliver it tomorrow but will keep my data and rerun it after the update if Max Render shows up
    Thanks again

  • Bit Depth and Bit Rate

    I have a pre recorded mp3 VO. I placed it into a track bed in GB. Clients wants a compressed audio file with bit depth: 16 bit and bitrate: 128kps max, but recommends 96kbps. If I need to adjust the bit depth and bite rate, can I do it in GB? and if so, where? Thanks for any help.

    Please be aware that Bit Depth and Bit Rate are two completely different things!
    They belong to a group of buzz words that belong to Digital Audio and that is the field we are dealing with when using GarageBand or any other DAW. Some of those terms pop up even in iTunes.
    Digital Audio
    To better understand what they are and what they mean, here is a little background information.
    Whenever dealing with Digital Audio, you have to be aware of two steps, that convert an analog audio signal into a digital audio signal. These magic black boxes are called ADC (Analog Digital Converter) and “on the way back”, DAC (Digital Analog Converter).
    Step One: Sampling
    The analog audio (in the form of an electric signal like from an electric guitar) is represented by a waveform. The electric signal (voltage) changes up and down in a specific form that represents the “sound” of the audio signal. While the audio signal is “playing”, the converter measure the voltage every now and then. These are like “snapshots” or samples, taken at a specific time. These specific time intervals are determined by a “Rate”, it tells you how often per seconds something happens. The unit is Hertz [Hz] defined as “how often per seconds” or “1/s”. A Sample Rate of 48kHz means that the converter takes 48,000 Samples per second.
    Step Two: Quantize (or digitize)
    All these Samples are still analog, for example, 1.6Volt, -0.3Volt, etc. But this analog value now has to be converted into a digital form of 1s and 0s.This is done similar to quantizing a note in GarageBand. The value (i.e. the note) cannot have any position, it  has to be placed on a grid with specific values (i.e. 1/16 notes). The converter does a similar thing. It provides a grid of available numbers that the original measured Sample has to be rounded to (like when a note get shifted in GarageBand by the quantize command). This grid, the amount of available numbers, is called the Bit Depth. Other terms like Resolution or Sample Size are also used. A Bit Depth of 16bit allows for 65,535 possible values.
    So the two parameters that describe the quality of an Digital Audio Signal are the Sample Rate (“how often”) and the Bit Depth (“how fine of a resolution”). The very simplified rule of thumb is, the higher the Sample Rate, the higher the possible frequency, and the higher the Bit Depth, the higher the possible dynamic.
    Uncompressed Digital Audio vs. Compressed Digital Audio
    So far I haven’t mentioned the “Bit Rate” yet. There is a simple formula that describes the Bit Rate as the product of Sampel Rate and Bit Depth: Sample Rate * Bit Depth = Bit Rate. However, Bit Depth and how it is used (and often misused and misunderstood) has to do with Compressed Digital Audio.
    Compressed Digital Audio
    First of all, this has nothing to do with a compressor plugin that you use in GarageBand. When talking about compressed digital audio, we talk about data compression. This is a special form how to encode data to make the size of the data set smaller. This is the fascinating field of “perceptual coding” that uses psychoacoustic models to achieve that data compression. Some smart scientists found out that you can throw away some data in a digital audio signal and you wouldn’t even notice it, the audio would still sound the same (or almost the same). This is similar to a movie set. If you shoot a scene on a street, then you only need the facade of the buildings and not necessary the whole building.
    Although the Sample Rate is also a parameter of uncompressed digital audio, the Bit Depth is not. Instead, here is the Bit Rate used. The Bit Rate tells the encoder the maximum amount of bits it can produce per second. This determines how much data it has to throw away in order to stay inside that limit. An mp3 file (which is a compressed audio format) with a Bit Rate of 128kbit/s delivers a decent audio quality. Raising the Bit Rate to 256bit/s would increase the sound quality. AAC (which is technically an mp4 format) uses a better encoding algorithm. If this encoder is set to 128kbit/s, it produces a better audio quality because it is smarter to know which bits to throw away and which one to keep.
    Conclusion
    Whenever you are dealing with uncompressed audio (aiff, wav), the two quality parameters are Sample Rate [kHz] and Bit Depth [bit] (aka Resolution, aka Bit Size)
    Whenever you are dealing with compressed audio (mp3, AAC), the two quality parameters are Sample Rate [kHz] and Bit Rate [kbit/s]
    If you look at the Export Dialog Window in GarageBand, you can see that the Quality popup menu is different for mp3/AAC and AIFF. Hopefully you will now understand why.
    Hope that helps
    Edgar Rothermich
    http://DingDingMusic.com/Manuals/
    'I may receive some form of compensation, financial or otherwise, from my recommendation or link.'

  • Creative Audigy 2 NX Bit Depth / Sample Rate Prob

    This is my first post to this form
    Down to business: I recently purchased a Creative Audigy 2 NX sound card. I am using it on my laptop (an HP Pavilion zd 7000, which has plenty of power to support the card.) I installed it according to the instructions on the manual, but I have been having some problems with it. I can't seem to set the bit depth and sample rate settings to their proper values.
    The maximum bit depth available from the drop down menu in "Device Control" -> "PCI/USB" tab is 6 bits and the maximum sample rate is 48kHz. I have tried repairing and reinstalling the drivers several times, but it still wont work. The card is connected to my laptop via USB 2.0.
    I looked around in the forms and found out that at least one other person has had the same problem but no solution was posted. If anyone knows of a way to resolve this issue I would appreciate the input!
    Here are my system specs:
    HP Pavilion zd 7000
    Intel Pentium 4 3.06 GHz
    GB Ram
    Windows XP Prof. SP 2
    Thnx.
    -cmsleimanMessage Edited by cmsleiman on -27-2004 09:38 PM

    Well, I am new to high-end sound cards, and I may be misinterpreting the terminology, but the sound card is supposed to be a 24bit/96kHz card.
    I am under the impression that one should be able to set the output quality of the card to 24bits of depth and a 96kHz sample rate, despite the speaker setting that one may be using, to decode good quality audio streams (say an audio cd or the dolby digital audio of a dvd movie.) I can currently achieve this only on 2. speaker systems (or when i set the speaker setting of the card to 2.) Otherwise the maximum bit depth/sample rate I can set the card output to is a sample rate of 48kHz and a bit depth of 6bits.
    Am I mistaken in thinking that if I am playing a good quality audio stream I should be able to raise the output quality of the card to that which it is advertised and claims to have?
    Thnx

  • Final cut pro millions of colours + bit depth question

    Hello
    I am working in final cut pro 7 and I wanted to know what is the maximum bit depth I can export using the Prores codec? All I see in compression settings for rendering my timeline when wanting to render with Prores 4444 is the option for 'millions of colors' and 'millions of colors +' I was under the impression that millions of colors refered to 8 bit... does the alpha channel mean I can get 10 bit? can the alpha channel hold 2 more bits per channel or something? Or is there no way I can export a 10bit file using the Prores codec within fcp7..? is it all just 8bit. -and when I select 422HQ there is no advanced options for millions of colors..what does this mean? is the only way to get 10bit out of fcp7 to render with the 10bit uncompressed codec? and if so can I render the timeline in prores while im working with it then delete all the renders and change the render codec to 10bit uncompressed, will this now be properly giving me 10bit from the original 4444 12 bit files i imported in the beginning..?
    Any help is much appreciated

    ProRes is 10-bit. Every ProRes codec is 10-bit...LT, 422, HQ.  Not one of them is 8-bit.  Except for ProRes 444...that's 12 bit.

Maybe you are looking for