30 bit color

I'm a photographer, I need a wide color panel for my work. I use the ADOBE RGB color panel with my EIZO screen (CX241). I would like changed my Workstation. Currently I use windows with a Quadro Graphic Card. But I haven't information about the color managment of the OS and of the MAC PRO. If I want use the ADOBE RGB color panel, I must have the 30 bit color managment with the Displayport (OS and Graphic card supported the 10 bit per color). Is the OS and the Mac Pro support them?
Can you give me more informations please?
Thank

No 10 bit
Can I use "deep color" (10-bit each color) on actual mac-mini?
10bit monitor display from Mac Pro late 2013

Similar Messages

  • How do I Convert a  Tiff image to a jpeg without being FORCED to 8-bit Color?

    I am an Artist.  I have High quality TIFF images.  When I convert the tiffs to jpeg it forces me into 8-bit color automatically. (Forget about 32bit - it will not allow me to jpeg that at all)   The only way I can get back to 16bit color is to use the already SMAshed file and bring it up to 16bit.  THIS makes NO sense.  Once the jpeg is smashed, HOw in the world is it supposed to convert up again. ??  Then even though it says you converted the file to 16 -bit , the metadata refers still to the file as 8-bit.
    On top of all of that confusion, One picture, for example, when supposedly converted to 16bit,  gets much BRighter then even the original Tiff image.  It looks good on one hand and over exposed on the other.  I assume that is photoshop throwing in fake resolution, am I right?
    Am I wasting my time with this imaginary 16bit conversion?
    Is there ANY way to take that original Tiff image and convert it to 16bit jpeg without the Default 8bit?  I have been trying all kinds of things.  I even have asked my web guy.  My web guy says that 8-bit is unexceptable for printing, even for web.
    Could this have anything to do with my computer and scanner?
    I have the iMAC OS X 10.8.3 (3.2 GHz) 8 GB memory.
    And I also have an Epson Expression 10000XL graphic arts scanner capable of scanniing at 48bit color.
    This color stuff Really matters!  It MATTERS!  I HAve FINE art files.  I am already losing so much quality with the jpeg conversion. (which I am required to do for SmugMug, in addition to compressing all my files to 50mb or Under)
    Anyone who knows anything that could help me would be much appreciated. 
    Aloha,
    -Melissa

    First of all jpeg is 8 bit only there is no way to save as a 16 or 32 bit jpg, just does not exist. Secondly people print in 8 bit all the time and most if not all web graphics are in 8 bit as that is the only way to view it as there is no 16 bit or 32 bit monitors to view it. All but a few pro monitors are 8 bit monitors.
    If you care about the color gamut and want the full range of color that 16 and 32 bit provide, then why jpg? Jpg by its own nature throws out color just to compress, thats why it is popular on the web, because of its small file size not its quality. If you need 16 or 32 bit for anything it must be in a format that supports that color depth.
    That being said a jpg image at 8 bit will display 16+ million colors,  256 shades of red, 256 shades of green and 256 shades of blue.
    Now here is where I think your bit information is off. a jpg image is a 24 bit image that will produce 8 bits of red, 8 bits of green and 8 bits of blue.
    The 8, 16 and 32 are per channel not total color information.
    If the overall image was 8 bits, the image would be gayscale.

  • Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

    Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

    Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

  • T500 with External Monitor Seems to Be in 8 bit Color

    I got a T500 for work running Windows 7 x64. I hooked up the computer to an external monitor via the VGA port (I've connected directly to the laptop and also through the Thinkpad dock).
    Despite my best efforts the external monitor seems to render in 8 bit color mode (or maybe 16 but definitely not 32 bit). You can tell by lack of color in the gradients. Things I've tried.
    Updating the video drivers
    Using the IBM monitor tool to set it to 32 bit
    Setting it to 32 bit in the Windows settings

    Just to close the loop on this, I've determined that the cause of my issue is the Intel Dynamic Platform and Thermal Framework software/driver. I've disabled in the BIOS for now, and the machine is working very well. Will look for an update from Lenovo to address more definitively.

  • Same 6 bit color? I bet it is.

    Sadly, all we have for color depth in the tech specs is the same old 'support for millions of colors' which we know in the past to have been a total lie. All the laptop displays have been 6 bit color, 262,144 colors, period. A great article on 6 bit versus 8 bit color can be found at:
    http://compreviews.about.com/od/multimedia/a/LCDColor.htm
    The tech specs for the new laptops don't indicate any improvement.
    And yes, the displays use dithering to fake millions of colors. Yes, all displays use mixing of RGB colors to fool the eye into seeing millions of colors. And yes, not one laptop display I am aware of, including PCs, has adequate color depth to allow any real color matching for professional use. Use a high end external display for color matching purposes. CRTs are still considered adequate if not superior for this purpose.

    Umm...
    Every laptop has 6bit colors. There aren't any 8bit laptop panels that Apple could use infact. Except for some very new 17" ones...
    The thing is, 8 bit panels use more power and are regarded by most of the industry as overkill for laptops.
    Anyway I repeat, there are no 13" 8bit panels in existence.
    You want 8 bit color? Plug in a CinemaHD or a Dell VA or IPS panel.

  • Is my Cinema Display running in 32-bit color mode?

    I'm calibrating my Cinema Display and have read that I should check to see that it is running in full 24- or 32-bit color mode. Can anyone tell me how to do that? Thanks.

    Open the Displays pane of System Preferences. If you're given the option to change the monitor's bitdepth, choose at least millions of colors.
    (119642)

  • Display seems like 16 bit color with slight flicker

    When I turned my iMac on today, it's like the display color spectrum is on 16 bit color and it has a flicker to it.
    This happed all of the sudden.
    It's less than a year old.
    21" iMac
    Intel Core 2 Duo
    NVIDIA GeForce 9400
    VRAM 256MB
    Any ideas?

    16bit colour is the default maximum colour depth for RDP connections to a windows terminal server. You can increase this by editing the local group policy object on the windows machine.
    Go to start->run and type gpedit.msc
    Browse to Computer Configuration/Administrative Template/Windows Components/Terminal Services and edit the "Limit Maximum colour depth properties" key. Set it to enabled and select 24 bit. Click OK close group policy editor. You should now be able to get 24bit colour over RDP connections.

  • How can I watch video encoded with 10-bits color depth correctly via Adobe Flash Player?

    When I watching video on a site, one of those video said it is encoded with 10-bits color depth and require Flash Player 10.2+ and non-trident browser to watch it correctly , but with the latest flash player available installed and ising the Firefox, why I still found green vertical bands over the video which is a sign of improper rendering? How can I fix it? Platform using: android 2.3.6 with Texas Instruments OMAP4430.

    http://forums.adobe.com/message/6115370#6115370

  • New MacBook Air display has same 6 bit color? I bet it does.

    Sadly, all we have for color depth in the tech specs is the same old 'support for millions of colors' which we know in the past to have been a total lie. All the laptop displays have been 6 bit color, 262,144 colors, period. A great article on 6 bit versus 8 bit color can be found at:
    http://compreviews.about.com/od/multimedia/a/LCDColor.htm
    The tech specs for the new laptops don't indicate any improvement.
    And yes, the displays use dithering to fake millions of colors. Yes, all displays use mixing of RGB colors to fool the eye into seeing millions of colors. And yes, not one laptop display I am aware of, including PCs, has adequate color depth to allow any real color matching for professional use. Use a high end external display for color matching purposes. CRTs are still considered adequate if not superior for this purpose.

    Umm...
    Every laptop has 6bit colors. There aren't any 8bit laptop panels that Apple could use infact. Except for some very new 17" ones...
    The thing is, 8 bit panels use more power and are regarded by most of the industry as overkill for laptops.
    Anyway I repeat, there are no 13" 8bit panels in existence.
    You want 8 bit color? Plug in a CinemaHD or a Dell VA or IPS panel.

  • Is the latest iMac monitor (21.5" and 27") using 8-bit color depth or 6-bit dithering?

    So far, there isn't been an official answer. Support millions of colors could be either one.

    My question is more toward the hardware on that iMac monitor rather than the software setting (32-bit).
    I notice a similar discussion few years back:
    https://discussions.apple.com/message/10643776#10643776
    So seem that Apple is not using 8-bit colors yet on iMac?

  • How do I export from Lightroom to Photoshop 2014 using 8-bit color?

    In LR when I rt-click in the image area, a menu appears. I choose export and then Photoshop 2014.  I specify "edit with Lightroom adjustments.  When the image appears in Photoshop it indeed as the adjustments, but the color depth is always 32-bit. It's become bothersome to then down adjust the color depth to 8-bit color and I am wondering if there is a hidden export setting that will allow me to export the image as 8-bit.

    You can set that in Lightroom preferences see the screen capture.

  • Rendering as 10-bit/12-bit color (JPEG 2000?)

    I'm not trying to create a digital cinema package (DCP) per se, but I've got a few questions related to rendering in bit depths greater than 8-bit.
    NOTE:  The digital cinema standard (PDF link, see page 31) for package distribution is JPEG 2000 files with 12-bit color (in XYZ color space)...packaged into the MXF container format.
    1.  I'm wondering if it's possible to render to any 10-bit or 12-bit color format within After Effects.  Let's assume my source is a 16bpc or 32bpc comp, and I render it to the QuickTime container and select JPEG2000 or one of the other variants.  None of them seems to go above "millions of colors", or 8-bit.  (The one that has the option of "millions of colors plus" still only renders to planar 8-bit [YUV 4:2:0] when I inspect its streams and metadata.)
    2.  In the QuickTime container list, what are "Motion JPEG-A" and "Motion JPEG-B"?  These aren't standards with which I'm familiar, and I can't seem to find any detail in the documentation as to what these actually are.  (In all my tests, they're limited to 8-bit color.)
    3.  Is the JPEG 2000 codec that's available via QuickTime the same JPEG 2000 codec that's literally the full ISO/IEC 15444 or SMPTE 429-4 standard, or some crippled bits-and-pieces version?
    Obviously, I can render to TIFF or OpenEXR sequences in 16bpc or full float...I was just wondering if it was possible to get 10-bit or 12-bit color in a standard container via After Effects CC or Premiere Pro CC (via Media Encoder CC). 
    I see the "render at maximum bit depth" option in Premiere Pro, but I've never found a format/container that would output anything higher than 8-bit color...even with 16bpc or 32bpc input media.
    Thanks for your help and insight.

    If you want highter bit depth J2K, you have to render to image sequences. The baseline QT implementation is from the stone age. Perhaps there's some commercial third-party Codec out there or if you have such hardware, you could use Blackmagic's, but off  the bat there is nothing usable in QT as far as I know.
    Mylenium

  • Does Lightroom 4 support 10-bit color?

    Hoping to learn if Lightroom supports 10-bit color.  I know it supports ProPhotoRGB and 16-bit image files.  Thanks.

    oliver83 wrote:
    Actually, the previous answers were only partially correct w.r.t. to displaying 10-bit colors. In contrast to what was said, LR4 (I use version 4.4) supports 10-bit colors, however only in the develop module. The library module does not seem to support it which could be because 1) it really does not support it or 2) the library previews are created with lower color depth or 3) something I haven't considered
    I first read this somewhere and then personally verified it with ATI's ramp.psd test file on an Eizo CG276 (and a V4900 FirePro).
    The library previews are jpeg (8-bit). 

  • 16 bit color in PHOTOSHOP ELEMENTS

    I get an Error when using some of my tools (example: Healing Brush) in FULL EDIT  that states that "16 bit color not supported .  Convert to 8 bit color?"
    Will Photshop 9 work with 16 bit color?  Considering most cameras esp. DSLR's and scanners now use 16 bit color I hope so.

    Some features of PSE will work on 16 bit images, other features of PSE will not work on 16 bit images and you have to convert the image to 8 bits. I don't have PSE9, but it is my understanding that if a feature didn't work on 16 bit images in earlier versions, it won't work on 16 bit images in the current version (PSE9).

  • Working with 32 bit color HDRs in Lightroom

    The feature of being able to work with 32 bit color Tiff files in Lightroom is awesome and wonderful.  However, I note that when I do not follow the recommended workflow of bringing the 32 bit file directly back into Lightroom to work on it - working on it with Adobe Camera Raw in Photoshop first - the results are somewhat confusing. I'm fine with following the recommended workflow except for the fact that I seem to be able to come up with even better results quite quickly using ACR.  ACR appears to give me a greater tonal range to work with, but when I save the file as a Tiff and take it back to Lightroom the colors aren't displayed correctly by Lightroom.  There is a huge difference between viewing the file in Photoshop and viewing it in LR. If I work on the file immediately in LR, avoiding doing anything in PS, the files look identical in both programs, but at the apparent cost of losing a lot of exposure range to play with.  Am I correct in inferring that LR is working on the 32 bit colors using a 16 bit color depth conversion in the tools? 

    https://dl.dropboxusercontent.com/u/106834366/Redux%20Vidz.aep
    This is kind of a project-ruining issue not being able to render it out in 32-bit -- I'm working with a lot of gradients and lens blurs.
    Anyone have any hints of why this might be happening?

  • Photoshop CC 2104 30 bit Color Support

    I'm using PS CC 2014 on a Windows 7 64 bit system with an Nvida Quadro 600 graphics card. I've tried every possible settings combination for PS and Windows 7 with no success using the AMD Ramp test file. The Nvida 10 bit Demo works fine. PS CS6 works as well, but only at 64% Zoom view and higher.
    https://forums.adobe.com/message/7154720#7154720
    Is anyone using an Nvidia Quadro 600 (not K600) card with Windows 7 having success with 30 bit color mode in PS CC 2014? This card is listed as "tested" in the PS CS6 GPU FAQ, but not on the PS CC 2014 GPU FAQ. I'm assuming it wasn't tested because it's an older card, but still should work.

    Read that and the Nvidia PDF multiple times. Basically there's nothing you need to change in the Nvidia driver or Windows 7 Aero mode, but I have tried it in both in Windows Basic and Aero modes. It's "partially" working in PS CS6 and the Nvidia demo works so clearly my NEC PA272w  monitor, Nivida Quadro graphics card and Windows 7 are supporting 30 bit color....just not in PS CC2015 and only partially in PS CS6.
    I'm not a newbie at this stuff, so feel free to  talk techie. Thanks!

Maybe you are looking for

  • 9.0.3, web.xml, context-param

    Using documentation found on oracle's website, where I needed to have a specific initialization parameter, I setup a context-param in web.xml. In 9.0.2, I got a warning (and posted here but got no response) but was able to continue. In 9.0.3 I can't

  • Multiple MRP areas to be handled with MD04

    Hello to all, my scope is to advice the user in MD04 when there are other MRP-Areas for the same material in the same plant and maybe offer a function to jump there directly. I am looking for suggestions how to solve this, maybe one of you did it alr

  • Cannot install Arch on raid0 (kernel panics)

    I followed the Howto in the wiki to install Arch on a Raid 0 array, but when trying to boot I keep getting errors that boil down to "no filesystem could mount root" followed by a kernel panic. I modified the howto to my needs: - created a Raid 0 arra

  • Apps not shared when home sharing itunes library

    I have successfully transfered my itunes library from my old windows xp to my new windows 7 but only 4 newly purchased apps came across. How do I get the rest of my apps to share without having to redownload them from itunes?

  • Eclipse -- jDeveloper   ((why wont jsp compile))??

    Error(6,20): Attribute 'xmlns:xalan' used but not declared. Above error is given.... the xalan.jar in my class path..., but it still doesn't work.. I have a project that compiles great in Eclipse v3.0 but when I import the war into jDev i get several