10 Bit Color From Displayport in 5.02

I noticed this was either fixed or added in the latest release. I have a Quadro 5000 with DisplayPort and HP ZR30w 10 Bit Color monitors with DisplayPort.
Is there anything I need to configure in Winodws 7 or Premiere Pro to make sure I am seeing 10 Bit Color. 10 Bit files look amazing and I see
no banding but I want to make sure I have it set up right.
Thanks,
Carlos Morillo

I have ran several searches for DisplayPort, Display Port, Display-Port and have found nothing. The searches I ran for 10 Bit Color, 10-Bit Color, High Depth Color, all returned results that seem to be specific to the AVI v210  Codec in Desktop Mode and/or SDI. There are many other file formats that
handle greater than 8 Bit Color and would like to make sure I am actually seeing that through my monitor. Just switched to the quadro and the HP 10 Bit display and I want to know that it's set up right. If I am missing something from the Premiere Pro CS5 PDF file could some one please point me to the right page? I ran I a search for just the word "color" and still found no answer. According to the notes on 5.02, 10 Bit Color through DisplayPort works.
Is it just automatic when you have a 10 Bit Sequence and 10 Bit or greater files? I Know you can render using High Depth Color and Maximum quality but I don't think that's the same as actually seeing 10 Bit color on screen since those options are availbale to you even when you are on an 8 Bit Sequence with 8 Bit Files.
Thanks,

Similar Messages

  • 10 Bit Color In LR 3.3

    I am planning on buying aN LCD monitor that provides 10 bit color with Displayport.  Does LR support that depth?

    Yes, please let's not get into the whole colorspace thing (LR uses MelissaRGB), especially since no one has confirmed LR 3.3 even supports 10 bit/color ouput.
    You say:
    Another recommended choice is the ATI FirePro 5800 but it only has DVI outputs.
    The ATI FirePro V5800 Professional card has two DisplayPort interfaces AND one dual-link DVI interface, exactly the same as the Nvidia Quadro 2000:
    DisplayPort: 2 Standard
    Dual-link DVI: 1
    Max Resolution: 2560x1600 @ 60Hz
    See link here:
    http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/v5800/pages/v5800.aspx# 4
    GrandpaHenry you have a lot of energy! Assembling a system that can support 10 bit/color displays is not for the "faint of heart." Like I tell my grandkids sometimes.....slooooooooooow down! No offense intended, but I just suggest doing more product research and reading of detailed specification information. Both of the above mentioned graphics cards will fit the bill, but like I said there may be less expensive models that will also work. Wide Gamut monitor selection is even more challenging and a much more expensive system investment. Some of the wide gamut displays come bundled with a hardware monitor calibration device and software. This is probably a more economical display solution, which also provides easier setup. Monitor setup and calibration is crucial to ANY digital imaging workstation, and even more so for a wide gamut system!
    Read, read, and re-read reviews and datasheet specifications, and look at other forums using Google search. Wide Gamut is still in its infancy for ALL applications – You are a pioneer, and I respect you for the gumption and fortitude required to make the leap. Best of luck to you – I am sure you will be successful!

  • Remote desktop from Win 7 to Win server 2003 only allows 16 bit color depth. How to increase to at least 24 bit?

    Hi! I'd like to RDP in to a Win server 2003 computer from my new Windows 7 (x64) computer. There's no problem to connect but the connection only allows me to use 16 bit color depth on the Windows server 2003 client. The program I want to use on this computer requires at least 24 bit to run.
    I've googled around and tried it all:
    -setting the color depth to 4 or 5 through regedit (on both computers)
    -creating a Group Policy object-snap in through mmc
    -reading endless of forums but no one has had the same issue as I
    My question. If all the ordinary solutions isn't working what could the issue then be? My only suspision at this point is the the RDP-client on Win server isn't working properly. The Windows Server 2003 client is fully updated machine and it should have the RDP 5.2-version installed but I can only find 5.1. Could this be the issue? That it needs to be 5.2 to properly communicate with a win7(x64)? If so, how can I upgrade solely the RDP-client since it seems not to have followed with the Automatic Update as it should?
    All answers and feedback are very much appreciated!
    Many thanks
    Frank

    Hello Frank,
    Thanks for your feedback and the test.
    I still recommend you to check the local group policy setting “Limit maximum color depth” on the Terminal Server side. Based on your description of Group Policy Object Editor, I suspect the loss of Terminal Services folder is caused by system.adm file not loaded successfully. Please try to fix it via the following steps:
    1.     Open Group Policy Object Editor (run “gpedit.msc”) on the terminal server.
    2.     Expand Computer Configuration \ Administrative Templates
    3.     Right click Administrative Templates folder and select Add/Remove Templates, if system is not listed in the panel, go to Step 4; If it is, remove it and go to Step 4.
    4.     Click Add button.
    5.     In the next phase, it is Windows\inf folder, if it is not, please navigate to the folder. Select system.adm file and click Open.
    6.     Check if the Terminal Services folder appears under Windows Components now.
    Thanks for your cooperation and we’re looking forwards to the result. Thanks.
    Regards,
    ·         Lionel Chen

  • How do I export from Lightroom to Photoshop 2014 using 8-bit color?

    In LR when I rt-click in the image area, a menu appears. I choose export and then Photoshop 2014.  I specify "edit with Lightroom adjustments.  When the image appears in Photoshop it indeed as the adjustments, but the color depth is always 32-bit. It's become bothersome to then down adjust the color depth to 8-bit color and I am wondering if there is a hidden export setting that will allow me to export the image as 8-bit.

    You can set that in Lightroom preferences see the screen capture.

  • Macbook Pro & Mini Dispalyport Does NOT Support 10 bit Color Monitors

    I am extremely concerned regarding Apple's Mini Displayport. Mini Displayport has been out for a year now, and is the video out port with the maximum signal output on Macbook Pro's. Apple's Mini Displayport still does not offer the compatibility and performance, that is possible with the standard Displayport implantation on Windows.
    Case in point two monitors from NEC and Eizo, the NEC PA214W and the Eizo CG223W:
    http://www.eizo.com/global/products/coloredge/cg223w/index.html
    http://www.necdisplay.com/Products/Product/?product=5a6621b9-e9c4-4f02-8542-e625 1364bf7c
    All current Macbook Pro's with Mini Displayport are unable to support the 10 bit color that these monitors are capable of. This is unacceptable, and an example of Apple leading customers to believe that Mini Displayport, offers the same capabillities and performance as Displayport.
    It is my understanding that on paper MBP's with the NVidia 9400/9600 are capable of outputting a 10 bit signal. They just do not at this point in time, unlike all PC notebooks with Displayport. Mini Displayport has been sold by Apple as being every bit as capable only smaller than Displayport. The truth is it is not.
    Apple notebooks can not currently produce the same level of quality on these monitors as Windows notebooks with Displayport that have also been on the market for a year now.
    http://www.tftcentral.co.uk/newsarchive/19.htm#10-bitips

    Displayport:
    http://www.displayport.org/consumer/
    http://www.displayport.org/consumer/?q=content/faq
    "Performance for a display interface is really bandwidth. So Displayport offers up to 10.8 Gigabits per second. That bandwidth can be allocated for greater color depth (more colors per pixel, i.e.. 30 bit color monitors), it can be allocated to resolution, i.e.. WQXGA , or refresh rate."
    Note: The bandwidth has increased to 21.6 Gbps with the announcement of Displayport v1.2.
    It appears to me that aside from the top of the line (Windows PC) notebooks, that the only systems that will currently be able to achieve 30 bit color with these monitors are going to be desktop systems with Displayport/Mini Displayport and the necessary GPU and Operating System. This is an example of the difficulty in getting accurate information when it comes to what a system can and can not do. Rod correctly pointed out that I was mistaken in saying that "all" PC notebooks with Displayport are capable of achieving 10 bit per channel output, some do not. As BSteely, Lewis, and Rod have discussed Mini Displayport is not what is responsible for the fact that Macbook Pros can not currently output 10 bit per channel signal. Displayport exists and allows for this capability, and I stand by my opinion that currently Macbook Pros with Mini Displayport do not implement the full Displayport spec, in particular the ability to output 10 bit per channel signal to 30 bit color monitors. On the PC side it is the "workstation" class notebooks that have had and continue to have this capability. There may be others, but in reviewing Dell, HP, and Lenovo I am only finding it in the Precision, W series, and Elitebook models. The latest T Series Thinkpads with Quadro GPU's may work as well. At this point, I am not going to research each and every model to determine exactly which notebook models and configurations will work, except to say none of them are Macbook Pros. The lesson here should be get multiple confirmations regarding your system's configuration and your intentions in its use. I was mistaken in my understanding that the MBP 17 with Mini Displayport would support the Displayport spec as it relates to 30 bit color monitors. I have stated what I think is wrong about that, but that is the way it is for now. Apple is not the only company to hype a technology, only to offer a limited implementation of it. "Vista Capable" comes to mind.
    http://www.informationweek.com/news/windows/operatingsystems/showArticle.jhtml?a rticleID=212100567
    This post will hopefully alert others to this limitation. To end on a positive note, in my experience the last six months with my Macbook Pro, Apple has demonstrated it can resolve issues quickly and effectively, and I have been extremely pleased to the extent that they have worked with me to address the few significant issues that have come up, including a system replacement in quick order. This may or may not be something they will be able to address with the current Macbook Pros or even the next generation. We will see.

  • 30 bit color

    I'm a photographer, I need a wide color panel for my work. I use the ADOBE RGB color panel with my EIZO screen (CX241). I would like changed my Workstation. Currently I use windows with a Quadro Graphic Card. But I haven't information about the color managment of the OS and of the MAC PRO. If I want use the ADOBE RGB color panel, I must have the 30 bit color managment with the Displayport (OS and Graphic card supported the 10 bit per color). Is the OS and the Mac Pro support them?
    Can you give me more informations please?
    Thank

    No 10 bit
    Can I use "deep color" (10-bit each color) on actual mac-mini?
    10bit monitor display from Mac Pro late 2013

  • T500 with External Monitor Seems to Be in 8 bit Color

    I got a T500 for work running Windows 7 x64. I hooked up the computer to an external monitor via the VGA port (I've connected directly to the laptop and also through the Thinkpad dock).
    Despite my best efforts the external monitor seems to render in 8 bit color mode (or maybe 16 but definitely not 32 bit). You can tell by lack of color in the gradients. Things I've tried.
    Updating the video drivers
    Using the IBM monitor tool to set it to 32 bit
    Setting it to 32 bit in the Windows settings

    Just to close the loop on this, I've determined that the cause of my issue is the Intel Dynamic Platform and Thermal Framework software/driver. I've disabled in the BIOS for now, and the machine is working very well. Will look for an update from Lenovo to address more definitively.

  • Display output on LED TV from DisplayPort on zbook 15

    Hi,
    I have a notebook zbook 15 Product D5H42AV.
    On this laptop I have a DisplayPort output and a Thunderbolt output.
    I have bought on the market an adapter (not from HP) from DisplayPort to HDMI.
    I cannot get the output displayed on my LED TV (full HD).
    Could it be that the adapter supports only DisplayPort 1.1 and not 1.2?
    If I want to see the screen on TV, shall I use the a DisplayPort adapter or Thunderbolt adapter?
    Is there any way to troubleshoot why I cannot display the output on TV from my laptop?
    This question was solved.
    View Solution.

    Hi Provost,
    thank you for your feedback. I have tried but still not working.
    According to this document:
    http://h20195.www2.hp.com/V2/GetPDF.aspx/4AA5-2657ENW
    it seems that I should connect the HDMI to DisplayPort:
    *DisplayPort connector supports a DisplayPort display, a HDMI display with an DP-to-HDMI dongle, a VGA display with a DP-to-VGA dongle, or a DVI display with a DP-to-DVI dongle.
    I have a Samsung TV Series 6 with 4 different HDMI input:
    HDMI1 (STB)
    HDMI2 (ARC)
    HDMI3
    HDMI4 (DVI)
    I have tried with all of them unsuccessfully. I'm afraid that my DP-to-HDMI dongle that I bought on a local internet shop and it's made in China is somehow not fully compatible with my zbook.
    Maybe I should buy the original one from HP.
    When I connect the laptop to TV I saw that somehow the TV see that the port is connected but it says "No Signal".
    Also pressing F4 and checking with the Screen Resolution windows (in Windows 7) shows the second display as VGA. So it looks it does not detect the DisplayPort.
    Is it possible that the port must be enabled (bios or Windows setting) or that I miss some drivers?
    Checking on the Internet I found that there are some adapter that can connect also to the thunderbolt port:
    https://m.cdw.com/shop/products/StarTech.com-Mini-DisplayPort-to-HDMI-Video-Adapter-Converter/197508...
    The product is described as compatible with zbook 15
    However the official HP site sell a dongle only for the DisplayPort:
    http://shopping1.hp.com/is-bin/INTERSHOP.enfinity/WFS/WW-USSMBPublicStore-Site/en_US/-/USD/ViewProdu...
    The adapter I'm using can be seen in this page:
    http://media.extron.com/download/files/userman/68-1887-01_B.pdf
    It's the "DisplayPort Male to HDMIF Active Adapter" (top right code 26-655-01).
    I'm a bit confused now. I don't know if I have to buy a different adapter or if there is a way to troubleshoot my problem and see if the DisplayPort is enabled and working on my laptop.
    Regards.
    Alberto

  • Rendering as 10-bit/12-bit color (JPEG 2000?)

    I'm not trying to create a digital cinema package (DCP) per se, but I've got a few questions related to rendering in bit depths greater than 8-bit.
    NOTE:  The digital cinema standard (PDF link, see page 31) for package distribution is JPEG 2000 files with 12-bit color (in XYZ color space)...packaged into the MXF container format.
    1.  I'm wondering if it's possible to render to any 10-bit or 12-bit color format within After Effects.  Let's assume my source is a 16bpc or 32bpc comp, and I render it to the QuickTime container and select JPEG2000 or one of the other variants.  None of them seems to go above "millions of colors", or 8-bit.  (The one that has the option of "millions of colors plus" still only renders to planar 8-bit [YUV 4:2:0] when I inspect its streams and metadata.)
    2.  In the QuickTime container list, what are "Motion JPEG-A" and "Motion JPEG-B"?  These aren't standards with which I'm familiar, and I can't seem to find any detail in the documentation as to what these actually are.  (In all my tests, they're limited to 8-bit color.)
    3.  Is the JPEG 2000 codec that's available via QuickTime the same JPEG 2000 codec that's literally the full ISO/IEC 15444 or SMPTE 429-4 standard, or some crippled bits-and-pieces version?
    Obviously, I can render to TIFF or OpenEXR sequences in 16bpc or full float...I was just wondering if it was possible to get 10-bit or 12-bit color in a standard container via After Effects CC or Premiere Pro CC (via Media Encoder CC). 
    I see the "render at maximum bit depth" option in Premiere Pro, but I've never found a format/container that would output anything higher than 8-bit color...even with 16bpc or 32bpc input media.
    Thanks for your help and insight.

    If you want highter bit depth J2K, you have to render to image sequences. The baseline QT implementation is from the stone age. Perhaps there's some commercial third-party Codec out there or if you have such hardware, you could use Blackmagic's, but off  the bat there is nothing usable in QT as far as I know.
    Mylenium

  • Problems with color from Photoshop to InDesign CS2?

    Hi all,
    Couple of questions:
    When I select "Proof Colors" from "View", the color changes a little bit. Is this normal?
    Also, on "View", "Proof Setup", which one should I select "Document CMYK" or "Working CMYK" or some other "Custom"
    thanks for all your help,
    Alex

    >When I select "Proof Colors" from "View", the color changes a little bit. Is this normal?
    It's not abnormal, but it will depend on what color spaces the objects are in and the color space that you choose to proof.
    >Also, on "View", "Proof Setup", which one should I select "Document CMYK" or "Working CMYK" or some other "Custom"
    You have a lot of choices for how to set up color management. I usually leave my settings set up for my most common situation, US Sheetfed coated and Adobe RGB, but if I have a special job I can reassign the document color space (notice I said Assign, not convert, in order to preserve the native object color numbers). You can proof the color in either the Working CMYK (the one in the color settings) the Document CMYK (if that's different) or something else entirely if you are repurposing an existing document for a new output method.
    When you proof colors, InDesign calculates the conversions on the fly and displays the results. RGB objects will almost always show some shifting due to gamut differences, but the type and degree will depend on the rendering intent you specify. CMYK objects tend not to show as much shifting because the gamuts are likely to be closer, but again it will depend on the exact profiles and intents. Objects that are already in the chosen proofing space should not show any shift at all.
    There is a caveat here regarding working in one space and outputting in another: If you want the colors to be as close as possible you will be converting to the new profile, which will by necessity change some color numbers to match the appearance. This is not usually a problem with things like photographs, but other things, like type, which is created as 100% K can be converted to a four-color mix which is not good. If you know the output conditions, I always recommend setting the document CMYK to that space when you start.
    Peter

  • Photoshop with 16 bit colors

    Hi,
    Is it possible to work with 16 bit colors in Photoshop ?
    I have some bitmaps, where I would like to change the colortable from 24 bit colors to 16 bit colors.
    Thanks,
    best regards,
    Poul

    Let us consider the possibilities of the file format Windows Bitmap BMP. Photoshop can save
    as BMP with different color depths, but this has nothing to do with "Photoshop Bitmap".
    Mode Indexed color (BMP, 8 bit per pixel) uses a color look-up table, a CLUT:
    256 rows with 3 bytes for R, G and B (one byte each).
    The image contains for each pixel one byte, which is a pointer to the CLUT,  from where the
    color bytes are taken. Finally we have a 24 bit per pixel color representation (True color).
    For GIF it's almost the same, but one row in the CLUT means transparency.
    Building the CLUT for an image which may contain originally 256^3 diffent colors is called
    Color quantization (a key word for a further search).
    If the display should require a smaller number of bits, e.g. 5R, 6G, 5B (green is more accurate),
    then the CLUT can be taken from the Indexed color BMP image and modified:
    R: Shift right 3 bits and round
    G: Shift right 2 bits and round
    B: Shift right 3 bits and round
    Assemble R,G,B in one word with 16 bit.
    Similarly, if the display should require 5R, 5G, 5B.
    Such a table cannot be generated directly by Photoshop.
    Mode High color (BMP, 16 bit per pixel) does not use a CLUT. Perhaps, at the time when these
    file formats were developed, a CLUT with 2^16 rows was considered as nonsensical.
    Photoshop can generate a BMP image with 16 bit per pixel by one of three modes, as explained
    earlier in my previous post.
    Thus it is possible to convert an image, which contains an arbitrary number of "swatches"
    as defined by True Color 24 bit per pixel, into High color 16 bit per pixel by Photoshop. From the
    result one can take the coding and build a table.
    Arbitrary means: up to the maximal number, e.g. 32*64*32 swatches for 5R 6G 5B.
    The further proceeding depends strongly on the decision   only 256 colors / more colors.
    Accepting the 16 bit per pixel limit, one may use a fixed full size CLUT and an automatic conversion
    True Color – High Color with Dithering.
    Best regards --Gernot Hoffmann

  • 10-bit color works in LR4 devel module but not in PS CS6

    Dear all,
    my setup is a ATI FirePro v4900 connected to two monitors: an Eizo CG276 and an old second monitor. I would like to get the Eizo to display 10-bit colors in PS CS6 under a 64-bit Windows 7.
    After setting everything up, I am at a point where Lightroom 4.4 displays the notorious ramp.psd file (see below) without banding in the develop module, hence I assume 10-bit is working there. In Photoshop, however, I had no success so far to get 10-bit to work.
    Here's what I've done so far:
    I set everything up as described here http://www.eizo.com/global/support/compatibility/photoshopcs6_nvidia_amd/index.html
    Summed up, I did:
    1) enable 10-bit mode in ATI driver and rebooted -> checked and this is still enabled after reboot
    2) open PS CS6 and enable "advanced" drawing mode and 30-bit display in the "performance" options and rebooted  -> checked and this is still enabled after reboot
    I verified that the monitor runs in 10-bit mode using Monitor Asset Manager 2.6 (http://www.entechtaiwan.com/util/moninfo.shtm). It states that Color bit depth is "10 bits per primary color", which is what I want.
    Then I tried to open the ATI 10-bit test file ramp.psd (http://www.amd.com/us/products/workstation/graphics/software/Pages/adobe-photoshop.aspx). In Photoshop it shows banding. As I've read that Lightroom 4's develop module uses 10-bit, I opened the file there and it had no banding.
    The only troubleshooting I could find for 10-bit colors in PS was this post http://www.dpreview.com/forums/post/40945907 however I do not have desaturation set in the color preferences of PS, hence it does not apply to my problem.
    Does anyone of you have any idea about what to try next?

    I forgot to mention: the Eizo is obviously connected to the V4900 via DisplayPort as that is the required connection type for 10-bit colors.
    PLEASE someone help me with some tip on this, I'm really stuck.

  • 1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, Direct3D capable

    I want to run AutoCad 2009 and work in 3d. Which lenovo meets the minimum reqs.? i.e. Intel or AMD Dual Core processor, 2.0 GHz or greater •    2 GB RAM or greater  •    2 GB free hard disk available not including installation  •    1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, OpenGL®, or Direct3D® capable workstation class graphics card. For Windows Vista, a Direct3D-capable workstation-class graphics card with 128 MB or greater is required.
    Which model is the least exspensive with the minumum reqs.?
    Clarification: I want to run AutoCad 2009 in 3d. Re: 1,280 x 1,024 32-bit color . . .
    Note from Moderator:  Please don't post the same message in multiple boards/threads as it splinters the discussion.  Duplicate(s) removed.
    Message Edited by JaneL on 02-08-2009 10:01 PM
    Solved!
    Go to Solution.

    Any ThinkPad with discrete or switchable graphics will work...T400/500, R400/500, W500/700...make sure that you're not getting the one with Intel integrated GPU and you'll be all set...
    FWIW, even T61/p with discrete GPU will run  this with no sweat...
    Hope this helps.
    Cheers,
    George
    In daily use: R60F, R500F, T61, T410
    Collecting dust: T60
    Enjoying retirement: A31p, T42p,
    Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
    Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

  • 32 bit color Wine for Diptrace

    I've been running Diptrace in wine, and it works almost perfectly, but every time it opens it complains that it is not in 32 bit color. I'm getting some weird effects using either OpenGl or Direct3d, but not Windows GDI.  When stuff is moved around or the screen is zoomed, it stops, then jumps slightly. It's pretty annoying, and I was wondering if it was related to the 32 bit color. I'm not sure where to start with this, so anything you can provide will be helpful. I'm using Openbox with the open source intel drivers, and Diptrace is 64 bit on my wine. Thanks

    Do any "real" 32 bit color systems actually exist? And, if so, which colour do they drop a bit from to get to 32? (3x11 = 33, 3x10=30) I see that according wiki even those that aren't 3x8 + 8 bits of extras are usually 3x10 + 2 bits of padding:
    +32-bit color+
    +"32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.+
    +Systems using more than 24 bits in a 32-bit pixel for actual color data exist, but most of them opt for a 30-bit implementation with two bits of padding so that they can have an even 10 bits of color for each channel, similar to many HiColor systems.+
    (from http://en.wikipedia.org/wiki/Color_depth)
    Cheers
    Rod

  • 16 bits color?

    A client complained about the colors in a movie. As it turns
    out they see boundaries of mc's and text fields (in text fields
    it's the rendeing area of the text, not the complete text field) in
    a slightly lighter tint than the background. I tried to replicate
    it and had to set my display to 16 bits.
    Solution? Limit the color scheme to web safe?
    How 'common' is 16 bits color? Isn't this something from the
    middle ages?

    Web safe probably won't help (it may be slightly better, or
    may be worse)
    The problem is a bug in the player .. it renders solid colors
    differently to
    image / gradient fills (and in some other cases). its also
    uses different
    rending when transparency is involved
    If the problem is due to a solid color fill being next to a
    gradient/image
    color, then you can solve the problem by using a gradient
    with only a single
    color in it for the solid fill.
    Apart from the fact that web safe colors are not web-safe at
    all (apart from
    a handful, like a handful of blues and the primary colors) it
    really won't
    make a significant different (the web-safeness of the color
    is not the
    issues here)
    Just have a think on this for a moment .... No matter WHAT
    color system you
    use on a screen (8-bit, 16-bit, 24-bit, 32-bit), if the PC is
    trying to
    output a given RGB color, it will display the same everywhere
    on that screen
    .. it may be different to what some other computer would
    show, but it would
    be consistent across the screen. So the problem is that Flash
    really is
    trying to display different colors on different pixels where
    it should be
    trying to display the same color (if it wasn't for the bugs
    in how it
    renders). Now, changing what color is there may increase or
    lessen the
    visible difference, but it is unlike to help (unless you do
    something like
    use pure black or pure white etc, which I think Flash manages
    to get right)
    NOTE: There is a technote that tries to explain away why this
    happen so it
    doesn't look like it's a bug (technotes are good at that).
    But it really is
    a bug.
    Jeckyl

Maybe you are looking for