Setting 16 bit color on ATI Xpert 98 PCI 8MB

Hi folks,
I am new to Solaris 7 Intel. I recently installed Solaris 7 Intel and trying to set my ATI video card to use 16 bit color. I edited the /etc/openwin/server/etc/OWconfig file and change defdepth="8" to defdepth="16". After that, it worked fine with Solaris. But when I used Netscape version 4.x to access some sites, like www.foxnews.com or www.kgo.com. It terminated my applications and restarted CDE. Does any one experience the same problem? Your advice would be appreciated.
Regards,
KRT

MisterAnderson wrote:Does the same happen with the opensource radeon drivers? (xf86-video-ati)
Thanks for the reply!
The open source driver doesn't support my card yet. I tried installing it first since its the recommended driver, but Xorg couldn't detect any display at all.
It seems the PCI BusID is not being read from the xorg.conf file. There's auto detection everytime the PC boots.
Last edited by hdhiman (2012-10-04 00:43:52)

Similar Messages

  • Can I set 4-bit color depth (16 colors) with a Unix command?

    "M4" by Deadly Games, a Sherman Tank simulator, insists that the color be set to 16 before it will launch completely and run, but the least I can select on my G4 QS 2002 with 9.2.2/10.3.9/10.4.10 is 256 colors. Is there a way to choose fewer colors than 256 using Terminal and a Unix command?
    TIA

    Camelot wrote:
    4-bit screen depths have not been supported in a long time. I don't know any option that will get you that setting.
    I can't imagine any current game would require that setting. Even if it only uses 16 colors internally it shouldn't need the entire system to be set that way.
    Can't argue with any of your statements or opinions.
    BTW, do you have an answer/solution to the problem?

  • Help!!! How do I get the bit color depth the user has his screen set to?

    Help!!! How do I get the bit color depth the user has his screen set to?
    Thank you.

    i'm not sure if it's what you 're looking for but it worked for me:
    GraphicsEnvironment ge = GraphicsEnvironment.getLocalGraphicsEnvironment();
    GraphicsDevice[] gs = ge.getScreenDevices();
    for(int i=0;i<gs.length;i++){ //length is 1 (= i have only one screen? )
    GraphicsDevice g=gs[ i ];
    int nBits=g.getDisplayMode().getBitDepth(); //method getDisplayMode() only in jdk1.4 , i think
    System.out.println( nBits+"");
    output is 32 if i set my screen color to 32bits, 24 if...24, ...

  • 10-bit color works in LR4 devel module but not in PS CS6

    Dear all,
    my setup is a ATI FirePro v4900 connected to two monitors: an Eizo CG276 and an old second monitor. I would like to get the Eizo to display 10-bit colors in PS CS6 under a 64-bit Windows 7.
    After setting everything up, I am at a point where Lightroom 4.4 displays the notorious ramp.psd file (see below) without banding in the develop module, hence I assume 10-bit is working there. In Photoshop, however, I had no success so far to get 10-bit to work.
    Here's what I've done so far:
    I set everything up as described here http://www.eizo.com/global/support/compatibility/photoshopcs6_nvidia_amd/index.html
    Summed up, I did:
    1) enable 10-bit mode in ATI driver and rebooted -> checked and this is still enabled after reboot
    2) open PS CS6 and enable "advanced" drawing mode and 30-bit display in the "performance" options and rebooted  -> checked and this is still enabled after reboot
    I verified that the monitor runs in 10-bit mode using Monitor Asset Manager 2.6 (http://www.entechtaiwan.com/util/moninfo.shtm). It states that Color bit depth is "10 bits per primary color", which is what I want.
    Then I tried to open the ATI 10-bit test file ramp.psd (http://www.amd.com/us/products/workstation/graphics/software/Pages/adobe-photoshop.aspx). In Photoshop it shows banding. As I've read that Lightroom 4's develop module uses 10-bit, I opened the file there and it had no banding.
    The only troubleshooting I could find for 10-bit colors in PS was this post http://www.dpreview.com/forums/post/40945907 however I do not have desaturation set in the color preferences of PS, hence it does not apply to my problem.
    Does anyone of you have any idea about what to try next?

    I forgot to mention: the Eizo is obviously connected to the V4900 via DisplayPort as that is the required connection type for 10-bit colors.
    PLEASE someone help me with some tip on this, I'm really stuck.

  • T500 with External Monitor Seems to Be in 8 bit Color

    I got a T500 for work running Windows 7 x64. I hooked up the computer to an external monitor via the VGA port (I've connected directly to the laptop and also through the Thinkpad dock).
    Despite my best efforts the external monitor seems to render in 8 bit color mode (or maybe 16 but definitely not 32 bit). You can tell by lack of color in the gradients. Things I've tried.
    Updating the video drivers
    Using the IBM monitor tool to set it to 32 bit
    Setting it to 32 bit in the Windows settings

    Just to close the loop on this, I've determined that the cause of my issue is the Intel Dynamic Platform and Thermal Framework software/driver. I've disabled in the BIOS for now, and the machine is working very well. Will look for an update from Lenovo to address more definitively.

  • Display seems like 16 bit color with slight flicker

    When I turned my iMac on today, it's like the display color spectrum is on 16 bit color and it has a flicker to it.
    This happed all of the sudden.
    It's less than a year old.
    21" iMac
    Intel Core 2 Duo
    NVIDIA GeForce 9400
    VRAM 256MB
    Any ideas?

    16bit colour is the default maximum colour depth for RDP connections to a windows terminal server. You can increase this by editing the local group policy object on the windows machine.
    Go to start->run and type gpedit.msc
    Browse to Computer Configuration/Administrative Template/Windows Components/Terminal Services and edit the "Limit Maximum colour depth properties" key. Set it to enabled and select 24 bit. Click OK close group policy editor. You should now be able to get 24bit colour over RDP connections.

  • Is the latest iMac monitor (21.5" and 27") using 8-bit color depth or 6-bit dithering?

    So far, there isn't been an official answer. Support millions of colors could be either one.

    My question is more toward the hardware on that iMac monitor rather than the software setting (32-bit).
    I notice a similar discussion few years back:
    https://discussions.apple.com/message/10643776#10643776
    So seem that Apple is not using 8-bit colors yet on iMac?

  • How do I export from Lightroom to Photoshop 2014 using 8-bit color?

    In LR when I rt-click in the image area, a menu appears. I choose export and then Photoshop 2014.  I specify "edit with Lightroom adjustments.  When the image appears in Photoshop it indeed as the adjustments, but the color depth is always 32-bit. It's become bothersome to then down adjust the color depth to 8-bit color and I am wondering if there is a hidden export setting that will allow me to export the image as 8-bit.

    You can set that in Lightroom preferences see the screen capture.

  • Does Lightroom 4 support 10-bit color?

    Hoping to learn if Lightroom supports 10-bit color.  I know it supports ProPhotoRGB and 16-bit image files.  Thanks.

    oliver83 wrote:
    Actually, the previous answers were only partially correct w.r.t. to displaying 10-bit colors. In contrast to what was said, LR4 (I use version 4.4) supports 10-bit colors, however only in the develop module. The library module does not seem to support it which could be because 1) it really does not support it or 2) the library previews are created with lower color depth or 3) something I haven't considered
    I first read this somewhere and then personally verified it with ATI's ramp.psd test file on an Eizo CG276 (and a V4900 FirePro).
    The library previews are jpeg (8-bit). 

  • Setting a Trasparent Color?

    Hey folks,
    Im working with Keynote for the first time this evening and as a long term Window's user I am having a bit of a hard time transitioning.
    I am looking for a feature in Keynote that allows me to set a transparent color on an image (similar to the one in PowerPoint/Word)
    Is there a way to do this? Namely, taking a picture with a white background, like an apple, and removing the white background.
    ^.^ I'm not totally against using another program to do this, but I'm not a huge fan of using something like photoshop to do this with each picture I have.
    Any suggestions?
    Thanks!
    ~Matthew

    I have, in the past, used Keynote's masking feature to crop images. For example, the image of the MacBook Pro on this page,
    http://web.mac.com/makentosh/iWeb/tipsfromtheiceberg/Blog/7A7B5816-296B-428E-9FC 5-A183EA22E123.html
    was made from this image
    http://images.apple.com/macbookpro/gallery/images/macbookpro0120061024.jpg
    I used "Draw a Shape" to trace around the edges of the MBP and when I had it the way I liked it, masked the image to the shape. A small amount of touching up after masking was required, but the result doesn't look too bad.

  • 1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, Direct3D capable

    I want to run AutoCad 2009 and work in 3d. Which lenovo meets the minimum reqs.? i.e. Intel or AMD Dual Core processor, 2.0 GHz or greater •    2 GB RAM or greater  •    2 GB free hard disk available not including installation  •    1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, OpenGL®, or Direct3D® capable workstation class graphics card. For Windows Vista, a Direct3D-capable workstation-class graphics card with 128 MB or greater is required.
    Which model is the least exspensive with the minumum reqs.?
    Clarification: I want to run AutoCad 2009 in 3d. Re: 1,280 x 1,024 32-bit color . . .
    Note from Moderator:  Please don't post the same message in multiple boards/threads as it splinters the discussion.  Duplicate(s) removed.
    Message Edited by JaneL on 02-08-2009 10:01 PM
    Solved!
    Go to Solution.

    Any ThinkPad with discrete or switchable graphics will work...T400/500, R400/500, W500/700...make sure that you're not getting the one with Intel integrated GPU and you'll be all set...
    FWIW, even T61/p with discrete GPU will run  this with no sweat...
    Hope this helps.
    Cheers,
    George
    In daily use: R60F, R500F, T61, T410
    Collecting dust: T60
    Enjoying retirement: A31p, T42p,
    Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
    Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

  • 10 Bit Color From Displayport in 5.02

    I noticed this was either fixed or added in the latest release. I have a Quadro 5000 with DisplayPort and HP ZR30w 10 Bit Color monitors with DisplayPort.
    Is there anything I need to configure in Winodws 7 or Premiere Pro to make sure I am seeing 10 Bit Color. 10 Bit files look amazing and I see
    no banding but I want to make sure I have it set up right.
    Thanks,
    Carlos Morillo

    I have ran several searches for DisplayPort, Display Port, Display-Port and have found nothing. The searches I ran for 10 Bit Color, 10-Bit Color, High Depth Color, all returned results that seem to be specific to the AVI v210  Codec in Desktop Mode and/or SDI. There are many other file formats that
    handle greater than 8 Bit Color and would like to make sure I am actually seeing that through my monitor. Just switched to the quadro and the HP 10 Bit display and I want to know that it's set up right. If I am missing something from the Premiere Pro CS5 PDF file could some one please point me to the right page? I ran I a search for just the word "color" and still found no answer. According to the notes on 5.02, 10 Bit Color through DisplayPort works.
    Is it just automatic when you have a 10 Bit Sequence and 10 Bit or greater files? I Know you can render using High Depth Color and Maximum quality but I don't think that's the same as actually seeing 10 Bit color on screen since those options are availbale to you even when you are on an 8 Bit Sequence with 8 Bit Files.
    Thanks,

  • Will Photoshop CS6 display 30-bit-color with two GPUs?

    Hello there!
    I  think about purchasing a small Nvidia Quadro, which supports 30-bit-color, since its not available on my currrent GTX 780Ti running on Windows 7 Ulitimate x64.
    It is not a big deal to set the quadro, once installed, as the responsible GPU for color and OpenCL-acceleration...
    BUT:
    as long as the 780Ti has around 999999x times more computing-power than the small quadro (around 250 €), it would be nice to profit
    of this as well. at the same time.
    So: is there a way to use the Quadro purely for "displaying and color" and the 780ti for accelerating tasks in Photoshop CS6?
    I'm afraid not..
    but cheer me up pls.
    Thanks in advance!

    Thank you for the quick response!
    unfortunately that's what I expected to hear.
    It's a shame that Nvidia wont enable the 30-bit-feature under windows or mac until you
    spend a few thousand euros for a K5000 or K6000
    I hope that there will be a workaround for that soon. (maybe some tweaked quadro-drivers as I read a few years ago)
    Thanks so far!

  • 16-bit color depth for gradients?

    Hello,
    I have created gradient backgrounds inside of inDesign CS6 but see some color banding in offset printer's color proofs. Is there a setting/option to set inDesign to generate its color gradients at 16-bit color depth?
    Thanks in advance.
    Jeffrey

    Then open and rasterize in Photoshop, where you can specify both the resolution and bit depth.
    Jeffery, It seems like that should work, but if you look closely at the histograms it doesn't look like you get 16-bits of gray levels. Here's a black to white blend exported to PDF/X-4 and opened as 8 and 16 bit CMYK. The black channel histograms are the same:
    If there were more than 256 levels of gray in the 16-bit version I should be able to make a correction without gaps showing in the histogram, but that doesn't happen:
    If I make the black channel blend in Photoshop I can make the same correction without gaps:

  • 16 bits color?

    A client complained about the colors in a movie. As it turns
    out they see boundaries of mc's and text fields (in text fields
    it's the rendeing area of the text, not the complete text field) in
    a slightly lighter tint than the background. I tried to replicate
    it and had to set my display to 16 bits.
    Solution? Limit the color scheme to web safe?
    How 'common' is 16 bits color? Isn't this something from the
    middle ages?

    Web safe probably won't help (it may be slightly better, or
    may be worse)
    The problem is a bug in the player .. it renders solid colors
    differently to
    image / gradient fills (and in some other cases). its also
    uses different
    rending when transparency is involved
    If the problem is due to a solid color fill being next to a
    gradient/image
    color, then you can solve the problem by using a gradient
    with only a single
    color in it for the solid fill.
    Apart from the fact that web safe colors are not web-safe at
    all (apart from
    a handful, like a handful of blues and the primary colors) it
    really won't
    make a significant different (the web-safeness of the color
    is not the
    issues here)
    Just have a think on this for a moment .... No matter WHAT
    color system you
    use on a screen (8-bit, 16-bit, 24-bit, 32-bit), if the PC is
    trying to
    output a given RGB color, it will display the same everywhere
    on that screen
    .. it may be different to what some other computer would
    show, but it would
    be consistent across the screen. So the problem is that Flash
    really is
    trying to display different colors on different pixels where
    it should be
    trying to display the same color (if it wasn't for the bugs
    in how it
    renders). Now, changing what color is there may increase or
    lessen the
    visible difference, but it is unlike to help (unless you do
    something like
    use pure black or pure white etc, which I think Flash manages
    to get right)
    NOTE: There is a technote that tries to explain away why this
    happen so it
    doesn't look like it's a bug (technotes are good at that).
    But it really is
    a bug.
    Jeckyl

Maybe you are looking for