1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, Direct3D capable

I want to run AutoCad 2009 and work in 3d. Which lenovo meets the minimum reqs.? i.e. Intel or AMD Dual Core processor, 2.0 GHz or greater •    2 GB RAM or greater  •    2 GB free hard disk available not including installation  •    1,280 x 1,024 32-bit color video display adapter (true color) 128 MB or greater, OpenGL®, or Direct3D® capable workstation class graphics card. For Windows Vista, a Direct3D-capable workstation-class graphics card with 128 MB or greater is required.
Which model is the least exspensive with the minumum reqs.?
Clarification: I want to run AutoCad 2009 in 3d. Re: 1,280 x 1,024 32-bit color . . .
Note from Moderator:  Please don't post the same message in multiple boards/threads as it splinters the discussion.  Duplicate(s) removed.
Message Edited by JaneL on 02-08-2009 10:01 PM
Solved!
Go to Solution.

Any ThinkPad with discrete or switchable graphics will work...T400/500, R400/500, W500/700...make sure that you're not getting the one with Intel integrated GPU and you'll be all set...
FWIW, even T61/p with discrete GPU will run  this with no sweat...
Hope this helps.
Cheers,
George
In daily use: R60F, R500F, T61, T410
Collecting dust: T60
Enjoying retirement: A31p, T42p,
Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

Similar Messages

  • How do I Convert a  Tiff image to a jpeg without being FORCED to 8-bit Color?

    I am an Artist.  I have High quality TIFF images.  When I convert the tiffs to jpeg it forces me into 8-bit color automatically. (Forget about 32bit - it will not allow me to jpeg that at all)   The only way I can get back to 16bit color is to use the already SMAshed file and bring it up to 16bit.  THIS makes NO sense.  Once the jpeg is smashed, HOw in the world is it supposed to convert up again. ??  Then even though it says you converted the file to 16 -bit , the metadata refers still to the file as 8-bit.
    On top of all of that confusion, One picture, for example, when supposedly converted to 16bit,  gets much BRighter then even the original Tiff image.  It looks good on one hand and over exposed on the other.  I assume that is photoshop throwing in fake resolution, am I right?
    Am I wasting my time with this imaginary 16bit conversion?
    Is there ANY way to take that original Tiff image and convert it to 16bit jpeg without the Default 8bit?  I have been trying all kinds of things.  I even have asked my web guy.  My web guy says that 8-bit is unexceptable for printing, even for web.
    Could this have anything to do with my computer and scanner?
    I have the iMAC OS X 10.8.3 (3.2 GHz) 8 GB memory.
    And I also have an Epson Expression 10000XL graphic arts scanner capable of scanniing at 48bit color.
    This color stuff Really matters!  It MATTERS!  I HAve FINE art files.  I am already losing so much quality with the jpeg conversion. (which I am required to do for SmugMug, in addition to compressing all my files to 50mb or Under)
    Anyone who knows anything that could help me would be much appreciated. 
    Aloha,
    -Melissa

    First of all jpeg is 8 bit only there is no way to save as a 16 or 32 bit jpg, just does not exist. Secondly people print in 8 bit all the time and most if not all web graphics are in 8 bit as that is the only way to view it as there is no 16 bit or 32 bit monitors to view it. All but a few pro monitors are 8 bit monitors.
    If you care about the color gamut and want the full range of color that 16 and 32 bit provide, then why jpg? Jpg by its own nature throws out color just to compress, thats why it is popular on the web, because of its small file size not its quality. If you need 16 or 32 bit for anything it must be in a format that supports that color depth.
    That being said a jpg image at 8 bit will display 16+ million colors,  256 shades of red, 256 shades of green and 256 shades of blue.
    Now here is where I think your bit information is off. a jpg image is a 24 bit image that will produce 8 bits of red, 8 bits of green and 8 bits of blue.
    The 8, 16 and 32 are per channel not total color information.
    If the overall image was 8 bits, the image would be gayscale.

  • Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

    Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

    Does anyone know how to change my 27"iMac to 32 or 24 bit color resolution to allow me to use Citrix Receiver?

  • T500 with External Monitor Seems to Be in 8 bit Color

    I got a T500 for work running Windows 7 x64. I hooked up the computer to an external monitor via the VGA port (I've connected directly to the laptop and also through the Thinkpad dock).
    Despite my best efforts the external monitor seems to render in 8 bit color mode (or maybe 16 but definitely not 32 bit). You can tell by lack of color in the gradients. Things I've tried.
    Updating the video drivers
    Using the IBM monitor tool to set it to 32 bit
    Setting it to 32 bit in the Windows settings

    Just to close the loop on this, I've determined that the cause of my issue is the Intel Dynamic Platform and Thermal Framework software/driver. I've disabled in the BIOS for now, and the machine is working very well. Will look for an update from Lenovo to address more definitively.

  • Same 6 bit color? I bet it is.

    Sadly, all we have for color depth in the tech specs is the same old 'support for millions of colors' which we know in the past to have been a total lie. All the laptop displays have been 6 bit color, 262,144 colors, period. A great article on 6 bit versus 8 bit color can be found at:
    http://compreviews.about.com/od/multimedia/a/LCDColor.htm
    The tech specs for the new laptops don't indicate any improvement.
    And yes, the displays use dithering to fake millions of colors. Yes, all displays use mixing of RGB colors to fool the eye into seeing millions of colors. And yes, not one laptop display I am aware of, including PCs, has adequate color depth to allow any real color matching for professional use. Use a high end external display for color matching purposes. CRTs are still considered adequate if not superior for this purpose.

    Umm...
    Every laptop has 6bit colors. There aren't any 8bit laptop panels that Apple could use infact. Except for some very new 17" ones...
    The thing is, 8 bit panels use more power and are regarded by most of the industry as overkill for laptops.
    Anyway I repeat, there are no 13" 8bit panels in existence.
    You want 8 bit color? Plug in a CinemaHD or a Dell VA or IPS panel.

  • Is my Cinema Display running in 32-bit color mode?

    I'm calibrating my Cinema Display and have read that I should check to see that it is running in full 24- or 32-bit color mode. Can anyone tell me how to do that? Thanks.

    Open the Displays pane of System Preferences. If you're given the option to change the monitor's bitdepth, choose at least millions of colors.
    (119642)

  • Display seems like 16 bit color with slight flicker

    When I turned my iMac on today, it's like the display color spectrum is on 16 bit color and it has a flicker to it.
    This happed all of the sudden.
    It's less than a year old.
    21" iMac
    Intel Core 2 Duo
    NVIDIA GeForce 9400
    VRAM 256MB
    Any ideas?

    16bit colour is the default maximum colour depth for RDP connections to a windows terminal server. You can increase this by editing the local group policy object on the windows machine.
    Go to start->run and type gpedit.msc
    Browse to Computer Configuration/Administrative Template/Windows Components/Terminal Services and edit the "Limit Maximum colour depth properties" key. Set it to enabled and select 24 bit. Click OK close group policy editor. You should now be able to get 24bit colour over RDP connections.

  • How can I watch video encoded with 10-bits color depth correctly via Adobe Flash Player?

    When I watching video on a site, one of those video said it is encoded with 10-bits color depth and require Flash Player 10.2+ and non-trident browser to watch it correctly , but with the latest flash player available installed and ising the Firefox, why I still found green vertical bands over the video which is a sign of improper rendering? How can I fix it? Platform using: android 2.3.6 with Texas Instruments OMAP4430.

    http://forums.adobe.com/message/6115370#6115370

  • New MacBook Air display has same 6 bit color? I bet it does.

    Sadly, all we have for color depth in the tech specs is the same old 'support for millions of colors' which we know in the past to have been a total lie. All the laptop displays have been 6 bit color, 262,144 colors, period. A great article on 6 bit versus 8 bit color can be found at:
    http://compreviews.about.com/od/multimedia/a/LCDColor.htm
    The tech specs for the new laptops don't indicate any improvement.
    And yes, the displays use dithering to fake millions of colors. Yes, all displays use mixing of RGB colors to fool the eye into seeing millions of colors. And yes, not one laptop display I am aware of, including PCs, has adequate color depth to allow any real color matching for professional use. Use a high end external display for color matching purposes. CRTs are still considered adequate if not superior for this purpose.

    Umm...
    Every laptop has 6bit colors. There aren't any 8bit laptop panels that Apple could use infact. Except for some very new 17" ones...
    The thing is, 8 bit panels use more power and are regarded by most of the industry as overkill for laptops.
    Anyway I repeat, there are no 13" 8bit panels in existence.
    You want 8 bit color? Plug in a CinemaHD or a Dell VA or IPS panel.

  • Is the latest iMac monitor (21.5" and 27") using 8-bit color depth or 6-bit dithering?

    So far, there isn't been an official answer. Support millions of colors could be either one.

    My question is more toward the hardware on that iMac monitor rather than the software setting (32-bit).
    I notice a similar discussion few years back:
    https://discussions.apple.com/message/10643776#10643776
    So seem that Apple is not using 8-bit colors yet on iMac?

  • How do I export from Lightroom to Photoshop 2014 using 8-bit color?

    In LR when I rt-click in the image area, a menu appears. I choose export and then Photoshop 2014.  I specify "edit with Lightroom adjustments.  When the image appears in Photoshop it indeed as the adjustments, but the color depth is always 32-bit. It's become bothersome to then down adjust the color depth to 8-bit color and I am wondering if there is a hidden export setting that will allow me to export the image as 8-bit.

    You can set that in Lightroom preferences see the screen capture.

  • Rendering as 10-bit/12-bit color (JPEG 2000?)

    I'm not trying to create a digital cinema package (DCP) per se, but I've got a few questions related to rendering in bit depths greater than 8-bit.
    NOTE:  The digital cinema standard (PDF link, see page 31) for package distribution is JPEG 2000 files with 12-bit color (in XYZ color space)...packaged into the MXF container format.
    1.  I'm wondering if it's possible to render to any 10-bit or 12-bit color format within After Effects.  Let's assume my source is a 16bpc or 32bpc comp, and I render it to the QuickTime container and select JPEG2000 or one of the other variants.  None of them seems to go above "millions of colors", or 8-bit.  (The one that has the option of "millions of colors plus" still only renders to planar 8-bit [YUV 4:2:0] when I inspect its streams and metadata.)
    2.  In the QuickTime container list, what are "Motion JPEG-A" and "Motion JPEG-B"?  These aren't standards with which I'm familiar, and I can't seem to find any detail in the documentation as to what these actually are.  (In all my tests, they're limited to 8-bit color.)
    3.  Is the JPEG 2000 codec that's available via QuickTime the same JPEG 2000 codec that's literally the full ISO/IEC 15444 or SMPTE 429-4 standard, or some crippled bits-and-pieces version?
    Obviously, I can render to TIFF or OpenEXR sequences in 16bpc or full float...I was just wondering if it was possible to get 10-bit or 12-bit color in a standard container via After Effects CC or Premiere Pro CC (via Media Encoder CC). 
    I see the "render at maximum bit depth" option in Premiere Pro, but I've never found a format/container that would output anything higher than 8-bit color...even with 16bpc or 32bpc input media.
    Thanks for your help and insight.

    If you want highter bit depth J2K, you have to render to image sequences. The baseline QT implementation is from the stone age. Perhaps there's some commercial third-party Codec out there or if you have such hardware, you could use Blackmagic's, but off  the bat there is nothing usable in QT as far as I know.
    Mylenium

  • Does Lightroom 4 support 10-bit color?

    Hoping to learn if Lightroom supports 10-bit color.  I know it supports ProPhotoRGB and 16-bit image files.  Thanks.

    oliver83 wrote:
    Actually, the previous answers were only partially correct w.r.t. to displaying 10-bit colors. In contrast to what was said, LR4 (I use version 4.4) supports 10-bit colors, however only in the develop module. The library module does not seem to support it which could be because 1) it really does not support it or 2) the library previews are created with lower color depth or 3) something I haven't considered
    I first read this somewhere and then personally verified it with ATI's ramp.psd test file on an Eizo CG276 (and a V4900 FirePro).
    The library previews are jpeg (8-bit). 

  • 16 bit color in PHOTOSHOP ELEMENTS

    I get an Error when using some of my tools (example: Healing Brush) in FULL EDIT  that states that "16 bit color not supported .  Convert to 8 bit color?"
    Will Photshop 9 work with 16 bit color?  Considering most cameras esp. DSLR's and scanners now use 16 bit color I hope so.

    Some features of PSE will work on 16 bit images, other features of PSE will not work on 16 bit images and you have to convert the image to 8 bits. I don't have PSE9, but it is my understanding that if a feature didn't work on 16 bit images in earlier versions, it won't work on 16 bit images in the current version (PSE9).

  • Working with 32 bit color HDRs in Lightroom

    The feature of being able to work with 32 bit color Tiff files in Lightroom is awesome and wonderful.  However, I note that when I do not follow the recommended workflow of bringing the 32 bit file directly back into Lightroom to work on it - working on it with Adobe Camera Raw in Photoshop first - the results are somewhat confusing. I'm fine with following the recommended workflow except for the fact that I seem to be able to come up with even better results quite quickly using ACR.  ACR appears to give me a greater tonal range to work with, but when I save the file as a Tiff and take it back to Lightroom the colors aren't displayed correctly by Lightroom.  There is a huge difference between viewing the file in Photoshop and viewing it in LR. If I work on the file immediately in LR, avoiding doing anything in PS, the files look identical in both programs, but at the apparent cost of losing a lot of exposure range to play with.  Am I correct in inferring that LR is working on the 32 bit colors using a 16 bit color depth conversion in the tools? 

    https://dl.dropboxusercontent.com/u/106834366/Redux%20Vidz.aep
    This is kind of a project-ruining issue not being able to render it out in 32-bit -- I'm working with a lot of gradients and lens blurs.
    Anyone have any hints of why this might be happening?

Maybe you are looking for

  • SMB sharing is not working for me

    Two machines, one Leopard and the other XP Pro. Both firewalls off just to be sure. Both machines can ping each other by IP. Both user names are the same. (Heck, both passwords are the same too.) I have a shared folder on the Mac with SMB enabled. Wh

  • Graph x-axis values

    Hi , i'm looking for some algorithm that returns nice ticks on a xasix rounded on 10 fe i have values 16 67 87 92 i want to get all the values 0-10-20....100 so rounded at 10 or 5 greetings Sven

  • How to define own properties

    Hi, I would like to know how to create own properties of own iView which can be set on EP after deploying. For example: on URL iView can be set URL, request method, etc. How to create parameters like these?

  • Negative stock indicator

    Hi, Can anyone tell me the path for negative stock indicator? regards,

  • How to tune the below query?is taking 1min 15 sec to process 16,34,300 reco

    SELECT user_id, ispeak, round(sum(DURATION_seconds*convert_duration(pulseunit,ratingunit))/decode(ispeak,0,max(offpeakpulse),max(peakpulse))), round(sum(download_bytes*convert_bytes(pulseunit,ratingunit))/decode(ispeak,0,max(offpeakpulse),max(peakpul