Nvidias Faulty GPUs

After reading this article:
http://www.engadget.com/2008/07/31/figuring-out-which-nvidia-gpus-are-defective- its-a-lot/
It appears that 8600m and 8400m gpus appear to be faulty and are having overheating issues. I was considering buying a macbook pro today but after reading this i was worried and decided to look further into the problem. I was curious if anyone is experiencing any problems with there 8600m and if anyone can provide and insight for me on this issue.
Thanks

My 6 month old 2.6 MacBook Pro is starting to exhibit symptoms. Actually it did a few weird things early on but then it stopped. As I need it to earn my living (3D graphics professional) I wasn't in a hurry to lose it for a couple of weeks anyway as my fallback machine is a 5 year old TiBook which isn't up to the work I do.
Now it's getting worse, all sorts of weird screen artifacts going on, and I'll have to back it up and take it in soon before it dies on me altogether. I am worried though as all they can do is replace it with the same chip which will also fail in time and then the laptop will be out of warranty unless I spend £300 UK (or $600 US) for Applecare which I'm not happy about as I don't see why I should pay extra and lose work because NVidias cards are dodgy.

Similar Messages

  • HP Pavilion dv2799ea special edition - NVIDIA Faulty graphics chip!

    My HP laptop has stopped working as a result of an NVIDIA faulty graphics chip. The HP Pavilion dv2799ea is now unusable unless I replace the motherboard. I am horrified to discover that lots of other models with the same problem have been granted a free replacement motherboard but my model is not on HP's list of products they are prepared to repair, therefore apparently I am not eligible for the 2 year extended warranty and repair.
    My laptop is less than 18 months old, cost £1000 and I now have to pay out £300 to have it fixed!
    This has got to be the worst experience I have have incurred from any product or company in my whole life.
    How come HP will replace other models with the same problem but not my version?
    This is apparently a known issue across the world and NVIDIA have admitted they have fitted models with this faulty chip so surely it is HPs responsibility to repair anyone's laptop that has this fault?
    I bought this expensive equipment from HP in good faith, trusting that they would supply a product that would be reliable and of excellent quality. Quality worth spending £1000 on.....What I got? A low end budget laptop version that is faulty just outside of the years warranty and an expensive repairs bill!
    I wish I had only bought a cheap one. The blow would not be so bad.
    I warn anyone out there who thinks that spending a lot of money with HP will buy you quality and durability..Think again! HP are obviously happy for their customers to be out of pocket and apparently are not concerned that customers would not buy from them again. I could now never recommend their products to anyone, their products are not worth the money and the after service is even worse!
    If anyone has any ideas as to how I could get this problem fixed without it costing me £300 I would be grateful. I cannot afford to get it repaired right now so will be without it until such time as I can get the funds together.
    When I bought this 'special edition' I did not realise at the time quite how 'special' it was going to end up being.
    I would also be interested to hear from anyone who has the same model and has experienced the same issue as myself.
    Thanks
    Message Edited by Emmaw on 11-18-2009 06:41 AM

    Reflow video at YouTube:
    http://www.youtube.com/watch?v=vR8L3B3eDr0
    ******Clicking the Thumbs-Up button is a way to say -Thanks!.******
    **Click Accept as Solution on a Reply that solves your issue to help others**

  • Laptops with inherited defected GPUs from nvidia

    I come across this website while googling about nvidia faulty cards
    http://www.nvidiadefect.com/
    by reading few links, one can imagine almost every  nvidia chip supplied is defective:
    http://www.nvidiadefect.com/evidence-with-links-to-external-sites-t10.html
    so guys if you are having trouble with a laptop with nvidia chip above website worth to check.
    unfortunatly bought lenovo laptop with defected GPU and I have started process to get refund or repair from retailer who sold me the product with defected chip.
    I sent them email explaining them the situtation. they agree to look into the matter and asked me to send 'Engineers report'. it means they didnt plainly refused.--"Sorry we cant do anything cos your laptop is out of warranty" ! popular phrase!
    best wishes
    Note from Moderator:  A comment which violated the forum rules was removed.

    MSI is doing great to deliver all the gears with new NV GPU to the market. Even the review by the media is so quick response.
    Wondering what's the brilliant news coming next?
    - - - Updated - - -

  • Nvidia 8600 GPU failure no longer covered?

    Is anyone else getting denied for coverage on Nvidia GPU failures? I went into the Apple Store today, and they confirmed a Nvidia failure, but said it was not covered anymore. Does anyone else get confirmation, but no coverage, eventhough they got $200 million from Nvidia?

    There are many comments about the NVIDIA GPU issue and Apple not replacing the GPU.  I will say I had good success in getting Apple to replace mine in my Macbook Pro. When my computer suddenly would not fully turn on, I researched the issue to find out the possible reasons and then somewhat dawdled in taking it in. It was 2 weeks past the 4-year GPU warranty window. But I had had problems for some time preceding when it wouldn't fully turn on, such as geo designs appearing on the screen and the screen going dark, and those issues did occur at least a month or more before my warranty ended. I had no idea what those issues meant and simply ignored them until the computer wouldn't fully turn on.
    The good folks at The Apple store in Austin initially said no luck; yes it's the GPU but it's two weeks past the warranty. But then they kindly relented because clearly the symptoms I described to them that occurred during the warranty were caused by the faulty GPU. How would I know this was the issue?  They replaced the logic board at no cost. I've seen comments here that a Genius said the problem was the logic board, not the GPU, and therefore Apple would not make the repair. I was told by my Genius that the GPU is partof the logic board, and therefore the logic board needed replacing. This was a $526 repair that I was grateful Apple took care of. I would not have made the repair and would have gone out and bought a new non-Apple computer with a warranty.
    When I got my computer back, I checked it before I left the Apple store and it was running slow. I ended up paying Apple to replace the hard drive, so ultimately I did pay for an Apple repair. But I'm a happy rather than disappointed Apple customer because of the free GPU repair. Happy, devoted customers has made Apple a great company, and it should be worth it to the company to fix the faulty GPUs that are close to warranty to keep customers and keep them happy.

  • Which nvidia driver?? [Solved]

    Hi all,
    I am sorry, I am not able to understand the beginners guide regarding installation of nvidia drivers.
    I have X and kdemod3 running with the nv driver, but it is slow.
    My card is: nVidia Corporation NV44A [GeForce 6200] (rev a1)
    On beginners guide I read:
    NVIDIA Graphics Cards
    The NVIDIA proprietary drivers are generally considered to be of good quality, and offer good 3D performance, whereas the open source nv driver offers only 2d support at this time and the new nouveau driver offers only experimental 3D acceleration.
    Before you configure your Graphics Card you will need to know which driver fits. Arch currently has several different driver packages that each match a certain subset of Cards:
    1. nvidia-96xx slightly newer cards up to the GF 4.
    2. nvidia-173xx Geforce FX series cards
    3. nvidia newest GPUs after the GF FX
    I am not able to identify my card as belonging to any of the series mentioned. I do not have any knowledge of the chronology of the series of graphics cards, so all this makes no sense alltogether.
    I look up the nvidia page, which in itself does not bring me further, but gives me a link to the nvidia download page, where I should be able to sort out which is which.
    There I initially guess and later confirm that my card belongs to the 6 series. I am given the 195.36.15 driver, which is also mentioned to be supporting my 6200 card.
    But I don't recognize that in the options given in the beginners guide. Archlinux package wize I still don't know what to do.
    I can only find 195 in the aur, where it is called beta.
    Is it this one I need to install?
    My card is several years old, so I wonder if there is not a driver in the official repos?
    Thank you for clarifying this for me!
    Last edited by slot (2010-03-27 17:51:01)

    Take the newest,
    you will find a compatibilty list here: http://www.nvidia.com/object/linux_disp … 36.15.html
    under "supported products"
    Greets

  • ATI Primary and Nvidia Secondary for Hardware MPE Acceleration

    Hi everyone,
    I'm not sure if this has been discovered yet. I think it is very exciting, and very important for anyone with an AMD (ATI) GPU who wants hardware MPE acceleration.
    It is possible to use Hardware MPE acceleration while using an ATI video card as your primary adapter, and a lesser CUDA Nvidia GPU as a secondary adapter not connected to any monitor.
    My system:
    CPU: 1090T
    Mobo: 890GX
    RAM: 8 1333
    RAID: No
    GPU1: 5870
    GPU2: GTS 450
    As you can see, I have a Nvidia and AMD GPU in the same system. The 5870 is obviously by far the most powerful of the two, and it is what I use to record rendered footage using FRAPS.
    Recently, I became aware of the powers of hardware MPE. I concluded that the best way to obtain HMPE and maintain my FRAPS recording was to purchase a GTX 480. However, this was out of my wallets league as I could not sell the 5870.
    I was already aware that PhysX (A CUDA physics calculation library) could only be run on Nvidia CUDA GPUs (Like HMPE). Many Nvidia card users used secondary CUDA cards to accelerate physics calculation in games. ATI card users could not use a secondary Nvidia card for physics calculation as the Nvidia driver locked down PhysX in the presence of an active ATI GPU. Luckily a clever fellow called GenL managed to hack the Nvidia drivers to force PhysX to work in the presence of an ATI GPU.
    I hypothesised that if I performed that hack, HMPE would gain access to CUDA in a similar fashion to PhysX, thus allowing me to buy a far cheaper GTS 450 and pair it as an HMPE renderer with my 5870. After buying a GTS 450, I failed at implementing the hack and was about to give up.
    HMPE worked when my monitor was connected to the GTS 450, but if i tried to start PPro with the 5870 connected to any monitor HMPE was unavailable.
    I had two monitors connected to my GTS 450, and was playing around with adding stupid amounts of HMPE accelerated effects to an AVCHD clip. Realising that it was impractical to constantly switch the DVI cable from 5870 to GTS 450 I decided to leave my primary monitor connected to the 5870 and give up on HMPE. So, I reached around behind my computer and did it, but crucially did not quit PPro before I did so.
    When the screen flickered back to life, the yellow HMPE preview bar was still yellow. The timeline still scrubbed perfectly smoothly. HMPE was still working with a 5870 as the primary monitor: The PPro window was on the 5870 monitor, and the 5870 was rendering the window!
    I found that provided I did not close PPro, I could switch between HMPE and SMPE at will, all while using the 5870 as the primary adapter.
    I tested this using a 10 second composition of 3 AVCHD 1920x1080 Clips with CC, drop shadow, gaussian blur, edge feather, Basic 3D, transform, Ultra Key, drop shadow applied, rotatating amongst each other. I could still switch even if the 5870 was the only card connected to a monitor.
    Rendering this test clip via PPro direct export takes 30 seconds in HMPE mode with the 5870 and 1.43 in SMPE mode with the 5870.
    However: Rendering performance in AME stays the same whether I selected HMPE or SMPE. I believe this is because AME is a separate application that 're-detects' the ATI card and disables HMPE before beginning the encode, in the same manner that restarting PPro while using the 5870 removes the HMPE option. Rendering the clip in SMPE and HMPE modes using the GTS 450 gave the same 30 second vs 1.43 minute result.
    Therefore, as long as you are happy to encode via direct PPro export you will still see the benefit of HMPE while using an AMD card as the primary adapter.
    I hope this is as terribly excited to other users of ATI cards as it was for me. This has saved me several hundred dollars.
    Cheers,
    NS2HD

    Interesting results. I own a system manufactured by BOXX, a system developer out of Texas who really knows their stuff. I had asked them if it would be possible to purchase a CUDA enabled card and put it in my secondary slot and use it for MPE while maintaining my current (nvidia) card to run my monitors (also giving me the ability to run four screens). They said that no, according to the Adobe developers they were working with, Premiere could only use MPE off the CUDA card if the monitor previewing your work was plugged into that card. I guess they were wrong!
    Also, from my understanding, you don't see lesser results with AME because it's a separate program that starts separately, you see the lesser results because it has not yet been coded to take advantage of CUDA.

  • Which display driver package do I need for Nvidia Quadro NVS 110m?

    Hi there,
    I'm new to (Arch) Linux and was wondering which display driver package I need. I have a Dell Latitude D620 with a "nvidia quadro nvs 110m".
    In the Arch Linux beginners guide it says:
    1. nvidia-96xx slightly newer cards up to the GF 4.
    2. nvidia-173xx Geforce FX series cards
    3. nvidia newest GPUs after the GF FX
    Consult the NVIDIA-Homepage to see which one is for you.
    But on the NVIDIA-Homepage it says:
    The GeForce M series and GeForce Go series notebook GPUs use drivers that have been customized by the notebook manufacturers to support hot key functions, power management functions, lid close and suspend/resume behavior. NVIDIA has worked with some notebook manufacturers to provide notebook-specific driver updates, however, most notebook driver updates must come from the notebook manufacturer.
    And they just forward me to Dell Support which obviously neither offers a linux driver nor linux driver information for the D620.
    So which of the three packages do I have to install?
    Are these packages open-source drivers? Do I need proprietary nvidia drivers to get full 3D-performance?
    Does anybody have experience with the D620?
    Although I'm new to Linux and I like the KISS idea behind Arch Linux, I still want to have all that Compiz Fusion eye candy. I also want to play a few games which need 3D performance (once I'm familiar with wine). ;-)
    Thanks in advance for your help,
    Blackhole
    Last edited by blackhole (2008-12-14 10:22:05)

    Thanks a lot.
    Final question: How does Arch Linux deal with license issues? If the nvidia package includes the original nvidia drivers linked above, doesn't that cause any legal/moral problems? Like in Ubuntu for example they always point out that it's proprietary software you're gonna install when enabling the nvidia drivers.
    Just wondering...

  • Thinkstation D20 & Quad SLI Configuration using 2 X Nvidia N295GTX Problem.

    I got the new D20 with Dual Quad-Core W5580 Processors, 16GB RAM, 2 SAS HDs.  Tried to install 2 IDENTICAL Nvidia N295GTX GPUs with its SLI connector, at the BIOS level, it gave me one log beep & 3 consecutive ones & the Monitor is Blank. Checked the BIOS & enabled the SLI GPUs to communicate with each other through the SLI Flex connector. Still didn't work. bearing in mind that i've corrrectly connected the GPU power cables (6&8 Pin cables for each card). Upgraded the BIOS to the latest version. I installed each card seperately, they worked perfectly, Problem starts when i install them in a Quad SLI configuration. 

    SurferGuy - welcome to the forum!
    one long and three short beeps is a keyboard error.   are you sure it isn't one long and two short beeps?   if so, this means your video cards are incompatible for some reason.   i don't know if this is a BIOS limitation or something limiting the D20 from working with four GPU cores.   since the D20 was originally intended to be used as a workstation, it's possible that dual GTX 295s were never tested in these systems since these are desktop/gaming cards.
    this isn't caused by insufficient power.   if this were the case, the GTX 295s themselves would beep.   the PSU on the D20 is rated to 1080W and natively supports dual 6+8 pin PCIe power sockets.   1080W is plenty for dual GTX 295s at 289W each as long as you aren't somehow running over 500W of other accessories.
    at this point i suggest calling service to see if they have any suggestions.   as far as i know, your cards aren't officially supported and may not even work at this point in time.   a BIOS update may change this in the future but i cannot speak in more detail at this point in time.
    ThinkStation C20
    ThinkPad X1C · X220 · X60T · s30 · 600

  • Apple, Mac Pro, Amd vs Nvidia with cuda? Why, and what to buy for Premiere Pro?

    Mac Pro has dual Amd video cards installed, and does not offer Nvidia cards with Cuda in this line up, as far as I can tell. The iMacs offer nvidia cards which seem a better choice for Premiere Pro which uses the cuda rendering capabilities. Which would be best for Premiere Pro? The quad Mac Pro, or the top of the line iMac with double the ram, say 32gigs vs 16 in the Mac Pro?  Why did Apple put AMD cards in the Mac Pro line?

    jtopenshaw wrote:
    Why did Apple put AMD cards in the Mac Pro line?
    Apple believes that OpenCL, not CUDA, is the future of GPGPU computing.  CUDA is closed and proprietary to nVidia only.  OpenCL is, as the name says: open.
    As of the creation and planning of the Mac Pro, nVidia's GPUs were (and most of them still are) very poor performers when it comes to OpenCL.  I expect that, over time, nVidia will catch up to AMD regarding OpenCL.  But they're not quite there yet, across the board.
    Why are the GPUs in the iMac (and Macbook Pro) nVidia?  They make better and more efficient mobile GPUs at this point in time.  The GPU in the iMac is similar to the GPU in the Macbook Pro in that they're both mobile, not desktop GPUs.  AMD has a fairly poor representation in that market (so far), so Apple stuck with nVidia.
    It's likely that Apple doesn't see the iMac and Macbook Pro lineup as devices to be used for massive GPGPU processing.  The Mac Pro, on the other hand is where they see that being used heavily.  Perhaps when and if AMD decides to produce an efficient and powerful mobile GPU, Apple will cut (back) over to them.  But one can only guess, there.
    Ugio wrote:
    What I can tell you it's that the mixture between Mac Pro (2013) and the last Adobe Suite does not worth the money.
    That's fair and is your opinion.  I don't agree. :-)

  • SLI with two nvidia 8600 gt's

    I am having a problem configuring xorg and the nvidia driver to get sli working
    here are some files that might help
    xorg.conf
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig: version 1.0 (buildmeister@builder58) Tue Jan 27 12:47:59 PST 2009
    Section "ServerLayout"
    Identifier "Layout0"
    Screen 0 "Screen0"
    InputDevice "Keyboard0" "CoreKeyboard"
    InputDevice "Mouse0" "CorePointer"
    EndSection
    Section "Files"
    EndSection
    Section "Module"
    Load "dbe"
    Load "extmod"
    # Load "type1"
    Load "freetype"
    Load "glx"
    EndSection
    Section "InputDevice"
    # generated from default
    Identifier "Mouse0"
    Driver "mouse"
    Option "Protocol" "auto"
    Option "Device" "/dev/psaux"
    Option "Emulate3Buttons" "no"
    Option "ZAxisMapping" "4 5"
    EndSection
    Section "InputDevice"
    # generated from default
    Identifier "Keyboard0"
    Driver "kbd"
    EndSection
    Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "Unknown"
    HorizSync 28.0 - 33.0
    VertRefresh 43.0 - 72.0
    Option "DPMS"
    EndSection
    Section "Device"
    Identifier "Device0"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BusID "01:00:00"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Device0"
    Monitor "Monitor0"
    DefaultDepth 24
    Option "SLI" "On"
    SubSection "Display"
    Depth 24
    EndSubSection
    EndSection
    and /var/log/Xorg.0.log
    X.Org X Server 1.5.3
    Release Date: 5 November 2008
    X Protocol Version 11, Revision 0
    Build Operating System: Linux 2.6.27-ARCH x86_64
    Current Operating System: Linux BlackWater 2.6.28-ARCH #1 SMP PREEMPT Sun Jan 25 09:43:53 UTC 2009 x86_64
    Build Date: 17 December 2008 10:46:49PM
    Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    (==) Log file: "/var/log/Xorg.0.log", Time: Sat Feb 7 15:02:18 2009
    (==) Using config file: "/etc/X11/xorg.conf"
    (==) ServerLayout "Layout0"
    (**) |-->Screen "Screen0" (0)
    (**) | |-->Monitor "Monitor0"
    (**) | |-->Device "Device0"
    (**) |-->Input Device "Keyboard0"
    (**) |-->Input Device "Mouse0"
    (==) Automatically adding devices
    (==) Automatically enabling devices
    (==) FontPath set to:
    /usr/share/fonts/misc,
    /usr/share/fonts/100dpi:unscaled,
    /usr/share/fonts/75dpi:unscaled,
    /usr/share/fonts/TTF,
    /usr/share/fonts/Type1
    (==) ModulePath set to "/usr/lib/xorg/modules"
    (WW) AllowEmptyInput is on, devices using drivers 'kbd' or 'mouse' will be disabled.
    (WW) Disabling Keyboard0
    (WW) Disabling Mouse0
    (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
    (II) No APM support in BIOS or kernel
    (II) Loader magic: 0x7b54e0
    (II) Module ABI versions:
    X.Org ANSI C Emulation: 0.4
    X.Org Video Driver: 4.1
    X.Org XInput driver : 2.1
    X.Org Server Extension : 1.1
    X.Org Font Renderer : 0.6
    (II) Loader running on linux
    (++) using VT number 7
    (!!) More than one possible primary device found
    (--) PCI: (0@1:0:0) nVidia Corporation GeForce 8600GT rev 161, Mem @ 0xfa000000/16777216, 0xe0000000/268435456, 0xf8000000/33554432, I/O @ 0x0000ac00/128, BIOS @ 0x????????/131072
    (--) PCI: (0@4:0:0) nVidia Corporation GeForce 8600GT rev 161, Mem @ 0xf6000000/16777216, 0xd0000000/268435456, 0xf4000000/33554432, I/O @ 0x00008c00/128, BIOS @ 0x????????/131072
    (II) System resource ranges:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [5] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [6] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [7] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [8] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [9] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [10] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [11] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [12] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [13] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [14] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [15] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [16] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [17] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [18] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [19] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [20] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [21] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [22] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [23] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (II) "extmod" will be loaded. This was enabled by default and also specified in the config file.
    (II) "dbe" will be loaded. This was enabled by default and also specified in the config file.
    (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
    (II) "freetype" will be loaded. This was enabled by default and also specified in the config file.
    (II) "dri" will be loaded by default.
    (II) LoadModule: "dbe"
    (II) Loading /usr/lib/xorg/modules/extensions//libdbe.so
    (II) Module dbe: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    Module class: X.Org Server Extension
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension DOUBLE-BUFFER
    (II) LoadModule: "extmod"
    (II) Loading /usr/lib/xorg/modules/extensions//libextmod.so
    (II) Module extmod: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    Module class: X.Org Server Extension
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension SHAPE
    (II) Loading extension MIT-SUNDRY-NONSTANDARD
    (II) Loading extension BIG-REQUESTS
    (II) Loading extension SYNC
    (II) Loading extension MIT-SCREEN-SAVER
    (II) Loading extension XC-MISC
    (II) Loading extension XFree86-VidModeExtension
    (II) Loading extension XFree86-Misc
    (II) Loading extension XFree86-DGA
    (II) Loading extension DPMS
    (II) Loading extension TOG-CUP
    (II) Loading extension Extended-Visual-Information
    (II) Loading extension XVideo
    (II) Loading extension XVideo-MotionCompensation
    (II) Loading extension X-Resource
    (II) LoadModule: "freetype"
    (II) Loading /usr/lib/xorg/modules/fonts//libfreetype.so
    (II) Module freetype: vendor="X.Org Foundation & the After X-TT Project"
    compiled for 1.5.3, module version = 2.1.0
    Module class: X.Org Font Renderer
    ABI class: X.Org Font Renderer, version 0.6
    (II) Loading font FreeType
    (II) LoadModule: "glx"
    (II) Loading /usr/lib/xorg/modules/extensions//libglx.so
    (II) Module glx: vendor="NVIDIA Corporation"
    compiled for 4.0.2, module version = 1.0.0
    Module class: X.Org Server Extension
    (II) NVIDIA GLX Module 180.27 Tue Jan 27 12:43:19 PST 2009
    (II) Loading extension GLX
    (II) LoadModule: "dri"
    (II) Loading /usr/lib/xorg/modules/extensions//libdri.so
    (II) Module dri: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension XFree86-DRI
    (II) LoadModule: "nvidia"
    (II) Loading /usr/lib/xorg/modules/drivers//nvidia_drv.so
    (II) Module nvidia: vendor="NVIDIA Corporation"
    compiled for 4.0.2, module version = 1.0.0
    Module class: X.Org Video Driver
    (II) NVIDIA dlloader X Driver 180.27 Tue Jan 27 12:23:08 PST 2009
    (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    (II) Primary Device is:
    (II) Loading sub module "fb"
    (II) LoadModule: "fb"
    (II) Loading /usr/lib/xorg/modules//libfb.so
    (II) Module fb: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org ANSI C Emulation, version 0.4
    (II) Loading sub module "wfb"
    (II) LoadModule: "wfb"
    (II) Loading /usr/lib/xorg/modules//libwfb.so
    (II) Module wfb: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org ANSI C Emulation, version 0.4
    (II) Loading sub module "ramdac"
    (II) LoadModule: "ramdac"
    (II) Module "ramdac" already built-in
    (II) resource ranges after probing:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [5] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [6] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [7] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [8] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [9] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [10] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [11] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [12] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [13] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [14] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [15] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [16] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [17] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [18] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [19] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [20] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [21] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [22] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [23] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
    (==) NVIDIA(0): RGB weight 888
    (==) NVIDIA(0): Default visual is TrueColor
    (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    (**) NVIDIA(0): Option "SLI" "On"
    (**) NVIDIA(0): Enabling RENDER acceleration
    (**) NVIDIA(0): NVIDIA SLI auto-select rendering option.
    (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    (II) NVIDIA(0): enabled.
    (EE) NVIDIA(0): Failed to find a valid SLI configuration.
    (EE) NVIDIA(0): Invalid SLI configuration 1 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (EE) NVIDIA(0): Invalid SLI configuration 2 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (EE) NVIDIA(0): Invalid SLI configuration 3 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (WW) NVIDIA(0): Failed to find a valid SLI configuration for the NVIDIA
    (WW) NVIDIA(0): graphics device PCI:1:0:0. Please see Chapter 25:
    (WW) NVIDIA(0): Configuring SLI and Multi-GPU FrameRendering in the README
    (WW) NVIDIA(0): for troubleshooting suggestions.
    (EE) NVIDIA(0): Only one GPU will be used for this X screen.
    (II) NVIDIA(0): NVIDIA GPU GeForce 8600 GT (G84) at PCI:1:0:0 (GPU-0)
    (--) NVIDIA(0): Memory: 1048576 kBytes
    (--) NVIDIA(0): VideoBIOS: 60.84.41.00.25
    (II) NVIDIA(0): Detected PCI Express Link width: 16X
    (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    (--) NVIDIA(0): Connected display device(s) on GeForce 8600 GT at PCI:1:0:0:
    (--) NVIDIA(0): HP w1907 (DFP-0)
    (--) NVIDIA(0): HP w1907 (DFP-0): 330.0 MHz maximum pixel clock
    (--) NVIDIA(0): HP w1907 (DFP-0): Internal Dual Link TMDS
    (II) NVIDIA(0): Assigned Display Device: DFP-0
    (==) NVIDIA(0):
    (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
    (==) NVIDIA(0): will be used as the requested mode.
    (==) NVIDIA(0):
    (II) NVIDIA(0): Validated modes:
    (II) NVIDIA(0): "nvidia-auto-select"
    (II) NVIDIA(0): Virtual screen size determined to be 1440 x 900
    (--) NVIDIA(0): DPI set to (89, 87); computed from "UseEdidDpi" X config
    (--) NVIDIA(0): option
    (==) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
    (--) Depth 24 pixmap format is 32 bpp
    (II) do I need RAC? No, I don't.
    (II) resource ranges after preInit:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [5] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [6] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [7] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [8] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [9] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [10] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [11] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [12] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [13] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [14] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [15] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [16] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [17] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [18] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [19] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [20] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [21] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    [22] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [23] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (II) NVIDIA(GPU-1): NVIDIA GPU GeForce 8600 GT (G84) at PCI:4:0:0 (GPU-1)
    (--) NVIDIA(GPU-1): Memory: 1048576 kBytes
    (--) NVIDIA(GPU-1): VideoBIOS: 60.84.41.00.25
    (II) NVIDIA(GPU-1): Detected PCI Express Link width: 16X
    (--) NVIDIA(GPU-1): Interlaced video modes are supported on this GPU
    (--) NVIDIA(GPU-1): Connected display device(s) on GeForce 8600 GT at PCI:4:0:0:
    (II) NVIDIA(0): Initialized GPU GART.
    (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
    (II) NVIDIA(0): may not be running or the "AcpidSocketPath" X
    (II) NVIDIA(0): configuration option may not be set correctly. When the
    (II) NVIDIA(0): ACPI event daemon is available, the NVIDIA X driver will
    (II) NVIDIA(0): try to use it to receive ACPI event notifications. For
    (II) NVIDIA(0): details, please see the "ConnectToAcpid" and
    (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
    (II) NVIDIA(0): Config Options in the README.
    (II) NVIDIA(0): Setting mode "nvidia-auto-select"
    (II) Loading extension NV-GLX
    (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
    (==) NVIDIA(0): Disabling shared memory pixmaps
    (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
    (==) NVIDIA(0): Backing store disabled
    (==) NVIDIA(0): Silken mouse enabled
    (**) Option "dpms"
    (**) NVIDIA(0): DPMS enabled
    (II) Loading extension NV-CONTROL
    (II) Loading extension XINERAMA
    (==) RandR enabled
    (II) Initializing built-in extension MIT-SHM
    (II) Initializing built-in extension XInputExtension
    (II) Initializing built-in extension XTEST
    (II) Initializing built-in extension XKEYBOARD
    (II) Initializing built-in extension XC-APPGROUP
    (II) Initializing built-in extension SECURITY
    (II) Initializing built-in extension XINERAMA
    (II) Initializing built-in extension XFIXES
    (II) Initializing built-in extension RENDER
    (II) Initializing built-in extension RANDR
    (II) Initializing built-in extension COMPOSITE
    (II) Initializing built-in extension DAMAGE
    (II) Initializing built-in extension XEVIE
    (II) Initializing extension GLX
    (II) config/hal: Adding input device Logitech USB Receiver
    (II) LoadModule: "evdev"
    (II) Loading /usr/lib/xorg/modules/input//evdev_drv.so
    (II) Module evdev: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 2.1.0
    Module class: X.Org XInput Driver
    ABI class: X.Org XInput driver, version 2.1
    (**) Logitech USB Receiver: always reports core events
    (**) Logitech USB Receiver: Device: "/dev/input/event5"
    (II) Logitech USB Receiver: Found 9 mouse buttons
    (II) Logitech USB Receiver: Found x and y relative axes
    (II) Logitech USB Receiver: Found keys
    (II) Logitech USB Receiver: Configuring as mouse
    (II) Logitech USB Receiver: Configuring as keyboard
    (**) Logitech USB Receiver: YAxisMapping: buttons 4 and 5
    (**) Logitech USB Receiver: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    (II) XINPUT: Adding extended input device "Logitech USB Receiver" (type: KEYBOARD)
    (**) Option "xkb_rules" "evdev"
    (**) Logitech USB Receiver: xkb_rules: "evdev"
    (**) Option "xkb_model" "evdev"
    (**) Logitech USB Receiver: xkb_model: "evdev"
    (**) Option "xkb_layout" "us"
    (**) Logitech USB Receiver: xkb_layout: "us"
    (II) config/hal: Adding input device Logitech USB Receiver
    (**) Logitech USB Receiver: always reports core events
    (**) Logitech USB Receiver: Device: "/dev/input/event4"
    (II) Logitech USB Receiver: Found keys
    (II) Logitech USB Receiver: Configuring as keyboard
    (II) XINPUT: Adding extended input device "Logitech USB Receiver" (type: KEYBOARD)
    (**) Option "xkb_rules" "evdev"
    (**) Logitech USB Receiver: xkb_rules: "evdev"
    (**) Option "xkb_model" "evdev"
    (**) Logitech USB Receiver: xkb_model: "evdev"
    (**) Option "xkb_layout" "us"
    (**) Logitech USB Receiver: xkb_layout: "us"
    (II) config/hal: Adding input device Macintosh mouse button emulation
    (**) Macintosh mouse button emulation: always reports core events
    (**) Macintosh mouse button emulation: Device: "/dev/input/event0"
    (II) Macintosh mouse button emulation: Found 3 mouse buttons
    (II) Macintosh mouse button emulation: Found x and y relative axes
    (II) Macintosh mouse button emulation: Configuring as mouse
    (**) Macintosh mouse button emulation: YAxisMapping: buttons 4 and 5
    (**) Macintosh mouse button emulation: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    (II) XINPUT: Adding extended input device "Macintosh mouse button emulation" (type: MOUSE)
    (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
    (II) No APM support in BIOS or kernel
    (II) NVIDIA(0): Setting mode "nvidia-auto-select"
    (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
    (II) NVIDIA(0): may not be running or the "AcpidSocketPath" X
    (II) NVIDIA(0): configuration option may not be set correctly. When the
    (II) NVIDIA(0): ACPI event daemon is available, the NVIDIA X driver will
    (II) NVIDIA(0): try to use it to receive ACPI event notifications. For
    (II) NVIDIA(0): details, please see the "ConnectToAcpid" and
    (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
    (II) NVIDIA(0): Config Options in the README.
    (II) Logitech USB Receiver: Device reopened after 1 attempts.
    (II) Logitech USB Receiver: Device reopened after 1 attempts.
    (II) Macintosh mouse button emulation: Device reopened after 1 attempts.
    In particular
    (**) NVIDIA(0): Option "SLI" "On"
    (**) NVIDIA(0): Enabling RENDER acceleration
    (**) NVIDIA(0): NVIDIA SLI auto-select rendering option.
    (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    (II) NVIDIA(0): enabled.
    (EE) NVIDIA(0): Failed to find a valid SLI configuration.
    (EE) NVIDIA(0): Invalid SLI configuration 1 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (EE) NVIDIA(0): Invalid SLI configuration 2 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (EE) NVIDIA(0): Invalid SLI configuration 3 of 3:
    (EE) NVIDIA(0): GPUs:
    (EE) NVIDIA(0): 1) NVIDIA GPU at PCI:1:0:0
    (EE) NVIDIA(0): 2) NVIDIA GPU at PCI:4:0:0
    (EE) NVIDIA(0): Errors:
    (EE) NVIDIA(0): - This configuration is not supported
    (WW) NVIDIA(0): Failed to find a valid SLI configuration for the NVIDIA
    (WW) NVIDIA(0): graphics device PCI:1:0:0. Please see Chapter 25:
    (WW) NVIDIA(0): Configuring SLI and Multi-GPU FrameRendering in the README
    (WW) NVIDIA(0): for troubleshooting suggestions.
    (EE) NVIDIA(0): Only one GPU will be used for this X screen.
    (II) NVIDIA(0): NVIDIA GPU GeForce 8600 GT (G84) at PCI:1:0:0 (GPU-0)
    (--) NVIDIA(0): Memory: 1048576 kBytes
    (--) NVIDIA(0): VideoBIOS: 60.84.41.00.25
    (II) NVIDIA(0): Detected PCI Express Link width: 16X
    (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    (--) NVIDIA(0): Connected display device(s) on GeForce 8600 GT at PCI:1:0:0:
    i do not know where to go from here i have tryed 2 device sections as well as the nvidia-beta package from AUR
    i have not installed the nvidia other then with pacman

    Thank you for at least looking this is the first time i got a real answer i mostly just get keep your luxury problems to your self.
    yes i did read Chapter 25
    in particular the end witch has 2 test to see if the graphics cards are detected by the kernel correctly
    the first shows that both cards are detected
    >> sudo lspci | grep -i vga
    01:00.0 VGA compatible controller: nVidia Corporation G84 [GeForce 8600GT] (rev a1)
    04:00.0 VGA compatible controller: nVidia Corporation G84 [GeForce 8600GT] (rev a1)
    the second shows that both are connected to a bus connected to the Root Bridge
    >> sudo lspci -t
    -[0000:00]-+-00.0
               +-00.1
               +-00.2
               +-00.3
               +-00.4
               +-00.5
               +-00.6
               +-00.7
               +-04.0-[01]----00.0  <<HERE
               +-08.0
               +-09.0
               +-09.1
               +-0a.0
               +-0a.1
               +-0c.0
               +-0d.0
               +-0d.1
               +-0d.2
               +-0e.0-[02]----0b.0
               +-0e.1
               +-10.0
               +-11.0
               +-16.0-[03]----00.0
               +-17.0-[04]----00.0  << AND HERE
               +-18.0
               +-18.1
               +-18.2
               \-18.3
    link to chapter 25
    http://us.download.nvidia.com/XFree86/L … er-25.html
    any other ideas would be appreciated

  • Nvidia Driver Update for MSi Gt 683

    Hello Fellows;
    I own a MSi GT683-244R notebook.
    I received a Beta key for testing the game Battlefield 3 but when I try to start the game they said that the Nvidia Driver is a old version and I need the latest one. I went to Nvidia web page to try to update my driver and I saw that my notebook from MSI is not supported, I need to download from MSI Website. On MSI driver page (http://www.msi.com/product/nb/GT683.html#/?div=Driver&os=Win7%2064) the driver version is 268.12 and on Nvidia driver page (http://www.nvidia.com/Download/Find.aspx?lang=en-us) the lastest certified is 280.26.
    So, my question is... when MSI will launch the latest driver for our notebook???
    GT is a gaming notebook series, so we need the latest driver, please MSI, update it for us.
    Best Regards;
    FaHeinen

    Quote from: DoDeH1 on 03-October-11, 00:30:43
    Product Support
     This driver supports the following NVIDIA notebook GPUs (please refer to the Products Supported tab for exceptions):
    ION notebook GPUs.
    GeForce 8M, 9M, 100M, 200M, 300M, 400M, and 500M-series notebook GPUs.
    Quadro 5010M, 5000M, 4000M 3000M, 2000M and 1000M notebook GPUs.
    Quadro NVS-series notebook GPUs (only those that support DirectX 10 or higher).
    Quadro FX-series notebook GPUs (only those that support DirectX 10 or higher).
    NVS-series notebook GPUs (only those that support DirectX 10 or higher).
    And by the way I use them on my GT680 without a problem.
    You cannot install any video drivers outside of the live update. Nvidia released a beta driver specifically for the Battlefield 3 beta that has been released. because its not available through the live update, you cannot install it on the MSI notebooks.

  • MacBook Pro won't boot up. HELP!!! Need this computer for work tomorrow!!

    My MacBook Pro will not boot. I was using it fine one minute, closed the lid to walk downstairs. Opened the lid and there was nothing. Tried to reboot it ... no chime with a black screen... nothing. Battery is working and fully charged. Tried using the power plug as well.

    There are many comments about the NVIDIA GPU issue and Apple not replacing the GPU.  I will say I had good success in getting Apple to replace mine in my Macbook Pro. When my computer suddenly would not fully turn on, I researched the issue to find out the possible reasons and then somewhat dawdled in taking it in. It was 2 weeks past the 4-year GPU warranty window. But I had had problems for some time preceding when it wouldn't fully turn on, such as geo designs appearing on the screen and the screen going dark, and those issues did occur at least a month or more before my warranty ended. I had no idea what those issues meant and simply ignored them until the computer wouldn't fully turn on.
    The good folks at The Apple store in Austin initially said no luck; yes it's the GPU but it's two weeks past the warranty. But then they kindly relented because clearly the symptoms I described to them that occurred during the warranty were caused by the faulty GPU. How would I know this was the issue?  They replaced the logic board at no cost. I've seen comments here that a Genius said the problem was the logic board, not the GPU, and therefore Apple would not make the repair. I was told by my Genius that the GPU is part of the logic board, and therefore the logic board needed replacing. This was a $526 repair that I was grateful Apple took care of. I would not have made the repair and would have gone out and bought a new non-Apple computer with a warranty.
    When I got my computer back, I checked it before I left the Apple store and it was running slow. I ended up paying Apple to replace the hard drive, so ultimately I did pay for an Apple repair. But I'm a happy rather than disappointed Apple customer because of the free GPU repair. Happy, devoted customers has made Apple a great company, and it should be worth it to the company to fix the faulty GPUs that are close to warranty to keep customers and keep them happy. 

  • HT203254 MBP Logic board failure or GPU failure?  Apparently it's simple to test...

    I have a 2008 Macbook Pro 15" (2.2Ghz) with NVidia 8600M GT graphics chip.  Last month it failed, no longer booting up.  The fans start, the LED light comes on, but the keyboard, hard drive and screen fail to come on. After reading article TS2377 I was convinced that the symptoms described match those on my MBP, so I took it to the nearest Apple store.  The 'genius' at the 'genius bar' (cringe) took it away then came back and told me that the problem was with the logic board, not the GPU.  They diagnosed this by removing the RAM and seeing if the RAM failure beeps sounded.  Seeing as there was no sound they diagnosed that the logic board was the source of failure because the RAM test comes before the GPU test in the boot sequence.  This test is so simple surely it should be posted under article TS2377???
    I must say that I'm completely unconvinced by the diagnosis, but who can argue with a 'genius'?  I'd really appreciate it if someone on here could give me a second opinion?  Even if they turn out to be right I think I might take out the logic board to see if there's evidence that the GPU has fried the logic board...

    There are many comments about the NVIDIA GPU issue and Apple not replacing the GPU.  I will say I had good success in getting Apple to replace mine in my Macbook Pro. When my computer suddenly would not fully turn on, I researched the issue to find out the possible reasons and then somewhat dawdled in taking it in. It was 2 weeks past the 4-year GPU warranty window. But I had had problems for some time preceding when it wouldn't fully turn on, such as geo designs appearing on the screen and the screen going dark, and those issues did occur at least a month or more before my warranty ended. I had no idea what those issues meant and simply ignored them until the computer wouldn't fully turn on.
    The good folks at The Apple store in Austin initially said no luck; yes it's the GPU but it's two weeks past the warranty. But then they kindly relented because clearly the symptoms I described to them that occurred during the warranty were caused by the faulty GPU. How would I know this was the issue?  They replaced the logic board at no cost. I've seen comments here that a Genius said the problem was the logic board, not the GPU, and therefore Apple would not make the repair. I was told by my Genius that the GPU is partof the logic board, and therefore the logic board needed replacing. This was a $526 repair that I was grateful Apple took care of. I would not have made the repair and would have gone out and bought a new non-Apple computer with a warranty.
    When I got my computer back, I checked it before I left the Apple store and it was running slow. I ended up paying Apple to replace the hard drive, so ultimately I did pay for an Apple repair. But I'm a happy rather than disappointed Apple customer because of the free GPU repair. Happy, devoted customers has made Apple a great company, and it should be worth it to the company to fix the faulty GPUs that are close to warranty to keep customers and keep them happy.

  • Installed Premiere Pro CS4 but video display does not work?

    I just got my copy of CS$. After installing Premiere I found two things that seem very wrong:
    1) video display does not work, not even the little playback viewer next to improted film clips located on the project / sequence window. Audio works fine.
    2) the UI is way too slow for my big beefy system.
    My pc is a dual boot Vista-32 and XP system with 4GB of memory installed and nvidia geforce 280 graphics board with plenty of GPU power. The CS4 is installed on the Vista-32 partition. My windows XP partition on the same PC with Premiere CS2 works great and real fast.
    Any ideas how to solve this CS4 install issue?
    Ron

    I would like to thank Dan, Hunt, and Haram:
    The problem is now very clear to me. The problem only shows up with video footage imported into PP CS4 encoded with "MS Video 1" codec. So this seems to be a bug. The codec is very clearly called out and supported within various menues but video with this codec just will not play in any monitor or preview window. In addition the entire product looks horrible with respect to performance while PP CS4 trys its best to play the video. Audio will start playing after about 30 seconds. And once in awhile part of video shows up at the wrong magnification before blanking out again.
    My suggestion to the Adobe team: fix the bug and add some sample footage to the next release so new installations can test their systems with known footage.
    My PC is brand new with the following "beefy" components:
    Motherboard
    nForce 790i SLI FTW
    Features:
    3x PCI Express x16 graphics support
    PCI Express 2.0
    NVIDIA SLI-Ready (requires multiple NVIDIA GeForce GPUs)
    DDR3-2000 SLI-Ready memory w/ ERP 2.0 (requires select third party system memory)
    Overclocking tools
    NVIDIA MediaSheild w/ 9 SATA 3 Gb/sec ports
    ESA Certified
    NVIDIA DualNet and FirstPacket Ethernet technology
    Registered
    CPU: Intel Core 2 Quad Q9550
    S-Spec: SLAWQ
    Ver: E36105-001
    Product Code: BX80569Q9550
    Made in Malaysia
    Pack Date: 09/04/08
    Features:
    Freq.: 2.83 GHz
    L2 Cache: 12 MHz Cache
    FSB: 1333 MHz (MT/s)
    Core: 45nm
    Code named: Yorkfield
    Power:95W
    Socket: LGA775
    Cooling: Liquid Cooled
    NVIDIAGeForce GTX 280 SC graphics card
    Features:
    1 GB of onboard memory
    Full Microsoft DirectX 10
    NVIDIA 2-way and 3-way SLI Ready
    NVIDIA PureVideo HD technology
    NVIDIA PhysX Ready
    NVIDI CUDA technology
    PCI Express 2.0 support
    Dual-link HDCP
    OpenGL 2.1 Capaple
    Output: DVI (2 dual-link), HDTV
    Western Digital
    2 WD VelociRaptor 300 GB SATA Hard Drives configured as Raid 0
    Features:
    10,000 RPM, 3 Gb/sec transfer rate
    RAM Memory , Corsair 4 GB (2 x 2 GB) 1333 MHz DDR3
    p/n: TW3X4G1333C9DHX G
    product: CM3X2048-1333C9DHX
    Features:
    XMS3 DHX Dual-Path 'heat xchange'
    2048 x 2 MB
    1333 MHz
    Latency 9-9-9-24-2T
    1.6V ver3.2

  • What upgrade graphic card do i need for a early 2008 mac pro

    I have a early 2008 Mac Pro.  ATI Radeon HD 2600 XT 256 MB is what i have now.  What is a good upgrade graphic card to get.

    The GTX 780 6GB runs fine on the dual PCIe internal power cables -- as does the GTX 680 and Radeon HD 7950. The same can't be said of the GTX TITAN or GTX 780 Ti or Radeon R9 280X -- all of which require an auxiliary power feed to avoid a nasty power down of your tower at the worst moment when too much wattage is demanded of the Mac Pro's factory power supply.
    Did I mention that the GTX 780 with 6GB of GDDR5 matches the 6GB of GDDR5 in the TITAN?
    The one fallacy to my 'sweet spot' award has to do with OpenCL. In case you hadn't noticed, the AMD Radeon GPUs smoked the NVIDIA GeForce GPUs running Photoshop's OpenCL accelerated Iris Blur filter and rendering LuxMark's OpenCL accelerated Room scene.
    - http://www.barefeats.com
    So depends on budget, whether you want a PC card for less but no "early EFI boot screen" prior to drivers loading, and what apps and needs you have.
    The one thing for sure is that 2600XT should have been retired ages ago and even more so with 10.8.3 and above, it could be trouble or just... PITA/POS.

Maybe you are looking for