Installation nvidia on Quadro K2000M

Hi there
I am trying to install the latest nvidia driver on my arch installation, but without any success.
After I installed the driver as suggested in the wiki I am stuck with a black screen... all the tips stated in the wiki do not seem to work.
I would like to have the nvidia and the intel driver installed, so I can switch manually in the bios which card to use (no optimus needed)
lspci | grep "VGA"
01:00.0 VGA compatible controller: NVIDIA Corporation GK107GLM [Quadro K2000M] (rev a1)
has anybody successfully installed the nvidia driver with a Quadro K2000M?
wucherpfennig
Last edited by wucherpfennig (2014-07-09 23:20:34)

stilll no success. here is my xorg.log (i tried to set up optimus, all settings are according to the wiki page)
[ 4.847]
X.Org X Server 1.15.2
Release Date: 2014-06-27
[ 4.847] X Protocol Version 11, Revision 0
[ 4.847] Build Operating System: Linux 3.15.1-1-ARCH x86_64
[ 4.847] Current Operating System: Linux lastesel 3.15.5-1-ARCH #1 SMP PREEMPT Thu Jul 10 07:08:50 CEST 2014 x86_64
[ 4.847] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=3ca83d3d-6d1b-4543-8551-56238718e4f1 rw quiet rcutree.rcu_idle_gp_delay=1
[ 4.847] Build Date: 27 June 2014 07:32:26PM
[ 4.847]
[ 4.847] Current version of pixman: 0.32.6
[ 4.847] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 4.847] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 4.847] (==) Log file: "/var/log/Xorg.0.log", Time: Thu Jul 17 22:32:35 2014
[ 4.850] (==) Using config file: "/etc/X11/xorg.conf"
[ 4.850] (==) Using config directory: "/etc/X11/xorg.conf.d"
[ 4.850] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 4.856] (==) ServerLayout "layout"
[ 4.856] (**) |-->Screen "nvidia" (0)
[ 4.856] (**) | |-->Monitor "<default monitor>"
[ 4.858] (**) | |-->Device "nvidia"
[ 4.858] (==) No monitor specified for screen "nvidia".
Using a default monitor configuration.
[ 4.858] (**) |-->Inactive Device "intel"
[ 4.858] (==) Automatically adding devices
[ 4.858] (==) Automatically enabling devices
[ 4.858] (==) Automatically adding GPU devices
[ 4.865] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
[ 4.865] Entry deleted from font path.
[ 4.865] (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
[ 4.865] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
[ 4.865] Entry deleted from font path.
[ 4.865] (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
[ 4.866] (==) FontPath set to:
/usr/share/fonts/misc/,
/usr/share/fonts/TTF/,
/usr/share/fonts/OTF/,
/usr/share/fonts/Type1/
[ 4.866] (==) ModulePath set to "/usr/lib/xorg/modules"
[ 4.866] (II) The server relies on udev to provide the list of input devices.
If no devices become available, reconfigure udev or disable AutoAddDevices.
[ 4.866] (II) Loader magic: 0x811cc0
[ 4.866] (II) Module ABI versions:
[ 4.866] X.Org ANSI C Emulation: 0.4
[ 4.866] X.Org Video Driver: 15.0
[ 4.866] X.Org XInput driver : 20.0
[ 4.866] X.Org Server Extension : 8.0
[ 4.866] (II) xfree86: Adding drm device (/dev/dri/card1)
[ 4.866] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 4.867] (--) PCI:*(0:0:2:0) 8086:0166:17aa:21f5 rev 9, Mem @ 0xf1400000/4194304, 0xe0000000/268435456, I/O @ 0x00006000/64
[ 4.867] (--) PCI: (0:1:0:0) 10de:0ffb:17aa:21f5 rev 161, Mem @ 0xf0000000/16777216, 0xc0000000/268435456, 0xd0000000/33554432, I/O @ 0x00005000/128, BIOS @ 0x????????/524288
[ 4.867] (II) Open ACPI successful (/var/run/acpid.socket)
[ 4.869] Initializing built-in extension Generic Event Extension
[ 4.869] Initializing built-in extension SHAPE
[ 4.869] Initializing built-in extension MIT-SHM
[ 4.869] Initializing built-in extension XInputExtension
[ 4.869] Initializing built-in extension XTEST
[ 4.869] Initializing built-in extension BIG-REQUESTS
[ 4.869] Initializing built-in extension SYNC
[ 4.869] Initializing built-in extension XKEYBOARD
[ 4.869] Initializing built-in extension XC-MISC
[ 4.869] Initializing built-in extension SECURITY
[ 4.869] Initializing built-in extension XINERAMA
[ 4.869] Initializing built-in extension XFIXES
[ 4.869] Initializing built-in extension RENDER
[ 4.869] Initializing built-in extension RANDR
[ 4.869] Initializing built-in extension COMPOSITE
[ 4.869] Initializing built-in extension DAMAGE
[ 4.869] Initializing built-in extension MIT-SCREEN-SAVER
[ 4.869] Initializing built-in extension DOUBLE-BUFFER
[ 4.869] Initializing built-in extension RECORD
[ 4.869] Initializing built-in extension DPMS
[ 4.869] Initializing built-in extension Present
[ 4.869] Initializing built-in extension DRI3
[ 4.869] Initializing built-in extension X-Resource
[ 4.869] Initializing built-in extension XVideo
[ 4.869] Initializing built-in extension XVideo-MotionCompensation
[ 4.869] Initializing built-in extension XFree86-VidModeExtension
[ 4.869] Initializing built-in extension XFree86-DGA
[ 4.869] Initializing built-in extension XFree86-DRI
[ 4.869] Initializing built-in extension DRI2
[ 4.869] (II) "glx" will be loaded by default.
[ 4.869] (II) LoadModule: "dri2"
[ 4.869] (II) Module "dri2" already built-in
[ 4.869] (II) LoadModule: "glamoregl"
[ 4.873] (II) Loading /usr/lib/xorg/modules/libglamoregl.so
[ 5.053] (II) Module glamoregl: vendor="X.Org Foundation"
[ 5.053] compiled for 1.15.0, module version = 0.6.0
[ 5.053] ABI class: X.Org ANSI C Emulation, version 0.4
[ 5.053] (II) LoadModule: "glx"
[ 5.053] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[ 5.216] (II) Module glx: vendor="NVIDIA Corporation"
[ 5.216] compiled for 4.0.2, module version = 1.0.0
[ 5.216] Module class: X.Org Server Extension
[ 5.218] (II) NVIDIA GLX Module 340.24 Wed Jul 2 15:04:31 PDT 2014
[ 5.218] Loading extension GLX
[ 5.218] (II) LoadModule: "nvidia"
[ 5.218] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[ 5.236] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 5.236] compiled for 4.0.2, module version = 1.0.0
[ 5.236] Module class: X.Org Video Driver
[ 5.237] (II) LoadModule: "intel"
[ 5.237] (II) Loading /usr/lib/xorg/modules/drivers/intel_drv.so
[ 5.241] (II) Module intel: vendor="X.Org Foundation"
[ 5.241] compiled for 1.15.2, module version = 2.99.912
[ 5.241] Module class: X.Org Video Driver
[ 5.241] ABI class: X.Org Video Driver, version 15.0
[ 5.241] (II) NVIDIA dlloader X Driver 340.24 Wed Jul 2 14:42:23 PDT 2014
[ 5.241] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 5.241] (II) intel: Driver for Intel(R) Integrated Graphics Chipsets:
i810, i810-dc100, i810e, i815, i830M, 845G, 854, 852GM/855GM, 865G,
915G, E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM,
Pineview G, 965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33,
GM45, 4 Series, G45/G43, Q45/Q43, G41, B43
[ 5.242] (II) intel: Driver for Intel(R) HD Graphics: 2000-5000
[ 5.242] (II) intel: Driver for Intel(R) Iris(TM) Graphics: 5100
[ 5.242] (II) intel: Driver for Intel(R) Iris(TM) Pro Graphics: 5200
[ 5.242] (++) using VT number 7
[ 5.248] (II) Loading sub module "fb"
[ 5.248] (II) LoadModule: "fb"
[ 5.248] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 5.251] (II) Module fb: vendor="X.Org Foundation"
[ 5.251] compiled for 1.15.2, module version = 1.0.0
[ 5.251] ABI class: X.Org ANSI C Emulation, version 0.4
[ 5.251] (WW) Unresolved symbol: fbGetGCPrivateKey
[ 5.251] (II) Loading sub module "wfb"
[ 5.251] (II) LoadModule: "wfb"
[ 5.251] (II) Loading /usr/lib/xorg/modules/libwfb.so
[ 5.254] (II) Module wfb: vendor="X.Org Foundation"
[ 5.254] compiled for 1.15.2, module version = 1.0.0
[ 5.254] ABI class: X.Org ANSI C Emulation, version 0.4
[ 5.254] (II) Loading sub module "ramdac"
[ 5.254] (II) LoadModule: "ramdac"
[ 5.254] (II) Module "ramdac" already built-in
[ 5.259] (II) intel(1): Using Kernel Mode Setting driver: i915, version 1.6.0 20080730
[ 5.264] (II) intel(G0): Using Kernel Mode Setting driver: i915, version 1.6.0 20080730
[ 5.264] (EE) Screen 1 deleted because of no matching config section.
[ 5.264] (II) UnloadModule: "intel"
[ 5.264] (II) NVIDIA(0): Creating default Display subsection in Screen section
"nvidia" for depth/fbbpp 24/32
[ 5.264] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
[ 5.264] (==) NVIDIA(0): RGB weight 888
[ 5.264] (==) NVIDIA(0): Default visual is TrueColor
[ 5.264] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[ 5.268] (**) NVIDIA(0): Enabling 2D acceleration
[ 9.541] (EE) NVIDIA(GPU-0): Failed to initialize the NVIDIA GPU at PCI:1:0:0. Please
[ 9.541] (EE) NVIDIA(GPU-0): check your system's kernel log for additional error
[ 9.541] (EE) NVIDIA(GPU-0): messages and refer to Chapter 8: Common Problems in the
[ 9.541] (EE) NVIDIA(GPU-0): README for additional information.
[ 9.542] (EE) NVIDIA(GPU-0): Failed to initialize the NVIDIA graphics device!
[ 9.542] (EE) NVIDIA(0): Failing initialization of X screen 0
[ 9.542] (II) UnloadModule: "nvidia"
[ 9.542] (II) UnloadSubModule: "wfb"
[ 9.542] (II) UnloadSubModule: "fb"
[ 11.796] (EE) intel(G0): [drm] failed to set drm interface version: Permission denied [13].
[ 11.796] (EE) intel(G0): Failed to claim DRM device.
[ 11.796] (II) UnloadModule: "intel"
[ 11.796] (EE) Screen(s) found, but none have a usable configuration.
[ 11.796] (EE)
Fatal server error:
[ 11.796] (EE) no screens found(EE)
[ 11.796] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 11.796] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[ 11.796] (EE)
[ 11.805] (EE) Server terminated with error (1). Closing log file.
Adding rcutree.rcu_idle_gp_delay=1 to grub does not resolve the problem...
thanks for your help
Last edited by wucherpfennig (2014-07-17 20:40:58)

Similar Messages

  • NVIDIA Quadro 2100M vs Quadro K2000M

    I got my Thinkpad W530 today.  It comes with NVIDIA Quadro 2100M. Is this the same as NVIDIA Quadro K2000M? I also notice that the fan run louder than my W500. Is this normal? Thank you.
    Intel Core i7-3820, Windows 7 Pro, 4GB, 320G HDD.

    viman89,
    See post by ibmthink in this thread.
    http://forums.lenovo.com/t5/W-Series-ThinkPad-Laptops/W530-Sub-par-features-to-the-actual-components...
    *Non Lenovo employee*
    I have a Y2P (i5) ... Feel free to ping me if you want me to test some applications with your Y2P if you have the same model. I don't mind keep doing recovery on it if needed .... =)

  • NVIDIA Quadro K2000M Stress Test Warning

    Anyone with the NVIDIA Quadro K2000M card:  Has anyone received a stress test warning after running a hardware scan?
    I am currently connected to only two monitors (through the dock). And during the scan, I only had a couple of browsers open. I was not streaming or watching videos;  simply reading a couple of articles  (nothing "stressfull" taking place).
    This is the first time this has come up; I will be checking settings and running the test later.
    I couldn't find anything on the result code I received:  WVC00V000-WJSWSM
    Just thought I would check if anyone else had received the same result (stress test warning if not same code) and  what the final outcome was.

    Similar result on a T530 with i7, 8GB, and NVidia 5400: installed new version of Lenovo Solutions (prompted by Lenovo when I opened my existing version), ran full hardware check, everything passed except stress test on video card, which gave warning with result code WVC00V000-UJ8XWD.  Am also running in a dock (Mini Series 3+, not the USB 3.0 version) with a single external monitor attached to a DVI port.
    Interestingly, the log section for the NVidia 5400 says "Displays: 4", as if it thinks that I have 4 displays attached (whereas the log section for Intel 4000 video card says "Displays: 1").  Since the dock could output three displays (two outputs that can be either DisplayPort or DVI, plus one VGA) and the laptop screen is active, perhaps the program is assuming that it's actually running four screens.
    The fan exhaust put out some serious heat during that part of the test.  A couple of programs were open but not in use during the hardware test.  I had not run the hardware test under the previous version of Lenovo Solutions, so I don't know whether it would have shown the same thing, or whether it's new to the new version of Lenovo Solutions.
    Would appreciate any insights the community has to offer.
    Thanks.

  • Premiere CC crashes on launch on Lenovo ThinkPad W530/Quadro K2000M

    Hey everyone!
    I can't get Premiere Pro CC to launch on my new laptop--Lenovo ThinkPad W530 with Quadro K2000M graphics. On launch I get "Adobe Premiere CC has stopped working."
    The K2000M is on the approved list, so I was hoping that wouldn't be a problem. But uninstalling the nVidia drivers does actually allow me to open Premiere, so clearly that is the problem. I went to the Lenovo site and I'm using the most up-to-date driver (9.18.13.1270/8.15.10.2725/G5DE43WW).
    Things that didn't work that I've seen elsewhere on this forum:
    - signing out of and into Creative Cloud
    - running Premiere in Windows 7 compatibility mode
    - running Premiere as administrator
    Any help in getting the program and the nVidia running at the same time would be appreciated...I have a project coming up shortly.
    Matt Levie
    If it helps, here's the problem signature:
    Problem signature:
      Problem Event Name:    BEX64
      Application Name:    Adobe Premiere Pro.exe
      Application Version:    7.2.1.4
      Application Timestamp:    52aed7f3
      Fault Module Name:    StackHash_1dc2
      Fault Module Version:    0.0.0.0
      Fault Module Timestamp:    00000000
      Exception Offset:    0000000000000000
      Exception Code:    c0000005
      Exception Data:    0000000000000008
      OS Version:    6.1.7601.2.1.0.256.48
      Locale ID:    1033
      Additional Information 1:    1dc2
      Additional Information 2:    1dc22fb1de37d348f27e54dbb5278e7d
      Additional Information 3:    eae3
      Additional Information 4:    eae36a4b5ffb27c9d33117f4125a75c2

    Unfortunately, the crash data you posted doesn't give us any clues.
    It looks like the driver you're on is very fresh: 2014-1/5. Another user recently reported a different problem on a Lenovo w/ the K1000M (http://forums.adobe.com/message/5987620#5987620). Don't know yet what driver he's on, but if he's also on a new one, I'd begin to suspect a bug in the driver.
    Some additional data could be illuminating. Premiere installs a utility called GPUsniffer. You'll find it here: C:\Program Files\Adobe\Adobe Premiere Pro CC. However, unless you read quicker than I do--and I mean waaaay faster--launching it from there won't do much good. You'll have to open the command console, navigate to the path above, and type gpusniffer.exe. (Let me know if you need more explicit instructions for getting around in the console.) Then copy the report and paste it here.
    edited to add: To help determine if the current Lenovo K2000M driver is at fault--and possibly get yourself back in business--please roll back to an earlier driver.

  • Quadro K2000m drivers Win 8.1 W530

    Hello everyone,
    I was trying to see whether the drivers I originally had (can't remember the version) for the Quadro K2000m on my W530 were doing their job.
    The first sign I noticed that something was off was when playing CS: GO and I can only play it at 1600x900 (using only the laptop's LCD)  with most settings in medium and some in high. Then, I noticed that although Premiere Pro CS6 works with the CUDA cores (I can play and add effects to most clips and still play them in real time from a 170mbps GH2 patch) I appear to still not get full performance since some effects immediately make playback choppy.
    According to GPU-Z, the GPU is not working when I render my work area in Premiere.
    I am attaching two screenshots from GPU-Z. The sensors screenshot shows what the GPU usage when playing a clip in full resolution with one effect (filmconvert) added to the clip. Playback is choppy. I would say it drops half the frames every second.
    Now, the kicker...since I downloaded a new driver from Lenovo, I started getting BSODs.
    So I downloaded another one from NVIDIA...same BSODs and usually not when I am working. Mostly when I am browsing of doing not much like watching a video on youtube or vimeo.
    Am I missing something here?

    When you run the NVIDIA installer, choose Custom. At the page where you can choose components to install/upgrade, there should be a checkbox for clean install. It will uninstall the old driver, prompt for reboot, then install the new driver.
    You can quickly check if Premiere is set to run on discrete by right-clicking on the shortcut (not sure if this works in the Metro UI), and looking in the "Run with graphics processor" submenu. If this says integrated is default, then you can change this in the NVIDIA Control Panel.
    W520: i7-2720QM, Q2000M at 1080/688/1376, 21GB RAM, 500GB + 750GB HDD, FHD screen
    X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
    Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen

  • Incompatibility quadro k2000m and after effects cs6

    Hi
    I just recieved my new dell laptop m4700 with a nvidia quadro k2000m.
    its sell as a fully compatible cuda card for all cs6 products. but gpu render doesnt work !!!! ray traced 3d is only on cpu... very boring.
    Whats the problem ? are drivers too young ? not yet compatible ? need an after effects update ?
    i've the same problem with premiere who doesnt use mercury gpu playback... (photoshop isnt very fast too...)
    im very disapointed with my all brand new pc........
    Help me please ! i do not find any discussion about it on the web.
    apologize my english, im french.
    thx a lot
    ciao

    melkirn wrote:
    Where do you find the 'adobe compatible card list' [lease?
    After Effects tech specs
    Premiere Pro tech specs
    I found them by doing a Google search for...well, those exact terms.

  • W530 Quadro K2000m and Premiere Pro CS6

    Hi all,
    I have a W530 with i7, Q K2000m, 32gb ram and 3 interal drives (1msata ssd, 2 hgst 7k1000 1tb in the other two bays). I don't know whether I have an issue or not, but:
    1) CUDA acceleration is enabled. So we're OK in this.
    2) When I assemble a few clips together, although the render line is yellow, after a few seconds, the videos start becoming choppy. So, I was wondering how can this because I am only editing a 720p timeline and I have not graded anything yet. Just warp stabilizer on some clips and some of them (since I shot at 60p) are slowed down to 24p.
    3) When I run GPU-Z, I am only seeing a 8-15% GPU load rate.
    4) Because I can't reverse time and stabilize at the same time, I stabilized a clip on another sequence and then nested that sequence in my main sequence. Now, when I reverse that nested sequence...I get a blank screen UNLESS I kill the hardware acceleration and switch to CPU only (this seems to be a Premiere Pro issue)
    5) I did everything possible to make sure that the Quadro card was working. I turned everything to "Performance NVIDIA graphics) in the NVIDIA control panel. I installed the latest drivers from NVIDIA the day before yesterday. I even flashed my BIOS to the latest downloadabale from Lenovo's System Update.
    Am I missing something? Or is the card just underclocked and under-powered on purpose? Why can it not run at beyond 15% load when the clips are choppy on playback?
    You may noticed I am a bit frustrated...but hey...I love the machine so I hope I am missing something that I can set up to make it all run smoothly again.
    Many thanks for any ideas, guidance.
    PS: All captured clips are in the 1st HDD on the main bay and the scratch/preview disk is the 2nd HDD on the CD drive. Premiere is running from the mSATA SSD (Plextor M5M 256GB)
    Solved!
    Go to Solution.

    I don't have Premiere, so I can't see by comparison what happens on my own W530 which only has a K1000m.  I also don't know if your symptoms are due to hardware or sofware (i.e. Premiere itself).  Do you have another machine of your own to try the same experiment against, to see what GPU usage measures at and whether the visual results are seemingly as poor as you describe on your W530, or are they much better visual results and much higher performing as measured?
    Aside from our both doing something in common, to compare results, I just took a look at two video applications on my own W530: (a) VideoReDo TV Suite 5 editing both 720p and 1080i copy-freely WTV recorded videos made by Windows Media Center (on my HTPC), and (b) Windows Media Center running on the W530 and playing both of these 720p and 1080i WTV recordings.  I opened GPU-Z while the video apps were running to monitor the GPU load.  I also have Aida64 running, which also displays presumably the same "GPU utilization".  I expect VRD to use less GPU horsepower than Windows Media Center uses.
    Also, here is my Lenovo Power Manager setup, with my i7-3720QM 8GB W530/K1000m 512GB Samsung 840 Pro SSD powered into the wall.  The display is external 24" Eizo S2433W monitor (miniDP->DP from the W530), 1920x1200, laptop screen OFF.  I'm running "discrete graphics" in the BIOS, using "retail" nVidia driver 347.88.
    And here's Windows Media Center playing a 1080i video (which involves hardware de-interlacing), and showing 57% GPU load through GPU-Z.  Note that with Aida64 (on the right side of the image) it also shows 55% GPU utilization, so this number is clearly correct.
    And the same clip being "edited" using VideoRedo.  Note that GPU usage is now down to 35% (at least at this instant), as I expected.  I'm quite sure hardware de-interlacing is also taking place.
    Video performance with both apps is superb.
    Anyway, without our being able to duplicate the other's test situation, I don't know what to suggest that you haven't already done.  Unfortunately I don't have Premiere to try.

  • Nvidia GTX + Quadro for Premiere and Speedgrade - is it possible?

    Hi,
    I have a workstation with Nvidia GTX 470 GPU. I am doing a lot of color grading work for my projects and I am considering to purchase HP Dream Color 10-bit monitor. As GTX don't output 10-bit, I am thinking about purchasing a Quadro GPU just for output preview purpose.
    Does anyone know is it possible to put those two cards together in the same PC (Windows 8, 64-bit) without issues? If so, can I use cheapest Quadro GPU, for example K420? The main idea is using GTX as workhorse for rendering and live timeline preview and Quadro for 10-bit preview on HP Dream Color.
    Thanks,
    Tom

    Generally it is not advised by nVidia as they do not test having two different families of drivers running at the same time.  That does not mean that it would not work but there have been reports of problems in the forums.   Of course if anyone is successful it generally does not get reported.
    You might have to consider removing the GTX 470 and getting something like the K2200 with a few more CUDA cores than your GTX 470 and also 10-bit output.

  • NVidia to release Drivers that support the Quadro K1000M and K2000M soon??

    I have been checking to see if verde drivers support the K2000M in my W530, but today I saw that nVidia has added the K1000M and K2000M to the product listing on the drivers page. I'm hoping this means that they are gearing up to release drivers for these cards soon!!!!.
    W530(2436-CTO): i7-3720QM, nVidia Quadro K2000M,16GB RAM, 500 GB hard drive, 128GB mSATA SSD, Ubuntu 14.04 Gnome, Centrino Ultimate-N 6300.
    Yoga 3 Pro: Intel Core-M 5Y70, Intel HD 5300, 8GB RAM, 128GB eMMC, Windows 8.1, Broadcom Wireless 802.11ac.

    I got those. They seem to run fine. I really want to upgrade to windows 8 so I can get rid of my VM.
    W530(2436-CTO): i7-3720QM, nVidia Quadro K2000M,16GB RAM, 500 GB hard drive, 128GB mSATA SSD, Ubuntu 14.04 Gnome, Centrino Ultimate-N 6300.
    Yoga 3 Pro: Intel Core-M 5Y70, Intel HD 5300, 8GB RAM, 128GB eMMC, Windows 8.1, Broadcom Wireless 802.11ac.

  • ThinkPad W530 with the Quadro K1000M or K2000M? BTSBEST coupon expires tomorrow!

    I've been looking for a good 15" Ivy Bridge laptop that will last four or five years without breaking or becoming totally obsolete. The ThinkPad W530 fits these criteria. I'll be using it for playing Portal 2 and Minecraft, watching (and sometimes encoding) 1080p video, developing graphical and other programs (fractal renderers, procedural generation, etc), and web design. Programs I run include Eclipse, VMware, x264, Photoshop, and Firefox (with pretty many extensions, userscripts, and tabs).
    I'm getting the Intel Core i7-3610QM, 1600x900 display (for 1080p I'd use an external monitor larger than 15"), 4GB RAM and 320GB hard drive (I can upgrade them myself more cheaply), and Intel Centrino Advanced-N 6205 AGN. So: which graphics card should I get, the K1000M or K2000M? The K1000M has 192 pipelines at 850MHz; the K2000M has 384 at 745MHz. Is having twice as many shaders worth an extra $250 for my purposes? Four years from now, will the K2000M be acceptable while the K1000M is obsolete? And if I do get the cheaper K1000M, should I upgrade to the Core i7-3520M for $50 or the i7-3720QM for $85? (Why does the dual-core 3520M cost more than the quad-core 3610QM?)
    The BTSBEST coupon is saving me $300 on this configuration, but it expires tomorrow, so I need to make a decision. Thanks for any advice!
    ThinkPad W530 (Intel Core i7-3610QM, NVIDIA Quadro K2000M, 4GB DDR3, 320GB 7200RPM, 15.6" 1600x900, Intel Centrino Ultimate-N 6300)

    Hello Rangi42,
    Personally I would go with the K2000M.  I am a big gamer and that graphics card would be great for Minecraft and Portal 2.  Graphics should run smoothly.  I would agree on the RAM and hard drive upgrade, maybe an SSD to make the computer run faster and smoother.
    I am not sure why the price of the 3520M is cheaper than the 3610QM. 
    The w530 in general is a great computer for graphics development like your web design.  Video editing is good on the W530 as well.
    Hope this helps,
    Alex
    Was this or another post on the forum helpful? Click the star on the left side of the screen to give kudos! Did someone solve the problem you encountered? Click Solution Provided to let us know!
    What we Do in Life will Echo through Eternity. -Maximus Aurelius

  • Nvidia Quadro 4000 is  Freezing / shutting down / Buggy with Mac Pro 2009

    I currently have two Nvidia Quadro 4000 mac cards and there causing my Mac Pro 2009 machine to kernel Panic and freeze or shutdown my machine.
    I dropped it off at the Apple store for them to diagnose the problem for 9 days and they went ahead and confirmed that it was the card which was causing the problem.
    NVIDIA PLEASE UPDATE YOUR DRIVERS FOR THIS CARD FOR MAC.
    Its ridicules that if you spend $1200 (apple store) that it will crap out your Mac Pro. I'm waiting for a updated driver in order to test the stability with the Mac Pro
    I'm almost 100% sure I did the 10.6.6 update with the stock card, then installed the most updated drivers from Nvidia website and then installed the Cuda Drivers, then finally installing the Video card in the machine. After two days, system was acting up.
    Once I get my machine back from Apple tomorrow, I will go ahead and give it one more last try to see if it works. I'm mean the cards are amazing with Adobe Premiere and Media Encoder (super fast), but at the cost that your machine will be very buggy.
    Lets wait and see what Apple, Nvidia or PNY will do about the big problem. I'm wanting to keep these babies, so make some moves people and fix the issues for the Professionals.

    I hadn't been experiencing the problems you have, but I have been having issues, and yes it absolutely is a case of immature drivers. When the card was released in December, nVidia merely did a simple patch job on the 256.01.00f03 driver that shipped with 10.6.5 rather than include an optimized driver that was comparable to the 259.x driver available on the Windows platform at the time.
    Since then, nVidia's engineers have been hard at work doing what appears to be nothing for the Mac. On the Windows side, the Quadro drivers have progressed to 267.11. Rather than provide Apple with updated drivers to include with 10.6.6 or this week's 10.6.7 release, they chose to sit back and wait for the 10.6.7 release and then release their own update.
    After 4 months, their best and brightest have brought us <drumroll> driver version 256.01.00f03. To be fair, they changed it from "v5" (the patch job to enable Quadro 4000 compatibility) to "v6". The idea was that it would add compatibility for the Quadro 4000 running under 10.6.7, Sadly for nVidia's Quadro engineers, that driver's installer didn't actually work. It took them nearly a full day to fix that, finally releasing 256.01.00f03v7. As expected, there are no improvements in either performance or stability. In fact, what happened to me is that the new driver actually broke compatibility with Adobe Premiere Pro CS5's Mercury Playback Engine GPU acceleration feature.
    Fortunately, I still had my GTX-285 card available, and this evening I pulled the Quadro and re-installed the older GTX card. I really wish nVidia would care enough to release a solid driver update, I really want to like the Quadro 4000. On paper the potential for video production and OpenGL rendering performance should be huge.

  • NVIDIA K2000M Premier Pro issues

    Hi,
    I have an MSI GT60 0NG-294US which comes with the NVIDIA Quadro K2000M graphics card. Adobe lists this as a compatible card for CUDA accleration in Premiere Pro. However the option in the Premier Pro settings is greyed out for me and only allows for the Mercury Playback Engine software only.
    What am I doing wrong? Help!
    Thank you!!

    What is your exact device driver version?
    How to determine your nVidia driver version
    Right click in any open area of the Windows desktop
    Select NVIDIA Control Panel from the popup window
    Select SYSTEM INFORMATION at the lower left corner
    Driver version should be the 1st line in Details
    I have a GTX285 using driver 296.10 and Win7

  • Vdpau and full screen youtube problems on Nvidia Quadro NVS 140M

    I have Nvidia NVS 140M on 64-bit Arch. I am trying to find perfect configuration for my Thinkpad with no luck.
    I used to have nouveau drivers, now using nvidia-173xx. The problems are following:
    1) YouTube videos:
    - with nouveau I was not able to watch any video (it was like a slideshow)
    - with nvidia everything works fine.
    I'd prefer nouveau than nvidia, because of KMS.
    2) Glxgears
    As far as I remember, you can check if acceleration is working, using this tool - CPU usage should be not changed after glxgears started. On both drivers (nouveau and nvidia) running glxgears process uses 100% CPU.
    3) vdpau
    According to this thread, my card supports vdpau. I can not make it to work.
    Additional information:
    thinkpad ~ $ pacman -Qs nvidia
    local/lib32-libvdpau 0.4.1-3
    Nvidia VDPAU library (32-bit)
    local/libvdpau 0.4.1-1
    Nvidia VDPAU library
    local/nvidia-173xx 173.14.28-3
    NVIDIA drivers for kernel26, 173xx branch.
    local/nvidia-173xx-utils 173.14.28-1
    NVIDIA drivers utilities and libraries, 173xx branch.
    thinkpad ~ $ vainfo
    libva: libva version 0.31.1
    Xlib: extension "XFree86-DRI" missing on display ":0.0".
    libva: va_getDriverName() returns 0
    libva: Trying to open /usr/lib/dri/nvidia_drv_video.so
    Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory
    libva error: /usr/lib/dri/nvidia_drv_video.so init failed
    libva: va_openDriver() returns -1
    vaInitialize failed with error code -1 (unknown libva error),exit
    thinkpad ~ $ cat /etc/X11/xorg.conf.d/20-nvidia.conf
    Section "Module"
    Load "glx"
    Disable "dri"
    Disable "dri2"
    EndSection
    Section "Device"
    Identifier "Default nvidia Device"
    Driver "nvidia"
    Option "NoLogo" "True"
    EndSection
    thinkpad ~ $ glxinfo
    name of display: :0.0
    display: :0 screen: 0
    direct rendering: Yes
    server glx vendor string: NVIDIA Corporation
    server glx version string: 1.4
    server glx extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer,
    GLX_ARB_fbconfig_float, GLX_EXT_framebuffer_sRGB
    client glx vendor string: NVIDIA Corporation
    client glx version string: 1.4
    client glx extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
    GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
    GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
    GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
    GLX_EXT_fbconfig_packed_float, GLX_EXT_texture_from_pixmap,
    GLX_EXT_framebuffer_sRGB, GLX_NV_present_video
    GLX version: 1.3
    GLX extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer,
    GLX_ARB_fbconfig_float, GLX_EXT_framebuffer_sRGB,
    GLX_ARB_get_proc_address
    OpenGL vendor string: NVIDIA Corporation
    OpenGL renderer string: Quadro NVS 140M/PCI/SSE2
    OpenGL version string: 2.1.2 NVIDIA 173.14.28
    OpenGL shading language version string: 1.20 NVIDIA via Cg compiler
    OpenGL extensions:
    GL_ARB_color_buffer_float, GL_ARB_depth_texture, GL_ARB_draw_buffers,
    GL_ARB_fragment_program, GL_ARB_fragment_program_shadow,
    thinkpad ~ $ cat /var/log/Xorg.0.log
    [ 2830.098]
    X.Org X Server 1.9.4
    Release Date: 2011-02-04
    [ 2830.098] X Protocol Version 11, Revision 0
    [ 2830.099] Build Operating System: Linux 2.6.37-ARCH x86_64
    [ 2830.099] Current Operating System: Linux thinkpad 2.6.37-ARCH #1 SMP PREEMPT Tue Mar 8 08:34:35 CET 2011 x86_64
    [ 2830.099] Kernel command line: root=/dev/sda3 ro
    [ 2830.100] Build Date: 04 February 2011 09:38:18PM
    [ 2830.100]
    [ 2830.100] Current version of pixman: 0.20.2
    [ 2830.101] Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    [ 2830.101] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [ 2830.103] (==) Log file: "/var/log/Xorg.0.log", Time: Sat Mar 12 21:24:44 2011
    [ 2830.104] (==) Using config directory: "/etc/X11/xorg.conf.d"
    [ 2830.104] (==) No Layout section. Using the first Screen section.
    [ 2830.104] (==) No screen section available. Using defaults.
    [ 2830.104] (**) |-->Screen "Default Screen Section" (0)
    [ 2830.104] (**) | |-->Monitor "<default monitor>"
    [ 2830.105] (==) No device specified for screen "Default Screen Section".
    Using the first device section listed.
    [ 2830.105] (**) | |-->Device "Default nvidia Device"
    [ 2830.105] (==) No monitor specified for screen "Default Screen Section".
    Using a default monitor configuration.
    [ 2830.105] (==) Automatically adding devices
    [ 2830.105] (==) Automatically enabling devices
    [ 2830.105] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
    [ 2830.105] Entry deleted from font path.
    [ 2830.105] (==) FontPath set to:
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/,
    /usr/share/fonts/Type1/,
    /usr/share/fonts/100dpi/,
    /usr/share/fonts/75dpi/
    [ 2830.105] (==) ModulePath set to "/usr/lib/xorg/modules"
    [ 2830.105] (II) The server relies on udev to provide the list of input devices.
    If no devices become available, reconfigure udev or disable AutoAddDevices.
    [ 2830.105] (II) Loader magic: 0x7d3b20
    [ 2830.105] (II) Module ABI versions:
    [ 2830.105] X.Org ANSI C Emulation: 0.4
    [ 2830.105] X.Org Video Driver: 8.0
    [ 2830.105] X.Org XInput driver : 11.0
    [ 2830.105] X.Org Server Extension : 4.0
    [ 2830.108] (--) PCI:*(0:1:0:0) 10de:0429:17aa:20d8 rev 161, Mem @ 0xd6000000/16777216, 0xe0000000/268435456, 0xd4000000/33554432, I/O @ 0x00002000/128
    [ 2830.108] (II) Open ACPI successful (/var/run/acpid.socket)
    [ 2830.108] (WW) "dri" will not be loaded unless you've specified it to be loaded elsewhere.
    [ 2830.108] (WW) "dri2" will not be loaded unless you've specified it to be loaded elsewhere.
    [ 2830.108] (II) "extmod" will be loaded by default.
    [ 2830.108] (II) "dbe" will be loaded by default.
    [ 2830.108] (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
    [ 2830.108] (II) "record" will be loaded by default.
    [ 2830.108] (II) "dri" will be loaded even though the default is to disable it.
    [ 2830.108] (II) "dri2" will be loaded even though the default is to disable it.
    [ 2830.108] (II) LoadModule: "glx"
    [ 2830.109] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
    [ 2830.118] (II) Module glx: vendor="NVIDIA Corporation"
    [ 2830.118] compiled for 4.0.2, module version = 1.0.0
    [ 2830.118] Module class: X.Org Server Extension
    [ 2830.118] (II) NVIDIA GLX Module 173.14.28 Wed Sep 29 10:19:01 PDT 2010
    [ 2830.118] (II) Loading extension GLX
    [ 2830.118] (II) LoadModule: "extmod"
    [ 2830.118] (II) Loading /usr/lib/xorg/modules/extensions/libextmod.so
    [ 2830.118] (II) Module extmod: vendor="X.Org Foundation"
    [ 2830.118] compiled for 1.9.4, module version = 1.0.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension MIT-SCREEN-SAVER
    [ 2830.119] (II) Loading extension XFree86-VidModeExtension
    [ 2830.119] (II) Loading extension XFree86-DGA
    [ 2830.119] (II) Loading extension DPMS
    [ 2830.119] (II) Loading extension XVideo
    [ 2830.119] (II) Loading extension XVideo-MotionCompensation
    [ 2830.119] (II) Loading extension X-Resource
    [ 2830.119] (II) LoadModule: "dbe"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/extensions/libdbe.so
    [ 2830.119] (II) Module dbe: vendor="X.Org Foundation"
    [ 2830.119] compiled for 1.9.4, module version = 1.0.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension DOUBLE-BUFFER
    [ 2830.119] (II) LoadModule: "record"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/extensions/librecord.so
    [ 2830.119] (II) Module record: vendor="X.Org Foundation"
    [ 2830.119] compiled for 1.9.4, module version = 1.13.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension RECORD
    [ 2830.119] (II) LoadModule: "nvidia"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [ 2830.120] (II) Module nvidia: vendor="NVIDIA Corporation"
    [ 2830.120] compiled for 4.0.2, module version = 1.0.0
    [ 2830.120] Module class: X.Org Video Driver
    [ 2830.120] (II) NVIDIA dlloader X Driver 173.14.28 Wed Sep 29 10:00:06 PDT 2010
    [ 2830.120] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [ 2830.120] (--) using VT number 7
    [ 2830.124] (II) Loading sub module "fb"
    [ 2830.124] (II) LoadModule: "fb"
    [ 2830.124] (II) Loading /usr/lib/xorg/modules/libfb.so
    [ 2830.124] (II) Module fb: vendor="X.Org Foundation"
    [ 2830.124] compiled for 1.9.4, module version = 1.0.0
    [ 2830.124] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 2830.124] (II) Loading sub module "wfb"
    [ 2830.125] (II) LoadModule: "wfb"
    [ 2830.125] (II) Loading /usr/lib/xorg/modules/libwfb.so
    [ 2830.125] (II) Module wfb: vendor="X.Org Foundation"
    [ 2830.125] compiled for 1.9.4, module version = 1.0.0
    [ 2830.125] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 2830.125] (II) Loading sub module "ramdac"
    [ 2830.125] (II) LoadModule: "ramdac"
    [ 2830.125] (II) Module "ramdac" already built-in
    [ 2830.125] (II) NVIDIA(0): Creating default Display subsection in Screen section
    "Default Screen Section" for depth/fbbpp 24/32
    [ 2830.125] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
    [ 2830.125] (==) NVIDIA(0): RGB weight 888
    [ 2830.125] (==) NVIDIA(0): Default visual is TrueColor
    [ 2830.125] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    [ 2830.125] (**) NVIDIA(0): Option "NoLogo" "True"
    [ 2830.125] (**) NVIDIA(0): Enabling RENDER acceleration
    [ 2830.125] (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    [ 2830.126] (II) NVIDIA(0): enabled.
    [ 2834.780] (II) NVIDIA(0): NVIDIA GPU Quadro NVS 140M (G86GL) at PCI:1:0:0 (GPU-0)
    [ 2834.780] (--) NVIDIA(0): Memory: 524288 kBytes
    [ 2834.780] (--) NVIDIA(0): VideoBIOS: 60.86.3e.00.00
    [ 2834.780] (II) NVIDIA(0): Detected PCI Express Link width: 16X
    [ 2834.780] (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    [ 2834.780] (--) NVIDIA(0): Connected display device(s) on Quadro NVS 140M at PCI:1:0:0:
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0)
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0): 330.0 MHz maximum pixel clock
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0): Internal Dual Link LVDS
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): HorizSync range (42.088-49.305 kHz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's HorizSync (32.9 kHz); ignoring HorizSync check for
    [ 2834.785] (WW) NVIDIA(0): mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): VertRefresh range (52.000-60.000 Hz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's VertRefresh (39.9 Hz); ignoring VertRefresh check
    [ 2834.785] (WW) NVIDIA(0): for mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): HorizSync range (42.088-49.305 kHz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's HorizSync (32.9 kHz); ignoring HorizSync check for
    [ 2834.785] (WW) NVIDIA(0): mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): VertRefresh range (52.000-60.000 Hz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's VertRefresh (39.9 Hz); ignoring VertRefresh check
    [ 2834.785] (WW) NVIDIA(0): for mode "1280x800".
    [ 2834.787] (II) NVIDIA(0): Assigned Display Device: DFP-0
    [ 2834.787] (==) NVIDIA(0):
    [ 2834.787] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
    [ 2834.787] (==) NVIDIA(0): will be used as the requested mode.
    [ 2834.787] (==) NVIDIA(0):
    [ 2834.787] (II) NVIDIA(0): Validated modes:
    [ 2834.787] (II) NVIDIA(0): "nvidia-auto-select"
    [ 2834.787] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 800
    [ 2836.069] (--) NVIDIA(0): DPI set to (98, 96); computed from "UseEdidDpi" X config
    [ 2836.069] (--) NVIDIA(0): option
    [ 2836.069] (==) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
    [ 2836.069] (--) Depth 24 pixmap format is 32 bpp
    [ 2836.074] (II) NVIDIA(0): Initialized GPU GART.
    [ 2836.084] (II) NVIDIA(0): Setting mode "nvidia-auto-select"
    [ 2836.968] (II) Loading extension NV-GLX
    [ 2837.035] (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
    [ 2837.037] (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
    [ 2837.037] (==) NVIDIA(0): Backing store disabled
    [ 2837.037] (==) NVIDIA(0): Silken mouse enabled
    [ 2837.040] (==) NVIDIA(0): DPMS enabled
    [ 2837.040] (II) Loading extension NV-CONTROL
    [ 2837.041] (==) RandR enabled
    [ 2837.041] (II) Initializing built-in extension Generic Event Extension
    [ 2837.041] (II) Initializing built-in extension SHAPE
    [ 2837.041] (II) Initializing built-in extension MIT-SHM
    [ 2837.041] (II) Initializing built-in extension XInputExtension
    [ 2837.041] (II) Initializing built-in extension XTEST
    [ 2837.041] (II) Initializing built-in extension BIG-REQUESTS
    [ 2837.041] (II) Initializing built-in extension SYNC
    [ 2837.041] (II) Initializing built-in extension XKEYBOARD
    [ 2837.041] (II) Initializing built-in extension XC-MISC
    [ 2837.041] (II) Initializing built-in extension SECURITY
    [ 2837.041] (II) Initializing built-in extension XINERAMA
    [ 2837.042] (II) Initializing built-in extension XFIXES
    [ 2837.042] (II) Initializing built-in extension RENDER
    [ 2837.042] (II) Initializing built-in extension RANDR
    [ 2837.042] (II) Initializing built-in extension COMPOSITE
    [ 2837.042] (II) Initializing built-in extension DAMAGE
    [ 2837.042] (II) Initializing extension GLX
    [ 2837.204] (II) config/udev: Adding input device Power Button (/dev/input/event4)
    [ 2837.204] (**) Power Button: Applying InputClass "evdev keyboard catchall"
    [ 2837.204] (II) LoadModule: "evdev"
    [ 2837.204] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
    [ 2837.205] (II) Module evdev: vendor="X.Org Foundation"
    [ 2837.205] compiled for 1.9.4, module version = 2.6.0
    [ 2837.205] Module class: X.Org XInput Driver
    [ 2837.205] ABI class: X.Org XInput driver, version 11.0
    [ 2837.205] (**) Power Button: always reports core events
    [ 2837.205] (**) Power Button: Device: "/dev/input/event4"
    [ 2837.216] (--) Power Button: Found keys
    [ 2837.216] (II) Power Button: Configuring as keyboard
    [ 2837.216] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD)
    [ 2837.216] (**) Option "xkb_rules" "evdev"
    [ 2837.216] (**) Option "xkb_model" "evdev"
    [ 2837.216] (**) Option "xkb_layout" "us"
    [ 2837.264] (II) config/udev: Adding input device Video Bus (/dev/input/event3)
    [ 2837.264] (**) Video Bus: Applying InputClass "evdev keyboard catchall"
    [ 2837.264] (**) Video Bus: always reports core events
    [ 2837.264] (**) Video Bus: Device: "/dev/input/event3"
    [ 2837.279] (--) Video Bus: Found keys
    [ 2837.279] (II) Video Bus: Configuring as keyboard
    [ 2837.279] (II) XINPUT: Adding extended input device "Video Bus" (type: KEYBOARD)
    [ 2837.279] (**) Option "xkb_rules" "evdev"
    [ 2837.280] (**) Option "xkb_model" "evdev"
    [ 2837.280] (**) Option "xkb_layout" "us"
    [ 2837.284] (II) config/udev: Adding input device Lid Switch (/dev/input/event1)
    [ 2837.284] (II) No input driver/identifier specified (ignoring)
    [ 2837.284] (II) config/udev: Adding input device Sleep Button (/dev/input/event2)
    [ 2837.284] (**) Sleep Button: Applying InputClass "evdev keyboard catchall"
    [ 2837.284] (**) Sleep Button: always reports core events
    [ 2837.284] (**) Sleep Button: Device: "/dev/input/event2"
    [ 2837.306] (--) Sleep Button: Found keys
    [ 2837.306] (II) Sleep Button: Configuring as keyboard
    [ 2837.306] (II) XINPUT: Adding extended input device "Sleep Button" (type: KEYBOARD)
    [ 2837.306] (**) Option "xkb_rules" "evdev"
    [ 2837.306] (**) Option "xkb_model" "evdev"
    [ 2837.306] (**) Option "xkb_layout" "us"
    [ 2837.312] (II) config/udev: Adding input device Logitech USB Laser Mouse (/dev/input/event9)
    [ 2837.312] (**) Logitech USB Laser Mouse: Applying InputClass "evdev pointer catchall"
    [ 2837.312] (**) Logitech USB Laser Mouse: always reports core events
    [ 2837.312] (**) Logitech USB Laser Mouse: Device: "/dev/input/event9"
    [ 2837.333] (--) Logitech USB Laser Mouse: Found 12 mouse buttons
    [ 2837.333] (--) Logitech USB Laser Mouse: Found scroll wheel(s)
    [ 2837.333] (--) Logitech USB Laser Mouse: Found relative axes
    [ 2837.333] (--) Logitech USB Laser Mouse: Found x and y relative axes
    [ 2837.333] (II) Logitech USB Laser Mouse: Configuring as mouse
    [ 2837.333] (II) Logitech USB Laser Mouse: Adding scrollwheel support
    [ 2837.333] (**) Logitech USB Laser Mouse: YAxisMapping: buttons 4 and 5
    [ 2837.333] (**) Logitech USB Laser Mouse: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    [ 2837.333] (II) XINPUT: Adding extended input device "Logitech USB Laser Mouse" (type: MOUSE)
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) keeping acceleration scheme 1
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration profile 0
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration factor: 2.000
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration threshold: 4
    [ 2837.333] (II) Logitech USB Laser Mouse: initialized for relative axes.
    [ 2837.334] (II) config/udev: Adding input device Logitech USB Laser Mouse (/dev/input/mouse1)
    [ 2837.334] (II) No input driver/identifier specified (ignoring)
    [ 2837.335] (II) config/udev: Adding input device HDA Digital PCBeep (/dev/input/event7)
    [ 2837.335] (II) No input driver/identifier specified (ignoring)
    [ 2837.342] (II) config/udev: Adding input device Integrated Camera (/dev/input/event8)
    [ 2837.342] (**) Integrated Camera: Applying InputClass "evdev keyboard catchall"
    [ 2837.342] (**) Integrated Camera: always reports core events
    [ 2837.342] (**) Integrated Camera: Device: "/dev/input/event8"
    [ 2837.373] (--) Integrated Camera: Found keys
    [ 2837.373] (II) Integrated Camera: Configuring as keyboard
    [ 2837.373] (II) XINPUT: Adding extended input device "Integrated Camera" (type: KEYBOARD)
    [ 2837.373] (**) Option "xkb_rules" "evdev"
    [ 2837.373] (**) Option "xkb_model" "evdev"
    [ 2837.373] (**) Option "xkb_layout" "us"
    [ 2837.382] (II) config/udev: Adding input device AT Translated Set 2 keyboard (/dev/input/event0)
    [ 2837.382] (**) AT Translated Set 2 keyboard: Applying InputClass "evdev keyboard catchall"
    [ 2837.382] (**) AT Translated Set 2 keyboard: always reports core events
    [ 2837.382] (**) AT Translated Set 2 keyboard: Device: "/dev/input/event0"
    [ 2837.403] (--) AT Translated Set 2 keyboard: Found keys
    [ 2837.403] (II) AT Translated Set 2 keyboard: Configuring as keyboard
    [ 2837.403] (II) XINPUT: Adding extended input device "AT Translated Set 2 keyboard" (type: KEYBOARD)
    [ 2837.403] (**) Option "xkb_rules" "evdev"
    [ 2837.403] (**) Option "xkb_model" "evdev"
    [ 2837.403] (**) Option "xkb_layout" "us"
    [ 2837.404] (II) config/udev: Adding input device TPPS/2 IBM TrackPoint (/dev/input/event6)
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: Applying InputClass "evdev pointer catchall"
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: always reports core events
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: Device: "/dev/input/event6"
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found 3 mouse buttons
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found relative axes
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found x and y relative axes
    [ 2837.419] (II) TPPS/2 IBM TrackPoint: Configuring as mouse
    [ 2837.419] (**) TPPS/2 IBM TrackPoint: YAxisMapping: buttons 4 and 5
    [ 2837.419] (**) TPPS/2 IBM TrackPoint: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    [ 2837.419] (II) XINPUT: Adding extended input device "TPPS/2 IBM TrackPoint" (type: MOUSE)
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) keeping acceleration scheme 1
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration profile 0
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration factor: 2.000
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration threshold: 4
    [ 2837.420] (II) TPPS/2 IBM TrackPoint: initialized for relative axes.
    [ 2837.420] (II) config/udev: Adding input device TPPS/2 IBM TrackPoint (/dev/input/mouse0)
    [ 2837.420] (II) No input driver/identifier specified (ignoring)
    [ 2837.422] (II) config/udev: Adding input device ThinkPad Extra Buttons (/dev/input/event5)
    [ 2837.422] (**) ThinkPad Extra Buttons: Applying InputClass "evdev keyboard catchall"
    [ 2837.422] (**) ThinkPad Extra Buttons: always reports core events
    [ 2837.422] (**) ThinkPad Extra Buttons: Device: "/dev/input/event5"
    [ 2837.446] (--) ThinkPad Extra Buttons: Found keys
    [ 2837.446] (II) ThinkPad Extra Buttons: Configuring as keyboard
    [ 2837.446] (II) XINPUT: Adding extended input device "ThinkPad Extra Buttons" (type: KEYBOARD)
    [ 2837.446] (**) Option "xkb_rules" "evdev"
    [ 2837.446] (**) Option "xkb_model" "evdev"
    [ 2837.446] (**) Option "xkb_layout" "us"

    lukaszg wrote:GlxGears still uses 100% CPU - is it OK?
    Should I disable both dri and dri2 in xorg.conf, for nvidia drivers?
    Dunno.  Comment them out, restart X and see what happens.
    Last edited by skunktrader (2011-03-13 16:01:02)

  • [SOLVED] nVidia 87.76 Drivers installer fails to build kernel module

    Hey,
    I have a nVidia GeForce2 Integrated graphics card on my Asus A7N266 motherboard. According to the nVidia site, http://www.nvidia.com/object/IO_32667.html, the most recent driver that supports this is the 96.xx series. Yet as shown here, http://www.nvnews.net/vbulletin/showthread.php?t=87332, many people have found that the 96.xx series drivers cause graphical corruption when using a GeForce2 IGP.
    Before coming to Arch, I had used Xubuntu 7.04 and managed to compile the last known working drivers, version 87.76, following the instructions here: http://kmandla.wordpress.com/2007/03/25 … 20-12-386/
    That all went well but now I'm using Arch. I tried using the nvidia-96.xx driver in the repos just for testing sake but I still got the same graphical corruption.
    Thus I have been trying to install the 87.76 drivers on Arch, but it always fails at the kernel module building stage. I had first applied this patch here, http://www.nvnews.net/vbulletin/showthr … ost1086669, before compling as the plain driver wont compile against recent kernels.
    Here is the output from the installer:
    nvidia-installer log file '/var/log/nvidia-installer.log'
    creation time: Sun Aug 5 11:57:53 2007
    option status:
    license pre-accepted : false
    update : false
    force update : false
    expert : false
    uninstall : false
    driver info : false
    precompiled interfaces : true
    no ncurses color : false
    query latest version : false
    OpenGL header files : true
    no questions : false
    silent : false
    no recursion : false
    no backup : false
    kernel module only : false
    sanity : false
    add this kernel : false
    no runlevel check : false
    no network : false
    no ABI note : false
    no RPMs : false
    no kernel module : false
    force SELinux : default
    force tls : (not specified)
    X install prefix : (not specified)
    X library install path : (not specified)
    X module install path : (not specified)
    OpenGL install prefix : (not specified)
    OpenGL install libdir : (not specified)
    utility install prefix : (not specified)
    utility install libdir : (not specified)
    doc install prefix : (not specified)
    kernel name : (not specified)
    kernel include path : (not specified)
    kernel source path : (not specified)
    kernel output path : (not specified)
    kernel install path : (not specified)
    proc mount point : /proc
    ui : (not specified)
    tmpdir : /tmp
    ftp mirror : ftp://download.nvidia.com
    RPM file list : (not specified)
    Using: nvidia-installer ncurses user interface
    -> License accepted.
    -> No precompiled kernel interface was found to match your kernel; would you li
    ke the installer to attempt to download a kernel interface for your kernel f
    rom the NVIDIA ftp site (ftp://download.nvidia.com)? (Answer: Yes)
    -> No matching precompiled kernel interface was found on the NVIDIA ftp site;
    this means that the installer will need to compile a kernel interface for
    your kernel.
    -> Performing CC sanity check with CC="cc".
    -> Performing CC version check with CC="cc".
    -> Kernel source path: '/lib/modules/2.6.22-ARCH/build'
    -> Kernel output path: '/lib/modules/2.6.22-ARCH/build'
    -> Performing rivafb check.
    -> Performing nvidiafb check.
    -> Cleaning kernel module build directory.
    executing: 'cd ./usr/src/nv; make clean'...
    rm -f -f nv.o nv-vm.o os-agp.o os-interface.o os-registry.o nv-i2c.o nv.o nv
    -vm.o os-agp.o os-interface.o os-registry.o nv-i2c.o nvidia.mod.o
    rm -f -f build-in.o nv-linux.o *.d .*.{cmd,flags}
    rm -f -f nvidia.{o,ko,mod.{o,c}} nv_compiler.h *~
    rm -f -f stprof stprof.o symtab.h
    rm -f -rf .tmp_versions
    -> Building kernel module:
    executing: 'cd ./usr/src/nv; make module SYSSRC=/lib/modules/2.6.22-ARCH/bui
    ld SYSOUT=/lib/modules/2.6.22-ARCH/build'...
    NVIDIA: calling KBUILD...
    make CC=cc KBUILD_VERBOSE=1 -C /lib/modules/2.6.22-ARCH/build SUBDIRS=/home
    /kris/Source/nVidia GLX 87.76 Driver/NVIDIA-Linux-x86-1.0-8776-pkg1/usr/src/
    nv modules
    test -e include/linux/autoconf.h -a -e include/config/auto.conf || ( \
    echo; \
    echo " ERROR: Kernel configuration is invalid."; \
    echo " include/linux/autoconf.h or include/config/auto.conf are mis
    sing."; \
    echo " Run 'make oldconfig && make prepare' on kernel src to fix it
    echo; \
    /bin/false)
    make[2]: *** No rule to make target `GLX'. Stop.
    NVIDIA: left KBUILD.
    nvidia.ko failed to build!
    make[1]: *** [mdl] Error 1
    make: *** [module] Error 2
    -> Error.
    ERROR: Unable to build the NVIDIA kernel module.
    ERROR: Installation has failed. Please see the file
    '/var/log/nvidia-installer.log' for details. You may find suggestions
    on fixing installation problems in the README available on the Linux
    driver download page at www.nvidia.com.
    Any ideas as to how I can get the kernel module to build?
    I am using 'kernel26 2.6.22.1-4' with the 'kernel-headers 2.6.22.1-1' from the testing repo.
    Last edited by Nameless One (2007-08-10 07:08:38)

    make CC=cc  KBUILD_VERBOSE=1 -C /lib/modules/2.6.22-ARCH/buildSUBDIRS=/home/kris/Source/nVidia GLX 87.76 Driver/NVIDIA-Linux-x86-1.0-8776-pkg1/usr/src/
    The name of the directory you placed the driver in contains spaces. That is  why you get the no rule to make target GLX error.
    It would be better not to circumvent pacman. I am using the following PKGBUILDs for the 8776 driver:
    nvidia-8776:
    pkgname=nvidia-8776
    pkgver=1.0.8776
    _nver=1.0-8776
    _kernver='2.6.22-ARCH'
    pkgrel=1
    pkgdesc="NVIDIA drivers for Arch kernel."
    arch=(i686 x86_64)
    [ "$CARCH" = "i686" ] && ARCH=x86
    [ "$CARCH" = "x86_64" ] && ARCH=x86_64
    url="http://www.nvidia.com/"
    depends=(kernel26 nvidia-8776-utils)
    conflicts=('nvidia' 'nvidia-96xx' 'nvidia-71xx' 'nvidia-legacy')
    install=nvidia.install
    source=(http://download.nvidia.com/XFree86/Linux-$ARCH/${_nver}/NVIDIA-Linux-$ARCH-${_nver}-pkg0.run NVIDIA_kernel-1.0-8776-20061203.diff.txt)
    md5sums=('93ad45fe7b974a5a80348e1890f9b7c9' '70e669f06ee4881c2583261672de292a')
    [ "$CARCH" = "x86_64" ] && md5sums=('f5340e4bbce811add994b1685cdea03b' '70e669f06ee4881c2583261672de292a')
    build()
    # Extract
    cd $startdir/src/
    sh NVIDIA-Linux-$ARCH-${_nver}-pkg0.run --extract-only
    cd NVIDIA-Linux-$ARCH-${_nver}-pkg0
    # Any extra patches are applied in here...
    patch -p0 < $startdir/NVIDIA_kernel-1.0-8776-20061203.diff.txt ||return 1
    cd usr/src/nv/
    ln -s Makefile.kbuild Makefile
    make SYSSRC=/lib/modules/$_kernver/build module || return 1
    # install kernel module
    mkdir -p $startdir/pkg/lib/modules/${_kernver}/kernel/drivers/video/
    install -m644 nvidia.ko $startdir/pkg/lib/modules/${_kernver}/kernel/drivers/video/
    sed -i -e "s/KERNEL_VERSION='.*'/KERNEL_VERSION='${_kernver}'/" $startdir/*.install
    Place the patch (NVIDIA_kernel-1.0-8776-20061203.diff.txt) and nvidia.install in the same directory as the PKGBUILD.
    nvidia-utils:
    pkgname=nvidia-8776-utils
    pkgver=1.0.8776
    _nver=1.0-8776
    pkgrel=1
    pkgdesc="NVIDIA drivers utilities and libraries."
    arch=(i686 x86_64)
    [ "$CARCH" = "i686" ] && ARCH=x86
    [ "$CARCH" = "x86_64" ] && ARCH=x86_64
    url="http://www.nvidia.com/"
    depends=(xorg-server)
    conflicts=('libgl' 'libgl-dri' 'ati-fglrx-utils' 'nvidia-legacy-utils' 'nvidia-71xx-utils' 'nvidia-96xx-utils')
    provides=('libgl' )
    #install=nvidia.install
    source=(http://download.nvidia.com/XFree86/Linux-$ARCH/${_nver}/NVIDIA-Linux-$ARCH-${_nver}-pkg0.run)
    md5sums=('93ad45fe7b974a5a80348e1890f9b7c9')
    [ "$CARCH" = "x86_64" ] && md5sums=('f5340e4bbce811add994b1685cdea03b')
    build()
    # override nvida install routine and do it the long way.
    cd $startdir/src/
    sh NVIDIA-Linux-${ARCH}-${_nver}-pkg0.run --extract-only
    cd NVIDIA-Linux-${ARCH}-${_nver}-pkg0/usr/
    mkdir -p $startdir/pkg/usr/{lib,bin,share/applications,share/pixmaps,man/man1}
    mkdir -p $startdir/pkg/usr/lib/xorg/modules/{extensions,drivers}
    mkdir -p $startdir/pkg/usr/share/licenses/nvidia/
    install `find lib/ -iname \*.so\*` $startdir/pkg/usr/lib/
    install lib/tls/* $startdir/pkg/usr/lib
    install share/man/man1/* $startdir/pkg/usr/man/man1/
    rm $startdir/pkg/usr/man/man1/nvidia-installer.1.gz
    install X11R6/lib/libXv* $startdir/pkg/usr/lib/
    install share/applications/nvidia-settings.desktop $startdir/pkg/usr/share/applications/
    # fix nvidia .desktop file
    sed -e 's:__UTILS_PATH__:/usr/bin:' -e 's:__PIXMAP_PATH__:/usr/share/pixmaps:' -i $startdir/pkg/usr/share/applications/nvidia-settings.desktop
    install share/pixmaps/nvidia-settings.png $startdir/pkg/usr/share/pixmaps/
    install X11R6/lib/modules/drivers/nvidia_drv.so $startdir/pkg/usr/lib/xorg/modules/drivers
    install X11R6/lib/modules/extensions/libglx.so.$pkgver $startdir/pkg/usr/lib/xorg/modules/extensions
    install -m755 bin/nvidia-{settings,xconfig,bug-report.sh} $startdir/pkg/usr/bin/
    cd $startdir/pkg/usr/lib/;
    ln -s /usr/lib/libGL.so.$pkgver libGL.so
    ln -s /usr/lib/libGL.so.$pkgver libGL.so.1
    ln -s /usr/lib/libGLcore.so.$pkgver libGLcore.so.1
    ln -s /usr/lib/libnvidia-cfg.so.$pkgver libnvidia-cfg.so.1
    ln -s /usr/lib/libnvidia-tls.so.$pkgver libnvidia-tls.so.1
    cd $startdir/pkg/usr/lib/xorg/modules/extensions;
    ln -s /usr/lib/xorg/modules/extensions/libglx.so.$pkgver libglx.so
    install $startdir/src/NVIDIA-Linux-${ARCH}-${_nver}-pkg0/LICENSE $startdir/pkg/usr/share/licenses/nvidia/
    find $startdir/pkg/usr -type d -exec chmod 755 {} \;
    # phew :)
    Last edited by kappa (2007-08-09 11:14:42)

  • Nvidia quadro 4000 or 5000 in Mac Pro

    Hello all,
    I have this question that I already make to a sale advisor in a Mac store and he do not know what to answer and also I do it to a technical Mac sales agent and again nothing.
    I do not own a Mac pro yet, I am looking to buy one becuase is the only system that can hold (in a stable way) both Mac OS X and Windows 7.
    I am a professional working with 3D content creation, CAD design and digital video and I want to be able to do pro 3D content with Windows 7 and also working in pro videos in Mac OSX. The actual graphic cards for the Mac pro 2010 are intended for video games instead of professional production (this is the info posted in the AMD/ATI website when one do the search for Radeon 5770 and 5780, of course they can handle some 3d content but not as the latest generation of Nvidia cards)
    My question is:
    Can I use a Mac pro with one of those ATI cards installed in the 1st PCIe 2.0 16x port for the MAC OS X and then install a Nvidia Quadro 4000 or 5000 in the second PCIe 2.0 16x port for Windows 7? (given the fact that Nvidia do not have even beta drivers for Mac OS X for those types of cards yet)
    Im thinking that I can use the Mac with the ATI and then when swiching to Windows (trough Boot Camp) using the Nvidia Quadro but I am not sure if this is possible and if installing another card will created probs. and if is posible how to relate each card to each OS.
    I ask this before having the Mac becuase I am planing to make a huge inversion on it and I want to be sure about this. (The specs. will be 32gb ram, 512gb ssd, 3x2tb hdd, 2x2.93ghz6c xeons)
    Thank you all and I will be pleased to heard any comments or suggestions.

    The Quadro FX 5800 is launching at $3,499, and its spec sheet looks well furnished: 240 stream processors, fill rate of 52 billion texels per second, 128-bit precision, true 10-bit color support, 102GB/s memory bandwidth (which should translate to an 800MHz GDDR3 clock speed, if my math is right), and "interactive 4D modeling with time lapse capabilities." Nvidia says the Quadro FX 5800 should be well suited to applications like "oil and gas exploration, medical imaging, styling and design, and scientific visualization."
    http://techreport.com/discussions.x/15866
    First announced fall-winter 2008
    http://www.nvidia.com/object/productquadro_fx_5800us.html
    If you wanted to use nVidia's Quadro CX to speed up CS5, it is Windows only, and the Mac only has TWO 6-pin connectors. So even though Quadro 5800 power is under 200W there are no more aux power connectors. Though you can try to add a small 450W PSU for graphics 2 x 8-pin./2 x 6-pin) it gets messy running from the 2nd optical drive bay and tight fit-impossible for cables. Not designed for dual graphic card power.
    The GTX 285 EVGA came out for Mac last year but drivers have been a problem with performance and a year later 10.6.4 and nVidia rushed to post a patch but there has been no word now that the GTX 285 went EOL. What's up with that?

Maybe you are looking for

  • Time Machine backup/restore on iMac with SSD and HD...

    Time Machine backs up the SSD and HD on my iMac into one "sparse bundle" for the computer.  I need some direction on how to get to the HD backup in order to restore individual files and folders.  PLEASE do not give an answer like "Just enter Time Mac

  • Importing Word docs from a RH for Word project

    I have a HUGE RH for Word project that I'm trying to convert to RHTML (RH Office X5). We generate CHM with this project, and we've got map ids for all topics. I've been to Peter Grainge's site to get some ideas, but I don't think they're going to wor

  • Image size problem after converting from Managed files to Referenced files.

    I recently moved to Referenced files for all my images. I use a plug-in (Graphic Converter 8.8.2) to resize all images to a consistent size (1440 pixels wide, landsacpe format) for a web app that I use (Sandvox). With Referenced files, when I drag an

  • Deploying to Sun AppServer PE 8.1 (build b41-fcs) with an Oracle database

    Hello, I'm trying to deploy an application to Sun AppServer PE 8.1(an external server, not the embedded one). I already know that the direct deployment preview feature doesn't work with 8.1: http://swforum.sun.com/jive/thread.jspa?forumID=123&threadI

  • Migration Apex 4.2.3 !

    Hello Every Body, I am using apex 4.2 Release 2 and i want to migrate to Apex 4.2.3 ; the oracle doc said that if  have apex version 4.2 i should install only the patch of apex 4.2.3 but id dont know if this patch wil migrate the version on my apex (