NVIDIA Issues; hitting the video card hangs the machine

Hello everybody, this could easily be placed under the Kernel and Hardware forum, but this one suits my experience level better.  I tried
searching for some similar issues to what I am having, but wasn't able to come up with anything. (Perhaps I was an unknown keyword off in my searches. )
Additionally, thanks in advance for any help! I'm pretty much out of ideas short of replacing some hardware.
My problem is that whenever my desktop displays anything that is graphically intensive, it freezes. I first experienced this while I had Slackware 13.1 installed when trying to run Minecraft. I had been running it successfully for a month or two when it hung while I was playing. I switched over to Arch in the hopes that it would be easier for me to troubleshoot the problem. This was a few months ago, and I've not been able to figure it out.
I'm running with an i7 930 CPU, GTX 480 video card, and six gigs of RAM. My xorg.conf is below:
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig: version 270.41.06 ([email protected]) Mon Apr 18 15:14:00 PDT 2011
Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0"
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
EndSection
Section "Files"
EndSection
Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Unknown"
ModelName "Unknown"
HorizSync 28.0 - 33.0
VertRefresh 43.0 - 72.0
Option "DPMS"
EndSection
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Depth 24
EndSubSection
EndSection
I dug through my logs and the only strange thing I found was in kernel.log:
Jul 3 13:56:43 localhost kernel: [ 101.297777] NVRM: GPU at 0000:03:00.0 has fallen off the bus.
For what it's worth, here is my Xorg.0.conf (but the only error is related to my keyboard, which actually works fine):
[ 28.765]
X.Org X Server 1.10.2
Release Date: 2011-05-28
[ 28.766] X Protocol Version 11, Revision 0
[ 28.766] Build Operating System: Linux 2.6.38-ARCH x86_64
[ 28.766] Current Operating System: Linux myhost 2.6.39-ARCH #1 SMP PREEMPT Mon Jun 27 21:26:22 CEST 2011 x86_64
[ 28.766] Kernel command line: root=/dev/disk/by-uuid/53882ab2-b171-4117-a8a2-6ec984ace8e2 ro
[ 28.767] Build Date: 30 May 2011 08:18:15AM
[ 28.767]
[ 28.767] Current version of pixman: 0.22.0
[ 28.767] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 28.768] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 28.770] (==) Log file: "/var/log/Xorg.0.log", Time: Sun Jul 3 14:20:49 2011
[ 28.800] (==) Using config file: "/etc/X11/xorg.conf"
[ 28.801] (==) Using config directory: "/etc/X11/xorg.conf.d"
[ 28.817] (==) ServerLayout "Layout0"
[ 28.817] (**) |-->Screen "Screen0" (0)
[ 28.817] (**) | |-->Monitor "Monitor0"
[ 28.817] (**) | |-->Device "Device0"
[ 28.817] (**) |-->Input Device "Keyboard0"
[ 28.817] (**) |-->Input Device "Mouse0"
[ 28.817] (==) Automatically adding devices
[ 28.817] (==) Automatically enabling devices
[ 28.837] (WW) The directory "/usr/share/fonts/TTF/" does not exist.
[ 28.837] Entry deleted from font path.
[ 28.837] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
[ 28.837] Entry deleted from font path.
[ 28.837] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
[ 28.837] Entry deleted from font path.
[ 28.857] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
[ 28.857] Entry deleted from font path.
[ 28.857] (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
[ 28.857] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
[ 28.857] Entry deleted from font path.
[ 28.858] (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
[ 28.858] (==) FontPath set to:
/usr/share/fonts/misc/
[ 28.858] (==) ModulePath set to "/usr/lib/xorg/modules"
[ 28.858] (WW) Hotplugging is on, devices using drivers 'kbd', 'mouse' or 'vmmouse' will be disabled.
[ 28.858] (WW) Disabling Keyboard0
[ 28.858] (WW) Disabling Mouse0
[ 28.858] (II) Loader magic: 0x7d3440
[ 28.858] (II) Module ABI versions:
[ 28.858] X.Org ANSI C Emulation: 0.4
[ 28.858] X.Org Video Driver: 10.0
[ 28.858] X.Org XInput driver : 12.2
[ 28.858] X.Org Server Extension : 5.0
[ 28.859] (--) PCI:*(0:3:0:0) 10de:06c0:10de:075f rev 163, Mem @ 0xdc000000/33554432, 0xc8000000/134217728, 0xd4000000/67108864, I/O @ 0x0000df00/128, BIOS @ 0x????????/524288
[ 28.859] (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
[ 28.859] (II) LoadModule: "extmod"
[ 28.878] (II) Loading /usr/lib/xorg/modules/extensions/libextmod.so
[ 28.889] (II) Module extmod: vendor="X.Org Foundation"
[ 28.889] compiled for 1.10.2, module version = 1.0.0
[ 28.889] Module class: X.Org Server Extension
[ 28.889] ABI class: X.Org Server Extension, version 5.0
[ 28.889] (II) Loading extension MIT-SCREEN-SAVER
[ 28.889] (II) Loading extension XFree86-VidModeExtension
[ 28.889] (II) Loading extension XFree86-DGA
[ 28.889] (II) Loading extension DPMS
[ 28.889] (II) Loading extension XVideo
[ 28.889] (II) Loading extension XVideo-MotionCompensation
[ 28.889] (II) Loading extension X-Resource
[ 28.889] (II) LoadModule: "dbe"
[ 28.889] (II) Loading /usr/lib/xorg/modules/extensions/libdbe.so
[ 28.890] (II) Module dbe: vendor="X.Org Foundation"
[ 28.890] compiled for 1.10.2, module version = 1.0.0
[ 28.890] Module class: X.Org Server Extension
[ 28.890] ABI class: X.Org Server Extension, version 5.0
[ 28.890] (II) Loading extension DOUBLE-BUFFER
[ 28.890] (II) LoadModule: "glx"
[ 28.890] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[ 29.187] (II) Module glx: vendor="NVIDIA Corporation"
[ 29.195] compiled for 4.0.2, module version = 1.0.0
[ 29.195] Module class: X.Org Server Extension
[ 29.195] (II) NVIDIA GLX Module 275.09.07 Wed Jun 8 14:34:43 PDT 2011
[ 29.195] (II) Loading extension GLX
[ 29.195] (II) LoadModule: "record"
[ 29.195] (II) Loading /usr/lib/xorg/modules/extensions/librecord.so
[ 29.196] (II) Module record: vendor="X.Org Foundation"
[ 29.196] compiled for 1.10.2, module version = 1.13.0
[ 29.196] Module class: X.Org Server Extension
[ 29.196] ABI class: X.Org Server Extension, version 5.0
[ 29.196] (II) Loading extension RECORD
[ 29.196] (II) LoadModule: "dri"
[ 29.196] (II) Loading /usr/lib/xorg/modules/extensions/libdri.so
[ 29.207] (II) Module dri: vendor="X.Org Foundation"
[ 29.207] compiled for 1.10.2, module version = 1.0.0
[ 29.207] ABI class: X.Org Server Extension, version 5.0
[ 29.207] (II) Loading extension XFree86-DRI
[ 29.207] (II) LoadModule: "dri2"
[ 29.207] (II) Loading /usr/lib/xorg/modules/extensions/libdri2.so
[ 29.207] (II) Module dri2: vendor="X.Org Foundation"
[ 29.207] compiled for 1.10.2, module version = 1.2.0
[ 29.207] ABI class: X.Org Server Extension, version 5.0
[ 29.207] (II) Loading extension DRI2
[ 29.207] (II) LoadModule: "nvidia"
[ 29.208] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[ 29.240] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 29.241] compiled for 4.0.2, module version = 1.0.0
[ 29.241] Module class: X.Org Video Driver
[ 29.246] (II) NVIDIA dlloader X Driver 275.09.07 Wed Jun 8 14:18:12 PDT 2011
[ 29.246] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 29.246] (--) using VT number 7
[ 29.255] (II) Loading sub module "fb"
[ 29.255] (II) LoadModule: "fb"
[ 29.255] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 29.262] (II) Module fb: vendor="X.Org Foundation"
[ 29.262] compiled for 1.10.2, module version = 1.0.0
[ 29.262] ABI class: X.Org ANSI C Emulation, version 0.4
[ 29.262] (II) Loading sub module "wfb"
[ 29.262] (II) LoadModule: "wfb"
[ 29.262] (II) Loading /usr/lib/xorg/modules/libwfb.so
[ 29.270] (II) Module wfb: vendor="X.Org Foundation"
[ 29.270] compiled for 1.10.2, module version = 1.0.0
[ 29.270] ABI class: X.Org ANSI C Emulation, version 0.4
[ 29.270] (II) Loading sub module "ramdac"
[ 29.270] (II) LoadModule: "ramdac"
[ 29.270] (II) Module "ramdac" already built-in
[ 29.271] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[ 29.271] (II) Loading /usr/lib/xorg/modules/libwfb.so
[ 29.271] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 29.272] (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
[ 29.272] (==) NVIDIA(0): RGB weight 888
[ 29.272] (==) NVIDIA(0): Default visual is TrueColor
[ 29.272] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[ 30.069] (II) NVIDIA(GPU-0): Display (AOC 2436 (CRT-1)) does not support NVIDIA 3D Vision
[ 30.069] (II) NVIDIA(GPU-0): stereo.
[ 30.070] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 480 (GF100) at PCI:3:0:0 (GPU-0)
[ 30.070] (--) NVIDIA(0): Memory: 1572864 kBytes
[ 30.070] (--) NVIDIA(0): VideoBIOS: 70.00.21.00.02
[ 30.070] (II) NVIDIA(0): Detected PCI Express Link width: 16X
[ 30.070] (--) NVIDIA(0): Interlaced video modes are supported on this GPU
[ 30.070] (--) NVIDIA(0): Connected display device(s) on GeForce GTX 480 at PCI:3:0:0
[ 30.070] (--) NVIDIA(0): AOC 2436 (CRT-1)
[ 30.071] (--) NVIDIA(0): AOC 2436 (CRT-1): 400.0 MHz maximum pixel clock
[ 30.109] (II) NVIDIA(0): Assigned Display Device: CRT-1
[ 30.109] (==) NVIDIA(0):
[ 30.109] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
[ 30.109] (==) NVIDIA(0): will be used as the requested mode.
[ 30.109] (==) NVIDIA(0):
[ 30.109] (II) NVIDIA(0): Validated modes:
[ 30.109] (II) NVIDIA(0): "nvidia-auto-select"
[ 30.110] (II) NVIDIA(0): Virtual screen size determined to be 1920 x 1080
[ 30.134] (--) NVIDIA(0): DPI set to (93, 94); computed from "UseEdidDpi" X config
[ 30.134] (--) NVIDIA(0): option
[ 30.134] (--) Depth 24 pixmap format is 32 bpp
[ 30.134] (II) NVIDIA: Using 3072.00 MB of virtual memory for indirect memory
[ 30.134] (II) NVIDIA: access.
[ 30.144] (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
[ 30.144] (II) NVIDIA(0): may not be running or the "AcpidSocketPath" X
[ 30.144] (II) NVIDIA(0): configuration option may not be set correctly. When the
[ 30.144] (II) NVIDIA(0): ACPI event daemon is available, the NVIDIA X driver will
[ 30.144] (II) NVIDIA(0): try to use it to receive ACPI event notifications. For
[ 30.144] (II) NVIDIA(0): details, please see the "ConnectToAcpid" and
[ 30.144] (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
[ 30.144] (II) NVIDIA(0): Config Options in the README.
[ 30.147] (II) NVIDIA(0): Setting mode "nvidia-auto-select"
[ 30.208] (II) Loading extension NV-GLX
[ 30.270] (==) NVIDIA(0): Disabling shared memory pixmaps
[ 30.270] (==) NVIDIA(0): Backing store disabled
[ 30.270] (==) NVIDIA(0): Silken mouse enabled
[ 30.271] (**) NVIDIA(0): DPMS enabled
[ 30.271] (II) Loading extension NV-CONTROL
[ 30.271] (II) Loading extension XINERAMA
[ 30.271] (II) Loading sub module "dri2"
[ 30.271] (II) LoadModule: "dri2"
[ 30.271] (II) Loading /usr/lib/xorg/modules/extensions/libdri2.so
[ 30.272] (II) Module dri2: vendor="X.Org Foundation"
[ 30.272] compiled for 1.10.2, module version = 1.2.0
[ 30.272] ABI class: X.Org Server Extension, version 5.0
[ 30.272] (II) NVIDIA(0): [DRI2] Setup complete
[ 30.272] (==) RandR enabled
[ 30.272] (II) Initializing built-in extension Generic Event Extension
[ 30.272] (II) Initializing built-in extension SHAPE
[ 30.272] (II) Initializing built-in extension MIT-SHM
[ 30.272] (II) Initializing built-in extension XInputExtension
[ 30.272] (II) Initializing built-in extension XTEST
[ 30.272] (II) Initializing built-in extension BIG-REQUESTS
[ 30.272] (II) Initializing built-in extension SYNC
[ 30.272] (II) Initializing built-in extension XKEYBOARD
[ 30.272] (II) Initializing built-in extension XC-MISC
[ 30.272] (II) Initializing built-in extension SECURITY
[ 30.272] (II) Initializing built-in extension XINERAMA
[ 30.272] (II) Initializing built-in extension XFIXES
[ 30.272] (II) Initializing built-in extension RENDER
[ 30.272] (II) Initializing built-in extension RANDR
[ 30.272] (II) Initializing built-in extension COMPOSITE
[ 30.272] (II) Initializing built-in extension DAMAGE
[ 30.273] (II) Initializing extension GLX
[ 30.487] (II) config/udev: Adding input device Power Button (/dev/input/event1)
[ 30.487] (**) Power Button: Applying InputClass "evdev keyboard catchall"
[ 30.487] (II) LoadModule: "evdev"
[ 30.499] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.506] (II) Module evdev: vendor="X.Org Foundation"
[ 30.506] compiled for 1.10.0, module version = 2.6.0
[ 30.506] Module class: X.Org XInput Driver
[ 30.506] ABI class: X.Org XInput driver, version 12.2
[ 30.506] (II) Using input driver 'evdev' for 'Power Button'
[ 30.506] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.506] (**) Power Button: always reports core events
[ 30.506] (**) Power Button: Device: "/dev/input/event1"
[ 30.530] (--) Power Button: Found keys
[ 30.530] (II) Power Button: Configuring as keyboard
[ 30.530] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/LNXPWRBN:00/input/input1/event1"
[ 30.530] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD)
[ 30.530] (**) Option "xkb_rules" "evdev"
[ 30.530] (**) Option "xkb_model" "evdev"
[ 30.530] (**) Option "xkb_layout" "us"
[ 30.561] (II) config/udev: Adding input device Power Button (/dev/input/event0)
[ 30.561] (**) Power Button: Applying InputClass "evdev keyboard catchall"
[ 30.561] (II) Using input driver 'evdev' for 'Power Button'
[ 30.561] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.561] (**) Power Button: always reports core events
[ 30.561] (**) Power Button: Device: "/dev/input/event0"
[ 30.583] (--) Power Button: Found keys
[ 30.583] (II) Power Button: Configuring as keyboard
[ 30.583] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0/event0"
[ 30.583] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD)
[ 30.583] (**) Option "xkb_rules" "evdev"
[ 30.583] (**) Option "xkb_model" "evdev"
[ 30.583] (**) Option "xkb_layout" "us"
[ 30.590] (II) config/udev: Adding input device Logitech Logitech Illuminated Keyboard (/dev/input/event4)
[ 30.590] (**) Logitech Logitech Illuminated Keyboard: Applying InputClass "evdev keyboard catchall"
[ 30.590] (II) Using input driver 'evdev' for 'Logitech Logitech Illuminated Keyboard'
[ 30.591] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.591] (**) Logitech Logitech Illuminated Keyboard: always reports core events
[ 30.591] (**) Logitech Logitech Illuminated Keyboard: Device: "/dev/input/event4"
[ 30.623] (--) Logitech Logitech Illuminated Keyboard: Found keys
[ 30.623] (II) Logitech Logitech Illuminated Keyboard: Configuring as keyboard
[ 30.623] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.1/usb6/6-1/6-1:1.0/input/input4/event4"
[ 30.623] (II) XINPUT: Adding extended input device "Logitech Logitech Illuminated Keyboard" (type: KEYBOARD)
[ 30.623] (**) Option "xkb_rules" "evdev"
[ 30.624] (II) config/udev: Adding input device Logitech Logitech Illuminated Keyboard (/dev/input/event5)
[ 30.624] (**) Logitech Logitech Illuminated Keyboard: Applying InputClass "evdev keyboard catchall"
[ 30.624] (II) Using input driver 'evdev' for 'Logitech Logitech Illuminated Keyboard'
[ 30.624] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.624] (**) Logitech Logitech Illuminated Keyboard: always reports core events
[ 30.624] (**) Logitech Logitech Illuminated Keyboard: Device: "/dev/input/event5"
[ 30.650] (--) Logitech Logitech Illuminated Keyboard: Found 1 mouse buttons
[ 30.650] (--) Logitech Logitech Illuminated Keyboard: Found scroll wheel(s)
[ 30.650] (--) Logitech Logitech Illuminated Keyboard: Found relative axes
[ 30.650] (--) Logitech Logitech Illuminated Keyboard: Found absolute axes
[ 30.650] (--) Logitech Logitech Illuminated Keyboard: Found keys
[ 30.650] (II) Logitech Logitech Illuminated Keyboard: Configuring as mouse
[ 30.650] (II) Logitech Logitech Illuminated Keyboard: Configuring as keyboard
[ 30.650] (II) Logitech Logitech Illuminated Keyboard: Adding scrollwheel support
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: YAxisMapping: buttons 4 and 5
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
[ 30.650] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.1/usb6/6-1/6-1:1.1/input/input5/event5"
[ 30.650] (II) XINPUT: Adding extended input device "Logitech Logitech Illuminated Keyboard" (type: KEYBOARD)
[ 30.650] (**) Option "xkb_rules" "evdev"
[ 30.650] (**) Option "xkb_model" "evdev"
[ 30.650] (**) Option "xkb_layout" "us"
[ 30.650] (EE) Logitech Logitech Illuminated Keyboard: failed to initialize for relative axes.
[ 30.650] (II) Logitech Logitech Illuminated Keyboard: initialized for absolute axes.
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: (accel) keeping acceleration scheme 1
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: (accel) acceleration profile 0
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: (accel) acceleration factor: 2.000
[ 30.650] (**) Logitech Logitech Illuminated Keyboard: (accel) acceleration threshold: 4
[ 30.651] (II) config/udev: Adding input device Microsoft Microsoft Optical Mouse with Tilt Wheel (/dev/input/event6)
[ 30.651] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: Applying InputClass "evdev pointer catchall"
[ 30.651] (II) Using input driver 'evdev' for 'Microsoft Microsoft Optical Mouse with Tilt Wheel'
[ 30.651] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
[ 30.651] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: always reports core events
[ 30.651] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: Device: "/dev/input/event6"
[ 30.676] (--) Microsoft Microsoft Optical Mouse with Tilt Wheel: Found 9 mouse buttons
[ 30.676] (--) Microsoft Microsoft Optical Mouse with Tilt Wheel: Found scroll wheel(s)
[ 30.676] (--) Microsoft Microsoft Optical Mouse with Tilt Wheel: Found relative axes
[ 30.676] (--) Microsoft Microsoft Optical Mouse with Tilt Wheel: Found x and y relative axes
[ 30.676] (II) Microsoft Microsoft Optical Mouse with Tilt Wheel: Configuring as mouse
[ 30.676] (II) Microsoft Microsoft Optical Mouse with Tilt Wheel: Adding scrollwheel support
[ 30.676] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: YAxisMapping: buttons 4 and 5
[ 30.676] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
[ 30.676] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.2/usb7/7-2/7-2:1.0/input/input6/event6"
[ 30.676] (II) XINPUT: Adding extended input device "Microsoft Microsoft Optical Mouse with Tilt Wheel" (type: MOUSE)
[ 30.676] (II) Microsoft Microsoft Optical Mouse with Tilt Wheel: initialized for relative axes.
[ 30.676] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: (accel) keeping acceleration scheme 1
[ 30.677] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: (accel) acceleration profile 0
[ 30.677] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: (accel) acceleration factor: 2.000
[ 30.677] (**) Microsoft Microsoft Optical Mouse with Tilt Wheel: (accel) acceleration threshold: 4
[ 30.677] (II) config/udev: Adding input device Microsoft Microsoft Optical Mouse with Tilt Wheel (/dev/input/mouse0)
[ 30.677] (II) No input driver/identifier specified (ignoring)
[ 30.677] (II) config/udev: Adding input device HDA Digital PCBeep (/dev/input/event3)
[ 30.677] (II) No input driver/identifier specified (ignoring)
[ 30.680] (II) config/udev: Adding input device PC Speaker (/dev/input/event2)
[ 30.680] (II) No input driver/identifier specified (ignoring)
The more I investigate, the more I feel like this could be a hardware problem. I checked the CPU temp first, but it only reaches ~32 degrees celsius when the system hangs, so I doubt that's the cause. I've checked the cables (power and otherwise) between everything in my box, and it's all solid. I've also taken the sides off, thinking it my be an air flow issue, but that didn't solve it either.
On the software side of things, I've got nvidia, nvidia-utils, and lib32-nvidia-utils installed and they aren't popping any errors that I can see. When I run glxinfor or glxgears, nothing happens, the terminal just prompts me for input (which is actually an improvement, in Slackware, they hung the machine).
Again, any help on this issue is greatly appreciated, I'm at my wit's end about it.

I had a similar problem that turned out to be the graphics card working itself loose. So, if you haven't already, try taking it out, cleaning the contacts and plugging it back in. Worth a try and worked for me
Also, Xorg 1.10.2 & the nvidia proprietary driver don't seem to be in conflict for me so I think williewillus is mistaken.

Similar Messages

  • How much power is supplied for the video card in the 2009 mac pro?

    I am trying to find how much power is supplied by each six pin power point and the pci e slot on my 2009 Mac Pro

    The first two slots can provide 75 watts each.  40 watts each for slots 3 and 4.
    Each auxiliary 6-pin power connector can provide 75 watts.
    You can't use all these at maximum at once, though.  Total power is limited to 300 watts.

  • Better video cards for the new 8-core Mac Pro...?

    I don't know a ton about video cards. I did some research and the internet says that the Nvidia GeForce GTX 295 video card is the best card out right now. Will it work in the 8-core macpro? If not, or if it's not the best, what is the best video card that will work in the 8-core macpro?
    P.S.
    Does anyone know how to tell exactly whether a video card is better from another? I mean I can guess a video card at 1792MB is going to be better then one at 512MB, but I've also heard that, for example, the ATI Radeon HD 4870 512MB is better then the NVIDIA GeForce GT 120 512MB, and apple charges more for it so it probably is better, but all that tells me is the name of the cards, and that they are both at 512MB, so from that without any other information I would assume they're exactly the same just made by different companies, but there must be some other details that would explain why it's better right? Thanks in advance

    Well I found out the Quadro FX 4800 works on it, that looks pretty good.
    It's a good card, but it has a different application focus.
    What applications do you use?
    Answering that helps determine what the best card is.
    The 5800 is better though, will it work?
    Better, how?
    The spec's read better?
    How about real world performance?
    Benchmarks?
    They are very similar in shape and such.
    Shape has little to do with anything.
    The ROM is the key.
    If the card is not produced as "OS X compatible", it does not have Mac ROM and cannot be used.
    If there is a Mac edition that is similar (same GPU series, similar architecture) the ROM may be flashed to Mac ROM.
    Especially in Geforce cards, the ability to edit ROMs and even write portions of the ROM is necessary to be able to flash a card.
    Even then, no guarantees.
    Often a port on a flashed card won't work after flashing.
    Seems the 5800 has a 10 bit display port, so if the card were flashable, the display port wouldn't work.
    The ROM chip size of a card is also of concern.
    Many cards require either a new, larger ROM chip to accept the Mac ROM (soldering), or, require a hacked, "reduced" ROM to be written to allow flash.
    Invariably, there will be some feature loss with a reduced ROM, but a good hacker usually gets rid of superfluous stuff.
    Cards with more VRAM than the Mac counterpart will often times lose the extra VRAM- it won't be read by OS X.
    Then there is the EFI question, which often creates a final stumbling block for converting a card.
    There are many pioneers who flash cards.
    If a card is flashable, it has already been done.
    The Quadro 4500, 4800, and 5600 have all been worked out.
    As of yet, the 5800 hasn't been sussed out for flash (as far as I can find).
    Flashing the card is easy.
    Finding a physically compatible card and a compatible ROM are the hard parts.
    The two best retail cards for the Mac Pro are the Geforce GTX 285 and the Radeon HD 4870.

  • Dual video card on the m3985

    I'm planing to upgrade my m3958 to it's limits  for 3d rendering. This is the present rig. Processor: Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz (8 CPUs), ~3.4GHzMemory: 8GB GPU : NVIDIA Quadro K2000    * Really i dont see much difference with this card. Though it is rated amongs the best. I dont know why* So what can i really add to make this machine more powerfull for 3d rendering.Thanks

    More information on the Video Card integration problem, but no success yet.
    I plugged in the second monitor; an Acer AL2223W LCD connected to one of the ports on the EVGA GeForce GT 610 card, to see if it would be recognized. It displays "No Signal" when turned on.
    Did a search on the computer and found the AMD VISION Engine Control Center. That had a place to search for video equipment and the only thing it would find is the Samsung 213T.
    I verified that the fan on the card is running so the card is getting power via the PCI-E slot.
    The 2 outputs on the video card and the receptacle on the Acer are DVI-I dual link configuration with 24 pins + 4 pins at the flat pin for analog signals. The Samsung has a DVI-D dual link receptacle (24 pins).
    The cable is DVI-D Single Link (18 pins + the flat pin). When I get another cable it will be DVI-D Dual Link.
    It seems that the  EVGA GeForce GT 610 video card is not being recognized by the mother board or not putting out a signal.
    There was no driver disk with either the video card or the computer.
    Power supply is 430 Watts and the card specs say the PS should be at least 300 Watts.
    Do I need to do something with the BIOS or get a driver or setup disk for the board or card or do something with a jumper on the MB?
    Do I need to get the DVI-D Dual Link cable to get the signal from the Video Card to the Monitor?
    Thanks for any help you can give me.
    Bob Campbell

  • Upgrading Video Card in the iMac

    I've been very confused by what I have read on so many discussion groups. Is the video card in the 2.8 Mhz, duo core, Intel based iMac upgradeable? I know there is a standard plug in card, MXM, but am not sure if the 2400 ATI card I have in the iMac can be upgraded. I am sure it is a chore to access the innards of the iMac.

    YC73, you can familiarize yourself with Apple's policies;Sales and Refund Policy. The link is to the US online store, but the policy is consistent, allowing for local laws, throughout the world.
    As you can see from the policy, a built to order iMac is not returnable in the first 14 days, as a standard model would be. It is my understanding that something as simple as a RAM upgrade makes the Mac BTO.
    As a BTO, the Mac is subject to the Apple Manufacturer's Warranty. It is repaired or replaced at Apple's discretion. It is reported by other's in the forums with experience, that Apple traditionally replaces a machine if it is repaired three times.
    I hope this answers your question.
    PS, Forgive Ricktorronto, he may have drank too much Apple Kool Aid recently. Most long time posters are usually more tolerant in their replies.

  • CPU rendering why can't a Video Card do the job?

    I started noticing that the flash player rely on a processor
    more than a graphic card, I kind of was hoping that someday that
    flash would work off from video card as a rendering engine. Though
    right now flash player playing flash files seems like it puts too
    much stress on a processor for rendering flash I mean simply why
    can't the video card do the work when processor takes up like 90%
    of the job? and do the job more smoother? Just an idea I'm putting
    out for adobe staff members.

    What Portal provides is
    authentication
    authorization,
    user management,
    security ( SRAP),
    content presentation frame work ( URL Scraper provider, RSS provider and JSP provider )
    dynamic vpn ( through netlets you can integrate any tcp based network application )
    Wireless portal :-),
    and a variety of other features ..
    I don't believe app server provides all these features. Requirements of these features necessitates a portal market which the portal server addresses.

  • Motherboard issue? With video card.

    Mainboard:  760GMA-P34(FX) (MS-7641)   
    PCB Version:  5.0
    BIOS Version: VP.0 (aka Version: V25.0)
    BIOS Date:  5/28/2013
    Everything works fine using onboard video. Windows 7 64 bit.
    Plug in my Nvidia GT 620 and immediately CPU fan maxes out with no increase in temperature, system still stable using standard VGA driver provided by windows. After ANY NVIDIA driver is installed, windows blue screens upon boot. We can't see the error because the blue screen is too quick, and it immediately tries to reboot and restore. Event logs show nothing.
    I'm at a loss here. MSI motherboard tech support isn't open on the weekends, hopefully you guys can shed some light. Is this common? Do I need a new board? I've never come across anything like this before. The rest of the machine is 100% stable, the only issue(s) that arise is when the video card and the drivers are installed. The video card works fine in another machine, I've replaced the power supply with a much heftier one with more juice, and the same thing happens.
    Thanks!

    Quote from: badboy2k on 03-May-15, 21:22:59
    >>Posting Guide<< <-- can you read and list your parts used here?
    Without above done, hard to say what I think.

  • Problems with FCP and Nvidia GeForce 8600M GT video card

    Not sure if this should be in the MacBook Pro forum or this one...
    I am having trouble with the GeForce 8600M GT video card while editing in final cut pro (and after effects). I have talked with numerous people at applecare, but they say that they are unaware of issues regarding the video card.
    The problem: It's almost like the card is unable to keep up with professional video applications. Whenever there is movement in the video I'm editing, I get lines across the screen. Not sure to how better to explain it than to say it looks like the video card is having a tough time refreshing. (I can take the same video to a much older power book, and it plays perfectly, so its not the source file.)
    At first I thought it was an FCP issue, but I just started using after effects, and I'm having the same problem in there. If I edit in after effects, when I play it back (in after effects), I get the lines. I can then export the project to a quicktime file. The quicktime file plays perfectly (no lines) when not involved with a professional application. Then if I import the same quicktime file into FCP and play it back in FCP, the lines happen again.
    Is anyone esle having this problem? Any thoughts? Any solutions?
    Thanks!
    Michael
    Message was edited by: mjaffe00
    Message was edited by: mjaffe00

    I had searched the macbook pro forum, but not this one. It seems that the "screen tearing" issue has been talked about a lot, so I apologize for the repetitive nature of the post. I will say that I am fairly shocked that given how many people have clearly called applecare, no one there seems to have a clue that there is a problem with this video card. I have talked with at least 4 different people in the FCP specialty group, and not one had any clue what I was talking about... 3 of the 4 were fairly insistent that it was a video source issue on my side.
    Message was edited by: mjaffe00

  • Nvidia 8800GT video card in the First Generation Mac Pro

    This message is a bit of advice if you install the new Nvidia 8800GT video card in the first generation Mac Pro. The advice is keep your old video card. The old video card will be needed if you ever need to reinstall the OS X operating system. This new card needs the video drivers (kernel extensions) that are in OS X leopard 10.5.2 or later. With this video card installed, both the OS X original installation disk that came with this computer and the OS X leopard upgrade disks will cause a kernel panic when booted up. This is because they do not contain the kernel extensions needed for this video card. There is a question. Does anyone know how to create and burn a reinstallation DVD with OS X leopard version 10.5.2 on it? This would help solve this reinstallation problem

    this is a advisory to whom it may concern
    here my latest reply from a apple product specialist how was very good and helpful but only
    give me this to my upgrade answer !
    i want to upgrade my graphic card to 8800 and run tiger on my 1st gen macpro
    here it goes:
    "This article indicates what cards are supported in what machines: http://docs.info.apple.com/article.html?artnum=305346
    Your Mac Pro is the first generation, and did not have PCI-Express 2.0 slot like the current generation does have.
    Unfortunately, we don't foresee the release of any drivers for the 10.4 OS on this machine.
    In regards to the second issue, we also don't support two video cards in two slots, using a different one for eash OS, there is no supported way to do this."
    "Unfortunately, the latest version of the Mac Pro will not run Tiger, as it was manufactured after the release of Leopard, therefore the drivers for the machines hardware will not be available on Tiger."
    hope thats cleared some ones Q.
    kindly
    S.

  • My hp m8100n freezes so I want to replace the video card

    I bought this computer a few years ago and lately it's been freezing a lot (pretty much freezes 5 mins after I turn it on). It's to the point where I have to press the power button and turn off the computer.  I'm actually not sure why but this tends to happen when I'm trying to watch videos online or just playing games (even solitaire).  I've updated all the drivers from the website and that doesn't seem to work.  I read on some forums that this is a problem with the video card so I want to replace it with another computer's video card but I know nothing about computers.
    I'm wondering if anybody could tell me if I have to know anything before I go and plug in another video card. I know the m8100n video card is on the motherboard, do I need to disable it before I install the new card? or can I just put this one on?  I just don't want to break the computer.
    The video card is ati radeon x1600 pro 256 mb and I'm pulling this from another computer in my house (that comp's dead so I'm pulling parts out and using it in other places).

    Hi,
    Many things can cause freezes.
    If you are using a wireless keyboard and/or mouse then replace the batteries.
    You might be having a heat issue.
    Try this procedure:
    Unplug the PC and open it up. Clean out all the dust. Carefully remove and replace all the cables going to the motherboard one at a time. Do the same for the memory dimms and the video and sound cards if you have any.  You might want to buy a can of compressed air to blow the dust out of the CPU heat sink and the system fan.  Plug your PC back in and give it a go.
    Don't install the ATI Radeon x1600 pro 256 mb video card as it needs a bigger power supply compared to the 300 watt power supply in your PC.
    HP DV9700, t9300, Nvidia 8600, 4GB, Crucial C300 128GB SSD
    HP Photosmart Premium C309G, HP Photosmart 6520
    HP Touchpad, HP Chromebook 11
    Custom i7-4770k,Z-87, 8GB, Vertex 3 SSD, Samsung EVO SSD, Corsair HX650,GTX 760
    Custom i7-4790k,Z-97, 16GB, Vertex 3 SSD, Plextor M.2 SSD, Samsung EVO SSD, Corsair HX650, GTX 660TI
    Windows 7/8 UEFI/Legacy mode, MBR/GPT

  • How does Photoshop CS6 choose the video card for its Graphics Processor?

    Running a Intel Mac on 10.74 - CS6
    I have two video cards running three Dell Monitors.
    The ATI 5770 (runs the 24") is on the 16 lane PCI slot and the older Nvidea 8800 GT (running two Dell 19") is on the 8 lane slot.
    But photoshop chooses the slower card on the slower lane.
    Any advise welcome to change this.

    It's not really Adobe's fault.
    The simple fact is that most of the software design that supports display processing is still mired in the days where a computer had one monitor.  Making things worse is the fact that the GPU makers don't really have any incentive to make their hardware/software work in a system with other GPU makers' hardware.  In a selfish sense, quite the opposite.  Who's at fault when your ATI and nVidia cards don't work together?
    Macs have just as many problems as PCs - possibly more, since Mac users don't have the luxury of mixing and matching driver versions.
    Many modern video cards are made now to support sometimes up to 4 monitors.
    -Noel

  • What video cards will the p7-1423w support?

    I would like to upgrade my video graphics to something that will support Witcher 2.  I picked up a video card but then realized the power consumption is too great for this pc.  Any suggestions on what card I can get that will support this type of graphics?  (It is a windows 8 machine if that makes any difference.)
    Thanks!

    Hello Leach69,
    There are two things to consider when purchasing a video card. The slots that are available in your computer and the power supply that is already in the computer. If you are looking to play modern games you are going to need a good video card, as you already know. I would suggest that you also consider upgrading your power supply to something that provides more power so that you can use that card that does use more power consumption.
    Upgrading the power supply is also going to give you the freedom to do other upgrades in the future if you choose to do so. The power supply that is currently in your computer is only 300w from what I can tell, this means you are going to be extremely limited.
    I'd suggest upgrading your power supply to something around a 550w - 600w then you can get just about any video card you want as long as you have the slot for it.
    If I have helped you in any way click the Kudos button to say Thanks.
    The community works together, click Accept as Solution on the post that solves your issue for other members of the community to benefit from the solution.
    - Friendship is magical.

  • What formats of videos can be played from the SDHC card on the Zen?

    CWhat formats of videos can be played from the SDHC card on the Zen?' I realize that the SDHC card data is not integrated into the same functions as the data on the internal dri've. That's not my issue.
    My issue is trying to find [color="#ff0000"]ANY videos that can be played [color="#ff0000"]AT ALL from the SDHC card.
    I've been e-mailing with technical support for 2 weeks but it seems all I get is the run-around (and I different technician) with each e-mail reply.
    First they tell me I have to get my videos from Amazon Unbox in order to play them. Then they tell me that, no, the Amazon ones don't play from the SDHC card.
    Can anyone tell me what format or what source I can use to actually play videos from the SDHC card?

    GRe: What formats of videos can be played from the SDHC card on the Zen?]
    why_itsme wrote:
    I realize that the SDHC card data is not integrated into the same functions as the data on the internal dri've. That's not my issue.
    My issue is trying to find [color="#ff0000"]ANY videos that can be played [color="#ff0000"]AT ALL from the SDHC card.
    I've been e-mailing with technical support for 2 weeks but it seems all I get is the run-around (and I different technician) with each e-mail reply.
    First they tell me I have to get my videos from Amazon Unbox in order to play them. Then they tell me that, no, the Amazon ones don't play from the SDHC card.
    Can anyone tell me what format or what source I can use to actually play videos from the SDHC card?
    The specs are here? http://us.creative.com/products/prod...duct=6999&nav=

  • Is it possible to change the video card of a pavilion g7-1075dx?

    So, I have an HP Pavilion g7-1075dx Notebook with Windows 7. I'm an avid gamer and wish to play the newest games. However, there's one problem thats keeping me from doing so. Alot of the newer games require newer video/graphics cards. This particular laptop has an ATI Mobility Radeon HD 4250. The newest games seem to require ATI Radeon HD 4850 and above. This is the only thing thats keeping me from playing games like Starcraft II on the highest settings OR in the case of Star Wars: The Old Republic, it almost keeps me from playing the game, period, due to the graphics being too much for my current card to handle, causing massive lag. I was hoping I could change the video card. Upgrade. However, i've heard that with most laptops or notebooks you can't upgrade anything other than the RAM. I'm hoping anyone on here can help me out and tell me if it is possible to get a new video card for my laptop. We don't have a desktop computer so this laptop is my only hope for a long time to play newer PC games.

    Alright, let me put this in a simpler message: Does anyone know if it is possible to change the video card currently in my laptop, the HP Pavilion g7-1075dx, for a newer and better one. ATI Mobility Radeon HD 4250 to ATI Radeon HD 4850 or higher.

  • What are the video card requirements for running a 23" cinema display

    What are the video card requirements for running a 22" cinema display(clear acrylic case) w/ a PC? My motherbaord is AGP. Thanks to anyone who can help.
    Intel P4 3.0ghz   Windows XP  

    Hi Lionel,
    As a general rule of thumb, the ATI Rage 128 Pro will not support a 20" LCD. That being said, there are reports of it doing just that (possibly the edition that went into the cube).
    I'm not that familiar with the ins and outs of the Cube, so I can't give you authoritative information on it.
    A good place to start looking for answers is:
    http://cubeowner.com/kbase_2/
    Cheers!
    Karl

Maybe you are looking for