Nvidia Quadro 600, GeForce GTX 560 Ti or cheaper for Photoshop CS5 and Lightroom 3?

Hello,
I am a professional photographer and I am setting up a new PC (i7, Windows 7 64bit). But I have some troubles to choose the graphic card.
I use Lightroom and Photoshop CS5 3, no video editing.
Between the Open GL, Open CL, CUDA accelerations etc ... professional graphic cards models and consumer ones I am lost!
The Nvidia GeForce GTX 560 Ti seems more powerful and more versatile but I tell myself that if Nvidia has professional range there must be a reason.
So in Photoshop CS5 and Lightroom 3 what would be the best: GeForce GTX 560 Ti or Nvidia Quadro 600?
http://www.geforce.com/Hardware/GPUs/geforce-gtx-560ti/specifications
http://www.nvidia.com/object/product-quadro-600-us.html
Any reason to get a Quadro 2000?
http://www.nvidia.com/object/product-quadro-2000-us.html
Or the graphic card doesn’t matter much and I should take an entry-level GeForce to enjoy the HDMI and the silence? Which one then?
Does the Quadro 600 manages 10bits display 10bits? And the Geforce?
Any change announced with Photoshop CS6 and Ligthroom4?
The only 3D application I use is Google Earth in 3D mode, does it make any difference?
Thank you for your help.

You say "no video editing"...  If that's going to be the case, and you won't use the Mercury engine in the Adobe Premiere Pro package, which needs the nVidia Cuda subsystem, then I recommend you consider the ATI brand over nVidia.
Why?
Because while neither brand's developers (ATI or nVidia) always release perfect drivers, I find ATI display drivers to be of consistently higher quality than that of nVidia releases.  What this means to you is generally fewer crashes or quirks.  ATI has also traditionally supported older cards into the future better than nVidia - this might matter to you in a few years.
People ask me what video card I would recommend, and right now that would be a VisionTek ATI Radeon HD 6670 1 GB GDDR5 card.  I like this particular card because:
I've had 100% success with VisionTek cards in a number of different systems, not only initially but they have all run as long as I have used them, without ever breaking down.
The 6670 model uses very little power (under 70 watts) and as such doesn't stress your computer's power supply, need a separate power connection, nor make a lot of fan noise. 
It's not the fastest card made for 3D gaming, but it's inexpensive and excellent for Photoshop.  No matter what you choose, you should get a card that scores over 500 on the Passmark GPU benchmark, ideally over 1000:  http://www.videocardbenchmark.net/
The ATI Catalyst display driver implementations for the 4670/5670/6670 line of cards have been good and solid.
1 GB of on-card memory seems to be a good size, even for editing a lot of images, and GDDR5 memory provides faster access than DDR3.
You should know that besides using Photoshop heavily, I also develop OpenGL-based software as well, so I have some additional insight into driver implementations.
-Noel

Similar Messages

  • NVidia Quadro K600 vs GTX 560

    Hello all!
    I have repurposed my home office HP Elite MiniTower 8300 into an editing rig. It already had an i7 so all I did was bump the RAM to 16GB and replace the system drive with a shiny new fast 512GB SSD from Transcend. Added 2x 3TB WD Black Drives for storage. But now the video card...I can afford the PNY K600 with 1GB or stretch the budget a little extra and get a Palit GTX 560 Ti with 2GB RAM for another 50 $
    I know this question has been flogged to death between the Quadro and Gaming cards but the new Qudro K series - do they change the playing field?
    Which card will give Premiere and PhotoShop the most bang?
    Many thanks for any pointers!

    Quadro's are overpriced and underperforming. It is a protected brand of nVidia and they steal you blind with their A-brand reputation. It is their cash-cow. I have yet to find a Maximus configuration for a whopping $ 6K come even close to a GTX 680 that only costs $ 500. The ONLY reason to choose a Quadro card is when you really NEED 10 bit output to equally exorbitant 10-bit monitors.
    You may want to have a look at http://ppbm7.com/index.php/news and scroll down to see a limited comparison of video cards and their performance with PR. If you have the choice between a GTX 560 Ti and a GTX 660, I would go for the 660, because it consumes less power, runs quieter and can steer up to 4 monitors.

  • Problem with 3 monitors Intel HD Graphics and GeForce GTX 560

    I would like to attach 3 monitors to my box -- 1 to integrated Intel card and 2 to discreet Nvidia. I've tried different configs but I failed to make monitors connected to different cards to work at the same time. I have system with stock kernel, latest updates and the following configuration:
    $ lspci | egrep "VGA|Disp"
    00:02.0 Display controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
    01:00.0 VGA compatible controller: NVIDIA Corporation GF114 [GeForce GTX 560 Ti] (rev a1)
    $ pacman -Qi xorg-server xf86-video-intel nvidia awesome-git linux | egrep "Name|Version"
    Name : xorg-server
    Version : 1.17.1-3
    Name : xf86-video-intel
    Version : 2.99.917-3
    Name : nvidia
    Version : 346.47-3
    Name : awesome-git
    Version : 3.5.2.397.gdde5b1b-1
    Name : linux
    Version : 3.18.6-1
    $ lsmod | egrep 'i915|nvidia'
    nvidia 8335766 46
    i915 946695 2
    button 12953 1 i915
    i2c_algo_bit 12744 1 i915
    video 18043 2 i915,asus_wmi
    drm_kms_helper 80985 1 i915
    drm 263481 7 i915,drm_kms_helper,nvidia
    i2c_core 50152 6
    drm,i915,i2c_i801,drm_kms_helper,i2c_algo_bit,nvidia
    intel_gtt 17848 2 i915,intel_agp
    I've tried different xorg configs. Here is the best one. All 3 monitors are active with it (I could drag cursor over them) but only 2 (which connected to nvidia card) display DE.
    Section "DRI"
    Mode 0666
    EndSection
    Section "Monitor"
    Identifier "SAM"
    ModelName "Samsung SMS27A850"
    HorizSync 30.0 - 90.0
    VertRefresh 56.0 - 75.0
    Option "DPMS"
    DisplaySize 597 336 # In millimeters
    EndSection
    Section "Monitor"
    Identifier "VS"
    ModelName "ViewSonic VG1930wm"
    Option "DPMS"
    Option "RightOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Monitor"
    Identifier "BNQ"
    ModelName "BenQ FP737s"
    Option "DPMS"
    Option "LeftOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTX 560 Ti"
    BusID "01:00:0"
    Option "Monitor-DVI-I-1" "SAM"
    Option "Monitor-DVI-I-2" "VS"
    EndSection
    Section "Device"
    Identifier "intel"
    Driver "intel"
    Option "AccelMethod" "none"
    VendorName "Intel"
    BoardName "Intel Integrated Graphic Controller"
    BusID "00:02:0"
    Option "Monitor-VGA-0" "BNQ"
    EndSection
    Section "Screen"
    Identifier "SAM-Screen"
    Device "nvidia"
    Monitor "SAM"
    EndSection
    Section "Screen"
    Identifier "VS-Screen"
    Device "nvidia"
    Monitor "VS"
    EndSection
    Section "Screen"
    Identifier "BNQ-Screen"
    Device "intel"
    Monitor "BNQ"
    EndSection
    Section "ServerLayout"
    Identifier "Layout0"
    Screen 0 "SAM-Screen"
    Screen 1 "VS-Screen" RightOf "SAM-Screen"
    Screen 2 "BNQ-Screen" LeftOf "SAM-Screen"
    Option "Xinerama" "0"
    EndSection
    Also, xrandr shows last monitor (VGA-0, it plugged into Intel card) as disconnected
    $ xrandr
    Screen 0: minimum 8 x 8, current 3460 x 1440, maximum 16384 x 16384
    DVI-I-0 disconnected primary (normal left inverted right x axis y axis)
    VGA-0 disconnected (normal left inverted right x axis y axis)
    DVI-I-1 connected 2560x1440+0+0 (normal left inverted right x axis y axis) 518mm x 324mm
    2560x1440 59.95*+
    1920x1200 59.88
    1920x1080 60.00 50.00
    1680x1050 59.95
    1600x1200 60.00
    1440x900 59.89
    1280x1024 75.02 60.02
    1280x960 60.00
    1280x800 59.81
    1280x720 60.00 50.00
    1152x864 75.00
    1024x768 75.03 70.07 60.00
    800x600 75.00 72.19 60.32 56.25
    720x576 50.00
    720x480 59.94
    640x480 75.00 72.81 59.94
    HDMI-0 disconnected (normal left inverted right x axis y axis)
    DVI-I-2 connected 900x1440+2560+0 left (normal left inverted right x axis y axis) 410mm x 256mm
    1440x900 59.89*+ 74.98
    1280x1024 75.02 60.02
    1280x960 60.00
    1152x864 75.00
    1024x768 75.03 70.07 60.00
    800x600 75.00 72.19 60.32 56.25
    640x480 75.00 72.81 59.94
    640x400 70.10
    But it's still posible to start application on last monitor with 'env DISPLAY:=0.1 urxvt'
    Here is full Xorg log file and highlights:
    [ 4789.920] (EE) intel(1): Cannot position output VGA1 relative to unknown output SAM
    [ 4792.053] (EE) intel(G0): [drm] failed to set drm interface version: Permission denied [13].
    [ 4792.053] (II) intel(G0): [drm] Contents of '/sys/kernel/debug/dri/0/clients':
    [ 4792.053] (II) intel(G0): [drm] command pid dev master a uid magic
    [ 4792.053] (II) intel(G0): [drm] Xorg 10086 0 y y 0 0
    [ 4792.053] (II) intel(G0): [drm] Xorg 10086 0 n y 0 0
    [ 4792.053] (EE) intel(G0): Failed to claim DRM device.
    [ 4792.053] (II) UnloadModule: "intel"
    [ 4789.342]
    X.Org X Server 1.17.1
    Release Date: 2015-02-10
    [ 4789.342] X Protocol Version 11, Revision 0
    [ 4789.342] Build Operating System: Linux 3.18.6-1-ARCH x86_64
    [ 4789.342] Current Operating System: Linux earth 3.18.6-1-ARCH #1 SMP PREEMPT Sat Feb 7 08:44:05 CET 2015 x86_64
    [ 4789.342] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=dc49b07c-e67c-4b6f-91f7-7c1adc050e21 rw quiet ipv6.disable=1 init=/usr/lib/systemd/systemd security=tomoyo TOMOYO_trigger=/usr/lib/systemd/systemd
    [ 4789.342] Build Date: 22 February 2015 12:50:32PM
    [ 4789.342]
    [ 4789.342] Current version of pixman: 0.32.6
    [ 4789.342] Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    [ 4789.342] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [ 4789.342] (==) Log file: "/var/log/Xorg.0.log", Time: Sun Mar 1 14:43:55 2015
    [ 4789.342] (==) Using config directory: "/etc/X11/xorg.conf.d"
    [ 4789.342] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
    [ 4789.342] (==) ServerLayout "Layout0"
    [ 4789.342] (**) |-->Screen "SAM-Screen" (0)
    [ 4789.342] (**) | |-->Monitor "SAM"
    [ 4789.343] (**) | |-->Device "nvidia"
    [ 4789.343] (**) |-->Screen "VS-Screen" (1)
    [ 4789.343] (**) | |-->Monitor "VS"
    [ 4789.343] (**) | |-->Device "nvidia"
    [ 4789.343] (**) |-->Screen "BNQ-Screen" (2)
    [ 4789.343] (**) | |-->Monitor "BNQ"
    [ 4789.343] (**) | |-->Device "intel"
    [ 4789.343] (**) Option "Xinerama" "0"
    [ 4789.343] (==) Automatically adding devices
    [ 4789.343] (==) Automatically enabling devices
    [ 4789.343] (==) Automatically adding GPU devices
    [ 4789.343] (==) FontPath set to:
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/,
    /usr/share/fonts/OTF/,
    /usr/share/fonts/Type1/,
    /usr/share/fonts/100dpi/,
    /usr/share/fonts/75dpi/
    [ 4789.343] (==) ModulePath set to "/usr/lib/xorg/modules"
    [ 4789.343] (II) The server relies on udev to provide the list of input devices.
    If no devices become available, reconfigure udev or disable AutoAddDevices.
    [ 4789.343] (II) Loader magic: 0x815d80
    [ 4789.343] (II) Module ABI versions:
    [ 4789.343] X.Org ANSI C Emulation: 0.4
    [ 4789.343] X.Org Video Driver: 19.0
    [ 4789.343] X.Org XInput driver : 21.0
    [ 4789.343] X.Org Server Extension : 9.0
    [ 4789.344] (EE) systemd-logind: failed to get session: PID 10086 does not belong to any known session
    [ 4789.345] (II) xfree86: Adding drm device (/dev/dri/card1)
    [ 4789.345] (II) xfree86: Adding drm device (/dev/dri/card0)
    [ 4789.347] (--) PCI: (0:0:2:0) 8086:0102:1043:84ca rev 9, Mem @ 0xf6400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64
    [ 4789.347] (--) PCI:*(0:1:0:0) 10de:1200:10b0:0401 rev 161, Mem @ 0xf4000000/33554432, 0xe0000000/134217728, 0xe8000000/67108864, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
    [ 4789.347] (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
    [ 4789.347] (II) LoadModule: "glx"
    [ 4789.347] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
    [ 4789.359] (II) Module glx: vendor="NVIDIA Corporation"
    [ 4789.359] compiled for 4.0.2, module version = 1.0.0
    [ 4789.359] Module class: X.Org Server Extension
    [ 4789.359] (II) NVIDIA GLX Module 346.47 Thu Feb 19 18:09:07 PST 2015
    [ 4789.359] (II) LoadModule: "nvidia"
    [ 4789.359] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [ 4789.360] (II) Module nvidia: vendor="NVIDIA Corporation"
    [ 4789.360] compiled for 4.0.2, module version = 1.0.0
    [ 4789.360] Module class: X.Org Video Driver
    [ 4789.360] (II) LoadModule: "intel"
    [ 4789.360] (II) Loading /usr/lib/xorg/modules/drivers/intel_drv.so
    [ 4789.360] (II) Module intel: vendor="X.Org Foundation"
    [ 4789.360] compiled for 1.17.1, module version = 2.99.917
    [ 4789.360] Module class: X.Org Video Driver
    [ 4789.360] ABI class: X.Org Video Driver, version 19.0
    [ 4789.360] (II) NVIDIA dlloader X Driver 346.47 Thu Feb 19 17:47:18 PST 2015
    [ 4789.360] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [ 4789.360] (II) intel: Driver for Intel(R) Integrated Graphics Chipsets:
    i810, i810-dc100, i810e, i815, i830M, 845G, 854, 852GM/855GM, 865G,
    915G, E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM,
    Pineview G, 965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33,
    GM45, 4 Series, G45/G43, Q45/Q43, G41, B43
    [ 4789.360] (II) intel: Driver for Intel(R) HD Graphics: 2000-6000
    [ 4789.360] (II) intel: Driver for Intel(R) Iris(TM) Graphics: 5100, 6100
    [ 4789.360] (II) intel: Driver for Intel(R) Iris(TM) Pro Graphics: 5200, 6200, P6300
    [ 4789.360] (++) using VT number 7
    [ 4789.471] (II) Loading sub module "fb"
    [ 4789.471] (II) LoadModule: "fb"
    [ 4789.471] (II) Loading /usr/lib/xorg/modules/libfb.so
    [ 4789.471] (II) Module fb: vendor="X.Org Foundation"
    [ 4789.471] compiled for 1.17.1, module version = 1.0.0
    [ 4789.471] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 4789.472] (II) Loading sub module "wfb"
    [ 4789.472] (II) LoadModule: "wfb"
    [ 4789.472] (II) Loading /usr/lib/xorg/modules/libwfb.so
    [ 4789.472] (II) Module wfb: vendor="X.Org Foundation"
    [ 4789.472] compiled for 1.17.1, module version = 1.0.0
    [ 4789.472] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 4789.472] (II) Loading sub module "ramdac"
    [ 4789.472] (II) LoadModule: "ramdac"
    [ 4789.472] (II) Module "ramdac" already built-in
    [ 4789.473] (II) intel(1): Using Kernel Mode Setting driver: i915, version 1.6.0 20140905
    [ 4789.474] (II) intel(G0): Using Kernel Mode Setting driver: i915, version 1.6.0 20140905
    [ 4789.474] (II) NVIDIA(0): Creating default Display subsection in Screen section
    "SAM-Screen" for depth/fbbpp 24/32
    [ 4789.474] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
    [ 4789.474] (==) NVIDIA(0): RGB weight 888
    [ 4789.474] (==) NVIDIA(0): Default visual is TrueColor
    [ 4789.474] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    [ 4789.474] (**) NVIDIA(0): Enabling 2D acceleration
    [ 4789.841] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20150116)
    [ 4789.842] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 560 Ti (GF114) at PCI:1:0:0 (GPU-0)
    [ 4789.842] (--) NVIDIA(0): Memory: 1048576 kBytes
    [ 4789.842] (--) NVIDIA(0): VideoBIOS: 70.24.21.00.00
    [ 4789.842] (II) NVIDIA(0): Detected PCI Express Link width: 16X
    [ 4789.875] (--) NVIDIA(0): Valid display device(s) on GeForce GTX 560 Ti at PCI:1:0:0
    [ 4789.875] (--) NVIDIA(0): CRT-0
    [ 4789.875] (--) NVIDIA(0): CRT-1
    [ 4789.875] (--) NVIDIA(0): Samsung SMS27A850 (DFP-0) (boot, connected)
    [ 4789.875] (--) NVIDIA(0): DFP-1
    [ 4789.875] (--) NVIDIA(0): ViewSonic VG1930wm (DFP-2) (connected)
    [ 4789.875] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
    [ 4789.875] (--) NVIDIA(GPU-0): CRT-1: 400.0 MHz maximum pixel clock
    [ 4789.875] (--) NVIDIA(0): Samsung SMS27A850 (DFP-0): Internal TMDS
    [ 4789.875] (--) NVIDIA(GPU-0): Samsung SMS27A850 (DFP-0): 330.0 MHz maximum pixel clock
    [ 4789.875] (--) NVIDIA(0): DFP-1: Internal TMDS
    [ 4789.875] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
    [ 4789.875] (--) NVIDIA(0): ViewSonic VG1930wm (DFP-2): Internal TMDS
    [ 4789.875] (--) NVIDIA(GPU-0): ViewSonic VG1930wm (DFP-2): 330.0 MHz maximum pixel clock
    [ 4789.875] (**) NVIDIA(0): Option "Rotate" "left"
    [ 4789.875] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4789.875] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4789.875] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4789.875] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4789.875] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4789.875] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4789.875] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4789.875] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4789.876] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4789.876] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4789.876] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4789.876] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4789.876] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4789.876] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4789.876] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4789.876] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4789.876] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4789.876] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    [ 4789.877] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4789.877] (**) NVIDIA(0): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4789.877] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4789.878] (==) NVIDIA(0):
    [ 4789.878] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
    [ 4789.878] (==) NVIDIA(0): will be used as the requested mode.
    [ 4789.878] (==) NVIDIA(0):
    [ 4789.878] (II) NVIDIA(0): Validated MetaModes:
    [ 4789.878] (II) NVIDIA(0):
    [ 4789.878] (II) NVIDIA(0): "DFP-0:nvidia-auto-select,DFP-2:nvidia-auto-select{Rotation=90}"
    [ 4789.878] (II) NVIDIA(0): Virtual screen size determined to be 3460 x 1440
    [ 4789.905] (--) NVIDIA(0): DPI set to (125, 114); computed from "UseEdidDpi" X config
    [ 4789.905] (--) NVIDIA(0): option
    [ 4789.906] (--) intel(1): Integrated Graphics Chipset: Intel(R) HD Graphics 2000
    [ 4789.906] (--) intel(1): CPU: x86-64, sse2, sse3, ssse3, sse4.1, sse4.2, avx
    [ 4789.906] (II) intel(1): Creating default Display subsection in Screen section
    "BNQ-Screen" for depth/fbbpp 24/32
    [ 4789.906] (==) intel(1): Depth 24, (--) framebuffer bpp 32
    [ 4789.906] (==) intel(1): RGB weight 888
    [ 4789.906] (==) intel(1): Default visual is TrueColor
    [ 4789.906] (**) intel(1): Option "AccelMethod" "none"
    [ 4789.906] (II) intel(1): Output VGA1 using monitor section BNQ
    [ 4789.906] (**) intel(1): Option "LeftOf" "SAM"
    [ 4789.906] (**) intel(1): Option "Rotate" "left"
    [ 4789.906] (II) intel(1): Enabled output VGA1
    [ 4789.906] (II) intel(1): Output HDMI1 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output HDMI1
    [ 4789.906] (II) intel(1): Output DP1 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output DP1
    [ 4789.906] (II) intel(1): Output HDMI2 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output HDMI2
    [ 4789.906] (II) intel(1): Output HDMI3 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output HDMI3
    [ 4789.906] (II) intel(1): Output DP2 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output DP2
    [ 4789.906] (II) intel(1): Output DP3 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output DP3
    [ 4789.906] (--) intel(1): Using a maximum size of 256x256 for hardware cursors
    [ 4789.906] (II) intel(1): Output VIRTUAL1 has no monitor section
    [ 4789.906] (II) intel(1): Enabled output VIRTUAL1
    [ 4789.906] (II) intel(1): EDID for output DP1
    [ 4789.906] (II) intel(1): EDID for output DP2
    [ 4789.906] (II) intel(1): EDID for output DP3
    [ 4789.906] (II) intel(1): EDID for output HDMI1
    [ 4789.906] (II) intel(1): EDID for output HDMI2
    [ 4789.907] (II) intel(1): EDID for output HDMI3
    [ 4789.920] (II) intel(1): EDID for output VGA1
    [ 4789.920] (II) intel(1): Manufacturer: BNQ Model: 7659 Serial#: 28
    [ 4789.920] (II) intel(1): Year: 2003 Week: 39
    [ 4789.920] (II) intel(1): EDID Version: 1.3
    [ 4789.920] (II) intel(1): Analog Display Input, Input Voltage Level: 0.700/0.700 V
    [ 4789.920] (II) intel(1): Sync: Separate Composite
    [ 4789.920] (II) intel(1): Max Image Size [cm]: horiz.: 34 vert.: 27
    [ 4789.920] (II) intel(1): Gamma: 2.20
    [ 4789.920] (II) intel(1): DPMS capabilities: StandBy Suspend Off; RGB/Color Display
    [ 4789.920] (II) intel(1): First detailed timing is preferred mode
    [ 4789.920] (II) intel(1): redX: 0.640 redY: 0.340 greenX: 0.290 greenY: 0.611
    [ 4789.920] (II) intel(1): blueX: 0.140 blueY: 0.069 whiteX: 0.310 whiteY: 0.330
    [ 4789.920] (II) intel(1): Supported established timings:
    [ 4789.920] (II) intel(1): 720x400@70Hz
    [ 4789.920] (II) intel(1): 640x480@60Hz
    [ 4789.920] (II) intel(1): 640x480@67Hz
    [ 4789.920] (II) intel(1): 640x480@72Hz
    [ 4789.920] (II) intel(1): 640x480@75Hz
    [ 4789.920] (II) intel(1): 800x600@56Hz
    [ 4789.920] (II) intel(1): 800x600@60Hz
    [ 4789.920] (II) intel(1): 800x600@72Hz
    [ 4789.920] (II) intel(1): 800x600@75Hz
    [ 4789.920] (II) intel(1): 832x624@75Hz
    [ 4789.920] (II) intel(1): 1024x768@60Hz
    [ 4789.920] (II) intel(1): 1024x768@70Hz
    [ 4789.920] (II) intel(1): 1024x768@75Hz
    [ 4789.920] (II) intel(1): 1280x1024@75Hz
    [ 4789.920] (II) intel(1): 1152x864@75Hz
    [ 4789.920] (II) intel(1): Manufacturer's mask: 0
    [ 4789.920] (II) intel(1): Supported standard timings:
    [ 4789.920] (II) intel(1): #0: hsize: 1152 vsize 864 refresh: 75 vid: 20337
    [ 4789.920] (II) intel(1): #1: hsize: 1280 vsize 1024 refresh: 76 vid: 36993
    [ 4789.920] (II) intel(1): #2: hsize: 1280 vsize 1024 refresh: 60 vid: 32897
    [ 4789.920] (II) intel(1): #3: hsize: 1280 vsize 1024 refresh: 72 vid: 35969
    [ 4789.920] (II) intel(1): Supported detailed timing:
    [ 4789.920] (II) intel(1): clock: 108.0 MHz Image Size: 338 x 270 mm
    [ 4789.920] (II) intel(1): h_active: 1280 h_sync: 1328 h_sync_end 1440 h_blank_end 1688 h_border: 0
    [ 4789.920] (II) intel(1): v_active: 1024 v_sync: 1025 v_sync_end 1028 v_blanking: 1066 v_border: 0
    [ 4789.920] (II) intel(1): Supported detailed timing:
    [ 4789.920] (II) intel(1): clock: 25.2 MHz Image Size: 304 x 228 mm
    [ 4789.920] (II) intel(1): h_active: 640 h_sync: 656 h_sync_end 752 h_blank_end 800 h_border: 0
    [ 4789.920] (II) intel(1): v_active: 350 v_sync: 387 v_sync_end 389 v_blanking: 449 v_border: 0
    [ 4789.920] (II) intel(1): Ranges: V min: 56 V max: 75 Hz, H min: 31 H max: 81 kHz, PixClock max 145 MHz
    [ 4789.920] (II) intel(1): Monitor name: BenQ FP737s
    [ 4789.920] (II) intel(1): EDID (in hex):
    [ 4789.920] (II) intel(1): 00ffffffffffff0009d159761c000000
    [ 4789.920] (II) intel(1): 270d01036c221b78eac6f6a3574a9c23
    [ 4789.920] (II) intel(1): 114f54bfef80714f81908180818c0101
    [ 4789.920] (II) intel(1): 010101010101302a009851002a403070
    [ 4789.920] (II) intel(1): 1300520e1100001ed50980a0205e6310
    [ 4789.920] (II) intel(1): 1060520830e41000001a000000fd0038
    [ 4789.920] (II) intel(1): 4b1f510e000a202020202020000000fc
    [ 4789.920] (II) intel(1): 0042656e51204650373337730a200072
    [ 4789.920] (II) intel(1): Printing probed modes for output VGA1
    [ 4789.920] (II) intel(1): Modeline "1280x1024"x60.0 108.00 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync (64.0 kHz eP)
    [ 4789.920] (II) intel(1): Modeline "1280x1024"x76.0 141.81 1280 1376 1512 1744 1024 1025 1028 1070 -hsync +vsync (81.3 kHz)
    [ 4789.920] (II) intel(1): Modeline "1280x1024"x75.0 135.00 1280 1296 1440 1688 1024 1025 1028 1066 +hsync +vsync (80.0 kHz e)
    [ 4789.920] (II) intel(1): Modeline "1280x1024"x72.0 132.84 1280 1368 1504 1728 1024 1025 1028 1067 -hsync +vsync (76.9 kHz)
    [ 4789.920] (II) intel(1): Modeline "1152x864"x75.0 108.00 1152 1216 1344 1600 864 865 868 900 +hsync +vsync (67.5 kHz e)
    [ 4789.920] (II) intel(1): Modeline "1024x768"x75.1 78.80 1024 1040 1136 1312 768 769 772 800 +hsync +vsync (60.1 kHz e)
    [ 4789.920] (II) intel(1): Modeline "1024x768"x70.1 75.00 1024 1048 1184 1328 768 771 777 806 -hsync -vsync (56.5 kHz e)
    [ 4789.920] (II) intel(1): Modeline "1024x768"x60.0 65.00 1024 1048 1184 1344 768 771 777 806 -hsync -vsync (48.4 kHz e)
    [ 4789.920] (II) intel(1): Modeline "832x624"x74.6 57.28 832 864 928 1152 624 625 628 667 -hsync -vsync (49.7 kHz e)
    [ 4789.920] (II) intel(1): Modeline "800x600"x72.2 50.00 800 856 976 1040 600 637 643 666 +hsync +vsync (48.1 kHz e)
    [ 4789.920] (II) intel(1): Modeline "800x600"x75.0 49.50 800 816 896 1056 600 601 604 625 +hsync +vsync (46.9 kHz e)
    [ 4789.920] (II) intel(1): Modeline "800x600"x60.3 40.00 800 840 968 1056 600 601 605 628 +hsync +vsync (37.9 kHz e)
    [ 4789.920] (II) intel(1): Modeline "800x600"x56.2 36.00 800 824 896 1024 600 601 603 625 +hsync +vsync (35.2 kHz e)
    [ 4789.920] (II) intel(1): Modeline "640x480"x75.0 31.50 640 656 720 840 480 481 484 500 -hsync -vsync (37.5 kHz e)
    [ 4789.920] (II) intel(1): Modeline "640x480"x72.8 31.50 640 664 704 832 480 489 491 520 -hsync -vsync (37.9 kHz e)
    [ 4789.920] (II) intel(1): Modeline "640x480"x66.7 30.24 640 704 768 864 480 483 486 525 -hsync -vsync (35.0 kHz e)
    [ 4789.920] (II) intel(1): Modeline "640x480"x60.0 25.20 640 656 752 800 480 490 492 525 -hsync -vsync (31.5 kHz e)
    [ 4789.920] (II) intel(1): Modeline "720x400"x70.1 28.32 720 738 846 900 400 412 414 449 -hsync +vsync (31.5 kHz e)
    [ 4789.920] (II) intel(1): Modeline "640x350"x70.1 25.17 640 656 752 800 350 387 389 449 +hsync -vsync (31.5 kHz e)
    [ 4789.920] (II) intel(1): EDID for output VIRTUAL1
    [ 4789.920] (II) intel(1): Output DP1 disconnected
    [ 4789.920] (II) intel(1): Output DP2 disconnected
    [ 4789.920] (II) intel(1): Output DP3 disconnected
    [ 4789.920] (II) intel(1): Output HDMI1 disconnected
    [ 4789.920] (II) intel(1): Output HDMI2 disconnected
    [ 4789.920] (II) intel(1): Output HDMI3 disconnected
    [ 4789.920] (II) intel(1): Output VGA1 connected
    [ 4789.920] (II) intel(1): Output VIRTUAL1 disconnected
    [ 4789.920] (II) intel(1): Using user preference for initial modes
    [ 4789.920] (II) intel(1): Output VGA1 using initial mode 1280x1024
    [ 4789.920] (EE) intel(1): Cannot position output VGA1 relative to unknown output SAM
    [ 4789.920] (II) intel(1): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated.
    [ 4789.920] (==) intel(1): TearFree disabled
    [ 4789.920] (==) intel(1): DPI set to (96, 96)
    [ 4789.920] (II) Loading sub module "dri2"
    [ 4789.920] (II) LoadModule: "dri2"
    [ 4789.920] (II) Module "dri2" already built-in
    [ 4789.920] (II) Loading sub module "present"
    [ 4789.920] (II) LoadModule: "present"
    [ 4789.920] (II) Module "present" already built-in
    [ 4792.053] (EE) intel(G0): [drm] failed to set drm interface version: Permission denied [13].
    [ 4792.053] (II) intel(G0): [drm] Contents of '/sys/kernel/debug/dri/0/clients':
    [ 4792.053] (II) intel(G0): [drm] command pid dev master a uid magic
    [ 4792.053] (II) intel(G0): [drm] Xorg 10086 0 y y 0 0
    [ 4792.053] (II) intel(G0): [drm] Xorg 10086 0 n y 0 0
    [ 4792.053] (EE) intel(G0): Failed to claim DRM device.
    [ 4792.053] (II) UnloadModule: "intel"
    [ 4792.053] (--) Depth 24 pixmap format is 32 bpp
    [ 4792.053] (II) NVIDIA: Using 3072.00 MB of virtual memory for indirect memory
    [ 4792.053] (II) NVIDIA: access.
    [ 4792.057] (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
    [ 4792.057] (II) NVIDIA(0): may not be running or the "AcpidSocketPath" X
    [ 4792.057] (II) NVIDIA(0): configuration option may not be set correctly. When the
    [ 4792.057] (II) NVIDIA(0): ACPI event daemon is available, the NVIDIA X driver will
    [ 4792.057] (II) NVIDIA(0): try to use it to receive ACPI event notifications. For
    [ 4792.057] (II) NVIDIA(0): details, please see the "ConnectToAcpid" and
    [ 4792.057] (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
    [ 4792.057] (II) NVIDIA(0): Config Options in the README.
    [ 4792.091] (II) NVIDIA(0): Setting mode "DFP-0:nvidia-auto-select,DFP-2:nvidia-auto-select{Rotation=90}"
    [ 4792.264] (==) NVIDIA(0): Disabling shared memory pixmaps
    [ 4792.264] (==) NVIDIA(0): Backing store enabled
    [ 4792.264] (==) NVIDIA(0): Silken mouse enabled
    [ 4792.264] (**) NVIDIA(0): DPMS enabled
    [ 4792.264] (II) Loading sub module "dri2"
    [ 4792.264] (II) LoadModule: "dri2"
    [ 4792.264] (II) Module "dri2" already built-in
    [ 4792.264] (II) NVIDIA(0): [DRI2] Setup complete
    [ 4792.264] (II) NVIDIA(0): [DRI2] VDPAU driver: nvidia
    [ 4792.264] (--) RandR disabled
    [ 4792.264] (II) intel(1): SNA initialized with disabled backend
    [ 4792.264] (==) intel(1): Backing store enabled
    [ 4792.264] (==) intel(1): Silken mouse enabled
    [ 4792.264] (II) intel(1): HW Cursor enabled
    [ 4792.264] (II) intel(1): RandR 1.2 enabled, ignore the following RandR disabled message.
    [ 4792.264] (**) intel(1): DPMS enabled
    [ 4792.264] (==) intel(1): display hotplug detection enabled
    [ 4792.264] (II) intel(1): Textured video not supported on this hardware
    [ 4792.264] (WW) intel(1): loading DRI2 whilst the GPU is wedged.
    [ 4792.264] (II) intel(1): [DRI2] Setup complete
    [ 4792.264] (II) intel(1): [DRI2] DRI driver: i965
    [ 4792.264] (II) intel(1): [DRI2] VDPAU driver: i965
    [ 4792.264] (II) intel(1): direct rendering: DRI2 enabled
    [ 4792.265] (II) intel(1): hardware support for Present enabled
    [ 4792.265] (WW) intel(1): Option "Monitor-VGA-0" is not used
    [ 4792.265] (WW) intel(1): Option "LeftOf" is not used
    [ 4792.265] (WW) intel(1): Option "Rotate" is not used
    [ 4792.265] (--) RandR disabled
    [ 4792.268] (II) Initializing extension GLX
    [ 4792.268] (II) Indirect GLX disabled.(II) intel(1): switch to mode [email protected] on VGA1 using pipe 0, position (0, 0), rotation left, reflection none
    [ 4792.332] (II) intel(1): Setting screen physical size to 270 x 338
    [ 4792.386] (II) config/udev: Adding input device Power Button (/dev/input/event2)
    [ 4792.386] (**) Power Button: Applying InputClass "evdev keyboard catchall"
    [ 4792.386] (**) Power Button: Applying InputClass "libinput keyboard catchall"
    [ 4792.386] (**) Power Button: Applying InputClass "system-keyboard"
    [ 4792.386] (II) LoadModule: "libinput"
    [ 4792.386] (II) Loading /usr/lib/xorg/modules/input/libinput_drv.so
    [ 4792.387] (II) Module libinput: vendor="X.Org Foundation"
    [ 4792.387] compiled for 1.17.1, module version = 0.7.0
    [ 4792.387] Module class: X.Org XInput Driver
    [ 4792.387] ABI class: X.Org XInput driver, version 21.0
    [ 4792.387] (II) Using input driver 'libinput' for 'Power Button'
    [ 4792.387] (**) Power Button: always reports core events
    [ 4792.387] (**) Option "Device" "/dev/input/event2"
    [ 4792.387] (II) input device 'Power Button', /dev/input/event2 is tagged by udev as: Keyboard
    [ 4792.387] (II) input device 'Power Button', /dev/input/event2 is a keyboard
    [ 4792.399] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/LNXPWRBN:00/input/input2/event2"
    [ 4792.399] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD, id 6)
    [ 4792.399] (**) Option "xkb_model" "pc104"
    [ 4792.399] (**) Option "xkb_layout" "us,ru"
    [ 4792.399] (**) Option "xkb_options" "caps:swapescape,grp:toggle"
    [ 4792.432] (II) input device 'Power Button', /dev/input/event2 is tagged by udev as: Keyboard
    [ 4792.432] (II) input device 'Power Button', /dev/input/event2 is a keyboard
    [ 4792.432] (II) config/udev: Adding input device Power Button (/dev/input/event1)
    [ 4792.432] (**) Power Button: Applying InputClass "evdev keyboard catchall"
    [ 4792.432] (**) Power Button: Applying InputClass "libinput keyboard catchall"
    [ 4792.432] (**) Power Button: Applying InputClass "system-keyboard"
    [ 4792.432] (II) Using input driver 'libinput' for 'Power Button'
    [ 4792.432] (**) Power Button: always reports core events
    [ 4792.432] (**) Option "Device" "/dev/input/event1"
    [ 4792.433] (II) input device 'Power Button', /dev/input/event1 is tagged by udev as: Keyboard
    [ 4792.433] (II) input device 'Power Button', /dev/input/event1 is a keyboard
    [ 4792.489] (**) Option "config_info" "udev:/sys/devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input1/event1"
    [ 4792.489] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD, id 7)
    [ 4792.489] (**) Option "xkb_model" "pc104"
    [ 4792.489] (**) Option "xkb_layout" "us,ru"
    [ 4792.489] (**) Option "xkb_options" "caps:swapescape,grp:toggle"
    [ 4792.490] (II) input device 'Power Button', /dev/input/event1 is tagged by udev as: Keyboard
    [ 4792.490] (II) input device 'Power Button', /dev/input/event1 is a keyboard
    [ 4792.490] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=3 (/dev/input/event0)
    [ 4792.490] (II) No input driver specified, ignoring this device.
    [ 4792.490] (II) This device may have been added with another device file.
    [ 4792.490] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=7 (/dev/input/event17)
    [ 4792.490] (II) No input driver specified, ignoring this device.
    [ 4792.490] (II) This device may have been added with another device file.
    [ 4792.490] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=8 (/dev/input/event18)
    [ 4792.490] (II) No input driver specified, ignoring this device.
    [ 4792.490] (II) This device may have been added with another device file.
    [ 4792.491] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=9 (/dev/input/event19)
    [ 4792.491] (II) No input driver specified, ignoring this device.
    [ 4792.491] (II) This device may have been added with another device file.
    [ 4792.491] (II) config/udev: Adding input device Microsoft Microsoft® 2.4GHz Transceiver v8.0 (/dev/input/event3)
    [ 4792.491] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "evdev keyboard catchall"
    [ 4792.491] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "libinput keyboard catchall"
    [ 4792.491] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "system-keyboard"
    [ 4792.491] (II) Using input driver 'libinput' for 'Microsoft Microsoft® 2.4GHz Transceiver v8.0'
    [ 4792.491] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: always reports core events
    [ 4792.491] (**) Option "Device" "/dev/input/event3"
    [ 4792.491] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event3 is tagged by udev as: Keyboard
    [ 4792.491] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event3 is a keyboard
    [ 4792.554] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.0/usb1/1-1/1-1.1/1-1.1:1.0/0003:045E:07B2.0004/input/input24/event3"
    [ 4792.554] (II) XINPUT: Adding extended input device "Microsoft Microsoft® 2.4GHz Transceiver v8.0" (type: KEYBOARD, id 8)
    [ 4792.554] (**) Option "xkb_model" "pc104"
    [ 4792.554] (**) Option "xkb_layout" "us,ru"
    [ 4792.554] (**) Option "xkb_options" "caps:swapescape,grp:toggle"
    [ 4792.554] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event3 is tagged by udev as: Keyboard
    [ 4792.554] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event3 is a keyboard
    [ 4792.555] (II) config/udev: Adding input device Microsoft Microsoft® 2.4GHz Transceiver v8.0 (/dev/input/event4)
    [ 4792.555] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "evdev pointer catchall"
    [ 4792.555] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "libinput pointer catchall"
    [ 4792.555] (II) Using input driver 'libinput' for 'Microsoft Microsoft® 2.4GHz Transceiver v8.0'
    [ 4792.555] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: always reports core events
    [ 4792.555] (**) Option "Device" "/dev/input/event4"
    [ 4792.555] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event4 is tagged by udev as: Mouse
    [ 4792.555] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event4 is a pointer caps
    [ 4792.609] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.0/usb1/1-1/1-1.1/1-1.1:1.1/0003:045E:07B2.0005/input/input25/event4"
    [ 4792.610] (II) XINPUT: Adding extended input device "Microsoft Microsoft® 2.4GHz Transceiver v8.0" (type: MOUSE, id 9)
    [ 4792.610] (**) Option "AccelerationScheme" "none"
    [ 4792.610] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: (accel) selected scheme none/0
    [ 4792.610] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: (accel) acceleration factor: 2.000
    [ 4792.610] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: (accel) acceleration threshold: 4
    [ 4792.610] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event4 is tagged by udev as: Mouse
    [ 4792.610] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event4 is a pointer caps
    [ 4792.610] (II) config/udev: Adding input device Microsoft Microsoft® 2.4GHz Transceiver v8.0 (/dev/input/mouse0)
    [ 4792.610] (II) No input driver specified, ignoring this device.
    [ 4792.610] (II) This device may have been added with another device file.
    [ 4792.611] (II) config/udev: Adding input device Microsoft Microsoft® 2.4GHz Transceiver v8.0 (/dev/input/event5)
    [ 4792.611] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "evdev keyboard catchall"
    [ 4792.611] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "joystick catchall"
    [ 4792.611] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "libinput keyboard catchall"
    [ 4792.611] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "system-keyboard"
    [ 4792.611] (II) Using input driver 'libinput' for 'Microsoft Microsoft® 2.4GHz Transceiver v8.0'
    [ 4792.611] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: always reports core events
    [ 4792.611] (**) Option "Device" "/dev/input/event5"
    [ 4792.611] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event5 is tagged by udev as: Keyboard Joystick
    [ 4792.611] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event5 is a keyboard
    [ 4792.676] (**) Option "config_info" "udev:/sys/devices/pci0000:00/0000:00:1a.0/usb1/1-1/1-1.1/1-1.1:1.2/0003:045E:07B2.0006/input/input26/event5"
    [ 4792.676] (II) XINPUT: Adding extended input device "Microsoft Microsoft® 2.4GHz Transceiver v8.0" (type: KEYBOARD, id 10)
    [ 4792.676] (**) Option "xkb_model" "pc104"
    [ 4792.676] (**) Option "xkb_layout" "us,ru"
    [ 4792.676] (**) Option "xkb_options" "caps:swapescape,grp:toggle"
    [ 4792.676] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event5 is tagged by udev as: Keyboard Joystick
    [ 4792.676] (II) input device 'Microsoft Microsoft® 2.4GHz Transceiver v8.0', /dev/input/event5 is a keyboard
    [ 4792.677] (II) config/udev: Adding input device Microsoft Microsoft® 2.4GHz Transceiver v8.0 (/dev/input/js0)
    [ 4792.677] (**) Microsoft Microsoft® 2.4GHz Transceiver v8.0: Applying InputClass "system-keyboard"
    [ 4792.677] (II) No input driver specified, ignoring this device.
    [ 4792.677] (II) This device may have been added with another device file.
    [ 4792.677] (II) config/udev: Adding input device HDA Intel PCH Front Mic (/dev/input/event7)
    [ 4792.677] (II) No input driver specified, ignoring this device.
    [ 4792.677] (II) This device may have been added with another device file.
    [ 4792.677] (II) config/udev: Adding input device HDA Intel PCH Line (/dev/input/event8)
    [ 4792.677] (II) No input driver specified, ignoring this device.
    [ 4792.677] (II) This device may have been added with another device file.
    [ 4792.677] (II) config/udev: Adding input device HDA Intel PCH Line Out Front (/dev/input/event9)
    [ 4792.677] (II) No input driver specified, ignoring this device.
    [ 4792.677] (II) This device may have been added with another device file.
    [ 4792.678] (II) config/udev: Adding input device HDA Intel PCH Line Out Surround (/dev/input/event10)
    [ 4792.678] (II) No input driver specified, ignoring this device.
    [ 4792.678] (II) This device may have been added with another device file.
    [ 4792.678] (II) config/udev: Adding input device HDA Intel PCH Line Out CLFE (/dev/input/event11)
    [ 4792.678] (II) No input driver specified, ignoring this device.
    [ 4792.678] (II) This device may have been added with another device file.
    [ 4792.678] (II) config/udev: Adding input device HDA Intel PCH Line Out Side (/dev/input/event12)
    [ 4792.678] (II) No input driver specified, ignoring this device.
    [ 4792.678] (II) This device may have been added with another device file.
    [ 4792.678] (II) config/udev: Adding input device HDA Intel PCH Front Headphone (/dev/input/event13)
    [ 4792.678] (II) No input driver specified, ignoring this device.
    [ 4792.678] (II) This device may have been added with another device file.
    [ 4792.678] (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=7 (/dev/input/event14)
    [ 4792.678] (II) No input driver specified, ignoring this device.
    [ 4792.678] (II) This device may have been added with another device file.
    [ 4792.679] (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=8 (/dev/input/event15)
    [ 4792.679] (II) No input driver specified, ignoring this device.
    [ 4792.679] (II) This device may have been added with another device file.
    [ 4792.679] (II) config/udev: Adding input device HDA Intel PCH Rear Mic (/dev/input/event6)
    [ 4792.679] (II) No input driver specified, ignoring this device.
    [ 4792.679] (II) This device may have been added with another device file.
    [ 4792.679] (II) config/udev: Adding input device Eee PC WMI hotkeys (/dev/input/event16)
    [ 4792.679] (**) Eee PC WMI hotkeys: Applying InputClass "evdev keyboard catchall"
    [ 4792.679] (**) Eee PC WMI hotkeys: Applying InputClass "libinput keyboard catchall"
    [ 4792.679] (**) Eee PC WMI hotkeys: Applying InputClass "system-keyboard"
    [ 4792.679] (II) Using input driver 'libinput' for 'Eee PC WMI hotkeys'
    [ 4792.679] (**) Eee PC WMI hotkeys: always reports core events
    [ 4792.679] (**) Option "Device" "/dev/input/event16"
    [ 4792.679] (II) input device 'Eee PC WMI hotkeys', /dev/input/event16 is tagged by udev as: Keyboard
    [ 4792.679] (II) input device 'Eee PC WMI hotkeys', /dev/input/event16 is a keyboard
    [ 4792.712] (**) Option "config_info" "udev:/sys/devices/platform/eeepc-wmi/input/input19/event16"
    [ 4792.712] (II) XINPUT: Adding extended input device "Eee PC WMI hotkeys" (type: KEYBOARD, id 11)
    [ 4792.712] (**) Option "xkb_model" "pc104"
    [ 4792.713] (**) Option "xkb_layout" "us,ru"
    [ 4792.713] (**) Option "xkb_options" "caps:swapescape,grp:toggle"
    [ 4792.713] (II) input device 'Eee PC WMI hotkeys', /dev/input/event16 is tagged by udev as: Keyboard
    [ 4792.713] (II) input device 'Eee PC WMI hotkeys', /dev/input/event16 is a keyboard
    [ 4793.050] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.050] (**) NVIDIA(0): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4793.050] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.081] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.081] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4793.081] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.081] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.081] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4793.081] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.081] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.081] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4793.081] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.081] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4793.081] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.081] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.081] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4793.081] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.081] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4793.081] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.081] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.081] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    [ 4793.129] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.129] (**) NVIDIA(0): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4793.129] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.160] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.160] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4793.160] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.160] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.160] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4793.160] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.160] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.160] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4793.160] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.160] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4793.160] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.160] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.160] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4793.160] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.160] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4793.160] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.160] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.160] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    [ 4793.229] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.229] (**) NVIDIA(0): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4793.229] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.259] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4793.259] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4793.259] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4793.259] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.259] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4793.259] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.259] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.259] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4793.259] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.259] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4793.259] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.259] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.259] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4793.259] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4793.259] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4793.259] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4793.259] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4793.259] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    [ 4849.763] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4849.763] (**) NVIDIA(0): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4849.763] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4849.794] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4849.794] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4849.794] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4849.794] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4849.794] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4849.794] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4849.794] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4849.794] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4849.795] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4849.795] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4849.795] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4849.795] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4849.795] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4849.795] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4849.795] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4849.795] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4849.795] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4849.795] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    I also tried other configs, but neither of them made me happy:
    Section "DRI"
    Mode 0666
    EndSection
    Section "Monitor"
    Identifier "SAM"
    ModelName "Samsung SMS27A850"
    HorizSync 30.0 - 90.0
    VertRefresh 56.0 - 75.0
    Option "DPMS"
    DisplaySize 597 336 # In millimeters
    EndSection
    Section "Monitor"
    Identifier "VS"
    ModelName "ViewSonic VG1930wm"
    Option "DPMS"
    Option "RightOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Monitor"
    Identifier "BNQ"
    ModelName "BenQ FP737s"
    Option "DPMS"
    Option "LeftOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Device"
    Identifier "nvidia-sam"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTX 560 Ti"
    BusID "01:00:0"
    Option "Monitor-DVI-I-1" "SAM"
    Screen 0
    EndSection
    Section "Device"
    Identifier "nvidia-vs"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTX 560 Ti"
    BusID "01:00:0"
    Option "Monitor-DVI-I-2" "VS"
    Screen 1
    EndSection
    Section "Device"
    Identifier "intel"
    Driver "modesetting"
    Option "AccelMethod" "none"
    VendorName "Intel"
    BoardName "Intel Integrated Graphic Controller"
    BusID "00:02:0"
    Option "Monitor-VGA-0" "BNQ"
    Screen 2
    EndSection
    Section "Screen"
    Identifier "SAM-Screen"
    Device "nvidia-sam"
    Monitor "SAM"
    #DefaultDepth 24
    # Option "Stereo" "0"
    # Option "SLI" "Off"
    # Option "MultiGPU" "Off"
    # Option "BaseMosaic" "off"
    #Option "AllowEmptyInitialConfiguration"
    # SubSection "Display"
    # Depth 24
    # EndSubSection
    EndSection
    Section "Screen"
    Identifier "VS-Screen"
    Device "nvidia-vs"
    Monitor "VS"
    #DefaultDepth 24
    # Option "Stereo" "0"
    # Option "SLI" "Off"
    # Option "MultiGPU" "Off"
    # Option "BaseMosaic" "off"
    #Option "AllowEmptyInitialConfiguration"
    # SubSection "Display"
    # Depth 24
    # EndSubSection
    EndSection
    Section "Screen"
    Identifier "BNQ-Screen"
    Device "intel"
    Monitor "BNQ"
    EndSection
    Section "ServerLayout"
    Identifier "Layout0"
    Screen 0 "SAM-Screen"
    Screen 1 "VS-Screen" RightOf "SAM-Screen"
    Screen 2 "BNQ-Screen" LeftOf "SAM-Screen"
    Option "Xinerama" "0"
    EndSection
    [ 4488.527]
    X.Org X Server 1.17.1
    Release Date: 2015-02-10
    [ 4488.527] X Protocol Version 11, Revision 0
    [ 4488.527] Build Operating System: Linux 3.18.6-1-ARCH x86_64
    [ 4488.527] Current Operating System: Linux earth 3.18.6-1-ARCH #1 SMP PREEMPT Sat Feb 7 08:44:05 CET 2015 x86_64
    [ 4488.527] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=dc49b07c-e67c-4b6f-91f7-7c1adc050e21 rw quiet ipv6.disable=1 init=/usr/lib/systemd/systemd security=tomoyo TOMOYO_trigger=/usr/lib/systemd/systemd
    [ 4488.527] Build Date: 22 February 2015 12:50:32PM
    [ 4488.527]
    [ 4488.527] Current version of pixman: 0.32.6
    [ 4488.527] Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    [ 4488.527] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [ 4488.527] (==) Log file: "/var/log/Xorg.0.log", Time: Sun Mar 1 14:38:54 2015
    [ 4488.527] (==) Using config directory: "/etc/X11/xorg.conf.d"
    [ 4488.527] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
    [ 4488.528] (==) ServerLayout "Layout0"
    [ 4488.528] (**) |-->Screen "SAM-Screen" (0)
    [ 4488.528] (**) | |-->Monitor "SAM"
    [ 4488.528] (**) | |-->Device "nvidia-sam"
    [ 4488.528] (**) |-->Screen "VS-Screen" (1)
    [ 4488.528] (**) | |-->Monitor "VS"
    [ 4488.528] (**) | |-->Device "nvidia-vs"
    [ 4488.528] (**) |-->Screen "BNQ-Screen" (2)
    [ 4488.528] (**) | |-->Monitor "BNQ"
    [ 4488.528] (**) | |-->Device "intel"
    [ 4488.528] (**) Option "Xinerama" "0"
    [ 4488.528] (==) Automatically adding devices
    [ 4488.528] (==) Automatically enabling devices
    [ 4488.528] (==) Automatically adding GPU devices
    [ 4488.528] (==) FontPath set to:
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/,
    /usr/share/fonts/OTF/,
    /usr/share/fonts/Type1/,
    /usr/share/fonts/100dpi/,
    /usr/share/fonts/75dpi/
    [ 4488.528] (==) ModulePath set to "/usr/lib/xorg/modules"
    [ 4488.528] (II) The server relies on udev to provide the list of input devices.
    If no devices become available, reconfigure udev or disable AutoAddDevices.
    [ 4488.528] (II) Loader magic: 0x815d80
    [ 4488.528] (II) Module ABI versions:
    [ 4488.528] X.Org ANSI C Emulation: 0.4
    [ 4488.528] X.Org Video Driver: 19.0
    [ 4488.528] X.Org XInput driver : 21.0
    [ 4488.528] X.Org Server Extension : 9.0
    [ 4488.529] (EE) systemd-logind: failed to get session: PID 32224 does not belong to any known session
    [ 4488.530] (II) xfree86: Adding drm device (/dev/dri/card1)
    [ 4488.530] (II) xfree86: Adding drm device (/dev/dri/card0)
    [ 4488.531] (--) PCI: (0:0:2:0) 8086:0102:1043:84ca rev 9, Mem @ 0xf6400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64
    [ 4488.531] (--) PCI:*(0:1:0:0) 10de:1200:10b0:0401 rev 161, Mem @ 0xf4000000/33554432, 0xe0000000/134217728, 0xe8000000/67108864, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
    [ 4488.531] (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
    [ 4488.531] (II) LoadModule: "glx"
    [ 4488.531] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
    [ 4488.542] (II) Module glx: vendor="NVIDIA Corporation"
    [ 4488.542] compiled for 4.0.2, module version = 1.0.0
    [ 4488.542] Module class: X.Org Server Extension
    [ 4488.542] (II) NVIDIA GLX Module 346.47 Thu Feb 19 18:09:07 PST 2015
    [ 4488.542] (II) LoadModule: "nvidia"
    [ 4488.542] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [ 4488.543] (II) Module nvidia: vendor="NVIDIA Corporation"
    [ 4488.543] compiled for 4.0.2, module version = 1.0.0
    [ 4488.543] Module class: X.Org Video Driver
    [ 4488.543] (II) LoadModule: "modesetting"
    [ 4488.543] (II) Loading /usr/lib/xorg/modules/drivers/modesetting_drv.so
    [ 4488.543] (II) Module modesetting: vendor="X.Org Foundation"
    [ 4488.543] compiled for 1.17.1, module version = 1.17.1
    [ 4488.543] Module class: X.Org Video Driver
    [ 4488.543] ABI class: X.Org Video Driver, version 19.0
    [ 4488.543] (II) NVIDIA dlloader X Driver 346.47 Thu Feb 19 17:47:18 PST 2015
    [ 4488.543] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [ 4488.543] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
    [ 4488.543] (++) using VT number 7
    [ 4488.645] (II) Loading sub module "fb"
    [ 4488.645] (II) LoadModule: "fb"
    [ 4488.645] (II) Loading /usr/lib/xorg/modules/libfb.so
    [ 4488.645] (II) Module fb: vendor="X.Org Foundation"
    [ 4488.646] compiled for 1.17.1, module version = 1.0.0
    [ 4488.646] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 4488.646] (II) Loading sub module "wfb"
    [ 4488.646] (II) LoadModule: "wfb"
    [ 4488.646] (II) Loading /usr/lib/xorg/modules/libwfb.so
    [ 4488.646] (II) Module wfb: vendor="X.Org Foundation"
    [ 4488.646] compiled for 1.17.1, module version = 1.0.0
    [ 4488.646] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 4488.646] (II) Loading sub module "ramdac"
    [ 4488.646] (II) LoadModule: "ramdac"
    [ 4488.646] (II) Module "ramdac" already built-in
    [ 4488.647] (II) modeset(2): using drv /dev/dri/card0
    [ 4488.647] (II) modeset(G0): using drv /dev/dri/card0
    [ 4488.647] (EE) Screen 2 deleted because of no matching config section.
    [ 4488.647] (II) UnloadModule: "modesetting"
    [ 4488.647] (II) NVIDIA(0): Creating default Display subsection in Screen section
    "SAM-Screen" for depth/fbbpp 24/32
    [ 4488.647] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
    [ 4488.647] (==) NVIDIA(0): RGB weight 888
    [ 4488.647] (==) NVIDIA(0): Default visual is TrueColor
    [ 4488.647] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    [ 4488.648] (II) NVIDIA(1): Creating default Display subsection in Screen section
    "VS-Screen" for depth/fbbpp 24/32
    [ 4488.648] (**) NVIDIA(0): Enabling 2D acceleration
    [ 4489.049] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20150116)
    [ 4489.050] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 560 Ti (GF114) at PCI:1:0:0 (GPU-0)
    [ 4489.050] (--) NVIDIA(0): Memory: 1048576 kBytes
    [ 4489.050] (--) NVIDIA(0): VideoBIOS: 70.24.21.00.00
    [ 4489.050] (II) NVIDIA(0): Detected PCI Express Link width: 16X
    [ 4489.082] (--) NVIDIA(0): Valid display device(s) on GeForce GTX 560 Ti at PCI:1:0:0
    [ 4489.082] (--) NVIDIA(0): CRT-0
    [ 4489.082] (--) NVIDIA(0): CRT-1
    [ 4489.082] (--) NVIDIA(0): Samsung SMS27A850 (DFP-0) (boot, connected)
    [ 4489.082] (--) NVIDIA(0): DFP-1
    [ 4489.082] (--) NVIDIA(0): ViewSonic VG1930wm (DFP-2) (connected)
    [ 4489.082] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
    [ 4489.082] (--) NVIDIA(GPU-0): CRT-1: 400.0 MHz maximum pixel clock
    [ 4489.082] (--) NVIDIA(0): Samsung SMS27A850 (DFP-0): Internal TMDS
    [ 4489.082] (--) NVIDIA(GPU-0): Samsung SMS27A850 (DFP-0): 330.0 MHz maximum pixel clock
    [ 4489.082] (--) NVIDIA(0): DFP-1: Internal TMDS
    [ 4489.082] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
    [ 4489.082] (--) NVIDIA(0): ViewSonic VG1930wm (DFP-2): Internal TMDS
    [ 4489.082] (--) NVIDIA(GPU-0): ViewSonic VG1930wm (DFP-2): 330.0 MHz maximum pixel clock
    [ 4489.083] (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4489.083] (**) NVIDIA(0): device Samsung SMS27A850 (DFP-0) (Using EDID frequencies
    [ 4489.083] (**) NVIDIA(0): has been enabled on all display devices.)
    [ 4489.083] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4489.083] (WW) NVIDIA(GPU-0): mode "1920x1080" is specified in the EDID; however, the
    [ 4489.083] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4489.083] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4489.083] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1920x1080".
    [ 4489.083] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4489.083] (WW) NVIDIA(GPU-0): mode "1280x720" is specified in the EDID; however, the
    [ 4489.083] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4489.083] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4489.083] (WW) NVIDIA(GPU-0): VertRefresh check for mode "1280x720".
    [ 4489.083] (WW) NVIDIA(GPU-0): The EDID for Samsung SMS27A850 (DFP-0) contradicts itself:
    [ 4489.083] (WW) NVIDIA(GPU-0): mode "720x576" is specified in the EDID; however, the
    [ 4489.083] (WW) NVIDIA(GPU-0): EDID's valid VertRefresh range (56.000-75.000 Hz) would
    [ 4489.083] (WW) NVIDIA(GPU-0): exclude this mode's VertRefresh (50.0 Hz); ignoring
    [ 4489.083] (WW) NVIDIA(GPU-0): VertRefresh check for mode "720x576".
    [ 4489.084] (==) NVIDIA(0):
    [ 4489.084] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
    [ 4489.084] (==) NVIDIA(0): will be used as the requested mode.
    [ 4489.084] (==) NVIDIA(0):
    [ 4489.084] (II) NVIDIA(0): Validated MetaModes:
    [ 4489.084] (II) NVIDIA(0): "DFP-0:nvidia-auto-select"
    [ 4489.084] (II) NVIDIA(0): Virtual screen size determined to be 2560 x 1440
    [ 4489.113] (--) NVIDIA(0): DPI set to (125, 114); computed from "UseEdidDpi" X config
    [ 4489.113] (--) NVIDIA(0): option
    [ 4489.114] (==) NVIDIA(1): Depth 24, (==) framebuffer bpp 32
    [ 4489.114] (==) NVIDIA(1): RGB weight 888
    [ 4489.114] (==) NVIDIA(1): Default visual is TrueColor
    [ 4489.114] (==) NVIDIA(1): Using gamma correction (1.0, 1.0, 1.0)
    [ 4489.114] (II) NVIDIA(1): NVIDIA GPU GeForce GTX 560 Ti (GF114) at PCI:1:0:0 (GPU-0)
    [ 4489.114] (--) NVIDIA(1): Memory: 1048576 kBytes
    [ 4489.114] (--) NVIDIA(1): VideoBIOS: 70.24.21.00.00
    [ 4489.114] (II) NVIDIA(1): Detected PCI Express Link width: 16X
    [ 4489.146] (--) NVIDIA(1): Valid display device(s) on GeForce GTX 560 Ti at PCI:1:0:0
    [ 4489.146] (--) NVIDIA(1): CRT-0
    [ 4489.146] (--) NVIDIA(1): CRT-1
    [ 4489.146] (--) NVIDIA(1): Samsung SMS27A850 (DFP-0) (boot, connected)
    [ 4489.146] (--) NVIDIA(1): DFP-1
    [ 4489.146] (--) NVIDIA(1): ViewSonic VG1930wm (DFP-2) (connected)
    [ 4489.146] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
    [ 4489.146] (--) NVIDIA(GPU-0): CRT-1: 400.0 MHz maximum pixel clock
    [ 4489.146] (--) NVIDIA(1): Samsung SMS27A850 (DFP-0): Internal TMDS
    [ 4489.146] (--) NVIDIA(GPU-0): Samsung SMS27A850 (DFP-0): 330.0 MHz maximum pixel clock
    [ 4489.146] (--) NVIDIA(1): DFP-1: Internal TMDS
    [ 4489.146] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
    [ 4489.146] (--) NVIDIA(1): ViewSonic VG1930wm (DFP-2): Internal TMDS
    [ 4489.146] (--) NVIDIA(GPU-0): ViewSonic VG1930wm (DFP-2): 330.0 MHz maximum pixel clock
    [ 4489.146] (**) NVIDIA(1): Option "Rotate" "left"
    [ 4489.146] (**) NVIDIA(1): Using HorizSync/VertRefresh ranges from the EDID for display
    [ 4489.146] (**) NVIDIA(1): device ViewSonic VG1930wm (DFP-2) (Using EDID frequencies
    [ 4489.146] (**) NVIDIA(1): has been enabled on all display devices.)
    [ 4489.147] (==) NVIDIA(1):
    [ 4489.147] (==) NVIDIA(1): No modes were requested; the default mode "nvidia-auto-select"
    [ 4489.147] (==) NVIDIA(1): will be used as the requested mode.
    [ 4489.147] (==) NVIDIA(1):
    [ 4489.147] (II) NVIDIA(1): Validated MetaModes:
    [ 4489.147] (II) NVIDIA(1): "DFP-2:nvidia-auto-select{Rotation=90}"
    [ 4489.147] (II) NVIDIA(1): Virtual screen size determined to be 900 x 1440
    [ 4489.150] (--) NVIDIA(1): DPI set to (55, 140); computed from "UseEdidDpi" X config
    [ 4489.150] (--) NVIDIA(1): option
    [ 4489.150] (EE)
    [ 4489.150] (EE) Backtrace:
    [ 4489.151] (EE) 0: /usr/lib/xorg-server/Xorg (OsLookupColor+0x119) [0x5949c9]
    [ 4489.151] (EE) 1: /usr/lib/libc.so.6 (__restore_rt+0x0) [0x7f64dd88653f]
    [ 4489.151] (EE) 2: /usr/lib/xorg-server/Xorg (xf86nameCompare+0x19) [0x4a9159]
    [ 4489.151] (EE) 3: /usr/lib/xorg-server/Xorg (xf86findOption+0x2c) [0x4a2c6c]
    [ 4489.151] (EE) 4: /usr/lib/xorg-server/Xorg (xf86SetDepthBpp+0x430) [0x4828d0]
    [ 4489.151] (EE) 5: /usr/lib/xorg/modules/drivers/modesetting_drv.so (_init+0x25e1) [0x7f64d695aab1]
    [ 4489.151] (EE) 6: /usr/lib/xorg-server/Xorg (InitOutput+0xbcc) [0x47b63c]
    [ 4489.151] (EE) 7: /usr/lib/xorg-server/Xorg (remove_fs_handlers+0x22a) [0x43c9da]
    [ 4489.152] (EE) 8: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7f64dd873800]
    [ 4489.152] (EE) 9: /usr/lib/xorg-server/Xorg (_start+0x29) [0x427039]
    [ 4489.152] (EE) 10: ? (?+0x29) [0x29]
    [ 4489.152] (EE)
    [ 4489.152] (EE) Segmentation fault at address 0x231
    [ 4489.152] (EE)
    Fatal server error:
    [ 4489.152] (EE) Caught signal 11 (Segmentation fault). Server aborting
    [ 4489.152] (EE)
    [ 4489.152] (EE)
    Please consult the The X.Org Foundation support
    at http://wiki.x.org
    for help.
    [ 4489.152] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
    [ 4489.152] (EE)
    [ 4489.295] (EE) Server terminated with error (1). Closing log file.
    Section "DRI"
    Mode 0666
    EndSection
    Section "Monitor"
    Identifier "SAM"
    ModelName "Samsung SMS27A850"
    HorizSync 30.0 - 90.0
    VertRefresh 56.0 - 75.0
    Option "DPMS"
    DisplaySize 597 336 # In millimeters
    EndSection
    Section "Monitor"
    Identifier "VS"
    ModelName "ViewSonic VG1930wm"
    Option "DPMS"
    Option "RightOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Monitor"
    Identifier "BNQ"
    ModelName "BenQ FP737s"
    Option "DPMS"
    Option "LeftOf" "SAM"
    Option "Rotate" "left"
    EndSection
    Section "Device"
    Identifier "nvidia-sam"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTX 560 Ti"
    BusID "01:00:0"
    Option "Monitor-DVI-I-1" "SAM"
    Screen 0
    EndSection
    Section "Device"
    Identifier "nvidia-vs"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTX 560 Ti"
    BusID "01:00:0"
    Option "Monitor-DVI-I-2" "VS"
    Screen 1
    EndSection
    Section "Device"
    Identifier "intel"
    Driver "intel"
    Option "AccelMethod" "none"
    VendorName "Intel"
    BoardName "Intel Integrated Graphic Controller"
    BusID "00:02:0"
    Option "Monitor-VGA-0" "BNQ"
    Screen 2
    EndSection
    Section "Screen"
    Identifier "SAM-Screen"
    Device "nvidia-sam"
    Monitor "SAM"
    #DefaultDepth 24
    # Option "Stereo" "0"
    # Option "SLI" "Off"
    # Option "MultiGPU" "Off"
    # Option "BaseMosaic" "off"
    #Option "AllowEmptyInitialConfiguration"
    # SubSection "Display"
    # Depth 24
    # EndSubSection
    EndSection
    Section "Screen"
    Identifier "VS-Screen"
    Device "nvidia-vs"
    Monitor "VS"
    #DefaultDepth 24
    # Option "Stereo" "0"
    # Option "SLI" "Off"
    # Option "MultiGPU" "Off"
    # Option "BaseMosaic" "off"
    #Option "AllowEmptyInitialConfiguration"
    # SubSection "Display"
    # Depth 24
    # EndSubSection
    EndSection
    Section "Screen"
    Identifier "BNQ-Screen"
    Device "intel"
    Monitor "BNQ"
    EndSection
    Section

    Hi:
    There is a specific order you need to install the graphics drivers for the notebooks with the nvidia/intel graphics.
    1. Install the Intel chipset installation utility and reboot.  If the one on your notebook's support page doesn't work, use this one.
    https://downloadcenter.intel.com/Detail_Desc.aspx?​DwnldID=20775&lang=eng&ProdId=816
    2. Install the Intel HD graphics driver and reboot.  If the one on your notebook's suport page doesn't work, try this one.
    https://downloadcenter.intel.com/Detail_Desc.aspx?​DwnldID=24348
    3. Install the nVidia HD graphics driver and reboot.  If the one on your notebook's support page doesn't work, try this one.
    http://www.nvidia.com/download/driverResults.aspx/​81879/en-us

  • Will NVIDIA GeForce GTX 560 fit into HP ProBook6470b

    Hi,
    I want to install NVIDIA GeForce GTX 560 card into my HP laptop ( ProBook 6470b). Will it require any other hardware also?
    On which port I can insert/install it.
    Thanks in advance.
    Br
    Manvash
    This question was solved.
    View Solution.

    Hi Erico, Thanks a lot!

  • Can't get GeForce GTX 560 Ti card CUDA to work with new Premiere Pro CC 2014 version

    In the past I've been able to get CUDA on my GeForce GTX 560 Ti card to work with Mercury Playback engine by adding "GeForce GTX 560 Ti" to the cuda_supported_cards.txt file. But with the new CC 2014 version of Premiere, that file isn't in the Premiere Pro directory anymore. Is there a way to get my GTX 560 Ti to be listed as a CUDA-supported card with CC 2014?

    Okay, I figured it out.  I installed the latest graphics driver and then opened the NVIDIA control panel.  In there I selected "Manage 3D settings" and went to the "program settings" tab.  I selected Adobe Premiere (adobe premiere pro.exe) from the list and under the "CUDA - GPUs" feature, I changed it to "Use these GPUs" and selected my GTX 560 Ti (which was the only one listed).
    After doing that, I opened Premiere Pro CC 2014 and selected "Project Settings - General" and was then able to choose "Mercury Playback Engine GPU Acceleration" under Renderer (which had previously been unavailable).
    Hopefully this will help someone else if they encounter the same problem.

  • Why does the GeForce GTX 560 Ti NOT support OpenCL?

    From the Nvidia web site it says that the GeForce GTX 560 Ti supports:  DirectX 11 > 3D Vision > PhysX > CUDA > SLI >
    301.24 drivers have been used and have downgraded to 296.10 to see if that would help, but it didn't.
    When I check the performance tab under Advanced it shows that the OpenCL option is grayed out.  When I check the Sys info log it also shows that OpenCL is Not availible (included below). 
    Why is this and when will it be fixed?  Now that CS6 is shipping it should be working.
    Adobe Photoshop Version: 13.0 (13.0 20120305.m.415 2012/03/05:21:00:00) x64
    Operating System: Windows 7 64-bit
    Version: 6.1 Service Pack 1
    System architecture: Intel CPU Family:6, Model:10, Stepping:7 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1, SSE4.2, HyperThreading
    Physical processor count: 4
    Logical processor count: 8
    Processor speed: 3411 MHz
    Built-in memory: 16351 MB
    Free memory: 12614 MB
    Memory available to Photoshop: 14726 MB
    Memory used by Photoshop: 95 %
    Image tile size: 128K
    Image cache levels: 4
    OpenGL Drawing: Enabled.
    OpenGL Drawing Mode: Advanced
    OpenGL Allow Normal Mode: True.
    OpenGL Allow Advanced Mode: True.
    OpenGL Allow Old GPUs: Not Detected.
    Video Card Vendor: NVIDIA Corporation
    Video Card Renderer: GeForce GTX 560 Ti/PCIe/SSE2
    Display: 2
    Display Bounds:=  top: -1080, left: 0, bottom: 0, right: 1920
    Display: 1
    Display Bounds:=  top: 0, left: 0, bottom: 1080, right: 1920
    Video Card Number: 1
    Video Card: NVIDIA GeForce GTX 560 Ti 
    OpenCL Unavailable
    Driver Version: 8.17.12.9610
    Driver Date: 20120229000000.000000-000
    Video Card Driver: nvd3dumx.dll,nvwgf2umx.dll,nvwgf2umx.dll,nvd3dum,nvwgf2um,nvwgf2um
    Video Mode: 1920 x 1080 x 4294967296 colors
    Video Card Caption: NVIDIA GeForce GTX 560 Ti 
    Video Card Memory: 1024 MB
    Video Rect Texture Size: 16384

    Bandalier1 wrote:
    Problem solved.  NVidia just released their first 300 series drivers and the Open CL option is working now…
    Good for you!  I was just about to start typing a reply to the OP clarifying that support for OpenGL depends on the driver for the card, when I read your last post.. 

  • GeForce GTX 560 Ti stopped working

    Hi!
    My "GeForce GTX 560 Ti" stopped working after the last update even though I didn't see it on the legacy driver list (http://www.nvidia.com/object/IO_32667.html) and it still is listed on the "supported products" list of the "latest short lived branch 343.22 " (http://www.nvidia.com/Download/driverRe … 7844/en-us)
    After uninstalling the normal nvidia driver and installing nvidia-304xx instead, it works again.
    Uhm... what's going on?
    thx.
    Last edited by whoops (2014-10-09 17:10:53)

    Hmm...  guess I was thrown off track by the coinciding nvidia news there...
    Didn't see it because it's not in the logs and when I use xorgs auto-detect I'm getting no errors at all (it just uses vga or something instead), but when I do everything manually and try to force nvidia for the display, I'm getting a "modprobe: could not insert 'nvidia' invalid argument"
    Checked if the mirror I updated from is ok: other mirrors have the same versions... linux + nvidia module are both up to date... deinstalling + reinstalling changed nothing...  so... *shrug*
    weird...
    Last edited by whoops (2014-10-09 19:55:38)

  • Can I install nvidia quadro 600/ AMD FirePro V4900 (ATI FireGL) on HPE h8-1220t ?

    HP/Nvidia Quadro 600 graphics card hangs w/ blue hp startp screen after installation into HPE h8-1220t.
    Error: After installation and startup, just it sounds a 'beep' and after a minute, it beeps again. No further progess w/ HP blue screen. 
    Notes: Card spec
    nVIdia Quadro 600 by pny
    Frame Buffer Memory 1 GB DDR3 Memory Interface 128-bit Memory Bandwidth 25.6 GB/s CUDA parallel processing cores 96 Max Power Consumption 40 W Energy Star Compliant Yes Physical Dimensions 2.713" H x 6.60" L Single Slot Low Profile Form Factor Yes Display Connectors DVI-I (1), DP (1) Number of Displays Supported 2 DisplayPort Yes DVI Yes VGA Yes Graphics Bus PCI Express 2.0 x16 Thermal Solution Active 3D Vision Pro Support Via USB Warranty 3 Year Warranty PNY Part # VCQ600-PB
    I contacted both HP and PNY customer support. They've not found any solution yet.
    Only the suggestion I heard from HP customer support is change the graphic card w/ PCIe 3.0 support.
    I believe PCI slot should be backward compatible w/ 2.0 version. 
    Does it really matter install PCIe2.0 graphic board on 3.0 slot?
    My ohter option is AMD FirePRo V4900, which is one of the certified graphic card for the Autodesk.
    However, I'm not sure wether it works with this machine.
    Please give me an answer.
    Kyoo

    masterqueue wrote:
    HP/Nvidia Quadro 600 graphics card hangs w/ blue hp startp screen after installation into HPE h8-1220t.
    Error: After installation and startup, just it sounds a 'beep' and after a minute, it beeps again. No further progess w/ HP blue screen. 
    Notes: Card spec
    nVIdia Quadro 600 by pny
    Frame Buffer Memory 1 GB DDR3 Memory Interface 128-bit Memory Bandwidth 25.6 GB/s CUDA parallel processing cores 96 Max Power Consumption 40 W Energy Star Compliant Yes Physical Dimensions 2.713" H x 6.60" L Single Slot Low Profile Form Factor Yes Display Connectors DVI-I (1), DP (1) Number of Displays Supported 2 DisplayPort Yes DVI Yes VGA Yes Graphics Bus PCI Express 2.0 x16 Thermal Solution Active 3D Vision Pro Support Via USB Warranty 3 Year Warranty PNY Part # VCQ600-PB
    I contacted both HP and PNY customer support. They've not found any solution yet.
    Only the suggestion I heard from HP customer support is change the graphic card w/ PCIe 3.0 support.
    I believe PCI slot should be backward compatible w/ 2.0 version. 
    Does it really matter install PCIe2.0 graphic board on 3.0 slot?
    My ohter option is AMD FirePRo V4900, which is one of the certified graphic card for the Autodesk.
    However, I'm not sure wether it works with this machine.
    Please give me an answer.
    Kyoo
    Hello masterqueue.
    This desktop was a CTO (Configured-to-Order) model.  As such, there is a lot of information I do not know about the machine.  Most importantly, how large is the power supply?  According to the computer's specification page it could have anywhere from a 300 watt to a 600 watt power supply.  That is a very big difference.
    I'll keep an eye out for your response.  Have a great day.
    Please click the white star under my name to give me Kudos as a way to say "Thanks!"
    Click the "Accept as Solution" button if I resolve your issue.

  • PSU for MSI GeForce GTX 560 Ti HAWK in SLI ?

     Do you know what psu should i get for 2 MSI GeForce GTX 560 Ti HAWK?

    Depends what other components your system consists of. >>Posting Guide<<
    But here is the link to the recommended PSU for such a combo. https://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards
    A 42A PSU is suggested. Thus a Corsair TX650 will suffice but to leave you with additional overhead you could opt for a Corsair TX750. The Corsair PSUs have single 12V rails which are recommended and have an above average reliability and build quality.

  • Photoshop CS5 and NVIDIA GeForce GTX 295 Crash

    My specifications:
    Photoshop CS5 (x64)
    Windows 7 x64
    NVIDIA GeForce GTX 295 (latest v197.45 driver)
    Intel i7 920 2.67 GHz
    My card is listed as a tested card for CS5 and NVIDIA says this is compatible with CS5, but today after using it, my screen went blank and the display driver recovered itself. It took about 5 minutes until Photoshop CS5 became responsive again and the GPU features were disabled. I am going to try GPU advanced mode and basic mode instead of normal mode to see what happens, but this is a shame that I have issues like this. I also had issues with CS4 like this.
    What I did to cause this issue: I clicked" Place.." to place an object and that is when it crashed. It only happens randomly as I use Photoshop. Please NVIDIA or Adobe fix this issue. I love the GPU features and I want to run them, but this crashing is annoying for spending so much on buying this product. I hate to have to pay to contact Adobe technical support, it should be included with the purchase of software!
    How to fix this without having to go to basic GPU mode or disabled GPU mode?

    function(){return A.apply(null,[this].concat($A(arguments)))}
    SteveR From UK wrote:
    With my own problems and a keen observer of this thread it making me think that I should stay away from Nvidia cards cards completed and go ATI
    It comes down to policies within companies that in some cases we simply cannot know.
    For example:
    Does nVidia or ATI have a better plan to support cards with new drivers into the future?  My experience is that right now, with my particular video card, ATI does a better job of making new features available (and making them work right) than nVidia.  Is this based on support policy or simply what the engineers can and can't do?  Maybe it's partially a matter of funding.
    Who has a better handle on how to develop unified software and keep it working on all a whole family of cards?  What is their software quality?  I am active on Windows forums, and it sure seemed to take nVidia a lot longer to stabilize their drivers for Windows 7 than ATI.
    Who has a better test lab?  You can imagine that if one company has a better built/funded/maintained test lab, they're more likely to make new versions of drivers for old cards work properly if they have a computer set aside somewhere with that old card in it just waiting to try out new drivers.  Unfortunately you can imagine with these companies making many, many new models all the time that funding that lab (and funding the MAINTENANCE of that lab) could get to be HUGE!
    Lastly, how much do Marketing or other greedy business people influence company direction, vs. Engineers?  Imagine a Marketing person's thoughts:  "Stop making new drivers for that old card; we're not selling those and haven't for a long time.  If a person has one and can't run the latest game or graphics software, he'll go buy a new one, and that's more money for us."
    By the way, just to get an idea of how many different cards we're talking about, look at the sheer number of models listed in the various lists on this page:  http://www.videocardbenchmark.net/
    Just food for thought.
    -Noel

  • The new EVGA GeForce GTX 560 Ti 2Win

    Adobe, it would be interesting if someone would check out this card for compatibility with Premiere for hardware MPE.  We constantly have people that have requirements for more than two displays.  All four outputs on this card (3 DVI and 1 HDMI) are usable simultaneously.  While it is appreciably more expensive than two GTX 560 Ti cards the big advantage would be it is only a single PCIe interface.  Apparently it is not SLI enabled externally (some gamers are upset by this) but that might be because the two onboard GPU's are in SLI mode internally--that would be a bummer for Premiere. 

    Bill Gehrke wrote:
    Adobe, it would be interesting if someone would check out this card for compatibility with Premiere for hardware MPE.
    Approach this card with the utmost of care.  Recently, nVidia has been having a bunch of driver trouble with this specific card for gamers.  It's probably not completely apropos to compare the stress a gamer puts on his card to what an editor does.  However, what the gamers are seeing is the driver just up and crashing under load.
    It only seems to be happening with the 560Ti for some oddball reason.  The troublemaking game is Battlefield 3.  nVidia has ack'd the problem and they're hacking away at it.  Given that, however, it might behoove you to look elsewhere.
    jas

  • GPUsniffer.exe will not accept my GeForce GTX 560 Ti using drivers newer than 296.10

    I updated my GPU drivers since the 296.10 version is over a year old, but the Mercury playback hack wouldnt work anymore. I was going bonkers all day today trying to troubleshoot the issue. I uninstaleld and reinstalled a bunch of times. GPUsniffer would give the notice that it "did not find any devices that support GPU computation". I went back to the old drivers and sure enough, GPUsniffer would detect it.
    I wanted to make this post for people googling issues with the hack involving the cuda_supported_cards.txt file involving CUDA enabled GeForce cards that aren't officially approved by Adobe, and also so the engineers at Adobe can maybe look into their end to see if its the GPUsniffer.exe that is the problem with not being able to detect the drivers, or maybe it's Nvidia's problem on their end that the drivers can't be detected. At least you guys can talk to one another to straighten things out.

    OK guys, i just went ahead and made a video showing you exactly what I'm doing and what happens with new and old drivers installed. You'll see that the new drivers simply do not work with everything else necessary to make the hack work remaining constant. In the middle of the video you might want to skip through since there's a long spot where the old drivers are installed.
    I'm totally open to the fact that something could still be wrong on my end, but what can it be when this video shows the only difference between it working and not is clearly a driver change??

  • Workstation with Quadro 2000 or GTX 570 HD 2,5GB, for PP CS 5.5?

    Hey there,
    I'm going to build up a new workstation for video-editing using the Production Premium Suite CS 5.5.
    But there is still one big question and I can't find a proper answer.
    What GPU should I take or which one will be faster? A Quadro 2000 or a GTX 570HD with 2,5GB?
    I know the Quadro has 192 cores and the GTX has 480 cores. So the GTX should be faster
    Actually? But would it really be faster? I can't find any Benchmark comparisons or stuff.
    Some say a Quadro 2000 is better, if it's only a workstation. But I also read that people
    Prefer the GTX-Models.
    I know the GTX needs more energy and it's getting warmer when used, but those two facts
    Wouldn't persuade me to buy the Quadro.
    The rest of my system would look like this:
    Intel Core i7-2600
    ASRock Z68 Extreme 3 Gen. 3
    G.Skill RipJaws-X DIMM Kit 16 GB
    Crucial m4 128GB for OS, Programms
    Western Digital AV-GP for the media-archive and the orginal videofiles
    WD Caviar Green for Export and stuff like that
    Fractal Design Arc
    Scythne Katana 3
    Super Flower Golden Green Pro 650 W
    So the only missing thing is the GPU.
    Thanks in advance for your help!

    For the most part, I second Harm. You see, the AV-GP is not compatible with PCs at all - but rather, it's a version of the WD Caviar Green designed specifically for set-top DVRs/PVRs. And in either case, the current WD Greens spin at far slower than 7200 RPM - in fact, most current WD Green drives spin at only 5405 RPM (with a few spinning as slow as 4200 RPM). The slower rotational speed negatively affects both sequential transfer performance and random seek performance.
    As for the non-K 2600, it is limited unlocked, not completely locked. There are two disadvantages to this limited unlock: Only the maximum single-core Turbo Boost multiplier is manually selectable, with the differing multi-core Turbo multipliers also increasing by the exact same number of steps as the single-core Turbo frequency (unlike on the K chips, the multi-core Turbo multipliers on the non-K chips cannot be set independently of the single-core Turbo multiplier). Second, the maximum Turbo multiplier setting is limited to four steps above the normal single-core Turbo multiplier: In the case of the 2600, the maximum single-core Turbo multiplier can be set at up to 42x (this will force the maximum quad-core multiplier to be boosted to 39x, which will result in a maximum quad-core overclock to 3.9GHz with the BCLK remaining at its stock 100MHz). The 2600K is so much easier to overclock the way the user wants it while costing only a few dollars more than the non-K 2600.
    As for the original decision between the Quadro 2000 and the GTX 570, definitely the latter: The Quadro 2000, as far as CS5.5 is concerned, is little more than a slightly underclocked GeForce GTS 450 with a huge heatsink attached to it and still only 1GB of VRAM. And as Bill's testing with the various GeForce GPUs (to be specific, Bill tested the GTX 580, GTX 480, GTX 560 Ti 448, GTX 285, GTX 260, GTX 550 Ti and the 9500 GT, from fastest to slowest - however, the GTX 560 Ti 448 is roughly equal to the GTX 480 in performance) in CS5.5 has demonstrated, the Quadro 2000 would definitely be slower than a GTX 550 Ti, especially in MPEG-2 DVD encodes.

  • MSI GTX 560 Ti Twin Frozr II - Heat sink and chipset heat sink come into contact

    Pretty much what the title says. There is a small heat sink on my GTX 560 which comes into contact with one of the motherboard's heat sinks. The card slipped right in PCI slot without any problems and is working.
    However, I'm concerned about heat transfer. Could the card heat up the motherboard's heat sink or the other way around?
    This is the small card's heat sink.
    Top view.
    Closeup. I guess the surface that makes contact is not so much.
    PS: I'm sorry for the poor quality of the photos.

     Contact MSI tech support about that issue and see what they have to say about it. You are the 2nd on this forum to ask about it and the only thing I can suggest is trimming the heat sink on the MB or video card but doing so to either one could possibly void the warranty.
      >> How to contact MSI <<

  • Gforce gtx 560 ti ... too high Temperature and crush

    hi there
    my post is similar at "MSI GTX 560 Ti HAWK Memory Temperature too high???"
    but my graphic card is gforce gtx 560 ti  _ 2gb gddr5   (i can't post external link to let you see wich one exactly)
    when i play to one game my card's Temperature go over 100°c and my screen become black :(
    ...  i'm suree that my card don't reach that temperature and i changed the base fan  with an accelero xtreme plus II, but nothing to do with my problem.
    so i'm there to ask an help
    (sorry for my poor english)
    Thanks in advance for your time!

    i think, there is a problem with temperature sensor
    or it is broken  or ... idon't know .....
    the temperature stay costant on about 78-80°C
    and have istant spike when i play on a game.....
    i saw it with speedfan and with MSI Afterburner.

Maybe you are looking for

  • Help needed in the a sql query

    hi, i have small but tricky piece of requirements, following is the input Id     id_type     id_level     id_name 1     F     A_1     Beta1 2     W     A_1     Beta2 3     W     A_1     Beta3 4     L     A_1     Alpha0 5     W     A_1     Beta4 6    

  • Need to export header created in splitter container in Excel using OO. How?

    Hi, I am using Object-Oriented ALV and have a header created using cl_gui_splitter_container containing the parameter selections from the front screen. I have got the header to appear differently when printing out using the print_top_of_page event bu

  • Can't fine password for my wi fi?

    CAN'T FIND MY PASSWORD FOR MY WI FI CONNECTION, WHERE DO I LOOK AND HOW DO I GET TO DISPLAY?

  • Flex KEY_DOWN and IE shortcuts

    Hi, In my Flex application, I have to manage shortcuts. So I use the event KeyboardEvent.KEY_DOWN on the stage object. This work fine for a lot of keys but not for the keys used by the browser (IE for example) : F1, F3, F5, ....Ctrl F, Ctrl L ... Whe

  • Shopping Cart

    hi there I have a online store that sell to only Canada & USA only- Canada - just product price, tax and shipping cost USA -       just product price and shipping cost but I'm having some issues regarding my Shopping Cart, the UPS Shipping options wi