Dual GPU, ATI + NVidia. Is it possible?

Hello. I work with CUDA in the college and now i am kidding around with pyrit. And i have seen that ATI cards outperform the nvidia ones, with a lower price. So, is possible to use an ATI card with a NVIDIA card? The main card(the one that will provide video) will probably be the nvidia or the onboard, since when computing with cuda the video keeps freezing.
My motherboard is a z77m-d3h and my gpu is a msi gtx 660 ti. The mobo has 2 PCI-E slots.
Will be any problem with drivers?
I also want to start using openCL with and ATI card. But i dont want to buy another computer or to physically change which card i will be using. So i need both ATI and NVidia cards on the same computer.
Thanks..

carneiro :
try opencl-nvidia / opencl-nividia-304xx , looking at their dependencies it should be possible to run them without having the nvidia video driver installed.
granada :
check https://wiki.archlinux.org/index.php/Opencl
amd opencl solution currently requires catalyst to be present.
The gallium compute solution for the open source radeon driver requires a custom build of mesa.
(if you want to try this one, check lordheavy's mesa-git repo.  some details about using opencl with that repo  :https://bbs.archlinux.org/viewtopic.php?id=170468 )
I very much doubt the combination of nvidia video driver and catalyst or xf86-video-ati will work though.

Similar Messages

  • Laptop intel 4000 gpu and nvidia 675m offer same performance?

    in photoshop...particularly when rendering 3d extrusions they both do the same time rendering a layer.  obviously i can force photoshop to use either gpu.  the 675m is a heck of a lot more powerful than the integrated intel gpu. 
    any special trick to make it render faster using the nvidia as opposed to the intel?

    The concept of a dual GPU laptop is that the embedded Intel GPU will come into play, when you are on the battery, doing very low-level and general computing, and it will save battery life.
    In a perfect world, when using higher-intensity programs, such as Ps, the nVidia, or AMD/ATI GPU will automatically come into play, giving you greater performance. That is not always the case, but is what is intended.
    With a program, such as Ps, you will not be using both GPU's for any process - it's one, or the other.
    Good luck,
    Hunt

  • CC & Dual GPU

    My new build is running a GTX570oc , I can pickup a 660ti for $200 to add.
    I know sli was unsupported a while back, but what's the workings of CC and its dual GPU support ?
    Will it utilize both cards and unlike sli doesn't matter if they are a matching pair ?
    The 660 has 2.5 times the cuda cores at a fraction slower bandwidth so I think ill get some good results.
    Does dual GPU in CC require much setup/tinkering and does it just use both, or make a primary and secondary ?
    Thanks
    Troy

    Troy,
    Dual GPU in CC is trivial to set up and requires no tinkering for the additional card. Of course, this assumes that you have enough power supply, PCIex16 slots, MPE hack, and cooling for the additional card(s). You do not need to SLI the cards to get the benefits in Premiere Pro. I did read on this forum that it works with mis-matched GPUs, but I've only seen that verified for cards using a common driver. As you probably know, the two cards that you mention do share a common driver from nVidia.
    You need to ask yourself why are you doing this though? For most rigs, I would expect that with a single decent GTX video card already in place that the only speed gain would be for DVD renders. For most other work flows, other items in the PC would be the limiting resource (CPU power, drive speed, etc.). On the other hand, if you are constantly doing DVD exports for high-def media, then the increased number of cuda cores will make your world so much better.
    See the following post regarding for my test results on a dual GTX Titan setup where adding a second video card doubled DVD exports, but left pretty much all others performance areas completely unchanged:
    http://forums.adobe.com/message/5588807
    Regards,
    Jim

  • Dual GPU's

    Is Aperture able to leverage any advantage from dual GPU's?
    quad G5   Mac OS X (10.4.4)  

    Is Aperture able to leverage any advantage from dual
    GPU's?
    I think Aperture would beenfit to some extent, because while CoreImage is using the GPU's so is the rest of the system. Also possibly it could split CoreImage work across both GPU's, do not know enough details about how Core Image uses the video cards to say for sure.
    A stronger video card is always going to help out Aperture to some degree though.

  • ATI & NVIDIA

    Hey everyone, I originally posed this question about a month ago and got mixed results. But now that the Mac Pro has been out for a good month+ maybe someone has found a definate result. I am looking to use 4 monitors; 2 crts, 1 regular lcd screen that needs to be rotated, and 1 hd lcd. I currently have the standard Nvidia 7300GT and was wondering if I could shift that to the second pci slot and put in an ATI X1900 XT in the first spot. I am concerned with three things. First, if Mac will have any driver issues or conflicts that will prevent 4 simultanious monitors from two seperate brand graphic cards. Secondly, (the same as the first) but under Windows XP (x64), and finally if there will be any problems rotating the canvas of one monitor.
    Any insight would be great. Thanks

    Hi,
    First off, I too found little information about multiple monitor setups on the Mac Pro. I have a triple monitor setup (ati + nvidia) and posted some of my comments here: http://discussions.apple.com/thread.jspa?threadID=646226&start=0&tstart=0
    I have not been able to get the nvidia card to work along with the ati card in windows. I can use the 2 monitors hooked up to the ati card, but not the monitor connected to the nvidia card in windows. This means that I can use windows, it just won't work with monitors hooked up to the nvidia card. If anyone has got this working please post!
    Also, I am using Dell LCDs, while another poster in the above mentioned thread using Apple LCDs said he could not get Windows to run properly (resolution issue). I have not tested portrait mode, but don't think it would be a problem.
    I am not currently a Final Cut user, so I cannot comment on those questions but otherwise the 3 monitors work OK on the OS X side. I have only worked with 2d apps and have not had any real issues. I was able to run 3 720p HD videos concurrently (1 in each monitor) with no stuttering.
    Regarding spanning, if your app is really GPU intensive, it would be best to span across the 2 monitors connected to the ati card.
    Regards
      Mac OS X (10.4.8)  

  • Is it possible to connect my 2008 black macbook, to my new 2014 macbook pro retina display screen to make a dual monitor. If this is possible what cables would I need to buy?

    Is it possible to connect my 2008 black macbook, to my new 2014 macbook pro retina display screen to make a dual monitor. If this is possible what cables would I need to buy?

    You can try using Screen Sharing in the Sharing preferences. Or try a third-party utility such as ScreenRecycler.

  • Dual monitors on nvidia geforce fx 5200

    Hi,
    I'm using a nvidia geforce fx 5200 card that has vga out, dvi out, and svideo out. I have one crt hooked into the vga, the main monitor, and I have a dvi to vga converter, to enable a second vga, crt monitor. When I try to use the nvidia-settings, and enable twinview,  and I get this message:
    "failed to set metamode (1) 'CRT00: nvidia-auto-select @1024x768 + 1024+0, DFP-0: 1024x768 @1024x768 +0+0' (Mode 2048x768, id: 50) on X screen 0."
    any ideas? thanks.

    Hm, Not working, just display on the true vga monitor. Are you using a dvi to vga connector as well? Heres my log output, I couldn't make too much of it, I tried to switch the CRT-1 to DFP-0 which didn't seem to do the trick either.
    X.Org X Server 1.5.3
    Release Date: 5 November 2008
    X Protocol Version 11, Revision 0
    Build Operating System: Linux 2.6.27-ARCH i686
    Current Operating System: Linux kenny-desktop 2.6.28-ARCH #1 SMP PREEMPT Sun Feb 8 10:13:45 UTC 2009 i686
    Build Date: 17 December 2008 08:20:05PM
    Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    (==) Log file: "/var/log/Xorg.0.log", Time: Fri Feb 13 13:50:20 2009
    (==) Using config file: "/etc/X11/xorg.conf"
    (==) ServerLayout "Layout0"
    (**) |-->Screen "Screen0" (0)
    (**) | |-->Monitor "Monitor0"
    (**) | |-->Device "Videocard0"
    (**) |-->Input Device "Keyboard0"
    (**) |-->Input Device "Mouse0"
    (**) Option "Xinerama" "0"
    (==) Automatically adding devices
    (==) Automatically enabling devices
    (WW) The directory "/usr/share/fonts/Type1" does not exist.
    Entry deleted from font path.
    (==) FontPath set to:
    /usr/share/fonts/misc,
    /usr/share/fonts/100dpi:unscaled,
    /usr/share/fonts/75dpi:unscaled,
    /usr/share/fonts/TTF
    (==) ModulePath set to "/usr/lib/xorg/modules"
    (**) Extension "Composite" is enabled
    (WW) AllowEmptyInput is on, devices using drivers 'kbd' or 'mouse' will be disabled.
    (WW) Disabling Keyboard0
    (WW) Disabling Mouse0
    (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
    (II) No APM support in BIOS or kernel
    (II) Loader magic: 0x81d5fe0
    (II) Module ABI versions:
    X.Org ANSI C Emulation: 0.4
    X.Org Video Driver: 4.1
    X.Org XInput driver : 2.1
    X.Org Server Extension : 1.1
    X.Org Font Renderer : 0.6
    (II) Loader running on linux
    (++) using VT number 7
    (--) PCI:*(0@1:0:0) nVidia Corporation NV34 [GeForce FX 5200] rev 161, Mem @ 0xfd000000/0, 0xe0000000/0, BIOS @ 0x????????/131072
    (--) PCI: (0@3:10:0) Internext Compression Inc iTVC16 (CX23416) MPEG-2 Encoder rev 1, Mem @ 0xf0000000/0
    (II) System resource ranges:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [5] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (II) "extmod" will be loaded. This was enabled by default and also specified in the config file.
    (II) "dbe" will be loaded. This was enabled by default and also specified in the config file.
    (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
    (II) "freetype" will be loaded. This was enabled by default and also specified in the config file.
    (II) "dri" will be loaded by default.
    (II) LoadModule: "dbe"
    (II) Loading /usr/lib/xorg/modules/extensions//libdbe.so
    (II) Module dbe: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    Module class: X.Org Server Extension
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension DOUBLE-BUFFER
    (II) LoadModule: "extmod"
    (II) Loading /usr/lib/xorg/modules/extensions//libextmod.so
    (II) Module extmod: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    Module class: X.Org Server Extension
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension SHAPE
    (II) Loading extension MIT-SUNDRY-NONSTANDARD
    (II) Loading extension BIG-REQUESTS
    (II) Loading extension SYNC
    (II) Loading extension MIT-SCREEN-SAVER
    (II) Loading extension XC-MISC
    (II) Loading extension XFree86-VidModeExtension
    (II) Loading extension XFree86-Misc
    (II) Loading extension XFree86-DGA
    (II) Loading extension DPMS
    (II) Loading extension TOG-CUP
    (II) Loading extension Extended-Visual-Information
    (II) Loading extension XVideo
    (II) Loading extension XVideo-MotionCompensation
    (II) Loading extension X-Resource
    (II) LoadModule: "freetype"
    (II) Loading /usr/lib/xorg/modules/fonts//libfreetype.so
    (II) Module freetype: vendor="X.Org Foundation & the After X-TT Project"
    compiled for 1.5.3, module version = 2.1.0
    Module class: X.Org Font Renderer
    ABI class: X.Org Font Renderer, version 0.6
    (II) Loading font FreeType
    (II) LoadModule: "glx"
    (II) Loading /usr/lib/xorg/modules/extensions//libglx.so
    (II) Module glx: vendor="NVIDIA Corporation"
    compiled for 4.0.2, module version = 1.0.0
    Module class: X.Org Server Extension
    (II) NVIDIA GLX Module 173.14.12 Thu Jul 17 18:36:35 PDT 2008
    (II) Loading extension GLX
    (II) LoadModule: "dri"
    (II) Loading /usr/lib/xorg/modules/extensions//libdri.so
    (II) Module dri: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org Server Extension, version 1.1
    (II) Loading extension XFree86-DRI
    (II) LoadModule: "nvidia"
    (II) Loading /usr/lib/xorg/modules/drivers//nvidia_drv.so
    (II) Module nvidia: vendor="NVIDIA Corporation"
    compiled for 4.0.2, module version = 1.0.0
    Module class: X.Org Video Driver
    (II) NVIDIA dlloader X Driver 173.14.12 Thu Jul 17 18:15:54 PDT 2008
    (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    (II) Primary Device is: PCI 01@00:00:0
    (II) Loading sub module "fb"
    (II) LoadModule: "fb"
    (II) Loading /usr/lib/xorg/modules//libfb.so
    (II) Module fb: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org ANSI C Emulation, version 0.4
    (II) Loading sub module "wfb"
    (II) LoadModule: "wfb"
    (II) Loading /usr/lib/xorg/modules//libwfb.so
    (II) Module wfb: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 1.0.0
    ABI class: X.Org ANSI C Emulation, version 0.4
    (II) Loading sub module "ramdac"
    (II) LoadModule: "ramdac"
    (II) Module "ramdac" already built-in
    (II) resource ranges after probing:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [5] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
    (==) NVIDIA(0): RGB weight 888
    (==) NVIDIA(0): Default visual is TrueColor
    (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    (**) NVIDIA(0): Option "TwinView" "1"
    (**) NVIDIA(0): Option "MetaModes" "CRT-0: 1024x768 +0+0, CRT-1: 1024x768_85 +1024+0; CRT-0: 800x600 +0+0, CRT-1: NULL; CRT-0: 640x480 +0+0, CRT-1: NULL"
    (**) NVIDIA(0): Option "TwinViewXineramaInfoOrder" "CRT-0"
    (**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
    (**) NVIDIA(0): Enabling RENDER acceleration
    (**) NVIDIA(0): TwinView enabled
    (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    (II) NVIDIA(0): enabled.
    (II) NVIDIA(0): NVIDIA GPU GeForce FX 5200 (NV34) at PCI:1:0:0 (GPU-0)
    (--) NVIDIA(0): Memory: 131072 kBytes
    (--) NVIDIA(0): VideoBIOS: 04.34.20.42.00
    (II) NVIDIA(0): Detected AGP rate: 8X
    (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    (--) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at PCI:1:0:0:
    (--) NVIDIA(0): Envision EFT7x0 Series (CRT-0)
    (--) NVIDIA(0): DELL D1025HTX (DFP-0)
    (--) NVIDIA(0): Envision EFT7x0 Series (CRT-0): 350.0 MHz maximum pixel clock
    (--) NVIDIA(0): DELL D1025HTX (DFP-0): 135.0 MHz maximum pixel clock
    (--) NVIDIA(0): DELL D1025HTX (DFP-0): Internal Single Link TMDS
    (II) NVIDIA(0): Display Device found referenced in MetaMode: CRT-0
    (WW) NVIDIA(0): TwinView requested, but only 1 display devices found.
    (II) NVIDIA(0): Assigned Display Device: CRT-0
    (WW) NVIDIA(0): Invalid display device in Mode Description
    (WW) NVIDIA(0): "CRT-1:1024x768_85+1024+0"
    (WW) NVIDIA(0): Not using mode description "CRT-1:1024x768_85+1024+0"; unable
    (WW) NVIDIA(0): to map to display device
    (WW) NVIDIA(0): Invalid display device in Mode Description "CRT-1:NULL"
    (WW) NVIDIA(0): Not using mode description "CRT-1:NULL"; unable to map to
    (WW) NVIDIA(0): display device
    (WW) NVIDIA(0): Invalid display device in Mode Description "CRT-1:NULL"
    (WW) NVIDIA(0): Not using mode description "CRT-1:NULL"; unable to map to
    (WW) NVIDIA(0): display device
    (II) NVIDIA(0): Validated modes:
    (II) NVIDIA(0): "CRT-0:1024x768+0+0,CRT-1:1024x768_85+1024+0"
    (II) NVIDIA(0): "CRT-0:800x600+0+0,CRT-1:NULL"
    (II) NVIDIA(0): "CRT-0:640x480+0+0,CRT-1:NULL"
    (II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
    (--) NVIDIA(0): DPI set to (81, 81); computed from "UseEdidDpi" X config
    (--) NVIDIA(0): option
    (**) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
    (--) Depth 24 pixmap format is 32 bpp
    (II) do I need RAC? No, I don't.
    (II) resource ranges after preInit:
    [0] -1 0 0xffffffff - 0xffffffff (0x1) MX[b]
    [1] -1 0 0x000f0000 - 0x000fffff (0x10000) MX[b]
    [2] -1 0 0x000c0000 - 0x000effff (0x30000) MX[b]
    [3] -1 0 0x00000000 - 0x0009ffff (0xa0000) MX[b]
    [4] -1 0 0x0000ffff - 0x0000ffff (0x1) IX[b]
    [5] -1 0 0x00000000 - 0x00000000 (0x1) IX[b]
    (II) NVIDIA(0): Initialized AGP GART.
    (II) NVIDIA(0): Unable to connect to the ACPI daemon; the ACPI daemon may not
    (II) NVIDIA(0): be running or the "AcpidSocketPath" X configuration option
    (II) NVIDIA(0): may not be set correctly. When the ACPI daemon is
    (II) NVIDIA(0): available, the NVIDIA X driver can use it to receive ACPI
    (II) NVIDIA(0): events. For details, please see the "ConnectToAcpid" and
    (II) NVIDIA(0): "AcpidSocketPath" X configuration options in Appendix B: X
    (II) NVIDIA(0): Config Options in the README.
    (II) NVIDIA(0): Setting mode "CRT-0:1024x768+0+0,CRT-1:1024x768_85+1024+0"
    (II) Loading extension NV-GLX
    (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
    (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
    (==) NVIDIA(0): Backing store disabled
    (==) NVIDIA(0): Silken mouse enabled
    (**) Option "dpms"
    (**) NVIDIA(0): DPMS enabled
    (II) Loading extension NV-CONTROL
    (==) RandR enabled
    (II) Initializing built-in extension MIT-SHM
    (II) Initializing built-in extension XInputExtension
    (II) Initializing built-in extension XTEST
    (II) Initializing built-in extension XKEYBOARD
    (II) Initializing built-in extension XC-APPGROUP
    (II) Initializing built-in extension SECURITY
    (II) Initializing built-in extension XINERAMA
    (II) Initializing built-in extension XFIXES
    (II) Initializing built-in extension RENDER
    (II) Initializing built-in extension RANDR
    (II) Initializing built-in extension COMPOSITE
    (II) Initializing built-in extension DAMAGE
    (II) Initializing built-in extension XEVIE
    (II) Initializing extension GLX
    (II) config/hal: Adding input device Logitech Optical USB Mouse
    (II) LoadModule: "evdev"
    (II) Loading /usr/lib/xorg/modules/input//evdev_drv.so
    (II) Module evdev: vendor="X.Org Foundation"
    compiled for 1.5.3, module version = 2.1.0
    Module class: X.Org XInput Driver
    ABI class: X.Org XInput driver, version 2.1
    (**) Logitech Optical USB Mouse: always reports core events
    (**) Logitech Optical USB Mouse: Device: "/dev/input/event2"
    (II) Logitech Optical USB Mouse: Found 3 mouse buttons
    (II) Logitech Optical USB Mouse: Found x and y relative axes
    (II) Logitech Optical USB Mouse: Configuring as mouse
    (**) Logitech Optical USB Mouse: YAxisMapping: buttons 4 and 5
    (**) Logitech Optical USB Mouse: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    (II) XINPUT: Adding extended input device "Logitech Optical USB Mouse" (type: MOUSE)
    (II) config/hal: Adding input device AT Translated Set 2 keyboard
    (**) AT Translated Set 2 keyboard: always reports core events
    (**) AT Translated Set 2 keyboard: Device: "/dev/input/event1"
    (II) AT Translated Set 2 keyboard: Found keys
    (II) AT Translated Set 2 keyboard: Configuring as keyboard
    (II) XINPUT: Adding extended input device "AT Translated Set 2 keyboard" (type: KEYBOARD)
    (**) Option "xkb_rules" "evdev"
    (**) AT Translated Set 2 keyboard: xkb_rules: "evdev"
    (**) Option "xkb_model" "evdev"
    (**) AT Translated Set 2 keyboard: xkb_model: "evdev"
    (**) Option "xkb_layout" "us"
    (**) AT Translated Set 2 keyboard: xkb_layout: "us"
    (II) config/hal: Adding input device Macintosh mouse button emulation
    (**) Macintosh mouse button emulation: always reports core events
    (**) Macintosh mouse button emulation: Device: "/dev/input/event0"
    (II) Macintosh mouse button emulation: Found 3 mouse buttons
    (II) Macintosh mouse button emulation: Found x and y relative axes
    (II) Macintosh mouse button emulation: Configuring as mouse
    (**) Macintosh mouse button emulation: YAxisMapping: buttons 4 and 5
    (**) Macintosh mouse button emulation: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    (II) XINPUT: Adding extended input device "Macintosh mouse button emulation" (type: MOUSE)
    AUDIT: Fri Feb 13 13:52:42 2009: 3113 X: client 17 rejected from local host ( uid=0 gid=0 pid=3481 )

  • Dual-head on NVIDIA

    Hello, I'm trying to configure a dual-head setup (NVIDIA proprietary drivers). I've tried both Xinerama and Twinview, but the screen is always treated as one big virtual screen and window managers spread fullscreen windows over both monitors.
    Here's what I've tried (they all resulted in the above):
    - # nvidia-xconfig
    - These instructions
    - # nvidia-xsettings
    - These instructions (I might've misunderstood them, they were fairly vague)
    In addition, Xinerama seems to disable the randr extension, which makes me unable to use xrandr or arandr, so the preferred option is to use Twinview.
    For clearance, when I open arandr, I see one big, wide screen called default. How do I set it up so that it shows 2 different screens?

    Separate X sessions on dual screens (both monitors have own x sessions )
    no xrandr
    only
    GUI Nvidia X Server Settings
    # nvidia-settings: X configuration file generated by nvidia-settings
    # nvidia-settings: version 295.40 ([email protected]) Thu Apr 5 22:40:34 PDT 2012
    Section "ServerLayout"
    Identifier "Layout0"
    Screen 0 "Screen0" 0 0
    Screen 1 "Screen1" RightOf "Screen0"
    InputDevice "Keyboard0" "CoreKeyboard"
    InputDevice "Mouse0" "CorePointer"
    Option "Xinerama" "0"
    EndSection
    Section "Files"
    EndSection
    Section "InputDevice"
    Identifier "Mouse0"
    Driver "mouse"
    Option "Protocol" "auto"
    Option "Device" "/dev/psaux"
    Option "Emulate3Buttons" "no"
    Option "ZAxisMapping" "4 5"
    EndSection
    Section "InputDevice"
    Identifier "Keyboard0"
    Driver "kbd"
    EndSection
    Section "Monitor"
    Identifier "Monitor0"
    VendorName "Unknown"
    ModelName "FUS SL3220W"
    HorizSync 30.0 - 83.0
    VertRefresh 56.0 - 75.0
    Option "DPMS"
    EndSection
    Section "Monitor"
    Identifier "Monitor1"
    VendorName "Unknown"
    ModelName "NEC LCD22WV"
    HorizSync 31.0 - 83.0
    VertRefresh 56.0 - 76.0
    Option "DPMS"
    EndSection
    Section "Device"
    Identifier "Device0"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce GTS 250"
    Option "RegistryDwords" "PerfLevelSrc=0x2222"
    BusID "PCI:1:0:0"
    EndSection
    Section "Device"
    Identifier "Device1"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    BoardName "GeForce 9600 GT"
    Option "RegistryDwords" "PerfLevelSrc=0x2222"
    BusID "PCI:2:0:0"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Device0"
    Monitor "Monitor0"
    DefaultDepth 24
    Option "TwinView" "0"
    Option "TwinViewXineramaInfoOrder" "DFP-0"
    Option "metamodes" "nvidia-auto-select +0+0"
    SubSection "Display"
    Depth 24
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen1"
    Device "Device1"
    Monitor "Monitor1"
    DefaultDepth 24
    Option "TwinView" "0"
    Option "TwinViewXineramaInfoOrder" "CRT-0"
    Option "metamodes" "nvidia-auto-select +0+0"
    SubSection "Display"
    Depth 24
    EndSubSection
    EndSection
    Tested Kde,Xfce,Lxde.Awesome

  • Dual Xeons Or Dual GPU Premiere cs6?

    Hey,
    Does PR support "DUAL" processors. GPU OR CPU.
    Iam willing to upgrade to dual Xeons like  " HD-Juggernaut"
    from ADK.
    Is it helful for PR or not, or i just stick with i7?

    As Bill said, dual GPU is not a good idea... some previous discussions
    2 cards and 3 monitors http://forums.adobe.com/thread/875252
    -and http://forums.adobe.com/thread/876675
    Dual Card SLI http://forums.adobe.com/thread/872941
    -problem http://forums.adobe.com/thread/872103

  • Graphics Error AMD Radeon HD 7640G/7670M Dual GPU

    HI,
    I am facing a problem with my HP Pavilion g6-2301ax Notebook PC with AMD Radeon HD 7640G/7670M Dual GPU (2 GB DDR3 dedicated) with Windows 8 64-bit. I haven't made any changes to the hardware/software nor have I connected any new hardwares.
    I cannot watch Videos no matter what program I use, be it VLC, Media Player, etc. The programs simply crashes. It takes quite a considerable time to boot and Restart/Shut-Down, than before.  Here's a screenshot of the error message...
    Also, the AMD Catalyst Control Centre keeps crashing...
    Before, the Graphics Control Interface was totally different. The so called "Hydra Grid" wasn't there. The Control Menu had more options and the ability to customise. Since, it is a Dual GPU, there was an option whether to choose both or just one. This problem started like 2-3 days back.
    I am really really frustrated. Because of this, my work is getting hampered a lot. Also, it doesn't read an external Hard Drive which has USB 3.0, even though the "safely remove icon" is shown at the task bar. And now, because of this I can't even Back-up my files on to the external hard drive in order to Recover or Restore it to Factory Settings.
    Somebody help me.
    Warm Regards,
    Zamyang.

    Hi @cheemsay ,
    Welcome to the HP Forums!
    It is a dynamite  place to find answers and ideas!
    For you to have the best experience in the HP forum I would like to direct your attention to the HP Forums Guide Learn How to Post and More
    I understand that you are unable to play videos regardless of what player you try.  You have not made any physical changes to your notebook or added any external hardware.
    When you connected your External drive you get the safely remove icon but does not install the drive.
    If you look in device manager does the external drive have any bangs or error on it.  Have you checked the manufacturers site for a driver for it?
    If you check disk management does it have a drive letter assigned to it?
    Here is a link to
    Testing for Hardware Failures (Windows 8) that may to determine the cause.
    You state  this happened  only 2 -3 days ago and you cannot do a restore as the external is not being seen correctly but you should be able to do a recovery back to factory.
    Do you know in updates were automatically installed?
    Here  is a link to Performing an HP system recovery (Windows 8)  that will guide you through the recovery process.
    I hope this helps!
    Sparkles1
    I work on behalf of HP
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    Click the “Kudos, Thumbs Up" on the bottom right to say “Thanks” for helping!

  • Can I run CC on Intel Pentium G2030 3.00GHz dual core. NVIDIA Geforce GT610 1GB graphics. Viglen Vig642M Motherboard?

    Can I run CC on Intel Pentium G2030 3.00GHz dual core. NVIDIA Geforce GT610 1GB graphics. Viglen Vig642M Motherboard?
    I also have 8GB RAM DDR3, 1TB HDD, Its on Windows 7, is that a problem with the allocated graphics card?
    P

    Thanks John,
    I've done this, but couldn't gather clear information about whether the graphics card was suitable for Premiere CC? Using CPUbenchmark it seemed to me like it would be suitable. Its unlikely we would use After Effects CC but would be handy if it would work.
    Also the G2030 dual core? Can you recommend how I find out if this would work, as the site states the requirement to be a Dual Core2... does this mean the software (Premiere CC, Photoshop CC, Indesign CC etc) would not work at all?
    Any guidance helpful.
    P

  • Cannot use GPU acceleration with 64 bit CS5 under 64 bit Windows 7 with two cards ATI & nVidia

    Subject sums up the problem, but here are the details:
    My ATI is a 5850, and I have drivers 10.6 installed. That's my primary card connected to my main monitor.
    The second card is an nVidia GTS 250 connected to my second monitor with 258.96 drivers.
    CS5 does not detect my GPU(s), hence, does not enable GPU acceleration. It says:
    "Graphics hardware acceleration is unavailable. You will need to upgrade your video driver and possibly your video card" both of which are obviously wrong, since I have the latest CS5 supporting drivers installed for two cards that can run GPGPU applications!
    Any suggestions?
    Thank you.

    function(){return A.apply(null,[this].concat($A(arguments)))}
    nzkiwi wrote:
    I'm curious about whether the requirement to support HLSL Shader Model 3.0 is a big part of what is holding back GPU acceleration support for ATI cards in Adobe products.  It seems that HLSL Shader Model 3.0 is NVidia/Intel proprietary, whereas GLSL is an open standard like OpenGL.
    What ATI support do you find being held back?  Just curious.  I ask because I have a 2 year old ATI 4670 card that supports Photoshop nicely.  The Catalyst software seems to have embraced whatever's necessary to run Photoshop.*
    -Noel
    * That said, I see a specific bug where sometimes the 32 bit drivers on my x64 system will not be recognized by Photoshop 32 bit as OpenGL-capable at Photoshop startup.  I've reported it to ATI, and they seem to have taken it seriously, so I expect a fix in a future version.  The 64 bit drivers always satisfy Photoshop's needs, so I'm sure it's not a design deficiency but just a lil' ol' bug.

  • Acer V5 753g (Intel Haswell / Nvidia GT 750m dual GPU) Bumblebee

    I think I am fighting several issues here, altough I might have fixed one already. Up to this point I managed to disable the nvidia gpu, using the proprietary driver and bbswitch, but bumblebee is not working.
    Okay first, I could isolated an issue with acpi, by following instructions in this bug report https://github.com/Bumblebee-Project/Bu … issues/460. To be exact it made me install a kernel module, code and install instrutions are described here: https://github.com/Bumblebee-Project/bb … ack-lenovo (After the bug report the author of this code was generous enough to add my model to the source code, so midifications weren't neccesary any more).
    Now after a reboot the gpu is disabled, but as soon as I run optirun (which fails) I am not able to disable it again
    I know the exact same issue is discribed in the bumblebee arch wiki page, but none of the workarounds fit my situation.
    bbswitch dmesg output
    dmesg | grep bbswitch
    [   16.608903] bbswitch: version 0.7
    [   16.608910] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.GFX0
    [   16.608917] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.RP05.PEGP
    [   16.609010] bbswitch: detected an Optimus _DSM function
    [   16.609063] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on
    [   16.610310] bbswitch: disabling discrete graphics
    [   78.445044] bbswitch: enabling discrete graphics
    Optirun output:
    optirun -vv glxspheres
    [ 1166.063193] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
    [ 1166.063481] [DEBUG]optirun version 3.2.1 starting...
    [ 1166.063488] [DEBUG]Active configuration:
    [ 1166.063491] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
    [ 1166.063494] [DEBUG] X display: :8
    [ 1166.063497] [DEBUG] LD_LIBRARY_PATH: /usr/lib/nvidia:/usr/lib32/nvidia
    [ 1166.063500] [DEBUG] Socket path: /var/run/bumblebee.socket
    [ 1166.063503] [DEBUG] Accel/display bridge: auto
    [ 1166.063505] [DEBUG] VGL Compression: proxy
    [ 1166.063508] [DEBUG] VGLrun extra options:
    [ 1166.063511] [DEBUG] Primus LD Path: /usr/lib/primus:/usr/lib32/primus
    [ 1166.063527] [DEBUG]Using auto-detected bridge virtualgl
    [ 1166.080418] [INFO]Response: No - error: [XORG] (EE) No devices detected.
    [ 1166.080441] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
    [ 1166.080445] [DEBUG]Socket closed.
    [ 1166.080469] [ERROR]Aborting because fallback start is disabled.
    [ 1166.080475] [DEBUG]Killing all remaining processes.
    Xorg error log:
    [  1166.066]
    X.Org X Server 1.14.4
    Release Date: 2013-10-31
    [  1166.066] X Protocol Version 11, Revision 0
    [  1166.066] Build Operating System: Linux 3.11.6-1-ARCH x86_64
    [  1166.066] Current Operating System: Linux acer-joschka 3.11.6-1-ARCH #1 SMP PREEMPT Fri Oct 18 23:22:36 CEST 2013 x86_64
    [  1166.066] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=9b2b7068-afcf-43a7-a6ac-7fc82d88e472 rw acpi_backlight=vendor
    [  1166.066] Build Date: 01 November 2013  05:10:48PM
    [  1166.066] 
    [  1166.066] Current version of pixman: 0.30.2
    [  1166.066]     Before reporting problems, check http://wiki.x.org
        to make sure that you have the latest version.
    [  1166.066] Markers: (--) probed, (**) from config file, (==) default setting,
        (++) from command line, (!!) notice, (II) informational,
        (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [  1166.066] (==) Log file: "/var/log/Xorg.8.log", Time: Mon Nov  4 14:34:19 2013
    [  1166.066] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
    [  1166.066] (++) Using config directory: "/etc/bumblebee/xorg.conf.d"
    [  1166.066] (==) ServerLayout "Layout0"
    [  1166.066] (==) No screen section available. Using defaults.
    [  1166.066] (**) |-->Screen "Default Screen Section" (0)
    [  1166.066] (**) |   |-->Monitor "<default monitor>"
    [  1166.066] (==) No device specified for screen "Default Screen Section".
        Using the first device section listed.
    [  1166.066] (**) |   |-->Device "DiscreteNvidia"
    [  1166.066] (==) No monitor specified for screen "Default Screen Section".
        Using a default monitor configuration.
    [  1166.066] (**) Option "AutoAddDevices" "false"
    [  1166.066] (**) Option "AutoAddGPU" "false"
    [  1166.066] (**) Not automatically adding devices
    [  1166.066] (==) Automatically enabling devices
    [  1166.066] (**) Not automatically adding GPU devices
    [  1166.066] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
    [  1166.066]     Entry deleted from font path.
    [  1166.066] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
    [  1166.066]     Entry deleted from font path.
    [  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
    [  1166.066]     Entry deleted from font path.
    [  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
    [  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
    [  1166.066]     Entry deleted from font path.
    [  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
    [  1166.066] (==) FontPath set to:
        /usr/share/fonts/misc/,
        /usr/share/fonts/TTF/
    [  1166.066] (++) ModulePath set to "/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules"
    [  1166.066] (==) |-->Input Device "<default pointer>"
    [  1166.066] (==) |-->Input Device "<default keyboard>"
    [  1166.066] (==) The core pointer device wasn't specified explicitly in the layout.
        Using the default mouse configuration.
    [  1166.066] (==) The core keyboard device wasn't specified explicitly in the layout.
        Using the default keyboard configuration.
    [  1166.067] (II) Loader magic: 0x7fdc20
    [  1166.067] (II) Module ABI versions:
    [  1166.067]     X.Org ANSI C Emulation: 0.4
    [  1166.067]     X.Org Video Driver: 14.1
    [  1166.067]     X.Org XInput driver : 19.1
    [  1166.067]     X.Org Server Extension : 7.0
    [  1166.067] (II) xfree86: Adding drm device (/dev/dri/card0)
    [  1166.067] setversion 1.4 failed
    [  1166.067] (II) xfree86: Adding drm device (/dev/dri/card1)
    [  1166.068] (--) PCI:*(0:1:0:0) 10de:0fe4:1025:079b rev 161, Mem @ 0xb2000000/16777216, 0xa0000000/268435456, 0xb0000000/33554432, I/O @ 0x00003000/128
    [  1166.068] Initializing built-in extension Generic Event Extension
    [  1166.068] Initializing built-in extension SHAPE
    [  1166.068] Initializing built-in extension MIT-SHM
    [  1166.068] Initializing built-in extension XInputExtension
    [  1166.068] Initializing built-in extension XTEST
    [  1166.068] Initializing built-in extension BIG-REQUESTS
    [  1166.068] Initializing built-in extension SYNC
    [  1166.068] Initializing built-in extension XKEYBOARD
    [  1166.068] Initializing built-in extension XC-MISC
    [  1166.068] Initializing built-in extension SECURITY
    [  1166.068] Initializing built-in extension XINERAMA
    [  1166.068] Initializing built-in extension XFIXES
    [  1166.068] Initializing built-in extension RENDER
    [  1166.068] Initializing built-in extension RANDR
    [  1166.068] Initializing built-in extension COMPOSITE
    [  1166.068] Initializing built-in extension DAMAGE
    [  1166.068] Initializing built-in extension MIT-SCREEN-SAVER
    [  1166.068] Initializing built-in extension DOUBLE-BUFFER
    [  1166.068] Initializing built-in extension RECORD
    [  1166.068] Initializing built-in extension DPMS
    [  1166.068] Initializing built-in extension X-Resource
    [  1166.068] Initializing built-in extension XVideo
    [  1166.068] Initializing built-in extension XVideo-MotionCompensation
    [  1166.068] Initializing built-in extension XFree86-VidModeExtension
    [  1166.068] Initializing built-in extension XFree86-DGA
    [  1166.068] Initializing built-in extension XFree86-DRI
    [  1166.068] Initializing built-in extension DRI2
    [  1166.068] (II) LoadModule: "glx"
    [  1166.068] (II) Loading /usr/lib/nvidia/xorg/modules/extensions/libglx.so
    [  1166.078] (II) Module glx: vendor="NVIDIA Corporation"
    [  1166.078]     compiled for 4.0.2, module version = 1.0.0
    [  1166.078]     Module class: X.Org Server Extension
    [  1166.078] (II) NVIDIA GLX Module  325.15  Wed Jul 31 18:12:00 PDT 2013
    [  1166.078] Loading extension GLX
    [  1166.078] (II) LoadModule: "nvidia"
    [  1166.078] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [  1166.078] (II) Module nvidia: vendor="NVIDIA Corporation"
    [  1166.078]     compiled for 4.0.2, module version = 1.0.0
    [  1166.078]     Module class: X.Org Video Driver
    [  1166.078] (II) LoadModule: "mouse"
    [  1166.078] (II) Loading /usr/lib/xorg/modules/input/mouse_drv.so
    [  1166.078] (II) Module mouse: vendor="X.Org Foundation"
    [  1166.078]     compiled for 1.14.0, module version = 1.9.0
    [  1166.078]     Module class: X.Org XInput Driver
    [  1166.078]     ABI class: X.Org XInput driver, version 19.1
    [  1166.078] (II) LoadModule: "kbd"
    [  1166.078] (WW) Warning, couldn't open module kbd
    [  1166.078] (II) UnloadModule: "kbd"
    [  1166.078] (II) Unloading kbd
    [  1166.078] (EE) Failed to load module "kbd" (module does not exist, 0)
    [  1166.078] (II) NVIDIA dlloader X Driver  325.15  Wed Jul 31 17:50:57 PDT 2013
    [  1166.078] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [  1166.078] (--) using VT number 7
    [  1166.078] (EE) No devices detected.
    [  1166.078] (EE)
    Fatal server error:
    [  1166.078] (EE) no screens found(EE)
    [  1166.079] (EE)
    Please consult the The X.Org Foundation support
         at http://wiki.x.org
    for help.
    [  1166.079] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
    [  1166.079] (EE)
    bumblebee.conf (I didnt change anything here)
    # Configuration file for Bumblebee. Values should **not** be put between quotes
    ## Server options. Any change made in this section will need a server restart
    # to take effect.
    [bumblebeed]
    # The secondary Xorg server DISPLAY number
    VirtualDisplay=:8
    # Should the unused Xorg server be kept running? Set this to true if waiting
    # for X to be ready is too long and don't need power management at all.
    KeepUnusedXServer=false
    # The name of the Bumbleblee server group name (GID name)
    ServerGroup=bumblebee
    # Card power state at exit. Set to false if the card shoud be ON when Bumblebee
    # server exits.
    TurnCardOffAtExit=false
    # The default behavior of '-f' option on optirun. If set to "true", '-f' will
    # be ignored.
    NoEcoModeOverride=false
    # The Driver used by Bumblebee server. If this value is not set (or empty),
    # auto-detection is performed. The available drivers are nvidia and nouveau
    # (See also the driver-specific sections below)
    Driver=
    # Directory with a dummy config file to pass as a -configdir to secondary X
    XorgConfDir=/etc/bumblebee/xorg.conf.d
    ## Client options. Will take effect on the next optirun executed.
    [optirun]
    # Acceleration/ rendering bridge, possible values are auto, virtualgl and
    # primus.
    Bridge=auto
    # The method used for VirtualGL to transport frames between X servers.
    # Possible values are proxy, jpeg, rgb, xv and yuv.
    VGLTransport=proxy
    # List of paths which are searched for the primus libGL.so.1 when using
    # the primus bridge
    PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
    # Should the program run under optirun even if Bumblebee server or nvidia card
    # is not available?
    AllowFallbackToIGC=false
    # Driver-specific settings are grouped under [driver-NAME]. The sections are
    # parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
    # detection resolves to NAME).
    # PMMethod: method to use for saving power by disabling the nvidia card, valid
    # values are: auto - automatically detect which PM method to use
    #         bbswitch - new in BB 3, recommended if available
    #       switcheroo - vga_switcheroo method, use at your own risk
    #             none - disable PM completely
    # https://github.com/Bumblebee-Project/Bu … PM-methods
    ## Section with nvidia driver specific options, only parsed if Driver=nvidia
    [driver-nvidia]
    # Module name to load, defaults to Driver if empty or unset
    KernelDriver=nvidia
    PMMethod=auto
    # colon-separated path to the nvidia libraries
    LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia
    # comma-separated path of the directory containing nvidia_drv.so and the
    # default Xorg modules path
    XorgModulePath=/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules
    XorgConfFile=/etc/bumblebee/xorg.conf.nvidia
    ## Section with nouveau driver specific options, only parsed if Driver=nouveau
    [driver-nouveau]
    KernelDriver=nouveau
    PMMethod=auto
    XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
    xorg.conf.nvidia (I placed my gpu device address in Device Section
    Section "ServerLayout"
        Identifier  "Layout0"
        Option      "AutoAddDevices" "false"
        Option      "AutoAddGPU" "false"
    EndSection
    Section "Device"
        Identifier  "DiscreteNvidia"
        Driver      "nvidia"
        VendorName  "NVIDIA Corporation"
    #   If the X server does not automatically detect your VGA device,
    #   you can manually set it here.
    #   To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
    #   as you see in the commented example.
    #   This Setting may be needed in some platforms with more than one
    #   nvidia card, which may confuse the proprietary driver (e.g.,
    #   trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
         BusID "PCI:01:00.0"
    #   Setting ProbeAllGpus to false prevents the new proprietary driver
    #   instance spawned to try to control the integrated graphics card,
    #   which is already being managed outside bumblebee.
    #   This option doesn't hurt and it is required on platforms running
    #   more than one nvidia graphics card with the proprietary driver.
    #   (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
    #   If this option is not set, the new Xorg may blacken the screen and
    #   render it unusable (unless you have some way to run killall Xorg).
        Option "ProbeAllGpus" "false"
        Option "NoLogo" "true"
        Option "UseEDID" "false"
        Option "UseDisplayDevice" "none"
    EndSection

    Your issue seems to be similar to what I had to face with the Razer Blade 14" (similar hardware as well).
    Goto: https://bbs.archlinux.org/viewtopic.php?id=173356 and see the instructions to get graphics to work, see if it applies (summarized: run linux-ck and set rcutree.rcu_idle_gp_delay=2 in the kernel parameters).
    I haven't tried undoing this in more recent updates because I haven't had the time to mess with it---it is possible that it has been fixed, although the thread following my and possibly your issue at nvidia doesn't give any indication of that:
    https://devtalk.nvidia.com/default/topi … iver-crash

  • ATI X800 XT 512MB  on MDD dual boot dual 1.25Ghz.... possible?

    Hi,
    Just looking to see if I can install a Mac version of the ATI X800 XT 512MB on G4 MDD dual boot dual 1.25Ghz machine?
    If so... great... I'll go pick it up straight away.
    thanks
    PS. I know it won't accelerate 9.2.2, but I already have the 9200 PCI card to accomodate that.

    Just following up on japamac's excellent response.
    ADC presents a seriously limitation on cards since the only two retail cards (no flashing, no fuss) on the market are the ATI 9600 and 9800, both without ADC.
    I use a modified 9600 Pro from a G5, which has an unpowered ADC. Unpowered because the the ADC tab doesn't align, although some people have modified the card further to get power to the ADC port. For info on both of those mods, see:
    http://www.pistolerapost2.com/viewing/96002g4.html
    There is one last card -- extremely -- I can think of with ADC and more oomph than the retail 9600: the Apple OEM 9700. That was a BTO option for the MDD. Here's some info on that card:
    http://www.xlr8yourmac.com/Graphics/Radeon9700PROOEM/index.html
    Again it's really rare because it wasn't available even as a BTO option for very long. They fetch a premium on the used market when you find them.
    What ADC monitor do you have? I hope an LCD. In which case, something to consider is the Apple ADC-DVI adapter which will set you back $100. It's basically a kind of splitter, with a brink supplying power to ADC monitor while passing the signal from the DVI port. Yeah this adds to your costs, but gives you more options in terms of cards.
    Oh more thing about 9.x acceleration which you mentioned. With ATI cards, you might need to pull an extension or library from the Extensions folder to keep 9.x from stalling at boot. I had to do that with my 9600. Sorry, I'm on my MBP now, so I can't tell you which ones. Let me know if you need/want to know which ones I pulled.

  • Is a dual monitor setup with 2graphic cards possible?

    HI.
    What i want to do is run Archlinux on my Desktop but i dont want to miss my second monitor.
    I could use both monitor on my nvidia graphiccard but i want to use the energy saving features too (wich dont work with 2monitor on one card) therefore my windows setup is big monitor on nvidia and the smaller one on the intel onboard card.
    They both get recognized by lspci but only the onboard cards isy being used (the onboard card is the first card on windows too).
    So is it possible to get it to work? Id really want to use ARchlinux on my Desktop too but i dont want to use it with one screen or without energy saving?
    Working compositing is a must, i want to use gnome3 wich is the only reason to swich to linux for the Desktop (arch is running on the laptop for ages^^) but i dont need to, Win7 is working flawlessly fine and i didnt have to pay for it (msdnaa).
    Thanks in advance!

    hy do you need to clock down your card?
    Most gpu have a function that clocks down the card@idle and or desktop.
    In my case thats down from 850mhz(560ti)to 200mhz while using windows and 50mhz if the gpu load is low.
    Thats about 20watts less then with full clock.
    The problem is, that the gpu wont clock down with 2monitors on it, it stays at 850mhz.
    Theres some driver tweaks for amd cards wich "fix" thatfix this (i believe its intentionl, 2monitors equals more load but its no problem with new cards...
    Thats one reason i use my onboard gpu for the second monitor, it works well with windows and also means that swiching off fullscreen (games) to the desktop works a lot better, the second monitor can play a movie while playing wich doesnt work with one gpu and fullscreen.
    I really dont need that with Linux since i use it for work and multimedia only but the energy saving feature is nice if the pc is on 8hours and more wich is often the case.
    Can you tell which one do you run? I'm going to buy one for dual-monitor setup, and I'm looking for something that will work for shure.
    As said dual monitor on one gpu works, amd can do 3monitors while nvidia can do 2 at once, that means if u intend to use 3monitors some day with ur card u have to buy an amd card.
    i took the nvidia because is can use it to render videos with cuda support (its not working with all tools yet but its getting better and if it works its a lot faster then my cpu rendering).
    PS: Thats one hread about the power saviing with 2monitors:http://forums.nvidia.com/index.php?showtopic=197557
    since my monitors dont share the same resolution it seems that theres no way to make it work .
    Last edited by halfzware (2011-05-12 21:17:41)

Maybe you are looking for

  • Terminal problems

    Hi everyone, Every time I open terminal I get the message below. This only happens in one account in the computer, it does not happen in other accounts. I will appreciate any help to fix the problem. I have a mac book pro, keyboard in spanish, and I

  • How to populate the calculated value into screen field.

    I am doing one enhancement in QM.I have added one custom screen to notification transaction ( QM01/QM02/QM03) transaction tab strip control using the enhancement QQMA0001.The Details of the calling and called screens as shown bellow The Calling scree

  • Startup nomount error

    SQL> startup nomount pfile='/u01/app/oracle/product/10.1.0/db_3/dbs/initRXWRHS_1.ora'; ORA-27123: unable to attach to shared memory segment Linux Error: 22: Invalid argument Additional information: 1 Additional information: 1310723

  • How to use iTune download on website

    I hope there is an answer to this. I want to know how to go about being able to legally use a song purchased from iTunes on a website page. The purpose is to merely enhance the page. It is not for download, not that I would know how to create that op

  • SkipJak and Ecommerce

    I am having issues with getting the eCommerce Webtools to properly send informaiton thorgh the gateway without an error.. Is there a way to see how the mapping is done within Webtools to be sure the fields are being sent to the credit card gateway as