Dual graphics cards in late 2008 macpro?

I've just bought an 8Core (harpertown) with the ATI2600 card and am finding the 256mb video ram a problem compared to my 4Core with the ATI 1900 XT with 512Mb video memory. (some of my filters won't render on using the 8Core 2600 that I initially created on the 4Core running the 1900).
I use the 8core primarily for grading with COLOR and online with FCP.
I'd love to buy a new ATI 1900 XT but it's expensive and I want to know if anybody has any experience or input on the following possible scenarios:
2 x ATI HD 2600 cards
or
ATI 3870 card
cheers
Llewelyn

Barefeats compared the various cards, 2600, 8800, 3870, and X1900.
3870:
http://www.barefeats.com/harper16.html
http://eshop.macsales.com/Item_XLR8YourMac.cfm?ID=10876&Item=ATI100435928
http://www.barefeats.com/harper8.html
http://www.barefeats.com/harper15.html
Does 10.5.6 make a difference?
http://www.barefeats.com/harper21.html
I can only see 2 x 2600 where you have multiple monitors.
Also, there are some problems anyway with 2008 and X1900 I thought (and who would really want to $399 for 3 yr plus old technology).

Similar Messages

  • Graphics Card Upgrade with 2008 MacPro?

    Hello: I have a sturdy, workhorse 2008 MacPro (6 gigs RAM). I need to confirm if I can/need to upgrade the graphics card that came with the original box. (Chipset Model: ATI Radeon HD 2600). Am buying a NEC PA271W-BK 27 monitor and X-rite ColorMunki, whose specs say I need "1024 x 768 or higher display resolution with 16-bit video card." (http://www.bhphotovideo.com/c/product/798928-REG/X_Rite_CMUNDIS_ColorMunki_Displ ay.html)
    Thoughts? (Would prefer not to buy new MacPro yet, as this machine likely has at least another year). Thanks.

    RE: Mac Pro silver tower (2006-2012) Replacement Graphics cards
    1) Apple brand cards,
    2) "sold in the Apple store" cards, and
    3) "Mac Edition" cards ...
    ... show all the screens, including Boot up screens, Safe Mode, Installer, Recovery, debug screens, and Alt/Option boot screens. At this writing, these choices include:
    1) Apple brand cards:
    • Apple-firmware 5770, about US$250** works near full speed in every model Mac Pro, Drivers in 10.6.5 [BUT! This card has just been discontinued and is no longer available in the Apple Store, or at many Apple Resellers]
    • Apple-firmware 5870, about US$450
    2) "sold in the Apple store" cards
    • NVIDIA Quadro 4000, about US$1200
    • NVIDIA Quadro 5000, about US$2500
    3) "Mac Edition" cards -- REQUIRE 10.8.3 or later:
    • SAPPHIRE HD 7950 3GB GDDR5 MAC Edition, about US$480** Vendor recommends Mac Pro 4,1
    • EVGA GTX 680 Mac Edition, about US$600
    The cards above require no more than the provided two 6-pin aux power connectors provided in the Mac Pro through 2012 model. Aux cables may not be provided for third-party cards, but are readily available.
    If you are Meet ALL of these:
    • running 10.8.3 or later AND
    • don't care about "no boot screens" etc AND
    • can re-wire or otherwise "work out" the power cabling, THEN:
    You can use many more cards, even most "PC-only cards"

  • Dual graphics cards and single 4K display - Mac Pro (late 2013)

    I'm considering a new Mac Pro as my next graphics/development workstation. I see that the new Mac Pros have dual graphics cards (i.e. dual D300, D500, D700.) I was wondering if there is any benefit to having the second graphics card if I only have only one 4K display connected via Display Port 1.2? Would the two cards work in tandem to output to the single 4k display? Can the second GPU still be used for OpenCL, even if it is not connected to a display?
    Thanks

    There is no option to have only one Dxxx, and yes some apps do and no the number of displays does not affect video output.
    Whether it is right for you vs say a maxed out iMac or something else is something you should continue to research and consider.
    www.barefeats.com and www.macperformanceguide.com as well as other forums and sites.

  • Photoshop CS6 have problems with my dual graphics card

    I'm currently using photoshop cs6 on my laptop (hp pavilion dv4 3114 ), and this laptop has dual graphics cards(one is mobile intel hd graphics, the other one is ati hd 6750), it ususlly switches to the faster one when games are running or some software that requires a lot of graphic work. But this time I have already set in the graphics card profile to let photoshop use the faster one, but when I launch photoshop cs6 it tells me that the graphics card is not officially supported, as you can see this in the screenshot.
    Then I turned to photoshop's preference(performence) and found it only identifies the mobile intel hd graphics. I'm currently using photoshop for designing large posters so performence is very important for me. Is anyone know how to solve this problem so that photoshop can run with the highest performence?

    II can't imagine how these computer companies expect to fully integrate GPUs from two different vendors, but they're trying.  To me, knowing a bit about how display drivers are implemented, it seems an impossible task.
    I hve heard (in the context of Macs that have this option) that disabling the low power options so that only the more powerful GPU is active can help.  I don't know if that can apply to your system, but it might give you an idea of a place to start looking. 
    Another thing might be to try to find whatever settings it offers for "affinity" and set it so Photoshop and its cousins use just the ATI GPU exclusively.  It might be the "switchover" process that is causing the problems.
    Good luck.
    -Noel

  • How to get dual graphics card to work together?

    I have an L series satellite with dual graphics card that wont work together. Its an AMD Radeon 3400m apu with an HD 6520G integrated , and a 6400m discrete graphcs card.
    The 6400m will never turn on, I have used amd system monitor to see if it is ever used, but it never is.
    Even when the 6520g is at full load it wont turn on.
    Any suggestions would be helpful.
    Thanks.

    Hi
    As far as I know this theme has been already discussed here in the forum and the switch between internal and external ATI GPU is not supported.
    Just the Optimus technology designed by nVidia is supported
    So you can switch between Intel GPU and nVidia (if both would support the Optimus technology)

  • Dual graphics card in M92 Tower?

    Hello,
    I have a manager here requesting 3 monitors for each of their departments workstations (Lenovo M92 Tower, Intel i5) and I am just wonder if the M92 supports dual graphics cards. I am aware that it only has a PCIe 1x and PCIe 16x slot - the plan is to install two 1x cards, if they are supported.
    Could someone confirm if this will work or not?
    Thanks!
    Solved!
    Go to Solution.

    First, I have no experience at all with more than two monitors on any PC.
    However... AMD's EyeFinity setup can support three (or more) monitors using the multiple connectors on a usable video card.  For example, the relatively inexpensive AMD R7 250 card comes in both full-size and low-profile varieties, to fit inside an M92p case (either mid-tower or SFF) in the PCIe x16 slot (which is what you want to use).  The low-profile R7 250 DDR5 card from Sapphire (which includes both low-profile and full-size brackets in the retail box, so you can use the card in either M92p case size) has three digital connectors on it: DVI, miniDP, and microHDMI.  So you can connect three monitors to them. The retail box also includes adapters for the microHDMI and miniDP connectors, but I'd personally just buy a proper 2-ended straight-through cable (say from Monoprice or other online source) with the correct connector at each end to connect each of your three monitors. I'm not a fan of adapters myself, I'd prefer a suitable cable with the correct connectors at each end.
    According to the EyeFinity setup instructions, you use two connectors to go to two monitors, and the third (or more) monitors must connect on the DisplayPort path.  In your case you only have three monitors, so you just use all three digital connectors on the R7 250 and you're done!  No need to have two graphics cards, and you get high-performance DDR5 as you want, on three monitors.
    Again... I've never done this myself, but EyeFinity for three or more monitors is standard technology given an AMD card like the R7 250 sitting in a single PCIe x16 slot.

  • HP Envy H8-1540t - dual graphics cards, Pcie 3.0 support

    I have 2 questions about using dual graphics cards and Pcie 3.0 support in my machine.
    The machine is:
    HP Envy H8-1540t
    Win 8 64
    i7-3820 2nd Gen
    10 GB DDR3 1600
    256 GB SSD
    1 TB 7200 rpm SATA hard drive
    Blu-Ray Player
    TV Tuner
    Prem wireless N-Lan and Blutooth 2x2
    600 Watt PSU
    NVidia GTX 660
    This machine uses the Pegatron IPIWB-PB motherboard and has 2 Pcie x 16 slots. I realize that by using dual width GPU's like the GTX 660, the smaller Pcie slots next to them will be buried, rendering them useless. So these are my 2 questions;
    1.) Will 2 Nvidia GTX 660 GPU's physically fit within the machine and be supported?
    2.)  Does this motherboard with it's Intel X79 chipset support Pcie 3.0?
    Thank You
    This question was solved.
    View Solution.

    Hi,
    You can find ALL information on the following OFFICIAL link, looks like it only supports PCIe Gen 2.0
       http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&lc=en&dlc=en&docname=c03132964#N241
    From the image of the motherboard (above) and the following image of the GTX 660, simple answer: 2 cards won't physically fit.
       http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660/product-images
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • GeForce GTX 680 Mac Edition Graphics Card advice wanted 2008 Mac Pro. Eight core desktop.

    I have read some threads yet it is not clear to me if it is a good move to upgrade an Mac Pro eight core 3.1- 2008 to use GeForce GTX 680 Mac Edition Graphics Card. I ask as my GeForce 8800 GT 2008 card has given up the ghost, and it has done sterling service R.I.P.. The machine is in the repair shop for a service itself, and this problem. So I thought an upgrade, as I am at it. Daunting choice out there. Um, learning curve and all that!
    I mainly do 2D graphics, Canvas, Adobe stuff, yet a little amount of video, which may grow some. No games on this work horse!
    So here I am fishing for experienced advice were my own is not, from anyone who has done this. Has it held up, do you get a boot screen, does it run two monsters easily via DVI.
    If you have one or something good fitted please share the skinny!
    For me it is turning into the year of the video card. My laptop looks like it got the disease first!
    Thanks.

    #1: AMD 7950 Mac Edition, or not. $425
    #2: GTX 680 Mac $625
    #3: MacVidCards - various but worth a look
    http://www.ebay.com/sch/macvidcards/m.html
    #4: GTX 285, no idea how 'safe' as often read someone (not Macvidcards) flashed PC card and sold as "Mac" and they are old, and may not be supported in Mavericks (and that would apply to a 8800 too).
    Information purposes only:
    http://www.amazon.com/EVGA-Geforce-Gtx285-Graphics-Pci-express/dp/B00G9TQMOE/
    Check out Barefeats, again, for ideas if you want 'near best.'
    A new gpu is $5000 cheaper than an overhaul and new MP.
    A 2008 though will never perform as well as later models. Ideal is 2010 or later.
    Even a top of the line iMac with GTX if you are having a lot of trouble and thinking of ditching the 2008.

  • G5 Dual Graphics card

    I have a G5 Dual - PCI - X I believe as well as a G% Quad - PCIe
    Ive just replaced the 6500 graphics card in the quad can I put it in the dual?

    The G5 dual will have an AGP graphics card slot, I'm afraid.
    That's presuming it isn't a dual-core - "Late 2005"
    http://support.apple.com/specs/powermac/PowerMac_G5_Late2005.html

  • Replacing Airport Extreme Card in Late 2008 MBP - What to buy??

    I've limped along for about 2 years now with my wifi constantly dropping while everyone else has a strong, constant signal. Doesn't matter where I am, it's always dropping and takes ages to reconnect. I've tried everything advised on the forum etc and am now about to buy something else as my main machine. I don't want to get rid of my MBP though so am planning on installing new memory and hard drive and while I'm doing that, want to replace the airport extreme card. The MBP will then become my secondary machine.
    Problem is, I can't figure out what I need to buy to replace the **** card! Seems like a lot I'm finding are either for machines that never had cards or for 2009 or newer. I just need to know which card to buy.
    OR . . . hearing that they are a pain to get to (keeping in mind I'll have the guts exposed while I upgrade the memory and HD) and fiddly . . . is there something external to plug into a USB I should get instead?
    Here are my specs:
    MacBook Pro, 15" Late 2008 (Unibody)
    2.4 Ghz Intel Core Duo
    2 GB 1067 Mhz DDR3 (which I'm about to upgrade)
    Mac OS X Lion 10.7.4 (11E53)
    Card Type:
    AirPort Extreme  (0x14E4, 0x8D)
      Firmware Version:
    Broadcom BCM43xx 1.0 (5.106.198.4.20)
      MAC Address:
      Locale:
      Country Code:
      Supported PHY Modes:
    802.11 a/b/g/n
      Supported Channels:
    1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 36, 40, 44, 48, 52, 56, 60, 64, 100, 104, 108, 112, 116, 120, 124, 128, 132, 136, 140, 149, 153, 157, 161, 165
      Wake On Wireless:
    Supported
      AirDrop:
    Supported

    I'm already upgrading to 8GB of RAM and I'm planning on adding a Solid State Hard Drive since it will be a secondary machine. I really don't want to replace the optical drive too. At that point it's got new components in an old chassis with an old CPU.
    I've been to the Genius Bar (back when I lived in a place where there was one in town) and they couldn't replicate the issue but as soon as I left it started up again. There are times when it holds the signal for a good hour or more or days like today when it's dropping it every few minutes. It's completely random.
    No. This girl is going to go build her own PC with a fantastic GPU so as to kick some Diablo butt and the old Macbook Pro will only get used when I don't need it for anything serious. And I am most certainly not dropping another $2k + on a new MBP no matter how shiny they are! It's the end of an era as I've been using Macs exclusively for over 10 years now.

  • Dual Graphics cards in Quad PPC G5?

    I have one GeForce 6600, and want to add a second, is this possible?
    If not, is there another card model which can be dualed up inside of the PPC G5 Quad?
    If not, where can you buy Nvidia graphics card upgrades for the PPC G5 Quad?

    Anyone know where you can get one of these upgrades?
    NVIDIA Quadro FX 4500 graphics card
    http://www.apple.com/pr/library/2005/oct/19pmg5.html
    Malcolm's last link is the Quadro FX 4500. However, unless you have the need for its unique features, the 7800GT (the next to last link in Malcolm's post) is a much better bang for the buck. The 7800 isn't much slower for most things, but costs 1/2 as much as the Quadro.

  • Cannot detect outputs on dual graphic card configuration

    Hi,
    I have a Dell Precision M6800 which has these graphic cards:
    00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
    01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Saturn XT [FirePro M6100]
    My ATI graphic card is supported by the xf86-video-ati driver based on http://www.x.org/wiki/RadeonFeature/ .
    Since I have multiple cards, I followed the instructions of https://wiki.archlinux.org/index.php/PRIME and I have
    DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
    OpenGL renderer string: Gallium 0.4 on AMD BONAIRE
    But when I plug my screen using my laptop display port, I don't see it in xrandr:
    Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
    eDP1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 382mm x 215mm
    1920x1080 60.01*+
    1400x1050 59.98
    1280x1024 60.02
    1280x960 60.00
    1024x768 60.00
    800x600 60.32 56.25
    640x480 59.94
    VGA2 disconnected (normal left inverted right x axis y axis)
    DP4 disconnected (normal left inverted right x axis y axis)
    HDMI1 disconnected (normal left inverted right x axis y axis)
    VIRTUAL1 disconnected (normal left inverted right x axis y axis)
    If I plug this screen on the VGA output, it is detected but the maximum resolution allowed by xrandr is 1920x1080 and my screen is a 2560x1080 so the image is stretched .
    My guess is that xrandr sees only the output of my intel card, how can I have it see the outputs supported by my ATI card?
    I also guess that my radeon card is not powered up, because cat /sys/kernel/debug/dri/0/radeon_pm_info gives me.
    PX asic powered off
    I tried DRI_PRIME=1 glxgears and that made me crash my X server with a segfault
    [ 554.701] (EE)
    [ 554.701] (EE) Backtrace:
    [ 554.701] (EE) 0: /usr/bin/X (xorg_backtrace+0x56) [0x58f186]
    [ 554.701] (EE) 1: /usr/bin/X (0x400000+0x192fc9) [0x592fc9]
    [ 554.701] (EE) 2: /usr/lib/libpthread.so.0 (0x7fb2738f1000+0xf4b0) [0x7fb2739004b0]
    [ 554.701] (EE) 3: /usr/lib/xorg/modules/drivers/intel_drv.so (0x7fb26d165000+0x1034a8) [0x7fb26d2684a8]
    [ 554.701] (EE) 4: /usr/bin/X (0x400000+0x15ea73) [0x55ea73]
    [ 554.701] (EE) 5: /usr/bin/X (0x400000+0x15f843) [0x55f843]
    [ 554.701] (EE) 6: /usr/bin/X (DRI2GetBuffersWithFormat+0xb) [0x55fc8b]
    [ 554.701] (EE) 7: /usr/bin/X (0x400000+0x16172b) [0x56172b]
    [ 554.701] (EE) 8: /usr/bin/X (0x400000+0x36b2f) [0x436b2f]
    [ 554.701] (EE) 9: /usr/bin/X (0x400000+0x3ad16) [0x43ad16]
    [ 554.701] (EE) 10: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7fb27255e000]
    [ 554.701] (EE) 11: /usr/bin/X (0x400000+0x250fe) [0x4250fe]
    [ 554.701] (EE)
    Its a bad sign, but I'm fine if we find a workaround to only use the radeon card. What I can read from the Xorg log files before this crash is the following:
    [ 545.834] (II) RADEON(G0): Printing probed modes for output DisplayPort-1-0
    [ 545.834] (II) RADEON(G0): Modeline "2560x1080"x60.0 185.58 2560 2624 2688 2784 1080 1083 1093 1111 +hsync -vsync (66.7 kHz eP)
    So my radeon detects this mode, I just don't see it in xrandr and don't know how to use it. I tried DRI_PRIME=1 xrandr but saw no difference with a standard xrandr.
    Additional information:
    dmesg|grep radeon gives me
    dmesg|grep radeon
    [ 5.097162] [drm] radeon kernel modesetting enabled.
    [ 5.118130] radeon 0000:01:00.0: enabling device (0000 -> 0003)
    [ 10.764770] radeon 0000:01:00.0: VRAM: 2048M 0x0000000000000000 - 0x000000007FFFFFFF (2048M used)
    [ 10.764772] radeon 0000:01:00.0: GTT: 1024M 0x0000000080000000 - 0x00000000BFFFFFFF
    [ 10.764852] [drm] radeon: 2048M of VRAM memory ready
    [ 10.764853] [drm] radeon: 1024M of GTT memory ready.
    [ 10.766910] [drm] radeon/BONAIRE_mc2.bin: 31792 bytes
    [ 10.774998] [drm] radeon: dpm initialized
    [ 10.782479] radeon 0000:01:00.0: WB enabled
    [ 10.782490] radeon 0000:01:00.0: fence driver on ring 0 use gpu addr 0x0000000080000c00 and cpu addr 0xffff8807ff9e9c00
    [ 10.782491] radeon 0000:01:00.0: fence driver on ring 1 use gpu addr 0x0000000080000c04 and cpu addr 0xffff8807ff9e9c04
    [ 10.782492] radeon 0000:01:00.0: fence driver on ring 2 use gpu addr 0x0000000080000c08 and cpu addr 0xffff8807ff9e9c08
    [ 10.782493] radeon 0000:01:00.0: fence driver on ring 3 use gpu addr 0x0000000080000c0c and cpu addr 0xffff8807ff9e9c0c
    [ 10.782494] radeon 0000:01:00.0: fence driver on ring 4 use gpu addr 0x0000000080000c10 and cpu addr 0xffff8807ff9e9c10
    [ 10.782874] radeon 0000:01:00.0: fence driver on ring 5 use gpu addr 0x0000000000076c98 and cpu addr 0xffffc9000a336c98
    [ 10.783472] radeon 0000:01:00.0: fence driver on ring 6 use gpu addr 0x0000000080000c18 and cpu addr 0xffff8807ff9e9c18
    [ 10.783473] radeon 0000:01:00.0: fence driver on ring 7 use gpu addr 0x0000000080000c1c and cpu addr 0xffff8807ff9e9c1c
    [ 10.783487] radeon 0000:01:00.0: irq 46 for MSI/MSI-X
    [ 10.783496] radeon 0000:01:00.0: radeon: using MSI.
    [ 10.783516] [drm] radeon: irq initialized.
    [ 11.056307] radeon 0000:01:00.0: No connectors reported connected with modes
    [ 11.057358] radeon 0000:01:00.0: fb1: radeondrmfb frame buffer device
    [ 11.057360] radeon 0000:01:00.0: registered panic notifier
    [ 11.058383] [drm] Initialized radeon 2.38.0 20080528 for 0000:01:00.0 on minor 0
    Anyone knows how to be able to use this new screen with my radeon card?
    Thanks in advance!
    Last edited by jolivier (2014-07-02 16:02:01)

    Ok I found a solution by changing my Xorg configuration and added my radeon card (although the wiki states that is should be useless)
    Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    BusId "PCI:1:0:0"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-1"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-2"
    EndSection
    Section "Monitor"
    Identifier "eDP1"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Radeon"
    Monitor "DisplayPort-2"
    SubSection "Display"
    Depth 24
    Modes "2560x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen1"
    Device "Radeon"
    Monitor "DisplayPort-1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen2"
    Device "Intel"
    Monitor "eDP1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "ServerLayout"
    Identifier "Default Layout"
    Screen 0 "Screen0"
    Screen 1 "Screen1" RightOf "Screen0"
    Screen 2 "Screen2" RightOf "Screen1"
    EndSection
    (I plugged two external screens to my DirectPort ports).
    This works but I then have two different screens (:0.0 and :0.1) and the intel drivers keeps on segfaulting when I use them together or when I try xinerama. So I disabled my intel monitor and screen and everything is fine except the fact that I cannot use my laptop screen with my radeon card so I am left with only two screens out of 3 but I will investigate more this issue deeper later on.

  • LAPTOP Dual Graphics Card CPU scailing (with bash script)

    Hey all, basically, getting Arch working with my laptop was a pain due to the lack of power options and graphics control, especially using the open source drivers. My laptop would overheat due to both the dedicated and integrated graphs cards would be running at the same time and my CPUs were running at 100%. After a long while of looking around, I finally found a solution, and being the nice guy I am, I decided to make a script to streamline the process for most people. It mounts the debugging filesystem, adds it to fstab, installs the necessary tools, loads the correct module, and also lets you change power plans, as well as check on battery, graphics card status, and cpu status. this is basically version one so i guess ill add a bit to it over time.
    *** MAKE SURE KMS IS ENABLED ON YOUR KERNEL GRUB/SYSLINUX LINE EG:  "radeon.modset=1"
    ******ERROR CHECKING:
    if you have the debug fs mounted already, unmount it with umount /sys/kernel/debug
    if you get an error modprobing, check what modules are supported from your cpu with  ls /lib/modules/$(uname -r)/kernel/drivers/cpufreq/
    with the debugging fs mounted, running cat /sys/kernel/debug/vgaswitcheroo/switch to find out what your graphic adapters are named and if needed replace the IGD and DIS with yours
    you may have to modify some parts of the program, but i tried my best to make it as easy as I can
    Installation:
    copy it and save it as foo.sh
    chmod 777 it for good measures
    RUN AS ROOT
    chmod 777 foo.sh
    ./foo.sh
    #! /bin/bash
    #By: Dominic dos Santos
    #[email protected]
    #mount -t debugfs none /sys/kernel/debug --mount debugging fs
    #echo"IGD"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --onboard graphics
    #echo"DIS"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --dedicated graphics
    #cpufreq-set -c 3 -g powersave # --powersave cpu freq set
    #cpufreq-set -c 3 -g performance # --performance
    #cpufreq-set -c 3 -g ondemand #--...
    #!!!PLEASE NOTE!!! I am using a quad core laptop, therefore I have the '-c 3' argument. Cores are as such: 0 1 2 3
    #Dual core would be '-c 1', 6 core would be 5
    echo "RUNNING THIS WIH X RUNNING WILL NOT MODIFY THE GRAPHICS CARD SETINGS"
    #checking if debugging fs is mounted, if not, mounting.
    if [ -f /sys/kernel/debug/vgaswitcheroo/switch ]; then
    echo "DEBUGFS is mounted, continuing wih program"
    else
    read -p "Press ENTER to mount the debugging directory (REQUIRED)"
    mount -t debugfs none /sys/kernel/debug #the mount fs command
    echo "Add to fstab?"
    read fs
    if [ "$fs" == "y" ]; then
    echo "debugfs /sys/kernel/debug debugfs 0 0" >> /etc/fstab #add the required line to the fstab
    fi
    read -p "We are now going to install the cpu drivers and load the required modules."
    pacman -S cpufrequtils
    echo "Do you have an [a]MD or [i]ntel cpu?" #load the [correct] module now
    read input
    if [ "$input" == "a" ]; then #AMD
    modprobe powernow-k8
    elif [ "$input" == "i" ]; then #INTEL
    modprobe acpi-cpufreq
    fi
    echo "REMEMBER TO ADD acpi-cpufreq cpufreq_powersave cpufreq_ondemand cpufreq_performance to your rc.conf beside MODULES=( ****** FOR INTEL CARDS ONLY!"
    echo "OR powernow-k8 cpufreq_powersave cpufreq_ondemand cpufreq_performance ****** FOR AMD CPU's ONLY!"
    fi
    #menu
    echo -e "Welcome to my CPU and Videocard Power and Performance Switcherooer"
    echo " 1: Powersave"
    echo " 2: On Demand"
    echo " 3: Performance"
    echo " 4: Check Status"
    echo "Please select an option"
    read input
    if [ "$input" = 1 ]; then
    #Powersave
    #Set CPU to "Powersave", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g powersave
    echo "IGD" > /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF" > /sys/kernel/debug/vgaswitcheroo/switch #the "OFF" infers to cutting the power to the one that isn't selected
    elif [ "$input" = 2 ]; then
    #On Demand
    #Set CPU to "On Demand", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g ondemand
    echo "IGD"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 3 ]; then
    #Performance
    #Set CPU to "Performance", set VGA to the dedicated graphics card and disables the onboard
    cpufreq-set -c 3 -g performance
    echo "DIS"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 4 ]; then # status check
    echo " 1: Battery"
    echo " 2: Graphics Card"
    echo " 3: CPU Info"
    read status
    if [ "$status" = 1 ]; then #battery
    acpi
    read -p "Press Enter"
    elif [ "$status" = 2 ]; then #battery
    cat /sys/kernel/debug/vgaswitcheroo/switch
    read -p "Press Enter"
    elif [ "$status" = 3 ]; then #battery
    cpufreq-info
    read -p "Press Enter"
    fi
    fi
    Last edited by imagoose (2012-02-15 22:51:13)

    Thats great, thank you.  I have an older Dell Studio XPS 13 which has NVIDIA Hybrid SLI.  Its current power usage in Arch is killing me (about an hour and a half, where as i can get 3-4 hrs in win7).  Right now I am doing all the work through the integrated graphics card, per my xorg.conf, but i dont think i've managed to disable the discrete card yet. When I get on the laptop I'll let you know how it goes.

  • [K7N2] Trouble with dual graphics cards

    I have a K7N2-DeltaL (Barton 2600+) which I have used for a while with a Radeon 9600 AGP@8x. To use multiple displays I installed a Radeon9200 PCI graphics card.
    However, I am having trouble getting it to work. The new card works fine alone, both in this and another computer. It works ok together with another AGP-card (GeForce2) on a Abit K7T with Duron 800.
    I have only gotten it to work for a few minutes once, long enough to install drivers etc. Ran it for a while but ended in a BSOD (endless-loop). Otherwise one of the following scenarios occur:
    1) with "Init AGP first" the computer boots but fails to initialize the PCI-video, works ok otherwise.
    2) with "Init PCI first" the computer fails to boot (stops in "Check RTC" according to D-bracket LEDs, or come to think about it, I think it actually only shows one red and no green, have to check that when I am back at it)
    3) with "Init PCI First" the computer POSTs but shuts down and can't be started without removing the power cord
    If after 3) I remove the *AGP* card, it still won't boot until I remove the power cable. After this it works fine with just the PCI-card.
    I am dual booting with XP Professional and WinME, but as it doesn't even get that far I am failing to see that this might matter.
    I am thinking "PSU", but am amazed that it worked in the other computer with the older PSU and not in the new 300W (have not found any Amp-specs).
    Any advice?

    I would have thought that the PCI card used the same power whether it is "init first" or not. Also I would have thought that a power problem would more probably be an unstable computer rather than an unbootable one.
    Could a few more AMPs on the PCI-bus really make that much (consistent) difference?

  • Need Xorg config file for dual graphics cards, triple monitors

    Hey guys, I'm having a bit of trouble getting triple monitors working on my system. The two plugged into the graphics card (radeon x300) work fine, but the one plugged into onboard graphics (geforce 6150SE) refuses to appear in xrandr. I figure I need to make a /etc/X11/xorg.conf.d/10-monitor.conf file but I'm confused about how to do it with separate cards. I find the wiki instructions that I can find confusing, as I've never had to deal with xorg files before (has always autoconfigured for me no problems).
    Relevant code:
    [greg@spacebar ~]$ lspci | grep VGA
    00:0d.0 VGA compatible controller: nVidia Corporation C61 [GeForce 6150SE nForce 430] (rev a2)
    02:00.0 VGA compatible controller: ATI Technologies Inc RV370 5B60 [Radeon X300 (PCIE)]
    [greg@spacebar ~]$ xrandr
    Screen 0: minimum 320 x 200, current 2960 x 1050, maximum 4096 x 4096
    VGA-0 connected 1280x1024+1680+0 (normal left inverted right x axis y axis) 338mm x 270mm
    1280x1024 60.0*+ 75.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    DVI-0 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 430mm x 270mm
    1680x1050 59.9*+
    1280x1024 75.0 60.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    S-video disconnected (normal left inverted right x axis y axis)
    [greg@spacebar ~]$ pacman -Q | grep xf86-video
    xf86-video-ati 6.14.3-1
    xf86-video-nouveau 0.0.16_git20120106-1
    xf86-video-vesa 2.3.0-7
    [greg@spacebar ~]$ cat /etc/mkinitcpio.conf | grep MODULES=
    MODULES="radeon nouveau"
    I just can't seem to understand how to write the 10-monitor.conf file. I was wondering if anyone could give me a hand?
    Also, the third monitor to be connected via onboard VGA is also 1280x1024 and will be left of the other two.

    Depends on the motherboard. This one allows you to enable the onboard graphics only when no external card is found, or always enable. The PCI-E card initialises first, if that helps.
    EDIT: Also I can confirm that it can run both cards (all three monitors) at the same time by switching to initialise the onboard card first, however that makes some weird stuff happen so I switched it back. And also the nouveau driver is shown in lsmod, and gets detected in the boot message logs.
    Last edited by MisterAnderson (2012-01-26 16:56:58)

Maybe you are looking for

  • How to validate numbers in char field.

    Hello all, I have one database column with char data type. This field should allow insert only numbers [ zero to nine] and plus symbol .. how to validate this? Pls help me.. I.m using oracle 9i database. So it does not allow REG-EXP and WITH methods.

  • AP - 1099 Tax form

    Hi, SAP  tax Experts, I have configured AP and need to know what are the configs to be done to get the US tax form 1099 to be prnted. Is it part of withholding tax ? Is it to reported to federal for the vendors who are contractors/Private individual?

  • Where is my music library in iTunes 11?

    Just upgraded to iTunes 11.  Music lib used to be on left side of screen w/ all individual playlists.  Now it's gone.  Only have playlists and purchases.  Where is it and how do I put it back in that left side list w the others?  Thanks!

  • Have Adobe X and can't download pdf files. says I have encountered a problem

    Tried to download a newer version of adobe but I won't download....what can I do?

  • How do i get my restriction passcode ?

    How do i get my restriction passcode ?