HP Envy H8-1540t - dual graphics cards, Pcie 3.0 support

I have 2 questions about using dual graphics cards and Pcie 3.0 support in my machine.
The machine is:
HP Envy H8-1540t
Win 8 64
i7-3820 2nd Gen
10 GB DDR3 1600
256 GB SSD
1 TB 7200 rpm SATA hard drive
Blu-Ray Player
TV Tuner
Prem wireless N-Lan and Blutooth 2x2
600 Watt PSU
NVidia GTX 660
This machine uses the Pegatron IPIWB-PB motherboard and has 2 Pcie x 16 slots. I realize that by using dual width GPU's like the GTX 660, the smaller Pcie slots next to them will be buried, rendering them useless. So these are my 2 questions;
1.) Will 2 Nvidia GTX 660 GPU's physically fit within the machine and be supported?
2.)  Does this motherboard with it's Intel X79 chipset support Pcie 3.0?
Thank You
This question was solved.
View Solution.

Hi,
You can find ALL information on the following OFFICIAL link, looks like it only supports PCIe Gen 2.0
   http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&lc=en&dlc=en&docname=c03132964#N241
From the image of the motherboard (above) and the following image of the GTX 660, simple answer: 2 cards won't physically fit.
   http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660/product-images
Regards.
BH
**Click the KUDOS thumb up on the left to say 'Thanks'**
Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

Similar Messages

  • Dual graphics card in M92 Tower?

    Hello,
    I have a manager here requesting 3 monitors for each of their departments workstations (Lenovo M92 Tower, Intel i5) and I am just wonder if the M92 supports dual graphics cards. I am aware that it only has a PCIe 1x and PCIe 16x slot - the plan is to install two 1x cards, if they are supported.
    Could someone confirm if this will work or not?
    Thanks!
    Solved!
    Go to Solution.

    First, I have no experience at all with more than two monitors on any PC.
    However... AMD's EyeFinity setup can support three (or more) monitors using the multiple connectors on a usable video card.  For example, the relatively inexpensive AMD R7 250 card comes in both full-size and low-profile varieties, to fit inside an M92p case (either mid-tower or SFF) in the PCIe x16 slot (which is what you want to use).  The low-profile R7 250 DDR5 card from Sapphire (which includes both low-profile and full-size brackets in the retail box, so you can use the card in either M92p case size) has three digital connectors on it: DVI, miniDP, and microHDMI.  So you can connect three monitors to them. The retail box also includes adapters for the microHDMI and miniDP connectors, but I'd personally just buy a proper 2-ended straight-through cable (say from Monoprice or other online source) with the correct connector at each end to connect each of your three monitors. I'm not a fan of adapters myself, I'd prefer a suitable cable with the correct connectors at each end.
    According to the EyeFinity setup instructions, you use two connectors to go to two monitors, and the third (or more) monitors must connect on the DisplayPort path.  In your case you only have three monitors, so you just use all three digital connectors on the R7 250 and you're done!  No need to have two graphics cards, and you get high-performance DDR5 as you want, on three monitors.
    Again... I've never done this myself, but EyeFinity for three or more monitors is standard technology given an AMD card like the R7 250 sitting in a single PCIe x16 slot.

  • Photoshop CS6 have problems with my dual graphics card

    I'm currently using photoshop cs6 on my laptop (hp pavilion dv4 3114 ), and this laptop has dual graphics cards(one is mobile intel hd graphics, the other one is ati hd 6750), it ususlly switches to the faster one when games are running or some software that requires a lot of graphic work. But this time I have already set in the graphics card profile to let photoshop use the faster one, but when I launch photoshop cs6 it tells me that the graphics card is not officially supported, as you can see this in the screenshot.
    Then I turned to photoshop's preference(performence) and found it only identifies the mobile intel hd graphics. I'm currently using photoshop for designing large posters so performence is very important for me. Is anyone know how to solve this problem so that photoshop can run with the highest performence?

    II can't imagine how these computer companies expect to fully integrate GPUs from two different vendors, but they're trying.  To me, knowing a bit about how display drivers are implemented, it seems an impossible task.
    I hve heard (in the context of Macs that have this option) that disabling the low power options so that only the more powerful GPU is active can help.  I don't know if that can apply to your system, but it might give you an idea of a place to start looking. 
    Another thing might be to try to find whatever settings it offers for "affinity" and set it so Photoshop and its cousins use just the ATI GPU exclusively.  It might be the "switchover" process that is causing the problems.
    Good luck.
    -Noel

  • How to get dual graphics card to work together?

    I have an L series satellite with dual graphics card that wont work together. Its an AMD Radeon 3400m apu with an HD 6520G integrated , and a 6400m discrete graphcs card.
    The 6400m will never turn on, I have used amd system monitor to see if it is ever used, but it never is.
    Even when the 6520g is at full load it wont turn on.
    Any suggestions would be helpful.
    Thanks.

    Hi
    As far as I know this theme has been already discussed here in the forum and the switch between internal and external ATI GPU is not supported.
    Just the Optimus technology designed by nVidia is supported
    So you can switch between Intel GPU and nVidia (if both would support the Optimus technology)

  • Dual graphics cards and single 4K display - Mac Pro (late 2013)

    I'm considering a new Mac Pro as my next graphics/development workstation. I see that the new Mac Pros have dual graphics cards (i.e. dual D300, D500, D700.) I was wondering if there is any benefit to having the second graphics card if I only have only one 4K display connected via Display Port 1.2? Would the two cards work in tandem to output to the single 4k display? Can the second GPU still be used for OpenCL, even if it is not connected to a display?
    Thanks

    There is no option to have only one Dxxx, and yes some apps do and no the number of displays does not affect video output.
    Whether it is right for you vs say a maxed out iMac or something else is something you should continue to research and consider.
    www.barefeats.com and www.macperformanceguide.com as well as other forums and sites.

  • Can hp ENVY 15 J 126TX GRAPHIC CARD BE UPGRADED

    can hp ENVY 15 J 126TX GRAPHIC CARD BE UPGRADED?it has a nvidia gt740 and i want to maybe upgrade it to something better like GeForce GTX 780M or gtx770.help pls

    Definitely not,
    Manual:
    http://h10032.www1.hp.com/ctg/Manual/c03943414.pdf
    Go through page 22, 64, 65
    Graphic card is soldered in to the board & also none of the cards you mentioned is supported by the system.
    Regards,
    ++Please click KUDOS / White thumb to say thanks
    ++Please click ACCEPT AS SOLUTION to help others, find this solution faster
    **I'm a Volunteer, I do not work for HP**

  • Jabber 4.4 "Graphics card or driver not supported" error on one machine but not another.

    Hi, I looked at the KB document but the email link on it no longer works.
    I have two end-users using Jabber 4.4 on identical machines, same specs, same driver build and OS image. One gets the  "Graphics card or driver not supported" error, the other does not.
    We have two DellOptiPlex 7010's,
    Both had 2 instances of AMD Radeon HD 7470 - they are dual monitor machines.
    In both cases the driver version 8.922.0.0, Driver Date 12/6/2011. This is the latest Dell has on their website.
    On one machine when I run a fresh install for Jabber for TelePresence 4.4 it works fine. But on another the user first gets
    "supportabilitytest.exe has stopped working
    A oriblem cause the program to stop working correctly. Please close the program."
    Then when that's closed out we get:
    "Graphics card or driver not supported!
    New features in this version of Jabber Video are not supported by your computer's graphics driver.
    Update to the newest graphics driver available and run Jabber Video again."
    I can't believe it's just a matter of upgrading the drivers because in this case one machine with identical drivers works.
    I appreciate any insights, thanks !

    Hi ksouthall,
    It sounds like the openGL supported version did not install on the one that is failing.  There isn't much we can do regarding that error.  Best practice is to always upgrade to the latest manufacturer's available driver.  Client requirements are below.
    Windows 7 (32-bit or 64-bit), Vista, or XP (SP 2 or newer), with OpenGL 1.2 or newer.
    For 720p HD video calls, Intel Core2Duo @ 1.2GHz or better.
    For VGA video calls, Intel Atom @ 1.6GHz or better.
    Webcam, built-in or external. You need an HD webcam if you want other callers to see you in HD.
    Broadband Internet connection with a recommended bandwidth of 768kbps upstream and downstream. You need about 1.2Mbps upstream and downstream for 720p HD video calls.
    Regards,
    Jason

  • G5 Dual Graphics card

    I have a G5 Dual - PCI - X I believe as well as a G% Quad - PCIe
    Ive just replaced the 6500 graphics card in the quad can I put it in the dual?

    The G5 dual will have an AGP graphics card slot, I'm afraid.
    That's presuming it isn't a dual-core - "Late 2005"
    http://support.apple.com/specs/powermac/PowerMac_G5_Late2005.html

  • [K7N2] Trouble with dual graphics cards

    I have a K7N2-DeltaL (Barton 2600+) which I have used for a while with a Radeon 9600 AGP@8x. To use multiple displays I installed a Radeon9200 PCI graphics card.
    However, I am having trouble getting it to work. The new card works fine alone, both in this and another computer. It works ok together with another AGP-card (GeForce2) on a Abit K7T with Duron 800.
    I have only gotten it to work for a few minutes once, long enough to install drivers etc. Ran it for a while but ended in a BSOD (endless-loop). Otherwise one of the following scenarios occur:
    1) with "Init AGP first" the computer boots but fails to initialize the PCI-video, works ok otherwise.
    2) with "Init PCI first" the computer fails to boot (stops in "Check RTC" according to D-bracket LEDs, or come to think about it, I think it actually only shows one red and no green, have to check that when I am back at it)
    3) with "Init PCI First" the computer POSTs but shuts down and can't be started without removing the power cord
    If after 3) I remove the *AGP* card, it still won't boot until I remove the power cable. After this it works fine with just the PCI-card.
    I am dual booting with XP Professional and WinME, but as it doesn't even get that far I am failing to see that this might matter.
    I am thinking "PSU", but am amazed that it worked in the other computer with the older PSU and not in the new 300W (have not found any Amp-specs).
    Any advice?

    I would have thought that the PCI card used the same power whether it is "init first" or not. Also I would have thought that a power problem would more probably be an unstable computer rather than an unbootable one.
    Could a few more AMPs on the PCI-bus really make that much (consistent) difference?

  • Can I upgrade my HP ENVY dv7-7300 Quad graphic card?

    I've heard that most laptops graphic cards now a days can't be upgraded, but I have an HP ENVY dv7-7300 Quad and I was wondering if it is possible to upgrade its graphic card to Nvidia GT 635m?
    Thanks

    Hi,
    Basically you can, nothing can stop you, with a cost and I believe this cost may be more expensive than buying a new computer because you have to take into account:
    New card price,  and WHERE you can buy it because laptop cards are not the same as desktop cards. You can't buy from any normal shop,
    Need new system board (may be),
    Need new cooling system because generally speaking more powerfull card pumps out more heat (you want more powerfull card don't you?).
    May need new PSU because your current PSU may not suitbale to the card. (I ordered a machine with Nvidia card and I needed to order a 90W charger at the same time, not the normal 65W charger. They simply rejected my order).
    Labor cost.
    Many, many packets of Asprin.
    Hope this helps.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • Need Xorg config file for dual graphics cards, triple monitors

    Hey guys, I'm having a bit of trouble getting triple monitors working on my system. The two plugged into the graphics card (radeon x300) work fine, but the one plugged into onboard graphics (geforce 6150SE) refuses to appear in xrandr. I figure I need to make a /etc/X11/xorg.conf.d/10-monitor.conf file but I'm confused about how to do it with separate cards. I find the wiki instructions that I can find confusing, as I've never had to deal with xorg files before (has always autoconfigured for me no problems).
    Relevant code:
    [greg@spacebar ~]$ lspci | grep VGA
    00:0d.0 VGA compatible controller: nVidia Corporation C61 [GeForce 6150SE nForce 430] (rev a2)
    02:00.0 VGA compatible controller: ATI Technologies Inc RV370 5B60 [Radeon X300 (PCIE)]
    [greg@spacebar ~]$ xrandr
    Screen 0: minimum 320 x 200, current 2960 x 1050, maximum 4096 x 4096
    VGA-0 connected 1280x1024+1680+0 (normal left inverted right x axis y axis) 338mm x 270mm
    1280x1024 60.0*+ 75.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    DVI-0 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 430mm x 270mm
    1680x1050 59.9*+
    1280x1024 75.0 60.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    S-video disconnected (normal left inverted right x axis y axis)
    [greg@spacebar ~]$ pacman -Q | grep xf86-video
    xf86-video-ati 6.14.3-1
    xf86-video-nouveau 0.0.16_git20120106-1
    xf86-video-vesa 2.3.0-7
    [greg@spacebar ~]$ cat /etc/mkinitcpio.conf | grep MODULES=
    MODULES="radeon nouveau"
    I just can't seem to understand how to write the 10-monitor.conf file. I was wondering if anyone could give me a hand?
    Also, the third monitor to be connected via onboard VGA is also 1280x1024 and will be left of the other two.

    Depends on the motherboard. This one allows you to enable the onboard graphics only when no external card is found, or always enable. The PCI-E card initialises first, if that helps.
    EDIT: Also I can confirm that it can run both cards (all three monitors) at the same time by switching to initialise the onboard card first, however that makes some weird stuff happen so I switched it back. And also the nouveau driver is shown in lsmod, and gets detected in the boot message logs.
    Last edited by MisterAnderson (2012-01-26 16:56:58)

  • HP Envy 15t-J100 Nvidia Graphics card issue.

    Hello, I recently bought a HP Envy 15t-j100. I was under the impression that it came with a NVIDIA Geforce GT 740m graphics driver ( even has the sticker on the laptop) and a HP Intel HD 4600 driver. However when I went to device manager and looked at the display drivers, the Nvidia driver was not there. The only driver that was there was the Intel HD one. I then decided to go to the nvidia website and download the latest driver, but that didn't work because it told me it couldn't continue due to imcompatibility. I'm confused, I have the sticker that tells me that I "have" the nvidia geforce graphics card, but in device manager I don't see it. Help!

    Hi,
    The Envy 15t-jxxx series is a set of CTO machines. What is the model/product of your computer ? Please use the following instructions for find out the model/product of your machine:
      http://h10025.www1.hp.com/ewfrf/wc/document?lc=en&cc=us&docname=c00033108
    You can't see something like the following image ?
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • Dual Graphics cards in Quad PPC G5?

    I have one GeForce 6600, and want to add a second, is this possible?
    If not, is there another card model which can be dualed up inside of the PPC G5 Quad?
    If not, where can you buy Nvidia graphics card upgrades for the PPC G5 Quad?

    Anyone know where you can get one of these upgrades?
    NVIDIA Quadro FX 4500 graphics card
    http://www.apple.com/pr/library/2005/oct/19pmg5.html
    Malcolm's last link is the Quadro FX 4500. However, unless you have the need for its unique features, the 7800GT (the next to last link in Malcolm's post) is a much better bang for the buck. The 7800 isn't much slower for most things, but costs 1/2 as much as the Quadro.

  • Cannot detect outputs on dual graphic card configuration

    Hi,
    I have a Dell Precision M6800 which has these graphic cards:
    00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
    01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Saturn XT [FirePro M6100]
    My ATI graphic card is supported by the xf86-video-ati driver based on http://www.x.org/wiki/RadeonFeature/ .
    Since I have multiple cards, I followed the instructions of https://wiki.archlinux.org/index.php/PRIME and I have
    DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
    OpenGL renderer string: Gallium 0.4 on AMD BONAIRE
    But when I plug my screen using my laptop display port, I don't see it in xrandr:
    Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
    eDP1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 382mm x 215mm
    1920x1080 60.01*+
    1400x1050 59.98
    1280x1024 60.02
    1280x960 60.00
    1024x768 60.00
    800x600 60.32 56.25
    640x480 59.94
    VGA2 disconnected (normal left inverted right x axis y axis)
    DP4 disconnected (normal left inverted right x axis y axis)
    HDMI1 disconnected (normal left inverted right x axis y axis)
    VIRTUAL1 disconnected (normal left inverted right x axis y axis)
    If I plug this screen on the VGA output, it is detected but the maximum resolution allowed by xrandr is 1920x1080 and my screen is a 2560x1080 so the image is stretched .
    My guess is that xrandr sees only the output of my intel card, how can I have it see the outputs supported by my ATI card?
    I also guess that my radeon card is not powered up, because cat /sys/kernel/debug/dri/0/radeon_pm_info gives me.
    PX asic powered off
    I tried DRI_PRIME=1 glxgears and that made me crash my X server with a segfault
    [ 554.701] (EE)
    [ 554.701] (EE) Backtrace:
    [ 554.701] (EE) 0: /usr/bin/X (xorg_backtrace+0x56) [0x58f186]
    [ 554.701] (EE) 1: /usr/bin/X (0x400000+0x192fc9) [0x592fc9]
    [ 554.701] (EE) 2: /usr/lib/libpthread.so.0 (0x7fb2738f1000+0xf4b0) [0x7fb2739004b0]
    [ 554.701] (EE) 3: /usr/lib/xorg/modules/drivers/intel_drv.so (0x7fb26d165000+0x1034a8) [0x7fb26d2684a8]
    [ 554.701] (EE) 4: /usr/bin/X (0x400000+0x15ea73) [0x55ea73]
    [ 554.701] (EE) 5: /usr/bin/X (0x400000+0x15f843) [0x55f843]
    [ 554.701] (EE) 6: /usr/bin/X (DRI2GetBuffersWithFormat+0xb) [0x55fc8b]
    [ 554.701] (EE) 7: /usr/bin/X (0x400000+0x16172b) [0x56172b]
    [ 554.701] (EE) 8: /usr/bin/X (0x400000+0x36b2f) [0x436b2f]
    [ 554.701] (EE) 9: /usr/bin/X (0x400000+0x3ad16) [0x43ad16]
    [ 554.701] (EE) 10: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7fb27255e000]
    [ 554.701] (EE) 11: /usr/bin/X (0x400000+0x250fe) [0x4250fe]
    [ 554.701] (EE)
    Its a bad sign, but I'm fine if we find a workaround to only use the radeon card. What I can read from the Xorg log files before this crash is the following:
    [ 545.834] (II) RADEON(G0): Printing probed modes for output DisplayPort-1-0
    [ 545.834] (II) RADEON(G0): Modeline "2560x1080"x60.0 185.58 2560 2624 2688 2784 1080 1083 1093 1111 +hsync -vsync (66.7 kHz eP)
    So my radeon detects this mode, I just don't see it in xrandr and don't know how to use it. I tried DRI_PRIME=1 xrandr but saw no difference with a standard xrandr.
    Additional information:
    dmesg|grep radeon gives me
    dmesg|grep radeon
    [ 5.097162] [drm] radeon kernel modesetting enabled.
    [ 5.118130] radeon 0000:01:00.0: enabling device (0000 -> 0003)
    [ 10.764770] radeon 0000:01:00.0: VRAM: 2048M 0x0000000000000000 - 0x000000007FFFFFFF (2048M used)
    [ 10.764772] radeon 0000:01:00.0: GTT: 1024M 0x0000000080000000 - 0x00000000BFFFFFFF
    [ 10.764852] [drm] radeon: 2048M of VRAM memory ready
    [ 10.764853] [drm] radeon: 1024M of GTT memory ready.
    [ 10.766910] [drm] radeon/BONAIRE_mc2.bin: 31792 bytes
    [ 10.774998] [drm] radeon: dpm initialized
    [ 10.782479] radeon 0000:01:00.0: WB enabled
    [ 10.782490] radeon 0000:01:00.0: fence driver on ring 0 use gpu addr 0x0000000080000c00 and cpu addr 0xffff8807ff9e9c00
    [ 10.782491] radeon 0000:01:00.0: fence driver on ring 1 use gpu addr 0x0000000080000c04 and cpu addr 0xffff8807ff9e9c04
    [ 10.782492] radeon 0000:01:00.0: fence driver on ring 2 use gpu addr 0x0000000080000c08 and cpu addr 0xffff8807ff9e9c08
    [ 10.782493] radeon 0000:01:00.0: fence driver on ring 3 use gpu addr 0x0000000080000c0c and cpu addr 0xffff8807ff9e9c0c
    [ 10.782494] radeon 0000:01:00.0: fence driver on ring 4 use gpu addr 0x0000000080000c10 and cpu addr 0xffff8807ff9e9c10
    [ 10.782874] radeon 0000:01:00.0: fence driver on ring 5 use gpu addr 0x0000000000076c98 and cpu addr 0xffffc9000a336c98
    [ 10.783472] radeon 0000:01:00.0: fence driver on ring 6 use gpu addr 0x0000000080000c18 and cpu addr 0xffff8807ff9e9c18
    [ 10.783473] radeon 0000:01:00.0: fence driver on ring 7 use gpu addr 0x0000000080000c1c and cpu addr 0xffff8807ff9e9c1c
    [ 10.783487] radeon 0000:01:00.0: irq 46 for MSI/MSI-X
    [ 10.783496] radeon 0000:01:00.0: radeon: using MSI.
    [ 10.783516] [drm] radeon: irq initialized.
    [ 11.056307] radeon 0000:01:00.0: No connectors reported connected with modes
    [ 11.057358] radeon 0000:01:00.0: fb1: radeondrmfb frame buffer device
    [ 11.057360] radeon 0000:01:00.0: registered panic notifier
    [ 11.058383] [drm] Initialized radeon 2.38.0 20080528 for 0000:01:00.0 on minor 0
    Anyone knows how to be able to use this new screen with my radeon card?
    Thanks in advance!
    Last edited by jolivier (2014-07-02 16:02:01)

    Ok I found a solution by changing my Xorg configuration and added my radeon card (although the wiki states that is should be useless)
    Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    BusId "PCI:1:0:0"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-1"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-2"
    EndSection
    Section "Monitor"
    Identifier "eDP1"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Radeon"
    Monitor "DisplayPort-2"
    SubSection "Display"
    Depth 24
    Modes "2560x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen1"
    Device "Radeon"
    Monitor "DisplayPort-1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen2"
    Device "Intel"
    Monitor "eDP1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "ServerLayout"
    Identifier "Default Layout"
    Screen 0 "Screen0"
    Screen 1 "Screen1" RightOf "Screen0"
    Screen 2 "Screen2" RightOf "Screen1"
    EndSection
    (I plugged two external screens to my DirectPort ports).
    This works but I then have two different screens (:0.0 and :0.1) and the intel drivers keeps on segfaulting when I use them together or when I try xinerama. So I disabled my intel monitor and screen and everything is fine except the fact that I cannot use my laptop screen with my radeon card so I am left with only two screens out of 3 but I will investigate more this issue deeper later on.

  • LAPTOP Dual Graphics Card CPU scailing (with bash script)

    Hey all, basically, getting Arch working with my laptop was a pain due to the lack of power options and graphics control, especially using the open source drivers. My laptop would overheat due to both the dedicated and integrated graphs cards would be running at the same time and my CPUs were running at 100%. After a long while of looking around, I finally found a solution, and being the nice guy I am, I decided to make a script to streamline the process for most people. It mounts the debugging filesystem, adds it to fstab, installs the necessary tools, loads the correct module, and also lets you change power plans, as well as check on battery, graphics card status, and cpu status. this is basically version one so i guess ill add a bit to it over time.
    *** MAKE SURE KMS IS ENABLED ON YOUR KERNEL GRUB/SYSLINUX LINE EG:  "radeon.modset=1"
    ******ERROR CHECKING:
    if you have the debug fs mounted already, unmount it with umount /sys/kernel/debug
    if you get an error modprobing, check what modules are supported from your cpu with  ls /lib/modules/$(uname -r)/kernel/drivers/cpufreq/
    with the debugging fs mounted, running cat /sys/kernel/debug/vgaswitcheroo/switch to find out what your graphic adapters are named and if needed replace the IGD and DIS with yours
    you may have to modify some parts of the program, but i tried my best to make it as easy as I can
    Installation:
    copy it and save it as foo.sh
    chmod 777 it for good measures
    RUN AS ROOT
    chmod 777 foo.sh
    ./foo.sh
    #! /bin/bash
    #By: Dominic dos Santos
    #[email protected]
    #mount -t debugfs none /sys/kernel/debug --mount debugging fs
    #echo"IGD"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --onboard graphics
    #echo"DIS"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --dedicated graphics
    #cpufreq-set -c 3 -g powersave # --powersave cpu freq set
    #cpufreq-set -c 3 -g performance # --performance
    #cpufreq-set -c 3 -g ondemand #--...
    #!!!PLEASE NOTE!!! I am using a quad core laptop, therefore I have the '-c 3' argument. Cores are as such: 0 1 2 3
    #Dual core would be '-c 1', 6 core would be 5
    echo "RUNNING THIS WIH X RUNNING WILL NOT MODIFY THE GRAPHICS CARD SETINGS"
    #checking if debugging fs is mounted, if not, mounting.
    if [ -f /sys/kernel/debug/vgaswitcheroo/switch ]; then
    echo "DEBUGFS is mounted, continuing wih program"
    else
    read -p "Press ENTER to mount the debugging directory (REQUIRED)"
    mount -t debugfs none /sys/kernel/debug #the mount fs command
    echo "Add to fstab?"
    read fs
    if [ "$fs" == "y" ]; then
    echo "debugfs /sys/kernel/debug debugfs 0 0" >> /etc/fstab #add the required line to the fstab
    fi
    read -p "We are now going to install the cpu drivers and load the required modules."
    pacman -S cpufrequtils
    echo "Do you have an [a]MD or [i]ntel cpu?" #load the [correct] module now
    read input
    if [ "$input" == "a" ]; then #AMD
    modprobe powernow-k8
    elif [ "$input" == "i" ]; then #INTEL
    modprobe acpi-cpufreq
    fi
    echo "REMEMBER TO ADD acpi-cpufreq cpufreq_powersave cpufreq_ondemand cpufreq_performance to your rc.conf beside MODULES=( ****** FOR INTEL CARDS ONLY!"
    echo "OR powernow-k8 cpufreq_powersave cpufreq_ondemand cpufreq_performance ****** FOR AMD CPU's ONLY!"
    fi
    #menu
    echo -e "Welcome to my CPU and Videocard Power and Performance Switcherooer"
    echo " 1: Powersave"
    echo " 2: On Demand"
    echo " 3: Performance"
    echo " 4: Check Status"
    echo "Please select an option"
    read input
    if [ "$input" = 1 ]; then
    #Powersave
    #Set CPU to "Powersave", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g powersave
    echo "IGD" > /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF" > /sys/kernel/debug/vgaswitcheroo/switch #the "OFF" infers to cutting the power to the one that isn't selected
    elif [ "$input" = 2 ]; then
    #On Demand
    #Set CPU to "On Demand", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g ondemand
    echo "IGD"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 3 ]; then
    #Performance
    #Set CPU to "Performance", set VGA to the dedicated graphics card and disables the onboard
    cpufreq-set -c 3 -g performance
    echo "DIS"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 4 ]; then # status check
    echo " 1: Battery"
    echo " 2: Graphics Card"
    echo " 3: CPU Info"
    read status
    if [ "$status" = 1 ]; then #battery
    acpi
    read -p "Press Enter"
    elif [ "$status" = 2 ]; then #battery
    cat /sys/kernel/debug/vgaswitcheroo/switch
    read -p "Press Enter"
    elif [ "$status" = 3 ]; then #battery
    cpufreq-info
    read -p "Press Enter"
    fi
    fi
    Last edited by imagoose (2012-02-15 22:51:13)

    Thats great, thank you.  I have an older Dell Studio XPS 13 which has NVIDIA Hybrid SLI.  Its current power usage in Arch is killing me (about an hour and a half, where as i can get 3-4 hrs in win7).  Right now I am doing all the work through the integrated graphics card, per my xorg.conf, but i dont think i've managed to disable the discrete card yet. When I get on the laptop I'll let you know how it goes.

Maybe you are looking for