LAPTOP Dual Graphics Card CPU scailing (with bash script)

Hey all, basically, getting Arch working with my laptop was a pain due to the lack of power options and graphics control, especially using the open source drivers. My laptop would overheat due to both the dedicated and integrated graphs cards would be running at the same time and my CPUs were running at 100%. After a long while of looking around, I finally found a solution, and being the nice guy I am, I decided to make a script to streamline the process for most people. It mounts the debugging filesystem, adds it to fstab, installs the necessary tools, loads the correct module, and also lets you change power plans, as well as check on battery, graphics card status, and cpu status. this is basically version one so i guess ill add a bit to it over time.
*** MAKE SURE KMS IS ENABLED ON YOUR KERNEL GRUB/SYSLINUX LINE EG:  "radeon.modset=1"
******ERROR CHECKING:
if you have the debug fs mounted already, unmount it with umount /sys/kernel/debug
if you get an error modprobing, check what modules are supported from your cpu with  ls /lib/modules/$(uname -r)/kernel/drivers/cpufreq/
with the debugging fs mounted, running cat /sys/kernel/debug/vgaswitcheroo/switch to find out what your graphic adapters are named and if needed replace the IGD and DIS with yours
you may have to modify some parts of the program, but i tried my best to make it as easy as I can
Installation:
copy it and save it as foo.sh
chmod 777 it for good measures
RUN AS ROOT
chmod 777 foo.sh
./foo.sh
#! /bin/bash
#By: Dominic dos Santos
#[email protected]
#mount -t debugfs none /sys/kernel/debug --mount debugging fs
#echo"IGD"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --onboard graphics
#echo"DIS"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --dedicated graphics
#cpufreq-set -c 3 -g powersave # --powersave cpu freq set
#cpufreq-set -c 3 -g performance # --performance
#cpufreq-set -c 3 -g ondemand #--...
#!!!PLEASE NOTE!!! I am using a quad core laptop, therefore I have the '-c 3' argument. Cores are as such: 0 1 2 3
#Dual core would be '-c 1', 6 core would be 5
echo "RUNNING THIS WIH X RUNNING WILL NOT MODIFY THE GRAPHICS CARD SETINGS"
#checking if debugging fs is mounted, if not, mounting.
if [ -f /sys/kernel/debug/vgaswitcheroo/switch ]; then
echo "DEBUGFS is mounted, continuing wih program"
else
read -p "Press ENTER to mount the debugging directory (REQUIRED)"
mount -t debugfs none /sys/kernel/debug #the mount fs command
echo "Add to fstab?"
read fs
if [ "$fs" == "y" ]; then
echo "debugfs /sys/kernel/debug debugfs 0 0" >> /etc/fstab #add the required line to the fstab
fi
read -p "We are now going to install the cpu drivers and load the required modules."
pacman -S cpufrequtils
echo "Do you have an [a]MD or [i]ntel cpu?" #load the [correct] module now
read input
if [ "$input" == "a" ]; then #AMD
modprobe powernow-k8
elif [ "$input" == "i" ]; then #INTEL
modprobe acpi-cpufreq
fi
echo "REMEMBER TO ADD acpi-cpufreq cpufreq_powersave cpufreq_ondemand cpufreq_performance to your rc.conf beside MODULES=( ****** FOR INTEL CARDS ONLY!"
echo "OR powernow-k8 cpufreq_powersave cpufreq_ondemand cpufreq_performance ****** FOR AMD CPU's ONLY!"
fi
#menu
echo -e "Welcome to my CPU and Videocard Power and Performance Switcherooer"
echo " 1: Powersave"
echo " 2: On Demand"
echo " 3: Performance"
echo " 4: Check Status"
echo "Please select an option"
read input
if [ "$input" = 1 ]; then
#Powersave
#Set CPU to "Powersave", set VGA to onboard and disables one not being used, ie. the dedicated
cpufreq-set -c 3 -g powersave
echo "IGD" > /sys/kernel/debug/vgaswitcheroo/switch
echo "OFF" > /sys/kernel/debug/vgaswitcheroo/switch #the "OFF" infers to cutting the power to the one that isn't selected
elif [ "$input" = 2 ]; then
#On Demand
#Set CPU to "On Demand", set VGA to onboard and disables one not being used, ie. the dedicated
cpufreq-set -c 3 -g ondemand
echo "IGD"> /sys/kernel/debug/vgaswitcheroo/switch
echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
elif [ "$input" = 3 ]; then
#Performance
#Set CPU to "Performance", set VGA to the dedicated graphics card and disables the onboard
cpufreq-set -c 3 -g performance
echo "DIS"> /sys/kernel/debug/vgaswitcheroo/switch
echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
elif [ "$input" = 4 ]; then # status check
echo " 1: Battery"
echo " 2: Graphics Card"
echo " 3: CPU Info"
read status
if [ "$status" = 1 ]; then #battery
acpi
read -p "Press Enter"
elif [ "$status" = 2 ]; then #battery
cat /sys/kernel/debug/vgaswitcheroo/switch
read -p "Press Enter"
elif [ "$status" = 3 ]; then #battery
cpufreq-info
read -p "Press Enter"
fi
fi
Last edited by imagoose (2012-02-15 22:51:13)

Thats great, thank you.  I have an older Dell Studio XPS 13 which has NVIDIA Hybrid SLI.  Its current power usage in Arch is killing me (about an hour and a half, where as i can get 3-4 hrs in win7).  Right now I am doing all the work through the integrated graphics card, per my xorg.conf, but i dont think i've managed to disable the discrete card yet. When I get on the laptop I'll let you know how it goes.

Similar Messages

  • Photoshop CS6 have problems with my dual graphics card

    I'm currently using photoshop cs6 on my laptop (hp pavilion dv4 3114 ), and this laptop has dual graphics cards(one is mobile intel hd graphics, the other one is ati hd 6750), it ususlly switches to the faster one when games are running or some software that requires a lot of graphic work. But this time I have already set in the graphics card profile to let photoshop use the faster one, but when I launch photoshop cs6 it tells me that the graphics card is not officially supported, as you can see this in the screenshot.
    Then I turned to photoshop's preference(performence) and found it only identifies the mobile intel hd graphics. I'm currently using photoshop for designing large posters so performence is very important for me. Is anyone know how to solve this problem so that photoshop can run with the highest performence?

    II can't imagine how these computer companies expect to fully integrate GPUs from two different vendors, but they're trying.  To me, knowing a bit about how display drivers are implemented, it seems an impossible task.
    I hve heard (in the context of Macs that have this option) that disabling the low power options so that only the more powerful GPU is active can help.  I don't know if that can apply to your system, but it might give you an idea of a place to start looking. 
    Another thing might be to try to find whatever settings it offers for "affinity" and set it so Photoshop and its cousins use just the ATI GPU exclusively.  It might be the "switchover" process that is causing the problems.
    Good luck.
    -Noel

  • How to get dual graphics card to work together?

    I have an L series satellite with dual graphics card that wont work together. Its an AMD Radeon 3400m apu with an HD 6520G integrated , and a 6400m discrete graphcs card.
    The 6400m will never turn on, I have used amd system monitor to see if it is ever used, but it never is.
    Even when the 6520g is at full load it wont turn on.
    Any suggestions would be helpful.
    Thanks.

    Hi
    As far as I know this theme has been already discussed here in the forum and the switch between internal and external ATI GPU is not supported.
    Just the Optimus technology designed by nVidia is supported
    So you can switch between Intel GPU and nVidia (if both would support the Optimus technology)

  • Dual graphics card in M92 Tower?

    Hello,
    I have a manager here requesting 3 monitors for each of their departments workstations (Lenovo M92 Tower, Intel i5) and I am just wonder if the M92 supports dual graphics cards. I am aware that it only has a PCIe 1x and PCIe 16x slot - the plan is to install two 1x cards, if they are supported.
    Could someone confirm if this will work or not?
    Thanks!
    Solved!
    Go to Solution.

    First, I have no experience at all with more than two monitors on any PC.
    However... AMD's EyeFinity setup can support three (or more) monitors using the multiple connectors on a usable video card.  For example, the relatively inexpensive AMD R7 250 card comes in both full-size and low-profile varieties, to fit inside an M92p case (either mid-tower or SFF) in the PCIe x16 slot (which is what you want to use).  The low-profile R7 250 DDR5 card from Sapphire (which includes both low-profile and full-size brackets in the retail box, so you can use the card in either M92p case size) has three digital connectors on it: DVI, miniDP, and microHDMI.  So you can connect three monitors to them. The retail box also includes adapters for the microHDMI and miniDP connectors, but I'd personally just buy a proper 2-ended straight-through cable (say from Monoprice or other online source) with the correct connector at each end to connect each of your three monitors. I'm not a fan of adapters myself, I'd prefer a suitable cable with the correct connectors at each end.
    According to the EyeFinity setup instructions, you use two connectors to go to two monitors, and the third (or more) monitors must connect on the DisplayPort path.  In your case you only have three monitors, so you just use all three digital connectors on the R7 250 and you're done!  No need to have two graphics cards, and you get high-performance DDR5 as you want, on three monitors.
    Again... I've never done this myself, but EyeFinity for three or more monitors is standard technology given an AMD card like the R7 250 sitting in a single PCIe x16 slot.

  • HP Envy H8-1540t - dual graphics cards, Pcie 3.0 support

    I have 2 questions about using dual graphics cards and Pcie 3.0 support in my machine.
    The machine is:
    HP Envy H8-1540t
    Win 8 64
    i7-3820 2nd Gen
    10 GB DDR3 1600
    256 GB SSD
    1 TB 7200 rpm SATA hard drive
    Blu-Ray Player
    TV Tuner
    Prem wireless N-Lan and Blutooth 2x2
    600 Watt PSU
    NVidia GTX 660
    This machine uses the Pegatron IPIWB-PB motherboard and has 2 Pcie x 16 slots. I realize that by using dual width GPU's like the GTX 660, the smaller Pcie slots next to them will be buried, rendering them useless. So these are my 2 questions;
    1.) Will 2 Nvidia GTX 660 GPU's physically fit within the machine and be supported?
    2.)  Does this motherboard with it's Intel X79 chipset support Pcie 3.0?
    Thank You
    This question was solved.
    View Solution.

    Hi,
    You can find ALL information on the following OFFICIAL link, looks like it only supports PCIe Gen 2.0
       http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&lc=en&dlc=en&docname=c03132964#N241
    From the image of the motherboard (above) and the following image of the GTX 660, simple answer: 2 cards won't physically fit.
       http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660/product-images
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • Mac Pro with two graphic card each one with two dvi output, can I extend deskop to three outputs?

    Mac Pro with two graphic card each one with two dvi output, can I extend deskop to three outputs?

    MAC OS 10.7.4  3.2GHz Quad-Core Intel Xeon processors, 12GB  of 800MHz DDR2 and 2 ATI Radeon HD 2600 XT 256MB (two dual-link DVI ports) . The problem is, I woul like to use one output as main and the three others as secondary or extended desktop but the system don't allow me use two diferent video cards this way.

  • Radeon HD 5770 graphics card works perfectly with my MAC Pro 1,1 and Adobe CS6. However, Photoshop CC won't read the card drivers and says I need to reinstall them and disabled 3D funtions as a result. Anyone else with this issue?

       Radeon HD 5770 graphics card works perfectly with my MAC Pro 1,1 and Adobe CS6. However, I upgraded to Adobe CC and Photoshop CC disabled some filters and 3D functionality. The preference panel for Performance/Graphics Card in Photoshop CC is greyed out and inaccessible and shows this message -
    "Photoshop detected an error in your display driver. Update or reinstall the driver and check Use Graphics Processor to retry."
       I still have Photoshop CS6 installed and it still works perfectly. Anyone else with this issue? Is there a driver update for this compatable with my setup?
    OS 10.7.5
    Mac Pro 1,1
    20gb ram
    ATI Radeon HD 5770 Graphics Card
    30" Apple Cinema HD display

    after going round with this Adobe was able to find the fix....opened the "show content" folder for the application Photoshop CC (control click on the application folder) and navigate to the sniffer file and rename it by putting a tilda ~ in front of the name..." ~sniffer ", close and open PhotoshopCC...and voila, problem solved... Thankyou Adobe support

  • Dual graphics cards and single 4K display - Mac Pro (late 2013)

    I'm considering a new Mac Pro as my next graphics/development workstation. I see that the new Mac Pros have dual graphics cards (i.e. dual D300, D500, D700.) I was wondering if there is any benefit to having the second graphics card if I only have only one 4K display connected via Display Port 1.2? Would the two cards work in tandem to output to the single 4k display? Can the second GPU still be used for OpenCL, even if it is not connected to a display?
    Thanks

    There is no option to have only one Dxxx, and yes some apps do and no the number of displays does not affect video output.
    Whether it is right for you vs say a maxed out iMac or something else is something you should continue to research and consider.
    www.barefeats.com and www.macperformanceguide.com as well as other forums and sites.

  • [p7-1240] How can I tell which graphics cards will work with my motherboard?

    I recently went through the nightmare of trying to install a Gigabyte GTX 650 in my PC. After a few days of struggling and scouring the internet for answers, I discovered that it is incompatible with the BIOS on my motherboard, and HP does not offer BIOS updates. I returned the card and am now looking for a new one, but I am unsure of which cards will work now, since my computer met all of the system requirements for the graphics card, and It stil ldid not work. Since GPU manufacturers do not list BIOS specs under system requirements, how can I tell if a graphics card will work with my sytem?
    My computer is a p7-1240 with an upgraded 520W PSU and approximeately 11" of room to install the card.
    This question was solved.
    View Solution.

    Thexn, welcome to the forum.
    The problem is, many of the newer video cards require the computer to have a full UEFI BIOS.  HP began using one in mid-October, 2012.  Before this date the BIOS is Legacy.  This is why the video card didn't work with your computer.
    I don't know what you want to spend for a card, but MSI makes a GTX 750ti that has a hybrid VBIOS.  It has a switch to change to Legacy or UEFI.  There is a video link directly below the picture of a card.  Paul from Newegg gives a full description of the card.
    I like EVGA cards the best.  Their Tech Support has given me a lot of information about their cards.  However, you have to go back to a GT 640 card to find one of their cards that doesn't require the UEFI BIOS.  They do have some updates for their BIOS'es that solve the problem, but you have to call them for the exact cards.
    Please click the "Thumbs up + button" if I have helped you and click "Accept as Solution" if your problem is solved.
    Signature:
    HP TouchPad - 1.2 GHz; 1 GB memory; 32 GB storage; WebOS/CyanogenMod 11(Kit Kat)
    HP 10 Plus; Android-Kit Kat; 1.0 GHz Allwinner A31 ARM Cortex A7 Quad Core Processor ; 2GB RAM Memory Long: 2 GB DDR3L SDRAM (1600MHz); 16GB disable eMMC 16GB v4.51
    HP Omen; i7-4710QH; 8 GB memory; 256 GB San Disk SSD; Win 8.1
    HP Photosmart 7520 AIO
    ++++++++++++++++++
    **Click the Thumbs Up+ to say 'Thanks' and the 'Accept as Solution' if I have solved your problem.**
    Intelligence is God given; Wisdom is the sum of our mistakes!
    I am not an HP employee.

  • What graphics card is compatable with my presario sr1710nx pc.

    What graphics card is compatable with my presario sr1710nx pc?  The current one died and I get nothing on my monitor and I tried connecting another monitor and still nothing.  So I assume it is the grahics card.
    This question was solved.
    View Solution.

    Hi:
    That is an old model.
    The specs state it has onboard (integrated graphics).
    If you are sure it is just the graphics that is kaput (you would know that if you hear Windows boot up and the logon music sounds or something) then...
    I would get a used Dell Radeon HD 2400 on eBay which should work just fine for you, but you will need to have a monitor with a DVI connection and a DVI cable.
    This is the cheapest one I could find.
    http://www.ebay.com/itm/ATI-Radeon-HD-2400PRO-256MB-PCI-Express-video-card-ATI-102-B17002-B-USED-/12...

  • [K7N2] Trouble with dual graphics cards

    I have a K7N2-DeltaL (Barton 2600+) which I have used for a while with a Radeon 9600 AGP@8x. To use multiple displays I installed a Radeon9200 PCI graphics card.
    However, I am having trouble getting it to work. The new card works fine alone, both in this and another computer. It works ok together with another AGP-card (GeForce2) on a Abit K7T with Duron 800.
    I have only gotten it to work for a few minutes once, long enough to install drivers etc. Ran it for a while but ended in a BSOD (endless-loop). Otherwise one of the following scenarios occur:
    1) with "Init AGP first" the computer boots but fails to initialize the PCI-video, works ok otherwise.
    2) with "Init PCI first" the computer fails to boot (stops in "Check RTC" according to D-bracket LEDs, or come to think about it, I think it actually only shows one red and no green, have to check that when I am back at it)
    3) with "Init PCI First" the computer POSTs but shuts down and can't be started without removing the power cord
    If after 3) I remove the *AGP* card, it still won't boot until I remove the power cable. After this it works fine with just the PCI-card.
    I am dual booting with XP Professional and WinME, but as it doesn't even get that far I am failing to see that this might matter.
    I am thinking "PSU", but am amazed that it worked in the other computer with the older PSU and not in the new 300W (have not found any Amp-specs).
    Any advice?

    I would have thought that the PCI card used the same power whether it is "init first" or not. Also I would have thought that a power problem would more probably be an unstable computer rather than an unbootable one.
    Could a few more AMPs on the PCI-bus really make that much (consistent) difference?

  • Hi,please suggest me some games for Compaq CQ62-105TU Laptop.Can I upgrade my laptop's graphics card

    Hi,please suggest me some games for Compaq CQ62-105TU Laptop.Can I upgrade my laptop's graphics card

    Unfortunately, you cannot upgrade the processor or graphics card on this laptop.
    Here is a page from CPU-World that gives the specs for your processor:
    http://www.cpu-world.com/CPUs/Bobcat/AMD-E%20Series%20E2-1800.html
    You will see down near the bottom of the page that it is not upgradeable. Some laptops come with the processor soldered to the motherboard. Almost all laptop graphics are this way. The only way that you would be able to get a faster processor/better graphics for this laptop would be to purchase a brand new motherboard. You can do an Ebay search for 'HP G6 motherboard' to see what average prices are if you want to go this route.

  • Graphics card drivers problem with star craft 2!!!! ah!

    I have a problem with my new laptop (a Compaq Presario CQ62z-200cto) that I bought for star craft II and I was hoping you guys could help.
    This is my computer’s stats. They surpass the recommended settings for Star craft 2 by a little bit.
    2.8 ghz phenom dual core
    2gs ddr3 ram
    Ati radeon mobility hd 545v (yes, this card is better than the recommend 3870)
    When I first turned on star craft it said:
    “Your video card drivers are out of date” and “Starcraft II does not recognize your graphics card”
    In addition to this Starcraft II runs relatively poorly. I’m not sure how well it’s supposed to run, but it’s a  laggy even on low-ish settings. I tried to get new drivers from ATI, but they said that if you have a laptop you should get drivers from the manufacturers, not ATI (because HP makes the specific drivers for the computer). ATI supposedly only has “generic” drivers for laptops or something like that. So ,I have to get the latest drivers from hp, which I already have (they were made in April…shesh).
    So, I’m stumped it seems like the drivers are holding me back even though I have the hardware for the job, but the drivers I need don’t exist.
    Any ideas? Anyone else have this problem?
    Thanks,
    Tim

    The "graphics card" on an iMac is built into the motherboard. Therefore the only updates would be through an Apple firmware update.

  • Cannot detect outputs on dual graphic card configuration

    Hi,
    I have a Dell Precision M6800 which has these graphic cards:
    00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
    01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Saturn XT [FirePro M6100]
    My ATI graphic card is supported by the xf86-video-ati driver based on http://www.x.org/wiki/RadeonFeature/ .
    Since I have multiple cards, I followed the instructions of https://wiki.archlinux.org/index.php/PRIME and I have
    DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
    OpenGL renderer string: Gallium 0.4 on AMD BONAIRE
    But when I plug my screen using my laptop display port, I don't see it in xrandr:
    Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
    eDP1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 382mm x 215mm
    1920x1080 60.01*+
    1400x1050 59.98
    1280x1024 60.02
    1280x960 60.00
    1024x768 60.00
    800x600 60.32 56.25
    640x480 59.94
    VGA2 disconnected (normal left inverted right x axis y axis)
    DP4 disconnected (normal left inverted right x axis y axis)
    HDMI1 disconnected (normal left inverted right x axis y axis)
    VIRTUAL1 disconnected (normal left inverted right x axis y axis)
    If I plug this screen on the VGA output, it is detected but the maximum resolution allowed by xrandr is 1920x1080 and my screen is a 2560x1080 so the image is stretched .
    My guess is that xrandr sees only the output of my intel card, how can I have it see the outputs supported by my ATI card?
    I also guess that my radeon card is not powered up, because cat /sys/kernel/debug/dri/0/radeon_pm_info gives me.
    PX asic powered off
    I tried DRI_PRIME=1 glxgears and that made me crash my X server with a segfault
    [ 554.701] (EE)
    [ 554.701] (EE) Backtrace:
    [ 554.701] (EE) 0: /usr/bin/X (xorg_backtrace+0x56) [0x58f186]
    [ 554.701] (EE) 1: /usr/bin/X (0x400000+0x192fc9) [0x592fc9]
    [ 554.701] (EE) 2: /usr/lib/libpthread.so.0 (0x7fb2738f1000+0xf4b0) [0x7fb2739004b0]
    [ 554.701] (EE) 3: /usr/lib/xorg/modules/drivers/intel_drv.so (0x7fb26d165000+0x1034a8) [0x7fb26d2684a8]
    [ 554.701] (EE) 4: /usr/bin/X (0x400000+0x15ea73) [0x55ea73]
    [ 554.701] (EE) 5: /usr/bin/X (0x400000+0x15f843) [0x55f843]
    [ 554.701] (EE) 6: /usr/bin/X (DRI2GetBuffersWithFormat+0xb) [0x55fc8b]
    [ 554.701] (EE) 7: /usr/bin/X (0x400000+0x16172b) [0x56172b]
    [ 554.701] (EE) 8: /usr/bin/X (0x400000+0x36b2f) [0x436b2f]
    [ 554.701] (EE) 9: /usr/bin/X (0x400000+0x3ad16) [0x43ad16]
    [ 554.701] (EE) 10: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7fb27255e000]
    [ 554.701] (EE) 11: /usr/bin/X (0x400000+0x250fe) [0x4250fe]
    [ 554.701] (EE)
    Its a bad sign, but I'm fine if we find a workaround to only use the radeon card. What I can read from the Xorg log files before this crash is the following:
    [ 545.834] (II) RADEON(G0): Printing probed modes for output DisplayPort-1-0
    [ 545.834] (II) RADEON(G0): Modeline "2560x1080"x60.0 185.58 2560 2624 2688 2784 1080 1083 1093 1111 +hsync -vsync (66.7 kHz eP)
    So my radeon detects this mode, I just don't see it in xrandr and don't know how to use it. I tried DRI_PRIME=1 xrandr but saw no difference with a standard xrandr.
    Additional information:
    dmesg|grep radeon gives me
    dmesg|grep radeon
    [ 5.097162] [drm] radeon kernel modesetting enabled.
    [ 5.118130] radeon 0000:01:00.0: enabling device (0000 -> 0003)
    [ 10.764770] radeon 0000:01:00.0: VRAM: 2048M 0x0000000000000000 - 0x000000007FFFFFFF (2048M used)
    [ 10.764772] radeon 0000:01:00.0: GTT: 1024M 0x0000000080000000 - 0x00000000BFFFFFFF
    [ 10.764852] [drm] radeon: 2048M of VRAM memory ready
    [ 10.764853] [drm] radeon: 1024M of GTT memory ready.
    [ 10.766910] [drm] radeon/BONAIRE_mc2.bin: 31792 bytes
    [ 10.774998] [drm] radeon: dpm initialized
    [ 10.782479] radeon 0000:01:00.0: WB enabled
    [ 10.782490] radeon 0000:01:00.0: fence driver on ring 0 use gpu addr 0x0000000080000c00 and cpu addr 0xffff8807ff9e9c00
    [ 10.782491] radeon 0000:01:00.0: fence driver on ring 1 use gpu addr 0x0000000080000c04 and cpu addr 0xffff8807ff9e9c04
    [ 10.782492] radeon 0000:01:00.0: fence driver on ring 2 use gpu addr 0x0000000080000c08 and cpu addr 0xffff8807ff9e9c08
    [ 10.782493] radeon 0000:01:00.0: fence driver on ring 3 use gpu addr 0x0000000080000c0c and cpu addr 0xffff8807ff9e9c0c
    [ 10.782494] radeon 0000:01:00.0: fence driver on ring 4 use gpu addr 0x0000000080000c10 and cpu addr 0xffff8807ff9e9c10
    [ 10.782874] radeon 0000:01:00.0: fence driver on ring 5 use gpu addr 0x0000000000076c98 and cpu addr 0xffffc9000a336c98
    [ 10.783472] radeon 0000:01:00.0: fence driver on ring 6 use gpu addr 0x0000000080000c18 and cpu addr 0xffff8807ff9e9c18
    [ 10.783473] radeon 0000:01:00.0: fence driver on ring 7 use gpu addr 0x0000000080000c1c and cpu addr 0xffff8807ff9e9c1c
    [ 10.783487] radeon 0000:01:00.0: irq 46 for MSI/MSI-X
    [ 10.783496] radeon 0000:01:00.0: radeon: using MSI.
    [ 10.783516] [drm] radeon: irq initialized.
    [ 11.056307] radeon 0000:01:00.0: No connectors reported connected with modes
    [ 11.057358] radeon 0000:01:00.0: fb1: radeondrmfb frame buffer device
    [ 11.057360] radeon 0000:01:00.0: registered panic notifier
    [ 11.058383] [drm] Initialized radeon 2.38.0 20080528 for 0000:01:00.0 on minor 0
    Anyone knows how to be able to use this new screen with my radeon card?
    Thanks in advance!
    Last edited by jolivier (2014-07-02 16:02:01)

    Ok I found a solution by changing my Xorg configuration and added my radeon card (although the wiki states that is should be useless)
    Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    BusId "PCI:1:0:0"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-1"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-2"
    EndSection
    Section "Monitor"
    Identifier "eDP1"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Radeon"
    Monitor "DisplayPort-2"
    SubSection "Display"
    Depth 24
    Modes "2560x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen1"
    Device "Radeon"
    Monitor "DisplayPort-1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen2"
    Device "Intel"
    Monitor "eDP1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "ServerLayout"
    Identifier "Default Layout"
    Screen 0 "Screen0"
    Screen 1 "Screen1" RightOf "Screen0"
    Screen 2 "Screen2" RightOf "Screen1"
    EndSection
    (I plugged two external screens to my DirectPort ports).
    This works but I then have two different screens (:0.0 and :0.1) and the intel drivers keeps on segfaulting when I use them together or when I try xinerama. So I disabled my intel monitor and screen and everything is fine except the fact that I cannot use my laptop screen with my radeon card so I am left with only two screens out of 3 but I will investigate more this issue deeper later on.

  • Can I upgrade my Sony Laptop's Graphic Card?

    My laptop is relatively new and I only bought it a few months ago, model #VPCEB. It currently has a 512 MB ATI Radeon HD 5470 which I want to upgrade so I can have StarCraft II run smoothly because it is kind of choppy right now. I just want to know, is there a way I can actually upgrade my computer and what would be a good graphics card?

    iamdre17 wrote:
    My laptop is relatively new and I only bought it a few months ago, model #VPCEB. It currently has a 512 MB ATI Radeon HD 5470 which I want to upgrade so I can have StarCraft II run smoothly because it is kind of choppy right now. I just want to know, is there a way I can actually upgrade my computer and what would be a good graphics card?
    It used to be that some laptops would come with 2-3 options of graphics chipsets, and you could swap between them, plus manufacturers might use the same form factor for a series of laptops (For example, all Dell Inspiron 8000 series units used the same card form factor - so you could upgrade an I8000 to the graphics chipset that came with the I8200).  This is no longer the case, even the highend gaming laptops don't have upgradeable graphics.
    *disclaimer* I am not now, nor have I ever been, an employee of Best Buy, Geek Squad, nor of any of their affiliate, parent, or subsidiary companies.

Maybe you are looking for