Dual Graphics cards in Quad PPC G5?

I have one GeForce 6600, and want to add a second, is this possible?
If not, is there another card model which can be dualed up inside of the PPC G5 Quad?
If not, where can you buy Nvidia graphics card upgrades for the PPC G5 Quad?

Anyone know where you can get one of these upgrades?
NVIDIA Quadro FX 4500 graphics card
http://www.apple.com/pr/library/2005/oct/19pmg5.html
Malcolm's last link is the Quadro FX 4500. However, unless you have the need for its unique features, the 7800GT (the next to last link in Malcolm's post) is a much better bang for the buck. The 7800 isn't much slower for most things, but costs 1/2 as much as the Quadro.

Similar Messages

  • Photoshop CS6 have problems with my dual graphics card

    I'm currently using photoshop cs6 on my laptop (hp pavilion dv4 3114 ), and this laptop has dual graphics cards(one is mobile intel hd graphics, the other one is ati hd 6750), it ususlly switches to the faster one when games are running or some software that requires a lot of graphic work. But this time I have already set in the graphics card profile to let photoshop use the faster one, but when I launch photoshop cs6 it tells me that the graphics card is not officially supported, as you can see this in the screenshot.
    Then I turned to photoshop's preference(performence) and found it only identifies the mobile intel hd graphics. I'm currently using photoshop for designing large posters so performence is very important for me. Is anyone know how to solve this problem so that photoshop can run with the highest performence?

    II can't imagine how these computer companies expect to fully integrate GPUs from two different vendors, but they're trying.  To me, knowing a bit about how display drivers are implemented, it seems an impossible task.
    I hve heard (in the context of Macs that have this option) that disabling the low power options so that only the more powerful GPU is active can help.  I don't know if that can apply to your system, but it might give you an idea of a place to start looking. 
    Another thing might be to try to find whatever settings it offers for "affinity" and set it so Photoshop and its cousins use just the ATI GPU exclusively.  It might be the "switchover" process that is causing the problems.
    Good luck.
    -Noel

  • How to get dual graphics card to work together?

    I have an L series satellite with dual graphics card that wont work together. Its an AMD Radeon 3400m apu with an HD 6520G integrated , and a 6400m discrete graphcs card.
    The 6400m will never turn on, I have used amd system monitor to see if it is ever used, but it never is.
    Even when the 6520g is at full load it wont turn on.
    Any suggestions would be helpful.
    Thanks.

    Hi
    As far as I know this theme has been already discussed here in the forum and the switch between internal and external ATI GPU is not supported.
    Just the Optimus technology designed by nVidia is supported
    So you can switch between Intel GPU and nVidia (if both would support the Optimus technology)

  • Dual graphics card in M92 Tower?

    Hello,
    I have a manager here requesting 3 monitors for each of their departments workstations (Lenovo M92 Tower, Intel i5) and I am just wonder if the M92 supports dual graphics cards. I am aware that it only has a PCIe 1x and PCIe 16x slot - the plan is to install two 1x cards, if they are supported.
    Could someone confirm if this will work or not?
    Thanks!
    Solved!
    Go to Solution.

    First, I have no experience at all with more than two monitors on any PC.
    However... AMD's EyeFinity setup can support three (or more) monitors using the multiple connectors on a usable video card.  For example, the relatively inexpensive AMD R7 250 card comes in both full-size and low-profile varieties, to fit inside an M92p case (either mid-tower or SFF) in the PCIe x16 slot (which is what you want to use).  The low-profile R7 250 DDR5 card from Sapphire (which includes both low-profile and full-size brackets in the retail box, so you can use the card in either M92p case size) has three digital connectors on it: DVI, miniDP, and microHDMI.  So you can connect three monitors to them. The retail box also includes adapters for the microHDMI and miniDP connectors, but I'd personally just buy a proper 2-ended straight-through cable (say from Monoprice or other online source) with the correct connector at each end to connect each of your three monitors. I'm not a fan of adapters myself, I'd prefer a suitable cable with the correct connectors at each end.
    According to the EyeFinity setup instructions, you use two connectors to go to two monitors, and the third (or more) monitors must connect on the DisplayPort path.  In your case you only have three monitors, so you just use all three digital connectors on the R7 250 and you're done!  No need to have two graphics cards, and you get high-performance DDR5 as you want, on three monitors.
    Again... I've never done this myself, but EyeFinity for three or more monitors is standard technology given an AMD card like the R7 250 sitting in a single PCIe x16 slot.

  • Dual graphics cards and single 4K display - Mac Pro (late 2013)

    I'm considering a new Mac Pro as my next graphics/development workstation. I see that the new Mac Pros have dual graphics cards (i.e. dual D300, D500, D700.) I was wondering if there is any benefit to having the second graphics card if I only have only one 4K display connected via Display Port 1.2? Would the two cards work in tandem to output to the single 4k display? Can the second GPU still be used for OpenCL, even if it is not connected to a display?
    Thanks

    There is no option to have only one Dxxx, and yes some apps do and no the number of displays does not affect video output.
    Whether it is right for you vs say a maxed out iMac or something else is something you should continue to research and consider.
    www.barefeats.com and www.macperformanceguide.com as well as other forums and sites.

  • HP Envy H8-1540t - dual graphics cards, Pcie 3.0 support

    I have 2 questions about using dual graphics cards and Pcie 3.0 support in my machine.
    The machine is:
    HP Envy H8-1540t
    Win 8 64
    i7-3820 2nd Gen
    10 GB DDR3 1600
    256 GB SSD
    1 TB 7200 rpm SATA hard drive
    Blu-Ray Player
    TV Tuner
    Prem wireless N-Lan and Blutooth 2x2
    600 Watt PSU
    NVidia GTX 660
    This machine uses the Pegatron IPIWB-PB motherboard and has 2 Pcie x 16 slots. I realize that by using dual width GPU's like the GTX 660, the smaller Pcie slots next to them will be buried, rendering them useless. So these are my 2 questions;
    1.) Will 2 Nvidia GTX 660 GPU's physically fit within the machine and be supported?
    2.)  Does this motherboard with it's Intel X79 chipset support Pcie 3.0?
    Thank You
    This question was solved.
    View Solution.

    Hi,
    You can find ALL information on the following OFFICIAL link, looks like it only supports PCIe Gen 2.0
       http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&lc=en&dlc=en&docname=c03132964#N241
    From the image of the motherboard (above) and the following image of the GTX 660, simple answer: 2 cards won't physically fit.
       http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660/product-images
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • Looking for Nvidea GeForce 6600 OR Similar Graphics Card for Quad G5 PPC

    Hi there!
    Wishing a Happy New Year to yo'all!
    Was wondering whether someone might help me out with a minor issue I'm having...
    I have recently acquired a Quad 2.5GHz PPC G5 (with PCIe). When I bought it, it came installed with a NVIDIA Quadro FX 4500 graphics card, which works just fine. Heard these cards are very good for 3D rendering, etc...
    However... As I am an audio engineer, I don't need a flashy graphics card... Instead I need "flashy" sound cards...
    I have two PCIe UAD-2 Quad cards for all my UAD plugin needs within Logic Pro... And I have PCIe Symphony X card for my Apogee (AD16X & DA16X) front end sound conversion. All which sound and work very well... So far so good!
    But here's the problem... My Quad PPC has 4 PCIe slots, which should be fine for my arrangement i.e. one slot for graphics card, one for PCIe Symphony card, two for UAD-2 Quad cards... BUT the damnable NVIDIA Quadro FX 4500 takes up two PCIe slots!!!
    I have looked on line to find out what the Quad 2.5GHz G5 PPC was originally supplied with: an Apple OEM Nvidea GeForce 6600. However these are rare and hard to come by now-a-days... And I can't wait forever till one pops up for sale.
    So... Can anyone recommend to me a graphics card that will work in the Quad 2.5GHz G5 PPC... AND take up only one PCIe slot and be able to run a 30 inch DVI Apple display i.e. it needs two DVI ports???
    I'd even do an out right swap for the NVIDIA Quadro FX 4500 and a good condition and working Apple OEM Nvidea GeForce 6600!? Heard this is more than a fair deal...
    Cheers in advance!

    Hi-
    You can buy a Geforce 6600LE at We Love Macs. They aren't cheap, though.
    This eBay seller has several of the 6600LE cards, at a much better price. The seller is very reliable (from experience) and carries the positive feedback to support my recommendation.
    This 7800GT is also a good possibility. It takes only one slot, and provides much better performance than the 6600LE.
    This flashed 7800GT is also an option. A flashed card is one that has the PC ROM replaced with Mac ROM. Again, this seller knows his stuff.....
    As with the 6600LE, the 7800GT has a dual link DVI to support the 30" ACD.
    After you get a replacement card, might try and get a few quid for the FX 4500 on eBay.
    An original will sell better than the flashed models that are selling there.

  • Looking for a Nvidea GeForce 6600 OR Similar Type Graphics Card for Quad G5

    Hi there!
    Wishing a Happy New Year to yo'all!
    Was wondering whether someone might help me out with a minor issue I'm having...
    I have recently acquired a Quad 2.5GHz PPC G5. When I bought it, it came installed with a NVIDIA Quadro FX 4500 graphics card, which works just fine. Heard these cards are very good for 3D rendering, etc...
    However... As I am an audio engineer, I don't need a flashy graphics card... Instead I need "flashy" sound cards...
    I have two PCIe UAD-2 Quad cards for all my UAD plugin needs within Logic Pro... And I have PCIe Symphony X card for my Apogee (AD16X & DA16X) front end sound conversion. All which sound and work very well... So far so good!
    But here's the problem... My Quad PPC has 4 PCIe slots, which should be fine for my arrangement i.e. one slot for graphics card, one for PCIe Symphony card, two for UAD-2 Quad cards... BUT the damnable NVIDIA Quadro FX 4500 takes up two PCIe slots!!!
    I have looked on line to find out what the Quad 2.5GHz G5 PPC was originally supplied with: an Apple OEM Nvidea GeForce 6600. However these are rare and hard to come by now-a-days... And I can't wait forever till one pops up for sale.
    So... Can anyone recommend to me a graphics card that will work in the Quad 2.5GHz G5 PPC... AND take up only one PCIe slot and be able to run a 30 inch DVI Apple display i.e. it needs two DVI ports???
    Cheers in advance!
    Message was edited by: Polynomial
    Message was edited by: Polynomial

    Hi, Polynomial. Try posting the question in one of the G5 forums. Most MacBook Pro owners aren't likely to be of much help with it.

  • G5 Dual Graphics card

    I have a G5 Dual - PCI - X I believe as well as a G% Quad - PCIe
    Ive just replaced the 6500 graphics card in the quad can I put it in the dual?

    The G5 dual will have an AGP graphics card slot, I'm afraid.
    That's presuming it isn't a dual-core - "Late 2005"
    http://support.apple.com/specs/powermac/PowerMac_G5_Late2005.html

  • LAPTOP Dual Graphics Card CPU scailing (with bash script)

    Hey all, basically, getting Arch working with my laptop was a pain due to the lack of power options and graphics control, especially using the open source drivers. My laptop would overheat due to both the dedicated and integrated graphs cards would be running at the same time and my CPUs were running at 100%. After a long while of looking around, I finally found a solution, and being the nice guy I am, I decided to make a script to streamline the process for most people. It mounts the debugging filesystem, adds it to fstab, installs the necessary tools, loads the correct module, and also lets you change power plans, as well as check on battery, graphics card status, and cpu status. this is basically version one so i guess ill add a bit to it over time.
    *** MAKE SURE KMS IS ENABLED ON YOUR KERNEL GRUB/SYSLINUX LINE EG:  "radeon.modset=1"
    ******ERROR CHECKING:
    if you have the debug fs mounted already, unmount it with umount /sys/kernel/debug
    if you get an error modprobing, check what modules are supported from your cpu with  ls /lib/modules/$(uname -r)/kernel/drivers/cpufreq/
    with the debugging fs mounted, running cat /sys/kernel/debug/vgaswitcheroo/switch to find out what your graphic adapters are named and if needed replace the IGD and DIS with yours
    you may have to modify some parts of the program, but i tried my best to make it as easy as I can
    Installation:
    copy it and save it as foo.sh
    chmod 777 it for good measures
    RUN AS ROOT
    chmod 777 foo.sh
    ./foo.sh
    #! /bin/bash
    #By: Dominic dos Santos
    #[email protected]
    #mount -t debugfs none /sys/kernel/debug --mount debugging fs
    #echo"IGD"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --onboard graphics
    #echo"DIS"> /sys/kernel/debug/vgaswitcheroo/switch && echo"OFF"> /sys/kernel/debug/vgaswitcheroo/switch --dedicated graphics
    #cpufreq-set -c 3 -g powersave # --powersave cpu freq set
    #cpufreq-set -c 3 -g performance # --performance
    #cpufreq-set -c 3 -g ondemand #--...
    #!!!PLEASE NOTE!!! I am using a quad core laptop, therefore I have the '-c 3' argument. Cores are as such: 0 1 2 3
    #Dual core would be '-c 1', 6 core would be 5
    echo "RUNNING THIS WIH X RUNNING WILL NOT MODIFY THE GRAPHICS CARD SETINGS"
    #checking if debugging fs is mounted, if not, mounting.
    if [ -f /sys/kernel/debug/vgaswitcheroo/switch ]; then
    echo "DEBUGFS is mounted, continuing wih program"
    else
    read -p "Press ENTER to mount the debugging directory (REQUIRED)"
    mount -t debugfs none /sys/kernel/debug #the mount fs command
    echo "Add to fstab?"
    read fs
    if [ "$fs" == "y" ]; then
    echo "debugfs /sys/kernel/debug debugfs 0 0" >> /etc/fstab #add the required line to the fstab
    fi
    read -p "We are now going to install the cpu drivers and load the required modules."
    pacman -S cpufrequtils
    echo "Do you have an [a]MD or [i]ntel cpu?" #load the [correct] module now
    read input
    if [ "$input" == "a" ]; then #AMD
    modprobe powernow-k8
    elif [ "$input" == "i" ]; then #INTEL
    modprobe acpi-cpufreq
    fi
    echo "REMEMBER TO ADD acpi-cpufreq cpufreq_powersave cpufreq_ondemand cpufreq_performance to your rc.conf beside MODULES=( ****** FOR INTEL CARDS ONLY!"
    echo "OR powernow-k8 cpufreq_powersave cpufreq_ondemand cpufreq_performance ****** FOR AMD CPU's ONLY!"
    fi
    #menu
    echo -e "Welcome to my CPU and Videocard Power and Performance Switcherooer"
    echo " 1: Powersave"
    echo " 2: On Demand"
    echo " 3: Performance"
    echo " 4: Check Status"
    echo "Please select an option"
    read input
    if [ "$input" = 1 ]; then
    #Powersave
    #Set CPU to "Powersave", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g powersave
    echo "IGD" > /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF" > /sys/kernel/debug/vgaswitcheroo/switch #the "OFF" infers to cutting the power to the one that isn't selected
    elif [ "$input" = 2 ]; then
    #On Demand
    #Set CPU to "On Demand", set VGA to onboard and disables one not being used, ie. the dedicated
    cpufreq-set -c 3 -g ondemand
    echo "IGD"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 3 ]; then
    #Performance
    #Set CPU to "Performance", set VGA to the dedicated graphics card and disables the onboard
    cpufreq-set -c 3 -g performance
    echo "DIS"> /sys/kernel/debug/vgaswitcheroo/switch
    echo "OFF"> /sys/kernel/debug/vgaswitcheroo/switch
    elif [ "$input" = 4 ]; then # status check
    echo " 1: Battery"
    echo " 2: Graphics Card"
    echo " 3: CPU Info"
    read status
    if [ "$status" = 1 ]; then #battery
    acpi
    read -p "Press Enter"
    elif [ "$status" = 2 ]; then #battery
    cat /sys/kernel/debug/vgaswitcheroo/switch
    read -p "Press Enter"
    elif [ "$status" = 3 ]; then #battery
    cpufreq-info
    read -p "Press Enter"
    fi
    fi
    Last edited by imagoose (2012-02-15 22:51:13)

    Thats great, thank you.  I have an older Dell Studio XPS 13 which has NVIDIA Hybrid SLI.  Its current power usage in Arch is killing me (about an hour and a half, where as i can get 3-4 hrs in win7).  Right now I am doing all the work through the integrated graphics card, per my xorg.conf, but i dont think i've managed to disable the discrete card yet. When I get on the laptop I'll let you know how it goes.

  • [K7N2] Trouble with dual graphics cards

    I have a K7N2-DeltaL (Barton 2600+) which I have used for a while with a Radeon 9600 AGP@8x. To use multiple displays I installed a Radeon9200 PCI graphics card.
    However, I am having trouble getting it to work. The new card works fine alone, both in this and another computer. It works ok together with another AGP-card (GeForce2) on a Abit K7T with Duron 800.
    I have only gotten it to work for a few minutes once, long enough to install drivers etc. Ran it for a while but ended in a BSOD (endless-loop). Otherwise one of the following scenarios occur:
    1) with "Init AGP first" the computer boots but fails to initialize the PCI-video, works ok otherwise.
    2) with "Init PCI first" the computer fails to boot (stops in "Check RTC" according to D-bracket LEDs, or come to think about it, I think it actually only shows one red and no green, have to check that when I am back at it)
    3) with "Init PCI First" the computer POSTs but shuts down and can't be started without removing the power cord
    If after 3) I remove the *AGP* card, it still won't boot until I remove the power cable. After this it works fine with just the PCI-card.
    I am dual booting with XP Professional and WinME, but as it doesn't even get that far I am failing to see that this might matter.
    I am thinking "PSU", but am amazed that it worked in the other computer with the older PSU and not in the new 300W (have not found any Amp-specs).
    Any advice?

    I would have thought that the PCI card used the same power whether it is "init first" or not. Also I would have thought that a power problem would more probably be an unstable computer rather than an unbootable one.
    Could a few more AMPs on the PCI-bus really make that much (consistent) difference?

  • Need Xorg config file for dual graphics cards, triple monitors

    Hey guys, I'm having a bit of trouble getting triple monitors working on my system. The two plugged into the graphics card (radeon x300) work fine, but the one plugged into onboard graphics (geforce 6150SE) refuses to appear in xrandr. I figure I need to make a /etc/X11/xorg.conf.d/10-monitor.conf file but I'm confused about how to do it with separate cards. I find the wiki instructions that I can find confusing, as I've never had to deal with xorg files before (has always autoconfigured for me no problems).
    Relevant code:
    [greg@spacebar ~]$ lspci | grep VGA
    00:0d.0 VGA compatible controller: nVidia Corporation C61 [GeForce 6150SE nForce 430] (rev a2)
    02:00.0 VGA compatible controller: ATI Technologies Inc RV370 5B60 [Radeon X300 (PCIE)]
    [greg@spacebar ~]$ xrandr
    Screen 0: minimum 320 x 200, current 2960 x 1050, maximum 4096 x 4096
    VGA-0 connected 1280x1024+1680+0 (normal left inverted right x axis y axis) 338mm x 270mm
    1280x1024 60.0*+ 75.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    DVI-0 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 430mm x 270mm
    1680x1050 59.9*+
    1280x1024 75.0 60.0
    1152x864 75.0
    1024x768 75.1 60.0
    800x600 75.0 60.3
    640x480 75.0 60.0
    720x400 70.1
    S-video disconnected (normal left inverted right x axis y axis)
    [greg@spacebar ~]$ pacman -Q | grep xf86-video
    xf86-video-ati 6.14.3-1
    xf86-video-nouveau 0.0.16_git20120106-1
    xf86-video-vesa 2.3.0-7
    [greg@spacebar ~]$ cat /etc/mkinitcpio.conf | grep MODULES=
    MODULES="radeon nouveau"
    I just can't seem to understand how to write the 10-monitor.conf file. I was wondering if anyone could give me a hand?
    Also, the third monitor to be connected via onboard VGA is also 1280x1024 and will be left of the other two.

    Depends on the motherboard. This one allows you to enable the onboard graphics only when no external card is found, or always enable. The PCI-E card initialises first, if that helps.
    EDIT: Also I can confirm that it can run both cards (all three monitors) at the same time by switching to initialise the onboard card first, however that makes some weird stuff happen so I switched it back. And also the nouveau driver is shown in lsmod, and gets detected in the boot message logs.
    Last edited by MisterAnderson (2012-01-26 16:56:58)

  • Photoshop CC - force selection of graphics card when dual graphics cards in use

    I have Photoshop CC on Win 8 with 2 graphics cards. The inbuilt and the extra card.
    This enables me to utilise 4 monitors in my dev environment.
    It is not practical to remove a card as I need all 4 monitors.
    I have tested disabling each card (one at a time) and Photoshop finds the GPU and works fine.
    Q. Is there a FIX to force Photoshop to select a chosen Graphics Card?
    Looking around the web, this seems like a large problem.
    Regards,
    Chris

    As per tech specification of Photoshop CC you must require 512MB of vRAM of graphic driver installed.
    Tech Specification Photoshop CC:
    http://www.adobe.com/in/products/photoshop/tech-specs.html
    Graphics ATI Radeon HD 2600 XT 256 MB
    The Graphics card installed on the machine have only 256MB vRAM.
    Hereby, Photshop gives you warning to resolve it.

  • Cannot detect outputs on dual graphic card configuration

    Hi,
    I have a Dell Precision M6800 which has these graphic cards:
    00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
    01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Saturn XT [FirePro M6100]
    My ATI graphic card is supported by the xf86-video-ati driver based on http://www.x.org/wiki/RadeonFeature/ .
    Since I have multiple cards, I followed the instructions of https://wiki.archlinux.org/index.php/PRIME and I have
    DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
    OpenGL renderer string: Gallium 0.4 on AMD BONAIRE
    But when I plug my screen using my laptop display port, I don't see it in xrandr:
    Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
    eDP1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 382mm x 215mm
    1920x1080 60.01*+
    1400x1050 59.98
    1280x1024 60.02
    1280x960 60.00
    1024x768 60.00
    800x600 60.32 56.25
    640x480 59.94
    VGA2 disconnected (normal left inverted right x axis y axis)
    DP4 disconnected (normal left inverted right x axis y axis)
    HDMI1 disconnected (normal left inverted right x axis y axis)
    VIRTUAL1 disconnected (normal left inverted right x axis y axis)
    If I plug this screen on the VGA output, it is detected but the maximum resolution allowed by xrandr is 1920x1080 and my screen is a 2560x1080 so the image is stretched .
    My guess is that xrandr sees only the output of my intel card, how can I have it see the outputs supported by my ATI card?
    I also guess that my radeon card is not powered up, because cat /sys/kernel/debug/dri/0/radeon_pm_info gives me.
    PX asic powered off
    I tried DRI_PRIME=1 glxgears and that made me crash my X server with a segfault
    [ 554.701] (EE)
    [ 554.701] (EE) Backtrace:
    [ 554.701] (EE) 0: /usr/bin/X (xorg_backtrace+0x56) [0x58f186]
    [ 554.701] (EE) 1: /usr/bin/X (0x400000+0x192fc9) [0x592fc9]
    [ 554.701] (EE) 2: /usr/lib/libpthread.so.0 (0x7fb2738f1000+0xf4b0) [0x7fb2739004b0]
    [ 554.701] (EE) 3: /usr/lib/xorg/modules/drivers/intel_drv.so (0x7fb26d165000+0x1034a8) [0x7fb26d2684a8]
    [ 554.701] (EE) 4: /usr/bin/X (0x400000+0x15ea73) [0x55ea73]
    [ 554.701] (EE) 5: /usr/bin/X (0x400000+0x15f843) [0x55f843]
    [ 554.701] (EE) 6: /usr/bin/X (DRI2GetBuffersWithFormat+0xb) [0x55fc8b]
    [ 554.701] (EE) 7: /usr/bin/X (0x400000+0x16172b) [0x56172b]
    [ 554.701] (EE) 8: /usr/bin/X (0x400000+0x36b2f) [0x436b2f]
    [ 554.701] (EE) 9: /usr/bin/X (0x400000+0x3ad16) [0x43ad16]
    [ 554.701] (EE) 10: /usr/lib/libc.so.6 (__libc_start_main+0xf0) [0x7fb27255e000]
    [ 554.701] (EE) 11: /usr/bin/X (0x400000+0x250fe) [0x4250fe]
    [ 554.701] (EE)
    Its a bad sign, but I'm fine if we find a workaround to only use the radeon card. What I can read from the Xorg log files before this crash is the following:
    [ 545.834] (II) RADEON(G0): Printing probed modes for output DisplayPort-1-0
    [ 545.834] (II) RADEON(G0): Modeline "2560x1080"x60.0 185.58 2560 2624 2688 2784 1080 1083 1093 1111 +hsync -vsync (66.7 kHz eP)
    So my radeon detects this mode, I just don't see it in xrandr and don't know how to use it. I tried DRI_PRIME=1 xrandr but saw no difference with a standard xrandr.
    Additional information:
    dmesg|grep radeon gives me
    dmesg|grep radeon
    [ 5.097162] [drm] radeon kernel modesetting enabled.
    [ 5.118130] radeon 0000:01:00.0: enabling device (0000 -> 0003)
    [ 10.764770] radeon 0000:01:00.0: VRAM: 2048M 0x0000000000000000 - 0x000000007FFFFFFF (2048M used)
    [ 10.764772] radeon 0000:01:00.0: GTT: 1024M 0x0000000080000000 - 0x00000000BFFFFFFF
    [ 10.764852] [drm] radeon: 2048M of VRAM memory ready
    [ 10.764853] [drm] radeon: 1024M of GTT memory ready.
    [ 10.766910] [drm] radeon/BONAIRE_mc2.bin: 31792 bytes
    [ 10.774998] [drm] radeon: dpm initialized
    [ 10.782479] radeon 0000:01:00.0: WB enabled
    [ 10.782490] radeon 0000:01:00.0: fence driver on ring 0 use gpu addr 0x0000000080000c00 and cpu addr 0xffff8807ff9e9c00
    [ 10.782491] radeon 0000:01:00.0: fence driver on ring 1 use gpu addr 0x0000000080000c04 and cpu addr 0xffff8807ff9e9c04
    [ 10.782492] radeon 0000:01:00.0: fence driver on ring 2 use gpu addr 0x0000000080000c08 and cpu addr 0xffff8807ff9e9c08
    [ 10.782493] radeon 0000:01:00.0: fence driver on ring 3 use gpu addr 0x0000000080000c0c and cpu addr 0xffff8807ff9e9c0c
    [ 10.782494] radeon 0000:01:00.0: fence driver on ring 4 use gpu addr 0x0000000080000c10 and cpu addr 0xffff8807ff9e9c10
    [ 10.782874] radeon 0000:01:00.0: fence driver on ring 5 use gpu addr 0x0000000000076c98 and cpu addr 0xffffc9000a336c98
    [ 10.783472] radeon 0000:01:00.0: fence driver on ring 6 use gpu addr 0x0000000080000c18 and cpu addr 0xffff8807ff9e9c18
    [ 10.783473] radeon 0000:01:00.0: fence driver on ring 7 use gpu addr 0x0000000080000c1c and cpu addr 0xffff8807ff9e9c1c
    [ 10.783487] radeon 0000:01:00.0: irq 46 for MSI/MSI-X
    [ 10.783496] radeon 0000:01:00.0: radeon: using MSI.
    [ 10.783516] [drm] radeon: irq initialized.
    [ 11.056307] radeon 0000:01:00.0: No connectors reported connected with modes
    [ 11.057358] radeon 0000:01:00.0: fb1: radeondrmfb frame buffer device
    [ 11.057360] radeon 0000:01:00.0: registered panic notifier
    [ 11.058383] [drm] Initialized radeon 2.38.0 20080528 for 0000:01:00.0 on minor 0
    Anyone knows how to be able to use this new screen with my radeon card?
    Thanks in advance!
    Last edited by jolivier (2014-07-02 16:02:01)

    Ok I found a solution by changing my Xorg configuration and added my radeon card (although the wiki states that is should be useless)
    Section "Device"
    Identifier "Radeon"
    Driver "radeon"
    BusId "PCI:1:0:0"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-1"
    EndSection
    Section "Monitor"
    Identifier "DisplayPort-2"
    EndSection
    Section "Monitor"
    Identifier "eDP1"
    EndSection
    Section "Screen"
    Identifier "Screen0"
    Device "Radeon"
    Monitor "DisplayPort-2"
    SubSection "Display"
    Depth 24
    Modes "2560x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen1"
    Device "Radeon"
    Monitor "DisplayPort-1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "Screen"
    Identifier "Screen2"
    Device "Intel"
    Monitor "eDP1"
    SubSection "Display"
    Depth 24
    Modes "1920x1080"
    EndSubSection
    EndSection
    Section "ServerLayout"
    Identifier "Default Layout"
    Screen 0 "Screen0"
    Screen 1 "Screen1" RightOf "Screen0"
    Screen 2 "Screen2" RightOf "Screen1"
    EndSection
    (I plugged two external screens to my DirectPort ports).
    This works but I then have two different screens (:0.0 and :0.1) and the intel drivers keeps on segfaulting when I use them together or when I try xinerama. So I disabled my intel monitor and screen and everything is fine except the fact that I cannot use my laptop screen with my radeon card so I am left with only two screens out of 3 but I will investigate more this issue deeper later on.

  • Dual Graphics cards 7300 & 8800 in 1st gen

    Does it work to have both video cards in my mac pro. so i can dual boot. the newly
    released 8800 gt only works w/ leopard. so i plan to install leopard on a seperate drive.
    and then just use the original 7300 card for tiger?
    anyone know?

    Well i currently use one display...i have another one that i don't use. i am fine with one for now. here are the specs:
    Connectivity
    Input Video Compatibility Analog RGB, Digital
    Connectors D-Sub, DVI, HDMI, CVBS, S-video, Component
    D-Sub 1
    DVI 1
    Video CVBS, S-video, Component
    HDMI 1
    would i manually have to change the cable in the back? I am looking to see if there are some form of DVI splitter. There are ones that go to multiple monitors, but i need one monitor to multiple cards. i use the DVI now. or maybe use another input of my monitor.. not sure which one though.

Maybe you are looking for