T530 buggy dual GPU's

I purchased a kind of souped-up T530 with the sole purpose of making it my entertainment laptop with the intention of connecting it to a big screen LCD via the DisplayPort cable, basically to watch internet streaming video or DVDs.  The T530 came with two GPUs, an NVIDIA 5400M and an "onboard" Intel HD Graphics 4000.  I have had some buggy issues with the way that the two GPUs communicate with each other since the beginning, but after some experimentation - maybe I have finally fixed my problem.  Since I had very little luck finding any documentation of the types of issues I was having, I am posting this here for any others.
Since day 1, the NVIDIA Control Panel Application (which you must use to configure multiple displays) has listed the DisplayPort (the TV) as being under the control of the NV5400M and the laptop display as being under the control of the Intel HD Graphics 4000.  Having the two displays under the control of different GPUs presented a bunch of problems, such as displays suddenly blacking out in the middle of movies and DVD's only playing on one screen or the other (never both).  I had monkeyed around with this for months and just thought this was the way that it was.  See below screen print for how NVIDIA Control Panel saw the situation:
Link to image 1
I was also having power management issues (another thread here) and someone suggested that I change the GPU configuration in the BIOS.  I had never done this before, so I went in and under Config-Graphics Device, there are three choices - none of which are particularily informative:
1. Integrated Graphics mode will achive longer battery life.
2. Discrete Graphics mode will achieve higher graphics performance.
3. NVIDIA Optimus mode runs as Integrated Graphics mode and Discrete Graphics is enabled on demand.  NOTE: NVIDIA Optimus mode should not be used if using XP.
I had apparently been running with option 3, NVIDIA Optimus mode which seemed to make the two displays sort of choppy, where if the laptop display woke up from black screen, it would black out a second time for a few seconds as if the two GPUs were trying to sync up to each other.  Back to BIOS - I changed the setting to option 2, Discrete Graphics mode and re-booted.
The result was that the multiple display blackouts stopped happening, but the laptop display and DisplayPort display still came up as each being under the control of a different GPU.  So, one of my problems was resolved, but I still couldn't watch DVDs simultaneously on both displays (a feature called Theatre Mode on AMD gpu's). 
I also tried monkeying around with the Intel HD 4000 control panel, which would never allow multiple displays and seemed set on only wanting to control the laptop screen.  I don't think the Intel HD4000 even has access to the DisplayPort.  It is very confusing having two graphics control panels, an NVIDIA one and an Intel one.  Only the NVIDIA Control panel allows any manipulation of a secondary screen, btw.
So, I tried going back to BIOS again to see what would happen if I chose option 1, Integrated Graphics mode.  This time, on re-boot - the NVIDIA Control panel wouldn't even launch, so it must be turned off with this mode.  So, the only option is to try using the Intel control panel which won't even recognize secondary screens at all.  So, apparently with Integrated Graphics mode, you lose any ability of using the DisplayPort at all.
I went back yet again and changed the BIOS back to option 2, Discrete Graphics mode - thereby putting the NVIDIA 5400M back in business.  But guess what, this time the NVIDIA Control panel shoiwed both the laptop screen and the DisplayPort both under control of the NVIDIA gpu only.  This resolved the problem of the DVD's.  Now DVDs play simultaneously on both displays now that they are both under the control of a single GPU.
Link to image 2
I have found this whole experience (months actually) of the dual GPUs to be very ill documented and very buggy.  To think that I only corrected my problem by changing the BIOS settings four times is incredible to me.  Anyway, I suppose this post will just float down into obscurity, but I had been having ZERO luck finding anything about this topic at all on the boards. 
Moderator note: large image(s) converted to link(s):  About Posting Pictures In The Forums

Even more on this topic.  I ended up re-booting for an ulrelated reason and found my display adapter configuration again back to the split scenario with the Intel HD 4000 in charge of the laptop display and the NVIDIA NVS5400M in charge of the display port.  I was surprised to see my configuration return back to the undesired configuration that was causing me so many problems.
It turns out there is also a secondary setting within BIOS Config-Graphics Device, right beneath the three choices for the graphics mode I had mentioned before.  The setting is called OS Detection for NVIDIA Optimus and it says "If Enabled, System BIOS automatically switches Graphics device setting to NVIDIA Optimus Mode if the OS supports this feaure, and to Discrete Graphics mode if the OS does not support it".
So, my secondary setting had been ENABLED, resulting in my request for Discrete Graphics mode (#2 above) being changed automatically back to NVIDIA Optimus Mode (#3 above).  I changed this setting to DISABLED, changed the mode back to Discrete Graphics (#2) and have re-booted a few times.  It looks like it should stay in Discrete Graphics mode for good, I am hoping.
Just playing around with this BIOS setting a little, I can boil it down to this:
If Integrated Graphics mode is chosen, only the Intel HD4000 gpu is active in Device Manager, the NVIDIA gpu is completely de-activated.  With this mode, there is no ability for a second display at all.  I suppose the idea here is that you are completely running on battery power and wouldn't have any secondary displays attached anyway.
If Discrete Graphics mode is chosen, only the NVIDIA NVS5400M gpu is active in Device Manager, the Intel HD4000 gpu is completely de-activated.  With this mode, there is an ability to have a second display via the Display Port, and the NVIDIA gpu takes charge of both the primary and secondary displays.  While it may be the most power consumptive, it works the best for actually viewing video.
If NVIDIA Optimus mode is chosen, both the Intel HD4000 gpu and the NVIDIA NVS5400M gpu are active in device manager.  With this mode, each gpu takes charge of one of the displays and must try and communicate with each other with regard to video rules and power consumption (I have found it to be buggy).  Also, the switch setting mentioned above indicate that this is the default setting unless you specifically tell it not to be.
I am mostly happy to finally understand what had been happening and it appears that by me putting the T530 into Discrete Graphics mode, that multiple power state errors, random screen black-outs, and inability to view DVDs correctly were finally corrected after months of mystery.  This seems to be a very little documented problem, as I have been searching for the answer for months...

Similar Messages

  • CC & Dual GPU

    My new build is running a GTX570oc , I can pickup a 660ti for $200 to add.
    I know sli was unsupported a while back, but what's the workings of CC and its dual GPU support ?
    Will it utilize both cards and unlike sli doesn't matter if they are a matching pair ?
    The 660 has 2.5 times the cuda cores at a fraction slower bandwidth so I think ill get some good results.
    Does dual GPU in CC require much setup/tinkering and does it just use both, or make a primary and secondary ?
    Thanks
    Troy

    Troy,
    Dual GPU in CC is trivial to set up and requires no tinkering for the additional card. Of course, this assumes that you have enough power supply, PCIex16 slots, MPE hack, and cooling for the additional card(s). You do not need to SLI the cards to get the benefits in Premiere Pro. I did read on this forum that it works with mis-matched GPUs, but I've only seen that verified for cards using a common driver. As you probably know, the two cards that you mention do share a common driver from nVidia.
    You need to ask yourself why are you doing this though? For most rigs, I would expect that with a single decent GTX video card already in place that the only speed gain would be for DVD renders. For most other work flows, other items in the PC would be the limiting resource (CPU power, drive speed, etc.). On the other hand, if you are constantly doing DVD exports for high-def media, then the increased number of cuda cores will make your world so much better.
    See the following post regarding for my test results on a dual GTX Titan setup where adding a second video card doubled DVD exports, but left pretty much all others performance areas completely unchanged:
    http://forums.adobe.com/message/5588807
    Regards,
    Jim

  • Dual GPU's

    Is Aperture able to leverage any advantage from dual GPU's?
    quad G5   Mac OS X (10.4.4)  

    Is Aperture able to leverage any advantage from dual
    GPU's?
    I think Aperture would beenfit to some extent, because while CoreImage is using the GPU's so is the rest of the system. Also possibly it could split CoreImage work across both GPU's, do not know enough details about how Core Image uses the video cards to say for sure.
    A stronger video card is always going to help out Aperture to some degree though.

  • Dual Xeons Or Dual GPU Premiere cs6?

    Hey,
    Does PR support "DUAL" processors. GPU OR CPU.
    Iam willing to upgrade to dual Xeons like  " HD-Juggernaut"
    from ADK.
    Is it helful for PR or not, or i just stick with i7?

    As Bill said, dual GPU is not a good idea... some previous discussions
    2 cards and 3 monitors http://forums.adobe.com/thread/875252
    -and http://forums.adobe.com/thread/876675
    Dual Card SLI http://forums.adobe.com/thread/872941
    -problem http://forums.adobe.com/thread/872103

  • Graphics Error AMD Radeon HD 7640G/7670M Dual GPU

    HI,
    I am facing a problem with my HP Pavilion g6-2301ax Notebook PC with AMD Radeon HD 7640G/7670M Dual GPU (2 GB DDR3 dedicated) with Windows 8 64-bit. I haven't made any changes to the hardware/software nor have I connected any new hardwares.
    I cannot watch Videos no matter what program I use, be it VLC, Media Player, etc. The programs simply crashes. It takes quite a considerable time to boot and Restart/Shut-Down, than before.  Here's a screenshot of the error message...
    Also, the AMD Catalyst Control Centre keeps crashing...
    Before, the Graphics Control Interface was totally different. The so called "Hydra Grid" wasn't there. The Control Menu had more options and the ability to customise. Since, it is a Dual GPU, there was an option whether to choose both or just one. This problem started like 2-3 days back.
    I am really really frustrated. Because of this, my work is getting hampered a lot. Also, it doesn't read an external Hard Drive which has USB 3.0, even though the "safely remove icon" is shown at the task bar. And now, because of this I can't even Back-up my files on to the external hard drive in order to Recover or Restore it to Factory Settings.
    Somebody help me.
    Warm Regards,
    Zamyang.

    Hi @cheemsay ,
    Welcome to the HP Forums!
    It is a dynamite  place to find answers and ideas!
    For you to have the best experience in the HP forum I would like to direct your attention to the HP Forums Guide Learn How to Post and More
    I understand that you are unable to play videos regardless of what player you try.  You have not made any physical changes to your notebook or added any external hardware.
    When you connected your External drive you get the safely remove icon but does not install the drive.
    If you look in device manager does the external drive have any bangs or error on it.  Have you checked the manufacturers site for a driver for it?
    If you check disk management does it have a drive letter assigned to it?
    Here is a link to
    Testing for Hardware Failures (Windows 8) that may to determine the cause.
    You state  this happened  only 2 -3 days ago and you cannot do a restore as the external is not being seen correctly but you should be able to do a recovery back to factory.
    Do you know in updates were automatically installed?
    Here  is a link to Performing an HP system recovery (Windows 8)  that will guide you through the recovery process.
    I hope this helps!
    Sparkles1
    I work on behalf of HP
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    Click the “Kudos, Thumbs Up" on the bottom right to say “Thanks” for helping!

  • Proper way to test the dual GPU cards?

    I'll be receiving my Mac Pro tomorrow and I wanted to know if there is a way to properly put both GPU's through their paces to make sure they are both functioning properly? I've found a few threads where people realize their second GPU ( not used most of the time ) was in fact defective. Is there something on OS X that will test this? I plan to go to bootcamp and do some usual cross-fire tests, but something on the OS X side might be useful.
    TIA
    -Andres

    Sorry to have to tell you this guys but all of the above are broken.
    First of all... you should all take a good look at what the read() method actually does.
    http://java.sun.com/j2se/1.3/docs/api/java/io/InputStream.html#read(byte[], int, int)
    Pay special attension to this paragraph:
    Reads up to len[i] bytes of data from the input stream into an array of bytes. An attempt is made to read as many as len[i] bytes, but a smaller number may be read, possibly zero. The number of bytes actually read is returned as an integer.
    In other words... when you use read() and you request say 1024 bytes, you are not guaranteed to get that many bytes. You may get less. You may even get none at all.
    Supposing you want to read length bytes from a stream into a byte array, here is how you do it.int bytesRead = 0;
    int readLast = 0;
    byte[] array = new byte[length];
    while(readLast != -1 && bytesRead < length){
      readLast = inputStream.read(array, bytesRead, length - bytesRead);
      if(readLast != -1){
        bytesRead += readLast;
    }And then the matter of write()...
    http://java.sun.com/j2se/1.3/docs/api/java/io/OutputStream.html#write(byte[])
    write(byte[] b) will always attempt to write b.length bytes, no matter how many bytes you actually filled it with. All you C/C++ converts... forget all about null terminated arrays. That doesn't exist here. Even if you only read 2 bytes into a 1024 byte array, write() will output 1024 bytes if you pass that array to it.
    You need to keep track of how many bytes you actually filled the array with and if that number is less than the size of the array you'll need pass this as an argument.
    I'll make another post about this... once and for all.
    /Michael

  • Dual GPU

    Does the MBP unibody's option of switching between the integrated or discreet GPU apply for Windows Vista in Boot Camp? If not then which does Vista choose? Does Windows 7 work on Boot Camp?
    Thanks in advance!

    Vista will use whichever GPU has been selected in Energy Saver preferences prior to booting Windows.
    Windows 7 is early beta software. It may or may not work. Certainly there will not be drivers available unless it can use Vista's drivers for the Mac hardware.

  • Hi, can i upgrade or use dual gpu-s in my elitebook 2570p ?

    I was wondering if i could use two gpu-s in my laptop 2570p elitebook ?

    Hi,
    Double post. You are talking about buying a new machine here, sell this one and buy a new one much cheaper.
    Regards.
    BH
    **Click the KUDOS thumb up on the left to say 'Thanks'**
    Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem.

  • Best way to setup dual GPU?

    Hello,
    I just got a new intel CPU that has an integrated GPU and now I am wondering how to set up my system.
    I already have a Radeon 280x on the open source driver and play games with it, both natives and through wine.
    I'd like to be able to use Qemu's passthrough, so for that I suppose I would need to plug my main monitor to my motherboard's HDMI instead of my Radeon's. But then would I be able to play native games on the radeon?
    Same thing for Xorg, is there a good way to setup both cards or does it have to be one at a time and restarting Xorg when I need the other?
    Thanks!

    Would you have a configuration example on how to do this? Or even some documentation I can read?
    I might have also not made myself very clear on the initial post. I do have vlans on both of the 3550's. The 3550 on the left has VLAN2 and the one on the right has VLAN3. I would like to be able to create VLAN4 and have VLAN4 available on all of the switches but with leaving the OSPF routing as is.
    So far the documentation I have been reading only talks about creating l2 trunks between the swtiches, and as far as I can tell you cannot do that on a routed port. So now I'm not sure where to go from here.
    Thanks.
    Dan.

  • Acer V5 753g (Intel Haswell / Nvidia GT 750m dual GPU) Bumblebee

    I think I am fighting several issues here, altough I might have fixed one already. Up to this point I managed to disable the nvidia gpu, using the proprietary driver and bbswitch, but bumblebee is not working.
    Okay first, I could isolated an issue with acpi, by following instructions in this bug report https://github.com/Bumblebee-Project/Bu … issues/460. To be exact it made me install a kernel module, code and install instrutions are described here: https://github.com/Bumblebee-Project/bb … ack-lenovo (After the bug report the author of this code was generous enough to add my model to the source code, so midifications weren't neccesary any more).
    Now after a reboot the gpu is disabled, but as soon as I run optirun (which fails) I am not able to disable it again
    I know the exact same issue is discribed in the bumblebee arch wiki page, but none of the workarounds fit my situation.
    bbswitch dmesg output
    dmesg | grep bbswitch
    [   16.608903] bbswitch: version 0.7
    [   16.608910] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.GFX0
    [   16.608917] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.RP05.PEGP
    [   16.609010] bbswitch: detected an Optimus _DSM function
    [   16.609063] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on
    [   16.610310] bbswitch: disabling discrete graphics
    [   78.445044] bbswitch: enabling discrete graphics
    Optirun output:
    optirun -vv glxspheres
    [ 1166.063193] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
    [ 1166.063481] [DEBUG]optirun version 3.2.1 starting...
    [ 1166.063488] [DEBUG]Active configuration:
    [ 1166.063491] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
    [ 1166.063494] [DEBUG] X display: :8
    [ 1166.063497] [DEBUG] LD_LIBRARY_PATH: /usr/lib/nvidia:/usr/lib32/nvidia
    [ 1166.063500] [DEBUG] Socket path: /var/run/bumblebee.socket
    [ 1166.063503] [DEBUG] Accel/display bridge: auto
    [ 1166.063505] [DEBUG] VGL Compression: proxy
    [ 1166.063508] [DEBUG] VGLrun extra options:
    [ 1166.063511] [DEBUG] Primus LD Path: /usr/lib/primus:/usr/lib32/primus
    [ 1166.063527] [DEBUG]Using auto-detected bridge virtualgl
    [ 1166.080418] [INFO]Response: No - error: [XORG] (EE) No devices detected.
    [ 1166.080441] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
    [ 1166.080445] [DEBUG]Socket closed.
    [ 1166.080469] [ERROR]Aborting because fallback start is disabled.
    [ 1166.080475] [DEBUG]Killing all remaining processes.
    Xorg error log:
    [  1166.066]
    X.Org X Server 1.14.4
    Release Date: 2013-10-31
    [  1166.066] X Protocol Version 11, Revision 0
    [  1166.066] Build Operating System: Linux 3.11.6-1-ARCH x86_64
    [  1166.066] Current Operating System: Linux acer-joschka 3.11.6-1-ARCH #1 SMP PREEMPT Fri Oct 18 23:22:36 CEST 2013 x86_64
    [  1166.066] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=9b2b7068-afcf-43a7-a6ac-7fc82d88e472 rw acpi_backlight=vendor
    [  1166.066] Build Date: 01 November 2013  05:10:48PM
    [  1166.066] 
    [  1166.066] Current version of pixman: 0.30.2
    [  1166.066]     Before reporting problems, check http://wiki.x.org
        to make sure that you have the latest version.
    [  1166.066] Markers: (--) probed, (**) from config file, (==) default setting,
        (++) from command line, (!!) notice, (II) informational,
        (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [  1166.066] (==) Log file: "/var/log/Xorg.8.log", Time: Mon Nov  4 14:34:19 2013
    [  1166.066] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
    [  1166.066] (++) Using config directory: "/etc/bumblebee/xorg.conf.d"
    [  1166.066] (==) ServerLayout "Layout0"
    [  1166.066] (==) No screen section available. Using defaults.
    [  1166.066] (**) |-->Screen "Default Screen Section" (0)
    [  1166.066] (**) |   |-->Monitor "<default monitor>"
    [  1166.066] (==) No device specified for screen "Default Screen Section".
        Using the first device section listed.
    [  1166.066] (**) |   |-->Device "DiscreteNvidia"
    [  1166.066] (==) No monitor specified for screen "Default Screen Section".
        Using a default monitor configuration.
    [  1166.066] (**) Option "AutoAddDevices" "false"
    [  1166.066] (**) Option "AutoAddGPU" "false"
    [  1166.066] (**) Not automatically adding devices
    [  1166.066] (==) Automatically enabling devices
    [  1166.066] (**) Not automatically adding GPU devices
    [  1166.066] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
    [  1166.066]     Entry deleted from font path.
    [  1166.066] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
    [  1166.066]     Entry deleted from font path.
    [  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
    [  1166.066]     Entry deleted from font path.
    [  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
    [  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
    [  1166.066]     Entry deleted from font path.
    [  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
    [  1166.066] (==) FontPath set to:
        /usr/share/fonts/misc/,
        /usr/share/fonts/TTF/
    [  1166.066] (++) ModulePath set to "/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules"
    [  1166.066] (==) |-->Input Device "<default pointer>"
    [  1166.066] (==) |-->Input Device "<default keyboard>"
    [  1166.066] (==) The core pointer device wasn't specified explicitly in the layout.
        Using the default mouse configuration.
    [  1166.066] (==) The core keyboard device wasn't specified explicitly in the layout.
        Using the default keyboard configuration.
    [  1166.067] (II) Loader magic: 0x7fdc20
    [  1166.067] (II) Module ABI versions:
    [  1166.067]     X.Org ANSI C Emulation: 0.4
    [  1166.067]     X.Org Video Driver: 14.1
    [  1166.067]     X.Org XInput driver : 19.1
    [  1166.067]     X.Org Server Extension : 7.0
    [  1166.067] (II) xfree86: Adding drm device (/dev/dri/card0)
    [  1166.067] setversion 1.4 failed
    [  1166.067] (II) xfree86: Adding drm device (/dev/dri/card1)
    [  1166.068] (--) PCI:*(0:1:0:0) 10de:0fe4:1025:079b rev 161, Mem @ 0xb2000000/16777216, 0xa0000000/268435456, 0xb0000000/33554432, I/O @ 0x00003000/128
    [  1166.068] Initializing built-in extension Generic Event Extension
    [  1166.068] Initializing built-in extension SHAPE
    [  1166.068] Initializing built-in extension MIT-SHM
    [  1166.068] Initializing built-in extension XInputExtension
    [  1166.068] Initializing built-in extension XTEST
    [  1166.068] Initializing built-in extension BIG-REQUESTS
    [  1166.068] Initializing built-in extension SYNC
    [  1166.068] Initializing built-in extension XKEYBOARD
    [  1166.068] Initializing built-in extension XC-MISC
    [  1166.068] Initializing built-in extension SECURITY
    [  1166.068] Initializing built-in extension XINERAMA
    [  1166.068] Initializing built-in extension XFIXES
    [  1166.068] Initializing built-in extension RENDER
    [  1166.068] Initializing built-in extension RANDR
    [  1166.068] Initializing built-in extension COMPOSITE
    [  1166.068] Initializing built-in extension DAMAGE
    [  1166.068] Initializing built-in extension MIT-SCREEN-SAVER
    [  1166.068] Initializing built-in extension DOUBLE-BUFFER
    [  1166.068] Initializing built-in extension RECORD
    [  1166.068] Initializing built-in extension DPMS
    [  1166.068] Initializing built-in extension X-Resource
    [  1166.068] Initializing built-in extension XVideo
    [  1166.068] Initializing built-in extension XVideo-MotionCompensation
    [  1166.068] Initializing built-in extension XFree86-VidModeExtension
    [  1166.068] Initializing built-in extension XFree86-DGA
    [  1166.068] Initializing built-in extension XFree86-DRI
    [  1166.068] Initializing built-in extension DRI2
    [  1166.068] (II) LoadModule: "glx"
    [  1166.068] (II) Loading /usr/lib/nvidia/xorg/modules/extensions/libglx.so
    [  1166.078] (II) Module glx: vendor="NVIDIA Corporation"
    [  1166.078]     compiled for 4.0.2, module version = 1.0.0
    [  1166.078]     Module class: X.Org Server Extension
    [  1166.078] (II) NVIDIA GLX Module  325.15  Wed Jul 31 18:12:00 PDT 2013
    [  1166.078] Loading extension GLX
    [  1166.078] (II) LoadModule: "nvidia"
    [  1166.078] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [  1166.078] (II) Module nvidia: vendor="NVIDIA Corporation"
    [  1166.078]     compiled for 4.0.2, module version = 1.0.0
    [  1166.078]     Module class: X.Org Video Driver
    [  1166.078] (II) LoadModule: "mouse"
    [  1166.078] (II) Loading /usr/lib/xorg/modules/input/mouse_drv.so
    [  1166.078] (II) Module mouse: vendor="X.Org Foundation"
    [  1166.078]     compiled for 1.14.0, module version = 1.9.0
    [  1166.078]     Module class: X.Org XInput Driver
    [  1166.078]     ABI class: X.Org XInput driver, version 19.1
    [  1166.078] (II) LoadModule: "kbd"
    [  1166.078] (WW) Warning, couldn't open module kbd
    [  1166.078] (II) UnloadModule: "kbd"
    [  1166.078] (II) Unloading kbd
    [  1166.078] (EE) Failed to load module "kbd" (module does not exist, 0)
    [  1166.078] (II) NVIDIA dlloader X Driver  325.15  Wed Jul 31 17:50:57 PDT 2013
    [  1166.078] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [  1166.078] (--) using VT number 7
    [  1166.078] (EE) No devices detected.
    [  1166.078] (EE)
    Fatal server error:
    [  1166.078] (EE) no screens found(EE)
    [  1166.079] (EE)
    Please consult the The X.Org Foundation support
         at http://wiki.x.org
    for help.
    [  1166.079] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
    [  1166.079] (EE)
    bumblebee.conf (I didnt change anything here)
    # Configuration file for Bumblebee. Values should **not** be put between quotes
    ## Server options. Any change made in this section will need a server restart
    # to take effect.
    [bumblebeed]
    # The secondary Xorg server DISPLAY number
    VirtualDisplay=:8
    # Should the unused Xorg server be kept running? Set this to true if waiting
    # for X to be ready is too long and don't need power management at all.
    KeepUnusedXServer=false
    # The name of the Bumbleblee server group name (GID name)
    ServerGroup=bumblebee
    # Card power state at exit. Set to false if the card shoud be ON when Bumblebee
    # server exits.
    TurnCardOffAtExit=false
    # The default behavior of '-f' option on optirun. If set to "true", '-f' will
    # be ignored.
    NoEcoModeOverride=false
    # The Driver used by Bumblebee server. If this value is not set (or empty),
    # auto-detection is performed. The available drivers are nvidia and nouveau
    # (See also the driver-specific sections below)
    Driver=
    # Directory with a dummy config file to pass as a -configdir to secondary X
    XorgConfDir=/etc/bumblebee/xorg.conf.d
    ## Client options. Will take effect on the next optirun executed.
    [optirun]
    # Acceleration/ rendering bridge, possible values are auto, virtualgl and
    # primus.
    Bridge=auto
    # The method used for VirtualGL to transport frames between X servers.
    # Possible values are proxy, jpeg, rgb, xv and yuv.
    VGLTransport=proxy
    # List of paths which are searched for the primus libGL.so.1 when using
    # the primus bridge
    PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
    # Should the program run under optirun even if Bumblebee server or nvidia card
    # is not available?
    AllowFallbackToIGC=false
    # Driver-specific settings are grouped under [driver-NAME]. The sections are
    # parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
    # detection resolves to NAME).
    # PMMethod: method to use for saving power by disabling the nvidia card, valid
    # values are: auto - automatically detect which PM method to use
    #         bbswitch - new in BB 3, recommended if available
    #       switcheroo - vga_switcheroo method, use at your own risk
    #             none - disable PM completely
    # https://github.com/Bumblebee-Project/Bu … PM-methods
    ## Section with nvidia driver specific options, only parsed if Driver=nvidia
    [driver-nvidia]
    # Module name to load, defaults to Driver if empty or unset
    KernelDriver=nvidia
    PMMethod=auto
    # colon-separated path to the nvidia libraries
    LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia
    # comma-separated path of the directory containing nvidia_drv.so and the
    # default Xorg modules path
    XorgModulePath=/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules
    XorgConfFile=/etc/bumblebee/xorg.conf.nvidia
    ## Section with nouveau driver specific options, only parsed if Driver=nouveau
    [driver-nouveau]
    KernelDriver=nouveau
    PMMethod=auto
    XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
    xorg.conf.nvidia (I placed my gpu device address in Device Section
    Section "ServerLayout"
        Identifier  "Layout0"
        Option      "AutoAddDevices" "false"
        Option      "AutoAddGPU" "false"
    EndSection
    Section "Device"
        Identifier  "DiscreteNvidia"
        Driver      "nvidia"
        VendorName  "NVIDIA Corporation"
    #   If the X server does not automatically detect your VGA device,
    #   you can manually set it here.
    #   To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
    #   as you see in the commented example.
    #   This Setting may be needed in some platforms with more than one
    #   nvidia card, which may confuse the proprietary driver (e.g.,
    #   trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
         BusID "PCI:01:00.0"
    #   Setting ProbeAllGpus to false prevents the new proprietary driver
    #   instance spawned to try to control the integrated graphics card,
    #   which is already being managed outside bumblebee.
    #   This option doesn't hurt and it is required on platforms running
    #   more than one nvidia graphics card with the proprietary driver.
    #   (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
    #   If this option is not set, the new Xorg may blacken the screen and
    #   render it unusable (unless you have some way to run killall Xorg).
        Option "ProbeAllGpus" "false"
        Option "NoLogo" "true"
        Option "UseEDID" "false"
        Option "UseDisplayDevice" "none"
    EndSection

    Your issue seems to be similar to what I had to face with the Razer Blade 14" (similar hardware as well).
    Goto: https://bbs.archlinux.org/viewtopic.php?id=173356 and see the instructions to get graphics to work, see if it applies (summarized: run linux-ck and set rcutree.rcu_idle_gp_delay=2 in the kernel parameters).
    I haven't tried undoing this in more recent updates because I haven't had the time to mess with it---it is possible that it has been fixed, although the thread following my and possibly your issue at nvidia doesn't give any indication of that:
    https://devtalk.nvidia.com/default/topi … iver-crash

  • Dual GPU, ATI + NVidia. Is it possible?

    Hello. I work with CUDA in the college and now i am kidding around with pyrit. And i have seen that ATI cards outperform the nvidia ones, with a lower price. So, is possible to use an ATI card with a NVIDIA card? The main card(the one that will provide video) will probably be the nvidia or the onboard, since when computing with cuda the video keeps freezing.
    My motherboard is a z77m-d3h and my gpu is a msi gtx 660 ti. The mobo has 2 PCI-E slots.
    Will be any problem with drivers?
    I also want to start using openCL with and ATI card. But i dont want to buy another computer or to physically change which card i will be using. So i need both ATI and NVidia cards on the same computer.
    Thanks..

    carneiro :
    try opencl-nvidia / opencl-nividia-304xx , looking at their dependencies it should be possible to run them without having the nvidia video driver installed.
    granada :
    check https://wiki.archlinux.org/index.php/Opencl
    amd opencl solution currently requires catalyst to be present.
    The gallium compute solution for the open source radeon driver requires a custom build of mesa.
    (if you want to try this one, check lordheavy's mesa-git repo.  some details about using opencl with that repo  :https://bbs.archlinux.org/viewtopic.php?id=170468 )
    I very much doubt the combination of nvidia video driver and catalyst or xf86-video-ati will work though.

  • HT3207 which is better setting to set my MBPRO 15" Late 2008..i have a 8gb ram with 2.66GHz with dual GPu cards.? thanks.

    HELP

    Since you have both built-in and soldered GPUs, whenever an application 'needs' the beefier GPU, your system will automatically switch to that GPU.
    To see when it's switching, download and install gfxCardStatus.
    Good luck,
    Clinton

  • [SOLVED] radeon/radeon dual GPU halp:)

    Hey all, I just got a new laptop with radeon/radeon hybrid gpus and I'm a little lost as to how to go about setting anything up. The hybrid wiki only really mentions ati/intel, but I've got both my cards on the radeon driver.
    I'm pretty sure arch has both cards going full tilt because the laptop is getting pretty hot and I can change the backlight in /sys/class/backlight/ for either video card and they both change the screen brightness.
    I'm also not even sure which card is which;
    # lspci |grep -i vga
    00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Device 9647
    02:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Seymour [Radeon HD 6400M Series]
    I though both cards are 6400m series, the dedicated is a 6470M and I thought I remembered seeing the integrated card as a 6400-series as well on windows when I first got it, but I'm no longer sure.
    here they are in lshw
    description: VGA compatible controller
    product: Seymour [Radeon HD 6400M Series]
    vendor: Advanced Micro Devices [AMD] nee ATI
    physical id: 0
    bus info: pci@0000:02:00.0
    version: 00
    width: 64 bits
    clock: 33MHz
    capabilities: pm pciexpress msi vga_controller bus_master cap_list rom
    configuration: driver=radeon latency=0
    resources: irq:44 memory:b0000000-bfffffff memory:fea20000-fea3ffff ioport:e000(size=256) memory:fea00000-fea1ffff
    description: VGA compatible controller
    product: Advanced Micro Devices [AMD] nee ATI
    vendor: Advanced Micro Devices [AMD] nee ATI
    physical id: 1
    bus info: pci@0000:00:01.0
    version: 00
    width: 32 bits
    clock: 33MHz
    capabilities: pm pciexpress msi vga_controller bus_master cap_list rom
    configuration: driver=radeon latency=0
    resources: irq:41 memory:a0000000-afffffff ioport:f000(size=256) memory:feb00000-feb3ffff
    I was thinking of picking one of the cards and writing an xorg.conf to try and disable one and enable the other at x launch, but before I do that does anyone have any words of wisdom or anywhere I should look for guidance? Also is it strange that even with both video cards going on the opensource driver compositing is very slow?
    EDIT: ok so I've fixed the compiz and I did try a xorg.conf that specifies one of the devices; this is over my head though. Is specifying the card in the xorg.conf enough to keep one of the cards idle?
    EDIT2: nah both cards are still burning full tilt, lm-sensors has both going ~70-74 degrees like they're crunching my math homework
    thanks all.
    Last edited by htedrom (2012-05-27 05:09:17)

    Hey Pres, thanks for the reply
    Yea nothing in the BIOS, and sorry specifying by BusId is what I meant by using xorg.conf, so I do have only one specified, though with both cards on the same driver I am unable to blacklist one of them.
    I have looked a bit at the ati wiki for powersaving but it doesn't really seem to matter what I do to the cards in terms of power profile.. switching from profile to dynamic didn't seem to change much, nor did specifying a low power profile for one of them. I'm going to install an opengl game when I get home and see what I can see when switching between profiles, but so far in terms of temperature the power profiles didn't seem to have much effect.
    Edit: is trying to put only one of the cards on the prop driver possible/a good idea? Could I then blacklist the open source driver for the dedicated card?
    Last edited by htedrom (2012-05-08 19:25:09)

  • The new version release today (7.2.1) fixes the dual GPU crash.

    http://blogs.adobe.com/movingcolors/2014/03/31/speedgrade-cc-7-2-1-update/

    Dear Adobe,
    So I was waiting 4 months for a patch, what doesn't work correctly for me. I have Quadro 6000, and GTX580. (2 cards because of another application - I don't want to promote it here).
    If I play the timeline with 7.2.1 with both cards enabled, I have no realtime. If I disable GTX580 in my control panel and run the same project also with 7.2.1, it plays realtime.
    So what exactly did you patched? The startup error screen only? Who are your beta testers and what they actually do? I liked Speedgrade by IRIDAS very much but since you bought it it is unusable for me. I have never seen so many bugs in one application - just compare this discussion forum with other forums.
    My setup:
    Intel 6 core 4930K !!!!
    64GB RAM !!!!
    10TB disk RAID with 500MB/s
    Quadro 6000
    GTX 580
    Win7 64bit SP1

  • [SOLVED] Lenovo T530 UEFI Arch/Ubuntu Dual boot - Arch fails to boot.

    Hi All,
    I have installed Arch to my Lenovo T530 to dual boot with Ubuntu using UEFI and Grub.
    After installation, Arch is presented to me as an option when my laptop fires up. However, if I select it, the loader goes to a purple screen and then hangs.
    I have attached here the bootloader scripts for my Arch installation (not working), my Ubuntu installation (working) and the output from sudo lsblk -o name,mountpoint,label,size,uuid.
    Please let me know if there is more useful information I can provide. (I have output from Bootinfoscript available but it is quite extensive).
    I am hoping to find out if there is an easily fixable error in the booting scripts used by Grub. If not, I have seen the section on dual booting with Arch in the wiki. My worry is that if I resort to it, UEFI looks to be temperamental at best and I risk breaking my currently working Ubuntu installation.
    Thanks and regards,
    Simon
    Arch boot script (not working):
    setparams 'Arch (on /dev/sda4)'
    insmod part-gpt
    insmod ext2
    set root= 'hd0,gpt4'
    if [ x$feature_platform_search_hint = xy ]; then
    search --no-floppy --fs-uuid --set=root --hint-bios=hd0,gpt4 -\
    -hint-efi=hd0,gpt4 -hint-baremetal=ahci0,gpt4 729b5164-22c4-4c21-8212-\
    66038d60943e
    else
    search --no-floppy --fs-uuid --set=root 729b5164-22c4-4c21-821\
    2-66038d60943e
    fi
    linux /boot/vmlinuz root=UUID=ad4103fa-d940-47ca-8506-301d\
    8071d467 rw quiet
    initrd /boot/initramfs-linux.img
    Ubuntu boot script (working)
    setparams 'Ubuntu, with Linux 3.13.0-24-generic'
    recordfail
    load_video
    gfxmode $linux_gfx_mode
    insmod gzio
    insmod part-gpt
    insmod ext2
    set root= 'hd0,gpt2'
    if [ x$feature_platform_search_hint = xy ]; then
    search --no-floppy --fs-uuid --set=root --hint-bios=hd\
    0,gpt2 --hint-efi=hd0,gpt2 -hint-baremetal=ahci0,gpt2 542bf27c-0fd5-42\
    4a-b4d8-107f7cf97b75
    else
    search --no-floppy --fs-uuid --set=root 542bf27c-0fd5-\
    424a-b4d8-107f7cf97b75
    fi
    echo 'Loading Linux 3.13.0-24-generic ...'
    linux /boot/vmlinuz-3.13.0-24-generic root=UUID=5\
    42bf27c-0fd5-424a-b4d8-107f7cf97b75 ro quiet spash $vt_handoff
    echo 'Loading initial ramdisk ...'
    initrd /boot/initrd.img-3.13.0-24-generic
    Output from sudo lsblk -o name,mountpoint,label,size,uuid
    NAME MOUNTPOINT LABEL SIZE UUID
    sda 119.2G
    ├─sda1 /boot/efi BOOTLOADER 524M 9360-2939
    ├─sda2 / Linux_Ubuntu 34.6G 542bf27c-0fd5-424a-b4d8-107f7cf97b75
    ├─sda3 [SWAP] Swap 9.8G 7768ae01-6e37-450b-bf0c-d873e3fd06a1
    ├─sda4 Linux_Arch 32.7G 729b5164-22c4-4c21-8212-66038d60943e
    ├─sda5 /media/Data Data 33.2G 5a971a77-685b-43d5-a8e6-c7b407a4c2ff
    └─sda6 Misc_Data 8.5G b165990d-bd25-458f-b2d6-63fae28d0870
    sdb 1T
    └─sdb1 1024G a1ee2f60-007a-4292-982b-7d5f8375fc7e
    sr0 1024M
    Last edited by simon_sjw (2015-03-22 10:43:03)

    linux /boot/vmlinuz root=UUID=ad4103fa-d940-47ca-8506-301d8071d467 rw quiet
    Change the UUID here. Where did that come from?
    EDIT: curiously, if you DuckDuckGo search this exact UUID, it comes up a bunch of times and has caused people headaches before. If you fix that you should be okay. If anyone knows why this same exact UUID would incorrectly be created on multiple systems, I'd love to know. Seems like some kind of issue with dual/triple booting and OS-prober.
    2nd EDIT: this UUID is in the default in grub.cfg. For some reason, it sometimes won't be replaced by grub-mkconfig... Maybe the user didn't run grub-mkconfig, but edited the file him or herself? simon_sjw?
    Last edited by nullified (2015-03-22 03:12:36)

Maybe you are looking for

  • Is there a way to exchange/return a SA 540?

    My vendor says that the SA 540 is used so they won't take it back and Cisco tech support is saying it isn't broke.  Any suggestions on what to do with it besides using it as a paper weight? Here are the problems I'm having with it:      - The WAN con

  • Vendor invoice creation using BAPI BAPI_ACC_DOCUMENT_POST

    Hi, I am trying to create vendor invoice(FB60) using BAPI_ACC_DOCUMENT_POST. Could anyone please let me know what all mandatory fields we have to pass in vendor item and header to create this ? Regards, Anubhuti Pandey

  • Received messages don't display

    How do we  knowweI have messages to check?.. we had 101 messages put on our home phone as prt of a bundle. We've put in a pin as requested and we are able to retrieve messages.....Our problem is how do we know we've recieved any messages. Our Uniden

  • Planning Version Table

    Hi SAP Gurus, I would like to inquire what particular table in SAP R/3 that can display list of planning version created for a specific material and plant using data element VRSIO. Planning version is created in tcode MC93 Thanks, Patrick

  • High Dynamic Range

    First, a disclaimer. After 23 years of PC use, I converted to Mac only two months ago (why didn't I do this years ago???). I am a photojournalist and am now heavily into using Aperture, which I L-O-V-E. What I'm trying to say here is that I'm a newbi