[W520] Programs not using NVIDIA graphics

Quadro 1000M.  Windws 7. Optimus set in the BIOS.  I have the newest drives on the Lenovo website (8.17.12.7658).  Plugged in to AC power.
The NVIDIA graphics are specified for certain programs in the NVIDIA control panel, also tried right click-->run with NVIDIA graphics.
Device manager shows both the Intel and NVIDIA display adapters with no problems.
When I plug in an external monitor the NVIDIA graphics are used, the NVIDIA GPU activity monitor says "1 display connection".  If I try to start a program with the NVIDIA graphics while an external display is connected, the program still won't use the NVIDIA GPU.
I just ran WEI and for the first time ever the NVIDIA GPU Activity indicator actually lit up a few times, mostly during the "Windows Media Tuning" section, and my graphics score went up from 5.2 to 6.4, so the system is sometimes able to use the NVIDIA graphics.  Although if I hovered the mouse over the NVIDIA GPU indicator it said "GPU activity: none", even while it was lit up, and GPU-Z indicated no NVIDIA GPU activity for the entire WEI test.  I can't figure out what "Windows Media Tuning" means or what exactly is being tested here. [Edit] A WEI test with only integrated graphics enabled in the BIOS also gets 6.4 so the increase from 5.2 to 6.4 wasn't a result of the WEI test using the NVIDIA graphics.  With discrete graphics set in the BIOS I get 6.7.
GPU-Z also indicates that the NVIDIA graphics are not being used (GPU load stays at 0%).  But if I have GPU-Z set to monitor the NVIDIA graphics then the NVIDIA activity indicator lights up, but again it still says "GPU activity: none".
Sometimes when I start a program (eg. Photoshop), the NVIDIA GPU Indicator will light up for a second then go out.  This happens whether I try to start the progam with the Intel or NVIDIA GPU.
If I set the BIOS to discrete graphics then the NVIDIA card is used properly, according to GPU-Z (ie. GPU load shows activity).
I'm pretty sure this all used to work properly, but I don't know when it stopped working.  I just upgraded Photoshop and wanted to see if it was using the NVIDIA graphics so that's how I noticed the problem.  It's been several months since I used the NVIDIA GPU Activity Monitor and I just got GPU-Z to try troubleshoot this problem.
In the ThinkVantage Toolbox when I go to video diagnostics, only the Intel GPU is listed.  But, if I first start GPU-Z and have it monitor the NVIDIA graphics, then start the ThinkVantage Toolbox, the NVIDIA GPU is also listed and I can run a Short Video test (stress test is only available for the Intel graphics), although the test doesn't really run anything (it just takes a second and doesn't display any of the animation).  If I leave the ThinkVantage Diagnostics window open, close GPU-Z, and try to run a test on the NVIDIA GPU again it says "No Device".  GPU-Z while set to monitor the NVIDIA GPU seems to "turn on" or "prime" the NVIDIA GPU so that the diagnostic test can use it.  This doesn't work for other programs (ie. even if GPU-Z is monitoring the NVIDIA GPU, the NVIDIA GPU still won't be used by any other programs I subsequently start).  If I have an external monitor plugged in I can run all the NVIDIA diagnostic tests (short test with graphics, stress test, CUDA test).  I just ran the short test and everything passed.
***This last bullet point seems important to me, is this normal behaviour? Can it point to where the problem is?
How can I get programs to use the NVIDIA graphics card?
W520 i7-2720QM | 32 GB RAM 1333 MHz | FHD 1920x1080 | Quadro 1000M | Windows 7 & Ubuntu 14.04
Crucial M500 480GB mSATA + 2 x 1 TB Hitachi Travelstar 7200 rpm | BIOS 1.38

Fullmetal99012 wrote:
The nVidia control panel should allow you to set options for programs to use the nVidia. Thats what I ended up having to do.
I tried that (see first point in original post), it doesn't help.
I haven't tried the drivers from NVIDIA, but I would think the Lenovo ones should work and therefore the drivers aren't the problem.  Has anyone used the NVIDIA ones?  Are there any downsides?  I assume I can just reinstall the Lenovo ones if I want?
W520 i7-2720QM | 32 GB RAM 1333 MHz | FHD 1920x1080 | Quadro 1000M | Windows 7 & Ubuntu 14.04
Crucial M500 480GB mSATA + 2 x 1 TB Hitachi Travelstar 7200 rpm | BIOS 1.38

Similar Messages

  • T540p not using nvidia graphics, very slow

    Hi!
    Well I'm using a T540p with GeForce GT 730M. 
    When I use applications with WPF or DirectDraw, like Office or Chrome (I think they do...) the notebooks very slow. I get a hight load on system process. 
    When I use the Process Explorer to see whats going on:
    ntoskrnl.exe!WheaAttemptPhysicalPageOffline
    is on high load.
    I checked the Bios: There is no way to deactivate the Intel HD Graphics. Checked the nvidia system settings to: Its set to use GeForce.
    Also tried to install newest drivers and so on: They are installed.
    Oh well: I use Windows 7, 64bit.
    Thanks!

    Fullmetal99012 wrote:
    The nVidia control panel should allow you to set options for programs to use the nVidia. Thats what I ended up having to do.
    I tried that (see first point in original post), it doesn't help.
    I haven't tried the drivers from NVIDIA, but I would think the Lenovo ones should work and therefore the drivers aren't the problem.  Has anyone used the NVIDIA ones?  Are there any downsides?  I assume I can just reinstall the Lenovo ones if I want?
    W520 i7-2720QM | 32 GB RAM 1333 MHz | FHD 1920x1080 | Quadro 1000M | Windows 7 & Ubuntu 14.04
    Crucial M500 480GB mSATA + 2 x 1 TB Hitachi Travelstar 7200 rpm | BIOS 1.38

  • Programs don't use nvidia graphics

    Hi, my programs don't use the nvidia graphics, for example the web browsers, video converters, video players, I configured the nvidia control panel but still they only use integrated graphics.
    Note: the only program that uses the graphics is Autodesk Maya 2012.

    You may try the following solution.
    1. Right-click on the free space of desktop and then select “Configure Switchable Graphics”.
    2. Configure GPU for application
    3. Assign GPU for an application by clicking the button next to it
    4. Add the application that you want to specify GPU for it by clicking “Browse”
    5. Click the application and then click “Open”
    6. Assign the GPU for the application you have added by clicking the button next to it
    7. Save your settings by clicking “Apply”
    Hope, this will resolve your query,
    Best Regards,
    Tanuj
    Did someone help you today? Press the star on the left to thank them with a Kudo!
    If you find a post helpful and it answers your question, please mark it as an "Accepted Solution".! This will help the rest of the Community with similar issues identify the verified solution and benefit from it.
    Follow @LenovoForums on Twitter!

  • MSIGE60 not using dedicated graphics?

    i everyone, I recently purchased a refurbished MSIGe60 laptop online equipped with a 2gb Nvidia Gtx660m and has the integrated intel HD 4000 as well. My laptop at the moment seems to be always using the Intel graphics, because I am getting terrible performance for games that this should be able to play as I've reserached on various videos on youtube. For example, I should be able to play skyrim with at least high graphics at 60fps , but whenever I load it up the game automatically has my settings to Low, and even if i change it to high the fps is unbearable. I have the latest drivers as far as im aware, and whenever I play a game it is always has the "run with NVIDIA high performance processor" as a default. Is there anything I can do about this? I have searched this topic many times and I know many people with the same model laptop have had this problem but I can't seem to solve what is wrong with it not using the graphics card even if it is set to it. Any tips would be much appreciated, thank you.

    Quote from: Svet on 01-October-14, 22:05:12
    Nope, for this model power button is not indicator for graphics cards
    its always light in blue with regardless what video card is in use
    @GatorKing95
    is your nvidia card 660m is proper installed when you look in device manager?
    If yes:
    when you go to Nvidia control panel,
    under 3D settings, then select "Adjust image settings with preview"
    and leave it in that way, does your Turbo button start lighting?
    If I remember right, the power button still changes on the GE series. Watching a few videos, I did in fact see the power button switching colors when using different video cards. I believe ALL the gaming laptops use this feature, as an added indicator for which video card is in fact in use.

  • Envy x360 2015 is not using Nvidia GTX 930m - GTX 900 Series not boot graphics

    I custom ordered a 2015 Envy x360 with a gtx 930m and my laptop was using Intel HD 5500 graphics out of the box. The BIOS has no option to configure boot graphics that I could find (even after a BIOS update through HP's driver/software update page). The 930m is detected and running correctly, but it is not the boot graphics and the only program I can get to actually use the GPU is Nvidia's Geforce Experience. Am I missing something here? EDIT: I have also updated my Nvidia drivers from HP's driver update site.

    I just got off the phone with an HP support case manager and he wouldn't guarantee or even suggest that it was likely that the boot graphics option would be added to the BIOS.My options are return it or don't boot with the graphics card.  According to my research, the InsydeH20 BIOS actually has lots of options and settings but manufacturerers (including HP) often hide the "advanced" settings because we're too dumb to not break our own laptops. If you're feeling frisky you can request that someone unlock your BIOS for you at bios-mods.com to give you access to the advanced options, but you're playing with fire with BIOS modding so... I'm going to go to Best Buy and see if the pre-configured version of my laptop boots on the GPU. If it does then I'm going to request that HP send me that pre-configured model to make ammends for what they've done. Good luck to you, please let me know if you get any new information

  • U530 not recognizing Nvidia graphics

    I noticed that my laptop wasn't recognizing the graphics card (Nvidia GT 730M) after trying to play fallout 3. The game was laggy, and when I went to options to choose the dedicated graphics, it didn't appear as an option. I tried to access the nvidia control panel, but get an error message: "NVidia display settings are not available. You are not currently using a display attached to an Nvidia GPU."
    The driver is up to date and the device shows up and is enabled in device manager. The integrated graphics card is also enabled and up to date.  I ran a hardware check with lenovo solution center and the check failed for the GPU, but not the integrated GPU.
    Can anyone help me with this?
    Solved!
    Go to Solution.

    hi biochem_matt,
    If you have the Intel® HD Graphics 4400 + Nvidia GT730M  in the Device Manager > Display Adapters then your system uses Optimus Technology to automatically switch between the Intel GPU (for power saving) and Nvida GPU (for maximum performance).
    The low FPS while gaming and the  "NVidia display settings are not available. You are not currently using a display attached to an Nvidia GPU" error might mean that the Intel and Nvidia drivers are not in sync causing Optimus not to work or the Nvidia GPU itself is showing signs of failure.
    Things that you can try:
    1. Follow this guide on how to disable automatic driver installation then navigate to the Control Panel (icon views) > Programs and Features and uninstall both the Intel and Nvidia Video Drivers and reboot.
    Alternatively, you can also use DUD (Display Driver Uninstaller) and select Clean and Restart.
    2. Install the Intel Graphics Driver first from the D:\Drivers folder or from the U530 drivers page then reboot. When finished, install the Nvidia Graphics driver and observe the game.
    3. If the above steps doesn't work, backup important files, shutdown the machine and press the OneKey Recovery button to restore the unit to factory settings and observe (note that this will wipe all data on the OS partition).
       - Link to picture
    Let me know how it goes.
    Regards
    Did someone help you today? Press the star on the left to thank them with a Kudo!
    If you find a post helpful and it answers your question, please mark it as an "Accepted Solution"! This will help the rest of the Community with similar issues identify the verified solution and benefit from it.
    Follow @LenovoForums on Twitter!

  • Why can I not see Nvidia graphics card when i click "more info" on my 15 inch macbook pro?

    I've just purchased a mid 2012 macbook pro
    its supposed to have an Nvidia graphics card, but when i click "about this mac" --> "more info" i only see Intel HD 4000 graphics card. Is this ok? Does this mean that somebody opened my macbook and stole the card :- /

    Nope - it's likely just the card in use at the time when you do the More Info...
    ...to check to see if the card is 'there' hold down the option key and select System Information from the Apple menu. Go under Hardware>Graphics & Displays and two cards should be listed.
    Good luck,
    Clinton

  • NVidia packages - let's not use nvidia-installer [RFC]

    Hi. After messing around with Xgl I (for some reason) decided that I hated the way the nvidia driver package doesn't account for all the files the nvidia-installer spits out/modifies. I wanted all the files involved to be properly managed by pacman. So I made some PKGBUILDs, looking at Fedora and Gentoo stuff for inspiration, and getting annoyed trying to work out what needs to be where based on NV's unmaintained install makefile..:shock:
    The PKGBUILDs that follow separate the kernel module from the rest of the nvidia package, but this is by no means necessary - I just wanted to be able to reinstall one without the other (to do with installing gl parts without restarting X - see below for more).
    (If there is to be a separation, it may be more logical to put the x.org driver module with the kernel module, with just the GL parts separate.)
    pkgname=nvidia-kernel
    pkgver=1.0.8178
    _pkgbinary=NVIDIA-Linux-x86-1.0-8178
    pkgrel=1
    pkgdesc="NVidia driver kernel module"
    url="http://www.nvidia.com"
    depends=('bash' 'gcc' 'binutils' 'glibc' 'make' 'nvidia-glx')
    source=(ftp://download.nvidia.com/XFree86/Linux-x86/1.0-8178/$_pkgbinary-pkg0.run nvidia.rc
    nvidia.patch nv2.diff)
    install="nvidia.install"
    #provides=()
    conflicts=('nvidia')
    build() {
    cd $startdir/src/
    chmod +x $_pkgbinary-pkg0.run
    ./$_pkgbinary-pkg0.run --extract-only
    cd $_pkgbinary-pkg0
    #strip stuff that's in nvidia-glx to make download lighter
    rm -rf usr/bin
    rm -rf usr/include
    rm -rf usr/lib
    rm -rf usr/share
    rm -rf usr/X11R6
    mkdir -p $startdir/pkg/usr/share/nvidia
    cp -p LICENSE $startdir/pkg/usr/share/nvidia
    # adding patches from nvidia forum and now provided by zander
    patch -Np0 -i $startdir/src/nvidia.patch || return 1
    patch -Np0 -i $startdir/src/nv2.diff || return 1
    #clean src
    install -D -m 755 $startdir/src/nvidia.rc $startdir/pkg/etc/rc.d/nvidia
    cd $startdir/src
    rm *
    mkdir -p $startdir/pkg/opt/nvidia
    mv * $startdir/pkg/opt/nvidia
    md5sums=('6c8081bfde4a806a487efc2a9a1ff016' '08f4f614066c08bd0774c7e557953fbe'
    '3b5a2525633e88b9d78c4721190542e6')
    (the file nv2.diff patches the kernel code for 2.6.16 kernels..)
    Here's the .install
    _driver=NVIDIA-Linux-x86-1.0-8178-pkg0/./nvidia-installer
    post_install() {
    cat << EOF
    IMPORTANT
    ==> In order to use the software, you have to agree to NVIDIA's license located in
    ==> /usr/share/nvidia/LICENSE
    ==> If you don't, please remove this package (pacman -R nvidia-kernel)
    Installation starts now ...
    EOF
    cd /opt/nvidia/NVIDIA-Linux-x86-1.0-8178-pkg0/usr/src/nv
    make install > /dev/null 2>&1 || (echo "INSTALLATION FAILED!" ; echo "==> You have to shutdown Xserver to finish installation! You have to rerun 'pacman -S nvidia-kernel'" ; echo "==> If you're running a custom kernel, make sure the source tree is available." ; echo "==> Other common solutions can be found on the wiki: http://wiki.archlinux.org/index.php/How_to_install_NVIDIA_driver" )
    modprobe nvidia
    rm -r /opt/nvidia
    cat << EOF
    To use this driver you need the nvidia-glx package too.
    If you need more information about setting up nvidia drivers have a look at:
    "http://wiki.archlinux.org/index.php/How_to_install_NVIDIA_driver"
    EOF
    post_upgrade() {
    rmmod nvidia > /dev/null 2>&1
    post_install $1
    pre_remove() {
    cat << EOF
    ==> Deinstallation starts now!
    EOF
    rmmod nvidia
    _KERNELNAME=$(uname -r)
    rm /lib/modules/$_KERNELNAME/kernel/drivers/video/nvidia.ko > /dev/null 2>&1
    cat << EOF
    ==> Don't forget to update your /etc/X11/XF86Config or /etc/X11/xorg.conf!
    You may want to remove nvidia-glx as well.
    EOF
    op=$1
    shift
    $op $*
    Now for the other bit..
    pkgname=nvidia-glx
    pkgver=1.0.8178
    _pkgbinary=NVIDIA-Linux-x86-1.0-8178
    pkgrel=1
    pkgdesc="The NVidia X.org driver and utilities (without the kernel module)"
    url="http://www.nvidia.com"
    depends=('bash' 'gcc' 'binutils' 'glibc' 'make' 'nvidia-kernel')
    source=(ftp://download.nvidia.com/XFree86/Linux-x86/1.0-8178/$_pkgbinary-pkg0.run)
    install="nvidia-glx.install"
    provides=('libgl')
    conflicts=('libgl' 'nvidia')
    md5sums=('6c8081bfde4a806a487efc2a9a1ff016')
    build() {
    cd $startdir/src/
    chmod +x $_pkgbinary-pkg0.run
    ./$_pkgbinary-pkg0.run --extract-only
    cd $_pkgbinary-pkg0
    mkdir -p $startdir/pkg/usr/share/applications
    mkdir -p $startdir/pkg/usr/bin
    mkdir -p $startdir/pkg/usr/man/man1
    mkdir -p $startdir/pkg/usr/lib/xorg/modules/{extensions,drivers}
    mkdir -p $startdir/pkg/usr/share/nvidia
    #Point .desktop to correct location
    chmod +w ./usr/share/applications/nvidia-settings.desktop
    sed 's:__UTILS_PATH__:/usr/bin:' ./usr/share/applications/nvidia-settings.desktop > ./nv.desktop
    sed 's:__DOCS_PATH__:/usr/share/nvidia:' ./nv.desktop > ./usr/share/applications/nvidia-settings.desktop
    chmod -w ./usr/share/applications/nvidia-settings.desktop
    #Grab what we need and put it where we want
    install -m 755 usr/bin/nvidia-bug-report.sh ${startdir}/pkg/usr/bin
    install -m 755 usr/bin/nvidia-settings ${startdir}/pkg/usr/bin
    install -m 755 usr/bin/nvidia-xconfig ${startdir}/pkg/usr/bin
    install usr/lib/libGL.so.${pkgver} ${startdir}/pkg/usr/lib
    install usr/lib/libGLcore.so.${pkgver} ${startdir}/pkg/usr/lib
    install usr/lib/libnvidia-tls.so.${pkgver} ${startdir}/pkg/usr/lib
    install usr/lib/libnvidia-cfg.so.${pkgver} ${startdir}/pkg/usr/lib
    install usr/X11R6/lib/libXvMCNVIDIA.a ${startdir}/pkg/usr/lib
    install usr/X11R6/lib/libXvMCNVIDIA.so.${pkgver} ${startdir}/pkg/usr/lib
    install usr/X11R6/lib/modules/drivers/nvidia_drv.o ${startdir}/pkg/usr/lib/xorg/modules/drivers
    install usr/X11R6/lib/modules/drivers/nvidia_drv.so ${startdir}/pkg/usr/lib/xorg/modules/drivers
    install usr/X11R6/lib/modules/extensions/libglx.so.${pkgver} ${startdir}/pkg/usr/lib/xorg/modules/extensions
    cp -pr usr/include ${startdir}/pkg/usr/share/nvidia
    cp -pr usr/share/doc/* ${startdir}/pkg/usr/share/nvidia
    cp -pr usr/share/applications/* ${startdir}/pkg/usr/share/applications
    cp -pr usr/share/man/man1 ${startdir}/pkg/usr/man
    cd ${startdir}/pkg/usr/lib/
    ln -fs libGL.so.${pkgver} libGL.so
    ln -fs libGL.so.${pkgver} libGL.so.1
    ln -fs libGLcore.so.${pkgver} libGLcore.so.1
    ln -fs libnvidia-cfg.so.${pkgver} libnvidia-cfg.so.1
    ln -fs libnvidia-cfg.so.${pkgver} libnvidia-cfg.so
    ln -fs libnvidia-tls.so.${pkgver} libnvidia-tls.so.1
    ln -fs libXvMCNVIDIA.so.${pkgver} libXvMCNVIDIA-dynamic.so.1
    cd ${startdir}/pkg/usr/lib/xorg/modules/extensions
    ln -fs libglx.so.${pkgver} libglx.so
    ..and here's the .install
    _driver=NVIDIA-Linux-x86-1.0-8178-pkg0/./nvidia-installer
    post_install() {
    #switch in nvidia GL headers
    ln -sf /usr/share/nvidia/include/GL/gl.h /usr/include/GL/gl.h
    ln -sf /usr/share/nvidia/include/GL/glext.h /usr/include/GL/glext.h
    ln -sf /usr/share/nvidia/include/GL/glx.h /usr/include/GL/glx.h
    ln -sf /usr/share/nvidia/include/GL/glxext.h /usr/include/GL/glxext.h
    cat << EOF
    IMPORTANT
    ==> In order to use the software, you have to agree to NVIDIA's license located in
    ==> /usr/share/nvidia/LICENSE (installed with nvidia-kernel - required)
    ==> If you don't, please remove this package (pacman -R nvidia-glx)
    To use this driver you need the nvidia-kernel package too.
    If you need more information about setting up nvidia drivers have a look at:
    "http://wiki.archlinux.org/index.php/How_to_install_NVIDIA_driver"
    Note: OpenGL headers have been replaced (if applicable) by symlinks to the NVidia headers.
    To switch to the mesa headers, reinstall mesa.
    (You may want to do this if compiling a GL app for another system.)
    EOF
    post_upgrade() {
    post_install $1
    pre_remove() {
    cat << EOF
    ==> Don't forget to update your /etc/X11/XF86Config or /etc/X11/xorg.conf!
    You may want to remove nvidia-kernel as well.
    EOF
    op=$1
    shift
    $op $*
    ..As you can see, the nvidia-glx .install instates the NVidia gl headers to aid compiling against them. If mesa is installed it's versions of these files will be overwritten. Making sure to install the package for the desired headers last seemed like the simplest way to select them. Gentoo has some script to select the gl libs.
    PS: This is designed alongside the current mesa-6.4.2-1 package in testing.
    So. Any comments? Any good? Or shall I just keep them to myself?

    tpowa wrote:well, problem is not that it's not possible, problem is if it is allowed.
    i'm not a license expert but as far as i can remember no big distro provides the packages other then nvidia-installer.
    gentoo is a special case.
    Yeah, I thought that might be a problem. The LICENSE seems to have a clause to do with splitting up the package...
    No Separation of Components.  The SOFTWARE is licensed as a
    single product.  Its component parts may not be separated for use
    on more than one computer, nor otherwise used separately from the
    other parts.
    But this refers to *use* of the seperate parts, which I suppose is a bit vague. Anyhoo, as I say, this modified install method can still work with a combined package. It could even not bother with the kernel module compile if it's already present.
    The only other part that seems relevant is...
    2.1.2  Linux/FreeBSD Exception.  Notwithstanding the foregoing terms
    of Section 2.1.1, SOFTWARE designed exclusively for use on the Linux or
    FreeBSD operating systems, or other operating systems derived from the
    source code to these operating systems, may be copied and redistributed,
    provided that the binary files thereof are not modified in any way
    (except for unzipping of compressed files).
    But since no binary files are modified this doesn't seem to apply. Of course, IANAL..

  • *URGENT* T500 not using any graphics card?

    I accidentally disabled the ATI radeon 3650 Card and then the intel graphics card. Now, none of my games would run after i disabled both of them and it says run dxdiag? is there any way i can enable the cards again? In display properties, it says i am running some VGA card and it says n/a for all the properties about the card? how do i enable the graphics card again?

    i had visit the link that u sent; but my problem is the10th step, that i only have 3 option, which is visa, mastercard and the other green card; the i-tune programme that i had do not have the "none" option for me to choose, is there anything i had to do?

  • Mid 2010 Macbook Pro won't switch to Nvidia graphics.

    I have been playing Starcraft II on my Macbook Pro (i7/4G RAM/15" Display) for a few weeks now. Yesterday my frame rates dropped through the floor. The game was nearly unplayable. I thought initially that the level I was playing just had too many units. I re-played one of the tutorial levels and it too was slow. I lowered the graphics quality settings in the game and it didn't get much better.
    I put the game into windowed mode (rather than full screen) and started playing. I pulled up the system profile tool and saw that the intel graphics were active, not the nvidia graphics.
    I went into system preferences/energy saver and shut off automatic graphics switching and it was still using the Intel HD graphics.
    This happens all the time, plugged in or not.
    Any ideas?

    Thanks for your suggestions. I ended up taking it to an Apple store and they ran diagnostics on it. It came up clean.
    The tech powercycled it and cleared the PRAM (?) by holding cmd-option-p-r while booting.
    I am not sure what fixed it, but Starcraft 2 framerates are back where they used to be.

  • Std::bad_alloc occurs only when using VS Graphics Debugger

    I'm using C# and SharpDX (a c# wrapper for directx) and I keep getting the following exception when using the Graphics Debugger in VS2012 and VS2013:
    First-chance exception at 0x7690c41f in Craft.exe: Microsoft C++ exception: std::bad_alloc at memory location 0x3276eda4.
    A first chance exception of type 'System.Runtime.InteropServices.SEHException' occurred in SharpDX.Direct3D11.dll
    An exception of type 'System.Runtime.InteropServices.SEHException' occurred in SharpDX.Direct3D11.dll but was not handled in user code
    Additional information: External component has thrown an exception.
    I have native code debugging enabled and I'm using the directx debug device.  The program runs fine normally, but when I use the graphics debugger, I get this exception almost all the time, usually pretty late in the loading phase.  When it happens,
    it highlights this line of code:
    newVertexBuffer = new Buffer(Engine.Device, VertexDataStream, (int)VertexDataStream.Length, ResourceUsage.Immutable, BindFlags.VertexBuffer, CpuAccessFlags.None, ResourceOptionFlags.None, 0);
    Like I said, this happens quite late into the loading phase, at which point this line of code has run hundreds of times successfully, and works just fine when not using the graphics debugger.  Is this a bug/issue with the VS debugger or something?

    Hi,
    Welcome to MSDN.
    The default memory allocation is the debugger heap, not the Debug CRT heap while running Visual Studio Debugger.
    It can be slow if we allocate/free a lot of memory, since it will cost lots of time on this special heap.
    You could refer to this
    thread and this
    link to get more information. There is a detailed description of debugger heap shared by “MSN” (his user name) in this
    link.
    In order to solve this issue, we could disable the debug heap as following steps:
    Control Panel->System Properties->Advanced System Settings->Environment Variables->System Variables.
    Add a new Variable called “_NO_DEBUG_HEAP” with a value of “1”.
    In addition, I suggest you post this issue in this forum below:
    http://xboxforums.create.msdn.com/forums/76.aspx
    like this thread “Visual Studio 2012 Graphics Debugger Issue”,
    since there are a lot of developers with more professional knowledge can help you.
    Thanks for your understanding.
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • I am using a code based typesetting program (not WYSISYG) that outputs PDFs. I am producing 100 plus pages that have multiple graphics on each page. I need to know how to format a PDF command that I can incllude in my programming that will tag my graphics

    I am using a code based typesetting program (not WYSISYG) that outputs PDFs. I am producing 100 plus pages that have multiple graphics on each page. I need to know how to format a PDF command that I can incllude in my programming that will tag my graphics with "Alternative Text".
    I know that with a Microsoft product graphics can be tagged before a PDF is made. I need to know how to do this with my programming.

    The Acrobat SDK might be a starting point.
    From there, perhaps a plug-in (built with C+).
    Perhaps with a licensed release of a PDF Library (this could be $$).
    The viable and cost effective alternative is use the tried and true.
    Authoring in an appropriate authoring application with appropriate tag management.
    Example:  Adobe InDesign; Adobe FrameMaker or MS Word with PDFMaker (comes with install of Acrobat).
    This way you place "Alternative Text" when mastering content in the authoring file.
    Going the route and with some look-see (research) you may find programmatic approaches to placing the alt txt in the authoring file.
    Note: as discussed in the Matterhorn Protocols there is no programmatic method that provides a fully accessible PDF (specifically, that is an ISO 14289-1, PDF/UA-1 compliant PDF).
    Regardless, here you have a sub-forum for discussions on Acrobat usage.
    Consequently discussions on/of 3rd party software is rather out of scope eh.
    Be well...

  • How to disable Intel HD graphics and use nvidia card?

    Hello,
    I would like to ask for some assistance.
    I want to use the Gtx 630m but instead the laptop keeps using the Intel HD chipset.
    What I tried so far:
    1. In nVidia control panel I set nVidia graphics card as default in global setting, and for several programs I frequently use as well
    2. In BIOS there is no option for Switchable Graphics
    Is there any BIOS update for this?
    Thanks!
    Laptop:
    HP Pavilion 15-b102sh Sleekbook
    Model: D4N32EA
    System version: Windows 8.1

    @Spinkick ,
    Hi again and thanks for posting back.  Here is a link from our forums that deals with the same issue. There is a pile of information here on how to deal with this.
    How to switch to Nvidia graphics instead of intel
    Here is another one from HP support.  It is not for the same model, however it covers the same topic and works with the nvidia software.
    NVIDIA Optimus Graphics with Integrated Intel Graphics on HP Pavilion dv7t-7000 CTO Notebook PCs (Wi...
    If you are trying to stop using the intel gpu all together you are not going to be able to do that at all.
    They system and the software are designed to work with both GPU chipsets and assign accordingly.
    I hope this helps.
    Thank you again for posting and have a great day.
    Please click the "Thumbs Up" on the bottom right of this post to say thank you if you appreciate the support I provide!
    Also be sure to mark my post as “Accept as Solution" if you feel my post solved your issue, it will help others who face the same challenge find the same solution.
    D5GR
    I work on behalf of HP

  • Why does my Thinkpad W541 use Integrated graphics in WoW instead of the nVidia Quadro K1100m?

    I have seen reviews by many people on the W540/541 and they all said they were getting around 100 fps in WoW and even the guy in this video was able to play it on high-setitngs: http://www.youtube.com/watch?v=j0j045vg3W8 Don't get me wrong I use this device for mainly professional work, but every now and then I would like to be able to play a game on it or two. thanks!

    According to a number of contributors on a number of threads regarding the same subject, THIS IS BY DESIGN.
    If you want to force use the K1100m on the W541, it will need to be (a) on an external monitor and (b) controlled through a dock, rather than via the video connectors on the laptop itself.  According to "the specs", the laptop screen is always handled by the Intel Graphics... no matter whether the W540/W541 BIOS is set to "basic" or "advanced" graphics mode.
    According to the following description of how graphics works in W540/W541 and newer machines (which is that Optimus Mode is always active, although you can select "basic mode" or "advanced mode"), you simply MUST use Optimus Mode and cannot disable Intel Graphics as you could with the W530.
    Now I'd always thought that in theory for the laptop screen you can use nVidia Control Panel (3D settings) to specify which programs you want to get nVidia graphics for when those programs' windows have focus).  But my experience (granted, with the W530 and not with W540/W541) is that nVidia graphics kick in (and take over for Intel graphics) reliably only when the firmware determines that graphics performance requirements justify it.
    Strangely, the description of Optimus Driver behavior (below) makes no mention of the NVidia graphics ever kicking in for the laptop screen, but I thought that was how it worked.  Confusing and contradictory descriptions, seemingly.  You'd think that gaming applications would be just such an example of "graphics performance demands nVidia graphics", but your thread subject suggests that nVidia is NOT kicking in (I assume you're running on your laptop screen, and have probably tried to go to nVidia Control Panel to request nVidia graphics when you run WoW), which would be consistent with the written descriptions but very definitely annoying.  On the W530, Optimus behavior had the K1000m definitely kicking in on the laptop screen when needed.
    So apparently by design Optimus is always active on W540/W541 and newer machines. You can´t actually disable the integrated GPU at all and force the use of the discrete graphics, as you can on the W530 for the K1000m nVidia graphics via its BIOS. Here is the description of Optimus Drivers for W541, which describes Standard vs. Advanced mode, and by implication how the new W540/W541 BIOS design works:
    STANDARD and ADVANCED MODE
    In Standard mode, all dock displays uses Integrated Graphics as display output
    and is limited to a maximum of 3 displays including Computer's LCD.
    While in Advanced mode, all dock displays uses Discrete Graphics as display
    output and it increases the maximum number of displays to 6 including Computer's LCD.
    ThinkPad W540, W541 (Standard Mode)
    Intel HD Graphics
    - (Computer's LCD)
    - Computer's analog VGA connector
    - Computer's DisplayPort connector
    - Docking Station's analog VGA connector
    - Docking Station's DVI connector(s)
    - Docking Station's DisplayPort connector(s)
    - Docking Station's HDMI connector
    NVIDIA Quadro K2100M or NVIDIA Quadro K1100M
    - No display is connected to this display adapter.
    ThinkPad W540, W541 (Advanced Mode)
    Intel HD Graphics Family
    - (Computer's LCD)
    - Computer's analog VGA connector
    - Computer's DisplayPort connector
    NVIDIA Quadro K2100M or NVIDIA Quadro K1100M
    - Docking Station's analog VGA connector
    - Docking Station's DVI connector(s)
    - Docking Station's DisplayPort connector(s)
    - Docking Station's HDMI connector

  • Photoshop CC 2014 (fresh installed and updated) does not recognize my graphic processor (NVIDIA GTX 840)

    Photoshop CC 2014 (fresh installed and updated) does not recognize my graphic processor (NVIDIA GTX 840) but instead only a Intel HD 4600. How do I ensure that PS is using the full phardware power of my notebook. My OS is WIN 8.1 Thanks for your help, Meik

    In the end it was the NVIDIA software - excluding PS as application for support by high performance graphic processor to save energy - activated NVIDIA card use for PS & LR all fine now.

Maybe you are looking for

  • Mini Display Port to VGA adapter with Mac Pro

    Just as a word of advice if you're thinking of buying a new Mac Pro and planning on hooking it up to dual VGA displays at their native resolutions - don't. Apple claim it will support 3 VGA displays. That might be true in a very loose sense but not i

  • The Database Link is not active

    try to be more clear, i'm in lack of ideas in this problem. I am following guide Oracle Database 2 Day + Data Replication and Integration Guide. I defined global_names parameter of remote database as true.In the step of "creation database link" i am

  • Auto album ratings breaks smart playlists - hollow stars

    I've seen this discussed a few times but the given answers don't seem to work, maybe it has changed again in the new release of itunes (I'm on 9.1.1.12 on XP). I have a smart playlist of all the songs which have not been rated. Once I'd rated 19/29 o

  • Artifacts in updated folios

    Recently I have had significant issues with artifacts from older builds sticking around or new builds not updating at all. The issue can almost always be fixed by updating the article again and redownloading the folio. The only problem is that this t

  • Applet run exe with dll

    Hi, i have a problem with a applet From my jsp i call my jar applet signed This applet call a client program .exe for digital sign created in java.      // FUNZIONE PER ESEGUIRE UN FILE .EXE O .BAT      public int runFile(String pathfile){