LR6 Using Intel Graphics intead of Nvidia

It's using the Intel 4600 instead of my Nvidia GTX680 with 4GB memory. My Nvidia driver is updated to the very latest. It says it passed the Open GL test, but it's not using the Nvidia
Lightroom version: 6.0 [1014445]
License: Perpetual
Operating system: Windows 8.1 Home Premium Edition
Version: 6.3 [9600]
Application architecture: x64
System architecture: x64
Logical processor count: 8
Processor speed: 2.4 GHz
Built-in memory: 16303.0 MB
Real memory available to Lightroom: 16303.0 MB
Real memory used by Lightroom: 1416.9 MB (8.6%)
Virtual memory used by Lightroom: 1454.4 MB
Memory cache size: 352.5 MB
Maximum thread count used by Camera Raw: 4
Camera Raw SIMD optimization: SSE2,AVX,AVX2
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Displays: 1) 1920x1080
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
Graphics Processor Info:
Intel(R) HD Graphics 4600
Check OpenGL support: Passed
Vendor: Intel
Version: 3.3.0 - Build 10.18.10.3960
Renderer: Intel(R) HD Graphics 4600
LanguageVersion: 3.30 - Build 10.18.10.3960

I too am having a primary/secondary graphics card  issue.
I have two graphics cards installed, a GTX 760 and a Quadro K600. the gtx 760 is configured as the primary graphics card yet lightroom insists on using the quadro, ( a slower card ) for the GPU acceleration.
For me, it is not acceptable to disable the quadro in order to get the gtx 760 to be the GPU accelerator.
Is there anything in the works to be able to choose from the preferences performance panel which GPU to use? Photoshop picks the correct GPU, so why doesn't LR???
Thanks in advance.
Phil Kemp

Similar Messages

  • GT60 2PC 3K keeps using intel graphics instead of nvidia?

    Hiya, having incredibly frustrating issues with my GT60 here.
    Out of the box, this thing works relatively OK, except the version of Win 8 that's installed on it has some memory leak issues when running fullscreen applications. No big deal, run windows update and that turned out be sorted. However, now its got the awful Intel onboard graphics as the main graphics processor. Majority of games won't start because it won't support direct X and other shenanigans like that. Disabling the intel graphics in device manager had some small effect, I actually got my games to start but as soon as they try to render anything more than a loading screen, i get "out of video memory" errors, so it's like it's still refusing to use the nvidia card. This was all working fine before I did any windows updates except for the random memory leaks that would suddenly lag the crap out of games for a while. Anyone had any issues like this with a GT60 3K or similar series? Any suggestions would be greatly appreciated!

    Hello gorgoncola
    Have you tried to recover the system and try the same scenario?
    Is your game running okay on a clean-recovered system? Or it actually didn't run on NVIDIA graphics card?
    Not all applications are detected by NVIDIA graphics driver and sometimes you'll need to manually add them to the detecting list (in the NVIDIA Control Panel) to automatically switch to NVIDIA graphics card to run them. 

  • Lightroom 6 uses Intel Graphics instead of nvidia K610m

    So far so good, but on my ZBook 17 LR runs on the Intel HD Graphics 4600.
    This one is equal fast/slow as CPU.
    Is there any way to let LR use the nvidia K610m instead of the Intel one?
    Kind regards

    Hello gorgoncola
    Have you tried to recover the system and try the same scenario?
    Is your game running okay on a clean-recovered system? Or it actually didn't run on NVIDIA graphics card?
    Not all applications are detected by NVIDIA graphics driver and sometimes you'll need to manually add them to the detecting list (in the NVIDIA Control Panel) to automatically switch to NVIDIA graphics card to run them. 

  • Using Intel graphics in Windows on Macbook Pro Retina Late 2013 ?

    I own a Macbook Pro Retina 15" Late 2013 with Nvidia GT750M graphics.
    On Windows, the power consumption and heat production is noticeable higher than on OSX.
    People said this is due the Nvidia GT750M being constantly used, because there is officially no way to switch to the more power saving Intel Graphics on Windows.
    I have found a extremely tricky way to switch the graphics, but unfortunately I have no idea how I do this:
    http://forums.macrumors.com/showthread.php?t=696523&page=40
    Does someone know a easier way to do it or can explain it step-by-step ?
    I'd appreciate this very much if I could have a longer battery life on Windows
    Thank you.

    I own a Macbook Pro Retina 15" Late 2013 with Nvidia GT750M graphics.
    On Windows, the power consumption and heat production is noticeable higher than on OSX.
    People said this is due the Nvidia GT750M being constantly used, because there is officially no way to switch to the more power saving Intel Graphics on Windows.
    I have found a extremely tricky way to switch the graphics, but unfortunately I have no idea how I do this:
    http://forums.macrumors.com/showthread.php?t=696523&page=40
    Does someone know a easier way to do it or can explain it step-by-step ?
    I'd appreciate this very much if I could have a longer battery life on Windows
    Thank you.

  • How to make MacBook Pro use intel Graphic card only?

    When unpluged, my 15' pro always use AMD praphic makes my battery only work for 2-3 hours. Is there a way to stop computer using AMD graphic card when i'm not playing game???

    Try using this software (It is 3rd party but it is safe)  called gfxCardStatus. It just places itself on the top bar or side bar (not the dock) it will probably be a d in italics (this is italics for those who don't know) or an i. Just click on it and click on the Integrated Only option in the bar which will switch the card to the original integrated graphics card. Also, check for dependencies bellow the "Dynamic Switching" option (if there isn't just skip this) dependencies can cause changes to your graphics card usage and may be the problem; for me it was but who knows it might be some internal error. If you don't trust me I totally understand just putting it up there for those who want it. You can follow the above option by "ds store" which might also work.
    Link:  http://codykrieger.com/gfxCardStatus

  • Ideas for using intel graphics

    I've got two laptops, one with i830 the other with i852/855GM.   Neither works with KMS(kernel modesetting).  The i830 just has a black screen with KMS. The i855 works for a little while until it completely freezes.
    The i830 laptop works well with the nomodeset grub kernel option and the 2.9.1 driver.  Fails completely with KMS enabled and doesn't work at all with 2.10 as that has no UMS(user modesetting).  Suspend/resume doesn't work well.  The vesa driver doesn't work well with the i830, not sure why.
    The i855 freezes with KMS , the 2.10 driver doesn't support UMS and the 2.9 driver doesn't work at all.  I've removed the xf86-video-intel driver and set xorg to use the vesa driver.  I've had no issues using the vesa driver with suspend/resume actually working.
    I would like comments on issues with older graphics cards and newer versions of xorg.  There seems to be a trend where support for older graphics cards is being dropped by the manufacturers and the older drivers are also being dropped by xorg.

    I don't have a notebook, but my IBM Netvista works only w/ intel-legacy:
    [karol@black arch]$ lspci | grep VGA
    00:02.0 VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 01)
    I have KMS enabled and get no lock-ups, but once I get into X, I can't go back to the console (the screen is garbled, but otherwise works fine).
    This is my downgrade script:
    #!/bin/bash
    pacman -U /var/cache/pacman/pkg/xorg-server-1.6.3.901-1-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xorg-server-utils-7.4-7-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xf86-video-intel-legacy-2.3.2-3-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xf86-video-vesa-2.2.0-1-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xf86-input-evdev-2.2.5-1-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xf86-input-keyboard-1.3.2-2-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/xf86-input-mouse-1.4.0-2-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/intel-dri-7.5.1-2-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/libgl-7.5.1-2-i686.pkg.tar.gz \
    /var/cache/pacman/pkg/libdrm-2.4.19-1-i686.pkg.tar.gz
    in my pacman.conf I have
    IgnorePkg = xorg-server xorg-server-utils xf86-video-intel-legacy xf86-video-vesa xf86-input-evdev xf86-input-keyboard xf86-input-mouse intel-dri libgl libdrm
    I tried many different drivers, libdrm etc. - nothing worked. W/ the above setup I have no complaints.

  • Who here uses intel graphics 4400 n experience pixelation problems.  cos I do. pls help

    Help

    [koala@myhost ~]$ lsmod
    Module Size Used by
    uhci_hcd 18764 0
    video 16208 0
    backlight 3652 1 video
    ath_pci 207800 0
    wlan 186612 1 ath_pci
    ath_hal 298208 1 ath_pci
    ath5k 88896 0
    Here is my lsmod...:)

  • Re-Enable Default Intel Graphic

    Need some help here.
    Previously, my laptop will only use nvidia graphic during gaming session.However,since i updated my intel graphic..out of sudden the nvidia has became the default graphic card..
    Anyone has any ideas to get my laptop to use intel graphic by default?
    I have tried following method yet useless.
    1)Changed from "Auto-Select" to "Integrated Graphic" from the NVIDIA control panel.
    2)Disabled "Nvidia graphic card" in device manager.
    3)System restore.
    My led is in orange color which is indicating I'm using the nvidia graphic card...

    Quote from: mklim;111695
    Hi. thks for replying.
    I googled around, where some people suggesting ur method too...but I couldn't find that particular setting in the BIOS.
    Anyway,i have solved my issue.
    What i did was completely reinstall my nvidia graphic driver and VOLLA!
    Glad that you solved it i will be getting my first MSI as well an GS70 2PE which i hope it does not have problems

  • Flash video artifacts when using discrete graphics

    So I have a 2011 MacBook Pro with Intel and ATI graphics. When watching flash video (YouTube), I have weird artifacting going on when using just discrete graphics. While using Intel graphics, the video usually plays fine, just a little choppy. I am using gfxCardStatus to switch between them. Is this a flash bug or something else? I've added an image to show what I'm talking about.

    Darrell110 posted in macromedia.dreamweaver
    > problem is I do not want to use another player. I add a
    lot of
    > video for clients and I love how easy it is to do that
    with
    > Dreamweaver.
    I don't see any evidence that you used Dreamweaver to insert
    the
    player. Normally, when you insert Flash with Dreamweaver, it
    inserts
    code I can readily identify. Did you use GoLive?
    The pages you show are consipicuously missing the
    <object> code and
    the JavaScript that DW uses.
    And I don't see any differences in the players. The pages
    both link to
    files named flvplayer.swf (in different folders) with the
    same look and
    the same file size. I am late in this thread, so maybe I
    missed the
    boat.
    Anyway, I don't know if this would make a difference with
    that player,
    but one page uses a relative link to the .flv, while the
    other page
    uses an absolute link, including
    http://. See if changing that one to a
    relative link helps.
    I only watched about a minute from both pages and, although
    872 Kpbs
    for the video track is VERY high, I did not see any glitches.
    (Did you mean to use two slashes in the second link?)
    > Here are samples using the same video file, only
    different
    > PLAYERS:
    http://bobbywarns.com/viewvideo.html
    >
    http://bobbywarns.com//video/viewvideo.html
    Mark A. Boyd
    Keep-On-Learnin' :)

  • Choppy and Laggy especially Ichat on C2D Imac with intel graphics

    Last night I upgraded from 10.4.10 to leopard on my Imac C2D. Now my Imac has the integrated Intel graphics and not a fully fledged Ati but this has never been an issue before as Tiger was always nippy. I have 1.256mb ram, due to having a 1GB stick and a 256MB.
    I performed an upgrade install and it all went fine. Initially though the system was very laggy and choppy, especially when using the dock, coverflow and even screensavers. Having used coverflow extensively in Itunes I can say it was never an issue before. I then decided to check out Ichat. Ichat was extremely choppy in particular with its video capture which lagged several seconds behind and even dropped frames. When I tried the effects these were about the same, no worse really and the background substitution effects worked very poorly with most of me missing.
    I rebooted after installing the second wave of updates for Leopard and there was a significant increase in performance but its still choppy in places. I’m not sure if it’s due to my integrated graphics and wondered had anyone else had a similar issue, such as Macbook or Mac Mini users?
    I’ve ordered a second 1GB stick from Crucial and hope this will bottom out any potential memory bottleneck but can’t help worrying it’s the Intel graphics. Would Apple seriously use Intel graphics only to release a OS which struggles with it?

    In part thats a relief. I did not move over to Tiger until is was 10.4.6 to ensure stability and was always pleased with its performance. In truth Leopard works fine but graphically it does seem to have speed issues which i had put down to the GMA Intel graphics.
    I really find Ichat and coverflow the worst afflicated and notice the minimal slow down I'm guessing more due to the excellent performance I had previously in Tiger.
    In Ichat the video effects are terrible and the interactive rear screen fail to detect me properly. In coverflow in the finder the delay is less noticable but you can see the icons sharpen up after a second or two.
    Hopefully Apple will sort this out shortly.

  • G780 Nvidia 630m Refuses to use dedicated graphics card

    Okay so I have the G780 Windows 7 64-bit i7 core 2.9 gHz. 6GB RAM, 750GB HDD, Intel Graphics 4000, and the NVidia GT 630M dedicated 2GB card.
    So for the past two years I have had this computer and have about 20 games, that at one time worked perfectly fine, until one day I decided to launch a game (Dishonored) and it crashed. So I decided to play another game, Sleeping Dogs, and it crashed as well. Launched Borderlands 2, what do you know? It crashed too.
    So I downloaded TechPowerUp's GPU-Z to see if the GPU was running, overheating etc. Found out as I was running Borderlands 2, it was using the integrated 4000 HD graphics instead of my dedicated.
    Now I have been having several problems with this. I have updated Nvidia drivers to version 9.18.13.3467 (1/15/14), running Graphics Bios Version 70.08.a8.00.43 and have updated the Nvidia Control Panel to the latest version. This program barely works and even though I have personally gone in and added every game's .exe file to the "Use High-Powered Graphics" setting, it still refuses to run on dedicated and runs on integrated as seen by GPU-Z.
    Any help here?

    hi unsterblich666,
    Welcome to Lenovo Community Forums!
    Try Uninstalling both GPU Driver from Device Manager,
    Intel HD Graphcis and make sure to place a check mark under Delete driver softwar for this device 
    Then Continue to Uninstall th Nvidia As well place a check on the check box to delete driver sotware.
    Restart the system if prompted
    then install the driver back using this version frm support site.
     Intel Video Driver for 64-bit Windows  
     NVIDIA VGA Driver for 64-bit Windows 
    Let me know your findings
    Cheers!
    Did someone help you today? Press the star on the left to thank them with a Kudo!
    If you find a post helpful and it answers your question, please mark it as an "Accepted Solution"! This will help the rest of the Community with similar issues identify the verified solution and benefit from it.
    Follow @LenovoForums on Twitter!

  • Why does my Thinkpad W541 use Integrated graphics in WoW instead of the nVidia Quadro K1100m?

    I have seen reviews by many people on the W540/541 and they all said they were getting around 100 fps in WoW and even the guy in this video was able to play it on high-setitngs: http://www.youtube.com/watch?v=j0j045vg3W8 Don't get me wrong I use this device for mainly professional work, but every now and then I would like to be able to play a game on it or two. thanks!

    According to a number of contributors on a number of threads regarding the same subject, THIS IS BY DESIGN.
    If you want to force use the K1100m on the W541, it will need to be (a) on an external monitor and (b) controlled through a dock, rather than via the video connectors on the laptop itself.  According to "the specs", the laptop screen is always handled by the Intel Graphics... no matter whether the W540/W541 BIOS is set to "basic" or "advanced" graphics mode.
    According to the following description of how graphics works in W540/W541 and newer machines (which is that Optimus Mode is always active, although you can select "basic mode" or "advanced mode"), you simply MUST use Optimus Mode and cannot disable Intel Graphics as you could with the W530.
    Now I'd always thought that in theory for the laptop screen you can use nVidia Control Panel (3D settings) to specify which programs you want to get nVidia graphics for when those programs' windows have focus).  But my experience (granted, with the W530 and not with W540/W541) is that nVidia graphics kick in (and take over for Intel graphics) reliably only when the firmware determines that graphics performance requirements justify it.
    Strangely, the description of Optimus Driver behavior (below) makes no mention of the NVidia graphics ever kicking in for the laptop screen, but I thought that was how it worked.  Confusing and contradictory descriptions, seemingly.  You'd think that gaming applications would be just such an example of "graphics performance demands nVidia graphics", but your thread subject suggests that nVidia is NOT kicking in (I assume you're running on your laptop screen, and have probably tried to go to nVidia Control Panel to request nVidia graphics when you run WoW), which would be consistent with the written descriptions but very definitely annoying.  On the W530, Optimus behavior had the K1000m definitely kicking in on the laptop screen when needed.
    So apparently by design Optimus is always active on W540/W541 and newer machines. You can´t actually disable the integrated GPU at all and force the use of the discrete graphics, as you can on the W530 for the K1000m nVidia graphics via its BIOS. Here is the description of Optimus Drivers for W541, which describes Standard vs. Advanced mode, and by implication how the new W540/W541 BIOS design works:
    STANDARD and ADVANCED MODE
    In Standard mode, all dock displays uses Integrated Graphics as display output
    and is limited to a maximum of 3 displays including Computer's LCD.
    While in Advanced mode, all dock displays uses Discrete Graphics as display
    output and it increases the maximum number of displays to 6 including Computer's LCD.
    ThinkPad W540, W541 (Standard Mode)
    Intel HD Graphics
    - (Computer's LCD)
    - Computer's analog VGA connector
    - Computer's DisplayPort connector
    - Docking Station's analog VGA connector
    - Docking Station's DVI connector(s)
    - Docking Station's DisplayPort connector(s)
    - Docking Station's HDMI connector
    NVIDIA Quadro K2100M or NVIDIA Quadro K1100M
    - No display is connected to this display adapter.
    ThinkPad W540, W541 (Advanced Mode)
    Intel HD Graphics Family
    - (Computer's LCD)
    - Computer's analog VGA connector
    - Computer's DisplayPort connector
    NVIDIA Quadro K2100M or NVIDIA Quadro K1100M
    - Docking Station's analog VGA connector
    - Docking Station's DVI connector(s)
    - Docking Station's DisplayPort connector(s)
    - Docking Station's HDMI connector

  • Photoshop CS6 Randomly Crashes using Intel HD Graphics 4000

    When using Photoshop CS6 on Windows 8 [Surface Pro], it randomly crashes. I normally can only use Photoshop for less than a minute, doing less than ten actions, before it crashes. I get an error message saying "Adobe Photoshop CS6 has stopped working." If I debug it using Visual Studio, I get this message:
    Unhandled exception at 0x5442B60E (ig7icd32.dll) in Photoshop.exe: 0xC0000005: Access violation reading location 0x0000002C.
    I looked up ig7icd32.dll and it seems to be published by Intel and related to my graphics hardware, so I assume the problem is related to my Intel HD Graphics 4000.
    Does anyone have any idea on how to fix this problem or as to what might be causing it?
    EDIT:
    Also, when launching Photoshop, I get this message:
    Photoshop has encountered a problem with the display driver, and has temporarily disabled enhancements which use the graphics hardware.
    Check the manufacturer's website for the latest software.
    Graphics hardware enhancements can be enabled in the performance panel of preferences.
    For more information, visit:
    http://www.adobe.com/go/photoshop_gpu_en
    All my drivers are fully updated, and all Windows updates are installed. No other applications have any problems with my graphics hardware.

    If Intel does not have an updated driver, I would check the MS Surface Pro Web site.
    If no updated driver exists, you might get it to run, if you turn off all graphics acceleration.
    Unfortunately, Intel graphics chips are good, but driver support has been horrible. I recommend against using them, if one has programs like Ps CS 6, Premiere Pro CS 5 - 6, Premiere Elements 9 - 11, After Effects CS 6, any 3D applications, etc.. The Intel chips and drivers are OK for checking e-mail, surfing the Web, doing light wordprocessing, or some spreadsheet work, i.e. the tasks that they were designed for. They fall far short with video-intensive applications, that need to interface with the video driver.
    Good luck,
    Hunt

  • What video card do I need to make Photoshop CC 2014 to use 3D? I have NVIDIA GeForce GTX 780M 4096 MB graphics.

    What video card do I need to make Photoshop CC 2014 to use 3D? I have NVIDIA GeForce GTX 780M 4096 MB graphics.

    A couple of questions:
    1.  Have you tried resetting your preferences?
    2.  If you go to Preferences > Performance - is "Use Graphics Processor" enabled?  If not, can you select it?
    Your system should be fine to support 3D in PS CC 2014.  Once I have more information from you we can move onto next steps if necessary.
    Thanks,
    Adam

  • Set Nvidia over Intel Graphics?

    My computer (lenovo y580) has an integrated intel chip, and an nvidia gtx 660m. I have both drivers installed on my system, yet whatever I open it uses the Intel chip to run it, when I really only want the nvidia running it. I am getting really frustrated now, is there a program for my system where you can switch between the devices?

    Okay. I think this is what I need. How do I go about this though? Am I supposed to remove my intel or nvidia drivers before applying the hack? Do I do the hack and then the bumblebee support? Even if I get the hack and it works, would it still choose the intel over the nvidia?

Maybe you are looking for

  • Importing a class CEX file using TOOL

    Hi Forte Users, Would anyone know how to import a class CEX file using TOOL code? Is there a class in one of the libraries that I can use to do such a task? If anyone has some insight into such an endeavor (positive or negative) please respond! As we

  • URGENT : REP-0999 Unimplemented error

    I have a problem in reports when i generate the report to another format (for example PDF or HTML) using the generate option in the menu of report output from an application..i get an error "REP-0999 unimplemented error" but if i generate the report

  • I lost system preference os 10.4.11 how do I get it back?

    I am using os 10.4.11 Tiger and somehow I lost System Preference. How do I get it back? I tried Spotlight it can't find it. Thanks

  • "name of computer "ABC" is already in use on this network...

    "The name has been changed to "XXX" I regularly get this message from the Time Machine backup process. Why? What should I do to correct this? Thanks a bunch.

  • Thread Count and Out Of Memory Error

    Hi, I was wondering if setting the ThreadPoolSize to a value which is too low can cause an out of memory error. My thought is that when it is time for Weblogic to begin garbage collection, if it does not get a thread fast enough it is possible that m