Acer V5 753g (Intel Haswell / Nvidia GT 750m dual GPU) Bumblebee

I think I am fighting several issues here, altough I might have fixed one already. Up to this point I managed to disable the nvidia gpu, using the proprietary driver and bbswitch, but bumblebee is not working.
Okay first, I could isolated an issue with acpi, by following instructions in this bug report https://github.com/Bumblebee-Project/Bu … issues/460. To be exact it made me install a kernel module, code and install instrutions are described here: https://github.com/Bumblebee-Project/bb … ack-lenovo (After the bug report the author of this code was generous enough to add my model to the source code, so midifications weren't neccesary any more).
Now after a reboot the gpu is disabled, but as soon as I run optirun (which fails) I am not able to disable it again
I know the exact same issue is discribed in the bumblebee arch wiki page, but none of the workarounds fit my situation.
bbswitch dmesg output
dmesg | grep bbswitch
[   16.608903] bbswitch: version 0.7
[   16.608910] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.GFX0
[   16.608917] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.RP05.PEGP
[   16.609010] bbswitch: detected an Optimus _DSM function
[   16.609063] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on
[   16.610310] bbswitch: disabling discrete graphics
[   78.445044] bbswitch: enabling discrete graphics
Optirun output:
optirun -vv glxspheres
[ 1166.063193] [DEBUG]Reading file: /etc/bumblebee/bumblebee.conf
[ 1166.063481] [DEBUG]optirun version 3.2.1 starting...
[ 1166.063488] [DEBUG]Active configuration:
[ 1166.063491] [DEBUG] bumblebeed config file: /etc/bumblebee/bumblebee.conf
[ 1166.063494] [DEBUG] X display: :8
[ 1166.063497] [DEBUG] LD_LIBRARY_PATH: /usr/lib/nvidia:/usr/lib32/nvidia
[ 1166.063500] [DEBUG] Socket path: /var/run/bumblebee.socket
[ 1166.063503] [DEBUG] Accel/display bridge: auto
[ 1166.063505] [DEBUG] VGL Compression: proxy
[ 1166.063508] [DEBUG] VGLrun extra options:
[ 1166.063511] [DEBUG] Primus LD Path: /usr/lib/primus:/usr/lib32/primus
[ 1166.063527] [DEBUG]Using auto-detected bridge virtualgl
[ 1166.080418] [INFO]Response: No - error: [XORG] (EE) No devices detected.
[ 1166.080441] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
[ 1166.080445] [DEBUG]Socket closed.
[ 1166.080469] [ERROR]Aborting because fallback start is disabled.
[ 1166.080475] [DEBUG]Killing all remaining processes.
Xorg error log:
[  1166.066]
X.Org X Server 1.14.4
Release Date: 2013-10-31
[  1166.066] X Protocol Version 11, Revision 0
[  1166.066] Build Operating System: Linux 3.11.6-1-ARCH x86_64
[  1166.066] Current Operating System: Linux acer-joschka 3.11.6-1-ARCH #1 SMP PREEMPT Fri Oct 18 23:22:36 CEST 2013 x86_64
[  1166.066] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=9b2b7068-afcf-43a7-a6ac-7fc82d88e472 rw acpi_backlight=vendor
[  1166.066] Build Date: 01 November 2013  05:10:48PM
[  1166.066] 
[  1166.066] Current version of pixman: 0.30.2
[  1166.066]     Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
[  1166.066] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[  1166.066] (==) Log file: "/var/log/Xorg.8.log", Time: Mon Nov  4 14:34:19 2013
[  1166.066] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
[  1166.066] (++) Using config directory: "/etc/bumblebee/xorg.conf.d"
[  1166.066] (==) ServerLayout "Layout0"
[  1166.066] (==) No screen section available. Using defaults.
[  1166.066] (**) |-->Screen "Default Screen Section" (0)
[  1166.066] (**) |   |-->Monitor "<default monitor>"
[  1166.066] (==) No device specified for screen "Default Screen Section".
    Using the first device section listed.
[  1166.066] (**) |   |-->Device "DiscreteNvidia"
[  1166.066] (==) No monitor specified for screen "Default Screen Section".
    Using a default monitor configuration.
[  1166.066] (**) Option "AutoAddDevices" "false"
[  1166.066] (**) Option "AutoAddGPU" "false"
[  1166.066] (**) Not automatically adding devices
[  1166.066] (==) Automatically enabling devices
[  1166.066] (**) Not automatically adding GPU devices
[  1166.066] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
[  1166.066]     Entry deleted from font path.
[  1166.066] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
[  1166.066]     Entry deleted from font path.
[  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
[  1166.066]     Entry deleted from font path.
[  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
[  1166.066] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
[  1166.066]     Entry deleted from font path.
[  1166.066]     (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
[  1166.066] (==) FontPath set to:
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/
[  1166.066] (++) ModulePath set to "/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules"
[  1166.066] (==) |-->Input Device "<default pointer>"
[  1166.066] (==) |-->Input Device "<default keyboard>"
[  1166.066] (==) The core pointer device wasn't specified explicitly in the layout.
    Using the default mouse configuration.
[  1166.066] (==) The core keyboard device wasn't specified explicitly in the layout.
    Using the default keyboard configuration.
[  1166.067] (II) Loader magic: 0x7fdc20
[  1166.067] (II) Module ABI versions:
[  1166.067]     X.Org ANSI C Emulation: 0.4
[  1166.067]     X.Org Video Driver: 14.1
[  1166.067]     X.Org XInput driver : 19.1
[  1166.067]     X.Org Server Extension : 7.0
[  1166.067] (II) xfree86: Adding drm device (/dev/dri/card0)
[  1166.067] setversion 1.4 failed
[  1166.067] (II) xfree86: Adding drm device (/dev/dri/card1)
[  1166.068] (--) PCI:*(0:1:0:0) 10de:0fe4:1025:079b rev 161, Mem @ 0xb2000000/16777216, 0xa0000000/268435456, 0xb0000000/33554432, I/O @ 0x00003000/128
[  1166.068] Initializing built-in extension Generic Event Extension
[  1166.068] Initializing built-in extension SHAPE
[  1166.068] Initializing built-in extension MIT-SHM
[  1166.068] Initializing built-in extension XInputExtension
[  1166.068] Initializing built-in extension XTEST
[  1166.068] Initializing built-in extension BIG-REQUESTS
[  1166.068] Initializing built-in extension SYNC
[  1166.068] Initializing built-in extension XKEYBOARD
[  1166.068] Initializing built-in extension XC-MISC
[  1166.068] Initializing built-in extension SECURITY
[  1166.068] Initializing built-in extension XINERAMA
[  1166.068] Initializing built-in extension XFIXES
[  1166.068] Initializing built-in extension RENDER
[  1166.068] Initializing built-in extension RANDR
[  1166.068] Initializing built-in extension COMPOSITE
[  1166.068] Initializing built-in extension DAMAGE
[  1166.068] Initializing built-in extension MIT-SCREEN-SAVER
[  1166.068] Initializing built-in extension DOUBLE-BUFFER
[  1166.068] Initializing built-in extension RECORD
[  1166.068] Initializing built-in extension DPMS
[  1166.068] Initializing built-in extension X-Resource
[  1166.068] Initializing built-in extension XVideo
[  1166.068] Initializing built-in extension XVideo-MotionCompensation
[  1166.068] Initializing built-in extension XFree86-VidModeExtension
[  1166.068] Initializing built-in extension XFree86-DGA
[  1166.068] Initializing built-in extension XFree86-DRI
[  1166.068] Initializing built-in extension DRI2
[  1166.068] (II) LoadModule: "glx"
[  1166.068] (II) Loading /usr/lib/nvidia/xorg/modules/extensions/libglx.so
[  1166.078] (II) Module glx: vendor="NVIDIA Corporation"
[  1166.078]     compiled for 4.0.2, module version = 1.0.0
[  1166.078]     Module class: X.Org Server Extension
[  1166.078] (II) NVIDIA GLX Module  325.15  Wed Jul 31 18:12:00 PDT 2013
[  1166.078] Loading extension GLX
[  1166.078] (II) LoadModule: "nvidia"
[  1166.078] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[  1166.078] (II) Module nvidia: vendor="NVIDIA Corporation"
[  1166.078]     compiled for 4.0.2, module version = 1.0.0
[  1166.078]     Module class: X.Org Video Driver
[  1166.078] (II) LoadModule: "mouse"
[  1166.078] (II) Loading /usr/lib/xorg/modules/input/mouse_drv.so
[  1166.078] (II) Module mouse: vendor="X.Org Foundation"
[  1166.078]     compiled for 1.14.0, module version = 1.9.0
[  1166.078]     Module class: X.Org XInput Driver
[  1166.078]     ABI class: X.Org XInput driver, version 19.1
[  1166.078] (II) LoadModule: "kbd"
[  1166.078] (WW) Warning, couldn't open module kbd
[  1166.078] (II) UnloadModule: "kbd"
[  1166.078] (II) Unloading kbd
[  1166.078] (EE) Failed to load module "kbd" (module does not exist, 0)
[  1166.078] (II) NVIDIA dlloader X Driver  325.15  Wed Jul 31 17:50:57 PDT 2013
[  1166.078] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[  1166.078] (--) using VT number 7
[  1166.078] (EE) No devices detected.
[  1166.078] (EE)
Fatal server error:
[  1166.078] (EE) no screens found(EE)
[  1166.079] (EE)
Please consult the The X.Org Foundation support
     at http://wiki.x.org
for help.
[  1166.079] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
[  1166.079] (EE)
bumblebee.conf (I didnt change anything here)
# Configuration file for Bumblebee. Values should **not** be put between quotes
## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=false
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=false
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=
# Directory with a dummy config file to pass as a -configdir to secondary X
XorgConfDir=/etc/bumblebee/xorg.conf.d
## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# List of paths which are searched for the primus libGL.so.1 when using
# the primus bridge
PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false
# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
#         bbswitch - new in BB 3, recommended if available
#       switcheroo - vga_switcheroo method, use at your own risk
#             none - disable PM completely
# https://github.com/Bumblebee-Project/Bu … PM-methods
## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=auto
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia
## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=auto
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
xorg.conf.nvidia (I placed my gpu device address in Device Section
Section "ServerLayout"
    Identifier  "Layout0"
    Option      "AutoAddDevices" "false"
    Option      "AutoAddGPU" "false"
EndSection
Section "Device"
    Identifier  "DiscreteNvidia"
    Driver      "nvidia"
    VendorName  "NVIDIA Corporation"
#   If the X server does not automatically detect your VGA device,
#   you can manually set it here.
#   To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
#   as you see in the commented example.
#   This Setting may be needed in some platforms with more than one
#   nvidia card, which may confuse the proprietary driver (e.g.,
#   trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
     BusID "PCI:01:00.0"
#   Setting ProbeAllGpus to false prevents the new proprietary driver
#   instance spawned to try to control the integrated graphics card,
#   which is already being managed outside bumblebee.
#   This option doesn't hurt and it is required on platforms running
#   more than one nvidia graphics card with the proprietary driver.
#   (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
#   If this option is not set, the new Xorg may blacken the screen and
#   render it unusable (unless you have some way to run killall Xorg).
    Option "ProbeAllGpus" "false"
    Option "NoLogo" "true"
    Option "UseEDID" "false"
    Option "UseDisplayDevice" "none"
EndSection

Your issue seems to be similar to what I had to face with the Razer Blade 14" (similar hardware as well).
Goto: https://bbs.archlinux.org/viewtopic.php?id=173356 and see the instructions to get graphics to work, see if it applies (summarized: run linux-ck and set rcutree.rcu_idle_gp_delay=2 in the kernel parameters).
I haven't tried undoing this in more recent updates because I haven't had the time to mess with it---it is possible that it has been fixed, although the thread following my and possibly your issue at nvidia doesn't give any indication of that:
https://devtalk.nvidia.com/default/topi … iver-crash

Similar Messages

  • Intel Haswell GPU acceleration in CC?

    Has anyone tried Intel Haswell integrated graphics chips for GPU acceleration in Adobe Premiere Pro CC? If so: cigar? No cigar?
    (I realize Haswell is neither supported nor is anywhere in the same performance bracket with discrete desktop GPUs from AMD and NVidia, and clearly isn't going to make you return your GTX-Titan. Just curious if GPU acceleration is at all possible given Haswell's hardware support for OpenCL and other neat tricks.)

    Premiere Pro CC cannot use Intel graphics for hardware acceleration.
    Guess things change pretty fast in Adobe town...
    GPU acceleration on Premiere Pro CC with Iris Pro (Windows) - Dave Helmly, Intel Developer's Conference (I think), August 2013.
    GPU acceleration in Premiere Pro CC on rMPB with Intel Iris (not Pro) - can anyone with an rMBP and Iris confirm this?

  • HP Envy 15 Graphics Problem with Windows 8.1 (Intel and NVidia Graphics). Help!

    I bought an HP ENVY TouchSmart 15-j013ea about 6 months ago and I have had a major problem with the graphics that I cannot seem to solve. I think it started when I updated to Windows 8.1, even though I am not 100% certain.
    I have noticed the following three problems, and I think they are related:
    Some windows (such as Chrome, Skype, or even Device Manager or other windows) are very blurry compared to others.
    The firefox menu window has some deformed black arrows instead of the normal ones.
    When playing a game (specifically, Star Wars the Force Unleashed 2), after a while I get an error that the computer has ran out of memory. However, according to task manager, only about 50% is in use, not to mention that I have 8 GB of RAM + 2 GB graphics memory. At the same time, the computer gets rather loud (like a fan working overtime) and hot.
    The laptop has both Intel and NVidia graphics, and both are enabled in Device Manager. To be honest, I have no idea which is used when, or how to check that (let me know if there is a way)
    I have tried the following without success:
    Uninstalling the Intel graphics driver from device manager and installing a version I downloaded from the HP website for my laptop and OS. After that, Windows would NOT BOOT, and I had to do a system restore.
    Uninstalling the Intel graphics from Programs and Features, and installing 2 different versions from the Intel website. Nothing changed.
    Updating Nvidia driver through GeForce Experience. Nothing changed.
    Updating Intel drivers from Windows Update. Nothing changed.
    Quite frankly I am not sure what the problem could be, since the graphics versions are supposed to be compatible with WIndows 8.1. I also don't get why some windows are blurry and others are not. I am thinking the computer uses only the Intel graphics and not the Nvidia (at least in the blurry windows), but even if that was the case (and it shouldn't be), I don't think I should be having this problem.
    At this point, I really need help. My graphics details are:
    Intel HD Graphics 4600: Driver version: 10.18.10.3412
    NVidia GeForce GT 740M: Driver version: 335.23
    NVidia GeForce Experience version: 1.8.2
    I am attaching screenshots that show driver details and the problems.
    Thank you.
    Blurry Chrome:
    Firefox menu problem:
    Device Manager and driver windows are also blurry, both graphics are shown

    I had a similar problem with a Dell laptop.  I found the problem was the laptop wanted to switch between the Intel and NYIDIA video cards.  It would switch to the Intel when high graphics was not required then to the NIVIDIA when high graphics was required.  The laptop does this to save power when high graphics is not required but it caused problems with some applications, specifically high graphics games.
    For me the solution was to disable the Intel graphics processor.  You can do this by right clicking on your Wallpaper, select the NIVIDIA Control Panel, then select Set Physx Configuration.  You will see a dropdown box labeled Select Physx Processor.  It is set to Auto by default.  This allows the switching I described. Select this dropdown box and select your NIVIDIA processor.  This will prevent the laptop from switching and force it to use your NVIDIA processor exclusively.
    Hope this helps.
    Sacto Fred
    Owner and CEO
    H&O PC Soultions

  • My itunes opens after 2-3 hrs after doubleclicking it to open;it happened once or twice  with itunes 11.x , but its a constant problem after updating itunes to 12.x ....my operating system win7;processor i5 intel haswell and 4gb ram...i hav no issue

    My itunes opens  2-3 hrs after doubleclicking it to open;it happened once or twice  with itunes 11.x , but its a constant problem after updating itunes to the latest 12.0.1....my operating system windows7;processor i5 intel haswell and 4gb ram...i have no issue with opening other big programs....i tried uninstalling completely the itunes and reinstalling it as i thought at first it is not opening (because it takes really long time) and i followed the following support pages
    1)iTunes for Windows Vista, Windows 7, or Windows 8: Fix unexpected quits or launch issues
    2)iTunes for Windows doesn't open after upgrading in Windows Vista, Windows 7, or Windows 8
    but my problem doesnot get solved even after trying all the methods in the above pages...
    now i realized,it opens after 2-3 hrs and at will...after opening it keeps hanging.finally it doesnot again open immediately after closing...kindly help me out as soon as possible

    My itunes opens  2-3 hrs after doubleclicking it to open;it happened once or twice  with itunes 11.x , but its a constant problem after updating itunes to the latest 12.0.1....my operating system windows7;processor i5 intel haswell and 4gb ram...i have no issue with opening other big programs....i tried uninstalling completely the itunes and reinstalling it as i thought at first it is not opening (because it takes really long time) and i followed the following support pages
    1)iTunes for Windows Vista, Windows 7, or Windows 8: Fix unexpected quits or launch issues
    2)iTunes for Windows doesn't open after upgrading in Windows Vista, Windows 7, or Windows 8
    but my problem doesnot get solved even after trying all the methods in the above pages...
    now i realized,it opens after 2-3 hrs and at will...after opening it keeps hanging.finally it doesnot again open immediately after closing...kindly help me out as soon as possible

  • Intel and nVidia at the same time, with OpenGL

    Hi all,
    I'd like to set up a multi-seat Xorg config, with the onboard Intel video powering three monitors connected to that card, and an nVidia card powering two monitors connected to it.
    This seems straightforward enough to configure in the Xorg config which I have done, but when the time comes to install the "nvidia" package, it conflicts with the Intel driver.  Specifically I can't have mesa-libgl (Intel) and nvidia-libgl installed at the same time.
    I guess this is fair enough as they both want to become the default GL driver, but in my case the actual GL driver will depend on which seat is being used - sitting at the Intel monitors the GL driver should be Intel, and sitting at the nVidia monitors it should be the nvidia GL driver.
    Is there a way to install both drivers at the same time, and specify which libGL to use by e.g. changing the library search path?
    I have looked at Bumblebee but it seems aimed at duplicating displays across to the other video card, whereas I want the displays to appear on directly attached hardware.

    Short answer: No.
    Long answer: Arch wiki NVIDIA (replace nouveau with the intel driver package).
    They do not work at the same time, however.

  • Can I run CC on Intel Pentium G2030 3.00GHz dual core. NVIDIA Geforce GT610 1GB graphics. Viglen Vig642M Motherboard?

    Can I run CC on Intel Pentium G2030 3.00GHz dual core. NVIDIA Geforce GT610 1GB graphics. Viglen Vig642M Motherboard?
    I also have 8GB RAM DDR3, 1TB HDD, Its on Windows 7, is that a problem with the allocated graphics card?
    P

    Thanks John,
    I've done this, but couldn't gather clear information about whether the graphics card was suitable for Premiere CC? Using CPUbenchmark it seemed to me like it would be suitable. Its unlikely we would use After Effects CC but would be handy if it would work.
    Also the G2030 dual core? Can you recommend how I find out if this would work, as the site states the requirement to be a Dual Core2... does this mean the software (Premiere CC, Photoshop CC, Indesign CC etc) would not work at all?
    Any guidance helpful.
    P

  • Intel and Nvidia graphics - set which one support choosen game.

    Hello,
    On some game forum one guy tell me i can set which one - intel or nvidia graphics support my choosen game, cuz i have intel processor and intel motherboard but also i have nvidia geforce 550ti graphics card, and my question is: Can i choose what graphics intel or nvidia support my choosen game ?

    x_user wrote:Yes i have  xf86-video-nouveau and mesa-libgl but performance for game what require only GeForce FX series is really bad, but maybe that model of graphics card is not best for open drivers, maybe this is the point.
    There's no reclocking for this card currently.
    Nouveau runs it locked at 405 MHz GPU clock, 648 MHz memory clock, 810 MHz processor clock.
    Nvidia can run it at 3 perf levels:
        GPU Clock    Memory Clock    Processor Clock
    0    50 MHz          270 MHz             101 MHz
    1    405 MHz        648 MHz             810 MHz
    2    900 MHz      4104 MHz           1800 MHz

  • Photoshop CS6 doesn't recognize my NVIDIA GeForce GTX 675M GPU?

    I just bought a brand new Alienware M17x R4 with Windows 8 Pro x64 and Photoshop CS6 won't recognize my NVIDIA GeForce GTX 675M GPU?
    I've already updated Windows 8, Photoshop and the Nvidia graphics driver.
    Any ideas?  I spent so much on a nice graphics card but now it seems PS CS6 can't even use it.  It's using the Intel HD4000 instead:
    I also tried this:  http://forums.adobe.com/message/4545768#4545768
    And deleted the settings but that didn't help.
    According to this, my graphics card should work, it's a NVIDIA GeForce GTX 675M, which is a 600 series GPU:  http://helpx.adobe.com/photoshop/kb/photoshop-cs6-gpu-faq.html#tested_cards
    Tested video cards for Photoshop CS6
    Adobe tested the following video cards before the release of Photoshop CS6. This document lists the video card by series. The minimum amount of RAM supported on video cards for Photoshop CS6 is 256 MB. Photoshop 13.1 cannot display 3D features if you have less than 512 MB VRAM on your video card.
    Important: This document is updated as newly released cards are tested. However, Adobe cannot test all cards in a timely manner. If a video card is not listed here, but was released after May 2012, you can assume that the card will work with Photoshop CS6.
    Adobe tested laptop and desktop versions of the following cards. Be sure to download the latest driver for your specific model. (Laptop and desktop versions have slightly different names.)
    nVidia GeForce 8000, 9000, 100, 200, 300, 400, 500, 600 series
    I've run out of ideas.  Any suggestions?

    Okay, I figured it out...duh.
    My bad...or rather, Nvidia's bad!
    In their app profiler, they set Photoshop to use the "default" settings, which defaults to the integrated graphics.  By changing the profile to make it use the discreet graphics, it worked!
    This also fixed my issue of the background flashing / flickering.
    I know that's a known bug with Windows 8 + AMD (http://helpx.adobe.com/photoshop/kb/image-background-transparent-or-flickers.html)
    However, Adobe should look into this same bug occuring with the Intel HD4000 integrated GPU, because this happened with my laptop.
    Maybe that bug doesn't occur on systems that only have an integrated GPU, but on systems with Optimus, it's definitely a reproducible issue.
    When I switch back to integrated, the issue pops up again.
    It's nice that Nvidia provides an option for us to either use the integrated or discreet graphics, however, Adobe needs to follow through and make sure that Photoshop doesn't wig out when we're using the integrated GPU.  Sometimes, your customers need to do work on battery power, so the integrated flickering /flashing /transparent background bug definitely needs to be fixed.
    Personally, I can get 6hrs out of this laptop running the integrated GPU, but when I use the discreet, that cuts it down to only 2hrs.

  • [BUG] Photoshop CC crashed immediately after selecting NVIDIA GeForce GTX 680M GPU

    I just started using Photoshop CC trial and I immediately ran into problem with my video card.
    First Photoshop doesn't recognized my GPU which is Nvidia GTX 680M. I'm using a new Alienware R4 with Win8 pro, I then searched the forum and followed this thread
    Photoshop CS6 doesn't recognize my NVIDIA GeForce GTX 675M GPU?http://forums.adobe.com/thread/1157982
    After following the instruction now Photoshop CC is supposely using my NVIDIA GPU, but it immediately crashed after launching, for both 32 bits and 64 bits. My NVIDIA driver version is 320.49 600 series, the latested one I just updated. If I deselect the NVIDIA option in the control panel I no longer suffer the crash problem but as you know there is a reported problem with Intel HD4000 integrated GPU where screen flickering keeps happening if you use a toolbox. To avoid the problem the only thing I can do is to deselect the use GPU option in CC option (changing the mode advanced setting doesn't help). By deselcting the GPU option get rid of the problem but there is another problem with zooming: it cannot zoom in real time anymore, instead it's having a toolbox where you choose a reagion and it will jump to that region, and I can only zoom in by doing this, not zooming out.
    So basically I had a problem with NVIDIA which crashes CC immediately and a problem with Intel HD4000 integrated GPU with flickering screen and another problem with no GPU selected where I cannot zoom in real time. Whatever I do always give me some sets of problem that I can't solve.
    BTW I haven't try CS6 yet, I may proceed to download a trial copy and I thought CC and CS6 are using the same core. Nevertheless, those problem should definately be looked into.
    Any helps? Thanks in advance.

    If you check that user's Profile, you'll discover it is not actually nVidia per se
    Profile
    Username:              Sora
    Age:                       7 ¾     
    Favourite activity:  Playing computer games
    Spots:                    In abundance
    Ambition:               To have a clue
    For what it’s worth, I have most of the CC apps installed on an MSI GT70 ONE with GTX680M — driver 9.18.13.614 dated 8/28/2012 and using Windows 8, and while I have not used them all on that laptop, the ones I have appear to be working faultlessly.

  • Will Premiere Elements 12 work with the  NVIDIA GFORCE GT 820M GPU?

    I am researching new computers, and we want to buy Premiere Elements 12. The computer we are looking at comes with an NVIDIA GFORCE GT 820M GPU. This GPU does not appear in the list of "certified" GPUs on the Adobe web site. I was just wondering if Premiere Elements will work with this type of GPU?
    Thanks!

    mimisonia123
    What computer operating system is involved?
    With the exception of Premiere Elements 10, Premiere Elements works with most NVIDIA and ATI video cards/graphics cards.
    The CUDA cards for Premiere Pro do not apply to Premiere Elements which, since version 8.0/8.0.1, does not even have GPU effects or transitions.
    NVIDIA lists the GT820M as a "CUDA Enabled GeForce Product"
    CUDA GPUs
    GeForce 820M Dedicated Graphics | GeForce
    My answer to your question is a "probably yes" but CUDA Enabled is not applicable to Premiere Elements.
    Unless you get a definite answer from a Premiere Elements 12/12.1 user here who is using the card in his/her particular computer operating system with Premiere Elements 12/12.1, I would consider contacting NVIDIA for its input.
    Are you thinking about staying with Premiere Elements or going to Premiere Pro in the near future?
    ATR

  • Nvidia GeForce GTX 485M GPU - Anyone Used It?

    I'm about to pull the trigger on a new Sager laptop for CS5.5 MC.
    I have the options for the Nvidia GeForce GTX 485M GPU with 2GB GDDR5 Video Memory, or the Nvidia Quadro FX 3800M Graphics with 1GB DDR3 Video Memory. The difference is ~ US$500, and pays for a bunch of other options.
    Unless it has just been updated, the GTX 485M is not directly supported for hardware MPE, but I assume that by applying the hack for that card, it should work. Does anyone have experience with that less-expensive card?
    My old workstation (soon to be replaced too), has a Quadro FX-4500, which has been a stout card for years, however was crazy expensive, and I never have really put it to the test, considering what I do.
    If I have to go with the FX 3800M, then so be it, but I'd rather buy more toys.
    Thanks for your observations and thoughts,
    Hunt
    PS - I know that Sager's biggest market is the gaming world, but it is getting harder to get a top-level laptop without SLI for dual-nVidia cards/chips, or Crossfire for the ATI/AMD. As soon as I take delivery, Adobe will probably announce that PrPro NOW fully supports SLI, but I'll live with it.

    Hey, it would not be the first time, that I was the "new kid on the block," with something, but not lately...
    Will probably go with it, and then run the Benchmark, just to get it listed. The Quadros are nice, BUT not sure that the laptop would really benefit from one, and especially to the tune of + US $ 500.
    Thanks,
    Hunt

  • How to verify Nvidia GT 750M is present?

    I purchased recently a MacBook Pro 15 Retina, with the Nvidia GT 750 configuration, but I would like to check somewhere in OSX that my laptop indeed has it.
    On "About This Mac" I only can see the "Intel Iris Pro 1024 MB" graphic processor. Is there any way to verify it? (i.e. command in the Terminal).
    Thank you.

    According to that table...
    "Core i7" 2.6 15" (IG)
    Late 2013
    Iris Pro 5200
    1 GB†
    "Core i7" 2.3 15" (DG)
    Late 2013
    Iris Pro 5200
    GeForce GT 750M*
    2 GB*
    "Core i7" 2.6 15" (DG)
    Late 2013
    Iris Pro 5200
    GeForce GT 750M*
    2 GB*
    I should have the latest in that list (2.6, 2 GB*). But following the information from "About This Mac", I only can see "Intel Iris Pro 1024 MB graphics" processor. How can I be sure that my model is correct?

  • UNABLE TO SWITCH GRAPHICS (FROM INTEL TO NVIDIA)

    Hello to my favourite brand...............
    I have Laptop with the the following specs
    HP 15-r032tx
    RAM:4gb
    Graphics: Nvidia Geforce 820M(2gb)
    WINDOWS 8.1 (SL) 64-bits
    When I right click go to screen resolution>Advanced Settings
    THERE IT SHOWS THE ADAPTER IS INTEL HD GRAPHICS FAMILY
    BUT I WANT TO SWITCH IT  TO NVIDIA GEFORCE 820m.      HOW CAN I DO IT??????????????
    PLEASE REPLY...........
    I HAVE GONE THROUGH THAT PROPERTIES OPTION BUT IT SAYS THIS DEVICE DRIVER DOESN't SUPPORT THIS VERSION OF WINDOWS
    This question was solved.
    View Solution.

    Hi,
    The fact that the Intel chip is showing as the display device ( and cannot be changed ) is correct
    The switchable graphics on your type of system ( Nvidia Optimus ), means that the Intel gpu is always used to render the final display, regardless of whether the discrete Nvidia chip is being used for graphics processing or not.
    For further information on how this type of 'Switchable' graphics technology works, see the Whitepaper on the following link.
    http://www.nvidia.co.uk/object/LO_optimus_whitepapers.html
    Regards,
    DP-K
    ****Click the White thumb to say thanks****
    ****Please mark Accept As Solution if it solves your problem****
    ****I don't work for HP****
    Microsoft MVP - Windows Experience

  • GPU acceleration fix for Dell Intel and Nvidia chipsets with external monitor?

    I have a Dell XPS laptop with both integrated Intel chips and Nvidia chips and I can only get the GPU acceleration to work when I don't have any external monitors plugged in.
    Is there some way I can get Illustrator to enable the Nvidia GPU and stop being confused by multiple chipsets on external monitors? I really like the live zoom.

    same problem here,
    'm getting a little frustrated trying to get my laptop working with my LG DLP 3D ready projector to display in 3D. The projector itself will work in 3D as long as the resolution is 1024x768@120hz and the laptop's NVidia GTX 770M should have no issues driving the projector with this resolution.
    Where it seems like the problems occur is Intel HD 4600 graphics driver gets in the way every time i try to run anything 3D, the software Im using to run any 3D type files do not recognize the projector as a 3D capable device because all the laptop is using is the Intel HD 4600 graphics.
    If I could somehow turn off the Intel graphics and just force the laptop to just use the NVidia card I think all my issues will go away. I just am not having success in doing this.
    I have tried to go into the NVidia control panel and under 3D settings pick the software program that runs my 3D files and tell it to use the NVidia card...but it never works.
    I think everything is sent through the Intel graphics card regardless if the laptop is using the NVidia card (just guessing here)?
    I also tried to simply uninstall the drivers but that did not work,
    I also tried to disable the Intel graphics but it then went into some type of generic display driver mode and still would not work.
    I also went into the BIOS to look for any settings that would allow me to disable the Intel graphics card ..no luck, could not find any settings for this.
    Can someone help me with this? how can I disable the Intel graphics and force the laptop to just use the NVidia card all the time?
    Thanks!!

  • What is the normal gaming temp of the new NVidia GeForce 750m? Mine is running at like 75c playing DAYZ on bootcamp?? Is this safe?

    I'm concerned =/
    Running on 2880x1800, the game runs smooth, but it gets really warm and the fan gets loud =/
    checked it with speedfan, 75c =/
    playing plugged into the charger, and running windows 7 ultimate. 64bit
    thanks for the input

    860M is a good GPU. I would like to know if this is one of your first gaming laptops?
    My suggestions -
    1) Play games at 1080p (max). The GPU you have cannot handle games at a resolution more than that.
    2) In most games, you will have to know the graphical settings and turn down the most demanding settings.
    (such as Ambient Occlusion, Post Processing FX, Tessellation etc,) You need to learn what these effects are how they affect your system.
    3) Take care of your system, like benchmark a game before playing to prevent overheating.
    4) Make sure the game you are trying to play is stable and optimized (otherwise you wont get the performance you want)
    5) Change desktop resolution to 1080p, adjust font and size sliders in Windows settings for optimal experience.
    If you want specific advice for an issue, let me know. Have a nice day.

Maybe you are looking for

  • Need A Photo Editing System--which Is Good?

    On June 22nd my iMac decided to die out on me--I'm still hoping to get it repaired or if anything my data recovered.  My former computer was still running on the Tiger edition of Safari.  At the same time that I got that computer five years ago, I ha

  • Message in Clock-In/out Correction

    Hello everybody. Some times when a user make a time correction using the clock in/out iView he/she gets a message when he/she reviews the correction. The message is in Spanish: "Lista de mensajes no se puede actualizar de momento por motivos técnicos

  • [SOLVED] Laptop Lid Close

    I've been using arch on my laptop for a while now and have been having problems when I close the lid on my laptop. What I want my a laptop to do is nothing except simply turn the screen off when the lid is shut. However, at the moment when I close th

  • For loop conditions..

    If i have two loops, like this: for (i=<;i <=5;i++){      do something      for(j =5; j<=5){         if (something.equals(something[j]){ do something

  • Need help in downloading movies

    I have a new iphone 4.  Just wonder how do I download movies to my iphone?  What are the steps? Thank you.