GPU Acceleration for Lightroom

When will Lightroom have GPU acceleration like Photoshop CS4 has?

From what I've read GPU acceleration in CS4 currently only applies to canvas zooming and rotation... not EVERYTHING is accelerated by the GPU. So what effect (if any) would GPU acceleration for have for those items in LR? (honest question... I don't know.)
I've also read Adobe is planning on more features later on in CS to use GPU. So to me this appears currently as a first stage deployment of this technology... not TOO exciting in terms of actual benefit today, but with promising prospects in the future.
As for LR, I assume it all depends on what features would benefit from this (some would actually slow down if routed through GPU, if I understand some the writeup about this.)

Similar Messages

  • Feature Suggestion // CUDA GPU Acceleration for Lightroom 4

    GPU CUDA Acceleration for rendering image previews, exporting images, and playing video in Lightroom 4,
    I'd like to see Lightroom 4 make use of  GPU processing.  Similar to the Mercury Playback engine in Premiere Pro CS5.5
    GPU acceleration should be available for ALL CUDA enabled GPU's.
    I absolutely love GPU acceleration & Mercury Playback engine in Adobe Premiere Pro.  It really helps to speed up real-time previewing of high resolutuon footage.
    I am sure that GPU acceleration will speed up any professionals workflow.
    http://forums.adobe.com/message/4238531#4238531

    Feature requests go here - http://feedback.photoshop.com/photoshop_family/products/photoshop_family_photoshop_lightro om

  • GPU notes for Lightroom CC (2015)

    Hi everyone,
    I wanted to share some additional information regarding GPU support in Lr CC.
    Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.
    For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.
    So why doesn't everything feel faster?
    Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.
    First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).
    Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.
    Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.
    Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.
    So let's clear up what's currently GPU accelerated in Lr CC and what's not:
    First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).
    Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.
    While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.
    Summary:
    1. GPU support is currently available in Develop only.
    2. Most (but not all) Develop controls benefit from GPU acceleration.
    3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.
    4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.
    5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.
    6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.
    The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.
    Eric Chan
    Camera Raw Engineer

    I posted the following information on the Luminous Landscape forum (GPU used in Develop but not Library?) in response to comments you made there.
    I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.
    OS X (10.9.5)
    Hardware configuration:
       MacPro (late 2013)
       AMD FirePro D300 2048 MB
       Apple Cinema Display 1920 x 1200
       16 GB RAM
       1 TB SSD
    Test file:  Nikon D800 NEF, 50 MB
    (0)  open the Develop module
    (1)  select a different NEF file and zoom to 1:1
    (2)  clear the ACR cache
    (3)  select the test file
    (4)  take 3 screenshots to illustrate the 3 display states (the first one is hard to capture)
    (5)  select another image
    (6)  same 3 states are present
    (7)  return to the test file and the same 3 display states are present
       Why isn’t the ACR cache coming into play in step 7?
    If I repeat this process with the GPU disabled the image is displayed without the intermediate states.
    I have attached the 3 screenshots mentioned in step (4).

  • GPU acceleration in Lightroom 6 not working

    Hi,
    Any ideas how do I turn on GPU acceleration in Lightroom 6?
    I meet the minimum requirements (Win 7 x64, 1GB VRAM, OpenGL 6.1 ) and as advised in "learn more" I did check my graphic driver which is at the latest version.
    Still I am getting message: "GPU acceleration was disabled due to errors"
    Any help apprecited
    thank you
    System info:
    Lightroom version: 6.0.1 [ 1018573 ]
    License: Perpetual
    Operating system: Windows 7 Home Premium Edition
    Version: 6.1 [7601]
    Application architecture: x64
    System architecture: x64
    Logical processor count: 4
    Processor speed: 3,0 GHz
    Built-in memory: 12245,9 MB
    Real memory available to Lightroom: 12245,9 MB
    Real memory used by Lightroom: 633,5 MB (5,1%)
    Virtual memory used by Lightroom: 620,2 MB
    Memory cache size: 0,0 MB
    Maximum thread count used by Camera Raw: 4
    Camera Raw SIMD optimization: SSE2,AVX
    System DPI setting: 96 DPI
    Desktop composition enabled: Yes
    Displays: 1) 1920x1200
    Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
    Graphics Processor Info:
    Check OpenGL support: Failed
    Vendor: ATI Technologies Inc.
    Version: 3.3.13283 Core Profile Context 14.501.1003.0
    Renderer: AMD Radeon HD 6670
    LanguageVersion: 4.40

    You can also install an "old" version from april 2014 (V 14.4) which works with Lr6 after having un-installed your actual actual drivers.
    You can find it here: http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064
    Be aware that with this driver you could activate the GPU Acceleration but it's possible that you didn't obtain any benefit...bug or not nobody knows...
    Try

  • GPU Acceleration in Lightroom CC

    This is the page talking about GPU usage in Lightroom CC: Adobe Photoshop Lightroom Help | Lightroom GPU FAQ.
    This is the page talking about GPU usage in Photoshop: Photoshop CC and CC 2014 GPU FAQ
    As you can see, I havn't seen much details about the actual use case of OpenGL/OpenCL acceleration in Lightroom. I assume now Adobe name Lightroom "Adobe Photoshop Lightroom" now, does that mean Lightroom is using the same kind of engine? I would think the new photo merge function would be a good candidate for GPU acceleration given some use case like panorama or large pixel image rotation could be quite heavy duty. Does anyone know where to find these under the hood details?
    Having this said, I wonder what is the official graphics card vendor Adobe would recommend? Perhaps nVidia?

    You can also install an "old" version from april 2014 (V 14.4) which works with Lr6 after having un-installed your actual actual drivers.
    You can find it here: http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064
    Be aware that with this driver you could activate the GPU Acceleration but it's possible that you didn't obtain any benefit...bug or not nobody knows...
    Try

  • Any GPU acceleration in Lightroom 4?

    Hi there-
    I know this has been bandied about as a feature request, was just curious if there was ever any implementation. Im off to buy a new video card today, and didnt want to buy Radeon if the only LR GPU acceleration was CUDA (or vice versa).
    Any general advice on which video card (if it matters at all for LR) to buy would also be appreciated.
    Thanks,

    No.

  • 2D GPU acceleration for Linux Reader 8 (when it will be available)?

    It would be nice if Adobe Linux Reader 8 could use "2D GPU Acceleration" in Linux too. It could be done using OpenGL driver. Nvidia Linux Geforce driver has support for shader technology in Linux too. ATI also uses its own accelerated OpenGL library in Linux.
    Will Adobe make Linux Reader 8 with 2D GPU acceleration?

    What a pity. Adobe reader is very slow even on fast machines when compared to open alternatives. 2D GPU acceleration could give a chance to Adobe Reader to work at least equally fast as open source viewers.
    However thanks for reply.

  • GPU Acceleration for both FCP-X and Premiere? Anyone have NVIDIA GTX285 or NVIDIA QuadroFX 4800? ...and successful at using both editors with GPU Acceleration?

    I'm in the market for a Laptop MAC solution to GPU accelerated editing. Because of multi-client workflow needs, I need to edit HD video with both FCP-X and ADOBE CS5.5 Premiere. The only GPU capable Graphics Cards that seem to satisfy both "camps" are from NVIDIA - GTX285 and NVIDIA Quadro FX 4800
    Does anyone have these cards AND using both softwares AND having success at it?
    I sure would appreciate any experienced advice with using NVIDIA GPU accelration with MacBook Pro especially!
    Thanks,
    ~mars9

    How do you plan on using a PCIe card with a laptop? Via a Thunderbolt PCIe adapter? I can't help with that. In fact, I didn't think those adapters were even out yet.
    I do have a Mac Pro and I use the NVDIA Quadro 4000 For Mac. This works great for Premiere Pro. As good as the 4800 and for less money. Check out this comparison.
    http://barefeats.com/wst10g11.html

  • Nvidia Geforce 670 and the Mercury Playback Engine GPU Acceleration for Adobe Premiere Pro

    Will Adobe test and certify the Nvidia Geforce GTX 670 for use with the mercury playback engine GPU acceleration feature of Adobe Premiere Pro CS5 in the near future? Any advice would be much appreciated, thanks in advance for any help.

    Adobe only certifies after extensive testing... For many others, with at least 1Gig of video ram, use the nVidia Hack http://forums.adobe.com/thread/629557 - which is a simple entry in a "supported cards" file - and Mac http://www.vidmuze.com/how-to-enable-gpu-cuda-in-adobe-cs6-for-mac/

  • Premiere CC kicks GPU acceleration for 6770M

    Installing Premiere CC has disabled GPU acceleration on my 2011 iMac running the "Supported" AMD Radeon HD 6770M 512 MB Graphics card. GPU acceleration was working fine on CS6 without the need for any hack.
    I am running a fully updated OS Mountain Lion.
    GPU acceleration was also kicked on a work computer I was using on a different license. A PC with an Nvidia card. This required the card driver to be reinstalled. But because my iMac has the driver built into the OS I can't do this for my iMac.
    Since this happened on two completely different computers I've used with different OS and graphics cards I'm surprised this isn't a big issue for lots of people.
    Any advice would be appreciated, thanks.
    James

    And yet it was working fine on CS6.
    The required specs on the Adobe page are little confusing as they read...
       Windows
    Mac
    AMD Radeon HD 6750M (only on certain MacBook Pro computers running OS X Lion (v10.7.x) with a minimum of 1GB of VRAM)
    AMD Radeon HD 6770M (only on certain MacBook Pro computers running OS X Lion (v10.7.x) with a minimum of 1GB of VRAM)
    ATI Radeon HD 6750M (OpenCL)
    ATI Radeon HD 6770M (OpenCL)
    NVIDIA GeForce GTX 285 (CUDA)
    NVIDIA GeForce GTX 675MX (CUDA)
    NVIDIA GeForce GTX 680 (CUDA)
    NVIDIA GeForce GTX 680MX (CUDA)
    NVIDIA GeForce GTX 650M (CUDA)
    NVIDIA GeForce Quadro CX (CUDA)
    NVIDIA GeForce Quadro FX 4800 (CUDA)
    NVIDIA GeForce Quadro 4000 (CUDA)
    NVIDIA GeForce Quadro K5000 (CUDA)
    So they obviously have the Mac and Windows titles in the wrong place.
    But further, when I look inside my System, through Terminal, it says I have the "ATI Radeon HD 6770M" - which is supported without the 1Gb VRAM caveat. When I open "About This Mac" / "More Info" it tells me I have the AMD Radeon HD 6770M 512mb.

  • GPU acceleration for video editing

    I am using cyberlink power director 12, adobe premiere elements 12, and sony vegas 12 (yes a lot of 12's). Problem is, my new notebook (msi gt70 dominator pro) doesn't let the NVidia 880m run for acceleration. I've used all these programs on desktop and with my previous 7 asus g750 laptops (the asus kept overheating so went msi). They all work great.
     Nvidia geoforce is all set up optimum, and ive tried integrated and dedicated for each program (not that it should matter which is using to use program). All settings in the programs themselves are checked for hardware/gpu acceleration. This isn't my first time doing this, and laptop/desktop side by side all settings are identical. But when I go to the produce section, the box for acceleration isn't valid to click, no matter what final format I choose. Out of all 7 asus laptops ive had last 6 months, all of them worked just fine.
     Any help, please. Shame I spent so much on laptop, finally got one that doesn't thermal throttle, and now programs don't work. I also reinstalled windows and programs once already, and that didn't fix it. All drivers are also 100% up to date.
     Components are different than the asus, but that shouldn't matter. Asus had i7 4700 and gtx 860m/870m/880m ( I went through all of them jm js jz), my msi has i7 4810/4900 and 880m.

    I've been working on this problem for days now, I was up until 2am this morning trying to figure it out.
    I then plugged my monitor into my laptop for my son to watch music videos while I did some things around the house, and noticed that as soon as I plugged in the monitor it switched over to NVidia graphics not intel. While trying to figure this one out, I loaded power director to work on a movie, rendering takes so long I figure I should start now while im busy. Well, gpu acceleration works just fine. Just as before. I did absolutely nothing different.
    Still haven't figured out how to run second screen off intel graphics, but now I don't know if I want to since that's the only thing that's changed and now gpu acceleration is working fine.

  • Media Encoder CC not using GPU acceleration for After Effects CC raytrace comp

    I created a simple scene in After Effects that's using the raytracer engine... I also have GPU enabled in the raytracer settings for After Effects.
    When I render the scene in After Effects using the built-in Render Queue, it only takes 10 minutes to render the scene.
    But when I export the scene to Adobe Media Encoder, it indicates it will take 13 hours to render the same scene.
    So clearly After Effects is using GPU accelleration but for some reason Media Encoder is not.
    I should also point out that my GeForce GTX 660 Ti card isn't officially supported and I had to manually add it into the list of supported cards in:
    C:\Program Files\Adobe\Adobe After Effects CC\Support Files\raytracer_supported_cards.txt
    C:\Program Files\Adobe\Adobe Media Encoder CC\cuda_supported_cards.txt
    While it's not officially supported, it's weird that After Effects has no problem with it yet Adobe Media Encoder does...
    I also updated After Effects to 12.1 and AME to 7.1 as well as set AME settings to use CUDA but it didn't make a difference.
    Any ideas?

    That is normal behavior.
    The "headless" version of After Effects that is called to render frames for Adobe Media Encoder (or for Premiere Pro) using Dynamic Link does not use the GPU for acceleration of the ray-traced 3D renderer.
    If you are rendering heavy compositions that require GPU processing and/or the Render Multiple Frames Simultaneously multiprocessing, then the recommended workflow is to render and export a losslessly encoded master file from After Effects and have Adobe Media Encoder pick that up from a watch folder to encode into your various delivery formats.

  • GPU acceleration for command line renderer?

    Hello,
    is it possible to use CUDA for batch rendering with many projects over the command line tool aerender ?
    Would a nVidia TESLA GPU significantly increase the performance? Or are only the 570/580 and Quadro GPUs supported?
    kind regards
    Poison Nuke
    www.poisonnuke.de

    No.
    The GPU matters hardly at all for After Effects. After Effects doesn't use CUDA at all, and primarily uses OpenGL for a low-fidelity preview renderer. Don't waste your time thinking about the GPU for After Effects.
    See this video for details:
    http://www.video2brain.com/en/videos-5359.htm

  • Resolution loss in multicam when GPU acceleration enabled

    Hello all,
    Strange multicam problem here. I know others besides me have struggled with multicam playback resolution, but at least on the surface, this particular problem seems counter-intuitive.
    When I enable GPU acceleration (I have a GTX 650 Ti BOOST), playback resolution is decreased in the multi-cam monitor. With CS5.5, this was extremely noticeable; now in CC, it's less apparent, but still detectable. The footage will blur slightly when playback begins, and then sharpen again when paused. If I disable GPU acceleration, playback resolution in the multicam-monitor is the same as the paused resolution.
    Is this a bug, or something that should be expected? I can live with this, but it'd be nice not to have to disable GPU acceleration for multi-cam, and then re-enable it for other tasks.
    Thanks in advance.

    Update: At least in some cases, I can see that the paused image is sharper with GPU acceleration enabled than the paused image without it. So the playback resolutions may well be the same with or without GPU, and all I'm seeing is a greater difference between playback and paused during GPU acceleration. Also, there may in fact be a drop during playback resolution in software only mode, albeit less noticeable.
    (As mentioned though, this was certainly an issue I experienced in 5.5; don't know if anyone else has experienced something similar.)
    My new question would be, is it possible with the right hardware to guarantee full resolution playback in multi-cam? I'm currently editing 4 cameras: 2 5DMIIIs, a GH2, and a Sony NX5U. Editing system is comprised of an i7 4770K overclocked to 4.2GHz, 16GB RAM, GTX 650 Ti Boost, and three HDDs: an SSD for OS and Programs, a 7200 for media and projects, and another 7200 for pagefile, previews, and media cache. Would creating a RAID 0 for the media files help at all in this case? Or is the drop in resolution I experience something I likely won't be able to change?

  • Does GPU acceleration in Illustrator CC still work after disabling 'GPU Preview' and enabling 'CPU Preview' (on a workstation gpu)?

    I'm using a workstation graphics card that allows for GPU acceleration in Illustrator CC.
    Selecting 'GPU Preview' instead of 'CPU Preview' under the 'View' toolbar in Illustrator CC results in a more choppy experience.
    Which leads me to wonder:
    Does GPU acceleration in Illustrator CC still work after disabling 'GPU Preview' and enabling 'CPU Preview' (on a workstation gpu)?
    Does GPU acceleration for my gpu actually improve the performance for my work?

    If you disable the GPU acceleration in preferences then you'll be unable to enable GPU preview under the view menu. It will automatically go into CPU preview as the GPU is no longer enabled.
    If you're experiencing choppy performance when you do have the GPU preview selected, something is amiss because any GPU that can support GPU acceleration should be able to do it without a performance hit.
    What is the video card that's in the workstation?

Maybe you are looking for

  • How can I display the ipod on my interactive board? I purchased the cable I was told to buy, but now what?

    I want to be able to connect an iPod to an interactive whiteboard.  I was told by a presentor to buy a special cable, which I did, but now what?  Do I need a certain app?  I want to use the board to display apps to teachers to show them how to use th

  • Will iPod touch 16GB 5th generation be updated to iOS 7?

    Have son with a reading disability. I want to load his audio textbooks on an iPod. The current price on the 16GB Touch 5th gen is very attractive, but not being able to update to iOS 7 would be a deal breaker. Thoughts?

  • What is the next row shortcut?

    Hi SCN, I know that going to the next column the shortcut is "TAB" but what is the shortcut to move to the next row? Thanks David

  • ConnectionPoolDataSource properties

    Hi Guys, How to implement the connection pooling in standalone java program using the ConnectionPoolDataSource APIs. If we use it with APP server we can specifiy the Properties in the config file, as follows <jdbc-connection-pool steady-pool-size="8"

  • Downloading/checking for mail when it shouldn't

    In my mail settings, I have "Messages: Autocheck" set to "manual". However, my phone regularly gives a short vibrate - which can only be the arrival of a new mail. The only way I can get it to stop chekcing adn telling me of new mails is to disable t