Feature Suggestion // CUDA GPU Acceleration for Lightroom 4

GPU CUDA Acceleration for rendering image previews, exporting images, and playing video in Lightroom 4,
I'd like to see Lightroom 4 make use of  GPU processing.  Similar to the Mercury Playback engine in Premiere Pro CS5.5
GPU acceleration should be available for ALL CUDA enabled GPU's.
I absolutely love GPU acceleration & Mercury Playback engine in Adobe Premiere Pro.  It really helps to speed up real-time previewing of high resolutuon footage.
I am sure that GPU acceleration will speed up any professionals workflow.
http://forums.adobe.com/message/4238531#4238531

Feature requests go here - http://feedback.photoshop.com/photoshop_family/products/photoshop_family_photoshop_lightro om

Similar Messages

  • GPU Acceleration for Lightroom

    When will Lightroom have GPU acceleration like Photoshop CS4 has?

    From what I've read GPU acceleration in CS4 currently only applies to canvas zooming and rotation... not EVERYTHING is accelerated by the GPU. So what effect (if any) would GPU acceleration for have for those items in LR? (honest question... I don't know.)
    I've also read Adobe is planning on more features later on in CS to use GPU. So to me this appears currently as a first stage deployment of this technology... not TOO exciting in terms of actual benefit today, but with promising prospects in the future.
    As for LR, I assume it all depends on what features would benefit from this (some would actually slow down if routed through GPU, if I understand some the writeup about this.)

  • GPU notes for Lightroom CC (2015)

    Hi everyone,
    I wanted to share some additional information regarding GPU support in Lr CC.
    Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.
    For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.
    So why doesn't everything feel faster?
    Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.
    First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).
    Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.
    Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.
    Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.
    So let's clear up what's currently GPU accelerated in Lr CC and what's not:
    First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).
    Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.
    While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.
    Summary:
    1. GPU support is currently available in Develop only.
    2. Most (but not all) Develop controls benefit from GPU acceleration.
    3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.
    4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.
    5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.
    6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.
    The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.
    Eric Chan
    Camera Raw Engineer

    I posted the following information on the Luminous Landscape forum (GPU used in Develop but not Library?) in response to comments you made there.
    I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.
    OS X (10.9.5)
    Hardware configuration:
       MacPro (late 2013)
       AMD FirePro D300 2048 MB
       Apple Cinema Display 1920 x 1200
       16 GB RAM
       1 TB SSD
    Test file:  Nikon D800 NEF, 50 MB
    (0)  open the Develop module
    (1)  select a different NEF file and zoom to 1:1
    (2)  clear the ACR cache
    (3)  select the test file
    (4)  take 3 screenshots to illustrate the 3 display states (the first one is hard to capture)
    (5)  select another image
    (6)  same 3 states are present
    (7)  return to the test file and the same 3 display states are present
       Why isn’t the ACR cache coming into play in step 7?
    If I repeat this process with the GPU disabled the image is displayed without the intermediate states.
    I have attached the 3 screenshots mentioned in step (4).

  • GPU acceleration in Lightroom 6 not working

    Hi,
    Any ideas how do I turn on GPU acceleration in Lightroom 6?
    I meet the minimum requirements (Win 7 x64, 1GB VRAM, OpenGL 6.1 ) and as advised in "learn more" I did check my graphic driver which is at the latest version.
    Still I am getting message: "GPU acceleration was disabled due to errors"
    Any help apprecited
    thank you
    System info:
    Lightroom version: 6.0.1 [ 1018573 ]
    License: Perpetual
    Operating system: Windows 7 Home Premium Edition
    Version: 6.1 [7601]
    Application architecture: x64
    System architecture: x64
    Logical processor count: 4
    Processor speed: 3,0 GHz
    Built-in memory: 12245,9 MB
    Real memory available to Lightroom: 12245,9 MB
    Real memory used by Lightroom: 633,5 MB (5,1%)
    Virtual memory used by Lightroom: 620,2 MB
    Memory cache size: 0,0 MB
    Maximum thread count used by Camera Raw: 4
    Camera Raw SIMD optimization: SSE2,AVX
    System DPI setting: 96 DPI
    Desktop composition enabled: Yes
    Displays: 1) 1920x1200
    Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
    Graphics Processor Info:
    Check OpenGL support: Failed
    Vendor: ATI Technologies Inc.
    Version: 3.3.13283 Core Profile Context 14.501.1003.0
    Renderer: AMD Radeon HD 6670
    LanguageVersion: 4.40

    You can also install an "old" version from april 2014 (V 14.4) which works with Lr6 after having un-installed your actual actual drivers.
    You can find it here: http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064
    Be aware that with this driver you could activate the GPU Acceleration but it's possible that you didn't obtain any benefit...bug or not nobody knows...
    Try

  • How to: Enable CUDA GPU Support for "Unsupported Cards" in CS6 (same process as in CS5+)  :) for MAC

    Hi all, I previously was involved in some discussion regarding GPU acceleration with CUDA in Premier Pro CS5 using an 'Unsupported card' with my Macbook Pro... I would like to show you this still works and will be here to show the process for anyone, and will post a video of the whole thing as a step-by-step probably tomorrow (its quite late where i am just now) but I'll give you a quick run through just now... Proof at bottom of page...
    Ps. This only applies to Nvidia card Macs.. I have no ATI card mac to try this out as OpenCL etc but i would imagine it would be a similar process...
    Step 1... Install Cuda drivers for your Mac (im using 4.2.7) here   http://www.nvidia.com/object/mac-driver-archive.html
    Step 2... Install GfxCardStatus and force your graphics (if using a laptop) to discrete card only   http://codykrieger.com/gfxCardStatus
    Step 3... Open Terminal... and run GpuSniffer... path as follows for original default installation directory, copy and paste then hit enter or change as required
                  /Applications/Adobe\ Premiere\ Pro\ CS6/Adobe\ Premiere\ Pro\ CS6.app/Contents/GPUSniffer.app/Contents/MacOS/GPUSniffer 
    This should give you an output as follows, while it may differ this is my output...
    --- OpenGL Info ---
    2012-05-02 01:00:28.642 GPUSniffer[6661:2403] invalid drawable
    Vendor: NVIDIA Corporation
    Renderer: NVIDIA GeForce GT 330M OpenGL Engine
    OpenGL Version: 2.1 NVIDIA-7.18.11
    GLSL Version: 1.20
    Monitors: 2
    Monitor 0 properties -
       Size: (0, 0, 1920, 1080)
       Max texture size: 8192
       Supports non-power of two: 1
       Shaders 444: 1
       Shaders 422: 1
       Shaders 420: 1
    2012-05-02 01:00:28.666 GPUSniffer[6661:2403] invalid drawable
    Monitor 1 properties -
       Size: (0, -1200, 1920, 1200)
       Max texture size: 8192
       Supports non-power of two: 1
       Shaders 444: 1
       Shaders 422: 1
       Shaders 420: 1
    --- GPU Computation Info ---
    Found 2 devices supporting GPU computation.
    CUDA Device 0 -
       Name: GeForce GT 330M
       Capability: 1.2
       Driver: 4.02
       Total Video Memory: 511MB
    OpenCL Device 1 -
       Name: GeForce GT 330M
       Capability: 1.1
       Driver: 1
       Total Video Memory: 512MB
       Not chosen because it did not match the named list of cards
    Step 4... Using the name of your card from the above output, i have it underlined, in my case a GeForce GT 330M, copy this exact name as we will now add it to the list of supported cards...
    Step 5... Using terminal still, again assuming default install directories, copy and paste then hit enter on the following...
                 sudo nano /Applications/Adobe\ Premiere\ Pro\ CS6/Adobe\ Premiere\ Pro\ CS6.app/Contents/cuda_supported_cards.txt
    You will then be asked for your computer password, type it and hit enter, it will not be shown as you type FYI... You will then see the default supported cards, add your card by pasting the name you have as in Step 4, then use ctrl+x to escape, type Y to save and hit enter a couple of times to get out...
    You should now be able to run Premier Pro CS6 using the CUDA GPU option you can select in Renderer as you start a new project!!
    Any questions i will try to answer them as soon as possible, and as i said i'll screen capture and record a video tutorial as i know that would be of help to some of you out there.. Some proof, here is a pic...
    PS. It is a full 1080 pic so it is quite large...
    Message was edited by: StuCS5 - Changed Font Size

    Very usefull! it worked perfectly on premiere cs6 on my macbook pro 2.8ghz with geforce gt 330m, Any suggestion for enabling ray traced 3d also in after effect? I inserted the  geforce gt 330m on the after effect card list, but when i enable ray traced 3d the text in the comp disappear!
    thanks!

  • Any GPU acceleration in Lightroom 4?

    Hi there-
    I know this has been bandied about as a feature request, was just curious if there was ever any implementation. Im off to buy a new video card today, and didnt want to buy Radeon if the only LR GPU acceleration was CUDA (or vice versa).
    Any general advice on which video card (if it matters at all for LR) to buy would also be appreciated.
    Thanks,

    No.

  • CUDA GPU Acceleration not working on Dell 4700 K2000M

    Hello,
    I have a new Dell 4700 laptop that was outfitted with the NVIDIA Quadro K2000M and my CS6 Premiere Pro does not allow me to choose GPU Acceleration as the option is grayed out for new files. I searched the web and tried several things suggested such as turning OPTIMUS off in the laptop BIOS, uploading the latest drivers for the NVIDIA card and still the Premeire Pro refuses to allow hardware acceleration.
    Am I missing something? I really want this to work. Please help!!
    Thanks,
    Charles

    Cuda Hack
    CS6 - minimum 1GB video ram
    PC - add:
    (Your card name) example: GeForce GTX 680
    to this list:
    C:\Program Files\Adobe\Adobe Premiere Pro CS6\cuda_supported_cards.txt
    (or simply delete the cuda_supported_cards.txt file)
    mac:
    How To Enable GPU Cuda in Adobe CS6 for Mac
    http://www.vidmuze.com/how-to-enable-gpu-cuda-in-adobe-cs6-for-mac/

  • Nvidia Geforce 670 and the Mercury Playback Engine GPU Acceleration for Adobe Premiere Pro

    Will Adobe test and certify the Nvidia Geforce GTX 670 for use with the mercury playback engine GPU acceleration feature of Adobe Premiere Pro CS5 in the near future? Any advice would be much appreciated, thanks in advance for any help.

    Adobe only certifies after extensive testing... For many others, with at least 1Gig of video ram, use the nVidia Hack http://forums.adobe.com/thread/629557 - which is a simple entry in a "supported cards" file - and Mac http://www.vidmuze.com/how-to-enable-gpu-cuda-in-adobe-cs6-for-mac/

  • GPU Acceleration in Lightroom CC

    This is the page talking about GPU usage in Lightroom CC: Adobe Photoshop Lightroom Help | Lightroom GPU FAQ.
    This is the page talking about GPU usage in Photoshop: Photoshop CC and CC 2014 GPU FAQ
    As you can see, I havn't seen much details about the actual use case of OpenGL/OpenCL acceleration in Lightroom. I assume now Adobe name Lightroom "Adobe Photoshop Lightroom" now, does that mean Lightroom is using the same kind of engine? I would think the new photo merge function would be a good candidate for GPU acceleration given some use case like panorama or large pixel image rotation could be quite heavy duty. Does anyone know where to find these under the hood details?
    Having this said, I wonder what is the official graphics card vendor Adobe would recommend? Perhaps nVidia?

    You can also install an "old" version from april 2014 (V 14.4) which works with Lr6 after having un-installed your actual actual drivers.
    You can find it here: http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064
    Be aware that with this driver you could activate the GPU Acceleration but it's possible that you didn't obtain any benefit...bug or not nobody knows...
    Try

  • AME Adobe Media Encoder Cuda GPU Acceleration not activated, How do I activate it

    Mac Pro Early 2009 4.1, Processor  2.93 GHz Quad-Core Intel Xeon, Memory  16 GB 1066 MHz DDR3 ECC, Graphics  NVIDIA GeForce GTX 680 2048 MB, Software  OS X 10.9.2 (13C64)
    Adobe CS6 suite up to date all programs as of 3-18-14
    NVidia Cuda Driver 5.5.47 installed, Open CL and GL capable
    I read this article http://blogs.adobe.com/kevinmonahan/2013/09/13/enabling-cuda-for-the-mercury-playback-engi ne-in-the-macbook-pro-retina/. it shows GPU acceleration through AME (see below). In my AME que window I do not see this

    Todd:
    (edit)
    Just reread - sorry, CS6 vs CC.
    What about AME CC?
    http://blogs.adobe.com/kevinmonahan/2013/09/13/enabling-cuda-for-the-m ercury-playback-engine-in-the-macbook-pro-retina/
    So, CUDA acceleration in AME and OpenCL is supported.  Anyway to monitor this?  I get FAILED when I try either (on a box with nVidia, or a box with AMD).
    Found this:  http://helpx.adobe.com/media-encoder/using/whats-new-media-encoder-7-1.html#gpu-accelerati on
    GPU acceleration
    Adobe Media Encoder now takes advantage of GPU for rendering purposes. Both CUDA and OpenCL are supported. The latest release of AME uses the GPU for the following renders:
    Scaling (HD to SD; SD to HD)
    Timecode filter
    Pixel format conversions
    Deinterlacing
    Aspect ratio changes
    All effects in the Effects tab
    GPU accelerated effects in Premiere Pro
    If you are rendering a Premiere Pro sequence, Adobe Media Encoder will use the GPU render preference you have set for the project. All GPU rendering capabilities of Premiere Pro are utilized.The limited set of GPU renderable tasks in Adobe Media Encoder is only for renders that originate in Adobe Media Encoder.
    If you are rendering a sequence with native sequence support, the GPU setting in AME is used and the project setting is ignored. In this case, all GPU rendering capabilities of Premiere Pro are utilized directly in AME.
    If your project uses 3rd party VSTs (Virtual Studio plugins), the GPU setting in the project is used. The sequence is encoded through headless Premiere Pro just as in earlier versions of Adobe Media Encoder. If Enable Native Premiere Pro Sequence Import option is unchecked, headless Premiere Pro will always be used and the GPU setting is used.

  • Premiere CC kicks GPU acceleration for 6770M

    Installing Premiere CC has disabled GPU acceleration on my 2011 iMac running the "Supported" AMD Radeon HD 6770M 512 MB Graphics card. GPU acceleration was working fine on CS6 without the need for any hack.
    I am running a fully updated OS Mountain Lion.
    GPU acceleration was also kicked on a work computer I was using on a different license. A PC with an Nvidia card. This required the card driver to be reinstalled. But because my iMac has the driver built into the OS I can't do this for my iMac.
    Since this happened on two completely different computers I've used with different OS and graphics cards I'm surprised this isn't a big issue for lots of people.
    Any advice would be appreciated, thanks.
    James

    And yet it was working fine on CS6.
    The required specs on the Adobe page are little confusing as they read...
       Windows
    Mac
    AMD Radeon HD 6750M (only on certain MacBook Pro computers running OS X Lion (v10.7.x) with a minimum of 1GB of VRAM)
    AMD Radeon HD 6770M (only on certain MacBook Pro computers running OS X Lion (v10.7.x) with a minimum of 1GB of VRAM)
    ATI Radeon HD 6750M (OpenCL)
    ATI Radeon HD 6770M (OpenCL)
    NVIDIA GeForce GTX 285 (CUDA)
    NVIDIA GeForce GTX 675MX (CUDA)
    NVIDIA GeForce GTX 680 (CUDA)
    NVIDIA GeForce GTX 680MX (CUDA)
    NVIDIA GeForce GTX 650M (CUDA)
    NVIDIA GeForce Quadro CX (CUDA)
    NVIDIA GeForce Quadro FX 4800 (CUDA)
    NVIDIA GeForce Quadro 4000 (CUDA)
    NVIDIA GeForce Quadro K5000 (CUDA)
    So they obviously have the Mac and Windows titles in the wrong place.
    But further, when I look inside my System, through Terminal, it says I have the "ATI Radeon HD 6770M" - which is supported without the 1Gb VRAM caveat. When I open "About This Mac" / "More Info" it tells me I have the AMD Radeon HD 6770M 512mb.

  • 2D GPU acceleration for Linux Reader 8 (when it will be available)?

    It would be nice if Adobe Linux Reader 8 could use "2D GPU Acceleration" in Linux too. It could be done using OpenGL driver. Nvidia Linux Geforce driver has support for shader technology in Linux too. ATI also uses its own accelerated OpenGL library in Linux.
    Will Adobe make Linux Reader 8 with 2D GPU acceleration?

    What a pity. Adobe reader is very slow even on fast machines when compared to open alternatives. 2D GPU acceleration could give a chance to Adobe Reader to work at least equally fast as open source viewers.
    However thanks for reply.

  • GPU Acceleration for both FCP-X and Premiere? Anyone have NVIDIA GTX285 or NVIDIA QuadroFX 4800? ...and successful at using both editors with GPU Acceleration?

    I'm in the market for a Laptop MAC solution to GPU accelerated editing. Because of multi-client workflow needs, I need to edit HD video with both FCP-X and ADOBE CS5.5 Premiere. The only GPU capable Graphics Cards that seem to satisfy both "camps" are from NVIDIA - GTX285 and NVIDIA Quadro FX 4800
    Does anyone have these cards AND using both softwares AND having success at it?
    I sure would appreciate any experienced advice with using NVIDIA GPU accelration with MacBook Pro especially!
    Thanks,
    ~mars9

    How do you plan on using a PCIe card with a laptop? Via a Thunderbolt PCIe adapter? I can't help with that. In fact, I didn't think those adapters were even out yet.
    I do have a Mac Pro and I use the NVDIA Quadro 4000 For Mac. This works great for Premiere Pro. As good as the 4800 and for less money. Check out this comparison.
    http://barefeats.com/wst10g11.html

  • GPU acceleration for video editing

    I am using cyberlink power director 12, adobe premiere elements 12, and sony vegas 12 (yes a lot of 12's). Problem is, my new notebook (msi gt70 dominator pro) doesn't let the NVidia 880m run for acceleration. I've used all these programs on desktop and with my previous 7 asus g750 laptops (the asus kept overheating so went msi). They all work great.
     Nvidia geoforce is all set up optimum, and ive tried integrated and dedicated for each program (not that it should matter which is using to use program). All settings in the programs themselves are checked for hardware/gpu acceleration. This isn't my first time doing this, and laptop/desktop side by side all settings are identical. But when I go to the produce section, the box for acceleration isn't valid to click, no matter what final format I choose. Out of all 7 asus laptops ive had last 6 months, all of them worked just fine.
     Any help, please. Shame I spent so much on laptop, finally got one that doesn't thermal throttle, and now programs don't work. I also reinstalled windows and programs once already, and that didn't fix it. All drivers are also 100% up to date.
     Components are different than the asus, but that shouldn't matter. Asus had i7 4700 and gtx 860m/870m/880m ( I went through all of them jm js jz), my msi has i7 4810/4900 and 880m.

    I've been working on this problem for days now, I was up until 2am this morning trying to figure it out.
    I then plugged my monitor into my laptop for my son to watch music videos while I did some things around the house, and noticed that as soon as I plugged in the monitor it switched over to NVidia graphics not intel. While trying to figure this one out, I loaded power director to work on a movie, rendering takes so long I figure I should start now while im busy. Well, gpu acceleration works just fine. Just as before. I did absolutely nothing different.
    Still haven't figured out how to run second screen off intel graphics, but now I don't know if I want to since that's the only thing that's changed and now gpu acceleration is working fine.

  • Media Encoder CC not using GPU acceleration for After Effects CC raytrace comp

    I created a simple scene in After Effects that's using the raytracer engine... I also have GPU enabled in the raytracer settings for After Effects.
    When I render the scene in After Effects using the built-in Render Queue, it only takes 10 minutes to render the scene.
    But when I export the scene to Adobe Media Encoder, it indicates it will take 13 hours to render the same scene.
    So clearly After Effects is using GPU accelleration but for some reason Media Encoder is not.
    I should also point out that my GeForce GTX 660 Ti card isn't officially supported and I had to manually add it into the list of supported cards in:
    C:\Program Files\Adobe\Adobe After Effects CC\Support Files\raytracer_supported_cards.txt
    C:\Program Files\Adobe\Adobe Media Encoder CC\cuda_supported_cards.txt
    While it's not officially supported, it's weird that After Effects has no problem with it yet Adobe Media Encoder does...
    I also updated After Effects to 12.1 and AME to 7.1 as well as set AME settings to use CUDA but it didn't make a difference.
    Any ideas?

    That is normal behavior.
    The "headless" version of After Effects that is called to render frames for Adobe Media Encoder (or for Premiere Pro) using Dynamic Link does not use the GPU for acceleration of the ray-traced 3D renderer.
    If you are rendering heavy compositions that require GPU processing and/or the Render Multiple Frames Simultaneously multiprocessing, then the recommended workflow is to render and export a losslessly encoded master file from After Effects and have Adobe Media Encoder pick that up from a watch folder to encode into your various delivery formats.

Maybe you are looking for