Lightroom JPEG files far larger than equivalent Photoshop files?

I'm using Lightroom 2.3 on a Mac. If I take an image and export it from Lightroom as a JPEG at the 60 quality setting, the resulting JPEG is typically around 2-3 times larger than the same file saved from Adobe Photoshop CS3 using the High Quality setting in 'Save for Web and Devices'.
Lightroom does not appear to be generating a preview or anything else that would account for the disparity.
To get Lightroom to shrink files down to the sizes that Photoshop produces, I need to reduce the quality to a level at which the images contain glaring JPEG artifacts.
I'd like to use Lightroom as part of my web workflow, but this is pretty much a showstopper. Exporting to TIFF, then using Photoshop batch actions to try to automate a conversion to JPEG runs into the problem that Photoshop's "Save for Web and Devices" command can't be fully automated (it's not possible to override the save location).
So my questions are:
1. Why is Lightroom's JPEG compression so poor in comparison to Photoshop's?
2. Is there any way to get around this?
3. Is it safe to use Photoshop's standard 'Save as ...' command in place of 'Save for Web and Devices' when preparing images for the web (I have a distant memory that 'Save As ...' used to export metadata that would choke certain browsers, but I don't know if that's still true in CS3).

It looks as if metadata contributes about 4-8K to the image. Photoshop is stripping a large part of the metadata (even with 'include XMP' on), which contributes to the difference, but doesn't explain it completely.
Here are some test results:
Test Image #1: 660 x 440:
Lightroom 2 (60, with metadata) - 170,904 bytes
Lightroom 2 (60, without metadata) - 164,142 bytes
Photoshop CS3 (60, no metadata) - 141,489 bytes
Photoshop CS3 (30, no metadata) - 72,394 bytes
Test Image #2: 660 x 440 (no metadata)
Lightroom 2 (70) - 124,231 bytes
Lightroom 2 (60) - 88,448 bytes
Lightroom 2 (60) - 80,560 bytes
Photoshop CS3 (Save for Web 70) - 94,032 bytes
Photoshop CS3 (Save for Web 60) - 69,654 bytes
Photoshop CS3 (Save for Web 30) - 32,287 bytes
Photoshop CS3 (Save 8) - 93,517 bytes
Photoshop CS3 (Save 7) - 73,067 bytes
However, there's a quality difference to be taken into account as well. Photoshop 60 is about equivalent to Lightroom 70.
To compare the images, I took the original image (scaled to 660 x 440 by Lightroom) and layered the JPEG over it, then set the layer to 'Difference' to reveal JPEG compression artifacts. Lightroom at quality level 60 shows visible artifacts; Photoshop Save for Web at level 60 does not. To get the artifacts to disappear, I have to take Lightroom up to about 70, by which time the Lightroom files are almost twice the size of the Photoshop files.
Just for amusement, I tried Graphic Converter as well. GC is a great program, but its JPEG conversion turns out to be vile: even at a quality setting of 100, the artifacting on fine near-vertical lines is obvious to the naked eye.
Using Photoshop's own 'Save' (rather than 'Save for Web') command yields similar results. quality level 8 in 'Save' appears to be fractionally poorer than quality level 60 in 'Save for Web'.

Similar Messages

  • Why would an alias file be larger than the original file?

    I have a simple document that is 106kb. When I create an alias to the file, the alias is 133kb. Why would the alias file be larger than the orginal file when the alias file is simply a redirect to the original file?

    Francine has a good explanation here:
    https://discussions.apple.com/message/10337543#10337543
    Matt

  • NT 4.0, LabVIEW 6, Error 4 (END OF FILE) when trying to seek to byte offset 4096 (from start of file) when the file is larger than 2 Gig

    If I try to seek (or read) with the position mode wired and set for START, I get error 4 (END OF FILE) if the file is larger than 2 GB. I'm only trying to move the file pointer 4096 bytes, not trying to seek or read more than 2GB, but I get the error if the file is over 2Gig.
    Instead, I have to do reads, with the position mode unwired, until I get to the place in the file that I want to be.
    Is this expected behavior?

    Hello,
    LabVIEW File I/O functions use an I32 value to store the size of a file. This means that we are limited to ~2GB file sizes when using the File I/O functions in LabVIEW. This explains why you are seeing odd behavior when trying to read to the end of the file, since this is causing the byte count to exceed ~2GB.
    I hope this explanation sheds some light on the situation for you. Hopefully the workaround (repeated reads) is not too much of an inconvenience.
    Good luck with your application, and have a pleasant day.
    Sincerely,
    Darren Nattinger
    Applications Engineer
    National Instruments
    Darren Nattinger, CLA
    LabVIEW Artisan and Nugget Penman

  • Why are ACR PSD files 10-20 percent larger than the same file resaved in PSD?

    Why are ACR > PSD files 10-20 percent larger than the same file resaved in PSD? I posted this many years ago and never found an answer. Now that my drives fill up quicker, I thought I might chase this question a little bit further.
    Same .CR2 saved within ACR either with cmd-R or open ACR within PSD, the saved file is 34.5mb. Resave that same file (no edits) within PSD either with or without Max-compatible and the file is now 30.7mb. Another file that is 24.5 becomes 19.5MB.
    Why the difference? Is ACR and PSD actually using different compression strategies?
    thanks.
    Mac 10.8.5 / CC / ACR 8.4.1 - but this has been a consistent behavior over many years and versions, CS6 / CC.
    Same .CR2 saved within ACR either with cmd-R or open ACR within PSD, the saved file is 34.5mb. Resave that same file (no edits) within PSD either with or without Max-compatible and the file is now 30.7mb. Another file that is 24.5 becomes 19.5MB.

    Hi Jeff
    If it is RLE it's not as efficient as LZW:
    Saved ACR>PSD = 40.1MB  (sample image this AM)
    opened in PS and resaved as PSD = 30.8MB
    resaved as TIF without LZW = 40.1MB    (this adds to your thought that the ACR>PSD doesn't us any compression)
    resaved as TIF/LZW = 9.6MB
    Jeff Schewe wrote:
    I really think your priorities are a bit off. 10-20% is meaningless...you just need to get bigger....  and quit fussing over a few GB's here or there...
    ???   I hope that the Adobe engineers are fussing over 10-20% efficiencies.
    I'm within arms reach to a rack of 40TB of drives (doesn't include off-site drives), and all 2TB drives are being recycled to 4TB drives, as a result the rack is always growing. Actually the ACR>PSD files don't really make a difference in our long term storage, only for the nightly backups. But anyway, how you save, what you save etc. should all be part of the discussion.
    .... so in my case, throw in an excess MB here and there and all of a sudden you are talking TB's. Plus advantages in backup times, drive life, and energy use.
    Somebody added compression into the PS>PSD format, but it wasn't included in the ACR>PSD format, was it a decision or an oversight? If it's just a matter of making ACR compatible with PS when saving the same PSD format..... then why not?
    regards,
    j

  • Aperture exports jpeg files larger than original RAW files

    Can anyone tell me why a RAW file (10.6mb), when exported as a jpeg (10.8mb) from Aperture ends up larger than the original RAW file. The same RAW file when opened and then saved as a jpeg (6.4mb) in Photoshop is a lot smaller. The photo dimensions and resolution are the same in both saved files (34.5mb open file 300dpi 4256 x 2831 pix). I have tried this on several photos, all with similar results. For information I am saving the photos in both Photoshop and Aperture at 300dpi, original size and at a quality setting of 12. In these examples/tests I have done no work to the photos, obviously the file sizes increase after work has been carried out on the photos (in both Ps & Aperture)
    Almost doubling the size of saved jpegs has a massive implication on my library and may be one reason to consider Adobe Lightroom as this gives similar jpeg file sizes as Photoshop, i.e. almost half the size of the original RAW file
    Reducing the quality setting on saved jpegs is an obvious way to reduce file size, but not answering the question of the considerable discrepancy when saving to the same quality in different software
    Is this a feature of Aperture and nothing can be done about it ? I would prefer to use Aperture but cannot cope with the large jpeg sizes !
    Any comments would be much appreciated - thank you
    Nick

    Think you might be right Allen - The 12 quality saved jpegs seem to be pretty high quality, closer to the original than maybe the files saved in Ps at quality 12. I have just run an identical set of processing actions on all the files in Photoshop and the jpegs previously saved in Aperture at 12, 11 and even 8 quality settings seem to be better than the same files saved at 12 in Ps
    Bizarrely the file size drops from 10.6mb at quality setting 12 in Aperture, to 3.2mb when saved just one notch down at quality setting 11 in Aperture. That is a massive drop, esp considering the next one down, saving at quality 10 results in a 2.8mb file
    rw just ran some checks and tests on the file export settings and file sizes in Aperture, on a file I sent him, and we get the same results. So at least my version of Aperture is not up the wall !!
    Would be useful to have the explanations from Apple as to the vast variance in settings and file sizes, but I guess we will just have to keep guessing - and buying more and more hard drives for all the large files
    I am considering keeping the RAW originals in future, and I suppose in this case I need only save smaller jpegs, and issue at whatever size they are needed at the time - just needs a bit of planning to look after an ever increasing collection, which is about to have two sets of images added at a time now. Added to the already amassed 80 000 images at last count !)
    Thanks
    Nick

  • Help with Photoshop CS2 files MUCH larger than normal, possibly not compessing layer masks

    Has anyone else come across a problem where Photoshop files grow in size much faster than expected, especially when adding layer masks?
    Normally when you add a layer mask the file size grow a small amount if the layer mask is mostly all white or mostly all black. In this case it's like the file size is growing exponentially, almost as if the layer mask isn't being compressed at all.
    The file in question although large, is similar dimensions, resolution, and number of layers as other projects that have been worked on and all the usual thing have been checked like resolution, colour mode, bit depth and the image has also been cropped so there's nothing outside of the page.
    The file has been checked on 2 different macs and exhibits the same behaviour and Photoshop preferences have been deleted and re-created and we've tried copying the file by dragging layers to a new document of the same dimensions with no success.
    The only things we're certain of is that file size grows dramatically when adding layer masks in particular and layers in general.
    The file was created by someone else, if that might be a factor, though the file has been "saved as.." with no improvement.
    Any ideas?
    TIA
    Tom
    The only

    Maybe it has something to do with "Maximize Compatibility". If that's checked when you save, it might add a lot of weight to your file, even with just a simple layer mask. Don't know, haven't done any testing, just a thought.
    Edit: Just did a quick test:
    1. flat image saved as PSD = 5.2 MB
    2. same image converted to a layer with simple layer mask, saved as PSD = 5.8 MB
    3. same layer mask image saved WITH "maximize compatibility" = 10.7 MB
    So adding "maximize compatibility" nearly doubled this test file. Could be it?

  • The amount of memory used for data is a lot larger than the saved file size why is this and can I get the memory usage down without splitting up the file?

    I end up having to take a lot of high sample rate data for relativily long periods of time. When I save the data it is usually over 100 MB. When I load the data for post-processing though the amount of memory used is excessively higher than the file size. This causes my computer to crash because 1.5 GB is not enough. Is there a way to stop this from happening withoput splitting up the file into smaller files.

    LabVIEW can efficiently handle large files, far beyond 100Mb, provided that care is taken in the coding of the loading/processing routines. Here are several suggestions:
    1) Check out the resources National Instruments has put together (NI Developer Zone > Development Library > Measurement and Automation Software > LabVIEW > Development System > Optimizing Applications > Managing Memory), specifically the article entitled "Managing Large Data Sets in LabVIEW".
    2) Load and process the data in chunks if possible.
    3) Avoid sending the data to front panel indicators, using local/global variables for data storage, or changing data types unless absolutely necessary.
    4) If using LabVIEW 7.1, use the "show buffer" tool to determine when LabVIEW is creating extra
    copies of data in memory.

  • Why is Keynote storing images 20x larger than the original file size upon import?

    In Keynote, I'll add a 150KB PNG image to a slide and the 500KB presentation goes to 3.7MB... a more than 3MB increase. I'm trying to troubleshoot large file sizes as a simple deck takes up 50-100MB on disk. Anyone know what is going on to make the file 20x larger once added to Keynote? And how I can keep the file size down? The 'Reduce File Size' option is worthless in this case... offering a less than 10% reduction on the earlier 20x amplification. Thanks

    Chris H 71 wrote:
    This was originally in Keynote 09.
    Yes converting an existing version 5 file in to Keynote 6 does many strange things. Any existing version 5 files I have, I continue to edit in 5. When starting a new presentation, I can use 6 without many problems at all.
    I provided the files to replicate the issue to Apple support and never heard back.
    Apple do not provide feedback to users on issues, they only request information from users.

  • I cannot add nor send file attachments larger than 100k on any of my email servers from this particular PC. Is there a fix for this?

    I cannot attach a file larger than 100k to any email account (gmail, aol, secureserver.net) while using Firefox. However, I can send files larger than 100k from those same accounts on this computer when using IE. Switched to Firefox from IE and would prefer to continue using Firefox, however this issue presents a problem for my business needs. Please advise. Thank you.

    Well seeing as I'm running Lion on 2 GB of RAM, I can't see how upgrading the RAM will make that much of a difference.    If you are seeing launch times in minutes, it probably indicates your cache got corrupted by some third party cache cleaning software.  To see if some corruption can be fixed, first backup your data:
    https://discussions.apple.com/docs/DOC-1992
    Then boot in safe mode, holding the shift key.  If issues persist in safe mode, run the hardware test that came with your Mac. 
    http://docs.info.apple.com/article.html?artnum=303081
    And if that is unrevealing, and booting into safe mode doesn't solve anything, and you are backed up, try repairing the directory by:
    1. Booting into the restore partition with command-R.
    2. Selecting the installer's Utilities menu.
    3. Selecting Disk Utility
    4. Selecting First Aid
    5. Selecting the boot partition.
    6. Click on repair disk (not repair permissions)
    Do NOT proceed with any install itself.  Just quit the install program
    If the disk is not able to be repaired, purchase Alsoft Disk Warrior, and boot of that to repair the directory.
    If it is able to be repaired, and the machine is still slow, I think you may have run into some RAM that is bad, and need to replace it, not add to it.
    If Disk Warrior can't repair the hard disk, replace the hard disk itself.

  • File too large error or corrupt file error

    I have scanned some images using a Nikon Cool Scan and when trying to import the the NEF files into Lightroom I get a corrupt or unrecognized file error. Bring it into CS2 or CS3 and save as TIFF and try the import and get a File too large error.
    Any ideas or help on this. What is the file size max for import?
    The scan is 4000dpi even tried at 300dpi.
    Thanks in advance for any insight.

    > Is it truly a size problem? If so, what is recommended? Lee Jay states that 10000 pixels is the max on either side. Okay, in DPI, what does that translate to?
    <br />
    <br />There's no necessary relationship between pixels and dots. You could scan an image at 4,000,000 dpi and translate it into an image of 100 x 100 pixels. I've used ridiculous extremes to make a point. The LR limitation is currently 10,000 pixels for any side. So you could have 9,000 x9,000 pixels but not 10,001 x50 pixels.
    <br />
    <br />Is this now clearer?
    <br />
    <br />
    <span style="color: rgb(102, 0, 204);">John "McPhotoman"</span>
    <font br="" /></font> color="#800000" size="2"&gt;~~ John McWilliams
    <br />
    <br />
    <br />
    <br />MacBookPro 2 Ghz Intel Core Duo, G-5 Dual 1.8;
    <br />Canon DSLRs

  • Why is audio file size larger than in Flash?

    I use both Flash and Captivate. Final file size is a big consideration. I wondered why the file size of the audio in Captivate is almost twice as large as the same audio if it's used in Flash. I'm using the same audio settings for both. Does anyone know what causes the audio compression to be different in Captivate than in Flash?

    OK, sounds like memory will be fine, so let's look at some other stuff...
    Is it only one particularly keynote presentation that slows up for you, or does keynote behave in the same way even if you start from scratch and create a new presentation?
    If it's just the one presentation then there's probably some content (possibly video) within it that KN takes exception to. Sometimes a single problematic slide or piece of content can slow down the whole app.
    Is the problematic presentation one that was imported from Powerpoint? If so, import it again and look carefully at the import error/warning logs.
    Before you start keynote, start up activity monitor - which will give you an indication of what programs are using all the CPU/memory/disk bandwidth etc. You'll then have an idea of whether it is KN maxing out on CPU or whether anything else is dragging the system down.
    Another thing you could try would be to log in to a different user account on your machine (create one if necessary) and then try opening the presentation from within this different account. If this works for you then it suggests a problem with the keynote plist file in your original user account.
    I don't know if it'd be acceptable to you, but if you could post a link to your problematic presentation then I could download it and try it out on my normally reliable Keynote on one of my computers. (Make sure that you've ticked the box to embed all media in the presentation)
    Hope some of this is helpful,
    Mike.

  • Time machine backup file size larger than size of backup hard drive.  Is something wrong?

    My 3 TB external hard drive (My Book Studio) is my Time Machine drive.  But Finder says that backup.backupdb is 6.9 TB in size, or over twice the drive's capacity.  Should I assume that the drive is corrupt or failing and that I can't rely on Time Machine now?  Or is Finder measuring something other than the actual size of the file...?

    I have this problem as well.  I have two laptops backed up to the same external hard drive, and I would like to move just one of the laptop's backups to a new drive.  The partition for both is 600GB, but the size of one laptop's backup folder is 850GB and for the other it is 3.3TB.
    I would like to know the actual size of each laptop's backup folder.  Getting the size of the volume is not useful because I need to know the sizes of the two individual backup folders within it.

  • When I export a PDF the file opens larger than resolution so it is pixelated? How do I stop this?

    I have created a Indesign document of 1024 x 768 and exported it as a 72dpi PDF. The PDF is only to view on screen.
    My problem is that when I export the PDF it always opens more then 100%. So everything is slightly pixelated. To see everything in focus as I want it I would need to zoom out.
    Obviously I want everyone to view it as I see it so how do I get around this.
    I choose 1024 x 768 just incase someone wanted to view it on a Ipad. Should I have just created it bigger to over compensate of this problem. I guess on an Ipad the PDF would just be shrunk so the whole page could be viewed and I guess that wouldn't be pixated?
    Thanks in advance for any help!
    Dan

    Set the initial view in Acrobat Pro and resave the PDF there.
    Bob

  • Why is my "Combined PDF" file size smaller than the original files?

    Hello!
    I am trying to combine two individual PDF files into a single PDF. Each file is 32mb, however when I use acrobat to combine them, the newly created "combined" file is only 19mb. I believe I've taken the necessary steps to ensure no degradation is happening (i.e. selecting Large File Size in the options panel), but I am still puzzled as to how two files can be put together as one and be smaller than the two separate files with out any compression. What am I missing?
    Thanks in advance!

    When you combine a file it does a "Save As" operation.  This re-writes all of the PDF object in the single file and is supposed to clean up the file, whereas the single files may have had multiple saves which when you look at the internals of the PDF file simply add on to the end of the file.  In other words you get a more cleanly written and optimized file that is also saved for Fast Web View.

  • Devenv always rebuilds my project, because a typescript file is newer than a pdb file

    I have a project with typescripts. devenv constantly rebuilding this project. Diagnostics output contains this line:   
    Project 'Benefits' is not up to date. Input file 'C:\abc\UI\BenefitsWeb\Scripts\Benefits\Modules\PPACA\ApprovalScreen.ts' is modified after output file 'C:\abc\UI\BenefitsWeb\bin\Dayforce.Web.Benefits.pdb'.
    So, how do I resolve this issue? The aforementioned typescript file is truly newer than the PDB. But the PDB is not the output of the typescript compilation !!! 
    Here is how I compile my typescript files:
        <PropertyGroup>
          <CompileDependsOn>
            $(CompileDependsOn);
            CompileTypeScript
          </CompileDependsOn>
        </PropertyGroup>
        <Target Name="GetInputs">
        </Target>
        <Target Name="GetOutputs">
        </Target>
        <Target Name="CompileTypeScript"
    DependsOnTargets="GetInputs;GetOutputs"
    Inputs="@(InputTypeScripts)"
    Outputs="@(OutputJavaScripts)">
        </Target>
    The project specifies the typescript files in the `TypeScriptCompile` item group, for instance:  
    <TypeScriptCompile Include="Scripts\Benefits\Modules\PPACA\CalendarSetup.ts" />
    Of course, @(OutputJavaScripts) lists the expected .js files only. No PDB files there.
    So how come devenv matches a typescript file with a PDB file? How do I fix it?
    I would like to clarify. It does not actually recompile the typescript or C# files (well for the first time it does recompile the typescript files, but only that). It starts the build, because typescript is newer than a PDB, but then it recognizes that no PDB
    dependency has actually changed. The same is true for the javascript files - they are all up-to-date. So nothing is actually rebuilt, but why to enter the build sequence in the first place!

    Hi mark,
    >>The project type is C# (csproj).
    C# is a programming language. The project type likes console application, windows forms application and web application.
    Have all project types the issue? If it is, try resetting VS to the default settings with the following commands to see if it helps.
    1.devenv.exe /resetsettings
    2.devenv.exe /resetuserdata
    regards
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for