Batch processing ignores ACR changes

I am using CS5 (32-bit) on a Vista PC.  I often want to batch process several raw files.  I have modified many of them in ACR. But when I do my batch processing, the changes I made in ACR have not been applied to the files I end up with.  I do not have any Open commands in my actions, I check "Suppress File Open Options" and "Override action Save As commands" and select a specific destination folder.  Some of the actions just save the file as a jpg without any other changes to it and then closes the file.  (I tried the Save jpg button in ACR but it always says it can't create the file.)
Has anyone seen this?  Even better, is there a solution?
Thanks,
Paul

Intermitently. Sometimes it seems to work for the fist few images in a batch and then starts to ignore the ACR changes even though the batch keeps running.
Paul

Similar Messages

  • How can I batch process files to change name and extention in the latest photoshop cc ?

    Hi,
    The batch process function has changed design recently and I cannot get it to change file names in photoshop cc 2014.
    I keep getting new layers in the first file to open instead of a renamed batch.
    I would also like to convert files to jpeg and resize ( at the same time ? )
    thank you for help in advance,
    christos

    Can't help you with CC, but you cannot change files types by merely changing the extension, regardless of program or version.

  • Peut on faire du batch processing avec Photoshop (changement d'ICC pour 200+ photos)?

    I want to change the ICC of 200+ pictures. Is there any means of doing in as a batch processing?

    You could record an Action of the "Convert  to Profile" step and use that in Batch or Image Processor for example.

  • In Photoshop Elements 12, is there a way to batch process all photos in a file with 'Auto Tone' and save the changes?

    In Photoshop Elements 12, is there a way to batch process all photos in a file with 'Auto Tone' and save the changes?

    Thank you, that was perfect!
    Yoni

  • Batch process makes image resolution change to 72dpi - it should be 300dpi - Help!

    In my work I use a batch process to run an action that converts EPS files to TIFF files. I have been using this successfully in Photoshop 7 on a Win2000 PC for a number of years. The process is as follows...
    Open EPS file
    Flatten Image
    Save
    Close
    When I recorded the action I made sure that the resolution was set to 300dpi as we require the TIFF produced at the end to be that resolution.
    I am now testing PSCS3 on a WinXP machine as we are upgrading all of our equipment. When I run the action as a batch process the TIFFs that are saved end up being 72dpi - They should be 300dpi. This worked in PS7 on the Win2000 machine and I am using exactly the same process in PSCS3 on the new WinXP machine, but something must be going wrong somewhere for the TIFFs to be saved at 72dpi instead of the required 300dpi. Any ideas?

    That doesn't seem to work - it doesn't appear to record any changes I make in the 'Image Size' dialog.
    When I look at the actions that I recorded in PS7 there are all the settings that I specified at the 'Open' step, such as 300dpi, greyscale, etc. saved under 'Open', but when I do the same in PSCS3 it doesn't record all the settings, just the actual 'Open' command, so I need to find a way of making PSCS3 remember '300dpi', 'greyscale', etc. within the 'Open' step like it did when I recorded the action using PS7.

  • Is it possible to add or change OCR text in a batch process?

    Hello,
    This is my first submission.  Really hope you can help as if there's a solution to this it will significantly help our business.
    Is it possible to 'batch process' the adding or changing of OCR text in a PDF?
    This might sound like a strange process but let me explain what we do:
    1. We scan old, handwritten books and registers.  Typically these registers contain lists of information. Each page will be scanned to a filename like pge01.jpg, pge02.jpg, pge03.jpg etc.
    2. We transcribe the content of each page.  Each page will contain mulitple records (i.e. 20 records)  typical fields might be:
    Unique ID
    Surname
    Forename
    Year
    Address
    Filename (i.e. pge02.jpg)
    3. We provide this content back within database driven software so that when a user performs a search on say 'Surname=Jones' & 'Year=1945' then all of the scanned pages that match that contains handwritten text that matches that search criteria is displayed in a list.  The user can then click on a search result and see the scanned page containing that record.
    Rather than provide database driven software, we'd simply like to produce a standard PDF file.  Each page within the PDF will show each of the scanned pages (pge01.jpg, pge02.jpg, pge03.jpg etc.).  But where you would normally store the 'ocr recognised text' behind each image, we would like to show our transcribed content.
    If this is possible, then I realise that it's likely that you can't do field searches (i.e. 'Surname=Jones' & 'Year=1945') but at the very least I'd be able to type 'Jones' in to the search box and it would find all pages that contained the transcribed word 'jones'.
    If it's possible to add the transcribed data as 'ocrd text' then is it possible to do it in some sort of batch process?  We scan lots of big books and capture millions of records - so doing it manually is not an option.
    Any help that anyone can provide will be hugely appreciated.
    Thanks,
    Paul

    I don't think it's possible to manually add OCRd text, but you can add form
    fields with the text in them. And yes, it is possible to search the content
    of form fields, using a script.

  • Where is it possible to batch process the size of images?--I am able to change pixel dimensions in image processor but not image size in inches.

    Hi- I need to batch process images for video project.  I am able to change the pixel dimensions in the image processor but don't seem to have the option to change image size in inches.  Please advise!  Thanks

    You don't need inches for video (or screen viewing in general). It all goes by pixel count. Inches is for print, nothing else.
    But to answer the question, you can run actions in the Image Processor, and this is where you set size in inches. Just bring up Image Size, uncheck "resample image", and specify size. You'll notice resolution changes to reflect the fact that the existing image pixels are now redistributed over the new print size.
    But again, screen doesn't care about size or resolution. It only counts pixels.

  • Batch Process Change Canvas Size Buggy

    I have 300+ products shots that need to go up to our website.
    Using FW CS3 I was able to resize and optimize both full size
    images and the thumbnails. The issue is due to our website setup
    the thumbnails are expected to be exactly 100x100. If the canvas
    size is irregular the image gets stretched or squeezed when served
    up on the site.
    So I created a custom action to set the canvas size to 100 x
    100 with a white background and to expand from the center out. When
    I run this manually everything is ducky. But when I add it to my
    batch process the actual content of the thumbnails moves from image
    to image. Some items are centered fine, some are too far left or
    too far right (or high,low) some are shifted so much that they are
    actually cut off to the left or right. There seems to be no rhyme
    or reason to it.
    Here is how the batch is set up:
    It grabs the original file ( a press resolution PSD file) and
    fit to size to 100 x 100. Then I run the command to set canvas size
    to 100x100 white background. And then it exports it to optimized
    jpeg. I have tried running the actions separately. And when I just
    run the fit to size and optimization as a batch the thumbs look
    fine no weird positioning. So I know it is the command.
    Any ideas?

    Thankyou Heath
    If I choose scale > scale to fit area > then choose a maximum width or height, that works for that part of the process.
    Percentage doesn’t work, because I need them to be a particular size (either width or height, doesn’t matter , as long as nither of them exceeds 00px)
    For the part that I then want to find the smallest measurement (width or height) and make a particular size, there doesn’t seem to be any way that I can find in FW to do that.
    But, after searching on line, I found an app that does all that and more, its brilliant.  It will find the largest side, the smallest side, or whatever and change it to what you want and rescale the other side to fit.
    This is the link - http://www.rw-designer.com/picture-resize
    Fiona

  • Photoshop cs6 batching process change the canvas size

    I am looking to change some images from 320x? to 320x320 without distoration.  The way to do that would be to change the canvas size to 320x? to 320x320 and have the product stay in the middle.  Can CS6 do that in a batch process?

    I am sorry is that a yes or a no.  I do not own this version of Photoshop but would upgrade if I can do this as I have several images to reformat.
    Sent from Windows Mail

  • ACR use of workdisk and scratchdisk when batch processing images

    I am setting up a new PC optimized for PS, and I and pondering on the optimal disk setup for ACR when I batch process several images:
    Alternative 1
    2 physical disk in raid 0 with 2 partitions, partition 1 for scratch and the second for images to work on.
    The question is if there ever will be a conflict in the sence that PS/ACR wants to access both scratch- and work partitions at the same time? If this should occure, I guess the following is a better setup??:
    Alternative 2
    1 physical disk for scratch
    1 physical disk for images to work on
    Christopher

    Thank you Ramón,
    I am not limited to 2 drives. I am planning on having OS and programs on a third separate disk.
    The point of using alternative 1 would be to use RAID 0 to obtain faster read/write (both for scratch and for reading and saving images on the work partition. However, if scratch usage competes with read/write to the work partition, I'm probably better off with alternative 2. If I work on one image this is simple, however I frequently batch process several RAW images. How does ACR/PS handle this? If all the images initially are read from the work partition and loaded into RAM + scratch and processed there, there will be no competition between the two partitions and I will benefit from the RAID solution. However, if the images are read one or just a few at a time, processed in RAM/scratch and then saved to the work partition, I imagine scratch and work partition could be competing for the the disks read/write head when finished images are being saved and ACR is "waiting" for more images to process??

  • Batch process icon changes

    Is there a way to change icons changes in a Finder window for several items at a time.
    By choosing Get Info from the File menu I can only change one icon, and then repete the process for the next one.
    Photoscene
    PowerBook G4 17 inch, PowerBook G4 12 inch   Mac OS X (10.4.6)   Tiger/Panther

    There is a plugin that lets you batch process Finder icons, FinderIconCM.
    http://www.pixture.com/software/macosx.php
    Photoscene

  • Batch process name  changes in photoshop cc 2014

    Hi,
    How can  I batch process name  changes in photoshop cc 2014 without opening Bridge?
    thank you in advance,
    christos

    are you talking about layer names or file names or turning layers into files

  • Can I use Organizer to change the .jpg size of all photos in a folder in one batch process?

    We're using wa-a-ay too much disk space for a growing photo collection.   My daughter is a camera freak with four kids -- need I say more? 
    In my current test folder the sizes of the original photos range from 2.3 MB to 4.2 MB.  I picked one photo at random and printed the original and then three more, saved by PSE9 as .jpg files at maximum, middle, and minimum quality settings on plain letter-size paper, and again on 4x6 photo paper.  It takes a magnifying glass and a calibrated eyeball to detect a difference on the plain paper, and it's even more difficult on the photo paper.  I think I should relieve the pressure on the hard drive's capacity by reducing everything to a more reasonable size; after all, the minimum-quality print takes up less than one percent of the disk space as does the maximum-quality print.
    I have a hunch that I can accomplish this in a batch process with Organizer, but that's all I have is the hunch -- not the know-how.   I would like someone to step me through the process.
    I also expect a lecture on why I shouldn't do this.  Go for it; I'll listen.
    Thanks,  Bud

    Bud,
    BudV a écrit:
    I also expect a lecture on why I shouldn't do this.  Go for it; I'll listen.
    You are the one to judge... Will you regret it if you want to use some pictures with the original size and resolution at a later time?
    Yes, you can use either the organizer or the 'process multiple files' command of the editor. But first you should think about two questions:
    - Are you ready to backup the full resolution and size files before the 'shrinking'? External USB drives with big capacities are available and affordable.
    - From what you are saying, the optimal size would be for files resized for 4" x 6" format at 300 ppi in jpeg quality 8 to 10.
    Resizing from the Editor: Use the 'process multiple files' command and click the 'Same as source' button. I would not do it without a sound backup first... This can be done folder by folder.
    Resizing from the Organizer : Use the 'Export' command. That command leaves the original and puts the resulting files in a new folder. The resulting files are NOT included in the catalog.This means that keeping your organization (categories, albums, tags...) will be tricky; same if you are using a folder organization system.
    Another solution with a third party software such as Faststone photo resizer:
    - Absolutely do a full backup before
    - run the resizing options in the external software so that the files are resized without being moved. That replaces the originals.
    - If you are using the organizer, your organization will be kept, but the file sizes in the catalog and thumbnail databases will be wrong. You can regenerate the thumbnail cache by deleting the thumbs.5.cache file in your catalog folder; it will be automatically rebuilt.
    That has worked for me in the past... no guarantee for you!

  • Batch processing  change settings as2 to as3

    We have previously develeped  animations  in as2 version we want to republish with as3. Those are above 10000.
    Hot batch process using jsfl.
    and one more if any code in main time line comment that the code .
    is it possible ?

    Hi,
    Thanks for your reply.
    I want to convert flash publish settings from as2 to flash as3 version using the batch processing, and at the same time I want to comment the code in the last fame (what ever code is present) located on main time line.

  • Gather_table_stats in a batch process

    Hi everyone,
    we have a bit of a problem with several batch processes. Their basic structure of a batch process this
    1. do some init tasks - record the batch process as running, load parameters etc. generally a very simple tasks (haven't seen those to fail)
    2. prepare some data into a TMP table (regular table, used from this single process - to pass data between runs, etc.)
    3. process the data from a TMP table (e.g. insert records to other tables, etc.)
    4. do some final tasks - mark the process as finished, create some simple records describing results (this fails, although very rarely)
    Steps 2-4 are always in a single transaction (usually a single procedure). One thing we've noticed is that the TMP table may be very different between the runs (e.g. it may be truncated and filled with new data on each run), which causes problems to the optimizer - in case of several batch processes, this often results in a very bad plan.
    So we have modified the processes so that a dbms_stats.gather_table_stats is called after steps 2 and 4, i.e. it looks like this
    1. do some init tasks - record the batch process as running, load parameters etc. generally a very simple tasks (haven't seen those to fail)
    2a. prepare some data into a TMP table (regular table, used from this single process - to pass data between runs, etc.)
    2b. gather stats on the TMP table
    3a. process the data from a TMP table (e.g. insert records to other tables, etc.)
    3b. gather stats on the modified tables (usually just one table)
    4. do some final tasks - mark the process as finished, create some simple records describing results (this fails, although very rarely)
    which works (the query plans are much better), but there's a BIG problem we haven't noticed before - the dbms_stats.gather_table_stats commits the transaction. That is not that big deal in case of the (2b) as the previous steps are rather trivial and easy to undo manually, but the (3b) is much worse.
    I thought we could fix this by running gather_table_stats from an autonomous transaction, but that does not work as it does not read uncommitted data :-(
    We could move the (3b) out of the batch process and run it at the very end, but that still does not solve the other gather_table_stats call (which is a PITA) and we would have to propagate what tables were modified in the process.
    Is there a way how to do this? I.e. how to gather stats within a single batch process (PL/SQL procedure) without committing the previous changes and not ignoring the uncommited data?

    First you refer to the table as a TMP (temporary) table but you also mention truncating it so is this actually a permanent table that is used as a work table? With a work table it is my experience that what you want to do is load it with a represenative sample (usually a larger one when there is significant variation between runs), collect statistics, and then leave those statistics in place.
    I find the same is true when you use a global temporary table (GTT) in place of a permanent work table. Create the GTT, load it, collect statistics, verify the plan, and leave those statistics in place. With a GTT every user session gets their own private copy of the work table so concurrent session access takes no special logic. With a permanent work table we have added a key column where each concurrent session selects a sequence value and uses it on every row and we have used dbms_lock to limit the process to one user in cases where only one user in the department needs to create the online report but multiple users will query the result.
    If this is not helpful then a fuller description of how the table is loaded and accessed may provide someone with the information necessary to provide you with a more useful reply.
    HTH -- Mark D Powell --

Maybe you are looking for