Batch processor design

Hi my question is there a good resource to see how batch-processing design ( e.g. existing app, case-study...etc)
So how we can implemented a rule-based implementation in set-logic?
I work at this company that the main processing is batch-data-processing in nature.
It receives some file feed, and than some business logic processing and then pass it to another company's program to do the fulfillment.
When the main app is designed 8 years ago, the architect sees what kind of features it needs.
It is implemented in oracle. It uses a rule base engine to do this processing row-by-row.
It uses a set of dynamic-sql to implement every business rule. So if there are 10 rules, than every record will call a PL/SQL 10 times for that set of dynamic-sql
It is also heavy triggers for business logic so there is some-kind of locking package got implemented so when the batch processing is done. It will not got dead-locked.
So when I have the interview my boss says it is a OLTP machine and in a way it is true as it is designed work in a way row-by-row like OLTP even it is batch-processing in nature
My other co-worker and I both agree that it is a very sub-optimal design as it is not set-based operation even by nature it is.
I can see some good-things about have mulitiple-sql is similar to the object-design of extending. If one SQL-RULE is good than to build on top of that, we can just insert new dynamic-sql into the table and do the processing. The draw back is the performance is pretty sub-optimal.
Would you all share some experience in this topic?

>
Hi my question is there a good resource to see how batch-processing design ( e.g. existing app, case-study...etc)
>
Not that I am aware of.
>
So how we can implemented a rule-based implementation in set-logic?
>
Too general a question to be able to provide much of an answer. Especially given that you may have a very specialized custom implementation of the process being used now.
Custom implementations usually present several issues for conversion projects (such as replacement with new technologies or functionalities)
1. The original author/developer is no longer availlable.
2. The system lacks sufficient documentation.
3. The business rules being implemented are either not documented at all, poorly documented or the documentation is out of date and no longer matches the code that is actually being used.
Rule #1 is "if it ain't broke don't fix it'. If you so much at even touch the system you run the risk of breaking it completely, introducing new errors or making things worse.
{quote}
The draw back is the performance is pretty sub-optimal.
{quote}
So what? See rule #1. Does the current system meet the required service level agreements? Is there functionality needed that isn't currently available?
1. Determine if change is actually needed - In other words, what is the business reason for even thinking about changing things?
2. Determine the scope of the change that is needed - are we talking about minor changes or mods? Or major new modules and functionality? Are the current business rules getting the job done and we only need a couple more?
3. Determine if the required changes can be implemented by modifying the current system. You said 'It receives some file feed, and than some business logic processing and then pass it to another company's program to do the fulfillment'. So if you now need to add another feed for another company and you need pretty much what you have now why switch technologies? Well maybe the volume is 10 times what you have now and the current system can't handle it.
This would be a good case for doing the new feed as a separate project and use new functionality to do it. You can run it in parallel to get the kinks worked out and, ultimately, convert the existing feed to the new code later.
4. Determine the level of functionality that the business rules apply to. Data cleansing is often primarily done at the row level so this type of processing is very amenable to 'set' processing. Apply cleansing rule #1 to the entire table, then rule #2 and so on. A serialized 'batch' process rather than apply 10 rules to each row one at a time. Checks for dirty data (invalid nulls, data too long for column, invalid datatypes, etc) can be done in bulk.
Other functionality requires data from multiple tables: parent/child validation, lookup table validation, lookup code conversions from source values to target values.
In short you need to have a clear business objective and goals.
A. Project justification
B. Functional Requirements document
C. Technical Requirements document
B. And then define the project

Similar Messages

  • PS/Bridge CS5 Batch Processor Not Working / Missing Scripts

    I'm having a bit of a problem within Photoshop/Bridge CS5 and haven't found anything that works to resolve it.
    In Bridge, I want to launch the Photoshop Batch Processor via Tools>Photoshop>Batch.
    When I select this, Photoshop launches, but the dialog to process the images never shows. On occasion, if I return to Bridge and attempt the same operation again, I'll get a notification that there's already something in progress and asks if I want to queue this operation. The other times (majority) I can return to Bridge and attempt the operation again but get no dialogues in Photoshop. I can try the other options in this menu as well (eg Image Processor, etc) and get the same results.
    Here's what I've done so far:
    1. Checked Bridge Edit>Preferences>Startup Scripts and ensured those were all checked. I even disabled all, closed Bridge and Photoshop, then re-enabled.
    2. Cleared all Preferences from Bridge and Photoshop (using Ctrl+Shift+Alt at launch).
    3. Reinstalled Photoshop (multiple times)
    4. Uninstalled Photoshop, ran the Adobe CS Cleaner application, then reinstalled Photoshop.
    It should be noted that in Bridge>Edit>Preferences and Startup Scripts, if I click the button to 'Reveal my scripts' the folder it takes me to is blank. Not sure if this folder is solely used for personal/user created scripts or for others as well, but it's completely empty.
    I have also gone to Help>Updates and updated both applications with any necessary updates from Adobe.
    If I go into Photoshop and select File>Automate>Batch or even File>Scripts>Image Processor, both dialogues open fine.
    Running Windows 8.1 Pro.
    Anyone else seen this and actually resolve the issue? Does anyone know where the scripts are supposed to be located and/or if I can get them from somewhere else? Any help is much appreciated

    Make sure Photoshop is not started and then try to use Bridge menu Tools>Photoshop>Batch does Photoshop start up?
    Bridge CS5 menu Tools>Photoshop>Batch works on my Windows 7 Pro system
    Photoshop Automate>Batch is not a script  and Photoshop does not any external java program it had its own version of java plugin.
    c:\Program Files (x86)\Adobe\Adobe Photoshop CS5\Plug-ins\Extensions\ScriptingSupport.8li
    c:\Program Files\Adobe\Adobe Photoshop CS5 (64 Bit)\Plug-ins\Extensions\ScriptingSupport.8li
    However there are two startup scripts for the bridge involved Bridge and Photoshop  You must have them for you have a Bridge menu Tools>Photoshop>items
    two scripts on my system are:
    c:\Program Files (x86)\Common Files\Adobe\Startup Scripts CS5\Adobe Bridge\bridge-4.jsx
    c:\Program Files (x86)\Common Files\Adobe\Startup Scripts CS5\Adobe Photoshop\photoshop.jsx
    The photoshop.jsx seems to be the one the the inserts the menu Tools>Photoshop>Batch into Bridge and also invokes Photosgop Automate>batch  photoshop.jsx  also inserts the other Bridge menu Tools>Photoshop>Menu Itens that you state work.
    Here is what I see in my photoshop.jsx for batch.  I do not know or understand Bridge scripting and do not know javsscript but what I see there seems to be some checking done about Photoshop being available and if it is not the script just returns which is what you seem to be experiencing the test looks like this
      if (!photoshop.correctPSIsAvailable (availabilityCheckOptions))
      return;
    The whole batch support section follows:
    //=================================================================
    // Batch
    // Sets up Photoshop's Batch automation command to be
    // accessed from the Bridge.
    //=================================================================
    // This routine takes an array of files. If called by Photoshop,
    // it will invoke Batch with the files. If called by
    // any other app, it will send a BridgeTalk message to Photoshop
    // to invoke this routine with the same arguments.
    photoshop.batch = function (/* Array */ files, availabilityCheckOptions)
      try
      if (availabilityCheckOptions == null)
      availabilityCheckOptions = new AvailabilityCheckOptions ();
      // Make sure this can be run.
      if (!photoshop.correctPSIsAvailable (availabilityCheckOptions))
      return;
      if (BridgeTalk.appName != "photoshop")
      // Bring Photoshop to the foreground.
      BridgeTalk.bringToFront (photoshop.versionInfo.btTargetName);
      // Create a new BridgeTalk message for Photoshop to invoke
      // Batch with the selected files
      var btMessage = new BridgeTalk;
      btMessage.target = photoshop.versionInfo.btTargetName;
      btMessage.body = "photoshop.batch (" + files.toSource () + ");";
      btMessage.send ();
      else
      app.bringToFront();
      photoshop.runActionCommand ('batch', files,
      localize ("$$$/PSBI/Automate/BatchSettingsFile=Batch via Bridge Settings"));
      catch (error)
      if (error.number != 8007) // Don't report user cancelled errors.
      alert (error);
    // This routine is called when the Bridge's Batch menu item is
    // chosen.
    photoshop.batchFromBridge = function ()
      try
      var availabilityCheckOptions = new AvailabilityCheckOptions ();
      // Make sure this can be run.
      if (!photoshop.correctPSIsAvailable (availabilityCheckOptions))
      return;
      var files = photoshop.getBridgeFileListForAutomateCommand (true, false);
      if (files.length != 0)
      photoshop.batch (files, availabilityCheckOptions);
      catch (error)
      if (error.number != 8007) // Don't report user cancelled errors.
      alert (error);
    // The code below inserts the Batch menu item into the Bridge menus.
    if ((BridgeTalk.appName == "bridge") && (BridgeTalk.appVersion.indexOf (photoshop.versionInfo.bridgeTargetVersion) == 0))
      // Use temp function to keep vars out of global namespace.
      photoshop.tempFunction = function ()
      // Set up the info necessary for inserting this item into the Bridge's menus later.
      var menuItemInfo = new Object;
      menuItemInfo.text = localize ("$$$/PSBI/Menu/Automate/Batch=Batch...");
      menuItemInfo.name = 'Batch';
      menuItemInfo.onSelect = photoshop.batchFromBridge;
      photoshop.menuItemInfoArray.push (menuItemInfo);
      photoshop.tempFunction ();
      delete photoshop.tempFunction;

  • Batch Processor, WIA and Inferred entity instances

    Hi All,
    I'm playing around with the new OPA 10.4. I bumped into an error in What-If-Analysis document when I tried to add a new worksheet for an inferred entity. Is this not possible in WIA? Does it mean it will not work as well with Batch Processor.
    Here's my data model:
    global
    --- the applicant
    --- --- the qualification
    --- --- --- the exempted course (inferred instances from 'the qualification' attributes)
    Thank you in advance!
    Regards,
    Jerome

    Brad Tuckett wrote:
    Jerome wrote:
    Is there a possible workaround right now in 10.4 to persist inferred instances in Batch Processor?The Batch Processor will not output any inferred instances in 10.4
    One more thing, I'm also using 10.2 and have implemented InferencingListener to create instances, would you happen to know if I will bump on the same issue with DSC?
    There probably isn't a way of having any entity instances created by an event listener appear in the Batch Processor output for database or CSV.
    However, you may be able to set the output to saved sessions. Although this isn't explicitly supported, I would expected that any instances you have created would show up in the exported sessions, which will be xml files in the xds format. This will create one session file for each case.
    You can set the output to be exported sessions by using "--export <dir>" in the command line or use the xml configuration for the Batch Processor.
    When 10.4.1 is released you should be able to switch to using inferred entities which will be compatible with the Batch Processor.

  • Can anyone suggest how to fix batch processor performance?

    The DBA's examined the OPA batch process while it was running. Oracle was sitting idle most the time waiting on the batch processor (according to the DBA staff.) Locks were not an issue and the DB was not under any load.
    The performance is absolutely dismal and we are going to open an OPA support ticket, but would like a quick answer if anyone on this forum has a quick answer.
    (Also, if you try to kill the batch job with <ctrl-c> because it is taking too long, then it leaves the sessions and cursors open. Is there a way to gracefully shut down the batch processor?)
    We have a single simple table where the first 5 columns are conditions and the last column is a conclusion.
    I filled the table with 1,000,000 rows of conditional data. I ran with a configuration like below:
    <configuration>
    <options>
    <rulebase>../output/MyRules.zip</rulebase>
    <database>
    <url>jdbc:oracle:thin:myusername/mypassword@mydatabase:1521:mySID</url>
    <driver>oracle.jdbc.OracleDriver</driver>
    <driversrc>\myjdbcdriverpath\ojdbc6.jar</driversrc>
    </database>
    <output type="db" />
    <mappings>
    <boolean-format true-value="TRUE" false-value="FALSE" />
    <mapping entity="global" table="SSPNA_IN" primary-key="ID">
    <attribute name="bHasNoCountableIncome" field="BHASNOCOUNTABLEINCOME" />
    <attribute name="bIsResidentOfNY" field="BISRESIDENTOFNY" />
    <attribute name="cSSIBenefitAmt" field="CSSIBENEFITAMT" />
    <attribute name="tFedLivingArrangementCD" field="TFEDLIVINGARRANGEMENTCD" />
    <attribute name="tManualHMOInd" field="TMANUALHMOIND" />
    <attribute name="tPaymentStatusCD" field="TPAYMENTSTATUSCD" />
    <attribute name="tPrincipleProviderCD" field="TPRINCIPLEPROVIDERCD" />
    <attribute name="cSSPNAPayment" field="CSSPNAPAYMENT" output="true" />
    </mapping>
    </mappings>
    </options>
    </configuration>

    This is definitely not something we've seen in the testing environment.
    Can you please raise a SR via Oracle Support and provide the following information?
    1. Product version. Version 10.4.1 added multi-processor support for Java database connection in case you were using 10.4.0
    2. Oracle database version
    3. Is the database installed on a physical server or a VM? What's the machine specification, i.e. CPU and memory size?
    4. Is the SSPNA_IN table indexed?
    5. What is the logging level? Does it help if you change the logging level to ERROR?
    6. Please provide the rulebase project and DDL/DML to create tables and insert sample data.

  • Batch Processor and Custom Functions

    Hi,
    We're using the 10.4.2 Batch Processor, connected to an Oracle 11g database to handle large amounts of data.
    Recently, four custom functions have been added to the rulebase. Nothing complicated, just some string manipulation stuff (trim whitespace etc.). Each function is just a few lines of fairly simple code. No database calls, no web service calls or anything like that. It stays within the Java jar file.
    Suddenly the processing time of the BP has gone up from a couple of minutes to well over an hour. The only change to the rulebase has been the addition of the functions, nothing else.
    It almost looks like the BP is initialising the custom functions with every case that's processed. If this is the case then this would be massively inefficient and as a result we would not be able to implement our new functions.
    Has anybody else come across this before?
    Regards,
    Evert

    943413 wrote:
    Hi,
    We're using the 10.4.2 Batch Processor, connected to an Oracle 11g database to handle large amounts of data.
    Recently, four custom functions have been added to the rulebase. Nothing complicated, just some string manipulation stuff (trim whitespace etc.). Each function is just a few lines of fairly simple code. No database calls, no web service calls or anything like that. It stays within the Java jar file.
    Suddenly the processing time of the BP has gone up from a couple of minutes to well over an hour. The only change to the rulebase has been the addition of the functions, nothing else.
    It almost looks like the BP is initialising the custom functions with every case that's processed. If this is the case then this would be massively inefficient and as a result we would not be able to implement our new functions.
    Has anybody else come across this before?
    Regards,
    EvertEvert,
    You are correct, it does initialise the custom function for every case that is processed. This is necessary as every case is a determinations engine session.
    However, you have control over how long the initialization process. If you make the time consuming parts of your custom function static, and then make the initialization fast, then you will get a much faster custom function call. The custom function code is entirely within your control and you can make it as fast as you like.
    Cheers
    Frank

  • OPA 10.4 - batch processor and Entity

    Hello all.
    I have a problem with the use of Entity with batch processor.
    The fact is we want to use OPA with csv input named like "TARIF_ONE.csv". One different csv for every way of tarification and each time a different project that will be call by a batch process (the client need)... without the use of a configuration file !
    So, if I well understood the OPA batching process, I need, for a "TARIF_ONE" project to be feed with a "TARIF_ONE.csv" file, to have a "TARIF_ONE" entity that will get all the data from the csv input file.
    I've done so.
    But my problem is that when I call the OPA zip by batch, the TARIF_ONE entity are well feed by the data from the csv file... but the calculation from the excel file are not made at all... and in a exporttsc file, the Entities (TARIF_ONE) are set as "unknown" for all the cases inputed...
    I've setted my project like this :
    my Properties file is setted like that :
    Global
    - TARIF_ONE (entity)
    --- INPUT_DATA_ONE (attribute) : Text = "Input data one"
    --- INPUT_DATA_TWO (attribute) : Text = "Input data two"
    --- OUTPUT_DATA_ONE (attribute) : Text = "Output data one"
    --- OUTPUT_DATA_TWO (attribute) : Text = "Output data two"
    And my excel rules file is setted like below
    - Declaration Sheet
    -- INPUT_DATA_ONE : Text = "Input data one from TARIF_ONE"
    -- INPUT_DATA_TWO : Text = "Input data two from TARIF_ONE"
    -- OUTPUT_DATA_ONE : Text = "Output data one from TARIF_ONE"
    -- OUTPUT_DATA_TWO : Text = "Output data two from TARIF_ONE"
    - Calculation Sheet
    -- OUTPUT_DATA_ONE = INPUT_DATA_ONE * INPUT_DATA_TWO
    As I said, when using the batch process, the inputs attribute for the TARIF_ONE entity are well feed, but the OUTPUT attribute are not (still unknown)... and the Entity itself is said as "Unknown"... any way of reflexion to soluce that ?
    Thank-you,
    Philippe,

    Philippe Carpentier wrote:
    Thank you Franck for your answer.
    I would have just one more question : is it possible to use different entity in the zero configuration mode ?
    I tried with one data file by entity, and the entities ares not created
    I tried with all the datas for entities in the global.csv file, and the same.
    It seams to me that when using differents entities in a projetc for batch, we have to use a configuration file... do you confirm my feeling ?
    Philippe,
    Edited by: Philippe Carpentier on 25 mai 2012 03:12You can definitely load entities other than global using zero configuratiuon. To do this create a csv file that has the same name as the public name of your entity. For example if you have an entity with the name "person" then create a person.csv file for each entity instance.
    You will need to have a reference to which global instance each entity instance relates to. The way to do this is to have a column named for the containment relationship for the entity to hold the foreign key to the entity instances parent.
    Example
    We have a global.csv with 3 globals (3 cases)
    #, (number_of_persons)
    1,
    2,
    3,And then a person.csv with 7 people which must all be link to their owning (parent) global. This is done through the containment relationship with the public name "all_persons"
    #,person_name,person_age,all_persons
    1,Fred,      ,27,       ,1
    2,Barney,    ,28,       ,1
    3,Homer,     ,40,       ,2
    4,Marge,     ,38,       ,2
    5,Bart,      ,10,       ,3
    6,Maggie,    ,1,        ,3
    7,Lisa,      ,8,        ,3Each value in the column "all_persons" is a foreign key to the global value which contains that person
    For more information on this, see "CSV input for the Batch Processor", and "Zero-configuration conventions for CSV input" in the "Oracle Policy Automation Developer's Guide". http://docs.oracle.com/html/E29403_01/toc.htm

  • Processor design

    Hello all,
    I am interesting in getting familiar with processor design.
    Can anybody suggest me an appropriate place to start with?
    May be a forum, web help or beginner's tutorial
    Thank you.

    Thats a quite broad question.
    I've seen some interesting posts on wikipedia which is worth reading, and if you are looking specifically for SPARC architecture you could possibly perhaps maybe find some useful information on:
    http://www.sparc.orc/
    .7/M.

  • Running the Batch Processor from a Java application (on schedule)

    Hello,
    I would like to run the Batch Processor (on schedule) from a Java application -- not from the Command Prompt.
    To achieve that, I suppose I would need to call a certain function contained in engine/determinations-batch.jar, using perhaps the path to the config xml as an argument.
    The trouble is that the OPA developer's guide doesn't mention it (or at least I didn't find it).
    Could you help me out?
    Thanks in advance.
    Razvan

    RazvanSaftescu wrote:
    Thanks Frank, but is there a way to identify this method and its syntax? I didn't find anything so far on the internet or on the dev's guide.
    This is the Java main method for the determination-batch.jar file - so the main method has the syntax:
    public static void main(String[] args)
    args is an array of strings matching the command line arguments described in the OPA Developer's Guide topic "Configure the Batch Processor".
    Regards,
    Brad

  • Batch interface design

    We are starting a project to significantly redesign a system. Part of this will be to migrate several batch interfaces, data extracts, mostly from one source. Most are scheduled, but some will be service based, and likely result in flat files that are then communicated to various external stakeholders. Traditionally we have created and stored queries that are called by an oraperl script.
    Would Integrator be a suitable tool for designing, developing, and maintaining the data extracts and interface?
    What is the licensing costs and arangment for this product? Is it on its own or included in other licensed products?
    Thanks, Brian

    It looks like ODI would be a very good fit for your environment:
    - ODI will generate queries as you do it today, but the generation and maintenance of these queries will be greatly simplified. In addition, you will benefit from the comfort of a graphical interface, and all the bells and whistles of a best of breed Data Integration tool. An important point though: you will always be able to control and modify what has been generated by ODI.
    - ODI processes can be scheduled (ODI scheduler or any third party scheduler like cron, control M, windows NT scheduler, etc.) or invoked as web services (all processes are automatically exposed by ODI as web services)
    - Regarding the distribution to external stakeholders, ODI can generate files (fixed, delimited, XML, ASCII, EBCDIC) but can also connect to an ESB, FTP the files or email them among other delivery capabilities.
    Licensing cost is $12,000 per-database target CPU and $4,000 per- database source CPU. (http://www.oracle.com/corporate/press/2007_feb/oracledataintegrator.html)
    I hope this helps
    Chris

  • Batch Automating Design update on 1000 InDesign files- Any ideas?

    I have just been brought in to a new company that issues one page (and separately saved) biographies using inDesign. The biographies use a standard style, with a great deal of reoccurant information (company details, logo etc) and then some personal info that makes up the biography.
    The company recently underwent a re-brand which now means some 1000 of these biographies are displaying incorrect logos, colour schemes, font, watermark, contact information and company details.
    I want to amend this, upgrading them to the new style guidelines. And for my personal sanity would like to automate the process as much as possible. Any ideas on how best to tackle this would be greatly appreciated!
    My thoughts are as follows:
    Partial Automation: - Perhaps I can export swatches, colour profiles and font settings from the updated templates and retrospectivly apply them to all the old ones. This would get me a good deal of the way there... but would still leave an issue on changing all new company details. If I could apply this in bulk I could flag the files for a small amount of editing next time they're needed.
    Partial Automation: - Perhaps with a better understanding of the master pages tools I could create a default collection of swatches and space layout and import data into certain fields manually?
    Full Automation: - I would love it if some combination of the above and InDesign scripts were possible to use to achieve this... but I am not optimistic. Perhaps the better idea is to use the exportallstories.js script to scrape all the relivant data from my old files. From there I would need to write a python file to identify the relivant fields (namely: Name, Title, biography). I could then migrate these details to a spreadsheet and save it as a .csv file. From here I might finally be able to re-import it back into my new template style using the <<Automated Data merge>> functions! It's a bit hieth robinson and would require some careful thinking on python expressions to get everything into .csv correctly.... but I think it's crazy enough to work. Any assistance on ideas here... or ideas about how I automate this to batch export to RTF (without 1000 tiedious "ok" dialog box clicks) would be greatly appreciated.
    Any help, advice or pointers to relavant literature would be greatly appreciated.
    Other Info:
    I am on CS5
    Windows 8 (i know... i know...)
    I can get a python IDE up and running pretty quickly
    Am happy to migrate this to my native linux for some parts of this process if there are tools/i can use the command line to get things done quicker!
    Thanks
    aBoL

    Thanks all for the replies, it's given me a great deal to think on.
    I have discoved some of this information is helpfully stowed on their website. Am going to contact the IT guy... if i could scrape right off it or better yet get access to their mySQL or whatever perhaps that'd be the fastest way to make my magic .csv and then batch import all.
    As for plans b-z:
    Eugene Tyson I looked into what you suggested about using books. I've still got a problem with importing all to one file if anyone knows a handy script that'd be much appreciated. I still obviously need to read up on master pages, when i've been dragging the style it seems to sandwich both the new style and old style together... so perhaps the data in isn't clean enough anyway (cleaning +1000 files would probably take longer than ctrl+c ctrl+v on all of em)
    Peter, you're exactly right. Dirty in dirty out on data right? I've found that given that these files were all made with the same workflow, there is a great deal of consistancy. If its 100% i cannot yet tell, but if i export using the story export .js it seems to me i could either scalp off the bottom most five files (which seem to inevitably contain the valued data) or ponder clever if statements that will detect if it's got something that might be a name or just our latin placeholder text. But yes... i figure whatever a good deal of checking will be invovled.
    Ariel, Your help would be greatly appreciated and I will pm you and anyone else who would be willing to get stuck into this challenge!! The main one for me right now would be the automated importing of files in bulk into a single story... perhaps the automated use of the other story export script... or if you were feeling extremely talented, doing all the work in indesign itself, opening files in sequence in a folder, stripping out the contents according to rules (that we could base on the regularity of it's content) or placement (they're generally in the same area) and saving that out to a txt file or even better a .csv!!
    anyway I'm teetering between this being a fun challenge and a utter nightmare, but the looming idea of manual data entry is enough to propel me on. Any help / discussion or further thoughts would still be vastly appreciated!
    thanks all and have a great weekend.

  • Why do I need to use GIMP to batch process?

    Dear Adobe,
    I'm grinding through thousands of display and template images for an online retailer. Photoshop layering and masks came in quite handy when I was making the templates
    but now that it's got down to reusing those templates, I have to look elsewhere.
    1 Why does every other piece of software on the market have a batch processing section, and yet you hide yours?
    2. Why does Image Processor refuse to resize both horizontal and vertical, although it claims to? It actually refuses to distort the image to resize, which I actually want it to do. I had
    to download GIMP to get this capability, as well as to specify the output file and change the dpi. These are all included in GIMP but not Pshop. Thanks for that. And don't tell
    me to go to Actions. THat may be useful for something things, but it's hardly the optimal way to do it, recording your exact actions, and then saving or not saving, and then
    seeing what happens. The way I wanted it was in GIMP, and its pathetic that 20 years in, Pshop still doesnt have a credible batch processor, or tell me where it is now.
    Charge extra for that perhaps?
    3.What has changed in Bridge CS6 since CS4? I see absolutely nothing different, but then output is still limited to PDF and 'WEB GALLERY' a dysfunctional glob
    of code bested easily by jquery, but maybe you've impoved that end. My last experience with web gallery was so dysfunctional that I'll just stick to fixing what's
    under the hood myself. I mean web gallery used to be flash, then something else that's not current. Let's just agree its a hobbyist option. Of course,
    I don't know why 'output' might include other options for the batch processing of files, since batch rename does actually work, but you guys are the deciders.
    4. Its too bad. This job has taught me a lot of new stuff about layering, masks, and the functionality of the pattern section, which I have no great qualms with, though,
    considering how many companies you've bought, you could include tons of patterning capabilities that used to exist in these programs, but that would be too much
    expenditure from MGMT. Maybe let's just fix the window views we broke in CS4, and call it a new version.
    5. Thanks for forcing me to discover GIMP, who dont ask a thing. I dont really like being in there yet, but we'll see if that changes with time. They actually have some
    different filters and rendering options that Adobe could've included years ago for gun, but again, there's just not enough money to go around at Adobe.

    Antinet wrote:
    Thanks for both of you proving both that the function doesnt exist.
    When the Image Processor resizes image it used the Photoshop Plug-in Fit Image to do the resize.  Both height and width are resized and the image aspect ratio is preserved so the image is not distorted.  That means the image will fit within the height and width  used in the dialog.  Only image with the same aspect ratio will fill the area.  It sounds to me like you want the image the size id the height and width you entered in the dialog.  In which case you need to use Photoshop features the can accomplish that.  If you do not care the the image is distorted you use Image size check resample uncheck constrain as set the height and width you want to resize to.  If you do not want to much distortion you can try content aware resize. That will still distort the image some what but try to maintain some areas perspective it finds object in.  You can also help that process by selecting areas.  If You want no distortion you need to resize the image to fill the area and mask off  the excess to virtually crop the image the the areas aspect ratio. To automate that you need to use some Photoshop scripting. Knowledge is required to get the most out of Programs like Gimp and Photoshop.  The function doesn't exist because of your lack of knowledge.

  • Pro Ver 6 Batch Processing Error

    When running batch processing on .doc files, at the end of the run the batch processor errors and warnings window shows a "bad parameter" message for each document. But it does not say what the bad parameter is. I can successfully convert the same document manually, so not sure what to do. The log file is no help, as that just contains the same error messages. Running version 6.0.6 of Acrobat Pro, WinXP Pro and Office 2003. The documents are all single page Word files with an embedded .tif file. The manually converted .pdf files look fine.

    hi TAJ0808,
    there is a fix for it go and ask support we already have it installed.
    regards chris

  • Batch trimming/bounce

    Hi,
    I have around 240 audio files that are around 31 seconds each. I now want to create a 10-15 second version of each file and can't think of a quick and easy way to do this without bouncing down every file again. I want to trim each file at exactly the same point and the only ways I can think of doing this are:
    1. Selecting each file in the audio window and trimming them at the same points.
    or
    2. Putting the files onto an individual audio track, selecting the bounce function and then entering the start and end points in the bounce dialog window.
    However, the number of audio tracks available to me is much lower than 240 - and either way I would have my work cut out doing every track individually.
    Any ideas would be greatly appreciated,
    Jack

    Well, the cutting process was explained so it could be done in, like, 2 minutes.
    To answer your question, A procedure such as this could be done by creating a simple macro of Key Commands which could:
    step to next region
    switch to the sample editor
    perform the fade
    switch back to the arrange
    Repeat 240 times
    If you had no batch processor, it could be done almost as easily ( and almost as quickly ) as using a batch processor. Of course, Accelerando integrates with several components of your OS and other applications, including Peak Pro and Sample Manager - both of which have very good Batch capabilities.
    J

  • Using batch convert tool on my website

    Can anyone help me out with a few questions:
    1)  How can I use Batch Convert tool on my website where I will be able to process the files with an automated process?
    2)  How and from where can I get the API to embed on my website?
    3)  I want to automate the batch conversion process so that no human intervention is required. Can I do that?
    Appreciate the answers.

    amit.wadhwa wrote:
    Can anyone help me out with a few questions:
    1)  How can I use Batch Convert tool on my website where I will be able to process the files with an automated process?
    2)  How and from where can I get the API to embed on my website?
    3)  I want to automate the batch conversion process so that no human intervention is required. Can I do that?
    The batch processor in Audition is an internal tool, not an external one so no, you can't use it on your website because it's DAW, not a web development kit. There's no API available.

  • Implication on batch setup

    Hi,
    I am wondering if there is any implication to the following.
    Presently we have a batch class setup in our Asia Pacific region (let say this batch class is called A, and it has 5 characteristics), and we plan to roll out one project to one of our plant in one country. However, there is a requirement to setup a new batch class with different specification (hence, we call it batch class B and it has 12 characteristics associate with it).
    Background:
    This plant does have interplant business transaction with other plants which are using batch class A
    This plant sharing the material master with other plants.
    Questions:
    1 - If we decide to endorse the idea of using this batch class B for this new plant rollout, may I know if there is any implication, in term of system setup conflict, or business process in term of batch classification?
    2 - Would it be better that this new plant use the batch class A which every plants in the region are using Or continue to use batch class B?
    Thanks
    Tuff

    Please consider OSS note 115581 (No multiple classification for batches) before designing a new class.

Maybe you are looking for

  • Unable to update Photoshop CC with Creative Cloud

    I have tried multiple times to download and update the Photoshop CC update that just came out. Everytime I try it gets to the end and then I get an error message that pops up. I click on more details and this is what I get when I open up the file. 09

  • New Gl 0Fi_GL_14 and impact on existing _4 AP and AR datasources

    We will be moving to new GL and I'm investiagting the  0FI_GL_14 Datasource for this purpose.I have couple of questions on this datasource. I couldn't find any related AP and AR datasources for new GL. Does this mean we can continue to use 0FI_AP_4 a

  • End of table in table interface

    Hi , I have a scenario, where I want to add some rows to HTML table.  I can do this by table inerface, but how do I know that I have reached end of the table . Regards MB

  • Docked iPhone 3GS automatically starts or resumes playing/HOW TO STOP

    Last year, I brought my iPhone 3GS to an Apple store and presented the problem of docking the iPhone and having the iPod on the iPhone resume or play from the beginning whatever song had last been played on the iPod. I needed to stop that behavior. O

  • Memory Services, maintaining context

    In an external function I must allocate a lot of memory ( could be megabytes ) during the first invocation of the function. During following invocations the information stored in the memory that was allocated is used to calculate the return value of