External process execution and destruction (multi-platform)

I want to execute a program from within java and then stop it later on also from within the same java program. Since some programs do not have a way to cleanly kill them off (say exit or stop) I'm looking for a way to kill a process.
The process is created using:
process = Runtime.getRuntime().exec(.....);So the Process Object can be used for some thing such as getting output and sending input to the process it can also be used in a form like:
try {
                    // the process located at i from the list as see if it is valid
                    Process p = (Process) l.get(i);
                    int stillRunning = p.exitValue();
                    if (p.exitValue() != 0) {
catch (IllegalThreadStateException e) {
                   System.our.print("process is still running");
}Also you can do something like:
Process.destroy();However this will only destroy the shell used to create the process on a UNIX system and not the process that was executed as a result that process will then be picked up by init, so process.destroy is not a good solution to killing off a process.
an example:
XXXXX  28393 28389  0 14:10 pts/4    00:00:00 /bin/sh /home/egandt/IdeaProjects/JumpstartKit/RunTime/Email_Server/james-2.3.1/bin/phoenix.sh run
XXXXX   28398 28393  9 14:10 pts/4    00:00:01 /opt/jdk1.6.0_02_amd64/bin/java -Djava.ext.dirs=/home/egandt/IdeaProjects/JumpstartKit/RunTime/Email_Server/james-2.3.1/lib:/home/egandt/IdeaProjects/JumpstartKit/RunTime/Email_Server/james-2.3.1/tools/lib -Djava.security.manager ....
egandt   28497 32547  0 14:10 pts/7    00:00:00 grep 28393In this example james is execute by running run.sh which starts the actual process, now if I kill the process for run.sh then james will be inherent by init (id 1).
I'm kind of out of ideas how to get around this problem on UNIX I if I had the ID of the process pointed to by the object process that could be used to find the child ID which is actually running, but this does not help on windows?, Any ideas?
Thanks,
ERIC GANDT

This bug is involved in the problem:
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4770092
In current Windows, there are two commandline commands that provide the ability to identify PIDs
and then kill processes. Open Windows Help and type "tasklist" and "taskkill" in Search.
What you want to do is buggy due to the os problems.

Similar Messages

  • Externally processed operation and planned order

    Hello experts,
    When an operation is defined as externally processed, once the production order (from a planned order generated via MRP) is created, the purchase requisition for that operation is automatically generated. My question is if it is possible to generate the purchase requisition before generating the production order, I mean, with the planned order.
    Although this external operation is defined in the routing, the planned order does not show it and it means that the purchase requisition is not generated.
    In our scenario we want to convert the production orders as late as possible to have automatic changes in BOM, but if we do not convert the planned orders to production orders, we won’t have this purchase requisitions.
    Is there any way to solve this?
    Regards

    Hi Laura
    No, it's not possible to generate the purchase requisition for an externally processed operation before the creation of the production order. The planned order routing is only exploded for scheduling and capacity requirements calculation and an operation is not actually created.
    As an alternative, you can create a separated material number and assign the special procurement 30 for subcontracting to this material. If you add this component to the BOM of the parent material, MRP will directly generate a purchase requisition for subcontracting.
    BR
    Caetano

  • OPM Process Execution and OPM Standard Costing for Poultry Business

    I have a business requirement from the client at their Poultry Processing plant where the client feels that the OPM Execution steps are too cumbersome until closing the batches. The client wants to maintain minimum number of batches in Production: one for Primary, one for Secondary and one for Further Processing, rather than having multiple batches based on different formulas. There are 3 main types of processing which can be briefly described as follows:
    A.) Primary Processing:
    Live Birds (Ingredient) become Whole Dressed Chicken with Neck, without Neck, Liver, Heart, etc... (40 Finished Products)
    e.g. Live Birds (Broiler) -> Whole Dressed Chicken with/without neck, Whole Dressed Chicken with/without neck, etc.
    B.) Secondary Processing:
    Only one type of Whole Dressed Chicken (from Primary Processing) become Chicken Cuts (80 Finished Products).
    e.g.[Scenario B1] Whole Dressed Chicken (Broiler) -> Drumsticks, Thighs, Wings, Breasts, Backbone, etc.
    [Scenario B2] Breasts -> Filets and Deboned Breasts (2 FGs)
    [Scenario B3] Thighs -> Thighs in Plain Bag
    Please note that the Whole Dressed Chicken is also sold as a Finished Good as well as an Ingredient for the Secondary Processing.
    C.) Further Processing:
    Chicken Cuts become 40 Finished Products
    e.g Filets -> Chicken Nuggets, Chicken Fries, etc...
    The client is using Standard Costing method for its OPM Financials and all the finished products (160 Products in all) are having different production cost. Please note that 1 kg of Chicken Wings is not equal to 1 Kg of Chicken Thighs. Different body parts have different costs. In order to alleviate the maintenance of multiple batches per day on the production floor, the client wishes to have minimum batches. We therefore would wish if you can confirm the below approach to be correct?
    a.) OPM Process Execution:
    ==========================
    EITHER should I (Option 1):
    a.) Create ONLY 3 Formulas for Primary, Secondary and Further Processing . The 3 Formulas will have all the processing line Finished Products grouped together (as per the scenarios explained above). Formula for Primary and Secondary Processing can also combined together, reducing it to only 2 batches to maintain per day.
    b.) Can set any Quantity Values for the Ingredient, Product and By-Product in the Formula details with ANY Cost Allocation (amounting to a total of '1') in the Products section
    c.) Set the Validity Rules as 'PRODUCTION' for the 3 Recipes
    d.) Complete the steps defined in the 'OPM Cost Management' (as described below)
    e.) Create the 2 or 3 batches and record the appropriate quantities at the end of the day before closing the batches
    OR should I (Option 2):
    a.) Create MULTIPLE Formulas (above 100 formulas) for Primary, Secondary and Further Processing based on the different products processed.
    b.) Can set any Quantity Values for the Ingredient, Product and By-Product in the Formula details with ANY Cost Allocation (amounting to a total of '1') in the Products section
    c.) Set the Validity Rules as 'PRODUCTION' for the 3 Recipes
    d.) Complete the steps defined in the 'OPM Cost Management' (as described below)
    e.) Create the MULTIPLE batches and record the appropriate quantities at the end of the day before closing the batches.
    b.) OPM Cost Management:
    ========================
    Whether (Option 1 or Option 2) selected, the below needs to be set for OPM Costing:
    a.) Define multiple formulas (above 100), as in Option 2.
    b.) Set the Quantity value to be '1' for the Ingredient, Product and By-Product in the Formula details with the appropriate Cost Allocation in the Products section
    c.) Set the Validity Rules as 'COSTING' and 'PRODUCTION' for each Recipe
    d.) Run Cost Rollup at least once so that the products can have an item cost per unit
    As per me, for the purpose of Costing, it would be imperative to have multiple batches (created one time only) with appropriate Cost Allocation in the Formulas and the ‘Recipe Use’ in the Validity Rules should be set as ‘COSTING’. Then, setting Profile Option 'GMF: Use Only Costing Validity Rules for Cost Rollup' to 'Yes'. In this way, we are sure that the different products in the formula with correct Cost Allocation will have their Item Cost calculated after performing the Cost Rollup. AND, for the purpose of operations, we can only one combined formula of Live Birds (as Ingredients) to yield -> All FGs for Primary and Secondary Processing with Cuts. But, this time, the ‘Recipe Use’ in the Validity Rules of the Recipe should be set as ‘PRODUCTION’.
    I want to confirm which approach (Option 1 or Option 2) is more appropriate in terms of Operations and confirm that the above proposed steps are correct with no circular reference (as certain finished products are also used as ingredients for another product in the Secondary Processing)?
    Thanks and regards
    Raveesh Nobeen
    [email protected]
    Edited by: user12189219 on Jan 20, 2010 3:48 PM

    Hi Raveesh
    I am implementing OPM R12 in Poultry Processing business, I think option 1 (Create 3 formulas/recipes/batches) is more appropriate. In my case we are using actual costing (Moving Average) as a costing method.
    I have set the profile option GMF: Cost Allocation Factor Calculation to be "Dynamic" to calculate the batch cost allocations as a ratio of actual quantity of each product produced to the total production batch output quantity and I have used OPM Financial Cost Allocation process to allocate GL expenses on different products based on Fixed percentage % according to product "value" where Filets has a return more than wings.
    Can you please share your knowledge in this business area and confirm to what extend my approach is correct ??
    Thank you and best regards
    Mamdouh Ragab

  • Owb process execution and alert message

    Hi,
    Do we have any provision to give an alert message in OWB during or before starting execution a process? If any please let me know.
    Thanks
    Ram

    HI,
    There is no such option as of now.
    You can maintain your own audit tables and you can register the messages of your processing. Whenever you start process or end process, you can log a record in ur defiend audit tables.
    If you want to implement in mappings, you can use premap and postmap operators to log the details.
    Regards,
    Gowtham.

  • External Processing with material Provided to Vendor

    Dear All,
    I have a problem in external processing of Maintenance Order.
    When i have created external processing operation.
    I have assigned one component for that external operation which i have to send to vendor.
    After saving the Maintenance order, system is generating Purchase Requisition for that operation.
    That purchase requisition is having account assignment category F. But it is not picking the item category L and the component assigned for this external operation is not flowing to the purchase requisition.
    Please guide me to solve this issue.
    Thanks & Regards
    Bala

    Hi
    For external service we can use PM03, for that we have to create service sheet and proceed further.
    But here my requirement is, just like PP external processing they are doing one operation outside.
    If you look at PP routing, Operation overview there is given external processing data and there will be subcontracting tick.
    If we tick the sub-contracting button, then system will pick the component assigned to that operation to the Purchase requisition.
    If we don't tick that button, system will not copy the component to PReq. .
    Is there any option available here in PM?
    Please guide me.
    Thanks & Regards
    Bala

  • Just installed Mac OS X 10.8.5  on a Mac Pro 2010 platform.    The App Store shows there is an upgrade, so I click the download button.   After about 2 hrs the process stops and an  Error (102) appears on the screen.  Any idea what goes wrong?  THX

    Just installed Mac OS X 10.8.5  on a Mac Pro 2010 platform. 
    The App Store shows there is an upgrade, so I click the download button. 
    After about 2 hrs the process stops and an  Error (102) appears on the screen. 
    Any idea what goes wrong? 
    THX

    ahstephen wrote:
    Thank you for the response.
    The upgrade I'm interested is for OS X  v.10.8.5...
    ...The App Store page shows 2 different upgrades:   
    Mountain Lion  (10.8.5)  Software Upgrade,  and
    Yosemite FREE upgrade
    If the App Store is showing 10.8.5 as an update, what do you currently have installed? The final update to Mountain Lion was 10.8.5, and since the basic OS installation of Mountain Lion is no longer offered in the App Store, that would suggest you're currently at an earlier version of Mountain Lion - 10.8.x where x=less than 5. If that's the case, I'd suggest getting the 10.8.5 update. There is also a Supplemental Update for 10.8.5 and that may be what the App Store is offering.

  • An Important AS3-based Multi-Platform Framework for Developers and Adobe

    Hello to all developers and to all representatives at Adobe!
    We're all fortunate for Adobe's progressive thinking and their immensely helpful programs to battle against device fragmentation.  In order to advance our efforts, we, as developers, must also permit our creative juices to flow to allow the cup of opportunity to runneth over -- not just for our and Adobe's benefit, but for the benefit of businesses and consumers who demand innovation.
    There are, however, barries in the shape of human form that deter potential developers, businesses, and consumers from embracing and capitalizing on such a powerful platform.  These obstructions aren't only coming from behind walled gardens; unfortunately, they're entrenched within our own camp.
    Ignoring rules of optimization and taking careless shortcuts, combined with not possessing an optimized and simplified alternative to targeting a wide range of browers and devices in one fell swoop, are the ammunition needed for skeptics to plant seeds of doubt about such a useful and future-proof ecosystem.
    Much like any platform and object-oriented programming language, the Flash Platform and ActionScript 3 are powerful weapons that must be wielded with responsiblity, or risk creating wounds that, in the minds of consumers, aren't easily healed.
    And since this is OUR responsibility, I've decided to take on that burden...which then became a challenge...which soon became the innovative answer we so deperately need.
    As an offering to my fellow developers, and as a proposal to Adobe -- the company in which I'm devoted to -- I've created an OOP-based framework that simplifies the process of developing optimized applications that cover a broad spectrum of browsers and devices.
    It's called Cross Model View Controller™, or XMVC™ for short, and it offers the following features:
    -Imagine using one base source code -- which you build just once -- to target a multitude of platforms: from smartphones and tablets, to desktops and browsers, and even to smart televisions.
    -Your concrete View classes determine the layout of components and animations for each targeted device; once your base classes are built, your concrete View classes are the ONLY classes that require alteration (see diagram).
    -Utilize the flexibility of the XMVC UI Components to work across ALL devices and browsers.  You simply specify the platform type (mobile, desktop, browser, or television) in the concrete view classes, and the XMVC UI Components do the rest.
         Example: The XMVC Container (which holds child elements and incorporates scrolling): When set to "mobile," it incorporates touch scrolling; when set to "desktop" or "browser," it incorporates a scroll bar; when set to "television," its scrolling is controlled by remote control directional events, as well as incorporates virtual directional buttons.  And this is all from ONE instance of the XMVC Container component; this eliminates switching out various types of containers for each targeted device.
    -Animations are created using Greensock's TweenMax platform, for lightweight, optimized animations.
    -For complete orientation control, you can assign custom animations for orientation changes.  It even incorporates an Upside Down view for Android devices.
    -You can allow users the preference of turning animations on or off, with a flip of a switch (litteraly).
    -XMVC automatically removes event listeners for better Garbage Collection processing.
    -XMVC accepts various data structures, including XML, PHP/AMFPHP with SQL, SAP, and HTTP Web Services.  It would be incredibly beneficial to allow incorporation of ColdFusion, LiveCycle Data Services, and BlazeDS data structures, as well.
    Necessity is indeed the mother of invention, and possessing and utilizing a powerful tool is necessary to progress our efforts in reaching an abundant amount of individuals and entities.
    The primary functionality of the framework (as displayed in the diagram) is complete; improving and adding additional XMVC components are the primary objective at this point of the development stage.
    Very soon, I will upload the source code to an outlet (such as Google Code) for developers to download and experiment.  Within the same timeframe, anticipate demo apps within App Markets to test on your devices.
    Adobe has such an incredible development community, and my hope is that XMVC provides these developers, as well as Adobe, an incredible amount of leverage to persuade businesses, consumers, as well as other developers to embrace and utilize a platform that can withstand the change of time.
    Imagine businesses excelling beyond their self-imposed barries due to the robustness and flexibility of a platform and framework that can function in any environment.
    Imagine liberating consumers to allow them to use your applications whenever, wherever, and on WHATever they desire.
    Think of the possibilities of quickly submitting applications -- with very little, or no, alterations to your programs -- once device manufacturers like Windows Mobile, webOS, and Symbian finally accept Adobe AIR.
    If there are developers out there who share the same sentiment, and if there are Adobe representatives who find this framework intriguing and effective, by all means, contact me.
    The possiblities are endless...so must be our efforts.
    Onward and upward,
    Adrian-Cortez Jackson
    [email protected]

    Thanks for posting.

  • Routing and Externally Process and Purchase requisition

    Dear Experts
    I try to set routing with External processing.
    I have 2 questions.
    I would really appreciate you, if you could answer my questions.
    I still have not understood the relationship between "Control key" and "Purchase Order"
    The routing is like below
    10 self manufacturing
    20 External Processing with the control key is pp02
    30 self manufacturing
    In case I set "+"  (Externally processed operation)   for "External Processing" in "pp02", the Purchase Requisition will be created.
    However in case I set "X"  (Internally processed operation / Externally processed operation)  for "External Processing" in "pp02", the Purchase Requisition will not be created.
    My question is in case I set "X" for "External Processing" for Control key, how can I create Purchase Requisition?
    Should I create Purchase Requisition manually?
    In case I set task 20 as External Processing.
    What kind of Work center should I regist for task20?
    Is it dummy work center or is empty OK?
    Best Regards

    Dear,
    When you maintain external processing in your control key PP02 and the system will create a purchase requisition when you release the production order. For that particular external processing operation you need to maintain the purchasing group and info record details in the external processing tab in routing.
    Please refer my reply from this link,
    Re: Control key for external process operation & internal.
    Regards,
    R.Brahmankar

  • Process Notification and Notification Wait activity - External Relationship

    Hi,
    I have a query relating to the Process Notification and Notification Wait activity.
    In my Process Creation after finishing 2 interactive avtivities I need to send notification or inform the instance
    waiting in a Notification Wait activity.
    For this Im using ALBPM Predefined Process Notification activity to send Notification.
    Im defining instance variable and mapping it as argument to Notification Wait activity.
    I have set the type of event to wait for as External relationship.And defined a correlarion
    at Notification Wait activity by setting initiate property as Yes and defined association with
    argument mapping.And selected the same correlation from the Process Notification activity.
    When im trying to execute the same always im getting the exceeption as Instance was not found for notification.
    Please help me to resolve this issue.
    Thank You,
    ~Kavitha

    Hi Matthias,
    What you have experienced is exactly how it works, the notification is processed after the screenflow is finished.
    I tested a lot some time ago and also really happy that worked well.
    Regards

  • LR5 multi-platform backup strategy

    > PC (2010):
    SSD: Intel SSD 120Gb (Installed apps: Windows 7, CS6 Master Collection, Lightroom 5)
    HDD1: 1TB (Lightroom gallery 1)
    HDD2: 1TB  (Lightroom gallery 2)
    HDD3: Backup 1 (3TB SDD + HDD1 + HDD3 + OS X)
    > rMBP (2013): i5 2.6 GHz/16GB/512GB (Lightroom temporary Gallery)
    OS X: (200GB): LR5, Windows 7 (300GB): CS6, LR5
    > External Western Digital HDDs (current backup strategy):
    WD1 2TB – Backup 2 (copy of Backup 1)
    WD2 1TB – Backup 3 (copy of Lightroom gallery 1, 2)
    Hi guys, I have recently purchased Lightroom 5 and would appreciate some advice on how to set up a backup strategy prior to syncing all my images with it. Currently, above PC is my primary machine (HDD1, HDD2 with image galleries and HDD3 is primary backup). I also have a mac where OS X and Windows 7 (bootcamp) both have LR5 installed. Images stored on the laptop is temporary while I am out and about which eventually gets moved back to HDD1/HDD2 (with backups in HDD3, WD1). I also a spare 1TB external HDD that I would like to utilize to exclusively store Lightroom files.
    My queries are as follows:
    1) What's the best way to move my laptop's (OS X and bootcamp) temporary LR5 files to my primary PC (HDD1, HDD2)? Can I do this over home network easily?
    2) Best way to sync all Lightroom files (from HDD1 and HDD2) with 1TB external HDD? This is so that I have a copy of my images on the go and can sync the laptop to external drive on longer trips.
    3) As my images (along with other data) gets backed up to HDD3 automatically, should I exclude this drive from sycing with LR5?
    4) Is there any backup software I can use to simplify this backup process?
    Cheers!

    it is interesting topic!
    how do you solve this?
    user-friendly  and   multi-platform-friendly
    backup strategy for LR and PS is maybe interesting for everybody
    (I am looking for help for about 5? main LR install and backup configurations ...   )

  • Is there any way to stop a process execution (all instances)

    Hello,
    I´d like to know if there is any way to stop a specific process for execution at the engine without need to undeploy it, since we don´t want to loose process instances when we need to start this process for execution latter on.
    We have a PRD environment with a lot of processes from different departments (developed by different teams and external suppliers) and a feature for stop a specific process and isolate the environment could be very good to do root cause analysis when issues occurs at the environment.
    Sometimes stop a specific process (or some of them) could help in issues investigation that causes the engine to malfunction (lot of audit enabled, some loop bad controlled, lot of concurrent access) but I could not see this option at the webconsole.
    In the version 5.7 one EAR was created separatedly for each process deployed and this could be done stopping the EAR created for that process. Anyone know how to do this at version 6?
    Thanks

    Well the bad news is you are right, there really isn't any way to do this in versions after 5.7
    Starting at 6.0 all projects are deployed under the 'engine ear'. So if you stop the engine, you stop all projects deployed.
    I'm a little concerned that you are first seeing these issues in a 'PRD' environment, is this something that you could set up in a DEV, or UAT, or SIT, or any other environment (That is built similarly) to recreate the issues? - Then undeploy any of the other projects... and isolate the problem...
    -Kevin

  • Error in using External Process in the Process Flow

    I Created a Process Flow with an external process to Move the file from one location to another location,
    I gave the below parameters for the External Process
    COMMAND: move
    PARAMETER_LIST: ?F:\\FlatFiles\\in\\company.txt?F:\\FlatFiles\\error\\company.err
    SUCCESS_THRESHOLD: 0
    SCRIPT:
    The environment is
    Windows 2003
    OWB 9.2.0.8
    OWF Builder 2.6
    When I deploy and execute using Deployment Manager, it gave me the below error
    Starting Execution TEST
    Starting Task TEST
    Starting Task TEST:EXTERNALPROCESS
    CreateProcess: move move F:\FlatFiles\in\company.txt F:\FlatFiles\error\company.err error=2
    Completing Task TEST:EXTERNALPROCESS
    Completing Task TEST
    Completing Execution TEST
    What am I missing something here?
    Is my Parameters correct?
    GIve me the link where I can find more on using External process.
    Please...please...help me..
    Shree

    Nikolai,
    I have created a simple process flow which only calls the external process. The script is on the same host as the process flow is deployed to.
    I have used two diffent values for the command parameter.
    1. I placed the full path of the file in the command parameter and left the script parameter blank:
    COMMAND: /edwftp/ppas/scripts/ClearPPAS.sh
    PARAMETER_LIST:
    SUCCESS_THRESHOLD: 0
    SCRIPT:
    2.I placed the bash command in the command parameter and the full path in the script parameter.
    COMMAND: /usr/bin/sh
    PARAMETER_LIST:
    SUCCESS_THRESHOLD: 0
    SCRIPT: /edwftp/ppas/scripts/ClearPPAS.sh
    Both of these appear to work as they print out the statements inside the script but the files that are supposed to be removed still remain.
    Starting Execution EXTER_FILE
    Starting Task EXTER_FILE
    Starting Task EXTER_FILE:EXTERNALPROCESS
    Removing ActivatedAudit.dat...
    Removing ActivatedCustomers.dat...
    Removing ActiveAudit.dat...
    Removing ActiveCustomers.dat...
    Done!
    Create the Activated Customers data file...
    Create the Active Customers data file...
    Done!
    WARNING: Log file truncated - see RAB for further information.
    /edwftp/ppas/scripts/ActivatedCustomers.sh: /edwftp/ppas/log/ActivatedCustomers.log: cannot create
    /edwftp/ppas/scripts/ActiveCustomers.sh: /edwftp/ppas/log/ActiveCustomers.log: cannot create
    WARNING: Log file truncated - see RAB for further information.
    Completing Task EXTER_FILE:EXTERNALPROCESS
    Completing Task EXTER_FILE
    Completing Execution EXTER_FILE
    The permissions on the /log direcotry are 775. The user I register the file location with owns this directory.
    Can't think of anything else I have missed. I really appreciate your help :)
    Ryan

  • I accidentally updated iPhoto on my (now) external hard drive and now it won't let me access any of my photos. What can I do?

    I accidentally updated iPhoto on my (now) external hard drive and now it won't let me access any of my photos. What can I do?

    So, you have two versions of iphoto? One from the old machine and one from the new?
    If so, forget about opening with the older version of iPhoto.
    For the newer version:
    Download iPhoto Library Manager and use its rebuild function. This will create a new library based on data in the albumdata.xml file. Not everything will be brought over - no slideshows, books or calendars, for instance - but it should get all your albums and keywords back.
    Because this process creates an entirely new library and leaves your old one untouched, it is non-destructive, and if you're not happy with the results you can simply return to your old one.  

  • Multi-platform File Adapter

    We are currently looking at the requirements for several new processes involving use of the File Adapter functionality. Our production BPM environment is RedHat Linux 4, while this particular legacy system runs on Windows 2000. The legacy system is already configured to read and write XML files into Windows file shares. Some integration processes should kick off when XML files are written to the file shares. Other processes will write XML files back into the file shares. I am looking for recommendations as to the best architectural approach to dealing with this multi-platform problem within the BPEL PM. As I see it, we have a few of options:
    1.     Utilize FTP server functionality rather than direct file access to read and write the files in a platform-independent manner.
    2.     Use some other technology to bridge between the platform-specific file directories and something less dependent on platform. For example, pick up the files from the Windows directory and write them to an AQ queue. Then feed the BPEL process from the queue.
    3.     Run BPM on multiple platforms and allow the Windows instance to handle Windows file drops while the Linux instance handles Linux file drops. Obviously there is a cost penalty here as well as complexity during deployment.
    Any thoughts or experiences are welcome.
    Thank you.

    Have you looked into relative path's? that would solve the issue of different OS path names.
    Also, you might want to consider the deployment descriptors you can use when compiliing, you can set enviroment specific variables like paths in there.

  • Substitute variables for external process activity in process flows

    Has anyone used with success substitute variables such as ${Working.Rootpath} for external process activity?
    I can't get it working. Variables aren't substituted and my scripts fail.
    Sample value for parameter_list parameter for external process I use is:
    |${Working.Rootpath}|
    and in the script I get:
    ${Working.Rootpath}
    which is of course not what I expected.

    In documentation is Working.Rootpath so there is a bug in documentation. It is ugly because it's hard to guess.
    Thank's for your reply Michael. I checked all that you described. Previously I had Working location set to "Use default location". When I changed it to actual location substitute variables started to work properly.
    If I correctly understand "Use default location" means: use location associated with process module. And for execution it works but for substitute variables doesn't. So I think it is a bug.
    Next thing is variables in the script itself. From examples sent by Mark (script: cd ${Working.RootPath}...) they should be set in environment and accessible to shell. This doesn't work for me but it is not described in documentation and can be easily achieved by passing parameters. So that's not a problem.
    One more question: Should I open tars and file bugs describing what we found?

Maybe you are looking for

  • Xbox 360 wireless headset

    Does the new xbox 360 wireless headset by microsoft work on the xbox 360 arcade?

  • Premiere Pro CC keeps crashing 3x in last 48 hours

    I never had Premiere Pro CS5 crash and now that I am using Premiere Pro CC it has crashed 3x in the last 48 hours.  I am not doing anything particularly complex and have sent all reports to Apple.  My system is iMac Processor  3.4 GHz Intel Core i7,

  • Pictures Appear to be Gone

    Iphoto "11 9.4.2. When i open iphoto now, it seems some pictures are now gone. See image below: I can click to hget into the "blank" thumbnail and there are blank images there: Is there any way to recover these missing pictures?

  • SuperDrive opens in X but not OS9

    All of a sudden my internal SuperDrive quit working in OS9.2.2. Does works in OSX. Last week I survived a hard drive crash and did a Clean Install of OS 9 and X on a drive replacement. Everything is great except SuperDrive in OS 9.2.2. The SuperDrive

  • A D E crashes

    I've been trying to download an ebook but everytime I load the acsm file, ADE crashes and I get the following error:   Nom d'événement de problème: APPCRASH   Application Name: DigitalEditions.exe   Application Version: 2.0.1.0   Application Timestam