Command & Process flow?

Hi Experts,
I hv confusion regarding the control flow among the Command nodes!
I hv a SMARTFORM, the tree nodes r defined as follows,
1-  COMMNAD(icon):  GO_TO_NEXT_PAGE  go to next page(No condition is mentioned) And (under the right side maitanance box) the Next Page parameter has given as ''Second Page''
2-  TEXT ELEMENT(icon):  WRITE_MYTEXT1  property detail description 1
3-  COMMAND(icon):  GO_TO_SECOND_PAGE  go to next page(No condition is mentioned) And (under the right side maitanance box) the Next Page parameter has given as ''Second Page''
4-  TEXT ELEMENT(icon):  WRITE_MYTEXT2  property detail description 2
5-  PAGE(icon):  SECOND_PAGE  this is second page
As per my understanding, the control comes first to the node 1,
then it goes to node 5 and process the nodes mentioned under (node) 5,
then the control comes back to node 2 and writes MYTEXT1,
then, the control encounters node 3, there it has been directed to go to node 5, so,
then the control goes to node 5 and processess it,
then the control comes back to node 4 and writes MYTEXT2,
then again the control encounters node 5 and it processes all the nodes  mentioned under node 5!
So, Is  this process flow is correct?
If not, pls. clarify my doubt.
ThanQ.

hi Srinivas,
u haven't understood the behaviour of the COMMAND node.
usinf ur node1--> u r callin te SECOND_PAGE.. fine when it comes to this node it call the SECOND_PAGE and prints..
1.now WRITE_MYTEXT1 (node2)  will be printed in the SECOND_PAGE...
then it executes the  node3... again SECOND_PAGE will be triggered and WRITE_MYTEXT2  will be printed on the seconde page(new page of  second_page)..
but u r expecting the control comes back and trigger the SECOND_PAGE agian in node3... but it never comes back unless and until u call the PAGE1 using the COMMAND node goto PAGE1.
Remember--> where ever u calling another page using COMMAND node then it triggers the new page( of type called page here SECOND_PAGE).. so it starts printing of node in the main window in the SECOND_PAGE...
and for all commang node use the CONDITION tab Only on Page 'PAGE1' (if u want it to execute only on PAGE1.. Only on Page 'SECOND_PAGE' (if u want it to execute only on 'SECOND_PAGE' ....
the reason for this is in the smartform we have only one main windoow so all the node will be existed in all the pages... but we  want to print some nodes in PAGE1 and some other node in SECOND_PAGE so we need to explicitily specify this using CONDITIONS TAB.
I hope it is cleare now, revert back if u want any clarifications
Please Close this thread.. when u r problem is solved. Reward all Helpful answers
Regards
Naresh Reddy K
t

Similar Messages

  • Execute Process Flow via command line

    How can I execute a Process flow from the command line?

    As far as I know once the job is setup in OEM it is submitted either through OEM UI or by scheduling, I don't think there is a command line interface. I would ask on OEM forum anyway Enterprise Manager
    But why such complexity? All OWB maps keep the execution audit history in one place already.
    Nikolai

  • Call/Invoke an OWB Process Flow from a Windows Command/batch

    Hi everyone,
    I'm newbie in Oracle Warehouse Builder and I've developed some process flows to load data from one database to another.
    One of these process, called "MAIN", is the beginning of the execution. I need to call these process flow (MAIN) from the Windows Command Line
    or batch (.bat file) to schedule in Windows Sckeduler Tasks.
    Anyone have an idea to how to do this?
    Thanks a lot!
    Edited by: Maximiliano García on 24/02/2012 05:41 PM
    Edited by: Maximiliano García on 24/02/2012 05:41 PM

    Check this blog post here that shows how to execute via sqlplus, which you will then be able to wrapper in a batch script or whatever to schedule.
    https://blogs.oracle.com/warehousebuilder/entry/how_to_execute_process_flow_from_sqlplus
    Cheers
    David

  • External commands in OWB Process Flow

    Hello *,
    how can I embed an external command (e.g. a batch file) in an OWB process flow.
    I know the User Defined Activity but I didn't succeed in embedding this.
    Has someone an example for me?
    Thanks and regards
    Michael

    hi thomas...
    thanks for your reply...
    I changed the parameters of runtime.properties to NATIVE_JAVA as u said...but it still not running....
    I will give u a celar pic..what I am doing now..
    I wrote a bat file called cmd.bat with the following commands..
    d:
    cd temp
    copy var.dat d.txt
    del var.dat
    I kept the path of this file(d:\temp\cmd.bat)..in the script parameter of userdefined activity...
    and the path of the cmd.exe (c:\windows\system32\cmd.exe) file in command parameter
    I created a basic process flow(ch)..with start, userdefined and stop activities...
    I just want to copy the data from var.dat to d.dat and delete the var.dat..
    the deployment and running of this process(ch) from the control center manager is success...but the actual copying and deletion that was mentioned in my cmd.bat..is not happening...
    I am using owb 10gr2....
    anymore suggestions plzz..
    thanks
    gopi

  • Cannot deploy process flow package specification from file

    Hi,
    we are using owb 11gr2 on linux. We successfully managed to deploy mappings and tables into a specification file, and then deploy from that file into the target database (RAC). But when we try to deploy a process flow package we run into an error:
    A process flow package can be deployed to a specification file. But a deployment from that specification file fails with the error: " OMB05623: Cannot deploy specification from file. Exception follows. nulll"
    ### SETUP
    set CONN_DESIGN "xxx"
    set CONN_TARGET_DESIGN "yyy"
    set CONN_TARGET_RT "xxx'"
    set DIR "D:\\tmp"
    set RELEASE "V0_2"
    set file "$DIR\\$RELEASE"
    set plan "FILE_DEPLOY_$RELEASE"
    ### Deployment via control center: works!
    OMBCONNECT $CONN_DESIGN
    OMBCONNECT CONTROL_CENTER $CONN_TARGET_RT
    OMBCC '/KLINGEL_DWH'
    OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN '$plan'
    OMBALTER DEPLOYMENT_ACTION_PLAN '$plan' ADD ACTION 'PF_TEST' SET PROPERTIES (OPERATION) VALUES ('REPLACE') SET REFERENCE PROCESS_FLOW_PACKAGE '/KLINGEL_DWH/WF_DWH/PF_TEST'
    OMBDEPLOY DEPLOYMENT_ACTION_PLAN '$plan'
    OMBDISCONNECT CONTROL_CENTER
    OMBCOMMIT
    OMBDISCONNECT
    ### Deployment via specificatin file: fails!
    OMBCONNECT $CONN_DESIGN
    OMBCONNECT CONTROL_CENTER $CONN_DESIGN
    OMBCC '/KLINGEL_DWH'
    OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN '$plan'
    OMBALTER DEPLOYMENT_ACTION_PLAN '$plan' ADD ACTION 'PF_TEST' SET PROPERTIES (OPERATION) VALUES ('REPLACE') SET REFERENCE PROCESS_FLOW_PACKAGE '/KLINGEL_DWH/WF_DWH/PF_TEST'
    OMBDEPLOY DEPLOYMENT_ACTION_PLAN '$plan' AS SPECIFICATION TO '$file.xml'
    OMBDROP DEPLOYMENT_ACTION_PLAN '$plan'
    OMBDISCONNECT CONTROL_CENTER
    OMBCOMMIT
    OMBDISCONNECT
    OMBCONNECT CONTROL_CENTER $CONN_TARGET_RT
    OMBDEPLOY SPECIFICATION FROM '$file.xml'
    ### OMB05623: Deployment von Spezifikation aus Datei nicht möglich. Exception folgt. null
    ###OMB05623: Cannot deploy specification from file. Exception follows. nullIf we use the operation DROP it works. But REPLACE and CREATE do not work.
    We get the error regardless whether the code is executed as an expert or on the command line at a windows client.
    Does anyone knows a workaround?
    Deployment via a specification file is a must due to license and architecture requirements.
    Thanks,
    Carsten.

    Its woked fine for me for OWB 10.2.0.4 .... some steps are missing from code like
    when you are connecting again it should be
    OMBDISCONNECT
    OMBCONNECT CONTROL_CENTER $CONN_TARGET_RT
    OMBDISCONNECT
    OMBCC '/KLINGEL_DWH'
    OMBCONNECT CONTROL_CENTER $CONN_TARGET_RTI think you can try the same on different system .
    Cheers
    Nawneet

  • Experts VS Process Flows on OWB 10g Release 2 (Paris)

    I was looking at the the new features list for OWB 10g Release 2 (Paris), and I notices the new feature called Experts. Does anyone have any insight into when an Expert would be used instead of a Process Flow? It doesn't look like Process Flows are going away, so I was trying to get a grasp of the use for Experts.
    From Here:
    http://www.oracle.com/technology/products/warehouse/htdocs/owb10gr2%20new%20features.htm
    Experts. Experts are solutions that enable advanced users to design solutions that simplify routine or complex tasks that end users perform in Warehouse Builder. You can design these solutions in the Expert Editor which resembles the Process Flow Editor and shares many of the same commands and navigational tools.

    Hi,
    Just to start with the Experts capabilities, they can be used for designing schema objects.......I tried with an expert for sampling the source files, creating External Tables on top of it....validating and deploying..........itz very interesting to work with.........I am unable to see the practical use of this for end users.....
    where as we all know, the PF can only be for executing the deployed objects.....
    Mahesh

  • Error in using External Process in the Process Flow

    I Created a Process Flow with an external process to Move the file from one location to another location,
    I gave the below parameters for the External Process
    COMMAND: move
    PARAMETER_LIST: ?F:\\FlatFiles\\in\\company.txt?F:\\FlatFiles\\error\\company.err
    SUCCESS_THRESHOLD: 0
    SCRIPT:
    The environment is
    Windows 2003
    OWB 9.2.0.8
    OWF Builder 2.6
    When I deploy and execute using Deployment Manager, it gave me the below error
    Starting Execution TEST
    Starting Task TEST
    Starting Task TEST:EXTERNALPROCESS
    CreateProcess: move move F:\FlatFiles\in\company.txt F:\FlatFiles\error\company.err error=2
    Completing Task TEST:EXTERNALPROCESS
    Completing Task TEST
    Completing Execution TEST
    What am I missing something here?
    Is my Parameters correct?
    GIve me the link where I can find more on using External process.
    Please...please...help me..
    Shree

    Nikolai,
    I have created a simple process flow which only calls the external process. The script is on the same host as the process flow is deployed to.
    I have used two diffent values for the command parameter.
    1. I placed the full path of the file in the command parameter and left the script parameter blank:
    COMMAND: /edwftp/ppas/scripts/ClearPPAS.sh
    PARAMETER_LIST:
    SUCCESS_THRESHOLD: 0
    SCRIPT:
    2.I placed the bash command in the command parameter and the full path in the script parameter.
    COMMAND: /usr/bin/sh
    PARAMETER_LIST:
    SUCCESS_THRESHOLD: 0
    SCRIPT: /edwftp/ppas/scripts/ClearPPAS.sh
    Both of these appear to work as they print out the statements inside the script but the files that are supposed to be removed still remain.
    Starting Execution EXTER_FILE
    Starting Task EXTER_FILE
    Starting Task EXTER_FILE:EXTERNALPROCESS
    Removing ActivatedAudit.dat...
    Removing ActivatedCustomers.dat...
    Removing ActiveAudit.dat...
    Removing ActiveCustomers.dat...
    Done!
    Create the Activated Customers data file...
    Create the Active Customers data file...
    Done!
    WARNING: Log file truncated - see RAB for further information.
    /edwftp/ppas/scripts/ActivatedCustomers.sh: /edwftp/ppas/log/ActivatedCustomers.log: cannot create
    /edwftp/ppas/scripts/ActiveCustomers.sh: /edwftp/ppas/log/ActiveCustomers.log: cannot create
    WARNING: Log file truncated - see RAB for further information.
    Completing Task EXTER_FILE:EXTERNALPROCESS
    Completing Task EXTER_FILE
    Completing Execution EXTER_FILE
    The permissions on the /log direcotry are 775. The user I register the file location with owns this directory.
    Can't think of anything else I have missed. I really appreciate your help :)
    Ryan

  • Error Deploying a Process Flow

    Hi All,
    Each time I try to deploy a process flow I created, I get the error "ORA-12514: TNS:listener does not currently know of service requested in connect descriptor". The Oracle workflow engine is on a separate machine and all connection parameters are correct. A tnsping of the workflow server works as required and I can even connect to the server (sql prompt) from the command prompt.
    Also, I will appreciate any direction to a site where the steps of creating a process flow is detailed.
    Cheers!!!

    Have you installed the workflow server into the database?
    Have you configured the workflow with the configuration assistant?
    Have you linked the workflow in the design center?
    Is the workflow packaged configured with the location?
    !!! Note for Oracle Workflow 2.6.4:
    When you run the Workflow Configuration Assistant, you need to specify the database connect string in the TNS Connect Descriptor box.
    Enter the following values:
    Accept the default for Install Option
    Workflow Account: (Accept the default ) owf_mgr           
    Workflow Password: owf_mgr                    
    Password : <Enter the SYS account password>          
    TNS Connect Descriptor: localhost:1521:orcl          
    Click Submit.

  • (urgent)how to run the sqlldr script in owb process flow?

    dear all:
    In my oracle warehouse ,i have to load much *.dat file
    into database with sqlldr in owb process flow. In owb process flow, I use the external process to run the sqlldr file with following configuration:
    1:======external process==========
    command : /app/ftpfile/sqlldr2.sh
    parameter list:
    success_threshold:0
    script:
    ================================
    2:create a file location in FILE LOCATION node:
    =============
    ODS_LOCAL_LOC
    =============
    3: in the runtime repository i register the location
    ============
    user name: oracle (for the sqlldr should run in oracle user)
    password : oracle
    host name: localhost
    root path: /app/ftpfile/
    ============
    4:configure the process flow
    ============
    path settings
    working locations:ods_local_loc
    ============
    after deploy them success in runtime repository,
    i run it ,it show me error following:
    ==========
    SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
    ORA-12545: Connect failed because target host or object does not exist
    ===========
    please help me!
    with best regard!

    Hello,
    our developers were getting this error code just the other day. They are using "sqlplus_exec_template" script to initiate these things. In our case, I had to do two thing:
    1) Modify their "initiator" script (the one that connects to runtime access user, and then calls "template") - it has to use tns connectivity "user/passwd@service_name"
    2) Create TNS entry (server side) for the "service_name" above.
    Now these SQL*LOADER mappings run successfully.
    Alex.

  • Process Flow - Executing a shell Script

    I have a simple shell script being called at the start of a process flow, up until a few days ago this worked fine across DEV, TEST and PROD.
    It's now stopped working in PROD.
    It executes through the command line fine as an os user and when you inspect:
    select * from all_rt_audit_executions where execution_name like '%SCRIPT_ONLY%' order by updated_on desc;
    The return_result is OK, but the script has not executed (it creates a simple file and I have added another one at the end of the script 'id >> whoami.txt')
    I've no idea what the sysadmin has been up to so can't comment on configuration changes on the box.
    Any ideas?
    Edited by: RodCouncil on Mar 16, 2009 4:51 PM

    First off, if the admin has upgraded or re-installed the repository, it might have reset back to the default security properties
    The properties file is found in $OWB_HOME/owb/bin/admin/Runtime.properties and change the following parameters to NATIVE_JAVA :
    property.RuntimePlatform.0.NativeExecution.FTP.security_constraint = NATIVE_JAVA
    property.RuntimePlatform.0.NativeExecution.Shell.security_constraint = NATIVE_JAVA
    property.RuntimePlatform.0.NativeExecution.SQLPlus.security_constraint = NATIVE_JAVA
    Make sure that there are no trailing spaces on the line after NATIVE_JAVA.
    Also, if they have upgraded software could they have changed the detault UNIX profie so that you are no longer finding executables? Or changed around the file structure so that your script is in a different absolute location so that OWB doesn't find it anymore? Or changed up directory permissions so that the Oracle user can't see the script anymore?
    Just a few thoughts on things to check for....
    I'd ask your sysadmin to check the OWB security profile, and then to try running the script from the Oracle Unix account and let you know the results (assuming that you do not have access to the Oracle user).
    Edited by: zeppo on Mar 16, 2009 10:40 AM

  • User Defined Activity "hangs" Process Flow if it runs for more than 10mins

    I have been using OWB (10g r2) for a while and have used "User Defined" activities in production process flows successfully before.
    However my problem now is that the activity (shell command) I want to perform, takes more than 10 minutes to complete. If I run the process flow with a parameter that generates a shell command that executes in two or three minutes, the process flow continues past the sucessfully executed user defined activity to the next activity.
    BUT
    If I run the same process flow (without modification or even redeployment) with a different parameter , the user defined acivity sucessfully executes (after about 11 minutes or so) but the process flow , doesnt seem to know that it is finished and just says it is still executing.
    my "user defined" activity has
    COMMAND: /usr/bin/ksh
    PARAMETER_LIST bound to a variable which is set earlier in the process flow.
    RESULT_CODE:
    SCRIPT :
    SUCESS_THRESHOLD : 0
    by examining (RUNTIME CONTROL CENTER SCHEMA).all_rt_audit_execution_params
    I can see that the variable contents assigned to RESULT_CODE look like this
    '/usr/bin/rsh -n WINSRV1 dtsrun /usa /ppw /S WINSQLSRV /N JOBNAME_SHORT'
    or this
    '/usr/bin/rsh -n WINSRV1 dtsrun /usa /ppw /S WINSQLSRV /N JOBNAME_LONG'
    I put the -n in because I thought that may have been the problem, but I think its more likely that there is come sort of ".properties" in a runtime file that needs to be set to control the "timeout" of user defined activities.
    (like in Runtime.properties
    property.RuntimePlatform.0.NativeExecution.Shell.security_constraint = NATIVE_JAVA
    except maybe
    .........Nativeexecution.timeout = .....?????
    Has anybody had a problem like this? Found a document describing available configuration properties for PF activites?
    Can anyone help?

    I have been using OWB (10g r2) for a while and have used "User Defined" activities in production process flows successfully before.
    However my problem now is that the activity (shell command) I want to perform, takes more than 10 minutes to complete. If I run the process flow with a parameter that generates a shell command that executes in two or three minutes, the process flow continues past the sucessfully executed user defined activity to the next activity.
    BUT
    If I run the same process flow (without modification or even redeployment) with a different parameter , the user defined acivity sucessfully executes (after about 11 minutes or so) but the process flow , doesnt seem to know that it is finished and just says it is still executing.
    my "user defined" activity has
    COMMAND: /usr/bin/ksh
    PARAMETER_LIST bound to a variable which is set earlier in the process flow.
    RESULT_CODE:
    SCRIPT :
    SUCESS_THRESHOLD : 0
    by examining (RUNTIME CONTROL CENTER SCHEMA).all_rt_audit_execution_params
    I can see that the variable contents assigned to RESULT_CODE look like this
    '/usr/bin/rsh -n WINSRV1 dtsrun /usa /ppw /S WINSQLSRV /N JOBNAME_SHORT'
    or this
    '/usr/bin/rsh -n WINSRV1 dtsrun /usa /ppw /S WINSQLSRV /N JOBNAME_LONG'
    I put the -n in because I thought that may have been the problem, but I think its more likely that there is come sort of ".properties" in a runtime file that needs to be set to control the "timeout" of user defined activities.
    (like in Runtime.properties
    property.RuntimePlatform.0.NativeExecution.Shell.security_constraint = NATIVE_JAVA
    except maybe
    .........Nativeexecution.timeout = .....?????
    Has anybody had a problem like this? Found a document describing available configuration properties for PF activites?
    Can anyone help?

  • FTP process flow not using registered userid

    Hi,
    I posted the following last week, to the back of a thread that Igor was answering to, but haven't seen any replies yet. Can some body answer the question regarding the userid used on the target location when an FTP process flow is ran please?
    Thanks again.
    ==================
    Igor,
    I followed the examples given in this thread, and checked the case study in the PDF, but am still not able to 'get' a file using FTP to the desired location.
    I have an FTP work flow configured with the Path Settings pointing to: REMOTE LOCATION is a w2k server, WORKING LOCATION is on Unix. These locations are registered in the Deployment Mgr properly.
    I am able to PUT a file to win2k from Unix, but not GET.
    This is caused by the fact that, during FTP WF execution, it is ran as Unix user 'oracle', whom do not have write access to the Root Path registered for the WORKING LOCATION. (I can do a get to /tmp).
    Also, I am getting these messages in the execution log, even if the FTP was successful:
    WARNING: Log file truncated - see RAB for further information.
    ftp: ioctl I_PUSH ttcompat: No such device or address
    ftp: ioctl(TIOCGETP): No such device or address
    All of these problems indicate that the run time Unix user, 'oracle', doesn't have sufficient rights to various directories. Is it possible to force OWB use the userid that was registered for the WORKING LOCATION?
    Thanks.

    Hi,
    Please follow the below steps.
    1. Kill the OWF process using the Oracle Workflow Monitor.
    - On the OWF Monitor home page, use "Find Process" to find the process
    - In the "Process List" page, click on the Process Name
    - Click on the "View Diagram" button
    - Click on the "Abort Process" button
    In the process list, the process should have a white-black flag.
    2. Connect as the Workflow schema owner and execute the following commands in order to purge the item type.
    - WF_PURGE.TOTAL package : deletes obsolete runtime data which includes: Items, Item activity statuses, Notifications, Expired activity versions.
    SQL> execute wf_purge.total
    - WFRMITT.sql script : deletes all definitions for an Item.
    SQL> @< database_oracle_home> wf\admin\sql\WFRMITT.sql
    3. Deploy the Workflow Process again from the OWB Deployment Manager.
    Thanks,
    Leo.

  • Where is a process flow's Execution details - Log section stored ?

    Hi,
    We're using OWB 11.2.0.1.7.
    I want to know where in the database the Execution Details for a process flow is stored.
    When i run a process flow with a Java Activity, or a User Defined Activity, and it halts with an Error i can inspect what went wrong in the process flow by opening the job in the Control Center client.
    I doubleclick on the name of the job in the Control Center, the Job Details window opens.
    I click on the first node, that is the name of the Job(process flow).
    Then you click on the blue I aka 'Display Details'.
    That window has a tab called Log.
    The Log tab shows information that Java/Userdefined activity returns.
    (In my case return info from a javabased xmlparser, or exitinfo from a shell command.)
    This is the only place where i can find this info unfortunately.
    I want to know where this info is stored in the database. I tried the runtime audit vews such as:
    select e.*, m.*, l.*
    from owbsys.wb_rtv_audit_executions e
    LEFT JOIN owbsys.wb_rtv_audit_messages m ON e.audit_execution_id = m.audit_execution_id
    LEFT JOIN owbsys.wb_rtv_audit_message_lines l ON l.audit_message_id = m.audit_message_id
    left join owbsys.wb_rtv_audit_message_params p on p.audit_message_line_id = l.audit_message_line_id
    But although i can see my processflow i can not find the Log information here.
    Can someone tell me where this info is stored ?
    Edited by: MichaelR64 on 6-apr-2011 14:01

    Hi Michael
    The standard output and error streams for the Java activities are stored in
    -> ALL_RT_AUDIT_EXEC_FILES
    Is this what you are after?
    You can see the doc for it in the runtime public views section;
    http://download.oracle.com/docs/cd/E11882_01/owb.112/e10584/api_2runviews.htm#i707034
    The FILE_TYPE column will be set ot JavaErrorStream for example, with the information that was written to the Java std error stream, and JavaOutputStream will have the Java std output stream. This same technique is used for other activities - SQLPlus, SQLLoader, FTP.
    Cheers
    David

  • Date parameter in a process flow

    Hi,
    I´m working with OWB 10.1.0.2.0.
    I have a process flow with a transform.
    The transform has a date parameter and I´ve defined a date parameter in the start activity, which is bound to the start parameter.
    If I try to pass a date parameter in this form, sysdate, the process doesn´t recognize it and in the all_rt_audit_execution_params table i can see that the param process has no value.
    If I pass a date parameter in this format 2006/08/07 the process works fine
    I need to pass the date parameter from start activity to the process activity using expresions with sysdate,
    may anyone help me? please and thanks
    beatriz

    Hi,
    there are several tasks you want to accomplish:
    1. "I need to check if the parameter file exists"
    Therefor OWB has the File Exists Activity (Process Flow). This one checks if a file exists with specified name on specified location. If so it returns with SUCCESS. If not, it returns a WARNING. In second case you should do a sleep for a while and then re-check.
    2. "accessing the date value from the parameter file"
    There will be several ways to do this. The easiest one may be, to declare a external Table on this file and simply select from it (use it as a source in your mapping)
    3. "After the ETL is finished running, the name of the parameter file needs to be changed"
    In your process flow you create a user defined activity. In this one you start cmd.exe (on windows) and execute any DOS-commands you like. In this case a move or rename.
    Hope I could help you...
    Bye,
    iOn.

  • Can not delete a parameter of a mapping in a process flow

    Hi,
    I want to delete a certain input parameter of all mappings in a flow with tcl.
    I use this:
    OMBALTER PROCESS_FLOW '$flow' DELETE PARAMETER 'IP_VERSION' OF ACTIVITY '$act'
    If i test this in the OMBPlus window it says it deleted it however it is still there i the flow.
    If i use this statement in a tcl script the script halts without message.
    It says :
    Process Flow Parameter IP_VERSION deleted.
    process Flow PFP_MIKE_1 altered.
    And when i retrieve the parameters afterwards it is indeed stil listed:
    OMBRETRIEVE PROCESS_FLOW '$flow' ACTIVITY '$act' GET PARAMETERS.
    Can somebody confirm:
    --that i am using the proper syntax
    --if this is a known bug in owb 10gr2 10.2.0.1.31
    rgrds Mike
    Nobody ?
    Edited by: MichaelR64 on 7-jun-2013 9:45
    Edited by: MichaelR64 on 7-jun-2013 9:46

    Nobody ?
    I did some more testing.
    The clause i am using is this: deleteGenericActivityParameterClause
    deleteGenericActivityParameterClause = PARAMETER "QUOTED_STRING" OF ACTIVITY "QUOTED_STRING"
    See here: http://docs.oracle.com/html/E14406_01/chap4003.htm for the 11g verrsion.
    My flow has calls to other mappings in it.
    Both the mappings and the flow itself have this input parameter IP_VERSION.
    It seems the OMBplus parser is making an error.
    When i specify the name of the input param that is of the flow(and not the mapping in the flow) , but the rest of the command specifies the mapping activity as in OP, then OMB says that it deleted the parameter and that it altered the flow.
    And it halts there.
    But when i check the actual flow nothing has changed.
    When i specify the name of parameter that the called mapping has , ofcourse with the full command line as in OP, then OMBplus says it can not find the specified parameter.
    So it seems that the parser makes two errors:
    When searching for the parameter it looks in the wrong place, it always looks in the flow section instead of the acitvy specified.
    The ' OF ACTIVITY '$act' ' part is not used it seems.
    The delete part of the instruction is also confused by this and says it deleted the parameter but nothing actually happens.
    Hope somebody can confirm/shed more light on this.
    Edited by: MichaelR64 on 6-jun-2013 17:07

Maybe you are looking for

  • Name 1 and Name 2 fields of Business Partner - changes get set back

    Dear Experts, in SRM 7.0 CS, i am facing the following problem w.r.t Messages in the shopping cart. It is known system behaviour, that in case the contend of the fields Name 1 or Name 2 of the Plant Business Partner is longer than 35 digits, than a (

  • Product id/serial number/ license code storage program

    hey, i'm looking for a good program to store and organize all my license codes for various pieces of software, does anyone have any good suggestions?

  • Reporting - Costs from the same GL account required in different categories

    We have a business requirement to report in the following way for management account reporting: Sales Gross Profit Freight DC Rent & Other Operating Costs DC Wages Intercoy Logistics Charge    Total Distribution Store Operating Expenses Wages Rent Ot

  • Stretching the whole timeline in Motion 2

    Hi guys, I was or am working on a project which is almost finished but the movie needs to be a little bit slower. I already read that there is no real possibility to enlarge the whole timeline - just via the scrubbing filter. Is there no other way to

  • Friendly names for virtual disks?

    In a OVM 3.x environment, is it possible and/or advisable to name virtual disks to something other than the default names given by ovm? I'm trying to find ways to ease management of resources, especially when backing up and restoring virtual machines