How to configure FTP in Process Flow

I'm having troubles configuring an FTP activity in my process flow. Any clue how that should be done?
The documentation does not tell that much. I added the following to my parameter list in "Activity Panel"
/${Working.Host}=100.1.15.236/${Working.User}=XYZ/${Working.Password}=xxxxx/${Working.Rootpath}=C:\Nova\Data/${Remote.Host}=192.168.100.100/${Remote.User}=oracle/${Remote.Password}=xxxxx/${Remote.RootPath}=/export/home/oracle/
where "/" is the variable delimiter. The deployment went successfully, but when I try to execute, it hangs at the FTP activity and generate an error when cancelled.
Any help will be appreciated
Thanks,
Rene

URGENT!!!
i have included FTP activity in my process flow, i can able to FTP the file to the remote server.
I have two transitions and point to two end processes, one is end-success and the other one is end-error. But even if the FTP process completed and file have been successfully send to the remote server, still the process ends with End-Error.
If the FTP activity couldn’t able to send the file due to some other reason still the process ends with the end-error.
Can any one give me some clues.

Similar Messages

  • How to Configure alert for process failures

    Hi Friends.
    How to Configure alert for process failures due to any reasons like mapping, runtime exceptions .
    I am devloping one scenario which is file to flatfile. In this scenario I am also using mail adapter for sending mail daily to admin that tells about record size and no of records parallely. I am not using BPM in this scenario.
    I am having very little experience in XI.
    Please answer this with clear steps so that I can do it.

    hi,
    check the links below for all the details and help on configuring alerts
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/9418d690-0201-0010-85bb-e9b2c1af895b - How to setup alerts for monitoring in XI 3.0
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/003de490-0201-0010-68a4-bf7a1e1fd3a5 - Monitoring in XI 3.0
    /people/michal.krawczyk2/blog/2005/09/09/xi-alerts--step-by-step - XI: Alerts configuration - Step by step
    /people/michal.krawczyk2/blog/2005/09/09/xi-alerts--troubleshooting-guide - XI: Alerts - Troubleshooting guide
    /people/bhavesh.kantilal/blog/2006/07/25/triggering-xi-alerts-from-a-user-defined-function
    /people/community.user/blog/2006/10/16/simple-steps-to-get-descriptive-alerts-from-bpm-in-xi

  • How to configure advance payment process in sd and fi

    hi,
    sap gurus,
    how to configure advance payment process in sd an fi
    how it will trigger  from sd and fi.
    and the accounting entries is also needed
    plz help me on this
    regrds,
    balaji.t
    09990019711

    Hi
    As far as my knowledge goes, this has lot to do with FI configuration for Down payments.
    Regards
    Aravind
    Assign points if useful

  • How to configure the "FTP" in process flow

    host1: owb for solaris 5.9
    host2: erp system with solaris os
    i create a workflow process in owb,the purpose is use "ftp" command to get a
    file from host2. the "ftp" in workflow process with
    the
    following parameter:
    ===
    1:COMMAND : /usr/bin/ftp
    2:PARAMETER_LIST: ?"${Task.Input}"?
    3:SUCCESS_THRESHOLD: 0
    4:SCRIPT :
    open ${Remote.Host}
    ${Remote.User}
    ${Remote.Password}
    lcd ${Working.RootPath}
    cd ${Remote.RootPath}
    get local.login
    quit
    ======
    at the same time ,i create two location in FILE LOCATION node
    with names:"Res_REMOTE_LOC","Tar_LOCAL_LOC", Then configure the workflow
    process
    remote location: Res_REMOTE_LOC
    working location: Tar_LOCAL_LOC
    =====
    in runtime repository ,i deploy the workflow process success,but when i run the
    workflow process ,it run all the time!
    why?please help me!
    tks a lot

    Igor,
    I followed the examples given in this thread, and checked the case study in the PDF, but am still not able to 'get' a file using FTP to the desired location.
    I have an FTP work flow configured with the Path Settings pointing to: REMOTE LOCATION is a w2k server, WORKING LOCATION is on Unix. These locations are registered in the Deployment Mgr properly.
    I am able to PUT a file to win2k from Unix, but not GET.
    This is caused by the fact that, during FTP WF execution, it is ran as Unix user 'oracle', whom do not have write access to the Root Path registered for the WORKING LOCATION. (I can do a get to /tmp).
    Also, I am getting these messages in the execution log, even if the FTP was successful:
    WARNING: Log file truncated - see RAB for further information.
    ftp: ioctl I_PUSH ttcompat: No such device or address
    ftp: ioctl(TIOCGETP): No such device or address
    All of these problems indicate that the run time Unix user, 'oracle', doesn't have sufficient rights to various directories. Is it possible to force OWB use the userid that was registered for the WORKING LOCATION?
    Thanks.

  • How to schedule mappings to process flows?

    Hi,
    I have shceduled a calendar (Job) which is referring to a process flow. But how can I make sure that the mappings are referring to the same process flow?
    E.g. I have scheduled job at 10 AM , I have created the process flow for 10 AM referring to the same scheduled job.
    My understanding here is there is a hierarchy : Scheduled jobs > Process Flows > Mappings
    I have configured the process flow to run it at a scheduled job, now I want the mappings to understand to run at the same time as that of the schedule.
    And also when I start the process flow all the mappings should get executed.
    Is there any parameter to tell the process flow that all these mappings falls under you.
    Hope I have made myself clear.
    Can anyone please look into this query?
    Thnks in adv..

    When I double click and open my process flow I am not able to see any mapping. We have stored procedures written:
    ln_exists NUMBER;
    LS_ERROR VARCHAR2(200);
    LD_START_PERIOD_DT DATE;
    LD_END_PERIOD_DT DATE;
    EX_PF_NOT_VALID EXCEPTION ;
    EX_SUB_PF_NOT_VALID EXCEPTION ;
    EX_LAYER_NOT_VALID EXCEPTION ;
    EX_MODULE_NOT_VALID EXCEPTION ;
    EX_DATE_FORMAT_ERR EXCEPTION ;
    BEGIN
    --1: Check the Process Flow parameter value
    IF IP_PF IS NOT NULL THEN
    select count(*)
    into ln_exists
    from adm_process_flow_par
    where process_flow = IP_PF;
    IF ln_exists =0 THEN
    RAISE EX_PF_NOT_VALID;
    END IF;
    END IF;
    --2: Check Sub Process Flow Parameters value
    IF IP_SUB_PF IS NOT NULL THEN
    select count(*)
    into ln_exists
    from adm_sub_pf_par
    where sub_pf_code = IP_SUB_PF;
    IF ln_exists = 0 then
    RAISE EX_SUB_PF_NOT_VALID;
    END IF;
    END IF;
    --3:Check Layer Code Parameter Value
    IF IP_LAYER IS NOT NULL THEN
    select count(*)
    into ln_exists
    from adm_lookup_code
    where lookup_type='LAYER_CODE'
    and lookup_code= IP_LAYER;
    IF LN_EXISTS =0 THEN
    RAISE EX_LAYER_NOT_VALID;
    END IF;
    END IF;
    --4: Check Module Code Parmeter Value
    IF IP_MODULE IS NOT NULL THEN
    select count(*)
    into ln_exists
    from adm_lookup_code
    where lookup_type IN ('SOURCE_SYSTEM','SUBJECT_CODE')
    and lookup_code= IP_MODULE;
    IF LN_EXISTS =0 THEN
    RAISE EX_MODULE_NOT_VALID;
    END IF;
    END IF;
    --5: Check start Period date & End Period Date Format
    BEGIN
    IF IP_START_PERIOD_DT IS NOT NULL THEN
    LD_START_PERIOD_DT := TO_DATE(IP_START_PERIOD_DT,'YYYY-MM-DD');
    END IF;
    IF IP_END_PERIOD_DT IS NOT NULL THEN
    LD_END_PERIOD_DT := TO_DATE(IP_END_PERIOD_DT,'YYYY-MM-DD');
    END IF;
    EXCEPTION
    WHEN OTHERS THEN
    RAISE EX_DATE_FORMAT_ERR;
    END;
    EXCEPTION
    WHEN EX_DATE_FORMAT_ERR THEN
    LS_ERROR := 'Date Format is not valid ,please check (FORMAT: YYYY-MM-DD HH24 /YYYYMMDDHH24)';
    SP_ERROR_REC(NULL,IP_PF,IP_SUB_PF,IP_MODULE,IP_LAYER,NULL,NULL,LS_ERROR,'SP_CHECK_PARAMETER_VALID',NULL);
    RAISE_APPLICATION_ERROR(-20002,LS_ERROR);
    WHEN EX_PF_NOT_VALID THEN
    LS_ERROR := 'The Process Flow Value is not valid ,please check table adm_process_flow_par';
    SP_ERROR_REC(NULL,IP_PF,IP_SUB_PF,IP_MODULE,IP_LAYER,NULL,NULL,LS_ERROR,'SP_CHECK_PARAMETER_VALID',NULL);
    RAISE_APPLICATION_ERROR(-20002,LS_ERROR);
    WHEN EX_SUB_PF_NOT_VALID THEN
    LS_ERROR := 'The Sub Process Flow Value is not valid ,please check table adm_sub_pf_par';
    SP_ERROR_REC(NULL,IP_PF,IP_SUB_PF,IP_MODULE,IP_LAYER,NULL,NULL,LS_ERROR,'SP_CHECK_PARAMETER_VALID',NULL);
    RAISE_APPLICATION_ERROR(-20003,LS_ERROR);
    WHEN EX_LAYER_NOT_VALID THEN
    LS_ERROR := 'The Layer Code Value is not valid ,please check adm_lookup_code(lookup_type="LAYER_CODE")';
    SP_ERROR_REC(NULL,IP_PF,IP_SUB_PF,IP_MODULE,IP_LAYER,NULL,NULL,LS_ERROR,'SP_CHECK_PARAMETER_VALID',NULL);
    RAISE_APPLICATION_ERROR(-20004,LS_ERROR);
    WHEN EX_MODULE_NOT_VALID THEN
    LS_ERROR := 'The Layer Code Value is not valid ,please check adm_lookup_code(lookup_type IN ("SOURCE_SYSTEM","SUBJECT_CODE")';
    SP_ERROR_REC(NULL,IP_PF,IP_SUB_PF,IP_MODULE,IP_LAYER,NULL,NULL,LS_ERROR,'SP_CHECK_PARAMETER_VALID',NULL);
    RAISE_APPLICATION_ERROR(-20005,LS_ERROR);
    END;
    Can anyone throw some light on this issue?
    Edited by: user11001347 on May 11, 2010 11:46 PM

  • How to run procedure in Process flow

    Hello,
    I have a small procedure which I want to run in between two different mappings. I want to achieve it in process flow.
    I had understanding that this can be done using transformation but didn't had much success.
    Please suggest.
    Thanks

    You are correct that this is done with a transformation in the process flow, and this is the way to do it.
    What kind of error message did you get?
    Was it something with the deployed location for the transformation not being set?
    If thats the case, in the design center, mark the process flow and right click, then choose configure. Find the transformation and set the location for where the procedure is deployed.
    Ragnar

  • How to configure FTP Adapter in BPEL PM 10.1.2 Beta3?

    Greetings all!
    I am working on a legacy integration using BPEL PM. I need to use FTP adapter. There is little information in the Beta Documents on FTP Adapter. Can anyone point me to the right direction on how to configure the FTP Adapter, or where to find the document?
    Thanks in advance!

    For more documentation please send a mail to [email protected]

  • How to monitor the wrapper process flows

    Hi,
    I have a process flow(main wrapper) which contains the 2 child process flows(child_PF1, child_PF2). which looks like below.
    start -> Child_PF1 -> Child_PF2 -> end
    I am able to run the the process flows successfully. I want to monitor the process flows and the % of completion of the process flow.
    I am able to check the parent and child process flows seperately. I am unable to link the parent process flow to its child work flows. Can anyone explain how to link/track the Parent process flow to its child process flows.
    When i open the Parent process flow i was able to see the child process flow as an operator. But i was not able to view the contents child process flow further.
    Oracle Workflow server version is 2.6.4
    OWB version is 10.1.0.4
    Thanks in Advance,
    SriGP.

    Hi,
    the following SQL works with OWB 10.2.
    SELECT to_char(x.created_on,'dd.mm. hh24:mi') as created_on,
           --x.root_id as r_id,
           sys_connect_by_path(x.map_name, '/') AS NAME,
           step_name as target,
           --x.map_name as map_name,
           --x.run_status,
           x.number_records_updated + x.number_records_inserted + x.number_records_merged + x.number_records_deleted AS "#R",
           --x.number_records_selected AS "#S",
           --x.number_records_inserted AS "#I",
           --x.number_records_updated AS "#U",
           --x.number_records_merged AS "#M",
           --x.number_records_deleted AS "#D",
           NVL(x.elapse_time, round((x.updated_on - x.created_on) * 86400)) AS secs,
           x.status
      FROM (SELECT r.top_level_execution_audit_id AS root_id,
                   r.parent_execution_audit_id AS parent_id,
                   r.execution_audit_id AS audit_id,
                   r.created_on,
                   r.updated_on,
                   coalesce(substr(m.map_name, 2, length(m.map_name) - 2), r.execution_name) AS map_name,
                   r.return_result,
                   s.elapse_time,
                   m.run_status,
                   s.number_records_selected,
                   s.number_records_inserted,
                   s.number_records_updated,
                   s.number_records_merged,
                   s.number_records_deleted,
                   s.step_name,
                   s.run_status as step_status,
                   s.elapse_time as step_time,
                   s.step_id,
                   p1.VALUE AS p1,
                   coalesce(e.run_error_message, msg.message_text, s.run_status, m.run_status, r.return_result, 'RUNNING') AS status,
                   msg.message_text
              FROM all_rt_audit_executions r,
                   all_rt_audit_map_runs m,
                   all_rt_audit_map_run_errors e,
                   all_rt_audit_step_runs s,
                   (SELECT execution_audit_id,
                           message_text
                      FROM all_rt_audit_exec_messages
                     WHERE message_line_number = 1) msg,
                   (SELECT p.execution_audit_id,
                           p.parameter_name,
                           p.parameter_kind,
                           p.VALUE
                      FROM all_rt_audit_execution_params p
                     WHERE p.parameter_kind = 'CUSTOM'
                       AND p.parameter_name = 'OTIM_ID') p1
             WHERE 1 = 1
               AND r.execution_audit_id = p1.execution_audit_id(+)
               AND m.map_run_id = e.map_run_id(+)
               AND m.map_run_id = s.map_run_id(+)
               AND r.execution_audit_id = m.execution_audit_id(+)
               AND r.execution_audit_id = msg.execution_audit_id(+)) x
    WHERE 1 = 1
       AND x.created_on > trunc(SYSDATE) - 0
    CONNECT BY x.parent_id = PRIOR x.audit_id
    START WITH x.parent_id IS NULL
    ORDER SIBLINGS BY x.root_id DESC, x.audit_id DESC, x.step_id DESCMaybe your are lucky and it will work also with OWB 10.1.
    Regards,
    Carsten.

  • RPE-02248: FTP in process flow deactivated by db-admin

    Hi,
    when starting a process flow with a ftp activity I get the error RPE-02248 that the ftp activity is deactivated from the db-admin because of security risks.
    But no hint where and who to activate it!
    Can anybody helps me, please?
    Regards,
    Detlef

    Hello!
    Metalink gives the following solution to this problem:
    To enable these activity-types you need to edit the file:
    <owb-home>/owb/bin/admin/Runtime.properties
    To enable your ftp activity, you need to change the value of the property
    property.RuntimePlatform.0.NativeExecution.FTP.security_constraint
    to NATIVE_JAVA.
    Change the below parameter from:
    property.RuntimePlatform.0.NativeExecution.FTP.security_constraint = DISABLED
    to
    property.RuntimePlatform.0.NativeExecution.FTP.security_constraint = NATIVE_JAVA
    You need to modify the file on the Oracle Home running the Control Center service,
    not the client, and then bounce the Control Center service using:
    <owb-home>/owb/rtp/sql/stop_service.sql
    <owb-home>/owb/rtp/sql/start_service.sql

  • How to configure CVC automation process using SNP master data and BI customer master data

    Hi Gurus,
    How to configure CVC automation in MPOS structure technically, using SNP master data in CVC creation such as Product and location details
    and in parallel extract date from ECC to BI customer master data in DP. Where-in a APO BI cube should have only new combinations which are to be validated with ECC and existing combinations in the MPOS structure before creation of CVC in different regions of MPOS.
    Second the automation should also check certain validations of product, location and are part of SNP-ECC master data and are these new combinations.
    Could someone guide us in this process.
    Thanks
    Kumar

    Praveen,
    The short answer is to move the data into infoproviders or flat files for all of the 'characteristic-type' source data; and then copy CVCs from the infoproviders into the Planning area, probably using one or more Process Chains (program /SAPAPO/TS_PLOB_MAINTAIN).
    The long answer is that this is not a trivial undertaking; this could end up being a pretty involved solution.  If you do not have enough BW/DP expertise available locally to create this solution for you, then I recommend you consider engaging external resources to assist you.  I personally wouldn't even consider starting to work on such a solution without first knowing a lot more about the detailed business requirements, and about any existing solutions that may already in place.  An SCN forum is not really suitable for such an answer.  In my opinion, the BBP doc alone would be 20+ pages, assuming no enhancements..
    Best Regards,
    DB49

  • How to configure  Oracle BPEL Process Manager for JBoss 4.2.1

    can any body help me to configure Oracle BPEL Process Manager for JBoss 4.2.1

    Look here:
    http://download.oracle.com/docs/cd/E10291_01/doc.1013/e10538/toc.htm
    Marc
    http://orasoa.blogspot.com

  • How to configure a status process

    I would like to configure a control process using a percentage bar for a VI

    Hello Novice Programmeur,
    The file that I sent to you, is a ".LLB" file : It allows to save all VIs, objects and external subroutines, excluding those in the folder vi.lib, to a library. It allows to save a hierarchy of VIs (Main VI and SubVIs) in one file. This kind of file can be open by LabVIEW.
    You will found in the attached zip file the same program with the main VI and SubVIs. Launch and execute the VI "ProgressBarExample".
    Best Regards,
    Sanaa TAZI
    National Instruments France
    Sanaa T.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    >> http://www.nidays.fr/images/081110_ban_nidays09_468X60.gif
    Attachments:
    test.zip ‏54 KB

  • How to configure CF8 to process all URL requests?

    Configuration: Apache frontend, ColdfusionMX 8 application
    server
    My Application.cfc defines a cffunction OnRequestStart to
    check every URL (to ensure user is allowed access).
    The function works fine for .cfm files (because the
    jrun-handler specifies .cfm files).
    However I need every URL sent to Apache that contains a
    specific url-pattern (e.g. /secureMe/*) to be forwarded on to
    Coldfusion so that the OnRequestStart function will check for user
    access.
    How do I configure Coldfusion (e.g. jrun-handler, web.xml,
    jrun-web.xml)
    to process every URL?
    I've tried <servlet-mapping> and
    <virtual-mapping> and even tried out CFFileServlet but I
    cannot get Coldfusion to do anything with directories and non-cfm
    files.
    Any solutions and suggestions would be greatly
    appreciated.

    Originally posted by:
    Newsgroup User
    devodl wrote:
    >> I had already tried to use the servlet mapping you
    suggested:
    >> <servlet-mapping>
    >> <servlet-name>CfmServlet</servlet-name>
    >> <url-pattern>/secureMe/*</url-pattern>
    >> </servlet-mapping>
    >> in the hopes that the OnRequestStart function would
    process the URL. However
    >> it seemed that Coldfusion would ignore the URL
    because it did not contain the
    >> *.cfm pattern.
    > Correct, this only serves to get the request handed of
    from the
    > webserver to JRun, not to select the correct servlet in
    JRun.
    I had gotten that far but ran into the need for a servlet
    which I may need to avoid (see below).
    >> Did you mean rewrite the original URL from:
    /secureMe/private/files/foo.pdf
    >> (which is what I want to protect)
    >> to something like: /secureMe/index.cfm
    /private/files/foo.pdf ?
    >> or did you mean rewrite it to:
    >> /secureMe/index.cfm
    ?/private/files/foo.pdf
    >> so that the original URL is passed to the index.cfm
    as a QUERY_STRING argument?
    >Whichever you want. Make sure that SES URLs are enabled
    in web.xml if
    >you choose the first option.
    Since I have control over the form of the links to non-cfm
    files I decided to pass the path of the non-cfm file as an argument
    (QUERY_STRING) to a cfm template. /secureMe/checkAccess.cfm
    ?/private/files/foo.pdf
    Then the checkAccess.cfm template can do the work necessary
    to verify user access to non-cfm files.
    >> I am concerned that my OnRequestStart function is
    being bypassed when the URL
    >> is for non-cfm files and that using a
    UrlRewriteServlet to rewrite the URL and
    >> pass it to the index.cfm will not address the
    problem.
    > It will, the UrlRewriteServlet is executed first.
    Thanks again, this is good to know.
    >> If I must use a servlet to catch the incoming URLs
    for non-cfm files then I
    >> might as well scrap use of my OnRequestStart
    function (for cfm files) and
    >> simply use my own servlet to check user access for
    all incoming URLs.
    > If you have the skills to write a servlet, that is
    absolutely preferred
    > above hacking this into a CF.
    I have the skills and have written many servlets for Tomcat
    using Struts, etc. but the Coldfusion admin group likes to use
    everything OOTB (Out Of The Box). When I speak of servlets they get
    a glazed look in their eyes...
    While I am quite comfortable creating a WAR file and having
    the J2EE container perform the deployment it seems to be a bit more
    complicated with Coldfusion (i.e. include the runtime bits and
    deploy to JRun). I am not sure if the admin group is ready to
    tackle something that formidable. Development and deployment of my
    web application to the Coldfusion servers is much easier if I
    simply send the admin group a zip file of components, templates,
    css, etc. and just have them unzip it onto the server.
    >> I was hoping that the Apache mod_jrun connector
    would be configurable to
    >> handle more than just file extensions. Ideally I
    would like to ability to
    >> configure the mod_jrun20 connector in Apache to
    handle the /secureMe/*
    >> pattern since it would be a cleaner solution.
    >It should be able to handle wildcards.
    > Jochem
    I tried and it seems that the AddHandler jrun-handler setting
    only handles file extensions.
    I read somewhere that the mod_jrun connector only does two
    things:
    1 - Pass the request to JRun to see if it should be picked
    up by a servlet mapping
    2 - Pass the specific filetypes (e.g. .cfm, .cfml, .cfswf)
    to Coldfusion
    Once again thank you for all your help and advice.
    It is greatly appreciated.
    Steve Deal

  • How to configure FTP/VAN Adapter for SFTP connection?

    Hi,
    I am new to the use of Seeburger's FTP/VAN adapter. I read SDN threads where the FTP/VAN adapter can be used for SFTP connections. Could anyone assist me in the steps to configure the communication channels for SFTP connections?
    Scenarios:
    1. R/3 -> XI -> File (SFTP server)
    2. File (SFTP server) -> XI -> IDoc
    There is no VAN connection involved. I am just trying to utilize the adapters my PI system currently have for my interface requirement.
    Is the configuration as simple as configuring the communication channel or there are scripts involved? If there are scripting involved, is there a benefit to use scripting in the FTP/VAN adapter, rather than XI's own File adapter?
    Please assist. Thank you.

    Hi Andrew,
    Standard PI Adapter will not support to conenct SFTP server, we have to use third party adapters?? do you have any third party adapters??
    many compniaes providing SFTP adapters and SAP PI Supports to.
    if you dont have third party adapter we do have other alternative , refer below blog
    /people/daniel.graversen/blog/2008/12/11/sftp-with-pi-the-openssh-way
    Regards,
    Raj

  • How to configure FTP Adapter for multiple endpoints?

    Scenario: I have multiple ftp endpoints (5-15) and depending on some business logic, I have to send data to the right destination. With default ftp adapter configuration, I have to add a n adapter connection factory reachable via JNDI for each destination. But I want to read the different endpoints dynamically from e.g. a database and configure my ftp adapter at runtime. That's because the endpoints do change quite often.
    Interconnect had such a feature: FTP Sender: The FTP adapter supports sending to multiple FTP endpoints. This feature provides flexibility for sending messages to different remote FTP servers.
    Any ideas on that for SOA Suite?
    Thanks,
    Torsten

    For more documentation please send a mail to [email protected]

Maybe you are looking for