File Adapter vs BPEL interaction issue on high availability environment

Hi all,
i would really appreciate your help on a matter i'm facing about a composite (SCA) deployed on a clustered environment configured for high availability. To help you better understand the issue i briefly describe what my composite does. Composite's instances are started by means of an Inbound File Adapter which periodically polls a directory in order to check if any file with a well defined naming convention is available. The adapter is not meant to read the file content but only its properties. Furthermore, the adapter automatically makes a backup copy of the file and doesn't delete the file. Properties read by the adapter are provided to a BPEL process which obtains them using the various "jca.file.xyz" properties (configurable in any BPEL receive activity) and stores them in some of its process variables. How the BPEL process uses these properties is irrilevant to the issue i'd like to pose to your attention.
The just described interaction between the File Adapter and the BPEL process has always worked in other non-HA environments. The problem i'm facing is that this interaction stops to work when i deploy the composite in a clustered environment configured for high availability: the File Adapter succeeds to read the file but no BPEL process instance gets started and the composite instance gets stuck (that is, it keeps always running until you don't manually abort it!).
Interesting to say, if I put a Mediator between the File Adapter and the BPEL, the Mediator instance gets started, that is the file's properties read by the adapter are passed to the mediator, but then the composite gets stuck again 'cos even the mediator doesn't seem to be able to initiate the BPEL process instance.
I think the problem lies in the way i configured either the SOA infrastructure for HA or the File Adapter or BPEL process in my composite. To configure the adapter, i followed the instructions given here:
http://docs.oracle.com/cd/E14571_01/integration.1111/e10231/adptr_file.htm#BABCBIAH
but maybe i missed something. Instead, i didn't find anything about BPEL configuration for HA with SOA Suite 11g (all the material i found refers to SOA Suite 10g).
I've also read in some posts that for using the db as a coordinator between the file adapters deployed on the different nodes of the cluster, the db must be a RAC! Is that true or is possible to use even another type of oracle db?
Please, let me know if someone of you has already encountered (and solved :)) a problem like this!
Thanks in advance,
Bye!

Hi,
thanks for your prompt reply. Anyway, i had already read through out that documentation and tried all settings suggested in it without any luck! I'm thinking the problem could be related to the Oracle DB used in the clustered environment, which is not RAC while all documentation i read about high availability configuration always refers to a RAC db. Anyone knows if a RAC Oracle DB is strictly needed for file adapter configuration in HA cluster?
Thanks, bye!
Fabio

Similar Messages

  • How to use File Adapter in BPEL

    Hi,
    How to use File Adapter in BPEL. I am using Oracle SOA 10g. Any sample or best example in blog or elsewhere ?

    Hi,
    Please check the below links for samples on how to use file adapter in BPEL (in 10g/11g usage is same) and let me know any further queries.
    11g - http://blogs.oracle.com/theshortenspot/entry/soa_suite_integration_part_3_l
    10g - http://erpschools.com/articles/bpel-file-adapter-tutorial

  • Reading file using File Adapter in BPEL

    Hi,
    I am using JDeveloper 11g.
    I am trying to Read XML file and write content to Text file.
    I created XML Schema, XML file.
    I created Asynchronous BPEL process (Auto Generated)
    I took one File adapter which will have Read operation and one more File Adapter which will have Write Operation.
    Then, added one Receive activity that will be connecting to partner link - Read.
    and one Invoke activity which is linked to Write Partner link .
    I am able to Deploy the process..
    While testing 1st receive (Auto generated) will work but Receive for Read is not getting started.
    I am getting " Waiting for "Read" from "ReadFile". Asynchronous callback.Waiting for "Read" from "ReadFile". Asynchronous callback "
    How to read file in BPEL process. ?
    Thanks

    So,You have two receive activity(One for soap client and one for Read activity) and two invoke(1 for write file and 1 for reply to client).
    I think the 1st receive activity for soap client not required as you want to initiate the process while read the file.You can delete the default soap client and corresponding receive activity and check the create instance chekbox in the 2nd receive activity linked with read partnerlink.
    There is some issue with 2nd receive activity in bpel ..
    BPEL 11g -- Issue with second Receive activity
    Edited by: blue bell on 10-Sep-2011 22:33

  • Two Files merging using File adapter in Bpel 2.0

    Hi All,
    I have two different files (File 1 and 2) of same format(Two columns each)  , now i want to merge these two files using their data and form a consolidated file (File 3 with four columns).
    Please advise on how to make it possible using file adapter.
    Example:
    File 1 Format: A1 and A2 - are two columns of number and sring type
    File 2 Format : B1 and B2 - are two columns of number and string type
    Consolidated File 3 Format : A1 B1 A2 B2 - forming a single row if A1=B1 and also populating A2 and B2.
    Thanks
    Karthick.

    Hi Karthick,
    I would say read both files completely. Then create a transform and select both messages as an input creating one output. With XSLT it should not be too hard to combine the two inputs. You could loop on one file and then select each row from the other file. That output can be written to file.
    I don't know what triggers the process, but you could either let bpel be triggered by the polling on one file and read the other synchronously. Or read both files synchronously.
    Regards,
    Martien

  • File Read and Write using File Adapter in Bpel

    In Bpel Process i am using File Adapter ( Schema is Opaque) for read and write the file contents. i am able do successful deployment and read, write function in first time deployment, after that again i tired to run the application, its not going to write the content of file, its only writing the file with out data's or content in that file.
    Please help me...
    Saravanan

    Hi Eric
    In my domain.log file having the following details. In this file im unable to find out what the exact problem. Please look at this and help me.
    <2008-01-22 18:25:42,024> <INFO> <default.collaxa.cube.compiler> validating "C:\product\10.1.3.1\OracleAS_1\bpel\domains\default\tmp\.bpel_BPELProcess2_1.1_298e83988d77b6640c33dfeec11ed31b.tmp\BPELProcess2.bpel" ...
    <2008-01-22 18:25:49,850> <INFO> <default.collaxa.cube.engine.deployment> <CubeProcessFactory::generateProcessClass>
    Process "BPELProcess2" (revision "1.1") successfully compiled.
    <2008-01-22 18:25:49,914> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> Loading JCAActivationAgent for {portType=Read_ptt}
    <2008-01-22 18:25:49,914> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> JCAActivationAgent::load - Locating Adapter Framework instance: OraBPEL
    <2008-01-22 18:25:49,930> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> JCAActivationAgent::load - Done loading JCAActivationAgent for processId='bpel://localhost/default/BPELProcess2~1.1/
    <2008-01-22 18:25:49,930> <INFO> <default.collaxa.cube.engine.deployment> Process "BPELProcess2" (revision "1.1") successfully loaded.
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> JCAActivationAgent::uninit Shutting down the JCA activation agent, processId='bpel://localhost/default/BPELProcess2~1.0/', activation properties={portType=Read_ptt}
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraBPEL - performing endpointDeactivation for portType=Read_ptt, operation=Read
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.ws> <File Adapter::Outbound> Endpoint De-activation called in adapter for endpoint : D:\MAXIMUS_Project_Softwares\jdevstudiobase10132\jdev\mywork\MyLabs\BPELProcess2\in
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> JCAActivationAgent::init - Initializing the JCA activation agent, processId='bpel://localhost/default/BPELProcess2~1.1/
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> JCAActivationAgent::initiateInboundJcaEndpoint - Creating and initializing inbound JCA endpoint for:
    process='bpel://localhost/default/BPELProcess2~1.1/'
    domain='default'
    WSDL location='rd.wsdl'
    portType='Read_ptt'
    operation='Read'
    activation properties={portType=Read_ptt}
    <2008-01-22 18:26:02,698> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraBPEL - endpointActivation for portType=Read_ptt, operation=Read
    <2008-01-22 18:26:02,730> <INFO> <default.collaxa.cube.activation> <File Adapter::Inbound> Endpoint Activation called in File Adapter for endpoint: D:\MAXIMUS_Project_Softwares\jdevstudiobase10132\jdev\mywork\MyLabs\BPELProcess2\in
    <2008-01-22 18:26:02,730> <INFO> <default.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraBPEL - successfully completed endpointActivation for portType=Read_ptt, operation=Read
    <2008-01-22 18:26:02,890> <WARN> <default.collaxa.cube.activation> <File Adapter::Inbound> PollWork::run exiting, Worker thread will die
    <2008-01-22 18:26:04,171> <INFO> <default.collaxa.cube.ws> <File Adapter::Outbound> Managed Connection Created
    <2008-01-22 18:26:04,171> <INFO> <default.collaxa.cube.ws> <File Adapter::Outbound> Connection Created
    <2008-01-22 18:26:04,171> <INFO> <default.collaxa.cube.ws> <File Adapter::Outbound> FileInteraction Created

  • File adapter: End of line issue in unix receiver

    Hi all,
    We are sending flat files using File Adapter (FTP) with the File content conversion functionality and we can see the file right in our windows system, but when it arrives to the Unix receiver, they give us an error regarding the End of line.
    It has a ^M character in every line so it cannot be processed.
    Does anyone know how to skip this end of line character in our file keeping the same format?
    Thanks in advance.

    Hi Daniel,
    you can try to use this adapter module:
    localejbs/SAP XI Sample/ConvertCRLFfromToLF
    http://help.sap.com/saphelp_nw04/helpdata/en/32/43d84072378031e10000000a1550b0/content.htm
    Hope this help.
    Francesco

  • Polling using File Adapter in BPEL

    My requirement is to constantly poll a directory location using file adapter for a particular filename. If the file is found then pick the file else continue polling.
    Any thoughts on the approach is appreciated.

    Refer the below, File adapter is capable of doing that.
    http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CIAHDAEB
    for File Polling refer *4.3.1.4 File Polling* section
    *7) Give points - it is good etiquette to reward an answerer points (5 - helpful; 10 - correct) for their post if they answer your question.*
    Thanks,
    Vijay

  • Interpreting Multibyte data with File Adapter in BPEL Process

    Hi all,
    I am trying to interpret multibyte data file using native schema file.The file is defined as below:-
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    targetNamespace="http://TargetNamespace.com/ftpinbound"
    xmlns:tns="http://TargetNamespace.com/ftpinbound"
    elementFormDefault="qualified"
    attributeFormDefault="unqualified" nxsd:encoding="MS932" nxsd:stream="bytes" nxsd:version="NXSD">
    <xsd:element name="root">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="Header" minOccurs="1" maxOccurs="1">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="RecType" type="xsd:string" nxsd:style="fixedLength" nxsd:length="5"/>
    <xsd:element name="OperMode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1"/>
    <xsd:element name="SendSideCompCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="RecvSideCompCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="BPID" type="xsd:string" nxsd:style="fixedLength" nxsd:length="8"/>
    <xsd:element name="InfoTypeCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="4"/>
         <xsd:element name="CreateDate" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="SendSideSysCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="RecvSideSysCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="Preliminary" type="xsd:string" nxsd:style="fixedLength" nxsd:length="458"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="Lines" minOccurs="1" maxOccurs="unbounded" nxsd:style="array" nxsd:arrayTerminatedBy="(">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="DataProcessNo" type="xsd:string" nxsd:style="fixedLength" nxsd:length="5"/>
    <xsd:element name="InfoClassCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="4"/>
    <xsd:element name="DataCreateDate" type="xsd:string" nxsd:style="fixedLength" nxsd:length="6"/>
    <xsd:element name="SellerCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="OrderRecvPartyCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="PODeptCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="8"/>
    <xsd:element name="ProductNo" type="xsd:string" nxsd:style="fixedLength" nxsd:length="19"/>
    <xsd:element name="CorrectionCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1"/>
    <xsd:element name="SupplyClass" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1"/>
    <xsd:element name="Buyer" type="xsd:string" nxsd:style="fixedLength" nxsd:length="7"/>
    <xsd:element name="MatStdDimension" type="xsd:string" nxsd:style="fixedLength" nxsd:length="20"/>
    <xsd:element name="Package" type="xsd:string" nxsd:style="fixedLength" nxsd:length="7"/>
    <xsd:element name="Remarks" type="xsd:string" nxsd:style="fixedLength" nxsd:length="30"/>
    <xsd:element name="ConTaxClass" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1"/>
    <xsd:element name="SuppDeptCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="8"/>
    <xsd:element name="SuppProdName" type="xsd:string" nxsd:style="fixedLength" nxsd:length="30"/>
    <xsd:element name="SuppProdNameCode" type="xsd:string" nxsd:style="fixedLength" nxsd:length="25"/>
    <xsd:element name="SuppUnit" type="xsd:string" nxsd:style="fixedLength" nxsd:length="3"/>
    <xsd:element name="SuppUnitPrice" type="xsd:string" nxsd:style="fixedLength" nxsd:length="13"/>
    <xsd:element name="PODeptName" type="xsd:string" nxsd:style="fixedLength" nxsd:length="20"/>
    <xsd:element name="Buyer_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="14"/>
    <xsd:element name="MatStdDim_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="40"/>
    <xsd:element name="Remarks_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="60"/>
    <xsd:element name="SuppDeptName_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="40"/>
    <xsd:element name="SuppProdName_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="60"/>
    <xsd:element name="PODeptName_K" type="xsd:string" nxsd:style="fixedLength" nxsd:length="40"/>
    <xsd:element name="CountingMTHs" type="xsd:string" nxsd:style="fixedLength" nxsd:length="4"/>
    <xsd:element name="SuppProdNvAmount" type="xsd:string" nxsd:style="fixedLength" nxsd:length="10"/>
    <xsd:element name="SupProdSingleItem" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="SupProdWIP" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    <xsd:element name="SupProdSpoiledQnty" type="xsd:string" nxsd:style="fixedLength" nxsd:length="12"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="Trailer" minOccurs="1" maxOccurs="1">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="RecType" type="xsd:string" nxsd:style="fixedLength" nxsd:length="4"/>
    <xsd:element name="NoOfRec" type="xsd:string" nxsd:style="fixedLength" nxsd:length="5"/>
    <xsd:element name="Preliminary" type="xsd:string" nxsd:style="fixedLength" nxsd:length="526"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    This schema works fine and data is produced as expected when I am using "chars" as the stream option.However I can't use "chars" as the stream option and need "bytes" as the stream option. However when mentinong "bytes" option the following exception is generated:
    Charset "MS932" not supported.
    Check the error stack and fix the cause of the error. Contact oracle support if error is not fixable.
         at oracle.tip.pc.services.translation.xlators.nxsd.ByteReader.<init>(ByteReader.java:87)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.createScanner(NXSDTranslatorImpl.java:800)
         at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.translateFromNative(NXSDTranslatorImpl.java:440)
         at oracle.tip.adapter.file.inbound.ProcessWork.doTranslation(ProcessWork.java:504)
         at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:266)
         at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:179)
         at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
         at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:272)
         at java.lang.Thread.run(Thread.java:595)
    Please let me know can I at all read a file in byte format with File adapter using a desired encoding style?This is littile urgent.
    Thanks

    Hi James,
    Yeah I tried that too and it gave the same error. And I need to encode with MS932 only since the data will contain Japanese characters and this encoding is identified both by windows and non-windows OS.
    Please can you advice what can I do since the requirement is urgent.
    Thanks

  • JDBC adapter connected to a DB in high availability!

    Hi folks,
    I have finished my scenario File -> XI -> JDBC and now I’m preparing to transport it to QAS. I found that Data Base of QAS is in two cluster nodes. I have two hostnames to fill the parameter <b><IP address></b> and I don’t know which hostname I should use! Is it supposed use both? …or I need another one virtual?
    Connection: <b>jdbc:oracle:thin:@<IP address>:<listener port>:<instance name (database name)></b>
    Thanks in advance,
    Ricardo.

    I don't know this mechanism of high availability with 2 IP, but if you really have 2 hostname you should think to a method to switch from a node to the other in easy way, deactivating one and activating the other (for instance with the <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/45/0c86aab4d14dece10000000a11466f/frameset.htm">Controlling a Communication Channel Externally</a>)
    I suggest you to verify how it will be in production instance: real cluster ? switch over ? virtual IP ?
    Regards,
    Sandro

  • SAP Web Dispatcher in a high availability environment

    Hello, guys
    We are working in a CRM 7.0 implementation Project. Our system landscape is the following:
       - Two hosts (host1 & host2) on MSCS cluster (Windows 2008) with SQL Server and ASCS in high availability. Additional, this MSCS cluster has a instance of SAP Web Dispatcher.
       - In these two host weu2019ve installed a CI & DI instance, outside of high availability scope
       - Two additional hosts (host3 & host4) with one dialog instance in every host
    We have severe problems with communication between SAP Web Dispatcher and ICM components. Our configuration schema is the next:
       - ASCS (MSCS_virtual_hostname):
    ms/server_port_0 = PROT=HTTP,PORT=8141
    SAPLOCALHOSTFULL = <MSCS_virtual_hostname>.<domain>
       - IC (host1)
    icm/server_port_0 = PROT=HTTP,PORT=8040,TIMEOUT=90,PROCTIMEOUT=600
    icm/host_name_full = <host1>.<domain>
       - ID1 (host2)
    icm/server_port_0 = PROT=HTTP,PORT=8044,TIMEOUT=90,PROCTIMEOUT=600
    icm/host_name_full = <host2>.<domain>
       - ID3 (host3)
    icm/server_port_0 = PROT=HTTP,PORT=8045,TIMEOUT=90,PROCTIMEOUT=600
    icm/host_name_full = <host3>.<domain>
       - ID4 (host4)
    icm/server_port_0 = PROT=HTTP,PORT=8046,TIMEOUT=90,PROCTIMEOUT=600
    icm/host_name_full = <host4>.<domain>
       - SAP Web Dispatcheer (MSCS_virtual_hostname):
    SAPGLOBALHOST = <MSCS_virtual_hostname>
    SAPLOCALHOSTFULL = <MSCS_virtual_hostname>.<domain>
    SAPLOCALHOST = <MSCS_virtual_hostname>
    SAPLOCALHOST = <MSCS_virtual_hostname>
    ms/http_port = 8141
    icm/server_port_0 = PROT=HTTP, PORT=8042,TIMEOUT=30,PROCTIMEOUT=600
    wdisp/add_xforwardedfor_header = TRUE
    In SAP Web Dispatcher log weu2019ve found the following error messages:
    Fri Jan 28 15:45:22 2011
    ***LOG Q0I=> NiPConnect2: connect (10061: WSAECONNREFUSED: Connection refused)
    *** ERROR => NiPConnect2: SiPeekPendConn failed for hdl 6 / sock 130060
        (SI_ECONN_REFUSE/10061; I4; ST; 192.168.6.182:8044)
    *** ERROR => Connection request to host: , service: 8044 failed (NIECONN_REFUSED)
    SAP Web Dispather is trying to connect to connect with dialog instances through , which itu2019s incorrect (ports 8044, 8045 & 8046 are opened in dialog instances, not in virtual instance). I think it should try with real hostnames (host1, host2, host3 & host4).
    ¡¡Please, help!! Thanks in advance

    Hello, Karthi,
    Our Web Dispatcher profile looks as following:
    Instance specific parameters
    Maybe some of these parameters are needless
    SAPSYSTEMNAME = <CRM SID>
    INSTANCE_NAME = <WD SID>
    SAPSYSTEM = <WD System number>
    SAPGLOBALHOST = <virtual hostname of WD>
    SAPLOCALHOSTFULL = <FQDN of virtual hostname of WD>
    SAPLOCALHOST = <virtual hostname of WD>
    Directorios
    DIR_INSTANCE = R:\usr\sap\wd
    DIR_INSTALL = R:\usr\sap\wd
    DIR_CT_RUN = $(DIR_EXE_ROOT)\$(OS_UNICODE)\NTAMD64
    DIR_EXECUTABLE = R:\usr\sap\wd
    DIR_PROFILE = R:\usr\sap\wd
    DIR_HOME = R:\usr\sap\wd
    DIR_ICMAN_ROOT = $(DIR_INSTANCE)\icmanroot
    R:\usr\sap\wd\global\security\data
    Accesibilidad al Message Server
    rdisp/mshost = <virtual hostname of CRM Message Server>
    ms/http_port = <HTTP port of CRM Message Server>
    HTTP Settings
    Puerto estandar de acceso HTTP
    icm/server_port_0 = PROT=HTTP, PORT=8042,TIMEOUT=30,PROCTIMEOUT=600
    These parameters defines load balancing weights
    #wdisp/server_00 = NAME=<hostname_SID_SYSNR>, LB=4, ACTIVE=0
    #wdisp/server_01 = NAME=<hostname_SID_SYSNR>, LB=10, ACTIVE=1
    #wdisp/server_02 = NAME=<hostname_SID_SYSNR>, LB=20, ACTIVE=1
    #wdisp/server_03 = NAME=<hostname_SID_SYSNR>, LB=20, ACTIVE=1
    Puerto de acceso interfaz web de administrador
    icm/HTTP/admin_0 = PREFIX=/sap/admin, DOCROOT=$(DIR_ICMAN_ROOT)/admin, AUTHFILE=$(DIR_INSTANCE)\sec\icmauth.txt
    Activaciu00F3n de la cachu00E9 de SAP Web Dispatcher
    icm/HTTP/server_cache_0/http_cache_control = true
    icm/HTTP/server_cache_0 = PREFIX=/, CACHEDIR=$(DIR_INSTANCE)\cache
    Fichero de log de seguridad
    icm/security_log = LOGFILE=$(DIR_HOME)\log\security_%y%m%d.log, SWITCHTF=day, MAXSIZEKB=1024, FILEWRAP=off
    icm/HTTP/logging_0 = PREFIX=/, LOGFILE=$(DIR_HOME)\log\wd_log_%y%m%d.log, SWITCHTF=day, MAXSIZEKB=1024, FILEWRAP=off
    icm/log_level = 1
    Dispatcher Configuration
    wdisp/add_xforwardedfor_header = FALSE
    Parametrizacion de memoria
    Datos de sizing de los que se parten                                #
    #users = 1800 usuarios (900 concurrentes)
    #req_per_dialog_step = 6 peticiones HTTP por paso
    #thinktime_per_diastep_sec = 10 seg. de "thinktime"
    #conn_keepalive_sec = 30 seg. mantener conexiu00F3n abierta con ICM
    #icm/max_conn = users * req_per_dialog_step * conn_keepalive_sec / thinktime_per_diastep_sec
    icm/max_conn = 16200
    wdisp/HTTP/max_pooled_con = icm/max_conn
    wdisp/HTTP/max_pooled_con = 16200
    icm/max_sockets = al menos la suma de icm/max_conn y wdisp/HTTP/max_pooled_con
    icm/max_sockets = 32400
    mpi/buffer_size = 64K = 64 * 1024 = 65536
    mpi/buffer_size = 65536
    mpi/total_size_MB = icm/max_conn * mpi/buffer_size (hay que convertir mpi/buffer_size a MB)
    mpi/total_size_MB = 1024
    icm/req_queue_len = icm/max_conn / 2
    icm/req_queue_len = 8100
    icm/min_threads = icm/max_conn / ~50
    icm/min_threads = 512
    icm/max_threads = icm/max_conn / ~20
    icm/max_threads = 1024
    Parametrizacion de seguridad
    Evitar el envu00EDo de mensajes tu00E9cnicos al usuario final
    is/HTTP/show_detailed_errors = FALSE
    #icm/HTTP/error_templ_path
    And ICM parameters are:
    - SAPLOCALHOSTFULL= <FQDN of every application server>
    - icm/server_port_0 = PROT=HTTP,PORT=8080,TIMEOUT=90,PROCTIMEOUT=600:
    - icm/host_name_full = <FQDN of every application server>  ## This parameter is ignored if SAPLOCALHOSTFULL is defined
    I hope it helps you.
    Best regards,
    Sergio Su00E1nchez

  • Sender File adapter complex structure FCC issue - flat structure RFC-stuck

    Hi,
    Please help.
    I have a the below file structres-
    Option 1-
    H,100890,P100,A02,S101,AUD#
    I,P,NULL,TH,Test PO TH,1,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    I,P,NULL,TH,Test PO TH1 2,2,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    H,100899,P100,A02,S101,GBP#
    I,P,NULL,AS,Test PO AS1,1,EA,100,10160000,A002,0001,720090,E.1.4.2,,,,VT#
    I,P,NULL,AS,Test PO AS12,2,EA,100,10160000,A002,0001,720090,E.1.4.2,,,,VT#
    I need to map this into a Z RFC which is expecting all the data in one row - of course multiple rows
    Option 2-
    Same structure as above but all in same row-
    H,100890,P100,A02,S101,AUD,P,NULL,TH,Test PO TH,1,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    I,100890,P100,A02,S101,AUD,P,NULL,TH,Test PO TH,1,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    I,100890,P100,A02,S101,AUD,P,NULL,TH,Test PO TH1 2,2,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    H,100899,P100,A02,S101,GBP,P,NULL,AS,Test PO AS1,1,EA,100,10160000,A002,0001,720090,E.1.4.2,,,,VT#
    I,100899,P100,A02,S101,GBP,P,NULL,AS,Test PO AS1,1,EA,100,10160000,A002,0001,720090,E.1.4.2,,,,VT#
    I,100899,P100,A02,S101,GBP,P,NULL,AS,Test PO AS12,2,EA,100,10160000,A002,0001,720090,E.1.4.2,,,,VT
    This also needs to be mapped into a Z RFC which is expecting all the data in one row - of course multiple rows
    I am getting issues in getting the file data across to Receiver adapter.
    I have tried various combinations but the message is failing in SXMB_MONI.
    How do i do it as i need to get the finalised option for incoming file structure?
    Option 1 is being stressed but how do i map it if i cannot get the receiver structure in RFC.
    Please help as I am stuck now.
    Regards,
    Archana

    Hi,
    The problem is basically in message mapping from file to RFC external message.
    The option 2 is working now and I get correct converted file strcuture after FCC and into RFC and also a correct RFC payload.
    However, business is stressing that can send the file in the format as given in Option 1 where u have different structure - Header and Items. This is not coming out correct in RFC payload as the header has 5 fields as compared to more in item but the header and item are still being mapped to the flat RFC structure and this is creating a mismatch. The item line is missing the 5 fields from Header.
    How do i do the FCC in this situation to get the correct structure in RFC?
    This means that in RFC payload, the first line should be the one as below-
    H,100890,P100,A02,S101,AUD#
    The 2 records after this as received in RFC internal table should be as 2 given below-
    I,P,NULL,TH,Test PO TH,1,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    I,P,NULL,TH,Test PO TH1 2,2,EA,100,10160000,A002,0001,720090,E.1.4.3,,,,VT#
    However, the 2 structures contain variable field columns.
    Please help.
    Regards,
    Archana

  • NW2004s Installation fails in High Availability Environment

    We are installing our Production Server
    SAP Netweaver 2004s with High Availablility.
    AS per the inst guide, we installed the Oracle 10G in Local Disk of both the nodes.
    Then Installed the OFS and the SCS Instance successfully.
    Then we created the Oracle Fail Safe Group.
    After this when we try to install the Database Instance, During the Prerequisites Check, we are getting the message as the
    "Oracle Software is not installed. Install Oracle First"
    But we have the Oracle installed in the Server into local disk.
    Any Ideas?
    Thanks & Regards
    Sumanth

    Hi Benny,
    Thanks for the info. We recieved a reply in OSS stating that NW2004s in HA is note tested.
    It will only work with NW2004s SR1.
    Now we are plannign to go with Distributed System installation without HA because we will not get the NW2004s SR1 DVD.
    Where can we get the info about how to Convert the System Landscape from the Distributed System to HA System. Is this possible as part of the migration or as a seperate installation.
    Can you please help me in your views on Converting from Distributed Environment to HA Environment.
    Thanks & Regards
    Sumanth

  • How to find EOF in BPEL File adapter

    Hi All,
    I am using file adapter in BPEL(SOA 11G) to read a csv file. The CSV file is very large so we have are processing the files in batched of 20000 records.I am inserting the records form CSV file to a staging table and after that I am invoking a concurrent Program for further processing.This is working fine and BPEL is initiating no of instances based on the file size.
    The issue here is, I want to submit the concurrent program only once when all the records are stored in staging table, I mean to say the BPEL Process should process the file in batches and then insert them in to the staging table. Once all the records from the file are stored in the table, then only the Concurrent Program should get submitted.
    I am using DB Adapter for invoking a PL/SQL API and from there we are submitting concurrent Program.
    Please help me on this to solve this.
    Thanks!

    It is possible to register a Java Listener class, which can/will be invoked, when a last batch in the file, is read. Code for invoking the concurrent Program can be written in side this class.
    Refer to following document on how to register the Java Listener class
    http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CACJBIGD
    Moreover, as an alternate to the above approach, chunk read interaction specs can be implemented along with the Sync File Read for your scenario( However,This will be a whole re-write, considering your current approach. Good News is, there is full-fledged example readily available for this in oracle docs).
    Here is the link if you would like to implement the Chunk Read.
    http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#BABJFCBH
    Mark the posting appropriately as "hlepful" or "correct answer", if your issue is solved.

  • Sender File adapter Pooling interval & file Size issue

    Hi ALL,
    I have a file size of 400 MB to be picked to be the sender file adapter i am facing issue that the system is unable to even pick the file and place in the Q for Processing.
    1)we have no mapping no content conversion ....only pick the file from one loaction and place it in 3 different locations with 3 different names ...can any one help me in solving this .. ? the best workable method to achive this .
    2) this file is getting created once in a day .say 2 PM .so i can give the poll intravel to 24 hours ..but some times ..the file creation in server gets delayed ..but they cont wain for the next 24 hour to come for processing the file ..so my query is can we reduce  the poll intravel for 2 PM till  3PM hour for some 3 times ..and once it reches 3PM  it can poll as usual with a gap of 24 hour time ..
    thanks
    RK

    Hi Rupash,
    >>I have a file size of 400 MB to be picked to be the sender file adapter i am facing issue that the system is unable to even pick the file and place in the Q for Processing.
    Never use File adapter for picking large file (maximum size which I have seen it to work successfully is 200 MB)
    Instead go for Java proxies. Check this forum post for the discussion:
    Java proxy to use XI as a file mover
    Also if you want some more info on Java proxies refer these blogs and articles:
    /people/prasad.ulagappan2/blog/2005/06/27/asynchronous-inbound-java-proxy
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    >>this file is getting created once in a day .say 2 PM .so i can give the poll intravel to 24 hours ..but some times ..the file creation in server gets delayed .
    You can do a availability time planning refer these links for more information:
    Planning Availability Times
    http://help.sap.com/saphelp_nw04/helpdata/en/45/06bd029da31122e10000000a11466f/content.htm
    Controlling a Communication Channel Externally Locate the document in its SAP Library structure
    http://help.sap.com/saphelp_nw04/helpdata/en/45/0c86aab4d14dece10000000a11466f/content.htm
    Regards
    Suraj

  • File Adapter to EJB  in BPEL

    Hi,
    I have one XML file that i am reading using File Adapter in BPEL process. after reading i want to pass those values to EJB Adapter.
    I used Trasform to mapp output variables of File Adapter to Input parameter of EJB service.
    But the Invoke process is not happening.
    How to Pass values from File Adapter to EJB ?
    Thanks

    Hi Sonia
    It is not possible to achieve this using debatching. In fact, the use of the Debatching technique is appropriate when your requirement is exactly the opposed: When you need to run instances simultaneously.
    If you need to process each record sequentially, you should consider using just one instace (not using debatching and let the process reads the entire file), using a while and considering your XML as an array, processing each record as a single unit.
    If the CSV file is too big, you should consider using a process A that breaks the file into smaller files and copy sequentially the small pieces to a directory where process B is reading the entire file.
    []´s
    Marcelo

Maybe you are looking for

  • T400 graphic upgrade

    Hi everyone, Simple question for you guys: Can I upgrade the graphic device for my Lenovo T400 (6474-CV3)??? Its currently a Intel 4500MHD and doesnt run much...  My first guess is no since most laptops have an integrated chips onboard for graphic bu

  • Purchase Order item - plant not available

    Hi, when I'm creating a purchase order the plant (Field name NAME1, Data element MEPO_EWERK) I want to select on the item level is not available. I get a list of plants, but the one I'm looking for is not displayed. I guess I have missed some basic c

  • How to grant access to all workbooks to a particular responsibility?

    Hi, I am new to Discoverer. i have installed OAS 10.1.2.3 and I have done EUL_SET up on my E-Business Suite R12 and now 'SYSADMIN' owns all the workbook under "System Administrator" responsibility. Now users want to see all workbooks. I need to provi

  • Editing Forms

    I am in the middle of the trial version of Contribute 3 and I am unable to edit forms that were previously done in Microsoft Frontpage. The error message I get is 'You cannot perform this action in this region of the page' . Macromedia support did no

  • Can't get video to correct size in viewer

    Here's an interesting situation. I've used FCP for 3 years on an old G4. Upon using my new G5 with Final Cut Studio for the 1st time I must have mistakenly put my regular DV clips on an HD Sequence timeline and rendered it. Now when I try to put the