XI Scenario synchronous RFC (ABAP program) over XI to C program Receiver

Hi experts,
I have a synchronous RFC-Call between an SAP System (Sender) and an C-programm (Receiver). But now XI should become a part of this scenario. 
Regarding this I have a few problems to include XI.
1. SENDER:
I have an abap application which calls a function module and send the data over synchronous RFC to XI. 
e. g. call function 'FM_XYZ'  DESTINATION 'RFC_SENDER'
        exporting PARAMETERS
        importing PARAMETERS
        changing TABLES.
Whis this part I have had no problems. But I have problems on my receiver side.
2. RECEIVER:
As receiver I have a server, where only a C-program starts a process and register the RFC Program on server side. The program is now waiting for requests. How can be called this C-Program on XI side.
So could anybody help me. Thank you.
Regards
Mario

Hi Ravi,
thanks for your fast reply.
In our current RFC Szenario (without XI)
1. In SAP side we have configured a TCP/IP connection
2. On C program side we start the C-program as follows:
We start a File which contains  the name of c-program.exe RFC-Program-ID gateway. Then the RFC destination is registered on external Server side.
By using the Testing connection feature on SAP RFC (SM59) the RFC-connection works fine.
3. But now this scenario should be displaced by the XI scenario. Is it possible to use on receiver side a RFC-ADAPTER which calls an external RFC server.
Regards
Mario
Message was edited by:
        Mario Bauer

Similar Messages

  • How to include an ABAP program in process Chain

    Hello Gurus,
    I have a situation where I have to go to SE38 and run an ABAP program which fills the active table of ODS2 by taking data from another ODS1.The data from ODS2 is again fed to another ODS3. So my question now is how do I create a process chain which takes care of this scenario.Especially include ABAP program in the process chain.Looking forward to your replies.
    Regards,
    Kalyan

    hi,
    take a look 'how to' ABAP in process chain
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7

  • Calling JCO RFC Server program from JCO RFC client

    Hi,
    I have an RFC registered server program which implements JCO.Server.  It seems to be working fine, when called from SAP.
    For testing purposes, I was trying to write a JCO client program which would take the place of the SAP client.
    This program opens a connection to the RFC server and executes a function e.g.
    JCO.Client client = JCO.createClient("xx.yy.com", "sapgw35", "MYPROGID");
    client.connect();
    client.execute(function);
    The RFC server program receives the call fine, when I test with a simple function which has no table parameters.  However when I tried a more complex function with table parameters, I get an serverExceptionOccurred from the RFC server program:
    com.sap.mw.jco.JCO$Exception: (104) RFC_ERROR_SYSTEM_FAILURE: connection closed without message (CM_NO_DATA_RECEIVED)
        at com.sap.mw.jco.rfc.MiddlewareRFC$Server.nativeListen(Native Method)
        at com.sap.mw.jco.rfc.MiddlewareRFC$Server.listen(MiddlewareRFC.java:1368)
        at com.sap.mw.jco.JCO$Server.listen(JCO.java:6805)
    I have tried to initialize the repositories in both server and client programs correctly, so that the function is in the cached function list and the table structures in the cached structures list before the function is invoked.  But I am not sure if there is still something I am missing, so any ideas would be welcome.
    Thanks,
    Richard

    JCO example 5 is a very good one for server side programming.
    try the example,somethings you 'd better make clear.
    1) JCO.server
    2) repositories--data mapping
    3) parameters: export import,table...
    further topic:
    1) JCO pool
    2) tRFC,qRFC
    After you have success in Client side programming,try example 5.
    Regards

  • Synchronizing Two Custom Z Tables Using Abap Program

    Hi,
        My requirement is Synchronizing two custom z tables using abap program me.
    I have vendors in two tables, i have to select the common vendors from first which are existing in second  table also.
    In first table each vendor can have more than one supplier types in one field only. In second table these supplier types are divided into
    different fields.
         My requirement is I have to check supplier types in first table and i have to pass X to corresponding supplier types in second table vendor.
    I have to pass X value for each depending in Second table depending upon supplier type in first table.
    . How to do it can any one suggest with code.
    Thanks in Advance,
    Vivek
    <subject edited by moderator>
    Message was edited by: Manish Kumar

    Hi,
    Imho, you need to get (meaning, extract into separate fields) the different supplier types from Table1 first. Your key for Table1 is the vendor no, which is also the key in Table2 (or the key for Table2 is Vendor no & Type).
    For better performance, better select multiple/all required entries from Table1 instead of doing a select endselect.
    Depending on the format of the vendortypes in Table1, put them in a new itab (for our purpose named Table1New where vendor no & type are the only 2 fields. For example, if the type length is fixed to 2 chars, or divided by space,... use your coding accordingly.
    Next step is to select all vendor no's in Table2 which you have selected in Table1. If in Table2, the vendor no is the only key (and the all vendor types are filled in a single record), then loop check the vendor types from Table1New against the types in Table2.
    If the key of Table2 is vendor no & vendor type, then do a read table for the key.
    The logic in pseudo-code:
    Select from Table1 into table. If you'd like to limit the selection size, add package size statement.
         extract the vendor types in to itab Table1New.
         Select the vendor & types from Table2 by using the for all entries option (better performance).
         loop at Table1New
              check in Table2:
                   if the unique key is vendor no: check all fields for the vendor type from Table1New
                   if the unique key combo is vendor no & type: check by using a read table.
              If not found => add entry to Table2
         endloop.
    endselect Table1 (when using package size)
    I guess the most difficult step is to extract the types from Table1 into separate fields, all the rest seems straight forward. Please keep in mind the itab type definitions for a better performance.
    Good luck!
    Best regards,
    Zhou

  • Synchronous RFC -- SOAP Scenario: problem with SOAP Response/Fault Mapping

    Hi,
    I've a synchronous RFC --> PI --> SOAP Scenario. The problem is that the message structure of the sending RFC doesn't match the Webservice Structure.
    The (SAP standard) RFC has just a Request / Response message structure. Part of the Response Message structure is a exception structure.
    The Webservice has a Request / Response message structure and in case of an error I get a SOAP:Fault.
    Problem now is that I cannot configure that scenario without usage of BPM as I will have to map SOAP:Response or SOAP:Fault to the RFC Response structure.
    Has anybody another idea to do that synchronous scenario (with usage of message mapping) without BPM?
    BR
    Holger

    1)
    you maus define 3 mapping.
    1)request
    2)response
    3)Fault
    in Interface mapping define at response boths (2-3) mapping. its clear??
    2)
    otherwise sometjhing is not clear, why do you want fault?? why dont you  get only response message. we implement this kind of response:
    <response_MT>
    <ID> (error ID)
    <system> (target system) 
    <error> (Error Description)
    </response>
    by this way fault message is not needed. but if you must have it just follow the top of message else, propose second.
    Thanks
    Rodrigo
    Thanks
    Rodrigo
    Edited by: Rodrigo Pertierra on Feb 25, 2008 11:52 AM

  • Executing ABAP program in RFC to File

    Hi,
    In the RFC to File, the RFc is called in the ABAP program.
    So when i am executing the ABAP program for the 1st time it doesnt reach PI, only when clicking the execute button multiple times execution reaches PI and a message ID is created along with the required data.
    what is the problem? why is it not reaching PI when i execute at the first time itself?

    Hi Ravi,
    Thanks for the response.
    >>> First u have check RFC destinatios wheter it working or not. 1) U r program ID should match with RFc destination and RFC adapter configurations.
    I have tested the RFC destinations and it works fine. I have also given the same Program ID in adapter configuration
    >>>2)should write commit work at last in the Abap Program.
    Commit is present at the end of the ABAP program
    Cache refresh was also done. But still facing the same problem.
    >>>5)Check in SM58 .
    And we are getting the following errors randomly:
    Server repository could not create function template for 'ZFN_EXTRACT
    Commit fault: com.sap.aii.adapter.rfc.afcommunication.RfcChannelMismatchE

  • Synchronous RFC  scenario

    Hi,
    I am with a 2 part asyncronous scenario as shown below:
    JMS -> XI -> receiver RFC : Sender RFC -> XI -> JMS.
    In the above case, I am calling the sender RFM within the Receiver RFM.
    when i execute the scenario. The first part runs successfully .
    Also, the Receiver RFM calls the sender RFM correctly.
    but i see that the second part of the scenario is not triggered!!.
    how can i get it working?
    Thanks
    Raghu

    Hi Tim,
    there are no errors... all the adapter show green..
    there is nothing shown in sxmb_moni for the second part..
    The first part is successfull.
    the sender rfc is called within the receiver rfm.
    am using 'IN BACKGROUND TASK' a 'COMMIT WORK' after the call
    any idea.. wat is missing out?
    Thanks

  • How to check whether a batch input session is completed in ABAP program

    I have created a ABAP program to create a batch input session (reference to RSBDCSUB). After the creation of the batch input session, I kick it to start and read the execution log. However, sometimes I cannot read anything from the execution log as the execution of the batch input is a synchronized process to the execution of my program, i.e. at the time being that I try to read the log of a particular transaction, that transaction is being processing / haven't start processing.
    How can I check whether a batch input session is completed in the program?
    The code that corresponding to the triggering of batch input session:
    SUBMIT (SUBREPORT)
       USER MTAB-USERID
       VIA JOB MTAB-GROUPID
       NUMBER JNUMB
       WITH QUEUE_ID  EQ MTAB-QID
       WITH MAPPE     EQ MTAB-GROUPID
       WITH MODUS     EQ 'N'
       WITH LOGALL    EQ LMODUS
    Or is there any method to wait here until the process is completed before further processing?

    Hi gundam,
    1. Or is there any method to wait here until the process is completed before further processing?
    There is no such direct method to wait.
    2. Immediately after submitting in background,
       we cannot wait
      neither can we LOOP and go on detecting
      whether the b/g process has completed or not !
    3. To over come such problems,
      we have to use another technique.
    4. we have to submit another
       job which will get triggered
       on event SAP_END_OF_JOB
       ie. when the original job will finish,
      our new job will AUTOMATICALLY get triggered,
    5. This new job / program
       will do the FURTHER actions !
    regards,
    amit m.

  • Execute ABAP Program - Error in Open File Cust_Dim.dat

    HI All,
    I am trying to Implement scenario of Data Service designer for Extract SAP Application data (page no. 181 Onwards). It is for transfer data from SAP ECC - KNA1 to MSSQL table.
    I have Data Service server install on my system.
    I have implement all the steps. When I execute job server. I am getting error like
    "Execute ABAP program <C:/Program Files/Business Objects/BusinessObjects Data Services/ZCUSTDIM_1.aba> error < Open File Error -- C:\Program Files\Business Objects\/Cust_Dim.dat>"
    Does anyone knows about this? I have tried to find lot in SDN but didn't find any solution.
    Thanks
    Ratnakar

    Hi all
    I was having the exact same problem as the one explained here.
    I was a little mixed up because I assigned full privileges on the destination folder to one specific account called installsap
    I used  this account for installing SAP BO Data Services. This account is the one configured on the Business Objects Data Services Service:
    However, I kept getting the same error, as if this account had not enough privileges over the folder.
    I saw that there is another Service called Server Intelligent Agent. I went through its configuration Properties by using the Central Configuration Manager, specifically within the Log On As Property and it seemed like a non specific System Account was configured there.
    Then I open the services.msc (Windows application) in order to see this very same service configuration and it had NOTHING specified there, nor a Local System Account nor an specific user account.
    So, I stopped the service and specified the installsap account I used for the Business Objects Data Services Service. After that I verified I could see this configuration through the Central Configuration Manager and now both tools showed the same.
    Then I started the SIA Service retried again executing the ABAP Data Flow and the Error in Open File was gone.
    It is how the SIA Service configuration looks like.
    By the way, I am not sure if the "Job Server Service" which was cited before in this thread is the same as the Server Intelligence Agent Service I modified
    Besides, although I've been using DS for quite a considerable amount of time, I have never heard of such "Job Server Service". Let me know if I am wrong.
    Anyways it worked for me and I hope it works for someone else who faces the same error.
    Best regards!

  • Error in Posting Data Using ABAP Program.

    I am Using Scenario SAP(RFC) to Webservice through XI
    I called My RFC into ABAP Program  and given RFC Destination
    and executing it goes into Short DUMP.
    I gives an Error message as
    " alternativeServiceIdentifier:party/service from channel configuration are not"
    Wat should i Do for this Error.
    regards,
    Jayasimha Jangam

    Hi
    Please look into these threads
    alternativeServiceIdentifier: party/service from channel configuration are.
    Rfc sender problem(sap r/3 -se37) 'alternativeServiceIdentifier: party/serv
    Also this Blog will help you,
    /people/michal.krawczyk2/blog/2005/09/07/xi-why-dont-start-searching-for-all-errors-from-one-place
    Reward points if found usefull............

  • ABAP program and FB for value mapping replication

    Hi,
    we are using a 40B system and I want to use the value mapping replication in XI. To refill the data into the XI database I must write an ABAP and a function module to transfer the data out of R/3 into XI (via RFC). Has anybody an example how the program (and the FB) must look like? i.e I want to read table mvke and extract the materialnumber and the prodhierarchy.
    Thanks and best regards
    Arnold

    Hi Arnold,
    First you need a table type with a structure like follows:
    operation
    groupid
    context
    identifier
    agency
    scheme
    (corresponding to the Interface ValueMappingReplication)
    All used data elements should have type string or charXX
    for example: operation - char10, groupid - char32, rest - char120
    Next you create a function module with attribute 'remote-enabled module'.
    The import parameter is your new table structure.
    Next you create an ABAP program. Here an example:
    report z_value_mapping .
    tables mvke.
    data:
    p_value_mapping type zvalue_mapping,
    p_value_mapping_table type zvalue_mapping_table.
    p_value_mapping-operation = 'Insert'.
    p_value_mapping-context = 'http://xi.com/Material'.
    select * from mvke where matnr between '170' and '501'.
    check not mvke-prodh is initial.
    * Create a value mapping group to join two entries.
    * use a unique 32 digit number.
    concatenate '00000000000000' mvke-prodh into p_value_mapping-groupid.
    translate p_value_mapping-groupid using ' 0'.
    * Store the mapping source as first entry to the group
    p_value_mapping-identifier = mvke-matnr.
    p_value_mapping-agency = 'SenderAgency'.
    p_value_mapping-scheme = 'MATNR'.
    append p_value_mapping to p_value_mapping_table.
    * Store the mapping target as second entry to the group
    p_value_mapping-identifier = mvke-prodh.
    p_value_mapping-agency = 'ReceiverAgency'.
    p_value_mapping-scheme = 'PRODH'.
    append p_value_mapping to p_value_mapping_table.
    endselect.
    * Push data to XI
    call function 'Z_VALUE_MAPPING' in background task
      destination 'IS_XID'
      exporting
        value_mapping       = p_value_mapping_table.
        commit work.
    Import the RFC to the Integration Builder, create a mapping between your RFC and the interface ValueMappingReplication.
    Check this Blog for additional steps:
    /people/sreekanth.babu2/blog/2005/02/23/value-mapping-replication
    Choose names for context, agency and scheme which are useful your scenario.
    Regards
    Stefan

  • Performance of ABAP program

    How do you take care of performance issues in your abap programs?

    HI
    and you can see this also
    Ways of Performance Tuning
    1.     Selection Criteria
    2.     Select Statements
    •     Select Queries
    •     SQL Interface
    •     Aggregate Functions
    •     For all Entries
    Select Over more than one Internal table
    Selection Criteria
    1.     Restrict the data to the selection criteria itself, rather than filtering it out using the ABAP code using CHECK statement. 
    2.     Select with selection list.
    Points # 1/2
    SELECT * FROM SBOOK INTO SBOOK_WA.
      CHECK: SBOOK_WA-CARRID = 'LH' AND
             SBOOK_WA-CONNID = '0400'.
    ENDSELECT.
    The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list
    SELECT  CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK
      WHERE SBOOK_WA-CARRID = 'LH' AND
                  SBOOK_WA-CONNID = '0400'.
    Select Statements   Select Queries
    1.     Avoid nested selects
    2.     Select all the records in a single shot using into table clause of select statement rather than to use Append statements.
    3.     When a base table has multiple indices, the where clause should be in the order of the index, either a primary or a secondary index.
    4.     For testing existence , use Select.. Up to 1 rows statement instead of a Select-Endselect-loop with an Exit. 
    5.     Use Select Single if all primary key fields are supplied in the Where condition .
    Point # 1
    SELECT * FROM EKKO INTO EKKO_WA.
      SELECT * FROM EKAN INTO EKAN_WA
          WHERE EBELN = EKKO_WA-EBELN.
      ENDSELECT.
    ENDSELECT.
    The above code can be much more optimized by the code written below.
    SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB
        FROM EKKO AS P INNER JOIN EKAN AS F
          ON PEBELN = FEBELN.
    Note: A simple SELECT loop is a single database access whose result is passed to the ABAP program line by line. Nested SELECT loops mean that the number of accesses in the inner loop is multiplied by the number of accesses in the outer loop. One should therefore use nested SELECT loops  only if the selection in the outer loop contains very few lines or the outer loop is a SELECT SINGLE statement.
    Point # 2
    SELECT * FROM SBOOK INTO SBOOK_WA.
      CHECK: SBOOK_WA-CARRID = 'LH' AND
             SBOOK_WA-CONNID = '0400'.
    ENDSELECT.
    The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list and puts the data in one shot using into table
    SELECT  CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK
      WHERE SBOOK_WA-CARRID = 'LH' AND
                  SBOOK_WA-CONNID = '0400'.
    Point # 3
    To choose an index, the optimizer checks the field names specified in the where clause and then uses an index that has the same order of the fields . In certain scenarios, it is advisable to check whether a new index can speed up the performance of a program. This will come handy in programs that access data from the finance tables.
    Point # 4
    SELECT * FROM SBOOK INTO SBOOK_WA
      UP TO 1 ROWS
      WHERE CARRID = 'LH'.
    ENDSELECT.
    The above code is more optimized as compared to the code mentioned below for testing existence of a record.
    SELECT * FROM SBOOK INTO SBOOK_WA
        WHERE CARRID = 'LH'.
      EXIT.
    ENDSELECT.
    Point # 5
    If all primary key fields are supplied in the Where condition you can even use Select Single.
    Select Single requires one communication with the database system, whereas Select-Endselect needs two.
    Select Statements           contd..  SQL Interface
    1.     Use column updates instead of single-row updates
    to update your database tables.
    2.     For all frequently used Select statements, try to use an index.
    3.     Using buffered tables improves the performance considerably.
    Point # 1
    SELECT * FROM SFLIGHT INTO SFLIGHT_WA.
      SFLIGHT_WA-SEATSOCC =
        SFLIGHT_WA-SEATSOCC - 1.
      UPDATE SFLIGHT FROM SFLIGHT_WA.
    ENDSELECT.
    The above mentioned code can be more optimized by using the following code
    UPDATE SFLIGHT
           SET SEATSOCC = SEATSOCC - 1.
    Point # 2
    SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA
      WHERE CARRID = 'LH'
        AND CONNID = '0400'.
    ENDSELECT.
    The above mentioned code can be more optimized by using the following code
    SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA
      WHERE MANDT IN ( SELECT MANDT FROM T000 )
        AND CARRID = 'LH'
        AND CONNID = '0400'.
    ENDSELECT.
    Point # 3
    Bypassing the buffer increases the network considerably
    SELECT SINGLE * FROM T100 INTO T100_WA
      BYPASSING BUFFER
      WHERE     SPRSL = 'D'
            AND ARBGB = '00'
            AND MSGNR = '999'.
    The above mentioned code can be more optimized by using the following code
    SELECT SINGLE * FROM T100  INTO T100_WA
      WHERE     SPRSL = 'D'
            AND ARBGB = '00'
            AND MSGNR = '999'.
    Select Statements       contd…           Aggregate Functions
    •     If you want to find the maximum, minimum, sum and average value or the count of a database column, use a select list with aggregate functions instead of computing the aggregates yourself.
    Some of the Aggregate functions allowed in SAP are  MAX, MIN, AVG, SUM, COUNT, COUNT( * )
    Consider the following extract.
                Maxno = 0.
                Select * from zflight where airln = ‘LF’ and cntry = ‘IN’.
                 Check zflight-fligh > maxno.
                 Maxno = zflight-fligh.
                Endselect.
    The  above mentioned code can be much more optimized by using the following code.
    Select max( fligh ) from zflight into maxno where airln = ‘LF’ and cntry = ‘IN’.
    Select Statements    contd…For All Entries
    •     The for all entries creates a where clause, where all the entries in the driver table are combined with OR. If the number of entries in the driver table is larger than rsdb/max_blocking_factor, several similar SQL statements are executed to limit the length of the WHERE clause.
         The plus
    •     Large amount of data
    •     Mixing processing and reading of data
    •     Fast internal reprocessing of data
    •     Fast
         The Minus
    •     Difficult to program/understand
    •     Memory could be critical (use FREE or PACKAGE size)
    Points to be must considered FOR ALL ENTRIES
    •     Check that data is present in the driver table
    •     Sorting the driver table
    •     Removing duplicates from the driver table
    Consider the following piece of extract
    Loop at int_cntry.
           Select single * from zfligh into int_fligh
    where cntry = int_cntry-cntry.
    Append int_fligh.
    Endloop.
    The above mentioned can be more optimized by using the following code.
    Sort int_cntry by cntry.
    Delete adjacent duplicates from int_cntry.
    If NOT int_cntry[] is INITIAL.
                Select * from zfligh appending table int_fligh
                For all entries in int_cntry
                Where cntry = int_cntry-cntry.
    Endif.
    Select Statements    contd…  Select Over more than one Internal table
    1.     Its better to use a views instead of nested Select statements.
    2.     To read data from several logically connected tables use a join instead of nested Select statements. Joins are preferred only if all the primary key are available in WHERE clause for the tables that are joined. If the primary keys are not provided in join the Joining of tables itself takes time.
    3.     Instead of using nested Select loops it is often better to use subqueries.
    Point # 1
    SELECT * FROM DD01L INTO DD01L_WA
      WHERE DOMNAME LIKE 'CHAR%'
            AND AS4LOCAL = 'A'.
      SELECT SINGLE * FROM DD01T INTO DD01T_WA
        WHERE   DOMNAME    = DD01L_WA-DOMNAME
            AND AS4LOCAL   = 'A'
            AND AS4VERS    = DD01L_WA-AS4VERS
            AND DDLANGUAGE = SY-LANGU.
    ENDSELECT.
    The above code can be more optimized by extracting all the data from view DD01V_WA
    SELECT * FROM DD01V INTO  DD01V_WA
      WHERE DOMNAME LIKE 'CHAR%'
            AND DDLANGUAGE = SY-LANGU.
    ENDSELECT
    Point # 2
    SELECT * FROM EKKO INTO EKKO_WA.
      SELECT * FROM EKAN INTO EKAN_WA
          WHERE EBELN = EKKO_WA-EBELN.
      ENDSELECT.
    ENDSELECT.
    The above code can be much more optimized by the code written below.
    SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB
        FROM EKKO AS P INNER JOIN EKAN AS F
          ON PEBELN = FEBELN.
    Point # 3
    SELECT * FROM SPFLI
      INTO TABLE T_SPFLI
      WHERE CITYFROM = 'FRANKFURT'
        AND CITYTO = 'NEW YORK'.
    SELECT * FROM SFLIGHT AS F
        INTO SFLIGHT_WA
        FOR ALL ENTRIES IN T_SPFLI
        WHERE SEATSOCC < F~SEATSMAX
          AND CARRID = T_SPFLI-CARRID
          AND CONNID = T_SPFLI-CONNID
          AND FLDATE BETWEEN '19990101' AND '19990331'.
    ENDSELECT.
    The above mentioned code can be even more optimized by using subqueries instead of for all entries.
    SELECT * FROM SFLIGHT AS F INTO SFLIGHT_WA
        WHERE SEATSOCC < F~SEATSMAX
          AND EXISTS ( SELECT * FROM SPFLI
                         WHERE CARRID = F~CARRID
                           AND CONNID = F~CONNID
                           AND CITYFROM = 'FRANKFURT'
                           AND CITYTO = 'NEW YORK' )
          AND FLDATE BETWEEN '19990101' AND '19990331'.
    ENDSELECT.
    1.     Table operations should be done using explicit work areas rather than via header lines.
    2.     Always try to use binary search instead of linear search. But don’t forget to sort your internal table before that.
    3.     A dynamic key access is slower than a static one, since the key specification must be evaluated at runtime.
    4.     A binary search using secondary index takes considerably less time.
    5.     LOOP ... WHERE is faster than LOOP/CHECK because LOOP ... WHERE evaluates the specified condition internally.
    6.     Modifying selected components using “ MODIFY itab …TRANSPORTING f1 f2.. “ accelerates the task of updating  a line of an internal table.
    Point # 2
    READ TABLE ITAB INTO WA WITH KEY K = 'X‘ BINARY SEARCH.
    IS MUCH FASTER THAN USING
    READ TABLE ITAB INTO WA WITH KEY K = 'X'.
    If TAB has n entries, linear search runs in O( n ) time, whereas binary search takes only O( log2( n ) ).
    Point # 3
    READ TABLE ITAB INTO WA WITH KEY K = 'X'. IS FASTER THAN USING
    READ TABLE ITAB INTO WA WITH KEY (NAME) = 'X'.
    Point # 5
    LOOP AT ITAB INTO WA WHERE K = 'X'.
    ENDLOOP.
    The above code is much faster than using
    LOOP AT ITAB INTO WA.
      CHECK WA-K = 'X'.
    ENDLOOP.
    Point # 6
    WA-DATE = SY-DATUM.
    MODIFY ITAB FROM WA INDEX 1 TRANSPORTING DATE.
    The above code is more optimized as compared to
    WA-DATE = SY-DATUM.
    MODIFY ITAB FROM WA INDEX 1.
    7.     Accessing the table entries directly in a "LOOP ... ASSIGNING ..." accelerates the task of updating a set of lines of an internal table considerably
    8.    If collect semantics is required, it is always better to use to COLLECT rather than READ BINARY and then ADD.
    9.    "APPEND LINES OF itab1 TO itab2" accelerates the task of appending a table to another table considerably as compared to “ LOOP-APPEND-ENDLOOP.”
    10.   “DELETE ADJACENT DUPLICATES“ accelerates the task of deleting duplicate entries considerably as compared to “ READ-LOOP-DELETE-ENDLOOP”.
    11.   "DELETE itab FROM ... TO ..." accelerates the task of deleting a sequence of lines considerably as compared to “  DO -DELETE-ENDDO”.
    Point # 7
    Modifying selected components only makes the program faster as compared to Modifying all lines completely.
    e.g,
    LOOP AT ITAB ASSIGNING <WA>.
      I = SY-TABIX MOD 2.
      IF I = 0.
        <WA>-FLAG = 'X'.
      ENDIF.
    ENDLOOP.
    The above code works faster as compared to
    LOOP AT ITAB INTO WA.
      I = SY-TABIX MOD 2.
      IF I = 0.
        WA-FLAG = 'X'.
        MODIFY ITAB FROM WA.
      ENDIF.
    ENDLOOP.
    Point # 8
    LOOP AT ITAB1 INTO WA1.
      READ TABLE ITAB2 INTO WA2 WITH KEY K = WA1-K BINARY SEARCH.
      IF SY-SUBRC = 0.
        ADD: WA1-VAL1 TO WA2-VAL1,
             WA1-VAL2 TO WA2-VAL2.
        MODIFY ITAB2 FROM WA2 INDEX SY-TABIX TRANSPORTING VAL1 VAL2.
      ELSE.
        INSERT WA1 INTO ITAB2 INDEX SY-TABIX.
      ENDIF.
    ENDLOOP.
    The above code uses BINARY SEARCH for collect semantics. READ BINARY runs in O( log2(n) ) time. The above piece of code can be more optimized by
    LOOP AT ITAB1 INTO WA.
      COLLECT WA INTO ITAB2.
    ENDLOOP.
    SORT ITAB2 BY K.
    COLLECT, however, uses a hash algorithm and is therefore independent
    of the number of entries (i.e. O(1)) .
    Point # 9
    APPEND LINES OF ITAB1 TO ITAB2.
    This is more optimized as compared to
    LOOP AT ITAB1 INTO WA.
      APPEND WA TO ITAB2.
    ENDLOOP.
    Point # 10
    DELETE ADJACENT DUPLICATES FROM ITAB COMPARING K.
    This is much more optimized as compared to
    READ TABLE ITAB INDEX 1 INTO PREV_LINE.
    LOOP AT ITAB FROM 2 INTO WA.
      IF WA = PREV_LINE.
        DELETE ITAB.
      ELSE.
        PREV_LINE = WA.
      ENDIF.
    ENDLOOP.
    Point # 11
    DELETE ITAB FROM 450 TO 550.
    This is much more optimized as compared to
    DO 101 TIMES.
      DELETE ITAB INDEX 450.
    ENDDO.
    12.   Copying internal tables by using “ITAB2[ ] = ITAB1[ ]” as compared to “LOOP-APPEND-ENDLOOP”.
    13.   Specify the sort key as restrictively as possible to run the program faster.
    Point # 12
    ITAB2[] = ITAB1[].
    This is much more optimized as compared to
    REFRESH ITAB2.
    LOOP AT ITAB1 INTO WA.
      APPEND WA TO ITAB2.
    ENDLOOP.
    Point # 13
    “SORT ITAB BY K.” makes the program runs faster as compared to “SORT ITAB.”
    Internal Tables         contd…
    Hashed and Sorted tables
    1.     For single read access hashed tables are more optimized as compared to sorted tables.
    2.      For partial sequential access sorted tables are more optimized as compared to hashed tables
    Hashed And Sorted Tables
    Point # 1
    Consider the following example where HTAB is a hashed table and STAB is a sorted table
    DO 250 TIMES.
      N = 4 * SY-INDEX.
      READ TABLE HTAB INTO WA WITH TABLE KEY K = N.
      IF SY-SUBRC = 0.
      ENDIF.
    ENDDO.
    This runs faster for single read access as compared to the following same code for sorted table
    DO 250 TIMES.
      N = 4 * SY-INDEX.
      READ TABLE STAB INTO WA WITH TABLE KEY K = N.
      IF SY-SUBRC = 0.
      ENDIF.
    ENDDO.
    Point # 2
    Similarly for Partial Sequential access the STAB runs faster as compared to HTAB
    LOOP AT STAB INTO WA WHERE K = SUBKEY.
    ENDLOOP.
    This runs faster as compared to
    LOOP AT HTAB INTO WA WHERE K = SUBKEY.
    ENDLOOP.

  • ABAP Program and Execution Caching?

    Hi All
    We have a strange problem with custom ABAP program possibily caching a SQL statement. Here is the problem:
    Transaction: ZTRANS_C1
    This transaction is about 4 years old and we have noticed a problem with a SQL statement withing the program which is causing a timeout and shorts dumps (runs for 600ms). We managed to fix the statement and change the program but it is still taking 600ms and timing out.
    So we created a new transaction ZTRANS_C2 which is an exact copy of ZTRANS_C1 and ran it. This took only 65ms.
    So is there any way in SAP or SQL where we can stop it caching the statements or is this some other problem?
    Thanks
    Phil

    Hi Arnold,
    First you need a table type with a structure like follows:
    operation
    groupid
    context
    identifier
    agency
    scheme
    (corresponding to the Interface ValueMappingReplication)
    All used data elements should have type string or charXX
    for example: operation - char10, groupid - char32, rest - char120
    Next you create a function module with attribute 'remote-enabled module'.
    The import parameter is your new table structure.
    Next you create an ABAP program. Here an example:
    report z_value_mapping .
    tables mvke.
    data:
    p_value_mapping type zvalue_mapping,
    p_value_mapping_table type zvalue_mapping_table.
    p_value_mapping-operation = 'Insert'.
    p_value_mapping-context = 'http://xi.com/Material'.
    select * from mvke where matnr between '170' and '501'.
    check not mvke-prodh is initial.
    * Create a value mapping group to join two entries.
    * use a unique 32 digit number.
    concatenate '00000000000000' mvke-prodh into p_value_mapping-groupid.
    translate p_value_mapping-groupid using ' 0'.
    * Store the mapping source as first entry to the group
    p_value_mapping-identifier = mvke-matnr.
    p_value_mapping-agency = 'SenderAgency'.
    p_value_mapping-scheme = 'MATNR'.
    append p_value_mapping to p_value_mapping_table.
    * Store the mapping target as second entry to the group
    p_value_mapping-identifier = mvke-prodh.
    p_value_mapping-agency = 'ReceiverAgency'.
    p_value_mapping-scheme = 'PRODH'.
    append p_value_mapping to p_value_mapping_table.
    endselect.
    * Push data to XI
    call function 'Z_VALUE_MAPPING' in background task
      destination 'IS_XID'
      exporting
        value_mapping       = p_value_mapping_table.
        commit work.
    Import the RFC to the Integration Builder, create a mapping between your RFC and the interface ValueMappingReplication.
    Check this Blog for additional steps:
    /people/sreekanth.babu2/blog/2005/02/23/value-mapping-replication
    Choose names for context, agency and scheme which are useful your scenario.
    Regards
    Stefan

  • New to SAP XI BPM scenario File -- RFC -- File

    Hi All,
    I am new to SAP XI BPM,
    My Scenario is : the input parameter for the RFC is sent through a file and the response from the RFC comes into XI which is then written into a file.
    Here my dought is How many Container I have to create... and tell me what is use of Container...
    Thanks & Regards
    Siva

    Well in short I would like to say is
    A Conatiner is like a table whose structre is having two fields namely ELEMENT and VALUE so it is nearly similar to decalring variables in the general programming , but here in the workflow terms we call them as Container elements. and Conatiner is a place where all the defined variables are stored with Name/Value pair.
    Now coming to the issue , the file which is imported from XI make sure that it has the structure as Name//Type/Value  
    Count the number of lines and create the conatiner dynamically by using the class CL_SWF_CNT_CONTAINER from the ABAP program  it has lots of methods check it .

  • ABAP program parallel processing

    Dear team,
    (Not sure if I have opened this message at correct place)
    We have A program (forecasting that is related to Production planning). It runs country wise.
    We have it running for Country XYZ where there are many plants; the job runs without any glitch to system.
    Same program runs for another country at different time. (with different variant). Problem arises here..... All the dialog work process gets clogged with the Sequential read of table swequeue. Also Internal RFC destinations (those pointing towards Application server of ECC and 'NONE') get occupied in SM58.
    When this job runs; It is impossible to login to system or even put another tcode. (simply system has no free dialog work process).
    What could be reason ? in sm58 I see that all transaction are either running or recorded or error like Comm. table full (that is cause gateway comm. limit)
    All I see in SM58 is the Function module IDOCs_OUTPUT_VIA_XML_HTTP is running in parallel.
    Please help.
    Regards,
    Arthur

    Maybe you do not have other programs that use that much of parallelism...
    By default parallel groups defined in RZ12 should include some quota rules to avoid saturating the dialog process...
    Have a look at parameter rdisp/rfc_check, I had to set it from default 1 to 2 in a similar scenario.
    With the default value (1) only asynchronous RFC calls are restricted, with (2) the quotas are also applied for synchronous RFC calls.
    Regard
    https://help.sap.com/saphelp_nw70/helpdata/en/76/992c1f8731514c8467e77f115796c7/content.htm
    Level 2: Monitors, additionally to level 1, all RFCs started anew from asynchronous RFC sessions. This includes synchronous RFCs.So that applications that transmit a lot of RFCs can run at this level, the number of dialog processes used for RFCs may have to be increased. (rdisp/rfc_min_wait_dia_wp may have to be reduced).
    Increase the value if you notice that your application server becomes flooded with RFCs and dialog mode is no longer possible, even though you have set rdisp/rfc_min_wait_dia_wp.

Maybe you are looking for

  • Special characters into datagrid through the use of buttons

    Hi! I have a datagrid for word input, I'm trying to create a "special characters" panel, kind of like google's new "virtual keyboard" thing on google.com, so that the user can enter data into the datagrid cells using characters that aren't on their k

  • How to read a very BIG XML-File

    Hello together, how can I read a XML file of e.g. 2 GByte in ABAP ( Release 4.6C ). The file should be read from the Application-Server ( not from the Front-End ). Problem: Too much MEMORY is needed by the upload of the complete file into an internal

  • Error message when trying to re-burn in a DVD-RW

    Hi guys, Everytime I try to re-burn in a dvd-rw shows the following error... The drive reported an error: Sense Key: HARDWARE ERROR Sense Code: 0x03 I press ok and then the next error: The drive reported an error: Sense Key: MEDIUM ERROR Sense Code:

  • How to add additional field into output table for RFIDYYWT(Generic Withholding Tax Reporting)

    Hi Experts, How to add additional field into output table VENDORS/WH TAX TYPES AND CODES in RFIDYYWT(Generic Withholding Tax Reporting). I have no idea how to start with, please give some advice. Thanks! Ice

  • Clock time right but save times wrong?

    The clock in the top right hand corner is correct, however when I save something, it will get the minutes but it will always say I saved it at 19:(mins). I also have a similar issue when using ical and also when the screen saver has the clock it alwa