IDOC Best Practices

Hi all,
I'm starting on creation of many scenarios using ECC -> PI -> IDOC , so i've found many references to start it, that point to
https://wiki.sdn.sap.com/wiki/display/XI/SAP%20R3%20(Idocs)%20To%20XI%20--%20Steps%20Summarized
/people/swaroopa.vishwanath/blog/2007/01/22/ale-configuration-for-pushing-idocs-from-sap-to-xi
These links points to best practices on IDOC integrations? Does anybody have some considerations about performance, configuration tips to this scenarios? Or following these posts is a good idea to begin?
Best regards

Hi Isaias,
The follwing link has the very good information on IDOC scenarios trouble shooting and analysis.
http://help.sap.com/saphelp_nw04/helpdata/en/6a/e6194119d8f323e10000000a155106/frameset.htm
and chapter 7.3 for IDOC Adapter related issues in the follwing trouble shooting guide..
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/108ba391-e826-2a10-608f-c1769c51dc29
Thanks,
Ram.

Similar Messages

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • EDI - 753Routing Request and 754 Routing Instructions transactions- what SAP ECC6 EDI output type, msg type, msg code, basis idoc is best practice to generate the idoc?

    One of our Trading Partner wishes to implement the 753 Routing Request and 754 Routing Instructions.  Can anyone give me best practice answer the following Questions?  We are on SAP ECC6 R3
    What Application? i.e. V2, V7?
    What output condition to use?  i..e. LALE, LAT2, SEDI????
    What Message type?  i.e SHPADV?
    What Basic Idoc Type? i.e SHPMNT05?
    What Message Code?  i.e. SHPM?
    What Process Code?  i.e. SHPM??
    What Function Module? i.e. IDOC_OUTPUT_SHPMNT??
    Does the SAP Transpotation Module have to be configured for this to be implemented?
    Your comments are greatly appreciated, we have until Nov 1 to be compliant?

    Good Morning,
    While I did not get any responses from my question, yes, I was able to determine what needed to be configured.   I hope the below will help you: This is for SAP ECC6
    The output application for the 753 Routing Request is "V7"
    The output type is "SEDI"
    Transmission medium is "6"
    The Basic Idoc type is "Shipmnt05"
    when setting up your Partner Profile (WE20) add:Message Partner Role "SP" with Message type "SHPADV', Message coed "753", Basic Type "SHPMNT05", Message Control add Application "V7", Message type "SEDI", Process Code "SHPM".
    also http://scn.sap.com/thread/698368 pages 14 and 15 were helpful.
    I have not configured the Inbound 754.  For now the inbound 754 Routing Instructions will be emailed to our traffic department.
    Thank you,
    Have a great day,
    jane

  • Idoc processing best practices - use of RBDAPP01 and RBDMANI2

    We are having performance problems in the processing of inbound idocs.  The message type is SHPCON, and transaction volume is very high.  I am a functional consultant, not an ABAP developer, but will try my best to explain our current setup.
    1)     We have a number of message variants for the inbound SHPCON message, almost all of which are set to trigger immediately upon receipt under the Processing by Function Module setting.
    2)      For messages that fail to process on the first try, we have a batch job running frequently using RBDMANI2.
    We are having some instances of the RBDMANI2 almost every day which get stuck running for a very long period of time.  We frequently have multiple SHPCON idocs coming in containing the same material number, and frequently have idocs fail because the material in the idoc has become locked.  Once the stuck batch job is cancelled and the job starts running again normally, the materials unlock and the failed idocs begin processing.  The variant for the RBDMANI2 batch job is currently set with a packet size of 1 and without parallel processing enabled.
    I am trying to determine the best practice for processing inbound idocs such as this for maximum performance in a very high volume system.  I know that RBDAPP01 processes idocs in status 64 and 66, and RBDMANI2 is used to reprocess idocs in all statuses.  I have been told that setting the messages to trigger immediately in WE20 can result in poor performance.  So I am wondering if the best practice is to:
    1)     Set messages in WE20 to Trigger by background program
    2)     Have a batch job running RBDAPP01 to process inbound idocs waiting in status 64
    3)     Have a periodic batch job running RBDMANI2 to try and clean up any failed messages that can be processed
    I would be grateful if somebody more knowledgeable than myself on this can confirm the best practice for this process and comment on the correct packet size in the program variant and whether or not parallel processing is desirable.  Because of the material locking issue, I felt that parallel processing was not desirable and may actually increase the material locking problem.  I would welcome any comments.
    This appeared to be the correct area for this discussion based upon other discussions.  If this is not the correct area for this discussion, then I would be grateful if the moderator could re-assign this discussion to the correct area (if possible) or let me know the best place to post it.  Thank you for your help.

    Hi Bob,
    Not sure if there is an official best practice, but the note 1333417 - Performance problems when processing IDocs immediately does state that for the high volume the immediate processing is not a good option.
    I'm hoping that for SHPCON there is no dependency in the IDoc processing (i.e. it's not important if they're processed in the same sequence or not), otherwise it'd add another complexity level.
    In the past for the high volume IDoc processing we scheduled a background job with RBDAPP01 (with parallel processing) and RBDMANIN as a second step in the same job to re-process the IDocs with errors due to locking issues. RBDMANI2 has a parallel processing option, but it was not needed in our case (actually we specifically wouldn't want to parallel-process the errors to avoid running into a lock issue again). In short, your steps 1-3 are correct but 2 and 3 should rather be in the same job.
    Also I believe we had a designated server for the background jobs, which helped with the resource availability.
    As a side note, you might want to confirm that the performance issues are caused only by the high volume. An ABAPer or a Basis admin should be able to run a performance trace. There might be an inefficiency in the process that could be adding to the performance issue as well.
    Hope this helps.

  • Best practice for creating RFC destination entries for 3rd parties(Biztalk)

    Hi,
    We are on SAP ECC 6 and we have been creating multiple RFC destination entries for the external 3rd party applications such as Biz-talk and others using TCP/IP connection type and sharing the programid.
    The RFC connections with IDOC as data flow have been made using Synchronous mode for time critical ones(few) and majority through asynchronous mode for others. The RFC destination entries have been created for many interfaces which have unique RFC destinations with its corresponding ports defined in SAP. 
    We have both inbound and outbound connectivity.with the large number of RFC destinations being added we wanted to review the same. We wanted to check with others who had encountered similar situation and were keen to learn their experiences.
    We also wanted to know if there are any best practices to optimise on number of RFC destinations.
    Here were a few suggestions we had in mind to tackle the same.
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    I have done checks on SAP best practices website, sap oss notes and help pages but could not get specific information I was after.
    I do understand we can have as unlimited number of RFC destinations and maximum connections using appropriate profile parameters for gateway, RFC, client connections, additional app servers.
    I would appreciate if you can suggest the best architecture or practice to achieve  RFC destinations in an optimized manner.
    Thanks in advance
    Sam

    Not easy to give a perfect answer
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    -> be careful if you have multi cllients ( for example in acceptance) RFC's are client independ but ports are not! you could run in to trouble
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    -> could be the best solution... its easier to create partner profiles and the control record will contain the correct partner.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    -> consider this option 2.
    We send to you messagebroker with 1 RFC destination , sending multiple idoctypes, different partners , different ports.

  • Best practice for retail store cycle count

    Hi experts
    Would you have any experience/best practice in designing for cycle count process for your clients?
    By cycle count, I mean for frequent counting (daily) for specific item category / exceptional count request.
    My client want to implement a cycle count process to minimize the effort for full stock take which happen around 3 times per year.
    They are using SAP as the core system to keep track of inventory, and using external store system, so some considerations would be on, for example:
    - whether the count variance or counted quantity shall be interfaced back to SAP
    - whether SAP should be the source to initiate a stock count process or POS initiate
    It would be very nice if you can share some best practice as in the process flow, system information flow, etc.
    Best regards
    Dominic

    Hello
    Both can be done but as you say SAP is the core system at least for tracking inventories, better create and integrate inventories is SAP aswell.
    The main advice I would give is to avoid specific tables for inventory integration because standard tables and specific won't be updated well sometimes and support would be needed, stick on standard transactions.
    If the inventory is centralised in SAP, you can for example count them with PDA, send to SAP connectors your counting which will be integrated from a xml file transformed in a Idoc for SAP.
    One parameter is important to consider as the freeze for inventory. You have to decide if you take into account any stock movement that could occur after your inventory creation or not. If you do, the variance after the counting will depend on the initial stock value and also the movement.
    If you don't, the variance will only take into account your initial stock value withotu considering the stock movement.
    Hope it helps you,
    Génia.

  • Best Practice for Placing meta tags in HEAD of document

    Good afternoon,
    We are utilizing Oracle UCM's ability to dynamically generate pages based on page templates, region templates, and contributor data files that contributors may edit and publish. We have a specific need to include meta tags in the head section of each page that are custom to the data present in each contributor data file. Including the meta tags in the region template that pertains to a section of our website does not work, as it does not include a <head> section. Alternatively, I can not place the meta tags in the page template, as the data in each meta tag is specific to the metadata surrounding each contributor data file.
    My question is this: Is there a best practice, or Oracle-supported method of accomplishing this task? I have to believe there is some way to accomplish this, as the need for meta tags for social media sites and search engines is very common.
    Thank you very much for your time,
    Josh

    Thanks for the reply, Jiri.
    I believe that the answer is 'yes' to both of your questions above. I do have a question about the formatting of the meta tag itself, as my main question is how do I extract the needed information from a particular CDF so that it is displayed in the <head> section located in the page template.
    I have a Region Definition for the content section of my page with a 'Title' field, which a contributor can fill in, save, and create a Contributor Data File with their own specified title. Let's also say that my page template, which formats the entire page, looks like this (rough):
    <html>
      <head>
      <title>Web page</title>
      </head>
      <body>
      <!-- placeholder for header section -->
      <!-- placeholder for content section -->
      <!-- placeholder for footer section -->
      </body>
    </html>
    What would my <meta> tag look like for the following:
      <meta property="og:title" content="Title of the content section"/>
    Let's name our templates/definitions:
    Page Template: PT_META
    Region Template: RT_META
    Region Definition: RD_META
    Contributor Data File: CDF_META
    In other words, how do I extract the information included in the Contributor Data File (as well as its metadata) for use in the Page Template? Is there some sort of Idoc Script call that can be used?
    Thank you so much,
    Josh

  • Best practice on integration message augmentation via user exit or rfc call

    I am looking for documentation that would provide best practices on whether to use user exits to augment the data on an IDOC or to forward the standard SAP produced IDOC to PI where rfc calls are made to augment the data as required for the specific target system.
    I am sure there are pros and cons for both solutions, but I am hesitant to use user exits since we now have moved the knowledge of what a target system wants from the integration layer to the source layer.  If a second target system comes along in the future, the user exit becomes more complicated with additional target specific requirements. 
    Any links to best practice documentation on this subject is greatly appreciated.
    Edited by: Sean Sweeney on Oct 15, 2009 6:59 PM

    Hi Steve,
    Might be trying for solution for a long time, If i understood your question clear let me clarify you few points.
    You are trying to access the bex query which is designed with the exit's in the background based on the logic and trying to call the entire dimensions and key-figures in a single connection. Then you are trying to map those data in the charts.
    Steve, try to make more connections based upon the logic and split them. use the same query but split them by sales per customer group, sales per day, sales per week by making three different connections and try. You can merge the prompts from all connections.
    Hope this Helps!!!
    Sorry if i misunderstood your question.
    --SumanT

  • Best practices for apps integration with third party systems ?

    Hi all
    I would like to know if there is any document from oracle or from your own regarding best practices for apps integration with third party systems.
    For example, in particular, let's say we need customization in a given module(ex:payables) need to provide data to a third party system, consider following:
    outbound interface:
    1)should third party system should be given with direct access to oracle database to access a particular payments data information table/view to look for data ?
    2) should oracle create a file to third party system, so that it can read and do what it need to do?
    inbound:
    1) should third party should directly login and insert data into tables which holds response data?
    2) again, should third party create file and oralce apps will pick up for further processing?
    again, there could be lot of company specific scenarios like it has to be real time or not... etc...
    How does companies make sure third party systems are not directly dipping into other systems (oracle apps/others), so that it will follow certain integration best practices.
    how does enterprise architectute will play a role in this? can we apply SOA standards? should use request/reply using Tibco etc?
    Many oracle apps implementations customizations are more or less directly interacting with third party systems by including code to login into respective third party systems and vice versa.
    Let me your know if you have done differently and that would help oracle apps community.
    thanks
    rrb.

    you want to send idoc to third party system (NONSAP).
    what kind of system is it? can it handle http requests
    or
    can it handle webservice?
    which version of R/3 you are using?
    what is the mechanism the receiving system has, to receive data?
    Regards
    Raja

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Best practice file SMB18_SE37_O011_Q78_878.TXT

    HI! I am going through the Best Practice site on Chemical industry. I am keen to find out how the data is structure for the PROACT01 IDOC processing. Does anyone happen to have the file or a sample file for the inbound handling stock data? I have gone through the BP website but could only find the configuration and scenario guide.
    Regards
    SF

    No answer

  • Could scenario 161 of SAP best practice  help to generate aut. posting?

    Dear All,
    We need to generate automatic postings for Sales costs, admin costs in FI. This sums we have on our cost centers groups accordingly.
    Could scenario 161 of SAP best practice help us?
    Best regards,
    Kamila

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

  • SAP Best Practice: Problems with Loading of Transaction Data

    Hi!
    I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
    Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
    My problems are:
    when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
    As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
    The problems there
    1) Input help of the web template of query don't contain any values
    2) no data will be shown
    Can some one help me to solve this problem?
    Thank you very much!
    Jürgen

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

  • Best Practices Invoice communication to 3rd party

    Hi All,
    I am looking for some pointer on best practices for 3rd party invoice communication for FMCG.
    Background: Inhouse printing, wrapping in envelope and posting to customer is an expensive option to client, compared to sending invocie data to 3rd party and 3rd party doing above activities of printing, enveloping and posting to customer.
    Currently a text file is sent to 3rd party which is enriched with logo etc and merged into a template to prepare final invoice document. This is then printed by 3P and sent to customer.
    Possible Solution: We can continue similar apporach where ECC sends IDOC to PI and PI converts data in text format to 3rd party for enrichment and printing. However this requires development both in ECC and PI and testing at ECC, PI and 3P. Is the suggested approach the best industry practice?
    Would appreciate if you can share your thoughts around the topic on best practice adopted in FMCG for third party communication?
    Thanks and Regards,
    Namish

    Best possible solution for your requirement is that you must maintain customer master  data with their Email adresses in customer general data tab. Take help from your ABAPER and BASIS guy to activate email sending. Maintain output type for email same like you are using for printing. You must have a smart form for your invoice with company logo and all other information. When you will create invoice a soft copy of that invoice in PDF format will be sent automatically to that particular customer.
    For this you need to do some configuration for out put type. ABAPER will help you to convert invoice into PDF and BASIS guy will activate email sending. You can also search the forum for more help. Email output type in PDF format have been many time discussed.
    Hope it will help you in your requirement.

Maybe you are looking for