Integrating Multiple systems - best practice

Hi,
I need to integrate the following scenario let me know the best practice with steps.
Scenario is end to end syncrhonous
A system(supports open tech i.e webservice client) => SAP PI<=>Oracle(call stored procedure) => Update in ECC=> Reponse back to A system (source)
Thanks
My3

Hi Mythree,
First get the request from the web service to pi and then map to stored procedure using synchronous send step (Sync Send1) in bpm and get the response back. Once when you get the response back from oracle then use another synchronus send step. Here take the response from the database and map to ecc (use either a proxy or rfc) which is Sync Send2 in the bpm and get the udpdates back. Once when you get the response back send back the response to source system. The steps in BPM would be like this:
Start --> Receive --> Sync Send1 --> Sync Send2 --> Send --> Stop
These blogs might be useful for this integration:
/people/siva.maranani/blog/2005/05/21/jdbc-stored-procedures
/people/luis.melgar/blog/2008/05/13/synchronous-soap-to-jdbc--end-to-end-walkthrough
Regards,
---Satish

Similar Messages

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • Application preferences system - best practice

    Hi,
    I'm migrating a forms application from 6i to 11g
    The application source forms are the same for all the possible deployments , which are set in different databases (let's say one database for each client).
    But as you may expect, each client wants a customized app, so we have to define some kind of preferences, and have them into account in forms (and db packages).
    The problem:
    The application, as it was designed, has this customizing system spread over different solutions:
    A database table with one row for each custom parameter.
    A database package constants.
    Forms global variables defined at the main menu.
    Even, instead of defininig a good set of properties, I'm finding a lot of code with "if the client is one of this then ... else ..." sentences. Sometimes implemented with instr and global variables defining groups of clients ... bufff....
    The question:
    I'd like to take advance of the migration process to fix this a little bit. Can you give advice on a best practice for this?

    Thanks. I was hoping there would be something better than both approaches (package constants or parameter table) bundled within the database.
    Of course the
    if customer name = 'COMPANY_A' then use this logic ...
    gives the creeps to anyone.
    Instead of this everything should be
    if logic_to_do_this = 'A' then ...
    There are two minor problems with the table approach:
    Single row and one column for each parameter? (thus you can control the parameter datatype, but need to ddl everytime you add a parameter.
    or a parameter table with parameter name/value pairs? (here you'd have to go with varchar for everything, but you could have a set of pre-established conversion masks)
    I prefer the second (you only need to establish masks for number and date parameters), but I'm a bit late, as the app has been working with a single row parameter table from the beginning, and they even didn't wrap it with a getter function.
    In fact, in an old forms application I developed where customization was paramount, I remember I used two tables master / detail: the master table for the parameter "context" definition and the detail table for the values and dates of entry in force (the app worked 24x7 and users even needed to program changes in preferences in advance). A getter function gets you the value currently in force.

  • Setting Disks/Caches/Vault for multiple projects - Best Practices

    Please confirm a couple assumptions for me:
    1. Because Scratch Disk, Cache and Autosave preferences are all contained in System Settings, I cannot choose different settings for different projects at the same time (i.e. I have to change the settings upon launch of a new project, if I desire a change).
    2. It is good practice to set the Video/Render Disks to an external drive, and keep the Cache and Autosave Vault set to the primary drive (e.g. user:Documents:FCP Documents). It is also best practice to save the Project File to your primary drive.
    And a question: I see that the Autosave Vault distinguishes between projects, and the Waveform Cache Files distinguishes between clips. But what happens in the Thumbnail Cache Files folder when you have more than one project targeting that folder? Does it lump it into the same file? Overwrite it? Is that something about which I should be concerned?
    Thanks!

    maxwell wrote:
    Please confirm a couple assumptions for me:
    1. Because Scratch Disk, Cache and Autosave preferences are all contained in System Settings, I cannot choose different settings for different projects at the same time (i.e. I have to change the settings upon launch of a new project, if I desire a change).
    Yes
    2. It is good practice to set the Video/Render Disks to an external drive, and keep the Cache and Autosave Vault set to the primary drive (e.g. user:Documents:FCP Documents).
    Yes
    It is also best practice to save the Project File to your primary drive.
    I don't. And I don't think it matters. But you should back that file up to some other drive (like Time Machine).
    And a question: I see that the Autosave Vault distinguishes between projects, and the Waveform Cache Files distinguishes between clips. But what happens in the Thumbnail Cache Files folder when you have more than one project targeting that folder? Does it lump it into the same file? Overwrite it? Is that something about which I should be concerned?
    I wouldn't worry about it.
    o| TOnyTOny |o

  • SOA 11g  Composite Deployment across multiple Instances: Best Practice

    Hi,
    We have a requirement where we need to deploy the composite acrocss mutilple instances(DEV,TEST,Production) without JDEV.
    We are using SOA11.1.3.3(cluster) and linux OS.
    Pls suggest me what is the best practice to deploy the SOA composite.
    Thanks,
    AB

    Why there are different ways to deploy the composite in different environment? Depending upon the environment, it's business importance increases and hence access to developers get more restricted and hence there are many ways for deploying. If you are developing an application, you would not like to export a SAR and then login to EM console to deploy it. For a developer it is always very convenient to use IDE itself for deployment and hence JDev is preferably used for Dev instances for deployment.
    Once development finishes, developer will check in the artifacts into version control system and if you want to deploy to a test instance then you have to check out the stable version, compile it and package it and then only you can deploy. Hence for test instances, ANT is the preferable mode for deployment. (Remember that developers may not be able to connect to Test environment directly and hence deployment from JDev is not possible)
    Once a configuration gets tested in Test env, it's SAR should be checked in into version control system, so that it would not be required to recompile and repackaging of it. This will also make sure that any artifact which is not tested on Test instance, must not go to higher instances (like UAT/PROD). Now in Pre-Prod/UAT/Prod, you may simply access the EM console and deploy the SAR which is already tested in Test instance. Remember that there will be very limited access to such critical environments and hence using Role based access of EM it would be very easy to deploy the resources. Moreover, it is more secure mode of deployment because only those users which have appropriate priviledge, will be able to deploy, and it would also be easier to track the changes.
    What is the proc and cons if we use only one way to deploy the composite across the Instances...As such there is no major pros and cons. You may use EM for all environments but it may be a little discomfort for developers/deployers of test environment. You may also use ANT for all the environments but it is not suggested until and unless you have a very good and secure process in place for deployments.
    Regards,
    Anuj

  • Embedding same images across multiple components, best practices?

    Hi Guys,
    Once again a great forum.
    I have a fairly large dashboard application, built from many
    components, from an obvious maintenance point of view this is
    preferred.
    To keep the look and feel the same across each component I
    reuse a lot of the icons, this icons have 3 states, disabled,
    normal and hot(rollover)
    The mouse roll over and roll out are the same for each set of
    icons. At the moment I have each component embed each icon and
    repeat the functions.
    This is not the optimum way to do this so I'm looking on best
    practices. Should I build a separate action script package with all
    the embedded images and roll over/out functions. Then import that
    in to each component. But then I will embedded ALL the images in
    each component. Should I embed all the images in the main container
    app, then when flex sees the same image in a compoent will it not
    embedded again or will it embed the images again..
    In the example below 4 of the components have the same
    refresh icon and roll/over states. So this code is repeated (bad
    practice) in all 4. Moving to separate action-script package will
    make maintenance easier, but as stated above will just one copy get
    embedded for the entire app, or will 4 copies get embedded ?
    [Embed(source="images/refresh_24.png")]
    [Bindable]
    private var refreshIcon:Class;
    [Embed(source="images/refresh_24_hot.png")]
    [Bindable]
    private var refreshIconHot:Class;
    [Embed(source="images/refresh_24_dis.png")]
    [Bindable]
    private var refreshIconDis:Class;
    private function rollOverRefresh(event:Event):void {
    if (event.target.source != refreshIconDis )
    {event.target.source = refreshIconHot;}
    private function rollOutRefresh(event:Event):void {
    If (event.target.source != refreshIconDis )
    {event.target.source = refreshIcon;}
    }

    Flex is able to collate those Embeds so they are only
    included once IIRC.
    While it may seem like bad practice to include it in each
    component, it really isn't, from a code reusability standpoint. But
    you could probably continue on that path and create a custom
    component <mycontrols:RefreshButton> that has the the embeds
    in one place for you.
    You could probably also do something similar with style very
    easily:
    .MyButton {
    overSkin="@Embed(source='../assets/orb_over_skin.gif')"
    upSkin="@Embed(source='../assets/orb_up_skin.gif')"
    downSkin="@Embed(source='../assets/orb_down_skin.gif')"/>
    Then apply that style to your button.
    You could also put the button into a single SWF file in Flash
    and include it that way to reduce the number of embeds. I never
    include PNG, JPG, GIF, etc files directly, always SWF as you get
    better compression that way IMO. Plus I just find it gives me
    greater flexibility...if I want to animate my skins in the future
    (button that gleams for instance), I already have SWF's in my code
    so no need to change anything out but the SWF.

  • Multiple Libraries - Best practice

    I have been using iphoto for the last 5 years, At the end of the year I move the library to an external drive and use iphoto buddy to point to it. Now I have 6-7 libraries and I would rather have everything, together.
    What has people found out is the best way to combine libraries so that all files end up in 1 directory on the firewire drive and then the best way to move this years photos from the hardrive to the ext drive, later? Or is having multiple libraries the best?
    Seems like i am always moving files to and from libraries and i loss any structure and backing it up is a pain. am i missing something here? Thanks in advance
    jf

    Jeff:
    Welcome to the Apple Discussions. The only way at present to merger multiple libraries into one library and retain keywords, rolls, etc, is to use the paid version of iPhoto Library Manager. That will get you one library. If that library gets very large then you might have a storage problem on your boot drive where you want to keep adequate free space for optimal performance. In that case multiple libraries might be best for you.
    FWIW here's my photo strategy. I keep my photos in folder based on the shoot (each upload from the camera after an event, usually family get togethers) in folders dated for the event and a brief description. I then import that folder into iPhoto to get a roll with the same name. Since I take the step to upload before importing to iPhoto I also batch rename the files with the international date format: YYYY-MM-DD-01.jpg and so forth.
    With iPhoto's new capability of using only aliases for the original/source files (I kept my original folders as a backup on a second drive) I used those original source folders for my combined library by importing those folders into a new V6 library using the alias method. So, for 17,400+ photos that take up 27G on my second drive, my iPhoto library on my boot drive only takes up 8G. This is very valuable for PB owners who can't afford that much library space on the boot drive. The one downside to the alias method is that if you delete a photo from the library the source/original file does not get deleted. Hopefully that will be in the next update.
    If the external is dismounted the library is still useable in that you can view thumbnails, create, delete and move albums, add comments and keywords (but with some effort). You can't do anything that requires moving the thumbnails or edit. I've created a workflow to convert from a conventional iPhoto library structure to an alias based one.
    The downside to converting as I mentioned is that you will lose keywords, albums, etc. due to the conversion. So it's not for everyone.
    Bottom line: for combining/merging libraries iPhoto Library Manager is the only option at the present time. Hope this has bee of some help to you.

  • Responsive projects autosizing for multiple devices -- best practices?

    While it's a nice idea to have the three different breakpoints on width that allow you to make some major layout changes, I'm having trouble making this work well across multiple devices.  With the high resolutions of current mobile devices, this model just doesn't seem to work well.  For example, the iPad 2 in landscape mode select the desktop layout, which isn't bad, but in portrait mode gets the tablet layout, which by default is set up to be landscape.  I changed the heights for both breakpoints to better fit the iPad, but then the iPhone in landscape gets the tablet layout, requiring a lot of scrolling.
    Given the wide range of resolutions on mobile devices, it's not at all clear how this a simplistic mechanism like this could ever work.  For one thing, it sure seems like you need to provide for both portrait and landscape layouts on tablets and phones, with Captivate using both width and height to select the best one.  Also, is there any way to tell it to fill the screen without scrolling? How about automatically leaving the area clear where there's a status bar (or telling it to hide the status bar)?
    I could just create a separate version of the project for each device, but then lose much of the value of creating a Responsive Project in the first place.  Have those of you with more experience found good ways to deal with these variables, or is the best option to just create separate versions of the project?

    While it's a nice idea to have the three different breakpoints on width that allow you to make some major layout changes, I'm having trouble making this work well across multiple devices.  With the high resolutions of current mobile devices, this model just doesn't seem to work well.  For example, the iPad 2 in landscape mode select the desktop layout, which isn't bad, but in portrait mode gets the tablet layout, which by default is set up to be landscape.  I changed the heights for both breakpoints to better fit the iPad, but then the iPhone in landscape gets the tablet layout, requiring a lot of scrolling.
    Given the wide range of resolutions on mobile devices, it's not at all clear how this a simplistic mechanism like this could ever work.  For one thing, it sure seems like you need to provide for both portrait and landscape layouts on tablets and phones, with Captivate using both width and height to select the best one.  Also, is there any way to tell it to fill the screen without scrolling? How about automatically leaving the area clear where there's a status bar (or telling it to hide the status bar)?
    I could just create a separate version of the project for each device, but then lose much of the value of creating a Responsive Project in the first place.  Have those of you with more experience found good ways to deal with these variables, or is the best option to just create separate versions of the project?

  • CC5.2: Logical and Cross Systems - Best Practices

    Hello,
    we are using CC5.2 in a landscape of multiple SAP systems. In order to streamline the process of creating our rule set I would like to clear up my confusion about the use of logical and cross systems.
    Here my questions:
    1. Are logical systems intended for encapsulating systems of following stages (e.g. DEV1, QA1, PRD1) or to group systems of one stage which have the same structure of risks and therefore can share rules (e.g. DEVECC1, DEVECC2, ...)?
    2. I don't see entries of logical systems in the Rule Architect - wouldn't it make sense to create functions for logical systems instead of uploading function authorizations for each system?
    3. When you create functions for one sample system - is it sufficient to generate the rules for the logical system the sample system belongs to?
    4. Regarding the cross systems functionality: After creating risks across different systems - is it still necessary to create corresponding cross systems and to generate the rules from the cross systems menu?
    Thanks for your help in advance!
    Regards,
    Martin

    Hi Frank,
    thanks for your answer.
    I now see the handling of logical systems as independent from system landscape tiers. After recreating the desired logical system for a second time it now also appears in the Rule Architect and in the upload dialog of function authorizations. I guess that after uploading them it's still necessary to generate the rules for logical systems in the configuration tab under 'logical systems' ?!
    Regarding the cross systems functionality I don't understand the redundancy of the different settings. Actually there are four options for cross system checks. Let's go into detail:
    a) Functions can include actions of different systems...
    b) ...and they can be flagged as cross system functions. --> Should not the system automatically set the analysis scope of a function to 'cross system' when functions include actions of different systems or is this flag used for the analysis of logical systems?
    b) Risks can include functions referencing to different systems. --> In my understanding it is essential that in this case the analysis scope is 'cross systems'. Is it still necessary to create cross systems and to generate rules for those? (Seems a bit redundant.) Or does one option take precedence over the other?
    c) Finally you can setup 'cross systems'. --> How is the relation of cross systems to the other settings mentioned above? Do you have to create cross systems for each combination of systems which are related in risk definitions or function definitions? Or can all related systems be accumulated in one overall cross system?
    Hope, there is not too much confusion now
    Thanks and regards,
    Martin

  • JHeadstart and multiples developers -best practices-

    Hi,
    Which is the best way to use JHeadstart when you have a team of several programmers?
    We’ve seen that several examples have only one “Application Structure File”, therefore it is quite difficult that a group of programmers can work in parallel.
    Which is the best way to handle this situation? Several “Application Structure Files”?
    Our project will have a team of 4 programmers and certainly we want that they work efficiently with this tool.
    Any thought?
    Best regards,

    Michael et al,
    I am working on a big ADF project where we do the following:
    - one big Model project with all business component objects, but we organize the EO's, VO's and AM's in separate packages, and for the VO's, we make sub-packages per functional subsystem
    - several ViewController projects for each functional subsystem, each having its own struts-config-<subsystemname> file.
    - within a ViewController project, we sometimes have multiple Application Structure files generating into the same struts-config file.
    Please see this thread for more info on config management tools:
    Re: Jheadstart, JDeveloper and CVS and TeamWork
    Steven Davelaar,
    JHeadstart Team.

  • Oracle R12 HCM to SCM Key Integration Points Best Practice Documentation

    My client is implementing Oracle R12 HCM and SCM modules on a Single Global Instance and would like to know if there are any key integration points or best practice documentation.
    Impacted Scenario’s include:
    1.     Multiple Business Groups
    2.     Retiree Business Groups
    3.     Ex-Pats
    4.     Return to Workers (Payroll vs Pension)
    Thank you,
    Steve

    My client is implementing Oracle R12 HCM and SCM modules on a Single Global Instance and would like to know if there are any key integration points or best practice documentation.
    Impacted Scenario’s include:
    1.     Multiple Business Groups
    2.     Retiree Business Groups
    3.     Ex-Pats
    4.     Return to Workers (Payroll vs Pension)
    Thank you,
    Steve

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Advice for Soon-to-be MacPro Owner. Need Recs for Best Practices...

    I'll be getting a Quad Core 3 Ghz with 1GB of RAM, a 250Gig HD, the ATI X1900 card. It will be my first mac after five years (replacing a well-used G4 Tibook 1Ghz).
    First the pressing questions: Thanks to the advice of many on this board, I'll be buying 4GB of RAM from Crucial (and upgrading the HD down the road when needs warrant).
    1) Am I able to add the new RAM with the 1G that the system comes with? Or will they be incompatible, requiring me to uninstall the shipped RAM?
    Another HUGE issue I've been struggling with is whether or not to batch migrate the entire MacPro with everything that's on my TiBook. I have so many legacy apps, fonts that I probably don't use any more and probably have contributed to intermittent crashes and performance issues. I'm leaning towards fresh installs of my most crucial apps: photoshop w/ plugins, lightroom, firefox with extensions and just slowly and systematically re-installing software as the need arises.
    Apart from that...I'd like to get a consensus as to new system best practices. What should I be doing/buying to ensure and establish a clean, maintenance-lite, high-performance running machine?

    I believe you will end up with 2x512mb ram from the Apple store. If you want to add 4gb more you'll want to get 4x1gb ram sticks. 5gb ram is never an "optimal" amount but people talk like it's bad or something but it's simply that the last gig of ram isn't accessed quite as fast. You'll want to change the placement so the 4x1 sticks are "first" and will be all paired up nicely so your other two 512 sticks only get accessed when needed. A little searching here will turn up explanations for how best to populate the ram for your situation. It's still better to have 5 gigs where the 5th gig of ram isn't quite as fast than 4. They will not be incompatible but you WILL want to uninstall the original RAM, then put in the 4gigs into the optimal slots then add the other two 512 chips.
    Do fresh installs. Absolutely. Then only add those fonts that you really need. If you use a ton of fonts I'd get some font checking app that will verify them.
    I don't use RAID for my home machine. I use 4 internal 500gig drives. One is my boot, the other is my data (although it is now full and I'll be adding a pair of external FW). Each HD has a mirror backup drive. I use SuperDuper to create a clone of my Boot drive only after a period of a week or two of rock solid performance following any system update. Then I don't touch it till another update or installation of an app followed by a few weeks of solid performance with all of my critical apps. That allows me to update quicktime or a security update without concern...because some of those updates really cause havoc with people. If I have a problem (and it has happened) I just boot from my other drive and clone that known-good drive back to the other. I also backup my data drive "manually" with Superduper.
    You will get higher performance with Raid of course, but doing that requires three drives (two for performance and one for backup) just for data-scratch, as well as two more for boot and backup of boot. Some folks can fit all their boot and data on one drive but photoshop and many other apps (FCP) really prefer data to be on a separate disk. My setup isn't the absolute fastest, but for me it's a very solid, low maintenance,good performing setup.

  • Best practices for apps integration with third party systems ?

    Hi all
    I would like to know if there is any document from oracle or from your own regarding best practices for apps integration with third party systems.
    For example, in particular, let's say we need customization in a given module(ex:payables) need to provide data to a third party system, consider following:
    outbound interface:
    1)should third party system should be given with direct access to oracle database to access a particular payments data information table/view to look for data ?
    2) should oracle create a file to third party system, so that it can read and do what it need to do?
    inbound:
    1) should third party should directly login and insert data into tables which holds response data?
    2) again, should third party create file and oralce apps will pick up for further processing?
    again, there could be lot of company specific scenarios like it has to be real time or not... etc...
    How does companies make sure third party systems are not directly dipping into other systems (oracle apps/others), so that it will follow certain integration best practices.
    how does enterprise architectute will play a role in this? can we apply SOA standards? should use request/reply using Tibco etc?
    Many oracle apps implementations customizations are more or less directly interacting with third party systems by including code to login into respective third party systems and vice versa.
    Let me your know if you have done differently and that would help oracle apps community.
    thanks
    rrb.

    you want to send idoc to third party system (NONSAP).
    what kind of system is it? can it handle http requests
    or
    can it handle webservice?
    which version of R/3 you are using?
    what is the mechanism the receiving system has, to receive data?
    Regards
    Raja

  • Best Practices for FSCM Multiple systems scenario

    Hi guys,
    We have a scenario to implement FSCM credit, collections and dispute management solution for our landscape comprising the following:
    a 4.6c system
    a 4.7 system
    an ECC 5 system
    2 ECC6 systems
    I have documented my design, but would like to double check and rob minds with colleagues regarding the following areas/questions.
    Business partner replication and synchronization: what is the best practice for the initial replication of customers in each of the different systems to business partners in the FSCM system? (a) for the initial creation, and (b) for on-going synchronization of new customers and changes to existing customers?
    Credit Management: what is the best practice for update of exposures from SD and FI-AR from each of the different systems? Should this be real-time for each transaction from SD and AR  (synchronous) or periodic, say once a day? (assuming we can control this in the BADI)
    Is there any particular point to note in dispute management?
    Any other general note regarding this scenario?
    Thanks in advance. Comments appreciated.

    Hi,
    I guess when you've the informations that the SAP can read and take some action, has to be asynchronous (from non-SAP to FSCM);
    But when the credit analysis is done by non-SAP and like an 'Experian', SAP send the informations with invoices paid and not paid and this non-SAP group give a rate for this customer. All banks and big companies in the world does the same. And for this, you've the synchronous interface. This interface will updated the FSCM-CR (Credit), blocking or not the vendor, decreasing or increasing them limit amount to buy.
    So, for these 1.000 sales orders, you'll have to think with PI in how to create an interface for this volume? What parameters SAP does has to check? There's an time interval to receive and send back? Will be a synchronous or asynchronous?
    Contact your PI to help think in this information exchange.
    Am I clear in your question?
    JPA

Maybe you are looking for

  • Creation of Space failes with the following error

    SR- 3-6660108191 Ver-11.1.1.6 In customer production environment when they attempting to creat a space from WebCenter, they receive the following: Creation of space SteveSpace4_11January2013 failed with errors : WCS#2013.01.11.08.47.28: Errors were e

  • EHP4 Upgrade pahse REPACHK2

    Hi Gurus, I have recently installed ECC6 EHP4 ready on AIX environment. We decided to upgrade its ECC 6 components to ECC 6.04/EHP4. I am now in the phase REPACHK2. It show the message PREPROCESSING - Shadow System Installation Lock Development The d

  • Epson r320 and airport express

    does anyone know if the r320 works with airport express? i see the r300 is on the compatibility list but would love to know if anyone has firsthand experience that the r320 works! ibook g4   Mac OS X (10.3.9)  

  • WEBSITE NOT REFLECTING EDITS OF HTML FILE THAT WAYS SUCCESSFULLY "PUT" TO REMOTE SERVER

    Help!  I'm hoping there is a simple reason why I can't get my simple edits to reflect on the website.Here is what I have done/tried: 1. opened and added one simple line of text (with one link to a pdf) to one page of the pre-exisitng site. 2. checked

  • Using Advanced Pricing in Purchasing--r12

    Can anyone give me the steps required to give a discount/surcharge on the PO price using Advanced Pricing. I understand that we have to set up modifiers and attach qualifier groups to such modifiers, but I'm not getting the desired results. I did not