Best practice for Documenting SOA Composites

Hi,
I am looking for any general guideline or best practice to create documentation for composites developed as part of a project.
Are there any plugins which help export in Visio or other tool?
I don't see a create JPEG button on composite editor similar to BPEL, so any suggestions for documentating that?
In general, i would like to take your opinions/suggestions to adapt a process for a better documentation.
Thanks.

Hi,
As such there is no such guidelines or best practices which are followed for documentation.
You may plug in your source control system with Jdeveloper but it will help during the coding process.
We have used OER in our project for maintaining documentations and the relationships among different files (be it xsd's, wsdls, bpels, mediator , etc)
Thanks

Similar Messages

  • Best Practice For Working on Composite In Team

    Hello,
    I would like to know what is the best practice for working on a single composite by mutiple members in a team.
    We have a core services module wherein a single composite contains many services. So, to complete in time, we would like many members to work on it simultaneously.
    In such scenarios, if some one adds a new adapter or some other services, composite.xml changes.
    Saving it would override other member's changes. Also, it is not possible to apply lock simultaneously on the same file through some version control mechanism.
    Please let us know what should be the best practice in such scenarios.
    Thanks-
    Ashish

    You can very well use a version control software with JDev. You may refer -
    http://www.oracle.com/technetwork/articles/soa/jimerson-config-soa-355383.html
    I think without version control mechanism (like subversion) it won't be easy to work in a multi-developer environment. If you really don't have a source and version control mechanism then manual merging will be required which may be error prone and time & effort consuming.
    Regards,
    Anuj

  • Best practice for documenting Hyperion BQY's?

    I was wondering if anyone had anyone had any suggestions on the best way to document a BQY within the BQY itself? I do use comments inside the calculated fields but I was looking for a way to document the complete BQY. Any thoughts or ideas would be greatly appreciated.
    Thanks,

    I have used a couple different approaches.  User Documentation has been done on a Notes Dashboard and Developer documentation done in the Document Scripts as comment section.  I often keep history of technical changes in the Document Scripts
    I have also written documentation on the file and printed that file to PDF then put a link on the Dashboard to the document whether it is hosted on the EPM Server as static content or a document library maybe hosted on a portal like SharePoint.  This approach is often good if you are using Use Cases, Design Docs, etc and want them available

  • Best Practice for documenting projects BW

    Hi people.
    We are studing at the moment a way of documenting new developments of applications of SAP BW.
    Is there a standard of documentation recommended by SAP, or a good practice that I use to create and document objects developed in SAP BW.
    Can someone help me?

    Hi Marques, From my experience this varies from one costumer to another. Everywhere they have different practices, formats of documents, tools, habits, etc.
    What I found very useful in BW is using of meta data repository part of TA RSA1. You can easily get some nice screen shots like data flows. Moreover in BW 7.x you have possibility to get documentation of particular transformations, DTPs etc. This can be done when you select particular object via left click of mouse and then hit F1 key.
    BR
    m./

  • Best practices for folder structure in SOA Project

    Actually, In my project, I have more than 10 BPEL process and several DBAdapters and some HumanTasks. For each BPEL process, JDeveloper creates .bpel, .componentType, wsdl files. For DBAdapters JDeveloper creates lot of files, and same for HumanTasks as well.
    By default Jdeveloper puts all these files in root directory of the Project. It looks messy having 100's of files in root directory.
    How can I organize all these files?
    What are the best practices for folder structure in SOA Project ?
    Thanks

    Yes Yatan. I did noticed that Polling service WSDL disabled in EM . But I want to try this approach beacause I want to move all database related stuff to one composite. I have another composite with 3 BPEL Processes. One BPEL queries external webservice and save the response in DB. One BPEL just fires the first Process(Query) and respond to caller with an unique id. I have Another BPEL which keeps polling on database for 5 sec(I'm using PICK Activity), if there is any new record in database it will retrun to caller. If no new records in database, returns empty to the caller after 5 sec.
    If I put all these in one composite, having lot of files in root directory, the composite really looks very messy.
    As you said, I want to have 2 composites. First Compoiste will have 3 BPEL processes, and 2nd Composite will have DB Polling and DB Save. I will have another project for MDS to store XSDs.
    So because I can't have Polling service as exposed service, I have to keep Polling service in the same Composite where the BPEL is. Now again it goes back to Big and messy project.
    Is there any way I can separate all DB related including polling in a separate Composite ?
    Thanks
    --Sreeny                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Anybody has documentation on the SAP Best Practices for BI 7.0

    I am currently looking for SAP Best Practices for BI 7.0 and speccifically needed documentation on installation of Business Content. Please email me at bala215 "at" yahoo.com

    Hi,
    There's some more links in the threads below :
    SAP BI Best Practices - Info
    Best Practices for Implementing BI7.0
    Cheers,
    Kedar

  • Best Practices for JMS Service Documentation

    Our software consists of a variety of JMS producers and consumers used to transform and transmit business to business messages on a large scale.
    The service nodes do a variety of things, and we've had trouble over the years ensuring every queue-based service clearly identifies the parameters and payload it accepts, so that everyone from programmers to system administrators can easily see what services are available and how they are to be used.
    Some have advocated always adding a web service in front of each message-based service to guarantee interface contracts are all well publicized. I think there are reasons to choose web services and reasons to choose message-based services, and am not convinced this is the right answer to address limitations in expressing the design contract for a message-based service. That said, I really like what we've been able to do with self-documenting web services based on annotations, and wonder if there's a conceptual equivalent in message-based software.
    What are your best practices for ensuring your message-based services are as self-documenting as your modern web service?
    Thanks in advance for your advice!
    Edited by: lsamaha on Apr 16, 2012 11:46 AM

    *bump                                                                                                                                                                                                                                                           

  • Best Practices for zVM/SLES10/zDB2 environment for dialog instances.

    Hi,  I am a zSeries system programmer who has just completed an IBM led Proof of Concept which demonstrated the viability of running SAP instances on SUSE SLES10 Linux booted in zVM guests and accessing zDB2 data via hipersockets. Before we build a Linux infrastructure using the 62 IFLs we just procured, we are wondering if any best practices for this environment have been developed as an OSS note or something else by SAP.    Below you will find an email which was sent and responded to by IBM and Novell on these topics...
    "As you may know, Home Depot has embarked on an IBM led proof of concept using SUSE SLES10 running in zVM guests on IBM zSeries hardware to host SAP server instances.  The Home Depot IT organization is currently in the midst of a large scale push to modernize our merchandising and people systems on SAP platforms.  The zVM/SUSE/SAP POC is part of that effort, as is a parallel POC of an Intel Blade/Red Hat/SAP platform.  For our production financial systems we now use a pSeries/AIX/SAP platform.
          So far in the zVM/SUSE/SAP POC, we have been able to create four zVM LPARS on IBM z9 hardware, create twelve zVM guests on those LPARS, boot SLES10 in those guests, install and run SAP instances in those guests using hipersockets for access to our DB2 SAP databases running on zOS, and direct user workloads to the SAP instances with good results.  We have also successfully developed cloning scripts that have made it possible to create new SLES10 instances, configured and ready for SAP installs, in about 10 seconds using FLASHCOPY and IBM DASD.
          I am writing in the hope that you can direct us to technical resources at IBM/Novell/SAP who may be able to field a few questions that have arisen.  In our discussions about optimization of the zVM/SUSE/SAP platform, we wondered if any wisdom about the appropriateness of and support for using zVM capabilities to virtualize SAP has ever been developed or any best practices drafted.  Attached you will find an IBM Redbook and a PowerPoint presentation which describes the use of the zVM discontiguous shared segments and the zVM named saved system features for the sharing of reentrant code and other  elements of Linux and its applications, thereby conserving storage and disk resources allocated to guest machines.   The specific question of the hour is, can any SAP code be handled similarly?  Have specific SAP elements eligible for this treatment been identified? 
          I've searched the SUSE Knowledgebase for articles on this topic to no avail.  Any similar techniques that might help us reduce the total cost of ownership of a zVM/SUSE/SAP platform as we compare it to Intel Blade/Red Hat/SAP and pSeries/AIX/SAP platforms are of great interest as we approach the end of our POC.  Can you help?
          Greg McKelvey is a Client I/T Architect at IBM.  He found the attached IBM documents and could give a fuller account of our POC.  Pat Downs, IBM zSeries IT Architect, has also worked to guide our POC. Akshay Rao, IBM Systems IT Specialist - Linux | Virtualization | SOA, is acting as project manager for the POC.  Jim Hawkins is the Home Depot Architect directing the POC.  I've CC:ed their email addresses.  I am sure they would be pleased to hear from you if there are the likely questions about what the heck I am asking about here.  And while writing, I thought of yet another question that I hoping somebody at SAP might weigh in on; are there any performance or operational benefits to using Linux LVM to apportion disk to filesystems vs. using zVM to create appropriately sized minidisks for filesystems without LVM getting involved?"
    As you can see, implementation questions need to be resolved.  We have heard from Novell that the SLES10 Kernel and other SUSE artifacts can reside in memory and be shared by multiple operating system images.  Does SAP support this configuration?  Also, has SAP identified SAP components which are eligible for similar treatment?  We would like to make sure that any decisions we make about the SAP platforms we are building will be supportable.  Any help you can provide will be greatly appreciated.  I will supply the documents referenced above if they are not known to any answerer.  Thanks,  Al Brasher 770-433-8211 x11895 [email protected]

    Hello AL ,
    first, let me welcome you on board,  I am sure you won't be disapointed with your choice to run SAP on ZOS.
    as for your questions,
    it wan't easy to find them in this long post , so i suggest you take the time to write a short summary that contains a very short list of questions.
    as for answers.
    here are a few usefull sources of information :
    1. the sap on db2 for Z/os sdn page :
    SAP on DB2 for z/OS
    in it you can find 2 relevant docs :
    a. best practices for ...
    b. database administration for db2 udb for z/os .
    this second publication is excellent , apart from db2 specific info , it contains information on all the components of the sap on db2 for z/os like zlinux,z/vm and so on ...
    2. I can see that you are already familiar with the ibm redbooks , but it seems that you haven't taken the time to get the most out of that resource.
    from you post it is clear that you have found one usefull publication , but I know there are several.
    3. a few months ago I wrote a short post on a similar subject ,
    I'm sure its not exactly what you are looking for at this moment , but its a good start , and with some patience you may be able to get some answers.
    here's a link
    http://blogs.ittoolbox.com/sap/db2/archives/index-of-free-documentation-on-sap-db2-administration-14245
    good luck.
    omer brandis.

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best Practices for Workshop IDE (Development Workstation Setup)

    Is there any Oracle documentation that describes best practices for setting up Workshop and developing on a workstation that includes Oracle's ODSI, OSB, Portal, and WLI? We are using all these products on a weblogic server for each developer's machine and experiencing performance and reliability issues. What's the optimal way to use these products on a developer's workstation. Thanks.

    Hi,
    Currently you dont see such best practice site with in workshop.
    but you can verify most issues from doc.
    http://docs.oracle.com/cd/E13224_01/wlw/docs103/
    if you need any further assistance let me know.
    Regards,
    Kal

  • Workflow not completed, is this best practice for PR?

    Hi SAP Workflow experts,
    I am new in workflow and now responsible to support existing PR release workflow.
    The workflow is quite simple and straightforward but the issue here is the workflow seems like will never be completed.
    If the user released the PR, the next activity is Requisition released that using task TS20000162.
    This will send work item to user (pr creator) sap inbox which when they double click it will complete the workflow.
    The thing here is, in our organization, user does not access SAP inbox hence there are thousands of work item that has not been completed. (our procurement system started since 2009).
    Our PR creator will receive notification of the PR approval to theirs outlook mail handled by a program that is scheduled every 5 minutes.
    Since the documentation is not clear enough, i can't digest why the implementer used this approach.
    May I know whether this is the best practice for PR workflow or not?
    Now my idea is to modify the send email program to complete the workitem after the email being sent to user outlook mail.
    Not sure whether it is common or not though in workflow world.
    Any help is deeply appreciated.
    Thank you.

    Hello,
    "This will send work item to user (pr creator) sap inbox which when they double click it will complete the workflow."
    It sounds liek they are sending a workitem where an email would be enough. By completing the workitem they are simply acknowledging that they have received notification of the completion of the PR.
    "Our PR creator will receive notification of the PR approval to theirs outlook mail handled by a program that is scheduled every 5 minutes."
    I hope (and assume) that they only receive the email once.
    I would change the workflow to send an email (SendMail step) to the initiator instead of the workitem. That is normally what happens. Either that or there is no email at all - some businesses only send an email if something goes wrong. Of course, the business has to agree to this change.
    Having that final workitem adds nothing to the process. Replace it with an email.
    regards
    Rick Bakker
    hanabi technology

  • Best Practice for Distributing Databases to Customers

    I did a little searching and was surprised to not find a best practice document for how to distribute Microsoft SQL Databases. With other database formats, it's common to distribute them as scripts. It seems that feature is rather limited with the built-in
    tools Microsoft provides. There appear to be limits to the length of the script. We're looking to distribute a database several GBs in size. We could detach the database or provide a backup, but that has its own disadvantages by limiting what versions
    of the SQL Server will accept the database.
    What do you recommend and can you point me to some documentation that handles this practice?
    Thank you.

    Its much easier to distribute schema/data from an older version to a newer one than the other way around. Nearly all SQL Server deployment features supports database version upgrade, and these include the "Copy Database" wizard, BACKUP/RESTORE,
    detach/attach, script generation, Microsoft Sync framework, and a few others.
    EVEN if you just want to distribute schemas, you may want to distribute the entire database, and then truncate the tables to purge data.
    Backing up and restoring your database is by far the most RELIABLE method of distributing it, but it may not be pratical in some cases because you'll need to generate a new backup every time a schema change occurs, but not if you already have an automated
    backup/maintenance routine in your environment.
    As an alternative, you can Copy Database functionality in SSMS, although it may present itself unstable in some situations, specially if you are distributing across multiple subnets and/or domains. It will also require you to purge data if/when applicable.
    Another option is to detach your database, copy its files, and then attach them in both the source and destination instances. It will generate downtime for your detached databases, so there are better methods for distribution available.
    And then there is the previously mentioned method of generating scripts for schema, and then using an INSERT statement or the import data wizard available in SSMS (which is very practical and implements a SSIS package internally that can be saved for repeated
    executions). Works fine, not as practical as the other options, but is the best way for distributing databases when their version is being downgraded.
    With all this said, there is no "best practice" for this. There are multiple features, each offering their own advantages and downfalls which allow them to align to different business requirements.

  • Best practice for remote topic subscription with HA

    I'd like to create an orchestrator EJB, in cluster A, that must persist some data to a database of record and then publish a business event to a topic. I have two durable subscribers, MDBs on clusters B & C, that need to receive the events and perform some persistence on their side.
              I'd like HA so that a failure in any managed server would not interrupt the system. I can live with at least once delivery, but at the same time I'd like to minimize the amount of redundant message processing.
              The documentation gets a little convoluted when dealing with clustering. What is the best practice for accomplishing this task? Has anyone successfully implemented a similar solution?
              I'm using Weblogic 8.1 SP5, but I wouldn't mind hearing solutions for later versions as well.

    A managed server failure makes that server's JMS servers unavailable, which, in turn, makes the JMS server's messages unavailable until the either (A) the JMS server is migrated or (B) the managed server is restarted.
              For more discussion, see my post today on topic "distributed destinations failover - can't access messages from other node". Also, you might be interested in the circa 8.1 migration white-paper on dev2dev: http://dev2dev.bea.com/pub/a/2004/05/ClusteredJMS.html
              Tom

  • Best practice for loading config params for web services in BEA

    Hello all.
    I have deployed a web service using a java class as back end.
    I want to read in config values (like init-params for servlets in web.xml). What
    is the best practice for doing this in BEA framework? I am not sure how to use
    the web.xml file in WAR file since I do not know how the name of the underlying
    servlet.
    Any useful pointers will be very much appreciated.
    Thank you.

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best practice for consuming web services

    Hi
    we are consuming web service in orchestration by "Add Generated Item".By using this option it creates 1 orch,1xsd file and some bindings.
    we have different projects for schemas,maps and orchestration under our solution in visual studio.
    Now i need to know that what will be the best practice for consuming web service in orchestration i mean in which project should i use "add generated item" (in orchstration project or in schemas project) coz it generates both 1 orch and 1
    schema.
    thanks

    From a service orientation perspective you should abstract the service artifacts from the other artifacts. Otherwise it will be very difficult to update the service interface without affecting the other artifacts. For example you don't want to have to redeply
    your entire application if only one field changes in the service you consume.
    So I typically generate the items, remove the unnecessary stuff, and put them in a separate project.
    Depending on the control you have over the services you want to consume, it would even be better to create another layer of abstraction. By that I mean create your own interface (schema) and map that one to the one the service exposes. This basically
    is only necessary if you consume external services that are beyond your control. By abstracting the interface it exposes, you limit the impact of changes of that interface on the rest of your system. All changes are abstracted behind your own interface.
    If you consume internal services, you can probably control the way the interface is defined. In a service oriented world all internal services expose a well known interface, based on the domain objects you have within your organisation.
    Jean-Paul Smit | Didago IT Consultancy
    Blog |
    Twitter | LinkedIn
    MCTS BizTalk 2006/2010 + Certified SOA Architect
    Please indicate "Mark as Answer" if this post has answered the question.

Maybe you are looking for