ESB best practices

I have the following questions on ESB, appreciate your inputs on these
1. Does one ESB map to multiple/one operation in the service provider's wsdl ?
2. How can we implement security if we want to restrict the operations exposed in the ESB WSDL to certain clients?
3. We are planning to expose the routing service to the consumers in a synchronous call, should the ESB define the schema for the consumers or should the ESB use the schema provided by the consumers (assuming more than one consumer)
4. Is it a bad practice to use DB Adapters?
5. Is it necessary to use Service Registry?
Thanks for any help on these

Best Practice is well overstated, it is a term that all the theorist band around to justify why they spend so much time talking about it instead of doing it.
In the real world you have to take a bit from real world and what is so call best practice, because sometimes best practice does not fit your use case.
I would like to understand your 2nd approach. Are you saying your services can call the ESB directly, e.g. via web services?
If this is the case I would go with your first option as this provides guaranteed delivery, and more options at fail over as you can have your process run in a XA transaction so the message is not dequeued until it has been enqueued on the next.
cheers
James

Similar Messages

  • Question on ESB Best Practice

    Hi,
    I would like to know what is the best practice for following scenario.
    1) I have to call different web services based on message content through ESB
    2) i have two ways, either i have one ESB developed and based on content i route them to different web services or create all different ESB services for different web services.
    Can any one tell me, what would be best with respect to performance and all perspective.
    Thanks,
    Jack

    I don't thin that I'm experienced enough but I guess that it depends to many things.
    - First about logic
    Where you want to place logic and how to manage it.
    You can place logic (routing to diferent endpoints) into BPEL but there is hard to manage it. When you put routing logic in ESB you can manage it without redeploying BPEL. It is easier update ESB routing rules than BPEL processes.
    - Performance
    I don't think that one ESB instead of 3 ESB is bottleneck but it is about performance.
    You can even create separate subprocess to keep routing logic. But I think it is the same as 1 ESB.

  • Best practice for version control B2B, ESB and BPEL

    Hello,
    we are setting up a new system using B2B, ESB and BPEL. The development team is more experienced working with PL/SQL, Oracle Workflow and we are worried that Jdeveloper generates changes to the source files during development and that we might have problems with the version control.
    Is there any best practice for setting up version control for these systems? Do we need to take anything in particular into consideration when setting up the projects?
    We are using Serena Dimensions 9.1 for version control with the add-on in Jdeveloper.
    Thanks in advance!

    I believe JDeveloper has a plugin for Dimensions.
    I havent used it but to get it, go to tools (It may be help I don't have JDeveloper on this machine to confirm) check for updates.
    If you select the thrid party check box - next, you will see an entry for dimentions.
    Configure the connection and develop as you would any other project.
    cheers
    James

  • XREF best practices in ESB cluster installation-OESB10.1.3.3

    Hi,
    We are using Oracle ESB during last 2 years.
    2 months ago I migrated our ESB installation to ESB Cluster in production (1 ESB DT, 1 ESB RT for polling
    adapters, 2 ESB RT for further message processing).
    We are using SOA Suite 10.1.3.3 with MLR#17 applied.
    I faced a issue with XREF (populateXRefRow XPath function) in production system and need assistance.
    All our ESB Processes contains next main parts:
    1) Polling DB-adapter (or FTP-adapter, this didn't matter) that initiates a ESB process, routing service for that polling adapter
    that asyncronously (!) invokes Requestor ABC level services (AIA terms);
    2) Requestor ABC level-services perform XREF population and continues message
    processing.
    XREF population is doing with next steps:
    we call lookupXRefRow XPath function, if value is not present in XREF, we doing
    populateXRefRow call.
    This logic is working fine when we are not using ESB cluster, but now step 2) (ReqABC level) is performed by different ESB servers
    and frequently we faced unique constraint violation error on XREF_DATA
    population (during populateXrefRow call).
    ESB RT nodes using to balance load but transmitted data is intersected. For example, we are poll not documents but document details instead (polling table populated by Oracle Streams, there are no guarantee that document header receives earlier than document details, because our system is high loaded and we are using commit_serialization=none with parallelism at APPLY processes).
    Each ESB RT instance can receive different rows of same document and xref population done at document header level.
    My question is: what is best practices to work with XREF in ESB cluster installations?
    May be other peoples faced with this issue and how this issue was resolved?
    I know possible workarounds to accomplish this task: not call populateXRefRow function in XSLT, instead call PL/SQL procedure or function that working same but can ignore any exceptions.
    This's solution not liked to me, but I dont know any other solutions.
    Also I cannot not populate XREF because XREF actively used in inter-systems communication.

    Hi,
    We are using Oracle ESB during last 2 years.
    2 months ago I migrated our ESB installation to ESB Cluster in production (1 ESB DT, 1 ESB RT for polling
    adapters, 2 ESB RT for further message processing).
    We are using SOA Suite 10.1.3.3 with MLR#17 applied.
    I faced a issue with XREF (populateXRefRow XPath function) in production system and need assistance.
    All our ESB Processes contains next main parts:
    1) Polling DB-adapter (or FTP-adapter, this didn't matter) that initiates a ESB process, routing service for that polling adapter
    that asyncronously (!) invokes Requestor ABC level services (AIA terms);
    2) Requestor ABC level-services perform XREF population and continues message
    processing.
    XREF population is doing with next steps:
    we call lookupXRefRow XPath function, if value is not present in XREF, we doing
    populateXRefRow call.
    This logic is working fine when we are not using ESB cluster, but now step 2) (ReqABC level) is performed by different ESB servers
    and frequently we faced unique constraint violation error on XREF_DATA
    population (during populateXrefRow call).
    ESB RT nodes using to balance load but transmitted data is intersected. For example, we are poll not documents but document details instead (polling table populated by Oracle Streams, there are no guarantee that document header receives earlier than document details, because our system is high loaded and we are using commit_serialization=none with parallelism at APPLY processes).
    Each ESB RT instance can receive different rows of same document and xref population done at document header level.
    My question is: what is best practices to work with XREF in ESB cluster installations?
    May be other peoples faced with this issue and how this issue was resolved?
    I know possible workarounds to accomplish this task: not call populateXRefRow function in XSLT, instead call PL/SQL procedure or function that working same but can ignore any exceptions.
    This's solution not liked to me, but I dont know any other solutions.
    Also I cannot not populate XREF because XREF actively used in inter-systems communication.

  • ESB Exception Handling Best Practices

    I've update the "ESB Transactions, Error Handling and Resubmit" Lesson PDF to include a best practices section. Go to http://otn.oracle.com/goto/esb and click on the link in the Learning more section. Feedback welcome.

    Hi Dave,
    I checked this document yesterday, it contained 18 pages.
    Some great info in the additional 7 pages, just in time as well: at a customer site we are hitting bug 5547165, the rejected messages being empty. I checked the rejection handlers for BPEL and was investigating how these could be used in case of ESB. Seems you have provided the answer.
    Any chance a fix for the bug mentioned here is in the 10.1.3.3 patch set?
    One more thing: by default the rejected messages for ESB are written to file system, in a directory below the 'home' OC4J instance. Could this be turned into a configurable space in a next release?
    Thanks and best regards, Sjoerd

  • Best-practice on versioning a soa suite-application

    Hi everyone,
    I recently organised a seminar for customers concerning the Soa Suite Stack and one of the interesting questions asked that day was a versioning-question.
    Let's say we've build a bpel/esb application interacting with different external and internal webservices and we've deployed this application to our production environment and we need to change an internal web service.
    How can we add versioning to this heterogenous system in a consistent way? I know you can tell esb which version of the bpel process it needs to use, but what about the custom and external webservices that we've integrated with?
    My 2 cents: You need to add a versioning-tag to you custom web services that you need to manage yourself, and use this versioning tag in the services your integrating with.
    Could somebody point me out what best practice is, or what Oracle's development team is working out regarding versioning systems for SOA-applications?

    Marc,
    Are you saying versioning isn't supported in ESB now? I thought that ESB already uses the versioning tag from bpel when you're integrating bpel and esb?
    During the development of my demo I've seen that the version-tag was used when invoking the bpel process through a soa service.

  • Best practice for data persistance for monitoring without BAM

    Greetings,
    We are modeling a business process in a large organization using BPEL Process Manager. The key point is that business people needs to monitor the execution of the business process in several key sectors of the process execution as well as they need to get report information of the process.
    To model this in our project, we decided to create a new Oracle Database Schema that is going to hold the information about the business process execution (we decided that because for this initial offering the customer is not buying BAM). In this context, the BPEL process is going to be sending this key information to the repository so business people can then view real time information about the process execution as well as historical information in form of reports.
    The important issue here is, if there is a best practice to send the information to the Database Schema ? it could be just using single database adapters ? maybe using sensors sending the data using topics connections ?
    Any help will be highly appreciated.
    Thanks in advance.

    hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?

  • OBIEE Best Practice Data Model/Repository Design for Objectives/Targets

    Hello World!
    We are faced with a design question that has become somewhat difficult and we need some help. We want to be able to compare side-by-side actual measures with their corresponding objectives/targets. Sounds simple. But, our objectives are static (not able to be aggregated) with multi-dimensionality and multi-levels. We need some best practice tips on how to design our data model and repository properly so that we can see the objective/target for a measure regardless of the dimensions that are used in the criteria and regardless of the level.
    Here is some more details:
    Example of existing objective table.
    Dimension1
    Dimension2
    Dimension3
    Obj1
    Obj2
    Quarter
    NULL
    NULL
    NULL
    .99
    1.8
    1Q13
    DIM1VAL1
    NULL
    NULL
    .99
    2.4
    1Q13
    DIM1VAL1
    DIM2VAL1
    NULL
    .98
    2.41
    1Q13
    DIM1VAL1
    DIM2VAL1
    DIM3VAL1
    .97
    2.3
    1Q13
    DIM1VAL1
    NULL
    DIM3VAL1
    .96
    1.9
    1Q13
    NULL
    DIM2VAL1
    NULL
    .97
    2.2
    1Q13
    NULL
    DIM2VAL1
    DIM3VAL1
    .95
    2.0
    1Q13
    NULL
    NULL
    DIM3VAL1
    .94
    3.1
    1Q13
    - Right now we have quarterly objectives set using 3 different dimensions. So, if an author were to add one or more (or zero) dimensions to their criteria for a given measure they could get back a different objective. They could add Dimension1 and get 99%. They could add Dimension1 and Dimension2 and get 98%. They could add all three dimensions and get 97%. They could add zero dimensions (highest grain) and get 99%. Using our existing structure if we were to add a new dimension to the mix the possible combinations would grow dramatically. (Not flexible)
    - We would like our final solution to be flexible enough so that we could view objectives with altogether different dimensions and possibly get different objectives.
    - We currently have 3 fact tables with 3+ conformed dimension tables and a few unique dimension tables.
    Could anyone share a similar situation where you have implemented a data model structure with the proper repository joins to handle showing side-by-side objectives/targets where the objectives were static and could be displayed at differing levels with flexible dimensions as described?
    Any help would be greatly appreciated.

    hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?

  • Best practice for how to access a set of wsdl and xsd files

    I've recently beeing poking around with the Oracle ESB, which requires a bunch of wsdl and xsd files from HOME/bpel/system/xmllib. What is the best practice for including these files in a BPEL project? It seems like a bad idea to copy all these files into every project that uses the ESB, especially if there are quite a few consumers of the bus. Is there a way I can reference this directory from the project so that the files can just stay in a common place for all the projects that use them?
    Bret

    Hi,
    I created a project (JDeveloper) with local xsd-files and tried to delete and recreate them in the structure pane with references to a version on the application server. After reopening the project I deployed it successfully to the bpel server. The process is working fine, but in the structure pane there is no information about any of the xsds anymore and the payload in the variables there is an exception (problem building schema).
    How does bpel know where to look for the xsd-files and how does the mapping still work?
    This cannot be the way to do it correctly. Do I have a chance to rework an existing project or do I have to rebuild it from scratch in order to have all the references right?
    Thanks for any clue.
    Bette

  • Environment best practices

    In my experience with other Integration tools, it has been a best practice to deploy an instance of the product per environment.  For example, I would expect to have Oracle SOA Suite deployed for a development enviornment, an Int Test environment, a System Test environment, and production.  In each of those environments exists a version of the corporate applications that are in various states of readiness as well.
    The question has been raised at this client - Why do we need to do that and could we not just stand up a single ESB instance for all non-prod and an ESB instance for prod.
    What is the experience of others on this forum around this topic - best practice, pros/cons, things to watch out for, etc.

    Thats a bad idea. You must have at least 3 environments.
    Developers need a space to deploy code and test. You need Test ideally a COPY of a production to smoke out any weirdness in the code.  The Test environment should resemble Prod in all aspects i.e hardware,memory, software versions etc.

  • Best practices or design framework for designing processes in OSB(11g)

    Hi all,
    We have been working in oracle 10g, now in the new project we are going to use Soa suite 11g.For 10g we designed our services very similar to AIA framework. But in 11g since OSB is introduced we are not able to exactly fit the AIA framework here because OSB has a structure different than ESB.
    Can anybody suggest best practices or some design framework for designing processes in OSB or 11g SOA Suite ?

    http://download.oracle.com/docs/cd/E12839_01/integration.1111/e10223/04_osb.htm
    http://www.oracle.com/technology/products/integration/service-bus/index.html
    Regards,
    Anuj

  • BPEL/WSDL Reference Best Practices ?

    We have a BPEL project with several (30+) processes. The WSDL of one process is needed by another, which is typical for a BPEL project. Also, the processes need resources like supplementary XML files. Where can I read about how to construct the partnerLink references to these resources that will facilitate both local testing and production deployment. There must be a recommended methodology.
    If I force every WSDL (or resource) to be accessed through a separate WEB service, than the WSDL created or modified in the BPEL process must be deployed to the WEB service before I test. Is that the only solution?
    I do not see how I could use a file URL because there are multiple developers and the BPEL workspaces are probably in different top level folder paths. Though each of the processes are at the same level in the workspace. Any suggestions would be appreciated.

    This link should have all that information.
    http://www.oracle.com/technology/tech/soa/soa-suite-best-practices/soa_best_practices_1013x_drop1.pdf
    You should see no to little performance hits creating multiple sub processes, if they are reusable then this is what the product is for and is encouraged. This is what SOA is all about. Whether this is best practice, I don't know this is just one persons point of view.
    Remember Oracle products are scalable, therefore if you are willing to invest in the right solution the performance is not an issue.
    When integrating to ESB don't use SOAP use the default jca adapter.
    cheers
    James

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Best practices for setting up users on a small office network?

    Hello,
    I am setting up a small office and am wondering what the best practices/steps are to setup/manage the admin, user logins and sharing privileges for the below setup:
    Users: 5 users on new iMacs (x3) and upgraded G4s (x2)
    Video Editing Suite: Want to connect a new iMac and a Mac Pro, on an open login (multiple users)
    All machines are to be able to connect to the network, peripherals and external hard drive. Also, I would like to setup drop boxes as well to easily share files between the computers (I was thinking of using the external harddrive for this).
    Thank you,

    Hi,
    Thanks for your posting.
    When you install AD DS in the hub or staging site, disconnect the installed domain controller, and then ship the computer to the remote site, you are disconnecting a viable domain controller from the replication topology.
    For more and detail information, please refer to:
    Best Practices for Adding Domain Controllers in Remote Sites
    http://technet.microsoft.com/en-us/library/cc794962(v=ws.10).aspx
    Regards.
    Vivian Wang

  • Add fields in transformations in BI 7 (best practice)?

    Hi Experts,
    I have a question regarding transformation of data in BI 7.0.
    Task:
    Add new fields in a second level DSO, based on some manipulation of first level DSO data. In 3.5 we would have used a start routine to manipulate and append the new fields to the structure.
    Possible solutions:
    1) Add the new fields to first level DSO as well (empty)
    - Pro: Simple, easy to understand
    - Con: Disc space consuming, performance degrading when writing to first level DSO
    2) Use routines in the field mapping
    - Pro: Simple
    - Con: Hard to performance optimize (we could of course fill an internal table in the start routine and then read from this to get some performance optimization, but the solution would be more complex).
    3) Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine).
    Does anybody know what is best practice is? Or do you have any experience regarding what you see as the best solution?
    Thank you in advance,
    Mikael

    Hi Mikael.
    I like the 3rd option and have used this many many times.  In answer to your question:-
    Update the fields in the End routine
    - Pro: Simple, easy to understand, can be performance optimized  - Yes have read and tested this that it works faster.  A OSS consulting note is out there indicating the speed of the end routine.
    - Con: We need to ensure that the data we need also exists (i.e. if we have one field in DSO 1 that we only use to calculate a field in DSO 2, this would also have to be mapped to DSO 2 in order to exist in the routine). - Yes but by using the result package, the manipulation can be done easily.
    Hope it helps.
    Thanks,
    Pom

Maybe you are looking for