Best Practices for Accessing the Configuration data Modelled as XML File in

Hi,
I refer the couple of blof posts/Forum threads on How to model and access the Configuration data as XML inside OSB.
One of the easiest and way is to
Re: OSB: What is best practice for reading configuration information
Another could be
Uploading XML data as .xq file (Creating .xq file copy paste all the Configuration as XML )
I need expert answers for following.
1] I have .xsd file which is representing the Configuration data. Structure of XSD is
<FrameworkConfig>
<Config type="common" key="someKey">proprtyvalue</Config>
<FrameworkConfig>
2] As my project will move from one env to another the property-value will change according to the Environment...
For Dev:
<FrameworkConfig>
<Config type="common" key="someKey">proprtyvalue_Dev</Config>
<FrameworkConfig>
For Stage :
<FrameworkConfig>
<Config type="common" key="someKey">proprtyvalue_Stage</Config>
<FrameworkConfig>
3] Let say I create the following Folder structure to store the Configuration file specific for dev/stage/prod instance
OSB Project Folder
|
|---Dev
|
|--Dev_Config_file.xml
|
|---Stage
|
|--Stahe_Config_file.xml
|
|---Prod
|
|-Prod_Config_file.xml
4] I need a way to load these property file as xml element/variable inside OSb message flow.?? I can't use XPath function fn:doc("URL") coz I don't know exact path of XMl on deployed server.
5] Also I need to lookup/model the value which will specify the current server type(Dev/Stage/prod) on which OSB MF is running. Let say any construct which will act as a Global configuration and can be acccessible inside the OSb message flow. If I get the vaalue for the Global variable as Dev means I will load the xml config file under the Dev Directory @runtime containing key value pair for Dev environment.
6] This Re: OSB: What is best practice for reading configuration information
suggest the designing of the web application which will serve the xml file over the http protocol and getting the contents into variable (which in turn can be used in OSB message flow). Can we address this problem without creating the extra Project and adding the Dependencies? I read configuration file approach too..but the sample configuration file doesn't show entry of .xml file as resources
Hope I am clear...I really appreciate your comments and suggestion..
Sushil
Edited by: Sushil Deshpande on Jan 24, 2011 10:56 AM

If you can enforce some sort of naming convention for the transport endpoint for this proxy service across the environments, where the environment name is part of the endpoint you may able to retrieve it from $inbound in the message pipeline.
eg. http://osb_host/service/prod/service1 ==> Prod and http://osb_host/service/prod/service2 ==> stage , then i think $inbound/ctx:transport/ctx:uri can give you /service/prod/service1 or /service/stage/service1 and applying appropriate xpath functions you will be able to extract the environment name.
Chk this link for details on $inbound/ctx:transport : http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/context.html#wp1080822

Similar Messages

  • Best practice on extending the SIEBEL data model

    Can anyone point me to a reference document or provide from their experience a simple best practice on extending the SIEBEL data model for business unique data? Basically I am looking for some simple rules - based on either use case characteristics (need to sort and filter by, need to update frequently, ...) or data characteristics (transient, changes frequently, ...) to tell me if I should extend the tables, leverage the 'x' tables, or do something else.
    Preferably they would be prescriptive and tell me the limits of the different options from a use perspective.
    Thanks

    Accepting the given that Siebel's vanilla data model will always work best, here are some things to keep in mind if you need to add something to meet a process that the business is unwilling to adapt:
    1) Avoid re-using existing business component fields and table columns that you don't need for their original purpose. This is a dangerous practice that is likely to haunt you at upgrade time, or (worse yet) might be linked to some mysterious out-of-the-box automation that you don't know about because it is hidden in class-specific user properties.
    2) Be aware that X tables add a join to your queries, so if you are mapping one business component field to ATTRIB_01 and adding it to your list applets, you are potentially putting an unnecessary load on your database. X tables are best used for fields that are going to be displayed in only one or two places, so the join would not normally be included in your queries.
    3) Always use a prefix (usually X_ ) to denote extension columns when you do create them.
    4) Don't forget to map EIM extensions to the extension columns you create. You do not want to have to go through a schema change and release cycle just because the business wants you to import some data to your extension column.
    5) Consider whether you need a conversion to populate the new column in existing database records, especially if you are configuring a default value in your extension column.
    6) During upgrades, take the time to re-evalute your need for the extension column, taking into account the inevitable enhancements to the vanilla data model. For example, you may find, as we did, that the new version of the S_ADDR_ORG table had an ADDR_LINE_3 column, and our X_ADDR_ADDR3 column was no longer necessary. (Of course, re-configuring all your business components to use the new vanilla column can also be quite an ordeal.)
    Good luck!
    Jim

  • Application to Read and Write the Configuration Data from a xml file

    Hi,
    I have to develop a Webdynpro application that will do the following:
    Read the Configuration data from an xml file
    If stored in a file , and the file is not found, prompt the user to provide the file location.
    Pre-Populate the screen(table) with the configuration data
    Allow the user to edit the data
    Store the configuration data when the user hits the Save button
    The config xml file details are as follows:
    Regardless of the location of the configuration data, the following fields will be stored
    Application (string) will always contain u201CSFA_EDOCSu201D
    Key (string) eg LDAP_USER, DB_PREFIX, etc.
    Type (character)  u201CPu201D = Plain Text, u201CEu201D = Encrypted
    Value (string)
    Since I am new to WD, I would require help on this.
    Please help.Its Urgent.
    Regards,
    Vaishali.
    Edited by: vaishali dhayalan on Sep 19, 2008 8:29 AM

    Hi,
    I have to develop a Webdynpro application that will do the following:
    Read the Configuration data from an xml file
    If stored in a file , and the file is not found, prompt the user to provide the file location.
    Pre-Populate the screen(table) with the configuration data
    Allow the user to edit the data
    Store the configuration data when the user hits the Save button
    The config xml file details are as follows:
    Regardless of the location of the configuration data, the following fields will be stored
    Application (string) will always contain u201CSFA_EDOCSu201D
    Key (string) eg LDAP_USER, DB_PREFIX, etc.
    Type (character)  u201CPu201D = Plain Text, u201CEu201D = Encrypted
    Value (string)
    Since I am new to WD, I would require help on this.
    Please help.Its Urgent.
    Regards,
    Vaishali.
    Edited by: vaishali dhayalan on Sep 19, 2008 8:29 AM

  • What are the best practice for CQ5.5 configuration?

    Hello,
    What are the best practice for CQ5.5 configuration which handle for High availability.
    Last time I had a issues on server when I was uploaded 2 GB of DAM and then after that the server is not able to start and always getting error regarding Tar Persistance.
    So kindly request you to please let me know what are the best apache felix configuration.
    Thanks in advance...
    Regards,
    Satish

    Hi,
    A DAM upload, regardless of the size of the assets, never should result in TarPM problems, unless you run into an OOM, which left the repository in an unclean state. So if you regularly do DAM uploads of that size, you should check the Garbage Collection logs and probably adjust the heapsize if necessary. You might want to limit the number of concurrent running workflows to keep the memory consumption a bit lower.
    To your question: HA in a traditional sense you cannot achieve with a single box, even with optimized settings. In an author usecase you would need clustering.
    Jörg

  • Best practice for Plan and actual data

    Hello, what is the best practice for Plan and actual data?  should they both be in the same app or different?
    Thanks.

    Hi Zack,
    It will be easier for you to maintain the data in a single application. Every application needs to have the category dimension, mandatorily. So, you can use this dimension to maintain the actual and plan data.
    Hope this helps.

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • Best practices for using the knowledge directory

    Anyone know when it is best to store docs in the Knowledge Directory versus Collab? They are both searchable, but I guess you can publish from the Publisher to the KD. Anyone have any best practices for using the KD or setting up taxonomies in the KD?

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practices for using the 'cost details' fields

    Hi
    Please could you advise us to the best practices for using the 'cost details' field within Pricing. Currently I cannot find the way to surface the individual Cost Details fields within the Next Generation UI, even with the tick box for 'display both cost and price' ticked. It seems that these get surfaced when the Next Generation UI is turned off, but cannot find them when it is turned on. We can see the 'Pricing Summary' field but this does not fulfill our needs, as some of our services have both recurring and one-off costs.
    Attached are some screenshots to further explain the situation.
    Many thanks,
    Richard Thornton

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best Practice for enhancing the SAP delivered standard WD ABAP application

    Hi,
    I am new to WebDypro ABAP.
    To enhance the SAP delivered Standard WebDynpro Component (complex component with Business objects & powl).
    Kindly let me know the best practice for enhancing the Standard WD ABAP from the below 1 or 2.
    1) To copy & create a "Z" of the component & make changes in that (or)
    2) to enhance directly on the same standard component without making "Z".
    Regards,
    NS

    Hi NS,
    If it is a standard component its better we go for enhancing the component rather than copying it into Z component.
    If there is any issue with in the standard component , SAP supports it through notes and OSS messages. If it is a Z component, SAP doesn't support it.
    If there is any up gradation of business packages, changes will be done to standard , but not the Z components, wherein we could miss it.
    Further, since it is a standard component it might have been used at many places, changes that has to done to reflect all changes might be difficult in this case if it is a z component.
    Regards,
    Harsha

  • Best practice for including additional DLLs/data files with plug-in

    Hi,
    Let's say I'm writing a plug-in which calls code in additional DLLs, and I want to ship these DLLs as part of the plug-in.  I'd like to know what is considered "best practice" in terms of whether this is ok  (assuming of course that the un-installer is set up to remove them correctly), and if so, where is the best place to put the DLLs.
    Is it considered ok at all to ship additional DLLs, or should I try and statically link everything?
    If it's ok to ship additional DLLs, should I install them in the same folder as the plug-in DLL (e.g. the .8BF or whatever), in a subfolder of the plug-in folder or somewhere else?
    (I have the same question about shipping additional files too, such as data or resource files.)
    Thanks
                             -Matthew

    Brother wrote:
    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    OO is a choice not a mandate. Using Java in a procedural way is certainly not ideal but given that it is existing code I would look more into whether is well written procedural code rather than looking at the lack of OO.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?Normally you create a data model driven by business need. You then implement using whatever means seem expedient in terms of other business constraints to closely model that data model.
    It is often the case that there is a strong correlation between data models and tables but certainly in my experience it is rare when there are not other needs driven by the data model (such as how foreign keys and link tables are implemented and used.)

  • Best Practice for Significant Amounts of Data

    This is basically a best-practice/concept question and it spans both Xcelsius & Excel functions:
    I am working on a dashboard for the US Military to report on some basic financial transactions that happen on bases around the globe.  These transactions fall into four categories, so my aggregation is as follows:
    Year,Month,Country,Base,Category (data is Transaction Count and Total Amount)
    This is a rather high level of aggregation, and it takes about 20 million transactions and aggregates them into about 6000 rows of data for a two year period.
    I would like to allow the users to select a Category and a country and see a chart which summarizes transactions for that country ( X-axis for Month, Y-axis Transaction Count or Amount ).  I would like each series on this chart to represent a Base.
    My problem is that 6000 rows still appears to be too many rows for an Xcelsius dashboard to handle.  I have followed the Concatenated Key approach and used SUMIF to populate a matrix with the data for use in the Chart.  This matrix would have Bases for row headings (only those within the selected country) and the Column Headings would be Month.  The data would be COUNT. (I also need the same matrix with Dollar Amounts as the data). 
    In Excel this matrix works fine and seems to be very fast.  The problem is with Xcelsius.  I have imported the Spreadsheet, but have NOT even created the chart yet and Xcelsius is CHOKING (and crashing).  I changed Max Rows to 7000 to accommodate the data.  I placed a simple combo box and a grid on the Canvas u2013 BUT NO CHART yet u2013 and the dashboard takes forever to generate and is REALLY slow to react to a simple change in the Combo Box.
    So, I guess this brings up a few questions:
    1)     Am I doing something wrong and did I miss something that would prevent this problem?
    2)     If this is standard Xcelsius behavior, what are the Best Practices to solve the problem?
    a.     Do I have to create 50 different Data Ranges in order to improve performance (i.e. Each Country-Category would have a separate range)?
    b.     Would it even work if it had that many data ranges in it?
    c.     Do you aggregate it as a crosstab (Months as Column headings) and insert that crosstabbed data into Excel.
    d.     Other ideas  that Iu2019m missing?
    FYI:  These dashboards will be exported to PDF and distributed.  They will not be connected to a server or data source.
    Any thoughts or guidance would be appreciated.
    Thanks,
    David

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • Best Practices for converting SAP HR data (4.7 to ECC)

    Hello Experts ...
    We are going from 4.6 to ECC ... no upgrade ..it will be a new implementation ...
    I am looking for best practices to convert SAP HR data from one sap instance(4.6) to another(ECC) ...
    I am not sure if direct input or LSMW or any other method/tool is the best way ...
    Will really appreciate and award point if I can get good advise or documentation ...
    Let me know if my question is not clear enough.
    Thanks,

    Hi
    You can check SAP Marketplace and there follow the download link. You will require a SAP Marketplace login to download the Best Practices. Fortunately Best practices for ECC 5.00 and 6.00 are available there but they are country specific versions. I know HCM for US is available there.
    Reward points, if helpful.
    Regards
    Waz

  • What are the best practices for using the enhancement framework?

    Hello enhancement framework experts,
    Recently, my company upgraded to SAP NW 7.1 EhP6.  This presents us with the capability to use the enhancement framework.
    A couple of senior programmers were asked to deliver a guideline for use of the framework.  They published the following statement:
    "SAP does not guarantee the validity of the enhancement points in future releases/versions. As a result, any implemented enhancement points may require significant work during upgrades. So, enhancement points should essentially be used as an alternative to core modifications, which is a rare scenario.".
    I am looking for confirmation or contradiction to the statement  "SAP does not guarantee the validity of enhancement points in future releases/versions..." .  Is this a true statement for both implicit and explicit enhancement points?
    Is the impact of activated explicit and implicit enhancements much greater to an SAP upgrade than BAdi's and user exits?
    Is there any SAP published guidelines/best practices for use of the enhancement framework?
    Thank you,
    Kimberly
    Edited by: Kimberly Carmack on Aug 11, 2011 5:31 PM

    Found an article that answers this question quite well:
    [How to Get the Most From the Enhancement and Switch Framework as a Customer or Partner - Tips from the Experts|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0f0373e-a915-2e10-6e88-d4de0c725ab3]
    Thank you Thomas Weiss!

  • Best practice for phone number column data type

    Hi,
    I hope I have posted this in the right forum, if not just notify me and i will remove it to the correct area.
    What would be considered best practice for storing a phone number.
    Number of Varchar2.
    Ben

    Well I was thinking that Number would have the following disadvantages,
    1. Users entering phone numbers into a form may be tempted to pre format with spaces, thereby throwing up an error.
    2. Mobile phone numbers may have leading zeros
    3. Calculations are not carried out on phone numbers.
    I was leaning towards a varchar2 type.
    Ben

  • Best practices for customizing the standard OBIA metedata repository (RPD)

    Hello
    Is there a Best practices document published by oracle or a partner that talks about best practices for customizing OBIA out-of-box RPD. I am specifically looking for guidance around:
    1. adding new objects to physical layer or modifying an exisitng table definition to add more columns
    2. Building new Logical columns in BMM layer
    3. Modifying the exisitng Subject areas.
    Thanks

    There is a very good presentation by Rittman mead on extending and customizing BI Applications. Refer to this link (http://www.rittmanmead.com/files/OOW2008%20-%20Extending%20and%20Customizing%20the%20BI%20Apps%20Data%20Warehouse.pdf ).
    Thanks,
    -Amith.

Maybe you are looking for

  • Stateless ejb doesn't insert row

    Hi When I generate stateless jsp pages for AM deployed on 8i ejb, it cannot insert a new record. However if I make statefull pages - everything is fine. Also it works in tester. Also with Local deployment both types of jsp generation (statefull and s

  • I hate this new menu, cant find history or how to save to bookmarks toolbarwant my old firefox back

    took me weeks to find the "find" feature to search for word on page, I want option to go back to old firefox

  • Broken DC in CBS Consollidation Build space

    Hi All When i assembled the activity in CMS server assembling of components got failed.Reason was Broken DC exist in CBS Consollidation build space. I have compared the CBS Development build space and CBS consollidation build space one difference i f

  • Lync Mobility Not Working Internally

    Hi Experts, Actually i am a newbie and i have installed Lync 2013 (without Edge Server), and all services(autodiscovery with lync PC clients, A/V, IM, presence etc etc ) are working (internally and externally) except Mobility functionality. Following

  • My contract is due to expire

    What are my options? My two-year contract for my iphone 4S will expire in early December.