Best practice to maintain code across different environments

Hi All,
We have a portal application and we use
JDEV version: 11.1.1.6
fusion middleware control 11.1.1.6
In our application we have  created many portlets by using iframe inside our jspx files and few are in navigation file as well and the URL's corresponding to these portlets are different
across the environments (dev,test and prod). we are using subversion to maintain our code .
problem we are having is: Apart from changing environment details while deploying to Test and prod, we also have to change the portlet URL' s from dev portlet URL's to corresponding env manually.
so is there any best practice to avoid this cumbersome task? can we achieve this by creating deployment profile?
Thanks
Kotresh

Hi.
Put a sample of the two different URLs. Anyway you can use EL Expression to get current host instead of hardcoded. In addition, you can think in using a common DNS mapped in hosts file for all environments.
Regards.

Similar Messages

  • What is the best practice to migrate code from dev  to prod

    I have few questions related OWB version control/migration.
    1. What is the best way to Version the design repository
    We want to keep a separate copy of production and test version all the time.
    Do we have to create design repository in each environment ?
    2. Is it possible to have multiple copies of the same mapping ? If yes how this is done in OWB
    3. How to migrate the code to different environments
    2.1 Is there a way to provide DBA with script for deploying PL/SQL mapping without using OWB ?
    2.2 How important is the setting before actual code in the XML file is required to execute the mapping.
    2.3 We are planning to create tables , materialized view etc., outside OWB will this cause any problems ?
    Note : Our DBA prepare compiling the code through SQLPLUS
    3. Which user ( target owner or runtime access ) should be used to execute the mapping through sqlplus.
    We are looking for a solution were we do not have to use OWB tools for version control , migration and deployment. Currently we are using PVCS for version control and migration.
    we are using OWB 9.2 , Oracle 9.2.
    Any suggestion/ comments are appreciated.
    Thanks,
    Sekar,K

    Some answers below:
    1. Yes, you should use snapshots (right click on t he object and select create snapshot or on the project menu, select Change Manager for a centralized snapshot management).
    2. See above.
    3. Usually you will export the (test/development) design repository to an mdl file, and import it into the other (production) design directory. Then you will deploy the code from the other design directory into the production target (runtime).
    2.1 From 9.2 on, you will have to deploy to file and then use OWB scripting (OMBPlus) to deploy the generated file outside of the deployment manager.
    2.2 Not sure what is asked in this question - can you rephrase?
    2.3 No, but the creation and use of thes will not be managed and audited by OWB (Runtime Audit Browser)
    3.(bis) The target owner.
    Regards:
    Igor

  • Best practice to move things between various environments in SharePoint 2013

    Hi All SharePoint Gurus!! - I was using SP deployment wizard to move Sites/lists/libraries/items etc. using SP Deployment Wizard (spdeploymentwizard.codeplex.com) in SP 2010. We just upgraded to SP 2013. I have few Lists and Libraries that I need to push
    into the Staging 2013 and Production 2013 environment from Development 2013 environment. SP Deployment Wizard  is throwing error right from the startup. I checked SP 2013 provides granular backups but is restricted to Lists/Library level. Could anybody
    let me know if SP Deployment Wizard works for 2013? I love that tool. Also, Whats the best practice to move things between various environments?
    Regards,
    Khushi
    Khushi

    Hi Khushi,
    I want to let you know that we built
    SharePoint Migration tool
    MetaVis Migrator that can copy and migrate to and from on-premise or hosted SharePoint sites. The tool can copy entire
    sites with sub-site hierarchies, content types, fields, lists, list views, documents, items with attachments, look and feel elements, permissions, groups and other objects - all together on at any level of granularity (for
    example, just lists or just list views or selected items). The tool preserves created / modified properties, all metadata and versions. It looks like Windows Explorer with copy/paste and drag-n-drop functions so it is easy to learn. It does not require any
    server side installations so you can do everything using your computer or any other server. The tool can copy the complete sites or just individual lists or even selected items. The tool also supports incremental or delta copy based on the previous migrations.
    The tool also includes Pre-Migration Analysis that helps to identify customizations.
    Free trial is available:
    http://www.metavistech.com . Feel free to contact us.
    Good luck with your migration project,
    Mark

  • Best Practice Regarding Maintaining Business Views/List of Values

    Hello all,
    I'm still in the learning process of using BOXI to run our Crystal Reports.  I was never familiar with the BO environment before but I have recently learned that every dynamic parameter we create for a report, the Business View/Data Connectors/LOV are created on the Enterprise Repository the moment the Crystal Report is uploaded.
    All of our reports are authored from a SQL Command statement and often times, various reports will use the same field name from the database for different reports.  For example, we have several reports that use the field name "LOCATION" that exists on a good number of tables on the database.
    When looking at the Repository, I've noticed there are several variations of LOCATION, all which I'm assuming belongs to one specific report.  Having said that, I see that it can start to become a nightmare in trying to figure out which variation of LOCATION belongs to what report.  Sooner or later, the Repository will need to be maintained a bit cleaner, and with the rate we author reports, I forsee a huge amount of headache down the road.
    With that being said, what's the best practice in a nutshell when trying to maintain these repository items?  Is it done indirectly on the Crystal Report authoring side where you name your parameter field identifiable to a specific report?  Or is it done directly on the Repository side?
    Thank you.

    Eric, you'll get a faster qualified response if you post to the  Business Objects Enterprise Administration forum as that forum is monitored by qualified support for BOE

  • Best Practice in maintaining multiple apps and user logins

    Hi,
    My company is just starting to use APEX, and none of us (the developers) have worked on this before either. It is greatly appreciated if we can get some help here.
    We have developed quite a few applications in the same workspace. Now, we are going to setup UAT and PRD environments and also trying to understand what the best practice is to maintain multiple apps and user logins.
    Many of you have already worked on APEX environment for sometime, can you please provide some input?
    Should we create multiple apps(projects) for one department or should we create one app for one department?
    Currently we have created multiple apps for one department, but, we are not sure if a user can login once and be able to access to all the authenticated apps.
    Thank you,
    LC

    LC,
    I am not sure how much of this applies to your situation - but I will share what I have done.
    I built a single 700+ page application for my department - other areas create separate smaller applications.
    The approach I chose is flexible enough to accomdate both.
    I built a separate access control application(Control) in its own schema.
    We use database authenication fo this app - an oracle account is required.
    We prefer to use LDAP for authentication for the user applications.
    For users that LDAP is not option - an encrypted password is stored - reset via email.
    We use position based security - priviliges are based on job functions.
    We have applications, appilcations have roles , roles have access to components(tabs,buttons,unmasked card numbers,etc.)
    We have positions that are granted application roles - they inherit access to the role components.
    Users have a name, a login, a position, and a site.
    We have users on both the East Coast and the West Coast, we use the site in a sys_context
    and views to emulate VPD. We also use the role components,sys_contexts and views to mask/unmask
    card numbers without rewriting the dependent objects(querys,reports,views,etc.)
    The position based security has worked well, when someone moves,
    we change the position they are assigned to and they immediately have the privileges they need.
    If you are interested I can rpovide more detail.
    Bill

  • Best Practice for managing variables for multiple environments

    I am very new to Java WebDynPro and have a question
    concerning our deployments to Sandbox, Development, QA,
    and Production environments.
    What is the 'best practice' that people use so that if
    you have information specific to each environment you
    don't hard-code it in your Java WebDynPro code.
    I could put the value in a properties file, but how do I
    make that variant?  Otherwise I'd still have to make a
    change for each environment to the property file and
    re-deploy.  I know there are some configurations on the
    Portal but am not sure if that will work in my instance.
    For example, I have a URL that varies based on my
    environment.  I don't want to hard-code and re-compile
    for each environment.  I'd prefer to get that
    information on the fly by knowing which environment I'm
    running in and load the appropriate URL.
    So far the only thing I've found that is close to
    telling me where I'm running is by using a Parameter Map
    but the 'key' in the map is the URL not the value and I
    suspect there's a cleaner way to get something like that.
    I used Eclipse's autosense in Netweaver to discover some
    of the things available in my web context.
    Here's the code I used to get that map:
    TaskBinder.getCurrentTask().getWebContextAdapter().getRequestParameterMap();
    In the forum is an example that gets the IP address of
    the site you're serving from. It sounds like it is going
    to be or has been deprecated (it worked on my system
    right now) and I would really rather have something like
    the DNS name, not something like an IP that could change.
    Here's that code:
    String remoteHost = TaskBinder.getCurrentTask().getWebContextAdapter().getHttpServletRequest().getRemoteHost();
    Thanks in advance for any clues you can throw my way -
    Greg

    Hi Greg:
         I suggest you that checks the "Software Change Managment Guide", in this guide you can find an explication of the best practices to work with a development infrastructure.
    this is the link :
    http://help.sap.com/saphelp_erp2005/helpdata/en/83/74c4ce0ed93b4abc6144aafaa1130f/frameset.htm
    Now if you can gets the ip of your server or the name of your site you can do the next thing:
    HttpServletRequest request = ((IWebContextAdapter) WDWebContextAdapter.getWebContextAdapter()).getHttpServletRequest();
    String server_name = request.getServerName();
    String remote_address =     request.getRemoteAddr()
    String remote_host = request.getRemoteHost()
    Only you should export the servlet.jar in your project properties > Build Path > Libraries.
    Good Luck
    Josué Cruz

  • "best practice to maintain the SAP OM Org Structure"

    Hi SAP Experts,
    My client want to have an best practice or an safe process to update, better and maintain their existing SAP HCM Organizational Structure. In one way you can say that i am doing an process oriented job.
    Our client system is not up-to-date due to the lack of user awareness and complete knowledge on the system. Due to this they are unsure on the accuracy of the reports that comes out of the system.
    As a HCM functional consultant i can look from the technical perspective but not on this process oriented role. I need your guidance in this regard, please sugguest me how can i move ahead and make  some really valuable recommendations ? I am confused on where to start and how to start. Please help me in this regard.
    Thanks in advance,
    Amar

    The only thing u need to keep in mind the Relatioships between the objects in OM
    check the Tcode OOVK  for relationships and assigning those objects   PP01 , PP02
    Re: Organization Structure
    this thread may help u
    let us know if there is anything else
    Edited by: Sikindar on Dec 4, 2008 9:36 AM

  • Best practice for maintaining URLs between Dev, Test, Production servers

    We sometimes send order confirmations which include links to other services in requestcenter.
    For example, we might use the link <href="http://#Site.URL#/myservices/navigate.do?query=orderform&sid=54>Also see these services</a>
    However, the service ID (sid=54) changes between our dev, test, and production environments.  Thus we need to manually go through notifications when we deploy between servers.
    Any best practices out there?

    Your best practice in this instance depends a bit on how much work you want to put into it at the front end and how tied to the idea of a direct link to a service you are.
    If your team uses a decent build sheet and migration checklist then updating the various URL’s can just be part of the process. This is cumbersome but it’s the least “technical” solution if you want to continue using direct links.
    A more technical solution would be to replace your direct links with links to a “broker page”. It’s relatively simple to create an asp page that can accept the name of the service as a parameter and then execute an SQL query against the DB to return the ServiceID, construct the appropriate link and pass the user through.
    A less precise, but typically viable, option would be to use links that take advantage of the built in search query functionality. Your link might display more results than just one service but you can typically tailor your search query to narrow it down. For example:
    If you have a service called Order New Laptop or Desktop and you want to provide a link that will get the user to that service you could use: http://#Site.URL#/RequestCenter/myservices/navigate.do?query=searchresult&&searchPattern=Order%20New%20Desktop%20or%20Laptop
    The above would open the site and present the same results as if the user searched for “Order New Desktop or Laptop” manually. It’s not as exact as providing a direct link but it’s quick to implement, requires no special technical expertise and would be “environment agnostic”.

  • Best Practices for Maintaining SSAS Projects

    We started using SSAS recently and we maintain we one project to deploy to both DEV and PROD instances by changing the deployment properties. However, this gets messy when we introduce new fact tables in to DEV data warehouse (that are not promoted to
    Production data warehouse). While we work on adding new measure groups and calculations (based on new fact tables in DEV) we are unable to make any changes to production cube (such as changes to calculations, formatting etc) requested by business
    users. Sorry for long question but is there is a best practice to manage projects and migrations? Thanks.

     While we work on adding new measure groups and calculations (based on new fact tables in DEV) we are unable to make any changes to production cube (such as changes to calculations, formatting etc) requested by business users.
    Hi Sbc_wisc,
    You can create a new project by importing the metadata from the production cube on the server, using the template, Import from Server (Multidimensional and Data Mining) Project, in SQL Server Data Tools (SSDT). And then make some changes on this project
    and then redeploy it to production server.
    Referencec:
    Import a Data Mining Project using the Analysis Services Import Wizard
    Regards,
    Charlie Liao
    TechNet Community Support

  • Can BPM maintain flow across different applications

    Hello,
    I have a requirement where I have to maintain the business flow across different applications(Siebel CRM, Oracle Financials and third party applications) with out the end user knowing.
    Is it possible with BPM to navigate users from one application to another application (CRM Application-> Third party Application -> Financials)? If there is a solution availabe with BPM or a different application please provide the same doc. Appreciate your help.
    Regards,
    Jay

    Hi,
    Yes. Oracle BPM can maintain a flow across multiple applications without the end user knowing. It is something it was built to do.
    First, applications like the ones you mentioned have an API (typically web service today but older applications exposed their API as Java POJOs, EJBs, COM, etc.). For Oracle BPM to access the applications, you need to expose the API in Oracle BPM's catalog. Customers that have a service bus expose the application APIs in the service bus and then Oracle BPM catalogs the service bus proxy services. Customers that do not have a service bus can expose the application APIs directly in Oracle BPM's catalog. Either way will work.
    Second, you'd design a process with a series of Interactive (human activities) and Automatic (activities that invoke the components that in turn invoke the APIs for your applications without human intervention). You'd add something called instance variables that carry the information throughout the life the process for each work item instance. Interactive activities are placed into roles with a name associated with them (e.g. CSR or Manager) so the work done in each activity is done by the right type of person. Interactive activities can be set up where the work item instance goes to a specific person instead of everyone in the role where the activity is located (e.g. send the instance to the CSR that talked to the customer last time).
    Third, at runtime as each work item instance is created (e.g. "Order 227") in the process the work item instance flows to one of the process's Interactive or Automatic activities. If it flows into an Interactive (human) activity, the end user assigned to the role where the activity is located clicks on an item in their web based Oracle BPM WorkSpace's inbox for the specific work item instance that they are interested in working on (again - perhaps "Order 227"). Once clicked by the end user, a UI presentation (either built using Oracle BPM's WYSIWYG presentation editor or a JSP) shows the work that needs to be done specifically by that end user. The UI presentation is already populated with the information gathered from a database or a previous API call from an Automatic activity. All this is done without end users having to cut out of one application and then paste into another application's screen - the right contextual information is sent to the right person at the right time. Once the end user finishes their manual task, the work might flow to an Automatic task that invokes another applicaiton's API automatically from the logic and variable information gathered in earlier activities in the process.
    All this is done without the end users knowing that they are flowing through multiple applications to get their work done.
    Hope this helps,
    Dan

  • The best practice for data mart to different BW System.

    Hi All,
    Would you like to suggest me what i have to do for this case ??
    I have 2 SAP BW systems e.g. BW A & BW B.
    I wanna transfer data from info cube within BW A into info cube within BW B.
    The things that I did :
    1. 'Generate Export Data Sources' for info cube BW A.
    2. Replicate source system in BW B. In BW B, it will show datasource from info cube BW A.
    3. In SAP BW B, I create info package, then I can fetch data from SAP BW A.
    What I wanna ask are:
    1. Could I make it automatically?? Because everytime I wanna fetch data from Info Cube SAP BW A, I must run info package in SAP BW B / what's the best practice.
    2. Could RDA make it automatically ?? Automatic in my case is everytime I have new/update data in Info cube SAP BW A, I don't have to run info package in SAP BW B.
    SAP BW B will automatically fetch the data from Info Cube A.
    If yes, could you give me step-by-step how to use RDA to solve my case please..
    Really need ur guidances all .
    Thanks,
    Best regards,
    Daniel N.

    Hi Daniel,
    You can create a process chain to load your cube in BW A. SImilarly create a process chain in your BW B system to load its cube.
    Now in your system BW A you create a process chain to load your cube. After that you can run automatically the procewss chain in BW B. You can use the Remote chain option for this.
    This will trigger a chain automatically in the remote system.
    Regards,
    Mansi

  • Best Practice for Migrating code from Dev to a fresh Test ODI instance

    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines?
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best Practices: BIP Infrastructure and Multiple Installations/Environments

    Hi all,
    We are in process of implementing BI Publisher as the main reporting tool to replace Oracle Reports for a number of Oracle Form Applications within our organization. Almost all of our Forms environments are (or will be) SSO enabled.
    We have done a server install of BIP (AS 10gR3) and enabled BIP with SSO (test) and everything seems in order for this one dev/test environment. I was hoping to find out how others out there are dealing with some of the following issues regarding multiple environments/installs (and licensing):
    Is it better to have one production BIP server or as many BIP severs as there are middle tier form servers? (Keeping in mind all these need to be SSO enabled). Multiple installs would mean higher maintenance/resource costs but is there any significant gain by having more autonomy where each application has its own BIP install?
    Can we get away with stand alone installations for dev/test environments? If so, how do we implement/migrate reports to production if BIP server is only accessible to DBAs in production (and even real UAT environment where developer needs to script work for migration)? In general, what is the best way to handle security when it comes to administration/development?
    I have looked at the Oracle iStore for some figures but this last question is perhaps one for Oracle Sales people but just in case anybody knows... How's licensing affected by multiple installations? Do we pay per installation or user? Do production and test/dev cost the same? Is the cost of stand alone environment different?
    I would appreciate if you can share your thoughts/experiences in regards to any of the above topics. Thank you in advance for your time.
    Regards,
    Yahya

    Your data is bigger than I run, but what I have done in the past is to restrict their accounts to a separate datafile and limit its size to the max that I want for them to use: create objects restricted to accommodate the location.

  • Best Practice for Debugging Code

    Hi!
    I am new to development in Crystal.  What method do you recommend to verify my formulas are doing what I expect?  For example, in other environments I would print or prompt the result. 
    What is the best approach in Crystal?
    Any help you can provide will be appreciated.

    It really depends on what you're trying to achieve.  If it's simply seeing if the calculation is right, just drop the field in the appropriate format and have it printed (or add a new format for this purpose that you can suppress or display).
    If a formula comes up with a value and you can't figure out why, my approach has been to add the following code to the formula in question (basic syntax):
    dim debug as number
    debug = debug / debug
    This will cause a divide by zero error, which will then cause Crystal to pull up the formula.  On the left of the window, all fields and variables used in the formula are displayed with their current value.  (You need to use the "debug" variable because Crystal Designer will give a syntatical divide by zero error if you code "formula = 1 / 0"...)
    As written, the breakpoint will hit on the first iteration of the formula.  You can cause the "breakpoint" to happen on later iterations by using something like:
    global dbgcnt as number
    dim debug as number
    dbgcnt = dbgcnt + 1
    if dbgcnt = 5 then
      debug = debug / debug
    end if
    where the "5" is the iteration number that you want it to break on.
    If anyone has a better way to "set a breakpoint", I'd love to hear it!  (Ah ha!  A great idea for the Suggestions thread.  Going there now!)
    HTH,
    Carl

  • Best practice to Maintain Folders in Portal

    Hello Everyone,
    Our Portal folders are a bit messed-up. So, we are thinking to re-arrange all of them in a proper manner. Before we do that, I thought of asking the experts advise about the way, the Folders copied or Delta Linked from SAP Standard ESS folders should be maintained.
    It would be very much appreciated, if anyone can advise us about the best way of Copying or Deltalinking the SAP Standard ESS Folders.
    Regards,
    Gopal.

    Hi Gopal,
    I'm not sure if I got your question completely - if you want to know if copying or deltalinking is the better approach, or if you want to know more details how to arrange this etc. The latter I couldn't answer.
    For the first question: As long as you don't have many many roles and as long as you don't maintain long chains of delta-links, using delta-links of course is better from the view of administration and how you can maintain things centrally.
    Long delta-link chains may be a problem from the view of performance. So if you have many roles, within the roles big navigation structures and tha targets are delta-links pointing to delta-links pointing to ... pointing to the origina iViews or whatever, this can become a problem in performance.
    Anyhow, this last case points to a problematic strucure how you maintain your PCD content, and as you are just tidying up, I expect that using delta-links and it's advantage concerning maintainibility is the best way to go.
    Hope it helps
    Detlev

Maybe you are looking for

  • Want to use NO data, but Wifi?

    Hi, new owner of the Curve 9300/3G, running OS 6.  I am currently on a voice/text plan, but have Data available (although I will get charged ridiculously on it). So I want to use NO data, by all means. But I want to use Wifi when I'm at home. I've re

  • How can you tell when your new tablet was first actervated

    I bought a HP Omni 10 window's 8.1 tablet display model (last one) and was told it was never turned on ,when i first turned it on windows was activated and the tablet was running and i did not have to set any thing up . My question is ,is there a pla

  • Configure a web dynpro application!!!

    Hi Experts, I am working on ESS-Travel and Expenses application. In that I have an ALV displayed on a POWL component. Here when we go to user settings after we right click on the ALV we get option to hide/show the columns of the ALV. What I need is t

  • Restart and logout problems

    Hello, I upgraded from Tiger to Leopard some while ago and I ran into some issue. When I try to restart my Macbook nothing happens except this information inside the log file: Apr 15 18:59:03 Macintosh com.apple.launchd[88] (com.apple.FolderActions.f

  • How to go from test site to real site?

    I hope this is the right thread for this...if not my apologies. OK, I am very new to this, and trying to do this on my own for a small non-profit - very little funding, and 1 basic DW CS4 course, but I have managed to stumble through this far.   I ha