Deploy to production (best practices)

I am wondering if there are some best practices published somewhere in regard to deploying an app from your dev environment to a production environment?
I currently do the following.
1) export/inport application
2) generate a sync script (toad) to get the schema objects in sync. Maybe it's better to do an export/import (expdp) of the schema here.
3) import images/files needed by the application

Hi,
Have a look at:
Book: Pro Oracle Application Express
Author: John Edward Scott
Author: Scott Spendolini
Publisher: apress
Year: 2008
It has a good section on migrating between environments.
Hope this helps.
Cheers,
Patrick Cimolini

Similar Messages

  • CF10 Production Best Practices

    Is there a document or additional information on the best way to configure multiple instances of CF10 in a production environment? Do most folks install CF10 as a ear/war J2EE deployment under JBoss or Tomcat with Apache as the webserver?

    There’s no such document that I know of, no.
    And here’s a perfect example where “best practices” is such a loaded phrase.
    You wonder if “install CF10 as a ear/war J2EE deployment under JBoss or Tomcat with Apache as the webserver”. I’d say the answer to that is “absolutely not”. Most folks do NOT deploy CF as a JEE ear/war. It’s an option, yes. And if you are running A JEE server already, then it does make great sense to deploy CF as an ear/war on said container.
    But would it be a recommended practice for someone installing CF10 without interest in JEE deployment? I’d say not likely, unless they already have familiarity with JEE deployment.
    Now, could one argue “but there are benefits to deploying CF on a JEE container”? Sure, they could. But would it be a “best practice”? Only in the minds of a small minority I think (those who appreciate the beenfits of native JEE deployment and containers). Of course, CF already deploys on a JEE container (Tomcat in CF10, JRun in CF 6-9), but the Standard and Enterprise Server forms of deployment hide all that detail, which is best for most. With those, we just have a ColdFusion directory and are generally none-the-wiser that it runs on JRun or Tomcat.
    That leads then to the crux of your first sentence: you mention multiple instances. That does change things quite a bit.
    First, a couple point of clarification before proceeding: in CF 7-9, such “multiple instance” deployment was for most folks enabled using the Enterprise Multiserver form of deployment, and created a Jrun4 directory where instances were installed (as distinguished from the Enterprise Server form I just mentioned above, which hid the JRun guts).
    In CF10, though, there is no longer a “multiserver” install option. It’s just that CF10 Enterprise (or Trial or Developer editions) does let you create new instances, using the same Instance Manager in the CF admin that existed for CF Enterprise Multiserver from 7-9. CF10 still only lets you create with the Enterprise (or trial or developer) edition, not Standard.
    (There is a change in CF10 about multiple instances, though: note that in CF10, you never see a Tomcat directory, even if you want “multiple instances”. When you create them, they are created right under the CF10 directory, as siblings to the cfusion directory (and while that cfusion directory previously existed only in the CF 7-9 multiserver form of deployment, it does not exist even in CF10 Standard, as the only instance it can use.)
    So all that is a lot of info, not any “best practices”, but you asked if there was any “additional info”, and I thought that helpful for you to have as you contemplate your options. (And of course, CF10 Enterprise does still let you deploy as a JEE ear/war if you want.)
    But no, doing it would not be a best practices. If someone asked for “the best way to configure multiple instances of CF10 in a production environment”, I’d tell them to just proceed as they would have in CF 7-9, using the same CF Admin Instance Manager capability to create them (and optionally cluster them).
    All that said, everything about CF10 does now run on Tomcat instead of JRun, and some things are improved under the covers, like clustering (and related things, like session replication), because those are now Tomcat-based features (which are actively updated and used by the Tomcat community), rather than JRun-based (which were pretty old and hardly used by anyone since JRun was EOL-ed several years ago).
    I’ll note that I offer a talk with a lot more detail contrasting CF10 on Tomcat to CF9 and earlier on JRun. That may interest you, snormo, so check out the presentations page at carehart.org.
    Hope all that’s helpful.
    /charlie
    PS You conclude with a mention of Apache as the web server. And sure, if one is on a *nix depoyiment or just favors Apache, it’s a fine option. But someone running CF 10 on Windows should not be discouraged from running on IIS. It’s come a long way and is now very secure, flexible, and capable, whether used for one or multiple instances of CF. 

  • Eclipse / Workshop dev/production best practice environment question.

    I'm trying to setup an ODSI development and production environment. After a bit of trial and error and support from the group here (ok, Mike, thanks again) I've been able to connect to Web Service and Relational database sources and such. My Windows 2003 server has 2 GB of RAM. With Admin domain, Managed Server, and Eclipse running I'm in the 2.4GB range. I'd love to move the Eclipse bit off of the server, develop dataspaces there, and publish them to the remote server. When I add the Remote Server in Eclipse and try to add a new data service I get "Dataspace projects cannot be deployed to a remote domain" error message.
    So, is the best practice to run everything locally (admin server, Eclipse/Workshop). Get everything working and then configure the same JDBC (or whatever) connections on the production server and deploy the locally created dataspace to the production box using the Eclipse that's installed on the server? I've read some posts/articles about a scripting capability that can perhaps do the configuration and deployment but I'm really in the baby steps mode and probably need the UI for now.
    Thanks in advance for the advice.

    you'll want 4GB.
    - mike

  • Deploying Branding Files Best Practice

    Question about best practice (if exists) for deployment method of branding files.
    Background:
    I created 2 differefent projects for a public facing SP 2010 site.
    First project contains 1 feature and is responsible for deploying branding files: contains my custom columns, layouts, masterpage, css, etc...
    Second project is my web template. Contains 3 features, contenttypebinding, webtemp default page, and the web temp files (onet.xml).
    I deploy my branding project, then my template files.
    Do you deploy branding as Farm solution or Sandboxed solution? So how do you update your branding files at this point?
    1. You don't, you deploy and forget about solution doing everything in SP Designer from this point on.
    2. Do step 1, then copy everything back to the VS project and save to TFS server.
    3. Do all design in VS and then update solution.
    I like the idea of having a full completed project, but don't like the idea of having to go back to VS, package, and re-deploy every time I have a minor change to my masterpage when I can just open up Designer and edit.
    Do you deploy and forget about the branding files using SP Designer to update master pages, layouts, etc.. or do you work and deploy via VS project.

    Hi,
    Many times we use Sandboxed solutions for branding projects, that way it would minimize the dependency of SharePoint Farm administration.  Though there are advantages using SharePoint Farm solution, but Sandboxed solution is simple to use, essentialy
    limited to set of available Sandboxed solutions API.
    On the SP Designer side, many of the clients that I have worked with were not comfortable with the Idea of enabling SharePoint Designer access to their portal. SP Designer though powerful and it does some times brings more issues when untrained business
    users start customizing the site.
    Another issue with SPD is you will not be able track changes(still can use versioning) and retracting changes are not that easy. As you said, you simply connect to the site using SPD and make changes to the master pages, but those changes are to be
    documented, and maintain copies of master pages always to retract the changes. 
    Think about a situation when you make changes to the Live site using SPD and the master page is messed up, and your users cannot access the site. we cannot follow trial and error methodology on the production server.This would bring
    out more questions like, What would be your contingency plan for restoring the site and what is your SLA for restoring your site, how critical is your business data.
    Although this SPD model would be useful for a few user SharePoint setup with limited tech budget but not also advisable.
    I would always favor VS based solution, which will give us more control over the design and planning for deployment.
    We do have solution deployment window, as per our governance we categorize the solution deployment based on the criticality and for important changes we plan for the weekends to avoid unavailability of the site.
    Hope this helps!
    Ram - SharePoint Architect
    Blog - SharePointDeveloper.in
    Please vote or mark your question answered, if my reply helps you

  • Delivering demo or trial versions of a product - best practice?

    Hello there RoboHelp gang,
    I am using RH8 HTML to create a knowledge product for commercial resale to clients and customers.  One of the things I need to do is offer a 'trial' version of the end the end product.  I imagine doing this by creating an instance of the knowledge product with selected areas freely available but with the bulk of the content 'locked down'. 
    I understand from the instructions that i can do this via condition conditional text function... the reason for posting to find out if anyone had any additional best practice comments, suggestions or tips on setting up and managing a demo or trial version of applications built in Robohelp 8 HTML? The end format of mine will be ADOBE AIR.
    fyi - i've tried searching online and in these forums for more on this but can't find anything (... and that maybe because i've over-complicating it?!:)) but i'll happily post my findings (if any) once it's all set up to satisfaction.
    Thanks,
    DK
    RH8 newbie from the UK

    KeepItSimpleStupid!
    That is, make it as functional as possible, but don't raise expectations needlessly with a lot of fancy foldarol; there'll be plenty of time to add cute stuff, and only as user feedback warrants. Yes, there are wonderful things you can add, but just because you can, doesn't mean you should.
    Note also that the topics you have retained (not conditionalized away) must not contain links to the ones that are missing (you can try explaining away such broken links, but they'll leave a bad taste nonetheless). In addition, their Index entries will be missing, and the topics will not be included in a Search.
    Good luck,
    Leon

  • UDDI and deployed Web Services Best Practice

    Which would be considered a best practice?
    1. To run the UDDI Registry in it's own OC4J container with Web Services deployed in another container
    2. To run the UDDI Registry in the same OC4J container as the deployed Web Services

    The reason you don't see your services in the drop-down is because, CE does lazy initialization of EJB components (gives you a faster startup time of the server itself). But your services are still available to you. You do not need to redeply each time you start the server. One thing you could do is create a logical destinal (in NWA) for each service and use the "search by logical destination" button. You should always see your logical names in that drop-down that you can use to invoke your services. Hope it helps.
    Rao

  • Deployment specific configuration - best practice

    I'm trying to figure out the best way to set-up deployment specific configurations.
    From what I've seen I can configure things like session timeouts and datasources. What I'd like to configure is a set of programmatically accessible parameters. We're connecting to a BPM server and we need to configure the URL, username and password, and make these available to our operating environment so we can set-up the connection.
    Can we set-up the parameters via a deployment descriptor?
    What about Foreign JNDI? Can I create a simple JNDI provider(from a file perhaps?) and access the values?
    Failing these, I'm looking at stuffing the configuration into the database and pulling it from there.
    Thanks

    Which version of the product are you using?
    Putting the configs in web.xml config params as in this example:
    https://codesamples.samplecode.oracle.com/servlets/tracking/remcurreport/true/template/ViewIssue.vm/id/S461/nbrresults/103
    Will allow you to change the values per deployment easily with a deployment plan.
    Another alternative 10.3.2 and later is to use a features that allows resources like normal properties files to be overloaded by putting them in the plan directory. I don't have the link to this one right now.

  • Parallels Server For Mac configuration (or any VM product), best practices

    What is the recommended practice for configuring Parallels Server For Mac on an XServe?   The xserve has 3 hard drive bays.   What do most people do to maximize uptime, failover, disaster recovery, etc.  We do have 3 different physical xserves with PSfM also.
    Is a software RAID recommended for the boot OS drive? for the drive storing the VMs if not the same drive?
    How are your hard drives configured?
    I'm concerned that mirroring two 1TB hard drives and using that for both the host OS and for the VMs will present problems:
    When creating new VMs, copying large amounts of data, etc, I'd guess that moving 10-100 GB of data is tasking for the software raid and will noticeably affect performance for all VMs.
    If one of the mirrored drives fails and the raid needs to be rebuilt, this probably would bring the entire system to a crawl, at best, and possibly be unusable while it works to repair the raid.  Thus it would have to be done during off peak hours.  This would prevent a complete outage though.
    are these correct assumptions?

    Hello there,
    There is no upgrade path from Server 2012 Essentials to R2.  R2 is not an upgrade, it is a new build.  Essentials does not upgrade easily, you will want to perform a migration of the server.  This will require additional hardware.  You
    can possibly backup the 2012 Essentials, build a temp server on workstation hardware just to do the migration from,  restore the backup to the new server and then perform the migration to R2. 

  • Transports to Production - Best Practice for Timing Imports

    I would like to compile a list of possible issues/problems which can occur through Transporting requests to the Production system during Productive time, i.e. 'Business Time'.
    This is obviously against SAP recommendations that, little if not no user activity should take place during a import, but i would like if possible some more detail on the risks.
    Thanks for any help.

    Hello Ashley,
    If you are looking for list of problems and issues related to TMS and what needs to be done with particular problem
    Here is very useful link
    http://www.geocities.com/SiliconValley/Grid/4858/sap/Basis/tpprobs_solutions.htm
    Thanks & Regards
    Vivek

  • Problem in deploy BPM 11g best practices to weblogic

    Hi everybody,
    I do step by step BPM11g BestPractices instruction. when I want to deploy SalesProcesses to application server in select SOA severs step in deploy wizard there isn't any server that I select. I login to admin server console of weblogic and see that soa_server1 and bam_server1 state is SHUTDOWN I find two ways for start them
    1. from console that first must start node manager and then start them
    2. from command line --> startManagedWebLogic.cmd soa_server1
    I try two ways but both of them have problems. when I see log files I see below exception finally :
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Critical> <WebLogicServer> <BEA-000362> <Server failed. Reason:*
    There are 1 nested errors:
    weblogic.management.ManagementException: Booting as admin server, but servername, soa_server1, does not match the admin server name, AdminServer
         at weblogic.management.provider.internal.RuntimeAccessService.start(RuntimeAccessService.java:67)
         at weblogic.t3.srvr.ServerServicesManager.startService(ServerServicesManager.java:461)
         at weblogic.t3.srvr.ServerServicesManager.startInStandbyState(ServerServicesManager.java:166)
         at weblogic.t3.srvr.T3Srvr.initializeStandby(T3Srvr.java:802)
         at weblogic.t3.srvr.T3Srvr.startup(T3Srvr.java:489)
         at weblogic.t3.srvr.T3Srvr.run(T3Srvr.java:446)
         at weblogic.Server.main(Server.java:67)
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>*
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>*
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>*
    what should I do ?? please help me :-(
    Regards.

    try starting as : startManagedWebLogic.cmd <man server name> <Admin Server URL>
    e.g. startManagedWebLogic.cmd soa_server1 t3://localhost:7001

  • Best practice deploying additional updates

    Hello what is the best practice concerning monthy windows updates. We are currently adding additional windows updates to the existing 1 package and updating the content on the DP's. However this seems to work with inconsistant results.
    DPs are not finalising content .
    Other places I have worked we would create a seperate package each month for additional updates and never had an issue. Any thoughts?
    SCCM Deployment Technician

    The documented best practices are all related to the maximum number of patches that are part of one deployment. That number should not pas the 1000,
    Remember this is a hard limit of 1000 updates per Software Update Group (not deployment package). It's quite legitimate to use a single deployment package.
    I usually create static historical Software Updates Groups at a point in time (eg November 2014). In this case it is not possible to have a single SUG for all products (Windows 7 has over 600 updates for example). You have to split them. I deploy these
    updates (to pilot and production) and leave the deployments in place. Then I create an ADR which creates a new SUG each month and deploy (to pilot and production).
    You can use a single deployment package for all the above.
    Gerry Hampson | Blog:
    www.gerryhampsoncm.blogspot.ie | LinkedIn:
    Gerry Hampson | Twitter:
    @gerryhampson

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Best Practice Analyzer for Exchange 2013

    Greetings,
    I have upgraded the messaging infrastructure from Exchange 2007 to Exchange 2013.
    I want to test the Health of the system through ExBPA for Exchange 2013.
    But i don't find any setup for Exchange 2013 like it was in 2010.
    I went through an article by Office365 community, according to which for In-premises Exchange also we need to have office 365 account (can use trial account also) to get the downloader file for ExBPA 2013.
    http://community.office365.com/en-us/w/deploy/office-365-best-practices-analyzer-for-exchange-server-2013.aspx
    But to run the setup the servers needs to be connected to internet.
    And, i don't want to expose my environment to internet in any condition.
    Somebody, please suggest me if there is any setup available so that i can install directly without exposing to internet.
    Thanks in advance.
    Best Regards,
    K2

    Welcome to Exchange 2013.
    Exchange Server 2013 doesn't come with ExBPA for health check. This might help
    http://exchangeserverpro.com/powershell-script-health-check-report-exchange-2010/
    Apart from that you can run these commands too
    Get-ServerHealth -Identity Exchange2013ServerName
    Test-ServiceHealth
    Cheers,
    Gulab Prasad
    Technology Consultant
    Blog:
    http://www.exchangeranger.com    Twitter:
      LinkedIn:
       Check out CodeTwo’s tools for Exchange admins
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Office 365 Best Practices Analyzer requirements

    Can you install the office 365 BPA from a windows 7 workstation, or do you need to install it on the Exchange Server itself, and run it from there?

    Hi cf090,
    This depends on which BPA you are trying to run. If you are running Office 365 Best Practices Analyzer for Exchange Server 2013 then this needs to be installed on your Exchange server (you can see the requirements here: http://community.office365.com/en-us/w/deploy/office-365-best-practices-analyzer-for-exchange-server-2013-requirements.aspx)
    There are other BPAs for O365 such as Office 365 Best Practices Analyzer for your PC but this is designed to see if your PC supports O365 not Exchange. More details here: http://community.office365.com/en-us/w/deploy/office-365-best-practices-analyzer-for-your-pc.aspx
    Hope this answers your question! 
    Mike 
    Mike Parker | MCSE - Messaging | www.cloudbusiness.com</ a>

Maybe you are looking for

  • Bluetooth headset pairs, but is not available as playback device

    Hi there I am trying to pair and use my MEElectronics Runaway bluetooth stereo headset (with microphone) with my Zbook. I am following the instructions at MEElectronics available here http://www.meelec.com/Articles.asp?ID=352#ytplayer I make the devi

  • Re-ordering pages in Adobe Acrobat PDF Portfolios

    Hi, Sorry for not posting this in the Acrobat forum but that forum seems to be a much less attended one and I end up having to wait much longer for an answer. I figured people with expertise in InDesign probably have some with Acrobat as well. I have

  • Paging a message from stored procedure

    How can we write a code in pl/sql to send a pager message to a specified pager ? Atul

  • Invalid column name: syabase issue

    Hi    I am trying to load data from sybase into temporary table. When i executing the job, it is giving the following error. 3220     5728     DBS-070404     2/11/2008 8:07:26 AM     SQL submitted to ODBC data source <ppi-r_pp3_V1R00> resulted in err

  • Blackberry bridge problems

    I had download blackberry bridge and everythng was paired properly and now all of a sudden it isn't anymore.  When I cilck on blackberry bridge on my playbook it says that my smartphone could not be found and to make sure it is within range, turned o