Deploying Branding Files Best Practice

Question about best practice (if exists) for deployment method of branding files.
Background:
I created 2 differefent projects for a public facing SP 2010 site.
First project contains 1 feature and is responsible for deploying branding files: contains my custom columns, layouts, masterpage, css, etc...
Second project is my web template. Contains 3 features, contenttypebinding, webtemp default page, and the web temp files (onet.xml).
I deploy my branding project, then my template files.
Do you deploy branding as Farm solution or Sandboxed solution? So how do you update your branding files at this point?
1. You don't, you deploy and forget about solution doing everything in SP Designer from this point on.
2. Do step 1, then copy everything back to the VS project and save to TFS server.
3. Do all design in VS and then update solution.
I like the idea of having a full completed project, but don't like the idea of having to go back to VS, package, and re-deploy every time I have a minor change to my masterpage when I can just open up Designer and edit.
Do you deploy and forget about the branding files using SP Designer to update master pages, layouts, etc.. or do you work and deploy via VS project.

Hi,
Many times we use Sandboxed solutions for branding projects, that way it would minimize the dependency of SharePoint Farm administration.  Though there are advantages using SharePoint Farm solution, but Sandboxed solution is simple to use, essentialy
limited to set of available Sandboxed solutions API.
On the SP Designer side, many of the clients that I have worked with were not comfortable with the Idea of enabling SharePoint Designer access to their portal. SP Designer though powerful and it does some times brings more issues when untrained business
users start customizing the site.
Another issue with SPD is you will not be able track changes(still can use versioning) and retracting changes are not that easy. As you said, you simply connect to the site using SPD and make changes to the master pages, but those changes are to be
documented, and maintain copies of master pages always to retract the changes. 
Think about a situation when you make changes to the Live site using SPD and the master page is messed up, and your users cannot access the site. we cannot follow trial and error methodology on the production server.This would bring
out more questions like, What would be your contingency plan for restoring the site and what is your SLA for restoring your site, how critical is your business data.
Although this SPD model would be useful for a few user SharePoint setup with limited tech budget but not also advisable.
I would always favor VS based solution, which will give us more control over the design and planning for deployment.
We do have solution deployment window, as per our governance we categorize the solution deployment based on the criticality and for important changes we plan for the weekends to avoid unavailability of the site.
Hope this helps!
Ram - SharePoint Architect
Blog - SharePointDeveloper.in
Please vote or mark your question answered, if my reply helps you

Similar Messages

  • Deploy to production (best practices)

    I am wondering if there are some best practices published somewhere in regard to deploying an app from your dev environment to a production environment?
    I currently do the following.
    1) export/inport application
    2) generate a sync script (toad) to get the schema objects in sync. Maybe it's better to do an export/import (expdp) of the schema here.
    3) import images/files needed by the application

    Hi,
    Have a look at:
    Book: Pro Oracle Application Express
    Author: John Edward Scott
    Author: Scott Spendolini
    Publisher: apress
    Year: 2008
    It has a good section on migrating between environments.
    Hope this helps.
    Cheers,
    Patrick Cimolini

  • Managing Alert log files : Best practices?

    DB Version : 10.2.0.4
    Several of our DBs' alert logs have become quite large. I know that oracle will create a brand new alert log file if i delete the existing one.
    But i just want to know how you guys manage your alert log files. Do you guys archive (move to a different directory) and recreate a brand new alert log. Just want to know if there any best practices i could follow.

    ScottsTiger wrote:
    DB Version : 10.2.0.4
    Several of our DBs' alert logs have become quite large. I know that oracle will create a brand new alert log file if i delete the existing one.
    But i just want to know how you guys manage your alert log files. Do you guys archive (move to a different directory) and recreate a brand new alert log. Just want to know if there any best practices i could follow.Every end of day (or every periodic time) archive your alert.log and move other directory then remove alert.log.Next time Database instance will automatically create own alert.log

  • Import data from excel file - best practice in the CQ?

    Hi,
    I have question related to importing data from excel file and creates from those data a table in the CQ page. Is inside CQ some OOTB component which provides this kind of functionalities? Maybe somebody implement this kind of functionality or there is best practice to do this kind of functionalities?
    Thanks in advance for any answer,
    Regards
    kasq

    You can check a working example package [1] (use your Adobe ID to log in)
    After installing it, go to [2] for immediate example.
    Unfortunately it only supports the old OLE-2 Excel format (.xls and not .xlsx)
    [1] - http://dev.day.com/content/packageshare/packages/public/day/cq540/demo/xlstable.html
    [2] - http://localhost:4502/cf#/content/geometrixx/en/company/news/pressreleases/my_personal_bes ts.html

  • UDDI and deployed Web Services Best Practice

    Which would be considered a best practice?
    1. To run the UDDI Registry in it's own OC4J container with Web Services deployed in another container
    2. To run the UDDI Registry in the same OC4J container as the deployed Web Services

    The reason you don't see your services in the drop-down is because, CE does lazy initialization of EJB components (gives you a faster startup time of the server itself). But your services are still available to you. You do not need to redeply each time you start the server. One thing you could do is create a logical destinal (in NWA) for each service and use the "search by logical destination" button. You should always see your logical names in that drop-down that you can use to invoke your services. Hope it helps.
    Rao

  • Constants or properties file - best practice question

    Hi,
    My application has a number of values that will be used in different classes throughout my project. For example, I perform a check against the max allowed length of an ID in numerous places in my code.
    Therefore, it makes sense to set this value in a central location, and refer to it as a variable where it is required in my code.
    I see in other projects that using a public static final member in a Constants class is used to set these types of values. Is this recommended or best practice?
    The only alternative I can think of would be to use a properties file, and inject the value using Spring etc.
    What is considered best practice or the neatest way for doing this?
    Thanks

    user10340197 wrote:
    Thanks. I'm using Spring anyways, so it would provide me with the PropertyPlaceHolderConfigurer for injecting these.And the name of that class provides a clue. As Kayaman said, constants are constants. Math.PI does not and will not change, EVER. Neither will Integer.MAX_VALUE.
    Configuration parameters, on the other hand, might change. If your MAX_ID_LENGTH is ever likely to change, and could do so without causing widespread chaos, then it probably should be a property (ie, a configuration) value.
    If not, it should probably be a constant (with appropriate 60-point documentation warning people what might happen if they DO change it).
    Winston
    PS: There is nothing particularly terrible about having a Properties class (except that you'll want to call it something different) that initializes its values from configuration files; except that if there are gazillions of them, you might want to:
    (a) Split them up into "themes".
    (b) Re-think your design.
    Winston

  • In the Begining it's Flat Files - Best Practice for Getting Flat File Data

    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?
    Thanks,
    Gregory

    Brother wrote:
    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    OO is a choice not a mandate. Using Java in a procedural way is certainly not ideal but given that it is existing code I would look more into whether is well written procedural code rather than looking at the lack of OO.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?Normally you create a data model driven by business need. You then implement using whatever means seem expedient in terms of other business constraints to closely model that data model.
    It is often the case that there is a strong correlation between data models and tables but certainly in my experience it is rare when there are not other needs driven by the data model (such as how foreign keys and link tables are implemented and used.)

  • Deployment specific configuration - best practice

    I'm trying to figure out the best way to set-up deployment specific configurations.
    From what I've seen I can configure things like session timeouts and datasources. What I'd like to configure is a set of programmatically accessible parameters. We're connecting to a BPM server and we need to configure the URL, username and password, and make these available to our operating environment so we can set-up the connection.
    Can we set-up the parameters via a deployment descriptor?
    What about Foreign JNDI? Can I create a simple JNDI provider(from a file perhaps?) and access the values?
    Failing these, I'm looking at stuffing the configuration into the database and pulling it from there.
    Thanks

    Which version of the product are you using?
    Putting the configs in web.xml config params as in this example:
    https://codesamples.samplecode.oracle.com/servlets/tracking/remcurreport/true/template/ViewIssue.vm/id/S461/nbrresults/103
    Will allow you to change the values per deployment easily with a deployment plan.
    Another alternative 10.3.2 and later is to use a features that allows resources like normal properties files to be overloaded by putting them in the plan directory. I don't have the link to this one right now.

  • Problem in deploy BPM 11g best practices to weblogic

    Hi everybody,
    I do step by step BPM11g BestPractices instruction. when I want to deploy SalesProcesses to application server in select SOA severs step in deploy wizard there isn't any server that I select. I login to admin server console of weblogic and see that soa_server1 and bam_server1 state is SHUTDOWN I find two ways for start them
    1. from console that first must start node manager and then start them
    2. from command line --> startManagedWebLogic.cmd soa_server1
    I try two ways but both of them have problems. when I see log files I see below exception finally :
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Critical> <WebLogicServer> <BEA-000362> <Server failed. Reason:*
    There are 1 nested errors:
    weblogic.management.ManagementException: Booting as admin server, but servername, soa_server1, does not match the admin server name, AdminServer
         at weblogic.management.provider.internal.RuntimeAccessService.start(RuntimeAccessService.java:67)
         at weblogic.t3.srvr.ServerServicesManager.startService(ServerServicesManager.java:461)
         at weblogic.t3.srvr.ServerServicesManager.startInStandbyState(ServerServicesManager.java:166)
         at weblogic.t3.srvr.T3Srvr.initializeStandby(T3Srvr.java:802)
         at weblogic.t3.srvr.T3Srvr.startup(T3Srvr.java:489)
         at weblogic.t3.srvr.T3Srvr.run(T3Srvr.java:446)
         at weblogic.Server.main(Server.java:67)
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>*
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>*
    *<Jul 10, 2011 10:15:20 AM GMT+03:30> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>*
    what should I do ?? please help me :-(
    Regards.

    try starting as : startManagedWebLogic.cmd <man server name> <Admin Server URL>
    e.g. startManagedWebLogic.cmd soa_server1 t3://localhost:7001

  • Naming Files - Best Practices?

    Since I began using Aperture (v1.0) I've imported files after first giving them a name such as "EventName 041225-001" etc. this added an extra step outside Aperture if I wanted the filename to match my previous filename scheme:
    (Event)(2-digit year)(2-digit month)(2-digit date)-(padded index)
    This year (2008) I switched to a 4-digit year to adapt to Aperture, but the index still isn't zero-padded. So, I finally began wondering, "why bother?" Why not just import the files as named by my camera?
    1) What are the disadvantage of not renaming my RAW files upon import? 2) What happens when my camera starts to recycle the filenames (already happened once)? What will Aperture do when this happens?

    DP Roberts wrote:
    Why not just import the files as named by my camera?
    I let the Nikon D2x name the pix; I name the folder that Masters live in, and I similarly name the Project.
    The plus is that it is easy, and the archived original unedited file can always be easily found if need be. The down side is that every 10,000 images there could be another pic with the same name, but since 2/3 of pix are culled that leaves a pic with the same name every ~30,000 images.
    Aperture knows which pic is which anyway. If I need to find an archived original file, yes I may have 5-10 to choose among but it happens seldom and it is easy enough to find the correct pic.
    -Allen Wicks

  • Best practice deploying additional updates

    Hello what is the best practice concerning monthy windows updates. We are currently adding additional windows updates to the existing 1 package and updating the content on the DP's. However this seems to work with inconsistant results.
    DPs are not finalising content .
    Other places I have worked we would create a seperate package each month for additional updates and never had an issue. Any thoughts?
    SCCM Deployment Technician

    The documented best practices are all related to the maximum number of patches that are part of one deployment. That number should not pas the 1000,
    Remember this is a hard limit of 1000 updates per Software Update Group (not deployment package). It's quite legitimate to use a single deployment package.
    I usually create static historical Software Updates Groups at a point in time (eg November 2014). In this case it is not possible to have a single SUG for all products (Windows 7 has over 600 updates for example). You have to split them. I deploy these
    updates (to pilot and production) and leave the deployments in place. Then I create an ADR which creates a new SUG each month and deploy (to pilot and production).
    You can use a single deployment package for all the above.
    Gerry Hampson | Blog:
    www.gerryhampsoncm.blogspot.ie | LinkedIn:
    Gerry Hampson | Twitter:
    @gerryhampson

  • Flat File load best practice

    Hi,
    I'm looking for a Flat File best practice for data loading.
    The need is to load a flat fle data into BI 7. The flat file structure has been standardized, but contains 4 slightly different flavors of data. Thus, some fields may be empty while others are mandatory. The idea is to have separate cubes at the end of the data flow.
    Onto the loading of said file:
    Is it best to load all data flavors into 1 PSA and then separate into 4 specific DSOs based on data type?
    Or should data be separated into separate file loads as early as PSA? So, have 4 DSources/PSAs and have separate flows from there-on up to cube?
    I guess pros/cons may come down to where the maintenance falls: separate files vs separate PSA/DSOs...??
    Appreciate any suggestions/advice.
    Thanks,
    Gregg

    I'm not sure if there is any best practise for this scenario (Or may be there is one). As this is more data related to a specific customer needs. But if I were you, I would handle one file into PSA and source the data according to its respective ODS. As that would give me more flexibility within BI to manipulate the data as needed without having to involve business for 4 different files (chances are that they will get them wrong  - splitting the files). So in case of any issue, your trouble shooting would start from PSA rather than going thru the file (very painful and frustating) to see which records in the file screwed up the report. I'm more comfortable handling BI objects rather than data files - coz you know where exactly you have look.

  • Best Practice Analyzer for Exchange 2013

    Greetings,
    I have upgraded the messaging infrastructure from Exchange 2007 to Exchange 2013.
    I want to test the Health of the system through ExBPA for Exchange 2013.
    But i don't find any setup for Exchange 2013 like it was in 2010.
    I went through an article by Office365 community, according to which for In-premises Exchange also we need to have office 365 account (can use trial account also) to get the downloader file for ExBPA 2013.
    http://community.office365.com/en-us/w/deploy/office-365-best-practices-analyzer-for-exchange-server-2013.aspx
    But to run the setup the servers needs to be connected to internet.
    And, i don't want to expose my environment to internet in any condition.
    Somebody, please suggest me if there is any setup available so that i can install directly without exposing to internet.
    Thanks in advance.
    Best Regards,
    K2

    Welcome to Exchange 2013.
    Exchange Server 2013 doesn't come with ExBPA for health check. This might help
    http://exchangeserverpro.com/powershell-script-health-check-report-exchange-2010/
    Apart from that you can run these commands too
    Get-ServerHealth -Identity Exchange2013ServerName
    Test-ServiceHealth
    Cheers,
    Gulab Prasad
    Technology Consultant
    Blog:
    http://www.exchangeranger.com    Twitter:
      LinkedIn:
       Check out CodeTwo’s tools for Exchange admins
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

Maybe you are looking for

  • Ipda mini unable to restore when disabled due to find my iPad.

    My daughter has forgotten her password and disabled her mini ipad (iOS 7).  I have attempted to restore but get the message that "find my iphone needs to be turned off to restore" on attempting (can't when disabled).  I've gone to my iCloud account a

  • My skype business number is constantly engaged

    [Phone number redacted for security and privacy.] please advise furthr, thanks

  • Standard Value and Activity at maintenance order

    Dear All I have created a standard value and attached a parameter of unit meter cube with the work center, however when i create maintenance order of same work center (by using IW31) system is not showing standard value parameter i attached on work c

  • Wireless-B W11S4V4 connection issues

    Man I need help.  I have a W11S4V4 wireless-B and it was working fine.  I had to connect my labtop directly to the comcast internet so I turned off the router for a few days.  Now, when I got back to how I had everything set up, i can't get connectio

  • Workflow Manager & User Profile Service Application for Extranet Web application

    Hi, Recently i have setup HA WF Manager farm and associated it with Intranet web application (on-prem). Now i want to use the same farm for our extranet environment (on-prem) but extranet environment is not associated with UPA. Since user profile app