SCCM 2012 Update deployment best practices?

I have recently upgraded our environment from SCCM 2007 to 2012. In switching over from WSUS to SCCM Updates, I am having to learn how the new deployments work.  I've got the majority of it working just fine.  Microsoft Updates, Adobe Updates (via
SCUP)... etc.
A few users have complained that the systems seem to be taking up more processing power during the update scans... I am wondering what the best practices are for this...
I am deploying all Windows 7 updates (32 and 64 bit) to a collection with all Windows 7 computers (32 and 64 bit)
I am deploying all Windows 8 updates (32 and 64 bit) to a collection with all Windows 8 computers (32 and 64 bit)
I am deploying all office updates (2010, and 2013) to all computers
I am deploying all Adobe updates to all computers... etc.
I'm wondering if it is best to be more granular than that? For example: should I deploy Windows 7 32-bit patches to only Windows 7 32-bit machines? Should I deploy Office 2010 Updates only to computers with Office 2010?
It's certainly easier to deploy most things to everyone and let the update scan take care of it... but I'm wondering if I'm being too general?

I haven't considered cleaning it up yet because the server has only been active for a few months... and I've only connected the bulk of our domain computers to it a few weeks ago. (550 PCs)
I checked several PCs, some that were complaining and some not. I'm not familiar with what the standard size of that file should be, but they seemed to range from 50M to 130M. My own is 130M but mine is 64-bit, the others are not. Not sure if that makes
a difference.
Briefly read over that website. I'm confused, It was my impression that WSUS was no longer used and only needed to be installed so SCCM can use some of the functions for its own purposes. I thought the PCs no longer even connected to it.
I'm running the WSUS cleanup wizard now, but I'm not sure it'll clean anything because I've never approved a single update in it. I do everything in the Software Update Point in SCCM, and I've been removing expired and superseded updates fairly regularly.
The wizard just finished, a few thousand updates deleted, disk space freed: 0 MB.
I found a script here in technet that's supposed to clean out old updates..
http://blogs.technet.com/b/configmgrteam/archive/2012/04/12/software-update-content-cleanup-in-system-center-2012-configuration-manager.aspx
Haven't had the chance to run it yet.

Similar Messages

  • SCCM 2012 R2 Driver - Best Practices on Updating Driver Packages?

    Example the new Surface Drivers were Release We are currently using September what is the best way to update the drivers?  If Import it shows multiple drivers old and new...  Thoughts?  Blog Post?

    No. You must always import drivers to be able to either one of the Apply Driver task types in a task sequence.
    However, you can also run a driver installer provided by the vendor as a package because the driver installer is a generic exe that does whatever its supposed to do outside the control of ConfigMgr.
    Note that although you can't use an Auto Apply Driver task in stand-alone media, you can absolutely use an Apply Driver Package in stand-alone media. In general, most folks do not rely on Auto Apply but instead rely on Apply Driver Package for multiple reason.
    Jason | http://blog.configmgrftw.com | @jasonsandys

  • Grid Control deployment best practices

    Looking for this document, interested to know more about Grid Control deployment best practices, monitoring and managing for more than 300+ dbs.

    hi
    have a search for the following document
    MAA_WP_10gR2_EnterpriseManagerBestPractices.pdf
    regards
    Alan

  • SCCM 2012 updates stuck on device

    Hi all,
    I have problem with SCCM 2012 updates on two machines. I have published security updates on all devices in collections without any problems. Till last week. On two devices, process of downloading has been stuck on 50%, 66% or 78%.
    Could I ask you how I should start troubleshooting this problem (I'm still "new" in SCCM topics).
    Thank you for any ideas and advices.
    Cheers!

    Pls check the logs
    http://technet.microsoft.com/en-us/library/hh427342.aspx#BKMK_SU_NAPLog
    Check these discussions
    http://social.technet.microsoft.com/Search/en-US/?query=stuck%20downloading&rq=meta:Search.MSForums.ForumID(7f829d1c-2fc5-4e76-aadc-a118080ff523)&rn=Configuration+Manager+Software+Updates+Management+Forum
    Thanks, Prabha G

  • Creating Software Update Packages - Best Practice?

    I am setting up our SCCM 2012 R2 environment to begin using it for Windows Updates, however I'm not sure 100% the best method of setting it up.
    Currently my plan is to break out the deployment packages by OS, but I read\told that I should avoid creating to many dynamic deployment packages, as every time it changes all the computers will re-scan that package.  So What I want to do is create
    various packages for OS and years, so I would have a package that contains all updates for Windows 7, older then January 31, 2013 (assuming the package doesn't have 1000+ updates), and are not superseded\Expired. Then I would create Packages for the 2014
    monthly updates each month, then at the end 2014, combine them all in 1 package, and restart the process for 2015.  Is this a sound plan or is there a better course of action?
    If this the best practice method, is there any way to automatically create these packages?  I tried the Automatic Deployment Rules, but I can not set a Year of release, only the a time frame of the release,(older then 9 Months), unless I am missing
    something.  The only way I can see doing this is going into All Software Updates, and filtering on my requirements, and then manually creating the package, but this would less desirable, as after each year I would like to remove the superseded and expired
    without having to recreate the package.
    Mark.

    First, please learn what the different objects are -- not trying to be rude, just stating that if you don't do this, you will have fundamental issues. Packages are effectively meaningless when it comes to deploying updates. Packages are simply a way of grouping
    the binary files so they can be distributed to DPs and in-turn made available to clients. The package an update is in is irrelevant. Also, you do not "deploy" update packages and packages are not scanned by clients. The terminology is very important because
    there are implications that go along with it).
    What you are actually talking about above are software update groups. These are separate and distinct objects from update packages. Software Update groups group updates (not the update binaries) into logical groups that can be in-turn deployed or used for
    compliance reporting.
    Thus, you have two different containers that you need to be concerned about, update packages and update groups. As mentioned, the update package an update is in is pretty meaningless as long as the update is in a package that is also available to the clients
    that need it. Thus, the best way (IMO) to organize packages is by calendar period. Yearly or semi-annually usually works well. This is done more less to avoid putting all the updates into a single package that could get corrupted or will be difficult to deploy
    to new DPs.
    As for update groups, IMO, the best way is to create a new group every month for each class of products. This typically equates to one for servers, one for workstations, and one for Office every month. Then at the end of every year (or some other timeframe),
    rolling these monthly updates into a larger update group. Keep in mind that a single update group can have no more than 1,000 updates in it though. (There is no explicit limit on packages at all except see my comments above about not wanting one huge package
    for all updates.)
    Initially populating packages (like 2009, 2010, 2011, etc) is a manual process as is populating the update groups. From then on, you can use an ADR (or really three: one for workstations, one for servers, and one for Office) that runs every month, scans
    for updates released in the past month, and creates a new update group.
    Depending upon your update process, you may have to go back and add additional deployments to each update group also, but that won't take too long. Also, always QC your update groups created by an ADR. You don't want IE11 slipping through if it will break
    your main LOB application.
    Jason | http://blog.configmgrftw.com

  • SCCM 2012 Update Group inconsistency Problem with Red marked Updates

    Hi everybody,
    we have a big Problem with our SCCM 2012 R2 CU3 Enviroment, regarding of Update Groups which got out of sync.
    we have a SCCM Site Infastructure with one CAS and 4 Primary Sites and on every Site is the SUP Role installed.
    I don't know when this failure occours but i think it was after the CU3 Installation. The Installation itself went smooth without any Errors or Warnings.
    The Problem is as following. We have some updates in Update Groups (all of them are Core XML Updates) which are out of sync and marked red as an invalid Update on 2 Primary Sites. On the CAS Site and the 2 other Primary SItes they are marked as green (downloaded
    yes and deployed yes)
    We have no Replication issues regarding the Replication Status (everything is synchronized to 100%) and the Replication Link Analyzer does also show no Problems at all.
    I now deleted the Deployments and the SW Update Group waited until the replication was fine and created a new one and downloaded these patches on one of the Primary Sites which had shown this Failure.
    The Result was not good. It looks like before. On the CAS and 2 Primary Sites the Deplyment is shown as downloaded but on the other 2 Sites the Status is again Downloaded=no.
    Does anybody have any idea what to do now ? I checked objmgr.log and rcmctr. log but found nothing what shows me the way in the right direction.
    Thx for your time, and it would be fine if anybody can share knowledge about this failure and how to fix it.
    All other Ideas are also welcome.
    Thx a lot in advance and have a nice bug free day :-)
    Bastian

    Hi,
    Please try to manually synchronize software updates from the CAS and monitor the WSUSCtrl.log, WCM.log and wsyncmgr.log on the CAS and Primary sites.
    Best Regards,
    Joyce
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Expired Updates Removal - Best Practices

    http://blogs.technet.com/b/configmgrteam/archive/2012/04/12/software-update-content-cleanup-in-system-center-2012-configuration-manager.aspx
    I was searching for best practices for removing expired updates from environment and found this useful link.
    There are some queries :
    1) It actually says to remove all the expired updates in one go without removing them from SUG's first. When the expired updates which are also part of active SUG are removed, wouldn't this trigger a software update rescan request for all clients in the collection
    to which these SUG's were targeted to, to rescan the patches required as there was a change in the SUG ? 
    2) How about deleting the deployments from collections and then removing the expired updates from only those SUG and proceed in this way. Wouldn't this lower the processing ?
    3) The expired update not part of any SUG will be removed, just to make sure, if the expired update is part of SUG but not targeted to any collection, will it still be removed ?
    4) Once the expired update is removed, what will be the process of its removal from the Distribution Point. What other automated tasks will be triggered for this like updatation of software update packages on DP once there is any change etc? I have been prestaging
    software update packages and extracting them on DP's. For any new DP, as the prestage still contains the older updates (expired, which were removed), Will they get extracted on new DP ? 
    Are all the steps i mentioned above valid in case of superseded updates instead of expired ?

    I am not clear of the below Jacob :
     If you delete the deployment, all of the policy for those updates will be removed. 
    But, that removes every single update and not just the ones you removed.  A bit more processing goes into removing everything.
    What my concern here is, suppose there are 10 SUG's deployed to 40 collections each. lets
    say there are 1000 updates.
    If i select all the expired updates and just edit their membership, suppose random udpates
    are part of all 10 SUG deployments. Removing these will trigger the policy cycle for all the collection clients.
    What i was talking about is, if I pick up 1 SUG out of 10 and remove it from 40 collections
    first. Once it is done, then go ahead with removing the expired updates from this SUG.
    This is what i need some clarification on.

  • SQL 2012 service accounts best practice

    I'm installing SQL Server 2012 for ConfigMgr 2012 r2 and I wonder what is the best practice for SQL service accounts.
    During the installation of SQL Server, in the server configuration/Service accounts menu I'm allowed to configure following service accounts: SQL Server Agent, SQL Server Agent Database Engine, SQL Server Reporting Services, SQL Server Browser.
    Do I have to create separate domain user (not admin) accounts for each service and configure service principal name (SPN) for all of them?
    For example: Domain user account named SQLSA for SQL Server Agent, another domain user account
    SQLADBE for SQL Server Agent Database Engine etc.

    During the installation of SQL Server 2012, the user is prompted to provide service account
    credentials. The default service accounts suggested vary depending on whether SQL Server
    2012 is installed on a computer running Windows Vista or Windows Server 2008 or on a computer
    running Windows 7 or Windows Server 2008 R2. On computers running Windows Vista
    or Windows Server 2008 operating systems, the following default service accounts are used:
    - NETWORK SERVICE Database Engine, SQL Server Agent, Analysis Services,
    Integration Services, Reporting Services, SQL Server Distributed Replay Controller,
    SQL Server Distributed Replay Client
    - LOCAL SERVICE SQL Server Browser, FD Launcher (Full-Text Search)
    - LOCAL SYSTEM SQL Server VSS Writer
    On computers running Windows 7 or Windows Server 2008 R2 operating systems, the following
    default accounts are used:
    - Virtual Account or Managed Service Account Database Engine, SQL Server Agent,
    Analysis Services, Integration Services, Replication Services, SQL Server Distributed
    Replay Controller, SQL Server Distributed Replay Client, FD Launcher (Full-Text Search)
    - LOCAL SERVICE SQL Server Browser
    - LOCAL SYSTEM SQL Server VSS Writer
    For Windows 7 and Windows Server 2008 R2, you can use a Managed Service Account
    (MSA) or a Managed Local Account. The differences between these account types are as
    follows:
    - Managed Service Account (MSA) This special kind of domain account managed
    by a domain controller is assigned to a single member computer and used for running
    services. The MSA password is managed by the domain controller. MSAs can register
    a Service Principal Name (SPN) with Active Directory. MSAs use a $ name suffix; for
    example, CONTOSO\SQL-A-MSA$. You must create the MSA prior to running SQL
    Server Setup if you want to use an MSA with SQL Server services.
    - Virtual Accounts or Managed Local Accounts These virtual accounts can access
    the network in a domain environment and are used by default for service accounts
    during SQL Server 2012 setup when run on Windows 7 or Windows Server 2008 R2.
    Such accounts use the NT SERVICE\<SERVICENAME>format. You don’t need to specify
    a password when using virtual accounts with SQL Server 2012 because this is handled
    automatically by the operating system.
    You should run SQL Server services, using the minimum possible user rights, and use an
    MSA or virtual account when possible. If you are manually configuring service accounts, use
    separate accounts for different SQL Server services. If it is necessary to change the properties
    of service accounts used for SQL Server 2012, use SQL Server tools such as SQL Server
    Configuration Manager. This ensures that all necessary dependencies are
    updated, which does not happen if you use only the Services console.
    Although you can configure domain accounts as service accounts, this strategy requires
    more effort because you must ensure that service account passwords are changed regularly.
    You must also manage SPNs, which are required for Kerberos authentication.
    Best regads
    P.Ceglie

  • SCCM 2012 updates, refine query

    Hi,
    Please advise on refining Windows update search SCCM 2012.
    We want to have only security and critical updates, Office, Windows 2008 R2 and 2012, X64, Internet Explorer 10 only.
    Problem:
    -can't get only Internet explorer 10 to appear (tried it in several ways, now we work with custom severity set to none then selecting/setting internet explorer 11 etc to severity none) ONLY, please advise how you do that
    -x86 updates appear (ok for Office, but not for Windows) also IA64, we only need x64
    Jan Hoedt

    Thanks for your feedback.
    Why we worry about the query? Maybe I'm missing something here too. How else can you select  select specific updates in bulk? Otherwise we'd need to evaluate one by one.
    Goal is to select all the updates of a specific query and add them to an update group, not needing to worry a wrong update slipped in (f.e. deploying IE11 would mean a disaster)
    What we want to do = deploy security and critical updates, Office, Windows 2008 R2 and 2012, X64, Internet Explorer 10 only.
    If we would just select the updates and deploy:
    indeed the not applicable updates would not be deployed but still downloaded so wasting bandwith and storage (f.e. IA64)
    also previews, internet explorer 11, 12 etc would be deployed whereas we need to stick to internet explorer 10
    If I select "title not contains"  Internet explorer 11 or title not contains not IA64 it totally ignores it.
    J.
    Jan Hoedt

  • SCCM 2012 R2 Deployment assistance and guide lines required.

    Hi All,
    We have purchased the system center suit and planed to deploy the below products in our environment.
    SCOM - Plan already made and architecture is ready with no issues
    SCCM - In planning state.
    We are planning to use the same SQL server for both SCOM and SCCM, Which is a SQL Server 2008 R2 SP1 CU6. So the database engine service is available to host our database.
    But the Business has said we need to deploy reporting on another machine which may be a SQL 2012 SP1.
    What i want to know is does SCCM 2012 R2 support using different versions of SQL server for its features? As i see there is no documentation for this.
    How ever MS has given the document for SCOM that it is not supported and we need to use the same version of SQL for all the features.
    Also is SQL Express edition supported for other Primary sites if i use a licensed SQL in my CAS ? 
    Does any one have an idea on SCCM 2012 R2 for the above ?
    Gautam.75801

    Thank you very Grath. 
    Also my last and the remaining question is about the SQL Server version inter operability.
    Does using different versions of SQL work in SCCM ? i.E I use DB Engine SQL 2008 R2 SP1 CU6 and if i use SQL 2012 R2 for reporting. 
    Will this work in SCCM ? As for SCOM MS has said this in there document that using different versions of SQL's for features is not supported.
    So what is the case with SCCM is it the same or is it supported ?
    As the Microsoft documents do not talk about this question.
    The reason i asked is as the business does not have cost for Hardware for a SQL server and they are asking to use a SQL server in another domain for DB engine.
    And another SQL server for reporting as the DB engine server does not have reporting.
    Gautam.75801

  • Jdev101304 SU5 - ADF Faces - Web app deployment best practice|configuration

    Hi Everybody:
    1.- We have several web applications that provides a service/product used for public administration purposes.
    2.- the apps are using adf faces adf bc.
    2.- All of the apps are participating on javaSSO.
    3.- The web apps are deployed in ondemand servers.
    4.- We have notice, that with the increase of users on this dates, the sessions created by the middle tier in the database, are staying inactive but never destroyed or removed.
    5.- Even when we only sing into the apps using javasso an perform no transacctions (like inserting or deleting something), we query the v$sesisons in the database, and the number of inactive sessions is always increasing, until the server colapse.
    So, we want to know, if this is an issue of the configurations made on the Application Module's properties. And we want to know if there are some "best practices" that you could provide us to configure a web application and avoid this behavior.
    The only configurations that we found recomended for web apps is set the jbo.locking.mode to optimistic, but this doesn't correct the "increasing inactive sessions" problem.
    Please help us to get some documentation or another resource to correct configure our apps.
    Thnks in advance.
    Edited by: alopez on Jan 8, 2009 12:27 PM

    hi alopez
    Maybe this can help, "Understanding Application Module Pooling Concepts and Configuration Parameters"
    see http://www.oracle.com/technology/products/jdev/tips/muench/ampooling/index.html
    success
    Jan Vervecken

  • SCCM 2012 - App deploy - Configuring a dependency in an app which is also configured for supersedence.

    Hi there
    I am replacing an app with a newer version via SCCM 2012.
    So lets call the existing app App1 and the new app App2.
    I have an application created which has App1 configured to be superseded (uninstall) and App2 to be installed as a msi deployment type.
    The issue I have ran into is that the install/upgrade fails if App1 is currently opened and running. The uninstall only works if App1 is not opened.
    Anyone had to deal with something like this before without reverting to a script deployment type which runs taskkill commands and then a msiexec /x command? Is there a more sophisticated way to do this using the new application model?
    The sequence would need to be:
    1) App1 if running on the target machine is killed
    2) App1 is uninstalled
    3) App2 is installed
    Anyone put anything like that together?
    Regards
    John

    Thank you for replying.
    Yes the key is indeed the uninstallation command and behaviour on App1; as my question makes clear I am aware of that.
    I am hoping someone has come up with a clever way of achieving such a requirement, or I will simply have to use a script, which seems a bit unsophisticated given the range of options available in the application model.
    Regards
    John

  • Oracle Identity Manager - automated builds and deployment/Best practice

    Is there a best practice as for directory structure for repository in version control system?
    Do you recommend to keep the whole xellerate folder + separate structure for xml files and java code? (Considering fact that multiple upgrades can occur over the time)
    How custom code is merged to the main application?
    How deployment to Weblogic application server occur? (Do you create your own script or there is an out of the box script that can be reused)
    I would appreciate any guidance regarding this matter.
    Thank you for your help.

    Hi,
    You can use any IDE (Eclipse, Netbeans) for development.
    For, Getting started with OIM API's using Eclipse, please follow these steps
    1. Creating the working folder structure
    2. Adding the jar/configuration files needed
    3. Creating a java project in Eclipse
    4. Writing a sample java class that will call the API's
    5. Debugging the code with Eclipse debugger
    6. API Reference
    1. Creating the working folder structure
    The following structure must be created in the home directory of your project (Separate project home for each project):
    <PROJECT_HOME>
    \ bin
    \ config
    \ ext
    \ lib
    \ log
    \ src
    The folders will store:
    src - source code of your project
    bin - compiled code of your project
    config - configuration files for the API and any of your custom configuration files
    ext - external libraries (3'rd party)
    lib - OIM API libraries
    log - local logging folder
    2. Adding the jar/configuration files needed
    The easiest way to perform this task is to copy all the files from the OIM Design Console
    folders respectively in the <PROJECT_HOME> folders.
    That is:
    <XEL_DESIGN_CONSOLE_HOME>/config -> <PROJECT_HOME>/config
    <XEL_DESIGN_CONSOLE_HOME>/ext -> <PROJECT_HOME>/ext
    <XEL_DESIGN_CONSOLE_HOME>/lib -> <PROJECT_HOME>/lib
    3. Creating a java project in Eclipse
    + Start Eclipse platform
    + Select File->New->Project from the menu on top
    + Select Java Project and click Next
    + Type in a project name (For example OIM_API_TEST)
    + In the Contents panel select "Create project from existing source",
    click Browse and select your <PROJECT_HOME> folder
    + Click Finish to exit the wizard
    At this point the project is created and you should be able to browse
    trough it in Package Explorer.
    Setting src in the build path:
    + In Package Explorer right click on project name and select Properties
    + Select Java Build Path in the left and Source tab in the right
    + Click Add Folder and select your src folder
    + Click OK
    4. Writing a sample Java class that will call the API's
    + In Package Explorer, right click on src and select New->Class.
    + Type the name of the class as FirstAPITest
    + Click Finish
    Put the following sample code in the class:
    import java.util.Hashtable;
    import com.thortech.xl.util.config.ConfigurationClient;
    import Thor.API.tcResultSet;
    import Thor.API.tcUtilityFactory;
    import Thor.API.Operations.tcUserOperationsIntf;
    public class FirstAPITest {
    public static void main(String[] args) {
    try{
    System.out.println("Startup...");
    System.out.println("Getting configuration...");
    ConfigurationClient.ComplexSetting config =
    ConfigurationClient.getComplexSettingByPath("Discovery.CoreServer");
    System.out.println("Login...");
    Hashtable env = config.getAllSettings();
    tcUtilityFactory ioUtilityFactory = new tcUtilityFactory(env,"xelsysadm","welcome1");
    System.out.println("Getting utility interfaces...");
    tcUserOperationsIntf moUserUtility =
    (tcUserOperationsIntf)ioUtilityFactory.getUtility("Thor.API.Operations.tcUserOperationsIntf");
    Hashtable mhSearchCriteria = new Hashtable();
    mhSearchCriteria.put("Users.First Name", "System");
    tcResultSet moResultSet = moUserUtility.findUsers(mhSearchCriteria);
    for (int i=0; i<moResultSet.getRowCount(); i++){
    moResultSet.goToRow(i);
    System.out.println(moResultSet.getStringValue("Users.Key"));
    System.out.println("Done");
    }catch (Exception e){
    e.printStackTrace();
    Replace the "welcome1" with your own password.
    + save the class
    To run the example class perform the following steps:
    + Click in the menu on top Run, and run "Create, Manage, and run Configurations" wizard. (In the menu, this can be either "run..." or "Open Run Dialog...", depending on the version of Eclipse used).
    + Right click on Java Application and select New
    + Click on arguments tab
    + Paste the following in VM arguments box:
    -Djava.security.manager -DXL.HomeDir=.
    -Djava.security.policy=config\xl.policy
    -Djava.security.auth.login.config=config\authwl.conf
    -DXL.ClientClassName=%CLIENT_CLASS%
    (please replace the URL, in ./config/xlconfig.xml, to your application server if not running on localhost or not using the default port)
    + Click Apply
    + Click Run
    At this point your class is executed. If everything is correct, you will see the following output in the Eclipse console:
    Startup...
    Getting configuration...
    Login...
    log4j:WARN No appenders could be found for logger (com.opensymphony.oscache.base.Config).
    log4j:WARN Please initialize the log4j system properly.
    Getting utility interfaces...
    1
    Done
    Regards,
    Sunny Ajmera

  • SOA OSB Deployment best practices in Production environment.

    Hi All
    I just wanted to know the best practices followed in production environment for deploying OSB and SOA Code. As you are aware that both require libraries from either (Jdev or SOA Suite) and (OEPE and OSB)? Should one rip the libraries and package them with the ANT scripts (I am not sure but SOA would require its internal ANT scripts and lot of libraries to be bundled, OSB requires only a few OEPE and OSB libraries) or we simply use the below:
    1) Use the production run time (SOA Server and OSB Server) to build and deploy the code. OEPE would not be present here, so we would just have to deploy the already created sbconfig.jar (We would build this in a local environment where OEPE and OSB would be installed). The code is checked out from a repository and transferred to this linux machine.
    2) Use a windows machine (which has access to prod environment) and have Jdeveloper, OEPE and OSB installed to build\deploy the code to production server. The code is checked out from a repository.
    Please let us know your personal experiences with the deployment in PROD. Thanks a lot!

    There are two approaches for deployment of OSB and SOA code.
    1. Use a machine specifically for build and deployment which will have access to all production environments (where deployment needs to be done). Install all the required software (oepe, OSB etc..) and use remote deployment for deploying the code.
    2. Bundle all the build and deployment related libraries and ship them as a deployment package on the target server and proceed with the deployment.
    Most commonly followed approach is approach#1.
    Regards
    Vivek

  • Portal server deployment best practices

    anyone out there knows what is the right way to deply portal server into production environment instead of manually copying all the folders and run the nessarily commands? Is there a better way to deploy portal server? Any best practices that i should follow for deploying portal server?

    From the above what I understood is you would like to transfer your existing portal server configuration to the new one. I don't think there is an easy method to do it.
    One way You can do is by taking the "ldif " back up from the existing portal server.
    For that first you have to install the portal server in the new box and then take back up of existing portal server using
    # /opt/netscape/directory4/slapd-<host>/ldif2db /tmp/profile.ldif
    edit the "/tmp/profile.ldif " file and modify <hostname> and <Domain name> with the new system values.
    copy this file to the new server using
    # /opt/netscape/directory4/slapd-<host>/ldif2db -i /tmp/backdb.ldif
    and also copy the file "slapd.user_at.conf " under /opt/netscape/directory4/slapd-<hostname>/config to the new system.
    Restarting the server makes you to access the portal server with the confguration of the old one.

Maybe you are looking for

  • Multiple devices - finding who purchased...

    This is an odd problem - my company purchased several iPads for board members. They were all registered and set up with the same iTunes ID, therefore they all have the same credit card information when accessing the store. This was by design, we want

  • Dynamic link button from speed grade to premiere is grey.

    Dynamic Link button from Speedgrade to Premier is grey. No problems Linking from Premier to SG, but no way to send the project back to Premier. The project is in Apple ProRes 4444. I have a Mac Pro last generation. Thanks for an urgent help!

  • Business Connector related question

    When I am in Business Connector and trying to right click on Trident package to Unlock it so I cab modify it and it gives me the following error: Java.lang.NullPointer.Exception When I click on more details I get – Server Error occurred Any ideas wha

  • Transfer metadata in bridge to another computer

    greetings, I have a mac laptop and imac, both on CS4.  over the thanksgiving holiday, i want to metadata quite a few images using bridge on my laptop, with all images residing on an external drive. however, after thanksgiving i will need to move all

  • Swing - Jdbc Components

    Hi friends. I'm newer in Java, and i'm searching for 3er party components (comercial or free), that connects swing with jdbc (for example, one tablemodel that connects with jdbc for read and write). Thank you.