Best Practice for CQ Updates in complex installations (clustering, replication)?

Hi everybody,
we are planning a production setup of CQ 5.5 with an authoring cluster replicating to 4 publisher instances. We were wondering what the best update process looks like in a scenario like this. Let's say, we need to install the latest CQ 5 Update - which we actually have to -:
Do we need to do this on every single instance, or can replication be utilized to distribute updates?
If updating a cluster - same question: one instance at a time? Just one, and the cluster does the rest?
The question is really: can update packages (official or custom) be automatically distributed to multiple instances? If yes, is there a "best practice" way to do this?
Thanks for any help on this!
Henning

Hi Henning,
The CQ5.5 servicepacks are distributed as CRX packages. You can replicate these packages and on the publishs they are unpacked and installed.
In a cluster the situation is different: You have only 1 repository. So when you have installed the servicepack on one node, the new versions of bundles and other stuff is unpacked to the repository (most likely to /libs). Then the magic (essentially the JcrInstaller) takes care, that the bundles are extracted to started.
I would not recommend to activate the service pack in a production environment, because then all publishs will be updated the same time. And as a restart is required, you might encounter downtimes. Of course you can make it work when you play with the replication agents :-)
cheers,
Jörg

Similar Messages

  • Best practice for auto update flex web applications

    Hi all
    is there a best practice for auto update flex web applications, much in the same way AIR applications have an auto update mechanism?
    can you please point me to the right direction?
    cheers
    Yariv

    Hey drkstr
    I'm talking about a more complex mechanism that can handle updates to modules being loaded into the application ect...
    I can always query the server for the verion and prevent loading from cach when a module needs to be updated
    but I was hoping for something easy like the AIR auto update feature

  • Best practice for the Update of SAP GRC CC Rule Set

    Hi GRC experts,
    We have in a CC production system a SoD matrix that we would like to modified extensively. Basically by activating many permissions.
    Which is a best practice for accomplish our goal?
    Many thanks in advance. Best regards,
      Imanol

    Hi Simon and Amir
    My name is Connie and I work at Accenture GRC practice (and a colleague of Imanolu2019s). I have been reading this thread and I would like to ask you a question that is related to this topic. We have a case where a Global Rule Set u201CLogic Systemu201D and we may also require to create a Specific Rule Set. Is there a document (from SAP or from best practices) that indicate the potential impact (regarding risk analysis, system performance, process execution time, etc) caused by implementing both type of rule sets in a production environment? Are there any special considerations to be aware? Have you ever implemented this type of scenario?
    I would really appreciate your help and if you could point me to specific documentation could be of great assistance. Thanks in advance and best regards,
    Connie

  • Best Practice for Software Update Structure?

    Is there a best practice guide for Software Update Structure?  Thanks.  I would like to keep this neat and organized.  I would also like to have a test folder for updates with test group.  Thanks.

    Hi,
    Meanwhile, please refer to the following blog get more inspire.
    Managing Software Updates in Configuration Manager 2012
    http://blogs.technet.com/b/server-cloud/archive/2012/02/20/managing-software-updates-in-configuration-manager-2012.aspx
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Best Practice for Expired updates cleanup in SCCM 2012 SP1 R2

    Hello,
    I am looking for assistance in finding a best practice method for dealing with expired updates in SCCM SP1 R2. I have read a blog post: http://blogs.technet.com/b/configmgrteam/archive/2012/04/12/software-update-content-cleanup-in-system-center-2012-configuration-manager.aspx
    I have been led to believe there may be a better method, or a more up to date best practice process in dealing with expired updates.
    On one side I was hoping to keep a software update group intact, to have a history of what was deployed, but also wanting to keep things clean and avoid issues down the road as i used to in 2007 with expired updates.
    Any assistance would be greatly appreciated!
    Thanks,
    Sean

    The best idea is still to remove expired updates from software update groups. The process describes in that post is still how it works. That also means that if you don't remove the expired updates from your software update groups the expired updates will
    still show...
    To automatically remove the expired updates from a software update group, have a look at this script:
    http://www.scconfigmgr.com/2014/11/18/remove-expired-and-superseded-updates-from-a-software-update-group-with-powershell/
    My Blog: http://www.petervanderwoude.nl/
    Follow me on twitter: pvanderwoude

  • Best practice for windows updates

    Hello,
    I'm new with zpm (10.3.2) and have been searching for somekind of best practice how to install windows updates with it. If somebody have some advices or could point me to the right direction, it would be greatly appreciated.
    thx
    -jarkko-

    leppja,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • Best practices for additonal storage devices in a clustered container

    I am wondering what is considered as best practices related to adding more storage devices. I have a Solaris Cluster 3.2, a failover resource group with a SZBT resource and a HASP resource for the zone root path. Now i have to add more storage devices (for Oracle to the zone). The additional storage devices are in the same metaset as the zone root. I see the following options:
    -> Add it to the zone (simple not managed by HASP)
    -> Add it to the existing HASP for the RG and lofs mount it in the Zone (managed by HASP, but needs lofs mount)
    Are there other options, and what is considered best practices?
    Fritz

    Hi Fritz,
    no matter which option you take it should resoult in lofs mounts.
    I in person would not mix the concepts, you started with HASP, so I would continue with HASP.
    To achieve this you have to take three steps:
    1 add the file systems to the HASP resource,
    2. create the mountpoints in the local zone
    3 add the file system to the Mounts variable in the SCZBT's parameter file.
    If you do not want to reboot the zones, you can mount the loopback files manually, but the safest option ( you will not mess up with typos in th mount pints and realize it later) would be to disable and reenable th sczbt resource.
    Cheers
    Detlef

  • Best Practices for Batch Updates, Inserts and Complex Queries

    The approach we have taken for our ALDSP Architecture is to model or DASi as Business Data Objects, each DS joining several (some times many) tables and lookups. This works ok when needing to access individual records for read and update, but when we need to update multiple tables and rows within the same commit, trying to do this with a logical single ds built on tables or other dASi, proves both cumbersome and slow. This is also the case for queries, when we have complex where clauses within a DS built upon two or more multi-table-joined logical DASi.
    We tried a DS built on SQL, but that does not allow dml operations. We may have to just use JDBC. Any thoughts on how best to leverage DAS in this respect.

    I tried doing this by creating a UO class and using it on a DS built on a sql statement. What we wanted to do here is first read the DS to get a list of ID values that met the conditions of the query and then call submit() and have the UO update all the necessary tables associated with those IDS.
    However, we found that U/O never get's called unless you actually update something, not just send submit() after a read. Dis I misunderstand the way this shoudl work?

  • Java - best practice for getting values from complex dialogs

    Hi,
    Java is a language I use quite a bit, but much of my work hasn't required GUIs. Now, I'm developing an app which will contain a number of custom-made dialogs (which will obviously extend JDialog). Take a typical Options dialog which typically stores many parameters for a given application.
    Simple dialogs that ship with Java allow to 'get' a given value. As they assume that the dialog only consists of obtaining a single piece of data, think JFileChooser.
    So, what is the best way to cope with a dialog that could potentially hold hundreds of parameters? It's surely unfeasible to implement getters for each field. I was thinking either creating a hashmap object to hold key-value pairs, and 'get' that from the dialog. Or, I could encapsulate that a little more by creating a new class that manages these values, ProgramOptions or such like. That can be passed around between the dialog and the main app.
    I hope this makes sense
    TIA

    sudman1 wrote:
    I doubt it's the most efficient method, but the way I've delt with things like this in the past is to set up the fields in either an Array or ArrayList, then use a for loop that gets the data and drops it into an Array/ArrayList which is then returned.
    class MyDialog extends JDialog {
    // Global variable
    private ArrayList dialogFieldsArray= new ArrayList();
    private void initGUI() {
    for (int i=0 ; i <numRequiredFields ; i++) {
    dialogFieldsArray.add(new JTextField(10));
    private ArrayList getDialogData() {
    ArrayList rval = new ArrayList();
    for (int i=0 ; i < dialogFieldsArray.size() ; i++) {
    String tempString = ((JTextField) dialogFieldsArray.get(i)).getText();
    rval.add(tempString);
    return rval;
    The only thing to remember with the above method is the castings involved, but you could implement your own subclasses of ArrayList that deal only with JTextFields and Strings seperately to get around that.
    I personally wouldn't go for this exact style, instead I'd opt for a Map collection instead. This because you need to know which field is at which exact index. Imagine you make a form that has the fields: forename, surname, date of birth. And so you know that forename is .get(0) of your ArrayList, etc.
    But then, you decide to put a new field, saluation (Mr, Miss, etc) at the beginning. This now shifts all your olds fields along by one, and so you need to edit that code. Sure, no biggie with a handful of fields, but large dialogs could cause headaches.

  • Best Practice for Developer update access to database in Production

    I am curious to find out what other organizations are doing as to developer access to sysadm in production. Such as using a database account created like sysadm that can be checked out for use and locked when not in use? or ?

    Developer can be provided with Read only access to SYSADM schema.
    Thanks
    Soundappan
    Edited by: Soundappan on Dec 26, 2011 12:00 PM

  • Best Practice for SAP BW 3.5 Installation (post install) procedure

    Hi,
    I have installed BW 3.5 on windows platform, database is Oracle 9.2. I have put Bi_Content as Add-on and followed these steps.
    Install oracle
    Install Oracle patch
    Install PBW ABAP was
    Activate TMS
    Apply new SPAM,support packages and upgrade PI_Basis from 2004 to 2005.
    Install BI_Content 353
    Apply BI_content support packages
    Perform client copy to client 100 from client 000 using SAP_ALL  profile
    Performed these without any error or discripency.
    But now some have raised a issue that we need to follow the other way that is as mentioned below.
    Install oracle
    Install Oracle patch
    Install PBW ABAP WAS
    Install BI_Content 353
    Perform client copy to client 100 from client 000 using SAP_ALL  profile
    Activate TMS and place in proper domain
    Apply new SPAM,support packages and upgrade PI_Basis from 2004 to 2005.
    Kindly help me out as its of high priority now.
    Thanks in advance.
    Regards,
    chaitra

    Hello Ramesh,
    you should have a look at the SAP Instguide, and if it tells you to do as your colleagues say, open a ticket to SAP and ask, whether doing it the otherw ay could have spoiled your system. Anything but an official "yes, it's ok" won't help you.
    For the other guys: have a loook at http://service.sap.com/instguides, there you can find all the guides you need.
    kind regards,
    Carl

  • Best Practices for SRM Installation !!

    Hi
        can someone share the best Practices for SRM Installation ?
    What is the typical timeframe to install SRM on development server and as well as on the Production server ?
    Appericiate the responses
    Thanks,
    Arvind

    Hi
    I don't know whether this will help you.
    See these links as well.
    <b>http://help.sap.com/bp_epv170/EP_US/HTML/Portals_intro.htm
    http://help.sap.com/bp_scmv150/index.htm
    http://help.sap.com/bp_biv170/index.htm
    http://help.sap.com/bp_crmv250/CRM_DE/index.htm</b>
    Hope this will help.
    Please reward suitable points.
    Regards
    - Atul

  • Best Practice for Updating children UIComponents in a Container?

    What is the best practice for updating children UIComponents in response to a Container being changed?  For instance, when a Canvas is resized, I would like to update all the children UIComponents height and width so the content scales properly.
    Right now I am trying to loop over the children calling InvalidateProperties, InvalidateSize, and InvalidateDisplayList on each.  I know some of the Containers such as VBox and HBox have layout managers, is there a way to leverage something like that?
    Thanks.

    you would only do that if it makes your job easier.  generally speaking, it would not.
    when trying to sync sound and animation i think most authors find it easiest to use graphic symbols because you can see their animation when scrubbing the main timeline.  with movieclips you only see their animation when testing.
    however, if you're going to use actionscript to control some of your symbols, those symbols should be movieclips.

  • Best Practice for SUP and WSUS Installation on Same Server

    Hi Folks,
    I have a question, I am in process of deploying SCCM 2012 R2... I was in process of deploying Software Update Point on SCCM with one of the existing WSUS server installed on a separate server from SCCM.
    A debate has started with of the colleague who says that the using remote WSUS server is recommended by Microsoft because of the scalability security  that WSUS will be downloading the updates from Microsoft and SCCM should be working as downstream
    server to fetch updates from WSUS server.
    but according to my consideration it is recommended to install WSUS server on the same server where SCCM is installed... actually it is recommended to install WSUS on a site system and you can used the same SCCM server to deploy WSUS.
    please advice me the best practices for deploying SCCM and WSUS ... what Microsoft says about WSUS to be installed on same SCCM server OR WSUS should be on a separate server then the SCCM server ???
    awaiting your advices ASAP :)
    Regards, Owais

    Hi Don,
    thanks for the information, another quick one...
    the above mentioned configuration I did is correct in terms of planning and best practices?
    I agree with Jorgen, it's ok to have WSUS/SUP on the same server as your site server, or you can have WSUS/SUP on a dedicated server if you wish.
    The "best practice" is whatever suits your environment, and is a supported-by-MS way of doing it.
    One thing to note, is that if WSUS ever becomes "corrupt" it can be difficult to repair and sometimes it's simplest to rebuild the WSUS Windows OS. If this is on your site server, that's a big deal.
    Sometimes, WSUS goes wrong (not because of ConfigMgr)..
    Note that if you have a very large estate, or multiple primary site servers, you might have a CAS, and you would need a SUP on the CAS. (this is not a recommendation for a CAS, just to be aware)
    Don
    (Please take a moment to "Vote as Helpful" and/or "Mark as Answer", where applicable.
    This helps the community, keeps the forums tidy, and recognises useful contributions. Thanks!)

  • SQL 2008 R2 Best Practices for Updating Statistics for a 1.5 TB VLDB

    We currently have a ~1.5 TB VLDB (SQL 2008 R2) that services both OLTP and DSS workloads pretty much on a 24x7x365 basis. For many years we have been updating statistics (full scan- 100% sample size) for this VLDB once a week on the weekend, which
    is currently taking up to 30 hours to complete.
    Somewhat recently we have been experiencing intermitent issues while statistics are being updated, which I doubt is just a coincidence. I'd like to understand exactly why the process of updating statistics can cause these issues (timeouts/errors). My theory
    is that the optimizer is forced to choose an inferior execution plan while the needed statistics are in "limbo" (stuck between the "old" and the "new"), but that is again just a theory. I'm somewhat surprised that the "old" statistics couldn't continue to
    get used while the new/current statistics are being generated (like the process for rebuilding indexes online), but I don't know all the facts behind this mechanism yet so that may not even apply here.
    I understand that we have the option of reducing the sample percentage/size for updating statistics, which is currently set at 100% (full scan).  Reducing the sample percentage/size for updating statistics will reduce the total processing time, but
    it's also my understanding that doing so will leave the optimizer with less than optimal statistics for choosing the best execution plans. This seems to be a classic case of not being able to have one’s cake and eat it too.
    So in a nutshell I'm looking to fully understand why the process of updating statistics can cause access issues and I'm also looking for best practices in general for updating statistics of such a VLDB. Thanks in advance.
    Bill Thacker

    I'm with you. Yikes is exactly right with regard to suspending all index optimizations for so long. I'll probably start a separate forum thread about that in the near future, but for now lets stick to the best practices for updating statistics.
    I'm a little disappointed that multiple people haven't already chimed in about this and offered up some viable solutions. Like I said previously, I can't be the first person in need of such a thing. This database has 552 tables with a whole lot more statistics
    objects than that associated with those tables. The metadata has to be there for determining which statistics objects can go (not utilized much if at all so delete them- also produce an actual script to delete the useless ones identified) and what
    the proper sample percentage/size should be for updating the remaining, utilized statistics (again, also produce a script that can be used for executing the appropriate update statistics commands for each table based on cardinality).
    The above solution would be much more ideal IMO than just issuing a single update statistics command that samples the same percentage/size for every table (e.g. 10%). That's what we're doing today at 100% (full scan).
    Come on SQL Server Community. Show me some love :)
    Bill Thacker

Maybe you are looking for

  • In RSA3, Data Selection is not Restricting the Data

    Hi Guys,    In RSA3, if check the entries after providing the Selection Criteria, it is extracting all the Data. I tried for OSS Note. I didn't find any? Please refer this information: We load data from a source system EBP (similar to R3) to our BW s

  • I upgraded my iPod Touch to iOS 7 and now I'm having issues?

    I have the latest generation iPod touch and I upgraded to iOS 7. It installed fine but since installing a bunch of my music disappeared and when I select a song to play it will skip the song I selected, move on to a new song, and delete that song off

  • Visio TechNet Guru News: October Winners Announced

    All the votes are in!  And below are the results for the TechNet Guru Awards, October 2014 !!!! For a full list of winners, see the full blog post, as runners up had to be removed from this post to fit the forum max length restrictions.  BizTalk Tech

  • BSEG Update during accounting document post

    Hello ALL, Need an idea to update a custom field in BSEG. The field should get updated once the A/C document gets generated but not yet saved in database. The BTEs are not helpful as its NOT allowing to change the T_BSEG entries. Substitution also no

  • Photos in Library not showing on Apple TV

    Have recently found that new photos I have saved to iphoto that can be viewed on my Mac do not appear in my library on Apple TV. 'Recent Imports', 'events' etc. do not show them.  I have tried restarting my Apple TV, switching off and switching back