LCM best practice to run on it's own BOE instllation?

Is it a best practice to install/run the Life Cycle Manager on it's own BOE instllation? The LCM installation documents seems to suggest that the LCM can run on an existing BOE installation. I assume that means it can run on a BOE instllation that also provides webi/deski/PM/etc. services that users access.
I am just curious whether it is better to have a sperate host that only runs an instance of BOE and the LCM but does not host any other BOE reporting/dashboards.
Also, is there a "LCM Best Practices" document floating around anywhere?
Thanks,
George

Thanks for the reply!
Do you know of any specifics as to 'why" it should run on it's own machine? I heard this was suggested by someone at a BO user conference but had not heard any details as to why. Does the LCM use a lot of resources when promoting or something?
Thanks again.

Similar Messages

  • Best practice to run Microsoft Endpoint Protection client in VDI environment

    We are using Citrix XenDesktop VDI environment. Symantec Endpoint Protection client (VDI performance optimised) has been installed on the “streamed to the clients” virtual machine image. Basically, all the files (in golden image) have been “tattooed” with
    Symantec signature. Now, when the new VM starts, Symantec scan engine simply ignores “tattooed” files and also randomise scan times. This is a rough explanations but I hope you’ve got the idea.
    We are switching from Symantec to Microsoft Endpoint Protection and I’m looking for any information and documentation in regards best practice for running Microsoft Endpoint Protection clients in VDI environment.
     Thanks in advance.

    I see this post is a bt old but the organization I'm with has a very large VDI deployment using VMware. We also are using SCEP 2012 for the AV.
    Did you find out what you were looking for or did you elect to take a different direction?
    We install SCEP 2012 into the base image and manage the settings using GPO and the updates for defs are through the normal route.
    Our biggest challenge is getting alert message from the client.
    Thanks

  • Best practice for running pcastconfig --sync_library

    Every so often pcastconfig --sync_library fails with a ruby method not found error (uid) and if I run it again it might fail the same way but in a different place or it might run to completion with no errors. I've taken to turning off Time Machine before running pcastconfig --sync_library and sometimes I turn off podcast producer and xgrid as well just to 'feel' safer.
    Does anyone know what is the best practice? --sync_library isn't in the man page for pcastconfig and the docs don't mention anything about turning anything off before running it.
    another error I see sometimes is database locked
    any ideas or tips appreciated

    There's nothing wrong with that approach. You can run as many IIS sites as you like against a single CF install.
    As for installing CF on IIS 7, I recommend that you do the following: install CF 9 without connecting it to IIS, then installing the 9.0.1 upgrade and any hotfixes, then connecting CF to IIS using the web server configuration utility. This will keep you from having to install the IIS 6 compatibility layer that's needed with CF 9 but not with CF 9.0.1.
    Dave Watts, CTO, Fig Leaf Software
    http://www.figleaf.com/
    http://training.figleaf.com/

  • Best practice for running multiple sites on 1 CF install?

    Hi-
    I'm setting up a new hosting environment (Windows Server 2008 Standard 64 bit VPS  configuration, MySQL, IIS 7, CF 9)
    Has anyone seen any docs or can anyone suggest best practices for configuring multiple sites in this environment? At this point I'm thinking simple is best, one new site in IIS for each client (domain) and point it to CF.
    Given this environment, is anyone aware of any gotchas within the setup of CF 9 on IIS 7?
    Thank you in advance,
    Rich

    There's nothing wrong with that approach. You can run as many IIS sites as you like against a single CF install.
    As for installing CF on IIS 7, I recommend that you do the following: install CF 9 without connecting it to IIS, then installing the 9.0.1 upgrade and any hotfixes, then connecting CF to IIS using the web server configuration utility. This will keep you from having to install the IIS 6 compatibility layer that's needed with CF 9 but not with CF 9.0.1.
    Dave Watts, CTO, Fig Leaf Software
    http://www.figleaf.com/
    http://training.figleaf.com/

  • After Vcenter best practice documentation - running as a VM

    Hi there,
    We are currently running our vcenter as a physical machine.  A few weeks back I saw something on twitter saying vmware had changed their best practice to recommend running vcenter as a virtual machine.  We are looking into this as a way of running one less physical server.
    Can anyone point me in the direction of any (revised) good practice documentation from vmware?

    here's a couple things as well
    http://www.vmware.com/pdf/vsphere4/r40_u1/vsp_40_u1_esx_vc_installation_guide.pdf
    Although a little dated, it still applies
    http://www.vmware.com/pdf/vi3_vc_in_vm.pdf
    I would also consider setting your restart priority to HIGH for your vCenter VM.  If you run your vCenter DB instance on a VM, I would also consider setting us a DRS rule to keep them together for better performance, as well as setting a HIGH restart priority for your VCDB

  • Best Practice: continuously running db procedure

    I've written a database procedure that pulls messages off an AQ. Then does some processing on them and stores the result on in a table. I'd like the procedure to run continuously. I also call the same procedure with different parameters which determine which messages will get pulled off. My questions are these:
    1. what is the best practice for keeping this procedure running continuously? If the client side connection is eventually terminated will the process keep running? Set timeout somewhere for no timeout?
    2. How to determine which procedure instances are running. I'm thinking I may need to create different schemas that have execute priviledge for the different instances so that I can atleast tell which process is which. Is there a better way to tell which is which if I need to kill one?
    thanks,
    dan

    > 1. what is the best practice for keeping this procedure running continuously? If the client
    side connection is eventually terminated will the process keep running? Set timeout
    somewhere for no timeout?
    DBMS_JOB or DBMS_SCHEDULER processes are ideal as these have no client part.
    As for a client.. when it dies, it usually takes its server process with it. As soon as Oracle notices that the client is gone (usually when it attempts to send data to it), it will terminate the dedicated server process that serviced the client, or it will clean up the virtual circuit of the shared server session that serviced that client.
    > 2. How to determine which procedure instances are running. I'm thinking I may need to
    create different schemas that have execute priviledge for the different instances so that> I can atleast tell which process is which. Is there a better way to tell which is which if I
    > need to kill one?
    With DBMS_JOB/DBMS_SCHEDULER it is easy. You check the RUNNINGJOBS views. Details on these are in the [url http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/toc.htm]Oracle® Database Reference guide.

  • Best practice to run BOBJ server

    Is that a best practice to install the BOBJ (BOE) server in Netweaver stack..
    may or may not use BI
    may or may not use Netwever Portal
    Have said above, I would appreciate best solution for running BOBJ server..
    Thanks-gopal

    I see this post is a bt old but the organization I'm with has a very large VDI deployment using VMware. We also are using SCEP 2012 for the AV.
    Did you find out what you were looking for or did you elect to take a different direction?
    We install SCEP 2012 into the base image and manage the settings using GPO and the updates for defs are through the normal route.
    Our biggest challenge is getting alert message from the client.
    Thanks

  • What is the best practice for running a long report/query against an active database?

    We are using SQL Server 2012 EE but currently do not have the option to run queries on a R/O mirror though that is my long term goal. I am concerned I may still run into the below issue in that scenario as well since the mirror would also be updating data I
    am querying.
    I have a view that joins across several tables from two databases and is used by an invoicing program on existing data. Three of these tables are also actively updated by ongoing transactions. Running a report that used this view did not use to be a problem
    but now our database is getting larger and we have run into some timeout problems for the live transactions coming in.
    First the report query was timing out so I set command timeout to 0 and reran the query which pegged all 4 CPUs 100% for 90 minutes and so I finally killed it. Strangely there were no problems with active transactions during that time so I'm wondering if the
    query was really running doing anything useful or somehow spinning and waiting. I reviewed the view and found a field I was joining on that was not indexed so created an index on that field, reran the report, which then finished in three minutes and all the
    CPUs were busy but not at all pegged out. Same data queried both times. I figured problem solved. Of course later, my boss ran a similar invoice report, with the same amount of data, and our live transactions started timing out 100% while his query was running.
    I did not get a chance to see the CPU usage during that time.
    I looked at the execution plan of the underlying view and added the suggested index but that did not help. When I run the just the view at SQL Server it does not seem to cause any problems and finished in a couple seconds. Perhaps something else going on in
    the reporting tool using the view.
    My main question is - Given I have to use the live and active database, what is the proper way to run a long R/O query/report so that active transactions can still continue to update
    tables that I am querying? sp_who2 did show transactions being blocked so I guess a long query accessing the tables blocks live transactions accessing those same tables, but certainly I'm not the only one doing this. I
    am considering adding "with (nolock)" but am hoping there is a better standard practice as that clause can return dirty data and I understand why. Thx, Dave
    Thanks, Dave
    Dave

    Hello
    You can change the DB isolation level to Read uncommitted
    http://technet.microsoft.com/en-us/library/ms378149(v=sql.110).aspx
    or use WITH (NOLOCK)
    I do use NOLOCK option for the dirty reads to avoid locks on the tables
    Javier Villegas |
    @javier_vill | http://sql-javier-villegas.blogspot.com/
    Please click "Propose As Answer" if a post solves your problem or "Vote As Helpful" if a post has been useful to you

  • Best Practice: Application runs on Extend Node or Cluster Node

    Hello,
    I am working within an organization wherein the standard way of using Coherence is for all applications to run on extend nodes which connect to the cluster via a proxy service. This practice is followed even if the application is a single, dedicated JVM process (perhaps a server, perhaps a data aggregater) which can easily be co-located with the cluster (i.e. on a machine which is on the same network segment as the cluster). The primary motivation behind this practice is to protect the cluster from a poorly designed / implemented application.
    I want to challenge this standard procedure. If performance is a critical characteristic then the "proxy hop" can be eliminated by having the application code execute on a cluster node.
    Question: Is running an application on a cluster node a bad idea or a good idea?

    Hello,
    It is common to have application servers join as cluster members as well as Coherence*Extend clients. It is true that there is a bit of extra overhead when using Coherence*Extend because of the proxy server. I don't think there's a hard and fast rule that determines which is a better option. Has the performance of said application been measured using Coherence*Extend, and has it been determined that the performance (throughput, latency) is unacceptable?
    Thanks,
    Patrick

  • Best Practice for Running Number Table

    Dear All
    Thank you for your attention.
    I would like to generate number for each order
    AAAA150001
    AAAA is prefix
    1 is year and 0001 is he sequence number.
    I proposed the table as below
    Prefix    | Year     | Number
    AAAA    | 15        | 1
    Using  SQL query as below to get the lastest number
    SELECT CurrentNumber = Prefix + Year + RIGHT ('0000'+ CAST (Number+1 AS VARCHAR(4)), 4)
    FROM RunningNumber WHERE Prefix = 'AAAA'
    after all save process then update the running number table
    UPDATE RunningNumber SET Number = (Number +1) WHERE Prefix = 'AAAA' AND Year = '15'
    Is that a normal approach and good to handle concurrent saving?
    Thanks.
    Best Regards
    mintssoul

    Dear Visakh16
    Each year the number will reset, table will as below
    Prefix    | Year     | Number
    AAAA    | 15        | 8749
    AAAA    | 16        | 1
    I could only use option1 from your ref.
    To use this approach, I must make sure 
    a) the number will not be duplicated or jumped as there is multiple users using the system concurrently.
    b) the number will not increment when there is any error after get the new number
    Is that using the following methods could archive a) & b)? 
    1) .NET SqlTransaction.Rollback
    2) SQL
    ROLLBACK TRANSACTION Thanks.
    To prevent repeat information, details of 1) & 2) is not listed here, please refer to my previous reply to Uri
    thanks.
    Best Regardsmintssoul

  • Best practice for running multiple instances?

    Hi!
    I have a MMPRPG interface that currently uses shared objects to store several pages of information per account.  A lot of users have several accounts (some as many as 50 or more) and may access only one or several different game servers..  I am building a manager application in air to manage them and plan on putting all the information in several sql db's. 
    The original authors obviously had no idea what the future held.  Currently players have a separate folder for each account, with a copy of the same swf application in EACH folder.  So if a player has 20 accounts, he manually opens 20 instances of the same swf (or projector exe, based on personal prefrence).  I have spent the last year or so tinkering with the interface, adding functionality, streamlining, etc, and have gathered a large following of supporters.
    Each account is currently a complete isolated copy of a given interface (there are several different ones out there. It could shape up to be quite a battle)   In order to remedy this undesireable situation, I have replaced the login screen with a controller.  The question now is how to handle instansiating each account. The original application simply replaced the login screen with the main application screen in the top application container at login.
    My main (first) question is: If I replace the login screen with a controller is it more economical to have the controller open a window for each account and load an instance of the required classes or  to compile the main application and load instances of the swf?
    Each account can have up to 10 instances of about 30 different actionscript classes that get switched in and out of the main display.  I need to be able to both send and receive events between each instance and the main controller.
    I tenatively plan on using air to open windows, and simply alter the storage system using shared objects to storing the same objects in an sql table.
    Or should that be 1 row per account?   I am not all that worried about the player db, since it is basically file storage, but the shared db will be in constant use, possibly  from several accounts. (Map and player data is updated constantly)  I am not sure yet how I plan to handle updating that one. 
    I am at the point now where all the basic groundwork is laid, and the controller (though still rough around the edges) stands ready to open some accounts...  Had the first account up and running a couple days ago, but ran into trouble when the next one tried to access what used to be static infoirmation...  The  next step is to build some databases and I need to get it right the first time.  Once I release the app and it writes a db to the users machine, I do not want to have to change it
    I am an avid listener and and an eager student.  (I have posted here before under the name eboda_kcuf but was notified a few weeks ago that it was not acceptable in the forums....)  I got some great help from Alex amd a few others, so I am sure you can help me out here. 
    After all, you guys are the pro's! just point me in the right direction.... 
    Oh, almost forgot:  I use flashbuilder 4.5, sdk 4.5.1
    Message was edited by: the0bot  typo.

    There's nothing wrong with that approach. You can run as many IIS sites as you like against a single CF install.
    As for installing CF on IIS 7, I recommend that you do the following: install CF 9 without connecting it to IIS, then installing the 9.0.1 upgrade and any hotfixes, then connecting CF to IIS using the web server configuration utility. This will keep you from having to install the IIS 6 compatibility layer that's needed with CF 9 but not with CF 9.0.1.
    Dave Watts, CTO, Fig Leaf Software
    http://www.figleaf.com/
    http://training.figleaf.com/

  • OSB best practices to run business service on two different environments

    Hi.
    I am using Service Bus 11gR1
    Oracle Service Bus Version: [Oracle Service Bus L10N Dependencies 11.1 Fri Dec 4 17:43:22 EST 2009 ]
    Oracle Weblogic Server Version: [WebLogic Server 10.3.5.0 Fri Apr 1 20:20:06 PDT 2011 1398638 ]
    I deploy my OSB services on two different environments (development, production).
    How to setup business service to run on two different environments without changing source (business service transport Endpoint URI)?
    Thanks in advance.

    I am not sure of any tutorial.
    For your case if you just have one URI and you want to change the URI for the business service you can simply use the OSB Customization file.This is straight forward.
    If you have complex routing logic based on inputs fields ,you can follow the below steps,
    Create a simple table with Business Service Name,Env and URI as columns.
    Create a select DBAdapter to return the URI
    Create a business svc out the DBAdapter files
    Use the business service to fetch the URI and finally
    Use the URI override( ref - http://www.oracle.com/technetwork/middleware/service-bus/learnmore/index.html)
    Edited by: Prabu on Feb 21, 2012 8:10 PM

  • Symantec antivirus Best practice for oracle database on windows server 2003

    Hi all,
    I have an oracle database server on windows server 2003 platform of version 10.2.0.4. what would be best practice of running symantec antivirus on that server as well as database file exclusions from scanning them.
    My server had rebooted unexpectedly for many times. in event log i have id as 6008. what may be cause of it..?

    Normally, you don't run a virus scanner on a database server because your database server isn't vulnerable to viruses. It's behind firewalls, people aren't reading mail on it, people aren't plugging thumb drives into it, etc. If you do decide that you need to run a virus scanner on a database server, at least exclude the Oracle data files from the scan. Oracle gets very unhappy if someone else tries to open its data files (or, worse, if someone opens a data file before it gets the chance to acquire exclusive access).
    Justin

  • Best practice for business rules

    Our business rules have
    Fix ( [Cost Center] )
    to extract the user's Cost Center from his form so that it runs faster.
    What is the best practice for running that same Business Rule but for all Cost Center? Will it be to put that Business Rule in a menu somewhere and let it prompt users to manually type "Cost Center" so that the Business Rule processes all cost centers ?
    Thanks.
    David

    You can try this way: create your primary business rule with FIX(@RELATIVE(VarCostCenter,0)), where VarCostCenter is a run time promt. Then you could easily use it to calculate only current member on ther form (fix will give you only 1 member).
    Then you create a new sequence and add there your business rule, go to "Launch Variables" tab, find promt for Cost Center, set it to "Total Cost Centers" and click hide. So basically now you have a copy of the primary rule but it runs for all cost centers automatically.
    So using this technique you will have to maintain only one business rule!

  • Eclipse / Workshop dev/production best practice environment question.

    I'm trying to setup an ODSI development and production environment. After a bit of trial and error and support from the group here (ok, Mike, thanks again) I've been able to connect to Web Service and Relational database sources and such. My Windows 2003 server has 2 GB of RAM. With Admin domain, Managed Server, and Eclipse running I'm in the 2.4GB range. I'd love to move the Eclipse bit off of the server, develop dataspaces there, and publish them to the remote server. When I add the Remote Server in Eclipse and try to add a new data service I get "Dataspace projects cannot be deployed to a remote domain" error message.
    So, is the best practice to run everything locally (admin server, Eclipse/Workshop). Get everything working and then configure the same JDBC (or whatever) connections on the production server and deploy the locally created dataspace to the production box using the Eclipse that's installed on the server? I've read some posts/articles about a scripting capability that can perhaps do the configuration and deployment but I'm really in the baby steps mode and probably need the UI for now.
    Thanks in advance for the advice.

    you'll want 4GB.
    - mike

Maybe you are looking for

  • TS3623 When I download certain movies I get a black rectangle in the bottom center of the screen. How do I get rid of this?

    When I download movies, I get a small, black rectangle in the bottom center of the screen. This is irritating! How do I get rid of it?

  • Is facetime broken???

    No one from Apple seems to be listening to the people on this forum.  Facetime does not seem to be working properly, WHY?? NannyMoo

  • Stacked bar

    Hi All, In a graph report i have 3 page items , defaulted to all in the drop down list of each page item. My question is " is there any way if i can choose them as stacked bar or do i have to include them in the columns? Thanks

  • Easy DMS Installation

    We have recently upgraded to component SAP R/3 Enterprise and I would like to install Easy DMS. I would like to know how much work is required of our basis team to install this, just so I can test how good it is? Is it a lot of work to get going, if

  • Process chains problem in bi 7.0

    Hi, I am creating process chain loading cube , but iam unable to find out delete request in info cube. finaly i found one option delete overlaping reuest in info cube is it correct? please help me onthis. Advance Thanks.