RE : Multiple environments on NT

Hello Veronika,
I have just tested on NT4 a multiple configuration with 2 environments
On one user session NT : it works very well.
I also use multi-version (R2 and R3) on the same PC.
I use the next command :
"C:\forteR3\install\bin\nodemgr.exe -e TstEnv1 -fnd Node1 -fns
HOSTNAME:5000" for the first environment
"C:\forteR3\install\bin\nodemgr.exe -e TstEnv2 -fnd Node2 -fns
HOSTNAME:5010" for the second one
I don't need any re-installation of Forte only one FORTE_ROOT.
You should be able to to the same with NT services using the
srvcinst.exe
utility on FORTE_ROOT/install/bin.
Hope this helps.
Daniel Nguyen.
Freelance Forte Consultant.

Dan,
The problem you are describing with the system agents and partitions not
being visible sounds like a problem with how you are starting the
partitions. In configurations such as these, you need to specify which
node manager the partition is being started under. This is done
automatically when you start an application from the environment, but from
the command line, you should use something like:
ftexec -fi ... -fnd node_name -ftsvr 0 .....
Don
At 09:46 AM 1/13/98 +0100, Daniel Nguyen wrote:
Hello Ray,
You are right : I didn't explain enough.
The aim is to define different context for each environment.
One solution is to make (just as on Unix) a command file (.bat) to
define Forte environment variable and to use it in other command files
to start an environment for instance. This is very usefull for testing
but I think that it is difficult to manage. The other solution is to
define a specific account on NT for each context redefining in the
registry the FortesoftwareInc entries for each account (especially
FORTE_NS_ADDRESS).
These are two ways to obtain the same result, but using the registry
I only use one system referential and I don't have to redefine each
Forte command which has to become a service.
For the second part, I start my partitions using ftexec -fi command
(with -ftsvr option for server partition).
It seems that Forte nodemgr reuses the environment variables of
the context of the process and not the options you used and
in that case the System Agents aren't well refreshed. The nodes are
visible, but the processes on the nodes you start by hand aren't
visible. I have the same problem with Repository agent on multiple
environments on Aix : some agents , may be, aren't well instanciated.
On Aix, I used standard solution : defining specific user and specific
fortedef.sh for each environment.
Happy new year.
Daniel Nguyen.
Freelance Forte Consultant
Ray(mond) Blum wrote:
Daniel Nguyen wrote:
Hello,
After discussing about it with Forte and making new tests,
beware on the solution you choose :
if you use the same login for your services, you will have some
problems to start your applications. You need to use different
logins(NT accounts) for earch nodemgr.Ummm.... I do not think that this is right. We have had (under Forte
R2) several nodemgrs on one machine and (under R3) several launch
servers all started from the same NT (4.0) login. All they needed to
differentiate themselves was different environments.
If you don't, Forte wont start your application properly.
Then if you start your partitions by hand (using DOS),
the application starts well, but the Forte agents don't
see the application partitions (even ftexec processes).
What do you mean by the above? Forte System agents? Our client
machines described above have always been visible in EConsole/EScript.
Hope this helps.
Daniel Nguyen
Freelance Forte Consultant.--
---Raymond Blum
[Yet Another Forte Consultant]
Hurry Star Force!
Planet Earth has just 164 days left!!!
Name: vcard.vcf
Part 1.2 Type: text/x-vcard
Encoding: 7bit
Description: Card for Raymond Blum
============================================
Don Nelson
Regional Consulting Manager - Rocky Mountain Region
Forte Software, Inc.
Denver, CO
Phone: 303-265-7709
Corporate voice mail: 510-986-3810
aka: [email protected]
============================================
"We tigers prefer to inflict excitement on others." - Hobbes

Similar Messages

  • Having multiple environments open in the same thread.

    Dear Sir,
    I have a question about the concurrency model inside the bdbje. I read the documents and faqs and wrote some simple
    programs using it.
    Here is my question, from my understanding, one can open an environment in multiple processes assuming that only
    one of the environments is opened for writing. I would like to know how can I open one read environment
    and one write environment in the same process. Since in my application, reads and writes can be initiated from
    multiple processes [assuming that always having at most one writer]. In my tests when a process receives
    a read requests it creates an environment for reading, and if during the reading process the process receives
    a write request, it create a new environment to handle the write request. This simple thing throws the following exception:
    je.env.isReadOnly is set to false in the config parameter which is incompatible with the value of true in the underlying environment
    It seems like I can't have multiple environments with different configurations open at the same time in the same process, i would like
    to know if this is really the case, are there any option that I can tweak to fix this.
    Thanks,
    AliS

    hello,
    An immutable property of an Environment can not be changed
    at runtime. The ReadOnly property for an environment is one of
    the few immutable environment properties. Hence, the IllegalArgumentException
    was thrown when your application tried to open an environment handle as
    ReadWrite when a handle had already been opened ReadOnly. All environment
    handles opened by a single process must have the same value for immutable
    properties.
    You can find some additional information is at:
    http://www.oracle.com/technology/documentation/berkeley-db/je/java/index.html
    under: EnvironmentConfig
    thanks,
    Sandra

  • How to config a shared cache for multiple environments with C API

    How to config a shared cache for multiple environments with C API?  Just like Java edition. Chapter 2. Database Environments
    I want to open large number of databases, at least 10,000. But as the counts of databases opened increase, the db->open operation become very slow. It almost cost 2 hours to 10,000 databases.
    So I try to distribute these databases to multiple environments ( for example, 5 envs ). And in order to improve the efficient of memory use, I want to share cache between envs.

    Hi,
    It is not clear what you meaning about multiple environments. Do you mean these environments are in different directories or  in the same directory ? If you mean environments in different dirs share the same cache, it is interesting why you need that.
    If you do not use DB_PRIVATE to open the environment, the created cache will be on disk, in the environment directory, so it can be shared by multiple processes and multiple threads. Currently, the cache file is in the environment directory, and we do not support specifying a separate directory for cache only.
    Regards,
    Oracle Berkeley DB.

  • Help With Multiple Schemas In Multiple Environments

    Dear Oracle Forum:
    We have a bit of controversy around the office and I was hoping we could get some expert input to get us on the right track.
    For the purposes of this discussion, we have two machines, development and production. Currently, on each machine, we have one database with multiple schemas, say, one for sales data and another for inventory. The sales data has maybe 200 tables and the inventory has another 50. About 12 times a year, once a month, we have a release and move code from dev to prod. The database is accessed by several hundred Pro*C and Pro*Cobol programs for online transaction processing.
    The problem comes up when we need to have multiple development environments. If I need to work on something for May that requires the customer address field to be 50 characters and somebody else is working on something for July that requires the customer address field to be 100 characters, we can’t both function in the same schema. We have a method of configuring running programs to attach to a given schema/database. Currently, everything connects to the same place. We were told that we should not have the programs running as the owners of the schemas for some reason so we set up additional users. The SALES schema is accessed with the connect string: SALES_USER/[email protected]. (I don’t know where we got dot world from but that is not the current discussion.)
    One of the guys said that we should have 12 copies of the database running, which is kind of painful to think about in my opinion. Oracle is not a lightweight product and there are any number of ancillary processes that would have to be duplicated 12 times.
    My recommendation is that we have 12 schemas each for sales and inventory with 12 users each to access them. We would have something like JAN_SALES_USER, FEB_SALES_USER, etc. Each user would have synonyms set up for each of the tables it is interested in. When my program connects as MAY_SALES_USER, I could select from the customer table and I would get my 50 character address field. When the other user connects as JUL_SALES_USER, he would get his 100 character address field. Both of us would not know anything different.
    Another idea that came up is to have a logon trigger that would set the current schema for that user to the appropriate base schema. When JUL_SALES_USER logs in, the current schema would be set to JUL_SALES, etc. This would simplify things by allowing us to avoid having something like 2400 synonyms to maintain (which could be automated without too much difficulty) but it would complicate things by requiring a trigger.
    There are probably other ways to go about this we have not considered as yet. Any input you can give will be appreciated.
    Regards,
    /Bob Bryan

    Hans Forbrich wrote:
    I'd rather see you with 12 schemas than with 12 databases. Unless you have lots of CPUs to spare ... and lots of cash to pay for those extra CPU licenses.
    Then again, I'd take it one step further and ask to investigate the base design. There should be little reason to change the schema based on time. Indeed, from what little I know of your app, I'd have to ask whether adding a 'date' column and appropriate views or properly coded SQL statements might simplify things. Interesting. If we were to have one big Customer table with views for each month, how would we handle the case where the May people have to see 50 character address and July have to see a 100 character address field. I guess we could have MAY_ADDRESS VARCHAR2(50) and JULY_ADDRESS VARCHAR2(100) and take care to make sure that people connecting as May can only see the May columns, etc. This is simpler than multiple schemas?
    I may have overly simplified things in my effort to get something down that would not require too much explanation. The big thing is that multiple people are doing development and they have to be independent of each other. If we were to drop a column for July, the May people will have trouble compiling if we don’t keep things separate. It is not a case of making the data available. The data in development is something we cook up to allow us to test. The other part is the code we compile now will be released to production one of these times. In production, there is only a need for one database.
    We are moving from another database product where multiple databases are effectively different sets of files. We have lots of disk space so multiple databases were no problem. Oracle is such a powerful product; I can’t believe there is not some way to set up something similar.

  • Best Practice for managing variables for multiple environments

    I am very new to Java WebDynPro and have a question
    concerning our deployments to Sandbox, Development, QA,
    and Production environments.
    What is the 'best practice' that people use so that if
    you have information specific to each environment you
    don't hard-code it in your Java WebDynPro code.
    I could put the value in a properties file, but how do I
    make that variant?  Otherwise I'd still have to make a
    change for each environment to the property file and
    re-deploy.  I know there are some configurations on the
    Portal but am not sure if that will work in my instance.
    For example, I have a URL that varies based on my
    environment.  I don't want to hard-code and re-compile
    for each environment.  I'd prefer to get that
    information on the fly by knowing which environment I'm
    running in and load the appropriate URL.
    So far the only thing I've found that is close to
    telling me where I'm running is by using a Parameter Map
    but the 'key' in the map is the URL not the value and I
    suspect there's a cleaner way to get something like that.
    I used Eclipse's autosense in Netweaver to discover some
    of the things available in my web context.
    Here's the code I used to get that map:
    TaskBinder.getCurrentTask().getWebContextAdapter().getRequestParameterMap();
    In the forum is an example that gets the IP address of
    the site you're serving from. It sounds like it is going
    to be or has been deprecated (it worked on my system
    right now) and I would really rather have something like
    the DNS name, not something like an IP that could change.
    Here's that code:
    String remoteHost = TaskBinder.getCurrentTask().getWebContextAdapter().getHttpServletRequest().getRemoteHost();
    Thanks in advance for any clues you can throw my way -
    Greg

    Hi Greg:
         I suggest you that checks the "Software Change Managment Guide", in this guide you can find an explication of the best practices to work with a development infrastructure.
    this is the link :
    http://help.sap.com/saphelp_erp2005/helpdata/en/83/74c4ce0ed93b4abc6144aafaa1130f/frameset.htm
    Now if you can gets the ip of your server or the name of your site you can do the next thing:
    HttpServletRequest request = ((IWebContextAdapter) WDWebContextAdapter.getWebContextAdapter()).getHttpServletRequest();
    String server_name = request.getServerName();
    String remote_address =     request.getRemoteAddr()
    String remote_host = request.getRemoteHost()
    Only you should export the servlet.jar in your project properties > Build Path > Libraries.
    Good Luck
    Josué Cruz

  • Multiple environments on Application Server

    Hi,
    I'm using 4.7.25, and the base API via db.jar.
    I have db.jar in the shared library folder on JBoss AS5 and I have to say I am very impressed with Berkeley DB. Performance is greatly improved from using MySQL, and is getting even better as we tune the environment settings.
    My question is this, if I have an environment set up as let's say "/opt/myenv" and within this a database called "mydb", will I run into trouble if multiple different applications all use this database? Bear in mind that each application has it's own memory space so I need to create an environment in each application. All the apps would be reading and writing to the DB in question, probably at the same time.
    I would imagine this would be OK, but would like to confirm with you guys before starting to recode the apps.
    Thanks for you help,
    Ross

    Hi Ross,
    user8648084 wrote:
    I have db.jar in the shared library folder on JBoss AS5 and I have to say I am very impressed with Berkeley DB. Performance is greatly improved from using MySQL, and is getting even better as we tune the environment settings.Thank you! Please let us know if you need a review on your performance settings/results.
    user8648084 wrote:
    if I have an environment set up as let's say "/opt/myenv" and within this a database called "mydb", will I run into trouble if multiple different applications all use this database? Bear in mind that each application has it's own memory space so I need to create an environment in each application. All the apps would be reading and writing to the DB in question, probably at the same time.An environment may be shared by any number of processes, as well as by any number of threads within those processes. It is possible for an environment to include resources from other directories on the system, and applications often choose to distribute resources to other directories or disks for performance or other reasons. However, by default, the databases, shared regions (the locking, logging, memory pool, and transaction shared memory areas) and log files will be stored in a single directory hierarchy.
    It is important to realize that all applications sharing a database environment implicitly trust each other. They have access to each other's data as it resides in the shared regions, and they will share resources such as buffer space and locks. At the same time, any applications using the same databases must share an environment if consistency is to be maintained between them.
    Please let us know if this is answering your question.
    Thanks,
    Bogdan Coman

  • Is Bridge a good tool for keeping files in multiple environments?

    Is Bridge capable of being a good digital assets manager?  For example, we have a "public to the company internally", "public externally" and "private to marketing" main categories we work with in regards to our images.  I have a pretty good idea of using Bridge for meta data (category, keyword, etc), but what's the best way to use it for master image management and web sharing hi-res images?
    Thanks
    Jason

    Bridge is not really designed for being a digital assest manager for multiple users.  In addition use of Bridge on a network can be problematic.  So it will work, but may not be best fit.
    Since time is money you might look into a DAM product and see if it might work better.

  • WL 4.5.1 JNDI defaults,JNDI lookup with multiple environments etc..

    The WL4.5 docs, state that if "java.naming.provider.url" is not
    explicitly provided
    (either during the JVM start or in the InitialContext), the WL will
    default to
    "t3://localhost:7001".
    Q1) If server is listening on a different port, shouldn't we explicitly
    specify the
    "-Djava.naming.provider.url=t3://theServerIP:theServerPort" during the
    WL JVM
    start-up. If we don't, how would the naming lookup work, when there is
    no naming
    service on the "localhost:7001"
    In the real life, usually there are several (usually identical) copies
    of the WL server
    instances running in (DEV/TEST/QUAL environments. Assume that all
    servers are
    on the same network and have the same EJBs deployed (using the same home
    names).
    Q2) Does each WL instance have its own naming service running? (assume
    yes)
    Q3) If no WL instances listen on 7001, what happens with Q1?
    Q4) If client specifically sets the TEST server IP:port for the
    "java.naming.provider.url", would TEST naming service first lookup its
    own tree and
    then, if object does not exist ask other naming services (assume yes ...
    how is this
    set-up, how do the WL naming services discover each other)
    Q5) How many naming services would run if all WL instances are running
    on the
    same machine using different listen ports? (assume N)
    Darko Bohinc
    Consultant - Synergy International Limited
    Level 7, Synergy House, 131 Queen Street, PO Box 7445 Wellesley Street,
    Auckland, NZ
    Phone: +64 9 3772400
    Fax: +64 9 3772444
    URL: http://www.synergy.co.nz

    No,version 4.5 is built to the 1.1 API.
    Version 1.5 is updated to JNDI 1.2.
    Thanks,
    Michael
    Michael Girdley
    BEA Systems Inc
    <[email protected]> wrote in message news:[email protected]..
    Hi All
    I have been using WL 4.5.1 for a while with JNDI 1.1.2 which are
    included in weblogicaux.jar.
    Can I use JNDI 1.2.1 with WL 4.5.1 by overriding the default 1.1.2 ?
    If so, how do I do it?
    Thanks
    Madhu

  • OpenScript 9.1 - use one script for multiple environments

    Greetings everyone.
    I am using Openscript 9.1 on Windows XP. I want to record a script on a web application hosted in one environment and reuse the same script (with minimal re-work) against the same web application hosted in a different environment. For example, record in development and reuse in test and production.
    From what I read on the Internet, the only way to do it is to find the URL in the Java code of the script and replace it as needed. Is there a more sophisticated way of doing this? Thanks in advance for your time and response.

    This one could work:
    WITH t AS (SELECT ROWNUM rn,
                      ' abcdefghijklmnopqrstuvwxyz' char_table
                 FROM all_objects)
    SELECT 1000 + ROWNUM - 1 empid,
           'king ' || char_counter,
           100 + (ROWNUM - 1) * 100 sal
      FROM (SELECT char_counter
              FROM (SELECT SUBSTR(t1.char_table, t1.rn, 1) || SUBSTR(t2.char_table, t2.rn, 1) char_counter
                      FROM t t1,
                           t t2
                     WHERE t1.rn <= 27
                       AND t2.rn <= 27
                       AND t2.rn >  t1.rn
             ORDER BY char_counter
    WHERE ROWNUM <= 100
    ; C.

  • 1 XMLP Enterprise instance, multiple environments (databases) - Datasources

    Hi
    We are trying out XML Publisher Enterprise 5.6.2 (stand-alone) in an OracleAS 10.1.2.0.2 environment (OC4J).
    In our "pre-Production" environment we have four separate instances.
    (instance -in this case- means combination of DB instance + front-end apps deployed in OracleAS OC4J)
    Our aim is to have an XMLP server (OC4J) per major environment, i.e. DEV, pre-PROD and PROD.
    The XMLP reports will mostly be run via URL from a bespoke application.
    The problem is that the data source (either data source defined in XMLP or JNDI data source defined in container) is specified in the XMLP Report. Notably as the "Default Data Source".
    We cannot find a way of dynamically (runtime) specifying the data source to be used or to override the "Default Data Source".
    (this is the data source as a connection definition to a schema in a database)
    We need to achieve this in order to avoid having to have an XMLP server per instance, where our goal is to only have an XMLP server per environment.
    Can anybody please help us on how to dynamically specify (override) the data source used by the XMLP report?
    Thank You & Regards

    Hi Tim
    Understood that by setting up the appropriate data sources one can point to many databases from on XMLP.
    Our situation is that our SDLC prescribes that (in the case of reports):
    - the report be developed and unit tested in Dev.
    (In Dev we plan to have one XMLP that developers can use at will)
    - once unit tested the report shall be deployed to pre-PROD environment, system test instance
    (DB=bugfix, OC4J for bespoke app=bugfix_fe, OC4J for XMLP=xmlpub)
    - once system tested the report shall be deployed to final UAT and sign-off
    (DB=final, OC4J for bespoke app=final_fe, OC4J for XMLP=xmlpub)
    - once signed off, deployed into PROD
    The 'movement' of the report from Dev to Bugfix to Final to Prod should all happen without again editing the report. Thus, if Developer created report with default data source being 'opsi' and this points to e.g. the opsi schema in dev db, this presents the issue that there can only be one data source named 'opsi' in the pre-PROD XMLP.
    Thus we would like to find a way of overriding that 'opsi', e.g. with 'opsi_bug' and then with 'opsi_fin' so that the respective data source can then point to the opsi schema in bugfix and final databases respectively.
    I trust this clarifies the matter some.
    Regards

  • Best Practices: BIP Infrastructure and Multiple Installations/Environments

    Hi all,
    We are in process of implementing BI Publisher as the main reporting tool to replace Oracle Reports for a number of Oracle Form Applications within our organization. Almost all of our Forms environments are (or will be) SSO enabled.
    We have done a server install of BIP (AS 10gR3) and enabled BIP with SSO (test) and everything seems in order for this one dev/test environment. I was hoping to find out how others out there are dealing with some of the following issues regarding multiple environments/installs (and licensing):
    Is it better to have one production BIP server or as many BIP severs as there are middle tier form servers? (Keeping in mind all these need to be SSO enabled). Multiple installs would mean higher maintenance/resource costs but is there any significant gain by having more autonomy where each application has its own BIP install?
    Can we get away with stand alone installations for dev/test environments? If so, how do we implement/migrate reports to production if BIP server is only accessible to DBAs in production (and even real UAT environment where developer needs to script work for migration)? In general, what is the best way to handle security when it comes to administration/development?
    I have looked at the Oracle iStore for some figures but this last question is perhaps one for Oracle Sales people but just in case anybody knows... How's licensing affected by multiple installations? Do we pay per installation or user? Do production and test/dev cost the same? Is the cost of stand alone environment different?
    I would appreciate if you can share your thoughts/experiences in regards to any of the above topics. Thank you in advance for your time.
    Regards,
    Yahya

    Your data is bigger than I run, but what I have done in the past is to restrict their accounts to a separate datafile and limit its size to the max that I want for them to use: create objects restricted to accommodate the location.

  • Office Web Apps for Hosted Environments

    Hi there,
    I have a some servers that host two different SharePoint Farms and currently one is connected to an OWA server -
    http://blogs.technet.com/b/justin_gao/archive/2013/06/30/configuring-office-web-apps-server-communication-using-https.aspx
    I want my second farm to use OWA too and was wondering if it is possible to use the same OWA server for this?  So, can I create two Office Web App Farms on the same server?

    Yes you can use OWA for multiple environments. You can refer to below links:
    http://social.technet.microsoft.com/Forums/en-US/78f678cb-69fc-48e6-9f4d-6985154f2d0c/office-web-apps-with-multiple-sp-farms?forum=sharepointadmin
    http://blogs.technet.com/b/office_resource_kit/archive/2012/09/11/introducing-office-web-apps-server.aspx
    Please ensure that you mark a question as Answered once you receive a satisfactory response.

  • RE: Accessing multiple Env from single Client-PC

    Look in the "System Management Guide" under connected environments page
    72. This will allow services in your primary environment to find
    services in your connected environment. However, there is a bug
    reported on this feature which is fixed in 2F4 for the HP and H1 for all
    other servers. The following is from Forte:
    The connected environments bug that was fixed in 2F4 is #24282. The
    problem
    was in the nodemgr/name server source code and caused the following to
    occur:
    Service1 is in connected envs A and B.
    Client has env A as primary, B as secondary.
    Envmgr A dies before the client has ever made a call to Service1.
    Afer env A is gone, client makes a call to Service1 which causes Envmgr
    B to
    seg fault.
    You should upgrade your node manager/env manager nodes to 2F4. The 2F2
    development and runtime clients are fully compatible with 2F4 servers.
    Kal Inman
    Andersen Windows
    From: Inho Choi[SMTP:[email protected]]
    Sent: Monday, April 21, 1997 2:04 AM
    To: [email protected]
    Subject: Accessing multiple Env from single Client-PC
    Hi, All!
    Is there anybody has any idea to access multiple environments from
    single client-PC? I have to have multiple environments because each
    environment resides geographically remote node and network bandwidth,
    reliability are not good enough to include all the systems into single
    environment.
    Using Control Panel for doing this is not easy for those who are not
    familiar with Windows. The end-user tend to use just single application
    to access all necessary services.
    I could consider two option to doing this:
    1. Make some DOS batch command file to switch different environment
    like, copying back/forward between environment repositories and
    set up forte.ini for changing FORTE_NS_ADDRESS. After then, invoke
    proper client partition(ftexec).
    2. Duplicate necessary services among each environment.
    But, these two options have many drawbacks in terms of system
    management(option 1), performance(option 2) and others.
    Has anybody good idea to implement this? Any suggestion would be
    appreciated.
    Inho Choi, Daou Tech., Inc.
    email: [email protected]
    phone: +82-2-3450-4696

    Look in the "System Management Guide" under connected environments page
    72. This will allow services in your primary environment to find
    services in your connected environment. However, there is a bug
    reported on this feature which is fixed in 2F4 for the HP and H1 for all
    other servers. The following is from Forte:
    The connected environments bug that was fixed in 2F4 is #24282. The
    problem
    was in the nodemgr/name server source code and caused the following to
    occur:
    Service1 is in connected envs A and B.
    Client has env A as primary, B as secondary.
    Envmgr A dies before the client has ever made a call to Service1.
    Afer env A is gone, client makes a call to Service1 which causes Envmgr
    B to
    seg fault.
    You should upgrade your node manager/env manager nodes to 2F4. The 2F2
    development and runtime clients are fully compatible with 2F4 servers.
    Kal Inman
    Andersen Windows
    From: Inho Choi[SMTP:[email protected]]
    Sent: Monday, April 21, 1997 2:04 AM
    To: [email protected]
    Subject: Accessing multiple Env from single Client-PC
    Hi, All!
    Is there anybody has any idea to access multiple environments from
    single client-PC? I have to have multiple environments because each
    environment resides geographically remote node and network bandwidth,
    reliability are not good enough to include all the systems into single
    environment.
    Using Control Panel for doing this is not easy for those who are not
    familiar with Windows. The end-user tend to use just single application
    to access all necessary services.
    I could consider two option to doing this:
    1. Make some DOS batch command file to switch different environment
    like, copying back/forward between environment repositories and
    set up forte.ini for changing FORTE_NS_ADDRESS. After then, invoke
    proper client partition(ftexec).
    2. Duplicate necessary services among each environment.
    But, these two options have many drawbacks in terms of system
    management(option 1), performance(option 2) and others.
    Has anybody good idea to implement this? Any suggestion would be
    appreciated.
    Inho Choi, Daou Tech., Inc.
    email: [email protected]
    phone: +82-2-3450-4696

  • Accessing multiple Env from single Client-PC

    Hi, All!
    Is there anybody has any idea to access multiple environments from
    single client-PC? I have to have multiple environments because each
    environment resides geographically remote node and network bandwidth,
    reliability are not good enough to include all the systems into single
    environment.
    Using Control Panel for doing this is not easy for those who are not
    familiar with Windows. The end-user tend to use just single application
    to access all necessary services.
    I could consider two option to doing this:
    1. Make some DOS batch command file to switch different environment
    like, copying back/forward between environment repositories and
    set up forte.ini for changing FORTE_NS_ADDRESS. After then, invoke
    proper client partition(ftexec).
    2. Duplicate necessary services among each environment.
    But, these two options have many drawbacks in terms of system
    management(option 1), performance(option 2) and others.
    Has anybody good idea to implement this? Any suggestion would be
    appreciated.
    Inho Choi, Daou Tech., Inc.
    email: [email protected]
    phone: +82-2-3450-4696

    Hi, All!
    Is there anybody has any idea to access multiple environments from
    single client-PC? I have to have multiple environments because each
    environment resides geographically remote node and network bandwidth,
    reliability are not good enough to include all the systems into single
    environment.
    Using Control Panel for doing this is not easy for those who are not
    familiar with Windows. The end-user tend to use just single application
    to access all necessary services.
    I could consider two option to doing this:
    1. Make some DOS batch command file to switch different environment
    like, copying back/forward between environment repositories and
    set up forte.ini for changing FORTE_NS_ADDRESS. After then, invoke
    proper client partition(ftexec).
    2. Duplicate necessary services among each environment.
    But, these two options have many drawbacks in terms of system
    management(option 1), performance(option 2) and others.
    Has anybody good idea to implement this? Any suggestion would be
    appreciated.
    Inho Choi, Daou Tech., Inc.
    email: [email protected]
    phone: +82-2-3450-4696

  • Multiple databases in one Environment

    I am tryiing to figure out the best way to implement my JE environment. My use case requires that I be able to use UNIX tools like 'cp' to move JE databases from place of generation (development) to place of consumption (production). Moreover, each JE database stores very different data and has different refresh cycles (ex., one of them may be refreshed every week, another may be refreshed once a month, etc.)
    My dilemma is:
    1)- Should I use one env w/multiple databases, or,
    2)- A separate env for each database?
    With (1), I can't see a way of selectively updating one of the databases and copying over the changed JE log files to production servers (for read-only access at runtime). Every time I want to update one database, I'm thinking I'll have to pull down all JE log files, update the DB and push out all JE log files to production servers.
    (2) seems to go against the guidelines I've seen in this forum, viz., one env per process is more efficient.
    Thoughts/experiences/comments are welcome.
    Thanks

    Hi,
    What Charles suggested is the best approach if you need to avoid the performance issues with having multiple environments in a single process (you mention that you've read about them elsewhere on the forum). Please also be aware that we're working to solve those problems in an upcoming release. So if you prefer, you can implement your application using multiple environments now, and expect that the performance issues will be resolved later. This might make sense if your deployment will be small at first, and performance will not be a big issue right away.
    Mark

Maybe you are looking for