Optimal environment configuration

Hi,
Can you please let us know if the following configuration options are optimal for good performance? We are interested in getting the best read performance without affecting the validity of writes. The BDB version is 5.0.21 (native edition). Thanks.
Non replicated environment
===========================
envConfig.setErrorStream(System.out)
envConfig.setErrorPrefix("BDBEnvironment")
envConfig.setAllowCreate(true)
envConfig.setInitializeLogging(true)
envConfig.setInitializeCache(true)
logFlush is invoked after creating objects in bulk
Replicated Environment
=====================================
envConfig.setLockDetectMode(LockDetectMode.DEFAULT)
envConfig.setAllowCreate(true)
envConfig.setRunRecovery(true)
envConfig.setThreaded(true)
envConfig.setInitializeReplication(true)
envConfig.setInitializeLocking(true)
envConfig.setInitializeLogging(true)
envConfig.setInitializeCache(true)
envConfig.setTransactional(true)
envConfig.setTxnNoSync(true)
ReplicationManagerAckPolicy=NONE
request Min Wait Time = 20000
Request Max Wait Time = 500000
bulk = false
priority = 100
totalSites = 0
cacheSize = 32 * 1024 * 1024
writeTransactionTimeout = 10 * 1000 * 1000
readTransactionTimeout = 750 * 1000
totalThreads = 3
heartBeatSendInterval = 5000000
heartBeatWaitTimeout = 5 * 60 * 1000 * 1000

Hi,
Can you please let us know if the following configuration options are optimal for good performance? We are interested in getting the best read performance without affecting the validity of writes. The BDB version is 5.0.21 (native edition). Thanks.
Non replicated environment
===========================
envConfig.setErrorStream(System.out)
envConfig.setErrorPrefix("BDBEnvironment")
envConfig.setAllowCreate(true)
envConfig.setInitializeLogging(true)
envConfig.setInitializeCache(true)
logFlush is invoked after creating objects in bulk
Replicated Environment
=====================================
envConfig.setLockDetectMode(LockDetectMode.DEFAULT)
envConfig.setAllowCreate(true)
envConfig.setRunRecovery(true)
envConfig.setThreaded(true)
envConfig.setInitializeReplication(true)
envConfig.setInitializeLocking(true)
envConfig.setInitializeLogging(true)
envConfig.setInitializeCache(true)
envConfig.setTransactional(true)
envConfig.setTxnNoSync(true)
ReplicationManagerAckPolicy=NONE
request Min Wait Time = 20000
Request Max Wait Time = 500000
bulk = false
priority = 100
totalSites = 0
cacheSize = 32 * 1024 * 1024
writeTransactionTimeout = 10 * 1000 * 1000
readTransactionTimeout = 750 * 1000
totalThreads = 3
heartBeatSendInterval = 5000000
heartBeatWaitTimeout = 5 * 60 * 1000 * 1000

Similar Messages

  • Best environment configuration practice

    we are deploying three different web applications which use bdbxml. they do not share data at all. we currently have it configured for all three apps to use the same environment but with different containers. we are running into problems where if one app goes down, it could take the entire environment down.
    is a 1:1 application to environment configuration a best practice? or is sharing one environment the best practice?
    thanks.

    Hi,
    This is the normal recovery process. If process 2 was in the middle of something, there is potential corruption on the db. So when process 2 rejoins the env with DB_RECOVERY, it will set the panic bit and start recovery. Process 1 is detecting that and getting out of the environment. After process 2 finishes up with recovery, process 1 can rejoin. This is the normal recovery process. Since we are a library, we have to be cautious about what we are doing and assume when some process terminates abnormally that something could be wrong.
    You can put different Container into different environments. OR adding that the DB_REGISTER and DB_FAILCHK flags can help reduce the number of occurrences when such panic events happen. It's worth following the reference guide documentation starting here:
    http://download.oracle.com/docs/cd/E17076_02/html/programmer_reference/transapp_fail.html
    Thanks,
    Rucong Zhao
    Oracle Berkeley DB XML

  • Re: Multipe Environment Configured as ServiceObjects

    In the registry, hkey_local_machine\SYSTEM\currentcontrolset\services\Forte
    Environment Manager 3.0.F.2\Parameters, there are three values. One of
    these is "Command line". You can add specific values there. For example,
    you might want to specify the name server address there with the "-fns"
    flag. You will also need to specify a different node name for each
    environment manager, using the -fnd flag.
    Don
    At 04:35 PM 3/11/98 +0100, Ampollini Michele wrote:
    Hello everybody.
    We're trying to configure a NT server with multiple environments.
    The environment managers need to be configured as NT services , and we
    use the srvcinst utility to install them. Does anybody know how to pass
    additional environment settings to server partitions, keeping them
    differentiated among different environments ? We cannot use the system
    variable settings , because they are shared among the different
    environments. We cannot use the
    mycomputer\hkey_local_machine\software\fortesoftwareinc\forte\3.0.f.2,
    because of the same reason ?
    We're also experiencing an error when trying starting a partition up,
    because the server partitions don't seem to recognize the
    FORTE_NS_ADDRESS environment variable.
    Has anybody ever experienced an error such as that one?
    Thank you very much.
    Mik & Frank
    Ds Data Systems Parma
    ============================================
    Don Nelson
    Regional Consulting Manager - Rocky Mountain Region
    Forte Software, Inc.
    Denver, CO
    Phone: 303-265-7709
    Corporate voice mail: 510-986-3810
    aka: [email protected]
    ============================================
    "Until you learn to stalk and overrun, you can't devour anyone" - Hobbes

    Another way that I have done this is to use the SRVANY tool included with
    the NT resource kit to run a .BAT file as an NT service. The .BAT file sets
    the appropriate environment variables and then starts the node manager. The
    benefit of this approach is that any FTEXECs started by the node manager
    inherit this environment.
    Kevin Klein
    Millennium Partners, Inc.
    Milwaukee, Wisconsin, USA
    Mankind, when left to themselves, are unfit for their own government.
    -- George Washington
    -----Original Message-----
    From: Ampollini Michele <[email protected]>
    To: '[email protected]' <[email protected]>
    Date: Wednesday, March 11, 1998 11:04 AM
    Subject: Multipe Environment Configured as Service Objects
    Hello everybody.
    We're trying to configure a NT server with multiple environments.
    The environment managers need to be configured as NT services , and we
    use the srvcinst utility to install them. Does anybody know how to pass
    additional environment settings to server partitions, keeping them
    differentiated among different environments ? We cannot use the system
    variable settings , because they are shared among the different
    environments. We cannot use the
    mycomputer\hkey_local_machine\software\fortesoftwareinc\forte\3.0.f.2,
    because of the same reason ?
    We're also experiencing an error when trying starting a partition up,
    because the server partitions don't seem to recognize the
    FORTE_NS_ADDRESS environment variable.
    Has anybody ever experienced an error such as that one?
    Thank you very much.
    Mik & Frank
    Ds Data Systems Parma

  • Coherence with berkeley db environment configuration problem in weblogic

    Hi
    i am new to coherence and i developed a web application. in my app coherence is a cache and berkely db is a backend store. i configured the coherence-config.xml correctly as per the instructions in the oracle site.the problem is when i try to put my data in the cache i am getting an exception like this
    java.lang.NoSuchMethodError: com/sleepycat/je/EnvironmentConfig.setAllowCreate(Z)V
         at com.tangosol.io.bdb.DatabaseFactory$EnvironmentHolder.configure(DatabaseFactory.java:544)
         at com.tangosol.io.bdb.DatabaseFactory$EnvironmentHolder.(DatabaseFactory.java:262)
         at com.tangosol.io.bdb.DatabaseFactory.instantiateEnvironment(DatabaseFactory.java:157)
         at com.tangosol.io.bdb.DatabaseFactory.(DatabaseFactory.java:59)
         at com.tangosol.io.bdb.DatabaseFactoryManager.ensureFactory(DatabaseFactoryManager.java:74)
         at com.tangosol.io.bdb.BerkeleyDBBinaryStoreManager.createBinaryStore(BerkeleyDBBinaryStoreManager.java:176)
         at com.tangosol.net.DefaultConfigurableCacheFactory.instantiateExternalBackingMap(DefaultConfigurableCacheFactory.java:2620)
         at com.tangosol.net.DefaultConfigurableCacheFactory.configureBackingMap(DefaultConfigurableCacheFactory.java:1449)
         at com.tangosol.net.DefaultConfigurableCacheFactory$Manager.instantiateBackingMap(DefaultConfigurableCacheFactory.java:3904)
         at com.tangosol.coherence.component.util.CacheHandler.instantiateBackingMap(CacheHandler.CDB:7)
         at com.tangosol.coherence.component.util.CacheHandler.setCacheName(CacheHandler.CDB:35)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.ReplicatedCache.instantiateCacheHandler(ReplicatedCache.CDB:16)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.ReplicatedCache.ensureCache(ReplicatedCache.CDB:152)
         at com.tangosol.coherence.component.util.safeService.SafeCacheService.ensureCache$Router(SafeCacheService.CDB:1)
         at com.tangosol.coherence.component.util.safeService.SafeCacheService.ensureCache(SafeCacheService.CDB:33)
         at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:875)
         at com.tangosol.net.DefaultConfigurableCacheFactory.configureCache(DefaultConfigurableCacheFactory.java:1223)
         at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:290)
         at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:735)
         at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:712)
         at com.coherence.cachestore.ReadFromFile.putValuesInCache(ReadFromFile.java:149)
         at com.coherence.cachestore.ReadFromFile.doPost(ReadFromFile.java:78)
         at com.coherence.cachestore.ReadFromFile.doGet(ReadFromFile.java:43)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:183)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3717)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    i google it but i got the answer for setting the environment in the java class(even this is also not succeed).i dont know how to do it for coherence. how coherence will take the berkely db configuration when i load the data in catch.please help me. if you know the answer please show the code how to configure the berkely db environment configuration for coherence and where i need to chanage/create and what should i have to do coherence will invoke the berkely db environment and store the data in the local disk. i am using coherence 3.6 with weblogic 10.3.5 server
    Edited by: 875786 on Dec 2, 2011 4:37 AM
    Edited by: 875786 on Dec 2, 2011 4:39 AM

    Hi Thank you very much. Its works fine with je 3.3 version. I have several doubts.as per my application configuration its stores the data in local disk using berkely db. when i restart the server the cache data is no more but the stored data is available in the disk , is there any configuration or technique is available for pre load the data from disk to catch. (i am using replicated cache scheme).if yes menas please provide the full detail and sample code. thanks in advance.

  • Environment configuration for Hot Backups

    Hi all,
    1. I am trying to create a hot backup tool based on the read-only Environment strategy ([discussed in a previous thread|http://forums.oracle.com/forums/message.jspa?messageID=3674008#3674008] ).
    Now, leaving aside the EnvironmentConfig.setReadOnly(true), I have found quite a few possible other configuration options in the EnvironmentParams class and I'm wondering if there are some that I should be using.
    Here are a couple of examples:
    - ENV_RECOVERY
    - ENV_RUN_INCOMPRESSOR
    - ENV_RUN_CHECKPOINTER
    - ENV_RUN_CLEANER
    Would it make sense to configure any of these?
    2. After creating a hot backup I have tried to test its state. Basically, the approach was quite simple:
    - open a read-only env on the backup
    - try to access the databases in the env
    My idea is that if the above 2 ops are succeeding then there is a very good chance that the backup is correct.
    Now, while playing with the above configuration options I have noticed that if I'm setting ENV_RECOVERY to false in this test environment, then any attempt to access the databases within results in a DatabaseNotFoundException.
    Can someone help me understand what is happening? (basically, I cannot make a connection between recovery and access to the DBs in the environment)
    Many thanks in advance,
    ./alex
    PS: I've forgot to mention that I'm running a quite old version: 2.1.30
    Edited by: Alex Popescu on Aug 13, 2009 5:50 AM

    ENV_RECOVERY - suppresses running recovery at Environment creation. Internal parameter.
    ENV_RUN_INCOMPRESSOR, ENV_RUN_CHECKPOINTER, ENV_RUN_CLEANER - disable the INCompressor, Checkpointer, and Cleaner Daemon threads.
    You should not need to adjust any of these parameters for your DbBackup utility. In fact, ENV_RECOVERY is an "internal use only" parameter.
    PS: I've forgot to mention that I'm running a quite old version: 2.1.30
    I'm sorry to be the bearer of bad news, but as my colleague Mark Hayes stressed in a previous post, you really need to upgrade from 2.1.30 to 3.3.latest. It is highly probable that you will eventually run into bugs with 2.x and we are unlikely to (1) be willing to diagnose them, and (2) fix them. As Mark pointed out, 2.1 is 3.5 years old and the product has had a lot of improvements in that time. We are happy to answer questions on this forum relating to the latest major release, but dealing with old and crusty code is certainly going to be well below our allowable priority level.
    Charles Lamb

  • Overwhelmed "newbie" with an environment configuration question...

    This will surely display my lack of experience when it comes to both MIDI and DAW software, but:
    Page 84 of the "Getting Started With Logic" document states:
    On the left hand you see an Object named +Physical Input+. Only one of these Objects exists in the environment.
    I have multiple MIDI input devices (M-Audio Keystation 88es, Keystation 49e, and Trigger Finger). They're all connected via USB. In Audio/MIDI setup they're all recognized separately. Before I read this, I was trying to assign each device it's own Physical Input, but no matter what I did, messages from all three devices showed up on the same Keyboard Object in the environment.
    How can I (hopefully) configure Logic Express to recognize them separately, because I'd like to assign specific devices to separate Audio Instrument tracks?
    Much, much thanks (and props) for the wisdom and sharing it!

    I noticed that in my audio/midi set-up utility, all three controllers have been set to a "port 1" without the option to select a different port...
    You might just have hit it on the head... you're onto something there. I only have one USB midi interface, and it is not supported anymore, so I use it's MIDI ports instead, into my Unitor8mkii. No USB...
    Is my problem that I'm using the controllers hooked up with USB cables versus MIDI cables (say with a USB MIDI interface)?
    That might be it. I'm not 100% sure, but that might be why. You could check in the Audio Midi Setup Application (in the utilities folder that is in the main applications folder) to see if each Midi controller is setup correctly, as far as number of MIDI channels to transmit and receive on.
    And thanks for clearing up the channel splitter thing for me, if / when I get another controller, I'll be sure to try this.
    Another thought, have you tried setting up each controller to transmit on a different MIDI channel, instead of OMNI? OMNI means the device transmits on ALL MIDI channels ALL the time... this would undoubtedly cause further troubles. Try, if possible to set each device internally to it's own MIDI Xmit and Rcv channel, say 1, 2, and 3. See what happens...
    As far as the output is concerned, if you are triggering internal VIs, you don't need to concern yourself as much, except for the EVB3, which accepts different messages on chs 1,2,and 3, for different parts of the plugin. (upper,middle and foot pedals)
    Cheers

  • Building with different environment configuration

    I currently work on a Portal project and have the need for building towards different environment - such as production and test. These different environment have their own configuration settings. For instance would I like to benefit of this configuration in test and production, but not in development:
    http://dev2dev.bea.com/blog/gnunn/archive/2005/06/a_no_brainer_pe_1.html
    Another issue is security setup. I would like to be able to run the portal file during development, but being more restrictive in test and production.
    Have anyone handled this in a good automated way? Any codeshare project that might help me?
    Any input appriciated.... Thanks
    Trond Andersen, Invenia AS, http://www.invenia.no

    Hello Jinsoo,
    Yes, it should be fine to set the cache size different on the master
    versus the replica.
    There are several configuration items that must be set the same
    across all sites, such as the size of the log file, or ack policy, etc.
    You should carefully read the pages for any configuration items that
    you want to have different across sites to see if it is allowed or not.
    Sue LoVerso
    Oracle

  • Optimal ram configuration

    hey guys im a slight noob, which configuration is best for OWC ram on a 2007 2 x 2.66 GHZ Dual Core Intel Xeon 667 MHZ ?
    DIMM Riser A/DIMM 1          2GB               DDR2 FB-DIMM    ---> stock ram
    DIMM Riser A/DIMM 2          2GB               DDR2 FB-DIMM    ---> stock ram
    DIMM Riser B/DIMM 1          empty             DDR2 FB-DIMM
    DIMM Riser B/DIMM 2          empty             DDR2 FB-DIMM
    DIMM Riser A/DIMM 3          empty             DDR2 FB-DIMM
    DIMM Riser A/DIMM 4          empty             DDR2 FB-DIMM
    DIMM Riser B/DIMM 3          empty             DDR2 FB-DIMM
    DIMM Riser B/DIMM 4          empty             DDR2 FB-DIMM
    I want to add OWC ram to this to add up to anything over 12, which configuarations are optimal?
    Should I get 6 -2GB modules or maybe just 2- 4GB modules? or 2- 2GB and 4- 4GB modules?
    Also does it matter where I put them?
    I actually bought 4x4gb ram for the 2010 quad-core mac pro by mistake. Will this work on my 2007 2x dual core model?
    (btw does 2 x dual core or just a single quad core provide better performance for say, 6 concurrent Adobe applications or Video editing?)
    Thanks!

    2 x 4GB to go in B1:2
    4 is ideal quad config / 8 DIMMs would be next

  • Optimal physical configuration of hard drives

    Hello,
    I have just purchased a new hard drive and would input on the optimal bay to install into. My machine has two rear ATA/100 bays and two forward ATA/66 bays. My hard drives are as follows:
    - IBM Deskstar 60GXP 60GB 7200 ATA/100, 2MB(?) cache (Apple original boot drive)
    - Maxtor DiamondMax Plus 9 250GB 7200 ATA/133, 8MB cache
    - Seagate Barracuda ST3250623A 250GB 7200 ATA/100, 16MB cache
    I still boot from the Deskstar. I'd like to dedicate a drive toward highest performance for video. A few specific questions:
    1. Would the Barracuda be the fastest drive among these on my existing system since it has the most cache, or would the ATA/133 drive somehow be faster even though it has half the cache and the bus is only ATA/100?
    2. Generally speaking, DDR memory can be run ok on a bus slower than the memory is rated for, the memory just runs at the slower bus speed; is it the same for hard drives (specifically, running an ATA/133 drive on a slower bus)?
    3. Is it better/faster to run an ATA/133 drive on the ATA/66 bus or the ATA/100 bus?
    4. Is there any "overhead" when running an ATA/133 drive on a slower bus (would a native ATA/100 drive be faster than an ATA/133 drive on an ATA/100 bus, all else being equal)?
    Thanks!
    Power Mac G4 dual 867 Mirrored Drive Doors   Mac OS X (10.4.6)  

    High performance, 100 or 133, ATA drives are backward compatible. They will run at the speed of the bus to which they are attached.
    The real-world performance of your drives depends on the bus speed, the seek speed, the cache size and efficiency, what else is on the bus, how the drive is configured, and other factors specific to the individual setup. I'd suggest you purchase a copy of Speed Tools and run the drive bench tests to determine your best read/write scores.
    Off the top of my head I'd say that your IBM is your slowest drive and your Seagate is your fastest. The fastest single drive setup would be to hook up the Seagate to the ATA100 bus and the other two drives to the ATA66 bus. Of course your Maxtor drive will really take a performance hit in that setup.
    If I was doing it, I'd compromise a little on the Seagate's performance and hook up both the Seagate and Maxtor to the ATA100 bus, both set to CS. The IBM won't be much slower than it already is going from ATA100 to ATA66, maybe 2/3 MBs/sec at most. Others may disagree that this is the best compromise.
    Carl B.

  • Problem with X environment configuration

    I have Desktop PC -> DELL OptiPlex GX260 with monitor LCD DELL 2405FPW (24" optimal Preset Resolution -> 1920 x 1200). Next I installed Solaris 10 x86 version march 2005. All devices from OptiPlex GX260 configured succesfully including integrated graphics card Intel 82845G. But I dont can correctly installed and configured graphics display device -> DELL 2405FPW. After this I cant run Xserver in runlevel 3 (CDE or Gnome) logs: Cant open Xserver... Question is: Its possible correct configure graphics card Intel 82845G and display device DELL 2405FPW (of course I try configuring this by used command kdmconfig?
    Any sugestion, someone have this or similar problem?

    http://docs.info.apple.com/article.html?artnum=304424

  • HD and Mac Pro - Optimal RAID configuration?

    I'm fairly new to RAID, but have done a fair amount of reading over the past few weeks. I currently only have 1 external drive setup as mirrored RAID (1.5TB).
    I do a lot of HD video work (as well as a lot of PShop/Illustrator/AE) and just ordered a new Mac Pro (to arrive in a few weeks). I'm curious to know what you may suggest configuring my system like with the following components. I don't want to have to buy anything else right now, nor do I think that I'll have to.
    I've got:
    1. New Mac Pro with RAID card and 4 identical SAS 300GB internal drives (no, didn't order those all from Apple)
    2. An external 1.5 TB WD drive
    3. Two external 500GB WD drives
    Of course, I want the best of both worlds of speed and redundancy. I had been considering RAID 5, but the more I thought about it, I thought maybe RAID 0 was the way on some of the drives, then create another array making use of the externals and/or Time Machine to take care of the redundancy. I have this super machine coming, and just want to be sure I'm making most efficient use of it all. Any opinions on how you'd set this up, very welcomed.
    I typically work on short projects, I'm not a filmmaker. Client work is usually between 2-15 minutes per clip, and once its done, it can be archived. I've got a ton of old data and large photo library and all the space needs you may imagine with HD video and photography, but I'll certainly have room to spare and play with. How would you work with these 7 drives?

    There is a significant difference between 'running incompatible' software and using a version that has some vestigial system checks. The software is quite capable of running as it should. The issue I experienced has nothing to do with the factors that make it not supported (which is significantly different than incompatible). While your statement is true, its not properly applied. Fact is, FCP HD is NOT incompatible with the Mac Pro.
    The quick fix of ignoring the scratch location did the job. Taking that option vs spending money has nothing to do with wisdom, rather discernment. It runs just fine now, cost me nothing to get it running, and isn't something I have a huge need for anyway. So this somewhat pretentious attitude isn't needed.

  • Optimal XP Configuration for Workshop

    We've put Workshop 8.1 SP1 on WinXP Pro workstations (2.3GHZ, 1Gb RAM, 20Gb HD). We used them for our week of BEA Portal training. The performance was miserable - it took minutes for jsps to compile and open for debugging - and they locked up or crashed at least twice a day. After class, we upgraded to SP2 (we couldn't during class to avoid conflict with class examples) but have seen no appreciable difference in stability or performance. BEA provides "minimum" requirements (PIII 700Mhz, 1 Gb RAM), which we exceed. Before I go essentially sell my first born to get our PC guys to get me some newer PC hardware (we buy Dell OptiPlex as our corporate standard - can get 3.2GHz w/ 2Gb RAM), are there other tweaks which may help? I've received internal suggestions to possibly get the Dell Precision WinXP workstation to get:
    * A bigger, faster disk (70 Gb, 7200rpm SATA) and create a much bigger swap file (15-20Gb).
    * A dual proc XP box
    But before we sink $3,500 into these, would this help my problem? Or is Workbench just slow and unreliable?

    Mike,
    Have you tried running with SP2 yet? You should notice perf
    improvements with SP2.
    Thomas
    Mike Pinter wrote:
    We've put Workshop 8.1 SP1 on WinXP Pro workstations (2.3GHZ, 1Gb RAM, 20Gb HD). We used them for our week of BEA Portal training. The performance was miserable - it took minutes for jsps to compile and open for debugging - and they locked up or crashed at least twice a day. After class, we upgraded to SP2 (we couldn't during class to avoid conflict with class examples) but have seen no appreciable difference in stability or performance. BEA provides "minimum" requirements (PIII 700Mhz, 1 Gb RAM), which we exceed. Before I go essentially sell my first born to get our PC guys to get me some newer PC hardware (we buy Dell OptiPlex as our corporate standard - can get 3.2GHz w/ 2Gb RAM), are there other tweaks which may help? I've received internal suggestions to possibly get the Dell Precision WinXP workstation to get:
    * A bigger, faster disk (70 Gb, 7200rpm SATA) and create a much bigger swap file (15-20Gb).
    * A dual proc XP box
    But before we sink $3,500 into these, would this help my problem? Or is Workbench just slow and unreliable?

  • Runtime environment configuration and spell check applet

    We have a spell check applet that checks ASP form data for correct spelling. An error occurs where the applet is being called. I have tried to adjust the user's IE settings for Security (ActiveX) as well as several combinations of Java Sun or Microsoft VM under the Advanced tab. The user's system has 1.4.2_08. No combinations of setting changes have helped. Does anyone know of a good JRE source to help me to determine if the user's system is properly finding Java for the applet? Thanks.

    I will check the console. Does the execution status appear on a particular tab of the console?
    The web application runs the applet following clicks of the nav buttons to go between screens. Upon click the JavaScript error is stating that an object is invalid, as if it cannot find the applet or it was not loaded. In the JavaScript code, that is where a function inside the applet is being called.

  • Configure Apps for SharePoint 2013 in dev environment without DNS

    I have a SP 2013 dev env
    http://spitlab/ .I want to configure app store in this environment 
    will I be able to do it without access to a DNS server 
    I followed the below two articles 
    http://www.ashokraja.me/post/Develop-SharePoint-2013-Napa-App-In-Local-Dev-Environment-Configuring-On-Premises-without-DNS.aspx
    I am able to install third party apps but when I click on it . it gets redirected to sfs.in 
    next i try this 
    http://sharepointconnoisseur.blogspot.com/2013/07/shortcut-to-prepare-sharepoint-2013-app.html
    same thing I am able to install 3rd party apps but when i click on it .. it goes to intranet.com 
    so is it possible to install third party apps on a dev box without DNS and check out third party apps and if what steps am i missing ?

    Hi,
    If you click the 3rd party apps and it is redirected to sfs.in or intranet.com, this means you configured app domain correctly.
    You can read the official document per the following first link to understand what app domain is (with DNS configured), app domain format is as bellow image (borrowed from this
    article), and app domain is defined as you want(e.g. ContosoApps.com).
    Without DNS, as your above two articles described, the app domain (e.g. apps.com, or apps.sfs.in) is written manually in hosts file directly, you can construct an app domain as your own, then after you install a custom developed app, it should be the following
    app url format.
    http://technet.microsoft.com/en-us/library/fp161236(v=office.15).aspx
    http://www.ashokraja.me/post/Develop-SharePoint-2013-Napa-App-In-Local-Dev-Environment-Configuring-On-Premises-without-DNS.aspx
    http://sharepointconnoisseur.blogspot.jp/2013/07/shortcut-to-prepare-sharepoint-2013-app.html
    Thanks
    Daniel Yang
    TechNet Community Support

  • Configuring Environment for receiving discrete multichannel midi in Logic 8

    I am using ipMidi to send multichannel midi via ethernet to Logic 8 on my G5.
    The midi is showing up fine at Logic 8. I can run one instrument track no problem - but when I try to use more than one instrument track at a time I have the following problem: The multiple channel data is getting combined or summed, I think, by the default environment configuration in such a way that each instrument channel is receiving all the incoming midi channels and not just the one it is set to receive. Can someone help me with configuring the environment for multichannel midi? My goal is to have 8 instrument tracks in Logic respond discretely and simultaneously to 8 channels of incoming data.
    My Apple Audio Midi Setup recognizes the ipMidi port and so does Logic - but I have yet to figure out how to properly cable my environment so that each instrument channel only sees the midi channel it is set to receive.
    Thanks,
    SB

    mystrobray wrote:
    That works beautifully! Thanks very much - it required no changes tot the default environment, which makes it pretty foolproof. That gets me 16 channels but if for some crazy reason I needed more than 16 channels and had to go to second ipMidi port of 16 channels will the demix still demix by port as well as channel?
    Thanks again,
    SB
    In a bit of a hurry, off to a gig so I may not be thinking this through.
    It looks like software instruments are set up to use the merged ports as I don't see anyplace to set the port for a virtual instrument, external MIDI devices yes, virtual instruments, no. However.. in the environment you can set up a multi for multi-timbral softsynth/sample player and that would give you more
    channels but on a single instrument, you can select a port when using a multi-instrument.
    Also, "settings" are on a per-song basis.
    pancenter-

Maybe you are looking for

  • How do I send from different email address

    I would like to use a clients email address when I send out thier survey form.  It is more recognizable and less likely to get caught is spam filter compared to using my email address.

  • When i download itunes the download doesn't show a list or devices

    When i download Itunes on my Windows Vista PC the install only shows Music.  There is no list on the left side and therefore no selection for devices.  The install said it downloaded successfully but it obviously didn't.  I tried to install it again

  • SMplayer becomes invisible with xcompmgr-git and xmonad

    I encountered a very weird problem. My X desktop is based on xcompmgr-git (20080703-1) and xmonad (0.7-1) If I launch SMPlayer (0.6.1-1) ) from dmenu, the window becomes completely invisible, or I should say it becomes 100% transparent. SMPlayer beco

  • Can songs be deleted directly from the 5th gen ipod?

    I occasionally find a song I no longer want, and want to delete it then and there off of the ipod, but I can't find this function on the ipod. I forget by the time I get back to the computer to delete it, therefore deleting it when I next sync the ip

  • Does JRE 1.6 include a built-in SQL database?

    I'm investigating the creation of a custom Java database application that's intended to run on multiple systems, particularly Windows and Macs. I'd like to store and retrieve my data using SQL, rather than as flat files. At the same time, I'd also li