JVM Setting of Max Heap Equal To Max Perm Size

Hello,
I was wondering based on anyones experience or knowledge if there would be any bad effects in setting the Maximum heap size (-Xmxs) equal to the Maximum Permanent Size (-XX:MaxPermSize)?
I have a person that has the following settings of:
-native -Xnoclassgc -ms256m -mx256m -XX:MaxPermSize=256M
running under WLS 6.1 sp1 on a Sun Solaris box.
Just curious how this would affect applications running under this environment and if it could attribute to a java.lang.outofmemory error be generated.
Thanks,
Bob Krause
PHS

"Bob Krause" <[email protected]> wrote in message news:19491214.1100812698044.JavaMail.root@jserv5...
I was wondering based on anyones experience or knowledge if there would be any bad effects in setting the Maximum heap size(-Xmxs) equal to the Maximum Permanent Size (-XX:MaxPermSize)?
>
I have a person that has the following settings of:
-native -Xnoclassgc -ms256m -mx256m -XX:MaxPermSize=256M
running under WLS 6.1 sp1 on a Sun Solaris box.
Just curious how this would affect applications running under this environment and if it could attribute to ajava.lang.outofmemory error be generated.
The settings above don't make much sense. Check java.sun.com for JVM tuning for details.
Regards,
Slava Imeshev

Similar Messages

  • Setting max heap in Oracle JVM

    Hello -
    I'm having a problem with a Java stored procedure running out of heap memory in Oracle 10g running Java 1.4.
    Normally (running Java in a standard context) I would just modify the -Xmx with a higher value, but for the life of me I can't figure out how to do it in a stored procedure context.
    I have browsed Google and I have browsed the Oracle JVM installation stuff, all to no avail.
    Can anyone help me with how to set my max heap size, or verify that it's impossible? I have taken all of the standard Oracle memory parameters (JAVA_POOL, UGA/PGA/SGA limits) out of the picture by jacking them up and keeping an eye on memory values up to the point that the procedure fails (at ~700m), so I'm pretty sure that this is my problem.
    So far I have looked at:
    Config files
    Config tables
    DB parameters
    I haven't been able to find anything remotely related to JVM option configuration in any of the above.
    It is worth noting that I ran across another forum where someone was wanting to set their minimum heap size, and they were told that it was not possible. I'm just having trouble believing that it's the same story with something as critical as max heap size.
    Much obliged for any help.
    Thanks,
    Annaka

    From Metalink Note 466112.1:
    Applies to:
    Oracle Server - Enterprise Edition - Version: 9.2.0.1 to 10.2.0.3
    This problem can occur on any platform.
    Symptoms
    When attempting to execute a java class that works fine in a stand alone JVM, fails with Oracle JVM with the following error:
    ERROR
    ORA-29532: Java call terminated by uncaught Java exception:
    java.lang.OutOfMemoryError
    ORA-06512: at "IDS_SYS.POD", line 3
    Cause
    The MaxMemorySize was set 256M (the default values) where the JSO needs memory more than 256 MB to run.
    This can be checked as following:
    SQL> create or replace function getMaxMemorySize return number
    2 is language java name
    3 'oracle.aurora.vm.OracleRuntime.getMaxMemorySize() returns long';
    4 /
    Function created.
    SQL> select getMaxMemorySize from dual;
    GETMAXMEMORYSIZE
    268435456
    After increasing the MaxMemorySize to a larger value(1 GB), the problem was fixed
    Solution
    Please increase the MaxMemorySize to a larger values(i.e. 1GB), this can be done as following:
    SQL> create or replace function setMaxMemorySize(num number) return number
    2 is language java name
    3 'oracle.aurora.vm.OracleRuntime.setMaxMemorySize(long) returns long';
    4 /
    Function created.
    SQL> select setMaxMemorySize(1024*1024*1024) from dual;
    SETMAXMEMORYSIZE(1024*1024*1024)
    Then you can check if the value is set correctly using the following:
    SQL> select getMaxMemorySize from dual;
    GETMAXMEMORYSIZE
    1073741824
    In my case I had to set the parameter within a job's the session.
    bye
    TPD
    Edited by: TPD on Sep 23, 2008 4:27 PM - tags added

  • Does the jvm allocate complete max heap size initially?

    Does the JVM allocate the memory for the entire max heap size up front, or does it start with the specified minimum size and increase and grab more later if needed?
    The reason this is being posted is because we have a number of jboss servers that are being run. Some don't require large heap sizes, others do. If we use the same large max heap size for all would all the memory get allocated up front, or possibly a smaller initialization portion?

    I have done the test with Solaris, Linux and WinXP.
    Test with -Xms512M
    Have written a simple java program to which the minimum heap size was set to -Xms512m then the program was executed on Solaris and WinXP platforms. The usage of memory of the Java process was 6 MB in WinXP and 9 MB in Solaris, rather than 512 MB. The JVM is not allocating the configured minimum size of 512 MB at the start of the process execution.
    Reason:
    If you ask the OS for 512 MB it'll say "here it is", but pages won't actually be allocated until your app actually touches them.
    If the allocation is not being made initially during the start of the process, the concept of minimum heap size is not required.
    But the garbage collection log shows the minimum heap size as what was configured using -Xms option.
    Test with -Xms1024M
    The JVM arguments was set to : -Xms1024m -Xmx1024m, but the used memory observed using Windows perfmon was 573M.
    6.524: [Full GC 6.524: [Tenured: 3081K->10565K(967936K), 0.1949291 secs] 52479K->10565K(1040512K), [Perm : 12287K->12287K(12288K)], 0.1950893 secs] Reason:
    Optimization is something that the operating systems do. The JVM allocates the memory in it's address space and initializes all data structures to your -Xms. In any way that the JVM can measure, the allocation from the OS is complete. But the OS doesn't physically assign a page to the app until the first store instruction. Almost all modern OSs do this.
    Hope this is helpful.

  • How to set  max-heap-size outside the jnlp file?

    Due to bug_id=6631056 It may not be possible to specify max-heap-size within
    the JNLP file for certain jnlp java applications.
    Are there other possibilities to specify this Jvm parameter?
    In the ControlPanel there is the possibility to specify Xmx for applets but not for jnlp.
    I have tried to add properties like
    "deployment.javaws.jre.0.args=Xmx\=128M" without success
    Many thanks

    Even in JNLP also you can specify the max heap size
    <j2se version="1.5+" initial-heap-size="128m" max-heap-size="512m"/>
    Thanks,
    Suresh
    [http://sureshdevi.co.in|http://sureshdevi.co.in]

  • Setting System-Wide Max Heap Size

    We want to set the heap-size of Java-Plugin 1.5.0_14 for a company-wide rollout to a fixed size under Windows XP.
    In deployment.config under C:\Winnt\Sun\Java\deployment I am giving this:
    deployment.system.config=file:C:\\WINNT\\Sun\\Java\\Deployment\\deployment.properties
    deployment.system.config.mandatory=true
    In the respective deployment.properties I am giving
    #deployment.properties
    deployment.cache.max.size=50m
    deployment.javapi.jre.1.5.0_14.args=-Xmx256m -Xms75m
    While the cache-parameter is taking effect (visible in the Java Control Panel) there is no change in the Max Heap Size.
    Any idea how this could be achieved?
    Thank you
    Michael

    MaxHeapSize I determine with Runtime.getRuntime().maxMemory()
    Setting it manually in the javacpl via -Xmx works fine but....
    Problem is that we do not want each user to open his Java Control Panel and set this value manually.
    May be error prone and difficult to communicate in a scenario where you have hundreds of users in different locations, countries etc.
    Should be possible to set this value once installing the Java Plugin
    Thank You
    Michael

  • Java vm creation crashes if max heap is set too big

    I start the jvm from deep within a c / fortran app. If I give a "decent" max heap size w/ -Xmx option, everything is fine. However, the JNI_CreateJavaVM crashes w/
    Unhandled exception at 0x060b766b (jvm.dll) in tnapa.exe: 0xC0000005: Access violation reading location
    0x00000ff1.
    The violator being:
         jvm.dll!_findNonConsecutive() + 0x18b     if I give a large (e.g. -Xmx1g) max heap size. I first suspected that this is our own code that messes up some memory before the jvm gets into the picture (and still consider this a strong possibility). However, when trying to solve the problem, I looked at the reported memory location w/ MS Visual C++ debugger, and it showed ?? for values there. I could not find it in the documentation, but my coworker said it means that memory is outside of the scope of the process in question. This makes it sound like jvm is trying to access illegal memory areas. Could this be a jni bug?
    Environment:
    jdk 1.5.0_09
    win xp sp 2
    ms cl compiler (/MD flag is used in compilation)
    2GB of ram
    The jvm is loaded dynamically, i.e. the reference to the func JNI_CreateJavaVM is fetched from the jvm.dll, which is loaded lazily, i.e. just in time when needed for the first time. A proper typecast is done to ensure that JNICALL calling convention is used.
    -Antti-

    Sorry, blaming the wrong horse here... the jvm in use was actually jrockit (BEA JRockit(R) (build R26.4.0-63-63688-1.5.0_06-20060626-2259-win-ia32, jdk 1.5.0_06).
    W/ sun jdk everything is fine. Well, at least it does not crash, it just fails due to not enough memory.
    -Antti-

  • Strange: Mac screen menu-bar requires max-heap-size to be set.

    I planned to omit the max-heap-size attribute in the line of my jnlp file
    <j2se version="1.6+" max-heap-size="256m" />
    The idea was that with Java 1.6 the heap size is set automatically
    according to the client's RAM.
    Unfortunately, the Macintosh screen menu-bar works if and only if the max-heap-size attribute is present.
    It is a MacOsX=1.4 with Java1.5.
    Strange, since the Mac runs Java 1.5 and I am talking about settings for 1.6.
    The jnlp passed ValidateJNLP at http://mindprod.com/jgloss/jnlp.html#VALIDATION
    Here is another post stating that attributes in JNLP have side effects on Mac's screen menu-bar:
    http://lists.apple.com/archives/Java-dev/2008/Jul/msg00122.html
    Here is my jnlp:
    w3m -dump http://www.bioinformatics.org/strap/strap.jnlp
    Is there an explanation for this?
    Christoph

    user10289576 wrote:
    I would not blame Macintosh.
    The error might still be in the Sun's code.Could be. But to fix the Mac VM it would require the following.
    1. Find it in the Sun VM.
    2. Fix it in the Sun VM.
    3. Move the changes to the Mac VM...somehow.
    >
    If the jdk would be smaller, less redundant and clearer, then
    open JDK could be possibly compiled on a Mac.
    Not sure what that means since there are likely OS level calls that must be implemented in some place in the API. Which are specific to Mac.
    Just as there are differences between windows/solaris/linux which Sun accounted for.
    And that would be what Apple would have done to make it work on Mac. And what someone (someone who likes Mac) will need to continue to do with the public release (explusion?) which is the form that Java will have going forward.
    The main thing that should be improved on a Macintosh
    is to directly allow for Linux and Solaris executables such that the Linux JDK could directly work on a Mac.The main thing, again, is the Apple is no longer supporting Java on the Mac.
    And Apple, not Sun and certainly not Oracle, were the ones that created the Java VM.
    So the main thing at this point is that ALL future directions that Java take are dependent not on Apple but on the Macintosh community. That includes features as well as fixes.

  • Plug-in max heap space for IE on Windows Vista

    G'day,
    The organisation I work for will be moving from Windows 2000 to Windows Vista later this year.
    We'll be including JRE6 as part of the standard desktop installation :-)
    Ahead of that I'd like to know the default maximum heap-space configuration for JRE6 running in IE7 on Windows Vista.
    Elsewhere I've read it's as low as 64M (!)
    http://forum.java.sun.com/thread.jspa?forumID=30&threadID=5168556
    If it's this low I'll need to make sure our JRE6 install is tweaked to increase its max heap space (-Xmx)
    Thanks,
    Chris.

    Hi Kenneth,
    It is not necessarily for posting if you feel this write-up is too detailed. First I want to let you know that I am a novice just trying to help MYSELF, JAVA and others with the same or similar problem. This is going to be a very long post, just like one you (kbrussel) posted about this problem. I realize it may be unnecessary but maybe HARDWARE could be the problem so I listed all my hardware. More Data is better than LESS data I feel!
    Background: I have the identical problems on 2 different Gateway Models over almost a year.
    System # 01 is an Older Gateway Computer 838GM (Media Center) Pentium 4 - 630 HT Processor 3.0 GHz/2MB Cache, 1.536 GB of RAM (1.49GB Recognized) PC3200 with 800 MHz FSB and 200 GB Hard Drive running Windows XP Media Center 2002 SP2 with all latest updates (Actually I was told it is XP Professional plus additional programs for media) and a GMA900 on-board Video outputting to a Gateway EV700 CTR Monitor using a Linksys Cable Modem Model: BEFCMU10 Version 4.0 and Comcast Broadband ISP for Internet Connection.
    System # 02 is a Newer Gateway Computer GM5457E (Media Center) Core 2 Duo Processor E6320 1.86 GHz per CPU Core @ 1066 MHz FSB/4MB Cache, 4.096 GB of RAM (3.405216GB Available to Windows) PC5400 with Intel 945G Chip Set, a SATA II 500 GB Hard Drive running Windows Vista Home Premium V 6.0 (Build 6000) and an NVIDIA GE Force 7500LE Video Card outputting DVI to a new Acer AL2223W HDLCD 22" Monitor using a Linksys Cable Modem Model: BEFCMU10 Version 4.0 and Comcast Broadband ISP for Internet Connection.
    I am a high volume quick trading new "day trader" (actually a "minute" trader) and I have been having an "EXTREEME" problem over the past 10 months, maybe longer, with my older and my newer computer SYSTEMS of locking up (looping, I think, nothing using JAVA seems to work, totally locked) when running JAVA based TD Ameritrade�s "Command Center 2.0". During early very FAST trading minutes (usually 9:30 AM to 10:00 AM) with all functions (limited to 6 each functions each by TD Ameritrade), maybe 20-28 windows working at the same time. I use "Charts, Last Sales, Level II, Watch Lists, Trade Tickets, Actives", as a "DAY-TRADER". The JAVA plug-in providing "streaming" data to my older computer was unreliable, always locking-up. I bought a totally new computer system after I was told by TD Ameritrade TECHNICAL that "IT COULD BE MY OLDER COMPUTER". I bought a new computer, and the same problem, lock-up on this computer with CPU usage between 52% and 59%. I have spent over 12 hours with many different TD Ameritrade technicians trying to help me. I have tried both foxfire and IE and problem is on both. I place a buy order trade and then I need to sell it in less than a minute some times. I can not afford for JAVA to lock up on me for any number of SECONDS while I am trading, and JAVA has done that to me so many times I can�t count. By the time that I "End Task" and then re-open my account I can be down (LOST) several thousand dollars. This only happens when the market is trading furiously and JAVA can�t keep up, at least on my computer. I have tried all modification fixes that TD Ameritrade technicians have given me, and some on your blog http://forum.java.sun.com/profile.jspa?userID=523605), and some fixes from other sites that have been recommended, but I still fear that the problem is "not fixed". I have set -Xmx to "-Xmx100m" and Temp Internet Files to 100 MB. This has worked so far for me all day today but the stock market was not TOO busy and I did not have my second trading account up as I usually do. I think the problem is amplified when the "Market Volume" is very high, meaning that the data is constantly streaming so fast that either JAVA or my computer cannot keep up. I also noticed that there is a totally new version of JAVA (http://www.java.net/download/jdk6/6u10/promoted/b11/binaries/jre-6u10-ea-bin-b11-windows-i586-p-24_jan_2008.exe), do you think I should try this version?? , I had version 6 I think rev.4 a while ago and it failed too. Is this version 6 newer than a month or so ago ??? I am now running 1.6.0 (build 1.6.0_03-b05)
    I have not tried all solutions on you list of "ALL" but this fix seems to work so far, that is I set parameters to -Xmx100m works for me so far, just one day. I will let you know of new failures or continuing good operation,
    Thanks Russ Bower [email protected] I NEED THIS JAVA program TO WORK RELIABLY. Also it seems to be a problem on SCOTT Trade also from a blog I visited, several people claiming it is a JAVA bug.

  • Siginficance of max heap size mentioned in configtool

    Hi all,
    could anyone please tell me the exact significance of
    max heap size mentioned in configtool in SAP Netweaver in
    <b>1)Instance_ID</b>
    -servers general
    -message servers and bootstrap
    <b>2)Dispatcher_ID</b>
    -general
    -bootstrap
    <b>3)Server_ID</b>
    -general
    -bootstrap
    Which of these do i change to improve the performance?
    I tried changing the max heap size specified in
    <b>Server_ID</b>
    -general
    but i got the following error while trying to start the server  in std_server0.out:
    node name   : server0
    pid         : 3452
    system name : N02
    system nr.  : 01
    started at  : Tue Mar 20 21:53:37 2007
    Reserved 1610612736 (0x60000000) bytes before loading DLLs.
    [Thr 1912] MtxInit: -2 0 0
    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Regards,
    Namrata.

    Hi,
    The biggest impact to runtime performance will be adjusting the heap size of the server JVM. This is done in Server_ID->general.  The JVM parameters entered here take precedence over parameters in Instance_ID->servers general.  The server job by far do the most work in the Java engine and so it is very important that the JVM for the server node is tuned to handle the workload.  Tuning the server JVM or even adding additional server nodes is dependent on the workload and the amount of work on the system.
    Adjusting the heap for the other JVMs will have much less of an impact than adjusting the heap in the server JVM.
    The dispatcher JVM heap settings may have a slight impact during runtime, but compared to the server jobs the dispatcher does relatively little work.  Depending on your situation you may need to tune the dispatcher a little, but my experience has been that the default value for the dispatcher is usually sufficient.
    The values for all of the bootstrap jobs may have an impact on startup time, but they will have no impact on runtime since these jobs go away once the system is up.  From what I have seen the defaults values for the bootstrap jobs are sufficent.
    I never adjust anything under Instance_ID, I'm not sure what these parameters are used for except for maybe default values when adding server nodes.  Maybe someone out there knows.
    Hope this helps.
    Regards,
    Kolby

  • How do define the limit of the max heap size?

    Hi All,
    I would like to know what should be the limit of the JVM max heap size.
    What will happen if we will not define it?
    What is the purpose of defining it from the technical point of view?
    Thanks
    Edited by: Anna78 on Jul 31, 2008 12:36 PM

    Defining a max heap space too large can have the following effect:
    If you create new objects, the VM may decide it is not worth getting rid of garbage-collectable ones, as there
    is still plenty of space between the current heap size and the max allowed. The result will be that the
    application will run faster and will consume more memory than it really needs.
    If the heap size is too small, but still sufficient, the application will do a lot of garbage-collection and therefore
    run slower. On the other hand, it will stay inside the tight space it has been allowed to use.
    The speed difference may or may not be noticeable, while the difference between 256M and 512M may
    or may not matter on today's computers.

  • Allocated Heap v.s. Max Heap and OutOfMemoryError

    In my application, I specify heap size like following:
    -Xms384m -Xmx384m
    I got few OOME error recently. From our monitoring tool, I can see the 'allocated heap' was only about 280 - 340MB range when OOME happened. That means that the used heap size were close or reach the 'allocated' heap size but not the max heap that defined in '-Xmx'. My question is:
    Why does JVM act like this? What is the problem that prevents JVM from obtaining promised memory from OS?
    We are using JDK 1.5
    Thanks,
    J

    Its easy to check this, create as many threads as possible and configure a small heap size and a big stack size, count the number of created threads, then increase the heap and run the program again. Then you can see the number of threads increased too.
    I have created a [java program to do exactly this, check it out|http://weblogs.java.net/blog/claudio/archive/2007/05/how_many_thread_1.html] .
    See below, how thread stack size are allocated into the heap.
    [http://java.sun.com/docs/hotspot/threads/threads.html|http://java.sun.com/docs/hotspot/threads/threads.html]
    excerpt "TLEs (in 1.3) or TLABs (in 1.4) are thread local portions of the heap used in the young generation"
    [Java Memory White Paper|http://java.sun.com/javase/technologies/hotspot/gc/memorymanagement_whitepaper.pdf]
    For multithreaded applications, allocation operations need to be multithread-safe. If global locks were used to
    ensure this, then allocation into a generation would become a bottleneck and degrade performance. Instead,
    the HotSpot JVM has adopted a technique called Thread-Local Allocation Buffers (TLABs). This improves
    multithreaded allocation throughput by giving each thread its own buffer (i.e., a small portion of the
    generation) from which to allocate. Since only one thread can be allocating into each TLAB, allocation can take
    place quickly by utilizing the bump-the-pointer technique, without requiring any locking. Only infrequently,
    when a thread fills up its TLAB and needs to get a new one, must synchronization be utilized. Several techniques
    to minimize space wastage due to the use of TLABs are employed. For example, TLABs are sized by the allocator
    to waste less than 1% of Eden, on average. The combination of the use of TLABs and linear allocations using the
    bump-the-pointer technique enables each allocation to be efficient, only requiring around 10 native instructions.

  • Max-Heap deletion

    I'm studying for a Data Structures midterm and one of the questions is making a delete algorithm to delete element e from a max heap. Just wondering if anyone could help me with how this is done. The method should be able to delete from any part of the heap. I know how to delete the max value (remove the max, move the min value to the top and downheap) -- I'm just not sure how I would implement this if the element I wanted to remove was in the middle of the heap. Any help would be greatly appreciated.
    Thanks

    Are you referring to the Java Heap, which is used for memory management? If so, you cannot manage the heap in this way, it is up to the garbage collector for the particluar JVM.
    If you are referring to an array or hastable or collection of some sort, which collection are you referring to?
    Don

  • Max heap size limits

    I've been looking around for information on the max heap size limits on Sun's JVMs but can't seem to find any information. Just by testing, it seems like the max heap size for Windows 2000 can vary from 1.3G to 1.6G depending upon the machine (JDK 1.4). Does anybody know where I could find actual documentation that describes the limits for Sun's VMs on Windows (2000 and Advanced Server), Linux, and Solaris? I'm about to file this as a documentation bug against the JDK.
    God bless,
    -Toby Reyelts

    There was an older thread in the forums that had some info on this - my quick search failed to locate it, ypu might want to spend some time looking. The basic problem is memory space fragmentation by the OS, where the OS locates items in memory and effectively constrains heap growth to the unfragmented area that the heap starts in.While there may be more "unused" memory, it's not contiguous. There is also some info in MS's MSDN data regarding this condition, with information on the various OS's. I think Linux has a similar "condition".

  • Max heap memory values

    Dear all,
    In the config tool of a Java 6.40 system, for a server process I see the Max Heap size under Tab "General" and the Max Heap size under Tab "Bootstrap". What is the difference of these two values? If you can send me a link with documentation on this, it would be very much appreciated
    Many thanks
    Andreas

    Check Note 876722 - the general setting for server node will take precedent over bootstap value
    If you can see process details watch for -xmx value for the java process to find the actual value of the heap
    Hope it helps

  • Max Heap size of j2ee web dispatcher ??

    Hello All,
                  Max Heap size of j2ee web dispatcher for our DEV BI 7.0 is 1024 MB.But our QA BI 7.0 system is 130 MB.I need to know in which sceanario we increase Max Heap size of j2ee web dispatcher??For BI 7.0 what should be the perfect heap size??

    Dear's about what you speaking, may be i do not know anything ?
    Max Heap size of j2ee web dispatcher for our DEV BI 7.0 is 1024 MB.But our QA BI 7.0 system is 130 MB.
    What are web dispatcher? it's about it -->
    http://help.sap.com/erp2005_ehp_04/helpdata/EN/c6/0c2c79b9fc4e1c8548815bf56300f4/frameset.htm
    Where are her heap? If you say aboy dispatcher nodes in AS JAVA, when it is  not "web dispatcher" 
    Manoj Chintawar   wrote :
    You should be ok to set max heap size to 1024 MB since you have 16GB of RAM in your system.
    where you read abot it ? Per same Note 723909 - Java VM settings for J2EE 6.40/7.0 -->
    For dispatcher nodes it's sufficient to set the heap size to 171m on 32 bit and to 256m on 64 bit platforms. There is no need to apply the parameters mentioned under 4-9 below.
    Regards.

Maybe you are looking for

  • SAP Business One on Ipad

    Dear All, I installed SAP Business One 8.81 PL04 (on Server) and SQL server 2005. My questions are 1) i install B1 integration component, when i am opening the integration framework i am getting a windows security dialog box in which it is mentioned

  • Oracle ODAC 12c Release 3 32-bit supports EF 6.x?

    Hello, I have Windows  8.1 64-bit with Visual Studio 2013.  I installed the latest Oracle ODAC 12c Release 3 32-bit which says it supports EF 6.  When add the ADO.NET Entity Framework to my project and choose my Oracle data connection, it doesn't all

  • Disable SCCM management of Software Updates

    Hi, We are looking at installing the SCCM 2012 R2 client on some of our servers, however, the server managers are not pleased with the prospect of us managing the updates on those servers.  To work around this I have created a Client Settings policy

  • Sun Ray Server Failover/ Load Sharing

    Hi All I have 2 sun servers which i want to use as sun ray servers in failover/load sharing mode. I have 60 sun ray thin clients who will access these 2 Sun ray servers. The users will be using the word, excel and presentation application and save th

  • Old manual's

    Why is it impossible to download or even view copies of the manuals relating to older or 'dated' devices.? I have recently acquired an MSI CD-R/RW Drive - model No: CR48-A (MS-8348A) - from a friend who has updated his old system... It has come with