Hashmap usage

Dear JAVA Community,
Could you please help me, I'am old C++-Programmer und i am trying to understand JAVA. Thank you very much.
That works pretty:
HashMap myMap = new HashMap();
for (int i = 0; i < someGivenNumber; i++) {
     myClass r = new myClass (x*i,y*i); 
     myMap.put("r" + i, r);
}Now, I would like to acces a method of the instance r2 e.g:
r2.print();Unfortunately, I have not been able to discover this in the documentation. (Maybe because I am mentally handicapped by older experiences :-)
thank you very much for your kind descriptions
Emil from Vienna

alternatiively, although I prefer the approcah above
import java.util.HashMap;
public class TestMethod {
    public static void main(String[] args) {
        HashMap myMap = new HashMap();
        for (int i = 0; i < 10; i++) {
            myMap.put("r" + i, new MyClass(5 * i, 5 * i));
        ((MyClass)myMap.get("r2")).print();
class MyClass{
    int i =0;
    int j =0;
    public MyClass(int i, int j){
        this.i = i;
        this.j = j;
    public void print(){
        System.out.println("i is "+i+" j is "+j);
}

Similar Messages

  • HashMap usage in multithread env

    Based on my understanding java.util.HashMap is not threadsafe. My question is does anyone know how exactly will hashmap behave in following situation if no external synchronization is done
    //hashMapInstance has oldobj associated with "SOMEKEY"
    Multiple Reader threads executing hashMapInstance.get("SOMEKEY")
    One Writer thread executing hashMapInstance.put("SOMEKEY", newobj)
    I understand the behaviour will be unpredictable, but does that mean that reader threads have chance of getting null values?
    Or does that mean that some readers might get oldobj and some readers might get newobj?

    Why worry about what it might do... just fix it.
    hashMapInstance = Collections.synchronizedMap(hashMapInstance);
    ...there, now you're thread safe.

  • Hashmap,objectoutputstream,CPU usage

    hi java gurus ,
    i am developing a appln which will constantly write a hashmap to a file through objectoutput stream through serialisation.so by this approach when i need to add a new record i need to read the whole file store it in a hashmap and then add the record to the hashmap and then write the hashmap to the file.
    so by doing this my CPU usage goes to 100% due to which my appln becomes slow.
    pls tell me a way to stop my CPU usage going to 100%
    pls advice me whether i should change my flow if yes please tell me what i should be doing .
    Thank you in advance.

    You know, you might find it worthwhile to try defining your own, specialised HashMap rather than using the general purpose one in the class library. Get the number of Objects down. I don't know what the key and value items actually are in your application but consider unpacking them into primitives. Ideally, avoid even using Arrays.
    So, lets say you want to hash from a string of up to four characters to an integer define and entry object like:
    class MyHashEntry {
      MyHashEntry synonym;
      byte char0;
      byte char1;
      byte char2;
      byte char3;
      short value;
      static MyHashEntry[] map = new MyHashEntry[100000];
      public MyHashEntry(String key, int value) {
         byte[] chars = key.toByteArray();
         char0 = chars.length < 1 ? 0 : chars[0];
         char1 = chars.length < 2 ? 0 : chars[1];
         char2 = chars.length < 1 ? 0 : chars[2];
         char3 = chars.length < 1 ? 0 : chars[3];
         this.value = value;
         int col = hashValue % 1000000;
         next = map[col];
         map[col] = this;
      public int hashValue() {
        return (char0 << 24) ^ (char1 << 16) ^ (char2 << 8 ) ^ char3;
      public boolean equals(Object obj) {
        if(obj instanceof String) {
         ... do obvious comparison
    ... usual stuff
       public static MyHashEntry find(String key) {
         byte[] chars = key.toByteArray();
         int hash = 0;
         int i = 0;
         for(i = 0; i < 4; i++) {
           hash <<= 8;
           if(i < chars.length)
             hash ^= chars;
    hash %= 10000;
    MyHashEntry candidate = map[hash];
    while(candiate != null) {
    if(candiate.equals(key))
    return candidate;
    return null;
    (Untested)
    This should reduce the number of Objects by a factor of three of four.

  • *HashMap High Memory Usage, is this expected?

    I'm using this test class to demonstrate memory usage in hashmaps. I've done some preliminary testing by adding Runtime calls to get memory before and after, but what I'm immediately noticing is that I'm running out of memory before I hit the 200K mark.
    Error specifically is : Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    Am I doing anything horribly wrong here that is over-allocating memory to this purpose? It doesn't seem to me. like it should be hitting 75MB's so quickly. How can I estimate the usage per hashmap here? What is each string key worth? I saw that there's a chart of primitive memory somewhere...
    thanks in advance.
    package test;
    import java.util.*;
    /* burns through 75MB, and runs out.
    * is this expected behavior?
    public class ALMemoryTest {
         public static void main(String[] args) {
              ArrayList<HashMap<String,Object>> al = new ArrayList<HashMap<String,Object>>();
              HashMap<String,Object> hm;
              for (int i = 0; i < 200000; i++) {
                   hm = new LinkedHashMap<String,Object>();
                   hm.put("original", i);
                   hm.put("times2", i * 2);
                   hm.put("stringkey", i + "string");
                   al.add(hm);
              System.out.println(al.size());
    }

    So I made another test class. This shows I can get up to 185K-ish elements or so...
    package test;
    import java.util.*;
    /* burns through 75MB, and runs out.
    * is this expected behavior?
    public class ALMemoryTest {
         ArrayList<HashMap<String,Object>> al = new ArrayList<HashMap<String,Object>>();
         public double bytesToMbs (long bytes) {
              return bytes / 1048576;
         public LinkedHashMap<String,Object> newHashMap(int i) {
              LinkedHashMap<String,Object> hm = new LinkedHashMap<String,Object>();
              hm.put("original", i);
              hm.put("times2", i * 2);
              hm.put("stringkey", i + "string");
              return hm;
         public void totalMem() {
              System.out.println( "total bytes: " + Runtime.getRuntime().totalMemory() );     
         public void freeMem() {
              System.out.println( "free bytes: " + Runtime.getRuntime().freeMemory() );
         public void arrayListTest(int max) {
              for (int i = 0; i < max; i++) {
                   al.add(newHashMap(i));
              System.out.println(al.size());          
         public static void main(String[] args) {
              ALMemoryTest test = new ALMemoryTest();
              test.arrayListTest(100000);
              test.arrayListTest(50000);
              test.arrayListTest(20000);
              test.arrayListTest(10000);
              test.arrayListTest(5000);
              test.freeMem();
    }

  • HashMap memory usage

    Hi,
    I am implementing an indexer / compressor for plain text files (text, query log and urls files). The basic skeleton of the indexer is the Huffman codec, plus some various addon to boost performance.
    Huffman is used on words (Huffword); the first operation I execute is the complete scan of the file to collect term frequencies, which I will use to generate the Huffman model. Frequencies are stored in a HashMap<String, Integer>.
    The main problem is the HashMap dimension, I quickly run out of memory.
    In a query log of 300MB I collect something around 1700000 String-Integer pairs; is it possible that I need an 512MB-sized heap?

    >
    Huffman is used on words (Huffword); the first operation I execute is the complete scan of the file to collect term frequencies, which I will use to generate the Huffman model. Frequencies are stored in a HashMap<String, Integer>.
    The main problem is the HashMap dimension, I quickly run out of memory.
    In a query log of 300MB I collect something around 1700000 String-Integer pairs; is it possible that I need an 512MB-sized heap?
    >
    Answer to your question: yes, if you are consuming lots of memory, you need lots of heap.
    Answer to the question you didn't ask: with that many unique words, attempting to assign each word a Huffman code will make your file larger. Huffman codes are only useful when you have a relatively small vocabulary, where an even smaller number of terms predominate. This allows you to use a small number of bits for the frequently-occurring items, and a large number of bits for the rarely-occurring items.
    In your case you're going to have an extremely broad tree, with most of the terms being leaf nodes. If I'm remembering correctly, it will have log2(x) + N bits for a leaf node (where N accounts for the non-overlapping leading bits of the few predominant words), so 24+ bits per word. Plus, you have to store your entire dictionary in the file to be used for reconstruction.

  • High cpu usage during JSF lifecycle phase execution

    In our performance test we encountered a high cpu usage (100%) and the thread dumps indicated that most of the times the threads are either executing restore view or render response phase of the JSF lifecycle or they are blocked while accessing the jar files which containing the xhtml pages.
    One of the thread dump of a runnable thread is
    java.lang.Thread.State: RUNNABLE
    at java.util.HashMap.get(HashMap.java:317)
    at javax.faces.component.ComponentStateHelper.get(ComponentStateHelper.java:174)
    at javax.faces.component.ComponentStateHelper.add(ComponentStateHelper.java:216)
    at javax.faces.component.UIComponent.setValueExpression(UIComponent.java:436)
    at com.sun.faces.facelets.tag.jsf.CompositeComponentTagHandler$CompositeComponentRule$CompositeExpressionMetadata.applyMetadata(CompositeComponentTagHandler.java:631)
    at com.sun.faces.facelets.tag.MetadataImpl.applyMetadata(MetadataImpl.java:81)
    at javax.faces.view.facelets.MetaTagHandler.setAttributes(MetaTagHandler.java:129)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.setAttributes(DelegatingMetaTagHandler.java:102)
    at com.sun.faces.facelets.tag.jsf.CompositeComponentTagHandler.setAttributes(CompositeComponentTagHandler.java:246)
    at com.sun.faces.facelets.tag.jsf.CompositeComponentTagHandler.applyNextHandler(CompositeComponentTagHandler.java:184)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:308)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:367)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:346)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeFacelet(DefaultFaceletContext.java:199)
    at com.sun.faces.facelets.tag.ui.IncludeHandler.apply(IncludeHandler.java:120)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.applyNextHandler(DelegatingMetaTagHandler.java:137)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at com.sun.faces.facelets.tag.ui.DefineHandler.applyDefinition(DefineHandler.java:107)
    at com.sun.faces.facelets.tag.ui.CompositionHandler.apply(CompositionHandler.java:178)
    at com.sun.faces.facelets.impl.DefaultFaceletContext$TemplateManager.apply(DefaultFaceletContext.java:395)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeDefinition(DefaultFaceletContext.java:366)
    at com.sun.faces.facelets.tag.ui.InsertHandler.apply(InsertHandler.java:112)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.applyNextHandler(DelegatingMetaTagHandler.java:137)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:308)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:367)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:346)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeFacelet(DefaultFaceletContext.java:199)
    at com.sun.faces.facelets.tag.ui.CompositionHandler.apply(CompositionHandler.java:155)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:308)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:367)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:346)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeFacelet(DefaultFaceletContext.java:199)
    at com.sun.faces.facelets.tag.ui.IncludeHandler.apply(IncludeHandler.java:120)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.apply(DefaultFacelet.java:152)
    at com.sun.faces.application.view.FaceletViewHandlingStrategy.buildView(FaceletViewHandlingStrategy.java:774)
    at com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:100)
    at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:101)
    at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:139)
    at javax.faces.webapp.FacesServlet.service(FacesServlet.java:594)
    at org.apache.catalina.core.StandardWrapper.service(StandardWrapper.java:1550)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:343)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:217)while a thread trace for a blocked thread is
    java.lang.Thread.State: BLOCKED (on object monitor)
    at java.util.zip.ZipFile.getEntry(ZipFile.java:302)
    - waiting to lock <0x00000000c0f678f8> (a java.util.jar.JarFile)
    at java.util.jar.JarFile.getEntry(JarFile.java:225)
    at java.util.jar.JarFile.getJarEntry(JarFile.java:208)
    at sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:817)
    at sun.misc.URLClassPath$JarLoader.findResource(URLClassPath.java:795)
    at sun.misc.URLClassPath.findResource(URLClassPath.java:172)
    at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
    at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
    at java.lang.ClassLoader.getResource(ClassLoader.java:1138)
    at java.lang.ClassLoader.getResource(ClassLoader.java:1133)
    at org.glassfish.web.loader.WebappClassLoader.getResource(WebappClassLoader.java:1156)
    at org.glassfish.web.loader.WebappClassLoader.getResourceFromJars(WebappClassLoader.java:1111)
    at org.apache.catalina.core.StandardContext.getMetaInfResource(StandardContext.java:7586)
    at org.apache.catalina.core.StandardContext.getResource(StandardContext.java:6979)
    at org.apache.catalina.core.ApplicationContext.getResource(ApplicationContext.java:382)
    at org.apache.catalina.core.ApplicationContextFacade.getResource(ApplicationContextFacade.java:260)
    at com.sun.faces.context.ExternalContextImpl.getResource(ExternalContextImpl.java:502)
    at com.sun.faces.application.resource.WebappResourceHelper.getURL(WebappResourceHelper.java:119)
    at com.sun.faces.application.resource.ResourceImpl.getURL(ResourceImpl.java:190)
    at com.sun.faces.facelets.tag.jsf.CompositeComponentTagHandler.applyCompositeComponent(CompositeComponentTagHandler.java:366)
    at com.sun.faces.facelets.tag.jsf.CompositeComponentTagHandler.applyNextHandler(CompositeComponentTagHandler.java:191)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:308)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:367)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:346)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeFacelet(DefaultFaceletContext.java:199)
    at com.sun.faces.facelets.tag.ui.IncludeHandler.apply(IncludeHandler.java:120)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.applyNextHandler(DelegatingMetaTagHandler.java:137)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at com.sun.faces.facelets.tag.ui.DefineHandler.applyDefinition(DefineHandler.java:107)
    at com.sun.faces.facelets.tag.ui.CompositionHandler.apply(CompositionHandler.java:178)
    at com.sun.faces.facelets.impl.DefaultFaceletContext$TemplateManager.apply(DefaultFaceletContext.java:395)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeDefinition(DefaultFaceletContext.java:366)
    at com.sun.faces.facelets.tag.ui.InsertHandler.apply(InsertHandler.java:112)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.applyNextHandler(DelegatingMetaTagHandler.java:137)
    at com.sun.faces.facelets.tag.jsf.ComponentTagHandlerDelegateImpl.apply(ComponentTagHandlerDelegateImpl.java:184)
    at javax.faces.view.facelets.DelegatingMetaTagHandler.apply(DelegatingMetaTagHandler.java:120)
    at javax.faces.view.facelets.CompositeFaceletHandler.apply(CompositeFaceletHandler.java:98)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:308)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:367)
    at com.sun.faces.facelets.impl.DefaultFacelet.include(DefaultFacelet.java:346)
    at com.sun.faces.facelets.impl.DefaultFaceletContext.includeFacelet(DefaultFaceletContext.java:199)
    at com.sun.faces.facelets.tag.ui.CompositionHandler.apply(CompositionHandler.java:155)
    at com.sun.faces.facelets.compiler.NamespaceHandler.apply(NamespaceHandler.java:93)
    at com.sun.faces.facelets.compiler.EncodingHandler.apply(EncodingHandler.java:86)
    at com.sun.faces.facelets.impl.DefaultFacelet.apply(DefaultFacelet.java:152)
    at com.sun.faces.application.view.FaceletViewHandlingStrategy.buildView(FaceletViewHandlingStrategy.java:774)
    at com.sun.faces.application.view.StateManagementStrategyImpl.restoreView(StateManagementStrategyImpl.java:223)
    at com.sun.faces.application.StateManagerImpl.restoreView(StateManagerImpl.java:188)
    at com.sun.faces.application.view.ViewHandlingStrategy.restoreView(ViewHandlingStrategy.java:123)
    at com.sun.faces.application.view.FaceletViewHandlingStrategy.restoreView(FaceletViewHandlingStrategy.java:453)
    at com.sun.faces.application.view.MultiViewHandler.restoreView(MultiViewHandler.java:148)
    at com.sun.faces.lifecycle.RestoreViewPhase.execute(RestoreViewPhase.java:192)
    at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:101)
    at com.sun.faces.lifecycle.RestoreViewPhase.doPhase(RestoreViewPhase.java:116)
    at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:118)
    at javax.faces.webapp.FacesServlet.service(FacesServlet.java:593)
    at org.apache.catalina.core.StandardWrapper.service(StandardWrapper.java:1550)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:343)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:217)We use glassfish 3.1.1 as our application and the project_stage property is set to System_test. I would like to get suggestions on how should I investigate this further. Is this a normal behavior? Does glassfish provide an alternative for resolving blocked threads like some caching mechanism for resources etc?
    Thanks in advance

    Nik wrote:
    Even if it is legal, have you tried moving them out of there (just to pinpoint a possible bug since the stacktrace indicates a wait on a jar file)?Indeed. If that clears up the issue it is good information to put in a JSF bug report (which may even cascade to the Glassfish level).
    Putting resources in a jar file is only really useful when you want to share those resources among different web applications, which should be a rare case. Even when it happens I would probably still choose to simply copy the resources so they are individually managed and you don't get unnecessary dependencies between applications. Just because something is technically possible doesn't make it a good idea.

  • URGENT HELP ON HASHMAP OBJECTOUTPUTSTREAM

    hi java gurus ,
    i am developing a appln which will constantly write a hashmap to a file through objectoutput stream through serialisation.so by this approach when i need to add a new record i need to read the whole file store it in a hashmap and then add the record to the hashmap and then write the hashmap to the file.
    so by doing this my CPU usage goes to 100% due to which my appln becomes slow.
    pls tell me a way to stop my CPU usage going to 100%
    pls advice me whether i should change my flow if yes please tell me what i should be doing .
    Thank you in advance.

    If the objects in your Hashmap are strings you might have better luck
    using the Properties class. The Properties class has builtin methods for reading and writing a properties file.
    Use the Properties.list(PrintWriter) and Properties.load(InputStream), to read and write to your properties file. Since the Properties file extends Hashtable you can treat it just like your hashmap.

  • Using HashMaps in an EJB

    Ok this is the second time Im posting this problem. Last time I posted all the details and it may have been more general. This time I will be more specific. The following are the requirement that we were given for a school assignment,
    Develop a prototype software module to implement a messenger service. Messaging
    Systems consist of one Messaging server and several clients. Message can differ from a simple text message to a complex object like an Image. System should include following components.
    � Messaging Server: A Messaging server will hold the message queue for our
    Messaging system (which you need to develop)
    � Messaging Queue: A Queue will hold the messages from the client. The
    messages on this queue will be MapMessages that will allow us to store
    name/value pair information about the message to be sent out
    � Message Client: A client will create a message, and put it on the Messaging
    Queue. This message will hold the information for the email message to be sent out
    � Message Driven Bean: A message driven bean will be responsible for taking in the Message MapMessage and for mailing it out.
    To develop the system you have to use EJB 3.0 architecture. Usage of Java Persistence
    API for modeling data would be recommended.
    I dont want the coding to develop the system, I have already created that part and it works perfectly when I send text or numbers as messages. What I need help is on how to handle a hashMap. Simply SET'ing and GET'ing a hashMap does not seem to work. If you good people can give me some pointers on how to set up the entity and how the GET'ing and SET'ing methods work in relation to the HashMap I would be a very happy and thankful guy indeed.
    P.S - The *** indicate 'ing. The forum seems to sensor those words.
    Message was edited by:
    jomanlk

    WELL I gone through your code in the link
    http://forum.java.sun.com/thread.jspa?threadID=5138929&messageID=9509475#9509475
    Well thats the important parts of the code. it works fine when I send >simple messages but when I try out the HashMap the servlet is >displayed but no messages are sent to the queue. It would be great if >anyone could help. The reason is simple;
    THE HASHMAP CONTAIN NULL VALUES AND YOU ARE SENDING NULL TO THE JMS SO NO OUTPUT.
    protected void processRequest(HttpServletRequest request, HttpServletResponse response)
    throws ServletException, IOException {
    response.setContentType("text/html;charset=UTF-8");
    String title=request.getParameter("title");
         System.out.println("TITLE IS     :"+title);
    String body=request.getParameter("body");
         System.out.println("BODY IS     :"+body);
    HashMap hm = new HashMap();
    //hm.put("title", request.getParameter("title"));
    //hm.put("body", request.getParameter("body"));
         hm.put("title",title );
    hm.put("body", body);
    if ((title!=null) && (body!=null)) {
    try {
    Connection connection = connectionFactory.createConnection();
    Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
    MessageProducer messageProducer = session.createProducer(queue);
    ObjectMessage message = session.createObjectMessage();
    // here we create NewsEntity, that will be sent in JMS message
    NewsEntity e = new NewsEntity();
    e.setTitle(title);
    e.setBody(body);
    e.setHm(hm);
    message.setObject(e);
    messageProducer.send(message);
    messageProducer.close();
    connection.close();
    // response.sendRedirect("ListNews");
    } catch (JMSException ex) {
    ex.printStackTrace();
    try out this code this modified code and allot some dukes if you succeed best of luck :))

  • Design decision � attributes vs. HashMap

    I want to design a few value objects to hold customer and order informations. In a later step these objects should be used by EJBs and their DAOs. I�m facing two design aproches to do this.
    1) The "classical" aproach. A serializable class with the corresponding attributes + getters/setters.
    2) A class holding a HashMap. For each "attribute" there is a key-value-pair added to.
    Any experiences about usage, performance, maintenance, etc. would be nice.
    Thanks!
    -chris

    Hi.
    You didn't say whether you were still going to have accessors/mutators for the properties in (2). If this is the case then it's just a question of internal representation. If you're not going to have accessors/mutators (instead having put(String name, Object value) and get(String name) methods) then you are almost certainly going to cause yourself a lot of pain in maintenance and it has been my experience that this pain will be felt before you even get a first release out.
    I'm always in favour of a strong object model even for these little convenience type things (maybe that's because I consider myself to be a strong semantic modeller :-). The reason for this is that Java is strongly typed and the strong typing makes itself known to you when you (or worse, the client code) has to cast the thing in the Map, but you subvert the type system somewhat when you upcast and then downcast, as you would be doing in (2).
    So, we are being held to account by the type system but we're throwing away the help it would have given us.
    Now, even putting that argument aside for the moment, suppose you want to refactor your code and the thing in the map is no longer what you're casting it to (this is why it's worse in the client code) which will result in a run-time exception. If instead of (2) you had done (1), then a compile time exception would have occurred.
    Clearly the argument above is lessened if you are still using accessors and mutators but then it seems like such a small effort at that point to add fields for the properties.
    I can't say anything about performance as such I'm afraid, other than write clean code, then measure it to see where the bottlenecks are. By 'clean' I would have to include 'semantically strong'.
    Regards,
    Lance
    Lance Walton - [email protected]
    Team In A Box - Software without Tragedy
    http://www.teaminabox.co.uk

  • Query memory use of a HashMap at runtime? Overall JVM Memory Use?

    Is there any way that my application can query the deep memory usage (including keys/values) of a HashMap instance at runtime?
    Is there any way that I can query overall memory use of the entire app or JVM through code?
    Thanks!

    You can use the memory profiler in JDeveloper to test this:
    http://www.oracle.com/technology/pub/articles/masterj2ee/j2ee_wk11.html
    http://www.oracle.com/technology/products/jdev/tips/duff/debugger_memoryleaks3.html
    http://radio.weblogs.com/0118231/stories/2005/07/21/tipsForUsingHeapWindowAndMemoryProfilerToFindMemoryLeaks.html

  • Vector is way faster than HashMap (why?)

    I thought that HashMap would be faster than Vector (in adding stuff) ... could anyone explain to me why there was such a HUGE difference??
    here's the code I used:
    public class SpeedTest
    public static void main(String[] args)
       final int max=1000001;
       Integer[] arrayzinho = new Integer[max];
       Arrays.fill(arrayzinho,0,max,new Integer(1));
       Vector jota = new Vector(max,max);
       HashMap ele = new HashMap(max,1);
       System.out.println("Adicionando " + (max-1) + " elementos ao array...");
       long tempo = System.currentTimeMillis();
       for(int i=0;i<max;i++)
          arrayzinho[i] = new Integer(i);
       System.out.println("A opera??o demorou " + ((System.currentTimeMillis()-tempo)) + " msecs.");
    //ops
       System.out.println("Adicionando " + (max-1) + " elementos ao Vector...");
       tempo = System.currentTimeMillis();
       for(int i=0;i<max;i++)
          jota.add(arrayzinho);
    System.out.println("A opera??o demorou " + ((System.currentTimeMillis()-tempo)) + " msecs.");
    //ops
    System.out.println("Adicionando " + (max-1) + " elementos ao HashMap...");
    tempo = System.currentTimeMillis();
    for(int i=0;i<max;i++)
    ele.put(arrayzinho[i],arrayzinho[i]);
    System.out.println("A opera??o demorou " + ((System.currentTimeMillis()-tempo)) + " msecs.");
    Of course, when adding to HashMap, two values are entered instead of just the one added in the Vector... But, even doubling the time Vector used, the difference is huge!
    here's some output I've got:
    1:
    Adicionando 1000000 elementos ao array...
    A opera??o demorou 4500 msecs.
    Adicionando 1000000 elementos ao Vector...
    A opera??o demorou 469 msecs.
    Adicionando 1000000 elementos ao HashMap...
    A opera??o demorou 7906 msecs.
    2:
    Adicionando 1000000 elementos ao array...
    A opera??o demorou 4485 msecs.
    Adicionando 1000000 elementos ao Vector...
    A opera??o demorou 484 msecs.
    Adicionando 1000000 elementos ao HashMap...
    A opera??o demorou 7891 msecs.
    and so on, the results are almost the same everytime it's run. Does anyone know why?

    Note: This only times the for loop and insert into each one... not the lookup time and array stuff of the original..
    Test One:
    Uninitialized capacity for Vector and HashMap
    import java.util.*;
    public class SpeedTest
        public static void main(String[] args)
            final int max = 1000001;
            Vector jota = new Vector(); // new Vector(max,max);
            HashMap ele = new HashMap(); // new HashMap(max,1);
            Integer a = new Integer(1);
            long tempo = System.currentTimeMillis();
            for (int i = 0; i < max; ++i)
                jota.add(a);
            long done = System.currentTimeMillis();
            System.out.println("Vector Time " + (done - tempo) + " msecs.");
            tempo = System.currentTimeMillis();
            for (int i = 0; i < max; ++i)
                ele.put(a, a);
            done = System.currentTimeMillis();
            System.out.println("Map Time " + (done-tempo) + " msecs.");
        } // main
    } // SpeedTestAdministrator@WACO //c
    $ java SpeedTest
    Vector Time 331 msecs.
    Map Time 90 msecs.
    Test Two:
    Initialize the Vector and HashMap capacity
    import java.util.*;
    public class SpeedTest
        public static void main(String[] args)
            final int max = 1000001;
            Vector jota = new Vector(max,max);
            HashMap ele = new HashMap(max,1);
            Integer a = new Integer(1);
            long tempo = System.currentTimeMillis();
            for (int i = 0; i < max; ++i)
                jota.add(a);
            long done = System.currentTimeMillis();
            System.out.println("Vector Time " + (done - tempo) + " msecs.");
            tempo = System.currentTimeMillis();
            for (int i = 0; i < max; ++i)
                ele.put(a, a);
            done = System.currentTimeMillis();
            System.out.println("Map Time " + (done-tempo) + " msecs.");
        } // main
    } // SpeedTestAdministrator@WACO //c
    $ java SpeedTest
    Vector Time 60 msecs.
    Map Time 90 msecs.
    We see that IF you know the capacity of the vector before its usage, it is BEST to create one of the needed capacity...

  • HashMap load factor and performance

    Hi,
    Does anyone know why an increase in the load factor of a HashMap decreases lookup performance?
    Furthermore, assuming that 99% of the time you will be reading from
    the HashMap and writing(put) a well know amount only 1% of the time.
    Should the load factor be .01 to get the best performance?
    Perhaps I should use a different data structure? I am trying to
    load String name/value pairs in a data structure. The loading usually occurs once, but the data can be reloaded. The data will only be
    accessed via get(name). Nothing else matters. I will submit a name and want the value back. Practically, speaking the performance is not even an issue in my application, but I want to be aware of the issues involved and not blindly use the structure. I don't want to study the source. I have searched the web, but did not find anything that clearly deals with these issues. I am hoping someone who has already spent the effort can shed some light.
    Thanks,
    elisahak

    If you read any book on performance, they'll tell you to optimise where you need it. And you work this our through profiling - where you need to optimise might not be where you think you'll need to optimise.
    Anyway, assuming that you've determined that your bottleneck will be your hashtable, and you wish to tune it: if you read any book on performance, they'll tell you to measure - if you refactor/redesign/reconfigure because you think it'll be more efficient, and do verify that it is more efficient, then you're wasting your time (no pun intended).
    So, I would suggest that you write a method:
    timeTypicalUsage(final int initialCapacity, final float loadFactor) {
       ... add loads of stuff
       ... read loads of stuff (maybe do this in a loop 99 times so that
       you reflect your "typical usage".
    } Then call this method several times, with various load factors and time each call. If you're using JUnit, it'll automatically time the invocations for you.
    Just remember that the first invocations of a method are typically the slowest, as class loaders, JIT, etc. are all only "warming up". And that you should be testing with enough invocations that each test method take at least several seconds, if not tens of seconds, otherwise, other any overhead starts to become significant.

  • Problem with Firefox and very heavy memory usage

    For several releases now, Firefox has been particularly heavy on memory usage. With its most recent version, with a single browser instance and only one tab, Firefox consumes more memory that any other application running on my Windows PC. The memory footprint grows significantly as I open additional tabs, getting to almost 1GB when there are 7 or 8 tabs open. This is as true with no extensions or pluggins, and with the usual set, (firebug, fire cookie, fireshot, XMarks). Right now, with 2 tabs, the memory is at 217,128K and climbing, CPU is between 0.2 and 1.5%.
    I have read dozens of threads providing "helpful" suggestions, and tried any that seem reasonable. But like most others who experience Firebug's memory problems, none address the issue.
    Firefox is an excellent tool for web developers, and I rely on it heavily, but have now resorted to using Chrome as the default and only open Firefox when I must, in order to test or debug a page.
    Is there no hope of resolving this problem? So far, from responses to other similar threads, the response has been to deny any responsibility and blame extensions and/or pluggins. This is not helpful and not accurate. Will Firefox accept ownership for this problem and try to address it properly, or must we continue to suffer for your failings?

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

  • Problem with scanning and memory usage

    I'm running PS CS3 on Vista Home Premium, 1.86Ghz Intel core 2 processor, and 4GB RAM.
    I realise Vista only sees 3.3GB of this RAM, and I know Vista uses about 1GB all the time.
    Question:
    While running PS, and only PS, with no files open, I have 2GB of RAM, why will PS not let me scan a file that it says will take up 300Mb?
    200Mb is about the limit that it will let me scan, but even then, the actual end product ends up being less than 100Mb. (around 70mb in most cases)I'm using a Dell AIO A920, latest drivers etc, and PS is set to use all avaliable RAM.
    Not only will it not let me scan, once a file I've opened has used up "x" amount of RAM, even if I then close that file, "x" amount of RAM will STILL be unavaliable. This means if I scan something, I have to save it, close PS, then open it again before I can scan anything else.
    Surely this isn't normal. Or am I being stupid and missing something obvious?
    I've also monitored the memory usage during scanning using task manager and various other things, it hardly goes up at all, then shoots up to 70-80% once the 70ishMb file is loaded. Something is up because if that were true, I'd actually only have 1Gb of RAM, and running Vista would be nearly impossible.
    It's not a Vista thing either as I had this problem when I had XP. In fact it was worse then, I could hardly scan anything, had to be very low resolution.
    Thanks in advance for any help

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

  • Problem with JTree and memory usage

    I have problem with the JTree when memory usage is over the phisical memory( I have 512MB).
    I use JTree to display very large data about structure organization of big company. It is working fine until memory usage is over the phisical memory - then some of nodes are not visible.
    I hope somebody has an idea about this problem.

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

Maybe you are looking for