Log4j : How to avoid duplicate logmessages in a multithreaded application?

Hi,
I m using log4j for my heavily multithreaded JAVA application. After my application comes up, a few minutes later , I find many duplicate logger messages on the console(only logger messages in threads are being displayed 8-9 times each). Is it that my application being multithreaded or is there something wrong with my configuration file?
My log4j-config.xml is as below:-
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"
                     configDebug="true">
     <appender name="ASYNC" class="org.apache.log4j.AsyncAppender" additivity="false">
             <appender-ref ref="TEMP"/>
          <appender-ref ref="CONSOLE"/>
          <appender-ref ref="TRACE"/>
          <appender-ref ref="DEBUG"/>
          <appender-ref ref="INFO"/>
          <appender-ref ref="WARN"/>
          <appender-ref ref="ERROR"/>
          <appender-ref ref="FATAL"/>
     </appender>
     <appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
          <!-- param name="ImmediateFlush" value="true"/-->
          <layout class="org.apache.log4j.PatternLayout">
                 <!-- param name="ConversionPattern"
                            value="%d %-5p [%t] %C %A (%F:%L) - %m%n"/-->
          </layout>
     </appender>
     <appender name="TEMP" class="org.apache.log4j.FileAppender">
          <param name="File" value="/var/log/abs/acpu_fun/acpu.log"/>
          <layout class="org.apache.log4j.PatternLayout">
                 <param name="ConversionPattern"
                            value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
     </appender>
     <appender name="TRACE" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acputrc.log"/>
     <param name="Threshold" value="TRACE"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
     </appender>
     <appender name="DEBUG" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acpudeb.log"/>
     <param name="Threshold" value="DEBUG"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
     </appender>
     <appender name="INFO" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acpuinfo.log"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
          <filter class="org.apache.log4j.varia.LevelMatchFilter">
                      <param name="LevelToMatch" value="info"/>
             </filter>
     <filter class="org.apache.log4j.varia.DenyAllFilter"/>     
     </appender>
     <appender name="WARN" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acpuwar.log"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
          <filter class="org.apache.log4j.varia.LevelMatchFilter">
                 <param name="LevelToMatch" value="warn"/>
            </filter>
     <filter class="org.apache.log4j.varia.DenyAllFilter"/>     
     </appender>
     <appender name="ERROR" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acpuerr.log"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
          <filter class="org.apache.log4j.varia.LevelMatchFilter">
                 <param name="LevelToMatch" value="error"/>
            </filter>
     <filter class="org.apache.log4j.varia.DenyAllFilter"/>     
     </appender>
     <appender name="FATAL" class="org.apache.log4j.FileAppender">
     <param name="File" value="/var/log/abs/acpu_fun/acpufatl.log"/>
     <layout class="org.apache.log4j.PatternLayout">
          <param name="ConversionPattern"
                           value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
          <filter class="org.apache.log4j.varia.LevelMatchFilter">
                 <param name="LevelToMatch" value="fatal"/>
            </filter>
     <filter class="org.apache.log4j.varia.DenyAllFilter"/>     
     </appender>
     <appender name="SLCS" class="org.apache.log4j.FileAppender">
          <param name="File" value="/var/log/slcs/slcs.log"/>
          <layout class="org.apache.log4j.PatternLayout">
                 <param name="ConversionPattern"
                            value="%d %-5p [%t] %C (%F:%L) - %m%n"/>
          </layout>
     </appender>
     <category name="com.aircell.abs.acpu.softwareloadconfig" additivity="false">
      <level value="debug"/>
     <appender-ref ref="SLCS"/>
      </category>
     <root>
          <priority value="info"/>
          <appender-ref ref="ASYNC"/>
     </root>
</log4j:configuration>Regards.

Hi,
Did you find any solution for this problem. I am also encountering the same problem.
Thanks,
Mohit

Similar Messages

  • How to avoid duplicate posting of noted items for advance payment requests?

    How to avoid duplicate posting of noted items for advace payments request?

    Puttasiddappa,
    In the PS module, we allow the deletion of a component pruchase requisition allthough a purchase order exists. The system will send message CN707 "<i>A purchase order already exists for purchase requisition &</i>" as an Iinformation message by design to allow flexible project management.
    If you, however, desire the message CN707 to be of type E you have to
    modify the standard coding. Doing so, using SE91, you can invoke the
    where-used-list of message 707 in message class CN, and to change the
      i707(cn)
    to
      e707(cn)
    where desired.
    Also, user exit CNEX0039 provides the possibility to reject the
    deletion of a component according to customers needs e. g. you may
    check here whether a purchase order exists and reject the deletion.
    Hope this helps!
    Best regards
    Martina Modolell

  • How to avoid duplicates values from alvgird see below code

    how to avoid duplicates values from alvgird see below code
    in below query docno no is repeated again and again
    how i can avoid duplication in this query.
    select * into corresponding fields of table itab
             from  J_1IEXCHDR
                     inner join  J_1IEXCDTL
                        on  J_1IEXCDTLlifnr =  J_1IEXCHDRlifnr
                     where  J_1IEXCHDr~status = 'P'.

    Hi Laxman,
    after that select statement
    select * into corresponding fields of table itab
    from J_1IEXCHDR
    inner join J_1IEXCDTL
    on J_1IEXCDTLlifnr = J_1IEXCHDRlifnr
    where J_1IEXCHDr~status = 'P'.
    <b>if sy-subrc = 0.
    delete adjucent duplicates from itab comparing <field name of itab internal table>
    endif.</b>
    this will delete your duplicate entries.once you done with this call the alv FM.
    <b>  call function 'REUSE_ALV_GRID_DISPLAY'</b>
    exporting
      I_INTERFACE_CHECK                 = ' '
      I_BYPASSING_BUFFER                = ' '
      I_BUFFER_ACTIVE                   = ' '
       i_callback_program                = v_repid
      I_CALLBACK_PF_STATUS_SET          = ' '
      I_CALLBACK_USER_COMMAND           = 'IT_USER_COMMAND'
      I_CALLBACK_TOP_OF_PAGE            = ' '
      I_CALLBACK_HTML_TOP_OF_PAGE       = ' '
      I_CALLBACK_HTML_END_OF_LIST       = ' '
      I_STRUCTURE_NAME                  =
      I_BACKGROUND_ID                   = ' '
       i_grid_title                      = 'Purchase Order Details'
      I_GRID_SETTINGS                   = I_GRID_SETTINGS
       is_layout                         = wa_layout
       it_fieldcat                       = it_fieldcat
      IT_EXCLUDING                      = IT_EXCLUDING
      IT_SPECIAL_GROUPS                 = IT_SPECIAL_GROUPS
       it_sort                           = it_sort
      IT_FILTER                         = IT_FILTER
      IS_SEL_HIDE                       = IS_SEL_HIDE
      I_DEFAULT                         = 'X'
      I_SAVE                            = ' '
      IS_VARIANT                        = IS_VARIANT
       it_events                         = it_event
      IT_EVENT_EXIT                     = IT_EVENT_EXIT
      IS_PRINT                          = IS_PRINT
      IS_REPREP_ID                      = IS_REPREP_ID
      I_SCREEN_START_COLUMN             = 0
      I_SCREEN_START_LINE               = 0
      I_SCREEN_END_COLUMN               = 0
      I_SCREEN_END_LINE                 = 0
      I_HTML_HEIGHT_TOP                 = 0
      I_HTML_HEIGHT_END                 = 0
      IT_ALV_GRAPHICS                   = IT_ALV_GRAPHICS
      IT_HYPERLINK                      = IT_HYPERLINK
      IT_ADD_FIELDCAT                   = IT_ADD_FIELDCAT
      IT_EXCEPT_QINFO                   = IT_EXCEPT_QINFO
      IR_SALV_FULLSCREEN_ADAPTER        = IR_SALV_FULLSCREEN_ADAPTER
    IMPORTING
      E_EXIT_CAUSED_BY_CALLER           = E_EXIT_CAUSED_BY_CALLER
      ES_EXIT_CAUSED_BY_USER            = ES_EXIT_CAUSED_BY_USER
        tables
    <b>      t_outtab                          = ITAB</b>
    exceptions
       program_error                     = 1
       others                            = 2
      if sy-subrc <> 0.
        message id sy-msgid type sy-msgty number sy-msgno
                with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      endif.
    Thanks
    Vikranth Khimavath

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • How to avoid duplicate data while inserting from sample.dat file to table

    Hi Guys,
    We have issue with duplicate data in flat file while loading data from sample.dat file to table. How to avoid duplicate data in control file.
    Can any one help me on this.
    Thanks in advance!
    Regards,
    LKR

    No, a control file will not remove duplicate data.
    You would be better to use an external table and then remove duplicate data using SQL as you query the data to insert it to your destination table.

  • How to avoid directory listing in java web applications.

    how to avoid directory listing in java web applications.
    That is on typing the url of the application it should not the directory listing. Welcome tag in web.xml doesnot fully solve the problem, since still the images folder etc is still accessible

    I know of two ways.
    If you're using tomcat and have access to the conf directory.
    Edit your $TOMCAT/conf/web.xml. Find your default servlet properties and change
      <servlet>
        <servlet-name>default</servlet-name>
        <servlet-class>org.apache.catalina.servlets.DefaultServlet</servlet-class>
        <init-param>
          <param-name>debug</param-name>
          <param-value>0</param-value>
        </init-param>
        <init-param>
          <param-name>listings</param-name>
          <param-value>true</param-value>
        </init-param>
        <load-on-startup>1</load-on-startup>
      </servlet>to  <servlet>
        <servlet-name>default</servlet-name>
        <servlet-class>org.apache.catalina.servlets.DefaultServlet</servlet-class>
        <init-param>
          <param-name>debug</param-name>
          <param-value>0</param-value>
        </init-param>
        <init-param>
          <param-name>listings</param-name>
          <param-value>false</param-value>
        </init-param>
        <load-on-startup>1</load-on-startup>
      </servlet>And restart your server. This will affect every directory on the server, and return a 405 directory browsing forbidden error.
    Another way, is to place an index.jsp inside each directory with a simple one line redirect to your applications CONTEXT_PATH.response.sendRedirect("http://yourserver/yourapp/");This will only affect specific directories which contain these index.jsp files.
    Hope this helps

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • How to avoid duplicates for an result set

    how to avoid the duplicate rows for the below query
    SELECT  to_char(grecode (titleid)) gre_code, to_char(toeflcode (titleid)) toefl_code,titleid
              FROM (SELECT DISTINCT TO_CHAR
                                       (UPPER (TRIM (get_clob_value (table_name,
                                                                     KEY
                                       ) RESULT,
                                    titleid
                               FROM mcp_specifications a JOIN mcp_title_specifications b
                                    ON a.specificationid = b.specificationid
                                    JOIN mcp_titles c ON b.titleid = c.titleid
                              WHERE b.is_parent = 'F'
                                AND UPPER (TRIM (c.university_state)) =
                                                              UPPER (TRIM ('USA'))
                                AND TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
                                AND UPPER (TRIM (SPECIFICATION)) IN
                                                       (UPPER (TRIM ('program'))))
             WHERE UPPER (TRIM (RESULT)) = UPPER (TRIM ('COMPUTER SCIENCE'))
          ORDER BY RESULT ASC;the output of the query would be
    gre_code    toefl_code   titleid
    402             78             5518
    402             78             5519
    402             78             5520
    402             78             5521the output should be
    402 78 any titleid

    Some simplified code:
    SELECT grecode(titleid) gre_code,
           toeflcode(titleid) toefl_code,
           min(titleid) titleid
    FROM   (SELECT DISTINCT TO_CHAR(UPPER(TRIM(get_clob_value(table_name,KEY)))) RESULT,
                   titleid
            FROM   mcp_specifications a
                   JOIN mcp_title_specifications b
                        ON a.specificationid = b.specificationid
                   JOIN mcp_titles c
                        ON b.titleid = c.titleid
            WHERE  b.is_parent = 'F'
            AND    UPPER(TRIM(c.university_state)) = 'USA'
            AND    TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
            AND    UPPER(TRIM(SPECIFICATION)) = 'PROGRAM')
    WHERE  UPPER(TRIM(RESULT)) = 'COMPUTER SCIENCE'
    GROUP BY grecode(titleid),
             toeflcode(titleid)Please note that applying functions like UPPER and TRIM on a string literal can and should be avoided.
    For example:
    UPPER(TRIM('USA')) = 'USA'Why force the database to do both an UPPER and a TRIM on something that can just be represented in uppercase with no surrounding spaces? It's a waste of time.

  • How to avoid duplicates in export?

    Hi
    Is there any way to avoid duplicates when exporting from LR4/avoid exporting the same picture(to the same folder) again?
    Kindly
    Jan

    Rob Cole wrote:
    I stand corrected - thanks Jim .
    I think I had forgotten this because of how I use publish services / collections.
    All of my publish services have exactly one (smart) collection, which defines the photos to be published.
    I don't create a multitude of publish collections in order to define an associated publish tree - the tree is defined by the source folders. I started this convention before Lr was a glimmer in Adobe's eyes (even before digital photography was invented). If I was inventing now from scratch I might do it differently, but this is one reason I get aggravated when some of the experts in this forum continually "forget" that the need to maintain a prescribed convention is sometimes an absolute requirement (or at least *highly* desirable), and Lr should be able to adapt to the convention - and not the other way around.
    If I quit Lr today, I could still maintain published trees without Lr's publishing collections. If your scheme depends on publishing collections which have no visibility outside Lightroom (e.g. a multitude of hard drive publishing collections), then you'd be screwed (so to speak) if you wanted to migrate to another software for maintenance. Not only that, but if you rebuild your catalog, all such collections are lost (unless you know how to use plugins to preserve them). Even if your scheme depends on regular (non-publishing I mean, whether smart of not) collections, and you use jf's Collection Publisher to publish in matching hierarchy, you'd still be screwed when migrating, since those collections do not exist outside Lightroom.
    Impact may vary of course, but I like to minimized dependence on a specific software if possible.
    Folders exist regardless of which software you use to edit your photos. Put another way: collection hierarchies are proprietary, folder structure isn't.
    So, although lots of people prefer jf's Collection Publisher (understandably), it's worth considering jf's Folder Publisher too, or my very own TreeSync Publisher.
    Cheers,
    Rob

  • How to avoid duplicate BOM Item Numbers?

    Hello,
    is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
    For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
    Regards,
    Helmut Gante

    Hello,
    is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
    For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
    Regards,
    Helmut Gante

  • How to avoid duplicates in CROSS JOIN Query

    Hi,
    I am using CROSS JOIN to get all the subset of a table col values as shown below:
    PRODUCT (Col Header)
    Bag
    Plate
    Biscuit
    While doing cross join we will get as
    Bag Bag
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Plate
    Plate Biscuit ..... like this
    By placing where condition prod1 <> prod2 to avoid Bag Bag and Plate Plate values. So the output will be like below
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Biscuit
    Now "Bag Plate" and "Plage Bag" are same combination how to avoid these records. My expected result is
    Bag Biscuit
    Plate Biscuit
    How to derive this ?
    Sridhar

    Hi,
    This is the the solution that I found as fit to the OP question, but
    Visakh16 already posted the same idea (assuming the names are unique) from the start and I don't think that anyone notice it!
    Sridhar.DPM did
    you check Visakh16's response
    (the second response received)?!?
    I will mark his response as an answer. If this is not what you need pls clarify and you can unmark it :-)
    [Personal Site] [Blog] [Facebook]

  • How to avoid duplicates in Iphoto 5?

    I have IPhoto 5.0.4 with Leopard on a G4. When I attach my digital camera, all the photos from my camera upload. I keep some family pics on my camera all the time, so the same ones duplicate themselves every time I upload new photos. How can this be avoided? thanks

    There is a way to do this.
    If your camera mounts on the Desktop as an external device, then use iPhoto's File > Import to Library command. Navigate to the camera, preview the photos, select just the ones you want, and import directly into iPhoto.
    If your camera does not mount on the Desktop, like most Canon models, you can place the camera card into a USB card reader. That will mount on the Desktop, and you can control your imports as described above. It also gives you the advantage of not having to worry about your camera's battery power during the import.
    Regards.

  • How to avoid duplicate measures in reports due to case functions?

    Hi,
    If I create a report, using a dimension called insert_source_type where the next measure would be insert_source in the dimensions hirarchie, if I do not put any formula, when I become a report where i can drill down on insert_source_type and i get insert_source values.
    If I use a function like (CASE "Ins Source"."Ins Source Type" WHEN 'OWS' THEN 'WEB' ELSE "Ins Source"."Ins Source Type" END) and change the label of insert_source_tpye to Channel Group instead, when
    I drill down on Channel Group, it goes to insert_source_tpye and from there i can drill down to insert_source.
    There is an insert_source_type too much!
    How can be this avoided?
    Thanks and Regards
    Giuliano

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • How to avoid duplicates in LOV

    Hi, i'm using search query component (11g) and for the search fields i'm adding LOV. In the UI , i can see the LOV with values from the table. But i need to avoid the duplicates in this . Looked at the docs and demo and i couldn't figure out anything. Could some one point me a resource. Thanks.

    How do you create the LOV then?
    To make a view object with a lov you need two viewObjects - let us say viewObject and viewObjectLOV.
    viewObject is updatalbe, viewObjectLOV is not.
    Your problem is that viewObjectLOV returns duplicate values.
    1. find where is the viewObjectLOV, you can reach it via viewObject's accessor
    2. make sure it has a primary key.
    3. modify viewObjectLOV's query so it does not return duplicate values, for example using distinct keyword.
    0. If you did not understand what I tried to explain, maybe you should read some more documentation first :)

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

Maybe you are looking for

  • The System permit add line with unit price in zeros

    Hello , I need colaboratioin in this case, i create ar invoice or ap invoice and the system permit create with lines in zeros into the documents. Since(As,Like) controlling that it could not create documents with lines in zeros, with whole bigger tha

  • Motion and Typography

    I am making a little kinetic typography project and am stuck. I am trying to make it so I zoom in on the dot of an i, so the dot of the i creates a new scene, but Motion doesn't seem to be able to that, or I am doing it incorrectly. First I just trie

  • Performance Point Services as a Service

    Hi, Is it possible to consume a report/Dashboard/scorecard/etc created in PPS to be consumed from a third party (Custom Application). I mean can the pps be exposed as a service which can be consumed by other UI Applications (not just through SharePoi

  • Web Services Framework - WSDL Generation

    Hi, all! Question: Is it possible to adapt the wizard-generated WSDL-Files  by adding additional elements to the <xsd:sequence>? Thanks a lot in advance for any answers! Regards, Thomas

  • Color in both Bridge CS5 and ACR look yellow.

    The color in my images are all yellow.  I've seen some posts but so far no answer to the problem. Can anything at all be done to fix this problem. It it very, very frustrating. Because it's so yellow I cant use ACR for anything at all.  Please help I