Dynamically defined output log file in log4j
I am trying to generate an output log file dynamically using log4j. I have standard configuration XML file for log4j where I have an appender configuired. In my java code I am trying to add appender to the logger and then change an output file:
<code>
Logger logger = Logger.getLogger(ClassFSTest.class);
Appender rfa =logger.getAppender("rolling");
rfa.setFile(myFileName);
rfa.activateOprions();
</code>
However getAppender method returns NULL, but at the same time I have an Appender named "rolling" in my confuiguration. And if I use standard output mechanism like "logger.info()" the message is being written into the defined file.
thanks in advance.
Hope the following link will address the solution for your problem:
http://cognitivecache.blogspot.com/2008/08/log4j-writing-to-dynamic-log-file-for.html
Similar Messages
-
In R12 how to change concurrent output/log file name prefix?
how to change concurrent output/log file name prefix?
but i want to change change concurrent output/log file name prefix?You cannot, and I believe it is not supported for all concurrent requests -- Please log a SR to confirm this with Oracle support.
Thanks,
Hussein -
Getting empty log files with log4j and WebLogic 10.0
Hi!
I get empty log files with log4j 1.2.13 and WebLogic 10.0. If I don't run the application in the application server, then the logging works fine.
The properties file is located in a jar in the LIB folder of the deployed project. If I change the name of the log file name in the properties file, it just creates a new empty file.
What could be wrong?
Thanks!I assume that when you change the name of the expected log file in the properties file, the new empty file is that name, correct?
That means you're at least getting that properties file loaded by log4j, which is a good sign.
As the file ends up empty, it appears that no logger statements are being executed at a debug level high enough for the current debug level. Can you throw in a logger.error() call at a point you're certain is executed? -
Multiple log files using Log4j
Hello,
I want to generate log files based on package structure. Like com.temp.test in test.log ,also I am having a log file at application like app.log .
This is my requirement what has been logged in test.log should not be logged in app.log.This is my log4j.properties file.
# Log4j configuration file.
# Available levels are DEBUG, INFO, WARN, ERROR, FATAL
# Default logger
log4j.rootLogger=DEBUG, PFILE
log4j.logger.com.temp.test=DEBUG,TEST
# PFILE is the primary log file
log4j.appender.PFILE=org.apache.log4j.RollingFileAppender
log4j.appender.PFILE.File=./App.log
log4j.appender.PFILE.MaxFileSize=5120KB
log4j.appender.PFILE.MaxBackupIndex=10
#log4j.appender.PFILE.Threshold=DEBUG
log4j.appender.PFILE.layout=org.apache.log4j.PatternLayout
log4j.appender.PFILE.layout.ConversionPattern=%p %d[%l][%C] %m%n
#log4j.appender.PFILE.layout.ConversionPattern=%p %d %m%n
log4j.appender.TEST=org.apache.log4j.RollingFileAppender
log4j.appender.TEST.File=./test.log
log4j.appender.TEST.MaxFileSize=5120KB
log4j.appender.TEST.MaxBackupIndex=10
log4j.appender.TEST.layout=org.apache.log4j.PatternLayout
log4j.appender.TEST.layout.ConversionPattern=%p %d[%l][%C] %m%n
Can u help me!!!You have to configure the temp logger so that it does not send its info on to the root logger.
For this, you can use the additivity flag.
# Default logger
log4j.rootLogger=DEBUG, PFILE
log4j.additivity.com.temp.test=false
log4j.logger.com.temp.test=DEBUG,TESTThe rest of the file remains the same. -
Hi,
We are invoking financialUtilService web service using HTTP Proxy Client to upload data files and submit ESS and to get ESS job log files. We are able to successfully upload and submit ESS job. But downloadESSJobExecutionDetails is not uploading logs/out files to UCM.
List<DocumentDetails> docDetails = financialUtilService.downloadESSJobExecutionDetails(requestId.toString(), "log");
// List<DocumentDetails> docDetails1 = financialUtilService.downloadExportOutput(requestId.toString());
System.out.println("Ess Job output:" + docDetails);
for(DocumentDetails documentDetails : docDetails){
System.out.println("Account: "+documentDetails.getDocumentAccount().getValue());
System.out.println("File Name: "+documentDetails.getFileName().getValue());
System.out.println("Document Title: "+documentDetails.getDocumentTitle().getValue());
System.out.println("DocumentName " + documentDetails.getDocumentName().getValue());
System.out.println("ContentType " + documentDetails.getContentType().getValue());
Below output is returned:
Ess Job status:SUCCEEDED
Ess Job output:[com.oracle.xmlns.apps.financials.commonmodules.shared.financialutilservice.DocumentDetails@5354a]
Account: fin$/payables$/import$
File Name: null
Document Title: Uma Test Import
DocumentName: 84037.zip
ContentType zipHey
We have the same problem. On calling `downloadESSJobExecutionDetails`, a zipfile is returned, but it only contains my original upload file, not any logs. Although, if I call `downloadESSJobExecutionDetails` on a dependent child subprocess, the log is included.
I also reported this to oracle support (SR 3-10267411981) - they referred me to known bug, to be fixed in v11: https://bug.oraclecorp.com/pls/bug/webbug_edit.edit_info_top?rptno=20356187 (not public accessible though )
Is there any work around available? -
Dynamic bad and Log file names!!
Hi
Hi
I have the scenario like the data files would be passed as parameter to the mapping , and the mapping would have the premapping procedure which would change the file name in the external table definition. This works fine. But the one more issue I have is that, I would like to change the bad file,log file and discard file name also through this procedure. In other words, for each incoming data file I would like to create a bad ,log and disc file.So How do I accomplish in the present procedure which does the following alter table statement
execute immediate 'alter table '||p_table_name||' location('''
||p_file_name||''')';
Any suggestion would be greatly appreciated
Thanks
Balaji -
Java binding - output log file
I have successfully implemented a simple java binding interface, but I am having some issues with the java class and I would like to do some debugging.
I was under the assumption that system.out.pritntln statements in the code would simply redirect to the default domain.log, however I don't see any of my output.
Does anyone know how I should go about debugging the external class? I am sure that using log4j would be more appropriate, but for now, I just want some simple output to let me know the health and status.
Thanks in advance.solved by adjusting the opmn config output to re-direct the stdout
-
How to Specify the Log File address as RootDir/Logs/Error.log in Log4j
I have a web application,
How to configure the log4j Rolling file Appender in that.
I want to logs to be redirected to the RootDir/logs/Error.log.
Currently, I m using following configuration:
# This file must live on the classpath of the jvm
# Set root logger level to DEBUG and log to both stdout and rollingFile appenders
# (see below for their definitions)
# The set of possible levels are: DEBUG, INFO, WARN, ERROR and FATAL
log4j.rootLogger=INFO, stdout, R
####### Console appender ######
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
# Pattern to output the caller's file name and line number.
# The pattern: Date Priority (Filename:Line Number) - Message\n
log4j.appender.stdout.layout.ConversionPattern=%d %-5p (%F:%L) - %m%n
#### Second appender writes to a file
log4j.appender.R=org.apache.log4j.RollingFileAppender
log4j.appender.R.File=Error.log
# Control the maximum log file size
log4j.appender.R.MaxFileSize=100KB
# Archive log files (one backup file here)
log4j.appender.R.MaxBackupIndex=1
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=%d %-5p (%F:%L) - %m%n
Thanks for your replies in advance.Please follow the following steps, will work.
To enable log4j logging to a file on lunar pages do the following:
1) Create a servlet class that will initialize log4j. Here is
the code:
package logging;
import org.apache.log4j.PropertyConfigurator;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class Log4jInit extends HttpServlet {
public
void init() {
String prefix = getServletContext().getRealPath("/");
String file = getInitParameter("log4j-init-file");
// if the log4j-init-file is not set, then no point in trying
if(file != null) {
PropertyConfigurator.configure(prefix+file);
public
void doGet(HttpServletRequest req, HttpServletResponse res) {
2) In your web.xml file add the following entry:
<servlet>
<servlet-name>log4j-init</servlet-name>
<servlet-class>logging.Log4jInit</servlet-class>
<init-param>
<param-name>log4j-init-file</param-name>
<param-value>WEB-INF/classes/log4j.lcf</param-value>
</init-param>
<load-on-startup>1</load-on-startup>
</servlet>
3) Create a log4j.lcf file located in your WEB-INF/classes directory
as with the following entries:
log4j.rootLogger=debug, R
# yourdirectory below is where your site lives.
log4j.appender.R=org.apache.log4j.RollingFileAppender
log4j.appender.R.File=/home/yourdirectory/public_html/logs/out.log
log4j.appender.R.MaxFileSize=500KB
# Keep one backup file
log4j.appender.R.MaxBackupIndex=1
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=%d %-5p [%c] %m%n
4) Add logging to your class. Here is an example:
import org.apache.log4j.Logger;
public class Example {
static Logger logger = Logger.getLogger(Example.class);
public void doSomething() {
logger.debug("testing logging");
5) make sure you have the log4j-1.2.8.jar file located in your WEB-INF/lib directory.
That should do it! -
Log4j problem for backing up the log file
This is my log4j.properties. it doesn't seem to back up the log file and create a new log file when it reaches to MAX size. can anybody look at it?
Thanks..
log4j.rootCategory=debug, stdout, R
# Print only messages of priority WARN or higher for your category
log4j.category.your.category.name=WARN
# Specifically inherit the priority level
#log4j.category.your.category.name=INHERITED
#### First appender writes to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
# Pattern to output the caller's file name and line number.
log4j.appender.stdout.layout.ConversionPattern=[%d] [%c] %-5p - %m%n
#### Second appender writes to a file
log4j.appender.R=org.apache.log4j.RollingFileAppender
log4j.appender.R.File=/opt/mc/logs/AMSWAS.log
# Control the maximum log file size
log4j.appender.R.MaxFileSize=200000KB
log4j.appender.R.MaxBackupIndex=1
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=[%d] [%c] %-5p - %m%nHello again Tom and thanks for your help!
No, I didn't optimize any media though I've rendered proxies for all media. I have about 14TB of R3D footage on five external harddrives, it's a feature. The proxies and sound files which I copied to the library makes the library file that big. I can't consolidate all media, as far as I know, since I don't have a drive even close to the size of all the R3D footage.
In terms of copying I have only tried the good old "copy/paste" method. I have never used any of those programs you mentioned. Can those programs be used to copy certain files or will they copy an entire drive?
Will the automatic back-ups FCPX does every 15 minutes save my timelines if something would go wrong and the library file would dissappear? I don't fully understand how that back-up process works. I could always render new proxies, though it would take time, but re-editing all those timelines is a whole other thing. Important to note here is that I'm used to Premiere Pro and the "old" FCP which is why all of this is so confusing.
Thank you again! -
How to create outside Logging file in osb 11g using log4j.jar?
Hi all,
Currently, i am using osb 11g to develop a system. In the system we need to create a log file using log4j.jar library. This sub-program is working in the osb 10g base but fail to work in the osb 11g base. Can anyone give me some advice about this matter? Have anyone created one like this in 11g? Is it successful?Sorry path is missing for the above request.
path="\\192.168.0.14\c$\LOG\d9\May_08_2008_log.txt ";
please help.
Saravanan.K -
Different log file name in the Control file of SQL Loader
Dear all,
I get every day 3 log files with ftp from a Solaris Server to a Windows 2000 Server machine. In this Windows machine, we have an Oracle Database 9.2. These log files are in the following format: in<date>.log i.e. in20070429.log.
I would like to load this log file's data to an Oracle table every day and I would like to use SQL Loader for this job.
The problem is that the log file name is different every day.
How can I give this variable log file name in the Control file, which is used for the SQL Loader?
file.ctl
LOAD DATA
INFILE 'D:\gbal\in<date>.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
Do you have any better idea about this issue?
I thought of renaming the log file to an instant name, such as in.log, but how can I distinguish the desired log file, from the other two?
Thank you very much in advance.
Giorgos BaliotisI don't have a direct solution for your problem.
However if you invoke the SQL loader from an Oracle stored procedure, it is possible to dynamically set control\log file.
# Grant previleges to the user to execute command prompt statements
BEGIN
dbms_java.grant_permission('bc4186ol','java.io.FilePermission','C:\windows\system32\cmd.exe','execute');
END;
* Procedure to execute Operating system commands using PL\SQL(Oracle script making use of Java packages
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "Host" AS
import java.io.*;
public class Host {
public static void executeCommand(String command) {
try {
String[] finalCommand;
finalCommand = new String[4];
finalCommand[0] = "C:\\windows\\system32\\cmd.exe";
finalCommand[1] = "/y";
finalCommand[2] = "/c";
finalCommand[3] = command;
final Process pr = Runtime.getRuntime().exec(finalCommand);
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String buff = null;
while ((buff = br_in.readLine()) != null) {
System.out.println("Process out :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process output.");
ioe.printStackTrace();
}).start();
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_err = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String buff = null;
while ((buff = br_err.readLine()) != null) {
System.out.println("Process err :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process error.");
ioe.printStackTrace();
}).start();
catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
public static boolean isWindows() {
if (System.getProperty("os.name").toLowerCase().indexOf("windows") != -1)
return true;
else
return false;
* Oracle wrapper to call the above procedure
CREATE OR REPLACE PROCEDURE Host_Command (p_command IN VARCHAR2)
AS LANGUAGE JAVA
NAME 'Host.executeCommand (java.lang.String)';
* Now invoke the procedure with an operating system command(Execyte SQL-loader)
* The execution of script would ensure the Prod mapping data file is loaded to PROD_5005_710_MAP table
* Change the control\log\discard\bad files as apropriate
BEGIN
Host_Command (p_command => 'sqlldr system/tiburon@orcl control=C:\anupama\emp_join'||1||'.ctl log=C:\anupama\ond_lists.log');
END;Does that help you?
Regards,
Bhagat -
Regarding Log4.xml to add timestamp in log file
Dear Sir,
Could you guide me how to append the timestamp got appeared in log file which has been generated from Log4j.xml?? This is my Log4j.xml.
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd"> <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"> <!-- Order of child elements is appender*, logger*, root?. --> <!-- Appenders control how logging is output. --> <appender name="CM" class="org.apache.log4j.FileAppender"> <param name="File" value="Master.log"/> <param name="Threshold" value="DEBUG"/> <param name="Append" value="true"/> <param name="MaxFileSize" value="1MB"/> <param name="MaxBackupIndex" value="1"/> <layout class="org.apache.log4j.PatternLayout"> <!-- {fully-qualified-class-name}:{method-name}:{line-number} - {message}{newline} --> <param name="ConversionPattern" value="%C:%M:%L - %m%n"/> </layout> </appender> <appender name="stdout" class="org.apache.log4j.ConsoleAppender"> <param name="Threshold" value="INFO"/> <layout class="org.apache.log4j.PatternLayout"> <param name="ConversionPattern" value="%C:%M:%L - %m%n"/> </layout> </appender> <!-- Logger hierarchy example: root - com - com.ociweb - com.ociweb.demo - com.ociweb.demo.LogJDemo --> <!-- Setting additivity to false prevents ancestor categories for being used in addition to this one. --> <logger name="com.tf" additivity="true"> <priority value="DEBUG"/> <appender-ref ref="CM"/> </logger> <!-- Levels from lowest to highest are trace, debug, info, warn, error, fatal & off. --> <!-- The root category is used for all loggers unless a more specific logger matches. --> <root> <appender-ref ref="stdout"/> </root> </log4j:configuration> It would be great, if you could give the solution for this. There is no probs in getting timestamp from the folowing properties file Log4j.properties: # # Configure the logger to output info level messages into a rolling log file. # log4j.rootLogger=DEBUG, R log4j.appender.R=org.apache.log4j.DailyRollingFileAppender log4j.appender.R.DatePattern='.'yyyy-MM-dd # # Edit the next line to point to your logs directory. # The last part of the name is the log file name. # log4j.appender.R.File=c:/temp/log/${log.file} log4j.appender.R.layout=org.apache.log4j.PatternLayout # # Print the date in ISO 8601 format # log4j.appender.R.layout.ConversionPattern=%d %-5p %c %L - %m%n but i need it from Log4j.xml
thanks in advance maniDear Sir,
Could you guide me how to append the timestamp got appeared in log file which has been generated from Log4j.xml?? This is my Log4j.xml.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/">
<!-- Order of child elements is appender*, logger*, root?. -->
<!-- Appenders control how logging is output. -->
<appender name="CM" class="org.apache.log4j.FileAppender">
<param name="File" value="customer_master.log"/>
<param name="Threshold" value="DEBUG"/>
<param name="Append" value="true"/>
<param name="MaxFileSize" value="1MB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<!-- {fully-qualified-class-name}:{method-name}:{line-number}
- {message}{newline} -->
<param name="ConversionPattern" value="%C:%M:%L - %m%n"/>
</layout>
</appender>
<appender name="stdout" class="org.apache.log4j.ConsoleAppender">
<param name="Threshold" value="INFO"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%C:%M:%L - %m%n"/>
</layout>
</appender>
<!-- Logger hierarchy example:
root - com - com.ociweb - com.ociweb.demo - com.ociweb.demo.LogJDemo
-->
<!-- Setting additivity to false prevents ancestor categories
for being used in addition to this one. -->
<logger name="com.tf" additivity="true">
<priority value="DEBUG"/>
<appender-ref ref="CM"/>
</logger>
<!-- Levels from lowest to highest are
trace, debug, info, warn, error, fatal & off. -->
<!-- The root category is used for all loggers
unless a more specific logger matches. -->
<root>
<appender-ref ref="stdout"/>
</root>
</log4j:configuration>It would be great, if you could give the solution for this. There is no probs in getting timestamp from the folowing properties file Log4j.properties:
# Configure the logger to output info level messages into a rolling log file.
log4j.rootLogger=DEBUG, R
log4j.appender.R=org.apache.log4j.DailyRollingFileAppender
log4j.appender.R.DatePattern='.'yyyy-MM-dd
# Edit the next line to point to your logs directory.
# The last part of the name is the log file name.
log4j.appender.R.File=c:/temp/log/${log.file}
log4j.appender.R.layout=org.apache.log4j.PatternLayout
# Print the date in ISO 8601 format
log4j.appender.R.layout.ConversionPattern=%d %-5p %c %L - %m%nthanks in advance -
Node.js loss of permission to write/create log files
We have been operating Node.js as a worker role cloud service. To track server activity, we write log files (via log4js) to C:\logs
Originally the logging was configured with size-based roll-over. e.g. new file every 20MB. I noticed on some servers the sequencing was uneven
socket.log <-- current active file
socket.log.1
socket.log.3
socket.log.5
socket.log.7
it should be
socket.log.1
socket.log.2
socket.log.3
socket.log.4
Whenever there is uneven sequence, i realise the beginning of each file revealed the Node process was restarted. From Windows Azure event log, it further indicated worker role hosting mechanism found node.exe to have terminated abruptly.
With no other information to clue what is exactly happening, I thought there was some fault with log4js roll over implementation (updating to latest versions did not help). Subsequently switched to date-based roll-over mode; saw that roll-over happened every
midnight and was happy with it.
However some weeks later I realise the roll-over was (not always, but pretty predictably) only happening every alternate midnight.
socket.log-2014-06-05
socket.log-2014-06-07
socket.log-2014-06-09
And each file again revealed that midnight the roll-over did not happen, node.exe was crashing again. Additional logging on uncaughtException and exit happens showed nothing; which seems to suggest node.exe was killed by external influence (e.g. process
kill) but it was unfathomable anything in the OS would want to kill node.exe.
Additionally, having two instances in the cloud service, we observe the crashing of both node.exe within minutes of each other. Always. However if we had two server instances brought up on different days, then the "schedule" for crashing would
be offset by the difference of the instance launch dates.
Unable to trap more details what's going on, we tried a different logging library - winston. winston has the additional feature of logging uncaughtExceptions so it was not necessary to manually log that. Since winston does not have date-based roll-over it
went back to size-based roll-over; which obviously meant no more midnight crash.
Eventually, I spotted some random midday crash today. It did not coincide with size-based rollover event, but winston was able to log an interesting uncaughtException.
"date": "Wed Jun 18 2014 06:26:12 GMT+0000 (Coordinated Universal Time)",
"process": {
"pid": 476,
"uid": null,
"gid": null,
"cwd": "E:
approot",
"execPath": "E:\\approot
node.exe",
"version": "v0.8.26",
"argv": ["E:\\approot\\node.exe", "E:\\approot\\server.js"],
"memoryUsage":
{ "rss": 80433152, "heapTotal": 37682920, "heapUsed": 31468888 }
"os":
{ "loadavg": [0, 0, 0], "uptime": 163780.9854492 }
"trace": [],
"stack": ["Error: EPERM, open 'c:\\logs\\socket1.log'"],
"level": "error",
"message": "uncaughtException: EPERM, open 'c:\\logs\\socket1.log'",
"timestamp": "2014-06-18T06:26:12.572Z"
Interesting question: the Node process _was_ writing to socket1.log all along; why would there be a sudden EPERM error?
On restart it could resume writing to the same log file. Or in previous cases it would seem like the lack of permission to create a new log file.
Any clues on what could possibly cause this? On a "scheduled" basis per server? Given that it happens so frequently and in sync with sister instances in the cloud service, something is happening in the back scenes which I cannot put a finger to.
thanks
The melody of logic will always play out the truth. ~ Narumi Ayumu, SpiralHi,
It is strange. From your description, how many instances of your worker role? Do you store the log file on your VM local disk? To avoid this question, the best choice is you could store your log file into azure storage blob . If you do this, all log
file will be stored on blob storage. About how to use azure blob storage, please see this docs:
http://azure.microsoft.com/en-us/documentation/articles/storage-introduction/
Please try it.
If I misunderstood, please let me know.
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Very weird issue with server logging when using log4j.properties file
I'm using log4j logging. In log4j.properties the root logger is set up to use the ServerLoggingAppender class so that all our application logs go to the main server logfile. In addition, there are several appenders defined for specific components, with output going to specific per-component log files. Everything is going fine until I launch the server console. At this point all of those per-component log files get wiped out (zero length) and some non-ASCII lines are written to at least one of these files, after which the logs appear to be fine. The main server log file does not appear to be affected (because the root logger is set to "warn" level, while component-specific loggers are set to trace, the contents in these files is different; however I tried disabling all the other appenders and turning the root logger up to trace, and that still did not re-create the problem in the main server log file.
And here's the really weird part -- if I use the same configuration, but in a log4j.xml file, the problem does not happen.Figured it out.
We were passing in the configuration for log4j as -Dlog4j.configuration=file:/<properties file> and this was added to the command line for both the managed and admin servers. Problem is that the console app starts its own instance of log4j, and when it reads the configuration for the appenders it initializes or rolls over the files. At some point we have two JVMs accessing the same files, so some corruption is bound to happen.
I'm not clear why the .xml file made a difference, but earlier we had been passing the log4j configuration as a jar file placed in the domain/lib folder, so perhaps the designer reverted to that (placed the log4j.xml file in a jar in lib, and not simply changed the -Dlog4j.configuration=file:/ option. -
Configure log4j to write to different log files conditionally
Hi folks,
Is there is way log4j could be configured to write to multiple log files conditionally? Let me try to explain what i am trying to do.
I have two classes DatabaseChecker and FTPChecker extends Checker class. Within Checker class, there is a method logTestResult(CheckerType c, boolean isFailed, int retry, int isMaxRetryExceeded). Depending on the CheckerType(database or FTP), I need to write log outputs to different files (dbchecker.log or ftpchecker.log). How do i configure log4j to do this?
I've seen how to configure log4j based on classes from different package, but not sure on this one. Any clue would be much appreciated.lgmqy2000 wrote:
I have two classes DatabaseChecker and FTPChecker extends Checker class. Within Checker class, there is a method logTestResult(CheckerType c, boolean isFailed, int retry, int isMaxRetryExceeded). Depending on the CheckerType(database or FTP), I need to write log outputs to different files (dbchecker.log or ftpchecker.log). How do i configure log4j to do this?
if (checker.isType(CheckerType.DATABASE)) {
dbLogger.info(someMessage);
} else if (checker.isType(CheckerType.FTP)) {
ftpLogger.info(someMessage);
} else {
defaultLogger.info(someMessage);
}~
Maybe you are looking for
-
Jabber Client for Windows video not working with ManyCam
Hi everyone. We've been kicking around the Cisco Jabber client and I've noticed a small issue when it comes to video. I use a software package called ManyCam (http://www.manycam.com/) which creates a web cam that allows me to put effects on my actu
-
Transferring photos from laptop to Itunes
I had to do a systems restore on Iphone. The only thing that bothered me was losing the photos, i found most of them stored on my laptop. Is there any way of transferring them to Itunes and thus onto my Iphone when i sync?
-
It seems that someone has stolen my iPod touch at my school. I didnt put a tracking device on it before it was stolen. So I was curious as to whether it's forever lost or somehow I am still able to find out where it's at?
-
Employee resposnsibles Org unit Determination (Employee is assigned to Org Unit)
Hi , We want to determine Org unit based on employee responsible but not on basis of USER. Employee responsible will be assigned to a particular position in Organization . We want to determine Organization unit where Employee is assigned . When in tr
-
Transporting ABAP & Z-table to multiple system
Hi, from time to time we have some reports which we would like to have in all productional system. The most painful way is to copy & paste everything in each D system and the transport it via Q to P. If we want to just import the transportfiles we ne