XMLDB log-level configuration

I've been searching for a document that will advise what the different log-level settings are that can be put into the xdbconfig.xml but I can't seem to find anything on this. The default level is set to 0 on install. That is what we current have it as. We are being required to make this a log-level of 1 but I don't know the implications of doing this.
Does anyone know where I can find more information regarding this? We have an Oracle 11.1.0.7 installation on AIX 7.1 -- any help would be appreciated.

Apparently the IRS does know, once again, more then I do ;-)
Seriously; The status of 2007 does not have changed for this feature.
The following is a feature that does work: http://www.liberidu.com/blog/2010/01/17/ora-31098-internal-event-to-turn-on-xdb-tracing
Also you can enable the listener logging/tracing which will also mention login attempts (and TCP/IP#, FQDN info btw).
The XFiles demo XMLDB app also shows a method on how to program this feature via XDB events and stuff...
The needed info, probably, can also retrieved by implementing Database Vault or the Database Firewall software (and yes, security comes at a cost).
http://www.oracle.com/technetwork/products/database-firewall/overview/index.html
Edited by: Marco Gralike on Sep 20, 2012 3:07 AM

Similar Messages

  • Apache Sling Logging Writer Configuration

    Hi,
    I'm having an issue where my custom log writer configuration is not being picked up and used by CQ5 sometimes.  I've created a custom error log writer based on the example provided at http://helpx.adobe.com/cq/kb/HowToRotateRequestAndAccessLog.html which I've installed on an author, replicated it to the publishers, and all seemed ok and working correctly.  The settings are:
    Log File: ../logs/error.log
    Number of Log Files: 5
    Log File Threshold: 200MB
    After installing this, the error logs rotated at 200MB resulting in error.log.0, error.log.1 etc as expected.
    However after rebuilds on the machines, this configuration was overwritten (as expected).  So I've installed and replicated the package again, but now the configuration is not taking effect.  I'm using the exact same config package as I was previously, but the logs don't seem to be rotating at all now (not even daily).  I've deleted the config from the Felix console on all authors and publishers, reinstalled on the authors, replicated to the publishers, then restarted the CQ5 service on all machines, but it's still not working.
    So I have a couple of questions about this:
    Is there somewhere else in CQ5 that might be overriding these log writer config settings?
    Is ../logs/error.log correct for the standard log file location?  A note in step 3 of Creating Your Own Loggers and Writers on http://dev.day.com/docs/en/cq/current/deploying/configure_logging.html states:
    Log writer paths are relative to the crx-quickstart/launchpad location.
    Therefore, a log file specified as logs/thelog.log writes to crx-quickstart/launchpad/logs/thelog.log.
    To write to the folder crx-quickstart/logs the path must be prefixed with ../ (as../logs/thelog.log).
    So a log file specified as ../logs/thelog.log writes to crx-quickstart/logs/thelog.log.
         The configs in the log rotation example also use ../logs.  However when looking at the default/standard logging writer config, it uses logs/error.log.  Which one is correct?
    Any help on what's going on here would be appreciated!!
    Thanks,
    K

    Hi,
    I am answering your last question and this one here only.
    1. Log configuration can be override at project via creating config and overriding factory "org.apache.sling.commons.log.LogManager.factory.config-<identifier>" and writer (if required) "org.apache.sling.commons.log.LogManager.factory.writer-<identifier>". So plz check if you have configure log at project level. Now if you are not able to see then plz recheck you log configuration in felix console http://localhost:4502/system/console/slinglog here you can see entire log configuration for CQ system.
    2. both will work but the location will change
         i. CQ 5.5 - log file located under "crx-quickstart" ../logs/<filename>.log will create under this
         ii. CQ 5.4 - logs creates under "crx-quickstart\logs" also under "crx-quickstart\launchpad\logs" so if you select ../logs/ it will go under "crx-quickstart\logs" only but you can change wherever you want to store. Again you can see this configuration info at http://localhost:4502/system/console/slinglog
    3. You can also customize you log via creating at project level and assigning the "identifire" (with all other configure parameters) for more information you can refer http://sling.apache.org/site/logging.html apart from earlier share links.
    I hope above will help you to proceed. Please let me know for more information.
    Thanks,
    Pawan

  • Setting logging level

    Hello folks;
    Is there a way to set the logging threshold for log messages sent to
    the weblogic log? I see where to adjust logging threshold for the
    standard out, but this level doesn't seem to affect the threshold for
    messages getting to the physical logfile. Unlike the log4j classes
    that seem to back up weblogic's logging ( I checked this out by
    looking in the weblogic.jar file); there are no methods for setting
    the logging threshold on weblogic's log classes.
    Cheers-

    I was mistaken about the log4j stuff being used under the covers.
    However, changing the logging threshold does not seem to affect
    messages sent to the log for my server. I just tested this again by
    setting the log level to "CRITICAL" in the console. Sure enough the
    messages sent to the standard out console are affected, but the
    messages sent to the physical log are not affected. I still get Info,
    Notice, etc messages there--- but nearly no messages to the standard
    out console.
    This was done on version 7.0 of weblogic.
    Cheers--
    "Jane Sampson" <[email protected]> wrote in message news:<3e5a84b4$[email protected]>...
    Hi Steve,
    If you set the logging level in the Logging tab on Server Configuration, it
    DOES affect the level of messages that are logged to both stdout and the
    physical log. Are you sure you are looking at the correct log file?
    Also, none of the weblogic classes use log4j. What version did you find
    log4j classes included in the weblogic.jar??
    Jane
    BEA Support

  • Could you tell me what's the meaning of the logfile-path and log-level?

    we are running an productive xml database. but it is not stable now. Sometime it would report resource conflict error while you access xmldb via http protocol. I read the database logs and listener logs, but no abnormal message could be found. So, I want to find more information from the xmldb logs. While I run select DBMS_XDB.cfg_get().getclobval() from dual, I found that there are several logfile-path and log-level tags and I guess that these would be xmldb logs. Does anyone know what's the meaning of these tags?

    I wondered about that one often too. I didn't have a chance yet to investigate (but maybe Mark will elaborate a little here), my guess is that it / or will be a possibility to enable tracing regarding the protocols or servlets.
    Though It look like if you enable it it will trace to the XML file defined in the xdbconfig.xml. I also guess that (because there is also a XSD counterpart) that one could create an resource that streams the errors into a XDB ftp or http or ... xmltype table based on these settings.
    This would be great because it would mature the protocol server regarding functionality. You could enable the tracing and see what happens. Until now the documentation doesn't give much extra insight...
    <!-- FTP specific -->
    <element name="ftpconfig">
    <complexType><sequence>
    <element name="ftp-port" type="unsignedShort" default="2100"/>
    <element name="ftp-listener" type="string"/>
    <element name="ftp-protocol" type="string"/>
    <element name="logfile-path" type="string" default="/sys/log/ftplog.xml"/>
    <element name="log-level" type="unsignedInt" default="0"/>
    <element name="session-timeout" type="unsignedInt" default="6000"/>
    <element name="buffer-size" default="8192">
    <simpleType>
    <restriction base="unsignedInt">
    <minInclusive value="1024"/> <!-- 1KB -->
    <maxInclusive value="1048496"/> <!-- 1MB -->
    </restriction>
    </simpleType>
    </element>
    <element name="ftp-welcome-message" type="string" minOccurs="0"
    maxOccurs="1"/>
    </sequence></complexType>
    </element>
    <!-- HTTP specific -->
    <element name="httpconfig">
    <complexType><sequence>
    <element name="http-port" type="unsignedShort" default="8080"/>
    <element name="http-listener" type="string"/>
    <element name="http-protocol" type="string"/>
    <element name="max-http-headers" type="unsignedInt" default="64"/>
    <element name="max-header-size" type="unsignedInt" default="4096"/>
    <element name="max-request-body" type="unsignedInt" default="2000000000"
    minOccurs="1"/>
    <element name="session-timeout" type="unsignedInt" default="6000"/>
    <element name="server-name" type="string"/>
    <element name="logfile-path" type="string"
    default="/sys/log/httplog.xml"/>
    <element name="log-level" type="unsignedInt" default="0"/>
    <element name="servlet-realm" type="string" minOccurs="0"/>
    ...etc...

  • How to change log level in log4j at runtime

    Hi,
    I have a small requirement where the default log level is info, but if the user wants to invoke the Java program with debug level, the Debug should start off. I am trying to achieve this using this code..
              Logger root = Logger.getRootLogger();
              root.setLevel(Level.DEBUG);
              log.debug("This is debug");When I do this, the debug statement at the end does not print.
    Then I thought may be the root logger level is not propagated after the initialization at start up, so I tried this
              log.setLevel(Level.DEBUG);
              log.info("This is info");
              log.debug("This is debug");
              log.info("Log level is " + log.getLevel());
              log.info(log.isDebugEnabled());The info statement prints "this is info" then the next info statement prints "log level is DEBUG" the next statement prints true but the intermediate debug statement does not print.
    I have done this before in a Web app and it works just fine. I do not understand what is going on. Can someone please help me understand this.

    I am not really familiar with Log4J, but in java.util.logging you define both a loglevel on the logger and the Handler that is actually responsible for outputting. A quick look on the Log4J sites shows that those are called 'appenders' in Log4J. So you might want to look if your appender is actually configured to output DEBUG-level statements.

  • Wireless AirOS Global AP Syslog Level configuration command 7.4.121.0

    Hello
    I have a controller 5508 running on version 7.4.121.0. With the command "show ap config global" I can check the global AP syslog config:
    AP global system logging host.................... 0.0.0.0
    AP global system logging level................... informational
    Default the syslog host ip is 0.0.0.0. With the command ">config ap syslog host global x.x.x.x" I can configure the IP of the syslog server.
    Question:
    How can I configure the global syslog level?
    I searched in the command reference but there is no specific command to set the global AP syslog level.
    Thanks,
    Rolf

    Hi Rolf,
    Here is the command you required
    config ap logging syslog level <syslog_level> all   
    This post also should give you an idea how to configure syslog in different WLC platforms & how to analyze them using splunk
    http://mrncciew.com/2014/09/19/wlc-syslog-analysis/
    Pls mark the thread as "answered" if this is you looking for. 
    HTH
    Rasika

  • Logging level that will show when rules are added/changed/deleted?

    What level of logging on the ASA will enable the syslog to see when a firewall rule has been changed?  I know debugging on the config level should be able to, but I don't want to put my firewall through that level of logging for everything.
    any help would be greatly appreciated!

    Hi,
    Would seem to me that you would be looking for Syslog messages with the following IDs
    111008 (level 5 = Notifications)
    111009 (level 7 = Debugging)
    111010 (level 5 = Notifications)
    Source:
    http://www.cisco.com/en/US/docs/security/asa/syslog-guide/logmsgs.html#wp4769400
    You can also change a level of a particular Syslog ID without changing the global level configured for certain destination
    Lets say you wanted the change the above Debugging level message changed to the Notifications level you would configure
    logging message 111009 level notifications
    I am not completely sure would you also need to add these to specify how many of such log messages could be generated and in what timeframe. Though there is an option for "unlimited" also.
    logging rate-limit message 111009
    logging rate-limit message 111008
    logging rate-limit message 111010
    - Jouni

  • Set Logging level for my application

    I've got an web-based application (ADF/EJB) and I'm trying to configure the logging level.
    I'm using the Java Logging API and no matter what I change, short of hardcoding a call in to set the global logging level, I can't alter what level it is using. I've tried changing in the WebLogic console and all that does is change the logging level for WebLogic. I've tried adding properties files but they don't seem to be read.
    Any Ideas?

    Hey Mark,
    I have no idea if this will help, but I was wanting to print out only my DML operations on commit and I got this post on my thread.
    You could probably do this with -Djbo.debugoutput=adflogger and then fuss with the logging configuration, or perhaps by turning on JDBC loggingDouble-click your View Controller project-> Run/Debug/Profile -> Edit ,and then type the -Djbo.debugoutput=adflogger into your Java Options, but I don't know anything after that.
    Good luck.
    Will

  • RAID Level Configuration Best Practices

    Hi Guys ,
       We are building new Virtual environment for SQL Server and have to define RAID level configuration for SQL Server setup.
    Please share your thoughts for RAID configuration for SQL data, log , temppdb, Backup files .
    Files  RAID Level 
    SQL Data File -->
    SQL Log Files-->
    Tempdb Data-->
    Tempdb log-->
    Backup files--> .
    Any other configuration best practices   are more then welcome . 
    Like Memory Setting at OS level , LUN Settings. 
    Best practices to configure SQL Server in Hyper-V with clustering.
    Thank you
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Hi,
    If you can shed some bucks you should go for RAID 10 for all files. Also as a best practice keeping database log and data files on different physical drive would give optimum performance. Tempdb can be placed with data file or on a different drive as per
    usage. Its always good to use dedicated drive for tempdb
    For memory setting.Please refer
    This link for setting max server memory
    You should monitor SQL server memory usage using below counters taken from
    this Link
    SQLServer:Buffer Manager--Buffer Cache hit ratio(BCHR): IIf your BCHR is high 90 to 100 Then it points to fact that You don't have memory pressure. Keep in mind that suppose somebody runs a query which request large amount of pages in that
    case momentarily BCHR might come down to 60 or 70 may be less but that does not means it is a memory pressure it means your query requires large memory and will take it. After that query completes you will see BCHR risiing again
    SQLServer:Buffer Manager--Page Life Expectancy(PLE): PLE shows for how long page remain in buffer pool. The longer it stays the better it is. Its common misconception to take 300 as a baseline for PLE.   But it is not,I read it from
    Jonathan Kehayias book( troubleshooting SQL Server) that this value was baseline when SQL Server was of 2000 version and max RAM one could see was from 4-6 G. Now with 200G or RAM coming into picture this value is not correct. He also gave the formula( tentative)
    how to calculate it. Take the base counter value of 300 presented by most resources, and then determine a multiple of this value based on the configured buffer cache size, which is the 'max server memory' sp_ configure option in SQL Server, divided by 4 GB.
      So, for a server with 32 GB allocated to the buffer pool, the PLE value should be at least (32/4)*300 = 2400. So far this has done good to me so I would recommend you to use it.  
    SQLServer:Buffer Manager--CheckpointPages/sec: Checkpoint pages /sec counter is important to know about memory pressure because if buffer cache is low then lots of new pages needs to be brought into and flushed out from buffer pool, 
    due to load checkpoint's work will increase and will start flushing out dirty pages very frequently. If this counter is high then your SQL Server buffer pool is not able to cope up with requests coming and we need to increase it by increasing buffer pool memory
    or by increasing physical RAM and then making adequate changes in Buffer pool size. Technically this value should be low if you are looking at line graph in perfmon this value should always touch base for stable system.  
    SQLServer:Buffer Manager--Freepages: This value should not be less you always want to see high value for it.  
    SQLServer:Memory Manager--Memory Grants Pending: If you see memory grants pending in buffer pool your server is facing SQL Server memory crunch and increasing memory would be a good idea. For memory grants please read this article:
    http://blogs.msdn.com/b/sqlqueryprocessing/archive/2010/02/16/understanding-sql-server-memory-grant.aspx
    SQLServer:memory Manager--Target Server Memory: This is amount of memory SQL Server is trying to acquire.
    SQLServer:memory Manager--Total Server memory This is current memory SQL Server has acquired.
    For other settings I would like you to discuss with vendor. Storage questions IMO should be directed to Vendor.
    Below would surely be a good read
    SAN storage best practice For SQL Server
    SQLCAT best practice for SQL Server storage
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • Dynamic Logging level control in a grid/rac system

    How is the logging level in a grid/rac configured Weblogic system dynamically configured? I would like to control the logging level in grid/rac configured system dymaically via a GUI. What are the steps to controlling the level, or please direct to the appropriate documention.

    You would do it the same way you would with a tank of any shape. The only difference is that you are dealing with the volume of a sphere rather than a cylinder or rectangle. Hence the amount of fluid required to raise the level by X is non-linear in that it varies with how much stuff is already in the tank - which is a function of the volume of a sphere (actually the volume of a partial sphere).
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Logging Level for Identity Server

    There is a path name configurable via the cosole but not log level. Within AMCOnfig.properties file, there is a com.iplanet.services.debug.level parameter but it does not seem to control the level of data being logged into the "logs" directory. Are there any other places where there are logging level parameters? Changing this one does not have any effect on the files in the "logs" directory - only the "debug" directory.
    There are 4 main files in the logs directory - amAuthLog, amSSO.access amPolcy.access and amAuthentication.access. Is there any configuration data for any of these files?
    Any help would be appreciated!
    Thanks

    The logging parameter that you are referring to is configured can be toggled on/off within the AMConfig.properties (changing it requires a restart)
    com.iplanet.am.logstatus=ACTIVE
    HTH!
    Satish

  • Logging level warning not working.

    We have logging set to "warning" in our sessions.xml file, but we still see fine and finer logging lines. Is there anything we are missing?
    Any help is appreciated. We are on Toplink version 10.1.3
    Here are the lines from the sessions.xml file.
    &lt;logging xsi:type="toplink-log"&gt;
    &lt;log-level&gt;warning&lt;/log-level&gt;
    &lt;/logging&gt;

    Amehta,
    I verified in our code that INFO is the default and that the level string is case-insensitive "warning"=="WARNING".
    So it looks like FINER may be set somewhere else in your code.
    You may want to checkout a previous post surrounding a similar sessions.xml logging level issue where it is suggested that multiple sessions.xml files are in the classpath - if so change the level on all of them.
    Also, if you are using JPA as well as native ORM then change the level in persistence.xml as well.
    Where is logging configured?
    thank you
    /michael

  • Where can I set the log level for the "Inbox log file" ?

    From the Siebel 8 Bookshelf, it says :
    "To set the level of the Inbox log file for troubleshooting
    *In Siebel Tools, set the Log Level for the Inbox log file (Alias = InboxLog) to 5*."
    But where exactly in Siebel Tools can I find that Log Level ? Which object does the Siebel bookshelf talk about ?

    Hi,
    Loglevels are not configured in Siebel Tools. You have to configure them with the siebel client. You can find a parameter at "Administration - Server Configuration / Server / Events"
    Search for "Inbox General Log Events". Set this parameter to 5. It think this should help you.
    Cheers Andreas

  • Change opmn log level

    how to change opmn log level in OAS 10.1.3.3. I want to set it to debug level.
    Thanks
    AB

    Thanks, I looked at the following section but still not sure what exactly to change in opmn.xml
    2.2 There used to be a level attribute in the <log-file> tag that I don’t find anymore. How do I define log levels?
    In 10.1.3.1.0, logging is configured by component codes rather than level codes. The logging messages contain the literal value based on logging levels rather than an integer value; for example: none, fatal, error, warn, notify, debug1, debug2, debug3, and debug4.
    All OPMN log messages of non-debug type (none, fatal, error, warn and notify) for the ONS and PM components go in opmn.log file. If debugging is enabled, then all OPMN debug log messages (of type debug1, debug2, debug3 and debug4) are written into opmn.dbg file.
    You can configure the following component codes for logged events for both log and debug:
    • internal: a log for the common internal information for OPMN
    • ons: a log for the ONS component information for OPMN
    • pm: a log for the PM component information for OPMN
    Both the ons and pm components consist of subcomponents that can also be configured. Refer to the Oracle Process Manager and Notification Server Administrator’s Guide 10g Release 3 (10.1.3.1.0) for the list of ons and pm subcomponents.
    I looked at http://download-west.oracle.com/docs/cd/B25221_04/core.1013/b15976/common.htm#g1044967
    and there is a section
    <debug>
    Required: false
    Default: see attributes
    Parents: <opmn>
    Attributes: path, comp, rotation-size, rotation-hour
    Can you tell me an example how i can set it?

  • Increasing log level for saw.log

    Hi
    Please let me know the steps to increase the logging level for saw.log
    Thanks.

    In the enterprise manager goto:
    Business Intelligence >> coreapplication >> Diagnostics >>      Log Configuration
    Regards
    John
    http://obiee101.blogspot.com
    http://obiee11g.com

Maybe you are looking for

  • How can I get a refund from my iCloud purchase less than 24hrs ago

    I purchased the iCloud 20gb extra storage yesterday, and after quickly realising it did not meet my needs & deleted my icloud account , shortly after i received a conformation mail from apple that my bank account has been debited £28.00 I have spent

  • Birthday reminders in Ical??

    I have all of my friends birthdays in Icall, but never get a reminder. How can I turn this on?? Thank you

  • Can I connect my new MacBook Air to my USB external hard drive?

    Can any of the ports on a new MacBook Air (USB3 or thunderbolt) connect by cable to an external hard drive that's old-style USB?  Thanks!

  • Retrieving a message using XSLT

    Hi, I have XML file like the following <doc xmlns="http://schemas.soap.org/...." xmlns:ems="http://www.dummy.com/ems" xmlns:op="http://www.dummy.com/ems" > <Message>Hello<Message> </doc> Now using XSLT I want to extract the message 'Hello' from Messa

  • Dock to micro usb adapter

    hi! I'm italian and I have a questions: the "dock to micro usb adapter" for iphone ( http://store.apple.com/it/product/MD099ZM/A/adattatore-micro-usb-apple-per-iphon e#compatibility ) is compatible with the ipad 3° generation? thanks to all!!!