Growing nsure audit log file in sys\etc\logcache

I have a Netware 6.5 OES2 server that suddenly had a quickly growing file in the \sys\etc\logcache folder. The file has just recently stabilized, but I would like to shrink the file. I am aware that this is part of NSure auditing and would like to leave that running. Can the files in this directory be deleted, or how to I go about shrinking or truncating them?
Thanks.

That would be OES, as OES2 only exists on Linux.
TID 10089097 seems to cover this in a general sense:
configuring the PA on NetWare, is to configure the eDirectory, Filesystem, and Netware OS instrumentation. This is done at the NCP server object in iManager. From the eDirectory Administration task list, select Modify Object. Browse for server object. This is the NCP server object, in the tree, not the Secure Logging Server object in the Logging Services container. Click on the Nsure Audit tab. Below the tab, there will links to the individual components, eDirectory, NetWare, and Filesystem.
Without having it installed myself, I would expect that you could reset log files and suchlike in there.

Similar Messages

  • The format of Audit log file

    We have a perl script to extract data from Audit log files(Oracle Database 10g Release 10.2.0.1.0) which have format as bellow.
    Audit file /u03/oracle/admin/NIKKOU/adump/ora_5037.aud
    Oracle Database 10g Release 10.2.0.1.0 - Production
    ORACLE_HOME = /u01/app/oracle/product/10.2.0
    System name:     Linux
    Node name:     TOYDBSV01
    Release:     2.6.9-34.ELsmp
    Version:     #1 SMP Fri Feb 24 16:54:53 EST 2006
    Machine:     i686
    Instance name: NIKKOU
    Redo thread mounted by this instance: 1
    Oracle process number: 22
    Unix process pid: 5037, image: oracleNIKKOU@TOYDBSV01
    Sun Jul 27 03:06:34 2008
    ACTION : 'CONNECT'
    DATABASE USER: 'sys'
    PRIVILEGE : SYSDBA
    CLIENT USER: oracle
    CLIENT TERMINAL:
    STATUS: 0
    After we update the db from Release 10.2.0.1.0 to Release 10.2.0.4.0, the format of Audit log file had been changed to something likes below.
    Audit file /u03/oracle/admin/NIKKOU/adump/ora_1897.aud
    Oracle Database 10g Release 10.2.0.4.0 - Production
    ORACLE_HOME = /u01/app/oracle/product/10.2.0
    System name:     Linux
    Node name:     TOYDBSV01
    Release:     2.6.9-34.ELsmp
    Version:     #1 SMP Fri Feb 24 16:54:53 EST 2006
    Machine:     i686
    Instance name: NIKKOU
    Redo thread mounted by this instance: 1
    Oracle process number: 21
    Unix process pid: 1897, image: oracle@TOYDBSV01
    Tue Oct 14 10:30:29 2008
    LENGTH : '135'
    ACTION :[7] 'CONNECT'
    DATABASE USER:[3] 'SYS'
    PRIVILEGE :[6] 'SYSDBA'
    CLIENT USER:[0] ''
    CLIENT TERMINAL:[7] 'unknown'
    STATUS:[1] '0'
    Because we have to rewrite the perl script, could anyone tell us where we can find the manual to describe the format of the Audit log file.

    Oracle publishes views of the audit trail data. You can find a list of the views for the 11.1 database here:
    http://download.oracle.com/docs/cd/B28359_01/network.111/b28531/auditing.htm#BCGIICFE
    The audit trail does not really change between patchsets as that would constitute underlying structure changes and right now, the developers are not allowed to change the underlying structure of tables in patchsets. But, we can change what may be displayed in a column from patchset to patchset. For example, we are getting ready to update the comment$text field to display more information like dblinks and program names.
    I personally don't like overloading the comment$text field like that, but sometimes when you need the information, that is the only choice except to wait for the next major release :)
    As for the output of the audit log files, those can change between patchsets because of bugs that were found and some changes to support Audit Vault. My apologies out there for anyone that is reading the audit files written to the OS directly, I would recommend using the views.
    Hope that helps. Tammy

  • BOE XI 3.1 Removing Audit log files

    Hi there experts,
    we have an issue with our production BOE install (3.1 SP7) whereby we have over 39,000 audit log files awaiting processing in the BOE_HOME/auditing folder. These audit files were generated a few months back when we had an issue with the system whereby thousands of scheduled events were created, we are not sure how. The removal of these events has had a knock on effect in that we have too many audit files to process, ie the system just cant process them all quickly enough.
    So my question is can we just remove these audit files from the auditing directory with no knock on effects as we dont need them loading into the audit database anyways as they are all multiples of the same event.
    As an aside when we upgraded from SP3 to SP7 the problem went away, ie no new audit files for these delete events being generated. We are still to establish how/why these audit events were created but for the time being we just want to be able to remove them. Unfortunately as its a production system we don't want to just take a chance and remove them without some advice first.
    thanks in advance
    Scott

    Is your auditing running now? Or still pending? Can you check in Audit DB, what is the max(audit_timestamp? This will tell you when was the recent actvitiy happened.
    Deleting the audit files, will not harm to your BO system. You will not be able to see auditing details for that period.
    Is the new auditing files are processed? or you still see the files created in auditing folder without processing?
    If the auditing file size shows 0 okb, than it means they were processed.

  • Maximum number of events per audit log file must be greater than 0.

    BOE-XI (R2)
    Windows Server 2003
    Running AUDIT features on all services.
    Report Application Server (RAS) keeps giving the following error in the Windows Application Event Log.
    Maximum number of events per audit log file must be greater than 0.  Defaulting to 500.
    I am assuming that this is because the RAS is not being used by anyone at this time - and there is nothing in the local-audit-log to be copied to the AUDIT database.
    Is there any way to suppress this error...?
    Thanks in advance for the advice!

    A couple more reboots after applying service pack 3 seemed to fix the issue.
    Also had to go to IIS and set the BusinessObjects and CrystalEnterprise11 web sites to use ASP .NET 1.1 instead of 2.

  • Any software/program that can read audit log files

    Hi,
    Currently i am searching for a program/tools that can read audit log files and format it into a readable format. Anyone know is there any in the market or any open source program?
    Thank You.

    Not sure what you mean by "audit log".
    Anyway. Pete Finnigan's tools page has only one thing that might be what you're looking for - LMON, which runs on BSD, Solaris, Linux. As he's the go-to guy for Oracle security the chances of there being a good free log analyzer tool that he hasn't heard of is slight.
    Cheers, APC

  • Oblix v7 audit log file missing

    Hi,
    I'm using oblix v7.
    I have enabled audit logs and specified the file name as: C:\audit33.txt
    But on the machine there is no such file. It is somehow missing.
    The same configuration works on another machine.
    Any idea why the audit log file is missing?
    Thanks.
    Sash.

    I response myself.
    There is no way to set the Date/Time format to any other than UTC for the OAM component logs
    See note 742777.1 for deeph information.
    Julio.

  • Bad date recorded by AccessServer in Audit Log File

    Hi all,
    I have installed OAM and configure Audit Log File to AccessServer:
    Access System Configuration >> Access Server Configuration >> and put ON "Audit to File"
    The log is recorded OK, but when compare the date writed in log file with SO date, there are 6hs of diference
    LOG FILE
    01\/28\/2009 *00:18:07* \-0500 - AUTHZ_SUCCESS - GET - AccessServer - 192.168.3.105 - sec.biosnettcs.com\/access\/oblix\/lang\/en\-us\/msgctlg.js - cn=orcladmin\,cn=Users\,dc=biosnettcs\,dc=com - 00:18:07 - http - AccessGate - - 2
    SO date
    # date
    mar ene 27 *18:18:15 CST* 2009
    # date -u
    mié ene 28 *00:18:23 UTC* 2009
    How we can see in this lines the audit log is recording date in UTC, but a need this in the timezone setted in SO.
    How can do this (print date in audit log file with the same timezone setted by SO)??
    Thaks in advance,
    Julio

    I response myself.
    There is no way to set the Date/Time format to any other than UTC for the OAM component logs
    See note 742777.1 for deeph information.
    Julio.

  • Remote management audit log file

    I've read the documentation @
    http://www.novell.com/documentation/...a/ad4zt4x.html
    which indicates that the audit file is auditlog.txt and is located in the
    system directory of the managed workstation. The problem is I can't find the
    log file in that location or anywhere else on the computer. I even looked in
    C:\Program Files\Novell\ZENworks\RemoteManagement\RMAgent but I can't find
    anything. Any ideas? Can someone point me in the right direction.
    BTW, I'm using ZDM 6.5 SP2 for both the server and the workstations.
    Jim Webb

    Just an FYI, with ZDM 6.5 HP3 the file name changed from AuditLog.txt to
    ZRMAudit.txt still located under system32 on Windows XP.
    Jim Webb
    >>> On 5/22/2006 at 3:27 PM, in message
    <[email protected]>,
    Jim Webb<[email protected]> wrote:
    > Well I found out the ZDM 6.5 HP2 fixes the problem of the log file not
    > being
    > created.
    >
    > Jim Webb
    >
    >>>> On 5/19/2006 at 8:37 AM, in message
    > <[email protected]>,
    > Jim Webb<[email protected]> wrote:
    >> Well, it does show up in the event log but not in the inventory. If I
    >> disable inventory the log file won't be deleted, correct?
    >>
    >> Jim Webb
    >>
    >>>>> On 5/18/2006 at 10:03 AM, in message
    >> <[email protected]>, Marcus
    >> Breiden<[email protected]> wrote:
    >>> Jim Webb wrote:
    >>>
    >>>> I did a search on a machine I am remote controlling, no log file. What
    >>>> next?
    >>> good question... does the session show up in the eventlog?

  • Audit log files user rights

    Hello,
    I started binary audit some of my servers. It works fine.
    Generated files has 600 mask and root:root group:user. This makes my backup routines sick. Backup scripts work as another user and permission denied errors arises.
    How can i change audit files mask?
    Thanks,
    Osman

    Although I'm not sure I don't think you can since audit data will always need solid protection due to the included information. The only liable option I see is to use syslog as your logging daemon.

  • Viewing SQL Audit Logs

    I am new at the SQL auditing feature. I have played around with it, and I have one problem that I hope someone can help me with. When I first turn on the SQL audit and the audit file is small I can view the log on my local machine with the log viewer in
    SSMS. However, as the log files continue to be created and grow, the log file viewer no longer works. It sits there for several minutes saying "initializing log file #1", then just comes back 0 records processed. Sometimes it comes back with an OutOfMemoryException.
    I have the SQL audit set to go to files, and the files are set to be 250 MB each.
    Is there some better way to look at the SQL audit log files generated, or is there some way to make the log file viewer actually work once the log files begin to grow? I know I can use Transact-SQL statements, but I was hoping to use the log file viewer
    or some other GUI viewer. Another issue I'm concerned about is being able to archive the log files to another location and having the ability to look at them.
    Thank you.

    Its looks like bug.
    http://connect.microsoft.com/SQLServer/feedback/details/709364/sql-server-audit-logs-do-not-display
    http://thomaslarock.com/2012/10/viewing-sql-server-2008-r2-audit-logs-using-ssms-2012/
    Use sys.fn_get_audit_file() system function to see the data that you already doing.
     http://msdn.microsoft.com/en-us/library/cc280765.aspx
    -Prashanth

  • Protection of SAP Log Files

    Does anyone know of any tools (SAP or third-party) to protect SAP log files (system logs, security audit logs, etc.) from alteration by an authorized user (e.g., someone with SAP_ALL)?  We are looking for an audit-friendly method to protect log files such that someone with SAP_ALL privileges (via Firefighter or special SAP userid (DDIC, SAP*)) can't perform actions and then cover up their tracks by deleting log entries etc.  For example, we're wondering if any tools exist that enable the automatic export of the log files to a protected area (that's inaccessible to users with SAP privileges)?  We'd certainly appreciate any advice or insight as to how to resolve this issue.
    Regards,
    Gail

    For anyone who is interested, I wanted to pass along what we did (this was in response to an audit finding):
    First, SAP_ALL access is restricted to monitored Firefighter accounts (we already had that in place).  Recognizing that users with SAP_ALL and super-user access at the UNIX level (i.e., our Basis Team) can still circumvent pretty much any measure we take (e.g., can disable alerts in CCMS, delete batch jobs, deactivate Security Audit Log filters, delete Security Audit Log files, etc.), at least the actions would be captured via FF  (although they could disable that as well) or other utilities at the UNIX level.  And the more things the person has to disable/deactivate, the more likely it becomes that someone would notice that something was amiss. 
    Our company was already using SPLUNK to capture logs from other (non-SAP) systems so we decided to leverage that to capture and retain certain SAP Security Audit Log entries.  We created a batch job on SAP that runs a custom program at 5 minute intervals to extract records from the Security Audit Log files into a UNIX file (the program includes some logic that uses timestamps in the UNIX file to determine which records to extract).  The UNIX file is monitored by the UNIX tail-f command which is spawned by a Perl script.  The output from the tail-f command is then piped to a file on a central syslog server via the logger command in the script.  Finally, a SPLUNK process, which monitors syslog entries, extracts the information into SPLUNK real-time.
    This process is not bulletproof as our Basis Team (with SU privileges at the UNIX level) could disable the Perl script or delete/change entries within the UNIX file.  All we can really do is make it difficult for them to cover their tracksu2026
    Gail

  • Transactional log file issue

    Dear All,
    There have been issues in the past where the transactional log file has grown too big that it made the drive to limit its size. I would like to know the answers to the following
    please:
    1. To resolve the space issue, is the correct way to first take a backup of the transactional log then shrink the transactional log file?
    2. What would be the recommended auto growth size, for example if I have a DB which is 1060 GB?
    3. At the moment, the transactional log backup is done every 1 hour, but I'm not sure if it should be taken more regularly?
    4. How often should the update stat job should run please?
    Thank you in advance!

    Hi
    My answers might be very similar to geeks already answer, but hope it will add something more
    1. To resolve the space issue, is the correct way to first take a backup of the transactional log then shrink the transactional log file?
     --> If database recovery model is full \ bulk then tlog backup is helpful, and it doesnt help try to increase frequency of log backup and you can refer :
    Factors That Can Delay Log Truncation
    2. What would be the recommended auto growth size, for example if I have a DB which is 1060 GB?
    Auto grow for very large db is very crucial if its too high can cause active vlf and too less can cause fragmentation. In your case your priority is to control space utilizatiuon.
    i suggest you to keep minimum autogrowth and it must be in size not in percentage.
    /*******Auto grow formula for log file**********/
    Auto grow less than 64MB = 4 VLFs 
    Autogrow  of 64MB and less than 1GB = 8 VLFs 
    Autogrow of 1GB and larger = 16 VLFs
    3. At the moment, the transactional log backup is done every 1 hour, but I'm not sure if it should be taken more regularly?
    ---> If below query returns log_backup for respective database then yes you can to increase log backup frequency. But if it returns some other factor , please check above
    mention link
    "select name as [database] ,log_reuse_wait , log_reuse_wait_desc from sys.databases"
    4. How often should the update stat job should run please?
    this totaly depend on ammount of dml operation you are performing. you can select auto update stats and weekly you can do update stats with full scan.
    Thanks Saurabh Sinha
    http://saurabhsinhainblogs.blogspot.in/
    Please click the Mark as answer button and vote as helpful
    if this reply solves your problem

  • How can I audit specific file??

    I have some important files used by more than one user and i want to audit only these files ( removing or renameing events ... )
    Can any body tel me how to do it on solaris 8?

    As you know, auditting is usually done on a system basis which is selected when the Basic Security Model (BSM) is initialized (example: /etc/security/bsmconv /). In your case, the solution may be to modify the audit_user files to track the users who may be accessing the files in question. Still, this will not give you exactly the behavior that you want. You could always post-process the audit.log files a priori.

  • IdM Audit Log

    Does Identity Manager keep a record of all events like adds/deletes/modifies to entries it manages?
    Are all attributes and values added recorded?
    Are all attributes and values (before and after modification) recorded on updates?
    Are all delete entry events recorded?
    If so, how would I extract this information out of IdM to a log FILE?
    Also, about how much effort is involved in creating the desired audit log FILE.
    (Potential) Customers of Identity manager here have asked, after being shown a quick demo of IdM where is the ability to get statistical info e.g. how many entries added in past 24 hours/week/month? how many email accounts were created in past 24 hours/week/month etc etc...
    I/they see a screen audit report as an IdM task but it doesnt seem to be able to dump useful information to a file. A file can be manipulated to produce these statistics, a screen cannot. This file can also be used by other external systems of course.

    I cannot agree more with Mr greenfan88:
    Clients should have a HIGH expectation in a system such as IDM which relates to provisionning, meta-directory and workflow
    The main reason beeing that business processes are the core driver of successfull projects. Technical things comes in second place. Thus processes need to be highly traceable and reports customizable
    What I think of IDM Reports:
    * Nearly half of the standard reports are administrative reports (ex: list the connectors status, list the admins...) => No business value
    * Other reports are pure AuditLog reports that correspond to a grep on logs => Low business value
    * There are as well resource risk reports that scan inactive accounts... => No business value
    * One report type provide statistical information which is good
    * Only one report consolidate information (<> from just an audilog grep listing)
    All these reports have low business value:
    1) the attributes are technical ones
    2) the reports types are frozen
    3) Consolidation is very low
    4) Scoping/Security of reports is based on ORGANIZATIONS. Very limited
    5) Inputing parameters such as a date range, people/account status (active/inactive), or departement perimeter is impossible or very difficult to achieve
    What I think of IDM AUDITOR:
    * Quite the same since lots of reports are administratove
    * Auditor introduces the notion of COMPLIANCE rules. This is good BUT it should be extended to business attributes, time ranges, active/inactive status...
    Except the COMPLIANCE addition, I don't see much interesting features from Auditor. It is still in V1 or beta ?
    => I hope the product line will improve to include REAL REPORTS like the ones we can make with BUSINESS OBJECTS or CRYSTAL REPORTS...
    Rgds,

  • How to configure log files in SQL 2012?

    I'm installing a SharePoint Solution and using Microsoft SQL 2012 and I have limited knowledge in installing SQL.  I simply run the wizard and create the DB for SharePoint. 
    Is there any links/materials available demonstrating how to correctly configure the log files step by step in a specific partition during the SQL installation for SharePoint 2013?
    thanks, 

    Hello,
    SQL Server database log files benefit from RAID 1 and RAID 10 configurations. RAID 10 is recommended for both data files and log files.
    To control de grow of transaction log files, please backup them regularly. The following article may explain you in detail why:
    http://technet.microsoft.com/en-us/library/ms175495.aspx
    Monitor the growth of log files using Performance Monitor and the SQLServer:Databases --> Log Growths counter. Adjust the size of log files until Log Growths
    is constantly zero.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

Maybe you are looking for

  • Unable to Install Oracle 11g R2(11.2.0.1.0) on RHEL 6

    I'm performing installation of Oracle 11g r2 with Server class on RHEL(installed as Desktop class) In prerequesites check i'm getting the following errors:      Checks          STATUS FIXABLE      Soft Limit: maximum user processes     FAILED     Yes

  • How to Use Dynamic Action ?

    Dear All, i am using Apex 4.1 Ver. i need return value in text Area Item when i select any Value in Select List. i have One select List Item :p1_template_id and one text Area Item P1_template_description . Select List Code SELECT TEMPLATE_NAME D, TEM

  • Possible (1) cause why iTunes doesn't work on some windows systems

    I'm currently working on a notebook computer for a friend. The problem? iTunes does not launch. Selecting the iTunes icon does nothing. Loaded iTunes onto (2) of my computers. A desktop and a notebook. Both worked well. Attempted to launch the Quickt

  • The latest version of Firefox does not seem to work with Google!

    When you type something in the search box, you will get presented with a list of possible choices, but Google will not actually perform the search when you press "Enter" or click "Google Search". IE 6 still appears to work fine, as does Firefox 3.6.

  • Mac Lion OS print driver for hp color laser jet 1600 & business inkjet 2200

    I just brought a Macbook Air (13in) with the new OS Lion.  I however have a HP Color Laserjet 1600 and Business Inkjet 2200 that I love, but can't find a printer driver for either.  So, my questions are, does anyone know if: 1)  there is going to be