Audit data Tables

Hi All,
Can you tell me the names of the table where Audit data is stored in Microsoft platform.
I think there are two tables ie Audit Activity table and Audit data table .
Thank you
Kind Regards
Abhinav Sona

Hi Sona,
these are the table names:
AuditDtsLog
AuditActivityDetail<Appset>
AuditActivityDetail<Appset>Archive
AuditActivityHdr<Appset>
AuditActivityHdr<Appset>Archive
for each application:
AuditActivityDetail<Application>
AuditActivityDetail<Application>Archive
AuditActivityHdr<Application>
AuditActivityHdr<Application>Archive
AuditData<Application>
AuditData<Application>Archive
AuditDataTmp<Application>
AuditHdr<Application>
AuditHdr<Application>Archive
Comment<Application>Archive
Kind regards
Roberto
Edited by: Roberto Vidotti on Dec 13, 2011 10:34 AM

Similar Messages

  • Tax error - audit data - table ETXDCJ with wrong data

    Hi,
       I am working on audit_data file that external tax system receives from SAP. table entries in 'ETXDCJ' are fed to audit_data file which are wrong. Please let me know how this table gets the entries. Any transaction or steps that I need to know to replicate the issue. We have this wrong tax data issue only for certain sales document type not for all. Please let me know.
    Thanking you
    Ram.

    External tax calculation is a setting that you have to enable in configuration. Every document has a tax procedure and pricing procedure assigned to it. So when you create a sales order, based on these two your prices and taxes are calculated. I am not how it works, but I think the external tax calculating software will calculate and send tax documents back to SAP. These are the ones that get stored in the table ETXDCJ,ETXDCH and ETXDCI. This update function module seems to be inserting it EXTERNAL_TAX_DOC_INSERT_DB.
    What is it that you are calling error? Are calculations wrong, or is it something else? Are you sure it is document type not cutomer or state that is influencing this? Look in config for the external tax calculation under 'Financial Accoutning Global Settings' as follows:
    Implementation Guide for R/3 Customizing (IMG)
    -->Financial Accounting
       -->Financial Accounting Global Settings
          -->Tax on Sales/Purchases
             -->External Tax Calculation
    I am wondering if your issue is with the tax jurisdictions and your condition table entries. If you are sure that it is always with a particular document type, compare the document type settings with another one that is working.
    If this helps, please don't forget to reward.
    Srinivas

  • Audit- Not laoding Audit data into Audit tables

    Hi ,
    Audit data not loading into audit tables in my XI3.1 SP4Environament.
    Audit data base is in SQL SERVER 2008 R2
    where Audit  database congigured , audit events enabled properly.
    and also log files creating without any errors also
    Please help me ..
    Edited by: Reddeppa Konduru on Nov 9, 2011 11:31 AM
    Edited by: Reddeppa Konduru on Nov 9, 2011 12:31 PM

    I am getting below error
    bsystem_impl.cpp:2651: TraceLog message 1
    2011/11/09 03:22:47.071|>>|A| |13664|9368| |||||||||||||||assert failure: (.\auditsubsystem_impl.cpp:2651). (false : Next event id value should be greater than the current one, check the auditee packing events code).

  • Audit database. config auditing data source (DB2)

    Hello expert,
    I want to enable audit database for Business Object and I has followed the admin guide but not work.
    I install Business Object enterprise XI 3.1 on AIX 5.3 .
    When I installed BO ,I choose 'Use an existing database'  and choose DB2. (The same database of SAP BW database)
    And when the installation require the information of CMS and Audit database, I fill the same db alias as SAP BW database.
    So now I has SAPBW , CMS and Audit  data in the same database.
    After installation I saw the CMS table in "DB2<aliasName>"  schema.
    But can not find the Audit table.
    Does the audit table will create after installation?
    Then I try to enable the audit database using cmsdbsetup.sh and I choose 'selectaudit' and fill in the information that it require.
    Then it finish with no error.
    "Select auditing data source was successful. Auditing Data Source setup finished."
    But I still can not find any audit table in the database.
    I run serverconfig.sh and I can't see the Enable Audit option when I choose 'modify a server'.
    Any idea?
    Thanks in advance.
    Chai

    Hello,
    Thanks for your reply.
    It is not a BO cluster.
    And a log detail when select audit data source is show as below
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Select auditing data source was successful.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Select auditing data source was successful.
    And the CMS log file did not show any error.
    Additional detail:
    - My BW and BO are in the same server.
    - I already grant all the rights to the user who relate to audit database.
    - My BW and  BO are in the same database.
    - There is no audit table appear in the database.
    - No Fix pack installed.
    I wonder why BO audit connection did not see my database.
    (In case of DB2, I think the db2 alias name is the same name of the database name (as default).
    So if my database name is BWD then the database alias name should be BWD, am I right?)
    Any idea?
    Thanks in advance.
    Chai

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • Creating Audit Data with Triggers

    I want to create an audit table like
    AuditTable(
    FieldName Varchar2(40),
    OldValue Varchar2(100),
    NewValue Varchar2(100),
    User varchar2(20),
    UpdtDate Date)
    Whenever Table X is updated, then the Trigger should capture the changes and create a row for each field in the UPdate Statement.
    I don't want to turn on Audit as the DB is very big and high transaction oriented. I just want to watch a particulare table only.
    I need a little guidance in parsing the update statement and get the fieldnames, their old values and new values.

    Well, you could certainly audit a single table - you don't have to audit every table.
    In any case, the trigger would look something like:
    create or replace trigger t_audit
    after update on t
    for each row
    begin
      if updating('column_a') then
        insert into audittable values ('column_a', :old.column_a, :new.column_a, user, sysdate);
      end if;
      ... repeat for all columns
      if updating('column_z') then
        insert into audittable values ('column_z', :old.column_z, :new.column_z, user, sysdate);
      end if;
    end;You can use the data dictionary (user_tab_columns) to automate the writing of that trigger if there are many columns.

  • View Customization Table changes and Data  table base changes in a report

    Hi All,
    How to view Customization Table changes and DAta  table base changes in a report ,
    Is it right transactions: SCU3 or RSVTPROT
    Also plz let me know the concept of audit trial,
    Thanks
    SD

    Hi,
    Changes to master data objects must be captured for the For compliance purposes. The auditor allows you to be able to view and print an audit log of changes to master data objects for a chosen period. It is very common for external auditors to focus on what has changed from one year or quarter to the next to help determine the nature, extent, and population for testing. To configure and access your audit log, perform the actions listed with each of the following utilities.
    Audit Trail is used to track the record changes.
    The report RPUAUD00 gives all the changes done to the masterdata by any user anytime.
    But, before using this audit trial, please ensure that the system hardware is well equipped as the audit trials activation would later become a performance issue as this would occupy a lot of space in the coming time.
    Best Regards,
    Venkat.

  • What is the best way to audit data

    What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
    Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.
    I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
    Thanks,
    Matt Knowles
    Edited by: mattknowles on Jan 10, 2011 8:40 AM

    mattknowles wrote:
    What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
    Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.You can either:
    1. Implement your own custom auditing (as you currently do)
    2. Flashback Data Archive (11g). Requires license.
    3. Version enable your tables with Workspace Manager.
    >
    I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
    Unfortunately, auditing data always takes lots of space. You must also consider performance, as custom triggers and Workspace Manager will perform much slower than FDA if there is heavy DML on the table.

  • XI R2 Auditing data retension period

    Hi,
    Using XI R2 SP2 FP5 with an SQLServer database and want to limit the amount of data held, ideally by date i.e. only 6 months of data.
    I would expect a setting somewhere to say keep x days of data but can't find one and can't find any reference to it in documentation or on these forums.
    Any help much appreciated.
    John

    Hello,
    There is no way to restrict/purge audit data out of the box. You could however purge data in your db as mentioned in SAP NOTE  1198638 - How to purge the BO AUDIT tables leaving only the last 6 months data. I.e
    To purge the BO AUDIT tables, leaving only the last 6 months data, delete AUDIT_DETAIL with a join to AUDIT_EVENT and select the date for less then 6 months. Then delete the same period in AUDIT_EVENT.
    Note that this is apparently not supported. SAP  NOTE 1406372 - Is it possible to purge Audit database entries? 
    Best,
    Srinivas

  • Collector gows down and audit data aren't in OAV Console

    Hi,
    I have OAV 10.2.3.2, DBAUD collector, collection agent is on windows. When I run this collector, it starts but after a while it is stopped again. I can retrieve politics, create politics and provision them. Audit data are stored in AUD$ but I can't see them in Audit Vault console.
    In C:/oracle/product/10.2.3.2/av_agent_1/av/log/DBAUD_Collector_av_db2_2 are following:
         ***** Started logging for 'AUD$ Audit Collector' *****
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Collector Name = DBAUD_Collector
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Source Name = av_db2
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Av Name = AV
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Initialization done OK
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Starting CB
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** CSDK inited OK + 1
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** Src alias = SRCDB2
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** SRC connected OK
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** SRC data retrieved OK
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** Recovery done OK
    ERROR @ '21/04/2010 12:05:07 02:00':
    On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
    ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
    ORA-06512: na "AVSYS.AV$DW", line 1022
    ORA-01882: oblast časového pásma nebyla nalezena
    ORA-06512: na "AVSYS.AV$DW", line 1290
    ORA-06512: na line 1
    ERROR @ '21/04/2010 12:05:08 02:00':
    Collecting thread died with status 46821
    INFO @ '21/04/2010 12:06:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:07:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:08:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:09:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:10:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:11:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:12:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:41 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:41 02:00':
         ***** Started logging for 'AUD$ Audit Collector' *****
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Collector Name = DBAUD_Collector
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Source Name = av_db2
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Av Name = AV
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Initialization done OK
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Starting CB
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** CSDK inited OK + 1
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** Src alias = SRCDB2
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** SRC connected OK
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** SRC data retrieved OK
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** Recovery done OK
    ERROR @ '21/04/2010 12:14:03 02:00':
    On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
    ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
    ORA-06512: na "AVSYS.AV$DW", line 1022
    ORA-01882: oblast časového pásma nebyla nalezena
    ORA-06512: na "AVSYS.AV$DW", line 1290
    ORA-06512: na line 1
    ERROR @ '21/04/2010 12:14:05 02:00':
    Error on get metric callback: 46821
    ERROR @ '21/04/2010 12:14:05 02:00':
    Collecting thread died with status 46821
    ERROR @ '21/04/2010 12:14:06 02:00':
    Receive error. NS error 12537
    ERROR @ '21/04/2010 12:14:06 02:00':
    Timeout for getting metrics reply!
    INFO @ '21/04/2010 12:15:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:16:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:17:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    I've already follow help in administrator guide for Problem: Cannot start the DBAUD collector and the log file shows an error. There is that if I run command
    $ sqlplus /@SRCDB1
    successful, my source database is set up correctly. I run this command successful, but my problem didn't solve.
    Any advice, please?

    I reinstall AVS and AV Collection agent and again patchset 10.2.3.2, again add source database, agent, collectors,... but still have the same problem... collector starts successfully, but after wihile its stopped and in AV console aren't any data for reports (I create some audit policy and alerts).
    I test connection with sqlplus /@SRCDB1 and it is ok, log for collector is:
    May 11, 2010 8:11:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:11:38 AM Thread-44 FINE: timer task interval calculated = 60000
    May 11, 2010 8:11:38 AM Thread-44 FINE: Going to start collector, m_finalCommand=/bin/sh -c $ORACLE_HOME/bin/a
    vaudcoll hostname="avtest.zcu.cz" sourcename="stag_db" collectorname="DBAUD_Collector" avname="AV" loglevel="I
    NFO" command=START
    May 11, 2010 8:11:46 AM Thread-44 FINE: collector started, exitval=0
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=IS_ALIVE value=true
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=BYTES_PER_SEC value=0.0000
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=RECORDS_PER_SEC value=0.0000
    May 11, 2010 8:11:49 AM Thread-44 FINE: timer task going to be started...
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: Stop caching since it is NOT alive for 10 times of query
    And in table av$rads_flat are collected audit data...
    Pls, know any one where could be a problem??

  • Options for auditing data changes

    Hi Friends,
    I thought I will get some inputs on my following implementation. The requirement is to audit some data changes with in the system.( Oracle 10.2 on RHEL 4.7 )
    The audit is required in a sense that, the before images of data and information of who changed the data is required. I have looked at options like Oracle Auditing,FGA and so. But this cannot give me audit for the data changes,when and who changed.
    The first thing that comes into my mind are using triggers . Another option is using log miner. I have successfully tested it out with both of these approaches. The environment is like
    1 ) For some critical tables for which audit is required triggers were written ( ours is an OLTP application )
    2 ) For some non critical tables, log miner which was called by a stored procedure which runs at certain periods is used.
    3 ) audit data is stored in a different schema , with same table names as in base schema.
    I would like to know your thoughts on this.
    Thank You,
    SSN

    The delay with log miner is acceptable with some less critical audit tables and as you said, tirggers can be implemented for some critical tables.
    One bottleneck with using logminer is that it depends on the availability of archived redo logs, the backup mechanism if any implemented should make sure that, no archive logs are removed from the locations specified for the audit program. The backup mechanism should ensure that all archived logs are processed by periodically running audit program which uses log miner.
    Wondering if there is any other recommended approach for this.
    Thanks
    SSN

  • Logminer Issue : Not getting the proper auditing data

    Hi,
    I am using Logminer utility on my test database to view the contents from archived logs which I copied from my prod database. I have setup everything, and started trying to view SQL_REDO from v$logmnr_contents.
    The output looks a bit strange as it has picked the object_id and some hexadecimal values for columns.
    Here it is -
    SQL> set lines 120
    SQL> set pages 300
    SQL> select sql_redo FROM v$logmnr_contents;
    set transaction read write;
    update "UNKNOWN"."OBJ# 979115" set "COL 1" = HEXTORAW('c40b072041') where "COL 1" = HEXTORAW('3b5b5f462666') and ROWID =
    'AADvCrACCAAAAQ2AAA';
    delete from "UNKNOWN"."OBJ# 291676" where "COL 1" = HEXTORAW('c4504d3a42') and "COL 2" = HEXTORAW('4220') and "COL 3" =
    HEXTORAW('c3064c4f1d') and "COL 4" = HEXTORAW('80') and "COL 5" = HEXTORAW('80') and "COL 6" = HEXTORAW('80') and "COL 7
    " = HEXTORAW('80') and "COL 8" = HEXTORAW('80') and "COL 9" = HEXTORAW('80') and "COL 10" = HEXTORAW('c3091a0a45') and "
    COL 11" = HEXTORAW('c3091a0a45') and "COL 12" = HEXTORAW('80') and "COL 13" = HEXTORAW('80') and "COL 14" = HEXTORAW('80
    ') and "COL 15" = HEXTORAW('c4504d3a42') and "COL 16" = HEXTORAW('786d08150e080a') and "COL 17" = HEXTORAW('786d08150e08
    0a') and "COL 18" = HEXTORAW('534554544c455f55534552') and "COL 19" = HEXTORAW('534554544c455f55534552') and ROWID = 'AA
    BHNcAA9AABX7KABy';
    (I guess dictionary information has missed.)
    However, When I queried the dba_objects on my prod database, and I am getting the object_name with type.
    OWNER OBJECT_NAME OBJECT_TYPE CREATED
    SSCHEMA SCHEMATABLE TABLE 09-FEB-08
    Please suggest, is there anyway I can retrieve the proper auditing data.

    Thanks for reply!
    Another thing I missed to mention here that our test database is in NOARCHIVELOG mode. As per you suggestion it is giving me below error as -
    SQL> EXECUTE DBMS_LOGMNR_D.BUILD( -
    OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS);
    BEGIN DBMS_LOGMNR_D.BUILD( OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS); END;*
    ERROR at line 1:
    ORA-01325: archive log mode must be enabled to build into the logstream
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 3172
    ORA-00258: manual archiving in NOARCHIVELOG mode must identify log
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5786
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5884
    ORA-06512: at "SYS.DBMS_LOGMNR_D", line 12
    ORA-06512: at line 1
    Please suggest.

  • BPC Audit data

    BPC for audit has been enabled and since lot of audit data is piling up.. it is slowing down the reporting..
    How to delete the BPC data that has piled up over a period..
    I want to completely delete the data..
    Appreciate your inputs..

    Hi,
    there are 2 standard process chains for archiving audit data:
    /CPMB/ARCHIVE_ACTIVITY
    /CPMB/ARCHIVE_DATA
    tables for audit:
    activity tables: UJU_AUDACTHDR, UJU_AUDACTDET, UJU_AUDACTHDR_A, UJU_AUDACTDET_A
    data audit: UJU_ AUDDATAHDR, /1 CPMB/ KIOMBAD, UJU_ AUDDATAHDR_A/, 1CPMB/ KIOMBAD_A
    regards
    D

  • HFM audit data export utility availability in version 11

    Hi Experts,
    We have a client who has a HFM environment where the audit & task logs grow very large very quickly.
    They need to be able to archive and clear the logs. They are too large for EPM Maestro to handle and they don't want to schedule them as a regular event.
    I am concerned because I am sure that these large log tables are impacting performance.
    They want to know if the old system 9 utility they used to use is still available in the latest version. It was called the HFM audit data export utility. Does anyone know?
    Thanks in advance and kind regards
    Jean

    I know this is a reasonably old post but I found it through Google. To help those in the future, this utility is available via Oracle Support. It is HFM Service Fix 11.1.1.2.05 but it is compatible up to 11.1.1.3.
    Here is the Oracle Support KB Article:
    *How To Extract the Data Audit and Task Audit records of an HFM application to a File [ID 1067055.1]*+
    Modified 23-MAR-2010 Type HOWTO Status PUBLISHED
    Applies to:
    Hyperion Financial Management - Version: 4.1.0.0.00 to 11.1.1.3.00 - Release: 4.1 to 11.1
    Information in this document applies to any platform.
    Goal
    Some system administrators of Financial Management desire a method to archive / extract the information from the DATA_AUDIT and TASK_AUDIT database tables of an HFM application before truncating those tables.
    Solution
    Oracle provides a stand alone utility called HFMAuditExtractUtilitiy.exe to accomplish this task. As well as extracting the records of the two log tables, the utility can also be used to truncate the tables at the same time.
    The utility comes with a Readme file which should be consulted for more detailed instructions on how it should be used.
    The latest version of the utility which is compatible with all versions of HFM up to 11.1.1.3 is available as Service Fix 11.1.1.2.05 (Oracle Patch 8439656).
    Edited by: Fredric J Parodi on Nov 5, 2010 9:43 AM

  • Audit data from Identity Mgr repository

    Hi,
    I want to get the data like, If I create an account for a user and assign him certain resources today (say date is a) and assign him some resources.These resources need to be approved,they are approved after two days (say date is b).The is created at that time.
    Any clue How I could the date when request was made,
    Is there any way or workflow service op that lets us get the audit data for a user from repository.
    Thanks,
    Pandu

    In the Audit Vault Server infrastructure, there are a number of objects that are used to store the audit data from source databases. The Agent/Collector continually extracts and sends the audit data from the source databases into the Audit Vault repository. One of the tables that stores the 'raw' audit data is av$rads_flat which should never be externalized, changed, or manipulated by anyone.
    Out of the box, a job runs every 24 hours to transform or normalize the raw audit data into the warehouse structure that the reports are run against. The warehouse tables are published in the Audit Vault Auditor's documentation in which you can run your own report tool or other plsql to analyze the audit data.
    What do you want to do with the raw audit data?
    Thanks, Tammy

Maybe you are looking for

  • Importing AIA Xpath Functions in to Jdeveloper

    Hi, I have a requirement of using aia Xpath functions, i downloaded the AIAXpath zip folder form metalink and edited the setup.ev to Jdev home even after running the bat file restarted the system but could not see the aia Xpath functions in my Jdev d

  • Any way to make a dvd with just autoplay?

    i know to drag the video to the autoplay but it still comes back to the dvd menu any way to avoid the menu?

  • Adobe pdf problem

      I continue to get emails for a online survey that i should be able to print out and was able to always view and printout on my previous pc but now i have a Imac and I am not able to open up the pdf . It continues to say i need to download a pdf vie

  • Restricting access to Discoverer OLAP

    Hi, I am using Oracle BIDiscoverer for OLAP and would like to create two set of users i-e 1- Powers Users who would have access to discoverer plus. These users would create reports for others. 2- Simple Users who would have access to discoverer viewe

  • SMARTFORM Alignment  Problem

    Dear Friends, In my SMART FORM I have used templates as my requirement in there cells i want to write text with alignment like MS excel Vertical alignment (Top, Center, Bottom) is it possible in SMARTFORMS? hope, my question is clear for you. Thanks