Moving Audit Data

Hello,
Error Created while moving audit data from existing runtime repository to Control Center using OWB Control Center Upgrade Assistant. Here OWBRTR2TST is Runtime repository in Control Center.
Error as follows:
Exception:
Failed to Migrate Audit Data into OWBRTR2TST java.sql.SQLException: ORA-02298:
Can not validate (OWBRTR2TST.co.FK_PARENT_CO) - parent keys not found
ORA-06512 at line 16
What we need to do to aviod this error.
Thanks
Santi

Hello
Even I was getting almost similar problem. Once you solve please let us know.
Mine is like this
Error Created while moving audit data from existing runtime repository to Control Center
Using OWB Control Center Upgrade Assistant. Here OWBRTR2TST is Runtime repository in
Control Center.
Exception:
Failed to Migrate Audit Data into OWBRTR2TST java.sql.SQLException: ORA-02298:
Can not validate (OWBRTR2TST.co.FK_PARENT_CO) - parent keys not found
ORA-06512 at line 16

Similar Messages

  • Excel Upload and moving the data to dynamic itab

    Hi Folks,
    I am building a fieldcatalog dynamically based on the table name and then creating a dynamic internal table based on the fieldcatalog.Now I have a excel sheet having the fields in the same structure as the fieldcatalog that I built.I want to upload the data in the excel to an internal table having the structure of the fieldcatalog I built.I am having a problem in moving the data from the excel to the itab (dynamic itab built based on the fieldcatalog)
    1.Building the fieldcatalog.
    data:gt_fieldcat    TYPE lvc_t_fcat.
      ls_fieldcat-tabname    = p_tabname.
      ls_fieldcat-fieldname  = p_fieldname.  
      ls_fieldcat-inttype    = 'C'.
      ls_fieldcat-outputlen  = wa_dfies-outputlen.   
      ls_fieldcat-coltext    = wa_dfies-fieldtext.
      APPEND: ls_fieldcat TO gt_fieldcat.
      CLEAR : ls_fieldcat.
    2.Building the dynamic itab
    DATA:gp_table       TYPE REF TO data.
    FIELD-SYMBOLS: <gt_table> TYPE table,
    CALL METHOD cl_alv_table_create=>create_dynamic_table
        EXPORTING
          it_fieldcatalog = gt_fieldcat
        IMPORTING
          ep_table        = gp_table.
      ASSIGN gp_table->* TO <gt_table>.
    3.Now I have an excel with a similar structure with data.Uploading the excel data into itab.
    DATA: g_t_xltab  TYPE STANDARD TABLE OF ty_xltab,
          it_rawdata TYPE truxs_t_text_data,
      CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
        EXPORTING
    *     I_FIELD_SEPERATOR    =
          i_line_header        = 'X'
          i_tab_raw_data       = it_rawdata
          i_filename           = xls_filename
        TABLES
          i_tab_converted_data = g_t_xltab[]
        EXCEPTIONS
          conversion_failed    = 1
          OTHERS               = 2.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      Endif.
    *Move the data that got uploaded from excel into
    *respective internal tables.
        LOOP AT g_t_xltab
        here I want move the data into an internal table having the structure I built using
        the fieldcatalog/dynamic itab ie    <gt_table>.
        Endloop.
    I tried using ASSIGN COMPONENT sy-index of <gt_table> and move-corresponding but in vain as I am not that much familiar with dynamic itab using field symbols.Kindly let me know how the excel data can be moved to dynamic itab.
    Thanks,
    K.Kiran.

    Hi,
    you can try some logic like this. Please note this is just an example
    TYPE-POOLS:truxs.
    TYPES:BEGIN OF ty_xltab,
          field TYPE string,
          END OF ty_xltab.
    DATA:wa TYPE ty_xltab.
    DATA:lv_len TYPE i.
    DATA:startpos TYPE i.
    FIELD-SYMBOLS:<fs_line> TYPE mara.
    FIELD-SYMBOLS:<fs> TYPE ANY.
    DATA: g_t_xltab  TYPE STANDARD TABLE OF ty_xltab,
          it_rawdata TYPE truxs_t_text_data.
    CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
        EXPORTING
    *     I_FIELD_SEPERATOR    =
          i_line_header        = 'X'
          i_tab_raw_data       = it_rawdata
          i_filename           = ' '
        TABLES
          i_tab_converted_data = g_t_xltab[]
        EXCEPTIONS
          conversion_failed    = 1
          OTHERS               = 2.
    IF sy-subrc  <> 0.
      MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    startpos = 0.
    LOOP AT g_t_xltab INTO wa.
      DO.
        ASSIGN COMPONENT sy-index OF STRUCTURE <fs_line> TO <fs>.
        IF sy-subrc = 0.
          DESCRIBE FIELD <fs> OUTPUT-LENGTH lv_len.
          IF lv_len > 0.
            <fs> = wa-field+startpos(lv_len).
            startpos = startpos + lv_len.
          ENDIF.
        ELSE.
          append <fs_line> to <fs_tab>.
          EXIT.
        ENDIF.
      ENDDO.
    ENDLOOP.
    Sorry my answer is not for you requirement..Please Ignore.
    Your solution can be easily solved by looking at the concept Assign component . Please search
    Edited by: Keshav.T on Nov 7, 2011 1:42 PM

  • Validate Date if it falls within 30, 120 and 365 days of the audit date I enter on any given day

    PDF Forms in Acrobat 9 Pro
    When auditing a file, I enter the date I am auditing it.
    Example: 02/18/2015
    I then enter dates of when information was last verified. Some of these dates need to fall within 30 days of the audit date I enter on any give day and others need to fall within 120 days and another within 365 days. If they don't fall within the specified number of days for that field, I would like it to give an error message in that filled.
    Examples:
    College Grad Date 01/15/2015
    Certification Date  12/23/2015
    License Date  02/13/2014 - This is over 365 days so I do not want the date, I want it to display "Need to re-verify"
    I do not have any inkling of an idea of how to write this. Your help would be most appreciated.
    Thank you.

    Solved the problem- I contacted apple on their support line found on this page... http://www.apple.com/uk/support/mac/ and they were really helpful and talked me through what to do on the phone, I now have all three productivity apps. Hope this helps

  • Audit database. config auditing data source (DB2)

    Hello expert,
    I want to enable audit database for Business Object and I has followed the admin guide but not work.
    I install Business Object enterprise XI 3.1 on AIX 5.3 .
    When I installed BO ,I choose 'Use an existing database'  and choose DB2. (The same database of SAP BW database)
    And when the installation require the information of CMS and Audit database, I fill the same db alias as SAP BW database.
    So now I has SAPBW , CMS and Audit  data in the same database.
    After installation I saw the CMS table in "DB2<aliasName>"  schema.
    But can not find the Audit table.
    Does the audit table will create after installation?
    Then I try to enable the audit database using cmsdbsetup.sh and I choose 'selectaudit' and fill in the information that it require.
    Then it finish with no error.
    "Select auditing data source was successful. Auditing Data Source setup finished."
    But I still can not find any audit table in the database.
    I run serverconfig.sh and I can't see the Enable Audit option when I choose 'modify a server'.
    Any idea?
    Thanks in advance.
    Chai

    Hello,
    Thanks for your reply.
    It is not a BO cluster.
    And a log detail when select audit data source is show as below
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Select auditing data source was successful.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST)
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) About to process commandline parameters...
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Finished processing commandline parameters.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
    Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Select auditing data source was successful.
    And the CMS log file did not show any error.
    Additional detail:
    - My BW and BO are in the same server.
    - I already grant all the rights to the user who relate to audit database.
    - My BW and  BO are in the same database.
    - There is no audit table appear in the database.
    - No Fix pack installed.
    I wonder why BO audit connection did not see my database.
    (In case of DB2, I think the db2 alias name is the same name of the database name (as default).
    So if my database name is BWD then the database alias name should be BWD, am I right?)
    Any idea?
    Thanks in advance.
    Chai

  • Should cancelled mailbox moves result in any moved mail data removed from target server?

    I was moving mailboxes from 1 DB to another. 3 out of 120 were left running for 12 hours; they were the biggest ones and it was very slow to process the items. There were also reports of slowness on the server so I had no choice but to cancel the moves.
    My question is should cancelled mailbox moves result in any moved mail data being removed from the target database? It doesn't appear that my disk as increased in space again so I'm wondering if it's there as whitespace maybe? There is an event logged to say
    the mailbox has been deleted from the target.
    Would appreciate if someone can clear this up and possibly explain exactly what occurs during this type of scenario.
    Thanks.

    Hi,
    Thank you for your kindly remind.
    In Exchange 2000, 2003 and 2007 mailboxes become unavailable during movement, when a mailbox move takes place, the mailbox isn't actually moved, it's just copied to the destination location. Once that copy has finished, AD is updated to point to the new location
    and the old mailbox is deleted from the source EDB file.
    The most different between Exchange 2010 and legacy version in move mailbox is that allows end-users to be online in their email accounts. On completing the move users just needs to reopen their Outlook clients.
    Mailbox Move Requests are a new feature set of Exchange 2010 and requires running a series of Cmdlets to perform asynchronous online mailbox moves with the help of a service agent called Mailbox Replication Service (MRS). MRS is available on all Exchange 2010
    Client Access Servers.
    Best Regards,
    Allen Wang

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • What is the best way to audit data

    What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
    Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.
    I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
    Thanks,
    Matt Knowles
    Edited by: mattknowles on Jan 10, 2011 8:40 AM

    mattknowles wrote:
    What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
    Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.You can either:
    1. Implement your own custom auditing (as you currently do)
    2. Flashback Data Archive (11g). Requires license.
    3. Version enable your tables with Workspace Manager.
    >
    I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
    Unfortunately, auditing data always takes lots of space. You must also consider performance, as custom triggers and Workspace Manager will perform much slower than FDA if there is heavy DML on the table.

  • How to show User Auditing data in dashboard/reports in MS CRM 2013 online?

    HI,
    I am having requirement to show user auditing details like user last logged in date/ session spent time in MS CRM 2013 online.
    I did not found any option to query user Auditing data.
    I found the Audit summary View but don't know how to use it.
    Could any one suggest me how to achieve this.
    Thanks
    Baji Rahaman

    Please try this 
    Public Function Decompress(ByVal arr As Byte()) As Byte()
            Dim s As Byte()
            Dim notCompressed As Boolean
            notCompressed = False
            Dim MS As System.IO.MemoryStream
            MS = New System.IO.MemoryStream()
            MS.Write(arr, 0, arr.Length)
            MS.Position = 0
            Dim stream As System.IO.Compression.GZipStream
            stream = New System.IO.Compression.GZipStream(MS, System.IO.Compression.CompressionMode.Decompress)
            Dim temp As System.IO.MemoryStream
            temp = New System.IO.MemoryStream()
            Dim buffer As Byte() = New Byte(4096) {}
            While (True)
                Try
                    Dim read As Integer
                    read = stream.Read(buffer, 0, buffer.Length)
                    If (read <= 0) Then
                        Exit While
                    Else
                        temp.Write(buffer, 0, buffer.Length)
                    End If
                Catch ex As Exception
                    notCompressed = True
                    Exit While
                End Try
            End While
            If (notCompressed = True) Then
                stream.Close()
                Return temp.ToArray()
            Else
                Return temp.ToArray()
            End If
        End Function
    Thanks & Regards Manoj

  • Audit- Not laoding Audit data into Audit tables

    Hi ,
    Audit data not loading into audit tables in my XI3.1 SP4Environament.
    Audit data base is in SQL SERVER 2008 R2
    where Audit  database congigured , audit events enabled properly.
    and also log files creating without any errors also
    Please help me ..
    Edited by: Reddeppa Konduru on Nov 9, 2011 11:31 AM
    Edited by: Reddeppa Konduru on Nov 9, 2011 12:31 PM

    I am getting below error
    bsystem_impl.cpp:2651: TraceLog message 1
    2011/11/09 03:22:47.071|>>|A| |13664|9368| |||||||||||||||assert failure: (.\auditsubsystem_impl.cpp:2651). (false : Next event id value should be greater than the current one, check the auditee packing events code).

  • XI R2 Auditing data retension period

    Hi,
    Using XI R2 SP2 FP5 with an SQLServer database and want to limit the amount of data held, ideally by date i.e. only 6 months of data.
    I would expect a setting somewhere to say keep x days of data but can't find one and can't find any reference to it in documentation or on these forums.
    Any help much appreciated.
    John

    Hello,
    There is no way to restrict/purge audit data out of the box. You could however purge data in your db as mentioned in SAP NOTE  1198638 - How to purge the BO AUDIT tables leaving only the last 6 months data. I.e
    To purge the BO AUDIT tables, leaving only the last 6 months data, delete AUDIT_DETAIL with a join to AUDIT_EVENT and select the date for less then 6 months. Then delete the same period in AUDIT_EVENT.
    Note that this is apparently not supported. SAP  NOTE 1406372 - Is it possible to purge Audit database entries? 
    Best,
    Srinivas

  • Collector gows down and audit data aren't in OAV Console

    Hi,
    I have OAV 10.2.3.2, DBAUD collector, collection agent is on windows. When I run this collector, it starts but after a while it is stopped again. I can retrieve politics, create politics and provision them. Audit data are stored in AUD$ but I can't see them in Audit Vault console.
    In C:/oracle/product/10.2.3.2/av_agent_1/av/log/DBAUD_Collector_av_db2_2 are following:
         ***** Started logging for 'AUD$ Audit Collector' *****
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Collector Name = DBAUD_Collector
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Source Name = av_db2
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Av Name = AV
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Initialization done OK
    INFO @ '21/04/2010 12:04:45 02:00':
    ***** Starting CB
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
    INFO @ '21/04/2010 12:04:46 02:00':
    Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** CSDK inited OK + 1
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** Src alias = SRCDB2
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** SRC connected OK
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** SRC data retrieved OK
    INFO @ '21/04/2010 12:04:46 02:00':
    ***** Recovery done OK
    ERROR @ '21/04/2010 12:05:07 02:00':
    On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
    ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
    ORA-06512: na "AVSYS.AV$DW", line 1022
    ORA-01882: oblast časového pásma nebyla nalezena
    ORA-06512: na "AVSYS.AV$DW", line 1290
    ORA-06512: na line 1
    ERROR @ '21/04/2010 12:05:08 02:00':
    Collecting thread died with status 46821
    INFO @ '21/04/2010 12:06:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:07:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:08:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:09:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:10:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:11:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:12:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:41 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:13:41 02:00':
         ***** Started logging for 'AUD$ Audit Collector' *****
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Collector Name = DBAUD_Collector
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Source Name = av_db2
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Av Name = AV
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Initialization done OK
    INFO @ '21/04/2010 12:13:41 02:00':
    ***** Starting CB
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
    INFO @ '21/04/2010 12:13:42 02:00':
    Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** CSDK inited OK + 1
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** Src alias = SRCDB2
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** SRC connected OK
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** SRC data retrieved OK
    INFO @ '21/04/2010 12:13:42 02:00':
    ***** Recovery done OK
    ERROR @ '21/04/2010 12:14:03 02:00':
    On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
    ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
    ORA-06512: na "AVSYS.AV$DW", line 1022
    ORA-01882: oblast časového pásma nebyla nalezena
    ORA-06512: na "AVSYS.AV$DW", line 1290
    ORA-06512: na line 1
    ERROR @ '21/04/2010 12:14:05 02:00':
    Error on get metric callback: 46821
    ERROR @ '21/04/2010 12:14:05 02:00':
    Collecting thread died with status 46821
    ERROR @ '21/04/2010 12:14:06 02:00':
    Receive error. NS error 12537
    ERROR @ '21/04/2010 12:14:06 02:00':
    Timeout for getting metrics reply!
    INFO @ '21/04/2010 12:15:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:16:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    INFO @ '21/04/2010 12:17:01 02:00':
    Could not call Listener, NS error 12541, 12560, 511, 2, 0
    I've already follow help in administrator guide for Problem: Cannot start the DBAUD collector and the log file shows an error. There is that if I run command
    $ sqlplus /@SRCDB1
    successful, my source database is set up correctly. I run this command successful, but my problem didn't solve.
    Any advice, please?

    I reinstall AVS and AV Collection agent and again patchset 10.2.3.2, again add source database, agent, collectors,... but still have the same problem... collector starts successfully, but after wihile its stopped and in AV console aren't any data for reports (I create some audit policy and alerts).
    I test connection with sqlplus /@SRCDB1 and it is ok, log for collector is:
    May 11, 2010 8:11:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:11:38 AM Thread-44 FINE: timer task interval calculated = 60000
    May 11, 2010 8:11:38 AM Thread-44 FINE: Going to start collector, m_finalCommand=/bin/sh -c $ORACLE_HOME/bin/a
    vaudcoll hostname="avtest.zcu.cz" sourcename="stag_db" collectorname="DBAUD_Collector" avname="AV" loglevel="I
    NFO" command=START
    May 11, 2010 8:11:46 AM Thread-44 FINE: collector started, exitval=0
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=IS_ALIVE value=true
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=BYTES_PER_SEC value=0.0000
    May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=RECORDS_PER_SEC value=0.0000
    May 11, 2010 8:11:49 AM Thread-44 FINE: timer task going to be started...
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
    May 11, 2010 8:21:15 AM Thread-43 FINE: Stop caching since it is NOT alive for 10 times of query
    And in table av$rads_flat are collected audit data...
    Pls, know any one where could be a problem??

  • BPEL retrieving audit data in Java sample

    Hello,
    I am interested in using the BPEL Audit facilities programatically from inside a BPEL process. I had a look at the API documentation that is in the orabpel folder and I noticed that there are facilities to add information to the audit (via <bpelx:exec> and addAuditTrailEntry() methods). The using of these seems to be straightforward.
    I also noticed in the API that there are some methods for retrieving audit info from the BPEL, based on an instance handle. I would be interested in seeing how could be obtained some information from these audit data in a Java program. Could you point me to an existing example of how doing this?
    Do you have any other source of information regarding the audit process inside the BPEL?
    Many thanks in advance!
    Regards,
    Marinel

    Delayed response :
    You can use these methods from java-exec to add your own audit entries. I don't think there is a way to retreive audit trail for that current instance from java-exec If you want to retreive an audit trail, using IInstanceHandle.getAuditTrail().
    * Add an log entry into the audit trail.
    * @param message
    * @param detail
    abstract protected void addAuditTrailEntry( String message, Object detail );
    * Add an log entry into the audit trail.
    * @param message
    abstract protected void addAuditTrailEntry( String message );
    * Add an exception log entry into the audit trail.
    * @param t
    abstract protected void addAuditTrailEntry( Throwable t );
    HTH.
    Thanks,
    Rakesh

  • Moving shipment data using IDOC SHPMNT05

    Hi guys,
    I want to configure ALE process across two SAP systems and moved shipment data using IDOC SHPMNT05.
    May I know how to do that.
    Plz help me...................
    Thankz in advance,
    Swetha.

    Hi Shwetha,
    Please check the following links to start with.
    http://help.sap.com/saphelp_nw04/helpdata/en/78/217da751ce11d189570000e829fbbd/content.htm
    http://help.sap.com/saphelp_erp2004/helpdata/en/dc/6b835943d711d1893e0000e8323c4f/content.htm
    Thanks,
    Naren

  • Options for auditing data changes

    Hi Friends,
    I thought I will get some inputs on my following implementation. The requirement is to audit some data changes with in the system.( Oracle 10.2 on RHEL 4.7 )
    The audit is required in a sense that, the before images of data and information of who changed the data is required. I have looked at options like Oracle Auditing,FGA and so. But this cannot give me audit for the data changes,when and who changed.
    The first thing that comes into my mind are using triggers . Another option is using log miner. I have successfully tested it out with both of these approaches. The environment is like
    1 ) For some critical tables for which audit is required triggers were written ( ours is an OLTP application )
    2 ) For some non critical tables, log miner which was called by a stored procedure which runs at certain periods is used.
    3 ) audit data is stored in a different schema , with same table names as in base schema.
    I would like to know your thoughts on this.
    Thank You,
    SSN

    The delay with log miner is acceptable with some less critical audit tables and as you said, tirggers can be implemented for some critical tables.
    One bottleneck with using logminer is that it depends on the availability of archived redo logs, the backup mechanism if any implemented should make sure that, no archive logs are removed from the locations specified for the audit program. The backup mechanism should ensure that all archived logs are processed by periodically running audit program which uses log miner.
    Wondering if there is any other recommended approach for this.
    Thanks
    SSN

  • Logminer Issue : Not getting the proper auditing data

    Hi,
    I am using Logminer utility on my test database to view the contents from archived logs which I copied from my prod database. I have setup everything, and started trying to view SQL_REDO from v$logmnr_contents.
    The output looks a bit strange as it has picked the object_id and some hexadecimal values for columns.
    Here it is -
    SQL> set lines 120
    SQL> set pages 300
    SQL> select sql_redo FROM v$logmnr_contents;
    set transaction read write;
    update "UNKNOWN"."OBJ# 979115" set "COL 1" = HEXTORAW('c40b072041') where "COL 1" = HEXTORAW('3b5b5f462666') and ROWID =
    'AADvCrACCAAAAQ2AAA';
    delete from "UNKNOWN"."OBJ# 291676" where "COL 1" = HEXTORAW('c4504d3a42') and "COL 2" = HEXTORAW('4220') and "COL 3" =
    HEXTORAW('c3064c4f1d') and "COL 4" = HEXTORAW('80') and "COL 5" = HEXTORAW('80') and "COL 6" = HEXTORAW('80') and "COL 7
    " = HEXTORAW('80') and "COL 8" = HEXTORAW('80') and "COL 9" = HEXTORAW('80') and "COL 10" = HEXTORAW('c3091a0a45') and "
    COL 11" = HEXTORAW('c3091a0a45') and "COL 12" = HEXTORAW('80') and "COL 13" = HEXTORAW('80') and "COL 14" = HEXTORAW('80
    ') and "COL 15" = HEXTORAW('c4504d3a42') and "COL 16" = HEXTORAW('786d08150e080a') and "COL 17" = HEXTORAW('786d08150e08
    0a') and "COL 18" = HEXTORAW('534554544c455f55534552') and "COL 19" = HEXTORAW('534554544c455f55534552') and ROWID = 'AA
    BHNcAA9AABX7KABy';
    (I guess dictionary information has missed.)
    However, When I queried the dba_objects on my prod database, and I am getting the object_name with type.
    OWNER OBJECT_NAME OBJECT_TYPE CREATED
    SSCHEMA SCHEMATABLE TABLE 09-FEB-08
    Please suggest, is there anyway I can retrieve the proper auditing data.

    Thanks for reply!
    Another thing I missed to mention here that our test database is in NOARCHIVELOG mode. As per you suggestion it is giving me below error as -
    SQL> EXECUTE DBMS_LOGMNR_D.BUILD( -
    OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS);
    BEGIN DBMS_LOGMNR_D.BUILD( OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS); END;*
    ERROR at line 1:
    ORA-01325: archive log mode must be enabled to build into the logstream
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 3172
    ORA-00258: manual archiving in NOARCHIVELOG mode must identify log
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5786
    ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5884
    ORA-06512: at "SYS.DBMS_LOGMNR_D", line 12
    ORA-06512: at line 1
    Please suggest.

Maybe you are looking for

  • CURSOR MISALIGNMENT IN CAMERA RAW ? !

    Hello, I am a new photoshop user.  I have discovered an issue in Adobe Camara Raw and I need help to fix it. I am using Windows 8.1 on a Microsoft Surface Pro 3 with an Intel Core i7 1.7 ghz processor and the Intel HD 5000 graphics card.  I am also u

  • Mass of localization errors when trying to import mx.rpc.soap.* in AS3

    I have a small action script class that I'm trying to add web service support to, however when I import the mx.rpc.soap libs I get a mass of error messages along the lines of... Unable to resolve resource bundle "collections" for locale "en_US" Any i

  • Using REST-based data in SolMan

    Hi, In the [Apache Project ESME|http://incubator.apache.org/esme], we have made JMX-based data available via a REST interface. This is partially related to restrictions that are present in many cloud-based environments.  We are looking to see how exi

  • Warning from Excel when opening Excel document in JSP

    Hi, I populate a HSSFWorkbook with data retrieved from a database using user input. I then want to open the excel file up so the user can see it. This works fine, but prior to the file opening I get a warning message from Excel saying that the file i

  • Can't turn on genius (Hangs)?

    Can't turn on genius (Hangs)?