Triggers Vs Audit data
Is it possible to send the chnages done in the Database to some
flat file or in a MOM queue as and when they are committed on
realtime.
Its is possible to do that using triggers.How will trigger
cometo know that the transaction is committed.
I don't want to use audit using triggrs as then i will have to
keep on polling the audit tables.
Thanx in advance,
Rgrds,
Kailash M. L.
Hi...
I haven't done this, but here's an idea. I've used this
approach on similar things.
Have the trigger write the changes to a table designed to hold
the changes made to the desired tables.
Insert to these tables using triggers on the desired DML
statements.
In the same triggers, schedule using DBMS_JOB.SUBMIT a
background job to write the changes to the flat file. This job
would also delete the record in the table recording the
changes. It would probably be a good idea to pass a unique id
for the recording table.
This takes advantage of the fact that the background job is not
scheduled/run until a COMMIT is performed in the transaction
that runs DBMS_JOB. If the transaction containing the DML and
the DBMS_JOB submission are rolled back, then the records in the
recording table and job scheduling tables are rolled back, and
the bacjkground job is never run.
For critical applications, it probably would be a good idea to
write to an error table when the background job fails, as can
happen when writing to flat files.
Hope this helps. Good Luck.
Similar Messages
-
What is the best way to audit data
What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.
I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
Thanks,
Matt Knowles
Edited by: mattknowles on Jan 10, 2011 8:40 AMmattknowles wrote:
What is the best way to audit actual changes in the data, that is, to be able to see each insert, update, delete on a given row, when it happened, who did it, and what the row looked like before and after the change?
Currently, we have implemented our own auditing infrastructure where we generate standard triggers and an audit table to store OLD (values at the beginning of the Before Row timing point) and NEW (values at the beginning of the After Row timing point) values for every change.You can either:
1. Implement your own custom auditing (as you currently do)
2. Flashback Data Archive (11g). Requires license.
3. Version enable your tables with Workspace Manager.
>
I'm questioning this strategy because of the performance impact it has (significant, to say the least) and because it's something that a developer (confession, I'm the developer) came up with, rather than something a database administrator came up with. I've looked into Oracle Auditing, but this doesn't seem like we'd be able to go back and see what a row looked like at a given point in time. I've also looked at Flashbacks, but this seems like it would require a monumental amount of storage just to be able to go back a week, much less the years we currently keep this data.
Unfortunately, auditing data always takes lots of space. You must also consider performance, as custom triggers and Workspace Manager will perform much slower than FDA if there is heavy DML on the table. -
Let us say I want to audit data updates, deletes on existing table EMP_TAB that
has a few hundred thousands of records.
I created a shadow table Emp_tab_audit and added few audit columns
Emp_tab (
Empno NUMBER NOT NULL,
Ename VARCHAR2(10),
Job VARCHAR2(9),
Mgr NUMBER(4),
Hiredate DATE,
Sal NUMBER(7,2),
Comm NUMBER(7,2),
Deptno NUMBER(2) NOT NULL);
CREATE TABLE Emp_tab_audit (
seq number
operation varchar2(3),
user varchar2(20),
Timestamp date,
ip_address varchar2(25),
Terminal varchar2(10,
Empno NUMBER NOT NULL,
Ename VARCHAR2(10),
Job VARCHAR2(9),
Mgr NUMBER(4),
Hiredate DATE,
Sal NUMBER(7,2),
Comm NUMBER(7,2),
Deptno NUMBER(2) NOT NULL);
I am mostly interested in UPDATES and DELETES but I decided to add INSERTS to have full history for
each eomplyee in one table (audit schema) instead of querying two tables all the time (production
table and audit table) to see the changes.
I created this AFTER INSERT, UPDATE, DELETE trigger.
I decided to copy the :NEW values for INSERT and UPDATE and :OLD values for DELETE.
see attached.
so when insert happens, the first audit row is created in EMP_TAB_AUDIT.
when update happens, the 2nd new row is created in EMP_TAB_AUDIT.
The problem I am facing is the old records that curently exist. If someone updates an old row I am
copying the :NEW values so I won't have a copy of the :OLD values unless I create 2 ROWS (one for
the old and one for the new).
Do you think I should copy all the hundreds of thousands of records to the AUDIT tables for this to
work. I am hesitant to do that.
ANy better ideas. I am applying this solution to several tables (not just one).
This is also in 9i and i dont flexibility other than using a trigger to track data changes.
CREATE OR REPLACE TRIGGER TRG_EMP_AUDIT
AFTER INSERT OR DELETE OR UPDATE ON EMP_TAB
FOR EACH ROW DECLARE
v_operation VARCHAR2(10) := NULL;
v_user VARCHAR2(20);
v_timestamp Date;
v_ip_address VARCHAR2(25),
v_terminal VARCHAR2(10);
BEGIN
v_user := USERENV(user);
v_timestamp := SYSDATE;
v_ip_address := USERENV(ip_address);
v_terminal := USERENV(terminal);
IF INSERTING THEN
v_operation := 'INS';
ELSIF UPDATING THEN
v_operation := 'UPD';
ELSE
v_operation := 'DEL';
END IF;
IF INSERTING OR UPDATING THEN
INSERT INTO EMP_TAB_AUDIT (
seq,
operation,
user
timestamp,
ip_address,
terminal,
empno,
job,
mgr,
hiredate,
sal,
comm,
deptno )
VALUES (
audit_seq.nextval,
v_operation,
v_user,
v_timestamp,
v_ip_address,
v_terminal,
:new.empno,
:new.job,
:new.mgr,
:new.hiredate,
:new.sal,
:new.comm,
:new.deptno);
ELSIF DELETING THEN
INSERT INTO EMP_TAB_AUDIT (
seq,
aud_action,
user
timestamp,
ip_address,
terminal,
empno,
job,
mgr,
hiredate,
sal,
comm,
deptno )
VALUES (
audit_seq.nextval,
v_operation,
v_user,
v_timestamp,
v_ip_address,
v_terminal,
:old.empno,
:old.job,
:old.mgr,
:old.hiredate,
:old.sal,
:old.comm,
:old.deptno);
END IF;
END;
*******************************************************************************For auditing normailly you do AFTER triggers (not BEFORE) as ther could be other triggers that change the :NEW data stored in the row.
You can capture :OLD in AFTER too, but the issue you are saving the updated values. You are storing an image of the OLD row so you still need to check the current row to find out what has been updated. My preference was to store NEW values on updates. -
Options for auditing data changes
Hi Friends,
I thought I will get some inputs on my following implementation. The requirement is to audit some data changes with in the system.( Oracle 10.2 on RHEL 4.7 )
The audit is required in a sense that, the before images of data and information of who changed the data is required. I have looked at options like Oracle Auditing,FGA and so. But this cannot give me audit for the data changes,when and who changed.
The first thing that comes into my mind are using triggers . Another option is using log miner. I have successfully tested it out with both of these approaches. The environment is like
1 ) For some critical tables for which audit is required triggers were written ( ours is an OLTP application )
2 ) For some non critical tables, log miner which was called by a stored procedure which runs at certain periods is used.
3 ) audit data is stored in a different schema , with same table names as in base schema.
I would like to know your thoughts on this.
Thank You,
SSNThe delay with log miner is acceptable with some less critical audit tables and as you said, tirggers can be implemented for some critical tables.
One bottleneck with using logminer is that it depends on the availability of archived redo logs, the backup mechanism if any implemented should make sure that, no archive logs are removed from the locations specified for the audit program. The backup mechanism should ensure that all archived logs are processed by periodically running audit program which uses log miner.
Wondering if there is any other recommended approach for this.
Thanks
SSN -
Action needs to be triggered in future date
Hi All,
We have a requirement where we need to trigger an Action on Contract's End date. How can we define action, so that it would be triggered in future date automatically with out any user intervention. I have checked with Date profiles. That wouldnt help in this scenario.
Can we do this with events..? How can we schedule an event, so that the event would be triggered in future (ex. after 2 years)?
Thanks in Advance!
Regards,
Rajesh.Hi Rajesh,
A lot depends on how are you planning to schedule the action in your case. With actions a lot of different combinations of 'schedule condition','start condition', 'action merging' options and 'processing time' can be used to meet the requirements.
If you want trigger the action immediately when the contract end date is reached, one of the options could be to use the condition contract end date = current date as schedule condition and then use the following:
Processing Time = Immediate Processing
Action Merging = Max. 1 Action for Each Action Definition
Do not set a start condition.
Thanks & regards,
Ahmad -
PDF Forms in Acrobat 9 Pro
When auditing a file, I enter the date I am auditing it.
Example: 02/18/2015
I then enter dates of when information was last verified. Some of these dates need to fall within 30 days of the audit date I enter on any give day and others need to fall within 120 days and another within 365 days. If they don't fall within the specified number of days for that field, I would like it to give an error message in that filled.
Examples:
College Grad Date 01/15/2015
Certification Date 12/23/2015
License Date 02/13/2014 - This is over 365 days so I do not want the date, I want it to display "Need to re-verify"
I do not have any inkling of an idea of how to write this. Your help would be most appreciated.
Thank you.Solved the problem- I contacted apple on their support line found on this page... http://www.apple.com/uk/support/mac/ and they were really helpful and talked me through what to do on the phone, I now have all three productivity apps. Hope this helps
-
Audit database. config auditing data source (DB2)
Hello expert,
I want to enable audit database for Business Object and I has followed the admin guide but not work.
I install Business Object enterprise XI 3.1 on AIX 5.3 .
When I installed BO ,I choose 'Use an existing database' and choose DB2. (The same database of SAP BW database)
And when the installation require the information of CMS and Audit database, I fill the same db alias as SAP BW database.
So now I has SAPBW , CMS and Audit data in the same database.
After installation I saw the CMS table in "DB2<aliasName>" schema.
But can not find the Audit table.
Does the audit table will create after installation?
Then I try to enable the audit database using cmsdbsetup.sh and I choose 'selectaudit' and fill in the information that it require.
Then it finish with no error.
"Select auditing data source was successful. Auditing Data Source setup finished."
But I still can not find any audit table in the database.
I run serverconfig.sh and I can't see the Enable Audit option when I choose 'modify a server'.
Any idea?
Thanks in advance.
ChaiHello,
Thanks for your reply.
It is not a BO cluster.
And a log detail when select audit data source is show as below
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST)
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) About to process commandline parameters...
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Finished processing commandline parameters.
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
Wed Nov 17 2010 10:17:02 GMT+0700 (THAIST) Select auditing data source was successful.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) /bodev/bobje/setup/jscripts//ccmunix.js started at Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST)
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) About to process commandline parameters...
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Finished processing commandline parameters.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No password supplied, setting password to default.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No username supplied, setting username and password to default.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No authentication type supplied, setting authentication type to default.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Warning: No CMS name supplied, setting CMS name to machine name.
Wed Nov 17 2010 10:25:22 GMT+0700 (THAIST) Select auditing data source was successful.
And the CMS log file did not show any error.
Additional detail:
- My BW and BO are in the same server.
- I already grant all the rights to the user who relate to audit database.
- My BW and BO are in the same database.
- There is no audit table appear in the database.
- No Fix pack installed.
I wonder why BO audit connection did not see my database.
(In case of DB2, I think the db2 alias name is the same name of the database name (as default).
So if my database name is BWD then the database alias name should be BWD, am I right?)
Any idea?
Thanks in advance.
Chai -
Analysing Task Audit, Data Audit and Process Flow History
Hi,
Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
Many Thanks.I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
-- Declare working variables --
declare @dtStartDate as nvarchar(20)
declare @dtEndDate as nvarchar(20)
declare @strAppName as nvarchar(20)
declare @strSQL as nvarchar(4000)
-- Initialize working variables --
set @dtStartDate = '1/1/2012'
set @dtEndDate = '8/31/2012'
set @strAppName = 'YourAppNameHere'
--Get Rules Load, Metadata, Member List
set @strSQL = '
select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (1)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (21)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (23)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
ActivityID ActivityName
0 Idle
1 Rules Load
2 Rules Scan
3 Rules Extract
4 Consolidation
5 Chart Logic
6 Translation
7 Custom Logic
8 Allocate
9 Data Load
10 Data Extract
11 Data Extract via HAL
12 Data Entry
13 Data Retrieval
14 Data Clear
15 Data Copy
16 Journal Entry
17 Journal Retrieval
18 Journal Posting
19 Journal Unposting
20 Journal Template Entry
21 Metadata Load
22 Metadata Extract
23 Member List Load
24 Member List Scan
25 Member List Extract
26 Security Load
27 Security Scan
28 Security Extract
29 Logon
30 Logon Failure
31 Logoff
32 External
33 Metadata Scan
34 Data Scan
35 Extended Analytics Export
36 Extended Analytics Schema Delete
37 Transactions Load
38 Transactions Extract
39 Document Attachments
40 Document Detachments
41 Create Transactions
42 Edit Transactions
43 Delete Transactions
44 Post Transactions
45 Unpost Transactions
46 Delete Invalid Records
47 Data Audit Purged
48 Task Audit Purged
49 Post All Transactions
50 Unpost All Transactions
51 Delete All Transactions
52 Unmatch All Transactions
53 Auto Match by ID
54 Auto Match by Account
55 Intercompany Matching Report by ID
56 Intercompany Matching Report by Acct
57 Intercompany Transaction Report
58 Manual Match
59 Unmatch Selected
60 Manage IC Periods
61 Lock/Unlock IC Entities
62 Manage IC Reason Codes
63 Null -
Hello,
Error Created while moving audit data from existing runtime repository to Control Center using OWB Control Center Upgrade Assistant. Here OWBRTR2TST is Runtime repository in Control Center.
Error as follows:
Exception:
Failed to Migrate Audit Data into OWBRTR2TST java.sql.SQLException: ORA-02298:
Can not validate (OWBRTR2TST.co.FK_PARENT_CO) - parent keys not found
ORA-06512 at line 16
What we need to do to aviod this error.
Thanks
SantiHello
Even I was getting almost similar problem. Once you solve please let us know.
Mine is like this
Error Created while moving audit data from existing runtime repository to Control Center
Using OWB Control Center Upgrade Assistant. Here OWBRTR2TST is Runtime repository in
Control Center.
Exception:
Failed to Migrate Audit Data into OWBRTR2TST java.sql.SQLException: ORA-02298:
Can not validate (OWBRTR2TST.co.FK_PARENT_CO) - parent keys not found
ORA-06512 at line 16 -
How to show User Auditing data in dashboard/reports in MS CRM 2013 online?
HI,
I am having requirement to show user auditing details like user last logged in date/ session spent time in MS CRM 2013 online.
I did not found any option to query user Auditing data.
I found the Audit summary View but don't know how to use it.
Could any one suggest me how to achieve this.
Thanks
Baji RahamanPlease try this
Public Function Decompress(ByVal arr As Byte()) As Byte()
Dim s As Byte()
Dim notCompressed As Boolean
notCompressed = False
Dim MS As System.IO.MemoryStream
MS = New System.IO.MemoryStream()
MS.Write(arr, 0, arr.Length)
MS.Position = 0
Dim stream As System.IO.Compression.GZipStream
stream = New System.IO.Compression.GZipStream(MS, System.IO.Compression.CompressionMode.Decompress)
Dim temp As System.IO.MemoryStream
temp = New System.IO.MemoryStream()
Dim buffer As Byte() = New Byte(4096) {}
While (True)
Try
Dim read As Integer
read = stream.Read(buffer, 0, buffer.Length)
If (read <= 0) Then
Exit While
Else
temp.Write(buffer, 0, buffer.Length)
End If
Catch ex As Exception
notCompressed = True
Exit While
End Try
End While
If (notCompressed = True) Then
stream.Close()
Return temp.ToArray()
Else
Return temp.ToArray()
End If
End Function
Thanks & Regards Manoj -
Audit- Not laoding Audit data into Audit tables
Hi ,
Audit data not loading into audit tables in my XI3.1 SP4Environament.
Audit data base is in SQL SERVER 2008 R2
where Audit database congigured , audit events enabled properly.
and also log files creating without any errors also
Please help me ..
Edited by: Reddeppa Konduru on Nov 9, 2011 11:31 AM
Edited by: Reddeppa Konduru on Nov 9, 2011 12:31 PMI am getting below error
bsystem_impl.cpp:2651: TraceLog message 1
2011/11/09 03:22:47.071|>>|A| |13664|9368| |||||||||||||||assert failure: (.\auditsubsystem_impl.cpp:2651). (false : Next event id value should be greater than the current one, check the auditee packing events code). -
XI R2 Auditing data retension period
Hi,
Using XI R2 SP2 FP5 with an SQLServer database and want to limit the amount of data held, ideally by date i.e. only 6 months of data.
I would expect a setting somewhere to say keep x days of data but can't find one and can't find any reference to it in documentation or on these forums.
Any help much appreciated.
JohnHello,
There is no way to restrict/purge audit data out of the box. You could however purge data in your db as mentioned in SAP NOTE 1198638 - How to purge the BO AUDIT tables leaving only the last 6 months data. I.e
To purge the BO AUDIT tables, leaving only the last 6 months data, delete AUDIT_DETAIL with a join to AUDIT_EVENT and select the date for less then 6 months. Then delete the same period in AUDIT_EVENT.
Note that this is apparently not supported. SAP NOTE 1406372 - Is it possible to purge Audit database entries?
Best,
Srinivas -
Collector gows down and audit data aren't in OAV Console
Hi,
I have OAV 10.2.3.2, DBAUD collector, collection agent is on windows. When I run this collector, it starts but after a while it is stopped again. I can retrieve politics, create politics and provision them. Audit data are stored in AUD$ but I can't see them in Audit Vault console.
In C:/oracle/product/10.2.3.2/av_agent_1/av/log/DBAUD_Collector_av_db2_2 are following:
***** Started logging for 'AUD$ Audit Collector' *****
INFO @ '21/04/2010 12:04:45 02:00':
***** Collector Name = DBAUD_Collector
INFO @ '21/04/2010 12:04:45 02:00':
***** Source Name = av_db2
INFO @ '21/04/2010 12:04:45 02:00':
***** Av Name = AV
INFO @ '21/04/2010 12:04:45 02:00':
***** Initialization done OK
INFO @ '21/04/2010 12:04:45 02:00':
***** Starting CB
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
INFO @ '21/04/2010 12:04:46 02:00':
Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
INFO @ '21/04/2010 12:04:46 02:00':
***** CSDK inited OK + 1
INFO @ '21/04/2010 12:04:46 02:00':
***** Src alias = SRCDB2
INFO @ '21/04/2010 12:04:46 02:00':
***** SRC connected OK
INFO @ '21/04/2010 12:04:46 02:00':
***** SRC data retrieved OK
INFO @ '21/04/2010 12:04:46 02:00':
***** Recovery done OK
ERROR @ '21/04/2010 12:05:07 02:00':
On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
ORA-06512: na "AVSYS.AV$DW", line 1022
ORA-01882: oblast časového pásma nebyla nalezena
ORA-06512: na "AVSYS.AV$DW", line 1290
ORA-06512: na line 1
ERROR @ '21/04/2010 12:05:08 02:00':
Collecting thread died with status 46821
INFO @ '21/04/2010 12:06:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:07:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:08:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:09:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:10:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:11:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:12:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:41 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:13:41 02:00':
***** Started logging for 'AUD$ Audit Collector' *****
INFO @ '21/04/2010 12:13:41 02:00':
***** Collector Name = DBAUD_Collector
INFO @ '21/04/2010 12:13:41 02:00':
***** Source Name = av_db2
INFO @ '21/04/2010 12:13:41 02:00':
***** Av Name = AV
INFO @ '21/04/2010 12:13:41 02:00':
***** Initialization done OK
INFO @ '21/04/2010 12:13:41 02:00':
***** Starting CB
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_DELAY_TIME|, got |20|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_SLEEP_TIME|, got |5000|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_ACTIVE_SLEEP_TIME|, got |1000|
INFO @ '21/04/2010 12:13:42 02:00':
Getting parameter |AUDAUDIT_MAX_PROCESS_RECORDS|, got |1000|
INFO @ '21/04/2010 12:13:42 02:00':
***** CSDK inited OK + 1
INFO @ '21/04/2010 12:13:42 02:00':
***** Src alias = SRCDB2
INFO @ '21/04/2010 12:13:42 02:00':
***** SRC connected OK
INFO @ '21/04/2010 12:13:42 02:00':
***** SRC data retrieved OK
INFO @ '21/04/2010 12:13:42 02:00':
***** Recovery done OK
ERROR @ '21/04/2010 12:14:03 02:00':
On line 1287; OAV-46599: internal error ORA-1882 count(DWFACT_P20100420_TMP) =
ORA-06512: na "AVSYS.DBMS_AUDIT_VAULT", line 6
ORA-06512: na "AVSYS.AV$DW", line 1022
ORA-01882: oblast časového pásma nebyla nalezena
ORA-06512: na "AVSYS.AV$DW", line 1290
ORA-06512: na line 1
ERROR @ '21/04/2010 12:14:05 02:00':
Error on get metric callback: 46821
ERROR @ '21/04/2010 12:14:05 02:00':
Collecting thread died with status 46821
ERROR @ '21/04/2010 12:14:06 02:00':
Receive error. NS error 12537
ERROR @ '21/04/2010 12:14:06 02:00':
Timeout for getting metrics reply!
INFO @ '21/04/2010 12:15:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:16:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
INFO @ '21/04/2010 12:17:01 02:00':
Could not call Listener, NS error 12541, 12560, 511, 2, 0
I've already follow help in administrator guide for Problem: Cannot start the DBAUD collector and the log file shows an error. There is that if I run command
$ sqlplus /@SRCDB1
successful, my source database is set up correctly. I run this command successful, but my problem didn't solve.
Any advice, please?I reinstall AVS and AV Collection agent and again patchset 10.2.3.2, again add source database, agent, collectors,... but still have the same problem... collector starts successfully, but after wihile its stopped and in AV console aren't any data for reports (I create some audit policy and alerts).
I test connection with sqlplus /@SRCDB1 and it is ok, log for collector is:
May 11, 2010 8:11:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:11:38 AM Thread-44 FINE: timer task interval calculated = 60000
May 11, 2010 8:11:38 AM Thread-44 FINE: Going to start collector, m_finalCommand=/bin/sh -c $ORACLE_HOME/bin/a
vaudcoll hostname="avtest.zcu.cz" sourcename="stag_db" collectorname="DBAUD_Collector" avname="AV" loglevel="I
NFO" command=START
May 11, 2010 8:11:46 AM Thread-44 FINE: collector started, exitval=0
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=IS_ALIVE value=true
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=BYTES_PER_SEC value=0.0000
May 11, 2010 8:11:49 AM Thread-44 FINE: return cached metric , name=RECORDS_PER_SEC value=0.0000
May 11, 2010 8:11:49 AM Thread-44 FINE: timer task going to be started...
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:12:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:13:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:14:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:15:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:16:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:17:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:18:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:19:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:20:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=IS_ALIVE value=false
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=BYTES_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: return cached metric , name=RECORDS_PER_SEC value=0.00
May 11, 2010 8:21:15 AM Thread-43 FINE: Stop caching since it is NOT alive for 10 times of query
And in table av$rads_flat are collected audit data...
Pls, know any one where could be a problem?? -
BPEL retrieving audit data in Java sample
Hello,
I am interested in using the BPEL Audit facilities programatically from inside a BPEL process. I had a look at the API documentation that is in the orabpel folder and I noticed that there are facilities to add information to the audit (via <bpelx:exec> and addAuditTrailEntry() methods). The using of these seems to be straightforward.
I also noticed in the API that there are some methods for retrieving audit info from the BPEL, based on an instance handle. I would be interested in seeing how could be obtained some information from these audit data in a Java program. Could you point me to an existing example of how doing this?
Do you have any other source of information regarding the audit process inside the BPEL?
Many thanks in advance!
Regards,
MarinelDelayed response :
You can use these methods from java-exec to add your own audit entries. I don't think there is a way to retreive audit trail for that current instance from java-exec If you want to retreive an audit trail, using IInstanceHandle.getAuditTrail().
* Add an log entry into the audit trail.
* @param message
* @param detail
abstract protected void addAuditTrailEntry( String message, Object detail );
* Add an log entry into the audit trail.
* @param message
abstract protected void addAuditTrailEntry( String message );
* Add an exception log entry into the audit trail.
* @param t
abstract protected void addAuditTrailEntry( Throwable t );
HTH.
Thanks,
Rakesh -
Logminer Issue : Not getting the proper auditing data
Hi,
I am using Logminer utility on my test database to view the contents from archived logs which I copied from my prod database. I have setup everything, and started trying to view SQL_REDO from v$logmnr_contents.
The output looks a bit strange as it has picked the object_id and some hexadecimal values for columns.
Here it is -
SQL> set lines 120
SQL> set pages 300
SQL> select sql_redo FROM v$logmnr_contents;
set transaction read write;
update "UNKNOWN"."OBJ# 979115" set "COL 1" = HEXTORAW('c40b072041') where "COL 1" = HEXTORAW('3b5b5f462666') and ROWID =
'AADvCrACCAAAAQ2AAA';
delete from "UNKNOWN"."OBJ# 291676" where "COL 1" = HEXTORAW('c4504d3a42') and "COL 2" = HEXTORAW('4220') and "COL 3" =
HEXTORAW('c3064c4f1d') and "COL 4" = HEXTORAW('80') and "COL 5" = HEXTORAW('80') and "COL 6" = HEXTORAW('80') and "COL 7
" = HEXTORAW('80') and "COL 8" = HEXTORAW('80') and "COL 9" = HEXTORAW('80') and "COL 10" = HEXTORAW('c3091a0a45') and "
COL 11" = HEXTORAW('c3091a0a45') and "COL 12" = HEXTORAW('80') and "COL 13" = HEXTORAW('80') and "COL 14" = HEXTORAW('80
') and "COL 15" = HEXTORAW('c4504d3a42') and "COL 16" = HEXTORAW('786d08150e080a') and "COL 17" = HEXTORAW('786d08150e08
0a') and "COL 18" = HEXTORAW('534554544c455f55534552') and "COL 19" = HEXTORAW('534554544c455f55534552') and ROWID = 'AA
BHNcAA9AABX7KABy';
(I guess dictionary information has missed.)
However, When I queried the dba_objects on my prod database, and I am getting the object_name with type.
OWNER OBJECT_NAME OBJECT_TYPE CREATED
SSCHEMA SCHEMATABLE TABLE 09-FEB-08
Please suggest, is there anyway I can retrieve the proper auditing data.Thanks for reply!
Another thing I missed to mention here that our test database is in NOARCHIVELOG mode. As per you suggestion it is giving me below error as -
SQL> EXECUTE DBMS_LOGMNR_D.BUILD( -
OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS);
BEGIN DBMS_LOGMNR_D.BUILD( OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS); END;*
ERROR at line 1:
ORA-01325: archive log mode must be enabled to build into the logstream
ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 3172
ORA-00258: manual archiving in NOARCHIVELOG mode must identify log
ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5786
ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 5884
ORA-06512: at "SYS.DBMS_LOGMNR_D", line 12
ORA-06512: at line 1
Please suggest.
Maybe you are looking for
-
My Snow Leopard Install Experiences
First of all, let it be known that I’m a relative newcomer to the world of Macs, having bought my first last December. However, I’ve been around computers for fifteen years and class myself as pretty sad on the geek scale and run two Windows and two
-
I just created an iTunes library, how do I revert back to my default library?
I just created a new iTunes library, how do I return to my default library?
-
Satellite L300-147- Manual volume control not working
The manual volume control on the laptop is no longer working. If I try to adjust the volume (on the right the headphone plug-ins)- I just keep scrolling. As a result, I can only watch DVD on a really low volume level. I have adjusted volume control u
-
I have uploaded photos and a movie into my mobileMe Gallery and can open them there. In iweb I've selected the MobileMe Gallery widget but cannot see any photos or the movie in the widget menu to select and place into the page - any help please?
-
JSON error when passing items into a Dynamic PL/SQL action
Hi, APEX 4.1.1, Oracle RDBMS 11.2.0.3 I have a recipient field whose value is intended to be dependent on two other fields on the page (a select list and a number field). I have an on-change 'set-value' DA on the number field with a set type of 'PL/S