Record column level changes of table for audit.
Hi Experts,
I need suggestion on recording column level changes in table for internal aduit purposes, our company policy does not allow us to enable CDC and CT on database levels so I am looking for whar are other the best ways for doing the same ?
I know we can use triggers and OUTpUT clause to record the changes on tables but Is there any other light weight solution for recording changes on transactions tables which very gaint tables.
Seeking your guidnace on solution side ?
Shivraj Patil.
OUTPUT should be the best choice for your case as this would be much lighter than Trigger.
Similar Messages
-
How may I include the table for audit?
How may I include global tables for audit? We are going to control changes.
Do you know which table contained the list with it?
ThanksHi Marina,
The following is the information on table log:
You must start the SAP system with the rec/client parameter. This parameter specifies whether the SAP system logs changes to table data in all clients or only in specific clients. The parameter can have
the following values:
- rec/client = ALL logs all clients
- rec/client = 000 [,...] logs the specified clients
- rec/client = OFF turns logging off
In the technical settings (use transaction SE13), set the Log data changes flag for those tables that you want to have logged.
If both of these conditions are met, the database logs table changes in the DBTABPRT table. You can also check at transaction SCU3.
Use the RSTBHIST report to obtain a list of those tables that are currently set to be logged.
Hope it helps.
Please award points if it is useful.
Thakns & Regards,
Santosh -
Update trigger on table fr audit purpose to record column level information
Hi,
I want to create a update trigger which will record data in an audit table.
In audit table I want to have columns like old_value_of_Col and new_value_of_Col.
This is easily achievable. But my main concern is to add three more columns having the information of:
1) Which column was updated?
2) Old value of column that was updated
3) New value of column that was updated
If possible, Also if one updates three columns for example, then in such a case. I would like to have three entries in audit table for the corresponding three columns.
Please help.
Thanks in advance.A few approaches to consider.
First, if you are on 11g, take a look at Flashback Data Archive: http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28424/adfns_flashback.htm#ADFNS01011
Second, you can use Fine-Grained Auditing to capture DML statements and bind variables. It is not as easy to reconstruct the before/after picture, but it may be sufficient for some purposes.
For trigger-based solutions, I have seen the approach you propose (1 row for each column changed) and it is tedious and prone to maintenance headaches. You have to write code for each column, doing compares (remembering to consider NULLs), dealing with different datatypes, etc. We used that design becasue there was an actual requirement to produce a report that needed such a structure.
An easier trigger-based solution is to create a history table for the table you want to track, with all of the columns from the original, plus whatever else you need for housekeeping. Write an After Insert/Update/Delete trigger on your base table, and populate/insert rows into the history table:
- For inserts, populate and insert row from the :new. values
- For deletes, populate and insert a row from the :old. values
- For updates, popualte and insert a row from the :old. values and another from the :new. values
I would also have a column to designate whether this is an Insert, Delete, Update-Old or Update-New row.
Once you have done one example, the rest are easy. If you were sufficiently motivated (I have not yet been :-) ), you could probably write a script to query DBA_TAB_COLS and generate the code. -
Change log table for Vendor Bidder information?
Hi,
In BBPMAININT transaction we have BIDDER DATA tab.
If we change any Classification data in BIDDER DATA tab for the Vendor in which table it updates the change log information?
I checked in BBP_SUPP_MONI table this table updates only when we change any Company data information for the vendor not Bidder tab.
can any one help me?
Thanks.Hi
There is no standard report.
But you can check in CDHDR & CDPOS tables for change log.
Object Class= BUPA_BBP0020
Tabname=BBPM_BUT_FRG0021
Best regards
Ramki -
Changing color table for intensity graphs
Hi, I've been working on superimposing intensity graphs and I'm almost getting it, I'm just having trouble with color table, it is suppose to be composed of a 1-D array of 256 colors
I have 2 intensity graphs each with a different color table (256 color - 256 locations in the color table array). I need to "squeeze" both color table into a third color table for the superimposing of the 2 intensity graphs. therefore I figure I would take every other color value from each of the first 2 color tables (so each color table shrinks to 128 color) and add the values to form a 3rd color table so that the bottom will correspond to the first color table and the top will correspond to the 2nd color table.
I can do this manually, but it takes WAY too long, so I've been trying to use some kinda of algorism, but I cannot succeed. Anyone have any suggestions? it seems that color table is an unusual kinda of array
My attempted program is attached.
thanks very much!!
Attachments:
superimposing3.vi 111 KBBrian,
I think this is what you're after (attached). The easiest thing to do is to decimate, then concatenate your original color tables to form your new "squeezed" color table. However, keep in mind that you also have to squeeze, or remap, your underlying data values to correspond with either one half of the color table or the other.
See if what I did makes sense.
Regards,
John
Attachments:
superimposing3_MOD.vi 90 KB -
How to update a column in a nested table for a given record in the master t
Hi I have translations for all attributes of an item stored as a nested table
CREATE OR REPLACE TYPE T_ITM_ATTR AS OBJECT(
ATTR_NM VARCHAR2(30),
ATTR_VAL VARCHAR2(200 CHAR),
ATTR_STS_BL NUMBER(1))
INSTANTIABLE
FINAL
CREATE OR REPLACE TYPE T_ITM_ATTRIBUTES AS TABLE OF T_ITM_ATTR;
CREATE TABLE XGN_MOD_ITEMS_T
IDS NUMBER,
MOD_IDS NUMBER NOT NULL,
MOD_ITM_IDS NUMBER NOT NULL,
LGG_ID VARCHAR2(3 CHAR) NOT NULL,
ITM_TYPE VARCHAR2(50 CHAR) NOT NULL,
ITM_NM VARCHAR2(50 CHAR) NOT NULL,
ITM_BLOCK VARCHAR2(50 CHAR),
ITM_ATTR T_ITM_ATTRIBUTES,
ITM_COL1 VARCHAR2(1 CHAR),
ITM_DSC VARCHAR2(100 CHAR),
CREATED_BY VARCHAR2(30 CHAR) DEFAULT USER NOT NULL,
CREATION_DATE DATE DEFAULT SYSDATE NOT NULL,
LAST_UPDATED_BY VARCHAR2(30 CHAR),
LAST_UPDATE_DATE DATE
NESTED TABLE ITM_ATTR STORE AS NESTED_ITM_ATTR_T
TABLESPACE XGN4_TAB
PCTUSED 40
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
FREELISTS 1
FREELIST GROUPS 1
BUFFER_POOL DEFAULT
What I want to do is to update only the attr_val of each item to a value coming from a temporary table where the user inserted his translations
So how can I update ?
this doesn't work since I have to know the previous value?
update table(
select t2.attr_val
from XGN_MOD_ITEMS_T t1, table(t1.itm_attr) t2
where t1.mod_itm_ids=160) attr
set value(attr) = 'Profil'
where value(attr) = 'Profile'
This updates all occurences for all entries wich doesn't work either because I have for each language another record
UPDATE /*+ NESTED_TABLE_GET_REFS */
NESTED_ITM_ATTR_T
SET attr_val = 'SHIT'
WHERE attr_val = 'Profile'http://www.psoug.org/reference/nested_tab.html
Look for UPDATE. There is a working demo on the page.
That said nested tables are not a good place to store data. Reconsider using relational tables with, if necessary, object views. -
Column Level Privileges on Table
i want to give the user right to just select the one column of the table, even not view other table fields.
A quick glance at the documentation would have quickly resolved this for you.
"You can specify columns only when granting the INSERT, REFERENCES, or UPDATE privilege. If you do not list columns, then the grantee has the specified privilege on all columns in the table or view." (my emphasis)
If you are on 10gR2 you do have the option of implementing column-oriented Row Level Security but that is a big thing to undertake. Find out more.
Cheers, APC -
Configuring gardes and level in T510 Table for Payroll
Hi Experts,
We have a scenario in implementation where we have grades Level 1...... Level 8 ,
Minimum - Maximum amount is their for each level in that particular grade ( Like L1-A: Min-Max.,L1-B:Min-Max,)
How to tackle this kind of situation in payroll configuration in T510 table.
Regards,
IrfanHI Experts,
I am doing this for international payroll driver. Our OM and PA are integrated.
I will repeat my question
Level 1 - A : 250 - 300
Level 1 - B : 301 - 400
Level 1 - C : 401 - 450
Level 1 - D : 451- 500
I want to make this appear in 1005 infotype.
But instead I am suggesting client
Level 1 - A : 300
Level 1 - B : 400
Level 1 - C : 450
Level 1 - D : 500
In the 1005 infotype.
Please advice.
Regards,
IFF
Edited by: IFF on Nov 12, 2010 11:47 AM -
Change dimension tables for an infocube
hello all,
we have an existing cube where a few of the dimension tables are quite big compared to size of fact table. we understand enough about getting the correct dimension table sizes, our question revolves around the existing cube.
meaning, when we make the dimension table changes in dev and transport to val, will the transport be successful with data in the cube (so it will be stored in a format of old dimension table structure)? do we have to dump the cube before transport and then reload after transport successful?
thanks so much!Hi,
Please check the below link
Re: Cube Transport issue
For remodelling,
http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=(J2EE3417700)ID0196158950DB01648128345571943727End?blog=/pub/wlg/5772
Thanks -
Is it possible to Delete all Records of Std. SAP tables for Quality Server
Hi,
We want to delete some sensetive information from Test(Quality) Server.
For Std. SAP tables, is it possible to delete all Records?Hi, Through an ABAP program it is possible. You can develop the custom program on Development server than transfer it to QS and run the program.
Regards,
Aalok
Edited by: aalokg on May 13, 2010 12:19 PM -
Dears,
I need to find out the table which maintain Audit/Security Log and can be access through ABAP or any other programming language.
I already have seen the reports from SM30 and SM31 but what i need is the tables from where these reports are fetch from.
Thanks & Regards,Hi,
After I turn on audit trail on the database sever and issue DML statment. I have found nothing in the audit table.
1) Set "audit_trial" = true in teh init.ora file
2) Run the $ORACLE_HOME/rdbms/admin/cataudit.sql
3) Connect sys/password by sysdba and issue the following command:
- AUDIT SELECT TABLE, UPDATE TABLE, INSERT TABLE, DELETE TABLE BY APPS BY ACCESS;
Do I need to re-set the database after step 1? or I have made some wrong.
Thanks. -
Hi All -
I have a table which has 6 columns
create table main_tbl
(p_id integer,
p_lname varchar2(20),
p_fname varchar2(20),
p_dept varchar2(15),
p_office varchar2(15),
p_ind char(1)
And I have a corresponding audit/history table
create table main_tbl_audit
(p_id integer,
p_lname varchar2(20),
p_fname varchar2(20),
p_dept varchar2(15),
p_office varchar2(15),
p_ind char(1)
As part of the application audit process, I want to move a record from main_tbl to main_tbl_audit only when a column value changes. I can do this using a column level trigger but if I use column level trigger and if for example 3 values are changed in the main_tbl at once then it will create 3 different rows in the main_tbl_audit table.
Is there a way to always create one row in main_tbl_audit table even if a record in the main_tbl table has one or more column value changes.
Please share your expertise.
Thanks,
Seenu001I'm not quite sure what you mean by a "column-level trigger" since there is no such thing in Oracle. You can specify a list of columns in the OF clause of a row-level trigger, so I'm guessing that's what you're talking about. But then I don't understand why you would get three rows in the audit table. Unless you created three different row-level triggers each of which specified a single column?
Why wouldn't you simply have a single row-level trigger that compared the old and new values, i.e. (ignoring NULLs)
CREATE OR REPLACE TRIGGER trg_audit_main
BEFORE UPDATE ON main_tbl
FOR EACH ROW
BEGIN
IF( :new.p_lname != :old.p_lname or
:new.p_fname != :old.p_fname or
THEN
INSERT INTO main_tbl_audit ...
END IF;
END;Justin -
Hi,
I am designing a schema where in all the data tables will have created_by, created_on, modified_by and modified_on columns to find what & when changes made and who made changes to the records of the tables for audit purpose. These audit columns are rarely used in queries.
Other than the above said audit columns, these tables will have application data columns on an average of 20-25 max.
Will these additional audit columns impact the DML performance on the data tables?
If it is, will it be a good idea creating separate audit table with all these audit columns having one-to-one relationship with the actual data tables? Will this separation really reduce the load on actual data tables and improve the DML performance?
In case of creating separate audit tables, is it advisable creating these audit tables in separate schema? So that, the audit tables and related indexes will be created on a different data file and hence archiving, backup and recovery of actual data tables will be easy.
To update these audit columns on separate tables, using of insert, update & delete triggers is advisable?
Please advice.
Thanks in advance
Sathishuser6092922 wrote:
Hi,
I am designing a schema where in all the data tables will have created_by, created_on, modified_by and modified_on columns to find what & when changes made and who made changes to the records of the tables for audit purpose. These audit columns are rarely used in queries.
Other than the above said audit columns, these tables will have application data columns on an average of 20-25 max.
Will these additional audit columns impact the DML performance on the data tables?
If it is, will it be a good idea creating separate audit table with all these audit columns having one-to-one relationship with the actual data tables? Will this separation really reduce the load on actual data tables and improve the DML performance?
In case of creating separate audit tables, is it advisable creating these audit tables in separate schema? So that, the audit tables and related indexes will be created on a different data file and hence archiving, backup and recovery of actual data tables will be easy.
To update these audit columns on separate tables, using of insert, update & delete triggers is advisable?
Welcome to OTN forums!
What's the db version in 4 digits and o/s information?
Why you are reinventing the wheel by adding such columns in the table? Why not to take advantage of the auditing facility which is already there in the database?
Aman.... -
Designing metadataLogging table for data warehouse
Hi:
I am in the process of creating metadataLogging tables for dw.
I have thought 5 tables to track errors and check etl execution.
Master table for all major jobs. Enviournment - oracle 9i. Dw size not more than 30 mb. to start with.
CREATE TABLE ADMIN_JOB_MASTER_TBL
JOBNUMBER NUMBER,
JOBNAME VARCHAR2(768 BYTE),
MINTHRESHRECORDS NUMBER,
MAXTHRESHRECORDS NUMBER,
MINTHRESHTIME VARCHAR2(64 BYTE),
MAXTHRESHTIME VARCHAR2(64 BYTE),
CREATEDATE DATE
Audit Table for auditing jobs.
CREATE TABLE ADMIN_AUDIT_JOBS_TBL
JOBNUMBER NUMBER,
JOBNAME VARCHAR2(768 BYTE),
STARTDATE DATE,
ENDDATE DATE,
NUMBERRECORDS NUMBER,
FAIL_OR_SUCCESS VARCHAR2(64 BYTE)
Step Master: for jobs involving various steps, logic such as branching, loop, flow etc. breaking the job_master to a lower level for a more insight.
audit_step
CREATE TABLE ADMIN_AUDIT_STEP_TBL
RECORDID VARCHAR2(256 BYTE),
JOBAUDITID VARCHAR2(256 BYTE),
STEPNUMBER NUMBER,
PARAMETERS VARCHAR2(512 BYTE),
NUMBERRECORDS NUMBER,
STARTDATE DATE,
ENDDATE DATE,
USERNAME VARCHAR2(256 BYTE)
Error_table: to track error during execution.
CREATE TABLE ERROR_TBL
ERROR_NO NUMBER,
TABLE_NAME VARCHAR2(124 BYTE),
ERR_NUM NUMBER,
ERR_MSG VARCHAR2(124 BYTE),
ERROR_TIME DATE
I am thinking to load the master tables manually with expected values.
And during each run, loading the auditing tables.
Error table would ofcourse be loaded during the run.
So everytime a threshold has changed, I would have to update master table manually. I dont mind doing that initially.
Would the following tables and the stated appraoch be good for etl.
Please guys, let me know your thoughts. Suggest changes or tables or approach, if you feel, that its wrong.
Thanks in advance. All inputs are highly appreciated !!!
Regards
SomHi,
What better than Oracle suggests...is there ????
Have you read Oracle doc titled...
"Data Warehousing Guide" and the "Handling Data Errors with an Error Logging Table " section for example...
Regards,
Simon -
ODS Activation Returns 0 records in the Change Log
Hi All,
I have question regarding the Change Log build after the ODS req has been activated.
BW system - SAP NetWeaver BI 7.0 - Level 15
Upon loading a delta request into the ODS and Activating, it adds no records to the change log table.But the active table gets correct records
Checks I Did
ODS -> Menu-> Environment -> Delete Change Log Data.
No job scheduled or runing to delete the change log reqs (sm37)
RSRV - Check the Status of the Generated Program of a Data Store Object
ODS - reactivation to see if it works..
ODS Manage - Auto Req Processing set Quality status, Activate DSO to Yes.
The Activation Log shows at the end that they were no changed records added:
Data pkgs 000001; Added records 0; Changed records 0; Deleted records 0
Log for activation request ODSR_4APQQHKM459YNO7CIGCE0629U data package 000001...000001
Activation of records from DataStore object C_PUR_C sucessfully finished
U table for DataStore object C_PUR_C deleted successfully
Job finished
Can anyone help on the above issue??
Thanks
RaoHello Siggi,
Referring to the previous post:
I have a situation where few documents are not transferred to Change log during DSO Activation, though the After image values of these records are reflected in Active table.
I have found one common pattern for those documents which are not recorded in Change Log.
They have two pairs of After and Before image records.
First Pair like this: Eg: Overall status as the only data field.
Sales Doc Overall Statuls Recordmode
1001 C X
1001 A
Second pair:
1001 A X
1001 C
If we consolidate all these 4 records there seems to be no change for any data fields. Should I consider this as the standard behaviour while activating DSO - causing these records not to get recorded in change log?
Or is there any SAP Note that can make sure the change log gets populated in such case.
Hope my details are clear to you..
- Varma
Maybe you are looking for
-
I got a new macbook and ilife 08 came installed. but most of the loops in garageband are greyed out, and i can't use them. IT gives me the option to download now/later/blah blah blah. anyhoo, i clicked download now, but it opens software update which
-
My iTouch4 no longer shows up under Devices in i-Tunes, nor will it sync?
My iTouch 4 no longer shows up in iTunes as a device, nor does it sync, or create updates. Help?
-
I installed itunes11 on windows 7, i cant open itunes
Help emergency I installed itunes 11 on pc windows 7 , and then i cant use i tunes, i cant open itunes, i followed the process and uninstall all, and re install again and no solution pat
-
Third party control for editing report runtime
Hi.. Is there any third party control to edit crystal report at run time with Visual studio 2008? Regards, Vibhuti
-
Need help with PayPal for a conference Registration Form
I created a conference registration page and enabled PayPal. I tested it with no issues. My client distributed the URL, and two people tried to register today, one of whom was my client. She called me to say she never got a confirmation email and t