Multiple Audit Records created

We have seen that when executing certain workflows, IdM tends to log multiple audit records for the same task. This results in 2 identical records being created in the audit logs as seen in the Today's Report.
Can anyone help with understanding why this occurs and how to resolve this?

Frequently we see this behavior, and I'd love to know if there's a product issue that does this.
One reason we sometimes see duplicate entries is that IDM actually updates the account twice. E.g., if a workflow loads a user, checks the user in, then reloads the user and rechecks it in again later, we see two updates.
A common cause of double saves seems to be that IDM stores dates in memory in a different format than it does in the database, but IDM appears to compare the actual date values as strings. If you check out a view with a date in it, and then save it (without making any changes) the record is still treated as updated. E.g., "1 Jan 2009 10:00:00AM" in IDM memory vs. "01/01/2009 10:00:00" in the database.
But that only explains some of the duplicates we see.

Similar Messages

  • Multiple MBEWH records created when creating new material with BAPI

    Hi,
    I am using  BAPI_MATERIAL_SAVEDATA to create a material.
    When Creating a brand new material, the BAPI will create multiple MBEWH records, which is causing to update Previous Period/Year Moving Average price to update in MBEW.
    Any suggestion, why the BAPI would create historical data for MBEWH?
    Thank you very much.
    Oleg.

    Hi,
    I am using  BAPI_MATERIAL_SAVEDATA to create a material.
    When Creating a brand new material, the BAPI will create multiple MBEWH records, which is causing to update Previous Period/Year Moving Average price to update in MBEW.
    Any suggestion, why the BAPI would create historical data for MBEWH?
    Thank you very much.
    Oleg.

  • Multiple audit records per one report refresh

    We use BO XI R3 on Windows with CMS on DB2.
    I need to create audit reports on BO report usage with following data:
    username, timestamp, duration, report name for DeskI and WebI reports.
    I use auditing on DesktopIntelligenceCacheServer and WebIntelligenceProcessingServer.
    The problem is that I'm getting multiple records in AUDIT_EVENT table with different event_ids and timestamps per one refreshed report.
    Following SQL brings 2 records from WebI server all the time (first with duration=0) and from 1 to 5 records from DeskI server, all of them with real report duration.
    SELECT
    start_timestamp,
    ae.EVENT_ID,
    user_name,
    duration,
    object_type,
    detail_id,
    detail_text
    FROM BO_XI_R3.AUDIT_EVENT ae,
    bo_xi_r3.AUDIT_DETAIL ad,
    bo_xi_r3.EVENT_TYPE et
    where ae.EVENT_TYPE_ID=et.EVENT_TYPE_ID
    and ae.EVENT_ID=ad.EVENT_ID
    and ad.DETAIL_TYPE_ID=8
    and ae.EVENT_TYPE_ID=19
    order by 1;
    How to separate unique refresh info?
    Edited by: Valentin Volk on Oct 3, 2008 9:03 PM

    Valentin,
    The "duplicate" records that you are seeing, is it always consistent or just sometimes.  I ran you query against my Auditor database and sometimes (like maybe less than 50%) I am seeing "duplicate" records.  I say "duplicate" because the Event_ID is different in each case, but by sorty by start_timestamp, I can see a duration 0 record, and if the query takes like 5 seconds, then five seconds later I see the entry again (with a new Event_ID) and the second record has a duration of 5.  What does all this mean?  I don't know exactly, other than BO sees the act of submitting a report (sometimes) as an activity (and records an "enter" record with a duration of zero, zero for obvious reasons), then when the report ends another entry to the journal used to record the duration.  And at other times I don't see the "enter" record, just the entry with the duration.  In my practice we run a similiar query as you've provided but we do not record where duration is zero.
    thanks,
    John

  • Loading multiple physical records into a logical record

    Hello,
    I'm not sure if this is the right place to post this thread.
    I have to import data from a fixed length positioned text file into a oracle table using sql loader.
    My sample input file (which has 3 columns) looks like:
    Col1 Col2 Col3
    1 A abcdefgh
    1 A ijklmnop
    1 A pqrstuv
    1 B abcdefgh
    1 B ijklmn
    2 A abcdefgh
    3 C hello
    3 C world
    The above text file should be loaded into the table as:
    Col1 Col2 Col3
    1 A abcdefghijklmnpqrstuv
    1 B abcdefghijklmn
    2 A abcdefgh
    3 C helloworld
    My question: Is there a way tht i can use the logic of loading multiple physical records into a logical record in my oracle tables. Please suggest.
    Thanks in advance.

    Hi,
    user1049091 wrote:
    Kulash,
    Thanks for your reply.
    The order of the concatenated strings is important as the whole text is split into several physical records in the flat file and has to be combined into 1 record in Oracle table.
    My scenario is we get these fixed length input files from mainframes on a daily basis and this data needs to be loaded into a oracle table for reporting purpose. It needs to be automated.
    Am still confused whether to use external table or a staging table using sql loader. Please advise with more clarity as am a beginner in sql loader. Thanks.I still think an external table would be better.
    You can create the external table like this:
    CREATE TABLE     fubar_external
    (      col1     NUMBER (2)
    ,      col2     VARCHAR2 (2)
    ,      col3     VARCHAR2 (50)
    ORGANIZATION  EXTERNAL
    (       TYPE             ORACLE_LOADER
         DEFAULT DIRECTORY  XYZ_DIR
         ACCESS PARAMETERS  (
                                 RECORDS DELIMITED BY     NEWLINE
                          FIELDS  (   col1        POSITION (1:2)
                                      ,   col2        POSITION (3:4)
                               ,   col3        POSITION (5:54)
         LOCATION        ('fubar.txt')
    );where XYZ_DIR is the Oracle Directory on the database server's file system, and fubar.txt is the name of the file on that directory. Every day, when you get new data, just overwrite fubar.txt. Whenever you query the table, Oracle will read the file that's currently on that directory. You don't have to drop and re-create the table every day.
    Note that the way you specify the columns is similar to how you do it in SQL*Loader, but the SEQUENCE generator doesn't work in external files; use ROWNUM instead.
    Do you need to populate a table with the concatenated col3's, or do you just need to display them in a query?
    Either way, you can reference the external table the same way you would reference a regular, internal table.

  • At what point is a user record created in the WWSEC_PERSON$ table?

    Hello -
    We are using OID for our SSO into the Oracle Portal. When we sync AD -> OID, when I query the Portal WWSEC_PERSON$ table and the new users are not present. I have to go into the Portal Administration page, query one of the users and set a default group. After this action they will have a record in the WWSEC_PERSON$ table.
    The reason Im asking is we are trying to automate the default group assignment using the wwsec_api.set_defaultgroup api, but because these users are missing from the WWSEC_PERSON$ table, they obviously cannot be assigned a group.
    Is there a way to force new users into the portal DB table? Has anyone else automated this type of assignment?
    Thanks
    Brian

    I've found that ODS.DS_ATTRSTORE comes in very handy to query OID data using SQL. There are multiple attribute records for each entry. The attribute 'uid' contains the user name and 'createtimestamp' contains the timestamp for when the user was created in OID after being synched from AD. The attribute 'orcldefaultprofilegroup' stores the default portal group for the user too.

  • While Installation of 11g database creation time error ORA-28056: Writing audit records to Windows Event Log failed Error

    Hi Friends,
    OS = Windows XP 3
    Database = Oracle 11g R2 32 bit
    Processor= intel p4 2.86 Ghz
    Ram = 2 gb
    Virtual memory = 4gb
    I was able to install the oracle 11g successfully, but during installation at the time of database creation I got the following error many times and I ignored it many times... but at 55% finally My installation was hanged nothing was happening after it..... 
    ORA-28056: Writing audit records to Windows Event Log failed Error  and at 55% my Installation got hung,,,, I end the installation and tried to create the database afterward by DBCA but same thing happened....
    Please some one help me out, as i need to install on the same machine .....
    Thanks and Regards

    AAP wrote:
    Thanks Now I am able to Create a database , but with one error,
    When I created a database using DBCA, at the last stage I got this error,
    Database Configuration Assistant : Warning
    Enterprise Manager Configuration Failed due to the Following error Listener is not up or database service is not registered with it.  Start the listener & Registered database service & run EM Configuration Assistant again....
    But when I checked the listener was up.....
    Now what was the problem,  I am able to connect and work through sqlplus,
    But  I didnt got the link of EM and when try to create a new connection in sql developer it is giving error ( Status : failure - Test Failed the Network Adapter could not establish the connection )
    Thanks & Regards
    Creation of the dbcontrol requires a connection via the listener.  When configuring the dbcontrol as part of database creation, it appears that the dbcontrol creation step runs before the dynamic registration of the databsase with the listener is complete.  Now that the database itself is completed and enough time (really, just a minute or two) has passed to allow the instance to register, use dbca or emca to create the dbcontrol.
    Are you able to get a sqlplus connection via the listener (sqlplus scott/tiger@orcl)?  That needs to be the first order of business.

  • CRM 2011: when auditing is turned off, why do existing Audit records lose "New Value"?

    Not sure if this is by design (if so, would love to know why!) or if this is a bug.  Let's say I have auditing turned on for Accounts.  I change the phone number in the Main Phone field from 555-1111 to 555-5555, and an audit record is
    created showing the Old Value (555-1111) and New Value (555-5555).  Great.  Now I turn off auditing. The system goes back to the audit record that had previously been created, and modifies it so that it shows the Old Value (555-1111),
    and inserts an icon into the New Value field indicating auditing was turned off.  The tool tip for that icon states "Auditing was stopped and restarted.  The new values for the fields that changed during this period cannot be identified or recorded."
    First of all (minor picky issue) auditing hasn't been turned back on yet, despite what the tool tip says (says the same thing whether or not auditing is turned back on).  But the real problem is
    why the existing audit record's New Value would be removed... why would that happen?!?  We're talking about a historical update.  It really did change, and the system knew what it changed from/to, so why would the system remove that new value
    from the audit record?  Logically I would expect that any changes made
    while auditing is off would not be logged, and that once auditing is turned back on, Old/New values would correspond to whatever was/is in the system at the time that any new audit record is created, so there may be a disconnect between an older audit
    record and a new one (old record's New Value not matching the new record's Old Value, because changes were made while auditing was off). And it's quite apparent when auditing is turned off, because an audit record is created stating "Auditing Disabled". 
    Any ideas if there's a logical reason why it would do this, or if this is really a bug that Microsoft needs to fix?  We have a client with a business need to periodically turn off auditing while integration updates run between CRM and another system
    (otherwise hundreds of thousands of unnecessary audit records would be created).  But if they're going to lose audit data every time they disable auditing, the auditing is worthless.

    We wouldn't ever temporarily turn auditing off on purpose... it's happening when we import managed solutions.  We have vendors with managed solutions that contain system entities such as Contacts.  When importing their solution, it momentarily
    turns auditing off on those entities.  The only evidence of this is audit records indicating "Entity Audit Stopped", and of course the loss of all the most  recent "new value" on all audit records.  The interesting (?) parts of this are:
    1. When importing the managed solution, the recommended option to keep all existing customizations is selected.  I would think this would include the audit settings on the entity, but apparently not (sort of... see #2).
    2. As soon as the managed solution is imported, in addition to creating an "Entity Audit Stopped" audit record on relevant entities and wiping out new values, it apparently turns auditing back on, yet there's no record of this (in customizations, the entity
    still (or once again) indicates auditing is enabled; but on records for that entity, there is no "Entity Audit Enabled" record.   
    So, now it's on to working with the vendors, to dig into their managed solutions and see if there's a way to prevent this!

  • How to Pass Multiple data records into SDATA for a segment

    Hi Friends,
    I need to Pass data records to Function Module
    MASTER_IDOC_DISTRIBUTE for Creating Outbound IDOCs
    I am Collecting all the Data Records and Passing into the Structure EDIDC.
    Here my problem is
    For one Route there are Multiple Customer records.
    so my First Parent Segment is ZROUTE
    i am passing the Datarecord for this Segment.
    and for the Customer Segment , I have 5 Customers,
    so i am passing the 5 Data records to SDATA .
    Loop at gt_route into gw_route.
    Pass the Route data into the Data record SDATA
      gw_idocdata-sdata    = gw_route_header.
      gw_idocdata-segnam = lc_route_header.
      gw_idocdata-hlevel = lc_4.
      APPEND gw_idocdata TO gt_idocdata.
      CLEAR : gw_route_header, gw_idocdata.
    Loop at gt_customer into gw_customer
                                   where anlage  = gw_route-anlage.
    Pass the Customer data into the Data record SDATA
      gw_idocdata-sdata  = gw_customer.
      gw_idocdata-segnam = lc_customer.
      gw_idocdata-hlevel = lc_5.
      APPEND gw_idocdata TO gt_idocdata.
      CLEAR : gw_customer, gw_idocdata.
    ENDLOOP
    ENDLOOP.
    So this customer segment is comming 5 times in my IDOC
    Can any one suggest me whether i am going in correct way or not.
    I am Getting the IDOC status 26.
    Error during syntax check of IDoc (outbound)
    I think it is because the IDOC is not yet SET RELEASE.
    But i am having the Dought in Passing the data to SDATA fied
    of the EDIDC Structure.
    Can any one Suggest me.
    Thanks in Advance,
    Ganesh

    Hi,
    First of all EDIDC is the control record, EDID4 is the data record. When collecting customer record into idoc_data you need to insert customer's parent segment number.
    Eg.
    route1, segnum 1.
       child 1, segnum 2 and parent segnum = 1
       child 2, segnum 3 and parent segnum = 1
       child 3, segnum 4 and parent segnum = 1
    route 2, segnum 5
       child 1, segnum 6 and parent segnum = 5
       child 2, segnum 7 and parent segnum = 5
       child 3, segnum 8 and parent segnum = 5
    Cheers.
    ...Reward if useful.

  • Multiple assignment records

    Hi All,
    I have requirement like, need to display multiple assignment and person records.
    select pf.employee_number
    ,pf.full_name
    ,pf.email_address,
    ,paf.supervisor_id
    ,paf.job_id
    from per_all_people_f pf
    ,per_all_assignments_f paf
    where pf.person_id = paf.person_id
    group by
    pf.employee_number
    ,pf.full_name
    ,pf.email_address,
    ,paf.supervisor_id
    ,paf.job_id
    having count(*)>=1;
    I need to display multiple assignment records as well as person records. But above query taking too much time.
    kindly explainme how to modify the query.
    for ex: 1234 as employee number exists in application.
    for the above employee 3 assignment records are exist.
    Regards,
    Visu

    Hi All,
    kindly give me suggestions for my problem.
    for ex: 114589 employee created on 01-Jan-2010. One more record created on 24-jul-2010.
    So, in person record (per_all_people_f) we have two records as below:
    effective start date effective_end date employee number ...etc
    01-jan-2010 23-jul-2010 114589 ...etc
    24-Jul-2010 31-dec-4712 114589 --etc
    same way 3 assignment records are created for the same employee.
    my requirement is that i need to display all records assignment as well as person records like below format
    employee number assignment id ---etc
    114589 1526 --etc
    114589 1526 --etc
    114589 1526 --etc
    Likewise i need to check all records in database and display accordingly.
    regards,
    Visu

  • Fine Grained Audit records to syslog

    Hello experts,
    I am working on Standard Auditing and Fine Grained Auditing on 11.2.0.3 databases on Red Hat x86_64.
    I am trying to send Fine Grained Audit records to syslog as for my Standard Audit records with audit_trail set to OS, but can't find any appropriate option.
    When I create FGA policies with the ADD_POLICY procedure of the DBMS_FGA package, the audit_trail parameter can only be set to DB or XML, as stated in [PL/SQL Packages and Types Reference|http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_fga.htm#CDEIECAG].
    Does somebody know if it is possible to send FGA audit records to syslog directly:
    1. without using any additional product (e.g. Oracle Audit Vault)?
    2. without doing manual extraction from fga_log$ or DBA_COMMON_AUDIT_TRAIL?
    Thanks for any suggestion.

    Hi,
    Well, i did not used FGA yet.
    I used audit_Trail=db and the query SELECT username,extended_timestamp,owner,obj_name,action_name,sql_text FROM dba_audit_trail WHERE to_char(extended_timestamp, 'DD/MM/RR') = to_char(SYSDATE - 1, 'DD/MM/RR') ORDER BY timestamp)
    Then i wrote a procedure, and exported the results using utl_file .
    And i scheduled this procedure to run daily.
    It works pretty good, if you like the solution as ask for details.
    Hope that helps,
    Regards.

  • Punch in & multiple tracks recording

    I'm working with Logic Express for a few weeks now, and I need to know 2 things:
    How can I make a punch in/out with Logic Express?
    I worked with Adobe Audition, and I just needed to select the part of the track where the repair should be done, and play.
    Now I just can't find out how it works with logic!
    how can I record multiple tracks at the same time with my Tascam FW1082?, now I can only select 2 tracks for multiple record.....
    Thanks!
    Martin

    As for "Punching in and out" i just loop the section that i wish to record over and BAM im done. You can use the magnifying glass to loop a certain area in detail or even use the loop markers on your Transport bar.
    For multiple track recording press record and then shift+click the other tracks you want to record on.
    Peace and Chicken Grease
    Frisco

  • Moving audit Records from one database to another database using dblink

    i got five database, i have to move sys.aud$ records from five databases to one centralized database into another schema every day at 10:00 clock, i have to use a dblink for this, i have to create same table as sys.aud$ with different schema in centralized database with one extra column db_unique_name,by using db_link how i need to move records from all the five databases to one centralized database, can anyone help me here how to create a db_link from to move records from five database to one centralized database, due to maintainance perspective i have to move the records from all the five databases to one centralized database. i have to write a script for moving the audit records from all the five databases to one centralized database, can anyone help me how to write the script, or if you have any related scripts , can u post here, it will helpful for me.

    spool audit.log
    --"Auditing Initialisation Parameters: check initialization parameter"
    select name || '=' || value from v$parameter where name like '%audit%'
    ---"if auditing is disabled then issue this command and bounce"
    alter system set audit_trail=db,extended scope = spfile
    shutdown immediate
    startup
    create tablespace ORDER_DATA datafile '+DDATA' size 50m;
    create user INFO identified by INFO;
    Grant connect,resource to INFO;
    Alter user INFO quota unlimited on ORDER_DATA;
    create table INFO.ORDER as select * from sys.aud$;
    alter INFO.ORDER add db_unique_name varchar2(50);
    create table INFO.ORDER
    partition by range (Timestamp#)
    subpartition by hash(dbid)
    subpartition template
    (subpartition sp1 tablespace users,
    subpartition sp2 tablespace users)(
    partition p1 values less than (TO_DATE('07/29/2010','MM/DD/YYYY')),
    partition p2 values less than (TO_DATE('07/29/2011','MM/DD/YYYY')),
    partition p3 values less than (MAXVALUE)) tablespace BGORDER_DATA as select * from sys.aud$
    spool off;
    exit
    BEGIN
    DBMS_SCHEDULER.create_job(
    job_name => 'Move Aud$ records',
    job_type => 'PLSQL_BLOCK',
    job_action => 'CREATE OR REPLACE PROCEDURE bgorder_aud
                        AS
                   ts TIMESTAMP;
                   BEGIN
    ts := SYSTIMESTAMP;
    insert into info.order select * from sys.aud$ where timestamp# < ts;
    delete sys.aud$ where timestamp# < ts;
    commit;
    END;
    start_date => TRUNC(SYSDATE) + 22 / 24,
    repeat_interval => 'FREQ=daily;BYHOUR=22;BYMINUTE=0;BYSECOND=0',
    enabled => TRUE,
    comments => 'Move records');
    END;
    /

  • Multiple results recording for Delivery inspection lot

    Hi Gurus,
    My requirement is to configure for quality results recording for sales delivery document. Where client wants to record Customer, supplier, Umpire, etc results record for inspection characteristics.
    I got response from few Sapiens that I should use Inspection point concept for multiple results recording. Could anyone cna tell me how can I do this.
    Your help will be highly appreciated.
    Thanks,
    Kumar

    Hi Kumar,
    Okay!
    First of all letu2019s try to understand whether inspection points are really suitable for your requirement or not. Accordingly further errors can be eliminated. I just try to understand.
    A- Using Inspection Points
    1.     Inspection points are used when you inspect the same parameters again and again.
    2.     When you click on results tab in QA32, system asks you put some identification number. Letu2019s say you put A1 and then press enter. Thereafter you record the actual results (LENGTH, WIDTH etc.) You save this without giving UD.
    3.     For the next time when you again click on results for the same inspection lot, system again asks you to put identification number. This time suppose you put A2. Then again you record the results (LENGTH, WIDTH) for A2.
    4.     Likewise you can retrieve the inspection results from system with respect to inspection point (identification point).
    5.     Typically this can be used for chemical industries where you record the temperature of the same chemical after every ½ an hour. Here 10:00 am, 10:30 am can be used as identifications numbers. As a result of which you can get the information that at 10:00 am temperature was 40 Deg and at 10:30 am it was 75 degree Celsius.
    If your requirement resembles this situation, then only the use of inspection points would be fruitful.
    B- Without Inspection Points
    But if you want to record the readings by various agencies like customer, umpire (refer following pattern)and want to compare them there are other ways.
    1.     Reading by your inspector
    a.     LENGTH u2013 100, 101, 102mm (average is 101mm)
    b.     WIDTH     - 10.50, 10.50, 10.60mm (average is 31.6mm)
    2.     Reading by customer
    a.     LENGTH u2013 100, 101, 102mm (average is 101mm)
    b.     WIDTH     - 10.50, 10.50, 10.60mm (average is 31.6mm)
    3.     Reading by umpire
    a.     LENGTH u2013 100, 101, 102mm (average is 101mm)
    b.     WIDTH     - 10.50, 10.50, 10.60mm (average is 31.6mm)
    You can achieve this without using inspection points also. For this in QP02 you need to create 3 operation; Inspector, Customer and umpire respectively. Feed the MICs in each of these operations. When you click on result tab system will throw pop up for these 3 operations. Double click on Umpire if you want to record results noted by umpire. Likewise for rest of the operations. You can do this even after UD also (after 1-2 months also). You need to use MIC with control indicator as long term characteristics for this in QS23.
    Regarding averaging of results u2013 This can be achieved using single result recording MIC with or without inspection points. Use appropriate sampling procedure. When you feed results in QA32 system automaticaaly averages out the reading. This can be viewable against phi symbol.
    Finally you did mention that u201Chere I am not getting the newly created inspection lotsu201D. Does that mean inspection lot (10 origin) has not created?
    Regards,
    Anand Rao

  • When will audit on "CREATE SESSION" generate audit log?

    Hi,
    When we setup audit on "CREATE SESSION" for all uses with access, does it mean there will be an audit log every time some user is granted "CREATE SESSION" or does it mean that there will be an audit log every time a user connects? ( I'm assuming internally "CREATE SESSION" is executed every time user logs into the database)
    The command would be something like: audit CREATE SESSION by access
    Thanks for the help.

    Hozy wrote:
    Thanks Aman.
    I read somewhere that it is a good practice to enable audit on "CREATE SESSION" for all the users by access, and the next question that came to my mind is if we do this, when will the audit log get generated?To enabling auditing you have to set audit_trail initialization parameter,please refer
    http://www.oracle-base.com/articles/10g/Auditing_10gR2.php
    If it is everytime a user connects, then it will be a performance overhead.No,but after some days you can delete old records from sys.aud$

  • How to get all records created this month?

    Ok I'm trying to write a query that would get records that were created this month or later but not in the past.
    So today is Nov 16th 2009
    I need to look for all records created from 11/2009 and onward (>11/2009)
    Any ideas?

    Do you have any field like "create_date" on that table ? Here is simple qry
    with t as ( select 1 as id,to_date('01-OCT-2009','DD-MON-YYYY') as create_date from dual
    UNION
    select 2,to_date('11-OCT-2009','DD-MON-YYYY') from dual
    UNION
    select 3, to_date('02-NOV-2009','DD-MON-YYYY') from dual
    UNION
    select 4, to_date('01-NOV-2009','DD-MON-YYYY') from dual
    UNION
    select 5, to_date('13-DEC-2009','DD-MON-YYYY') from dual)
    select * From t
    where CREATE_DATE >= trunc(sysdate,'MON')Edited by: rkolli on Nov 16, 2009 1:23 PM

Maybe you are looking for