Data purge utility in OIM 9.1.0.2

Hi All,
Anybody aware of any data purge procedures/steps/utility in OIM.
I have studied that following tables are used for audit purpose. Following table holds of user profile snapshot (whenever user data get changed). Reference document is http://docs.oracle.com/cd/E10391_01/doc.910/e10365/useraudit.htm
UPA
UPA_USR
UPA_FIELDS
UPA_GRP_MEMBERSHIP
UPA_RESOURCE
AUD_JMS
UPA_UD_FORMS
UPA_UD_FORMFIELDS
I guess tis tables can be truncated (after taking a backup) without any risk ?
Please suggest?
Ritu

Refer more about script here:docs.oracle.com/cd/E14899_01/doc.9102/e14763/bulkload.htm#CHDHAFHC
OIM server should be up and running while running the script.
Regards,
GP

Similar Messages

  • LOG_FILE_NOT_FOUND when running cleaner manually after some data purge

    I hit LOG_FILE_NOT_FOUND error when running cleaner manually after some data purge, I searched the forum, found someone also faced the same issue before, but cannot find any clue on how to fix it. Below is the error trace and followed by our configurations
    Caused by: com.sleepycat.je.EnvironmentFailureException: (JE 4.1.6)
    Environment must be closed, caused by:
    com.sleepycat.je.EnvironmentFailureException: Environment invalid because of
    previous exception: (JE 4.1.6) /scratch/tie/thirdeye/index/data-store
    fetchTarget of 0x50f/0x3fb9dd6 parent IN=368491717 IN
    class=com.sleepycat.je.tree.IN lastFullVersion=0x510/0x2ca7d18
    parent.getDirty()=false state=0 LOG_FILE_NOT_FOUND: Log file missing, log is
    likely invalid. Environment is invalid and must be closed.
    at
    com.sleepycat.je.EnvironmentFailureException.wrapSelf(EnvironmentFailureExcept
    ion.java:196)
    at
    com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:1439)
    at com.sleepycat.je.Environment.checkEnv(Environment.java:2117)
    at com.sleepycat.je.Environment.checkpoint(Environment.java:1440)
    at
    com.oracle.thirdeye.datastore.DataStoreManager.clean(DataStoreManager.java:402
    at
    com.oracle.thirdeye.infostore.InfoStoreManager.clean(InfoStoreManager.java:301
    ... 11 more
    Caused by: com.sleepycat.je.EnvironmentFailureException: Environment invalid
    because of previous exception: (JE 4.1.10)
    /scratch/tie/thirdeye/index/data-store fetchTarget of 0x50f/0x3fb9dd6 parent
    IN=368491717 IN class=com.sleepycat.je.tree.IN
    lastFullVersion=0x510/0x2ca7d18 parent.getDirty()=false state=0
    LOG_FILE_NOT_FOUND: Log file missing, log is likely invalid. Environment is
    invalid and must be closed.
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1332)
    at com.sleepycat.je.tree.IN.findParent(IN.java:2886)
    at com.sleepycat.je.tree.Tree.getParentINForChildIN(Tree.java:881)
    at com.sleepycat.je.tree.Tree.getParentINForChildIN(Tree.java:809)
    at
    com.sleepycat.je.cleaner.FileProcessor.findINInTree(FileProcessor.java:1152)
    at com.sleepycat.je.cleaner.FileProcessor.processIN(FileProcessor.java:1090)
    at
    com.sleepycat.je.cleaner.FileProcessor.processFile(FileProcessor.java:538)
    at com.sleepycat.je.cleaner.FileProcessor.doClean(FileProcessor.java:241)
    at com.sleepycat.je.cleaner.Cleaner.doClean(Cleaner.java:463)
    ------------Configurations-------------------------
    EnvironmentConfig.ENV_RUN_CLEANER -> false
    EnvironmentConfig.CHECKPOINTER_HIGH_PRIORITY -> true
    EnvironmentConfig.CLEANER_EXPUNGE -> false
    Any hints are appreciated. I'm also working for Oracle CDC, feel free to call me at 861065151679 or drop me an email at [email protected] so that we can talk more in detail
    Anfernee

    Anfernee, I will contact you via email.
    --mark                                                                                                                                                                                                                   

  • Statistic on throughput of data loader utility

    Hi All
    Can you guys share some statistics on throughput of data loader utility ? If you are looking for number of records you may consider 1 Million, how long it would take to import this ?.
    I need these number to make a call on using Web Service or Data loader utility. Any suggestion is appreciated.
    Thank you.

    It really depends on the object and the amount of data in there (both the number of fields you are mapping, and how much data is in the table).
    For example…
    One of my clients has over 1.2M Accounts. It takes about 3 hours (multi-tenant) to INSERT 28k new customers. But when we were first doing it, it was sub-1hour. Because the bulk loader is limited on the record count (most objects are limited to 30k records in the input file), you will need to break up your file accordingly.
    But strangely, the “Financial Account” object (not normally exposed in the standard CRMOD), we can insert 30k records in about 30 min (and there are over 1M rows in that table). Part of this is probably due to the number of fields on the account and the address itself (remember it is a separate table in the underlying DB, even though it looks like there are two address sets of fields on the account).
    The bulk loader and the wizard are roughly the same. However, the command line approach doesn’t allow for simultaneously INSERT/UPDATE (there are little tricks around this; depends how you might prepare the extract files from your other system... UPDATE file and a INSERT file, some systems aren't able to extract this due to the way they are built).
    Some objects you should be very careful with because the way the indexes are built. For example, ASSET and CONTACT both will create duplicates even when you have an “External Unique Id”. For those, we use web services. You aren’t limited to a file size there. I think (same client) we have over 800k ASSETS and 1.5M CONTACTS.
    The ASSET load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 6 hours.
    The CONTACT load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 10 hours.
    Your best shot is to do some timings via the import wizard and do a little linear time increase as you increase the data size sitting in the tables.
    My company (Hitachi Consulting) can help build these things (both automated bulk loaders and web services) if you are interested due to limited resource bandwidth or other factors.

  • Up-to-date unzip utility for MOPatch

    Hi All,
    SAP Note 1027012 - MOPatch - Install Multiple Oracle Patches in One Run
    mentions  following
    MOPatch requires an up-to-date "unzip" utility. As of version 1.7,
    MOPatch by default uses the "unzip" utility that is located at
    $ORACLE_HOME/bin.
    Does anyone know where do we download the uptodated version of unzip utilty and how to install it?
    Is it just over writing the existing files?
    Thanks

    Rizwan Choudhry wrote:
    Hi Orkun Gedik,
    >
    > Thank your for replying.
    >
    > We are using OS :
    > HP-UX Itanium (64-bit)
    >
    > regards riz
    Hi Riz,
    You can find the unzip pack on the link, below;,
    http://hpux.connect.org.uk/hppd/hpux/Misc/unzip-6.0/
    And follow the installation instructions, on the same page.
    Best regards,
    Orkun Gedik

  • Oracle EBS Data Purging and Archival

    Hi,
    I would like to know if there is any tool available in market for Oracle EBS data purging and Archival?
    Thanks,

    yes, there are 3rd-party tool available which will apply a set of business rules (ie all data older than Nov.1, 2007) through the various Oracle modules implemented at a customer site.
    They are 3rd-party tools; You can go to Oracle.com and look in partners validated integration solutions. At the moment there are 2 partners offering such integrated solution:
    Solix EDMS Validated Integration with 12.1
    IBM Optim Data Growth Solution
    the only other solution is to hire OCS for a customized developed solution

  • Scrub3.exe - IBM Secure Data Disposal Utility - Problem

    D10 6493 Problem
    I have a refurbished D10 6493, upon boot up get the following message:
    IBM Secure Data Disposal Util 2.0
    Remote Deployment Mgr 4.1
    Command Code Executed
    C:\Dos7|Scrub3.exe /d=all /1=2
    Return Code = 0
    I changed the boot sequence to CD and tried to load Win7 OS but to no avail. End up with the following message.
    What changes should I make to the BIOS so I can install my OS.
    Ted

    What does your startup sequence look like?  Assuming you're trying to install the OS from optical, you should have the optical in front of the HDD in the startup sequence, and you might have to hit a key right after POST in order to boot the system to the DVD.
    That message you're seeing looks like it might be a log from the secure erase utility.  That's a utility usually used to wipe media.  I'm guessing this is displaying because your system is booting to a blank (but formatted) HDD that's been scrubbed already.

  • C100 Data Import Utility - download where?

    Hi there!
    I lost the DVD that came with my C100. Can I download the Data Import Utility somewhere? I can't seem to find it anywhere on Canon's pages...

    Hi o-shows!
    Thanks for posting.
    The Data Import Utility is licensed software provided by the Pixela Corporation.  They have it available on their website for download.  Clicking HERE will take you to the download page.  There is a download link at the bottom of the page you can click after reading over the instructions.
    This didn't answer your question or issue? Find more help at Contact Us.
    Did this answer your question? Please click the Accept as Solution button so that others may find the answer as well.

  • Canon's Data Import Utility C100

    Does anyone know where to download C100 Data Import Utility for C100? I just bought a used c100 that did not come with the disc and hoping to find out how to order a new one or if there is a download somewhere. Anybody out there think this is a must or can bypass with something like clipwrapper? 

    So I found a simple work around.  I was having dificualty only downloading the VIDEO files, with no AUDIO becuase they are stored in 2 separt folders when recorded to the SD card from the c100.  All you need to do is
    sd > card reader > computer > [open folder via my computer]  
    EOS CANON C100 > PRIVATE > AVCHD >  BDMV >  STREAM (video files)
                                                                                        >  CLIPINF (enables audio files to be interpreted in NLE's)
    ****STREAM (contains video file copy them to a new folder on your hard drive.  Then the tricky part,  also open the folder CLIPINF this folder holds files with the extension .CPI  select all those and copy into the same folder you created for the video files.  Then simply import the video files into a NLE's such as Premier Pro then your good to go!
    Took me about an hour of frustration then I went back and just opened all the folders. Trial and error = Knowledge!
    Enjoy hope this helps

  • HFM audit data export utility availability in version 11

    Hi Experts,
    We have a client who has a HFM environment where the audit & task logs grow very large very quickly.
    They need to be able to archive and clear the logs. They are too large for EPM Maestro to handle and they don't want to schedule them as a regular event.
    I am concerned because I am sure that these large log tables are impacting performance.
    They want to know if the old system 9 utility they used to use is still available in the latest version. It was called the HFM audit data export utility. Does anyone know?
    Thanks in advance and kind regards
    Jean

    I know this is a reasonably old post but I found it through Google. To help those in the future, this utility is available via Oracle Support. It is HFM Service Fix 11.1.1.2.05 but it is compatible up to 11.1.1.3.
    Here is the Oracle Support KB Article:
    *How To Extract the Data Audit and Task Audit records of an HFM application to a File [ID 1067055.1]*+
    Modified 23-MAR-2010 Type HOWTO Status PUBLISHED
    Applies to:
    Hyperion Financial Management - Version: 4.1.0.0.00 to 11.1.1.3.00 - Release: 4.1 to 11.1
    Information in this document applies to any platform.
    Goal
    Some system administrators of Financial Management desire a method to archive / extract the information from the DATA_AUDIT and TASK_AUDIT database tables of an HFM application before truncating those tables.
    Solution
    Oracle provides a stand alone utility called HFMAuditExtractUtilitiy.exe to accomplish this task. As well as extracting the records of the two log tables, the utility can also be used to truncate the tables at the same time.
    The utility comes with a Readme file which should be consulted for more detailed instructions on how it should be used.
    The latest version of the utility which is compatible with all versions of HFM up to 11.1.1.3 is available as Service Fix 11.1.1.2.05 (Oracle Patch 8439656).
    Edited by: Fredric J Parodi on Nov 5, 2010 9:43 AM

  • Bulkload Utility in OIM

    Hi all,
    I am trying to upload csv data file into OIM 11.1.1.5.0 using bulkload utility oim_blkld.sh. I have followed the steps which is mentioned in the document "Developer's Guide for Oracle Identity Manager 11g Release 1 (11.1.1)"
    After the run Utility(oim_blkld.sh)  recieved the error in the "oim_blkld_user_load_summary.log" as ,
         ERROR ==> The provided User ID *oimtest* , for user created from web console, does not exist
    But i have created the user "oimtest" using OIM admin console before i run the Utility. I ran this Utility after i stopped the OIM server.
    Please suggest me  to resolve this error and to upload the users in OIM.
    Thanks and Regards,
    Karthick Sugumaran

    Hi Suren,
    We are currently using pre-populate adapters which generates userid and another adapter which generates password for a user and stores in OIM.
    Later on OIM is provisioning user in OID.
    But as per migration requirement, lot of users needs to be migrated to OIM and later needs to be provisioned in OID.
    But these users already have userid and password as they are already existing users.
    Now during bulkload we do not want these adapters to fire as already data is there.
    Please suggest the solution for this.
    Thanks.

  • Error while running uploadJar.sh Utility in OIM

    [Enter Xellerate admin username :]
    [Enter the admin password :]
    [[Enter serverURL (Ex. t3://oimhostname:oimportno for weblogic or corbaloc:iiop:localhost:2801 for websphere)]:
    [[Enter context (i.e.: weblogic.jndi.WLInitialContextFactory for weblogic or com.ibm.websphere.naming.WsnInitialContextFactory for websphere)]:]weblogic.jndi.WLInitialContextFactory
    Exception in thread "main" java.lang.NoClassDefFoundError: oracle/jrf/PortabilityLayerException
            at oracle.iam.platformservice.utils.JarUploadUtility.main(JarUploadUtility.java:219)
    Caused by: java.lang.ClassNotFoundException: oracle.jrf.PortabilityLayerException
            at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
            at java.security.AccessController.doPrivileged(Native Method)
            at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
            at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
            ... 1 more
    bash-4.1$
    I checked for the jrf-api jar location in ./setEnv.sh and the path is correct.
    Please help.

    It must be some issue with your path export.
    Have you also generated wlfullclient.jar? and is it available in your path.
    1) Verify the Scheduler structure: Oracle Stack: Sample Custom Schedule Task
    You can register using standalone code as well: FYI Oracle Stack: Registering a Plugin using OIM APIs
    2) Also, verify it with this blog: http://idmoim.blogspot.in/2011/07/developing-and-deploying-oim-11g-custom.html
    ~J

  • Data Access Error in OIM Scheduler

    I have a scheduler task in OIM,which fetches the "*Manager Employee ID*" custom field from OIM and if that is not empty then updates the Manager ID for that user.
    The scheduler task when run throws an Data Access error, every time it starts and closes.
    The same Scheduler code is working fine in the UAT environment successfully, but throws the error "Data Access Error" in the Prod environment.
    In the code there is a catch block for catching the "*tcDataSetException*" and this is where the error is thrown from.
    When i looked in to the code i found that the "*tcUserOperationsIntf*" was initialized but not closed in the finally block. Will closing the "tcUserOperationsIntf" slove this problem or am i missing something ?
    When is the Data Access Error occur??? There are other schedulers that are running fine in the prod environment.
    Any help in this regard would be really helpful to me.
    Thanks

    I am getting the UDF using the Query.
    I am placing the piece of code that i am using for your reference
    try {
         //Get Users for who have got there manager empid and manager empid not set to null
         String msQuery1 = "select usr_key,usr_login, "+udf_Manager_EMPID+" from usr where "+udf_Manager_EMPID+" is not null " ;
              tcDataSet getMemberUsers = new tcDataSet();
              getMemberUsers.setQuery(getDataBase(), msQuery1);
              getMemberUsers.executeQuery();
              for (int i=0;i<getMemberUsers.getTotalRowCount();i++)
                   getMemberUsers.goToRow(i);
                   String managerEmpId = getMemberUsers.getString(udf_Manager_EMPID);
                   logger.debug("the manager emp id that we get for the user is "+managerEmpId );
                   long userKey = getMemberUsers.getLong("usr_key");
                   String userid = getMemberUsers.getString("usr_login");//Users.User ID
                   logger.debug("the userkey that we get for the user is "+userKey );
                   String msQuery2 = "select usr_key from usr where "+udf_EmployeeID+"="+ managerEmpId ;
                   tcDataSet getMemberUser = new tcDataSet();
                        getMemberUser.setQuery(getDataBase(), msQuery2);
                        getMemberUser.executeQuery();
                        long userassociatedwithmgrid=0;
                        for(int j=0;j<getMemberUser.getTotalRowCount();j++){
                        getMemberUser.goToRow(j);
                             userassociatedwithmgrid = getMemberUser.getLong("usr_key");
                             logger.debug("the user id that we get for the user's manager is "+userassociatedwithmgrid );
                             updateusermanager(userid,userassociatedwithmgrid);
    } catch (tcDataSetException e)
              logger.error(e.getMessage() + " DataSetException");
    In the above code the error is being thrown from the catch block as Data Access Error
    As i told earlier this piece of code works fine in UAT but does not work in prod and throw the above error. The error is thrown from the line msQuery2
    Thanks

  • Is it possible to execute the Data Load Utility from a button?

    I want to have a button on a form to load spreadsheet data each day.
    Can the XE Utility to load data be called from a button?
    How? It is not in any of the documentation I looked at.
    Jean

    No. The "data load button" is not something you can embed in your applications. However, there are solutions to your problem that have been developed by some very talented folks. Take a look at Vikas' examples here: http://htmldb.oracle.com/pls/otn/f?p=38131:1
    Although it wasn't designed with XE it should work just fine.
    Earl

  • Sliding window for historical data purge in multiple related tables

    All,
    It is a well known question of how to efficiently BACKUP and PURGE historical data based on a sliding window.
    I have a group of tables, they all have to be backed up and purged based on a sliding time window. These tables have FKs related to each other and these FKs are not necessary the timestamp column. I am considering using partition based on the timestamp column for all these tables, so i can export those out of date partitions and then drop them. The price I have to pay by this design is that timestamp column is actually duplicated many times among parent table, child tables, grand-child tables although the value is the same, but I have to do the partition based on this column in all tables.
    It's very much alike the statspack tables, one stats$snapshot and many child tables to store actual statistic data. I am just wondering how statspack.purge does this, since using DELETE statement is very inefficient and time consuming. In statspack tables, snap_time is only stored in stats$snapshot table, not everywhere in it's child table, and they are not partitioned. I guess the procedure is using DELETE statement.
    Any thought on other good design options? Or how would you optimize statspack tables historical data backup and purge? Thanks!

    hey oracle gurus, any thoughts?

  • Delete or Truncate statement for data purging

    Hi,
    I am writting a stored procedure to purge data from table every month. There is no constraint on table column except primary key.
    I am using 'delete' statement as it records an entry in the transaction log for each deleted row. But it's a slow process compared to 'Truncate' statement.
    Can you please suggest what should I choose (Delete or truncate) as per best practice.
    Thanks
    Sandy
    SandyLeo

    If you want to  delete all rows in the table use TRUNCATE, otherwise I would suggest the below technique
    --SQL2008 and onwards
    insert into Archive..db
    select getdate(),d.*
    from (delete  from db1
            output deleted.*
            where col=<value>
    ) d
            go
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

Maybe you are looking for

  • Designing simple sales report

    I am trying to create a Sales report from two tables: PRODUCT table and ORDER DETAILS table. The Order Details table has the following fields: Order ID, Product ID, Unit Price and Quantity. The Order ID field is a primary key in another table called

  • Release date of IE 11 'Enterprise Mode' (EmIE)?

    I read announcements (e.g. http://www.zdnet.com/internet-explorer-11-to-get-new-enterprise-mode-7000025842/)  that IE 11 shall be equipped with a so-called 'Enterprise Mode' (EMIE) as a new feature. EMIE will allow businesses to specify which sites s

  • IPOD 160 classic random 2 sec pause

    Ive read other people having a problem with their ipod 160classic randomly pausing for 2 secs then resuming play from where it left off. Ive also read this is probably an inherent design flaw with to small of a cache because it happens with larger fi

  • Remove one of three forests from AzureAD Sync

    Hello all!  We are using AzureAD sync and successfully synchronizing three forests.  We now are facing a divestiture and want to remove one of the forests.  We have applied filters and removed all of the users from the forest to be removed, but we no

  • .orf files cannot be read by CS3 or Bridge

    My Olympus Evolt E-510's raw .orf files appear in bridge but cannot be viewed and when i try to open them in Photoshop i get the message "Could no complete your request because it is not the right kind of document." I am updating now but will take a