Data Purge PD Data in Sandbox Development Client

I am trying to establish a Sandbox/Demo Client in our development environment for the exclusive use of HR Development practitioners.
I want it to be a copy of the current Development Client other than for data i.e. something like a blank canvas.  However my BASIS team struggle to deliver.  They can elimnate data from the PA structure as part of the copy process but they have not yet found a way to eliminate data from the PD structure.  I find myself having to tediously breakdown every relationship within an org. unit before I can delete lowest denominators within the PD Structure.
I wonder if you have ever encountered this problem?  Perhaps you know someone who has?  How did you or someone you know overcome it?
Regards,
Dan Carr

Hi Jyothi,
You can follow above instructions and informed your company's BASIS TEAM.
No need to worry about it but be very sure beofre deleting any data.
Also, always take backup of data before deleting anything from SAP.
1 more option : You can copy the data from quality system and upload it.
I hope this will help you.
Regards,
Rahul

Similar Messages

  • Purging Old Data

    pls help in solving one issue related to purging the data.
    There are some 40 tables in the database am working in. And I need to check these tables for data which is 5 years old and have to delete those data but these tables are interlinked with other tables (parent -child relation). So how to proceed with this.
    I thought of "with cascade" option but what if the record in the table with the relation is below 5 yrs time period (within 5 yrs data, should not be deleted) ?

    Aparna16 wrote:
    pls help in solving one issue related to purging the data.Interesting problem.. and one that I, wearing my DBA hat, will throw back at the developers.
    They know the data model. This request for purging old data is very likely to occur again in a year's time. What then? Go through a painful exercise again (this time taking data model changes since the last time into consideration)?
    I do not see the logic in that. So instead I will throw this back at the developer and tell them that a PL/SQL package needs to be designed and written to purge old data. DBA input into this will be ito design. Can a purge be done as a single massive delete transaction? Or make it more sense to design the purge for a specific date range? Or a specific product/invoice/customer/whatever business entity that can be aged? And then run multiple such purges in parallel?
    And why the purge? Is it to free up space? That may not be the case depending on the pctfree/pctused settings on a data block. So do some tables perhaps need to be rebuild after a purge in order to reorganise the table and free space?
    That's the DBA side of the problem. Figuring out what data can be deleted from which tables and writing code to do that - that's a developer problem.
    So either side, you need to make sure you use the other side to assist you in doing this task.

  • How to transport the data contains in the table from development to product

    How to transport the data contains in the table from development to production.
    Please let me know ASAP.

    Hello Dilip
    Create a workbench request and add the following entries to the request:
    Object key: R3TR TABU <name of z-table>
    For this object add the following value key:
    - client-independent table: '*'
    - client-dependent table (e.g. client 100): '100*'
    See also: [SAP Network Blog: Transport Table Entries|/people/community.user/blog/2007/01/07/transport-table-entries]
    Regards
      Uwe

  • Oracle EBS Data Purging and Archival

    Hi,
    I would like to know if there is any tool available in market for Oracle EBS data purging and Archival?
    Thanks,

    yes, there are 3rd-party tool available which will apply a set of business rules (ie all data older than Nov.1, 2007) through the various Oracle modules implemented at a customer site.
    They are 3rd-party tools; You can go to Oracle.com and look in partners validated integration solutions. At the moment there are 2 partners offering such integrated solution:
    Solix EDMS Validated Integration with 12.1
    IBM Optim Data Growth Solution
    the only other solution is to hire OCS for a customized developed solution

  • Purpose of Demantra Purge History Data step

    During our Demantra implementation phase the seeded EBS Full Download workflow was not completing successfully. As reccommended by Support, we created a custom EBS Full Download workflow and excluded the Purge History Data step. This resolved the issue and the EBS Download completes successfully. However, we have been running with this process for nearly 10 months and have not run the Purge History Data process.
    What are the consequences of not running Purge History Data process? I have found no information in implementation/user guides nor on metalink on the purpose of this process.
    Thanks

    Hi,
    According to the description, I understand that “Microsoft.SqlServer.Management.PowerShell.sqlps110” is missing in the regeidt.exe.
    Per http://msdn.microsoft.com/en-us/library/cc281962.aspx:
    Beginning in SQL Server 2008, Setup installs the following Windows PowerShell components when you select either the client software or the Database Services nodes:
    •Windows PowerShell 1.0, if Windows PowerShell is not already present on your computer.
    •The SQL Server snap-ins. The snap-ins are dll files that implement two types of Windows PowerShell support for SQL Server:
    A set of SQL Server cmdlets. Cmdlets are commands that implement a specific action. For example, Invoke-Sqlcmd runs a Transact-SQL or XQuery script that can also be run by using the sqlcmd utility, and Invoke-PolicyEvaluation reports whether
    SQL Server objects comply with policy-based management policies.
    A SQL Server provider. The provider lets you navigate the hierarchy of SQL Server objects using a path similar to a file system path. Each object is associated with a class from the SQL Server Management object models. You can use the methods
    and properties of the class to perform work on the objects. For example, if you cd to a databases object in a path, you can use the methods and properties of the Microsoft.SqlServer.Managment.SMO.Database class to manage the database.
    •The sqlps utility that is used to run Windows PowerShell sessions that include the SQL Server snap-ins.
    In addition, you can export the missing registry from other working machine and import it into the machine in problem. Make sure that the registry path is same.
    Best Regards,
    Tracy
    Tracy Cai
    TechNet Community Support

  • After purging the data..is need to rebuild the index?

    Hi All,
    In our production DB,Application level incidents Purging got finished.(Mean first 3 years data got purged,which is not usable by client.This is done for performance issue)
    Now in this situation,Is necessary to rebuild the index?
    Before that FYI,
    oracle 9i(Enterprise Edition)
    hp-ux b11
    dbsize=30G
    Application:
    tool:HP-ov
    Please throw your input.
    Thanks in advance :-)
    Regards,
    DB.
    Edited by: DB on May 21, 2013 11:26 PM

    >
    In our production DB,Application level incidents Purging got finished.(Mean first 3 years data got purged,which is not usable by client.This is done for performance issue)
    Now in this situation,Is necessary to rebuild the index?
    >
    Please clarify.
    You said you did the purge for 'performance issue' and now, after the purge, you still don't know if your performance issue went away?
    If you no longer have whatever (unknown to us since you didn't post it) 'performance issue' you had then why do you need to do anything?
    If you do still have a performance issue then you need to follow the same steps you should follow whenever you have a performance issue:
    1. validate that you REALLY have an issue
    2. determine the cause of the issue
    3. identify solutions to mitigate/eliminate the issue
    4. test one or two of the possible solutions
    5. implement your 'best' tested solution from step #4
    6. go back to step #1
    Based on what you posted it sounds like you implemented a 'purge' without really knowing that it was even necessary or was even causing a performance issue.
    And you make no mention at all of whether you recollected statistics after the purge was done.

  • SQL Developer can't commit edited data in Table Data pane

    When I try to commit changes in "Data" pane for selected table SQL Developer gives me a strange error:
    One error saving changes to table "TABLENAME".:
    Row XXX: Data got commited in another/same session, cannot update row.
    I can see in the log that SQL Developer tries to do something like:
    UPDATE "TABLENAME" set "COLUMN"="value1" where ROWNUM="xxxx1" and ROW_SCN=nnn1;
    UPDATE "TABLENAME" set "COLUMN"="value2" where ROWNUM="xxxx2" and ROW_SCN=nnn1;
    UPDATE "TABLENAME" set "COLUMN"="value3" where ROWNUM="xxxx3" and ROW_SCN=nnn2;
    If I update the same rows in SQL window by other condition and do commit - all is OK. Why so strange behaivour?
    My table has not a primary key and no other users try to change it. SQL Developer version 3.0.04 and Oracle 10.2.0.4 Linux.
    Best regards,
    Sergey Logichev

    That's because the inaccuracy of ROW_SCN.
    I suggest you turn off Preferences - Database - ObjectViewer - Use ORA_ROWSCN (as I did the very moment we got the option).
    Have fun,
    K.

  • LOG_FILE_NOT_FOUND when running cleaner manually after some data purge

    I hit LOG_FILE_NOT_FOUND error when running cleaner manually after some data purge, I searched the forum, found someone also faced the same issue before, but cannot find any clue on how to fix it. Below is the error trace and followed by our configurations
    Caused by: com.sleepycat.je.EnvironmentFailureException: (JE 4.1.6)
    Environment must be closed, caused by:
    com.sleepycat.je.EnvironmentFailureException: Environment invalid because of
    previous exception: (JE 4.1.6) /scratch/tie/thirdeye/index/data-store
    fetchTarget of 0x50f/0x3fb9dd6 parent IN=368491717 IN
    class=com.sleepycat.je.tree.IN lastFullVersion=0x510/0x2ca7d18
    parent.getDirty()=false state=0 LOG_FILE_NOT_FOUND: Log file missing, log is
    likely invalid. Environment is invalid and must be closed.
    at
    com.sleepycat.je.EnvironmentFailureException.wrapSelf(EnvironmentFailureExcept
    ion.java:196)
    at
    com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:1439)
    at com.sleepycat.je.Environment.checkEnv(Environment.java:2117)
    at com.sleepycat.je.Environment.checkpoint(Environment.java:1440)
    at
    com.oracle.thirdeye.datastore.DataStoreManager.clean(DataStoreManager.java:402
    at
    com.oracle.thirdeye.infostore.InfoStoreManager.clean(InfoStoreManager.java:301
    ... 11 more
    Caused by: com.sleepycat.je.EnvironmentFailureException: Environment invalid
    because of previous exception: (JE 4.1.10)
    /scratch/tie/thirdeye/index/data-store fetchTarget of 0x50f/0x3fb9dd6 parent
    IN=368491717 IN class=com.sleepycat.je.tree.IN
    lastFullVersion=0x510/0x2ca7d18 parent.getDirty()=false state=0
    LOG_FILE_NOT_FOUND: Log file missing, log is likely invalid. Environment is
    invalid and must be closed.
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1332)
    at com.sleepycat.je.tree.IN.findParent(IN.java:2886)
    at com.sleepycat.je.tree.Tree.getParentINForChildIN(Tree.java:881)
    at com.sleepycat.je.tree.Tree.getParentINForChildIN(Tree.java:809)
    at
    com.sleepycat.je.cleaner.FileProcessor.findINInTree(FileProcessor.java:1152)
    at com.sleepycat.je.cleaner.FileProcessor.processIN(FileProcessor.java:1090)
    at
    com.sleepycat.je.cleaner.FileProcessor.processFile(FileProcessor.java:538)
    at com.sleepycat.je.cleaner.FileProcessor.doClean(FileProcessor.java:241)
    at com.sleepycat.je.cleaner.Cleaner.doClean(Cleaner.java:463)
    ------------Configurations-------------------------
    EnvironmentConfig.ENV_RUN_CLEANER -> false
    EnvironmentConfig.CHECKPOINTER_HIGH_PRIORITY -> true
    EnvironmentConfig.CLEANER_EXPUNGE -> false
    Any hints are appreciated. I'm also working for Oracle CDC, feel free to call me at 861065151679 or drop me an email at [email protected] so that we can talk more in detail
    Anfernee

    Anfernee, I will contact you via email.
    --mark                                                                                                                                                                                                                   

  • Data purge utility in OIM 9.1.0.2

    Hi All,
    Anybody aware of any data purge procedures/steps/utility in OIM.
    I have studied that following tables are used for audit purpose. Following table holds of user profile snapshot (whenever user data get changed). Reference document is http://docs.oracle.com/cd/E10391_01/doc.910/e10365/useraudit.htm
    UPA
    UPA_USR
    UPA_FIELDS
    UPA_GRP_MEMBERSHIP
    UPA_RESOURCE
    AUD_JMS
    UPA_UD_FORMS
    UPA_UD_FORMFIELDS
    I guess tis tables can be truncated (after taking a backup) without any risk ?
    Please suggest?
    Ritu

    Refer more about script here:docs.oracle.com/cd/E14899_01/doc.9102/e14763/bulkload.htm#CHDHAFHC
    OIM server should be up and running while running the script.
    Regards,
    GP

  • How to purge BPEL data on 10.1.3.1

    Hi Experts,
    I want to purge OLD data on BPEL dehudration repositiory.
    a) I did read notes in Metalink and talks about 10.1.2.. However mine is 10.1.3
    b) I also read the belwo url
    http://orasoa.blogspot.com/2007/02/delete-bulk-bpel-instances.html
    c) Can someone PLEASE suggest a supported and proven method for DATA PURGE on 10.1.3.1?
    I'm still reading the archive from this forum.... no luck so far..
    In fact I'm also looking for details of tables which I can purge...
    Thanks a lot!!!
    Natrajan

    Hi,
    here are the comments which are in the plsql package collaxa:
    * procedure insert_sa
    * Stored procedure to do a "smart" insert of a scope activation message.
    * If a scope activation message already exists, don't bother to insert
    * and return 0 (this process can happen if two concurrent threads generate
    * an activation message for the same scope - say the method scope for
    * example - only one will insert properly; but both threads will race to
    * consume the activation message).
    * procedure insert_wx
    * Stored procedure to insert a retry exception message into the
    * wi_exception table. Each failed attempt to retry a work item
    * gets logged in this table; each attempt is keyed by the work item
    * key and an increasing retry count value.
    * procedure update_doc
    * Stored procedure to do a "smart" insert of a document row. If the
    * document row has not been inserted yet, insert the row with an empty
    * blob before returning it.
    * procedure delete_ci
    * Deletes a cube instance and all rows in other Collaxa tables that
    * reference the cube instance. Since we don't have referential
    * integrity on the tables (for performance reasons), we need this
    * method to help clean up the database easily.
    * procedure delete_cis_by_domain_ref
    * Deletes all the cube instances in the system. Since we don't have
    * referential integrity on the tables (for performance reasons), we
    * need this method to help clean up the database easily.
    * procedure delete_cis_by_pcs_id( processId )
    * Deletes all the cube instances in the system for the specified process.
    * Since we don't have referential integrity on the tables
    * (for performance reasons), we need this method to help clean
    * up the database easily.
    * procedure insert_document_ci_ref
    * Stored procedure to do a "smart" insert of a document reference.
    * If a document reference already exists for a cube instance, don't bother to insert
    * and return 0.
    */

  • Sliding window for historical data purge in multiple related tables

    All,
    It is a well known question of how to efficiently BACKUP and PURGE historical data based on a sliding window.
    I have a group of tables, they all have to be backed up and purged based on a sliding time window. These tables have FKs related to each other and these FKs are not necessary the timestamp column. I am considering using partition based on the timestamp column for all these tables, so i can export those out of date partitions and then drop them. The price I have to pay by this design is that timestamp column is actually duplicated many times among parent table, child tables, grand-child tables although the value is the same, but I have to do the partition based on this column in all tables.
    It's very much alike the statspack tables, one stats$snapshot and many child tables to store actual statistic data. I am just wondering how statspack.purge does this, since using DELETE statement is very inefficient and time consuming. In statspack tables, snap_time is only stored in stats$snapshot table, not everywhere in it's child table, and they are not partitioned. I guess the procedure is using DELETE statement.
    Any thought on other good design options? Or how would you optimize statspack tables historical data backup and purge? Thanks!

    hey oracle gurus, any thoughts?

  • What is master data in SAP CRM 2007 in Web Client

    Hi All,
    plz could you explain what is the master data in SAP CRM 2007 In Web client intraction center.
    regards,
    pasha

    Master Data are as under:
    Business Partner
    Product
    IBase
    Regards,
    Rajesh Banka

  • Question: How to make EDQ Purge Project data after Finishing a Job?

    Hi all
    I know how to purge project data manually by clicking
    but I am wondering if there is a way to make EDQ to Purge project data after Finishing a Job by itself.
    Thank you in advance for all answer.

    Hi,
    In EDQ 9.0, there is no automated purge facility.
    However, note that you can configure a job to write only bare metrics and the data and results data that you are choosing to stage (or indeed to stage no data at all) - and you can set such options up to be variable at runtime using a run profile. It is normally better to consider purging separately from the running of jobs, since any temporary footprint a job has in the results database is cleared up automatically in any case.
    Regards,
    Mike

  • Oracle Apps data Purge

    Hi Apps Gurus
    We have a requirement to delete one of the operating units data from Oracle Projects ,SCM and Finance modules.
    There are huge transactions and interfaced to GL and other modules(e.g. Payable Invoices have been transferred to GL)
    There are standard concurrent programs and Purge functionalities, however it checks for dependencies. We would like to delete it completely including GL module.
    Regards
    Dharam

    Hi;
    You can not delete module from EBS,If you do than consistiency will be destory. You can not delete data from backend too, coz its also can be cause many problem.
    If you need to purge huge data than you have to log SR and confirm it wiht oracle support
    Regard
    Helios

  • Cmp 2.0 how to purge all data

    Hi,
    if we want to purge/delete data, based on like date, ID, and there are like 100 tables, how can I implement this?
    I mean I'll have to call 100 ejbs and 100 times find and remove?
    any idea will be appreciated.
    Thanks
    John
    Toronto

    You can use the bulk update feature of the Java Persistence Query Language.
    For example
    DELETE FROM Customer c WHERE c.status = �inactive�
    Please see section 4.11 of the spec for more details

Maybe you are looking for

  • IPhone 6 ios 8.0.2 "This accessory may not be supported"

    I know this is a common issue as there are dozens of threads, but after reading more than a dozen I still do not have a solution. Often, when I plug my brand new (10 days) lightening cord into my brand new (10 days) iPhone 6 to charge, I receive the

  • My name is showing up as "my info" on someone else's phone

    someone else's name is showing as the owner of my phone?

  • String& char

    Hi; Could anyone tell me a buildin class for find out how many characters in a text file(or a string) and it's length(count by character). For example; "ghggjjkk kjkjk" it has 14 characters and length is 14 Thanks

  • Having problem with optical out on HD rental movie

    Hi all, I just bought an Apple TV for myself for the holidays. I have the optical out going to a DAC then to my stereo, and it works fine for  music, Netflix, and iTunes Radio. However, when I rented a movie via iTunes (The Wolverine), optical out no

  • Image on monitor window does not match playhead as it moves

    I am resurrecting this earlier thread because, although it received a star as "solved" it was not really solved. Since upgrading to Quicktime 7.6.6 I, too, have had similar problems, as apparently have several others. Basically, on iMovie playback th