Purging of data

I am trying to shift the data from one db to the other which are on different machines.
I tried it using the dblink but it is giving me some error like "distributed transaction waiting for lock failed".
I want to insert the data using simple insert query with select query is there any other option other than using the dblink to insert the data from one table of the db to the other table of the another db.
Please help me out to solve this......
Thnks in advance

Sounds like you should consider an archive policy - how about creating mirror archive tables (identical apart from an ARC suffix) and kick off a process that inserts data from each table based on date criteria and then deletes from the master table. When this is complete, a table level export can export the ARC tables & following this they can be truncated.
Is there any business or performance reason why you have to archive this data?

Similar Messages

  • Purging Old Data

    pls help in solving one issue related to purging the data.
    There are some 40 tables in the database am working in. And I need to check these tables for data which is 5 years old and have to delete those data but these tables are interlinked with other tables (parent -child relation). So how to proceed with this.
    I thought of "with cascade" option but what if the record in the table with the relation is below 5 yrs time period (within 5 yrs data, should not be deleted) ?

    Aparna16 wrote:
    pls help in solving one issue related to purging the data.Interesting problem.. and one that I, wearing my DBA hat, will throw back at the developers.
    They know the data model. This request for purging old data is very likely to occur again in a year's time. What then? Go through a painful exercise again (this time taking data model changes since the last time into consideration)?
    I do not see the logic in that. So instead I will throw this back at the developer and tell them that a PL/SQL package needs to be designed and written to purge old data. DBA input into this will be ito design. Can a purge be done as a single massive delete transaction? Or make it more sense to design the purge for a specific date range? Or a specific product/invoice/customer/whatever business entity that can be aged? And then run multiple such purges in parallel?
    And why the purge? Is it to free up space? That may not be the case depending on the pctfree/pctused settings on a data block. So do some tables perhaps need to be rebuild after a purge in order to reorganise the table and free space?
    That's the DBA side of the problem. Figuring out what data can be deleted from which tables and writing code to do that - that's a developer problem.
    So either side, you need to make sure you use the other side to assist you in doing this task.

  • How to purge BPEL data on 10.1.3.1

    Hi Experts,
    I want to purge OLD data on BPEL dehudration repositiory.
    a) I did read notes in Metalink and talks about 10.1.2.. However mine is 10.1.3
    b) I also read the belwo url
    http://orasoa.blogspot.com/2007/02/delete-bulk-bpel-instances.html
    c) Can someone PLEASE suggest a supported and proven method for DATA PURGE on 10.1.3.1?
    I'm still reading the archive from this forum.... no luck so far..
    In fact I'm also looking for details of tables which I can purge...
    Thanks a lot!!!
    Natrajan

    Hi,
    here are the comments which are in the plsql package collaxa:
    * procedure insert_sa
    * Stored procedure to do a "smart" insert of a scope activation message.
    * If a scope activation message already exists, don't bother to insert
    * and return 0 (this process can happen if two concurrent threads generate
    * an activation message for the same scope - say the method scope for
    * example - only one will insert properly; but both threads will race to
    * consume the activation message).
    * procedure insert_wx
    * Stored procedure to insert a retry exception message into the
    * wi_exception table. Each failed attempt to retry a work item
    * gets logged in this table; each attempt is keyed by the work item
    * key and an increasing retry count value.
    * procedure update_doc
    * Stored procedure to do a "smart" insert of a document row. If the
    * document row has not been inserted yet, insert the row with an empty
    * blob before returning it.
    * procedure delete_ci
    * Deletes a cube instance and all rows in other Collaxa tables that
    * reference the cube instance. Since we don't have referential
    * integrity on the tables (for performance reasons), we need this
    * method to help clean up the database easily.
    * procedure delete_cis_by_domain_ref
    * Deletes all the cube instances in the system. Since we don't have
    * referential integrity on the tables (for performance reasons), we
    * need this method to help clean up the database easily.
    * procedure delete_cis_by_pcs_id( processId )
    * Deletes all the cube instances in the system for the specified process.
    * Since we don't have referential integrity on the tables
    * (for performance reasons), we need this method to help clean
    * up the database easily.
    * procedure insert_document_ci_ref
    * Stored procedure to do a "smart" insert of a document reference.
    * If a document reference already exists for a cube instance, don't bother to insert
    * and return 0.
    */

  • Question: How to make EDQ Purge Project data after Finishing a Job?

    Hi all
    I know how to purge project data manually by clicking
    but I am wondering if there is a way to make EDQ to Purge project data after Finishing a Job by itself.
    Thank you in advance for all answer.

    Hi,
    In EDQ 9.0, there is no automated purge facility.
    However, note that you can configure a job to write only bare metrics and the data and results data that you are choosing to stage (or indeed to stage no data at all) - and you can set such options up to be variable at runtime using a run profile. It is normally better to consider purging separately from the running of jobs, since any temporary footprint a job has in the results database is cleared up automatically in any case.
    Regards,
    Mike

  • Cmp 2.0 how to purge all data

    Hi,
    if we want to purge/delete data, based on like date, ID, and there are like 100 tables, how can I implement this?
    I mean I'll have to call 100 ejbs and 100 times find and remove?
    any idea will be appreciated.
    Thanks
    John
    Toronto

    You can use the bulk update feature of the Java Persistence Query Language.
    For example
    DELETE FROM Customer c WHERE c.status = �inactive�
    Please see section 4.11 of the spec for more details

  • Purpose of Demantra Purge History Data step

    During our Demantra implementation phase the seeded EBS Full Download workflow was not completing successfully. As reccommended by Support, we created a custom EBS Full Download workflow and excluded the Purge History Data step. This resolved the issue and the EBS Download completes successfully. However, we have been running with this process for nearly 10 months and have not run the Purge History Data process.
    What are the consequences of not running Purge History Data process? I have found no information in implementation/user guides nor on metalink on the purpose of this process.
    Thanks

    Hi,
    According to the description, I understand that “Microsoft.SqlServer.Management.PowerShell.sqlps110” is missing in the regeidt.exe.
    Per http://msdn.microsoft.com/en-us/library/cc281962.aspx:
    Beginning in SQL Server 2008, Setup installs the following Windows PowerShell components when you select either the client software or the Database Services nodes:
    •Windows PowerShell 1.0, if Windows PowerShell is not already present on your computer.
    •The SQL Server snap-ins. The snap-ins are dll files that implement two types of Windows PowerShell support for SQL Server:
    A set of SQL Server cmdlets. Cmdlets are commands that implement a specific action. For example, Invoke-Sqlcmd runs a Transact-SQL or XQuery script that can also be run by using the sqlcmd utility, and Invoke-PolicyEvaluation reports whether
    SQL Server objects comply with policy-based management policies.
    A SQL Server provider. The provider lets you navigate the hierarchy of SQL Server objects using a path similar to a file system path. Each object is associated with a class from the SQL Server Management object models. You can use the methods
    and properties of the class to perform work on the objects. For example, if you cd to a databases object in a path, you can use the methods and properties of the Microsoft.SqlServer.Managment.SMO.Database class to manage the database.
    •The sqlps utility that is used to run Windows PowerShell sessions that include the SQL Server snap-ins.
    In addition, you can export the missing registry from other working machine and import it into the machine in problem. Make sure that the registry path is same.
    Best Regards,
    Tracy
    Tracy Cai
    TechNet Community Support

  • Purging the data in exchange tables in IM 3.2

    Hi All,
    Please sugest me on what basis we can purge the data of Exchange tables for SAP IM 3.2
    Regards-
    Garima
    Tags edited by: Michael Appleby

    Hi All,
    Please sugest me on what basis we can purge the data of Exchange tables for SAP IM 3.2
    Regards-
    Garima
    Tags edited by: Michael Appleby

  • RUEI 12.1.0.1.1 - Purge Collected Data Issue

    Hello all,
    I am facing some strange behaviour from RUEI purge collected data option.
    Under RUEI Dashboard:
    System > Maintenance > System Reset > Purge Collected Data
    Under RUEI's help, it says "Purge collected data to remove all collected data from the appliance." However, there are times after I purged data, I still see some collected data.
    Is this due to the reporter still monitoring the network traffic? Will stopping the collector before I do the purging help? (If yes, how do we stop the collector?) How can I make sure that all collected data is successfully purged?
    Regards,
    Nathan

    Dear Nathan,
    your observation is correct, the purge collected data is actually purging data from the OLAP database model, i.e. processed data.
    The data that has since been collected or is still in the processing engine might not be purged.
    The process you have suggested seems to guarantee that there is no data in collector / processing that might not be purged.
    Kind regards,
    Stefan

  • How to purge Sales Data in Oracle Demantra

    Hi Demantra experts,
    I have one doubt in demantra:
    Suppose the user has loaded the historical data say Sales History data in SALES_DATA table.
    And after running the Analaytical Engine , forecast has been generated as well.
    Later he found that it was a wrong data and now it should be removed.
    If the user wants to remoe these records from SALES_DATA table .
    What is the process of purging the records?
    Thanks,
    Neeraj.

    Thanks a lot for your help.
    I checked the metalink note. It deals with removing the data from temporary tables and not the base tables.
    The Temprary tables acts like interface tables. We need to populate the records in the respective temp table and run the .bat file
    However the record stays in the temp table after the .bat file has loaded the data in base tables.
    The suggested metalink note provides script to cleanup the temp tables of demantra.
    My requirement is to remove the records from base tables.
    Thanks,
    Neeraj.

  • How to Archive and Purge SLA data?

    We are looking for a solution to archive and purge SLA tables. I have opened an SR with Oracle and they suggested to work with third party who specialized in archiving and purging data. I would like to see if any of the clients have developed a custom solution for this and if so what is the criteria that has been used for this
    Thanks in advance,
    Prathibha

    Hi,
    Thanks for the information.
    But this traditional way of doing is archiving is for maintenance and backup operations.
    We want to have this process online, without taking db offline. In this case will this approach work?
    In our case, the rules can be like -
    1. For table 'A', if rows exceed 10Million, then start archiving of the data for that table.
    2. For table 'B', if data is older than 6 motnhs start archiving of the data for that table.
    3. Archiving should be on for 15 minutes only after that should pause, and should resume whenever user wants to resume.
    4. Archiving should start on specified days only... ETC...

  • Purging the data older than 1 year

    I have a requirement to purge data older than 1 year.
    Table has 5 billion rows and delete about 4 billion rows in batches.
    there is no primary key or indexes on table.
    There is column Datetime and i created nonclustered index on table and planning to purge data based on that column.
    Any help in preparing the script to purge data in batches older than 1 year.
    Thanks,
    Ron.

    As noted above, it is better to build a new table for what you want to keep using SELECT-INTO:
    http://www.sqlusa.com/bestpractices/select-into/
    Kalman Toth Database & OLAP Architect
    SELECT Video Tutorials 4 Hours
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Reporting database creation without purging any data

    We want to create a reporting database where no 'Purge' operation can be performed & all the records will be retained .Production DB & reporting DB are at different geographic location.In this scenario; what tool should be used for replication ? Can we achieve this using Data guard ? Can streaming be done over the WAN ? Does streaming has lot of overheads due to which performance may get hampered

    Can we achieve this using Data guard ? no
    Can streaming be done over the WAN ?sure
    Does streaming has lot of overheads due to which performance may get hamperedin a downstream scenario, all you have to do is to transport the archivelogs to other site, both capture and apply is done on the target site, no overhead on source site (however processing can be quite demanding on the target site)
    you can write your own DDL/DML handlers according to your needs

  • Year end to purge previous data ( in PO, AP, AR & GL)?

    Dear all,
    We are going to purge data in PO, AP, AR & GL? Any purge program to do it in EBS?
    Thanks,
    Amy

    Hi;
    Plesae follow
    Subject: Purging Strategy for eBusiness Suite 11i Doc ID: 732713.1
    How To Purge Oracle Payables Data Doc ID: 158903.1
    List Of General Ledger Tables That The Archive And Purge Process Purges Data From Doc ID: 275580.1
    see its helpfull
    Regard
    Helios

  • After purging the data..is need to rebuild the index?

    Hi All,
    In our production DB,Application level incidents Purging got finished.(Mean first 3 years data got purged,which is not usable by client.This is done for performance issue)
    Now in this situation,Is necessary to rebuild the index?
    Before that FYI,
    oracle 9i(Enterprise Edition)
    hp-ux b11
    dbsize=30G
    Application:
    tool:HP-ov
    Please throw your input.
    Thanks in advance :-)
    Regards,
    DB.
    Edited by: DB on May 21, 2013 11:26 PM

    >
    In our production DB,Application level incidents Purging got finished.(Mean first 3 years data got purged,which is not usable by client.This is done for performance issue)
    Now in this situation,Is necessary to rebuild the index?
    >
    Please clarify.
    You said you did the purge for 'performance issue' and now, after the purge, you still don't know if your performance issue went away?
    If you no longer have whatever (unknown to us since you didn't post it) 'performance issue' you had then why do you need to do anything?
    If you do still have a performance issue then you need to follow the same steps you should follow whenever you have a performance issue:
    1. validate that you REALLY have an issue
    2. determine the cause of the issue
    3. identify solutions to mitigate/eliminate the issue
    4. test one or two of the possible solutions
    5. implement your 'best' tested solution from step #4
    6. go back to step #1
    Based on what you posted it sounds like you implemented a 'purge' without really knowing that it was even necessary or was even causing a performance issue.
    And you make no mention at all of whether you recollected statistics after the purge was done.

  • How to purge EBS data without deleting any setup or configuration details

    Today I get a very interesting requirement. We have a one year old instance. Now, senior management wants to remove/purge/delete all transactional data though they want to keep all the EBS suite R12.1 setup and configuration information. I do not have any idea how can I achieve this. Please help me on this regard.

    Hi Su;
    Please see:
    Purging Strategy for eBusiness Suite 11i [ID 732713.1]
    Also see:
    http://oracleappstechnology.blogspot.com/2008/12/in-built-data-purge-concurrent-programs.html << r11 iicin hepsi burda
    In r12 What is use of Purge log and Closed system alerts
    Purge Debug Log And System Alerts Performance Issues
    Regard
    Helios

  • How can I purge old data on EM Repository?

    I'd like to remove my old data of EM repository. For example, removing old data collected before Jan 1st 2007.
    Is there any auto-job for that facility? Or Any ideas?

    Be sure you have your dbms_jobs working and they have run.
    MGMT_JOB_ENGINE.apply_purge_policies();
    is perhaps the one you need to have run.
    You can verify the info
    setup tab,
    Management & Service Repository
    Repository Operations
    Then check the date stamps in the Last Scheduled Run
    They should be no more than a day or 2 ago fore each and every one of them.
    If its not, then check the repostory db dbms_jobs and dbms_job.run(xxx) the job.

Maybe you are looking for

  • After clicking of applications folder in the dock, the items have no labels.

    Only happens after a reboot. System:   Model Name: MacBook   Model Identifier: MacBook4,1   Processor Name: Intel Core 2 Duo   Processor Speed: 2.4 GHz   Number Of Processors: 1   Total Number Of Cores: 2   L2 Cache: 3 MB   Memory: 2 GB   Bus Speed:

  • Connection to MySQL on GoDaddy hosted site

    I have a site set up with PHP and MySQL (by following David Powers book - exellent). Everything works fine on my localhost, but when I try to use the remote site to view a page containing MySQL data all that comes up is a blank page. I have added a l

  • Production order :  urgent

    Production order :   Hi PP guru's, I am getting a error message like this when i release the order and the same is not visible in COOIS production order information system . What could be the reason since we need to produce the material urgent. The s

  • Refresh rate for Numeric Indicator

    I have application that displays 60 16 bit word of data in Hex format on the screen simulatanousely as I receive block of data at at approximately 10,000 times a second through USB. Obviously, at this 10,000Hz rate, it is an over-kill in terms of the

  • Upgrade from LR 5.2 to 5.3. Is any preparation necessary?

    Is there anything I need to do before upgrading, such as backing up my develop presets ready to re-instate them after the upgrade ,or do I just go ahead, hit the "download" button and leave it to grind away? I can't find any description of the proces