How to purge BPEL data on 10.1.3.1

Hi Experts,
I want to purge OLD data on BPEL dehudration repositiory.
a) I did read notes in Metalink and talks about 10.1.2.. However mine is 10.1.3
b) I also read the belwo url
http://orasoa.blogspot.com/2007/02/delete-bulk-bpel-instances.html
c) Can someone PLEASE suggest a supported and proven method for DATA PURGE on 10.1.3.1?
I'm still reading the archive from this forum.... no luck so far..
In fact I'm also looking for details of tables which I can purge...
Thanks a lot!!!
Natrajan

Hi,
here are the comments which are in the plsql package collaxa:
* procedure insert_sa
* Stored procedure to do a "smart" insert of a scope activation message.
* If a scope activation message already exists, don't bother to insert
* and return 0 (this process can happen if two concurrent threads generate
* an activation message for the same scope - say the method scope for
* example - only one will insert properly; but both threads will race to
* consume the activation message).
* procedure insert_wx
* Stored procedure to insert a retry exception message into the
* wi_exception table. Each failed attempt to retry a work item
* gets logged in this table; each attempt is keyed by the work item
* key and an increasing retry count value.
* procedure update_doc
* Stored procedure to do a "smart" insert of a document row. If the
* document row has not been inserted yet, insert the row with an empty
* blob before returning it.
* procedure delete_ci
* Deletes a cube instance and all rows in other Collaxa tables that
* reference the cube instance. Since we don't have referential
* integrity on the tables (for performance reasons), we need this
* method to help clean up the database easily.
* procedure delete_cis_by_domain_ref
* Deletes all the cube instances in the system. Since we don't have
* referential integrity on the tables (for performance reasons), we
* need this method to help clean up the database easily.
* procedure delete_cis_by_pcs_id( processId )
* Deletes all the cube instances in the system for the specified process.
* Since we don't have referential integrity on the tables
* (for performance reasons), we need this method to help clean
* up the database easily.
* procedure insert_document_ci_ref
* Stored procedure to do a "smart" insert of a document reference.
* If a document reference already exists for a cube instance, don't bother to insert
* and return 0.
*/

Similar Messages

  • Cmp 2.0 how to purge all data

    Hi,
    if we want to purge/delete data, based on like date, ID, and there are like 100 tables, how can I implement this?
    I mean I'll have to call 100 ejbs and 100 times find and remove?
    any idea will be appreciated.
    Thanks
    John
    Toronto

    You can use the bulk update feature of the Java Persistence Query Language.
    For example
    DELETE FROM Customer c WHERE c.status = �inactive�
    Please see section 4.11 of the spec for more details

  • How to purge EBS data without deleting any setup or configuration details

    Today I get a very interesting requirement. We have a one year old instance. Now, senior management wants to remove/purge/delete all transactional data though they want to keep all the EBS suite R12.1 setup and configuration information. I do not have any idea how can I achieve this. Please help me on this regard.

    Hi Su;
    Please see:
    Purging Strategy for eBusiness Suite 11i [ID 732713.1]
    Also see:
    http://oracleappstechnology.blogspot.com/2008/12/in-built-data-purge-concurrent-programs.html << r11 iicin hepsi burda
    In r12 What is use of Purge log and Closed system alerts
    Purge Debug Log And System Alerts Performance Issues
    Regard
    Helios

  • How to purge Sales Data in Oracle Demantra

    Hi Demantra experts,
    I have one doubt in demantra:
    Suppose the user has loaded the historical data say Sales History data in SALES_DATA table.
    And after running the Analaytical Engine , forecast has been generated as well.
    Later he found that it was a wrong data and now it should be removed.
    If the user wants to remoe these records from SALES_DATA table .
    What is the process of purging the records?
    Thanks,
    Neeraj.

    Thanks a lot for your help.
    I checked the metalink note. It deals with removing the data from temporary tables and not the base tables.
    The Temprary tables acts like interface tables. We need to populate the records in the respective temp table and run the .bat file
    However the record stays in the temp table after the .bat file has loaded the data in base tables.
    The suggested metalink note provides script to cleanup the temp tables of demantra.
    My requirement is to remove the records from base tables.
    Thanks,
    Neeraj.

  • Question: How to make EDQ Purge Project data after Finishing a Job?

    Hi all
    I know how to purge project data manually by clicking
    but I am wondering if there is a way to make EDQ to Purge project data after Finishing a Job by itself.
    Thank you in advance for all answer.

    Hi,
    In EDQ 9.0, there is no automated purge facility.
    However, note that you can configure a job to write only bare metrics and the data and results data that you are choosing to stage (or indeed to stage no data at all) - and you can set such options up to be variable at runtime using a run profile. It is normally better to consider purging separately from the running of jobs, since any temporary footprint a job has in the results database is cleared up automatically in any case.
    Regards,
    Mike

  • How to purge data cache table using command line

    Hi:
    Is there a way to purge the data cache table using command line?
    thanks!

    Thanks, Mike.
    I'm thinking about the ldconsole provided with ALDSP.
    The ldconsole has a link for purging the cache. Is there anything I can leverage from there? Is it a JMX component that I can call?

  • How to purge the workflow which is in process

    Hi Friends,
    I am facing one problem in the AME Workflow.
    when a user submit a page for approval process, workflow engine invokes . and in the hr_api_transcations table transacation id is created for that workflow transacation. now the problem is, if i purge that workflow process using concurrent program 'PURGE OBSELETE WORKFLOW RUNTIME DATA' . it is deleting the workflow. but when i see the hr_api_transacation table still the transacation id for that workflow is active.
    Example :-
    when i see through through the responsibility (workflow administrator web applications)=>Administrator workflow=>Status monitor
    when i type the item key say some x and search for the workflow status i cant see the workflow status. i can see it is deleted.
    but when i type the same item key in the hr_api_transacation
    like :- select * from hr_api_transactions where item_key='X' i can see the status of this transacation is active and workflow is in pending.
    can anyone explain me why system is behaving like.. how to stop the workflow process and how to delete the transacation id from the table.
    Thanks in advance

    Please check below nots
    453137.1 (Oracle Workflow Best Practices Release 12 and Release 11i) sections titled "Choosing Not to Use E-mail Notifications" and "Cleaning Up the WF_NOTIFICATION_OUT Queue"
    How to purge e-mail notifications from the workflow queue so the e-mail is not sent [ID 372933.1]
    264191.1 describes how to Purging Oracle Workflow tables of obsolete workflow runtime information for completed workflow processes is a required regular maintenance tas
    Notification Mailers Unavailable
    How to delete undelivered notifications from WF mailer

  • Purging Old Data

    pls help in solving one issue related to purging the data.
    There are some 40 tables in the database am working in. And I need to check these tables for data which is 5 years old and have to delete those data but these tables are interlinked with other tables (parent -child relation). So how to proceed with this.
    I thought of "with cascade" option but what if the record in the table with the relation is below 5 yrs time period (within 5 yrs data, should not be deleted) ?

    Aparna16 wrote:
    pls help in solving one issue related to purging the data.Interesting problem.. and one that I, wearing my DBA hat, will throw back at the developers.
    They know the data model. This request for purging old data is very likely to occur again in a year's time. What then? Go through a painful exercise again (this time taking data model changes since the last time into consideration)?
    I do not see the logic in that. So instead I will throw this back at the developer and tell them that a PL/SQL package needs to be designed and written to purge old data. DBA input into this will be ito design. Can a purge be done as a single massive delete transaction? Or make it more sense to design the purge for a specific date range? Or a specific product/invoice/customer/whatever business entity that can be aged? And then run multiple such purges in parallel?
    And why the purge? Is it to free up space? That may not be the case depending on the pctfree/pctused settings on a data block. So do some tables perhaps need to be rebuild after a purge in order to reorganise the table and free space?
    That's the DBA side of the problem. Figuring out what data can be deleted from which tables and writing code to do that - that's a developer problem.
    So either side, you need to make sure you use the other side to assist you in doing this task.

  • How to purge Workflow queue after R12 upgrade before starting WF Mailer?

    Hi,
    We are about to upgrade to R12.1.3 from R11.5.9.
    As part of the testing, I configured and started the Workflow Mailer in the new system. As soon as I did that, the system started sending a ton of notifications regarding past Requisition approvals apparently queued in the system. WF Mailer has been down since I did the last upgrade test.
    We plan to approve all requisitions before the cutover in the old system, so there should be no email notifications pending. Is there a way of updating msg_state of those notifications in the wf_notification_out table with a value of "READY", so we can make sure that we will have a clean system and users will not receive any notifications regarding the past requisition approvals?
    Thanks,
    Sinan

    Please see these docs.
    Note: 847889.1 - Stop Workflow Notification Emails During Clone
    Note: 828812.1 - How To Stop Old Outbound Workflow Notification Email Messages During Clone Activity
    Note: 603003.1 - How To Remove Workflow Data On A Test Or Cloned Instance
    Note: 372933.1 - How to purge e-mail notifications from the workflow queue so the e-mail is not sent
    Note: 736508.1 - How to Cancel Email Notifications for Particular Workflow Type
    Regards,
    Hussein

  • How to purge tables using odi ???

    hi,
    I want to purge the Data in target table.my Data Policy is to only retain the only 90 days. I have to keep only 90 days data. How can i do this in ODI. I have only one DATE column.
    Thanks,
    Regards,
    AMSII

    If your existing date field can be used to identify the records to be purged, then you could simply create a procedure in ODI along the lines of
    delete from <%=odiRef.getSchemaName( )%>.*your_table* where your_date_field < (sysdate - 90)
    This is the simplest way, but be wary of your indexes on the table that they don't get too fragmented. You could also investigate using a daily partition startegy on the table which would enable you to drop partitions older than 90 days which would be much faster.

  • Year end to purge previous data ( in PO, AP, AR & GL)?

    Dear all,
    We are going to purge data in PO, AP, AR & GL? Any purge program to do it in EBS?
    Thanks,
    Amy

    Hi;
    Plesae follow
    Subject: Purging Strategy for eBusiness Suite 11i Doc ID: 732713.1
    How To Purge Oracle Payables Data Doc ID: 158903.1
    List Of General Ledger Tables That The Archive And Purge Process Purges Data From Doc ID: 275580.1
    see its helpfull
    Regard
    Helios

  • How does a BPEL different from ESB?

    How does a BPEL different from ESB?

    BPEL and ESB are similar on higher level,
    they can both be used to integrate system/process and make data flow between them.
    BPEL is for designing process
    1.) Where human task is involved
    2.) Lot of complex looping is there
    ESB
    1.) Its designed for integrating different system/process and make data flow between them,
    2.) Its designed to do this very purpose, so fast in comparison to BPEL to do this
    Yatan

  • RUEI 12.1.0.1.1 - Purge Collected Data Issue

    Hello all,
    I am facing some strange behaviour from RUEI purge collected data option.
    Under RUEI Dashboard:
    System > Maintenance > System Reset > Purge Collected Data
    Under RUEI's help, it says "Purge collected data to remove all collected data from the appliance." However, there are times after I purged data, I still see some collected data.
    Is this due to the reporter still monitoring the network traffic? Will stopping the collector before I do the purging help? (If yes, how do we stop the collector?) How can I make sure that all collected data is successfully purged?
    Regards,
    Nathan

    Dear Nathan,
    your observation is correct, the purge collected data is actually purging data from the OLAP database model, i.e. processed data.
    The data that has since been collected or is still in the processing engine might not be purged.
    The process you have suggested seems to guarantee that there is no data in collector / processing that might not be purged.
    Kind regards,
    Stefan

  • SoundTrak Crashed!! How I got my data back.

    If you get the beachball during a record, by what could be the screen saver kicking in or the disk drive going to sleep (all of which should be disabled to start with), then you might get a file which shows size and a good time of creation, but it won't play or show a waveform. In fact even Audacity won't touch it.  Here's how to recover the data. Hopefully you know your settings you were using, aka 48khz, 44.1 etc.  For the sake of this post, I used 24bit 44.1khz AIFF sample rescue information.  Bear with me.
    First locate your bad file in the soundtrak pro recordings folder. Duplicate it. Note its size. Take the copy and get it to the desktop. Do not muck with the original.
    Download the 0xED (thats the name) hex editing app.  You can also have a dupe file of a working file from an earlier session. Your going to want to copy it as well and put it on your desktop. You'll need the header info from it.
    Open your known good file wih 0xED. Highlight and copy from the beginning of that file address 0000000 thru 0000FF7 with a contrl C on the keyboard.
    Open your bad file copy. They can both be opened at the same time.  Highlight those same addresses and contrl V that data into your bad file.  Pay close attention thats all you changed. Nothing should have shifted anywhere else. If you screwed up, cntrl Z out of that and try it again until it looks like a clean paste. Yes the data is going to be different in your paste operation as thats why your file is hosed to begin with. Hopefully you just now patched your file.  SAVE the potential repaired file using the save icon in 0xED. Open our repaired file
    with Audacity, or SoundTrak Pro. You should have a workable file now.  If not, go back and try again as perhaps you messed something up or didnt save the repair properly.  Its crucial you use data from a good file of the same settings to fix the bad one. If your not sure of the settings you used, try a couple possibilities. Most people use the same settings with only a few variations so your choices should be few hopefully. NOTE: To test your file, close out all instances of the bad file in Sountrak Pro or your new file won't chck out if it is indeed fixed. DO a fresh load of your new file into a freshly started Sountrak Pro or Audacity.
    What else have I used? I used SoundHack a few times. Its a bit strange and clunky. The results are hard to nail down. I ended up with a file that played like MickeyMouse but I was able to use Audacity to cut the speed down by 50% and save the audio.
    The problems I encountered to get me to this point were as described in the beginning of this write up. I had the beachball. One time Sountrak just quit without warning. Again, the data was there but just wasnt closed out properly.
    I hope this saves someone from a disaster. It saved me. I never gave up.
    cheers
    Johnny
    P.S.  Here's the data for an AIFF 44.1k 24 bit good file if it can be of use. You'd want to place it right over your bad file in 0xED. No guarantees. Make sure your muckng with a copy and NOT your original file...... (yes thats 4C twice, thats not a misprint)
    464F524D0A90582741494646434F4D4D000000120001000000000018400EAC44000000000000464C 4C5200000FC20000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0 0000000000000000000000000000000000000000000000000000000000000000000000000000000 0
    53534E440B3E3D63
    Message was edited by: johnnyU  by all means drop me a line if this helps anyone. I'd love to know.

    if you did not do a back up of your phone and memory card there is no way to get back the information. you can't roll back anything the memory or firmware once done is done for good. sorry i know this is not good news
    You know what I love about you the most, the fact that you are not me ! In love with technology and all that it can offer. Join me in discovery....

  • How can I Move data from one column to another in my access table?

    I have two columns, one that stores current month’s data and one that stores last month’s data. Every month data from column 2 (this month’s data) needs to be moved to column 1 that holds last month’s data. I then null out column 2 so I can accumulates this month’s data.
    I understand how to drop a column or add a column, how do I transfer data from one column to another.
    Here is my trial code:
    <cfquery name="qQueryChangeColumnName" datasource="#dsn#">
      ALTER TABLE leaderboard
      UPDATE leaderboard SET  points2 = points3
    </cfquery>
    Unfortunately, I get the following error:
    Error Executing Database Query.
    [Macromedia][SequeLink JDBC Driver][ODBC Socket][Microsoft][ODBC Microsoft Access Driver] Syntax error in ALTER TABLE statement.
    How can I transfer my data with the alter table method?

    I looked up the Access SQL reference (which is probably a
    good place to start when having issues with Access SQL), and
    it suggests you probably need a WHERE clause in there.
    I agree the documentation is a good place to start. But you should not need a WHERE clause here.
    Too few parameters. Expected 1.
    If you run the SQL directly in Access, what are the results? At the very least, it should provide a more informative error message..

Maybe you are looking for

  • My macbook will not connect to the internet!!!

    So I just recieved a used macbook from ebay today. I was so excited because everything was working perfectly - internet and all!! Then, after about 20 mins of being on the internet, the connection was lost. Now everytime I try to connect it says "Con

  • Monitor Interface with Motherboard

    My son is building a pc and with the Z87-G45 Motherboard.  I computer is up and running, but when he went to plug in the monitor the monitor keeps going to sleep as if it can't get any power.  We have a VGA plug with a VGA cord but when the monitor f

  • GR based IV for delivery costs not working

    Hi Gurus, GR based IV is not working for certain delivery costs. I have not assigned any accrual key to the condition type in the calculation schema. When I try to perform invoice verification for the delivery cost, the GR quantity / value does not a

  • More then one WEMservices

    Hi , I found 4 WEM services in my win2k server service and non of them are runing, I think it is because I have installed everything in one m/c. If I start any one then it is fine. if I try to start one more then it close the first one or some time i

  • Trouble using networked printer

    I am a new iMac user with quesiotns regarding netowrked printing. Any advice/explanations would be appreciated. I am using an Epson RX580 printer which is physically connected to a Windows PC. I am able to print from the Mac over a wireless connectio