Data deletion in PSA tables

Hello Folks,
I am trying to delete Older than 7 days data in my PSA tables, with Process chains I am able to delete the requests ID but actual data is not getting dropped. I can write an ABAP program but that will delete the entire content of PSA tables. I went through lot of SDN messages in this regard but I did not find a solution. Could you please throw some suggestions on this?
Kris

Hi,
There is a process type called "Deleting requests from PSA" and there you can set the frequency i.e. how many old data you want to retain.
And as I seen once the request is deleted, all the data belonging to that request also get deleted otherwise there is no point in just deleting request ID's.
I guess there must be some issue with your process chain. Try manually deleting PSA request if they are still available in PSA table.
Regards,
Durgesh.

Similar Messages

  • Data Deletion from PSA Tables

    Hi,
      I deleted 2 failed requests (10000000 Records) from PSA Tables because of the disk space issue for our BW System.
    RSA1—PSA –DELETE PSA DATA –
    The requests are deleted from the PSA, In BW in can see that there not requests in that particular PSA.
    But Basis/Database team raised an issue that the data has been added to the Database…
    What is this mean and how can I proceed with this
    Thanks

    Hi,
    Thanks for the info provided.
    So according to the info provided the Deleting the request from the PSA  marks as ‘To be Deleted’
    ‘When the last of all requests in one partition gets deleted,’ what do you mean by this.
    There are 2 requests in our PSA,I deleted 2 of them and in BW when I go to PSA there I don’t find any requests available in PSA.
    But our basis team complained that some data has been added to data base
    Thanks

  • Missing data packages for PSA Table in sap netweaver 2004s

    When we tryed to load the master data 0customer we have got the message
    "Information Idoc: Sent, but did not arrive:
    Missing data packages for PSA Table
    Diagnosis
    Data packets are missing from PSA Table . BI processing does not return any errors. The data transport from the source system to BI was probably incorrect.
    We couldn't find any entries in source system.We have checked the tablespaces and authorizations.Its looks good.
    WE have replicated and tryed again.But still getting the same error message.
    we have SAP_BASIS
    Release 700,level 0008, Highest SAPKB70008 basis component.
    pls help us with the solution.
    Thanks in advance.
    -Soujanya

    Hi
    1) goto BD87 Tcode... take ur Idoc number and check the status of the idoc if it is 64 .. just process it manually ....
    2) go to the TRFC in Datawarehouse and execute the LUWs manually ..
    hope it helps
    regards
    AK
    ************assign points if usefull******

  • How to correct the data in the psa table?

    1Q. There are lot of invalid character in the infopackage of say 1million records. it takes lot of time to check each and every record in the data package(PSA)and correct it. i think there is an efficient way to reslove this issue that is going in the PSA table to correct all the records. is it right, if yes how to do it?
    2Q. If say there are 30 data packages in the request and only data pacakge 25 has the bad records. if i correct the data in the PSA and push it to the data target, its gone process all the data packages one by one that takes lot of time and delay our process chain job that has depedency on the load. can i just manually process this data package only. if yes how to do it?
    3Q. when i successfully correct all the bad records in the data package and push it from the PSA. the request dont turn to status green and have to manually turn this request to green in the data target after i verify all the data packages have on bad records and it is a delta update. is my process right? as it is a delta what are the pitfalls i have to watch for? and the next step after this is compress the request this is very dangerous because this basic cube have lot of history and it will take a long time probably weeks to reload it. how to take precuation before i turn it to stutus green in the data target?
    Thanks in advance! and i know how to thank SDN experts by assining points.

    Hi,
    1Q . Update the invalid chars in the filter table using tcode RSKC and also write a ABAP routine to filter out the invalid characters.
    2Q. For the incorrect data packet, you can right click on the data packet in the monitor details tab and say update manually. That way you don't need to reload the entire request again.
    3Q. When you reload the request or update individual data packet again, the request should automatically turn green. You don't have to turn it green manually. The pitfall is, if you turn a delta request green, you have chances of losing data and corrupting the delta. Best practise is never turn a request green manually.. Even if you compress the requests, you can use selective deletion to delete the data and then use an infopackage with the same selections, that you used for deletion to load the same data back.
    Cheers,
    Kedar

  • Custom delta extractor: All data deleted in source table in R/3

    Hi everyone,
    I have made a custom delta extractor from R/3 to a BW system. The setup is the following:
    The source table in R/3 holds a timestamp, which is used for the delta. The data is afterwards loaded to a DSO in the BW system. The extractor works as expected with delta capability. Furthermore if I delete a record in the source table, this is not transmitted to the DSO, which is also as expected.
    The issue is this however: If we delete all data in the source table, then on the next load there is a request showing 1 record transfered to the DSO. This request does, however, not show up in the PSA, and afterwards all data fields in the DSO is set to initial.
    Does anyone know why this happens?
    Thank you in advance.
    Philip R. Jarnhus

    Hi Philip,
    As you have used generic extractor I am not sure how the ROCANCEL will work but you can check the below link for more information,
    [0RECORDMODE;
    Regards,
    Durgesh.

  • Data deletion in DB table

    Hi,
    I have given maintenance allowed as a setting in delivery & maintenace of DB table.
    But still i couldn't delete data from the table. what will be the reason? should i change
    some other settings of DB table?

    Hi,
    First of all, You should NEVER directly delete entries from a standard SAP table.
    (1) Delete the row by row entry through SE16N OR
    (2) Delete by SQL delete query ,
            Delete from tablename where condition.
    Regards,
    Hardik B

  • Deleting the data from the PSA

    Hi Experts,
    I have a case where I have to delete my PSA data,which is all about nearly two years data.
    Any idea on this highly appreciated.
    Regards,
    Srini.

    Yes, data in the PSA tables is periodically deleted to prevent it from getting bigger and bigger.
    It is a good idea to delete incorrect requests or deltas for a data target, to which you do not want to load any more delta.
    http://help.sap.com/saphelp_nw04/helpdata/en/b0/078f3b0e8d4762e10000000a11402f/frameset.htm
    pls reward me some points

  • PSA data deletions

    There are scheduled loads to a data target via a meta chain.
    I want to include deletion of PSA data as part of the scheduled loads.  Just wanted to check the following:
    1. There are Process chain process to delete PSA tables
    2. It is better to collectively delete the PSA tables
    3. Should I include the PSA deletion as the last step for each individual process chain or as a new process in the meta chain
    Thanks

    Hi,
    If you put all PSA tables deletion in one process, then system will delete one by one PSA & process will take more time.
    So group PSA tables like Master data PSA, transaction data PSA, PSA in between source system & BI , PSA tables with in BI etc....
    Then do analysis on data volume on each PSA. If any table will recieve more volume, then better to keep it alone in single process type where as smaller ones can be grouped. Again it depends on your server work processors available, regular PC design & data volume.
    Hope you might get an direction.
    Regards,
    Arun Thangaraj.

  • Any table for the data in ODS, PSA, INFOCUBE

    Hi All,
          We are loading data to PSA and then to ODS and then to INFOCUBE , in such condition is there any table where we can check the number of data that can be loaded in each run into  PSA and then to ODS and then to INFOCUBE .
       When u check the data in the PSA you will see (Suppose):
                        Processing          PSA
                        Update Mode       Full update
                        Records Check    32 Records Received
    Now here it is showing the number of data loaded to PSA then there must be one table where it is loading the data ..
    Do anyone have the idea  about the table.
    Thanks,
    Gautam

    Hi,
    You can check the data in the PSA table and Active table of ODS.
    PB

  • How to add index to PSA table?

    Hi , Experts
    as title, I want to add index to PSA table.
    my scenario is as below:
    We have already initialized 2LIS_03_BF with NOT  blcok business user(it's due to our company could not stop R/3 system, anyway), so there are 6 years data in the PSA table, it's very huge records in PSA more than 48 million records, because of we NOT stop R/3 business during initialization, so I have to search some common data  between delta records and full records(it's due to this common data contain duplicate record, I need find them and delete the duplicate record and make sure the data is correct).
    This is mean The selection is too expensive as it has to look for more than five thousand delta record between more than 48 million full records. fow now  I have already tigger this search job use a program, but it's still run for 7 days, still not finished yet, it drive me crazy.  so I mean if I can create a index for PSA table, and this action will drive the above compare work more quickly than before? or have any other way for my scenario?
    Thanks in advance.
    Best Regards,
    Bruce

    Hi,
    See the PSA Name in RSTSODS table and then in SE11 you can create Index. But the load will become slow if you don't drop the Index.
    Use 2LIS_03_BX, 2LIS_03_BF, 2LIS_03_UM to 0IC_C03 Cube and design the report.
    Use :See the steps how to load the data to 0IC_C03.
    Treatment of historical full loads with Inventory cube
    Setting up material movement/inventory with limit locking time
    If it is BI 7 then for BX in in DTP in Extraction Tab you need to select Extacrion mode = NON-Cumulative option.
    Thanks
    Surendra Kumar Reddy Koduru

  • PSA table name not visible in the list

    Dear All,
    I have been trying to delete the data from the PSA tables using the process type <b>Deletion of Request from PSA</b> and the Object type <b>PSA TABLE</b>.
    When searching for the PSA table name in the <b>Object Name</b> list, I am not able to get one of the PSA table names (/BIC/B0000505). Though I am able to get all the other table names in the list. (Ex. /BIC/B0000504 , /BIC/B0000506) .
    Please let me know the possible reason for this and how to rectify the same so as to include that PSA table name
    Regards
    Shalabh

    Hi Raghavendra,
    The table is present in the system (SE11/SE16) and contains data as well.
    The problem is, I am able to find the other tables from the same datasource in the list, but am not able to find this only PSA Table in the list.
    Has it got anything to do with corrupted PSA table and need to re-tranport the Datasource ??
    Regards
    Shalabh

  • PSA tables in SAP BW

    Hi experts,
    I'm having trouble with creating PSA tables in SAP BW. Can somebody please explain the different ways to create PSA tables.
    Thank You Very Much
    My Regards...

    Hi..
    After it is extracted from source systems, data is transferred to the entry layer the data
    warehouse, the persistent staging area (PSA). In this layer, data is stored in the same form as
    in the source system. The way in which data is transferred from here to the next layer</b>incorporates quality-assuring measures and the transformations and clean up required for a
    uniform, integrated view of the data.
    <b>When you activate the DataSource, BI generates a PSA table and a transfer program.</b>
    The data in the DataSource (R3TR RSDS) is transferred to the PSA.
    When you transport the restored 3.x DataSource into the target system, the DataSource
    (R3TR RSDS) is deleted in the after image. The PSA and InfoPackages are retained. If a
    transfer structure (R3TR ISTS) is transported with the restore process, the system tries to
    transfer the PSA for this transfer structure. This is not possible if no transfer structure exists
    when you restore the 3.x DataSource or if IDoc is specified as the transfer method for the 3.x
    DataSource. The PSA is retained in the target system but is not assigned to a DataSource/3.x DataSource or to a transfer structure.
    <b>The PSA table to which the data is written is created when the transfer structure is activated.</b>
    <b>A transparent PSA table is created for every DataSource that is activated</b>.
    The PSA tables each have the same structure as their respective DataSource. They are also flagged with key
    fields for the request ID, the data package number, and the data record number.
    InfoPackages load the data from the source into the PSA. The data from the PSA is processed with data transfer processes.
    With the context menu entry Manage for a DataSource in the Data Warehousing Workbench
    you can go to the PSA maintenance for data records of a request or delete request data from the PSA table of this DataSource. You can also go to the PSA maintenance from the monitor for requests of the load process.
    Using partitioning, you can separate the dataset of a PSA table into several smaller, physically independent, and redundancy-free units. This separation can mean improved performance when you update data from the PSA.
    In the Implementation Guide with SAP
    NetWeaver &#8594; Business Intelligence &#8594; Connections to Other Systems &#8594; Maintain Control
    Parameters for Data Transfer you define the number of data records needed to create a new
    partition. Only data records from a complete request are stored in a partition. The specified value is a threshold value.
    The number of fields is limited to a maximum of 255 when using TRFCs to transfer data. The length of the data record is limited to 1962 bytes when you use TRFCs.
    <b>PSA Table
    For PSA tables, you access the database storage parameter maintenance by choosing Goto &#8594; Technical Attributes in DataSource maintenance. In dataflow 3.x, you access this setting in transfer rule maintenance in the Extras menu.
    You can also assign storage parameters for a PSA table already in the system. However, this has no effect on the existing table. If the system generates a new PSA version (a new PSA table) due to changes to the DataSource, this is created in the data area for the current storage parameters.</b>
    <b>Define PSA Part. size using transaction RSCUSTV6
    Minimum no. of records in a PSA partition
    Use program SAP_PSA_PARTITION_COMPRESS for existing PSA
    PSA
    RSTSODS    Directory of all PSA Tables</b>
    with regards,
    hari kv
    Message was edited by:
            hari k

  • How do I reclaim the unused space after a huge data delete- very urgent

    Hello all,
    How do I reclaim the unused space after a huge data delete?
    alter table "ODB"."BLOB_TABLE" shrink space; This couldn't execute with ora 10662 error. Could you please help

    'Shrink space' has requirements:
    shrink_clause
    The shrink clause lets you manually shrink space in a table, index-organized table or its overflow segment, index, partition, subpartition, LOB segment, materialized view, or materialized view log. This clause is valid only for segments in tablespaces with automatic segment management. By default, Oracle Database compacts the segment, adjusts the high water mark, and releases the recuperated space immediately.
    Compacting the segment requires row movement. Therefore, you must enable row movement for the object you want to shrink before specifying this clause. Further, if your application has any rowid-based triggers, you should disable them before issuing this clause.
    Werner

  • Data not received in PSA Table - Data of request already deleted

    Hi
    I'm running a delta load on a BI 7.0 system and get this error Data not received in PSA Table.
    And when I click on the PSA table it telling me that  Data of request XXXXXX already deleted.
    I have tried to do a consistency check in RSRV. See if there's any IDOCs in pb87.
    And the short dump analysis just say: 
    A RAISE statement in the program "SAPLRSSM" raised the exception
    condition "NOT_EXIST".
    Since the exception was not intercepted by a superior
    program, processing was terminated.
    I've also tried to repeat the load but with no luck.
    Anyone know how to fix this?
    Thank you in advance

    Hi,
    Delete the bad requests in 3 targets and delete the data mart status in Source DSO.
    Also delete the green request in the Source DSO and repeat it.
    Once it loads successfully then load the data to other 3 targets.
    Rgds,
    Ram

  • How to delete the duplicate data  from PSA Table

    Dear All,
    How to delete the duplicate data  from PSA Table, I have the purchase cube and I am getting the data from Item data source.
    In PSA table, I found the some cancellation records for that particular records quantity  would be negative for the same record value would be positive.
    Due to this reason the quantity is updated to target but the values would summarized and got  the summarized value  of all normal and cancellation .
    Please let me know the solution how to delete the data while updating to the target.
    Thanks
    Regards,
    Sai

    Hi,
    in deleting the records in PSA table difficult and how many you will the delete.
    you can achieve the different ways.
    1. creating the DSO maintain the some key fields it will overwrite the based on key fields.
    2. you can write the ABAP logic deleting the duplicate records at info package level check with the your ABAPer.
    3.you can restrict the cancellation records at query level.
    Thanks,
    Phani.

Maybe you are looking for

  • An error has occurred while attempting to load the CR in hosting

    I deployed my asp.net with crystal report in hosting server... They have installed Crystal Report Basic Runtime for Visual Studio 2008.(10.5.3700.0) on windows server 2008 OS X64.. I developed my asp.net application with visual studio 2008 in xp mach

  • How to track the session Logs ( SM35) in program

    Hi, I want to track the SM35 logs in my program. Using log information i want to move the filure records, Success records and warning  records spearately in my internal tabel. Based on that iinforamtion i want to analysis the records again for upload

  • Acrobat 9 Pro Extanded tacking system

    Hi all, It ry to make a flow by sending and collecting forms using the track form facility. The 1st step "sending" is OK but despite I send back a filled form using a 2nd email address and the message is properly received back in MS Exchange sending

  • Open Excel files in Excel

    Hello, I just downloaded FireFox 8 because I've had enough of IE. Now, when I click on an Excel file in Sharepoint 2010 it opens it directly in Firefox, without the option of download/save. I need to open these files in Excel so I have the analysis a

  • DPR in A/R is affecting Tax Amount

    Hi   While creating Down Payment Request in A/R Tax % get affected which i don't want. I don't want to use Payment on Account. Secondly i want to know if i changed the tax code to exempt  at the time of DPR , am i right or i am wrong somewhere. Thank