Delete Master Data using Process Chain?

Hi Friends,
How do I delete Master Data using Process Chain?
Thanks,
Pradeep

Hi Bhanu,
I am not sure why we would want to delete MD. But that is the task given to me. I actually know it is not possible. But is there a way to delete MD using chains?
Thanks,
Pradeep

Similar Messages

  • How to delete Master data using Process Chain

    I need to delete data in respective data targets such as Infocubes, DSO’s & Master data Infoobjects. It was fine with Infocubes & DSOs  using respective process type. But the issue is with Master data Infoobjects.
    Is there any process type by using which I can delete data in Master data Infoobjects using process chain?

    Murali,
    Master data object process type is not there because in BW, master data is conformed or is shared by other infoproviders. Due to this dependency, master data loading is a bit more complex. You need to drop all the data from all the infproviders which are using the master data you want to delete.
    -Saket

  • How to delete master data via process chain

    Hi All,
    I have to delete master data before loading it again via a process chain.
    In the process chain I chose the process " complete deletion of the data target content" but here I did not find infoobject as the data target option to delete the data.
    Please suggest how to solve this.
    Thanks,
    Harini

    Hi Harini,
    P_CHABNM - It is for the reference characteristic.
    I think if you dont have a ref characteristic then give the info-object name.
    This is what I fetched fom help.
    Reference characteristic
    The reference characteristic has the technical properties of a characteristic such as data type and length, as well as the master data (attributes, texts and hierarchies). The characteristic itself also has buinesses semantics.  Several characteristics can refer to the same reference  characteristic. Such characteristics automatically have the same technical properties and master data.
    Example:
    The characteristics sender cost center and receiver cost center use the reference characteristiccost center and with it the same values and the same texts.
    Bye
    Dinesh

  • Deleting Master Data via Process Chain

    I am in the midst of creating a process chain and am able to create process for deleting ODS Contents, but am not able to do the same for Master Data. Is this possible?

    Firstly, why do you want to delete master data.
    You cannot delete master if the same is used across other transaction and master ( attr )data.
    You may use :
    Prg - > RSDMD_DEL_BACKGROUND
    FM - >RSDMD_DEL_MASTER_ DATA.
    To my knwldge you cannot delete MD using process chains.
    You can delete data in PSA periodically..is that you are looking for ?

  • Initial full load of Master data using process chain

    Hi All,
    Could you please help me regarding, initial master data load to characteristics with attributes and text. I need to load master data to 23 info objects, by using process chain can I do full load of master data to all info objects at a time. And one more doubt is, as per my knowledge we can't maintain more than one variant in an info package, is that right ? or we can ?
    Means Start Variant -> Info Package (0Customer_Text, 0Customer_Attr,0BILL_TYPE_TEXT, BILL_CAT_TEXT) -> DTP ( ", ", ", ") -> ACR.
    Your Help will be appreciated.
    Thanks & Regards
    Sunil

    Hi,
    "I need to load master data to 23 info objects, by using process chain can I do full load of master data to all info objects at a time."
    if there is no dependency between attributes then you add you can create process chains and trigger them at a time. No issues.
    we can't maintain more than one variant in an info package, is that right ? or we can ?
    With one info pack you can't load data to all 23 psa. because each data source have own psa. you need to sue 23 info packs.
    in general start variant--> info pack --> dtp (assuming as your bw 7.x)---> attribute change run.
    like that you need to create 23 chains 
    Or create one two big chains.
    one is for attribute and another for text.
    In attribute
    start varaint--> info pack(info bject 1)--DTP(infoobject 1))--> info pack(infoo bject 2)-->dtp(infoobject 2).
    Like that way you can create in series and parallel chains to load attributes data into info objects. at end you add change run for 6 info objects each. SAme you can do for text loads also.
    Thanks

  • Deleting PSA data  using process chain doesn't delete records from psatable

    Dear All,
    We have an issue where by the process chain runs and deletes the PSA records and upon checking that's what we thought it had done from the monitor but it didn't actually delete the records from the PSA Tables!
    Does anyone have any clues about this ?
    Thanks
    Craig

    Chetan,
    Yes you read correctly the actual table content for the PSA records are not physically removed.
    By monitor I meant the entry in the requests for PSA.
    It's a strange one for sure.
    Cheers
    Craig

  • Deletion of changelog data using process chain

    Hi All,
             I have a requirement where i have to delete the changelog data for particular set of data using process chain.we have an option of deleting the entire data from changelog but how to delete only a particular set of data?????

    If you are on BI 7 than,
    Choose "Deletion of Requests from the Change Log" from process category Further BI Processes by double-clicking.
    Create a new process variant and in the variant maintenance screen select Type of Object as Change Log Table
    Under Name of object select DSO name
    Specify the requests that need to be deleted by determining the days or dates. You also have the option to specify whether you only want to delete successfully updated requests, and/or only incorrect requests that are no longer updated in an InfoProvider.

  • Loading master data through process chain

    Hi Gurus
    I am designing the process chain , I know that I have to start by loading master data first for that I am planning to load attribute first and then to load text and then Hierarchy. Is that the right sequence?
    Also how can I make sure that once the master data is loaded , the transaction data loading start automatically?
    I am planning to put attribute chage run as last process in every master data loading . Is that correct?
    Also i would appreciate if you can tell me the very first step I should follow to start my design, I have already identified the master data and transaction data and their respective dependancies.what would be my next step?
    Thanks for your help in advance.
    Kris

    Hi Kris,
    Make two seperate Process Chains for Master Data and Transaction Data
    For Master Data:
    Your sequence is absolutly correct, Identify what all Master Data Objects and make a list of InfoPackages and Start creating Process Chain as
    Start --> Attribute --> Text --> Hier --> Attr Change Run
    For Transactional Data:
    Identify the Data Targets and the Sequence of loading, then create the Process Chain
    NOTE: Always Master Data should be loaded before Transaction Data
    After Creating different Chains for Master Data and Transaction Data, Make a one Super Chain called Meta Chain where you can put all these process chains,
    Meta Chain  --> Process Chain 1( Master Data Chain) --> Process Chain 2( Transaction Data)
    also check these threads for process chain creation
    Re: Process Chain : Master Data
    Re: process chains ?
    Hope this helps,
    Sudhakar.

  • Delete some hierarchies using process chain?

    Hello,
    I wonder if there is anyway to delete some hierarchies from InfoObjects using process chain.
    For example:
    - In the product InfoObject, there are 5 existing hierarchies which are HIER1, HIER2, HIER3, HIER4, HIER5.
    - In the process chain, there is a schedule to load product hierarchy every week.
    - The requirements is to delete HIER5 before loading HIER5 into the product InfoObject again and would like to add this process in the process chain as well.
    The final solution that we think it might help is by writing in ABAP.I'm not sure if there any alternative way to serve this requirement without using ABAP program?
    Any suggestion would be sincerely appreciate.
    -WJ-

    Closed without solution!

  • Loading Data Using Process Chains

    Hi All,
    we are having 3 layers for loading data using DB connect i.e 1st and 2nd layers we are having DSO's and the 3rd layer is cube.we are following the following updates for the 3 layers
    1st layer- full
    2nd layer - delta
    3rd -delta
    is there any possibilites of getting duplicates in the cube and if so how to solve this one using process chains
    regards,

    Hi,
    Your dataflow looks ok.
    Hopefully you are using "Std DSO", there is no chances of getting duplicate records.
    But in case...
    If you are using "WO DSO" with check box "Do not check Uniqueness of records"
    In this situation you will get the duplicate... as "WO DSO" with this property will simply keep on adding adding the records.
    Thanks
    Mayank  

  • Issue while loading Master Data through Process Chain in Production

    Hi All,
    We are getting an error in Process chain while loading Master Data
    Non-updated Idocs found in Source System
    Diagnosis
    IDocs were found in the ALE inbox for Source System that are not updated.
    Processing is overdue.
    Error correction:
    Attempt to process the IDocs manually. You can process the IDocs manually using the Wizard or by selecting the IDocs with incorrect status and processing them manually.
    I had checked the PSA also but I could not find any record and the strange thing is, Job itself is not getting scheduled. Can any one help me out in order to resolve this issue.
    Regards
    Bhanumathi

    Hi
    This problem is not related to Process chain..
    u can try this..
    In RSMO, select the particular load you want to monitor.
    In the menu bar, Environment >>> Transact. RFC >>> Select whichever is required, BW or Source System.
    In the next screen you can select the Execute button and the IDOCS will be displayed.
    Check Note 561880 - Requests hang because IDocs are not processed.
    OR
    Transact RFC - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    (from JituK)
    Hope it helps
    Darshan

  • Usage of IO (read master data) in Process Chain

    Hi experts,
    I have a little bit complicated the task.
    i have some InfoObject ( as input) and i need to find out the list of Process Chains which read data from this IO, directly by using update rules/transformations or by using function modules.
    It means to check all steps in Process Chain and check all routines, selection conditions, update rules, transformations, functions ( function can call next function) , ... and find out if we read informations from IO (master data). I think it is complicated.
    Maybe next solution -  is existing any log of Process Chain which contains informations about "select" over IO (master data) ?
    Thanks in advance
    Martin

    There is something, what can make your work easier
    1. Go to table RSAABAP and put in field LINE value 'yourInfoobject'
    you will get list of all places, where this IO is read
    2. Go to table RSTRANSTEPROUT and paste all CODEID from previous table, and you will get list of all transformations.
    Having transformation you can find all DTPs and then process chains
    Regards,
    DL

  • Loading master data via process chain

    hi experts,
    I have to load data in infoObject via process chain, my infopackage has update mode as full update. Now my question is do I have to delete the previously added data from infoObject and then run this full update infopackage in process chain or system will only fetch the newly added records from source.
    regards,
    ray

    You can not delete the data from master data once the transcational data is posted against the master data value. Attribute value can be deleted, but you can not delete the value of the main characteristics from master data table.
    Moreover, you dont need to delete the data when loading master data. Master data always overwritten with the new reocrds.
    As described, first load the full load and then on daily basis you can load the delta.
    DO NOT FOGET TO SCHEDULE ATTRIBUTE CHANGE RUN to activate the newly loaded data into master data.
    Make sure that your master data is not enhanced, otherwise any change in the enhanced field will not generate delta record. If this is the case, then load in full mode everyday.
    - Danny

  • Delete older data via process chain

    Business needs only data for the past 45 days at any given time and the data loads are daily basis via process chain.
    so I would like to have a step in Process chain to delete any data older than 30 days from today's date , on a daily basis.
    How can I do that in the Process chain?

    Hi,
    You can use a process type called Delete overlapping request from Infocube in process chain. To delete requests older than 30 days select Deletion selections and write ABAP routine under Request selection through routine using following template.
    Routines for determining the old requests to be deleted after successfully loading a new request.
    form compute_<InfoCube-Name>
      tables l_t_request_to_delete structure rsreqdelstruc
      using l_request like rsreqdone-rnr
      changing p_subrc like sy-subrc.
    *Insert Source Code to decide if requests should be deleted.
    *All Requests in table l_t_request_to_delete will be deleted
    *from Infocube <InfoCube-Name>.
    *Add new requests if you want to delete more (from this cube).
    *Remove requests you did not want to be deleted.
    $$ begin of routine - insert your code only below this line-
         loop at l_t_request_to_delete.
         endloop.
         clear p_subrc.
    $$ end of routine - insert your code only before this line-
    ENDFORM.
    hope it helps...
    regards,
    Raju

  • Master data in process chain

    I have to create pocess chain for updating master data for following objects:
    0Vendor - full load
    0GL_account - Delta
    0pur_group - Delta
    any guideline how i will proceed?

    SAP BI
    Usually we create seperate process chains for Delta and Full loads. Since you have only three objects you could create just one process chain for all three. First create the start process and load all these three objects in parallel and one after other depends on the volume of records.
    Hope this helps
    Thnaks
    sat

Maybe you are looking for