Process chains for loading data to target is not functioning

Hi SAPians,
Recently, we have upgraded the firmware on IBM P590 with the help of IBM on Saturday i.e. 06/12/2008 (The firmware of P5-590 was SF235_209. It was upgraded to SF240_XXX) and since then the process chains for loading data to targets are not functioning properly.  We have stopped all the SAP services, database services & OS services from our end and services have been rebooted after firmware upgrade.
However, the problem with the process chains that load transaction data and hierarchies is solved but the chains that load master data are still not working as scheduled.
We tried to load the master data manually, by running DTP to load data to an object analysis code (attributes) the request remained YELLOW. After refreshing it several times, the request turned into RED. The error message was - PROCESSING TERMINATED (screenshot 1 attached).
To mitigate this we tried deleting the bad request and it prompted with the below message:
"Cannot lock request DTPR_4C37TCZGDUDX1PXL38LVD9CLJ of table ZANLYSIS for deletion (Message no. RSMPC178)" (screenshot 2 attached)
Please advise us how the bad request should be deleted?
Regards,
Soujanya

Hi Sowjanya,
Follow the below procedure to make yellow request to RED:
Use SE37, to execute the function module RSBM_GUI_CHANGE_USTATE
From the next screen, for I_REQUID enter that request ID and execute.
From the next screen, select 'Status Erroneous' radiobutton and continue.
This Function Module, change the status of request from Green / Yellow to RED.
Once it is RED, you can manually delete that request...
Releasing LocK
Gott Transaction Code SM12 and release the lock.
Hope It helps you.
Regardss,
Nagaraju.V

Similar Messages

  • How to create process chains for Master Data?

    I need information on " how we can create process Chains for Master Data"

    Hi Sachin,
      http://help.sap.com/saphelp_bw33/helpdata/en/ad/6b023b6069d22ee10000000a11402f/frameset.htm
    and also Modelling aspects in process chains (ppt)
    https://websmp109.sap-ag.de/~sapidb/011000358700002337702003
    Hope this helps.
    Srini

  • Process chain for loading master data attributes

    Dear Experts,
    Can any one explain the steps needed to load master data into info object through process chain.
    Thanks in advance
    Santhosh
    Please search the forum
    Edited by: Pravender on Jun 8, 2010 3:06 PM

    in bi7 this would be:
    start
    infopackage
    dtp
    attribute change run
    M.

  • Process Chain which loads data to a cube

    Hi All
    Please help me create a process chain in BI 7.0. I have to load the data in to a cube and before that i need to delete the existing request in the Cube. I think i should use " Delete overlapping requests from Infocube".  In the maintain variant for this process type what object types i need to add. Do i need to add both Execute infopackage and Data transfer process objects.
    Regards
    Naga

    Hi Ravi
    I am loading the data from PSA to Cube using DTP. Actually my data source is export datasource (8ZHYP- prefix 8). So accroding to your answer i should use DTP object type in the process type delete overlapping request. But when i create a variant for that process type it is getting Indexs > Create Index > Data Transfer Process > Request to delete data in the infocube.
    I just want to delete the data in the infocube before i load the data. So can i delete all the remaining processes.
    Regards
    Naga

  • Creation of Decision in Process Chain to load data to a SPO

    Hi,
    I have a question regarding adding a Decision type to my Process chain.
    I want to load data to a SPO that contains 12 InfoCubes (Jan - Dec) from a DSO, but I do not want to start all 12 DTP's.
    Want I am looking for is a way to only start the DTP's that load to the InfoCubes that contain current month, last month and all other data should be loaded to a Infocube (nr 13) that is not a part of the SPO so I would only have 3 DTP's running at the same time and not 13.
    E.g. A delta load to the DSO contains 3 records; 1 for 2014.09, 1 for 2014.08 and 1 for 05.2013 triggers the DTP to the January InfoCube, the DTP to the August InfoCube and the DTP to the 13 Infocube.
    Is this possible by using a Decision and if so how?
    Kind Regards
    Steffen

    I haven't worked with SPOs yet, but I do work frequently with decisions, so here's my 5 cents...
    In your case you wish to have your decision based on the data contained in "a delta load to the DSO". I'm assuming you're talking InfoPackage here, because you wish to execute the DTP's afterwards.
    So basically you need to determine which "months" are in your incoming data packages. The only way I see that possible is via start/end routine. In your example, you would look the data package and find 3 records each with a different month, being 08.2014, 09.2014 & 05.2013. You could then "convert" these to just numbers ranging from 1 to 13. That makes it relatively "easy" to link the right DTP to the right "number".
    Now, how to pass that info (those numbers) back to the "decision" process? By default you base your formula on a "system" field (see basic how to link below). I don't see how this would work in your case. You'll need to somehow export your "numbers" (most likely an internal table or a "range") to memory (or fill it in a [z-]table) and then read it back in with custom code. I haven't done that yet (at least not in the context of decision steps), but I would recommend to have a look at the "experienced" way of working document below. It won't be easy, but I think it could work.
    Cheers,
    Raf
    The official documentation can be found here.
    For a "basic" how to on decision step, click here.
    For a more "experienced" way of working, click here.

  • How to increase the number of processes taken for loading data?

    Hi,
    While loading data from R/3 to BW, we found that the particular load is taking 2 processes to load data into Cube.
    Is there anyway to increase the number to 4 ?
    Thanks in advance.
    Bobby.

    Bobby,
       to my knowledge we can't change that. Let me explain this, we have setting in the source system for DS default Data Transfer. there we will assign the processes. if you want to assign 4 you need to change the setting in the source system. For flat files we can change in BW System. We can maintain the setting in the Infopackage level(wht we are assigned in the Source System), but we can't change the process.
    in order to check the setting in source system  SBIW--> General Settings --> Control Parameters for Data Transfer.
    we need to change here, this will effect to all the Data Sources. Before making changes check with your basis.
    All the best.
    Regards,
    Nagesh Ganisetti.

  • How to create process chain for this data flow

    Hi experts,
    My data flow is:
    ======================================
    2lis_11_vahdr->Transfer rules->2lis_11_vahdr->Update rules->DSO1->Transformation->DTP->Cube1.
    2lis_11_vaitm->Transfer rules->2lis_11_vaitm->Update rules->DSO1->Transformation->DTP->Cube1.
    ======================================
    2lis_12_vchdr->Transfer rules->2lis_12_vchdr->Update rules->DSO2->Transformation->DTP->Cube1.
    2lis_12_vcitm->Transfer rules->2lis_12_vcitm->Update rules->DSO2->Transformation->DTP->Cube1.
    ======================================
    2lis_13_vdhdr->Transfer rules->2lis_13_vdhdr->Update rules->DSO3->Transformation->DTP->Cube1.
    2lis_13_vditm->Transfer rules->2lis_13_vditm->Update rules->DSO3->Transformation->DTP->Cube1.
    ======================================
    Here for each datasource info package brings data upto dso and then from dso dtp fetches the data.
    For deltas i want to keep this data flow in process chain.
    Anyone please guide me how to keep this in a process chain.
    Full points will be assigned.
    Regards,
    Bhadri m.
    Edited by: Bhadri M on Sep 2, 2008 7:20 PM
    Edited by: Bhadri M on Sep 2, 2008 7:21 PM

    Hello,
    Sure it is possible to maintain that dataflow.
    Start here:
    This is a very good document about process chains:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
    Let me know if you have any doubts.
    Regards,
    Jorge Diogo

  • Process chain while loading data into DataStoreObject

    Hai
    Im trying to load the data in DataStoreObject by using the process chain . I build the process chain like below
    Start Process -> InfoPackage -> DataTransferProcess -> Activate DataStoreobject Data .
    When i try to schedule the proces chain then job is going to be cancelled.
    Then i looked in St22 . There i got the error says that <b>MESSAGE_TYPE_X</b> .
    how can i solve this error . Please let me know
    i ll assing the points
    kumar

    Hi,
    try to activate again the ODS, also try to activate manually by executing the following report from SE38:
    RS_TRANSTRU_ACTIVATE_ALL
    Insert just the infosource of the ODS and execute it. After this, replicate the datasource and try again to execute the chain.,
    Rgs
    Antonino

  • Why process chains for BI statistics Technical Content are not showing in 0

    I am looking into 0TCT_MC21 which shows process statistics from 0TCT_C21 and 0TC_VC21.
    I know process chains such as 0TCT_C2_DELTA_PO1,  0TCT_C2_INIT_PO1 have been run, yet
    They do not show in 0TCT_MC21 which included  real time data (via virtual cube) as well as
    Collected data 0TCT_C21u2026
    Do I have to specify somewhere on which process chains statistics are to be collected, if yes, how??
    Thanks.

    Hi,
    Have you checked this existing thread:
    /message/3214515#3214515 [original link is broken]
    Regards,
    Lee

  • Process chain for a full load failed

    Hi,
    I have a process chain where load data from an InfoCube to a DSO with the following steps:
    Begin
    Load data: Execute an Infopackage with full load.
    Update from PSA
    Activate Data: Activate data of DSO.
    When I execute the process chain, the step of execute InfoPackage succes, but the step of Update from PSA failed. If I saw the errors, I can see that when the infopackage finished need to Activate the data of DSO, and the status of the request is yellow and to Update from PSA need the QM status was green. What can I do? The data transfer correctly to the DSO but not is active.
    Thank you very much.
    PD: May be, some steps and options are not exactly like I wrote them but I'm working in Spanish and the meaning is the same

    Hi....
    Look if Update from PSA has failed......then go the details tab in the IP monitor.........expand the failed data packet and see the error message........there may be several reasons .......try to solve that issue............ then delete the request from the target without making the QM status red.......the reconstruct the request................
    Actually I am not getting you clearly....I think your process chain is fine......Are you trying to say your ODS activation also failed...........but how ODS activation will start..........the load itself failed.........Is the link is on success or failure....
    Regards,
    Debjani....
    Edited by: Debjani  Mukherjee on Oct 16, 2008 2:02 PM

  • Creating Process Chains for Infocube

    Hi All,
    I am trying to create a process chain for Vendor Evaluation(0PUR_C05).I created a start process and the "execute infopack" wich is delta load,so it prompted for "insert delete index" wich i created.Now,what to do next???
    the data flow is like this..
    infoapck(delta)->psa->infosource->infocube...
    is giving the selete index and rebuilding it mandatory??
    Please throw a light on this and help me out...
    Points will be rewarded..
    Regards,
    Manjari.

    Hi Manjari,
    How to do INIT?
    Go to Process Chain for Transaction Data.. and go to infopack you want to do INIT.
    Right click on that variant and click "Maintain Variant".
    Now if you have not yet loaded the cube using this Infopackage, you will find option Full update and Initialize data transfer in Update tab. If you have not loaded cube with any update, go to Initialize Data transfer and select the datatarget you want in Data Targets tab and check "Start immediately" in Scheduler tab and Execute.
    Now go to monitor screen and refresh it in times.
    Keep on monitoring in that screen until the requests become "Green".
    Once the requests are Green, come back to the Cube in RSA1 and check the data in Manage screen whether the data is populated fine or not.
    If the data is populated well, now go to Infopackage and in Update tab, change the Update technique to Delta. From now, you can have Delta upload using process chain.
    You will come to know whether the process chain is working properly or not by executing the process chain once. In your case, run the process chain only after you are done with INIT.
    If you have already loaded cube with Full load, follow the below process for INIT load...
    Go to Process Chain for Transaction Data.. and go to infopack and right click it. Click on Maintain Variant.
    InOrder to make it in INITIALISE DATA TRANSFER…goto... Menu>Scheduler>INITIALISATION OPTIONS FOR SOURCE SYSTEM..and click on it.
    You will get a popup with some requests in it. Select the requests and make them RED and delete all the requests and enter.
    And if u check in Update tab ,,for the update mode..it will be seen in INITIALISE DATA TRANSFER mode.
    Then move to DATA TARGETS tab and check the data target to which you want to load data.
    NOTE: If there is any data in the Cube, you want to do INIT, delete the data first (delete requests in Data target) and start INIT.
    Now move to "schedule" tab in Infopackage and check if the radio button is on START DATA LOAD IMMEDIATELY and click on the START button.
    Assign points if this helps u...
    Regards,
    KK.

  • Problem in creating process chains for CO

    Hi Friends,
                     I have a CO cube "0OPA_C11" getting data from Three datasources,
    those are 0CO_OM_OPA_6(supports delta), 0CO_OM_OPA_1(full only), 0CO_OM_OPA_2(full only), i have already created process chain to this cube using these three datasources, the chain is like this,
    1. delete indexes
    2.0CO_OM_OPA_6(supports delta
    3.0CO_OM_OPA_1(full only),
    4.0CO_OM_OPA_1(request overlapping)
    5.0CO_OM_OPA_2(full only),
    6.0CO_OM_OPA_2(request overlapping)
    7.create indexes
            Is it right? here the chain is running, after completion of that, in manage of cube, it is showing only the request for 0CO_OM_OPA_2(full only), the request which were there earlier like Init request and new delta request for .0CO_OM_OPA_6, full request for 0CO_OM_OPA_1 are getting deleted, it is showing the request only for 0CO_OM_OPA_2,
      the way how i created the process chain is right or wrong? can anybody give me the solution?
    Thanx in Advance
    BalajiReddy

    Hi Nick,
              Thanx for your reply, everthing is ok what you've asked me to check, i think there is certain scenario to create the process chains for the data targets, which are getting data from multiple datasources, which support delta's as well as full.
              I want that scenario, can you tell me? or can anybody tell me?
    The right points are assigned to the right answers.
    Thanx in advance
    balajireddy

  • Delta and Full process chain for same master data target.

    Hi  Friends,
    Can I do full update and delta update through process chain for same master data target and from same source ?
    Regards
    shekar reddy

    Hi Sriram,
    why you want to load full and delta for same master data object?
    you can achieve this but make sure that you select repair full option for full load IP.
    you have this option in scheduler in menu bar
    Regards,
    Venkatesh

  • Error in process chain for PCA full load

    Hello everyone,
    I'm trying to use a process chain in order to delete a previous full load of plan data in a cube prior to the new load (to avoid double records). The successor job in the process chain is loading a delta of actual data into the same cube (same info source).
    When executing the process chain (and the included info package (full load)), the setting "Automatic loading of similar/identical requests from info cube" in the info package is not working (I have ticked "full or init, data-/infosource are the same")...
    I have checked that the function itself works as I have executed the info package manually with success. So the problem is the chain somehow.
    In the chain I just execute the info package as usual... so to my understanding, it should work the same way as if I executed it manually. Or am I wrong? Is some additional setting required in the chain in order to make it work?
    Any ideas?
    Thanks,
    Fredrik

    Hi Fredrik,
    not all settings in infopackages work in chains in the same way they do while running the package manually. Mostly you can check that with pressing F1 on the setting. In your case, you need to add a process type for deleting the data to the chain. In your chain maintenance, look at process types and then in load processes .... There you will find the type you need.
    kind regards
    Siggi

  • How to create Process chain for Aggregate for Master data

    Hello friends,
    I created Aggregates on Navigational Attributes.
    Now they are working fine, but i need to know that how shall i create the Roll up for Aggregates on Navigational Attributes.
    The point is the master data changes frequently a lot of time for e.g. for 0customer etc.....
    So if any one can send me the step by step documents so as to know how to roll up the Aggregates for Navigation attributes or for aggregates created on Master data....
    How to create process chains for the same ?????????
    Because if master data changes, then rolling up the aggregates straight forward will not help.
    So we need to write a process chain so that it deactivates the aggregate and reactivate again and fill up again..........
    If i mis interpreted something please rectify it.......
    Please advise

    Hello,
    the changerun that you have to schedule in order to activate the master data will adjust the aggregates automatically.  There is no need to deactivate them after master data loads.
    Best regards,
    Ralf

Maybe you are looking for