Delta load problem in process chain

Hi,
I have ODS givng data to another ODS. The init of this flow was done manually...Now i have created process chain in which further processing has variant as "execute infopackage" as the delta package made .. now when i check this process chain its giving me error ""Delete init. request
REQU_49LKNGG0SE5966D3S7SQYYDQX before running
init. again with same selection"" .. Actually this is not the Init req which i ran manually..this is the req which was run when i ran this process chain... what shld i do to add only delta IP in process chain so that i dnt ve to delete init
Thanks
Prashant

hi,
first load all the init loads from the ods to cube. then u have to create the delta infopack and include that in the chain.
create the int info pack under the infosource 8<ods name>
trigger that infopack, create the delta infopack under the 8<odsname> infosource , and you have include that in the PC.
Please do changes in your process chain.
Local chain 1 - start -> load to ods -> activation of ods ->further processing(remove the infopack in this process maintain variant).
Lcoal chain 2 - start -> delete index ->load delta to cube -> create index -> roll up (if needed)
Main chain -> start -> local chain 1 ->local chain2
Ramesh

Similar Messages

  • Initialization of delta loads possible with process chain?

    hello all,
    followed situation:
    i load data from file and sap extractor to ods 1. from this ods i have an automatic delta load to another ods 2, where the data are filtered and aggregated.
    its possible that some of the data change their state from filtered out to in but are not in ods 2 because of the earlier criterias they had have. now i want a weekly load (initial load) from ods 1 to ods 2 and a daily delta load if new data arrived ods 1 to the no 2.
    i tried to create a process chain for that but i cant! i cannot use the info package for the initialization of the load from ods 1 to 2 because it is generated info package. and if i use the process "update ods object data" there is always tried to make a delta load! at first step i delete all data of ods 2, so i need an initialization with data as next step.
    how can i do what i want?
    thanks for any idea
    Frank

    Hello Tom,
    ok, i understand but i dont know how...
    The only way i know to create an InfoPackage between ODS is to use "Update ODS Data in Data Target" of the context menu of an ODS. Then i chose between full/delta/init load and the infopackage will be generated. How else can i create it?
    Frank

  • Delta Problem in process chain

    Hi all,
    the case is this: i have an ODS "A", like source of a Datamart to 2 Infoprovider "B" , "C", i wanted separate them in 2 differents DELTA , they were before in the same InfoPackage Delta  but now i need them separate, when i separate them in two Infopackage DELTA in the same process chain , occurred an error just loaded one of 2 infoprovider  of the data mart, when i check the ODS source "A" says it sent a Delta Load and is no more deltas to others infoproviders, i means it  not recognize the 2 infoproviders

    aaff

  • Problem in Process chains - Need help

    Dear Experts
    I have a problem in Process chains
    I have  ODS1  from which Iu2019m updateing the  ODS2 and ODS3 with same Delta infopackage.
    The difference between ODS2 and ODS3 is in ODS2 Iu2019m loading with Local Currency
    And in ODS3 is getting loaded with Group Currency
    When Iu2019m running with Process chain  its executing twice as below
    First time : It is showing only ODS2 as the Datatarget  and loading all the delta records to ODS2
    Secondtime: Its showing both ODS2 and ODS3 in Datatarget and loading with 0 Records.
    In this way No record is getting updated in ODS3.
    Could you please share your thoughts on why it is triggering twice ? And how we can avoid this.
    Thanks
    Lakshminarayana

    Thanks Shambu for quick reply
    The variant  is being used to load the ODS2 & ODS3  and Automatic further processing tick also removed.
    Thanks
    Nerusu

  • Delta datasource management in process chain

    Hi experts,
    I have the following problem with process chains.
    I have one <b>Delta</b> infosource used to load two different infoproviders (ODS1 and 2).ODS 2 must be loaded only once ODS 1 is loaded and activated. The Delta process implies that the <b>2</b> ODS are datatarget for the delta infopackage.
    In my process chain, I have :
    'execute infopackage IFPCK1'
    <i>(with 'only PSA' option and of course the two datatargets ODS1 and ODS 2 selected)</i>
    then a first
    'Read PSA and charge datatarget process'
    <i>(with options : datatarget = ODS 1, execute infopackage IFPCK1)</i>
    followed by
    'Activate ODS' ODS 1
    then a second
    'Read PSA and charge datatarget process'
    <i>(with options : datatarget = ODS 2, execute infopackage IFPCK1)</i>
    Each time, I run the Process chain, I have the following message for <b>ODS 2</b>
    "the request <b>(ODS1)</b> already exists in infocube/ODS; Not loaded."
    It is as if the first Read PSA process tries to load both ODS instead of only one.
    Any suggestion to help me slve this problem ??
    Thanks in advance

    Hi Daniel,
    unfortunately there is no easy way around this. I would suggest to create an ODS 3 as exact copy of the info source and load ODS 1 and 3 in parallel and then ODS 2 from ODS 3.
    Best regards
       Dirk

  • Problem in Process chain due to Aggregate Roll-up

    Hi,
    I have a Infocube with Aggregates built on it.  I have loaded data in the Infocube from 2000 to 2008, Rolled up & Compressed the aggregates for this.
    I have also loaded the 2009 data in the same Infocube using Prior Month & Current Month Infopackage for which i am only Rolling up the aggregate and no Compression of aggregates is done.  The Current & Prior month load runs through Process chain on a daily basis at 4 times per day.  The Process chain is built in such a way that it deletes the overlapping requests when it is loading for the second/third/fourth time on a day.
    The problem here is, when the overlapping requests are deleted, the Process Chain is also taking the Aggregates compressed requests (2000 to 2008 Data), de-compressing it, De-activating the aggregates, Activating the Aggregates again, Re-filling & compressing the aggregates again.  This nearly takes 1 hour of time for the Process Chain to run which should take not more than 3 minutes.
    So, what could be done to tackle this problem?  Any help would be highly appreciated.
    Thanks,
    Murali

    Hi all,
    Thanks for your reply.
    Arun: The problem with the solution you gave is "Untill i roll-up the aggregates for the Current & Prior Month Infopackage the Ready for Reporting symbol is not appearing for the particular request".
    Thanks,
    Murali

  • How i run that delta infopackage in my process chain

    Hello Sir,
    i have a delta infopackage which sciduled in a process chain but it does not pick any delta i have to manualy run repair full infopackage dally and then delta comes in to picture my target is a dso so it does't have any problem.
    the datasourse is costamize
    delta pointer is cal day
    my question is how i run that delta infopackage in my process chain
    Thanks...

    Hi
    IS the process type "delta infopackage " is executing in the process chain or not? Plz once check in SM37 Tcode . and Why r u running the repair full request infopackage . If u miss any delta records then only u have to run the repair full request . If delta records are not extracted then better to manually extract the delta infopackage.
    Plz once check in SM37 Tcode whether the execute infopackage process step is running or not? or any error has occured in that PC.
    Thanx & Regards,
    RaviChandra

  • Data Load Requirement through Process Chain

    Hello All,
    I have implemented an area through a custom data source and following is my data load requirement through Process Chain. Request you to kindly help me for the same.
    1. The design is an InfoCube and updated using a Transformation through the custom data source.
    2. For the entire year 2008 I want a single request to be loaded. So it gets loaded into the PSA and then into the Infocube through a (Delta) DTP.
    3. Now, I have created an InfoPackage (Full Update) with year as 2009 in the selection. Tht is I want tht hencforth the data load should be for the year 2009.
    4. Hence, the Infopackage will run to bring the 2009 data into PSA and I run the (Delta) DTP to update the same in the Cube.
    5. Now, what i require is everyday the InfoPackage (Full Update) with 2009 should run and bring data into PSA. and the same should be updated in the InfoCube after deleting the 2009 request already present keeping 2008 request loaded previously intact.
    I hope the above is nt confusing.
    Please let me know if i can elaborate the same.
    Thank you.
    Regards,
    Kunal Gandhi

    Hi,
    Please go through the links.
    http://help.sap.com/saphelp_nw04/Helpdata/EN/21/15843b74f7be0fe10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/f8/e5603801be792de10000009b38f842/frameset.htm
    http://help.sap.com/saphelp_nw04/Helpdata/EN/b0/078f3b0e8d4762e10000000a11402f/frameset.htm
    These may help you in designing the process chain as required.
    Regards,
    Sunil
    Edited by: sunil kumar on Apr 17, 2009 6:20 PM

  • How to load hierarchies using process chains

    Hi ,
    Can any one please explain the steps for loading hierarchies using process chain.whenever i drag the hierarchy infopcakage Save variant is also coming by default,so do we need to have different SAve variant for different hierarchy infopackages or can we have one save variant for all the hierarchy variables.
    Thanks,
    Vaka

    Hello Veka,
    How r u ?
    Yes ! SAVE Variant & Attribute Change Run will add up while loading the Hierarchy InfoPackage.
    Provide the InfoObject Name for which u r loading the Hierarchy in the SAVE variant and SAVE it. The same will be transferred to the Attribute Change Run also.
    If u r creating the Chain with more InfoPackages then have the Save Variant & Attribute Change Run at the last.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Problem in Process chain data loading

    Hi Experts,
    We have a Process chain in which the lists of jobs are as follows.
    1) BI_TRIGGER_PROCESS
    2) BI_PROCESS_LOADING
    3) BI_PROCESS_ODSACTIVAT
    4) BI_PROCESS_DROPINDEX
    5) BI_PROCESS_LOADING
    6) BI_PROCESS_INDEX
    7) BI_PROCESS_ABAP
         These jobs trigger the PC and load the data to ODS, activate and delete the index for cube and load data to cube and create the index and then delete the file from the FTP.
    The problem is that the Initial job (BI_TRIGGER_PROCESS) is being repeating multiple times because of which the loading is taking place for the same file repeatedly to ODS. one day it was happend repeatedly for 130 times. This is being happening on a randomly basis. We are unable to figure it out the where the problem is. Can anybody help in this regard?
    Thanks.

    Hi Rama,
    Following is my understanding of problem.
    You have Following steps in process chain
    1) BI_TRIGGER_PROCESS
    2) BI_PROCESS_LOADING
    3) BI_PROCESS_ODSACTIVAT
    4) BI_PROCESS_DROPINDEX
    5) BI_PROCESS_LOADING
    6) BI_PROCESS_INDEX
    7) BI_PROCESS_ABAP
    Some times only first two steps repeat multiple times i.e
    1) BI_TRIGGER_PROCESS
    2) BI_PROCESS_LOADING
    In such case you should see the log from process chain log view.
    I believe you should take your process chain out of schedule and reschedule it.
    Also check whether somebody has scheduled the inopackage.
    Regards,
    Ajinkya

  • Inventory delta load problem

    Hi,
         We are facing inventory data load getting below error, Main problem is they got below error in PSA level from 16th june still they changed red icon into green and loaded to incube and compressed. when running process chain they getting same error daily.
    Error:
    Caller 09 contain an error message
    Problem:
    -> Data flow PSA->IC, 20th request (it look like have 16th to 20th data)  they have changed PSA red request into green and loaded to cube and compressed.
    21th request (it look like have 16th to 21th data)  they have changed PSA red request into green and loaded to cube and compressed.
    Kindly guide me what solution we can give.
    thanks

    I suggest that you handle this with utmost care. Since it does not look like you have the DSO in the dataflow, recovering the data would be impossible if the cube data is corrupted.
    Need to understand the details to be able to give a proper solution.
    You can consider request reverse posting option which will post reverse images of the PSA requests under the scope and nullify the value from the cube. Please try this in Dev and quality system, before trying this in Production.
    Else, you can create a custom datasource based on this PSA table and reverse the values in the transformation.
    If you could identify the documents, created after this delta corruption, you could set them up in source and do a repair full upload.
    Please let me know whether it helps.

  • Delta load problem from ODS

    Hello everyone here.........,
    I have a problem loading frm ODS to another target with delta.
    Actually to ODS i am loading from two datasources. Both are full loads. Full1 and Full2. When i am trying to load from ODS to another target with delta..., it takes only Full1 data not Full2. So for loading Full2 now i am loading manually.
    so why this happend here. Please let me know how to do
    for making full2 also automatically........,
    swetha

    yes u r correct. after full1 loaded into ODS, its activating the ODS. these two can done through process chain. For full2 load we are loading manually. and doing manual activation.
    lets take small example here that for yesterdays full1 & activation has done successfully. and after that i am loading
    full2 manually & activated. and again today process chain ran, it loads full1 & activated ODS. Now what i am thinking is does this further datamart load which is run in process chain picks only full1 load and it doesn't pick full2 load which is already loaded & activated successfully.
    I think u will get what i suppose to ask you now...,
    clear my confusion here please......
    swetha

  • Problem in process chain running for 3 days

    Hi Experts,
    Please help me
    We have process chain like below
    1.     delete index
    2.     full loads going 3 cubes (16 infopacs)
    3.     loading to ods and then update to cube
    4.     delete overlapping requests (16 infopac)
    5.     create index
    6.     aggregate filling
    7.     delete PSA
    Now the issue is, from last Saturday onwards
    Process chains was still in Yellow status till date, so now, 1,2,3,4, are still running
    In process chains, out if 16 infopacs, 12 are green, remaining 4 are stopped
    And in ODS activation of ods data variant shows yellow, but when checked the data is activated, but not pushed to cube. When checked log it show u201Csystem exception error_messageu201D, so process terminated.
    When we checked for shortdump,
    It says u201Clocal time on the application server is incorrectu201D
    Experts please let me know how to solve this and correct the load.

    Hi Experts,
    Thank u for your help,
    I have done the steps manually yestarday, checked todays, still it is in yellow status.
    let me explain the scenario,
    The data is going to 3 cubes,
    the process chain is designed like this,
    1. initially delete index
    2. full load to cube using 16 infopac and loading from (2)ODS to cube.
    3. delete overlapping requests
    4. create index (3 cubes)
    5. fill aggregates (only one cube)
    6. Delete PSA request
    so now , chain is having problem at
    1. data is loaded to ods
    2. data activation (showing yellow, but when checked data is activated)
    3. further update to cube  (2 cubes) in process chain this is not started atall ( so y'day I've done manually)
    4. out 16 infocpack, only 12 infopacs are green and remaining 4 are not started.
    so yestarday I've manually schedule remaining 4 infopacs.
    and deleted previour requests in infocube as they are full load, but I did'nt done create index and fill aggregate steps.
    now the problem is , out of 3 cube report available request is present for 2 cubes, but for 3 cube  this report request is not available, but QM status is green
    please let me know how to proceed.
    one small silly question, do i need to  fill aggregate step to get this report available request.....please suggest.
    Thanks & Regards,
    Raghu

  • How to redirect loading flow in process chain based on logic?

    Hi Experts,
    I have a scenario where I want to keep data for last 3 years in 3 different cubes.e.g. lets say cube 1 holds data for current 2006 year, cube 2 holds 2005 and cube 3 holds 2004.Now in next year 2007, I want to keep data for 2007, 2006 and 2005.I want to keep data for 2007 in cube 3 which was holding 2004 data.(delete 2004 data and load 2007).
    This process should be automated i.e. which cube to load should be done auto.In short, <b>data for new cube should go into cube which is holding data for oldest year.</b>
    I want to know :
    1) Which Options I can use to do this?
    2) What about ABAP program : custom table can be maintained to know which cube is holding what data, but how to redirect loading flow?
    3) What about "Decision process type" in process chain?
    4) Also would custom process type solve this functionality?
    Any ideas would be highly appreciated.
    Thanks in advance,
    Sorabh

    Hi Sorabh,
    Its just an Idea, Im assuming that this would work. This should also work for INIT DELTA I guess. But would need proper testing.
    Have a Custom Table ZCUBEYEAR and maintain the CUBE as the Key.
    ZCUBEYEAR
    CUBE     YEAR
    Y1       2004
    Y2       2005
    Y3       2006.
    In the update rule->Start Routine for Cube Y1, extract this entry from the table ZCUBEYEAR for Y1, which in this case would be 2004.
    DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY1.
    in our case YEARFORCUBEY1 = 2004.
    For cube Y2 and Y3 the Delete statement would be as follows in their Start Routines.
    DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY2.
    DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY3.
    This would ensure that only 2004 data would flow to Y1, 2005 for Y2 and 2006 for Y3.
    Once we come to a NEW YEAR, We need to run a program or Manually change the CUSTOM TABLE "ZCUBEYEAR" updating the cube Y1 with 2007, the Deltas would flow correctly.
    Please email me at [email protected], we could have a detailed discussion.
    Hope the above helps your cause.
    Regards,
    Praveen.

  • Urgent: Problem with Process chain.

    Hello Experts,
    In my process chain I have an ABAP Program which produces an error message:
    RFC server sapftp cannot be started - 1 : Programm nicht über RFC gestartet. Kein Rückruf möglich.Message no. CMS057
    Manually I executed this program with out any problems. But this program was not responding via Porcess chain.
    After finishing the successfull load this program copies the files from Application Server in to our Central Project  Directory to archive the data and deletes the file from Application server.
    Any suggestion please...........
    Thanks in advance.
    Thanks & Regards

    Hi Sailekha,
    When the program was excuted through the Process Chain the Program was not able to access the Application Server, this could happen sometimes due to RFC issues.
    Did you try repeating the Program from the Process Chain???
    Is this issue happened for the first time???
    Depending on these questions you can decide on whether it was an RFC issue.
    If such is not the case check for the next run of Process Chain and also check for the variant assinged in Maintain Variant of this ABAP Program in this Process Chain.
    Hope it helps!!!
    Cheers,
    Neelesh Jain.

Maybe you are looking for

  • How can i make the volume controller on a pavillion dv4 work after a clean windows install?

    Hello! Recently I performed a clean windows 7 install from a windows install disk I previously owned, the problem is that this disk didn't intalled the drivers and programs specific for this computer. What I want to know is which driver or program I

  • Media Player for different file formats

    I wonder if you could help me. I run a media website. I allow users to download videos and audios. 90% of my videos are in .WMV and some are in .RM and .AVI formats. My audios are in .MP3, .RM and .WMA formats. Now, I want the users to view and liste

  • How do i view an email attachment without saving it first?

    I would like to (and used to be able to) view my yahoo attachments without having to save them or download them. Is that still possible? Yahoo says that b rowsers handle that (mostly jpg attachments)

  • How to merge two folders together

    hi i have a music folder on my desktop. under my "music folder" i have different genre's eg 'rnb folder' 'house folder' 'dance folder' everytime i download new music i put it into which ever folder is dedicated to that specific genre. to keep things

  • I know this may sound silly but i'm scared

    ok my parents got me my macbook about 2 months ago and everything's ok until about now. mac's are suppose to be silent computers right? (i've used macs for years but never go into the technical aspect) now my laptop has been whirring a lot. i know it