Load Balancing in Process Chains
Hi Experts,
There is a need to do load balancing for events triggered "within a Process Chain" . This need is to be fulfilled without impacting any global settings (As switching to time-based scheduling by manipulating background process settings in the central instance). Please let me know if there is any setting / process that allows this distribution to happen ( kindly exclude data activation, change run steps)
Assign points to helpful answers ?...of course !!
Thanks and regards,
Prakash
Hi,
Make sure all the transactional data loads do not start at a time.
regards
Srinivas
Similar Messages
-
How to load hierarchies using process chains
Hi ,
Can any one please explain the steps for loading hierarchies using process chain.whenever i drag the hierarchy infopcakage Save variant is also coming by default,so do we need to have different SAve variant for different hierarchy infopackages or can we have one save variant for all the hierarchy variables.
Thanks,
VakaHello Veka,
How r u ?
Yes ! SAVE Variant & Attribute Change Run will add up while loading the Hierarchy InfoPackage.
Provide the InfoObject Name for which u r loading the Hierarchy in the SAVE variant and SAVE it. The same will be transferred to the Attribute Change Run also.
If u r creating the Chain with more InfoPackages then have the Save Variant & Attribute Change Run at the last.
Best Regards....
Sankar Kumar
+91 98403 47141 -
Data Load Requirement through Process Chain
Hello All,
I have implemented an area through a custom data source and following is my data load requirement through Process Chain. Request you to kindly help me for the same.
1. The design is an InfoCube and updated using a Transformation through the custom data source.
2. For the entire year 2008 I want a single request to be loaded. So it gets loaded into the PSA and then into the Infocube through a (Delta) DTP.
3. Now, I have created an InfoPackage (Full Update) with year as 2009 in the selection. Tht is I want tht hencforth the data load should be for the year 2009.
4. Hence, the Infopackage will run to bring the 2009 data into PSA and I run the (Delta) DTP to update the same in the Cube.
5. Now, what i require is everyday the InfoPackage (Full Update) with 2009 should run and bring data into PSA. and the same should be updated in the InfoCube after deleting the 2009 request already present keeping 2008 request loaded previously intact.
I hope the above is nt confusing.
Please let me know if i can elaborate the same.
Thank you.
Regards,
Kunal GandhiHi,
Please go through the links.
http://help.sap.com/saphelp_nw04/Helpdata/EN/21/15843b74f7be0fe10000000a114084/frameset.htm
http://help.sap.com/saphelp_nw04/Helpdata/EN/f8/e5603801be792de10000009b38f842/frameset.htm
http://help.sap.com/saphelp_nw04/Helpdata/EN/b0/078f3b0e8d4762e10000000a11402f/frameset.htm
These may help you in designing the process chain as required.
Regards,
Sunil
Edited by: sunil kumar on Apr 17, 2009 6:20 PM -
How to redirect loading flow in process chain based on logic?
Hi Experts,
I have a scenario where I want to keep data for last 3 years in 3 different cubes.e.g. lets say cube 1 holds data for current 2006 year, cube 2 holds 2005 and cube 3 holds 2004.Now in next year 2007, I want to keep data for 2007, 2006 and 2005.I want to keep data for 2007 in cube 3 which was holding 2004 data.(delete 2004 data and load 2007).
This process should be automated i.e. which cube to load should be done auto.In short, <b>data for new cube should go into cube which is holding data for oldest year.</b>
I want to know :
1) Which Options I can use to do this?
2) What about ABAP program : custom table can be maintained to know which cube is holding what data, but how to redirect loading flow?
3) What about "Decision process type" in process chain?
4) Also would custom process type solve this functionality?
Any ideas would be highly appreciated.
Thanks in advance,
SorabhHi Sorabh,
Its just an Idea, Im assuming that this would work. This should also work for INIT DELTA I guess. But would need proper testing.
Have a Custom Table ZCUBEYEAR and maintain the CUBE as the Key.
ZCUBEYEAR
CUBE YEAR
Y1 2004
Y2 2005
Y3 2006.
In the update rule->Start Routine for Cube Y1, extract this entry from the table ZCUBEYEAR for Y1, which in this case would be 2004.
DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY1.
in our case YEARFORCUBEY1 = 2004.
For cube Y2 and Y3 the Delete statement would be as follows in their Start Routines.
DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY2.
DELETE DATA_PACKAGE WHERE YEAR NE YEARFORCUBEY3.
This would ensure that only 2004 data would flow to Y1, 2005 for Y2 and 2006 for Y3.
Once we come to a NEW YEAR, We need to run a program or Manually change the CUSTOM TABLE "ZCUBEYEAR" updating the cube Y1 with 2007, the Deltas would flow correctly.
Please email me at [email protected], we could have a detailed discussion.
Hope the above helps your cause.
Regards,
Praveen. -
Delta load problem in process chain
Hi,
I have ODS givng data to another ODS. The init of this flow was done manually...Now i have created process chain in which further processing has variant as "execute infopackage" as the delta package made .. now when i check this process chain its giving me error ""Delete init. request
REQU_49LKNGG0SE5966D3S7SQYYDQX before running
init. again with same selection"" .. Actually this is not the Init req which i ran manually..this is the req which was run when i ran this process chain... what shld i do to add only delta IP in process chain so that i dnt ve to delete init
Thanks
Prashanthi,
first load all the init loads from the ods to cube. then u have to create the delta infopack and include that in the chain.
create the int info pack under the infosource 8<ods name>
trigger that infopack, create the delta infopack under the 8<odsname> infosource , and you have include that in the PC.
Please do changes in your process chain.
Local chain 1 - start -> load to ods -> activation of ods ->further processing(remove the infopack in this process maintain variant).
Lcoal chain 2 - start -> delete index ->load delta to cube -> create index -> roll up (if needed)
Main chain -> start -> local chain 1 ->local chain2
Ramesh -
Loading simultaneously several process chains
Is it recommended to load fiscal year 1, fiscal year 2, fiscal year 3, fiscal year 4, etc. at the same time using process chains ALEREMOTE?
What are the consequences that would cause to the system if several process chains are laoded simultaneously to 1 InfoCube?
There are millions of records coming from the DSO for each fiscal year.Hi Yosemite,
It is a good option to load data parallely into same target, when the selection criteria is different. But please ensure that number of parallel processes in each info package/DTP are restricted to sustain parallel loads in the system.
For DTP this setting can be enabled from Goto->Settings for batch manager.
Regards,
Vikram -
Flat file data loading error using process chain
Hi SAP Experts,
I am having a problem loading the flat file data to the cube using process chain. The issue is that when i run the process chain it fails giving the message " Date format error, please enter the date in the format _.yyyy" . I am using " 0calmonth in the datasource" . Strange is that when i manually execute the infopackage, i dont get any errors and am able to load the same file successfully to the dso and the cube. Is there any special setting for the process chain that i am missing?
The date format in the flat file is mm/yyyy. I have tried all the options i could including recreating the datasource and invain dont see any success so far. please help me solving this problem as we r in the middle of testing cycle.
Thanking you all for your quick response and support all the time.
Kind Regards,
SanjeevHi Sanjeev,
I believe you are opening the .csv file again after saving it. In this case the initial 0 in single digit month (say 02/2010) is getting changed to 2/2010 and the resulting file is not readable to the system. Just do not open the file after you have entered data in the .csv file and saved and closed it. If required, open the file in notepad, but not in excel in case you want to re-check the data.
Hope this helps.
regards,
biplab -
BP daily load failed in process chain
Hi,
I am using process chains for daily loading of master data. I have an infopackage scheduled in that process chain named 0BP_DEF_ADDRESS_ATTR_INIT/DELTA_CRM01 (ZPAK_48WV8918R85SEUHG8N1IP6DJQ) . This infopackage is showing an error. When i diagnosed it says "Second attempt to write record '0030368557' to /BI0/PBPARTNER was successful"
And during error analysis i found this is causing the error.
Has anybody come across this error? How to correct this?
Appreciate your valuable responses.
Thanks
AshokThanks.
I deleted the request on Requests tab. However the request still runs on the reconstruction tab and when i manually try to do a delta load it says the last load is not finished yet.
Could you please suggest?
Thanks -
Initialization of delta loads possible with process chain?
hello all,
followed situation:
i load data from file and sap extractor to ods 1. from this ods i have an automatic delta load to another ods 2, where the data are filtered and aggregated.
its possible that some of the data change their state from filtered out to in but are not in ods 2 because of the earlier criterias they had have. now i want a weekly load (initial load) from ods 1 to ods 2 and a daily delta load if new data arrived ods 1 to the no 2.
i tried to create a process chain for that but i cant! i cannot use the info package for the initialization of the load from ods 1 to 2 because it is generated info package. and if i use the process "update ods object data" there is always tried to make a delta load! at first step i delete all data of ods 2, so i need an initialization with data as next step.
how can i do what i want?
thanks for any idea
FrankHello Tom,
ok, i understand but i dont know how...
The only way i know to create an InfoPackage between ODS is to use "Update ODS Data in Data Target" of the context menu of an ODS. Then i chose between full/delta/init load and the infopackage will be generated. How else can i create it?
Frank -
BPC:: Master data load from BI Process chain
Hi,
we are trying to automatize the master data load from BI.
Now we are using a package with:
PROMPT(INFILES,,"Import file:",)
PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
PROMPT(DIMENSIONNAME,%DIMNAME%,"Dimension name:",,,%DIMS%)
PROMPT(RADIOBUTTON,%WRITEMODE%,"Write Mode",2,{"Overwirte","Update"},{"1","2"})
INFO(%TEMPNO1%,%INCREASENO%)
INFO(%TEMPNO2%,%INCREASENO%)
TASK(/CPMB/MASTER_CONVERT,OUTPUTNO,%TEMPNO1%)
TASK(/CPMB/MASTER_CONVERT,FORMULA_FILE_NO,%TEMPNO2%)
TASK(/CPMB/MASTER_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
TASK(/CPMB/MASTER_CONVERT,SUSER,%USER%)
TASK(/CPMB/MASTER_CONVERT,SAPPSET,%APPSET%)
TASK(/CPMB/MASTER_CONVERT,SAPP,%APP%)
TASK(/CPMB/MASTER_CONVERT,FILE,%FILE%)
TASK(/CPMB/MASTER_CONVERT,DIMNAME,%DIMNAME%)
TASK(/CPMB/MASTER_LOAD,INPUTNO,%TEMPNO1%)
TASK(/CPMB/MASTER_LOAD,FORMULA_FILE_NO,%TEMPNO2%)
TASK(/CPMB/MASTER_LOAD,DIMNAME,%DIMNAME%)
TASK(/CPMB/MASTER_LOAD,WRITEMODE,%WRITEMODE%)
But we need to include these tasks into a BI process chain.
How can we add the INFO statement into a process chain?
And how can we declare the variables?
Regards,
EZ.Hi,
i have followed your recomendation, but when i try to use the process /CPMB/MASTER_CONVERT, with the parameter TRANSFORMATIONFILEPATH and the root of the transformation file as value, i have a new problem. The value only have 60 char, and my root is longer:
\ROOT\WEBFOLDERS\APPXX\PLANNING\DATAMANAGER\TRANSFORMATIONFILES\trans.xls
How can we put this root???
Regards,
EZ. -
Data load Error using Process chain
Hi,
In NW2004s when I schedule delta loads for 2LIS_11_VAITM using process chains, when there is zero delta, the manage screens of the infocubes show green light with zero records correctly whereas the process chain log display shows red. The details of the process monitor show zero records. Is this right? I assume that even if there is zero delta, the log must show green and all the subsequent processes like activation of ODS, deleting of PSA, construction of indexes must be carried out. Please confirm. Is this a bug?
Thanks.
ParamHi Victor,
Can you please advice how you solved issue as we are having similar issue where it works fine when loaded manually but throws communication error when process chain runs, following is error message..
Communication error: call FM RSSDK_DATA_REMOTE_GET
Thanks in advance for your help!!
Sandeep -
Steps for master data loading by using process chain
Hi
I want to load the Master data by using the process chain . Can anyone tell me the step-by-step method for loading the master data .
I means what are the processes should i need to include in the process chain . Pls tell me in sequenece .
i ll assing the points
kumarHi,
The process os loading master data is similar to transaction data in terms of using the infopackages.
First you have to load Attr followed by texts and hierarchy.For master data you have to include the process type called as Attribute change run.This can be included as a separate process type or using the AND process.
Hope this helps.
Assign points if useful.
Regards,
Venkat -
Master data loading by using process chain
hi
I want to load the master data by using the process chain in BI7.0 . How can i do this one .
Can anyone tell me the step-by-step method for loading the master data.
kumarHi Kumar,
check this link to load the master data by using the process chain
Re: Process Chain : Master Data
cheers
Sunil -
Loading Data Using Process Chains
Hi All,
we are having 3 layers for loading data using DB connect i.e 1st and 2nd layers we are having DSO's and the 3rd layer is cube.we are following the following updates for the 3 layers
1st layer- full
2nd layer - delta
3rd -delta
is there any possibilites of getting duplicates in the cube and if so how to solve this one using process chains
regards,Hi,
Your dataflow looks ok.
Hopefully you are using "Std DSO", there is no chances of getting duplicate records.
But in case...
If you are using "WO DSO" with check box "Do not check Uniqueness of records"
In this situation you will get the duplicate... as "WO DSO" with this property will simply keep on adding adding the records.
Thanks
Mayank -
Regarding Master Data Loading by using Process Chain
hai
can anyone tell me 'step-by-step process and what are processes , process types' are used for loading the Master Data , Transaction into ODS and into Infocube.
i ll assing maximum points
bye
rizwanHI Mohammand,
1. Master Data loading:
http://help.sap.com/saphelp_nw04/helpdata/en/3d/320e3d89195c59e10000000a114084/content.htm
2. Transactional data to ODS:
http://help.sap.com/saphelp_nw04/helpdata/en/ce/dc87c1d711f846b34e0e42ede5ebb7/content.htm
3. Transactional data to CUBE:
http://help.sap.com/saphelp_nw04/helpdata/en/d5/e80d3dbd82f72ce10000000a114084/content.htm
Hope it Helps
Srini
Maybe you are looking for
-
How to change the value of subtitution string in a page at runtime
Hi, I need to change the value of a substitution string which is mentioned in application attributes at runtime. The value will be fetched from a field in database on load of each screen. Can anybody help....... Help will be appriciated........thanx
-
I'm trying to update Itunes on my laptop. I keep getting error messages, so I uninstall/reinstall then I'm asked do I have permission for apple mobile device?? I'm not computor savvy but have never had a problem before, I'm also worried about whether
-
WebLogic Server 7.0 and MS SQL Server 2000
Hi, I'm trying to create a connection pool in WebLogic Server 7.0 for MS SQL Server 2000, but I always get the following error message: <Error> <JDBC> <001060> <Cannot startup connection pool "mcp-ConnectionPool " weblogic.common.ResourceException: w
-
How do I manually sort the bookmarks toolbar?
I cannot sort my bookmarks toolbar now. I normally manually sort but, presumably after an auto update, they are sorted alphabetically. I cannot drag and drop any more. I have changed the settings in Library view which apparently change the settings,
-
Hi, We are planning to implement CTS+ in our environment. We have Solution Manager system in NW 7.0 EHP1 SPS 2 levels. We are planning to use CTS+ for our XI transports. We are on XI 3.0, NW04, Basis 640 release, SPS 22 We will be using Solution mana