Number of CVCs
Hello APO Experts,
As part of sizing activity, I would need to ballpark the number of CVCs that is going to be used. Just for the reference, I would like to know the maximum number of CVCs that you have encountered with the clients implemented APO DP. I appreciate your reply on this.
Thanks,
Uthira.
Uthira,
I suggest that you only create CVCs for the actual data you will be managing. I have a hard time conceiving of a company that has that many active combinations that are significant enough to manage in DP. For instance, in my 89K CVC project, there existed about a half million undeleted MARC records, 8,000 undeleted soldtos. But, analysis of VBAP showed a small fraction of those combos actually made it into sales docs.
I assume that your goal is to manage and forecast sales data in DP. Have you determined how many of those CVCs actually exist in Sales docs? And how many need to go in each MPOS/Planning Area?
If you determine that you really need to create100 million CVCs, in one MPOS, then, you need to focus on performance. I won't say that it is impossible, I have never attempted such an implementation. I believe you will need a lot of hardware, and performance will always be an issue.
Best Regards,
DB49
Similar Messages
-
Typical number of CVCs and Matlocs for Sizing
Friends,
Can anyone tell me what would be the typical size of APO application?
I know it varies from company to company but trying to understand what would be the optimal number.
I have seen anywhere between 6000 to 600,000 SKUs for SNP and about 500,000 CVC's for DP. Please share your experiences and I am hoping that this thread can be a good resource for people looking to understand the sizing effort of APO application.
Thanks
Prod_Planner.Hi,
You are right. It completely depends on the organization, type of manufacturing and so on. It is really difficult to quote any specific numbers.
I have seen Matlocs of abt 250k and CVCs of about 50k.
Hope it helps.
Thanks
Mani -
Hi,
We are using SAP SCM release 5.1, the component Demand Planning. We are running macros in background jobs, e.g. a macro that populates a key figure by summarizing other key figures etc. We are using parallel processing, configured based on recommendation from SAP. When the number of CVCs to be processed by the macros becomes very high (we are processing ~300.000 CVCs at the moment), we get the following message in the job log in SM37:
"Some CVCs cannot be processed in block 2 ", message number /SAPAPO/SDP_PAR023. Obviously the block number varies, and in our case we have had the above message for up to 19 blocks. Users have spotted CVC's that were not updated by the background job, but we don't have a specific list of the CVCs that were not processed.
OSS notes mentioning this behaviour:
Note 1282811 - Error processing 1 CVC, terminates the Parallel Profile
Note 1501424 - DP Job with parallel processing - job status message
Note 1494099 - DP Job with parallel processing - job status
The question below is only to those who have encountered the same message in a DP background job:
Did you find a log of the CVCs that were not processed, and what did you do to overcome the problem?
Thanks in advance!
Kind regards,
Geir KronkvistHi Rico,
Thanks for your reply! The spool consists of 23.145 pages so I looked for the word "Lock" using the "Find in request" in the "Spool request" menu. The searched found two entries where there was a message stating that a CVC was locked. It must be noted that no users are logged on while our background job is running, and there are no other processes (background or dialog) running in parallel. When checking transaction SM12 prior to running the job, there are no locks in the system.
Our job schedule consists of a nightly process chain transferring data from BI to liveCache, and a monthly job that prepares historical data and calculates the forecast. We are now running the monthly job.
Is it possible that the parallel processing may cause the locking by itself?
Kind regards,
Geir Kronkvist -
Taking long time to load from the planning area to the cube since cvc big?
Hi all,
We have a huge number of cvc for various countries, while we are planning to load the data from the planning area to the cube using those cvc it consuming 40 hours.
I happneed to check the cvc are more than 15,000 that is one reason. But i still need to improve the time taken to load the data from the planning area to the cube using the process chain.
I happened to split the process chain to load data respectively to the sales organisation but still the same!!
Can anyone help me nor recommend the sap process to improve the data loading from planning to cube in apo demand planning?
Thanks
PoojaHi Pooja,
15K is not huge at all. We have worked with 50k and still managed do the extract into cube in about an hours time.
Pls help me understand a few things, so that we can help you better.
1) Number of KF?
2) Number of Periods?
3) Key Figure Storage types - Is there any KF stored in Infocube rather than Time Series, this can be found in the Key Figure details in the Planning Area.
4) Pls explain your data flow like PA--> Datasource --> Communication Structure --> UPdate rules --> Cube???
5) Are you using Parallel extraction in your Datasource? This can be checked in Data Extraction tools from your Planning Area Screen.
Few general tips.
1) Parallelize your datasource.
2) Load into PSA and Infocube parallely.
3) Do not include KF stored in Infocube for your backup, use only KF stored in LC.
Thanks
Mani -
How do we run the Realignment process for CVC in DP using a flat file. Realignment of single values can be done, but if it is to be done for a number of CVCs through a Excel /Text tab delimited file then how is it to be done. Also when do you use the options of Delete Source in Realignment. What if we want to copy the same Product for a number of different locations.
Thanks,
HarshWe had same issue, I would like to share the solution, so some one in future can reference.
Step1: Create source file as Tab delimited text file and name it as .xls rather than .txt
Step2: Leave first line blank in the source file.
Step3: When source is created from SAP template, ignore (delete column "Realignment Step", you dont need it, system will generate sequence)
Step4: Leave Step Status Blank, Realignment factor mostly 100 and date in YYYYMMDD format, and Realignment Logic as M.
Step5: Upload the file, It will work. -
Calculating the no. of CVCs
Hi,
We have 15 products and each have about 10 SKUs. There are 4 Plants and 12 DCs. Can I calculate the number of CVCs I may have? Is there any other information that is needed to calculate potentially how many CVCs we would be having?
I appreciate any links.
Thanks.Hi Visu,
With the numbers that you give us, the greatest number of cvc's that you can obtain, if you use all of them as a characteristics, is 15x10x4x12= 7200
The only way that i know to approximately calculate the number of CVC's is knowing all the characteristics that you are going to use and the possible values of each.
If you want to reduce the number of CVC's, check if some characteristics have relations n:1, if so, you can use this characteristic as a navigational attribute.
I hope this can help you.
Regards.
Marín. -
Error while loading the planning book
Hi There,
I am getitng the below error while loading the data in planning book
"Error for COM routine using application program (return code 40,016)
Error reading data - Planning book cannot be processed further"
Ran /SAPAPO/TS_LCM_CONS_CHECK and /SAPAPO/TS_LCM_REORG but its not working , still getting the same error.
Could you please advise?
Thanks,
KrishnaHi Krishna,
Looks like a key figure might be incorrectly created or there might be a problem in aggregation/disaggregation of key figures due to numerous number of CVCs.
Regards
JB -
Use of storage bucket profile in APO DP
I'm trying to clarify the sizing implications of using various buckets in the APO DP storage buckets profile.
Suppose I have a storage bucket profile 1 consisting of calendar weeks and days, and a profile 2 consisting of weeks only.
What will be the relative database/memory sizing resulting from these 2 profiles?
Thanks for any advice...Hi,
As our other friends have mentioned here, just having a storage bucket profile doesnt consume memory, however let us assume you have generated time series objects based on these storage bucket profile, the following example will highlight the memory usage.
Horizon used -> 2 years.
No. of Weekly bucket --> 104 or 105
No. of daily bucket --> 365
Now if you generate Time Series out of your SBP which contains both daily and weekly, total memory occupied will be 365 +104 = 469 times your number of CVCs
If you generate time series out of your SBP which contains just weeks, total memory occupied will be 104 times your number of CVCs..
Hope this helps.
Thanks
Mani Suresh -
APO DP: Disaggregation to product&plant level from higher levels.
Hi.
We do demand planning on groups of products and for country/region in general, we have around 48.000 CVC's in our current setup. It works very well.
A new situation has arisen where we need to have the forecast split down to product and plant level.
As is we simply don't have the information at this level of granularity.
I don't see how we can add for instance product to our setup, we have around 20.000 products so the number of CVC's in DP would become massive if we did this.
I was thinking that perhaps something could be done by exporting the relevant key figures to a new DP setup with fewer characteristics (to keep the number of CVC's down) via some infocubes, perhaps some disaggregation could be done via some tables and the BW update rules. This still leaves the issue of how to get the figures properly disaggregated to plant and product though.
Does anyone have experiences on how to get the figures split to lower levels from DP when you're planning on a higher level?Simon,
One approach as you mentioned can be creating Z Table where in you set up disaggregation proportion from product group level to product level or product location level.
Product Group X 100 Product A@loc1 10
Product B@loc1 90
Download your planning area data into infocube C and then use BW routines to convert the data from group in infocube C to lower level refereing Z Table....into another infocube..
SAP also provides such standard functionality of spliting the aggregate Demand plan to detailed level
SNP plan..through functionality like location slit or product split.
Essential you will be using same concept in yor BW solution or you may also want to consider the
release your DP to SNP planning area its as solution of diaggregation of data to lower level.
Regards,
Manish -
I am using APO DP version 5.
In my planning object structure I have <u>product group</u> and <u>product</u> characteristics.
I have a key figure called <u>% adjustment</u>. If the user enters, for example, 10 in this key figure at the product group level, I want this figure of 10 to be applied for <u>each</u> product in the product group, via a macro where a percentage adjustment of 10% is applied to an unadjusted key figure (at the product level).
The question is - what calculation type should be used for the % adjustment key figure so that the value entered at the higher level is simply replicated down to the lower level?
Thanks, Bob AustinUse calculation type 'A' - average. Easy way to populate values is enter (manual or macro or job) this KF value at the highest possible agg.level. Same value get copied to lower level automatically. But you should be careful when add new CVCs, as this is a average type, it calculate by adding all KF values then dividing number of CVCs in next level. If new CVCs added then (as this KF value is zero) the value will not be same. (for eg 10.000% will become 9.9% depending on number of CVCs). So when new CVCs added you need to zero out and re-enter samevalue at agg. level.
Hope this helps,
Niranjan -
Reading of attributes of characteristics in APO DP
I am using APO DP V5.
Within DP, I have a characteristic with associated navigation and display attributes.
Is it possible, using macro function, to read these attributes so that they can be used in macro coding?
Thanks,
Bob Austin.if you had char values for every "prod at customer" value, you can try to load this as a new char. since it will have a one to one relation with the main char there will not be a increase in the number of CVCs and in the macro you can refer to the value of the selected char(led time in this case) directly with ACT_IOBJNM_VALUE()
another way is to use the product master and load the attribute as a extra(tab) field in and then use macro function MATLOC_EXTRA() to read this
but you need to define your material and location in DP (product dorresponding to "product at customer") -
Collective Product Allocation Wildcard
Hi I am having an issue setting up collective product allocation...Allocation currently is based on the following parameters and this works fine.
Product Family
Customer type
Order type
Currently we have around 4-5 order types defined in R/3 and CVC's have been created with all the combinations and for this reason number of CVC's are high Prod. Fam * Cust type * order type.
In the future we are looking at some specific allocation for certain order types. I have manually created CVC's with a wild card for the order type but when I try to create a sales order it gives me a combination error.
Fam1 X ####
Fam2 Y ####
Fam3 Z ####
Fam1 X ORD1
Fam2 Y ORD1
I am not really sure if collective allocation would help in this scenario. Can somebody let me know what exactly the issue could be...
Thanks,
HarishProduct allocation is been completely setup what I mean by this is Allocation groups, procedures assignments in master data have already been done.
As you have mentioned I know the number of combinations going into the future will have issues...that is the reason I want to setup collective allocation but I keep getting a CVC combination error when I do this..
If you look at the example I had given earlier thats the combination I would like to achieve without the CVC error.. -
Hi colleagues
We plan to use DP and GATP in APO for FMCG industry.
For which i have a following Questions:
Level for Forecasting:
Product, Product Group, Plant, Province are required.Forecasting is done in monthly level.
Expected combinations = approximetly 9000 to 11000 CVC's
For Forecasting we plan to use keyfigure fixing at Product and Plant Level.We need to maintain aggregate at Product and Plant.
For GATP Check and Allocation Check:
Product, Product Group, Plant, Province,9AKNOB and Customer are required.Allocations are done in weekly Level.
Expected Combinations = 69000 CVC's
In our process, Statistical forecast is also one of the input to decide allocation.
For allocations we plan to do fixing at aggregate level at Product and Province Level.We need to maintain aggregate at Product and Province.
For designing MPOS:
Option 1: Separate MPOS for Forecasting and Allocation
Pros: 1) Performance increases for forecasting bcoz less number of CVC's since no customers in forecasting.
2) Only single Aggregate maintenance to the forecasting MPOS and Allocation MPOS
Cons: 1) CVC duplication with 2 MPOS
2) Data realignment need to handled twice.
Option 2: Single MPOS for forecasting and Allocation
What is your recommendations on my requirement.
BR
KaterineHi Katerine,
I too think that seperate MPOS may be a better choice.
Just couple of points.....
'In our process, Statistical forecast is also one of the input to decide allocation."
For this point, how do you plan to have forecast updated for the CVCs for MPOS- GATP ?
Generating in this GATP MPOS itself or copying from the other MPOS ?
Please also note that 69K total CVCs is not a huge number to me, we have operated in volumes of CVCs which were 10-15 times more.
To have many additional background jobs/ process chains means addtional dependencies, monitoring , scope for confusions , more BW info objects ( say back up cube, extraction cube, hsitory cubes).
Regards
Datta -
Hello All,
I am struck in creation of CVC.
I have suucessfully loaded the data from file and I am able to see the data in info cube by selection of "display data in info cue"
But I am not able to create CVC in MPOS. I get message "No new characteristics combinations were loaded"
Please tell me what shall I do ?
I am able to create single CVC.
Thanks a lot in advance
PrabhatDear Prabhat,
please check the date for the data in the cube. This date has to be included in the horizon of the variant you use for generation of CVC's.
You can do it as follows:
1.) Enter txn /n/sapapo/mc62
2.) choose MPOS
3.) click "create characteristic combinations"
4.) click "generate in background"
5.) Load from infoprovider and choose the cube
6.)Make sure that date in cube is included in the horizon
-> Double check if CVC's are selected by clicking "Number of hits"
7.) fire
I hope this will help.
Regards,
Tibor -
Process chains: error when including CVC generation process type
hello all !
I have a Pc that loads data from custom extractor into a Icube.
next triggers Generate Characteristic combination which never executes. Status remains active but no "completed succesfully" (yellow triangle) in monitor of pc.
When I run it alone the CVC generation process types executes OK...??!!!
My sequence is Delete Index from cube ---> Load IP --> Atribute change run ---> Create Index --> Delete overlapping request (up to hear loads oK although status never becomes green '??)
final step is a CVC generation + adjust time series process type.
Does any of you know why the error ? and how to help solving it out ?
appreciate all yur commentsHi Samir,
I guess you can directly go to RSA1->Source Systems.Rt click and Transfer Global Settings then Delete the Red Request and Go to the Particular Process chain and do a Repeat/Repair.
You can try repairing it...
Rest of your question is not clear.... Give me ur number, will call you, then we will discuss
Rgrds,
Habeeb
Maybe you are looking for
-
Why can't I sign into YouTube through my Apple Tv
I'm having a a tough time signing into YouTube with my Apple TV (Gen 2) I use the same password to sing in on my PC with no problem, but can't do the same with my new little black box with the apple on it. Any Ideas or suggestions why this is happen
-
over the past couple of weeks, we have had fios internet disconnect for no apparent reason. then i go in and click on it and it re-connects. we ordered a movie on demand and it lost connectivity 3 times. had to go back in to on demand and reconnect a
-
Cisco Prime Infrastructure patch 2.1.1 - installed or no?
I've reinstalled new Cisco Prime using the newest ISO image version 2.1.0.0.87. After that I installed the patch file (pi211_20140630_01.ubf) for version 2.1.1 and rebooted the server. After that I don't see any information in CLI that I have version
-
How to delete 9 hundred thousand entries from SWW_WI2OBJ
Hi All, When I execute transacation SXI_MONITOR, Process Overview shows a hit being made to the database table SWW_WI2OBJ. This table contains 9 hundred thousand entries all the related work flow items. This causes a very slow performance while displ
-
HDV & Shane's answer to Mixed Media woes...
I was amazed at Shane's video dealing with mixed compression timeline fixes at: http://www.proapptips.com/proapptipsvideotutorials/194F4DC2-ACBB-4DD6-A1D0-F46D7 D8DCAFC/53B46BA3-E23E-4ACD-888F-1E931224596B.html I thought my problems were over and tha