APO DP Disaggregation issue.
Hi All,
Dis-aggregation issue in Demand planning forecast execution. History 3 material, CVC Combinations 6 material, Matching
History and CVC 3, when i execute Forecast, disagrregated to all 6 material. i want only 3 materila dis-aggregation APPODPDANT
i used for Disagrregation,please help me
Hello,
If I understand correctly, you want to use proportional factor to disaggregate the calculated forecast, right?
Please first make sure that the proportional factor APPODPDANT is correctly calculated based on the history key figure.
Make sure the APODPDANT's values are zeroes for those CVCs without historical data.
(You can check APPODPDANT key figure's value by putting it into a certain dataview.)
If the proportional factor is not correct, please recalculate it in /sapapo/mc8v.
Then please make sure that the disaggregation type of the forecast key figure is correctly set. It shoud be:
1)Disaggregation type should be 'P' instead of 'I'. ('I' is also OK, but for testing purpose, 'P' is better.)
2)Disaggregation key figure is set as APODPDANT.
3)Time based disaggregation should have the same setting as structural disaggregation.
At last, the forecast run in foreground should be run at aggregated level.
Best Regards,
Ada
Similar Messages
-
APO DP SCM 5.0 Disaggregation issue + performance guide
Hi All,
I am again facing a u201Dsmallu201D disaggregation calculation issue. What I am trying to do is to calculate Corrected sales history to the most detailed level through disaggregation.
KF1: Sales history
KF2: Correction
KF3: Corrected Sales history
Situation:
The calculation is performed via background job:
Aggregation level:
Country
Sales org.
Product
Selection:
Country: 001
Sales org: 2000
Product: 12 & 13
I use macro for the calculation:
Start
Step1: KF3 = initial
Step2: KF3 (redisaggregation) = KF1 (Values) u2013 KF2 (Values)
End
Planning Area:
KF3: I (if initial, based on a another keyfigure) u2013 KF1 , Time based disaggregation (not relevant)
How can I get KF1 based disaggregation for KF3? Is it even possible with the above solution or should I come up with a totally different approach to this? Calculating in the most detailed level is not possible because the system canu2019t handle so many CVCu2019s. Although I kind of wander that, the system is almost a beast and it can only calculate 50000 CVCu2019s in a 5 hour? Any suggestions on this one also?
Thanks in advance,
JuhaHi Manimaran,
I just tested your suggestion and the results are:
Sales history KF1:
SUM: 1000
Product: A: 750
Product: B: 250
Product A for Customer 1: 325
Product A for Customer 2: 325
Product B for Customer 1: 125
Product B for Customer 2: 125
Correction KF2:
SUM: 1000 (automatically split via disaggregation, this number is inserted in planning view and after this the background job is executed)
Product: A: 500
Product: B: 500
Product A for Customer 1: 250
Product A for Customer 2: 250
Product B for Customer 1: 250
Product B for Customer 2: 250
Corrected Sales history KF3 (after background job and disaggregation):
SUM: 2000
Product: A: 1250
Product: B: 750
Product A for Customer 1: 625
Product A for Customer 2: 625
Product B for Customer 1: 375
Product B for Customer 2: 375
As I analyzed this it seems like it is disaggregating it evenly? I have checked that the planning area settings are:
KF3: P - (based on a keyfigure) Keyfigure: KF1
so on that basis there should be no even disaggregation. I also tried with different aggregation level settings but the results are the same.
BR,
Juha
Edited by: Juha Pajarinen on Aug 12, 2011 11:37 AM -
APO DP: Disaggregation to product&plant level from higher levels.
Hi.
We do demand planning on groups of products and for country/region in general, we have around 48.000 CVC's in our current setup. It works very well.
A new situation has arisen where we need to have the forecast split down to product and plant level.
As is we simply don't have the information at this level of granularity.
I don't see how we can add for instance product to our setup, we have around 20.000 products so the number of CVC's in DP would become massive if we did this.
I was thinking that perhaps something could be done by exporting the relevant key figures to a new DP setup with fewer characteristics (to keep the number of CVC's down) via some infocubes, perhaps some disaggregation could be done via some tables and the BW update rules. This still leaves the issue of how to get the figures properly disaggregated to plant and product though.
Does anyone have experiences on how to get the figures split to lower levels from DP when you're planning on a higher level?Simon,
One approach as you mentioned can be creating Z Table where in you set up disaggregation proportion from product group level to product level or product location level.
Product Group X 100 Product A@loc1 10
Product B@loc1 90
Download your planning area data into infocube C and then use BW routines to convert the data from group in infocube C to lower level refereing Z Table....into another infocube..
SAP also provides such standard functionality of spliting the aggregate Demand plan to detailed level
SNP plan..through functionality like location slit or product split.
Essential you will be using same concept in yor BW solution or you may also want to consider the
release your DP to SNP planning area its as solution of diaggregation of data to lower level.
Regards,
Manish -
Hi All,
actually I've a requirement where the standard disaggregation function seem to bring no success. Here my requirement:
In the Planning Query I've a row Hierarchy with a structure and two Characteristics:
e.g.
Structure:
Region I
Region II etc.
Characteristics:
Subregion and Respons. Person
Original Values:
Region I: 1.000
-> Subregion I: 1.000
> Musterman I: 500
> Musterman II. 500
The structure Element is locked for manual Planning. The user should be able to plan values for Subregions and Respon. Persons. The only way to do get the cells inputable was using the disaggregation function in the query. But the function should not do a normal disaggregation. Here a few examples how it must work:
Entering 1.500 for the Subregion I:
Instead of:
Region I: 1.500
-> Subregion I: 1.500
> Musterman I: 750
> Musterman II. 750
...the result should look like:
Region I: 1.500
-> Subregion I: 1.500
> Musterman I: 500
> Musterman II. 500
> #: 500
On the other hand, changing values of the responsible persons, lead to
1)
Region I: 1.000
-> Subregion I: 1.000
> Musterman I: 500
> Musterman II. 500
2)
Region I: 1.500
-> Subregion I: 1.500
> Musterman I: 1000
> Musterman II. 500
Doe's anybody has an idea how this issue can be solved? I already searched in the forum for a while...Hi Christopher,
posting the difference of a higher level to # is not really a disaggregation (or even a planning function). It's standard functionality. Create a planning level and query that do not contain resp. person and enter the values there. Then they will appear as # in the query you already have.
Regards,
Marc
SAP NetWeaver RIG -
Hi,
We have APO system, where most of the dialog processes are used by RFCs. This is more than the RFC quota and min Dialog quota set. Looks RFCs are creating new RFCs, in such cases, quota parameter is not enforced. Did any one had similar issues?
Thanks,Hi Raj,
Well, before you conclude i would suggest to monitor for while/days on the RFC utilization using Transaction "SARFC".
This transaction basically helps in analysizing the utilization, using this you can monitor the RFC resources on all application servers and thus find out the load incurred by parallel RFCs on a server.
If you suspect, there is need for more resource then probably you can configure for Dynamic allocation of the resources.
For detailed explaination, configuration, tuninig parameters info ...please refer to below link
http://help.sap.com/saphelp_NW70EHP1/helpdata/en/fd/1d75a8f34308449ef6c4cdcda4e137/frameset.htm
http://help.sap.com/saphelp_erp2004/helpdata/EN/5b/90a0231a36b544a6ba54908a898828/content.htm
Last but not least - Please take into consideration of your system configuration (Harware/Software) before making any changes.
I hope this helps you in fine tuning your system.
Regards
Sekhar -
Hi Gurus,
Of the both disaggregation types the structural disaggregation( as a example region as a aggregation level and customer/dc etc as the detailed level using calculation type) and the time-based disaggregation, which one happens first..
Please share your ideas.
Reards,Hi,
As such there is no sequence in between Time-Based and Characteristic-Based dis-aggregation.
This depends on the business needs that Dis-aggregation of the planning data should be done on the bases of characteristics (like product brand or sales region) or it should be dis-aggregated with respect to time level ( say month wise data dis-aggregated to weeks). This is done at KF level and is defined at the time of PA creation.
Anyone can be used first or both can be used.
Thanks -
Invalid I/O node config error while passing project reservations to APO
Guys,
I had created a Project with WBS Element, Network & Internal activity in ECC Project Systems. This data was ciffed and seen in APO without any issue.
Upon creating a Material component, as a reservation, under network activity in PS and releasing it, there is an error in the APO side log(SLG1) as shown below.
Source system: xyz, user: abc transaction:CJ20N function module:/SAPAPO/CIF_PRJ_INBOUND
-----Error start--
New order (warning)
Error in activity of operation 0010 order 4001740(warning)
Invalid I/O node configuration (error)- Message no. /SAPAPO/OM_ERROR258
Diagnosis-The I/O node cannot be scheduled using the value combination specified.
--Error end--
Error while processing project order: 000001234- (error)- Message no. /SAPAPO/PRJ003
------Log Information
The registered objects of the queue are marked as faulty - Message no. /SAPAPO/CIF_ERRHDLG604
CIF error handling activated - Message no. /SAPAPO/CIF_ERRHDLG504
End of processing registered for RFC 00000001 of the LUW with ID xxx
Message no. /SAPAPO/CIF_ERRHDLG605
checked the APO post processing and manaully triggered the transfer of selected order . even then order reservation wasn't pushed to APO.
Would appreciate if any one can provide with info on why this error is being produced and ways to resolve it!!!
Thanksplease check if you have any issues under network activity..if you could not find any please try to debug the failed queue with the help of your Abap counterpart. here is the process
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d0d1265d-db32-2b10-79ba-ccf6fe2c161d -
Displaying data on a different level then the allocation check
Hi
we are creating sales orders in R/3 and executing a sales order check based on a planning area in APO.
this check is done on Product / sold to, because that is the level where teh forecast is entered.
However, one sold to can contain different ship to's. so because of disagregation, it is randomly done over the ship to's.
What we want to obtain is that the when the allocation check is done and the orders is in APO via the CIF, the data is displayed on product sold to, ship to like in the sales order while the check to see if there is enough quantity can remain on product sold to.
Do you have any ideas?
TommyTommy,
Please elaborate on your problem.
I assume you are talking about the Planning book used for Allocations; please confirm. If the only CVCs in your Allocation planning book and your Product Allocation Group are 'Product' and 'Sold To', then 'ShipTo' (and ShipTo disaggregation) is irrelevant. Product Allocation only considers the CVCs that have been created. In this case, multiple ShipTos against a single SoldTo are 'first come first served' until the SoldTo Incoming Orders qty reaches the SoldTo Allocation Qty.
It is possible to include ShipTo in your Allocation Group and Allocation Planning Area in addition to SoldTo; this is a fairly common solution. If you do so, you will THEN have to consider ShipTo Disaggregation issues. Since this seems to be a negative issue for you, I would recommend against it.
Best Regards,
DB49 -
Hi,
My question is related to the CIF of master data.
I built an integration model to transfer the materials from R/3 to APO, and then the delta modifications are CIF by a background process at night.
I would like to transfer all the delta master data, but having one excetption.
I do not need to transfer the safety stock delta changes in R/3 to APO, so that I maintain these values independently in APO. The issue is that the safety stock values in R/3 are also changed but with another purpose and the R/3 values are overwriting the values that I am uploading directly in APO after the delta CIF transfer.
Is there any way that I can CIF the delta master data changes in R/3, but restricting the delta safety stock changes?
Thanks a lot!Yes it is possible. You have to add custom logic in the material CIF user exit to turn off the update of safety stock fields in APO. Actually there are a lot of threads in this forum that discuss this issue.
You can start by referring to the thread below
Re: CIF problem with Dynamic and Static Stock Method -
CIF Error while transferring vendor
We are facing CIF error while transferring vendor master to APO.
The issue is in e-mail address of the vendor master.
One way is to correct the e-mail addresses - one per line without semicolon at the end.
But the issue list is very huge to get it corrected.
Is there any other way by which we can exculde the e-mail addresses or its validation?
Thanks and regards,
SushantHi Senthil,
The objective was to avoid CIF error due to incorrect address setup. This required us bypass it before data transfer. We could fix this using enhancement in ECC.
Apreciate your kind reply, though. Thanks a lot.
Warm regards,
Sushant -
Hi,
We are using CTM for Characteristic dependent planning (CDP), for that we are CIFing PDS from ECC
In CDP since CTM supports only time continuous planning we have to use PP/Ds PDS and since CTM supports only CTM PDS we have done the BADi enhancement and getting the CTM PDS in APO.
The issue is that when we CIF the PDS with Object dependency the OD is not reflecting in the CTMPDS in APO while we are able to see the OD in PP/DS for the same production version.
has anyone worked on CTM PDS with Object Dependency and what can be the probable reason for this discreapancy.
I have also applied not 0001342840 to resolve this issue.
Thanks & Regards,
SanjogHi,
there are some restrictions with CDP in CTM according to note 1284461:
Characteristics based planning with CTM
PDS must come from an ERP system. The supply chain should not mix VC and
CDP scenarios. For reference characteristics only components
/SAPAPO/CULL_CFG_COMPONENT-QUANT and
/SAPAPO/CULL_CFG_MODE_PROCEDURE-DURVAR are supported.
Variant functions are not supported. Only the following Object Variables
are supported: $SELF, $PARENT, $ROOT. For Selection Conditions and
Procedures the following operators are supported: =, <>,AND,OR,NOT
For Procedures the following operator is supported: IF. Complex select
conditions are not supported.
Multi-value characteristics are not supported.
Can you check if you fullfill the above mentioned criteria?
Regards Frank -
CTM Order Selection MTO scenario
Hi,
I have cfreated a Sales Order in ECC with Requirement Strategy as Made To Order and CIFed the order to APO.
The order is getting CIFed and is reflecting in APO.
The issue is that when I try to do Demand Simulation using CTM the order is not getting recognised.
Is their some setting that needs to be done in case Sales Order for MTO scenario.
Please help me in this regards.
Thanks,
SanjogHi Sanjog,
can you see the sales order in APO (/sapapo/rrp3).
please maintain in the demand prioritization profile the category for this sales order and then try simulating the demands.
let me know if it works.
Thanks
Binod -
Issue with Disaggregation Type A
I am seeing some strange behavior with respect to Disaggregation type A.
We have region and product as characteristics and when we use the Dissagregation type A for KF A,
Lets say Region A has 10,000 products under it and Region B has 15,000 products under it. Most products for Region A and B are common. We load decimal numbers like 1.11, 1.0, 1.2 and.94 for 36 months in KF A by Region only (not by region and by product into APO).
So according to Disag type A, I would expect all the products under Region A to have the same numbers as 1.11, 1.0, 1.2 and 0.94. But I am seeing some different numbers with slight difference when I load Region A and product 1004 or a bunch of products (lets say).
What could be the reason for this behavior and how to correct it ? Would really appreciate any inputs on this from the experts.
Thanks
AliciaAlicia,
You mention that you see "s;ight difference" in the values. Could you let me know what values you see? A slight difference can occur in disaggregation due to decimal settings fo rthe KF and rounding effects.
Also is this somrthing that has happened all of a sudden and used to work correctly earlier? Is this issue along all the regions/CVCS or only some?
Abhi
Edited by: abhishek sharma on Apr 12, 2010 10:35 AM -
Issue in transfer of data from ECC to APO
Hi All,
I have a requirement of transferring data from ECC to APO. I am using EXIT_SAPLCMAT_001 fro this purpose. The problem is, I need to transfer the data of a field that is not present in cif_matloc but present in /sapapo/matloc.
How should I proceed...Please help....this is an urgent requirement
Thanks & Regards,
SriLalithaHi,
you may want to go to the transaction /SAPAPO/SNP_SFT_PROF
Determine Forecast of Replenishment Lead Time
Use
In this field, you specify how the extended safety stock planning determines
the forecast of the replenishment
lead time (RLT). The following values are available:
Supply Chain
The system determines the RLT forecast using the supply chain structure by
adding the corresponding production, transportation, goods receipt, and goods
issue times. If there are alternative procurement options, the system always
takes the
longest
option into account.
Master Data
The system determines the RLT forecast from the location product master
data.
Master Data/ Supply Chain
First, the system determines the RLT forecast from the location product
master data. If no RLT forecast can be determined, the system determines the
forecast using the supply chain structure (as described under
Supply
Chain
Dependencies
You can retrieve the replenishment lead time forecast yourself by using the
GET_LEADTIME
method of the Business Add-In (BAdI) /SAPAPO/SNP_ADV_SFT.
Replenishment Lead Time in Calendar Days
Number of calendar days needed to obtain the product, including its
components, through in-house
production or external
procurement.
Use
The replenishment lead time (RLT) is used in the enhanced methods of safety
stock planning in Supply Network Planning (SNP). The goal of safety
stock planning is to comply with the specified service level, in order
to be prepared for unforeseen demand that may arise during the replenishment
lead time. The longer the RLT, the higher the planned safety stock level.
Dependencies
The field is taken into account by the system only if you have specified
master data or master data/supply chain in the RLT: Determine
Forecast field of the safety stock planning profile used.
Hope this helps.
The RLT from ECC is in MARC-WZEIT which is transferred to APO in structure /SAPAPO/MATIO field CHKHOR.
May be if you maintain the setting in the profile, you may get the value in RELDT.
Thanks. -
Sale order doesn;t exist issue in APO
Hi ,
I am facing issue while trying ot create outbound delivery from sale order. But during delivery creation , BAPI_OUTB_DELIVERY_CREATE_SLS , bapi throws error as " SALE order xxxxxxxxx doesn't exist in APO , Only 0 EA of material is available..." . But when manually do, delivey is created . Hence we analysed it is due to the time taken by APO system to update the Sale order.
Let me know how to get the sale order or orderid in terms of APO to verify the SO is updated in APO system .
Kindly suggestDear Experts,
this message assumed answered, I'm facing with a similar problem, during the replication of business partners the address isn't created in SAP ECC, and a bdow with the error appears.
Thanks in advance
Cris
Maybe you are looking for
-
I am using Studio Creator to build a application that connects to an Informix Database running on HPUX. Its Informix 7.3 Dynamic server. I downloaded the 3.0 JDBC drivers for Informix and all seems well from within Studio Creator. I can see the datab
-
Can't import edited song from iTunes to iPhoto
Hi, I've created a slideshow in iPhoto and a playlist in iTunes to go with the slideshow. I shortened one of the songs in the playlist from 4.5 minutes to 1.5 minutes (file-get info-options). Within iTunes the song is shortened but when I import the
-
How do you edit your Open With list in Mac Mail?
I get a lot of attachments in my emails I need to open with Photoshop CC, but now that I have upgraded to Mavericks my list of default programs is gone when I right click and choose open with. Now I have to choose applications> Photoshop CC> Photosho
-
Set Report Title in Center Dynamically
I have 3 crosstabs in WEBI report and the column of those tables are changing dynamically. So how can I set my report title in center when the report is changed. Please help me. Thanks in advance
-
4g lte router with voice works great with data....stops working on voice
Have had the router about a month and half....works so much better than old MIFi Jet Pack although Jet Pack was good..Signal so much better with router...6-10 meg down 2.5-4 upload which is good for us out here in the country..............BUT here is