Key figure incorrect disaggregation
Hi All,
Please help me in resolving the issue regarding incorrect key figure disaggregation in the planning book. The key figure value which is disaggregated is based on another keyfigure. The calculation type and disaggregation type set up in the planning area for these two key figures are
Calculation type Based on KF Time based disaggregation
KF1 S P
KF2 P KF1 P
I am providing an example as below.
You select one SKU (50058801807) in the planning book. These three characteristic value combinations exist for the SKU: CVC1, CVC2, and CVC3. You enter 100 pieces as the actual impact of promotional events quantity (KF2) in the month of May (month five).
You save the data. The system first performs time-based disaggregation. In other words, it first disaggregates the 100 pieces of the SKU level into technical periods based on the number of workdays in each period. In this example, factory calendar has been defined in the planning area, so there are five workdays in each week. The month of May has 22 workdays (one day public holiday). No proportional factors exist.The calculation is as follows:
Level1: The system collects the rounding differences from all the periods.
Technical Number of DaysCalculation Result Brought Forward
Period in TP
5/1-5 5 100 * 5/22 = 22.727 23 -0.273
5/8-12 5 100 * 5/22= 22.727 23 -0.273
5/15-19 5 100 * 5/22 = 22.727 23 -0.273
5/22-26 5 100 * 5/22 = 22.727 23 -0.273
5/29-31 2 100 * 2/22 = 9.090 9 0.090
Total: 101 1
Level 2: Distribution of the difference -1
Technical Number of Days Calculation Result Brought Forward
Period in TP
5/1-5 5 -1 * 5/22 = -0.227 0 -0.227
5/8-12 5 -1 * 5/22 = -0.227 0 -0.454
5/15-19 5 -1 * 5/22 = -0.227 0 -0.681
5/22-26 5 -1 * 5/22 = -0.227 0 -0.908
5/29-31 2 -1 * 2/22 = -0.090 -1 0
Total: -1 0
The result:
Technical Result
Period
5/1-5 23
5/8-12 23
5/15-19 23
5/22-26 23
5/29-31 8
Total: 100
As an example, the 23 pieces, which are to be distributed in several technical periods (5/1-5, 5/8-12, 5/15-19, 5/22-26 and 5/29-31), are calculated below for the three characteristic value combinations:
CVC Calculation Result Remainder
CVC1 23 * 1/3 = 7.666 8 -0.334
CVC2 23 * 1/3 = 7.666 8 -0.334
CVC3 23 * 1/3 = 7.666 8 -0.334
Total: 24 -1.002
The remainder -1.002 is distributed to the three characteristic value combinations:
CVC Calculation Result Remainder
CVC1 -1.002 * 1/3 = -0.334 0 -0.334
CVC2 -1.002 * 1/3 = -0.334 0 -0.668
CVC3 -1.002 * 1/3 = -0.334 -1 0
Total: -1
The following table shows which of the seven characteristic combinations the system assigns how many pieces:
CVC Result
CVC1 8
CVC2 8
CVC3 7
Total: 23
Calculation for 8 pieces:
CVC Calculation Result Remainder
CVC1 8 * 1/3 = 2.666 3 -0.333
CVC2 8 * 1/3 = 2.666 3 -0.333
CVC3 8 * 1/3 = 2.666 3 -0.333
Total: 9 -0.999
The remainder 0.994 is distributed to the five characteristic value combinations:
CVC Calculation Result Remainder
CVC1 0.999 * 1/3 = -0.333 0 -0.333
CVC2 0.999 * 1/3 = -0.333 0 -0.666
CVC3 0.999 * 1/3 = -0.333 -1
Total: -1
The following table shows which of the seven characteristic combinations the system assigns how many pieces:
CVC Result
CVC1 3
CVC2 3
CVC3 2
Total: 8
This means that each characteristic combination is assigned the relevant proportion from the table for each technical period for which the system can distribute 8 pieces. In the period 5/29-31, eight pieces are distributed to the three characteristic combinations.
The results are displayed schematically in the following figure:
Technical periods in which data is saved:
5/1-5 5/8-12 5/15-19 5/22-26 5/29-31
23 23 23 23 8
Disaggregation in technical periods at SKU level:
For value 23
23
8 8 7
For value 8
8
3 3 2
Disaggregation in technical periods at detailed level:
Now according to the above calculation,
CVC 1 gets value 35
CVC 2 gets value -35
CVC 3 gets value-30
The planning book values are:
CVC 1 33.623
CVC 2 -50.276
CVC 3 -16.101
Please note: 3 decimal places set for the key figure in the planning area, but this calculation is based on one decimal place.
Even then the differences should not be this high.
Please provide me some examples on disaggregation of kf based on another kf with no proportional factors maintained.
Thank you in advance for your help and support.
Kind Regards,
Hi
Your calculations seem to be in line with the <a href="http://help.sap.com/saphelp_scm50/helpdata/en/26/53f1f3758211d398490000e8a49608/content.htm">SAP help for rounding and disaggregation</a>
If there is no preexisting proportions in both KF1 and KF2 what you expected should be the values comng up. It is possible that there is any other proportions that are existing in the KF1 that dont appear in your selections but are causing this to aggregate in KF2. (if you have a remote cube you can LISTCUBE whats in the planning area and check all the CVCs)
it would be also a good idea to check both KF in a details( all) mode for the CVCs and also have the time buckets in technical periods and see if that matches your calculations
to see the planning book in technical periods, from the planning book ... go to the period structure settings and choose 9ASTORAGE. (you cant use this permanently in the planing books though)
Similar Messages
-
Currency Translation with a Calculate Key Figure
Hi,
Does anyone have experience with applying currency translation in BEX/Query Designer on a Calculated Key Figure?
I have a need for a calculated key figure to be converted to a target currency of USD (which is already defined via RRC1). I also need the Results Row to display the summation in USD.
Currently, my calculated key figure displays two currencies (MYR and USD) and an incorrect summation of both currencies.
Here is how I have defined my calculated key figure:
NODIM ( 'Consumption (STOs)' ) * NDIV0 ( 'Material Source Plant Cost' / NODIM ( 'Source Plant Price Unit' ) )
I am multiplying a quantity field by amount field and then dividing by a price per unit field. I have applied NODIM on the other fields so that the amount field will retain it's properties.
Running this query through transaction RSRT and clicking on the Generate Report button, I get the following message:
"<b>Currency translation cannot be carried out for element 20 (my calculated key figure). Element 20 neither contains a Basic key figure nor a variable with type Amount. For this reason, you cannot and do not need to perform a currency translation</b>."
From the above message, I must be setting my calculated key figure incorrectly.
Any ideas?
Thanks!
HauHello Ajeet and N Ganesh,
Thank-you for your help. I verified that the dimensions of my key figure 'Material Source Plant Cost' was of type 0AMOUNT, so that was not the issue.
The issue was in the error message that was returned when I pressed the Generate Report button in RSRT. Essentially, currency conversions in BEX can only be performed on basic key figures or simple replacement path variables.
I can perform the currency translation on a SIMPLE calculated key figure that contains only the replacement path variable (where 'Material Source Plant Cost' is an attribute of my master data characteristic ZMAT_SRC). However, I found that I cannot perform the currency translation on COMPLEX calculated key figures, like in my example above.
To get around this currency translation issue and as suggested by the error message, I created a SIMPLE calculated key figure for replacement path variable 'Material Source Plant Cost' and performed the currency translation. The problem with this solution is that while I get my currency translation, I also get a calculated key figure that is aggregated, according to the query layout.
To get around the aggregation issue, I also created a dummy counter as a master data attribute (of ZMAT_SRC) to capture the aggregation. I assign a value of 1 to this dummy counter/master data attribute in the update rules to my ZMAT_SRC infoobject.
I then divide my new (aggregated) calculated key figure for 'Material Source Plant Cost' by the dummy counter (which is also aggregated). Essentially, I divided the aggregated replacement path variable by the scaling factor. The result is that I get a new calculated key figure that has been translated into the target currency and in the correct scaling factor. I can use the new calculated key figures in the above formula; regardless of the report is rolled-up.
Again, thanks for your help. It gave me the pieces to solve this puzzle. -
Disaggregation of Key figure not in proportion at monthly bucket
Dear Expert,
Please find the below case.
Here Key figure 2 is disaggregated based on Key figure 1 (Calculation Type P) also we maintained Time based disaggregation K.
The issue is - if you check total, it is 13 for both key figures as shown in below screen shot.
But region wise (Detail level) if you checked, it is not in 1:1 proportion and it is causing problem for us.
For example Region 2 having value in KF1 as 2 for Week 11.2015 but after disaggregation it got nothing on KF2.
(The reason is Key figure 2 is having value in week 12 and week 13 and nothing in Week 11)
We tried 2 different approach to get 1:1 proportion in Monthly bucket, but it could solve our problem.
Approach 1-After Key figure2 calculated in Weekly bucket, Reset Key figure 2 at total level in Monthly data View and enter the Total of Key figure 2 at Total level in Monthly bucket.Check the result at detailed level.
But in this case it is calculated for week 10, 11, 12 and 13 as 0, 9, 3, 1. (We want Key figure2 as per original calculation i.e it should be 12 and 1 in week 12 and 13 respectively)
Approach2 -
Approach 2 - Add another New Key figure as Key figure 1 Total- which sum the Key figure1 at detailed unit at Month level and then disaggregation of Key figure2 based on this new KF.
Its working for this scenario but different scenario is not working.
For example- here region 3 and 4 affected.
If disaggregation at Month level is corrected, our issue will be resolve.
Waiting for your feed back.
Thank you
SachinHi,
In Planning Area u2013 Key Figure settings, did you try to use u2018N - No disaggregation in timeu2019 for KF2 key figure ?
This will mean that copying will occur in technical periods of the storage bucket profile and not the way it happens now.
Other option:
Now, you storage bucket profile and also the planning bucket proifile are in Days/Weeks/ .Months
Is it necessary for your user to view data in daily buckets ? Can the user view the data in weekly buckets only (and not in daily buckets) ?
Your planning bucket profile can be in Weeks/..Months .
This way your displayed data will be in weekly buckets.
Regards
Datta -
Addition of Key Figures in Query Produces incorrect Results
Hi,
I have a query with a globally defined structure of Restricted Key Figures. These are mainly G/L Account Balances from R/3.
When I create a global Calculated Key Figure the sum is incorrect (ie does not match R/3) despite the fact that the individual restricted key figures match perfectly.
The only inconsistency is one or 2 of the RKF's are + ve in R/3 but - ve in BW.
Would really appreciate some help on this.
Thanks a lot.Hi All,
Thanks for your replies.
Paul B - I tried your recommendation and it did not work. I only set the result to summation nothing else.
Bjorn - I tried what you said but that didnt work either.
Venkat - I thought about this but I have about 30 + Calculated Key Figs and over 100 Restricted Kfs and a fair amount have -ve signs. Do I have to do this for all ?
Regards -
Query key figures showing incorrect values
Hi,
key figures in my query are showing incorrect values NO CALC POSSIBLE
Data in the cube is coming from different datasources and the cube is compressed.
this is how data in the cube looks like
plant Unit currency Stock Quantity Stock Value total ( *** key figure ( inflow and outflow)
P001 ST 1000 0
P001 EUR 100
Query result
plant stock(PC) Stock Quantity Stock Value total ( cumy figure ( inflow and outflow)
P001 No Cal possible 1000 100 No Cal possible
How do I resolve in query or cube level so that the NO CALC possible is not displayed.
thanksHi,
Indeed my cube is a copy of 0IC_C03.
Could you please explain more about the filtering.
I have the following key figures
0TOTALSTCK
0ISSTOTSTCK
0RECTOTSTCK
ZISSSTCK_VALUE
ZREC_VALUE
ZTOTSTKVALUE
ZTOTSTKVALUE is CKF with inflow and outflow of ZISSSTCK_VALUE
ZREC_VALUE
Whenever there is 0 it displays NO CALC possible. I would like to have it blank or 0s.
thanks -
Different disaggregation rules within the same key figure
Hello All,
We need to apply different disaggregation rules within the different levels of the same key figure.
As an Example:
Key Figure: SALES VOLUME
Level1: Market
Level2: Segment
Level3: Model
We want to define such rules:
1- Disaggregate from Market Level to Segment Level using Historical Data
2- Disaggregate from Segment Level to Model Level using APODPNDANT (a key figure in which the input is supplied from business)
Does anybody know if this (defining two different rules within the different levels of the same key figure) is posssible?
Thanks in advance...Hi Ergul
Ideally its better to have APODPDANT to be the disaggregation KF at all levels
you can choose to manipulate the APODPDANT instead by making it an Editable Kf in the Planning book
To start with all levels can be based on the proportional KF calculation
By Loading the selections at the Segment to model level you can manually enter the values based on business input
you might need some macors to fix the totals and maybe to make sure the Sales volume is redisaggregated
i think its something that can be solved by a god business process and simple functionality in the way you see the data -
Calculated key figures with incorrect values
Hi experts,
i have a requirement whre i have multiple calculated key figures based on different restricted key figures.
say for eg:
RKF1-restricted to calmonth 01.2006
RKF2-restricted to calmoth 01.2007
ckf1-RKF1 %a RKF2
RKF3-restricted to calmonth 02.2006
RKF4-restricted to calmoth 02.2007
ckf2-RKF3 %a RKF4
and so on...
In my output of the report, ALL CKFs are show same result as CKF1!!
please suggest..
thank you all in advanceHi,
Indeed my cube is a copy of 0IC_C03.
Could you please explain more about the filtering.
I have the following key figures
0TOTALSTCK
0ISSTOTSTCK
0RECTOTSTCK
ZISSSTCK_VALUE
ZREC_VALUE
ZTOTSTKVALUE
ZTOTSTKVALUE is CKF with inflow and outflow of ZISSSTCK_VALUE
ZREC_VALUE
Whenever there is 0 it displays NO CALC possible. I would like to have it blank or 0s.
thanks -
How to ignore blank/null key figure value in BI Queries
Reports on Multiprovider - we see some cells of a Key figure as blanks. These blanks are interpreted as zeros by the system and calculated accordingly resulting in incorrect values. As per our requirement, we need a count of all hard/real zeros only, not the blanks. For example, if there are 10 rows of which 6 are real zeros and 4 are blanks - our count should be 6 and not 10.
How to ignore the blanks in BEx queries please?
Thanks for your help.
UpenderRakesh,
It is not possible to find a pattern because the report is on a MultiProvider with 2 InfoProviders- Purchasing documents DSO and Material Movements InfoCube.
Every Purchasing Document has several materials associated with it. These materials are compared with materials in Materials Movement. Not all materials in Purchasing Document are found in Materials Movement. For those Materials found in Materials Movement, the Quantity is obtained. For these found rows, the correct value is showing up - if the quantity is zero, it is showing in reports as zero. If the material is not found in Material Movements then Quantity shows up as blank values.
My requirement is ignore such blank quantities and not count them. Only Quantities with 0 values should be counted. Currently both blanks and zero values are counted showing inflated count.
Thanks,
Upender -
Key figure display in planning book with respect to Time bucket profile
Hi,
I am loading a key figure to planning area from the info cube for the current month. When I review the key figure in planning book with monthly time bucket profile it shows 85 for the current month. In the same planning book with weekly bucket profile, it shows 55 from the current week and future weeks and the remaining 30 goes into the past weeks of the current month.
How to make the total quantity 85 to show in the current and future weeks only.
thanks and regards
MurugesanHi Murugesan,
Within the Planning Area, the data is stored at the lowest level granularity that you maintain in storage bucket profile. Then during display, system will decide what data to show depending on what kind of time bucket profile you use in the planning view, and the time based disaggregation that you maintain for Key Figure.
In this below case, what time characteristic do you have in cube? Is it date, week or month?
If it's date, check how much KF data is maintained on the dates which belong to week which has days both in this month/last month e.g. if I talk about Dec 2011, how much data is stored 1,2,3 & 4 th of Dec, 2011.
This data would appear in Dec in monthly view, but in week, it would appear in the week starting 28th November.
If data is maintained in cube in weeks, then you need to calculate how time based disaggregation would show it to you in months.
If it's months, then you would need to find out how much data would go to the days in the past week of the month.
The time based disaggregation may be causing you some issues, but in that case, I would not expect 30 out of 85 to go in the past week, unless you have data in cube in days.
Data shown in weekly view for week starting 28th Nov should ideally be a small proportion of 85, unless you are using a time stream/fiscal year variant, due to which most of December is in holidays. The only other exception I can think of is that you have data in teh days mentioned above.
It would be best to help the business understand this disaggregation logic, rather than thinking of manipulating the data to shift to a later week.
If this logic doesn't explain your situation, then please provide the date/week/month at which you have data in cube, and what quantity.
Thanks - Pawan -
Key Figure units in Fact Table - Error
All -
When a run a report off of a cube, some row display 0 when there are corresponding values in my cube. The report doesn't agree with LISTCUBE. I have ran transaction RSRV on my cube and tested the "Key figure units in fact tables of Infocube" and I get an error saying that 1380 units are missing from fact table.
<b>Diagnosis
In the fact table /BIC/FEU_FRCTS records have been found that contain values other than zero for key figures that have units, but that have no value for the unit of the key figure. Since the value of the unit has to correspond to the value of the key figure, this inidicates an error when the data was loaded. The values of the units have not been loaded into BW correctly. Choose Details to display the incorrect records.</b>
Does anyone know what this error means? How do I solve this problem?
Thanks,
Nyrvolehi Nyrvole,
as the message said, you have keyfigures with unit but the unit value not filled, click 'detail' as suggested to check which keyfigure(s) involved, that go to rsd2 type in that keyfigure and see which infoobject unit is used, then check transfer/update rules how this unit infoobject mapped, try correct the values and upload again.
there is option 'repair' in rsrv but think in this case it can't fix the error, just try.
hope this helps. -
Impact of Delta Records on Key Figure Summation in DSO
Hi experts,
I have a key figure with aggregation type "summation" in a DSO. I would like to know the impact of delta records on the key figure.
E.g.
source DSO
doc_id (key) | doc_pos | type | amount
4711 | 1 | A | 100 USD
4711 | 2 | B | 20 USD
target DSO
doc_id (key) | amount
4711 | 120 USD
If the first record is modified ("type" from A to C) as follows and delta-loaded to target DSO:
4711 | 1 | C | 100 USD
This will lead to incorrect amount:
target DSO
doc_id (key) | amount
4711 | 220 USD
How can I handle this situation?
Thanks in advance.
Regards,
MengHi..
I believe one document number and document Item will have only one type.
Like 4711 1 should have only one type ( A / B / C).
If the above assumption is true then just remove Doc Type from Key field of source DSO.
Then From Source to Target Change Log table can handle this.
Regards
Anindya -
Key Figure calculation in Abap is not working correctly - Overlooping
Hi,
I wrote a logic to calculate the ratio of key figure but it is not working correctly
For example I have a requirement to split 1 Product into Several new Products and also the Net Amount will be splitted to these several new products as well. The total Amount of the new product will be equivalent to the Net Amount.
So far my Logic is splitting the product to several new products but the amount is incorrect as the calculation is over looping.
Sample
A PRODUCT has Net Amount 1000. And this product needs to be splitted into 3 new products. Each of this new product is assigned a ratio of 0.3, 0.2 and 0.7 respectively. total sum of the ratio is 1.
PRODUCT1 0.3 = 1000 * 0.3 = 300
PRODUCT2 0.2 = 1000 * 0.2 = 200
PRODUCT3 0.7 = 1000 * 0.7 = 700
The total amount of this new products is 1000.
Now my logic is working this way.
PRODUCT1 0.3 = 1000 * 0.3 = 300
PRODUCT2 0.2 = 1000 * 0.2 * 0.3 = 60
PRODUCT3 0.7 = 1000 * 0.2 * 0.3 * 0.7 = 42
Only the PRODUCT1 is working correctly and there is overlooping for the remaining products
Logic used
DATA: t_data TYPE data_package_structure OCCURS 0 WITH HEADER LINE.
DATA: t_newdso LIKE /bic/newdso OCCURS 0 WITH HEADER LINE.
DATA: t_olddso LIKE /bic/olddso OCCURS 0 WITH HEADER LINE.
DATA: amount LIKE data_package-netamount.
DATA: zidx LIKE sy-tabix.
REFRESH t_data.
LOOP AT data_package.
zidx = sy-tabix.
MOVE-CORRESPONDING data_package TO t_data.
REFRESH t_newdso.
SELECT * FROM newdso INTO TABLE t_newdso WHERE prod =
data_package-prod.
SORT t_newdso BY prod.
*LOOP AT T_NEWDSO.
READ TABLE t_newdso WITH KEY prodh4 = t_data-prod.
IF sy-subrc EQ 0.
LOOP AT t_newdso.
t_data-prod = t_newdso-/bic/znew_mp.
t_data-material = t_newdso-material.
*T_DATA-NETAMOUNT = T_DATA NETAMOUNT * T_NEWDSO-/BIC/ZSP_RATIO.*
APPEND t_data.
ENDLOOP.
ELSE.
REFRESH t_olddso.
SELECT * FROM olddso INTO TABLE t_olddso WHERE prod =
data_package-prod.
SORT t_olddso BY prod.
READ TABLE t_olddso WITH KEY prodh4 = t_data-prod.
t_data-prod = t_olddso-prod.
t_data-material = t_olddso-material.
APPEND t_data.
ENDIF.
MODIFY data_package INDEX zidx.
ENDLOOP.
REFRESH data_package.
data_package[] = t_data[].
thanks
Edited by: Matt on Sep 27, 2010 2:25 PM - added tagsHi,
I am not really good at debugging Abap code since I am a newbie. however I have tried to add CLEAR T_DATA before the first loop.
REFRESH T_DATA.
LOOP AT DATA_PACKAGE.
ZIDX = SY-TABIX.
MOVE-CORRESPONDING DATA_PACKAGE TO T_DATA.
and before the second loop and select statement and at the end of the loop.
REFRESH T_NEWDSO.
SELECT * FROM NEWDSO INTO table T_NEWDSO WHERE PROD =
DATA_PACKAGE-PROD.
SORT T_NEWDSO BY PROD.
READ TABLE T_NEWDSO WITH KEY PROD = T_DATA-PROD.
IF sy-subrc EQ 0.
LOOP AT T_NEWDSO.
but then not all data are being fetched.
thanks
Edited by: Bhat Vaidya on Sep 28, 2010 8:33 AM -
APO-DP - Change in Calculation Type of Key Figure in Production Environment
Hi Team,
I wish to know how to make a change in Calculation Type of Key Figure in Production Environment. I wish to change calculation type from "P" to "S" and also wish to enter Disaggregated Key Figure.
Regards,
Tarun JhaHI,
The TR will not move the changes as you are expecting here.
Any changes to planning area, below 5.0 need to de-intialize the planning area.
just like
1. hold the latest back up
2. de-intialize planning area
3. make the required changes to planning area
4. initialize planning area and do conisistency checks
5. load back the data to planning area at all CVC level
all the steps need to perfom, in Dev and Quality and Production servers,
Plan the activity for a weekend so that users will not get disturb*
All the best
ARUN R Y -
BEx Query - Stop Aggregation of Key Figure upon Drill
Hi,
I have a situation wherein I am joining Header and Line Item details using a BW InfoSet (Header - DSO1 and Line Item DSO2). A query is built off this InfoSet providing a single view of both the Header and Line Item information. This works great and users love it. Note that this information is coming from a 3rd part application sitting in ECC and is data before posting into SAP.
Using BEx Analyzer, we have Header and Line Item fields in the rows along with the Line Item Amount and the Header Amount. If I remove the Line Item characteristics from the rows, the Amounts get aggregated(SUM UP). This is OK for the Line Item Amount (sum up all the different lines), but not so good when the Header Amounts sum together providing an incorrect header amount.
Ex: Document Number 123 has 3 distribution lines. Total Header Amount is $100.
Doc Number(DSO1) | Line Item Number(DSO2) | Line Item Amount(DSO1) | Header Amount (DSO2)
123 1 $20 $100
123 2 $20 $100
123 3 $50 $100
This view looks good. Say, we remove Line Item Number from the rows, we now get:
Doc Number | Line Item Amount | Header Amount
123 $90 $300
The Header Amount is incorrect as it should be $100. My initial plan was to use Line Item Amount as it would aggregate up and provide the detail at the line item level and sum up to the header level.
As this information is before posting into SAP. None of the SAP validations are called up for this data and the sum of the line item amounts will not always tie to the header amount (there could be tax and freight amounts at the header level in different key figures). As seen in my example above, Line Item Amount sum up to $90 whereas the Header Amount is $100.
I guess the simplest solution would be to set the Header Amount on this report to not change with any navigation/filtering etc. As in the example, the header amount remains $100. Any way this can be achieved?
Any ideas/suggestions are welcome. Thanks in advance.
-VivekHi Yasemin,
Thank you for the prompt reply. I like your idea of counting line item numbers and dividing the header amounts with the count. For my situation though, Akshay and Anshu's answer worked best and like the fact that I can leverage BEx functionality out of the box.
Akshay & Anshu,
Thank for the prompt reply. Your suggested solution worked great.
Many Thanks,
Vivek -
BEx Query key figure sum different at monthly or at docu number level
I have a query designed in Query Designer. The report is summarized at the monthly level , but the document number is available for drilling down. What's strange is that the sum of the key figures is different when running at the monthly level or when drilling down at the document number level. Can someone share your insight on what how to solve this problem?
Thanks,
SharonHi Sharon
Which key figures give you an incorrect picture or discrpancies in the two displays ?
Check the properties of these key figures. Are any of the key figures posing an issue calculated as 'Formula'. If yes, then what is the Calculate Result as for this KEy Figure ? Is it TOTAL ?
Cheers
Umesh
Maybe you are looking for
-
I have an iTunes account with my own music library. My wife wants to have her own music library. Is there a way to set up a sub account for her, or do I have to set up a set of husband playlists and a set of wife playlists?
-
Trouble installing iTunes 10.7 on Windows Vista
Whenever I install it, I get this message: "iTunes was not installed correctly. Please re-install". Ive done this about 10 times. If re-installed, rebooted the computer, then rre-installed again. My latest approach was to uninstall it from my compute
-
How to Upgrade from 10.1.5 to 10.4 tiger
I have a G3 with 10.1.5 installed I want to install tiger 10.4 from cd's but when I start the installation it tells me that cannot install the software? What am I doing wrong? Thank You for your help!!
-
What is the problem with this script. it is giving invalid table name error
declare l_cnt number; v_sql varchar2(1000); table_name1 varchar2(1000); begin for i in ( select table_name from all_tables ) loop table_name1 := i.table_name; v_sql := 'select count(1) from :table' ; execute immediate v_sql into l_cnt using table_nam
-
Photoshop Cs4 fails to install
When I go to install CS4 ,I get a message that Bridge is open. It is not open. I try to uninstall CS4 extended trial an the same thing happens. I tried to load CS3 and bridge won't open. CS4 bridge will load even though I have run out of daYS USING T