Regarding Key figures

Hi All,
I will  be taking data  from the file which  will be stored at AL11.
Suppose if the amoount is 23.4578  but the key figure for this  is having setting 0.00 only 2 digits are allow after decimal places .
So how this  value will come in dat target  23.45 or 23.46 or will it  give any error ?
Thanks for your  reply.

I think it will give 23.46. But it will definitely not give any error. It will load successfully.
If you will use data type FLTP in key figure then the complete value 23.4578 can be stored.

Similar Messages

  • Regarding  currency translation for Display attribute(key figure)

    Hi Gurus,
    How can i do currency conversion for  MATERIAL COST(key figure) ,which is  Display Attribute  of MATERIAL...
    thanks advancely.......
    CHANDRA

    Hi,
    You can create the currency conversion type in the T code RSCURR.
    Here are the steps...
    1.     Transaction for creating currency translation type is RSCURR
    2.     Choose the Exchange rate, Source currency, Target Currency and the Time reference in the respective tabs.
    3.     In the query definition choose the Currency Conversion Key in the properties of the Key-figure.
    Regards,
    Yogesh.

  • Query regarding characteristics and key figures

    Hi All,
    Can anybody pls tell me in what scenarios a key figure can be used as a attribute of a characteristic?
    what are the scenarios in which each type of DSO can be use?

    Hi,
    Can anybody pls tell me in what scenarios a key figure can be used as a attribute of a characteristic?
    if  your key figure in trasnctional data level it's give you the Fact.
    If you use your key figure in your master data lvel it's give you the present truth.
    Like this you have 0Material and Material price has the attribute of 0Material.
    read below link
    http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/content.htm
    what are the scenarios in which each type of DSO can be use?
    We have 3 DSO in BI 7.
    1. Standard DSO : it's contaions 3 tables (New,Active,Change log tables) : If you want overwrite funcationality go for this dso
    2. Direct Update DSO : It's Contaion only Active Table : If you want write some data using with  APD and BAPI go for this dso.
    3. Write Optimized DSO : It's Contaion only Active Table : If you want load mass uploads and stageing purpose we use this dso.
    Hope it helps you.
    Regards,
    Venkat.

  • Regarding classifying values in a key figure

    Hi All,
    I need help in defining one query. In my query there is a key figure with positive and negitive values(Quantity). I need another key figure (Key figure 2) to display all the negitive values for a condition price list type =02.
    That is "IF PRICE LIST TYPE =02 and qty is nigitive" then display in Key figure 2
    How to achieve it??
    Regards
    Jay Y

    Hi,
    Create a Calculated key figure with the following formula:
    ( 'Keyfigure' < 0) * 'Keyfigure'
    The expression ( 'Keyfigure' < 0) returns 1 when ever the expression is true.
    Use this CKF in the query along with the restriction on Price List type.
    (You can create a restricted key figure using the above CKF & restricting it on Price List type)
    Hope this helps.

  • Regarding Non-Cumulated Key figures

    Dear All,
    I wanted to create Multiprovider which includes infocube having Non-cumulative key figures .When I am including that Non-cumulative key figure in the Multiprovider , I am getting an error "Inconsistencies were found when checking MultiProvider ZMULTIPTP"
    detail fo this error is
    Define the characteristics of the validity table for non-cumulatives
    Message no. R7846
    Diagnosis
    The InfoCube contains non-cumulative values. A validity table is created for these non-cumulative values, in which the time interval is stored, for which the non-cumulative values are valid.
    The validity table automatically contains the "most detailed" of the selected time characteristics (if such a one does not exist, it must be defined by the user, meaning transfered into the InfoCube).
    Besides the most detailed time characteristic, additional characteristics for the InfoCube can be included in the validity table:
    Examples of such characteristics are:
    A "plan/actual" indicator, if differing time references exist for plan and actual values (actual values for the current fiscal year, plan values already for the next fiscal year),
    the characteristic "plant", if the non-cumulative values are reported per plant, and it can occur that a plant already exists for plant A for a particular time period, but not yet for plant B.
    Procedure
    Define all additional characteristics that should be contained in the validity table by selecting the characteristics.
    In the example above, the characteristics "plan/actual" and  "plant" must be selected.
    The system automatically generates the validity table according to the definition made. This table is automatically updated when data is loaded.

    Hello,
    See this Validity table for non-cumulatives in a self defined cube
    Thanks
    Chandran

  • Change order of key figure is not working in Web report output (using WAD)

    Hi,
    We are using BI 7.0, release 701 and level 008. We are facing problem in WAD (web report output). When ever we do 'Select filter' for key figures and 'change the order' of key figures, it does not get reflected in new order. Though drag and drop of key figures is working.
    Though same change order is working fine in Bex Analyser.
    Kindly suggest some inputs.
    Thanks.

    Hi,
    On Which Service Pack are you on?WAD has this feature of re arranging the keyfigures in BI 7.0 for SPS 14...
    Regards,
    Bhagyarekha.

  • How can i display all the Coloumns (Key Figures) in the Report's same page

    Hi All,
    I am currently working on the BI 7.0 reporting. In the BEx Query Designer report, I am having 13 coloumns (7 Key figures and 6 Formulas) in the report. It is now displaying 10 coloumns in the First page of report. The other 3 coloumns can be seen by clicking on the next page. But the requirement is to see coloumns in the SAME PAGE.
    Are there any settings OR any other thing in the Query Designer, which can enable me to see all the coloumns on SAME page. The users are not interested in scrolling next page for 3 coloumns.
    Appreciates the Quick help/Response, please.
    Regards
    Ramana

    In Web Application Designer, for the properties of your analysis webitem under the Paging category, change parameter 'Number of Data Columns Displayed at Once' (BLOCK_COLUMNS_SIZE) to 0.  This will make all columns show instead of paginating as it is doing now.

  • Creation of new key figure at report level depending on the month range

    Dear All,
    I have a requirment where user wants to see the dollar values for the Amount key figure. Data is stored in the INR value in the cube.
    Exchange rate values keep on changing depending upon the quarter. For all four quarters we have four different exchange rate.
    Now system has to identify the applicable exchange rate and show a new column with the USD value just beside the INR values column.
    How best we can achieve this.
    Thanks & Rgds,
    Anup

    Hi
    You can have currency translation at query level for achieving this.
    https://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/600157ec-44e5-2b10-abb0-dc9d06ba6c2f&overridelayout=true
    Translation in Query
    Regards
    Ravi
    Edited by: Ravi Naalla on Aug 17, 2009 11:56 AM

  • What is diffrernce between key figures and characteristics?

    Hi all,
    I am a newbie to SAP BI....i am doing Masters in ERP(SAP)....I have confusion about what is basic difference between key figures and characteristics?
    Why do we have to define attributes in characteristics?
    If any one can refer me a thread that answers my questions or explain with very lucid example I will highly appreciate it.....please answer this question as I have just spend 4-5 months studying BI and have not really implemented in the Industry but we do have access to SAP BI where we perform tutorials....
    Also please let me know the links to materials where basics of BI are explained in a very simple way.....
    Thanks,
    Regards,
    Rahul

    KF is anything which can quantitatively measured: Amount, Number, Value, Price... whatever you want to measure is called Key Figure. This key figures are also called KPI (Key Performance Indicator). If you want to know the number of sales orders, the number of deliveries, the amount of deliveries... all are key figures
    And to add to that, key figures are also divided into two:
    Cumulative KF: Like I said, amount, value, price
    Non-cumulative KF, this is different kind of key figures which shouldn't be summed up, for example: Age, Inventory, Number of Employees.  When  you measure this key figures, you don't add the age of every body, because that doesn't make any sense, but adding the sum of all deliveries, or adding all Sales does make sense, that is the difference between the two key figures.
    Characteristics: are anything which couldn't be quantitatively measured: material, customer, plant, employee ID and so on...
    Aways, in BI you measure the KF against the Key figures..
    thanks.
    Wond

  • Key Figures getting added

    Hi All,
    We arein a process of migrating 3.x Transfer rules and update rules to NW2004s. We are following the standard procedure of migrating the objects such that transfer rules are migrated to a transformations and so do update rules.
    In on of the objects while running the loads after migrations the key figures are getting doubled. We were able to debug the standard code and figure out where addition is happening. Strange things is this is a standard SAP code generated in all the migrated objectsbut in this particular case it has generated some extra code which is adding up the key figures. I am pasting code below for your reference and will appreciate if someone can help us to resolve this.
    As mentioned above the below Read statement is standard in all the TRCS transformations which are migrated from update rules. This read statement fails with sy-subrc = 4 because  <_yth_TG_1>. is empty at this stage but during execution it executes the statement INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1> which populates <_yth_TG_1>  and in turn makes the next read statement successful which is extra in this case. There is one more Read statement which is extra in this particular case and is adding up the key figures.
    =====  PUT groups to target
          IF skipseg_all IS INITIAL.
            READ TABLE <_yth_TG_1>
              WITH TABLE KEY
    here it is reading all the different characteristics and key figurs. eliminating this to cut short the lengthof message,
               ASSIGNING <_ys_TG_1>.
            G1_subrc = sy-subrc.
            G1_tabix = sy-tabix.
            IF G1_subrc = 0.
      aggregation SUM
              <_ys_TG_1>-POS_FRE = <_ys_TG_1>-POS_FRE + G1-POS_FRE.
              <_ys_TG_1>-POS_OCC = <_ys_TG_1>-POS_OCC + G1-POS_OCC.
              <_ys_TG_1>-POS_VAC = <_ys_TG_1>-POS_VAC + G1-POS_VAC.
              <_ys_TG_1>-/BIC/ZKMOSALRY = <_ys_TG_1>-/BIC/ZKMOSALRY + G1-/BIC/ZKMOSALRY.
              <_ys_TG_1>-/BIC/ZFILFTEFX = <_ys_TG_1>-/BIC/ZFILFTEFX + G1-/BIC/ZFILFTEFX.
              <_ys_TG_1>-/BIC/ZWRKFTEPX = <_ys_TG_1>-/BIC/ZWRKFTEPX + G1-/BIC/ZWRKFTEPX.
              ls_cross-insegid      = 1.
              ls_cross-inrecord     = l_recno_SC_1.
              ls_cross-outsegid     = 1.
              ls_cross-outrecord    = <_ys_TG_1>-record.
              CALL METHOD i_r_log->add_cross_tab
                EXPORTING
                  I_S_CROSSTAB = ls_cross.
            ELSE.
              ASSIGN rdsTG_1->*          to <_ys_TG_1>.
              CLEAR <_ys_TG_1>.
              MOVE-CORRESPONDING G1 TO <_ys_TG_1>.
              <_ys_TG_1>-requid    = l_requid.
              l_recno_TG_1          = l_recno_TG_1 + 1.
              ls_cross-insegid      = 1.
              ls_cross-inrecord     = l_recno_SC_1.
              ls_cross-outsegid     = 1.
              ls_cross-outrecord    = l_recno_TG_1.
              CALL METHOD i_r_log->add_cross_tab
                EXPORTING
                  I_S_CROSSTAB = ls_cross.
        Record# in target = sy-tabix - if sorting of table won't be changed
              <_ys_TG_1>-record     = l_recno_TG_1.
              INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1>.
              IF sy-subrc <> 0.
                CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack.
              ENDIF.
            ENDIF.      "Read table
    This read statement is extra in this code and when this executes the sy-subrc becomes zero and then it adds up the key figures.
            READ TABLE <_yth_TG_1>
              WITH TABLE KEY
    **This also contains the characteristics and key figures which are being update as part of the transformations       
               ASSIGNING <_ys_TG_1>.
            G2_subrc = sy-subrc.
            G2_tabix = sy-tabix.
            IF G2_subrc = 0.
      aggregation SUM
              <_ys_TG_1>-POS_OCC = <_ys_TG_1>-POS_OCC + G2-POS_OCC.
              <_ys_TG_1>-POS_VAC = <_ys_TG_1>-POS_VAC + G2-POS_VAC.
              <_ys_TG_1>-POS_COUNT = <_ys_TG_1>-POS_COUNT + G2-POS_COUNT.
      aggregation MAX
              IF G2-POS_CTSPAN > <_ys_TG_1>-POS_CTSPAN.
                <_ys_TG_1>-POS_CTSPAN = G2-POS_CTSPAN.
              ENDIF.
              ls_cross-insegid      = 1.
              ls_cross-inrecord     = l_recno_SC_1.
              ls_cross-outsegid     = 1.
              ls_cross-outrecord    = <_ys_TG_1>-record.
              CALL METHOD i_r_log->add_cross_tab
                EXPORTING
                  I_S_CROSSTAB = ls_cross.
            ELSE.
              ASSIGN rdsTG_1->*          to <_ys_TG_1>.
              CLEAR <_ys_TG_1>.
              <_ys_TG_1>-CURRENCY = G1-CURRENCY.
              MOVE-CORRESPONDING G2 TO <_ys_TG_1>.
              <_ys_TG_1>-requid    = l_requid.
              l_recno_TG_1          = l_recno_TG_1 + 1.
              ls_cross-insegid      = 1.
              ls_cross-inrecord     = l_recno_SC_1.
              ls_cross-outsegid     = 1.
              ls_cross-outrecord    = l_recno_TG_1.
              CALL METHOD i_r_log->add_cross_tab
                EXPORTING
                  I_S_CROSSTAB = ls_cross.
        Record# in target = sy-tabix - if sorting of table won't be changed
              <_ys_TG_1>-record     = l_recno_TG_1.
              INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1>.
              IF sy-subrc <> 0.
                CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack.
              ENDIF.
            ENDIF.      "Read table
    Thanks for looking into this and will appreciate your response on this.
    Regards
    Raman

    Have you tried the same thing by deleting the extra 'READ' statement?

  • Key figure issue

    I am loading some data from a flat file to a cube. The file does not supply base unit of measure and currency as a field. And I have that in my info source , because of that when I run the query I am seeing the symbol "ERR" next to the key figures.
    So I thought of writting a routine for the individual key figures in the update rules and the routine is :
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    DATA: IN    TYPE F,
          OUT   TYPE F,
          DENOM TYPE F,
          NUMER TYPE F.
    DATA: DEB_CRED(2) TYPE C VALUE 'HK'.
    DATA: QUOT(1) TYPE C VALUE 'B'.
    constants: c_msgty_e value 'E'.
    $$ end of global - insert your declaration only before this line   -
    FORM compute_data_field
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
      USING    COMM_STRUCTURE LIKE /BIC/CSZSD_SALHIST_HDR
               RECORD_NO LIKE SY-TABIX
               RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING RESULT LIKE /BIC/VZSD_C03T-SUBTOT_2S
               UNIT LIKE /BIC/VZSD_C03T-STAT_CURR
               RETURNCODE LIKE SY-SUBRC
               ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal table "MONITOR", to make monitor entries
    <b>* result value of the routine
      RESULT = COMM_STRUCTURE-SUBTOT_2S.
    result value of the unit
      UNIT = 'USD'.</b>* if the returncode is not equal zero, the result will not be updated
      RETURNCODE = 0.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    But now when I am loading the data , it is takin very long. I want move this code to the start routine. I have the same code for the key figures also.
    I know I can get rid of the error using "NO DIM " function in BEX but I do not want the users to do that .
    Thanks

    Hi Jimmy,
    You may in transfer rules just assign your currency to 'USD' constant. You can assign a constant to unit too (if you know it) - or assign a routine if you know the logic of the unit determination.
    Best regards,
    Eugene

  • Aggregates on Non-cumulative InfoCubes, stock key figures, stock, stocks,

    Hi..Guru's
    Please let me know if  anybody has created aggregates on Non-Cumulative Cubes or key figure (i.e. 0IC_C03 Inventory Management.)
    I am facing the problem of performance related at the time of execution of query in 0IC_C03.( runtime dump )
    I have tried lot on to create aggregate by using proposal from query and other options. But its not working or using that aggr by query.
    Can somebody tell me about any sample aggr. which they are using on 0ic_c03.
    Or any tool to get better performance to execute query of the said cube.
    One more clarification req that what is Move the Marker pointer for stock calculation. I have compressed only two inital data loading req. should I compress the all req in cube (Regularly)
    If so there would be any option to get req compress automatically after successfully load in data target.
    We are using all three data sources 2lis_03_bx,bf & um for the same.
    Regards,
    Navin

    Hi,
    Definately the compression has lot of effect on the quey execution time for Inventory cubes <b>than</b> other cumulated cubes.
    So Do compression reqularly, once you feel that the deletion of request is not needed any more.
    And ,If the query do not has calday characterstic and need only month characterstic ,use Snap shot Info cube(which is mentioned and procedure is given in How to paper) and divert the month wise(and higher granularity on time characterstic ,like quarter & year) queries to this cube.
    And, the percentage of improvement in qury execution time in case of aggregates is less for non cumulated cubes when compared to other normal(cumulated) cubes. But still there is improvement in using aggregates.
    With rgds,
    Anil Kumar Sharma .P
    Message was edited by: Anil Kumar Sharma

  • Problem in Calculated Key Figure in BEx 7.0

    Hi All,
    In Bex 7.0
    We have a requirement where we need to display a key figure which is having unit price of an Article. In our InfoProvider we have Article, Store and Date and the report is based on Article. As we have to display only the latest price we have created a RKF by putting a restrcition of the latest date in the date range on the the unit price key figure. But still unit price is getting aggregated over stores.
    For example if there are 10 stores and the unit price of an article is 5, then data displayed is 50 and not 5.
    We tried to create a Calculated Key Figure using the above RKF and an exception aggregation Average of all detailed value that are not zero, null or error on reference characteristic 0PLANT. Its working fine when the query is restricted to one or two Articles. But for all the articles it gives the following error messages.
    1. Error while reading data; navigation is possible.
    2. Error occurred during parallel processing of query 17; RC3
    3. Error executing physical plan: AttributeEngine: no 6952
    4. Error reading the data of infoprovider ZXMSC01$X
    5. Amount of data that is to be read from the BIA server is too large
    6. The following error 6952 occurred in BIA server
    7. Error executing physical plan: AttributeEngine: not enough memory;BwPopMerge pop1(setIsLastMerge(1)),in executor::Executor in cube: p03_zxmsc01
    Can anyone tell me the solution for the above error. Or if there is a workaround to our requirement.
    Regards,
    Sachin

    Hi,
    Could be a memory issue.
    One of your cube in on BIA which is causing the Problem.
    So executing the query in RSRT by selecting Do not use BIA index option.
    If the query is executed with out any issue then take the cube out of BIA and push it back.
    Hope it helps.
    Regards,
    Raghu
    P.S : Also please let us know if this query is live and it is an issue that you are facing after executing it properly for few days.
    Or this is a new query created and testing now and in the process got this error?
    Edited by: Raghu tej harish reddy on Sep 24, 2009 5:53 AM

  • IF Condition in Calculated Key Figure .

    Hi,
       can anyone can tell the syntax for using "If Statement in Calculated key figure ".

    Hi,
    If amt1 ne amt2.
      select amt1.
    endif.
    Try the following for above condition :
    Create a formula varible.
    1) select 'amt1'.
    2) Go to boolean, select 'is not equal to'.
    3) select 'amt2'.
    4) select '*'.
    5) select 'amt1'.
    its looks like
    'amt1'<>'amt2'*'amt1'
    hope this helps..
    regards,
    raju

  • Error in key figure value at Reporting level

    Hi Experts,
    I had problem of Key figure value which displays wrong data in reporting.
    The Process is :
    I had added the key figure SALK3 to the generic data source (xxx). The Extractor is Table/View, Table is MBEW.I had replicated to BW side.
    In BW, The process is Data Source(Generic) to 0MAT_PLANT(Infoobject (Added SALK3 to the infoobject of Attributes tab) ) loads to Multiprovider.I can see data  in the multi provider for that particular field after loading.
    Now the problem is, In Query designer i had added the key figure in Columns and ran the report.The output of the field is, If the value is 100 it displays 100 ERROR, if it 200 it displays 200 ERROR.Could you please help me on this,where i went wrong.
    Regards,
    Kishan.

    Hello,
    Please check with the follwing screen shots:
    Right click on the key fgure you created----select edit
    Once the screen opens select the NODIM from Data Functions before the key figure (as highlighted).
    Finally click on OK.
    Regards
    NS

Maybe you are looking for

  • Time out while activating Data Source

    Hi All, I am faced with a problem. I am trying to load a cube from a flat file Data Source. I create the data source and save it, but when i try and activate it...it just keeps running for a long time and gives a timeout error. I have created Info Ob

  • Default dynamic attributes and weightings on BID invitation

    Based on certain business logic i need to default dynamic attributes and weightings on bid invitation transaction BBP_BID_INV, I tried to do some code in method BBP_QUOT_CHANGE of BADI BBP_DOC_CHANGE. The execution doesn't stop at that method as i pu

  • How to hide options from InfoView's "Schedule Report Page"?

    Hi, We are using Business Objects XI R2 InfoView (Java version) to schedule the reports. Whle schedling the reports, schedule page displays the following option to the users - 1. When 2. Database Logon 3. Filters 4. Destination 5. Format 6. Print Set

  • Problem with Excel automation refnum

    While presenting ActiveX functions to students, I ran into an unexpected problem : only half of the PCs were able to work with Excel after the initial basic steps : place an automation refnum on the front panel right-click the automation refnum, then

  • Pages won't let me open a document, says "unable to open document"

    Pages for my iPad 2 is preventing me from opening one specific file. I reaaally need that file back but I can't open it or transfer it anyway. Every time I click on it it says "unable to open document." Pleaaaase help!