Issue with Decimal Values in Bex Reporting

hello gurus,
I have extracted data from 2LIS_13_VDITM. In the ODS when I look at the Active data, column Billing Qty Sls Units - it shows "13,190 " and the Sales Unit is "KM".
ITEM   SALES UNIT  Billing Qty :Sales Unit
10       KM          13,190     
20       KM           0,500 
In R3 table VBRK also it stores in the above format.
But while reporting it rounds of the data as 13,000 and 1,000.
How can i get the actuals, in reporting also.
Pls advise. thanks in advance.

Hi Eugene -Scaling it to 1 did not help.
Hi Roberto -> I had a look at the entries in T006 for KM (kilometers)and M(meters). this is what i got.
[Int. meas. unit] [Decimal.pl.rnd.] [Decimal places]
KM            |       0       |            0           
M             |       2       |            0           
And in SPRO in Src System and BW Sytem there is no mention of KM and meters.
Message was edited by: bw
Message was edited by: bw
Message was edited by: bw

Similar Messages

  • Getting wrong values  for decimal value in Bex report

    Hi,
         I am getting some problem in Bexreport decimal values.
       In ODS i am able to look the values for this key figure
      (Total PO Release Val)= 140.692,00,the same value in Bex report looks like this $ 140,692.00.
         Here one calculation is going with these above values
       % Used = 'Total PO Release Val' / 'Target Value' * 100
        as per ODS value calculation  it is correct = 140.692/1000 * 100 =14.692
       as per Bex  it is coming like this = 140,692.00/1000 *100 = 142.692.
        in Bex report Percentage valus is not considering value is taking as 140,692.00 rather than 140.692.I mean to say decemal values are ignoring while calculation in Bexreport.can you any please advice me how to comeout of this problem.Appreciate your help.
    Regards
    Ramesh

    <i> "(Total PO Release Val)= 140.692,00,the same value in Bex report looks like this $ 140,692.00."
    </i>
    How do you check the ODS value? The amount will only have two decimal places after it, so you can't have a value like 140.692,00. It seems to be an issue of user's decimal notation (where decimal is represented by comma, and thousand seperator by '.'). Go to user profile, change this setting (in default tab) to correct one (ie decimal being represented by '.') and login again and check it. The value that you see in ODS is 'one hundred forty thousand six hundred ninety two, the same that you get in Bex, only the representation is different).

  • Issue with Variable Personalisation in Bex Report

    Hi Gurus,
    I'm facing a typical problem with Variable presonalisation in the Report.
    I've executed a BEX Report and filtered data on some characteristics and used SAVE AS Button on the Portal to save the resultset. When I'm trying to use the variants, again say after  2 or 3 days i couldn't find updated data for particular variants upon using the same variants rather its refelecting the old data. Further more when I execute the report and try filtering on the same criteria altogether, I'm able to see updated result set.
    Is there any setting to be enabled when we personalise variants or variants are dependent on the data or the date on which they are created. Not sure why the reports are behaving this way. Or Do I need to create variants time and again.
    Any pointers would be of great help.
    Regard,
    Yaseen

    Hello,
    In the BEx report, when you get the variable selection screen there input your selection and at the bottom you have the option of saving them as a variant.
    Now in future when you run the report simply use this variant and execute the report. It should work fine.
    Also remember date selection can be tricky. If you input todays date in the variant, it wil consider the same date in further references. So do a check on that too.
    Regards,
    Shashank

  • Issue with variable values while adding report to Bookmarks/Favorites

    Hello All,
    I am able to save the reports from my portal as bookmarks with the appropiate navigation state & variable values for reports which do not have a mandatory value variable with no default value. For all other queries like queries with no mandatory variables or queries containing mandatory variables with default values, this is working fine. Any inputs is appreciated. We are in BI7.0 with SP12.
    Thanks,
    Danny

    What's your question, because from your statement, it seems like everything is fine ??
    Cheers,
    Andrew

  • Issue with Decimal Notation in BEX.

    HI BW Experts !!
    Can any one help on this please.
    Thanks a million in advance.
    InfoCube: Equipment Barometer InfoCube (Technical Name: ZRLC_BAR)       
    Key Figure: Month to Date Income (Technical Name: ZRL_MTDIN)     
    If I execute the query with the Month/Year = 10.2004 and I compare it with the data in the infocube in the same period I'll see the following data:                                                   
    Query=   611303.49   (correct amount)
    InfoCube=6113034.92  (Wrong amount, is divided by 10)
    Always the amounts in the query(BEX) is divided by 10.               
    If someone can help me i will appreciate.

    Hi Marcos,
    In Query edit mode, goto properties of key figure (right-click on the ZRL_MTDIN), and select 'Nothing Defined' for the scaling factor.
    This should work.
    Regards,
    Vikrant.

  • URGENT: Issue with hierarchy level keys and report drill down

    Hi,
    BASIC STRUCTURE:
    I have created a subject area with 3 facts (FACT_A, FACT_B, FACT_C) and 4 dimesnions (TIME_DIM,DIM_2,DIM_3,DIM_4). Each fact table also has additional aggregate tables aggregated along levels of the time dimension. Also our timw dimension has aggregated dimension tables like TIME_DIM_WEEK, TIME_DIM_MONTH, TIME_DIM_QUARTER and TIME_DIm_YEAR.
    GOAL:
    All 3 facts have the same measures M_1 and M_2 in them but may not have data for the same dimension values selected.
    For example
    For month JAN 2000 FACT_A.M1=100$ and no data exists for JAN 2000 in FACT_B and FACT_C. Then in the report
    for JAN2000 it should show FACT_A.M1= $100, FACT_B.M1 = 0 and FACT_C.M1 =0. In this case I should be able to drill down to the lowest level.
    ISSUE:
    The time dimension TIME_DIM has the following levels - Total -> Year -> Quarter -> Month -> Week -> Day
    However I am having an issue with drill down in the reports whenever I pull metrics from more than 1 fact at a time. I have defined the level keys but not sure if I need to do anything in addition since I am using aggregates.
    I have to fix this issue quickly. Please help me.

    Alastair,
    All the fact tables have aggregated facts as sources.
    I have checked the levels set for each of the sources to the time dimesnion table in BMM and they look okay. So the Time dim table in BMM has 4 source tables
    Time_Day (level set to day, table key is "day"),
    Time_month (level set to month, table key is "Fiscal_Month_Code"),
    Time_Quarter(level set to quarter, table key is "Fiscal_Quarter_Code") and
    Time_Year(level set to year, table key is :Fiscal_Year_Code").
    Note: No time week aggregate added as logical source.
    Again the time dim hierarchy based off of this table has levels: Total -> Year ->quarter -> Month -> week -> day
    The levels keys set for each level are
    Year -> Primary key is Year_Name (YYYY)(Checked as chronological key) and another key is Year_Num (YYYY)(Checked as chronological key)
    Quarter->Primary Key is Quarter_Name (YYYY Qn), another key is Quarter_Number (Format n where n can assum values 1,2,3,4). Both keys are set as chronological keys
    Month -> Primary key is Month_Name (MON YYYY), another key is Month_Num (Format n where n can assume values from 1 to 12) Both keys are set as chronological keys
    Week -> Primary Key is week name (YYYY Wk nn, where nn can have values from 1 to 53), another key is week num (nn, where nn can have values from 1 to 53)
    Day -> primary key set to day (date format)
    Issue1: When I try to drill to lower levels it throws out an error saying report cannot find any data because filters may be too restrictive even though I see data at higher level
    For ex: If I drill down to Year: 2010 and Qtr: 2010 Q2 and M1:$100 when I click on Qtr to drll to month level it throws me the error
    Issue2: when I add year and qtr colums to the report I see data as below which is incorrect
    Year_Name Qtr_Name data:FACTA_M1 Data: FactB_M1
    2009 2009 Q1 $10 $5
    2009 2009 Q2 $20 $80
    2009 2009 Q3 $20
    2009 2009 Q4 $30
    2010 2010 Q1 $100
    2010 Q2 $101
    2010 Q3 $102 $230
    2010 Q4 $103
    2011 Q1 $10
    In the above example year_name is not showing up for 2010 Q2 and after. However if I change the primary key for level 'Quarter' by having key consist of year name and quarter name instead of just quarter name the issue doesnt occur and drill down works great. The only issue is when I drill from qtr it first shows year name and quarter name instead of showing the next level which would be month name.
    Sorry about the long message but I thought you might notice something in how I have set up the keys.
    Thanks

  • Issue in Decimal value round off

    Hi All, I am facing an issue wherein Decimal value of 0.01 is getting converted to 0.00. Let me brief you about Source-target Types -  Source - Flat FileTarget - DB2Powercenter Version - 9.6.1 HF2 Issue -  Field in Source File is defined as Number(5,2).Corrosponding field from Db2 Target Table is defined as Decimal(5,2). Mapping is simple straight forward mapping with 1:1 loading from Source to target. Now, my Source File contain data as below -  0.010.020.0312.01 When I execute mapping, DB2 target Table gets loaded with all records.Value from corrosponding DB2 Target Table is -  0.000.020.0312.01 As you can see value of 0.01 is getting converted to 0.00 whereas all other values are working fine as expected. As a quick fix, I have applied to_decimal(source_field,2) before loading it to target. Could anyone please explain if this is an issue with Powercenter or there's some other issue. If I think it's a rounding issue then how come value of 0.02 is working fine? Please advise.

    Hi All,  I have one scenario to read the source file . The file delimiter is '|' . The no of pipeline for each line is 17. So if a line containing more than 17 , send an error email. For this first i am printing tota pipeline in ecah line to afile called pipelinecount.txt . Then i will read this file and send each value to while loop /for loop , where it will get > 17 , it will exit the process and send an email.  But here in script i am getting error at while line. Could anyone help.   #! /bin/kshset -x SOURCE_DIR=/vp01/SrcFilessed 's/[^|]//g' /vp01/SrcFiles/Test.txt | awk '{ print length }'> /vp01/SrcFiles/pipelinecount.txtcd $SOURCE_DIRwhile line in `cat pipelinecount.txt`; do if [ $line -eq 17 ];thenecho "No issue in pipeline"exit 0;fiif [ $line -gt 17 ];thenecho "No of pipelines exceeded the expected. Please verify the source file." | mailx -s "WKFS Load: Failed" [email protected]

  • Re: Issue in Decimal value round off in DB2

    Hi All, It is happening only only when the decimal separator of source is ',' and behaving normally when decimal separator is '.'. Some how ..we casted the incoming field to TO_DECIMAL(IN_FIELD,2) ..in that case it is working fine.  Regards,Srinivas

    Hi All, I am facing an issue wherein Decimal value of 0.01 is getting converted to 0.00. Let me brief you about Source-target Types -  Source - Flat FileTarget - DB2Powercenter Version - 9.6.1 HF2 Issue -  Field in Source File is defined as Number(5,2).Corrosponding field from Db2 Target Table is defined as Decimal(5,2). Mapping is simple straight forward mapping with 1:1 loading from Source to target. Now, my Source File contain data as below -  0.010.020.0312.01 When I execute mapping, DB2 target Table gets loaded with all records.Value from corrosponding DB2 Target Table is -  0.000.020.0312.01 As you can see value of 0.01 is getting converted to 0.00 whereas all other values are working fine as expected. As a quick fix, I have applied to_decimal(source_field,2) before loading it to target. Could anyone please explain if this is an issue with Powercenter or there's some other issue. If I think it's a rounding issue then how come value of 0.02 is working fine? Please advise.

  • How to remove unassigned(#) value in bex report in 04s

    Any help on hiding/ removing unassigned(#) value in bex report in 04s , the query is created on an infoset( left outer join).

    you can exclude # in your restriction...
    assign points if it helps

  • Issue with indicator values in report display

    Hi All,
    I am getting an issue with valutype #.
    we have  account  restricted hirerarchy ,fiscal year period and Valuetype in rows .
    I need to show the actuals indicator of value type 10 data in one row for each fiscal year period.But some of the key figures we are getting # data .valuetype indiator is repeating 2 times for each period.but we need to post this # values to 10.Any one have any idea how to handle this.
    for example : my report is displaying like this
    Account code :   Fiscal Year period   Value type      Keyfig 1    keyfig2   Keyfig 3
    CA1100 :                  001.2006          
    10                                  100       22
                     200
                                     002.2006          
    10                                   200       44
                     300  
    But I need the report  like below
    Account code :   Fiscal Year period   Value type      Keyfig 1    keyfig2   Keyfig 3
    CA1100 :                  001.2006           10                  200          100       22
                                                                                    002.2006           10                  300          200       44

    Sirisha,
       in the Query, You can filter your query to actual value i.e. 10. display KF1 and KF2 as it is coming from source. create restricted KF with restriction on account, fiscal year period and value type (= #).
    you will get exact value. or while loading to cube or ods.. you can move that value to KF3 using start routine.
    Nagesh Ganisetti.

  • BEx authorization issue with colon value

    Hi All,
    I have created few reports in 3.5 version on a cube. Two reports are having authorization object company and remaining don't have authorization object. If i do not give value colon( for company, authorization is failing for those reports do not have company. If i give colon value for company in authorization object, both reports with company and without company are working fine. If i do not specify any input value for the reports which are having company are giving all company values( but it should give no authorization message instead of displaying all companies). I'm using characteristic variable of type authorization and Ready for input. Can you please advise me how to fix this issue.
    Cheers,
    MKR

    Hi MKR,
    If you have custom authorization object and you have ticked it against a cube, then every query/report on that cube should have that authorization object.
    Please chage the queries accordingly and check
    hope this helps

  • Issue with Region field in BW report

    Hi experts,
    I am facing an issue with BW Report ,the data for REGION for some billing doc no are showing as # and its description is not assigned.Also there is no data for region in its data target as well and also no data in VBRK table for those billing doc no in R3 side.
    Can you please suggest some solution.
    Thanks for help in advance.
    Regards
    NP

    Hello -
    If there is no data for charecteristic values,. #  and not assigned will be dispalyed in the report.
    Its BW funcnality,.
    If you would like to remove the # values  in BEX you have to go for a Macro,.
    If in WAD you have to eliminate the first values in the repactive web item.
    Regards,
    Vishwa.

  • Issue with Negative Value for Total valuated stock 0VALSTCKQTY.

    Hi Experts,
    we loaded the Cube with datasources 2LIS_03_BX, 2LIS_03_BF and 2LIS_03_UM.
    We mapped the quanity field from 2LIS_03_BF in transformation either into Key figures "Quantity issued from valuated stock" (0ISSVALSTCK) or "Quantity received into valuated stock" (0RECVALSTCK ) of the cube.
    For obtaining the Total Valuated stock, we used the Key figure 0VALSTCKQTY. This key figure is having the Inflow and Out flow values as 0RECVALSTCK and 0ISSVALSTCK. When i tried to check the content of 0VALSTCKQTY, the key figure  0VALSTCKQTY is not present in the infoCube content. I understood that the value for this key figure would be calculated at the time of query execution with the formula
    { Last obtained Valuated stock + (Received Valuated Stock u2013 Issued Valuated Stock ) }.
    The issue is the first records in the query is obtained with negative value for the total valuated stock 0VALSTCKQTY even though the values of Received Valuated Stock  and  Issued Valuated Stock are with Zero.  Could any one please help me on how the first record in the query is with a negative value eventhough the inflow and out flow fields of it are with Zero.
    Many Thanks in advance.
    Jeswanth

    Hi Srini,
    I observed an interresting reason for the stock being with negative in the first record.
    Issue : While executing the BEx report, we have the first record with a negative value.
    Let me explain with an example
    Material  : XYZ
    Plant      : A
    Date of Stock Initialization for data source 2LIS_03_BX -
    >  12th April 2009.
    So on 12 Th April 2009 consider that we have a stock in store with a value of 2640.
    Then we have loaded wih Data source 2LIS_03_BF for all Historic Movement types.
    At the time of query execution we will have the first record with a negative of available stock present on the day of initialization .
    So, as we have initilized the data source 2LIS_03_BX on 12 th April 2009, we have at that time a stock of 2640 in availability.
    Exactly with the same value of 2640, we are having a negative value i.e. -2640.
    and also one more point to be noted is ...as we have initialized the data source 2LIS_03_BX on 12 th APril 2009...it has created a
    opening balance of 2640 on the day of initiailization i.e on 12 th April 2009 in infocube....(which is an extra record...)...so if the previous record of 12,04,2009 is having some value in it..then it will get added to 2640 EA..which will give incorrect stock
    So in the query the records will appear in the below following manner
    Calendar Day     Total stock                                                                Received stock                                          Issued Stock         
    28.12.2005 --->  - 2640 EA
    29.12.2005 --->    2000 EA                                                                       640 EA                                                        0
    10.04.2009 --->      0                                                                                0                                                               2000 EA      
    11.04.2009 -
    >    0                                                                                0                                                                0          
                                 (For making
                                  the earlier record of  initilization to 0  a negative value -2640 EA is created in the first record )
    12.04.2009 (Initializtion day) --->  2640 EA                                                 0                                                                0
    13.04.2009 
    the day before initialization the total stock will be 0 due to the negative effect introduced by the first record and from the day of initialization the records will be the accurate values...  On 12 th april 2009 we can see that a stock of 2640 is brought into the total stock. In fact if there is no negative value i.e. -2640 in the first record then ...the value on 11.04.2009 will be 2640 Ea and this will get summed up with the Opening balance of 2640 EA created by 2LIS_03_BX on the day of intitialization. So on 12 .04. 2009 the total stock will be shown as 5280 EA...so in order to prevent the double value only we observed that the first record is created with the negative of available stock present on the initialization ...to make the record before the day of initialization i.e. 11.04.2009...to be 0...so that from 12 th April 2009 ..will start to see the actual total stock in the query result...
    This is happening only if we use  both the data sources 2LIS_03_BX and 2LIS_03_BF  for loading into BW.....
    If we load alone by 2LIS_03_BF ...then as 2LIS_03_BX is ruled out in the loading...then no negative of available stock will be created ...because there will be bno opening balance created on the day of initialization.....and the opeing balance available will be flowing into consecutive records due to the movement types and 12 th April 2009 will be shown with avaialable total stock of 2640 EA.
    Kindly let us know your opinions on this...
    Thanks.
    regards,
    Jeswanth

  • Importing a text file with variable values into web reporting

    Hello,
    I'm looking for a possibility to import values from a text file into a variable selection in Web reporting.
    With BEx reporting in Excel this is possible, by clicking on the multiple selection button in the variable popup screen, and subsequently selecting the "Import from Text file" button.
    However, this function seems not to be available for web-reporting...
    It would be great if someone could help me out with this!
    Thanks & regards,
    Arvid

    Hi,
    we could resolve this issue, so i thought it may also be helpful for others:
    In our example we used a file with numbers for materials. This file is stored somewhere in a directory, to which SAP BI must have authorization to read.
    The file looks something like this:
    4711
    4712
    4713
    4714
    The file is named "import.txt" and lays in directory "/usr/sap/EC6/files/bi/"
    *&  Include           ZXRSRU01
    * global variables
    Data: intern_range LIKE LINE OF i_t_var_range,
          l_s_range    TYPE rsr_s_rangesid,
          line(100)    TYPE c,
          p_file(128)  TYPE c,
          length_rangelow  type i,
          tmp_rangelow     like l_s_range-low.
    * internal tables for selection-transfer from transaction
    * Data: BEGIN OF it_file occurs 0,
    *        it_p_file(128) TYPE c,
    *      END of it_file.
    IF i_step = 1.
    ** variables can be changed or set before pop-up appears
      CASE i_vnam.
    * take material from external file to selection-list
         WHEN 'ZSD_UPMA'.
    ** call of transaction, with which the path can be set
    *CALL TRANSACTION 'ZBW_VARIABLE' using it_file
    *MODE 'A'      " call should be visible, so that variable can be set
    *UPDATE 'S'.   " first transaction, then processing
    ** Der Pfad, der in dem Selektionsbild eingegeben wird, wird an die Variable übergeben
    ** Der Set-Parameter ist in Report ZSD_SELECT_VARIABLE
    *  get parameter id 'VAR' field p_file.
    p_file = '/usr/sap/EC6/files/bi/import.txt'.
    * further handling of variable in BI
          OPEN DATASET p_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
          IF sy-subrc = 0.
            READ DATASET p_file INTO line.
            WHILE sy-subrc = 0.
              IF line(2) <> '//'.
                l_s_range-sign = 'I'.
                l_s_range-opt  = 'EQ'.
                l_s_range-low  = line.
    * fill with leading Zeros
    * in variable tmp_rangelow the value from l_s_range-low is taken
        tmp_rangelow    = l_s_range-low.
    * read the length
        length_rangelow = strlen( tmp_rangelow ).
    * in our case: material has 18 characters
        while length_rangelow lt 18.
          CONCATENATE '0' tmp_rangelow INTO tmp_rangelow.
          length_rangelow = length_rangelow + 1.
        endwhile.
    * initialize l_s_range-low
        clear l_s_range-low.
    * set with filled values
        l_s_range-low = tmp_rangelow.
    * transfer to structure
                APPEND l_s_range TO e_t_range.
              ENDIF.
              READ DATASET p_file INTO line.
            ENDWHILE.
          ENDIF.
          CLOSE DATASET p_file.
        ENDCASE.
    ELSEIF i_step = 2.
    ** in step 2 all variable values from pop-up input can be processed or
    ** User Exit variables can be derived
    * UserExit Ende.
    ENDIF.
    Hope i could help!
    Best regards,
    Tobias

  • Strange issue with a huge SSRS 2008 report (supposed to return 3 million records)

    Hi,
    NOTE: The strange part (as mentioned in title) will come in the end.
    I am running a SSRS 2008 report which fetches 3 million records from a remote server. After around 1 hour the report processing stops and I see an error icon on the lft bottom of the browser window. When I click on that to see the error details it shows
    some PageRequestManagerSQLErrorException with an unknown error message with code 12029 (sometimes 12002).
    When I see the reportserver logs there is an error message logged in it which says "Microsoft.ReportingServices.Library.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the
    database is required for all requests and processing. ---> Microsoft.ReportingServices.Library.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for
    all requests and processing. ---> System.InvalidOperationException: Timeout expired.  The timeout period elapsed prior to obtaining a connection from the pool.  This may have occurred because all pooled connections were in use and max pool size
    was reached."
    The <DatabaseQueryTimeOut> value in the report server configuration file is already having a value set to 7200 seconds(2 hours).
    NOW, the strange part is, when I open the ExecutionLog2 table in ReportServer database, there is an entry for the same report with the status as "success" !!!
    My head is spinning over this issue, somebody please rescue.
    Regards.

    Not sure if this will help but you might give it a try :
    1. Open the Registry Editor.
    2. Navigate to the registry path below.
    HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Tcpip\Parameters\
    3. Create a DWORD value MaxUserPort.
    Value name
    MaxUserPort
    Value data (in Decimal)
    10000
    4. Restart the server.
    5. Check and confirm that the registry key ‘MaxUserPort’ is added.
    Do the same process as above for TCPTimedWaitDelay and reduce the value to 60.  The above steps are to increase the maxuserports and reduce the time_wait in order to free up resources on the server.

Maybe you are looking for