Decmal values are rounding at ODS level

Hi Guys,
I have an issue in production system.While loading the data from Oracle data base to ODS it was taking the rounding value.
EX: The value in oracle data base is 0.0625,while loading to ODS it was rounding to 0.0630.In the properties of the Key figure decimals has been set to ie 0.0000.
Can anybody help me in resolving the issue.
Points will be awarded.It is an urgent isssue.
Thanks,
Chinna

Hi,
try the following at your key figure settings in InfoObject area.
Additional Properties -> Key figure with maximum precision
regards,
MG.

Similar Messages

  • By default in BPC values are sum in node level, It does not make sense to sum averages on parent nodes.

    HI Guys
    we have an business requirement to store data in average figures ,
    The values  are entered on the base member level but in reports we need to show average values  on the parents level instead of sum
    ( by default in BPC values are sum in node level),it does not make sense to sum averages on parent nodes.
    we managed it with excel formulas ,but we r facing the problem with dimensions which are in context.
    (i tried using dim memeber formulas but AVG key word is not accepting the system in member formulas)
    at last by using formula we managed the report but im worried abt that if there is no formulas ,how we will manage
    pls suggest any solution to show average values on the node level in specific reports

    Hi Krao,
    Please define what do you mean by average - arithmetic average or weighted average (has more business sense to my mind).
    Also, please provide some samples, explaining dimensions used in average calculation.
    In our case we use dimension member formulas to show average prices, discounts etc. in the reports.
    B.R. Vadim

  • Decimal values are not flowing to Excise Invoice created in J1IIN

    Dear Sap Gurus,
    Please help me with the problem. After creation of invoice in VF01 when i am creaing outgoing excise invoice decimal values for BED, ECS and SEC. ECess( AT1) are not coming instead values are rounded of to nearest whole number.
    for ex.
    if invoice values are as below
    BED= 16
    ECS= 3.2
    HSC= 1.6
    when excise iinvoice is created values are changes as below
    BED= 16
    ECS= 3
    HSC= 2
    please give propeer solution
    reg.
    suresh

    Check in Logistics - General -> Tax on Goods Movements -> India -> Basic Settings -> Maintain Company Code Settings. 
    Also ensure that in V/06, for the respective excise condition types, rounding is not activated.
    thanks
    G. Lakshmipathi

  • How to use NODIM func with out it's values being rounded

    I created a new calculated key figure in Query Designer 3.x, and used the function NODIM() - Value with out dimensions. When I use this function, the values are rounding off to the nearest value.
    For example, I have a value 0.000075 US$, when I use NODIM function the value is displayed as 0.000080. Value is getting rounded to nearest value.
    I tried using absolute value it did not work.
    Can any one tell me how to use NODIM function with out it's value being rounded to nearest value.
    Thanks,

    Hi,
    According to your description, you might want that "Notice" field has a default value when form is created and users can be able to change the value of that
    field.
    As a workaround, you can add an action rule in “Name” field via InfoPath to fill the default value in “Notice” field only when “Name” field is not blank and “Notice”
    field is blank.
    Settings of the rule are as below, you can modify it based on your need:
    Here is a link about how to add an action rule in InfoPath form, you can use it as a reference:
    https://support.office.microsoft.com/en-us/article/Add-rules-for-performing-other-actions-5322d726-b787-4e5f-8d37-ba6db68b451d?CorrelationId=8a64c12f-aa60-4d90-b1f9-a44fcc4e74b5&ui=en-US&rs=en-US&ad=US
    Best regards
    Patrick Liang
    TechNet Community Support

  • Payscale group and levels values are not appearing in IT0008

    Hi Friends,
    I craeted Payscale group & Level in V_T510. Eventhough these values are not coming as F4 (drop down)values in IT0008.
    Pls adivice where I have missed the configuration.
    Thanks

    Hi..
    ESG/CAP and and its grouping in V-t510 have already been maintained. however still it is not coming as drop down values.
    Pls suggest to resolve.
    Thanks

  • Content Organizer bug - PDF files does not get routed correctly if autodeclaration is on and library level default values are set

    It looks like whenever one specifies Column default values at a library level then the content organizer routing goes a bit awry SPECIFICALLY FOR NON OFFICE FILES [e.g. PDF] . Below are the observations and issues
    1. Column level default value set on a record library with auto declaration of records turned on.  The content organizer routes the document to the library but also keeps a copy of the document in the drop off library. It does not remove it from the drop
    off library. The instant we clear the default value settings at the library level of the target library the content organizer works as expected again. 
    2. If default value settings are specified on a column in the target library then the PDF file gets routed to the document library but all the metadata is blanked out. The copy of the file that remains in the drop off library has all the correct metadata but
    the target library has blanked out metadata. 
    Are the 2 observations described above by design or are they bugs? If so is there any documentation that is available that proves this because this does not make logical sense and proving this to a client in the absence of any documentation is a challenge.
    The problem goes away if we shift the default value to the site columns directly at the site collection level. It's just the library level defaults that the pdf files do not seem to agree with

    Hi Lisa,
    Thanks for responding. This can be replicated in any environment but is only replicable for a specific combination of content organizer settings . The combination of settings I am referring to can be seen in the screenshot below. If you turn off redirect
    users to the drop off library for direct uploads to libraries and if you turn on sharepoint versioning then you should be able to replicate the issue. Also we are using managed metadata site columns. I simplified this use case to a custom content content type
    with just 2 custom managed metadata columns and can still replicate the issue in several environments. Also note the issue does not occur if the default values are set at the site or site collection level. It only occurs if you set the column value default
    at a library level.  I was able to replicate this on a completely vanilla Enterprise records site collection freshly created just to test this.  Also note that the issue is not that the file does not reach the destination library. The issue is the
    document does not get removed from the drop off library after it is transfered to the destination library which technically should have gotten removed.

  • DRG-11427: valid gist level values are S, P

    when i want to create a gist, appears this error, but i don't understand what it means.
    I ve found in documentation:
    Specify valid gist leved..but what it means?
    SQL> begin
    2 Ctx_Doc.Gist (
    3 index_name => 'realistic_docs_text',
    4 textkey => 2,
    5 restab => 'ctx_gists',
    6 query_id => 2,
    7 pov => 'GENERIC',
    8 glevel => 'paragraph',
    9 numparagraphs => 2,
    10 maxpercent => 100 );
    11 end;
    12 /
    begin
    ERROR at line 1:
    ORA-20000: Oracle Text error:
    DRG-11425: gist level paragraph is invalid
    DRG-11427: valid gist level values are S, P
    ORA-06512: at "CTXSYS.DRUE", line 160
    ORA-06512: at "CTXSYS.CTX_DOC", line 401
    ORA-06512: at line 2

    OK.I've solved!
    Solution:
    glevel => 'paragraph', wrong!!!
    glevel => 'P', ok!

  • How can this happen? Quantity Field values are showing doubles in ODS

    Hello BW Gurus,
    I have a staging ODS, on top of it a reporting ODS and a cube. I have loaded the data from flatfile to the staging ODS which looks perfectly fine but coming to the Reporting ODS and the cube, the values in quantity field are doubled. If it was just cube where the values were doubled then I might have thought it could be due to double time loading or may be because of duplicate records. But I can see the Quantity values doubled in reporting ODS (which has Overwrite option).
    Does anyone has any clues? when we had this issue couple of days back - I have deleted the request from Reporting ODS and Cube and re-loaded it which looked perfectly fine.
    But today our issue is back - is it not strange?
    help me if you have any thoughts and thanks so much,
    Swathi.

    Hello Dinesh and Vikash,
    My issue has taken a twist and here is the new problem:
    on Friday the report was fine
    on Saturday the report was fine
    on Sunday we had problems with Reporting ODS as it was  showing all double values. (I cleaned the reporting ODS and Cube and reloaded the data) after that the report worked fine.
    on Monday the report was fine
    on Tuesday again we got duplicate values (for few plants) now in staging ODS as well, which has completely with 'overwrite' option with one to one mapping with no routines.
    I loaded the same file in DEV. and this was working fine?
    I debugged the code that was fine too.
    I think this is a strange Issue.
    Please let me know your ideas,
    Thanks so much,
    Swathi

  • Reporting Issue   "Opening Balance values are going to Not Assigned Values"

    Hi Friends,
    Closing Balance values are perfectly showing but opening balance values are going to Not Assingned Values? I am not able to find it out.
    our query is built on PCA Daily Multiprovider.
    Pls Help.
    Thanks
    Asim

    Hi Ravish,
    Here are the details
    Stock values and quantities in the PCA cube of BW never show correctly at an article level due to an opening balance issue.  This reduces our ability to report on stocks at moving average price, i.e. anything that actually matches the financial values in the system. The issue arises because stock at the beginning of the year (quantity and value) for any site is only shown against an article of "Not assigned."  Stock movements during the year are shown against the appropriate article.(If you have a new site, the will have stock assigned against an article up until 31 December and then the 31 December values are shown against "not assigned" for the next year).
    Thanks
    Asim

  • These i7 4770K Temperatures And Voltage Values Are Normal With Z87-G43?

    Hello everyone,
    I recently upgraded my good old Core2Duo rig and bought a new cpu-motherboard-ram trio. My new specs is as follows:
    - i7 4770K @ stock speed + Coolermaster 212Evo cpu cooler
    - MSI z87 g43 motherboard
    - 2x4 GB G-Skill Ripjaws X 1866Mhz
    - Case: Corsair Carbide 400R
    The problem is I wasn't aware of that new Haswell cpus are running slightly hotter and now I'm little bit worried about my temperatures. Since I was also planning to overclock my cpu a bit and trying to find a point that doesn't need a vast voltage increase, I'm losing my sleeps over this situation at the moment.
    Anyways. I'm using softwares like HWmonitor, Coretemp, Realtemp and Intel Extreme Tuning Utility. Turbo Boost is also active. So the cpu is going up until 3.9 GHz. Other than that I'm on stock speeds, and motherboard's default values. Here are my temperatures:
    Ambient room temp: Varies between 23-25 °C
    Idle: 28-30 °C
    While playing demanding games like Battlefield 4: Max 58-65°C
    With Intel Extreme Tuning's stress test for 15 mins: max 65-70 °C
    With Prime 95 Blend and OCCT burn tests for 15 mins: max 78-82 °C
    I also run realtemp's sensor test and the values are identical since it's using Prime95 too.
    I also noticed that Prime 95 and OCCT is increasing my cpu voltage value from 1.156 to 1.21 while Intel Extreme Tuning's stress test and BurnInTest is using 1.156v. All these test are using %100 of the cores. Couldn't understand why there's a voltage increase on certain tests which leads my temps go even higher and higher. Will I encounter these kinds of random voltage increases during normal tasks? Like playing games, rendering some stuff etc..?
    On the other hand I tried motherboard's OC Genie future to see what happens. It overclocked the cpu automatically to 4.0 GHz @1.10v. With this setting I've seen max of 70 °C for a second and mostly 65-68°C under OCCT stress test. And also my voltage didn't increase at all and sit at 1.10. I'm a bit confused about these values. Since with default settings I'm getting hotter values and my voltage is going up to 1.21 with Turbo Boost under Prime95/OCCT burn tests. I also found out that my BIOS is v1.0. I don't really have a performance or a stability issue for now except this voltage thing. Does a BIOS update help on that situation? I don't really like to touch something already working OK and end up with a dead board.
    Also I'm wondering if my temperature values are normal with the cpu cooler i have (Coolermaster 212evo)?
    I also could buy some extra fans for my case (1 exhaust to top & 1 intake to side) and maybe a second fan for the cpu cooler if you guys think that these would help a bit.
    Sorry for my English by the way. I'm not a native speaker.
    Thanks for all your comments and suggestions already.

    Thanks for the reply Nichrome,
    I will follow your suggestions for the fans. Currently I don't have any fans on top. But I'm considering to buy some fans to top and side. So you get better results with top fans being exhaust right?
    Also which fan are you using as the second fan for the cpu heatsink? I would buy one of these as well since we have the same heatsink.
    I'm also using the default/auto voltages and settings at the moment. Just Turbo Boost is enabled and when it kicks in voltage is going up to 1.156. Which seems normal and doesnt produce dangerous level of heat. The thing is if I start running Prime95&OCCT the voltages going up to 1.21+ level at the same turbo boost speed (which is 3.9 GHz). And that produces a lot more heat than usual. But if I use BurnInTest or Intel Extreme Tuning Utility stress test the voltages sit at 1.156 and under full load on all cores. I'm wondering what's the reason of this difference and if it is software or motherboard related. Even with using the OC Genie @ 4.1 GHz temperatures and voltages seem lower than the stock&auto settings (idle 35-38 / stress test with OCCT 70C max, gaming 60-65C max). I'm not sure if a BIOS update would fix this. Since the whole BIOS flashing process is creeping me out. I don't like to bother with something that is already working OK. Don't want to end up with a dead board in the end :P Maybe I'm becoming a bit paranoid though, since this is a really hard earned upgrade after 6 years. :P

  • Updated (VORowImpl ) values are not reflected in inline POPUP table

    Hi Expert,
    Currently i am getting exception when i try to update my iterator binding,
    Here is my use case,
    I have view object displayed inside the inline popup as a table. When i modify one of the cell i am firing the value change listener and it is called the ViewObjectRowImpl class method. Inside the method i do some computation (i am executing some DB query to get back some value). After the query execution i am updating the other columns data based on the changes.
    First i have PPR the table and check out the update values. Unfortunately the change values are not reflected.
    Then i used the following code
    DCIteratorBinding itrBinding = findIteratorBinding("EmployeeVO1Iterator");
    if (itrBinding != null) {
    itrBinding.executeQuery();
    itrBinding.refresh(DCIteratorBinding.RANGESIZE_UNLIMITED);
    Now i am getting the " JBO-25003: Object EmployeeMgmtAM of type ApplicationModule is not found."
    May i know what went wrong? Am i missing anything. May i know how can i refresh the UI table based on the updated VORowImpl. Looking forward hints.
    Thanks

    Thanks Frank, Actually i am accessing the client method as follows,
    DCIteratorBinding itrBinding = findIteratorBinding("EmployeeVO1Iterator");
    if (itrBinding != null) {
    EmployeeVORowImpl employeeVO = (EmployeeVORowImpl)itrBinding.getCurrentRow();
    empoyeeVO.methodName();
    May i know what is the different between accessing the method as shown above and access via the MethodBinding?
    How about access the attribute of the View object using the above approach? i.e empoyeeVO.getName() like ....
    Also i have configure the ADF looger to finest level and found the primary reason to "Applicaiton module not found" was some entity validation. But this not always happened. Same test case works fine and fails sometimes. I am clueless. Could you please throw some light.
    Thanks
    Edited by: user1022639 on Feb 7, 2012 5:24 PM

  • AET Generated field values are not saved.

    Hi Gurus,
    I have created two AET fields on the screen (marked below) and that should store values in table CRMD_CUSTOMER_H. When I create a new service request and enter the values and save, AET field values are not saved. Again, when I edit the same service request and enter the values and save , AET field values are saved on the database.
    While  debugging  I found that, relationship BTHeaderCustExt does not exist for the first time and second time onward its appearing. Due to this, data is not being saved at first time (Line no 27 : current is empty).
    When tried to create realtionship using create_related_entity , it throwing exception cx_crm_genil_model_error.
    Please advice me the soution for the same.
    Regards,
    Anand

    there should be a context node at your view level. Please check ON NEW FOCUS method is implemented or not.
    If not, you can implement that method with below code.
        DATA: lv_collection TYPE REF TO if_bol_bo_col,
              entity        TYPE REF TO cl_crm_bol_entity.
    *   get collection of dependent nodes
        entity ?= focus_bo.
        TRY.
            lv_collection = entity->get_related_entities(
                   iv_relation_name = 'BTHeaderCustExt' ).
            IF lv_collection IS NOT BOUND or lv_collection->size( ) = 0.
              IF entity->is_changeable( ) = ABAP_TRUE.
                TRY.
                    entity = entity->create_related_entity(
                     iv_relation_name = 'BTHeaderCustExt' ).
                  CATCH cx_crm_genil_model_error cx_crm_genil_duplicate_rel.
    *               should never happen
                ENDTRY.
                IF entity IS BOUND.
                  CREATE OBJECT lv_collection TYPE cl_crm_bol_bo_col.
                  lv_collection->add( entity ).
                ENDIF.
              ENDIF.
            ENDIF.
          CATCH cx_crm_genil_model_error.
    *       should never happen
            EXIT.
          CATCH cx_sy_ref_is_initial.
        ENDTRY.
        me->set_collection( lv_collection ).

  • Inventory cube - non cumulativekey fig values are showing -ve values

    Hi Guru's,
             For Improving the performance of inventory cube *0IC_C03
    The following steps i did:
    1) Created History cube by taking a copy of actual cube (0IC_C03).
    2) Transferred all the four years of data (2007, 2008, 2009, 2010) to history cube(4 yr data) as a back up to do clustering and for cube remodelling.
    3) After doing all these, loaded the current 3 years (2008, 2009, 2010)data back to the actual cube and kept one year data in the history cube (2007) (i.e maintained only recent 3yrs data in actual cube).
    5) Created a multiprovider includes actual and history cubes and populated the existing report on top of the multiprovider.
    6) After purging one year data from the actual cube, stock values in the reports are showing negative values
    7) To clear that issue i loaded the 2007 year data back to the actual cube (now the cube has all years data as it was before) to avoid the negative stock value, but again stock values are showing negative values.
    How to solve this issues in inventory cube..
    how too eliminate the negative value in reports which was working prperly before data purging( removing the first year data from the actual cube)

    Hi prayog.. 10q for answering... Yeah i went 2 the data targets. And the forumlae is already wrriten like this IF( Debit/Credit = 'H', Qty in OUn, ( 1- * Qty in OUn ) ) for Actual Consum. K.F and IF( Debit/Credit = 'H', Amt. in local curr., ( 1- * Amt. in local curr. ) ) for Amount.....
    So i already said that from one of the infosource the data is flowing through ODS and then 2 CUBE. So i checked out the data in ODS with the movement type and posting date as per in the Report.. I selected the 'Debit/Credit' = H and Movement type and Posting date... But in ODS o/p the keyfig's are not displayed..... This is the problem...
    Cheers,
    Hemanth Aluri...

  • Grand total values are not matching with Detail report

    Report has grand totals and when I drill to the detail report, grand total values are NOT matching with parent report totals, I did some analysis but I'm clueless on this issue.
    Please provide your thoughts and insight on this issue..
    Thanks

    is your summary and detail reports hitting different facts, like summary hitting aggregate and detail report hitting it's corresponding detail level fact..?
    if then,
    From Front-end:
    Fix the filter values in detail report that are passing from master report then try delete each columns then check the grand total. If you found your values is matching by deleting particular column then you need to investigate what is the issue around with that dimension table..
    From Database side:
    1. check first aggregate table has proper aggregate data of it's detail..
    2. Take the detail report obiee generated query and try to comment each dimension table and it's corresponding joins to the facts, (before, this delete all the dimensional columns and other measures from select statement and put only that measure where you are getting wrong value, so that you need not to comment all the select and group by columns which saves your time.. ). Need to check by commenting each dimensional wid and it's table from clause, if you found that values is matching then there is some problem with wid columns data population in your ETL.
    Is that BI-Apps project?
    btw, whtz ur name?

  • In which table,taxcode values are stored?

    Hi,
    Can anyone say in which tables and fields, tax codes values are stored.
    Values which are maintained in tcode FTXP.
    regards,
    sathya

    Hi,
    Education cess at PO level is not stored directly into any table, instead it is calculate at runtime using function module "PRICING" with calculation type "B".
    Edu. Cess for Invoice are stored in table BSET table. You can identify the record passing condition type (KSCHL).
    Manul excise condition goes to table KONV linked with EKKO-KNUMV field.

Maybe you are looking for