Rounding for Analytical Data

Is there a function module that does Rounding for Analytical Data?
If I use function module ROUND, 1.25 becomes 1.3 and 1.35 becomes 1.4
But in analytical rounding rules, if the last digit is even round down and last digit is odd round up. So 1.25 becomes 1.2 and 1.35 becomes 1.4.

megan,
do one thing.
check the last digit of the number(dec_digit in the code snipet). (1st digit after decimal as per your rule). u can pass it to string and then use split or some other thing to get the 1st digit after decimal.
then use a if block either to floor or ceil.
gv_val = dec_digit mod 2.
if gv_val = 0.
   gv_result = floor (amount).
else.
   gv_result = ceil (amount)
endif.
i am not givign a executed code here as i dont have sap currently with me. but use this logic. hope this will do your job.
thanks
Somu

Similar Messages

  • Java rounding for Double data type

    Hi,
    Is there a way to change the default precision to 2 digits after decimal point for Double data type.

    java.text.DecimalFormat:
    import java.text.DecimalFormat;
    DecimalFormat format = new DecimalFormat("#.00");
    StringBuffer formatted = format.format(Math.PI, new StringBuffer(), new FieldPosition(0));
    System.out.println(formatted); // prints out 3.14Probably as applicable today as it was 3 years ago

  • User exit for additional data B for sale order item .

    Hi., all
    my client requirement is
    (  This business requirement will make the Last Price for a given item be visible during order entry.  )
    u2022Retrieve & display during order entry, the most recent unit price given to a customer for a specific item, from the Billing data.
    . Display the Last Price under Additional Data B Screen
    add new field (last extended price) in additional data b screen.
    after that 1.     Using the Sales Order Material Number (VBAP-MATNR), Sales Order Sales Organization (VBAK-VKORG), Sales Order Distribution Channel (VBAK-VTWEG), Sales Order Division (VBAK-SPART), Sales Order Sold-to Number (VBPA-KUNNR for VBPA-PARVW=u2019AGu2019) to access the Billing Items By Material Index Table (VRPMA) and specify a billing date (VRPMA-FKDAT) of less than 60 days from current Sales Order requested delivery date (if specified at header VBAK-VDATU or at the schedule line level (VBEP-EDATU).  This will result in all the billing documents where the Sold-to bought the item but isnu2019t completely refined as of yet.  Retain the billing document (VRPMA-VBELN), item (VBPMA-POSNR), and billing date (VRPMA-FKDAT) in a temporary table to pass to number 2 as the input.
    2.     Use the billing document (VRPMA-VBELN) and item (VRPMA-POSNR) to read the Sales Document Partners Table (VBPA) where the partner function (VBPA-PARVW = u201CSHu201D) and the Sales Order Ship-To (VBPA-KUNNR for VBPA-PARVW=u2019WEu2019) to select ONLY billing documents that are for that given ship-to location.  This filters out only billing documents relevant for that ship-to location. 
    3.     From the resulting list of billing documents, select the most recent date (VRPMA-FKDAT) which will refine the search for the last Billing Document (VRPMA-VBELN) and item (VRPMA-POSNR).
    4.     Using the most recent Billing Document (VBPA-VBELN), access the Billing Document Item Table (VBRP). To result in the Last Extended Price as VBRP-KZWI1.
    5.     This price will be an extended price which needs to be calculated as a u2018unit priceu2019.  For this billing item, select the sales unit (VBRP-VRKME) to determine if the sales unit is in cases or eaches. 
    a.     If the unit of measure is in cases, then simple math is required to divide the Last Extended Price (VBRP-KZWI1) by the billing quantity (VBRP-FKIMG).  Standard rounding should apply when .005 results in a .01.
    how to achive this ?

    Hi Chakravarthy,
    use the Exits provided in SAPMV45A -includes MV45*ZZ and screen exits as well 8309 8310 8459, 8460. Just be sure to
    use zznnnnnn include in the SAP provided forms instead of coding directly in the forms.
    You can check below user exits:
    MV45ATZZ :For entering metadata for sales document processing. User-specific metadata must start with "ZZ".
    MV45AOZZ:
    For entering additional installation-specific modules for sales document processing which are called up by the screen and run under PBO (Process Before Output) prior to output of the screen. The modules must start with "ZZ".
    MV45AIZZ:
    For entering additional installation-specific modules for sales document processing. These are called up by the screen and run under PAI (Process After Input) after data input (for example, data validation). The modules must start with "ZZ".
    MV45AFZZ and MV45EFZ1:
    For entering installation-specific FORM routines and for using user exits, which may be required and can be used if necessary. These program components are called up by the modules in MV45AOZZ or MV45AIZZ.
    Reddy

  • How to create a ABAP report off of SRM box for live data?

    How to create a ABAP report off of SRM box for live data?
    Thanks in advance.
    York.

    you can try infoset query:
    STEP - A:
    1. Go to T Code RSQ02 and give the InfoSet name & select CREATE.
    2. Provide the Name(Description) and Data Source i.e. for eg here i take "DIRECT READ OF TABLE" = /BIC/AODS100. Then CONTINUE.
    3. Select what to Include in the 3 options available with the POPUP, here "INCLUDE ALL TABLE FIELDS". Then Check the fields and click GENERATE(one RED and WHITE round icon).
    4. Now provide the PACKAGE for the INFOSET. Come Back(F3).
    STEP - B: optional(If u want to create a new user group)
    1. Select ENVIRONMENT -> USER GROUPS. Provide the User Group name and CREATE.
    2. Provide Description and SAVE.
    3. Provide PACKAGE and SAVE. Come Back (F3) to the Initial Screen.
    4. Click Role/User Group Assignment. Select Newly Created User Group or an existing one. Then SAVE (CTRL + S). F3.
    STEP - C:
    1. Select ENVIRONMENT -> Queries. Provide the query name and CREATE.
    2. Select the INFOSET u have created and assigned the user group.
    3. Provide the Title and Select BASIC LIST. There you have to select (check) the fields you want to display, SAVE and then TEST. It will ask for Variant, just CONTINUE.

  • Segmentation Builder Using ADS -Analytical data Storage

    Hi -
    We are using ADS -Analytical Data Storage in Segement Builder. ADS brings in info objects from BW and then Infoset is created on that ADS .Infoset is then used as Data Source in Segment Builder. We are looking for some information on following. Any help will be greatly Appreciated -
    1.Is any other customers using ADS (Analytical Data source) in Segmentation builder , and how their experience has been ?
    2. We have  large number of records, in  millions which will be pulled from BW to CRM ADS. Can there be any performance issues, are there any know issues ?
    3. Keeping in mind the performance issue, we haves divided ADS into several  tables. However number of records in one table can be much more than other tables as these tables correspond to different customer types. What should be the optimal no of records in one table ? Is there any better way to define ADS ? -
    Your help is greatly appreciated.
    Thanks. - MP

    Hello Mohanpreet,
    We are pushing data from BW to CRM for subsequent generation of BP in CRM system ; and for this we are exploring APD functionality.
    I am using ADS adapter in CRM and have made proper entries in CRM system( ref. screen 'Release data target adapter' in CRM system)
    Now, when I go back to BW system and start to design my landsape in APD trnx. ; there I'm selecting CRM as my target system - which is connected to BW via RFC.
    The problem come when I'm trying to deifne attributes of CRM system then while selecing 'ADS as traget adapter' I"m getting an error as " CRM error : Adapter 'ADS' not known " !!!
    How do I solve this error. I'm totally stuck up at this point.
    Please advice. Due points will be awarded.
    Rgds,
    Ak-

  • EE says time has run out for my data but I still have 500mb PAYG top-up ?

    Hi. I received 10 gb data when I bought my phone 3 months ago.  EE says the time for this data has run out which is correct- however I still have 500mb of the monthly PAYG topup remaining (this is shown on the data, call, text remaining allowance screen. However away fromWifi I cannot access the internet..  It says I have no data - 'Time Run Out'. Any help appreciated  

    Hi Leanne, I tried contacting Customer Services via phone but just ended up going round in circles with the various options and not being able to speak to a person and basically getting nowhere.  So I went into the local EE shop and a very helpful and patient young lady listened to my story and phoned Customer Services who put her onto the Tech area - then back and forth between the two for over an hour till eventually they reset my account and credited me with a new PAYG monthly pack.  Now I can use the data allowance to access the internet as normal.  The only thing that I believe needs to be checked is that on my next purchase of PAYG I was due to get a 250mb Boost so I need to check I receive this as the text msgs I received from EE stated I was on my first PAYG purchase. Thanks Brendan  

  • MCW_AA111 System not a valid BW target system for analytical applications

    Dear Gurus,
    Our project would like to use retail allocation strategy with reference to SAP BW, such as
    Quotas based on SAP BW data
    Top down (SAP BW)
    Bottom up (SAP BW)
    To retract the data from BW, we go to the configuration of Data Retraction for Retail/CP. (T-CODE: MCW_AA).
    In the Maintenance of Queries for Analytical Applications, we wanted to add a new entry in the query list. We can find the Target Sys. But when we tried the input help(F4) of the Analyt. Appl. Query., the msg: "MCW_AA111:System is not a valid BW target system for analytical applications" prompt out.
    RFC connection has already been established between SAP-retail and BW.
    Does anybody have any clue?
    Thank you.
    Alex.

    hi Kevin,
    check if helps following oss note
    Note 983449 - Termin A122 1COLUMN no valid characteristic of infoprovider
    Symptom
    Termination A 122 Brain occurs when you test and generate a query. The system does not recognise the characteristic 1COLUMN.
    Other terms
    Query, condition, COB_PRO
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 2004s BI
               Import Support Package 10 for SAP NetWeaver 2004s BI (BI Patch 10 or SAPKW70010) into your BI system. The Support Package is available once Note 914304 "SAPBINews BI 7.0 Support Package 10", which describes this Support Package in more detail, has been released for customers.
    In urgent cases, you can implement the correction instructions.
    You must first implement Notes 932065, 935140, 948389, 964580 and 969846, which provide information about transaction SNOTE. Otherwise, problems and syntax errors may occur when you deimplement some notes.

  • Definition for the "Data Manager"

    Hi all!!!
    Can anyone give me a definition for the DATA MANAGER?
    I have a presentation about performance issues in a week and it would be nice to have an offical definition for it.
    Thanks a lot.

    Part of the analytic engine that controls read access to all InfoProviders.
    InfoProviders are accessed either using an SQL access to the database of the BI system or by accessing alternative sources, such as the database of another operative system, using RFC or HTTP, for example.
    Change operations are also provided for InfoProviders that are stored persistently in the database of the BI system, for example, writing in or deleting data from InfoCubes and DataStore objects. These functions are part of warehouse management.

  • HT1212 my niece's device is locked fro 23 million minutes!!!! How can i get round without losing data as at 8 she hasnt backed up!

    my niece's device is locked for 23 million minutes!!!! How can i get round without losing data as at 8 she hasnt backed up!

    If you restore you will have to restore to latest Apple's software, that frankly *****. I did it and I completely regret. On iOS 6.1.5 NOTHING works.

  • Essbase Analytics for HFM- Data Update Agent Status

    Hi,
    In Essbase Analytics for HFM, Data Update Agent status is showing 'Not Active', so as 'Analytic Link Data' status.
    The Start/Stop option for the same is disabled. Other connectivity statuses in Bridge are all 'Available'.
    What needs to be done to make Data Update status 'Active'.

    Hi HP,
    The Essbase Analytics Link and HFM needs to be registered with the same Shared Services. Hope you are doing the same.
    Thanks & Regards
    Sandy

  • Looking for Calendar functionality for a Date Variable

    Hi Experts,
    Currently iam creating a WEBI Report where the source system is SAP BI System. I have a BEx Query with some characteristics at the row level and keyfigures at the column level. I have a Date Interval Variable (Based on 0DATE and optional), where the user will input the From and To date to execute the query. I have created the universe on top of this query, but date inteval variable appears as a character in the form of LOV. When i use this universe and buid the report in WEBI, the User Prompt for date is appearing as List of date values for this date interval variable, where as my requirement is to have date calendar for this date interval variable.
    I did some r&d on top of this, when the date varibale as single value and optional, iam able to get the date calendar, but when i use the Date interval Varibale it takes as a character. I searched the forum, but i didn't find any solution.
    Is it possible to have a date calendar for a date interval variable in BEx?? or its is only for single value date varibale.
    Kindly suggest
    Regards
    Santosh

    Hi,
    you stated it correctly that the calendar shows up in case of a keydate and in case of a single value but not in case of a range.
    in addition the underlying characteristic needs to be type DATS
    Ingo
    Edited by: Ingo Hilgefort on Dec 8, 2009 1:35 PM

  • I need the Log Report for the Data which i am uploading from SAP R/3.

    Hi All,
    I am BI 7.0 Platform with Support Patch 20.
    I need the Log Report for the Data which i am uploading from SAP R/3.
    I extract the DATA from R/3 into BI 7.0 DSO where I am mapping the GL Accounts with the FS Item.   In the Transformation i have return a routine on the FS Item InfObject . I am checking the Gl code into Z table for the FS Item .
    I capture the FS item from the Z table then update this FS item to Infobject FS item.
    Now i  need to stop the Data upload if i do not find the GL code in the Z table, and generate report for all GL code for which the FS item is not maintained in the Z table.
    Please suggest.
    Regards
    nilesh

    Hi.
    Add a field that you will use to identify if the GL account of the record was found in the Z table or not. Fx, create ZFOUND with length 1 and no text.
    In your routine, when you do the lookup, populate ZFOUND with X when you found a match (sy-subrc = 0) and leave it blank if you don't find a match. Now create a report filtering on ZFOUND = <blank> and output the GL accounts. Those will be the ones not existing in the Z table, but coming in from your transactions.
    Regards
    Jacob

  • About delta loading for master data attribute

    Hi all,
    We have a master data attribute loading failed which is a delta loading. I have to fix this problem but I have two questions:
    1. how can I find the those delta requests because I need to delete all these failed requests first, am I right ? Master data is not like cube or ods that we can find the requests in Manage, for master data how can we find them ?
    2. Could you please let me know the detailed procedures to perform this delta loading again ?
    Thanks a lot

    Hi...
    1. how can I find the those delta requests because I need to delete all these failed requests first, am I right ? Master data is not like cube or ods that we can find the requests in Manage, for master data how can we find them ?
    Look.....for master data.....no need to delete request from the target..........just make the status red.......and repeat the load.....But problem is that master data sometimes does'nt support Repeat delta..........if u repeat......then load will again fail with Update mode R.........in that case u hav to do re-init.......
    1) delete the init flag.......(In the IP scheduler >> in the top Scheduler tab >> Initialization option for source system)
    2) Init with data transfer(if failed load picks some records)..........otherwise .....init without data transfer.....if the last delta failed picking 0 records.......
    3) then Delta.......
    2. Could you please let me know the detailed procedures to perform this delta loading again ?
    1) Make the QM status red.........to set back the init pointer.......
    2) Repeat the load.....
    After that.........if again load failed with Update mode R.....
    1) delete the init flag.......
    2) Init with data transfer(if failed load picks some records)..........otherwise init without data transfer.....
    3) then Delta.......
    Regards,
    Debjani.....

  • For existing data in BI, client need to take a call to park the same from s

    hi,
    friends,
    For existing data in BI, client need to take a call to park the same from safety point view by copying the existing ODS  
    is it possible?if it is possible give me details.
    thanks&regurds
    sivaji

    hi,
    well when you copy an ODS only the structure is copied. to load the data you might need to create transformation and DTp to load data in the new ODS.
    You need to load data manually. Copying the ODS will only copy the structure.
    Regds,
    Shashank

  • Data Federator XI 3.0 using DB2 VARCHAR FOR BIT DATA Column?

    We have a column in a DB2 database that is defined as VARCHAR(16) FOR 
    BIT DATA.
    We are using the suggested IBM JDBC driver, db2jcc.jar, against a DB2 
    OS/390 8.1.5 version database.
    The Datasource column displays a data type of NULL, indicating the DF 
    does not understand or cannot handle this IBM data type.
    We have two issues.
    First, target tables are not able to return any columns, regardless if 
    we exclude columns defined as NULL as mentioned above. We see the 
    'Wait' animation for a very long time when we use the 'Target table 
    test tool' option. Selecting to display the count only, returns zero.
    We are able to fetch and view non-NULL column data when using the 
    'Query tool' under the Datasource pane.
    I also get the same result when using the 'My Query Tool' in Server 
    Administrator; a selection agains the sources returns data while 
    selecting from a target table returns no data. Also, a 'select 
    count(*)' returns zero.
    The second issue is in mapping a relationship between two DB2 tables 
    where the join is between two columns of the above mentioned type 
    (NULL).
    The error we get back when we use "Show Errors" is "The types 
    'NULL' (in 'S1.PLANNEDGOALID') and 'NULL' (in 'S2.PLANNDEDGOALID') are 
    not compatible.". When reviewing the relationship, a dashed red line 
    appears instead of a solid grey line between the two tables in the 
    "Table relationships and pre-filers" section of our mapping pane.
    The following query returns an error via the Server Administrator 
    Query Tool; "Types 'NULL' and 'NULL' are not compatible for operator 
    '=' (Error code : 10248)".
    select count(*)
    from
    (select s1.CASEID, s2.PLANNEDGOALID, s2.NAME, s2.PLANNEDGRPSTTYCD
      from "/DF_CMS_ODS/sources/CMFSREPT/CMSPROD.PLANNEDGOAL" AS s1
    ,"/DF_CMS_ODS/sources/CMFSREPT/CMSPROD.PLANNEDGOAL" s2
              where s1.PLANNEDGOALID = s2.PLANNEDGOALID)
    Here are the properties settings in the Resource Connector Settings 
    for jdbc.db2.zSeries we are using.
    capabilities: isjdbc=true;orderBy=false
    driverLocation: drivers/db2jcc_license_cisuz.jar;drivers/db2jcc.jar
    jdbcClass: com.ibm.db2.jcc.DB2Driver
    sourceType: db2
    supportsCatalog: no
    urlTemplate: jdbc:db2://<hostname>[:<port>]/<databasename>
    Here are the Connection parameters as defined for the datasource in DF 
    Designer.
    Defined resource: jdbc.db2.zSeries
    Jdbc connection URL: jdbc:db2://DB2D03:50000/CMFSREPT
    Authentication: Use a specific database logon for all Data Federator 
    users.
    User Name: x
    Password: hidden
    Login domain: -- Choose a defined login domain --
    Supports Schema: checked
    Schema: is empty
    Prefix table names with schema name: checked
    Supports catalog: unchecked
    Prefix table names with the database name: unchecked
    Table types: TABLE and VIEW
    So, the following is the two questions we require answers for...
    Is this a limitation of Data Federator?
    Is there a work around short of changing the datatype in the database.

    Hi Darren,
    The VARCHAR() FOR BIT DATA is a binary data type and Data Federator does not support binaries. But if in your case, it makes sense to map this column to a VARCHAR data type you can configure the DB2 connector to view this column as a VARCHAR.
    Your column can be mapped explicitly to a data type of your choice using a property: castColumnType.
    This property can be set updating the resource you selected when you registered you DB2 data source.
    If the resource is "jdbc.db2", then:
    1. Launch Data Federator Administrator
    2. Click on "Administration" tab
    3. Click on "Connector Settings"
    4. Select the right resource: "jdbc.db2"
    5. Click "Add a property"
    6. Select "castColumnType"
    7. Set its value to: VARCHAR() FOR BIT DATA=VARCHAR
    8. Click on Ok
    You should see this column as a VARCHAR.
    Regards,
    Mokrane
    PS: For the target table issue, we have forwarded your mail to the Data Federator Designer team.

Maybe you are looking for