Calculating DELTAs

Hi
In discoverer in need to write a report, that shows deltas, i.e. columns showing difference between two days data. Let me give a simple example:
Name Age Tennis Score (delta) Soccer Score (delta)
Pat 25 4 2
Amy 23 5 3
Matt 21 6 7
where deltas are difference in the scores in today and yesterday. Data is stored in the db like this:
Name Age Date Tennis Score Soccer Score
Pat 25 09/20/2005 14 12
Amy 23 09/20/2005 19 8
Matt 21 09/20/2005 18 19
Pat 25 09/19/2005 10 10
Amy 23 09/19/2005 14 5
Matt 21 09/19/2005 12 12
Any help will be highly appreciated.
Thanks
shalu

hi
u can have this by using LAG analytical function,
then use another calculations for difference.
Try
NP

Similar Messages

  • Global Correlation Risk Delta

    Hi,
    I'm currently working on Tuning a pair of IPS modules in ASA's. We are currently in Promiscous and tuning/filtering to ensure we don't block any valid traffic when making the switch to inline.
    We are using the new 7.0.1 code and getting the global correlation / reputation data - works great & rocks.
    When viewing the events - there is a paramater - "Global Correlation Risk Delta" -- Could someone explain to me what that is?
    I understand how it adjusts the RR based on reputation & have the chart (including it for those who do not have it - got it from a networkers prezo). However I am having a hard time figuring out what Global Correlation Risk Delta is/means/does...anyone know?
    Thanks,
    Brad

    Here is a basic description.
    Without Global Correlation (versions prior to 7.0, or version 7.0 with the feature turned off) all alert triggerings will have a Risk Rating calculated.
    How a Risk Rating is calculated is explained in the following White Paper on cisco.com:
    http://www.cisco.com/en/US/prod/collateral/vpndevc/ps5729/ps5713/ps4077/prod_white_paper0900aecd806e7299.html
    Now with version 7.0 when Global Correlation is enabled there is now a new parameter added to the Risk Rating calculation ( + Global Correlation Risk Delta )
    The Global Correlation Risk Delta is either 0 or a positive value and so can keep the Risk Rating the same, or raise the Risk Rating, but will not decrease the Risk Rating.
    The Global Correlation Risk Delta is calculated based on both the Attacker IP address, and the Initial Risk Rating ( The Initial Risk Rating is the Risk Rating calculated without the Global Correlation Risk Delta).
    When Global Correlation is enabled in version 7.0 the sensor will download a Reputation Database from the cisco servers. This reputation database contains lists of Public IP Addresses that have been known to be sources of attacks in the past. With that database a Negative Reputation Score is determined for each Address in the database. The Negative Reputation Score could range anywhere from a -0.5 to a -10. If only a few atttacks have been seen from the address, the score may be only slightl negative in the -0.5 - -3 range. The worst offending Attacker IP Addresses could have negative scores in the -8 to -10 range.
    That Reputation Database is only for Public IP Addresses. So Private IP Addresses (addresses used only with NAT/PAT and are not Internet routable) will not exist in the Reputation Database.
    If the attacker IP Address is a Private IP Address, or is a Public IP Address that is NOT in the Reputation Database, then the sensor will automatically set the Global Correlation Risk Delta to 0.
    When added into the Original Risk Rating, the Risk Rating winds up the same (no change).
    So Global Correlation has no effect on Private IP Addresses, or Public IP Addresses that do NOT have Negative Reputation.
    It is only when the Attacker is from a Public IP Address with Negative Reputation that the Global Correlation Risk Delta is calculated.
    Internally the sensor has a formula to calculate what that Delta should be.
    The inputs to that formula are the Negative Reputation Score for the Atttacker IP, Original Risk Rating, as well as some proprietary variables for fine tuning the formula.
    All of these are inputs to the formula, and the one output is the Delta.
    The Delta is then Added to the Initial Risk Rating and results in a Higher Risk Rating.
    The chart from your first post is a result of plugging in the highest 20 possible Risk Ratings, and 20 possible negative Reputation scores, and uses the original proprietary variable settings, and shows you what the formula will output as the Global Correlation Risk Delta.
    So this should be used as just an example.
    The formula will still be used for Risk Ratings lower than 80 that are not shown on the chart, and will also be used for Negative Reputation Scores that are not neatly rounded to a 0.5 number.
    Also the proprietary variables are also subject to change, as we continue to fine tune the formula.
    So the chart you've posted is a good example of the type of Deltas that the formula can output.
    Because of this calculated Delta being added to the Risk Rating, the same attack coming from a known Negative Reputation Public Address will wind up with a Higher Risk Rating than the same attack coming from a Private IP Address (or even the same Public Address when not using Global Correlation).
    The sensor then has features for how it can then make use of the Risk Rating.
    And I will talk about this in the next post. I am limited by the number of characters in a single post or I would have put it into this post.

  • Client for maintenance optimizer

    Hi friends,
    Which client is to be ussed for maintenance optimizer configuration (its 000 or 001)
    Kindly advice.
    Regards
    Ayush johri

    Hi Thomas,
    I want the automatic calculated delta of available support packages and the actual support package level of my systems and for this i have to connect my DEV system and assign them to a logical component
    1. How can i connect the DEV system to the logical component (if you can guide me through the steps )
    2. As of installing Sol Man i havent created any solution manager landscape, now as you said sol man landscape would be used for connecting the DEV system to it. how can i do this ?
    Please advice.
    Your suggestions would be highly appreciated
    Regards
    Ayush

  • Converting data from frequency to mass but nothing shows up on waveform chart

    I am trying to convert frequency data from labview to mass in ng. I have a conversion formula. I used the conversion formula and wired the output data to a waveform chart.
    So, I think I did everything correct when trying to convert frequency values from the QCM to mass values in ng%2Fcm^2. What I did was convert the frequency data into mass data as a function of time by calculating delta frequency and multiplying by the appropriate scalar. I approximated the area of the quartz crystal to be 1 cm^2. I wired the data to a waveform chart (called mass (ng%2Fcm^2) as shown in the block diagram I attached. The problem is that when I run the vi nothing shows up on the mass waveform chart I just created. I played around with changing the scales but absolutely no data is recorded when the vi runs. I don't know why that is the case. 
    Thanks.  -Sicelo
    Attachments:
    block diagram.png ‏55 KB

    Hi guys.
    Thank you so much for your help. I succeeded in converting the frequency data to mass. I posted a new question about a new problem I am encountering. My vi does not run when I run it even after everything is properly connected. 
    My new message is posted under "vi does not sense signal from instrument"
    Any help will be highly appreciated.
    Thanks.
    -Sicelo

  • Clear journal data

    Hi
    When I run the Clear package the data stored on JRN_Adj is not removed from the fact tables.
    Why is that?
    How should i clear data for a selection and get data for all data sources cleared?
    Any extra step I need to consider when deleting journal data.
    Jesper

    It is about fundemental method of how BPC is using Microsoft OLAP database.
    If you run clear package, BPC is calculating delta value between current and zero. And submit that delta to make value of destination region to zero. So, you cannot completely eliminate any record in FACT, FAC2 or FACTWB tables by data manager packages.
    However, if you want to remove records explicitly, you may run SQL query to delete them. But I doubt about reason. Any particular reason?
    If you do unpost journal, it is also going to submit delta to FACTWB table. Or you can delete journal table manually. It is ultimate way and somehow risky.
    If it is necessary in your business, I suggest to do this.
    Full optimize with compress data.
    Unpost all journals on JRN_Adj.
    Full optimize with compress data.
    It is based on scenario that no one sent any data to JRN_Adj by Excel or ZFP but journal.

  • Delta Calculation and Updating multiple tables

    We pull data from a System of Records table that contains the most up to date information. The information changes daily so we have a delta process to identify what new records were added, which records were deleted (records that are not found in the table as compared to yesterday) and which were updated. Delta process compares the already loaded data with the newly updated SOR data to find the differences.
    Once the delta is established, either new records get added or existing records get updated or existing records are marked as inactive (Deletes). Additions and Updates generally happen across multiple destination tables.
    Updates are identified by looking at different columns to see if any one column is changed. These columns end up in different tables.
    Example
    Source Delta Table, S1
    ID COL1 COL2 COL3 ACTION
    1 abc xyz pqr A
    2 bcd lmn def U
    S1.Col1 maps to Destination Table D1.Col23
    S1.Col2 maps to Destination Table D2.Col45
    S1.Col3 maps to Destination Table D3.Col11
    Currently all tables are updated irrespective of whether the relevant data has changed or not (All 3 destination tables are updated).
    I would like to know which of the Columns for a given row has changed values so that I can update only the relevant tables.
    Thus if additional columns are available that act as flags
    Source Delta Table, S1
    ID COL1 COL2 COL3 ACTION COL1 COL2 COL3
    1 abc xyz pqr A - - -
    2 bcd lmn def U N Y N
    3 kjh qwe iop U Y Y N
    then for incoming ID=2, I just have to update Destination Table D2 and not D1 and D3
    for incoming ID= 3, I have to update Destination Tables D1 and D2 but not D3.
    How can I achieve that?
    This is mainly to improve performance as the processing time is very short - Faster the delta processing, better will it be.
    Thanks in advance.

    Thanks for your response.
    My question was more towards establishing what has changed.
    Given a table, which is updated daily, how does one efficiently establish which data has changed?
    Here is an example to clarify my question further
    The Source table has the following data on a particular day
    Data in Source table on Monday               
    ID     Col1     Col2     Col3
    1     abc     bcd     cde
    2     def     efg     fgh
    3     ghi     hij     ijk
    4     jkl     klm     lmn
    Copy of the above data is stored in a Old Data table
    Data in Source table on Tuesday               
    ID     Col1     Col2     Col3
    1     bac     bcd     cde
    2     def     gfe     fgh
    3     ghi     jih     jik
    5     mno     nop     opq
    Data in Source Table is compared with data in Old Data Table
    Delta established by comparing Source Table with Old Data Table                    
    ID     Col1     Col2     Col3     Delta_Flag
    1     bac     bcd     cde     U
    2     def     gfe     fgh     U
    4                    D
    5     mno     nop     opq     A
    Rows with IDs 1 & 2 were updated - thus to be updated
    Row with ID 3 - no change so not seen in delta
    Row with ID 4 was not found - thus to be deleted
    Row with ID 5 was new - To be added
    I can do the above easily. I would like to a step further to be able to say for updates
    Row with ID 1 has Col1 changed
    Row with ID 2 has Col2 and Col3 changed
    Is there an easy way to do this?

  • I_UPDMODE has no value in my Function Module when using Delta Extraction

    Help me please.
    My system is BW 3.52
    Please see the source code below and tell me why I_UPDMODE has not been passed value. I have ever used "I_SOURCE" but the value pass to I_DSOURCE. Can anyone tell me where is the upload mode pass to?
    FUNCTION ZBWFN_TEST_DELTA.
    ""Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_RLOGSYS) TYPE  SRSC_S_INTERFACE-RLOGSYS OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_INTERFACE-READONLY OPTIONAL
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZISU_ERCHC OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    This extractor is part of a delta scenario based on a timestamp
    included in the fields of table ROVERCUBE1. The interesting part
    takes place in form get_time_interval, where the date range is
    calculated update modespecifically.
    The pointer for the date up to which delta was extracted during
    the last delta update is held in table ROBWQTSTAT.
      TABLES: ZISU_TP_ERCHC, ERCH, ERCHC.
    Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SBIWA_S_SELECT.
      DATA: L_ERCHC LIKE ZISU_TP_ERCHC OCCURS 0 WITH HEADER LINE.
      DATA: L_DATE LIKE SY-DATUM,
                L_ACTUAL_DATE LIKE SY-DATUM,
                L_LAST_DATE LIKE SY-DATUM.
    Maximum number of lines for DB table
      STATICS: L_MAXSIZE TYPE SBIWA_S_INTERFACE-MAXSIZE,
               BEGIN OF S_S_INTERFACE.
      INCLUDE TYPE SBIWA_S_INTERFACE.
      INCLUDE TYPE SRSC_S_INTERFACE.
      STATICS: END OF S_S_INTERFACE.
    STATICS: BEGIN OF S_R_TSTMP OCCURS 1,
                SIGN(1),
                OPTION(2),
                LOW  LIKE ROVERCUBE1-TSTMP,
                HIGH LIKE ROVERCUBE1-TSTMP,
              END   OF S_R_TSTMP.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Invalid second initialization call -> error exit
        IF NOT G_FLAG_INTERFACE_INITIALIZED IS INITIAL.
          IF 1 = 2. MESSAGE E008(R3). ENDIF.
          LOG_WRITE 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDIF.
    Check DataSource validity
        CASE I_DSOURCE.
          WHEN 'ZOVER_TRANS'.
          WHEN 'TEST_ROVERCUBE'.
          WHEN 'DO_DATASOURCE'.
          WHEN '0VER_DELTA_WITH_LONG_NAME'.
          WHEN '0VER_CUBE_OLD_LIS'.
          WHEN '0VER_TYPE_ATTR'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE            "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
    Check for supported update mode
        CASE I_UPDMODE.
          WHEN 'F'.
          WHEN 'D'.
          WHEN 'C'.
          WHEN 'R'.
          WHEN 'S'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E011(R3). ENDIF.
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '011'                "message number
                      I_UPDMODE            "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO G_T_SELECT.
    Fill parameter buffer for data extraction calls
        S_S_INTERFACE-REQUNR    = I_REQUNR.
        S_S_INTERFACE-ISOURCE   = I_DSOURCE.
        S_S_INTERFACE-MAXSIZE   = I_MAXSIZE.
        S_S_INTERFACE-INITFLAG  = I_INITFLAG.
        S_S_INTERFACE-UPDMODE   = I_UPDMODE.
        S_S_INTERFACE-RLOGSYS   = I_RLOGSYS.
        S_S_INTERFACE-READONLY  = I_READ_ONLY.
        G_FLAG_INTERFACE_INITIALIZED = SBIWA_C_FLAG_ON.
        APPEND LINES OF I_T_FIELDS TO G_T_FIELDS.
    here the timerange for update modes concerning delta is calculated
    and the status table is updated
        PERFORM GET_CAL_INTERVAL TABLES G_R_DELTA_DATE[]
                                  USING  S_S_INTERFACE-ISOURCE
                                             S_S_INTERFACE-UPDMODE
                                             S_S_INTERFACE-RLOGSYS.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      calcualte range tables for key fields
                                   calculate date range due to update mode
                                   OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        G_COUNTER_DATAPAKID = G_COUNTER_DATAPAKID + 1.
        IF G_COUNTER_DATAPAKID = 1.
    Fill range tables.
         LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'COUNTRY'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_COUNTRY.
           APPEND L_R_COUNTRY.
         ENDLOOP.
         LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'REGION'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_REGION.
           APPEND L_R_REGION.
         ENDLOOP.
         LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'KUNNR'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_KUNNR.
           APPEND L_R_KUNNR.
         ENDLOOP.
         LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'TYPE'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_TYPE.
           APPEND L_R_TYPE.
         ENDLOOP.
         LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'GJAHR'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_GJAHR.
           APPEND L_R_GJAHR.
         ENDLOOP.
    no data must be selected in Init simulation mode
          CHECK S_S_INTERFACE-UPDMODE NE SRSC_C_UPDMODE_INITSIMU.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE.
          L_MAXSIZE = G_S_INTERFACE-MAXSIZE.
          REFRESH: L_ERCHC.
          SELECT * FROM ERCH WHERE ERDAT IN G_R_DELTA_DATE
                                          OR AEDAT IN G_R_DELTA_DATE.
            SELECT SINGLE * FROM ERCHC WHERE BELNR = ERCH-BELNR.
            IF SY-SUBRC = 0.
              CLEAR: L_ERCHC.
              L_ERCHC-BUKRS = ERCH-BUKRS.
              L_ERCHC-ABRVORG = ERCH-ABRVORG.
              L_ERCHC-PORTION = ERCH-PORTION.
              L_ERCHC-GPARTNER = ERCH-GPARTNER.
              IF ERCHC-CPUDT IN G_R_DELTA_DATE.
                L_ERCHC-DELDT = ERCHC-CPUDT.
                L_ERCHC-DOCDT = ERCHC-BUDAT.
                L_ERCHC-RELNO = 1.
                COLLECT L_ERCHC.
              ENDIF.
              IF ERCHC-INTCPUDT IN G_R_DELTA_DATE AND
                 ERCHC-INTCPUDT IS NOT INITIAL.
                L_ERCHC-DELDT = ERCHC-INTCPUDT.
                L_ERCHC-DOCDT = ERCHC-INTBUDAT.
                L_ERCHC-REVNO = 1.
                COLLECT L_ERCHC.
              ENDIF.
            ENDIF.
          ENDSELECT.
          DELETE FROM ZISU_TP_ERCHC.
          LOOP AT L_ERCHC.
            MOVE-CORRESPONDING L_ERCHC TO ZISU_TP_ERCHC.
            INSERT ZISU_TP_ERCHC.
          ENDLOOP.
          OPEN CURSOR WITH HOLD G_CURSOR FOR
          SELECT * FROM ZISU_TP_ERCHC.
        ENDIF.                             "First data package ?
        IF S_S_INTERFACE-UPDMODE = SRSC_C_UPDMODE_INITSIMU.
          RAISE NO_MORE_DATA.
        ENDIF.
    Fetch records into interface table.
        FETCH NEXT CURSOR G_CURSOR
                   APPENDING CORRESPONDING FIELDS OF TABLE E_T_DATA
                   PACKAGE SIZE  S_S_INTERFACE-MAXSIZE.
        IF SY-SUBRC <> 0.
          RAISE NO_MORE_DATA.
        ENDIF.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.

    Dave,
    1. You can fire SELECTS in an RFC as well, but in your case the data exists in SYSTEM A and the RFC is in System B, so you can't do that. You can fire SELECTS on tables in the same system.
    2. Quick example of two table loops - EKKO (HEADER) EKPO (ITEM).
    LOOP AT EKKO.
    LOOP AT EKPO WHERE EBELN = EKKO-EBELN.
    ENDLOOP.
    ENDLOOP.
    I hope this is clear now.
    Regards,
    Ravi

  • Pythagoras calculation

    I have a .vi I am trying to write in Labview for a project that really has me stuck. Basically what I want to be able
    to do is calculate distance using pythagoras, given a user selected number of dimensions (up to 20). That means rather
    than traditionally just calculating pythagoras using x and y, I now have x, y, z, .... etc.
    Currently I am at the stage where I can send an array of integers, where each represents which dimension to use
    in calculation. eg 1 is x, 2 is y, 3 is z ... etc.
    What I have done is a .vi for just x and y, which I have attatched but as for expanding this for z, alpha, beta,
    gamma, delta etc I draw a blank. Any clues?
    Solved!
    Go to Solution.
    Attachments:
    Pythagoras.vi ‏9 KB

    LabVIEW will do primitive operations on arrays.  For example, if you connect two one-dimensional arrays into the add primitive, it will return an array that contains and element by element addition of the inputs.  This makes your problem fairly easy.  Instead of indexing individual points, index the entire location vectors of the two points you want to know the distance between.  Subtract one from the other to get the differences.  Square the resultant vector.  Use the Array Sum to sum it, then take the square root.  This approach will scale with however many dimensions you want.
    Good luck!
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Calculating Life to date values

    Hi there,
    the users are looking to get life to data values for service order. We can do this in the query (on the front end) but the performance takes a big hit due to the volume of records that are retrieved. So we are looking to build this calculation on the backend. Has anyone here done something similar and could share their experience on this?  To repeat, the users would like to see the life-to-date value and this will contain any amounts that have been posted to this object. I was thinking about using a standard dso for this solution and use the amounts related cubes as source for this dso. The key figures (or data fields) in the dso would be incremented from the delta loads from the underlying cubes.
    Thanks for sharing your thoughts or experience handling this issue.

    If I am getting you correctly, you are getting the following output(sample) where you are getting few additional orders for which cost$ is zero but we have &cost(life to date value) like 10002563 .
    Order#          Fiscal period               $costs         $costs (life-to-date)
    10001111           001.2012                 5,000             50,000
    10001111           002.2012                 5,000             50,000
    10001112           001.2012                 6,000             30,000
    10001112           002.2012                 7,000             30,000
    10001112           003.2012                 7,000             30,000
    10001113           003.2012                 5,000             10,000
    10002563           001.2012                  0                   5000
    If this is the issue then you can put a condition on cost& Key figure to be not equal to zero.It would solve the issue.
    I would request you to provide us the sample output with both the correct and incorrect orders so that we can further understand the issue.

  • Calculation of Safety Stock and Reorder Point under Forecast Model T

    Hi Gurus!
    Happy Holiday!
    I would like to ask for your assistance on how the the safety stock and reorder point was calculated with the following values available. I would really appreciate it if you could give me details on the calculation.
    Below are the values:
    Basic data
    Forecast date        01.12.2009        Unit                  CTN
    Forecast model       T                 Service level         0.0
    Period indicator     M                 Paramtr profile
    Control data
    Initialization                                Tracking limit        4.000
    Model selection      A                 Procedure selection   2
    Parameter optimizatio                Optimization level    F
    Alpha factor         0,10               Beta factor           0,10
    Gamma factor       0,00              Delta factor          0,30
    Basic value           5.464-           Trend value          5.603-
    MAD                      4.758            Error total              4.722
    Safety stock         1                   Reorder pnt.          1
    No. of values
    Consumption           6                Forecast periods       1
    Initial. periods      0                Fixed periods          0
    Periods per season   12
    Historical data
    Period                Original     Corrected value
    11.2009                3.000              3.000
    10.2009                0.000              0.000
    09.2009                0.000              0.000
    08.2009                9.000              9.000
    07.2009               21.000             21.000
    06.2009               20.300             20.300
    Forecast results
    Period                Original     Corrected value
    12.2009                0.000              0.000
    Appreciate your assistance!
    Thank you and Happy Holidays!
    Ji

    Sweth, you are asking for consulting, and in my opinion it is way beyond what can or should be reasonable to achieve in such a forum. You are asking complex questions, that most probable have more than one possible answer.
    I suggest that you get on-site help from a knowledgeable and experienced consultant. These are crucial business issues, and should be dealt seriously.

  • Calculations based on Summarized data in Cross Tab

    First off, I'm pretty experienced with Crystal.
    I've run accross something that seems like it should be realy easy to do, and the sort of thing you would expect to do  in a cross tab... so maybe I'm missing something totally obvious.
    I'm doing a year over year comparison of some financial data broken down by month and by quarter.
    So, my Rows are Quarter, and a field called 'YEARMONTH' (calculated field, YYYYMM, for ease of sorting)
    My column is Year, and for summarized fields, I have the data field I'm interested in which is a float. Let's call it 'Dollars' for sake of argument.
    What I want to do is create a summary field (next to the total field) called 'Difference' or 'Delta' that calculates for reach row the difference between my two years (2008, and 2009)
    The only solution I can come up with is to dummy in a record from the datasource with a year value of 'Difference' and some other dummy values so that it will show up as a column on the cross tab, and then somehow use the currentrowcolum function or some such creature to mask the output in the column. but now that I type it out here, I'm not sure it's going to work. I also don't think it's going to export the way I want it to either.
    I'd really prefer it to be in a cross tab, and not in some manually created cross-tab emulation using manual running totals, but I'll go there if I have to.
    Thanks a ton!

    This is what I did in my report to get the difference
    my crosstab looks like this
                        2004     2005    Total
    Total              T1         T2         T
    USA               A          B          C
    INDIA              X          Y          Z
    right click on T1 and go to format field and write the suppress condition like this
    numberVar d:=0;
    currencyVar array arr1;
    currencyVar array arr2;
    numberVar e;
    if GridRowColumnValue('year')=2004 then
    (e:=e+1;
    redim preserve arr1[e];
    arr1[e]:=CurrentFieldValue)
    else
    (e:=e+1;
    redim preserve arr2[e];
    arr2[e]:=CurrentFieldValue);
    false
    right click on T and go to format field and write the Display string condition like this
    currencyVar array arr1;
    currencyVar array arr2;
    totext(arr1[1]-arr2[1])
    right click on A and go to format field and write the suppress condition like this
    currencyVar array four;
    currencyVar array five;
    numberVar d;
    if GridRowColumnValue('year')=2004 then
    (d:=d+1;
    redim preserve four[d];
    four[d]:=CurrentFieldValue)
    else
    (d:=d+1;
    redim preserve five[d];
    five[d]:=CurrentFieldValue);
    false
    right click on C and go to format field and write the Display string condition like this
    currencyVar array four;
    currencyVar array five;
    numberVar g;
    g:=g+1;
    ToText(four[g]-five[g])
    Note: Please select the option "column totals on top" for crosstab in customize style
    Hope this helps!
    Raghavendra

  • Function Module Extractor with Delta

    I'm very new to working with the BW, so I hope I'm explaining my question in a way that makes sense.
    I need to make a BW extractor that pulls data from at least one table, possibly more, and which also calculates some data.
    I think the only way I can do that is by using an extractor with a function module.
    I've been looking at the examples in function group RSAX, and I think I sort of get it, except for one part.
    I'm really confused about how the delta works.
    I'm looking at the example: RSAX_BIW_GET_DATA_SIMPLE
    Does it even use a delta?
    I'm trying to understand how the delta gets passed, and how it gets used to filter the data in the function module.  I'm not sure if it's something I need to code in myself, or if it's something that BW does by itself, or what.
    I'm not sure if I'm making sense, but if anyone could help me, I'd appreciate it.

    Thanks guys, you've been super helpful so far, so I'm just going to keep asking questions.  I promise to award points at the end
    I'm looking at it now, and I think if I change it to read from a table, instead of read from a function module I can use the 'numeric pointer' option for the delta.  (Does that just mean a number?  For example, payroll runid is a number, so I could use that and only get new payroll runs?)
    However, there still are some calculated fields they want.  Someone mentioned doing that on the BW side.  How would that work?
    Sorry for such vague questions, but any help is really appreciated.

  • CONVERT_TO_LOCAL_CURRENCY currency conversion errors in Delta mode

    Hello,
    I have been working with SAP via customer messages for a month now with no resolution.  As many of you have experienced, the SAP Level 1 support is pretty useless so I would like to check here to see if I can get some help.
    I found a thread where another user has had the exact same problem as I have. 
    Problem: Exchange rate in Update rules with DELTA mode
    However, there was no answer given to him.  So, if anyone can help, I appreciate it.  Here's the problem:
    When a record comes through 2LIS_13_VDITM (for example), it had a document currency of EUR, a local currency of PLN and a statistical currency of USD.  The first time the record comes through, it comes through just fine.  The exchange rate from EUR to PLN is 3.533.  The statistical currency value is calculated by taking the document currency value and multiplying it by the exchange rate from EUR to PLN to get the local currency.  Then, this local currency is converted to USD by multiplying itself by the PLN to USD exchange rate of 1.519.
    These are the exchange rates that are coming through as key figures on the transaction data.  The problem is when a delta occurs.  The data goes into the ODS and then once the update rules to the cube get it, the signs are backwards on the exchange rates.  This makes sense so that the exchange rate key figures will cancel themselves out, but the SAP code says to calculate the local currency, it must first take the document currency and divide by the exchange rate.  It should still be multiplying because the exchange rate from EUR to PLN is still 3.533.  The negative on the key figure that is being zeroed out tells the code to divide though.
    You cannot simply set the reversal indicator because the values that do come through ok would then be erraneous as the previous poster who had the same issue encountered.  This is really causing some bad data for us.  This is SAP delivered code.  Any help would be appreciated.
    Thanks,
    Brent

    hi,
    chk out various application currency conversion  links
    Need Urgent Help on Currency Conversion Routine
    Currency conversion using transformation rule
    http://help.sap.com/saphelp_nw04/helpdata/en/bc/058f4073b2f30ee10000000a155106/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/16/088f4073b2f30ee10000000a155106/frameset.htm
    Ramesh

  • Negative values in report for delta

    Hello,
    I am getting negative values for service level report metrics in delta loading, intresting thing is metrics is giving correct values for almost all the order but for around 40 order I am getting -ve values in metrics (i am getting -100 % and correct value should be 100 %, it is for around 40 records only for other records I am getting it correctly), If I do the full initalization again it brings me the correct poistive values but again get some -ve values when I run the new delta loads.
    Please give your suggestion on comments on this. I will assign point to all the answers and full point to right answer.
    if you need any further information pl let me know
    Hitesh

    Hello Manga, Kedar and Eric,
    Thanks for your quick replies,
    Manga, I am checking the orders on r/3 how they were processed thanks for your suggestions
    Kedar, yes these are custom extractor using 2lis_11_v_ssl and 2lis_11_vaitm going to ods and then to infocube and I am reporting it on infocube. can you please specify in detail what exactly I should check on extracotr side/ do I need to check at first level ODS anthing specific ??
    as these metrics calculation are done using RKFs and  CKFs in query designer and those all are correct.
    Pl advise
    Any other suggestions/ comments are most welcome
    Note: I have assigned points to all the answers
    Hitesh
    Message was edited by: Hitesh Asknani
    Message was edited by: Hitesh Asknani

  • -ve values when using record mode in ODS delta

    Hi
    We have an ODS where we are calculating the number of items using update routine.  I'm not sure if the below code is correct.
    I'm getting -ve values in my item count for delta loads from ODS to cube. ODS data (active) looks correct. But Cube data is showing -ve item count.  I guess I should remove the below code to count the number of items correctly ( if COMM_STRUCTURE-recordmode EQ 'X' ) from ODS update rules . Can any one please go through below and advice.
      IF COMM_STRUCTURE-PROCESSKEY = '001'        "Purchase Order
        RESULT = 1.
             if COMM_STRUCTURE-recordmode EQ 'X'.
             RESULT = -1.
             endif.
    ELSE.
        RESULT = 0.
    ENDIF.
    Thanks
    Sam

    Thanks very much for the blog. But I already knew this as I checked it before. Now my issue is our logic is brining -ve values in to Infocube...Here is what I have in ODS Active , Changelog and Cube. I want to make the cube result same as ODS active data:
    Active:
      OI_EBELN   OI_EBELP DOC_CAT COMP_CODE PUR_GROUP  ORDER_VAL PO_ITEMS
      4700020064       10 F       1004      907       10.000,00    1,000            
    Changelog:
       OI_EBELN   OI_EBELP DOC_CAT COMP_CODE PUR_GROUP    ORDER_VAL PO_ITEMS
                                                     20.000,00    2,000              2              0
    Infocube:
    COMP_CODE PUR_GROUP    ORDER_VAL PO_ITEMS
              10.000,00     6,000-     0     1

Maybe you are looking for

  • Keys on external keyboard not working

    Hi folks, I recently purchased a USB keyboard (extended numeric) for my MacBook Pro, and all has been good until this morning, when the V and Y keys (both lower and upper case) have stopped working, i.e. when I try typing with them, nothing appears.

  • Developed application not appearing in system

    Hi, Developed and deployed an application throgh NWDI ,but its not appeared in the respective portal system. Note: able to see the application in CBS(NWDI) what are all to be crossverified. Thanks a ton

  • Peoplesoft 9.2 PTools 8.53.02 Appliances (April-2013) : issues

    Hello, Not sure if this forum is still monitored by Oracle team, and if it still accurate to post in for the OVA. Anyway, I posted the same on my blog here, but I hope this will help others to put it here. I had few issues with the latest Peoplesoft

  • Pleasee Help , Urgent  , XML Parsing

    We should upload Sales Order XML File into OA Tables everyday . As I am very new to XML I request to give complete sample code ( giving just the logic needs time to understand, very urgent..) The current XML File can have mutilpe one sales order and

  • Problem with data output stream

    in the code output.writeBytes(String arg); output.writeInt(Int arg); output.writeLong(dateOfPurchase.getTime());the first line is working fine ,but the other two line ,even though it writes something to the file it is not in a readable format can any