Best Practise comparing measure with historic data?

Hello everyone,
we are currently trying to compare a measure for a certain week with the same measure of the weeks before. To make a condition such as: "If number of alerts in the 3 previous weeks > 100 then 1 else 0 [for this specific week]".
However we are struggling because it seems that Measures and Dimensions are strictly separated from each other. "number of alerts" would be a measure, "week" would be a dimension. When creating a measure only other measures are available in the formula creator, when creating a dimension only other dimensions are available. In our understanding we would have a formula with both measures and dimensions in it.
Any suggestions?
Thanks in advance!
Marvin

thank you for your fast reply. That's a good thought, however I currently don't see a possibility for aggregations in the formula of a dimension. Hence the logic "if number of alerts in the previous 3 weeks  > 100 then 1 else 0" still struggles me.
The values in my table are on the alert-level but for my measure I have to aggregate them at the level of the object, where the alert occurred (a dimension).

Similar Messages

  • Compare measures by dynamic Date filter.

    Hello,
    I have Date dimension with Hierarchy Year.Quarter.Month and Measure.Amount.
    I would like to know if it is possible to compare measures on selected Date dimension level comparing 1 year before.
    For example there are values
    Month Amount
    2012-01 10
    2012-02 20
    2012-03 30
    2013-01 40
    2013-02 50
    2013-03 60
    When I browse cube I need to create Calculated member (Amount_last_period) like the following:
    Month Amount
    Amount_last_period
    2012-01 10
    2012-02 20
    2012-03 30
    2013-01 40
    10
    2013-02 50
    20
    2013-03 60
    30
    I need to do it dynamically. If I choose to output quarters' amount it is need to compare quarter's data.
    Any help or thoughts would be helpful.

    Hi VB,
    You can use ParallelPeriod in combination with Currentmember.
    An example from the AW cube:
    WITH
    MEMBER [Measures].[Prior year Reseller Sales Amount] AS
    ParallelPeriod(
    [Date].[Calendar].[Calendar Year],
    1,
    [Date].[Calendar].CurrentMember
    [Measures].[Reseller Sales Amount]
    ,FORMAT="Currency"
    SELECT
    ([Measures].[Reseller Sales Amount]),
    ([Measures].[Prior year Reseller Sales Amount])
    } ON COLUMNS,
    {[Date].[Calendar].[Calendar Quarter]
    } ON ROWS
    FROM [Adventure Works]
    Philip,

  • Populated the new field with historic data

    HI,
    I have a data for 2 years now I am enhancing the data source how can I populated the new field with historic data.Is it by
    1)deleting all the data in BW side and the doing an init then setting up a regular delta
    2)running a repair full request with selection condition.
    or is there any other option available? which is the best scenario to load these historic data?
    Regards,
    Ravi

    Hi,
    I think your datasource is already in production, and you want to data only from today onwards for enhanced fields, i.e. historical data is not required for enhanced fields..
    1.Fix ECC down time for 20 to 30 munities
    2. Keep all objects on Qty system in ECC and BW.
    3.Run Delta laods in BW for 2 to 3 times.So with this step you can clear SMQ1 and RSA7. Check the entries in RSA7, if it is ZERO then it is fine.
    4. Move DS from ECC Qty to ECC PROD .
    5. Replicate in BW.
    6. Move all BW objects from BW Qty to BW Prod.
    7. Delete Init load in InfoPackage level (Not in ODS/CUbe).
    8.Load Init without DataTransfer.
    9.Then Run Delta.
    10. Next day on wards deltas will come as usual.
    If you need Historical data also.
    1.Delete data in Cube.
    2.Fix Down Time and load Init then Delta.
    Check
    SAP Note 328181 - Changes to extraction structures inCustomizing Cockpit
    Thanks
    Reddy

  • Best practise in SAP BW master data management and transport

    Hi sap bw gurus,
    I like to know what is the best practise in sap bw master data transport. For example, if I updated my attributes in development, what are the 'required only' bw objects should I transport?
    Appreciate advice.
    Thank you,
    Eric

    Hi Vishnu,
    Thanks for the reply but that answer may be suitable if I'm implementing a new BW system. What I'm looking for is more on daily operational maintenance and transport (a BW systems that has gone live awhile).
    Regards,
    Eric

  • How to fill a new single field in a Infocube with historical data

    Hello Everybody,
    We have an SD infocube with historical data since 1997.
    Some of the infoobjects (fields) of the infocube were empty during all this time.
    Now we require to fill a single field of the infocube with historical data from R/3.
    We were thinking that an option could be to upload data from the PSA in order to fill only the required field (infoobject).
    Is it possible? is there any problem doing an upload from the PSA requests directly to the infocube.
    Some people of our team are thinking that the data may be duplicated... are they right?
    Which other solutions can we adopt to solve this issue?
    We will appreciate all your valuable help.
    Thanks in advance.
    Regards.
    Julio Cordero.

    Remodeling in BI 7:
    /people/mallikarjuna.reddy7/blog/2007/02/06/remodeling-in-nw-bi-2004s
    http://www.bridgeport.edu/sed/projects/cs597/Fall_2003/vijaykse/step_by_step.htm
    Hope it helps..

  • Non-cumulative initialization is not possible with historical data

    Hi all,
    While loading inventory cube.. 0IC_C03  .. after loading 2LIS_03_BX request get sucessful. while compress that request No Marker update. i getting following error msg.
    <b>Non-cumulative initialization is not possible with historical data</b>     
    Regards
    siva.

    You sure you didn't use BF instead of BX?  This messge indicates that you have historical data in your cube whereas BX only loads a fixed state at a specific point in time...  Or maybe, did you initialize BX twice in R/3 without deleting the previous init, I don't know if that could cause that error but it's another possibility.

  • Simple question: validate infocube data by comparing it with PSA data

    Hi all,
    I want to validate infocube data by comparing it with PSA data, for that i went through PSA via context menu. After that i generated export datasource. Then i am not able to find PSA Export datasource? where can i find that..
    Thanks
    raju

    Hi all,
    I´ve tryied to VALIDATE INFOCUBE DATA BY COMPARING IT WITH PSA DATA making the step by step contained in the HOW TO ...
    The problem is that I have the following error when i activate the transfer rules after aggregating the infoobjet 0TCTREQUID and don´t know how to resolve it:
    " IDoc segment /BIC/CICA7ZBW_RECEP_DI could not be assigned to IDoc ZSCA002
    Message no. RSAR240
    Diagnosis
    The transfer structure is assigned to IDoc ZSCA002 as an IDoc segment for the data transfer between source system and Business Information Warehouse.
    This assignment failed.
    System response
    IDoc segment /BIC/CICA7ZBW_RECEP_DI not known to IDoc V2&. Therefore, no data import can take place from the source system for this InfoSource."
    Any idea of what could have happend?
    Thanks in advance,
    Diego

  • Compare Hiredate with Current date in PCR

    As per the requirement I have, a certain type of accrual needs to occur on the employee's anniversary date, this accrual should only happen on the first anniversary. How can I compare hire date with current date ignoring the year, and then ignoring the same logic from second year on-wards.
    Can someone please help me with the possible method of implementing this requirement in a time PCR.

    In my humble opinion, I think that you should achieve that by means of an ABAP program, it should be scheduled to run every day and compare the hiring date with the current date of every employee, then create a batch input to Infotype 2013... it would be easier to track and monitor, since you would have all logs in SM35
    Best regards,
    Federico.

  • Formula Created in BI Query appears in Universe as Measure with No Data

    Hi,
    I have created the universe on top of SAP BI Query(Which is built on Infoset).
    There are some formulas created in BI Query as mentioned below:
    Eg: Status1=
                        If Completion Code = null then Status1 = 0 
                        If Completion Code = =10,11,18 then Status1 = 1
          Status2=
                         If Status1= 1 and Field Completion Date <= Regulatory Due Date then Status 2 = 3
                         If Status1= 0 and Report Date <= Regulatory Due Date then Status 2 = 4
    In the Universe I get Status1 and Status2 as Measures.
    When I use these Measure Objects in WebI report, I donu2019t see any Data for these objects in WebI Report. Both the columns for Status1 and Status2 appear Blank with no data in it, although I get the data in SAP BI Query for both Status1 and status2.
    Is there any issue with the formulas to be used in SAP BI Query?
    Are Formulas supported in Business Objects from SAP BI Query?
    regards,
    Nisha

    Hi Ingo,
    I tried running the standard test MDX in MDXTEST and I got the data for those calculations.
    But I wonder why there is no data in WebI for those formulas(Key Figures)?
    In Standard test MDX the MDX Query is as Follows:
    SELECT
    [Measures].MEMBERS ON COLUMNS,
    NON EMPTY [Z_WM_IS01___F98].[LEVEL01].MEMBERS ON ROWS
    FROM ZWM_M02/Z_ZWM_M02_Q001 SAP VARIABLES
    [!V000001] INCLUDING [Z_WM_IS01___F15].[3]
    Based on above Query I created the WebI report which includes the objects as described below:
    one Dimension Object Notification Number which is equivalent to [Z_WM_IS01___F98].[LEVEL01].MEMBERS  from above query.
    Selected All the measures objects available in query which refers to [Measures].MEMBERS
    And  one Prompt on Region which is equivalent to [!V000001] INCLUDING [Z_WM_IS01___F15].[3]
    Why there is no data for Calculation columns (Key Figures Status1 and Status2) in WebI Report???

  • Inventory 0IC_C03, issue with historical data (data before Stock initializa

    Hi Experts,
    Inventory Management implementation we followed as bellow.
    Initailization data and delta records data is showing correctly with ECC MB5B data, but historical data (2007 and 2008 till date before initialization) data is not showing correctly (stock on ECC side) but showing only difference Qunatity from Stock initialization data to date of Query.
    we have done all the initial setting at BF11, Process keys and filed setup table for BX abd BF datasources, we are not using UM datasource.
    1 we loaded BX data and compressed request (without tick mark at "No Marker Update)
    2. initialization BF data and compressed request (with tick mark at "No Marker Update)
    3 for deltas we are comperessing request on daily (without tick mark at "No Marker Update).
    is this correct process
    in as you mentioned for BX no need to compress ( should not compress BX request ? )
    and do we need to compress delta requets ?
    we have issue for historial data validation,
    here is the example:
    we have initilaized on may 5th 2009.
    we have loaded BX data from 2007 (historical data)
    for data when we see the data on january 1st 2007, on BI side it is showing value in negative sign.
    on ECC it is showing different value.
    for example ECC Stock on january 1st 2007 : 1500 KG
    stock on Initialization may 5th 2009 : 2200 KG
    on BI side it is showing as: - 700 KG
    2200 - (-700) = 1500 ,
    but on BI side it is not showing as 1500 KG.
    (it is showing values in negative with refence to initialization stock)
    can you please tell, is this the process is correct, or we did worng in data loading.
    in validity table (L table) 2 records are there with SID values 0 and -1, is this correct
    thanks in advance.
    Regards,
    Daya Sagar
    Edited by: Daya Sagar on May 18, 2009 2:49 PM

    Hi Anil,
    Thanks for your reply.
    1. You have performed the initialization on 15th May 2009.
    yes
    2. For the data after the stock initialization, I believe that you have either performed a full load from BF data source for the data 16th May 2009 onwards or you have not loaded any data after 15th May 2009.
    for BF after stock initialization delta data, this compressed with marker update option unchecked.
    If this is the case, then I think you need to
    1. Load the data on 15th May (from BF data source) separately.
    do you mean BF ( Material movements) 15th May data to be compressed with No Marker Update option unchecked. which we do for BX datasource ?
    2. Compress it with the No Marker Update option unchecked.
    3. Check the report for data on 1st Jan 2007 after this. If this is correct, then all the history data will also be correct.
    After this you can perform a full load till date
    here till date means May 15 th not included ?
    for the data after stock initialization and then start the delta process. The data after the stock initialization(after 15th May 2009) should also be correct.
    can you please clarify these doubts?
    Thanks
    Edited by: Daya Sagar on May 20, 2009 10:20 AM

  • How to convert a string to date and then compare it with todays date???

    Hello.
    I want to set a format first for my dates
    DateFormate df = new SimpleDateFormate("yyyy-mm-dd");
    once this is done then I want to convert any string to date object in the above formate
    String str="2001-07-19";
    Date d = null;
    try{
    d = df.parse(s);
    }catch(ParseException pe) {
    pe.printStackTrace();
    First of all there is something wrong above,cus what I get for this is
    Fri Jan 19 00:07:00 MST 2001
    where as it should have been
    2001-07-19... to my understanding.
    once this part is done I need to get current date in the above set format and compare the
    current date and the date I set.
    I will appreciate the help.
    Thanks

    for the output part:
    a date is a point in time
    the output depends on the format you specify for output
    using for example a SimpleDateFormat.
    You only specified the format for parsing (which is independent for that of output) so java uses some default format ... see the DateFormat.format() method for details.
    for the comparison stuff, I just posted a little code snippet in this forum a few minutes ago.
    the hint is: Date.getTime() returns milliseconds after a fixed date
    hth Spieler

  • Best Practises ►Repurposing videos with Captivate and export as SCORM

    I'm new to Captivate and SCORM.  I have a series of 6 basic video Lessons (MP4s) which make-up one Course.  Lerners must complete (pass a summary quiz) each lesson before they can move on to the next.  We want to repurpose these files to create a more immersive UX.  I'm looking for a "best practices" suggestion for this work.
    Currently the plan looks like this:
    Re-edit existing videos into sets of smaller clips.
    Create Captivate project/s using the edited clips and adding other Captivate tools/features e.g. navigation, quizzes etc.
    The Captivate project/s will be exported/saved as SCORM package/s.
    We will up-load the SCORM package/s to Moodle.
    I'm very new to this, I've not yet created a SCORM-based Moodle project, and am not clear on how they "work," but that's for the Moodle forums.
    Should I create one large "Course" with Captivate that contains all of the content from all 6 MP4s?  This will require us to rely on Captivate's functions to control access to each lesson.  Each lesson has a summary quiz; learners must pass each quiz before they can move to the next lesson.  This one SCORM package is than imported into one Moodle course.
    Should I create six smaller "Lessons" with Captivate that contain only the content from one MP4?  This will require us to rely on Captivate's functions to control progress through only one lesson.  The summary quiz will be 'administered' by Captivate/SCORM.  When the learner passes the summary quiz the lesson is complete.  We will rely on Moodle's access control functionality to then provide access to the next SCORM package.
    If anyone has experience with this sort of project, or can offer some advise I would appreciate it.
    Thanks
    Larry

    While I cannot claim enough expertise to state these are best practices, I offer them as food for thought as you work out your implementation. We use Moodle and Captivate in combination. We do kindergarten to grade 12 curriculum and we realized that there is a huge number of lessons that can be repurposed for multiple grade levels and we also have more than one copy of many of our courses.
    What we decided to do was to use what would be equivalent to a repository for the lessons portions. These have interactives in them for practice of the new concept but they are not scored. Our many courses that use a particular lesson can then link to the one instance of the lesson from the repository. This saves us a lot of disk space, development time in new course builds, and if we need to fix or update the lesson we merely have to do it once. Those are huge benefits.
    Anything that will score to the grade book runs inside a course so the SCORMs can 'talk' to the course the student is in, so homework, quizzes, and tests are housed inside Moodle courses.
    We can place links to the repository lessons easily into the SCORM final landing pages of after the pretest or provide the ability to link to the lesson as feedback if a student is having trouble with a particular question.
    We are also starting a project of building interactive walkthroughs for each problem that will be housed in the reporitory. If students get stuck, they can be linked out to the walk through. This is a great way to do these because their homework will not have to be weighted down with a walk through for every problem. That will make them faster to load, but yet when they do need a walk through we can link them out from the SCORM to the deep and interactive added support for just what they need.
    So far, this setup has worked well for us. We are not expecting any major changes such as a new domain name on the repository. If your repository domain name or server structure for your repository is likley to be in flux, then this setup is not ideal.

  • Update GL accounts (historical data) with new Business Area values

    Hi experts,
    Situation:
    One company code with multiple plants already productive w/o Business Area.
    Business Area would be implemented soon to internally differenciate the company code.
    What is needed:
    To update with various values the GL account balances with a proper Business Area, as required, for instance starting on a new year. Something like executing a balance carry forward and be able to add the business area.
    Purpose:
    For internal analysis and reporting to be able to compare data without having to deal with historical data having the field Buisness Area 'blank' .
    Question:
    Does any body knows if a tool or program is available to do so?
    Your responses are greatly appreciated.
    Thanks,
    GG

    Hi,
    this message cannot be changed (OBA5) nor a new analysis period for the costcenter is helpful if any plan / actual data is posted already in this fiscal year.
    If its not only test data (which can be deleted and than the change can be done) the only way is to add the business area in cost center master by maintaining CSKS-table directly if you are allowed to do so...
    If there are balance sheet items (eg assets assigned to such a cost center, the balance sheet postings from asset accounting are posted with the busines area from cost center master) you have to repost them in GL accounting otherwise business area wise balance sheet is wrong.
    Best regards, Christian
    Edited by: Christian Ortner on Mar 23, 2010 12:15 PM

  • Need help with enhanced data source in Production system

    Hello Gurus,
    1.                  I enhanced a datasource in BW and populated the field using customer exit using CMOD function. In Dev system, i dont have much data, so I deleted the whole data and did full load.
    what shud I do in Production side, so that Delta wudnt be affected??since in production, we have millions of records, we wont do full load., what is the best way to populate the field in production after transporting the datasource to production without disturbing delta's, to reflect the new field for previous years data???
    2.  can we put 0customer and 0material in the same dimension?? how its going to affect the performance?
    Thanks in advance.,
    Best Regards,
    Pavan

    Hi,
    Please see this
    1.
    see this thread
    populated the new field with historic data
    2. can we put 0customer and 0material in the same dimension?? how its going to affect the performance?
    Its better not to use them in a single dimension  because one customer and take more than one material so if you have 100 customer and 1000 materials  this combination will generate a large number of records. Its always better to keep characteristic which are having 1:N relation ship in one dimensional in you  case customer and material will have an M:N type of relationship.which will result in slow performance.
    Regards,
    Ravi

  • AP historical data

    Dear experts,
    My client wants to bring at least one year of historical data from legacy system to SAP to prvent creating duplicate invoices. ExL an invoice may have already been created in paid in legacy and would like to have the visibility on that.
    Also, for 1099 reporting purposes. The client is going in mid of the year and they wanted to bring the data from legacy to sap so that they can do their 1099s in one single place.
    can someone help how the other clients are doing this and what is the best practice to bring the historical data?

    Hi,
    If you client want 1 year data to be entered in SAP then you have to upload all the invoices and payment separately in SAP with the corresponding GL will be initial upload account, and after uploading you have to clear the document manually which are already cleared in legacy system.
    Example
    Invoice
    Purchase Ac Dr
      TO initial upload Ac Cr
    initial upload Ac Dr
       To Vendor Ac Cr
    Payment
       Vendor Ac Dr
         To initial upload Ac Cr
       initial upload Ac Dr
         To Bank  Ac Cr
    Now you have to clear all the document manually.
    Hope this will solve you issue.
    Regards,
    Shayam
    Edited by: Shayam_210 on Aug 25, 2011 6:35 AM

Maybe you are looking for

  • Unable to deploy a bean

    I create the jar file as follows: java -cp D:\bea\wlserver6.1lib\weblogic.jar weblogic.ejbc -compiler javac D:\working\icechill\classes\ScipBeans.jar D:\bea\wlserver6.1\config\mydomain\applications\ScipBeans.jar After that I start my weblogic6.1 serv

  • Can we call a simple java application from ESB

    Please let me know how this can be done by using a ESB. The application jar exists on the host server. How we can pass parameters etc and receive results from this application. Any help will be greatly appreciated. Prakash

  • Do we need to setup traditional ABAP Transport Management System on XI?

    I was confused that do we need to setup ABAP transport Management System on XI, since the IR and ID object will use CMS or File to transport between DEV->QAS->PRD. If we still need, could you pls let me know why we need it? one more question is: if w

  • HT1386 After I synced, all my contacts were missing information or actually missing, how do I correct this?

    Could someone please tell me how to get my contact list back to the original. When I synced it, it got all jumbled and I have missing contacts, missing information and it's just a mess.  I think the fact that I joined Cloud may have something to do w

  • SDM no longer saves updated sdmips.sdf file

    I made some changes to to my v4 IPS signatures using SDM 2.5 on an 1841 router, latest release branch firmware. This process has always worked in the past. I can tell it's worked because the changes are present after a reload, and also the date stamp