Time series inconsistency due to change in Fiscal variant/Storage bucket

Hi All,
We got into the situation where running time series consistency check on a DP planning area fails with  the message pointing to, 'changes to Fiscal variant'.  We are unable to track any changes to the fiscal variant though there is a possibility that storage bucket horizon could have been changed. Is there a way to correct/synch this by running a report?
We are trying to avoid re-initialization here, though this is an option, for the obvious reasons to backup the data in production. Is there an alternative solution to fix this inconsistency?
Cheers!

Dear Ashok,
You should never change a FYV when this FYV is used in a storage
bucket profile which is used in a planning area and if time series
objects are already created. It is no problem to maintain the FYV for
an additional time range, but changes and deletion of periods should
never be made if the FYV is actively used. You should not
change existing periods. If you want to make any changes to buckets
in your FYV or delete buckets, you should always make a COMPLETE
backup of ALL your data from the planning area into an InfoCube,
then delete the time series objects for the planning area (and with
this all the data in liveCache) and then change the FYV. After that,
you can create time series objects again and reload the data from the
Backup-InfoCube into the planning area. If you do these steps, you
will not risk to loose any data. The data in the InfoCube will be
the backup you can reload.
As some processes check the FYV some time before and after it is used,
it is very recommendable to maintain the FYV at least 2 - 3 years in
the past and in the future. E.g. if you create time series objects up
from 2001 you should maintain your FYV at least back to 1999 and if you
create time series objects until 2006, you should maintain your FYV
at least until 2008. It might be that you never experience problems if
you do not maintain the FYV in this time range, but some processes do
additional checks and then it can come to problems
Regards,
Rakesh

Similar Messages

  • Bad performance due to the use of AGO and TO_DATE time series functions

    Hi all,
    I'm building an OBI EE Project on top of a 1TB DW, and i'm facing major performance problems due to the use of the AGO and TO_DATE time series functions in some of the Metrics included on the reports. I discovered that when a report with one of those metrics is submited to the DB, the resulting query/explain plan is just awful!... Apparently OBI EE is asking the DB to send everything it needs to do the calculations itself. The CPU cost goes to the roof!
    I've tried new indexes, updated statistics, MV's, but the result remains the same, i.e., if you happen to use AGO or TO_DATE in the report you'll get lousy query time...
    Please advise, if you have come across the same problem.
    Thanks in advance.

    Nico,
    Combining the solution to view the data in dense form (http://gerardnico.com/wiki/dat/obiee/bi_server/design/obiee_densification_design_preservation_dimension), and the use of the lag function (http://gerardnico.com/wiki/dat/obiee/presentation_service/obiee_period_to_period_lag_lead_function) appears to be the best solution for us.
    Thanks very much.

  • Time Series initialization dates with fiscal periods!

    Dear Experts
    Problem:
    I cannot initialize planning area.
    Period YYYY00X is invalid for periodicity P FYV
    Configuration
    Storage Buckets Profile: Week, Month, Quarter, Year, Post Period FYV all checked.
    Horizon
    Start: 24.02.2013
    End: 31.12.2104
    No Time stream defined.
    Fiscal Year Variant
    2012
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    28
    25
    31
    28
    26
    30
    28
    25
    29
    27
    24
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2013
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    26
    23
    30
    27
    25
    29
    27
    24
    28
    26
    23
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2014
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    25
    22
    29
    26
    24
    28
    26
    23
    27
    25
    22
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2015
    Month
    1
    2
    4
    5
    5
    7
    8
    8
    10
    10
    11
    12
    Edate
    31
    28
    4
    2
    30
    4
    1
    29
    3
    31
    28
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Question
    What goddamn dates should I enter in the planning area initialization start and end to initialize for maximum duration given the settings above?
    I tried a few dozens but none is accepted. For a start I tried the same dates as in Horizon of storage bucket profile. But given the kind of error text I have I cannot decipher with tiny little mind what dates I am expected to enter in for time series creation.
    Thanks
    BS

    Thanks Mitesh,
    No its 2014
    Here is what worked.
    Storage Bucket Horizon
    Start: 24.02.2013
    End: 22.11.2014
    Time Series Initialization
    Start: 01.02.2013
    End Date: 31.12.2014
    The fiscal year variant is what I pasted above.
    I thought time series can only be initialized for a subset of period in the storage bucket profile !. This is my first experience of this kind where Initialization period is larger than the storage horizon. Is this the case ? I went by the book always and this is just a chance discovery by me.
    I was aware of the limitation by SAP notes on need to have one more year on either sides in fiscal year variant for it to be used for initialization and hence the range of dates I tried.
    Appreciate your comments on this.
    All dates in dd/mm/yyyy.
    Thanks
    BS

  • Programatically change the Date Time series

    Is there a way to set Date time series programatically thru Javascript? I have Time dimenson and I have defined a DTS as YTD. Now based upon the selection of the month by end user I need to show YTD numbers in IR?
    Any idea how do I achieve that? I am not seeeing any option to call DTS thru javascript?

    I do not believe that option is available.
    9.3.1 introduced the new interface into OLAP Essbase Queries. My guess is that there will be more functionality added in future releases.
    Wayne Van Sluys
    TopDown Consulting

  • Best log file format for multivariable non-continous time series

    Databases or TDM(S) files are great, but what if you cannot use a database (due to the type of target) and TDM files are (seems) unsuitable because the data does not come in blocks of continous time series. What is the best file option for data logging?
    Scenario:
    The number of variables you are going to log to a file can change during run-time
    The data is not sampled at fixed intervals (they have been deadband filtered e.g.)
    The files must be compact and fast to search through (i.e. binary files with known positions of the time stamps, channel descriptions etc.)
    Must be supported on compact fieldpoint and RIO controllers
    Right now we use our own custom format for this, but it does not support item no. 1 in the list above (at least not within the same file) and it would be much nicer to have an open format that other software can read as well.
    Any suggestions?
    MTO

    I did some tests of the performance. For a months worth of data (2592000 rows) with 4 channels, I got the following results when reading all of the data:
    1. TDMS file written as blocks of 60 values (1 minute buffers):1,5 seconds.
    2. As test 1, but with a defrag run on the final file: 0,9 seconds
    3. As test 1 & 2, but with all the data written in one operation: 0,51 seconds 
    3. Same data stored in binary file (1 header+2D array): 0,17 seconds.
    So even if I could write everything in 1 go (which I cannot), reading a month of data is 3 times faster with a binary file. The application I have might get a lot of read-requests and will need to read much more than 1 month of data - so the difference is significant (reading a year of data if stored as monthly files would take me 12-18 seconds with TDMS files, but just 2 seconds with a binary file.
    Because I'll be writing different groups of data at different rates, using the  advanced api to just get one (set) og header(s) is not an option.
    TDMS files are very versatile, it is great to be able to dump a new group/channel into the file at any time, and to have a file format that is supported by other applications as well. However, if the number of writes are many and the size of each write is (has to be) small the performance gets a serious hit. In this particular case performance trumphs ease of use so I'll probably  need to rewrite our custom binary format to preallocate chunks for each group (feature request for TDMS? :-) ).
    MTO

  • Time series for cvc's

    hello
    In any case ( whether new cvc's added or the new period is to be included ) we create time series for the planning area. Not aware as to why Adjust time series option is given for Planning Object Structure at the time of creation of cvc's.
    Any thoughts please
    regards
    KK

    Hi
    Yes you are correct we have some background jobs for create time series for the planning area which might execute depending on our frequency. But if you create CVCs and try to check before our program has to be executed in background we might have inconsistency. To avoid this inconsistency it is always advisable to put Adjust Time Series Objects indicator which will immediately Updates the time series objects for new CVC in all planning areas based on this master planning object structure.
    SAP always recommends that you update all planning areas immediately due to which we should use this indicator .
    Alternatively you can leave these indicator unselected for all new characteristic value combinations. Afterwords to adjust the time series objects at a later date we can choose Adjust Time Series from the context menu for the planning object structure in via S&DP Adminsistration. This also has some of the advantage like that you can create a variant and schedule the job to run later,  when there is little system activity.
    I hope this helps to answer your question
    Thanks
    Amol

  • All input time series to which you can distribute are fixed

    Hi Experts,
    I ran one macro in the background. Usually it runs without any different message but this time I got following message
    "All input time series to which you can distribute are fixed". Macro has executed successfully & data is also correct but it has taken lots of time to finish. When I checked I got the above message.
    Can anyone help out to find the exact meaning of this message?
    When I digging around the message I got the following explinantion but didn't get the meaning.
    Diagnosis
    All aggregrated time series objects of an aggregate time series are fixed. As such, a change to the characteristic value at aggregate level cannot be disaggregated on any of the aggregated time series objects.
    Procedure
    Remove at least one of the fixings of the aggregated time series objects.
    Please help me to find the solution to avoid the delay in job.
    Regards
    Sujay

    Hi Sujay,
    I think you have some hierarchy in DP .Ex.one product @ diff locations and you have fixed the values of the key fig say Sales Demand at all locations and there you are trying to change some value at product level which due to fixing is not getting distributed/disaggregated on low levels of locations and hence you are getting this message.
    Please refer SAP Note 609074 & 687074  which will be very helpful to you.
    Regards,
    Digambar

  • Time-series Chart

    Warning, CR Newbie here so this may be a stupid question. I am evaluating the trial version of CR to see if it will be a good fit for an upcoming project. I've seen some related posts in the SCN, but no answers that quite fit.
    I'm looking to create a line chart (or a scatter chart) with time-series data. My dataset includes a time stamp field (yyyy-MM-dd hh:mm:ss) and some floating-point temperature values like this:
    2014-05-01 08:00:00, 123.4, 115.1, 109.2
    2014-05-01 08:00:10, 123.6, 116.0, 109.8
    The desired outcome has the date / time along the X-axis with data points spaced proportionally in the X dimension and plotted in the Y-dimension according to the temperature. The interval between the time stamps is not always the same, so numerical scaling is required on both axes. The desired chart would show a temperature scale along the vertical axis, three trend lines for the three series of temperature data and times shown on the X axis label.
    I've played with several options in an attempt to make this work. On the data tab, it would seem I would want to select "on change of" and then my time-stamp field. However, with this selection, I can only use summary values and end up with a chart with a single data point for each series. I don't need or want any summary calculations carried out on the data, I just want to plot it so I can look at a trend over time. I can get trend lines if I select "for each record" on the data tab of the wizard, but then my X-axis is meaningless and the horizontal scaling is misleading unless the interval between my samples is constant.
    I would welcome any suggestions on how best to accomplish this with Crystal Reports.
    Thanks for reading.

    Jamie,
    Thanks for continuing to reply. I am getting close, but still no success.
    Here is the procedure I've followed and problem:
    Put chart in RF section
    Start Chart Expert
    Chart Type = Numeric Axes, subtype = Date axis line chart
    Data tab
    On change of datetime field
    Order... ascending, printed for each second
    Values avg of my data fields (must select summary when on change of is used)
    Right-click on X-axis label, select Group (X) Axis Settings
    Scales tab: base unit, major unit and minor unit can only be set to days, months or years
    I cannot set the minimum and maximum date with resolution other than day
    Right-click Chart, select Chart Options...Axes tab: show group axes set to show time scale
    No matter the setting I use, I can't find a way to adjust the resolution of the time scale lower than days.
    I tried using a formula to extract only the time portion of my datetime field. I used that as my "on change" data series, hoping maybe CR would automatically recognize I was looking at a fraction of a day if I did that. No good - now it gives me a date scale with the dates showing up as the beginning of the epoch, but I can still only get resolution of integer days.
    Thanks for your patience and persistence.
    - Max

  • Regarding Time Series Graph in OBIEE 11g

    Hi,
    I need to create a time-series graph in OBIEE 11g. However, the value for time on the x-axis of the graph automatically comes for days.
    Is there any way to change it to hours or minutes?
    Thanks,
    Naman Misra

    Yes, it shud be a prob. Try using seperate tables for fact n timedim

  • Issue in new macro calculating values in time series in CVC

    Hi friends.
    I'm new in APO.
    I have a problem with a new macro in CVC which calls up a FM to calculate and store data.
    This new macro calculates the selected data in time series (TS) and e.g. we select 3 days to make calculation with this macro, the first selected day in TS is ignorated.
    We created this macro to do this calculation when user enter some manual values and want it to be calculated in the deliver points in CVC, by TS.
    This macro calls up my Z function which internally calls up a standard FM '/SAPAPO/TS_DM_SET' (Set the TS Information).
    Sometimes, this FM, raises error 6 (invalid_data_status = 6), but only when I user 'fcode_check' rotine together.
    After that, we call the FM '/SAPAPO/MSDP_GRID_REFRESH' in mode 1 and 'fcode_check' rotine in program '/sapapo/saplmsdp_sdp'.
    Firstly, I thought it could be dirty global variables in standard FM so I put it inside a program and called it by submit and return command. But now I think could not be this kind of error because it did not work. And inverted the results, and now only first line os TS get storted and change in CVC.
    It's a crazy issue. Please friends. Guide me for a solution!
    thanks.
    Glauco

    Hi friend. Issue still without a correct solution yet.
    A friend changed the macro adding another step calling the same function. Now this macro has two calls in sequence to same FM. Now it's working, but we can't understand why it's working now.
    It's seems like dirty memory in live cash.
    Nobody knows how to solve this!
    Glauco.

  • Time series functions are not working for fragmented logical table sources?

    If i remove that fragmented logical table sources, then its working fine.
    if any body know the reason, please let me know.
    thanks and regards,
    krishna

    Hi,
    is because the time series function are not supported for the framentation content, see the content of the oracle support:
    The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
    Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
    Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
    Regards,
    Gianluca

  • Use of time series functions with horizontally fragmented fact tables

    Hi Guys,
    in OBIEE 10g it wasn't possible to use time series functions [AGO, TO_DATE] on horizontally fragmented fact tables. This was due to be fixed in 11g.
    Has this been fixed? Has somebody used this new functionality? What the the limitations?
    Tkx
    Emil

    Hello,
    Can you give us some examples for "horizontally fragmented fact tables", we can tell you whether we can do that or not?
    Thanks,

  • Time series functions are not working in OBIEE for ESSBASE data source

    Hi All,
    I am facing a problem in OBIEE as I am getting error messages for measure columns with Time series functions(Ago,ToDate and PeriodRolling) in both RPD and Answers.
    Error is "Target database does not support Ago operation".
    But I am aware of OBIEE supports Time Series functions for Essbase data source.
    using Hyperion 9.3.1 as data source and obiee 11.1.1.5.0 as reporting tool.
    Appreciate your help.
    Thanks,
    Aravind

    Hi,
    is because the time series function are not supported for the framentation content, see the content of the oracle support:
    The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
    Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
    Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
    Regards,
    Gianluca

  • Variation calculation without time series?

    Hi all.
    I've got a pivot table like the one below. Time series is not implemented in the rpd, due to performance reasons. Is there a way to calculate the variation between two measures on two different dates in a pivot table (Difference column below), similar to what we achieve using the AGO() timeseries function. The dates are chosen by the user from a dashboard prompt, so I am not sure if calculated item would apply here and dynamic items ($1, $2) does not work too. The pivot table has more columns and rows than the example below.
    _________________Dates
    Product___01/10/11____02/10/11____Difference
    A__________10_________15___________5
    B___________8__________6___________-2
    C___________8__________8___________0
    Thank you.
    Regards,
    Edited by: user10634835 on 03-Oct-2011 01:54
    Edited by: user10634835 on 03-Oct-2011 02:06
    Edited by: user10634835 on 03-Oct-2011 02:08

    Hi,
    By using $2 - $1 in the new calculated item under pivot table view.
    Ex:
    Name 3-10-2011 1-10-2011 diff
    dev 423 400 23
    let say from date =1-10-2011 refer as $1
    to date =3-10-2011 referd as $2
    *$2-$1 will get any two date difference..fully automatically*
    *100% it will work...i have also implemented the same*
    Thanks
    Deva
    Edited by: Devarasu on Oct 3, 2011 5:18 PM

  • Semiweekly Time Series Creation for DP

    Hi,
    I want to create a semiweekly (twice a week) time series for DP.
    And this would be spread for 2 years (i.e. 1 year for the past and 1 year for the future).
    So a total of 208 entries need to be made for this time series.
    My understanding of the method is
    1)     Create a fiscal Year Variant.
    2)     Create a Storage Bucket profile based on fiscal Year Variant.
    3)     Create a Time Series Object based on Storage Bucket profile.
    4)     In the new Planning Area, provide the Time Series Object.
    Correct me if my understanding is wrong.
    Also can you provide the T-Code or Menu Path for the first three activities?
    Regards,
    Vikas

    The Menu Path for
    (1) Fiscal Year Variants
    SPRO>APO> Supply CHain PLanning> Demand Planning> Basic Settings>Maintain Fiscal Year Variants
    (2) Storage Bucket Profiles
    SPRO>APO> Supply CHain PLanning> Demand Planning> Basic Settings>Define STorage Bucket Profiles
    (3) I guess when you mean time series, you meant planning bucket profile
    SPRO>APO> Supply CHain PLanning> Demand Planning> Basic Settings>Define Planning Bucket Profiles
    (4) Define PLanning Area- /n/sapapo/msdp_admin
    You define a storage bucket profile
    (5) Define PLanning Book
    You define the planning bucket profile

Maybe you are looking for