Storage bucket inconsistency

I am getting an error message, when i am running a LC consistency.  The error says" storage bucket profile start times in LC are inconsistent with those in the data base. I try to create the time series as per the storage bucket profile and try to recreate the POB and PA.  None of them fixed the problem.  Did any body come accross the problem. Any help is appreciated.
thx
Jeff

Hi Jeff,
Try the below reports (transactions)
1) /SAPAPO/PSTRUCONS
2) /SAPAPO/TSCONS
In case of any inconsistencies found, repair it.
Even after if the solution persists, down the
livecache after back up and then again makes it up
and then recheck it
Regards
R. Senthil Mareeswaran.

Similar Messages

  • Time series inconsistency due to change in Fiscal variant/Storage bucket

    Hi All,
    We got into the situation where running time series consistency check on a DP planning area fails with  the message pointing to, 'changes to Fiscal variant'.  We are unable to track any changes to the fiscal variant though there is a possibility that storage bucket horizon could have been changed. Is there a way to correct/synch this by running a report?
    We are trying to avoid re-initialization here, though this is an option, for the obvious reasons to backup the data in production. Is there an alternative solution to fix this inconsistency?
    Cheers!

    Dear Ashok,
    You should never change a FYV when this FYV is used in a storage
    bucket profile which is used in a planning area and if time series
    objects are already created. It is no problem to maintain the FYV for
    an additional time range, but changes and deletion of periods should
    never be made if the FYV is actively used. You should not
    change existing periods. If you want to make any changes to buckets
    in your FYV or delete buckets, you should always make a COMPLETE
    backup of ALL your data from the planning area into an InfoCube,
    then delete the time series objects for the planning area (and with
    this all the data in liveCache) and then change the FYV. After that,
    you can create time series objects again and reload the data from the
    Backup-InfoCube into the planning area. If you do these steps, you
    will not risk to loose any data. The data in the InfoCube will be
    the backup you can reload.
    As some processes check the FYV some time before and after it is used,
    it is very recommendable to maintain the FYV at least 2 - 3 years in
    the past and in the future. E.g. if you create time series objects up
    from 2001 you should maintain your FYV at least back to 1999 and if you
    create time series objects until 2006, you should maintain your FYV
    at least until 2008. It might be that you never experience problems if
    you do not maintain the FYV in this time range, but some processes do
    additional checks and then it can come to problems
    Regards,
    Rakesh

  • Copy Key figure data between two planning area with different storage bucket profile

    Hi,
    I have a DP planning area ' PA1' with monthly storage bucket profile data view 'PA1M' ( monthly TBF) and I want to copy the data to another planning area 'PA2' whose storage bucket is weekly. This would need me to write a custom program which will split monthly to weekly and then populate 'PA2'.
    However, In 'PA2', I created two data views 'PA2W' (with weekly time bucket profile) and 'PA2M' (with Monthly Time bucket profile). If I am able to copy the data from PA1 (PA1M) to PA2(PA2M), it will automatically splits monthly to weekly in PA2W.
    So my question is if there exists a way I can copy key figure data from PA1M to PA2M? Any BAPI?

    Hi Rakesh,
    You can use COPY/Version Management Function to do this.
    Path: Demand Planning > Environment >Copy/ Version Management
    The system takes into account only those periodicities that are common to both planning areas.
    For example, if the data is saved in months in the source planning area (PA1) but in months and weeks in the target planning area (PA2)
    the system copies to months in the target planning area and then Disaggregates the data to the storage buckets in accordance with the Time-based Disaggregation.
    hope this wil help to understand the basic concept.
    Kapil

  • Use of storage bucket profile in APO DP

    I'm trying to clarify the sizing implications of using various buckets in the APO DP storage buckets profile.
    Suppose I have a storage bucket profile 1 consisting of calendar weeks and days, and a profile 2 consisting of weeks only.
    What will be the relative database/memory sizing resulting from these 2 profiles?
    Thanks for any advice...

    Hi,
    As our other friends have mentioned here, just having a storage bucket profile doesnt consume memory, however let us assume you have generated time series objects based on these storage bucket profile, the following example will highlight the memory usage.
    Horizon used -> 2 years.
    No. of Weekly bucket --> 104 or 105
    No. of daily bucket --> 365
    Now if you generate Time Series out of your SBP which contains both daily and weekly, total memory occupied will be 365 +104 = 469 times your number of CVCs
    If you generate time series out of your SBP which contains just weeks, total memory occupied will be 104 times your number of CVCs..
    Hope this helps.
    Thanks
    Mani Suresh

  • Storage Unit Inconsistency!!

    Dear Experts,
    One background job in SAP which runs every 10mins got canceled because the storage unit was inconsistent with the transfer order data. I am not able to find the cause of this inconsistency of storage unit!!
    Any ideas??
    Regards

    Can I know what kind of error (details of inconsistencies) you are getting?

  • Storage Standby Inconsistent Behaviour (hdparm -S)

    Well hi folks, I have a bunch of ata/sata drives.
    -M acoustic      = not supported
    -B APM_level      = not supported
    -k  DIO_GET_KEEPSETTINGS failed: Inappropriate ioctl for device
    If i apply -S 60 (5 min) it works out very well.
    But any value higher than 120 won't spin down the drives at all.
    Apparently i'd like to set 240 (20 min). 60 is too low.
    I've tried hdparm -K 1 as well as smartctl -s off --offlineaouto=off --saveauto=on
    With mounted partitions or not. With systemd oneshot unit or
    the udev rule mentioned in the wiki. Fstab commit=600 or 1200.
    Or should i apply these commands in any specific order?
    On top of that, i have a feeling they used to work with -S 120.
    But on a random basis. Totally arbitrary behaviour.
    I use hddtemp /dev/sd? to check the state.
    Still i can't pin point the reason why. At this point it seems incomprehensible to me.
    Where should i look next?
    Last edited by Xelvet (2014-05-16 12:46:37)

    Now there's been some development. If i leave the freshly booted sys alone,
    standby performs as expected, apparently.
    I have some reasons to beleive all this mess caused by that mf Wine.
    winecfg alone wakes up all standby storages immediately for no f reason.
    i do use wine all the time for my charting platform.
    but this doesn't explain why the hdd standby is being prevented.
    Last edited by Xelvet (2014-06-05 19:52:03)

  • Key figure display in planning book with respect to Time bucket profile

    Hi,
    I am loading a key figure to planning area from the info cube for the current month. When I review the key figure in planning book with monthly time bucket profile it shows 85 for the current month. In the same planning book with weekly bucket profile, it shows 55 from the current week and future weeks and the remaining 30 goes into the past weeks of the current month.
    How to make the total quantity 85 to show in the current and future weeks only.
    thanks and regards
    Murugesan

    Hi Murugesan,
    Within the Planning Area, the data is stored at the lowest level granularity that you maintain in storage bucket profile. Then during display, system will decide what data to show depending on what kind of time bucket profile you use in the planning view, and the time based disaggregation that you maintain for Key Figure.
    In this below case, what time characteristic do you have in cube? Is it date, week or month?
    If it's date, check how much KF data is maintained on the dates which belong to week which has days both in this month/last month e.g. if I talk about Dec 2011, how much data is stored 1,2,3 & 4 th of Dec, 2011.
    This data would appear in Dec in monthly view, but in week, it would appear in the week starting 28th November.
    If data is maintained in cube in weeks, then you need to calculate how time based disaggregation would show it to you in months.
    If it's months, then you would need to find out how much data would go to the days in the past week of the month.
    The time based disaggregation may be causing you some issues, but in that case, I would not expect 30 out of 85 to go in the past week, unless you have data in cube in days.
    Data shown in weekly view for week starting 28th Nov should ideally be a small proportion of 85, unless you are using a time stream/fiscal year variant, due to which most of December is in holidays.  The only other exception I can think of is that you have data in teh days mentioned above.
    It would be best to help the business understand this disaggregation logic, rather than thinking of manipulating the data to shift to a later week.
    If this logic doesn't explain your situation, then please provide the date/week/month at which you have data in cube, and what quantity.
    Thanks - Pawan

  • Disaggregation of Key figure not in proportion at monthly bucket

    Dear Expert,
    Please find the below case.
    Here Key figure 2 is disaggregated based on Key figure 1 (Calculation Type P) also we maintained Time based disaggregation K.
    The issue is - if you check total, it is 13 for both key figures as shown in below screen shot.
    But region wise (Detail level) if you checked, it is not in 1:1 proportion and it is causing problem for us.
    For example Region 2 having value in KF1 as 2 for Week 11.2015 but after disaggregation it got nothing on KF2.
    (The reason is Key figure 2 is having value in week 12 and week 13 and nothing in Week 11)
    We tried 2 different approach to get 1:1 proportion in Monthly bucket, but it could solve our problem.
    Approach 1-After  Key figure2 calculated in Weekly bucket, Reset Key figure 2 at total level in Monthly data View and enter the Total of Key figure 2 at Total level in Monthly bucket.Check the result at detailed level.
    But in this case it is calculated for week 10, 11, 12 and 13 as 0,  9, 3, 1. (We want Key figure2 as per original calculation i.e it should be 12 and 1 in week 12 and 13 respectively)
    Approach2 -
    Approach 2 - Add another New Key figure as Key figure 1 Total- which sum the Key figure1  at detailed unit at Month level and then disaggregation of Key figure2  based on this new KF.
    Its working for this scenario but different scenario is not working.
    For example- here region 3 and 4 affected.
    If disaggregation at Month level is corrected, our issue will be resolve.
    Waiting for your feed back.
    Thank you
    Sachin

    Hi,
    In Planning Area u2013 Key Figure settings,  did you try to use  u2018N - No disaggregation in timeu2019  for KF2  key figure ?
    This will mean that copying will occur in technical periods of the storage bucket profile and not the way it happens now.
    Other  option:
    Now, you storage bucket profile and also the planning bucket proifile  are  in Days/Weeks/   .Months 
    Is it necessary for your user to view data in daily buckets ?  Can the user view  the data in weekly buckets only (and not in daily buckets) ?
    Your planning bucket profile can be in Weeks/..Months .
    This way your displayed data will be in weekly buckets.
    Regards
    Datta

  • Move IIS Log files to AWS S3 Bucket...

    I'm seeking to automate a process that will copy or move IIS logs to a remote location.
    The following variables must be taken into account =
    1. Copy or Move all IIS logs (xcopy?) to another location. (Each server maintains several websites)
    2. Delete the existing log files up to the current day log file from each website/ server (free up disk space)
    3. I need to retain the last 2 current days of logs per site / per server.
    4.I'd like to be able to schedule this task per server.
    5. This will be performed on several IIS web servers.
    6. The logs will need to move into their respective folders within the remote location or as part of the process create a new folder name, confirm the copy/move of the logs and location.
    7. I don't have to worry about retaining actual website paths from the servers as long as the log files are in the folders names which are labeled by //server name / website (W3SVC1, W3SVC4, W3SVC5, etc...)
    8. End goal - scheduling an automated task that moves these logs into an AWS S3 location (amazon Storage bucket).
    Thank you.
    LRod

    Hi,
    Okay, so what's your question? All I see up there is a list of requirements (note that we don't write
    scripts on demand here).
    My initial recommendation will be to look into using robocopy as a starting point:
    http://ss64.com/nt/robocopy.html
    Don't retire TechNet! -
    (Don't give up yet - 12,830+ strong and growing)

  • Time Series Storage Design

    Hi, I've got the unenviable task of rewriting the data storage back end for a very complex legacy system which analyses time series data for a range of different data sets. What I want to do is bring this data kicking an screaming into the 21st century but putting it into a database. While I have worked with databases for many years I've never really had to put large amounts of data into one and certainly never had to make sure I can get large chunks of that that data very quickly.
    The data is shaped like this: multiple data sets (about 10 normally) each with up to 100k rows with each row containing up to 300 data points (grand total of about 300,000,000 data points). In each data set all rows contain the same number of points but not all data sets will contain the same number of points as each other. I will typically need to access a whole data set at a time but I need to be able to address individual points (or at least rows) as well.
    My current thinking is that storing each data point separately, while great from a access point of view, probably isn't practical from a speed point of view. Combined with the fact that most operations are performed on a whole row at a time I think row based storage is probably the best option.
    Of the row based storage solutions I think I have two options: multiple columns and array based. I'm favouring a single column holding an array of data points as it fits well with the requirement that different data sets can have different numbers of points. If I have separate columns I'm probably into multiple tables for the data and dynamic table / column creation.
    To make sure this solution is fast I was thinking of using hibernate with caching turned on. Alternatively I've used JBoss Cache with great results in the past.
    Does this sound like a solution that will fly? Have I missed anything obvious? I'm hoping someone might help me check over my thinking before I commit serious amounts of time to this...

    Hi,
      Time Series Key Figure:
            Basically Time series key figure is used in Demand planning only. Whenever you cerated a key figure & add it to DP planning area then it is automatically convert it in to time series key figure. Whenever you actiavte the planning area that means you activate each Key figure of planning area with time series planning version.
           There is one more type of Key figure & i.e. an order series key figure & which mainly used in to SNP planning area.
    Storage Bucket profile:
          SBP is used to create a space in to live cache for the periodicity like from 2003 to 2010 etc. Whenever you create SBP then it will occupy space in the live cache for the respective periodicity & which we can use to planning area to store the data. So storage bucket is used for storing the data of planning area.
    Time/Planning bucket profile:
         basically TBP is used to define periodicity in to the data view. If you want to see the data view in the year, Monthly, Weekly & daily bucket that you have to define in to TBP.
    Hope this will help you.
    Regards
    Sujay

  • Time Buckets in DP

    Dear All,
    I have the following scenario in APO DP:
    I have two plants: Plant A and Plant B.
    For Plant A (Weekly off is wednesday):
    So Weekly buckets for Oct 08 should look like this
    W1 --> 2nd to 7th Oct
    W2 --> 9th to 14th Oct
    W3 --> 16th to 21st Oct
    W4 --> 23rd to 29th OCt
    And for Plant B (Weekly off is sunday)
    So weekly buckets should look like this
    W1 --> 1st to 4th Oct
    W2 --> 6th to 11th oct...
    continues like this..
    How do i address this in ONE PLANNING AREA?
    Orelse should i create two fiscal year variants and attach to TWO DIFFERENT planning area?

    Hi.
    If I understand your requirement correctly, you are going to need two storage bucket profiles. This being the case, you will need two planning areas.
    Not only will you require two FYVs but also two Time streams to assign to the storage bucket profiles. In the timestreams you can define the order of your workdays in a week.
    Hope this helps.
    M

  • Time bucket not displaying in planning book

    When I opened planning book, the time bucket is not displaying as it used to be. I tried to select CVC to be displayed in the planning book and got the following error message:
    @5C\QError@     Insufficient information about read period transferred
    @5C\QError@     Error reading data - Planning book cannot be processed further
    Insufficient information about read period transferred
    Message no. /SAPAPO/TSM231
    This internal error should not occur during normal processing. However if this message is displayed, contact your system administration or the SAP hotline quoting the message number.
    Any idea what went wrong?

    Hi Nic,
    Either someone deleted the time bucket profile, or else the periodicities in the time bucket profile are not in line with the periodicity allowed in your planning area.
    Go to Advanced Planning and Optimization-> Demand Planning -> Environment -> Current Settings -> Maintain Time Buckets Profile for Demand Plng and Supply Network Plng, or directly use the tcode "/SAPAPO/TR30".
    Check that your time bucket profiles ZDP_PBP_9Q & ZDP_PBP_5Q exist by using F4 help, and also select them to see what periods are used in the time bucket profile e.g. weeks, days, etc.
    If you don't find the time bucket profile, then they need to be created (based on names, my guess is that *9Q will contain 9 quarters, and *5Q will contain 5 quarters). I expect that issue lies here itself.
    If time buckets profile exist, then check the following:
    1) Go to /sapapo/msdp_admin, select your planning area. See the storage bucket profile name in the "info" tab.
    2) Go to /SAPAPO/TR32  (Periodicities for Planning Area). Give the name of the Storage bucket profile that you see in 1.
    3) Check what periods are used in storage bucket profile.
    4) Storage bucket profile needs to have lower periods maintained than the time bucket profile. e.g. assuming that your time bucket profile has quarters, then you need to have at least quarter maintained in storage bucket profile. Even if you have day or week or month selected in storage bucket profile, it's ok because these are smaller periods than quarters what you have in time bucket profile.
    Also make a note if "post period" is checked. Note the name of "fiscal year variant" if that is maintained along with this tick. I hope you don;t need to go further but if above don;t resolve your issue, then we need to check the fiscal year variant.
    Thanks - Pawan

  • Invalid time buckets profile reference

    Hi :
    I am getting fololowing error while loading data in planning book : " Invalid time buckets profile reference", and unable to laod any data  to planning book
    I have already done consistency check , initailised version , checked time bucket profile , storage bucket profile , and those are fine .There are no time stream id defined either .

    Hi Virendra,
          Check whether your time bucket profile falls same under storage bucket profile. I mean, check your fiscal year variant in storage bucket profile as well as in Time bucket profile. becz you have assigned the storage bucket profile to planning Area, which having fiscal year variant & if you are using different fiscal year variant in Time bucket profile, this mentined error will appear.

  • Qty... weekly buckets wise

    Hi Experts,
    I am creating a report on top of the infocube.
    I have 3 quantities in the cube and i have to show one quantity as per calweek and remaining 2 in single columns.
    I kept the calweek in column and first quantity under that and i am not able keep remaining 2 in rows.
    To achieve this i have created to char info objects and assigned to remaining quantities by rounding the decimals.
    now the problem is i am getting more than one line per item due to this new char objects.
    my requirement is to show one line per item.
    Please suggest me to fulfill my user requirement.
    Thank you
    Manju

    Thanks, I did look up the note 737230 yesterday and realized that it was standard functionality by SAP, I had a feeling that was the case anyway.  The problem is that the user is running a report out of APO to make sure that there are no data inconsistencies with the original data in the external database where we import from.  So they look at the database data, and it is in correct format, then they look at APO at the storage bucket profile level and those weeks are seperated so it makes it difficult.  I realize they should look at it at the time bucket profile, but now this brings up concerns from them. 
    How can they view the data in a non planning book view where the weeks are not split to make sure the data came over correctly? 
    Also, when I extract this data back to an outside database, will the data still be in those seperated buckets, or will it consolidate?  I guess I can test this myself by extracting to a text file and just viewing that file.
    To keep everything in synch, is my only option to create a fiscal year variant to correspond with the true weeks?  I feel as if this would be overkill for functionality many people would want?

  • Unable to save Time Buckets Profile for SNI

    I am getting this weird behavior with the Time Buckets Profile in SNI.
    Whenever I Edit and Save the changes, the Time Bucket Profile reverts back to Years 0000, Months 0000, Weeks 0000 and so on.
    I am using SCM 7.0 with the SP 06.
    I have maintained the necessary Visibility Profiles for the relevant applications.
    It was working until a couple of days and stopped from today. We recently added a custom view under SNI Details.
    Any thoughts?
    Kedar

    Amit
    this is not a Time bucket profile but a storage bucket profile
    you cannot create or maintain it... its standard SAP structure for storing data in technical buckets
    the use of this bucket is in interactive planning book if you want to see how the data is stored(which you seem to have used). This is useful , say, in explaining how the weeks overlapping two months has got the quantity. Again this is a temporary setting and cannot be saved in the planning book
    SCM5.0 makes this a bit accessible by giving this as an option in the drop down icon for "period settings"
    for a related thread see
    Weekly Buckets Incorrectly Split into incomplete weeks!?!?

Maybe you are looking for

  • Internet Explorer cannot open url in local Windows network share

    Windows Server 2012 R2 as a domain controller Internet Explorer 11 64/32 Favorites are redirected to a Windows network share (on the server itself) Issue: When clicking a favourite redirected to the Windows network share, IE does nothing.  For testin

  • Codec for HD Quicktime?...

    I have a couple HD Clips that a friend gave me which are in Quicktime format and I want to include them in my Premiere CS4 Project. I try to bring them in and I get an error saying "Codec missing or unavailable" I tried then just opening the Quicktim

  • Compressor wont open and unexpectedly quitting

    When I open compressor from FCP, compressor shows in the dock and then unexpectedly quits, FCP freezes up and I have to force quit. I thought it was QuickTime 7.2 so reinstalled the OS, but then it simply did nothing, doesn't even appear in the doc.

  • How to format my mac and install fresh copy of Snow Leopard

    Hi guys, When i bought my mac, it was coming with Mac OSX 10.5.6 Leopard. I want to remove everything and start my mac from beginning again using Mac OSX 10.5.6 Snow Leopard directly. is it possible? I already bought the Snow Leopard DVD. Can i just

  • Whats the scope for engineering graduate(fresher) with scsa background

    Will the companies hire fresh engineering graduate with solaris background(scsa) without any experience.....people are always telling that the companies are looking for experienced people in solaris field...is it like tat? then whats the reason that