Time based disaggregation

Hi All,
The requirement is to dis-aggregate from months to weeks for the current month plus next month, based on last months historical data.
There are two data views
1. having the data in (forecasting is done in this data view)
2. having data in months and weeks. ( last months history is available in weeks and the current & current +1 month in weeks)
The forecast that is in months needs to be disaggregated into weeks for the first 2 months based on last months history.
I have created another KF to copy the last month history into the current and current +1 month in the weekly view.
Also no calendar is defined.
during disaggregation based on the KF the system is throughing unexpected values especially when the last and the next month overlap.
Ex: these are the disaggregated values
W 31.2010     W 32.2010     W 33.2010     W 34.2010     W 35.2010     M 09.2010
3747                            2214                     2554                   3406                     14258
here in W35 the value is 14258 which is wrong as the value for M092010 is also added into this week.
Kindly through some light on this weird behavior.
Regards

Hello Nitin,
Sorry to intrude into this old thread that I read with much hope but I find this unanswered as I just happen to post a ditto issue.
Would you recall how this was resolved :-). Pardon if this sounds irritating and unrelated to you know :-)
This is what I asked a while ago
http://scn.sap.com/message/14565886#14565886
My settings for KF1 are P-KF2 and K-KF2 but in my case KF2 has 0 values for some selections (new products). In human language, I am trying to disaggregate an "adjustment" to the statistical forecast that is 0 for new products. The adjustment itself is done in month buckets. Simple solution though is to do the adjustments in weeks but that is like multiplying effort by 5 times.
SAP is doing some dumb thing here as this is a logic of convenience. To me it doesn't appear to be a storage bucket setting issue as I am only expecting a sane result in week buckets and I do have week (and also month) checked in storage bucket profile
Here is what is happening. Hoping some consultant from SAP looks into this and acknowledges this as a bug as I didn't manage to find a note. I will have to grope for key words all night. This is such a common unavoidable scenario.
Step1: System finds the COUNT of week buckets (NOT calendar weeks- just the weeks that FALL in the month) in EACH calendar month separately. E.g. Feb 2014 has 5 weekly buckets, incl.1 partially in Jan and 1 partially in March. The week of 27th Jan 2014 is a “week bucket” for BOTH Jan 2014 and Feb2014.
Step2: It then divides the monthly value of KF1 by the number of weeks in Step1 to calculate EQUAL values per bucket week belonging to respective calendar months, IRRESPECTIVE of whether the week is a WHOLE or PARTIAL week.
Step3: It then simply adds the EQUAL values of resp. week buckets of step1, WHENEVER, the week is spread across months. e.g. In the week beginning 27th Jan 2014, that transcends to next month (Feb 2014), it adds up the results calculated in step2 for W-1 and W+1.
Thanks
BS

Similar Messages

  • Pb with time-based disagregation

    Hi!
    In my parea ZPA_REG, i have 2 dataview. One is by week and the second one is by month. I enter 100 at month level for October 2007. Then, i go in the second dataview (week level) and i see
    S40  25
    S41  25
    S42  14
    S43  25
    S44  11
    My question is about S41 ad S44. For S44, since there are only 3 days for october it could explain the result. But i don't understand why for S42 the value is 14 .
    The Time Stream ID (transaction /sapapo/calendar) is based on US calendar. Its calculation type is with gaps.
    The storage bucket profile is by Day, Week, Month and year.
    The time based disagregation for my KF is type P.
    Can you help me please.
    Thanks by advance,
    LB

    1. check if there are any holidays in that week. looks like it could be 2 holidays there.
    2. zero out all the values in the weeks and re enter 100. sometimes if there is a value already existing it will follow that proportion by default and not your planning area disaggregation type

  • Sales orders in TDMS company/time based reduction  are outside the scope

    Guys,
    I have had some issues with TDMS wheras it didn't handle company codes without plants very well. That was fixed by SAP. But I have another problem now. If I do a company code and time based reduction, It doesn't seem to affect my sales orders in VBAK/VBUK as I would have expected. I was hoping it would only copy sales orders across that have a plant which is assigned to a company code that was specified in the company code based reduction scenario. That doesn't seem to be the case.
    VBAK is now about one third of the size of the original table (number of records). But I see no logic behind the reduction. I can clearly see plenty of sales documents that have a time stamp way back from what I specified in my copy procedure and I can see others that have plant entries that should have been excluded from the copy as they do belong to different company codes than the ones I specified.
    I was under the impression that TDMS would sort out the correct sales orders for me but somehow that doesn't seem to be happening. I have to investigate further as to what exactly it did bring across but just by looking at what's in the target system I can see plenty of "wrong" entries in there either with a date outside the scope or with a plant outside the scope.
    I can also see that at least the first 10'000 entries in VBAK in the target system have a valid from and to date of 00.00.0000 which could explain why the time based reduction didn't work?
    Did you have similar experiences with your copies? Do I have to do a more detailed reduction such as specifying tables/fields and values?
    Thanks for any suggestions
    Stefan
    Edited by: Stefan Sinzig on Oct 3, 2011 4:57 AM

    The reduction itself is not based on the date when the order was created but the logic enhances it to invoices and offers, basically the complete update process.
    If you see data that definitely shouldn't be there I'd open an OSS call and let the support check what's wrong.
    Markus

  • Calculation of SLA times based on Service Organization

    Is it possible to calculate the SLA times based only on Service org?
    a) Using Service contracts i.e create SC with only org and assign the Service & Response profiles.
    Else as mentioned below.
    Please give your more thoughts.
    I maintain the Service & response profiles at "Maintain Availability and Response Times" .
    Can I access these values directly in the BADI ?
    My scenario is
    a) An agent belongs to a service org.
    b) I define these Profiles seperately for each Org (Org1 Org2 etc) at the above tcode.This manual entry.I know we dont have org to profiles mapping in the above tcode.I just painly maintain.
    c) In the BADI i check the org entered in the complaint.
    d) for Ex if the Org1 is entered I want to access the profiles for Org1.If else ladder.
    e) then use these profiles to calculate the SLA times.
    f) then save the document.
    g) Also trigger an e-mail saying the above time lines.
    Is the above flow possible??
    Let me know if you want me to post this onto another thread.
    Thanks
    amol

    Shalini,
    I will be just maintaining the service and response profiles in the "Maintain Availa..." tcode.
    There wont be exact mapping stored in any table.
    My logic would be ,i dunno whether this right or wrong..
    1) Once i get the Org ,I would compare like ths
    if( orgdata = org1)
    then service profile 1 etc.
    2) then apply the profiles to the cal of SLA times.
    I think we can achieve what you said using CRM_ORDERADM_I_BADI
    Or we need to use the BADI's specifically mentioned for serv contract det and calculation of SLA.
    As you know in SAP for SLA times we need to have the service contract for a) customer b) org  and many other parameters.then to this SC we need to associate the service and response profiles.When the SC is determined in a complaint ,the serv and resp profiles will be used to get the SLA times.
    But my requirement is to have determine the SLA times based on the service organisations.Not based on the customer and any other parameters.
    For ex : If my serv org is in India the times would be diff ..if my serv org is in US the times would be diff.
    So let me know what approach would be best ?
    Use BADI's as above or does this approach of having define different Service contracts (without having Partner functions customer etc) for diff orgs?
    Thanks
    Amol

  • Problem in Time Based Publishing Content

    Hi every1,
      Im working with Time based publishing.
    Using xml form builder i created 3 contents means 3 xmls.
    Then i created one iView for reading the contents (KM Navigation iView) and i setup the properties like
    layout set, layout set mode, root folder of contents.
    After creation of iView i checked in the preview all 3 contents visible in my iView.
      Now i want to show time based content in that iView.
    Contents displayed as per time based
    for that i enabled time based publishing and life time of particular content(xml)by using the given way
    Time dependant publishing in Km. I clicked on the right hand side of the name of my folder-> go to details -> Settings -> Lifetime. there you have to enable the time dependant publishing. Then i opened the folder and click on the rt hand side of the document-> properties -> lifetime, here give the time span of the document.
    After life time setup , again i seen in the iView for reading the contents (previous created) in the preview
    again all 3 contents displayed including life time expired content also.
       Please give me solution for this, or any more configurations required.
    Note :
    I required to display the contents in between time applicable only ( from time and until time).
    Thanks in advaince
    Vasu

    I have waited more than 3 hours for settings to apply.
    But i couldn't find any changes.
    any other solution?
    Thanks
    Vasu

  • Time Based Workflow - how to make it work?

    Hello,
    Has anyone successfully built a Time Based Workflow? Could you share your examples?
    For me it does not work properly.
    I have tried to set up 2 workflows: on Opportunity Close Date and Account Contract Expiration Date.
    - Account Contract Expiration Date: I want an Account Owner to get an email notification exactly 6 months before the contract with his client expires. However - the email is triggered each time the record is modified - so I have seen in the workflow monitor that users on the date of contract expiration - 180 days will receive as many emails as many times they modified the record! Is there a way to avoid this situation?
    - Opportunity Close Date - I want to send an email to Opportunity Owner's Manager - 10 days after the opportunity was closed. However - there will be the same issue as above + the wait action is not working with a PRE function.
    Please let me know what you think and if you have already built a Time Based Workflow that works correctly.
    Edited by: MagdaR on May 18, 2010 1:57 AM

    Let's start with the workflow for Opty Close Date.
    There are a lot of ways to do this, so you'll need to evaluate which way is best for your case, but the basics are to check to ensure that the opty is closed for the first time, then set the flag. In order to accomodate for the opty being closed when it is created, you will have to consider a post default for the flag in addition to the workflow.
    In this case, you could create a workflow on Opty using the before modified record saved trigger event. In the Rule Condition, have the workflow check for a closed opty and if the status changed to closed during this modification. There are a number of options to validate this, including sales stage = Closed/Won or Lost, Closed Date is populated for the first time, Status is closed. In any case, just validate that the opty was closed for the first time using the PRE Function (i.e., PRE(Closed Date) is null and PRE(Closed Date)<>Closed Date). When your condition is met, set a flag that will trigger the event. You could also add a date that the wf conditions were met the first time, to ensure that you track when the rule was originally triggered.
    The next step is to have a workflow that unsets the flag if the conditions are not met. Set the order on this one to follow the rule above.
    The last rule is the wait/email rule and it uses the when modified record saved event. This rule triggers on the flag being checked, then waits to send the email.
    Test this and validate that it will work for your purposes. Based on this workflow, you should be able to create the other one, and I can help if you have any issues.
    Good Luck,
    Thom

  • Define Time-Based Fields for Cost Centers

    Dear All!
    I would like to know , how I'm abale to cahnge the setting
    of business area for Time-Based Field of Cost Centers to period
    Transaction is OKEG
    Would be thankful

    You can maintain master data for cost centers, cost elements, activity
    types, and business processes with time dependencies. You can make
    changes at any time for any given time interval. Data storage also takes
    place with a time reference. In this way, a master data record can have
    multiple database records storing different information.
    The smallest interval is one day. To ensure data consistency, you
    cannotchange each field daily. The timeframes in which you can change a
    field depend on the field functions, which are fixed by the SAP R/3
    System and cannot be changed. Master data maintenance includes an
    automatic check for each field's time-based consistency, resulting in
    individual time-based maintenance for each field.
    Regards
    Prabhu

  • Exclude a table from time-based reduction

    Hi,
    Iu2019d like to exclude a table from time-based reduction. How can I do this ? Is there any manual how to do customizing in TDMS ?
    Regards
    p121848

    Thank you Markus for your annotation.
    AUFK is technically declared as an Master Data Table, but stores orders. Standard
    TDMS provides a reduction of this file and in the client copies we did via TDMS a lot of  records disappeared when we selected time-reduction.
    Now we fond out that some Transactions as OKB9 or KA03 refer to old internal orders. So we would like to maintain the customizing, to exclude AUFK from reduction. But this is not possible in activity TD02P_TABLEINFO, because no changes can be done to the tables, which have got the transfer_status 1 = Reduce.
    You can manipulate the Transfer-Status in file CNVTDMS_02_STEMP before getting to activity  TD02P_TABLEINFO, but I wonder whether this is the way one should do.
    Any idea ?
    Regards p121848

  • Error in computing time based on TimeZones on a server

    I have a piece of code that generates timestamps based on US TimeZones such as Pacific, Central and Eastern. I get correct results when I run this code locally (EST) but get results an hour off when I run the same code on a server running in US central time. See the results below and the code that I run.
    Results from running the code on Dev Server:
    ts: 2007-11-05 10:39:03.19 ���� ������� (Default time)
    tsEST: 2007-11-05 10:39:03.019� ������� (EST based on TimeZone String America/New_York, should be 11:39)
    tsCST: 2007-11-05 09:39:03.019� ������� (Central time based on TimeZone String US/Central, should be 10:39)
    tsPST: 2007-11-05 07:39:03.019� ������� (Pacific time based on TimeZone String America/Los_Angeles, should be 8:39)
    Results from running the same code on Local machine:
    tsEST: 2007-11-05 11:39:01.272� (Eastern Time based on TimeZone String America/New_York)
    tsCST: 2007-11-05 10:39:01.272� (Central time based on TimeZone String US/Central)
    tsPST: 2007-11-05 08:39:01.272� (Pacific time based on TimeZone String America/Los_Angeles)
    Below is the code that I ran.
         Timestamp ts = new Timestamp(Calendar.getInstance().getTime().getTime());
    DateFormat df1 = new SimpleDateFormat( "yyyy-MM-dd HH:mm:ss.SS" );
    GregorianCalendar cal1 = new GregorianCalendar();
    Timestamp tsNow = new Timestamp(cal1.getTimeInMillis());
    TimeZone tsEST = TimeZone.getTimeZone("America/New_York");
    String inPattern = "yyyy-MM-dd HH:mm:ss.SS";
    DateFormat df = new SimpleDateFormat(inPattern);
    df.setTimeZone(tsEST);
    Date date = df.parse(tsNow.toString());
    df.setTimeZone(tsEST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsEST: " + ts.toString());
    DateFormat df2 = new SimpleDateFormat(inPattern);
    TimeZone tsCST = TimeZone.getTimeZone("US/Central");
    df2.setTimeZone(tsCST);
    ts = new Timestamp( df1.parse( df2.format(date) ).getTime() );
    System.out.println("tsCST: " + ts.toString());
    TimeZone tsPST = TimeZone.getTimeZone("America/Los_Angeles");
    df.setTimeZone(tsPST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsPST: " + ts.toString());

    Actually, I did try a complete removal and re-install with the same results. On the desktop machine on which the Ctime client is working, Ctime was installed on an earlier system version and was working before the upgrade to the current system. The client continued to work. This at least indicates an initial setup problem on 10.2.4 and may indeed relate to the product not yet being ready for the latest system.

  • Repository Services - Time based publishing missing

    Hi,
    We are running NW07, and want to configure time based publishing.
    I can't find the Repository Services for this it is suppose to be under
    System administratoin -> Content Management -> Repository Services
    But it is not,
    can anyone help?

    After that, you have to define the real lifetime
    http://help.sap.com/saphelp_nw70/helpdata/en/e8/a9a76828b8dc469969ff450ec81ced/frameset.htm
    An keep in mind that only users with not more than read permissions will see the document only during its lifetime. Users with write additional permissions can always see it
    Kind regards
    Karin

  • Time-based publishing stopped working

    Hi,
    We currently have a problem with time-based publishing in KM. Since a few days ago, documents stopped becoming visible once they reach their "valid from" date. We have not been able to publish documents with TBP since then on that system.
    These errors keep appearing in the knowledgemanagement.#.log files, which seem related to this issue :
    #1.5#C000AC10005900130000012D00000B3400040F325EE040B8#1142608921379#com.sapportals.wcm.WcmException#irj#com.sapportals.wcm.WcmException.WcmException(62)#System#0#####ThreadPool.Worker1##0#0#Error##Plain###application property service not found com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishException: application property service not found
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.getApplicationPropertyService(TimebasedPublishServiceManager.java:589)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.setValidEventSent(TimebasedPublishServiceManager.java:540)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.handleVisibleResources(TimebasedPublishServiceManager.java:327)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.CheckValidFromSchedulerTask.run(CheckValidFromSchedulerTask.java:65)
         at com.sapportals.wcm.service.scheduler.SchedulerEntry.run(SchedulerEntry.java:128)
         at com.sapportals.wcm.service.scheduler.crt.PoolWorker.run(PoolWorker.java:107)
         at java.lang.Thread.run(Thread.java:479)
    The KMC version is SP2 with Patch level 29 hotfix 1, and is running on Windows Server 2003 with an Oracle database. We have opened an OSS message but while we are waiting I thought I would post this here in case anyone ever experienced this.
    Best regards,
    Olivier

    Hi,
    1.  Have you checked that tbl service continue assigned to your repository ?
    2. If you create a new repository and assign these service, does it work ?
    Enables users to define a time frame during which documents are published (visible).
    Note that the time-dependent publishing service requires the application property service.
    This service cannot be configured.
    Patricio.

  • Time Based publishing on taxonomy

    Hi there,
    we have built a news application where everyone can post it's own news items. Location of the user is a personal folder created by using a userhome filter. The folder has time based publishing activated, the lifetime can be entered in the edit form or directly on the news item through 'details'.
    To display all items for all users, a taxonomy is used. However, the lifetime is ignored and all items are visible to all users.
    I have activated the tbp service on the /taxonomies repository, as well as enabled 'lifetime' on my 'news' subfolder (everyone has read permissions on this one). Unfortunately, still all items are visible to all users.
    Is TBP possible on a taxonomy?
    Thanks,
    Kevin

    Hi Kevin,
    we have kind of the same scenario here.
    We have a KM Folder where editors are editing news. This folder has a taxonomy to display different views on this news folder. TBP is activated in the folder.
    It works like that:
    Endusers only see valid news in the taxonomy folders. BUT editors are seeing even the expired ones, nevertheless that they only have Read permissions in the Taxonomy and /taxdata repository. So it seems to us that the "Full control" permission from the Source KM Folder is kind of overwriting the Read only permission in the Taxonomy. (Maybe you have the same problem)
    Our solution might be that we are programming a "KM Resource List Filter" to filter out the unwanted expired news out of the taxonomy folder for the editors as well.
    Cheers
    Christoph

  • Time Based workflow not working

    Hi,
    I am facing issue in time based workflow. I designed a worflow with two actions. Wait action is the first action and second action is send email. For send email the from address is current user and TO is owner of the record. One thing I noticed is that its working fine when "I" triggered the workflow and assign owner as "me". Its not trigerring the mail when owner is somebody else. I can see the instance with the following error message in the workflow monitor. "The buscomp Service Request is no longer valid for workflow name ********. The workflow instance ******* has terminated."
    Edited by: user11100286 on Oct 1, 2010 4:33 PM

    Hi,
    I am facing issue in time based workflow. I designed a worflow with two actions. Wait action is the first action and second action is send email. For send email the from address is current user and TO is owner of the record. One thing I noticed is that its working fine when "I" triggered the workflow and assign owner as "me". Its not trigerring the mail when owner is somebody else. I can see the instance with the following error message in the workflow monitor. "The buscomp Service Request is no longer valid for workflow name ********. The workflow instance ******* has terminated."
    Edited by: user11100286 on Oct 1, 2010 4:33 PM

  • Time Based Publishing - Not Working

    Hello SAP KM Gurus-
    I had configured Time Based Publishing to work on our clustered portal.  Everything worked fine until we went to a central instance / dialog set-up.  Now Time Based Publishing no longer works and I can't seem to get it to work no matter what I do.  I have so far:  scheduled the job on only one instance (as per the clustering guidelines in SAP Library), turned it on with properties with the repository (and for the folder I wish to use) and have checked to make sure the service is okay in KM Configuration.  However, it seems like the job never comes by to hide the documents b/c they just show up for Read users no matter what I change.  As I stated before, this was working fine until we went to the new configuation.
    I've checked SAP Notes with no luck.  Anyone have any idea why this is not working?  I'm fresh out.
    Any help greatly appreciated...
    Jim

    Hello Anjali-
    Thanks for your post.  Yes, I have checked that.  Here are my settings - I have Check Valid From assigned to one instance running on  the Central Instance and Check Valid To assigned to the other instance (we have two instances on each server) as per the help docs.  In component monitor, tbp is coming up green and the properties it is State-Ok.  On the repositories, I have both tbp and properties assigned and when I enable tbp I can get the lifecycle tab for the documents.  It appears as if everything is set up right.  However, the read users can see the documents just fine when they shouldn't.  It seems as if the Check Valid From and Check Valid To jobs just never run.
    Is there anyway I can see if the jobs have run and what the schedule was?  The tbp report also showing nothing...  Does it look like I'm doing anything wrong above.  I'm on EP 14/KM 14 by the way...
    Thanks for your help-
    Jim

  • Time-based publishing

    Hello All,
      I'm trying to use lifetime documents following this link
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d1/5b6635f5e0ef428fb513336881679b/frameset.htm
    So I have done thoose steps:
                  activate lifetime property in my folder
                  activate tasks "valid from" and "valid to"
    I'm trying to activate time-based publishing service (TBP) but I'm unable to find it in the portal.
       I have loocked in System Administration->System configuration->KM->Content Management->Repository Services and in System Administration->System configuration->KM->Content Management->Global Services without sucess.
       How can I check/configurate if TBP is running on the Portal?
       Do I have to do something more for use lifetime documents service?
       I'm using NW 2004s and EP7.0
       Thanks in advance,
       Regards

    Hi Yaiza,
    TBP service should be running in your portal but the configuration of the service cannot be changed. You need only to register this service with a repository and you must also register the application property service as it is described here: <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/c1/c87d3cf8ff3934e10000000a11405a/frameset.htm">Time-Dependent Publishing Service</a>
    This registration is done by default for standard /documents repository in the portal.
    Please be aware that the user with read/write permission for the folder where you have activated Time-dependant Publishing will be able to see all documents in that folder. Please refer to this: <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/e8/a9a76828b8dc469969ff450ec81ced/frameset.htm">Lifetime of Documents</a>
    Please remember to set valid time periods for your documents. When time-dependant publishing is activated properties <b>valid from</b> and <b>valid to</b> have the same value by default.
    One more thing is to set correct CM systems for scheduler task for TBP (at least on system in clustered environment) <a href="http://help.sap.com/saphelp_nw70/helpdata/en/d1/5b6635f5e0ef428fb513336881679b/frameset.htm">Scheduler Tasks for Time-Dependent Publishing</a>
    Hope it helps,
    Best Regards,
    Michal M.

Maybe you are looking for

  • I have set my new iphone up to the wrong email address how can I change it the the correct 1

    how do I change the e-mail address for my IPhone  account I have entered the incorrect details

  • Propagation error

    how to clean up propagation error? 1- Oracle version 9.2.0.1.0 2-I have three instance bd1, bd2, bd3 in different machines 3-The 3 instance can communicate between them via network. 4-I have setup succefully streams environnement bidirectionally betw

  • Drives by Intel?

    System Profiler seems to be telling me that the drives (350 and 7500) in my new 3. quad core MacPro were made by Intel. I didn't know Intel did that and don't recall seeing them mentioned here. Am I reading this right? Do these drives have a rep?

  • Dmraid error on initramfs: no block devices found

    After updating a system running dmraid mkinitcpio fails to create an image with the following: ==> ERROR: binary dependency `libudev.so.1' not found for `dmraid' [2012-08-30 14:07] ==> ERROR: binary dependency `libudev.so.1' not found for `dmsetup' [

  • Has anyone ever recovered deleted FLAC files?

    It doesn't work with any software I have tried so far. They just don't recognize it, just aiff, mp3, aac. Thanks a lot for a tipp! Peter