Time Based Levels

Hi, I'm developing a chasing game. The game is split into a number of levels and the object is for user to stay alive for as long as possible withing a time limit.
Once the the time limit is reached the next level is shown.
Any ideas on how to go about doing this. I've heard people using threads to do this but my attempts result in the applet not painting or the level not stopping after the allocated time.
Thanks

obviously, you'll have to use threads. not necessarely, since your game loop will probably use sleep(), you could countDown an int and when it reaches 0, your "timer" goes of. But there is no real harm in using in Timer anyway ...
So in your main
thread which contains the actual gameplay, put in a
timer. when the timer hits 0, level is over, and load
the new stuff. reset the timer, start the thread over
again.

Similar Messages

  • SCM230 - Time based Capacity Leveling by decreased storage (Days Supply)

    Hi,
    Reading through the SCM230 training course on the subject of Capacity Leveling, the manual describes the following functionality:
    If you choose Time-Based Capacity Leveling by decreasing storage (days'
    supply), the system levels the order with the largest days' supply first, then
    the one with the second largest supply, and so on.
    Can anyone explain this functionality in plain English and what / where are the parameters etc to influnce it?
    Thanks for your help
    Mark

    Hi Marius,
    I understand forward and combined forward/backward scheduling in capacity leveling using SNP Heuristics/Optimizer method. However, forward leveling or combined forward/backward leveling does not fulfil the requirement because then capacity overloads (aka SNP planned orders) are shifted in future causing shortage in weeks where demand exist. Business needs a way to shift capacity overloads in previous available buckets, if available capacity exists in previous weeks or months. If sufficient available capacity does not exist in previous weeks then the remaining capacity overload should stay in the week/month where that overload occurs due to excess demand. This will provide business an opportunity to meet demand by finding alternate methods of balancing that capacity overload by either adding capacity (running machines overtime, opening up production on a non-workday) or by outsourcing that overload to sub-contractors. However, neither heuristics nor optimizer method of capacity leveling in SNP fulfils this requirement.
    Regards,
    Jagjeet.

  • Time-based publishing stopped working

    Hi,
    We currently have a problem with time-based publishing in KM. Since a few days ago, documents stopped becoming visible once they reach their "valid from" date. We have not been able to publish documents with TBP since then on that system.
    These errors keep appearing in the knowledgemanagement.#.log files, which seem related to this issue :
    #1.5#C000AC10005900130000012D00000B3400040F325EE040B8#1142608921379#com.sapportals.wcm.WcmException#irj#com.sapportals.wcm.WcmException.WcmException(62)#System#0#####ThreadPool.Worker1##0#0#Error##Plain###application property service not found com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishException: application property service not found
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.getApplicationPropertyService(TimebasedPublishServiceManager.java:589)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.setValidEventSent(TimebasedPublishServiceManager.java:540)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.handleVisibleResources(TimebasedPublishServiceManager.java:327)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.CheckValidFromSchedulerTask.run(CheckValidFromSchedulerTask.java:65)
         at com.sapportals.wcm.service.scheduler.SchedulerEntry.run(SchedulerEntry.java:128)
         at com.sapportals.wcm.service.scheduler.crt.PoolWorker.run(PoolWorker.java:107)
         at java.lang.Thread.run(Thread.java:479)
    The KMC version is SP2 with Patch level 29 hotfix 1, and is running on Windows Server 2003 with an Oracle database. We have opened an OSS message but while we are waiting I thought I would post this here in case anyone ever experienced this.
    Best regards,
    Olivier

    Hi,
    1.  Have you checked that tbl service continue assigned to your repository ?
    2. If you create a new repository and assign these service, does it work ?
    Enables users to define a time frame during which documents are published (visible).
    Note that the time-dependent publishing service requires the application property service.
    This service cannot be configured.
    Patricio.

  • TDMS copy based on Company Code and time based reduction.

    I'm struggling to understand the process of this scenario.
    I created a new client in the target system (using local client copy with SAP_UCSV).
    I configured a time based and company code based reduction.
    After the TDMS copy is complete, I check the target system and still find plenty of data that isn't related to the company codes I selected. It appears that it still copies part of the excluded data. For example: I can still find work orders and sales orders for company codes that I specifically didn't select for the copy process.
    Any idea why this is? It appears that the copy doesn't bring across ALL of the excluded data. Just some.
    I double checked and triple checked my TDMS selection for company code based reduction and I can't find any error in there.
    To be more specific: I selected company code A01 A02 A03 and left B01 B02 B03 out. After the copy there are still orders visible in the target system with company code B01 B02 B03.
    This is after I created an empty target client first so it's not copying into an already existing client.
    Thanks for any guidance guys.

    Hm, not sure what's going on. There is no clear link between those existing orders and the company codes I initially selected. Then again it only seems to have a "high level" record of the data. As soon as one starts to drill down, the data is missing as expected. Same goes for other areas as well.
    Looks like we can live with that for now so I won't lose any sleep over it anymore.

  • TDMS - time based reduction - receiver system deletion

    Experts,
    I'm doing a time based reduction.  I'm on the step "Start Deletion of Data in Receiver System".  It's been running for over 18hours.
    But I don't see any jobs running in SM66 on the Central/Control or Sender or Reciever systems.
    When I click on the "Task" button, I see it has completed 8,444 of 12,246  sub activites.  There are 3,802 not yet started.
    We're on all the latest levels of DMIS and ECC.
    Any ideas?
    Thanks
    NICK

    Ashley and Niraj,
    Hey, I'm all for tips/tricks so don't worry about messing up my thread.
    I completely shut down the central/control system via stopsap and restarted.  Still it was in "running" status but no jobs were running on sender/rec or central/control.
    So I tried the trouble-shooting but it was un-clear to me what to do.
    I ended up highlighting the phase I reference earlier, then doing "execute" again.  The status changes from the "truck" to a green flag and I started to see jobs run again on the receiver system.  Again they have stopped, but I see another job scheduled to run in a few minutes....It's just weird, I didn't run into this on my last time-based copy.
    I'll post a few things I've learned to increase performance:
    RDISP/MAX_WP_RUNTIME = 0
    At LEAST 25 WP and 25 BCK procs
    rec/client = OFF
    RDISP/BTCTIME = 60
    RUN STATS regularly
    TAKE OUT OF ARCHIVELOG MODE
    Read/Impl these notes:
    Read theseu2026Update these parameters
    o TD05X_FILL_VBUK_1 Note 1058864
    o TD05X_FILL_VBUK_2 Note 1054584
    o TD05X_FILL_BKPF Note 1044518
    o TD05X_FILL_EBAN Note 1054583
    o TD05X_FILL_EQUI Note 1037712
    Set these oracle index on rec system:
    Table: QMIH
      fields: MANDT, BEQUI
    Table: PRPR
      fields: MANDT, EQUNR
    Table: VBFA
      fields: MANDT, VBELN, VBELV, POSNV
    set parameter u2018P_CLUu2019 to u2018Yu2019 in the following
    activities before you start the activities for filling internal header tables:
    TD05X_FILL_BKPF
    TD05X_FILL_CE
    TD05X_FILL_EKKO
    TD05X_FILL_VBUK
    TD05X_FILL_VBUK_1
    TD05X_FILL_VBUK_2
    TD05X_FILL_VSRESB
    TD05X_FILL_WBRK_1
    run TCODE CNVMBTACTPAR, specify the project number to do this
    IMPORTANT TCODEs
    CNV_MBT_TDMS_MY  Main TDMS starting point     
    CNVMBTMON  Process Monitor (must know your project number)
    DTLMON  MWB transfer monitor
    CNVMBTACTPAR  activity parameters
    CNVMBTACTDEF  MBT PCL activity maint
    CNVMBTTWB  TDMS workbench to develop scrambling rules
    CNV_TDMS_HCM_SCRAM  run in SENDER system for scrambling functionality
    Reports
    CNV_MBT_PACKAGE_REORG  to reorganize TDMS projects..aka delete
    CNV_MBT_DTL_FUGR_DELETE  deletes function groups associated with old projects
    Tables
    CNVMBTUSEDMTIDS   lists obsolete MTIDs
    IMPORTANT NOTES
    Note 894307 - TDMS: Tips, tricks, general problems, error tracing
    Note 1405597 - All relevant notes for TDMS Service Pack 12 and above
    Note 1402704 - TDMS Composite Note : Support Package Independent
    Note 890797 - SAP TDMS - required and recommended system settings
    Note 894904 - TDMS: Problems during deletion of data in receiver system
    Note 916763 - TDMS performance "composite SAP note"
    Note 1003051 - TDMS 3.0 corrections - Composite SAP Note
    Note 1159279 - Objects that are transferred with TDMS
    Note 939823 - TDMS: General questionnaire for problem specification
    Note 897100 - TDMS: Generating profiles for TDMS user roles
    Note 1068059 - To obtain the optimal deletion method for tables (receiver)
    Note 970531 - Installation and delta upgrade of DMIS 2006_1
    Note 970532 - Installation of DMIS Content 2006_1
    Note 1231203 - TDMS release strategy (Add-on: DMIS, DMIS_CNT, DMIS_EXT...)
    Note 1244346 - Support Packages for TDMS (add-on DMIS, DMIS_CNT, ...)
    I'm doing this for an ECC system running ecc 6.0 EHP6 by the way.
    Still any help with my issue on the delete would be helpful. but post tips I don't kwnow about
    NICK

  • Pb with time-based disagregation

    Hi!
    In my parea ZPA_REG, i have 2 dataview. One is by week and the second one is by month. I enter 100 at month level for October 2007. Then, i go in the second dataview (week level) and i see
    S40  25
    S41  25
    S42  14
    S43  25
    S44  11
    My question is about S41 ad S44. For S44, since there are only 3 days for october it could explain the result. But i don't understand why for S42 the value is 14 .
    The Time Stream ID (transaction /sapapo/calendar) is based on US calendar. Its calculation type is with gaps.
    The storage bucket profile is by Day, Week, Month and year.
    The time based disagregation for my KF is type P.
    Can you help me please.
    Thanks by advance,
    LB

    1. check if there are any holidays in that week. looks like it could be 2 holidays there.
    2. zero out all the values in the weeks and re enter 100. sometimes if there is a value already existing it will follow that proportion by default and not your planning area disaggregation type

  • Sales orders in TDMS company/time based reduction  are outside the scope

    Guys,
    I have had some issues with TDMS wheras it didn't handle company codes without plants very well. That was fixed by SAP. But I have another problem now. If I do a company code and time based reduction, It doesn't seem to affect my sales orders in VBAK/VBUK as I would have expected. I was hoping it would only copy sales orders across that have a plant which is assigned to a company code that was specified in the company code based reduction scenario. That doesn't seem to be the case.
    VBAK is now about one third of the size of the original table (number of records). But I see no logic behind the reduction. I can clearly see plenty of sales documents that have a time stamp way back from what I specified in my copy procedure and I can see others that have plant entries that should have been excluded from the copy as they do belong to different company codes than the ones I specified.
    I was under the impression that TDMS would sort out the correct sales orders for me but somehow that doesn't seem to be happening. I have to investigate further as to what exactly it did bring across but just by looking at what's in the target system I can see plenty of "wrong" entries in there either with a date outside the scope or with a plant outside the scope.
    I can also see that at least the first 10'000 entries in VBAK in the target system have a valid from and to date of 00.00.0000 which could explain why the time based reduction didn't work?
    Did you have similar experiences with your copies? Do I have to do a more detailed reduction such as specifying tables/fields and values?
    Thanks for any suggestions
    Stefan
    Edited by: Stefan Sinzig on Oct 3, 2011 4:57 AM

    The reduction itself is not based on the date when the order was created but the logic enhances it to invoices and offers, basically the complete update process.
    If you see data that definitely shouldn't be there I'd open an OSS call and let the support check what's wrong.
    Markus

  • Getting an error while activating a planning area "Enter values for planning horizon From and planning horizon To for the storage time profile level"

    Dear S&OP community,
    I am getting following error while creating a planning ares in a newly installed sandbox. "Enter values for planning horizon From and planning horizon To for the storage time profile level".
    This what I did...
    1) Created new attributes and master data objects and activated them successfully.
    2) Time profile created and activated successfully
    3) Trying to create planing area by assigning  time profile in step 2 and assigned master data from step1..Unable to save the data and system returns 
    this error - "Enter values for planning horizon From and planning horizon To for the storage time profile level"
    My understanding is time profile needs to be active  but doesn't have to have values...
    Any help is appreciated.
    Thanks,
    Krishna

    YS,
    Here are my time profile settings
    Level       Name          Display Horizon - Past  Display Horizon - Future
    1             Monthly     -6                                       11         
    2             Quarterly     -2                                       3
    3             Yearly        -1                                       2
    Time profile is active and but time profile data is not loaded
    Thanks,
    Krishna

  • Calculation of SLA times based on Service Organization

    Is it possible to calculate the SLA times based only on Service org?
    a) Using Service contracts i.e create SC with only org and assign the Service & Response profiles.
    Else as mentioned below.
    Please give your more thoughts.
    I maintain the Service & response profiles at "Maintain Availability and Response Times" .
    Can I access these values directly in the BADI ?
    My scenario is
    a) An agent belongs to a service org.
    b) I define these Profiles seperately for each Org (Org1 Org2 etc) at the above tcode.This manual entry.I know we dont have org to profiles mapping in the above tcode.I just painly maintain.
    c) In the BADI i check the org entered in the complaint.
    d) for Ex if the Org1 is entered I want to access the profiles for Org1.If else ladder.
    e) then use these profiles to calculate the SLA times.
    f) then save the document.
    g) Also trigger an e-mail saying the above time lines.
    Is the above flow possible??
    Let me know if you want me to post this onto another thread.
    Thanks
    amol

    Shalini,
    I will be just maintaining the service and response profiles in the "Maintain Availa..." tcode.
    There wont be exact mapping stored in any table.
    My logic would be ,i dunno whether this right or wrong..
    1) Once i get the Org ,I would compare like ths
    if( orgdata = org1)
    then service profile 1 etc.
    2) then apply the profiles to the cal of SLA times.
    I think we can achieve what you said using CRM_ORDERADM_I_BADI
    Or we need to use the BADI's specifically mentioned for serv contract det and calculation of SLA.
    As you know in SAP for SLA times we need to have the service contract for a) customer b) org  and many other parameters.then to this SC we need to associate the service and response profiles.When the SC is determined in a complaint ,the serv and resp profiles will be used to get the SLA times.
    But my requirement is to have determine the SLA times based on the service organisations.Not based on the customer and any other parameters.
    For ex : If my serv org is in India the times would be diff ..if my serv org is in US the times would be diff.
    So let me know what approach would be best ?
    Use BADI's as above or does this approach of having define different Service contracts (without having Partner functions customer etc) for diff orgs?
    Thanks
    Amol

  • Problem in Time Based Publishing Content

    Hi every1,
      Im working with Time based publishing.
    Using xml form builder i created 3 contents means 3 xmls.
    Then i created one iView for reading the contents (KM Navigation iView) and i setup the properties like
    layout set, layout set mode, root folder of contents.
    After creation of iView i checked in the preview all 3 contents visible in my iView.
      Now i want to show time based content in that iView.
    Contents displayed as per time based
    for that i enabled time based publishing and life time of particular content(xml)by using the given way
    Time dependant publishing in Km. I clicked on the right hand side of the name of my folder-> go to details -> Settings -> Lifetime. there you have to enable the time dependant publishing. Then i opened the folder and click on the rt hand side of the document-> properties -> lifetime, here give the time span of the document.
    After life time setup , again i seen in the iView for reading the contents (previous created) in the preview
    again all 3 contents displayed including life time expired content also.
       Please give me solution for this, or any more configurations required.
    Note :
    I required to display the contents in between time applicable only ( from time and until time).
    Thanks in advaince
    Vasu

    I have waited more than 3 hours for settings to apply.
    But i couldn't find any changes.
    any other solution?
    Thanks
    Vasu

  • Time Based Workflow - how to make it work?

    Hello,
    Has anyone successfully built a Time Based Workflow? Could you share your examples?
    For me it does not work properly.
    I have tried to set up 2 workflows: on Opportunity Close Date and Account Contract Expiration Date.
    - Account Contract Expiration Date: I want an Account Owner to get an email notification exactly 6 months before the contract with his client expires. However - the email is triggered each time the record is modified - so I have seen in the workflow monitor that users on the date of contract expiration - 180 days will receive as many emails as many times they modified the record! Is there a way to avoid this situation?
    - Opportunity Close Date - I want to send an email to Opportunity Owner's Manager - 10 days after the opportunity was closed. However - there will be the same issue as above + the wait action is not working with a PRE function.
    Please let me know what you think and if you have already built a Time Based Workflow that works correctly.
    Edited by: MagdaR on May 18, 2010 1:57 AM

    Let's start with the workflow for Opty Close Date.
    There are a lot of ways to do this, so you'll need to evaluate which way is best for your case, but the basics are to check to ensure that the opty is closed for the first time, then set the flag. In order to accomodate for the opty being closed when it is created, you will have to consider a post default for the flag in addition to the workflow.
    In this case, you could create a workflow on Opty using the before modified record saved trigger event. In the Rule Condition, have the workflow check for a closed opty and if the status changed to closed during this modification. There are a number of options to validate this, including sales stage = Closed/Won or Lost, Closed Date is populated for the first time, Status is closed. In any case, just validate that the opty was closed for the first time using the PRE Function (i.e., PRE(Closed Date) is null and PRE(Closed Date)<>Closed Date). When your condition is met, set a flag that will trigger the event. You could also add a date that the wf conditions were met the first time, to ensure that you track when the rule was originally triggered.
    The next step is to have a workflow that unsets the flag if the conditions are not met. Set the order on this one to follow the rule above.
    The last rule is the wait/email rule and it uses the when modified record saved event. This rule triggers on the flag being checked, then waits to send the email.
    Test this and validate that it will work for your purposes. Based on this workflow, you should be able to create the other one, and I can help if you have any issues.
    Good Luck,
    Thom

  • How to know on which time dimension level we are ?

    Hello,
    I would like to know is there a variable or a mean to know dynamically on which time dimension level we are in order to use that in a CASE WHEN clause ?
    By using a sort of aggregation tables in which one of the column contains the name of the level, I could know on which level I am but I can't use that for drill down. What I mean :
    Tab1 :
    'Year' as typelevel, year
    Tab2 :
    'Month' as typelevel, year, month
    In BMM, I have made one logical table with as Source tab1 and tab2 and as columns typelevel, year and month.
    tab1 has in content column the year level
    tab2 has in content column the month level.
    So when in Answers I retrieve
    typelevel, year
    the result is : 'Year', 2008
    and when I request : typelevel, year, month
    the result is : 'Month', 2008, 1
    But if I want to drill from year to month in order to have :
    'Year', 2008
    and then after drill
    'Month', 2008, 1
    it is impossible as a filter on typelevel='Year' is added on the month level, so it retrieves 0 columns.
    If someone has an idea on how to do that it would be very great.
    Thanks in advance for your help.

    Hi Supriya,
    OOTB I think you can use SharePoint designer, but I would suggest  custom code to iterate to all pages, and get the lists that are associated with these pages.
    http://stackoverflow.com/questions/633633/sharepoint-how-can-i-find-all-the-pages-that-host-a-particular-web-part
    another one would be if those lists were never used and you can check for list with empty data.
    I would use Get-SPLists to get all of the lists to check for zero items.
    http://blogs.technet.com/b/heyscriptingguy/archive/2010/09/15/use-windows-powershell-to-manage-lists-in-sharepoint-2010.aspx
     http://sharepointrelated.com/2011/11/28/get-all-sharepoint-lists-by-using-powershell/
    Hope this helps!
    Ram - SharePoint Architect
    Blog - SharePointDeveloper.in
    Please vote or mark your question answered, if my reply helps you

  • Define Time-Based Fields for Cost Centers

    Dear All!
    I would like to know , how I'm abale to cahnge the setting
    of business area for Time-Based Field of Cost Centers to period
    Transaction is OKEG
    Would be thankful

    You can maintain master data for cost centers, cost elements, activity
    types, and business processes with time dependencies. You can make
    changes at any time for any given time interval. Data storage also takes
    place with a time reference. In this way, a master data record can have
    multiple database records storing different information.
    The smallest interval is one day. To ensure data consistency, you
    cannotchange each field daily. The timeframes in which you can change a
    field depend on the field functions, which are fixed by the SAP R/3
    System and cannot be changed. Master data maintenance includes an
    automatic check for each field's time-based consistency, resulting in
    individual time-based maintenance for each field.
    Regards
    Prabhu

  • Exclude a table from time-based reduction

    Hi,
    Iu2019d like to exclude a table from time-based reduction. How can I do this ? Is there any manual how to do customizing in TDMS ?
    Regards
    p121848

    Thank you Markus for your annotation.
    AUFK is technically declared as an Master Data Table, but stores orders. Standard
    TDMS provides a reduction of this file and in the client copies we did via TDMS a lot of  records disappeared when we selected time-reduction.
    Now we fond out that some Transactions as OKB9 or KA03 refer to old internal orders. So we would like to maintain the customizing, to exclude AUFK from reduction. But this is not possible in activity TD02P_TABLEINFO, because no changes can be done to the tables, which have got the transfer_status 1 = Reduce.
    You can manipulate the Transfer-Status in file CNVTDMS_02_STEMP before getting to activity  TD02P_TABLEINFO, but I wonder whether this is the way one should do.
    Any idea ?
    Regards p121848

  • Error in computing time based on TimeZones on a server

    I have a piece of code that generates timestamps based on US TimeZones such as Pacific, Central and Eastern. I get correct results when I run this code locally (EST) but get results an hour off when I run the same code on a server running in US central time. See the results below and the code that I run.
    Results from running the code on Dev Server:
    ts: 2007-11-05 10:39:03.19 ���� ������� (Default time)
    tsEST: 2007-11-05 10:39:03.019� ������� (EST based on TimeZone String America/New_York, should be 11:39)
    tsCST: 2007-11-05 09:39:03.019� ������� (Central time based on TimeZone String US/Central, should be 10:39)
    tsPST: 2007-11-05 07:39:03.019� ������� (Pacific time based on TimeZone String America/Los_Angeles, should be 8:39)
    Results from running the same code on Local machine:
    tsEST: 2007-11-05 11:39:01.272� (Eastern Time based on TimeZone String America/New_York)
    tsCST: 2007-11-05 10:39:01.272� (Central time based on TimeZone String US/Central)
    tsPST: 2007-11-05 08:39:01.272� (Pacific time based on TimeZone String America/Los_Angeles)
    Below is the code that I ran.
         Timestamp ts = new Timestamp(Calendar.getInstance().getTime().getTime());
    DateFormat df1 = new SimpleDateFormat( "yyyy-MM-dd HH:mm:ss.SS" );
    GregorianCalendar cal1 = new GregorianCalendar();
    Timestamp tsNow = new Timestamp(cal1.getTimeInMillis());
    TimeZone tsEST = TimeZone.getTimeZone("America/New_York");
    String inPattern = "yyyy-MM-dd HH:mm:ss.SS";
    DateFormat df = new SimpleDateFormat(inPattern);
    df.setTimeZone(tsEST);
    Date date = df.parse(tsNow.toString());
    df.setTimeZone(tsEST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsEST: " + ts.toString());
    DateFormat df2 = new SimpleDateFormat(inPattern);
    TimeZone tsCST = TimeZone.getTimeZone("US/Central");
    df2.setTimeZone(tsCST);
    ts = new Timestamp( df1.parse( df2.format(date) ).getTime() );
    System.out.println("tsCST: " + ts.toString());
    TimeZone tsPST = TimeZone.getTimeZone("America/Los_Angeles");
    df.setTimeZone(tsPST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsPST: " + ts.toString());

    Actually, I did try a complete removal and re-install with the same results. On the desktop machine on which the Ctime client is working, Ctime was installed on an earlier system version and was working before the upgrade to the current system. The client continued to work. This at least indicates an initial setup problem on 10.2.4 and may indeed relate to the product not yet being ready for the latest system.

Maybe you are looking for