Time based inspection

The material once produced goes to oven and will be kept in oven for few hours. For example material A can be kept in over for 6.5 hours, Material B u2013 5.5 hrs, and C can be kept for 6.0 hours.
When the material is pull out of oven before specified time then system then system should not allowed user to move the stock out of the oven. Stock when entering the oven will be un-restricted.
My thought is to create an inspection lot when material is moved into the oven (08 -Inspection type). When results entered is  validated with the time in which material moved into oven. And check if this time is less than required material time.
I am not sure where to capture hours for the material and how to do validation of result entered with time which material moved to oven.
Client don't want to create production order for this process.
Is there a standard way to do map to SAP. ? ?
Please adviceu2026
Thanks
S

Well, you'd have to get an ABAP'er to write the code.  But assuming you create the inspection lot, (I.e. do whatever movement you setup), when the material is placed in the oven, then your start time is the lot creation time.  Alternatively you could use inspection points and then you create the inspection point when the material is placed in the oven.  The start time would be part of the insp. point identifier and could be edited to the exact time.
When someone attempts the UD you take the current date/ time and subtract the date/time of the insp. point or of the lot creation.  Compare that difference to the value in the inspection plan.  If it doesn't exceed the time, don't allow the UD.
There are many places you could store the cure time for the material.  You could use a general characteristic in a material class assigned to the material.  That would probably be my first choice.  You could use the QM inspection text in the material master. 
This shouldn't be to hard for a good programmer.
FF

Similar Messages

  • Auto creation of time based inspection points against in process required

    Hi
    We want system to create/generate inspection points automatically when process order is released and inspection lot is created. Currently we have to create inspection point manually using either QE71 or QE11. Previously when scheduling was active in PP, inspection points had been creating automatically for the duration of process order. But now PP scheduling is not required and as a results inspection points are not being created automatically.
    Please assist.
    regards
    Chris

    SAP provides a FM, (QAPP_CUST_IP_CREATE), to be used as a template to create your own FM for automatically creating inspection points.  There is documentation in the FM to explain it.  You then assign this FM in configuration where you define the inspeciton point identifiers.
    FF

  • Create Time Master Inspection Characteristic (MIC)

    Hi all,
    Pls tell me how to create an time based mic (Inspection Charactsrstic) where the Format of the values should be 00.00 ..12.00
    I tried creating it with 2 decimal places but the result is like 0.00 and 12.00,the first digit is being deleted
    your help is appreciated
    Regards

    Dear colleague,
    the two types of characteristics of the standard SAP are
    qualitative  (for example, "product color") or
    quantitative (for example, "material density").
    Please see the online-documentaion:
    http://help.sap.com/saphelp_47x200/helpdata/en/61/
    2db434d6e3681ae10000009b38f83b/frameset.htm
    There is no possibility to convert numrical fields into date in
    results recording. Your request might reachable using classification
    of characteristics. Please see the following online-documentaion:
    http://help.sap.com/saphelp_47x200/helpdata/en/ec/
    62ab0a416a11d1896d0000e8322d00/frameset.htm
    under
    Entering Basic Data / Format / Data Type: Time Format/Date Format
    Thank you for contacting
    Maria

  • Sales orders in TDMS company/time based reduction  are outside the scope

    Guys,
    I have had some issues with TDMS wheras it didn't handle company codes without plants very well. That was fixed by SAP. But I have another problem now. If I do a company code and time based reduction, It doesn't seem to affect my sales orders in VBAK/VBUK as I would have expected. I was hoping it would only copy sales orders across that have a plant which is assigned to a company code that was specified in the company code based reduction scenario. That doesn't seem to be the case.
    VBAK is now about one third of the size of the original table (number of records). But I see no logic behind the reduction. I can clearly see plenty of sales documents that have a time stamp way back from what I specified in my copy procedure and I can see others that have plant entries that should have been excluded from the copy as they do belong to different company codes than the ones I specified.
    I was under the impression that TDMS would sort out the correct sales orders for me but somehow that doesn't seem to be happening. I have to investigate further as to what exactly it did bring across but just by looking at what's in the target system I can see plenty of "wrong" entries in there either with a date outside the scope or with a plant outside the scope.
    I can also see that at least the first 10'000 entries in VBAK in the target system have a valid from and to date of 00.00.0000 which could explain why the time based reduction didn't work?
    Did you have similar experiences with your copies? Do I have to do a more detailed reduction such as specifying tables/fields and values?
    Thanks for any suggestions
    Stefan
    Edited by: Stefan Sinzig on Oct 3, 2011 4:57 AM

    The reduction itself is not based on the date when the order was created but the logic enhances it to invoices and offers, basically the complete update process.
    If you see data that definitely shouldn't be there I'd open an OSS call and let the support check what's wrong.
    Markus

  • Calculation of SLA times based on Service Organization

    Is it possible to calculate the SLA times based only on Service org?
    a) Using Service contracts i.e create SC with only org and assign the Service & Response profiles.
    Else as mentioned below.
    Please give your more thoughts.
    I maintain the Service & response profiles at "Maintain Availability and Response Times" .
    Can I access these values directly in the BADI ?
    My scenario is
    a) An agent belongs to a service org.
    b) I define these Profiles seperately for each Org (Org1 Org2 etc) at the above tcode.This manual entry.I know we dont have org to profiles mapping in the above tcode.I just painly maintain.
    c) In the BADI i check the org entered in the complaint.
    d) for Ex if the Org1 is entered I want to access the profiles for Org1.If else ladder.
    e) then use these profiles to calculate the SLA times.
    f) then save the document.
    g) Also trigger an e-mail saying the above time lines.
    Is the above flow possible??
    Let me know if you want me to post this onto another thread.
    Thanks
    amol

    Shalini,
    I will be just maintaining the service and response profiles in the "Maintain Availa..." tcode.
    There wont be exact mapping stored in any table.
    My logic would be ,i dunno whether this right or wrong..
    1) Once i get the Org ,I would compare like ths
    if( orgdata = org1)
    then service profile 1 etc.
    2) then apply the profiles to the cal of SLA times.
    I think we can achieve what you said using CRM_ORDERADM_I_BADI
    Or we need to use the BADI's specifically mentioned for serv contract det and calculation of SLA.
    As you know in SAP for SLA times we need to have the service contract for a) customer b) org  and many other parameters.then to this SC we need to associate the service and response profiles.When the SC is determined in a complaint ,the serv and resp profiles will be used to get the SLA times.
    But my requirement is to have determine the SLA times based on the service organisations.Not based on the customer and any other parameters.
    For ex : If my serv org is in India the times would be diff ..if my serv org is in US the times would be diff.
    So let me know what approach would be best ?
    Use BADI's as above or does this approach of having define different Service contracts (without having Partner functions customer etc) for diff orgs?
    Thanks
    Amol

  • Problem in Time Based Publishing Content

    Hi every1,
      Im working with Time based publishing.
    Using xml form builder i created 3 contents means 3 xmls.
    Then i created one iView for reading the contents (KM Navigation iView) and i setup the properties like
    layout set, layout set mode, root folder of contents.
    After creation of iView i checked in the preview all 3 contents visible in my iView.
      Now i want to show time based content in that iView.
    Contents displayed as per time based
    for that i enabled time based publishing and life time of particular content(xml)by using the given way
    Time dependant publishing in Km. I clicked on the right hand side of the name of my folder-> go to details -> Settings -> Lifetime. there you have to enable the time dependant publishing. Then i opened the folder and click on the rt hand side of the document-> properties -> lifetime, here give the time span of the document.
    After life time setup , again i seen in the iView for reading the contents (previous created) in the preview
    again all 3 contents displayed including life time expired content also.
       Please give me solution for this, or any more configurations required.
    Note :
    I required to display the contents in between time applicable only ( from time and until time).
    Thanks in advaince
    Vasu

    I have waited more than 3 hours for settings to apply.
    But i couldn't find any changes.
    any other solution?
    Thanks
    Vasu

  • Time Based Workflow - how to make it work?

    Hello,
    Has anyone successfully built a Time Based Workflow? Could you share your examples?
    For me it does not work properly.
    I have tried to set up 2 workflows: on Opportunity Close Date and Account Contract Expiration Date.
    - Account Contract Expiration Date: I want an Account Owner to get an email notification exactly 6 months before the contract with his client expires. However - the email is triggered each time the record is modified - so I have seen in the workflow monitor that users on the date of contract expiration - 180 days will receive as many emails as many times they modified the record! Is there a way to avoid this situation?
    - Opportunity Close Date - I want to send an email to Opportunity Owner's Manager - 10 days after the opportunity was closed. However - there will be the same issue as above + the wait action is not working with a PRE function.
    Please let me know what you think and if you have already built a Time Based Workflow that works correctly.
    Edited by: MagdaR on May 18, 2010 1:57 AM

    Let's start with the workflow for Opty Close Date.
    There are a lot of ways to do this, so you'll need to evaluate which way is best for your case, but the basics are to check to ensure that the opty is closed for the first time, then set the flag. In order to accomodate for the opty being closed when it is created, you will have to consider a post default for the flag in addition to the workflow.
    In this case, you could create a workflow on Opty using the before modified record saved trigger event. In the Rule Condition, have the workflow check for a closed opty and if the status changed to closed during this modification. There are a number of options to validate this, including sales stage = Closed/Won or Lost, Closed Date is populated for the first time, Status is closed. In any case, just validate that the opty was closed for the first time using the PRE Function (i.e., PRE(Closed Date) is null and PRE(Closed Date)<>Closed Date). When your condition is met, set a flag that will trigger the event. You could also add a date that the wf conditions were met the first time, to ensure that you track when the rule was originally triggered.
    The next step is to have a workflow that unsets the flag if the conditions are not met. Set the order on this one to follow the rule above.
    The last rule is the wait/email rule and it uses the when modified record saved event. This rule triggers on the flag being checked, then waits to send the email.
    Test this and validate that it will work for your purposes. Based on this workflow, you should be able to create the other one, and I can help if you have any issues.
    Good Luck,
    Thom

  • Define Time-Based Fields for Cost Centers

    Dear All!
    I would like to know , how I'm abale to cahnge the setting
    of business area for Time-Based Field of Cost Centers to period
    Transaction is OKEG
    Would be thankful

    You can maintain master data for cost centers, cost elements, activity
    types, and business processes with time dependencies. You can make
    changes at any time for any given time interval. Data storage also takes
    place with a time reference. In this way, a master data record can have
    multiple database records storing different information.
    The smallest interval is one day. To ensure data consistency, you
    cannotchange each field daily. The timeframes in which you can change a
    field depend on the field functions, which are fixed by the SAP R/3
    System and cannot be changed. Master data maintenance includes an
    automatic check for each field's time-based consistency, resulting in
    individual time-based maintenance for each field.
    Regards
    Prabhu

  • Exclude a table from time-based reduction

    Hi,
    Iu2019d like to exclude a table from time-based reduction. How can I do this ? Is there any manual how to do customizing in TDMS ?
    Regards
    p121848

    Thank you Markus for your annotation.
    AUFK is technically declared as an Master Data Table, but stores orders. Standard
    TDMS provides a reduction of this file and in the client copies we did via TDMS a lot of  records disappeared when we selected time-reduction.
    Now we fond out that some Transactions as OKB9 or KA03 refer to old internal orders. So we would like to maintain the customizing, to exclude AUFK from reduction. But this is not possible in activity TD02P_TABLEINFO, because no changes can be done to the tables, which have got the transfer_status 1 = Reduce.
    You can manipulate the Transfer-Status in file CNVTDMS_02_STEMP before getting to activity  TD02P_TABLEINFO, but I wonder whether this is the way one should do.
    Any idea ?
    Regards p121848

  • Error in computing time based on TimeZones on a server

    I have a piece of code that generates timestamps based on US TimeZones such as Pacific, Central and Eastern. I get correct results when I run this code locally (EST) but get results an hour off when I run the same code on a server running in US central time. See the results below and the code that I run.
    Results from running the code on Dev Server:
    ts: 2007-11-05 10:39:03.19 ���� ������� (Default time)
    tsEST: 2007-11-05 10:39:03.019� ������� (EST based on TimeZone String America/New_York, should be 11:39)
    tsCST: 2007-11-05 09:39:03.019� ������� (Central time based on TimeZone String US/Central, should be 10:39)
    tsPST: 2007-11-05 07:39:03.019� ������� (Pacific time based on TimeZone String America/Los_Angeles, should be 8:39)
    Results from running the same code on Local machine:
    tsEST: 2007-11-05 11:39:01.272� (Eastern Time based on TimeZone String America/New_York)
    tsCST: 2007-11-05 10:39:01.272� (Central time based on TimeZone String US/Central)
    tsPST: 2007-11-05 08:39:01.272� (Pacific time based on TimeZone String America/Los_Angeles)
    Below is the code that I ran.
         Timestamp ts = new Timestamp(Calendar.getInstance().getTime().getTime());
    DateFormat df1 = new SimpleDateFormat( "yyyy-MM-dd HH:mm:ss.SS" );
    GregorianCalendar cal1 = new GregorianCalendar();
    Timestamp tsNow = new Timestamp(cal1.getTimeInMillis());
    TimeZone tsEST = TimeZone.getTimeZone("America/New_York");
    String inPattern = "yyyy-MM-dd HH:mm:ss.SS";
    DateFormat df = new SimpleDateFormat(inPattern);
    df.setTimeZone(tsEST);
    Date date = df.parse(tsNow.toString());
    df.setTimeZone(tsEST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsEST: " + ts.toString());
    DateFormat df2 = new SimpleDateFormat(inPattern);
    TimeZone tsCST = TimeZone.getTimeZone("US/Central");
    df2.setTimeZone(tsCST);
    ts = new Timestamp( df1.parse( df2.format(date) ).getTime() );
    System.out.println("tsCST: " + ts.toString());
    TimeZone tsPST = TimeZone.getTimeZone("America/Los_Angeles");
    df.setTimeZone(tsPST);
    ts = new Timestamp( df1.parse( df.format(date) ).getTime() );
    System.out.println("tsPST: " + ts.toString());

    Actually, I did try a complete removal and re-install with the same results. On the desktop machine on which the Ctime client is working, Ctime was installed on an earlier system version and was working before the upgrade to the current system. The client continued to work. This at least indicates an initial setup problem on 10.2.4 and may indeed relate to the product not yet being ready for the latest system.

  • Repository Services - Time based publishing missing

    Hi,
    We are running NW07, and want to configure time based publishing.
    I can't find the Repository Services for this it is suppose to be under
    System administratoin -> Content Management -> Repository Services
    But it is not,
    can anyone help?

    After that, you have to define the real lifetime
    http://help.sap.com/saphelp_nw70/helpdata/en/e8/a9a76828b8dc469969ff450ec81ced/frameset.htm
    An keep in mind that only users with not more than read permissions will see the document only during its lifetime. Users with write additional permissions can always see it
    Kind regards
    Karin

  • Time-based publishing stopped working

    Hi,
    We currently have a problem with time-based publishing in KM. Since a few days ago, documents stopped becoming visible once they reach their "valid from" date. We have not been able to publish documents with TBP since then on that system.
    These errors keep appearing in the knowledgemanagement.#.log files, which seem related to this issue :
    #1.5#C000AC10005900130000012D00000B3400040F325EE040B8#1142608921379#com.sapportals.wcm.WcmException#irj#com.sapportals.wcm.WcmException.WcmException(62)#System#0#####ThreadPool.Worker1##0#0#Error##Plain###application property service not found com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishException: application property service not found
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.getApplicationPropertyService(TimebasedPublishServiceManager.java:589)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.setValidEventSent(TimebasedPublishServiceManager.java:540)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.handleVisibleResources(TimebasedPublishServiceManager.java:327)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.CheckValidFromSchedulerTask.run(CheckValidFromSchedulerTask.java:65)
         at com.sapportals.wcm.service.scheduler.SchedulerEntry.run(SchedulerEntry.java:128)
         at com.sapportals.wcm.service.scheduler.crt.PoolWorker.run(PoolWorker.java:107)
         at java.lang.Thread.run(Thread.java:479)
    The KMC version is SP2 with Patch level 29 hotfix 1, and is running on Windows Server 2003 with an Oracle database. We have opened an OSS message but while we are waiting I thought I would post this here in case anyone ever experienced this.
    Best regards,
    Olivier

    Hi,
    1.  Have you checked that tbl service continue assigned to your repository ?
    2. If you create a new repository and assign these service, does it work ?
    Enables users to define a time frame during which documents are published (visible).
    Note that the time-dependent publishing service requires the application property service.
    This service cannot be configured.
    Patricio.

  • Time Based publishing on taxonomy

    Hi there,
    we have built a news application where everyone can post it's own news items. Location of the user is a personal folder created by using a userhome filter. The folder has time based publishing activated, the lifetime can be entered in the edit form or directly on the news item through 'details'.
    To display all items for all users, a taxonomy is used. However, the lifetime is ignored and all items are visible to all users.
    I have activated the tbp service on the /taxonomies repository, as well as enabled 'lifetime' on my 'news' subfolder (everyone has read permissions on this one). Unfortunately, still all items are visible to all users.
    Is TBP possible on a taxonomy?
    Thanks,
    Kevin

    Hi Kevin,
    we have kind of the same scenario here.
    We have a KM Folder where editors are editing news. This folder has a taxonomy to display different views on this news folder. TBP is activated in the folder.
    It works like that:
    Endusers only see valid news in the taxonomy folders. BUT editors are seeing even the expired ones, nevertheless that they only have Read permissions in the Taxonomy and /taxdata repository. So it seems to us that the "Full control" permission from the Source KM Folder is kind of overwriting the Read only permission in the Taxonomy. (Maybe you have the same problem)
    Our solution might be that we are programming a "KM Resource List Filter" to filter out the unwanted expired news out of the taxonomy folder for the editors as well.
    Cheers
    Christoph

  • Time Based workflow not working

    Hi,
    I am facing issue in time based workflow. I designed a worflow with two actions. Wait action is the first action and second action is send email. For send email the from address is current user and TO is owner of the record. One thing I noticed is that its working fine when "I" triggered the workflow and assign owner as "me". Its not trigerring the mail when owner is somebody else. I can see the instance with the following error message in the workflow monitor. "The buscomp Service Request is no longer valid for workflow name ********. The workflow instance ******* has terminated."
    Edited by: user11100286 on Oct 1, 2010 4:33 PM

    Hi,
    I am facing issue in time based workflow. I designed a worflow with two actions. Wait action is the first action and second action is send email. For send email the from address is current user and TO is owner of the record. One thing I noticed is that its working fine when "I" triggered the workflow and assign owner as "me". Its not trigerring the mail when owner is somebody else. I can see the instance with the following error message in the workflow monitor. "The buscomp Service Request is no longer valid for workflow name ********. The workflow instance ******* has terminated."
    Edited by: user11100286 on Oct 1, 2010 4:33 PM

  • Time Based Publishing - Not Working

    Hello SAP KM Gurus-
    I had configured Time Based Publishing to work on our clustered portal.  Everything worked fine until we went to a central instance / dialog set-up.  Now Time Based Publishing no longer works and I can't seem to get it to work no matter what I do.  I have so far:  scheduled the job on only one instance (as per the clustering guidelines in SAP Library), turned it on with properties with the repository (and for the folder I wish to use) and have checked to make sure the service is okay in KM Configuration.  However, it seems like the job never comes by to hide the documents b/c they just show up for Read users no matter what I change.  As I stated before, this was working fine until we went to the new configuation.
    I've checked SAP Notes with no luck.  Anyone have any idea why this is not working?  I'm fresh out.
    Any help greatly appreciated...
    Jim

    Hello Anjali-
    Thanks for your post.  Yes, I have checked that.  Here are my settings - I have Check Valid From assigned to one instance running on  the Central Instance and Check Valid To assigned to the other instance (we have two instances on each server) as per the help docs.  In component monitor, tbp is coming up green and the properties it is State-Ok.  On the repositories, I have both tbp and properties assigned and when I enable tbp I can get the lifecycle tab for the documents.  It appears as if everything is set up right.  However, the read users can see the documents just fine when they shouldn't.  It seems as if the Check Valid From and Check Valid To jobs just never run.
    Is there anyway I can see if the jobs have run and what the schedule was?  The tbp report also showing nothing...  Does it look like I'm doing anything wrong above.  I'm on EP 14/KM 14 by the way...
    Thanks for your help-
    Jim

Maybe you are looking for

  • Purchase Order and Purchase Requisition

    Hi To All, my customer while using tcode me2m hase extracted purchase order opened since 2007. now he want to close all this order and settle them to zero. this operation should be done also for the purchase requisition. there is a standard way to do

  • How to delete a VSG from VNMC ?

    Hi I have half installed a VSG ie I have got it so far as registering with VNMC but need to change its hostname and enable NTP on it. If I delete teh agent config the VSG and make teh changes then renable the agent VNMC just changes everything back !

  • I need iOS 4 or something

    Please help

  • USB hub printers

    Is there a problem with printing with Bonjour when you have a couple printers hooked up via USB hub? I have 2 printers attached to Airport Extreme Base Station via USB hub and they don't work very well (if at all) until I unplug from Station and then

  • IPhone Apps and MacBook Air Apps...

    Can you use the purchased app's from your iPhone through itunes and use them on your mac?