Significance and scope of STO & STR

Folks
I am trying to understand the Significance and scope of STO & STR.
1. Are STO & STRu2019s primarily used to transport the stock between
       a). Plants within same company or
       b). Plants within different company or
2. At what point in the logistics/SCM process does STR/STO get created?
3. Are STR/STO created manually or through MRP / SNP?
4. Whatu2019s the standard approach to convert STR to STO.
Thanks

How do you differentiate STO and PO?  Both are pruchase orders that are created with ME21N and changed in ME22N.
if the plants are in the same company then you use order document type UB.
if the plants are in different company codes then you either stay with the standard and use NB as document type (or you still want differentiate between real external orders and purchase orders made to a partner company, then you can create an extra document type like IC for intercompany)
MRP creates planned order or requisitions only, based on settings made in MRP parameters and indicators set in material master, but never an external document like purchase order or STO.
The main difference between UB and NB stock transport orders is that in UB orders you just  mention a plant instead of a vendor number. further in NB orders you need to have an price because there is  a billing requirement  between two legal entities while UB orders do not have price conditions (except for freight costs with external carriers).
The setting from which SAP knows whether it has to create a NB or UB  is made in purchasing customizing of stock transports. Further SAP will know from enterprise customizing if the plants involved belong to same or different company codes.
In case of different company codes, you need to create vendor master data for each plant,  and the vendor needs to be linked to a plant. (table T001W)

Similar Messages

  • Significance and scope of backorder processing

    Folks
    I am trying to understand the Significance and scope of backorder processing.
    Is backorder processing a standard process that is run in every company frequently or is there a special need that drives this process ?
    Thanks

    Hi Sree
    In simple term Back order Processing is more of a manipulation.
    The quantity already assured to other customers earlier are being manipulated/changed here.
    Back order is created in two scenarios.
    1. If the order quantity is not totally confirmed.
    2. If the requested delivery date can not be kept.
    I will give you a simple example.
    Suppose ,you have a material M1.
    Your sales closing is 30th Oct.
    As of to-day your stock is 1000 units, out of which suppose 200,300 and 400 units are already assured/promised to customer A,B and C respectively.
    You are not going to receive any stock before 1st Nov.
    In this scenario, suppose one of the biggest institution (your most important customer) has palced an order for 400 units and to be delivered before 30th Oct.,whereas you have only 100 units left open with you.
    You have been trying for this order since last few months as this business is very lucrative, every year grows by atleast 30%,payment is very good, profit is very good, it is a prestigious institute where all compititors are active.
    If you will not serve this order, then in future you will loose the business permanently which you can not afford.
    In this situation what you will do ?
    Obviously, You will change the already promised quantity earlier to other customer and find out 500 units for this order, and this process is called Back Order Processing.
    Back order processing is of 2 types.
    1. Manually (Here ATP quantities are reassigned manually as for confirmation and manage the required stock for the important customer)
    Here t.code CO06 is used.
    2. Automatic or via Rescheduling ( Here system does automatically taking the "Delivery priority" from the customer master into consideration). Here t.code V_V2 is used.

  • Significance of deliveries in STOs

    Hi
    Please explain the significance and the process when deliveries used in STO processes
    Reagrds
    Jupudi

    Hi Jupudi,
    a lot of companies want to have the same kind of documentation for a goods issue for an STO as they have it with a delivery for a sales order.
    Process:
    1. create STO (ME21N)
    2. create delivery (VL10B/VL10D/VL10F etc)
    3. PGI (VL02N, VL06G, VL06O)
    4. PGR against the delivery to correctly update document flow (MIGO). In case of a one step STO the PGR is done automatically when doing PGI
    5. cross company STO: create intercompany billing doc (VF01)
    6. cross company STO: create invoice for PO (MIRO)
    You need to set up sales views for the materials affected, and execute the IMG activities for STOs (OLME -> Purchase Order -> Set up Stock Transport Order). These steps and the process is really just the basics, you can complicate it if you want.
    Please also search the forum, I'm pretty sure you'll find more than enough threads
    Cheers,
    Attila

  • Open PIR quantity, unconverted Pld orders and open Prod/ STO orders

    Dear all,
    How to deal with open PIR quantity, unconverted planned orders and open production/ STO orders in the next MRP run? Do we need to delete/ reduce them manually?
    Thanks and Regards,
    Raghu

    Hi Shiva,
    Thanks for your immediate response. I have still some confusion.
    Last month, I have material A with Production order quantity of 30000.
    'A' has a component 'B' which is an in-house produced material. Three planned orders each of 10000 were generated for B. Of these two planned orders were converted to production orders. One is still in planned order form.
    This month I am not going to use this planned order of B. I get new requirement for A. So during MRP run I get the planned orders for B also.
    For the last month requirement, as I have not converted one planned order of B to prod order, there will an open quantity of A prod order in this plant and open STO quantity in another plant. Again the PIR will also be not reduced to the same extent. If I reduce the PIR manually or through some reorganization, what about the open quantity of prod order and STO
    I think I can TECO the prod order of A. Is there any such thing for STO also?
    Thanks and Regards,
    Raghu

  • Automating table header and scope assignment?

    Is there any way to automate the process of assigning headers and scope in a table in a pdf document?
    We regularly produce large volumes of pdf documents for our clients, that include among other things tables.  We have recently been informed that our documents need to be Accessible (508 compliant).  Being able to produce these in an automated fashion is required due to our volume.  For example, our current project involves producing 4,300 4 page documents that include a total of  six tables each.  While the layout is uniform for these, the data in the tables is unique for each one.
    We produce the documents using SAS and are using adobe action scripts to automate the process of tagging the documents, adding properties and doing a full accessibility check.  We have also used commonlook to verify that we are meeting the additional accessibility requirements that the full accessibility check does not cover.  The one part that we have not been able to figure out is how to automate the tagging of header cells in the document as headers and set the scope. 
    SAS does not natively tag the PDF documents, which is why we tag them with an Acrobat action.  For an alternative approach, I have tried converting an html document with a properly formatted table (headers and scope assigned) from html to pdf to see if headers and scope could be assigned that way, but the header and scope assignment was lost in the conversion.
    If there is no way to automate this through Acrobat somehow, is there a PDF authoring tool that can produce PDF documents with properly tagged and scoped tables in an automated fashion from a data source?  Preferably one that can use a SAS dataset or Oracle for the source data.

    Thank you for replying.  I understand that header rows and columns need to be properly tagged and scoped to be accessible (508 compliant) which is why I am looking for a tool or plugin that I can set up to properly tag them in an automated fashion. 
    You mention using a plug-in.  Do you know of a plug-in that can be set up to properly tag the table headers and scope without manual intervention?
    If I tag the tables in these documents manually, it takes me about 10 minutes per document.  for 4,300 documents, at 40 hours per week, it will take about 4 1/2 months to tag all of them.  That is just one of many projects.  That is not an option.
    Can anyone suggest a PDF authoring tool that can produce properly tagged PDF's, including table headers and scope, in a productionalized fashion using SAS or Oracle or SQL Server or Access, or Excel as a data source?
    Or an Acrobat plug-in that can be set up to properly tag the table headers and scope without manual intervention?

  • What is the proper way to run a DMM and Scope back and forth continuously?

    I am running a list of tests from an Excel file that could be a DMM or Scope test or neither. I am creating a session of each at the beginning, setting up the device before measurement and disabling them after making a reading then set up the device before next reading.  I thought this would be quicker than closing and initializing around each test.  Is this an efficient way or is there a better way?

    Execute tests runs the list of tests.  Call Required Test Steps runs through each individual test.  Device is called to initialize all devices before Execute tests is called.  Then they are not closed until all tests have been ran.
    Maybe to clarify my question.   
    Here is what I do with the DMM and scope
    init DMM, init Scope 
    setup DMM or scope
    measure
    disable
    setup DMM or scope
    measure
    disable
    //(repeat these three steps as many times as needed)
    Close DMM and Scope.
    I disable so that the DMM isn't still reading Ohms when a voltage is connected, etc
    I am just wondering if this is efficient
    Attachments:
    Call Required Test Steps.vi ‏36 KB
    Execute Tests.vi ‏347 KB
    Device.vi ‏53 KB

  • Significance and use of CVD indicator and MRP indicator in J1IEX

    Dear Frndz...
    Plz expln significance and use of CVD indicator and MRP indicator in J1IEX.
    Regards,
    Amit P Hiran
    njoy SAP..
    njoy Life...........

    hi
    >CVD TICk
    CVD indicator will be auto ticked when when u go for IMPORT PROCESS
    check folowing for import process
    import process customization
    >MRP TICk
    MRP tick is used for the first stage dealer scenario ,where at time of purchase u cant have excise values
    the bill will be including base price exciseothertax + delears commisiion
    E.g.:
    Material Price = 80/-
    Excise Duty = 14/- (BEDECessSECess)
    Margin = 6/-
    Total =100/-
    VAT @3% = 3/-
    Grand Total =103/-
    Step 1:
    Create PO with Base Price as 100/- by selecting tax code which is having VAT as 3% (Total PO value will be 103/-)
    Step 2:
    Post MIGO and Capture Excise Part - 1
    Click on More details icon in Excise tab (header), check MRP Indicator in miscellaneous tab, Enter Assessable value as 80 in Excise tab Item level and enter the excise values BED, ECess , SECess manually, check & save the document.
    Step 3;
    Post Excise Part 2 in J1IEX
    Step 4:
    Post MIRO by checking calculate tax, edit base value as 80/- enter the amount in header as proposed by the system (103/-), simulate & post the document.
    Note 1104456 - Use of MRP indicator for capturing excise duties from dealer

  • Dynamic View Object Creation and Scope Question

    I'm trying to come up with a way to load up an SQL statement into a view object, execute it, process the results, then keep looping, populating the view object with a new statement, etc. I also need to handle any bad SQL statement and just keep going. I'm running into a problem that is split between the way java scopes objects and the available methods of a view object. Here's some psuedo code:
    while (more queries)
    ViewObject myView = am.createViewObjectFromQueryStmt("myView",query); //fails if previous query was bad
    myView.executeQuery();
    Row myRow = myView.first();
    int rc = myView.getRowCount();
    int x = 1;
    myView.first();
    outStr = "";
    int cc = 0;
    while (x <= rc) //get query output
    Object[] result = myRow.getAttributeValues();
    while (cc < result.length)
    outStr = outStr+result[cc].toString();
    cc = cc+1;
    x = x+1;
    myView.remove();
    catch (Exception sql)
    sql.printStackTrace(); myView.remove(); //won't compile, out of scope
    finally
    myView.remove(); //won't compile, out of scope
    //do something with query output
    Basically, if the queries are all perfect, everything works fine, but if a query fails, I can't execute a myView.remove in an exception handler. Nor can I clean it up in a finally block. The only other way I can think of to handle this would be to re-use the same view object and just change the SQL being passed to it, but there's no methods to set the SQL directly on the view object, only at creation time as a method call from the application module.
    Can anyone offer any suggestions as to how to deal with this?

    I figured this out. You can pass a null name to the createViewObjectFromQueryStmt method, which apparently creates a unqiue name for you. I got around my variable scoping issue by re-thinking my loop logic.

  • Is there an idiots guide to JSF and scope somewhere?

    I am getting very confused with scope and JSF.
    I have a page that just displays customer details (from CustomerBean extends Customer) with a button that should allow the user to change the customer details. Both pages use the same backing bean (is that recommended?).
    customerDetail.jsp --> editCustomer.jsp
    If I set the scope of CustomerBean to session, editCustomer sees the same customer as customerDetail. It seems to me that I shouldn't really be using session scope as I don't want a particular customer to hang around once I have finished with him. So what should I be doing?
    Should customerBean be request scoped? If so how does editCustomer see it?
    Or should I somehow destroy the session scoped customerBean when I have finished with it?
    I also notice that some example jsps out there have a hidden field for the ID - should I simply look up the customer again from the database in editCustomer?
    I also tried to add a <h:inputHidden value="#{customerBean}"/> but that broke my jsp.
    I have bought and read the J2EE tutorial but I am still confused as to what the recommended way to drag the same Object through two jsp pages.
    It's probably very simple but it's doing my head in ;-)
    - David

    Yeah, forming a model in your head to explain something can be painful, sometimes. This question has come up before, including where I work, and I've never really seen a comprehensive answer, so I'll just write one. :)
    This is kind of a basic servlet concept, so I'll talk mostly about servlets. JSF is just flavor on top of this.
    The lifetime of the request is: from the time the user hits "submit" until the time the response is fully rendered, whatever page that is.
    So, you have a form the user has filled out and he/she hits "submit".
    The HTTP POST request goes to the server (open port 80, write some "key: value" headers to satisfy the HTTP protocol requirements, followed by a stream of text that represents the users' form field values, wait for a response) which then proceeds to process it by parsing the incoming data and making a bunch of subroutine calls. The last set of subroutine calls basically involves a bunch of println() calls to write HTML into an output stream, which is the response the requesting browser is listening for. When that stream is done, the browser displays the html.
    There's nothing that says the html that's displayed is the same as the html that originally held the form the user submitted. The first page is essentially garbage that somehow generated some form fields. The server could care less what it was, all it wants is the key=value pairs.
    You could, if you were so inclined, code all those println()s yourself. That's straight servlet programming. It's totally under your control. You could code println( "<html><body>Hi, Mom!</body></html>"); and be done.
    Or, you could write a JSP that, when compiled, turns into essentially a subroutine chock full o' println()s, and you could call that subroutine.
    You do that with RequestDispatcher.forward(). It's just a subroutine call. (But don't do any more scribbling on the output stream after it returns, 'cause the stream's essentially been closed.)
    It's all a big call tree with one thread of execution, single entry, single exit. One of the nice things about servlets is that the infrastructure makes available to you, in a contextual sorta way, the original request parameters plus whatever attributes you choose to attach to the request as your proceed w/your processing, kind of like charms on a charm bracelet (via ServletRequest.setAttribute()). (When I say "contextual", I mean the ServletRequest is passed in as a parameter to Servlet.service() so you can sling it around in your call tree.) Attributes you choose to attach while processing incoming form data are available later (for instance... in the subroutine that has all those println()s you so carefully coded up or allowed the JSP compiler to generate for you).
    When the call tree is done (you've finally printed "</html>", marked the output stream "done" somehow and shipped all that HTML back out to the browser), the ServletRequest object that held the incoming form parameters plus whatever attribute cruft it accumulated is garbage collected. (I could write something poetic about Elysium and gamboling among daisies, but... nah.) So, the lifetime of that data associated w/the ServletRequest is the duration of that request-processing call tree.
    JSF gives you a nice bunch of automatically-generated request attributes, corresponding to your request-scoped managed beans. It even very kindly transfers (via the value bindings) incoming form parameters into properties of beans which are attributed onto your ServletRequest, automagically.
    So, if, in your JSP, you bind your form data to request-scoped bean properties (not the bean itself, but the bean's properties), those exact same bean properties will be visible on whatever JSP you eventually wind up on and it will be available to whatever intervening logic you code up ("Invoke Application" phase), and when the request is done, it all vanishes into thin air.
    To be more specific to your question: yes, I believe it is recommended to have the same bean between pages. That's kind of the whole point. If you find yourself at the end of a request trying to destroy session data that was created at the beginning of that request, you should probably be using request scoping, not session.
    I could be wrong, but I don't think you can bind an entire bean to an html element value. You bind bean properties. Of course, there's nothing to say that a bean property couldn't be... another bean!
    In your particular case, I guess you have a bunch of display-only strings that come from your customer bean, plus a hidden "key" field somewhere. You could bind that hidden field to the customer.key property and Customer.setKey() would do whatever's necessary to get the rest of the data into the bean. That could be a d/b lookup or a map or array (cache) fetch. Or you could have a "current customer" in your session (that would have to be session-scoped, because you paint the detail screen w/one request and then paint the "edit" screen w/the same customer but in a different request). That "current customer" concept might cause you some problems later when you go multi-window or multi-frame in your webapp (truuuuuuuust me).
    Also, I'm not sure why you need a CustomerBean separate from Customer. Can you just make Customer a bean and be done with it?
    Holy cow, what an essay this turned out to be.
    John Lusk.

  • How do we link b/w P.O and Delivery in STO?

    Hi Consultants,
    WHere do we link between P.O and Delivery and how the system knows that a delivery should be done against  a P.O in STO process.
    Thanks in Advance.
    Siva

    Hi,
    Using transaction code VL10B system gives you a report for which PO's we have to create Deliveries.
    For extracting the report we can use different selection criteria.
    Thus a link (document flow) is created with PO and delivery.
    Hope this will answer your question, if yes reward points.
    Thanks
    Venkat.

  • DHCP Scopes and Scope Options Import & Export

    I need to adjust lease times for over one hundred scopes spread across multiple servers (about half of them are on one server, though). There will be 2 or 3 different lease times used. What is the best way to do this?
    I know I can use netsh to change the option for each scope. But I would like to script the collection of the list of scopes, rather than typing the list manually. Is there a way to export a list that contains just scopes and descriptions?
    Thanks

    Hi,
    Actually, it can be exported as txt file.
    netsh dhcp server export c:\DHCP\myscopes.txt
    Export-DhcpServer
    And you can also manage it via powershell
    Use the PowerShell DHCP Module to Simplify DHCP Management
    http://blogs.technet.com/b/heyscriptingguy/archive/2011/02/14/use-the-powershell-dhcp-module-to-simplify-dhcp-management.aspx
    Hope this helps.

  • 5122 and scope show video signal in bad resolution

    i'm using the 5122 digitizer trying to show a nice clear scope (ni scope front panel) image of an entire video frame (all lines).  some of my settings are 2.0ms/div, 50.0 MS/s; TV trigger; a record length of 1,000,000.
    what i see on the scope is far from a million points plotted.  where the points are concentrated it is just solid white.  I know the scope is recorded a million points because i've exported the data and analyzed elsewhere but the issue isn't data analysis, the people i'm working on this for need to be able to view the scope image and determine whether it is a good signal (by eye).
    if there are any alternate methods to do this then please elaborate. thanks
    Jeff Padham
    HBE
    [email protected]

    You are being limited by the resolution of your monitor.  A nice monitor has about 2000 pixels of horizontal resolution.  If you expanded the NI-SCOPE Soft Front Panel to be the full width of such a monitor, you would get a plot area about 1700 pixels wide.  You are collecting 1 million points.  There is almost three orders of magnitude difference between these two numbers.  The soft front panel is doing the best it can.  Unfortunately, it does not support zoom, which would help with this issue.  You have several options:
    Take ten or twenty lines of data at once and cycle through the frame, using the Line Number event and changing the Line # to change your position in the frame.  This will only work if your signal is repetitive.  This essentially does what a zoom would do.
    Set up your computer with multiple monitors and stretch the soft front panel window across all of them to give yourself more horizontal resolution.  Two monitors is probably fairly easy, since most modern video cards support that many.  More than that can get expensive, since it will require another video card and probably a new motherboard to support an extra video card (although Matrox has a nice external solution, as well).  I am not sure this will give you enough resolution.
    Export the data to NI-HWS and create a viewer to zoom and view all of it at high resolution.  This is fairly easy in LabVIEW.  Let us know if want to pursue this and we can give you some pointers (see the tutorial Managing Large Data Sets in LabVIEW, which has an example which almost does what you need - you would need to add the file I/O to replace the waveform generate code).
    Write your own application in LabVIEW to take a frame and allow zoom on the final result.  Once again, this is fairly easy in LabVIEW.  The NI-SCOPE examples and the previously mentioned tutorial should give you the info you need.
    Let us know if you need more help.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Distribution Channel and Division in STO

    Hi All,
    We are have a STO senario.We are also using the common distribution channel and common division.However when we create an Invoice with the delivery created the system is picking up the common division not the material division. In delivery  also in the line item of the in delivery shows 00( cross division) and not the division of the material from the material master.
    We have configure the Shipping data for plant with common division and common distribution channel.
    We want the system should pick the material division compared to common division.
    Kindly guide us on the issue.
    Regards
    Rajev

    We also search the forum and found one thread
    Material division in Intercompany STO Invoice
    But is not helpful for us.
    Regards
    Rajev

  • Receiving grade and weight govern STOs?

    Hello,
    There is a common practice in scrap metal industry that the Buyer determines the weight and grade of the material on physical receipt of delivery. This weight and grade govern the transaction, even if the Seller initially adviced a different weight and grade.
    Example:
    Seller advices X tons of material A
    Buyer determines on receipt Y tons of material B
    The settlement of the transaction (GR and payment) will be based on the value of Y tons of material B
    QUESTION: How do we support this scenario with STO?
    With standard STOs, the system does not allow to change the quantity or material (grade) when posting the GR against an outbound delivery. So basically the Vendor's weight and grade govern. If we want to change them, we would have to reverse all the transactions and re-create them with the new values, which is too cumbersome for the users.
    We looked at OSS note 167795, which allows the receiving weight/grade to govern, but it's a core mod and we're trying to avoid that.
    I would appreciate any feedback.
    Edited by: marcin_j on May 8, 2009 4:22 PM

    Hello,
    There is a common practice in scrap metal industry that the Buyer determines the weight and grade of the material on physical receipt of delivery. This weight and grade govern the transaction, even if the Seller initially adviced a different weight and grade.
    Example:
    Seller advices X tons of material A
    Buyer determines on receipt Y tons of material B
    The settlement of the transaction (GR and payment) will be based on the value of Y tons of material B
    QUESTION: How do we support this scenario with STO?
    With standard STOs, the system does not allow to change the quantity or material (grade) when posting the GR against an outbound delivery. So basically the Vendor's weight and grade govern. If we want to change them, we would have to reverse all the transactions and re-create them with the new values, which is too cumbersome for the users.
    We looked at OSS note 167795, which allows the receiving weight/grade to govern, but it's a core mod and we're trying to avoid that.
    I would appreciate any feedback.
    Edited by: marcin_j on May 8, 2009 4:22 PM

  • Companies using labview in India and scope of labview in india

    HI
    This is hari krishna have 1+years experience in labview application development.
    I like to know how many companies using labview in India and jobs scope/future of labview developers.
    Regards,
    Hari Krishna.k
    R&D Engineer.

    Hi Hari,
    The scope of LabVIEW in India is indeed very bright.
    You may find the below link useful:
    http://forums.ni.com/t5/LabVIEW-Job-Openings/Scope-of-LabVIEW-job-in-India/td-p/1017671
    Regards,
    Rohan Sood
    Applications Engineer
    National Instruments
    Regards,
    Rohan Sood
    Applications Engineer
    National Instruments

Maybe you are looking for

  • Cover Flow Greatest Hits Problem....

    Has anyone seen the issue where all songs and album art work are combined in cover flow? When not using cover flow everything is fine and in order but when in cover flow mode. The song list for say Al Green's album called Greatest Hits also has the s

  • Best Approach for Pagination  in 11.1.1.6 or 11.1.1.7  for Portals

    Hi , We have a requirement for using pagination in webcenter portal . We already installed Jdev 11.1.1.6 , So is their any approach or a solution for Pagination in 11.1.1.6 . If not could you please suggest any other approach for upgrading to 11.1.1.

  • Error Message 150:30 in Elements 8

    I have Elements 8. It was working fine, but now when I try to open a photo to edit, it give me the message Error 150:30, which is apparently a licensing glitch. Restarting the computer does not fix it. Help!

  • Hello.can anyone help me plz i have problem with my macbook air i can't find image capture.

    hello.can anyone help me plz i have problem with my macbook air i can't find image capture. there is nothing in progrem can i reinstall that progrem and how?thanks

  • Datasource XML: how Synchronize from file?

    Hi, i''ve a XML datasource that it works fine, and i've an interface that move some content of this datasource to an oracle database. All work fine, but only on the first execution. If i do some changes on XML datasource (in XML File, add or update e