MRP Related queries

Hi all,
1. Is  Materials planning can be set to generate
planned order,delivery schedule,PR?
2.To view the most up to date planning data (available stock, orders, reservations, etc.) on a material.
Stock Overview report.
MRP list.
Stock Requirement list..
3.The following MRP types could use a material forecast:  Manual Reorder point planning.
Automatic reorder point planning.
Forecast based planning.
Time phased planning.
Storage location MRP.
4.Reservations can be created for:
Goods receipt for purchase order.
Goods receipt without a purchase order.
Goods issue to a cost centre.

Hi Kali,
Question 2:
To view the most up to date planning data (available stock, orders, reservations, etc.) on a material.
Stock Overview report.
MRP list.
Stock Requirement list..
Correct Answer: Stock Requirement list..
Regards,
Sankaran

Similar Messages

  • MD04 Transaction and MRP related queries

    Hello All,
    i have some specific questions regarding MD04 and MRP Run
    1. In MD04  for few raw material i am getting -ve available quantities , i am not getting why MRP didn"t run for the same before available qty become negative , we have a batch job schedule for MRP run .
    what all the possibilites of not running the MRP ,
    We don't have reorder level planning !  MRP type`is standard one PD .
    2. For some material we are not able to create a planned order, settings in Material Master MRP view is fine as we have another material with same settings are working , i am running the MRP through MD41 Transaction code >
    Forecasting is done through MD61/62 active indicator ticked >
    what else could be the reason .
    Secondly one silly question does MRP take into consideration available stock during run , i mean if i run the MRP for a material with Tcode MD41 , will MRP create Planned orders , even if enough stock is available >
    Thanks for coming back in advance  .
    Thanks and regards

    Look for the MRP list as well (MD05), not only the Stock/Requirements List (MD04).
    You can see there the date and time of the last MRP run, and if there was a termination that did not allow the proper run.
    There may be a number of reasons why a material was not included in the MRP run, like what type of run (e.g. regenerative or net change), etc.

  • SD Related Queries

    Hello all,
    In our project we are using Automotive Industry module.
    so now i want to take Automotive module BW SD related queries from
    [http://help.sap.com/erp2005_ehp_04/helpdata/EN/50/296fe7bf1a474f84d5955cedefa0a3/frameset.htm]
    Please tell me BW sd queries from the above link.
    Regards.
    Edited by: Ranga123 on Mar 24, 2010 2:28 PM

    no one is answering my question, so admin please delete this thread.thanks.

  • AME related queries, where to post ?

    Hi all,
    Can you please tell me where should we post our AME related queries. Since Oracle treats it as part of HRMS, do we post those queries in this forum ? Or is there any separate forum for this purpose. Please provide me the link as well, if you can.
    Thanks a lot in advance.

    You can post it here I think

  • Automotive Industry SD Related Queries

    Hello all,
    In our project we are using Automotive Industry module.
    so now i want to take Automotive module SD related queries from
    [http://help.sap.com/erp2005_ehp_04/helpdata/EN/50/296fe7bf1a474f84d5955cedefa0a3/frameset.htm]
    Please tell me sd quries from the above link.

    Check this link
    [SAP Best Practices for Automotive|http://help.sap.com/content/bestpractices/industry/bestp_industry_automotive.htm]
    thanks
    G. Lakshmipathi

  • Idoc Related queries

    1.     How can we view and rectify the errors or warnings caused ,while we create a new idoc ,which may be an extension of an existing basic Idoc type(at Transaction code – we30)?
    2.     How can we delete an Idoc created,if its already been released (at Transaction code we30) and configured(at transaction code we82)?
    3.     Is that mandatory that the check box ‘Mandatory’ field should always be checked,whenever we create(extend) a new segment to an existing segment(at transaction code we30)?
    4.     On what basis,we can identify that “To which existing segment - we can append our needed segment(new segment if any to be appended)”?

    Hi Nagarajan,
      Answers for your questions:
    1)How can we view and rectify the errors or warnings caused ,while we create a new idoc ,which may be an extension of an existing basic Idoc type(at Transaction code – we30)?
       WE30 is created for IDOCs. First set break point related user exit.For testing WE19. Just enter that error IDOC number in WE19 and press F8. Then it will display the segments. Then press /H in the command box and press inbound function module push button (Just side of inbound push button). Then it will open in debug mode. we can test.
    2. How can we delete an Idoc created,if its already been released (at Transaction code we30) and configured(at transaction code we82)?
    Yes it is possible to delete hte IDOC which is released from our system, i think thorugh remote function but i am not sure.
    3. Is that mandatory that the check box ‘Mandatory’ field should always be checked,whenever we create(extend) a new segment to an existing segment(at transaction code we30)?
    Based on the requirement we can select that check box. suppose it u upload the data for MM01 t-code then observe what are all the manditory feilds in that screen. Based on that choose mandotory check box for proper fields in the segment.(In MM01 suppose meterail number is manditory then while creating segment select that manditory chk box for MATNR)
    4. On what basis,we can identify that “To which existing segment - we can append our needed segment(new segment if any to be appended)”?
    Based on the basic IDOC type and given information from the user.
    Hope this helps you, reply for queries,
    Regards.
    kumar.

  • Adobe create suite 64-bit related queries

    Hi,
    I have following couple of questions related to 64-bit support in Adobe Products.
    1. Would like to know Adobe Illustrator CS3,CS4 and CS5 support 64-bit?
    2. Would like to know Adobe Photoshop CS3,CS4 and CS5 support 64-bit?
    3. Heard that CS5 would support 64-bit. All application underneath Creative Suite 5 would support 64-bit.
    4. does 32-bit and 64-bit have separate installer or same installer can be installed in 32-bit and 64-bit as well?.
    5. In which Window platform CS 64-bit will be supported?
    6. In which MAC platform  CS 64-bit will be supported?
    7. Separate licence to be purchased for 32-bit & 64-bit? or same license can be used?
    Please clarify the above queries.
    Regards,
    Avudaiappan

    Find answers inline.
    AvudaiappanSornam wrote:
    Hi,
    I have following couple of questions related to 64-bit support in Adobe Products.
    1. Would like to know Adobe Illustrator CS3,CS4 and CS5 support 64-bit?
    Illustrator CS5 is not 64 bit.
    2. Would like to know Adobe Photoshop CS3,CS4 and CS5 support 64-bit?
    Photoshop CS5 is 64 bit
    3. Heard that CS5 would support 64-bit. All application underneath Creative Suite 5 would support 64-bit.
    Since answer to question number 1 is no you know the answer
    4. does 32-bit and 64-bit have separate installer or same installer can be installed in 32-bit and 64-bit as well?.
    Same download can install 64 bit if you have a 64 bit OS
    5. In which Window platform CS 64-bit will be supported?
    XP, Vista, Win 7
    6. In which MAC platform  CS 64-bit will be supported?
    10.5.7 and 10.6.x
    7. Separate licence to be purchased for 32-bit & 64-bit? or same license can be used?
    I beleive no, but you can always cross check with Adobe/reseller before purchasing
    Please clarify the above queries.
    Regards,
    Avudaiappan

  • Relational queries through JDBC with the help of Kodo's metadata for O/R mapping

    Due to JDOQL's limitations (inability to express joins, when relationships
    are not modeled as object references), I find myself needing to drop down to
    expressing some queries in SQL through JDBC. However, I still want my Java
    code to remain independent of the O/R mapping. I would like to be able to
    formulate the SQL without hardcoding any knowledge of the relational table
    and column names, by using Kodo's metadata. After poking around the Kodo
    Javadocs, it appears as though the relevant calls are as follows:
    ClassMetaData cmd = ClassMetaData.getInstance(MyPCObject.class, pm);
    FieldMetaData fmd = cmd.getDeclaredField( "myField" );
    PersistenceManagerFactory pmf = pm.getPersistenceManagerFactory();
    JDBCConfiguration conf = (JDBCConfiguration)
    ((EEPersistenceManagerFactory)pmf).getConfiguration();
    ClassResolver resolver = pm.getClassResolver(MyPCObject.class);
    Connector connector = new PersistenceManagerConnector(
    (PersistenceManagerImpl) pm );
    DBDictionary dict = conf.getDictionary( connector );
    FieldMapping fm = ClassMapping.getFieldMapping(fmd, conf, resolver, dict);
    Column[] cols = fm.getDataColumns();
    Does that look about right?
    Here's what I'm trying to do:
    class Foo
    String name; // application identity
    String bar; // foreign key to Bar
    class Bar
    String name; // application identity
    int weight;
    Let's say I want to query for all Foo instances for which its bar.weight >
    100. Clearly this is trivial to do in JDOQL, if Foo.bar is an object
    reference to Bar. But there are frequently good reasons for modeling
    relationships as above, for example when Foo and Bar are DTOs exposed by the
    remote interface of an EJB. (Yeah, yeah, I'm lazy, using my
    PersistenceCapable classes as both the DAOs and the DTOs.) But I still want
    to do queries that navigate the relationship; it would be nice to do it in
    JDOQL directly. I will also want to do other weird-ass queries that would
    definitely only be expressible in SQL. Hence, I'll need Kodo's O/R mapping
    metadata.
    Is there anything terribly flawed with this logic?
    Ben

    I have a one point before I get to this:
    There is nothing wrong with using PC instances as both DAO and DTO
    objects. In fact, I strongly recommend this for most J2EE/JDO design.
    However, there should be no need to expose the foreign key values... use
    application identity to quickly reconstitute an object id (which can in
    turn find the persistent version), or like the j2ee tutorial, store the
    object id in some form (Object or String) and use that to re-find the
    matching persistent instance at the EJB tier.
    Otherwise, there is a much easier way of finding ClassMapping instances
    and in turn FieldMapping instances (see ClassMapping.getInstance () in
    the JavaDocs).
    Ben Eng wrote:
    Due to JDOQL's limitations (inability to express joins, when relationships
    are not modeled as object references), I find myself needing to drop down to
    expressing some queries in SQL through JDBC. However, I still want my Java
    code to remain independent of the O/R mapping. I would like to be able to
    formulate the SQL without hardcoding any knowledge of the relational table
    and column names, by using Kodo's metadata. After poking around the Kodo
    Javadocs, it appears as though the relevant calls are as follows:
    ClassMetaData cmd = ClassMetaData.getInstance(MyPCObject.class, pm);
    FieldMetaData fmd = cmd.getDeclaredField( "myField" );
    PersistenceManagerFactory pmf = pm.getPersistenceManagerFactory();
    JDBCConfiguration conf = (JDBCConfiguration)
    ((EEPersistenceManagerFactory)pmf).getConfiguration();
    ClassResolver resolver = pm.getClassResolver(MyPCObject.class);
    Connector connector = new PersistenceManagerConnector(
    (PersistenceManagerImpl) pm );
    DBDictionary dict = conf.getDictionary( connector );
    FieldMapping fm = ClassMapping.getFieldMapping(fmd, conf, resolver, dict);
    Column[] cols = fm.getDataColumns();
    Does that look about right?
    Here's what I'm trying to do:
    class Foo
    String name; // application identity
    String bar; // foreign key to Bar
    class Bar
    String name; // application identity
    int weight;
    Let's say I want to query for all Foo instances for which its bar.weight >
    100. Clearly this is trivial to do in JDOQL, if Foo.bar is an object
    reference to Bar. But there are frequently good reasons for modeling
    relationships as above, for example when Foo and Bar are DTOs exposed by the
    remote interface of an EJB. (Yeah, yeah, I'm lazy, using my
    PersistenceCapable classes as both the DAOs and the DTOs.) But I still want
    to do queries that navigate the relationship; it would be nice to do it in
    JDOQL directly. I will also want to do other weird-ass queries that would
    definitely only be expressible in SQL. Hence, I'll need Kodo's O/R mapping
    metadata.
    Is there anything terribly flawed with this logic?
    Ben
    Steve Kim
    [email protected]
    SolarMetric Inc.
    http://www.solarmetric.com

  • RFC related queries

    HI Friends,
    i have some queries which are RFC related.Can you please clarify them-:
    1)What does this syntax mean
    call function 'FM' destination 'dev-60'
    if i use this syntax in dev-90  FM  , will it excute the above given remote enabled FM in  dev-60.
    can i use this syntax in the same  remote enabled FM .
    Thanks and Regards,
    Sakshi
    Thanks and Regards,
    Sakshi

    Hello Sakshi,
    This is a basic question which can be answered by googling. It is real easy, try this [link|http://tinyurl.com/yeqwqfv].
    BR,
    Suhas

  • Oracle 9iDS related queries

    Dear all,
    Please, help me to provide the answers or document for the followinf queries.
    1. Major difference between Oracle 6iDS and Oracle 9iDS.
    2. Can I Execute application developed in Oracle 6i
    (Client-Server) as is in Oracle 9iDS?
    3. Can I execute forms developed in Oracle 9iDS without
    Application Server?
    4. Equivalent of DFLT.PRT (available in 6i) in Oracle 9iDS
    You can also send me the document (if any) by mail. My mail id is [email protected]
    Thanks

    Hi,
    1. Major difference between Oracle 6iDS and Oracle 9iDS.
    - Listener Servlet architecture
    - Web only
    - 25+ new features (design, deployment and architecture)
    2. Can I Execute application developed in Oracle 6i
    (Client-Server) as is in Oracle 9iDS?
    You need to re-compile them. Also there are some obsoleted built-ins that you need to migrate if you use them. There is a migration assistant contained in the DeveloperSuite (FMA)
    3. Can I execute forms developed in Oracle 9iDS without Application Server?
    Only with Oracle9iDS there is a stand alone OC4J instance that you use for designtime testing your application. For production the only supported application server is Oracle Application Server
    4. Equivalent of DFLT.PRT (available in 6i) in Oracle 9iDS
    Sounds Reports related and I would try this question on their forum.
    See also the 9i collateral on http://otn.oracle.com/products/forms
    Frank

  • MRP Related Functionality

    Hi All,
    Need help on the below query related to MRP
    Consider an item I1 for which I have run the MRP Wizard.
    The forecast states wk39 (26/9/11) for 4,502 cases
    The MRP Results stated wk 39 (26/9/11) for 4,502 cases - ( noted in red as overdue based on Item Master leadtime )
    However the Order Recommendation states due date wk 38 (19/09/11) for 4,502 cases u2013 release date 19/9/11 (noted in red as not allowed enough time based on Item Master leadtime / release date of 28 days ) but due date states 19/9/11 wk38 (which is one week earlier that the MRP results) and also from the order recommendation the results for the due date are 7 days earlier than the MRP results/forecast date.
    Does the system bring forward an order recommendation delivery date 1 week earlier than the MRP result for the requirement, as ordering based on 1 week?  If so, is this under a specific setting?
    Regards
    Sharat

    Hi
    Please see below link for getting more info on MRP area
    http://help.sap.com/saphelp_dimp50/helpdata/EN/c4/106956ae8a11d1a6720000e83235d4/content.htm
    Please let us know if you need any more info
    Thanks

  • 'ADMINISTRATION TOOL' RELATED QUERIES

    Hi All,
    I have few queries regarding OBIEE administration tool, request to help me out in getting answers for these.
    Using OBIEE version 10.1.3.4.0, any information about these queries/documents/site related to these topics would help me.
    1. Suppose i have more than one dimension model in a single rpd, more than one developer access to this rpd. Is it possible to restrictaccess for one dimension model so that s/he will not be able to access other models?
    2. Also when there are more than one rpds in Administration tool and many developers access to them, can security be defined like 'User A will have access only to RPD1' A cannot access any other offline/online rpd.
    3. Administration tool must be installed in Server or it can be installed in client system also? asking this question because if more than one developer wants to access administration tool at the same time how can it be achieved?
    4. In my rpd i have dimention models ( more than one), can i import one model from this rpd to another rpd?
    5. What is multiuser environment? will it help in any of my above requirements??
    Thanks in advance

    1. No, but you can use MUD to define different projects so that developers "check out" the projects they are working on. See the links provided on the previous response.
    2. The security is defined in each RPD. To open an RPD you need to be an Administrator user defined in that RPD. Online access can be restricted if you block connections to your BI Servers on port 9703 so that they can only come from a local connection or from defined IPs. You will need a firewall for that though. Offline access can not be restricted. If I have an RPD locally and I have an admin account I can open it anywhere I want.
    3. Client only it's fine. You would simply install the client tools in each developer's machine.
    4. Yes, search this forum for "merge repositories", plenty of guidance already available.
    5. The links provided above give you a good explanation of what MUD is and what it is for. Searching this forums also gives you plenty of information.

  • MRP related message during creation of Sales order

    When creating SO, I am getting following messages- " MRP area  for material # xxxx not determined, continue with plant MRP area -
    ", . The system allows me to save the SO, but the message pops up 3 different times- including when it check for ATP. This material has ND for MRP.
    How can I suppress this message- where is the config for it? The message no is AG211.
    Thanks
    Raj

    thank you or the info- we are on ECC 6.0- and the notes are from 2001- Would I still need to apply these notes or would they be already applied as part of new version?
    Thanks
    Raj

  • Monitoring Related Queries - BPIM

    Dear All,
    We are planning to implement Business Process & Inteface Monitoring - first phase mainly involving application specific mointoring and interface monitoring. I have few queries on the same. Would be great if you can help in same:
    1) Our present DB is about 35TB. If we implement BPMon in SolMan then how we can make sure that performance of monitored systems like ECC, SRM, etc are not impacted while data collection by local CCMS and then passed to SolMan Cental CCMS. There could be possibility of thousands of open sales order globally at various loc. so that data when collected should have some impact on system performance.
    2) What are the Best Practices & recommendation on BPMon side, specifically for cross application monitoring like ABAP Dumps, Update Errors and Batch Files. I have links for SAP Standard slides so please dont share those one from mkt place.
    3) Any strategic document from your side to show how it was proposed /implemented in some project/clients with real time examples as that will give more clarity.
    4) Escalation Mgmt / Corrective Measure procedure - Any standard facility available for escalation management. Also we are looking for some task assignment kind of feature where alerts action can be assigned to various project team members by process experts for needful, etc.
    Thanks in advance.
    SM

    Hello Suchit,
    1. There is no guarantee that the collector will not influence the performance, however they are written in a way which should not drastically affect the system. If you are in doubt I would suggest running a chosen TBI report (in ST13), which more or less illustrates the data collector and tracing results (especially performance trace).
    2. If you have SAP slides you should be able to find best practices, I believe that the goal of BPMon is to monitor application/process specific errors. That's why also for example ABAP dumps monitor has a selection on what is dumping, so you could search only among errors belonging to your process. In my case we created a separate BP calles cross-process monitoring and we monitor there things that are critical but not possible to assign just to one process.
    3. The only "strategic" document is a technical documentation of BPMon, as SAP will not identify your critical business processes and tell you how to monitor them (at least not for free ).
    4. That depends what kind of tool you are using for the incident management. You can utilize email notifications if you don't have a specific tools, otherwise you might consider building your own BAPi to support your IM tool.
    BR
    Jolanta

  • Smart Sync Related Queries

    A. Any guidelines in defining parent chlid relation ships and association when making BAPI Wrappers???
        Any documents/note/links.
    C. How can we separate BAPI Wrappers Interface and filtering rules. Can I bring my Filtering /distribution rules defined at backend
        to MI Middleware. If yes How??? Any documents.
    D. Which type is suitable for Master Data and which Type is suitable for Transactional Data.
        Is it OK to make every BAPI Wrapper of type T51(Server Driven).
        Are there any drawbacks in this approach.
    E. In the case of Server Driven, who has more load Server or the middleware??
    regards
    anubhav

    Hi Anubhav,
        T51 is the best type of syncBO not only for master data, but for transactional data also.  This is the one with highest performance and the least load both on both backend and MI.
    In T51 case, backend informs MI about the changes in the data (add/modify/delete). Backend is the best person to identify the change in data as the data resides there. The task to idendify the data changes is quite easy in backend (depends on the backend implementation), but if there are too many ways (too many applications) the data can be changed in the backend, the effort to catch all these cases will be higher in the backend.
    In T01 case, MI identifies the changes in the data. Since the data is primarily residing in the backend and changes in data happens there, the only way MI can identify the changes is by comparing every record from the backend with the replicated data in the middleware RDB. This process will be very time consuming and can lead to performance problems if the data is huge. Also the replication time will be higher.
    In the case of master data which seldom occurs changes, the T01 replicator will run periodically ( as scheduled ) comapring the whole data to find out there is no changes. In the case of T51, the replicator will be run automatically only when the backend informs there is a change in data.
    Even for transactional data, T51 is better in terms of performance. The delay for the updation of data in the middleware after the change in backend is very small and is even configurable in middleware. So the latest data will be always there in middleware.
    Regards
    Ajith Chandran

Maybe you are looking for

  • FileVault and the loss of all data

    Dear Readers, On a stupid whim and out of sheer boredom, I activated the FileVault on my MacBook Pro. After the hour and a half of encryption time, I proceeded to use my computer without problems. I decided I wanted to backup a few of my files on a t

  • HT2488 Automator movie not playing

    I have created an application in Automator using the get movie and play movie options, i have saved the saved application file into my login items and enabled auto login. The problem is that when i bootup the machine nothing happens it just auto logs

  • Image Map

    I think a searcher all over the Internet and din't found any answers. First sorry for the bad english. So comming to the point, I created a map, it should represent a country and as we know any country has more states so i wanted to split this map in

  • Using Additional Information Tab for CC

    Anyone know if you can add Closed Caption data to a dv file thought the use of Compressor 3's Additional Information Tab? The manual lists QuickTime outputs MPEG-2 elementary stream outputs MPEG-2 program and transport stream outputs But I tired to c

  • Lync Client Policy from command shell

    Hi all, I would like to apply some restriction to Lync Client 2013. From Technet Library and this forum link http://social.technet.microsoft.com/Forums/en-US/ocsclients/thread/e4ec68fb-32e3-45f1-bceb-60399388cbf2, what i need to do is set the group p