Automotive Industry SD Related Queries

Hello all,
In our project we are using Automotive Industry module.
so now i want to take Automotive module SD related queries from
[http://help.sap.com/erp2005_ehp_04/helpdata/EN/50/296fe7bf1a474f84d5955cedefa0a3/frameset.htm]
Please tell me sd quries from the above link.

Check this link
[SAP Best Practices for Automotive|http://help.sap.com/content/bestpractices/industry/bestp_industry_automotive.htm]
thanks
G. Lakshmipathi

Similar Messages

  • SD Related Queries

    Hello all,
    In our project we are using Automotive Industry module.
    so now i want to take Automotive module BW SD related queries from
    [http://help.sap.com/erp2005_ehp_04/helpdata/EN/50/296fe7bf1a474f84d5955cedefa0a3/frameset.htm]
    Please tell me BW sd queries from the above link.
    Regards.
    Edited by: Ranga123 on Mar 24, 2010 2:28 PM

    no one is answering my question, so admin please delete this thread.thanks.

  • Usage of material variants in automotive industry

    Dear Experts,
    I want to understand the usage of material variants in an automotive industry (car or two-wheeler). Please provide the config settings required and also the usage of super BOM & Routing (if required) with the complete flow. I would prefer if someone can share the scenarios in a two-wheeler industry.
    Thanks in advance.
    Sameer

    Hi Suresh,
    Thanks for your answer.
    1)We have about 9 characteristics. If we need to create material variants, around 200 material variants will be required to be created. Is that the best option?
    2)If we select the first characteristic value, we want only those remaining chararacteristics to be available for selection which are related to the first characteristic. Likewise, we want to have control over the selection of each characteristic, till all the characteristic value selections are made in the sales order. Is it possible that a pop-up comes with the available characteristic options for that combination/s.
    3)Can we make a super BOM for items selection? How can I control the selections from the super BOM? Can it be done via Object Dependencies? Routings are same for all the options.
    Thanks in advance.
    Sameer

  • Automotive industry

    Friends,
    What kind of question can be asked for automotive industry (two wheeler) in case of SAP SD  in interview.
    Please tell some of the important business terminologies related to this. Also list various business process.
    please not that they are using 4.7 version
    Thanks

    Possible Scenario's are:
    1. Dealer Sales
    2. OEM Sales
    3. Export Sales
    4. Bulk Sales (Corporate Sales)
    5. Stock Transfer between plants.
    6. Spare Part Sales.
    Possible topic are the scenario's above and in addition to those:
    7. Question on Enterprise Structure.
    8. Question ob Pricing Procedure.
    9. Batch Number Management.
    10. Serial Number Management.
    & so on .......
    Also check the Link below:
    http://help.sap.com/
    Path: SAP Best Practices >> Industry >> Automotive
    https://www.sdn.sap.com/irj/sdn/bpx-automotive
    Regards,
    Rajesh Banka

  • AME related queries, where to post ?

    Hi all,
    Can you please tell me where should we post our AME related queries. Since Oracle treats it as part of HRMS, do we post those queries in this forum ? Or is there any separate forum for this purpose. Please provide me the link as well, if you can.
    Thanks a lot in advance.

    You can post it here I think

  • Role of Quality Management udring NPDI Stage for Automotive industry

    Hi All,
    Can anyone tell me how the quality requirements for automotive industry, during NPD stage can be addressed by mySAP PLM-Quality Management?
    Best Regards
    Guruprasad

    Hello Isabelle,
    Greetings! and thanks for the reply.
    Can you please elaborate more on exit - QPL10001. I am not clear where the Inspection plan will be maintained and with what usage? How the inspection lot will get generated? is it need to trigger manually?
    I will just explain the scenario in short- During <b>New Product development</b>, OEM can get the parts in 2 ways, one is from Supplier and another in-house production.
    <b>In case of supplier parts</b> it comes as a sample part till finalisation of drawing and other aspects. At all stages during development, suppler part quality is throughly verified - dimenstionally, metallurgically, performace, endurance etc..
    here am facing  2 main difficulties
    1.Too many number of characteristics.
    2.How the results recorded can be identified, later when moved to regular production? since the parts will be inspected against inspetion type 01.
    Similarly, <b>for MIW (made in works / in-house production)</b> material / part is manufactured without using actual machines initially and hence there is no rate-routing and no inspection characteristics assigned to material.
    In the later stage on the process is finalised and rate routing is maintaied.
    Here am facing problem as,
    1. No rate routing and hence no inspection characteristics initially at NPD.
    2.And even if I maintain the inspection characteristics in rate routing, Since Auto industry follows repetetive manufactuing scenario, the Production order type - PP01 etc is not of use.
    am trying with manual generation of inspection lot for material version - using MFPR...asking client to maintain the inspection characteristics in rate routing to cater the requirement.
    I will really be glad if u could slightly brief - on how/ where to maintain inspection plan and how will the inspection lot will get generated?
    Thanks a lot.
    take care
    Guruprasad..

  • Idoc Related queries

    1.     How can we view and rectify the errors or warnings caused ,while we create a new idoc ,which may be an extension of an existing basic Idoc type(at Transaction code – we30)?
    2.     How can we delete an Idoc created,if its already been released (at Transaction code we30) and configured(at transaction code we82)?
    3.     Is that mandatory that the check box ‘Mandatory’ field should always be checked,whenever we create(extend) a new segment to an existing segment(at transaction code we30)?
    4.     On what basis,we can identify that “To which existing segment - we can append our needed segment(new segment if any to be appended)”?

    Hi Nagarajan,
      Answers for your questions:
    1)How can we view and rectify the errors or warnings caused ,while we create a new idoc ,which may be an extension of an existing basic Idoc type(at Transaction code – we30)?
       WE30 is created for IDOCs. First set break point related user exit.For testing WE19. Just enter that error IDOC number in WE19 and press F8. Then it will display the segments. Then press /H in the command box and press inbound function module push button (Just side of inbound push button). Then it will open in debug mode. we can test.
    2. How can we delete an Idoc created,if its already been released (at Transaction code we30) and configured(at transaction code we82)?
    Yes it is possible to delete hte IDOC which is released from our system, i think thorugh remote function but i am not sure.
    3. Is that mandatory that the check box ‘Mandatory’ field should always be checked,whenever we create(extend) a new segment to an existing segment(at transaction code we30)?
    Based on the requirement we can select that check box. suppose it u upload the data for MM01 t-code then observe what are all the manditory feilds in that screen. Based on that choose mandotory check box for proper fields in the segment.(In MM01 suppose meterail number is manditory then while creating segment select that manditory chk box for MATNR)
    4. On what basis,we can identify that “To which existing segment - we can append our needed segment(new segment if any to be appended)”?
    Based on the basic IDOC type and given information from the user.
    Hope this helps you, reply for queries,
    Regards.
    kumar.

  • Adobe create suite 64-bit related queries

    Hi,
    I have following couple of questions related to 64-bit support in Adobe Products.
    1. Would like to know Adobe Illustrator CS3,CS4 and CS5 support 64-bit?
    2. Would like to know Adobe Photoshop CS3,CS4 and CS5 support 64-bit?
    3. Heard that CS5 would support 64-bit. All application underneath Creative Suite 5 would support 64-bit.
    4. does 32-bit and 64-bit have separate installer or same installer can be installed in 32-bit and 64-bit as well?.
    5. In which Window platform CS 64-bit will be supported?
    6. In which MAC platform  CS 64-bit will be supported?
    7. Separate licence to be purchased for 32-bit & 64-bit? or same license can be used?
    Please clarify the above queries.
    Regards,
    Avudaiappan

    Find answers inline.
    AvudaiappanSornam wrote:
    Hi,
    I have following couple of questions related to 64-bit support in Adobe Products.
    1. Would like to know Adobe Illustrator CS3,CS4 and CS5 support 64-bit?
    Illustrator CS5 is not 64 bit.
    2. Would like to know Adobe Photoshop CS3,CS4 and CS5 support 64-bit?
    Photoshop CS5 is 64 bit
    3. Heard that CS5 would support 64-bit. All application underneath Creative Suite 5 would support 64-bit.
    Since answer to question number 1 is no you know the answer
    4. does 32-bit and 64-bit have separate installer or same installer can be installed in 32-bit and 64-bit as well?.
    Same download can install 64 bit if you have a 64 bit OS
    5. In which Window platform CS 64-bit will be supported?
    XP, Vista, Win 7
    6. In which MAC platform  CS 64-bit will be supported?
    10.5.7 and 10.6.x
    7. Separate licence to be purchased for 32-bit & 64-bit? or same license can be used?
    I beleive no, but you can always cross check with Adobe/reseller before purchasing
    Please clarify the above queries.
    Regards,
    Avudaiappan

  • Relational queries through JDBC with the help of Kodo's metadata for O/R mapping

    Due to JDOQL's limitations (inability to express joins, when relationships
    are not modeled as object references), I find myself needing to drop down to
    expressing some queries in SQL through JDBC. However, I still want my Java
    code to remain independent of the O/R mapping. I would like to be able to
    formulate the SQL without hardcoding any knowledge of the relational table
    and column names, by using Kodo's metadata. After poking around the Kodo
    Javadocs, it appears as though the relevant calls are as follows:
    ClassMetaData cmd = ClassMetaData.getInstance(MyPCObject.class, pm);
    FieldMetaData fmd = cmd.getDeclaredField( "myField" );
    PersistenceManagerFactory pmf = pm.getPersistenceManagerFactory();
    JDBCConfiguration conf = (JDBCConfiguration)
    ((EEPersistenceManagerFactory)pmf).getConfiguration();
    ClassResolver resolver = pm.getClassResolver(MyPCObject.class);
    Connector connector = new PersistenceManagerConnector(
    (PersistenceManagerImpl) pm );
    DBDictionary dict = conf.getDictionary( connector );
    FieldMapping fm = ClassMapping.getFieldMapping(fmd, conf, resolver, dict);
    Column[] cols = fm.getDataColumns();
    Does that look about right?
    Here's what I'm trying to do:
    class Foo
    String name; // application identity
    String bar; // foreign key to Bar
    class Bar
    String name; // application identity
    int weight;
    Let's say I want to query for all Foo instances for which its bar.weight >
    100. Clearly this is trivial to do in JDOQL, if Foo.bar is an object
    reference to Bar. But there are frequently good reasons for modeling
    relationships as above, for example when Foo and Bar are DTOs exposed by the
    remote interface of an EJB. (Yeah, yeah, I'm lazy, using my
    PersistenceCapable classes as both the DAOs and the DTOs.) But I still want
    to do queries that navigate the relationship; it would be nice to do it in
    JDOQL directly. I will also want to do other weird-ass queries that would
    definitely only be expressible in SQL. Hence, I'll need Kodo's O/R mapping
    metadata.
    Is there anything terribly flawed with this logic?
    Ben

    I have a one point before I get to this:
    There is nothing wrong with using PC instances as both DAO and DTO
    objects. In fact, I strongly recommend this for most J2EE/JDO design.
    However, there should be no need to expose the foreign key values... use
    application identity to quickly reconstitute an object id (which can in
    turn find the persistent version), or like the j2ee tutorial, store the
    object id in some form (Object or String) and use that to re-find the
    matching persistent instance at the EJB tier.
    Otherwise, there is a much easier way of finding ClassMapping instances
    and in turn FieldMapping instances (see ClassMapping.getInstance () in
    the JavaDocs).
    Ben Eng wrote:
    Due to JDOQL's limitations (inability to express joins, when relationships
    are not modeled as object references), I find myself needing to drop down to
    expressing some queries in SQL through JDBC. However, I still want my Java
    code to remain independent of the O/R mapping. I would like to be able to
    formulate the SQL without hardcoding any knowledge of the relational table
    and column names, by using Kodo's metadata. After poking around the Kodo
    Javadocs, it appears as though the relevant calls are as follows:
    ClassMetaData cmd = ClassMetaData.getInstance(MyPCObject.class, pm);
    FieldMetaData fmd = cmd.getDeclaredField( "myField" );
    PersistenceManagerFactory pmf = pm.getPersistenceManagerFactory();
    JDBCConfiguration conf = (JDBCConfiguration)
    ((EEPersistenceManagerFactory)pmf).getConfiguration();
    ClassResolver resolver = pm.getClassResolver(MyPCObject.class);
    Connector connector = new PersistenceManagerConnector(
    (PersistenceManagerImpl) pm );
    DBDictionary dict = conf.getDictionary( connector );
    FieldMapping fm = ClassMapping.getFieldMapping(fmd, conf, resolver, dict);
    Column[] cols = fm.getDataColumns();
    Does that look about right?
    Here's what I'm trying to do:
    class Foo
    String name; // application identity
    String bar; // foreign key to Bar
    class Bar
    String name; // application identity
    int weight;
    Let's say I want to query for all Foo instances for which its bar.weight >
    100. Clearly this is trivial to do in JDOQL, if Foo.bar is an object
    reference to Bar. But there are frequently good reasons for modeling
    relationships as above, for example when Foo and Bar are DTOs exposed by the
    remote interface of an EJB. (Yeah, yeah, I'm lazy, using my
    PersistenceCapable classes as both the DAOs and the DTOs.) But I still want
    to do queries that navigate the relationship; it would be nice to do it in
    JDOQL directly. I will also want to do other weird-ass queries that would
    definitely only be expressible in SQL. Hence, I'll need Kodo's O/R mapping
    metadata.
    Is there anything terribly flawed with this logic?
    Ben
    Steve Kim
    [email protected]
    SolarMetric Inc.
    http://www.solarmetric.com

  • RFC related queries

    HI Friends,
    i have some queries which are RFC related.Can you please clarify them-:
    1)What does this syntax mean
    call function 'FM' destination 'dev-60'
    if i use this syntax in dev-90  FM  , will it excute the above given remote enabled FM in  dev-60.
    can i use this syntax in the same  remote enabled FM .
    Thanks and Regards,
    Sakshi
    Thanks and Regards,
    Sakshi

    Hello Sakshi,
    This is a basic question which can be answered by googling. It is real easy, try this [link|http://tinyurl.com/yeqwqfv].
    BR,
    Suhas

  • Oracle 9iDS related queries

    Dear all,
    Please, help me to provide the answers or document for the followinf queries.
    1. Major difference between Oracle 6iDS and Oracle 9iDS.
    2. Can I Execute application developed in Oracle 6i
    (Client-Server) as is in Oracle 9iDS?
    3. Can I execute forms developed in Oracle 9iDS without
    Application Server?
    4. Equivalent of DFLT.PRT (available in 6i) in Oracle 9iDS
    You can also send me the document (if any) by mail. My mail id is [email protected]
    Thanks

    Hi,
    1. Major difference between Oracle 6iDS and Oracle 9iDS.
    - Listener Servlet architecture
    - Web only
    - 25+ new features (design, deployment and architecture)
    2. Can I Execute application developed in Oracle 6i
    (Client-Server) as is in Oracle 9iDS?
    You need to re-compile them. Also there are some obsoleted built-ins that you need to migrate if you use them. There is a migration assistant contained in the DeveloperSuite (FMA)
    3. Can I execute forms developed in Oracle 9iDS without Application Server?
    Only with Oracle9iDS there is a stand alone OC4J instance that you use for designtime testing your application. For production the only supported application server is Oracle Application Server
    4. Equivalent of DFLT.PRT (available in 6i) in Oracle 9iDS
    Sounds Reports related and I would try this question on their forum.
    See also the 9i collateral on http://otn.oracle.com/products/forms
    Frank

  • Automotive Industry Specific Training on VMS and Warranty ready for booking

    Hi all,
    this is indeed not a typical topic for a discussion forum, but potentially for your interest:
    The SAP trainings on Automotive Sales and also SAP Warranty are usually only offered on request, and not hold that often. But, per accident, there are a some seats available:
    [SAP Industry Training on SAP Warranty (IAU280)|https://service.education.sap.com/sap(bD1lbiZjPTAwMSZkPW1pbg==)/bc/bsp/sap/hcm_learning/trainingtype.htm?sap-params=b3R5cGU9RCZvYmppZD03MDAxNjk2MQ%3d%3d]
    is scheduled for Februar 25-28 2008 and will be held in Walldorf, Germany, and
    [SAP Industry Training on Automotive Vehicle Sales (IAU270)|https://service.education.sap.com/sap(bD1lbiZjPTAwMSZkPW1pbg==)/bc/bsp/sap/hcm_learning/trainingtype.htm?sap-params=b3R5cGU9RCZvYmppZD03MDA0MTMzOA%3d%3d] is scheduled for April 21-23 2008 in London, UK. Both are available for registration.

    Hi ,
    I am an employee of SAP...
    Can I get somewhere the training material IAU280 and 270
    in some common server...

  • 'ADMINISTRATION TOOL' RELATED QUERIES

    Hi All,
    I have few queries regarding OBIEE administration tool, request to help me out in getting answers for these.
    Using OBIEE version 10.1.3.4.0, any information about these queries/documents/site related to these topics would help me.
    1. Suppose i have more than one dimension model in a single rpd, more than one developer access to this rpd. Is it possible to restrictaccess for one dimension model so that s/he will not be able to access other models?
    2. Also when there are more than one rpds in Administration tool and many developers access to them, can security be defined like 'User A will have access only to RPD1' A cannot access any other offline/online rpd.
    3. Administration tool must be installed in Server or it can be installed in client system also? asking this question because if more than one developer wants to access administration tool at the same time how can it be achieved?
    4. In my rpd i have dimention models ( more than one), can i import one model from this rpd to another rpd?
    5. What is multiuser environment? will it help in any of my above requirements??
    Thanks in advance

    1. No, but you can use MUD to define different projects so that developers "check out" the projects they are working on. See the links provided on the previous response.
    2. The security is defined in each RPD. To open an RPD you need to be an Administrator user defined in that RPD. Online access can be restricted if you block connections to your BI Servers on port 9703 so that they can only come from a local connection or from defined IPs. You will need a firewall for that though. Offline access can not be restricted. If I have an RPD locally and I have an admin account I can open it anywhere I want.
    3. Client only it's fine. You would simply install the client tools in each developer's machine.
    4. Yes, search this forum for "merge repositories", plenty of guidance already available.
    5. The links provided above give you a good explanation of what MUD is and what it is for. Searching this forums also gives you plenty of information.

  • Monitoring Related Queries - BPIM

    Dear All,
    We are planning to implement Business Process & Inteface Monitoring - first phase mainly involving application specific mointoring and interface monitoring. I have few queries on the same. Would be great if you can help in same:
    1) Our present DB is about 35TB. If we implement BPMon in SolMan then how we can make sure that performance of monitored systems like ECC, SRM, etc are not impacted while data collection by local CCMS and then passed to SolMan Cental CCMS. There could be possibility of thousands of open sales order globally at various loc. so that data when collected should have some impact on system performance.
    2) What are the Best Practices & recommendation on BPMon side, specifically for cross application monitoring like ABAP Dumps, Update Errors and Batch Files. I have links for SAP Standard slides so please dont share those one from mkt place.
    3) Any strategic document from your side to show how it was proposed /implemented in some project/clients with real time examples as that will give more clarity.
    4) Escalation Mgmt / Corrective Measure procedure - Any standard facility available for escalation management. Also we are looking for some task assignment kind of feature where alerts action can be assigned to various project team members by process experts for needful, etc.
    Thanks in advance.
    SM

    Hello Suchit,
    1. There is no guarantee that the collector will not influence the performance, however they are written in a way which should not drastically affect the system. If you are in doubt I would suggest running a chosen TBI report (in ST13), which more or less illustrates the data collector and tracing results (especially performance trace).
    2. If you have SAP slides you should be able to find best practices, I believe that the goal of BPMon is to monitor application/process specific errors. That's why also for example ABAP dumps monitor has a selection on what is dumping, so you could search only among errors belonging to your process. In my case we created a separate BP calles cross-process monitoring and we monitor there things that are critical but not possible to assign just to one process.
    3. The only "strategic" document is a technical documentation of BPMon, as SAP will not identify your critical business processes and tell you how to monitor them (at least not for free ).
    4. That depends what kind of tool you are using for the incident management. You can utilize email notifications if you don't have a specific tools, otherwise you might consider building your own BAPi to support your IM tool.
    BR
    Jolanta

  • Smart Sync Related Queries

    A. Any guidelines in defining parent chlid relation ships and association when making BAPI Wrappers???
        Any documents/note/links.
    C. How can we separate BAPI Wrappers Interface and filtering rules. Can I bring my Filtering /distribution rules defined at backend
        to MI Middleware. If yes How??? Any documents.
    D. Which type is suitable for Master Data and which Type is suitable for Transactional Data.
        Is it OK to make every BAPI Wrapper of type T51(Server Driven).
        Are there any drawbacks in this approach.
    E. In the case of Server Driven, who has more load Server or the middleware??
    regards
    anubhav

    Hi Anubhav,
        T51 is the best type of syncBO not only for master data, but for transactional data also.  This is the one with highest performance and the least load both on both backend and MI.
    In T51 case, backend informs MI about the changes in the data (add/modify/delete). Backend is the best person to identify the change in data as the data resides there. The task to idendify the data changes is quite easy in backend (depends on the backend implementation), but if there are too many ways (too many applications) the data can be changed in the backend, the effort to catch all these cases will be higher in the backend.
    In T01 case, MI identifies the changes in the data. Since the data is primarily residing in the backend and changes in data happens there, the only way MI can identify the changes is by comparing every record from the backend with the replicated data in the middleware RDB. This process will be very time consuming and can lead to performance problems if the data is huge. Also the replication time will be higher.
    In the case of master data which seldom occurs changes, the T01 replicator will run periodically ( as scheduled ) comapring the whole data to find out there is no changes. In the case of T51, the replicator will be run automatically only when the backend informs there is a change in data.
    Even for transactional data, T51 is better in terms of performance. The delay for the updation of data in the middleware after the change in backend is very small and is even configurable in middleware. So the latest data will be always there in middleware.
    Regards
    Ajith Chandran

Maybe you are looking for