What is project cost analysis

hi sap gurus
can anybody clear about the project cost analysis
thank you
venkatareddy

Hi Venkat
Please go through the following link which is useful
http://en.wikipedia.org/wiki/Project_management#Cost
best regards
assign points if helpful

Similar Messages

  • How to find the difference between Project costs & Cost center costs?

    Hi all,
    Can you please explain me ? What is project Cost & Cost center cost and which are the tables having those fields of project cost & cost center Costs? And how to find the difference between their costs ? And please give me the functional Idea about it?
    Thanking you all in Advance...
    Regards,
    Chandru

    Hi,
    goto table COSP:
    1) object-no. beginning with 'KS' refer to cost-centre
       example: 'KSKOKA4711'
       with KS, KOKA = controlling area and 4711 = cost centre
    2) OBJNR beginning with 'PR' refer to projects / WBS-element
       example PR00001234
    Message was edited by: Andreas Mann

  • CO03 cost Analysis report to BW Report

    I need to create a report in BW same as (CO03) Cost - Analysis, I want to create an extractor that extract data the same as what the (CO03) Cost - Analysis shows, but unfortunately i dont know from which and what table, selection condition should i need to use for my extraction.
    or can anyone tell me what function module is used by SAP for this and is it possible to use it in my extractor?
    Thanks you very much in advance

    Hi Joher,
    You can look for Controlling extractors  in R/3
    Basically
    0CO_PC_01
    and other data sources related to that.
    Also you can look for transaction SBIW and there you can look for Controlling and profitabilty analysis(COPA).
    basically it covers all the requirements.But it will contain only those orders which have hit your accounts.
    But basically CO03 is related at production planning and maintenace then you try for other sources also which are present in standard extrator those starting with
    2LIS_17_.....
    This all data sources are based on the PM.
    I
    I think it should help.

  • Project cost controlling

    Hi guru
    What is project cost controlling & what are its step by step customisation settings.
    Thanks
    shivaji

    Hi,
    You can refer to this link:
    http://help.sap.com/saphelp_erp60_sp/helpdata/en/bf/5ffd166d4311d1ad5e0000e8a6353a/frameset.htm
    Regards,
    Eli

  • What kind of component to use as CI if I want to add a project costing rec

    Seems like the Project_Express and Intfc_Proj_Express components do not allow, by default, for a creation of new records. When I tried to enable the creation by allowing for Adds in the component properties, and then create CIs from these components, it still will not allow me to create a record (the create in the CI tester is disabled). Any ideas on how to create a project costing transaction record via a CI?

    Allright I found an answer myself. What we call Cinch-cabel in German is a common phono-cable in English...
    so in order to extend the speakercables just use male to female phono-cable!
    my bad guys!

  • What is impact of Complex procurment with retention on Oracle Project Cost?

    Hello Experts / Dina,
    Can you please explain the flow and acrrual of cost when complex purchase order has been raised with multiple retention on Oracle Project and interface the cost to Oracle Project.
    Example - Complex purchase order of contract consulting is raised with amount of 1000 GBP with 4 Milstone of 250 GBP upon 25% completion of work.
    I wanted to know in this scenarion how the retention payment / invoice matches to purchase order impact on cost interface to project by using PRC:Interface cost to projects.
    Thanks in advance,
    Regards
    :-)

    Hello,
    As per Oracle Costing user guide on retainage.....
    "Oracle Projects does not report on or interface project-related retainage distribution
    lines as commitments or actual costs. Instead, Oracle Projects captures the full amount
    of the expense as a commitment and, when applicable, for the funds check. Later, the
    full amount of the expense is interfaced to Oracle Projects as an actual cost. Retainage is
    related to the payment of the invoice and it ultimately does not have an impact on the
    overall project cost."
    So in relation with my previous example, if period end accounting is enable on procurment side then on raising of purchase order of 1000GBP and interfacing cost to oracle project, will be treated as commitment amount(1000GBP)? and when the invoiced raised / matched and accounted for 1st milestone let say 300 GBP then complete 1000GBP will be interface as actual cost or just 300 GBP will be as actual cost?
    Really appreciate if you can me help in understand. :-)
    Thanks

  • Cost Analysis by Sale Order

    Hello,
    I'm seeking to enable in BI a Cost Analysis Scenario with repect to the <b>Sales Orders</b>. That is, something similar to the analysis you can easily perform in the R/3 area <i>Accounting -> Controlling -> Product Cost Controlling -> <b>Product Cost by Sales Order</b></i>.
    For instance in standard report KKBC_KUN, you can choose a sales order number and get all the <b>cost elements</b> related to that sales order, with the corresponding costs.
    I have examined the following objects provided by the BW's Business Content:
    - <u>InfoCube 0PC_C01</u>: it contains only the costs related to <b>production orders</b>, and not the costs associated with sales orders.
    - <u>InfoSource 0CO_OM_CCA_9</u>: it seems interesting, in fact it delivers some header and line item data of the <u>Controlling Object Document</u>. Unfortunately it doesn't deliver the crucial characteristics <i><u>Controlling Object</u></i> and <i><u>Controlling Object Type</u></i>, which I definitly need for enabling my analysis scenario!!
    (By the way, are these 2 characteristics delivered by the BC? There exists a 0COSTOBJ characteristic, but it's not used by any object in the flow afterwards. Moreover, I can't find a standard DataSource in R/3 which I can use to load either its attributes or its texts, and see what data enter it..)
    Business Content for Product Cost by Sales Order), but none could help me.
    It seems really weird that it's not possible to carry out a <b>Cost Analysis by Sales Order</b> in BW, so if anyone has some suggestion it'll be very appreciated indeed.
    Thanks & Regards, Davide

    I eventually found the following SAP Note. No BCT for this analysis scenario....
    <b>Note 584791 - BW-BCT-CO: Costs not extracted on sales orders</b>
    <b>Symptom</b>
    In R/3, costs are posted directly to sales orders. The costs have to be analyzed in the BW system. In the standard system, however, this data is not extracted into the BW system.
    <b>Other terms</b>
    0CO_OM_OPA_1; 0CO_OM_OPA_6
    <b>Solution</b>
    1. There are currently no DataSources for costs on sales orders in the standard BW Business Content. Unfortunately, there are currently no resources scheduled for a new development in this area either.
    2. Therefore, the DataSources and the Business Content must be created in the customer project within the framework of consulting as a customer solution concerning the generic extraction options of the BW Service APIs or function modules you have programmed yourself for extracting data.
    3. Technical background information about costs on sales orders:
    The transaction data is available in the following R/3 tables
    COSS and COSP - Totals records or COEP - Line items.
    This concerns records for the VB* object numbers.
    The master data of the sales orders is contained in the VBAK and VBAP tables.
    The 0CO_OM_OPA_1 and 0CO_OM_OPA_6 DataSources only extract costs for orders with master data in the AUFK table. These types of requests always have OR* object numbers. Therefore, these DataSources cannot be used for sales orders.
    However, you may be able to assign requests from the AUFK table (for example, maintenance orders or production orders) to a sales order through the AUFK-KDAUF table field.

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • Opening Balances for Project Costing in R12

    Hi All,
    What are the Open transactions should be migrated as Opening Balance in Project Costing (R12) for Projects and Tasks. Here at client place cost will come from Inv, PO, AP and OTL to Project costing (R12). Can you please explain in detail (Opening Balances & Historical Cost - How to Migrate)?
    Waiting for replies...

    Hi
    Usually on implementation of Projects, people are migrating total historical costs per project.
    Depend on your current system, you may convert the historical data summarized by project, task, expenditure organization and expenditure type. There is an option to summarize costs into the latest period, just before starting the future transactions import into Projects. Otherwise, consider summarizing cost data per period, per quarter or per year.
    The summarized data may be loaded as miscellaneous expenditures into Projects.
    Be aware that for converted expenditures you should setup a Transaction Source which is costed and accounted.
    Dina

  • Production Order Cost Analysis - Goods Receipt

    I have been assigned to create analysis of production orders.  I need to show the volume variance as well as the cost variance using both the initial planned costs as well as the actual costs when the production order is completed.  All this information is available in COR3 on cost analysis, but I need to allow for display of multiple orders based on plant and date information.
    I have found that the ingredients are in the table RESB with most of the information that I need, I have not had luck in finding what table has the goods receipt information (produced product).  SAP help says that the fields are in KKBCS_OUT, which is a structure that I can't find where populated from.
    I would appreciate any assistance in this endeavor.
    Thanks
    Steven M Engel

    Thanks for the suggestions, but they have not provided me with what I think I need to give a solution to my superiors.
    I will attempt to clarify what I need to do.  We are a manufacturing facility.  We have a BOM that says to make 10 E, I need 3 A, 2 B, 3 C, and 4 D.  Each of the components (A to D) have a set cost at the time the production order is created, this looks to be saved as part of the planned cost in RESB.  Produced product E also has a planned cost at time production order created (as seen by COR3) but that is not stored in RESB.
    At production time we actually use 2.5 A, 2.5 B, 3 C, and 5 D to make the 10 E.  COR3 cost analysis is showing me the cost variance.  This cost variance has two parts, part 1 is because the quantities of the inputs are different.  Part 2 is because between the time the production order was created, and the production order was finished, the costs of A to E may have changed.
    I need to breakout the different components to show management how much of the total cost variance is attributed to each part.  I need a report that management can run (or have run) that will allow them choose by plant, date, end product, etc.  so they can direct what changes need to be made to costing, pricing, and BOMs.
    I hope that I have laid out what I am trying to do in a coherent manner.  Also I am new to the SAP system with less than 1 year using the system.
    Thanks for your time.
    Steven M Engel

  • Cost analysis through Purchase Order created from Sales Order

    Hi Experts,
    This is the scenario: There is an Item No. which is not stocked in warehouse and is purchased only when customers ask for it. The cost of the sales order line having this item is calculated by the Purchase Order line on which the item is there. (The PO have items for other Sales Orders as well. And, Sales Orders have multiple item types) The Sales Order have the PO no. against it and Purchase Order has SO no. against the line. This is all done using MS Access and Sage Accounts. Lots of free typing and manual cost analysis involved.
    I have gathered from this forum that we can create Purchase Order from the Sales Order. The Document No. information can be pulled from one another using Formatted Search. Can we do the accounting and cost analysis as well?
    Please let me know if I have not made myself clear.
    Thank you in advance and kind regards.
    IC
    Edited by: iancameron03 on Sep 20, 2011 4:35 PM

    Hi Ian,
    I concur with Faraz, it is better if you use procurement document facility in doing back to back transaction from SO to PO.
    However, in your case, I'm afraid you can't analyze the gross profit and recording the accounting related activity as your request using standard SAP B1 function.
    As for gross profit case:
    Since you're actually using workaround in order to evade all hassles in maintaining all item master data related to this case, you create just one item code called S1 right?
    SO DocEntry - SO Line # - Item Code - Item Name - Quantity - Price
    1001 - 0 - S1 - Table - 2 - 100
    1001 - 1 - S1 - Chair - 8 - 25
    1001 - 2 - S1 - Cloth - 3 - 10
    PO # - PO Line # - PO BaseEntry (SO DocEntry)  - PO BaseLine (SO Line #) - Item Code - Item Name - Quantity - Price
    2001 - 0 - 1001 - 0 - S1 - 2 - 100
    2001 - 1 - 1001 - 1 - S1 - 8 - 25
    2001 - 2 - 1001 - 2 - S1 - 3 - 10
    Note that standard B1 gross profit calculation is that system will compare selling price for each item with item cost / certain price list of each respective item. However, in your case, you put all item as a single item master. Thus, system couldn't compare Table to Table, Chair to Chair, Cloth to Cloth since the item cost for these 3 items will be the same, the price list for these 3 items will also be the same since it is actually only one master data. Of course you could update the gross profit calculation manually for each SO, but to be honest it seems rather weird to do this manually.
    So, in order to simulate the gross profit calculation, you need to create a custom query and comparing PO line no with SO line no.
    As far as accounting related recording concern, I'm afraid you can't do it
    1. SO 1 : Item Table - Selling price 130 ; Item Chair - Selling price 30
    2. PO 1 (linked to SO 1) : Item Table - Purchase price 100 ; Item Chair - Purchase price 25
    3. SO 2 : Item Table - Selling price 150 ; Item Chair - Selling price 40
    4. PO 2 (linked to SO 2) : Item Table - Purchase price 110 ; Item Chair - Purchase price 30
    Correct me if I'm wrong, you want to record accounting wise that CoGS of Delivery based on SO 1 is 100 for Table and 25 for Chair, isn't it? No matter what costing method you use, you can't achieve this.
    If you use moving average, system will average both Table and Chair cost price, since it is actually one item master data. Even if you distinguish each item as a master data, system will average the cost price for Table and Cloth by the time GR for PO 1 and PO 2 occurred.
    Example (order by sequence of event):
    1. GR based on PO 1 - Table - Purchase price = 100, therefore Item cost = 100
    2. GR based on PO 2 - Table - Purchase price = 110, therefore Item cost = 105 (considering quantity only one for both PO)
    3. Delivery based on SO 1 - Table - CoGS incurred will be 105, not 100
    To certain extent, same thing could be said with FIFO costing method
    1. GR based on PO 1 - Table - Purchase price = 100 (qty 1)
    2. GR based on PO 2 - Table - Purchase price = 110 (qty 1)
    3. Delivery based on SO 1 - Table - CoGS incurred will be 100 (qty 1)
    4. Delivery based on SO 2 - Table - CoGS incurred will be 110 (qty 1)
    At first glance, it seems right since the CoGS for SO 1 is 100, tied up with purchase price for PO 1. And CoGS for SO 2 is 110, tied up with purchase price for PO 2. But we're using assumption here, we're assuming the sequence of Goods Receipt of PO and Delivery of SO occurred in the very same sequence as the issuance of SO and PO.
    1. GR based on PO 1 - Table - Purchase price = 100 (qty 1)
    2. GR based on PO 2 - Table - Purchase price = 110 (qty 1)
    3. Delivery based on SO 2 - Table - CoGS incurred will be 100 (qty 1)
    4. Delivery based on SO 1 - Table - CoGS incurred will be 110 (qty 1)
    If above scenario / sequence that happened, you will get the wrong figure of gross profit since the Delivery of SO 2 will take the item cost based on GR of PO 1. Since we're actually talking about real life business process, who can guarantee that above scenario wouldn't happen? It is possible that you send goods to customer who ordered late in real life, too much possibility that it is better for you to mitigate such result.
    My suggestion would be that you record the accounting transaction based on standard business process, no need to really capture the actual cost and actual gross profit for certain SO and PO. Too complicated, and the software itself doesn't cater this kind of requirement. Best thing you could do only create custom query for gross profit simulation calculation, but for very accurate accounting recording, IMO you should forget it. Hope this help.
    Best Regards,
    Hendry Wijaya

  • Runtime error in kob5, kob1 and line items display in Cost Analysis in ko03

    Hello everyone,
    I would like to ask for your expert advice on a situation we need to have a solution. Let me tell you what happened.
    In ko88, we tried to settle an internal order to a final asset using 'partial capitalization'. We encountered an error 'Settlement amount is greated than available amount'. I checked in kob5 and we have two line items: 90,000 and 30,000. Both have to be settled to three different final assets namely asset 1, asset 2 and asset 3. The settlement rule was as follows at first:
    Line 1: 90,000 to be settled to asset 1, asset 2 and asset 3 at 65,000; 40,000 and 15,000, respectively.
    Line 2: 30,000 to be settled to asset 1, asset 2 and asset 3 at 65,000; 40,000 and 15,000, respectively.
    Due to the error, we changed the settlement rule following these directions:
    1. Remove the settlement rules that we had so we can start anew.
    2. Choose the first line item and click on "Final Settlement".
    3. Enter the necessary settlement details (total should equal to 90,000 only).
    4. Go back and save.
    5. Choose the second line item and click on "Final Settlement".
    6. Enter the necessary settlement details (total should equal to 30,000).
    7. Go back and save.
    8. Then, check again the distribution rules set.
    Everything was saved and done correctly. So, we tried to do ko88 this time. But, the previous error 'Settlement amount is greated than available amount' still occured.
    I went back to kob5 and when I ran it, a runtime error occured. I went to see the actual line items in kob1, and I encountered the same runtime error. I went to see in ko03>cost analysis>the actual debit line items, and I encountered the same runtime error. Were my actual line items lost? How do I retrieve them?
    Thanks in advance.
    Best regards,
    Peter

    hi,
    check in SAP OSS with the message number you get is there any SAP Notes tobe applied.

  • Project cost plan Header Data table ( PS module - T.code CJR2)

    Hi,
    I am working on PS module where we have to build a integration between PS Cost plan (At WBS Element level) to SD Quotation. I have to use transaction CJR2 for this. we will add the custom field 'Indicator' on the initial screen of transaction CJR2 for this. I want to know what is the HEADER DATA table for Project Cost Plan, so i can add this custom field in that table and letter on use in CJR2.
    If someone have done any kind of modification for work on CJR2 transaction, please provide me some information on this since i am first time working on PS module. I need some information on modifying the initial screen on transaction CJR2.
    Any inputs will be appreciated.
    Thanks,

    Hi,
    you'll find Project cost plan  in tables RPSCO / COSP
    regards Andreas

  • A Standard RRB report that shows actual project cost line items .... ?

    A Standard RRB report that shows actual project cost line items but that also shows amount/qty by rejected, billed, to be billed, postponed?
    I'm looking for something that will allow project managers to reconciile with the RRB totals and be able to dril in a see a level detail not available in RRB to decide what should be billed, what should should be deferred (postponed) and what should be rejected out of hand.
    Thanks.

    Thanks Virendra Pal but the VAx5 reports are not really the answer - maybe a custom report will be required right enough.

  • Cost analysis not showing up in production order

    Hi All
    Preliminary cost analysis is not showing up in the production order...can anyone suggest what could be the possible reason for this

    Hi,
    Once you issue goods(GI) to order then only you can see actaul cost in cost analysis in Production order.
    if you are looking for target cost in cost analysis,you have to post goods receipt(GR) from order to inventory.
    hope this helps you.
    Thanks,
    Rau

Maybe you are looking for

  • Traceroute timeouts and lots of packet loss when a...

    I host various site via the above, and since late last night and today, I am having connection timeout issues on all of them (but sites like bbc, bt etc are fine). I contacted them and performed a traceroute to my default site southee.co.uk which tim

  • Retrieve city and state from zip code that is entered by user

    I am trying to use AJAX to retrieve city and state from a table based on a zip code that is entered by the user. Two are text fields (zip code and city) and one is a SELECT field (state). 1. I defined an application item called TEMPORARY_APPLICATION_

  • Saving image using mobile into sap R/3

    We are using sap mobile platform version 2.3 and our client requirement is to save images using mobile apps. We already saved images in mime repository but for this we have to required client modifiable , but due  to security reason we can't do this.

  • Auto forwarding option

    Hi experts how can we forward the work item to another user automatically when the present user is out of station.please give me the suggestions regarding this.As i have done some settings and they r not working Regards Phani

  • My account has not updated new prepaid account

    opened a new Prepaid mobile added to my account which has not happened also had alot of trouble rechargeing with Voucher because the my account was not showing after number of phone calls to Telstra and being told by robot can't do that ring this num