HRPAD00INFTY - How to determine what action the user took (IN_UPDATE)

Greetings, Experts.
I've been asked to modify an existing BAdI for infotype 0167. The existing code is in method IN_UPDATE.  FYI, we have to use this method (as opposed to, say, AFTER_INPUT) since we have to make sure the user completed the operation as we're modifying data in another infotype based on their action.
As such, the  logic depends on what the user is doing, i.e., inserting, updating, delimiting or deleting a record. I was testing different scenarios in PA30 to see what the value of IPSPAR-ACTIO would be and got some puzzling results.
For example, I used the Edit->Copy feature in PA30 to copy and then delimit an existing record and was expecting an ACTIO of "LIS9" (Delimit) but it was "MOD". Similarly, I used the trashcan to remove an existing record and got a "MOD" operation. Finally, I deleted another record the same way (trashcan icon) and this time it was "LIS1" which I found stupefying since that's not even a valid value according to the DDIC for data element ACTIO.
I can't seem to get a consistent way to determine what the user is doing. What is the best practice?
Thanks.

Hi Steve,
You can  try with IPSYST-IOPER Parameter Which is Showing the Right Values .
Regards,
Kishore.S

Similar Messages

  • How to determine which cell the user has just left?

    I want to check the contents of a JTable cell just after the user has left that cell.
    What is the most reliable way to determine which cell the user has just left?

    Hi,
    I use the cellRenderer for that... if the value is not correct, I call an editCellAt() method...
    JRG

  • What is the user exit for automatic account determination in PO

    Hi Experts
       I ned to know  what is the user exit for automatic account determination in PO
    my client needs the document type to defines Account Assignment Category.
    choosen a document type Account Assignment Category is defined automatically.  you can't choose
    a wrong  Account Assignment Category from the drop down list.
    thanks form you help.

    If this is the case than why don't you use the Transaction variant
    SHD0: http://www.mortenhjorthnielsen.dk/Security/transactionvariants.htm
    If you have knowledge of Transaction variant then you can use SHD0 tcode and generate the transaction using default type and give that transaction to user instead of ME21N so they will not have the issue.
    The exact solution for your problem is as follows
    Go to Tcode SHD0 and enter the transaction variant
    In T code field enter ME21N and then enter the ZXXXX or the name you want to keep in transaction variant field and hit create button
    This will take you to ME21N transaction and here you just enter the data but make sure you enter the A/c which you want to default.
    Once you finish system will save this one as a variant
    Every time you hit or enter the pop up screen comes here you just hit enter and whichever data you want to default that you have to select in the Popup by checking the Box
    W content and output only for your case the field XXXXXXXX should be selected for W content and output only
    Make sure you have value 1 in contents field
    If not then restart the variant again and on the ME21Nscreen reselect the invoice field
    Later exit and save (save in the object which you can transport)
    Then from SHD0 transaction select GOTO create variant transaction and here you careate the tcode and use that tcode for ME21N (e.g ZTRAN - For specifix sto)

  • HT201441 what if the user also forgot id apple? how to solve this problem? whether the iPhone can still be used?

    what if the user also forgot ps apple? how to solve this problem? whether the iPhone can still be used?

    Start here:
    https://iforgot.apple.com

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • What are the User Exits for Sales Order creation process?

    Hi,
    what are the User Exits for Sales Order creation process? how can I find them?
    thanks in advance,
    will reward,
    Mindaugas

    Please check this info:
    User Exits In Sales Document Processing
    This IMG step describes additional installation-specific processing in sales document processing. In particular, the required INCLUDES and user exits are described.
    Involved program components
    System modifications for sales document processing affect different areas. Depending on the modification, you make the changes in the program components provided:
    MV45ATZZ
    For entering metadata for sales document processing. User-specific metadata must start with "ZZ".
    MV45AOZZ
    For entering additional installation-specific modules for sales document processing which are called up by the screen and run under PBO (Process Before Output) prior to output of the screen. The modules must start with "ZZ".
    MV45AIZZ
    For entering additional installation-specific modules for sales document processing. These are called up by the screen and run under PAI (Process After Input) after data input (for example, data validation). The modules must start with "ZZ".
    MV45AFZZ and MV45EFZ1
    For entering installation-specific FORM routines and for using user exits, which may be required and can be used if necessary. These program components are called up by the modules in MV45AOZZ or MV45AIZZ.
    User exits in the program MV45AFZZ
    The user exits which you can use for modifications in sales document processing are listed below.
    USEREXIT_DELETE_DOCUMENT
    This user exit can be used for deleting data which was stored in a separate table during sales document creation, for example, if the sales document is deleted.
    For example, if an additional table is filled with the name of the person in charge (ERNAM) during order entry, this data can also be deleted after the sales order has been deleted.
    The user exit is called up at the end of the FORM routine BELEG_LOESCHEN shortly before the routine BELEG_SICHERN.
    USEREXIT_FIELD_MODIFICATION
    This user exit can be used to modify the attributes of the screen fields.
    To do this, the screen fields are allocated to so-called modification groups 1 - 4 and can be edited together during a modification in ABAP. If a field has no field name, it cannot be allocated to a group.
    The usage of the field groups (modification group 1-4) is as follows:
    Modification group 1: Automatic modification with transaction MFAW
    Modification group 2: It contains 'LOO' for step loop fields
    Modification group 3: For modifications which depend on check tables or on other fixed information
    Modification group 4: is not used
    The FORM routine is called up for every field of a screen. If you require changes to be made, you must make them in this user exit.
    This FORM routine is called up by the module FELDAUSWAHL.
    See the Screen Painter manual for further information on structuring the interface.
    USEREXIT_MOVE_FIELD_TO_VBAK
    Use this user exit to assign values to new fields at sales document header level. It is described in the section "Transfer of the customer master fields into the sales document".
    The user exit is called up at the end of the FORM routine VBAK_FUELLEN.
    USEREXIT_MOVE_FIELD_TO_VBAP
    Use this user exit to assign values to new fields at sales document item level. It is described in the section "Copy customer master fields into the sales document".
    The user exit is called up at the end of the FORM routine VBAP_FUELLEN.
    USEREXIT_MOVE_FIELD_TO_VBEP
    Use this user exit to assign values to new fields at the level of the sales document schedule lines.
    The user exit is called up at the end of the FORM routine VBEP_FUELLEN.
    USEREXIT_MOVE_FIELD_TO_VBKD
    Use this user exit to assign values to new fields for business data of the sales document. It is described in the section "Copy customer master fields into sales document".
    The user exit is called up at the end of the FORM routine VBKD_FUELLEN.
    USEREXIT_NUMBER_RANGE
    Use this user exit to define the number ranges for internal document number assignment depending on the required fields. For example, if you want to define the number range depending on the sales organization (VKORG) or on the selling company (VKBUR), use this user exit.
    The user exit is called up in the FORM routine BELEG_SICHERN.
    USEREXIT_PRICING_PREPARE_TKOMK
    Use this user exit if you want to include and assign a value to an additional header field in the communication structure KOMK taken as a basis for pricing.
    USEREXIT_PRICING_PREPARE_TKOMP
    Use this user exit if you want to include or assign a value to an additional item field in the communication structure KOMP taken as a basis for pricing.
    USEREXIT_READ_DOCUMENT
    You use this user exit if further additional tables are to be read when importing TA01 or TA02.
    The user exit is called up at the end of the FORM routine BELEG_LESEN.
    USEREXIT_SAVE_DOCUMENT
    Use this user exit to fill user-specific statistics update tables.
    The user exit is called up by the FORM routine BELEG-SICHERN before the COMMIT command.
    Note
    If a standard field is changed, the field r185d-dataloss is set to X. The system queries this indicator at the beginning of the safety routine. This is why this indicator must also be set during the maintenance of user-specific tables that are also to be saved.
    USEREXIT_SAVE_DOCUMENT_PREPARE
    Use this user exit to make certain changes or checks immediately before saving a document. It is the last possibility for changing or checking a document before posting.
    The user exit is carried out at the beginning of the FORM routine BELEG_SICHERN.
    User exits in the program MV45AFZA
    USEREXIT_MOVE_FIELD_TO_KOMKD
    Use this user exit to include or assign values to additional header fields in the communication structure KOMKD taken as a basis for the material determination. This is described in detail in the section "New fields for material determination".
    USEREXIT_MOVE_FIELD_TO_KOMPD
    Use this user exit to include or assign values to additional item fields in the communication structure KOMPD taken as a basis for the material determination. This is described in detail in the section "New fields for material determination".
    USEREXIT_MOVE_FIELD_TO_KOMKG
    Use this user exit to include or assign values to additional fields in the communication structure KOMKG taken as a basis for material determination and material listing. This is described in detail in the section "New fields for listing/exclusion".
    USEREXIT_MOVE_FIELD_TO_KOMPG
    Use this user exit to include or assign values to additional fields in the communication structure KOMPG taken as a basis for material determination and material listung. This is described in detail in the section "New fields for listing/exclusion".
    USEREXIT_REFRESH_DOCUMENT
    With this user exit, you can reset certain customer-specific fields as soon as processing of a sales document is finished and before the following document is edited.
    For example, if the credit limit of the sold-to party is read during document processing, in each case it must be reset again before processing the next document so that the credit limit is not used for the sold-to party of the following document.
    The user exit is executed when a document is saved if you leave the processing of a document with F3 or F15.
    The user exit is called up at the end of the FORM routine BELEG_INITIALISIEREN.
    User-Exits in program MV45AFZB
    USEREXIT_CHECK_XVBAP_FOR_DELET
    In this user exit, you can enter additional data for deletion of an item. If the criteria are met, the item is not deleted (unlike in the standard system).
    USEREXIT_CHECK_XVBEP_FOR_DELET
    In this user exit, you can enter additional data for deletion of a schedule line. If the criteria are met, the schedule line is not deleted (unlike in the standard system).
    USEREXIT_CHECK_VBAK
    This user exit can be used to carry out additional checks (e.g. for completion) in the document header. The system could, for example, check whether certain shipping conditions are allowed for a particular customer group.
    USEREXIT_CHECK_VBAP
    This user exit can be used to carry out additional checks (e.g. for completion) at item level.
    USEREXIT_CHECK_VBKD
    The user exit can be used to carry out additional checks (e.g. for completion) on the business data in the order.
    USEREXIT_CHECK_VBEP
    This user exit can be use to carry out additional checks (e.g. for completion) on the schedule line. During BOM explosion, for example, you may want certain fields to be copied from the main item to the sub-items (as for billing block in the standard system).
    USEREXIT_CHECK_VBSN
    You can use this user exit to carry out additional checks (e.g. for completion) on the serial number.
    USEREXIT_CHECK_XVBSN_FOR_DELET In this user exit, you can enter additional criteria for deletion of the serial number. If the criteria are met, the serial number is not deleted (unlike in the standard system).
    USEREXIT_FILL_VBAP_FROM_HVBAP
    You can use this user exit to fill additional fields in the sub-item with data from the main item.
    USEREXIT_MOVE_FIELD_TO_TVCOM_H
    You can use this user exit to influence text determination for header texts. For example, you can include new fields for text determination or fill fields that already exist with a new value.
    USEREXIT_MOVE_FIELD_TO_TVCOM_I
    You can use this user exit to influence text determination for item texts. For example, you can include new fields for text determination or fill fields that already exist with a new value.
    User-Exits for product allocation:
    The following user exits all apply to structure COBL, in which the data for account determination is copied to item level.
    USEREXIT_MOVE_FIELD_TO_COBL
    Option to include new fields in structure COBL.
    USEREXIT_COBL_RECEIVE_VBAK
    Option to assign values from the document header to the new fields.
    USEREXIT_COBL_RECEIVE_VBAP
    Option to supply values from the item to the new fields.
    USEREXIT_COBL_SEND_ITEM
    A changed field can be copied from the structure into the item. You could use the user exit to display a certain field in the account assignment block (see also MV45AFZB).
    USEREXIT_COBL_SEND_HEADER
    A changed field can be copied from the structure to the header (see source text MV45AFZB)
    USEREXIT_SOURCE_DETERMINATION
    You can use this user exit to determine which plant will be used for the delivery. In the standard system, the delivering plant is copied from the customer master or the customer-material info record. If you want to use a different rule, then you must enter it in this user exit.
    USEREXIT_MOVE_FIELD_TO_ME_REQ
    With this user exit you can include additional fields for the following fields:
    EBAN (purchase requisition)
    EBKN (purchase requisition-account assignment)
    USEREXIT_GET_FIELD_FROM_SDCOM
    Option to include new fields for the variant configuration. Fields that are included in structure SDCOM can be processed and then returned to the order.
    USEREXIT_MOVE_WORKAREA_TO_SDWA
    You can use this user exit to format additional work areas for the variant configuration. You will find notes on the user exit in MV45AFZB.
    User-Exits for first data transfer:
    The following user exits can only be used for the first data transfer.
    Note
    Only use the user exits if the names/fields do NOT have the same name.
    USEREXIT_MOVE_FIELD_TO_VBAKKOM
    Option to include additional fields in structure VBAKKOM (communiction fields for maintaining the sales document header)
    USEREXIT_MOVE_FIELD_TO_VBAPKOM
    Option to include additional fields in structure VBAPKOM (communication fields for maintaining a sales item)
    USEREXIT_MOVE_FIELD_TO_VBEPKOM
    Option to include additional fields in structure VBEPKOM (communication fields for maintaining a sales document schedule line)
    USEREXIT_MOVE_FIELD_TO_VBSN
    You can use this user exit to include fields in structure VBSN (scheduling agreement-related change status).
    USEREXIT_MOVE_FIELD_TO_KOMKH
    You can use this user exit to include new fields for batch determination (document header).
    USEREXIT_MOVE_FIELD_TO_KOMPH
    You can use this user exit to include new fields for batch determination (document item).
    USEREXIT_CUST_MATERIAL_READ
    You can use this user exit to set another customer number in the customer material info record (e.g. with a customer hierarchy)
    USEREXIT_NEW_PRICING_VBAP
    Option for entry of preconditions for carrying out pricing again (e.g. changes made to a certain item field could be used as the precondition for pricing to be carried out again). Further information in MV45AFZB.
    USEREXIT_NEW_PRICING_VBKD
    Option for entry of preconditions for carrying out pricing again (e.g. changes to the customer group or price group could be set as the preconditions for the system to carry out pricing again). Further information in MV45AFZB.
    User-Exits in Program MV45AFZD
    USEREXIT_CONFIG_DATE_EXPLOSION
    The BOM is exploded in the order with the entry date. You can use this user exit to determine which data should be used to explode the BOM (explosion with required delivery date, for example).
    User exits in the program FV45EFZ1
    USEREXIT_CHANGE_SALES_ORDER
    In the standard SAP R/3 System, the quantity and confirmed date of the sales document schedule line is changed automatically if a purchase requisition is allocated, and it or the sales document is changed (for example, quantity, date).
    If you want to change this configuration in the standard system, you can define certain requirements in order to protect your sales orders from being changed automatically. Use this user exit for this purpose. Decide at this point whether the schedule lines are to be changed.
    User-Exits in Program RV45PFZA
    USEREXIT_SET_STATUS_VBUK
    In this user exit you can you can store a specification for the reserve fields in VBUK (header status). Reserve field UVK01 could, for example, be used for an additional order status (as for rejections status, etc.).
    The following workareas are available for this user exit:
    VBUK (header status)
    FXVBUP (item status)
    FXVBUV (Incompletion)
    USEREXIT_SET_STATUS_VBUP
    In this user exit you can you can store a specification for the reserve fields for VBUP (item status).
    The following workareas are available for this user exit:
    FXVBAP (Item data)
    FXVBAPF (Dynamic part of order item flow)
    FXVBUV (Incompletion)
    USEREXIT_STATUS_VBUK_INVOICE
    You can use this user exit to influence billing status at header level.
    User exits in the screens
    Additional header data is on screen SAPMV45A 0309, additional item data on screen SAPMV45A 0459. These screens contain the Include screens SAPMV45A 8309 or SAPMV45A 8459 as user exits.
    Fields which are also to be included in the sales document for a specific installation should be included on the Include screens for maintaining. If an application-specific check module is needed for the fields, this can be included in the Include MV45AIZZ. The module is called up in the processing logic of the Include screens.
    For field transports, you do not have to make changes or adjustments.
    Example
    A new field, VBAK-ZZKUN, should be included in table VBAK.
    If the check is defined via the Dictionary (fixed values or check table) the field must be included with the fullscreen editor in the Include screen SAPMV45A 8309. In this case, no change has to be made to the processing logic.
    User Exits in Program MV45AFZ4
    USEREXIT_MOVE_FIELD_TO_KOMK
    You can use this user exit to add or edit additional header fields in the communication structure - KOMK- for free goods determination. For more information, see the New Fields for Free Goods Determination IMG activity.
    USEREXIT_MOVE_FIELD_TO_KOMP
    You can use this user exit to add or edit additional item fields in the communication structure KOMP for free goods determination. For more information see the New Fields for Free Goods Determination IMG activity.
    User Exits in the SAPFV45PF0E and SAPFV45PF0C Programs
    EXIT_SAPFV45P_001
    You can use this user exit to decide whether intercompany billing data is used in the profitability segment for cross-company code sales, or whether the data comes from external billing (external customer, sales data from the selling company code.
    Regards
    Eswar

  • How can i get all the users from weblogic server?

    how can i get all the users from weblogic server?
    i have configurated a LDAP server using iPlanet and
    in weblogic server console i see those users from LDAP
    server. but how can i get all the users in my program
    from weblogic server instead of LDAP server?
    BTW,how to configure a RDBMSAuthenticator and what should i do
    in Oracle? which tables should i create? and how are their architectures?
    Thanks
    Daniel

    BTW, i use weblogic platform 8.1
    "Daniel" <[email protected]> дÈëÓʼþ
    news:[email protected]..
    how can i get all the users from weblogic server?
    i have configurated a LDAP server using iPlanet and
    in weblogic server console i see those users from LDAP
    server. but how can i get all the users in my program
    from weblogic server instead of LDAP server?
    BTW,how to configure a RDBMSAuthenticator and what should i do
    in Oracle? which tables should i create? and how are their architectures?
    Thanks
    Daniel

  • What are the user exits available related to Credit Management?

    What are the user exits available related to Credit Management?
    <b>I have blocked users from releasing SO through Tciode VKM3 , now I want to block
    the same (releasing of SO through VKM4) ,I want VKM4 to be used only for releasing delivery orders. Can anyone help me in configuring the same?.</b>
    Thanx in advance,
    Regrds,
    Binayak

    Hi,
    User Exits For Credit Checks And Risk Management
    Credit Check
    If you want to carry out your own individual credit checks, that differ from those in the standard system, you must define them in the following user exits:
    LVKMPTZZ
    LVKMPFZ1: USER_CREDIT_CHECK1
    LVKMPFZ2: USER_CREDIT_CHECK2
    LVKMPFZ3: USER_CREDIT_CHECK3
    User exit for availability check
    User exit USEREXIT_AVAIL_CHECK_CREDIT exists in Include MV45AFZF.
    This user exit allows you to determine whether the system should or should not carry out an availability check after a blocked document has been released or after a new credit check.
    Risk Management for Receivables
    RVKMPUS2
    Using this user exit, you can define new forms of payment guarantee.
    Regards,
    Amrish Purohit

  • How to findout Which are the User Exist had been activated for T.Code

    Hi Everybody,
    -->I want to know, what are the user exits have been activated or used for a particular t.code.
    --> I will give a small example it will be give clear idea on my question.
    -->For my client, they used lot of user exist and screen exist in the sales order screen,how to find out them. Is their any easy way to find out which are user exist had been acitviated for a particular template.
    Regards,
    Madhan

    If you mean how to see a user list with the T-Code they are using.
    Then you can just use T-Code SM04.
    It will give all the information about those user.
    Or If you mean User Exists.
    then please use this link
    http://www.sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    Thanks,
    Sugauli
    Edited by: Sugauli on Dec 11, 2008 9:20 PM

  • How to determine ADF roles a user is in - before fully authenticated

    [JDev/ADF v11.1.1.5.0]
    I am trying to intercept a user's login to our ADF application (to log it to a database). I have written a custom login page and backing bean to handle the login using:
    mySubject = login(this._username, this._password);
    HttpServletRequest request = (HttpServletRequest)ctx.getExternalContext().getRequest();
    ServletAuthentication.runAs(mySubject, request);
    ServletAuthentication.generateNewSessionID(request);
    // determine what ADF 'Application Roles' the user has
    // log to database here
    // ... [code removed] ...
    HttpServletResponse response = (HttpServletResponse)ctx.getExternalContext().getResponse();
    RequestDispatcher dispatcher = request.getRequestDispatcher("/adfAuthentication");
    dispatcher.forward(request, response);
    What I need to do, however, is determine what roles a user has in the app, at the "???" point in the above code. If I interrogate the 'mySubject' object, it lists the groups from our authentication source that the user is a member of. In ADF Security, I've mapped these "Enterprise Roles" to "Application Roles", and need to get access to the Application Roles before redirecting them to the adfAuthentication servlet.
    I've tried using ADFContext.getCurrent().getSecurityContext().getUserRoles() where the '// ???' is, but it returns the 'anonymous' user (and associated roles). It appears that even though I've switched to runAs the authenticated user (via ServletAuthentication.runAs), ADF still thinks I'm running as the initial (anonymous) user.
    Is there a way to tell ADF to 'refresh' who it thinks I am now, so it will see me as the (now-authenticated) user, with their roles, etc.? Or, is there some other way to determine what (Application) roles a user has given their username?
    Thanks!
    Edited by: Karl C on Nov 27, 2012 12:28 PM

    Just checked code.
    Sorry, in our code we test enterprise roles(and not application roles) because we are using ReadOnlySqlAuthenticator to retrieve db users/roles.
    Set<Principal> allPrincipals = mySubject.getPrincipals();
    for (Principal principal : allPrincipals) {
          if(principal instanceof WLSGroupImpl ) {
               roles.add(principal.getName());
    }Dario

  • What is the user and password to Oracle 8.17

    Dear expert,
    I have loaded IDES 4.7 with Oracle 8.17.
    I wish to query the SAP table using Oracle PL/SQL.
    What is the user, password and string to login into SAP Oracle database? Since I was not asked for user and password when installing Oracle I do not know how to login.
    Also can you let me know how to view/change table fields in SAP tables. I understand there are different ways but which in your opinion is the easiest to find the record I want to correct(as some tables have many fields).
    Your kind assistance is appreciated.

    Thank you for your reply. Both the user and password does not work on my IDES 4.7e system.
    However I tried with user Internal and without password and I successfully login.
    I then create a new user and also change the user=system
    password.

  • What are the user defind functions

    hi all,
    what are the user defind functions.
    Regards,
    priya

    Hi Priya,
    Generally we will complete our mapping things with the help of Standard Functions itself those are available in Graphical Mapping.
    UDF are used when the mappings are unable to complete with these standard functions then we need to write small piece of Java code and can achieve that functionality.
    Steps to UDF
    Activities
    1. To create a new user-defined function, in the data-flow editor, choose Create New Function (This
    graphic is explained in the accompanying text), which is located on the lower left-hand side of the
    screen. In the menu, choose Simple Function or Advanced Function.
    2. In the window that appears, specify the attributes of the new function:
    Name
    Technical name of the function. The name is displayed in the function chooser and on the data-flow
    object.
    Description
    Description of how the function is used.
    Cache
    Function type (see above)
    Argument Count
    In this table, you specify the number of input values the function can process, and name them. All
    functions are of type String.
    3. In the window that appears, you can create Java source code:
    a. You can import Java packages to your methods from the Imports input field, by specifying them
    separated by a comma or semi-colon:
    You do not need to import the packages java.lang., java.util., java.io., and java.lang.reflect. since
    all message mappings require these packages and therefore import them. You should be able to
    access standard JDK and J2EE packages of the SAP Web Application Server by simply specifying the
    package under Import. In other words, you do not have to import it as an archive into the Integration
    Repository. You can also access classes of the SAP XML Toolkit, the SAP Java Connector, and the
    SAP Logging Service (see also: Runtime Environment (Java-Mappings)).
    In addition to the standard packages, you can also specify Java packages that you have imported as
    archives and that are located in the same, or in an underlying software component version as the
    message mapping.
    b. Create your Java source text in the editor window or copy source text from another editor.
    4. Confirm with Save and Close.
    5. User-defined functions are limited to the message mapping in which you created the function. To
    save the new function, save the message mapping.
    6. To test the function, use the test environment.
    The new function is now visible in the User-Defined function category. When you select this category,
    a corresponding button is displayed in the function chooser pushbutton bar. To edit, delete, or add the
    function to the data-flow editor, choose the arrow next to the button and select from the list box
    displayed.
    REgards
    Seshagiri

  • Have situation on a unprotected wireless when accessing I cannot get a router address.  I'm on system 10.6.8 on MacBook Pro summer 2009.  Any thoughts on how to determine what's up?

    Have situation on a unprotected wireless when accessing I cannot get a router address.  I'm on system 10.6.8 on MacBook Pro summer 2009.  Any thoughts on how to determine what's up?

    You have basically verified that the issue is NOT with your MBP, but with this particular Wi-Fi network. Even though it may be unsecured, the provider may have elected to configure it for MAC filtering. In this case I am referring to the MAC (Media Access Control) address and not your Mac laptop itself. If this has been enabled, your Mac would only be able to connect if the Mac's MAC address was added to that filter.

  • How to determine what prevents auto sleep?

    My iMac doesn`t go to sleep automatically, so my question is how to determine what prevents sleep?

    I, too, have just noticed this issue after I left my computer to secure empty trash. I thought my computer would enter sleep mode automatically, but when I returned to my home office to turn in for the night, I found that the computer had been on (with the monitor off) the entire evening. So much for "Energy Saver."
    I have been through as many of the threads covering this issue. I checked to see if there were any apps using the processor. None. I checked "Put the hard drive(s) to sleep when possible." I made sure Internet Sharing was unchecked. I even unchecked Bluetooth sharing or anything that would require some type of network or internet connection. Despite all this, the computer still does not go to sleep on it's own.
    I have just upgraded to Lion 10.7.3 on my iMac 2.4 Ghz Intel Core 2 Duo so this is definitely a continuing problem which has not been addressed since older theads dealing with early versions of Lion.
    Let's hope Mountain Lion will fix this.

  • How to determine what Index is required by a query ?

    Hi all,
    In our application, we create some complex View with complex query.
    The question is : How to determine what Index is required by the query ?
    I try using SQL Tuning Advisor, but it seems I can only use it After running TOP SQL.
    Is there aniway to determine the Index even Before the query under heavy load/ top sql ?
    Thank you very much,
    xtanto

    You can run your explain plan to check the table access path. You can create an index and see if the optimizer is using the index. Then compare the costs. Creating index will help only in some occassion and you need to provide complete details of the requirement to come to a conclusion of which type of INDEX as well. In some occassions, B*Tree index is useful, in few DW applications, bitmap will work efficiently with columns of low cardinality.
    So it is again the task for you to look into this.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/toc.htm
    Cheers
    Sarma.

Maybe you are looking for

  • Not working on my MacBook

    I installed the toolkit on my MacBook but it crash at the startup. I don't know what can be the problem, but this is the error mesasage: Date/Time: 2007-10-02 10:31:36.668 +0200 OS Version: 10.4.10 (Build 8R2218) Report Version: 4 Command: newIDE Pat

  • Hiding a region

    Hi Friends, I had a region that I hid some time ago. Yesterday I modified the setting in User Interface / Template, and changed the region Reports region so that I can view it again. That was intentional because I wanted to check something out. Probl

  • How can I send in a suggestion for new iphone?

    I think Apple should create an Iphone like the GALAXY S4 with this phone being waterproof it makes it hard to stay with the Iphone.

  • New iPad iPhone app hosed my ability to control my kids appleTv

    Last night I could not connect to my daughters Apple TV tonight I had the bright idea to update it. Long story short, I can no longer change the movie she is watching or choose a new playlist. Apple really messed this one up. I have five apple TV's,

  • Installer for Adobe AIR does not respond when double-clicked?

    I tried to install Adobe AIR on my laptop (WinXP Professional, SP3), but the link at http://get.adobe.com/air/ didn't work - website said 'downloading' and to click OK when prompted, but I never got prompted. So I found an alternate site, downloaded