Requirements for master

Basic requirements for master data template.
Infotype to be in the Indian implementation collecting master data template.
Help me
Thanks in avance

Previous employer tax details
(IT0580)     Info Type 0580 capture employees previous employment tax detail. In SAP, previous employment tax detail can be captured in Info type 0580. 
Salary as per u/s 17 (1),Value of perquisite u/s 17 (2),  Profit in lieu of salary u/s 17 (3),  Exemptions u/s 10, Professional Tax, Provident fund, Income tax deducted, Medical exemption, Leave Encashment Exemption Amount , Gratuity Exemption Amount, LTA exemptions.     Indian specific Info types for Payroll
House Rent Allowance
(IT0581)     These info types capture employeeu2019s housing details. Housing (HRA / CLA/ COA). This info type captures if the employee is using own house or rented or company accommodation etc.      Indian specific Info types for Payroll
Exemptions u2013 Subtypes
LTA / MDA / CEA / CHA
(IT0582)     The LTA subtype of the Exemptions Info type (0582) stores the details of the travel submitted as proof for Leave Travel Allowance (LTA) tax exemptions. The details in this subtype are used during the calculation of LTA tax exemption. This info type can be created at any point of time during a financial year. The info type records are read for the entire financial year. Each time there is a regular payroll run.
The MDA subtype of the Exemptions Info type (0582) stores the details of the bills submitted as proofs for medical exemptions. This includes the details of proofs submitted for:
u2022     Medical reimbursements
u2022     Medical Insurance
u2022     Medical reimbursement treated as insurance.
The SCEA and SCHA subtypes of the Exemptions Info type (0582) stores the details of expenditure submitted as proofs for exemptions on child education and hostel allowances, respectively.
     Indian specific Info types for Payroll
House Rent Allowance
(IT0581)     These info types capture employeeu2019s housing details. Housing (HRA / CLA/ COA). This info type captures if the employee is using own house or rented or company accommodation etc.      Indian specific Info types for Payroll
Car & Conveyance
(IT0583)     This info type stores for an employee the details of the:
u2022 Conveyance type
u2022 Car and driver for the different Car Schemes     Indian specific Info types for Payroll
Income from other sources  (IT0584)     This info type stores the details of income from sources other than that from your employment, for the current financial year.
There are two subtypes to this info type:
u2022     Subtype 0001 - Income From House Property
u2022     Subtype 0002 - Income From Other Sources
     Indian specific Info types for Payroll
Section 80 Deduction
(IT0585)     This info type keeps a record of the proposed and actual contributions made by an employee towards Section 80 subsections and divisions.
These info types capture employeeu2019s investment with respect to the  80CCC
80CCC     Contribution to certain pension funds
80D Medical Insurance Premium
80DD Med Treatment / Depts. of handicapped dependents
80DDB     Medical Treatment
80E Repayment of loan for higher education
80G Donations to certain funds, charitable .institution.
80GG Deduction in respect of rent paid
80GGA     Donation for Scientific/Rural Development
80L Interest on Certain Securities/Dividends
80RR Prof Income from Foreign Sources
80RRA      Remuntn recvd for services rendrd abrd
80U Permanent Physical Disability
     Indian specific Info types for Payroll
Section 88 Deductions
(0586)     This info type keeps a record of the proposed and actual investments made by an employee towards Section 88.     Indian specific Info types for Payroll
Provident fund contribution (0587)     This object stores the Provident Fund information of an employee.
This component helps you create and maintain information on employee Provident Fund. Provident Fund is a statutory contribution, and consists of two parts:
1.     Provident Fund (PF) - Both, the employee and the employer, contribute a fixed percentage of the PF basis towards a Provident Fund. The minimum percentage contributed is as specified by the authorities.
2.     Pension Fund - The employer contributes a fixed percentage of the PF basis towards the Pension Fund of an employee. The minimum percentage contributed is as specified by the authorities.
     Indian specific Info types for Payroll
Other Statutory Deduction (0588)     There are three sub types to this info type:
1.Subtype 0001 u2013 ESI Employees' State Insurance (ESI) is a benefit provided to any employee earning wages below a specified limit to provide for:
u2022     Hospital facilities and out-patient facilities
u2022     Medicines provided by the hospital
u2022     Compensation in case of accident or death
u2022     Leave subject to ESI hospitals or doctors certificate of ill-health.
2. Sub type 0002 u2013 LWF Labour Welfare Fund (LWF) is a component of compensation provided to promote the welfare of Labour in the state. Employees who are eligible for LWF contribute a statutory amount to the LWF fund.
3. Sub type 0003 u2013 PTX Professional Tax is a tax on employment and it is deducted from the wages payable to all employees with the exception of certain categories such as disabled and so on but may vary in different States as per the laws that govern them.
Nominations (IT0591)     This info type stores the nomination details of employees for the following benefits:
u2022        Employee State Insurance
u2022        Gratuity
u2022        Maternity Act
u2022        Provident Fund
u2022        Pension

Similar Messages

  • Requirements for master data template

    HI,
    Please let me know the basic requirements for master data template. What all the infotype to be involved in collecting master data template.
    Thanks and regards,
    Aylwin

    Hi Alwyn,
    If you dealing with india payroll definately Yes subject to you business process again.
    Even though it is indian payroll some item may not applicable
    For exmple other income resourece will be  if you go to manufaturing industries you may face problem even though they got some they don't want reveal it to the employee.
    in such that infotype is not required as per your business process.
    Some states like UP there is no profession tax. Genrally we say profession tax need to be configure you need that master data.
    If such case that might be wrong for that you need to give all the info
    use PA30 t code go in side check what are the mandatory fields and take them as master data fields and put it in template.
    If your client is able to give other field you can very well add them no harm in it.
    But make sure you filled mandatory fields
    That is the reason i was asking you to tell which coutry payroll you using
    Best Regards

  • To Understand requirement for master data

    Hi Experts,
    i am given a requirement to create a master data   and  was  asked to do below
    Extraction from r/3 for the new object ZCPAR
    Similar to ZNTORD (get key & text from 0COORDER)          
    Restricted to Order type Z040 only
    and
    Based on value of ZCPAR_ZCFAM and ZCPAR_ZCTYN                         
         update attribute ZCFAM_ZCTYN     
    How do we do that please guide .
    Thanks In Advance,
    Nitya.

    Hi Nitya,
    Just create a char infoobject ZCPAR with required Datatype & length. In the IO properties screen, you will find a tab Master data attribute/text. Just make sure that the check box "Attribute" is checked/ticked.
    Loading is same as loading any other BW DS. Just create a Infopackage & load to DS. create a Transformation from DS to MD IO ZCPAR & load the data using DTP.
    Similar to ZNTORD (get key & text from 0COORDER) Restricted to Order type Z040 only
    Similar to 1st point. Make sure both Attribute & text is checked. You can restrict for Order type either is Infopackage or you can write a small field routine in transformation like
    IF Order type EQ Z040.
    Result = 'XYZ'.
    ENDIF.
    Based on value of ZCPAR_ZCFAM and ZCPAR_ZCTYN update attribute ZCFAM_ZCTYN
    Simple field level routine for ZCFAM_ZCTYN.
    IF ZCPAR_ZCFAM EQ 'XYZ' and ZCPAR_ZCTYN EQ 'XYZ'.
    RESULT = 'XYZ'.
    ENDIF.
    Note : Replace 'XYZ' with suitable values
    Hope its clear & helpful!
    Regards,
    Pavan
    Edited by: PavanRaj_S on Dec 26, 2011 2:45 PM

  • Hardware requirement for master collection cs5

    hi guys, just a newbie here. i'm a web/flash game-animation hobbyist, just  thought to take the notch next level by adding production softwares of adobe by taking in mc cs5.
    I got a few questions before buying this monster suite.
    my machine is a laptop dell stud 17 core i5 2.93Ghz max, 4MB RAM, 1GB VC, 64bit W7. 500GB HDD 7200rpm
    my machine has only 1600x900 video display - mc cs5 recommends 1280x1024; what is the disadvantage, if i don't use their recommended spec and stick to my min display?
    what is the "DVD-ROM drive compatible with dual-layer DVDs for?" is it required in installation or just a requirement in video production; say burning/reading dual layer format disc as reference and or burning in that format as final projects? my dvd is just the regular +/- single layer RW drive. No blue ray at the moment.
    Adobe-certified GPU card for GPU-accelerated performance in Adobe Premiere Pro--- will 1GB ATI Mobility Radeon™ HD 5650 Graphics Card do the job? this is all i have.
    Any help will do. thanks guys!!

    I wanna ask is it my display or something else. Im using 1366x768.
    Yes it is. The system requirements are there for a reason, not just for fun. These apps all have tons of palettes and require extra space for displaying timelines, compositions and video previews and even if they could be installed with such a small screen res, working with them would be no fun at all because things would get cut off or you would be forced to shuffle palettes around all the time. You really need a qualified computer.
    Mylenium

  • System Requirements for the Creative Suite 5.5 Master Collection

    I am going to purchase a Mac notebook and would like to know which one will accommodate my Creative Suite 5.5 Master Collection of programs I already own from Adobe.

    You can find the system requirements for Master Collection CS5.5 at System requirements | Master Collection - http://helpx.adobe.com/x-productkb/policy-pricing/system-requirements-master-collection.ht ml#main_CS5_5_Master_Collection_system_requirements.

  • What is setting required for Price Contol do the optional entery in master.

    Hi SAP EXPERT,
    What is setting required for Price Contol do the optional entery in Material master ?
    Regards
    Mahendra
    Edited by: MAHENDRA  NAVALE on Aug 27, 2011 2:23 PM

    HI
    In general in the material master record you can, hide display or optional for a field, but there are certain feilds should be mandatory for material creation,like base unit of measure,material description,and price contro,l valuation price,valuation class,without this you cannot create a purchase order.
    If you dont want to the price control means go for mateial type  non valuated material NLAG. because the price control is mainly linked with the material  valuation. based on the the price control  the price difference will post either sock account or price difference account.
    Goto>MM>LOGISTICS GENERAL>MATERIAL MASTER>FEILD SELECTION >ASSIGN FEILD SELECTION TO FEILD SELECTIN GROUPS(OMSR) >CHECK YOUR FEILD REFERENCE FOR PRICE CONTROL (MBEW-VPRSV) FEILD REFERNCE GROUP IS 32* > AND GOTO MAINTAIN FEILD REFERENCE FOR DATA SCREENS (OMS9) >ENTER  YOUR FEILD SELECTIN GROUP 32 (PRICE CONTROL )AND  CHOOSE MM01 MAKE IT FROM REQUIRED  TO OPTIONAL.
    Thanks
    Edited by: Nijamudeen**MM on Aug 29, 2011 10:19 AM

  • Master data required for implementation

    Hi all,
    We are going for a university implementation. Could someone tell me what master data is required for implementation from client. Like student data, learning content data, grading system,
    My next question is which third party tool is good for library management & class and exam scheduling.
    Thanks
    Ravi

    Hi Vinod,
    I need some information about the implementation of SLCM. I have found some standard documentation about the functions it offers, but they usually give an overview. I couldn't access any information on issues such as the technical prerequisites or how to implement it yet.
    As no such project has been implemented in my country, I'll keep on doing research online. If you have any documents that would be useful to implement or at least to understand the implementation process of SLCM, they would be of utmost value for me.
    Thanks in advance
    Aslı

  • Master Data Required for 2LIS_02_HDR Data Source Installation

    Hi All,
                  We are trying to install  2LIS_02_HDR DataSource through BC for our purchasing implementation . Can you suggest what all the Master Data required for this Data Source.
    Regards,
    Aditya

    Hi ,
    Please see this link.
    http://help.sap.com/saphelp_nw70/helpdata/en/ed/62073c44564d59e10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/ed/62073c44564d59e10000000a114084/frameset.htm
    In the same page you can find all details.
    Hope this helps.
    Thanks
    CK

  • What are the settings required for QM in procurement

    Hi Team,
    What are the settings required for QM in procurement. I have  set indicator for QM in procurement in QM view in material master.
    I am not clear about  following fields to be maintained in QM view.
    QM Control Key
    Certificate type
    Target QM system
    Tech. delivery terms Indicator.
    Please suggest me in which case to be used this fiels. Is it relivant to Quality Certificates.
    Thanks

    Hi,
    All meaning are
    QM Control Key :
    If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement.
    Certificate type :
    Certificate types applies to the Certificate processing in procurement  &  Certificate creation
    Target QM system :
    whether the vendor's verified QM system, according to vendor master record or quality info-record (for a combination of vendor/material) meets the requirements for QM systems as specified in the material master
    -  If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement. If you want Procurment control then accordingly define Control Key.
    -  If you want Vendor's perticular certificate for Material then you have to define Certificate type.
    Also, you have to maintain Material, Vendor's Info record at plant level.
    Thanks,
    JM

  • List of Manual Setup required for iSetup to work

    Hi All,
    This is Mugunthan from iSetup development. Based on my interaction with customers and Oracle functional experts, I had documented list of manual setups that are required for smooth loading of selection sets. I am sharing the same. Please let me know if I anyone had to enter some manual setup while using iSetup.
    Understanding iSetup
    iSetup is a tool to migrate and report on your configuration data. Various engineering teams from Oracle develop the APIs/Programs, which migrates the data across EBS instances. Hence all your data is validated for all business cases and data consistency is guarantied. It requires good amount of setup functional knowledge and bit of technical knowledge to use this tool.
    Prerequisite setup for Instance Mapping to work
    ·     ATG patch set level should be same across all EBS instances.
    ·     Copy DBC files of each other EBS instances participating in migration under $FND_SECURE directory (refer note below for details).
    ·     Edit sqlnet.ora to allow connection between DB instacnes(tcp.invited_nodes=(<source>,<central>))
    ·     Make sure that same user name with iSetup responsibility exists in all EBS instances participating in migration.
    Note:- iSetup tool is capable of connecting to multiple EBS instances. To do so, it uses dbc file information available under $FND_SECURE directory. Let us consider three instances A, B & C, where A is central instance, B is source instance and C is target instances. After copying the dbc file on all nodes, $FND_SECURE directory would look like this on each machine.
    A => A.dbc, B.dbc, C.dbc
    B => A.dbc, B.dbc
    C => A.dbc, C.dbc
    Prerequisite for registering Interface and creating Custom Selection Set
    iSetup super role is mandatory to register and create custom selection set. It is not sufficient if you register API on central/source instance alone. You must register the API on all instances participating in migration/reporting.
    Understanding how to access/share extracts across instances
    Sharing iSetup artifacts
    ·     Only the exact same user can access extracts, transforms, or reports across different instances.
    ·     The “Download” capability offers a way to share extracts, transforms, and loads.
    Implications for Extract/Load Management
    ·     Option 1: Same owner across all instances
    ·     Option 2: Same owner in Dev, Test, UAT, etc – but not Production
    o     Extract/Load operations in non-Production instances
    o     Once thoroughly tested and ready to load into Production, download to desktop and upload into Production
    ·     Option 3: Download and upload into each instance
    Security Considerations
    ·     iSetup does not use SSH to connect between instances. It uses Concurrent Manager framework to lunch concurrent programs on source and target instances.
    ·     iSetup does not write password to any files or tables.
    ·     It uses JDBC connectivity obtained through standard AOL security layer
    Common Incorrect Setups
    ·     Failure to complete/verify all of the steps in “Mapping instances”
    ·     DBC file should be copied again if EBS instance has been refreshed or autoconfig is run.
    ·     Custom interfaces should be registered in all EBS instances. Registering it on Central/Source is not sufficient.
    ·     Standard Concurrent Manager should up for picking up iSetup concurrent requests.
    ·     iSetup financial and SCM modules are supported from 12.0.4 onwards.
    ·     iSetup is not certified on RAC. However, you may still work with iSetup if you could copy the DBC file on all nodes with the same name as it had been registered through Instance Mapping screen.
    Installed Languages
    iSetup has limitations where it cannot Load or Report if the number and type of installed languages and DB Charset are different between Central, Source and Target instances. If your case is so, there is a workaround. Download the extract zip file to desktop and unzip it. Edit AZ_Prevalidator_1.xml to match your target instance language and DB Charset. Zip it back and upload to iSetup repository. Now, you would be able to load to target instance. You must ensure that this would not corrupt data in DB. This is considered as customization and any data issue coming out this modification is not supported.
    Custom Applications
    Application data is the prerequisite for the most of the Application Object Library setups such as Menus, Responsibility, and Concurrent programs. iSetup does not migrate Custom Applications as of now. So, if you have created any custom application on source instance, please manually create them on the target instance before moving Application Object Library (AOL) data.
    General Foundation Selection Set
    Setup objects in General foundation selection set supports filtering i.e. ability to extract specific setups. Since most of the AOL setup data such as Menus, Responsibilities and Request Groups are shipped by Oracle itself, it does not make sense to migrate all of them to target instance since they would be available on target instance. Hence, it is strongly recommended to extract only those setup objects, which are edited/added, by you to target instance. This improves the performance. iSetup uses FNDLOAD (seed data loader) to migrate most of the AOL Setups. The default behavior of FNDLOAD is given below.
    Case 1 – Shipped by Oracle (Seed Data)
    FNDLOAD checks last_update_date and last_updated_by columns to update a record. If it is shipped by Oracle, the default owner of the record would be Oracle and it would skip these records, which are identical. So, it won’t change last_update_by or last_updated_date columns.
    Case 2 – Shipped by Oracle and customized by you
    If a record were customized in source instance, then it would update the record based on last_update_date column. If the last_update_date in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Case 3 – Created and maintained by customers
    If a record were newly added/edited in source instance by you, then it would update the record based on last_update_date column. If the last_update_date of the record in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Profiles
    HR: Business Group => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    HR: Security Profile => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    MO: Operating Unit => Set the Operating Unit name for which you would like to extract data from source instance. After loading Operating Unit onto the target instance, make sure that this profile option is set if required.
    Navigation path to do the above setup:
    System Administrator -> Profile -> System.
    Query for the above profiles and set the values accordingly.
    Descriptive & Key Flex Fields
    You must compile and freeze the flex field values before extracting using iSetup.
    Otherwise, it would result in partial migration of data. Please verify that all the data been extracted by reporting on your extract before loading to ensure data consistency.
    You can load the KFF/DFF data to target instance even the structures in both source as well as target instances are different only in the below cases.
    Case 1:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1, Loc2, Loc3 (Mandate), Loc4, Loc5 and Loc6
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3, Loc4, then locations will be loaded to target instance without any issue. If you do not provide value for Loc3, then API will fail, as Loc3 is a mandatory field.
    Case 2:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1 (Mandate), Loc2
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3 and Loc4 and load data to target instance, API will fail as Loc3 and Loc4 are not there in target instance.
    It is always recommended that KFF/DFF structure should be same for both source as well as target instances.
    Concurrent Programs and Request Groups
    Concurrent program API migrates the program definition(Definition + Parameters + Executable) only. It does not migrate physical executable files under APPL_TOP. Please use custom solution to migrate executable files. Load Concurrent Programs prior to loading Request Groups. Otherwise, associated concurrent program meta-data will not be moved even through the Request Group extract contains associated Concurrent Program definition.
    Locations - Geographies
    If you have any custom Geographies, iSetup does not have any API to migrate this setup. Enter them manually before loading Locations API.
    Currencies Types
    iSetup does not have API to migrate Currency types. Enter them manually on target instance after loading Currency API.
    GL Fiscal Super user--> setup--> Currencies --> rates -- > types
    Associating an Employee details to an User
    The extract process does not capture employee details associated with users. So, after loading the employee data successfully on the target instance, you have to configure them again on target instance.
    Accounting Setup
    Make sure that all Accounting Setups that you wish to migrate are in status “Complete”. In progress or not-completed Accounting Setups would not be migrated successfully.
    Note: Currently iSetup does not migrate Sub-Ledger Accounting methods (SLA). Oracle supports some default SLA methods such as Standard Accrual and Standard Cash. You may make use of these two. If you want to use your own SLA method then you need to manually create it on target instances because iSetup does not have API to migrate SLA. If a Primary Ledger associated with Secondary Ledgers using different Chart of Accounts, then mapping rules should be defined in the target instance manually. Mapping rule name should match with XML tag “SlCoaMappingName”. After that you would be able to load Accounting Setup to target instance.
    Organization API - Product Foundation Selection Set
    All Organizations which are defined in HR module will be extracted by this API. This API will not extract Inventory Organization, Business Group. To migrate Inventory Organization, you have to use Inventory Organization API under Discrete Mfg. and Distribution Selection Set. To extract Business Group, you should use Business Group API.
    Inventory Organization API - Discrete Mfg & Distribution Selection Set
    Inventory Organization API will extract Inventory Organization information only. You should use Inventory Parameters API to move parameters such as Accounting Information. Inventory Organization API Supports Update which means that you can update existing header level attributes of Inventory Organization on the target instance. Inventory Parameters API does not support update. To update Inventory Parameters, use Inventory Parameters Update API.
    We have a known issue where Inventory Organization API migrates non process enabled organization only. If your inventory organization is process enabled, then you can migrate them by a simple workaround. Download the extract zip file to desktop and unzip it. Navigate to Organization XML and edit the XML tag <ProcessEnabledFlag>Y</ProcessEnabledFlag> to <ProcessEnabledFlag>N</ProcessEnabledFlag>. Zip it back the extract and upload to target instance. You can load the extract now. After successful completion of load, you can manually enable the flag through Form UI. We are working on this issue and update you once patch is released to metalink.
    Freight Carriers API - Product Foundation Selection Set
    Freight Carriers API in Product Foundation selection set requires Inventory Organization and Organization Parameters as prerequisite setup. These two APIs are available under Discrete Mfg. and Distribution Selection Set. Also,Freight Carriers API is available under Discrete Mfg and Distribution Selection Set with name Carriers, Methods, Carrier-ModeServ,Carrier-Org. So, use Discrete Mfg selection set to load Freight Carriers. In next rollup release Freight Carriers API would be removed from Product Foundation Selection Set.
    Organization Structure Selection Set
    It is highly recommended to set filter and extract and load data related to one Business Group at a time. For example, setup objects such as Locations, Legal Entities,Operating Units,Organizations and Organization Structure Versions support filter by Business Group. So, set the filter for a specific Business Group and then extract and load the data to target instance.
    List of mandatory iSetup Fwk patches*
    8352532:R12.AZ.A - 1OFF:12.0.6: Ignore invalid Java identifier or Unicode identifier characters from the extracted data
    8424285:R12.AZ.A - 1OFF:12.0.6:Framework Support to validate records from details to master during load
    7608712:R12.AZ.A - 1OFF:12.0.4:ISETUP DOES NOT MIGRATE SYSTEM PROFILE VALUES
    List of mandatory API/functional patches*
    8441573:R12.FND.A - 1OFF:12.0.4: FNDLOAD DOWNLOAD COMMAND IS INSERTING EXTRA SPACE AFTER A NEWLINE CHARACTER
    7413966:R12.PER.A - MIGRATION ISSUES
    8445446:R12.GL.A - Consolidated Patch for iSetup Fixes
    7502698:R12.GL.A - Not able to Load Accounting Setup API Data to target instance.
    Appendix_
    How to read logs
    ·     Logs are very important to diagnose and troubleshoot iSetup issues. Logs contain both functional and technical errors.
    ·     To find the log, navigate to View Detail screens of Extracts/ Transforms/Loads/Standard/Comparison Reports and click on View Log button to view the log.
    ·     Generic Loader (FNDLOAD or Seed data loader) logs are not printed as a part of main log. To view actual log, you have to take the request_id specified in the concurrent log and search for the same in Forms Request Search Window in the instance where the request was launched.
    ·     Functional errors are mainly due to
    o     Missing prerequisite data – You did not load one more perquisite API before loading the current API. Example, trying to load “Accounting Setup” without loading “Chart of Accounts” would result in this kind of error.
    o     Business validation failure – Setup is incorrect as per business rule. Example, Start data cannot be greater than end date.
    o     API does not support Update Records – If the there is a matching record in the target instance and If the API does not support update, then you would get this kind of errors.
    o     You unselected Update Records while launching load - If the there is a matching record in the target instance and If you do not select Update Records, then you would get this kind of errors.
    Example – business validation failure
    o     VONAME = Branches PLSQL; KEY = BANKNAME = 'AIBC‘
    o     BRANCHNAME = 'AIBC'
    o     EXCEPTION = Please provide a unique combination of bank number, bank branch number, and country combination. The 020, 26042, KA combination already exists.
    Example – business validation failure
    o     Tokens: VONAME = Banks PLSQL
    o     BANKNAME = 'OLD_ROYAL BANK OF MY INDIA'
    o     EXCEPTION = End date cannot be earlier than the start date
    Example – missing prerequisite data.
    o     VONAME = Operating Unit; KEY = Name = 'CAN OU'
    o     Group Name = 'Setup Business Group'
    o     ; EXCEPTION = Message not found. Application: PER, Message Name: HR_ORG_SOB_NOT_FOUND (Set of books not found for ‘Setup Business Group’)
    Example – technical or fwk error
    o     OAException: System Error: Procedure at Step 40
    o     Cause: The procedure has created an error at Step 40.
    o     Action: Contact your system administrator quoting the procedure and Step 40.
    Example – technical or fwk error
    o     Number of installed languages on source and target does not match.
    Edited by: Mugunthan on Apr 24, 2009 2:45 PM
    Edited by: Mugunthan on Apr 29, 2009 10:31 AM
    Edited by: Mugunthan on Apr 30, 2009 10:15 AM
    Edited by: Mugunthan on Apr 30, 2009 1:22 PM
    Edited by: Mugunthan on Apr 30, 2009 1:28 PM
    Edited by: Mugunthan on May 13, 2009 1:01 PM

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

  • Purpose of instructor led training requirement for OCP?

    What is the purpose of requiring OCP candidates to take an instructor led course? It all seems a bit random / superfluous. Costly too if one does not have an employer willing to pay for the training.
    My Oracle Press "OCA/OCP Oracle Database 11g All-in-One Exam Guide" doesn't even hint that classes are a requirement for the OCP; the most said is: "The OCA qualification is based on two examinations; the OCP qualification requires passing a third examination." (pg. XXIX). An incomplete characterization, correct?
    And "To prepare for the OCA/OCP examinations, you can attend Oracle University instructor-led training courses, you can study Oracle University online learning material, or you can read this book." (pg. XXIX). Prepare for the exams, sure. But why would such a book omit the not insignificant detail that a perhaps expensive course is required to get the OCP? Seems discourteous to the reader.
    From http://education.oracle.com/pls/web_prod-plq-dad/db_pages.getpage?page_id=244#5 via http://education.oracle.com/pls/web_prod-plq-dad/db_pages.getpage?page_id=198&p_org_id=&lang=
    I say it seems random and superfluous because of the following list:
    "These foundation courses meet the Hands On Course Requirement for OCP-Level certification:
    * Oracle Database 11g: Introduction to SQL
    * Oracle Database 11g: Introduction to SQL Ed 1 LVC
    * Oracle Database 11g: SQL Fundamentals I
    * Oracle Database 11g: SQL Fundamentals l Ed 1 LVC
    * Oracle Database 11g: SQL and PL/SQL Fundamentals
    * Oracle Database 11g: SQL and PL/SQL Fundamentals Ed 1 LVC
    * Oracle Database 11g: Develop PL/SQL Program Units
    * Oracle Database 11g: Administration Workshop I
    * Oracle Database 11g: Administration Workshop I Ed 1.1 LVC
    * Oracle Database 11g: Administration Workshop II
    * Oracle Database 11g: Data Warehousing Fundamentals"
    A modest Intro to SQL course will count? But an OCA will already have that knowledge. Obviously, one should choose a course with higher-level content. But I'm not sure I understand the logic of the OCA-grade courses being eligible. And for those funding certification without financial assistance from an employer, the question that suggests is this: what is the least expensive way to satisfy the "instructor led" course requirement? I cheapest price I see from Oracle is a 3 day class costing $1,800. It would also cost 3 days of vacation time, or unpaid leave, if an employer is not supportive of certification.

    Dana N wrote:
    Thanks for your thoughtful reply Hans.
    TO expose the candidate to techniques related to the new features rather than traditional techniques that are commonplace in the industry.That makes sense to me. But many of us like to tinker with new features on our own.In my experience, many, many more people want the OCP without doing any work. Some will use pure memorization or out and cheating to get there.
    It's unfortunate that you and I get penalized by those.
    >
    And to ensure that the has had at least some hands-on time, instead of only pure theory. While some people may get hands-on at work, many others do not
    have that opportunityh, especially in the are of new features.Then it might make sense for OCP certification to allow submission of proof of work experience in place of the HOC, as is the case with other professional certifications.
    Yes, that would be nice.
    But difficult to verify. and you would be surprised how many organizations have very narrow definitions of the job. Experimenting not allowed in some places.
    It all seems a bit random / superfluous. Sad, from an instructors and OCP's point of view, to hear you think of it that way.I don't mean to say there isn't anything new any of us couldn't pick up, say, even from an Intro to SQL course. There's a vastness to most technology topics and we can't master every facet. But the cost/benefit ratio may be terribly low for those who are already Oracle professionals with many years of on the job experience. The reality as well is that some instructors are duds--that's the case anywhere. And for my learning style, I prefer self-guided / on-my-own study.
    Costly too if one does not have an employer willing to pay for the training.No doubt.
    My Oracle Press "OCA/OCP Oracle Database 11g All-in-One Exam Guide" doesn't even hint that classes are a requirement for the OCP; the most said is: "The
    OCA qualification is based on two examinations; the OCP qualification requires passing a third examination." (pg. XXIX). An incomplete characterization, correct?A topic to discuss with the authors, and publisher. We (the user community with whom you are talking) have little to no control over this.Wonder if the authors participate in these forums. I realize you, as a user, have no control over this. But a book blessed by Oracle via Oracle Press, I respectfully submit, can and should do better--by including details about the HOC requirement.
    Some do. But much more effetive would be to lodge a formal complaint or inquiry.
    "These foundation courses meet the Hands On Course Requirement for OCP-Level certification:
    ...All of which will ensure you have had time becoming acquainted with SQLPLus, DBControl, and perhaps SQLDeveloper.I already use all 3 tools. Would love to take all the courses listed; but I'm not independently wealthy or otherwise.
    So would I. ;-)
    As well as logging in (you would be surprised how many OCA candidates do not know how), these all give you an opportunity to issue SQL and PL/SQL
    commands. Scary, but seems beside the point. The HOC requirement is one solution to this problem, but it seems like a sub-optimal one. Again, this punishes Oracle professionals already working in the field who'd like to become certified. Why is there no regard for them? One way to correct this, if this is a genuine concern on Oracle's part, is to include a "live" aspect to the exams. Then, if this causes a fail, those who flunk might be interested in taking HOCs to remedy this. In this day and age, I'm certain training could involve interactive SQL*PLUS sessions, etc. Only a failure of imagination and will could prevent I suspect; particularly in a world of virtualization. I'd consider a redesign of exams before I'd require people to pay for classes they may be unable to pay for. It's a bit of an insult to working Oracle professionals who want to add certification.
    I agree to some extent. Again, the issue is the consistency in experience. I have met people with 10 years of experience, and others with 10 times 1 year of experience.
    A modest Intro to SQL course will count? But an OCA will already have that knowledge. I wish this was consistently true. Then I mispoke. But the HOC still punishes existing Oracle professionals.
    I sympathize. I'm an independent, and all exams and certification costs come out of my own (corporate=me) pocket. But life in the professional fast lane is not
    cheap. This topic has been the source of some contention for quite some time.
    Unfortunately a number of people have been unscrupulous in their method of getting certification - cheating, using gunners, and so on - and I see this as one of
    the few ways to combat the issue. If the cheating issue is just left alone, the value of OCP (and any other certification) is effectively lost.The unscrupulous, I'm afraid, will always be with us. This problem is not solved, I'm also afraid, by requiring a costly HOC. It is in my opinion one potential solution of many, and not at all the best one. I wonder how other vendors with certifications handle this. I do know that some industry-generic certifications require proof of X years of work in the field (e.g. GISPs), as well as providing proof of various contributions to the field. This could be gamed as well I suppose. But the honest should not be punished along with the dishonest.Basically I agree with the sentiment. Implementation is the issue.
    >
    Bottom line: in my view, a few alternate solutions to the HOC are:
    1) Allow submission of work credentials to substitute for the HOC
    2) Incorporate hands-on aspects to the exam, e.g. a SQL*PLUS simulator.
    I'm sure the brilliant minds at Oracle, or a suitable contractor, could easily accomplish #2. And #1 might be a true bar-raiser. It seems highly unlikely to me that, say, requiring an Intro to SQL HOC course raises the bar more than a few millimeters.There have been many discussions around this. Several of us have even considered non-vendor certifications as a route to accomplish this.
    It is an imperfect world. But hopefully the steps will correct some of the deficiencies.
    I am sure Paul and the rest of the Certification team would be happy to hear more feedback on how to implement practical verifiable real-world experience alternatives to the HOC that do not increase Oracle's costs.
    >
    Since both are unlikely to happen any time soon, what is the least expensive option for the HOC and where do I find a list of courses and institutions that offer them? In my view, Oracle should provide this information as a courtesy. No reason not to unless there is a revenue motive; and the author of the blog posts you sent me assures us the addition of the HOC is not a revenue enhancement maneuver. I'd be more inclined to believe it if those words were backed-up with helpful actions like the one I've proposed--since such a thing seems not to exist.All courses and course locations are available through http://education.oracle.com (and have been available for years) as well as the price. Oracle's policy does not allow for much price variation for education. The exception is when bulk purchased are negotiated, which is outside what you and I can talk with them about.

  • Planning of Dependent Requirements for BOM Components in Make to Order scen

    Hi Friends - I want to execute a full cycle Make to Stock Scenario with Materials planning using MRP.
    The scenario will include BOM Components with Routing.
    The process flow I want to execute -
    1. Create a Sales Order for a Finished Product (Product should have a BOM for Production)
    2. Do the product Materials Planning for the Sales Order through MD50
    3. Get all the dependent material requirements (both for products as Planned Order and also for BOM Components - all the BOM components should also be planned)
    4. Convert Planned Order for Product to Production Order
    5. Convert Planned Orders for BOM components to Purchase Requisition
    6. Confirm Production Order
    To execute this I have done the following
    1. I have created a Material Master data for Finished Product (FERT).
    2. I have assigned Item Category Group - NORM, and General Item Category group as NORM
    3. I have assigned MRP Type PD in MRP 1 View and Strategy Group 20 (Make to Order Production) in MRP 3 View.
    4. Now created a Universal (Type 3) BOM for the product with one material component.
    5. Now also created a Routing for the Material with two operations having PP01 a control key.
    6. Now created a standard sales order for the product.
    7. Executed the MD50 for the Sales Order Item.
    Now where do I find the BOM components for this material in the Sales Order?
    After executing the planning Run MD50 I am only getting the requirements for Product only. I can't see any dependent requirements for the BOM components.
    How do I make sure that the BOM components are also planned while the product is planned?
    I have checked the component Stock Requirement List - there are no requirements generated from the Sales Order Product BOM.
    Pls help me on this. How to do this so that the raw material components are also planned?
    Thanks
    Purnendu

    Hi Dhaval, Sandeep, Ashok - Thanks a lot for your replies.
    I have checked the Planned Order Created through Planning Run.
    If I click on the Components Tab I am getting following message
    No material components were determined
    Message no. 61027
    Diagnosis
    Until now, no BOM explosion has been carried out or no BOM has been created for this material yet.
    Procedure
    Either explode the bill of material for this material, or
    create the components manually by processing the planned order in change mode.
    If I try to explode the BOM through Header Menu - Edit> Explode BOM
    I am getting the same message.
    If I click on the Bill of Material Tab - I am getting the same above messge.
    Ashok - I have tried both the Selection Method -
    Nil - Selection by Order Quant.
    1 - Selection by Explosion Date
    But with no effect.
    Waiting for your advice.
    Thanks and warm regards
    Purnendu

  • Batch Management Indicator Required - Material Master

    Hello,
    My requirement -
    Batch management field has to be required for material Type "FERT"
    1) I created a new field selection group for the field "batch management" (MARA-XCHPF)
    2) Created a new Field Reference group with "Required" entry (OMS9)
    3) Assigned the New Field Reference to the material Type in OMS2
    The batch management Field is now a mandatory field.
    Issue is, when we try to create only the basic Data Views of a FERT material, system does not allow to save the material untill the "Batch Management Indicator" is activated (even if we are not referencing to any plant specific views of the material master.)
    We need to have the the Batch management indicator a required entry for FERT, but at the same time should be able to create material master with basic data Field only (we would be extending the plant specific views later)
    Is there a alternative method to have this setting to meet the requirements above?
    Thanks

    Hi
    I think you can check if the batch level is defined on material or client now, if yes, you may check if that can be switched to plant level. As far as I knw the basic data is for client so the field is shown on basic data view and ask to input value as you customized.
    Alternatively, I think you can use the user exit MGA00001to check the field instead of the field selection, this is called when the material master get saved, then you can write your own source code to chec if the customer filled the  field only for certain situations.
    Best Regards.
    Leon.

  • Authorisation control for master data creation on the basis of eq. category

    Hi Experts,
    In one of my business scenerio , I want to control the authorisation of a particular person on the basis of equipment category. I want to create a role for that particular ID and assign that equipment category sothat he can not create the equipments other than that category.
    How can I control this authorisation on object level basis? If any other way , please let me know sothat i can try out for the same.
    With Regards
    VT

    Hi
    You can use SPRO>PM and CS>Tech objects>equipment>Defien field selection for eqmt master record. Chose 2nd activity in list and click 'influencing' and select Equipment catiegory. Through this for a equipment category, you can control fields as input, required, display, hide.  Maintain the field auth group as required for a equipment category you want to control.
    You can create auth group in SPRO same path as above under techincal objects, general data. Check with GRC/basis team to limit auth to Equipment category for this group. If not, you can do this through ABAP for validating the auth group for equipment category. Assign this Auth group to a role
    Regards
    Hemanth

  • Transfer of Requirements for Return Orders

    Hi,
    Please I need  help on TOR management with Return Orders in SD.
    I've already red  all the threads realted to TOR and ATP but I couldn't find a solution. As we know TOR is dependent on the following data: reqts type, reqts class, checking group and schedule line category.
    The reqts type and class are determined in the strategy group (material master - MRP3).
    Now if I switch on Transfer of Requirements for Returns, at Schedule Line level and Reqts Class level, the result is that I can see requirements related to Returns in the availability check overview of Sales Orders (ATP count) but if I check in the requirements list (i.e. transaction MD04) these requirements related to Returns are not recorded. Why?
    And also another point please: requirements generated from Return Orders, into ATP count are considered as normal Sales Orders (negative sign, reducing ATP quantity, when it should be the contrary because a goods receipt is expected).
    Can you help me please on these 2 points (perhaps something related with Sales Document Category)?
    Thank you very much
    Kind Regards
    AP

    I practically did a sample order / returns order to see your requirements and these are my observations
    a)
    If you create a standard sales order, then the availability check registers a minus sign with stock required in Rec./reqd qty field.
    Once you save the sales order, this is not seen again as the total quantity is reduced by sales order quantity.
    It was indicating a minus sign when you created a returns order because you checked the Transfer of requirements and availability check indicators in schedule line category.
    It was simulating a requirement as you checked the indicators in schedule line category.
    If you create returns order with standard DN (i.e. Transfer of requirements and availability check indicators are not checked), then you would not see any requirement for stock i.e. you don't see any new entry in availability overview screen.
    b)
    You are not seeing returned quantities in MD04 because of Movement types. This has nothing to do with Transfer of requirements.
    Schedule line DN uses movement type 651.
    Using movement type 651, you post returns from a customer with returns delivery to blocked stock returns.
    The blocked stock returns are are neither valuated nor part of "unrestricted-use" stock.
    Hence you are not seeing the returned stock in your available stock.
    If you click the magnifying glass on total stock i.e. top line of your stock/requirements list (transaction MD04), you could see the returned quantity in 'Returns field'.
    If you want to post returns from a customer with returns delivery directly to the valuated stock, you could use the following movement types in schedule line category.
    653 Returns from customer to unrestricted-use stock
    655 Returns from customer to stock in quality inspection
    With these movement types, you could see returned quantities in stock/requirements list.

Maybe you are looking for