Is TREX required for CCM 2.0 search?

Hello,
Is  TREX a manadatory component for CCM 2.0?
Regards,
Bejoy

Hi,
As already indicated TREX is a required component of the SAP CCM 2.0, even in the scenario where the search functionality would not be used (i.e. in a scenario of just navigating the hierarchial structure and selecting items) the TREX is still a requirement for publishing and indexing of the published data.
Regards,
Jason

Similar Messages

  • BPM - Requirement for CCMS Setup?

    Hi all,
    In the BPM setup guide, it says: "The Computing Center Management System or CCMS Alert Monitor (transaction /nrz20) provides the underlying monitoring infrastructure for Business Process Monitoring."
    Is there any further information on exactly what CCMS Setup is required at a minimum in the Solution Manager and in the Monitored system for BPM to function correctly?
    I read the technical prerequisites which are described in SAP notes 784752 - "BPMon in SAP Solution Manager - Prerequisites" and 521820 - "Availability of Business Process Monitoring". The first note does not reference CCMS, and the second one only makes reference to it if you want to use "CCMS Monitors" in BPM.
    Thanks for any guidance on this,
    Regards,
    John

    Hi JD
    To use CCMS monitoring in BPMon, 1st, please implement Note 1293668 as described in Note
    521820. This note fix the problem of CCMS monitoring related problem.
    >Description of note 521820
    >(d) CCMS Monitor
    >If you plan link any CCMS monitors from the managed systems into
    >Business Process Monitoring implement SAP Note 1293668.
    About the procedure, you just need to copy and past MTE from RZ20 of related system.
    In more detail, please refer to the document in below.
    http://www.service.sap.com/bpm
    ->Technical information
    ->->Setup Guide - Interface Monitoring
    Page 78 describe the steps on CCMS monitoring setup.
    (Originally this description is for one of the monitoring topic for PI area.
    Create MTE for PI and by using CCMS monitoring of BPMon, include the MTE to the BPMon.
    But the page above exactly describe the step that you need to do for CCMS monitoring.
    This apply for your case as well).
    Best Regards
    Keiji Mishima

  • What is the harddisk requirement for SP2013 Application server + Search component?

    Dear all,
    In my SP2013 system, we don't have separated search server. So the all search components are resident with application server.
    According to: Hardware requirements—web servers, application servers, and single server installations
    http://technet.microsoft.com/en-us/library/cc262485.aspx#reqOtherCap
    I need 80GB for system drive
    According to: Hardware requirements for search topologies for the enterprise
    http://technet.microsoft.com/en-us/library/651dba4d-8751-4bd8-9492-f2842b2e1177(v=office.15)#HW_Enterprise
    Index component
    80 GB regardless of the number of search components hosted on the server.
    Analytics processing component
    80 GB regardless of the number of search components hosted on the server.
    Crawl component,Content processing component,Query processing component,Search administration component
    80 GB regardless of the number of search components hosted on the server
    Sorry that I am confused here. I need 160GB or 320GB? My current setup is 250GB. Is it risky? Thanks.
    Mark

    The answer depends on whether you are planning to run all the Search components on one server.  They can be distributed across multiple servers.  If you put them all on one server you need the following amounts
    80GB for the C: drive (system drive) for the OS and SharePoint binaries
    80GB for the Index component (you may need more than that depending on how much content is in SharePoint)
    80GB for the Analytics processing component
    80GB for the rest of the components
    That's a total of 320GB.  That's about right for a starting point for a Search Server that contains all the search components.  Moving one or more components to another server would decrease that total.  
    But you could probably get by with 250GB depending on how much content you are indexing.  But as your system grows you will proabably need more.  I normally start with an 80GB C: drive and a 300GB D: drive.
    Paul Stork SharePoint Server MVP
    Principal Architect: Blue Chip Consulting Group
    Blog: http://dontpapanic.com/blog
    Twitter: Follow @pstork
    Please remember to mark your question as "answered" if this solves your problem.

  • Which Version of TREX is Required for CCM 2

    Hi,
    We are upgrading to Portal NW04s, SRM 5 and CCM 2. We are using TRex for searching the CCM catalog. What version of TREX is required to do so? (We currently are running version 6 and would like to know if it can stay that way or if we need to upgrade to TREX 7.0)
    Thanks,
    Bert

    [http://www.mozilla.com/en-US/firefox/5.0beta/system-requirements/ Firefox 5 System Requirements]
    '''Windows Operating Systems'''
    *Windows 2000
    *Windows XP
    *Windows Server 2003
    *Windows Vista
    *Windows 7

  • Is TREX required for the customer specific catalog views in SAP ERP E-com

    Hi gurus,
    I see there are few ramp-up sap notes for enabling the customer specific catalog views of Product catalog in SAP ERP E-commerce scenario.
    Any one who has already implemented them know if TREX is mandatory for having catalog views?
    specifically for XECOM 5.0 and ECC 6.0
    Thank you

    See [Note 696095 - ISA R/3 4.0: Collective note on Catalog Views|https://service.sap.com/sap/support/notes/696095]
    See the first line in the Reasons and Prerequisites:
    Important: The catalog views functionality is only available from ISA 4.0 SP4 on, we recommend to use the latest SP. It is also only available with TREX as catalog engine
    This is true for recent versions too.
    The requirement is behind how the solution is implemented. VIEWS_ID is actually published to TREX for optimized extraction of customer views.

  • Using wildcards in CCM 2.0 search

    Hi experts
    we're using SRM 5.0 (SRM 5.5 Server) with CCM 2.0.
    Is it possible to use wildcards in catalog search?
    Are there any documents about wildcards in CCM ?
    Regards.
    Sven

    Hi
    Yes TREX is mandatory for CCM2.0 (written in the master guide). Please restart the TREX server and retry, if this does not in your case.
    I guess, There are two wildcard characters:
    ? stands for a single character
    stands for a sequence of any combination characters of any length.
    Related links ->
    CCM 2.0 Simple search on CSE does not show any result
    Re: Cross catalog search doesn't work for CCM 2.0
    Re: Search function for Catalogue CCM 1.0 SRM version 4
    CCM view rules - based on wild cards?
    Re: Is TREX required for CCM 2.0 search?
    Hope this answers your queries. Do let me know.
    Regards
    - Atul

  • TREX for CCM

    Dear Experts,
    We have implemented self service procurement and we are planning to implement CCM within the same client.
    As i understand that TREX is mandatory for CCM, I required some clarifications from you.
    1) We should maintain seperate client for TREX? or we can use same ccm client?
    2) How do we install TREX component in our server or it 's already exist when we install SRM?
    3) If I go for different client for TREX then what all are the settings i need to maintain for this>
    Thanks for your help
    Thanks
    Ravi

    Hi Ravi,
    For CCM yes you are correct TREX is mandatory
    For your questions:
    1)     Well I assume you have installed TREX somewhere once is installed and you performed the configuration of TREX. TREX will create automatically an RFC in SRM (if you are using CCM in the same client of SRM)  or the CCM client this is how you connect to TREX from your CCM client.
    2)     You need to install TREX it does not come with your CCM/SRM installation this is a standalone NetWeaver component: you can find the guides here under Standalone Engines-> SAP NEtWeaver Search and Classification TREX
    https://websmp106.sap-ag.de/installnw70
    You will find two guides single host and multiple host server download the guide of Single Host this is most used for the majority of installations but every installation is different depending on your business requirements.
    Multiple Host
    https://websmp206.sap-ag.de/~sapidb/011000358700000854982007E/TREX71InstallMultipleHosts.pdf
    Single Host guide
    https://service.sap.com/~sapidb/011000358700000854952007E/TREX71InstallSingleHost.pdf
    3)     TREX is not installed in a client is a standalone engine Is not ABAP component  the only thing you need to do after you configure TREX is make sure that the RFC got created (during the configuration of TREX)
    Best Regards,
    Juan Jose

  • Cross catalog search doesn't work for CCM 2.0

    Hi SRM gurus,
    We are using SRM 5.0 and CCM2.0.
    We created several CCM catalogs but cross catalog search doesn't work at all.
    In the call structure, option "cross catalog" is correctly customized.
    Are there additional check I have to do in order to see this cross catalog search working fine ?
    It's quite urgent...
    Thanks a lot,
    regards,
    Caroline

    Hi
    The TREX server (service) must started and be contactable by RFC - this can be checked by carrying out the actions listed above. (Refer OSS Note - 866547 Error when accessing TREX server for more details )
    <b>Please have a look at the following SAP OSS Note, which will help -></b>
    Note 851106 - Search in catalog from SRM leads to "Service not reachable"
    <u>Other related OSS Notes</u>
    Note 973594 Cross Catalog Search - Configuration
    Note 894717 Items from Cross Catalogs Result does not appears in step 2
    Note 803731 Cross-category search returns no result
    Note 847137 OCI, cross-catalog search: detail display
    Note 996885 Cross Catalog Search - Timeout while accessing MDM Catalog
    1023487 cross-catalog-serach in portal opens up a duplicate window
    1020025 Item detail display in Cross Catalog Search
    1027352 Item detail display in Cross Catalog Search
    Note 866547 - Error when accessing TREX server
    Note 988427 - Update to TREX 6.1 Rev 27
    Note 994623 - Hierarchy Buffer and BIA
    Note 1030056 - Improvement in the Search within Results feature of CSE
    Note 798988 CCM/CSE: Sorting sometimes returns no results
    Note 778688 TREX_INDEX_MANAGER unit test update_view(): incorrect search
    Note 808754 Display sequence of the characteristics is not changeable
    Note 794325 - Error in OCI transfer in the BAdI /CCM/OCI_SCALEPRI
    Note 745235 Search ability changes to cross-catalog characteristics
    Note 724097 - Search of the comp. in case of structured characteristics
    Note 743643 Search ability change in cross-catalog characteristics
    Note 847551 Displaying date, time, and timestamp in the CSE
    Note 750756 Program for the deletion/clean up of TREX indexes
    Do update me as well.
    Regards
    - Atul

  • TREX for CCM implemenetion

    Dear Experts,
    We have implemented self service procurement and we are planning to implement CCM within the same client.
    As i understand that TREX is mandatory for CCM, I required some clarifications from you.
    1) We should maintain seperate client for TREX? or we can use same ccm client?
    2) How do we install TREX component in our server or it 's already exist when we install SRM?
    3) If I go for different client for TREX then what all are the settings i need to maintain for this>
    Thanks for your help
    Thanks
    Ravi

    Hi Masa,
    Thanks for your reply. I understand from you that we should go for MDM catalogue only right?
    Approximately howlong will take us to implement the MDM catalogue *we required only 6 intenral catalogues).
    Please let us know approximate time..
    Thanks
    Ravi

  • Diskspace Requirement for Trex Installation

    Hi guys,
    I just installed Trex server to one of our systems. It will be used for indexing and searching for our portal system which is running on another server.
    While checking Trex 7.0 Installation guide, hard disk capacity took my attention.
    On the server we installed Trex server, there is a free space of 30 GB. (Index, Trex and queue directory are on the same partition)
    Our KM contains pdf and doc documents total size; 25 gb. In the guide it says; index directory needs approximately half as much disk space as the documents plus the same amount of disk space needed temporarily for the optimization and queue directory needs space approximately three quarters of the disk space required by indexes.
    Calculation of disk space required;
    For index directory;  Permanent size = 25 / 2 = 12.5 GB
                                 Temporary size = 12.5 GB
                                 Total = 25 GB
    For queue directory; 3 * 25 / 4 = 18,75 GB
    Total space (index + queue) = 43,75 GB
    So 30 gb space is not enough for this amount of document.
    Is this true?
    Any suggestions?
    Tolga
    Edited by: Tolga Akinci on Nov 25, 2008 9:14 AM

    We were facing stability issues. Mapbox is a consultant solution and there's not that much documentation about the "internals", it runs as a Java application inside a J2EE engine and thus has all the implication that comes with that.
    The behaviour of that version was indeterministic, mails didn't get through and there were so many places to check, why a certain mail was stuck.
    You can run the Mapbox as part of J2EE on i5/OS but I can't state here anything about stability or issues, you'd better ask on the i5/OS forum here
    Markus

  • Mark Search Results in Trex results for pdf

    Hello Colleagues,
    i perform a trex search and klick in the results.
    When the result is a html document the trex marks the findings of the searched word in the document. When the result is a pdf document this doesnt works. Does somebody know a solution how i can get this feature in an pdf document?
    Best regards,
    Patrick

    Hi Prashanth,
    There are different possibilities:
    On the one hand, the standard link is rendered as the displayname property with the modifier "contentlink", see http://help.sap.com/saphelp_nw04/helpdata/en/79/a1d23e6b2c3d67e10000000a114084/frameset.htm
    This could be modified, but that would be a hard task.
    I would suggest a dummy property (DisplayNameContentLink) which then could be rendered without the contentlink modifier and within the SearchResultList, only show this property. If the resource is a PDF, add the search expression (also some non-trivial task to access this!); otherwise render the "standard" content link.
    Hope it helps
    Detlev

  • Is SAF required for ICSS?

    Hi,
    I was wondering whether anyone would know whether you need to install the Software Agent Framework (SAF) if we wanted to use the SOlution Search and FAQs in the Internet Customer Self Service (ICSS) scenario?  We are using CRM 4.0 SIE and understand if using the non-java configuration of the IC Web Client we do not need SAF.  However is SAF still required for ICSS?
    Any ideas would be much appreciated.
    Regards
    Sulaman

    Hi Sulaman
    Look at
    <a href="http://help.sap.com/saphelp_crm50/helpdata/en/8a/336641df6c7f47e10000000a1550b0/frameset.htm">http://help.sap.com/saphelp_crm50/helpdata/en/8a/336641df6c7f47e10000000a1550b0/frameset.htm</a>
    In the prerequisites say "You have installed the TREX index server and the Software Agent Framework (SAF) components".
    Regards.
    Manuel

  • Does TREX keep a log of all search entries?

    I would like to see what my users have tried to search for.  Does TREX keep a log of all search entries?

    Hi Eric
    Take a look at note 937055, I believe the described approach meets your requirements.
    Also search the forum for "search statistics" eg. before posting. A search can often give you the solution right away. In this case a search for "search statistics" yielded this result: Need to report Search Statistics from TREX which might also be interesting for your future needs.
    Kind regards,
    Martin

  • Table for temporarily stock /requirement  for tocde /afs/mdo4

    Dear expart,
    I developed a zreport for display STO number, Production order number, operation etc.
    mainly I use here AFPO,AFRU, MSEG, MCHB & J_3ABDSI Table.
    My problem is, when I compare with Tcode /afs/md04 tab-temporarily stock /requirement  .
    for some MATNR
    data show properly.
    and some MATNR are blank  with message Last MRP run on 04.04.2011 or such date.
    Hhow i can filter the in Z-report which MATNR are not in Tcode /afs/md04 tab-temporarily stock /requirement  .
    my code is.
    SELECT  j_3abdsiaufnr j_3abdsimatnr j_3abdsij_4krcat j_3abdsimbdat j_3abdsi~menge INTO TABLE it_eket FROM j_3abdsi
        FOR ALL ENTRIES IN it_final1
        WHERE
              j_3abdsi~j_4krcat = it_final1-j_4ksca AND
              j_3abdsi~matnr = it_final1-matnr AND
              j_3abdsi~werks = it_final1-werks AND
              j_3abdsi~bdart = 'TB' AND
              j_3abdsi~plart = 'B' AND
              j_3abdsi~bsart = 'UB'.
    Pls help .
    Rayhan
    Edited by: Abu Rayhan on Apr 5, 2011 10:24 AM

    CLEAR i_data1.
      REFRESH i_data1.
      LOOP AT i_mara.
        READ TABLE i_marc WITH KEY matnr = i_mara-matnr  BINARY SEARCH .
        IF sy-subrc = 0 .
          CALL FUNCTION 'J_3AM_DISPOSITION_DISPL'
            EXPORTING
              i_matnr                 = i_mara-matnr
              i_werks                 = p_werks
          I_DIALOG                = ' '
          I_SPERR                 = ' '
          I_AUFRUF                = ' '
          I_BANER                 = ' '
             i_todate                = todate
          I_HEADER_ONLY           = ' '
           IMPORTING
             ex_dbba                 = i_data3
          E_MDKP                  =
          EX_PBBD                 =
          EX_MELD                 =
          E_CM61M                 =
           EXCEPTIONS
             material_gesperrt       = 1
             wbz_fehler              = 2
             material_prgr           = 3
             dispo_gesperrt          = 4
             OTHERS                  = 5
          IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
          ELSE.
            IF i_data3[] IS NOT INITIAL.
              LOOP AT i_data3 INTO i_data4 .
                  IF ( i_data4-j_3astat ='A' OR i_data4-j_3astat ='T') AND i_data4-j_3abskz ='C'   .
                    READ TABLE i_t001l WITH KEY lgort = i_data4-lgonr  BINARY SEARCH .
                    IF sy-subrc = 0 .
                      CLEAR i_data1str.
                      i_data1str-matnr = i_data4-matnr.
                      i_data1str-j_3asize = i_data4-j_3asize .
                      i_data1str-lgort = i_data4-lgonr.
                      i_data1str-menge = i_data4-menge .
                      COLLECT i_data1str INTO i_data1.
                    ENDIF.
                  ENDIF.
              ENDLOOP.
            ENDIF.
          ENDIF.
      ENDLOOP.
    Questions
    i_mara  recordset 500 material
    It take more than 3 house to finished this report.
    do changed ?
    do you help me ?
    Think.

  • List of Manual Setup required for iSetup to work

    Hi All,
    This is Mugunthan from iSetup development. Based on my interaction with customers and Oracle functional experts, I had documented list of manual setups that are required for smooth loading of selection sets. I am sharing the same. Please let me know if I anyone had to enter some manual setup while using iSetup.
    Understanding iSetup
    iSetup is a tool to migrate and report on your configuration data. Various engineering teams from Oracle develop the APIs/Programs, which migrates the data across EBS instances. Hence all your data is validated for all business cases and data consistency is guarantied. It requires good amount of setup functional knowledge and bit of technical knowledge to use this tool.
    Prerequisite setup for Instance Mapping to work
    ·     ATG patch set level should be same across all EBS instances.
    ·     Copy DBC files of each other EBS instances participating in migration under $FND_SECURE directory (refer note below for details).
    ·     Edit sqlnet.ora to allow connection between DB instacnes(tcp.invited_nodes=(<source>,<central>))
    ·     Make sure that same user name with iSetup responsibility exists in all EBS instances participating in migration.
    Note:- iSetup tool is capable of connecting to multiple EBS instances. To do so, it uses dbc file information available under $FND_SECURE directory. Let us consider three instances A, B & C, where A is central instance, B is source instance and C is target instances. After copying the dbc file on all nodes, $FND_SECURE directory would look like this on each machine.
    A => A.dbc, B.dbc, C.dbc
    B => A.dbc, B.dbc
    C => A.dbc, C.dbc
    Prerequisite for registering Interface and creating Custom Selection Set
    iSetup super role is mandatory to register and create custom selection set. It is not sufficient if you register API on central/source instance alone. You must register the API on all instances participating in migration/reporting.
    Understanding how to access/share extracts across instances
    Sharing iSetup artifacts
    ·     Only the exact same user can access extracts, transforms, or reports across different instances.
    ·     The “Download” capability offers a way to share extracts, transforms, and loads.
    Implications for Extract/Load Management
    ·     Option 1: Same owner across all instances
    ·     Option 2: Same owner in Dev, Test, UAT, etc – but not Production
    o     Extract/Load operations in non-Production instances
    o     Once thoroughly tested and ready to load into Production, download to desktop and upload into Production
    ·     Option 3: Download and upload into each instance
    Security Considerations
    ·     iSetup does not use SSH to connect between instances. It uses Concurrent Manager framework to lunch concurrent programs on source and target instances.
    ·     iSetup does not write password to any files or tables.
    ·     It uses JDBC connectivity obtained through standard AOL security layer
    Common Incorrect Setups
    ·     Failure to complete/verify all of the steps in “Mapping instances”
    ·     DBC file should be copied again if EBS instance has been refreshed or autoconfig is run.
    ·     Custom interfaces should be registered in all EBS instances. Registering it on Central/Source is not sufficient.
    ·     Standard Concurrent Manager should up for picking up iSetup concurrent requests.
    ·     iSetup financial and SCM modules are supported from 12.0.4 onwards.
    ·     iSetup is not certified on RAC. However, you may still work with iSetup if you could copy the DBC file on all nodes with the same name as it had been registered through Instance Mapping screen.
    Installed Languages
    iSetup has limitations where it cannot Load or Report if the number and type of installed languages and DB Charset are different between Central, Source and Target instances. If your case is so, there is a workaround. Download the extract zip file to desktop and unzip it. Edit AZ_Prevalidator_1.xml to match your target instance language and DB Charset. Zip it back and upload to iSetup repository. Now, you would be able to load to target instance. You must ensure that this would not corrupt data in DB. This is considered as customization and any data issue coming out this modification is not supported.
    Custom Applications
    Application data is the prerequisite for the most of the Application Object Library setups such as Menus, Responsibility, and Concurrent programs. iSetup does not migrate Custom Applications as of now. So, if you have created any custom application on source instance, please manually create them on the target instance before moving Application Object Library (AOL) data.
    General Foundation Selection Set
    Setup objects in General foundation selection set supports filtering i.e. ability to extract specific setups. Since most of the AOL setup data such as Menus, Responsibilities and Request Groups are shipped by Oracle itself, it does not make sense to migrate all of them to target instance since they would be available on target instance. Hence, it is strongly recommended to extract only those setup objects, which are edited/added, by you to target instance. This improves the performance. iSetup uses FNDLOAD (seed data loader) to migrate most of the AOL Setups. The default behavior of FNDLOAD is given below.
    Case 1 – Shipped by Oracle (Seed Data)
    FNDLOAD checks last_update_date and last_updated_by columns to update a record. If it is shipped by Oracle, the default owner of the record would be Oracle and it would skip these records, which are identical. So, it won’t change last_update_by or last_updated_date columns.
    Case 2 – Shipped by Oracle and customized by you
    If a record were customized in source instance, then it would update the record based on last_update_date column. If the last_update_date in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Case 3 – Created and maintained by customers
    If a record were newly added/edited in source instance by you, then it would update the record based on last_update_date column. If the last_update_date of the record in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Profiles
    HR: Business Group => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    HR: Security Profile => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    MO: Operating Unit => Set the Operating Unit name for which you would like to extract data from source instance. After loading Operating Unit onto the target instance, make sure that this profile option is set if required.
    Navigation path to do the above setup:
    System Administrator -> Profile -> System.
    Query for the above profiles and set the values accordingly.
    Descriptive & Key Flex Fields
    You must compile and freeze the flex field values before extracting using iSetup.
    Otherwise, it would result in partial migration of data. Please verify that all the data been extracted by reporting on your extract before loading to ensure data consistency.
    You can load the KFF/DFF data to target instance even the structures in both source as well as target instances are different only in the below cases.
    Case 1:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1, Loc2, Loc3 (Mandate), Loc4, Loc5 and Loc6
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3, Loc4, then locations will be loaded to target instance without any issue. If you do not provide value for Loc3, then API will fail, as Loc3 is a mandatory field.
    Case 2:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1 (Mandate), Loc2
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3 and Loc4 and load data to target instance, API will fail as Loc3 and Loc4 are not there in target instance.
    It is always recommended that KFF/DFF structure should be same for both source as well as target instances.
    Concurrent Programs and Request Groups
    Concurrent program API migrates the program definition(Definition + Parameters + Executable) only. It does not migrate physical executable files under APPL_TOP. Please use custom solution to migrate executable files. Load Concurrent Programs prior to loading Request Groups. Otherwise, associated concurrent program meta-data will not be moved even through the Request Group extract contains associated Concurrent Program definition.
    Locations - Geographies
    If you have any custom Geographies, iSetup does not have any API to migrate this setup. Enter them manually before loading Locations API.
    Currencies Types
    iSetup does not have API to migrate Currency types. Enter them manually on target instance after loading Currency API.
    GL Fiscal Super user--> setup--> Currencies --> rates -- > types
    Associating an Employee details to an User
    The extract process does not capture employee details associated with users. So, after loading the employee data successfully on the target instance, you have to configure them again on target instance.
    Accounting Setup
    Make sure that all Accounting Setups that you wish to migrate are in status “Complete”. In progress or not-completed Accounting Setups would not be migrated successfully.
    Note: Currently iSetup does not migrate Sub-Ledger Accounting methods (SLA). Oracle supports some default SLA methods such as Standard Accrual and Standard Cash. You may make use of these two. If you want to use your own SLA method then you need to manually create it on target instances because iSetup does not have API to migrate SLA. If a Primary Ledger associated with Secondary Ledgers using different Chart of Accounts, then mapping rules should be defined in the target instance manually. Mapping rule name should match with XML tag “SlCoaMappingName”. After that you would be able to load Accounting Setup to target instance.
    Organization API - Product Foundation Selection Set
    All Organizations which are defined in HR module will be extracted by this API. This API will not extract Inventory Organization, Business Group. To migrate Inventory Organization, you have to use Inventory Organization API under Discrete Mfg. and Distribution Selection Set. To extract Business Group, you should use Business Group API.
    Inventory Organization API - Discrete Mfg & Distribution Selection Set
    Inventory Organization API will extract Inventory Organization information only. You should use Inventory Parameters API to move parameters such as Accounting Information. Inventory Organization API Supports Update which means that you can update existing header level attributes of Inventory Organization on the target instance. Inventory Parameters API does not support update. To update Inventory Parameters, use Inventory Parameters Update API.
    We have a known issue where Inventory Organization API migrates non process enabled organization only. If your inventory organization is process enabled, then you can migrate them by a simple workaround. Download the extract zip file to desktop and unzip it. Navigate to Organization XML and edit the XML tag <ProcessEnabledFlag>Y</ProcessEnabledFlag> to <ProcessEnabledFlag>N</ProcessEnabledFlag>. Zip it back the extract and upload to target instance. You can load the extract now. After successful completion of load, you can manually enable the flag through Form UI. We are working on this issue and update you once patch is released to metalink.
    Freight Carriers API - Product Foundation Selection Set
    Freight Carriers API in Product Foundation selection set requires Inventory Organization and Organization Parameters as prerequisite setup. These two APIs are available under Discrete Mfg. and Distribution Selection Set. Also,Freight Carriers API is available under Discrete Mfg and Distribution Selection Set with name Carriers, Methods, Carrier-ModeServ,Carrier-Org. So, use Discrete Mfg selection set to load Freight Carriers. In next rollup release Freight Carriers API would be removed from Product Foundation Selection Set.
    Organization Structure Selection Set
    It is highly recommended to set filter and extract and load data related to one Business Group at a time. For example, setup objects such as Locations, Legal Entities,Operating Units,Organizations and Organization Structure Versions support filter by Business Group. So, set the filter for a specific Business Group and then extract and load the data to target instance.
    List of mandatory iSetup Fwk patches*
    8352532:R12.AZ.A - 1OFF:12.0.6: Ignore invalid Java identifier or Unicode identifier characters from the extracted data
    8424285:R12.AZ.A - 1OFF:12.0.6:Framework Support to validate records from details to master during load
    7608712:R12.AZ.A - 1OFF:12.0.4:ISETUP DOES NOT MIGRATE SYSTEM PROFILE VALUES
    List of mandatory API/functional patches*
    8441573:R12.FND.A - 1OFF:12.0.4: FNDLOAD DOWNLOAD COMMAND IS INSERTING EXTRA SPACE AFTER A NEWLINE CHARACTER
    7413966:R12.PER.A - MIGRATION ISSUES
    8445446:R12.GL.A - Consolidated Patch for iSetup Fixes
    7502698:R12.GL.A - Not able to Load Accounting Setup API Data to target instance.
    Appendix_
    How to read logs
    ·     Logs are very important to diagnose and troubleshoot iSetup issues. Logs contain both functional and technical errors.
    ·     To find the log, navigate to View Detail screens of Extracts/ Transforms/Loads/Standard/Comparison Reports and click on View Log button to view the log.
    ·     Generic Loader (FNDLOAD or Seed data loader) logs are not printed as a part of main log. To view actual log, you have to take the request_id specified in the concurrent log and search for the same in Forms Request Search Window in the instance where the request was launched.
    ·     Functional errors are mainly due to
    o     Missing prerequisite data – You did not load one more perquisite API before loading the current API. Example, trying to load “Accounting Setup” without loading “Chart of Accounts” would result in this kind of error.
    o     Business validation failure – Setup is incorrect as per business rule. Example, Start data cannot be greater than end date.
    o     API does not support Update Records – If the there is a matching record in the target instance and If the API does not support update, then you would get this kind of errors.
    o     You unselected Update Records while launching load - If the there is a matching record in the target instance and If you do not select Update Records, then you would get this kind of errors.
    Example – business validation failure
    o     VONAME = Branches PLSQL; KEY = BANKNAME = 'AIBC‘
    o     BRANCHNAME = 'AIBC'
    o     EXCEPTION = Please provide a unique combination of bank number, bank branch number, and country combination. The 020, 26042, KA combination already exists.
    Example – business validation failure
    o     Tokens: VONAME = Banks PLSQL
    o     BANKNAME = 'OLD_ROYAL BANK OF MY INDIA'
    o     EXCEPTION = End date cannot be earlier than the start date
    Example – missing prerequisite data.
    o     VONAME = Operating Unit; KEY = Name = 'CAN OU'
    o     Group Name = 'Setup Business Group'
    o     ; EXCEPTION = Message not found. Application: PER, Message Name: HR_ORG_SOB_NOT_FOUND (Set of books not found for ‘Setup Business Group’)
    Example – technical or fwk error
    o     OAException: System Error: Procedure at Step 40
    o     Cause: The procedure has created an error at Step 40.
    o     Action: Contact your system administrator quoting the procedure and Step 40.
    Example – technical or fwk error
    o     Number of installed languages on source and target does not match.
    Edited by: Mugunthan on Apr 24, 2009 2:45 PM
    Edited by: Mugunthan on Apr 29, 2009 10:31 AM
    Edited by: Mugunthan on Apr 30, 2009 10:15 AM
    Edited by: Mugunthan on Apr 30, 2009 1:22 PM
    Edited by: Mugunthan on Apr 30, 2009 1:28 PM
    Edited by: Mugunthan on May 13, 2009 1:01 PM

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

Maybe you are looking for

  • ERROR IN PROCESS VS REDISTRIBUTION WORKLOAD

    Hello All I create a shoping cart and cart went to ERROR IN PROCESS. and if i looked to to REDISTRIBUTION OF WORKLOAD search a requirement s(shopping cart) , why this ERROR IN PROCESS cart shows as COMPLETED (TICK GREEN MARK) .  no one touched (i.e n

  • Noob here, wanting to upgrade my MBP HD to 1TB.

    So I'm looking to upgrade the internal hard drive of my February 2011 13.3" Macbook Pro. It's 2.70GHz i7, 8GB RAM, current 500GB HD. Does anybody know what a good option would be? Preferably under $100, unless you can convince me that a more expensiv

  • How to assign values into  sap idoc

    Dear all,      I have created Purchase order.   My PO number is xxxx569.  I have created PO using ME21N.   In this ME21N i have give material,deliver date,price,PO Quantity.      My actual assignment is whatever PO quantity may give in ME21N , but in

  • AE CS6 WONT OPEN!!!! Last log message was: 140735275870560 GPUManager 2 Sniffer Result Code: 3

    I'm having this problem in after effects. AE CS6 WONT OPEN!! Last log message was: <140735275870560> <GPUManager> <2> Result Code Sniffer: 3

  • Two significant aperture software failures (or features depending....)

    I have been unable to determine any method allowing Aperture to import GPS data from nikon D1X,D2X directly... the only workaround seems to be to import the raw images into nikon view software, export as jpegs import the jpegs, seems a bit silly but