Is SAF required for ICSS?

Hi,
I was wondering whether anyone would know whether you need to install the Software Agent Framework (SAF) if we wanted to use the SOlution Search and FAQs in the Internet Customer Self Service (ICSS) scenario?  We are using CRM 4.0 SIE and understand if using the non-java configuration of the IC Web Client we do not need SAF.  However is SAF still required for ICSS?
Any ideas would be much appreciated.
Regards
Sulaman

Hi Sulaman
Look at
<a href="http://help.sap.com/saphelp_crm50/helpdata/en/8a/336641df6c7f47e10000000a1550b0/frameset.htm">http://help.sap.com/saphelp_crm50/helpdata/en/8a/336641df6c7f47e10000000a1550b0/frameset.htm</a>
In the prerequisites say "You have installed the TREX index server and the Software Agent Framework (SAF) components".
Regards.
Manuel

Similar Messages

  • Safe boot required for firmware upgrades??

    The two recent firmware upgrades, EFI and SMC, both required me to do a safe boot for the firmware to install. Is this normal behaviour?? Trying to do the firmware install from a normal boot results in in an installation failure.
    Possibly related, the "fixed" version of Parallels crashes the Mac during the install. If I try the same install during a safeboot, it will install. However, the system will panic during a normal boot until Parallels is uninstalled.
    Thoughts?? Advice??? Thanks!
    -Phil

    Safe Mode boot…
    http://docs.info.apple.com/article.html?artnum=107393
    You get updates by running Software Update (System Preferences else it's the 2nd item in the Apple menu).

  • What are the thread safety requirements for container implementation?

    I rarely see in the TopLink documentation reference to thread safety requirements and it’s not different for container implementation.
    The default TopLink implementation for:
    - List is Vector
    - Set is HashSet
    - Collection is Vector
    - Map is HashMap
    Half of them are thread safe implementations List/Collection and the other half is not thread safe Set/Map.
    So if I choose my own implementation do I need a thread safe implementation for?
    - List ?
    - Set ?
    - Collection ?
    - Map ?
    Our application is always reading and writing via UOW. So if TopLink synchronize update on client session objects we should be safe with not thread safe implementation for any type; does TopLink synchronize update on client session objects?
    The only thing we are certain is that it is not thread safe to read client session object or read read-only UOW object if they are ever expired or ever refreshed.
    We got stack dump below in an application always reading and writing objects from UOW, so we believe that TopLink doesn’t synchronize correctly when it’s updating the client session objects.
    java.util.ConcurrentModificationException
    at java.util.AbstractList$Itr.checkForComodification(AbstractList.java:449)
    at java.util.AbstractList$Itr.next(AbstractList.java:420)
    at oracle.toplink.internal.queryframework.InterfaceContainerPolicy.next(InterfaceContainerPolicy.java:149)
    at oracle.toplink.internal.queryframework.ContainerPolicy.next(ContainerPolicy.java:460)
    at oracle.toplink.internal.helper.WriteLockManager.traverseRelatedLocks(WriteLockManager.java:140)
    at oracle.toplink.internal.helper.WriteLockManager.acquireLockAndRelatedLocks(WriteLockManager.java:116)
    at oracle.toplink.internal.helper.WriteLockManager.checkAndLockObject(WriteLockManager.java:349)
    at oracle.toplink.internal.helper.WriteLockManager.traverseRelatedLocks(WriteLockManager.java:144)
    at oracle.toplink.internal.helper.WriteLockManager.acquireLockAndRelatedLocks(WriteLockManager.java:116)
    at oracle.toplink.internal.helper.WriteLockManager.checkAndLockObject(WriteLockManager.java:349)
    at oracle.toplink.internal.helper.WriteLockManager.traverseRelatedLocks(WriteLockManager.java:144)
    at oracle.toplink.internal.helper.WriteLockManager.acquireLockAndRelatedLocks(WriteLockManager.java:116)
    at oracle.toplink.internal.helper.WriteLockManager.acquireLocksForClone(WriteLockManager.java:56)
    at oracle.toplink.publicinterface.UnitOfWork.cloneAndRegisterObject(UnitOfWork.java:756)
    at oracle.toplink.publicinterface.UnitOfWork.cloneAndRegisterObject(UnitOfWork.java:714)
    at oracle.toplink.internal.sessions.UnitOfWorkIdentityMapAccessor.getAndCloneCacheKeyFromParent(UnitOfWorkIdentityMapAccessor.java:153)
    at oracle.toplink.internal.sessions.UnitOfWorkIdentityMapAccessor.getFromIdentityMap(UnitOfWorkIdentityMapAccessor.java:99)
    at oracle.toplink.internal.sessions.IdentityMapAccessor.getFromIdentityMap(IdentityMapAccessor.java:265)
    at oracle.toplink.publicinterface.UnitOfWork.registerExistingObject(UnitOfWork.java:3543)
    at oracle.toplink.publicinterface.UnitOfWork.registerExistingObject(UnitOfWork.java:3503)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.registerIndividualResult(ObjectLevelReadQuery.java:1812)
    at oracle.toplink.internal.descriptors.ObjectBuilder.buildWorkingCopyCloneNormally(ObjectBuilder.java:455)
    at oracle.toplink.internal.descriptors.ObjectBuilder.buildObjectInUnitOfWork(ObjectBuilder.java:419)
    at oracle.toplink.internal.descriptors.ObjectBuilder.buildObject(ObjectBuilder.java:379)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.buildObject(ObjectLevelReadQuery.java:455)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.conformIndividualResult(ObjectLevelReadQuery.java:622)
    at oracle.toplink.queryframework.ReadObjectQuery.conformResult(ReadObjectQuery.java:339)
    at oracle.toplink.queryframework.ReadObjectQuery.registerResultInUnitOfWork(ReadObjectQuery.java:604)
    at oracle.toplink.queryframework.ReadObjectQuery.executeObjectLevelReadQuery(ReadObjectQuery.java:421)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.executeDatabaseQuery(ObjectLevelReadQuery.java:811)
    at oracle.toplink.queryframework.DatabaseQuery.execute(DatabaseQuery.java:620)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.execute(ObjectLevelReadQuery.java:779)
    at oracle.toplink.queryframework.ReadObjectQuery.execute(ReadObjectQuery.java:388)
    at oracle.toplink.queryframework.ObjectLevelReadQuery.executeInUnitOfWork(ObjectLevelReadQuery.java:836)
    at oracle.toplink.publicinterface.UnitOfWork.internalExecuteQuery(UnitOfWork.java:2604)
    at oracle.toplink.publicinterface.Session.executeQuery(Session.java:993)
    at oracle.toplink.publicinterface.Session.executeQuery(Session.java:950)

    Hi Lionel,
    As a general rule of thumb, the ATI Rage 128 Pro will not support a 20" LCD. That being said, there are reports of it doing just that (possibly the edition that went into the cube).
    I'm not that familiar with the ins and outs of the Cube, so I can't give you authoritative information on it.
    A good place to start looking for answers is:
    http://cubeowner.com/kbase_2/
    Cheers!
    Karl

  • What packages are required for an 11.2.0.3 install on Solaris 11?

    Where can I find information on which packages are required to install Oracle 11.2.0.3 on Solaris 11?
    The documentation only mentions Solaris 10. I tracked down the packages required for Solaris 10, and most of them are either included in the GA release of 11 or obsoleted.
    There are however a couple of packages that are missing from the base install of Solaris 11:
    SUNWarc (renamed to):
         * consolidation/osnet/osnet-incorporation (installed)
         * developer/library/lint (missing)
    SUNWhea (renamed to):
         * consolidation/osnet/osnet-incorporation (installed)
         * system/header (misising)
    So it looks like developer/library/lint and system/header are packages from required for Solaris 10 that are missing on 11. The installer however doesn't check for any packages whatsoever.
    Does that mean it's safe to install the database?
    Edited by: 894946 on Nov 21, 2011 8:38 AM

    Here is list of packages you need in order to install Oracle 11gR2 on Solaris 11:
    SUNWarc SUNWbtool SUNWhea SUNWlibms SUNWpool SUNWpoolr SUNWsprot SUNWtoo SUNWlibm SUNWuiu8 SUNWfont-xorg-core SUNWfont-xorg-iso8859-1 SUNWmfrun SUNWxorg-client-programs SUNWxorg-clientlibs SUNWxwfsw SUNWxwplt
    Some of the above are installed by default which you can find by:
    pkginfo -i SUNWarc SUNWbtool SUNWhea SUNWlibms SUNWpool SUNWpoolr SUNWsprot SUNWtoo SUNWlibm SUNWuiu8 SUNWfont-xorg-core SUNWfont-xorg-iso8859-1 SUNWmfrun SUNWxorg-client-programs SUNWxorg-clientlibs SUNWxwfsw SUNWxwplt
    I believe you will only have to install the following:
    pkg install compatibility/packages/SUNWxwplt SUNWarc SUNWhea SUNWxorg-client-programs SUNWxorg-clientlibs
    Cheers

  • What are the optimal hardware specification requirements for Adobe Photoshop CS6?

    Greetings to the staff and users of this forum,
    As the title clearly stipulates, the question is "What are the optimal hardware specification requirements for Adobe Photoshop CS6?".
    Unfortunately, I am not satisfied with the specs specified on the website, as I believe they are far lesser than the specs needed to run the program at its full potential. This belief comes from my experience with the product Adobe Photoshop CS2 that I currently run on my computer. In effect, Ps CS2 cannot fully run without inducing time lags on my computer (i.e. when using 500 pix diameter smudge tool/ using liquify gallery), even though my specs are already higher than those specified for CS6 (mine are: Intel Core 2 Duo 6300 1.86 GHz, 3 GB RAM). I can assure you that I get those time lags even when almost in safe mode (only system main processes running) and that my computer is clean from any process/virus that could lower its performances.
    I've been through two chat rooms and four telephone operators to always get the same answer: the specs on the website.
    If any user here could provide me with proper higher specs, I would really be grateful.
    Thanks,
    phdwengr

    This document looks at different workflows and gives recommendations. It should give you an idea of what's needed.
    http://blogs.adobe.com/jnack/files/2012/07/CS6_hardware_recommendations.pdf

  • HT4007 The installed graphics card does not meet the minimum requirements for Aperture.

    The installed graphics card does not meet the minimum requirements for Aperture.  I have not used Aperture in a while and just got this error message.  I tried applying OS updates and reinstalling Aperture and still get the same message.  Does anybody have a solution or work around.

    Are you using a non-standard Graphics Cards?  Then this may be the reason.
    You may also get this error message, if you accidentally booted into "Safe Boot Mode". Then certain extensions of the Graphics card will be disabled and Aperture cannot be launched.
    Reboot and make sure that you this time will boot into regular boot mode.
    See: Mac OS X: What is Safe Boot, Safe Mode?
    If you are indeed running the system in Safe Mode, the question is, what made this happen? Do you use a wireless mouse and keyboard? There is a strange bug that makes Mac OS X boot into Safe mode, if you turn the wireless devices on, while the system is booting. Make sure they are active, before you boot.
    But if you did not boot into Safe Mode, you could try to boot into this mode to reset the graphics card and then boot again into regular mode.
    Merry Christmas!
    Léonie

  • Specific Power Requirements For iPods?

    +(Yes I searched, no I didn't find this information.)+
    I am looking for specific power requirements for multiple iPod models; 1st-Generation iPod touch, 5th-Generation 80GB Video, 3rd & 4th-Generation iPod nano, and 2nd-Generation Shuffle, to be precise.
    I am not looking for the obvious information found on the Apple site, such as charge times and USB vs. FireWire. I need more accurate data such as exact mAh and wattage input requirements for each iPod. I know I have chargers that will satisfy the needs of the nanos and 80GB but leave the 1st-Gen touch wanting and waiting for more. At the same time, I do not wish to overload the smaller iPod models like the shuffle.
    I have searched Apple.com and the Apple forums as well as Google, but I cannot find this information anywhere.
    Please help! I am trying to find suitable alternate power supplies such as solar, crank, and AA/AAA-battery backup sources and I want one that will work with all of my current iPods and hopefully future iPod purchases as well (to include many current iPods). Thank you!

    I don't have the info you need on the iPods.
    But I have an AC charger bought originally to use with another brand of mp3 player. After I got my iPod, I didn't see the point in buying another AC charger if it would work with iPod, since the company (Griffintechnology) also makes some for iPods. (Of course, I would have to switch out the USB cable to the one that works with iPod, but that's easy enough.) I couldn't find the info on their web site or on Apple's, so I emailed Griffin, and they told me it was safe to use with the iPod. So far, so good.
    So I would suggest emailing the maker of the chargers you already own, to check on compatibility. I know Griffin makes them, as does DLO. As far as AA/AAA rechargers, we recently got one from www.adrenalinetechnologies.com (haven't yet used it, though) that's supposed to work with multiple iPod models, and they have multiple adapters and such for different things.

  • Acsrvmui.cab required for this install is corrupt

    I uninstalled SharePoint 2013 on Server 2012 and restarted.  Now I am installing SharePoint 2013 again, StandAlone and within 30 seconds I am getting an error
    acsrvmui.cab required for this install is corrupt.
    I have downloaded this to my desktop and run it as administrator and I still get this error.

    I was attempting a clean install with all Pre-Reqs installed and have encountered this error.
    To be safe I extracted the ISO with WIN RAR and tested - FAIL
    Mounted the ISO as a Drive - FAIL
    I ran this both with and without Administrative rights - FAIL
    This is a Windows 2012 R2 with the latest updates.
    Did you ever get around this or get a resolution from Microsoft?
    Steve

  • What's the minimum Clock Speed Requirement for GB3?

    What's the minimum Clock Speed Requirement for GB3 (assuming one wishes to create and play mult. audio tracks within this current iLife app)? Is it 733mhz?
    Here's what i was able to find on
    iLife ’06 requirements:
    To install iLife, you need:
    • A Macintosh computer with a PowerPC G4, G5, or Intel Core processor; 733 MHZ or faster required for iDVD
    • 256 MB of RAM; 512 MB recommended
    • Mac OS X version 10.3.9 or Mac OS X version 10.4.3 or later; Mac OS X version 10.4.4 recommended
    • iTunes 6.0.2 and QuickTime 7.0.4 (included)
    • A DVD drive for installation
    • 10 GB of disk space to install iLife '06 applications

    Thanks Christoph, TIA, and AppleGuy!
    Christoph, what model are you using? You said you have even less Mhz, but your model looks like the same as mine.
    TIA, I looked at the other thread (thanks for the link) any idea what other models are affected by the sound output problem?
    I probably will back up first just to be safe, and won't try running them both at the same time. I was hoping though to have full functionality of the Podcasting features plus all the old music ones without slowing the machine down.
    Tansef

  • Runtime requirements for binaries built using Studio 12.3?

    Is there any documentation of 'hard' runtime requirements in terms of Solaris kernel etc. versions?
    I've just installed Studio 12.3 on Solaris 10 u9 (x64) and both the toolchain and binaries built with it seem to work fine, despite the documented requirement for Sol10u10 for Studio itself.
    I can't find anything which talks about runtime requirements for binaries built using Studio.
    Are there lurking issues which I may run into in production if I try to deploy binaries built using 12.3 on a Solaris version before 10u10?
    Thanks

    The Release Notes for Studio 12.3
    http://docs.oracle.com/cd/E24457_01/html/E21986/ossrn.html#scrolltoc
    lists the minimum OS version as "Solaris 10 10/08", which is S10u6, not S10u10. So you are safe in using S10u9. (Sometimes documentation refers to the date of the update and sometimes to the update number, which I agree can be confusing.)
    The basic runtime compatibility rule has been that you can run on a newer OS version than where you built, not necessarily on an older OS version. The problem is that new OS versions introduce new interfaces, and a program might intentionally or unintentionally depend on an interface that does not exist in an older Solaris version. In that case, the application would not work on the older OS.
    Historically, updates to Solaris were only bug fixes. But Solaris 10 has been out so long that some updates introduced new features and new interfaces.
    A safe rule is to build on the oldest OS version that you intend to allow clients to use. For Solaris 10, it is safest to extend that rule to the oldest update that you intend to allow clients to use.

  • Table for temporarily stock /requirement  for tocde /afs/mdo4

    Dear expart,
    I developed a zreport for display STO number, Production order number, operation etc.
    mainly I use here AFPO,AFRU, MSEG, MCHB & J_3ABDSI Table.
    My problem is, when I compare with Tcode /afs/md04 tab-temporarily stock /requirement  .
    for some MATNR
    data show properly.
    and some MATNR are blank  with message Last MRP run on 04.04.2011 or such date.
    Hhow i can filter the in Z-report which MATNR are not in Tcode /afs/md04 tab-temporarily stock /requirement  .
    my code is.
    SELECT  j_3abdsiaufnr j_3abdsimatnr j_3abdsij_4krcat j_3abdsimbdat j_3abdsi~menge INTO TABLE it_eket FROM j_3abdsi
        FOR ALL ENTRIES IN it_final1
        WHERE
              j_3abdsi~j_4krcat = it_final1-j_4ksca AND
              j_3abdsi~matnr = it_final1-matnr AND
              j_3abdsi~werks = it_final1-werks AND
              j_3abdsi~bdart = 'TB' AND
              j_3abdsi~plart = 'B' AND
              j_3abdsi~bsart = 'UB'.
    Pls help .
    Rayhan
    Edited by: Abu Rayhan on Apr 5, 2011 10:24 AM

    CLEAR i_data1.
      REFRESH i_data1.
      LOOP AT i_mara.
        READ TABLE i_marc WITH KEY matnr = i_mara-matnr  BINARY SEARCH .
        IF sy-subrc = 0 .
          CALL FUNCTION 'J_3AM_DISPOSITION_DISPL'
            EXPORTING
              i_matnr                 = i_mara-matnr
              i_werks                 = p_werks
          I_DIALOG                = ' '
          I_SPERR                 = ' '
          I_AUFRUF                = ' '
          I_BANER                 = ' '
             i_todate                = todate
          I_HEADER_ONLY           = ' '
           IMPORTING
             ex_dbba                 = i_data3
          E_MDKP                  =
          EX_PBBD                 =
          EX_MELD                 =
          E_CM61M                 =
           EXCEPTIONS
             material_gesperrt       = 1
             wbz_fehler              = 2
             material_prgr           = 3
             dispo_gesperrt          = 4
             OTHERS                  = 5
          IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
          ELSE.
            IF i_data3[] IS NOT INITIAL.
              LOOP AT i_data3 INTO i_data4 .
                  IF ( i_data4-j_3astat ='A' OR i_data4-j_3astat ='T') AND i_data4-j_3abskz ='C'   .
                    READ TABLE i_t001l WITH KEY lgort = i_data4-lgonr  BINARY SEARCH .
                    IF sy-subrc = 0 .
                      CLEAR i_data1str.
                      i_data1str-matnr = i_data4-matnr.
                      i_data1str-j_3asize = i_data4-j_3asize .
                      i_data1str-lgort = i_data4-lgonr.
                      i_data1str-menge = i_data4-menge .
                      COLLECT i_data1str INTO i_data1.
                    ENDIF.
                  ENDIF.
              ENDLOOP.
            ENDIF.
          ENDIF.
      ENDLOOP.
    Questions
    i_mara  recordset 500 material
    It take more than 3 house to finished this report.
    do changed ?
    do you help me ?
    Think.

  • What are the settings required for QM in procurement

    Hi Team,
    What are the settings required for QM in procurement. I have  set indicator for QM in procurement in QM view in material master.
    I am not clear about  following fields to be maintained in QM view.
    QM Control Key
    Certificate type
    Target QM system
    Tech. delivery terms Indicator.
    Please suggest me in which case to be used this fiels. Is it relivant to Quality Certificates.
    Thanks

    Hi,
    All meaning are
    QM Control Key :
    If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement.
    Certificate type :
    Certificate types applies to the Certificate processing in procurement  &  Certificate creation
    Target QM system :
    whether the vendor's verified QM system, according to vendor master record or quality info-record (for a combination of vendor/material) meets the requirements for QM systems as specified in the material master
    -  If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement. If you want Procurment control then accordingly define Control Key.
    -  If you want Vendor's perticular certificate for Material then you have to define Certificate type.
    Also, you have to maintain Material, Vendor's Info record at plant level.
    Thanks,
    JM

  • List of Manual Setup required for iSetup to work

    Hi All,
    This is Mugunthan from iSetup development. Based on my interaction with customers and Oracle functional experts, I had documented list of manual setups that are required for smooth loading of selection sets. I am sharing the same. Please let me know if I anyone had to enter some manual setup while using iSetup.
    Understanding iSetup
    iSetup is a tool to migrate and report on your configuration data. Various engineering teams from Oracle develop the APIs/Programs, which migrates the data across EBS instances. Hence all your data is validated for all business cases and data consistency is guarantied. It requires good amount of setup functional knowledge and bit of technical knowledge to use this tool.
    Prerequisite setup for Instance Mapping to work
    ·     ATG patch set level should be same across all EBS instances.
    ·     Copy DBC files of each other EBS instances participating in migration under $FND_SECURE directory (refer note below for details).
    ·     Edit sqlnet.ora to allow connection between DB instacnes(tcp.invited_nodes=(<source>,<central>))
    ·     Make sure that same user name with iSetup responsibility exists in all EBS instances participating in migration.
    Note:- iSetup tool is capable of connecting to multiple EBS instances. To do so, it uses dbc file information available under $FND_SECURE directory. Let us consider three instances A, B & C, where A is central instance, B is source instance and C is target instances. After copying the dbc file on all nodes, $FND_SECURE directory would look like this on each machine.
    A => A.dbc, B.dbc, C.dbc
    B => A.dbc, B.dbc
    C => A.dbc, C.dbc
    Prerequisite for registering Interface and creating Custom Selection Set
    iSetup super role is mandatory to register and create custom selection set. It is not sufficient if you register API on central/source instance alone. You must register the API on all instances participating in migration/reporting.
    Understanding how to access/share extracts across instances
    Sharing iSetup artifacts
    ·     Only the exact same user can access extracts, transforms, or reports across different instances.
    ·     The “Download” capability offers a way to share extracts, transforms, and loads.
    Implications for Extract/Load Management
    ·     Option 1: Same owner across all instances
    ·     Option 2: Same owner in Dev, Test, UAT, etc – but not Production
    o     Extract/Load operations in non-Production instances
    o     Once thoroughly tested and ready to load into Production, download to desktop and upload into Production
    ·     Option 3: Download and upload into each instance
    Security Considerations
    ·     iSetup does not use SSH to connect between instances. It uses Concurrent Manager framework to lunch concurrent programs on source and target instances.
    ·     iSetup does not write password to any files or tables.
    ·     It uses JDBC connectivity obtained through standard AOL security layer
    Common Incorrect Setups
    ·     Failure to complete/verify all of the steps in “Mapping instances”
    ·     DBC file should be copied again if EBS instance has been refreshed or autoconfig is run.
    ·     Custom interfaces should be registered in all EBS instances. Registering it on Central/Source is not sufficient.
    ·     Standard Concurrent Manager should up for picking up iSetup concurrent requests.
    ·     iSetup financial and SCM modules are supported from 12.0.4 onwards.
    ·     iSetup is not certified on RAC. However, you may still work with iSetup if you could copy the DBC file on all nodes with the same name as it had been registered through Instance Mapping screen.
    Installed Languages
    iSetup has limitations where it cannot Load or Report if the number and type of installed languages and DB Charset are different between Central, Source and Target instances. If your case is so, there is a workaround. Download the extract zip file to desktop and unzip it. Edit AZ_Prevalidator_1.xml to match your target instance language and DB Charset. Zip it back and upload to iSetup repository. Now, you would be able to load to target instance. You must ensure that this would not corrupt data in DB. This is considered as customization and any data issue coming out this modification is not supported.
    Custom Applications
    Application data is the prerequisite for the most of the Application Object Library setups such as Menus, Responsibility, and Concurrent programs. iSetup does not migrate Custom Applications as of now. So, if you have created any custom application on source instance, please manually create them on the target instance before moving Application Object Library (AOL) data.
    General Foundation Selection Set
    Setup objects in General foundation selection set supports filtering i.e. ability to extract specific setups. Since most of the AOL setup data such as Menus, Responsibilities and Request Groups are shipped by Oracle itself, it does not make sense to migrate all of them to target instance since they would be available on target instance. Hence, it is strongly recommended to extract only those setup objects, which are edited/added, by you to target instance. This improves the performance. iSetup uses FNDLOAD (seed data loader) to migrate most of the AOL Setups. The default behavior of FNDLOAD is given below.
    Case 1 – Shipped by Oracle (Seed Data)
    FNDLOAD checks last_update_date and last_updated_by columns to update a record. If it is shipped by Oracle, the default owner of the record would be Oracle and it would skip these records, which are identical. So, it won’t change last_update_by or last_updated_date columns.
    Case 2 – Shipped by Oracle and customized by you
    If a record were customized in source instance, then it would update the record based on last_update_date column. If the last_update_date in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Case 3 – Created and maintained by customers
    If a record were newly added/edited in source instance by you, then it would update the record based on last_update_date column. If the last_update_date of the record in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Profiles
    HR: Business Group => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    HR: Security Profile => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    MO: Operating Unit => Set the Operating Unit name for which you would like to extract data from source instance. After loading Operating Unit onto the target instance, make sure that this profile option is set if required.
    Navigation path to do the above setup:
    System Administrator -> Profile -> System.
    Query for the above profiles and set the values accordingly.
    Descriptive & Key Flex Fields
    You must compile and freeze the flex field values before extracting using iSetup.
    Otherwise, it would result in partial migration of data. Please verify that all the data been extracted by reporting on your extract before loading to ensure data consistency.
    You can load the KFF/DFF data to target instance even the structures in both source as well as target instances are different only in the below cases.
    Case 1:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1, Loc2, Loc3 (Mandate), Loc4, Loc5 and Loc6
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3, Loc4, then locations will be loaded to target instance without any issue. If you do not provide value for Loc3, then API will fail, as Loc3 is a mandatory field.
    Case 2:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1 (Mandate), Loc2
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3 and Loc4 and load data to target instance, API will fail as Loc3 and Loc4 are not there in target instance.
    It is always recommended that KFF/DFF structure should be same for both source as well as target instances.
    Concurrent Programs and Request Groups
    Concurrent program API migrates the program definition(Definition + Parameters + Executable) only. It does not migrate physical executable files under APPL_TOP. Please use custom solution to migrate executable files. Load Concurrent Programs prior to loading Request Groups. Otherwise, associated concurrent program meta-data will not be moved even through the Request Group extract contains associated Concurrent Program definition.
    Locations - Geographies
    If you have any custom Geographies, iSetup does not have any API to migrate this setup. Enter them manually before loading Locations API.
    Currencies Types
    iSetup does not have API to migrate Currency types. Enter them manually on target instance after loading Currency API.
    GL Fiscal Super user--> setup--> Currencies --> rates -- > types
    Associating an Employee details to an User
    The extract process does not capture employee details associated with users. So, after loading the employee data successfully on the target instance, you have to configure them again on target instance.
    Accounting Setup
    Make sure that all Accounting Setups that you wish to migrate are in status “Complete”. In progress or not-completed Accounting Setups would not be migrated successfully.
    Note: Currently iSetup does not migrate Sub-Ledger Accounting methods (SLA). Oracle supports some default SLA methods such as Standard Accrual and Standard Cash. You may make use of these two. If you want to use your own SLA method then you need to manually create it on target instances because iSetup does not have API to migrate SLA. If a Primary Ledger associated with Secondary Ledgers using different Chart of Accounts, then mapping rules should be defined in the target instance manually. Mapping rule name should match with XML tag “SlCoaMappingName”. After that you would be able to load Accounting Setup to target instance.
    Organization API - Product Foundation Selection Set
    All Organizations which are defined in HR module will be extracted by this API. This API will not extract Inventory Organization, Business Group. To migrate Inventory Organization, you have to use Inventory Organization API under Discrete Mfg. and Distribution Selection Set. To extract Business Group, you should use Business Group API.
    Inventory Organization API - Discrete Mfg & Distribution Selection Set
    Inventory Organization API will extract Inventory Organization information only. You should use Inventory Parameters API to move parameters such as Accounting Information. Inventory Organization API Supports Update which means that you can update existing header level attributes of Inventory Organization on the target instance. Inventory Parameters API does not support update. To update Inventory Parameters, use Inventory Parameters Update API.
    We have a known issue where Inventory Organization API migrates non process enabled organization only. If your inventory organization is process enabled, then you can migrate them by a simple workaround. Download the extract zip file to desktop and unzip it. Navigate to Organization XML and edit the XML tag <ProcessEnabledFlag>Y</ProcessEnabledFlag> to <ProcessEnabledFlag>N</ProcessEnabledFlag>. Zip it back the extract and upload to target instance. You can load the extract now. After successful completion of load, you can manually enable the flag through Form UI. We are working on this issue and update you once patch is released to metalink.
    Freight Carriers API - Product Foundation Selection Set
    Freight Carriers API in Product Foundation selection set requires Inventory Organization and Organization Parameters as prerequisite setup. These two APIs are available under Discrete Mfg. and Distribution Selection Set. Also,Freight Carriers API is available under Discrete Mfg and Distribution Selection Set with name Carriers, Methods, Carrier-ModeServ,Carrier-Org. So, use Discrete Mfg selection set to load Freight Carriers. In next rollup release Freight Carriers API would be removed from Product Foundation Selection Set.
    Organization Structure Selection Set
    It is highly recommended to set filter and extract and load data related to one Business Group at a time. For example, setup objects such as Locations, Legal Entities,Operating Units,Organizations and Organization Structure Versions support filter by Business Group. So, set the filter for a specific Business Group and then extract and load the data to target instance.
    List of mandatory iSetup Fwk patches*
    8352532:R12.AZ.A - 1OFF:12.0.6: Ignore invalid Java identifier or Unicode identifier characters from the extracted data
    8424285:R12.AZ.A - 1OFF:12.0.6:Framework Support to validate records from details to master during load
    7608712:R12.AZ.A - 1OFF:12.0.4:ISETUP DOES NOT MIGRATE SYSTEM PROFILE VALUES
    List of mandatory API/functional patches*
    8441573:R12.FND.A - 1OFF:12.0.4: FNDLOAD DOWNLOAD COMMAND IS INSERTING EXTRA SPACE AFTER A NEWLINE CHARACTER
    7413966:R12.PER.A - MIGRATION ISSUES
    8445446:R12.GL.A - Consolidated Patch for iSetup Fixes
    7502698:R12.GL.A - Not able to Load Accounting Setup API Data to target instance.
    Appendix_
    How to read logs
    ·     Logs are very important to diagnose and troubleshoot iSetup issues. Logs contain both functional and technical errors.
    ·     To find the log, navigate to View Detail screens of Extracts/ Transforms/Loads/Standard/Comparison Reports and click on View Log button to view the log.
    ·     Generic Loader (FNDLOAD or Seed data loader) logs are not printed as a part of main log. To view actual log, you have to take the request_id specified in the concurrent log and search for the same in Forms Request Search Window in the instance where the request was launched.
    ·     Functional errors are mainly due to
    o     Missing prerequisite data – You did not load one more perquisite API before loading the current API. Example, trying to load “Accounting Setup” without loading “Chart of Accounts” would result in this kind of error.
    o     Business validation failure – Setup is incorrect as per business rule. Example, Start data cannot be greater than end date.
    o     API does not support Update Records – If the there is a matching record in the target instance and If the API does not support update, then you would get this kind of errors.
    o     You unselected Update Records while launching load - If the there is a matching record in the target instance and If you do not select Update Records, then you would get this kind of errors.
    Example – business validation failure
    o     VONAME = Branches PLSQL; KEY = BANKNAME = 'AIBC‘
    o     BRANCHNAME = 'AIBC'
    o     EXCEPTION = Please provide a unique combination of bank number, bank branch number, and country combination. The 020, 26042, KA combination already exists.
    Example – business validation failure
    o     Tokens: VONAME = Banks PLSQL
    o     BANKNAME = 'OLD_ROYAL BANK OF MY INDIA'
    o     EXCEPTION = End date cannot be earlier than the start date
    Example – missing prerequisite data.
    o     VONAME = Operating Unit; KEY = Name = 'CAN OU'
    o     Group Name = 'Setup Business Group'
    o     ; EXCEPTION = Message not found. Application: PER, Message Name: HR_ORG_SOB_NOT_FOUND (Set of books not found for ‘Setup Business Group’)
    Example – technical or fwk error
    o     OAException: System Error: Procedure at Step 40
    o     Cause: The procedure has created an error at Step 40.
    o     Action: Contact your system administrator quoting the procedure and Step 40.
    Example – technical or fwk error
    o     Number of installed languages on source and target does not match.
    Edited by: Mugunthan on Apr 24, 2009 2:45 PM
    Edited by: Mugunthan on Apr 29, 2009 10:31 AM
    Edited by: Mugunthan on Apr 30, 2009 10:15 AM
    Edited by: Mugunthan on Apr 30, 2009 1:22 PM
    Edited by: Mugunthan on Apr 30, 2009 1:28 PM
    Edited by: Mugunthan on May 13, 2009 1:01 PM

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

  • Questions on SETSPN syntax and what is required for MANUAL AD auth

    I'll preface this by stating that I don't need to do all the extra stuff for Vintela SSO, SSO to database, etc.  I just need to know precisely what is necessary to do to get AD authentication working.  I managed to get it working in XIr2 previously but it's been so long and I'm not 100% sure that everything I wound up doing was absolutely necessary that I wanted to sort it out for good as we look at going to XI 3.1 SP3.
    In the XI 3.1 SP3 admin guide, page 503, the SETSPN command which is
    used as part of the setup process to establish a service account to
    enable AD authentication is outlined as follows:
    SETSPN.exe -A <ServiceClass>/<DomainName> <Serviceaccount>
    The guide suggests that the <ServiceClass> can be anything you want to
    arbitrarily assign. If I choose something other than the
    suggested "BOBJCentralMS" value, is there anywhere else I have to
    specify this value to allow the service account to function properly?
    The guide suggests that the <DomainName> should be the domain name on
    which the service account exists however I've seen many posts online which seem to
    indicate this <DomainName> should actually be the FQDN of the server
    running the CMS service instead of the general domain name.
    Clarification there would be very helpful if anyone has some insight.

    The CMS account can have an SPN of spaghetti/meatballs, there are no requirements (cept 2 characters on each side of the / I believe). The SPN created should be the value entered in the CMC > Authentication > Windows AD
    The account must run the SIA and it therefore must have AD permissions. Now if you are using IIs or client tools you don't even need an SPN. The SPN is for kerberos only which is required for java app servers.
    The vintela SSO white paper in the this forums sticky post explains the roles of a service account.
    Regards,
    Tim

  • Is CAL required for SharePoint Foundation 2010?

    Is extra CAL required for uploading/downloading docs using SharePoint Foundation 2010? We already have Licensed Windows Server 2008 R2. Ours is an intranet application which will be accessed by 400 intranet users.
    If CAL is required then can i use only 1 CAL for my app server to upload/download docs on SP Foundation provided all 400 client requests will go via app server to SP server?

    If I want to use sharepoint foundation 2013, I need to have Windows server 2008 R2 / 2012 license & SQL server license (optional). I'm confused in - what kind of windows server license is required & do I also need some other license to use sharepoint
    foundation?
    How can I provide external user access to my sharepoint environment? Is there any requirement of any other license also?

Maybe you are looking for

  • Hard Drive Format Fails under Lion 10.7.4

    Long-time Mac user, first time I've ever seen this problem. OS X 10.7.4, Mac Pro. I am attempting to format a RAID JBOD disk through Disk Utility 12.1.1 (Lion version) and it fails consistently with this error: "Partition Failed with the error: Wipin

  • Trying to recover metadata from a lost DB

    Hi: We lost a DB instance and cannot recover. Ops replaced the broken disks and restored datafiles from backup, but "recover datafile..." for a specific datafile fails with OAR-01113 (file 89 needs media recovery). It calls for a file called arch1_12

  • FM for Creating secondary resource(CRC1)

    Hi, Any one knows the function module for creating secondary resouce and the tcode is CRC1. Regards, Raghu

  • DBACOCKPIT job error when running backup job to LAN network folder

    Hello all I have set up a network location through MSSQL Server Management Studio from Server Objects -> Backup devices and using SQL Server Managemnt Studio i can backup Full or Log to that network locaion. When in SAP using DBACOCKPIT i can see the

  • 55-250mm Telephoto Zoom Lens help

    I can not get a sharp image with this lens no matter what I do. please can someone help me. I tried taking pictures of some birds this morning and they are not sharp at all. I used a tripod too. thanks.Tina