Bulk load of user security details

Hi,
Any tips on how to go for a bulk load of user details in BPC, we have a list of users and their required authorisations.May be using scripts or some thing else. I know using DTS is one way, but trying to find if any workaround is there.?

My advice to customers and partners is to always build a security matrix in excel to determine all the assignments. The matrix helps to determine if you have captured all the correct teams, tasks assignments, and access.  Try to ONLY setup users without any access 1st, then setup Memebraccess profiles and task profiles.  Bring this all together via the TEAM assignment, since a team may have only 1 TASK profile, but multiple Member access profiles.  While the building of security may take time, there are methods to minimize the current and future maintenance after the initial setup.  Plus, once you set-up the process via the admin console, you then may see in the table structures just how complex the assignments are for each of the components.  Once the tables are set, I still believe and I may be wrong, that an admin needs to process at a minimum the TEAMS from the admin console, to establish the connections required for by users for access.
Hope this helps.

Similar Messages

  • Bulk load of users for Portal

    hi
    We have an Oracle database which we use a third party payroll application on. This database contains employee info and looks to be the perfect for getting the details to create portal sign ons. We are running Portal 9.0.2.3 on AIX 5.1 and I would like to know - how do I do a bulkload into OID so that all our employees can have a sign on to our company Portal. I have seen the ldif file - but I cannot see how I can create one from raw data in a table.
    Has anyone overcome this before??
    thanks for any info!

    You should look at the Oracle Internet Directory (OID) Administrator's manual - which has a whole Part dedicated to discussing the Oracle Directory Integration Platform.
    What you are looking for is very similar to the connector for Oracle HR, which is discussed in Chapter 33.
    However, it will not be the same, so some slight modifications will be necessary. You will basically devise your own connector. Please read this section and it should provide the info you need to accomplish it.
    Oracle does this internally as well. There is a synchronization service that runs against a database-based "Profile" system that is used for Oracle Technology Network (OTN) accounts, and automatically provisions user accounts on my.oracle.com, using this Oracle Directory Integration Platform.
    If you need specific help after reading up on the referenced information, you may want to post your questions to the OID forum.
    Thanks

  • Notifications are not being sent when Bulk Load is done

    Hi All,
    I have OIM 11g setup on my machine. I use the bulk load utility for loading the user data. Now in my OIM setup, the notifications are being sent for all stuff like Reset Password. New account creation and other. However when I bulk load the users, notifications are not sent to their mail ids. I am running the scheduled job "Bulk load Post Process" which is necessary so that the users are synced to the LDAP repository. I have the LDAP Sync option checked and also the Notifications option set to yes in this scheduled job. Though the users are loaded successfully and are synced properly, the notifications are not sent. Can some one please guide me as to what could be the problem here?
    Thanks,
    $id

    The code is probably only called in the Event method of the event handler that sends the notification. You can check the mds files and find the notification you are looking for and then use a code decompiler to find the class that is called. You can then use this code as a sample, or write your own notification code and create an event handler that runs in the BulkEvent.
    And on another note there is also this System Configuration Variable: Recon.SEND_NOTIFICATION which is set to FALSE by default.
    -Kevin

  • Can someone reply this - pre-populating (bulk load) the OID - URGENT

    gurus,
    i'm using following -
    Database --> Oracle 9i
    Portal --> Oracle Portal 9iAS Release 2
    there are about 10,000 portal users. i would like to pre-populate the OID from the existing employee repository (employee repository is a custom Oracle database).
    question - is there a white paper that gives u all the api's required to do so. i've to accomplish the following tasks -
    1. create users
    2. give them privileges
    3. assign them to groups
    4. assign a default groups to users
    i need to achieve above as part of pre-populating the OID.
    ideas anyone ....?
    thanx a bunch.

    Hero,
    I just went through an exercise were I did a bulk load of users and did exectly the four steps you're asking for. I also applied the users to desinated groups.
    I'm on a HP-UX but solution can apply to any O/S.
    How do I get this to you?

  • How to prevent Evaluate User Policies to run for Bulk loaded users?

    Hi,
    I have an OIM 11G R2 environment, where i did a bulk load of abount 200,000+ users, and all the users' accounts were created using target recon.
    How do I prevent the evaluate user policies scheduler from running for these users?
    Any ideas are welcome.
    Thanks,
    Aravind Suresh

    Hi,
    I do have roles and access policies.
    But i do not want them to applied to them at this stage as they already got everything through target recon.
    Only for new users or these users on update i want the evaluate user policies to run.
    Otherwise running evaluate user policies for these many users could be a very time and resource consuming task.
    Thanks,
    Aravind Suresh

  • OIM Bulk Load always asking for OIM database user

    Hi,
    While launching OIM Bulk load utility, all steps are correctly executed but when java driver is called, I always got the same message:
    Enter password for OIM database user again :
    And I never got menu #2.
    Thank you for your help.
    Is there any way to set it as parameter to the java program?

    Basically twice you have to provide the correct DB password. are you using ojdbc5.jar in the lib folder?
    you can set the password in oim_blkld_db_input.sh/bat script (just update oimpwd=<Actual password> and remove other lines under getDbPswd() method)

  • Bulk load Supporting details

    Hi All,
    I was looking at an option of how we can bulk upload and extract supporting details into/from Hyperion Planning. Version is 4.0.2.2
    Smartview??
    Thanks

    You will need to use SQL to bulk load and/or extract supporting detail.
    The relevant tables are in your specific planning app DB are below:
    Table name:      HSP_COLUMN_DETAIL
    Description:     One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
    Table name:      HSP_OBJECT
    Description:     This table is used to translate DIM1 – DIM22 ID’s into member names. Object_name is the member name. If the member has an alias the HSP_ALIAS table must be queried.
    Table name:      HSP_COLUMN_DETAIL_ITEM
    Description:     One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
    Regards,
    -John

  • User Interface for bulk loading images using interMedia

    I would like to create an interface where users could bulk load images to a database. Has anyone created a web (or other) interface that would perhaps call a PL/SQL procedure or SQLloader?
    Is there a way for users to upload images from there own computers in bulk? Would they need to utilize SQLPLUS?
    While I have seen the examples and plan to create a web interface for uploading images one at a time, I have been requested to find a way for the users to upload images in bulk themselves (instead of them requesting us technical people to do it).
    Thanks for any suggestions.
    Judy

    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by Simon Oxbury:
    Hi,
    There's a sample on OTN that discusses loading multimedia data in bulk into the interMedia types using both SQL*Plus (with PL/SQL) and SQL*Loader. Check out the following URL: http://otn.oracle.com/sample_code/products/intermedia/htdocs/avi_bulk_loading.html
    One major difference to consider between SQL*Loader and SQL*Plus (with PL/SQL) is that SQL*Loader can load data from files on the machine running SQL*Loader, which may be a different machine than the database, although it still needs an Oracle installation. Whereas SQL*Plus with PL/SQL can load data only from directories that are accessible to the database server and that have been defined in the server using the CREATE DIRECTORY command, which requires privs. Also note there are restrictions and issues specific to both NT and Unix when it comes to access network directories from the server.
    If SQL*Loader looks like a possibility, you might want to think about a simple Java program, Perl script, or some such, to create the SQL*Loader scripts. On the other hand, if you get into Java, then you could use Java to do the upload and, at the same time, provide some level of application-specific user interaction and/or error reporting, etc. Its easy to get a list of file names in a directory using the File.list or File.listFiles methods in Java. On the other hand, if we talking LOTs of files, then SQL*Loader may turn out to be more efficient.
    In order to better understand the variety of ways in which our customers are using interMedia, we'd be very interested in knowing a little more about your application, the interMedia functionality that you are using, and how you are developing and deploying your application. If you are able to help us, please send a short email message with some information about your application, together with any comments you may have, to the Oracle interMedia Product Manager, Joe Mauro, at [email protected]. Thank you!
    Regards,
    Simon<HR></BLOCKQUOTE>
    null

  • Bulk load users into Directory and Messaging at the same time

    Can I bulk load users into Directory and Messaging at the same
    time?
    <P>
    Yes, but you are not really loading users into the Messaging
    Server. The Directory Server contains all kinds of information about
    users, including information about their email. So if you want to
    load user information into the Directory Server, including the users'
    messaging infromation, you must install the Directory server first.
    Configure the directory server. Then install the Messaging Server.
    Then you can load users into both by loading an LDIF file with the
    user information.

    Hi,
    You can check the documentation: Multiple Copies of RMAN Backups ;-)
    When backing up datafiles, archived redo log files, server parameter files and control files into backup pieces, RMAN can duplex the backup set, producing up to four identical copies of each backup piece in the backup set on different backup destinations with one BACKUP command. (Note that duplexing is not supported for backup operations that produce image copies.)
    There are three ways to specify duplexing of backup sets when using the BACKUP command:
    Specify a default level of duplexing with CONFIGURE... BACKUP COPIESAll backup commands that back up data into backup sets will be affected if you use this option, unless you specify different duplexing options for a command using SET BACKUP COPIES or provide a COPIES option for the BACKUP command.
    Use SET BACKUP COPIES in a RUN block All commands in the RUN block will be affected, overriding any CONFIGURE... BACKUPCOPIES setting, except those where you provide a COPIES option as part of the BACKUP command.
    Provide a COPIES option to the BACKUP command For this specific BACKUP command, files will be duplexed to produce the number of copies you specify.
    Cheers
    Legatti

  • SSRS 2005 report: Cannot bulk load Operating system error code 5(Access is denied.)

    I built a SSRS 2005 report, which calls a stored proc on SQL Server 2005. The proc contains following code:
    CREATE TABLE #promo (promo VARCHAR(1000))
    BULK
    INSERT #promo
    FROM '\\aseposretail\c$\nz\promo_names.txt'
    WITH
    --FIELDTERMINATOR = '',
    ROWTERMINATOR = '\n'
    SELECT * from #promo
    It's ok when I manually execute the proc in SSMS.
    When I try to run the report from BIDS I got following error:
    *Cannot bulk load because the file "\aseposretail\c$\nz\promo_names.txt" could not be opened. Operating system error code 5(Access is denied.).*
    Note: I have gooled a bit and see many questions on this but they are not relevant because I CAN run the code no problem in SSMS. It's the SSRS having the issue. I know little about the security of SSRS.

    I'm having the same type of issue.  I can bulk load the same file into the same table on the same server using the same login on one workstation, but not on another.  I get this error:
    Msg 4861, Level 16, State 1, Line 1
    Cannot bulk load because the file "\\xxx\abc.txt" could not be opened. Operating system error code 5(Access is denied.).
    I've checked SQL client versions and they are the same, I've also set the client connection to TCP/IP only in the SQL Server Configuration Manager.  Still this one workstation is getting the error.  Since the same login is being used on both workstations and it works on one  but not the other, the issue is not a permissions issue.  I can also have another user login into the bad workstation and have the bulk load fail, but when they log into their regular workstation it works fine.  Any ideas on what the client configuration issue is?  These are the version numbers for Management Studio:
    Microsoft SQL Server Management Studio 9.00.3042.00
    Microsoft Analysis Services Client Tools 2005.090.3042.00
    Microsoft Data Access Components (MDAC) 2000.085.1132.00 (xpsp.080413-0852)
    Microsoft MSXML 2.6 3.0 5.0 6.0
    Microsoft Internet Explorer 6.0.2900.5512
    Microsoft .NET Framework 2.0.50727.1433
    Operating System 5.1.2600
    Thanks,
    MWise

  • Windows cannot load the user's profile but has logged you on with the default profile for the system.

    My Windows 7  crashed a couple days ago after a windows update, I got this message.
    Windows cannot find the local profile and is logging you on with a temporary profile. Changes you make to this profile will be lost when you log off.
    I restarted the machine and got this message
    Windows was unable to load the registry. This problem is often caused by insufficient memory or insufficient security rights.
    DETAIL - The process cannot access the file because it is being used by another process. for C:\Users\TEMP\ntuser.dat
    I checked the event Log I found these .
    Windows cannot load the user's profile but has logged you on with the default profile for the system.
    DETAIL - Only part of a ReadProcessMemory or WriteProcessMemory request was completed.
    Windows has backed up this user profile. Windows will automatically try to use the backup profile the next time this user logs on.
    Windows cannot load the locally stored profile. Possible causes of this error include insufficient security rights or a corrupt local profile.
     DETAIL - The process cannot access the file because it is being used by another process.
    This is the first error in the event viewer after a successful logon
    The description for Event ID 34 from source ccSvcHst cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
     If the event originated on another computer, the display information had to be saved with the event.
    ccSetMgr
    Windows cannot load the user's profile but has logged you on with the default profile for the system.
    DETAIL - Access is denied.
    Looking at the Logs all I can tell is that after the Desktop Window Manager started if caused this error.
    The winlogon notification subscriber <SessionEnv> was unavailable to handle a notification event.
    then this one
    The Desktop Window Manager has exited with code (0x40010004)
    Then this before it shutdown.
    The User Profile Service has stopped.
    I started up the PC and the first message I got was
    How can I get access to my user profile? do I need to createa new Administrator account? Please help
    The EventSystem sub system is suppressing duplicate event log entries for a duration of 86400 seconds. The suppression timeout can be controlled by a REG_DWORD value named SuppressDuplicateDuration under the following registry key: HKLM\Software\Microsoft\EventSystem\EventLog.

    hi do the following
    1. In Search programs and files (Windows 7) area, type in regedit, and press Enter.
    2. If prompted click yes,
    3.  expand the following HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList
    4. click the sid that related to your admin profile (if you not sure, click each sid and in turn look to the right hand side of registry editor it will show who that sid is related to one of the registry files should hae in description localhost\admin or
    something similair)
    5. right click the sid and press delete.
    6. restart your machine and log back on with the admin account, this will then rebuild the admin profile... dont worry when it loads and none of your personal settings are saved or files or folders... go to c:\users
    in here you will see two folders for the admin account, one will be just admin and the other most likely admin.localhost
    i cant remember which one is which but just check both, one will still have all your files and folders in it.
    i suggest making a backup of your data before doing this incase something does go wrong, but ive had this happen many times in a domain enviorment and has worked for me everytime.

  • Using API  to run Catalog Bulk Load - Items & Price Lists concurrent prog

    Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
    When I run the same concurrent program using the iProcurement Administration page it runs ok.
    Has anyone been able to run this program through the backend? If so, any help is appreciated.
    Thanks

    Hello S.P,
    Basically this is what I am trying to achieve.
    1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
    So basically the user can load item details into the database from an excel sheet.
    2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
    3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
    oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
    I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
    Thank you

  • Error when doing a   ATGOrder Bulk load

    Hi
    Getting the below error when trying to do a bulk load ATGOrder in CSC.
    Machine Details :Linux 64bit machine
    ATG Version:10.1
    17:44:07,487 INFO [OrderOutputConfig] Starting bulk load
    17:44:11,482 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:11,488 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:11,495 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:17,651 WARN [LiveIndexingService] Current hosts for environment ATGOrderBulk cannot support requested engine count
    17:44:17,652 WARN [LiveIndexingService] Allocate more hosts or increase the maximum number of search engines for one of its hosts
    17:44:17,656 ERROR [LiveIndexingService] Unable to release lock: __routingLiveIndexingLock:ATGOrder
    atg.service.lockmanager.LockManagerException: Attempt to release a write lock when not the owner: key=__routingLiveIndexingLock:ATGOrder Owner=Thread[http-0.0.0.0-8580-1:ipaddr=172.21.21.49;path=/dyn/admin/nucleus/atg/commerce/search/OrderOutputConfig/;sessionid=B0DC1551B81ACFD6B7C987E59116D825,5,jboss]
    at atg.service.lockmanager.ClientLockEntry.releaseWriteLock(ClientLockEntry.java:713)
    at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1386)
    at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1415)
    at atg.search.routing.LiveIndexingService.releaseLock(LiveIndexingService.java:1843)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1455)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    17:44:17,658 ERROR [BulkLoader]
    atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    ... 49 more
    Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
    ... 51 more
    17:44:17,675 ERROR [OrderOutputConfig]
    atg.repository.search.indexing.IndexingException: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:1040)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    ... 48 more
    Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    ... 49 more
    Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
    ... 51 more

    In my /atg/search/routing/LiveIndexingService/ component i have the following values.
    ATGProfile      running      yes      yes      8000001      null      1      1      1      start stop cycle delete
    backup restore disable
    ATGProfileBulk      stopped      NO      yes      null      null      1      0      0      start stop cycle delete
    backup restore disable
    ATGOrder      running      yes      yes      8000002      null      1      4      4      start stop cycle delete
    backup restore disable
    ATGOrderBulk      stopped      NO      yes      null      null      1      0      0      start stop cycle delete
    backup restore disable
    Why is there 4 engins running for ATG Order???? i think this is wat is causing the problem, but i am unable to find from where its creating this 4 engins.

  • Get error while Integrating with Oracle's Enterprise User Security

    Hi,
    I am trying to create an Oracle Enterprise User integrating with OVD and MS Active Directory.
    I am following all the steps in Integrating with Oracle's Enterprise User Security.
    In the documentation section: "Configuring Oracle Virtual Directory for the Integration"
    I have applied the steps successfully until:
    Update and load the entries into the Local Store Adapters by performing the following steps:
    I have successfully extended the Oracle Virtual Directory schema with the loadOVD.ldif
    However I am getting errors in the next step: Update realmRoot.ldif to use your namespaces
    The next step states the following:
    Update realmRoot.ldif to use your namespaces, including the dn, dc, o, orclsubscriberfullname,
    and memberurl attributes in the file. If you have a DN mapping between Active Directory and
    Oracle Virtual Directory, use the DN that you see from Oracle Virtual Directory.
    The realmRoot.ldif file is located in ORACLE_VIRTUAL_DIRECTORY_HOME/eus,
    where ORACLE_VIRTUAL_DIRECTORY_HOME represents the location where Oracle Virtual Directory is installed.
    The realmRoot.ldif file contains core entries in the directory namespace that Enterprise User Security queries. The realmRoot.ldif file also contains the dynamic group that contains the registered Enterprise User Security databases to allow secured access to sensitive Enterprise User Security related attributes, like the user's Enterprise User Security hashed password attribute.
    Load your domain root information in the realmRoot.ldif file into Oracle Virtual Directory using the following command:
    ldapmodify -h Oracle_Virtual_Directory_Host –p OVD_Port -D cn=admin -w Admin_Password -v -a –f realmRoot.ldif
    When I run the ldapmodify command I get the following error:
    add dc:
    testldap
    add objectclass:
    top
    domain
    domainDNS
    adding new entry DC=testldap,DC=local
    ldap_add: Operations error
    ldap_add: additional info: LDAP Error 1 : null
    The actual realmRoot.ldif looks like this:
    # Please uncomment the following one line if you are importing this
    # LDIF file via OVD Manager or OVD Server's ldapmodify tool.
    #version: 1
    #dn: dc=com
    #dc: com
    #objectclass: domain
    dn: DC=testldap,DC=local
    changetype: add
    dc: testldap
    #o: subarashii
    objectclass: top
    objectclass: domain
    objectclass: domainDNS
    #objectclass: orclSubscriber
    #orclsubscriberfullname: subarashii
    #orclVersion: 90400
    # If your domain structure has more layers than dc=subarashii,dc=com,
    # for example, it's dc=us,dc=subarashii,dc=com, you will need to load
    # the following ldif entry/entries too.
    # Uncomment out the following, if required.
    #dn: dc=us,dc=subarashii,dc=com
    #orclversion: 90400
    #orclsubscriberfullname: us
    #objectclass: domain
    #objectclass: top
    #objectclass: orclSubscriber
    #dc: us
    # Adding EUSDBGroup entry
    # Modify the memberurl attribute and replace it with your own domain name
    #dn: cn=EUSDBGROUP,dc=subarashii,dc=com
    #cn: EUSDBGROUP
    #memberurl:ldap:///dc=subarashii,dc=com??sub?(&(objectclass=orclService)(objectclass=orclDBServer))
    #objectclass:groupofuniquenames
    #objectclass:groupofurls
    #objectclass:top

    Did you ever get your questions answered about the realmRoot.ldif file? Did you manage to configure a successful integration of OVD with EUS? I am battling with trying to get Oracle Virtual Directory integrated with Enterprise User Security, but every step I take in Chapter 7 of the OVD manual fails in some way, and the instructions are often vague. I am not sure how to modify the realmRoot.ldif file. Is there any improved documentation on this? I have logged a Service Request, but not getting any help. Any resources or documentation you know of that provides better guidance would be much appreciated. I am way behind my schedule now and this is a very frustrating exercise.
    Thanks.

  • Bulk Load option doesn't work

    Hi Experts,
    I am trying to load data to HFM using Bulk load option but it doesnt work. When I Change the option to SQL insert, the loading is successful. The logs say that the temp file is missing. But when I go to the lspecified location , I see the control file and the tmp file. What am I missing to have bulk load working?Here's the log entry.
    2009-08-19-18:48:29
    User ID...........     kannan
    Location..........     KTEST
    Source File.......     \\Hyuisprd\Applications\FDM\CRHDATALD1\Inbox\OMG\HFM July2009.txt
    Processing Codes:
    BLANK............. Line is blank or empty.
    ESD............... Excluded String Detected, SKIP Field value was found.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    RFM............... Required Field Missing.
    TC................ Type Conversion, Amount field could be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    Create Output File Start: [2009-08-19-18:48:29]
    [TC] - [Amount=NN]     Batch Month File Created: 07/2009
    [TC] - [Amount=NN]     Date File Created: 8/6/2009
    [TC] - [Amount=NN]     Time File Created: 08:19:06
    [Blank] -      
    Excluded Record Count.............. 3
    Blank Record Count................. 1
    Total Records Bypassed............. 4
    Valid Records...................... 106093
    Total Records Processed............ 106097
    Begin Oracle (SQL-Loader) Process (106093): [2009-08-19-18:48:41]
    [RDMS Bulk Load Error Begin]
         Message:      (53) - File not found
         See Bulk Load File:      C:\DOCUME~1\fdmuser\LOCALS~1\Temp\tWkannan30327607466.tmp
    [RDMS Bulk Load Error End]
    Thanks
    Kannan.

    Hi Experts,
    I am facing the data import error while importing data from .csv file to FDM-HFM application.
    2011-08-29 16:19:56
    User ID...........     admin
    Location..........     ALBA
    Source File.......     C:\u10\epm\DEV\epm_home\EPMSystem11R1\products\FinancialDataQuality\FDMApplication\BMHCFDMHFM\Inbox\ALBA\BMHC_Alba_Dec_2011.csv
    Processing Codes:
    BLANK............. Line is blank or empty.
    ESD............... Excluded String Detected, SKIP Field value was found.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    RFM............... Required Field Missing.
    TC................ Type Conversion, Amount field could be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    Create Output File Start: [2011-08-29 16:19:56]
    [ESD] ( ) Inter Co,Cash and bank balances,A113000,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],1
    [ESD] ( ) Inter Co,"Trade receivable, prepayments and other assets",HFM128101,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],35
    [ESD] ( ) Inter Co,Inventories ,HFM170003,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],69
    [ESD] ( ) Inter Co,Financial assets carried at fair value through P&L,HFM241001,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],103
    [Blank] -      
    Excluded Record Count..............4
    Blank Record Count.................1
    Total Records Bypassed.............5
    Valid Records......................0
    Total Records Processed............5
    Begin SQL Insert Load Process (0): [2011-08-29 16:19:56]
    Processing Complete... [2011-08-29 16:19:56]
    Please help me solve the issue.
    Regards,
    Sudhir Sinha

Maybe you are looking for

  • Javax.xml.rpc.JAXRPCException: Cannot unmarshal jaxrpc-mapping-file:

    Hi, I am using jboss for deploying the webservice. when i am deploying the webservice i am getting following error. 20:26:30,696 INFO [WSDLFilePublisher] WSDL published to: file:/C:/jboss4.0/jboss-4.0.0/server/default/data/wsdl/acWebSe rvices.war/Add

  • Configuration issue with Response processor

    Please assist with my issue I am having the following issue with ALPPIM module: Response Processor Current system time is 17-FEB-2010 10:00:30 wferr: - 2008: Could not open unprocessed mail folder. - 2106: Could not open folder '&FOLDER'. NAME=/apps1

  • JFrame or JPanel

    Hi, I am new to Java. I am reading the Java materials. Sometimes they use JFrame and sometimes they use JPanel. Can anyone tell me in which situation they will be used ? Many thanks in advance. Ivan

  • Netasst on Redhat 6.1

    I am a beginner. I have a problem with netasst on oracle8.1.5i for linux(Redhat6.1). After running netasst,I followed these step: 1.Create Listener listener name : TEST 2.Add address Displayed: Protocol: TCP/IP HOST: future_linux.gdpita PORT: 1521 3.

  • Course booking for org unit in LSO - ECC 6

    Hi All, can u pls share your inputs /solution we are in the process of implementing LSO . The requirement right now is that we want to book courses for an Org unit. The idea is to book the courses directly to Org Unit so that all the employees have a