Catalog Bulk Load Process
Hi
Where can I find the spreadsheet templates (ItemPrice.txt, Item.txt, Price.txt) to bulkload catalog items in iProcurement?
Thanks!
Hi,
Before loading catalogs you have to define the mapping hierarchy (Browsing, Item). The end level should be item.
PEN ( Level 1 -- Browsing item)
-- GELL PENS ( Level 2 Item)
-- BALL POINT PENS (Level 2 Item)
-- INK PENS (Level 2 Item)
This is the navigation
internet procurement catalag admin responsibility --> eContent Manager -->Manage Category hierarchy (PEN ( Level 1 -- Browsing item))
using these declare the browsing and use the Manage Item Categories declare the item categories ( -- BALL POINT PENS (Level 2 Item)
-- INK PENS (Level 2 Item))
Then you goto Manage category Hierarchy and assign the 2 items ( BALL POINT PENS ,INK PENS).
Once you declare the structure, map the categories with purchasing categories using Map Oracle purchasing categories.
Then you have to rebuild the index usingthe below command.
Navigation: N > Setup > E-Catalog Admin > Loader Values
Description:This report is run to extract all Purchasing Categories that are web enabled from core PO to iProcurement. In the loader values screen just click ‘Extract Classifications’ for the extract to be done.
Notes: View the output of this report to check that everything worked OK and there were no errors. It may be necessary to first Drop and then Rebuild the Intermedia Index on the database before this process works for the first time. Script below:Exec icx_por_intermedia_index.drop_index
.create_index.rebuild_index
Once it is done, you can load the categories using the upload items button.
goto purchasing-- view requestes. check the concrrent progarm status.
Let me know if you have any other concerns or drop me a mail [email protected]
Thanks
Subbu
Similar Messages
-
Using API to run Catalog Bulk Load - Items & Price Lists concurrent prog
Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
When I run the same concurrent program using the iProcurement Administration page it runs ok.
Has anyone been able to run this program through the backend? If so, any help is appreciated.
ThanksHello S.P,
Basically this is what I am trying to achieve.
1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
So basically the user can load item details into the database from an excel sheet.
2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
Thank you -
Hello,
I have one question regarding bulk loading. I did lot of bulk loading.
But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
How to call this function inside bulk loading process.
Help !!
xx_f is function which is using autonmous transction,
See my sample code
declare
cursor c1 is select a,b,c from xx;
type l_a is table of xx.a%type;
type l_b is table of xx.b%type;
type l_c is table of xx.c%type;
v_a l_a;
v_b l_b;
v_c l_c;
begin
open c1;
loop
fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
exit when c1%notfound;
begin
forall i in 1..v_a.count
insert into xxyy
(a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
commit;
end bulkload;
end loop;
close c1;
end;
I just want to call xx_f function without autonoumous transaction.
but with bulk loading. Please let me if you need more details
Thanks
yreddyrCan you show the code for xx_f? Does it do DML, or just transformations on the columns?
Depending on what it does, an alternative could be something like:
DECLARE
CURSOR c1 IS
SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
TYPE l_a IS TABLE OF whatever xx_f returns;
TYPE l_b IS TABLE OF whatever xx_f returns;
TYPE l_c IS TABLE OF whatever xx_f returns;
v_a l_a;
v_b l_b;
v_c l_c;
BEGIN
OPEN c1;
LOOP
FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
BEGIN
FORALL i IN 1..v_a.COUNT
INSERT INTO xxyy (a, b, c)
VALUES (v_a(i), v_b(i), v_c(i));
END;
EXIT WHEN c1%NOTFOUND;
END LOOP;
CLOSE c1;
END;John -
OIM 11g - Issue with Bulk Load Utility for Account Data
Hi,
We are trying to load the account data for users in OIM 11g using bulk load utility.
We are trying to load the account data for resource "iPlanet". For testing purpose, we made one account entry in csv file and run the bulk load utility. After the bulk load process completes, we have noticed that resource is provisioned to the user multiple times and multiple entries have been created in process form table.
We have tried to run the utility multiple times with a different user record each time.
The out put of the below sql query:
SELECT MSG FROM OIM_BLKLD_LOG
WHERE MODULE = 'ACCOUNT' AND LOG_LEVEL = 'PROGRESS_MSG'
ORDER BY MSG_SEQ_NO;
is coming as follows:
MSG
Number of Records Loaded: 126
Number of Records Loaded: 252
Number of Records Loaded: 504
Number of Records Loaded: 1008
Number of Records Loaded: 2016
Number of Records Loaded: 4032
We have noticed that each time the number of records loaded is increased to double from the records loaded in last run even when the csv file contains only one record.
Provided below are the parent and child csv file entries.
Parent file:
UD_IPNT_USR_USERID,UD_IPNT_USR_FIRST_NAME,UD_IPNT_USR_LAST_NAME,UD_IPNT_USR_COMMON_NAME,UD_IPNT_USR_NSUNIQUEID
KPETER,Peter,Kevin,Peter Kevin,
Child file 1:
UD_IPNT_USR_USERID,UD_IPNT_GRP_GROUP_NAME
KPETER,group1
Child file 2:
UD_IPNT_USR_USERID,UD_IPNT_ROL_ROLE_NAME
KPETER,role1
Can you please throw some insight on what could be the potential cause for this issue and how it could be resolved?
Thanks
Deepa
Edited by: user10955790 on Jun 25, 2012 6:45 AMHi Deepa,
I know from 'User load' perspective that is required to restart Oracle Identity Manager when we need to reload data that was not loaded during the first run.
So, my suggestion is restart it before reload.
Reference: http://docs.oracle.com/cd/E21764_01/doc.1111/e14309/bulkload.htm#CHDEICEH
I hope this helps,
Thiago Leoncio. -
Issue with Bulk Load Post Process Scheduled Task
Hello,
I successfully loaded users in OIM using the bulk load utility. I also have LDAP sync ON. The documentation says to run the Bulk Load Post Process scheduled task to push the loaded users in OIM into LDAP.
This works if we run the Bulk Load Post Process Scheduled Task right away after the run the bulk load.
If some time had passed and we go back to run the Bulk Load Post Process Scheduled Task, some of the users loaded through the bulk load utility are not created in our LDAP system. This created an off-sync situation between OIM and our LDAP.
I tried to use the usr_key as a parameter to the Bulk Load Post Process Scheduled Task without success.
Is there a way to force the re-evaluation of these users so they would get created in LDAP?
Thanks
KhanhThe scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Issue with Bulk Load Post Process
Hi,
I ran bulk load command line utility to create users in OIM. I had 5 records in my csv file. Out of which 2 users were successfully created in OIM and for rest i got exception because users already existed. After that if i run bulk load post process for LDAP sync and generate the password and send notification. It is not working even for successfully created users. Ideally it should sync successfully created users. However if there is no exception i during bulk load command line utility then LDAP sync work fine through bulk load post process.Any idea how to resolve this issue and sync the user in OID which were successfully created. Urgent help would be appreciated.The scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Retry "Bulk Load Post Process" batch
Hi,
First question, what is the actual use of the scheduled task "Bulk Load Post Process"? If I am not sending out email notification, nor LDAP syncing nor generating the password do I still need to run this task after performing a bulk load through the utility?
Also, I ran this task, now there are some batches which are in the "READY FOR PROCESSING" state. How do I re-run these batches?
Thanks,
VishalThe scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Facing issue in Bulk Load Post Process in case of blank fields in csv file
Hi,
I ran command line bulk utility to create users in OIM through CSV file. I left few columns as blank which were not mandatory. After executing the bulk utility, user got successfully created in OIM but after that when i ran "Bulk Load Post Process" for LDAP sync then it changed the user's organisation to default organisation "Xellerate User" which is a major issue. It is happening only when i am leaving some column blank. Any idea why it is happening and how to resolve this problem? Need some urgent help!!Hi ,
Thanks for your reply, the issue was we can not use DOD=Y due to the dependency of the Standard Version where other program fetches data from the same XML structure,
Any way we resolved this issue by re doing below post installation steps. And used a new schema with updated DTD and point that with the same logical schema.
1) Created a new schema for XML external database
2) Change the DTD file, keep it in the same location
3) Change properties file with new schema name
4) Test with the new schema for XML connection
5) Reverse engineer the BaseModel ITEMBRANCH only (Where we added additional XML Tags)
6) Bounce back ODI agent and Client
Thanks,
Pc -
Bulk Loader in Tuxedo/Jolt 8.0
Hello,
I have installed Tuxedo 8.0 under D:\tuxedo on a Win2000 test-server, using the
"Full install" option. In production, running Tuxedo 6.5/Jolt 1.1 on NT 4.0, we
have a catalog D:\tuxedo\udataobj\jolt\client .
This catalog is no longer present in version 8.0 - is this correct?
If yes: How do I run the bulk loader, whose class file "jbld.class" lies under
D:\tuxedo\udataobj\jolt\client\classes\bea\joltadm in version 1.1? I cannot find
this class file anywhere in the 8.0 install tree!
regards,
Simen R.This was added to Tuxedo 8.0 as a performance enhancement.
This ubb option(or equivalent mib and environment attributes)
sets the cache size for the number of service or interface
entries that you expect to need to maintain locally.
Without this cache, all the service/interface information
must be gotten from the bulletin board process. With this
cache, the information is immediately available.
The cache is on by default and the default cache size is
500 entries. The cache is updated everytime the bulletin
board is changed so the usefulness of the cache decreases
when the bulletin board is changing frequently.
As with any performance tuning, you should test with
your application to see what size cache(if any) works
best. Hope this helps.
Bob Finan
"°í°æÇÐ" wrote:
Hi.
I wonder about SICACHEENTRIESMAX(Service Cache Entry), in Tuxedo Version 8.0
ubbconfig's SERVERS section.
Please tell me about its function.
Thanks in advance.
Best Regards. -
Hi Experts,
I am trying to load data to HFM using Bulk load option but it doesnt work. When I Change the option to SQL insert, the loading is successful. The logs say that the temp file is missing. But when I go to the lspecified location , I see the control file and the tmp file. What am I missing to have bulk load working?Here's the log entry.
2009-08-19-18:48:29
User ID........... kannan
Location.......... KTEST
Source File....... \\Hyuisprd\Applications\FDM\CRHDATALD1\Inbox\OMG\HFM July2009.txt
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2009-08-19-18:48:29]
[TC] - [Amount=NN] Batch Month File Created: 07/2009
[TC] - [Amount=NN] Date File Created: 8/6/2009
[TC] - [Amount=NN] Time File Created: 08:19:06
[Blank] -
Excluded Record Count.............. 3
Blank Record Count................. 1
Total Records Bypassed............. 4
Valid Records...................... 106093
Total Records Processed............ 106097
Begin Oracle (SQL-Loader) Process (106093): [2009-08-19-18:48:41]
[RDMS Bulk Load Error Begin]
Message: (53) - File not found
See Bulk Load File: C:\DOCUME~1\fdmuser\LOCALS~1\Temp\tWkannan30327607466.tmp
[RDMS Bulk Load Error End]
Thanks
Kannan.Hi Experts,
I am facing the data import error while importing data from .csv file to FDM-HFM application.
2011-08-29 16:19:56
User ID........... admin
Location.......... ALBA
Source File....... C:\u10\epm\DEV\epm_home\EPMSystem11R1\products\FinancialDataQuality\FDMApplication\BMHCFDMHFM\Inbox\ALBA\BMHC_Alba_Dec_2011.csv
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2011-08-29 16:19:56]
[ESD] ( ) Inter Co,Cash and bank balances,A113000,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],1
[ESD] ( ) Inter Co,"Trade receivable, prepayments and other assets",HFM128101,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],35
[ESD] ( ) Inter Co,Inventories ,HFM170003,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],69
[ESD] ( ) Inter Co,Financial assets carried at fair value through P&L,HFM241001,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],103
[Blank] -
Excluded Record Count..............4
Blank Record Count.................1
Total Records Bypassed.............5
Valid Records......................0
Total Records Processed............5
Begin SQL Insert Load Process (0): [2011-08-29 16:19:56]
Processing Complete... [2011-08-29 16:19:56]
Please help me solve the issue.
Regards,
Sudhir Sinha -
Notifications are not being sent when Bulk Load is done
Hi All,
I have OIM 11g setup on my machine. I use the bulk load utility for loading the user data. Now in my OIM setup, the notifications are being sent for all stuff like Reset Password. New account creation and other. However when I bulk load the users, notifications are not sent to their mail ids. I am running the scheduled job "Bulk load Post Process" which is necessary so that the users are synced to the LDAP repository. I have the LDAP Sync option checked and also the Notifications option set to yes in this scheduled job. Though the users are loaded successfully and are synced properly, the notifications are not sent. Can some one please guide me as to what could be the problem here?
Thanks,
$idThe code is probably only called in the Event method of the event handler that sends the notification. You can check the mds files and find the notification you are looking for and then use a code decompiler to find the class that is called. You can then use this code as a sample, or write your own notification code and create an event handler that runs in the BulkEvent.
And on another note there is also this System Configuration Variable: Recon.SEND_NOTIFICATION which is set to FALSE by default.
-Kevin -
How to improve performance for Azure Table Storage bulk loads
Hello all,
Would appreciate your help as we are facing a challenge.
We are tried to bulk load Azure table storage. We have a file that contains nearly 2 million rows.
We would need to reach a point where we could bulk load 100000-150000 entries per minute. Currently, it takes more than 10 hours to process the file..
We have tried Parallel.Foreach but it doesn't help. Today I discovered Partitioning in PLINQ. Would that be the way to go??
Any ideas? I have spent nearly two days in trying to optimize it using PLINQ, but still I am not sure what is the best thing to do.
Kindly, note that we shouldn't be using SQL/Azure SQL for this.
I would really appreciate your help.
ThanksI'd think you're just pooling the parallel connections to Azure, if you do it on one system. You'd also have a bottleneck of round trip time from you, through the internet to Azure and back again.
You could speed it up by moving the data file to the cloud and process it with a Cloud worker role. That way you'd be in the datacenter (which is a much faster, more optimized network.)
Or, if that's not fast enough - if you can split the data so multiple WorkerRoles could each process part of the file, you can use the VM's scale to put enough machines to it that it gets done quickly.
Darin R. -
API for bulk loading of pages into UCM
For Oracle Universal Content Management –
Is there an API for use in bulk loading pages?
Where is the documentation for this?
If there is not API, what is the best way to bulk load 10’s of thousands of pages into UCM?
Thanks in advance,
RamTo easily bulk load files in UCM, you can use the BatchLoader utility described in chapter 7 (in release 10g or chapter 3 in release 11g) of the 'Managing System Settings and Processes' admin guide.
We used it on our project to load some 60000 documents in UCM with associated metadata in just over 1 hour. Worked fine.
Brgrds,
Bob Marien
(Ps: if you want to use an API instead, you can ofcourse invoke the UCM webservices) -
Error in endeca search module-bulk load error
i have added product-catalog-output-config.xml and category-dim-output-config.xml to my project in respective paths as in DCS...the XMLs are combining when i see them in dyn/admin.
but while running baseline index, pre-indexing is getting succeeded but RepositoryTypeDimensionExporter is failing and the errors showing in command prompt are these..
11:14:48,710 INFO [ProductCatalogOutputConfig] Starting bulk load
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:48,710 INFO [CategoryToDimensionOutputConfig] Failed to cancel incremental load of /atg/endeca/index/commerce/CategoryToDimensionOutputConfig, probably because no bulk load was run
ning.
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:48,710 INFO [ProductCatalogOutputConfig] Failed to cancel incremental load of /atg/commerce/search/ProductCatalogOutputConfig, probably because no bulk load was running.
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:55,222 INFO [CategoryToDimensionOutputConfig] Bulk load completed with "false" result in 10,081 milliseconds.
11:14:58,724 INFO [ProductCatalogOutputConfig] Bulk load completed with "false" result in 10,014 milliseconds.
11:14:58,726 INFO [SchemaDocumentSubmitter] Rolling back session for record store Discover_en_schema with transactionId 59
11:14:58,827 INFO [LoggingOutInterceptor] Outbound Message
it is also saying that
"WARN [IndexingPeriodicService] No configuration registered for "/atg/endeca/index/commerce/CategoryToDimensionOutputConfig" so skipping check for incremental update." prior to indexing.
please help me up..
thanksi am also getting this error..
Exception in thread "index-/atg/endeca/index/commerce/ProductCatalogSimpleIndexingAdmin" java.lang.NullPointerException: Property value cannot be null (dimval.display_name)
when i run baseline index...
please suggest the solution -
Error when doing a ATGOrder Bulk load
Hi
Getting the below error when trying to do a bulk load ATGOrder in CSC.
Machine Details :Linux 64bit machine
ATG Version:10.1
17:44:07,487 INFO [OrderOutputConfig] Starting bulk load
17:44:11,482 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
17:44:11,488 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
17:44:11,495 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
17:44:17,651 WARN [LiveIndexingService] Current hosts for environment ATGOrderBulk cannot support requested engine count
17:44:17,652 WARN [LiveIndexingService] Allocate more hosts or increase the maximum number of search engines for one of its hosts
17:44:17,656 ERROR [LiveIndexingService] Unable to release lock: __routingLiveIndexingLock:ATGOrder
atg.service.lockmanager.LockManagerException: Attempt to release a write lock when not the owner: key=__routingLiveIndexingLock:ATGOrder Owner=Thread[http-0.0.0.0-8580-1:ipaddr=172.21.21.49;path=/dyn/admin/nucleus/atg/commerce/search/OrderOutputConfig/;sessionid=B0DC1551B81ACFD6B7C987E59116D825,5,jboss]
at atg.service.lockmanager.ClientLockEntry.releaseWriteLock(ClientLockEntry.java:713)
at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1386)
at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1415)
at atg.search.routing.LiveIndexingService.releaseLock(LiveIndexingService.java:1843)
at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1455)
at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at atg.nucleus.Nucleus.service(Nucleus.java:2967)
at atg.nucleus.Nucleus.service(Nucleus.java:2867)
at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
at java.lang.Thread.run(Thread.java:662)
17:44:17,658 ERROR [BulkLoader]
atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at atg.nucleus.Nucleus.service(Nucleus.java:2967)
at atg.nucleus.Nucleus.service(Nucleus.java:2867)
at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
at java.lang.Thread.run(Thread.java:662)
Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
... 49 more
Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
... 51 more
17:44:17,675 ERROR [OrderOutputConfig]
atg.repository.search.indexing.IndexingException: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:1040)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at atg.nucleus.Nucleus.service(Nucleus.java:2967)
at atg.nucleus.Nucleus.service(Nucleus.java:2867)
at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
at java.lang.Thread.run(Thread.java:662)
Caused by: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
... 48 more
Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
... 49 more
Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
... 51 moreIn my /atg/search/routing/LiveIndexingService/ component i have the following values.
ATGProfile running yes yes 8000001 null 1 1 1 start stop cycle delete
backup restore disable
ATGProfileBulk stopped NO yes null null 1 0 0 start stop cycle delete
backup restore disable
ATGOrder running yes yes 8000002 null 1 4 4 start stop cycle delete
backup restore disable
ATGOrderBulk stopped NO yes null null 1 0 0 start stop cycle delete
backup restore disable
Why is there 4 engins running for ATG Order???? i think this is wat is causing the problem, but i am unable to find from where its creating this 4 engins.
Maybe you are looking for
-
Hello Community! I am working with SharePoint 2013 and I built a farm inside the firewall. Then a decision was made to move the two WFE's to the DMZ. Since that time, whenever I try to access the site collections, I get the error below. Other info
-
Hello! I defined measures for the investment support with the transaction ANVEST, and i put a maximum percentage rate for the investment support. When using the transaction ABIF to post an amount for the investment support, the system does not check
-
Modifiying an swf layer with After Effect
Greetings, I'm very new to after effects and have a question regarding an exsisting project that has a swf layer. We want to increase the time play from 6 second to 48 seconds. No problem with that.. However within the animation there is a swf that
-
Trouble emailing from PDF document
I am trying to email straight from a pdf document using pdf reader and am getting operation failed. This is only happening with one of my users.
-
Syncing iPhone 4 to a new MacBook Pro
I just got a new MacBook Pro. My old MB all of a sudden stopped syncing to my phone months ago and now none of my contacts, apps, or anything in my iCal are even on my old MB anymore. (Long story short, my hard drive crashed last year so I put anothe