Retry "Bulk Load Post Process" batch
Hi,
First question, what is the actual use of the scheduled task "Bulk Load Post Process"? If I am not sending out email notification, nor LDAP syncing nor generating the password do I still need to run this task after performing a bulk load through the utility?
Also, I ran this task, now there are some batches which are in the "READY FOR PROCESSING" state. How do I re-run these batches?
Thanks,
Vishal
The scheduled task carries out post-processing activities on the users imported through the bulk load utility.
Similar Messages
-
Issue with Bulk Load Post Process Scheduled Task
Hello,
I successfully loaded users in OIM using the bulk load utility. I also have LDAP sync ON. The documentation says to run the Bulk Load Post Process scheduled task to push the loaded users in OIM into LDAP.
This works if we run the Bulk Load Post Process Scheduled Task right away after the run the bulk load.
If some time had passed and we go back to run the Bulk Load Post Process Scheduled Task, some of the users loaded through the bulk load utility are not created in our LDAP system. This created an off-sync situation between OIM and our LDAP.
I tried to use the usr_key as a parameter to the Bulk Load Post Process Scheduled Task without success.
Is there a way to force the re-evaluation of these users so they would get created in LDAP?
Thanks
KhanhThe scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Issue with Bulk Load Post Process
Hi,
I ran bulk load command line utility to create users in OIM. I had 5 records in my csv file. Out of which 2 users were successfully created in OIM and for rest i got exception because users already existed. After that if i run bulk load post process for LDAP sync and generate the password and send notification. It is not working even for successfully created users. Ideally it should sync successfully created users. However if there is no exception i during bulk load command line utility then LDAP sync work fine through bulk load post process.Any idea how to resolve this issue and sync the user in OID which were successfully created. Urgent help would be appreciated.The scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Facing issue in Bulk Load Post Process in case of blank fields in csv file
Hi,
I ran command line bulk utility to create users in OIM through CSV file. I left few columns as blank which were not mandatory. After executing the bulk utility, user got successfully created in OIM but after that when i ran "Bulk Load Post Process" for LDAP sync then it changed the user's organisation to default organisation "Xellerate User" which is a major issue. It is happening only when i am leaving some column blank. Any idea why it is happening and how to resolve this problem? Need some urgent help!!Hi ,
Thanks for your reply, the issue was we can not use DOD=Y due to the dependency of the Standard Version where other program fetches data from the same XML structure,
Any way we resolved this issue by re doing below post installation steps. And used a new schema with updated DTD and point that with the same logical schema.
1) Created a new schema for XML external database
2) Change the DTD file, keep it in the same location
3) Change properties file with new schema name
4) Test with the new schema for XML connection
5) Reverse engineer the BaseModel ITEMBRANCH only (Where we added additional XML Tags)
6) Bounce back ODI agent and Client
Thanks,
Pc -
ECC 5.0 database instance install error in database load (post processing)
HI all!
The installation of the environment :
VM SERVER v1.03
ECC 5.0 /WINDOWS SERVER 2003
SQL SERVER 2000 SP3
DB INSTANCE install occur error in database load (post processing) step:
sapinst.log:
INFO 2008-10-16 14:20:54
Creating file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB\keydb.5.xml.
ERROR 2008-10-16 14:22:29
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSSEXC.log
ERROR 2008-10-16 14:24:59
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPAPPL2.log
ERROR 2008-10-16 14:24:59
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSDIC.log
ERROR 2008-10-16 14:26:30
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPPOOL.log
SAPSSEXC.log:
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: START OF LOG: 20081016142100
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#27 $ SAP
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: version R6.40/V1.4
Compiled Aug 18 2008 23:28:15
D:\usr\sap\IDS\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSSEXC.cmd -l C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSSEXC.log -loadprocedure fast
(DB) INFO: connected to DB
(DB) INFO: D010INC deleted/truncated
Interface access functions from dynamic library dbmssslib.dll loaded.
(IMP) INFO: EndFastLoad failed with <2: Bulk-copy commit unsuccessful:3624 Location: rowset.cpp:5001
Expression: pBind->FCheckForNull ()
SPID: 54
Process ID: 1120
3624 >
(IMP) ERROR: EndFastload: rc = 2
(DB) INFO: D010TAB deleted/truncated
SAPAPPL2.log
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: START OF LOG: 20081016142130
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#27 $ SAP
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: version R6.40/V1.4
Compiled Aug 18 2008 23:28:15
D:\usr\sap\IDS\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPAPPL2.cmd -l C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPAPPL2.log -loadprocedure fast
(DB) INFO: connected to DB
(DB) INFO: AGR_1250 deleted/truncated
Interface access functions from dynamic library dbmssslib.dll loaded.
(IMP) INFO: import of AGR_1250 completed (139232 rows) #20081016142200
(DB) INFO: AGR_1251 deleted/truncated
(IMP) INFO: EndFastLoad failed with <2: Bulk-copy commit unsuccessful:2627 Violation of PRIMARY KEY constraint 'AGR_1251~0'. Cannot insert duplicate key in object 'AGR_1251'.
3621 The statement has been terminated.>
(IMP) ERROR: EndFastload: rc = 2
(DB) INFO: AQLTS deleted/truncated
(IMP) INFO: import of AQLTS completed (17526 rows) #20081016142456
(DB) INFO: disconnected from DB
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: job finished with 1 error(s)
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: END OF LOG: 20081016142456
SAPSDIC.log:
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: START OF LOG: 20081016142230
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#27 $ SAP
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: version R6.40/V1.4
Compiled Aug 18 2008 23:28:15
D:\usr\sap\IDS\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSDIC.cmd -l C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSDIC.log -loadprocedure fast
(DB) INFO: connected to DB
(DB) INFO: DD08L deleted/truncated
Interface access functions from dynamic library dbmssslib.dll loaded.
(IMP) INFO: ExeFastLoad failed with <2: BCP Commit failed:3624 Location: p:\sql\ntdbms\storeng\drs\include\record.inl:1447
Expression: m_SizeRec > 0 && m_SizeRec <= MAXDATAROW
SPID: 54
Process ID: 1120
3624 >
(IMP) ERROR: ExeFastload: rc = 2
(DB) INFO: disconnected from DB
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: job finished with 1 error(s)
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: END OF LOG: 20081016142437
SAPPOOL.log:
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: START OF LOG: 20081016142530
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#27 $ SAP
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: version R6.40/V1.4
Compiled Aug 18 2008 23:28:15
D:\usr\sap\IDS\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPPOOL.cmd -l C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPPOOL.log -loadprocedure fast
(DB) INFO: connected to DB
(DB) INFO: ATAB deleted/truncated
Interface access functions from dynamic library dbmssslib.dll loaded.
(IMP) INFO: ExeFastLoad failed with <2: BCP Commit failed:3624 Location: p:\sql\ntdbms\storeng\drs\include\record.inl:1447
Expression: m_SizeRec > 0 && m_SizeRec <= MAXDATAROW
SPID: 54
Process ID: 1120
3624 >
(IMP) ERROR: ExeFastload: rc = 2
(DB) INFO: disconnected from DB
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: job finished with 1 error(s)
D:\usr\sap\IDS\SYS\exe\run/R3load.exe: END OF LOG: 20081016142621
please help! thanks!VM SERVER v1.03
can you please tell me what you mean with that? VMWare? Is that ESX-Server?
(DB) INFO: connected to DB
(DB) INFO: ATAB deleted/truncated
Interface access functions from dynamic library dbmssslib.dll loaded.
(IMP) INFO: ExeFastLoad failed with <2: BCP Commit failed:3624 Location: p:sql
tdbmsstorengdrsinclude
ecord.inl:1447
Expression: m_SizeRec > 0 && m_SizeRec <= MAXDATAROW
SPID: 54
Process ID: 1120
3624 >
(IMP) ERROR: ExeFastload: rc = 2
I´ve seen those load problems under VMWare and on boxes hosted as Xen VMs.
Markus -
Error in Database Load (post processing)
Hi, i have a problem when install the Database Instance. In the step "Database Load (post processing)". Error:
ERROR 2007-01-16 18:33:23
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:33:53
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:34:23
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:34:53
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:35:23
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:35:53
MSC-01015 Process finished with error(s), check log file D:\users\bpmadmin\install/SAPVIEW.log
ERROR 2007-01-16 18:35:54
FJS-00012 Error when executing script.
And I want to open the file SAPVIEW.log for check the error, but the file size is 300Mb.
Help me, please.
Jonu00E1300 MB? are u sure? this is way too much imho...as i see you install sqlsrv (= win)...you can find a tail like tool for windows under http://tailforwin32.sourceforge.net ... but somehow i doubt you will succeed to load the file...
-> which sap rlz you're installing?
GreetZ, AH -
Database load (post processing)
Hi sap gurus,
I am installing ECC 5.0 database instance . from past 15 hours it is still at database load (post processing). can u please tell me how long it will take to complete whole installation.
thanks
vijayTime it takes to do the database load is highly dependent on your system's CPU, RAM and disk IO.
On a real production type server it could be several hours, on a desktop type machine 15 to 20 hours would not be uncommon. -
Database instance install error in database load (post processing)
hi
i'm facing these errors during the ECC installation the operating system is Windows 2003 server database is oracle 9.2.
Error occured in database load (post processing) step
ERROR 2009-12-19 01:23:50
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPCLUST.log
ERROR 2009-12-19 01:49:20
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSDIC.log
ERROR 2009-12-19 04:08:20
MSC-01015 Process finished with error(s), check log file C:\Program Files\sapinst_instdir\ECC_50_ABAP_NUC\DB/SAPSSEXC.logHi
(RFF) ERROR: invalid checksum in data file "E:/DVD2/EXPORT2/DATA/SAPCLUST.001";
current table was "EDI40"
Your install software is corrupt....
Did you copy the software to disk ? If so did you copy accross in binary mode ...
Mark -
OIM Bulk Load utility(Process Form Updates)
Hi All,
I am wondering whether OIM bulk laod Utitly is capable of Updating the Accounts(Targer Recon) in OIM(Updates in Process form)?... I was checking the oracle doucment and it doe not say anything about Updating the records in OIM it says of creating the accounts. Please guide me.
Thanks in advance
Saravani
Edited by: 918661 on Mar 4, 2012 7:47 PMDon't worry. Utility support both feature
1. Trusted recon - reconcile from trusted source(csv, databse.etc.) to OIM user profile
2. Target recon - reconcile target account to OIM Process Form . for this you must have their coressponding user exist in OIM.
For further details go through below doc.
http://docs.oracle.com/cd/E14899_01/doc.9102/e14763/bulkload.htm#CHDCGGDA
Section "15.5 Loading Account Data" will clarify your doubt
--nayan -
Notifications are not being sent when Bulk Load is done
Hi All,
I have OIM 11g setup on my machine. I use the bulk load utility for loading the user data. Now in my OIM setup, the notifications are being sent for all stuff like Reset Password. New account creation and other. However when I bulk load the users, notifications are not sent to their mail ids. I am running the scheduled job "Bulk load Post Process" which is necessary so that the users are synced to the LDAP repository. I have the LDAP Sync option checked and also the Notifications option set to yes in this scheduled job. Though the users are loaded successfully and are synced properly, the notifications are not sent. Can some one please guide me as to what could be the problem here?
Thanks,
$idThe code is probably only called in the Event method of the event handler that sends the notification. You can check the mds files and find the notification you are looking for and then use a code decompiler to find the class that is called. You can then use this code as a sample, or write your own notification code and create an event handler that runs in the BulkEvent.
And on another note there is also this System Configuration Variable: Recon.SEND_NOTIFICATION which is set to FALSE by default.
-Kevin -
Hi Experts,
I am trying to load data to HFM using Bulk load option but it doesnt work. When I Change the option to SQL insert, the loading is successful. The logs say that the temp file is missing. But when I go to the lspecified location , I see the control file and the tmp file. What am I missing to have bulk load working?Here's the log entry.
2009-08-19-18:48:29
User ID........... kannan
Location.......... KTEST
Source File....... \\Hyuisprd\Applications\FDM\CRHDATALD1\Inbox\OMG\HFM July2009.txt
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2009-08-19-18:48:29]
[TC] - [Amount=NN] Batch Month File Created: 07/2009
[TC] - [Amount=NN] Date File Created: 8/6/2009
[TC] - [Amount=NN] Time File Created: 08:19:06
[Blank] -
Excluded Record Count.............. 3
Blank Record Count................. 1
Total Records Bypassed............. 4
Valid Records...................... 106093
Total Records Processed............ 106097
Begin Oracle (SQL-Loader) Process (106093): [2009-08-19-18:48:41]
[RDMS Bulk Load Error Begin]
Message: (53) - File not found
See Bulk Load File: C:\DOCUME~1\fdmuser\LOCALS~1\Temp\tWkannan30327607466.tmp
[RDMS Bulk Load Error End]
Thanks
Kannan.Hi Experts,
I am facing the data import error while importing data from .csv file to FDM-HFM application.
2011-08-29 16:19:56
User ID........... admin
Location.......... ALBA
Source File....... C:\u10\epm\DEV\epm_home\EPMSystem11R1\products\FinancialDataQuality\FDMApplication\BMHCFDMHFM\Inbox\ALBA\BMHC_Alba_Dec_2011.csv
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2011-08-29 16:19:56]
[ESD] ( ) Inter Co,Cash and bank balances,A113000,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],1
[ESD] ( ) Inter Co,"Trade receivable, prepayments and other assets",HFM128101,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],35
[ESD] ( ) Inter Co,Inventories ,HFM170003,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],69
[ESD] ( ) Inter Co,Financial assets carried at fair value through P&L,HFM241001,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],103
[Blank] -
Excluded Record Count..............4
Blank Record Count.................1
Total Records Bypassed.............5
Valid Records......................0
Total Records Processed............5
Begin SQL Insert Load Process (0): [2011-08-29 16:19:56]
Processing Complete... [2011-08-29 16:19:56]
Please help me solve the issue.
Regards,
Sudhir Sinha -
Formatting / Post Processing of Exported Excel Sheet
Hi ,
Issue:
The Columns in the Excel Sheet are not fully visible when we export Report from CR viewer to Excel Sheet. So we had planned for Post processing of the Excel Sheet using VBA or .NET etc to format the Excel Sheet, which would be on the server side, meaning the End user will never see the formatting process, but only the End Report. Is it possible to achieve this ??
NOTE:
1) VBA or .NET could not be run on the Linux Server but we could take advantage of the windows server in the presentation layer.
Architecture:
Presentation Layer
Linux Server <--> Windows .Net Server <--
> End User
{Crystal RAS Server} <-->{GUI CR Viewer Application}<--
>
Our Proposed Solution:
1) Create an GUI Application that has CR viewer in it .
2) Create a Separate Excel Export button on the webpage.
3) when the user wants to export the report , he clicks the button.
4) The GUI/ application saves the Exported Excel Sheet from the Ras server to the windows server.
5) Then the .Net or the Vba code is applied to process/Format the Excel Sheet.
6) When complete the End user is prompted for saving the report to a local disk.
My questions:
1) Can this be achieved ?
2) what would be the best way to handle this ?
3) what should be the process flow ?
4) what are the things to be considered while planning for such a design ?
Regards,
Ramkumar GovindasamySo you're looking at using RAS .NET SDK on the Application Layer to export a Crystal Report to Excel, save to temp file on the Application Layer machine, process that temp file, and stream that back out to the client web browser.
Considerations:
1. You need to create your own custom UI button to trigger the process, since the .NET Web Forms Crystal Reports viewer won't have the hooks to customize the Excel export.
2. Running Excel VBA from your Web App may be problematic - you'd have to be particularly careful if the system is under load, since the Excel VBA - COM-Interop isn't necessarily designed for high throughput. Under high load, you may get file locking or COM-Interop layer just refuse to process. It's pretty common to try and catch exceptions and retry if you encounter this.
You'd likely not find anyone here familiar with 2 above, but 1 is fairly common.
Sincerely,
Ted Ueda -
OIM 11g - ldap sync - Post Process event handler 'CREATE' faillling
Hi Gurus,
We have ldap sync set up between OIM 11.1.1.5 and ODSEE 11g,
Post process event handler on user creation with is setting a attribute with random 16 digit character, This event handler is getting triggered and setting the attribute in OIM but in logs i can see "Modification failed because user 45118 is not synchronized to the LDAP directory." error and it is not updated in ODSEE.
This behaviour is only for trusted recon not for the User created through UI.
Not sure what exactly is happening..
Is it expected behavior??
Gurus help me out on this.IF it fail because event handler unable to produce random number then verify below
is eventhandler code being executed in trusted recon verify in log.
There are two method execute and bulk execute in eventhandler. execute is being called from UI and bulk execute is being called for trusted recon.
either put code in bulk execute or update batch recon size something like that system property to 1. so, it will function as UI. Default value of batch is 500
--nayan -
Hi,
I'm using Oracle Endeca 2.3.
I encountered a problem in data integrator, Some batch of records were missing in the Front end and when I checked the status of Graph , It Showed "Graph Executed sucessfully".
So, I've connected the Bulk loader to "Universal data writer" to see the data domain status of the bulk load.
I've listed the results below, However I'm not able to interpret the information from the status and I've looked up the documentation but I found nothing useful.
0|10000|0|In progress
0|11556|0|In progress
0|20000|0|In progress
0|30000|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
40009|-9|0|In progress
40009|9991|0|In progress
40009|19991|0|In progress
40009|20846|0|In progress
Could anyone enlighten me more about this status.
Also,Since these messages are a part of "Post load", I'm wondering why is it still showing "In-Progress".
Cheers,
KhurshidI assume there was nothing of note in the dgraph.log?
The other option is to see what happens when you either:
A) filter your data down to the records that are missing prior to the load and see what happens
Or
B) use the regular data ingest API rather than the bulk.
Option b will definitely perform much worse on 2.3 so it may not be feasible.
The other thing to check is that your record spec is truly unique. The only time I can remember seeing an issue like this was loading a record, then loading a different record with the same spec value. The first record would get in and then be overwritten by the second record making it seem like the first record was dropped. Figured it would be worth checking.
Patrick Rafferty
Branchbird -
I am using v11.2 with the new Jena adapter.
I am trying to upload data from a bunch of ntriple files to the triple store via the bulk load interface in the Jena adaptor- aka. bulk append. The code does something like this
while(moreFiles exist)
readFilesToMemory;
bulkLoadToDatabase using the options "MBV_JOIN_HINT=USE_HASH PARALLEL=4"
Loading the first set of triples goes well. But when I try to load the second set of triples, I get the exception below.
Some thoughts:
1) I dont think this is data problem because I uploaded all the data during an earlier test + when I upload the same data on an empty database it works fine.
2) I saw some earlier posts with similar error but none of the seem to be using the Jena adaptor..
3) The model also has a OWL Prime entailment in incremental mode.
4) I am not sure if this is relevant but... Before I ran the current test, I mistakenly launched multiple of java processes that bulk loaded the data. Ofcourse I killed all the processes and dropped the sem_models and the backing rdf tables they were uploading to.
EXCEPTION
java.sql.SQLException: ORA-06502: PL/SQL: numeric or value error: character string buffer too small
ORA-06512: at "MDSYS.SDO_RDF_INTERNAL", line 3164
ORA-06512: at "MDSYS.SDO_RDF_INTERNAL", line 4244
ORA-06512: at "MDSYS.SDO_RDF", line 276
ORA-06512: at "MDSYS.RDF_APIS", line 693
ORA-06512: at line 1
at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:131)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:204)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1034)
at oracle.jdbc.driver.T4CCallableStatement.doOall8(T4CCallableStatement.java:191)
at oracle.jdbc.driver.T4CCallableStatement.executeForRows(T4CCallableStatement.java:950)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1222)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3387)
at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3488)
at oracle.jdbc.driver.OracleCallableStatement.execute(OracleCallableStatement.java:3840)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1086)
at oracle.spatial.rdf.client.jena.Oracle.executeCall(Oracle.java:689)
at oracle.spatial.rdf.client.jena.OracleBulkUpdateHandler.addInBulk(OracleBulkUpdateHandler.java:740)
at oracle.spatial.rdf.client.jena.OracleBulkUpdateHandler.addInBulk(OracleBulkUpdateHandler.java:463)
at oracleuploadtest.OracleUploader.loadModelToDatabase(OracleUploader.java:84)
at oracleuploadtest.RunOracleUploadTest.main(RunOracleUploadTest.java:81)
thanks!
Ram.The addInBulk method needs to be called twice to trigger the bug. Here is a test case that passes only while the bug is present! (It is to remind me to remove the workaround code when the fix gets through to my code).
@Test
public void testThatOracleBulkBugIsNotYetFixed() throws SQLException {
char nm[] = new char[22-TestDataUtils.getUserID().length()-TestOracleHelper.ORACLE_USER.length()];
Arrays.fill(nm,'A');
TestOracleHelper helper = new TestOracleHelper(new String(nm)); // actual name is TestDataUtils.getUserID() +"_" + nm
GraphOracleSem og = helper.createGraph();
Node n = RDF.value.asNode();
Triple triples[] = new Triple[]{new Triple(n,n,n)};
try {
og.getBulkUpdateHandler().addInBulk(triples, null);
// Oracle bug hits on second call:
og.getBulkUpdateHandler().addInBulk(triples, null);
catch (SQLException e) {
if (e.getErrorCode()==6502) {
return; // we have a work-around for this expected error;
throw e; // some other problem.
Assert.fail("It seems that an Oracle update (has the ora jar been updated?) resolves a silly bug - please modify BulkLoaderExportMode");
Jeremy
Maybe you are looking for
-
How can I change an app's Apple ID from one that no longer exist to a current one?
how can I change an app's Apple ID from one that no longer exist to a current one?
-
N8 Music Player causes phone reboot?
Has anyone experienced this? When I am listening to the music player with my headphones (just standard headphones, not the included mic/controller headset) and the phone is locked in idle state, sometime it will just spontaneously reboot. The music s
-
Time Machine Backup does not fit on HDD
Hi, I have a very odd problem. I do have a 750GB hdd in my macbook pro. My external time machine backup drive is 1TB. After haveing some issues with my system I had to reinstall OSX. Now I want to recover my system from the external drive. When I run
-
Smart forms and print program needed
Can anyone provide me with the following print programs and smart forms? I can't find it in my SAP. Thanks. Print Program: /SMB40/M07DR Smart Forms: /SMB40/MMGR1_A /SMB40/MMGR3_A /SMB40/MMGI1_A Please kindly email me at [email protected]
-
Hi All Happy wit my macbook pro - 13" 2010 is there any one there can help me - i need to find out where i read if it is a "a1278" Thanks in advance Br Mark