Help bulk load
I'm new to bulk collect/forall. I can't get the forall section to work. Can someone help me out please. Thanks
create or replace procedure nrows_at_a_time(p_array_size in number)
as
cursor cursor1 is select * from dbo1.mars_voice_usage_bill;
type cursor1_type is table of cursor1%rowtype;
mars_voice_usage_bill_rec cursor1_type;
begin
open cursor1;
loop
fetch cursor1 bulk collect into mars_voice_usage_bill_rec limit p_array_siz
e;
forall i in 1 .. mars_voice_usage_bill_rec.last
update dbo1.mars_voice_usage_bill
set air_chg = mars_voice_usage_bill_rec.toll_chg(i)
where air_chg = 0;
exit when cursor1%notfound;
end loop;
end;
Errors for PROCEDURE NROWS_AT_A_TIME:
LINE/COL ERROR
11/8 PL/SQL: SQL Statement ignored
12/24 PL/SQL: ORA-00904: "MARS_VOICE_USAGE_BILL_REC"."TOLL_CHG":
invalid identifier
12/50 PLS-00302: component 'TOLL_CHG' must be declared
12/50 PLS-00302: component 'TOLL_CHG' must be declared
Thanks for the reply. The sql below is just testing the concept.I'm trying to use a simple example where update, bulk collect and forall are used together. The will be more logic added later. Still getting errors.
afiedt.buf" 21 lines, 537 characters
1 CREATE OR REPLACE PROCEDURE nrows_at_a_time(p_array_size IN NUMBER)
2 AS
3 CURSOR cursor1
4 IS
5 SELECT *
6 FROM dbo1.mars_voice_usage_bill;
7 TYPE cursor1_type IS TABLE OF cursor1%ROWTYPE;
8 mars_voice_usage_bill_rec cursor1_type;
9 BEGIN
10 OPEN cursor1;
11 LOOP
12 FETCH cursor1
13 BULK COLLECT INTO mars_voice_usage_bill_rec LIMIT p_array_size;
14 FORALL i IN 1 .. mars_voice_usage_bill_rec.LAST
15 UPDATE dbo1.mars_voice_usage_bill
16 SET air_chg = mars_voice_usage_bill_rec(i).toll_chg
17 WHERE air_chg = 0;
18 EXIT WHEN cursor1%NOTFOUND;
19 END LOOP;
20* END;
osreuat>/
Warning: Procedure created with compilation errors.
osreuat>show errors
Errors for PROCEDURE NROWS_AT_A_TIME:
LINE/COL ERROR
15/4 PL/SQL: SQL Statement ignored
16/19 PL/SQL: ORA-22806: not an object or REF
16/19 PLS-00382: expression is of wrong type
16/19 PLS-00436: implementation restriction: cannot reference fields of
BULK In-BIND table of records
Similar Messages
-
Anyone know setting primary key deferred help in the bulk loading
Hi,
Anyone know by setting primary key deferred help in the bulk loading in term of performance..cos i do not want to disable the index, cos when user query the existing records in the table, it will affect the search query.
Thank You...In the Oracle 8.0 documentation when deferred constraints were introduced Oracle stated that defering testing the PK constraint until commit time was more efficient than testing each constraint at the time of insert.
I have never tested this assertion.
In order to create a deferred PK constraint the index used to support the PK must be created as non-unique.
HTH -- Mark D Powell -- -
Please HELP! issue with BULK LOAD in FDM 11.1.2.1
Please assist with a solution to the following error!
See log below
** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:39] **
ERROR:
Code............................................. -2147217900
Description...................................... You do not have permission to use the bulk load statement.
BULK INSERT POLFDM..tWkalnic158050364335 FROM N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic158050364335.tmp' WITH (FORMATFILE = N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic158050364335.fmt',DATAFILETYPE = N'widechar',ROWS_PER_BATCH=221593,TABLOCK)
Procedure........................................ clsDataManipulation.fExecuteDML
Component........................................ upsWDataWindowDM
Version.......................................... 1112
Thread........................................... 5036
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:40] **
ERROR:
Code............................................. -2147217900
Description...................................... Data access error.
Procedure........................................ clsImpDataPump.fImportTextFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 5036
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:40] **
ERROR:
Code............................................. -2147217900
Description...................................... Data access error.
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 5036
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:38] **
ERROR:
Code............................................. -2147217900
Description...................................... You do not have permission to use the bulk load statement.
BULK INSERT POLFDM..tWkalnic46564644597 FROM N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic46564644597.tmp' WITH (FORMATFILE = N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic46564644597.fmt',DATAFILETYPE = N'widechar',ROWS_PER_BATCH=221593,TABLOCK)
Procedure........................................ clsDataManipulation.fExecuteDML
Component........................................ upsWDataWindowDM
Version.......................................... 1112
Thread........................................... 4644
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:38] **
ERROR:
Code............................................. -2147217900
Description...................................... Data access error.
Procedure........................................ clsImpDataPump.fImportTextFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 4644
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:39] **
ERROR:
Code............................................. -2147217900
Description...................................... Data access error.
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 4644
IDENTIFICATION:
User............................................. kalnickim
Computer Name.................................... POCHFM04
App Name......................................... POLFDM
Client App....................................... WebClient
CONNECTION:
Provider......................................... SQLOLEDB
Data Server...................................... pochfmsql01\hfm
Database Name.................................... POLFDM
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... BW
Location ID...................................... 751
Location Seg..................................... 5
Category......................................... ActSeg
Category ID...................................... 38
Period........................................... Sep - 2011
Period ID........................................ 9/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... FalseHave you read the installation documentation? It appears that you did not take the time to do a basic level of troubleshooting. A simple Google search of the error message provides the root problem as well as the solution.
The forums are intended to be used when you have exhausted other options. Please be mindful of this and contributors time when posting further questions.
I have attached a google search result for the error "You do not have permission to use the bulk load statement."
http://www.google.com/#sclient=psy-ab&hl=en&safe=off&site=&source=hp&q=+You+do+not+have+permission+to+use+the+bulk+load+statement.&pbx=1&oq=+You+do+not+have+permission+to+use+the+bulk+load+statement.&aq=f&aqi=g4&aql=&gs_sm=e&gs_upl=1556l1556l0l2633l1l1l0l0l0l0l184l184l0.1l1l0&bav=on.2,or.r_gc.r_pw.r_cp.,cf.osb&fp=ebaa3ff8b466872e&biw=1920&bih=955 -
Hi Experts,
I am trying to load data to HFM using Bulk load option but it doesnt work. When I Change the option to SQL insert, the loading is successful. The logs say that the temp file is missing. But when I go to the lspecified location , I see the control file and the tmp file. What am I missing to have bulk load working?Here's the log entry.
2009-08-19-18:48:29
User ID........... kannan
Location.......... KTEST
Source File....... \\Hyuisprd\Applications\FDM\CRHDATALD1\Inbox\OMG\HFM July2009.txt
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2009-08-19-18:48:29]
[TC] - [Amount=NN] Batch Month File Created: 07/2009
[TC] - [Amount=NN] Date File Created: 8/6/2009
[TC] - [Amount=NN] Time File Created: 08:19:06
[Blank] -
Excluded Record Count.............. 3
Blank Record Count................. 1
Total Records Bypassed............. 4
Valid Records...................... 106093
Total Records Processed............ 106097
Begin Oracle (SQL-Loader) Process (106093): [2009-08-19-18:48:41]
[RDMS Bulk Load Error Begin]
Message: (53) - File not found
See Bulk Load File: C:\DOCUME~1\fdmuser\LOCALS~1\Temp\tWkannan30327607466.tmp
[RDMS Bulk Load Error End]
Thanks
Kannan.Hi Experts,
I am facing the data import error while importing data from .csv file to FDM-HFM application.
2011-08-29 16:19:56
User ID........... admin
Location.......... ALBA
Source File....... C:\u10\epm\DEV\epm_home\EPMSystem11R1\products\FinancialDataQuality\FDMApplication\BMHCFDMHFM\Inbox\ALBA\BMHC_Alba_Dec_2011.csv
Processing Codes:
BLANK............. Line is blank or empty.
ESD............... Excluded String Detected, SKIP Field value was found.
NN................ Non-Numeric, Amount field contains non numeric characters.
RFM............... Required Field Missing.
TC................ Type Conversion, Amount field could be converted to a number.
ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
Create Output File Start: [2011-08-29 16:19:56]
[ESD] ( ) Inter Co,Cash and bank balances,A113000,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],1
[ESD] ( ) Inter Co,"Trade receivable, prepayments and other assets",HFM128101,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],35
[ESD] ( ) Inter Co,Inventories ,HFM170003,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],69
[ESD] ( ) Inter Co,Financial assets carried at fair value through P&L,HFM241001,Actual,Alba,Dec,2011,MOF,MOF,,YTD,Input_Default,[NONE],[NONE],[NONE],103
[Blank] -
Excluded Record Count..............4
Blank Record Count.................1
Total Records Bypassed.............5
Valid Records......................0
Total Records Processed............5
Begin SQL Insert Load Process (0): [2011-08-29 16:19:56]
Processing Complete... [2011-08-29 16:19:56]
Please help me solve the issue.
Regards,
Sudhir Sinha -
Issue with Bulk Load Post Process
Hi,
I ran bulk load command line utility to create users in OIM. I had 5 records in my csv file. Out of which 2 users were successfully created in OIM and for rest i got exception because users already existed. After that if i run bulk load post process for LDAP sync and generate the password and send notification. It is not working even for successfully created users. Ideally it should sync successfully created users. However if there is no exception i during bulk load command line utility then LDAP sync work fine through bulk load post process.Any idea how to resolve this issue and sync the user in OID which were successfully created. Urgent help would be appreciated.The scheduled task carries out post-processing activities on the users imported through the bulk load utility.
-
Bulk loading BLOBs using PL/SQL - is it possible?
Hi -
Does anyone have a good reference article or example of how I can bulk load BLOBs (videos, images, audio, office docs/pdf) into the database using PL/SQL?
Every example I've ever seen in PL/SQL for loading BLOBs does a commit; after each file loaded ... which doesn't seem very scalable.
Can we pass in an array of BLOBs from the application, into PL/SQL and loop through that array and then issue a commit after the loop terminates?
Any advice or help is appreciated. Thanks
LJIt is easy enough to modify the example to commit every N files. If you are loading large amounts of media, I think that you will find that the time to load the media is far greater than the time spent in SQL statements doing inserts or retrieves. Thus, I would not expect to see any significant benefit to changing the example to use PL/SQL collection types in order to do bulk row operations.
If your goal is high performance bulk load of binary content then I would suggest that you look to use Sqlldr. A PL/SQL program loading from BFILEs is limited to loading files that are accessible from the database server file system. Sqlldr can do this but it can also load data from a remote client. Sqlldr has parameters to control batching of operations.
See section 7.3 of the Oracle Multimedia DICOM Developer's Guide for the example Loading DICOM Content Using the SQL*Loader Utility. You will need to adapt this example to the other Multimedia objects (ORDImage, ORDAudio .. etc) but the basic concepts are the same.
Once the binary content is loaded into the database, you will need a to write a program to loop over the new content and initialize the Multimedia objects (extract attributes). The example in 7.3 contains a sample program that does this for the ORDDicom object. -
Hello,
I have one question regarding bulk loading. I did lot of bulk loading.
But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
How to call this function inside bulk loading process.
Help !!
xx_f is function which is using autonmous transction,
See my sample code
declare
cursor c1 is select a,b,c from xx;
type l_a is table of xx.a%type;
type l_b is table of xx.b%type;
type l_c is table of xx.c%type;
v_a l_a;
v_b l_b;
v_c l_c;
begin
open c1;
loop
fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
exit when c1%notfound;
begin
forall i in 1..v_a.count
insert into xxyy
(a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
commit;
end bulkload;
end loop;
close c1;
end;
I just want to call xx_f function without autonoumous transaction.
but with bulk loading. Please let me if you need more details
Thanks
yreddyrCan you show the code for xx_f? Does it do DML, or just transformations on the columns?
Depending on what it does, an alternative could be something like:
DECLARE
CURSOR c1 IS
SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
TYPE l_a IS TABLE OF whatever xx_f returns;
TYPE l_b IS TABLE OF whatever xx_f returns;
TYPE l_c IS TABLE OF whatever xx_f returns;
v_a l_a;
v_b l_b;
v_c l_c;
BEGIN
OPEN c1;
LOOP
FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
BEGIN
FORALL i IN 1..v_a.COUNT
INSERT INTO xxyy (a, b, c)
VALUES (v_a(i), v_b(i), v_c(i));
END;
EXIT WHEN c1%NOTFOUND;
END LOOP;
CLOSE c1;
END;John -
Using API to run Catalog Bulk Load - Items & Price Lists concurrent prog
Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
When I run the same concurrent program using the iProcurement Administration page it runs ok.
Has anyone been able to run this program through the backend? If so, any help is appreciated.
ThanksHello S.P,
Basically this is what I am trying to achieve.
1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
So basically the user can load item details into the database from an excel sheet.
2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
Thank you -
How to improve performance for Azure Table Storage bulk loads
Hello all,
Would appreciate your help as we are facing a challenge.
We are tried to bulk load Azure table storage. We have a file that contains nearly 2 million rows.
We would need to reach a point where we could bulk load 100000-150000 entries per minute. Currently, it takes more than 10 hours to process the file..
We have tried Parallel.Foreach but it doesn't help. Today I discovered Partitioning in PLINQ. Would that be the way to go??
Any ideas? I have spent nearly two days in trying to optimize it using PLINQ, but still I am not sure what is the best thing to do.
Kindly, note that we shouldn't be using SQL/Azure SQL for this.
I would really appreciate your help.
ThanksI'd think you're just pooling the parallel connections to Azure, if you do it on one system. You'd also have a bottleneck of round trip time from you, through the internet to Azure and back again.
You could speed it up by moving the data file to the cloud and process it with a Cloud worker role. That way you'd be in the datacenter (which is a much faster, more optimized network.)
Or, if that's not fast enough - if you can split the data so multiple WorkerRoles could each process part of the file, you can use the VM's scale to put enough machines to it that it gets done quickly.
Darin R. -
Hi,
I have a file where fields are wrapped with ".
=========== file sample
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
==========
I am having a .net method to remove the wrap characters and write out a file without wrap characters.
======================
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
======================
the .net code is here.
========================================
public static string RemoveCharacter(string sFileName, char cRemoveChar)
object objLock = new object();
//VirtualStream objInputStream = null;
//VirtualStream objOutStream = null;
FileStream objInputFile = null, objOutFile = null;
lock(objLock)
try
objInputFile = new FileStream(sFileName, FileMode.Open);
//objInputStream = new VirtualStream(objInputFile);
objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
//objOutStream = new VirtualStream(objOutFile);
int nByteRead;
while ((nByteRead = objInputFile.ReadByte()) != -1)
if (nByteRead != (int)cRemoveChar)
objOutFile.WriteByte((byte)nByteRead);
finally
objInputFile.Close();
objOutFile.Close();
return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
==================================
however when I run the bulk load utility I get the error
=======================================
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
==========================================
the bulk insert statement is as follows
=========================================
BULK INSERT Temp
FROM '<file name>' WITH
FIELDTERMINATOR = ','
, KEEPNULLS
==========================================
Does anybody know what is happening and what needs to be done ?
PLEASE HELP
Thanks in advance
VikramTo load that file with BULK INSERT, use this format file:
9.0
4
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 Latin1_General_CI_AS
3 SQLCHAR 0 0 "\",\"" 2 col2 Latin1_General_CI_AS
4 SQLCHAR 0 0 "\"\r\n" 3 col3 Latin1_General_CI_AS
Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
http://www.sommarskog.se/arrays-in-sql-2008.html
Erland Sommarskog, SQL Server MVP, [email protected] -
Error when Bulk load hierarchy data
Hi,
While loading P6 Reporting databases following message error appears atthe step in charge of Bulk load hierarchy data into ODS.
<04.29.2011 14:03:59> load [INFO] (Message) - === Bulk load hierarchy data into ODS (ETL_LOADWBSHierarchy.ldr)
<04.29.2011 14:04:26> load [INFO] (Message) - Load completed - logical record count 384102.
<04.29.2011 14:04:26> load [ERROR] (Message) - SqlLoaderSQL LOADER ACTION FAILED. [control=D:\oracle\app\product\11.1.0\db_1\p6rdb\scripts\DATA_WBSHierarchy.csv.ldr] [file=D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv]
<04.29.2011 14:04:26> load [INFO] (Progress) - Step 3/9 Part 5/6 - FAILED (-1) (0 hours, 0 minutes, 28 seconds, 16 milliseconds)
Checking corresponding log error file (see below) I see that effectively some records are rejected. Question is: How could I identify the source of the problem and fix it?
QL*Loader: Release 11.1.0.6.0 - Production on Mon May 2 09:03:22 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: DATA_WBSHierarchy.csv.ldr
Character Set UTF16 specified for all input.
Using character length semantics.
Byteorder little endian specified.
Data File: D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv
Bad File: DATA_WBSHierarchy.bad
Discard File: none specified
+(Allow all discards)+
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table WBSHIERARCHY, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
PARENTOBJECTID FIRST * WHT CHARACTER
PARENTPROJECTID NEXT * WHT CHARACTER
PARENTSEQUENCENUMBER NEXT * WHT CHARACTER
PARENTNAME NEXT * WHT CHARACTER
PARENTID NEXT * WHT CHARACTER
CHILDOBJECTID NEXT * WHT CHARACTER
CHILDPROJECTID NEXT * WHT CHARACTER
CHILDSEQUENCENUMBER NEXT * WHT CHARACTER
CHILDNAME NEXT * WHT CHARACTER
CHILDID NEXT * WHT CHARACTER
PARENTLEVELSBELOWROOT NEXT * WHT CHARACTER
CHILDLEVELSBELOWROOT NEXT * WHT CHARACTER
LEVELSBETWEEN NEXT * WHT CHARACTER
CHILDHASCHILDREN NEXT * WHT CHARACTER
FULLPATHNAME NEXT 8000 WHT CHARACTER
SKEY SEQUENCE (MAX, 1)
value used for ROWS parameter changed from 64 to 21
Record 14359: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 14360: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 14361: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27457: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 27458: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27459: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38775: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 38776: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38777: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52411: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 52412: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52413: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 114619: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 114620: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 127921: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 127922: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 164588: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 164589: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 171322: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 171323: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 186779: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 186780: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 208687: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 208688: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 221167: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 221168: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 246951: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 246952: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Table WBSHIERARCHY:
+384074 Rows successfully loaded.+
+28 Rows not loaded due to data errors.+
+0 Rows not loaded because all WHEN clauses were failed.+
+0 Rows not loaded because all fields were null.+
Space allocated for bind array: 244377 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 384102
Total logical records rejected: 28
Total logical records discarded: 0
Run began on Mon May 02 09:03:22 2011
Run ended on Mon May 02 09:04:07 2011
Elapsed time was: 00:00:44.99Hi Mandeep,
Thanks for the information.
But still it doesnot seem to work.
Actally, i have Group ID and Group Name as display field in the Hiearchy table.
Group ID i have directly mapped to Group ID.
I have created a Split Hierarchy of Group Name and mapped it.
I have also made all the options configurations as per your suggestions, but it doenot work still.
Can you please help.
Thanks,
Priya. -
Please let me know about SAP API’s that would enable bulk loading mechanism instead of serial method of one by one records.
Hi,
U can migrate all data by the help two methods as i mentioned ,
1. By Funtion module
CRM_ORDER_MAINTAIN
or Sap Crm Bol Programming
2. Middleware (BDoc).
This FM is like bapi for data maintain in R/3.
I think FM is good for those who don't bol programming .
Check the Mapping Fields available in FM.
Hope it helps u.
Thanks and Regards
Alok -
OIM Bulk Load: Insufficient privileges
Hi All,
I'm trying to use the OIM Bulk Load Utility and I keep getting this error message:
Exception in thread "main" java.sql.SQLException: ORA-01031: insufficient privileges
ORA-06512: at "OIMUSER.OIM_BLKLD_SP_CREATE_LOG", line 39
ORA-06512: at "OIMUSER.OIM_BLKLD_PKG_USR", line 281
I've followed the instructions and gone over everything a few times. The utility tests the connection to the database OK.
I don't know much about oracle db's so I am not sure how to do even basic troubleshooting. Could I just give my OIMUSER full permissions? Shouldn't it have full permission as it is?
I did have to create a tablespace for this utility, maybe the OIMUSER needs to be give access to this? I have no idea....
Any help would be greatly appreciated!
AlexEven i got same error, at that time db oim user had following permission:
CREATE TABLE
CREATE VIEW
QUERY REWRITE
UNLIMITED TABLESPACE
EXECUTE ON SYS.DBMS_SHARED_POOL
EXECUTE ON SYS.DBMS_SYSTEM
SELECT ON SYS.DBA_2PC_PENDING
SELECT ON SYS.DBA_PENDING_TRANSACTIONS
SELECT ON SYS.PENDING_TRANS$
SELECT ON SYS.V$XATRANS$
CONNECT
RESOURCE
Later DBA provided following additional permission and it worked like a charm:-
CREATE ANY INDEX
CREATE ANY SYNONYM
CREATE ANY TRIGGER
CREATE ANY TYPE
CREATE DATABASE LINK
CREATE JOB
CREATE LIBRARY
CREATE MATERIALIZED VIEW
CREATE PROCEDURE
CREATE SEQUENCE
CREATE TABLE
CREATE TRIGGER
CREATE VIEW -
Error in endeca search module-bulk load error
i have added product-catalog-output-config.xml and category-dim-output-config.xml to my project in respective paths as in DCS...the XMLs are combining when i see them in dyn/admin.
but while running baseline index, pre-indexing is getting succeeded but RepositoryTypeDimensionExporter is failing and the errors showing in command prompt are these..
11:14:48,710 INFO [ProductCatalogOutputConfig] Starting bulk load
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:48,710 INFO [CategoryToDimensionOutputConfig] Failed to cancel incremental load of /atg/endeca/index/commerce/CategoryToDimensionOutputConfig, probably because no bulk load was run
ning.
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:48,710 INFO [ProductCatalogOutputConfig] Failed to cancel incremental load of /atg/commerce/search/ProductCatalogOutputConfig, probably because no bulk load was running.
11:14:48,710 WARN [ProductCatalogSimpleIndexingAdmin] signalShouldCancel() yet implemented.
11:14:55,222 INFO [CategoryToDimensionOutputConfig] Bulk load completed with "false" result in 10,081 milliseconds.
11:14:58,724 INFO [ProductCatalogOutputConfig] Bulk load completed with "false" result in 10,014 milliseconds.
11:14:58,726 INFO [SchemaDocumentSubmitter] Rolling back session for record store Discover_en_schema with transactionId 59
11:14:58,827 INFO [LoggingOutInterceptor] Outbound Message
it is also saying that
"WARN [IndexingPeriodicService] No configuration registered for "/atg/endeca/index/commerce/CategoryToDimensionOutputConfig" so skipping check for incremental update." prior to indexing.
please help me up..
thanksi am also getting this error..
Exception in thread "index-/atg/endeca/index/commerce/ProductCatalogSimpleIndexingAdmin" java.lang.NullPointerException: Property value cannot be null (dimval.display_name)
when i run baseline index...
please suggest the solution -
Error in Add/Replace Bulk Load component - illegal character in XML
Has anyone ever seen the bulk load component complain about some illegal character in xml? I see this error and not sure what exactly the problem is:
ERROR [SocketReader] - Received error message from server: Character is not legal in XML 1.0
It's a very simple graph - reading data from clover data file and ingesting it straight into Endeca using the out of the box bulk load component.
Thanks for your help!
Edited by: 935345 on May 18, 2012 11:48 AMAssuming you are on EID 2.3, this transformation will apply the fix to all your string fields and print on your console the fields that had non-compliant XML 1.0 characters.
//#CTL2
string[] fields;
// Transforms input record into output record.
function integer transform() {
$out.0.* = $in.0.*;
for(integer i = $in.0.length() - 1; i >=0 ; i--) {
if (getFieldType($in.0.*, i) == "string" && getFieldType($out.0.*, i) == "string") {
if (!isNull($in.0.*, i)) {
string originalValue = getStringValue($in.0.*, i);
string newValue = originalValue.replace("([^\\u0009\\u000a\\u000d\\u0020-\\uD7FF\\uE000-\\uFFFD]|[\\u0092\\u007F]+)","");
if (originalValue != newValue) {
fields[i] = getFieldName($in.0, i);
setStringValue($out.0.*, i, newValue);
return OK;
// Called during component initialization.
// function boolean init() {}
// Called during each graph run before the transform is executed. May be used to allocate and initialize resources
// required by the transform. All resources allocated within this method should be released
// by the postExecute() method.
// function void preExecute() {}
// Called only if transform() throws an exception.
// function integer transformOnError(string errorMessage, string stackTrace) {}
// Called during each graph run after the entire transform was executed. Should be used to free any resources
// allocated within the preExecute() method.
function void postExecute() {
printErr("Fields with non-compliant XML 1.0 characters");
for(integer i = 0; i < fields.length(); i++) {
if (fields[i] != null) {
printErr(fields);
// Called to return a user-defined error message when an error occurs.
// function string getMessage() {}
-- Alex
Maybe you are looking for
-
How can I import songs which are filed basis one album/compilation as one folder?
How can I import songs which are filed basis one album/compilation as one folder? i can drag and drop one folder which is a proper album into itunes under playlist. if i drag and drop two proper albums into itunes it is also ok but if itunes does not
-
How to stop frame rates from dropping?
Hi, I've been having an issue ever since purchasing the 27inch late 2013 iMac and can't seem to find a solution. I use Quicktime to record with my Logitech c930e webcam, previously used c920 but was told it wasn't "compatible" by apple tech support,
-
Substituting BSEG-ZUONR value in MIRO with PO details.
Hi All, I need to substitute the value of the assignment field (BSEG-ZUONR) with ebeln and ebelp details. I have coded a substitution exit at call point 2 in FI at item level. When i debug and check my code BSEG document no has no values(actually it
-
hi, Apologize if this has been asked already. What are the differences if any between - - controlling the display of portlet header/border in the provider.xml file (containerRenderer set to false) V/S - controlling in the page edit mode (edit default
-
QuickTime Error: 0 message from Compressor
I recently exported several sequences from FCPro 5.5 to Compressor for conversion to MPEG-2. For some reason, two of the sequences (one is the longest, but only by several seconds; the other is nowhere near the longest) fail in Compressor with the me