SQL Load Queries - Trailer and Header record
I have a data file which comes with a header and a Trailer record.
We are using SQL load to load it.
Is a way to ignore the header and the trailer records.
Any help around this is highly appreciated.
user532091 wrote:
Is a way to ignore the header and the trailer records.Assuming number of records in header is known, use SQL*Loader parameter
SKIP nwhere n is number of records in your header. Trailer is a different story. There is nothing built-in to handle trailer, so best solution is to remove trailer on OS level. Other than that, you can try SQL*Loader clause WHEN. For example, if triler row is something like:
TOTAL SIZE nnnnnnyou can use:
WHEN position(1,10) != 'TOTAL SIZE'
obviously, assuming non-trailer records can't start with TOTAL SIZE.
SY.
Similar Messages
-
Hi,
running SQL*Loader (Release 8.1.7.2.1) causes an error "SQL*Loader-704: Internal error: Maximum record length must be <= [10000000]". This error occurs when SQLLoader is trying to load several thousand records into a database table. Each record is less than 250 bytes in length.
Any idea what could cause the problem?
Thanks in advance!
Ingo
And here's an extract from the log file generated by SQLLoader :
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 1360 rows, maximum of 10485760 bytes
Continuation: none specified
Path used: Conventional
Table "SYSTEM"."BASICPROFILE$1", loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
UUID FIRST * O(X07) CHARACTER
DOMAINID NEXT * O(X07) CHARACTER
LASTMODIFIED NEXT * O(X07) DATE DD/MM/YYYY HH24:MI:SS
ANNIVERSARY NEXT * O(X07) CHARACTER
BIRTHDAY NEXT * O(X07) CHARACTER
COMPANYNAME NEXT * O(X07) CHARACTER
DESCRIPTION NEXT * O(X07) CHARACTER
FIRSTNAME NEXT * O(X07) CHARACTER
COMPANYNAMETRANSCRIPTION NEXT * O(X07) CHARACTER
FIRSTNAMETRANSCRIPTION NEXT * O(X07) CHARACTER
GENDER NEXT * O(X07) CHARACTER
HOBBIES NEXT * O(X07) CHARACTER
HONORIFIC NEXT * O(X07) CHARACTER
JOBTITLE NEXT * O(X07) CHARACTER
KEYWORDS NEXT * O(X07) CHARACTER
LASTNAME NEXT * O(X07) CHARACTER
LASTNAMETRANSCRIPTION NEXT * O(X07) CHARACTER
NICKNAME NEXT * O(X07) CHARACTER
PREFERREDLOCALE NEXT * O(X07) CHARACTER
PREFERREDCURRENCY NEXT * O(X07) CHARACTER
PROFESSION NEXT * O(X07) CHARACTER
SECONDLASTNAME NEXT * O(X07) CHARACTER
SECONDNAME NEXT * O(X07) CHARACTER
SUFFIX NEXT * O(X07) CHARACTER
TITLE NEXT * O(X07) CHARACTER
CONFIRMATION NEXT * O(X07) CHARACTER
DEFAULTADDRESSID NEXT * O(X07) CHARACTER
BUSINESSPARTNERNO NEXT * O(X07) CHARACTER
TYPECODE NEXT * O(X07) CHARACTER
OCA NEXT * O(X07) CHARACTER
SQL*Loader-704: Internal error: Maximum record length must be <= [10000000]As a second guess, the terminator changes or goes missing at some point in the data file. If you are running on *NIX, try wc -l data_file_name. This will give a count of the number of lines (delimited by CHR(10) ) that are in the file. If this is not close to the number you expected, then that is your problem.
You could also try gradually working through the data file loading 100 records, then 200, then 300 etc. to see where it starts to fail.
HTH
John -
How to create a partner and header record using CRM_ORDER_MAINTAIN?
Hi any one knows how to create a partner and header record using the function module CRM_ORDER_MAINTAIN??
I tried to create a record, but i only managed to create a header record and the partner record is not reflected in the transaction. Why is that so? is there any indicator that i need to include?
Thanks..
JenHi Jen!
I use this FM and it works perfectly.
Use this to create a partner:
gs_partner-ref_handle = '0000000001'.
gs_partner-ref_kind = 'A'.
gs_partner-ref_partner_handle = '0001'.
gs_partner-partner_fct = '00000001'.
gs_partner-partner_no = NO_PARTNER. "number of the partner, bu_partner
gs_partner-display_type = 'BP'.
gs_partner-no_type = 'BP'.
gs_partner-kind_of_entry = 'C'.
* ls_partner_l-ref_handle = '1'.
gs_partner-ref_guid = '00000000000000000000000000000000'.
APPEND gs_partner TO gT_partner .
ls_input_field-ref_kind = 'A'.
ls_input_field-logical_key = '0001'.
ls_input_field-objectname = 'PARTNER'.
ls_input_field-ref_handle = '0000000001'.
ls_input_field_names-fieldname = 'DISPLAY_TYPE'.
INSERT ls_input_field_names INTO TABLE ls_input_field-field_names.
ls_input_field_names-fieldname = 'KIND_OF_ENTRY'.
INSERT ls_input_field_names INTO TABLE ls_input_field-field_names.
ls_input_field_names-fieldname = 'NO_TYPE'.
INSERT ls_input_field_names INTO TABLE ls_input_field-field_names.
ls_input_field_names-fieldname = 'PARTNER_FCT'.
INSERT ls_input_field_names INTO TABLE ls_input_field-field_names.
ls_input_field_names-fieldname = 'PARTNER_NO'.
INSERT ls_input_field_names INTO TABLE ls_input_field-field_names.
INSERT ls_input_field INTO TABLE gt_input_fields.
clear ls_input_field-field_names[].
CALL FUNCTION 'CRM_ORDER_MAINTAIN'
EXPORTING
* it_schedlin_i = gt_schedlin_i_com
it_partner = gt_partner
* it_sales = gt_sales
* it_orgman = gt_orgman
* it_appointment = gt_appointment
* it_ordprp_i = gt_ordprp_i
* it_product_i = gt_product_i
* it_activity_i = gt_activity_i
* it_pridoc = gt_pridoc_com
CHANGING
ct_orderadm_h = gt_orderadm_h
* ct_orderadm_i = gt_orderadm_i
ct_input_fields = gt_input_fields.
* ct_doc_flow = gt_doc_flow
* cv_log_handle = gv_log_handle.
Hope it helps u,
Regards,
Mon. -
SQL LOADER , EXTERNAL TABLE and ODBS DATA SOURCE
hello
Can any body help loading data from dbase file .dbt to an oracle 10g table.
I tried last day with SQL LOADER, EXTERNAL table , and ODBC data source.
Why all of these utilities still failing to solve my problem ?
Is there an efficient way to reach this goal ?
Thanks in advanceexport the dbase data file to text file,
then you have choice of using either sql loader or external table option to use.
regards -
Sql loader does not skip header rows from the second infile
Hi
I am new to Sql Loader and trying to figure out a way to load data into the table from 2 files which have different data in the same format. The files get successfully loaded into the database with just one glitch:
Its skips first 2 rows from the first file however it takes first 2 rows from the 2nd file which I do not want. Can anyone help me with this issue? How can i restrict loader from picking up the 2 header rows from the second file as well?
given below is the content of the control file
OPTIONS ( SKIP=2)
LOAD DATA
INFILE 'C:\loader\contacts\Italy_Wave11_Contacts.csv'
BADFILE 'C:\loader\contacts\Contacts.bad'
DISCARDFILE 'C:\loader\contacts\contacts.dsc'
infile 'C:\loader\contacts\Spain_Wave11_Contacts.csv'
BADFILE 'C:\loader\contacts\Contacts1.bad'
DISCARDFILE 'C:\loader\contacts\contacts1.dsc'
Truncate
into table V_contacts_dump
fields terminated by ',' optionally enclosed by '"'
trailing nullcols
(ASSISTANT_EMAIL_TX,
ASSISTANT_NM,
ASSISTANT_PHONE_TX,
BUSINESS_AREA_CD,
BUSINESS_EMAIL_TX,
BUSINESS_FAX_TX,
BUYER_ROLE_CD,
COMMENTS_TX,
COUNTRY_CD,
COUNTY_STATE_PROVINCE_CD,
DATE_OF_BIRTH_DT,
DO_NOT_CALL_IN,
DO_NOT_EMAIL_IN,
DO_NOT_MAIL_IN,
DOMESTIC_PARTNERS_NM,
FIRST_NM,
FULL_NM,
GENDER_CD,
INTERESTS_CD,
LAST_NM,
MIDDLE_NM,
MOBILE_TX,
OFFICE_PHONE_TX,
OWNER_PARTY_EMAIL,
PREFERRED_CONTACT_LANG_CD,
PREFERRED_CONTACT_CD,
PRIMARY_CONTACT_IN,
REFERRED_BY_TX,
SALUTATION_CD,
STAC_SRC_ID_TX,
STAC_STDS_SRC_ID,
STCN_SRC_ID_TX,
STREET_TX,
STREET2_TX,
STREET3_TX,
SUFFIX_TX,
TITLE_CD,
TOWN_CITY_TX,
ZIP_POSTAL_CD_TX )It would be possible to call the loader twice with only one input-File at each run.
-
SQL Loader Direct Path and Nulls
I'm using Sql Loader with Direct Path and I cannot find a way to remove trailing spaces in varchar2 fields.
I'm trying to avoid updating the tables after loading.
Any ideas?Hi Alan,
How big is this table and what sort of indexes are they?
An index update using SQL*Loader unrecoverable direct path load is achieved by an isolated sort followed by a nologging merge of the old index and the new mini-index into a new index segment (this according to seminar notes by Jonathan Lewis). This will conceivably take a long time for a large table / large index.
Performance improvements? Are you loading all records into a new partition?
Cheers,
Colin -
Sql loader using position and functions
Hi all, i need help loading some data in my table using sql loader. consider the following
CREATE TABLE er
a1 NUMBER,
a2 number,
a3 VARCHAR2(100),
a4 VARCHAR2(100),
a5 VARCHAR2(100),
a6 VARCHAR2(100),
a7 VARCHAR2(100),
a8 VARCHAR2(100)
OPTIONS (BINDSIZE=20548000, READSIZE=20548000, STREAMSIZE=20548000, DATE_CACHE=25000, SKIP=0)
LOAD DATA
INTO TABLE er
APPEND
TRAILING NULLCOLS
a1 POSITION(0001:0021) ,
a2 POSITION(0022:0042) "DECODE(SUBSTR(:a2,1,3),'***',NULL,:a2)" ,
a3 POSITION(0043:0053) ,
a4 POSITION(0054:0064) ,
a5 POSITION(0065:0075) ,
a6 POSITION(0076:0086) ,
a7 POSITION(0087:0093) "DECODE(SUBSTR(:a7,1,3),'***',NULL,:a7)"
BEGIN
0.00 ******************** X X X *X ****
END;if you look at the data, some fields have a lot of * and some has af few such as ****. i want to load this data into a table and when a field contain all * as a value, i want to set it to null. if a field contain a * and alphanumeric then that value should be load as it is.
in the example above, ******************** should be set to null and **** should also be set to null. notice that there is a field with X. since this field contain alpha numeric, it should be loaded into the table as is. the only time field should be set to null is when the value contain all .
somebody in this forum suggest using decode but it looks like it is not working and i get error when it reads second field and try to insert into a2 number column.
is there any way to use regular expression to find out if a field contain all *. also i want to trim each field since they might contain leading spaces.
can some one help with this using the sqlloader ctl and data above?You can include regular expression in you SQL*Loader control file.
An example can be found here:
http://www.morganslibrary.org/reference/sqlloader.html
Demos 7 and 8 using the UPPER and DECODE functions to illustrate how to do it. -
SQL Loader, direct path and indexes on partition tables
Hi,
I have a big partitioned table and I have two indexes on it.
When I use SQL Loader to upload data to my table with using OPTIONS (DIRECT=TRUE) UNRECOVERABLE , then it takes almost half hour to load just one record!!
When I remove OPTIONS (DIRECT=TRUE), then it takes just two seconds to upload the same one record.
Am I missing anything? Can I use direct path load on indexed partitioned tables and have reasonable load time ?
An scheduled external job loads almost 100,000 records into this table every hour and I am trying to make the sql*loader performance it as fast as possible.
Any help would be appreciated,
AlanHi Alan,
How big is this table and what sort of indexes are they?
An index update using SQL*Loader unrecoverable direct path load is achieved by an isolated sort followed by a nologging merge of the old index and the new mini-index into a new index segment (this according to seminar notes by Jonathan Lewis). This will conceivably take a long time for a large table / large index.
Performance improvements? Are you loading all records into a new partition?
Cheers,
Colin -
How to use SQL loader with DBF fixed format record
Hi everybody!
My situation is that: I want to use SQL loader with Foxpro DBF format, it similar to case 2 study (Fixed format record) but DBF file has header, how can I tell SQL loader skip header.
Thank you in advanceAnother option is to apply SQL operators to fields
LOAD DATA
INFILE *
APPEND
INTO TABLE emp
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' (
empno,
ename,
job,
mgr,
hiredate DATE(20) "DD-Month-YYYY",
sal,
comm,
deptno CHAR TERMINATED BY ':',
projno,
loadseq "my_seq.nextval")This is a modified control file of Case Study 3 which originally demonstrated the use of the Sequence parameter -
Line item and header records in the same infopackage
Gurus,
I wanted to check how can I make sure that I get all the line item documents in the same package with the header document record in the same infopackage? Is there some setting for that? If I am writing a custom extractore how can I make sure of this in ABAP?
Thanks
AKDear AKBW,
This is not very clear why you want to use same infopackage for line-item as well as header data. Normally there are 2 different datasources for line item and for header data.
Say you are woking with Sales Order Data in ECC.
So Line-item datasource for sales order data - 2LIS_11_VAITM
and Header datasource for sales order data - 2LIS_11_VAHDR.
Now as they are 2 different datasource, so you must need 2 different infopackages.
Now if you have created one custom datasource in ECC, then just check if its extracting line-item level or header level data. Normally when we create any custom datasource, we try to make it line-item level, to have all the itemwise detail. Though its not mandatory.
So if you have 2 different datasources (SAP or Custom), you definitely need 2 different infopackages.
Please let me know, if you still have any more doubt. You can also give me the other detail of the custom datsource you are creating.( type, fields, what it is supposed to extract etc).. -
SQL Loader, nested tables and default values
Is there a way to specify a default value for a nested table entry when SQL*Loader encounters a 'null' value?
I want to avoid this:
Record 5: Rejected - Error on table LEVEL_DESC, column LEVELS.
NULL nested table element is not allowedUse the NULLIF parameter in your control file for the nested table objects.
e.g
LOAD DATA
INFILE 'level_data.dat'
INTO TABLE LEVEL
(LEVEL_ID POSITION (01:05) CHAR
LEVEL_NAME POSITION (07:20)
LEVEL_DESC COLUMN OBJECT
(LEVELS POSITION (22:25) CHAR NULLIF LEVEL_DESC.LEVELS=BLNAKS,
... )) -
Executing sqlldr (sql loader) from java and returning the error code
I'm wandering do sqlldr return any error code whenever it hit error while running in java.
For example, if i run in command prompt using the command below,
C:\ >sqlldr uid/pwd data=abc.dat control=abc.txt
It might give me some indicator that error occurs such as
SQL*Loader-601: For INSERT option, table must be empty. Error on table CURRENCY
or
SQL*Loader-500: Unable to open file (abc.txt)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
But when i run in java using the code below,
Runtime rt = Runtime.getRuntime();
Process proc = rt.exec("sqlldr uid/pwd data=abc.dat control=abc.txt");
int exitVal = proc.waitFor();
System.out.println("Process exitValue: " + exitVal);
it will only give me the same exitValue of 1(i presume its error) / 0 (i presume no error) instead of the details of the error.
How can i get the exact error code/message if i were to execute it using java?
Any solution?mg,
I don't think user576271 wants the exit code, I think [s]he wants the error message.
But wouldn't error messages from SQL*Loader be sent to the stderr stream, and not the stdout stream?
In which case user576271 would need method "getErrorStream()" of class java.lang.Process, no?
Good Luck,
Avi. -
SQL LOADER export ? And Help
1) The SQLLoader can export table to text file?
2) have any oracle tool to export table to text file? (Table include the BLOB field this field is store the image)
Thx very much.1) Nope.
2) SQL*Plus. Or you can try SQL Developer. Just bring up the table data in a grid, right click on it, then export it in the format of your choice.
Oops, I forgot about your BLOB column. Not sure about that one. But since you asked for a text file, you'd would just have to eliminate that column since a text file can't contain an image.
Message was edited by:
Eric H -
Upload records - SQL loader.
Dear All,
We are in the process of getting a big volume data upload project. The data is to be uploaded using SQLLDR in Oracle DB. However the no of records are huge in GB.
Max volume looks to be for the table: 153 million records x 197 (Max size of all data fields in one record)Unicode characters volume. If I assume 1 byte per Unicode char, it comes to 153,639,503 x 197 = 30,266,982,091 bytes; which is about 26.3 GB.
Kindly help to answer below queries –
a. Can you suggest approximately how much time SQLLDR process will take to upload the data of 26.3 GB ?
b. If it is 4 bytes per Unicode char, it will be 105.2GB. Is there a way to handle file sizes of that volume?
c. How can we enhance the data upload process for large volume data (performance and tuning)?
Thanks in advance.
deb882134 wrote:
We are in the process of getting a big volume data upload project. The data is to be uploaded using SQLLDR in Oracle DB. However the no of records are huge in GB. Relatively small volumes. I am loading over 35 files every 60 seconds. Each CSV file is 21MB in size. 35 files can contain around 4,645,417 rows in total.
a. Can you suggest approximately how much time SQLLDR process will take to upload the data of 26.3 GB ?How long is a piece of string?
b. If it is 4 bytes per Unicode char, it will be 105.2GB. Is there a way to handle file sizes of that volume?Not relevant. It is not the size of the data - it is WHAT needs to be done using that data.
c. How can we enhance the data upload process for large volume data (performance and tuning)?SQL*Loader supports direct and parallel loads. Overheads of the data structure to load into can be reduced by eliminating table constraints, triggers and indexes. Etc.
Oracle is very capable of dealing with large volumes of data is a performant and scalable fashion. How well it performs and scales depends on how you use Oracle. -
How to skip footer record in SQL*Loader input file
I have am using SQL*Loader in a batch import process.
The input files to SQL*Loader have a header record and footer record - always the 1st and last records in the file.
I need SQL*Loader to ignore these two records in every file when performing the import. I can easily ignore the header by using the SKIP function.
Does anybody know how to ignore the last record (footer) in an input file??
I do not want to physically pre-strip the footer since the business want all data files to have the header and footer records.Thanks - how do I use the when clause to specify the last line of the input file?
I am presuming it requires me to have a unique identifier at a given position on that last line. If I don't have such an identifier can I still use your solution?
Cheers Why not putting an idetifier at the end of your input file: echo "This_is_the_End" >> input_file ?
Maybe you are looking for
-
IOS 7.1 Battery-Reset fix
Anyone else having this issue-Charge all night and as soon as it is unplugged the battery indicator drops a few % with minimal use. I restart the phone by doing the home button/power button hold until it restarts. Battery jumps up 3%-5% as charged
-
Error in SQL Statement: SAPSQL_INVALID_TABLENAME
Hello all, While displaying contents of infocube i am getting strange error .. Error in SQL Statement: SAPSQL_INVALID_TABLENAME ABC Message no. DBMAN256 ( Table name length is 16 ) I am able to see the contents of Fact table. Could not found any solu
-
Problem in select-option in module pool
Hi, I have 4 fields in my module pool screen, all are select-option and i want to select the data on basis of these fields. but condition is that some fields can be blank. So i have use select query with 'in' option but it does not work when i leave
-
Problem with 3rd party bluetooth keyboard
I just purchased a third party bluetooth keyboard (RockSoul BT-100). Unfortunately, when paired with my MacBook (2009 aluminum unibody running Mountain Lion), neither the Escape nor the FN key works. Any suggestions? Do I need a special driver?
-
TS1702 cant open instagram or app store apps when on 3g
instagram and app store apps will not open in 3g although they are fine on wifi