Loading Objects with SQL*Loader
When loading a column object with SQL*Loader, is it possible to specify a column specific 'TERMINATED BY' clause in the control file?
I've successfully defined column-level termination characters for regular columns and nested tables, but can't seem to find any way of achieving the same result with column objects.
When loading a column object with SQL*Loader, is it possible to specify a column specific 'TERMINATED BY' clause in the control file?
I've successfully defined column-level termination characters for regular columns and nested tables, but can't seem to find any way of achieving the same result with column objects.
Similar Messages
-
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
Problem with loading file with SQL loader
i am getting a problem with loading a file with SQL loader. The loading is getting
terminated after around 2000 rows whereas there are around 2700000 rows in the file.
The file is like
919879086475,11/17/2004,11/20/2004
919879698625,11/17/2004,11/17/2004
919879698628,11/17/2004,11/17/2004
the control file, i am using is like:-
load data
infile 'c:\ran\temp\pps_fc.txt'
into table bm_05oct06
fields terminated by ","
(mobile_no, fcal, frdate )
I hope, my question is clear. Please help, in solving the doubt.
regards.So which thread is telling the truth?
Doubt with SQL loader file wih spaces
Are the fields delimited with spaces or with commas?
Perhaps they are a mixture of delimiters and that is where the error is coming in? -
Hi:
Does somebody know if it is possible to upload data using an API in SQL Loader? Or SQL Loader is limited only to INSERT statements?
I need to bulk a lot of employees and their usernames and resposibilities. They are almost 200 employees. What I would like to do is to create a csv file with all this information (first_name, last_name, hire_date, sex, email, if it is a purchaser, username, password, responsibility...), and through a request load all this to Oracle Ebs. I want to use hr_employee_api.create_employee, fnd_user_pkg.create_user and fnd_wf_engine.propagate_user_role. I don't want to enter data using inserts because hr_employee_api.create_employee api changes more than one table.
I hope you can help me
Thanks in advanceWhy don't you
* load the data file into a temp table
* write a procedure that reads the temp table and calls the 3 APIs with the data from each row
Sandeep Gandhi -
Hi Experts,
I have a file with the following format. I have to insert the data of those files in a table. I can use SQL Loader to load those files.
My question is I need to schedule the upload of those files. Can i incorporate sql loader in a procedure?
Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||
{code)
Edited by: Kevin CK on 02-Feb-2010 03:28Yes sorry about that mishap
Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||1|-2145916800|1|0|1|0||0|0|0|0|0|0|0|0|0|0|-3287|0|0|0|1|1|2|0|0|0|1|0|999|0||5|0|0|||||This is my file format which is a .txt file -
URGENT: Problems Loading files with SQL Loader into a BLOB column
Hi friends,
I read a lot about how to load files into blob columns, but I found errors that I can't solve.
I've read several notes in these forums, ine of them:
sql loader: loading external file into blob
and tried the solutions but without good results.
Here are some of my tests:
With this .ctl:
LOAD DATA
INFILE *
INTO TABLE mytable
REPLACE
FIELDS TERMINATED BY ','
number1 INTEGER EXTERNAL,
cad1 CHAR(250),
image1 LOBFILE(cad1) TERMINATED BY EOF
BEGINDATA
1153,/opt/oracle/appl/myapp/1.0.0/img/1153.JPG,
the error when I execute sqlldr is:
SQL*Loader-350: Syntax error at line 9.
Expecting "," or ")", found "LOBFILE".
image1 LOBFILE(cad1) TERMINATED BY EOF
^
What problem exists with LOBFILE ??
(mytable of course has number1 as a NUMBER, cad1 as VARCHAR2(250) and image1 as BLOB
I tried too with :
LOAD DATA
INFILE sample.dat
INTO TABLE mytable
FIELDS TERMINATED BY ','
(cad1 CHAR(3),
cad2 FILLER CHAR(30),
image1 BFILE(CONSTANT "/opt/oracle/appl/myapp/1.0.0/img/", cad2))
sample.dat is:
1153,1153.JPEG,
and error is:
SQL*Loader-350: Syntax error at line 6.
Expecting "," or ")", found "FILLER".
cad2 FILLER CHAR(30),
^
I tried too with a procedure, but without results...
Any idea about this error messages?
Thanks a lot.
Jose L.> So you think that if one person put an "urgent" in the subject is screwing the problems of
other people?
Absolutely. As you are telling them "My posting is more important than yours and deserve faster attention and resolution than yours!".
So what could a typical response be? Someone telling you that his posting is more important by using the phrase "VERY URGENT!". And the next poster may decide that, no, his problem is evern more import - and use "EXTREMELY URGENT!!" as the subject. And the next one then raises the stakes by claiming his problem is "CODE RED! CRITICAL. DEFCON 4. URGENT!!!!".
Stupid, isn't it? As stupid as your instance that there is nothing wrong with your pitiful clamoring for attention to your problem by saying it is urgent.
What does the RFC's say about a meaningful title/subject in a public forum? I trust that you know what a RFC is? After all, you claim to have used public forums on the Internet for some years now..
The RFC on "public forums" is called The Usenet Article Format. This is what it has to say about the SUBJECT of a public posting:
=
The "Subject" line (formerly "Title") tells what the message is about. It should be suggestive enough of the contents of the message to enable a reader to make a decision whether to read the message based on the subject alone. If the message is submitted in response to another message (e.g., is a follow-up) the default subject should begin with the four characters "Re: ", and the "References" line is required. For follow-ups, the use of the "Summary" line is encouraged.
=
([url http://www.cs.tut.fi/~jkorpela/rfc/1036.html]RFC 1036, the Usenet article format)
Or how about [url http://www.cs.tut.fi/~jkorpela/usenet/dont.html]The seven don'ts of Usenet?
Point 7 of the Don'ts:
Don't try to catch attention by typing something foolish like "PLEASE HELP ME!!!! URGENT!!! I NEED YOUR HELP!!!" into the Subject line. Instead, type something informative (using normal mixed case!) that describes the subject matter.
Please tell me that you are not too thick to understand the basic principles of netiquette, or to argue with the RFCs that governs the very fabric of the Internet.
As for when I have an "urgent" problem? In my "real" work? I take it up with Oracle Support on Metalink by filing an iTAR/SR. As any non-idiot should do with a real-life Oracle crisis problem.
I do not barge into a public forum like you do, jump up and down, and demand quick attention by claiming that my problem is more important and more urgent and more deserving of attention that other people's problem in the very same forum. -
hi.,
By SQL LOADER iam able to Transfer data to Oracle Database, from a Flat File System.
But now i want to Transfter the data from Flat Files, to Oracle Apps/VB/D2k Etc.,
with help of SQL Loader.,
is it possible to do..?
Thanks in Advance,
With Regards.,
N.GowriShankar.For the Applications you can use file handling built-ins such as TEXT_IO, UTL_FILE, etc. You can write a batch program for sqlldr and can invoke it from front end Applications.
-
Load data with SQL Loader link field between CSV file and Control File
Hi all,
in a SQL Loader control file, how do you specify link with field in CSV file and Control file?
E.g. if I wat to import the record in table TEST (col1, col2, col3) with data in csv file BUT in different position. How to do this?
FILE CSV (with variable position):
test1;prova;pippo;Ferrari;
xx;yy;hello;by;
In the table TEST i want that col1 = 'prova' (xx),
col2 = 'Ferrari' (yy)
col3 = default N
the others data in CSV file are ignored.
so:
load data
infile 'TEST.CSV'
into table TEST
fields terminated by ';'
col1 ?????,
col2 ?????,
col3 CONSTANT "N"
Thanks,
AttilioWith '?' mark i mean " How i can link this COL1 with column in csv file ? "
Attilio -
Problem with loading sdo_geometry with sql loader
Hi All,
I i'm trying to load some geometries, and i'm using sql loader for that.
example entry of my data file looks like this:
1,01030000000100000011000000af8916eafaf7284014c8917307f54540645ddc4603f82840d7b7dd150bf5454084ab4dad08f828401cfc0e8f0ef54540c987eaf70ef828404469143713f54540c987eaf70ef828404469143713f54540d13131a715f82840d44c52f41bf54540e148fb7a19f82840e2e1e24d23f54540e148fb7a19f82840e2e1e24d23f545403abcd6941af828400b13a16c25f5454091c71d801ef82840b8efac3830f54540c58cf0f620f8284039cbd1883ef54540c58cf0f620f8284039cbd1883ef545406659e6632df82840eb80351834f545403bf9991f24f828404bf37d271cf54540e3c281902cf8284012633ec516f54540b66a323e27f82840181d35cb0af54540af8916eafaf7284014c8917307f54540
Geometry is stored in wkb format. To be able to add SRID information to geometry i created the function in database which looks like this:
create or replace function sdo_geom_form_wkb_text(wkb_text IN VARCHAR2) RETURN
sdo_geometry as
SRID_VALUE NUMBER := 8307;
BEGIN
return sdo_geometry(to_blob(HEXTORAW(wkb_text)), SRID_VALUE);
end sdo_geom_form_wkb_text;
and i would like to invoke this method during load. To do this i look in some forums where there were examples with using from_wkt function and i made my own ctl file:
OPTIONS (SKIP=0,BINDSIZE=20000000,ROWS=10000,ERRORS=500,DIRECT=true)
LOAD DATA
INFILE 'c:\geometry\face2.dat'
BADFILE 'c:\geometry\face2.bad'
TRUNCATE
CONTINUEIF NEXT(1:1) = '#'
INTO TABLE TEST_TABLE
FIELDS TERMINATED BY ','
TRAILING NULLCOLS (
ID INTEGER EXTERNAL,
string_geom BOUNDFILLER,
GEOMETRY EXPRESSION "sdo_geom_form_wkb_text(:string_geom)"
when i invoke sqlloader i'm getting:
SQL*Loader-951: Error calling once/load initialization
ORA-26052: Unsupported type 121 for SQL expression on column GEOMETRY.
Could anyone tell what i'm doing wrong?
Thanks,864742,
. . . .Forget the function and use the SDO_GEOMETRY constructor.
-- "NOTE: here I'm assuming your SRID is 8307"
ID INTEGER EXTERNAL,
string_geom BOUNDFILLER,
GEOMETRY EXPRESSION "SDO_GEOMETRY(:string_geom, 8307)"
.... . . .Credit
Regards,
Noel -
Special character loading issue with SQL Loader
Hi,
I am getting the special characters as part of my input file to SQL loader. The problem is that because the length of the input string (having special characters) is more than 1000 in input data, the converted bytes length is becoming more than 4000 and the defined substrb() function in control file is not working for it and the target 'ADDR_LINE' column in table 'TEST_TAB' is defined Varchar2(1024 char).
Following is a sample ctl file and data i am using for it.
LOAD DATA
CHARACTERSET UTF8
INFILE 'updated.txt'
APPEND INTO TABLE TEST_TAB
FIELDS TERMINATED BY "|~"
TRAILING NULLCOLS
INDX_WORD ,
ADDR_LINE "SUBSTRB(:ADDR_LINE , 1, 1000)",
CITY
following is the actual data which i am receiving as part of input file to sql loader for the ADDR_LINE column:
'RUA PEDROSO ALVARENGA, 1284 AƒAƒA‚AƒAƒA‚A‚AƒAƒAƒA‚A‚AƒA‚A‚AƒAƒAƒA‚AƒAƒA‚A‚A‚AƒAƒA‚A‚AƒA‚A‚AƒAƒAƒA‚AƒAƒA‚A‚AƒAƒAƒA‚A‚AƒA‚A‚A‚AƒAƒA‚AƒAƒA‚A‚A‚AƒAƒA‚A‚AƒA‚A‚A‚AƒAƒA‚AƒAƒA‚A‚AƒAƒAƒA‚A‚AƒA‚A‚AƒAƒAƒA‚AƒAƒA‚A‚A‚AƒAƒA‚A‚AƒA‚A‚A‚AƒAƒA‚AƒAƒA‚A‚AƒAƒAƒA‚A‚AƒA‚A‚A‚AƒAƒA‚AƒAƒA‚A‚A‚AƒAƒA‚A‚AƒA‚A‚A– 10 ANDAR'
My database is having following settings.
NLS_CALENDAR GREGORIAN
NLS_CHARACTERSET AL32UTF8
NLS_COMP BINARY
NLS_CURRENCY $
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_DUAL_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_LANGUAGE AMERICAN
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_NCHAR_CONV_EXCP FALSE
NLS_NUMERIC_CHARACTERS .,
NLS_SORT BINARY
NLS_TERRITORY AMERICA
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
Any help in this regard will be much appreciated.
Thanks in advance.Is the data file created directly on the Unix server ? If not, how does it get to the Unix server ? And where does the file come from ? Is the UTF8 locale installed on the Unix server (check with the Unix sysadmin) ?
HTH
Srini -
Tutorials on loading data with SQL LOADER
Dear reader
Pls i need tutorials on how to use SQL LOADER as well as a software for loading data into oracle softwarehttp://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm
Chapters 6-14 are filled with examples and tutorials -
Goodmorning,
I have this XML file:
bq. <?xml version="1.0"?> \\ <Header> \\ <DocName>NOMRES</DocName> \\ <DocVersion>3.2</DocVersion> \\ <Sender>TIGF</Sender> \\ <Receiver>TIENIG</Receiver> \\ <DocNumber>120731</DocNumber> \\ <DocDate>2008-12-14T16:36:43.9288481+01:00</DocDate> \\ <DocType>J</DocType> \\ <Contract>TIGF-TIENIG</Contract> \\ </Header> \\ <ListOfGasDays> \\ <GasDay> \\ <Day>2008-12-15</Day> \\ <BusinessRuleFlag>Processed by adjacent TSO</BusinessRuleFlag> \\ <ListOfLibri> \\ <Libro> \\ <Logid>62</Logid> \\ <Isbn>88-251-7194-3</Isbn> \\ <Autore>Elisa Bertino</Autore> \\ <Titolo>Sistemi di basi di dati - Concetti e architetture</Titolo> \\ <Anno>1997</Anno> \\ <Collocazione>Dentro</Collocazione> \\ <Genere>Informatica</Genere> \\ <Lingua>Italiano</Lingua> \\ </Libro> \\ <Libro> \\ <Logid>63</Logid> \\ <Isbn>978-88-04-56981-7</Isbn> \\ <Autore>Dan Brown</Autore> \\ <Titolo>Crypto</Titolo> \\ <Anno>1998</Anno> \\ <Collocazione>Dentro</Collocazione> \\ <Genere>Thriller</Genere> \\ <Lingua>Italiano</Lingua> \\ </Libro> \\ </ListOfLibri> \\ </GasDay> \\ <GasDay> \\ <Day>2008-12-15</Day> \\ <BusinessRuleFlag>Confirmed</BusinessRuleFlag> \\ <ListOfLibri> \\ <Libro> \\ <Logid>64</Logid> \\ <Isbn>978-88-6061-131-4</Isbn> \\ <Autore>Stephen King</Autore> \\ <Titolo>Cell</Titolo> \\ <Anno>2006</Anno> \\ <Collocazione>Dentro</Collocazione> \\ <Genere>Horror</Genere> \\ <Lingua>Italiano</Lingua> \\ </Libro> \\ <Libro> \\ <Logid>65</Logid> \\ <Isbn>1-56592-697-8</Isbn> \\ <Autore>David C. Kreines</Autore> \\ <Titolo>Oracle SQL - The Essential Reference</Titolo> \\ <Anno>2000</Anno> \\ <Collocazione>Dentro</Collocazione> \\ <Genere>Informatica</Genere> \\ <Lingua>Inglese</Lingua> \\ </Libro> \\ <Libro> \\ <Logid>66</Logid> \\ <Isbn>978-88-6061-131-4</Isbn> \\ <Autore>Stephen King</Autore> \\ <Titolo>Cell</Titolo> \\ <Anno>2006</Anno> \\ <Collocazione>Dentro</Collocazione> \\ <Genere>Horror</Genere> \\ <Lingua>Italiano</Lingua> \\ </Libro> \\ </ListOfLibri> \\ </GasDay> \\ </ListOfGasDays> \\ <ListOfGeneralNotes> \\ <GeneralNote> \\ <Code>100</Code> \\ <Message>Rien a signaler</Message> \\ </GeneralNote> \\ </ListOfGeneralNotes>
and use this control file:
bq. load data \\ infile "Esempio.XML" "str '</Libro>'" \\ BADFILE "libri.bad" \\ DISCARDFILE "libri.dis" \\ DISCARDMAX 10000 \\ truncate \\ into table LIBRI \\ TRAILING NULLCOLS \\ ( \\ dummy filler terminated by '<Libro>', \\ Logid enclosed by "<Logid>" and "</Logid>", \\ Isbn enclosed by "<Isbn>" and "</Isbn>", \\ Autore enclosed by "<Autore>" and "</Autore>", \\ Titolo enclosed by "<Titolo>" and "</Titolo>", \\ Anno enclosed by "<Anno>" and "</Anno>", \\ Collocazione enclosed by "<Collocazione>" and "</Collocazione>", \\ Genere enclosed by "<Genere>" and "</Genere>", \\ Lingua enclosed by "<Lingua>" and "</Lingua>" \\ )
being uploaded but I always send error in the first record.
Someone said me why? differently if I set the control file?
thanksI have the following XML data file and had the same loading issue.
<?xml version="1.0"?>
<Settlement_Info>
<file_header>
<agency_id>129</agency_id>
<agency_form_number/>
<omb_form_number/>
<treasury_account_symbol/>
<percent_of_amount>100.00</percent_of_amount>
</file_header>
<body_item>
<item_header>
<deposit_ticket_number>001296</deposit_ticket_number>
<total_amount_of_sf215>1,318,542,280.16</total_amount_of_sf215>
<number_of_collections>3,929</number_of_collections>
<total_of_all_collections>1,318,542,280.16</total_of_all_collections>
</item_header>
<item_detail_record>
<paygov_tx_id>FMG4</paygov_tx_id>
<agency_tx_id>0000015901</agency_tx_id>
<collection_amount>8,688.70</collection_amount>
<collection_method>ACH</collection_method>
<deposit_ticket_number>96</deposit_ticket_number>
<settlement_date>12/15/2009</settlement_date>
<collection_status>SETTLED</collection_status>
<submitter_name>MORRIS</submitter_name>
</item_detail_record>
<item_detail_record>
<paygov_tx_id>FMG5</paygov_tx_id>
<agency_tx_id>0000015902</agency_tx_id>
<collection_amount>42,198.66</collection_amount>
<collection_method>ACH</collection_method>
<deposit_ticket_number>001296</deposit_ticket_number>
<settlement_date>12/15/2009</settlement_date>
<collection_status>SETTLED</collection_status>
<submitter_name>CASTLE</submitter_name>
</item_detail_record>
<item_detail_record>
<paygov_tx_id>4FMG6</paygov_tx_id>
<agency_tx_id>0000015903</agency_tx_id>
<collection_amount>57,278.25</collection_amount>
<collection_method>ACH</collection_method>
<deposit_ticket_number>001296</deposit_ticket_number>
<settlement_date>12/15/2009</settlement_date>
<collection_status>SETTLED</collection_status>
<submitter_name>FRANKLIN</submitter_name>
</item_detail_record>
</body_item>
<file_footer>
<file_name>ACHActivityFile_12152009.xml</file_name>
<file_creation_date>12/15/2009 10:08:31 AM</file_creation_date>
</file_footer>
</Settlement_Info>
Control file
load data
infile 'C:\sample1.xml' "str '</item_detail_record>'"
truncate
into table xml_test2
TRAILING NULLCOLS
dummy filler terminated by "<item_detail_record>",
paygov_tx_id enclosed by "<paygov_tx_id>" and "</paygov_tx_id>",
agency_tx_id enclosed by "<agency_tx_id>" and "</agency_tx_id>",
collection_amount enclosed by "<collection_amount>" and "</collection_amount>",
collection_method enclosed by "<collection_method>" and "</collection_method>",
deposit_ticket_number enclosed by "<deposit_ticket_number>" and "</deposit_ticket_number>",
settlement_date enclosed by "<settlement_date>" and "</settlement_date>",
collection_status enclosed by "<collection_status>" and "</collection_status>",
submitter_name enclosed by "<submitter_name>" and "</submitter_name>"
table strucutre
CREATE TABLE XML_TEST2
PAYGOV_TX_ID VARCHAR2(30 BYTE),
AGENCY_TX_ID VARCHAR2(30 BYTE),
COLLECTION_AMOUNT VARCHAR2(30 BYTE),
COLLECTION_METHOD VARCHAR2(30 BYTE),
DEPOSIT_TICKET_NUMBER VARCHAR2(30 BYTE),
SETTLEMENT_DATE VARCHAR2(30 BYTE),
COLLECTION_STATUS VARCHAR2(30 BYTE),
SUBMITTER_NAME VARCHAR2(60 BYTE)
If I reomove the <file_header> and <item_header> blocks, the control file works perfectly, otherwise it skips the first record.
thanks
Reji -
Use Oracle directory object in SQL*loader?
Hi All,
We have a bunch of flatfiles that need to be read on a daily basis. We are using SQL*loader to read these files into Oracle.
The files arrive into a different directory every day ( /filesDDMMYY/ ). We now manually copy these files into the static directory which is pointed to in our ctl file. I was wondering if it's possible to use an Oracle Directory object to point to these data files, in stead of the pysical directory we use now?
Now we use: INFILE './sources/mydata.txt' , but I would like to make this a dynamic refrence to a directory with a different name
I searched the documentation and the internet quite extensively, but can not get an answer if it's possible to use directory objects in conjunction with sql loader.
Any help or suggestions would be appriciated.
Greetz,
Toin.
Message was edited by:
Toin ~ corrected typoyou can remove the INFILE parameter from the CTL files, and instead specify it on the command line (DATA=./sources...).
obviously this would still require changing every ctl file, but you would only need to do it once, not everytime you change a directory.
of course, the shell script which runs sqlldr would need to change. however, you could make the shell script more robust, by having it connect to sqlplus to look up the actual directory path from ALL_DIRECTORIES, and then use that when calling sqlldr. -
All,
I have two tables HEADER_TABLE and LINE_TABLE. Each header record can have multiple line records. I have to load data from a flat file to these tables.Flat file can have two types of records. H-Header, L-Line. It looks as follows.. Each H record can have multiple corresponding L records
H..........
L.......
L......
L......
H.........
L.......
L......
L......
I have HEADER_ID column in HEADER_TABLE and HEADER_ID, LINE_ID columns in the LINE_TABLE.
While loading data using SQL Loader, I need to generate HEADER_ID and LINE_ID values as follows and load them.
H..........<HEADER_ID = 1>
L....... <HEADER_ID = 1><LINE_ID = 1>
L...... <HEADER_ID = 1><LINE_ID = 2>
L...... <HEADER_ID = 1><LINE_ID = 3>
H......... <HEADER_ID = 2>
L....... <HEADER_ID = 2><LINE_ID = 4>
L...... <HEADER_ID = 2><LINE_ID = 5>
L...... <HEADER_ID = 2><LINE_ID = 6>
Is it possible to do this with SQL LODER?
I tried to do this with sequences. But it loaded the tables as follows.
H..........<HEADER_ID = 1>
L....... <HEADER_ID = 1><LINE_ID = 1>
L...... <HEADER_ID = 1><LINE_ID = 2>
L...... <HEADER_ID = 1><LINE_ID = 3>
H......... <HEADER_ID = 2>
L....... <HEADER_ID = 1><LINE_ID = 4>
L...... <HEADER_ID = 1><LINE_ID = 5>
L...... <HEADER_ID = 1><LINE_ID = 6>
Thanks
KethaMorgan,
Examples given in the link are quite generic and I have tried them. But my requirement is focused on generating header_id and line_id values as i have described. It seems that SQLLDR scans all records for a particular WHEN clause and insert them into the specified table. I think that if SQLLDR is made to read recod in the data file sequentially, this can be done.
ANy idea of how to make SQLLDR read the records from the file sequentially?
Thanks
Ketha -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961
Maybe you are looking for
-
Feature request: better tools for commenting and editing text
We currently use a comments-enabled Acrobat workflow with InDesign to track changes to our large catalog project. Was hoping to move our team over to CS Live, but currently the commenting tools are sub-par, and not accurate enough for extensive text
-
IPhoto doesn't open after migration/upgrade
I see several discussions similar to mine, but not exactly. I just migrated from a MacBookPro 10.6.8 to a Pro 10.8.2. My iPhoto was iPhoto 8, vers 7.1.5. After migration and upgrade my iPhoto is vers 9.4.1. When I try to open iPhoto I get the message
-
Not able to create provider URLsample (parseSubscriberID)
http://host:port/servlets/urlsample works fine, but when I try to create provider urlsample I get foolowing errors. Portal 3.0.9.8.1 An error occurred when attempting to call the providers register function. (WWC-43134) The following error occurred d
-
One Form 10g takes long time to open
We have created one form using Object Library and all objects are inherited. It takes long time to open as compared to other forms which are inherited from Object Library. Anybody having any idea. It would be great if somebody can provide some help
-
IMessage won't work and no notifications at all for any messages please help
iMessage won't work I know there is a problem but now I'm not getting any text notifications please help! Won't even vibrate