Sqlldr bindsize parameter
Using sqlldr to import data:
sqlldr ${user_id} control=${ctl_file} rows=10000 bindsize=8192 readsize=8192 errors=999999 log=${log_file}/${today_DATE}.log
When executing:
Commit point reached - logical record count 2
Commit point reached - logical record count 4
Commit point reached - logical record count 6
Commit point reached - logical record count 8
Commit point reached - logical record count 10
Commit point reached - logical record count 12
In data file, there is only 80 characters(that is 80 bytes) in every row. However, in log file:
Space allocated for bind array: 6714 bytes(2 rows)
Why 6714 bytes is here? Except the 160 bytes for two rows, what's the other sizes?
Thanks a lot for your kindly reply.
Thanks.
Table has 14 fields:
create table TBL
A VARCHAR2(3) not null,
B VARCHAR2(10) not null,
C VARCHAR2(5) not null,
D VARCHAR2(1) not null,
E VARCHAR2(4) not null,
F VARCHAR2(4) not null,
G VARCHAR2(6) not null,
H VARCHAR2(1) not null,
I VARCHAR2(1) not null,
J VARCHAR2(5) not null,
K VARCHAR2(6) not null,
L VARCHAR2(1) not null,
M TIMESTAMP(6) not null,
N VARCHAR2(2) not null
The size of all columns is about 55 bytes. Binsize is 8192 in control file, However, two lines import once a time when executing. Obviously, it is not "55" for actually. Control file is like:
TRUNCATE
INTO TABLE TBL
FIELDS TERMINATED BY X'09'
TRAILING NULLCOLS
(A "upper(trim(:A))"
,B "upper(trim(:B))"
,C "upper(trim(:C))"
,D "upper(trim(:D))"
,E "upper(trim(:E))"
,F "upper(trim(:F))"
,G "upper(trim(:G))"
,H "upper(trim(:H))"
,I "upper(trim(:I))"
,J "upper(trim(:J))"
,K "upper(trim(:K))"
,L "upper(trim(:L))"
,M "upper(trim(:M))"
,N "upper(trim(:N))"
It is not "14*255" as you said.
Could you give me some advice? Thanks a lot.
Similar Messages
-
Sql loader not able to load more than 4.2 billion rows
Hi,
I am facing a problem with Sql loader utility.
The Sql loader process stops after loading 342 gigs of data i.e it seems to load 4.294 billion rows out of 6.9 billion rows and the loader process completes without throwing any error.
Is there any limit on the volume of data sql loader can load ?
Also how to identify that SQL loader process has not loaded whole data from the file as it is not throwing any error ?
Thanks in Advance.it may be a prob with the db config not in the sql loader...
check all the parameters
Maximizing SQL*Loader Performance
SQL*Loader is flexible and offers many options that should be considered to maximize the speed of data loads. These include:
1. Use Direct Path Loads - The conventional path loader essentially loads the data by using standard insert statements. The direct path loader (direct=true) loads directly into the Oracle data files and creates blocks in Oracle database block format. The fact that SQL is not being issued makes the entire process much less taxing on the database. There are certain cases, however, in which direct path loads cannot be used (clustered tables). To prepare the database for direct path loads, the script $ORACLE_HOME/rdbms/admin/catldr.sql.sql must be executed.
2. Disable Indexes and Constraints. For conventional data loads only, the disabling of indexes and constraints can greatly enhance the performance of SQL*Loader.
3. Use a Larger Bind Array. For conventional data loads only, larger bind arrays limit the number of calls to the database and increase performance. The size of the bind array is specified using the bindsize parameter. The bind array's size is equivalent to the number of rows it contains (rows=) times the maximum length of each row.
4. Use ROWS=n to Commit Less Frequently. For conventional data loads only, the rows parameter specifies the number of rows per commit. Issuing fewer commits will enhance performance.
5. Use Parallel Loads. Available with direct path data loads only, this option allows multiple SQL*Loader jobs to execute concurrently.
$ sqlldr control=first.ctl parallel=true direct=true
$ sqlldr control=second.ctl parallel=true direct=true
6. Use Fixed Width Data. Fixed width data format saves Oracle some processing when parsing the data. The savings can be tremendous, depending on the type of data and number of rows.
7. Disable Archiving During Load. While this may not be feasible in certain environments, disabling database archiving can increase performance considerably.
8. Use unrecoverable. The unrecoverable option (unrecoverable load data) disables the writing of the data to the redo logs. This option is available for direct path loads only.
Edited by: 879090 on 18-Aug-2011 00:23 -
Sql loader issue(How to specify a file that exists on remote server)
In sqlldr infile parameter I'd like to give a data file that exists on remote server.How can I specify???Please help me in the syntax.
Any help would be greatly appreciated.
Edited by: 792353 on Sep 24, 2010 7:22 AMsqlldr can accept any path that is VALID and it can be any type of share as long as the OS supports the share.
so INFILE can be anything you want as sqlldr will simply attempt to access it via the OS.
if you are on a linux box going to another linux box with an NSF mount point on the box running sqlldr simply reference:
INFILE '/mountpoint/fname'
Now I have never tried a UNC path before but I would guess that if you are on a windows box, going to another winddows box and the box running sqlldr was logged in with the right permissions it would simply be:
INFILE '\\server\directory\file'
I doubt that it will accept a URL as in:
INFILE '//servername.com/directory/file'
I don't think that sqlldr does anonymous ftp or htp file transfer protocol, but I could be wrong.
NOTE: I have found that it is best to ALWAYS surround your INFILE parameter with single quotes. -
How to change NLS_NUMERIC_CHARACTERS parameter for OWB SQLLDR mapping
Hi,
How to change the NLS_NUMERIC_CHARACTERS database paramater for my SQLLDR mapping?
I have an input flat file which has numeric data with ',' as decimal separator means NLS_NUMERIC_CHARACTERS setting as ',.'
However in my target oracle schema, the decimal separator is '.' which has NLS parameter set as NLS_NUMERIC_CHARACTERS='.,'
My OWB version is 10.2.
When I checked the configuration parameters of the sql loader mapping and the flat file operator, There is facility to change language, but not NLS_NUMERIC_CHARACTERS setting.
I do not want to change the NLS_NUMERIC_CHARACTERS setting in my database as there are many other projects which will get impacted.
We got a work around as below using external table & premap procedure. But as I have many mappings already developed, It is not possible to use this workaround.
- I can use premapping procedure with external tables to populate.
- NLS_NUMERIC_CHARACTERS setting can be changed using procedure for that particular session.
Is there a way to change NLS_NUMERIC_CHARACTERS setting only for that particular mapping/mapping session?
Thanks,
SriGPAt this moment , this is not possible . You can see metalink note ID 268906.1.
It says:
Currently, external tables always use the setting of NLS_NUMERIC_CHARACTERS
+at the database level.+
Cheers
Marisol -
Problem with sqlldr and commit
Hi,
i have a problem with sqlldr and commit.
I have a simple table with one colum [ col_id number(6) not null ]. The column "col_id" is primary key in the table. I have one file with 100.000 records ( number from 0 to 99.999 ).
I want load the file in the table with sqlldr ( sql*loader ) but i want commit only if all records are loaded. If one record is discarded i want discarded all record of file.
The proble is that in coventional path the commit is on 64 row but if i want the same records of file isn't possible and in direct path sqlldr disable primary key :(
There are a solutions?
Thanks
I'm for the bad EnglishThis is my table:
DROP TABLE TEST_SQLLOADER;
CREATE TABLE TEST_SQLLOADER
( COL_ID NUMBER NOT NULL,
CONSTRAINT TEST_SQLLOADER_PK PRIMARY KEY (COL_ID)
This is my ctlfile ( test_sql_loader.ctl )
OPTIONS
DIRECT=false
,DISCARDMAX=1
,ERRORS=0
,ROWS=100000
load data
infile './test_sql_loader.csv'
append
into table TEST_SQLLOADER
fields terminated by "," optionally enclosed by '"'
( col_id )
test_sql_loader.csv
0
1
2
3
99999
i run sqlloader
sqlldr xxx/yyy@orcl control=test_sql_loader.ctl log=test_sql_loader.log
output on the screen
Commit point reached - logical record count 92256
Commit point reached - logical record count 93248
Commit point reached - logical record count 94240
Commit point reached - logical record count 95232
Commit point reached - logical record count 96224
Commit point reached - logical record count 97216
Commit point reached - logical record count 98208
Commit point reached - logical record count 99200
Commit point reached - logical record count 100000
Logfile
SQL*Loader: Release 11.2.0.1.0 - Production on Sat Oct 3 14:50:17 2009
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Control File: test_sql_loader.ctl
Data File: ./test_sql_loader.csv
Bad File: test_sql_loader.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 100000 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_SQLLOADER, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
COL_ID FIRST * , O(") CHARACTER
value used for ROWS parameter changed from 100000 to 992
Table TEST_SQLLOADER:
100000 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255936 bytes(992 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 100000
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Sat Oct 03 14:50:17 2009
Run ended on Sat Oct 03 14:50:18 2009
Elapsed time was: 00:00:01.09
CPU time was: 00:00:00.06
The commit is on 992 row
if i have error on 993 record i have commit on first 992 row :(
Edited by: inter1908 on 3-ott-2009 15.00 -
Using prepareStatement with parameter in existsNode() function
Hello,
I have an xml file like the following:
<MyDocRoot id="myId" xmlns="..." xmlns:xsi="..." xsi:schemaLocation="...">
</MyDocRoot>
I am retrieving my documents from an XMLType table with aquery like:
SELECT OBJECT_VALUE FROM MY_TABLE
WHERE existsNode(OBJECT_VALUE, '/MyDocRoot[@id="myId"]') = 1;
When I run it using oracle client everything works fine and great.
Now I would like to execute that query from a Java application, using prepared statement, so with a query now like:
SELECT OBJECT_VALUE FROM MY_TABLE
WHERE existsNode(OBJECT_VALUE, '/MyDocRoot[@id="?"]') = 1;
And passing the id value as the parameter at execution time.
The problem I am facing is that it seems that the prepare statement does not recognize the parameter at that place. I get an error specifying that one parameter was provided but none was expected.
I get the same problem when I try to update a document in the database. In this case I have the query is:
UPDATE MY_TABLE SET OBJECT_VALUE = XMLType(?)
WHERE existsNode(OBJECT_VALUE, '/MyDocRoot[@id="?"]') = 1
The error I receive specify that 2 parameters were provided but only one was expected (the parameter in the XMLType() is recognized but not the one in the XPATH expression)
Does anybody have an example on how to use prepare statement with parameters in XPATH expression like above?Try this
package com.oracle.st.xmldb.pm.examples;
import com.oracle.st.xmldb.pm.common.baseApp.BaseApplication;
import oracle.jdbc.OraclePreparedStatement;
import oracle.jdbc.OracleResultSet;
import oracle.xdb.XMLType;
import oracle.xml.parser.v2.XMLDocument;
public class GetXMLType extends BaseApplication
public void doSomething(String[] Args) throws Exception
OraclePreparedStatement statement = null;
String statementText;
statementText = "select object_value from PURCHASEORDER where existsNode(object_value,'/PurchaseOrder[Reference=\"' || :1 || '\"]') = 1";
OracleResultSet resultSet = null;
XMLDocument doc = null;
XMLType xml;
statement = (OraclePreparedStatement) getConnection().prepareStatement(statementText);
statement.setString(1,"AHUNOLD-20040817185414366GMT");
resultSet = (OracleResultSet) statement.executeQuery();
while (resultSet.next())
xml = (XMLType) resultSet.getObject(1);
doc = (XMLDocument) xml.getDocument();
doc.print(System.out);
resultSet.close();
statement.close();
getConnection().close();
public static void main (String[] args)
try
GetXMLType example = new GetXMLType();
example.initializeConnection();
example.doSomething(args);
catch (Exception e)
e.printStackTrace();
}Gives
C:\TEMP>
C:\oracle\product\11.1.0\db_1\jdk\bin\javaw.exe -client -classpath C:\xdb\JDeveloper\Classes;C:\oracle\product\11.1.0\db_1\jdbc\lib\ojdbc5.jar;C:\oracle\product\11.1.0\db_1\LIB\xmlparserv2.jar;C:\oracle\product\11.1.0\db_1\RDBMS\jlib\xdb.jar;C:\oracle\JDeveloper\j2ee\home\oc4j.jar;C:\oracle\JDeveloper\j2ee\home\lib\servlet.jar -Dcom.oracle.st.xmldb.pm.ConnectionParameters=C:\\xdb\\jdeveloper\\SimpleExamples\\LocalConnection.xml -Dhttp.proxyHost=www-proxy.us.oracle.com -Dhttp.proxyPort=80 -Dhttp.nonProxyHosts=localhost|us.oracle.com|*.oracle.com -Dhttps.proxyHost=www-proxy.us.oracle.com -Dhttps.proxyPort=80 -Dhttps.nonProxyHosts=localhost|us.oracle.com|*.oracle.com com.oracle.st.xmldb.pm.examples.GetXMLType -mx2048M
Using connection Parameters from : C:\\xdb\\jdeveloper\\SimpleExamples\\LocalConnection.xml
ConnectionProvider.establishConnection(): Connecting as SQLLDR/SQLLDR@jdbc:oracle:oci8:@(description=(address=(host=localhost)(protocol=tcp)(port=1521))(connect_data=(service_name=ORA11GR1.xp.mark.drake.oracle.com)(server=DEDICATED)))
ConnectionProvider.establishConnection(): Database Connection Established
<PurchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://localhost:8080/home/SCOTT/poSource/xsd/purchaseOrder.xsd">
<Reference>AHUNOLD-20040817185414366GMT</Reference>
<Actions>
<Action>
<User>AHUNOLD</User>
</Action>
</Actions>
<Reject/>
<Requestor>Alexander Hunold</Requestor>
<User>AHUNOLD</User>
<CostCenter>A60</CostCenter>
<ShippingInstructions>
<name>Elizabeth Bates</name>
<address>Magdalen Centre, The Oxford Science Park,
Oxford,
Oxford OX9 9ZB
United Kingdom</address>
<telephone>980-985-4081</telephone>
</ShippingInstructions>
<SpecialInstructions>Next Day Air</SpecialInstructions>
<LineItems>
<LineItem ItemNumber="1">
<Description>Farewell, My Concubine</Description>
<Part Id="717951002723" UnitPrice="19.99" Quantity="3"/>
</LineItem>
<LineItem ItemNumber="2">
<Description>Willy Wonka and the Chocolate Factory</Description>
<Part Id="85392229123" UnitPrice="19.99" Quantity="4"/>
</LineItem>
<LineItem ItemNumber="3">
<Description>Best In Show</Description>
<Part Id="85391895121" UnitPrice="19.99" Quantity="3"/>
</LineItem>
<LineItem ItemNumber="4">
<Description>The Reggae Movie</Description>
<Part Id="13023004597" UnitPrice="19.99" Quantity="1"/>
</LineItem>
</LineItems>
</PurchaseOrder>
Process exited with exit code 0. -
Sqlldr- ORA-00060: deadlock detected while waiting for resource
Hi Team,
My database version is 11.1.0.7.0. I am loading the data using sqlldr. I am getting the error ORA-00060: deadlock detected while waiting for resource. once dead lock detected ,whether data will be rejected after commit point reached(Rows=100000). FYI information only sqllder is running on the database,it is getting completed withing 5-10min. please help me whether any other lock happening on this due to sqllder.
sqlldr userid=orcl/orcle control=".$controlfile." log=".$logfile.".log data=".$datafile." bad=".$badfile." discard=".$discardfile." Bindsize=19000000 Rows=100000 Readsize=20000000 Errors=1000000";
Thanks in advanceuser9256814 wrote:
Hi Team,
My database version is 11.1.0.7.0. I am loading the data using sqlldr. I am getting the error ORA-00060: deadlock detected while waiting for resource. once dead lock detected ,whether data will be rejected after commit point reached(Rows=100000). FYI information only sqllder is running on the database,it is getting completed withing 5-10min. please help me whether any other lock happening on this due to sqllder.
sqlldr userid=orcl/orcle control=".$controlfile." log=".$logfile.".log data=".$datafile." bad=".$badfile." discard=".$discardfile." Bindsize=19000000 Rows=100000 Readsize=20000000 Errors=1000000";
Thanks in advanceadditional clues will exist within alert_SID.ora file & subsequent trace file. -
(urgent)how to run the sqlldr script in owb process flow?
dear all:
In my oracle warehouse ,i have to load much *.dat file
into database with sqlldr in owb process flow. In owb process flow, I use the external process to run the sqlldr file with following configuration:
1:======external process==========
command : /app/ftpfile/sqlldr2.sh
parameter list:
success_threshold:0
script:
================================
2:create a file location in FILEãLOCATION node:
=============
ODS_LOCAL_LOC
=============
3: in the runtime repository i register the location
============
user name: oracle (for the sqlldr should run in oracle user)
password : oracle
host name: localhost
root path: /app/ftpfile/
============
4:configure the process flow
============
path settings
working locations:ods_local_loc
============
after deploy them success in runtime repository,
i run it ,it show me error following:
==========
SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0]
ORA-12545: Connect failed because target host or object does not exist
===========
please help me!
with best regard!Hello,
our developers were getting this error code just the other day. They are using "sqlplus_exec_template" script to initiate these things. In our case, I had to do two thing:
1) Modify their "initiator" script (the one that connects to runtime access user, and then calls "template") - it has to use tns connectivity "user/passwd@service_name"
2) Create TNS entry (server side) for the "service_name" above.
Now these SQL*LOADER mappings run successfully.
Alex. -
Error loading xml file with sqlldr
Hi there,
I am having trouble loading an xml file via sqlldr into oracle.
The version i am running is Oracle Database 10g Release 10.2.0.1.0 - 64bit Production and the file size is 464 MB.
It ran for about 10 hours trying to load the file and then threw up the error:
ORA-22813: operand value exceeds system limits.
I have loaded a file of 170MB using the same process succesfully.
Any Ideas?
Cheers,
Dan.Looked a bit into the issue (ORA-22813) and although it can be caused by a lot of issues varrying database versions, you could have a go at sizing up your PGA database parameter. See Oracle support Doc ID 837220.1 for more info.
The following might help
CREATE OR REPLACE PROCEDURE show_pga_memory (context_in IN VARCHAR2 DEFAULT NULL)
SELECT privileges required on:
SYS.v_$session
SYS.v_$sesstat
SYS.v_$statname
Here are the statements you should run:
GRANT SELECT ON SYS.v_$session TO <schema>;
GRANT SELECT ON SYS.v_$sesstat TO <schema>;
GRANT SELECT ON SYS.v_$statname TO <schema>;
IS
l_memory NUMBER;
BEGIN
SELECT st.VALUE
INTO l_memory
FROM SYS.v_$session se, SYS.v_$sesstat st, SYS.v_$statname nm
WHERE se.audsid = USERENV ('SESSIONID')
AND st.statistic# = nm.statistic#
AND se.SID = st.SID
AND nm.NAME = 'session pga memory';
DBMS_OUTPUT.put_line (CASE WHEN context_in IS NULL
THEN NULL
ELSE context_in || ' - '
END
|| 'PGA memory used in session = ' || TO_CHAR (l_memory));
END show_pga_memory;
/ -
Sqlldr does not understand unicode characters in file names
Hello,
I am trying to call sqlldr from a .net application on Windows to bulk load some data. The parameter, control, data, log files used by sqlldr, are all located in the C:\Configuración directory (note the unicode character in the directory name).
Here is my parfile:
control='C:\Configuración\SystemResource.ctl'
direct=true
errors=0
log='C:\Configuración\SystemResource.log'
userid=scott/tiger@orasrv
When I make a call as
sqlldr -parfile='C:\Configuración\SystemResource.par'I am getting
SQL*Loader-100: Syntax error on command-line
If I run it as
sqlldr -parfile='C:\Config~1\SystemResource.par'I am getting
SQL*Loader-522: lfiopn failed for file (C:\Configuraci├│n\SystemResource.log)
If I remove the log= parameter from the parameter file, I am getting
SQL*Loader-500: Unable to open file (C:\Configuraci├│n\SystemResource.ctl)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
Can anyone suggest a way to handle unicode/extended ASCII characters in file names?
Thanks,
Alex.Werner, thank you for replying to my post.
In my real application, I actually store the files in %TEMP%, which on Spanish and Portuguese Windows has "special" characters (e.g. '...\Administrador\Configuración local\Temp\'). In addition, you can have a user with the "special" characters in the name which will become part of %TEMP%.
Another problem is that 8.3 name creation may be disabled on NTFS partitions.
Problem #3 is that the short file names that have "special" characters are not converted correctly by GetShortPathName windows API, e.g. "Configuración" will be converted to "Config~1", but for "C:\ración.txt" the api will return the same "C:\ración.txt", even though dir /x displays "RACIN~1.TXT". Since I am creating the parameter and control files programmatically from a .net application, I have to PInvoke GetShortPathName.
Any other ideas?
Thanks,
Alex. -
Insert statement with timezone asking for parameter
I have an insert statement in a file.
When the whole file is executed it inserts around 300 rows in to table.
One of the insert statement is failing. This insert statement inserts a query into column of a table.
It is as following
Insert into hr.TABLE_PATH
(PATH, TABLE_OWNER_NAME, TABLE_NAME, PATH_NAME, TABLE_ACCESS_PATH_DESC, TABLE_PATH_SQL_TXT, PROCESS_CODE, DELETE_IND, ACCESS_ID)
Values
(290, HR, EMPLOYEES, 'Reconc', 'XYZ', 'SELECT COUNT(*) FROM (SELECT hr.product, TO_TIMESTAMP (TO_CHAR (FROM_TZ (sm.product_tmstp, 'GMT') AT TIME ZONE 'US/Central', 'YYYY-MM-DD HH24:MI:SS.FF6'), 'YYYY-MM-DD HH24:MI:SS.FF6') AS product_tmstp FROM ( SELECT DISTINCT(product) FROM hr.updated_table) upr, RH.product AS OF SCN :1 spr, RH.milestone AS OF SCN :1 sm WHERE spr.product = upr.product AND sm.product = spr.product MINUS SELECT prf.product, mi.product_tmstp FROM ( SELECT MAX(execute_id) exeute_id FROM hr.slave_execute ) cmr, hr.prod prf, hr.hr_info mi WHERE prf.execute_id = cmr.execute_id AND mi.milestone_group_id = prf.milestone_group_id )', 'N ', 'N', 2);
The problem is with this piece of insert:
TO_TIMESTAMP (TO_CHAR (FROM_TZ (sm.product_tmstp, 'GMT') AT TIME ZONE 'US/Central', 'YYYY-MM-DD HH24:MI:SS.FF6'), 'YYYY-MM-DD HH24:MI:SS.FF6') AS product_tmstp
: --> is being considered as parameter.
Where as the whole sql thing needs to get inserted into the table.If you are inserting a string, then any quotes inside the string should be replaced with two single quotes,
try this,
INSERT INTO hr.TABLE_PATH (PATH,
TABLE_OWNER_NAME,
TABLE_NAME,
PATH_NAME,
TABLE_ACCESS_PATH_DESC,
TABLE_PATH_SQL_TXT,
PROCESS_CODE,
DELETE_IND,
ACCESS_ID)
VALUES (
290,
HR,
EMPLOYEES,
'Reconc',
'XYZ',
'SELECT COUNT(*) FROM (SELECT hr.product, TO_TIMESTAMP (TO_CHAR (FROM_TZ (sm.product_tmstp, ''GMT'') AT TIME ZONE ''US/Central'', ''YYYY-MM-DD HH24:MI:SS.FF6''), ''YYYY-MM-DD HH24:MI:SS.FF6'') AS product_tmstp FROM ( SELECT DISTINCT(product) FROM hr.updated_table) upr, RH.product AS OF SCN :1 spr, RH.milestone AS OF SCN :1 sm WHERE spr.product = upr.product AND sm.product = spr.product MINUS SELECT prf.product, mi.product_tmstp FROM ( SELECT MAX(execute_id) exeute_id FROM hr.slave_execute ) cmr, hr.prod prf, hr.hr_info mi WHERE prf.execute_id = cmr.execute_id AND mi.milestone_group_id = prf.milestone_group_id )',
'N ',
'N',
2
);Note the
TO_TIMESTAMP (TO_CHAR (FROM_TZ (sm.product_tmstp, ''GMT'') AT TIME ZONE ''US/Central'', ''YYYY-MM-DD HH24:MI:SS.FF6''), ''YYYY-MM-DD HH24:MI:SS.FF6'') AS product_tmstp First off, How are you inserting the data? using sqlldr or utl_file?
If you are inseting using a script from Sqlplus
use
SET DEFINE OFFjust to make sure nothing is treated as a parameter.
G.
Edited by: Ganesh Srivatsav on Apr 6, 2011 5:20 PM -
SQL*Loader sqlldr removes zeros from character field
Hello,
I am using SQL*Loader to load an Oracle table, and am having a problem. One of the fields is defined as VARCHAR2 and contains comments entered by a user. There may be numbers or dollar amounts included in this text. When I execute the sqlldr script below, the result is that all of the zeros on the text field disappear. There is a translate function invoked for this field (bolded statement) in an attempt to remove imbedded newlines from the text. Wherever there was a zero in the original text, it ends up being removed after I run this script. Can anyone suggest why this is occurring, and how to prevent it? Can it be related to the translate function?
Thanks for your help!
OPTIONS (READSIZE=20971520, BINDSIZE=20971520, ROWS=20000)
LOAD DATA
INFILE 'R24.REGION.ERL.N1E104' "str X'5E5E220A'"
BADFILE 'LOGS/N1E104_BUT_RS_ASSGN_TXT_BADDATA.TXT'
DISCARDFILE 'LOGS/N1E104_BUT_RS_ASSGN_TXT_DISCARDDATA.TXT'
REPLACE
INTO TABLE TESTM8.CONV_BUT_RS_ASSGN_TXT
FIELDS TERMINATED BY '~' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
RST_RS_EXT_TXT_OID DECIMAL EXTERNAL,
RST_RS_ASSGN_OID DECIMAL EXTERNAL NULLIF RST_RS_ASSGN_OID = 'NULL',
RST_TXT_SEQ_NBR INTEGER EXTERNAL,
RST_RS_COMM_OID DECIMAL EXTERNAL,
RST_DIF_ASSGN_OID DECIMAL EXTERNAL NULLIF RST_DIF_ASSGN_OID = 'NULL',
RST_EXTENDED_TXT "SUBSTR(TRANSLATE(:RST_EXTENDED_TXT, '#0x0A', '#'), 1, 248)"
--------------------------------------------Never mind, found my mistake. In the TRANSLATE function, I had assumed that the 0x0A would be interpreted as a single hex value. Instead, it is interpreted literally as the character '0', the character 'x', the character 'A', etc. The result is that the transformed text had no '0', 'x', or 'A' characters, which is exactly what I inadvertently told it to do. I changed it to the following, which works better ;-)
RST_EXTENDED_TXT "SUBSTR(TRANSLATE(:RST_EXTENDED_TXT, '#'||CHR(10), '#'), 1, 250)" -
How to dynamically create sqlldr control file using stored procedure
I am trying to dynamically create the control file (.ctl) and execute the same using a stored procedure.I would be passing the file name as a parameter to this procedure. How do I go about doing this?
The control file has the following structure. The file name (mktg) varies and is passed as an input to the stored procedure.
SPOOL mktg.ctl
LOAD DATA
INFILE 'mktg.csv'
INTO TABLE staging
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(COMPANY_NAME,
ADDRESS,
CITY,
STATE,
ZIP)
SPOOL OFF ;
sqlldr scott/tiger CONTROL= mktg.ctl LOG=mktg.log BAD=mktg.badWe are using oracle 9i rel 2.
I have not had much success with the creation of log and bad files using external tables when they are being used within a dynamic sql.
Plz check this:
Re: problems related to data loads from excel, CSV files into an oracle 9i db -
Reg..csv file as input parameter in sqlloader
Hi,
I have .ctl file. every time i received the file name in diff name.
rather than hardcode file name i wants to take .csv file as input parameter plesase do help in this.
here is the code..
OPTIONS (SKIP = 1, BINDSIZE=100000)
LOAD DATA
CHARACTERSET WE8ISO8859P1
INFILE '/WOAU1/bkp/pgp_masterkey.csv'
BADFILE '/WOAU1/bkp/pgp_masterkey.bad'
DISCARDFILE '/WOAU1/bkp/pgp_masterkey.dsc'
Thanks
AtulA better alternative would be to avoid using SQL*Loader and instead use External Tables for which you can use an ALTER TABLE statement to change the LOCATION of the table (which details the filename). (A valid reason for using EXECUTE IMMEDIATE in PL/SQL). That way you keep all your control inside the database and you're not messing about with o/s scripts to pass different parameters to an external program.
-
Sqlldr & SKIP_INDEX_MAINTENANCE
Hi
I am using Informatica to use sqlldr to load multiple files into a partitioned table A
DIRECT=TRUE &
PARALLEL=TRUE
Table A has indexes on it which have been made UNUSABLE through a script. (Dont want to drop and recreated them)
Through informatica I cannot set the parameter
SKIP_INDEX_MAINTENANCE=TRUE
Is there any method through which I can set it as an environment variable on Unix and pass it to sqlldr.
OS: HP UX
DB: 10g R2
Any help is appreciated.Thread to be continued at --> Re: sqlldr exit code with SKIP_INDEX_MAINTENANCE
~ Madrid
Maybe you are looking for
-
Roundtrip issues [Lr5.3 to Ps cc]
Hi everyone, in the past two days i experience an issue in Lr-Ps roundtrip.I use to do that work from Lr 4 to Ps6 in my old win7 machine with no problems. Resently move to mac OS X (mountain lion)platform and be a member in Adobe`s CC.I try as always
-
Hotsync occasionally results in an empty calendar in the Palm desktop software.
Hotsync occasionally results in an empty calendar in the Palm desktop software. Updated to most current version - 6.2. That did not help. Downgraded to 4.2.1 based on some discussions I read here, but that didn't help either. Hotsync logs show succes
-
Where is x2x, I can't find it in the repositories.
Hi, I preformed a new system installation (systemd) and I can't find x2x. How can I install it? Sorry if I'm missing something. Thanks great community!
-
I recently upgrade to Lightroom 5.7.1 and have noticed that when I am trying to create an HDR image using the "Photo->Edit In->Merge to HDR Pro in Photoshop" no longer works. Photoshop will launch but the images do not appear and no HDR merge process
-
What is a .tod file??
I have a video camera and all the videos are .tod files and .moi files. How can I get these to play on my Mac.