Substr function in CTL file of sqlloader
Hi gurus,
I am using sql loader to load the data from text file to oracle table.
I have 2 columns in the input data, which are of more size than their corresponding destination table column sizes.
I am using substr() function in the control file, but it is failing for some records in 2 scenarios, details given below.
First scenario:
destination table column size is 250 characters.
substr() function works fine when the source data is less than 255 characters, but when it is more than 255 characters, the record is getting rejected as bad record.
eg: if the source data is of 254 characters, only 250 chars are inserted in the table, but if the size is 256, record is not inserted in the table.
Here is the syntax i am using
ENQUIRY "SUBSTR(:ENQUIRY, 0, 250)",
Second scenario:
destination table column size is 2000 characters.
substr() function works fine when the source data is upto 2000 characters, but when it is more than 2000 characters, the record is getting rejected as bad record.
Here is the syntax i am using
ANSWER CHAR(2000) "SUBSTR(:ANSWER, 0, 2000)",
Please suggest.
Thanks,
odie_63 wrote:
Hi,
Try with specifying a larger size limit for both columns :
ENQUIRY CHAR(4000) "SUBSTR(:ENQUIRY, 0, 250)",
ANSWER CHAR(4000) "SUBSTR(:ANSWER, 0, 2000)"
Will the substr work with a starting offset of 0 ? Doesn't substr uses 1 as the first character position? or it is different with sqlloader?
Okay I checked, It works just like if the offset is given as one. Perfect.
SQL> select substr('ABCDEF',0,3) from dual;
SUB
ABCThanks odie_63. :)
Edited by: zahid79 on Jul 23, 2010 5:20 PM
Edited by: zahid79 on Jul 23, 2010 5:22 PM
Similar Messages
-
Invalid length parameter passed to the LEFT or SUBSTRING function, error on INSERT
I have a stored procedure that does a BULK INSERT of a csv file into myCSVTable then INSERTs records from
myCSVTable into myTable. The INSERT statement is giving me this error:
Invalid length parameter passed to the LEFT or SUBSTRING function.
CSV File:
3,020000007,OR,051,97205,02020005,41075,19470721,2,0,,0,0,0,0,0,1,0,0,,,,,,,,,0,2,,334418,334418,334418,13,,0,,0,0,0,0,0,1,0,0,1,20100217,,,,,20100304,20121030,1,1,,,0,0,0,0,0,,,,,0,0,0,0,9,0,0,0,1,1,0,0,0,3,00,20100304,20100426,20100922,20101011,20100304,20100922,20101011,,20101011,20100819,,20100326,20100326,0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0,0,,,1,1,,9,9,1,1,1,1,0,0,111150,111150,111150,999999,5586.25,4714.01,5884.28,7055.89,11097.89,10839.39,0.00,2,,,,,20120617,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,The
Workforce Investment Board for the City of xxxxxxxx and xxxxxxxxx and W,020005,,mstr: 1000000000000000017 js: 372651 epi: 1,Jane Doe
4,020000007,OR,051,97205,02020005,41075,19470721,2,0,,0,0,0,0,0,1,0,0,,,,,,,,,0,2,,334418,334418,334418,13,,0,,0,0,0,0,0,1,0,0,1,20100217,,,,,20100304,20121030,1,1,,,0,0,0,0,0,,,,,0,0,0,0,9,0,0,0,1,1,0,0,0,3,00,20100304,20100426,20100922,20101011,20100304,20100922,20101011,,20101011,20100819,,20100326,20100326,0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0,0,,,1,1,,9,9,1,1,1,1,0,0,111150,111150,111150,999999,5586.25,4714.01,5884.28,7055.89,11097.89,10839.39,0.00,2,,,,,20120617,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,The
Workforce Investment Board for the City of xxxxxxxx and xxxxxxxxx and W,020005,,mstr: 1000000000000000017 id: 020000007 epi: 1,Jane Doe
There is no LEFT or SUBSTRING function used in the procedure and there are no triggers on either table.
The procedure had been working fine until today, when I altered one of the CSV fields. The code which triggers the error is this:
INSERT [myTable]
SELECT * FROM [myCSVTable]
The weird thing is, the 1st record containing the changed field (mstr: 1000000000000000017 js: 372651 epi: 1) triggers the error, but the 2nd record containing
the old field value (mstr: 1000000000000000017 id: 020000007 epi: 1), does not.
There are only blank spaces in the 2 strings, no non-printable characters or anything like that. I'm confused.A View of myTable was causing the problem.
Hi LoriCazares,
Do you mean that you have solved this issue? If so, please close this thread.
Regards,
Elvis Long
TechNet Community Support -
Error in scheduling a mapping with sqlloader ctl file
Hi everyone,
I have been trying to schedule a single mapping which generates sqlloader ctl file. but i get the error
ORA-20001: Begin. initialize complete. workspace set. l_job_audit_execution_id= 20545. ORA-20001: Please check execution object is deployed correctly. ORA-01403: no data found ORA-06512: at "USER7.PMAP_TLOG_JOB", line 180 ORA-20001: Please check execution object is deployed correctly. ORA-01403: no data found
but when i attach this mapping with a process flow it works fine. There is no error.
so my question is in OWB is it a must that we should attach the mapping which generates sqlloader ctl file to a process flow and then schedule it or can we schedule a single mapping which generates sqlloader ctl file and what should be the process to schedule a single mapping which generates sqlloader ctl file?
can anyone please help?
Thanks & Regards
SubhasreeHi Nawneet,
Any suggestions?
can anybody else also help me in this error???
Regards
Subhasree -
Way to generate sql*loader ctl file from a table?
I'm an oracle newbie. (Oracle 8i, HP Unix)Is there any way to take an existing table
description and generate a sql*loader control file from it? If anyone has already written a procedure or knows where one can be found I'd really appreciate it. We're doing a mass conversion to oracle and would like an easy way to re-write all our loads.
Thanks! Eileen from Polaroid.Hi,
I have shell program, which will get column names from system table and create temp. control file and call sqlloader exe. and load the data automatically.
You can customise this file, according to your needs.
Shell Program
if [ $# -ne 2 ]
then
echo " Usage : $0 table_name flat file name"
exit
fi
#assigning envir. variable to unix variables
table_name=$1
flat_file=$2
#creating the control file
echo "LOAD DATA" > $table_name.ctl
echo "INFILE '$flat_file'" >> $table_name.ctl
echo "into table $table_name " >> $table_name.ctl
echo "fields terminated by '\t'" >> $table_name.ctl
#calling functions for making column name part
#describing the table and spooling into file
sqlplus -s $CUST_ORA_USER << sql_block
spool $table_name.lst
desc $table_name
spool off
sql_block
# creating suitable file and add the feilds into control file
# cutting the first line (headings)
tail +3 $table_name.lst > temp
rm -f $table_name.lst
k=`wc -l < temp`
k1=`expr $k - 1`
#cutting the last line
head -$k1 temp > tempx
record_count=`wc -l < tempx`
counter=1
echo "(" > wxyz.ctl
# reading file line by line
cat tempx | while in_line=`line`
do
#cutting the first field
field_name=`echo $in_line | cut -f1 -d' '`
#calculating the no of characters
word_cnt=`echo $in_line | wc -w`
#calculating count in a line
if [ $word_cnt = 2 ]
then
data_type=`echo $in_line | cut -f2 -d' ' | cut -f1 -d'('`
if [ "$data_type" = "DATE" ]
then
data_fmt="DECODE(LENGTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rtr im(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
elif [ "$data_type" = "CHAR" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
elif [ "$data_type" = "VARCHAR2" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
else
data_fmt="NVL(:$field_name,0) "
fi
else
data_type=`echo $in_line | cut -f4 -d' ' | cut -f1 -d'('`
if [ "$data_type" = "DATE" ]
then
data_fmt="DECODE(LENGHTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rt rim(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
elif [ "$data_type" = "CHAR" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
elif [ "$data_type" = "VARCHAR2" ]
then
data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
else
data_fmt="NVL(:$field_name,0) "
fi
fi
#if last line put );
#else ,
if test $record_count -eq $counter
then
echo $field_name \"$data_fmt\"");" >> wxyz.ctl
else
echo $field_name \"$data_fmt\""," >> wxyz.ctl
fi
#counter increamenting for each record
counter=`expr $counter + 1`
done
#removing the file
rm -f temp tempx
cat wxyz.ctl >> $table_name.ctl
rm -f x.ctl
#calling the SQLLOADER
SQLLDR $CUST_ORA_USER CONTROL=$table_name.ctl ERROR=99999999
#removing the control file
rm -f $table_name.ctl -
Need help in writing the control file for SQLLOADER
Is it possible to error out the Sqlloader in case the data fields in the data file for a row are more than the fields stated in the control file?
i.e. My data file is something like
aaa,bbb,cc
dd,eee
And my ctl file has just 2 columns in it. Is it possible to write a control file which will cause the Sqlloader to error out?
Thanks...Nisha,
Again I posted test example in your other post but here is how can do that
CREATE TABLE mytest111 (
col1 NUMBER,
col2 NUMBER,
col3 NUMBER
LOAD DATA
TRUNCATE INTO TABLE MYTEST111
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
col1 integer external,
col2 integer external
#mytest.dat
1,2,3
1,2
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Apr 10 11:40:39 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: mytest.ctl
Data File: mytest.dat
Bad File: mytest.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table USIUSER.MYTEST111, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
COL1 FIRST * , O(") CHARACTER
COL2 NEXT * , O(") CHARACTER
Table MYTEST111:
2 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 33024 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 2
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Fri Apr 10 11:40:39 2009
Run ended on Fri Apr 10 11:40:40 2009
Elapsed time was: 00:00:00.99
CPU time was: 00:00:00.06
{code}
Regards -
SQL LOADER / INFILE filename as variable in .ctl file
I stumbled over several threads in the OTN-forums regarding this problem, but neither was it finaly solved nor did I find a FAQ that answered my question. Soooo:
We get several datafiles from several sources and process them via SQL Loader and store 'em in the DB.
This is done via a CHRON job and a PERL skript, for all datafiles in a specific directory.
We need the information which file on which date generated the data INSIDE the DB as well.
So I want to store the filename || SYSDATE combination as well.
I know, I could parse the .ctl file and replace a key-string with the actual filename and so have it in the input.
But this seems a bit dirty to me. Isn't there some way, i.e. a keyword or variable for the infile-filename within the SQLLoader that I can access in the .ctl file? Something like:
INTO TABLE processed_files
FIELDS TERMINATED BY ';'
WHEN LOWER(LTRIM(RTRIM(hdr_ftr))) = 'ftr' -- FOOTER??
(hdr_ftr VARCHAR2(100),
source INFILE||' on '||TO_CHAR(SYSDATE, 'MM/DD/YYYY'),
realm VARCHAR2(100),
version VARCHAR2(20)
I would be greatfull if you?d share your wisdom with me. ;-))
OliverI passed this quite similar to 'Ask Tom' and got the advice to put the .ctl's content as a string variable into a Shell skript.
This shell skript (which had to be written anyway to loop over the datafiles an subsequently call the sqlldr) should then replace the INFILE parameter and the CONSTANTs for the filenames and generate a 'temporarry' .ctl before calling sqlldr!
That's it, no better and safer way! -
How to use substring in ITS(HTML file)
Hai,
I am modifying d_searchhelp.html file in agate. I want a use a substring function this file. So I used the below code.
newclassname=~searchhelpcontrolname;
oldclassname=newclassname.substring(0,8);
^ ^ ^
This is the error i got when i use this HTML file in searchhelp.
@ ...\templates\system\dm\msiexplorer\d_searchhelp.html (263,42): error : syntax error : '('
@ ...\templates\system\dm\msiexplorer\d_searchhelp.html (263,44): error : syntax error : ','
@ ...\templates\system\dm\msiexplorer\d_searchhelp.html (263,46): error : syntax error : ')'
Please let me know how to use substring.
Thanks & Regards,
H.K.Hayath BashaHello H.K.Hayath Basha,
please see the HTML Business documentation on <http://help.sap.com/saphelp_nw04/helpdata/en/5f/1fb5174aee11d189740000e8322d00/frameset.htm>:
string strSub (in string string, in int position, in int length)
With regards,
TJ -
How we can restrict record in CTL file on the basis of other table ?
Hello all,
How we can restrict record in CTL file on the basis of other table ?
Eg.
I have following control file to load the records in the table through the sql loader.
LOAD DATA
INTO TABLE THIST APPEND
FIELDS TERMINATED BY "|" TRAILING NULLCOLS
LNUM POSITION(1) Char "substr(:LOAN_NUM, 4, 13)",
TSRNUM Char "rtrim:TRAN_SR_NUM)" ,
TPROCDT Char "to_char(to_date rtrim:TRAN_PROC_DT), 'MMDDYYYY'), 'YYYYMMDD')"
I have another table c all TFILE in which I have LNUM. I want to import only those records from input text file using the control file and sql loader in which LNUM is exist in the TFILE.
So how i can restrict it in the Control File.
Thanks
Kamlesh Gujarathi
[email protected]Hello Satyaki De.
Thank you very much for your suggestion but my Private information is totally apart from this question & I already I have changed each and every information from the question.
Thanks
Kamlesh Gujarathi
[email protected] -
I have a mapping with file as source. I want to generate a CTL file that contains actual physical directory of my location and not the location name. I have specified my own path where the control file for this mapping should get generated.
(1) When I generate this mapping <map.ctl> file has location_name of the file module and not the physical path in the ctl script.
(2) When I deploy this mapping - ctl file is not generated in the specified directory.
(3) Only when I execute this mapping thru OWB that a CTL file containing the physical location of the data file is generated in the OS directory.
My questions are :
(1) How do I generate a CTL file containing physical location - without executing the mapping thru OWB. I want to do that so that I can run from sqlldr directly.
(2) What is the function of Deploy in case of file to table mappings.
(3) What is the OWB recommended way of deploying and executing such mappings involving sqlldr in prod environment.Vibha,
Sorry for the late response.
(1) How do I generate a CTL file containing physical location - without executing the mapping thru OWB. I want to do that so that I can run from sqlldr directly.
There should be no need to run from sqlldr directly. Warehouse Builder manages the execution in which case you get all the error handling and logging included in the runtime audit browser.
(2) What is the function of Deploy in case of file to table mappings.
Deploy stores the control file in the runtime repository, so that it can be picked up at execution time, temporarily be stored on the file system and OWB will then execute SQL loader natively.
(3) What is the OWB recommended way of deploying and executing such mappings involving sqlldr in prod environment.
Deploy to the production environment and execute it through OEM or a command line scheduling.
Mark. -
Substring function giving out empty fields
I am using OAS 10.1.3v . I have a problem with the substring function in the BPEL functions. In 10.1.2v , all the xpaths with substring functions gave out data.
But in 10.1.3v, all those nodes are seen empty. Is there any particular reason for this ?
Is this a problem of any particular jar file ? I am using xalan 2.7.0 along with this, for custom functions.
Also, in which class, inside the jar file, is the substring function defined ?Try wrapping the substring function in a @member function. You need the ruturn of a member name not a string. Look at the example of @member in the technical reference for more details
-
How to create a typed CTL files automatically?
I make extensive use of vi references and created many strict typedefs
of vi types. The problem is, that everytime, when i changed a vi, which
has a corresponding typedef ctl, i need to update this CTL file manually. I whould like to get the CTL file updated automatically.
I allready found a way, to automatically create a Control (with a
function from the "_NewProbeWizard.llb" library) in a VI, but this
doesn't work for CTL Files (See attachment).
We be thankfull for help
Attachments:
Creating a Control on a VI.jpg 20 KBYou may have to go to the LAVA forum to get a good answer to this Q.
Without getting into scripting you may be able to do this under the right conditions.
1) First save the type specifier control as a type def and save it.
2) After ensuring the VI that uses the typedef (created in step #1) is not running use the code you have already to create the the type specifier control on the FP of a .ctl file.
3) save the new ctl as the name used in step #1
4) Open the VI that uses the type def and it should re-link to the new type.
I have used techniques like this to dynamically create enums for functional globals. It may work for you.
Ben
Ben Rayner
I am currently active on.. MainStream Preppers
Rayner's Ridge is under construction -
[sqlldr] upload number datatype from varchar2 datatype in ctl file
dear friends.
this is two files from my DAT file.
In the last field the data is: |3180.8| or |605.|
00000000|0|0601|2000082|6|20131023414|3378|1|MATUTE|40|C|MATUTE|CUADRA 17 DE ISABEL LA CATOLICA|11|03|150115|3180.8|
00000000|0|0601|2000082|1|20131023414|3378|1|MATUTE|40|C|MATUTE|CUADRA 17 DE ISABEL LA CATOLICA|11|03|150115|605.|
in the table destination the field have NUMBER(12,2) and i want to upload in this datatype in my control file.
I had read about portable and non-portable datatypes but i dont understand. This topic is refer about my problem?
In other ways, what will be the instruction in order to upload this datatype ???
Thanks a lot.
cesar
P.D. please apologize my englishA lot of thanks my friend...
i saw in your example that is function... but... please, would you describe me what is my specific error in my files ?
Example of my datfile is:
00000000|0|0601|2000082|6|20131023414|3378|1|MATUTE|40|C|MATUTE|CUADRA 17 DE ISABEL LA CATOLICA|11|03|150115|3180.8|
00000000|0|0601|2000082|1|20131023414|3378|1|MATUTE|40|C|MATUTE|CUADRA 17 DE ISABEL LA CATOLICA|11|03|150115|605.|
My CTL file is:
Load DATA
APPEND
INTO TABLE PLANELEC.PEMVX_DATOSCOMPDOM
fields terminated by "|" TRAILING NULLCOLS
V_NUMPAQTE,
V_NUMLOTE,
V_NUMFORM,
N_NUMORDEN,
V_CODDOCIDE,
V_NUMDOCIDE,
N_NUMCORAPO,
N_ESTASGHAB,
V_NOMBREVIA,
V_NUMEROVIA,
V_INTERIOR,
V_NOMZONA,
V_REFERNCIA,
V_CODTVIA,
V_CODZONA,
V_CODUBIGEO,
TEST_INTEGER CONSTANT 0,
TEST_NUMBER (this is ths field where i put the last field of datfile)
This is my sh. file:
# /bin/bash
export ORACLE_SID=plelpp
export ORACLE_HOME=/u01/app/oracle/product/10.2.0/db_1
export PATH=$PATH:$ORACLE_HOME/bin
sqlldr USERID=system, CONTROL=T818COMP.ctl, LOG=T818COMP.log, data=10pemvx_datoscompdom_20080304165740.dat, bad=T818COMP.bad, discard=T818COMP.dsc
And finally, this is my *.log file that show errors when i execute my sh file:
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Mar 6 07:50:46 2008
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: T818COMP.ctl
Data File: 10pemvx_datoscompdom_20080304165740.dat
Bad File: T818COMP.bad
Discard File: T818COMP.dsc
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table PLANELEC.PEMVX_DATOSCOMPDOM, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
V_NUMPAQTE FIRST * | CHARACTER
V_NUMLOTE NEXT * | CHARACTER
V_NUMFORM NEXT * | CHARACTER
N_NUMORDEN NEXT * | CHARACTER
V_CODDOCIDE NEXT * | CHARACTER
V_NUMDOCIDE NEXT * | CHARACTER
N_NUMCORAPO NEXT * | CHARACTER
N_ESTASGHAB NEXT * | CHARACTER
V_NOMBREVIA NEXT * | CHARACTER
V_NUMEROVIA NEXT * | CHARACTER
V_INTERIOR NEXT * | CHARACTER
V_NOMZONA NEXT * | CHARACTER
V_REFERNCIA NEXT * | CHARACTER
V_CODTVIA NEXT * | CHARACTER
V_CODZONA NEXT * | CHARACTER
V_CODUBIGEO NEXT * | CHARACTER
TEST_INTEGER CONSTANT
Value is '0'
TEST_NUMBER NEXT * | CHARACTER
value used for ROWS parameter changed from 64 to 58
Record 1: Rejected - Error on table PLANELEC.PEMVX_DATOSCOMPDOM, column TEST_NUMBER.
ORA-01722: numero no valido
Record 2: Rejected - Error on table PLANELEC.PEMVX_DATOSCOMPDOM, column TEST_NUMBER.
ORA-01722: numero no valido
Record 3: Rejected - Error on table PLANELEC.PEMVX_DATOSCOMPDOM, column TEST_NUMBER.
ORA-01722: numero no valido
Record 4: Rejected - Error on table PLANELEC.PEMVX_DATOSCOMPDOM, column TEST_NUMBER.
ORA-01722: numero no valido
Record 5: Rejected - Error on table PLANELEC.PEMVX_DATOSCOMPDOM, column TEST_NUMBER.
ORA-01722: numero no valido
Table PLANELEC.PEMVX_DATOSCOMPDOM:
0 Rows successfully loaded.
5 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 254504 bytes(58 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 5
Total logical records rejected: 5
Total logical records discarded: 0
Run began on Thu Mar 06 07:50:46 2008
Run ended on Thu Mar 06 07:50:49 2008
Elapsed time was: 00:00:02.95
CPU time was: 00:00:00.02
Please,
would you help me ? is very important.
a lot of thanks...
cesar -
is there an option in sqldeveloper to have this (nonlogged) option set in the generated sqlldr .ctl files? thx
Hi,
I confirm there is no OPTION to use the DIRECT mode.
For the simple reason you have restriction in DIRECT mode:
-Cannot load clustered tables.
-There can be no active transactions in the loaded tables.
-There can be no SQL functions used in the control file.
-If the target table is indexed, then no SELECT statements may be issued
against the table during the load.
So if you have a look to the .CTL files, the most part of the time, functions are used to translate datatypes correctly (ex: DECODE,HEXTORAW......)
regards,
Mireille -
Informix- Oracle migration: OMWB misapplies SUBSTR function !!!
New problem occured:
OMWB translates Informix's substrings wrong:
Examples of translations:
INFORMIX ORACLE
wvs[1] SUBSTR(wvs,1)
wvs[2,10] SUBSTR(wvs,2,10)
wvs[10] SUBSTR(wvs,10)
It seems, that OMWB only copies index numbers in square brackets as parameters of Oracles's SUBSTR function. But SUBSTR works differently by string indexes closed to [] in Informix !
wvs[1] means 1-st character of wvs string, wvs[3,5] means substring from 3-rd to 5-th character of wvs (length 3), but SUBSTR(wvs,1) means substring from 1-st chracter to end of string, SUBSTR(wvs,3,5) means 5-characters long substring from 3-rd to 7-th character, e.t.c.
Vladimir Kubanka alias Bare Foot
nullWell, I obtained <process_id> from V$SESSION, (...WHERE audsid=to_number(sessid) , ... , sessid:= sys_context('USERENV','SESSIONID') ).
But problem: this process_id is not the same, which has some process, creating standard tracefile "ora_<process_id>_orcl.trc".
In case I run procedure from SQLPLUS, process_id was for example 3258 and trace file had name "ora_3259_orcl.trc". It seems that tracefile was created by some child process, called from sqlplus (having UNIX process_id 3258). But it means, that child's process_id could be bigger more than +1 (when more users works in the same time). And in case I run procedure from some Windows-oriented client on my PC, the <process_id> (obtained from V$SESSION as written above in procedure) had some nonsense value (for ex. 54823716) and tracefile I found under name "ora_3271_orcl.trc" - it seems that process, creating tracefile had normal UNIX-liked process_id 3271. From this issues questions:
1. Is there some way to obtain in procedure process_id, which creates that standard tracefile?
OR:
2. Is there some way to obtain (in procedure) name of that tracefile ? (written into directory defined in USER_DUMP_DEST parameter)
null -
Shp2sdo generates weird ctl file
The shp2sdo tool from Oracle generates a CTL file with these values for the SDO_COORDINATES:
#1.#INF00|1.#INF00|1.#INF00|1.#INF00|1.#INF00|1.#INF00|1.#INF00|1.#INF00|
I think, something went wrong, does somebody has a clue?
This is the command line:
D:\data\shp2sdo_nt>shp2sdo.exe wijk_2008_gen_reprojected buurtgrenzen -g geom -d -s 28992 -t 0.05 -v
The SQLLoader gave me 50 of these errors:
value used for ROWS parameter changed from 64 to 61
Record 1: Rejected - Error on table BUURTGRENZEN, column GEOM.SDO_ORDINATES.X.
error converting data
ORA-01722: invalid number
Hopefully someone knows more than I do... ;)Kodde
do you get ORA-01722: invalid number?
in that case it might be an NLS issue. check out:
Re: Problem loading data generated with SHP2SDO
Luc
Maybe you are looking for
-
Creating trendline in labview graph
Hi, does anyone know whether is it possible to create a trendline in labview's graph? kinda like in excel u can create a trendline but i want to do it in labview, is it possible?
-
pls help. i want to use photoshop
-
I have adobe flash media encoder 3.1 installed and am trying to stream downloaded .avi, .mpeg, and other video files to justin.tv using ManyCam, Everything seems to be working fine, video stream is clear and not choppy. Audio on my end is clear, and
-
BEx Broadcaster exception in GET_DATA_PROVIDER while START_BROADCASTER
Hi All, I am very new to SAP and first time using BEx 3.5 Web Application Designer (WAD). I have no background about how they communicate with each other and also how ABAP programs are related to Web Items. Need some help. In my project, the requirem
-
Ipad has download error for kindle
ipad download error for kindle books - how to fix