Remove space in external table
Dear Gurus,
I generated external table in Toad, but found that after generated, the datas have space.
My Oracle database version is 11g
OS:Linux redhat 5
I created directory in server then created file.txt, external table in Database. I have check file.txt and there is no space.
Where the space came from?
Please help
Thank you
JOE
This seems to be a TOAD issue - pl ask in the TOAD forums - http://toadworld.com/Discussions/tabid/824/Default.aspx
HTH
Srini
Similar Messages
-
How to remove space below the table in smartforms
how to remove space below the table in smartforms
maintain a loop counter for item say count
and in text editer write
if count > 6
new-page.
endif. -
PO Stylesheet How to remove space between two tables
Hi
I am building a rtf template for PO output for communication report in r12,there is a terms and conditions part and total amount part which comes at bottom of every page of the report. I have created that as a template and i call it in the footer.
The problem is there is always a gap between the lines table and the terms & conditions and total table,any idea's on how to resolve this issue will be very helpful.
Please let me know if any other input is required from my side.
Thanks in AdvanceAre you trying to have the table where the lines are always be the same size no matter how many lines are present?
If so, you some choices:
1) If your lines are always the same height, you might get some ideas on what to do from this article:
http://oracle.anilpassi.com/xml-publisher-developing-reports-printed-on-pre-printed-stationary.html
2) If your lines vary in height, then you should look into a PDF template. This will give you a pre-printed stationary look in which the drawing of boxes and lines is not affected by the data content.
3) Other less attractive options are a) making the report in XSL-FO, and b) making a bitmapped RDF report
There is currently no solution that I am aware of for a pre-printed stationary look using an RTF template in which the data lines are of variable height. -
Remove Enter Sign in External Table
Dear Gurus,
I have external table with show 'enter sign'.
I thought it was blank space but not.
How to remove it?since function TRIM can't remove it.
Records delimited by newline -> is that any syntax to prevent enter sign appear?
Thank you
Regards
JOEThis seems to be a TOAD issue - pl ask in the TOAD forums - http://toadworld.com/Discussions/tabid/824/Default.aspx
HTH
Srini -
Showing Text file as external table without trimming spaces
Hi all:
I am trying to display a text file from db sever with original spacing.
Well it's not entirely related to APEX.
This is how external table is created:
create table u_x
(rowline varchar2(255))
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ext_tab <-- this is a dir i have created
access parameters
records delimited by newline
fields notrim
LOCATION ('aaa.txt')
REJECT LIMIT UNLIMITED;
on the first line of the aaa.text (its a report), it shows something like:
Sat Dec 29 Page 1
but when I do select * from u_x , the result shows:
Sat Dec 29 Page 1
all above is done in sql workshop /sql command in APEX 4.2.1 (db 10.2.0.5)
Is there any way to preserve the spaces between "29" and "Page"?
(the result compressed all spaces in-between to 1 space).
I think the ideal way is to invoke client's notepad to open the server's text file.
anyway to generate the link?
Thanks
JohnWell, I have tried to download the file using below procedure.
create or replace PROCEDURE download_bfile (
directory_in IN VARCHAR2,
file_in IN VARCHAR2
AS
lob_loc BFILE;
v_mime VARCHAR2 (48);
v_length NUMBER(16);
BEGIN
lob_loc := BFILENAME (UPPER (directory_in), file_in);
v_length := DBMS_LOB.getlength (lob_loc);
-- set up HTTP header
-- use an NVL around the mime type and
-- if it is a null set it to application/octect
-- application/octect may launch a download window from windows
OWA_UTIL.mime_header (nvl(v_mime, 'application/octet'), FALSE);
-- set the size so the browser knows how much to download
HTP.p ('Content-length: ' || v_length);
-- the filename will be used by the browser if the users does a save as
HTP.p ( 'Content-Disposition:attachment; filename="'
|| SUBSTR (file_in, INSTR (file_in, '/') + 1)
|| '";'
-- close the headers
OWA_UTIL.http_header_close;
-- download the BLOB
WPG_DOCLOAD.download_file (lob_loc);
apex_application.stop_apex_engine;
END download_bfile;
I run it inside sql command in apex, but it does not show a pop window to save the file,
it displays the whole text content of the file in the result window.
Any idea why the save-as window does not pop up?
thanks,
John. -
How to remove carraige return from the field while loading external table
I am facing an issue of not getting rid of carraige returns present in the fileds of the source .csv file while loading the records into external table.
I had tried using LRTRIM, but it does not help.
The error I am getting is:
KUP-04021: field formatting error for field POPULATION_DESCRIPTION
KUP-04037: terminator not found
I am pasting one record out of the .csv file which is causing this error as below:
"Business Card Accounts
",123,7 BizCard - Gamers,Control,"Business Card Accounts
",75270,75271
You can see the carraige return in the 1st field as well as 5th field. Filed are separated by commas & eclosed by double quotes.
Could anybody help on this please?Can you copy the file to an external table only version, and then dos2unix it?
Alternatively, you can use the ACCESS PARAMETERS area in your external table definition to set that your RECORDS DELIMITED BY carriage returns, instead of the default.
Check out the example here http://www.psoug.org/reference/externaltab.html -
How can an external table handle data with line feed between delimiters?
I have defined an external table as below. My data is pipe delimited and comes from a DOS system.
I already remove any carriage returns before putting the file into the DATA_DIR for reading. But
I have found that some of my VARCHAR fields have embeded line feeds.
Is it possible to have a definition that would remove any line feed characters between the delimiters?
Below I also threw together a sample data set there ID #2 has that extra character. Yes, I could
write an awk script to pre-process all my data files. But I am hoping there is a way for Oracle
to also do this.
I understand the LDTRIM to remove any leading and trailing spaces in the delimited field. Is there a
REPLACE or TRANSLATE option. I did a bit of searching but I must be asking the wrong things.
Thanks for any help
Eric
CREATE TABLE table_ext
id NUMBER,
desc1 VARCHAR2(64 CHAR),
desc2 VARCHAR2(255 CHAR),
add_date DATE
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY data_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
CHARACTERSET WE8ISO8859P1
BADFILE log_dir:'table_ext.bad'
DISCARDFILE log_dir:'table_ext.dis'
LOGFILE log_dir:'table_ext.log'
FIELDS TERMINATED BY '|' LDRTRIM
MISSING FIELD VALUES ARE NULL
id INTEGER EXTERNAL(38),
desc1 CHAR(64),
desc2 CHAR(255),
add_date CHAR DATE_FORMAT DATE MASK "yyyy-mm-dd hh24:mi",
LOCATION( 'data.txt' )
PARALLEL
REJECT LIMIT UNLIMITED;
1|short desc|long desc|2001-01-01 00:00
2|short desc| long
desc |1999-03-03 23:23
3|short desc| long desc | 2011-02-02 02:02Thanks for looking. But that relates to the record delimiter which in my case is the pipe character '|'. In my various data sets this is consistent. I expect each record to be one per line. But between two delimiters some data has a line feed. So I'm looking for a method that will "cleanup" the field data as it gets imported.
I was hoping there was an option that would ignore any embedded line feeds (\n) characters. I.e., those not at the end of the line.
Eric -
IGNORE COMMENTS IN EXTERNAL TABLE FILE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?When I added the lines to attempt to
exclude the Header lines to be:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
LOAD WHEN ( cvt_seq_num LIKE '--%' )
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
( cvt_seq_num INTEGER EXTERNAL(12)
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPENcallout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "identifier": expecting one of: "equal,notequal"
KUP-01008: the bad identifier was: LIKE
KUP-01007: at line 6 column 33
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
When I moved the line I got:
RECORDS DELIMITED BY newline
NOBADFILE
NODISCARDFILE
NOLOGFILE
SKIP 0
FIELDS
TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
LOAD WHEN ( cvt_seq_num LIKE '--%' )
I got:
select * from old_cvt_external
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "load": expecting one of: "exit, ("
KUP-01007: at line 11 column 11
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1
So it appears that the "LOAD WHEN" does not work.
Can you please suggest anything else? -
IGNORE COMMENTS IN EXTERNAL TABLE
I currently have a partitioned table, split by month going back some 7 years.
The business have now agreed to archive off some of the data but to have it
available if required. I therefore intend to select month by month, the data
to a flat file and make use of external tables to provide the data for the
business if required. Then to remove the months partition thus releaseing
space back to the database. Each months data is only about 100bytes long, but
there could be about 15million rows.
I can successfully create the external table and read the data.
However I would like to place a header in the spooled file but the only method
I have found to ignore the header is to use the "SKIP" parameter. I would
rather not use this as it hard codes the number of lines I reserve for a
header.
Is there another method to add a comment to a external tables file and have the
selects on the file ignore the comments?
Could I use "LOAD WHEN ( column1 != '--' )" as each comment line begins with a
double dash?
Also, is there a limit to the size of an external tables file?
Thanks
JDHi Gurus..
any help will be highly appreciated. -
Dear Experts,
I was trying to explore the "preprocessor" command in external table, but I was not able to succeed.
I just followed the steps mentioned in, latest Oracle Magazine.
http://www.oracle.com/technetwork/issue-archive/2012/12-nov/o62asktom-1867739.html
But finally when I do the select from table df, I am getting the following error.
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-29400: data cartridge error
KUP-04095: preprocessor command /export/csrmgr/ripple_dms/run_df.sh encountered error "error during exec: errno is 2
PS: The output of v$version;
Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
PL/SQL Release 11.2.0.2.0 - Production
"CORE 11.2.0.2.0 Production"
TNS for IBM/AIX RISC System/6000: Version 11.2.0.2.0 - Production
NLSRTL Version 11.2.0.2.0 - Production
Could anyone help me to resolve this issue.I'm not able to test Tom's example at the minute.
However, that new feature was discussed in the March/April 2011 edition of Oracle Magazine too.
Oracle Magazine March/April 2011 article on new feature in 11gR2 called the external table preprocessor.
by Arup Nanda
CHANGE IN PROCESS
Now suppose that with your external table and its text file in place, the input file for the external table is compressed to reduce the volume of data transmitted across the network. Although compression helps the network bandwidth utilization, it creates a challenge for the ETL process. The file must be uncompressed before its contents can be accessed by the external table.
Rather than uncompress the file in a separate process, you can use the preprocessor feature of Oracle Database 11g Release 2 with external tables to uncompress the file inline. And you will notneed to change the ETL process.
To use the preprocessor feature, first you need to create a preprocessor program. The external table expects input in a text format but not necessarily in a file. The external table does not need to read a file; rather, it expects to get the file contents “fed” to it. So the preprocessor program must stream the input data directly to the external table—and not
create another input file. The input to the pre-processor will be a compressed file, and the output will be the uncompressed contents.
The following is the code for your new preprocessor program, named preprocess.bat:
@echo off
C:\oracle\product\11.2.0\dbhome_1\BIN\unzip.exe -qc %1The first line, @echo off, suppresses the output of the command in a Windows environment. The remaining code calls the unzip.exe utility located in the Oracle home. The utility needs an input file, which is the first (and only) parameter passed to it, shown as %1. The options q and c tell the utility to uncompress quietly (q) without producing extraneous output such as “Inflating” or “%age inflated” and match filenames case-insensitively (c), respectively.
Next you need to create the directory object where this preprocessor program is located. Logging in as SYS, issue
create directory execdir as 'c:\tools';And now grant EXECUTE permissions on the directory to the ETLADMIN user:
grant execute on directory execdir to etladmin;Finally, create the new external table:
create table indata1
cust_id number,
cust_name varchar2(20),
credit_limit number(10)
organization external
type oracle_loader
default directory etl_dir
access parameters
records delimited by newline
preprocessor execdir:'preprocess.bat'
fields terminated by ","
location ('indata1.txt')
/It calls the preprocess.bat executable in the directory specified by EXECDIR before the external table accesses the indata1.txt file in the location specified by the ETL_DIR directory. Remember, indata1.txt is now a compressed file. So, in effect, the external table reads not the actual specified input file but rather the output of preprocess.bat, which is the uncompressed data from the indata1.txt file.
If you select from the external table now, the output will be similar to that of the earlier select * from indata1; query. The preprocessor passed the uncompressed contents of the indata1.txt (compressed) file on to the external table. There was no need to uncompress the file first—saving significant time and the intermediate space required and making it unnecessary to change the ETL process.
This inline preprocessing unzip example uses a script, but that is not always necessary. An executable can be used instead. For example, in Linux you can use /bin/gunzip. However, the utility can’t accept any parameters. So if you pass parameters (as in this article’s example), you must use a script.
Security Concerns
=================
The EXECUTE privilege on a directory is a new feature introduced in Oracle Database 11g Release 2. It enables the DBA to grant EXECUTE permissions only for certain directories and only to certain users. Without WRITE privileges, users will not be able to
update the executables inside a directory to insert malicious code, but users will be able to execute the “approved” code accessible in a single location. The DBA can put all the necessary preprocessor tools into a single directory and grant EXECUTE privileges there to the users who may need them. And, of course, the executables and data should be in different directories.
Preprocessing also requires some special precautions on the part of the DBA. Because the executables called by preprocessing programs will be executed under the privileges of the Oracle software owner and malicious executable code can cause a lot of damage, the DBA should be extremely careful in monitoring executables for potentially harmful code.
The directory containing the preprocessor executables needs to be accessible to the Oracle software owner for EXECUTE operations only, not for WRITE activity. Therefore, as an added precaution, the system administrator can remove WRITE access to that directory from all users, including the Oracle software owner. This significantly reduces the chance of damage by malicious code.
Other Uses
==========
Compression is not the only use for inline preprocessing, although it certainly is the most widely used. You can, for example, use this preprocessing technique to show the output of a program as an external table. Consider, for instance, the dir command in Windows for listing the contents of a directory. How would you like to get the output as a table so that you can apply predicates?
Getting and using this output is quite simple with the preprocessor functionality. Remember, the preprocessor does not actually need a file but, rather, requires the output of the preprocessor program. You can write a preprocessor program to send the
output of a dir command. The new preprocessor program, named preproc_dir.bat, has only the following two lines:
@echo off
dirYou will also need a file for the external table. The contents of the file are irrelevant, so you can use any file that the Oracle software owner can read in a directory to which that owner has read access. For this example, the file is dirfile.txt, and although the contents of the file are immaterial, the file must exist, because the external table will access it. Listing 1 shows how to create the table.
Because the dir command displays output in a prespecified manner, the external table easily parses it by reading the fields located in specific positions. For example, positions 1 through 10 display the date, 11 through 20 display the time, and so on. The dir command produces some heading and preliminary information that the external table has to ignore, so there is a skip 5 clause in Listing 1 that skips the first five lines of the output. The last few lines of the output show how many files and directories are present and how much free space remains. This output must be skipped as well, so the external table displays records only when the date column has a value.
Listing 1 also shows the result of a query against the external table. Because the MOD_DT column is of the date datatype, you can also apply a WHERE condition to select a specified set of records.
Code Listing 1:
===============
create table dir_tab
mod_dt date,
mod_time char(10),
file_type char(10),
file_size char(10),
file_name char(40)
organization external
type oracle_loader
default directory etl_dir
access parameters
records delimited by newline
preprocessor execdir:'preproc_dir.bat'
skip 5
load when (mod_dt != blanks)
fields
mod_dt position (01:10) DATE mask "mm/dd/yyyy",
mod_time position (11:20),
file_type position (21:29),
file_size position (30:38),
file_name position (39:80)
location ('dirfile.txt')
reject limit unlimited
-- select from this table
SQL> select * from dir_tab;
MOD_DT MOD_TIME FILE_TYPE FILE_SIZE FILE_NAME
16-DEC-10 10:12 AM <DIR> .
16-DEC-10 10:12 AM <DIR> ..
22-MAY-10 09:57 PM <DIR> archive
22-MAY-10 10:27 PM 2,048 hc_alap112.dat
05-DEC-10 07:07 PM 36 indata1.txt
22-DEC-05 04:07 AM 31,744 oradba.exe
16-DEC-10 09:58 AM 1,123 oradim.log
28-SEP-10 12:41 PM 1,536 PWDALAP112.ora
16-DEC-10 09:58 AM 2,560 SPFILEALAP112.ORA
9 rows selected.
-- select a file not updated in last 1 year
SQL> select * from dir_tab where mod_dt < sysdate - 365;
MOD_DT MOD_TIME FILE_TYPE FILE_SIZE FILE_NAME
22-DEC-05 04:07 AM 31,744 oradba.exe -
[Forum FAQ] Removable media and external HDD
Scenario
There are two conceptions about which we would have confusion in some particular scenarios: removable media and external HDD. Here are two examples:
If you need to make a USB storage device perform AutoRun, the device must not be marked as a removable media device and the device must contain an Autorun.inf file and a startup application.
Answer File Search on removable read/write media.
Definition and Research
The definition of the removable media and external HDD is shown below:
The removable media device setting is a flag contained within the SCSI Inquiry Data response to the SCSI Inquiry command. Bit 7 of byte 1 (indexed from 0) is the Removable Media Bit (RMB). An RMB set to zero indicates that the device is not a removable
media device. An RMB of one indicates that the device is a removable media device. Drivers obtain this information by using the StorageDeviceProperty request.
External HDDs typically connect via USB; variants using USB interface generally have slower data transfer rates when compared to internally mounted hard drives connected through SATA.
That is, the removable media has a device property attach itself, from the results on Linux and Windows OS, we can see the difference between removable media and
generic external disk:
Here are the outputs of dmesg log on Linux OS:
sd 10:0:0:0: [sdb] Attached SCSI disk
(generic external disk)
sd 2:0:0:0: [sdb] Attached SCSI removable disk
(USB flash drive)
Here is the output on Windows OS:
We use following command to get the media type of physical drive:
Get-WmiObject -Class Win32_DiskDrive | Format-Table Name,Model, MediaType
Conclusion
From the device’s property, we cannot say that external device is equal to removable media totally. Removable media is associated with the RMB flag, of course, RMB is embedded in device itself.
Now, back to our common issue, when you supply an answer file on a USB external HDD, it would not recognize your device as a removable media, because the StorageDeviceProperty gets value 0 of RMB to indicate this is not a removable media. In this situation,
supply an answer file on a USB flash drive (UFD) or floppy disk would fix this issue.
Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.
Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.Ive had it on my Ibook before and then moved it to the HDD because of Space Shortage, and then just added as ive aquired more music. Idon't Really Remeber but i think it worked before without any hiccups. oh well otherwise i can just delete those files because theyre only single files here and there. but its quite irritating because they don't seem to have any kind of corruption.
-
Hi,
Could anybody tell me if there is possible to export a huge table (900 mil records) to an external file and then create an external table based on this external file ?
Does anybody know the best way to export/ import a huge schema to save time and disk space ?
And all these in Oracle 9i.
Thanks,
PaulHi Paul,
export a huge table (900 mil records) to an external file and then create an external table based on this external file ?Yes! You can use the SQL*Plus "spool" command, and extract the data in a comma-delimited format (CSV). Then, you can define an external table to point to the flat file. I have some notes nere:
http://www.dba-oracle.com/art_ext_tabs_spreadsheet.htm -
Selecting data from external table
Hi there
I was wondering if somebody could assist me. When I try to select data from an external table, no data is displayed, and in my log file I receive the following error:
KUP-04026: field too long for datatype. Please find attached my external table script.
CREATE TABLE DEMO_FILE_EXT
MACODE NUMBER(7),
MANO NUMBER(7),
DEPNO VARCHAR2(2 BYTE),
DEPTYPE NUMBER(5),
STARTDATE NUMBER(8),
ENDDATE NUMBER(8),
OPTIONSTART NUMBER(8),
BENEFITSTART NUMBER(8),
STARTSUSPEND NUMBER(8),
ENDSUSPEND NUMBER(8),
INITIALS VARCHAR2(5 BYTE),
FIRSTNAME VARCHAR2(20 BYTE),
SURNAME VARCHAR2(25 BYTE),
STR1 VARCHAR2(30 BYTE),
STR2 VARCHAR2(30 BYTE),
STR3 VARCHAR2(30 BYTE),
STR4 VARCHAR2(30 BYTE),
SCODE VARCHAR2(6 BYTE),
POS1 VARCHAR2(30 BYTE),
POS2 VARCHAR2(30 BYTE),
POS3 VARCHAR2(30 BYTE),
POS4 VARCHAR2(30 BYTE),
PCODE VARCHAR2(6 BYTE),
TELH VARCHAR2(10 BYTE),
TELW VARCHAR2(10 BYTE),
TELC VARCHAR2(10 BYTE),
IDNUMBER VARCHAR2(13 BYTE),
DOB NUMBER(8),
GENDER VARCHAR2(1 BYTE),
EMPLOYER_CODE VARCHAR2(10 BYTE),
EMPLOYER_NAME VARCHAR2(900 BYTE)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY DEMO_FILES
ACCESS PARAMETERS
( RECORDS DELIMITED BY newline
BADFILE 'Tinusb.txt'
DISCARDFILE 'Tinusd.txt'
LOGFILE 'Tinusl.txt'
SKIP 1
FIELDS TERMINATED BY '|'
MISSING FIELD VALUES ARE NULL
(MACODE,
MANO,
DEPNO,
DEPTYPE,
STARTDATE,
ENDDATE,
OPTIONSTART,
BENEFITSTART,
STARTSUSPEND,
ENDSUSPEND,
INITIALS,
FIRSTNAME,
SURNAME,
STR1,
STR2,
STR3,
STR4,
SCODE,
POS1,
POS2,
POS3,
POS4,
PCODE,
TELH,
TELW,
TELC,
IDNUMBER,
DOB,
GENDER,
EMPLOYER_CODE,
EMPLOYER_NAME
LOCATION (DEMO_FILES:'Test1.txt')
REJECT LIMIT UNLIMITED
LOGGING
NOCACHE
NOPARALLEL;
I have the correct privileges on the directory, but the error seems to be on the EMPLOYER_NAME field. The file I try to upload is in pipe-delimited format. The last field in the file does not have a pipe-delimiter at the end. Can this be the problem? Must I go and look for any trailing spaces? Can I specify in the external table script how many characters I need for the employer_name field? We receive this file from an external company
Thank you very much for the help
Ferdiecommon mistake, you gave the field sizes in the
column listing of the table, but not in the file
definition. oracle does not apply one to the other.
in the file defintion section, give explict field
sizes.Hi shoblock
Sorry for only coming back to you now, thank you for your help, I had to give the explicit field size for the last column (employer name).
Thank you once again!!
Ferdie -
External Table LKM: error during task interpretation
I've had this problem more than once now, using "LKM File to Oracle (EXTERNAL TABLE)". Before the interface can even run, I get an error, as if the script can't be output. I've tried removing the LKM and reimporting. No use. I am using this LKM successfully in other projects... not sure why it would have issues here. When I had this problem previously, Oracle Support couldn't reproduce it (they certainly tried). I managed to work around it then, but now... this is getting problematic. Insight welcome.
Here's the start of the error:
com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Error during task interpretation
Task:6
java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
BSF info: Create external table at line: 0 column: columnNo
at com.sunopsis.dwg.codeinterpretor.a.a(a.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.h.y(h.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
Text:The application script threw an exception: java.lang.StringIndexOutOfBoundsException: *String index out of range: 2* BSF info: Create external table at line: 0 column: columnNoHi,
I have this exact problem, did you get a solution?
Thanks -
A special character in the last column in my external table
Hello,
I'v made a external table that looks like this :
CREATE
TABLE "REC_XLS"
"FINANCE_DATE" VARCHAR2(50 BYTE),
"CREATION_DATE" VARCHAR2(50 BYTE),
"SENT_DATE" VARCHAR2(50 BYTE),
"REMARKS" VARCHAR2(200 BYTE),
"REMAINING_QUANTITY" VARCHAR2(30 BYTE),
"CODE" VARCHAR2(30 BYTE)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER DEFAULT DIRECTORY "LOADER" ACCESS PARAMETERS (
RECORDS DELIMITED BY newline LOAD WHEN CODE!=BLANKS Skip 1 FIELDS
TERMINATED BY 0X'09' MISSING FIELD VALUES ARE NULL REJECT ROWS
WITH
ALL NULL FIELDS ( Finance_date,
creation_date, sent_date, Remarks,
remaining_quantity, code) location ( 'Finance.tsv' )
I exported the result in xls file and I saw in the last column a special character. The result of a copy and paste is prenseted here : "ABE
You should notice that the last quote is in the upper line instead of at the end of ABE. This means, I think, that there is a carriage return at the end. And the 0X'09' is the tab separator. How can I get rid of this character ?my guess is that you run the load on unix server which uses default LF end-of-line character and the file has dos CRLF end-of-line, the easiest option is to change records delimiter in the DDL and replace "RECORDS DELIMITED BY newline" with code below
RECORDS DELIMITED BY '\r\n'in general end-of-line can be really any character, sample above shows CRLF but it can be TAB, space, | or any other character or set of characters. I have samples of special characters on my blog http://jiri.wordpress.com/2009/01/29/oracle-external-tables-by-examples-part-1/
hope this helps
jiri
Maybe you are looking for
-
Migratiing AP Check print from RDF to XML publisher
We are migrating customized AP Check print report from RDF in 11.5.7 to R12 XML Publisher report. We have 18 reports to convert. I have followed http://oracle.anilpassi.com/xml-publisher-concurrent-program-xmlp-2.html link to do the same. I am facing
-
I am trying to restore an ipod, but keep getting error 3194. Tips?
Ipod = disabled Recieve error 3194 Windows computer
-
Video won't open in this ver of windows 7
when i try to open a video in photoshop cc, error message says can not play video in this version of windows 7.
-
CJ40: Planned amount Doesnt Show up in CJ30 (Planned Total)
Hi; I came up recently witha Problem That whenever i do Planning (cj40) and after saving it when i go to Project Budget CJ30, there it doesnt show me the planned Values in the relevant column. your valuable input would be of great help to me. Regards
-
Problems installing OSB on eclipse in 32 bit Win 7
I am having istalling OSB on Win 7. I cannot see any of the OSB projects when I click File-> new. I have installed Weblogic 10.3.3 + oepe+ coherance. I also tried reinstalling and copying the OSB plgins to the OEPE directory. Does not help