How to read a CSV File
Hi All,
I am new to the forum so i dont know if i can post this message in this section or not but i did not find any related section in the menu so i am posting it here.
I want to read a .CSV (comma separated file) in a particular location and see if the format is correct. if the format is ok then ftp it to another directory or another location.
Thanks in advance.
I want to read a .CSV (comma separated file) in a particular location and see if
the format is correct. if the format is ok then ftp it to another directory or another locationYou don't specify how you want to handle the file. You can do it three ways:
1) Just read the file as is, including the delimiters, then process it.
2) Remove the delimiters from the file, then read the file and process it.
3) Parse the file using the delimiters to separate the file into tokens, then read the tokens and process them.
I'll assume that you want to do #3 above. The following code will handle CSV data that is delimited by a comma, and is either quoted or unquoted.
public class Xb
public static void main(String[] args)
// csv include both quoted and unquoted data
String csv = "\"a\",\"b\",\"c\",d,e,f";
System.out.println(csv); // so you can see the data
String[] stArray = csv.replaceAll("\"", "").split(",");
for (int j = 0; j < stArray.length; j++)
System.out.println(stArray[j]);
}This is the printed output:
"a","b","c",d,e,f
a
b
c
d
e
f
Note: If you want #2 above, delete[b] .split(",") from this line:
String[] stArray = csv.replaceAll("\"", "").split(",");
Similar Messages
-
How to read a .csv file(excel format) using Java.
Hi Everybody,
I need to read a .csv file(excel) and store all the columns and rows in 2d arrays. Then I can do the rest of the coding myself. I would like it if somebody could post their code to read .csv files over here. The .csv file can have different number of columns and different number of rows every time it is ran. The .csv file is in excel format, so I don't know if that affects the code or not. I would also appreciate it if the classes imported are posted too. I would also like to know if there is a way I can recognize how many rows and columns the .csv file has. I need this urgently so I would be very grateful to anybody who has the solution. Thanks.
Sincerely Taufiq.I used this
BufferedReader in = new BufferedReader (new FileReader ("test.csv"));
// and
StringTokenizer parser = new StringTokenizer (str, ", ");
while (parser.hasMoreTokens () == true)
{ //crap }works like a charm! -
How to read a .CSV file using UTL_FILE
HI,
How do i read a .csv file line by line using UTL_FILE?
Thanks in advance
Regards,
Gayatri----do open the file logic
begin
----Let's say this file is delimited by ','
---declare variables
v_startPos number; -- starting position of field
v_Pos number; -- position of string
v_lenString number; -- length
v_first_field varchar2(30);
v_second_field varchar2(30);
v_third_field varchar2(30);
v_fourth_field varchar2(30);
input_String varchar2(1000); -- buffer for each line of file
----Say you have a 4 column file delimited by ','
delimitChar varchar2(1) := ','
Joe;Joan;People;Animal
Teddy;Bear;Beans;Toys
begin
loop
utl_file.get_line(input_file, input_String); -- get each line
---- this will get the first field as specified by the last number
v_Pos := instr(input_String,delChar,1,1);
v_lenString := v_Pos - 1;
v_first_field := substr(input_String,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will get the second field
v_Pos := instr(inString,delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_second_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- 3rd field
v_Pos := instr(inString,delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_third_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- last field -- there is no delimiter for last field
v_Pos := length(input_String) + 1;
v_lenString := v_Pos - v_startPos;
v_fourth_field := substr(input_String,v_StartPos,v_lenString);
end;
EXCEPTION
WHEN no_data_found then
fnd_file.put_line(FND_FILE.LOG, 'Last line so exit');
exit;
end loop; -
How to read the CSV Files into Database Table
Hi
Friends i have a Table called tblstudent this has the following fields Student ID, StudentName ,Class,Father_Name, Mother_Name.
Now i have a CSV File with 1500 records with all These Fields. Now in my Program there is a need for me to read all these Records into this Table tblstudent.
Note: I have got already 2000 records in the Table tblstudent now i would like to read all these CSV File records into the Table tblstudent.
Please give me some examples to do this
Thank your for your service
Cheers
Jofin1) Read the CSV file line by line using BufferedReader.
2) Convert each line (record) to a List and add it to a parent List. If you know the columns before, you might use a List of DTO's.
3) Finally save the two-dimensional List or the List of DTO's into the datatable using plain JDBC or any kind of ORM (Hibernate and so on).
This article contains some useful code snippets to parse a CSV: http://balusc.xs4all.nl/srv/dev-jep-csv.html -
How to read/write .CSV file into CLOB column in a table of Oracle 10g
I have a requirement which is nothing but a table has two column
create table emp_data (empid number, report clob)
Here REPORT column is CLOB data type which used to load the data from the .csv file.
The requirement here is
1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
eg: empid report_field1 report_field2
1 x y
Any help would be appreciated.If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
declare
lt_report_clob CLOB;
l_max_line_length integer := 1024; -- set as high as the longest line in your file
l_infile UTL_FILE.file_type;
l_buffer varchar2(1024);
l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
l_filename varchar2(200) := 'my_file_name.csv'; -- get this from somewhere
begin
-- open the file; we assume an Oracle directory has already been created
l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
-- initialise the empty clob
dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
loop
begin
utl_file.get_line(l_infile, l_buffer);
dbms_lob.append(lt_report_clob, l_buffer);
exception
when no_data_found then
exit;
end;
end loop;
insert into report_table (emp_id, report)
values (l_emp_id, lt_report_clob);
-- free the temporary lob
dbms_lob.freetemporary(lt_report_clob);
-- close the file
UTL_FILE.fclose(l_infile);
end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
There is at least one other possibility:
- you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
HTH
Regards Nigel
Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file... -
How to read a CSV file into the portal
hi all,
I want to read the content of CSV file that is avaliable in my desktop.
plz help me with suitable code
Regards
Savitha
Edited by: Savitha S R on Jun 1, 2009 8:25 AMPlease use this code for that
REPORT znkp_upload_csv line-size 400.
DATA : v_filename TYPE string.
PARAMETER : p_file LIKE rlgrap-filename DEFAULT 'C:\Documents and Settings\Administrator\Desktop\JV.csv' .
*Types for Reading CSV file
TYPES : BEGIN OF x_csv,
text(400) TYPE c,
END OF x_csv.
*Global internal table for reading CSV file
DATA : lt_csv TYPE TABLE OF x_csv.
*Global work area for reading CSV file
DATA : wa_csv LIKE LINE OF lt_csv.
v_filename = p_file.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = v_filename
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = ' '
HEADER_LENGTH = 0
READ_BY_LINE = 'X'
DAT_MODE = ' '
CODEPAGE = ' '
IGNORE_CERR = ABAP_TRUE
REPLACEMENT = '#'
CHECK_BOM = ' '
VIRUS_SCAN_PROFILE =
NO_AUTH_CHECK = ' '
IMPORTING
FILELENGTH =
HEADER =
TABLES
data_tab = lt_csv
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
DATA : BEGIN OF item OCCURS 0 ,
t1(20) TYPE c,
t2(20) TYPE c,
t3(20) TYPE c,
t4(20) TYPE c,
t5(20) TYPE c,
t6(20) TYPE c,
t7(20) TYPE c,
t8(20) TYPE c,
END OF item.
DATA : txt(1) TYPE c. " 1-Header 2-Item
LOOP AT lt_csv into wa_csv.
split wa_csv-text at ',' into item-t1
item-t2
item-t3
item-t4
item-t5
item-t6
item-t7
item-t8.
append item.
clear item.
ENDLOOP.
Check ITEM TABLE
Regards
Naresh -
How to Read a CSV file using pure PL/SQL
Hi friends - Is there a possibility to read a .TXT/CSV file from a local machine and display the contents on the form fields each record at one time.
The logic to flow through the records can be build, but the problem is where to begin from to start reading the file using PL/SQL. I don't want to use any third party component/API as such. Is there any way in forms by using PL/SQL to read an external CSV file and interpret the contents.
I have read about SQL * Loader on some sites but cannot find any way to invoke it on windows platform. There seems to be UNIX commands to invoke the .CTL file that is used by SQL Loader. Any help is much apreciated.
RgdsHi Thanks for your replies, TEXT_IO seems to be a solution but not very comprehensive as it provides limited exposed functions to perform complex operations while read and write of files.
I think SQL*Loader is a valid solution, but as I stated in my original quote Im not able to invoke it from the command prompt. The command that is shown on the suggested URL(http://www.orafaq.com/faqloadr.htm) is not found on my machine. I have Windows 2K installed. Is there a seperate patch or a download available from where I can get that .EXE to invoke the utility.
Thanks.. -
How to read a CSV file and Insert data into an Oracle Table
Hi All,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
Please let me some suggestions on this.
Thanks,
Chandra Rjeneesh wrote:
And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
@OP,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
So, why have you stored comma seperated data in a CLOB?
Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data. -
Read a CSV file and dynamically generate the insert
I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
to the $cmd.CommandText
does not evaluate the values
How to evaluate the string in powershell
Import-Csv -Path $FileName.FullName | % {
# Insert statement.
$insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
$valCols='';
$DataCols='';
$lists = $ReqColumns.split(",");
foreach($l in $lists)
$valCols= $valCols + '$($_.'+$l+')'','''
#Generate the values statement
$DataCols=($DataCols+$valCols+')').replace(",')","");
$insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
#The above statement generate the following insert statement
#INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
$cmd.CommandText = $insertStr #does not evaluate the values
#If the same statement is passed as below then it execute successfully
#$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
#Execute Query
$cmd.ExecuteNonQuery() | Out-Null
jyeragiHi Jyeragi,
To convert the data to the SQL table format, please try this function out-sql:
out-sql Powershell function - export pipeline contents to a new SQL Server table
If I have any misunderstanding, please let me know.
If you have any feedback on our support, please click here.
Best Regards,
Anna
TechNet Community Support -
Data formatting and reading a CSV file without using Sqlloader
I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
Question # 1:
How can I skip reading the first two lines from my csv file?
Question # 2:
There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
for 140 times in my script or, there is a better way to do this?
Question # 3:
This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
But can this be doable in the insert as created in the below code?
I am trying to find the "wrap code" button in my post, but do not see it.
Heres my file layout -
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
This is my table structure -
desc sps_dataload;
File_Name Varchar2 (50) Not Null,
Record_Layer Varchar2 (20) Not Null,
Level_Id Varchar2 (20),
Desc1 Varchar2 (50),
Desc2 Varchar2 (50),
Desc3 Varchar2 (50),
Desc4 Varchar2 (50)
Heres my code to do this -
create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
p_filename IN varchar2,
p_Totalinserted IN OUT number) as
v_filename varchar2(30) := p_filename;
v_filehandle UTL_FILE.FILE_TYPE;
v_startPos number; --starting position of a field
v_Pos number; --position of string
v_lenstring number; --length of string
v_record_layer varchar2(20);
v_level_id varchar2(20) := 0;
v_desc1 varchar2(50);
v_desc2 varchar2(50);
v_desc3 varchar2(50);
v_desc4 varchar2(50);
v_input_buffer varchar2(1200);
v_delChar varchar2(1) := ','
v_str varchar2(255);
BEGIN
v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
p_Totalinserted := 0;
LOOP
BEGIN
UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
-- this will read the 1st field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,1);
v_lenString := v_Pos - 1;
v_record_layer := substr(v_input_buffer,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 2nd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 3rd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 4th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,4);
v_lenString := v_Pos - v_startPos;
v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 5th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,5);
v_lenString := v_Pos - v_startPos;
v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
Execute immediate v_str;
p_Totalinserted := p_Totalinserted + 1;
commit;
END LOOP;
UTL_FILE.FCLOSE(v_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_OPERATION THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
WHEN UTL_FILE.READ_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
WHEN UTL_FILE.INVALID_PATH THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
WHEN UTL_FILE.INTERNAL_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
WHEN VALUE_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE;
END insert_spsdataloader;
/Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
CREATE TABLE extern_sps_dataload
( record_layer VARCHAR2(20),
attr1 VARCHAR2(20),
attr2 VARCHAR2(20),
attr3 VARCHAR2(20),
attr4 VARCHAR2(20)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY dataload
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE dataload:'sps_dataload.bad'
LOGFILE dataload:'sps_dataload.log'
DISCARDFILE dataload:'sps_dataload.dis'
SKIP 2
VARIABLE 2 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' LRTRIM
MISSING FIELD VALUES ARE NULL
+LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+ LOCATION ('sps_dataload.csv')
REJECT LIMIT UNLIMITED;
{code}
While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
Thank you for your help!! Appreciate your thoughts on this..
Sanders. -
Hi all,
I'm new to flex and have been asked to provide a small widget
that will display the contents of a csv in a list. Can I use the
HTTPService to open a csv file and then create an actionscript 3
function to parse it? If so can anyone point me to a tutorial on
how to parse a csv file please.
Thanks in advanceHello Sir,
I am new to flex and want to read a csv file and using below code but does not seems to be working. Can you please help?
i use below code but did not work
var file:File = evt.currentTarget as File;
file = file.resolvePath(file.nativePath);
var fileStream:FileStream = new FileStream();
fileStream.open(file, FileMode.READ);
var fileData:String = fileStream.readUTFBytes(fileStream.bytesAvailable);
var endings:Array = [File.lineEnding, "\n", "\r"];
but for some reason it return "ÐÏ ࡱá" funny value. Any idea why don't i get the correct data from file.
belwo is the csv file i am trying to open.
Title
Given_Name
Surname
Gong
Salutation
Position
Organisation
Address_Line_1
Address_Line_2
Address_Line_3
Suburb
State
Postcode
Country
Home_Phone
Fax
Other_Phone
User_Field_1
User_Field_2
User_Field_3
User_Field_4
Mobile_Phone
Second_Address
Second_Address_Line_1
Second_Address_Line_2
Second_Address_Line_3
Second_Suburb
Second_State
Second_Country
Second_Postcode
Langcode
Website
Mr.
Jeff
Alexander
Retention Marketing
Monday, April 13th, 2009
Mr.
Anthony
Demaso
Retention Marketing
Monday, April 13th, 2009
Sally
Swinamer
Yield
Monday, April 13th, 2009
Chris
Torbay
Yield
Monday, April 13th, 2009
Annette
Warring
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Mark
Khoury
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Andy
Thorndyke
Thorsons
Monday, April 13th, 2009
Shannon
Rutherford
Central Reproductions
Monday, April 13th, 2009
Mr.
Rob
Greenwood
Central Reproductions
Monday, April 13th, 2009
Lisa
Marchese
Des Rosiers
Monday, April 13th, 2009
Mr.
Michael
Whitcombe
McMillan LLP
Monday, April 13th, 2009
Thanks,
Gill -
Read a csv file in a PL/SQL routine with the path as a parameter
hello oracle community,
System Windows 2008, Oracle 11.1 Database
I have created a directory in the database and there is no problem to read a csv file from there. Since the files can be at different places, my java developer is asking me, if he can give me the path of the file as a parameter. So far I know I will need a directory to open the file with the UTL_FILE package, but I cannot create diretories for folders, that will be created in the future. Is there any way to solve that problem, so I can use the path as a parameter in my routine ?
IkrischerBluShadow wrote:
Ikrischer wrote:
as I said before, they are dynamic. As usal to most other companies, you get new partners and you will lose partners, so new folders will be added, old ones will be deleted. if you use different directories for different folders, you have more work to handle them.I wouldn't call that dynamic, I would call that business maintenance. Dynamic would mean that the same company is putting their files into different folders each time or on a regular basis. If it's one folder per company then that's not dynamic.the folder structure is changing, I call that dynamic, does not look static to me. but thats just a point of view, it is not important how we call it. lets just agree, the folder structure will change in the future, no matter how many folders one partner will have.
BluShadow wrote:
So you're receiving files and you don't even know the format of them? Some of them are not "correct"? Specify that the company must provide the files in the correct format or they will be rejected.
Computers work on known logic in most cases, not some random guessing game.we know the correct format, but we also know, that the partner wont always send the correct format, period. Cause of that we want to deliver an error log, record 11,15 and 20 is wrong cause of....that is a very usefull service.
BluShadow wrote:
The folders should be known.impossible, we do not know the future.
BluShadow wrote:
The file format should be known.see above, it has a fixed format, but thats not a gurantee you get the correct format. for me thats a normal situation, it happens every day, you have fixed rules and they will be broken.
BluShadow wrote:
The moment you allow external forces to start specifying things how they want it then you're shooting yourself in the foot. That's not good design and not good business practice.maybe we are talking about different things, we do not allow to load data into our system with a wrong format. but we do allow to to send us a wrong format, and even more, we do allow them to send wrong content, but still we dont load it into our system. format and content must be correct to be loaded in the real system, but both we are checking. and with external tables you cannot check a wrong format we a detailed error log.
Ikrischer -
Read a csv file, fill a dynamic table and insert into a standard table
Hi everybody,
I have a problem here and I need your help:
I have to read a csv file and insert the data of it into a standard table.
1 - On the parameter scrreen I have to indicate the standard table and the csv file.
2 - I need to create a dynamic table. The same type of the one I choose at parameter screen.
3 - Then I need to read the csv and put the data into this dynamic table.
4 - Later I need to insert the data from the dynamic table into the standard table (the one on the parameter screen).
How do I do this job? Do you have an example? Thanks.Here is an example table which shows how to upload a csv file from the frontend to a dynamic internal table. You can of course modify this to update your database table.
report zrich_0002.
type-pools: slis.
field-symbols: <dyn_table> type standard table,
<dyn_wa>,
<dyn_field>.
data: it_fldcat type lvc_t_fcat,
wa_it_fldcat type lvc_s_fcat.
type-pools : abap.
data: new_table type ref to data,
new_line type ref to data.
data: xcel type table of alsmex_tabline with header line.
selection-screen begin of block b1 with frame title text .
parameters: p_file type rlgrap-filename default 'c:Test.csv'.
parameters: p_flds type i.
selection-screen end of block b1.
start-of-selection.
* Add X number of fields to the dynamic itab cataelog
do p_flds times.
clear wa_it_fldcat.
wa_it_fldcat-fieldname = sy-index.
wa_it_fldcat-datatype = 'C'.
wa_it_fldcat-inttype = 'C'.
wa_it_fldcat-intlen = 10.
append wa_it_fldcat to it_fldcat .
enddo.
* Create dynamic internal table and assign to FS
call method cl_alv_table_create=>create_dynamic_table
exporting
it_fieldcatalog = it_fldcat
importing
ep_table = new_table.
assign new_table->* to <dyn_table>.
* Create dynamic work area and assign to FS
create data new_line like line of <dyn_table>.
assign new_line->* to <dyn_wa>.
* Upload the excel
call function 'ALSM_EXCEL_TO_INTERNAL_TABLE'
exporting
filename = p_file
i_begin_col = '1'
i_begin_row = '1'
i_end_col = '200'
i_end_row = '5000'
tables
intern = xcel
exceptions
inconsistent_parameters = 1
upload_ole = 2
others = 3.
* Reformt to dynamic internal table
loop at xcel.
assign component xcel-col of structure <dyn_wa> to <dyn_field>.
if sy-subrc = 0.
<dyn_field> = xcel-value.
endif.
at end of row.
append <dyn_wa> to <dyn_table>.
clear <dyn_wa>.
endat.
endloop.
* Write out data from table.
loop at <dyn_table> into <dyn_wa>.
do.
assign component sy-index of structure <dyn_wa> to <dyn_field>.
if sy-subrc <> 0.
exit.
endif.
if sy-index = 1.
write:/ <dyn_field>.
else.
write: <dyn_field>.
endif.
enddo.
endloop.
REgards,
RIch Heilman -
Read a csv file and read the fiscal yr in the 4th pos?
Hello ABAP Experts,
how to write a code for read a csv file and read the fiscal year in the 4th position.
any suggestions or code highly appreciated.
Thanks,
BWerHi Bwer,
Declare table itab with the required fields...
Use GUI UPLOAD to get the contents of the file (say abc.csv) in case if the file is on the presentation server...
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = 'c:\abc.csv'
FILETYPE = 'ASC'
WRITE_FIELD_SEPARATOR = 'X'
tables
data_tab = itab
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Use OPEN DATASET in case if the file is on the application server..
After that USE SPLIT command at comma to get the contents of the 4th field...
Regards,
Tanveer.
<b>Please mark helpful answers</b> -
Reading a csv file and bind to a data grid
hi. doing a school project and been searching. the application reads a csv file from c:\stocklist.csv, and then this in a button called btnLoadData, and now need to read in a data control called DmgDisplayData. do i put the code in the button, or in the
data grid. been searching, but cannot seem to find any thing. so, where do i put the code, in the data control, and does any one have an example code how to read into the data fields. also need to have one field, able to edit, the other three or four fields,
read only. can any one help me out. never covered this in the subject, but did do file streams a few years ago in vb, but usin g c#,a dn the help in visual studio, not that helpful, with a blind person using a screen reader, jaws for windows from http://www.freedomscientific.com,
and using visual studio 2013 community edition. can any one help me out, been searching and trawling about 15 to 20 pages so far, and did try a couple of sites, but could not find, any help. thanks. the application is to read a csv file from a button, and
load into a data grid, then have a message box, saying file load successful, then have one field, order on as edit, but the other fields, read only, so do i need the navigator buttons, for next, back, previous, etc, and how do i code that as well. not to do
it for me, want to learn, but maybe some sample code, did do navgiator controls, years ago for a vb project, but need the c # example for that, thanks. then you have a button, Save data, that saves the csv file in the data grid. so can google for that. then
have a toolbar, with a button saying, Sort Items, a tool strip, and when you click on that button, you have a drop down list, of three items, then a sort button, which will then sort the array in the data grid. so, do i need another form, or just do the combo
box as an invisible control, then just refrence, that in the toolbar. so need to use th file class and an array, learnt about single and multi arrays. any ideas. thanks.
http://startrekcafe.stevesdomain.net http://groups.yahoo.com/groups/JawsOzHi Marvin,
-->where do i put this, in the data grid click event. or in the form load event. thanks.
You could use this code after you initialize the DataGridView. you could put it in the form load event.
-->what about how get the tool bar and the combo box and another button, then sort from the combo box on the array for the collumns, for to set focus to the first read collumn for the data grid. how do i do that, close the parent form, and have another
form on the toolbar.
Since this is another issue of this thread, I would recommend you posting it with
another thread. We will focus on that thread to help you. Thanks for your
understanding.
BTW, Before you asking questions, I suggest you could learn to make it by yourself. You could begin to learn winforms in MSDN articles:
https://msdn.microsoft.com/en-us/library/dd30h2yb%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396. Or Google it, you will get many answers. In that way, you will learn more from the questions.
Best regards,
Youjun Tang
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.
Maybe you are looking for
-
XML Publisher Report:how to disappear the column if no data found
Hi, I have a XMl publisher report requirement , the format is tabular lets say with the structure of 4 columns and 10 rows.requirement is to if suppose there is no data for row number 6 the row itself should not appear over the report and the immedia
-
Check lot issue..
Dear Experts, At my client place client want to extened check lot chars upto 10 digitis but in sap it takes 7 only please provide your inputs.. With Best Regards, Satish
-
Headphone settings not working
So far I've been enjoying lollipop, even with the exceptions ( being no real silent mode) and what I'm about to mention now. I've noticed even when I'm playing in Walkman, I can't select headphone settings where it has studio sound, concert sound, et
-
Change the subject in email alert
Hi, I have created an email alert when process chains finished. The subject in the email that the reciepents get, contain many details. The reciepents want to get in the subject only, "Succeeded" or "Failed". How can I change th esubject in the email
-
I have several movies I have taken with my digital camera and I have two issues I hope folks can resolve: (1) Can I edit (e.g., rotate) these movies within iPhoto? I've tried and Quicktime forces me to save the file as a *.mov within my "Movies" fold