Reading a csv file in Labview
I need to be able to read a 2D csv file and pull the data out by rows for use in an I/O write. I am working with a Lakeshore 331 temperature controller and need to be able to write parameters from a csv file to the temp controller. There are three columns of data; Set temp, Stabilization time, and Tolerance. I need to be able to switch between using tolerance data and a time switch to change a boolean to true. That true boolean needs to be able to be able to light a "Temp Stabilized Flag" and run a loop to the next row of data points were the previous row of written data points will be replaced by the following row. Basically, when one row of desired parameters is achieved, the program reads the next row of parameters and uses them as the standard until those parameter points are achieved. This process will continue until the entire list is exhausted. I am unable to find a way to read the csv by rows. My initial thought was to use the "Read Text" function and convert the string to the three seperate data points but I was unable to make that work. I have attached an example csv file, a rough draft file of what I need in order to write the data to the 331(which is entirely unfinished but provides a basic idea of where I am coming from), and my full project in case it helps. Please help me find a way to pull each row of data points individually in the previously mentioned cycle.
Thank You,
Matt
Attachments:
csv rough draft.vi 25 KB
Lakeshore 331 Project_Test.vi 52 KB
Test331.csv 1 KB
I suggest you look at using the subVI Spreadsheet String to Array from the palette to convert the csv file to an array of parameters then you can use a for loop to index through the data elements one row at a time. You may need to use an index primitive or internal loop to process the data.
If you check out the examples through the "Example Finder" you will most likely discover a good starting point for your problem.
happy Codeing
Wire Warrior
Wire Warrior
Behold the power of LabVIEW as my army of Roomba minions streaks across the floor!
Similar Messages
-
Read from csv file and plot particular columns
Hello,
I`m a new user of Labview and here it comes...my first major problem.
Maybe this has been discussed before. I’ve made a search to solve my problem first but I couldn`t find anything helpful so I `ve decided to post a new message.
So here is my problem:
I`m working in a small semiconductor lab where different types of nitrides are grown using proprietary reactor. The goal is to read the collected csv files from each growth in Labview and plot the acquired data in appropriate graphs.
I have a bunch of csv files and I have to make a Labview program to read them.
The first part of my project I`ve decided to be displaying the csv file (growth log file) under labview (which I think works fine).
The second one is to be able to plot particular columns from the recipe in graphs in Labview (that one actually gives me a lot of trouble):
1. Timestamp vs Temperature /columns B and D/
2. Timestamp vs Gas flow /columns L to S/
3. Timestamp vs Pressure /columns E,K,T,U,V/
I`ve got one more problem. How can I convert the Timestamp shown in csv file to human readable date in labview? This actually is a big problem, because the timestamp is my x axis and I want to know at what time a particular process took place and I also want to be able to see the converted timestamp when displaying csv file at first. I`ve read a lot about time stamping in excel and timestamp in labview but I`m still confused how to convert it in my case.
I don`t have problems displaying csv file under Labview. My problems are with the timestamp and the graphs.
Sorry for my awful English. I hope you can understand my problems since English is not my mother language.
Please find the attached files.
If you have any ideas or suggestions I`ll be more than happy to discuss them.
Thank you in advance.
Have a nice day!
Attachments:
growth log.csv 298 KB
Read from growth log.vi 33 KBHello again,
I`m having problems with converting the first column in the attached above file Growth Log.csv.
I have a code converting xl timestamp to time and using Index Array traying to grab a particular column out of it but the attached file is written in strings so I guess I have to redo it in array but I don`t know how.Would you help me with this one?
Attachments:
Xl Timestamp to Time.vi 21 KB -
Read a CSV file and dynamically generate the insert
I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
to the $cmd.CommandText
does not evaluate the values
How to evaluate the string in powershell
Import-Csv -Path $FileName.FullName | % {
# Insert statement.
$insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
$valCols='';
$DataCols='';
$lists = $ReqColumns.split(",");
foreach($l in $lists)
$valCols= $valCols + '$($_.'+$l+')'','''
#Generate the values statement
$DataCols=($DataCols+$valCols+')').replace(",')","");
$insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
#The above statement generate the following insert statement
#INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
$cmd.CommandText = $insertStr #does not evaluate the values
#If the same statement is passed as below then it execute successfully
#$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
#Execute Query
$cmd.ExecuteNonQuery() | Out-Null
jyeragiHi Jyeragi,
To convert the data to the SQL table format, please try this function out-sql:
out-sql Powershell function - export pipeline contents to a new SQL Server table
If I have any misunderstanding, please let me know.
If you have any feedback on our support, please click here.
Best Regards,
Anna
TechNet Community Support -
Reading a CSV file from server
Hi All,
I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
But when i do that the last field in the internal table is appened with #.
Can somebody tell me the solution for this.
U can see the my code below.
data: begin of itab_infile occurs 0,
input(3000),
end of itab_infile.
data: begin of itab_rec occurs 0,
record(200),
end of itab_rec.
data: c_comma(1) value ',',
open dataset f_name1 for input in text mode encoding default.
if sy-subrc <> 0.
write: /, 'FILE NOT FOUND'.
exit.
endif.
do
read dataset p_ipath into waf_infile.
split itab_infile-input at c_sep into table itab_rec.
enddo.
Thanks in advance.
SunilSunil,
You go not mention the platform on which the CSV file was created and the platform on which it is read.
A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
MS/Windows usings <CR><LF> as the EOR
Unix using either <CR> or <LF>
If on unix open the file using vi in a telnet session to confirm the EOR type.
The fix options.
1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin. This does the dos2unix conversion on the fly.
3) Install SAMBA and share the load directory to the windows platforms. SAMBA also handles the dos2unix and unix2dos conversions on the fly.
Hope this helps
David Cooper -
Data formatting and reading a CSV file without using Sqlloader
I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
Question # 1:
How can I skip reading the first two lines from my csv file?
Question # 2:
There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
for 140 times in my script or, there is a better way to do this?
Question # 3:
This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
But can this be doable in the insert as created in the below code?
I am trying to find the "wrap code" button in my post, but do not see it.
Heres my file layout -
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
This is my table structure -
desc sps_dataload;
File_Name Varchar2 (50) Not Null,
Record_Layer Varchar2 (20) Not Null,
Level_Id Varchar2 (20),
Desc1 Varchar2 (50),
Desc2 Varchar2 (50),
Desc3 Varchar2 (50),
Desc4 Varchar2 (50)
Heres my code to do this -
create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
p_filename IN varchar2,
p_Totalinserted IN OUT number) as
v_filename varchar2(30) := p_filename;
v_filehandle UTL_FILE.FILE_TYPE;
v_startPos number; --starting position of a field
v_Pos number; --position of string
v_lenstring number; --length of string
v_record_layer varchar2(20);
v_level_id varchar2(20) := 0;
v_desc1 varchar2(50);
v_desc2 varchar2(50);
v_desc3 varchar2(50);
v_desc4 varchar2(50);
v_input_buffer varchar2(1200);
v_delChar varchar2(1) := ','
v_str varchar2(255);
BEGIN
v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
p_Totalinserted := 0;
LOOP
BEGIN
UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
-- this will read the 1st field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,1);
v_lenString := v_Pos - 1;
v_record_layer := substr(v_input_buffer,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 2nd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 3rd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 4th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,4);
v_lenString := v_Pos - v_startPos;
v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 5th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,5);
v_lenString := v_Pos - v_startPos;
v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
Execute immediate v_str;
p_Totalinserted := p_Totalinserted + 1;
commit;
END LOOP;
UTL_FILE.FCLOSE(v_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_OPERATION THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
WHEN UTL_FILE.READ_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
WHEN UTL_FILE.INVALID_PATH THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
WHEN UTL_FILE.INTERNAL_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
WHEN VALUE_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE;
END insert_spsdataloader;
/Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
CREATE TABLE extern_sps_dataload
( record_layer VARCHAR2(20),
attr1 VARCHAR2(20),
attr2 VARCHAR2(20),
attr3 VARCHAR2(20),
attr4 VARCHAR2(20)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY dataload
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE dataload:'sps_dataload.bad'
LOGFILE dataload:'sps_dataload.log'
DISCARDFILE dataload:'sps_dataload.dis'
SKIP 2
VARIABLE 2 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' LRTRIM
MISSING FIELD VALUES ARE NULL
+LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+ LOCATION ('sps_dataload.csv')
REJECT LIMIT UNLIMITED;
{code}
While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
Thank you for your help!! Appreciate your thoughts on this..
Sanders. -
How to read a .csv file(excel format) using Java.
Hi Everybody,
I need to read a .csv file(excel) and store all the columns and rows in 2d arrays. Then I can do the rest of the coding myself. I would like it if somebody could post their code to read .csv files over here. The .csv file can have different number of columns and different number of rows every time it is ran. The .csv file is in excel format, so I don't know if that affects the code or not. I would also appreciate it if the classes imported are posted too. I would also like to know if there is a way I can recognize how many rows and columns the .csv file has. I need this urgently so I would be very grateful to anybody who has the solution. Thanks.
Sincerely Taufiq.I used this
BufferedReader in = new BufferedReader (new FileReader ("test.csv"));
// and
StringTokenizer parser = new StringTokenizer (str, ", ");
while (parser.hasMoreTokens () == true)
{ //crap }works like a charm! -
Hi all,
I'm new to flex and have been asked to provide a small widget
that will display the contents of a csv in a list. Can I use the
HTTPService to open a csv file and then create an actionscript 3
function to parse it? If so can anyone point me to a tutorial on
how to parse a csv file please.
Thanks in advanceHello Sir,
I am new to flex and want to read a csv file and using below code but does not seems to be working. Can you please help?
i use below code but did not work
var file:File = evt.currentTarget as File;
file = file.resolvePath(file.nativePath);
var fileStream:FileStream = new FileStream();
fileStream.open(file, FileMode.READ);
var fileData:String = fileStream.readUTFBytes(fileStream.bytesAvailable);
var endings:Array = [File.lineEnding, "\n", "\r"];
but for some reason it return "ÐÏ ࡱá" funny value. Any idea why don't i get the correct data from file.
belwo is the csv file i am trying to open.
Title
Given_Name
Surname
Gong
Salutation
Position
Organisation
Address_Line_1
Address_Line_2
Address_Line_3
Suburb
State
Postcode
Country
Home_Phone
Fax
Other_Phone
User_Field_1
User_Field_2
User_Field_3
User_Field_4
Mobile_Phone
Second_Address
Second_Address_Line_1
Second_Address_Line_2
Second_Address_Line_3
Second_Suburb
Second_State
Second_Country
Second_Postcode
Langcode
Website
Mr.
Jeff
Alexander
Retention Marketing
Monday, April 13th, 2009
Mr.
Anthony
Demaso
Retention Marketing
Monday, April 13th, 2009
Sally
Swinamer
Yield
Monday, April 13th, 2009
Chris
Torbay
Yield
Monday, April 13th, 2009
Annette
Warring
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Mark
Khoury
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Andy
Thorndyke
Thorsons
Monday, April 13th, 2009
Shannon
Rutherford
Central Reproductions
Monday, April 13th, 2009
Mr.
Rob
Greenwood
Central Reproductions
Monday, April 13th, 2009
Lisa
Marchese
Des Rosiers
Monday, April 13th, 2009
Mr.
Michael
Whitcombe
McMillan LLP
Monday, April 13th, 2009
Thanks,
Gill -
Read a csv file in a PL/SQL routine with the path as a parameter
hello oracle community,
System Windows 2008, Oracle 11.1 Database
I have created a directory in the database and there is no problem to read a csv file from there. Since the files can be at different places, my java developer is asking me, if he can give me the path of the file as a parameter. So far I know I will need a directory to open the file with the UTL_FILE package, but I cannot create diretories for folders, that will be created in the future. Is there any way to solve that problem, so I can use the path as a parameter in my routine ?
IkrischerBluShadow wrote:
Ikrischer wrote:
as I said before, they are dynamic. As usal to most other companies, you get new partners and you will lose partners, so new folders will be added, old ones will be deleted. if you use different directories for different folders, you have more work to handle them.I wouldn't call that dynamic, I would call that business maintenance. Dynamic would mean that the same company is putting their files into different folders each time or on a regular basis. If it's one folder per company then that's not dynamic.the folder structure is changing, I call that dynamic, does not look static to me. but thats just a point of view, it is not important how we call it. lets just agree, the folder structure will change in the future, no matter how many folders one partner will have.
BluShadow wrote:
So you're receiving files and you don't even know the format of them? Some of them are not "correct"? Specify that the company must provide the files in the correct format or they will be rejected.
Computers work on known logic in most cases, not some random guessing game.we know the correct format, but we also know, that the partner wont always send the correct format, period. Cause of that we want to deliver an error log, record 11,15 and 20 is wrong cause of....that is a very usefull service.
BluShadow wrote:
The folders should be known.impossible, we do not know the future.
BluShadow wrote:
The file format should be known.see above, it has a fixed format, but thats not a gurantee you get the correct format. for me thats a normal situation, it happens every day, you have fixed rules and they will be broken.
BluShadow wrote:
The moment you allow external forces to start specifying things how they want it then you're shooting yourself in the foot. That's not good design and not good business practice.maybe we are talking about different things, we do not allow to load data into our system with a wrong format. but we do allow to to send us a wrong format, and even more, we do allow them to send wrong content, but still we dont load it into our system. format and content must be correct to be loaded in the real system, but both we are checking. and with external tables you cannot check a wrong format we a detailed error log.
Ikrischer -
Read a csv file, fill a dynamic table and insert into a standard table
Hi everybody,
I have a problem here and I need your help:
I have to read a csv file and insert the data of it into a standard table.
1 - On the parameter scrreen I have to indicate the standard table and the csv file.
2 - I need to create a dynamic table. The same type of the one I choose at parameter screen.
3 - Then I need to read the csv and put the data into this dynamic table.
4 - Later I need to insert the data from the dynamic table into the standard table (the one on the parameter screen).
How do I do this job? Do you have an example? Thanks.Here is an example table which shows how to upload a csv file from the frontend to a dynamic internal table. You can of course modify this to update your database table.
report zrich_0002.
type-pools: slis.
field-symbols: <dyn_table> type standard table,
<dyn_wa>,
<dyn_field>.
data: it_fldcat type lvc_t_fcat,
wa_it_fldcat type lvc_s_fcat.
type-pools : abap.
data: new_table type ref to data,
new_line type ref to data.
data: xcel type table of alsmex_tabline with header line.
selection-screen begin of block b1 with frame title text .
parameters: p_file type rlgrap-filename default 'c:Test.csv'.
parameters: p_flds type i.
selection-screen end of block b1.
start-of-selection.
* Add X number of fields to the dynamic itab cataelog
do p_flds times.
clear wa_it_fldcat.
wa_it_fldcat-fieldname = sy-index.
wa_it_fldcat-datatype = 'C'.
wa_it_fldcat-inttype = 'C'.
wa_it_fldcat-intlen = 10.
append wa_it_fldcat to it_fldcat .
enddo.
* Create dynamic internal table and assign to FS
call method cl_alv_table_create=>create_dynamic_table
exporting
it_fieldcatalog = it_fldcat
importing
ep_table = new_table.
assign new_table->* to <dyn_table>.
* Create dynamic work area and assign to FS
create data new_line like line of <dyn_table>.
assign new_line->* to <dyn_wa>.
* Upload the excel
call function 'ALSM_EXCEL_TO_INTERNAL_TABLE'
exporting
filename = p_file
i_begin_col = '1'
i_begin_row = '1'
i_end_col = '200'
i_end_row = '5000'
tables
intern = xcel
exceptions
inconsistent_parameters = 1
upload_ole = 2
others = 3.
* Reformt to dynamic internal table
loop at xcel.
assign component xcel-col of structure <dyn_wa> to <dyn_field>.
if sy-subrc = 0.
<dyn_field> = xcel-value.
endif.
at end of row.
append <dyn_wa> to <dyn_table>.
clear <dyn_wa>.
endat.
endloop.
* Write out data from table.
loop at <dyn_table> into <dyn_wa>.
do.
assign component sy-index of structure <dyn_wa> to <dyn_field>.
if sy-subrc <> 0.
exit.
endif.
if sy-index = 1.
write:/ <dyn_field>.
else.
write: <dyn_field>.
endif.
enddo.
endloop.
REgards,
RIch Heilman -
Read a csv file and read the fiscal yr in the 4th pos?
Hello ABAP Experts,
how to write a code for read a csv file and read the fiscal year in the 4th position.
any suggestions or code highly appreciated.
Thanks,
BWerHi Bwer,
Declare table itab with the required fields...
Use GUI UPLOAD to get the contents of the file (say abc.csv) in case if the file is on the presentation server...
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = 'c:\abc.csv'
FILETYPE = 'ASC'
WRITE_FIELD_SEPARATOR = 'X'
tables
data_tab = itab
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Use OPEN DATASET in case if the file is on the application server..
After that USE SPLIT command at comma to get the contents of the 4th field...
Regards,
Tanveer.
<b>Please mark helpful answers</b> -
Created a java program to read a .csv file but receiving an error when I try to run it. Here is program:
import java.io.*;
class TestRead {
public static void main(File file) throws FileNotFoundException
{try
{FileReader fr = new FileReader(file);
StreamTokenizer st = new StreamTokenizer(fr);
BufferedReader br = new BufferedReader(new FileReader("C:/upload/DS121002.csv"));
String line = br.readLine();
System.out.println(line);
while (line != null)
{ line = br.readLine();
} catch (Exception ex)
{ ex.printStackTrace(); }
Here is the error I am receiving:
C:\jakarta-tomcat-4.0.3\webapps\webdav\WEB-INF\classes>java TestRead
Exception in thread "main" java.lang.NoSuchMethodError: main
C:\jakarta-tomcat-4.0.3\webapps\webdav\WEB-INF\classes>
What am I missing?public static void main(File file) throws
FileNotFoundException a "main" must be:
public static void main(String[] args)... -
Reading a csv file and bind to a data grid
hi. doing a school project and been searching. the application reads a csv file from c:\stocklist.csv, and then this in a button called btnLoadData, and now need to read in a data control called DmgDisplayData. do i put the code in the button, or in the
data grid. been searching, but cannot seem to find any thing. so, where do i put the code, in the data control, and does any one have an example code how to read into the data fields. also need to have one field, able to edit, the other three or four fields,
read only. can any one help me out. never covered this in the subject, but did do file streams a few years ago in vb, but usin g c#,a dn the help in visual studio, not that helpful, with a blind person using a screen reader, jaws for windows from http://www.freedomscientific.com,
and using visual studio 2013 community edition. can any one help me out, been searching and trawling about 15 to 20 pages so far, and did try a couple of sites, but could not find, any help. thanks. the application is to read a csv file from a button, and
load into a data grid, then have a message box, saying file load successful, then have one field, order on as edit, but the other fields, read only, so do i need the navigator buttons, for next, back, previous, etc, and how do i code that as well. not to do
it for me, want to learn, but maybe some sample code, did do navgiator controls, years ago for a vb project, but need the c # example for that, thanks. then you have a button, Save data, that saves the csv file in the data grid. so can google for that. then
have a toolbar, with a button saying, Sort Items, a tool strip, and when you click on that button, you have a drop down list, of three items, then a sort button, which will then sort the array in the data grid. so, do i need another form, or just do the combo
box as an invisible control, then just refrence, that in the toolbar. so need to use th file class and an array, learnt about single and multi arrays. any ideas. thanks.
http://startrekcafe.stevesdomain.net http://groups.yahoo.com/groups/JawsOzHi Marvin,
-->where do i put this, in the data grid click event. or in the form load event. thanks.
You could use this code after you initialize the DataGridView. you could put it in the form load event.
-->what about how get the tool bar and the combo box and another button, then sort from the combo box on the array for the collumns, for to set focus to the first read collumn for the data grid. how do i do that, close the parent form, and have another
form on the toolbar.
Since this is another issue of this thread, I would recommend you posting it with
another thread. We will focus on that thread to help you. Thanks for your
understanding.
BTW, Before you asking questions, I suggest you could learn to make it by yourself. You could begin to learn winforms in MSDN articles:
https://msdn.microsoft.com/en-us/library/dd30h2yb%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396. Or Google it, you will get many answers. In that way, you will learn more from the questions.
Best regards,
Youjun Tang
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
How to read a .CSV file using UTL_FILE
HI,
How do i read a .csv file line by line using UTL_FILE?
Thanks in advance
Regards,
Gayatri----do open the file logic
begin
----Let's say this file is delimited by ','
---declare variables
v_startPos number; -- starting position of field
v_Pos number; -- position of string
v_lenString number; -- length
v_first_field varchar2(30);
v_second_field varchar2(30);
v_third_field varchar2(30);
v_fourth_field varchar2(30);
input_String varchar2(1000); -- buffer for each line of file
----Say you have a 4 column file delimited by ','
delimitChar varchar2(1) := ','
Joe;Joan;People;Animal
Teddy;Bear;Beans;Toys
begin
loop
utl_file.get_line(input_file, input_String); -- get each line
---- this will get the first field as specified by the last number
v_Pos := instr(input_String,delChar,1,1);
v_lenString := v_Pos - 1;
v_first_field := substr(input_String,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will get the second field
v_Pos := instr(inString,delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_second_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- 3rd field
v_Pos := instr(inString,delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_third_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- last field -- there is no delimiter for last field
v_Pos := length(input_String) + 1;
v_lenString := v_Pos - v_startPos;
v_fourth_field := substr(input_String,v_StartPos,v_lenString);
end;
EXCEPTION
WHEN no_data_found then
fnd_file.put_line(FND_FILE.LOG, 'Last line so exit');
exit;
end loop; -
I have written a binary file with a specific header format in LABVIEW 8.6 and tried to Read the same Data File, Using LABVIEW 7.1.Here i Found some difficulty.Is there any way to Read the Data File(of LABVIEW 8.6), Using LABVIEW 7.1?
I can think of two possible stumbling blocks:
What are your 8.6 options for "byte order" and "prepend array or string size"?
Overall, many file IO functions have changed with LabVIEW 8.0, so there might not be an exact 1:1 code conversion. You might need to make some modifications. For example, in 7.1, you should use "write file", the "binary file VIs" are special purpose (I16 or SGL). What is your data type?
LabVIEW Champion . Do more with less code and in less time . -
Read two CSV files and remove the duplicate values within them.
Hi,
I want to read two CSV files(which contains more than 100 rows and 100 columns) and remove the duplicate values within that two files and merge all the unique values and display it as a single file.
Can anyone help me out.
Thanks in advance.kirthi wrote:
Can you help me....Yeah, I've just finished... Here's a skeleton of my solution.
The first thing I think you should do is write a line-parser which splits your input data up into fields, and test it.
Then fill out the below parse method, and test it with that debugPrint method.
Then go to work on the print method.
I can help a bit along the way, but if you want to do this then you have to do it yourself. I'm not going to do it for you.
Cheers. Keith.
package forums.kirthi;
import java.util.*;
import java.io.PrintStream;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import krc.utilz.io.ParseException;
import krc.utilz.io.Filez.LineParser;
import krc.utilz.io.Filez.CsvLineParser;
public class DistinctColumnValuesFromCsvFiles
public static void main(String[] args) {
if (args.length==0) args = new String[] {"input1.csv", "input2.csv"};
try {
// data is a Map of ColumnNames to Sets-Of-Values
Map<String,Set<String>> data = new HashMap<String,Set<String>>();
// add the contents of each file to the data
for ( String filename : args ) {
data.putAll(parse(filename));
// print the data to output.csv
print(data);
} catch (Exception e) {
e.printStackTrace();
private static Map<String,Set<String>> parse(String filename) throws IOException, ParseException {
BufferedReader reader = null;
try {
reader = new BufferedReader(new FileReader(filename));
CsvLineParser.squeeze = true; // field.trim().replaceAll("\\s+"," ")
LineParser<String[]> parser = new CsvLineParser();
int lineNumber = 1;
// 1. read the column names (first line of file) into a List
// 2. read the column values (subsequent lines of file) into a List of Set's of String's
// 3. build a Map of columnName --> columnValues and return it
} finally {
if(reader!=null)reader.close();
private static void debugPrint(Map<String,Set<String>> data) {
for ( Map.Entry<String,Set<String>> entry : data.entrySet() ) {
System.out.println("DEBUG: "+entry.getKey()+" "+Arrays.toString(entry.getValue().toArray(new String[0])));
private static void print(Map<String,Set<String>> data) {
// 1. get the column names from the table.
// 2. create a List of List's of String's called matrix; logically [COL][ROW]
// 3. print the column names and add the List<String> for this col to the matrix
// 4. print the matrix by inerating columns and then rows
}
Maybe you are looking for
-
I was running KeyPassFox with Firefox 8.0.1, but the plugin does not work in the new release (9) . Is there a way to restore my previous version with all of my bookmarks intact? I do not mind if I have to restore the plugins myself. The plugin shows
-
User exit or badi for FB03 to deactivating 'DELETE' button in attacmentlist
Hi All, I have a requirement to put an authorization check on the 'DELETE' function in the attachment list of FB03 transaction. Meaning, if user is not authorized to delete, then he cannot delete the (link) attachment. Any user exits to do that enhan
-
Upload file function in a flash form
Hi, i need to ad a function to upload files in a tipical contact form and receive them by mail. anyone knows how to do it? thanks a lot!
-
Hyperion 7.0 Slow Performance
We are having very slow performance with Hyperion Report 7.0 running batches and reports. Has anyone else experienced this and had any success improving the performance?
-
Snooped at work -How is this possible?
The other day at work I discussed my boss has the capability to monitor what I am doing on the internet on my macbook. It is my computer but I use his internet. He does not know my password and I have my security settings set to NO FILESHARING. How i