Question about reading csv file into internal table
Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
I would like to ask how can I read a CSV file into internal table from files in application server?
I can't simply use SPLIT as there may be comma in the content. e.g.
"abc","aaa,ab",10,"bbc"
My expected output:
abc
aaa,ab
10
bbb
Thanks again for your help.
Hi Gundam,
Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
DO.
READ DATASET dsn INTO record.
PERFORM parser USING record.
ENDDO.
*DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
*DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
*DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
*DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
*DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
FORM parser USING str.
DATA field(12).
DATA field1(12).
DATA field2(12).
DATA field3(12).
DATA field4(12).
DATA cnt TYPE i.
DATA len TYPE i.
DATA temp TYPE i.
DATA start TYPE i.
DATA quote TYPE i.
DATA rec_cnt TYPE i.
len = strlen( str ).
cnt = 0.
temp = 0.
rec_cnt = 0.
DO.
* Start at the beginning
IF start EQ 0.
"string just ENDED start new one.
start = 1.
quote = 0.
CLEAR field.
ENDIF.
IF str+cnt(1) EQ '"'. "Check for qoutes
"CHECK IF quotes is already set
IF quote = 1.
"Already quotes set
"Start new field
start = 0.
quote = 0.
CONCATENATE field '"' INTO field.
IF field IS NOT INITIAL.
rec_cnt = rec_cnt + 1.
CONDENSE field.
IF rec_cnt EQ 1.
field1 = field.
ELSEIF rec_cnt EQ 2.
field2 = field.
ELSEIF rec_cnt EQ 3.
field3 = field.
ELSEIF rec_cnt EQ 4.
field4 = field.
ENDIF.
ENDIF.
* WRITE field.
ELSE.
"This is the start of quotes
quote = 1.
ENDIF.
ENDIF.
IF str+cnt(1) EQ ','. "Check end of field
IF quote EQ 0. "This is not inside quote end of field
start = 0.
quote = 0.
CONDENSE field.
* WRITE field.
IF field IS NOT INITIAL.
rec_cnt = rec_cnt + 1.
IF rec_cnt EQ 1.
field1 = field.
ELSEIF rec_cnt EQ 2.
field2 = field.
ELSEIF rec_cnt EQ 3.
field3 = field.
ELSEIF rec_cnt EQ 4.
field4 = field.
ENDIF.
ENDIF.
ENDIF.
ENDIF.
CONCATENATE field str+cnt(1) INTO field.
cnt = cnt + 1.
IF cnt GE len.
EXIT.
ENDIF.
ENDDO.
WRITE: field1, field2, field3, field4.
ENDFORM.
Regards,
Wenceslaus.
Similar Messages
-
Uploading CSV file into internal table
Hi,
I want to upload a CSV file into internal table.The flat file is having values as below:
'AAAAA','2003-10-11 07:52:37','167','Argentina',NULL,NULL,NULL,NULL,NULL,'MX1',NULL,NULL,'AAAA BBBB',NULL,NULL,NULL,'1',NULL,NULL,'AR ',NULL,NULL,NULL,'ARGENT','M1V','MX1',NULL,NULL,'F','F','F','F','F',NULL,'1',NULL,'MX','MMI ',NULL
'jklhg','2004-06-25 08:01:57','456','hjllajsdk','MANAGUA ',NULL,NULL,'265-5139','266-5136 al 38','MX1',NULL,NULL,'hjgkid GRÖBER','sdfsdf dfs asdfsdf 380 ad ased,','200 as ads, sfd sfd abajao y 50 m al sdf',NULL,'1',NULL,NULL,'NI ',NULL,NULL,NULL,'sdfdfg','M1V','dds',NULL,NULL,
Here I can not even split at ',' because some of the values are having value like NULL and some have values with comma too,
The delimiter is a quote and the separator is a comma here.
Can anyone help on this?
Thanks.
Edited by: Ginger on Jun 29, 2009 9:08 AMAs long as there can be a comma in a text literal you are right that the spilt command doesn't help. However there is one possibility how to attack this under one assumption:
- A comma outside a text delimiter is always considered a separator
- A comma inside a text delimiter is always considered a comma as part of the text
You have to read you file line by line and then travel along the line string character by character and setting a flag or counter for the text delimiters:
e.g.
"Text","Text1, Text2",NULL,NULL,"Text"
String Index 1: EQ " => lv_delimiter = 'X'
String Index 2: EQ T => text literal (because lv_delimiter = 'X')
String Index 3: EQ e => text literal (because lv_delimiter = 'X')
String Index 4: EQ x => text literal (because lv_delimiter = 'X')
String Index 5: EQ t => text literal (because lv_delimiter = 'X')
String Index 6: EQ " => lv_delimiter = ' ' (because it was 'X' before)
String Index 7: EQ , => This is a separator because lv_delimiter = ' '
String Index 8: EQ " => lv_delimiter = 'X' (because it was ' ' before)
String Index 9: EQ T => text literal (because lv_delimiter = 'X')
String Index 10: EQ e => text literal (because lv_delimiter = 'X')
String Index 11: EQ x => text literal (because lv_delimiter = 'X')
String Index 12: EQ t => text literal (because lv_delimiter = 'X')
String Index 13: EQ 1 => text literal (because lv_delimiter = 'X')
String Index 14: EQ , => text literal (because lv_delimiter = 'X')
String Index 15: EQ T => text literal (because lv_delimiter = 'X')
Whenever you hit a 'real' separator (lv_delimiter = ' ') you pass the value of the string before that up to the previous separator into the next structure field.
This is not an easy way to do it, but if you might have commas in your text literal and NULL values I gues it is probably the only way to go.
Hope that helps,
Michael -
Getting Issue while uploading CSV file into internal table
Hi,
CSV file Data format as below
a b c d e f
2.01E14 29-Sep-08 13:44:19 2.01E14 SELL T+1
actual values of column A is 201000000000000
and columen D is 201000000035690
I am uploading above said CSV file into internal table using
the below coding:
TYPES: BEGIN OF TY_INTERN.
INCLUDE STRUCTURE KCDE_CELLS.
TYPES: END OF TY_INTERN.
CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
EXPORTING
I_FILENAME = P_FILE
I_SEPARATOR = ','
TABLES
E_INTERN = T_INTERN
EXCEPTIONS
UPLOAD_CSV = 1
UPLOAD_FILETYPE = 2
OTHERS = 3.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
am getting all columns data into internal table,
getting problem is columan A & D. am getting values into internal table for both; 2.01E+14. How to get actual values without modifying the csv file format.
waiting for your reply...
thanks & regards,
abhiHi Saurabh,
Thanks for your reply.
even i can't double click on those columns.
b'se the program needs be executed in background there can lot of csv file in one folder. No manual interaction on those csv files.
regards,
abhi -
Error converting CSV file into internal table
Hi,
I have to convert a large CSV file (>20.000 entries) into an internal table. I used FM GUI_UPLOAD to get a raw data table then convert this table using FM TEXT_CONVERT_CSV_TO_SAP.
But this does not seem to work properly: after 16.000 or so, the FM seems stuck as if in an endless loop.
Note that if I split the CSV file in several parts, the conversion runs successfully.
Is there any memory limit with this FM ?
Thanks,
FlorianFlorian Labrouche,
Instead of using two function modules, you can use 'TEXT_CONVERT_XLS_TO_SAP' function module once by specifying file name in that function module itself. It does not take much time.
Check the sample program.
report zvenkat-upload-xl no standard page heading.
"Declarations.
"types
types:
begin of t_bank_det,
pernr(8) type c,
bnksa(4) type c,
zlsch(1) type c,
bkplz(10) type c,
bkort(25) type c,
bankn(18) type c,
end of t_bank_det.
"work areas
data:
w_bank_det type t_bank_det.
"internal tables
data:
i_bank_det type table of t_bank_det.
" selection-screen
selection-screen begin of block b1 with frame title text_001.
parameters p_file type localfile.
selection-screen end of block b1.
"At selection-screen on value-request for p_file.
at selection-screen on value-request for p_file.
perform f4_help.
"Start-of-selection.
start-of-selection.
perform upload_data.
"End-of-selection.
end-of-selection.
perform display_data.
"Form f4_help
form f4_help .
data:
l_file_name like ibipparms-path .
call function 'F4_FILENAME'
exporting
program_name = syst-cprog
dynpro_number = syst-dynnr
field_name = 'P_FILE'
importing
file_name = l_file_name.
p_file = l_file_name.
endform. " f4_help
"Form upload_data
form upload_data .
type-pools:truxs.
data:li_tab_raw_data type truxs_t_text_data.
data:l_filename like rlgrap-filename.
l_filename = p_file.
call function 'TEXT_CONVERT_XLS_TO_SAP'
exporting
i_tab_raw_data = li_tab_raw_data
i_filename = l_filename
tables
i_tab_converted_data = i_bank_det
exceptions
conversion_failed = 1
others = 2.
if sy-subrc <> 0.
message id sy-msgid type sy-msgty number sy-msgno
with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
endif.
endform. " upload_data
" Form display_data
form display_data .
data: char100 type char100.
loop at i_bank_det into w_bank_det .
if sy-tabix = 1.
write w_bank_det.
write / '------------------------------------------------------------'.
else.
write / w_bank_det.
endif.
endloop.
endform. " display_data
Regards,
Venkat.O -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
How to get data of tabulated text file into internal table
hi all,
i want to get data from tabulated text file(notepad) into internal table. i searched in SCN and got lot of post regarding how to convert excel file into internal table but i didnt get posts regarding text file.
thanks
SAchintry:
DATA: BEGIN OF tabulator,
x(1) TYPE x VALUE '09',
END OF tabulator.
READ DATASET file INTO wa.
split wa at tabulator into table itab.
A. -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
How to convert xml file into internal table in ABAP Mapping.
Hi All,
I am trying with ABAP mapping. I have one scenario in which I'm using below xml file as a sender from my FTP server.
<?xml version="1.0" encoding="UTF-8" ?>
- <ns0:MTO_ABAP_MAPPING xmlns:ns0="http://Capgemini/Mumbai/sarsingh">
<BookingCode>2KY34R</BookingCode>
- <Passenger>
<Name>SARVESH</Name>
<Address>THANE</Address>
</Passenger>
- <Passenger>
<Name>RAJESH</Name>
<Address>POWAI</Address>
</Passenger>
- <Passenger>
<Name>CARRON</Name>
<Address>JUHU</Address>
</Passenger>
- <Flight>
<Date>03/03/07</Date>
<AirlineID>UA</AirlineID>
<FlightNumber>125</FlightNumber>
<From>LAS</From>
<To>SFO</To>
</Flight>
</ns0:MTO_ABAP_MAPPING>
AT the receiver side I wnat to concatenate the NAME & ADDRESS.
I tried Robert Eijpe's weblog (/people/r.eijpe/blog/2005/11/21/xml-dom-processing-in-abap-part-ii--convert-an-xml-file-into-an-abap-table-using-sap-dom-approach)
but couldnt succeed to convert the xml file into internal table perfectly.
Can anybody help on this.
Thanks in advance!!
SarveshHi Sarvesh,
The pdf has details of ABAP mapping. The example given almost matches the xml file you want to be converted.
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/xi/3.0/how to use abap-mapping in xi 3.0.pdf
Just in case you have not seen this
regards
Vijaya -
Uploading excel file into internal table
Hi,
Any function module is there to upload an excel file into internal table in CRM 7.0 system?
Thanks.Hi Ginger,
If u have access to ECC R/3 make use FM 'TEXT_CONVERT_XLS_TO_SAP' source code which is existing to convert to Excel to Internal table. It will work in CRM also.
As of Now I don't have access to SAP S/m. Mean while u can try as above said.
Regards,
Lokesh B -
Upload data from flat file into internal table
Hi friends,
I want to upload the data from a flat file into internal table , but the problem is that all the columns in that flat file are seperated by "|" character instead of tabs.
Plz help me out.........HEllo,
DO like this.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
FILENAME = LV_FILENAME
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X' " Check here
* HEADER_LENGTH = '1'
* READ_BY_LINE = 'X'
* DAT_MODE = ' '
* CODEPAGE = ' '
* IGNORE_CERR = ABAP_TRUE
* REPLACEMENT = '#'
* CHECK_BOM = ' '
* IMPORTING
* FILELENGTH =
* HEADER =
TABLES
DATA_TAB = IT_COJRNL
EXCEPTIONS
FILE_OPEN_ERROR = 1
FILE_READ_ERROR = 2
NO_BATCH = 3
GUI_REFUSE_FILETRANSFER = 4
INVALID_TYPE = 5
NO_AUTHORITY = 6
UNKNOWN_ERROR = 7
BAD_DATA_FORMAT = 8
HEADER_NOT_ALLOWED = 9
SEPARATOR_NOT_ALLOWED = 10
HEADER_TOO_LONG = 11
UNKNOWN_DP_ERROR = 12
ACCESS_DENIED = 13
DP_OUT_OF_MEMORY = 14
DISK_FULL = 15
DP_TIMEOUT = 16
OTHERS = 17
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
VAsanth -
Program to upload csv file to internal table and insert into database table
Hi I'm writing a program where I need to upload a csv file into an internal table using gui_upload, but i also need this program to insert the data into my custom database table using the split command. Anybody have any samples to help, its urgent!
Hi,
Check this table may be it will give u an hint...
REPORT z_table_upload LINE-SIZE 255.
Data
DATA: it_dd03p TYPE TABLE OF dd03p,
is_dd03p TYPE dd03p.
DATA: it_rdata TYPE TABLE OF text1024,
is_rdata TYPE text1024.
DATA: it_fields TYPE TABLE OF fieldname.
DATA: it_file TYPE REF TO data,
is_file TYPE REF TO data.
DATA: w_error TYPE text132.
Macros
DEFINE write_error.
concatenate 'Error: table'
p_table
&1
&2
into w_error
separated by space.
condense w_error.
write: / w_error.
stop.
END-OF-DEFINITION.
Field symbols
FIELD-SYMBOLS: <table> TYPE STANDARD TABLE,
<data> TYPE ANY,
<fs> TYPE ANY.
Selection screen
SELECTION-SCREEN: BEGIN OF BLOCK b01 WITH FRAME TITLE text-b01.
PARAMETERS: p_file TYPE localfile DEFAULT 'C:\temp\' OBLIGATORY,
p_separ TYPE c DEFAULT ';' OBLIGATORY.
SELECTION-SCREEN: END OF BLOCK b01.
SELECTION-SCREEN: BEGIN OF BLOCK b02 WITH FRAME TITLE text-b02.
PARAMETERS: p_table TYPE tabname OBLIGATORY
MEMORY ID dtb
MATCHCODE OBJECT dd_dbtb_16.
SELECTION-SCREEN: END OF BLOCK b02.
SELECTION-SCREEN: BEGIN OF BLOCK b03 WITH FRAME TITLE text-b03.
PARAMETERS: p_create TYPE c AS CHECKBOX.
SELECTION-SCREEN: END OF BLOCK b03,
SKIP.
SELECTION-SCREEN: BEGIN OF BLOCK b04 WITH FRAME TITLE text-b04.
PARAMETERS: p_nodb RADIOBUTTON GROUP g1 DEFAULT 'X'
USER-COMMAND rg1,
p_save RADIOBUTTON GROUP g1,
p_dele RADIOBUTTON GROUP g1.
SELECTION-SCREEN: SKIP.
PARAMETERS: p_test TYPE c AS CHECKBOX,
p_list TYPE c AS CHECKBOX DEFAULT 'X'.
SELECTION-SCREEN: END OF BLOCK b04.
At selection screen
AT SELECTION-SCREEN.
IF sy-ucomm = 'RG1'.
IF p_nodb IS INITIAL.
p_test = 'X'.
ENDIF.
ENDIF.
At selection screen
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
field_name = 'P_FILE'
IMPORTING
file_name = p_file.
Start of selection
START-OF-SELECTION.
PERFORM f_table_definition USING p_table.
PERFORM f_upload_data USING p_file.
PERFORM f_prepare_table USING p_table.
PERFORM f_process_data.
IF p_nodb IS INITIAL.
PERFORM f_modify_table.
ENDIF.
IF p_list = 'X'.
PERFORM f_list_records.
ENDIF.
End of selection
END-OF-SELECTION.
FORM f_table_definition *
--> VALUE(IN_TABLE) *
FORM f_table_definition USING value(in_table).
DATA: l_tname TYPE tabname,
l_state TYPE ddgotstate,
l_dd02v TYPE dd02v.
l_tname = in_table.
CALL FUNCTION 'DDIF_TABL_GET'
EXPORTING
name = l_tname
IMPORTING
gotstate = l_state
dd02v_wa = l_dd02v
TABLES
dd03p_tab = it_dd03p
EXCEPTIONS
illegal_input = 1
OTHERS = 2.
IF l_state NE 'A'.
write_error 'does not exist or is not active' space.
ENDIF.
IF l_dd02v-tabclass NE 'TRANSP' AND
l_dd02v-tabclass NE 'CLUSTER'.
write_error 'is type' l_dd02v-tabclass.
ENDIF.
ENDFORM.
FORM f_prepare_table *
--> VALUE(IN_TABLE) *
FORM f_prepare_table USING value(in_table).
DATA: l_tname TYPE tabname,
lt_ftab TYPE lvc_t_fcat.
l_tname = in_table.
CALL FUNCTION 'LVC_FIELDCATALOG_MERGE'
EXPORTING
i_structure_name = l_tname
CHANGING
ct_fieldcat = lt_ftab
EXCEPTIONS
OTHERS = 1.
IF sy-subrc NE 0.
WRITE: / 'Error while building field catalog'.
STOP.
ENDIF.
CALL METHOD cl_alv_table_create=>create_dynamic_table
EXPORTING
it_fieldcatalog = lt_ftab
IMPORTING
ep_table = it_file.
ASSIGN it_file->* TO <table>.
CREATE DATA is_file LIKE LINE OF <table>.
ASSIGN is_file->* TO <data>.
ENDFORM.
FORM f_upload_data *
--> VALUE(IN_FILE) *
FORM f_upload_data USING value(in_file).
DATA: l_file TYPE string,
l_ltext TYPE string.
DATA: l_lengt TYPE i,
l_field TYPE fieldname.
DATA: l_missk TYPE c.
l_file = in_file.
l_lengt = strlen( in_file ).
FORMAT INTENSIFIED ON.
WRITE: / 'Reading file', in_file(l_lengt).
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = l_file
filetype = 'ASC'
TABLES
data_tab = it_rdata
EXCEPTIONS
OTHERS = 1.
IF sy-subrc <> 0.
WRITE: /3 'Error uploading', l_file.
STOP.
ENDIF.
File not empty
DESCRIBE TABLE it_rdata LINES sy-tmaxl.
IF sy-tmaxl = 0.
WRITE: /3 'File', l_file, 'is empty'.
STOP.
ELSE.
WRITE: '-', sy-tmaxl, 'rows read'.
ENDIF.
File header on first row
READ TABLE it_rdata INTO is_rdata INDEX 1.
l_ltext = is_rdata.
WHILE l_ltext CS p_separ.
SPLIT l_ltext AT p_separ INTO l_field l_ltext.
APPEND l_field TO it_fields.
ENDWHILE.
IF sy-subrc = 0.
l_field = l_ltext.
APPEND l_field TO it_fields.
ENDIF.
Check all key fields are present
SKIP.
FORMAT RESET.
FORMAT COLOR COL_HEADING.
WRITE: /3 'Key fields'.
FORMAT RESET.
LOOP AT it_dd03p INTO is_dd03p WHERE NOT keyflag IS initial.
WRITE: /3 is_dd03p-fieldname.
READ TABLE it_fields WITH KEY table_line = is_dd03p-fieldname
TRANSPORTING NO FIELDS.
IF sy-subrc = 0.
FORMAT COLOR COL_POSITIVE.
WRITE: 'ok'.
FORMAT RESET.
ELSEIF is_dd03p-datatype NE 'CLNT'.
FORMAT COLOR COL_NEGATIVE.
WRITE: 'error'.
FORMAT RESET.
l_missk = 'X'.
ENDIF.
ENDLOOP.
Log other fields
SKIP.
FORMAT COLOR COL_HEADING.
WRITE: /3 'Other fields'.
FORMAT RESET.
LOOP AT it_dd03p INTO is_dd03p WHERE keyflag IS initial.
WRITE: /3 is_dd03p-fieldname.
READ TABLE it_fields WITH KEY table_line = is_dd03p-fieldname
TRANSPORTING NO FIELDS.
IF sy-subrc = 0.
WRITE: 'X'.
ENDIF.
ENDLOOP.
Missing key field
IF l_missk = 'X'.
SKIP.
WRITE: /3 'Missing key fields - no further processing'.
STOP.
ENDIF.
ENDFORM.
FORM f_process_data *
FORM f_process_data.
DATA: l_ltext TYPE string,
l_stext TYPE text40,
l_field TYPE fieldname,
l_datat TYPE c.
LOOP AT it_rdata INTO is_rdata FROM 2.
l_ltext = is_rdata.
LOOP AT it_fields INTO l_field.
ASSIGN COMPONENT l_field OF STRUCTURE <data> TO <fs>.
IF sy-subrc = 0.
Field value comes from file, determine conversion
DESCRIBE FIELD <fs> TYPE l_datat.
CASE l_datat.
WHEN 'N'.
SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
WRITE l_stext TO <fs> RIGHT-JUSTIFIED.
OVERLAY <fs> WITH '0000000000000000'. "max 16
WHEN 'P'.
SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
TRANSLATE l_stext USING ',.'.
<fs> = l_stext.
WHEN 'F'.
SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
TRANSLATE l_stext USING ',.'.
<fs> = l_stext.
WHEN 'D'.
SPLIT l_ltext AT p_separ INTO l_stext l_ltext.
TRANSLATE l_stext USING '/.-.'.
CALL FUNCTION 'CONVERT_DATE_TO_INTERNAL'
EXPORTING
date_external = l_stext
IMPORTING
date_internal = <fs>
EXCEPTIONS
OTHERS = 1.
WHEN 'T'.
CALL FUNCTION 'CONVERT_TIME_INPUT'
EXPORTING
input = l_stext
IMPORTING
output = <fs>
EXCEPTIONS
OTHERS = 1.
WHEN OTHERS.
SPLIT l_ltext AT p_separ INTO <fs> l_ltext.
ENDCASE.
ELSE.
SHIFT l_ltext UP TO p_separ.
SHIFT l_ltext.
ENDIF.
ENDLOOP.
IF NOT <data> IS INITIAL.
LOOP AT it_dd03p INTO is_dd03p WHERE datatype = 'CLNT'.
This field is mandant
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-mandt.
ENDLOOP.
IF p_create = 'X'.
IF is_dd03p-rollname = 'ERDAT'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-datum.
ENDIF.
IF is_dd03p-rollname = 'ERZET'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-uzeit.
ENDIF.
IF is_dd03p-rollname = 'ERNAM'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-uname.
ENDIF.
ENDIF.
IF is_dd03p-rollname = 'AEDAT'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-datum.
ENDIF.
IF is_dd03p-rollname = 'AETIM'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-uzeit.
ENDIF.
IF is_dd03p-rollname = 'AENAM'.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data>
TO <fs>.
<fs> = sy-uname.
ENDIF.
APPEND <data> TO <table>.
ENDIF.
ENDLOOP.
ENDFORM.
FORM f_modify_table *
FORM f_modify_table.
SKIP.
IF p_save = 'X'.
MODIFY (p_table) FROM TABLE <table>.
ELSEIF p_dele = 'X'.
DELETE (p_table) FROM TABLE <table>.
ELSE.
EXIT.
ENDIF.
IF sy-subrc EQ 0.
FORMAT COLOR COL_POSITIVE.
IF p_save = 'X'.
WRITE: /3 'Modify table OK'.
ELSE.
WRITE: /3 'Delete table OK'.
ENDIF.
FORMAT RESET.
IF p_test IS INITIAL.
COMMIT WORK.
ELSE.
ROLLBACK WORK.
WRITE: '- test only, no update'.
ENDIF.
ELSE.
FORMAT COLOR COL_NEGATIVE.
WRITE: /3 'Error while modifying table'.
FORMAT RESET.
ENDIF.
ENDFORM.
FORM f_list_records *
FORM f_list_records.
DATA: l_tleng TYPE i,
l_lasti TYPE i,
l_offst TYPE i.
Output width
l_tleng = 1.
LOOP AT it_dd03p INTO is_dd03p.
l_tleng = l_tleng + is_dd03p-outputlen.
IF l_tleng LT sy-linsz.
l_lasti = sy-tabix.
l_tleng = l_tleng + 1.
ELSE.
l_tleng = l_tleng - is_dd03p-outputlen.
EXIT.
ENDIF.
ENDLOOP.
Output header
SKIP.
FORMAT COLOR COL_HEADING.
WRITE: /3 'Contents'.
FORMAT RESET.
ULINE AT /3(l_tleng).
Output records
LOOP AT <table> ASSIGNING <data>.
LOOP AT it_dd03p INTO is_dd03p FROM 1 TO l_lasti.
IF is_dd03p-position = 1.
WRITE: /3 sy-vline.
l_offst = 3.
ENDIF.
ASSIGN COMPONENT is_dd03p-fieldname OF STRUCTURE <data> TO <fs>.
l_offst = l_offst + 1.
IF is_dd03p-decimals LE 2.
WRITE: AT l_offst <fs>.
ELSE.
WRITE: AT l_offst <fs> DECIMALS 3.
ENDIF.
l_offst = l_offst + is_dd03p-outputlen.
WRITE: AT l_offst sy-vline.
ENDLOOP.
ENDLOOP.
Ouptut end
ULINE AT /3(l_tleng).
ENDFORM.
Regards,
Joy. -
Help reading .csv file into an arraylist
i need to read a csv file into an arraylist and then print the arraylist to the screen.
using:
try {
// Setup our scanner to read the file at the path c:\test.txt
Scanner myscanner = new Scanner(new File("\\carsDB.csv"));
ArrayList<CarsClass> carsClass = new ArrayList<CarsClass>(" write all fields");
while (myscanner.hasNextLine()) {
myscanner.add(" add car fields to the carClass"); // Write your code here.
// loop to read them all to the screencsv file is like this:
Manufacturer,carline name,displ,cyl,fuel,(miles),Class
CHEVROLET,CAVALIER (natural gas),2.2,4,CNG,120,SUBCOMPACT CARS
HONDA,CIVIC GX (natural gas),1.6,4,CNG,190,SUBCOMPACT CARS
FORD,CONTOUR (natural gas),2,4,CNG,70,COMPACT
FORD,CROWN VICTORIA (natural gas),4.6,8,CNG,140/210*,LARGE CARS
FORD,F150 PICKUP (natural gas) - 2WD,5.4,8,CNG,130,STANDARD PICKUP TRUCKS 2WD
FORD,F250 PICKUP (natural gas) - 2WD,5.4,8,CNG,160,STANDARD PICKUP TRUCKS 2WD
FORD,F250 PICKUP (natural gas) - 2WD,5.4,8,CNG,150/210*,STANDARD PICKUP TRUCKS 2WD
FORD,F150 PICKUP (natural gas) - 4WD,5.4,8,CNG,130,STANDARD PICKUP TRUCKS 4WD
FORD,F250 PICKUP (natural gas) - 4WD,5.4,8,CNG,160,STANDARD PICKUP TRUCKS 4WD
FORD,E250 ECONOLINE (natural gas) - 2WD,5.4,8,CNG,170,"VANS, CARGO TYPE"
FORD,E250 ECONOLINE (natural gas) - 2WD,5.4,8,CNG,80,"VANS, CARGO TYPE"
FORD,F150 LPG - 2WD,5.4,8,LPG,290/370*,STANDARD PICKUP TRUCKS 2WD
FORD,F250 LPG - 2WD,5.4,8,LPG,260/290/370**,STANDARD PICKUP TRUCKS 2WD
ok, so do i need to write a file carsclass.java or is the line:
ArrayList<CarsClass> carsClass = new ArrayList<CarsClass>(" write all fields");
going to define my carsclass objects? how do i write my fields for each of the 7 categories?
i believe i can easily add to and print the arraylist, but i'm confused how to go about creating this arraylist in the first place. do i just create carclass.java and save them all as strings? i guess my question is mainly how should i structure this? any suggestions/help is appreciated.
Edited by: scottc on Nov 15, 2007 5:55 PMString.split uses regular expressions.Ahh yeah ummm (slaps forehead) I'd forgotten that... so maybe StringTokeniser is more accessible for noob's. Sorry.
Anyway... if all you want/need to so is store a bunch of String field values then how about using an array of String's instead of individual fields... maybe something like:
forums\Car.java
* Car Data Transfer Object. All fields are final, making objects of this class
* thread safe(r).
* @author keith
package forums;
import krc.utilz.stringz.Arrayz;
public class Car
// class attributes // variables //isn't actually wrong, it's just not quit right.
private final String[] fields; // final means values are write once, read many times, which is thread safe(r).
* Initialises the new Cars fields to the given fields array.
* @param fields - an array of any (reasonable) length
public Car(String[] fields) { // no need to comment a constructor as a constructor.
this.fields = fields; // much better to provide javadoc comments explaining
} // what the method does and how to use it.
* Returns this Car's fields as one long string.
public String toString() { // It's actually GOOD to be a lazy programmer.
return Arrayz.join(", ", this.fields); // I have reused my Arrayz.join in several projects.
// I think you can use Array.toString (new in 1.6) instead.
krc\utilz\stringz\Arrayz.java
package krc.utilz.stringz;
import java.util.List;
import java.util.ArrayList;
public class Arrayz
* returns true if the given value is in the args list, else false.
* @param value - the value to seek
* @param args... - variable number of String arguments to look in
public static boolean in(String value, String... args) {
for(String a : args) if(value.equals(a)) return true;
return false;
* append the elements of array into one string, seperated by FS
* @param a - an array of strings to join together
* @param FS - Field Seperator string, optional default=" "
* @example
* String[] array = {"Bob","The","Builder"};
* System.out.println(join(array);
* --> Bob The Builder
* System.out.println("String[] array = {\""+join(array, "\",\"")+"\"};");
* --> String[] array = {"Bob","The","Builder"};
public static String join(String FS, String[]... arrays) {
StringBuffer sb = new StringBuffer();
for (String[] array : arrays)
sb.append(join(array, FS));
return(sb.toString());
public static String join(String[] array) {
return(join(array, " "));
public static String join(String[] a, String FS) {
if (a==null) return null;
if (a.length==0) return "";
StringBuffer sb = new StringBuffer(a[0]);
for (int i=1; i<a.length; i++) {
sb.append(FS+a);
return sb.toString();
* append all the elements of the given arrays into one big array
* @param String[]... arrays - to be concatenated
* @return String[] - one big array
public static String[] concatenate(String[]... arrays) {
List<String> list = new ArrayList<String>();
for(String[] array : arrays) {
for(String item : array) {
list.add(item);
return(list.toArray(new String[0]));
I guess that Arrayz class might be a bit beyond you at the moment so don't worry too much if you can't understand the code... there are no nasty surprises in it... just cut and paste the code, change the package name, and use it (for now). -
Read from file to Internal Table With Extra Record
Hi,
I'm trying to read file from application server into internal table. Then loop thru the internal table and display.
My text file only have 2 rows of records. However, when display the internal table, it show the 2 records plus an extra line with 0. May I know where did I do wrong?
PARAMETERS: p_infile LIKE rlgrap-filename OBLIGATORY DEFAULT '/usr/sap/'..
DATA: ld_file LIKE rlgrap-filename.
*Internal tabe to store upload data
TYPES: BEGIN OF t_record,
name1 like pa0002-VORNA,
name2 like pa0002-name2,
age type i,
END OF t_record.
DATA: it_record TYPE STANDARD TABLE OF t_record INITIAL SIZE 0,
wa_record TYPE t_record.
*Text version of data table
TYPES: begin of t_uploadtxt,
name1(10) type c,
name2(15) type c,
age(5) type c,
end of t_uploadtxt.
DATA: wa_uploadtxt TYPE t_uploadtxt,
wa_upload TYPE t_uploadtxt.
*String value to data in initially.
DATA: wa_string(255) type c.
START-OF-SELECTION.
ld_file = p_infile.
OPEN DATASET ld_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
ELSE.
DO.
CLEAR: wa_string, wa_uploadtxt.
READ DATASET ld_file INTO wa_string.
IF sy-subrc NE 0.
EXIT.
ELSE.
SPLIT wa_string AT SPACE INTO wa_uploadtxt-name1 wa_uploadtxt-name2 wa_uploadtxt-age.
MOVE-CORRESPONDING wa_uploadtxt TO wa_upload.
APPEND wa_upload to it_record.
ENDIF.
ENDDO.
CLOSE DATASET ld_file.
ENDIF.
END-OF-SELECTION.
loop at it_record INTO wa_record.
write / wa_record-name1.
write / wa_record-name2.
write / wa_record-age.
Endloop.Hi,
Am attaching the file I used. There's no empty line after the second record.
After execute, it is showing:
Joe Adams 20
0
John Smith 40
0
May I know why there's 0, where did i do wrong?
Thank you. -
Read CSV file into a 1-D array
Hi
I would like to read a csv file into a cluster of 4 elements which would then be read into a 1-D array.
My cluster contains a typedef, a double, a boolean, and another typedef.
Basically it could be seen as:
Bob Runs, 4, T, Bob
Mary sits, 5, F, Mary
Bob Sits, 2, F, Bob
Mary Runs, 9, T, Mary
(keeps growing)
Are there any good examples for what I am trying to put together that I could leverage, or is it better to use a different input file than a csv. I am trying to make my program more flexable and easier to make adjustments even after the executable is created. My line items seem to be growing exponentially and is getting difficult to manage in the LV window.
Thanks
Solved!
Go to Solution.Unless your CSV file is huge, I'd use "Read from Spreadsheet File" with the delimiter set as "," and the type as string. This will give you a 2D array of strings. You could then separate out each column of the array, convert to the appropriate data type, and use Index & Bundle Cluster Array to build your array of clusters. Something like this (except I'm using a string constant in place of reading from the file).
-
Read DMS document into internal table
Hi,
I have a requirement to read DMS document in binary format into internal table.
and i want to print the data in internal table through SAP spool.
Please let me know how to handle this requirement.
Thanks
SriniHi,
Check scms pakage, FG - SCMS_CONV for conversions..
Regards
Surjit
Maybe you are looking for
-
I can't get them back
-
Project 2013 - How do you sort the list of Enterprise Templates in Project Pro?
In Project Pro 2013, click File > New > Enterprise I see the list of templates from our project server. We have a global installation with 40-50 templates. These are displayed in a random order and reading through the whole list to find the one I wan
-
Has anyone gotten a message when trying to access their itunes that says this version of Itunes has not been correctly localized for this language. Please run the English version
-
Can't import files, can't import files
New to Final Cut Pro .....and MacBook Pro Have several home made DVDs....can't import them
-
is possible to use a screen painter form in the wizard that create user defined object? and then replace the new form in the user menu? regards