Upload text delimited file
could someone please provide a sample code of how to upload a text "|" delimited file into SAP internal structure? i used GUI_upload with field_separator but didn't work. many thanks.
HI Voo,
i think this is the FM you are looking for..
see the sample functionality ..
DATA: BEGIN OF itab OCCURS 0,
vbeln LIKE vbak-vbeln,
ernam LIKE vbak-ernam,
END OF itab.
DATA itab2 LIKE TABLE OF KCDE_CELLS WITH HEADER LINE.
CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
EXPORTING
i_filename = 'D:dataupl.txt'
i_separator = '|'
tables
e_intern = itab2
* EXCEPTIONS
* UPLOAD_CSV = 1
* UPLOAD_FILETYPE = 2
* OTHERS = 3
IF sy-subrc <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
LOOP AT itab2.
SPLIT itab2-value AT '|' INTO itab-vbeln itab-ernam.
APPEND itab.
CLEAR itab.
ENDLOOP.
And the file: upl.txt
12345|text
12346|text2
12347|text3
regards
satesh
Similar Messages
-
Upload tab-delimited file from the application server to an internal table
Hello SAPients.
I'm using OPEN DATASET..., READ DATASET..., CLOSE DATASET to upload a file from the application server (SunOS). I'm working with SAP 4.6C. I'm trying to upload a tab-delimited file to an internal table but when I try load it the fields are not correctly separated, in fact, they are all misplaced and the table shows '#' where supposedly there was a tab.
I tried to SPLIT the line using as separator a variable with reference to CL_ABAP_CHAR_UTILITIES=>HORIZONTAL_TAB but for some reason that class doesn't exist in my system.
Do you know what I'm doing wrong? or Do you know a better method to upload a tab-delimited file into an internal table?
Thank you in advance for your help.Try:
REPORT ztest MESSAGE-ID 00.
PARAMETER: p_file LIKE rlgrap-filename OBLIGATORY.
DATA: BEGIN OF data_tab OCCURS 0,
data(4096),
END OF data_tab.
DATA: BEGIN OF vendor_file_x OCCURS 0.
* LFA1 Data
DATA: mandt LIKE bgr00-mandt,
lifnr LIKE blf00-lifnr,
anred LIKE blfa1-anred,
bahns LIKE blfa1-bahns,
bbbnr LIKE blfa1-bbbnr,
bbsnr LIKE blfa1-bbsnr,
begru LIKE blfa1-begru,
brsch LIKE blfa1-brsch,
bubkz LIKE blfa1-bubkz,
datlt LIKE blfa1-datlt,
dtams LIKE blfa1-dtams,
dtaws LIKE blfa1-dtaws,
erdat LIKE lfa1-erdat,
ernam LIKE lfa1-ernam,
esrnr LIKE blfa1-esrnr,
konzs LIKE blfa1-konzs,
ktokk LIKE lfa1-ktokk,
kunnr LIKE blfa1-kunnr,
land1 LIKE blfa1-land1,
lnrza LIKE blfa1-lnrza,
loevm LIKE blfa1-loevm,
name1 LIKE blfa1-name1,
name2 LIKE blfa1-name2,
name3 LIKE blfa1-name3,
name4 LIKE blfa1-name4,
ort01 LIKE blfa1-ort01,
ort02 LIKE blfa1-ort02,
pfach LIKE blfa1-pfach,
pstl2 LIKE blfa1-pstl2,
pstlz LIKE blfa1-pstlz,
regio LIKE blfa1-regio,
sortl LIKE blfa1-sortl,
sperr LIKE blfa1-sperr,
sperm LIKE blfa1-sperm,
spras LIKE blfa1-spras,
stcd1 LIKE blfa1-stcd1,
stcd2 LIKE blfa1-stcd2,
stkza LIKE blfa1-stkza,
stkzu LIKE blfa1-stkzu,
stras LIKE blfa1-stras,
telbx LIKE blfa1-telbx,
telf1 LIKE blfa1-telf1,
telf2 LIKE blfa1-telf2,
telfx LIKE blfa1-telfx,
teltx LIKE blfa1-teltx,
telx1 LIKE blfa1-telx1,
xcpdk LIKE lfa1-xcpdk,
xzemp LIKE blfa1-xzemp,
vbund LIKE blfa1-vbund,
fiskn LIKE blfa1-fiskn,
stceg LIKE blfa1-stceg,
stkzn LIKE blfa1-stkzn,
sperq LIKE blfa1-sperq,
adrnr LIKE lfa1-adrnr,
mcod1 LIKE lfa1-mcod1,
mcod2 LIKE lfa1-mcod2,
mcod3 LIKE lfa1-mcod3,
gbort LIKE blfa1-gbort,
gbdat LIKE blfa1-gbdat,
sexkz LIKE blfa1-sexkz,
kraus LIKE blfa1-kraus,
revdb LIKE blfa1-revdb,
qssys LIKE blfa1-qssys,
ktock LIKE blfa1-ktock,
pfort LIKE blfa1-pfort,
werks LIKE blfa1-werks,
ltsna LIKE blfa1-ltsna,
werkr LIKE blfa1-werkr,
plkal LIKE lfa1-plkal,
duefl LIKE lfa1-duefl,
txjcd LIKE blfa1-txjcd,
sperz LIKE lfa1-sperz,
scacd LIKE blfa1-scacd,
sfrgr LIKE blfa1-sfrgr,
lzone LIKE blfa1-lzone,
xlfza LIKE lfa1-xlfza,
dlgrp LIKE blfa1-dlgrp,
fityp LIKE blfa1-fityp,
stcdt LIKE blfa1-stcdt,
regss LIKE blfa1-regss,
actss LIKE blfa1-actss,
stcd3 LIKE blfa1-stcd3,
stcd4 LIKE blfa1-stcd4,
ipisp LIKE blfa1-ipisp,
taxbs LIKE blfa1-taxbs,
profs LIKE blfa1-profs,
stgdl LIKE blfa1-stgdl,
emnfr LIKE blfa1-emnfr,
lfurl LIKE blfa1-lfurl,
j_1kfrepre LIKE blfa1-j_1kfrepre,
j_1kftbus LIKE blfa1-j_1kftbus,
j_1kftind LIKE blfa1-j_1kftind,
confs LIKE lfa1-confs,
updat LIKE lfa1-updat,
uptim LIKE lfa1-uptim,
nodel LIKE blfa1-nodel.
DATA: END OF vendor_file_x.
FIELD-SYMBOLS: <field>,
<field_1>.
DATA: delim TYPE x VALUE '09'.
DATA: fld_chk(4096),
last_char,
quote_1 TYPE i,
quote_2 TYPE i,
fld_lth TYPE i,
columns TYPE i,
field_end TYPE i,
outp_rec TYPE i,
extras(3) TYPE c VALUE '.,"',
mixed_no(14) TYPE c VALUE '1234567890-.,"'.
OPEN DATASET p_file FOR INPUT.
DO.
READ DATASET p_file INTO data_tab-data.
IF sy-subrc = 0.
APPEND data_tab.
ELSE.
EXIT.
ENDIF.
ENDDO.
* count columns in output structure
DO.
ASSIGN COMPONENT sy-index OF STRUCTURE vendor_file_x TO <field>.
IF sy-subrc <> 0.
EXIT.
ENDIF.
columns = sy-index.
ENDDO.
* Assign elements of input file to internal table
CLEAR vendor_file_x.
IF columns > 0.
LOOP AT data_tab.
DO columns TIMES.
ASSIGN space TO <field>.
ASSIGN space TO <field_1>.
ASSIGN COMPONENT sy-index OF STRUCTURE vendor_file_x TO <field>.
SEARCH data_tab-data FOR delim.
IF sy-fdpos > 0.
field_end = sy-fdpos + 1.
ASSIGN data_tab-data(sy-fdpos) TO <field_1>.
* Check that numeric fields don't contain any embedded " or ,
IF <field_1> CO mixed_no AND
<field_1> CA extras.
TRANSLATE <field_1> USING '" , '.
CONDENSE <field_1> NO-GAPS.
ENDIF.
* If first and last characters are '"', remove both.
fld_chk = <field_1>.
IF NOT fld_chk IS INITIAL.
fld_lth = strlen( fld_chk ) - 1.
MOVE fld_chk+fld_lth(1) TO last_char.
IF fld_chk(1) = '"' AND
last_char = '"'.
MOVE space TO fld_chk+fld_lth(1).
SHIFT fld_chk.
MOVE fld_chk TO <field_1>.
ENDIF. " for if fld_chk(1)=" & last_char="
ENDIF. " for if not fld_chk is initial
* Replace "" with "
DO.
IF fld_chk CS '""'.
quote_1 = sy-fdpos.
quote_2 = sy-fdpos + 1.
MOVE fld_chk+quote_2 TO fld_chk+quote_1.
ELSE.
MOVE fld_chk TO <field_1>.
EXIT.
ENDIF.
ENDDO.
<field> = <field_1>.
ELSE.
field_end = 1.
ENDIF.
SHIFT data_tab-data LEFT BY field_end PLACES.
ENDDO.
APPEND vendor_file_x.
CLEAR vendor_file_x.
ENDLOOP.
ENDIF.
CLEAR data_tab.
REFRESH data_tab.
FREE data_tab.
Rob -
UPLOADING tab delimited file onto FTP server
Hello all
Can i upload a tab delimited file onto the FTP server. If yes then how
points guranteed if answered!!Hi,
Yes you can do this one .. you can have a look at the standard program 'RSEPSFTP'.
REPORT ZFTPSAP LINE-SIZE 132.
DATA: BEGIN OF MTAB_DATA OCCURS 0,
LINE(132) TYPE C,
END OF MTAB_DATA.
DATA: MC_PASSWORD(20) TYPE C,
MI_KEY TYPE I VALUE 26101957,
MI_PWD_LEN TYPE I,
MI_HANDLE TYPE I.
START-OF-SELECTION.
*-- Your SAP-UNIX FTP password (case sensitive)
MC_PASSWORD = 'password'.
DESCRIBE FIELD MC_PASSWORD LENGTH MI_PWD_LEN.
*-- FTP_CONNECT requires an encrypted password to work
CALL 'AB_RFC_X_SCRAMBLE_STRING'
ID 'SOURCE' FIELD MC_PASSWORD ID 'KEY' FIELD MI_KEY
ID 'SCR' FIELD 'X' ID 'DESTINATION' FIELD MC_PASSWORD
ID 'DSTLEN' FIELD MI_PWD_LEN.
CALL FUNCTION 'FTP_CONNECT'
EXPORTING
*-- Your SAP-UNIX FTP user name (case sensitive)
USER = 'userid'
PASSWORD = MC_PASSWORD
*-- Your SAP-UNIX server host name (case sensitive)
HOST = 'unix-host'
RFC_DESTINATION = 'SAPFTP'
IMPORTING
HANDLE = MI_HANDLE
EXCEPTIONS
NOT_CONNECTED = 1
OTHERS = 2.
CHECK SY-SUBRC = 0.
CALL FUNCTION 'FTP_COMMAND'
EXPORTING
HANDLE = MI_HANDLE
COMMAND = 'dir'
TABLES
DATA = MTAB_DATA
EXCEPTIONS
TCPIP_ERROR = 1
COMMAND_ERROR = 2
DATA_ERROR = 3
OTHERS = 4.
IF SY-SUBRC = 0.
LOOP AT MTAB_DATA.
WRITE: / MTAB_DATA.
ENDLOOP.
ELSE.
* do some error checking.
WRITE: / 'Error in FTP Command'.
ENDIF.
CALL FUNCTION 'FTP_DISCONNECT'
EXPORTING
HANDLE = MI_HANDLE
EXCEPTIONS
OTHERS = 1.
Regards
Sudheer -
Detecting line-breaks within a column of an uploaded tab-delimited file.
Suppose you upload a tab-delimited file from your laptop and split each row of the file into some structure that you append to an itab.
Is there a way inside ABAP to detect that a field of the uploaded file has a CR or CRLF in it? And if so, where it is ?
Thanks in advance ...You can use any of the following for those char.
DATA: head_crnl(1) TYPE c VALUE cl_abap_char_utilities=>cr_lf,
top_crnl(1) TYPE c VALUE cl_abap_char_utilities=>cr_lf,
end_crnl(1) TYPE c VALUE cl_abap_char_utilities=>cr_lf,
blank_crnl(1) TYPE c VALUE cl_abap_char_utilities=>cr_lf,
final_crnl(1) TYPE c VALUE cl_abap_char_utilities=>cr_lf,
first_pgbr(1) TYPE c VALUE cl_abap_char_utilities=>form_feed.
Declare the above variables and check if they occur in the file. Hope this helps. -
How to read a text delimited file using 2 dimentional array in java ??
hi,
I am new to java programming.. I have to do a task where in i have to read a text delimeted file in an array.. For example.. If the file is as follows
Name place Value
adi goa 20
shri mumbai 30
riya bangalr 45
I want it to be read in java so as to get an array[row][columns]
This is something i am currently upto, but cant get any further.
import java.io.BufferedReader;
import java.io.FileReader;
public class generateGML{
public static void main(String[] argv)
throws Exception{
BufferedReader fh = new BufferedReader(new FileReader("filename.txt"));
String s;
while ((s=fh.readLine())!=null){
String[] columns = s.split("\t");
String name = columns[0];
String place = columns[1];
String value = columns[2];
It reads columns,But I want it two dimentionally,as in something like matrix[row_num][column_num].
Can anyone please suggest me..You could do the following:
String[][] array = new String[rows][];
int row_num = 0;
while ((s=fh.readLine())!=null) {
array[row_num++] = s.split("\t");
}However, you need to know ahead of time how many rows to allocate. If you allocate more than needed, you'll need to copy to a new array, or you'll need to keep track of how much is actually populated. If you allocate less than needed, you'll get an ArrayIndexOutOfBoundsException.
Another (likely better) approach is:
Do you really need it as a 2-dimensional array? Can you make a List of objects that have a name, place, and value? Then you don't need to know how big of a list to allocate ahead of time, assuming you use a list that grows itself (like ArrayList or LinkedList). Your code would be much easier to read if you could say:
String name = list.get(10).getName();instead of
String name = array[10][0]; -
ho do i upload a file that is tab delimited, if the file is comma separated we items "split at ',' . how do i split tab separated items
HI Willard
As you know, handling of files w.r.t presentation server and application server differs.
<b>Presentation Server</b>: We can use FM: <b>GUI_UPLOAD</b> for the same.
<b>Application Server</b>:
1. Declare a variable of long text.
Eg: data: l_data(1024) type c.
2. Declate a variable which identifies tab character;
data: <b>l_tab(1) type c value cl_abap_char_utilities=>horizontal_tab</b>.
3. While reading the dataset, read into l_data.
4. Split l_data at l_tab into work area fields.
Eg:
data: l_text(1024) type c,
l_tab(1) type c value cl_abap_char_utilites=>horizontal_tab.
open dataset fname for input in text mode encoding default.
if sy-subrc ne 0.
write:/ 'Error Opening File'.
else.
do.
read dataset fname into l_text.
if sy-subrc ne 0.
exit.
else.
split l_text at l_tab into wa-fld1 wa-fld2 ....
append wa to itab.
endif.
enddo.
close dataset fname.
endif.
Hope the above info gives you some idea.
Kind Regards
Eswar -
Uploading a delimited file via sqlldr
I am using Oracle 10g on Unix. I have been receiving a file, previously fixed length, and uploading it into a table using sqlldr. This time the file is comma-delimited and the text data is enclosed with double quotes. I feel like a total idiot (and probably sound like one) but I have no idea how to make this work. Any assistance is greatly appreciated.
Thanks.costumer wrote:
I am using Oracle 10g on Unix. I have been receiving a file, previously fixed length, and uploading it into a table using sqlldr. This time the file is comma-delimited and the text data is enclosed with double quotes. I feel like a total idiot (and probably sound like one) but I have no idea how to make this work. Any assistance is greatly appreciated.
Thanks.Go to tahiti.oracle.com
Drill down to your product and version
There you will find the complete doc set
Click on the Books tab
Find and open the Utilities Manual
Find the section on sqlldr
Find the discussion on "variable record format"
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#sthref476 -
Parse tab delimited file after upload
I could use some advice. I have a requirement to parse and insert an uploaded, tab delimited file using APEX. Directly off of the file system, I would do this with an external table, but here I am uploading the file into a CLOB field. Does PL/SQL have any native functions to parse delimited files?
The user will not have access to the data load part of APEX, although I have seen numerous requests on this forum for that functionality to be added to the user GUI. Thoughts?j,
I wrote this a while ago...
http://www.danielmcghan.us/2009/02/easy-csv-uploads-yes-we-can.html
I've since improved the code a little bit. If you like the solution, I could do a new post with the latest code.
Regards,
Dan
Blog: http://DanielMcGhan.us/
Work: http://SkillBuilders.com/ -
How to upload a Flat file into sap database if the file is in Appl'n Server
Hello Sap Experts , Can you tel me
" How to upload a Flat file into sap database if the file is in Application Server.
what is Path for that ?
Plz Tel Me its Urgent
Thanks for allHi,
ABAP code for uploading a TAB delimited file into an internal table. See code below for structures.
*& Report ZUPLOADTAB *
*& Example of Uploading tab delimited file *
REPORT zuploadtab .
PARAMETERS: p_infile LIKE rlgrap-filename
OBLIGATORY DEFAULT '/usr/sap/'..
DATA: ld_file LIKE rlgrap-filename.
*Internal tabe to store upload data
TYPES: BEGIN OF t_record,
name1 like pa0002-VORNA,
name2 like pa0002-name2,
age type i,
END OF t_record.
DATA: it_record TYPE STANDARD TABLE OF t_record INITIAL SIZE 0,
wa_record TYPE t_record.
*Text version of data table
TYPES: begin of t_uploadtxt,
name1(10) type c,
name2(15) type c,
age(5) type c,
end of t_uploadtxt.
DATA: wa_uploadtxt TYPE t_uploadtxt.
*String value to data in initially.
DATA: wa_string(255) type c.
constants: con_tab TYPE x VALUE '09'.
*If you have Unicode check active in program attributes then you will
*need to declare constants as follows:
*class cl_abap_char_utilities definition load.
*constants:
* con_tab type c value cl_abap_char_utilities=>HORIZONTAL_TAB.
*START-OF-SELECTION
START-OF-SELECTION.
ld_file = p_infile.
OPEN DATASET ld_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
ELSE.
DO.
CLEAR: wa_string, wa_uploadtxt.
READ DATASET ld_file INTO wa_string.
IF sy-subrc NE 0.
EXIT.
ELSE.
SPLIT wa_string AT con_tab INTO wa_uploadtxt-name1
wa_uploadtxt-name2
wa_uploadtxt-age.
MOVE-CORRESPONDING wa_uploadtxt TO wa_upload.
APPEND wa_upload TO it_record.
ENDIF.
ENDDO.
CLOSE DATASET ld_file.
ENDIF.
*END-OF-SELECTION
END-OF-SELECTION.
*!! Text data is now contained within the internal table IT_RECORD
* Display report data for illustration purposes
loop at it_record into wa_record.
write:/ sy-vline,
(10) wa_record-name1, sy-vline,
(10) wa_record-name2, sy-vline,
(10) wa_record-age, sy-vline.
endloop. -
FM to upload TAB DELIMITED TEXT file into Internal table.
Hello Friends,
The FM 'ALSM_EXCEL_TO_INTERNAL_TABLE' is used to upload EXCEL file into a Internal table.
Is there any FM which performs the simillar operation on TAB DELIMITED TEXT FILE.
Thanks in advance!
AshishHi,
To upload text file with tab delimated you can use FM
GUI_OPLOAD.
In this function you have put X in the field HAS_FIELD_SEPARATOR.
Regards,
Sujit -
Uploading comma delimited text file
Dear Experts,
I want to upload comma delimited text file in sap and get that values into internal table
Can any 1 provide a sample code ??
thanx in advance.Hi Suchitra ,
There is FM GUI_UPLOAD which will help you in this.
Its has got one parameter calles has_field_separator . Here you can pass comma. So that this will take care while uploading the data in Internal table.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = wf_file
filetype = 'ASC'
has_field_separator = ','
* HEADER_LENGTH = 0
* READ_BY_LINE = 'X'
* DAT_MODE = ' '
* CODEPAGE = ' '
* IGNORE_CERR = ABAP_TRUE
* REPLACEMENT = '#'
* CHECK_BOM = ' '
* IMPORTING
* FILELENGTH =
* HEADER =
TABLES
data_tab = it_data
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17
Hope this will help you.
Regards,
Nikhil -
Text Tab delimited file-Strcuture-validation
Hi Folks,
1.Is there a way to check whether the file is text tab delimited or not,programatically ?
2.can we check whether the data in the text tab delimited file is in line with the structure of the internal table into which it is going to be uploaded ?
Thanks,
K.Kiran.hi try this
DO.
READ DATASET p_ufile INTO in_file.
IF sy-subrc <> 0.
EXIT.
ENDIF.
APPEND in_file.
CLEAR in_file.
ENDDO.
CLOSE DATASET p_ufile.
LOOP AT in_file.
SPLIT in_file AT c_tab INTO
wa_citm_b-type
wa_citm_b-vbeln
wa_citm_b-posnr
wa_citm_b-uepos.
wa_citm_b-matnr
lv_menge
wa_citm_b-arktx
wa_citm_b-vbegdat
wa_citm_b-venddat
wa_citm_b-prctr
wa_citm_b-zterm
wa_citm_b-faksp
wa_citm_b-taxm1
wa_citm_b-vlaufz
wa_citm_b-vlauez
wa_citm_b-vlaufk
wa_citm_b-vkuegru
wa_citm_b-bstkd
wa_citm_b-bstdk
wa_citm_b-posex
wa_citm_b-bstkd_e
wa_citm_b-bstdk_e
wa_citm_b-period.
IF NOT wa_citm_b-posnr CA sy-abcde.
APPEND wa_citm_b TO lt_citm_b.
ENDIF.
ENDLOOP. -
Functions to upload UNIX tab-delimited file
plz tell me lists of Functions to upload UNIX tab-delimited file in the database table
HI,
data : itab like standard table of ZCBU.
ld_file = p_infile.
OPEN DATASET ld_file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
ELSE.
DO.
CLEAR: wa_string, wa_uploadtxt.
READ DATASET ld_file INTO wa_string.
IF sy-subrc NE 0.
EXIT.
ELSE.
SPLIT wa_string AT con_tab INTO wa_uploadtxt-name1
wa_uploadtxt-name2
wa_uploadtxt-age.
MOVE-CORRESPONDING wa_uploadtxt TO wa_upload.
APPEND wa_upload TO it_record.
ENDIF.
ENDDO.
CLOSE DATASET ld_file.
ENDIF.
loop at it_record.
itab-field1 = it_reocrd-field1.
itab-field2 = it_record-field2.
append itab.
endloop.
*-- Now update the table
modify ZCBU from table itab. -
Upload text file to oracle table with checking and aggregation
Hi Friends,
I am new to ODI. I have encountered a problem which is specific to ODI 11G (11.1.1.6.3) to upload text file to oracle table with checking and aggregation. Would you please teach me how to implement the following requirement in ODI 11G?
Input text file a:
staffCode, staffCat, status, data
input text file b:
staffCodeStart, staffCodeEnd, staffCat
temp output oracle table c:
staffCat, data
output oracle table d:
staffCat, data
order:
a.staffCode, a.staffCat, a.status
filter:
a.status = ‘active’
join:
a left outerjoin b on a.staffCode between b.staffCodeStart and b.staffCodeEnd
insert temp table c:
c.staffCat = if b.staffCat is not null then b.staffCat else a.staffCat
c.data = a.data
insert table d:
if c.staffCat between 99 and 1000 then d.staffCat = c.staffCat, d.data = sum(c.data)
else d.staffCat = c.staffCat, d.data = LAST(c.data)
Any help on fixing this is highly appreciated. Thanks!!
Thanks,
ChrisDear Santy,
Many thanks for your prompt reply. May I have more information about the LAST or SUM step?
I was successful to create and run the following interfaces p and q
1. Drag text file a to a newly created interface panel p
2. Filter text file a : a.status = ‘active’
3. Lookup text file a to text file b : a.staffCode between b.staffCodeStart and b.staffCodeEnd
4. Drag oracle temp table c to interface panel p
5. Set c.staffCat : CASE WHEN b.staffCat IS NULL THEN a.staffCat ELSE b.staffCat END
6. Set c.data : a.data
7. Drag oracle temp table c to a newly created interface panel q
8. Drag oracle table d to interface panel q
9. Set UK to d.staffCat
10. Set Distinct Rows to table d
11. Set d.staffCat = c.staffCat
12. Set d.data = SUM(c.data)
However, the interface q should be more than that:
If c.staffCat is between 99 and 1000, then d.data = the last record c.data; else d.data = sum(c.data)
Would you please teach me how to do the LAST or SUM steps? Moreover, can interface p and interface q be combined to one interface and do not use the temp table c? Millions thanks!
Regards,
Chris -
Upload text files with non-english characters
I use an Apex page to upload text files. Then i retrieve the contents of files from wwv_flow_files.blob_content and convert them to varchar2 with utl_raw.cast_to_varchar2, but characters like ò, à, ù become garbage.
What could be the problem? Are characters lost when files are stored in wwv_flow_files or when i do the conversion?
Some other info:
* I see wwv_flow_files.DAD_CHARSET is set to "ascii", wwv_flow_files.FILE_CHARSET is null.
* Trying utl_raw.cast_to_varchar2( utl_raw.cast_to_raw('àòèù') ) returns 'àòèù' correctly;
* NLS_CHARACTERSET parameter is AL32UTF8 (not just english ASCII)Hi
Have a look at csv upload -- suggestion needed with non-English character in csv file it might help you.
Thanks,
Manish
Maybe you are looking for
-
I am loading a swf into an AIR app created in CS3. In the AIR app I use HTMLLoader.createRootWindow() which works fine.. but I would like to call this function from the SWF that was loaded into the AIR app. I cannot call it directly from the swf so I
-
I am trying to get rid of the voice over from my apple t.v. that appeared just recently. It is very annoying and I want to know how to get rid of it. Thank you. ibolya
-
I'm trying to install ngPlant, but compile fails: ==> Starting build()... patching file ngput/p3dimage.cpp patching file ngput/p3dospath.cpp patching file ngpshot/ngpshot.cpp patching file ngpshot/p3dshaders.cpp patching file ngpview/ngpview.cpp patc
-
Hi, We make a PO through ME21N of STO for 1 line item. Problem is that, that PO was created through developed user logon by mistake. we delete that PO from EKKO after that. Now can we find that from which user id make that PO. Plz help..
-
DHCP Services migration using tool
I am trying to migrate DHCP services from NW65 to OES2 using the migration tool. when I create the project in the tool , I am setting it up to do a Server level migration. It does not find the DHCP locator / Group Object when I browse to the context