Gui_download Comma delimited
Is there a way to create a comma delimited file using the GUI_Download. Its keeps coming out tab delimited.
Thanks
Sure, just declare a flat structure in your program.
data: itab type table of string.
data: xtab type string.
Then fill this internal table with data from another internal concatenating the fields .
loop at ianother.
concatenate ianother-fld1
ianother-fld2
ianother-fld3
ianother-fld4
into xtab separated by ','
append xtab to itab.
endloop.
Now just pass ITAB to the GUI_DOWNLOAD function module.
Welcome to SDN.
Regards,
Rich Heilman
Similar Messages
-
Convert xls to comma delimited
hi all...
i have a xls file which needs to be converted to comma delimited dat file.
can somebody help me on this.
thnks,
i need it programatically pls!!Check the below program :
Upload xls file and you can see .txt file ( with comma delimted)
Input ( XLS file )
aaa 1245 2344 233 qwww
233 2222 qwww www www
Output ( .txt file with comma delimted)
aaa,1245,2344,233,qwww
233,2222,qwww,www,www
REPORT ZFII_MISSING_FILE_UPLOAD no standard page heading.
data : begin of i_text occurs 0,
text(1024) type c,
end of i_text.
Internal table for File data
data : begin of i_data occurs 0,
field1(10) type c,
field2(10) type c,
field3(10) type c,
field4(10) type c,
field5(10) type c,
end of i_data.
data : begin of i_download occurs 0,
text(1024) type c,
end of i_download.
data : v_lines type sy-index.
data : g_repid like sy-repid.
data v_file type string.
data: itab like alsmex_tabline occurs 0 with header line.
data : g_line like sy-index,
g_line1 like sy-index,
$v_start_col type i value '1',
$v_start_row type i value '1',
$v_end_col type i value '256',
$v_end_row type i value '65536',
gd_currentrow type i.
selection-screen : begin of block blk with frame title text.
parameters : p_file like rlgrap-filename obligatory.
selection-screen : end of block blk.
initialization.
g_repid = sy-repid.
at selection-screen on value-request for p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
PROGRAM_NAME = g_repid
IMPORTING
FILE_NAME = p_file.
start-of-selection.
Uploading the data into Internal Table
perform upload_data.
download the file into comma delimted file.
perform download_data.
*& Form upload_data
text
--> p1 text
<-- p2 text
FORM upload_data.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING
FILENAME = p_file
I_BEGIN_COL = $v_start_col
I_BEGIN_ROW = $v_start_row
I_END_COL = $v_end_col
I_END_ROW = $v_end_row
TABLES
INTERN = itab
EXCEPTIONS
INCONSISTENT_PARAMETERS = 1
UPLOAD_OLE = 2
OTHERS = 3.
IF SY-SUBRC <> 0.
write:/10 'File '.
ENDIF.
if sy-subrc eq 0.
read table itab index 1.
gd_currentrow = itab-row.
loop at itab.
if itab-row ne gd_currentrow.
append i_data.
clear i_data.
gd_currentrow = itab-row.
endif.
case itab-col.
when '0001'.
first Field
i_data-field1 = itab-value.
second field
when '0002'.
i_data-field2 = itab-value.
Third field
when '0003'.
i_data-field3 = itab-value.
fourth field
when '0004'.
i_data-field4 = itab-value.
fifth field
when '0005'.
i_data-field5 = itab-value.
endcase.
endloop.
endif.
append i_data.
ENDFORM. " upload_data
*& Form download_data
text
--> p1 text
<-- p2 text
FORM download_data.
loop at i_data.
concatenate i_data-field1 ',' i_data-field2 ',' i_data-field3 ','
i_data-field4 ',' i_data-field5 into i_download-text.
append i_download.
clear : i_download,
i_data.
endloop.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
BIN_FILESIZE =
FILENAME =
'C:\Documents and Settings\smaramreddy\Desktop\fff.txt'
FILETYPE = 'ASC'
APPEND = ' '
WRITE_FIELD_SEPARATOR = ' '
HEADER = '00'
TRUNC_TRAILING_BLANKS = ' '
WRITE_LF = 'X'
COL_SELECT = ' '
COL_SELECT_MASK = ' '
DAT_MODE = ' '
IMPORTING
FILELENGTH =
TABLES
DATA_TAB = i_download
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
GUI_REFUSE_FILETRANSFER = 3
INVALID_TYPE = 4
NO_AUTHORITY = 5
UNKNOWN_ERROR = 6
HEADER_NOT_ALLOWED = 7
SEPARATOR_NOT_ALLOWED = 8
FILESIZE_NOT_ALLOWED = 9
HEADER_TOO_LONG = 10
DP_ERROR_CREATE = 11
DP_ERROR_SEND = 12
DP_ERROR_WRITE = 13
UNKNOWN_DP_ERROR = 14
ACCESS_DENIED = 15
DP_OUT_OF_MEMORY = 16
DISK_FULL = 17
DP_TIMEOUT = 18
FILE_NOT_FOUND = 19
DATAPROVIDER_EXCEPTION = 20
CONTROL_FLUSH_ERROR = 21
OTHERS = 22
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
ENDFORM. " download_data
Thanks
Seshu -
Comma delimited in Sql query decode function errors out
Hi All,
DB: 11.2.0.3.0
I am using the below query to generate the comma delimited output in a spool file but it errors out with the message below:
SQL> set lines 100 pages 50
SQL> col "USER_CONCURRENT_QUEUE_NAME" format a40;
SQL> set head off
SQL> spool /home/xyz/cmrequests.csv
SQL> SELECT
2 a.USER_CONCURRENT_QUEUE_NAME || ','
3 || a.MAX_PROCESSES || ','
4 || sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
5 ||sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'I',1,0),0)) Pending_Normal ||','
6 ||sum(decode(b.PHASE_CODE,'R',decode(b.STATUS_CODE,'R',1,0),0)) Running_Normal
7 from FND_CONCURRENT_QUEUES_VL a, FND_CONCURRENT_WORKER_REQUESTS b
where a.concurrent_queue_id = b.concurrent_queue_id AND b.Requested_Start_Date <= SYSDATE
8 9 GROUP BY a.USER_CONCURRENT_QUEUE_NAME,a.MAX_PROCESSES;
|| sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
ERROR at line 4:
ORA-00923: FROM keyword not found where expected
SQL> spool off;
SQL>
Expected output in the spool /home/xyz/cmrequests.csv
Standard Manager,10,0,1,0
Thanks for your time!
Regards,Get to work immediately on marking your previous questions ANSWERED if they have been!
>
I am using the below query to generate the comma delimited output in a spool file but it errors out with the message below:
SQL> set lines 100 pages 50
SQL> col "USER_CONCURRENT_QUEUE_NAME" format a40;
SQL> set head off
SQL> spool /home/xyz/cmrequests.csv
SQL> SELECT
2 a.USER_CONCURRENT_QUEUE_NAME || ','
3 || a.MAX_PROCESSES || ','
4 || sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
5 ||sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'I',1,0),0)) Pending_Normal ||','
6 ||sum(decode(b.PHASE_CODE,'R',decode(b.STATUS_CODE,'R',1,0),0)) Running_Normal
7 from FND_CONCURRENT_QUEUES_VL a, FND_CONCURRENT_WORKER_REQUESTS b
where a.concurrent_queue_id = b.concurrent_queue_id AND b.Requested_Start_Date <= SYSDATE
8 9 GROUP BY a.USER_CONCURRENT_QUEUE_NAME,a.MAX_PROCESSES;
|| sum(decode(b.PHASE_CODE,'P',decode(b.STATUS_CODE,'Q',1,0),0)) Pending_Standby ||','
>
Well if you want to spool query results to a file the first thing you need to do is write a query that actually works.
Why do you think a query like this is valid?
SELECT 'this, is, my, giant, string, of, columns, with, commas, in, between, each, word'
GROUP BY this, is, my, giant, stringYou only have one column in the result set but you are trying to group by three columns and none of them are even in the result set.
What's up with that?
You can only group by columns that are actually IN the result set. -
SQL to match a single value in a field with comma-delimited text
I have a column that can contain none, one or many recordIDs
(from another table) stored as comma-delimited text strings (i.e.,
a list). I need to retrieve all records that match a single value
within that list.
For example, if I want to match all values that equal
recordID 3, and I use ... WHERE MyColumn IN ('3') ... , I will get
all records that have EXACTLY 3 as the value of MyColumn, but not
any MyColumn records whose values include 3, if they are instances
such as "3,17" or the like.
Also using the LIKE operator -- as WHERE MyColumn LIKE '%3%'
-- will get me unwanted records with values such as 35 or 13 ...
Can I use some sort of intervening ColdFusion list processing
to output only the desired records?Normalize your database so that your data becomes
accessible. -
Import Format for Separte Debit and Credit Columns (comma delimited file)
I have a file that is comma delimited with a sparate debit column and a separate credit column:
Sample:
Ent,Acct,description,Dr,Cr
,1000,test,100,
,110010,another test,,100
My import format is this:
Comma delimited
Field Number Number of Fields Expression
Entity 1 5 SGB_ABC
Account 2 5
Amount 5 5 Script=DRCRSplit_Comma_Del.uss
I've tried writing the following script to pull the amount from the debit column into the credit column but it just skips the lines with no amount in field 5.
'Declare Variables
Dim Field4
Dim Field5
'Retrieve Data
Field4 = DW.Utilities.fParseString(strRecord, 4, 4, ",")
Field5 = DW.Utilities.fParseString(strRecord, 5, 5, ",")
'If Field4 has a value then fld 5 is to equal fld 4.
If IsNumeric(Field5) Then
Field5 = Field5
Else
Field5 = Field4
End If
'Return Result into FDM
SQFLD5 = Field5
End Function
Can anyone tell me what I am doing wrong?
Thanks!I got it to work using this script:
Function DRCR_Comma_Del(strField, strRecord)
'Hyperion FDM DataPump Import Script:
'Created By: testuser
'Date Created: 7/22/2009 9:31:15 AM
'Purpose: If Amount is in the DR column move it to the CR amount column.
Dim DR
Dim CR
DR = DW.Utilities.fParseString(strRecord, 5, 4, ",")
CR = DW.Utilities.fParseString(strRecord, 5, 5, ",")
If IsNumeric(DR) Then
strField = DR
Else
strField = "-" & CR
End If
DRCR_Comma_Del = strField
End Function -
I've created an SSIS package to import a comma delimited file (csv) with double quotes for a text qualifier ("). Some of the fields contain the delimiter inside the qualified text. An example row is:
15,"Doe, John",IS2,Alabama
In SSIS I've specified the text qualifier as ". The sample output in the connection manager looks great. The package runs perfectly from VS and when manually executed on the SSIS server itself. The problem comes when I schedule the package to run via SQL
job. At this point the package ignores the text qualifier, and in doing so pushes half of a field into the next available column. But instead of having too many columns, it concatenates the last 2 columns ignoring the delimiter. For example (pipes are fields):
15|"Doe| John"|IS2,Alabama
So the failure happens when the last half of a field is too large to fit into the next available field. In the case above _John" is 6 characters where the IS2 field is char(3). This would cause a truncation failure, which is the error I receive from the
job history.
To further test this I created a failure flow in my data flow to capture the records failing to be pulled from the source csv. Out of ~5000 records, ~1200 are failing, and every one of the failures has a comma delimiter inside the quoted text with a 'split'
length greater than the next ordinal field. The ones without the comma were inserted as normal and records where the split fit into the next ordinal column where also imported, with the last field being concatenated w/delimiter. Example records when selected
from table:
25|"Allan Percone Trucking"|GI6|California --Imported as intended
36|"Renolds| J."|UI6,Colorado --Field position offset by 1 to right - Last field overloaded
To further ensure this is the problem, I changed the csv file and flat file connection manager to pipe delimited, and sure enough every record makes it in perfectly from the SQL job execution.
I've tried comma delimited on the following set ups. Each set up failed.
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 100
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 100
Since a lot of our data comes in via comma delimited flat files, I really need a fix for this. If not I'll have to rebuild all my files when I import them to use pipe delimiters instaed of commas. I'd like to avoid the extra processing overhead if possible.
Also, is this a known bug? If so can someone point me to the posting of said bug?
Edit: I can't ask the vendor to change the format to pipe delimited because it comes from a federal government site.Just wanted to share my experience of this for anyone else since I wasted a morning on it today.
I encountered the same problem where I could run the package fine on my machine but when I deployed to a server and ran the package via dtexec, the " delimeter was being replaced with _x0022_ and columns all slurped up together and overflowing columns/truncating
data etc.
Since I didn't want to manually hack the DTS XML and can't install updates, the solution I used was to set an expression on the textdelimiter field of the flat file connection with the value "\"" (a double quote). That way, even if the one stored in XML
gets bodged somewhere along the way, it is overridden with the correct value at run time. The package works fine everywhere now. -
Comma delimited file in application server.
How to extarct a comma delimited file from an internal table in to SAP application server.
Hi Arun,
Can you be a bit more clearer? Are you uploading the dat to the appl server or downloading?
1) Comma separated info from itab to file server.
loop at itab.
concatenate itab-field1
itab-field2
itab-field3
into itab_new-data separated by ','.
append itab_new.
clear itab_new.
endloop.
open dataset dsn for output in text mode.
if sy-subrc = 0.
loop at itab_new.
transfer itab_new to dsn.
endloop.
endif.
close dataset.
2) Comma separated info from file to itab.
open dataset dsn for input in text mode.
if sy-subrc = 0.
do.
read datset dsn into itab-data.
if sy-subrc <> 0.
exit.
else.
append itab.
clear itab.
endif.
enddo.
endif.
close dataset.
loop at itab.
split itab-data at ',' into itab_new-field1
itab_new-field2
itab_new-field3.
append itab_new.
clear itab_new.
endloop.
REgards,
Ravi -
Ssrs 2012 export to comma delimited (csv) file problem
In an ssrs 2012 report, I want to be able to export all the data to a csv (comma delimited) file that only contains the detailed row information. I do not want to export any rows that contain header data information.
Right now the export contains header row information and detail row information. The header row information exists for the PDF exports.
To have the header row not exist on CSV exports, I have tried the following options on the tablix with the header row information with no success:
1. = iif(Mid(Globals!RenderFormat.Name,1,5)<>"EXCEL",false,true),
2. = iif(Mid(Globals!RenderFormat.Name,1,3)="PDF",false,true),
3. The textboxes that contain the column headers, the dataelelementoutput=auto and
the text boxes that contain the actual data, I set the dataelelementoutput=output.
Basically I need the tablix that contains the header information not to appear when the export is for a csv file.
However when the export is for the detail lines for the csv export, I need that tablix to show up.
Thus can you tell me and/or show me code on how to solve this problem?Hi wendy,
Based on my research, the expression posted by Ione used to hide items only work well when render format is RPL or MHTML. Because only the two render format’s RenderFormat.IsInteractive is true.
In your scenario, you want to hide tablix header only when export the report to csv format. While CSV file uses the textbox name in detail section as the column name, not the table header when you view the table. So if we want to hide the header row of the
tablix, please refer to the steps below to enable the “NoHeader” setting in the RSReportserver.config file:
Please navigate to RSReportserver.config file: <drive:>\Program Files\Microsoft SQL Server\MSRS1111.MSSQLSERVER\Reporting Services\ReportServer \RSReportserver.config.
Backup the RSReportserver.config file before we modify it, open the RSReportserver.config file with Notepad format.
In the <Render> section, add the new code for the CSV extension like this:
< Extension Name="CSV" Type="Microsoft.ReportingServices.Rendering.DataRenderer.CsvReport,Microsoft.ReportingServices.DataRendering">
<Configuration>
<DeviceInfo>
<NoHeader>True</NoHeader>
</DeviceInfo>
</Configuration>
< /Extension>
Save the RSReportserver.config file.
For more information about CSV Device Information Settings, please see:
http://msdn.microsoft.com/en-us/library/ms155365(v=sql.110)
If you have any other questions, please feel free to ask.
Thanks,
Katherine xiong
If you have any feedback on our support, please click
here.
Katherine Xiong
TechNet Community Support -
Using a comma-delimited string in Dynamic SQL
Hi --
If I receive a comma-delimited string as an in parameter, can I simply use that (in string format) when building my dynamic sql?
Thanks,
ChristineThe problem is, that you can not use bind variables
here, only literals. This causes
eventual performance problems.And to avoid the inevitable database performance problems Dmytro mentions you can use a function to convert the string to a varray and select from that. This also avoids having to use dynamic sql.
First you create a varray and conversion function.
SQL> create or replace type tabstr_t as table of varchar2(255)
2 /
Type created.
SQL> create or replace function tabstr (
2 p_str in varchar2,
3 p_sep in varchar2 default ','
4 )
5 return tabstr_t
6 as
7 l_str long default p_str || p_sep;
8 l_tabstr tabstr_t := tabstr_t();
9 begin
10 while l_str is not null loop
11 l_tabstr.extend(1);
12 l_tabstr(l_tabstr.count) := rtrim(substr(
13 l_str,1,instr(l_str,p_sep)),p_sep);
14 l_str := substr(l_str,instr(l_str,p_sep)+1);
15 end loop;
16 return l_tabstr;
17 end;
18 /
Function created.Then you can use these in either regular sql.
SQL> var s varchar2(100)
SQL> exec :s := 'Smith,Scott,Miller'
PL/SQL procedure successfully completed.
SQL>
SQL> select * from emp where ename in
2 (select upper(column_value) from table(tabstr(:s)));
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800 20
7788 SCOTT ANALYST 7566 09-DEC-82 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10Or in pl/sql.
SQL> var c refcursor
SQL> begin
2 open :c for
3 select * from emp where ename in
4 (select upper(column_value) from table(tabstr(:s)));
5 end;
6 /
PL/SQL procedure successfully completed.
SQL> print c
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800 20
7788 SCOTT ANALYST 7566 09-DEC-82 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10 -
Passing data to different internal tables with different columns from a comma delimited file
Hi,
I have a program wherein we upload a comma delimited file and based on the region( we have drop down in the selection screen to pick the region). Based on the region, the data from the file is passed to internal table. For region A, we have 10 columns and for region B we have 9 columns.
There is a split statement (split at comma) used to break the data into different columns.
I need to add hard error messages if the no. of columns in the uploaded file are incorrect. For example, if the uploaded file is of type region A, then the uploaded file should be split into 10 columns. If the file contains lesser or more columns thenan error message should be added. Similar is the case with region B.
I do not want to remove the existing split statement(existing code). Is there a way I can exactly pass the data into the internal table accurately? I have gone through some posts where in they have made use of the method cl_alv_table_create=>create_dynamic_table by passing the field catalog. But I cannot use this as I have two different internal tables to be populated based on the region. Appreciate help on this.
Thanks,
PavanHi Abhishek,
I have no issues with the rows. I have a file with format like a1,b1,c1,d1,e1, the file should be uploaded and split at comma. So far its fine. After this, if the file is related to region A say Asia, then it should have 5 fields( as an example). So, all the 5 values a1,b1..e1 will be passed to 5 fields of itab1.
I also have region B( say Europe) whose file will have only 4 fields. So, file is of the form a2,b2,c2,d2. Again data is split at comma and passed to itab2.
If some one loads file related to Asia and the file has only 4 fields then the data would be incorrect. Similar is the case when someone tries to load Europe file with 5 fields related data. To avoid this, I want to validate the data uploaded. For this, I want to count the no. of fields (seperated by comma). If no. of fields is 5 then the file is related to Asia or if no. of fields is 4 then it is Europe file.
Well, the no. of commas is nothing but no. of fields - 1. If the file is of the form a1,b1..e1 then I can say like if no. of commas = 4 then it is File Asia.But I am not sure how to write a code for this.Please advise.
Thanks,
Pavan -
Wrapping A SqlBulkCopy Class To Export A Proper Comma Delimited Table
Hello!
If anyone can point me in the right direction it would be greatly appreciated!
In my search, it seems SQL Server cannot export tables to a proper comma delimited format. I have not found a out of the box solution for SQL Server to properly escape commas and double quotes within fields on a record.
Which brings me here. I'm searching for a way to leverage the SqlBulkCopy class (or a alternative) to quickly export a record's values in a array or list so I can feed it into my custom DelimitedStringBuilder class.
So far it seems the out of the box SqlBulkCopy class only supports import into other tables and not a text file.
One thought (I have no idea how I would implement it) would be to attach a DataReader class to the SqlBulkCopy class and then pass the values from the DataReader to my DelimitedStringBuilder class.
Has anyone been able to write something like this? Or is there a simpler solution?Have you tried just using SQL and ExecuteNonQuery? Below is an example:
Dim TextConnection As New System.Data.OleDb.OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;" & _
"Data Source=" & "C:\Documents and Settings\...\My Documents\My Database\Text" & ";" & _
"Extended Properties=""Text;HDR=NO;""")
TextConnection.Open()
'New table
Dim TextCommand As New System.Data.OleDb.OleDbCommand("SELECT * INTO [Orders#csv] FROM [Orders] IN '' [ODBC;Driver={SQL Server Native Client 11.0};Server=.\SQLExpress;Database=Northwind;Trusted_Connection=yes];", TextConnection)
TextCommand.ExecuteNonQuery()
TextConnection.Close()
Paul ~~~~ Microsoft MVP (Visual Basic) -
Hi,
What are the FMs that I can use to read comma delimited files. Also, will it be able to run in the background?
Thanks,
RTHi Rob
As far as i know, we can not upload data from
persentation server in background. For that the file
needs to be placed in application server and use Open dataset command.
Below is just an example to help you have a feel of the
same.
eg:
type: begin of t_data,
vbeln like vbak-vblen,
posnr like vbap-posnr,
matnr like vbap-matnr,
menge like vbap-menge,
end of t_data.
data: it_data type standard table of t_data.
data: wa_data type t_data.
data: l_content(100) type c.
open dataset p_file for output in text mode.
if sy-subrc ne 0.
*** error reading file.
else.
do.
read dataset p_file into l_content.
if sy-subrc ne 0.
close dataset p_file.
exit.
else.
split l_content at ',' into wa_data-vbeln
wa_data-posnr wa_data-matnr wa_data-menge.
append wa_data to it_data.
endif
enddo.
endif.
Hope this helps you.
Kind Regards
Eswar -
How to read and parse a comma delimited file? Help
Hi, does anyone know to read and parse a comma delimited file?
Should I use StreamTokenizer, StringTokenizer, or the oro RegEx packages?
What is the best?
I have a file that has several lines of data that is double-quoted and comma delimited like:
"asdfadsf", "asdfasdfasfd", "asdfasdfasdf", "asdfasdf"
"asdfadsf", "asdfasdfasfd", "asdfasdfasdf", "asdfasdf"
Any help would be greatly appreciated.
thanks,
Spackimport java.util.*;
import java.io.*;
public class ResourcePortalParser
public ResourcePortalParser()
public Vector tokenize() throws IOException
File reportFile = new File("C:\\Together5.5\\myprojects\\untitled2\\accessFile.txt");
Vector tokenVector = new Vector();
StreamTokenizer tokenized = new StreamTokenizer(new FileReader(reportFile));
tokenized.eolIsSignificant(true);
while (tokenized.nextToken() != StreamTokenizer.TT_EOF)
switch(tokenized.ttype)
case StreamTokenizer.TT_WORD :
System.out.println("Adding token - " + tokenized.sval);
tokenVector.addElement(tokenized.sval);
break;
return tokenVector;
[\code] -
Suppliers is a field containing a comma delimited list of
Supplier ID's.
When a supplier logs in they should be able to view all the
auctions that they have been registered for
i.e if their supplierID is in the suppliers field.
have tried this and get an error:
<CFQUERY NAME="GetAuctions"
DATASOURCE="#Application.Datasource#">
SELECT * FROM Auctions
WHERE '#Session.SupplierID#' IN 'Auctions.Suppliers'
</CFQUERY>
have tried this and recordcount is 0 when it should be 3:
<CFQUERY NAME="GetAuctions"
DATASOURCE="#Application.Datasource#">
SELECT * FROM Auctions
WHERE '#Session.SupplierID#' LIKE 'Auctions.Suppliers'
</CFQUERY>You should avoid having a list value in a field and normalise
your table. But if you want to stick with your style(which is not
advisable), maybe you can do this. I believe your supplier id is a
string so the code below may cause slowness in your system:
<CFQUERY NAME="GetAuctions1"
DATASOURCE="#Application.Datasource#">
SELECT Suppliers FROM Auctions
</CFQUERY>
<cfoutput query="GetAuctions1">
<CFQUERY NAME="GetAuctions2"
DATASOURCE="#Application.Datasource#">
SELECT * FROM Auctions
WHERE '#Session.SupplierID#' IN(<cfqueryparam
values="#Suppliers#" cfsqltype="CF_SQL_VARCHAR" list="Yes">)
</CFQUERY>
</cfoutput>
But if your supplier id is a numeric value. then you can do
this:
<CFQUERY NAME="GetAuctions"
DATASOURCE="#Application.Datasource#">
SELECT A1.* FROM Auctions A1
WHERE #Session.SupplierID# IN(SELECT A2.Suppliers FROM
Auctions A2 WHERE A2.your_primary_key_for_table_Auctions =
A1.your_primary_key_for_table_Auctions)
</CFQUERY> -
Creating a comma delimited flat file
Hi,
I am trying to create comma delimited flat files from oracle large Oracle 8.0.6 tables. What is the best method? It seems like using the spool command is very inefficiant. Can sqldr be used?Todd,
This looks very helpful -- thanks! I need to unload very large tables and this should do the trick.
Jeremiah
Maybe you are looking for
-
I have a bunch of old VCRs that I'd like to put onto DVD. does anyone know how to wire a VCR to a DVD-R to do this? Can it be done? Can you hook a DVD-R to a computer and do the same. thanks
-
So I go to export my 16:9 FCEHD movie and it comes out squashed on both sides... not exactly 16:9 nor 4:3... Shouldn't it simply export as it is produced? Granted, its not originally HD but 4:3 brought onto a 16:9 canvas and treated and cropped to fi
-
I have 2 children on my itunes account. can you set up sub accounts so itunes cards blaances remain seperate.
-
Hi experts, What is BPM used for ? Is it used to connect loosely coupled systems? Can any one give a real example . Regards, Vignesh
-
HT5372 Closing iCloud tab from another iOS Device
I have an iCloud tab open on a device that I wiped out and sold. The website is showing up in the cloud on my other devices. How can I close this?