Comma Delimitted file
Could some one help me in , how to configure a file adapter, I have 5 fields in the incomming file and they are separated by COMMAS.
Thanks.
Thanks Michal. One another question related to the File.
Let us say I have 5 fields and the leght of the fields are given, in this case how do we configure the Sender File Adapter.
length Start Position End position
Field 1 10 1 10
Field 2 5 11 15
Field 3 2 16 17
Field 4 3 18 20
Field 5 15 21 35
Similar Messages
-
Comma delimited file in application server.
How to extarct a comma delimited file from an internal table in to SAP application server.
Hi Arun,
Can you be a bit more clearer? Are you uploading the dat to the appl server or downloading?
1) Comma separated info from itab to file server.
loop at itab.
concatenate itab-field1
itab-field2
itab-field3
into itab_new-data separated by ','.
append itab_new.
clear itab_new.
endloop.
open dataset dsn for output in text mode.
if sy-subrc = 0.
loop at itab_new.
transfer itab_new to dsn.
endloop.
endif.
close dataset.
2) Comma separated info from file to itab.
open dataset dsn for input in text mode.
if sy-subrc = 0.
do.
read datset dsn into itab-data.
if sy-subrc <> 0.
exit.
else.
append itab.
clear itab.
endif.
enddo.
endif.
close dataset.
loop at itab.
split itab-data at ',' into itab_new-field1
itab_new-field2
itab_new-field3.
append itab_new.
clear itab_new.
endloop.
REgards,
Ravi -
Passing data to different internal tables with different columns from a comma delimited file
Hi,
I have a program wherein we upload a comma delimited file and based on the region( we have drop down in the selection screen to pick the region). Based on the region, the data from the file is passed to internal table. For region A, we have 10 columns and for region B we have 9 columns.
There is a split statement (split at comma) used to break the data into different columns.
I need to add hard error messages if the no. of columns in the uploaded file are incorrect. For example, if the uploaded file is of type region A, then the uploaded file should be split into 10 columns. If the file contains lesser or more columns thenan error message should be added. Similar is the case with region B.
I do not want to remove the existing split statement(existing code). Is there a way I can exactly pass the data into the internal table accurately? I have gone through some posts where in they have made use of the method cl_alv_table_create=>create_dynamic_table by passing the field catalog. But I cannot use this as I have two different internal tables to be populated based on the region. Appreciate help on this.
Thanks,
PavanHi Abhishek,
I have no issues with the rows. I have a file with format like a1,b1,c1,d1,e1, the file should be uploaded and split at comma. So far its fine. After this, if the file is related to region A say Asia, then it should have 5 fields( as an example). So, all the 5 values a1,b1..e1 will be passed to 5 fields of itab1.
I also have region B( say Europe) whose file will have only 4 fields. So, file is of the form a2,b2,c2,d2. Again data is split at comma and passed to itab2.
If some one loads file related to Asia and the file has only 4 fields then the data would be incorrect. Similar is the case when someone tries to load Europe file with 5 fields related data. To avoid this, I want to validate the data uploaded. For this, I want to count the no. of fields (seperated by comma). If no. of fields is 5 then the file is related to Asia or if no. of fields is 4 then it is Europe file.
Well, the no. of commas is nothing but no. of fields - 1. If the file is of the form a1,b1..e1 then I can say like if no. of commas = 4 then it is File Asia.But I am not sure how to write a code for this.Please advise.
Thanks,
Pavan -
Hi,
What are the FMs that I can use to read comma delimited files. Also, will it be able to run in the background?
Thanks,
RTHi Rob
As far as i know, we can not upload data from
persentation server in background. For that the file
needs to be placed in application server and use Open dataset command.
Below is just an example to help you have a feel of the
same.
eg:
type: begin of t_data,
vbeln like vbak-vblen,
posnr like vbap-posnr,
matnr like vbap-matnr,
menge like vbap-menge,
end of t_data.
data: it_data type standard table of t_data.
data: wa_data type t_data.
data: l_content(100) type c.
open dataset p_file for output in text mode.
if sy-subrc ne 0.
*** error reading file.
else.
do.
read dataset p_file into l_content.
if sy-subrc ne 0.
close dataset p_file.
exit.
else.
split l_content at ',' into wa_data-vbeln
wa_data-posnr wa_data-matnr wa_data-menge.
append wa_data to it_data.
endif
enddo.
endif.
Hope this helps you.
Kind Regards
Eswar -
How to read and parse a comma delimited file? Help
Hi, does anyone know to read and parse a comma delimited file?
Should I use StreamTokenizer, StringTokenizer, or the oro RegEx packages?
What is the best?
I have a file that has several lines of data that is double-quoted and comma delimited like:
"asdfadsf", "asdfasdfasfd", "asdfasdfasdf", "asdfasdf"
"asdfadsf", "asdfasdfasfd", "asdfasdfasdf", "asdfasdf"
Any help would be greatly appreciated.
thanks,
Spackimport java.util.*;
import java.io.*;
public class ResourcePortalParser
public ResourcePortalParser()
public Vector tokenize() throws IOException
File reportFile = new File("C:\\Together5.5\\myprojects\\untitled2\\accessFile.txt");
Vector tokenVector = new Vector();
StreamTokenizer tokenized = new StreamTokenizer(new FileReader(reportFile));
tokenized.eolIsSignificant(true);
while (tokenized.nextToken() != StreamTokenizer.TT_EOF)
switch(tokenized.ttype)
case StreamTokenizer.TT_WORD :
System.out.println("Adding token - " + tokenized.sval);
tokenVector.addElement(tokenized.sval);
break;
return tokenVector;
[\code] -
Is it possible to download a comma delimited file from ABAP to a Portal
directory or Web Dev URL on the Portal?
We currently have a comma delimited file being emailed nightly using standard SAP FMs to a distribution list.
We want to move these files to the Portal instead.
Speaking with our Portal guys, they say the easiest way for them to receive the file is if I can download directly to a Web Dev URL.
Does SAP provide a way to communicate from our app server to the portal?
thanks alot,
rp.Is it possible to verify a signed jar-file from a
program
(using some API) likewise jarsigner does?Hi,
You would have to open the jarfile, read each jar entry and for each of them do a getCertificates() and then in turn verify each certificate with the public key of the enclosed certificates in the jar file.
An easier solution would be to use the verify flag of the JarFile or JarInputStream.
Hope it helps..
Cheers,
Vijay -
Is it possible to download a comma delimited file to..
a Web Dev URL on the SAP Portal?
We currently have a comma delimited file being emailed nightly using standard SAP FMs to a distribution list.
We want to move these files to the Portal instead.
Speaking with our Portal guys, they say the easiest way for them to receive the file is if I can download directly to a Web Dev URL.
Does SAP provide a way to communicate from our app server to the portal?
thanks alot,
rp.Are you trying to convert a list of your forms or a list of the responses to your forms?
Andrew -
My Firefox v6.02 browser always open *.csv (comma delimited) file in another tab rather than downloading a file? I do not have right click option to save this file. My OP is windows XP sp2.
Hi mha007,
Go to this site "http://www.nseindia.com/content/indices/ind_histvalues.htm"
Select any Index type and from dates select just one day or any number of days. Click "Get Details" button. This will open tabular data on loaded page (with link CSV file link at bottom).
When I click this link, it opens CSV file in new tab (in Firefox browser) and does not open download window (just as in case of other file formats like *.zip, *.rar or *.xls).
Right-click of link does not give "save as" or "save file" option.
Firefox is my preferred and default browser.
========
surprisingly, When I do same thing in IE6 browser, data-table page shows 2 options below table, i.e. (1) download file (2) open file. -
Import Format for Separte Debit and Credit Columns (comma delimited file)
I have a file that is comma delimited with a sparate debit column and a separate credit column:
Sample:
Ent,Acct,description,Dr,Cr
,1000,test,100,
,110010,another test,,100
My import format is this:
Comma delimited
Field Number Number of Fields Expression
Entity 1 5 SGB_ABC
Account 2 5
Amount 5 5 Script=DRCRSplit_Comma_Del.uss
I've tried writing the following script to pull the amount from the debit column into the credit column but it just skips the lines with no amount in field 5.
'Declare Variables
Dim Field4
Dim Field5
'Retrieve Data
Field4 = DW.Utilities.fParseString(strRecord, 4, 4, ",")
Field5 = DW.Utilities.fParseString(strRecord, 5, 5, ",")
'If Field4 has a value then fld 5 is to equal fld 4.
If IsNumeric(Field5) Then
Field5 = Field5
Else
Field5 = Field4
End If
'Return Result into FDM
SQFLD5 = Field5
End Function
Can anyone tell me what I am doing wrong?
Thanks!I got it to work using this script:
Function DRCR_Comma_Del(strField, strRecord)
'Hyperion FDM DataPump Import Script:
'Created By: testuser
'Date Created: 7/22/2009 9:31:15 AM
'Purpose: If Amount is in the DR column move it to the CR amount column.
Dim DR
Dim CR
DR = DW.Utilities.fParseString(strRecord, 5, 4, ",")
CR = DW.Utilities.fParseString(strRecord, 5, 5, ",")
If IsNumeric(DR) Then
strField = DR
Else
strField = "-" & CR
End If
DRCR_Comma_Del = strField
End Function -
Regular Expression to Search Comma Delimited File for any of 3 Values
Hi,
I'd like to parse a table column that contains a comma delimited string for any of 3 values, 1200, 1400, 1600 just to see if they're present using Regexp_instr. If someone has an expression available please pass it along.
Thanks,
VictorOr you could do it like this too...
SQL> ed
Wrote file afiedt.buf
1 with t as (select 1 as id, '1000,2000,3000' as txt from dual union all
2 select 2, '1200,1300,1400' from dual union all
3 select 3, '1000,1300,1600' from dual)
4 -- end of test data
5 select *
6 from t
7* where regexp_like(txt,'(^|,)(1200|1400|1600)(,|$)')
SQL> /
ID TXT
2 1200,1300,1400
3 1000,1300,1600 -
I've just received my iPad, still struggling to get to grips with it, trying to import my contacts from my PC into iCloud.... does anybody have any suggestions on how to bulk-import?
ThanksThanks - that's exactly what I'm trying to do!
It's step 2 that I'm having trouble with.
I could replicate the job of something like (for example) JDOM by receiving all events under the root and creating JDOM elements from the contents, but I was hoping to be able to use some library that did this for me (for example: jdom's SAXBuilder class).
The other side effect of this method is that you drive things from a SAX parser and let the document handler do the application work. Unfortuanately re-engineering the application to work from within a DocumentHandler is probably not an option.
But thankyou anyway :) -
Ssrs 2012 export to comma delimited (csv) file problem
In an ssrs 2012 report, I want to be able to export all the data to a csv (comma delimited) file that only contains the detailed row information. I do not want to export any rows that contain header data information.
Right now the export contains header row information and detail row information. The header row information exists for the PDF exports.
To have the header row not exist on CSV exports, I have tried the following options on the tablix with the header row information with no success:
1. = iif(Mid(Globals!RenderFormat.Name,1,5)<>"EXCEL",false,true),
2. = iif(Mid(Globals!RenderFormat.Name,1,3)="PDF",false,true),
3. The textboxes that contain the column headers, the dataelelementoutput=auto and
the text boxes that contain the actual data, I set the dataelelementoutput=output.
Basically I need the tablix that contains the header information not to appear when the export is for a csv file.
However when the export is for the detail lines for the csv export, I need that tablix to show up.
Thus can you tell me and/or show me code on how to solve this problem?Hi wendy,
Based on my research, the expression posted by Ione used to hide items only work well when render format is RPL or MHTML. Because only the two render format’s RenderFormat.IsInteractive is true.
In your scenario, you want to hide tablix header only when export the report to csv format. While CSV file uses the textbox name in detail section as the column name, not the table header when you view the table. So if we want to hide the header row of the
tablix, please refer to the steps below to enable the “NoHeader” setting in the RSReportserver.config file:
Please navigate to RSReportserver.config file: <drive:>\Program Files\Microsoft SQL Server\MSRS1111.MSSQLSERVER\Reporting Services\ReportServer \RSReportserver.config.
Backup the RSReportserver.config file before we modify it, open the RSReportserver.config file with Notepad format.
In the <Render> section, add the new code for the CSV extension like this:
< Extension Name="CSV" Type="Microsoft.ReportingServices.Rendering.DataRenderer.CsvReport,Microsoft.ReportingServices.DataRendering">
<Configuration>
<DeviceInfo>
<NoHeader>True</NoHeader>
</DeviceInfo>
</Configuration>
< /Extension>
Save the RSReportserver.config file.
For more information about CSV Device Information Settings, please see:
http://msdn.microsoft.com/en-us/library/ms155365(v=sql.110)
If you have any other questions, please feel free to ask.
Thanks,
Katherine xiong
If you have any feedback on our support, please click
here.
Katherine Xiong
TechNet Community Support -
The selected file does not appear to be a valid comma separated values (csv) file or a valid tab delimited file. Choose a different file.
I guess your question is, "what's wrong with the file?"
You're going to have to figure that out yourself, as we cannot see the file.
Importing into Address book requires either a tab-delimited or a comma-delimited file. You can export out of most spreadsheets into a csv file. However, you need to make sure you clean up the file first. If you have a field that has commas in the field, they will create new fields at the comma. So, some lines will have more fields than the others, causing issues like the error you saw. -
I've created an SSIS package to import a comma delimited file (csv) with double quotes for a text qualifier ("). Some of the fields contain the delimiter inside the qualified text. An example row is:
15,"Doe, John",IS2,Alabama
In SSIS I've specified the text qualifier as ". The sample output in the connection manager looks great. The package runs perfectly from VS and when manually executed on the SSIS server itself. The problem comes when I schedule the package to run via SQL
job. At this point the package ignores the text qualifier, and in doing so pushes half of a field into the next available column. But instead of having too many columns, it concatenates the last 2 columns ignoring the delimiter. For example (pipes are fields):
15|"Doe| John"|IS2,Alabama
So the failure happens when the last half of a field is too large to fit into the next available field. In the case above _John" is 6 characters where the IS2 field is char(3). This would cause a truncation failure, which is the error I receive from the
job history.
To further test this I created a failure flow in my data flow to capture the records failing to be pulled from the source csv. Out of ~5000 records, ~1200 are failing, and every one of the failures has a comma delimiter inside the quoted text with a 'split'
length greater than the next ordinal field. The ones without the comma were inserted as normal and records where the split fit into the next ordinal column where also imported, with the last field being concatenated w/delimiter. Example records when selected
from table:
25|"Allan Percone Trucking"|GI6|California --Imported as intended
36|"Renolds| J."|UI6,Colorado --Field position offset by 1 to right - Last field overloaded
To further ensure this is the problem, I changed the csv file and flat file connection manager to pipe delimited, and sure enough every record makes it in perfectly from the SQL job execution.
I've tried comma delimited on the following set ups. Each set up failed.
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 80
SSIS Server 2008 R2 RTM
DB Server 2008 RTM
DB Compat 100
SSIS Server 2008 R2 RTM
DB Server 2008 R2 RTM
DB Compat 100
Since a lot of our data comes in via comma delimited flat files, I really need a fix for this. If not I'll have to rebuild all my files when I import them to use pipe delimiters instaed of commas. I'd like to avoid the extra processing overhead if possible.
Also, is this a known bug? If so can someone point me to the posting of said bug?
Edit: I can't ask the vendor to change the format to pipe delimited because it comes from a federal government site.Just wanted to share my experience of this for anyone else since I wasted a morning on it today.
I encountered the same problem where I could run the package fine on my machine but when I deployed to a server and ran the package via dtexec, the " delimeter was being replaced with _x0022_ and columns all slurped up together and overflowing columns/truncating
data etc.
Since I didn't want to manually hack the DTS XML and can't install updates, the solution I used was to set an expression on the textdelimiter field of the flat file connection with the value "\"" (a double quote). That way, even if the one stored in XML
gets bodged somewhere along the way, it is overridden with the correct value at run time. The package works fine everywhere now. -
Cant get my head round delimited files
hey guys i have been trying to sort this out for ages but i just cant get it to display correctly just wondering if any one could help
Create a comma delimited file for example:
Smith, Jon
Bloggs, Fred
Jones, Anne
Then simply read each item of data and display it to look like this:
Jon Smith
Fred Bloggs
Anne Jones
i have made a .txt files with the names in but i cant get them to display like it ask me to i can get rid of the comma and move the text around but i cant get it in that orderi have displayed them so that they display on the same line just havin trouble swaping them around
import java.io.File;
import java.util.Scanner;
public class URLScan {
public static void main(String[] args)throws Exception {
String url;
Scanner filescan, urlscan;
filescan = new Scanner(new File("url.txt"));
while(filescan.hasNext()){
url = filescan.nextLine();
System.out.println("URL: "+url);
urlscan = new Scanner(url); urlscan.useDelimiter(",");
System.out.print("\f"+urlscan.next());
System.out.println("\t"+urlscan.next());
while (urlscan.hasNext()){
}
Maybe you are looking for
-
Hi all, I have a new Z10 (less than a week old) and I used the camera this morning, took the photo and it failed to display the photo. I send the image to a mate via BBM and it wouldn't open for them. Now I have since tried to use the camera again an
-
Airplay on 1st generation apple tv
How do you use airplay on 1st generation apple tv
-
How to use or create time variable in my case
I want to design a report to calculate the number of employees every month by year according to employee subgroup. When users run this report, month variable will displayed prompt, after user input year, report will calculate the number of employees
-
How would I...
Achieve these two effects? The first ones are like stutters with pitch bend? How would I do that? And then second one, with the piece from Rihanna's Umbrella, there's a pitch bend and then the last song I assume has a pitch bend? How would I achieve
-
MY PRINTER SCANER IS NOT WORKING
MY PRINTER SCANER IS NOT WORKING WHAT CAN I DO