CSV File - Japanese encoding
I am trying to create a CSV (comma separator file) from a Java program which is fetching data from database which is in Japanese language. The characters are displaying as garbage. I tried using encoding as shift_jis but it didnt work.
Any help is highly appreciated.
Thanks
Ketan Malekar
[email protected]
Are you writing to csv in utf-8 encoding .If so CSV files will unable to detect that utf-8 encoding.Try to open the generated csv using with notepad and let me know the result.So that i can give you some solution
Similar Messages
-
BCP queryout -w generating wrong CSV File
I'm trying to generate a CSV file with BCP. My problem is that I have some NVARCHAR columns so I must use the parameter -w for the bcp utility. So, the CSV file generated is opening in a single column in EXCEL. If I create a new text file copy the content
of the CSV generated and paste in the new file and then change its type to CSV it works and open the content spread in different columns. Has someone seen it before? Thanks
SET @File = 'MyQuery.csv'
set @SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -w -t "," -T -S '+ convert(varchar,@@ServerName)
exec master..xp_cmdshell @SQLHi,
I agree with you. It seems the text was attached with some text attribute when generated by BCP. When I copy the content to Notepad, I notice each rows were enclosed in double quotes. Image:
The double quotes make Excel think it is a unit. If I remove the double quotes and import the data to Excel, it will copy the text to separate columns.
1. On the Data tab, in the Get External Data group, click From Text. And follow the instructions in the
Text Import Wizard.
2. Under Original data type, choose Delimited, click Next.
3. Select Tab. I can review the data in the bottom of the dialog box. Then, Next.
Another option is to query the results to file and save it as csv file. Use the Text Import Wizard to import the csv file into Excel. Select Space option in the step 3.
Thanks.
Tracy Cai
TechNet Community Support
Hi Tracy,
I've found a solution for my problem:
I've used the -CACP to generate the CSV file ANSI Encoded and it works!!!
Now my command looks like:
SET @File = 'MyQuery.csv'
set @SQL = 'bcp "SELECT FirstName, LastName, DwellingName FROM Table" queryout "' + + '" -c -CACP -t "," -T -S '+ convert(varchar,@@ServerName)
exec master..xp_cmdshell @SQL -
Are there any versions of Excel (chinese, japanese, russian... 2003, 2007, 2010...) that can save CSV files in Unicode (either UTF-8 or UTF-16)?
If not, is the only solution to go with tab-delimited files (save as Unicode-text option)?Hi Mark,
I have the same problem. Trying to save my CSV file in UTF8 encoding. After several hours in searching and trying this also in my VSTO Add-In I got nothing. Saving file as Unicode option in Excel creates file as TAB separated. Because I'd like to save the
file in my Add-In application, the best to do is (for my problem) saving file as unicode tab delimited and then replacing all tabs with commas in the file automatically.
I don't think there is a direct way to save CSV as unicode in Excel. And I don't understand why. -
CSV file encoded as UTF - 8 loses characters when displayed with excel 2010
Hello everybody,
I have adapted a customer report to be able to send certain data via mail a a CSV attachment.
For that purpose I am using class cl_bcs.
Everything goes fine, but since mail attachment contains certain german characters as Ü, when displaying it with excel those characters appear as corrupted.
It seems the problem is with excel, because when opening the same file with notepad, the Ü is there. If I import the file to excel with the importer, it is correct too.
Anyway, is there any solution to this problem?
I have tried concatenating byte_order_mark_utf8 in the beginning of the file, but still excel does not recognize it.
Thanks in advance,
Pablo.
Edited by: katathema on Jan 31, 2012 2:05 PM- Does ms excell actually support UTF-8
Yes. I believed that we installed some international add-on which is not in default installnation. Anyway, other UTF-8 or UTF-16 file can be openned and viewed by Excel without any problem.
- have you verifide that the file is viewable as a UTF-8 -encoded file
I think so. If I open it into Notepad and choose "save as", the file type if UTF-8 file
- Try opening the file in a program you are confident
that it support UTF-8 - eg. Mozilla...
I will try that.
- Check that your UTF-8 -encoded file has a UTF-8 identifier (0xFEFF ?)
as the first character
The unicode-16(LE or BE) file I got from internet, I found there is always two bytes in the front. (0xFEFF or 0xFFFE). My UTF-8 file generated by java doesn't have that. But should UTF-8 file also has this kind of specifcal bytes in the front? If I manually add these bytes in the front of my file using Ultraeditor and open it in Excel2000, it didn't help.
- Try using another spreadsheet program that supports UTF-8
Do you know any other spreadsheet program supports csv file and UTF-8. -
SQL bulk copy from csv file - Encoding
Hi Experts
This is the first time I am creating a PowerShell script and it is almost working. I just have some problems with the actual bulk import to SQL encoding from the text file since it replaces
special characters with a question mark. I have set the encoding when creating the csv file but that does not seem to reflect on the actual bulk import. I have tried difference scenarios with the encoding part but I cannot find the proper solution for that.
To shortly outline what the script does:
Connect to Active Directory fetching all user - but excluding users in specific OU's
Export all users to a csv in unicode encoding
Strip double quote text identifiers (if there is another way of handling that it will be much appreciated)
Clear all records temporary SQL table
Import records from csv file to temporary SQL table (this is where the encoding is wrong)
Update existing records in another table based on the records in the temporary table and insert new record if not found.
The script looks as the following (any suggestions for optimizing the script are very welcome):
# CSV file variables
$path = Split-Path -parent "C:\Temp\ExportADUsers\*.*"
$filename = "AD_Users.csv"
$csvfile = $path + "\" + $filename
$csvdelimiter = ";"
$firstRowColumns = $true
# Active Directory variables
$searchbase = "OU=Users,DC=fabrikam,DC=com"
$ADServer = 'DC01'
# Database variables
$sqlserver = "DB02"
$database = "My Database"
$table = "tblADimport"
$tableEmployee = "tblEmployees"
# Initialize
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
# GET DATA FROM ACTIVE DIRECTORY
# Import the ActiveDirectory Module
Import-Module ActiveDirectory
# Get all AD users not in specified OU's
Write-Host "Retrieving users from Active Directory..."
$AllADUsers = Get-ADUser -server $ADServer `
-searchbase $searchbase -Filter * -Properties * |
?{$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com' `
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'}
Write-Host "Users retrieved in $($elapsed.Elapsed.ToString())."
# Define labels and get specific user fields
Write-Host "Generating CSV file..."
$AllADUsers |
Select-Object @{Label = "UNID";Expression = {$_.objectGuid}},
@{Label = "FirstName";Expression = {$_.GivenName}},
@{Label = "LastName";Expression = {$_.sn}},
@{Label = "EmployeeNo";Expression = {$_.EmployeeID}} |
# Export CSV file and remove text qualifiers
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
Write-Host "Removing text qualifiers..."
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Write-Host "CSV file created in $($elapsed.Elapsed.ToString())."
# DATABASE IMPORT
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
$batchsize = 50000
# Delete all records in AD import table
Write-Host "Clearing records in AD import table..."
Invoke-Sqlcmd -Query "DELETE FROM $table" -Database $database -ServerInstance $sqlserver
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable and autogenerate the columns
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($firstRowColumns -eq $true) { $null = $reader.readLine()}
Write-Host "Importing to database..."
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
$i++; if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
# Add in all the remaining rows since the last clear
if($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
# Clean Up
Write-Host "CSV file imported in $($elapsed.Elapsed.ToString())."
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
# Update tblEmployee with imported data
Write-Host "Updating employee data..."
$queryUpdateUsers = "UPDATE $($tableEmployee)
SET $($tableEmployee).EmployeeNumber = $($table).EmployeeNo,
$($tableEmployee).FirstName = $($table).FirstName,
$($tableEmployee).LastName = $($table).LastName,
FROM $($tableEmployee) INNER JOIN $($table) ON $($tableEmployee).UniqueNumber = $($table).UNID
IF @@ROWCOUNT=0
INSERT INTO $($tableEmployee) (EmployeeNumber, FirstName, LastName, UniqueNumber)
SELECT EmployeeNo, FirstName, LastName, UNID
FROM $($table)"
try
Invoke-Sqlcmd -ServerInstance $sqlserver -Database $database -Query $queryUpdateUsers
Write-Host "Table $($tableEmployee) updated in $($elapsed.Elapsed.ToString())."
catch
Write-Host "An error occured when updating $($tableEmployee) $($elapsed.Elapsed.ToString())."
Write-Host "Script completed in $($elapsed.Elapsed.ToString())."I can see that the Export-CSV exports into ANSI though the encoding has been set to UNICODE. Thanks for leading me in the right direction.
No - it exports as Unicode if set to.
Your export was wrong and is exporting nothing. Look closely at your code:
THis line exports nothing in Unicode"
Export-Csv -NoTypeInformation $csvfile -Encoding Unicode -Delimiter $csvdelimiter
There is no input object.
This line converts any file to ansi
(Get-Content $csvfile) | foreach {$_ -replace '"'} | Set-Content $csvfile
Set-Content defaults to ANSI so the output file is converted.
Since you are just dumping into a table by manually building a recorset why not just go direct. You do not need a CSV. Just dump theresults of the query to a datatable.
https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd
This script dumps to a datatable object which can now be used directly in a bulkcopy.
Here is an example of how easy this is using your script:
$AllADUsers = Get-ADUser -server $ADServer -searchbase $searchbase -Filter * -Properties GivenName,SN,EmployeeID,objectGUID |
Where{
$_.DistinguishedName -notmatch 'OU=MeetingRooms,OU=Users,DC=fabrikam,DC=com'
-and $_.DistinguishedName -notmatch 'OU=FunctionalMailbox,OU=Users,DC=fabrikam,DC=com'
} |
Select-Object @{N='UNID';E={$_.objectGuid}},
@{N='FirstName';Expression = {$_.GivenName}},
@{N='LastName';Expression = {$_.sn}},
@{N=/EmployeeNo;Expression = {$_.EmployeeID}} |
Out-DataTable
$AllDUsers is now a datatable. You can just upload it.
¯\_(ツ)_/¯ -
Issues while opening a UTF16 encoded .csv file in excel using c#
When I tried to open a UTF8 encoded
.csv file in excel through my c# code, doing something like this:
m_excel.Workbooks.OpenText(newPath, Comma: true);
That works fine. Can anyone tell me what to do if I had to open a UTF16 encoded
.CSV file similarly. All options i tried either fails or print incorrect Unicode characters.
So somehow we have to specify the encoding format (of the file to be opened i.e .CSV) while opening using excel functions, which i am unable to figure out.
Please help me. I am badly stuck here.
Thanks in advance.Hi Jb9952,
There is an Origin parameter in
Workbooks.OpenText method, you need to specify this parameter. You could try to use xlWindows (Windows ANSI).
To get the specify file origin value, you could get the detail code through Record Macro feature in excel.
Regards
Starain
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Hi all,
I want some clarification about CSV file. we using CSV file to upload data from us front end site to db(oracle db). First we read data from CSV file via java CSV reader, the problem is starts from here. The CSV file may be contain some foreign character (i.e. chines, Spanish, Japanese, hindi, arabic), We need to create the CSV file as UTF-8 format.So,
Step - 1 :we create a excel file and save as CSV format
Step - 2 the same file in open Edit+ and change the utf-8
Mean the file has been read in java CSV reader.
Otherwise suppose i changed the second step that is the file open in notepad and convert utf-8 mean this same file not recognized to read in java CSV reader
Please give some idea how to create CSV file in UTF-8 charsetIf your input file is in csv format, try importing it directly into the database using SQLDeveloper. If you have a table created already, riight click on the table in the Connections navigator and select import data. On the first page of the wizard, select the correct encoding from the Encoding list. You should see the characters in the file displayed correctly in the bottom of the page. Select the other options like format, delimiiters and line terminators. When these options are specified correctly, you should see the file displayed as rows and columns in the bottom of the screen. Continue with the import using the wizard. If the table is not already created, you can right click on the table folder in the Connections navigator. The second page of the wizard will allow you to enter a new table name, the wizard will automatically create a column in the table for each column in the file and you can refine the column definitions using step 4, the Column Definition panel.
Joyce Scapicchio
SQLDeveloper Team -
How to extract data from an arbitrary xml file and export in a nice csv file?
Hallo,
I'm facing big problems in the use of XML files. I have an
application which generates XML files with clusters containing arrays
and scalars like in the example pasted below. My task is to
read it and export the data in a human-friendly CSV document.
Since I don't know the actual content of the cluster, I need some kind
of intelligent vi which goes through the XML file looking for arrays
and other data structures in order to export them properly in the CSV
format (columns with headers).
Thank you
<Cluster>
<Name></Name>
<NumElts>3</NumElts>
<Array>
<Name></Name>
<Dimsize>6</Dimsize>
<I32>
<Name></Name>
<Val>0</Val>
</I32>
<I32>
<Name></Name>
<Val>1</Val>
</I32>
<I32>
<Name></Name>
<Val>2</Val>
</I32>
<I32>
<Name></Name>
<Val>3</Val>
</I32>
<I32>
<Name></Name>
<Val>4</Val>
</I32>
<I32>
<Name></Name>
<Val>5</Val>
</I32>
</Array>
<DBL>
<Name></Name>
<Val>3.14159265358979</Val>
</DBL>
<String>
<Name></Name>
<Val>ciao</Val>
</String>
</Cluster>
Solved!
Go to Solution.Thank you again,
I'm forwarding my vi draft with many comments and an xml file sample.
Data in cluster is stored according to the LabVIEW schema, infact it is generated by LabVIEW.
What I'm trying to do is to access the element of the cluster and read their content using the Invoke node and Property node functions. Could you give it a look, there may be something wrong, I'm not able to access cluster children.
Which funcions should I use? Could you give me an example? You may use the draft I enclosed...
Then write these data in a csv file
should be the easier part.
BODY{font:x-small 'Verdana';margin-right:1.5em}
.c{cursor:hand}
.b{color:red;font-family:'Courier New';font-weight:bold;text-decoration:none}
.e{margin-left:1em;text-indent:-1em;margin-right:1em}
.k{margin-left:1em;text-indent:-1em;margin-right:1em}
.t{color:#990000}
.xt{color:#990099}
.ns{color:red}
.dt{color:green}
.m{color:blue}
.tx{font-weight:bold}
.db{text-indent:0px;margin-left:1em;margin-top:0px;margin-bottom:0px;padding-left:.3em;border-left:1px solid #CCCCCC;font:small Courier}
.di{font:small Courier}
.d{color:blue}
.pi{color:blue}
.cb{text-indent:0px;margin-left:1em;margin-top:0px;margin-bottom:0px;padding-left:.3em;font:small Courier;color:#888888}
.ci{font:small Courier;color:#888888}
PRE{margin:0px;display:inline}
<?xml
version="1.0" encoding="iso-8859-1" ?>
<Contents type="Data"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="XMLSection.xsd">
<section name="beta"
date="7/31/2009" time="3:43:03 PM" version="1.0">
<Cluster>
<Name />
<NumElts>1</NumElts>
<Array>
<Name />
<Dimsize>4</Dimsize>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.93317638164326</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.79233924020314</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.39199947274518</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.74817197429441</Val>
</DBL>
</Array>
</Cluster>
</section>
</Contents>
Attachments:
read_array.vi 12 KB -
Reading a CSV file from server
Hi All,
I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
But when i do that the last field in the internal table is appened with #.
Can somebody tell me the solution for this.
U can see the my code below.
data: begin of itab_infile occurs 0,
input(3000),
end of itab_infile.
data: begin of itab_rec occurs 0,
record(200),
end of itab_rec.
data: c_comma(1) value ',',
open dataset f_name1 for input in text mode encoding default.
if sy-subrc <> 0.
write: /, 'FILE NOT FOUND'.
exit.
endif.
do
read dataset p_ipath into waf_infile.
split itab_infile-input at c_sep into table itab_rec.
enddo.
Thanks in advance.
SunilSunil,
You go not mention the platform on which the CSV file was created and the platform on which it is read.
A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
MS/Windows usings <CR><LF> as the EOR
Unix using either <CR> or <LF>
If on unix open the file using vi in a telnet session to confirm the EOR type.
The fix options.
1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin. This does the dos2unix conversion on the fly.
3) Install SAMBA and share the load directory to the windows platforms. SAMBA also handles the dos2unix and unix2dos conversions on the fly.
Hope this helps
David Cooper -
How to generate graphs from csv file and show on remote clients?
Hi,
I have set of csv files. Each file has 104 parameters. From these parameters different graphs have to be generated and displayed to remote clients thru tomcat.
Can anyone tell me how to do that?
cheers,
its reejuit's very easy to load the CSV into java objects. Once you have done that why not use the Java2D API to draw your graphs and then use the Sun JPG encoder tools to write out a jpg stream back to the browser (you will need to set the content type for jpg).
-
Hello, I am very new to flex and don't have a programming background. I am trying to create an air app with flex that looks at a folder on the users desktop where csv files will be dropped by the user. In the air app the user will be able to browse and look for a specific csv file in a list container, once selected the information from that file should be displayed in a datagrid bellow. Finally i will be using Alive PDF to create a pdf from the information in this datagrid laid out in an invoice format. Bellow is the source code for my app as a visual refference, it only has the containers with no working code. I have also attached a sample csv file so you can see what i am working with. Can this be done? How do i do this? Please help.
<?xml version="1.0" encoding="utf-8"?>
<mx:WindowedApplication xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" width="794" height="666">
<mx:Label x="280" y="19" text="1. Select Purchase Order"/>
<mx:List y="45" width="232" horizontalCenter="0"></mx:List>
<mx:Label x="158" y="242" text="2. Verify Information"/>
<mx:DataGrid y="268" height="297" horizontalCenter="0" width="476">
<mx:columns>
<mx:DataGridColumn headerText="Column 1" dataField="col1"/>
<mx:DataGridColumn headerText="Column 2" dataField="col2"/>
<mx:DataGridColumn headerText="Column 3" dataField="col3"/>
</mx:columns>
</mx:DataGrid>
<mx:Label x="355" y="606" text="3. Generated PDF"/>
<mx:Button label="Click Here" horizontalCenter="0" verticalCenter="311"/>
</mx:WindowedApplication>Open the file, parse it, populate an ArrayCollection or XMLListCollection, and make the collection the DataGrid dataProvider:
http://livedocs.adobe.com/flex/3/html/help.html?content=Filesystem_08.html
http://livedocs.adobe.com/flex/3/html/help.html?content=12_Using_Regular_Expressions_01.ht ml
http://livedocs.adobe.com/flex/3/html/help.html?content=dpcontrols_6.html
http://livedocs.adobe.com/flex/3/langref/mx/collections/ArrayCollection.html
http://livedocs.adobe.com/flex/3/langref/mx/collections/XMLListCollection.html
If this post answered your question or helped, please mark it as such. -
How to upload .CSV file from Application Server
Hi Experts,
How to upload .CSV file separated by ',' from Application server to an internal table.
Invoice No,Cust No,Item Type,Invoice Date,days,Discount Amount,Gross Amount,Sales Amount,Customer Order No.,Group,Pay Terms
546162,3233,1,9/4/2007,11,26.79,5358.75,5358.75,11264,HRS,11
546163,2645,1,9/4/2007,11,3.07,305.25,305.25,10781,C,11
Actually I read some already answered posts. But still I have some doubts.
Can anybody please send me the code.
Thanks in Advance.Hi Priya,
Check this code
Yhe logic used here is as follows,
Get all the data into an internal table in the simple format ie: a row with one field contains an entire line
After getting the data, we split each line of the table on every occurrence of the delimiter (comma in your case)
Here, I have named the fields as field01, field02 etc, you could use your own names according to your requirement
parameters: p_file(512).
DATA : BEGIN OF ITAB OCCURS 0,
COL1(1024) TYPE C,
END OF ITAB,
WA_ITAB LIKE LINE OF ITAB.
DATA: BEGIN OF ITAB_2 OCCURS 0,
FIELD01(256),
FIELD02(256),
FIELD03(256),
FIELD04(256),
FIELD05(256),
FIELD06(256),
FIELD07(256),
FIELD08(256),
FIELD09(256),
FIELD10(256),
FIELD11(256),
FIELD12(256),
FIELD13(256),
FIELD14(256),
FIELD15(256),
FIELD16(256),
END OF ITAB_2.
DATA: WA_2 LIKE LINE OF ITAB_2.
OPEN DATASET p_FILE FOR INPUT IN TEXT MODE ENCODING NON-UNICODE.
IF SY-SUBRC = 8.
WRITE:/ 'File' , p_FILE , 'cannot be opened'.
LV_LEAVEPGM = 'X'.
EXIT.
ENDIF.
WHILE SY-SUBRC <> 4.
READ DATASET p_FILE INTO WA_ITAB.
APPEND WA_ITAB TO ITAB.
ENDWHILE.
CLOSE DATASET p_FILE.
LOOP AT ITAB INTO WA_ITAB.
SPLIT WA_ITAB-COL1 AT ',' " where comma is ur demiliter
INTO WA_2-FIELD01 WA_2-FIELD02 WA_2-FIELD03 WA_2-FIELD04
WA_2-FIELD05 WA_2-FIELD06 WA_2-FIELD07 WA_2-FIELD08 WA_2-FIELD09
WA_2-FIELD10 WA_2-FIELD11 WA_2-FIELD12 WA_2-FIELD13 WA_2-FIELD14
WA_2-FIELD15 WA_2-FIELD16.
APPEND WA_2 TO ITAB_2.
CLEAR WA_2.
ENDLOOP.
Message was edited by:
Kris Donald -
File transfer Open dataset CSV file Problem
Hi Experts,
I have an issue in transferring Korean characters to a .CSV file using open dataset.
data : c_file(200) TYPE c value '
INTERFACES\In\test8.CSV'.
I have tried
open dataset c_file for output LEGACY TEXT MODE CODE PAGE '4103'.
open dataset c_file for output in TEXT MODE ENCODING NON-UNICODE.
open dataset c_file for output in TEXT MODE ENCODING Default.
Nothing is working.
But to download to the presentation server the below code is working. How can the same be achieved for uploading the file to application server.
CALL METHOD cl_gui_frontend_services=>gui_download
EXPORTING
filename = 'D:/test123.xls'
filetype = 'ASC'
write_field_separator = 'X'
dat_mode = 'X'
codepage = '4103'
write_bom = 'X'
CHANGING
data_tab = t_tab
EXCEPTIONS
file_write_error = 1
no_batch = 2
gui_refuse_filetransfer = 3
invalid_type = 4
no_authority = 5
unknown_error = 6
header_not_allowed = 7
separator_not_allowed = 8
filesize_not_allowed = 9
header_too_long = 10
dp_error_create = 11
dp_error_send = 12
dp_error_write = 13
unknown_dp_error = 14
access_denied = 15
dp_out_of_memory = 16
disk_full = 17
dp_timeout = 18
file_not_found = 19
dataprovider_exception = 20
control_flush_error = 21
not_supported_by_gui = 22
error_no_gui = 23
OTHERS = 24.Hi,
I would recommend to use OPEN DATASET ... ENCODING UTF-8 ...
If your excel version is unable to open this format, you can convert from 4110 to 4103 with report RSCP_CONVERT_FILE.
Please also have a look at
File upload: Special character
Best regards,
Nils Buerckel -
Question about reading csv file into internal table
Some one (thanks those nice guys!) in this forum have suggested me to use FM KCD_CSV_FILE_TO_INTERN_CONVERT to read csv file into internal table. However, it can be used to read a local file only.
I would like to ask how can I read a CSV file into internal table from files in application server?
I can't simply use SPLIT as there may be comma in the content. e.g.
"abc","aaa,ab",10,"bbc"
My expected output:
abc
aaa,ab
10
bbb
Thanks again for your help.Hi Gundam,
Try this code. I have made a custom parser to read the details in the record and split them accordingly. I have also tested them with your provided test cases and it work fine.
OPEN DATASET dsn FOR input IN TEXT MODE ENCODING DEFAULT.
DO.
READ DATASET dsn INTO record.
PERFORM parser USING record.
ENDDO.
*DATA str(32) VALUE '"abc",10,"aaa,ab","bbc"'.
*DATA str(32) VALUE '"abc","aaa,ab",10,"bbc"'.
*DATA str(32) VALUE '"a,bc","aaaab",10,"bbc"'.
*DATA str(32) VALUE '"abc","aaa,ab",10,"b,bc"'.
*DATA str(32) VALUE '"abc","aaaab",10,"bbc"'.
FORM parser USING str.
DATA field(12).
DATA field1(12).
DATA field2(12).
DATA field3(12).
DATA field4(12).
DATA cnt TYPE i.
DATA len TYPE i.
DATA temp TYPE i.
DATA start TYPE i.
DATA quote TYPE i.
DATA rec_cnt TYPE i.
len = strlen( str ).
cnt = 0.
temp = 0.
rec_cnt = 0.
DO.
* Start at the beginning
IF start EQ 0.
"string just ENDED start new one.
start = 1.
quote = 0.
CLEAR field.
ENDIF.
IF str+cnt(1) EQ '"'. "Check for qoutes
"CHECK IF quotes is already set
IF quote = 1.
"Already quotes set
"Start new field
start = 0.
quote = 0.
CONCATENATE field '"' INTO field.
IF field IS NOT INITIAL.
rec_cnt = rec_cnt + 1.
CONDENSE field.
IF rec_cnt EQ 1.
field1 = field.
ELSEIF rec_cnt EQ 2.
field2 = field.
ELSEIF rec_cnt EQ 3.
field3 = field.
ELSEIF rec_cnt EQ 4.
field4 = field.
ENDIF.
ENDIF.
* WRITE field.
ELSE.
"This is the start of quotes
quote = 1.
ENDIF.
ENDIF.
IF str+cnt(1) EQ ','. "Check end of field
IF quote EQ 0. "This is not inside quote end of field
start = 0.
quote = 0.
CONDENSE field.
* WRITE field.
IF field IS NOT INITIAL.
rec_cnt = rec_cnt + 1.
IF rec_cnt EQ 1.
field1 = field.
ELSEIF rec_cnt EQ 2.
field2 = field.
ELSEIF rec_cnt EQ 3.
field3 = field.
ELSEIF rec_cnt EQ 4.
field4 = field.
ENDIF.
ENDIF.
ENDIF.
ENDIF.
CONCATENATE field str+cnt(1) INTO field.
cnt = cnt + 1.
IF cnt GE len.
EXIT.
ENDIF.
ENDDO.
WRITE: field1, field2, field3, field4.
ENDFORM.
Regards,
Wenceslaus. -
Uploading csv files and reading them from server
I want to read a csv file.From Flex i am able to select the
file but when i pass it to the server using struts
FileUploadInterceptor , am not able to pass the file to the
server.FileUploadInterceptor in struts2 processes the request only
if its instance of MultiPartRequestWrapper.Is there any way in Flex
where i can pass the request as a instance of this.Is there any
other way in which i can read the file from the server after
uploading it through flex.Code is as follows :
1)MXML File :
?xml version="1.0" encoding="utf-8"?>
<mx:Application xmlns:mx="
http://www.adobe.com/2006/mxml"
layout="absolute">
<mx:Script>
<![CDATA[
import ImportData;
import flash.net.FileReference;
import flash.net.FileFilter;
import flash.events.IOErrorEvent;
[Bindable] var fileRef:FileReference = new FileReference();
private function openFileDialog():void{
fileRef.addEventListener(Event.SELECT, selectHandler);
fileRef.addEventListener(Event.COMPLETE, completeHandler);
fileRef.addEventListener(DataEvent.UPLOAD_COMPLETE_DATA
,uploadCompleteHandler);
fileRef.addEventListener(IOErrorEvent.IO_ERROR,onIOError);
try{
var textTypes:FileFilter = new FileFilter("Text Files
(*.txt,*.csv)","*.txt;*.csv");
var allTypes:Array = new Array(textTypes);
//var success:Boolean = fileRef.browse();
var success:Boolean = fileRef.browse(allTypes);
catch(error:Error){
trace("Unable to browse for files.");
private function onIOError(event:IOErrorEvent):void {
trace("In here"+event.text);
trace("In here"+event.toString());
// when a file is selected you upload the file to the upload
script on the server
private function selectHandler(event:Event):void{
//var request:URLRequest = new URLRequest("/importAction");
var request:URLRequest = new URLRequest("
http://localhost:8080/pack1/importAction.action");
try
fileRef.upload(request);
catch (error:Error)
trace("Unable to upload file.");
private function completeHandler(event:Event):void{
trace("uploaded");
// dispatched when file has been uploaded to the server
script and a response is returned from the server
// event.data contains the response returned by your server
script
public function uploadCompleteHandler(event:DataEvent):void
trace("uploaded... response from server: \n" +
String(event.data));
]]>
</mx:Script>
<mx:Button label="Import" id="importBtn"
click="openFileDialog()" height="20" width="90"
styleName="buttonsOnSearchBar"/>
<mx:ComboBox x="23" y="44" borderColor="#ff0000"
themeColor="#ff0000"></mx:ComboBox>
</mx:Application>
2)struts.xml file
<struts>
<package name="pack1"
extends="struts-default,json-default">
<global-results>
<result name="error" type="json"></result>
</global-results>
<global-exception-mappings>
<exception-mapping result="error"
exception="java.lang.Throwable"/>
</global-exception-mappings>
<action name="importAction"
class="routing.ImportAction">
<interceptor-ref name="fileUpload"/>
<interceptor-ref name="basicStack"/>
<result name="success" type="json"></result>
</action>
</package>
</struts>
3)Action Class
package com.om.dh.orderrouting.action;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.StringTokenizer;
import org.apache.log4j.Logger;
import com.opensymphony.xwork2.ActionSupport;
public class ImportAction extends ActionSupport{
private String contentType;
private File upload;
private String fileName;
private String caption;
private static final Logger logger =
Logger.getLogger(ImportAction.class);
@Override
public String execute() throws Exception {
* Read File Line by Line.. If the file has more than one
word separated by comma
* return error.
ArrayList<String> symbolList = new
ArrayList<String>();
try{
BufferedReader reader = new BufferedReader(new
FileReader(upload));
String line =null;
String symbol=null;
while((line=reader.readLine())!=null){
StringTokenizer tokenizer = new StringTokenizer(line,"\t");
symbol = tokenizer.nextToken();
if(symbol!=null) symbol = symbol.trim();
if(symbol.length()>0)
symbolList.add(symbol);
}catch(FileNotFoundException fne){
if(logger.isDebugEnabled())
logger.debug("File NotFount ", fne);
for(String symbol1:symbolList)
System.out.print(symbol1+" ");
return SUCCESS;
public String getUploadFileName() {
return fileName;
public void setUploadFileName(String fileName) {
this.fileName = fileName;
public String getUploadContentType() {
return contentType;
public void setUploadContentType(String contentType) {
this.contentType = contentType;
public File getUpload() {
return upload;
public void setUpload(File upload) {
this.upload = upload;
public String getCaption() {
return caption;
public void setCaption(String caption) {
this.caption = caption;
public String input() throws Exception {
return SUCCESS;
public String upload() throws Exception {
return SUCCESS;quote:
Originally posted by:
ived
tried this but does not work...
var request:URLRequest = new URLRequest("
http://localhost:8080/pack1/importAction.action");
request.contentType="multipart/form-data";
in the interceptor it expects the request to be instanceof
MultiPartRequestWrapper...
Further the document says that FileReference.upload() and
FileReference.download() methods do not support the
URLRequest.contentType and URLRequest.requestHeaders parameters.
Any help ??
Maybe you are looking for
-
I just purchased a movie from the iTunes store using iTunes ver. 11.0. The movie downloaded correctly is there on my hard drive but does appear in the iTunes library. How do I get it to appear in the iTunes Movie Library?
-
Async Client proxy -PI - File scenario- Pipeline steps missing in SXMB_MONI
Hi I made a scenario where an ABAP client proxy(in a BW system) pulls data from a BW table and push it to PI. In PI a Mapping is done to write an XML file and sump it to a desired location. It is working fine in the development environment(request se
-
Why do I have to sacrifice my data plan?
I ha ve been with verizon for almost 7 years. I chose the unlimited data plan about 5 years ago,and now I need an upgrade,but I don't want to lose my unlimited. I was grandfathered in when they started changing it,and I don't think it's fair that in
-
Client proxy Null pointer exception
Folks, I have a problem with ABAP client proxy. The scenario runs fine, but a few messages does not enter XI pipeline with a null pointer exception in RWB. When I click on Message Content of these errored messages, I get 500 Internal Server Error
-
HT204053 Can't remember my password for iCloud how do I reset it
Can anyone help