Using different codepages in a csv file
hi guys
I'm working on a project where we load data from arround 50 different acapta system which we want to load in a DWH. The axapata systems export the data, based on a exact schema, to a csv file and send us this file. After that, we us a loop to run the same dataflow for each file (arround 200 files).
Some of this axapta systems are not able to generate correct utf8 files. For most of the system, this is no problem, but we have a problem with some language (Chiniese, Thai, ...).
The File Format definition is on uft8 and we could not change it on the fly (the function is out-gray in the DataFlow). In this case, we should create a DataFlow for each File, which we are sourcing. However, I'm not happy with this.
Do anybody have an idea, how could we solve this case? Is it possible, to change the codepage of the file format on the fly (in the loop)? Is it possible to use a variable for the define the codepage of a Dataflow (it's not in the list).
Thanks for helpiing
Christoph
I would run a script on operation system level to get all the files converted to code page you need before file is uploaded into BW.
Similar Messages
-
Can someone assist me, I am looking for a workflow and/ or script that would allow me to rename finder items using a .txt or a .csv file as a reference for the new naming convention?
THANK YOU!!!!
I have taken your recommendation and enabled the script menubar item. I also tweeked the script as you recommended, I will play with both options to see which I prefer. I thank you again for the quick turn around on this.
If I am to understand your comment correctly about placing the .csv file in the same folder as the files to be renamed I should modify the script as such.
REMOVE:
set csvFile to choose file with prompt "Select the CSV file." of type "public.text"
set workingFolder to (choose folder "Select the folder to be processed") as text
set fileList to (POSIX path of (files of folder workingFolder whose visible is true and name begins with "untitled"))
REPLACE WITH:
set csvList to paragraphs of (read "/path/to/csvfile.csv")
set fileList to (POSIX path of (files of folder "/pah/to/folder" whose visible is true andname begins with "untitled"))
After reading this and attempting to modify it I am missing one piece of markup or the arrangement of the markup.
Please find the modified script with options:
-- change this to specify the correct csv delimiter
set splitDelimiter to ","
-- change this to specify the file name join character
set recombineDelimiter to "_"
-- choose the csv file
set csvFile to choose file with prompt "Select the CSV file." of type "public.text"
-- choose the folder
set workingFolder to (choose folder "Select the folder to be processed") as text
-- csv contents as list
set csvList to paragraphs of (read csvFile)
--set csvList to paragraphs of (read "/path/to/csvfile.csv")
tell application "System Events"
-- list of files in alphabetical order
set fileList to (POSIX path of (files of folder workingFolder whose visible is true and name begins with "untitled"))
-- set fileList to (POSIX path of (files of folder "/path/to/folder" whose visible is true and name begins with "untitled"))
repeat with i from 1 to count of fileList
-- get a file from the file list and note its extension
set currentFile to file (item i of fileList)
set fileExtension to name extension of currentFile
-- get a line from the csv list, split it at the delimiter, and recombine it
set {oldTID, my text item delimiters} to {my text item delimiters, splitDelimiter}
set newFileNamePieces to text items of (item i of csvList)
set my text item delimiters to recombineDelimiter
set newFileName to newFileNamePieces as text
set my text item delimiters to oldTID
-- write the new name back out to the file
set name of currentFile to (newFileName & "." & fileExtension)
end repeat
end tell -
Why is my amount of space used different from my Creative Cloud Files Folder size?
Why is my amount of 'space used' different from my Creative Cloud Files Folder size? CC is telling me I'm using 93% (18.6GB of 20GB) but my CC File Folder only contains 13.42GB?
Have you checked the "Archive" section on the Creative.adobe.com website ?
When you delete the files then they are moved to the "Archive" from where you can delete the content permanently.
Clear the Archive section and then check. -
How can I use automator to import a .csv file into a an excel file
I would still like all the import settings that I get if I am using excel to do the process, i.e. going in with the text format instead of the general format. That way my dates don't change
You'll need to use Applescript in Automator.
Note: File extension cannot be CSV (Excel will assume that the file is comma separated), so add an action to copy or rename with extension TXT.
(You can rename back at end of Workflow is needed)
The Applescript Action is:
on run {input, parameters}
repeat with f in input
tell application "Microsoft Excel"
open text file filename (f as string) data type delimited field info {{1, text format}} other char "|" with use other
end tell
end repeat
eturn input
end run
NOTE: This is formating the 1st col to text
If you need to format the second, then you need to:
open text file filename (f as string) data type delimited field info {{1, general format}, {2, text format}} other char "|" with use other
(You'll need to list the 1st col, 2nd col, etc until you get to the col to format as text). Formats are (from the Applescript Excel Library):
[field info list] : A list contain parse information for the individual columns of data.
Formats are general format, text format, MDY format, DMY format, YMD format, MYD format, DYM format, YDM format, skip column.
Or you can skip using Automator and just use an Applescript App (this is a droplet, just drop the files on the icon, or click to choose files):
on run
set fs to choose file with prompt "Select one or more files to open:" default location alias (the path to desktop folder as text) with multiple selections allowed
proceed(fs)
end run
on open fs
proceed(fs)
end open
on proceed(fs)
repeat with f in fs
tell application "Microsoft Excel"
open text file filename (f as string) data type delimited field info {{1, text format}} other char "|" with use other
end tell
end repeat
end proceed -
File Adapter: trailing space in field using XSD:Decimal in a CSV file
Hi Folks,
I have a problem which i am unable to understand fully.
We have a SAP to file via XI scenario where a mail adapter is used. We are producing a CSV file as an mail attachment.
The issue is all the decimal fields have an extra space before the next delimiter i.e. comma.
The data type used in mapping is XSD:Decimal with 'fraction' set as 2.
I have checked the source XML and there is no trailing spaces there. Initially i thought this might be due to doing the conversion using transformation beans in mail adapter, to rule this out i checked other files produced using FILE adapter they also appear to have same issue.
I can't get my head around it, could not find any parameter i need to pass in content conversion either in MAIL adapter or FILE adapter to supress trailing space before the delimiter.
I suppose this must have occured with others as well.
Any directions would be greatly appreciated.
Btw, we are on PI 7.0 and ECC 6
-Praveen
Edited by: - External Consultants Mouchel on Sep 15, 2009 5:34 PMHi Mouchel,
I personally didnot encounter this kind of issue with the file adapter at any point of time. I would suggest you to check the message mapping before and after payload in sxmb_moni. If you see in mapping you may not find out, so view the source in notepad and then see.
Regards,
---Satish -
Using a lookup from one CSV file to another CSV file errors with a name clash
Hi
I have a master CSV file created by querying the AD. I need to populate the "Team" attribute by doing a lookup using Name against the Name Column any of 6 CSVs, someone has prepared for me . The problem is the data is inconsistent
in the 6 CSV files so Name can hold either the Name or Username i.e. SAmAccountName or LoginName
Now my ps script works when the data is good. but I have around 30 exception cases whereby there is a mismatch . To explain:
is in this line ( see the big code block below)
$userObject = $nameAndTeamCsvData | Where-Object {$_.Name -eq $masterUserName}
The $_.Name is using the top level masterCSVData Name but I was hoping that it was using the
$nameAndTeamCSVData.Name as in only within the scope of the Where-Object {} ..I guess I have misunderstood the syntax of what I had written.
. A quick a dirty fix would be rename this column in all of the 4 spreadsheets or trying to fix the code.....
# now update the empty team value for user object
$csvMasterData | ForEach-Object `
$masterName = $_.Name
$masterUserName = $_.Username # to deal with exception cases
#force scope to as the properties in the outer with the same name should be out of scope
# lookup the Name to see if we can extract the user's Team
$userObject = $nameAndTeamCsvData | Where-Object {$_.Name -eq $masterName }
# deal with the situation where the name in the businesses spreadsheets is actually the Username (login name) in the master csv
if ( $userObject -eq $null )
# lookup the username (loginname) to see if we can extract the user's Team # !!!!error occurs here with the $_.Name !!!!!
$userObject = $nameAndTeamCsvData | Where-Object {$_.Name -eq $masterUserName}
if ( $userObject -eq $null )
$_.Team = "UNKNOWN"
else
# replace the mastercsv.Team with the one we have looked up
$_.Team = $userObject.Team
else
# Name matches so replace the mastercsv.Team with the one we have looked up
$_.Team = $userObject.Team
DanielI see your challenge. How about collecting the column names in an array and looking for the potential names? Following my example before, try this:
$dataColumns = $I | Get-Member -MemberType NoteProperty | Select-Object -ExpandProperty Name
Now, you can loop through the names and collect the appropriate data.
if ($dataColumns.Contains('Column1'))
#add ForEach-Object { $_.Column1... }
HTH
~fr3dd
Hi I had to wait until Friday as I have someone ( my boss) checking the business & team csv files against valid usernames and names.
# now update the empty team value for user object
$nameAndTeamCsvData | ForEach-Object `
$nt_name = $_.Name
# $masterUserName = $_.Username # to deal with exception cases
# lookup the Name to see if we can extract the username and mobile from the masterCSV
$userObject = $null # Replace my original code: $csvMasterData | Where-Object {$_.SName -eq $nt_name } # with foreach code below
$uo = $null
foreach ($uo in $csvMasterData)
if ($uo.SName -eq $nt_name)
$userObject = $uo
# ouch can't break out of this even though I found it so have to move on to the next
}if ( $userObject -ne $null ){ # Name matches so replace the few propeerties in mastercsv.Team with the ones we have looked up
$_.Username =$userObject.SamAccountName
$_.Mobile =$userObject.Mobile
This does work but is clearly not very efficient since I don't think I can
break or exit out of the foreach loop. I am thinking I could quickly modify and add your code - my import is delayed as I am having a new property added to user class... Let me know what you think. -
How to use GUI_upload for Uploading a CSV file in Microsoft Excel.
Hi Guys,
can anybody tell me how to Upload the CSV format file in Microsoft excel sheet?
Thanks,
Gopi.Hi Gopi,
u can use GUI_UPLOAD, TEXT_CONVERT_XLS_TO_SAP.
Please check these codes.
Uploading data from CSV file format into internal table using GUI_UPLOAD
REPORT zupload MESSAGE-ID bd.
DATA: w_tab TYPE ZTEST.
DATA: i_tab TYPE STANDARD TABLE OF ZTEST.
DATA: v_subrc(2),
v_recswritten(6).
PARAMETERS: p_file(80)
DEFAULT 'C:\Temp\ZTEST.TXT'.
DATA: filename TYPE string,
w_ans(1) TYPE c.
filename = p_file.
CALL FUNCTION 'POPUP_TO_CONFIRM'
EXPORTING
titlebar = 'Upload Confirmation'
* DIAGNOSE_OBJECT = ' '
text_question = p_file
text_button_1 = 'Yes'(001)
* ICON_BUTTON_1 = ' '
text_button_2 = 'No'(002)
* ICON_BUTTON_2 = ' '
default_button = '2'
* DISPLAY_CANCEL_BUTTON = 'X'
* USERDEFINED_F1_HELP = ' '
* START_COLUMN = 25
* START_ROW = 6
* POPUP_TYPE =
* IV_QUICKINFO_BUTTON_1 = ' '
* IV_QUICKINFO_BUTTON_2 = ' '
IMPORTING
answer = w_ans
* TABLES
* PARAMETER =
* EXCEPTIONS
* TEXT_NOT_FOUND = 1
* OTHERS = 2
IF sy-subrc <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CHECK w_ans = 1.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = filename
* FILETYPE = 'ASC
has_field_separator = 'X'
* HEADER_LENGTH = 0
* READ_BY_LINE = 'X'
* IMPORTING
* FILELENGTH =
* HEADER =
TABLES
data_tab = i_tab
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
* SYST FIELDS ARE NOT SET BY THIS FUNCTION SO DISPLAY THE ERROR CODE *
IF sy-subrc <> 0.
v_subrc = sy-subrc.
MESSAGE e899 WITH 'File Open Error' v_subrc.
ENDIF.
INSERT ZTEST FROM TABLE i_tab.
COMMIT WORK AND WAIT.
MESSAGE i899 WITH sy-dbcnt 'Records Written to ZTEST'.
Uploading data from Excel file format into internal table using TEXT_CONVERT_XLS_TO_SAP
REPORT zupload_excel_to_itab.
TYPE-POOLS: truxs.
PARAMETERS: p_file TYPE rlgrap-filename.
TYPES: BEGIN OF t_datatab,
col1(30) TYPE c,
col2(30) TYPE c,
col3(30) TYPE c,
END OF t_datatab.
DATA: it_datatab type standard table of t_datatab,
wa_datatab type t_datatab.
DATA: it_raw TYPE truxs_t_text_data.
* At selection screen
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
field_name = 'P_FILE'
IMPORTING
file_name = p_file.
*START-OF-SELECTION.
START-OF-SELECTION.
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
* I_FIELD_SEPERATOR =
i_line_header = 'X'
i_tab_raw_data = it_raw " WORK TABLE
i_filename = p_file
TABLES
i_tab_converted_data = it_datatab[] "ACTUAL DATA
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
* END-OF-SELECTION.
END-OF-SELECTION.
LOOP AT it_datatab INTO wa_datatab.
WRITE:/ wa_datatab-col1,
wa_datatab-col2,
wa_datatab-col3.
ENDLOOP.
reward if helpful
raam -
Writing in different sheets of a .csv file
Hi,
Is it possible to write values in different sheets of the same workbook in a .csv / .xls file. When I use the following to write it on to the file. If I have many such sets of data to be written is it possible to write it in different sheets of the file (E.g. sheet1 as text1, sheet2 as text2, etc.)
out.write(key + "," + hm.get(key));Please do advise.
PKGoogle for Jakarta POI ^ It has features for writing different sheets in XLS.
-
Hi,
I am running Data-Driven test on different machines with different input values in .CSV file in batch mode.we are facing following problem:
Test not considering modified values in .CSV file until we recompile the test.
Is there any way to avoid this dependency of compilation after updating .CSV file???
Regards,
Nagasree.Assuming the CSV is part of the Visual Studio solution. Open the properties panel for the CSV file from solution explorer. Set "Copy to output directory" to "Copy if newer" or to "Copy always". Some documents recommend
"Copy if newer" but I prefer "Copy always" as occasionally a file was not copied as I expected. The difference between the two copy methods is a little disk space and a little time, but disks are normally big and the time to copy is normally
small. Any savings are, in my opinion, far outweighed by being sure that the file will be copied correctly.
See also
http://stackoverflow.com/questions/23469100/how-to-run-a-test-many-times-with-data-read-from-csv-file-data-driving/25742114#25742114
Regards
Adrian -
Can I use lookout to retreive data in CSV file stored at a network PC
The machines I try to interface are PC based machines and network capability. All the informations are stored in CSV files.Is it posssible to use lookout to read these CSV files periodically.
ThanksLookout does not have built-in functions to read CSV files. However, their are some ODBC drivers that are capable of reading/writing text files, and you can use Lookout's SQLExec object to communicate with those ODBC drivers. For example, the Microsoft Text Driver in your ODBC control panel will let you do this.
Regards,
Greg Caesar
National Instruments,
Applications Engineer -
Export Public Folder contacts to CSV File using Exchange management Shell
Hi All
Is there a command i can run in the exchange management shell to export all public folder contacts (which are in many sub folders) to a CSV file?
Any help on this appreciated.
Ben Weinberg
Prime-Networks
www.prime-networks.co.uk
Please post the resolution to your issue so that everyone can benefit
Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.Hello Ben,
Per my knowledge, you cannot use Exchange Management Shell to export the contacts in public folder to csv file.
If you want to export contacts in public folder to a csv file, you can follow this way to do that:
Open Outlook and create a Temp Contact Folder
Move all the contacts to the Temp Contact Folder
Use Outlook export contacts to csv file .
Thanks,
Evan Liu
TechNet Subscriber Support
in forum
If you have any feedback on our support, please contact
[email protected] -
CSV File Parsing using Nintex 2010 Stucked at "in progress"
I have created a workflow using Nintex 2010 for parsing CSV file and writing to sharepoint list. My CSV file contaning 60000+ records and it's stucked at "in progress" after parsing 5000 records properly.
Is it issue with number of records parsing?Hi Shilabhadra,
As Margriet suggested, since the issue is related to third-party product, we do not have sufficient resource here, it would be better you contact their support engineer.
In addition, I found an article about how to bulk upload and synchronize data into SharePoint using the Excel Add-in and SharePoint Designer Workflows for your reference:
http://rstagg.com/2010/04/13/how-to-bulk-upload-and-synchronize-data-into-sharepoint-using-the-excel-add-in-and-sharepoint-designer-workflows/
Regards,
Rebecca Tu
TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
[email protected] -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Hello everyone,
To move data from roughly 50 CSV files in a loop to SQL tables (table per CSV file), I've used BulkInsert task in For each loop container.
Please know, for all different columns of all CSV files, the filed length was specified as varchar(255).
It worked well for the first 6 files but on 7th file, it has found the data in one of the columns of CSV file, which exceeds the limit of 255 characters. I would like to truncate the data and insert the remaining data for this column. In other words, I would
like to insert first 255 characters into table for this field and let the package execute further.
Also, if I would use SQLBulkCopy in Script task, will it be able to resolve this problem? I believe, I would face the similar problem over there too. Although, would like to get confirmation for the same from experts.
Can you please advise how to get rid of this truncation error?
Any help would be greatly appreciated.
Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.comHello! I suggest you add a derived column transformation between Source and Destination, use string functions to extract first 255 characters from the incoming field and send this output to destination field, that ways you can get rid of these sort of issues.
Useful Links:
http://sqlage.blogspot.in/2013/07/ssis-how-to-use-derived-column.html
Good Luck!
Please Mark This As Answer if it solved your issue.
Please Vote This As Helpful if it helps to solve your issue -
Problem in handling complex CSV file
Hi,
I am facing a strange problem in handling a complex CSV file.
The content of the file is as follows. It is getting executed through an XSD and the target is a table.
First time when the interface is executed ( for e.g session id - 19001) it is getting completed successfully.
Now when making changes in the CSV file suppose adding ( for e.g.-
DEPT,D05,Books,ABC Store
CUST,C01,Sayantan,[email protected]
CUST,C02,Shubham,[email protected])
or deleting ( for e.g.-
CUST,C02,Sarbajit,[email protected])
and running the interface I am not getting the added D05 in the target table nor my C02 data is getting removed i.e.- the updated data in CSV file is not getting fetched and I am getting the same records as i got when i ran interface of session id 19001.
I am not getting why it is happening??
The CSV file used in session id -19001 is:
DEPT,D01,Retail,World Mart
CUST,C01,Anindya,[email protected]
CUST,C02,Rashni,[email protected]
DEPT,D02,Food,Food Bazar
CUST,C01,Abhijit,[email protected]
CUST,C02,Anirban,[email protected]
CUST,C03,Sharmistha,[email protected]
DEPT,D03,Water,SW
CUST,C01,Nirmalya,[email protected]
DEPT,D04,Clothes,City style
CUST,C02,Sarbajit,[email protected]
CUST,C03,Abhishek,[email protected]Here's what you can do to handle CSV files using HSQL.
Say the CSV file contains order data. Each order record contains data about a particular order like order number, product code, quantity, ordered by, date etc. Now lets take a hypothetical requirement which says, the user needs to know how many orders were ordered per product.
Option 1] At the basic level, read the order file, parse each line and write the logic to get count of orders per product.
Option 2] Load this CSV file into a database like MySQL and write database queries to get the orders per product.
So where does this leave us? Have we run out of options? I am sure we would have tried & used the above two options . But I wanted a different approach. Following questions were lingering in my mind:
1] Why can’t I write SQL queries against the CSV file itself. After all it’s like any RDBMS table.
2] Why should I load the CSV file into some database before I query it?
3] Why can’t I create a database table & attach the CSV file to it?
Continue Reading Here: Handling CSV Files
Maybe you are looking for
-
Problem with the new server UCS C220 for set IP to CIMC
Hi We've a problem with the new server UCS C220. We bought two servers UCS C220 M3 for CallManager 8.6 with High Availability. When we turn on the server during the boot and when it tells us, oppress F8 to enter at the CIMC and set the IP. But it nev
-
i have scenario SOAP to IDOC when i'm using business service in sender side it is getting an error as follows sender message can not convert to ALE Logical system in this i'm using altova spy as testing tool my sender interface is asynchronous but w
-
Urgent:regarding purchase requsitions
HI, i am making purchase requsition report in which i have to display that these are the person who released the purchase requsition at dese dates. plzzzz help me out as i am able to display only release requsitions but not the person who had release
-
Change the language to english
The reader I have is set in dutch and I would like if the ios changed to english for easier settings change Solved! Go to Solution.
-
i update version iphone 3gs when it finished why it has a problem ?? It always No service how should i do ??