Flat File Load Issue - Cannot convert character sets for one or more charac
We have recently upgraded our production BW system to SPS 17 (BW SP19)
and we have issues loading flat files (contain Chinese, Japanese and
Korean characters) which worked fine before the upgrade. The Asian
languages appear as invalid characters (garbled) as we can see in PSA. The Character Set
Settings was previously set as Default Setting and it worked fine until
the upgrade abd we referred to note 1130965 and with the code page 1100
the load went through (without the Cannot convert character sets for one or more characters error) however the Asian language characters will appear
as invalid characters. We tried all the code pages suggested in the note e.g.
4102, 4103 etc on the info packages but it did not work. Note that the
flat files are encoded in UTF-8.
I checked lower case option for all IO
When i checked the PSA failed log no of records processed is "0 of 0" my question is with out processing single record system is througing this error message.
When i use same file loading from local workstation no error message
I am thinking when I FTP the file from AS/400 to BI Application Server(Linex) some invalid characters are adding but how can we track that invalid char?
Gurus please share your thoughts on this I will assign full points.
Thanks,
Similar Messages
-
Cannot convert character sets for one or more characters Error message
Hello,
I am getting this error message "Cannot convert character sets for one or more characters" when i try to transfer the data from Application server file as CSV to PSA.
Using T-code AL11
Under SAP Directory I open my file I can able to see all contents of my file right?
But i can't able to see all my data because of this I am getting ths error message?
I am loading Master data attribute
Is there any limitation on record length?
Please share your thought..
ThanskWhich length should not exceed 60.
When i load the same file from local workstation I can able to load no error message
Why I am getting if i load same file from application server?
Thanks -
Cannot convert character sets for one or more characters
Hello,
Issue:
Source File: Test.csv
Source file dropped on application server as csv file via FTP
Total records in csv file 102396
I can able to load only 38,000 records after that I am getting error message " convert character sets for one or more characters" while loading into PSA .
If i load same csv file from local workstation I am not getting any error message I can able to load total 102396 records.
I am guessing while FTP the file into Application server some invalid codes adding?
But I am FTP 7 files, out of 7 I am unable to load only 1 file.
Anybody have faced this kind of problem please share with me I will assign full points
ThanksI checked lower case option for all IO
When i checked the PSA failed log no of records processed is "0 of 0" my question is with out processing single record system is througing this error message.
When i use same file loading from local workstation no error message
I am thinking when I FTP the file from AS/400 to BI Application Server(Linex) some invalid characters are adding but how can we track that invalid char?
Gurus please share your thoughts on this I will assign full points.
Thanks, -
Convert character sets for one or more characters
Hello,
Issue:
Source File: Test.csv
Source file dropped on application server as csv file via FTP
Total records in csv file 102396
while loading into PSA I can able to load only 38,000 records after that I am getting error message " convert character sets for one or more characters" .
If i load same csv file from local workstation I am not getting any error message I can able to load total 102396 records.
Anybody have faced this kind of problem please share with me I will assign full points
ThanksHi,
check
Flatfile Issue
Regards,
Arek -
Hello
I'm trying to load a master data infoobject with a flat file and getting errors.
The infoobject ZCOLLECT_DATE has two fields - ZQUARTER and ZCOLLECTDATE
ZCOLLECT_DATE is of Data type NUMC, Length 5, conversion routine PERI5 and Output length 6
I created a flat file in Excel the following way:
Left the first row blank
Second column - 1/2008 (for ZQUARTER)
Thrid column - 01/11/2008 (for ZCOLLECTDATE)
I saved it as a CSV (save as-> csv).
In the infopackage, I've selected this file in the External data tab, File type CSV, Data separator ; , Escape sign "
When I try to load the infoobject, I get the following error:
"Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
I thought this was because the Length in my infoobject was 5, whereas ZQUARTER is of length 6 in my flat file (eg-1/2008) and ZCOLLECTDATE is of format mm/dd/yyyy.
Any thoughts on what the issue is?
Thanks
AlexHi Hemant
I changed it to 12008, 22008 etc.. but still get the error ""Error in conversion exit CONVERSION_EXIT_PERI5_INPUT"
Also, when I save the xls file as CSV, I get two pop-ups saying some features are not compatible and only those features that are compatible will be saved. Are these normal warnings, or could they be causing the issue?
Also, in the infoobject, since it only consists of dates, do I need to specify conversion routine PERI5 in Infoobject properties/maintenance?
Thanks -
Hi All,
I am facing a problem while doing a deta load through Flat File; it's an .csv file.
Problem arises whne I have some records like this £1,910.00 ; it's creating a separate column.
I am having lot's of such amount field.
is there any breakthrough for this issue ?? Your ans would be highly appriciated.
Regards,
Kironmoy BanerjeeHi,
As satish has mentioned it is better to maintain two seperate columns as below
1.0currency
GBP
2.0amount
1910
(GBP )Indicates British Pound.
When ur maintaining amount as 1,910.00 it will go for next column because in info package Extraction tabe Data Separator will be , (Comma).
Regards
Surendra
Edited by: vaade surendra on May 7, 2010 11:37 AM
Edited by: vaade surendra on May 7, 2010 11:37 AM
Edited by: vaade surendra on May 7, 2010 11:38 AM
Edited by: vaade surendra on May 7, 2010 11:41 AM -
Urgent: Flat file load issue
Hi Guru's,
Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
Thanks
MKHi Bhanu,
How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
In PSA I checked the error message it shows 2 error messages:
First Error message:
The value '1# ' from field /BIC/ZACTDAYS is not convertible into the
DDIC data type QUAN of the InfoObject in data record 7 . The field
content could not be transferred into the communication structure
format.
em Response
The data to be loaded has a data error or field /BIC/ZACTDAYS in the
transfer structure is not mapped to a suitable InfoObject.
The conversion of the transfer structure to the communication structure
was terminated. The processing of data records with errors was continued
in accordance with the settings in error handling for the InfoPackage
(tab page: Update Parameters).
Check the data conformity of your data source in the source system.
On the 'Assignment IOBJ - Field' tab page in the transfer rules, check
the InfoObject-field assignment for transfer-structure field
/BIC/ZACTDAYS .
If the data was temporarily saved in the PSA, you can also check the
received data for consistency in the PSA.
If the data is available and is correct in the source system and in the
PSA, you can activate debugging in the transfer rules by using update
simulation on the Detail tab page in the monitor. This enables you to
perform an error analysis for the transfer rules. You need experience of
ABAP to do this.
2nd Error Message:
Diagnosis
The transfer program attempted to execute an invalid arithmetic
operation.
System Response
Processing was terminated, and indicated as incorrect. You can carry out
the update again any time you choose.
Procedure
1. Check that the data is accurate.
- In particular, check that the data types, and lengths of the
transferred data, agree with the definitions of the InfoSource,
and that fixed values and routines defined in the transfer rules
- The error may have arisen because you didn't specify the
existing headers.
2. If necessary, correct the data error and start processing again.
Thanks
MK -
Flat File-to-RFC question, multiple RFC calls for one file.
Hi guys,
I'm quite new to XI / PI and I have a question regarding a File-to-RFC scenario we're trying in NW PI 7.1.
We have a flat file which has two lines with two fields each, let's take this as an example :
001,001
002,002
The files needs to be converted to XML and then transferred to the RFC program which will update a table with the values above.
In the ESR I've created 3 data types (z_row1,z_record1 and z_fileinput1), 1 message type (z_file2rfc_ob_mt), 1 message mapping (z_file2rfc_mm), 2 Service Interface (z_file2rfc_ob_si and z_file2rfc_ib_ztestpi) and 1 operation mapping (z_file2rfc_om).
In the Integration Builder (ID) I've created everything required (The sender and receiver communication channels, sender and receiver agreement, receiver determination and interface mapping).
We're also using content conversion to convert the flat file to XML, this does seem to work because I see all the lines/field in the message when it gets into PI. The problem is that the RFC can only accept two fields at a time and only the first line gets updated. I see that only the first line gets sent to the RFC.
How can I make the RFC call for each and every line ?
Thanks !!Create the RFC with table, which takes multiple lineitem as input and update the table in single call.
If you want response back then call RFC as synchrounous else in Asynchrounous mode.
By doing this in single call it will update the complete table.
Gaurav Jain
Reward Points if answer is helpful -
I am doing flat file in BW System. InfoPackage setting are as follows,
File is CSV file
Data Separator = ,
Escape Sign = u201C
In Transfer Structure all are char fields.
1st Filed is of length 18
2nd Filed is of length 18
3rd Filed is of length 25
4th Filed is of length 10
e.g of data in flat file is as follows
"00q80000003AmvHAAS","00530000000jSm8AAE","US02CALVER;H21020101","2008-11-24u201D
This data will load successfully every time, but if data comes like this
"00q80000003AmvHAAS","00530000000jSm8AAE","US02BVERFU;H21020101;H21020102;H21020104;H21020201;H21020202;H21020203;H21020204","2008-11-24u201D
Then in preview itself I canu2019t see the date(4th filed will be blank) it will blank or some times it will be some char like 200 or 2008 or 2008-11 etc.
Can you please tell me why this is so? How to solve this issue?I've come accross a similar problem. Maybe another view will help someone find an answer.
The transfer structure looks like this:
MATERIAL 0MATERIAL Material CHAR 18
MAT_NAME ignore CHAR 1
REGION ignore CHAR 1
PLANT 0PLANT Plant CHAR 4
PLANT_NAME ignore CHAR 1
VENDOR 0VENDOR Vendor CHAR 10
VENDOR_NAME ignore CHAR 1
CURRENCY 0CURRENCY Currency CUKY 5
UNIT 0UNIT Unit of measure UNIT 3
/BIC/CGRPLEAD CGRPLEAD Goup leader CHAR 3
/BIC/CMONTHFR CMONTHFR Month from CHAR 7
/BIC/CMONTHTO CMONTHTO Month to CHAR 7
/BIC/AREBATES AREBATES Rebate CHAR 17 2
The csv file looks like this:
10000001,Dry ring def,NA,1001,plant 001,100000,Main vendor in United Stat,USD,TO,CR,200701,200812,3.00
10000001,Dry ring defl,NA,1001,This is the name for pla,100000,Main vendor in United States of,USD,TO,CR,200701,200812,3.00
Line 1 will load correctly like this:
10000001|D|NA|1001|T|100000|M|USD|TO|CR|200701|200812|3.00
but line 2 will be cut off and loaded link this:
10000001|D|NA|1001|T|100000|M|USD|TO|CR|200701|20081|0.00
with the month to and amount incorrect.
There must be a length limit to the incomming text but it doesn't seem to relate directly to the field lengths of the transfer structure. -
Hi,
A newbie to SSIS but was able to create an SSIS package which extracts XML data from one of the SQL server columns using "Execute SQL Task" and passes that to a for each loop container which contains an XML task for apply transform to each input
xml and append it to a flat file(additonally using a script task within for each container).
All good so far but now I want to apply conditional splitting to "Execute SQL Task" in the above and additionally pipe it to the exact similar additional process(For each container doing xml task as described above) for a different kind of
flat file.
If I alter the design to use the data flow approach and apply OOTB conditional split, I run into not knowing how to connect and execute more than one foreach container and embedded XML and script task (dataflow to control flow connection)
It is easy to put everything in a sequence container and repeat the Execute SQL Task . But to me that is very inefficient.
Any creative ideas or pointers to some internet content which tells me how can this be done most efficiently.
Hope my question makes sense and let me know if you need more clarification.
Thanks in advance.
SMAs I understand what you're asking for is a way to create conditional branches to do different typeof processing for each subset of data. For this you can do like below
1. Add set of tasks to additional processing for each type of flat file and link them to Execute sql task.
2. Make the precedence constraint option as Expression And Constraint for each of them. Make constraint as OnSuccess and expression based on your condition. You may need to create SSIS variables to capture value of fields to be used in expression manipulation.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Transformation not generating-Flat file loading
Hello guys, I hope you can help me with this little confusion I have on BI7 Flat file loading.
I got a File (CSV) on my workstation. I am trying to load Master Data. Here is the example of my file and issues:
Lets say, I have CSV file named "CarModel.CSV" on my PC.
This excel file has 10 Records and No atributes for this Field.
So the records should show like this and it is showing correctly in PSA level, DS level.
A
B
C
D
E
F
My goal is to load Flat file data to InfoObject (inserted in Infoprovider)
I created Source system, DS, all thats stuffs.
I am now on Display DS screen under Proposal Tab. I am putting 10 Records to show and hitting Load example data....it works fine by showing all 10 records. However in the bottom part of this screen, what should show as a Field ?In my case , First it was showing the First Record ("A")...i didnt think it was correct but i prceeded anyways. Transformation could not be generated.
I tried by deleting "1" from the field "No. of header rows to be ignored" and its the same result ..No transformations.. I mean i know its very simple to load this data but i am not sure what i am doing wrong..My question is:
1) What should show under Field/propasal tab in my case?am i supposed to create this Field?
2) Should the system to propse the field which is from my flat file header ?in my case i dont have any header..should i have include header in my csv file like "car model"? i am confused please give me some info thanks
dkHi,
In filed tab, u have to enter ur infoobject names in an order...and press enter it'll automatically give the description and its other factors...
i guess u shuld have some header in the sense eg: customer,cust ID like this..this only u have to enter as fields...in proposal tab...try that
rgds, -
Unit Code, Commercial Code - Flat File loading vs BPS Loading
Hi SDN Community,
Recently, we have moved our flat file loading to be performed by BPS interfaces.
During flat file loads, the Commercial code of all units of measures, are verified.
eg.DAY
But when loaded by BPS, the UNIT code is verified.
eg. TAG.
In the display of the BPS upload, it displays DAY which is the commercial code.
The only thing, is the customer is forced to use TAG in the BPS upload files.
They wish to create another record in the transaction
CUNI to be
Unit code = DAY
Commercial code = DAY.
However, i have found that we cannot allocate the same commercial code to another unit code.
Is this a design constraint, or a process error that i am doing.
Thank you.
Simon04.01.2010 - 02:22:53 CET - Reply by SAP
Dear customer,
As i do not fully understand how this works, but the base table T006A
has 2 entries, 1 for english, 1 for German, should it not be that the
English EN entry field be working rather that the German DE, hence DAY
should be used? Can you please confirm my understanding on this?
>>> If you check my attachment "SE16.xls", you can see that it's for
language >>English<<, and the internal format is TAG while the external
format is DAY.
- Are there any plans to modify the BPS functionality in newer SAP BW
versions to allow the customer to indicate the same UOM as per flat
file laods? Or is an upgraded Support Pack that allows this?
- If not, would it be possible to make any customer enhancements to
allow this to take place depending on customer requirements.
>>> There is no plan at this moment to change this BPS functionality
in newer SAP BW versions or support packages.
We really recommend you to use internal format TAG in the upload file
in BPS which I think should be acceptable and feasible to you. All
other ways trying to use external format is risky and we cannot assure
you that it will work well as it's not SAP standard function. (I think
it's not worth the risk as the standard function which requires
internal format should not be too unacceptable)
Thanks for your understanding.
I also don't think my BC-SRV-ASF-UOM colleague would be able to help
you a lot regarding this.
Best Regards,
Patricia Yang
Support Consultant - Netweaver BW
Global Support Center China
SAP Active Global Support -
Flat File loading Initialize with out Data transfer is disabled in BI 7.0
Hi experts,
When loading through flat file in BI 7.0 for Info Package Level Initialization Delta Process with data Transfer is coming by default,but when i want to select Initialization Delta Process without Data transfer is disabled. (in the creation of Data Source (flat file) in the Extraction Tab Delta Process is changed to FIL1 Delta Data (Delta Images).
please provide me Solution.
regards
Subba reddy.Hi Shubha,
For flat file load please go throught he following link:
http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
This will help.
Regards,
Mahesh -
What are the settings for datasource and infopackage for flat file loading
hI
Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
pls let me know
regards
kumarLoading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
Loading of master data in BI 7.0:
For Uploading of master data in BI 7.0
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets. -
Hi Friends,
I am struggling with flat-file loading problem. I am trying to load a .csv file into data target. I took all pre-cautions while loading data. I look into preview and simulate the data. Everything is ok, but when i schedule the data, i found 0 records in the monitor. The following is the STATUS message of the above problem:
No data available
Diagnosis
The data request was a full update.
In this case, the corresponding table in the source system does not
contain any data.
System response
Info IDoc received with status 8.
Procedure
Check the data basis in the source system.
Can anybody help me what is the problem and procdure to resolve it?
Regards,
MaheshHi Eugene,
Thanks for the quick reply. The following screen-shot tells you the messages of detail tab;
OVER ALL STATUS MISSING WITH MESSAGES OR WARNINGS
REQUEST: MISSING MESSAGES
EXTRACTION
EVERYTHING IS OK
DATA REQUEST RECEIVED
NO DATA AVAILABLE DATA ELECTION ENDED.
PROCESSING
NO DATA
The above message was shown in details tab. Pls guide me to locate the problem.
Regards,
Mahesh
Maybe you are looking for
-
Thunderbird 24.3.0 (latest under OpenBSD) I (like most people) get spam in my inbox. When there are multiple spam messages in a row in my inbox I used to click on the first message and then press the delete key repeatedly for each sequential spam mes
-
How do I enable the Move Option When Importing (It's Greyed Out)
Bonjour, I am trying to figure out why Aperture 2 does not ask to delete photos from camera (Canon SD950 IS) after import or why the "move" option is 'greyed' out. I have searched the forums here and have only seen people offer advice why this is not
-
Z230 with Firepro w2100 reverts to low resolution when the monitor is removed.
Hi all,Just wondering if anyone has a workaround for an issue I've found.I'm using a HP Z230 workstation that has an AMD Firepro w2100 in it. We're intending on running this machine headless and only accessing it remotely if we need to at all, howeve
-
Hi All, How can we know if an image has been edited with 'Edit Original' option in CS4. In previous Indesign versions (before CS4), the UIDRef of the link used to change if the image is saved after editing. In CS4 it doesn't change. In previous versi
-
Locks getting created with lock owner field ##
Hi We are using ECC 6.0 system. We are facing a typical pronblem. Locks are getting created with ## in the user name field. Because of which updates are not happening. I solved this problenm by applying note 857439 but the problem again occurs after