Chinese characters scrambled when loading from DS to BW

Hi, I've been pulling my hair out with this issue.
I have a flat file containing Chinese text. When I load this in BW using 'FLATFILE' as a source system, it works fine. BW shows the correct Chinese characters.
When I do the same load using BODI, I get funny characters.
When I use BODI to load from one flat file into another flat file, the Chinese characters remain correct.
What do I need to do to make sure I get the right Chinese characters in BW when loading from BODI?
BODI is installed on Unix on Oracle 10.
I run the jobs as batch processes.
The dsconfig.txt has got:
AL_Engine=<default>_<default>.<default>
There are no locale settings in al_env.sh
BW target is UTF-8 codepage.
File codepage is BIG5-HKSCS
BODI is set up as a Unicode system in SAP BW.
When loading flat file to flat file, I get a message:
DATAFLOW: The specified locale <eng_gb.iso-8859-1> has been coerced to <Unicode (UTF-16)
because the datastore <TWIN_FF_CUSTOMER_LOCAL> obtains data in <BIG5-HKSCS> codepage.
JOB: Initializing transcoder for datastore <TWIN_FF_CUSTOMER_LOCAL> to transcode between
engine codepage<Unicode (UTF-16)>  and datastore codepage <BIG5-HKSCS>
When loading to BW the messages are almost the same, but now the last step in UTF-16 to UTF-8.
I read the wiki post which definitely helped me to understand the rationale behind code page, but now I ran out of ideas what else to check ( http://wiki.sdn.sap.com/wiki/display/BOBJ/Multiple+Codepages )
Any help would be greatly appreciated.
Jan.

Hi all. Thanks for the Inputs. This is what I got when I clicked on the Details Tab of the Monitor....
Error when transferring data; communication error when analyzing
Diagnosis
Data packages or InfoPackages are missing in BI but there were no apparent processing errors in the source system. It is therefore probable that there was an error in the data transfer.
The analysis tried to read the ALE outbox of the source system. This lead to error .
It is possible that there is no connection to the source system.
Procedure
Check the TRFC overview in the source system.
Check the connection to the source system for errors and check the authorizations and profiles of the remote user in both the BI and source systems.
Check th ALE outbox of the source system for IDocs that have not been updated.

Similar Messages

  • Component Slider not working when loaded from another swf file

    I have a simple Flash application that uses a component Slider to increase or decrease the size of the text in a TextArea (ta). It works perfectly fine on its own, however, when I try to load the same swf file from another application, I get the following error...
    ReferenceError: Error #1069: Property fl.managers:IFocusManager::form not found
    on fl.managers.FocusManager and there is no default value.
    at fl.controls::Slider/thumbPressHandler()
    Code...
    import fl.events.*;
    import flash.text.TextFormat;
    ta.text = "Lorem ipsum dolor sit amet";
    var tf:TextFormat = new TextFormat();
    tf.color = 0xCCCCCC;
    tf.font = "Trebuchet MS";
    tf.size = 12;
    slider.addEventListener(SliderEvent.THUMB_DRAG, sliderChange);
    style();
    function style():void
        ta.setStyle("textFormat", tf);
    function sliderChange(e:SliderEvent):void
        tf.size = slider.value;
        ta.setStyle("textFormat", tf);
    Could the containing swf file that I'm loading the slider swf file in anyway effect the slider application? I don't quite understand why it works on its own, but not when loaded from another app.

    add a slider component to the main (loading) swf's library.

  • Anyone Locked a generated SO by item/header when loading from ORDERS05?

    Has anyone managed to sucessfully locked a generated Sales Order when Loading from an IDOC of type ORDERS05?. I create a lock at the item level if necessary, but for some reason it zeroises the item Qty, almost as though there was no Schedule segment supplied, which there is.
    Anyone hit this same problem and overcome it?.
    Blue

    This is in reply to the first post. I don't know what happened after.
    Caused by: java.security.AccessControlException: access denied (java.util.PropertyPermission sun.arch.data.model read)
         at java.security.AccessControlContext.checkPermission(Unknown Source)
         at java.security.AccessController.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkPropertyAccess(Unknown Source)
         at java.lang.System.getProperty(Unknown Source)
         at org.eclipse.swt.internal.Library.loadLibrary(Library.java:167)
         at org.eclipse.swt.internal.Library.loadLibrary(Library.java:151)
         at org.eclipse.swt.internal.C.<clinit>(C.java:21)
    If you read the above trace from bottom to top, it shows none of you classes, only classes from that Eclipse library, which seems to loadLibrary() a native DLL. In order to do this, it needs to call System.getProperty( "sun.arch.data.model" ). This call is not allowed from un unsigned applet. So I guess you need to sign the applet and this problem will go away. Many other problems may follow. Just read very very carefully all the related documentation, which I did not.

  • Error when loading from ODS to Cube

    Hello Friends,
    I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
    07/03/2007     13:10:25     Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
    07/03/2007     13:28:42     Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed     
    I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
    Thanks.

    Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
    Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
    A few questions:
    How are you loading ur cube?
    Did the data get thru fine to PSA with the infopack in question?
    How did you load ur DSO(assuming the load was successful)?
    Message was edited by:
            voodi

  • BPC 7.5: Delta Load when loading from BI InfoProvider

    Hi,
    in BPC 7.5 running a package based on Process Chain "CPMB/LOAD_INFOPROVIDER" loads data directly from an SAP BI Infoprovider into an BPC-Cube. According to the options you can choose "Merge Data Values" or "Replace & Clear DataValues"
    According to the description the first one "Imports all records, leaving all remaining records in the destination intact", the second one "Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records".
    I tried both and both (!!) result in what you would expect of "Replace & Clear". Both firstly create storno-records in the cube and then add the new values.
    Is this an error or wanted behaviour? I didn't find any SAP Notes on this topic, but doubt it's right....
    Is there any way to achieve an Merge or - even better - a delta load?
    Thanks a lot for any input given.
    bate

    Hi Bate,
    It is indeed a bit confusing. I'll translate the BPC-speak for you
    Replace & Clear:
    1. Look up all CATEGORY/ENTITY/TIME combinations in the incoming data
    2. Clear all records in application with the same CATEGORY/ENTITY/TIME combinations that exist in the incoming data records
    3. Load incoming data to the cube
    Merge:
    1. Load incoming data to the cube record-by-record, overwriting existing data.
    This means that "Replace & Clear" might clear out existing records in the cube that do not share the full key of incoming records but do share the CATEGORY/ENTITY/TIME dimension values. For example, a records with CATEGORY/ENTITY/TIME/ACCOUNT values of ACTUAL/1000/2010.JAN/ACCT1 would be deleted when loading an incoming record with values ACTUAL/1000/2010.JAN/ACCT2, if you use the "Replace & Clear" method, but would not be deleted if you use the "Merge" method.
    There is no option to load data additively, like it is loaded to a BW InfoCube.
    Cheers,
    Ethan

  • Error when loading from Flat File

    I get the message below when trying to load from a flat file.  The change that I made to the Transfer Structure was to move the items.  How can I resolve? Thanks
    Error 8 when compiling the upload program: row 227, message: Data type /BIC/CCABSMMIMH1 was found in a newer ve

    hi Niten,
    check if helps
    A newer version of the data type error when loading

  • Chinese characters reading and printing from

    I have an xml file with one element
    < item name="Notification" value="some chinese characters like & # \u5927 ;" />
    Now I want to read that from the xml file and print that
    so what I do is
    String value = config.getProperty(Config.PREFACE);
        String value1 = new String(value.getBytes("Big5"), "Big5");
        System.out.println("testing 26 " + value1);But at the end of the day I am still getting question marks.
    meaning that the out put still being
    some chinese characters like ?
    Any ideas? This thing is driving me nuts..............

    Hi,
    I was working on it Chineese and arabic but abandoned it due to time constraints.
    I think u need to install chineese fonts on u'r system if u dont have one.
    I dont think it comes default with java. Also u may require to edit the font.properties file and specify which font to use for displaying the range of unicode characters.

  • Firefox will not play .ogv video files when loaded from a server. Does anyone have any answers that will fix this?

    I've read the suggestions for fixing this, and none of them work. I've updated my .htaccess to match MIME types for .ogv, I've uploaded the files as both Binary and ASCII, neither method works. The .ogv file plays fine when you load it from a local file, but it will not play when viewed from a remote server. I've confirmed with my hosting company that the .htaccess and all other issues for HTML5 are up-to-date. Safari and Chrome play the HTML5 videos just fine, but Firefox will not. Does anyone have any concrete answers as to what the problem is. I'm also serving up the video files as .webm and .mp4, but neither of these formats works in Firefox either.

    That can happen if the server isn't configured properly to send the files with a by Firefox supported MIME type.
    *https://developer.mozilla.org/en/docs/Properly_Configuring_Server_MIME_Types
    Did you check via the Web Console which headers Firefox receives when requesting the .ogv file?
    If that is OK then it is possible that the file is using an unsupported coding method.
    * https://developer.mozilla.org/En/Media_formats_supported_by_the_audio_and_video_elements

  • Problem when loading from ODS to the CUBE

    Hi Experts,
    I am facing an unusual problem when loading data from ODS to the cube.
    I have a first level ODS where the delta postings are updated. I have checked the active table of the ODS and the data is accurate.
    I have deleted the entire data from the Cube and trying to do a full load from ODS to the CUBE.
    I am sure when I run a full load the data goes from Active table of the ODS.
    After the full load the the keyfigure values are 0. I have tried doing the testing by loading couple of sales documents and still the keyfigure values are 0.
    I wonder when I load the full. The data should be picked exactly the way it is in active table of the ODS.
    I also dont have any fancy routines in the update rules.
    Please help me in this regard.
    Regards
    Raghu

    Hi,
    Check the procedure did u do that exactly or not. just follow the the laymen steps here:... and let me know ur issue
    o     First prepare a flat file in Microsoft excel sheet for master data and transaction data and save it in .CSV format and close it.
    o     First select info objects option and create info area then create info object catalog then char, & key figure. Then create ‘id’ in Char, Name as attribute activate it then in key figures create no and activate it.
    o     In info sources create application component then create info source with direct update for master data and flexible update for transaction data. The flat files would be created for master data create info package and execute it.
    o     For transaction data go to info provider then select the file right click on it then select option create ODS object. Then a new screen opens give the name of the ODS and create other screen opens then drag character to key fields.
    o     Activate it then update rules it will give error to it then go to communication structure and give 0Record mode then activate it then go to update rules it will activate it.
    o     Then go to info source then create info package then external data then processing update, Data targets then the scheduling then click start then the monitor.
    o     Then other screen opens there we can see if we doesn’t able to see the records then come back to the info provider right click on it then other screen opens then select the option activate data in ODS then a screen opens.
    o     so select the option QM where it would be in a yellow co lour then by double click then another screen opens then select green color option then click continue it and it comes to the previous screen then select the option and click save it would be underneath of it just click it up and close it.
    o     Then come back to the main screen to the info source to the info package go to the package and right click on it then a screen opens then click schedule without distributing any of the tab directly click the monitor tab. Then refresh it until the records come to the green.
    o     Once it is green the click the data target and see the file executed.
    o     Now go to the info provider then a right click on the file name then create info cube. Then a new screen opens and gives the name and activates it. The drag the key figures to the other side, time characteristics to the other side, and chacetristics to the other side.
    o     Then create the dimension table then assign it and activate it .Then go to file name right click on it then select the option create update rules select ODS option and give the file name and activate it .
    o     Then come back to the main screen and refresh it. Whenever you see 8 file names you will know that it is undergoing the data mart concept.
    o     In the main screen click ODS file right click then a pop up screen opens select the option update ODS data target. Then another screen opens in that we are able to see 2 options full update, initial update then select initial update.
    o     Then another screen opens info Package there the external data , data targets , scheduler select the option later in background option then a another screen opens just click immediate then click save automatically it closes then click start option.
    o     Then select monitor option then the contents then the field to be elected then the file to be executed.
    regards
    ashwin

  • How to Prevent CacheStore from Getting Called when Loading from DB

    Hi,
    I have a Cache with WriteBehind enabled. The issue is when I'm initializing the Cache from its Persistence Store (SQL Server 2005) I dont want it to call its CacheStore implementation. In the coherence book written by Alexander Seovic it recommends using another Cache to control writing to different Caches, sort of like a global flag, but that will just work in a WriteThrough scenario and not in a WriteBehind. One of my theories is to use a MapTrigger in which when i'm loading from the db I intercept the call and tell the object not write to the DB, maybe through writing directly to the Backing Map though i'm not sure if writing to the BackingMap prevents the Calling of the CacheStore. Please let me know. Tks.

    Hi user13402724,
    The documentation covers a scenario like this here: http://download.oracle.com/docs/cd/E14526_01/coh.350/e14509/appsampcachestore.htm#sthref512
    JK

  • Issue with Chinese characters while sending IDOCs from PI to SAP system

    Hi,
    We are working on File to IDOC scenario where in some of the fields are having Chinese characters. In SAP system, we noticed that the Chinese characters are getting replaced with # where as in PI output payload we are able to see the correct Chinese characters. The systems involved are PI 7.4 and SAP ECC -6.0.

    Like in dual stack where we have an option to choose unicode/ non-unicode, i don't see any option in single stack.
    However not sure if it works but you can give a try..
    I assume you have created the RFC destination in NWA.
    Go to RFC destination ->Specific Data -> Advanced Settings - here we have an option to mention the code page.
    You might want to mention the code page for chinese char and see if it helps..
    Before doing this, do checkMark's suggestion.

  • MBP 2010 13'' completely freezing when loading from sleep after battery drain

    Hi all,
    Ever since upgrading do Mountain Lion, everytime my MBP 2010 13'' goes to sleep after battery drainage, it freezes in the last "dash" of the waking up process. Furthermore I noticed that the background during wake up is not my dimmed desktop anymore as it used to be on Lion, it is the Apple logo on a white background (similar to Power up screen). I experience no other freezing stated by the community that are attributed to ML, iTunes or Safari, only this one when recovering from sleep. This last time I waited a couple of hours before power down, only resulting in abnormal heating. Already tried fully charge, fully recharge, reboot, cmd+opt+p+r, ... Anyone has this? Is it a known problem being somehow addressed by Apple? Thanks in advance!

    So can anyone tell me how can I open an issue on this for Apple? Thanks.

  • Error when loading from External Tables in OWB 11g

    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.

    user1130292 wrote:
    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in  tags
    also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO?

  • Buffer error when loading from EAS??

    All,
    We're on essbase 7.1.2 and having trouble loading an ASO cube thru eas. We keep getting error 1270040: Data load buffer [2] does not exist.
    I did some research on this and it should only happen when running thru MaxL, not in EAS. From what I read, EAS is supposed to handle this transparently.
    Does anyone have any ideas why this might be happening?
    Additional info: It only seems to happen when using an EAS client on a remote computer, not on the server itself.
    Thanks,

    This is one of those error messages that does not mean what you think it means. I have had this error come up when something was wrong in my load rule, misspelled dimension or member name, missing a dimension etc. check your load rule carefully, sometimes the errors are hard to spot

  • Why won't Chinese characters show after copying from Abobe Reader?

    Hi,
    I'm using Adobe Reader 9 to read a pdf of a Chinese grammar book. I tried copying the characters and pinyin from the pdf into  Mnemosyne to use for studying, however, it just comes up blank. I did it for another pdf and it comes up with weird characters. I downloaded the Chinese Simplified support and it does the same thing even after restarting my computer. Both pdfs allow for copying and were created with Adobe Distiller 7. I don't know what the problem is, but I would very much appreciate any help so I can get to studying.
    Thank you,
    hb

    If you don't get an answer here, you may want to post the question in the iMovie discussion here.

Maybe you are looking for

  • One time Vendor _ Look for Invoices by VAT number

    Hi Guys, I am looking for a way to find out the invoices for one time vendor base on VAT number. I need a standard report, because I know that I can search the table BSEC, but I am wondering whether SAP standard offers you the option to look these in

  • Connect WiFi to one subnet & the Ethernet port to a different subnet? (OJ Pro 8600)

    Hello all, I have an OfficeJet Pro 8600, which has both WiFi and an Ethernet port. I have two routers at home: one for a fibre boradband line for business use and another for an ADSL line which the family use. The fibre router has a 172.17.17.x subne

  • How to replicate a SQL Server database to Oracle?

    I want to create an Oracle database. I have an existing SQL database. I have created an sql file of the SQL database. Should I just run this file from the sql prompt? Do I need to create a service id for the new database and then run the sql file? Al

  • How to disable/set password expiration to None in EBS

    HI , I just clone a 11i , and was asked to set all users password not to expire . I have studied FND_USER_PKG.UPDATEUSER , but has no idea how to do it . Can anyone help ? Thanks Felix

  • Excperiences when changing main admin account

    I as usual when buying a new Mac, found after 2,5 years, that my iMac should benefit from changing my main admin account to a new one. And then I had the possibility of rethinking all things that I had installed slightly unsmart. So I now have two ma