How to deal with such Unicode source data in BI 7.0?

I encountered error when activating DSO data. It turned out that the source data is Unicode in the HTML representation style. For example, the source character string is:
ABCDEFG& #65288;XYZ  (I added a space in between & and # so that it won't be interpreted to Unicode in SDN by web browser)
After some analysis, I see it's actually the Unicode string
ABCDEFG(XYZ
Please notice the wide left parenthesis. It's the actual character from the HTML $#xxx style above. To compare, here is the Unicode parenthesis '('  and here is the ASCII one '(' . You see they are different.
My question is: as I have trouble loading the &#... string, I think I should translate the string to actual Unicode character (like '(' in this case). But how can I achieve this?
Thanks!
Message was edited by:
        Tom Jerry

I found this is called "Numeric character reference", or NCR, in HTML term. So the question is how to convert string in NCR fashion back to Unicode. Thanks.

Similar Messages

  • How to deal with such backround noise?

    Hi.
    I have one audio clip, recorded in the airport, and there's a lot of backround noise there, microphone is too far from the speaker, so there's nothing really I can do about it to take it away (except hiss).
    But I've heard that something can be done using reverb. I tried different options and it seems promising, because my goal is to have the speaker's voice understandable, not high quality output. But I still didn't get what I wanted.
    How is this done? Or maybe there's some other option?
    Here's the sample. http://www.adrive.com/public/90859f4687438404025bdb9bde6470031e05dd71a3640a8a435048faef761 ed4.html
    Thanks for help!

    Hi Steve and Bob!
    I came to realize that I really don't know how to play around with this NR effect settings except moving the noise reduction level here and there. Therefore haven't really used because getting too many artifacts.
    Ideally you need a sample long enough to let you use the highest FFT number there is in NR
    Ok, I found a place with enough noise and could capture a 24000 FFT point sample, using that gives a better result before it gets too hollow for sure, but at the same time I actually have no idea what is the best way to tweak other settings as precision factor, smoothing amount and etc. Also there's this snapshots in profile setting, is that useful to tweak also?
    I will upload a sample with only noise, maybe you could try again and let me know what settings did you use. http://www.filedropper.com/noisesample
    Is there some nice tutorials about in depth noise removal? I have been looking for them and haven't found much.
    Btw. Maybe you could recommend me a decent microphone and recorder for recording classes, discussions? I really need to get something high quality to avoid such problems in the future.

  • How to deal with large amount of data

    Hi folks,
    I have a web page that needs 3 maps to serve. Each map will hold about 150,000 entries, and all of them will use about 100MB total. For some users with lots of data (e.g. 1,000,0000 entries), it may use up to 1GB of memory. If few of these high-volume users log in at the same time, it can bring the server down. The data is from the files, I cannot read it on demand because it will be too slow. Loading the data to maps offers me very good performance, but it does not scale. I am thinking to serialize the maps and deserialize them when I need. Is it my only option?
    Thanks in advance!

    JoachimSauer wrote:
    I don't buy the "too slow" argument.
    I'd put the data into a database. They are built to handle that amount of data.++

  • How to deal with the change of data element?

    Hi, experts
    My trouble's background is:
    A CBO table(ZIEBTCIITM) saves invoice items .
    There are three fields relevant to quantity in it.
    And they have the corresponding data elements with the same type QUAN,13 characters and 3 decimals.
    Now because the quantity is so small,for example 0.00004,that the 3 decimal digit is not suitable.
    I want to change the CBO table to adapt my business,so the quantity fields' decimal should be expanded firstly,and there are transction data in the CBO table.
    My question is :
    1) Is it necessary to backup the CBO table before data element change?
    2) What should I do is Only to expand the data elements' decimals?
    Anyone can give me suggestion?

    Hello Mic
    To be on the save side I would suggest to export the table entries to Excel and create a transport request containing the original table and its entries:
    R3TR TABL ZIEBTCIITM
    R3TR TABU ZIEBTCIITM *
    Next you should create your own data element (or search for a suitable standard data element), e.g. ZQUAN13_5 (13 digits, 5 decimals).
    Replace the data element of the DB fields with your new data element and activate the DB table.
    I expect that nothing will happen to the entries in the DB table except that your quantity fields should have 5 decimals now.
    Regards
      Uwe

  • How to deal with special character in source file

    Hi experts,
                      i am doing a file to file scenario in which my source file contains many special characters when i am puting this file into moni its going with the special character .My source file is a fixed length file so in content conversion i have specified the file length but due to these special charcters these field lenght is also varing.So please guide me how to deal with these special characters in sender adapter
    regards,
    Saurabh

    you could try using a Java Mapping to change the encoding manually. For that, set the encoding of the OutputFormat of the XML you'll serialize. Try the following code piece for the mapping (inside a try/catch declaration):
    DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
    DocumentBuilder documentBuilder = factory.newDocumentBuilder();
    Document input = documentBuilder.parse(in);
    OutputFormat format = new OutputFormat(XML, "ISO-8859-1", false);
    XMLSerializer serializer = new XMLSerializer(out, format);
    For more details check this guide:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/502991a2-45d9-2910-d99f-8aba5d79fb42

  • How to get around a performance issue when dealing with a lot of data

    Hello All,
    This is an academic question really, I'm not sure what I'm going to do with my issue, but I have some options.  I was wondering if anyone would like to throw in their two cents on what they would do.
    I have a report, the users want to see all agreements and all conditions related to the updating of rebates and the affected invoices. From a technical perspective ENT6038-KONV-KONP-KONA-KNA1.  THese are the tables I have to hit.  The problem is that when they retroactively update rebate conditions they can hit thousands of invoices, which blossoms out to thousands of conditions...you see the problem. I simply have too much data to grab, it times out.
    I've tried everything around the code.  If you have a better way to get price conditions and agreement numbers off of thousands of invoices, please let me know what that is.
    I have a couple of options.
    1) Use shared memory to preload the data for the report.  This would work, but I'm not going to know what data is needed to be loaded until report run time. They put in a date. I simply can't preload everything. I don't like this option much. 
    2) Write a function module to do this work. When the user clicks on the button to get this particular data, it will launch the FM in background and e-mail them the results. As you know, the background job won't time out. So far this is my favored option.
    Any other ideas?
    Oh...nope, BI is not an option, we don't have it. I know, I'm not happy about it. We do have a data warehouse, but the prospect of working with that group makes me whince.

    My two cents - firstly totally agree with Derick that its probably a good idea to go back to the business and justify their requirement in regards to reporting and "whether any user can meaningfully process all those results in an aggregate". But having dealt with customers across industries over a long period of time, it would probably be bit fanciful to expect them to change their requirements too much as in my experience neither do they understand (too much) technology nor they want to hear about technical limitations for a system etc. They want what they want if possible yesterday!
    So, about dealing with performance issues within ABAP, I'm sure you must be already using efficient programming techniques like using Hash internal tables with Unique Keys, accessing rows of the table using Field-Symbols and all that but what I was going to suggest to you is probably look at using [Extracts|http://help.sap.com/saphelp_nw04/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm]. I've had to deal with this couple of times in the past when dealing with massive amount of data and I found it to be very efficient in regards to performance. A good point to remember when using Extracts that, I quote from SAP Help, "The size of an extract dataset is, in principle, unlimited. Extracts larger than 500KB are stored in operating system files. The practical size of an extract is up to 2GB, as long as there is enough space in the filesystem."
    Hope this helps,
    Cheers,
    Sougata.

  • How to deal with QoS such as BestEffort, ExactlyOnce, ExactlyOnceInOrder

    when developing the sap XI adapter, How to deal with the Qos of BestEffort, ExactlyOnce, ExactlyOnceInOrder?

    Take a look at the following link.
    http://help.sap.com/saphelp_nw04s/helpdata/en/8d/6ad6409ff68631e10000000a1550b0/content.htm
    regards
    Shravan

  • Dealing with large volumes of data

    Background:
    I recently "inherited" support for our company's "data mining" group, which amounts to a number of semi-technical people who have received introductory level training in writing SQL queries and been turned loose with SQL Server Management
    Studio to develop and run queries to "mine" several databases that have been created for their use.  The database design (if you can call it that) is absolutely horrible.  All of the data, which we receive at defined intervals from our
    clients, is typically dumped into a single table consisting of 200+ varchar(x) fields.  There are no indexes or primary keys on the tables in these databases, and the tables in each database contain several hundred million rows (for example one table
    contains 650 million rows of data and takes up a little over 1 TB of disk space, and we receive weekly feeds from our client which adds another 300,000 rows of data).
    Needless to say, query performance is terrible, since every query ends up being a table scan of 650 million rows of data.  I have been asked to "fix" the problems.
    My experience is primarily in applications development.  I know enough about SQL Server to perform some basic performance tuning and write reasonably efficient queries; however, I'm not accustomed to having to completely overhaul such a poor design
    with such a large volume of data.  We have already tried to add an identity column and set it up as a primary key, but the server ran out of disk space while trying to implement the change.
    I'm looking for any recommendations on how best to implement changes to the table(s) housing such a large volume of data.  In the short term, I'm going to need to be able to perform a certain amount of data analysis so I can determine the proper data
    types for fields (and whether any existing data would cause a problem when trying to convert the data to the new data type), so I'll need to know what can be done to make it possible to perform such analysis without the process consuming entire days to analyze
    the data in one or two fields.
    I'm looking for reference materials / information on how to deal with the issues, particularly when a large volumn of data is involved.  I'm also looking for information on how to load large volumes of data to the database (current processing of a typical
    data file takes 10-12 hours to load 300,000 records).  Any guidance that can be provided is appreciated.  If more specific information is needed, I'll be happy to try to answer any questions you might have about my situation.

    I don't think you will find a single magic bullet to solve all the issues.  The main point is that there will be no shortcut for major schema and index changes.  You will need at least 120% free space to create a clustered index and facilitate
    major schema changes.
    I suggest an incremental approach to address you biggest pain points.  You mention it takes 10-12 hours to load 300,000 rows, which suggests there may be queries involved in the process which require full scans of the 650 million row table.  Perhaps
    some indexes targeted at improving that process is a good first step.
    What SQL Server version and edition are you using?  You'll have more options with Enterprise (partitioning, row/page compression). 
    Regarding the data types, I would take a best guess at the proper types and run a query with TRY_CONVERT (assuming SQL 2012) to determine counts of rows that conform or not for each column.  Then create a new table (using SELECT INTO) that has strongly
    typed columns for those columns that are not problematic, plus the others that cannot easily be converted, and then drop the old table and rename the new one.  You can follow up later to address columns data corrections and/or transformations. 
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • How to deal with deadlock on wwv_flow_data table when http server times out

    There are some threads about a deadlock on the wwv_flow_data table. None of them contain a real explanation for this behaviour. In my case I will try to explain what I think is happening. Maybe it helps somebody who is hitting the same matter.
    In my case with APEX 3.2.1 I am navigating from one page to another. Doing this APEX will lock the table wwv_flow_data. As soon as the other page is shown the lock will be released. But now this other page contains a bad performing query (standaard report region). After 5 minutes the http server (modplsql) will time out and present the message "No response from the application server" on the screen. In the meanwhile the query is still running on the database server and the lock stays on the wwv_flow_data table.
    Normal user behaviour will be that the user will use the back button to return to the previous page and tries it again to navigate to the other page or
    the user will try to refresh the page with the bad performing query.
    And voila now you will have a deadlock on the wwv_flow_data table since a second session is trying to do the same thing while the first hasn't finished yet.
    How to deal with it?
    First of all. Have a good look at the bad performing query. Maybe you can improve it that it will succeed before the http server will timeout.
    In my case the 11gr1 optimizer couldn't handle a subquery factoring clause in the best way. After changing it back to a classical inline query the problem was solved.
    Secondly you could increase the timeout parameter of the http server. Although this not the best way.
    Maybe it would better if APEX in a next version would release the lock on the table wwv_flow_date earlier or do a rollback just before the moment that the http server is timing out.
    regards,
    Mathieu Meeuwissen

    Hello Shmoove,
    I saw your reply here and you probably understand the problems the HTTP 100 response may cause.
    I am trying to send image that was taken by getSnapshot. The problem is that the server respond with this HTTP 100 message.
    I suspect that the reason that my server doesn't recognize the file that I'm sending from J2me is that the "server to client" response to the 100 message comes after the second message of (see what the TCPIP viewer shows down here):
    POST /up01/up02.aspx HTTP/1.1
    Content-Type: multipart/form-data; boundary=xxxxyyyyzzz
    Connection: Keep-Alive
    Content-length: 6294
    User-Agent: UNTRUSTED/1.0
    Host: szekely.dnsalias.com:80
    Transfer-Encoding: chunked
    400: Client to Server (126 bytes)
    78
    --xxxxyyyyzzz
    Content-Disposition: form-data; name="pic"; filename="david.jpg"
    Content-Type: application/octet-stream
    400: Connected to Server
    400: Server to Client (112 bytes)
    HTTP/1.1 100 Continue
    Server: Microsoft-IIS/5.1
    Date: Wed, 23 Mar 2005 00:47:02 GMT
    X-Powered-By: ASP.NET
    Any help will be appreciated,
    David

  • How to deal with credentials for external applications using a Java Client/

    Hi Guys,
    This is the case. I am integrating an external application with an ADF Application. I have implemented some programmatic ViewObjects that are being filled up by a REST Java Client Wrapper. Everything is working fine but the issue is that the credentials the wrapper is using are hard coded inside the java class. I am thinking to ask for the credentials at the beginning of my taskflow and then store them somewhere and use them then to create my client wrapper (passing them in the constructor).
    However, I don't know if my approach is good and I would like you to share your experiences or how to deal with this.
    Regards

    You can use Credential Store Framework to store the credentials securely in the weblogic server instead of hardcoding in the java class.
    The Credential Store Framework:
    - enables you to manage credentials securely
    - provides an API for storage, retrieval, and maintenance of credentials in different back-end repositories
    Check the documentation on CSF API -
    http://docs.oracle.com/cd/E29505_01/core.1111/e10043/devcsf.htm
    Major Steps -
    1. Create a credential map and key in em console to store the password (http://docs.oracle.com/cd/E25054_01/core.1111/e10043/csfadmin.htm)
    2. Use CSF API to retrieve the stored password
    3. In jazn-data.xml give permissions to access CSF key and map

  • How to deal with file(for example .xml)? what format of dir should be?

    I'd like to operate the file in disk, and want to use relative directory?
    How to deal with file dir? what format of dir should be?

    Hi Kamlesh,
    Thanks for your response.
    Actually, In the "Process External Bank Statement" window, i see that there are few entries which is for the previous year and which has not been reconciled. I have never worked practically on BRS and hence, i am scared to make any changes in the clients database without being confident on what i am doing. I need to reconcile for one of their Bank a/c for the month of April '08. I have the copy of the statements for the month ending 31st Mar 08 and 30th Apr 08. The closing balances are as below:
    31/03/08 - 2300000.00
    30/04/08 - 3100000.00
    Now my OB for Bank a/c for April '08 in SAP is 2300000.00 Dr.
    When i go to External Bank Reconciliation - Selection Criteria Screen (Manual Reconciliation), here are the detail that i enter:
    Last Balance: INR -7,000,000.00000 (Grayed out by the system)
    Ending Balance: INR -3,100,000.00000 (Entered by me)
    End Date: 30/04/08 (Entered by me)
    "Reconciliation Bank Statement" Screen opens up and shows the below balances in the screen:
    Cleared Book Balance: INR -7,000,000.00000
    Statement Ending Balance: INR -3,100,000.00000
    Difference: INR 3,800,000.00000
    As per the Bank statement, i have found all the transactions listed out here for the month of Apr '08 but, i also found that the open transactions for the previous month from April '08 have been lying in "Process External Bank Statement" window.
    Could you please help me solve my issue as to what needs to be done or could you also get me some links from where i can get few documents for processing External Bank Reconciliations?
    That will be of a great help for me. I need steps as to what needs to be done first and then the next so that i can arrive at the correct closing balance for the month April '08.
    Thanks in Advance....
    Regards,
    Kaushal

  • How to deal with the growing table?

    Every tables are growing in any applications. In some applications, tables become larger and larger quickly. How to deal with this problem?
    I am developing an application system now. Should I add a lot of delete commands in the code for each table?

    junez wrote:
    Every tables are growing in any applications. In some applications, tables become larger and larger quickly. How to deal with this problem?
    I am developing an application system now. Should I add a lot of delete commands in the code for each table?Uh, well, yes if you continually add rows to a table the table will grow ... sooner or later you will want to delete rows that are no longer needed. What did you expect? You have to decide what the business rules are to determine when a row can be deleted, and make sure your design allows for such rows to be identified. This is called ..... analysis and design.

  • How to deal with images stored in oracle

    hi,
    can anyone help me to solve this issue please:
    in fact i am developping a swing based standalone application based on a TCP/IP client-server connection, so the point is to display on my frame for each student his information and also his personal picture
    first step : storing the personal picture into the oracle database from a specefic frame that allows to specify each NEW student's profile and his photo.
    step 2: as needed, a specefic frame allows to retrieve all the information related to a student and his photo to ( in a jlabel or other swing componenet)
    how to deal with this storing and then the retriving from the oracle DB
    any help please!

    If I understand well your problem, you need your client java application to store and retrive information from an oracle DB.
    This can be done via JDBC.
    Here's the tutorial:
    http://java.sun.com/developer/onlineTraining/Database/JDBC20Intro/JDBC20.html
    Look at
    http://java.sun.com/developer/onlineTraining/Database/JDBC20Intro/JDBC20.html#JDBC2018
    for storing and retriving binary data (like java serialized objects (Images for example))

  • How to deal with moving around like in gta2?

    Hi there,
    I'm programming an game but i was wondering how to deal with moving around like in gta2.
    my game is also top-down with you(tha player) constantly in the middle of the screen. I know how to deal with button press etc.. but how to calculate how manny pixels the map has to move in the x and y axis...
    It's difficult to explain for me.. but hope the picture is clear..
    greetings

    I was afraid for this one already.. the picture is not clear:) thanx for your response so far.. I will try again.. with the game gta2 in mind:)
    see this screenie: http://www.vollversion.de/bilder/705_1_full.jpg
    My game looks the same as gta2 top down... with the character in the middle of the screen(the one firing the flame thwower )(the player). if i press up he walks towards the police car.. if i press left or right.. he only turns around(an combination of pressing up and left is also posible) thus the character walks in the direction he's facing.
    Now i would like to know how did they do this? how to programm such a movement? Because the character is standing constnatly in the middle of the screen the map has to move like in almost every game... how to calculate the x and y axis movement of the map?
    Please.. try agian.. hopefully the screenie will help.

  • How to deal with 0...n or 1...n mappings?

    Hi all,
    I'm relatively new to BPM. I've already made several processes that are working fine. However, I'm now stuck because, for the life of me, I'm not able to understand how to deal with data mappings of nodes of a cardinality greater than 1...1.
    I'm used to dealing with 0...n inputs of Web Services in Webdynpro Java and CAF Application services, using java code... but I just don't get how to deal with these in a BPM data mapping scenario.
    Let's say you have a Web Service whose input is a node called "Employee", where you can add n Employee objects, like this:
       Employee (0...n)
          FirstName: String
          LastName: String
    How do you:
    1- map a 0...n context node from a Web Dynpro or Web Service output, already containing several employees, into this Employee 0...n WS input node?
    2- map a 0...n context node from a Web Dynpro or Web Service output, containing NO employees, into this Employee 0...n WS input node, without getting an error and the BPM crashing because it says the employee element is not found?
    Hopefully someone can help me with this, because I'm about to go the way of calling the web service n times, one employee at a time, instead of one time with a Employee object with n registries in it.
    Thanks!

    Hi Abhijeet,
    i think i should have mentioned this earlier. My Employee node is inside another node.  So, the actual input structure of the WS is this:
    InputValues (1...1)
       Employees (0..n)
          FirstName: String
          LastName: String
    This scenario works OK in BPM when mapping a 0..n node using deep copy as Jocelyn explained, but I still doesn't work if I want to pass an empty (not null) Employees array.
    If I go to the WS Navigator and run this WS with the following parameters, it runs ok (I get an output message saying no employee was selected, which is how it should work):
    InputValues -  "Is null" checkbox not activated.
       Employees - "Skip" checkbox not activated
          FirstName - "Skip" checkbox activated
          LastName - "Skip" checkbox activated
    However, if instead of activating the checkbox of, say, "FirstName", I enter a "" value, I get an error from the WS saying that's not a vaild first name, which is also how it should work.
    In java code, I would just pass an empty InputValues object to the WS, but I'm not sure how to do this in a BPM without it being considered null, and without having to set on of its String-child values to "".
    Do you know how to achieve this?

Maybe you are looking for

  • I keep getting ERROR's when i try to restore ??? help!

    Whenever i try to restore i get error, and i keep getting them. i have the latest itunes. And my iphone is stuck in recovery mode, i cant use it please help.

  • Where can I download the Creative Suite 6 Design Standard Student Teacher Version install packages?

    I have a serial number and downloaded the normal install packages for CS6. Using my serial number failed. I wonder if this is because I only have a serial number for an educational version.

  • Could not see SH schema details in AWM 11g.

    Hi all, I have installed oracle db 11g (11.1.0.6) and installed awm 11.1.0.7. I am trying to open db instance in AWM by SH Schma, but i could not see the dimensions which already in place (as this is oracle defined sample schema). What could be done

  • DVPFX980 DVD Player MP4 Bitrates

    I'm thinking about getting one, but was wondering what the limit for bitrates is. I do know that it supports mp4 simple profile from reading the manual, but what about other specs like the bitrate, fps, etc. I'm thinking that keeping it under 1 mb/s

  • What happened to shortcuts in numbers

    There used to be short cut keys displayed when you went to Numbers - Table -  Add Row Above.   I didn't memorize at the short cut keys, so where did it go?  It is still there for many of the other items, like Edit - Undo.   It appears that everything