BIG ASCII string to HEX string

Hello guys,
I know that is already a lot of posts regarding this issue, but I couldn't find exactly what I need and also couldn't manage to solve it myself...
I need to convert a string with ASCII characters to a string containing HEX representation of these characters.
What I did was: 
Apparently it works very well, but only for a limited numbers of characters, defined by data representation of the 0 constant (U64). It allows me to convert only 8 ASCII characters, as you can see on the image below:
So, it converted from 0 to 7, but not the remaining characters (8 and 9).
Any ideas??
Thanks in advance!
 

What is the data type of the constant wired into the typecast.  My guess is it is a U64.  So it will only typecast the first 8 bytes of your string to a number, then you are doing hex on those 8 numbers.
Use the function String to Byte Array.  Or you could also typecast where an array of U8 integers is wired into the typecast function.

Similar Messages

  • How can I convert an ASCII string of variable length to a HEX number?

    Hello,
    I read data from the serial port 5 times in a second (while loop) and the number of bytes read varies every time.
    The data comes in as ASCII string, and I can see the HEX representation if I connect a HEX string indicator, but I need to permanently convert data to a HEX number. 
    How can that be done?

    Like This.
    Ton
    Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
    Nederlandse LabVIEW user groep www.lvug.nl
    My LabVIEW Ideas
    LabVIEW, programming like it should be!

  • Non ASCII String to Hex

    Hi,
    I am writing an algorithm that first encrypts a String, and then converts that String to Hex String. I can successfully parse Hex Strings into Ascii Strings and vice versa, but when I encrypt I cannot do this anymore. Is there a way to convert a non-ascii character to Hexadecimal and vice versa?
    Thanks in advance

    Maybe so....
    The encrypted value when printed does not format on my screen, and appears as boxes and such.
    The values that I am passing in are a very simple encryption....XOR against a key. Here is that algorithm.
    public byte[] encrypt(byte[] value) {
              byte[] encrypted = new byte[value.length];
              byte[] adjustedKey = getKey();
              while (adjustedKey.length < encrypted.length) {
                   String temp = new String(adjustedKey) + new String(key);
                   adjustedKey = temp.getBytes();
              for (int i = 0; i < value.length; i++) {
                   encrypted[i] = (byte)(adjustedKey[i] ^ value);
              return encrypted;          
    Since this is an XOR, I can use the same method to decrypt as well. This is what produces my non-Ascii characters.
    Is there a way for me to get the byte value of a 1 digit hex value then? this may help some.

  • Concatenate hex string with ASCII string

    Hello,
    I'm new to LabView and I'm building an application that speaks to a camera. I'm stuck at something that seems quite basic: I need to concatenate a constant string of hex values (0002 040D ...) with an ASCII string of hex characters (00D0A10F, for example), that comes out of a "Number to Hex string" block. The output should be a string of hex values (0002 040D 00D0A10F) and go to the camera, but the concatenate block turns the ASCII string into hex display too, so that every character is converted to two hex symbols....
    I found a very old post where someone claims they found a solution; unfortunately, it's not posted...
    Does someone have an idea what do to do?
    Thanks a lot!
    Michal
    Solved!
    Go to Solution.

    Hi,
    Give this a try - it converts an ASCII string of hex characters into the ascii characters.
    So, if you view the output as hex, it should concatenate properly with your other data.
    Dan
    Dan
    CLD
    Attachments:
    hex to ascii.vi ‏8 KB

  • What is the most efficient way to turn an array of 16 bit unsigned integers into an ASCII string such that...?

    What is the most efficient way to turn a one dimensional array of 16 bit unsigned integers into an ASCII string such that the low byte of the integer is first, then the high byte, then two bytes of hex "00" (that is to say, two null characters in a row)?
    My method seems somewhat ad hoc. I take the number, split it, then interleave it with 2 arrays of 4095 bytes. Easy enough, but it depends on all of these files being exactly 16380 bytes, which theoretically they should be.
    The size of the array is known. However, if it were not, what would be the best method?
    (And yes, I am trying to read in a file format from another program)

    My method:
    Attachments:
    word_array_to_weird_string.vi ‏18 KB

  • Convers ascii string to hex string

    I need to send a hex string such as 5051525354A5A6A7A8A9 to a DSP across a serial port.  The string is read in from a file.
    When I send this string to the DSP, it does not recognize the correct hex values.
    If I enter the hex value, 5051525354A5A6A7A8A9, in to a string control that is set to hex display and then change the control to normal display, I get PQRST¥¦§¨©.  When I send this string,  PQRST¥¦§¨©, to the DSP, it recognizes the correct hex values. 
    How can I change the hex string read from the text file to this type of ASCII character conversion?  I've used type cast and the other functions available in LabVIEW with no success.
    This is only one of many strings I need to send so I don't want to hard code PQRST¥¦§¨© or some such string for each message.

    Here's a slightly simpler way.
    Message Edited by altenbach on 08-23-2007 12:17 PM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    HexStringtoBinaryString.png ‏10 KB
    HexStringtoBinaryString.vi ‏11 KB

  • How can I write this string to a file as the ASCII representation in Hex format?

    I need to convert a number ( say 16000 ) to Hex string ( say 803E = 16,000) and send it using Visa Serial with the string control in Hex Mode. I have no problem with the conversion (see attached). My full command in the hex display must read AA00 2380 3E...
    I can easily get the string together when in Normal mode to read AA0023803E... but how can I get this to hex mode without converting? (i.e. 4141 3030 3233 3830 3345 3030 3030 3031 )
    Attachments:
    volt to HEX.vi ‏32 KB

    Sorry, The little endian option was probably introduced in 8.0 (?).
    In this special case it's simple, just reverse the string before concatenating with the rest.
    It should be in the string palette, probably under "additional string functions".
    (note that this only works in this special case flattening a single number as we do here. If the stat structure is more complex (array, cluster, etc.) you would need to do a bit more work.
    Actually, you might just use typecast as follows. Same difference.
    I only used the flatten operation because of the little endian option in my version.
    Message Edited by altenbach on 11-16-2007 11:53 AM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    littleendian71.png ‏4 KB
    littleEndiancast71.png ‏4 KB

  • Removing TCP hex codes from ascii string

    I have a TCP connection to my VI and I'm using PuTTY to communicate with it. It is a basic terminal with a few commands. On boot my first command is always unrecognized because a few TCP protocol's are sent immediately beforehand:
    FF FB 1F “window size”
    FF FB 20 “terminal speed”
    FF FB 18 “terminal type”
    FF FB 27 “Telnet Environment Option”
    FF FD 01 “echo”
    FF FB 03 “suppress go ahead”
    FF FD 03 “suppress go ahead”
    Each command is always proceeded by a command character "FF". I can remove the entire offending sting without any problem, however what I've been trying to do is remove any hex code following the command character. I have tried a few things with regular expressions without much luck. Any insight on how to solve this challenge would be appreciated!
    Thanks,
    Devin 

    I'm not a regular expression power user, but I'd try something like the code below (right-click on "Search and Replace String" to set the "Regular Expression" option). This code looks for \FF followed by either \F0 through FA, or \FB through \FE followed by any other character, which I think matches the rules in the Telnet protocol as listed here: http://tools.ietf.org/html/rfc854 It replaces any match found with an empty string.

  • Scanning an ascii string for a start and end character

    I would like to watch data lines on a com port and pick off a part of a
    line that starts with a "$" and ends with a CR. I would like the
    felxibility to select different start and end characters also. Is this
    doable with LabView? I have the com port monitoring done I need to
    separate incoming strings. I can't count chacters because the data
    lines varies in length each time the data line comes in. Thanks in
    advance fir the help.
    [email protected]

    If you don't want to mess with having to decide whether or not you need to pre-pend a "\" to the character you are feeding into the match-pattern function then you could just do it all with byte arrays like in the attached example. The example also shows various ways that ASCII controls & indicators can be configured to allow you to more easily handle non-visible characters (just right-click on the front panel control/indicator and select normal, \ codes, hex or password).
    As far as your question about the error code meaning, you can right-click on a front-panel error cluster and select the "explain error" option to see all the known causes of the error. You have to then interpret this in the context of what you were doing. When I drop an error cluster on a front panel, set status to Error and code to 85, I can right click on it, select "explain error" and get this:
    "Error 85 occurred at an unidentified location
    Possible reason(s):
    LabVIEW:  Scan failed."
     ...and that sounds to me like (as a WAG) you might be trying to use the Scan From String function with input data that does not match the format string.
    Attachments:
    String Subset.vi ‏53 KB

  • ASCII string to Unicode

    My problem: i have to parse a file-format where special-characters are replaced by an (ASCII) escape sequence - e.g. \344 or \u2200.
    I get these char-sequences through a FileReader and a StreamTokenizer into a String. Since i need to draw the String, i need to re-create the special-chars, but using the "internationalisation tutorial" doesnt work here. There it is said to use a String.getBytes(string,"ENCODING") and then recreate a String with this byte-array. But there is no difference!...
    Any ideas on how to do it?

    Translating the string "\u2200" into the corresponding Unicode character is fairly simple: drop off the "\u" portion and executechar uChar = (char)Integer.parseInt(theRest, 16);This is simplified by the fact that the Unicode encodings are always "\u" followed by 4 hexadecimal digits -- are they not? Now, what about the things like "\344"? Are they always the same length? (If not, how do we know that is the escape sequence and that it is not "\34" followed by the character "4"?) And are they decimal or hexadecimal or octal?

  • Performance problem submitting big XML string parameter to the web service

    We deployed a web service on the OC4J via oracle.j2ee.ws.StatelessJavaRpcWebService . This web service takes one string as a parameter. The string contains XML. Evrything works great UNLESS input XML string reaches 5Mb in size. When it happens OC4J does something with it for about 10 minutes (yes, minutes) before it calls the web service method. At this time java.exe consumes 100% of CPU.
    WE tried to increase JVM heap size, stack size, etc, - no effect.
    Please, help!
    Thank you in advance,
    Vlad.

    Hi Sheldon,
    What i feel is that it's not been handled in your webservice if the parameter is null or "" <blank> space
    i just you to take care in webservice that if the parameter is null or "" pass the parameter null to the stored proc
    Regards
    Pavan

  • Send ASCII String as Command to Modbus Ethernet Device

    Hello Precious Developers,
    How do we send the following string command from a PC to a Modbus Ethernet device at starting address 411000 using LVDSC:
    "<ID 0><CLR><WIN 0 0 287 31><POS 0 0><LJ><BL N><CS 2><RED><T>H</T>""$0D""$0A"
    (Function code 16 - Write multiple Holding registers)
    Any pointers / suggestions / example code shall be deeply appreciated.
    Solved!
    Go to Solution.

    hi,
    I implemented your solution but when there is an odd number of character, the last character is not included in the  u16 array. Is my implemented correct?
    Regards,
    Cedric
    Attachments:
    string_transfert_MODBUS.vi ‏7 KB

  • Generate wifi password 13 ascii or 26 hex

    Is there an easy way to generate 13 ASCII character or 26 hex digits for the 128-bit WEP password for the wifi in the MacBook Pro? Is there an online tool for doing that? Or is there some freeware software I can download to generate this?
    Thanks

    I found a web site here:
    http://www.string-functions.com/string-hex.aspx
    If people can recomend other web sites or free software (I have Lion) I'll be interested.
    Thanks.

  • Convert FF Ascii To FF Hex

    I have a serial device that returns a hex number but in ascii format(!).
    For example:
    Instead of returning  FF0A (viewed in a string indicator set to Hex display), it returns ff0a (as viewed in a string indicator set to normal display). But the data actually represents FF 0A.
    My question is how can I convert ff0a (ascii) to FF 0A (hex)?
    Thanks in advance!
    Solved!
    Go to Solution.

    Hexadecimal String to Number.
    Remember:
    The LabVIEW Help is there to help you.  
    The functions palette has a search capabitility.
    Message Edited by smercurio_fc on 06-15-2010 09:22 AM
    Attachments:
    Example_VI_BD.png ‏1 KB

  • 5508 wlan gui psk format shows ascii is actually hex

    Version 7.0.116.0  wlan psk format shows as ascii in gui but is configured as hex.
    Appears to be cosmetic bug as clients still work ok when entering in correct hex key.
    Thanks
    Has anyone seen this on WLC controllers. My lab 2106 also has same trouble with 7.0.98 code ?

    Hello Scott - thanks for the reply.
    You misunderstand the problem.
    Try setting the PSK key for HEX format in the gui. After you setup the hex key it works ok for client but
    the gui screen still shows it to be ascii.

Maybe you are looking for