String to Packed ASCII conversion

Hi,
Can anyone provide any examples or Algorithm to convert 'ASCII' to 'Packed ASCII'
Packed ASCII format is used in HART protocol
Thanks.

Hi,
Packed ASCII is noting but a data compression technique.
HART makes limited use of data compression in the form of Packed ASCII.  Normally, there are 256 possible ASCII characters, so that a full byte is needed to represent a character.  Packed ASCII is a subset of full ASCII and uses only 64 of the 256 possible characters.  These 64 characters are the capitalized alphabet, numbers 0 through 9, and a few punctuation marks.  Many HART parameters need only this limited ASCII set, which means that data can be compressed to 3/4 of normal.  This improves transmission speed, especially if the textual parameter being communicated is a large one.
    Since only full bytes can be transmitted, the 3/4 compression is fully realized only when the number of uncompressed bytes is a multiple of 4.  Any fractional part requires a whole byte.  Thus, if U is the number of uncompressed bytes, and T the number of transmitted bytes; find T = (3*U)/4 and increase any fractional part to 1.  As examples, U = 3, 7, 8, and 9 result in T = 3, 6, 6, and 7.
    The rule for converting from ASCII to Packed ASCII is just to remove bits 6 and 7 (two most significant).  An example is the character "M".  The full binary code to represent this is 0100,1101.  The packed binary code is 00,1101.  The rules for conversion from packed ASCII back to ASCII are (1) set bit 7 = 0 and (2) set bit 6 = complement of packed ASCII bit 5.
    Note that, with some exceptions, HART Slaves don't need to do the compression or know anything about the compression.  They simply store and re-transmit the already compressed data.  Again, this is an instance where the more difficult software is placed in the device (Master) that is more capable of dealing with it.
Thanks and Regards
Himanshu Goyal
Thanks and Regards
Himanshu Goyal | LabVIEW Engineer- Power System Automation
Values that steer us ahead: Passion | Innovation | Ambition | Diligence | Teamwork
It Only gets BETTER!!!

Similar Messages

  • Reg EBCDI to ASCII  conversion

    Hi,
          our project is converting from EBCDIC to ASCII conversion,
    we need to some changes for the programs, shown in T-CODE SLIN, if any one have any document , please help me in this regard how to correct the changes required.
    Regards.
    venkat

    hi
    data:pack type p deimals 2.
    first pass char value to pack and then from pack pass it to floting point variable.
    Regards
    Neha

  • EBCDIC - ASCII Conversion for XI File Adapters

    All:
    Does XI File Adpater have built-in capability to convert EBCDIC to ASCII and vice versa?  We also need to handle pack decimals.  All mainframe file based integration scenarios require this capability.
    If there's no built-in capability available, are there any Java APIs available from SAP XI bundle some where.
    Thanks and appreciate your feedback.

    Hi,
    Unfortunately, there is no built-in EBCDEC-ASCII conversion in XI.  There are examples when you do a search on GOOGLE.
    The java function can be used in either java mapping or adapter user-module.
    Regards,
    Bill

  • Binary to Ascii conversion

    Dear experts,
    For BINARY to ASCII conversion, please give me some example programs (using methods).
    Thanks a lot in advance.
    Regards,
    Matt

    Hi,
    I think this link will be useful for you. Please go throught this link....
    http://help.sap.com/saphelp_erp2004/helpdata/en/d6/0dba8f494511d182b70000e829fbfe/frameset.htm
    Hope it will helps

  • EBCDIC to ASCII conversion

    Hello experts,
    i'm performing EBCDIC to ASCII  conversion in our system and the export (command R3SETUP dbextr.r3s) terminates with the error :
    EXP) TABLE: "MCDYNTYPEN"
    (EXP) TABLE: "MCDYNUM"
    DbSl Trace: Error -904 in function exec_cached_fetch
    (EXP) ERROR: DbSlExeRead: rc = 99, table "MCHA"
         (SQL error -904)
    error message returned by DbSl:
    DbSl Trace: SQL error -904 with reason code 13
    Resource limit exceeded. Job=323987/DEVCPC/R3LOAD
    #STOP: 20100416175601
    Thank you,

    Once more the installation stopped with error...
    ERROR 2010-07-07 17:33:50 DBR3LOADEXEC_IND_DB4ASCII R3loadPrepare:0
    Child exited with error: rc = 2                                
    child_pid = 3                                                  
    See logfile SAPAPPL0.log for further information
    ERROR 2010-07-07 17:34:33 DBR3LOADEXEC_IND_DB4ASCII R3loadPrepare:0 
    Child exited with error: rc = 2                                 
    child_pid = 3                                                   
    See logfile SAPAPPL1_10.log for further information
    .ERROR 2010-07-07 17:36:01 DBR3LOADEXEC_IND_DB4ASCII R3loadPrepare:0
    Processes started: 46                                          
    Ended with error:  2                                           
    load tool ended with error.                                    
    See above SAP*.log errors.
    Error: Can not unlock the LogWriter. 
    Possible reasons:                                                                   
    No permissions to set the lock             
    Error: Can not lock the LogWriter.                                  
    Possible reasons:                                                   
    No permissions to set the lock                                      
    Error: Can not unlock the LogWriter.                                
    ERROR 2010-07-07 17:36:04 DBR3LOADEXEC_IND_DB4ASCII InstallationDo:0
    Phase failed.
    The message in "SAPAPPL0.log" is that
    "(IMP) ERROR: SQL statement failed: DROP TABLE "MCHA"
    MCHA in *LIBL not table, view, or physical file.
    " and in "SAPAPPL1_10.log" is
    "#Trying to create primary key "S602+0"
    (IMP) ERROR: CREATE statement failed for object "S602"
         (ALTER TABLE R3DEVDATA/"S602" ADD CONSTRAINT "S602+0" PRIMARY KEY ( "MANDT", "SSOUR", "VRSIO", "SPMON", "SPTAG", "SPWOC", "SPBUP", "VKORG", "VTWEG", "ZZIDIWTIS", "PRODH", "MATNR", "WERKS"  ))
    DbSlExecute: rc = 99
    (SQL error -603)
    error message returned by DbSl:
    Unique index cannot be created because of duplicate keys. "

  • Time comparision of ASCII conversion VS Unicode conversion

    In general how does the runtime of a Unicode conversion compare to that of the ASCII conversion?  For example, you perform the Unicode conversion on the same hardware on which you completed the ASCII conversion.  The ASCII conversion takes 20 hours.  Does the Unicode conversion take a similiar time?
    thanx
    Mark

    Hi Mark,
    first an answer, that comes only now, because my user was somehow locked or whatever ...
    How long does it take compared to the ASCII CPC ?
    As you are a latin-1 customer I guess, you could make use of the InPlace version in both attempts.
    The InPlace Unicode CPC Export will take 6-7h or the time of the Ascii Export if this was longer. The reload will take about 3h only.
    The major problem is on some other edge:
    The combined upgrade only works fine if you run all the scans and this will take several days ... even when they are "pretty useless" in your case. Therefore, I would recommend an ECC6 upgrade without unicode and then the unicode conversion directly afterwards - in a short special technique, that I used several times already.
    If it is "just" from 4.6C ASCII to ECC 6.0 Unicode, the week will be fully sufficient including all backups - which can sometimes run in parallel. (at least with my ideas)
    I did something like that from 4.6B ASCII to ecc5 Unicode on one weekend from friday evening until sunday afternoon. (for sure, depending on the systemsize)
    Regards
    Volker Gueldenpfennig, consolut.gmbh
    http://www.consolut.de - http://www.4soi.de - http://www.easymarketplace.de

  • Downtime for EBCDIC to ASCII conversion

    Hello,
    we have performed successfully an EBCDIC to ASCII conversion for a client's development system.
    The total downtime was about 24 hours.
    The customer though, refuses to have downtime of their production system, since it severely affects their systems.
    Is there a possibility not to have any downtime for an EBCDIC to ASCII conversion ?
    We were thinking as alternatives to build a second productive system, do the conversion there and after finishing, to apply journal receivers from the real production system. This would minimize their downtime to the backup time of the production system and the time needed to apply the journal receivers ?!?
    Has anyone performed such a task ?
    Would this be feasible (we would have to apply journals from an EBCDIC system to an ASCII system)
    Thank you very much
    Katerina Psalida

    Hi Katerina,
    applying journal changes from the EBCDIC system to the ASCII system will not work for several reasons, primarily because the journal keeps track of the journaled tables through an internal journal ID, which will not be the same after the EBCDIC to ASCII conversion. Technically it would not work because the data in the journal entries is kept very low-level, so a conversion from EBCDIC to ASCII during apply is not implemented. Also, the journal entries are based on the relative record numbers in the table, and after the conversion, the relative record numbers will not necessarily be the same.
    I am not aware of a zero-downtime conversion option. You can speed up the conversion if you use the "Inplace" conversion option. Did you use that when you measured the 24 hours downtime at the test system? If not, you should give the Inplace option a try. Depending on your data, it could reduce the downtime significantly.
    Kind regards,
    Christian Bartels.

  • String Literal for ASCII Encoding

    I need to know what is the STRING LITERAL for ASCII CHAR. ENCODING in Java (as ISO_LATIN-1 has "8859_1").
    Thanks in Advance!

    import java.nio.charset.Charset;
    import java.util.Set;
    import java.util.Iterator;
    class Test {
         void m(String name) {
              Charset s = Charset.forName(name);
              System.out.println("display name= " + s.displayName());
              Set aliases = s.aliases();
              Iterator it = aliases.iterator();
              while (it.hasNext()) {
                   String x = (String)it.next();
                   System.out.println("alias= " + x);
         public static void main(String[] args) {
              if (args.length > 0) new Test().m(args[0]);
    }display name= US-ASCII
    alias= us
    alias= ISO_646.irv:1991
    alias= ANSI_X3.4-1968
    alias= iso-ir-6
    alias= 646
    alias= ISO646-US
    alias= cp367
    alias= ANSI_X3.4-1986
    alias= csASCII
    alias= ASCII
    alias= iso_646.irv:1983
    alias= IBM367

  • String (binary) to boolean conversion problem.

    Hello,
    Im facing problems converting a string (presumably in the binary format) to turn on LEDs according to its respective weight. ie.binary input = 1010. Thereby it would turn on the first (MSB) and third LEDs while the second and third are left off.
    What I have managed to obtain is a direct conversion of decimal to binary but have not much idea on how to achieve the above goal. The attached file shows two operations; top part does the boolean to binary conversion is fine. The bottom is supposed to be the binary to boolean conversion.
    Attachments:
    boolean to string.vi ‏21 KB

    OK, let's back up a second here. I don't understand what your "boolean to string" VI is doing. Are you starting with a string, a number, or a bunch of Boolean controls? The top part is dealing with Boolean controls and creating a string of characters of "0" and "1". The bottom part you have a numeric control. By the way, it is pointless to take the output of Number to Boolean Array, converting it to an array of 0s and 1s, indexing out each element, and then using the Not Equal to Zero operator. Just take the Boolean array output from Number to Boolean Array directly into an Index Array!
    You seem to be saying that you have a string in the binary format. This is somewhat meaningless, so I'm assuming you mean you have a string that consists of a sequence of the ASCII characters "1" and "0" to indicate a numerical 1 or 0. You then want to convert this into something that is programmatically useful. What that is is not clear, so let's assume an array of Booleans. If that's the case, then you can simply take advantage of the fact that you're starting out with ASCII characters, and use the ASCII codes to find out what you have. The ASCII code for the character "0" is 30 (hex) or 48 (decimal). The ASCII code for the character "1" is 31 (hex) or 49 (decimal). Assuming this is what you have and what you want, then you can simply do this:
    Attachments:
    Example_VI.png ‏8 KB

  • Unicode and ascii conversion help needed

    I am trying to read passwords from a foxpro .dbf. The encrpytion of the password is crude, it takes the ascii value of each char entered and adds an integer value to it, then stores the complete password to the table. So to decode, just subtract same integer value from each chars retieved from .dbf. pretty simple.
    The problem is that java chars and strings are unicode, so when my java applet retrieves these ascii values from the .dbf they are treated as unicode chars, if the ascii value is over 127 I have problems.
    The question. how can i retrieve these ascii values as ascii values in java?
    Should I use an InputStream like:
    InputStream is=rs.getAsciiStream("password");
    Is there a way to convert from unicode to extended ascii?
    Some examples would be helpful, Thanks in advance.

    version 1
    import java.nio.charset.Charset;
    import java.nio.ByteBuffer;
    import java.nio.CharBuffer;
    class Test {
        static char[] asciiToChar(byte[] b) {
            Charset cs = Charset.forName("ASCII");
            ByteBuffer bbuf = ByteBuffer.wrap(b);
            CharBuffer cbuf = cs.decode(bbuf);
            return cbuf.array();
        static byte[] charToAscii(char[] c) {
            Charset cs = Charset.forName("ASCII");
            CharBuffer cbuf = CharBuffer.wrap(c);
            ByteBuffer bbuf = cs.encode(cbuf);
            return bbuf.array();
    }version 2
    import java.io.*;
    import java.nio.charset.Charset;
    class Test {
        static char[] asciiToChar(byte[] b) throws IOException {
            Charset cs = Charset.forName("ASCII");
            ByteArrayInputStream bis = new ByteArrayInputStream(b);
            InputStreamReader isr = new InputStreamReader(bis, cs);
            char[] c = new char[b.length];
            isr.read(c, 0, c.length);
            return c;
        static byte[] charToAscii(char[] c) throws IOException {
            Charset cs = Charset.forName("ASCII");
            ByteArrayOutputStream bos = new ByteArrayOutputStream();
            OutputStreamWriter osw = new OutputStreamWriter(bos, cs);
            osw.write(c, 0, c.length);
            osw.flush();
            byte[] b = bos.toByteArray();
            return b;
    }

  • Boolean Array to Hex String - LabVIEW Supplementary Data Conversion Library in version 12 please

    Hello,
    I would like to use the Boolean Array to Hex String.vi in LabVIEW Supplementary Data Conversion Library at http://zone.ni.com/devzone/cda/epd/p/id/3727
    But it is version 4. Can someone give me the library in version 12? Attached herewith.
    Attachments:
    cnvrsion.zip ‏38 KB

    Mass compiled in 8.2.1, which you can open with 2012.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines
    Attachments:
    convrzun.llb ‏65 KB

  • Verify if string is valid ASCII

    I have a string and I want to verify if it contains only valid ASCII, for example, if the string contains characters like the right allow, then it be should reject ! can anyone help? thanks!

    What's the in the db? A series of bytes from a different character set? (One which, when displayed on a terminal (which may or may not understand the character set), displays a right arrow?)
    Perhaps the ultimately correct solution is to use java.io.InputStreamReader, constructing it denoting the correct character set.
    Or, to put data into the database using OutputStreamWriter, again specifying the correct character set.
    Then you catch CharConversionExceptions. That's how you detect bad input.
    Or, do the same kind of thing using String.getBytes and String constructors.

  • Hex Ascii conversion Problem

    Hi.
    I am working on a project where I send commands to the serial port. I am using the java.comm.*; library and for now, using the provided sample program to send and receive commands.
    Before using java, I used a program called terminal.exe, to send commands such as:
    $71$00$00$00
    $71$40 (hex code)
    the second line works fine when i convert that to ascii text to send, but the first line does not. I am having trouble with the 00. I have tried strings such as NULL, NUL, null, nul, and others to no avail.
    Anyone have any ideas to what I can try next?
    Thanks for the help.

    Don't you just have to send the three characters Dollar Sign, Digit Zero, and Digit Zero? Or does that terminal.exe program convert the $nn things into a single byte?
    If it's the latter then just send 0. Not the character '0', the byte 0. You're sending bytes anyway, right?

  • String - 7 bit Ascii

    Hi,
    I am creating a call logging system with a Java app that will collect the incoming/outgoing calls from a meridian telephone system. My question is this...........
    The data from the calls comes in through the COM port, if i collect this and write it to the screen its a jumbled mess, i know the problem but dont know how to implement it, i need to force the text to convert to 7 bit ascii or something like that...
    Can anyone point me in the right direction?

    If ur sure that the data that ur getting as ascii characters, then u can store these ascii charters in an array of bytes. then u create a string passing this array to constructor, then string will contain the characters for these ascii codes.

  • Check either String data is ASCII or Unicode

    Hi,
    How can we check String(Array of Characters) data is ASCII or Unicode using java.
    Please reply immediately.
    Thanx in advance.

    Hi,
    How can we check String(Array of Characters) data
    is ASCII or Unicode using java.
    Please reply immediately.that's not the correct way to ask :P
    >
    >
    Thanx in advance.

Maybe you are looking for