Unsigned char[] problem

hi,
I am using JNI to test a C++ method that generates a random 128-byte unsigned char []. In my code, I've used a char[] to pass the unsigned char[] (as it always gives a positive number) and returned it as a jstring in Java.
However, I'm not sure in Java how I can catch the unsighed bytes using the String returned. I tried using int[] and float[] but somehow I still get negative bytes.
Could anyone pls give me some hints? any help would be greatly appreciated.
JNIEXPORT jstring JNICALL Java_RNGInvoker_getRNG(JNIEnv *env, jobject obj)
     RNG R;
     unsigned char Data[128];
     R.gen_num(Data);
     char Data2[128];
     for(int i=0;i<RNG::RNG_RANGE;i++) {
          Data2=Data[i];
     return (*env).NewStringUTF(Data2);
In Java,
class RNGInvoker {
    public native String getRNG();
    static {
        System.loadLibrary("rng");
    public static void main(String[] args) {
        String temp = new RNGInvoker().getRNG();
        byte aa[] = new byte[128];
        int bb[] = new int[128];
        aa = temp.getBytes();
        for(int i=0; i<128; i++) {
            bb[i] = Integer.parseInt(String.valueOf(aa));
System.out.println(aa[i] + ";" + bb[i]);

C/C++ uses 'bytes' for 'char'. Java uses 'chars' in Strings, not bytes. You are generating a byte array. You are then trying to return a byte array as a String. When java uses a byte array to create a String it maps it using the default language of the system. This might or might not lead to changes in actual values.
If you want to generate chars then figure out what the valid chars are for the range and language you are using.
If you want a byte array then return that not a String.

Similar Messages

  • Calling a dll that has an unsigned char, unsigned char * in a function call

    I have a dll that has the following function that I am trying to call from LabView
     GetAvailableData( unsigned short * Num1,   unsigned short * pList,   unsigned short in,   unsigned short * ip  )  
    In this, the second parameter is an output. When I use call library function node, I  use the second arg to create an indicator. When I run this program, I see the return value to be zero.
     I use the following for arg2:
    Name: arg2
    Type: Numeric
    Data type: 16bit Integer
    Pass: Pointer to value
    Any help is appreciated. 

    dinks-
               Just some janitorial work first: when posting try and keep the spaces between lines to a minimal. Empty spaces just means less memory somewhere. To your question, I think some clarification is needed. Tell me if this is right, you have a dll that you are calling into LabVIEW using Call Library Function node and then trying to get this unassigned character.
     Yes. That is correct. 
    It returns 0 but it should be giving something else (what should it be returning?). Are you getting any errors running this or you are just not getting the expected value back? Thanks for the clairfication!!
    Thanks for the response. I think there are lot of nitty gritty details when calling a c++ dll from LabView that I need to learn. The unsigned char problem is solved when I modified the output to Array in Labview.
    I have issues using C++ functions that return a bool value. 
    The other problem is with C++ functions that return a HANDLE.
    In the first case where the return value is bool, I assigned the return value in LabView to  Signed 32 bit integer when I call this function,  the return bool value is some large number always irrespective of the C++ bool value (true or false).
    For the second case (return type HANDLE), I assume, I should specify the return value as signed 32 bit integer in the configure option of call library function node. I tried that and it returns the same value. The dll returns non zero if my hardware that I am interfacing with is found. If it doesn't find the hardware, the return HANDLE is 0. The Labview program returns the same value if I enable the hardware or disable the hardware.
    I am following the  Call DLL.vi example for using various data types in C++. 
    Really appreciate your help.

  • Passing Jbytearray to unsigned char

    Hello Group:
    I am newer in JNI and I have a little problem:
    I have a native method with the follow signature:
    public native int enroll(byte[] template);
    now I have it type data in C:
    typedef unsigned char FPIMAGE; /* fp image data object */
    Implementing my native method i need to call a function of a library to proccess this image(in this case the byte[]) but the signature of this method is:
    FPCODE *cA = bd_fpCode(FPIMAGE *A, int xh, int yw, FPCODE buf, int               qval);
    A = pointer to the fingerprint image pixel matrix
    xh = number of rows (height)
    yw = number of columns (width)
    buf = pointer to an external buffer with minimum size of FPC_MAXLEN bytes as destination for the created fingerprint code
    qval = fingerprint image quality value (in %)
    cA = pointer to fp-code (cA == buf) or NULL, if processing failed
    There is my problem i don't now how to pass o convert my byte[] template to a object FPIMAGE.
    you can help me please!

    Hi,
    bd_fpCode() function is a 3rd party library, right ?
    You can use a JNI wapper to do that, so you do not have to write C code.
    Look at http://jnative.free.fr/SPIP-v1-8-3/article.php3?id_article=4 how to pass pointers to your functions and how to handle structures.
    Alternatively, you can use env->GetByteArrayElements() in your own JNI wrapper lib see (http://java.sun.com/developer/onlineTraining/Programming/JDCBook/jnistring.html).
    --Marc (http://jnative.sf.net)
    Message was edited by:
    mdenty

  • Convert from char* to unsigned char*

    Hi, I try compile example from: http://download-uk.oracle.com/docs/cd/B19306_01/appdev.102/b14294/relational.htm#i1000940
    in Visual C++ and I have problem with:
    const string userName = "SCOTT";
    because string is defined as unsigned char* and "SCOTT" is char*, is any function to convert from type char* to type unsigned char* ?

    I'm apologize for this stupid question, answer is very easy:
    const string userName = (const unsigned char*) "SCOTT";

  • AddPreferencePanel - unsigned char*

    The AddPreferencePanel function takes an unsigned char* as the menu text for the preferences dialog. I've tried storing the text as a signed char array and casting it to an unsigned char* for the function, but this doesn't seem to work. I have the following code:
    char prefText[] = "Export PDF\0";
    AIPreferenceItemGroupHandle prefItemGroup = NULL;
    AIMenuItemHandle menuItem = NULL;
    //Add the panel
    sAIPreference->AddPreferencePanel(message->d.self, //Plugin Ref
      reinterpret_cast<unsigned char*>(prefText), //Menu Item string
      kADMNonModalDialogID, //UID of dialog in plugin resources
      0, //Options
      &prefItemGroup, //[out] dialog item group ref
      &menuItem); //[out] menu item ref
    This adds text like "xport PDF •¢£†˙gkznzºª© ag ad" to the File > Preferences menu.
    Inside the Preferences dialog, the text looks like "xport PDF".
    I can't figure out why the first character is getting lost. I also can't figure out why there's garbage being printed in the first instance.
    Is there some special way to construct an unsigned char array? Illustrator SDK calls the argument a "localizable string". Is there some special way to construct one of those?

    I am able to add a menu to the preferences panel.
    But the problem is
    1. If I call the sADMDialog->Modal method on click of the menu item, I am loosing the look how the preferences panel should be shown, instead the dialog comes. my intension is to display my dialog within the preferences panel as is for other prefences items.
    2. If I call the sAIPreference->ShowPreferencePanel I am getting the proper preview but i am not able to edit the controls inside the dialog.
    Can you help me how should I achieve both.
    Below is the code I am using.
    if(message->menuItem == this->PreferencesAIMenu)
    //MyDialogParameters** parms_handle;
    AIErr result = sADMDialog->Modal(message->d.self, "MediaBin Preferences", IDD_PREFERENCES, kADMModalDialogStyle,
    // &MediaBinPlugin::InitializePreferencesDialog, (ADMUserData)parms_handle, 0);
    error = sAIPreference->ShowPreferencePanel(PreferenceItemGroupHandle);
    //ADMDialogRef result = sADMDialog->Create(message->d.self, "MediaBin Preferences", IDD_PREFERENCES, kADMModalDialogStyle,
    //&MediaBinPlugin::InitializePreferencesDialog, (ADMUserData)parms_handle, 0);
    }

  • How can I get an unsigned char string with nulls from a dll into LabVIEW 6i?

    The following ethernet packet data is contained in an unsigned char * string returned from my dll (it's formatted on printing):
    Received A 230 Packet On Adapter 0x008F0070
    Ethernet Dest: 01.E0.BB.00.00.15 Src: 00.E0.BB.00.DD.CC Type: 0x8868
    000000: 01 E0 BB 00 00 15 00 E0 : BB 00 DD CC 88 68 48 41 .............hHA
    000010: 00 E0 BB 00 DD CC 80 B3 : 00 00 FF FF 00 02 00 01 ................
    000020: 01 00 F0 18 FF 7F 7F FF : FF 7F 7F FF FF 7F 7F FF ................etc., etc.
    However, when I read this string into LabVIEW 6i, I only get the following:
    01E0 BB
    Which is the data before the first NULL or 00 information. I found a "Remove Unprintable Chars.vi" but it
    just sees more data before the above string, nothing after, as seen here: 5C30 31E0 BB.
    Anybody have any suggestions for how to get the rest of the string? Is there something I can do to further reformat my dll? The dll I'm using is already a wrapper around another dll so I have some flexibility, but the bottom line is, the data I want is in the format of an unsigned char *.

    Excellent advice, this mostly works so I have some further questions:
    I am just reading network traffic off my ethernet card right now, but here is what I get using my C program to test:
    000000: 01 E0 BB 00 00 15 00 E0 : BB 00 DD CC 88 68 48 41 .............hHA
    000010: 00 E0 BB 00 DD CC 80 B3 : 00 00 FF FF 00 02 00 01 ................
    000020: 01 00 38 3C FF 7F 7F 7F : 7F 7F 7F FF FF 7F 7F FF ..8<............
    000030: FF 7F 7F FF FF 7F 7F FF : 7F 7F 7F FF FF FF FF FE ................
    000040: FE FF FF FF FF 7F 7F 7F : 7F 7E 7E 7F 7F 7E 7E FF .........~~..~~.
    000050: 7F 7F 7F 7F FF 7F 7F 7F : 7F 7F 7F FF FF 7F 7E 7F ..............~.
    000060: 7F 7F 7E 7F 7F 7E 7F FF : FF 7F FF FF FE FF FF FE ..~..~..........
    000070: FF FF FF FF FF 7F 7F FF : FF 7F 7F FF FF FF FF FF ................
    000080: FF 7F 7F FF FF 7F 7F FF : FF 7F 7F FF FF 7F 7F FF ................
    000090: FF 7F 7F 7F FF 7F 7F 7F : 7F 7F 7F FF FF 7F 7F FF ................
    0000A0: FF 7F 7F 7F 7F 7E 7E 7F : 7F 7F FF FF FF FF FF FF .....~~.........
    0000B0: FF FF 7F FF FF 7F 7F FF : 7F 7F 7F FF FF 7E 7F FF .............~..
    0000C0: FF FF 7F FF FF 7F 7F FF : 7F 7F 7F FF FF 7F 7F FF ................
    0000D0: FF 7F 7F FF FF 7F 7F 7F : 7F 7F 7F FF FF FF FF FE ................
    0000E0: FE FF FF FF 00 01 : ................
    And here is what I get using LabVIEW to call the dll:
    0015 00E0 BB00 DDCC 8868 4841 00E0 BB00 DDCC 80B3 0000 FFFF 0002 0001 0100 9600 7F7F 7F7E 7F7F 7F7F 7F7F 7F7F 7F7F 7F00 B405 4300 3300 0000 0000 0000 01E0 BB00 0015 00E0 BB00 DDCC 8868 4841 00E0 BB00 DDCC 80B3 0000 FFFF 0002 0001 0100 9600 7F7F 7F7E 7F7F 7F7F 7F7F 7F7F 7F7F 7F00 F405 1B04 0C04 0000 0000 0000 8000 0000 0000 0000 0800 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000
    The first thing I notice is that the first 4 bytes are chopped off, and after about 50 bytes, the data is corrupted until the sequence starts to repeat, but this time it starts with the missing 4 bytes and still corrupts after about 55 bytes.
    I am expecting the data in LabVIEW to look very similar to the C data because the network packets I am grabbing are pretty consistant, only a couple bytes will vary between them, not the number I am seeing in LabVIEW.
    Another side effect I'm seeing is that I can only run my labVIEW code once, if I try running it again it crashes with failures such as:
    memory could not be "read"
    For reference, I am opening and closing the network adapter inside the read function of my dll, but the pointer seems like it should be intact...
    Attachments:
    zListAdapters.vi ‏30 KB
    listAdapters.dll ‏201 KB
    Reading.dll ‏213 KB

  • No warning when assigning to an unsigned char from unsigned int

    Is this an error in VC++, or am I missing something?  The following code should give two warnings for assigning to a smaller type, but only gives one: (compiled as 32 bit)
    unsigned
    char c;
    unsigned
    int i = 24;
    // 32 bit unsigned integer
    unsigned
    long L = 25;
    // Also a 32 bit unsigned integer
    c = L; // Warning C4244
    c = i; // no warning
    This happens in Visual Studios 2005, 2010, 2012, 2013 and 2015, at least.

    Is this an error in VC++, or am I missing something?  The following code should give two warnings for assigning to a smaller type, but only gives one: (compiled as 32 bit)
    unsigned
    char c;
    unsigned
    int i = 24;
    // 32 bit unsigned integer
    c = i; // no warning
    This happens in Visual Studios 2005, 2010, 2012, 2013 and 2015, at least.
    Have you tried it with Warning Level 4?
    In VC++ 2008 Express and W4 I get:
    Warning 2 warning C4244: '=' : conversion from 'unsigned int' to 'unsigned char', possible loss of data
    - Wayne

  • Wrtie unsigned char and unsigned int into a binary file

    Hi, I need write some data into some specific file format (.ov2) for other applications to read.
    Here is the requirement of the data:
    First byte: An unsigned char value, always "2" means simple POI entry
    Next 4 byte: An unsigned int value,
    Next 4 byte: A signed integer value.
    Next 4 byte: A signed integer value
    then next : A zero-terminated Ascii string
    Here is my code:
                       String name = "name";
                       byte type = 2;
                     int size = name.length() + 13 +1; //1+4+4+4 and there is a 0 at last
                     int a = 1;
                     int b =2;
                     ds.writeByte(type); //1 byte, need save it as unsigned char
                         ds.writeInt(size); //4 //need to be an unsignged int
                         ds.writeInt(ilont); //4 //signed int
                         ds.writeInt(ilatt); //4 //signed int
                         //write zero-terminated ascii string
                         for(int n=0; n<name.length(); n++){
                              ds.writeByte(name.charAt(n));
                         ds.writeByte(0);                    This code could not get the correct result and I think I must do some sign to unsign conversion, but I don't know how to do it. Any help?
    Thanks for you attention.

    You don't have to do anything special at all. What's in a int or a byte is what you, as a programmer, say it is. Java treats int's as signed but, in fact, with 2's complement arithmatic most operations are exactly the same. Load and store, add and subtract - it makes no difference whether a variable is signed or unsigned. It's only when you convert the number, for example to a wider type or to a string representation that you need to know the difference.
    When you read an unsigned byte and you want to widen it to an integer you have to clip the top bytes of the integer e.g.
    int len = input.get() & 0xff;This is because, since java sees a byte as signed it will extend the sign bit into the top bytes of the int. "& 0xff" forces them to zero.
    Writing it, you don't have to do anything special at all - it will autmatically get clipped. The above is probably the only concesion you'll ever need to make to unsigned binary. (You'd need to do the same thing from unsigned int to long, but this almost never happens).
    When I read ov2 files I used a mapped byte buffer, because it allows you to set either big-endian or little-endian and I wasn't sure which ov2 files used. (Sorry, I've forgotten).

  • "unsigned char** pImgData" How to call in LabVIEW

    HI,
    I am calling a DLL in LabVIEW8.5 by using "Call Library Function Node".
    I can able to access all the functions and getting exact results.
    But for the below function LabVIEW is crashing.
    void SFCCU_GET_IMG_RAW_DATA(unsigned long *length,unsigned char** pImgData,unsigned int iCamObj)
    According to the doccumentation of the dll provided, pImgData is 1D array.
    I tried the same dll in VC++. I can able to acheive the result of he above function properly.
    I attached the Call Library Function Node.vi's prototype with this mail what i have done. It seems to be correct.
    But i dont know why its crashing. Please tell me wether i am calling properly or not.
    if not, kindly suggest me how to overcome it.
    Thanks in advance for the help,
    Please help me,
    Thanks,
    Sams.
    Attachments:
    Prototype in LabVIEW.JPG ‏42 KB

    Hi Sams,
    Can you clarify what you mean by crash? Can you describe what happens? Do you get an error message? Is there anything different in how you are calling the function in LabVIEW versus VC++?
    I look forward to your reply to help move this issue forward.
    Regards,
    Hillary E
    National Instruments

  • How to map C/C++ unsigned char[] to Java

    hi all,
    I'm using w2k OS.
    Given that C code:
    BYTE *fBuf;
    fBuf = new BYTE[256];
    Is there anyone of you know how to pass/map the unsigned char to java?
    regards
    elvis

    why did you classify this as byte? how do you did
    that?They probably guessed. It is probably a good guess.
    You can use the following to determine the size exactly.
    First determine what "BYTE" is exactly. You will have to find that in an include file somewhere. Your IDE might do this for you automatically.
    So, for example you might find the following...
    typedef unsigned char BYTE;
    So then you would know that the type is actually "unsigned char".
    Once you have this information you then look in limits.h (this is an ANSI C/C++ file so it will exist somewhere.) In it you find the "..._BIT" that corresponds to the type. For the type given above you would be looking for "CHAR_BIT" (because unsigned and signed chars are the same size.)
    On my system that would be...
    #define CHAR_BIT 8
    That tells you that there are 8 bits in the BYTE value. So now you need to find a java type that also has at least 8 bits. And the java "byte" value does.

  • JNI equivalent of an unsigned char *

    Hi,
    I have a String in Java that needs to be passed to a predefined C library routine. The routine's signature is
    master(unsigned char *).
    How do I pass this string so that the routine can process it.I tried passing it as a jbyte array from java , the compilation and stuff does not fail but , my routine is returning an error because it is not able to read the data.
    Thanks in advance.

    I'm sorry , I'm not sure if I understand this correctly,
    if my method signature in my C file is
    JNIEXPORT jint JNICALL Java_reflexis_bac_validation_wrapper_Validate_MY_1FPI_1Master
    (JNIEnv * env, jobject methodObj, jlong instance, jbyteArray liveTemp, jbyteArray masterTemp, jobject xtraInfo){}
    where I have a jbyteArray type of variable. Now I need to pass this variable to a C method whose signature is
    int FPI_EXPORT FPI_Match( LPFPIINSTANCE lpInst,
                        USHORT wMatchingStrict,
                             UCHAR *lpLiveTemplate,
                             UCHAR *lpMaster,
                        LPFPI_MATCHINFO lpResults )
    So are you saying I have to convert the jbyteArray to an unsigned char string . How do I do that?

  • Assigning a jbyteArray to an unsigned char array

    I pass a jbyteArray in my method's parameter list...how do I assign it's entire contents into an unsigned char array in the body of the method?

    unsigned char cbuf[...];
    int cbufSize=...;
    jbyteArray jbuf = ...;
    jsize len = env->GetArrayLength(jbuf);
    if( len>=cbufSize )
       len = cbufSize-1;
    if( len!=0 ) env->GetByteArrayRegion(jbuf,0,len,(jbyte*)cbuf);
    cbuf[len] = '\0';

  • [SOLVED] g++ errors saying byte array is unsigned char array

    Does anyone know why this code
    #include <iostream>
    #include <string>
    #include <iomanip>
    #include "cryptopp/osrng.h"
    using namespace std;
    int main() {
    const unsigned int BLOCKSIZE = 16*8;
    byte * pcbScratch = (byte *)calloc(BLOCKSIZE, sizeof(byte)); // Same error happens with byte pcbScratch[BLOCKSIZE]
    CryptoPP::AutoSeededRandomPool rng;
    rng.GenerateBlock(pcbScratch, BLOCKSIZE);
    for(unsigned int i = 0; i < BLOCKSIZE; i++) {
    cout << *(pcbScratch+i);
    return 0;
    would cause this error?
    g++ TestClass.cpp
    /tmp/cc7wp4jU.o: In function `main':
    TestClass.cpp:(.text+0x202): undefined reference to `CryptoPP::RandomNumberGenerator::GenerateBlock(unsigned char*, unsigned int)'
    Last edited by rwdalpe (2011-06-15 03:22:30)

    Sure thing!
    I used to post on the forum under the username l33tunderground, but have since decided I would rather go with a less childish username.
    To be honest, I just started a project in Eclipse and attempted to include the Crypto++ libraries, which I installed through pacman. I don't know if there's anything special I need to do to properly include the library, as the Crypto++ documentation is painfully lacking in basics.
    Edit: Shameless plug to the Arch forums
    I've found in the past that Arch users are incredibly reliable and helpful, so I posted here rather than a traditional programming forum.
    Last edited by rwdalpe (2011-06-15 03:13:22)

  • Unsigned char vs. signed char

    When I attempt to assign my unsigned char array to a jbyteArray...the Set...Region method expects a signed char array??? Why, and what impact is there? My code does quite a bit of bit manipulation...

    When I attempt to assign my unsigned char array to a
    jbyteArray...the Set...Region method expects a signed
    char array??? Why, and what impact is there? Because that is what it expects.
    Same impact if you were using a signed char array.

  • Unsigned char* to jbytearray

    could anyone please tell me how I could convert an unsigned char*, which is just some binary data to jbytearray.
    the c++ program is being called thru JNI and hence jbytearray.
    or is there a cleaner way to capture whatevr I get from the C++ code, and send it as is into my java module, so I can send it out through a UDP socket?

    mlk thank you. i looked at it earlier in the serach results. But i am having second thoughts now. does it have to be so complicated.
    how do i make sure that the raw data returned by my C++ code is maintained as is by java module and ushed out of a UDP socket.
    what data types should I rely on.
    because at the other end of the socket, i have a client which can understand the raw data which was generated by the c++ code at my end.
    any kind of suggestion will be a great help.

Maybe you are looking for

  • [SOLVED]xf86-video-intel-2.13.0-2 - Black screen/windows

    I've upgraded from xf86-video-intel-2.13.0-1 to release 2 and now every GL window is black, with no rendering. For example, glxgears give-me a black empty window with no rendering, also the games stoped working. I can confirm this issue on this igp:

  • Leave Request - can't change the time of pending request

    Hello, We are trying to change the time of a pending leave request and when entering  a new time in the duration field, the original time still shows up on the Review and Send screen. The leave request is for one day and for 5 hours - we are trying t

  • Email Not Ready Icon visible in CSD when no Email server is used

    HI,      We have implemented a new contact center in the UCCX 7.0(1)SR5 cluster with no Email feature. But when one logins into the CSD, you see this Email Not ready icon appeneded to the current agent status. Note that this is not the case with all

  • SAP Standard Tools for compliance

    Hello, i heard that SAP has standard tools available for compliance i.e. AIS, MIC..etc. They are smaller in scope than GRC.  Could someone help show me where I can find them in my SAP v4.6c system. Many thanks. Charles PS: I intend to post this in th

  • Do I need credit on my prepaid iphone to send an imessage

    I have a prepaid iphone, if i have no credit but still attached to wifi can i still send an imessage?