JNI_CreateJavaVM fail in JDK1.4

Dear all,
I made a simple test program to create Java VM in C++ code on Sun 5.8 with JDK1.4. The error is as below:
#./jnitest
There was an error trying to initialize the HPI library.
Please check your installation, HotSpot does not work correctly
when installed in the JDK 1.2 Solaris Production Release, or
with any JDK 1.1.x release.
Can't create Java VM
The code of test program jnitest.cxx is as below:
#include <jni.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <iostream.h>
main() {
JavaVMInitArgs vm_args;
JavaVMOption options[4];
JNIEnv *env;     // pointer to JNI environment
JavaVM *vm;    // pointer to Java virtual machine
jint res;
options[0].optionString = "-Djava.compiler=NONE"; /* disable JIT */
options[1].optionString = "-Djava.class.path=./"; /* user classes */
vm_args.version = JNI_VERSION_1_4; //change it to JNI_VERSION_1_2
// for JDK1.3
vm_args.options = options;
vm_args.nOptions = 2;
vm_args.ignoreUnrecognized = JNI_TRUE;
res = JNI_CreateJavaVM(&vm, (void **)&env, &vm_args);
if (res < 0)
cerr << "Can't create Java VM\n";
exit(1);
} else {
cerr << "Java VM initialized\n";
vm->DestroyJavaVM();
cerr << "Java VM destroyed\n";
I use the following command to do the compiling:
/opt/corp/inm_3rdparty/SUNWspro/bin/CC -I./ -I/opt/corp/inm_3rdparty/j2sdk_1.4.0/include -I/opt/corp/inm_3rdparty/j2sdk_1.4.0/include/solaris -L/opt/corp/inm_3rdparty/j2sdk_1.4.0/jre/lib/sparc -ljvm jnitest.cxx -o jnitest
I got the executible jnitest. Then set up LD_LIBRARY_PATH as below:
setenv LD_LIBRARY_PATH /opt/corp/inm_3rdparty/j2sdk_1.4.0/jre/lib/sparc
Run jnitest, then I got error message which is posted in the beginning.
If I change the compiling and environment LD_LIBRARY_PATH to JDK1.3.1 and change JNI version to JNI_VERSION_1_2 in jnitest.cxx, everything works. The VM can be created properly. So is it bug in JDK1.4? How can I create Java VM in C++ in JDK1.4? I checked the JDK1.4 JNI API, I don't find any difference for this method JNI_CreateJavaVM between JDK1.3 and JDK1.4.
Thank you very much for the help,
Linda ----looking forward your valuable comments :-)

where did you find that you are supposed to set version to 1.4?
vm_args.version = JNI_VERSION_1_4; //change it to JNI_VERSION_1_2the only thing i have found is to use 1.2.
vm_args.version = JNI_VERSION_1_2;i am using jdk 1.4 and i am successfully creating a JVM but im on windows 2000. i think the highest version of JNI is 1.2....(maybe)...
good luck

Similar Messages

  • JNI_CreateJavaVM fails on Windows - Very urgent

    Folks,
    I am very much new to the JNI on Windows 2000. For the last two days, I am trying to load JVM into my application. But, could not succeed. Even the samples I got across the net failed. The problem is "JNI_CreateJavaVM" returns -1.
    Using Microsoft Visual Studio 6.0, and VC++, I created a project with default settings and used the following sample code and it fails left and right. Please, let me know if I am doing something wrong.
    Sample
    JNIEnv *env = NULL;
    JavaVMInitArgs vm_args;     
    JavaVMOption options[4];
    char a1,a2;
    int nbOptions;
    jint res = 0;
    char classpath,librarypath;
    classpath = malloc(1024,sizeof(char));
    librarypath= malloc(1024,sizeof(char));
    strcpy(classpath, "-Djava.class.path=e:\\jdk1.2.2;e:\\jdk1.2.2\\jre\\lib\\i18n.jar");
    strcpy(librarypath, "-Djava.library.path=.;e:\\jdk1.2.2\\jre\\bin;e:\\jdk1.2.2\\jre\\bin\\classic");
    options[0].optionString = classpath;
    options[1].optionString = librarypath;
    options[2].optionString = "-Djava.compiler=NONE";
    options[3].optionString = "-verbose:jni";
    vm_args.version = JNI_VERSION_1_2;
    vm_args.options = options;
    vm_args.nOptions = 4;
    vm_args.ignoreUnrecognized = JNI_TRUE;
    res = JNI_CreateJavaVM(&jvm,(void **)&env,&vm_args);
    I tried with other samples also. But, the result is same. JNI_CreateJavaVM returns -1.
    If possible, please respond to [email protected] also
    Thanks,
    Lokesh

    strcpy(librarypath, "-Djava.library.path=.;e:\\jdk1.2.2\\jre\\bin;e:\\jdk1.2.2\\jre\\bin\\classic");
    options[1].optionString = librarypath;This doesn't do what the author probably expected. This specifies the path(s) for Java to load native libraries from -- this is different from the path where you actually want to load the JVM from. That path is up to the system and where things happen to be, unless you specify it explicitly in your call to load jvm.dll in the case of Windows, and I assume you use the system call LoadLibrary() on Windows in order to do this. If you do not (and you don't use some other Win API call for this purpose) then you are depending entirely on your system's environment variables in order to create the JVM instance.
    So, it is not something that Sun has to do anything about.

  • JNI_CreateJavaVM failed with error code -3

    Hi All,
    I was trying the example from http://java.sun.com/docs/books/tutorial/native1.1/invoking/invo.html
    but for me it failed..
    i m using http://java.sun.com/docs/books/tutorial/native1.1/invoking/example-1dot1/invoke.c
    i m working on redhat linux 9.0 with j2se 1.4.2_6
    i did
    gcc -I/usr/java/j2sdk1.4.2_06/include \
    -I/usr/java/j2sdk1.4.2_06/include/linux \
    -L/usr/java/j2sdk1.4.2_06/jre/lib/i386 \
    -L/usr/java/j2sdk1.4.2_06/jre/lib/i386/client -ljava -ljvm -lverify invoke.c \
    -o invoke.out
    aND
    export LD_LIBRARY_PATH=/usr/java/j2sdk1.4.2_06/jre/lib/i386:/usr/java/j2sdk1.4.2_06/jre/lib/i386/client
    after executing i got
    ./invoke.out
    Can't create Java VM
    i printed error code and thats -3
    Please help
    Thanks in advance !
    ~Preetam

    finally it worked !
    I written code specific to jvm 1.4.2 !
    JavaVMInitArgs vm_args;
    JavaVMOption options[2];
    options[0].optionString = "-Djava.compiler=NONE"; /* disable JIT */
    options[1].optionString = "-Djava.class.path=."; /* user classes */
    vm_args.version = JNI_VERSION_1_2;
    vm_args.options = options;
    vm_args.nOptions = 2;
    vm_args.ignoreUnrecognized = JNI_TRUE;
    /* Note that in the Java 2 SDK, there is no longer any need to call
    * JNI_GetDefaultJavaVMInitArgs.
    res = JNI_CreateJavaVM(&jvm, (void **)&env, &vm_args);
    Neways thanks !

  • JNI_CreateJavaVM fails in Multithreaded app.

    Hello,
    when i call JNI_CreateJavaVM in a multithreaded application, I ll get the error:
    "ERROR: Could not find the pthread library (2). Are you running a supported Linux
    distribution?"
    So, given the following program, everyhing works okay:
    ----code---
    #include "jni.h"
    #include <stdio.h>
    #include <stdlib.h>
    #include <pthread.h>
    void jnicall()
    JNIEnv *env;
    JavaVM *jvm;
    jint res;
    jclass cls;
    jmethodID mid;
    jstring jstr;
    jclass stringClass;
    jobjectArray args;
    JavaVMInitArgs vm_args;
    JavaVMOption options[1];
    options[0].optionString = "-Djava.class.path=.";
    vm_args.version = 0x00010002;
    vm_args.options = options;
    vm_args.nOptions = 1;
    vm_args.ignoreUnrecognized = JNI_TRUE;
    res = JNI_CreateJavaVM(&jvm, (void**)&env, &vm_args);
    if (res < 0) {
    fprintf(stderr, "Can't create Java VM\n");
    exit(1);
    cls = env->FindClass("Hello");
    if (cls == NULL) {
    goto destroy;
    mid = env->GetStaticMethodID(cls, "main", "([Ljava/lang/String;)V");
    if (mid == NULL) {
    goto destroy;
    jstr = env->NewStringUTF(" from C!");
    if (jstr == NULL) {
    goto destroy;
    stringClass = env->FindClass("java/lang/String");
    args = env->NewObjectArray(1, stringClass, jstr);
    if (args == NULL) {
    goto destroy;
    env->CallStaticVoidMethod(cls, mid, args);
    destroy:
    if (env->ExceptionOccurred()) {
    env->ExceptionDescribe();
    jvm->DestroyJavaVM();
    int main()
    jnicall();
    ---code---
    But when I replace main with something like
    ---code---
    int main()
    pthread_t thread;
    if (0 != pthread_create(&thread, NULL, (void*(*)(void*))jnicall, NULL))
    fprintf(stderr, "Cannot create thread");
    exit(1);
    ---code---
    Really strange is that is does not matter, how i use pthread functions: even if
    i call the jnicall function directly from main but start a different thread using
    pthread_create, i get this method too:
    ---code---
    int main()
    pthread_t thread;
    jnicall();
    if (0 != pthread_create(&thread, NULL, (void*(*)(void*))some_other_thread_without_jni,
    NULL))
    fprintf(stderr, "Cannot create thread");
    exit(1);
    ---code---
    My system is a Suse Linux Enterprise Server 8.0 (wth United Linux 1.0) (kernel
    2.4.19, glibc is 2.2.5); the Jrockit stuff is JRockit 8.1 SP 2
    Any ideas?
    TIA,
    Johannes

    Try checking that the environment variable LD_ASSUME_KERNEL=2.2.5 is not
    set.
    From the release notes at:
    http://edocs.bea.com/wljrockit/docs81/relnotes/relnotes.html#1041981
    Best Regards,
    Josefin Hallberg
    BEA WebLogic JRockit
    Customer Centric Engineering
    "Johannes Hampel" <[email protected]> wrote in message
    news:[email protected]...
    >
    Hello,
    when i call JNI_CreateJavaVM in a multithreaded application, I ll getthe error:
    "ERROR: Could not find the pthread library (2). Are you running asupported Linux
    distribution?"
    So, given the following program, everyhing works okay:
    ----code---
    #include "jni.h"
    #include <stdio.h>
    #include <stdlib.h>
    #include <pthread.h>
    void jnicall()
    JNIEnv *env;
    JavaVM *jvm;
    jint res;
    jclass cls;
    jmethodID mid;
    jstring jstr;
    jclass stringClass;
    jobjectArray args;
    JavaVMInitArgs vm_args;
    JavaVMOption options[1];
    options[0].optionString = "-Djava.class.path=.";
    vm_args.version = 0x00010002;
    vm_args.options = options;
    vm_args.nOptions = 1;
    vm_args.ignoreUnrecognized = JNI_TRUE;
    res = JNI_CreateJavaVM(&jvm, (void**)&env, &vm_args);
    if (res < 0) {
    fprintf(stderr, "Can't create Java VM\n");
    exit(1);
    cls = env->FindClass("Hello");
    if (cls == NULL) {
    goto destroy;
    mid = env->GetStaticMethodID(cls, "main", "([Ljava/lang/String;)V");
    if (mid == NULL) {
    goto destroy;
    jstr = env->NewStringUTF(" from C!");
    if (jstr == NULL) {
    goto destroy;
    stringClass = env->FindClass("java/lang/String");
    args = env->NewObjectArray(1, stringClass, jstr);
    if (args == NULL) {
    goto destroy;
    env->CallStaticVoidMethod(cls, mid, args);
    destroy:
    if (env->ExceptionOccurred()) {
    env->ExceptionDescribe();
    jvm->DestroyJavaVM();
    int main()
    jnicall();
    ---code---
    But when I replace main with something like
    ---code---
    int main()
    pthread_t thread;
    if (0 != pthread_create(&thread, NULL, (void*(*)(void*))jnicall,NULL))
    fprintf(stderr, "Cannot create thread");
    exit(1);
    ---code---
    Really strange is that is does not matter, how i use pthread functions:even if
    i call the jnicall function directly from main but start a differentthread using
    pthread_create, i get this method too:
    ---code---
    int main()
    pthread_t thread;
    jnicall();
    if (0 != pthread_create(&thread, NULL,(void*(*)(void*))some_other_thread_without_jni,
    NULL))
    fprintf(stderr, "Cannot create thread");
    exit(1);
    ---code---
    My system is a Suse Linux Enterprise Server 8.0 (wth United Linux 1.0)(kernel
    2.4.19, glibc is 2.2.5); the Jrockit stuff is JRockit 8.1 SP 2
    Any ideas?
    TIA,
    Johannes

  • Oracle application server fails with jdk1.6

    I have installed oracle application server 10.1.3.3, i want to use jdk1.6.0_xx. i configured in this following way
    Renamed the current JDK directory used by the Application Server installation ( in our case renamed jdk under this location “/product/10.1.3.1/OracleAS_1” to jdk.orig using mv jdk jdk.orig)
    Then went to this location /product/10.1.3.1/OracleAS_1 and execute “ln -s /usr/jdk/jdk1.6.0_13 jdk”
    check java version from this location /product/10.1.3.1/OracleAS_1/jdk/bin use ./java –version
    When i start i am getting this following error in the log file.
    Unrecognized VM option 'AppendRatio=3'
    Could not create the Java virtual machine.

    To get it starting, remove the AppendRatio.
    --olaf                                                                                                                                                                                                           

  • JNI_CreateJavaVM returns JNI_ENOMEM (-4) from different versions of the jre

    I have created a JNI dll, invoked from a VBA Excel add-in to execute Java code. This works fine until I introduce the -Xmx parameter as a vm option.
    Before creating the VM, I use a VirtualAlloc/Free loop to make sure the -Xmx parameter is not too large, decrementing it by 2MB each time until an allocable amount is found (thanks to [jurberg's post|http://forums.sun.com/thread.jspa?forumID=37&threadID=5220601]; my version, slightly modified, is posted below). I then pass that value to the VM via the -Xmx option.
    This works great when I am using a JRE 1.6 installed in the "C:\Program Files\Java\*jre1.6.0_xx"* directory. When the same* JRE version is installed in the "C:\Program Files\Java\*jre6*" directory, JNI_CreateJavaVM fails with JNI_ENOMEM. Calling GetLastError returns 0 ("operation completed successfully"). Using a *1.5 JRE*, this also fails, returning JNI_ENOMEM, but GetLastError returns error code 6 ("the handle is invalid").
    A little about my platform:
         Windows XP Pro
         Building JNI dll using Microsoft Visual C++ 2008 and 1.5 JDK
    I have multiple JREs and JDKs installed on my system as this is a dev machine, but I have the same problem on a non-dev machine running XP Home.
    Here is a snippet of my code used to create the vm:
         // create the JNI structures
         const int numOptions = args.size();
         JavaVMInitArgs vm_args;
         JavaVMOption* options = new JavaVMOption[numOptions];
            log("Creating JVM with parameters:");
            int i = 0;
            char * nextArg;
            for (itr=args.begin(); itr != args.end(); itr++) {
                nextArg = new char[(*itr).length() + 1];
                strcpy(nextArg, (*itr).c_str());
                options.extraInfo = NULL;
    options[i++].optionString = nextArg;
    log("\t" + string(nextArg));
         vm_args.version = CRUSH_JNI_VERSION;
         vm_args.options = options;
         vm_args.nOptions = numOptions;
         vm_args.ignoreUnrecognized = JNI_FALSE;
    // load and initialize the Java VM, and return a JNI interface pointer
    JNIEnv* env = NULL;
         err = (*createVM)(&jvm, (void**)&env, &vm_args);
         // err is -4 (JNI_ENOMEM) in the cases described above
    Does anyone have any suggestions on what is going on here and how I might make this code stable for all 1.5 and 1.6 JREs, regardless of where they are installed?
    Thanks in advance,
    Sarah
    Code to determine max -Xmx value:static const DWORD NUM_BYTES_PER_MB = 1024 * 1024;
    bool canAllocate(DWORD bytes)
    LPVOID lpvBase;
    lpvBase = VirtualAlloc(NULL, bytes, MEM_RESERVE, PAGE_READWRITE);
    if (lpvBase == NULL) return false;
    VirtualFree(lpvBase, 0, MEM_RELEASE);
    return true;
    int getMaxHeapAvailable(int permGenMB, int maxHeapMB)
    DWORD          originalMaxHeapBytes = 0;
    DWORD          maxHeapBytes = 0;
    int               numMemChunks = 0;
    SYSTEM_INFO          sSysInfo;
    DWORD          maxPermBytes = permGenMB * NUM_BYTES_PER_MB;     // Perm space is in addition to the heap size
    DWORD          numBytesNeeded = 0;
    GetSystemInfo(&sSysInfo);
    // jvm aligns as follows:
    // quoted from size_t GenCollectorPolicy::compute_max_alignment() of jdk 7 hotspot code:
    // The card marking array and the offset arrays for old generations are
    // committed in os pages as well. Make sure they are entirely full (to
    // avoid partial page problems), e.g. if 512 bytes heap corresponds to 1
    // byte entry and the os page size is 4096, the maximum heap size should
    // be 512*4096 = 2MB aligned.
    // card_size computation from CardTableModRefBS::SomePublicConstants of jdk 7 hotspot code
    int card_shift = 9;
    int card_size = 1 << card_shift;
    DWORD alignmentBytes = sSysInfo.dwPageSize * card_size;
    maxHeapBytes = maxHeapMB * NUM_BYTES_PER_MB;
    // make it fit in the alignment structure
    maxHeapBytes = maxHeapBytes + (maxHeapBytes % alignmentBytes);
    numMemChunks = maxHeapBytes / alignmentBytes;
    originalMaxHeapBytes = maxHeapBytes;
    // loop and decrement requested amount by one chunk
    // until the available amount is found
    numBytesNeeded = maxHeapBytes + maxPermBytes;
    while (!canAllocate(numBytesNeeded) && numMemChunks > 0)
    numMemChunks --;
    maxHeapBytes = numMemChunks * alignmentBytes;
    numBytesNeeded = maxHeapBytes + maxPermBytes;
    if (numMemChunks == 0) return 0;
    // we can allocate the requested size, return it now
    if (maxHeapBytes == originalMaxHeapBytes) return maxHeapMB;
    // calculate the new MaxHeapSize in megabytes
    return maxHeapBytes / NUM_BYTES_PER_MB;

    I have a similar, but I think much simpler problem. Namely, I get ENOMEM's when as far as I can tell there's plenty of memory available. It seems to have something to do with how Windows is configured, although I've never been able to determine what it could be.FWIW, in my case, I found that if I loaded my JNI dll into a console process, the max heap requested was always allocated. But when loading into Excel, the same amount would be too much. This was partly due to the fact that Excel has its own memory management, limiting the amount of memory workbooks can use. Also, it could be due to the vm not being able to reserve a contiguous chunk of memory for the max heap space.
    Why (and how) separate the permanent generation space from the rest of the max heap? It seems you'll fail if you can't get that much space (which is the why) but how did you determine what it is?The VM uses the perm gen space plus the requested max heap space when attempting the VirtualAlloc call to verify that it can allocate the specified amount. The default perm gen is 64MB, but that can be changed via the -XX:MaxPermSize vm parameter, so I allow for any requested value.
    What's CRUSH_JNI_VERSION? It's not in any .h file I have.That's just my own constant defined to be either JNI_VERSION_1_4 or JNI_VERSION_1_6.
    Why are you messing with the bootclasspath? (I suspect you're adding something to it. Generally the VM can find it's own damn classpath).Yep, I'm adding the 2.1 JAXB jar to the bootclasspath because earlier 1.6 distributions included JAXB 2.0 and I needed 2.1.
    -sarah

  • Hashtable - jdk1.2 vs jdk1.3

    hash is Hashtable
    for (Enumeration e = hash.keys() ; e.hasMoreElements() ;) {
    System.out.println(e.nextElement());
    This program prints the elements in a different order
    under jdk1.2 and jdk1.3. This means the hashtable
    implementation is different under these two versions.
    Am I correct?
    I know I am not supposed to rely on this order, but
    unfortunately significantly large part of a code
    uses this order. and my transition to jdk1.3 fails.
    and jdk1.2 doesnt work on RH7.1 ;-(
    Is there something I can do or am I totally out of luck?
    thanks
    Santosh

    I believe I had a mental lapse when I wrote my last post. The problem has to do with the initial table capacity. In JDK 1.2 the default capacity is 101. In JDK 1.3 the default capacity is 11.
    The hashcodes for the strings "a", "b", and "c" are 97, 98, and 99 respectively. In JDK 1.2 these were simple placed in the table array at the index with the same value as the hashcode. Enumeration of the keys results in iterating over all non-null elements in the table array starting with the last array element. This results in an ordering of the elements by index which produces c b a.
    In JDK 1.3 the hashcodes for the strings "a", "b", and "c" are the same as in JDK 1.2 but they are placed at different indices in the table array. In this case "a" is at index 9, "b" is at index 10, and "c" is at index 0. Enumeration of the keys still results in iteration over all non-null elements in the table array starting with the last element. However due to the different element positions the output is b a c.
    This trivial case can be solved by simply creating a Hashtable in JDK 1.3 with an initial capacity of 101. Try running the code below using JDK 1.2 and JDK 1.3. You will find that the Hashtable with an initial capacity of 101 produces the same results in the two different JVMs.Hashtable htab1 = new Hashtable();
    htab1.put(new String("a"), new String("element1"));
    htab1.put(new String("b"), new String("element2"));
    htab1.put(new String("c"), new String("element3"));
    System.out.println("Hashtable 1 with default capacity");
    for (java.util.Enumeration e = htab1.keys(); e.hasMoreElements();) {
       String key = (String) e.nextElement();
       System.out.println("hash:" + key.hashCode() + " key:" + key);
    Hashtable htab2 = new Hashtable(101);
    htab2.put(new String("a"), new String("element1"));
    htab2.put(new String("b"), new String("element2"));
    htab2.put(new String("c"), new String("element3"));
    System.out.println("Hashtable 2 with capacity of 100");
    for (java.util.Enumeration e = htab2.keys(); e.hasMoreElements();) {
       String key = (String) e.nextElement();
       System.out.println("hash:" + key.hashCode() + " key:" + key);
    }To start solving your problem you will need to determine if there are differences in the way elements are placed in the table array and in the way the table is rehashed.

  • A failure of verifying a DSA signature in JDK1.4.

    Hi,
    I have a problem of the interoperability among JDK1.3 and JDK1.4.
    If someone knows about this, please let me know whether JDK1.4 is right or JDK1.3 is right.
    I see some signatures fail to verify in 1.4.2_01 but succeeds in 1.3.1-b24.
    Those are all DSA signatures. For example,
    merlin-xmldsig-twenty-three/signature-x509-crt-crl.xml
    which you can get from
    http://www.w3.org/Signature/2001/04/05-xmldsig-interop.html
    I succeed to verify this Signature in JDK1.3 and fail in JDK1.4.
    The exception in JDK1.4 is as follows :
    java.security.SignatureException: invalid signature: out of range values
         at sun.security.provider.DSA.engineVerify(DSA.java:228)
         at sun.security.provider.DSA.engineVerify(DSA.java:182)
         at java.security.Signature.verify(Signature.java:464)
    Thanks.

    Hi,
    I'm seeing the same error with DSA signatures under 1.4. Did you find out what is going on?
    Thanks.
    java.security.SignatureException: invalid signature: out of range values
         at sun.security.provider.DSA.engineVerify(DSA.java:228)     
    at sun.security.provider.DSA.engineVerify(DSA.java:182)
         at java.security.Signature.verify(Signature.java:464)
         at cryptix.openpgp.algorithm.PGPDSA.verifySignature(PGPDSA.java:555)

  • JNI Version Wrong? Error Code = JNI_EVERSION

    Hi people,
    I have a jar file and I want to call those jar-file-methods from my C++ Project. B'coz I never have used JNI, I would like to create a simple example to see how is that running:
    [JAVA]public class HelloFunc {
         static {
              System.mapLibraryName("HelloFunc");
         public static void main(String[] args) {
              if (args.length > 0) System.out.println("Hello World " + args[0]);
              else System.out.println("Hello World /wo Args!");
    }[JAVA]
    I exported that Java programm above with Eclipse as an executable jar file, which is used in the project below. In his case I want to call the main-Method from my C++ project:
    [CPP]#include <jni.h>
    #include "JNI_IF.h"
    // C:\Program Files\Java\jdk1.6.0_26\lib
    #pragma comment (lib,"C:\\Program Files\\Java\\jdk1.6.0_26\\lib\\jvm.lib")
    JNI_IF::JNI_IF(char* JVMOptionString)
         JavaVMInitArgs vm_args;
         JavaVMOption options[1];
         options[0].optionString = JVMOptionString;
         vm_args.options = options;
         vm_args.nOptions = 1;
         vm_args.version = JNI_VERSION_1_6;
         vm_args.ignoreUnrecognized = JNI_FALSE;
    void JNI_IF::setClassName(char* className)
         result = JNI_CreateJavaVM( &jvm,(void **)&env, &vm_args);
         switch(result)
              case JNI_ERR:
                   printf("Unknown Error invoking the JVM\n");
                   return;
              case JNI_EDETACHED:
                   printf("Thread detached from the JVM\n");
                   return;
              case JNI_EVERSION:
                   printf("JNI version error\n");
                   return;
              case JNI_ENOMEM:
                   printf("Not enough memory the JVM\n");
                   return;
              case JNI_EEXIST:
                   printf("JVM already created\n");
                   return;
              case JNI_EINVAL:
                   printf("Invalid arguments\n");
                   return;
              case JNI_OK:
                   printf("JVM created --> Ready ...\n");
         cls = env->FindClass(className);
         if( cls == NULL ) {
              printf("can't find class %s\n", className);
              return;
         env->ExceptionClear();
    void JNI_IF::setMethodName(char* methodName)
         mid = env->GetStaticMethodID(cls, methodName, "([Ljava/lang/String;)V");
         if (mid != NULL){
              env->CallStaticVoidMethod(cls,mid,"70");
    int main()
         JNI_IF JNIInterfaceObj("-Djava.class.path=M:\\Eigene Dateien\\Visual Studio 2008\\Projects\\JNI_IF\\JNI_IF;M:\\Eigene Dateien\\Visual Studio 2008\\Projects\\JNI_IF\\JNI_IF\\hello.jar");
         JNIInterfaceObj.setClassName("HelloFunc");
         JNIInterfaceObj.setMethodName("main");
         return 0;
    }[CPP]
    And my corresponding header:
    [HEADER]#ifndef JNI_IF_H
    #define JNI_IF_H
    #include <jni.h>
    class JNI_IF {
    public:
         JNI_IF(char* JVMOptionString);
         void setMethodName(char* methodName);
         void setClassName(char* className);
    private:
         JavaVMInitArgs vm_args;
         JavaVM *jvm;
         JNIEnv *env;
         //JNINativeInterface *env;
         long result;
         jmethodID mid;
         jfieldID fid;
         jobject jobj;
         jclass cls;
         int asize;
         char  JVMOptionString[20];
         char  className[20];
         char  methodName[20];
         JavaVMOption options[1];
    #endif[HEADER]
    According to my debugger, I always got a JNI_EVERSION error --> so the JNI_CreateJavaVM() fails. Hows that be? I tried every versions JNI_VERSION_1_1, JNI_VERSION_1_2, JNI_VERSION_1_4 and JNI_VERSION_1_6. All the same error.
    Windows XP Prof. SP 3, Visual Studio 2008, JDK 1.6.0_26. I linked the lib statically as you can see and the jvm.dll is in my projects debug folder.
    I can compile and start the programm, but as I said, I got that error code.
    I appreciate every hints and solutions, thank you for your patience.
    Cheers
    Edited by: 872444 on 14.07.2011 01:03
    Edited by: EJP on 8/09/2011 18:38: code tags

    There are several major problems with this code.
    JNI_IF::JNI_IF(char* JVMOptionString)
         JavaVMInitArgs vm_args;
         JavaVMOption options[1];
         options[0].optionString = JVMOptionString;
         vm_args.options = options;
         vm_args.nOptions = 1;
         vm_args.version = JNI_VERSION_1_6;
         vm_args.ignoreUnrecognized = JNI_FALSE;
    }So I assume vm_args is a class member. However you are setting vm_args.options to the address of a variable that will disappear as soon as this constructor exits. This is almost certainly the cause of your immediate problem, and if it isn't it will certainly cause problems further down the track. I would move most of code from setClassName() into here to address that.
    void JNI_IF::setClassName(char* className)
         result = JNI_CreateJavaVM( &jvm,(void **)&env, &vm_args);
         switch(result)
              case JNI_ERR:
                   printf("Unknown Error invoking the JVM\n");
                   return;
              case JNI_EDETACHED:
                   printf("Thread detached from the JVM\n");
                   return;
              case JNI_EVERSION:
                   printf("JNI version error\n");
                   return;
              case JNI_ENOMEM:
                   printf("Not enough memory the JVM\n");
                   return;
              case JNI_EEXIST:
                   printf("JVM already created\n");
                   return;
              case JNI_EINVAL:
                   printf("Invalid arguments\n");
                   return;
              case JNI_OK:
                   printf("JVM created --> Ready ...\n");
         }All that should be in the constructor.
         cls = env->FindClass(className);So it appears that 'cls' is also a member.
         if( cls == NULL ) {
              printf("can't find class %s\n", className);
              return;
         }If 'cls' was null there was an exception, and you should print it rather than making up your own message. You are losing information.
         env->ExceptionClear();Conversely if 'cls' isn't null there was no exception so there is nothing to clear here.
    void JNI_IF::setMethodName(char* methodName)There's not much point in making this a separate method. I would have one setEntryPoint() method that takes the classname and the method name. However as the method name has to be 'main' there's not much point in making it a parameter.
         mid = env->GetStaticMethodID(cls, methodName, "([Ljava/lang/String;)V");And here for example you are using a methodName parameter but a hardwired argument-list. So again there wasn't much point in the methodName parameter.
         if (mid != NULL){
              env->CallStaticVoidMethod(cls,mid,"70");... else you should print something ... an exception. Otherwise you are again losing information. The program does nothing and silently exits. Not very informative. And this proves that the method is poorly named. You aren't just setting a method name, you are +calling+ it. I would say the method should be called run(), with no arguments.

  • A question about compatiblilty of JDBC

    Hi, there,
    I have a question about JDBC 3.0 in JDK1.4.1.
    We build a .jar using JDK1.4.1 and this .jar file
    provides JDBC 3.0 interface.
    I then tried to call the interface under JDK1.3.1
    but it keeps asking me about the getParameterMetaData()
    in PreparedStatement.
    (The error message like this:
    Exception in thread "main" java.lang.NoClassDefFoundError: java/sql/ParameterMetaData dbmaker.sql.JdbcOdbcConnection.prepareStatement(JdbcOdbcConnection.java:257)
    I did implement the getParameterMetaData() method,
    but I didn't call this method in my testing program.
    It's quite strange since when JDK1.2 was release,
    we implemented the getBlob() method in
    ResultSet class and there's no problem when we
    use JDK1.1 run time to call this ResultSet object.
    Could any one tell me why? Is this caused by a
    compatibility problem between JDK1.4 and the
    older versions of JDK?

    Hi, there,
    Let me outline the condition by examples so that maybe you can help me. PLEASE! Could somebody tell us what's wrong?
    - CASE 1:
    In JDK1.2, we implemented a new method getBlob which returns the type of Blob (this is done according to JDBC 2.0 specification.) in ResultSet class. We can use the .jar compiled by jdk1.2 in jdk1.1 environment.
    e.g. We make a xyz.jar by jdk1.2 and the xyz.jar includes new method getBlob in ReaultSet class. Then, one java application including ResultSet class CAN run under jdk1.1 with xyz.jar, if the method ResultSet.getBlob() is not called.
    - CASE 2:
    BUT, the following case which is the same as in the above condition will fail in jdk1.3 environment.
    In JDK 1.4, we implemented a new method getParameterMetaData() which returns the type of ParameterMetaData. (BTW, this is done according to JDBC 3.0 specification.) in PreparedStatement class. We CANNOT use the .jar compiled by jdk1.4 in jdk1.3 environment.
    e.g. We make a ttt.jar by jdk1.4 and ttt.jar includes the new method getParameterMetaData() in PreparedStatement class. Then, one java application including PreparedStatement class CANNOT be run under jdk1.3 with ttt.jar, even when the method PreparedStatement.getParameterMetaData() is not called.

  • Invoking java methods from C/C++ on the machine with different JREs

    I implemented Windows NT Service instantiating JVM and invoking several java methods. Everything works but I have an issue with running the service on the machine where multiple different versions of JRE have been installed. The service is calling java methods that require JRE 1.3 or later so I wrote the code that is setting system PATH from within the service based on the configuration stored in the external file. The problem is that the service requires jvm.dll to be in the PATH prior lunching it since this library is instantiated through the implicit linking. When I put jvm.dll in the same path as the service binary I can lunch it but JNI_CreateJavaVM fails and returns -1. This happens even if JRE 1.3 is in the system PATH prior lunching the service.
    Everything works if the system PATH contains references to JRE 1.3 and jvm.dll is removed from the service's directory.
    I am looking for an advice on what is the proper way to deal with invoking java methods from the C/C++ executable in the environment with different versions of JRE.
    Thanks, Kris.

    Here's a way I have done what you are asking about:
    What you want to do is make all of your linking happen at runtime, rather than at compile time. This way, you can edit the PATH variable before the jvm.dll gets loaded.
    Following is some code that I used to handle a dll of my own in this manner. You can decide if you want to write a "wrapper" dll, or if you find it simpler to approach the jvm.dll in this way.
    // Define pointer type for DLL entry point.
         typedef void JREPDLL_API (*EXECUTEREQUEST)(char*, Arguments&);
    // Set up path, load dll, and pass everything off to it.
    HINSTANCE javaServer = javaServer = LoadLibrary("jrepdll.dll");
    if (javaServer != NULL) {
    EXECUTEREQUEST executeRequest = (EXECUTEREQUEST)GetProcAddress(javaServer, "ExecuteRequest");
    if (executeRequest != NULL) {
    if (argc == 1)
         // Execute the request.
         executeRequest("-run", args);
    else
         // Execute the request.
         executeRequest("-console", args);
    Here's some code for how to edit the PATH:
              // Edit the PATH environment variable so that we use our private java.
    char *appendPt;
    char *newPath;
    char *path;
              char tmp[_MAX_PATH];
              // Get the current PATH variable setting.
    path = getenv("PATH");
              // Allocate space for an edited path setting.
              if (path != NULL)
                   newPath = (char*)malloc((_MAX_PATH * 2) + strlen(path));
              else
                   newPath = (char*)malloc((_MAX_PATH * 2));
              // Get upper part of path to desired directories.
              strcpy(tmp, filepath);
              appendPt = strstr(tmp, "dbin\\jreplicator");
              appendPt[0] = '\0';
    sprintf(newPath, "PATH=%sjava\\jre1.2.2\\bin;%sjava\\jre1.2.2\\bin\\classic", tmp, tmp);
    // Append the value of the existing PATH environment variable.
    // If there is anything, append it.
    if (path != NULL) {
         strcat(newPath, ";");
         strncat(newPath, path, (sizeof(newPath) - strlen(newPath) - 2));
    // Set new PATH value.
    _putenv(newPath);
              free(newPath);

  • Problem running sdk 1.5.0 on solaris 5.6 sparc

    After installing the sdk, I try to run java -version and it gives me this error:
    dl failure on line 704Error: failed /usr/jdk1.5.0/jre/lib/sparc/client/libjvm.so, because ld.so.1: ./java: fatal: relocation error: file /usr/jdk1.5.0/jre/lib/sparc/client/libjvm.so: symbol getloadavg: referenced symbol not found
    I have also downloaded and applied http://sunsolve.sun.com/pub-cgi/show.pl?target=patches/J2SE recommended patches but still no joy.
    Please help

    Yes something happened between 1.4.2_08 and 1.4.2_09 which prevents it from running on Solaris 2.6. My guess is the VM added a call to "getloadavg()" which is not in the 2.6 objects. This really stinks since now I cannot remediate the security issues with Java past _08. I would REALLY like this call removed. Doesn't seem that it is a real necessity.

  • Memory Allocation problem when using JNI

    For a Project we need to interface Labwindows-CVI/ Teststand with an application written in Java. we are using JNI. The code uses JNI_CreateJavaVM to start a JVM to run the Java interface code. The code did run for some time , but now ( without any obvious change nor on the CVI side neither on the Java side) JNI_CreateJavaVM fails with -4 error code, that means that the start of the JVM failed due to memory allocation failure. First investigation showed, that even if Windows Task Manager shows about 600M free physical memory, you can allocate in CVI only about 250M as a single block at the time we are calling  JNI_CreateJavaVM. That might be a little bit to less as we need to pass -Xmx192m to the JVM to run our code. Unfortunately just increasing the physical memory of that machine from 1.5G to 2G doesn't change anything. The free memory showed by Task Manager increases, but the allocatable memory block size does not. Are the any trick to optimize CVI/Teststand for that use case ?  Or maybe known problems with JNI ?
    Solved!
    Go to Solution.

    hi,
    have you tried other functions to allocate memory?
    the -Xmx command only sets the maximum heap size. You can try to use -Xms. This command sets the initial Java heap size. 

  • Portal Search Robot API

    I'm having some problems with the Robot API with JES 2005Q1 on Solaris 10. I'm not sure if this happens under other JES versions or not.
    First... I tried to launch java from a robot plugin using JNI, and it was failing from missing libraries.. I renamed 'robot' to robot2 and created a shell script 'robot' that added the Java libs to the LD_LIBRARY_PATH. Then the JNI_CreateJavaVM failed to create the VM.
    So I attempted to log the return code from the function call, and the API would not log it... So I created a 'hello world' example (below), and I verified that 'HelloWorld' gets put into the readACL, which means that the log statement should have executed, however, the log statement didn't go into the log file. The API documentation refers to a log level located in process.conf, and I am not able to find that file. So I bumped 'loglevel' to 99 in the robot.conf, as it is the only log level setting I could find outside of the Access Manager Console. Still nothing came out, but a lot of other information did.
    NSAPI_PUBLIC int someFx(libcs_pblock pb, CSFilter f, CSResource *r)
    int value_sz;
    char *attribute = "ReadACL";
    char *data = "HelloWorld";
    int dataLen=strlen(data);
    SOIF_Insert(r->rd, attribute, data, dataLen );
    cslog_error(1,1,("fn=someFx: HELLO WORLD\n"));
    return REQ_PROCEED;
    So then I attempted to use the fopen, fprintf,fflush functions to implement my own logging, and my custom log files were created, but fprintf didn't seem to write to them??
    Any ideas?
    My questions are,
    Is it possible to launch a JVM from a robot plugin? If so, what is the proper way to take care of the shared libraries and create the JVM?
    What's up with the logging?
    Thanks.

    Presumedly you'd like those documents being accessiable by users as well, so they should be put on a web server, ftp or nfs sharing. You can just add the urls to those documents, or the directory they are in, into robot system as starting points and let robot run to collect them.

  • Call Java methods from vi

    I have exising software written in Java that I would like to make use of inside of a vi.  I have written a shared library in C++ that creates a Java JVM using JNI_CreateJavaVM and then calls a few methods from a specific class.  When i use this DLL from c++ application things work fine.  However when I try calling this dll from LabView JNI_CreateJavaVM fails with a -4 return code.  When looking around for help I often see http://digital.ni.com/public.nsf/allkb/BEE812007BA2A9B486256BC80068A49A?OpenDocument referenced and this has been the approach I have taken.  I have never come upon any sample code that shows this behavior working however.  If someone could point me to some sample code or suggest a reason why JNI_CreateJavaVM isn't working I would be very grateful.  Thanks.

    I ran into the same problem.  I am trying to call some code written in Java.  I am not a Java developer but a colleague has written a C DLL which I call from LabVIEW to invoke the JNI to execute the Java classes.  We ran into the exact same issue.  My colleague changed the amount of memory the virtual machine was requesting.  Here is a sample of the code from my colleague:
    JavaVMOption options[4];
      jint res;
      options[0].optionString = "-Djava.class.path=C:\\Shared\\C-JNI-Bridge-Prototype\\bin"; //Path to the java class files.
      options[2].optionString = "-Xmx16m";
      options[3].optionString = "-Xms16m";
      options[1].optionString = "-XX:+HeapDumpOnOutOfMemoryError=C:\\Shared\\C-JNI-Bridge-Prototype\\bin\\dumps";
      vm_args.version = JNI_VERSION_1_6; //JDK version. This indicates version 1.6
      vm_args.nOptions = 3;
      vm_args.options = options;
    I personally don't really understand this, but I believe this is using virtual machine options that are not the defaults.  The default options were probably requesting more memory than LabVIEW could allocate for the virtual machine.  Keep in mind that these limits will likely be different for your application.

Maybe you are looking for