Using multiple DLL instances from Java

Hi All
I have an application which uses JSP/RMI/JNI in combination. I have JSPs on webserver and my application DLL developed in C++ on another machine (lets say appserver). So I run a RMI server on my appserver and RMI client on my webserver. Now RMI server tries to load a DLL for each client everytime it receives a request. However, it is obviously failing because there can't be multiple instances of a DLL in memory at the same time.
My question is how to resolve this situation.
I can use a executable program representing the DLL functionality. However, I am not sure if there can be more than one instances of .exes in memory at the same time.
Any suggestions?
Regards
Nikhil

Something is loading the dll.
Instead of loading it each time only load it once.

Similar Messages

  • Dll call from java using Jbuilder?

    Please?
    Does anyone know how I do dll calls from java (jubuilder).
    I have tried but did not manage to get it wright???
    / thanks carlos

    Research JNI (java native interface)

  • How Can I Use Multiple Weblogic Instances in a Single OS

    Hello Everyone,
    Actually I have to install Some different applications. Few of them need weblogic 10.3.6 and others need 10.3.4. The OS am using is Oracle  Enterprise Linux 5.
    Now I am able to install 2 separate(One of 10.3.4 and 10.3.6) instances with two different users,In two different directories.
    I have installed the weblogic 10.3.6 version with a user webadmin and installed node manager with port 5556. This is working fine.
    The main problem here is :
    In the second instance (10.3.4 ) installed with a a different user and gave the port number to NodeManager as 1600 and its not getting started. Its throwing error and also after some errors in the terminal am able to see that its reverting to port number 5556 only.
    What might be the issue?
    I have to install 2 different versions of weblogic in a single Server. But am failing with NodeManager. What Can I do to have multiple weblogic instances with multiple versions in a single server ?
    Can anyone suggest a resolution for this please ?
    Thanks in advance.

    Pl do not spam these forums with multiple posts - How Can I Use Multiple Weblogic Instances in a Single OS

  • Calling C Functions in existing DLL's from Java

    Hi Guys ,
    The tutorial in this site talks about creating ur own DLL's and then calling them from Java . I need to call C functions in existing DLL's from Java . How do I go about doing this ? . Any help on this would be much appreciated.
    regards
    murali

    What you are interested in can be done with what's called "shared stubs", from the JNI book (http://java.sun.com/products/jdk/faq/jnifaq.html), although you don't need the book to do it (I didn't).
    The example code will call functions with any number and kind of parameters, but doing that requires some assembly language. They supply working examples for Win32 (Intel architecture) and Solaris (Sparc).
    If you can limit yourself to functions to a single function signature (number and types of parameters), or at least a small set that you know you'll call at compile time, you can modify the example so that the assembly language part isn't needed, just straight C code.
    Then you'll have one C library that you compile and a set of Java classes and you can load arbitrary functions out of arbitrary dynamic libraries. In my case you don't even have to know what the libraries and functions are ahead of time, the user can set that up in a config file.
    You mentioned doing this with Delphi. One thing to watch out for is C versus Pascal (Win32) function calling convention. A good rule of thumb; if it crashes hard, you probably picked the wrong one, try the other. :-)

  • Calling windows dll's from Java

    Is there any way to call a windows dll directly from java, without having to create a bridge dll in native code and call the bridge code through the JNI?
    thanks,

    No.

  • Help! How to convert an instance from java.lang.Object to a particula class

    * How to convert an instance from java.lang.Object class to a particular class
    witch is known only at the running time?
    Roster EJB component is make up of RosterHome, Roster and RosterBean.
    RosterHome is the home interface of Roster EJB.
    Roster is the remote interface of Roster EJB.
    RosterBean is the implement of Roster EJB.
    The following code segment is to invoke Roster EJB component.
    String jndiName="roster.RosterHome";
    javax.naming.Context initial = getInitialContext();//getInitialContext() returns a instance of Context.
    Object objref = initial.lookup(jndiName);
    RosterHome home =
    (RosterHome)javax.rmi.PortableRemoteObject.narrow(objref,
    RosterHome.class);
    Roster myRoster = home.create();
    String team="T1"
    String player="Tom"
    myRoster.addPlayer(player,team);
    But, now, all the home interface, the remote interface and the JNDI Name of
    Roster EJB component are not been known at the compiling time. However They are
    known at the running time, throught reading from the XML config file.
    Questions:
    1,How to write code for this case ? or
    2,How to convert an instance from Object class to a particular class witch is known
    only at the running time?
    String jndiName="roster.RosterHome";// in fact, reads from XML file.
    String homeClassName="roster.RosterHome";// in fact, reads from XML file.
    String remoteClassName="roster.Roster";// in fact, reads from XML file.
    javax.naming.Context initial = getInitialContext();//getInitialContext() returns a instance of Context.
    Object objref = initial.lookup(jndiName);
    Object objHome= javax.rmi.PortableRemoteObject.narrow(objref,
    Class.forName(homeClassName));
    /* how to do next?
    }

    I am not sure what you are trying to do. But at some point you should know which methods to call on the remote interfaces. Maybe the method names are stored in the XML file as well or you have a set of standard method names (also consider parameters).
    However, this can be solved by reflection. Look at the java.lang.reflect package, especially java.lang.reflect.Method, and also at java.lang.Class.
    If you are doing this on the app server:
    I've seen posts where people say that reflection is not permitted in EJB, but I don't think so. Check the EJB spec.
    If you are doing this in an application: reflection is always permitted. Probably also in applets and JSP.

  • Call DLL function from JAVA

    HI!
    What's the best way (and simple) to call DLL function from JAVA.
    DLL function (developed in C) has two input parameters (string) and return an integer.
    Could you help with an example?
    Thanks a lot.

    Do a google search for 'JNI tutorial' and it will turn up hundreds of examples.

  • Accessing DLL files from Java

    I have a problem with Java, and the problem is that I'm trying to call a DLL file declared by C++ only but without a support for Java programs, is what I'm aiming for going to be available by using the rundll32.exe file.
    Note: I'm trying to use the (skydll.dll) file for controling skystar2 DVB card.
    Message was edited by:
    JZoro

    You cannot directly call a DLL from Java unless the DLL exposes entry points that are compatible with the JNI calling conventions.
    For example a Java class calling a native function like below:
    class Arguments
       private native void setArgs (String[] javaArgs);
       public static void main (String args[])
          Arguments A = new Arguments();
          newArgs[] = A.setArgs(args);
       static
          System.loadLibrary("MyArgs");
    }Needs a DLL entry point with the following signature:
    JNIEXPORT void JNICALL
    Java_Arguments_setArgs (JNIEnv *jenv, jobject job, jobjectArray oarr) {Notice that the first parameter of the method is of type JNIEnv and the second can be either a reference to a class or Java object instance.
    Jacques Gonzalez
    J4SOFT

  • Single SOA Suite Install with multiple oc4j instances and java processes

    We right now have 5 BPEL processes and 5 ESB processes all running under one java.exe process. We would like to seperate some of
    them out into their own java.exe processes without having to install more
    %ORACLE_HOME% instances of SOA Suite. I can create an oc4j instance but of
    course it doesn't have any SOA Suite stuff deployed to it. I tried to see what
    the install would do with this new oc4j instance but it wants to create a new
    %ORACLE_HOME% with an entire installation of SOA Suite.
    Is there some sort of way to clone oc4j instances that have SOA Suite deployed to them so that you
    don't need multiple %ORACLE_HOME% instances?
    ### How is this Issue Impacting Your Business ###
    We really don't want to have a lot of %ORACLE_HOME% instances to have to maintain. We are
    migrating projects over from our current integration server product and we'll
    have potentially dozens more BPEL and ESB projects. We definitely want to
    group and isolate projects so that outages of one project do not bring down
    others that are unrelated.
    We are currently experiencing periodic problems with one BPEL project that requires recycling but all the other BPEL and ESB
    projects get recycled also. If we could put this project into it's own java
    process without creating another SOA SUITE instance, it would be a big
    help.
    ANSWER
    =======
    You can create multiple domains in BPEL or create multiple systems/groups in ESB to group different projects.
    MY REPLY:
    =========
    We have been using systems/groups in ESB but they all run under the save java.exe process. I would assume that having a seperate domain in BPEL would also run in that same java.exe process.
    Right now, the one BPEL project we have a problem with will gobble up all the JDBC connections from time to time and that requires a recycle of SOA Suite, which means all BPEL and ESB projects that run in that java.exe process get recycled also. We're working that issue in a different ticket.
    It would be nice if the SOA Suite installation would install against a new oc4j instance and not assume it has to create a complete %ORACLE_HOME% instance. The components of SOA Suite seem to be J2EE based components.
    Scenario: I already have an oc4j instance called oc4j_soa and a complete %ORACLE_HOME% installation of soa suite. I then create a new oc4j instance from Enterprise Manager. Then I would deploy the esb-dt, esb-rt, orabpel, etc. components of SOA Suite to that new oc4j instance and modify the necessary config file so that it can work with OHS and the SOA Suite Databases. Is this possible?
    Does anyone have any experience with this or do people typically install multiple complete installation of SOA Suite with mulitple Oracle Homes?

    Hi,
    yes, on metalink you get in touch with real experts....
    You have to install serveral application servers to get different ORACLE_HOMEs.
    For each one, you can install a BPEL PM.
    But: For each BPEL PM you need your own database instance, or you have to configure them as a clustered BPEL installation.... (but i do not know if this work with non RAC DBs)

  • Multiple sql statements from java

    I am executing sql statements in MaxDB 7.6.04.12 from java using jdbc.
    I want to execute multiple statements at a time, but it seems that however I separate the statements, I get
    com.sap.dbtech.jdbc.exceptions.DatabaseException: [-3014] (at 433): Invalid end of SQL statement
    I have tried separating the statements with just a semi-colon, with a semi-colon and new line, with newline-//-newline (as works with SQL Studio), but whatever I try I get this error or some other.
    Is it possible to do this? and if so how?
    Thanks
    Chris

    Hi Lars,
    Here are the relevant bits of code.
    However I'm not sure how helpful that will be - I'm using the SpringFramework for my jdbc calls, as it saves a lot of time and effort, and my calls are to Spring methods, which are wrappers round the base jdbc calls.
    I'll post it anyway, just in case you're familiar with Spring. I haven't looked at the Spring code, but my understanding is it pretty much just passes the sql to standard jdbc calls.
    I guess my next step would be to trace through the Spring code as it executes and see if anything becomes apparent. However I'm under some time pressure and was hoping to avoid that.
    My other alternative is to create non-temporary tables, and drop them explicitly when I've finished with them.
        public Set<String> getPriceUpdatedProducts() {
            final Set<String> prods = new TreeSet<String>();
            String sql;
            int updates;
            // need single connection template so subsequent statements can access temp tables:
            JdbcTemplate jtSingle = getSingleConTemplate(jdbcTemplate);
            // inDCs is a list of dist channels for an sql 'in' statement:
            String inDCs = "";
            for (String siteId: siteConfig.getAllSiteIds()) {
                if (!"".equals(inDCs)) inDCs += ",";
                inDCs += "'" + siteConfig.getSiteIdProperty(siteId, "distChanId") + "'";
            // Clear all the changed flags from the last run:
            updates = jtSingle.update("update zchangedartdc set changed = '' ");
            // Get the current data into a temp table:
            sql = "create table temp.pricechanges as " +
                  "select A304.matnr, A304.vtweg, konp.kbetr price " +
                  "  from A304 join konp " +
                  "    on A304.knumh = konp.knumh " +
                  "   and A304.mandt = konp.mandt " +
                  "   and A304.vkorg = '" + salesOrg + "' " +
                  "   and A304.vtweg in (" + inDCs + ")" +
                  "   and A304.kschl = '" + rrpCondType + "' " +
                  "   and A304.mandt = '" + sapClient + "' " +
                  "   and konp.kschl = '" + listCondType + "' " +
                  "   and char(date, internal) >= chr(A304.datab) " +
                  "   and char(date, internal) <= chr(A304.datbi) " +
                  "   and A304.kappl = 'V' ";
            jtSingle.execute(sql);
            // Get the changes into a second temp table:
            sql = "create table temp.changedarts as " +
                  "select temp.pricechanges.* " +
                  "  from temp.pricechanges, zchangedartdc " +
                  " where zchangedartdc.matnr = temp.pricechanges.matnr " +
                  "   and zchangedartdc.vtweg = temp.pricechanges.vtweg " +
                  "   and zchangedartdc.price != temp.pricechanges.price ";
            jtSingle.execute(sql);
            // save the changes, and flag them:
            sql = "update zchangedartdc " +
                  "   set (price, changed) = (select price, 'X'  " +
                  "                             from temp.changedarts " +
                  "                            where zchangedartdc.matnr = temp.changedarts.matnr " +
                  "                              and zchangedartdc.vtweg = temp.changedarts.vtweg) " +
                  " where matnr||vtweg in (select matnr||vtweg from temp.changedarts) ";
            updates = jtSingle.update(sql);
            // add the new items that weren't there last time, and flag them:
            sql = "insert into zchangedartdc " +
                  "select '" + sapClient + "', matnr, vtweg, 'X', price " +
                  "  from temp.pricechanges " +
                  " where matnr||vtweg not in (select matnr||vtweg from zchangedartdc) ";
            updates = jtSingle.update(sql);
            // now we've got all the changes flagged, we can get the list of changed products:
            jtSingle.query("select distinct matnr from zchangedartdc " +
                           " where changed = 'X' ",
                           new RowCallbackHandler() {
                               public void processRow(ResultSet rs) throws SQLException {
                                   prods.add(rs.getString("matnr"));
            // release the connection:
            destroySingleConnection(jtSingle);
            return prods;
         * Return a JdbcTemplate which will always use the same connection. Parameter jt is just
         * a convenient way to get a DataSource from which to get a Connection.
         * When the calling prodedure has finished, it MUST call
         * destroySingleConnection(jt)
         * @param jt
         * @return
        private JdbcTemplate getSingleConTemplate(JdbcTemplate jt) {
            Connection con;
            try {
                con = jt.getDataSource().getConnection();
                con.setAutoCommit(true);
            } catch (SQLException e) {
                throw new DataAccessResourceFailureException("Failed to get Connection for SingleConnectionDataSource", e);
            SingleConnectionDataSource singleDs = new SingleConnectionDataSource(con, true);
            return new JdbcTemplate(singleDs);
        private void destroySingleConnection(JdbcTemplate singleConJt) {
            try{
                ((SingleConnectionDataSource)singleConJt.getDataSource()).destroy();
            } catch (SQLException e) {
                throw new DataAccessResourceFailureException("Failed to destroy SingleConnectionDataSource", e);

  • How to get application module instance from java bundle?

    Hi!
    I would like to build an application that would get all translations from a database table.
    So I created application module for translations that contains a view object which is selecting translations for specific language from a database table. I exposed a method in application module as client interface which returns HashMap<String, String> for all translations for specific language. When I test my view and client interface method call they work fine.
    Then I created java bundle classes to get translations for specific language. Then I tried to override public Object[][] getContents() method.
    I tried to get my translations application module like this:
    SharedTranslationsAppModuleImpl am = new SharedTranslationsAppModuleImpl();
    Map<String, String> translationsMap = am.getTranslations(this.getLocaleCode); // Client interface method call
    In getTranslations(String LocaleCode) I try to get that view (which would select translations from database) but it returns NULL and I get NPE error message.
    So what is the right way to get application module from java bundle file? Now everytime application wants to get translations, application stops and displays NPE message.
    Regards, Marko
    I use JDeveloper 11.1.2.1.0

    Marko,
    you can't just instantiate an application module. An application module has to be set up, db connections and memory pools have to be initialized and ....
    Can you describe why and when you try to read the resource bundle from the db and where the resource bundle is used?
    This blog may be what you are looking for http://technology.amis.nl/2012/08/10/implement-resource-bundles-for-adf-applications-in-a-database-table/
    Timo

  • Using Multiple Linear Regression from SAP Predictive Analysis within S&OP

    How could I go about using a model exported from SAP Predictive Analysis, say a multiple linear regression model, as a stored procedure in HANA from within S&OP?

    Hi Kevin,
    You cannot use a model exported from SAP Predictive Analysis in S&OP.  S&OP uses the same PAL libraries from HANA for Statistical Forecasting. We enable a few statistical forecasting methods from PAL like exponential smoothing from S&OP.
    Thanks,
    Raghav

  • How to use the Worklist API from Java (classpath ??)

    Hi all,
    Sorry for a novice question but I couldn't find the way to go about this (probably because it's such common knowldge...)
    I would like to try and use the Worklist API from my Java code in Eclipse, and according to the BPEL dev-guide I need to add an Import command for oracle.tip.pc.api.worklist. Where do I find these classes ?????
    I guess I need to change my CLASSPATH but I couldn't find a single word about this in the BPEL dev-guide (chapter 17), BPEL installation guide or elseware.
    thanks.

    Hi all,
    Ok now.
    To summarize - I was trying the code from BPEL developer guide, chapter 17, page 40 for using the Worklist local API's.
    Only after adding the following JAR's to the build path, was I able to compile it:
    orabpel-common.jar
    orabpel.jar
    bpm-infra.jar
    bpm-services.jar
    So, these 4 JAR's are required for using the Worklist local API's (not a clue in the dev guide itself for this requirement though...)
    Thank you very much for your help,
    assaf.

  • Calling Window's Dll Functions from Java

    Hello Audience,
    Is there a way to call Win32/64 based Dynamic-Link-Library functions from Java?
    Thanks,
    GenKabuki

    Yes.
    In general, if you are trying to call functions in an existing dll, you will have to write a "wrapper" dll that meets the java jni interface requires. There are tools around wich purport to generate these for you. Do a google serach; among other tools, you should turn up JACE.

  • How to use multiple tape drive from single client when I want to backup single filesystem?

    Hello All.
    I want to backup 1 filesystem using 4 tape drives for reducing backup time.
    In case of symantec veritas netbackup, they can user "NEW_STREAM" for multi-streaming backup.
    Please anybody let me know that how to do use multiple tape drive for single client.
    1. Backup Source server : Linux (1 client)
    2. Backup Soucre :      /data1  (500GB)
        -> Current directory structure doesn't have sub directory under /data1, just files present under /data1
    If direcotry struceture has a sub-direcotry like( /data1/aaa, /data1/bbb ), Is it availble to use multiple tape drives?
    3. Tape Drive with OSB : LTO6 * 4 drives
    Thank you.

    You would have to create a different dataset for each sub-folder. If you only have files at the top level folder then even the NEW_STREAM option couldn't be used to split the job.
    Organise it into folders and then create datasets in a client folder such as :
    /usr/local/oracle/backup/admin/config/dataset/Linux/data1_aaa
    /usr/local/oracle/backup/admin/config/dataset/Linux/data1_bbb
    Then in the schedule you just specify the Linux folder. In each dataset you list the hostname and the folder name, such as :
    include host Linux
    include path /data1/aaa
    Now it will create a new job for each dataset and therefore each folder.
    Thanks
    Rich

Maybe you are looking for