Cardinality in java

Hi,
I have list of plants in a vector.
How can i find the number of plants in that vector object. ?
like cardinality function.
let me know if my question is not clear.
Thanks

pushsan wrote:
Thank you...it worked..Phew. I can sleep easier now knowing I don't have to regression test List#size() myself. :^)
- Saish

Similar Messages

  • Beginner's Question on simulation of java card application

    Hi,
    I am trying to run a basic Java card application.
    To simulate the java card application, I created jcwde.app and tried C:\>jcwde -p 9025 jcwde.app
    I got and exception like:
    Exception in thread "main" java.lang.UnsatisfiedLinkError: markHeap
    at com.sun.javacard.impl.NativeMethods.markHeap(Native Method)
    at javacard.framework.Dispatcher.cardInit(Dispatcher.java:188)
    at javacard.framework.Dispatcher.main(Dispatcher.java:63)
    at javacard.framework.JCWDEDispatcher.main(JCWDEDispatcher.java:28)
    at com.sun.javacard.jcwde.Main.run(Main.java:85)
    at com.sun.javacard.jcwde.Main.main(Main.java:148)
    Can anyone help me to find out the reason for this exception?
    The content of jcwde.app is
    // applet AID
    com.sun.javacard.installer.InstallerApplet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0x8:0x1
    wallet.Wallet
    Thanx in advance.
    anju

    I see terms like ...
    Core Java: The main libraries.
    JDK: (J)ava (D)evelopement (K)it) - AKA: (S)oftware (D)evelopement (K)it although the terms are not exactly synonomous
    J2EE:(J)ava 2 (E)nterprise (E)dition (builds on the J2SE with additional librarities for true business application building)
    J2SE: (J)ava 2 (S)tandard (E)dition (The core libraries for general Java program developement, w/o the extra stuff that is in the J2EE
    Go to java.sun.com and check out the tutorials, the readme files that come with the downloads, the release notes, etc ... HTH

  • Error in conversion-applet class is not defined!

    Hello everybody.I have created my own applet and I try to test it in a simulation world ,using the Java Card Toolkit 2.1.1.
    I have declared AID of my applet in the jcwde.app file as following:
    ///The code in the jcwde.app file
    com.sun.javacard.samples.wallet.Wallet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x6:0x1
    //Declare my applet
    com.sun.javacard.samples.myapplet.myapplet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x7:0x1
    /////End of the code
    In the first line of this file there is the AID of the installer also.
    As you can see from the above file, the only difference between th wallet and myapplet is the second byte from the end.
    Then I tried to compile(was successful) myapplet and then converted.When I try to convert(I also create an .opt file which has the above address of myapplet), I received the message:
    error: com.sun.javacard.samples.wallet.myapplet class is not defined in package com.
    sun.javacard.samples.wallet.myapplet.
    conversion completed with 1 errors and 0 warnings.
    I will give the .opt file also:
    /////code of the opt file
    -out EXP JCA CAP
    -exportpath ..\api21
    -applet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x7:0x1 com.sun.javacard.samples.myapplet.myapplet
    com.sun.javacard.samples.myapplet
    0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x7 1.0
    /////End of opt file
    What do you think I am doing wrong?Is any other file(like the jcwde.app file) that I have to declare the AID of myapplet?
    I will apreciate any help.
    Thank you.

    The converter informs, that it can not find your applet in >the specified package. Have you really specified a >package in a java-file?
    myapplet.java:
    package com.sun.javacard.samples.myapplet
    your applet code.
    Try to specify parameter -classdir in the .opt file:
    -classdir <the root directory of the class hierarchy>Hello and thank you for replying me.
    Yes,I define the package as you wrote me.I try also to use the -classdir option.But I have this similar problem again.The DOS commmand line appears the message(After the execution of the command->jcwde -p 9025 jcwde.app):
    java.lang.ClassNotFoundException: com.sun.javacard.samples.IMSIapp.IMSIapplet
    jcwde terminating on receipt of SimulationException. See previous messages fo
    cause.
    Exception in thread "main" com.sun.javacard.jcwde.SimulationException
    at com.sun.javacard.jcwde.MaskedApplets.getInstallInstance(MaskedApple
    .java:233)
    at com.sun.javacard.jcwde.SimPrivAccess.getROMApplet(SimPrivAccess.jav
    146)
    at com.sun.javacard.impl.NativeMethods.getInstallMethod(NativeMethods.
    va:314)
    at com.sun.javacard.impl.PrivAccess.initialize(PrivAccess.java:281)
    at javacard.framework.Dispatcher.cardInit(Dispatcher.java:226)
    at javacard.framework.Dispatcher.main(Dispatcher.java:63)
    at javacard.framework.JCWDEDispatcher.main(JCWDEDispatcher.java:28)
    at com.sun.javacard.jcwde.Main.run(Main.java:85)
    at com.sun.javacard.jcwde.Main.main(Main.java:148)
    It did not recognize the directory(IMSIapp is the directory of my clas,java,opt file) and IMSIappplet is the java card applet code.
    What did you think is the wrong?
    I am waiting for your(or any reply).
    Thank you.

  • Have problem running JCWDE - ClassNotFoundException

    I ran the demo in Java Kit 2.1.1 successfully. However, when I created another application by copying the wallet java file and renaming it to testWallet in package TestWallet under the same directory as the wallet application, I encountered the ClassNotFoundExcepton error when run JCWDE:
    Copyright (c) 2000 Sun Microsystems, Inc. All rights reserved.
    jcwde is listening for T=0 Apdu's on TCP/IP port 9,025.
    java.lang.ClassNotFoundException: com.sun.javacard.samples.TestWallet.testWallet
    jcwde terminating on receipt of SimulationException. See previous messages for
    cause.
    Exception in thread "main" com.sun.javacard.jcwde.SimulationException
    at com.sun.javacard.jcwde.MaskedApplets.getInstallInstance(MaskedApplets
    .java:233)
    at com.sun.javacard.jcwde.SimPrivAccess.getROMApplet(SimPrivAccess.java:
    146)
    at com.sun.javacard.impl.NativeMethods.getInstallMethod(NativeMethods.ja
    va:314)
    at com.sun.javacard.impl.PrivAccess.initialize(PrivAccess.java:281)
    at javacard.framework.Dispatcher.cardInit(Dispatcher.java:226)
    at javacard.framework.Dispatcher.main(Dispatcher.java:63)
    at javacard.framework.JCWDEDispatcher.main(JCWDEDispatcher.java:28)
    at com.sun.javacard.jcwde.Main.run(Main.java:85)
    at com.sun.javacard.jcwde.Main.main(Main.java:148)
    I added the last line in the jcwde.app configuration file:
    com.sun.javacard.installer.InstallerApplet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0x8:0x1
    com.sun.javacard.samples.JavaPurse.JavaPurse 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x2:0x1
    com.sun.javacard.samples.JavaLoyalty.JavaLoyalty 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x5:0x1
    com.sun.javacard.samples.wallet.Wallet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x6:0x1
    com.sun.javacard.samples.TestWallet.testWallet 0xa0:0x0:0x0:0x0:0x62:0x3:0x1:0xc:0x8:0x1
    Do I miss any setup to run the testWallet application which is exactly the same as the Wallet application excepts the name?
    Fai.

    Thanks for you explaination and directory. However, I used the same Dos window with the same environment setup to run both the original sample applications as well as the new application. I didn't see anywhere that there is a classpath pointing to the original sample application. Where should I add the classpath information to?
    Fai.

  • What diference between JAVA TV, OCAP, Xletview, Cardinal Studio, Espial...

    I am newbe in digital TV area, and one of my great doubt is in relation the most diverse acronyms and names that appear, carrying would like that they clarify me some doubts.
    Which the difference between as many implementations that they exist today, for example: JAVA TV and OCAP?
    The Xletview is an implementation freeware of certain the Dvb-mhp standard? What your relation with JAVA TV?
    This Dvb-mhp has any software release or is only one consorcium that it defines rules for digital TV?
    What are the best simulator today to test applications for Digital TV? Currently I am using the implementation of devicetop.com but it seems that it is not complete, that is truth? Because it is not complete, what it is lacking?
    Sorry for many questions.

    First of all, sorry for the delay in responding. I've been buried under a mountain of work for the last week,and so this is the first chance I've had to check in here for a while.
    Some more details on the book:
    - It'll be published sometime before Christmas, hopefully. The manuscript has been delivered to the publisher, so now we have wait for them to complete the technical review, then we have to make any changes that they need (or that we wantot make). Then it will actually get published.
    - As far as I know, it will be available worldwide, and you will certainly be able to order it via amazon.com
    - I don't know the price yet
    - It will cover OCAP, MHP and JavaTV, with some discussion of ACAP (but not much). We're looking at most aspects of these standards: building applications, building middleware, and how you can make money off them
    - It's based on my website, but with some MAJOR changes. It's probably going to be around 600 pages long, so there's a lot of new material there. I've tried to cover as much as possible about the questions that people have been posting here and on the MHP forum, as well as the questions I get in email, because these are the areas that peopole want answers to.
    - I can still make some minor additions, so if there's any questions that you'd really like to see answered, please let me know (email to [email protected]). I can't promise to answer them in the book, but I'll do my best and I can always cover them on the website.
    - I'm the author of the technical bits. My co-author for the more commercially-oriented bits is Anthony Smith-Chaigneau, VP of business development at Osmosys and ex-DVB. He knows the commercial side of MHP/OCAP as well as anyone out there. I've been doing MHP/JavaTV since the early days of the standardization process for those standards (and standardizing DAVIC before then), so between us we've got a pretty good grasp on the material and the commercial/political/technical background to the various standards.
    - There is no mailing list yet, but if people think it's useful then I can create one. Let me know what you think.
    To answer funniest's questions, your list covers most of the things that you need to do to implement an OCAP receiver. Some stuff like storing applications is missing, but this comes under the general case of 'application management'. Of course, there is all the other OpenCable certification that you need to do as well, but that's not specific to OCAP. I know that OCAP are discussing conformance testing, but there's no firm decisions yet as far as I know. Maybe they will follow the MHP approach, or maybe they'll extend their 'Wave' certification program. This is also one of the areas that OCAP and ACAP need to agree (because of the OCAP APIs that are re-used in ACAP), which complicates things.
    As for other OCAP activities, there are a few OCAP trials underway in various places in the US. I can't say too much about them, but I know that many cable companies are looking at OCAP, even if they're not running trials yet. The Sun OCAP content is the only activity that is generally open to the public at the moment, as far as I know.
    Steve.

  • How to use EVS with different data in each row, in a Java Web Dynpro table?

    Hi all,
    I am using EVS in a column of java web dynpro table.
    Let's say the name, and context attribute, of this column is column1.
    It's filled dynamically using an RFC, that uses as input parameter the value of another column, and related context attribute, from the same table (Let's call it column2).  Obviously, from the same row. So, in other words: the values of the EVS in column1 of row1, are dependent of the value of column2 of row1. And the values of the EVS in column1 of row2, are dependent of the value of column2 of row2. And so on... Hope i could explain myself ok.
    The code I'm using works great for filling the EVS dynamically:
    IWDAttributeInfo attrInfo = wdContext.nodeDetail().getNodeInfo().getAttribute(nodeElement.COLUMN1);
    ISimpleTypeModifiable siType = attrInfo.getModifiableSimpleType();
    IModifiableSimpleValueSet<String> value = siType.getSVServices().getModifiableSimpleValueSet();
    value.clear();
    if(this.initRFC_Input(nodeElement.getColumn2())){
         for (int i = 0; i < wdContext.nodeRFCresult().size(); i++){
              value.put(wdContext.nodeRFCresult().getRFCresultElementAt(i).getLgort()
                 , wdContext.nodeRFCresult().getRFCresultElementAt(i).getLgobe());
    In this code, nodeElement is the context row of the table that is passed dynamically to the method when the value of colum2 is changed.
    HOWEVER, the problem I'm having is that after executing this code, EACH NEW ROW that is added to the table has by default the same values as the first row, in the column1 EVS. And, for example, if I refresh the values of the column1 EVS in row 2, all EVS values in the other rows are also refreshed with the same values as the ones of EVS in row 2.
    How can I make sure each row EVS has its own set of independent values, so they don't mess with each other?
    Hope you guys can help me. And please, let me know if I didn't explain myself correctly!
    Thanks!

    I just did as you said (I think), but it's still having the same behaviour as before (same data for all EVS in the table).
    Here´s what I did:
    I
    In node "Detail" (cardinality 0...n, singleton set to true), which is binded to the table, I created a child node named "Column1Values" wth cardinality 1...1 and singleton set to false.
    "Column1Values" node has an attribute called "column1", of type String.
    I did the binding between attribute "column1" and the column1 inputfield celleditor in the table.
    I created an event called Column2Changed and binded it to the column2 celleditor of the table. I added a parameter called nodeElement of type IPrivateCompView.IDetailElement to this event, and mapped it to the column2 editor in the table so that I can dynamically get the nodeElement that is being affected.
    I added the following code to the onActionColumn2Changed(wdEvent, nodeElement) method that gets created in the view:
    IWDAttributeInfo attrInfo = nodeElement.nodeColumn1Values().getNodeInfo().getAttribute("column1");
    ISimpleTypeModifiable siType = attrInfo.getModifiableSimpleType();
    IModifiableSimpleValueSet<String> value = siType.getSVServices().getModifiableSimpleValueSet();
    if(this.initRFC_Input(nodeElement.getColumn2())){
         for(int i =0; i < wdContext.nodeRFCresults().size(); i++){
              value.put(wdContext.nodeRFCresults().getRFCresultsElementAt(i).getId(),
                                  wdContext.nodeRFCresults().getRFCresultsElementAt(i).getDesc());
    And with this, I still get the original problem... When the EVS of one row is updated, ALL other EVS of the table get also updated with the same values.
    What am I missing? Sorry Govardan, I bet I'm not seeing something really obvious... hopefully you can point me in the right direction.
    Thanks!

  • Returning an array type from a local method in Web Dynpro Java application

    Hi,
    In my project, we have a requirement to display 18 rolling months along with the year, starting from current month.
    How I am going to approach is that I will get the system date and get the current month and send the month and year value to a local method which will return 18 rolling months along with the year.
    But, when I tried to create a new method there is no option to return an array type. It was greyed out.
    So, we can not return an array type from a method from Web Dynpro for Java application?
    If so, what is the alternative and how am I going to achieve it?
    I will appreciate your help!
    Regards
    Ram

    HI
    You can create new methods in
      //@@begin others
      private ArrayList MyMethod(){
           // ** Put your code here
           return new ArrayList();
      //@@end
    Other option are create a context node with cardinality 0...n with one or more attributes, and in your method create the needed registers into this node. To read this values, you only need to read your context node.
    Best regards
    Edited by: Xavier Aranda on Dec 2, 2010 9:41 AM

  • Dmtreedemo.java error

    hi,
    i got error in the following programme in java named dmdemotree.java the code and the error are as mentioned below
    i have installed oracle 10g r2 and i have used JDK 1.4.2 softwares , i have set classpath for jdm.jar and ojdm_api.jar available in oracle 10g r2 software ,successfully compiled but at execution stage i got error as
    F:\Mallari\DATA MINING demos\java\samples>java dmtreedemo localhost:1521:orcl scott tiger
    --- Build Model - using cost matrix ---
    javax.datamining.JDMException: Generic Error.
    at oracle.dmt.jdm.resource.OraExceptionHandler.createException(OraExcept
    ionHandler.java:142)
    at oracle.dmt.jdm.resource.OraExceptionHandler.createException(OraExcept
    ionHandler.java:91)
    at oracle.dmt.jdm.OraDMObject.createException(OraDMObject.java:111)
    at oracle.dmt.jdm.base.OraTask.saveObjectInDatabase(OraTask.java:204)
    at oracle.dmt.jdm.OraMiningObject.saveObjectInDatabase(OraMiningObject.j
    ava:164)
    at oracle.dmt.jdm.resource.OraPersistanceManagerImpl.saveObject(OraPersi
    stanceManagerImpl.java:245)
    at oracle.dmt.jdm.resource.OraConnection.saveObject(OraConnection.java:3
    83)
    at dmtreedemo.executeTask(dmtreedemo.java:622)
    at dmtreedemo.buildModel(dmtreedemo.java:304)
    at dmtreedemo.main(dmtreedemo.java:199)
    Caused by: java.sql.SQLException: Unsupported feature
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at oracle.jdbc.dbaccess.DBError.throwUnsupportedFeatureSqlException(DBEr
    ror.java:690)
    at oracle.jdbc.driver.OracleCallableStatement.setString(OracleCallableSt
    atement.java:1337)
    at oracle.dmt.jdm.utils.OraSQLUtils.createCallableStatement(OraSQLUtils.
    java:126)
    at oracle.dmt.jdm.utils.OraSQLUtils.executeCallableStatement(OraSQLUtils
    .java:532)
    at oracle.dmt.jdm.scheduler.OraProgramJob.createJob(OraProgramJob.java:7
    7)
    at oracle.dmt.jdm.scheduler.OraJob.saveJob(OraJob.java:107)
    at oracle.dmt.jdm.scheduler.OraProgramJob.saveJob(OraProgramJob.java:85)
    at oracle.dmt.jdm.scheduler.OraProgramJob.saveJob(OraProgramJob.java:290
    at oracle.dmt.jdm.base.OraTask.saveObjectInDatabase(OraTask.java:199)
    ... 6 more
    SO PLZ HELP ME OUT IN THIS , I WILL BE VERY THANK FULL
    ===========================================================
    the sample code is
    // Copyright (c) 2004, 2005, Oracle. All rights reserved.
    // File: dmtreedemo.java
    * This demo program describes how to use the Oracle Data Mining (ODM) Java API
    * to solve a classification problem using Decision Tree (DT) algorithm.
    * PROBLEM DEFINITION
    * How to predict whether a customer responds or not to the new affinity card
    * program using a classifier based on DT algorithm?
    * DATA DESCRIPTION
    * Data for this demo is composed from base tables in the Sales History (SH)
    * schema. The SH schema is an Oracle Database Sample Schema that has the customer
    * demographics, purchasing, and response details for the previous affinity card
    * programs. Data exploration and preparing the data is a common step before
    * doing data mining. Here in this demo, the following views are created in the user
    * schema using CUSTOMERS, COUNTRIES, and SUPPLIMENTARY_DEMOGRAPHICS tables.
    * MINING_DATA_BUILD_V:
    * This view collects the previous customers' demographics, purchasing, and affinity
    * card response details for building the model.
    * MINING_DATA_TEST_V:
    * This view collects the previous customers' demographics, purchasing, and affinity
    * card response details for testing the model.
    * MINING_DATA_APPLY_V:
    * This view collects the prospective customers' demographics and purchasing
    * details for predicting response for the new affinity card program.
    * DATA MINING PROCESS
    * Prepare Data:
    * 1. Missing Value treatment for predictors
    * See dmsvcdemo.java for a definition of missing values, and the steps to be
    * taken for missing value imputation. SVM interprets all NULL values for a
    * given attribute as "sparse". Sparse data is not suitable for decision
    * trees, but it will accept sparse data nevertheless. Decision Tree
    * implementation in ODM handles missing predictor values (by penalizing
    * predictors which have missing values) and missing target values (by simple
    * discarding records with missing target values). We skip missing values
    * treatment in this demo.
    * 2. Outlier/Clipping treatment for predictors
    * See dmsvcdemo.java for a discussion on outlier treatment. For decision
    * trees, outlier treatment is not really necessary. We skip outlier treatment
    * in this demo.
    * 3. Binning high cardinality data
    * No data preparation for the types we accept is necessary - even for high
    * cardinality predictors. Preprocessing to reduce the cardinality
    * (e.g., binning) can improve the performance of the build, but it could
    * penalize the accuracy of the resulting model.
    * The PrepareData() method in this demo program illustrates the preparation of the
    * build, test, and apply data. We skip PrepareData() since the decision tree
    * algorithm is very capable of handling data which has not been specially
    * prepared. For this demo, no data preparation will be performed.
    * Build Model:
    * Mining Model is the prime object in data mining. The buildModel() method
    * illustrates how to build a classification model using DT algorithm.
    * Test Model:
    * Classification model performance can be evaluated by computing test
    * metrics like accuracy, confusion matrix, lift and ROC. The testModel() or
    * computeTestMetrics() method illustrates how to perform a test operation to
    * compute various metrics.
    * Apply Model:
    * Predicting the target attribute values is the prime function of
    * classification models. The applyModel() method illustrates how to
    * predict the customer response for affinity card program.
    * EXECUTING DEMO PROGRAM
    * Refer to Oracle Data Mining Administrator's Guide
    * for guidelines for executing this demo program.
    // Generic Java api imports
    import java.math.BigDecimal;
    import java.sql.PreparedStatement;
    import java.sql.ResultSet;
    import java.sql.ResultSetMetaData;
    import java.sql.SQLException;
    import java.sql.Statement;
    import java.text.DecimalFormat;
    import java.text.MessageFormat;
    import java.util.Collection;
    import java.util.HashMap;
    import java.util.Iterator;
    import java.util.Stack;
    // Java Data Mining (JDM) standard api imports
    import javax.datamining.ExecutionHandle;
    import javax.datamining.ExecutionState;
    import javax.datamining.ExecutionStatus;
    import javax.datamining.JDMException;
    import javax.datamining.MiningAlgorithm;
    import javax.datamining.MiningFunction;
    import javax.datamining.NamedObject;
    import javax.datamining.SizeUnit;
    import javax.datamining.algorithm.tree.TreeHomogeneityMetric;
    import javax.datamining.algorithm.tree.TreeSettings;
    import javax.datamining.algorithm.tree.TreeSettingsFactory;
    import javax.datamining.base.AlgorithmSettings;
    import javax.datamining.base.Model;
    import javax.datamining.base.Task;
    import javax.datamining.data.AttributeDataType;
    import javax.datamining.data.CategoryProperty;
    import javax.datamining.data.CategorySet;
    import javax.datamining.data.CategorySetFactory;
    import javax.datamining.data.ModelSignature;
    import javax.datamining.data.PhysicalAttribute;
    import javax.datamining.data.PhysicalAttributeFactory;
    import javax.datamining.data.PhysicalAttributeRole;
    import javax.datamining.data.PhysicalDataSet;
    import javax.datamining.data.PhysicalDataSetFactory;
    import javax.datamining.data.SignatureAttribute;
    import javax.datamining.modeldetail.tree.TreeModelDetail;
    import javax.datamining.modeldetail.tree.TreeNode;
    import javax.datamining.resource.Connection;
    import javax.datamining.resource.ConnectionFactory;
    import javax.datamining.resource.ConnectionSpec;
    import javax.datamining.rule.Predicate;
    import javax.datamining.rule.Rule;
    import javax.datamining.supervised.classification.ClassificationApplySettings;
    import javax.datamining.supervised.classification.ClassificationApplySettingsFactory;
    import javax.datamining.supervised.classification.ClassificationModel;
    import javax.datamining.supervised.classification.ClassificationSettings;
    import javax.datamining.supervised.classification.ClassificationSettingsFactory;
    import javax.datamining.supervised.classification.ClassificationTestMetricOption;
    import javax.datamining.supervised.classification.ClassificationTestMetrics;
    import javax.datamining.supervised.classification.ClassificationTestMetricsTask;
    import javax.datamining.supervised.classification.ClassificationTestMetricsTaskFactory;
    import javax.datamining.supervised.classification.ClassificationTestTaskFactory;
    import javax.datamining.supervised.classification.ConfusionMatrix;
    import javax.datamining.supervised.classification.CostMatrix;
    import javax.datamining.supervised.classification.CostMatrixFactory;
    import javax.datamining.supervised.classification.Lift;
    import javax.datamining.supervised.classification.ReceiverOperatingCharacterics;
    import javax.datamining.task.BuildTask;
    import javax.datamining.task.BuildTaskFactory;
    import javax.datamining.task.apply.DataSetApplyTask;
    import javax.datamining.task.apply.DataSetApplyTaskFactory;
    // Oracle Java Data Mining (JDM) implemented api imports
    import oracle.dmt.jdm.algorithm.tree.OraTreeSettings;
    import oracle.dmt.jdm.resource.OraConnection;
    import oracle.dmt.jdm.resource.OraConnectionFactory;
    import oracle.dmt.jdm.modeldetail.tree.OraTreeModelDetail;
    public class dmtreedemo
    //Connection related data members
    private static Connection m_dmeConn;
    private static ConnectionFactory m_dmeConnFactory;
    //Object factories used in this demo program
    private static PhysicalDataSetFactory m_pdsFactory;
    private static PhysicalAttributeFactory m_paFactory;
    private static ClassificationSettingsFactory m_clasFactory;
    private static TreeSettingsFactory m_treeFactory;
    private static BuildTaskFactory m_buildFactory;
    private static DataSetApplyTaskFactory m_dsApplyFactory;
    private static ClassificationTestTaskFactory m_testFactory;
    private static ClassificationApplySettingsFactory m_applySettingsFactory;
    private static CostMatrixFactory m_costMatrixFactory;
    private static CategorySetFactory m_catSetFactory;
    private static ClassificationTestMetricsTaskFactory m_testMetricsTaskFactory;
    // Global constants
    private static DecimalFormat m_df = new DecimalFormat("##.####");
    private static String m_costMatrixName = null;
    public static void main( String args[] )
    try
    if ( args.length != 3 ) {
    System.out.println("Usage: java dmsvrdemo <Host name>:<Port>:<SID> <User Name> <Password>");
    return;
    String uri = args[0];
    String name = args[1];
    String password = args[2];
    // 1. Login to the Data Mining Engine
    m_dmeConnFactory = new OraConnectionFactory();
    ConnectionSpec connSpec = m_dmeConnFactory.getConnectionSpec();
    connSpec.setURI("jdbc:oracle:thin:@"+uri);
    connSpec.setName(name);
    connSpec.setPassword(password);
    m_dmeConn = m_dmeConnFactory.getConnection(connSpec);
    // 2. Clean up all previuosly created demo objects
    clean();
    // 3. Initialize factories for mining objects
    initFactories();
    m_costMatrixName = createCostMatrix();
    // 4. Build model with supplied cost matrix
    buildModel();
    // 5. Test model - To compute accuracy and confusion matrix, lift result
    // and ROC for the model from apply output data.
    // Please see dnnbdemo.java to see how to test the model
    // with a test input data and cost matrix.
    // Test the model with cost matrix
    computeTestMetrics("DT_TEST_APPLY_OUTPUT_COST_JDM",
    "dtTestMetricsWithCost_jdm", m_costMatrixName);
    // Test the model without cost matrix
    computeTestMetrics("DT_TEST_APPLY_OUTPUT_JDM",
    "dtTestMetrics_jdm", null);
    // 6. Apply the model
    applyModel();
    } catch(Exception anyExp) {
    anyExp.printStackTrace(System.out);
    } finally {
    try {
    //6. Logout from the Data Mining Engine
    m_dmeConn.close();
    } catch(Exception anyExp1) { }//Ignore
    * Initialize all object factories used in the demo program.
    * @exception JDMException if factory initalization failed
    public static void initFactories() throws JDMException
    m_pdsFactory = (PhysicalDataSetFactory)m_dmeConn.getFactory(
    "javax.datamining.data.PhysicalDataSet");
    m_paFactory = (PhysicalAttributeFactory)m_dmeConn.getFactory(
    "javax.datamining.data.PhysicalAttribute");
    m_clasFactory = (ClassificationSettingsFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationSettings");
    m_treeFactory = (TreeSettingsFactory) m_dmeConn.getFactory(
    "javax.datamining.algorithm.tree.TreeSettings");
    m_buildFactory = (BuildTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.task.BuildTask");
    m_dsApplyFactory = (DataSetApplyTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.task.apply.DataSetApplyTask");
    m_testFactory = (ClassificationTestTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationTestTask");
    m_applySettingsFactory = (ClassificationApplySettingsFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationApplySettings");
    m_costMatrixFactory = (CostMatrixFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.CostMatrix");
    m_catSetFactory = (CategorySetFactory)m_dmeConn.getFactory(
    "javax.datamining.data.CategorySet" );
    m_testMetricsTaskFactory = (ClassificationTestMetricsTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationTestMetricsTask");
    * This method illustrates how to build a mining model using the
    * MINING_DATA_BUILD_V dataset and classification settings with
    * DT algorithm.
    * @exception JDMException if model build failed
    public static void buildModel() throws JDMException
    System.out.println("---------------------------------------------------");
    System.out.println("--- Build Model - using cost matrix ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    PhysicalDataSet buildData =
    m_pdsFactory.create("MINING_DATA_BUILD_V", false);
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    buildData.addAttribute(pa);
    m_dmeConn.saveObject("treeBuildData_jdm", buildData, true);
    //2. Create & save Mining Function Settings
    // Create tree algorithm settings
    TreeSettings treeAlgo = m_treeFactory.create();
    // By default, tree algorithm will have the following settings:
    // treeAlgo.setBuildHomogeneityMetric(TreeHomogeneityMetric.gini);
    // treeAlgo.setMaxDepth(7);
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(0.1, SizeUnit.percentage);
    // treeAlgo.setMinNodeSize( 0.05, SizeUnit.percentage );
    // treeAlgo.setMinNodeSize( 10, SizeUnit.count );
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(20, SizeUnit.count);
    // Set cost matrix. A cost matrix is used to influence the weighting of
    // misclassification during model creation (and scoring).
    // See Oracle Data Mining Concepts Guide for more details.
    String costMatrixName = m_costMatrixName;
    // Create ClassificationSettings
    ClassificationSettings buildSettings = m_clasFactory.create();
    buildSettings.setAlgorithmSettings(treeAlgo);
    buildSettings.setCostMatrixName(costMatrixName);
    buildSettings.setTargetAttributeName("AFFINITY_CARD");
    m_dmeConn.saveObject("treeBuildSettings_jdm", buildSettings, true);
    // 3. Create, save & execute Build Task
    BuildTask buildTask = m_buildFactory.create(
    "treeBuildData_jdm", // Build data specification
    "treeBuildSettings_jdm", // Mining function settings name
    "treeModel_jdm" // Mining model name
    buildTask.setDescription("treeBuildTask_jdm");
    executeTask(buildTask, "treeBuildTask_jdm");
    //4. Restore the model from the DME and explore the details of the model
    ClassificationModel model =
    (ClassificationModel)m_dmeConn.retrieveObject(
    "treeModel_jdm", NamedObject.model);
    // Display model build settings
    ClassificationSettings retrievedBuildSettings =
    (ClassificationSettings)model.getBuildSettings();
    if(buildSettings == null)
    System.out.println("Failure to restore build settings.");
    else
    displayBuildSettings(retrievedBuildSettings, "treeBuildSettings_jdm");
    // Display model signature
    displayModelSignature((Model)model);
    // Display model detail
    TreeModelDetail treeModelDetails = (TreeModelDetail) model.getModelDetail();
    displayTreeModelDetailsExtensions(treeModelDetails);
    * Create and save cost matrix.
    * Consider an example where it costs $10 to mail a promotion to a
    * prospective customer and if the prospect becomes a customer, the
    * typical sale including the promotion, is worth $100. Then the cost
    * of missing a customer (i.e. missing a $100 sale) is 10x that of
    * incorrectly indicating that a person is good prospect (spending
    * $10 for the promo). In this case, all prediction errors made by
    * the model are NOT equal. To act on what the model determines to
    * be the most likely (probable) outcome may be a poor choice.
    * Suppose that the probability of a BUY reponse is 10% for a given
    * prospect. Then the expected revenue from the prospect is:
    * .10 * $100 - .90 * $10 = $1.
    * The optimal action, given the cost matrix, is to simply mail the
    * promotion to the customer, because the action is profitable ($1).
    * In contrast, without the cost matrix, all that can be said is
    * that the most likely response is NO BUY, so don't send the
    * promotion. This shows that cost matrices can be very important.
    * The caveat in all this is that the model predicted probabilities
    * may NOT be accurate. For binary targets, a systematic approach to
    * this issue exists. It is ROC, illustrated below.
    * With ROC computed on a test set, the user can see how various model
    * predicted probability thresholds affect the action of mailing a promotion.
    * Suppose I promote when the probability to BUY exceeds 5, 10, 15%, etc.
    * what return can I expect? Note that the answer to this question does
    * not rely on the predicted probabilities being accurate, only that
    * they are in approximately the correct rank order.
    * Assuming that the predicted probabilities are accurate, provide the
    * cost matrix table name as input to the RANK_APPLY procedure to get
    * appropriate costed scoring results to determine the most appropriate
    * action.
    * In this demo, we will create the following cost matrix
    * ActualTarget PredictedTarget Cost
    * 0 0 0
    * 0 1 1
    * 1 0 8
    * 1 1 0
    private static String createCostMatrix() throws JDMException
    String costMatrixName = "treeCostMatrix";
    // Create categorySet
    CategorySet catSet = m_catSetFactory.create(AttributeDataType.integerType);
    // Add category values
    catSet.addCategory(new Integer(0), CategoryProperty.valid);
    catSet.addCategory(new Integer(1), CategoryProperty.valid);
    // Create cost matrix
    CostMatrix costMatrix = m_costMatrixFactory.create(catSet);
    // ActualTarget PredictedTarget Cost
    costMatrix.setValue(new Integer(0), new Integer(0), 0);
    costMatrix.setValue(new Integer(0), new Integer(1), 1);
    costMatrix.setValue(new Integer(1), new Integer(0), 8);
    costMatrix.setValue(new Integer(1), new Integer(1), 0);
    //save cost matrix
    m_dmeConn.saveObject(costMatrixName, costMatrix, true);
    return costMatrixName;
    * This method illustrates how to compute test metrics using
    * an apply output table that has actual and predicted target values. Here the
    * apply operation is done on the MINING_DATA_TEST_V dataset. It creates
    * an apply output table with actual and predicted target values. Using
    * ClassificationTestMetricsTask test metrics are computed. This produces
    * the same test metrics results as ClassificationTestTask.
    * @param applyOutputName apply output table name
    * @param testResultName test result name
    * @param costMatrixName table name of the supplied cost matrix
    * @exception JDMException if model test failed
    public static void computeTestMetrics(String applyOutputName,
    String testResultName, String costMatrixName) throws JDMException
    if (costMatrixName != null) {
    System.out.println("---------------------------------------------------");
    System.out.println("--- Test Model - using apply output table ---");
    System.out.println("--- - using cost matrix table ---");
    System.out.println("---------------------------------------------------");
    else {
    System.out.println("---------------------------------------------------");
    System.out.println("--- Test Model - using apply output table ---");
    System.out.println("--- - using no cost matrix table ---");
    System.out.println("---------------------------------------------------");
    // 1. Do the apply on test data to create an apply output table
    // Create & save PhysicalDataSpecification
    PhysicalDataSet applyData =
    m_pdsFactory.create( "MINING_DATA_TEST_V", false );
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeTestApplyData_jdm", applyData, true );
    // 2 Create & save ClassificationApplySettings
    ClassificationApplySettings clasAS = m_applySettingsFactory.create();
    HashMap sourceAttrMap = new HashMap();
    sourceAttrMap.put( "AFFINITY_CARD", "AFFINITY_CARD" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    m_dmeConn.saveObject( "treeTestApplySettings_jdm", clasAS, true);
    // 3 Create, store & execute apply Task
    DataSetApplyTask applyTask = m_dsApplyFactory.create(
    "treeTestApplyData_jdm",
    "treeModel_jdm",
    "treeTestApplySettings_jdm",
    applyOutputName);
    if(executeTask(applyTask, "treeTestApplyTask_jdm"))
    // Compute test metrics on new created apply output table
    // 4. Create & save PhysicalDataSpecification
    PhysicalDataSet applyOutputData = m_pdsFactory.create(
    applyOutputName, false );
    applyOutputData.addAttribute( pa );
    m_dmeConn.saveObject( "treeTestApplyOutput_jdm", applyOutputData, true );
    // 5. Create a ClassificationTestMetricsTask
    ClassificationTestMetricsTask testMetricsTask =
    m_testMetricsTaskFactory.create( "treeTestApplyOutput_jdm", // apply output data used as input
    "AFFINITY_CARD", // actual target column
    "PREDICTION", // predicted target column
    testResultName // test metrics result name
    testMetricsTask.computeMetric( // enable confusion matrix computation
    ClassificationTestMetricOption.confusionMatrix, true );
    testMetricsTask.computeMetric( // enable lift computation
    ClassificationTestMetricOption.lift, true );
    testMetricsTask.computeMetric( // enable ROC computation
    ClassificationTestMetricOption.receiverOperatingCharacteristics, true );
    testMetricsTask.setPositiveTargetValue( new Integer(1) );
    testMetricsTask.setNumberOfLiftQuantiles( 10 );
    testMetricsTask.setPredictionRankingAttrName( "PROBABILITY" );
    if (costMatrixName != null) {
    testMetricsTask.setCostMatrixName(costMatrixName);
    displayTable(costMatrixName, "", "order by ACTUAL_TARGET_VALUE, PREDICTED_TARGET_VALUE");
    // Store & execute the task
    boolean isTaskSuccess = executeTask(testMetricsTask, "treeTestMetricsTask_jdm");
    if( isTaskSuccess ) {
    // Restore & display test metrics
    ClassificationTestMetrics testMetrics = (ClassificationTestMetrics)
    m_dmeConn.retrieveObject( testResultName, NamedObject.testMetrics );
    // Display classification test metrics
    displayTestMetricDetails(testMetrics);
    * This method illustrates how to apply the mining model on the
    * MINING_DATA_APPLY_V dataset to predict customer
    * response. After completion of the task apply output table with the
    * predicted results is created at the user specified location.
    * @exception JDMException if model apply failed
    public static void applyModel() throws JDMException
    System.out.println("---------------------------------------------------");
    System.out.println("--- Apply Model ---");
    System.out.println("---------------------------------------------------");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 1 ---");
    System.out.println("--- Find the 10 customers who live in Italy ---");
    System.out.println("--- that are least expensive to be convinced to ---");
    System.out.println("--- use an affinity card. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    PhysicalDataSet applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    ClassificationApplySettings clasAS = m_applySettingsFactory.create();
    // Add source attributes
    HashMap sourceAttrMap = new HashMap();
    sourceAttrMap.put( "COUNTRY_NAME", "COUNTRY_NAME" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    // Add cost matrix
    clasAS.setCostMatrixName( m_costMatrixName );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    DataSetApplyTask applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT1_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // Find the 10 customers who live in Italy that are least expensive to be
    // convinced to use an affinity card.
    displayTable("TREE_APPLY_OUTPUT1_JDM",
    "where COUNTRY_NAME='Italy' and ROWNUM < 11 ",
    "order by COST");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 2 ---");
    System.out.println("--- List ten customers (ordered by their id) ---");
    System.out.println("--- along with likelihood and cost to use or ---");
    System.out.println("--- reject the affinity card. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    clasAS = m_applySettingsFactory.create();
    // Add cost matrix
    clasAS.setCostMatrixName( m_costMatrixName );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT2_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // List ten customers (ordered by their id) along with likelihood and cost
    // to use or reject the affinity card (Note: while this example has a
    // binary target, such a query is useful in multi-class classification -
    // Low, Med, High for example).
    displayTable("TREE_APPLY_OUTPUT2_JDM",
    "where ROWNUM < 21",
    "order by CUST_ID, PREDICTION");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 3 ---");
    System.out.println("--- Find the customers who work in Tech support ---");
    System.out.println("--- and are under 25 who is going to response ---");
    System.out.println("--- to the new affinity card program. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    clasAS = m_applySettingsFactory.create();
    // Add source attributes
    sourceAttrMap = new HashMap();
    sourceAttrMap.put( "AGE", "AGE" );
    sourceAttrMap.put( "OCCUPATION", "OCCUPATION" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT3_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // Find the customers who work in Tech support and are under 25 who is
    // going to response to the new affinity card program.
    displayTable("TREE_APPLY_OUTPUT3_JDM",
    "where OCCUPATION = 'TechSup' " +
    "and AGE < 25 " +
    "and PREDICTION = 1 ",
    "order by CUST_ID");
    * This method stores the given task with the specified name in the DMS
    * and submits the task for asynchronous execution in the DMS. After
    * completing the task successfully it returns true. If there is a task
    * failure, then it prints error description and returns false.
    * @param taskObj task object
    * @param taskName name of the task
    * @return boolean returns true when the task is successful
    * @exception JDMException if task execution failed
    public static boolean executeTask(Task taskObj, String taskName)
    throws JDMException
    boolean isTaskSuccess = false;
    m_dmeConn.saveObject(taskName, taskObj, true);
    ExecutionHandle execHandle = m_dmeConn.execute(taskName);
    System.out.print(taskName + " is started, please wait. ");
    //Wait for completion of the task
    ExecutionStatus status = execHandle.waitForCompletion(Integer.MAX_VALUE);
    //Check the status of the task after completion
    isTaskSuccess = status.getState().equals(ExecutionState.success);
    if( isTaskSuccess ) //Task completed successfully
    System.out.println(taskName + " is successful.");
    else //Task failed
    System.out.println(taskName + " failed.\nFailure Description: " +
    status.getDescription() );
    return isTaskSuccess;
    private static void displayBuildSettings(
    ClassificationSettings clasSettings, String buildSettingsName)
    System.out.println("BuildSettings Details from the "
    + buildSettingsName + " table:");
    displayTable(buildSettingsName, "", "order by SETTING_NAME");
    System.out.println("BuildSettings Details from the "
    + buildSettingsName + " model build settings object:");
    String objName = clasSettings.getName();
    if(objName != null)
    System.out.println("Name = " + objName);
    String objDescription = clasSettings.getDescription();
    if(objDescription != null)
    System.out.println("Description = " + objDescription);
    java.util.Date creationDate = clasSettings.getCreationDate();
    String creator = clasSettings.getCreatorInfo();
    String targetAttrName = clasSettings.getTargetAttributeName();
    System.out.println("Target attribute name = " + targetAttrName);
    AlgorithmSettings algoSettings = clasSettings.getAlgorithmSettings();
    if(algoSettings == null)
    System.out.println("Failure: clasSettings.getAlgorithmSettings() returns null");
    MiningAlgorithm algo = algoSettings.getMiningAlgorithm();
    if(algo == null) System.out.println("Failure: algoSettings.getMiningAlgorithm() returns null");
    System.out.println("Algorithm Name: " + algo.name());
    MiningFunction function = clasSettings.getMiningFunction();
    if(function == null) System.out.println("Failure: clasSettings.getMiningFunction() returns null");
    System.out.println("Function Name: " + function.name());
    try {
    String costMatrixName = clasSettings.getCostMatrixName();
    if(costMatrixName != null) {
    System.out.println("Cost Matrix Details from the " + costMatrixName
    + " table:");
    displayTable(costMatrixName, "", "order by ACTUAL_TARGET_VALUE, PREDICTED_TARGET_VALUE");
    } catch(Exception jdmExp)
    System.out.println("Failure: clasSettings.getCostMatrixName()throws exception");
    jdmExp.printStackTrace();
    // List of DT algorithm settings
    // treeAlgo.setBuildHomogeneityMetric(TreeHomogeneityMetric.gini);
    // treeAlgo.setMaxDepth(7);
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(0.1, SizeUnit.percentage);
    // treeAlgo.setMinNodeSize( 0.05, SizeUnit.percentage );
    // treeAlgo.setMinNodeSize( 10, SizeUnit.count );
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(20, SizeUnit.count);
    TreeHomogeneityMetric homogeneityMetric = ((OraTreeSettings)algoSettings).getBuildHomogeneityMetric();
    System.out.println("Homogeneity Metric: " + homogeneityMetric.name());
    int intValue = ((OraTreeSettings)algoSettings).getMaxDepth();
    System.out.println("Max Depth: " + intValue);
    double doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSizeForSplit(SizeUnit.percentage);
    System.out.println("MinNodeSizeForSplit (percentage): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSizeForSplit(SizeUnit.count);
    System.out.println("MinNodeSizeForSplit (count): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize();
    SizeUnit unit = ((OraTreeSettings)algoSettings).getMinNodeSizeUnit();
    System.out.println("Min Node Size (" + unit.name() +"): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize( SizeUnit.count );
    System.out.println("Min Node Size (" + SizeUnit.count.name() +"): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize( SizeUnit.percentage );
    System.out.println("Min Node Size (" + SizeUnit.percentage.name() +"): " + m_df.format(doubleValue));
    * This method displayes DT model signature.
    * @param model model object
    * @exception JDMException if failed to retrieve model signature
    public static void displayModelSignature(Model model) throws JDMException
    String modelName = model.getName();
    System.out.println("Model Name: " + modelName);
    ModelSignature modelSignature = model.getSignature();
    System.out.println("ModelSignature Deatils: ( Attribute Name, Attribute Type )");
    MessageFormat mfSign = new MessageFormat(" ( {0}, {1} )");
    String[] vals = new String[3];
    Collection sortedSet = modelSignature.getAttributes();
    Iterator attrIterator = sortedSet.iterator();
    while(attrIterator.hasNext())
    SignatureAttribute attr = (SignatureAttribute)attrIterator.next();
    vals[0] = attr.getName();
    vals[1] = attr.getDataType().name();
    System.out.println( mfSign.format(vals) );
    * This method displayes DT model details.
    * @param treeModelDetails tree model details object
    * @exception JDMException if failed to retrieve model details
    public static void displayTreeModelDetailsExtensions(TreeModelDetail treeModelDetails)
    throws JDMException
    System.out.println( "\nTreeModelDetail: Model name=" + "treeModel_jdm" );
    TreeNode root = treeModelDetails.getRootNode();
    System.out.println( "\nRoot node: " + root.getIdentifier() );
    // get the info for the tree model
    int treeDepth = ((OraTreeModelDetail) treeModelDetails).getTreeDepth();
    System.out.println( "Tree depth: " + treeDepth );
    int totalNodes = ((OraTreeModelDetail) treeModelDetails).getNumberOfNodes();
    System.out.println( "Total number of nodes: " + totalNodes );
    int totalLeaves = ((OraTreeModelDetail) treeModelDetails).getNumberOfLeafNodes();
    System.out.println( "Total number of leaf nodes: " + totalLeaves );
    Stack nodeStack = new Stack();
    nodeStack.push( root);
    while( !nodeStack.empty() )
    TreeNode node = (TreeNode) nodeStack.pop();
    // display this node
    int nodeId = node.getIdentifier();
    long caseCount = node.getCaseCount();
    Object prediction = node.getPrediction();
    int level = node.getLevel();
    int children = node.getNumberOfChildren();
    TreeNode parent = node.getParent();
    System.out.println( "\nNode id=" + nodeId + " at level " + level );
    if( parent != null )
    System.out.println( "parent: " + parent.getIdentifier() +
    ", children=" + children );
    System.out.println( "Case count: " + caseCount + ", prediction: " + prediction );
    Predicate predicate = node.getPredicate();
    System.out.println( "Predicate: " + predicate.toString() );
    Predicate[] surrogates = node.getSurrogates();
    if( surrogates != null )
    for( int i=0; i<surrogates.length; i++ )
    System.out.println( "Surrogate[" + i + "]: " + surrogates[i] );
    // add child nodes in the stack
    if( children > 0 )
    TreeNode[] childNodes = node.getChildren();
    for( int i=0; i<childNodes.length; i++ )
    nodeStack.push( childNodes[i] );
    TreeNode[] allNodes = treeModelDetails.getNodes();
    System.out.print( "\nNode identifiers by getNodes():" );
    for( int i=0; i<allNodes.length; i++ )
    System.out.print( " " + allNodes.getIdentifier() );
    System.out.println();
    // display the node identifiers
    int[] nodeIds = treeModelDetails.getNodeIdentifiers();
    System.out.print( "Node identifiers by getNodeIdentifiers():" );
    for( int i=0; i<nodeIds.length; i++ )
    System.out.print( " " + nodeIds[i] );
    System.out.println();
    TreeNode node = treeModelDetails.getNode(nodeIds.length-1);
    System.out.println( "Node identifier by getNode(" + (nodeIds.length-1) +
    "): " + node.getIdentifier() );
    Rule rule2 = treeModelDetails.getRule(nodeIds.length-1);
    System.out.println( "Rule identifier by getRule(" + (nodeIds.length-1) +
    "): " + rule2.getRuleIdentifier() );
    // get the rules and display them
    Collection ruleColl = treeModelDetails.getRules();
    Iterator ruleIterator = ruleColl.iterator();
    while( ruleIterator.hasNext() )
    Rule rule = (Rule) ruleIterator.next();
    int ruleId = rule.getRuleIdentifier();
    Predicate antecedent = (Predicate) rule.getAntecedent();
    Predicate consequent = (Predicate) rule.getConsequent();
    System.out.println( "\nRULE " + ruleId + ": support=" +
    rule.getSupport() + " (abs=" + rule.getAbsoluteSupport() +
    "), confidence=" + rule.getConfidence() );
    System.out.println( antecedent );
    System.out.println( "=======>" );
    System.out.println( consequent );
    * Display classification test metrics object
    * @param testMetrics classification test metrics object
    * @exception JDMException if failed to retrieve test metric details
    public static void displayTestMetricDetails(
    ClassificationTestMetrics testMetrics) throws JDMException
    // Retrieve Oracle ABN model test metrics deatils extensions
    // Test Metrics Name
    System.out.println("Test Metrics Name = " + testMetrics.getName());
    // Model Name
    System.out.println("Model Name = " + testMetrics.getModelName());
    // Test Data Name
    System.out.println("Test Data Name = " + testMetrics.getTestDataName());
    // Accuracy
    System.out.println("Accuracy = " + m_df.format(testMetrics.getAccuracy().doubleValue()));
    // Confusion Matrix
    ConfusionMatrix confusionMatrix = testMetrics.getConfusionMatrix();
    Collection categories = confusionMatrix.getCategories();
    Iterator xIterator = categories.iterator();
    System.out.println("Confusion Matrix: Accuracy = " + m_df.format(confusionMatrix.getAccuracy()));
    System.out.println("Confusion Matrix: Error = " + m_df.format(confusionMatrix.getError()));
    System.out.println("Confusion Matrix:( Actual, Prection, Value )");
    MessageFormat mf = new MessageFormat(" ( {0}, {1}, {2} )");
    String[] vals = new String[3];
    while(xIterator.hasNext())
    Object actual = xIterator.next();
    vals[0] = actual.toString();
    Iterator yIterator = categories.iterator();
    while(yIterator.hasNext())
    Object predicted = yIterator.next();
    vals[1] = predicted.toString();
    long number = confusionMatrix.getNumberOfPredictions(actual, predicted);
    vals[2] = Long.toString(number);
    System.out.println(mf.format(vals));
    // Lift
    Lift lift = testMetrics.getLift();
    System.out.println("Lift Details:");
    System.out.println("Lift: Target Attribute Name = " + lift.getTargetAttributeName());
    System.out.println("Lift: Positive Target Value = " + lift.getPositiveTargetValue());
    System.out.println("Lift: Total Cases = " + lift.getTotalCases());
    System.out.println("Lift: Total Positive Cases = " + lift.getTotalPositiveCases());
    int numberOfQuantiles = lift.getNumberOfQuantiles();
    System.out.println("Lift: Number Of Quantiles = " + numberOfQuantiles);
    System.out.println("Lift: ( QUANTILE_NUMBER, QUANTILE_TOTAL_COUNT, QUANTILE_TARGET_COUNT, PERCENTAGE_RECORDS_CUMULATIVE,CUMULATIVE_LIFT,CUMULATIVE_TARGET_DENSITY,TARGETS_CUMULATIVE, NON_TARGETS_CUMULATIVE, LIFT_QUANTILE, TARGET_DENSITY )");
    MessageFormat mfLift = new MessageFormat(" ( {0}, {1}, {2}, {3}, {4}, {5}, {6}, {7}, {8}, {9} )");
    String[] liftVals = new String[10];
    for(int iQuantile=1; iQuantile<= numberOfQuantiles; iQuantile++)
    liftVals[0] = Integer.toString(iQuantile); //QUANTILE_NUMBER
    liftVals[1] = Long.toString(lift.getCases((iQuantile-1), iQuantile));//QUANTILE_TOTAL_COUNT
    liftVals[2] = Long.toString(lift.getNumberOfPositiveCases((iQuantile-1), iQuantile));//QUANTILE_TARGET_COUNT
    liftVals[3] = m_df.format(lift.getCumulativePercentageSize(iQuantile).doubleValue());//PERCENTAGE_RECORDS_CUMULATIVE
    liftVals[4] = m_df.format(lift.getCumulativeLift(iQuantile).doubleValue());//CUMULATIVE_LIFT
    liftVals[5] = m_df.format(lift.getCumulativeTargetDensity(iQuantile).doubleValue());//CUMULATIVE_TARGET_DENSITY
    liftVals[6] = Long.toString(lift.getCumulativePositiveCases(iQuantile));//TARGETS_CUMULATIVE
    liftVals[7] = Long.toString(lift.getCumulativeNegativeCases(iQuantile));//NON_TARGETS_CUMULATIVE
    liftVals[8] = m_df.format(lift.getLift(iQuantile, iQuantile).doubleValue());//LIFT_QUANTILE
    liftVals[9] = m_df.format(lift.getTargetDensity(iQuantile, iQuantile).doubleValue());//TARGET_DENSITY
    System.out.println(mfLift.format(liftVals));
    // ROC
    ReceiverOperatingCharacterics roc = testMetrics.getROC();
    System.out.println("ROC Details:");
    System.out.println("ROC: Area Under Curve = " + m_df.format(roc.getAreaUnderCurve()));
    int nROCThresh = roc.getNumberOfThresholdCandidates();
    System.out.println("ROC: Number Of Threshold Candidates = " + nROCThresh);
    System.out.println("ROC: ( INDEX, PROBABILITY, TRUE_POSITIVES, FALSE_NEGATIVES, FALSE_POSITIVES, TRUE_NEGATIVES, TRUE_POSITIVE_FRACTION, FALSE_POSITIVE_FRACTION )");
    MessageFormat mfROC = new MessageFormat(" ( {0}, {1}, {2}, {3}, {4}, {5}, {6}, {7} )");
    String[] rocVals = new String[8];
    for(int iROC=1; iROC <= nROCThresh; iROC++)
    rocVals[0] = Integer.toString(iROC); //INDEX
    rocVals[1] = m_df.format(roc.getProbabilityThreshold(iROC));//PROBABILITY
    rocVals[2] = Long.toString(roc.getPositives(iROC, true));//TRUE_POSITIVES
    rocVals[3] = Long.toString(roc.getNegatives(iROC, false));//FALSE_NEGATIVES
    rocVals[4] = Long.toString(roc.getPositives(iROC, false));//FALSE_POSITIVES
    rocVals[5] = Long.toString(roc.getNegatives(iROC, true));//TRUE_NEGATIVES
    rocVals[6] = m_df.format(roc.getHitRate(iROC));//TRUE_POSITIVE_FRACTION
    rocVals[7] = m_df.format(roc.getFalseAlarmRate(iROC));//FALSE_POSITIVE_FRACTION
    System.out.println(mfROC.format(rocVals));
    private static void displayTable(String tableName, String whereCause, String orderByColumn)
    StringBuffer emptyCol = new StringBuffer(" ");
    java.sql.Connection dbConn =
    ((OraConnection)m_dmeConn).getDatabaseConnection();
    PreparedStatement pStmt = null;
    ResultSet rs = null;
    try
    pStmt = dbConn.prepareStatement("SELECT * FROM " + tableName + " " + whereCause + " " + orderByColumn);
    rs = pStmt.executeQuery();
    ResultSetMetaData rsMeta = rs.getMetaData();
    int colCount = rsMeta.getColumnCount();
    StringBuffer header = new StringBuffer();
    System.out.println("Table : " + tableName);
    //Build table header
    for(int iCol=1; iCol<=colCount; iCol++)
    String colName = rsMeta.getColumnName(iCol);
    header.append(emptyCol.replace(0, colName.length(), colName));
    emptyCol = new StringBuffer(" ");
    System.out.println(header.toString());
    //Write table data
    while(rs.next())
    StringBuffer rowContent = new StringBuffer();
    for(int iCol=1; iCol<=colCount; iCol++)
    int sqlType = rsMeta.getColumnType(iCol);
    Object obj = rs.getObject(iCol);
    String colContent = null;
    if(obj instanceof java.lang.Number)
    try
    BigDecimal bd = (BigDecimal)obj;
    if(bd.scale() > 5)
    colContent = m_df.format(obj);
    } else
    colContent = bd.toString();
    } catch(Exception anyExp) {
    colContent = m_df.format(obj);
    } else
    if(obj == null)
    colContent = "NULL";
    else
    colContent = obj.toString();
    rowContent.append(" "+emptyCol.replace(0, colContent.length(), colContent));
    emptyCol = new StringBuffer(" ");
    System.out.println(rowContent.toString());
    } catch(Exception anySqlExp) {
    anySqlExp.printStackTrace();
    }//Ignore
    private static void createTableForTestMetrics(String applyOutputTableName,
    String testDataName,
    String testMetricsInputTableName)
    //0. need to execute the following in the schema
    String sqlCreate =
    "create table " + testMetricsInputTableName + " as " +
    "select a.id as id, prediction, probability, affinity_card " +
    "from " + testDataName + " a, " + applyOutputTableName + " b " +
    "where a.id = b.id";
    java.sql.Connection dbConn = ((OraConnection) m_dmeConn).getDatabaseConnection();
    Statement stmt = null;
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate( sqlCreate );
    catch( Exception anySqlExp )
    System.out.println( anySqlExp.getMessage() );
    anySqlExp.printStackTrace();
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    private static void clean()
    java.sql.Connection dbConn =
    ((OraConnection) m_dmeConn).getDatabaseConnection();
    Statement stmt = null;
    // Drop apply output table
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT1_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT2_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT3_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    // Drop apply output table created for test metrics task
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE DT_TEST_APPLY_OUTPUT_COST_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE DT_TEST_APPLY_OUTPUT_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    //Drop the model
    try {
    m_dmeConn.removeObject( "treeModel_jdm", NamedObject.model );
    } catch(Exception jdmExp) {}
    // drop test metrics result: created by TestMetricsTask
    try {
    m_dmeConn.removeObject( "dtTestMetricsWithCost_jdm", NamedObject.testMetrics );
    } catch(Exception jdmExp) {}
    try {
    m_dmeConn.removeObject( "dtTestMetrics_jdm", NamedObject.testMetrics );
    } catch(Exception jdmExp) {}

    Hi
    I am not sure whether this will help but someone else was getting an error with a java.sql.SQLexception: Unsupported feature. Here is a link to the fix: http://saloon.javaranch.com/cgi-bin/ubb/ultimatebb.cgi?ubb=get_topic&f=3&t=007947
    Best wishes
    Michael

  • Java.lang.NullPointerException + Web Clipping Portlet

    Hi,
    I have created one external application connection and one Oracle-PDK producer connection referring to that external application.Then I dragged that web clipping portlet into a Webcenter application.
    After running jspx file,it raises an error 'Portlet Consume Error'.
    In Jdeveloper the error log is as follows::
    <ServletLogger> <severe> ERROR: Unhandled exception in SOAP call
    java.lang.NullPointerException
         at oracle.portal.wcs.session.http.HttpClientTransportSessionContext.login(HttpClientTransportSessionContext.java:308)
         at oracle.portal.wcs.common.WcExternalPrincipal.login(WcExternalPrincipal.java:322)
         at oracle.portal.wcs.provider.ProviderUserTransportSessionContextManager.handleExternalPrincipal(ProviderUserTransportSessionContextManager.java:91)
         at oracle.portal.wcs.provider.WcProviderInstance.initSession(WcProviderInstance.java:183)
         at oracle.webdb.provider.v2.adapter.soapV1.ProviderAdapter.initSession(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at oracle.webdb.provider.v2.utils.soap.SOAPProcessor.doMethodCall(Unknown Source)
         at oracle.webdb.provider.v2.utils.soap.SOAPProcessor.handleRequest(Unknown Source)
         at oracle.webdb.provider.v2.adapter.SOAPServlet.doSOAPCall(Unknown Source)
         at oracle.webdb.provider.v2.adapter.SOAPServlet.service(Unknown Source)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
         at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:176)
    Please help.It is very urgent!!!!!!!!!!!

    Hi
    Are you useing NWDS(CE 7.1) then you have use this sort of code to  bind it
    Model1 model1Model = new Model1();
        Bapi_Flight_Getlist_Input bapi_Flight_Getlist_Input = new Bapi_Flight_Getlist_Input(model1Model);
        Bapi_Flight_Getlist_Output output = new Bapi_Flight_Getlist_Output(model1Model);
        java.util.List<Bapisfldat> flight_List = new ArrayList<Bapisfldat>();
        output.setFlight_List(flight_List);
        wdContext.nodeBapi_Flight_Getlist_Input().bind(bapi_Flight_Getlist_Input);
    and if you are using NWD 7.0 then the binding will be differ use below sample code to bind it.
    Bapi_Flight_Getlist_Input input = new Bapi_Flight_Getlist_Input();
         input.setDestination_From(new Bapisfldst());
         input.setDestination_To( new Bapisfldst());
         wdContext.nodeBapi_Flight_Getlist_Input().bind(input);
    PS: Also check the cardinality  should be 1..n not 0:n.
    Thanks

  • What is the right  way to display a table in Java web dynpro using a node.

    Hi experts,
      I am trying to show a node of cardinality 0...n as a table in an adobe form in Java web dynpro. But its not showing it properly. Can anybody please tell me what is the right way to display a table on adobe form using a node of cardinality 0...n or 1...n in Java Webdynpro.  In ABAP webdynpro, we can drag and drop a node of cardianlity 0...n or 1...n to  show as a table and it works fine. Is the same possible in Java webdynpro also. Please help.
    Thanks and Regards.
    Vaibhav Tiwari.

    Please refer to my post.. you will get the answer
    Dynamic Table -  same data repeating in all rows
    Special care should be taken in designing the context for table attribute.
    The attribute type singletone also plays a important role. I have this doubt from the beginning when you have reported this problem for the first time but finally you marked it as solved so i thought there might be some other issues but again when you reported that again i did some analysis.
    Now coming to final solution :
    For designing a table in adobe interactive form you have consider following
    You have to design the view context upto three level, I am explaining you the properties
    PDFDataSource (Parent Level1) - Cardinality 1:1 - Signetone -True - This is assigned to datasource
    TableList (Parent Level2) - Cardinality (1:1) - Signetone -True
    TableWrapper(Parent Level3) - Cardinality (0:n) - Signetone -True
    TableData (Parent Level4) - Cardinality (0:1) - Signetone - false (This is the main point)
    Then under TableData value node, you have to put all your table attributes.
    This Value Node name can be anything but hierarchy should be same as I have mentioned above.
    Please try out these steps and get back to me if you have any doubt.

  • Submit Button in Interactive form is not Triggering (Webdynpro Java)

    Hi,
    I have developed one application for Online intractive form.I have  created one text box and one submit button in Intractive
    Form.I created a Action method Submit.I have bound the onActionSubmit() method to  onSubmit Event of Intractive Form.
    For testing i am putting some value in text box and Clicking Submit button.I have set the debugger in onActionSubmit()
    method.But when I am Clicking Submit button, onActionSubmit() method is not triggering.I have
    1. I have used Submit to SAP button (Pallet>Library>Web Dynpro).
    2.I have ACF(Active Control Framework) instalation.
    3.I have WAS 6.4 and NWDS 04.
    4.I have SAPForms.api in   Program Files\Adobe\Acrobat 7.0\Reader\plug_ins path.
    One Value Node DataSource (Cardinality 1..1) is binded into dataSource Property.And pdfSource property is binded to one value
    attribute of binary Type.
    I have done one application where i am getting data from R/3 System and displaying in PDF ,It is working fine.
    I got below error in Log File for the above two applications.But geting the data from R/3 and displaying in PDF is working
    fine even if it is giving below Error in Log
    ClientJTSInterceptor.receive_reply
    [EXCEPTION]
    org.omg.CORBA.BAD_PARAM: Not found ServiceContext with id=0  vmcid: OMG  minor code: 1A completed: Maybe
         at com.sap.engine.services.iiop.internal.giop.ClientRequest.get_reply_service_context(ClientRequest.java:284)
         at
    com.sap.engine.services.ts.jts.ots.PortableInterceptor.ClientJTSInterceptor.receive_reply(ClientJTSInterceptor.java:91)
         at com.sap.engine.services.iiop.internal.giop.ClientRequest.dealReceiveReply(ClientRequest.java:133)
         at com.sap.engine.services.iiop.internal.giop.ClientRequest.dealReceiveReply(ClientRequest.java:125)
         at com.sap.engine.services.iiop.server.portable.Delegate.invoke(Delegate.java:282)
         at org.omg.CORBA.portable.ObjectImpl._invoke(ObjectImpl.java:486)
         at com.adobe.service._ControlAgentStub.done(_ControlAgentStub.java:83)
         at com.adobe.service.ProcessResource.onCommit(ProcessResource.java:609)
         at com.adobe.service.ResourcePeer.invokeCommit(ResourcePeer.java:130)
         at com.adobe.service.J2EEResourcePeerImpl.commit(J2EEResourcePeerImpl.java:124)
         at com.sap.engine.services.ts.jta.impl.ResourceList.commitTwoPhase(ResourceList.java:80)
         at com.sap.engine.services.ts.jta.impl.TransactionImpl.commit(TransactionImpl.java:355)
         at com.adobe.AdobeDocumentServicesLocalLocalObjectImpl0.rpData(AdobeDocumentServicesLocalLocalObjectImpl0.java:174)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:85)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:58)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:60)
         at java.lang.reflect.Method.invoke(Method.java:391)
         at
    com.sap.engine.services.webservices.runtime.EJBImplementationContainer.invokeMethod(EJBImplementationContainer.java:126)
         at com.sap.engine.services.webservices.runtime.RuntimeProcessor.process(RuntimeProcessor.java:157)
         at com.sap.engine.services.webservices.runtime.RuntimeProcessor.process(RuntimeProcessor.java:79)
         at com.sap.engine.services.webservices.runtime.servlet.ServletDispatcherImpl.doPost(ServletDispatcherImpl.java:92)
         at SoapServlet.doPost(SoapServlet.java:51)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:390)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:264)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:347)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:325)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:887)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:241)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at
    com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageLis
    tener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(AccessController.java:214)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    Please Suggest.
    Thanks & Regards
    muna

    Hi Muna,
    May ADS is not configured in server properly.
    Check the below link. I hope it will  help you.
    Re: no action got executed corr to SubmitToSap button of Interactive forms
    Thanks
    Siva Arikatla

  • Null Pointer Exception Error's in WebDynpro Java

    Hi All,
    How Types are coming Null Pointer Exception Error's in WebDynpro Java, Please provide the types.of Errors.
    Ex. Cardinality Type not correct etc...
    Thanks,
    Bye,
    Vijay Hari.

    HI
      Null Pointer Exception can occur in may instances , for suppose
      1) when  you create a Value Node with some attribute which has cardinality , and you have not
      initialized the Node , then it would through  Null Pointer exception ,
      2) when you integrate the RFC and parameters you pass as input to the RFC are not set correctly
          then there could be null pointer when you execute the RFC
      3>or may be when you doesnot bind the node when using webservice then there could be null pointer exception
    and there could be many occurances  for the exception

  • Dynamic Table in PDF - only first row passed to the WD Java

    Hi Experts,
    I'm working with Web Dynpro for Java on WAS 2004s SP13, ADS for SP13 and LiveCycle Designer 7.1
    I am facing a problem related to PDF-dynamic table generation.
    I am creating the PDF form with a dynamic table, an empty row will be added, when ADD button is clicked, the row will be deleted when DELETE button is clicked. After form submit, only first row of the table is passed to the Web Dynpro. I'v tried to use different dataSource Context node structure without results. The structure diescribed in the thread [Dynamic Table -  same data repeating in all rows;  doesnt works for me. The same happend if i try to folow the advise from Wiki https://wiki.sdn.sap.com/wiki/display/WDJava/Creating%20Table%20in%20Interacting%20form%20using%20Web%20Dynpro.
    Beside this,  my DropDown list in the table column is not populated. I know how to populate the DropDown list outside of table. That's working fine. But the DropDown in the table just not respond on the click (is not going open). I'm pretty sure that this is a result of a Context node structure/binding issue. 
    Please suggest me how can i implement dynamic table and populate the data in table dropdown column.
    Edited by: A. Mustacevic on Sep 7, 2009 12:18 AM

    Hi Prabhakar,
    You describe exactly my situation. The node which is bound to the table row has cardinality 1..n. Exactly Context structure is:
    node dataSource (cardinality 1..1/ Singleton true) ======> dataSource of the Interactive Form
    subnode TableList (cardinality 1..1/ Singleton true) ======> bound to the table in the Interactive Form
    subnode TableWrapper (cardinality 1..n/ Singleton true) ======> bound to the table row in the Interactive Form
    subnode TableData (cardinality 0..1/ Singleton false) ======> table data
    attribute 1  ====>     Context nodeattribute bound to the table row field   
    attribute 2
    This structure is recommanded in the post that I found on the Forum (see the firs hyperlink in my firs post).
    Is this structure correct? Why is not working?
    Your link is not working. Can you post the correct one.
    Thanks in advance.
    Regards
    Adnan
    Edited by: A. Mustacevic on Sep 8, 2009 1:56 PM
    Edited by: A. Mustacevic on Sep 8, 2009 1:57 PM
    Edited by: A. Mustacevic on Sep 8, 2009 2:00 PM
    Edited by: A. Mustacevic on Sep 8, 2009 2:01 PM
    Edited by: A. Mustacevic on Sep 8, 2009 2:02 PM

  • Binding and displaying data to table in webdynpro java

    hi all,
    i want to know how to bind the output values to table in webdynpro java.
    i know how to bind values in context to table,but if we want to display the values from database(back end ) and display in table present in the view.
    eg: i have table in view and want to get values from backend(SQL server) and display it in that table.
    plz help me to know ..
    thanks
    sirisha

    Hi Saisirisha,
    Try this.
    1> Take a Value Node(Employee) cardinality 0..n.
    Employee Structure
    Employee
    |------Surname(Attribute)
    |------FirstName(Attribute)
    |------Category(Attribute)
    2> Bind this value node(Employee) in the View datasource of table.
    3> Try the code inside wdDoInit method.
    try {
         // Load the JDBC-ODBC bridge
         Class.forName ("sun.jdbc.odbc.JdbcOdbcDriver");
         // specify the ODBC data source's URL
         String url = "jdbc:odbc:SSPer";
         // connect
         Connection con = DriverManager.getConnection(url,"North","Ken");
         // create and execute a SELECT
         Statement stmt = con.createStatement();
         ResultSet rs = stmt.executeQuery
         ("SELECT Surname,FirstName,Category FROM Per");
         while (rs.next()) {
         // get current row values
         String Surname = rs.getString(1);
         String FirstName = rs.getString(2);
         int Category = rs.getInt(3);
         //create table row and add the value in the table
         IPrivate<Put View Name>.IEmployeeElement empElm = wdContext.createEmployeeElement();
            empElm.setSurname(Surname);
         empElm.setFirstName(FirstName);
         empElm.setCategory(Category);
         wdContext.nodeEmployee().addElement(empElm);
         // close statement and connection
         stmt.close();
         con.close();
    } catch (java.lang.Exception ex) {
         wdComponentAPI.getMessageManager().reportException("Exception : "ex.getMessage()+,true);
    http://www.developer.com/java/data/article.php/3417381
    Regards,
    Mithu

  • Unable to add value to model node with cardinality 0..n

    Hi All,
       Im working with Webdynpro Java.i have an issue here.i have a input field item named customer and a drop down box item location.when i give customer and location as inputs,a WSDL (named Equipment WSDL) is called.the result is a  drop down list containing equipments id
    i took a custom node location with cardinaliy 0..n.i gave static values as input for location.i need to set the location values to model node location.when i m giving a single value to custom node location,that static value is not accepted by model node location(in WSDL) whose cardinality is 0..n.
    Please suugest a solution for this issue
    With Regards,
    Ushasri.

    HI Ushashri,
    What do you mean by 'static value is not accepted by model node location'
    how you have done the mapping
    send me the hierarchy of rfc and ur value node
    With Regards
    Naidu

Maybe you are looking for