ROC - Probability Threshold

Hi, gurus! :)
I'm having quite a hard time interpreting the test metrics. For ROC, how should I interpret the 'probability threshold'? Where does it come from or how is it computed? I'm also confused with the term 'are under the curve', as for any type of probability distribution, the area is always equal to 1. Kindly enlighten me on this.
Thank you very much.
Regards,
Manoy

Hi,
You should base your threshold selection on the original model's ROC result. Don't change the probability threshold and try to apply ROC test on that to adjust again. If you want to explore using ROC or other options to tune the behavior of a model, you should try the "tune" option in Data Miner. It allows you to select a biasing techniques, such as ROC, and push a cost matrix into an existing model. The tune option is displayed in the Property Inspector for the Classification Build Node under the Models tab. Select the model you want to tune and clicke the "tune" button in the grid tool bar. It is important to also recognize that you have build time options as well. See the Performance Setting for the model in the model build specifications (advanced dialog). The default is "balanced" but you also have "natural" and "custom".
Thanks, Mark

Similar Messages

  • Oracle Data Miner ROC Chart

    Is there anyone who can explain some things about the roc chart for me?
    How is what is showed in the roc chart related to the confusion matrix next to it given in the Oracle Data Miner?
    How is this roc chart constructed? How is it possible that it represents the decision tree model I made?
    I hope somebody can help me

    Hi,
    This explaination comes from one of our algorithm engineers:
    "The ROC analysis applies to binary classification problems. One of the classes is selected as a "positive" one. The ROC chart plots the true positive rate as a function of the false positive rate. It is parametrized by the probability threshold values. The true positive rate represents the fraction of positive cases that were correctly classified by the model. The false positive rate represents the fraction of negative cases that were incorrectly classified as positive. Each point on the ROC plot represents a true_positive_rate/false_positive_rate pair corresponding to a particular probability threshold. Each point has a corresponding confusion matrix. The user can analyze the confusion matrices produced at different threshold levels and select a probability threshold to be used for scoring. The probability threshold choice is usually based on application requirements (i.e., acceptable level of false positives).
    The ROC does not represent a model. Instead it quantifies its discriminatory ability and assists the user in selecting an appropriate operating point for scoring."
    I would add to this that you can select a threshold point the build activity to bias the apply process. Currently we generate a cost matrix based on the selected threshold point rather than use the threshold point directly.
    Thanks, Mark

  • Dmtreedemo.java error

    hi,
    i got error in the following programme in java named dmdemotree.java the code and the error are as mentioned below
    i have installed oracle 10g r2 and i have used JDK 1.4.2 softwares , i have set classpath for jdm.jar and ojdm_api.jar available in oracle 10g r2 software ,successfully compiled but at execution stage i got error as
    F:\Mallari\DATA MINING demos\java\samples>java dmtreedemo localhost:1521:orcl scott tiger
    --- Build Model - using cost matrix ---
    javax.datamining.JDMException: Generic Error.
    at oracle.dmt.jdm.resource.OraExceptionHandler.createException(OraExcept
    ionHandler.java:142)
    at oracle.dmt.jdm.resource.OraExceptionHandler.createException(OraExcept
    ionHandler.java:91)
    at oracle.dmt.jdm.OraDMObject.createException(OraDMObject.java:111)
    at oracle.dmt.jdm.base.OraTask.saveObjectInDatabase(OraTask.java:204)
    at oracle.dmt.jdm.OraMiningObject.saveObjectInDatabase(OraMiningObject.j
    ava:164)
    at oracle.dmt.jdm.resource.OraPersistanceManagerImpl.saveObject(OraPersi
    stanceManagerImpl.java:245)
    at oracle.dmt.jdm.resource.OraConnection.saveObject(OraConnection.java:3
    83)
    at dmtreedemo.executeTask(dmtreedemo.java:622)
    at dmtreedemo.buildModel(dmtreedemo.java:304)
    at dmtreedemo.main(dmtreedemo.java:199)
    Caused by: java.sql.SQLException: Unsupported feature
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at oracle.jdbc.dbaccess.DBError.throwUnsupportedFeatureSqlException(DBEr
    ror.java:690)
    at oracle.jdbc.driver.OracleCallableStatement.setString(OracleCallableSt
    atement.java:1337)
    at oracle.dmt.jdm.utils.OraSQLUtils.createCallableStatement(OraSQLUtils.
    java:126)
    at oracle.dmt.jdm.utils.OraSQLUtils.executeCallableStatement(OraSQLUtils
    .java:532)
    at oracle.dmt.jdm.scheduler.OraProgramJob.createJob(OraProgramJob.java:7
    7)
    at oracle.dmt.jdm.scheduler.OraJob.saveJob(OraJob.java:107)
    at oracle.dmt.jdm.scheduler.OraProgramJob.saveJob(OraProgramJob.java:85)
    at oracle.dmt.jdm.scheduler.OraProgramJob.saveJob(OraProgramJob.java:290
    at oracle.dmt.jdm.base.OraTask.saveObjectInDatabase(OraTask.java:199)
    ... 6 more
    SO PLZ HELP ME OUT IN THIS , I WILL BE VERY THANK FULL
    ===========================================================
    the sample code is
    // Copyright (c) 2004, 2005, Oracle. All rights reserved.
    // File: dmtreedemo.java
    * This demo program describes how to use the Oracle Data Mining (ODM) Java API
    * to solve a classification problem using Decision Tree (DT) algorithm.
    * PROBLEM DEFINITION
    * How to predict whether a customer responds or not to the new affinity card
    * program using a classifier based on DT algorithm?
    * DATA DESCRIPTION
    * Data for this demo is composed from base tables in the Sales History (SH)
    * schema. The SH schema is an Oracle Database Sample Schema that has the customer
    * demographics, purchasing, and response details for the previous affinity card
    * programs. Data exploration and preparing the data is a common step before
    * doing data mining. Here in this demo, the following views are created in the user
    * schema using CUSTOMERS, COUNTRIES, and SUPPLIMENTARY_DEMOGRAPHICS tables.
    * MINING_DATA_BUILD_V:
    * This view collects the previous customers' demographics, purchasing, and affinity
    * card response details for building the model.
    * MINING_DATA_TEST_V:
    * This view collects the previous customers' demographics, purchasing, and affinity
    * card response details for testing the model.
    * MINING_DATA_APPLY_V:
    * This view collects the prospective customers' demographics and purchasing
    * details for predicting response for the new affinity card program.
    * DATA MINING PROCESS
    * Prepare Data:
    * 1. Missing Value treatment for predictors
    * See dmsvcdemo.java for a definition of missing values, and the steps to be
    * taken for missing value imputation. SVM interprets all NULL values for a
    * given attribute as "sparse". Sparse data is not suitable for decision
    * trees, but it will accept sparse data nevertheless. Decision Tree
    * implementation in ODM handles missing predictor values (by penalizing
    * predictors which have missing values) and missing target values (by simple
    * discarding records with missing target values). We skip missing values
    * treatment in this demo.
    * 2. Outlier/Clipping treatment for predictors
    * See dmsvcdemo.java for a discussion on outlier treatment. For decision
    * trees, outlier treatment is not really necessary. We skip outlier treatment
    * in this demo.
    * 3. Binning high cardinality data
    * No data preparation for the types we accept is necessary - even for high
    * cardinality predictors. Preprocessing to reduce the cardinality
    * (e.g., binning) can improve the performance of the build, but it could
    * penalize the accuracy of the resulting model.
    * The PrepareData() method in this demo program illustrates the preparation of the
    * build, test, and apply data. We skip PrepareData() since the decision tree
    * algorithm is very capable of handling data which has not been specially
    * prepared. For this demo, no data preparation will be performed.
    * Build Model:
    * Mining Model is the prime object in data mining. The buildModel() method
    * illustrates how to build a classification model using DT algorithm.
    * Test Model:
    * Classification model performance can be evaluated by computing test
    * metrics like accuracy, confusion matrix, lift and ROC. The testModel() or
    * computeTestMetrics() method illustrates how to perform a test operation to
    * compute various metrics.
    * Apply Model:
    * Predicting the target attribute values is the prime function of
    * classification models. The applyModel() method illustrates how to
    * predict the customer response for affinity card program.
    * EXECUTING DEMO PROGRAM
    * Refer to Oracle Data Mining Administrator's Guide
    * for guidelines for executing this demo program.
    // Generic Java api imports
    import java.math.BigDecimal;
    import java.sql.PreparedStatement;
    import java.sql.ResultSet;
    import java.sql.ResultSetMetaData;
    import java.sql.SQLException;
    import java.sql.Statement;
    import java.text.DecimalFormat;
    import java.text.MessageFormat;
    import java.util.Collection;
    import java.util.HashMap;
    import java.util.Iterator;
    import java.util.Stack;
    // Java Data Mining (JDM) standard api imports
    import javax.datamining.ExecutionHandle;
    import javax.datamining.ExecutionState;
    import javax.datamining.ExecutionStatus;
    import javax.datamining.JDMException;
    import javax.datamining.MiningAlgorithm;
    import javax.datamining.MiningFunction;
    import javax.datamining.NamedObject;
    import javax.datamining.SizeUnit;
    import javax.datamining.algorithm.tree.TreeHomogeneityMetric;
    import javax.datamining.algorithm.tree.TreeSettings;
    import javax.datamining.algorithm.tree.TreeSettingsFactory;
    import javax.datamining.base.AlgorithmSettings;
    import javax.datamining.base.Model;
    import javax.datamining.base.Task;
    import javax.datamining.data.AttributeDataType;
    import javax.datamining.data.CategoryProperty;
    import javax.datamining.data.CategorySet;
    import javax.datamining.data.CategorySetFactory;
    import javax.datamining.data.ModelSignature;
    import javax.datamining.data.PhysicalAttribute;
    import javax.datamining.data.PhysicalAttributeFactory;
    import javax.datamining.data.PhysicalAttributeRole;
    import javax.datamining.data.PhysicalDataSet;
    import javax.datamining.data.PhysicalDataSetFactory;
    import javax.datamining.data.SignatureAttribute;
    import javax.datamining.modeldetail.tree.TreeModelDetail;
    import javax.datamining.modeldetail.tree.TreeNode;
    import javax.datamining.resource.Connection;
    import javax.datamining.resource.ConnectionFactory;
    import javax.datamining.resource.ConnectionSpec;
    import javax.datamining.rule.Predicate;
    import javax.datamining.rule.Rule;
    import javax.datamining.supervised.classification.ClassificationApplySettings;
    import javax.datamining.supervised.classification.ClassificationApplySettingsFactory;
    import javax.datamining.supervised.classification.ClassificationModel;
    import javax.datamining.supervised.classification.ClassificationSettings;
    import javax.datamining.supervised.classification.ClassificationSettingsFactory;
    import javax.datamining.supervised.classification.ClassificationTestMetricOption;
    import javax.datamining.supervised.classification.ClassificationTestMetrics;
    import javax.datamining.supervised.classification.ClassificationTestMetricsTask;
    import javax.datamining.supervised.classification.ClassificationTestMetricsTaskFactory;
    import javax.datamining.supervised.classification.ClassificationTestTaskFactory;
    import javax.datamining.supervised.classification.ConfusionMatrix;
    import javax.datamining.supervised.classification.CostMatrix;
    import javax.datamining.supervised.classification.CostMatrixFactory;
    import javax.datamining.supervised.classification.Lift;
    import javax.datamining.supervised.classification.ReceiverOperatingCharacterics;
    import javax.datamining.task.BuildTask;
    import javax.datamining.task.BuildTaskFactory;
    import javax.datamining.task.apply.DataSetApplyTask;
    import javax.datamining.task.apply.DataSetApplyTaskFactory;
    // Oracle Java Data Mining (JDM) implemented api imports
    import oracle.dmt.jdm.algorithm.tree.OraTreeSettings;
    import oracle.dmt.jdm.resource.OraConnection;
    import oracle.dmt.jdm.resource.OraConnectionFactory;
    import oracle.dmt.jdm.modeldetail.tree.OraTreeModelDetail;
    public class dmtreedemo
    //Connection related data members
    private static Connection m_dmeConn;
    private static ConnectionFactory m_dmeConnFactory;
    //Object factories used in this demo program
    private static PhysicalDataSetFactory m_pdsFactory;
    private static PhysicalAttributeFactory m_paFactory;
    private static ClassificationSettingsFactory m_clasFactory;
    private static TreeSettingsFactory m_treeFactory;
    private static BuildTaskFactory m_buildFactory;
    private static DataSetApplyTaskFactory m_dsApplyFactory;
    private static ClassificationTestTaskFactory m_testFactory;
    private static ClassificationApplySettingsFactory m_applySettingsFactory;
    private static CostMatrixFactory m_costMatrixFactory;
    private static CategorySetFactory m_catSetFactory;
    private static ClassificationTestMetricsTaskFactory m_testMetricsTaskFactory;
    // Global constants
    private static DecimalFormat m_df = new DecimalFormat("##.####");
    private static String m_costMatrixName = null;
    public static void main( String args[] )
    try
    if ( args.length != 3 ) {
    System.out.println("Usage: java dmsvrdemo <Host name>:<Port>:<SID> <User Name> <Password>");
    return;
    String uri = args[0];
    String name = args[1];
    String password = args[2];
    // 1. Login to the Data Mining Engine
    m_dmeConnFactory = new OraConnectionFactory();
    ConnectionSpec connSpec = m_dmeConnFactory.getConnectionSpec();
    connSpec.setURI("jdbc:oracle:thin:@"+uri);
    connSpec.setName(name);
    connSpec.setPassword(password);
    m_dmeConn = m_dmeConnFactory.getConnection(connSpec);
    // 2. Clean up all previuosly created demo objects
    clean();
    // 3. Initialize factories for mining objects
    initFactories();
    m_costMatrixName = createCostMatrix();
    // 4. Build model with supplied cost matrix
    buildModel();
    // 5. Test model - To compute accuracy and confusion matrix, lift result
    // and ROC for the model from apply output data.
    // Please see dnnbdemo.java to see how to test the model
    // with a test input data and cost matrix.
    // Test the model with cost matrix
    computeTestMetrics("DT_TEST_APPLY_OUTPUT_COST_JDM",
    "dtTestMetricsWithCost_jdm", m_costMatrixName);
    // Test the model without cost matrix
    computeTestMetrics("DT_TEST_APPLY_OUTPUT_JDM",
    "dtTestMetrics_jdm", null);
    // 6. Apply the model
    applyModel();
    } catch(Exception anyExp) {
    anyExp.printStackTrace(System.out);
    } finally {
    try {
    //6. Logout from the Data Mining Engine
    m_dmeConn.close();
    } catch(Exception anyExp1) { }//Ignore
    * Initialize all object factories used in the demo program.
    * @exception JDMException if factory initalization failed
    public static void initFactories() throws JDMException
    m_pdsFactory = (PhysicalDataSetFactory)m_dmeConn.getFactory(
    "javax.datamining.data.PhysicalDataSet");
    m_paFactory = (PhysicalAttributeFactory)m_dmeConn.getFactory(
    "javax.datamining.data.PhysicalAttribute");
    m_clasFactory = (ClassificationSettingsFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationSettings");
    m_treeFactory = (TreeSettingsFactory) m_dmeConn.getFactory(
    "javax.datamining.algorithm.tree.TreeSettings");
    m_buildFactory = (BuildTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.task.BuildTask");
    m_dsApplyFactory = (DataSetApplyTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.task.apply.DataSetApplyTask");
    m_testFactory = (ClassificationTestTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationTestTask");
    m_applySettingsFactory = (ClassificationApplySettingsFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationApplySettings");
    m_costMatrixFactory = (CostMatrixFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.CostMatrix");
    m_catSetFactory = (CategorySetFactory)m_dmeConn.getFactory(
    "javax.datamining.data.CategorySet" );
    m_testMetricsTaskFactory = (ClassificationTestMetricsTaskFactory)m_dmeConn.getFactory(
    "javax.datamining.supervised.classification.ClassificationTestMetricsTask");
    * This method illustrates how to build a mining model using the
    * MINING_DATA_BUILD_V dataset and classification settings with
    * DT algorithm.
    * @exception JDMException if model build failed
    public static void buildModel() throws JDMException
    System.out.println("---------------------------------------------------");
    System.out.println("--- Build Model - using cost matrix ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    PhysicalDataSet buildData =
    m_pdsFactory.create("MINING_DATA_BUILD_V", false);
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    buildData.addAttribute(pa);
    m_dmeConn.saveObject("treeBuildData_jdm", buildData, true);
    //2. Create & save Mining Function Settings
    // Create tree algorithm settings
    TreeSettings treeAlgo = m_treeFactory.create();
    // By default, tree algorithm will have the following settings:
    // treeAlgo.setBuildHomogeneityMetric(TreeHomogeneityMetric.gini);
    // treeAlgo.setMaxDepth(7);
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(0.1, SizeUnit.percentage);
    // treeAlgo.setMinNodeSize( 0.05, SizeUnit.percentage );
    // treeAlgo.setMinNodeSize( 10, SizeUnit.count );
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(20, SizeUnit.count);
    // Set cost matrix. A cost matrix is used to influence the weighting of
    // misclassification during model creation (and scoring).
    // See Oracle Data Mining Concepts Guide for more details.
    String costMatrixName = m_costMatrixName;
    // Create ClassificationSettings
    ClassificationSettings buildSettings = m_clasFactory.create();
    buildSettings.setAlgorithmSettings(treeAlgo);
    buildSettings.setCostMatrixName(costMatrixName);
    buildSettings.setTargetAttributeName("AFFINITY_CARD");
    m_dmeConn.saveObject("treeBuildSettings_jdm", buildSettings, true);
    // 3. Create, save & execute Build Task
    BuildTask buildTask = m_buildFactory.create(
    "treeBuildData_jdm", // Build data specification
    "treeBuildSettings_jdm", // Mining function settings name
    "treeModel_jdm" // Mining model name
    buildTask.setDescription("treeBuildTask_jdm");
    executeTask(buildTask, "treeBuildTask_jdm");
    //4. Restore the model from the DME and explore the details of the model
    ClassificationModel model =
    (ClassificationModel)m_dmeConn.retrieveObject(
    "treeModel_jdm", NamedObject.model);
    // Display model build settings
    ClassificationSettings retrievedBuildSettings =
    (ClassificationSettings)model.getBuildSettings();
    if(buildSettings == null)
    System.out.println("Failure to restore build settings.");
    else
    displayBuildSettings(retrievedBuildSettings, "treeBuildSettings_jdm");
    // Display model signature
    displayModelSignature((Model)model);
    // Display model detail
    TreeModelDetail treeModelDetails = (TreeModelDetail) model.getModelDetail();
    displayTreeModelDetailsExtensions(treeModelDetails);
    * Create and save cost matrix.
    * Consider an example where it costs $10 to mail a promotion to a
    * prospective customer and if the prospect becomes a customer, the
    * typical sale including the promotion, is worth $100. Then the cost
    * of missing a customer (i.e. missing a $100 sale) is 10x that of
    * incorrectly indicating that a person is good prospect (spending
    * $10 for the promo). In this case, all prediction errors made by
    * the model are NOT equal. To act on what the model determines to
    * be the most likely (probable) outcome may be a poor choice.
    * Suppose that the probability of a BUY reponse is 10% for a given
    * prospect. Then the expected revenue from the prospect is:
    * .10 * $100 - .90 * $10 = $1.
    * The optimal action, given the cost matrix, is to simply mail the
    * promotion to the customer, because the action is profitable ($1).
    * In contrast, without the cost matrix, all that can be said is
    * that the most likely response is NO BUY, so don't send the
    * promotion. This shows that cost matrices can be very important.
    * The caveat in all this is that the model predicted probabilities
    * may NOT be accurate. For binary targets, a systematic approach to
    * this issue exists. It is ROC, illustrated below.
    * With ROC computed on a test set, the user can see how various model
    * predicted probability thresholds affect the action of mailing a promotion.
    * Suppose I promote when the probability to BUY exceeds 5, 10, 15%, etc.
    * what return can I expect? Note that the answer to this question does
    * not rely on the predicted probabilities being accurate, only that
    * they are in approximately the correct rank order.
    * Assuming that the predicted probabilities are accurate, provide the
    * cost matrix table name as input to the RANK_APPLY procedure to get
    * appropriate costed scoring results to determine the most appropriate
    * action.
    * In this demo, we will create the following cost matrix
    * ActualTarget PredictedTarget Cost
    * 0 0 0
    * 0 1 1
    * 1 0 8
    * 1 1 0
    private static String createCostMatrix() throws JDMException
    String costMatrixName = "treeCostMatrix";
    // Create categorySet
    CategorySet catSet = m_catSetFactory.create(AttributeDataType.integerType);
    // Add category values
    catSet.addCategory(new Integer(0), CategoryProperty.valid);
    catSet.addCategory(new Integer(1), CategoryProperty.valid);
    // Create cost matrix
    CostMatrix costMatrix = m_costMatrixFactory.create(catSet);
    // ActualTarget PredictedTarget Cost
    costMatrix.setValue(new Integer(0), new Integer(0), 0);
    costMatrix.setValue(new Integer(0), new Integer(1), 1);
    costMatrix.setValue(new Integer(1), new Integer(0), 8);
    costMatrix.setValue(new Integer(1), new Integer(1), 0);
    //save cost matrix
    m_dmeConn.saveObject(costMatrixName, costMatrix, true);
    return costMatrixName;
    * This method illustrates how to compute test metrics using
    * an apply output table that has actual and predicted target values. Here the
    * apply operation is done on the MINING_DATA_TEST_V dataset. It creates
    * an apply output table with actual and predicted target values. Using
    * ClassificationTestMetricsTask test metrics are computed. This produces
    * the same test metrics results as ClassificationTestTask.
    * @param applyOutputName apply output table name
    * @param testResultName test result name
    * @param costMatrixName table name of the supplied cost matrix
    * @exception JDMException if model test failed
    public static void computeTestMetrics(String applyOutputName,
    String testResultName, String costMatrixName) throws JDMException
    if (costMatrixName != null) {
    System.out.println("---------------------------------------------------");
    System.out.println("--- Test Model - using apply output table ---");
    System.out.println("--- - using cost matrix table ---");
    System.out.println("---------------------------------------------------");
    else {
    System.out.println("---------------------------------------------------");
    System.out.println("--- Test Model - using apply output table ---");
    System.out.println("--- - using no cost matrix table ---");
    System.out.println("---------------------------------------------------");
    // 1. Do the apply on test data to create an apply output table
    // Create & save PhysicalDataSpecification
    PhysicalDataSet applyData =
    m_pdsFactory.create( "MINING_DATA_TEST_V", false );
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeTestApplyData_jdm", applyData, true );
    // 2 Create & save ClassificationApplySettings
    ClassificationApplySettings clasAS = m_applySettingsFactory.create();
    HashMap sourceAttrMap = new HashMap();
    sourceAttrMap.put( "AFFINITY_CARD", "AFFINITY_CARD" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    m_dmeConn.saveObject( "treeTestApplySettings_jdm", clasAS, true);
    // 3 Create, store & execute apply Task
    DataSetApplyTask applyTask = m_dsApplyFactory.create(
    "treeTestApplyData_jdm",
    "treeModel_jdm",
    "treeTestApplySettings_jdm",
    applyOutputName);
    if(executeTask(applyTask, "treeTestApplyTask_jdm"))
    // Compute test metrics on new created apply output table
    // 4. Create & save PhysicalDataSpecification
    PhysicalDataSet applyOutputData = m_pdsFactory.create(
    applyOutputName, false );
    applyOutputData.addAttribute( pa );
    m_dmeConn.saveObject( "treeTestApplyOutput_jdm", applyOutputData, true );
    // 5. Create a ClassificationTestMetricsTask
    ClassificationTestMetricsTask testMetricsTask =
    m_testMetricsTaskFactory.create( "treeTestApplyOutput_jdm", // apply output data used as input
    "AFFINITY_CARD", // actual target column
    "PREDICTION", // predicted target column
    testResultName // test metrics result name
    testMetricsTask.computeMetric( // enable confusion matrix computation
    ClassificationTestMetricOption.confusionMatrix, true );
    testMetricsTask.computeMetric( // enable lift computation
    ClassificationTestMetricOption.lift, true );
    testMetricsTask.computeMetric( // enable ROC computation
    ClassificationTestMetricOption.receiverOperatingCharacteristics, true );
    testMetricsTask.setPositiveTargetValue( new Integer(1) );
    testMetricsTask.setNumberOfLiftQuantiles( 10 );
    testMetricsTask.setPredictionRankingAttrName( "PROBABILITY" );
    if (costMatrixName != null) {
    testMetricsTask.setCostMatrixName(costMatrixName);
    displayTable(costMatrixName, "", "order by ACTUAL_TARGET_VALUE, PREDICTED_TARGET_VALUE");
    // Store & execute the task
    boolean isTaskSuccess = executeTask(testMetricsTask, "treeTestMetricsTask_jdm");
    if( isTaskSuccess ) {
    // Restore & display test metrics
    ClassificationTestMetrics testMetrics = (ClassificationTestMetrics)
    m_dmeConn.retrieveObject( testResultName, NamedObject.testMetrics );
    // Display classification test metrics
    displayTestMetricDetails(testMetrics);
    * This method illustrates how to apply the mining model on the
    * MINING_DATA_APPLY_V dataset to predict customer
    * response. After completion of the task apply output table with the
    * predicted results is created at the user specified location.
    * @exception JDMException if model apply failed
    public static void applyModel() throws JDMException
    System.out.println("---------------------------------------------------");
    System.out.println("--- Apply Model ---");
    System.out.println("---------------------------------------------------");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 1 ---");
    System.out.println("--- Find the 10 customers who live in Italy ---");
    System.out.println("--- that are least expensive to be convinced to ---");
    System.out.println("--- use an affinity card. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    PhysicalDataSet applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    PhysicalAttribute pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    ClassificationApplySettings clasAS = m_applySettingsFactory.create();
    // Add source attributes
    HashMap sourceAttrMap = new HashMap();
    sourceAttrMap.put( "COUNTRY_NAME", "COUNTRY_NAME" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    // Add cost matrix
    clasAS.setCostMatrixName( m_costMatrixName );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    DataSetApplyTask applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT1_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // Find the 10 customers who live in Italy that are least expensive to be
    // convinced to use an affinity card.
    displayTable("TREE_APPLY_OUTPUT1_JDM",
    "where COUNTRY_NAME='Italy' and ROWNUM < 11 ",
    "order by COST");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 2 ---");
    System.out.println("--- List ten customers (ordered by their id) ---");
    System.out.println("--- along with likelihood and cost to use or ---");
    System.out.println("--- reject the affinity card. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    clasAS = m_applySettingsFactory.create();
    // Add cost matrix
    clasAS.setCostMatrixName( m_costMatrixName );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT2_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // List ten customers (ordered by their id) along with likelihood and cost
    // to use or reject the affinity card (Note: while this example has a
    // binary target, such a query is useful in multi-class classification -
    // Low, Med, High for example).
    displayTable("TREE_APPLY_OUTPUT2_JDM",
    "where ROWNUM < 21",
    "order by CUST_ID, PREDICTION");
    System.out.println("---------------------------------------------------");
    System.out.println("--- Business case 3 ---");
    System.out.println("--- Find the customers who work in Tech support ---");
    System.out.println("--- and are under 25 who is going to response ---");
    System.out.println("--- to the new affinity card program. ---");
    System.out.println("---------------------------------------------------");
    // 1. Create & save PhysicalDataSpecification
    applyData =
    m_pdsFactory.create( "MINING_DATA_APPLY_V", false );
    pa = m_paFactory.create("CUST_ID",
    AttributeDataType.integerType, PhysicalAttributeRole.caseId );
    applyData.addAttribute( pa );
    m_dmeConn.saveObject( "treeApplyData_jdm", applyData, true );
    // 2. Create & save ClassificationApplySettings
    clasAS = m_applySettingsFactory.create();
    // Add source attributes
    sourceAttrMap = new HashMap();
    sourceAttrMap.put( "AGE", "AGE" );
    sourceAttrMap.put( "OCCUPATION", "OCCUPATION" );
    clasAS.setSourceDestinationMap( sourceAttrMap );
    m_dmeConn.saveObject( "treeApplySettings_jdm", clasAS, true);
    // 3. Create, store & execute apply Task
    applyTask = m_dsApplyFactory.create(
    "treeApplyData_jdm", "treeModel_jdm",
    "treeApplySettings_jdm", "TREE_APPLY_OUTPUT3_JDM");
    executeTask(applyTask, "treeApplyTask_jdm");
    // 4. Display apply result -- Note that APPLY results do not need to be
    // reverse transformed, as done in the case of model details. This is
    // because class values of a classification target were not (required to
    // be) binned or normalized.
    // Find the customers who work in Tech support and are under 25 who is
    // going to response to the new affinity card program.
    displayTable("TREE_APPLY_OUTPUT3_JDM",
    "where OCCUPATION = 'TechSup' " +
    "and AGE < 25 " +
    "and PREDICTION = 1 ",
    "order by CUST_ID");
    * This method stores the given task with the specified name in the DMS
    * and submits the task for asynchronous execution in the DMS. After
    * completing the task successfully it returns true. If there is a task
    * failure, then it prints error description and returns false.
    * @param taskObj task object
    * @param taskName name of the task
    * @return boolean returns true when the task is successful
    * @exception JDMException if task execution failed
    public static boolean executeTask(Task taskObj, String taskName)
    throws JDMException
    boolean isTaskSuccess = false;
    m_dmeConn.saveObject(taskName, taskObj, true);
    ExecutionHandle execHandle = m_dmeConn.execute(taskName);
    System.out.print(taskName + " is started, please wait. ");
    //Wait for completion of the task
    ExecutionStatus status = execHandle.waitForCompletion(Integer.MAX_VALUE);
    //Check the status of the task after completion
    isTaskSuccess = status.getState().equals(ExecutionState.success);
    if( isTaskSuccess ) //Task completed successfully
    System.out.println(taskName + " is successful.");
    else //Task failed
    System.out.println(taskName + " failed.\nFailure Description: " +
    status.getDescription() );
    return isTaskSuccess;
    private static void displayBuildSettings(
    ClassificationSettings clasSettings, String buildSettingsName)
    System.out.println("BuildSettings Details from the "
    + buildSettingsName + " table:");
    displayTable(buildSettingsName, "", "order by SETTING_NAME");
    System.out.println("BuildSettings Details from the "
    + buildSettingsName + " model build settings object:");
    String objName = clasSettings.getName();
    if(objName != null)
    System.out.println("Name = " + objName);
    String objDescription = clasSettings.getDescription();
    if(objDescription != null)
    System.out.println("Description = " + objDescription);
    java.util.Date creationDate = clasSettings.getCreationDate();
    String creator = clasSettings.getCreatorInfo();
    String targetAttrName = clasSettings.getTargetAttributeName();
    System.out.println("Target attribute name = " + targetAttrName);
    AlgorithmSettings algoSettings = clasSettings.getAlgorithmSettings();
    if(algoSettings == null)
    System.out.println("Failure: clasSettings.getAlgorithmSettings() returns null");
    MiningAlgorithm algo = algoSettings.getMiningAlgorithm();
    if(algo == null) System.out.println("Failure: algoSettings.getMiningAlgorithm() returns null");
    System.out.println("Algorithm Name: " + algo.name());
    MiningFunction function = clasSettings.getMiningFunction();
    if(function == null) System.out.println("Failure: clasSettings.getMiningFunction() returns null");
    System.out.println("Function Name: " + function.name());
    try {
    String costMatrixName = clasSettings.getCostMatrixName();
    if(costMatrixName != null) {
    System.out.println("Cost Matrix Details from the " + costMatrixName
    + " table:");
    displayTable(costMatrixName, "", "order by ACTUAL_TARGET_VALUE, PREDICTED_TARGET_VALUE");
    } catch(Exception jdmExp)
    System.out.println("Failure: clasSettings.getCostMatrixName()throws exception");
    jdmExp.printStackTrace();
    // List of DT algorithm settings
    // treeAlgo.setBuildHomogeneityMetric(TreeHomogeneityMetric.gini);
    // treeAlgo.setMaxDepth(7);
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(0.1, SizeUnit.percentage);
    // treeAlgo.setMinNodeSize( 0.05, SizeUnit.percentage );
    // treeAlgo.setMinNodeSize( 10, SizeUnit.count );
    // ((OraTreeSettings)treeAlgo).setMinDecreaseInImpurity(20, SizeUnit.count);
    TreeHomogeneityMetric homogeneityMetric = ((OraTreeSettings)algoSettings).getBuildHomogeneityMetric();
    System.out.println("Homogeneity Metric: " + homogeneityMetric.name());
    int intValue = ((OraTreeSettings)algoSettings).getMaxDepth();
    System.out.println("Max Depth: " + intValue);
    double doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSizeForSplit(SizeUnit.percentage);
    System.out.println("MinNodeSizeForSplit (percentage): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSizeForSplit(SizeUnit.count);
    System.out.println("MinNodeSizeForSplit (count): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize();
    SizeUnit unit = ((OraTreeSettings)algoSettings).getMinNodeSizeUnit();
    System.out.println("Min Node Size (" + unit.name() +"): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize( SizeUnit.count );
    System.out.println("Min Node Size (" + SizeUnit.count.name() +"): " + m_df.format(doubleValue));
    doubleValue = ((OraTreeSettings)algoSettings).getMinNodeSize( SizeUnit.percentage );
    System.out.println("Min Node Size (" + SizeUnit.percentage.name() +"): " + m_df.format(doubleValue));
    * This method displayes DT model signature.
    * @param model model object
    * @exception JDMException if failed to retrieve model signature
    public static void displayModelSignature(Model model) throws JDMException
    String modelName = model.getName();
    System.out.println("Model Name: " + modelName);
    ModelSignature modelSignature = model.getSignature();
    System.out.println("ModelSignature Deatils: ( Attribute Name, Attribute Type )");
    MessageFormat mfSign = new MessageFormat(" ( {0}, {1} )");
    String[] vals = new String[3];
    Collection sortedSet = modelSignature.getAttributes();
    Iterator attrIterator = sortedSet.iterator();
    while(attrIterator.hasNext())
    SignatureAttribute attr = (SignatureAttribute)attrIterator.next();
    vals[0] = attr.getName();
    vals[1] = attr.getDataType().name();
    System.out.println( mfSign.format(vals) );
    * This method displayes DT model details.
    * @param treeModelDetails tree model details object
    * @exception JDMException if failed to retrieve model details
    public static void displayTreeModelDetailsExtensions(TreeModelDetail treeModelDetails)
    throws JDMException
    System.out.println( "\nTreeModelDetail: Model name=" + "treeModel_jdm" );
    TreeNode root = treeModelDetails.getRootNode();
    System.out.println( "\nRoot node: " + root.getIdentifier() );
    // get the info for the tree model
    int treeDepth = ((OraTreeModelDetail) treeModelDetails).getTreeDepth();
    System.out.println( "Tree depth: " + treeDepth );
    int totalNodes = ((OraTreeModelDetail) treeModelDetails).getNumberOfNodes();
    System.out.println( "Total number of nodes: " + totalNodes );
    int totalLeaves = ((OraTreeModelDetail) treeModelDetails).getNumberOfLeafNodes();
    System.out.println( "Total number of leaf nodes: " + totalLeaves );
    Stack nodeStack = new Stack();
    nodeStack.push( root);
    while( !nodeStack.empty() )
    TreeNode node = (TreeNode) nodeStack.pop();
    // display this node
    int nodeId = node.getIdentifier();
    long caseCount = node.getCaseCount();
    Object prediction = node.getPrediction();
    int level = node.getLevel();
    int children = node.getNumberOfChildren();
    TreeNode parent = node.getParent();
    System.out.println( "\nNode id=" + nodeId + " at level " + level );
    if( parent != null )
    System.out.println( "parent: " + parent.getIdentifier() +
    ", children=" + children );
    System.out.println( "Case count: " + caseCount + ", prediction: " + prediction );
    Predicate predicate = node.getPredicate();
    System.out.println( "Predicate: " + predicate.toString() );
    Predicate[] surrogates = node.getSurrogates();
    if( surrogates != null )
    for( int i=0; i<surrogates.length; i++ )
    System.out.println( "Surrogate[" + i + "]: " + surrogates[i] );
    // add child nodes in the stack
    if( children > 0 )
    TreeNode[] childNodes = node.getChildren();
    for( int i=0; i<childNodes.length; i++ )
    nodeStack.push( childNodes[i] );
    TreeNode[] allNodes = treeModelDetails.getNodes();
    System.out.print( "\nNode identifiers by getNodes():" );
    for( int i=0; i<allNodes.length; i++ )
    System.out.print( " " + allNodes.getIdentifier() );
    System.out.println();
    // display the node identifiers
    int[] nodeIds = treeModelDetails.getNodeIdentifiers();
    System.out.print( "Node identifiers by getNodeIdentifiers():" );
    for( int i=0; i<nodeIds.length; i++ )
    System.out.print( " " + nodeIds[i] );
    System.out.println();
    TreeNode node = treeModelDetails.getNode(nodeIds.length-1);
    System.out.println( "Node identifier by getNode(" + (nodeIds.length-1) +
    "): " + node.getIdentifier() );
    Rule rule2 = treeModelDetails.getRule(nodeIds.length-1);
    System.out.println( "Rule identifier by getRule(" + (nodeIds.length-1) +
    "): " + rule2.getRuleIdentifier() );
    // get the rules and display them
    Collection ruleColl = treeModelDetails.getRules();
    Iterator ruleIterator = ruleColl.iterator();
    while( ruleIterator.hasNext() )
    Rule rule = (Rule) ruleIterator.next();
    int ruleId = rule.getRuleIdentifier();
    Predicate antecedent = (Predicate) rule.getAntecedent();
    Predicate consequent = (Predicate) rule.getConsequent();
    System.out.println( "\nRULE " + ruleId + ": support=" +
    rule.getSupport() + " (abs=" + rule.getAbsoluteSupport() +
    "), confidence=" + rule.getConfidence() );
    System.out.println( antecedent );
    System.out.println( "=======>" );
    System.out.println( consequent );
    * Display classification test metrics object
    * @param testMetrics classification test metrics object
    * @exception JDMException if failed to retrieve test metric details
    public static void displayTestMetricDetails(
    ClassificationTestMetrics testMetrics) throws JDMException
    // Retrieve Oracle ABN model test metrics deatils extensions
    // Test Metrics Name
    System.out.println("Test Metrics Name = " + testMetrics.getName());
    // Model Name
    System.out.println("Model Name = " + testMetrics.getModelName());
    // Test Data Name
    System.out.println("Test Data Name = " + testMetrics.getTestDataName());
    // Accuracy
    System.out.println("Accuracy = " + m_df.format(testMetrics.getAccuracy().doubleValue()));
    // Confusion Matrix
    ConfusionMatrix confusionMatrix = testMetrics.getConfusionMatrix();
    Collection categories = confusionMatrix.getCategories();
    Iterator xIterator = categories.iterator();
    System.out.println("Confusion Matrix: Accuracy = " + m_df.format(confusionMatrix.getAccuracy()));
    System.out.println("Confusion Matrix: Error = " + m_df.format(confusionMatrix.getError()));
    System.out.println("Confusion Matrix:( Actual, Prection, Value )");
    MessageFormat mf = new MessageFormat(" ( {0}, {1}, {2} )");
    String[] vals = new String[3];
    while(xIterator.hasNext())
    Object actual = xIterator.next();
    vals[0] = actual.toString();
    Iterator yIterator = categories.iterator();
    while(yIterator.hasNext())
    Object predicted = yIterator.next();
    vals[1] = predicted.toString();
    long number = confusionMatrix.getNumberOfPredictions(actual, predicted);
    vals[2] = Long.toString(number);
    System.out.println(mf.format(vals));
    // Lift
    Lift lift = testMetrics.getLift();
    System.out.println("Lift Details:");
    System.out.println("Lift: Target Attribute Name = " + lift.getTargetAttributeName());
    System.out.println("Lift: Positive Target Value = " + lift.getPositiveTargetValue());
    System.out.println("Lift: Total Cases = " + lift.getTotalCases());
    System.out.println("Lift: Total Positive Cases = " + lift.getTotalPositiveCases());
    int numberOfQuantiles = lift.getNumberOfQuantiles();
    System.out.println("Lift: Number Of Quantiles = " + numberOfQuantiles);
    System.out.println("Lift: ( QUANTILE_NUMBER, QUANTILE_TOTAL_COUNT, QUANTILE_TARGET_COUNT, PERCENTAGE_RECORDS_CUMULATIVE,CUMULATIVE_LIFT,CUMULATIVE_TARGET_DENSITY,TARGETS_CUMULATIVE, NON_TARGETS_CUMULATIVE, LIFT_QUANTILE, TARGET_DENSITY )");
    MessageFormat mfLift = new MessageFormat(" ( {0}, {1}, {2}, {3}, {4}, {5}, {6}, {7}, {8}, {9} )");
    String[] liftVals = new String[10];
    for(int iQuantile=1; iQuantile<= numberOfQuantiles; iQuantile++)
    liftVals[0] = Integer.toString(iQuantile); //QUANTILE_NUMBER
    liftVals[1] = Long.toString(lift.getCases((iQuantile-1), iQuantile));//QUANTILE_TOTAL_COUNT
    liftVals[2] = Long.toString(lift.getNumberOfPositiveCases((iQuantile-1), iQuantile));//QUANTILE_TARGET_COUNT
    liftVals[3] = m_df.format(lift.getCumulativePercentageSize(iQuantile).doubleValue());//PERCENTAGE_RECORDS_CUMULATIVE
    liftVals[4] = m_df.format(lift.getCumulativeLift(iQuantile).doubleValue());//CUMULATIVE_LIFT
    liftVals[5] = m_df.format(lift.getCumulativeTargetDensity(iQuantile).doubleValue());//CUMULATIVE_TARGET_DENSITY
    liftVals[6] = Long.toString(lift.getCumulativePositiveCases(iQuantile));//TARGETS_CUMULATIVE
    liftVals[7] = Long.toString(lift.getCumulativeNegativeCases(iQuantile));//NON_TARGETS_CUMULATIVE
    liftVals[8] = m_df.format(lift.getLift(iQuantile, iQuantile).doubleValue());//LIFT_QUANTILE
    liftVals[9] = m_df.format(lift.getTargetDensity(iQuantile, iQuantile).doubleValue());//TARGET_DENSITY
    System.out.println(mfLift.format(liftVals));
    // ROC
    ReceiverOperatingCharacterics roc = testMetrics.getROC();
    System.out.println("ROC Details:");
    System.out.println("ROC: Area Under Curve = " + m_df.format(roc.getAreaUnderCurve()));
    int nROCThresh = roc.getNumberOfThresholdCandidates();
    System.out.println("ROC: Number Of Threshold Candidates = " + nROCThresh);
    System.out.println("ROC: ( INDEX, PROBABILITY, TRUE_POSITIVES, FALSE_NEGATIVES, FALSE_POSITIVES, TRUE_NEGATIVES, TRUE_POSITIVE_FRACTION, FALSE_POSITIVE_FRACTION )");
    MessageFormat mfROC = new MessageFormat(" ( {0}, {1}, {2}, {3}, {4}, {5}, {6}, {7} )");
    String[] rocVals = new String[8];
    for(int iROC=1; iROC <= nROCThresh; iROC++)
    rocVals[0] = Integer.toString(iROC); //INDEX
    rocVals[1] = m_df.format(roc.getProbabilityThreshold(iROC));//PROBABILITY
    rocVals[2] = Long.toString(roc.getPositives(iROC, true));//TRUE_POSITIVES
    rocVals[3] = Long.toString(roc.getNegatives(iROC, false));//FALSE_NEGATIVES
    rocVals[4] = Long.toString(roc.getPositives(iROC, false));//FALSE_POSITIVES
    rocVals[5] = Long.toString(roc.getNegatives(iROC, true));//TRUE_NEGATIVES
    rocVals[6] = m_df.format(roc.getHitRate(iROC));//TRUE_POSITIVE_FRACTION
    rocVals[7] = m_df.format(roc.getFalseAlarmRate(iROC));//FALSE_POSITIVE_FRACTION
    System.out.println(mfROC.format(rocVals));
    private static void displayTable(String tableName, String whereCause, String orderByColumn)
    StringBuffer emptyCol = new StringBuffer(" ");
    java.sql.Connection dbConn =
    ((OraConnection)m_dmeConn).getDatabaseConnection();
    PreparedStatement pStmt = null;
    ResultSet rs = null;
    try
    pStmt = dbConn.prepareStatement("SELECT * FROM " + tableName + " " + whereCause + " " + orderByColumn);
    rs = pStmt.executeQuery();
    ResultSetMetaData rsMeta = rs.getMetaData();
    int colCount = rsMeta.getColumnCount();
    StringBuffer header = new StringBuffer();
    System.out.println("Table : " + tableName);
    //Build table header
    for(int iCol=1; iCol<=colCount; iCol++)
    String colName = rsMeta.getColumnName(iCol);
    header.append(emptyCol.replace(0, colName.length(), colName));
    emptyCol = new StringBuffer(" ");
    System.out.println(header.toString());
    //Write table data
    while(rs.next())
    StringBuffer rowContent = new StringBuffer();
    for(int iCol=1; iCol<=colCount; iCol++)
    int sqlType = rsMeta.getColumnType(iCol);
    Object obj = rs.getObject(iCol);
    String colContent = null;
    if(obj instanceof java.lang.Number)
    try
    BigDecimal bd = (BigDecimal)obj;
    if(bd.scale() > 5)
    colContent = m_df.format(obj);
    } else
    colContent = bd.toString();
    } catch(Exception anyExp) {
    colContent = m_df.format(obj);
    } else
    if(obj == null)
    colContent = "NULL";
    else
    colContent = obj.toString();
    rowContent.append(" "+emptyCol.replace(0, colContent.length(), colContent));
    emptyCol = new StringBuffer(" ");
    System.out.println(rowContent.toString());
    } catch(Exception anySqlExp) {
    anySqlExp.printStackTrace();
    }//Ignore
    private static void createTableForTestMetrics(String applyOutputTableName,
    String testDataName,
    String testMetricsInputTableName)
    //0. need to execute the following in the schema
    String sqlCreate =
    "create table " + testMetricsInputTableName + " as " +
    "select a.id as id, prediction, probability, affinity_card " +
    "from " + testDataName + " a, " + applyOutputTableName + " b " +
    "where a.id = b.id";
    java.sql.Connection dbConn = ((OraConnection) m_dmeConn).getDatabaseConnection();
    Statement stmt = null;
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate( sqlCreate );
    catch( Exception anySqlExp )
    System.out.println( anySqlExp.getMessage() );
    anySqlExp.printStackTrace();
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    private static void clean()
    java.sql.Connection dbConn =
    ((OraConnection) m_dmeConn).getDatabaseConnection();
    Statement stmt = null;
    // Drop apply output table
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT1_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT2_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE TREE_APPLY_OUTPUT3_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    // Drop apply output table created for test metrics task
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE DT_TEST_APPLY_OUTPUT_COST_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    try
    stmt = dbConn.createStatement();
    stmt.executeUpdate("DROP TABLE DT_TEST_APPLY_OUTPUT_JDM");
    } catch(Exception anySqlExp) {}//Ignore
    finally
    try
    stmt.close();
    catch( SQLException sqlExp ) {}
    //Drop the model
    try {
    m_dmeConn.removeObject( "treeModel_jdm", NamedObject.model );
    } catch(Exception jdmExp) {}
    // drop test metrics result: created by TestMetricsTask
    try {
    m_dmeConn.removeObject( "dtTestMetricsWithCost_jdm", NamedObject.testMetrics );
    } catch(Exception jdmExp) {}
    try {
    m_dmeConn.removeObject( "dtTestMetrics_jdm", NamedObject.testMetrics );
    } catch(Exception jdmExp) {}

    Hi
    I am not sure whether this will help but someone else was getting an error with a java.sql.SQLexception: Unsupported feature. Here is a link to the fix: http://saloon.javaranch.com/cgi-bin/ubb/ultimatebb.cgi?ubb=get_topic&f=3&t=007947
    Best wishes
    Michael

  • [SOLVED] Help: My fonts probably got messed up

    Hi,
    I'm on a fully updated Arch-X86_64/KDE system.
    After working flawlessly in the past, I can't use Efax-gtk anymore as I get errors similar to this one:
    Warning: Cannot convert string "-*-Helvetica-Medium-R-Normal--*-140-*-*-P-*-ISO8859-1"
    to type FontStruct
    Warning: Cannot convert string "-*-Helvetica-Medium-R-Normal--*-120-*-*-P-*-ISO8859-1"
    to type FontStruct
    Warning: Cannot convert string "-*-Helvetica-Medium-R-Normal--*-100-*-*-P-*-ISO8859-1"
    to type FontStruct
    Warning: Cannot convert string "-*-Helvetica-Bold-R-Normal--*-120-*-*-P-*-ISO8859-1"
    to type FontStruct
    Warning: Cannot convert string "-*-Helvetica
    Since the last time I've successfully used Efax-gtk I've installed several fonts (mainly TTF) from AUR, and I guess, something got wrong.
    Below please find several respective terminal outputs.
    Please advise.
    Thanks
    Regards,
    Michael  Badt
    $ fc-list : file
    /usr/share/fonts/TTF/DejaVuSerif-BoldItalic.ttf:
    /usr/share/fonts/Type1/a010015l.pfb:
    /usr/share/fonts/TTF/LiberationMono-BoldItalic.ttf:
    /usr/share/fonts/Type1/n019043l.pfb:
    /usr/share/fonts/misc/7x13-ISO8859-1.pcf.gz:
    /home/miki/.fonts/s/SimpleCLM_Medium.ttf:
    /usr/share/fonts/TTF/DejaVuSans-ExtraLight.ttf:
    /usr/share/fonts/misc/9x18-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/12x13ja.pcf.gz:
    /usr/share/fonts/misc/6x13-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/n019044l.pfb:
    /usr/share/fonts/TTF/DroidSansArmenian.ttf:
    /home/miki/.fonts/f/FrankRuehlCLM_MediumOblique.ttf:
    /usr/share/fonts/TTF/LiberationSerif-Regular.ttf:
    /usr/share/fonts/TTF/DroidSansEthiopic-Regular.ttf:
    /home/miki/.fonts/a/AharoniCLM_Bold.pfa:
    /usr/share/fonts/misc/6x12-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/DejaVuSerif-Bold.ttf:
    /home/miki/.fonts/m/MiriamMonoCLM_BookOblique.ttf:
    /usr/share/fonts/misc/10x20-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/8x13-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/7x13.pcf.gz:
    /home/miki/.fonts/a/AharoniCLM_Book.pfa:
    /usr/share/fonts/Type1/n022003l.pfb:
    /usr/share/fonts/misc/9x18B-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/n022004l.pfb:
    /usr/share/fonts/misc/7x14-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/DroidSansHebrew-Regular.ttf:
    /usr/share/fonts/misc/6x10-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/n021023l.pfb:
    /home/miki/.fonts/e/ElliniaCLM_Light.pfa:
    /usr/share/fonts/misc/9x18B.pcf.gz:
    /usr/share/fonts/misc/9x15-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/n021024l.pfb:
    /usr/share/fonts/Type1/n019023l.pfb:
    /usr/share/fonts/misc/9x15.pcf.gz:
    /usr/share/fonts/TTF/DroidSansFallbackLegacy.ttf:
    /usr/share/fonts/misc/5x7.pcf.gz:
    /usr/share/fonts/Type1/n022023l.pfb:
    /usr/share/fonts/Type1/n022024l.pfb:
    /home/miki/.fonts/k/KeterYG_Bold.ttf:
    /home/miki/.fonts/d/DrugulinCLM_BoldItalic.pfa:
    /usr/share/fonts/Type1/n019024l.pfb:
    /home/miki/.fonts/d/DavidCLM_Bold.ttf:
    /home/miki/.fonts/f/FrankRuehlCLM_BoldOblique.ttf:
    /usr/share/fonts/TTF/DejaVuSansCondensed.ttf:
    /usr/share/fonts/TTF/DroidKufi-Bold.ttf:
    /usr/share/fonts/TTF/LiberationSans-BoldItalic.ttf:
    /usr/share/fonts/misc/8x13.pcf.gz:
    /usr/share/fonts/misc/cu-pua12.pcf.gz:
    /usr/share/fonts/misc/8x13B-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/9x18.pcf.gz:
    /usr/share/fonts/TTF/FreeMonoBoldOblique.ttf:
    /usr/share/fonts/TTF/DejaVuSansMono-BoldOblique.ttf:
    /home/miki/.fonts/n/NachlieliCLM_Bold.pfa:
    /usr/share/fonts/TTF/DroidKufi-Regular.ttf:
    /usr/share/fonts/TTF/FreeSansOblique.ttf:
    /usr/share/fonts/Type1/n021003l.pfb:
    /home/miki/.fonts/h/HadasimCLM_Regular.ttf:
    /usr/share/fonts/misc/clR6x12-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/9x15B.pcf.gz:
    /usr/share/fonts/Type1/n021004l.pfb:
    /usr/share/fonts/TTF/FreeSerif.ttf:
    /usr/share/fonts/TTF/FreeMonoOblique.ttf:
    /usr/share/fonts/TTF/DroidSansHebrew-Bold.ttf:
    /usr/share/fonts/TTF/Ubuntu-B.ttf:
    /home/miki/.fonts/k/KeterYG_BoldOblique.ttf:
    /usr/share/fonts/TTF/Ubuntu-R.ttf:
    /usr/share/fonts/Type1/n019003l.pfb:
    /usr/share/fonts/TTF/Ubuntu-C.ttf:
    /home/miki/.fonts/y/YehudaCLM_Bold.pfa:
    /usr/share/fonts/misc/8x13B.pcf.gz:
    /home/miki/.fonts/d/DavidCLM_MediumItalic.ttf:
    /usr/share/fonts/kanjistrokeorders/KanjiStrokeOrders.ttf:
    /usr/share/fonts/misc/9x15B-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/LiberationSerif-Bold.ttf:
    /usr/share/fonts/misc/decsess.pcf.gz:
    /usr/share/fonts/misc/6x9-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/DejaVuSerifCondensed-BoldItalic.ttf:
    /home/miki/.fonts/n/NachlieliCLM_BoldOblique.pfa:
    /home/miki/.fonts/h/HadasimCLM_Bold.ttf:
    /usr/share/fonts/misc/micro.pcf.gz:
    /usr/share/fonts/TTF/DejaVuSerif-Italic.ttf:
    /usr/share/fonts/Type1/n019004l.pfb:
    /home/miki/.fonts/h/HadasimCLM_RegularOblique.ttf:
    /usr/share/fonts/TTF/DejaVuSansMono.ttf:
    /usr/share/fonts/TTF/DroidSansGeorgian.ttf:
    /usr/share/fonts/TTF/DroidSansEthiopic-Bold.ttf:
    /home/miki/.fonts/s/SimpleCLM_Bold.ttf:
    /usr/share/fonts/misc/6x13B-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/8x16.pcf.gz:
    /usr/share/fonts/misc/7x13B-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/Ubuntu-L.ttf:
    /usr/share/fonts/TTF/DroidSansJapanese.ttf:
    /usr/share/fonts/misc/6x9.pcf.gz:
    /home/miki/.fonts/s/StamSefaradCLM.ttf:
    /usr/share/fonts/TTF/LiberationMono-Bold.ttf:
    /home/miki/.fonts/s/SimpleCLM_BoldOblique.ttf:
    /usr/share/fonts/misc/cursor.pcf.gz:
    /home/miki/.fonts/e/ElliniaCLM_BoldItalic.pfa:
    /usr/share/fonts/TTF/DejaVuSans-BoldOblique.ttf:
    /usr/share/fonts/TTF/DejaVuSansCondensed-Oblique.ttf:
    /usr/share/fonts/TTF/DroidNaskh-Regular.ttf:
    /home/miki/.fonts/y/YehudaCLM_Light.pfa:
    /usr/share/fonts/TTF/DejaVuSans.ttf:
    /usr/share/fonts/TTF/FreeSerifBold.ttf:
    /usr/share/fonts/misc/clR6x12.pcf.gz:
    /home/miki/.fonts/e/ElliniaCLM_Bold.pfa:
    /usr/share/fonts/misc/8x13O-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/DejaVuSansCondensed-Bold.ttf:
    /usr/share/fonts/misc/8x13O.pcf.gz:
    /usr/share/fonts/misc/7x13O-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/deccurs.pcf.gz:
    /usr/share/fonts/misc/7x13B.pcf.gz:
    /home/miki/.fonts/n/NachlieliCLM_LightOblique.pfa:
    /usr/share/fonts/misc/6x13O-ISO8859-1.pcf.gz:
    /usr/share/fonts/TTF/UbuntuMono-BI.ttf:
    /usr/share/fonts/TTF/LiberationMono-Italic.ttf:
    /home/miki/.fonts/c/CaladingsCLM.pfa:
    /usr/share/fonts/TTF/DroidSerif-BoldItalic.ttf:
    /usr/share/fonts/misc/7x14.pcf.gz:
    /usr/share/fonts/Type1/c059036l.pfb:
    /home/miki/.fonts/h/HadasimCLM_BoldOblique.ttf:
    /home/miki/.fonts/s/StamAshkenazCLM.ttf:
    /usr/share/fonts/TTF/DroidSans-Bold.ttf:
    /usr/share/fonts/misc/7x14B-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/c059033l.pfb:
    /usr/share/fonts/TTF/Ubuntu-RI.ttf:
    /usr/share/fonts/TTF/DejaVuSansMono-Oblique.ttf:
    /usr/share/fonts/misc/10x20.pcf.gz:
    /home/miki/.fonts/m/MiriamMonoCLM_Bold.ttf:
    /usr/share/fonts/misc/4x6.pcf.gz:
    /usr/share/fonts/TTF/FreeSansBoldOblique.ttf:
    /usr/share/fonts/TTF/DroidSansTamil-Regular.ttf:
    /usr/share/fonts/misc/6x12.pcf.gz:
    /usr/share/fonts/TTF/DroidSerif-Regular.ttf:
    /home/miki/.fonts/m/MiriamMonoCLM_Book.ttf:
    /usr/share/fonts/TTF/FreeMono.ttf:
    /home/miki/.fonts/a/AharoniCLM_BookOblique.pfa:
    /home/miki/.fonts/m/MiriamMonoCLM_BoldOblique.ttf:
    /usr/share/fonts/TTF/FreeSans.ttf:
    /usr/share/fonts/TTF/DroidNaskh-Regular-SystemUI.ttf:
    /usr/share/fonts/Type1/d050000l.pfb:
    /home/miki/.fonts/a/AharoniCLM_BoldOblique.pfa:
    /usr/share/fonts/misc/18x18ko.pcf.gz:
    /usr/share/fonts/misc/6x13B.pcf.gz:
    /usr/share/fonts/TTF/DroidSans.ttf:
    /usr/share/fonts/TTF/UbuntuMono-RI.ttf:
    /usr/share/fonts/TTF/LiberationSans-Bold.ttf:
    /usr/share/fonts/Type1/c059016l.pfb:
    /usr/share/fonts/Type1/s050000l.pfb:
    /usr/share/fonts/misc/7x14B.pcf.gz:
    /usr/share/fonts/TTF/LiberationSerif-Italic.ttf:
    /usr/share/fonts/TTF/LiberationSerif-BoldItalic.ttf:
    /home/miki/.fonts/d/DavidCLM_BoldItalic.ttf:
    /usr/share/fonts/Type1/c059013l.pfb:
    /usr/share/fonts/TTF/Ubuntu-BI.ttf:
    /usr/share/fonts/misc/7x13O.pcf.gz:
    /usr/share/fonts/TTF/DejaVuSerifCondensed-Italic.ttf:
    /usr/share/fonts/TTF/FreeSansBold.ttf:
    /usr/share/fonts/TTF/Ubuntu-LI.ttf:
    /usr/share/fonts/TTF/FreeMonoBold.ttf:
    /usr/share/fonts/TTF/DejaVuSans-Bold.ttf:
    /home/miki/.fonts/k/KeterYG_MediumOblique.ttf:
    /usr/share/fonts/TTF/DejaVuSerif.ttf:
    /home/miki/.fonts/s/SimpleCLM_MediumOblique.ttf:
    /usr/share/fonts/TTF/FreeSerifItalic.ttf:
    /usr/share/fonts/TTF/DejaVuSansMono-Bold.ttf:
    /usr/share/fonts/TTF/DroidSansThai.ttf:
    /home/miki/.fonts/d/DavidCLM_Medium.ttf:
    /usr/share/fonts/TTF/DejaVuSerifCondensed.ttf:
    /usr/share/fonts/TTF/LiberationMono-Regular.ttf:
    /usr/share/fonts/Type1/b018032l.pfb:
    /usr/share/fonts/TTF/DroidNaskh-Bold.ttf:
    /usr/share/fonts/misc/5x8.pcf.gz:
    /usr/share/fonts/Type1/b018035l.pfb:
    /usr/share/fonts/TTF/UbuntuMono-B.ttf:
    /usr/share/fonts/misc/5x7-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/6x13O.pcf.gz:
    /usr/share/fonts/TTF/UbuntuMono-R.ttf:
    /home/miki/.fonts/e/ElliniaCLM_LightItalic.pfa:
    /usr/share/fonts/TTF/DroidSansArabic.ttf:
    /usr/share/fonts/misc/6x10.pcf.gz:
    /usr/share/fonts/TTF/DroidSerif-Bold.ttf:
    /usr/share/fonts/TTF/DroidSansDevanagari-Regular.ttf:
    /usr/share/fonts/TTF/DejaVuSansCondensed-BoldOblique.ttf:
    /usr/share/fonts/Type1/z003034l.pfb:
    /usr/share/fonts/misc/12x24.pcf.gz:
    /usr/share/fonts/TTF/DroidSansFallback.ttf:
    /usr/share/fonts/misc/18x18ja.pcf.gz:
    /home/miki/.fonts/f/FrankRuehlCLM_Bold.ttf:
    /usr/share/fonts/TTF/LiberationSans-Italic.ttf:
    /home/miki/.fonts/m/MiriamCLM_Book.ttf:
    /usr/share/fonts/Type1/b018012l.pfb:
    /usr/share/fonts/Type1/p052004l.pfb:
    /usr/share/fonts/misc/cu12.pcf.gz:
    /home/miki/.fonts/m/MiriamCLM_Bold.ttf:
    /usr/share/fonts/TTF/DroidSerif-Italic.ttf:
    /usr/share/fonts/Type1/b018015l.pfb:
    /usr/share/fonts/misc/4x6-ISO8859-1.pcf.gz:
    /usr/share/fonts/Type1/a010033l.pfb:
    /usr/share/fonts/TTF/DroidSansTamil-Bold.ttf:
    /usr/share/fonts/Type1/p052003l.pfb:
    /usr/share/fonts/Type1/a010035l.pfb:
    /usr/share/fonts/TTF/LiberationSans-Regular.ttf:
    /home/miki/.fonts/n/NachlieliCLM_Light.pfa:
    /usr/share/fonts/TTF/DejaVuSerifCondensed-Bold.ttf:
    /usr/share/fonts/Type1/n019063l.pfb:
    /usr/share/fonts/misc/5x8-ISO8859-1.pcf.gz:
    /usr/share/fonts/misc/6x13.pcf.gz:
    /home/miki/.fonts/d/DrugulinCLM_Bold.pfa:
    /usr/share/fonts/misc/arabic24.pcf.gz:
    /usr/share/fonts/Type1/n019064l.pfb:
    /usr/share/fonts/misc/cu-alt12.pcf.gz:
    /home/miki/.fonts/f/FrankRuehlCLM_Medium.ttf:
    /usr/share/fonts/TTF/DroidSansMono.ttf:
    /usr/share/fonts/Type1/p052024l.pfb:
    /usr/share/fonts/TTF/FreeSerifBoldItalic.ttf:
    /usr/share/fonts/TTF/DejaVuSans-Oblique.ttf:
    /usr/share/fonts/Type1/a010013l.pfb:
    /usr/share/fonts/TTF/DroidSansFallbackFull.ttf:
    /home/miki/.fonts/k/KeterYG_Medium.ttf:
    /usr/share/fonts/Type1/p052023l.pfb:
    And here is probably the problem (that's AFTER I've ran mkfontdir, as root, in both directories, as suggested)
    are/fonts/Type1/p052023l.pfb:
    miki ~ $ grep /fonts /var/log/Xorg.0.log
    [ 16.143] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
    [ 16.153] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
    [ 16.153] (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
    [ 16.154] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
    [ 16.154] (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/,
    /usr/share/fonts/Type1/
    $ xset q
    Keyboard Control:
    auto repeat: on key click percent: 0 LED mask: 00000000
    XKB indicators:
    00: Caps Lock: off 01: Num Lock: off 02: Scroll Lock: off
    03: Compose: off 04: Kana: off 05: Sleep: off
    06: Suspend: off 07: Mute: off 08: Misc: off
    09: Mail: off 10: Charging: off 11: Shift Lock: off
    12: Group 2: off 13: Mouse Keys: off
    auto repeat delay: 660 repeat rate: 25
    auto repeating keys: 00ffffffdffffbbf
    fadfffefffedffff
    9fffffffffffffff
    fff7ffffffffffff
    bell percent: 50 bell pitch: 400 bell duration: 100
    Pointer Control:
    acceleration: 20/10 threshold: 4
    Screen Saver:
    prefer blanking: yes allow exposures: yes
    timeout: 0 cycle: 600
    Colors:
    default colormap: 0x20 BlackPixel: 0x0 WhitePixel: 0xffffff
    Font Path:
    /usr/share/fonts/misc/,/usr/share/fonts/TTF/,/usr/share/fonts/Type1/,built-ins,/home/miki/.fonts
    DPMS (Energy Star):
    Standby: 600 Suspend: 900 Off: 1200
    DPMS is Enabled
    Monitor is On
    and my full fonts tree
    miki /usr/share/fonts $ tree
    β”œβ”€β”€ 100dpi
    β”‚   └── fonts.alias
    β”œβ”€β”€ 75dpi
    β”‚   └── fonts.alias
    β”œβ”€β”€ TTF
    β”‚   β”œβ”€β”€ DejaVuSans-Bold.ttf
    β”‚   β”œβ”€β”€ DejaVuSans-BoldOblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSans-ExtraLight.ttf
    β”‚   β”œβ”€β”€ DejaVuSans-Oblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSans.ttf
    β”‚   β”œβ”€β”€ DejaVuSansCondensed-Bold.ttf
    β”‚   β”œβ”€β”€ DejaVuSansCondensed-BoldOblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSansCondensed-Oblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSansCondensed.ttf
    β”‚   β”œβ”€β”€ DejaVuSansMono-Bold.ttf
    β”‚   β”œβ”€β”€ DejaVuSansMono-BoldOblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSansMono-Oblique.ttf
    β”‚   β”œβ”€β”€ DejaVuSansMono.ttf
    β”‚   β”œβ”€β”€ DejaVuSerif-Bold.ttf
    β”‚   β”œβ”€β”€ DejaVuSerif-BoldItalic.ttf
    β”‚   β”œβ”€β”€ DejaVuSerif-Italic.ttf
    β”‚   β”œβ”€β”€ DejaVuSerif.ttf
    β”‚   β”œβ”€β”€ DejaVuSerifCondensed-Bold.ttf
    β”‚   β”œβ”€β”€ DejaVuSerifCondensed-BoldItalic.ttf
    β”‚   β”œβ”€β”€ DejaVuSerifCondensed-Italic.ttf
    β”‚   β”œβ”€β”€ DejaVuSerifCondensed.ttf
    β”‚   β”œβ”€β”€ DroidKufi-Bold.ttf
    β”‚   β”œβ”€β”€ DroidKufi-Regular.ttf
    β”‚   β”œβ”€β”€ DroidNaskh-Bold.ttf
    β”‚   β”œβ”€β”€ DroidNaskh-Regular-SystemUI.ttf
    β”‚   β”œβ”€β”€ DroidNaskh-Regular.ttf
    β”‚   β”œβ”€β”€ DroidSans-Bold.ttf
    β”‚   β”œβ”€β”€ DroidSans.ttf
    β”‚   β”œβ”€β”€ DroidSansArabic.ttf
    β”‚   β”œβ”€β”€ DroidSansArmenian.ttf
    β”‚   β”œβ”€β”€ DroidSansDevanagari-Regular.ttf
    β”‚   β”œβ”€β”€ DroidSansEthiopic-Bold.ttf
    β”‚   β”œβ”€β”€ DroidSansEthiopic-Regular.ttf
    β”‚   β”œβ”€β”€ DroidSansFallback.ttf
    β”‚   β”œβ”€β”€ DroidSansFallbackFull.ttf
    β”‚   β”œβ”€β”€ DroidSansFallbackLegacy.ttf
    β”‚   β”œβ”€β”€ DroidSansGeorgian.ttf
    β”‚   β”œβ”€β”€ DroidSansHebrew-Bold.ttf
    β”‚   β”œβ”€β”€ DroidSansHebrew-Regular.ttf
    β”‚   β”œβ”€β”€ DroidSansJapanese.ttf
    β”‚   β”œβ”€β”€ DroidSansMono.ttf
    β”‚   β”œβ”€β”€ DroidSansTamil-Bold.ttf
    β”‚   β”œβ”€β”€ DroidSansTamil-Regular.ttf
    β”‚   β”œβ”€β”€ DroidSansThai.ttf
    β”‚   β”œβ”€β”€ DroidSerif-Bold.ttf
    β”‚   β”œβ”€β”€ DroidSerif-BoldItalic.ttf
    β”‚   β”œβ”€β”€ DroidSerif-Italic.ttf
    β”‚   β”œβ”€β”€ DroidSerif-Regular.ttf
    β”‚   β”œβ”€β”€ FreeMono.ttf
    β”‚   β”œβ”€β”€ FreeMonoBold.ttf
    β”‚   β”œβ”€β”€ FreeMonoBoldOblique.ttf
    β”‚   β”œβ”€β”€ FreeMonoOblique.ttf
    β”‚   β”œβ”€β”€ FreeSans.ttf
    β”‚   β”œβ”€β”€ FreeSansBold.ttf
    β”‚   β”œβ”€β”€ FreeSansBoldOblique.ttf
    β”‚   β”œβ”€β”€ FreeSansOblique.ttf
    β”‚   β”œβ”€β”€ FreeSerif.ttf
    β”‚   β”œβ”€β”€ FreeSerifBold.ttf
    β”‚   β”œβ”€β”€ FreeSerifBoldItalic.ttf
    β”‚   β”œβ”€β”€ FreeSerifItalic.ttf
    β”‚   β”œβ”€β”€ LiberationMono-Bold.ttf
    β”‚   β”œβ”€β”€ LiberationMono-BoldItalic.ttf
    β”‚   β”œβ”€β”€ LiberationMono-Italic.ttf
    β”‚   β”œβ”€β”€ LiberationMono-Regular.ttf
    β”‚   β”œβ”€β”€ LiberationSans-Bold.ttf
    β”‚   β”œβ”€β”€ LiberationSans-BoldItalic.ttf
    β”‚   β”œβ”€β”€ LiberationSans-Italic.ttf
    β”‚   β”œβ”€β”€ LiberationSans-Regular.ttf
    β”‚   β”œβ”€β”€ LiberationSerif-Bold.ttf
    β”‚   β”œβ”€β”€ LiberationSerif-BoldItalic.ttf
    β”‚   β”œβ”€β”€ LiberationSerif-Italic.ttf
    β”‚   β”œβ”€β”€ LiberationSerif-Regular.ttf
    β”‚   β”œβ”€β”€ Ubuntu-B.ttf
    β”‚   β”œβ”€β”€ Ubuntu-BI.ttf
    β”‚   β”œβ”€β”€ Ubuntu-C.ttf
    β”‚   β”œβ”€β”€ Ubuntu-L.ttf
    β”‚   β”œβ”€β”€ Ubuntu-LI.ttf
    β”‚   β”œβ”€β”€ Ubuntu-R.ttf
    β”‚   β”œβ”€β”€ Ubuntu-RI.ttf
    β”‚   β”œβ”€β”€ UbuntuMono-B.ttf
    β”‚   β”œβ”€β”€ UbuntuMono-BI.ttf
    β”‚   β”œβ”€β”€ UbuntuMono-R.ttf
    β”‚   β”œβ”€β”€ UbuntuMono-RI.ttf
    β”‚   β”œβ”€β”€ fonts.dir
    β”‚   └── fonts.scale
    β”œβ”€β”€ Type1
    β”‚   β”œβ”€β”€ a010013l.afm
    β”‚   β”œβ”€β”€ a010013l.pfb
    β”‚   β”œβ”€β”€ a010013l.pfm
    β”‚   β”œβ”€β”€ a010015l.afm
    β”‚   β”œβ”€β”€ a010015l.pfb
    β”‚   β”œβ”€β”€ a010015l.pfm
    β”‚   β”œβ”€β”€ a010033l.afm
    β”‚   β”œβ”€β”€ a010033l.pfb
    β”‚   β”œβ”€β”€ a010033l.pfm
    β”‚   β”œβ”€β”€ a010035l.afm
    β”‚   β”œβ”€β”€ a010035l.pfb
    β”‚   β”œβ”€β”€ a010035l.pfm
    β”‚   β”œβ”€β”€ b018012l.afm
    β”‚   β”œβ”€β”€ b018012l.pfb
    β”‚   β”œβ”€β”€ b018012l.pfm
    β”‚   β”œβ”€β”€ b018015l.afm
    β”‚   β”œβ”€β”€ b018015l.pfb
    β”‚   β”œβ”€β”€ b018015l.pfm
    β”‚   β”œβ”€β”€ b018032l.afm
    β”‚   β”œβ”€β”€ b018032l.pfb
    β”‚   β”œβ”€β”€ b018032l.pfm
    β”‚   β”œβ”€β”€ b018035l.afm
    β”‚   β”œβ”€β”€ b018035l.pfb
    β”‚   β”œβ”€β”€ b018035l.pfm
    β”‚   β”œβ”€β”€ c059013l.afm
    β”‚   β”œβ”€β”€ c059013l.pfb
    β”‚   β”œβ”€β”€ c059013l.pfm
    β”‚   β”œβ”€β”€ c059016l.afm
    β”‚   β”œβ”€β”€ c059016l.pfb
    β”‚   β”œβ”€β”€ c059016l.pfm
    β”‚   β”œβ”€β”€ c059033l.afm
    β”‚   β”œβ”€β”€ c059033l.pfb
    β”‚   β”œβ”€β”€ c059033l.pfm
    β”‚   β”œβ”€β”€ c059036l.afm
    β”‚   β”œβ”€β”€ c059036l.pfb
    β”‚   β”œβ”€β”€ c059036l.pfm
    β”‚   β”œβ”€β”€ d050000l.afm
    β”‚   β”œβ”€β”€ d050000l.pfb
    β”‚   β”œβ”€β”€ d050000l.pfm
    β”‚   β”œβ”€β”€ fonts.dir
    β”‚   β”œβ”€β”€ fonts.scale
    β”‚   β”œβ”€β”€ n019003l.afm
    β”‚   β”œβ”€β”€ n019003l.pfb
    β”‚   β”œβ”€β”€ n019003l.pfm
    β”‚   β”œβ”€β”€ n019004l.afm
    β”‚   β”œβ”€β”€ n019004l.pfb
    β”‚   β”œβ”€β”€ n019004l.pfm
    β”‚   β”œβ”€β”€ n019023l.afm
    β”‚   β”œβ”€β”€ n019023l.pfb
    β”‚   β”œβ”€β”€ n019023l.pfm
    β”‚   β”œβ”€β”€ n019024l.afm
    β”‚   β”œβ”€β”€ n019024l.pfb
    β”‚   β”œβ”€β”€ n019024l.pfm
    β”‚   β”œβ”€β”€ n019043l.afm
    β”‚   β”œβ”€β”€ n019043l.pfb
    β”‚   β”œβ”€β”€ n019043l.pfm
    β”‚   β”œβ”€β”€ n019044l.afm
    β”‚   β”œβ”€β”€ n019044l.pfb
    β”‚   β”œβ”€β”€ n019044l.pfm
    β”‚   β”œβ”€β”€ n019063l.afm
    β”‚   β”œβ”€β”€ n019063l.pfb
    β”‚   β”œβ”€β”€ n019063l.pfm
    β”‚   β”œβ”€β”€ n019064l.afm
    β”‚   β”œβ”€β”€ n019064l.pfb
    β”‚   β”œβ”€β”€ n019064l.pfm
    β”‚   β”œβ”€β”€ n021003l.afm
    β”‚   β”œβ”€β”€ n021003l.pfb
    β”‚   β”œβ”€β”€ n021003l.pfm
    β”‚   β”œβ”€β”€ n021004l.afm
    β”‚   β”œβ”€β”€ n021004l.pfb
    β”‚   β”œβ”€β”€ n021004l.pfm
    β”‚   β”œβ”€β”€ n021023l.afm
    β”‚   β”œβ”€β”€ n021023l.pfb
    β”‚   β”œβ”€β”€ n021023l.pfm
    β”‚   β”œβ”€β”€ n021024l.afm
    β”‚   β”œβ”€β”€ n021024l.pfb
    β”‚   β”œβ”€β”€ n021024l.pfm
    β”‚   β”œβ”€β”€ n022003l.afm
    β”‚   β”œβ”€β”€ n022003l.pfb
    β”‚   β”œβ”€β”€ n022003l.pfm
    β”‚   β”œβ”€β”€ n022004l.afm
    β”‚   β”œβ”€β”€ n022004l.pfb
    β”‚   β”œβ”€β”€ n022004l.pfm
    β”‚   β”œβ”€β”€ n022023l.afm
    β”‚   β”œβ”€β”€ n022023l.pfb
    β”‚   β”œβ”€β”€ n022023l.pfm
    β”‚   β”œβ”€β”€ n022024l.afm
    β”‚   β”œβ”€β”€ n022024l.pfb
    β”‚   β”œβ”€β”€ n022024l.pfm
    β”‚   β”œβ”€β”€ p052003l.afm
    β”‚   β”œβ”€β”€ p052003l.pfb
    β”‚   β”œβ”€β”€ p052003l.pfm
    β”‚   β”œβ”€β”€ p052004l.afm
    β”‚   β”œβ”€β”€ p052004l.pfb
    β”‚   β”œβ”€β”€ p052004l.pfm
    β”‚   β”œβ”€β”€ p052023l.afm
    β”‚   β”œβ”€β”€ p052023l.pfb
    β”‚   β”œβ”€β”€ p052023l.pfm
    β”‚   β”œβ”€β”€ p052024l.afm
    β”‚   β”œβ”€β”€ p052024l.pfb
    β”‚   β”œβ”€β”€ p052024l.pfm
    β”‚   β”œβ”€β”€ s050000l.afm
    β”‚   β”œβ”€β”€ s050000l.pfb
    β”‚   β”œβ”€β”€ s050000l.pfm
    β”‚   β”œβ”€β”€ z003034l.afm
    β”‚   β”œβ”€β”€ z003034l.pfb
    β”‚   └── z003034l.pfm
    β”œβ”€β”€ cyrillic
    β”‚   └── fonts.alias
    β”œβ”€β”€ encodings
    β”‚   β”œβ”€β”€ adobe-dingbats.enc.gz
    β”‚   β”œβ”€β”€ adobe-standard.enc.gz
    β”‚   β”œβ”€β”€ adobe-symbol.enc.gz
    β”‚   β”œβ”€β”€ armscii-8.enc.gz
    β”‚   β”œβ”€β”€ ascii-0.enc.gz
    β”‚   β”œβ”€β”€ dec-special.enc.gz
    β”‚   β”œβ”€β”€ encodings.dir
    β”‚   β”œβ”€β”€ ibm-cp437.enc.gz
    β”‚   β”œβ”€β”€ ibm-cp850.enc.gz
    β”‚   β”œβ”€β”€ ibm-cp852.enc.gz
    β”‚   β”œβ”€β”€ ibm-cp866.enc.gz
    β”‚   β”œβ”€β”€ iso8859-11.enc.gz
    β”‚   β”œβ”€β”€ iso8859-13.enc.gz
    β”‚   β”œβ”€β”€ iso8859-16.enc.gz
    β”‚   β”œβ”€β”€ iso8859-6.16.enc.gz
    β”‚   β”œβ”€β”€ iso8859-6.8x.enc.gz
    β”‚   β”œβ”€β”€ large
    β”‚   β”‚   β”œβ”€β”€ big5.eten-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ big5hkscs-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ cns11643-1.enc.gz
    β”‚   β”‚   β”œβ”€β”€ cns11643-2.enc.gz
    β”‚   β”‚   β”œβ”€β”€ cns11643-3.enc.gz
    β”‚   β”‚   β”œβ”€β”€ encodings.dir
    β”‚   β”‚   β”œβ”€β”€ gb18030-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ gb18030.2000-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ gb18030.2000-1.enc.gz
    β”‚   β”‚   β”œβ”€β”€ gb2312.1980-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ gbk-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ jisx0201.1976-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ jisx0208.1990-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ jisx0212.1990-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ ksc5601.1987-0.enc.gz
    β”‚   β”‚   β”œβ”€β”€ ksc5601.1992-3.enc.gz
    β”‚   β”‚   └── sun.unicode.india-0.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1250.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1251.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1252.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1253.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1254.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1255.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1256.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1257.enc.gz
    β”‚   β”œβ”€β”€ microsoft-cp1258.enc.gz
    β”‚   β”œβ”€β”€ microsoft-win3.1.enc.gz
    β”‚   β”œβ”€β”€ mulearabic-0.enc.gz
    β”‚   β”œβ”€β”€ mulearabic-1.enc.gz
    β”‚   β”œβ”€β”€ mulearabic-2.enc.gz
    β”‚   β”œβ”€β”€ mulelao-1.enc.gz
    β”‚   β”œβ”€β”€ suneu-greek.enc.gz
    β”‚   β”œβ”€β”€ tcvn-0.enc.gz
    β”‚   β”œβ”€β”€ tis620-2.enc.gz
    β”‚   └── viscii1.1-1.enc.gz
    β”œβ”€β”€ kanjistrokeorders
    β”‚   └── KanjiStrokeOrders.ttf
    β”œβ”€β”€ misc
    β”‚   β”œβ”€β”€ 10x20-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 10x20-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 10x20.pcf.gz
    β”‚   β”œβ”€β”€ 12x13ja.pcf.gz
    β”‚   β”œβ”€β”€ 12x24.pcf.gz
    β”‚   β”œβ”€β”€ 12x24rk.pcf.gz
    β”‚   β”œβ”€β”€ 18x18ja.pcf.gz
    β”‚   β”œβ”€β”€ 18x18ko.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 4x6-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 4x6.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 5x7-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 5x7.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 5x8-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 5x8.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x10-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 6x10.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x12-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 6x12.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x13-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 6x13.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x13B.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x13O.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 6x9-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 6x9.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 7x13-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 7x13.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 7x13B.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 7x13O.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-JISX0201.1976-0.pcf.gz
    β”‚   β”œβ”€β”€ 7x14-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 7x14.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 7x14B.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 8x13-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 8x13.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 8x13B.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 8x13O.pcf.gz
    β”‚   β”œβ”€β”€ 8x16.pcf.gz
    β”‚   β”œβ”€β”€ 8x16rk.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 9x15-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 9x15.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 9x15B.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-11.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 9x18-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ 9x18.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ 9x18B.pcf.gz
    β”‚   β”œβ”€β”€ arabic24.pcf.gz
    β”‚   β”œβ”€β”€ clB6x10.pcf.gz
    β”‚   β”œβ”€β”€ clB6x12.pcf.gz
    β”‚   β”œβ”€β”€ clB8x10.pcf.gz
    β”‚   β”œβ”€β”€ clB8x12.pcf.gz
    β”‚   β”œβ”€β”€ clB8x13.pcf.gz
    β”‚   β”œβ”€β”€ clB8x14.pcf.gz
    β”‚   β”œβ”€β”€ clB8x16.pcf.gz
    β”‚   β”œβ”€β”€ clB8x8.pcf.gz
    β”‚   β”œβ”€β”€ clB9x15.pcf.gz
    β”‚   β”œβ”€β”€ clI6x12.pcf.gz
    β”‚   β”œβ”€β”€ clI8x8.pcf.gz
    β”‚   β”œβ”€β”€ clR4x6.pcf.gz
    β”‚   β”œβ”€β”€ clR5x10.pcf.gz
    β”‚   β”œβ”€β”€ clR5x6.pcf.gz
    β”‚   β”œβ”€β”€ clR5x8.pcf.gz
    β”‚   β”œβ”€β”€ clR6x10.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-1.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-10.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-13.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-14.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-15.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-16.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-2.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-3.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-4.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-5.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-7.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-8.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-ISO8859-9.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12-KOI8-R.pcf.gz
    β”‚   β”œβ”€β”€ clR6x12.pcf.gz
    β”‚   β”œβ”€β”€ clR6x13.pcf.gz
    β”‚   β”œβ”€β”€ clR6x6.pcf.gz
    β”‚   β”œβ”€β”€ clR6x8.pcf.gz
    β”‚   β”œβ”€β”€ clR7x10.pcf.gz
    β”‚   β”œβ”€β”€ clR7x12.pcf.gz
    β”‚   β”œβ”€β”€ clR7x14.pcf.gz
    β”‚   β”œβ”€β”€ clR7x8.pcf.gz
    β”‚   β”œβ”€β”€ clR8x10.pcf.gz
    β”‚   β”œβ”€β”€ clR8x12.pcf.gz
    β”‚   β”œβ”€β”€ clR8x13.pcf.gz
    β”‚   β”œβ”€β”€ clR8x14.pcf.gz
    β”‚   β”œβ”€β”€ clR8x16.pcf.gz
    β”‚   β”œβ”€β”€ clR8x8.pcf.gz
    β”‚   β”œβ”€β”€ clR9x15.pcf.gz
    β”‚   β”œβ”€β”€ cu-alt12.pcf.gz
    β”‚   β”œβ”€β”€ cu-arabic12.pcf.gz
    β”‚   β”œβ”€β”€ cu-devnag12.pcf.gz
    β”‚   β”œβ”€β”€ cu-lig12.pcf.gz
    β”‚   β”œβ”€β”€ cu-pua12.pcf.gz
    β”‚   β”œβ”€β”€ cu12.pcf.gz
    β”‚   β”œβ”€β”€ cuarabic12.pcf.gz
    β”‚   β”œβ”€β”€ cudevnag12.pcf.gz
    β”‚   β”œβ”€β”€ cursor.pcf.gz
    β”‚   β”œβ”€β”€ deccurs.pcf.gz
    β”‚   β”œβ”€β”€ decsess.pcf.gz
    β”‚   β”œβ”€β”€ fonts.alias
    β”‚   β”œβ”€β”€ fonts.dir
    β”‚   β”œβ”€β”€ fonts.scale
    β”‚   β”œβ”€β”€ gb16fs.pcf.gz
    β”‚   β”œβ”€β”€ gb16st.pcf.gz
    β”‚   β”œβ”€β”€ gb24st.pcf.gz
    β”‚   β”œβ”€β”€ hanglg16.pcf.gz
    β”‚   β”œβ”€β”€ hanglm16.pcf.gz
    β”‚   β”œβ”€β”€ hanglm24.pcf.gz
    β”‚   β”œβ”€β”€ jiskan16.pcf.gz
    β”‚   β”œβ”€β”€ jiskan24.pcf.gz
    β”‚   β”œβ”€β”€ k14.pcf.gz
    β”‚   β”œβ”€β”€ micro.pcf.gz
    β”‚   β”œβ”€β”€ nil2.pcf.gz
    β”‚   β”œβ”€β”€ olcursor.pcf.gz
    β”‚   β”œβ”€β”€ olgl10.pcf.gz
    β”‚   β”œβ”€β”€ olgl12.pcf.gz
    β”‚   β”œβ”€β”€ olgl14.pcf.gz
    β”‚   └── olgl19.pcf.gz
    └── util
    β”œβ”€β”€ map-ISO8859-1
    β”œβ”€β”€ map-ISO8859-10
    β”œβ”€β”€ map-ISO8859-11
    β”œβ”€β”€ map-ISO8859-13
    β”œβ”€β”€ map-ISO8859-14
    β”œβ”€β”€ map-ISO8859-15
    β”œβ”€β”€ map-ISO8859-16
    β”œβ”€β”€ map-ISO8859-2
    β”œβ”€β”€ map-ISO8859-3
    β”œβ”€β”€ map-ISO8859-4
    β”œβ”€β”€ map-ISO8859-5
    β”œβ”€β”€ map-ISO8859-6
    β”œβ”€β”€ map-ISO8859-7
    β”œβ”€β”€ map-ISO8859-8
    β”œβ”€β”€ map-ISO8859-9
    β”œβ”€β”€ map-JISX0201.1976-0
    └── map-KOI8-R
    10 directories, 676 files
    Thanks again
    Last edited by mibadt (2014-06-01 09:10:42)

    Short answer: Try installing either or both 'extra/xorg-fonts-100dpi' and 'extra/xorg-fonts-75dpi'.
    Long answer:
    I don't have efax-gtk installed.Β  The efax program is looking for a font using the old XLFD system of font names. The fontconfig methods for checking fonts may not be useful to determine where the problem lies.Β  The program xlsfonts, from 'extra/xorg-xlsfonts', is needed to search for a match to the font name.
    I have a Helvetica font installed, your results may be different or you may have no results from xlsfonts.
    $ # Find the font names that match.
    $ xlsfonts -fn "-*-Helvetica-Medium-R-Normal--*-140-*-*-P-*-ISO8859-1"
    -adobe-helvetica-medium-r-normal--14-140-75-75-p-77-iso8859-1
    -adobe-helvetica-medium-r-normal--19-140-100-100-p-0-iso8859-1
    -adobe-helvetica-medium-r-normal--19-140-100-100-p-0-iso8859-1
    -adobe-helvetica-medium-r-normal--20-140-100-100-p-100-iso8859-1
    $ # Find the directories and one of the files for the fonts named above.
    $ grep -r -m1 '\-adobe-helvetica-medium-r-normal' /usr/share/fonts/**/fonts.dir
    /usr/share/fonts/100dpi/fonts.dir:helvR08-ISO8859-1.pcf.gz -adobe-helvetica-medium-r-normal--11-80-100-100-p-56-iso8859-1
    /usr/share/fonts/75dpi/fonts.dir:helvR08-ISO8859-1.pcf.gz -adobe-helvetica-medium-r-normal--8-80-75-75-p-46-iso8859-1
    $ # Find the packages containing the files.
    $ pkgfile helvR08-ISO8859-1.pcf.gz
    extra/xorg-fonts-100dpi
    extra/xorg-fonts-75dpi

  • Not able to create sub folder using SharePoint web service in large SharePoint document library (Item count view threshold limit)

    We are trying to create folder & subfolder in a SharePoint document library using SharePoint default(dws) web service. Document library has unique permission as well as item level permission. It was working as expected. Once item count crosses
    view threshold limit ( 5000) , create folder web method completes with an error and it creates a folder in SharePoint.
    Request:
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:dws="http://schemas.microsoft.com/sharepoint/soap/dws/">
       <soapenv:Header/>
       <soapenv:Body>
          <dws:CreateFolder>
             <!--Optional:-->
             <dws:url>Shared Documents/VenTest02092015v1</dws:url>
          </dws:CreateFolder>
       </soapenv:Body>
    </soapenv:Envelope>
     Response:
    <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
       <soap:Body>
          <CreateFolderResponse xmlns="http://schemas.microsoft.com/sharepoint/soap/dws/">
             <CreateFolderResult>&lt;Error ID="2">Failed&lt;/Error></CreateFolderResult>
          </CreateFolderResponse>
       </soap:Body>
    </soap:Envelope>
     While trying to create subfolder under the above created folder service throws an exception saying
    FolderNotFound.
    Though we are able to create subfolder from SharePoint UI successfully. 
    Request
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:dws="http://schemas.microsoft.com/sharepoint/soap/dws/">
       <soapenv:Header/>
       <soapenv:Body>
          <dws:CreateFolder>
             <!--Optional:-->
             <dws:url>Shared Documents/VenTest02092015v1/REQ-1</dws:url>
          </dws:CreateFolder>
       </soapenv:Body>
    </soapenv:Envelope>
    Response:
    <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
       <soap:Body>
          <CreateFolderResponse xmlns="http://schemas.microsoft.com/sharepoint/soap/dws/">
             <CreateFolderResult>&lt;Error ID="10">FolderNotFound&lt;/Error></CreateFolderResult>
          </CreateFolderResponse>
       </soap:Body>
    </soap:Envelope>

    Yes, you're probably hitting the 5000 list item threshold (
    http://sharepoint.stackexchange.com/questions/105937/overcoming-5000-file-document-library-limits ). I assume you can do it via the UI because you're probably logged in as an admin in which case, out of memory, the threshold is 20.000 items. You can extend
    this limit, but you probably shouldn't.
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • Purpose of GC Threshold

    What is the purpose of the GC threshold setting feature in WLS 7.X? JVM GC
    documentation and other GC related information resources suggest this is at
    best pointless and probably degrading to system performance. We are talking
    specifically of a threshold trigger versus a distributed synchronous GC.

    What is the purpose of the GC threshold setting feature in WLS 7.X? JVM GC
    documentation and other GC related information resources suggest this is at
    best pointless and probably degrading to system performance. We are talking
    specifically of a threshold trigger versus a distributed synchronous GC.

  • T520: Slice battery charging thresholds stuck

    I bought a T520 (4240) a few months ago and, like most people around here had the same issues with pulsing fans and, on applying BIOS 1.28, CPU throttling issues. I'm not sure if the latter one is fixed properly yet, but I also have a problem with the battery charging thresholds that has me stumped.
    I have the 55++ and 27++ 9-cell main and "slice" batteries. For a while I let them charge up to 100% whenever connected to AC, but then I read that a 100% charge level is not recommended for day-to-day operation.
    I use Fedora 15 Linux 99.9% of the time on my T520. I wanted to set the battery charge thresholds to avoid the 100% charge level, but the "tp_smapi" Linux driver doesn't work for the T520 (yet). I read somewhere that if I were to set the charge thresholds with the Lenovo Power Manager in Windows 7, that they would work in Linux as long as the laptop was not completely powered down when switching from Windows to Linux (i.e., use a warm boot). Well, it worked. In fact it worked really well: I could even power down and remove the batteries and something, somewhere would still remember that I wanted to start charging the slice battery when it dropped below 35% and stop charging it at 80%, even when running Fedora 15.
    Having used the machine for a few weeks, I found that I used it on battery power a lot and liked to "top up" the batteries over night. The 35/80 thresholds didn't really suit my working pattern. So, back in the Power Manageer in Windows, I changed the battery thresholds from 35/80 to 50/88 for both batteries (55++ and 27++). On booting back to Linux, I noted that the main battery would respect the new levels, but he slice battery was still stuck on 35/80. Back in Windows, I tried changing the levels in different ways, but the slice battery refused to charge to more than 80%, even under Windows, no matter what I did. It seems that my first change from the default (is it 96/100?) to 35/80 has gotten stuck somewhere and I cannot change it again.
    I even notice in the Power Manager that the slice battery constantly toggles between charging and not charging when under AC power if I set the threshold to 85/90. Because the 80% level where the battery is stuck is lower than 85% (my new "start" threshold), charging kicks in, but it immediately stops again, as it won't go over 80%, then, of course, it starts again and stops again, and so on.
    How do I fully reset the charge thresholds on the slice battery so I can exceed 80% charge?

    I had kind of given up on this, but thank you KooMaster for your confirmation that I am not alone with this issue. I don't think this is a problem with the Lenovo or Windows software, as this happens to me when running Linux without either of those. My guess is that the problem is in the embedded controller (EC) firmware. It is responsible for charging the batteries and enforcing the charging thresholds (that's why they still work even when the machine is turned off).
    It probably makes sense to charge each battery to 80% first, as it likely only takes half of the overall charge time to get to that level (the last 20% taking as long again). That way, we get both batteries to a useful charge level as fast as possible. However, based on your observations, it seems that someone in Lenovo forgot to tell the EC to check that the main battery has reached its upper charge threshold, not that it has reached 100%, before topping the slice battery up to its upper charging threshold.
    Can someone in Lenovo please comment on this issue, as I raised it over seven months ago now and I haven't heard a single word from any of you about it? "KooMaster" has confirmed the problem on a different model with different batteries, so it seems highly likely that a quick check of the (shared?) EC firmware code and a patch will resolve this quite quickly.
    (I updated my BIOS and EC firmware today and it made no difference to my problem.)

  • Image Analysis Threshold won't work in program program freezes

    I am a new user to labview and I probably have an error in my code but the only problem is that there are no error messages I get from my code and it runs up to a point at which the program freezes and I can't click any buttons on the Front Panel. Basically everything works up to when I want to choose the Threshold of my image at this stage it creates an image of the values in the threshold buttons but I can't adjust the values anymore to find out the best values.
    I should say that I have tried my Threshold code on its own and it seems to work fine so I am really confused as to what is getting muddled up!
    Any help on what might be wrong would be helpful. I have added the VI and an image that I am analyzing.
    The process currently goes as follows, you load the image, then choose a colour plane (usually green) that gets extracted. Then you draw a line on the border of the object and it rotates the image to make it straight and then you choose a ROI and it cuts the main object out. Finally at the point when you should be choosing a Threshold it hangs.
    Thanks for any help!
    Solved!
    Go to Solution.
    Attachments:
    Image adjustment version 4a.vi ‏143 KB
    Sample 2.jpg ‏2359 KB

    Hi,
    The root of your problem is in how you have used multiple event structures. Typically, a program should just have one event structure, which handles all events during the program. The vision functions were all working correctly!
    From looking at what you are trying to do, I think it would be worth using a well known architecture called the state machine with events. I'll attach some code framework to show you what it might look like for you, you can adapt it to suit your program. I hope its enough for you to get an idea of what I'm trying to show you, it doesn't have any code in it yet, but I think it gets the idea across, and if you combine it with the link below on state machines, it may help you to set out your code in a better architecture.
    A few links you may find helpful:
    Event programming: http://www.ni.com/white-paper/3331/en
    Caveats for using events:Β http://zone.ni.com/reference/en-XX/help/371361J-01/lvhowto/caveatsrecmndtnsevnts/
    State machine: http://www.ni.com/white-paper/2926/en
    let me know if you have any more questions.
    Ian S
    Applications Engineer CLD
    National Instruments UK&Ireland
    Attachments:
    Event state machine example.vi ‏14 KB

  • Two Threshold Analog to Digital

    I was asked to develop some code that would take a signal in on an analog, then convert it to a digital, then perform frequency, duty cycle, and signal integrity testing on it. Β The built in NI functions for performing these tasks were insufficient because we needed to be able to detect a single drop out of a cycle. Β With a real world signal I realize there maybe noise and a having a single threshold to convert from a analog to a digital may show transitions that aren't there and so I planned on developing some kind of debounce code.
    Instead someone mentioned using two thresholds, one for the low and one for the high, and to only consider the signal transitioning if it goes above the high, after going below the low. Β 
    Attached is my attempt at that method. Β This VI simulates a sine wave with a bunch of noise then does a single theshold to show how imperfect it can be. Β Then using that same signal it does a two level theshold which works much better but has a slight shift in the time domain, and the beginning will contain unknown values because neither transition has occured with the first sample.
    Any pointers or suggestions to improve my implementation is appreciated. Β Thanks.
    EDIT: This does use an OpenG function from the Array palette.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.
    Solved!
    Go to Solution.
    Attachments:
    Test AI to Digital With 2 Levels.vi ‏72 KB

    Why so many loops when you just need one?
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines
    Attachments:
    Test AI to Digital With 2 Levels.png ‏76 KB

  • Setting thresholds for cpu governor with TLP?

    I used to have a set of scripts with every power saving trick I ever collected that would run upon un/plugging my laptop. But today I decided to play with TLP because it's all the rage these days.
    It's really nice! It seems to hit more stuff than Laptop Mode Tools did, which is awesome. My powertop is reporting under 11 watts, which is about as good as I can ever get with this laptop.
    The one thing I'm missing, though, is I like to switch to conservative governor when on battery. I also change the up and down thresholds to 98 and 95, respectively as per this probably dated article: http://www.ibm.com/developerworks/libra … index.html .
    Anyway, I have TLP set to switch to performance when plugged in and conservative when on battery. Is there any way to get TLP to also set the thresholds when the computer is unplugged? Any kind of "run this script upon unplugging" option? I think Laptop Mode Tools had that, so that would be the one thing which I'd say is better about LMT if TLP doesn't also have that.

    Hello,
    I would assign 8 CPUs to affinity mask (CPU) and 8 CPUs to I/O affinity mask per instance.
    Please read about why you should not assign the both to the same schedulers on the following post.
    http://blogs.msdn.com/b/psssql/archive/2010/11/19/how-it-works-io-affinity-mask-should-i-use-it.aspx
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • How to change Self Monitoring Thresholds in SOLMAN 7.1 SP8?

    Hello Expert,
    I'm wondering if there is an option to change Self Monitoring Thresholds in SOLMAN 7.1 SP8? I'm getting so many SOLMAN Self Monitoring alert's emails, so looking forward to adjust the thresholds (if this is possible at all)?
    Your help is always appreciated.
    Thanks & regards,
    MM

    Hi Roland,
    Thanks a lot for advice!
    Sorry, how are you getting to this screen where you can create a custom template?
    Somehow I can't find any option to create a custom template in step 3.2 ofΒ  the Self-Mon in my system.
    When I press 'Configure and Activate' button it's just activating the Self-Mon, without any option to create a custom template.
    In Alerting Directory Browser I can see SAP Standard Template for Self-Mon,...
    ...but I'm struggeling to find where I can create the custom template a copy of this standard one?
    Probably I'm missing something.
    I'm on SP08.
    You help is much appreciated.
    Best,
    MM

  • Assigning discount threshold to a position in org. model

    Hello,
    Β Β Β Β Β  I need to assign discount thresholds (as a percentage value) to a position in org. model to facilitate discount approval process for quotation. Can this be done by using the attributes section of the position? Please provide your inputs.
    Thanks,
    Aditya Mishra

    Hi Aditya,
    not exactly what you are looking for, but probably you can make use of http://help.sap.com/saphelp_crm700_ehp01/helpdata/EN/45/7d36e37962694ee10000000a155369/content.htm or http://help.sap.com/saphelp_crm700_ehp01/helpdata/EN/45/bff319692f52f5e10000000a1553f7/content.htm.
    Best Regards,
    Michael

  • How do you write an image threshold to a file

    Hi, I use IMAQ threshold to threshold an RGB image. How do I write this image to a file? Whenever I use the IMAQ Write JPG or Write PNG vI and try to view the file with the windows image viewer all I see is a black image. Does anyone know how to save a binary image?

    If you cast it to an eight bit image and save it as a BMP or JPEG does it show up correctly? If so the problem probably lies in the fact that a 16-bit image in LabVIEW is signed, while third party viewers usually assume that it is unsigned. So rather than pixel values going from -32766 to 32767, they go from 0 to 65536. There are example programs on ni.com to get around this. One is called "Mapping a 16-bit Image to an 8-bit Image," another is called "Bit Shift an IMAQ Vision 12-bit Image and Resave for External Viewer." If you search for these titles on the ni.com search bar, you should be able to find them.
    Kyle V

  • Repair vs replace option when MBR is probably bad?

    I could use some assistance getting the Thinkvantage & F11 buttons workingΒ  again.Β  Fn and Thinkvantage buttons work correctly when booted into Windows as does the R&R software, but will not go into the recovery partition when booting the machine.
    The blue Thinkvantage button will only show a "Startup Interrupt Menu" with three options: pressing ESC to resume startup normally, F1 to enter the BIOS or F12 to load the boot menu.
    I ran the MBR repair floppyΒ  (the CD ISO I found reported errors on download & burn with 2 different programs), but it said that the MBR might not be from Lenovo and did apparently did nothing.Β  Should I try the replace option instead of repair?Β  What will that do and will it affect my disk partitions?
    I did repartition the C drive into multiple C, D, & E drives, but did not change or move the service partition which I can see in disk management.Β Β Β  This was all working until I booted the machine with an old drive in the SATA ultrabay.Β  Windows went wacko getting drive letters confused and kind of merging the boot from both drives.Β  I got that fixed and the preboot environment is all thats left to resolve.
    Thanks a lot

    Everyone is different, and sometimes people have sentimental attachments to systems that override what makes economic sense to most people... Most techs I know use a 50% threshold. If repairing is going to cost more than 50% of a new system, it's probably not worth fixing.
    USUALLY in these cases systems are 2-3 years old or more, so people can at least take comfort in the fact that they're getting a better overall system. In your case, you don't have that, so the threshold for repair vs replace should maybe be more around 75%. But from some of the liquid damaged systems I've seen, and estimates I've written up... I don't think you'll have that big of an issue making a decision. Bare minimum they'll probably have to replace the logic board. There's also a good chance the display is damaged, and replacing the battery after liquid damage is generally a good idea, just to be safe. There's a good chance the repair costs would exceed the cost of a new system.
    And for the record, you did about all you could given the circumstances. The damage was done the moment the liquid came in contact with the circuitry. There's really nothing you can do with liquid spills except clean things up as best you can and hope for the best.

  • Thresholding an image in order to determine areas

    Hey everyone,
    I'm somewhat new to Labview and I've only recently gotten the Vision Development Module and the Vision Acquisition software. It's extremely new to me and I'm at a loss. I searched around and couldn't seem to find exactly what I'm looking for, or maybe I didn't understand it enough to know. I'm really just looking for something to get me started.
    I am wanting to create a VI that thresholds an image that is inputed from a previous VI that I have already created in order to determine the area (number of pixels) the objects take up on the picture. Β The picture will have several metal objects on top of a colored background and will be later compared to another similar image.
    What would be the best way to threshold this? Based on the colored background (most likely blue or green)? What I picture is for this to turn the image into black and white and then count the white pixels and group them by each object.
    Attached are the image capturing VI which uses a webcam to take a picture and a test VI to try to figure out what thresholding is about.
    Thanks in advance for the help.
    Attachments:
    ImageCaptureVI.vi ‏61 KB
    TestThesholdingVI.vi ‏22 KB

    Hello,
    if you background is for blue for example, you could extract the opposite (complementary) color plane i.e. red to reinforce the contrast before thresholding:
    BEFORE:
    AFTER:
    You can then manually or auto threshold. Manual threshoding needs to know the threshold values. The below part in your code is (probably) not correct:
    Thresholding the image will convert the image to binary, no matter what value you specify, but the output is greatly dependant on this. The equation for Labview threshold function is (check the documentation to be sure):
    IF f(u,v) >= Tlow AND f(u,v) <= Thigh, then f(u,v) = 1, otherwise f(u,v) = 0!
    where f(u,v) is the value of the pixel at (u,v) location and Tlow is the lower threshold, Thigh is the upper threshold.
    The range specifies this two threshold values. After you threshold the image, you can use morphology operations to reduce noisy data, fill holes etc... and use for example particle analysis (on binary image). If you use count and measure objects, you do not need to threshold the image before, since the function does this for you, again by specifying the threshold value(s).
    Hope this helps.
    Best regards,
    K
    https://decibel.ni.com/content/blogs/kl3m3n
    "Kudos: Users may give one another Kudos on the forums for posts that they found particularly helpful or insightful."

Maybe you are looking for

  • Where can I found the Oracle-Validated rpm packaeg in OEL6?

    Hello, I installed the OEL6 to avoid some issues I had; now I can't find the Oracle-Validated package rpm; I registered my system to UBL and nothing show up. Thanks

  • Using CRX as Document Server for PDF and Office Docs

    Hi All, I my organization we have got lots of PDF , Word,PPT and video on stored on CQ server along with webpage. I am suggesting it to store webpages on CQ/AEM server and rest of the documents on CRX server and access those documents via Sling URL's

  • Process flow hangs in state "bussy"

    Hi, We are encountering what seems to be a "random" problem when runing process flows. In about 1 out of 4 runs the flow "hangs" with status "busy" The technical specs are as follows: OWB 9.2.0.2.8 on WIN2K Server SP4. There are two runtime repositor

  • Kernel Panics on a Brand New Retina Macbook Pro

    Hardware Overview: Β  Model Name:Β Β Β Β Β Β Β Β Β  MacBook Pro Β  Model Identifier:Β Β Β Β Β Β Β Β Β  MacBookPro11,3 Β  Processor Name:Β Β Β Β Β Β Β Β Β  Intel Core i7 Β  Processor Speed:Β Β Β Β Β Β Β Β Β  2.3 GHz Β  Number of Processors:Β Β Β Β Β Β Β Β Β  1 Β  Total Number of Cores:Β Β Β Β Β Β Β Β Β  4 Β  L2

  • Resolution maximum on Macbook pro

    Good evening, I have a doubt about the macbook, I need a quality monitor and I thought Thunderbolt Apple Display (flat panel 27 "), but the maximum resolution offered by the laptop is 1280 * 800, thus becoming unable to Benefit the maximum resolution