Normalizer transformation limitations

Hi, I have the below queries pertaining to limitations in the normalizer transformation: 1. Any specific reason why the normalizer transformation not support the date datatype? 2. Also, why only limited datatypes are supported in the normalizer transformation - string, nstring and number are only available. 3. Why can't the normalizer transformation be used within a mapplet? Kindly help out with these queries. Thanks & Regards,Nelrick

강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스강남야구장↔강남풀싸롱↔ 풀싸롱ぢ "010.2157.5421"【강남풀싸롱】다양한 EVENT 강남풀싸롱 강남야구장 강남매직미러 풀싸롱 야구장 매직미러초이스

Similar Messages

  • Using stored procedure transformation as a active transformation.

    hi,
    i have a requirment, i need to take start date and end date as an inpute and then the output must the number of days falling in each month between the given dates.the problem is that i want to use a stored procedure.but a SP transformation is a passive transformation.
    i.e for one row of inpute there may be n rows of output.
    plz tell me how to handel this in INFORMATICA SP transformation.

    i'm working on oracle 9i release 9 and informatica power center 8.
    the flow goes like this:
    Router is used to devide the comming rows according to the stard_date and end_date data.if both the dates fall in the same month then those data fallow a rout and if the start date and end dates fall in different months then these rows are sent to a custom transformation.
    the custom transformation calls a procedure and the procedure devides the comming data in different segments.
    eg: If the start_date = '24-jan-2008' and end_date = '12-apr-2008'
    then the procedure must devide the output as:
    month=jan number of days= 7
    month=feb number of days= 29
    month=march number of days=31
    month=april number of days=12
    there for for a single row of data comming in the transformation is getting splitted into 4 rows.
    the output of this transformation is then merged with the first output of the router and loaded in a table.
    the problem is thet i dont have any idea of how to go forward with this approach. is this approach fissible.can any one suggest me any better way to do this.
    data is very huge so i cant use normalizer transformation.

  • Using stored procedure as an active transformation

    hi,
    i have a requirment, i need to take start date and end date as an inpute and then the output must the number of days falling in each month between the given dates.the problem is that i want to use a stored procedure.but a SP transformation is a passive transformation.
    plz help me out.

    i'm working on oracle 9i release 9 and informatica power center 8.
    the flow goes like this:
    Router is used to devide the comming rows according to the stard_date and end_date data.if both the dates fall in the same month then those data fallow a rout and if the start date and end dates fall in different months then these rows are sent to a custom transformation.
    the custom transformation calls a procedure and the procedure devides the comming data in different segments.
    eg: If the start_date = '24-jan-2008' and end_date = '12-apr-2008'
    then the procedure must devide the output as:
    month=jan number of days= 7
    month=feb number of days= 29
    month=march number of days=31
    month=april number of days=12
    there for for a single row of data comming in the transformation is getting splitted into 4 rows.
    the output of this transformation is then merged with the first output of the router and loaded in a table.
    the problem is thet i dont have any idea of how to go forward with this approach. is this approach fissible.can any one suggest me any better way to do this.
    data is very huge so i cant use normalizer transformation.

  • Error in PL/SQL generated package

    Hello,
    With the help of ODM (version 10,2,0,3,1 -Build 479) I created a SVM classification model, which works very fine.
    After that, I generated the PL/SQL package, which returns an ORA-06512 in the call into DBMS_DATA_MINING.CREATE_MODEL.
    I tried to rebuild the model in the ODM and all worked well.
    I am kind of stuck here and don't know what to do next.
    Thanks,
    Igor

    Hi,
    hope you had a nice vacation.
    As for the code I have a feeling I try to fill a varchar2 with text, larger than declared length. I created a standard PL/SQL package as defined in the tutorial and the error remained. I wonder how come that with the use of odminer the building process is successful while the use of PL/SQL package returns errors?
    Thanks,
    Igor
    CREATE VIEW mining_data_build_v AS
    SELECT
    a.CUST_ID,
    a.CUST_GENDER,
    2003-a.CUST_YEAR_OF_BIRTH AGE,
    a.CUST_MARITAL_STATUS,
    c.COUNTRY_NAME,
    a.CUST_INCOME_LEVEL,
    b.EDUCATION,
    b.OCCUPATION,
    b.HOUSEHOLD_SIZE,
    b.YRS_RESIDENCE,
    b.AFFINITY_CARD,
    b.BULK_PACK_DISKETTES,
    b.FLAT_PANEL_MONITOR,
    b.HOME_THEATER_PACKAGE,
    b.BOOKKEEPING_APPLICATION,
    b.PRINTER_SUPPLIES,
    b.Y_BOX_GAMES,
    b.OS_DOC_SET_KANJI
    FROM
    sh.customers a,
    sh.supplementary_demographics b,
    sh.countries c
    WHERE
    a.CUST_ID = b.CUST_ID
    AND a.country_id = c.country_id
    AND a.cust_id between 101501 and 103000;
    using the package
    CREATE PACKAGE "DATAMININGACTIVITY9" AUTHID DEFINER AS
    PROCEDURE "MINING_DATA_BUI666498540_BA"(case_table IN VARCHAR2 DEFAULT '"DMUSER"."MINING_DATA_BUILD_V"',
    additional_table_1 IN VARCHAR2 DEFAULT NULL,
    model_name IN VARCHAR2 DEFAULT 'MINING_DATA_B80870_SV',
    confusion_matrix_name IN VARCHAR2 DEFAULT '"DM4J$T504449054041_M"',
    lift_result_name IN VARCHAR2 DEFAULT '"DM4J$T504449083327_L"',
    roc_result_name IN VARCHAR2 DEFAULT '"DM4J$T504449092305_R"',
    test_metric_name IN VARCHAR2 DEFAULT '"DM4J$MINING_D51278_TM"',
    drop_output IN BOOLEAN DEFAULT FALSE);
    END;
    CREATE PACKAGE BODY "DATAMININGACTIVITY9" AS
    c_long_sql_statement_length CONSTANT INTEGER := 32767;
    SUBTYPE SQL_STATEMENT_TYPE IS VARCHAR2(32767);
    SUBTYPE LONG_SQL_STATEMENT_TYPE IS DBMS_SQL.VARCHAR2A;
    TYPE TABLE_ARRAY is TABLE OF VARCHAR2(62);
    TYPE LSTMT_REC_TYPE IS RECORD (
    lstmt dbms_sql.VARCHAR2A,
    lb BINARY_INTEGER DEFAULT 1,
    ub BINARY_INTEGER DEFAULT 0);
    TYPE LSTMT_REC_TYPE_ARRAY is TABLE OF LSTMT_REC_TYPE;
    TYPE QUERY_ARRAY is TABLE OF SQL_STATEMENT_TYPE;
    TYPE TARGET_VALUES_LIST IS TABLE OF VARCHAR2(32);
    TYPE VALUE_COUNT_LIST IS TABLE OF NUMBER;
    PROCEDURE dump_varchar2a(vc2a dbms_sql.VARCHAR2A) IS
    v_str varchar2(32767);
    BEGIN
    DBMS_OUTPUT.PUT_LINE('dump_varchar2a:');
    FOR i IN 1..vc2a.COUNT LOOP
    v_str := vc2a(i);
    DBMS_OUTPUT.PUT_LINE(v_str);
    END LOOP;
    END;
    PROCEDURE ls_append(
    r_lstmt IN OUT NOCOPY LSTMT_REC_TYPE,
    p_txt VARCHAR2)
    IS
    BEGIN
    r_lstmt.ub := r_lstmt.ub + 1;
    r_lstmt.lstmt(r_lstmt.ub) := p_txt;
    END ls_append;
    PROCEDURE ls_append(
    r_lstmt IN OUT NOCOPY LSTMT_REC_TYPE,
    p_txt LSTMT_REC_TYPE) IS
    BEGIN
    FOR i IN p_txt.lb..p_txt.ub LOOP
    r_lstmt.ub := r_lstmt.ub + 1;
    r_lstmt.lstmt(r_lstmt.ub) := p_txt.lstmt(i);
    END LOOP;
    END ls_append;
    FUNCTION query_valid(
    p_query VARCHAR2) RETURN BOOLEAN
    IS
    v_is_valid BOOLEAN;
    BEGIN
    BEGIN
    EXECUTE IMMEDIATE p_query;
    v_is_valid := TRUE;
    EXCEPTION WHEN OTHERS THEN
    v_is_valid := FALSE;
    END;
    RETURN v_is_valid;
    END query_valid;
    FUNCTION table_exist(
    p_table_name VARCHAR2) RETURN BOOLEAN IS
    BEGIN
    RETURN query_valid('SELECT * FROM ' || dbms_assert.simple_sql_name(p_table_name));
    END table_exist;
    FUNCTION model_exist(
    p_model_name VARCHAR2) RETURN BOOLEAN
    IS
    v_model_cnt NUMBER;
    v_model_exists BOOLEAN := FALSE;
    BEGIN
    SELECT COUNT(*) INTO v_model_cnt FROM DM_USER_MODELS WHERE NAME = UPPER(p_model_name);
    IF v_model_cnt > 0 THEN
    v_model_exists := TRUE;
    END IF;
    --DBMS_OUTPUT.PUT_LINE('model exist: '||v_model_exists);
    RETURN v_model_exists;
    EXCEPTION WHEN OTHERS THEN
    RETURN FALSE;
    END model_exist;
    PROCEDURE drop_table(
    p_table_name VARCHAR2)
    IS
    v_stmt SQL_STATEMENT_TYPE;
    BEGIN
    v_stmt := 'DROP TABLE '||dbms_assert.simple_sql_name(p_table_name)||' PURGE';
    EXECUTE IMMEDIATE v_stmt;
    EXCEPTION WHEN OTHERS THEN
    NULL;
    --DBMS_OUTPUT.PUT_LINE('Failed drop_table: '||p_table_name);
    END drop_table;
    PROCEDURE drop_view(
    p_view_name VARCHAR2)
    IS
    v_stmt SQL_STATEMENT_TYPE;
    BEGIN
    v_stmt := 'DROP VIEW '||dbms_assert.simple_sql_name(p_view_name);
    EXECUTE IMMEDIATE v_stmt;
    EXCEPTION WHEN OTHERS THEN
    NULL;
    --DBMS_OUTPUT.PUT_LINE('Failed drop_view: '||p_view_name);
    END drop_view;
    PROCEDURE drop_model(
    p_model_name VARCHAR2)
    IS
    BEGIN
    DBMS_DATA_MINING.DROP_MODEL(p_model_name);
    EXCEPTION WHEN OTHERS THEN
    NULL;
    --DBMS_OUTPUT.PUT_LINE('Failed drop_model: '||p_model_name);
    END drop_model;
    FUNCTION create_new_temp_table_name(prefix IN VARCHAR2, len IN NUMBER)
    RETURN VARCHAR2 IS
    v_table_name VARCHAR2(30);
    v_seed NUMBER;
    BEGIN
    dbms_random.seed(SYS_GUID());
    v_table_name := 'DM$T' || SUBSTR(prefix, 0, 4) || dbms_random.string(NULL, len-8);
    --DBMS_OUTPUT.PUT_LINE('create_new_temp_table_name: '||v_table_name);
    RETURN v_table_name;
    END create_new_temp_table_name;
    FUNCTION create_new_temp_table_name(prefix IN VARCHAR2)
    RETURN VARCHAR2 IS
    BEGIN
    RETURN create_new_temp_table_name(prefix, 30);
    END create_new_temp_table_name;
    FUNCTION ADD_TEMP_TABLE(tempTables IN OUT NOCOPY TABLE_ARRAY, temp_table IN VARCHAR2) RETURN VARCHAR2 IS
    BEGIN
    tempTables.EXTEND;
    tempTables(tempTables.COUNT) := temp_table;
    return temp_table;
    END;
    PROCEDURE DROP_TEMP_TABLES(tempTables IN OUT NOCOPY TABLE_ARRAY) IS
    v_temp VARCHAR2(30);
    BEGIN
    FOR i IN 1..tempTables.COUNT LOOP
    v_temp := tempTables(i);
    drop_table(v_temp);
    drop_view(v_temp);
    tempTables.DELETE(i);
    END LOOP;
    END;
    PROCEDURE CHECK_RESULTS(drop_output IN BOOLEAN,
    result_name IN VARCHAR2) IS
    BEGIN
    -- drop all results if drop = true, otherwise make sure all results don't exist already (raise exception)
    IF result_name IS NOT NULL THEN
    IF drop_output THEN
    drop_table(result_name);
    drop_view(result_name);
    ELSIF (table_exist(result_name)) THEN
    RAISE_APPLICATION_ERROR(-20000, 'Result table exists: '||result_name);
    END IF;
    END IF;
    END;
    PROCEDURE CHECK_MODEL(drop_output IN BOOLEAN,
    model_name IN VARCHAR2) IS
    BEGIN
    -- drop all results if drop = true, otherwise make sure all results don't exist already (raise exception)
    IF model_name IS NOT NULL THEN
    IF drop_output THEN
    drop_model(model_name);
    ELSIF (model_exist(model_name)) THEN
    RAISE_APPLICATION_ERROR(-20001, 'Model exists: '||model_name);
    END IF;
    END IF;
    END;
    PROCEDURE create_table_from_query(query IN OUT NOCOPY LSTMT_REC_TYPE)
    IS
    v_cursor NUMBER;
    v_feedback INTEGER;
    BEGIN
    v_cursor := DBMS_SQL.OPEN_CURSOR;
    DBMS_SQL.PARSE(
    c => v_cursor,
    statement => query.lstmt,
    lb => query.lb,
    ub => query.ub,
    lfflg => FALSE,
    language_flag => dbms_sql.native);
    v_feedback := DBMS_SQL.EXECUTE(v_cursor);
    DBMS_SQL.CLOSE_CURSOR(v_cursor);
    EXCEPTION WHEN OTHERS THEN
    IF DBMS_SQL.IS_OPEN(v_cursor) THEN
    DBMS_SQL.CLOSE_CURSOR(v_cursor);
    END IF;
    RAISE;
    END;
    FUNCTION get_row_count(tableName IN VARCHAR2)
    RETURN INTEGER IS
    v_stmt VARCHAR(100);
    qcount INTEGER := 0;
    BEGIN
    v_stmt := 'SELECT COUNT(*) FROM '|| tableName;
    EXECUTE IMMEDIATE v_stmt INTO qcount;
    RETURN qcount;
    END get_row_count;
    PROCEDURE SET_EQUAL_DISTRIBUTION (
    counts IN OUT VALUE_COUNT_LIST )
    IS
    v_minvalue NUMBER := 0;
    BEGIN
    FOR i IN counts.FIRST..counts.LAST
    LOOP
    IF ( i = counts.FIRST )
    THEN
    v_minvalue := counts(i);
    ELSIF ( counts(i) > 0 AND v_minvalue > counts(i) )
    THEN
    v_minvalue := counts(i);
    END IF;
    END LOOP;
    FOR i IN counts.FIRST..counts.LAST
    LOOP
    counts(i) := v_minvalue;
    END LOOP;
    END SET_EQUAL_DISTRIBUTION;
    PROCEDURE GET_STRATIFIED_DISTRIBUTION (
    table_name VARCHAR2,
    attribute_name VARCHAR2,
    percentage NUMBER,
    attr_values IN OUT NOCOPY TARGET_VALUES_LIST,
    counts IN OUT NOCOPY VALUE_COUNT_LIST,
    counts_sampled IN OUT NOCOPY VALUE_COUNT_LIST )
    IS
    v_tmp_stmt VARCHAR2(4000);
    BEGIN
    v_tmp_stmt :=
    'SELECT /*+ noparallel(t)*/ ' || attribute_name ||
    ', count(*), ROUND ( ( count(*) * ' || percentage || ') / 100.0 ) FROM '|| table_name ||
    ' WHERE ' || attribute_name ||' IS NOT NULL GROUP BY ' || attribute_name;
    EXECUTE IMMEDIATE v_tmp_stmt
    BULK COLLECT INTO attr_values, counts, counts_sampled;
    END GET_STRATIFIED_DISTRIBUTION;
    FUNCTION GENERATE_STRATIFIED_SQL (
    v_2d_temp_view VARCHAR2,
    src_table_name VARCHAR2,
    attr_names TARGET_VALUES_LIST,
    attribute_name VARCHAR2,
    percentage NUMBER,
    op VARCHAR2,
    equal_distribution IN BOOLEAN DEFAULT FALSE) RETURN LSTMT_REC_TYPE
    IS
    v_tmp_lstmt LSTMT_REC_TYPE;
    attr_values_res TARGET_VALUES_LIST;
    counts_res VALUE_COUNT_LIST;
    counts_sampled_res VALUE_COUNT_LIST;
    tmp_str VARCHAR2(4000);
    sample_count PLS_INTEGER;
    BEGIN
    GET_STRATIFIED_DISTRIBUTION(src_table_name, attribute_name, percentage, attr_values_res, counts_res, counts_sampled_res);
    IF ( equal_distribution = TRUE )
    THEN
    SET_EQUAL_DISTRIBUTION(counts_sampled_res);
    END IF;
    v_tmp_lstmt.ub := 0; -- initialize
    ls_append(v_tmp_lstmt, 'CREATE TABLE ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, '( SELECT ');
    FOR i IN attr_names.FIRST..attr_names.LAST
    LOOP
    IF ( i != attr_names.FIRST )
    THEN
    ls_append(v_tmp_lstmt,',');
    END IF;
    ls_append(v_tmp_lstmt, attr_names(i));
    END LOOP;
    ls_append(v_tmp_lstmt, ' FROM (SELECT /*+ no_merge */ t.*, ROWNUM RNUM FROM ' || src_table_name || ' t) WHERE ' );
    FOR i IN attr_values_res.FIRST..attr_values_res.LAST
    LOOP
    IF ( i != attr_values_res.FIRST )
    THEN
    tmp_str := ' OR ';
    END IF;
    IF ( counts_res(i) <= 2 ) THEN
    sample_count := counts_res(i);
    ELSE
    sample_count := counts_sampled_res(i);
    END IF;
    tmp_str := tmp_str ||
    '( ' || attribute_name || ' = ''' || attr_values_res(i) || '''' ||
    ' AND ORA_HASH(RNUM,(' || counts_res(i) || ' -1),12345) ' || op || sample_count || ') ';
    ls_append(v_tmp_lstmt, tmp_str );
    END LOOP;
    ls_append(v_tmp_lstmt, ') ');
    return v_tmp_lstmt;
    END GENERATE_STRATIFIED_SQL;
    PROCEDURE "MINING_DATA_BUI666498540_BA"(case_table IN VARCHAR2 DEFAULT '"DMUSER"."MINING_DATA_BUILD_V"',
    additional_table_1 IN VARCHAR2 DEFAULT NULL,
    model_name IN VARCHAR2 DEFAULT 'MINING_DATA_B80870_SV',
    confusion_matrix_name IN VARCHAR2 DEFAULT '"DM4J$T504449054041_M"',
    lift_result_name IN VARCHAR2 DEFAULT '"DM4J$T504449083327_L"',
    roc_result_name IN VARCHAR2 DEFAULT '"DM4J$T504449092305_R"',
    test_metric_name IN VARCHAR2 DEFAULT '"DM4J$MINING_D51278_TM"',
    drop_output IN BOOLEAN DEFAULT FALSE)
    IS
    additional_data TABLE_ARRAY := TABLE_ARRAY(
    additional_table_1
    v_tempTables TABLE_ARRAY := TABLE_ARRAY();
    v_2d_view VARCHAR2(30);
    v_2d_view_build VARCHAR2(30);
    v_2d_view_test VARCHAR2(30);
    v_2d_temp_view VARCHAR2(30);
    v_txn_views TABLE_ARRAY := TABLE_ARRAY();
    v_txn_views_build TABLE_ARRAY := TABLE_ARRAY();
    v_txn_views_test TABLE_ARRAY := TABLE_ARRAY();
    v_txn_temp_views TABLE_ARRAY := TABLE_ARRAY();
    v_case_data SQL_STATEMENT_TYPE := case_table;
    v_case_id VARCHAR2(30) := 'DMR$CASE_ID';
    v_tmp_lstmt LSTMT_REC_TYPE;
    v_target_value VARCHAR2(4000) := '1';
    v_num_quantiles NUMBER := 10;
    v_build_data VARCHAR2(30);
    v_test_data VARCHAR2(30);
    v_prior VARCHAR2(30);
    v_build_setting VARCHAR2(30);
    v_apply_result VARCHAR2(30);
    v_build_cm VARCHAR2(30);
    v_test_cm VARCHAR2(30);
    v_accuracy NUMBER;
    v_area_under_curve NUMBER;
    v_avg_accuracy NUMBER;
    v_predictive_confidence NUMBER;
    v_confusion_matrix VARCHAR2(30);
    v_gen_caseId BOOLEAN := FALSE;
    v_2d_txt_view VARCHAR2(30);
    v_content_index VARCHAR2(30);
    v_content_index_pref VARCHAR2(30);
    v_category_temp_table VARCHAR2(30);
    v_term_definitions VARCHAR2(30);
    v_term_final_table VARCHAR2(30);
    v_term_final_table_index VARCHAR2(30);
    v_term_final_table_test VARCHAR2(30);
    pragma autonomous_transaction;
    BEGIN
    CHECK_MODEL(drop_output, model_name);
    CHECK_RESULTS(drop_output, test_metric_name);
    CHECK_RESULTS(drop_output, confusion_matrix_name);
    CHECK_RESULTS(drop_output, lift_result_name);
    CHECK_RESULTS(drop_output, roc_result_name);
    IF (v_gen_caseId) THEN
    v_case_data := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    EXECUTE IMMEDIATE 'CREATE TABLE '||v_case_data||' as SELECT rownum as DMR$CASE_ID, t.* FROM ('||case_table||') t ';
    EXECUTE IMMEDIATE 'ALTER TABLE '||v_case_data||' add constraint '||create_new_temp_table_name('PK')||' primary key (DMR$CASE_ID)';
    END IF;
    ----- Start: Input Data Preparation -----
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt, 'SELECT "CASE_TABLE"."CUST_ID" as "DMR$CASE_ID", TO_CHAR( "CASE_TABLE"."AFFINITY_CARD") AS "AFFINITY_CARD",
    "CASE_TABLE"."AGE" AS "AGE",
    TO_CHAR( "CASE_TABLE"."BOOKKEEPING_APPLICATION") AS "BOOKKEEPING_APPLICATION",
    TO_CHAR( "CASE_TABLE"."BULK_PACK_DISKETTES") AS "BULK_PACK_DISKETTES",
    "CASE_TABLE"."COUNTRY_NAME" AS "COUNTRY_NAME",
    "CASE_TABLE"."CUST_GENDER" AS "CUST_GENDER",
    "CASE_TABLE"."CUST_INCOME_LEVEL" AS "CUST_INCOME_LEVEL",
    "CASE_TABLE"."CUST_MARITAL_STATUS" AS "CUST_MARITAL_STATUS",
    "CASE_TABLE"."EDUCATION" AS "EDUCATION",
    TO_CHAR( "CASE_TABLE"."FLAT_PANEL_MONITOR") AS "FLAT_PANEL_MONITOR",
    TO_CHAR( "CASE_TABLE"."HOME_THEATER_PACKAGE") AS "HOME_THEATER_PACKAGE",
    "CASE_TABLE"."HOUSEHOLD_SIZE" AS "HOUSEHOLD_SIZE",
    "CASE_TABLE"."OCCUPATION" AS "OCCUPATION",
    TO_CHAR( "CASE_TABLE"."OS_DOC_SET_KANJI") AS "OS_DOC_SET_KANJI",
    TO_CHAR( "CASE_TABLE"."Y_BOX_GAMES") AS "Y_BOX_GAMES",
    "CASE_TABLE"."YRS_RESIDENCE" AS "YRS_RESIDENCE" FROM (' || v_case_data || ') CASE_TABLE ');
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view := v_2d_temp_view;
    ----- End: Input Data Preparation -----
    ----- Start: Outlier Treatment Transformation -----
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt, 'SELECT
    "AFFINITY_CARD",
    ( CASE WHEN "AGE" < -1.3 THEN -1.3
    WHEN "AGE" >= -1.3 AND "AGE" <= 79.77 THEN "AGE"
    WHEN "AGE" > 79.77 THEN 79.77
    end) "AGE"
    "BOOKKEEPING_APPLICATION",
    "BULK_PACK_DISKETTES",
    "COUNTRY_NAME",
    "CUST_GENDER",
    "CUST_INCOME_LEVEL",
    "CUST_MARITAL_STATUS",
    "DMR$CASE_ID",
    "EDUCATION",
    "FLAT_PANEL_MONITOR",
    "HOME_THEATER_PACKAGE",
    "HOUSEHOLD_SIZE",
    "OCCUPATION",
    "OS_DOC_SET_KANJI",
    "Y_BOX_GAMES",
    ( CASE WHEN "YRS_RESIDENCE" < -1.7 THEN -1.7
    WHEN "YRS_RESIDENCE" >= -1.7 AND "YRS_RESIDENCE" <= 10 THEN "YRS_RESIDENCE"
    WHEN "YRS_RESIDENCE" > 10 THEN 10
    end) "YRS_RESIDENCE"
    FROM ');
    ls_append(v_tmp_lstmt, v_2d_view);
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view := v_2d_temp_view;
    ----- End: Outlier Treatment Transformation -----
    ----- Start: Missing Values Transformation -----
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt, 'SELECT
    DECODE ( "AGE" , NULL,
    39.23831 , "AGE" ) "AGE" ,
    DECODE ( "YRS_RESIDENCE" , NULL,
    4.128 , "YRS_RESIDENCE" ) "YRS_RESIDENCE" ,
    DECODE ( "BOOKKEEPING_APPLICATION" , NULL,
    ''1'' , "BOOKKEEPING_APPLICATION" ) "BOOKKEEPING_APPLICATION" ,
    DECODE ( "BULK_PACK_DISKETTES" , NULL,
    ''1'' , "BULK_PACK_DISKETTES" ) "BULK_PACK_DISKETTES" ,
    DECODE ( "COUNTRY_NAME" , NULL,
    ''United States of America'' , "COUNTRY_NAME" ) "COUNTRY_NAME" ,
    DECODE ( "CUST_GENDER" , NULL,
    ''M'' , "CUST_GENDER" ) "CUST_GENDER" ,
    DECODE ( "CUST_INCOME_LEVEL" , NULL,
    ''J: 190,000 - 249,999'' , "CUST_INCOME_LEVEL" ) "CUST_INCOME_LEVEL" ,
    DECODE ( "CUST_MARITAL_STATUS" , NULL,
    ''Married'' , "CUST_MARITAL_STATUS" ) "CUST_MARITAL_STATUS" ,
    DECODE ( "EDUCATION" , NULL,
    ''HS-grad'' , "EDUCATION" ) "EDUCATION" ,
    DECODE ( "FLAT_PANEL_MONITOR" , NULL,
    ''1'' , "FLAT_PANEL_MONITOR" ) "FLAT_PANEL_MONITOR" ,
    DECODE ( "HOME_THEATER_PACKAGE" , NULL,
    ''1'' , "HOME_THEATER_PACKAGE" ) "HOME_THEATER_PACKAGE" ,
    DECODE ( "HOUSEHOLD_SIZE" , NULL,
    ''3'' , "HOUSEHOLD_SIZE" ) "HOUSEHOLD_SIZE" ,
    DECODE ( "OCCUPATION" , NULL,
    ''Exec.'' , "OCCUPATION" ) "OCCUPATION" ,
    DECODE ( "OS_DOC_SET_KANJI" , NULL,
    ''0'' , "OS_DOC_SET_KANJI" ) "OS_DOC_SET_KANJI" ,
    DECODE ( "Y_BOX_GAMES" , NULL,
    ''0'' , "Y_BOX_GAMES" ) "Y_BOX_GAMES" ,
    "AFFINITY_CARD",
    "DMR$CASE_ID"
    FROM ');
    ls_append(v_tmp_lstmt, v_2d_view);
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view := v_2d_temp_view;
    ----- End: Missing Values Transformation -----
    ----- Start: Normalize Transformation -----
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt, 'SELECT
    "BOOKKEEPING_APPLICATION",
    "BULK_PACK_DISKETTES",
    "COUNTRY_NAME",
    "CUST_GENDER",
    "CUST_INCOME_LEVEL",
    "CUST_MARITAL_STATUS",
    "EDUCATION",
    "FLAT_PANEL_MONITOR",
    "HOME_THEATER_PACKAGE",
    "HOUSEHOLD_SIZE",
    "OCCUPATION",
    "OS_DOC_SET_KANJI",
    "Y_BOX_GAMES",
    "AFFINITY_CARD",
    "DMR$CASE_ID",
    LEAST(1, GREATEST(0, (ROUND(("AGE" - 17.0) / (79.77 - 17.0),15) * (1.0 - 0.0) + 0.0))) "AGE",
    LEAST(1, GREATEST(0, (ROUND(("YRS_RESIDENCE" - 0.0) / (10.0 - 0.0),15) * (1.0 - 0.0) + 0.0))) "YRS_RESIDENCE"
    FROM ');
    ls_append(v_tmp_lstmt, v_2d_view);
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view := v_2d_temp_view;
    ----- End: Normalize Transformation -----
    ----- Start: Stratified Split Transformation -----
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, GENERATE_STRATIFIED_SQL(v_2d_temp_view, v_2d_view, TARGET_VALUES_LIST('"BOOKKEEPING_APPLICATION"',
    '"BULK_PACK_DISKETTES"',
    '"COUNTRY_NAME"',
    '"CUST_GENDER"',
    '"CUST_INCOME_LEVEL"',
    '"CUST_MARITAL_STATUS"',
    '"EDUCATION"',
    '"FLAT_PANEL_MONITOR"',
    '"HOME_THEATER_PACKAGE"',
    '"HOUSEHOLD_SIZE"',
    '"OCCUPATION"',
    '"OS_DOC_SET_KANJI"',
    '"Y_BOX_GAMES"',
    '"AFFINITY_CARD"',
    '"DMR$CASE_ID"',
    '"AGE"',
    '"YRS_RESIDENCE"'), '"AFFINITY_CARD"', 60, ' < ' ));
    create_table_from_query(v_tmp_lstmt);
    v_2d_view_build := v_2d_temp_view;
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, GENERATE_STRATIFIED_SQL(v_2d_temp_view, v_2d_view, TARGET_VALUES_LIST('"BOOKKEEPING_APPLICATION"',
    '"BULK_PACK_DISKETTES"',
    '"COUNTRY_NAME"',
    '"CUST_GENDER"',
    '"CUST_INCOME_LEVEL"',
    '"CUST_MARITAL_STATUS"',
    '"EDUCATION"',
    '"FLAT_PANEL_MONITOR"',
    '"HOME_THEATER_PACKAGE"',
    '"HOUSEHOLD_SIZE"',
    '"OCCUPATION"',
    '"OS_DOC_SET_KANJI"',
    '"Y_BOX_GAMES"',
    '"AFFINITY_CARD"',
    '"DMR$CASE_ID"',
    '"AGE"',
    '"YRS_RESIDENCE"'), '"AFFINITY_CARD"', 60, ' >= ' ));
    create_table_from_query(v_tmp_lstmt);
    v_2d_view_test := v_2d_temp_view;
    ----- End: Stratified Split Transformation -----
    ----- Start: Mining Data Preparation -----
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt,
    'SELECT caseTable."AFFINITY_CARD"
    , caseTable."AGE"
    , caseTable."BOOKKEEPING_APPLICATION"
    , caseTable."BULK_PACK_DISKETTES"
    , caseTable."COUNTRY_NAME"
    , caseTable."CUST_GENDER"
    , caseTable."CUST_INCOME_LEVEL"
    , caseTable."CUST_MARITAL_STATUS"
    , caseTable."DMR$CASE_ID"
    , caseTable."EDUCATION"
    , caseTable."FLAT_PANEL_MONITOR"
    , caseTable."HOME_THEATER_PACKAGE"
    , caseTable."HOUSEHOLD_SIZE"
    , caseTable."OCCUPATION"
    , caseTable."OS_DOC_SET_KANJI"
    , caseTable."Y_BOX_GAMES"
    , caseTable."YRS_RESIDENCE"
    FROM ('); ls_append(v_tmp_lstmt, v_2d_view_build); ls_append(v_tmp_lstmt, ') caseTable
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view_build := v_2d_temp_view;
    v_tmp_lstmt.ub := 0; -- initialize
    v_2d_temp_view := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ls_append(v_tmp_lstmt, 'CREATE VIEW ');
    ls_append(v_tmp_lstmt, v_2d_temp_view);
    ls_append(v_tmp_lstmt, ' AS ');
    ls_append(v_tmp_lstmt, ' ( ');
    ls_append(v_tmp_lstmt,
    'SELECT caseTable."AFFINITY_CARD"
    , caseTable."AGE"
    , caseTable."BOOKKEEPING_APPLICATION"
    , caseTable."BULK_PACK_DISKETTES"
    , caseTable."COUNTRY_NAME"
    , caseTable."CUST_GENDER"
    , caseTable."CUST_INCOME_LEVEL"
    , caseTable."CUST_MARITAL_STATUS"
    , caseTable."DMR$CASE_ID"
    , caseTable."EDUCATION"
    , caseTable."FLAT_PANEL_MONITOR"
    , caseTable."HOME_THEATER_PACKAGE"
    , caseTable."HOUSEHOLD_SIZE"
    , caseTable."OCCUPATION"
    , caseTable."OS_DOC_SET_KANJI"
    , caseTable."Y_BOX_GAMES"
    , caseTable."YRS_RESIDENCE"
    FROM ('); ls_append(v_tmp_lstmt, v_2d_view_test); ls_append(v_tmp_lstmt, ') caseTable
    ls_append(v_tmp_lstmt, ' ) ');
    create_table_from_query(v_tmp_lstmt);
    v_2d_view_test := v_2d_temp_view;
    v_build_data := v_2d_view_build;
    v_test_data := v_2d_view_test;
    ----- End: Mining Data Preparation -----
    v_prior := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    EXECUTE IMMEDIATE 'CREATE TABLE ' || v_prior || ' (TARGET_VALUE VARCHAR2(4000), PRIOR_PROBABILITY NUMBER)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_prior || ' VALUES (''0'', 0.25333333333333335)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_prior || ' VALUES (''1'', 0.7466666666666666)';
    COMMIT;
    v_build_setting := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    EXECUTE IMMEDIATE 'CREATE TABLE ' || v_build_setting || ' (setting_name VARCHAR2(30), setting_value VARCHAR2(128))';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''JDMS_TARGET_NAME'', ''"AFFINITY_CARD"'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''SVMS_ACTIVE_LEARNING'', ''SVMS_AL_ENABLE'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''JDMS_FUNCTION_TYPE'', ''CLASSIFICATION'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''ALGO_NAME'', ''ALGO_SUPPORT_VECTOR_MACHINES'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''SVMS_CONV_TOLERANCE'', ''0.0010'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_build_setting || ' VALUES (''CLAS_PRIORS_TABLE_NAME'', :priorTable)' USING v_prior;
    COMMIT;
    -- BUILD MODEL
    DBMS_DATA_MINING.CREATE_MODEL(
    model_name => model_name,
    mining_function => dbms_data_mining.classification,
    data_table_name => v_build_data,
    case_id_column_name => v_case_id,
    target_column_name => 'AFFINITY_CARD',
    settings_table_name => v_build_setting);
    v_test_cm := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    EXECUTE IMMEDIATE 'CREATE TABLE ' || v_test_cm || ' (actual_target_value VARCHAR2(4000), predicted_target_value VARCHAR2(4000), cost NUMBER)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_test_cm || ' VALUES (''0'', ''0'', 0.0)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_test_cm || ' VALUES (''0'', ''1'', 1.0)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_test_cm || ' VALUES (''1'', ''0'', 1.0)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || v_test_cm || ' VALUES (''1'', ''1'', 0.0)';
    COMMIT;
    -- TEST MODEL
    IF (test_metric_name IS NOT NULL) THEN
    -- CREATE APPLY RESULT FOR TEST
    v_apply_result := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    DBMS_DATA_MINING.APPLY(
    model_name => model_name,
    data_table_name => v_test_data,
    case_id_column_name => v_case_id,
    result_table_name => v_apply_result);
    EXECUTE IMMEDIATE 'CREATE TABLE ' || test_metric_name || ' (METRIC_NAME VARCHAR2(30), METRIC_VARCHAR_VALUE VARCHAR2(31), METRIC_NUM_VALUE NUMBER)';
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''MODEL_NAME'', :model)' USING model_name;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''TEST_DATA_NAME'', :test_data)' USING v_test_data;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''MINING_FUNCTION'', ''CLASSIFICATION'')';
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''TARGET_ATTRIBUTE'', :target)' USING 'AFFINITY_CARD';
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''POSITIVE_TARGET_VALUE'', :target_value)' USING v_target_value;
    COMMIT;
    IF confusion_matrix_name IS NULL THEN
    v_confusion_matrix := ADD_TEMP_TABLE(v_tempTables, create_new_temp_table_name('DM$T'));
    ELSE
    v_confusion_matrix := confusion_matrix_name;
    END IF;
    DBMS_DATA_MINING.COMPUTE_CONFUSION_MATRIX (
    accuracy => v_accuracy,
    apply_result_table_name => v_apply_result,
    target_table_name => v_test_data,
    case_id_column_name => v_case_id,
    target_column_name => 'AFFINITY_CARD',
    confusion_matrix_table_name => v_confusion_matrix,
    score_column_name => 'PREDICTION',
    score_criterion_column_name => 'PROBABILITY',
    cost_matrix_table_name => v_test_cm);
    -- DBMS_OUTPUT.PUT_LINE('**** MODEL ACCURACY ****: ' || ROUND(accuracy, 4));
    IF (confusion_matrix_name IS NOT NULL) THEN
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_NUM_VALUE) VALUES (''ACCURACY'', :accuracy)' USING v_accuracy;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''CONFUSION_MATRIX_TABLE'', :confusion_matrix_name)' USING confusion_matrix_name;
    COMMIT;
    -- Average Accuracy
    EXECUTE IMMEDIATE '
    WITH
    a as
    (SELECT a.actual_target_value, sum(a.value) recall_total
    FROM ' || confusion_matrix_name || ' a
    group by a.actual_target_value)
    b as
    (SELECT count(distinct b.actual_target_value) num_recalls
    FROM ' || confusion_matrix_name || ' b)
    c as
    (SELECT c.actual_target_value, value
    FROM ' || confusion_matrix_name || ' c
    where actual_target_value = predicted_target_value)
    d as
    (SELECT sum(c.value/a.recall_total) tot_accuracy
    FROM a, c
    where a.actual_target_value = c.actual_target_value)
    SELECT d.tot_accuracy/b.num_recalls * 100 avg_accuracy
    FROM b, d' INTO v_avg_accuracy;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_NUM_VALUE) VALUES (''AVG_ACCURACY'', :avg_accuracy)' USING v_avg_accuracy;
    COMMIT;
    END IF;
    -- Predictive Confidence
    EXECUTE IMMEDIATE '
    WITH
    a as
    (SELECT a.actual_target_value, sum(a.value) recall_total
    FROM ' || v_confusion_matrix || ' a
    group by a.actual_target_value)
    b as
    (SELECT count(distinct b.actual_target_value) num_classes
    FROM ' || v_confusion_matrix || ' b)
    c as
    (SELECT c.actual_target_value, value
    FROM ' || v_confusion_matrix || ' c
    where actual_target_value = predicted_target_value)
    d as
    (SELECT sum(c.value/a.recall_total) tot_accuracy
    FROM a, c
    where a.actual_target_value = c.actual_target_value)
    SELECT (1 - (1 - d.tot_accuracy/b.num_classes) / GREATEST(0.0001, ((b.num_classes-1)/b.num_classes))) * 100
    FROM b, d' INTO v_predictive_confidence;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_NUM_VALUE) VALUES (''PREDICTIVE_CONFIDENCE'', :predictive_confidence)' USING v_predictive_confidence;
    COMMIT;
    IF lift_result_name IS NOT NULL AND v_target_value IS NOT NULL THEN
    DBMS_DATA_MINING.COMPUTE_LIFT (
    apply_result_table_name => v_apply_result,
    target_table_name => v_test_data,
    case_id_column_name => v_case_id,
    target_column_name => 'AFFINITY_CARD',
    lift_table_name => lift_result_name,
    positive_target_value => v_target_value,
    num_quantiles => v_num_quantiles,
    cost_matrix_table_name => v_test_cm);
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''LIFT_TABLE'', :lift_result_name)' USING lift_result_name;
    COMMIT;
    END IF;
    IF roc_result_name IS NOT NULL AND v_target_value IS NOT NULL THEN
    DBMS_DATA_MINING.COMPUTE_ROC (
    roc_area_under_curve => v_area_under_curve,
    apply_result_table_name => v_apply_result,
    target_table_name => v_test_data,
    case_id_column_name => v_case_id,
    target_column_name => 'AFFINITY_CARD',
    roc_table_name => roc_result_name,
    positive_target_value => v_target_value,
    score_column_name => 'PREDICTION',
    score_criterion_column_name => 'PROBABILITY');
    -- DBMS_OUTPUT.PUT_LINE('**** AREA UNDER ROC CURVE ****: ' || area_under_curve );
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_VARCHAR_VALUE) VALUES (''ROC_TABLE'', :roc_result_name)' USING roc_result_name;
    EXECUTE IMMEDIATE 'INSERT INTO ' || test_metric_name || ' (METRIC_NAME, METRIC_NUM_VALUE) VALUES (''AREA_UNDER_CURVE'', :v_area_under_curve)' USING v_area_under_curve;
    COMMIT;
    END IF;
    END IF;
    DROP_TEMP_TABLES(v_tempTables);
    EXCEPTION WHEN OTHERS THEN
    DROP_TEMP_TABLES(v_tempTables);
    RAISE;
    END;
    END;
    /

  • Can I make a TexturePaint  than do not is not AffineTransformed?

    I like to to paint shapes with a TexturePaint that remains the same independent of scale, I had traied to create a Texture with a simple BufferedImage but the AffineTransform applied to the Graphics2D modify the image, making the pixels bigger when zooming, that is not my intention.
    I also tried to override the createContext method of TexturePaint to return the same image everytime, but it works for some shapes but not for others (for complicated shapes). I really do not understant how the TexturePaint works.
    Tanks in advance..

    I would prefer another solution.
    The better way is to normalize transform before using textture paint and restore graphics' transform after.
    public void paint(Graphics g) {
      Graphics2D g2d=(Graphics2D)g;
      AffineTransform oldTransform=g2d.getTransform();
      g2d.setTransform(normalTransform);
      g2d.setPaint(yourTexturePaint);
      //paint something
      g2d.setTransform(oldTransform);
    }regards
    Stas

  • How to scape chars for XML?

    Hi!,
    I am generating a XML file.
    Would you please help about escaping the text nodes ?
    The following code produces an invalid XML file for having the "?" on the text node.
    Thansk in advance!
    import javax.xml.parsers.*;
    import org.w3c.dom.*;
    import javax.xml.transform.*;
    import javax.xml.transform.dom.*;
    import javax.xml.transform.stream.*;
    public void writeXML(OutputStream os)
    Document docXML;
    Element root;
    DocumentBuilderFactory docFactory = DocumentBuilderFactory.newInstance();
    docFactory.setValidating(true);
    DocumentBuilder docBuilder = docFactory.newDocumentBuilder();
    docXML = docBuilder.newDocument();
    root = docXML.createElement("ROOT");
    root.appendChild( docXML.createText("Text?"));
    docXML.appendChild(root);
    docXML.normalize();
    Transformer transformer = TransformerFactory.newInstance().newTransformer();
    transformer.transform(new DOMSource(docXML), new StreamResult(new OutputStreamWriter(os)));

    Thanks.
    I am sorry!
    The java code was escaping correctly characters but the enconding at the top of the generated XML file was wrong, it generates
    <?xml version="1.0" encoding="UTF-8" ?>
    and I require for having european characters
    <?xml version="1.0" encoding="ISO-8859-1" ?>
    How should I change the econding for the generated XML file?

  • JAXP DOM reading and writting issues

    import org.xml.sax.*;
    import org.w3c.dom.*;
    import javax.xml.parsers.*;
    import javax.xml.transform.*;
    import javax.xml.transform.dom.DOMSource;
    import javax.xml.transform.stream.StreamResult;
    import java.io.*;
    public class Test
    {   public static void main(String[] args) throws Exception
        {     DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
            dbf.setIgnoringElementContentWhitespace(true);
         DocumentBuilder db = dbf.newDocumentBuilder();
         read(db, "xml_in.xml");
         create(db,"xml_out.xml");
        public static void read(DocumentBuilder db, String fileName)
        {     Document d = null;
         try
         {   d = db.parse(new File(fileName));
         }catch(IOException ex)
         {   ex.printStackTrace();
             return;
         }catch(SAXException ex)
         {   ex.printStackTrace();
             return;
         Node n = d.getDocumentElement();
         System.out.println("Name of the root element: " + n.getNodeName());
         NodeList nl = null;
         nl = d.getElementsByTagName("*");
         System.out.println("Number of element:" + nl.getLength());
         System.out.println();
         nl = d.getElementsByTagName("user");
         System.out.println("Length:" + nl.getLength());
         for(int i=0; i<nl.getLength(); i++)
         {   System.out.println("id: "+nl.item(i).getAttributes().getNamedItem("id").getNodeValue());
             System.out.println("name: "+nl.item(i).getFirstChild().getTextContent());
        public static void create(DocumentBuilder  db, String fileName) throws Exception
         Document d = db.newDocument();
         d.appendChild(d.createComment("This is comment"));
         Element ele_root = d.createElement("root");
         d.appendChild(ele_root);
         Element ele_temp;
         ele_temp = d.createElement("sub");
         ele_temp.setAttribute("id","1");
         ele_temp.appendChild(d.createTextNode("data"));
         ele_root.appendChild(ele_temp);
         ele_temp = d.createElement("sub");
         ele_temp.setAttribute("id","2");
         ele_root.appendChild(ele_temp);
         //adding node
         NodeList nl = d.getElementsByTagName("sub");
         Element ele_parent = (Element)nl.item(0).getParentNode();
         ele_temp = d.createElement("sub");
         ele_temp.setAttribute("id","3");
         ele_parent.appendChild(ele_temp);
         d.normalize();
         Transformer t = TransformerFactory.newInstance().newTransformer();
         //t.setOutputProperty(OutputKeys.METHOD, "test");
         t.transform(new DOMSource(d), new StreamResult(new File(fileName)));
    }xml_in.xml
    <?xml version = "1.0" ?>
    <user-detail>
         <user     id = "1"><name>user1</name><age>10</age></user>
         <user     id = "2">
              <name>user2</name>
              <age>20</age>
         </user>
         <user     id = "3">
              <name>user3</name>
              <age>30</age>
         </user>
    </user-detail>The result from read(db, "xml_in.xml"):
    >
    Name of the root element: user-detail
    Number of element:10
    Length:3
    id: 1
    name: user1
    id: 2
    name:
    id: 3
    name:
    >
    both name of id 2 and 3 are missing due to the spacing, how to remove the spacing to avoid this problem?
    The result from create(db,"xml_out.xml"):
    <?xml version="1.0" encoding="UTF-8" standalone="no"?><!--This is comment--><root><sub id="1">data</sub><sub id="2"/><sub id="3"/></root>how to set it nicely as below?
    <?xml version="1.0" encoding="UTF-8" standalone="no"?>
    <!--This is comment-->
    <root>
         <sub id="1">data</sub>
         <sub id="2"/>
         <sub id="3"/>
    </root>thanks~

    Removing the whitespace is the wrong approach. Instead you should realize that the creators of that XML can put in whitespace wherever they feel like it, and more importantly, that the whitespace is also part of the document. In particular it forms text nodes which are children of the element in which they are located, just as element nodes are children.
    So your strategy of assuming that the "name" element will be the first child of the "user" element is incorrect. When there is whitespace before the "name" element, that whitespace will be the first child. So your strategy should be to get the "name" element which is a child of the "user" element.
    And by the way, calling the "normalize" method of the DOM won't do anything to affect that. As usual Ram manages to provide un-useful information.

  • Multitrack - Adobe Audition 2.0

    I am using Multitrack to make a montage of song clips (Bounce to New Track).
    Obviously each track varies in volume slightly. When I create a Mixdown, AA 2.0 seems to do some form of normalization or limiting. Looking at the Mixdown some of the tracks are louder than the originals. I am not using any gain or any of the effects.
    Why does it do this, and is there any way to turn this off? All I want is the Mixdown - I do not want AA 2.0 to do anything to the volume of the tracks when it creates the Mixdown.
    Thank you in advance!
    Message was edited by: DJPT-UK

    DJPT-UK wrote:
    Obviously each track varies in volume slightly. When I create a Mixdown, AA 2.0 seems to do some form of normalization or limiting. Looking at the Mixdown some of the tracks are louder than the originals. I am not using any gain or any of the effects.
    Why does it do this, and is there any way to turn this off? All I want is the Mixdown - I do not want AA 2.0 to do anything to the volume of the tracks when it creates the Mixdown.
    Audition doesn't do any sort of anything that you haven't told it to - whether you realised it at the time you were doing it or not. There is no form of normalization or limiting applied to any tracks at all unless it's specifically selected to be there, so whatever you are doing, it's categorically not Audition's fault - or we'd have noticed this a long time ago, believe me. But since you are using Bounce to New Track, rather than a mixdown, and that uses a straightforward dB addition process, it's inevitable that results will be a little strange - it's completely the wrong tool to use.
    The one thing that you will notice very directly when using bounce to new track is that there will be a -3dB level shift in the new track, because of the way that the panning control works. If you don't want that to happen, then go to Edit>Preferences>Multitrack and select Left/Right Logarithmic for your stereo panning mode. And if you make this change, you will have to restart Audition completely before it takes effect (bold because for some reason, people keep overlooking this step). This setting applies to the entire mixer, incidentally.
    Anyway, if you want your montage to sound correct, then the thing to do is to do a proper mixdown - having checked that it actually sounds okay in Multitrack View. Because then, you'll get what you want. So you go to File>Export>Audio Mixdown and select the options you want - in this case, from the master output, and then name a file to save it to. When you click okay, the file will be created and automatically open up in Edit View.

  • Normalize Block: How to Implement Transformation Function

    Under data transformation > scale and reduce is an option called normalize.  It has a second output point called transformation function (of type ITransformDotNet).  I assumed this was used in a similar way to the quantize block - that is you
    apply it to the training data and then use the function to apply it to new test data.  I am not finding anyway to apply the normalize and don't see much reference to ITransformDotNet.
    Thanks,
    Steve

    This is now supported, see this
    announcement.

  • Limiting rotation of Transform to one axis at a time.

    I have a situation where I want to force a transform returned from a picking tool to only rotate around or translate along a single axis. To do this, I'm using the transformChanged() function shown below. It works fine for the translation changes, but has problems for the rotation case. If the rotations are about one axis only it's ok, but when rotations are combined (30 degrees on X, then 60 degrees on Y) say, the object flips around in strange ways.
    I understand that my approach to the rotation limit was too simplistic, but none of the other approaches I've tried has worked either. I'm very new to Java3D programming and have run out of ideas, can anyone else suggest something to try?
    // Callback from Picking tools.
    public void transformChanged(int type, TransformGroup tg)
        if((selObj != null) && (tg != null))
            // Get the current Transform3D
            Transform3D current = new Transform3D();
            tg.getTransform(current);
            // Get the current transform vector
            Quat4f currQuat = new Quat4f();
            Vector3f currVect = new Vector3f();
            current.get(currQuat, currVect);
            double currScale = current.getScale();
            System.out.println("transformChanged - scale = "+current.getScale());
            // Set global current variables
            curPosition = currVect;
            curQuat4f = currQuat;
            // Only allow movement on the chosen axis
            if(moveX)
                if(type == PickingCallback.TRANSLATE)
                    currVect.setY(oldPosition.getY());
                    currVect.setZ(oldPosition.getZ());
                else if(type == PickingCallback.ROTATE)
                    currQuat.setY(oldQuat4f.getY());
                    currQuat.setZ(oldQuat4f.getZ());
            else if(moveY)
                if(type == PickingCallback.TRANSLATE)
                    currVect.setX(oldPosition.getX());
                    currVect.setZ(oldPosition.getZ());
                else if(type == PickingCallback.ROTATE)
                    currQuat.setX(oldQuat4f.getX());
                    currQuat.setZ(oldQuat4f.getZ());
            else if(moveZ)
                if(type == PickingCallback.TRANSLATE)
                    currVect.setX(oldPosition.getX());
                    currVect.setY(oldPosition.getY());
                else if(type == PickingCallback.ROTATE)
                    currQuat.setX(oldQuat4f.getX());
                    currQuat.setY(oldQuat4f.getY());
            // Set old global variables.
            oldPosition = currVect;
            oldQuat4f = currQuat;
            // Set the rotation and position back
            // into the transform.
            current.set(currQuat, currVect, (float)currScale);
            // Set the Transform3D back into the TransformGroup
            try
                tg.setTransform(current);
            catch(BadTransformException btex)
                System.out.println(btex);
                System.out.println("current transform = \n"+current);
    }Edited by: billphil on Mar 17, 2008 10:56 AM
    Edited by: billphil on Mar 17, 2008 11:07 AM
    Edited by: billphil on Mar 17, 2008 11:18 AM
    Edited by: billphil on Mar 17, 2008 11:21 AM

    thanks!
    i have tried using a file as a shared resource, and i
    can now check whether an instance of my application
    is already running. however, i can't make the
    currently running application window to be
    automatically displayed after the checking. Ar running application could listen on a port so you could tell it to display itself by using TPC/IP. This would require some additinals code in your app, but that shouldn't be a real problem.
    If you shared resource is already a port then you could use this port. And if it is a file you could add some additionla information to this file. Like the port number. In this case you could avoid having to use a predefined port (that could be already used by a a completely different app even though it is not very likely) by using an anonymous ServerSocket (by using "new ServerSocker(0)" and "getLocalPort()")

  • Comment transformé le résultat d'un teste de limite par analyseur de fréquence HP8753E en un fichier texte avec labview

    Bonjour,
    j'utilise labview pour commander un analyseur agilent hp 8753E pour faire des teste de limite de cable le résultat est identifier par les status registre bytes B, alors comment on transformen le résultat du teste en un fichier texte ?

    Bonjour
    Le VI "write to spreadsheet File" enregistre les données dans un fichier TXT.
    Cordialement
    Mart G

  • Is there some place where I can submit a ticket ? Seems like FF limits DOM elements' width and height to 10,000,000px since it transforms this number into scientific notation. e.g. 10,000,000px becomes 1e+7px which obviously can't be interpreted. Cheers

    I've reproduced this issue from FF 3.6.13 on windows XP and mac osx 10.6.8.
    On FF 3.6.13 the number of pixels limit seems lower with exactly 8,388,607.

    The 12" PB internals are a bit more complex, for PB's. If you don't want to replace the hard drive yourself or pay someone to install it, you could always get an external firewire hard drive, and use it to boot from and for general usage. Would have to be firewire, since the PB won't book from a USB device. One example of what you could get is a 160GB external hard drive: http://eshop.macsales.com/item/Other%20World%20Computing/MS4U5160GB8/ . All the choices with that case are listed here: http://eshop.macsales.com/shop/firewire/on-the-go
    Have you called any Apple Authorized Service Providers to see what they would charge to install a drive for you? Whether you bought it or they supplied it? You can find a local one in the US at http://www.apple.com/buy/locator/service/

  • GA limitation

    Hi,
    We can check the document of GA limitation. But I could not understand the following sentence. I am very happy if you tell me the detail. it is welcome if you answer, not all, but one of any..
    Report Designer: Formatting of frames (color, line styles and sizes), definition of cell – frame margin
    what is the frames?
    General: List calculation: Normalization according to not visible values
    Web Application Designer: Pattern Wizard in WAD
    Modal Windows cannot be displayed in Firefox.
    what is model windows
    Wizard to customize button item
    Using the standard Excel Commentary function to create BI documents
    Search Objects in Portal in Open/Save Dialogs in Web Analyzer
    what is the Object in portal?
    Check referential integrity as part of transformation
    Kind regards,
    Masa

    Report Designer: Formatting of frames (color, line styles and sizes), definition of cell – frame margin
    what is the frames?
    - This is related to border color and width control
    General: List calculation: Normalization according to not visible values
    - Normailzation is a calculation option. If you want to include suppressed values that aren't displayed as part of your normalization, you can now do this.
    Web Application Designer: Pattern Wizard in WAD
    - This is new functionality. There will be a wizard to help create UI patterns.
    Modal Windows cannot be displayed in Firefox.
    what is model windows
    - Modal windows are popup dialogs...
    Wizard to customize button item
    - This allows you a step by step guided approach to call commands that get executed when you hit a button.
    Using the standard Excel Commentary function to create BI documents
    - Excel allows you to have commentary mark (creating comments on a cell). There will be tighter integraton with these comments and KM documents.
    Search Objects in Portal in Open/Save Dialogs in Web Analyzer
    what is the Object in portal?
    - The object is any BICS object (query, 3rd party data source, or infoprovider)

  • Primitive APAB editor in start/end routines in transformations

    When editing or viewing ABAP code in BI transformations, for example in a start routine, the editor that opens is very primitive compared to the normal SE38 editor. Some of the limitations include:
    The editor window doesn't cover the whole screen with seemingly no way to increase its size.
    The syntax check doesn't show on which line syntax errors are located.
    There is no option to perform a extended program check.
    There is no way to insert break-points (other than with the ABAP keyword of course)
    These limitations are present regardless of whether i choose the new front-end editor, the old front-end editor or the back-end editor. We're running SAP Netweaver 2004s.
    It is of course possible to create a program in SE38 and copy-paste your start routine code to see the code using the "real" editor, but this is very tiresome and time consuming. Is there a way to make this editor look and behave like the normal editor? I have looked through the setting options an searched SDN without finding a way.

    Hi,
    This is just the settings you need to change to open the start,end, and characteristics routine using the old editor you are comfortable with. No need to go to se38 and check copy the program.
    Go to se38->Utilities->settings->abap editor->editor tab->select the old abap editor.
    To specifically put break point in transformations (start routine..end routine..)..goto transformation (RSA1) and then display the transformation.
    Then goto extra (menu)->generated program. search for start_routine (method now) and put break point in the desired place.
    Then from the DTP enable all 4 break points..in tranformation (this will come when u cange it to debug mode simulation). And u can debug the transformation.
    The new editor is a good handy one. But take some time to get acquented to it. After you may start liking it :).
    Cheers,
    -J

  • Is there a way to transform multiple colors?

    I have a project where I've been asked to allow a user to choose between multple color themes of a character.  This object  is complex in that it has multiple colors and that each color needs to change differently based on the theme's color scheme.
    Is there a way to take a movieclip with multiple colors and apply process to it that will detect different colors and transform them accordingly without taking each individual color, making a movie clip from it, then modifying the movieclip's colors?  Each character has a pretty limited color pallette, so this process would only have to find and then change maybe 3 or 4 colors.  But the character in question is hand animated and has a LOT of frames, so I'm hoping there's a chance that I can handle this programmatically rather than doing it all by hand with a few hundred extra movieclips to make color changes to.  Any help is greatly appreciated!

    Hi, and thanks for the help.
    This is the 1st time I've looked at the palletteMap method.  I see in the documentation:
    paletteMap(sourceBitmapData:BitmapData, sourceRect:Rectangle, destPoint:Point, redArray:Array = null, greenArray:Array = null, blueArray:Array = null, alphaArray:Array = null):void
    Parameters: 
    sourceBitmapData:BitmapData — The input bitmap image to use. The source image can be a different BitmapData object, or it can refer to the current BitmapData instance.
    sourceRect:Rectangle — A rectangle that defines the area of the source image to use as input.
    destPoint:Point — The point within the destination image (the current BitmapData object) that corresponds to the upper-left corner of the source rectangle.
    redArray:Array (default = null) — If redArray is not null, red = redArray[source red value] else red = source rect value.
    greenArray:Array (default = null) — If greenArray is not null, green = greenArray[source green value] else green = source green value.
    blueArray:Array (default = null) — If blueArray is not null, blue = blueArray[source blue value] else blue = source blue value.
    alphaArray:Array (default = null) — If alphaArray is not null, alpha = alphaArray[source alpha value] else alpha = source alpha value.
    I comprehend the 1st 4 parameters, but the color and alpha arrays have got me confused.  How would I use these to, for instance find all the pure blue pixels in a bitmap (0x000000FF) and then change those pixels' colors to some other color, then find all the pure red pixels (0x00FF0000) and change them to yet another color?  Fortunately I don't have to deal with alpha with this current delimma . . .
    One other question before I look into this method as a solution, is it possible to programmatically convert vector-based artwork into a bitmap to use this colorMatrix feature?  All the artwork I need to change colors on is hand drawn animation using the native Flash drawing tools.  Thanks again for any additional help!

Maybe you are looking for

  • Is it possible to display lower nodes in a hierarchy without top nodes?

    Hello: I am working with the Query Builder in SAP BW 3.5.  We enter date, profit center and responsible organization to get labor data and WBS hierarchy output.  We would like to show all the data associated with the WBS hierarchy at level 3 and belo

  • DVD Rom

    Hi there. recently my dvd-drive has been working intermittently, it no longer shows up as an icon in the "my computer" section. the drivers are up to date. if anyone has any ideas on what is causing this problem id be extremely grateful for solutions

  • Shipment cost transfer to invoice - condition twice in invoice

    Hi experts, I created a stat. condition in the SCD and calculated a vaule which I need in the invoice to calculate the total. I have 2 items in the SCD - one of the header and one of the stage level. For both the condition is found and calculated wit

  • Dynamic Margins in Reports??

    I need to be able to dynamically change the values of my header and footer in the Margin of a report. The values can change from page to page. I'm not sure how or even if it is possible to dynamically control the values in the Margin. If you know a w

  • No mapping for Identity User Name in WS Security X.509 token profile

    Hi, I am trying to do interoperability tests with Apache WSS4J and Aqualogic ESB for X509 certificates. I wrote the client and server in the Axis and WSS4J framework. It is worknig fine. When I developed the proxy servcies in Aqualogic ESB with the c