Table Creation on the fly

Hi,
I have to run 50 reports last day of every month. Each query will do lot of dynamic calculations which are a time consuming. The whole point is to create excels for each of the report. But since the Application will throw request time out error if time exceeds more than minutes. So we decided to generate the data for each of those 50 reports and store in a temporary table... so that the application will not throw time out error...
Details:
Oracle 10g.
.NET
# of records in each table: > 700k
In order to accomplish the above task we have to create temporary tables.. but the question is for every report do we have to create a table on the fly OR create a base table which will have 100 columns which will be a cross product of all the columns selected in all the 50 queries and every row will have a report id..
OR Is there a better way to accomplish the above?
thanks in advance

Thank you.. the only reason i wasnt going for view is i wont be able to have indexes if i want to add some filter while exporting to excel..
other wise we should be good with materialized views..

Similar Messages

  • Dynamic Table Creation with the sortable collection model in the bean.

    Hi all,
    My requirement: I want to create the table dynamically with the set of rows. This set of rows will be holded in the Bean for the sortable variable.
    (private SortableModel viewDeleteCollectionModel)
    Issue: "_No Data to Display_" is coming in the Dynamic Table. Whereas when i checked with the collection model row count it is showing the correct no of rows.
    Code for Dynamic Table in JSFF:
    <af:table varStatus="rowStat" value="#{ViewHistoryDelete.viewDeleteCollectionModel}"
    rows="#{ViewHistoryDelete.viewDeleteCollectionModel.rowCount}"
    rowSelection="single" width="99%" var="row" id="t2" summary=" " >
    <af:forEach items="#{ViewHistoryDelete.viewDeleteColumnNames}" var="name">
    <af:column headerText="#{name}" sortable="true" sortProperty="#{name}" id="pt_c1">
    <af:inputText value="#{row[name]}" label="#{row[name]}" readOnly="true" id="it2"/>
    </af:column>
    </af:forEach>
    </af:table>
    Bean Code:
    Varaible DEclaration : private SortableModel viewDeleteCollectionModel;
    Row assignig to collection model : viewDeleteCollectionModel = new SortableModel(new ArrayList<Map<String, Object>>());
    ((List<Map<String, Object>>) viewDeleteCollectionModel.getWrappedData()).addAll(viewDeletedData);
    viewHistoryDelete.setViewDeleteCollectionModel(viewDeleteCollectionModel);
    After this iam checked by putting sop for the table row count
    iam getting actual no.of rows. But to the screen it is not showing the rows.
    Any solution............
    reg,
    bakkia.
    Edited by: Bakkia on Oct 11, 2011 11:20 PM

    Did you find solution to this problem ?

  • Master Data on the fly

    Hello team,
    i am want to create a tool kind of thing for master data creation on the fly.
    I got some doc about this but all are specific to a single dimension.
    I wat to create the standard tool which can work for any dimensions.
    I used EPM function to take the environment ,model ,dimension name & dimension property in the excel sheet i am unable to find a way to send these data values to the script of my data manager package.
    Can anyone help me in sending the data from excel sheet to the script of the data manager package?
    Thanks & regards
    Ronit Kumar

    In the How-To document "Master Data on the Fly" the custom process chain was used with REPLACEPARAMx_KEY and REPLACEPARAMx_VALUE
    It's better to use standard chain and pass just 2 variables:
    First variable: DIMNAME
    Second variable: Some string with ID,Description,Properties etc... (some syntax has to be defined)
    Both variables will be processed by badi to create new master data!
    Vadim

  • Inserting into a table which is created "on the fly" from a trigger

    Hello all,
    I am trying to insert into a table from a trigger in Oracle form. The table name however, is inputted by the user in am item form.
    here is what the insert looks like:
    insert into :table_name
    values (:value1, :value2);
    the problem is that forms does not recognize ::table_name. If I replace :table_name with an actual database table, it works fine. However, I need to insert to a table_name based from oracle form item.
    By the way, the table|_name is built on the fly using a procedure before I try to insert into it.
    Any suggestion on how can I do that? My code in the trigger is:
    declare
    dm_drop_tbl(:table_name,'table) // a call to an external procedure to drop the table
    dm_create_tbl(:table_name,'att1','att2');
    insert into :table_name
    values (:value1, :value2);
    this give me an error:
    encounter "" when the symbol expecting one.....

    Hi ,
    You should use the FORMS_DDL built_in procedure. Read the on-line documentation of forms ...
    Simon

  • Bad file is not created during the external table creation.

    Hello Experts,
    I have created a script for external table in Oracle 10g DB. Everything is working fine except it does not create the bad file, But it creates the log file. I Cann't figure out what is the issue. Because my shell scripts is failing and the entire program is failing. I am attaching the table creation script and the shell script where it is refering and the error. Kindly let me know if something is missing. Thanks in advance
    Table Creation Scripts:_-------------------------------
    create table RGIS_TCA_DATA_EXT
    guid VARCHAR2(250),
    badge VARCHAR2(250),
    scheduled_store_id VARCHAR2(250),
    parent_event_id VARCHAR2(250),
    event_id VARCHAR2(250),
    organization_number VARCHAR2(250),
    customer_number VARCHAR2(250),
    store_number VARCHAR2(250),
    inventory_date VARCHAR2(250),
    full_name VARCHAR2(250),
    punch_type VARCHAR2(250),
    punch_start_date_time VARCHAR2(250),
    punch_end_date_time VARCHAR2(250),
    event_meet_site_id VARCHAR2(250),
    vehicle_number VARCHAR2(250),
    vehicle_description VARCHAR2(250),
    vehicle_type VARCHAR2(250),
    is_owner VARCHAR2(250),
    driver_passenger VARCHAR2(250),
    mileage VARCHAR2(250),
    adder_code VARCHAR2(250),
    bonus_qualifier_code VARCHAR2(250),
    store_accuracy VARCHAR2(250),
    store_length VARCHAR2(250),
    badge_input_type VARCHAR2(250),
    source VARCHAR2(250),
    created_by VARCHAR2(250),
    created_date_time VARCHAR2(250),
    updated_by VARCHAR2(250),
    updated_date_time VARCHAR2(250),
    approver_badge_id VARCHAR2(250),
    approver_name VARCHAR2(250),
    orig_guid VARCHAR2(250),
    edit_type VARCHAR2(250)
    organization external
    type ORACLE_LOADER
    default directory ETIME_LOAD_DIR
    access parameters
    RECORDS DELIMITED BY NEWLINE
    BADFILE ETIME_LOAD_DIR:'tstlms.bad'
    LOGFILE ETIME_LOAD_DIR:'tstlms.log'
    READSIZE 1048576
    FIELDS TERMINATED BY '|'
    MISSING FIELD VALUES ARE NULL(
    GUID
    ,BADGE
    ,SCHEDULED_STORE_ID
    ,PARENT_EVENT_ID
    ,EVENT_ID
    ,ORGANIZATION_NUMBER
    ,CUSTOMER_NUMBER
    ,STORE_NUMBER
    ,INVENTORY_DATE char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,FULL_NAME
    ,PUNCH_TYPE
    ,PUNCH_START_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,PUNCH_END_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,EVENT_MEET_SITE_ID
    ,VEHICLE_NUMBER
    ,VEHICLE_DESCRIPTION
    ,VEHICLE_TYPE
    ,IS_OWNER
    ,DRIVER_PASSENGER
    ,MILEAGE
    ,ADDER_CODE
    ,BONUS_QUALIFIER_CODE
    ,STORE_ACCURACY
    ,STORE_LENGTH
    ,BADGE_INPUT_TYPE
    ,SOURCE
    ,CREATED_BY
    ,CREATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,UPDATED_BY
    ,UPDATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,APPROVER_BADGE_ID
    ,APPROVER_NAME
    ,ORIG_GUID
    ,EDIT_TYPE
    location (ETIME_LOAD_DIR:'tstlms.dat')
    reject limit UNLIMITED;
    _***Shell Script*:*----------------_*
    version=1.0
    umask 000
    DATE=`date +%Y%m%d%H%M%S`
    TIME=`date +"%H%M%S"`
    SOURCE=`hostname`
    fcp_login=`echo $1|awk '{print $3}'|sed 's/"//g'|awk -F= '{print $2}'`
    fcp_reqid=`echo $1|awk '{print $2}'|sed 's/"//g'|awk -F= '{print $2}'`
    TXT1_PATH=/home/ac1/oracle/in/tsdata
    TXT2_PATH=/home/ac2/oracle/in/tsdata
    ARCH1_PATH=/home/ac1/oracle/in/tsdata
    ARCH2_PATH=/home/ac2/oracle/in/tsdata
    DEST_PATH=/home/custom/sched/in
    PROGLOG=/home/custom/sched/logs/rgis_tca_to_tlms_create.sh.log
    PROGNAME=`basename $0`
    PROGPATH=/home/custom/sched/scripts
    cd $TXT2_PATH
    FILELIST2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.dat for i in $FILELIST2
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES2 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES1="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST1="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST1
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES1 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    cd $TXT1_PATH
    FILELIST3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.datfor i in $FILELIST3
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES3 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES4="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST4="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST4
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES4 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    #connecting to oracle to generate bad files
    sqlplus -s $fcp_login<<EOF
    select count(*) from rgis_tca_data_ext;
    select count(*) from rgis_tca_data_history_ext;
    exit;
    EOF
    #counting the records in files
    tot_rec_in_tstlms=`wc -l $DEST_PATH/tstlms.dat | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits=`wc -l $DEST_PATH/tstlmsedits.dat | awk ' { print $1 } '`
    tot_rec_in_tstlms_bad=`wc -l $DEST_PATH/tstlms.bad | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits_bad=`wc -l $DEST_PATH/tstlmsedits.bad | awk ' { print $1 } '`
    #updating log table
    echo "pl/sql block started"
    sqlplus -s $fcp_login<<EOF
    define tot_rec_in_tstlms     = '$tot_rec_in_tstlms';
    define tot_rec_in_tstlmsedits     = '$tot_rec_in_tstlmsedits';
    define tot_rec_in_tstlms_bad     = '$tot_rec_in_tstlms_bad';
    define tot_rec_in_tstlmsedits_bad='$tot_rec_in_tstlmsedits_bad';
    define fcp_reqid ='$fcp_reqid';
    declare
    l_tstlms_file_id number := null;
    l_tstlmsedits_file_id number := null;
    l_tot_rec_in_tstlms number := 0;
    l_tot_rec_in_tstlmsedits number := 0;
    l_tot_rec_in_tstlms_bad number := 0;
    l_tot_rec_in_tstlmsedits_bad number := 0;
    l_request_id fnd_concurrent_requests.request_id%type;
    l_start_date fnd_concurrent_requests.actual_start_date%type;
    l_end_date fnd_concurrent_requests.actual_completion_date%type;
    l_conc_prog_name fnd_concurrent_programs.concurrent_program_name%type;
    l_requested_by fnd_concurrent_requests.requested_by%type;
    l_requested_date fnd_concurrent_requests.request_date%type;
    begin
    --getting concurrent request details
    begin
    SELECT fcp.concurrent_program_name,
    fcr.request_id,
    fcr.actual_start_date,
    fcr.actual_completion_date,
    fcr.requested_by,
    fcr.request_date
    INTO l_conc_prog_name,
    l_request_id,
    l_start_date,
    l_end_date,
    l_requested_by,
    l_requested_date
    FROM fnd_concurrent_requests fcr, fnd_concurrent_programs fcp
    WHERE fcp.concurrent_program_id = fcr.concurrent_program_id
    AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    exception
    when no_data_found then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log, 'No data found for request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when retrieving request_id request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlms.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlms_file_id,
                   'tstlms.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlms,
                   &tot_rec_in_tstlms_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlms file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlmsedits.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlmsedits_file_id,
                   'tstlmsedits.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlmsedits,
                   &tot_rec_in_tstlmsedits_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlmsedits file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    end;
    exit;
    EOF
    echo "rgis_tca_to_tlms_process.sql started"
    sqlplus -s $fcp_login @$SCHED_TOP/sql/rgis_tca_to_tlms_process.sql $fcp_reqid
    exit;
    echo "rgis_tca_to_tlms_process.sql ended"
    _**Error:*----------------------------------*_
    RGIS Scheduling: Version : UNKNOWN
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    TCATLMS module: TCA To TLMS Import Process
    Current system time is 18-AUG-2011 06:13:27
    COUNT(*)
         16
    COUNT(*)
         25
    wc: cannot open /home/custom/sched/in/tstlms.bad
    wc: cannot open /home/custom/sched/in/tstlmsedits.bad
    pl/sql block started
    old 33:     AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    new 33:     AND fcr.request_id = 18661823; --fnd_global.conc_request_id();
    old 63:                &tot_rec_in_tstlms,
    new 63:                16,
    old 64:                &tot_rec_in_tstlms_bad,
    new 64:                ,
    old 97:                &tot_rec_in_tstlmsedits,
    new 97:                25,
    old 98:                &tot_rec_in_tstlmsedits_bad,
    new 98:                ,
    ERROR at line 64:
    ORA-06550: line 64, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql stddev sum variance
    execute forall merge time timestamp interval date
    <a string literal with character set specification>
    <a number> <a single-quoted SQL string> pipe
    <an alternatively-quoted string literal with character set specification>
    <an alternatively-q
    ORA-06550: line 98, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql st
    rgis_tca_to_tlms_process.sql started
    old 12: and concurrent_request_id = '&1';
    new 12: and concurrent_request_id = '18661823';
    old 18: and concurrent_request_id = '&1';
    new 18: and concurrent_request_id = '18661823';
    old 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,&1);
    new 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,18661823);
    old 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,&1);
    new 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,18661823);
    old 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',&1);
    new 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',18661823);
    declare
    ERROR at line 1:
    ORA-20001: Error occured when executing RGIS_TCA_TO_TLMS_PROCESS.sql ORA-01403:
    no data found
    ORA-06512: at line 59
    Executing request completion options...
    ------------- 1) PRINT   -------------
    Printing output file.
    Request ID : 18661823      
    Number of copies : 0      
    Printer : noprint
    Finished executing request completion options.
    Concurrent request completed successfully
    Current system time is 18-AUG-2011 06:13:29
    ---------------------------------------------------------------------------

    Hi,
    Check the status of the batch in SM35 transaction.
    if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
    To Release -Shift+F4.
    Also you can analyse the job status through F2 button.
    Bye

  • Different databse tables affected during the creation of a PR

    Can someone please explain the different databse tables affected during the creation of a Purchase Requisition except EBAN?

    Hi Ajit,
    Thanks for the reply.
    Kindly let me know how these tables will be affected when PR will be created(STXH,STXL)?
    Thanks,
    Lina

  • Slow table creation after upgrade from 10.2.0.3 to 11.2.0.1 using DBUA

    I've recently completed a database upgrade from 10.2.0.3 to 11.2.0.1 using the DBUA.
    I've since encountered a slowdown when running a script which drops and recreates a series of ~250 tables. The script normally runs in around 19 seconds. After the upgrade, the script requires ~2 minutes to run.
    By chance has anyone encountered something similar?
    The problem may be related to the behavior of an "after CREATE on schema" trigger which grants select privileges to a role through the use of a dbms_job call; between 10g and the database that was upgraded from 10G to 11g. Currently researching this angle.
    I will be using the following table creation DDL for this abbreviated test case:
    create table ALLIANCE  (
       ALLIANCEID           NUMBER(10)                      not null,
       NAME                 VARCHAR2(40)                    not null,
       CREATION_DATE        DATE,
       constraint PK_ALLIANCE primary key (ALLIANCEID)
               using index
           tablespace LIVE_INDEX
    tablespace LIVE_DATA;When calling the above DDL, an "after CREATE on schema" trigger is fired which schedules a job to immediately run to grant select privilege to a role for the table which was just created:
    create or replace
    trigger select_grant
    after CREATE on schema
    declare
        l_str varchar2(255);
        l_job number;
    begin
        if ( ora_dict_obj_type = 'TABLE' ) then
            l_str := 'execute immediate "grant select on ' ||
                                         ora_dict_obj_name ||
                                        ' to select_role";';
            dbms_job.submit( l_job, replace(l_str,'"','''') );
        end if;
    end;
    {code}
    Below I've included data on two separate test runs.  The first is on the upgraded database and includes optimizer parameters and an abbreviated TKPROF.  I've also, included the offending sys generate SQL which is not issued when the same test is run on a 10g environment that has been set up with a similar test case.  The 10g test run's TKPROF is also included below.
    The version of the database is 11.2.0.1.
    These are the parameters relevant to the optimizer for the test run on the upgraded 11g SID:
    {code}
    SQL> show parameter optimizer
    NAME                                 TYPE        VALUE
    optimizer_capture_sql_plan_baselines boolean     FALSE
    optimizer_dynamic_sampling           integer     2
    optimizer_features_enable            string      11.2.0.1
    optimizer_index_caching              integer     0
    optimizer_index_cost_adj             integer     100
    optimizer_mode                       string      ALL_ROWS
    optimizer_secure_view_merging        boolean     TRUE
    optimizer_use_invisible_indexes      boolean     FALSE
    optimizer_use_pending_statistics     boolean     FALSE
    optimizer_use_sql_plan_baselines     boolean     TRUE
    SQL> show parameter db_file_multi
    NAME                                 TYPE        VALUE
    db_file_multiblock_read_count        integer     8
    SQL> show parameter db_block_size
    NAME                                 TYPE        VALUE
    db_block_size                        integer     8192
    SQL> show parameter cursor_sharing
    NAME                                 TYPE        VALUE
    cursor_sharing                       string      EXACT
    SQL> column sname format a20
    SQL> column pname format a20
    SQL> column pval2 format a20
    SQL> select sname, pname, pval1, pval2 from sys.aux_stats$;
    SNAME                PNAME                     PVAL1 PVAL2
    SYSSTATS_INFO        STATUS                          COMPLETED
    SYSSTATS_INFO        DSTART                          03-11-2010 16:33
    SYSSTATS_INFO        DSTOP                           03-11-2010 17:03
    SYSSTATS_INFO        FLAGS                         0
    SYSSTATS_MAIN        CPUSPEEDNW           713.978495
    SYSSTATS_MAIN        IOSEEKTIM                    10
    SYSSTATS_MAIN        IOTFRSPEED                 4096
    SYSSTATS_MAIN        SREADTIM               1565.746
    SYSSTATS_MAIN        MREADTIM
    SYSSTATS_MAIN        CPUSPEED                   2310
    SYSSTATS_MAIN        MBRC
    SYSSTATS_MAIN        MAXTHR
    SYSSTATS_MAIN        SLAVETHR
    13 rows selected.
    {code}
    Output from TKPROF on the 11g SID:
    {code}
    create table ALLIANCE  (
       ALLIANCEID           NUMBER(10)                      not null,
       NAME                 VARCHAR2(40)                    not null,
       CREATION_DATE        DATE,
       constraint PK_ALLIANCE primary key (ALLIANCEID)
               using index
           tablespace LIVE_INDEX
    tablespace LIVE_DATA
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          4           0
    Fetch        0      0.00       0.00          0          0          0           0
    total        2      0.00       0.00          0          0          4           0
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 324
    {code}
    ... large section omitted ...
    Here is the performance hit portion of the TKPROF on the 11g SID:
    {code}
    SQL ID: fsbqktj5vw6n9
    Plan Hash: 1443566277
    select next_run_date, obj#, run_job, sch_job
    from
    (select decode(bitand(a.flags, 16384), 0, a.next_run_date,
      a.last_enabled_time) next_run_date,       a.obj# obj#,
      decode(bitand(a.flags, 16384), 0, 0, 1) run_job, a.sch_job  sch_job  from
      (select p.obj# obj#, p.flags flags, p.next_run_date next_run_date,
      p.job_status job_status, p.class_oid class_oid,      p.last_enabled_time
      last_enabled_time, p.instance_id instance_id,      1 sch_job   from
      sys.scheduler$_job p   where bitand(p.job_status, 3) = 1    and
      ((bitand(p.flags, 134217728 + 268435456) = 0) or
      (bitand(p.job_status, 1024) <> 0))    and bitand(p.flags, 4096) = 0    and
      p.instance_id is NULL    and (p.class_oid is null      or (p.class_oid is
      not null      and p.class_oid in (select b.obj# from sys.scheduler$_class b
                               where b.affinity is null)))   UNION ALL   select
      q.obj#, q.flags, q.next_run_date, q.job_status, q.class_oid,
      q.last_enabled_time, q.instance_id, 1   from sys.scheduler$_lightweight_job
      q   where bitand(q.job_status, 3) = 1    and ((bitand(q.flags, 134217728 +
      268435456) = 0) or         (bitand(q.job_status, 1024) <> 0))    and
      bitand(q.flags, 4096) = 0    and q.instance_id is NULL    and (q.class_oid
      is null      or (q.class_oid is not null      and q.class_oid in (select
      c.obj# from sys.scheduler$_class c                          where
      c.affinity is null)))   UNION ALL   select j.job, 0,
      from_tz(cast(j.next_date as timestamp),      to_char(systimestamp,'TZH:TZM')
      ), 1, NULL,      from_tz(cast(j.next_date as timestamp),
      to_char(systimestamp,'TZH:TZM')),     NULL, 0   from sys.job$ j   where
      (j.field1 is null or j.field1 = 0)    and j.this_date is null) a   order by
      1)   where rownum = 1
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.00          0          0          0           0
    Fetch        1      0.47       0.47          0       9384          0           1
    total        3      0.48       0.48          0       9384          0           1
    Misses in library cache during parse: 1
    Optimizer mode: CHOOSE
    Parsing user id: SYS   (recursive depth: 1)
    Rows     Row Source Operation
          1  COUNT STOPKEY (cr=9384 pr=0 pw=0 time=0 us)
          1   VIEW  (cr=9384 pr=0 pw=0 time=0 us cost=5344 size=6615380 card=194570)
          1    SORT ORDER BY STOPKEY (cr=9384 pr=0 pw=0 time=0 us cost=5344 size=11479630 card=194570)
    194790     VIEW  (cr=9384 pr=0 pw=0 time=537269 us cost=2563 size=11479630 card=194570)
    194790      UNION-ALL  (cr=9384 pr=0 pw=0 time=439235 us)
        231       FILTER  (cr=68 pr=0 pw=0 time=920 us)
        231        TABLE ACCESS FULL SCHEDULER$_JOB (cr=66 pr=0 pw=0 time=690 us cost=19 size=13157 card=223)
          1        TABLE ACCESS BY INDEX ROWID SCHEDULER$_CLASS (cr=2 pr=0 pw=0 time=0 us cost=1 size=40 card=1)
          1         INDEX UNIQUE SCAN SCHEDULER$_CLASS_PK (cr=1 pr=0 pw=0 time=0 us cost=0 size=0 card=1)(object id 5056)
          0       FILTER  (cr=3 pr=0 pw=0 time=0 us)
          0        TABLE ACCESS FULL SCHEDULER$_LIGHTWEIGHT_JOB (cr=3 pr=0 pw=0 time=0 us cost=2 size=95 card=1)
          0        TABLE ACCESS BY INDEX ROWID SCHEDULER$_CLASS (cr=0 pr=0 pw=0 time=0 us cost=1 size=40 card=1)
          0         INDEX UNIQUE SCAN SCHEDULER$_CLASS_PK (cr=0 pr=0 pw=0 time=0 us cost=0 size=0 card=1)(object id 5056)
    194559       TABLE ACCESS FULL JOB$ (cr=9313 pr=0 pw=0 time=167294 us cost=2542 size=2529254 card=194558)
    {code}
    and the totals at the end of the TKPROF on the 11g SID:
    {code}
    OVERALL TOTALS FOR ALL NON-RECURSIVE STATEMENTS
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      2      0.00       0.00          0          0          4           0
    Fetch        0      0.00       0.00          0          0          0           0
    total        3      0.00       0.00          0          0          4           0
    Misses in library cache during parse: 1
    Misses in library cache during execute: 1
    OVERALL TOTALS FOR ALL RECURSIVE STATEMENTS
    call     count       cpu    elapsed       disk      query    current        rows
    Parse       70      0.00       0.00          0          0          0           0
    Execute     85      0.01       0.01          0         62        208          37
    Fetch       49      0.48       0.49          0       9490          0          35
    total      204      0.51       0.51          0       9552        208          72
    Misses in library cache during parse: 5
    Misses in library cache during execute: 3
       35  user  SQL statements in session.
       53  internal SQL statements in session.
       88  SQL statements in session.
    Trace file: 11gSID_ora_17721.trc
    Trace file compatibility: 11.1.0.7
    Sort options: default
           1  session in tracefile.
          35  user  SQL statements in trace file.
          53  internal SQL statements in trace file.
          88  SQL statements in trace file.
          51  unique SQL statements in trace file.
        1590  lines in trace file.
          18  elapsed seconds in trace file.
    {code}
    The version of the database is 10.2.0.3.0.
    These are the parameters relevant to the optimizer for the test run on the 10g SID:
    {code}
    SQL> show parameter optimizer
    NAME                                 TYPE        VALUE
    optimizer_dynamic_sampling           integer     2
    optimizer_features_enable            string      10.2.0.3
    optimizer_index_caching              integer     0
    optimizer_index_cost_adj             integer     100
    optimizer_mode                       string      ALL_ROWS
    optimizer_secure_view_merging        boolean     TRUE
    SQL> show parameter db_file_multi
    NAME                                 TYPE        VALUE
    db_file_multiblock_read_count        integer     8
    SQL> show parameter db_block_size
    NAME                                 TYPE        VALUE
    db_block_size                        integer     8192
    SQL> show parameter cursor_sharing
    NAME                                 TYPE        VALUE
    cursor_sharing                       string      EXACT
    SQL> column sname format a20
    SQL> column pname format a20
    SQL> column pval2 format a20
    SQL> select sname, pname, pval1, pval2 from sys.aux_stats$;
    SNAME                PNAME                     PVAL1 PVAL2
    SYSSTATS_INFO        STATUS                          COMPLETED
    SYSSTATS_INFO        DSTART                          09-24-2007 11:09
    SYSSTATS_INFO        DSTOP                           09-24-2007 11:09
    SYSSTATS_INFO        FLAGS                         1
    SYSSTATS_MAIN        CPUSPEEDNW           2110.16949
    SYSSTATS_MAIN        IOSEEKTIM                    10
    SYSSTATS_MAIN        IOTFRSPEED                 4096
    SYSSTATS_MAIN        SREADTIM
    SYSSTATS_MAIN        MREADTIM
    SYSSTATS_MAIN        CPUSPEED
    SYSSTATS_MAIN        MBRC
    SYSSTATS_MAIN        MAXTHR
    SYSSTATS_MAIN        SLAVETHR
    13 rows selected.
    {code}
    Now for the TKPROF of a mirrored test environment running on a 10G SID:
    {code}
    create table ALLIANCE  (
       ALLIANCEID           NUMBER(10)                      not null,
       NAME                 VARCHAR2(40)                    not null,
       CREATION_DATE        DATE,
       constraint PK_ALLIANCE primary key (ALLIANCEID)
               using index
           tablespace LIVE_INDEX
    tablespace LIVE_DATA
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      1      0.00       0.01          0          2         16           0
    Fetch        0      0.00       0.00          0          0          0           0
    total        2      0.01       0.01          0          2         16           0
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 113
    {code}
    ... large section omitted ...
    Totals for the TKPROF on the 10g SID:
    {code}
    OVERALL TOTALS FOR ALL NON-RECURSIVE STATEMENTS
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.02          0          0          0           0
    Execute      1      0.00       0.00          0          2         16           0
    Fetch        0      0.00       0.00          0          0          0           0
    total        2      0.00       0.02          0          2         16           0
    Misses in library cache during parse: 1
    OVERALL TOTALS FOR ALL RECURSIVE STATEMENTS
    call     count       cpu    elapsed       disk      query    current        rows
    Parse       65      0.01       0.01          0          1         32           0
    Execute     84      0.04       0.09         20         90        272          35
    Fetch       88      0.00       0.10         30        281          0          64
    total      237      0.07       0.21         50        372        304          99
    Misses in library cache during parse: 38
    Misses in library cache during execute: 32
       10  user  SQL statements in session.
       76  internal SQL statements in session.
       86  SQL statements in session.
    Trace file: 10gSID_ora_32003.trc
    Trace file compatibility: 10.01.00
    Sort options: default
           1  session in tracefile.
          10  user  SQL statements in trace file.
          76  internal SQL statements in trace file.
          86  SQL statements in trace file.
          43  unique SQL statements in trace file.
         949  lines in trace file.
           0  elapsed seconds in trace file.
    {code}
    Edited by: user8598842 on Mar 11, 2010 5:08 PM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                              

    So while this certainly isn't the most elegant of solutions, and most assuredly isn't in the realm of supported by Oracle...
    I've used the DBMS_IJOB.DROP_USER_JOBS('username'); package to remove the 194558 orphaned job entries from the job$ table. Don't ask, I've no clue how they all got there; but I've prepared some evil looks to unleash upon certain developers tomorrow morning.
    Not being able to reorganize the JOB$ table to free the now wasted ~67MB of space I've opted to create a new index on the JOB$ table to sidestep the full table scan.
    CREATE INDEX SYS.JOB_F1_THIS_NEXT ON SYS.JOB$ (FIELD1, THIS_DATE, NEXT_DATE) TABLESPACE SYSTEM;The next option would be to try to find a way to grant the select privilege to the role without using the aforementioned "after CREATE on schema" trigger and dbms_job call. This method was adopted to cover situations in which a developer manually added a table directly to the database rather than using the provided scripts to recreate their test environment.
    I assume that the following quote from the 11gR2 documentation is mistaken, and there is no such beast as "create or replace table" in 11g:
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10592/statements_9003.htm#i2061306
    "Dropping a table invalidates dependent objects and removes object privileges on the table. If you want to re-create the table, then you must regrant object privileges on the table, re-create the indexes, integrity constraints, and triggers for the table, and respecify its storage parameters. Truncating and replacing have none of these effects. Therefore, removing rows with the TRUNCATE statement or replacing the table with a *CREATE OR REPLACE TABLE* statement can be more efficient than dropping and re-creating a table."

  • Table creation - order of events

    I am trying to get some help on the order I should be carrying out table creation tasks.
    Say I create a simple table:
    create table title (
    title_id number(2) not null,
    title varchar2(10) not null,
    effective_from date not null,
    effective_to date not null,
    constraint pk_title primary key (title_id)
    I believe I should populate the data, then create my index:
    create unique index title_title_id_idx on title (title_id asc)
    But I have read that Oracle will automatically create an index for my primary key if I do not do so myself.
    At what point does Oracle create the index on my behalf and how do I stop it?
    Should I only apply the primary key constraint after the data has been loaded as well?
    Even then, if I add the primary key constraint will Oracle not immediately create an index for me when I am about to create a specific one matching my naming conventions?

    yeah but just handle it the way you would handle any other constraint violation - with the EXCEPTIONS INTO clause...
    SQL> select index_name, uniqueness from user_indexes
      2  where table_name = 'APC'
      3  /
    no rows selected
    SQL> insert into apc values (1)
      2  /
    1 row created.
    SQL> insert into apc values (2)
      2  /
    1 row created.
    SQL> alter table apc add constraint apc_pk primary key (col1)
      2  using index ( create unique index my_new_index on apc (col1))
      3  /
    Table altered.
    SQL> insert into apc values (2)
      2  /
    insert into apc values (2)
    ERROR at line 1:
    ORA-00001: unique constraint (APC.APC_PK) violated
    SQL> alter table apc drop constraint apc_pk
      2  /
    Table altered.
    SQL> insert into apc values (2)
      2  /
    1 row created.
    SQL> alter table apc add constraint apc_pk primary key (col1)
      2  using index ( create unique index my_new_index on apc (col1))
      3  /
    alter table apc add constraint apc_pk primary key (col1)
    ERROR at line 1:
    ORA-02437: cannot validate (APC.APC_PK) - primary key violated
    SQL> @%ORACLE_HOME%/rdbms/admin/utlexcpt.sql
    Table created.
    SQL> alter table apc add constraint apc_pk primary key (col1)
      2  using index ( create unique index my_new_index on apc (col1))
      3  exceptions into EXCEPTIONS
      4  /
    alter table apc add constraint apc_pk primary key (col1)
    ERROR at line 1:
    ORA-02437: cannot validate (APC.APC_PK) - primary key violated
    SQL> select * from apc where rowid in ( select row_id from exceptions)
      2  /
          COL1
             2
             2
    SQL> All this is in the documentation. Find out more.
    Cheers, APC

  • How to create monthly table creation?

    Hi Mates,
    Unable to create table by month in analytic database but load the data to the previous table continuous as attached screenshot, Schema user has the creation privilege. We are using Webcenter interaction 10gR4.
    How to create monthly table creation please?
    Thanks,
    Katherine

    Hi Trevor,
    Thanks for your help.  We were able to create table and load data till Apr as attached.
    However the analytic user privilege has been modified on Apr due to server operation.
    Since then, there was a message saying there is no permission to create tables in the analytic log,
    analytic user privilege has been granted after checked this message, As I suspected, the issue occurred after modifying analytic user privilege.
    Currently, analytic users are granted with all privilege.
    Any idea please?
    Thanks,
    Kathy

  • How do I display on-the-fly generated XML on a web page using DW CS4 Spry regions?

    On a main web page I'm trying to display formatted data from an ASP page that generates XML on-the-fly from a query.
    When I run the ASP page from the browser, the XML formatting of the data works. But when I run the main web page, the data doesn't display.
    I'm using  DW CS4 Spry regionsto display the data on the main page from the XML data generated by the ASP page. Here's the main page code:
    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
    <META HTTP-EQUIV="Pragma" CONTENT="no-cache" >
    <html>
    <head>
    <script type="text/javascript" src="SpryAssets/xpath.js"></script>
    <script type="text/javascript" src="SpryAssets/SpryData.js"></script>
    <script type="text/javascript" src="SpryAssets/SpryUtils.js"> </script>
    <script type="text/javascript">var A1D1xml = new Spry.Data.XMLDataSet("A1D1ACRs_testWxmlCode.asp", "tests/test");</script>
    <title>Test XML Main</title>  
    </head>
    <body>
          <div id="A1D1xml" spry:region="A1D1xml">
                                    <table id="A1D1">
                <tr>
                <th>ID</th>
                <th>Last Name</th>
                <th>Final Status</th>
                </tr>
                <tr spry:repeat="A1D1xml">
                       <td>{acr_id}</td>
                    <td>{acr_lastName}</td>
                    <td>{acr_final_status}</td>
                </tr>
                  </table>
             </div>
    </body>
    </html>
    Here's the code for the page that generates the XML: A1D1ACRs_testWxmlCode.asp
    <html>
    <%
    set objConn=server.CreateObject("ADODB.Connection")
    objConn.Open application("web_test")
    set rs = objConn.Execute( "SELECT acr_id, acr_lastName, acr_final_status from acr_records_grid_view where acr_changeOption = 'A1D1'")
    Response.ContentType = "text/xml"
    Response.AddHeader "Pragma", "public"
    Response.AddHeader "Cache-control", "private"
    Response.AddHeader "Expires", "-1"
    %>
    <?xml version="1.0" encoding="utf-8"?>
    <tests>
      <%While (NOT rs.EOF)%>
                    <test>
                                    <ID><%=(rs.Fields.Item("acr_id").Value)%></ID>
                                    <acr_lastName><![CDATA[<%=(rs.Fields.Item("acr_lastName").Value)%>]]></acr_lastName>
                                    <acr_final_status><![CDATA[<%=(rs.Fields.Item("acr_final_status").Value)%>]]></acr_final_s tatus>
                    </test>
        <%
                    rs.MoveNext()
                    Wend
      %>
    </tests>
    <%
    rs.Close()
    Set rs = Nothing
    %>
    </html>
    Thanks.

    Thanks, but no; I'm using the correct case and folder.
    With this code on the main page, The region flashes the table column header names
    and the code as written for the spry repeat; then instantly disappears from the screen.
    <html>
    <head>
    <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
    <script type="text/javascript" src="SpryAssets/xpath.js"></script>
    <script type="text/javascript" src="SpryAssets/SpryData.js"></script
    <script type="text/javascript">var A1D1x = new Spry.Data.XMLDataSet("A1D1ACRs_testWxmlCode.xml", "tests/test");</script>
    <title>Test XML Main</title>  
    </head>
    <body>
          <div id="A1D1" spry:region="A1D1x">
                   <table id="A1D1a">
                <tr>
                <th>ID</th>
                <th>Last Name</th>
                <th>Final Status</th>
                </tr>
                <tr spry:repeat="A1D1x">
                    <td>{acr_id}</td>
                    <td>{acr_lastName}</td>
                    <td>{acr_final_status}</td>
                </tr>
                  </table>
             </div>
    </body>
    </html>

  • Reg :-maintain a table maintenance view for the z table.suggest me the code

    i have question.i have created a z table related to pp module.the requirement is to maintain {table maintenance view} for this z table.how could this be done.can any one suggest me the code for this?

    Hi ,
    We have a lots of queries on table maintainance creation in the forum posted and replied to.Pls have a look at it first.
    In se11 -> change mode of the ztable -> goto -> utilities -> table maintainance generator -> click on it...
    it will give a new screen-> enter the required details like function group , authorisation(use &NC& or leave it blank if you dont know)  and screens
    2 step and 1 step means=> number of screens displayed in maintainace
    if step1 is selected then we have a screen which will be like a table control for data entry.........
    for step2 we have a table control screen and a more detailed individual field display as second screen
    click on the button 'find screen numbers' so that system automatically proposes the screen numbers
    after which click on the "create" button and follow the required instructions/messages
    once done go to SM30 enter the table name to check if maintainance has been created properly
    Hope it helps,Pls check and revert
    Regards
    Byju

  • Generating JSP/HTML on the fly

    Hi all,
    The requirement is to be able to generate HTML files (for UI display) on the fly depending on the database table structure. (Consider an EMPLOYEE table for example, an output HTML file needs to be generated based on the fields present in this table).
    To store data (config data) regarding this HTML file (in terms of field names, field types, attributes, etc.) what is suggested is to use an XML file which would have these, then a java file would read these from the XML and give out an other XML which would contain the actual data that will go in to the HTML (such as if the original XML conf. file contained a select statement for a LOV field, then the java file would fetch these and put them inthe final XML file) and so on.. and then have an other XSLT file with the appropriate layout to generate the final HTML.
    Could any one else suggest an alternative technique?
    Thanks and much appreciated.

    read your XML file to a "org.w3c.dom.Document" structure (see the example: http://www.developerfusion.com/show/2064/), then use an XSLT to show the Document in HTML frormat.
    // doc is an instance object of org.w3c.dom.Document
    File stylesheet = new File("style.xsl");
    StreamSource stylesource = new StreamSource(stylesheet);
    File datafile = new File("results.html");
    try {
    // Use a Transformer for output
    TransformerFactory tFactory = TransformerFactory.newInstance();
    Transformer transformer = tFactory.newTransformer(stylesource);
    DOMSource source = new DOMSource(doc);
    StreamResult result = new StreamResult(datafile);
    transformer.transform(source, result);
    catch (TransformerConfigurationException tce) {
    // Error generated by the parser
    System.out.println ("\n** Transformer Factory error");
    System.out.println(" " + tce.getMessage() );
    // Use the contained exception, if any
    Throwable x = tce;
    if (tce.getException() != null)
    x = tce.getException();
    x.printStackTrace();
    catch (TransformerException te) {
    // Error generated by the parser
    System.out.println ("\n** Transformation error");
    System.out.println(" " + te.getMessage() );
    // Use the contained exception, if any
    Throwable x = te;
    if (te.getException() != null)
    x = te.getException();
    x.printStackTrace();

  • Convert Chinese on the fly

    Dear all,
    I have a DB which the NLS_CHARACTERSET is UTF8.
    In a table, we have a column which set as varchar2 data type.
    Users will put content into it (which some times are traditional chinese, and sometimes are simplified chinese).
    Based on the new user requirements, I need to:
    1. identify those records with traditional chinese / simplified chinese
    - I read from the web that I may use the dump function to do so. However, I did some tests and it seems incorrect. Any tips on this?
    2. Convert the characters on the fly
    - is it possible that I convert the column into all traditional chinese / simplified chinese when I select / export / view?
    - I heard that the function convert may help but I can't make it works. Any help on these two issues are very appreciated. many thx.

    The data in the VARCHAR2 column should encoded using the UTF-8 Unicode encoding. It should not be either simplified or traditional Chinese.
    If the client application is accepting data using the traditional Chinese character set, that data should be converted to the database character set transparently by the SQL*Net layer when it is sent over the wire based on the client's NLS settings. Similarly, if the client application is accepting data using the simplified Chinese character set, the data should be converted to the database character set when it is sent over the wire based on that client's NLS settings. Similarly, when you select data, the client's NLS settings should indicate the character set that the client application wishes the data to be converted to and that should be done transparently by the Oracle SQL*Net layer.
    Justin

  • User Exit for ME21 PO Creation at the time of saving--Urgent

    Hi,
    Can some one help me out in finding the user exit for PO creation at the time of saving.
    The Requirement is:
    I need to create a custom field in EKKO table.
    After appending the structure with the field to the EKKO table, i need to create a PO.
    Now i need to update the field which has been created at the time of pressing the SAVE button.
    I need to update the EKKO table it self.
    I could find the user exits but none of them have the EKKO table either in Changing or in Tables.
    Thanks,

    Hi,
              You may want to check this user exits.
    EXIT_SAPMM06E_012 - Check Customer-Specific Data Before Saving
    EXIT_SAPMM06E_013 - Update Customer-Specific Data in Purchasing Document
    Here is the list of available user exits for ME21.
    EXIT_SAPMM06E_001 - Other Number Range or Own Document Number
    EXIT_SAPMM06E_004 - User Exit for Cust.-Specific Control of Import Data Screens in Purchasing
    EXIT_SAPMM06E_005 - Field Selection Control: Vendor Address Screen
    EXIT_SAPMM06E_006 - Export Data to Customer Subscreen for Purchasing Document Header (PBO)
    EXIT_SAPMM06E_007 - Export Data to Customer Subscreen for Purchasing Document Header (PAI)
    EXIT_SAPMM06E_008 - Import Data from Customer Subscreen for Purchasing Document Header
    EXIT_SAPMM06E_009 - Reset Customer Data at Beginning of New Document (Without Dialog)
    EXIT_SAPMM06E_012 - Check Customer-Specific Data Before Saving
    EXIT_SAPMM06E_013 - Update Customer-Specific Data in Purchasing Document
    EXIT_SAPMM06E_014 - Read Customer-Specific Data when Importing Purchasing Document
    EXIT_SAPMM06E_016 - Export Data to Customer Subscreen for Purchasing Document Item (PBO)
    EXIT_SAPMM06E_017 - Export Data to Customer Subscreen for Purchasing Document Item (PAI)
    EXIT_SAPMM06E_018 - Import Data from Customer Subscreen for Purchasing Document Item
    EXIT_SAPMM06E_020 - User Exit: Change Document for Requisitions (Conversion into PO)
    EXIT_SAPMM06E_021 - Fulfillment of Target Value: Release Orders Against a Contract
    <b>Reward points</b>
    Regards

  • IDOC creation after the creation of a purchase order

    Hi everybody,
                I want some configuration details regardign IDOC creation. The requirement is, once i create a purchase order in one system, the details should automatciaclly transfer to other system through IDOC. could any one suggest any document or detailed procedure to accomplish the task. Thanks in advance,
             Santosh.

    hi santosh kumar,
    Creating an IDoc Segment WE31:
    The segment defines the structure of the records in an IDoc. They are defined with transaction WE31.
    We will define a structure to send a text from the text database.
    Transaction WE31 calls the IDoc segment editor. The editor defines the fields of a
    single segment structure. The thus defined IDoc segment is then created as a data
    dictionary structure. You can view the created structure with SE11 and use it in an
    ABAP as any TABLES declaration.
    To demonstrate the use of the IDoc segment editor we will set up an example, which
    allows you to send a single text from the text pool (tables STXH and STXL) as an
    IDoc. These are the texts that you can see with SO10 or edit from within many
    applications.
    We will show the steps to define an IDoc segment YAXX_THEAD with the DDic
    structure of THEAD.
    To facilitate our work, we will use the "copy-from-template-tool", which reads the
    definition of a DDIC structure and inserts the field and the matching definitions as
    rows in the IDoc editor. You could, of course, define the structure completely
    manually, but using the template makes it easier.
    The tool in release 4.0b lets you use both DDIC structures or another IDoc segment
    definition as a template.
    The thus created structure can be edited any time. When saving, it will create a data
    dictionary structure based on the definition in WE31. The DDIC structure will retain
    the same name. You can view the structure as a table definition with SE11 and use it
    in an ABAP the same way.
    Defining the Message Type (EDMSG)
    The message type defines the context under which an IDoc is transferred to its destination. It allows for using the same IDoc file format for several different applications.
    Imagine the situation of sending a purchase order to a supplier. When the IDoc with
    the purchase order reaches the supplier, it will be interpreted as a sales order
    received from a customer, namely you.
    Simultaneously you want to send the IDoc data to the supplier's warehouse to inform
    it that a purchase order has been issued and is on the way.
    Both IDoc receivers will receive the same IDoc format; however, the IDoc will be
    tagged with a different message type. While the IDoc to the supplier will be flagged
    as a purchase order (in SAP R/3 standard: message type = ORDERS), the same IDoc
    sent to the warehouse should be flagged differently, so that the warehouse can
    recognize the order as a mere informational copy and process it differently than a
    true purchase order.
    The message type together with the IDoc type determine the processing function.
    The message types are stored in table EDMSG.
    Defining the message type can be done from the transaction WEDI
    EDMSG: Defining the message type (1)
    The entry is only a base entry which tells the system that the message type is
    allowed. Other transactions will use that table as a check table to validate the entry.
    IT is as shown .
    EDMSG: Defining the message type (1):
    The entry is only a base entry which tells the system that the message type is
    allowed. Other transactions will use that table as a check table to validate the entry.
    Sales Orders are being created through inbound IDocs using FM 'EDI_DATA_INCOMING'. Now a Report is required to check the status of these Inbound IDocs along with Sales Orders generated against customer Purchase Orders.
    Requirement:
    Sales Orders are being created through inbound IDocs using FM 'EDI_DATA_INCOMING'. Now a Report is required to check the status of these Inbound IDocs along with Sales Orders generated against customer Purchase Orders.
    Processing:
    The report selects, 'ORDERS' IDoc numbers & status, generated between given time range, from table EDIDC. Further, it calls Function Module 'IDOC_READ_COMPLETELY' to get the IDoc details. Then required information is extracted by reading relevant field data of IDoc segments.
    sampl code:
    REPORT  Z_EDI_FILE_LOAD_STATUS_REPORT           .
    Staus Report for Inbound IDOCs ( Sales Orders )
    Program        : Z_EDI_FILE_LOAD_STATUS_REPORT
    Presented By   : www.rmtiwari.com
    TABLES : EDIDC.
    ALV stuff
    TYPE-POOLS: SLIS.
    DATA: GT_FIELDCAT TYPE SLIS_T_FIELDCAT_ALV,
          GS_LAYOUT   TYPE SLIS_LAYOUT_ALV,
          GT_SORT     TYPE SLIS_T_SORTINFO_ALV,
          GT_LIST_TOP_OF_PAGE TYPE SLIS_T_LISTHEADER.
    DATA : BEGIN OF T_REPORT OCCURS 0,
             IDOC_NO       TYPE EDI_DOCNUM,
             IDOC_DATE     TYPE SY-DATUM,
             IDOC_TIME     TYPE SY-UZEIT,
             SORDER_NO     TYPE VBELN,
             STP_NO        TYPE KNA1-KUNNR,
             STP_NAME(35)  TYPE C,
             STP_PHONE(12) TYPE C,
             PO_NO(15)     TYPE C,
             STATUS        TYPE C,
             S_TEXT(70)    TYPE C,
             ERROR(70)     TYPE C,
           END OF T_REPORT.
    --PARAMETER--
    selection-screen begin of block date with frame title TEXT-S01.
    select-options: UDATE for  EDIDC-UPDDAT
                          default SY-datum obligatory,    "Changed On
                    UTIME for  EDIDC-UPDTIM .             "Changed Time
    selection-screen end   of block date.
    INITIALIZATION.
    START-OF-SELECTION.
    PERFORM SHOW_STATUS_REPORT.
    *&      Form  alv_grid
          text
    -->   p1        text
    < --  p2        text
    FORM ALV_GRID.
      IF GT_FIELDCAT[] IS INITIAL.
        PERFORM FIELDCAT_INIT.
        PERFORM LAYOUT_INIT.
        PERFORM SORT_INIT.
      ENDIF.
      PERFORM GRID_DISPLAY.
    ENDFORM.                    "alv_grid
    *&      Form  layout_init
    FORM LAYOUT_INIT.
      GS_LAYOUT-ZEBRA             = 'X'.
      GS_LAYOUT-CELL_MERGE        = 'X'.
      GS_LAYOUT-COLWIDTH_OPTIMIZE = 'X'.
      GS_LAYOUT-NO_VLINE          = ' '.
      GS_LAYOUT-TOTALS_BEFORE_ITEMS = ' '.
    ENDFORM.                    " layout_init
    *&      Form  fieldcat_init
    FORM FIELDCAT_INIT.
      DATA: LS_FIELDCAT TYPE SLIS_FIELDCAT_ALV.
       CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'IDOC_NO'.
      LS_FIELDCAT-KEY          = 'X'.
      LS_FIELDCAT-REPTEXT_DDIC = 'IDOC'.
      LS_FIELDCAT-OUTPUTLEN    = 10.
    Fix for ALV print bug, which puts 'N/A' over last digit
    Set inttype to 'N' to stop corruption of printed ALV cell.
      LS_FIELDCAT-INTTYPE = 'N'.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'IDOC_DATE'.
      LS_FIELDCAT-REPTEXT_DDIC = 'Creation Date'.
      LS_FIELDCAT-OUTPUTLEN    = 10.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'IDOC_TIME'.
      LS_FIELDCAT-REPTEXT_DDIC = 'Creation Time'.
      LS_FIELDCAT-OUTPUTLEN    = 8.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'STATUS'.
      LS_FIELDCAT-REPTEXT_DDIC = 'St'.
      LS_FIELDCAT-OUTPUTLEN    = 2.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'ERROR'.
      LS_FIELDCAT-REPTEXT_DDIC = 'Message'.
      LS_FIELDCAT-OUTPUTLEN    = 70.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'STP_NO'.
      LS_FIELDCAT-REPTEXT_DDIC = 'S.T.Party No'.
      LS_FIELDCAT-OUTPUTLEN    = 10.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'STP_NAME'.
      LS_FIELDCAT-REPTEXT_DDIC = 'Sold to Party Name'.
      LS_FIELDCAT-OUTPUTLEN    = 35.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'PO_NO'.
      LS_FIELDCAT-REPTEXT_DDIC = 'Purch Order'.
      LS_FIELDCAT-OUTPUTLEN    = 15.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
      CLEAR LS_FIELDCAT.
      LS_FIELDCAT-FIELDNAME    = 'STP_PHONE'.
      LS_FIELDCAT-REPTEXT_DDIC = 'S.T.Party Phone'.
      LS_FIELDCAT-OUTPUTLEN    = 15.
      APPEND LS_FIELDCAT TO GT_FIELDCAT.
    ENDFORM.                    "fieldcat_init
    *&      Form  sort_init
    FORM SORT_INIT.
      DATA: LS_SORT TYPE SLIS_SORTINFO_ALV.
      CLEAR LS_SORT.
      LS_SORT-FIELDNAME = 'IDOC_DATE'.
      LS_SORT-SPOS      = 1.
      LS_SORT-UP        = 'X'.
      APPEND LS_SORT TO GT_SORT.
      CLEAR LS_SORT.
      LS_SORT-FIELDNAME = 'IDOC_TIME'.
      LS_SORT-SPOS      = 2.
      LS_SORT-UP        = 'X'.
      APPEND LS_SORT TO GT_SORT.
      CLEAR LS_SORT.
      LS_SORT-FIELDNAME = 'STATUS'.
      LS_SORT-SPOS      = 3.
      LS_SORT-UP        = 'X'.
      APPEND LS_SORT TO GT_SORT.
      CLEAR LS_SORT.
      LS_SORT-FIELDNAME = 'IDOC_NO'.
      LS_SORT-SPOS      = 4.
      LS_SORT-UP        = 'X'.
      APPEND LS_SORT TO GT_SORT.
    ENDFORM.                    "sort_init
    *&      Form  grid_display
    FORM GRID_DISPLAY.
      CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
        EXPORTING
          IS_LAYOUT     = GS_LAYOUT
          IT_FIELDCAT   = GT_FIELDCAT
          IT_SORT       = GT_SORT
          i_callback_program      = SY-REPID
          I_CALLBACK_TOP_OF_PAGE = 'TOP_OF_PAGE'
          I_DEFAULT     = ' '
          I_SAVE        = 'X'
        TABLES
          T_OUTTAB      = T_REPORT
        EXCEPTIONS
          PROGRAM_ERROR = 1
          OTHERS        = 2.
    ENDFORM.                    "grid_display
    *&      Form  COMMENT_BUILD
          Processing of listheader
    FORM COMMENT_BUILD USING P_FK_LIST_TOP_OF_PAGE TYPE SLIS_T_LISTHEADER.
      DATA: LS_LINE TYPE SLIS_LISTHEADER.
      REFRESH P_FK_LIST_TOP_OF_PAGE.
    List Heading : Typ H
      CLEAR LS_LINE.
      LS_LINE-TYP  = 'H'.
      LS_LINE-INFO  = 'Sales Order Interface: Z_EDI_FILE_LOAD'.
      APPEND LS_LINE TO P_FK_LIST_TOP_OF_PAGE.
    List : Typ S
      clear LS_LINE.
      LS_LINE-typ  = 'S'.
      LS_LINE-key  = 'Date Range:'.
      LS_LINE-info  = UDATE-low.
      if not UDATE-high is initial.
        write ' To ' to  LS_LINE-info+30.
        LS_LINE-info+36 = UDATE-high.
      endif.
      APPEND LS_LINE TO P_FK_LIST_TOP_OF_PAGE.
    ENDFORM.                               " COMMENT_BUILD
          FORM TOP_OF_PAGE                                              *
          Ereigniss TOP_OF_PAGE                                       *
          event     TOP_OF_PAGE
    FORM TOP_OF_PAGE.
      PERFORM COMMENT_BUILD  USING gt_LIST_TOP_OF_PAGE[].
      CALL FUNCTION 'REUSE_ALV_COMMENTARY_WRITE'
        EXPORTING
          IT_LIST_COMMENTARY = GT_LIST_TOP_OF_PAGE.
    ENDFORM.                    "TOP_OF_PAGE
    *&      Form  show_status_report
    FORM SHOW_STATUS_REPORT .
    Report to show status.
      DATA: BEGIN OF T_TEDS2 OCCURS 0.
              INCLUDE STRUCTURE TEDS2.
      DATA: END OF T_TEDS2.
      DATA: BEGIN OF T_IDOC_CONTROL_TMP OCCURS 0.
              INCLUDE STRUCTURE EDIDC.
      DATA: END OF T_IDOC_CONTROL_TMP.
      CONSTANTS: C_STATUS_IN_IDOC_POSTED       LIKE EDIDC-STATUS VALUE '53'.
      DATA : T_EDIDS TYPE STANDARD TABLE OF EDIDS WITH HEADER LINE.
      DATA : T_EDIDD TYPE STANDARD TABLE OF EDIDD WITH HEADER LINE.
      DATA : GV_PARTNER_SEG TYPE E1EDKA1,
             GV_PO_REF_SEG  TYPE E2EDK02.
    Get text for status values
      SELECT * FROM TEDS2 INTO TABLE T_TEDS2 WHERE LANGUA = SY-LANGU.
    Read the IDoc's status after processing
      SELECT * FROM EDIDC
        INTO TABLE T_IDOC_CONTROL_TMP
       WHERE UPDDAT IN UDATE
         AND UPDTIM IN UTIME
         AND MESTYP = 'ORDERS'.
      LOOP AT T_IDOC_CONTROL_TMP.
      IDoc has been processed, since control record changed.
        READ TABLE T_TEDS2 WITH KEY STATUS = T_IDOC_CONTROL_TMP-STATUS.
        T_REPORT-IDOC_NO = T_IDOC_CONTROL_TMP-DOCNUM.
        T_REPORT-IDOC_DATE = T_IDOC_CONTROL_TMP-CREDAT.
        T_REPORT-IDOC_TIME = T_IDOC_CONTROL_TMP-CRETIM.
        T_REPORT-S_TEXT = T_TEDS2-DESCRP.
        IF T_IDOC_CONTROL_TMP-STATUS = C_STATUS_IN_IDOC_POSTED.
        ok status
          T_REPORT-STATUS = 'S'.
        ELSE.
        error status
          T_REPORT-STATUS = 'E'.
        ENDIF.
        Get IDoc details.
        CALL FUNCTION 'IDOC_READ_COMPLETELY'
          EXPORTING
            DOCUMENT_NUMBER         = T_REPORT-IDOC_NO
          TABLES
            INT_EDIDS               = T_EDIDS
            INT_EDIDD               = T_EDIDD
          EXCEPTIONS
            DOCUMENT_NOT_EXIST      = 1
            DOCUMENT_NUMBER_INVALID = 2
            OTHERS                  = 3.
      Get Error status
        READ TABLE T_EDIDS WITH KEY STATUS = T_IDOC_CONTROL_TMP-STATUS.
        IF SY-SUBRC EQ 0.
          REPLACE FIRST OCCURRENCE OF '&1' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA1.
          REPLACE FIRST OCCURRENCE OF '&2' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA2.
          REPLACE FIRST OCCURRENCE OF '&3' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA3.
          REPLACE FIRST OCCURRENCE OF '&4' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA4.
          REPLACE FIRST OCCURRENCE OF '&' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA1.
          REPLACE FIRST OCCURRENCE OF '&' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA2.
          REPLACE FIRST OCCURRENCE OF '&' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA3.
          REPLACE FIRST OCCURRENCE OF '&' IN T_EDIDS-STATXT
                                WITH T_EDIDS-STAPA4.
          T_REPORT-ERROR = T_EDIDS-STATXT.
        ENDIF.
        LOOP AT T_EDIDD.
          CASE T_EDIDD-SEGNAM.
            WHEN 'E1EDKA1'.
              GV_PARTNER_SEG = T_EDIDD-SDATA.
              CLEAR : T_REPORT-STP_NAME.
              CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
                EXPORTING
                  INPUT  = GV_PARTNER_SEG-PARTN
                IMPORTING
                  OUTPUT = T_REPORT-STP_NO.
              SELECT SINGLE NAME1 TELF1
                INTO (T_REPORT-STP_NAME,T_REPORT-STP_PHONE)
                FROM KNA1
               WHERE KUNNR = T_REPORT-STP_NO.
            WHEN 'E1EDK02'.
              GV_PO_REF_SEG = T_EDIDD-SDATA.
              T_REPORT-PO_NO = GV_PO_REF_SEG-BELNR.
          ENDCASE.
        ENDLOOP.
        APPEND T_REPORT.
      ENDLOOP .
      SORT T_REPORT BY STATUS IDOC_NO.
    Show Report
      PERFORM ALV_GRID.
    ENDFORM.                    " show_status_report
    thanks
    karthik
    reward me points if usefull.

Maybe you are looking for

  • I need a good monitor for the retina macbook pro

    What is a good monitor for the retina macbook pro. Im looking for 27 inch that is around the price of the thunderbolt display. The only thing putting me off the thunderbolt display is that i wouldnt be able to watch stuff on my xbox on it due to no h

  • New Reader XI

    New reader XI will not down load or open pdf's - i use XP w/ SP-3, solutions please?

  • Query on inventory

    Hi All, What will be the query to see the item code, item description, item Stock etc "INVENTORY WISE" ? Regards, Rupa Sarkar

  • Schedule CR by crystal report data consumer

    Hello expert,         I want to embed a xcelsius dashboard into crystal report by crystal report data consumer, but I don't know if I can schedule this mixed crystal report in infoview and send it out by email periodically? because I 've heard there

  • Is it safe to dissect proprietary installers?

    I've been working on 2 PKGBUILDs today: maya2014 (just modified mihaim's maya2013's PKGBUILD) and mudbox2014 (based it upon maya2014) Their respective *.tgz (from the official website) contain the following files: Autodesk_Maya_2014_SP1_English_Linux