File loaded successfully by SQLldr but external table failed

Hi ,
When i tried to create an external table to load data from file it failed with the below error
"KUP-04018: partial record at end of file",
When i query on the external table for rows with rownum<2500 i can see the records being properly loaded but when i tried to fetch all records it throws the above error.
But the same file when i loaded with sqlldr it loaded all the records successfully.
Can you please let me know the reason for this?
Regards

Can you post us your SQLLDR control file, your external table definition, a sample of your data (preferably including the last lines of data which you believe are erroring) and also let us know your database version.

Similar Messages

  • Bad file is not created during the external table creation.

    Hello Experts,
    I have created a script for external table in Oracle 10g DB. Everything is working fine except it does not create the bad file, But it creates the log file. I Cann't figure out what is the issue. Because my shell scripts is failing and the entire program is failing. I am attaching the table creation script and the shell script where it is refering and the error. Kindly let me know if something is missing. Thanks in advance
    Table Creation Scripts:_-------------------------------
    create table RGIS_TCA_DATA_EXT
    guid VARCHAR2(250),
    badge VARCHAR2(250),
    scheduled_store_id VARCHAR2(250),
    parent_event_id VARCHAR2(250),
    event_id VARCHAR2(250),
    organization_number VARCHAR2(250),
    customer_number VARCHAR2(250),
    store_number VARCHAR2(250),
    inventory_date VARCHAR2(250),
    full_name VARCHAR2(250),
    punch_type VARCHAR2(250),
    punch_start_date_time VARCHAR2(250),
    punch_end_date_time VARCHAR2(250),
    event_meet_site_id VARCHAR2(250),
    vehicle_number VARCHAR2(250),
    vehicle_description VARCHAR2(250),
    vehicle_type VARCHAR2(250),
    is_owner VARCHAR2(250),
    driver_passenger VARCHAR2(250),
    mileage VARCHAR2(250),
    adder_code VARCHAR2(250),
    bonus_qualifier_code VARCHAR2(250),
    store_accuracy VARCHAR2(250),
    store_length VARCHAR2(250),
    badge_input_type VARCHAR2(250),
    source VARCHAR2(250),
    created_by VARCHAR2(250),
    created_date_time VARCHAR2(250),
    updated_by VARCHAR2(250),
    updated_date_time VARCHAR2(250),
    approver_badge_id VARCHAR2(250),
    approver_name VARCHAR2(250),
    orig_guid VARCHAR2(250),
    edit_type VARCHAR2(250)
    organization external
    type ORACLE_LOADER
    default directory ETIME_LOAD_DIR
    access parameters
    RECORDS DELIMITED BY NEWLINE
    BADFILE ETIME_LOAD_DIR:'tstlms.bad'
    LOGFILE ETIME_LOAD_DIR:'tstlms.log'
    READSIZE 1048576
    FIELDS TERMINATED BY '|'
    MISSING FIELD VALUES ARE NULL(
    GUID
    ,BADGE
    ,SCHEDULED_STORE_ID
    ,PARENT_EVENT_ID
    ,EVENT_ID
    ,ORGANIZATION_NUMBER
    ,CUSTOMER_NUMBER
    ,STORE_NUMBER
    ,INVENTORY_DATE char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,FULL_NAME
    ,PUNCH_TYPE
    ,PUNCH_START_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,PUNCH_END_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,EVENT_MEET_SITE_ID
    ,VEHICLE_NUMBER
    ,VEHICLE_DESCRIPTION
    ,VEHICLE_TYPE
    ,IS_OWNER
    ,DRIVER_PASSENGER
    ,MILEAGE
    ,ADDER_CODE
    ,BONUS_QUALIFIER_CODE
    ,STORE_ACCURACY
    ,STORE_LENGTH
    ,BADGE_INPUT_TYPE
    ,SOURCE
    ,CREATED_BY
    ,CREATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,UPDATED_BY
    ,UPDATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,APPROVER_BADGE_ID
    ,APPROVER_NAME
    ,ORIG_GUID
    ,EDIT_TYPE
    location (ETIME_LOAD_DIR:'tstlms.dat')
    reject limit UNLIMITED;
    _***Shell Script*:*----------------_*
    version=1.0
    umask 000
    DATE=`date +%Y%m%d%H%M%S`
    TIME=`date +"%H%M%S"`
    SOURCE=`hostname`
    fcp_login=`echo $1|awk '{print $3}'|sed 's/"//g'|awk -F= '{print $2}'`
    fcp_reqid=`echo $1|awk '{print $2}'|sed 's/"//g'|awk -F= '{print $2}'`
    TXT1_PATH=/home/ac1/oracle/in/tsdata
    TXT2_PATH=/home/ac2/oracle/in/tsdata
    ARCH1_PATH=/home/ac1/oracle/in/tsdata
    ARCH2_PATH=/home/ac2/oracle/in/tsdata
    DEST_PATH=/home/custom/sched/in
    PROGLOG=/home/custom/sched/logs/rgis_tca_to_tlms_create.sh.log
    PROGNAME=`basename $0`
    PROGPATH=/home/custom/sched/scripts
    cd $TXT2_PATH
    FILELIST2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.dat for i in $FILELIST2
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES2 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES1="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST1="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST1
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES1 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    cd $TXT1_PATH
    FILELIST3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.datfor i in $FILELIST3
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES3 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES4="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST4="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST4
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES4 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    #connecting to oracle to generate bad files
    sqlplus -s $fcp_login<<EOF
    select count(*) from rgis_tca_data_ext;
    select count(*) from rgis_tca_data_history_ext;
    exit;
    EOF
    #counting the records in files
    tot_rec_in_tstlms=`wc -l $DEST_PATH/tstlms.dat | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits=`wc -l $DEST_PATH/tstlmsedits.dat | awk ' { print $1 } '`
    tot_rec_in_tstlms_bad=`wc -l $DEST_PATH/tstlms.bad | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits_bad=`wc -l $DEST_PATH/tstlmsedits.bad | awk ' { print $1 } '`
    #updating log table
    echo "pl/sql block started"
    sqlplus -s $fcp_login<<EOF
    define tot_rec_in_tstlms     = '$tot_rec_in_tstlms';
    define tot_rec_in_tstlmsedits     = '$tot_rec_in_tstlmsedits';
    define tot_rec_in_tstlms_bad     = '$tot_rec_in_tstlms_bad';
    define tot_rec_in_tstlmsedits_bad='$tot_rec_in_tstlmsedits_bad';
    define fcp_reqid ='$fcp_reqid';
    declare
    l_tstlms_file_id number := null;
    l_tstlmsedits_file_id number := null;
    l_tot_rec_in_tstlms number := 0;
    l_tot_rec_in_tstlmsedits number := 0;
    l_tot_rec_in_tstlms_bad number := 0;
    l_tot_rec_in_tstlmsedits_bad number := 0;
    l_request_id fnd_concurrent_requests.request_id%type;
    l_start_date fnd_concurrent_requests.actual_start_date%type;
    l_end_date fnd_concurrent_requests.actual_completion_date%type;
    l_conc_prog_name fnd_concurrent_programs.concurrent_program_name%type;
    l_requested_by fnd_concurrent_requests.requested_by%type;
    l_requested_date fnd_concurrent_requests.request_date%type;
    begin
    --getting concurrent request details
    begin
    SELECT fcp.concurrent_program_name,
    fcr.request_id,
    fcr.actual_start_date,
    fcr.actual_completion_date,
    fcr.requested_by,
    fcr.request_date
    INTO l_conc_prog_name,
    l_request_id,
    l_start_date,
    l_end_date,
    l_requested_by,
    l_requested_date
    FROM fnd_concurrent_requests fcr, fnd_concurrent_programs fcp
    WHERE fcp.concurrent_program_id = fcr.concurrent_program_id
    AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    exception
    when no_data_found then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log, 'No data found for request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when retrieving request_id request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlms.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlms_file_id,
                   'tstlms.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlms,
                   &tot_rec_in_tstlms_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlms file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlmsedits.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlmsedits_file_id,
                   'tstlmsedits.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlmsedits,
                   &tot_rec_in_tstlmsedits_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlmsedits file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    end;
    exit;
    EOF
    echo "rgis_tca_to_tlms_process.sql started"
    sqlplus -s $fcp_login @$SCHED_TOP/sql/rgis_tca_to_tlms_process.sql $fcp_reqid
    exit;
    echo "rgis_tca_to_tlms_process.sql ended"
    _**Error:*----------------------------------*_
    RGIS Scheduling: Version : UNKNOWN
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    TCATLMS module: TCA To TLMS Import Process
    Current system time is 18-AUG-2011 06:13:27
    COUNT(*)
         16
    COUNT(*)
         25
    wc: cannot open /home/custom/sched/in/tstlms.bad
    wc: cannot open /home/custom/sched/in/tstlmsedits.bad
    pl/sql block started
    old 33:     AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    new 33:     AND fcr.request_id = 18661823; --fnd_global.conc_request_id();
    old 63:                &tot_rec_in_tstlms,
    new 63:                16,
    old 64:                &tot_rec_in_tstlms_bad,
    new 64:                ,
    old 97:                &tot_rec_in_tstlmsedits,
    new 97:                25,
    old 98:                &tot_rec_in_tstlmsedits_bad,
    new 98:                ,
    ERROR at line 64:
    ORA-06550: line 64, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql stddev sum variance
    execute forall merge time timestamp interval date
    <a string literal with character set specification>
    <a number> <a single-quoted SQL string> pipe
    <an alternatively-quoted string literal with character set specification>
    <an alternatively-q
    ORA-06550: line 98, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql st
    rgis_tca_to_tlms_process.sql started
    old 12: and concurrent_request_id = '&1';
    new 12: and concurrent_request_id = '18661823';
    old 18: and concurrent_request_id = '&1';
    new 18: and concurrent_request_id = '18661823';
    old 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,&1);
    new 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,18661823);
    old 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,&1);
    new 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,18661823);
    old 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',&1);
    new 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',18661823);
    declare
    ERROR at line 1:
    ORA-20001: Error occured when executing RGIS_TCA_TO_TLMS_PROCESS.sql ORA-01403:
    no data found
    ORA-06512: at line 59
    Executing request completion options...
    ------------- 1) PRINT   -------------
    Printing output file.
    Request ID : 18661823      
    Number of copies : 0      
    Printer : noprint
    Finished executing request completion options.
    Concurrent request completed successfully
    Current system time is 18-AUG-2011 06:13:29
    ---------------------------------------------------------------------------

    Hi,
    Check the status of the batch in SM35 transaction.
    if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
    To Release -Shift+F4.
    Also you can analyse the job status through F2 button.
    Bye

  • External table fails to view data after changing file name

    Hello,
    I have previously imported 122 external tables and was able to view their data without any problems.
    The names of these files have since changed. The names have a date in them.
    So I dropped the external tables from that database and recreated them with the new file names.
    I then went into OWB and dropped them from OWB.
    I reimported all of the tables without errors.
    I go to view the data and get an error that the file does not exists.
    The problem is that the file it is telling me does not exists, is the file I had been using last week. Not the new file.
    However, when I go and right click the external table and do a configure, the correct name is showing up under datafiles.
    It's almost as though when I dropped the external tables, not everything was cleaned up. For whatever reason, OWB is still using the old names.
    Dan

    Hi there... Wich OWB version are you using?
    If you've created this external tables using any other tool (ie: SQLPLUS, SQLDeveloper, Toad, SQLNavigator, etc), are you able to query this objects from one of these tools?
    If you've created this ET's using OWB, are you creating it as external tables or are you mapping it as source flat files?
    If you are using OWB to create it, there are a few configurations/properties you should change to have them running well.
    Please, post some more info.
    Thanks

  • Create external table fails

    I'm trying to map a trace file to an external table' but I'm getting "missing keyword" error.
    set verify off
    create or replace directory 'TRACE_DIR' as '/db_home/oracle/product/diag/rdbms/test/dtms_t/trace';
    column id new_value os_user;
    SELECT sys_context('USERENV', 'OS_USER') id FROM DUAL;
    BEGIN
          EXECUTE IMMEDIATE 'alter session set tracefile_identifier = &os_user' ;
    END;
    column tracefile_loc new_value tracefile
    select SUBSTR(tracefile_loc,INSTR(tracefile_loc,'/',-1)+1) tracefile_loc
    from
    ( select value ||'/'||(select instance_name from v$instance) ||'_ora_'||
                  (select spid||case when traceid is not null then '_'||traceid else null end
                        from v$process where addr = (select paddr from v$session
                                                     where sid = (select sid from v$mystat
                                                                  where rownum = 1
                  ) || '.trc' tracefile_loc
       from v$parameter where name = 'user_dump_dest'
    jbrock@ddtms_t> CREATE TABLE trace_ext
      2                     (line varchar2(80)
      3                     )
      4       ORGANIZATION EXTERNAL
      5       (
      6         TYPE ORACLE_LOADER
      7         DEFAULT DIRECTORY trace_dir
      8         ACCESS PARAMETERS
      9         (
    10           records delimited by newline
    11           missing field values are null
    12           ( line
    13           )
    14         )
    15         LOCATION (&&tracefile)
    16       )
    17       REJECT LIMIT UNLIMITED;
           LOCATION (ddtms_t_ora_9101_JBROCK.trc)
    ERROR at line 15:
    ORA-00905: missing keyword

    Hi,
    jimmyb wrote:
    ... jbrock@ddtms_t> CREATE TABLE trace_ext
    2                     (line varchar2(80)
    3                     )
    4       ORGANIZATION EXTERNAL
    5       (
    6         TYPE ORACLE_LOADER
    7         DEFAULT DIRECTORY trace_dir
    8         ACCESS PARAMETERS
    9         (
    10           records delimited by newline
    11           missing field values are null
    12           ( line
    13           )
    14         )
    15         LOCATION (&&tracefile)
    16       )
    17       REJECT LIMIT UNLIMITED;
    LOCATION (ddtms_t_ora_9101_JBROCK.trc)
    ERROR at line 15:
    ORA-00905: missing keyword
    Doesn't the location have to be in single-qutoes? For line 15, try
    LOCATION ('&&tracefile')

  • Validate External Table Fails

    Hi there,
    I am new to WB and I am trying to setup a simple project to read from a data file and populate a database table. I have create a 2 locations, a database and the other a file location. I have created a module and have created the external table, and imported the other database table. I already have an external table in my database that reads the data file, but was unable to import it for some reason when I did so it didn't appear, so I created it again in WB. Having done so if I perform validation on it, by right clicking, I get the following error
    VLD-0187: The location is not configured for for module MY_MODULE.
    Specify a location in the configuration for the database module. Locations must be specified as data locations in the Module Editor to appear in the configuration location selection list.
    Does anybody know what this means? The database module under which I have created my external table and imported my table uses the database location I have created, and my extrenal table uses the file location I have created.
    Any help would be much appreciated
    thanks, Anil

    OK there are few thing that need to be set:
    1. MY_MODULE is your Schema (module) in OWB Project explorer --> Double clicking on it opens 2 tabs for locations
    -- 1. Metadata location --> this is your connection to DB
    -- 2. Data locations --> this is where data is/will be located
    2. In connection explorer you need to have 2 connections
    -- 1. DB connection --> this is your connection to DB
    -- 2. File locations --> this is directory where your files are
    3. Double click on your External table --> tab Location --> set file location
    Do you have all of these set?

  • Data loaded successfully in RSA3, but can not find data in BW PSA

    Dear all:
    We have custom datasource enhanced through BADI. For one column, in the source system RSA3, I can see this field populated with data, but in BW, when I load the data in PSA and check, this field is blank...
    I tried seeral ways like re-replicate datasource, delete the datasource in BW and re-replicate, but not work.
    Did anybody meet the same situation before? How to handle this issue?
    Any post would be appreciated and thank you all in advance!
    Tim

    Hi Pavan:
    Thanks for your reply.
    I don't have any selection field for this datasource, and the number between ECC and BW matches. Actually the problem is only for one field, on ECC side RSA3, this field are populated with values, while in BW PSA, this column is there, but all the content are blank.
    I do call one function module called to populate that field:
    CALL FUNCTION 'CLAF_CLASSIFICATION_OF_OBJECTS'
          EXPORTING
          CLASS                      = ''
            CLASSTEXT                  = 'X'
            CLASSTYPE                  = '023'
          CLINT                      = 0
            FEATURES                   = 'X'
            LANGUAGE                   = SY-LANGU
            OBJECT                     = W_OBJECT
            OBJECTTABLE                = 'MCH1'
            KEY_DATE                   = SY-DATUM
            INITIAL_CHARACT            = 'X'
          NO_VALUE_DESCRIPT          =
            CHANGE_SERVICE_CLF         = 'X'
          INHERITED_CHAR             = ' '
          CHANGE_NUMBER              = ' '
          TABLES
            T_CLASS                    = ITAB_CLASS
            T_OBJECTDATA               = ITAB_OBJECTDATA
          I_SEL_CHARACTERISTIC       =
          T_NO_AUTH_CHARACT          =
          EXCEPTIONS
            NO_CLASSIFICATION          = 1
            NO_CLASSTYPES              = 2
            INVALID_CLASS_TYPE         = 3
            OTHERS                     = 4
        IF ITAB_OBJECTDATA IS NOT INITIAL.
           READ TABLE ITAB_OBJECTDATA INTO wa_OBJECTDATA WITH KEY atnam = 'ZLOBM_PLNTMFG'.
           <LS_ESMI_BW_MD_ZBATC0001>-ZORIGPLNT = wa_OBJECTDATA-AUSP1.
        ENDIF.
    The code is very simple, it just call the standard FM and fetch the value from the return table, is there any thing wrong with my code?
    Thanks!
    Tim

  • .mov file loads into lightroom ok but crashes when playing

    Lightroom 4.1
    QT player 7.7.2 is installed
    Window 7
    H.264 HD video from Canon S100 camera
    .mov files plays fine in WMP
    First frame of video displays in LR library module.  In LR library view, click play.  Video advanced about 1 frame then hangs.  I get the "Adobe Photoshop Lightroom 64-bit has stopped working" pop-up window.  Programs has to be closed.  This happens every time. 
    None of the following processes are running before starting LR:
    dynamiclinkmanager.ex
    dynamiclinkmediaserver.exe
    amecommand.exe
    Adobe QT32 Server.exe.
    video also plays ok in QT player and WMP

    Problem solved. It was an authorization problem. The answer resides here: http://docs.info.apple.com/article.html?artnum=306424

  • Financial/Project Analytics Load - Some of the Tasks TRUNCATE Table Fails

    Oracle Business Applications 7.9.6.1 - Financial and Project Analytics
    DAC Load for Project Analytics fails with most of the tasks failing at Truncate Table Tasks.
    I am implementing OBIA Financial Analytics and Project Analytics, I am able to configure and run the Out of the Box Loads without any isssues for Financial Analytics. After making sure everything is fine with Fin Analytics i configured the Project Analytics and ran the Load only for the Project - ORA12.
    Financial analytics : Load went through fine without any issues.
    Project Analytics : Majority of the Tasks fail with 'TRUNCATE TABLE table_name' itself.
    No log of Informatica Session Files are Created
    No Log of Workflow logs but taskname.log.bin files are Created.
    I am assuming the loads are failing because it is unable to truncate the data loaded by Financial Analytics Load and due to Foreign Key constraints,
    I did the integration as per the document by unchecking in the Configuration Tags.
    Do i need to run the load together, As i am unable to figure out what is happening due to the absense of the log files i.e Session, Workflow.
    DAC logs are not providing much info.
    Any inputs are appreciated.
    Thanks

    hi
    you need to create a new execution plan that contains fin & project subject area. Do not forget to build the new plan,
    thx

  • Can we cleanse and transform data at flat file or external table level?

    Hi,
    I have some data that I want to cleanse and transform. I don't want to cleanse it after i populate the external table, I want to get done with it at flat file level or while populating the external table. Can we cleanse and transform data at flat file or external table level through Oracle or OWB 11.2? Is it possible to run a conditional load (i.e. having a where clause or if-else-then) for an external table? Can we call oracle functions for an external table at the time of creation?
    Thanks in advance.
    Regards,
    Ann.

    Hi Oleg,
    Thanks a lot for the clarification. :)
    So is there a way that I can clease the data within the text file through Oracle or OWB? I have datatype mismatches in the data and most of my data is getting rejected because of that. The way I can think of, for solving this problem, is to create the external table with all fields with datatype varchar and then cleansing the data. But it doesn't seem very effecient plus it will get very complicated because I have almost 80-90 fields.
    Any help?
    Thanks and regards,
    Ann.

  • Network location for external table's file

    Hi,
    I am trying to import data from a csv file in oracle 10g database. I can successfully upload data from a csv file located on server itself using external table. But when I redefine directory object on a folder on network location, it doesn't work. Is that a limitation with external tables to access data from a file on network location.
    Any clarifications and suggestions would be highly appreciated.
    Thanks,
    Aniket

    Hi Nicolas,
    I created a directory object pointing to a folder on a system(Other than Oracle 10g server) in the network. This folder is shared and can be accessed from the Oracle 10g server. When I create an external table with the default directory as the shared folder it couldn't read the csv file from that folder ( Shared folder on another system).
    But when I redefined the directory object on a local folder which was on Oracle 10g server, it could read from csv file in the local folder using external table.
    My understanding is using external tables, one can only read files that are on the local machine i.e. Oracle 10g server and not on a different system.
    Please correct me if i am wrong. Any further suggestions would be highly appreciated.
    Regads,
    Aniket

  • Need info on using external tables load/write data

    Hi All,
    We are planning to load conversion/interface data using external tables feature available in the Oracle database.
    Also for outbound interfaces, we are planning to use the same feature to write data in the file system.
    If you have done similar exercise in any of your projects, please share sample code units. Also, let me know if there
    are any cons/limitations in this approach.
    Thanks,
    Balaji

    Please see old threads for similar discussion -- http://forums.oracle.com/forums/search.jspa?threadID=&q=external+AND+tables&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • Error while fetching data from OWB Client using External Table.

    Dear All,
    I am using Oracle Warehouse Builder 11g & Oracle 10gR2 as repository database on Windows 2000 Server.
    I facing some issue in fetching data from a Flat File using external table from OWB Client.
    I have perform all the steps without any error but when I try to view the data, I got the following error.
    ======================================
    RA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    java.sql.SQLException: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04040: file expense_categories.csv in SOURCE_LOCATION not found
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
         at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
         at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
         at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1030)
         at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:183)
         at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:774)
         at oracle.jdbc.driver.T4CStatement.executeMaybeDescribe(T4CStatement.java:849)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
         at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1377)
         at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:386)
         at oracle.wh.ui.owbcommon.QueryResult.<init>(QueryResult.java:18)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleQueryResult.<init>(OracleDVTableModel.java:48)
         at oracle.wh.ui.owbcommon.dataviewer.relational.OracleDVTableModel.doFetch(OracleDVTableModel.java:20)
         at oracle.wh.ui.owbcommon.dataviewer.RDVTableModel.fetch(RDVTableModel.java:46)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel$1.actionPerformed(BaseDataViewerPanel.java:218)
         at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
         at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
         at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
         at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:282)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerPanel.executeQuery(BaseDataViewerPanel.java:493)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.init(BaseDataViewerEditor.java:116)
         at oracle.wh.ui.owbcommon.dataviewer.BaseDataViewerEditor.<init>(BaseDataViewerEditor.java:58)
         at oracle.wh.ui.owbcommon.dataviewer.relational.DataViewerEditor.<init>(DataViewerEditor.java:16)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
         at oracle.wh.ui.owbcommon.IdeUtils._tryLaunchEditorByClass(IdeUtils.java:1412)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1349)
         at oracle.wh.ui.owbcommon.IdeUtils._doLaunchEditor(IdeUtils.java:1367)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:869)
         at oracle.wh.ui.owbcommon.IdeUtils.showDataViewer(IdeUtils.java:856)
         at oracle.wh.ui.console.commands.DataViewerCmd.performAction(DataViewerCmd.java:19)
         at oracle.wh.ui.console.commands.TreeMenuHandler$1.run(TreeMenuHandler.java:188)
         at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:209)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:461)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)
    ===========================
    In the error it is showing that file expense_categories.csv in SOURCE_LOCATION not found but I am 100% sure that file is very much there.
    Is anybody face the same issue?
    Do we need to configure something before loading data from a flat file from OWB Client?
    Any help would higly appreciable.
    Regards,
    Manmohan Sharma

    Hi Detlef / Gowtham,
    Now I am able to fetch data from flat files from OWB Server as well as OWB Client.
    One way I have achieved as suggested by you
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copy all flat files on OWB Server
    5) Updated the location which I created at the client.
    Other way
    1) Creating location on the OWB Client
    2) Samples the files at client
    3) Created & Configured external table
    4) Copied flat files on the sever in same drive & directory . like if my all flat files are on C:\data at OWB Client then I copied flat file C:\data on the OWB Server. But this is feasible for Non-Windows.
    Hence my problem solved.
    Thanks a lot.
    Regards,
    Manmohan

  • Load XML records in a normal table

    Good afternoon all,
    I have a very simple question:
    I get a XML file and want to store that data in my Oracle database in a normal table.
    I have seen so many answers everywhere, varying from LOBs and using XDB etc.
    What i don't understand is why it is so difficult.
    When i want to load a CSV file in a table I make a very small Control File CTL and from the command prompt / command line I run the SQL Loader.
    Control file:
    load data
    infile 'import.csv'
    into table emp
    fields terminated by "," optionally enclosed by '"'          
    ( empno, empname, sal, deptno )
    command:
    sqlldr user/password@SID control=loader_Control_File.ctl
    Next I connect to the database and run SQL query:
    select * from emp;
    and i see my data as usual, I can make Crystal Reports on it, etc etc
    I really don't understand why this can't be done with an XML file
    Oracle know the fields in the table EMP
    The xml file has around every field the <EMPNO> and </EMPNO>
    Can't be easier than that I would say.
    I can understand Oracle likes some kind of description of the XML table, so reference to a XSD file would be understandable.
    But all examples are describing LOB things (whatever that is)
    Who can help me to get XML data in a normal table?
    Thanks
    Frank

    Hi Frank,
    What i don't understand is why it is so difficult.Why do you think that?
    An SQL*Loader control file might appear very small and simple to you, but you don't actually see what happens inside the loader itself, I guess a lot of complex operations (parsing, datatype mapping, memory allocation etc.).
    XML, contrary to a CSV format, is a structured, well standardized language and could handle far more complex documents than row-organized CSV files.
    I think it naturally requires a few extra work (for a developer) to describe what we want to do out of it.
    However, using an XML schema is not mandatory to load XML data into a relational table.
    It's useful if you're interested in high-performance loading and scalability, as it allows Oracle to fully understand the XML data model it has to deal with, and make the correct mapping with SQL types in the database.
    Furthermore, now with 11g BINARY XMLType, performance has been improved with or without schema.
    Here's a simple example, loading XML file "import.xml" into table MY_EMP.
    Do you find it difficult? ;)
    SQL> create or replace directory test_dir as 'D:\ORACLE\test';
    Directory created
    SQL> create table my_emp as
      2  select empno, ename, sal, deptno
      3  from scott.emp
      4  where 1 = 0
      5  ;
    Table created
    SQL> insert into my_emp (empno, ename, sal, deptno)
      2  select *
      3  from xmltable('/ROWSET/ROW'
      4         passing xmltype(bfilename('TEST_DIR', 'import.xml'), nls_charset_id('CHAR_CS'))
      5         columns empno  number(4)    path 'EMPNO',
      6                 ename  varchar2(10) path 'ENAME',
      7                 sal    number(7,2)  path 'SAL',
      8                 deptno number(2)    path 'DEPTNO'
      9       )
    10  ;
    14 rows inserted
    SQL> select * from my_emp;
    EMPNO ENAME            SAL DEPTNO
    7369 SMITH         800.00     20
    7499 ALLEN        1600.00     30
    7521 WARD         1250.00     30
    7566 JONES        2975.00     20
    7654 MARTIN       1250.00     30
    7698 BLAKE        2850.00     30
    7782 CLARK        2450.00     10
    7788 SCOTT        3000.00     20
    7839 KING         5000.00     10
    7844 TURNER       1500.00     30
    7876 ADAMS        1100.00     20
    7900 JAMES         950.00     30
    7902 FORD         3000.00     20
    7934 MILLER       1300.00     10
    14 rows selected
    import.xml :
    <?xml version="1.0"?>
    <ROWSET>
    <ROW>
      <EMPNO>7369</EMPNO>
      <ENAME>SMITH</ENAME>
      <SAL>800</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7499</EMPNO>
      <ENAME>ALLEN</ENAME>
      <SAL>1600</SAL>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <!-- more rows here -->
    <ROW>
      <EMPNO>7934</EMPNO>
      <ENAME>MILLER</ENAME>
      <SAL>1300</SAL>
      <DEPTNO>10</DEPTNO>
    </ROW>
    </ROWSET>
    Who can help me to get XML data in a normal table?If you have a specific example, feel free to post it, including the following information :
    - structure of the target table
    - sample XML file
    - database version (select * from v$version)
    Hope that helps.
    Edited by: odie_63 on 9 mars 2011 21:22

  • External Tables on Windows

    I have Oracle loaded on a Unix server. I am trying to execute stored procedures using External Tables. My external table directories reside on a Windows OS. While I was successful in creating the external tables and the logical directories mapped to the OS drectory, I was not able to execute the procedure. It came up with errors.
    I have the necessary grants on these directories.
    The procedures runs successfully if the directories are created on the Unix box
    14:41:21 Start Executing PL/SQL block ...
    14:41:21 Starting execution of PL/SQL block...
    Error -29913: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: unable to open log file exttbl_1970284.log
    OS error No such file or directory
    ORA-06512: at "SYS.ORACLE_LOADER", line 14
    ORA-06512: at line 1
    14:41:21 Execution failed: ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    14:41:21 ORA-29400: data cartridge error
    14:41:21 KUP-04063: unable to open log file invpbl_1970284.log
    14:41:21 OS error No such file or directory
    14:41:21 ORA-06512: at line 7
    14:41:21 End Executing PL/SQL block
    I have checked the read/write permissions on the Windows folder and they have full access
    It is critical for me to make it work on a Windows platform

    Hi
    Extract of Metalink note
    You query the following view and found a mismatch between directory path.
    SQL> select * from dba_directories ;
    OWNER     DIRECTORY_NAME     DIRECTORY_PATH
    SYS     ADMIN_DAT_DIR     e:\external_tables\data
    SYS     ADMIN_DAT_DIR     e:\external_tables\bad
    SYS     ADMIN_DAT_DIR     e:\external_tables\log      
    SQL> select * from dba_external_locations where table_name='ADMIN_EXT_EMPLOYEES';
    OWNER TABLE_NAME      LOCATION DIRECTORY_OWNER      DIRECTORY_NAME
    SCOTT ADMIN_EXT_EMPLOYEES empxt1.dat SYS           ADMIN_DAT_DIR
    SCOTT ADMIN_EXT_EMPLOYEES empxt2.dat SYS           ADMIN_DAT_DIR
    Fix --- Check for the OS directory "data,bad and log" exists under e:\external_tables. If this does not exist then create and rerun the query.
    Regards
    Hector Ulloa Ligarius

  • External Table Preprocessor

    Dear Experts,
    I was trying to explore the "preprocessor" command in external table, but I was not able to succeed.
    I just followed the steps mentioned in, latest Oracle Magazine.
    http://www.oracle.com/technetwork/issue-archive/2012/12-nov/o62asktom-1867739.html
    But finally when I do the select from table df, I am getting the following error.
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04095: preprocessor command /export/csrmgr/ripple_dms/run_df.sh encountered error "error during exec: errno is 2
    PS: The output of v$version;
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    "CORE     11.2.0.2.0     Production"
    TNS for IBM/AIX RISC System/6000: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production
    Could anyone help me to resolve this issue.

    I'm not able to test Tom's example at the minute.
    However, that new feature was discussed in the March/April 2011 edition of Oracle Magazine too.
    Oracle Magazine March/April 2011 article on new feature in 11gR2 called the external table preprocessor.
    by Arup Nanda
    CHANGE IN PROCESS
    Now suppose that with your external table and its text file in place, the input file for the external table is compressed to reduce the volume of data transmitted across the network. Although compression helps the network bandwidth utilization, it creates a challenge for the ETL process. The file must be uncompressed before its contents can be accessed by the external table.
    Rather than uncompress the file in a separate process, you can use the preprocessor feature of Oracle Database 11g Release 2 with external tables to uncompress the file inline. And you will notneed to change the ETL process.
    To use the preprocessor feature, first you need to create a preprocessor program. The external table expects input in a text format but not necessarily in a file. The external table does not need to read a file; rather, it expects to get the file contents “fed” to it. So the preprocessor program must stream the input data directly to the external table—and not
    create another input file. The input to the pre-processor will be a compressed file, and the output will be the uncompressed contents.
    The following is the code for your new preprocessor program, named preprocess.bat:
    @echo off
    C:\oracle\product\11.2.0\dbhome_1\BIN\unzip.exe -qc %1The first line, @echo off, suppresses the output of the command in a Windows environment. The remaining code calls the unzip.exe utility located in the Oracle home. The utility needs an input file, which is the first (and only) parameter passed to it, shown as %1. The options q and c tell the utility to uncompress quietly (q) without producing extraneous output such as “Inflating” or “%age inflated” and match filenames case-insensitively (c), respectively.
    Next you need to create the directory object where this preprocessor program is located. Logging in as SYS, issue
    create directory execdir as 'c:\tools';And now grant EXECUTE permissions on the directory to the ETLADMIN user:
    grant execute on directory execdir to etladmin;Finally, create the new external table:
    create table indata1
      cust_id number,
      cust_name varchar2(20),
      credit_limit number(10)
    organization external
      type oracle_loader
      default directory etl_dir
      access parameters
        records delimited by newline
        preprocessor execdir:'preprocess.bat'
        fields terminated by ","
      location ('indata1.txt')
    /It calls the preprocess.bat executable in the directory specified by EXECDIR before the external table accesses the indata1.txt file in the location specified by the ETL_DIR directory. Remember, indata1.txt is now a compressed file. So, in effect, the external table reads not the actual specified input file but rather the output of preprocess.bat, which is the uncompressed data from the indata1.txt file.
    If you select from the external table now, the output will be similar to that of the earlier select * from indata1; query. The preprocessor passed the uncompressed contents of the indata1.txt (compressed) file on to the external table. There was no need to uncompress the file first—saving significant time and the intermediate space required and making it unnecessary to change the ETL process.
    This inline preprocessing unzip example uses a script, but that is not always necessary. An executable can be used instead. For example, in Linux you can use /bin/gunzip. However, the utility can’t accept any parameters. So if you pass parameters (as in this article’s example), you must use a script.
    Security Concerns
    =================
    The EXECUTE privilege on a directory is a new feature introduced in Oracle Database 11g Release 2. It enables the DBA to grant EXECUTE permissions only for certain directories and only to certain users. Without WRITE privileges, users will not be able to
    update the executables inside a directory to insert malicious code, but users will be able to execute the “approved” code accessible in a single location. The DBA can put all the necessary preprocessor tools into a single directory and grant EXECUTE privileges there to the users who may need them. And, of course, the executables and data should be in different directories.
    Preprocessing also requires some special precautions on the part of the DBA. Because the executables called by preprocessing programs will be executed under the privileges of the Oracle software owner and malicious executable code can cause a lot of damage, the DBA should be extremely careful in monitoring executables for potentially harmful code.
    The directory containing the preprocessor executables needs to be accessible to the Oracle software owner for EXECUTE operations only, not for WRITE activity. Therefore, as an added precaution, the system administrator can remove WRITE access to that directory from all users, including the Oracle software owner. This significantly reduces the chance of damage by malicious code.
    Other Uses
    ==========
    Compression is not the only use for inline preprocessing, although it certainly is the most widely used. You can, for example, use this preprocessing technique to show the output of a program as an external table. Consider, for instance, the dir command in Windows for listing the contents of a directory. How would you like to get the output as a table so that you can apply predicates?
    Getting and using this output is quite simple with the preprocessor functionality. Remember, the preprocessor does not actually need a file but, rather, requires the output of the preprocessor program. You can write a preprocessor program to send the
    output of a dir command. The new preprocessor program, named preproc_dir.bat, has only the following two lines:
    @echo off
    dirYou will also need a file for the external table. The contents of the file are irrelevant, so you can use any file that the Oracle software owner can read in a directory to which that owner has read access. For this example, the file is dirfile.txt, and although the contents of the file are immaterial, the file must exist, because the external table will access it. Listing 1 shows how to create the table.
    Because the dir command displays output in a prespecified manner, the external table easily parses it by reading the fields located in specific positions. For example, positions 1 through 10 display the date, 11 through 20 display the time, and so on. The dir command produces some heading and preliminary information that the external table has to ignore, so there is a skip 5 clause in Listing 1 that skips the first five lines of the output. The last few lines of the output show how many files and directories are present and how much free space remains. This output must be skipped as well, so the external table displays records only when the date column has a value.
    Listing 1 also shows the result of a query against the external table. Because the MOD_DT column is of the date datatype, you can also apply a WHERE condition to select a specified set of records.
    Code Listing 1:
    ===============
    create table dir_tab
      mod_dt date,
      mod_time char(10),
      file_type char(10),
      file_size char(10),
      file_name char(40)
      organization external
      type oracle_loader
      default directory etl_dir
      access parameters
        records delimited by newline
        preprocessor execdir:'preproc_dir.bat'
        skip 5
        load when (mod_dt != blanks)
        fields
          mod_dt position (01:10) DATE mask "mm/dd/yyyy",
          mod_time position (11:20),
          file_type position (21:29),
          file_size position (30:38),
          file_name position (39:80)
      location ('dirfile.txt')
    reject limit unlimited
    -- select from this table
    SQL> select * from dir_tab;
    MOD_DT        MOD_TIME       FILE_TYPE       FILE_SIZE      FILE_NAME
    16-DEC-10     10:12 AM       <DIR> .
    16-DEC-10     10:12 AM       <DIR> ..
    22-MAY-10     09:57 PM       <DIR> archive
    22-MAY-10     10:27 PM                                2,048 hc_alap112.dat
    05-DEC-10     07:07 PM                                   36 indata1.txt
    22-DEC-05     04:07 AM                               31,744 oradba.exe
    16-DEC-10     09:58 AM                                1,123 oradim.log
    28-SEP-10     12:41 PM                                1,536 PWDALAP112.ora
    16-DEC-10     09:58 AM                                2,560 SPFILEALAP112.ORA
    9 rows selected.
    -- select a file not updated in last 1 year
    SQL> select * from dir_tab where mod_dt < sysdate - 365;
    MOD_DT        MOD_TIME       FILE_TYPE       FILE_SIZE      FILE_NAME
    22-DEC-05     04:07 AM                               31,744 oradba.exe

Maybe you are looking for