Variable filename external table

I have written a script to import data from a .csv into a table in Apex:
CREATE TABLE APEX_USER_002.INK_KASSA_LOAD
KASS_DTM      VARCHAR2(255),
       KASS_ID       VARCHAR2(255),
       KASS_VKP_COD  VARCHAR2(255),
       KASS_VKP_OMS  VARCHAR2(255),
       KASS_COD VARCHAR2(255),
       KASS_OMS   VARCHAR2(255),
       KASS_PRS_COD VARCHAR2(255),
       KASS_PRS_OMS VARCHAR2(255)
ORGANIZATION EXTERNAL
  TYPE ORACLE_LOADER
  DEFAULT DIRECTORY AFM_KASSA_IN
  ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8MSWIN1252
      LOGFILE AFM_KASSA_LOG: 'log.log'
    BADFILE AFM_KASSA_BAD: 'badfile.bad'
    NODISCARDFILE
    SKIP 1
    FIELDS TERMINATED BY ';'
    OPTIONALLY ENCLOSED BY '"'
    LRTRIM
    MISSING FIELD VALUES ARE NULL
KASS_DTM,
       KASS_ID,
       KASS_VKP_COD,
       KASS_VKP_OMS,
       KASS_COD,
       KASS_OMS,
       KASS_PRS_COD,
       KASS_PRS_OMS
    LOCATION  ('variable_filename.csv')
It works fine when I put the name of the .csv hardcoded in the script. Like this: LOCATION ('Kassa_20130406130014.csv')
I have written a function which returns the filename, but when I use the function in LOCATION (function()) I get an error: ora-00905: missing keyword
How can I call a function in LOCATION ()?

Hi,
I'm not sure but I think you need re-create external table.
Create procedure that drops and re-creates external table dynamically with new file name.
Or use e.g. DBMS_SCHEDULER to call OS script that renames file always to same name for external table.
Regards,
Jari

Similar Messages

  • How to insert filename as column in external table.

    I have an external table that is made up by number of different files. I would like to add a new column in the table to store the actual external filename. How can this be done?
    thanks in advanced

    That's going to be a problem, then. The ISO 8859-1 character set doesn't support that character, so you cannot store a ™ in a CHAR or VARCHAR2 column in your database.
    Potentially, you could use an NCHAR or NVARCHAR2 column, depending on your national character set. Using the national character set, though, can cause other problems, particularly for front-end applications that may not fully support those data types.
    You could also potentially change your existing character set to something that does support that character, but that may be a rather involved sort of change.
    Justin

  • External Table for Variable Length EBCDIC file with RDWs

    I am loading an ebcdic file where the record length is stored in the first 4 bytes. I am able to read the 4 bytes using the db's native character set, ie;
    records variable 4
    characterset WE8MSWIN1252
    data is little endianBut I have to then convert each string column individually on the select, ie;
    convert(my_col, 'WE8MSWIN1252', 'WE8EBCDIC37')If I change the character set to ebcdic;
    records variable 4
    characterset WE8EBCDIC37
    data is little endianI get the following error reading the first 4 bytes;
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04019: illegal length found for VAR record in file ...We can not use the ftp conversion as the file contains packed decimals.
    There are other options for converting the file but I am wondering if was able to get an external table to read a native ebcdic file without a pre-process step.

    I am loading an ebcdic file where the record length is stored in the first 4 bytes. I am able to read the 4 bytes using the db's native character set, ie;
    records variable 4
    characterset WE8MSWIN1252
    data is little endianBut I have to then convert each string column individually on the select, ie;
    convert(my_col, 'WE8MSWIN1252', 'WE8EBCDIC37')If I change the character set to ebcdic;
    records variable 4
    characterset WE8EBCDIC37
    data is little endianI get the following error reading the first 4 bytes;
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04019: illegal length found for VAR record in file ...We can not use the ftp conversion as the file contains packed decimals.
    There are other options for converting the file but I am wondering if was able to get an external table to read a native ebcdic file without a pre-process step.

  • Create external table on a CSV file with a variable number of delimiters

    Hi experts,
    I was wondering what the best approach for the following issue.
    I'm trying to create an external table on a file which has in each record (on a new line 'RECORDS DELIMITED BY NEWLINE') a variabel number of delimiters. By example my delimiter is a comma , in the first record I have 100 comma's in the second only 60, the next 80 etc.etc.
    Is there a way to create a external table on this file?
    Thanks in advance.

    alter the source is no option, unfortunalty. But is suggested there would be a sensible workway to handle it. An to solve it I want to pivot i, but first I think I need to have the file in Oracle through a external table.
    Edited by: Jonathan Wisgerhof on Mar 31, 2009 12:21 PM

  • External table filenames

    I am loading hundreds/thousands of files of delimited data via external tables. The record counts for verifying the data is being provided on a file by file level from the mainframe system sending the data, example file prn00532.dat has 78,322 records.
    I can add all the files to the create external table DDL no problem, but I then can not verify that each file that is loaded has the correct number of records as the files all get loaded as a group.
    My question is, without having to create an table for each external file, is there a way to have a calculated column that oracle adds, or a bind or something I can reference to insert into a another table/same table that shows the external data file that oracle is currently loading?
    Basically when oracle switches from file prn00532.dat to prn00533.dat I can capture that...

    Instead of disturbing parallel coordinators you could just submit n jobs each doing a simple <b>insert into big_table select * from external_table_n</b> or something similar - all external_tables_i having the same structure just different location parameters.
    Regards
    Etbin
    Edited by: Etbin on 26.2.2011 12:25
    possibly grouping text data files according to partitions the data would end within

  • External table - fetch location ?

    Using Oracle 10.2.0.5
    An external table is a construct that gives me SQL access to a file.
    Is it possible to know the name of the file somehow inside the select? Like Add a column with the file name?
    pseudo example
    CREATE TABLE EXT_DUMMY
        "RECORDTYPE" VARCHAR2(100 BYTE),
        "COL1" VARCHAR2(100 BYTE),
        "COL2" VARCHAR2(100 BYTE),
        "FILE" VARCHAR2(100 BYTE)
    ORGANIZATION EXTERNAL
        TYPE ORACLE_LOADER DEFAULT DIRECTORY "IMPORT_BAD_FILE"
        ACCESS PARAMETERS (
                 records delimited BY newline
                 FIELDS TERMINATED BY ';'
                 MISSING FIELD VALUES ARE NULL
                   ( RECORDTYPE CHAR
                   , COL1 CHAR
                   , COL2 CHAR
                   , FILE CHAR FILLER
        LOCATION ( 'Testfile1.txt, Testfile2.txt' )
        reject limit 10
    ;The result could look like this:
    RECORDTYPE   COL1       COL2      FILE
    SAMPLE           DUMMY    DUMMY Testfile1.txt
    SAMPLE           DUMMY1   DUMMY Testfile1.txt
    SAMPLE           DUMMY2   DUMMY Testfile1.txt
    SAMPLE           DUMMY3   DUMMY Testfile1.txt
    SAMPLE           DUMMY1   DUMMY1 Testfile2.txt
    SAMPLE           DUMMY1   DUMMY2 Testfile2.txt
    SAMPLE           DUMMY2   DUMMY1 Testfile2.txtI would like to know from which file a certain row is read. Maybe I missed an option in the documentation. In this example I have two different files as the source for the external table.
    Another use case could be this:
    If I enable a user to switch the external table to a different file alter table EXT_DUMMY location ('Testfile3.txt' ). How can we know which file is read during the select on the table? When userA does the select, maybe userB just altered the location before the select was started. Therefore userA would read in a different file then expected.
    Edited by: Sven W. on May 26, 2011 4:48 PM
    Edited by: Sven W. on May 26, 2011 4:51 PM
    Edited by: Sven W. on May 26, 2011 5:11 PM

    Hi Sven,
    I'm not sure how much we can rely on this, but let's consider the following :
    create table test_xt (
      rec_id  number
    , message varchar2(100)
    organization external (
      default directory test_dir
      access parameters (
        records delimited by newline
        fields terminated by ';'
      location (
        'marc5.txt'
      , 'test1.csv'
      , 'test2.csv'
      , 'test3.csv'
    );I always thought the ROWID doesn't hold much sense for an external table, but...
    SQL> select t.rowid
      2       , dump(t.rowid) as rowid_dump
      3       , regexp_substr(dump(t.rowid,10,9,1),'\d+$') as file#
      4       , t.*
      5  from test_xt t
      6  ;
    ROWID              ROWID_DUMP                                                FILE#      REC_ID MESSAGE
    (AADVyAAAAAAAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,0,0,0,0,0,0,0,0,0     0               1 this is a line from marc5.txt
    (AADVyAAAAAAAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,0,0,0,0,0,0,0,0,33    0               2 this is a line from marc5.txt
    (AADVyAAAAAAAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,0,0,0,0,0,0,0,0,66    0               3 this is a line from marc5.txt
    (AADVyAAAAAAAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,0,0,0,0,0,0,0,0,99    0               4 this is a line from marc5.txt
    (AADVyAAAAAEAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,1,0,0,0,0,0,0,0,0     1               1 this is a line from test1.csv
    (AADVyAAAAAEAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,1,0,0,0,0,0,0,0,33    1               2 this is a line from test1.csv
    (AADVyAAAAAEAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,1,0,0,0,0,0,0,0,66    1               3 this is a line from test1.csv
    (AADVyAAAAAEAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,1,0,0,0,0,0,0,0,99    1               4 this is a line from test1.csv
    (AADVyAAAAAIAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,2,0,0,0,0,0,0,0,0     2               1 this is a line from test2.csv
    (AADVyAAAAAIAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,2,0,0,0,0,0,0,0,33    2               2 this is a line from test2.csv
    (AADVyAAAAAIAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,2,0,0,0,0,0,0,0,66    2               3 this is a line from test2.csv
    (AADVyAAAAAMAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,3,0,0,0,0,0,0,0,0     3               1 this is a line from test3.csv
    (AADVyAAAAAMAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,3,0,0,0,0,0,0,0,33    3               2 this is a line from test3.csv
    (AADVyAAAAAMAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,3,0,0,0,0,0,0,0,66    3               3 this is a line from test3.csv
    (AADVyAAAAAMAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,3,0,0,0,0,0,0,0,99    3               4 this is a line from test3.csv
    (AADVyAAAAAMAAAAAA Typ=208 Len=17: 4,0,0,213,200,0,0,0,3,0,0,0,0,0,0,0,132   3               5 this is a line from test3.csv
    16 rows selected
    Then with a join to EXTERNAL_LOCATION$ :
    SQL> with ext_loc as (
      2    select position-1 as pos
      3         , name as filename
      4    from sys.external_location$
      5    where obj# = ( select object_id
      6                   from user_objects
      7                   where object_name = 'TEST_XT' )
      8  )
      9  select x.filename,
    10         t.*
    11  from test_xt t
    12       join ext_loc x on x.pos = to_number(regexp_substr(dump(t.rowid,10,9,1),'\d+$'))
    13  ;
    FILENAME       REC_ID MESSAGE
    marc5.txt           1 this is a line from marc5.txt
    marc5.txt           2 this is a line from marc5.txt
    marc5.txt           3 this is a line from marc5.txt
    marc5.txt           4 this is a line from marc5.txt
    test1.csv           1 this is a line from test1.csv
    test1.csv           2 this is a line from test1.csv
    test1.csv           3 this is a line from test1.csv
    test1.csv           4 this is a line from test1.csv
    test2.csv           1 this is a line from test2.csv
    test2.csv           2 this is a line from test2.csv
    test2.csv           3 this is a line from test2.csv
    test3.csv           1 this is a line from test3.csv
    test3.csv           2 this is a line from test3.csv
    test3.csv           3 this is a line from test3.csv
    test3.csv           4 this is a line from test3.csv
    test3.csv           5 this is a line from test3.csv
    Seems to work... assuming the files are always read in the order specified through the LOCATION parameter, and the generated ROWID actually means what I think it means.

  • External Table Authentication in OBIEE 11g

    Hi ,
    I have a security table, which contains userid,displayname,group . I have imported Security table in Physical Layer. I'm creating session variables based on condition.
    When am trying to logging into analytic s getting an error, invalid username and password . I'm using 11.1.1.6.0 version
    How to handle external table authentication in OBIEE 11g version.
    Regards,
    Malli

    Hi fiaz,
    That links talks about 10g version.
    Step1: We have imported a secutiry table in Physical layer.
    Step2: Creating a session variable by selecting initilazation block.
    Select user_name,group from security_table where user_id=':USER' and pwd=':password';
    step3: created DISPLAYNAME,GROUP & USER VARIABLES in edit target window
    After these modifications i was trying to logging with new user, which is there in security table.
    I am getting an error that is invalid user or password.
    Is there any other changes does it required here.
    Regards,
    Malli
    Edited by: user10675696 on Dec 26, 2012 9:39 PM

  • Error when loading from External Tables in OWB 11g

    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.

    user1130292 wrote:
    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in  tags
    also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO?

  • Data Mismatch while selecting from External Table

    Hi I am not able create an external table,I am trying to create and test table and able to create but when i selecting the data it showing data mismatch.I tried for an test data but it returned error.I want to load from an excel file saved as test.csv
    CREATE TABLE Per_ext
    CITY VARCHAR2(30),
    STATE VARCHAR2(20),
    ZIP VARCHAR2(10),
    COUNTRY VARCHAR2(30)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY dataload
    ACCESS PARAMETERS
    MISSING FIELD VALUES ARE NULL
    LOCATION ('test.csv')
    REJECT LIMIT UNLIMITED;
    test.csv file contents
    city ----------     state---------Zip------------country
    Bombay-----     MH------------34324-------india
    london-------London------1321---------UK
    Pune---------MH------------3224---------india
    Banglore----     Karnataka---11313-------india
    rgds
    soumya

    Hi Justin
    I am getting following error when i trying from toad
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "comma": expecting one of: "badfile, byteordermark, characterset, column, data, delimited, discardfile, exit, fields, fixed, load, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string, skip, territory, variable"
    KUP-01007: at line 1 column 29
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    rgds
    soumya

  • Bad file is not created during the external table creation.

    Hello Experts,
    I have created a script for external table in Oracle 10g DB. Everything is working fine except it does not create the bad file, But it creates the log file. I Cann't figure out what is the issue. Because my shell scripts is failing and the entire program is failing. I am attaching the table creation script and the shell script where it is refering and the error. Kindly let me know if something is missing. Thanks in advance
    Table Creation Scripts:_-------------------------------
    create table RGIS_TCA_DATA_EXT
    guid VARCHAR2(250),
    badge VARCHAR2(250),
    scheduled_store_id VARCHAR2(250),
    parent_event_id VARCHAR2(250),
    event_id VARCHAR2(250),
    organization_number VARCHAR2(250),
    customer_number VARCHAR2(250),
    store_number VARCHAR2(250),
    inventory_date VARCHAR2(250),
    full_name VARCHAR2(250),
    punch_type VARCHAR2(250),
    punch_start_date_time VARCHAR2(250),
    punch_end_date_time VARCHAR2(250),
    event_meet_site_id VARCHAR2(250),
    vehicle_number VARCHAR2(250),
    vehicle_description VARCHAR2(250),
    vehicle_type VARCHAR2(250),
    is_owner VARCHAR2(250),
    driver_passenger VARCHAR2(250),
    mileage VARCHAR2(250),
    adder_code VARCHAR2(250),
    bonus_qualifier_code VARCHAR2(250),
    store_accuracy VARCHAR2(250),
    store_length VARCHAR2(250),
    badge_input_type VARCHAR2(250),
    source VARCHAR2(250),
    created_by VARCHAR2(250),
    created_date_time VARCHAR2(250),
    updated_by VARCHAR2(250),
    updated_date_time VARCHAR2(250),
    approver_badge_id VARCHAR2(250),
    approver_name VARCHAR2(250),
    orig_guid VARCHAR2(250),
    edit_type VARCHAR2(250)
    organization external
    type ORACLE_LOADER
    default directory ETIME_LOAD_DIR
    access parameters
    RECORDS DELIMITED BY NEWLINE
    BADFILE ETIME_LOAD_DIR:'tstlms.bad'
    LOGFILE ETIME_LOAD_DIR:'tstlms.log'
    READSIZE 1048576
    FIELDS TERMINATED BY '|'
    MISSING FIELD VALUES ARE NULL(
    GUID
    ,BADGE
    ,SCHEDULED_STORE_ID
    ,PARENT_EVENT_ID
    ,EVENT_ID
    ,ORGANIZATION_NUMBER
    ,CUSTOMER_NUMBER
    ,STORE_NUMBER
    ,INVENTORY_DATE char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,FULL_NAME
    ,PUNCH_TYPE
    ,PUNCH_START_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,PUNCH_END_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,EVENT_MEET_SITE_ID
    ,VEHICLE_NUMBER
    ,VEHICLE_DESCRIPTION
    ,VEHICLE_TYPE
    ,IS_OWNER
    ,DRIVER_PASSENGER
    ,MILEAGE
    ,ADDER_CODE
    ,BONUS_QUALIFIER_CODE
    ,STORE_ACCURACY
    ,STORE_LENGTH
    ,BADGE_INPUT_TYPE
    ,SOURCE
    ,CREATED_BY
    ,CREATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,UPDATED_BY
    ,UPDATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
    ,APPROVER_BADGE_ID
    ,APPROVER_NAME
    ,ORIG_GUID
    ,EDIT_TYPE
    location (ETIME_LOAD_DIR:'tstlms.dat')
    reject limit UNLIMITED;
    _***Shell Script*:*----------------_*
    version=1.0
    umask 000
    DATE=`date +%Y%m%d%H%M%S`
    TIME=`date +"%H%M%S"`
    SOURCE=`hostname`
    fcp_login=`echo $1|awk '{print $3}'|sed 's/"//g'|awk -F= '{print $2}'`
    fcp_reqid=`echo $1|awk '{print $2}'|sed 's/"//g'|awk -F= '{print $2}'`
    TXT1_PATH=/home/ac1/oracle/in/tsdata
    TXT2_PATH=/home/ac2/oracle/in/tsdata
    ARCH1_PATH=/home/ac1/oracle/in/tsdata
    ARCH2_PATH=/home/ac2/oracle/in/tsdata
    DEST_PATH=/home/custom/sched/in
    PROGLOG=/home/custom/sched/logs/rgis_tca_to_tlms_create.sh.log
    PROGNAME=`basename $0`
    PROGPATH=/home/custom/sched/scripts
    cd $TXT2_PATH
    FILELIST2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.dat for i in $FILELIST2
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES2 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES1="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST1="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST1
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT2_PATH/test/.
    mv $i.$DATE $TXT2_PATH/test/.
    done
    if test $NO_OF_FILES1 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    cd $TXT1_PATH
    FILELIST3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
    NO_OF_FILES3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
    $DEST_PATH/tstlmsedits.datfor i in $FILELIST3
    do
    cat $i >> $DEST_PATH/tstlmsedits.dat
    printf "\n" >> $DEST_PATH/tstlmsedits.dat
    mv $i $i.$DATE
    #mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES3 -eq 0
    then
    echo " no tstlmsedits.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
    echo "-------------------------------------------" >> $PROGLOG
    fi
    NO_OF_FILES4="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
    FILELIST4="`ls -lrt tstlms*.dat |awk '{print $9}'`"
    $DEST_PATH/tstlms.datfor i in $FILELIST4
    do
    cat $i >> $DEST_PATH/tstlms.dat
    printf "\n" >> $DEST_PATH/tstlms.dat
    mv $i $i.$DATE
    # mv $i $TXT1_PATH/test/.
    mv $i.$DATE $TXT1_PATH/test/.
    done
    if test $NO_OF_FILES4 -eq 0
    then
    echo " no tstlms.dat file exists " >> $PROGLOG
    else
    echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
    fi
    #connecting to oracle to generate bad files
    sqlplus -s $fcp_login<<EOF
    select count(*) from rgis_tca_data_ext;
    select count(*) from rgis_tca_data_history_ext;
    exit;
    EOF
    #counting the records in files
    tot_rec_in_tstlms=`wc -l $DEST_PATH/tstlms.dat | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits=`wc -l $DEST_PATH/tstlmsedits.dat | awk ' { print $1 } '`
    tot_rec_in_tstlms_bad=`wc -l $DEST_PATH/tstlms.bad | awk ' { print $1 } '`
    tot_rec_in_tstlmsedits_bad=`wc -l $DEST_PATH/tstlmsedits.bad | awk ' { print $1 } '`
    #updating log table
    echo "pl/sql block started"
    sqlplus -s $fcp_login<<EOF
    define tot_rec_in_tstlms     = '$tot_rec_in_tstlms';
    define tot_rec_in_tstlmsedits     = '$tot_rec_in_tstlmsedits';
    define tot_rec_in_tstlms_bad     = '$tot_rec_in_tstlms_bad';
    define tot_rec_in_tstlmsedits_bad='$tot_rec_in_tstlmsedits_bad';
    define fcp_reqid ='$fcp_reqid';
    declare
    l_tstlms_file_id number := null;
    l_tstlmsedits_file_id number := null;
    l_tot_rec_in_tstlms number := 0;
    l_tot_rec_in_tstlmsedits number := 0;
    l_tot_rec_in_tstlms_bad number := 0;
    l_tot_rec_in_tstlmsedits_bad number := 0;
    l_request_id fnd_concurrent_requests.request_id%type;
    l_start_date fnd_concurrent_requests.actual_start_date%type;
    l_end_date fnd_concurrent_requests.actual_completion_date%type;
    l_conc_prog_name fnd_concurrent_programs.concurrent_program_name%type;
    l_requested_by fnd_concurrent_requests.requested_by%type;
    l_requested_date fnd_concurrent_requests.request_date%type;
    begin
    --getting concurrent request details
    begin
    SELECT fcp.concurrent_program_name,
    fcr.request_id,
    fcr.actual_start_date,
    fcr.actual_completion_date,
    fcr.requested_by,
    fcr.request_date
    INTO l_conc_prog_name,
    l_request_id,
    l_start_date,
    l_end_date,
    l_requested_by,
    l_requested_date
    FROM fnd_concurrent_requests fcr, fnd_concurrent_programs fcp
    WHERE fcp.concurrent_program_id = fcr.concurrent_program_id
    AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    exception
    when no_data_found then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log, 'No data found for request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when retrieving request_id request_id');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    raise_application_error(-20001,
    'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
    sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlms.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlms_file_id,
                   'tstlms.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlms,
                   &tot_rec_in_tstlms_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlms file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    --calling ins_or_upd_tca_process_log to update log table for tstlmsedits.dat file
    begin
    rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
                   (l_tstlmsedits_file_id,
                   'tstlmsedits.dat',
                   l_conc_prog_name,
                   l_request_id,
                   l_start_date,
                   l_end_date,
                   &tot_rec_in_tstlmsedits,
                   &tot_rec_in_tstlmsedits_bad,
                   null,
                   null,               
                   null,
                   null,
                   null,
                   null,
                   null,
                   l_requested_by,
                   l_requested_date,
                   null,
                   null,
                   null,
                   null,
                   null);
    exception
    when others then
    fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
    fnd_file.put_line(fnd_file.log,
    'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlmsedits file');
    fnd_file.put_line(fnd_file.log, sqlerrm);
    end;
    end;
    exit;
    EOF
    echo "rgis_tca_to_tlms_process.sql started"
    sqlplus -s $fcp_login @$SCHED_TOP/sql/rgis_tca_to_tlms_process.sql $fcp_reqid
    exit;
    echo "rgis_tca_to_tlms_process.sql ended"
    _**Error:*----------------------------------*_
    RGIS Scheduling: Version : UNKNOWN
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    TCATLMS module: TCA To TLMS Import Process
    Current system time is 18-AUG-2011 06:13:27
    COUNT(*)
         16
    COUNT(*)
         25
    wc: cannot open /home/custom/sched/in/tstlms.bad
    wc: cannot open /home/custom/sched/in/tstlmsedits.bad
    pl/sql block started
    old 33:     AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
    new 33:     AND fcr.request_id = 18661823; --fnd_global.conc_request_id();
    old 63:                &tot_rec_in_tstlms,
    new 63:                16,
    old 64:                &tot_rec_in_tstlms_bad,
    new 64:                ,
    old 97:                &tot_rec_in_tstlmsedits,
    new 97:                25,
    old 98:                &tot_rec_in_tstlmsedits_bad,
    new 98:                ,
    ERROR at line 64:
    ORA-06550: line 64, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql stddev sum variance
    execute forall merge time timestamp interval date
    <a string literal with character set specification>
    <a number> <a single-quoted SQL string> pipe
    <an alternatively-quoted string literal with character set specification>
    <an alternatively-q
    ORA-06550: line 98, column 4:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + case mod new not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql st
    rgis_tca_to_tlms_process.sql started
    old 12: and concurrent_request_id = '&1';
    new 12: and concurrent_request_id = '18661823';
    old 18: and concurrent_request_id = '&1';
    new 18: and concurrent_request_id = '18661823';
    old 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,&1);
    new 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,18661823);
    old 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,&1);
    new 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,18661823);
    old 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',&1);
    new 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',18661823);
    declare
    ERROR at line 1:
    ORA-20001: Error occured when executing RGIS_TCA_TO_TLMS_PROCESS.sql ORA-01403:
    no data found
    ORA-06512: at line 59
    Executing request completion options...
    ------------- 1) PRINT   -------------
    Printing output file.
    Request ID : 18661823      
    Number of copies : 0      
    Printer : noprint
    Finished executing request completion options.
    Concurrent request completed successfully
    Current system time is 18-AUG-2011 06:13:29
    ---------------------------------------------------------------------------

    Hi,
    Check the status of the batch in SM35 transaction.
    if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
    To Release -Shift+F4.
    Also you can analyse the job status through F2 button.
    Bye

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • Unable to read chinese characters in a flat file to external table

    Hi All,
    We have a flat file containing data in chinese. We are using external table to read data in files. When i do select <coulmn-name> from <table> it displays box for the chinese characters. The column is of type varchar2.
    The NLS_LANGUAGE is AMERICAN and NLS_CHARACTERSET is AL32UTF8.
    The character set is mentioed as UTF8 in external table defnition.
    Is the external table not reading these charcters properly? Should any patch be installed or some settings should be changed for sql developer to to display chinese character?

    The NLS_LANG environment variable/registry string variable (You are just one of the 1000s of 'anonymous' users refusing to post platform and version), so NOT NLS_LANGUAGE, on the server hosting the file, should be set to anything ending in .AL32UTF8
    Sybrand Bakker
    Senior Oracle DBA

  • External Table Authorization Best practices

    Hi,
    I am working on OBIEE External table Authorization. I am able to successfully implement for one Project (catalog). The field for Authorization table (AuthTable) are
    Windows_ID     Employeeid     Name     EmpEMail     GroupName     Process_ID     Process_Name     Portal_Path
    Here as per requirement a user should see data for a few process. So, I put a column for Process_ID and subsequently I created a INIT block in repository where query are like
    Select 'PROCESS_ID',AuthTable. Process_id
    From AuthTable
    WHERE upper(AuthTable.AD_ID) = upper(':USER')
    Then for User Groups I applied FILTERs for all the tables E.G for every Logical Table I applied Filter
    Dim_Process."Process ID" = VALUEOF(NQ_SESSION."PROCESS_ID")
    I checked data and every thing is correct. But My question is:
    We have many projects/catalog for which Filter Criteria will be different so shall we insert a new column for each criteria in SAME AuthTable or there is any other and better way to maintain it. Because if we maintain one table for all the projects/catalog it will be very messy I would prefer to keep different tables for different projects/catalog as there data marts are different.
    But Problem is for all other session variables we may use different INIT BLOCKS and hence different tables BUT for PORTALPATH there should be only one INIT BLOCK so only for PORTALPATH sake we need to keep every thing in same table ?
    Tell me if I am wrong some where in my understanding or there is a better way to do it.
    Regards
    Saurabh

    Hi,
    Pls refer to this link. Kumar explained it very clearly
    http://obieeblog.wordpress.com/category/obiee/obiee-security/
    Pls award points, if helpful
    Regards,
    Sarat Nallapati

  • How to implement row level security using external tables

    Hi All Gurus/ Masters,
    I want to implement row level security using external tables, as I'm not sure how to implement that. and I'm aware of using it by RPD level authentication.
    I can use a filter condition in my user level so that he can access his data only.
    But when i have 4 tables in external tables
    users
    groups
    usergroups
    webgrups
    Then in which table I need to give the filter conditions..
    Pl let me know this ...

    You pull the Group into a repository variable using a session variable init block, then reference that variable in the data filters either in the LTS directly or in the security management as Filters. You reference it with the syntax VALUEOF("NQ_SESSION.Variable Name")
    Hope this helps

  • Showing Text file as external table without trimming spaces

    Hi all:
    I am trying to display a text file from db sever with original spacing.
    Well it's not entirely related to APEX.
    This is how external table is created:
    create table u_x
      (rowline varchar2(255))
         ORGANIZATION EXTERNAL
           TYPE ORACLE_LOADER
           DEFAULT DIRECTORY ext_tab         <-- this is a dir i have created
           access parameters
           records delimited by newline
           fields notrim
           LOCATION ('aaa.txt')
    REJECT LIMIT UNLIMITED;
    on the first line of the aaa.text (its a report), it shows something like:
    Sat Dec 29                                                                Page 1
    but when I do select * from u_x , the result shows:
    Sat Dec 29 Page 1
    all above is done in sql workshop /sql command in APEX 4.2.1 (db 10.2.0.5)
    Is there any way to preserve the spaces between "29" and "Page"?
    (the result compressed all spaces in-between to 1 space).
    I think the ideal way is to invoke client's notepad to open the server's text file.
    anyway to generate the link?
    Thanks
    John

    Well, I have tried to download the file using below procedure.
    create or replace PROCEDURE download_bfile (
          directory_in   IN   VARCHAR2,
          file_in        IN   VARCHAR2
       AS
          lob_loc    BFILE;
          v_mime     VARCHAR2 (48);
          v_length   NUMBER(16);
       BEGIN
          lob_loc := BFILENAME (UPPER (directory_in), file_in);
          v_length := DBMS_LOB.getlength (lob_loc);
          -- set up HTTP header
          -- use an NVL around the mime type and
          -- if it is a null set it to application/octect
          -- application/octect may launch a download window from windows
          OWA_UTIL.mime_header (nvl(v_mime, 'application/octet'), FALSE);
          -- set the size so the browser knows how much to download
          HTP.p ('Content-length: ' || v_length);
          -- the filename will be used by the browser if the users does a save as
          HTP.p (   'Content-Disposition:attachment; filename="'
                 || SUBSTR (file_in, INSTR (file_in, '/') + 1)
                 || '";'
          -- close the headers
          OWA_UTIL.http_header_close;
          -- download the BLOB
          WPG_DOCLOAD.download_file (lob_loc);
          apex_application.stop_apex_engine;
       END download_bfile;
    I run it inside sql command in apex, but it does not show a pop window to save the file,
    it displays the whole text content of the file in the result window.
    Any idea why the save-as window does not pop up?
    thanks,
    John.

Maybe you are looking for