External table with empty file

Hi,
My db version: Oracle 11g
I have an empty csv file.
I created a external table for the empty csv file.
When I run:
select count(*) from externaltblname;
It returns 1.
It should return 0 right.
In the definition, I specified "SKIP 1"
But still it returns 1.
When I use this external table to load into a target table. It loads a single row with null values.
How to fix this. Please advice.

What works for me is the following (t_ext points to an empty csv):
SQL> select count(field) from t_ext;
COUNT(FIELD)
           1
1 row selected.
SQL> select ascii(field) from t_ext;
ASCII(FIELD)
          13
1 row selected.
SQL> select count(replace(field, chr(13))) from t_ext;
COUNT(REPLACE(FIELD,CHR(13)))
                            0
1 row selected.

Similar Messages

  • Can we bind a single external table with multiple files in OWB 11g?

    Hi,
    I wanted to ask if it is possible to bind an external table with multiple source files at same or different locations? Or an external table has to be bound to a single source file and a single location.
    Thanks in advance,
    Ann.
    Edited by: Ann on Oct 8, 2010 9:38 AM

    Hi Ann,
    Can you please help me out by telling me the steps to accomplish this. Right click on the external table in project tree, from the menu choose Configure,
    then in opened Configuration Properties dialog window right clock on Data Files node and choose from menu Create -
    you will get new record for file - specify Data File Name property
    Also link from OWB user guide
    http://download.oracle.com/docs/cd/B28359_01/owb.111/b31278/ref_def_flatfiles.htm#i1126304
    Regards,
    Oleg

  • Using an external drive with shared files (iPhoto, iTunes), attached to a Time Capsule, can the contents of the external drive be backed-up to the internal Time Capsule drive? Perhaps a RAID1 mirror to a partition?

    Using an external drive with shared files (iPhoto, iTunes), attached to a Time Capsule, can the contents of the external drive be backed-up to the internal Time Capsule drive? Perhaps a RAID1 mirror to a partitioned Time Capsule? I understand that the Time Machine (software) cannot backup a networked drive (the external) and that Time Capsule (the router/wireless hard drive) does not have its own backup software... so it won't backup the connected drive. What I would like is an alternate solution for having an automated backup of a networked drive. A 2TB Time Capsule has more than enough space for Time Machine backups of my family of Macbooks, so I had hoped to mirror a 500GB external drive (with shared media files) to a portion of the Time Capsule internal hard drive. I assume this would require a partition of the Time Capsule drive. If not, would the sparse file from the various computers being backed up need to be copied to the external drive as part of the RAID1 setup? This seems like overkill for the Time Machine backup, but it would cover the media files.  

    The complexity with this idea is the software has to run from a Mac computer on your network.. so you need a Mac turned on, probably most of the day.
    It isn't possible to partition the TC although you can create a image area.
    The software would have to copy the material, that means all files to be copied, go from USB drive, back to the Mac, then back to the TC, and written to the drive. In other words you have just added Network congestion, although a proper incremental backup type software will not actually use a lot of capacity.
    Finally it will be slow.. network drives are slower than internal drives. Well USB connected drive is much slower than the same drive connected directly to the computer.. and if the drive is connected directly to a computer it can be shared with the network.
    http://www.anandtech.com/show/4577/airport-extreme-5th-gen-and-time-capsule-4th- gen-review-faster-wifi-/4
    Read carefully the speed of the USB drive plugged into the TC.

  • Can we create single External Table for multiple files?

    HI,
    Can we create External table for multiple files? Could anyone please explain it.
    Thanks and regards
    Gowtham Sen.

    to merge 16 files having same structureWell, if files have the same structure, as per the reading of the example from the following documentation, you can create one external table for all your files :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#i1007480
    Nicolas.

  • Issues with external table from excel file

    dear all,
    i have been trying to use the below statement to create an external table.this table is referencing an excel file.
    CREATE TABLE EMPFRAUD_TEST
    SERIAL_NUM VARCHAR2(10 BYTE),
    BRANCH_CODE VARCHAR2(10 BYTE),
    BUSINESS_ADD VARCHAR2(100 BYTE),
    REGIONS VARCHAR2(50 BYTE),
    TRANSACTION_DATE_TIME DATE,
    REPORT_DATE_TIME DATE,
    NO_OF_TRANS VARCHAR2(4 BYTE),
    AMOUNT NUMBER,
    FRAUD_TYPE VARCHAR2(25 BYTE),
    IMPACT_CATEGORY VARCHAR2(10 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_EMP_TEST
    ACCESS PARAMETERS
    ( records delimited by newline
    badfile 'empfraud%a.bad'
    logfile 'empfraud%a.log'
    fields terminated by ','
    optionally enclosed by '"'lrtrim
    missing field values are null
    LOCATION ('fraud.csv')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    the problems is as follows
    1) when i run the query above the table will be created,
    but when i try to select from the table,an empty table will be display.
    when i checked the error log file,the following message was given.
    it was gotten from an oracle db on unix server.
    "L_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 71 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field ACCOUNT_KEY
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 79 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field SERIAL_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 80 rejected in file /home/oracle/ext_folder_test/fraud.csv
    error processing column TRANSACTION_DATE_TIME in row 1 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 2 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 3 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 8 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 9 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 10 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 11 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 12 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 13 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 14 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 15 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month"
    pls i need help to resolve it fast
    thank
    regards
    ajani abdulrahman olayide
    NB:
    after conversion to .csv format,
    BELOW IS THE DATA I AM TRYING TO ACCESS FROM THE EXCEL FILE
    BUSINESS OFFICE,REGIONS,Transaction_Date_Time,Report_Date_Time,Account_Key (Account No),     Number_of_Transactions,Total_Amount (N)      
    1,162,9 ojo street,Lagos South,various ,17/01/10,16200388749987,1,5100000,CHEQUE
    2,0238,"10 cyril Road, Enugu",East,21/06/2006,23/12/10,020968765357 09867653920174,1,20000000
    3,0127,"261, obiageli Rd, Asaba",Mid-West,22/12/2010,23/12/10,'00160030006149,1,6000000
    4,0519,"just road, Onitsha
    ",East,12/03/2010,14/02/11,0896002416575,1,5000000
    5,0519,"just road, Onitsha
    ",East,03/12/2010,14/02/11,06437890134356,1,5000000
    6,149,olayide street,Lagos South,10/02/2010,17/02/11,NGN01492501036 ,1,6108950
    7,0066,wale,Mid - west,18/02/2011,18/02/10,'05590020002924,1,55157977.53
    8,66,john,Mid- west,11/03/2010,14/03/09,'00660680054177,1,6787500
    9,0273,waheed Biem,N/Central,Jan 09 to Dec 2010,01/04/11,Nil,1,14146040

    As others suggested, you have to do the debugging yourself, To avoid the date error you may need something like this:
    CREATE TABLE EMPFRAUD_TEST
        SERIAL_NUM   VARCHAR2(10),
        BRANCH_CODE  VARCHAR2(10),
        BUSINESS_ADD VARCHAR2(100),
        REGIONS      VARCHAR2(50),
        TRANSACTION_DATE_TIME DATE ,
        REPORT_DATE_TIME DATE ,
        NO_OF_TRANS     VARCHAR2(50),
        AMOUNT          NUMBER,
        FRAUD_TYPE      VARCHAR2(25),
        IMPACT_CATEGORY VARCHAR2(10)
      ORGANIZATION EXTERNAL
        type oracle_loader default directory saubhik
        access parameters
        ( records delimited by newline
          badfile 'empfraud%a.bad'
          logfile 'empfraud%a.log'
          skip 1
          fields terminated by ','
          optionally enclosed by '"' ltrim
          missing field values are null
           ( serial_num ,
             branch_code ,
            business_add ,
            regions ,
            transaction_date_time date "dd/mm/rrrr",
            report_date_time date "dd/mm/rr",
            no_of_trans ,
            amount ,
            FRAUD_TYPE ,
            IMPACT_CATEGORY ) ) LOCATION ('fraud.csv')
      REJECT LIMIT UNLIMITED
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Creating external table - from a file with multiple field separators

    I need to create an external table from a flat file containing multiple field separators (",", ";" and "|").
    Is there any way to specifiy this in the CREATE TABLE (external) statement?
    FIELDS TERMINATED BY "," -- Somehow list more than just comma here?
    We receive the file from a vendor every week. I am trying to set up a process for some non-technical users, and I want to keep this process transparent to them and not require them to load the data into Oracle.
    I'd appreciate your help!

    scott@ORA92> CREATE OR REPLACE DIRECTORY my_dir AS 'c:\oracle'
      2  /
    Directory created.
    scott@ORA92> CREATE TABLE external_table
      2    (COL1 NUMBER,
      3       COL2 VARCHAR2(6),
      4       COL3 VARCHAR2(6),
      5       COL4 VARCHAR2(6),
      6       COL5 VARCHAR2(6))
      7  ORGANIZATION external
      8    (TYPE oracle_loader
      9       DEFAULT DIRECTORY my_dir
    10    ACCESS PARAMETERS
    11    (FIELDS
    12         (COL1 CHAR(255)
    13            TERMINATED BY "|",
    14          COL2 CHAR(255)
    15            TERMINATED BY ",",
    16          COL3 CHAR(255)
    17            TERMINATED BY ";",
    18          COL4 CHAR(255)
    19            TERMINATED BY ",",
    20          COL5 CHAR(255)
    21            TERMINATED BY ","))
    22    location ('flat_file.txt'))
    23  /
    Table created.
    scott@ORA92> select * from external_table
      2  /
          COL1 COL2   COL3   COL4   COL5
             1 Field1 Field2 Field3 Field4
             2 Field1 Field2 Field3 Field4
    scott@ORA92>

  • How to get the bound object of an external table with OMB

    Hello,
    I try to find it but without success. So may be one of you, know this secret information.
    I want to synchronize with the help of an OMB script my external tables.
    OMBSYNCHRONIZE FLAT_FILE '/MY_PROJECT/MY_FLAT_FILE_MODULE/MY_FLAT_FILE' RECORD 'MY_RECORD' TO EXTERNAL_TABLE 'MY_EXTERNAL_TABLE' \
    USE (RECONCILE_STRATEGY 'REPLACE', MATCHING_STRATEGY 'MATCH_BY_OBJECT_ID')For this purpose, I need to know the bound object. When you synchronize with the help of the GUI, you see it easily.
    Example:
    /MY_PROJECT/MY_FLAT_FILE_MODULE/MY_FLAT_FILE/MY_RECORDI searched in the properties of the external table and I don't see a BOUND OBJECT property.
    OMBDESCRIBE CLASS_DEFINITION 'EXTERNAL_TABLE' GET PROPERTY_DEFINITIONS
    BAD_FILE_LOCATION BAD_FILE_NAME BUSINESS_NAME DATA_FILES DEPLOYABLE DESCRIPTION
    DISCARD_FILE_LOCATION DISCARD_FILE_NAME ENDIAN GENERATE_ERROR_TABLE_ONLY GENERAT
    ION_COMMENTS LOAD_NULLS_WHEN_MISSING_VALUES LOG_FILE_LOCATION LOG_FILE_NAME NLS_
    CHARACTERSET NUMBER_OF_REJECTS_ALLOWED PARALLEL_ACCESS_DRIVERS PARALLEL_ACCESS_M
    ODE REJECTS_ARE_UNLIMITED SHADOW_TABLESPACE SHADOW_TABLE_NAME STRING_SIZES_IN TR
    IM UOIDThen I try the BOUND_OBJECT cached properties of a mapping operator but no luck.
    OMBRETRIEVE EXTERNAL_TABLE 'MY_EXTERNAL_TABLE' GET BOUND_OBJECT
    OMB00001: Encountered BOUND_OBJECT at line: 1, column: 56. Was expecting one of:
    "PROPERTIES" ...
        "REF" ...
        "REFERENCE" ...
        "COLUMN" ...
        "DEFAULT_LOCATION" ...
        "FLAT_FILE" ...
        "RECORD" ...
        "COLUMNS" ...
        "DATA_FILES" ...
        "DATA_RULE_USAGES" ...I have already the file and the record with the FLAT_FILE and RECORD properties
    OMBRETRIEVE EXTERNAL_TABLE  'MY_EXTERNAL_TABLE'  GET FLAT_FILE
    OMBRETRIEVE EXTERNAL_TABLE  'MY_EXTERNAL_TABLE'  GET RECORDBut how can I get the flat file module:
    MY_FLAT_FILE_MODULEDoes anybody know how OWB retrieve this information ?
    Thanks in advance and good day
    Nico
    Edited by: gerardnico on Jan 13, 2010 12:08 PM
    Change get the location by get the flat_file_module

    Yes, Oleg. It's what's worried me.
    The BOUND_OBJECT property of a table operator is not in the API.
    OMBDESCRIBE CLASS_DEFINITION 'TABLE_OPERATOR' GET PROPERTY_DEFINITIONS
    ADVANCED_MATCH_BY_CONSTRAINT AUTOMATIC_HINTS_ENABLED BOUND_NAME BUSINESS_NAME CO
    NFLICT_RESOLUTION DATABASE_FILE_NAME DATABASE_LINK DATA_COLLECTION_FREQUENCY DAT
    A_RULES DB_LOCATION DEBUG_BOUND_NAME DEBUG_DB_LOCATION DESCRIPTION DIRECT ENABLE
    _CONSTRAINTS ERROR_SELECT_FILTER ERROR_SELECT_ROLL_UP ERROR_TABLE_NAME EVALUATE_
    CHECK_CONSTRAINTS EXCEPTIONS_TABLE_NAME EXTRACTION_HINT IS_TEMP_STAGE_TABLE JOIN
    RANK KEYS_READONLY LOADING_HINT LOADING_TYPE MATCH_BY_CONSTRAINT OPTIMIZE_MERGE
    PARTITION_NAME PEL_ENABLED PRIMARY_SOURCE RECORDS_TO_SKIP REPLACE_DATA ROW_COUNT
    ROW_COUNT_ENABLED SCHEMA SINGLEROW SORTED_INDEXES_CLAUSE SUBPARTITION_NAME TARG
    ET_FILTER_FOR_DELETE TARGET_FILTER_FOR_UPDATE TARGET_LOAD_ORDER TEMP_STAGE_TABLE
    _EXTRA_DDL_CLAUSES TEST_DATA_COLUMN_LIST TEST_DATA_WHERE_CLAUSE TRAILING_NULLCOL
    S TRUNCATE_ERROR_TABLE UOID USE_LCR_APIThen I was wondering if anyone knew a little bit the same hidden property but to get the flat file source object of an external table.
    Cheers
    Nico

  • Dynamically creating oracle table with csv file as source

    Hi,
    We have a requirment..TO create a dynamic external table.. Whenever the data or number of columns change in the CSV file the table should be replaced with current data and current number of columns...as we are naive experienced people in oracle please give us a clear solution.. We have tried with a code already ..But getting some errors. Code given below..
    thank you
    we have executed this code by changing the schema name and table name ..Remaining everything same ...
    Assume the following:
    - Oracle User and Schema name is ALLEXPERTS
    - Database name is EXPERTS
    - The directory object is file_dir
    - CSV file directory is /export/home/log
    - The csv file name is ALLEXPERTS_CSV.log
    - The table name is all_experts_tbl
    1. Create a directory object in Oracle. The directory will point to the directory where the file located.
    conn sys/{password}@EXPERTS as sysdba;
    CREATE OR REPLACE DIRECTORY file_dir AS '/export/home/log';
    2. Grant the directory privilege to the user
    GRANT READ ON DIRECTORY file_dir TO ALLEXPERTS;
    3. Create the table
    Connect as ALLEXPERTS user
    create table ALLEXPERTS.all_experts_tbl
    (txt_line varchar2(512))
    organization external
    (type ORACLE_LOADER
    default directory file_dir
    access parameters (records delimited by newline
    fields
    (txt_line char(512)))
    location ('ALLEXPERTS_CSV.log')
    This will create a table that links the data to a file. Now you can treat this file as a regular table where you can use SELECT statement to retrieve the data.
    PL/SQL to create the data (PSEUDO code)
    CREATE OR REPLACE PROCEDURE new_proc IS
    -- Setup the cursor
    CURSOR c_main IS SELECT *
    FROM allexperts.all_experts_tbl;
    CURSOR c_first_row IS ALLEXPERTS_CSV.logSELECT *
    FROM allexperts.all_experts_tbl
    WHERE ROWNUM = 1;
    -- Declare Variable
    l_delimiter_count NUMBER;
    l_temp_counter NUMBER:=1;
    l_current_row VARCHAR2(100);
    l_create_statements VARCHAR2(1000);
    BEGIN
    -- Get the first row
    -- Open the c_first_row and fetch the data into l_current_row
    -- Count the number of delimiter l_current_row and set the l_delimiter_count
    OPEN c_first_row;
    FETCH c_first_row INTO l_current_row;
    CLOSE c_first_row;
    l_delimiter_count := number of delimiter in l_current_row;
    -- Create the table with the right number of columns
    l_create_statements := 'CREATE TABLE csv_table ( ';
    WHILE l_temp_counter <= l_delimiter_count
    LOOP
    l_create_statement := l_create_statement || 'COL' || l_temp_counter || ' VARCHAR2(100)'
    l_temp_counter := l_temp_counter + 1;
    IF l_temp_counter <=l_delimiter_count THEN
    l_create_statement := l_create_statement || ',';
    END IF;
    END;
    l_create_statement := l_create_statement || ')';
    EXECUTE IMMEDIATE l_create_statement;
    -- Open the c_main to parse all the rows and insert into the table
    WHILE rec IN c_main
    LOOP
    -- Loop thru all the records and parse them
    -- Insert the data into the table created above
    END LOOP;

    The initial table is showing errors and the procedure is created with compilation errors
    After executing the create table i am getting the following errors
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "badfile,
    byteordermark, characterset, column, data, delimited, discardfile,
    disable_directory_link_check, exit, fields, fixed, load, logfile, language,
    nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string,
    skip, territory, varia"
    KUP-01008: the bad identifier was: deli
    KUP-01007: at line 1 column 9
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

  • External tables-Fixed length file

    Hi All,
    I have a fixed length file that i load daily using an External table. Recently, one of the field, IP length was changed and customer wants to send both old records with 8 byte length and new records with 11 byte length in the same data file, until complete migration takes place.
    Will it be possible for External tables to handle this requirement?. Or Is there any other possibility to treat it.
    The old file contains 104 fields with IP field position form 490 to 498. Total
    The new file contains 104 fields with the IP position from 490 to 501.
    Thanks,
    Sri.

    If the two record types are mixed in the same file, then you will have problems loading them. I can see two possible solutions, in no particular order of preference (using your example data):
    1. Redefine the external table something like:
    Position (record_type (1:1)
              version     (2:5)
              data        (6:41))then parse the remaining fields based on the version number when you select from the external table.
    2. Create two external tables over the same file, one for version 1.00 and one for version 1.01 using the LOAD WHEN clause to determine which set of data to load when you select. Something like:
    CREATE TABLE version1 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.00)
    < definition for the old format >
    and
    CREATE TABLE version101 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.01)
    < definition for the new format >Then yor processing would use something like:
    SELECT ip, last_name
    FROM version1
    UNION ALL
    SELECT ip, last_name
    FROM version101HTH
    John

  • External Tables to Unix File System 10G R2

    Can anyone help with setting up an external table that reads a flat file from a Unix File system.
    I have sampled a file ok and created an external table and deployed it to the database ok but it can find the link through to the unix file system to read the file.
    I created the location as an FTP type and have referenced the path to the relevant directory /oracle02/app/OWB_files when creating the location. I have placed the relevant named file in the directory but when i try to look at the table in TOAD i get the following errors
    ORA-29913 error in executing ODCIEXTTABLEOPEN callout
    ORA-29400 data cartridge error
    KUP-04040 files 121123_PENS.txt in MLCC_FILES_LOCATION_0 not found
    Does anyone have a step by step guide for creating these and am i doing some thing wrong. Is choosing the FTP type in the location correct and is the path specified correctly. I can see much information on thisin the manual!
    Your assistance would be appreciated

    HI,
    You make sure that, the path should be shared one. We can do this using samba server.
    Regards,
    Gowtham Sen.

  • External table with Date Format

    I have external table and in the file date format is YYYY-MM-DD.
    i want to insert date format mask in the create table command .
    the bleow command is getting created . But it is giving error when i use select command.
    CREATE TABLE XADV.XADV_test_EXT_TABLE
    BPO_START_DATE Date
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY XCRM_DASHBOARD
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY '|^|'
    BPO_START_DATE date 'YYYY-MM-DD'
    LOCATION (XCRM_DASHBOARD:'test.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    the error i am getting is in select * from XADV_test_EXT_TABLE
    RA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "minussign": expecting one of: "column, enclosed, exit, (, ltrim, lrtrim, ldrtrim, missing, notrim, optionally, rtrim, reject"
    KUP-01007: at line 3 column 3
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    regards
    Mani

    shoblock wrote:
    use double quotes instead of single quotesHas nothing to do with the quotes. Issue is field list must be enclosed in parents:
    SQL> CREATE TABLE XADV_test_EXT_TABLE
      2  (
      3  BPO_START_DATE Date
      4  )
      5  ORGANIZATION EXTERNAL
      6  (
      7   TYPE ORACLE_LOADER
      8   DEFAULT DIRECTORY TMP
      9   ACCESS PARAMETERS
    10   (
    11    RECORDS DELIMITED BY NEWLINE
    12    FIELDS TERMINATED BY ','
    13     BPO_START_DATE date "YYYY-MM-DD"
    14   )
    15  LOCATION (TMP:'test.txt')
    16  )
    17  REJECT LIMIT UNLIMITED
    18  NOPARALLEL
    19  NOMONITORING
    20  /
    Table created.
    SQL> select * from XADV_test_EXT_TABLE
      2  /
    select * from XADV_test_EXT_TABLE
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "column,
    enclosed, exit, (, ltrim, lrtrim, ldrtrim, missing, notrim, optionally, rtrim,
    reject"
    KUP-01008: the bad identifier was: BPO_START_DATE
    KUP-01007: at line 3 column 4
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    SQL> drop table XADV_test_EXT_TABLE;
    Table dropped.
    SQL> CREATE TABLE XADV_test_EXT_TABLE
      2  (
      3  BPO_START_DATE Date
      4  )
      5  ORGANIZATION EXTERNAL
      6  (
      7   TYPE ORACLE_LOADER
      8   DEFAULT DIRECTORY TMP
      9   ACCESS PARAMETERS
    10   (
    11    RECORDS DELIMITED BY NEWLINE
    12    FIELDS TERMINATED BY ','
    13    (
    14     BPO_START_DATE date 'YYYY-MM-DD'
    15    )
    16   )
    17  LOCATION (TMP:'test.txt')
    18  )
    19  REJECT LIMIT UNLIMITED
    20  NOPARALLEL
    21  NOMONITORING
    22  /
    Table created.
    SQL> select * from XADV_test_EXT_TABLE
      2  /
    BPO_START
    19-NOV-08
    SQL> SY.

  • Can SAP send e-mail to external mail with attached file?

    Hi ABAP Guru.
    Can SAP send the E-Mail to external mail server? (e.g. Hotmail, Yahoo and so on.)
    And Can the E-Mail sent with attached file?
    And How to do that?
    Please give me your advice.
    Thank you and Best Regard,
    Nattapash C.

    Thank you very much! Amit Gujargoud.
    And I'm so sorry for my question that not clear for you and everyone.
    What I need to know is How to do with ABAP Code.
    I found the Function Module
    'SO_DOCUMENT_REPOSITORY_MANAGER' with method 'SEND' is being used for my case.
    but I don't know what field of structure RECIPIENTS (Table parameter of this Func. Module) that used for define recipient's E-Mail Address.
    Does anyone know the field or Have any method to solved my case?
    Please give me your advice.
    Thank you and Best regard.
    Nattapash C.

  • How can I use external tables with directories not on Oracle host?

    Oracle 11.2.0.2
    We have developers using C# to populate global temporary tables (two one header and detail with 1 to many relationbship between these two tables). The process of using C# was blowing up memory on Windows.
    As an alternative I requested them to create two CSV files and I can use sqlldr to load the detail CSV file of over 1 million rows under 15 seconds. It came to my mind that I could use external tables for this purpose with getting read of header and trailer from the CSV files.
    The issue I have is that I am not a DBA so I do not have access to OS host that Oracle instance is running. sqlldr is a client side tool, whereas external tables are server side. I was wondering if there is an alternative in 11g to use external tables without creating directories on the OS host that Oracle runs on? Are there other alternatives?
    Thanks

    905989 wrote:
    Oracle 11.2.0.2
    We have developers using C# to populate global temporary tables (two one header and detail with 1 to many relationbship between these two tables). The process of using C# was blowing up memory on Windows.
    As an alternative I requested them to create two CSV files and I can use sqlldr to load the detail CSV file of over 1 million rows under 15 seconds. It came to my mind that I could use external tables for this purpose with getting read of header and trailer from the CSV files.
    The issue I have is that I am not a DBA so I do not have access to OS host that Oracle instance is running. sqlldr is a client side tool, whereas external tables are server side. I was wondering if there is an alternative in 11g to use external tables without creating directories on the OS host that Oracle runs on? Are there other alternatives?
    Thanks
    no other alternative

  • Proxy Client response table with empty lines

    Hi,
    I'm consumig external webservice with ABAP (No XI scenario) and comuncation it seems correct. When I execute the client proxy from SE80 directly, I can see the XML response with table element ans its respectives contens.
    Unfortunately, when I call the proxy trhough a program, the table returned by te proxy client is filled with lines but each field of line is empty. It seems all is ok.
    I tried to regenerate proxy client and i have the same problem.
    Anybody knows what is the problem? I tried to debug but I don't konw how to see where SAP converts the XML to ABAP objects.
    Thanks,
    Regards

    Well, you have to tell Outlook to trust the server's root certificate, however than is done, or use a certificate at the server which is trusted by Outlook, for which see the JDK Guide to Features->Security->JSSE.
    The empty strings will solve themselves once the client trusts the server, but you shouldn't be using readLine, you should be reading bytes and decoding according to the POP3 or IMAP protocol definitions.

  • Deploying external tables with ombplus

    Hi,
    we try to deploy our projects with ombplus. The script breaks at the first external table when we do:
    OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN 'SVR_ET'
    foreach ext_tab [OMBLIST EXTERNAL_TABLES] {
    set exttab_path [concat $svr_modul$ext_tab]
    OMBALTER DEPLOYMENT_ACTION_PLAN 'SVR_ET' \
    ADD ACTION '$ext_tab' SET PROPERTIES (OPERATION) VALUES ('CREATE') \
    SET REFERENCE EXTERNAL_TABLE '$exttab_path'}
    OMBDEPLOY DEPLOYMENT_ACTION_PLAN 'SVR_ET'
    There is one external table that give an "VLD-0917" - error details are:
    java.lang.NullPointerException.
    Beforehand we did the Connectors with such a loop.
    Any ideas?
    Michael

    Hi Michael,
    Try to make it simple first time without a loop. When you have that fixed you try to do it with a loop.
    The example below, witout a loop, works for me.
    /JZ
    --First script deploy to file
    set OMBLOG {c:\tmp_jz\omb_logfile_deploy_test_to_file.log}
    OMBCONNECT owb_o/owb_o@localhost:1521:owbdb USE REPOSITORY 'owbdb'
    OMBCC 'DATAWAREHOUSE'
    OMBCONNECT CONTROL_CENTER owb_o/owb_o@localhost:1521:dw \
    USE REPOSITORY 'owb_o'
    OMBCREATE TRANSIENT DEPLOYMENT_ACTION_PLAN 'DEPLOY_TEST' \
    ADD ACTION 'ST_ET_TABELL' \
    SET PROPERTIES(OPERATION) VALUES ('CREATE') \
    SET REFERENCE EXTERNAL_TABLE '/DATAWAREHOUSE/TEST/ET_TABLE'
    OMBDEPLOY DEPLOYMENT_ACTION_PLAN 'DEPLOY_TEST' \
    AS SPECIFICATION TO 'c:\\tmp_jz\\deploy_test_to_file.xml'
    OMBDISCONNECT CONTROL_CENTER
    OMBDISCONNECT
    exit
    --Second script deploy from file to control center
    set OMBLOG {omb_logfile_deploy_test_to_cc.log}
    OMBCONNECT CONTROL_CENTER owb_ow/owb_o@localhost:dw \
    USE REPOSITORY 'owb_owner'
    OMBDEPLOY SPECIFICATION FROM 'deploy_test.xml'
    OMBDISCONNECT CONTROL_CENTER
    exit

Maybe you are looking for

  • Photos and Albums have Disappeared

    I appreciate that there seems to have been some problems with iPhoto 09 and Snow Leopard. I have been successfully using iPhoto 09 since I installed Snow Leopard without any problems. I purchased Snow Leopard immediately after it was launched. I back

  • Error when calling SOAP Runtime functions

    Hi Guys, I have a requirement in which i have to consume a webservice and get a response from it but when i consumed the web service and tried to test it i got the below error please let me know solution for this as it is very urgent and also i am ve

  • Commit performance with a lot of objects "new_persistent"

    Hello, I'm new developper in JDO Kodo 3.0.2, MySQL on WinNT 4. I have a simple test used to evaluate time performance and kodo properties. -One test is creating identical transient objects, making them persistent and committing the transaction. The t

  • Standalone applications in UCM

    How to run the standalone applications like Batch Loader, System Properties etc. from client computers i.e. systems not having content server installed on them. Any help would be appreciated. Regards Pradeep

  • Count of size range

    Hi could some one assist. Our requirement is Number of tickets by Size per each month, the data in Trade table like this Trade_id-->Size-->trade_Date-->Market 1-------->110----->01/02/11-->AA 2-------->80------>25/01/11-->AB 3-------->210----->17/01/