External Tables to Unix File System 10G R2

Can anyone help with setting up an external table that reads a flat file from a Unix File system.
I have sampled a file ok and created an external table and deployed it to the database ok but it can find the link through to the unix file system to read the file.
I created the location as an FTP type and have referenced the path to the relevant directory /oracle02/app/OWB_files when creating the location. I have placed the relevant named file in the directory but when i try to look at the table in TOAD i get the following errors
ORA-29913 error in executing ODCIEXTTABLEOPEN callout
ORA-29400 data cartridge error
KUP-04040 files 121123_PENS.txt in MLCC_FILES_LOCATION_0 not found
Does anyone have a step by step guide for creating these and am i doing some thing wrong. Is choosing the FTP type in the location correct and is the path specified correctly. I can see much information on thisin the manual!
Your assistance would be appreciated

HI,
You make sure that, the path should be shared one. We can do this using samba server.
Regards,
Gowtham Sen.

Similar Messages

  • Uploaded Files stored in Oracle 10G database or in Unix File system

    Hey All,
    I am trying to understand best practices on storing uploaded files. Should you store within the database itself (this is the current method we are using by leveraging BLOB storage) or use a BFILE locator to use the files system storage (we have our DB's on UNIX) . . .or is there another method I should be entertaining? I have read arguments on both sides of this question. I wanted to see what answers forum readers could provide!! I understand there are quite a few factors but the situation I am in is as follows:
    1) Storing text and pdf documents.
    2) File sizes range from a few Kb to up to 15MB in size
    3) uploaded files can be deleted and updated / replaced quite frequently
    Right now we have an Oracle stored procedure that is uploading the files binary data into a BLOB column on our table. We have no real "performance" problems with this method but are entertaining the idea of using the UNIX file system for storage instead of the database.
    Thanks for the insight!!
    Anthony Roeder

    Anthony,
    First word you must learn here in this forum is RESPECT.
    If you require any further explanation, just say so.
    BLOB compared with BFILE
    Security:
    BFILEs are inherently insecure, as insecure as your operating system (OS).
    Features:
    BFILEs are not writable from typical database APIs whereas BLOBs are.
    One of the most important features is that BLOBs can participate in transactions and are recoverable. Not so for BFILEs.
    Performance:
    Roughly the same.
    Upping the size of your buffer cache can make a BIG improvement in BLOB performance.
    BLOBs can be configured to exist in Oracle's cache which should make repeated/multiple reads faster.
    Piece wise/non-sequential access of a BLOB is known to be faster than a that of a BFILE.
    Manageability:
    Only the BFILE locator is stored in an Oracle BACKUP. One needs to do a separate backup to save the OS file that the BFILE locator points to. The BLOB data is backed up along with the rest of the database data.
    Storage:
    The amount of table space required to store file data in a BLOB will be larger than that of the file itself due to LOB index which is the reason for better BLOB performance for piece wise random access of the BLOB value.

  • IPod formatted to UNIX File System (UFS)

    Hello all, I had an old 15G iPod (the generation with the 4 button above the wheel) and I decided to reformat it and use it as a firewire drive, it's not used for music anymore. Anyway, I was about half way through the format and I noticed I had accidently sellected UNIX File System (UFS) instead of Mac OS Extended. Well I let lt finish and thought I'd just reformat it again to Mac OS Extended, only now the iPods not even being recognized by OSX? I looked in Disk Utility and nothing there either. iPod restore does nothing either, just says connect an iPod. There's an Apple on the screen and that's it? What gives? Will I be able to get this thing running again?

    Thanks but no thanks! I got it going using info at this link:
    http://ipodlinux.org/Installation
    Got it in Disk Mode!!!

  • Open API App (Windows based) fails to open FMB on Unix file system

    My Open API, Windows based app, can successfully open and 'get' properties of FMBs stored in the Windows file system. However, it fails to load the FMB when the FMB resides on a networked Unix server. The same FMBs on Unix can be opened by the Windows based FormBuilder (over the network). I can copy the FMB down to Windows and without re-compiling the FMB, my Open API app can 'load' the FMB and 'get' all the properties. What suggestions can you give for debugging / resolving this? I need to be able to 'Load' the FMBs (through the Open API), that reside in the Unix file system, from Windows.
    JJ

    Generally, this is why we will tell you that accessing net shares is not supported and in places where it might even be supported, we would still suggest that it is not recommended. Accessing via net shares (especially through Windows) is often problematic. There are various performance and connectivity issues that, unfortunately fool you into believing that the product you are using is flawed when the problem is really a connection issue with the share.
    In your case, because you are not exactly using an Oracle product (initially), Oracle can't offer much anyway, but I would recommend against using shares whenever possible. If you need to access a file, copy it locally first, perform whatever task on it, then return the updated file to its origin. This method protects you from things like net failure and instability as well as the performance issues associated with accessing files remotely.

  • Download PDF spool to unix file system

    Does anyone know how to programatically download a PDF document spool to the unix file system?
    I am trying to find a method to send PDF documents that have gone to spool to a unix file system. If anyone had any ideas on how to do this, please let me know.
    Thanks.

    Hi,
    For this define a logical file while using transaction FILE.
    In your code you will get the complete path of the file while using FM FILE_GET_NAME.
    Then download your PDF while using instruction
    OPEN DATASET...
    TRANSFERT ...
    Hope this help you .
    Best regards

  • Can we create single External Table for multiple files?

    HI,
    Can we create External table for multiple files? Could anyone please explain it.
    Thanks and regards
    Gowtham Sen.

    to merge 16 files having same structureWell, if files have the same structure, as per the reading of the example from the following documentation, you can create one external table for all your files :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#i1007480
    Nicolas.

  • UNIX File System?

    I was able in the past to format a disk with UNIX File System and read and write to it using Disk Utility in OS X Tiger. Now, my new Leopard machine does not show that option. What happened to that? How do I get it back?
    Thanks,
    John

    I'm pretty sure UFS support is totally gonzo in leopard. you certainly [can not install Leopard|http://docs.info.apple.com/article.html?artnum=306516] on UFS formatted drives.

  • Java script in HTMLDB to check if file exists in Unix file system

    How do I use javascript to check if file is exists in Unix file system. I would like to dispaly the columns only if file is exists.

    Hello,
    This is one of those features that the manuals do not cover.
    How to use and build AJAX features could be a whole book all by itself, and it's not really HTML DB specific feature even though we have built some hooks in application and javascript to make it easier.
    Take a look at this thread
    Netflix: Nice UI ideas
    and I've built some examples here
    http://htmldb.oracle.com/pls/otn/f?p=11933:11
    Or just search the forums for AJAX or XMLHTTP
    Carl

  • Unix file system & Oracle datafiles--urgent plz

    How i can chech my oracle db files on which unix file system? In HP/UX exvirnment??

    select * from dba_data_files
    AUTOEXTENSIBLE column gives you whether autoextend is on or not.
    Join with dba_free_space to get free space for each file.
    You can check the following link
    http://www.oracle.com/technology/oramag/code/tips2003/083103.html

  • Issues with external table from excel file

    dear all,
    i have been trying to use the below statement to create an external table.this table is referencing an excel file.
    CREATE TABLE EMPFRAUD_TEST
    SERIAL_NUM VARCHAR2(10 BYTE),
    BRANCH_CODE VARCHAR2(10 BYTE),
    BUSINESS_ADD VARCHAR2(100 BYTE),
    REGIONS VARCHAR2(50 BYTE),
    TRANSACTION_DATE_TIME DATE,
    REPORT_DATE_TIME DATE,
    NO_OF_TRANS VARCHAR2(4 BYTE),
    AMOUNT NUMBER,
    FRAUD_TYPE VARCHAR2(25 BYTE),
    IMPACT_CATEGORY VARCHAR2(10 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_EMP_TEST
    ACCESS PARAMETERS
    ( records delimited by newline
    badfile 'empfraud%a.bad'
    logfile 'empfraud%a.log'
    fields terminated by ','
    optionally enclosed by '"'lrtrim
    missing field values are null
    LOCATION ('fraud.csv')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    the problems is as follows
    1) when i run the query above the table will be created,
    but when i try to select from the table,an empty table will be display.
    when i checked the error log file,the following message was given.
    it was gotten from an oracle db on unix server.
    "L_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 71 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field ACCOUNT_KEY
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 79 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field SERIAL_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 80 rejected in file /home/oracle/ext_folder_test/fraud.csv
    error processing column TRANSACTION_DATE_TIME in row 1 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 2 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 3 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 8 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 9 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 10 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 11 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 12 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 13 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 14 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 15 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month"
    pls i need help to resolve it fast
    thank
    regards
    ajani abdulrahman olayide
    NB:
    after conversion to .csv format,
    BELOW IS THE DATA I AM TRYING TO ACCESS FROM THE EXCEL FILE
    BUSINESS OFFICE,REGIONS,Transaction_Date_Time,Report_Date_Time,Account_Key (Account No),     Number_of_Transactions,Total_Amount (N)      
    1,162,9 ojo street,Lagos South,various ,17/01/10,16200388749987,1,5100000,CHEQUE
    2,0238,"10 cyril Road, Enugu",East,21/06/2006,23/12/10,020968765357 09867653920174,1,20000000
    3,0127,"261, obiageli Rd, Asaba",Mid-West,22/12/2010,23/12/10,'00160030006149,1,6000000
    4,0519,"just road, Onitsha
    ",East,12/03/2010,14/02/11,0896002416575,1,5000000
    5,0519,"just road, Onitsha
    ",East,03/12/2010,14/02/11,06437890134356,1,5000000
    6,149,olayide street,Lagos South,10/02/2010,17/02/11,NGN01492501036 ,1,6108950
    7,0066,wale,Mid - west,18/02/2011,18/02/10,'05590020002924,1,55157977.53
    8,66,john,Mid- west,11/03/2010,14/03/09,'00660680054177,1,6787500
    9,0273,waheed Biem,N/Central,Jan 09 to Dec 2010,01/04/11,Nil,1,14146040

    As others suggested, you have to do the debugging yourself, To avoid the date error you may need something like this:
    CREATE TABLE EMPFRAUD_TEST
        SERIAL_NUM   VARCHAR2(10),
        BRANCH_CODE  VARCHAR2(10),
        BUSINESS_ADD VARCHAR2(100),
        REGIONS      VARCHAR2(50),
        TRANSACTION_DATE_TIME DATE ,
        REPORT_DATE_TIME DATE ,
        NO_OF_TRANS     VARCHAR2(50),
        AMOUNT          NUMBER,
        FRAUD_TYPE      VARCHAR2(25),
        IMPACT_CATEGORY VARCHAR2(10)
      ORGANIZATION EXTERNAL
        type oracle_loader default directory saubhik
        access parameters
        ( records delimited by newline
          badfile 'empfraud%a.bad'
          logfile 'empfraud%a.log'
          skip 1
          fields terminated by ','
          optionally enclosed by '"' ltrim
          missing field values are null
           ( serial_num ,
             branch_code ,
            business_add ,
            regions ,
            transaction_date_time date "dd/mm/rrrr",
            report_date_time date "dd/mm/rr",
            no_of_trans ,
            amount ,
            FRAUD_TYPE ,
            IMPACT_CATEGORY ) ) LOCATION ('fraud.csv')
      REJECT LIMIT UNLIMITED
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Creating external table - from a file with multiple field separators

    I need to create an external table from a flat file containing multiple field separators (",", ";" and "|").
    Is there any way to specifiy this in the CREATE TABLE (external) statement?
    FIELDS TERMINATED BY "," -- Somehow list more than just comma here?
    We receive the file from a vendor every week. I am trying to set up a process for some non-technical users, and I want to keep this process transparent to them and not require them to load the data into Oracle.
    I'd appreciate your help!

    scott@ORA92> CREATE OR REPLACE DIRECTORY my_dir AS 'c:\oracle'
      2  /
    Directory created.
    scott@ORA92> CREATE TABLE external_table
      2    (COL1 NUMBER,
      3       COL2 VARCHAR2(6),
      4       COL3 VARCHAR2(6),
      5       COL4 VARCHAR2(6),
      6       COL5 VARCHAR2(6))
      7  ORGANIZATION external
      8    (TYPE oracle_loader
      9       DEFAULT DIRECTORY my_dir
    10    ACCESS PARAMETERS
    11    (FIELDS
    12         (COL1 CHAR(255)
    13            TERMINATED BY "|",
    14          COL2 CHAR(255)
    15            TERMINATED BY ",",
    16          COL3 CHAR(255)
    17            TERMINATED BY ";",
    18          COL4 CHAR(255)
    19            TERMINATED BY ",",
    20          COL5 CHAR(255)
    21            TERMINATED BY ","))
    22    location ('flat_file.txt'))
    23  /
    Table created.
    scott@ORA92> select * from external_table
      2  /
          COL1 COL2   COL3   COL4   COL5
             1 Field1 Field2 Field3 Field4
             2 Field1 Field2 Field3 Field4
    scott@ORA92>

  • External tables-Fixed length file

    Hi All,
    I have a fixed length file that i load daily using an External table. Recently, one of the field, IP length was changed and customer wants to send both old records with 8 byte length and new records with 11 byte length in the same data file, until complete migration takes place.
    Will it be possible for External tables to handle this requirement?. Or Is there any other possibility to treat it.
    The old file contains 104 fields with IP field position form 490 to 498. Total
    The new file contains 104 fields with the IP position from 490 to 501.
    Thanks,
    Sri.

    If the two record types are mixed in the same file, then you will have problems loading them. I can see two possible solutions, in no particular order of preference (using your example data):
    1. Redefine the external table something like:
    Position (record_type (1:1)
              version     (2:5)
              data        (6:41))then parse the remaining fields based on the version number when you select from the external table.
    2. Create two external tables over the same file, one for version 1.00 and one for version 1.01 using the LOAD WHEN clause to determine which set of data to load when you select. Something like:
    CREATE TABLE version1 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.00)
    < definition for the old format >
    and
    CREATE TABLE version101 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.01)
    < definition for the new format >Then yor processing would use something like:
    SELECT ip, last_name
    FROM version1
    UNION ALL
    SELECT ip, last_name
    FROM version101HTH
    John

  • Can we bind a single external table with multiple files in OWB 11g?

    Hi,
    I wanted to ask if it is possible to bind an external table with multiple source files at same or different locations? Or an external table has to be bound to a single source file and a single location.
    Thanks in advance,
    Ann.
    Edited by: Ann on Oct 8, 2010 9:38 AM

    Hi Ann,
    Can you please help me out by telling me the steps to accomplish this. Right click on the external table in project tree, from the menu choose Configure,
    then in opened Configuration Properties dialog window right clock on Data Files node and choose from menu Create -
    you will get new record for file - specify Data File Name property
    Also link from OWB user guide
    http://download.oracle.com/docs/cd/B28359_01/owb.111/b31278/ref_def_flatfiles.htm#i1126304
    Regards,
    Oleg

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • External table with empty file

    Hi,
    My db version: Oracle 11g
    I have an empty csv file.
    I created a external table for the empty csv file.
    When I run:
    select count(*) from externaltblname;
    It returns 1.
    It should return 0 right.
    In the definition, I specified "SKIP 1"
    But still it returns 1.
    When I use this external table to load into a target table. It loads a single row with null values.
    How to fix this. Please advice.

    What works for me is the following (t_ext points to an empty csv):
    SQL> select count(field) from t_ext;
    COUNT(FIELD)
               1
    1 row selected.
    SQL> select ascii(field) from t_ext;
    ASCII(FIELD)
              13
    1 row selected.
    SQL> select count(replace(field, chr(13))) from t_ext;
    COUNT(REPLACE(FIELD,CHR(13)))
                                0
    1 row selected.

Maybe you are looking for

  • CC Applications PS, etc Won't launch in new install of ML on Imac Platform Config Error 213:5

    CC Applications PS, Illustrator, Adobe Acrobat Pro will not launch in newly installed Mountain Lion OS on Imac Platform. Attempts to launch and run CC applications results in  Configuration Error: 213:5. Further, On board PS and Lightroom application

  • Key Figures not showing in BEx Workbooks

    Hi, Recently we migrated the BW3.5 workbooks to BI7. After the migration and transportation of the workbooks to Quality system, in some of the workbooks the Key figures are not showing up and in some workbooks the key figures which are set as 'Hide"

  • Settings SLD and JCO connection for Java Web Dynpro

    Hi BI Gurus, I'm a Business Intelligence newbye and I'm a bit confused about Web Dynpro configurations. I try to execute, through Netweaver Developer Studio 7, a simple java Web Dynpro who call the standard Flight List BAPI. My configuration is: NW 2

  • Custom Schedule Task error

    Schedule Task Logic: SQL query to get list of user id's For each user get resource objects Update UPN for AD resource if Status 'provisioned' or 'enabled' 1) After Schedule task processes 500-600 users i get following error in log for each subsequent

  • Unsolved Problem

    I rencently buy a rabge expander but i'm unable to make it work. My config is the following. DSL WiFi router: IP: 192.168.1.1 MASK: 255.255.255.0 DHCP Server WEP Then, i configured Expander as follows: IP: 192.168.1.240 MASK: 255.255.255.0 Default Ga