SQL and External table question

Hello averyone,
I have a file to be read as an external table part of it is below:
ISO-10303-21;
HEADER;
FILE_DESCRIPTION((''),'2;1');
FILE_NAME('BRACKET','2005-07-08T',('broilo'),(''),
'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400',
'PRO/ENGINEER BY PARAMETRIC TECHNOLOGY CORPORATION, 2004400','');
FILE_SCHEMA(('CONFIG_CONTROL_DESIGN'));
ENDSEC;
DATA;
#5=CARTESIAN_POINT('',(5.5E0,5.5E0,-5.1E1));
#6=DIRECTION('',(0.E0,0.E0,1.E0));
#7=DIRECTION('',(-1.E0,0.E0,0.E0));
#8=AXIS2_PLACEMENT_3D('',#5,#6,#7);
The first question is: how to ignore the lines until the DATA; line or SQL already does it for me?
The second question is: since the fields of interest are separated by commas and the first field does not interest me (it is solved with a varchar2) how can I read the following fields as numbers ignoring the (,# and ) characters please?
Thanks for any help.
Sincerely yours,
André Luiz

The SKIP option can be used with SQL*Loader to skip a certain number of lines before starting to load. Off hand I cannot see any easy way to load the data in the format given. The format does not resemble a typical CVS format. You can look at the test cases provided for SQ*Loader in the Oracle® Database Utilities guide - or simply write PL/SQL code to load this data manually using UTL_FILE.

Similar Messages

  • Fields terminated by (SQL loader, external table) question?

    Hello.
    I have a txt file which looks like:
    Columns:
    A..........B.........C...........D.........E..............F.............G...........H
    739.......P.........0002......05........25012006..25012006..5...........data group
    . = space
    There are different number of spaces between columns.
    What must i use in FIELDS TERMINATED BY to import this?
    Thanks.

    So, don't use FIELDS TERMINATED BY, but, as Ino suggested, fixed format, something like
    LOAD DATA
    TRUNCATE INTO TABLE <table name>
    (a position(1:10),
    b position(11:20),
    c position(21:30),
    d position(31:40),
    e position(41:48) date "ddmmyyyy",
    f position(51:58) date "ddmmyyyy",
    g position(61:72),
    h position(73:92))

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • SQL LOADER , EXTERNAL  TABLE and ODBS DATA SOURCE

    hello
    Can any body help loading data from dbase file .dbt to an oracle 10g table.
    I tried last day with SQL LOADER, EXTERNAL table , and ODBC data source.
    Why all of these utilities still failing to solve my problem ?
    Is there an efficient way to reach this goal ?
    Thanks in advance

    export the dbase data file to text file,
    then you have choice of using either sql loader or external table option to use.
    regards

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • Failed to parse SQL Query - External Tables [Apex 4.0.2]

    Greetings experts -
    Has anyone encountered errors when creating a report in Apex that sources from an external table within the database? I'm using the 4.0.2 version that is packaged with the 11g XE edition on 64bit CentOS.
    For example, I might run:
    SELECT NULL LINK,
    COL1 LABEL,
    COL2 VALUE
    FROM MYTAB;
    Where MYTAB is an external table sitting on top of a comma-separated file. When I go to create the page, I'm prompted with the "Failed to parse SQL" dialogue.
    I noticed that if I did CTAS on the external table, and referenced the CTAS table, my page ran without problem!
    Any ideas? Is this a known "limitation" of Apex?
    Thanks,
    CJ

    Chiedu,
    Please try removing all declarative validations on this tabular form, and see if it works as expected then. There are some limitations on the type of views and joins that are supported by tabular forms when using the new declarative validations, i.e. you'll need a key preserved table in your view or join.
    Regards,
    Marc

  • SQL Loader/External Table multiple record delimiters

    Hi every one.
    I have a strange problem, I have an external csv file which i wish to deal with (external tables or sql loader). This csv is totally not organized in structure and it contains records that are mixed together, meaning that some records are delimited by newline characters. So in short, i want to know if I will be able to load the data in this csv separating records by newline character and another character? So is that possible to have multiple record delimiters specified in the same ctl file?

    abohsin,
    I think using the Stream record format would be helful in your case. Please explore that.
    Using stream record option, instead of the default new line, you can specify a user defined record delimiter.
    Check this link.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005509
    Here is what I did. Not the complete answer, but it might be helpful.
    Replace all delimiters witha standard delimiters (in unix)
    sed s/HEAD,/**DLM**/g < test.dat >test2.dat
    sed s/TAIL,/**DLM**/g < test2.dat >test3.dat
    create table t(
      TEXT varchar2(100)
    and Use that delimiter as the standard delimiter.
    load data
    infile "test3.dat" "str '**DLM**'"
    into table T
    TRUNCATE
    fields terminated by 'XXXXX' optionally enclosed by '"'
    TEXT
    sql> select * from t;
    TEXT
    1111,2222,
    4444,5555,
    4444
    1111,3333,
    8888,6666,
    5555
    {code}
    You should also replace new line charecters with '***DLM***'.
    Thanks,
    Rajesh.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • External Table Question

    I created the external table ext_sftm (see create statement below) and it worked until I added MISSING FIELD VALUES ARE NULL.
    Now when I run SELECT * FROM EXT_SFTM I get the following errors. [1]: (Error): ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "missing": expecting one of: "exit, (" KUP-01007: at line 10 column 5 ORA-06512: at "SYS.ORACLE_LOADER", line 14 ORA-06512: at line 1
    What am I doing wrong?
    CREATE TABLE ext_sftm
    (map_id NUMBER(38,0),
    source_id NUMBER(38,0),
    member_acis VARCHAR2(20),
    source_facility_code VARCHAR2(50),
    source_facility_name VARCHAR2(100),
    source_facility_address VARCHAR2(40),
    source_facility_city VARCHAR2(30),
    source_facility_state CHAR(2),
    source_facility_zip VARCHAR2(10),
    source_name VARCHAR2(100))
    ORGANIZATION EXTERNAL (
    DEFAULT DIRECTORY DCS_DIR
    ACCESS PARAMETERS(RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8ISO8859P1
    BADFILE 'DCS_DIR':'SFTM.bad'
    LOGFILE 'SFTM.log'
    READSIZE 1048576
    SKIP 1
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    LOCATION (
    DCS_DIR:'SFTM.csv'
    REJECT LIMIT UNLIMITED
    /

    According to Metalink note:
    Subject:      Select from external table gives ORA-29913 ORA-29400 KUP-00554 KUP-01005
         Doc ID:      Note:302672.1      Type:      PROBLEM
         Last Revision Date:      22-APR-2005      Status:      MODERATED
    This is an syntax error dues to wrong order of field definitions and record format info.
    I suspect something wrong with this line:
    REJECT ROWS WITH ALL NULL FIELDSSee following syntax diagram:
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/img_text/et_fields_clause.htm
    Message was edited by:
    Pierre Forstmann
    Message was edited by:
    Pierre Forstmann

  • Oracle SQL Loader - External Table

    This is my code:-
    CREATE TABLE GLO_CUST_EXT
    (SOURCE_ID CHAR(3),
    BUS_DT CHAR(8),
    TRAN_CD CHAR(1),
    CUST_CD NUMBER(10),
    MNEMONIC CHAR(10),
    SHORT_NM CHAR(255))
    ORGANIZATION external
    (TYPE oracle_loader
    DEFAULT DIRECTORY Source_dir
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE Skip 1
    badfile bad_dir:'GLO_CUST%a_%p.bad'
    LOGFILE log_dir:'GLO_CUST%a_%p.log'
    FIELDS TERMINATED BY '|'
    (SOURCE_ID ,
    BUS_DT ,
    TRAN_Cd ,
    CUST_CD,
    MNEMONIC,
    SHORT_NM))
    LOCATION ('GBCUS.txt'))
    REJECT LIMIT 0;
    Question:
    Would like how to load the data by selected field?
    Example:
    I want to all these fields (SOURCE_ID , BUS_DT , TRAN_CD ,
    CUST_CD, SHORT_NM) except the MNEMONIC. Thank you.

    Mohan Nair,
    This is my code:
    CREATE TABLE GLO_CUST_EXT5
    (SOURCE_ID CHAR(3),
    BUS_DT CHAR(8),
    TRAN_CD CHAR(1),
    CUST_CD NUMBER(10),
    MNEMONIC CHAR(10),
    SHORT_NM CHAR(255))
    ORGANIZATION external
    (TYPE oracle_loader
    DEFAULT DIRECTORY Source_dir
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE Skip 1
    badfile bad_dir:'GLO_CUST%a_%p.bad'
    LOGFILE log_dir:'GLO_CUST%a_%p.log'
    FIELDS TERMINATED BY '|'
    (SOURCE_ID ,
    BUS_DT ,
    TRAN_Cd ,
    CUST_CD,
    MNEMONIC FILLER,
    SHORT_NM))
    LOCATION ('GBCUS.txt'))
    REJECT LIMIT 0;
    I'm using Oracle 10g. I execute this code using TOAD version 8.5.0.50.
    I can create the external table, but when i used simple select statement
    (select * from GLO_CUST_EXT5), i got this error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "binary_double, binary_float, comma, char, date, defaultif, decimal, double, float, integer, (, nullif, oracle_date, oracle_number, position, raw, recnum, ), unsigned, varrawc, varchar, varraw, varcharc, zoned"
    KUP-01008: the bad identifier was: FILLER
    KUP-01007: at line 9 column 10
    ORA-06512: at "S
    Can you help me solve this problem? Thank you.

  • Korean Characters and External Table

    Hi Experts.
    We have data exported from teradata through fexp utility.
    The file that we export has Korean character in it. When we are trying to load the file
    to Oracle Db (10.2) through external tables. But We are not able do a select query itself from the external table.
    Is any characterset setting we have to make for external table.
    Above scripts are executed from DBs erver itself through Unix.

    What is the character set and national character set of the Oracle database?
    What character set is the data file encoded in?
    What character set is specified in the external table definition?
    What does "not able to do a select query" mean? Does that mean that you get an error? If so, what error?
    Justin

  • Is the PXIe-PCIe8361 adequate for this system? And external clock questions...

    Hi all,
    I have spent some time piecing together a system and I'd like a sanity check before pulling the trigger on this purchase.  The system will contain the following hardware:
    1. Chassis: PXIe-1078
    2. Controller: PXIe-PCIe8361
    3. 3 x PXIe-6363 (16 analog inputs each card, 32 digital inputs each card, all internally clocked @ 10kHz)
    4. 2 x PXI-6224 (32 digital inputs on one, 8 digital inputs on the other, externally clocked in "bursts" of 62.5khz)
    5. Labview software
    The three PXI-6363 cards will be responsible for  a mix of analog and digital measurements made @ 10 kHz, timed continuously by the onboard clock.
    One PXI-6224 will be clocked externally @ 62.5 kHz and will be used to collect digital data on a 32-bit port.  These clock pulses will not be continuous, but will occur in bursts lasting for 2ms every 20ms.
    The other PXI-6224 will be clocked externally @ 62.5kHz as well and will be used to collect digital data on an 8-bit port. These clock pulses will not be continuous, but will occur in bursts lasting for 2 ms at random intervals.
    My questions are:
    1. Am I planning anything that looks unreasonable for this hardware?
    2.  Should I expect issues with data transfer rates with the PXIe-PCIe8361?  I will be operating well within the advertised 110MB/s throughput of the device.  I plan to stream this method... NI Fast TDMS data streaming
    3.  I have only ever used NI cards for continuous measurements made by an onboard clock.  When I set up a task to collect data that is externally-timed, will the DAQ be expecting a "continuous" clock pulse, or will the system wait patiently for clock pulses to arrive at any rate (any rate within the spec of the card, of course)?
    Thanks, any input is appreciated.

    Hello LucasH0011
    1-As long as you put the PXI-6224  and the PXIe-6363 cards in the corresponding slots, meaning the express(PXIe-6363) in the express and the hybrid(PXI-6224) in the hybrid.
    2-I think you would  not have issues with the transfer rate.
    3-Your timing specifications sound reasonable to me, I think you will be fine. 
    Here is a document that has useful concepts for the use of cards:
    http://www.ni.com/white-paper/3615/en/
    It is for the M-Series, but the concepts apply to the X-Series as well. 
    Regards 
    Ernesto

  • Aperture - library management and external HDD question

    Hi all. 
    I have just graduated from a point and shoot to a Panasonic GH2 (love it), and have now begun using Aperture 3 rather than iPhoto on my early 2008 MB Pro to manage my photos going forwards.  Of course, I’ve now discovered that Aperture is quite the resource hog and so it’s upgrade time (darn, “have” to buy a new ‘puter!).  I have a 2011 MB Pro (2.3Ghz i7 with 512GB SSD) on order and 8GB of DDR 3 arriving from Crucial.  Since I have this brand-new-computer opportunity I want to make sure I’m organising things properly before I start transferring things across, and so have a few Aperture-related questions.  (In case it’s relevant, I’m shooting in RAW+JPEG.  So far I’ve been using RAW as master, but have since learnt it might be a good idea to import JPG as master and switch to RAW only when I need to make corrections, so I’ll probably do that going forwards.)
    I understand that moving to referenced masters on an external drive might be a good idea and save me precious SSD-space.  To that end, questions are:
    1. Can anyone recommend a companion external HDD for Aperture and the 2011 MBPro?  I guess either FW800 or Thunderbolt are the way to go.  The Lacie Little Big Disk Thunderbolt might be an option but is this overkill for Aperture masters or would FW800 be sufficient.  I’ve also seen the G-Tech G-RAID mini, Lacie Rugged – thoughts welcome.
    Key requirements are a) as compact as possible, and b) bus powered.
    2. What kind of performance can I expect if I go down this route?  Is there going to be significant loading/processing delay whenever I switch to a new image?
    3. How will Aperture cope with (eg) syncing photos to iPad / iPhone if the drive containing the masters isn’t connected?  Put another way, are JPG renders saved in the Aperture library (i.e. on my MBP SSD) or with the masters?
    Thanks in advance to anyone who responds!
    Aljrob

    Aljrob_UK wrote:
     ...I have a 2011 MB Pro (2.3Ghz i7 with 512GB SSD) on order and 8GB of DDR 3 arriving
    ...I understand that moving to referenced masters on an external drive might be a good idea and save me precious SSD-space.
    1. Can anyone recommend a companion external HDD for Aperture and the 2011 MBPro?  I guess either FW800 or Thunderbolt are the way to go.
    ...Key requirements are a) as compact as possible, and b) bus powered.
    OWC (an excellent vendor) has the Elite Pro Mini hard drive that meets your specs:
    http://eshop.macsales.com/shop/firewire/EliteALmini/eSATA_FW800_FW400_USB
    Thiunderbolt drives are not mainstream yet but eSATA and FW800 both work well. The multiple connection methods of OWC drives allow very desirable flexibility when purposing/repurposing drives.
    Note too that the MBP optical drive can be replaced with up to a 1-TB hard drive DIY or OWC will do it for you. That is what I am doing with my 17" 2011 MBP.
    2. Is there going to be significant loading/processing delay whenever I switch to a new image?
    SSD latency is orders of magnitude less than hard drives. Switching to a new image even fast hard drives with fast connectivity add significant latency delay. To avoid that what I do is leave (Referenced) Masters on the SSD until all editing is complete (which may be a few weeks). Only then do I use Aperture to change the Referenced Masters location from the SSD to a large external drive.
    What kind of performance can I expect if I go down this route?
    With Masters on the SSD and 8 GB RAM imports/exports are very fast and all Aperture editing is essentially instant. You will be pleased!
    Suggested workflow steps for Referenced Masters:
    • Use a FW card reader or MBP slot to copy to a file folder on the SSD (never directly into Aperture or any other images management app). With fast camera cards copy times are quick, but cheap slow cards can slow this step down a lot.
    • Eject and physically disconnect the card reader.
    • Back up that file folder on external drive(s).
    • Only after backup is complete, reformat the camera card in-camera.
    • Import images into Aperture from the file folder on the SSD.
    HTH
    -Allen Wicks

  • KM Search and External Link Question

    Q1: I have created a Meta Data Properties for a Folder, so any document Created in this folder will require these data to be entered, I want to put a Filter Option on the this folder Iview where user can filter data on the basis on Meta data Property or Search document based on Multiple Meta Data Property using TREX or any other search.
    Q2. If I create a External link to a document in KM folder, will TREX search in that document.
    Thanks in Advance
    Jagraj Dhillon

    Hi Jagraj,
    Q1: You have to define the Meta Property as Indexable than you can use TREX for searching for documents with the specific value of this property. Of cause you can as well filter the documents when displaying the content of the folder. In this case you have to implement a resource list filter.
    Q2: TREX is able to index links as well.
    The question is it you really mean External Links for referencing documents in KM folders because normaly you do this by an internal link. A external link in most cases is a reference outside the portal. Nevertheless see http://help.sap.com/saphelp_nw70/helpdata/en/73/66c090acf611d5993700508b6b8b11/frameset.htm so you can see you can define a parameter indexContentOfExternalLink and you can define a parameter IndexInternalLinks. In this case the index will contain as well the content of the links.
    Best Regards
    Frank

  • Macbook pro 13" and external monitor question

    Hi,
    I got a mbp 13" 2010
    I would like to buy a  HP 2510i monitor http://h20000.www2.hp.com/bizsupport/TechSupport/Document.jsp?objectID=c02076507 &lang=en&cc=us&taskId=115&contentType=SupportFAQ&prodSeriesId=3757979&prodTypeId =382087
    Today I brought my macbook to a computer store and connected it to a samsung monitor 27" using mini-display to hdmi cable.
    When i mirrored the screens I got two black areas on the left and right. basically it wanst fullscreen. (macbook res is 1280x800, so I think I got the same resolution on the external monitor)
    But when I disable the mirror option Its all good and supports 1080p resolution.
    How can I get fullscreen(1080p) even when I am mirroring the screen? will this hp monitor be able to do this?
    thanx

    You can change your screen resolution by going to system preferences and displays from there you can select the bes resolution. However you will never get is perfect as the ratio is different from your ho screen to your mac. You could either go without mirroring (what I do with my cinema display) or plug it into your screen and then shut your mac lid so the dock moves onto the ho screen and it becomes your only display. You would however need an external mouse and keyboard to control this.

  • Loading Flat File and Temporary Table question

    I am new to ODI. Is there a method to load a flat file directly in to the target table and not have ODI create a temporary file? I am asking this in trying to find out how efficient ODI is in bulk loading a very large set of data.

    You can always modify the KM to skip the loading of the temporary table and load the target table directly. Another option would be to use the SUNOPSIS MEMORY ENGINE as your staging database. But, this option has some drawbacks, as it relies on available memory for the JVM and is not suitable for large datasets (which you mention you have).
    http://odiexperts.com/11g-oracle-data-integrator-%E2%80%93-part-711g-%E2%80%93-sunopsis-memory-engine/
    Regards,
    Michael Rainey
    Edited by: Michael R on Oct 15, 2012 2:44 PM

Maybe you are looking for