External table, fixed columns and variable record length..

Hi all,
I'm trying to create an exernal table of a txt-file. Te records are delimited by newlines and the field lengths are fixed. But...
When one or more of the last columns are missing the record is truncated.
Here's the problem: the access parameter "missing field values are null" doesn't seem to work here.. And all incomplete lines are rejected!
Any ideas??

Well, seems like I was on a wild goose chase here.
The original version of the file was not in UTF8 but in ANSI. But for some reason somebody thought it was necessary to convert the file to UTF8.
So, although my question isn't answered yet, my problem has been solved.

Similar Messages

  • External Table - possible bug related to record size and total bytes in fil

    I have an External Table defined with fixed record size, using Oracle 10.2.0.2.0 on HP/UX. At 279 byte records (1 or more fields, doesn't seem to matter), it can read almost 5M bytes in the file (17,421 records to be exact). At 280 byte records, it can not, but blows up with "partial record at end of file" - which is nonsense. It can read up to 3744 records, just below 1,048,320 bytes (1M bytes). 1 record over that, it blows up.
    Now, If I add READSIZE and set it to 1.5M, then it works. I found this extends further, for instance 280 recsize with READSIZE 1.5M will work for a while but blows up on 39M bytes in the file (I didn't bother figuring exactly where it stops working in this case). Increasing READSIZE to 5M works again, for 78M bytes in file. But change the definition to have 560 byte records and it blows up. Decrease the file size to 39M bytes and it still won't work with 560 byte records.
    Anyone have any explanation for this behavior? The docs say READSIZE is the read buffer, but only mentions that it is important to the largest record that can be processed - mine are only 280/560 bytes. My table definition is practically taken right out of the example in the docs for fixed length records (change the fields, sizes, names and it is identical - all clauses the same).
    We are going to be using these external tables a lot, and need them to be reliable, so increasing READSIZE to the largest value I can doesn't make me comfortable, since I can't be sure in production how large an input file may become.
    Should I report this as a bug to Oracle, or am I missing something?
    Thanks,
    Bob

    I have an External Table defined with fixed record size, using Oracle 10.2.0.2.0 on HP/UX. At 279 byte records (1 or more fields, doesn't seem to matter), it can read almost 5M bytes in the file (17,421 records to be exact). At 280 byte records, it can not, but blows up with "partial record at end of file" - which is nonsense. It can read up to 3744 records, just below 1,048,320 bytes (1M bytes). 1 record over that, it blows up.
    Now, If I add READSIZE and set it to 1.5M, then it works. I found this extends further, for instance 280 recsize with READSIZE 1.5M will work for a while but blows up on 39M bytes in the file (I didn't bother figuring exactly where it stops working in this case). Increasing READSIZE to 5M works again, for 78M bytes in file. But change the definition to have 560 byte records and it blows up. Decrease the file size to 39M bytes and it still won't work with 560 byte records.
    Anyone have any explanation for this behavior? The docs say READSIZE is the read buffer, but only mentions that it is important to the largest record that can be processed - mine are only 280/560 bytes. My table definition is practically taken right out of the example in the docs for fixed length records (change the fields, sizes, names and it is identical - all clauses the same).
    We are going to be using these external tables a lot, and need them to be reliable, so increasing READSIZE to the largest value I can doesn't make me comfortable, since I can't be sure in production how large an input file may become.
    Should I report this as a bug to Oracle, or am I missing something?
    Thanks,
    Bob

  • Problems with retrieving data from tables with 240 and more records

    Hi,
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.
    I installed Oracle 11.2.0 Client and I started to have problems with retrieving data from tables.
    First I used the same connection string, driver and so on (O10 Oracle 10g) then I tried ORA Oracle but with no luck. The result is like this:
    I'm able to connect to database. I'm able to retrieve data but from small tables (e.g. with 110 records it works perfectly using both O10 and ORA drivers). When I try to retrieve data from tables with like 240 and more records retrieval simply hangs (nothing happens at all - no error, no timeout). Application seems to hang forever.
    I'm using Powerbuilder to connect to Database (either PB10.5 using O10 driver or PB12 using ORA driver). I used DBTrace, so I see that query hangs on the first FETCH.
    So for the retrievals that hang I have something like:
    (3260008): BIND SELECT OUTPUT BUFFER (DataWindow):(DBI_SELBIND) (0.186 MS / 18978.709 MS)
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=1
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): EXECUTE:(DBI_DW_EXECUTE) (192.982 MS / 19171.691 MS)
    (3260008): FETCH NEXT:(DBI_FETCHNEXT)
    and this is the last line,
    while for retrievals that end, I have FETCH producing time, data in buffer and moving to the next Fetch until all data is retrieved
    On the side note, I have no problems with retrieving data either by SQL Developer or DbVisualizer.
    Problems started when I installed 11.2.0 Client. Even if I want to use 10.0.1 Client, the same problem occurs. So I guess something from 11.2.0 overrides 10.0.1 settings.
    I will appreciate any comments/hints/help.
    Thank you very much.

    pgoel wrote:
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.Earlier (before installing new stuff) did you ever try retrieving data from big tables (like 240 and more records), if yes, was it working?Yes, with Oracle 10g client (before installing 11g) I was able to retrieve any data, either it was 10k+ records or 100 records. Installing 11g client changed something that even using old 10g client (which I still have installed) fails to work. The same problem occur no matter I'm using 10g or 11g client now. Powerbuilder hangs on retrieving tables with more than like 240 records.
    Thanks.

  • What are FIXED SIZE and VARIABLE SIZE

    Hi what FIXED SIZE and VARIABLE SIZE stands for when starting oracle instance?
    thanks in advance

    read this
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:365088445659
    Aman....

  • External tables-Fixed length file

    Hi All,
    I have a fixed length file that i load daily using an External table. Recently, one of the field, IP length was changed and customer wants to send both old records with 8 byte length and new records with 11 byte length in the same data file, until complete migration takes place.
    Will it be possible for External tables to handle this requirement?. Or Is there any other possibility to treat it.
    The old file contains 104 fields with IP field position form 490 to 498. Total
    The new file contains 104 fields with the IP position from 490 to 501.
    Thanks,
    Sri.

    If the two record types are mixed in the same file, then you will have problems loading them. I can see two possible solutions, in no particular order of preference (using your example data):
    1. Redefine the external table something like:
    Position (record_type (1:1)
              version     (2:5)
              data        (6:41))then parse the remaining fields based on the version number when you select from the external table.
    2. Create two external tables over the same file, one for version 1.00 and one for version 1.01 using the LOAD WHEN clause to determine which set of data to load when you select. Something like:
    CREATE TABLE version1 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.00)
    < definition for the old format >
    and
    CREATE TABLE version101 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.01)
    < definition for the new format >Then yor processing would use something like:
    SELECT ip, last_name
    FROM version1
    UNION ALL
    SELECT ip, last_name
    FROM version101HTH
    John

  • Tables for Fiels Fixed Value and Variable Value

    Hi,
    Can any one tell me where the actual fixed and variable costs are stored in which table and also the planned costs. I have checked it, but it is showing RKPLN as data structure. We are doing development report where in standard Cost Center Report we need to have break up of fixed and variable cost. Please guide me where exactly these fields are stored in which table. its very urgent.

    Hi,
    In regard to the tables for planned data please do a search of the forum as this question has already been asked a few times. E.g. here:
    CO PLANNING TABLES
    In regard to fixed / variable actual costs: these are not stored in separate tables. If a posting contains fixed and variable portions, for example postings resulting from splitting (KSS2), then the fixed portion is stored in the COEP- WKFBTR (line items) and COSS-WKF*  fields (summary items).  Please observe SAP note 192107 on this issue.
    Regards
    Karl

  • Comma after variable record length

    Hi guys,
    Can you tell me if it is possible (and how I should have my control file) to process a loader file that has a comma (CSV) after the byte length? I can't get it to work.
    e.g
    Example file record:-
    00047,1,MR SMITH,20 ANY STREET, ANY ADDRESS, ANY TOWN
    COLUMNS = "ID,NAME,ADDRESS1,ADDRESS2,ADDRESS3"
    So my infile reads
    infile 'example.dat' "var 5"
    but the only examples I have of using record length is where there is no comma separating the record length from the first field, ie in this format:
    000471,MR SMITH,20 ANY STREET, ANY ADDRESS, ANY TOWN
    I cant change the input data format, so any help to cater for this is greatly appreciated.
    Thanks

    Hello, I think you need to add a FILLER field, like this:
    LOAD DATA
    INFILE 'example.dat' "VAR 5"
    TRUNCATE INTO TABLE t1
    FIELDS TERMINATED BY ','
       DUMMY FILLER,
       ID  INTEGER EXTERNAL,
       NAME  CHAR,
       ADDRESS1  CHAR,
       ADDRESS2  CHAR,
       ADDRESS3  CHAR
    )

  • External Table, Handling Delimited and Special Character in file

    Hi ,
    I have created one external table with these option
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ***************************************
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    SKIP 0
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    MISSING FIELD VALUES ARE NULL                                          
    LOCATION
    ( 'test_feed.csv'
    Now problem is these are coming as valid.
    anupam|anupam2
    anupam"test|anupam"test2
    "anupam|test3"|test3
    anupam""""test5|test5
    anupam"|test7
    but these are not coming as valid
    "anupam"test4"|test4    --> Case when we have quotes in the filed but still have quotes in it. I guess in this case we can send the filed expect closing double quotes.
    "anupam|test6   --> In case field is starting with double quotes then it's failing
    "anupam"test8|test8"|test8 --> In case one filed contains both pipe ( |) and double quotes then we are sending it enclosed in double quotes. But thats failing the job.
    Can you suggest what is the best way to handle such scenario? ( One restriction though. The file is used by other system - Netezza as well, which can't take more than one character long delimited :'( )

    One approach is to define the external table a ONE column table (with single field on the file). This way each line will come in as a row in the external table. Of course you have to build "parsing logic" on top of that.
    DROP TABLE xtern_table;
    CREATE TABLE xtern_table
        c1 VARCHAR2(4000)
      organization external
        type ORACLE_LOADER DEFAULT directory xtern_data_dir
        ACCESS PARAMETERS (
            RECORDS DELIMITED BY NEWLINE
            FIELDS TERMINATED BY '~'   ---- <<<<<<<< Use a field terminator as a character that is not found in the file
            MISSING FIELD VALUES ARE NULL
            ( c1 CHAR(4000)
         ) location ('mycsv.csv')
    > desc xtern_table
    desc xtern_table
    Name Null Type          
    C1        VARCHAR2(4000)
    > column c1 format A40
    > select * from xtern_table
    C1                                    
    anupam|anupam2                          
    anupam"test|anupam"test2                
    "anupam|test3"|test3                    
    anupam""""test5|test5                   
    anupam"|test7                           
    "anupam"test4"|test4                    
    "anupam|test6                           
    "anupam"test8|test8"|test8              
    8 rows selected
    Ideally, it will be good t have an incoming source file with predictable format.
    Hope this helps.

  • How to create variable record length target file in SAP BODS

    Hi All
    I have a requirement to create target file which will have various record layout; meaning different record length (similar to cobol file format), but this is for the target. Please let me know what is the best practice and how to solution this requirment.
    Thanks
    Ash

    Hi Shiva,
    Thanks for your feedback. My issue is that I have 10 different detail records (each record type is fixed length).
    For each customer account, I have to write to file the header record, the detail records in the exact order, then continue with next account and so on and then write the trailer record. I have given sample layout below. Highlighted text is the record identifier in this exmaple while the underlineds are account numbers. Fields are fixed length right padded with space or 0.
    220700000000SA00    Wednesday     2014-12-12  ASA00034 334 000   ---> (this is header)
    220700000010SA10 AAb   00000+000000+ Akab xxxx bb   0000000000943 3433 --> (detail rec)
    220700000010SA14 AAA  00034354 DDD 000000000+    --> (detail rec)
    220700000010SA15 888e a88 00000000+            --> (detail rec)
    . . . . . remaining detail records
    220700000012SA10 AAb   00000+000000+ Akab xxxx bb   0000000000943 3433 --> (detail rec)
    220700000012SA14 AAA  00034354 DDD 000000000+   --> (detail rec)
    220700000012SA15 888e a88 00000000+           -->  (detail rec)
    . . . . . remaining detail records
    220700000000SA99    Wednesday     2014-12-12  d334 000   --> (trailer is header)

  • Export file - fixed columns and remove dimensions

    Hello Experts
    I wan't to use the standard export package ang get dimensions fixed in specific columns and also remove some dimensions.
    The problem is that i always get the dimensions randomly in columns and when i am able to remove dimensions, the dimensions are removed randomly, please see *MAPPING and result below, does anyone know how to do this? Or have an example? I have used the standard example files but they have not helped....
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER=
    VALIDATERECORDS=NO
    ROUNDAMOUNT = 7
    OUTPUTHEADER=
    OUTPUTDELIMITER=
    SPECIFICMAPPING=YES
    *MAPPING
    ENTITY=*COL(1)
    TIME=*COL(2)
    ACCOUNT=*COL(3)
    RPTCURRENCY=*COL(4)
    AMOUNT=*COL(5)
    ACCOUNT,ENTITY,RPTCURRENCY,TIME,AMOUNT
    NON_FLOW,ADT5_E,ACTUAL,ANA_TONS,TOTALADJ
    NON_FLOW,568U_E,ACTUAL,ANA_TONS,TOTALADJ
    Best regards
    Jonas

    Given the nature of OLAP and FACT tables, I do not beleive that you are able to disassociate a dimension from the export process.  So, I don't think that you may choose the dimensions to export, plus there method of being written to a file may just be alphabetical. I would export the complete details and then manipulate the details during an import process. The only other alternative that I can think of is to write a custom SSIS SQL script, to allow for FACT member aggregation if you choose to remove a dimension.
    But I would need to test further. Hope this helps.

  • Data Table Fixed rows and Scroll Bar

    Hi
    I have data table which displays the data dynamically, but when there is no data , the table is shrinking and moving the other elements of the page. Is there any way i can keep to fixed number of rows irrespective of data presence?
    Like i need to display only 10 rows at a time, if there are 7 rows in DB, i have t o display 7 rows containing data and rest as blank .
    If my data increases more than 10 rows how can i enable Scroll Bar Automatically on this?
    Thanks in advance.

    Then add empty row objects to the collection or datamodel which is been passed to the datatable value.

  • How to group by one column and concatinate records from another col

    hi,
    i want to fill a list-item in forms 6i with one cursor. i need a cursor / sql-function that fetch the data from the following table structur. it looks like this:
    col1 col2
    group1 01
    group1 02
    group2 03
    group3 04
    group2 05
    group1 06
    the resultset should look like this:
    label value
    group1 01.02.06
    group2 03.05
    group3 04
    any ideas ?
    thanks in advance.
    greets Kevin

    Try this
    table X2
    columns a and b
    SELECT a
    , LTRIM(MAX(SYS_CONNECT_BY_PATH(b,'.'))
    KEEP (DENSE_RANK LAST ORDER BY curr),'.') AS concatenated
    FROM ( SELECT a
    , b
    , ROW_NUMBER() OVER (PARTITION BY a ORDER BY b) AS curr
    , ROW_NUMBER() OVER (PARTITION BY a ORDER BY b) -1 AS prev
    FROM x2 )
    GROUP BY a
    CONNECT BY prev = PRIOR curr AND PRIOR a = a
    START WITH curr = 1
    Pls. Confirm if it works..

  • Accessing The External Table using UIAPI and DIAPI

    Hi  Every One,
    can any one tell me about , can we access the external table data  that is created by using sql query externally using UIApi and DIAPI
    Regards
    Srinivas

    Hi,
    technically is it possible, but all inserts and updates have to be through query.
    Note, that when you create tables from sdk, you can set more informations about fields than in sql query manager (for example subtype) - so you may have problem in binding data to matrix for example. So if you have table externally, you may work with, but this isn`t allowed by SAP (and I think that this case may be dangerous if you acces it in sdk). If you need to acces it on sql server side (own numbering, some stored procedure, ..) it will works safer, but it is not allowed as well.
    Petr

  • I want to extend table "hr_location_extra_info_lei" column name "lei_information3" field length from 16 to 30 and was wondering if I can do this and how?

    I am trying to extend the field length for colun lei_information 3 from 16 characters to 30 characters in the table hr_location_extra_info_lei, but cannot seem to do this.  Can the length be changed and if so how?
    Many thanks
    Louise

    Hi Louise,
    All the "lei_informationX" columns in   hr_location_extra_info  table are of size VARCHAR2(150) (Oracle EBS R12).
    So I understand why you're saying it's limited to 16 characters.
    May it's only a question of your setup limiting it to 16 car., in which case you can easily change it to 30 by unfreezing the DFF and modifying it.
    Regards,
    Rajen

  • Fixed Column and Scrollable JTable problems

    I've implemented a table that has the first column fixed, while 2nd to end are scrollable.
    following the fixedModel examples, i am able to get this working..but im not able to update it. the exmples tho, subclass AbstractTableModel and use anonymous classes to define it...mine uses a class that subclasses DefaultTableModel..but DefaultTableModel subclasses AbstractTableModel...so i think it should be ok.
    i have 2 table models, one for the fixed, and one for the scrollable..they use the same data/column arrays...then i create 2 tables that with these models...
    initially i have no data Object[][] data = new Object[0][0]) and a bunch of column names
    it works fine..but i have trouble updating it with real data
    i use fixedModel.setDataVector(...) and scrollModel.setDataVector(...) to update...but it doesnt work..i dont get any exceptions or anything..just nothing comes up...revalidating the tables doesnt work
    if i choose not to use the fixed/scrollable tables..and just use one normal table with setDataVector..it works fine
    any ideas?
    heres the code:
    fixedModel = new SortableTableModel() {
        /*  for sorting to work correctly  */
        public Class getColumnClass(int col) {
           return getValueAt(0, col).getClass();
        public boolean isCellEditable(int row, int col) {
            return false;
        //new methods
        public int getColumnCount() {
            return 1;
        public int getRowCount() {
            return data.length;
        public String getColumnName(int col) {
            return (String)columnNames[col];
        public Object getValueAt(int row, int col) {
            return data[row][0];
    fixedModel.setDataVector(data,columnNames);
    scrollModel = new SortableTableModel() {
        public Class getColumnClass(int col) {
            return getValueAt(0, col).getClass();
        public boolean CellEditable(int row, int col) {
            return false;
        //new methods
        public int getColumnCount() {
            return columnNames.length -1;
        public int getRowCount() {
            return data.length;
        public String getColumnName(int col) {
            return (String)columnNames[col+1];
        public Object getValueAt(int row, int col) {
            return data[row][col+1];
        public void setValueAt(Object obj, int row, int col) {
            data[row][col +1] = obj;
    scrollModel.setDataVector(data, columnNames);
    ListSelectionModel lsm = fixedTable.getSelectionModel();
    lsm.addListSelectionListener(new SelectionListener(true));
    lsm = scrollTable.getSelectionModel();
    lsm.addListSelectionListener(new SelectionListener(false));
    fixedTable.setSelectionMode(ListSelectionModel.SINGLE_SELECTION);
    scrollTable.setSelectionMode(ListSelectionModel.SINGLE_SELECTION);
    JViewport viewport = new JViewport();
    viewport.setView(fixedTable);
    viewport.setPreferredSize(fixedTable.getPreferredSize());i really need to get this fixed..thanks for the help!

    i fixed the problem a couple days ago
    my anonymous class for the fixed and scrollable table models defines the data given from a 2D Object array..and not a vector of vectors, which i was using when setting the data :P
    oops...
    thanks for the help tho.

Maybe you are looking for