Using a file as a table

Hi,
I know that it is possible to use a file (text file) as a table in 9i
I don't however know how to do it.
Does anyone know how to do this?
Thanks,
Ed.

Here's an example
##emp.txt##
1,JOE,MORGAN,55
2,LINDA,ROBBINS,19
3,MARTIN,DEMMS,43
create directory mydir as 'e:\oracle\admin\rman2\temp';
create table ext_emp
(empno number,
first varchar2(10),
last varchar2(10),
age number)
organization external
(type oracle_loader
default directory mydir
access parameters
(records delimited by newline
fields terminated by ','
location ('emp.txt')
reject limit unlimited
/

Similar Messages

  • Importing data- using xml file into HANA Table

    Hi,
    We are using (HDB STUDIO ) - revision 60.......
    Is it possible to import data using XML file into HANA table (Master,Fact tables)?
    (Without using any intermediate adapters for conversion of data.....)
    Can any one suggest us........
    Thank you.

    Hi user450616
    I am a bit confused about what you are trying to achieve.
    Are you:
    1. importing a CSV file into APEX
    2. adding an extra column to the Oracle Table
    3. populating the extra column with the CSV filename?
    Let us know if this is what you are trying to do.
    Cheers,
    Patrick Cimolini

  • How can i store a picture file in a table using sql

    can anyone help me
    to store a pic file in a table using sql

    You can find an example in this link
    http://www.orafaq.com/forum/t/38347/0/

  • Loading the data from a text file to a table using pl/sql

    Hi Experts,
    I want to load the data from a text (sample1.txt) file to a table using pl/sql
    I have used the below pl/sql code
    declare
    f utl_file.file_type;
    s varchar2(200);
    c number := 0;
    begin
    f := utl_file.fopen('TRY','sample1.txt','R');
    loop
    utl_file.get_line(f,s);
    insert into sampletable (a,b,c) values (s,s,s);
    c := c + 1;
    end loop;
    exception
    when NO_DATA_FOUND then
    utl_file.fclose(f);
    dbms_output.put_line('No. of rows inserted : ' || c);
    end;
    and my sample1.txt file looks like
    1
    2
    3
    The data is getting inserted, with below manner
    select * from sampletable;
    A     B     C
    1     1     1
    2     2     2
    3     3     3
    I want the data to get inserted as
    A     B     C
    1     2     3
    The text file that I have is having three lines, and each line's first value should go to each column
    Please help...
    Thanks

    declare
    f utl_file.file_type;
    s1 varchar2(200);
    s2 varchar2(200);
    s3 varchar2(200);
    c number := 0;
    begin
    f := utl_file.fopen('TRY','sample1.txt','R');
    utl_file.get_line(f,s1);
    utl_file.get_line(f,s2);
    utl_file.get_line(f,s3);
    insert into sampletable (a,b,c) values (s1,s2,s3);
    c := c + 1;
    utl_file.fclose(f);
    exception
    when NO_DATA_FOUND then
    if utl_file.is_open(f) then utl_file.fclose(f); ens if;
    dbms_output.put_line('No. of rows inserted : ' || c);
    end;SY.

  • Data not viewable in table using Control file  CharacterSet AL32UTF8

    i have a flat file which contains Chinese characters and English Characters
    I have to create a control file, to insert the data in this flat file to the Table
    The characterset i am using in my control file is AL32UTF8
    When i use this characterset the data is getting loaded into the table,but i am not able to view the Chinese characters.
    i am able to see some Upside down question mark symbol
    Please help me of how to view the chinese characters into the table
    Is that any other characterset i have to set in Control file

    NLS_LANG is an environment variable. I'm assuming you're on Windows, so it's probably something like
    Control Panel | System | Advanced | Environment Variables
    Given that you're using Toad, though, there may be someplace there where you set the character set. There is a discussion on Eddie Awad's blog on how various tools can be made to display Unicode data
    http://awads.net/wp/2006/07/06/sql-developer-and-utf8/
    Some of the comments discuss Toad, though that's not a tool I'm familiar with.
    If you happen to be able to use iSQL*Plus, since that's browser based, it supports unicode natively. That's often the easiest solution.
    Justin

  • Creation of External table by using XML files.

    I am in the process of loading of XML file data into the database table. I want to use the External Table feature for this loading. Though we have the external table feature with Plain/text file, is there any process of XML file data loading into the Database table by using External table?
    I am using Oracle 9i.
    Appreciate your responses.
    Regards
    Edited by: user652422 on Dec 16, 2008 11:00 PM

    Hi,
    The XML file which U posted is working fine and that proved that external table can be created by using xml files.
    Now My problem is that I have xml files which is not as the book.xml, my xml file is having some diff format. below is the extracts of the file ...
    <?xml version="1.0" encoding="UTF-8" ?>
    - <PM-History deviceIP="172.20.7.50">
    <Error Reason="" />
    - <Interface IntfName="otu2-1-10B-3">
    - <TS Type="15-MIN">
    <Error Reason="" />
    - <PM-counters TimeStamp="02/13/2008:12:15">
    <Item Name="BBE-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="BBE-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="ES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="ES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="CSES-S" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="CSES-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="UAS-S" Direction="Received" Validity="ADJ" Value="135" />
    <Item Name="UAS-SFE" Direction="Received" Validity="ADJ" Value="0" />
    <Item Name="SEF-S" Direction="Received" Validity="ADJ" Value="135" />
    </PM-counters>
    <PM-counters TimeStamp="03/26/2008:12:30">
    <Item Name="BBE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="BBE-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="ES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="ES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="SES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="SES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="CSES" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="CSES-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="UAS" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="UAS-FE" Direction="Received" Validity="OFF" Value="0" />
    <Item Name="PSC" Direction="Received" Validity="OFF" Value="0" />
    </PM-counters>
    </TS>
    </Interface>
    </PM-History>
    My problem is the Item Name and Direction the value of both(ex PSCReceived or UASReceived) will be treated as the coulmn name of the table and '0' would be the value of that column. I am confused how to create the external table creation program for that.
    I would really appreciate your responses.
    Regards

  • Help with uploading an excel file to a table using an application

    Hello,
    Can anyone please help me out with this issue. I have apex application where in the end users upload an excel file to a table. For this I have followed the solution provided in this link
    http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/
    Using the above solution, I was able to upload the excel data to a table "sample_tbl1" successfully with fields Id,acct_no,owner_name,process_dt. But the thing is I want accomdate a particular condition while uploading the file data, to check see if the acct_no already exists in another table say "sample_tbl2" or not. If acct_nos already exists in sample_tbl2 then give out an error displaying the list of account numbers that already exists in the database. Below is the code which I am using to upload file data to a table.
    DECLARE
    v_blob_data       BLOB; 
    v_blob_len        NUMBER; 
    v_position        NUMBER; 
    v_raw_chunk       RAW(10000); 
    v_char            CHAR(1); 
    c_chunk_len       number       := 1; 
    v_line            VARCHAR2 (32767)        := NULL; 
    v_data_array      wwv_flow_global.vc_arr2; 
    v_rows            number; 
    v_sr_no           number         := 1; 
    l_cnt             BINARY_INTEGER := 0;
    l_stepid          NUMBER := 10;
    BEGIN
    --Read data from wwv_flow_files</span> 
    select blob_content into v_blob_data 
    from wwv_flow_files 
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER) 
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER); 
    v_blob_len := dbms_lob.getlength(v_blob_data); 
    v_position := 1; 
    /* Evaluate and skip first line of data
    WHILE (v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char :=  chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    -- Clear out 
    v_line := NULL;
    EXIT;
    END IF;
    END LOOP;
    -- Read and convert binary to char</span> 
    WHILE ( v_position <= v_blob_len ) LOOP 
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position); 
    v_char :=  chr(hex_to_decimal(rawtohex(v_raw_chunk))); 
    v_line := v_line || v_char; 
    v_position := v_position + c_chunk_len; 
    -- When a whole line is retrieved </span> 
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span> 
    v_line := REPLACE (v_line, ',', ':'); 
    -- Convert each column separated by : into array of data </span> 
    v_data_array := wwv_flow_utilities.string_to_table (v_line); 
    -- Insert data into target table
    EXECUTE IMMEDIATE 'insert into sample_tbl1(ID,ACCT_NO,OWNER_NAME,PROCESS_DT) 
    values (:1,:2,:3,:4)'
    USING 
    v_sr_no, 
    v_data_array(1), 
    v_data_array(2),
    to_date(v_data_array(3),'MM/DD/YYYY');
    -- Clear out 
    v_line := NULL; 
    v_sr_no := v_sr_no + 1; 
    l_cnt := l_cnt + SQL%ROWCOUNT;
    END IF; 
    END LOOP;
    delete from wwv_flow_files
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);
    l_stepid  := 20;
    IF l_cnt = 0 THEN
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold">Please select a file to upload.</span></p>' ;
    ELSE
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold;color:green">File uploaded and processed ' || l_cnt || ' record(s) successfully.</span></p>';
    END IF;
    l_stepid  := 30;
    EXCEPTION WHEN OTHERS THEN
    ROLLBACK;
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold;color:red">Failed to upload the file. '||REGEXP_REPLACE(SQLERRM,'[('')(<)(>)(,)(;)(:)(")('')]{1,}', '') ||'</span></p>';
    END;
    {code}
    Can anyone please help me, how do i accomdate the condition within my existing code.
    thanks,
    Orton                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Orton,
    From your code it appears that the account No comes in the second column of the file = > v_data_array(1)
    So You can put a conditional block around the execute immediate code that inserts the records
    For instance
      SELECT count(1) INTO ln_account_no_exists from <"sample_tbl2> where account_no = v_data_array(1);
      IF (  ln_account_no_exists > 0 ) THEN
         --Account No: already exists
        <Do what you want to do here >
      ELSE
        EXECUTE IMMEDIATE ...
      END IF:
    {code}
    Inorder to handle the account no records which exists you can
      <li>Raise an exception
      <li> Write record to table or insert into collection and then use a report region in the page based on this table/collection to show error records
      <li> Append errored account No:s to the Success Message Variable programmatically(this variable is used by PLSQL process success/error message )
       {code}
        IF ( record exists)
          apex_application.g_print_success_message := apex_application.g_print_success_message||','|| v_data_array(1) ; -- Comma separated list of errored account no:s
        ELSE ...
       {code}
    Hope it helps                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Exporting data from text file to a table using utl_file

    Dear all,
    I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
    CREATE TABLE story_books
    book_id NUMBER,
    Category VARCHAR2(100 BYTE),
    Book_type VARCHAR2(100 BYTE),
    Name VARCHAR2(700 BYTE),
    Location VARCHAR2(700 BYTE),
    Ownership_code VARCHAR2(700 BYTE),
    Author VARCHAR2(700 BYTE),
    Less_Sel_fact VARCHAR2(700 BYTE),
    Reason VARCHAR2(700 BYTE),
    Buying VARCHAR2(700 BYTE),
    Suspected Book VARCHAR2(700 BYTE),
    Conditions VARCHAR2(700 BYTE)
    -------------------------text file---------------
    Books Out Table: Books
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown               
    Buying (if applicable):
    Not Applicable
    Suspected Book:
    Unknown
    Conditions to increace sales:
    Advertisement in all areas
    i was able to read the data and storing if it is in the same line.But i dont know how to read below data
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
    Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
    Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
    I was able to extract the data if it is in below format using utl_file.get_line
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown     
    Any one can explain me how to solve the above criteria.
    Thanks in advance.

    Dear all,
    I have a text file as below and i have a table having 12 columns. Now i need to insert this text file into the table story_books.
    CREATE TABLE story_books
    book_id NUMBER,
    Category VARCHAR2(100 BYTE),
    Book_type VARCHAR2(100 BYTE),
    Name VARCHAR2(700 BYTE),
    Location VARCHAR2(700 BYTE),
    Ownership_code VARCHAR2(700 BYTE),
    Author VARCHAR2(700 BYTE),
    Less_Sel_fact VARCHAR2(700 BYTE),
    Reason VARCHAR2(700 BYTE),
    Buying VARCHAR2(700 BYTE),
    Suspected Book VARCHAR2(700 BYTE),
    Conditions VARCHAR2(700 BYTE)
    -------------------------text file---------------
    Books Out Table: Books
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown               
    Buying (if applicable):
    Not Applicable
    Suspected Book:
    Unknown
    Conditions to increace sales:
    Advertisement in all areas
    i was able to read the data and storing if it is in the same line.But i dont know how to read below data
    Book. Type          Name          Location               Ownership Code
    Story               SL          hyd               SS-HYD
    In this data i have to search for 'Book. type' and then i need to save the word 'Story' to the column 'Book_type'
    Then i need to search for 'Name' and i need to save 'SL' into the column into 'Name'
    Then i need to search for 'Location' and i need to save 'hyd' into the column into 'Location'
    I was able to extract the data if it is in below format using utl_file.get_line
    Known Author:     Unknown               
    Less Selling Factors: Thunderstorms     
    Reason:     Unknown     
    Any one can explain me how to solve the above criteria.
    Thanks in advance.

  • Help Required:How Upload Excel file Into Oracle Table Using PLSQL Procedure

    Please Help , Urgent Help Needed.
    Requirement is to Upload Excel file Into Oracle Table Using PLSQL Procedure/Package.
    Case's are :
    1. Excel File is On Users/ Client PC.
    2. Application is on Remote Server(Oracle Forms D2k).
    3. User Is Using Application Using Terminal Server LogIn.
    4. So If User Will Use to GET_FILE_NAME() function of D2K to Get Excel File , D2k Will Try to pick File from That Remote Server(Bcs User Logind from Terminal Server Option).
    5. Cannot Use Util_File Package Or Oracle Directory to Place That File on Server.
    6. we are Using Oracle 8.7
    So Need Some PL/SQL Package or Fuction/ Procedure to Upload Excel file on User's Pc to Oracle Table.
    Please Guide me wd some Code. or with Some Pl/SQL Package, or With SOme Hint. Or any Link ....
    Jus help to Sort This Issue ........
    you can also write me on :
    [email protected], [email protected]

    TEXT_IO is a PL/SQL package available only in Forms (you'll want to post in the Forms forum for more information). It is not available in a stored procedure in the database (where the equivalent package is UTL_FILE).
    If the Terminal Server machine and the database machine do not have access to the file system on the client machine, no application running on either machine will have access to the file. Barring exceptional setups (like the FTP server on the client machine), your applications are not going to have more access to the client machine than the operating system does.
    If you map the client drives from the Terminal Server box, there is the potential for your Forms application to access those files. If you want the files to be accessible to a stored procedure in the database, you'll need to move the files somewhere the database can access them.
    Justin

  • Import data from text file to a table using t-sql

    Hi,
    I am trying to import data from a text file to a table using the below query but it is returning an error.
    SELECT *, 2 FROM OPENROWSET(BULK 'W:\file.txt',
    FORMATFILE = 'W:\format_file.xml', FIRSTROW = 1, LASTROW = 1) as f
    The error is:
    Bulk load data conversion error (truncation) for row 1, column 1 (COPYRIGHT_DETAIL_CODE).

    <BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
     <RECORD>
      <FIELD ID="1" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="2" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="3" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="4" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="5" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="6" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="1" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="7" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="8" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="10" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="9" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="10" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="11" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="12" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="5" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="13" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="8" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="14" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="2" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="15" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="3" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="16" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="4" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="17" xsi:type="CharTerm" TERMINATOR="\r\n" MAX_LENGTH="43" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
     </RECORD>
     <ROW>
      <COLUMN SOURCE="1" NAME="Name1" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="2" NAME="Name2" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="3" NAME="Name3" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="4" NAME="Name4" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="5" NAME="Name5" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="6" NAME="Name6" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="7" NAME="Name7" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="8" NAME="Name8" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="9" NAME="Name9" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="10" NAME="Name10" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="11" NAME="Name11" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="12" NAME="Name12" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="13" NAME="Name13" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="14" NAME="Name14" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="15" NAME="Name15" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="16" NAME="Name16" xsi:type="SQLVARYCHAR"/>
      <COLUMN SOURCE="17" NAME="Name17" xsi:type="SQLVARYCHAR"/>
     </ROW>
    </BCPFORMAT>
    The format file that I am using is just like the same one above.
       

  • Read text file insert into table using utl_file

    Hi
    i have script for read and insert into table but i want error records load into error table so i sent you my script and please fix the error log table
    script
    DECLARE
    v_line VARCHAR2(2000);
    v_file utl_file.file_type;
    v_dir VARCHAR2(250);
    v_filename VARCHAR2(50);
    BEGIN
    v_dir :='MID5010_DOC1TP';
    v_filename := 'OPT_CM_BASE.txt';
    v_file := utl_file.fopen(v_dir, v_filename, 'r');
    LOOP
    BEGIN
    utl_file.get_line(v_file, v_line);
    EXCEPTION
    WHEN no_data_found THEN
    EXIT;
    END ;
    v_line := REPLACE(v_line,'|','|~');
    INSERT
    INTO optum_icd10cm_base VALUES
    ( REPLACE(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,1),'a~','a'),'.'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,2),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,3),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,4),'a~','a'),
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,5),'a~','a'),
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,6)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,6),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,6),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,7)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,7),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,7),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,8)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,8),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,8),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,9)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,9),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,9),'a~','a'),'mm-dd-yyyy')
    END,
    CASE
    WHEN LENGTH(regexp_substr(v_line,'[^|~]+',1,10)) < 10
    THEN to_date(ltrim(TRANSLATE(regexp_substr(v_line,'[^|~]+',1,10),'a~','a'),'0'),'mm-yyyy')
    ELSE to_date(TRANSLATE(regexp_substr(v_line,'[^|]+',1,10),'a~','a'),'mm-dd-yyyy')
    END,
    TRANSLATE(regexp_substr(v_line,'[^|~]+',1,11),'a~','a')
    -----commit;
    END LOOP;
    utl_file.fclose(v_file);
    END;
    text file
    A50.0||Short|Long|Full|01-01-2009|01-2009||01-01-2013|09-18-2012|C|
    A50.1||Short|Long|Full|01-01-2009|01-01-2009||001-2013|09-18-2012|C|
    A50.2||Short|Long|Full|01-01-2009|01-01-2009|67|01-01-2013|09-18-2012|C|
    A50.3||Short|Long|Full|011-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.4||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|5|
    A50.5|R|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.6||Short|Long||01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A50.7||Short||Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    2345||Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A60.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A70.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    B222|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.8|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A4.9|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.2|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.3|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    D642|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.5|D|Short|Long|Full|01-01-2009|01-01-2009|01-10-2013|01-01-2013|09-18-2012|C|
    A5.6|D|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.7|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A001|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A009|C|Short Updated|Long Updated|Full Updated|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A5.10|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A0109|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.0|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.1|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.2|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.3|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.4|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.5|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.6|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    F10.7|N|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A30|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A316|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    A317|C|Short|Long|Full|01-01-2009|01-01-2009||01-01-2013|09-18-2012|C|
    ----clearly read text file insert into table and error records load into error table
    please help me

    hI
    i am using utl_file prepared script but i got error like 01861. 00000 - "literal does not match format string"
    script:
    DECLARE
    f utl_file.file_type;
    s VARCHAR2(32000);
    f1 VARCHAR2(100);
    f2 varchar2(100);
    F3 VARCHAR2(100);
    F4 VARCHAR2(100);
    F5 VARCHAR2(100);
    F6 DATE;
    F7 DATE;
    F8 DATE;
    F9 DATE;
    F10 DATE;
    f11 CHAR(1);
    BEGIN
    --DBMS_OUTPUT.ENABLE(100000);
    f := utl_file.fopen('MID5010_DOC1TP', 'OPT_CM_BASE.txt', 'R');
    LOOP
    BEGIN
    UTL_FILE.GET_LINE(f, s);
    f1 := REGEXP_SUBSTR (s,'[^|]+',1,1);
    f2 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,2);
    F3 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,3);
    F4 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,4);
    F5 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,5);
    F6 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,6),'mm-dd-yyyy');
    F8 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,8),'mm-dd-yyyy');
    F7 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,7),'mm-dd-yyyy');
    F9 := to_date(REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+',1,9),'mm-dd-yyyy');
    F10 :=to_date(REGEXP_SUBSTR (REPLACE(s,'||','||') ,'[^|]+',1,10),'mm-dd-yyyy');
    f11 := REGEXP_SUBSTR (REPLACE(s,'||','||'),'[^|]+', 1,11);
    INSERT
    INTO OPTUM_ICD10CM_BASE
    ( CODE,
    STATUS,
    SHORT_DESCRIPTION,
    LONG_DESCRIPTION,
    FULL_DESCRIPTION,
    CODE_EFFECTIVE_DATE,
    CHANGE_EFFECTIVE_DATE,
    TERMINATION_DATE,
    RELEASE_DATE,
    CREATION_DATE,
    VALIDITY
    VALUES
    F1,
    F2,
    F3,
    F4,
    F5,
    F6,
    F7,
    F8,
    F9,
    F10,
    f11
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    EXIT;
    END;
    END LOOP;
    UTL_FILE.FCLOSE(F);
    END;
    please help me(in my org looks utl_file standards only)

  • Sliding Window Table Partitioning Problems with RANGE RIGHT, SPLIT, MERGE using Multiple File Groups

    There is misleading information in two system views (sys.data_spaces & sys.destination_data_spaces) about the physical location of data after a partitioning MERGE and before an INDEX REBUILD operation on a partitioned table. In SQL Server 2012 SP1 CU6,
    the script below (SQLCMD mode, set DataDrive  & LogDrive variables  for the runtime environment) will create a test database with file groups and files to support a partitioned table. The partition function and scheme spread the test data across
    4 files groups, an empty partition, file group and file are maintained at the start and end of the range. A problem occurs after the SWITCH and MERGE RANGE operations, the views sys.data_spaces & sys.destination_data_spaces show the logical, not the physical,
    location of data.
    --=================================================================================
    -- PartitionLabSetup_RangeRight.sql
    -- 001. Create test database
    -- 002. Add file groups and files
    -- 003. Create partition function and schema
    -- 004. Create and populate a test table
    --=================================================================================
    USE [master]
    GO
    -- 001 - Create Test Database
    :SETVAR DataDrive "D:\SQL\Data\"
    :SETVAR LogDrive "D:\SQL\Logs\"
    :SETVAR DatabaseName "workspace"
    :SETVAR TableName "TestTable"
    -- Drop if exists and create Database
    IF DATABASEPROPERTYEX(N'$(databasename)','Status') IS NOT NULL
    BEGIN
    ALTER DATABASE $(DatabaseName) SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    DROP DATABASE $(DatabaseName)
    END
    CREATE DATABASE $(DatabaseName)
    ON
    ( NAME = $(DatabaseName)_data,
    FILENAME = N'$(DataDrive)$(DatabaseName)_data.mdf',
    SIZE = 10,
    MAXSIZE = 500,
    FILEGROWTH = 5 )
    LOG ON
    ( NAME = $(DatabaseName)_log,
    FILENAME = N'$(LogDrive)$(DatabaseName).ldf',
    SIZE = 5MB,
    MAXSIZE = 5000MB,
    FILEGROWTH = 5MB ) ;
    GO
    -- 002. Add file groups and files
    --:SETVAR DatabaseName "workspace"
    --:SETVAR TableName "TestTable"
    --:SETVAR DataDrive "D:\SQL\Data\"
    --:SETVAR LogDrive "D:\SQL\Logs\"
    DECLARE @nSQL NVARCHAR(2000) ;
    DECLARE @x INT = 1;
    WHILE @x <= 6
    BEGIN
    SELECT @nSQL =
    'ALTER DATABASE $(DatabaseName)
    ADD FILEGROUP $(TableName)_fg' + RTRIM(CAST(@x AS CHAR(5))) + ';
    ALTER DATABASE $(DatabaseName)
    ADD FILE
    NAME= ''$(TableName)_f' + CAST(@x AS CHAR(5)) + ''',
    FILENAME = ''$(DataDrive)\$(TableName)_f' + RTRIM(CAST(@x AS CHAR(5))) + '.ndf''
    TO FILEGROUP $(TableName)_fg' + RTRIM(CAST(@x AS CHAR(5))) + ';'
    EXEC sp_executeSQL @nSQL;
    SET @x = @x + 1;
    END
    -- 003. Create partition function and schema
    --:SETVAR TableName "TestTable"
    --:SETVAR DatabaseName "workspace"
    USE $(DatabaseName);
    CREATE PARTITION FUNCTION $(TableName)_func (int)
    AS RANGE RIGHT FOR VALUES
    0,
    15,
    30,
    45,
    60
    CREATE PARTITION SCHEME $(TableName)_scheme
    AS
    PARTITION $(TableName)_func
    TO
    $(TableName)_fg1,
    $(TableName)_fg2,
    $(TableName)_fg3,
    $(TableName)_fg4,
    $(TableName)_fg5,
    $(TableName)_fg6
    -- Create TestTable
    --:SETVAR TableName "TestTable"
    --:SETVAR BackupDrive "D:\SQL\Backups\"
    --:SETVAR DatabaseName "workspace"
    CREATE TABLE [dbo].$(TableName)(
    [Partition_PK] [int] NOT NULL,
    [GUID_PK] [uniqueidentifier] NOT NULL,
    [CreateDate] [datetime] NULL,
    [CreateServer] [nvarchar](50) NULL,
    [RandomNbr] [int] NULL,
    CONSTRAINT [PK_$(TableName)] PRIMARY KEY CLUSTERED
    [Partition_PK] ASC,
    [GUID_PK] ASC
    ) ON $(TableName)_scheme(Partition_PK)
    ) ON $(TableName)_scheme(Partition_PK)
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_GUID_PK] DEFAULT (newid()) FOR [GUID_PK]
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_CreateDate] DEFAULT (getdate()) FOR [CreateDate]
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_CreateServer] DEFAULT (@@servername) FOR [CreateServer]
    -- 004. Create and populate a test table
    -- Load TestTable Data - Seconds 0-59 are used as the Partitoning Key
    --:SETVAR TableName "TestTable"
    SET NOCOUNT ON;
    DECLARE @Now DATETIME = GETDATE()
    WHILE @Now > DATEADD(minute,-1,GETDATE())
    BEGIN
    INSERT INTO [dbo].$(TableName)
    ([Partition_PK]
    ,[RandomNbr])
    VALUES
    DATEPART(second,GETDATE())
    ,ROUND((RAND() * 100),0)
    END
    -- Confirm table partitioning - http://lextonr.wordpress.com/tag/sys-destination_data_spaces/
    SELECT
    N'DatabaseName' = DB_NAME()
    , N'SchemaName' = s.name
    , N'TableName' = o.name
    , N'IndexName' = i.name
    , N'IndexType' = i.type_desc
    , N'PartitionScheme' = ps.name
    , N'DataSpaceName' = ds.name
    , N'DataSpaceType' = ds.type_desc
    , N'PartitionFunction' = pf.name
    , N'PartitionNumber' = dds.destination_id
    , N'BoundaryValue' = prv.value
    , N'RightBoundary' = pf.boundary_value_on_right
    , N'PartitionFileGroup' = ds2.name
    , N'RowsOfData' = p.[rows]
    FROM
    sys.objects AS o
    INNER JOIN sys.schemas AS s
    ON o.[schema_id] = s.[schema_id]
    INNER JOIN sys.partitions AS p
    ON o.[object_id] = p.[object_id]
    INNER JOIN sys.indexes AS i
    ON p.[object_id] = i.[object_id]
    AND p.index_id = i.index_id
    INNER JOIN sys.data_spaces AS ds
    ON i.data_space_id = ds.data_space_id
    INNER JOIN sys.partition_schemes AS ps
    ON ds.data_space_id = ps.data_space_id
    INNER JOIN sys.partition_functions AS pf
    ON ps.function_id = pf.function_id
    LEFT OUTER JOIN sys.partition_range_values AS prv
    ON pf.function_id = prv.function_id
    AND p.partition_number = prv.boundary_id
    LEFT OUTER JOIN sys.destination_data_spaces AS dds
    ON ps.data_space_id = dds.partition_scheme_id
    AND p.partition_number = dds.destination_id
    LEFT OUTER JOIN sys.data_spaces AS ds2
    ON dds.data_space_id = ds2.data_space_id
    ORDER BY
    DatabaseName
    ,SchemaName
    ,TableName
    ,IndexName
    ,PartitionNumber
    --=================================================================================
    -- SECTION 2 - SWITCH OUT
    -- 001 - Create TestTableOut
    -- 002 - Switch out partition in range 0-14
    -- 003 - Merge range 0 -29
    -- 001. TestTableOut
    :SETVAR TableName "TestTable"
    IF OBJECT_ID('dbo.$(TableName)Out') IS NOT NULL
    DROP TABLE [dbo].[$(TableName)Out]
    CREATE TABLE [dbo].[$(TableName)Out](
    [Partition_PK] [int] NOT NULL,
    [GUID_PK] [uniqueidentifier] NOT NULL,
    [CreateDate] [datetime] NULL,
    [CreateServer] [nvarchar](50) NULL,
    [RandomNbr] [int] NULL,
    CONSTRAINT [PK_$(TableName)Out] PRIMARY KEY CLUSTERED
    [Partition_PK] ASC,
    [GUID_PK] ASC
    ) ON $(TableName)_fg2;
    GO
    -- 002 - Switch out partition in range 0-14
    --:SETVAR TableName "TestTable"
    ALTER TABLE dbo.$(TableName)
    SWITCH PARTITION 2 TO dbo.$(TableName)Out;
    -- 003 - Merge range 0 - 29
    --:SETVAR TableName "TestTable"
    ALTER PARTITION FUNCTION $(TableName)_func()
    MERGE RANGE (15);
    -- Confirm table partitioning
    -- Original source of this query - http://lextonr.wordpress.com/tag/sys-destination_data_spaces/
    SELECT
    N'DatabaseName' = DB_NAME()
    , N'SchemaName' = s.name
    , N'TableName' = o.name
    , N'IndexName' = i.name
    , N'IndexType' = i.type_desc
    , N'PartitionScheme' = ps.name
    , N'DataSpaceName' = ds.name
    , N'DataSpaceType' = ds.type_desc
    , N'PartitionFunction' = pf.name
    , N'PartitionNumber' = dds.destination_id
    , N'BoundaryValue' = prv.value
    , N'RightBoundary' = pf.boundary_value_on_right
    , N'PartitionFileGroup' = ds2.name
    , N'RowsOfData' = p.[rows]
    FROM
    sys.objects AS o
    INNER JOIN sys.schemas AS s
    ON o.[schema_id] = s.[schema_id]
    INNER JOIN sys.partitions AS p
    ON o.[object_id] = p.[object_id]
    INNER JOIN sys.indexes AS i
    ON p.[object_id] = i.[object_id]
    AND p.index_id = i.index_id
    INNER JOIN sys.data_spaces AS ds
    ON i.data_space_id = ds.data_space_id
    INNER JOIN sys.partition_schemes AS ps
    ON ds.data_space_id = ps.data_space_id
    INNER JOIN sys.partition_functions AS pf
    ON ps.function_id = pf.function_id
    LEFT OUTER JOIN sys.partition_range_values AS prv
    ON pf.function_id = prv.function_id
    AND p.partition_number = prv.boundary_id
    LEFT OUTER JOIN sys.destination_data_spaces AS dds
    ON ps.data_space_id = dds.partition_scheme_id
    AND p.partition_number = dds.destination_id
    LEFT OUTER JOIN sys.data_spaces AS ds2
    ON dds.data_space_id = ds2.data_space_id
    ORDER BY
    DatabaseName
    ,SchemaName
    ,TableName
    ,IndexName
    ,PartitionNumber  
    The table below shows the results of the ‘Confirm Table Partitioning’ query, before and after the MERGE.
    The T-SQL code below illustrates the problem.
    -- PartitionLab_RangeRight
    USE workspace;
    DROP TABLE dbo.TestTableOut;
    USE master;
    ALTER DATABASE workspace
    REMOVE FILE TestTable_f3 ;
    -- ERROR
    --Msg 5042, Level 16, State 1, Line 1
    --The file 'TestTable_f3 ' cannot be removed because it is not empty.
    ALTER DATABASE workspace
    REMOVE FILE TestTable_f2 ;
    -- Works surprisingly!!
    use workspace;
    ALTER INDEX [PK_TestTable] ON [dbo].[TestTable] REBUILD PARTITION = 2;
    --Msg 622, Level 16, State 3, Line 2
    --The filegroup "TestTable_fg2" has no files assigned to it. Tables, indexes, text columns, ntext columns, and image columns cannot be populated on this filegroup until a file is added.
    --The statement has been terminated.
    If you run ALTER INDEX REBUILD before trying to remove files from File Group 3, it works. Rerun the database setup script then the code below.
    -- RANGE RIGHT
    -- Rerun PartitionLabSetup_RangeRight.sql before the code below
    USE workspace;
    DROP TABLE dbo.TestTableOut;
    ALTER INDEX [PK_TestTable] ON [dbo].[TestTable] REBUILD PARTITION = 2;
    USE master;
    ALTER DATABASE workspace
    REMOVE FILE TestTable_f3;
    -- Works as expected!!
    The file in File Group 2 appears to contain data but it can be dropped. Although the system views are reporting the data in File Group 2, it still physically resides in File Group 3 and isn’t moved until the index is rebuilt. The RANGE RIGHT function means
    the left file group (File Group 2) is retained when splitting ranges.
    RANGE LEFT would have retained the data in File Group 3 where it already resided, no INDEX REBUILD is necessary to effectively complete the MERGE operation. The script below implements the same partitioning strategy (data distribution between partitions)
    on the test table but uses different boundary definitions and RANGE LEFT.
    --=================================================================================
    -- PartitionLabSetup_RangeLeft.sql
    -- 001. Create test database
    -- 002. Add file groups and files
    -- 003. Create partition function and schema
    -- 004. Create and populate a test table
    --=================================================================================
    USE [master]
    GO
    -- 001 - Create Test Database
    :SETVAR DataDrive "D:\SQL\Data\"
    :SETVAR LogDrive "D:\SQL\Logs\"
    :SETVAR DatabaseName "workspace"
    :SETVAR TableName "TestTable"
    -- Drop if exists and create Database
    IF DATABASEPROPERTYEX(N'$(databasename)','Status') IS NOT NULL
    BEGIN
    ALTER DATABASE $(DatabaseName) SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    DROP DATABASE $(DatabaseName)
    END
    CREATE DATABASE $(DatabaseName)
    ON
    ( NAME = $(DatabaseName)_data,
    FILENAME = N'$(DataDrive)$(DatabaseName)_data.mdf',
    SIZE = 10,
    MAXSIZE = 500,
    FILEGROWTH = 5 )
    LOG ON
    ( NAME = $(DatabaseName)_log,
    FILENAME = N'$(LogDrive)$(DatabaseName).ldf',
    SIZE = 5MB,
    MAXSIZE = 5000MB,
    FILEGROWTH = 5MB ) ;
    GO
    -- 002. Add file groups and files
    --:SETVAR DatabaseName "workspace"
    --:SETVAR TableName "TestTable"
    --:SETVAR DataDrive "D:\SQL\Data\"
    --:SETVAR LogDrive "D:\SQL\Logs\"
    DECLARE @nSQL NVARCHAR(2000) ;
    DECLARE @x INT = 1;
    WHILE @x <= 6
    BEGIN
    SELECT @nSQL =
    'ALTER DATABASE $(DatabaseName)
    ADD FILEGROUP $(TableName)_fg' + RTRIM(CAST(@x AS CHAR(5))) + ';
    ALTER DATABASE $(DatabaseName)
    ADD FILE
    NAME= ''$(TableName)_f' + CAST(@x AS CHAR(5)) + ''',
    FILENAME = ''$(DataDrive)\$(TableName)_f' + RTRIM(CAST(@x AS CHAR(5))) + '.ndf''
    TO FILEGROUP $(TableName)_fg' + RTRIM(CAST(@x AS CHAR(5))) + ';'
    EXEC sp_executeSQL @nSQL;
    SET @x = @x + 1;
    END
    -- 003. Create partition function and schema
    --:SETVAR TableName "TestTable"
    --:SETVAR DatabaseName "workspace"
    USE $(DatabaseName);
    CREATE PARTITION FUNCTION $(TableName)_func (int)
    AS RANGE LEFT FOR VALUES
    -1,
    14,
    29,
    44,
    59
    CREATE PARTITION SCHEME $(TableName)_scheme
    AS
    PARTITION $(TableName)_func
    TO
    $(TableName)_fg1,
    $(TableName)_fg2,
    $(TableName)_fg3,
    $(TableName)_fg4,
    $(TableName)_fg5,
    $(TableName)_fg6
    -- Create TestTable
    --:SETVAR TableName "TestTable"
    --:SETVAR BackupDrive "D:\SQL\Backups\"
    --:SETVAR DatabaseName "workspace"
    CREATE TABLE [dbo].$(TableName)(
    [Partition_PK] [int] NOT NULL,
    [GUID_PK] [uniqueidentifier] NOT NULL,
    [CreateDate] [datetime] NULL,
    [CreateServer] [nvarchar](50) NULL,
    [RandomNbr] [int] NULL,
    CONSTRAINT [PK_$(TableName)] PRIMARY KEY CLUSTERED
    [Partition_PK] ASC,
    [GUID_PK] ASC
    ) ON $(TableName)_scheme(Partition_PK)
    ) ON $(TableName)_scheme(Partition_PK)
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_GUID_PK] DEFAULT (newid()) FOR [GUID_PK]
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_CreateDate] DEFAULT (getdate()) FOR [CreateDate]
    ALTER TABLE [dbo].$(TableName) ADD CONSTRAINT [DF_$(TableName)_CreateServer] DEFAULT (@@servername) FOR [CreateServer]
    -- 004. Create and populate a test table
    -- Load TestTable Data - Seconds 0-59 are used as the Partitoning Key
    --:SETVAR TableName "TestTable"
    SET NOCOUNT ON;
    DECLARE @Now DATETIME = GETDATE()
    WHILE @Now > DATEADD(minute,-1,GETDATE())
    BEGIN
    INSERT INTO [dbo].$(TableName)
    ([Partition_PK]
    ,[RandomNbr])
    VALUES
    DATEPART(second,GETDATE())
    ,ROUND((RAND() * 100),0)
    END
    -- Confirm table partitioning - http://lextonr.wordpress.com/tag/sys-destination_data_spaces/
    SELECT
    N'DatabaseName' = DB_NAME()
    , N'SchemaName' = s.name
    , N'TableName' = o.name
    , N'IndexName' = i.name
    , N'IndexType' = i.type_desc
    , N'PartitionScheme' = ps.name
    , N'DataSpaceName' = ds.name
    , N'DataSpaceType' = ds.type_desc
    , N'PartitionFunction' = pf.name
    , N'PartitionNumber' = dds.destination_id
    , N'BoundaryValue' = prv.value
    , N'RightBoundary' = pf.boundary_value_on_right
    , N'PartitionFileGroup' = ds2.name
    , N'RowsOfData' = p.[rows]
    FROM
    sys.objects AS o
    INNER JOIN sys.schemas AS s
    ON o.[schema_id] = s.[schema_id]
    INNER JOIN sys.partitions AS p
    ON o.[object_id] = p.[object_id]
    INNER JOIN sys.indexes AS i
    ON p.[object_id] = i.[object_id]
    AND p.index_id = i.index_id
    INNER JOIN sys.data_spaces AS ds
    ON i.data_space_id = ds.data_space_id
    INNER JOIN sys.partition_schemes AS ps
    ON ds.data_space_id = ps.data_space_id
    INNER JOIN sys.partition_functions AS pf
    ON ps.function_id = pf.function_id
    LEFT OUTER JOIN sys.partition_range_values AS prv
    ON pf.function_id = prv.function_id
    AND p.partition_number = prv.boundary_id
    LEFT OUTER JOIN sys.destination_data_spaces AS dds
    ON ps.data_space_id = dds.partition_scheme_id
    AND p.partition_number = dds.destination_id
    LEFT OUTER JOIN sys.data_spaces AS ds2
    ON dds.data_space_id = ds2.data_space_id
    ORDER BY
    DatabaseName
    ,SchemaName
    ,TableName
    ,IndexName
    ,PartitionNumber
    --=================================================================================
    -- SECTION 2 - SWITCH OUT
    -- 001 - Create TestTableOut
    -- 002 - Switch out partition in range 0-14
    -- 003 - Merge range 0 -29
    -- 001. TestTableOut
    :SETVAR TableName "TestTable"
    IF OBJECT_ID('dbo.$(TableName)Out') IS NOT NULL
    DROP TABLE [dbo].[$(TableName)Out]
    CREATE TABLE [dbo].[$(TableName)Out](
    [Partition_PK] [int] NOT NULL,
    [GUID_PK] [uniqueidentifier] NOT NULL,
    [CreateDate] [datetime] NULL,
    [CreateServer] [nvarchar](50) NULL,
    [RandomNbr] [int] NULL,
    CONSTRAINT [PK_$(TableName)Out] PRIMARY KEY CLUSTERED
    [Partition_PK] ASC,
    [GUID_PK] ASC
    ) ON $(TableName)_fg2;
    GO
    -- 002 - Switch out partition in range 0-14
    --:SETVAR TableName "TestTable"
    ALTER TABLE dbo.$(TableName)
    SWITCH PARTITION 2 TO dbo.$(TableName)Out;
    -- 003 - Merge range 0 - 29
    :SETVAR TableName "TestTable"
    ALTER PARTITION FUNCTION $(TableName)_func()
    MERGE RANGE (14);
    -- Confirm table partitioning
    -- Original source of this query - http://lextonr.wordpress.com/tag/sys-destination_data_spaces/
    SELECT
    N'DatabaseName' = DB_NAME()
    , N'SchemaName' = s.name
    , N'TableName' = o.name
    , N'IndexName' = i.name
    , N'IndexType' = i.type_desc
    , N'PartitionScheme' = ps.name
    , N'DataSpaceName' = ds.name
    , N'DataSpaceType' = ds.type_desc
    , N'PartitionFunction' = pf.name
    , N'PartitionNumber' = dds.destination_id
    , N'BoundaryValue' = prv.value
    , N'RightBoundary' = pf.boundary_value_on_right
    , N'PartitionFileGroup' = ds2.name
    , N'RowsOfData' = p.[rows]
    FROM
    sys.objects AS o
    INNER JOIN sys.schemas AS s
    ON o.[schema_id] = s.[schema_id]
    INNER JOIN sys.partitions AS p
    ON o.[object_id] = p.[object_id]
    INNER JOIN sys.indexes AS i
    ON p.[object_id] = i.[object_id]
    AND p.index_id = i.index_id
    INNER JOIN sys.data_spaces AS ds
    ON i.data_space_id = ds.data_space_id
    INNER JOIN sys.partition_schemes AS ps
    ON ds.data_space_id = ps.data_space_id
    INNER JOIN sys.partition_functions AS pf
    ON ps.function_id = pf.function_id
    LEFT OUTER JOIN sys.partition_range_values AS prv
    ON pf.function_id = prv.function_id
    AND p.partition_number = prv.boundary_id
    LEFT OUTER JOIN sys.destination_data_spaces AS dds
    ON ps.data_space_id = dds.partition_scheme_id
    AND p.partition_number = dds.destination_id
    LEFT OUTER JOIN sys.data_spaces AS ds2
    ON dds.data_space_id = ds2.data_space_id
    ORDER BY
    DatabaseName
    ,SchemaName
    ,TableName
    ,IndexName
    ,PartitionNumber
    The table below shows the results of the ‘Confirm Table Partitioning’ query, before and after the MERGE.
    The data in the File and File Group to be dropped (File Group 2) has already been switched out; File Group 3 contains the data so no index rebuild is needed to move data and complete the MERGE.
    RANGE RIGHT would not be a problem in a ‘Sliding Window’ if the same file group is used for all partitions, when they are created and dropped it introduces a dependency on full index rebuilds. Larger tables are typically partitioned and a full index rebuild
    might be an expensive operation. I’m not sure how a RANGE RIGHT partitioning strategy could be implemented, with an ascending partitioning key, using multiple file groups without having to move data. Using a single file group (multiple files) for all partitions
    within a table would avoid physically moving data between file groups; no index rebuild would be necessary to complete a MERGE and system views would accurately reflect the physical location of data. 
    If a RANGE RIGHT partition function is used, the data is physically in the wrong file group after the MERGE assuming a typical ascending partitioning key, and the 'Data Spaces' system views might be misleading. Thanks to Manuj and Chris for a lot of help
    investigating this.
    NOTE 10/03/2014 - The solution
    The solution is so easy it's embarrassing, I was using the wrong boundary points for the MERGE (both RANGE LEFT & RANGE RIGHT) to get rid of historic data.
    -- Wrong Boundary Point Range Right
    --ALTER PARTITION FUNCTION $(TableName)_func()
    --MERGE RANGE (15);
    -- Wrong Boundary Point Range Left
    --ALTER PARTITION FUNCTION $(TableName)_func()
    --MERGE RANGE (14);
    -- Correct Boundary Pounts for MERGE
    ALTER PARTITION FUNCTION $(TableName)_func()
    MERGE RANGE (0); -- or -1 for RANGE LEFT
    The empty, switched out partition (on File Group 2) is then MERGED with the empty partition maintained at the start of the range and no data movement is necessary. I retract the suggestion that a problem exists with RANGE RIGHT Sliding Windows using multiple
    file groups and apologize :-)

    Hi Paul Brewer,
    Thanks for your post and glad to hear that the issue is resolved. It is kind of you post a reply to share your solution. That way, other community members could benefit from your sharing.
    Regards.
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Issue in loading specific columns from a file to teradata table using IKM

    Hi,
    Can any one help to resolve the issue in loading specific columns from text file to teradata table.
    i tried using IKM file teradata and columns are getting displaced.
    my requirement suppose i have 5 columns in file and i have to load only 3columns to table using IKM.
    same thing can be achived using LKM file to teradata but i want use IKM.
    please suggest me on this
    Regards
    Vinod

    Hi,
    I believe that the problem you are having is that you have a delimited file, of which you want to pick columns from position 2,3,5. In this case, ODI will pick the first 3 columns of a delimited file regardless of position.
    For example, if you a tab delimited file with c1,c2,c3,c4,c5 columns, and you want only columns c2,c3,c5 - when mapping these in an ODI interface, and executing, you will actually pick up the data from c1,c2,c3 as these are the first three columns in the file (reading from left to right). You can ignore "columns" on the right hand side of a file, but not the left. E.g delimited file with c1,c2,c3,c4,c5. Only pick columns c1,c2 will give you data for the first 2 columns
    Create a temporary table to load all the data from the file, and use you temp table to extract the data you require. Or you could get the file created with the first three columns as the columns you require.
    Cheers
    Bos
    Edited by: Bos on Jan 18, 2011 1:06 PM

  • Import tables using same file name for all tables

    hi,
    We are using oracle 11.2.0.3.0, we have taken export of multiple tables in schema, unfortunately in exp command we have mention same file name for all the tables, now it has created one file for all export but the size of that file shows that all the tables included in it , when we try to import the tables only last table mention in export command is imported for other tables it hangs, example of code show below.
    Is there any way to import the remaining tables.
    exp system file =exp05042013.dmp tables=gr.table1
    exp system file =exp05042013.dmp tables=gr.table2
    exp system file =exp05042013.dmp tables=gr.table3

    Hi,
    You can use as :
    exp username@database tables=table1,table2 file=tabledata.dmpDid you have any RMAN full backup?
    Why you are not recover crash database?
    Mahir

  • How to map single input value to Two columns of Database table using format file of Bulk Copy Process

    Hi All,
    Am using OPENROWSET to load the file data into table, here the problem is i need to map same input value to two different columns of table, As format file doesn't allow the duplicate numbers am unable to insert same value to two columns, please help me to
    find a solution for this. 
    i can use only OPENROWSET because i need to insert some default values also which come based on file. only the problem is how to map same input value to two different columns of table. please give me the suggestions.
    Thanks,
    Sudhakar

    From what you say:
       INSERT tbl(col1, col2)
          SELECT col1, col1
          FROM   OPENROWSET(....)
    But I guess it is more difficult. You need to give more details. What sort of data source do you have? What does your query look like? The target table?
    Erland Sommarskog, SQL Server MVP, [email protected]
    Hi Erland,
    Thanks for your response
    my source file is text file with | symbol separate for ex:
    1002|eTab |V101|eTablet|V100|Logic|LT-7|Laptops|SCM
    Database table have columns like
    column1,column2,column3...etc, now i need to insert same value from input file into two columns for ex:
    the eTab value from text file has to be insert into column2 and column3 of
    table
    we cannot change format file like below one
    for the above situation how can we insert eTab into column2 and column3
    Thanks,
    Sudhakar.

Maybe you are looking for

  • Which C# Adobe library should I use to open a PDF in Acrobat Reader 9, 10, and 11?

    Hi, all. I have a Visual Studio C# application that writes an Adobe PDF using iText. Does anyone know how to programmatically open the PDF on client's computer and ensure it works for Adobe Reader 9, 10, and 11? -I have Adobe Reader XI on my machine

  • Unable to clear / delete invitation in iCal

    I have an invitation showing in iCal. When I click on the invitation, a meeting entry comes up in the dropdown/popup window, but from then on nothing happens - if I select the invitation iCal simply closes the invitation window. There are no buttons

  • Accessing status bar text in codedUI test

    I need to get the text from status bar. I tried statusbar.text and statusbar.name, both didn't work. So suggest any method to get the text from it?

  • Comments on the Gem Cutter docs

    <div><span style="font-family: Helvetica; font-size: 12px" class="Apple-style-span">Thought I'd make some notes on this as I read through it. Overall, it's a very nice intro to the tool and what it can do.</span></div><div><span style="font-family: H

  • PSE-4 Elements color matching with RX500 Epson Printer

    I recently upgraded to PSE-4. I own a desktop DELL running XP. When I print out photo's the color does not match what I see on the CRT. The printer is an EPSON RX500. I had no problems like this using PSE-3. I tried messing with the printer settings