Request: PL/SQL, External Table and SQL Loader

I see lately Questions asked in SQL and PL/SQL forum regarding SQL Loader and External Table are being moved to {forum:id=732}.
Being an PL/SQL developer for some time now i feel External Table (and if i may, SQL Loader and DBMS_DATAPUMP) are very much an integral part of a PL/SQL development and the question related to these topics are well suited in SQL and PL/SQL forum. Even in SQL and PL/SQL FAQ we have exclusive content that discuss on these topics {message:id=9360007}
So i would like to request the moderators to consider not moving such questions out of the SQL and PL/SQL forum.
Thanks,
Karthick.

Karthick_Arp wrote:
I see lately Questions asked in SQL and PL/SQL forum regarding SQL Loader and External Table are being moved to {forum:id=732}.
Being an PL/SQL developer for some time now i feel External Table (and if i may, SQL Loader and DBMS_DATAPUMP) are very much an integral part of a PL/SQL development and the question related to these topics are well suited in SQL and PL/SQL forum. Even in SQL and PL/SQL FAQ we have exclusive content that discuss on these topics {message:id=9360007}
So i would like to request the moderators to consider not moving such questions out of the SQL and PL/SQL forum.
Thanks,
Karthick.Not sure which moderators are moving them... cos it ain't me. I'm quite happy to leave those there.

Similar Messages

  • External table and data load order

    I'm unsing Oracle 10g exterbal table feature to load a text file to the database
    1- What is the default order of loading the text file into the Oracle table
    2- How to ensure or to change the this default behavior

    1- What is the default order of loading the text file into the Oracle tableTop (row#1) to bottom
    2- How to ensure or to change the this default behavioruse ORDER BY clause

  • External Table vs SQL Loader.

    Hi,
    Pls anybody can tell me what is the significant differences between external table and SQL Loader.

    Both fall into category of Oracle utilities
    [SQL*Loader|http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/utility.htm#i10606] is the one that loads data into Oracle tables from operating system files and [external table|http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/utility.htm#i10611] is the one that
    is providing functionality similiar as SQL*Loader in the means of accessing external data but with with a different logic and rules,it lets you access data in external sources as if they were in a table in the database.

  • SQL LOADER , EXTERNAL  TABLE and ODBS DATA SOURCE

    hello
    Can any body help loading data from dbase file .dbt to an oracle 10g table.
    I tried last day with SQL LOADER, EXTERNAL table , and ODBC data source.
    Why all of these utilities still failing to solve my problem ?
    Is there an efficient way to reach this goal ?
    Thanks in advance

    export the dbase data file to text file,
    then you have choice of using either sql loader or external table option to use.
    regards

  • How to provide joins between oracle tables and sql server tables

    Hi,
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server.
    how to provide joins between oracle tables and sql server tables ? Any help on this
    Regards,
    Malli

    user10675696 wrote:
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server. Bad idea most times. Heterogeneous joins do not exactly scale and performance can be severely degraded by network speed and b/w availability. And there is nothing you can do in the application and database layers to address performance issue at the network level in this case - your code's performance is simply at the mercy of network performance. With a single glaring fact - network performance is continually degrading. All the time. Always. Until it is upgraded. When the performance degradation starts all over again.
    If the tables are not small (few 1000 rows each) and row volumes static, I would not consider doing a heterogeneous join. Instead I would rather go for a materialised view on the Oracle side, use a proper table and index structure, and do a local database join.

  • PL/SQL 101 : Cursors and SQL Projection

    PL/SQL 101 : Cursors and SQL Projection
    This is not a question, it's a forum article, in reponse to the number of questions we get regarding a "dynamic number of columns" or "rows to columns"
    There are two integral parts to an SQL Select statement that relate to what data is selected. One is Projection and the other is Selection:-
    Selection is the one that we always recognise and use as it forms the WHERE clause of the select statement, and hence selects which rows of data are queried.
    The other, SQL Projection is the one that is less understood, and the one that this article will help to explain.
    In short, SQL Projection is the collective name for the columns that are Selected and returned from a query.
    So what? Big deal eh? Why do we need to know this?
    The reason for knowing this is that many people are not aware of when SQL projection comes into play when you issue a select statement. So let's take a basic query...
    First create some test data...
    create table proj_test as
      select 1 as id, 1 as rn, 'Fred' as nm from dual union all
      select 1,2,'Bloggs' from dual union all
      select 2,1,'Scott' from dual union all
      select 2,2,'Smith' from dual union all
      select 3,1,'Jim' from dual union all
      select 3,2,'Jones' from dual
    ... and now query that data...
    SQL> select * from proj_test;
             ID         RN NM
             1          1 Fred
             1          2 Bloggs
             2          1 Scott
             2          2 Smith
             3          1 Jim
             3          2 Jones
    6 rows selected.
    OK, so what is that query actually doing?
    To know that we need to consider that all queries are cursors and all cursors are processed in a set manner, roughly speaking...
    1. The cursor is opened
    2. The query is parsed
    3. The query is described to know the projection (what columns are going to be returned, names, datatypes etc.)
    4. Bind variables are bound in
    5. The query is executed to apply the selection and identify the data to be retrieved
    6. A row of data is fetched
    7. The data values from the columns within that row are extracted into the known projection
    8. Step 6 and 7 are repeated until there is no more data or another condition ceases the fetching
    9. The cursor is closed
    The purpose of the projection being determined is so that the internal processing of the cursor can allocate memory etc. ready to fetch the data into. We won't get to see that memory allocation happening easily, but we can see the same query being executed in these steps if we do it programatically using the dbms_sql package...
    CREATE OR REPLACE PROCEDURE process_cursor (p_query in varchar2) IS
      v_sql       varchar2(32767) := p_query;
      v_cursor    number;            -- A cursor is a handle (numeric identifier) to the query
      col_cnt     integer;
      v_n_val     number;            -- numeric type to fetch data into
      v_v_val     varchar2(20);      -- varchar type to fetch data into
      v_d_val     date;              -- date type to fetch data into
      rec_tab     dbms_sql.desc_tab; -- table structure to hold sql projection info
      dummy       number;
      v_ret       number;            -- number of rows returned
      v_finaltxt  varchar2(100);
      col_num     number;
    BEGIN
      -- 1. Open the cursor
      dbms_output.put_line('1 - Opening Cursor');
      v_cursor := dbms_sql.open_cursor;
      -- 2. Parse the cursor
      dbms_output.put_line('2 - Parsing the query');
      dbms_sql.parse(v_cursor, v_sql, dbms_sql.NATIVE);
      -- 3. Describe the query
      -- Note: The query has been described internally when it was parsed, but we can look at
      --       that description...
      -- Fetch the description into a structure we can read, returning the count of columns that has been projected
      dbms_output.put_line('3 - Describing the query');
      dbms_sql.describe_columns(v_cursor, col_cnt, rec_tab);
      -- Use that description to define local datatypes into which we want to fetch our values
      -- Note: This only defines the types, it doesn't fetch any data and whilst we can also
      --       determine the size of the columns we'll just use some fixed sizes for this example
      dbms_output.put_line(chr(10)||'3a - SQL Projection:-');
      for j in 1..col_cnt
      loop
        v_finaltxt := 'Column Name: '||rpad(upper(rec_tab(j).col_name),30,' ');
        case rec_tab(j).col_type
          -- if the type of column is varchar2, bind that to our varchar2 variable
          when 1 then
            dbms_sql.define_column(v_cursor,j,v_v_val,20);
            v_finaltxt := v_finaltxt||' Datatype: Varchar2';
          -- if the type of the column is number, bind that to our number variable
          when 2 then
            dbms_sql.define_column(v_cursor,j,v_n_val);
            v_finaltxt := v_finaltxt||' Datatype: Number';
          -- if the type of the column is date, bind that to our date variable
          when 12 then
            dbms_sql.define_column(v_cursor,j,v_d_val);
            v_finaltxt := v_finaltxt||' Datatype: Date';
          -- ...Other types can be added as necessary...
        else
          -- All other types we'll assume are varchar2 compatible (implicitly converted)
          dbms_sql.DEFINE_COLUMN(v_cursor,j,v_v_val,2000);
          v_finaltxt := v_finaltxt||' Datatype: Varchar2 (implicit)';
        end case;
        dbms_output.put_line(v_finaltxt);
      end loop;
      -- 4. Bind variables
      dbms_output.put_line(chr(10)||'4 - Binding in values');
      null; -- we have no values to bind in for our test
      -- 5. Execute the query to make it identify the data on the database (Selection)
      -- Note: This doesn't fetch any data, it just identifies what data is required.
      dbms_output.put_line('5 - Executing the query');
      dummy := dbms_sql.execute(v_cursor);
      -- 6.,7.,8. Fetch the rows of data...
      dbms_output.put_line(chr(10)||'6,7 and 8 Fetching Data:-');
      loop
        -- 6. Fetch next row of data
        v_ret := dbms_sql.fetch_rows(v_cursor);
        -- If the fetch returned no row then exit the loop
        exit when v_ret = 0;
        -- 7. Extract the values from the row
        v_finaltxt := null;
        -- loop through each of the Projected columns
        for j in 1..col_cnt
        loop
          case rec_tab(j).col_type
            -- if it's a varchar2 column
            when 1 then
              -- read the value into our varchar2 variable
              dbms_sql.column_value(v_cursor,j,v_v_val);
              v_finaltxt := ltrim(v_finaltxt||','||rpad(v_v_val,20,' '),',');
            -- if it's a number column
            when 2 then
              -- read the value into our number variable
              dbms_sql.column_value(v_cursor,j,v_n_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_n_val,'fm999999'),',');
            -- if it's a date column
            when 12 then
              -- read the value into our date variable
              dbms_sql.column_value(v_cursor,j,v_d_val);
              v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
          else
            -- read the value into our varchar2 variable (assumes it can be implicitly converted)
            dbms_sql.column_value(v_cursor,j,v_v_val);
            v_finaltxt := ltrim(v_finaltxt||',"'||rpad(v_v_val,20,' ')||'"',',');
          end case;
        end loop;
        dbms_output.put_line(v_finaltxt);
        -- 8. Loop to fetch next row
      end loop;
      -- 9. Close the cursor
      dbms_output.put_line(chr(10)||'9 - Closing the cursor');
      dbms_sql.close_cursor(v_cursor);
    END;
    SQL> exec process_cursor('select * from proj_test');
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: RN                             Datatype: Number
    Column Name: NM                             Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,1     ,Fred
    1     ,2     ,Bloggs
    2     ,1     ,Scott
    2     ,2     ,Smith
    3     ,1     ,Jim
    3     ,2     ,Jones
    1     ,3     ,Freddy
    1     ,4     ,Fud
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    So, what's really the point in knowing when SQL Projection occurs in a query?
    Well, we get many questions asking "How do I convert rows to columns?" (otherwise known as a pivot) or questions like "How can I get the data back from a dynamic query with different columns?"
    Let's look at a regular pivot. We would normally do something like...
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    (or, in 11g, use the new PIVOT statement)
    but many of these questioners don't understand it when they say their issue is that, they have an unknown number of rows and don't know how many columns it will have, and they are told that you can't do that in a single SQL statement. e.g.
    SQL> insert into proj_test (id, rn, nm) values (1,3,'Freddy');
    1 row created.
    SQL> select id
      2        ,max(decode(rn,1,nm)) as nm_1
      3        ,max(decode(rn,2,nm)) as nm_2
      4  from proj_test
      5  group by id
      6  /
            ID NM_1   NM_2
             1 Fred   Bloggs
             2 Scott  Smith
             3 Jim    Jones
    ... it's not giving us this 3rd entry as a new column and we can only get that by writing the expected columns into the query, but then what if more columns are added after that etc.
    If we look back at the steps of a cursor we see again that the description and projection of what columns are returned by a query happens before any data is fetched back.
    Because of this, it's not possible to have the query return back a number of columns that are based on the data itself, as no data has been fetched at the point the projection is required.
    So, what is the answer to getting an unknown number of columns in the output?
    1) The most obvious answer is, don't use SQL to try and pivot your data. Pivoting of data is more of a reporting requirement and most reporting tools include the ability to pivot data either as part of the initial report generation or on-the-fly at the users request. The main point about using the reporting tools is that they query the data first and then the pivoting is simply a case of manipulating the display of those results, which can be dynamically determined by the reporting tool based on what data there is.
    2) The other answer is to write dynamic SQL. Because you're not going to know the number of columns, this isn't just a simple case of building up a SQL query as a string and passing it to the EXECUTE IMMEDIATE command within PL/SQL, because you won't have a suitable structure to read the results back into as those structures must have a known number of variables for each of the columns at design time, before the data is know. As such, inside PL/SQL code, you would have to use the DBMS_SQL package, just like in the code above that showed the workings of a cursor, as the columns there are referenced by position rather than name, and you have to deal with each column seperately. What you do with each column is up to you... store them in an array/collection, process them as you get them, or whatever. They key thing though with doing this is that, just like the reporting tools, you would need to process the data first to determine what your SQL projection is, before you execute the query to fetch the data in the format you want e.g.
    create or replace procedure dyn_pivot is
      v_sql varchar2(32767);
      -- cursor to find out the maximum number of projected columns required
      -- by looking at the data
      cursor cur_proj_test is
        select distinct rn
        from   proj_test
        order by rn;
    begin
      v_sql := 'select id';
      for i in cur_proj_test
      loop
        -- dynamically add to the projection for the query
        v_sql := v_sql||',max(decode(rn,'||i.rn||',nm)) as nm_'||i.rn;
      end loop;
      v_sql := v_sql||' from proj_test group by id order by id';
      dbms_output.put_line('Dynamic SQL Statement:-'||chr(10)||v_sql||chr(10)||chr(10));
      -- call our DBMS_SQL procedure to process the query with it's dynamic projection
      process_cursor(v_sql);
    end;
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy
    2     ,Scott               ,Smith               ,
    3     ,Jim                 ,Jones               ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    ... and if more data is added ...
    SQL> insert into proj_test (id, rn, nm) values (1,4,'Fud');
    1 row created.
    SQL> exec dyn_pivot;
    Dynamic SQL Statement:-
    select id,max(decode(rn,1,nm)) as nm_1,max(decode(rn,2,nm)) as nm_2,max(decode(rn,3,nm)) as nm_3,max(decode(rn,4,nm)) as nm_4 from proj_test group by id order by id
    1 - Opening Cursor
    2 - Parsing the query
    3 - Describing the query
    3a - SQL Projection:-
    Column Name: ID                             Datatype: Number
    Column Name: NM_1                           Datatype: Varchar2
    Column Name: NM_2                           Datatype: Varchar2
    Column Name: NM_3                           Datatype: Varchar2
    Column Name: NM_4                           Datatype: Varchar2
    4 - Binding in values
    5 - Executing the query
    6,7 and 8 Fetching Data:-
    1     ,Fred                ,Bloggs              ,Freddy              ,Fud
    2     ,Scott               ,Smith               ,                    ,
    3     ,Jim                 ,Jones               ,                    ,
    9 - Closing the cursor
    PL/SQL procedure successfully completed.
    Of course there are other methods, using dynamically generated scripts etc. (see Re: 4. How do I convert rows to columns?), but the above simply demonstrates that:-
    a) having a dynamic projection requires two passes of the data; one to dynamically generate the query and another to actually query the data,
    b) it is not a good idea in most cases as it requires code to handle the results dynamically rather than being able to simply query directly into a known structure or variables, and
    c) a simple SQL statement cannot have a dynamic projection.
    Most importantly, dynamic queries prevent validation of your queries at the time your code is compiled, so the compiler can't check that the column names are correct or the tables names, or that the actual syntax of the generated query is correct. This only happens at run-time, and depending upon the complexity of your dynamic query, some problems may only be experienced under certain conditions. In effect you are writing queries that are harder to validate and could potentially have bugs in them that would are not apparent until they get to a run time environment. Dynamic queries can also introduce the possibility of SQL injection (a potential security risk), especially if a user is supplying a string value into the query from an interface.
    To summarise:-
    The projection of an SQL statement must be known by the SQL engine before any data is fetched, so don't expect SQL to magically create columns on-the-fly based on the data it's retrieving back; and, if you find yourself thinking of using dynamic SQL to get around it, just take a step back and see if what you are trying to achieve may be better done elsewhere, such as in a reporting tool or the user interface.
    Other articles in the PL/SQL 101 series:-
    PL/SQL 101 : Understanding Ref Cursors
    PL/SQL 101 : Exception Handling

    excellent article. However there is one thing which is slightly erroneous. You don't need a type to be declared in the database to fetch the data, but you do need to declare a type;
    here is one of my unit test scripts that does just that.
    DECLARE
    PN_CARDAPPL_ID NUMBER;
    v_Return Cci_Standard.ref_cursor;
    type getcardapplattrval_recordtype
    Is record
    (cardappl_id ci_cardapplattrvalue.cardappl_ID%TYPE,
    tag ci_cardapplattrvalue.tag%TYPE,
    value ci_cardapplattrvalue.value%TYPE
    getcardapplattrvalue_record getcardapplattrval_recordtype;
    BEGIN
    PN_CARDAPPL_ID := 1; --value must be supplied
    v_Return := CCI_GETCUSTCARD.GETCARDAPPLATTRVALUE(
    PN_CARDAPPL_ID => PN_CARDAPPL_ID
    loop
    fetch v_return
    into getcardapplattrvalue_record;
    dbms_output.put_line('Cardappl_id=>'||getcardapplattrvalue_record.cardappl_id);
    dbms_output.put_line('Tag =>'||getcardapplattrvalue_record.tag);
    dbms_output.put_line('Value =>'||getcardapplattrvalue_record.value);
    exit when v_Return%NOTFOUND;
    end loop;
    END;

  • How to switch between SQL Server Express and SQL Server Evaluation

    I have both SQL Server Express and SQL Server Evaluation installed on my laptop. 
    I am wondering how to start SQL Server Evaluation; when I click All Programs > SQL Server Management Studio, I always get SQL Server Express (which I installed on my laptop a couple years ago). 
    I just downloaded SQL Server Evaluation (a couple days ago). 
    Whenever I try to access SQL Server Evaluation, I always seem to get SQL Server Express. 
    Sorry if this is a trivial thing; I just can’t seem to get it to work (and I Googled for a solution before posting this).
    Thanks everyone!!

    Ah!  It worked!!  This is great!!  It seems like I can connect to my version of SQL Server Evaluation, named 'EXCEL-PC\EXCEL'.
    Somehow, though, I seem to keep getting errors in my BIDS when I try to run an SSIS project.  See below:
    Ah!  It worked!!  This is great!!  It seems like I can connect to my version of SQL Server Evaluation, named 'EXCEL-PC\EXCEL'.
    I am getting an error that VS is unable to load a document:
    BIDS has to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. 
    To install BIDS, run SQL Server Setup and select Business Intelligence Development Studio.
    #1)  I recently downloaded and installed SQL Server Evaluation (specifically for the opportunity to try using BIDS and learn about this technology).
    #2)  When I run BIDS > SSIS, I get the error I described above. 
    #3)  I have had SQL Server Express installed for a couple years on my laptop
    So, my question is this: is it possible that BIDS > SSIS is picking up my SQL Server Express, instead of SQL Server Evaluation? 
    If so, how can I get BIDS to point to SQL Server Evaluation? 
    You can see in the image above that I have my BIDS Data Connections pointing to EXCEL-PC\EXCEL, which is the name of my SQL Server Evaluation. 
    Is this correct?  Why doesn’t this work? 
    I’ve spent some considerable time on this.  I’m close to just uninstalling EVERYTHING pertaining to SLQ Server, and just reinstalling SQL Server Evaluation, but this will take some time, and I don’t necessarily want
    to get rid of my SQL Server Express.  However, if that’s the only way to get this working, then that’s what I’ll do.
    I’d greatly appreciate any insight into this!!

  • SQL server service and SQL agent service

    Hi all. If I am going to use a Domain User to start SQl server service and SQL agent service,
    does the domain user need to  be SYSADMIN ????

    No. not at all required.
    Service account should have below permissions to start SQL service and Agent services.
    Log on as a service (SeServiceLogonRight)
    Replace a process-level token (SeAssignPrimaryTokenPrivilege)
    Bypass traverse checking (SeChangeNotifyPrivilege)
    Adjust memory quotas for a process (SeIncreaseQuotaPrivilege)
    Permission to start SQL Writer
    Permission to read the Event Log service
    Permission to read the Remote Procedure Call service
    Also please go though the below URL for more information:
    http://msdn.microsoft.com/en-us/library/ms143504.aspx#Windows

  • SQL server 2014 and Sql 2012 installation on VMWare machine

    Hi ,
    Please let me know Can we Install SQL Server 2014 RTM version and SQL Server 2012 SP1 On Vmare machine version 9.
    Does it Supports on VMware 9 . If yes please give me the links .
    Thanks a lot

    Hello,
    You can find that on VMware Web site for their high end products only as you can see in the following URL. For VMware
    9 you won't find it.
    http://www.vmware.com/resources/compatibility/sim/interop_matrix.php
    VMware 9 is a PC product that basically allows to install any product that works on Windows 8 and Windows Server 2012 or earlier. VMware 9 introduces support for Windows 8 and Windows 2012.
    So, SQL Server 2012 and SQL Server 2012 should install on Windows 8/Windows 2012 RTM with no issues on a VMware 9 virtual machine. Earlier operating systems may require installation of service packs and updates as explained on the following URLs.
    https://msdn.microsoft.com/en-us/library/ms143506(v=sql.120).aspx
    https://msdn.microsoft.com/en-us/library/ms143506(v=sql.110).aspx
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • How to add sql server (SQLEXPRESS) AND sql server Agent(SQLEXPRESS) in sql server Configuration Manager

    When I had uninstall Sql Server 2008 R2, that time I had uninstalled instances (SQLEXPRESS and MSSQLSERVER) and again installed sql server 2008 r2 setup .
    Before uninstalling from control panel >Uninstall program showed two setups 
    1) Microsoft sql server  2) Microsoft sql server R2 , I uninstalled both. 
    As i installed SQL R2 again, it is showing only 1 instance ie "Microsoft sql server R2" but the other instance is some how missing -"Microsoft sql server" and 
    also sql server (SQLEXPRESS)  AND sql server Agent(SQLEXPRESS)  services from sql server configuration manager are missing
    How to retain the missing instance and the above services? 
    Plz give replay

    Hi Tushar,
    I need to ask you a question here. 
    Why you require SQL Express edition on your machine, if there is no need you can safely ignore it and continue using SQL Server 2008 R2 edition.
    Express edition can also get installed if you happen to have installed Visual Studio 2010 for instance.
    If you need it then you must install SQL Server Express edition, It is a freeware and can be download from Microsoft website.
    BR, Shashikant

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • External Table and Direct path load

    Hi,
    I was just playing with Oracle sql loader and external table features. few things which i ovserved are that data loading through direct path method of sqlloader is much faster and takes much less hard disk space than what external table method takes. here are my stats which i found while data loading:
    For Direct Path: -
    # OF RECORDS.............TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:00:43.53...................108,638 KB...................142,088 KB
    957697..........................00:01:08.81...................217,365 KB...................258,568 KB
    1915393..........................00:02:54.43...................434,729 KB...................509,448 KB
    For External Table: -
    # OF RECORDS..........TIME...................SOURCE FILE SIZE...................DATAFILE SIZE(.dbf)
    478849..........................00:02:51.03...................108,638 KB...................966,408 KB
    957697..........................00:08:05.32...................217,365 KB...................1,930,248 KB
    1915393..........................00:17:16.31...................434,729 KB...................3,860,488 KB
    1915393..........................00:23:17.05...................434,729 KB...................3,927,048 KB
    (With PARALLEL)
    i used same files for testing and all other conditions are similar also. In my case datafile is autoextendable, hence, as par requirement its size is automatically increased and hard disk space is reduced thus.The issue is that, is this an expected behaviour? and why in case of external tables such a large hard disk space is used when compared to direct path method? Performance of external table load is also very bad when compared to direct path load.
    one more thing is that while using external table load with PARALLEL option, ideally, it should take less time. But what i actually get is more than what the time was without PARALLEL option.
    In both the cases i am loading data from the same file to the same table (once using direct path and once using external table). before every fresh loading i truncate my internal table into which data was loaded.
    any views??
    Deep
    Message was edited by:
    Deep

    Thanx to all for your suggestions.
    John, my scripts are as follows:
    for external table:
    CREATE TABLE LOG_TBL_LOAD
    (COL1 CHAR(20),     COL2 CHAR(2),     COL3 CHAR(20),     COL4 CHAR(400),
         COL5 CHAR(20),     COL6 CHAR(400),     COL7 CHAR(20),     COL8 CHAR(20),
         COL9 CHAR(400),     COL10 CHAR(400))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_TAB_DIR
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES ARE NULL     
    LOCATION ('LOGZ3.DAT')
    REJECT LIMIT 10;
    for loading i did:
    INSERT INTO LOG_TBL (COL1, COL2, COL3, COL4,COL5, COL6,
    COL7, COL8, COL9, COL10)
    (SELECT COL1, COL2, COL3, COL4, COL5, COL6, COL7, COL8,
    COL9, COL10 FROM LOG_TBL_load_1);
    for direct path my control file is like this:
    OPTIONS (
    DIRECT = TRUE
    LOAD DATA
    INFILE 'F:\DATAFILES\LOGZ3.DAT' "str '\n'"
    INTO TABLE LOG_TBL
    APPEND
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
    (COL1 CHAR(20),
    COL2 CHAR(2),
    COL3 CHAR(20),
    COL4 CHAR(400),
    COL5 CHAR(20),
    COL6 CHAR(400),
    COL7 CHAR(20),
    COL8 CHAR(20),
    COL9 CHAR(400),
    COL10 CHAR(400))
    and ya, i have used same table in both the situations. after loading once i used to truncate my table, LOG_TBL. i used the same source file, LOGZ3.DAT.
    my tablespace USERS is loaclly managed.
    thanks

  • Oracle 11g - External Table Issue SQL - PL/SQL?

    Oracle 11g - External Table Issue?
    =====================
    I hope this is the right forum for this issue, if not let me, where to go.
    We are using Oracle 11g (11.2.0.1.0) on (Platform : solaris[tm] oe (64-bit)), Sql Developer 3.0.04
    We are trying to use oracle external table to load text files in .csv format. Here is our data look like.
    ======================
    Date1,date2,Political party,Name, ROLE
    20-Jan-66,22-Nov-69,Democratic,"John ", MMM
    22-Nov-70,20-Jan-71,Democratic,"John Jr.",MMM
    20-Jan-68,9-Aug-70,Republican,"Rick Ford Sr.", MMM
    9-Aug-72,20-Jan-75,Republican,Henry,MMM
    ALL NULL -- record
    20-Jan-80,20-Jan-89,Democratic,"Donald Smith",MMM
    ======================
    Our Expernal table structures is as follows
    CREATE TABLE P_LOAD
    DATE1 VARCHAR2(10),
    DATE2 VARCHAR2(10),
    POL_PRTY VARCHAR2(30),
    P_NAME VARCHAR2(30),
    P_ROLE VARCHAR2(5)
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY P_EXT_TAB_D
    ACCESS PARAMETERS (
    RECORDS DELIMITED by NEWLINE
    SKIP 1
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
    REJECT ROWS WITH ALL NULL FIELDS
    MISSING FIELD VALUES ARE NULL
    DATE1 CHAR (10) Terminated by "," ,
    DATE2 CHAR (10) Terminated by "," ,
    POL_PRTY CHAR (30) Terminated by "," ,
    P_NAME CHAR (30) Terminated by "," OPTIONALLY ENCLOSED BY '"' ,
    P_ROLE CHAR (5) Terminated by ","
    LOCATION ('Input.dat')
    REJECT LIMIT UNLIMITED;
    It created successfully using SQL Developer
    Here is the issue.
    It is not loading the records, where fields are enclosed in '"' (Rec # 2,3,4,7)
    It is loading all NULL value record (Rec # 6)
    *** If we remove the '"' from input data, it loads all records including all NULL records
    Log file has
    KUP-04021: field formatting error for field P_NAME
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 2 rejected in file ....
    Our questions
    Why did "REJECT ROWS WITH ALL NULL FIELDS" not working?
    Why did Terminated by "," OPTIONALLY ENCLOSED BY '"' not working?
    Any idea?
    Thanks in helping.
    Edited by: qwe16235 on Jun 11, 2011 11:31 AM

    I'm not sure, but maybe you should get rid of the redundancy that you have in your CREATE TABLE statement.
    This line covers all fields:
    FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    {code}
    So I would change the field list to:
    {code}
    DATE1 CHAR (10),
    DATE2 CHAR (10),
    POL_PRTY CHAR (30),
    P_NAME CHAR (30),
    P_ROLE CHAR (5)
    {code}
    It worked on my installation.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • External table.How to load numbers (decimal and scientific notation format)

    Hi all, I need to load inside an external table records that contain 7 fields. The last field is called AMOUNT and it's represented in some records with the decimal format, in others records with the scientific notation format as, for example, below:
    CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Jan;11200100;'60800;CYZ900;41380,77
    The External table's script is the following:
    CREATE TABLE HYP_DATA
    COUNTRY VARCHAR2(50 BYTE),
    YEAR VARCHAR2(20 BYTE),
    PERIOD VARCHAR2(20 BYTE),
    ACCOUNT VARCHAR2(50 BYTE),
    DEPT VARCHAR2(20 BYTE),
    ACTIVITY_LOC VARCHAR2(20 BYTE),
    AMOUNT VARCHAR2(50 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY HYP_DATA_DIR
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    BADFILE 'HYP_BAD_DIR':'HYP_LOAD.bad'
    DISCARDFILE 'HYP_DISCARD_DIR':'HYP_LOAD.dsc'
    LOGFILE 'HYP_LOG_DIR':'HYP_LOAD.log'
    SKIP 0
    FIELDS TERMINATED BY ";"
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    "COUNTRY" Char,
    "YEAR" Char,
    "PERIOD" Char,
    "ACCOUNT" Char,
    "DEPT" Char,
    "ACTIVITY_LOC" Char,
    "AMOUNT" Char
    LOCATION (HYP_DATA_DIR:'Total.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    If, for the field AMOUNT I use the datatype VARCHAR (as above), the table is loaded but I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation as:
    CY001_STATU;2009;Jan;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Feb;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Mar;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    CY001_STATU;2009;Dec;11220020GR;'03900;CYZ900;-9,99999999839929e-03
    All the others records with a decimal AMOUNT are loaded correctly.
    So, my problem is that I NEED to load all the records (with the decimal and the scientific notation format) together (without records rejected), but I don't know which datatype I have to use for the AMOUNT field....
    Anybody has any idea ???
    Any help would be appreciated
    Thanks in advance
    Alex

    @OP,
    What version of Oracle are you using?
    Just cut'n'paste of you script and example woked FINE for me.
    however my quation is... An external table will LOAD all data or none at all. How are you validating/concluding that...
    I have some records rejected, and all these records contain the last field AMOUNT with the scientific notation
    select * from v$version where rownum <2;
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    select * from mydata;
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Feb     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77
    CY001_STATU     2009     Mar     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Dec     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11220020GR     '03900     CYZ900     -9,99999999839929e-03
    CY001_STATU     2009     Jan     11200100     '60800     CYZ900     41380,77MYDATA table script is...
    drop table mydata;
    CREATE TABLE mydata
    COUNTRY VARCHAR2(50 BYTE),
    YEAR VARCHAR2(20 BYTE),
    PERIOD VARCHAR2(20 BYTE),
    ACCOUNT VARCHAR2(50 BYTE),
    DEPT VARCHAR2(20 BYTE),
    ACTIVITY_LOC VARCHAR2(20 BYTE),
    AMOUNT VARCHAR2(50 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY IN_DIR
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY NEWLINE
    BADFILE 'IN_DIR':'HYP_LOAD.bad'
    DISCARDFILE 'IN_DIR':'HYP_LOAD.dsc'
    LOGFILE 'IN_DIR':'HYP_LOAD.log'
    SKIP 0
    FIELDS TERMINATED BY ";"
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    "COUNTRY" Char,
    "YEAR" Char,
    "PERIOD" Char,
    "ACCOUNT" Char,
    "DEPT" Char,
    "ACTIVITY_LOC" Char,
    "AMOUNT" Char
    LOCATION (IN_DIR:'total.txt')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;vr,
    Sudhakar B.

  • External table and XML

    Hi.
    I have this XML-structure:
    <order>
    <orderhead>
    <orderno>123</orderno>
    <orderline>
    <name>name_of_what_i_order</name>
    </orderline>
    <orderline>
    <name>name_of_what_i_order</name>
    </orderline>
    </orderhead>
    </order>
    How can I get the <orderhead><orderno> when I'm making an external table for the <ordeline> elements? I use 'records delimited by "</orderline>"' to get all the orderlines in one tablerow each. But how can I insert the orderno in each of these rows?

    Hi,
    You need to practice or test by youself. Fortunately, I have a posting at http://www.oraclepoint.com/topic.php?filename=98&extra=page%3D1, which is a similar case I've done couple of years ago.
    What I did are:
    1. formating XML
    2. Load to oracle table by using SQL*Loader
    3. Parse elements one by one from entire xml large object from within oracle table
    4. insert the parsed elements to oracle table
    Hope it helps. By the way, please register and then log in at oraclepoint.com to get the solution package becuase it's only available for members.
    R.Wang
    http://www.oraclepoint.com

Maybe you are looking for