SQL Loader and Oracle 11g

Hello,
Im receiving an sqlldr: not found error. Im getting ready to talk with our System Admin about the situation. Before I do I wanted to make sure that SQL Loader (sqlldr) is an available add-on for the Oracle 11 client. A co-worker mentioned that SQLLDR may not be available as an add-on in the Oracle 11 client and that we would need to use IMPORT/EXPORT instead. Is this a true statement?
Can someone please clarify these issues for me?
Thanks a bunch!

SQL*Loader is certainly available in either the 11.1 or 11.2 full client install. It may or may not be a component that is installed by default depending on the type of install you select during the installation process. But you can always go back and install that component.
If you are talking about the Instant Client, I'm not sure that either SQL*Loader or import and export work with the Instant Client.
And just to point it out, if you're using 11g, you would generally want to be using external tables rather than SQL*Loader.
Justin

Similar Messages

  • Is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader

    Hi
    Can anyone tell me whether is there any difference in Oracle 9i SQL Loader and Oracle 10g SQL Loader?
    I am upgrading the 9i db to 10g and wanted to run the 9i SQL Loader control files on upgraded 10g db. So please let me know is there any difference which I need to consider any modifications in the control files..
    Thank you in advance
    Adi

    answered

  • Help in calling sql loader and an oracle procedure in a script

    Hi Guru's
    please help me in writing an unix script which will call sql loader and also an oracle procedure..
    i wrote an script which is as follows.
    !/bin/sh
    clear
    #export ORACLE_SID='HOBS2'
    sqlldr USERID=load/ps94mfo16 CONTROL=test_nica.ctl LOG=test_nica.log
    retcode=`echo $?`
    case "$retcode" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with EX_FAIL, see logfile" ;;
    2) echo "SQL*Loader execution exited with EX_WARN, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code";;
    esac
    sqlplus USERID=load/ps94mfo16 << EOF
    EXEC DO_TEST_SHELL_SCRIPT
    it is loading the data in to an oracle table
    but the procedure is not executed..
    any valuable suggestion is highly appriciated..
    Cheers

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • Primavera 6.2 SQL Database to Oracle 11g migration

    Hi,
    I used the migration tool to migrate a Primavera 6.2 SQL database to Oracle 11G DB. The migration tool runned successfully and I am able to login to the database using P6.2 client with no problem; however when attemptiong to update the license on the now Oracle database using the privuser account I get the following error "License could not be loaded because of error: List index out of bounds (247). Please make sure that you are using a privileged database login" the error comes up before I can browse to the location of licesne.txt file. Same procedure on a "native" Oracle P6.2 database works perfectly with no problem. Has any one run into this problem before? any help will be appreciated.
    Best Reagrds,

    user11108903 wrote:
    Hi,
    I used the migration tool to migrate a Primavera 6.2 SQL database to Oracle 11G DB. The migration tool runned successfully and I am able to login to the database using P6.2 client with no problem; however when attemptiong to update the license on the now Oracle database using the privuser account I get the following error "License could not be loaded because of error: List index out of bounds (247). Please make sure that you are using a privileged database login" the error comes up before I can browse to the location of licesne.txt file. Same procedure on a "native" Oracle P6.2 database works perfectly with no problem. Has any one run into this problem before? any help will be appreciated.
    Best Reagrds,I have not seen that specifically but it appears that not all of the grants were correctly applied.
    I would suggest running the schema validation tool which can be found on the knowledgebase as Solution ID: prim75459 as this will tell you what is missing/invalid.
    Once you know that, you can pull the neccesary grants from the create scripts located in the cd1\install\database\scripts folder.

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • How do I migrate views from MS SQL 2008 to Oracle 11g through SQL Developer

    Is there any way to migrate the views from MS SQL 2008 to Oracle 11g through SQL Developer? Please give me some detail steps. Thanks for your help.
    Kevin

    Hi Kevin,
    user13531850 wrote:
    Hi Turloch,
    When I use migrate to oracle, I got a problem, the migrate tool create a new schema for me in my case (AZTECA_KSMMS), it migrates all the stuffs under that schema (AZTECA_KSMMS). However my application need the all the Oracle data under schema AZTECA instead of AZTECA_KSMMS. Is there any way to specify specific schema (AZTECA) for target oracle database? Schema remapping is available:
    First Capture (separately) then during right click convert on the captured model there is a Specify the conversion options with a Object Naming tab where the schema (and other) name changes are editable.
    I have not used this recently.
    Also during the migration process, when I choose repository, there is a check box for truncate to reset repository to empty state, Do I need to check that truncate Check Box so the repository will be cleared from last migration?The repository can hold multple migration attempts. Check truncate to get rid of previous attempts information. This cleans up the repository - not the destination database.
    There are also online database and offline database options during the migration process, what are the difference between these two choices? After I migrated to Oracle, all my views has a red cross icon next to it. Does that mean the view migration is failed or not? Please give me your comments. Thanks for your help.offline: for big (amount of data) databases with simple data types,
    uses bcp + files + scripts + sqlldr.
    online: for small (amount of data) databases (easier),
    uses (Java) jdbc.
    The view is likely to be broken - recompiling it may help.
    The Oracle schema is created using a .sql file - see under generated in the directory you gave originally in the wizard. There is a .out file that contains the result of running this script including any errors. During conversion there are also likely to be warnings displayed on the UI.
    There may be a single issue that is causing multiple issues - if viewa depends on functionb, and functionb is broken, viewa will also fail.
    >
    Kevin-Turloch
    SQLDeveloper Team

  • Compare SQL Server and Oracle syntax

    Hello, I am very new to the Oracle DB as I have been working with MS SQL Server for many years. I have a SQL statement that works fine in MSSQL but seems to not work in Oracle. Is this a syntax problem or do I need to look elsewhere?
    DECLARE @as_master_key varchar(26)
    EXEC [dbo].[GET_UNIQUE_MASTER_KEY]
         @as_project_name = N'CMDEMO',
         @as_database_site = N'CMPL',
         @as_user_initials = N'SK',
         @as_table = N'SUBMITTAL_CATEGORY',
         @as_master_key =
         @as_master_key
    OUTPUT SELECT @as_master_key as N'master_key'
    This is what I use in MSSQL and it works fine - but I am porting this application to an Oracle backend and it fails. Any suggestions?
    Thanks in advance to this newbie. :)

    SQL Server and Oracle are two completely different RDBMS'.
    You'd be best not to try and port SQL Server code to Oracle SQL or PL/SQL directly as it will not favour the power of Oracle's features correctly.
    I'm not sure what your code is supposed to be doing (I find SQL Server code awful to read, and never understand the need for "@" before variable names, that just seems to show a lazy parser (or lazy coder who designed the parser)).
    From what I can guess it's going to be something like...
    SQL code...
    create or replace sequence my_master_key;PL/SQL code...
    DECLARE
      as_master_key number;
      as_project_name varchar2(20) := 'CMDEMO';
      as_database_site varchar2(20) := 'CMPL';
      as_user_initials varckar2(5) := 'SK';
      as_table varchar2(30) := 'SUBMITTAL_CATEGORY';
    BEGIN
      select my_master_key.nextval
      into as_master_key
      from dual;
      -- In 11g you could replace this select with just:
      -- as_master_key = my_master_key.nextval;
    END;This get's a unique sequence value into a variable and declares all the other variables. What you want to do with them after that I don't know.

  • Terminology differences between MSSQL 2005/2008 and Oracle 11g

    I'm looking for some articles/books/webinars/blogs which discuss the terminology differences between MSSQL 2005/2008 and Oracle 11g in terms of database concepts. I have found this article http://www.sqlservercentral.com/articles/SQL+Server/67007 so far.
    Thanks.

    Thanks sb92075. Those returned results are almost all related to compare the features/functionalities between MSSQL and Oracle not to distinguish the terminology differences between them.

  • Error mapping, OGG from SQL 2008 to Oracle 11g

    Hi Experts
    Please I hoppe you can help me, I am using OGG to replicate from SQL 2008 to Oracle 11g R2, I can insert rows and replicate without problems, but when I update or delete on the Source the replicat stop working.
    This is the mesage:
    2012-06-11 09:51:57 ERROR OGG-01296 Oracle GoldenGate Delivery for Oracle, msrep.prm: Error mapping from FACTURA.EMP to GOLDENGATE.EMP.
    2012-06-11 09:51:57 ERROR OGG-01668 Oracle GoldenGate Delivery for Oracle, msrep.prm: PROCESS ABENDING.
    --- Discard FIle
    Key column FIRST_NAME (1) is missing from delete on table GOLDENGATE.EMP
    Key column LAST_NAME (2) is missing from delete on table GOLDENGATE.EMP
    Missing 2 key columns in delete for table GOLDENGATE.EMP.
    Current time: 2012-06-11 09:51:57
    Discarded record from action ABEND on error 0
    Aborting transaction on /u01/app/goldenGate/dirdat/ms beginning at seqno 0 rba 13729
    error at seqno 0 rba 13729
    Problem replicating FACTURA.EMP to GOLDENGATE.EMP
    Mapping problem with delete record (target format)...
    ID = 121
    Process Abending : 2012-06-11 09:51:57
    ------ SOURCE TABLE
    create table [factura].[emp] (
    [id] [smallint] not null,
    [first_name] varchar(50) not null,
    [last_name] varchar(50) not null,
    constraint [emp_pk] primary key clustered (
    [id] asc
    ) with (pad_index = off, statistics_norecompute=off, ignore_dup_key=off, allow_row_locks=on, allow_page_locks=on) on [primary]
    ) on [primary]
    ------ TARGET TABLE
    CREATE TABLE GOLDENGATE.EMP
    ID NUMBER,
    FIRST_NAME VARCHAR2(50 BYTE),
    LAST_NAME VARCHAR2(50 BYTE)
    Thank you
    J.A.

    Hi
    I altered the table in Oracle, I add the primary key:
    alter table emp add constraint
    pk_emp primary key (ID );
    Now GG replicate insert, delete and update.
    J.A.

  • HELP: SQL*LOADER AND Ref Column

    Hallo,
    I have already posted and I really need help and don't come further with this
    I have the following problem. I have 2 tables which I created the following way:
    CREATE TYPE gemark_schluessel_t AS OBJECT(
    gemark_id NUMBER(8),
    gemark_schl NUMBER(4),
    gemark_name VARCHAR2(45)
    CREATE TABLE gemark_schluessel_tab OF gemark_schluessel_t(
    constraint pk_gemark PRIMARY KEY(gemark_id)
    CREATE TYPE flurstueck_t AS OBJECT(
    flst_id NUMBER(8),
    flst_nr_zaehler NUMBER(4),
    flst_nr_nenner NUMBER(4),
    zusatz VARCHAR2(2),
    flur_nr NUMBER(2),
    gemark_schluessel REF gemark_schluessel_t,
    flaeche SDO_GEOMETRY
    CREATE TABLE flurstuecke_tab OF flurstueck_t(
    constraint pk_flst PRIMARY KEY(flst_id),
    constraint uq_flst UNIQUE(flst_nr_zaehler,flst_nr_nenner,zusatz,flur_nr),
    flst_nr_zaehler NOT NULL,
    flur_nr NOT NULL,
    gemark_schluessel REFERENCES gemark_schluessel_tab
    Now I have data in the gemark_schluessel_tab which looks like this (a sample):
    1 101 Borna
    2 102 Draisdorf
    Now I wanna load data in my flurstuecke_tab with SQL*Loader and there I have problems with my ref column gemark_schluessel.
    One data record looks like this in my file (it is without geometry)
    1|97|7||1|1|
    If I wanna load my data record, it does not work. The reference (the system generated OID) should be taken from gemark_schluessel_tab.
    LOAD DATA
    INFILE *
    TRUNCATE
    CONTINUEIF NEXT(1:1) = '#'
    INTO TABLE FLURSTUECKE_TAB
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS (
    flst_id,
    flst_nr_zaehler,
    flst_nr_nenner,
    zusatz,
    flur_nr,
    gemark_schluessel REF(CONSTANT 'GEMARK_SCHLUESSEL_TAB',GEMARK_ID),
    gemark_id FILLER
    BEGINDATA
    1|97|7||1|1|
    Is there a error I made?
    Thanks in advance
    Tig

    multiple duplicate threads:
    to call an oracle procedure and sql loader in an unix script
    Re: Can some one help he sql loader issue.

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • Import and process larger data with SQL*Loader and Java resource

    Hello,
    I have a project to import data from a text file in a schedule. A lager data, with nearly 20,000 record/1 hours.
    After that, we have to analysis the data, and export the results into a another database.
    I research about SQL*Loader and Java resource to do these task. But I have no experiment about that.
    I'm afraid of the huge data, Oracle could be slowdown or the session in Java Resource application could be timeout.
    Please tell me some advice about the solution.
    Thank you very much.

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • Entity Framework - Code First - Migration - How to access SQL Server and Oracle using the same context?

    Hello,
    I use Entity Framework code first approach.
    My project is working fine with SQL Server. But, I want to access Oracle too. I want to switch SQL Server and Oracle in run time.
    I am able to access Oracle using "Oracle.ManagedDataAccess.EntityFramework.dl" in a new project.
    But, Is this possible to access SQL Server and Oracle in the same project.
    Thanks,
    Murugan

    This should be possible with a Code-First workflow.  In Code-First the database mapping layer is generated at runtime.
    David
    David http://blogs.msdn.com/b/dbrowne/

  • Different output of same query in SQL Server and Oracle

    I have two tables table1 and table2
    --table1 has two columns c1 int and c2 varchar. there are not constraints added in it. it has data as given below
    c1     c2
    6     d
    5     j
    102     g
    4     g
    103     f
    3     h
    501     j
    1     g
    601     n
    2     m
    --table2 has only one column c1 int. there are not constraints added in it. it has data as given below
    c1
    6
    1
    4
    3
    2
    now when i run below given query in sql server and oracle it gives me different result
    select *
    from table1
         inner join (SELECT ROW_NUMBER() OVER (order by c1 ASC) AS c1 from table2) table2 on table2.c1=table1.c1
    sql server output
    c1     c2     c1
    1     g     1
    2     m     2
    3     h     3
    4     g     4
    5     j     5
    oracle output
    C1 C2 C1
    5 j 5
    4 g 4
    3 h 3
    1 g 1
    2 m 2
    If you notice the first column in both output. It is sorted in sql server and not in oracle.
    Why it is behaving differently in oracle? Is there any way I can solve this in oracle?
    Thanks,
    Jigs

    It is NOT behaving "differently" in Oracle; you just haven't specified an order that you expect your results to be in, so you're going to get output in whatever order the database fancies displaying it (ie. no guarenteed order). This is an artifact of how the database chooses to put together the data, and different databases (or even datasets within the same database) can and most likely will behave differently.
    Even SQL Server won't guarentee to always get your data in an ordered fashion if you exclude the order by clause, even if you think it has always output the data in an ordered fashion.
    Your solution is to add an order by clause, in BOTH databases, to force the order of the output data.

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

Maybe you are looking for

  • Camileo P30 no sound on Youtube video

    Have been given a Camileo P30. I load my first video to youtube but no sound was loaded. Have checked file on laptop and sound is fine. Thanks

  • Pen drive not detected by PC

      I bought a new 8GB HP V210W pen drive in Mumbai.It is not detected in USB 2.0 Ports of my PC running Windows 7 (Ultimate) . But when I insert the same in i ball6012 tablet port of pen drive is detected and I could able to copy data into the drive f

  • Airplay on iOS8 (iPhone and iPads) not showing any of my Apple TVs or base stations on network!

    I recently upgraded to iOS8 on my iphone5 and 2 iPad Airs (most recent iPad). When I have a video to watch and want to stream it to any of my apple tv's, the icons for these do not show up on the airplay button.  The only option is "airplay" which do

  • Toolkit off-pixel error on negative co-ordinates.

    All my assets are on pixels yet when I export them with toolkit, it spits them out as floats instead of ints e.g. : -197 becomes -196.9. This is pretty annoying, who do you speak to to get this fixed or to ad an option to toolkit? [edit] So on furthe

  • Source for 2GB Memory for NB100-12A

    Can anyone tell me where I can obtain the 2GM memory upgrade for my NB100-12A? How much does it cost? To install it, do I have to remove existig 1GM and replace with new memory or what is the procedure?