No db link / no sql loader

i dont have any utilities available here for loading data on my table. i only have sql plus*, i dont even have unix access, and no dblink access either.
is there any way around using the only available sql plus* i have to load my data from one database to other?
i know i should dblink here, but there is no DBA available to do that for me.. i think of exporting data and loading it to my table, but i dont have SQL Loader.
is there any suggestion you may give?

ive got this error using COPY:
DEV>copy from santov/santov@factprod -
to santov/santov@factdev1 -
create new_emp -
using select * from employees;Array fetch/bind size is 15. (arraysize is 15)
Will commit when done. (copycommit is 0)
Maximum long size is 80. (long is 80)
ERROR:
ORA-00257: archiver error. Connect internal only, until freed.

Similar Messages

  • What would be the maximum datafile size that can support sql*loader

    Hi,
    I would like to load datafile from xls file which nearly 5 gb into oracle table by using sql*loader. Could you please tell me how much is max datafile size we can load by using sql*loader?
    Thanks
    VAMSHI

    Hello,
    The Size limit is mainly given by the OS. So you should care about what the OS could support as SQL*Loader files are unlimited on *64 Bit* but limited to *2GB* in *32 Bit* OS:
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10839/appg_db_lmts.htm#UNXAR382
    Else, you should be able to load these data into the Table. So you must check that you have enough place inside the Tablespace and/or the Disk (if the Tablespace has to be extended).
    Please find enclosed a link about SQL*Loader and scroll down to Limits / Defaults:
    http://www.ordba.net/Tutorials/OracleUtilities~SQLLoader.htm
    Hope this help.
    Best regards,
    Jean-Valentin

  • SQL*Loader datafile size

    is there anyone who can tell me if SQL*Loader has a filesize restriction on the datafiles loaded?
    Thx!

    Hello,
    The Size limit is mainly given by the OS. So you should care about what the OS could support as SQL*Loader files are unlimited on *64 Bit* but limited to *2GB* in *32 Bit* OS:
    http://download.oracle.com/docs/cd/E11882_01/server.112/e10839/appg_db_lmts.htm#UNXAR382
    Else, you should be able to load these data into the Table. So you must check that you have enough place inside the Tablespace and/or the Disk (if the Tablespace has to be extended).
    Please find enclosed a link about SQL*Loader and scroll down to Limits / Defaults:
    http://www.ordba.net/Tutorials/OracleUtilities~SQLLoader.htm
    Hope this help.
    Best regards,
    Jean-Valentin

  • Load data with SQL Loader link field between CSV file and Control File

    Hi all,
    in a SQL Loader control file, how do you specify link with field in CSV file and Control file?
    E.g. if I wat to import the record in table TEST (col1, col2, col3) with data in csv file BUT in different position. How to do this?
    FILE CSV (with variable position):
    test1;prova;pippo;Ferrari;
    xx;yy;hello;by;
    In the table TEST i want that col1 = 'prova' (xx),
    col2 = 'Ferrari' (yy)
    col3 = default N
    the others data in CSV file are ignored.
    so:
    load data
    infile 'TEST.CSV'
    into table TEST
    fields terminated by ';'
    col1 ?????,
    col2 ?????,
    col3 CONSTANT "N"
    Thanks,
    Attilio

    With '?' mark i mean " How i can link this COL1 with column in csv file ? "
    Attilio

  • How to do it with SQL Loader

    All,
    I have two tables HEADER_TABLE and LINE_TABLE. Each header record can have multiple line records. I have to load data from a flat file to these tables.Flat file can have two types of records. H-Header, L-Line. It looks as follows.. Each H record can have multiple corresponding L records
    H..........
    L.......
    L......
    L......
    H.........
    L.......
    L......
    L......
    I have HEADER_ID column in HEADER_TABLE and HEADER_ID, LINE_ID columns in the LINE_TABLE.
    While loading data using SQL Loader, I need to generate HEADER_ID and LINE_ID values as follows and load them.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 2><LINE_ID = 4>
    L...... <HEADER_ID = 2><LINE_ID = 5>
    L...... <HEADER_ID = 2><LINE_ID = 6>
    Is it possible to do this with SQL LODER?
    I tried to do this with sequences. But it loaded the tables as follows.
    H..........<HEADER_ID = 1>
    L....... <HEADER_ID = 1><LINE_ID = 1>
    L...... <HEADER_ID = 1><LINE_ID = 2>
    L...... <HEADER_ID = 1><LINE_ID = 3>
    H......... <HEADER_ID = 2>
    L....... <HEADER_ID = 1><LINE_ID = 4>
    L...... <HEADER_ID = 1><LINE_ID = 5>
    L...... <HEADER_ID = 1><LINE_ID = 6>
    Thanks
    Ketha

    Morgan,
    Examples given in the link are quite generic and I have tried them. But my requirement is focused on generating header_id and line_id values as i have described. It seems that SQLLDR scans all records for a particular WHEN clause and insert them into the specified table. I think that if SQLLDR is made to read recod in the data file sequentially, this can be done.
    ANy idea of how to make SQLLDR read the records from the file sequentially?
    Thanks
    Ketha

  • How can I use sql loader to load a text file into a table

    Hi, I need to load a text file that has records on lines tab delimited into a table. How would I be able to use the sql loader to do this? I am using korn shell to do this. I am very new at this...so any kind of helpful examples or documentation would be very appreciated. I would love to see some examples to help me understand if possible. I need help! Thanks alot!

    You should check out the documentation on SQL*Loader in the online Oracle document titled Utilities. Here's a link to the 9iR2 version of it: http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/server.920/a96652/part2.htm#436160
    Hope this helps.

  • Sql@loader-704  and ORA-12154: error messages when trying to load data with SQL Loader

    I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
    The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
    The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
    The first time I loaded the data i did it from a command line with SQL loader
    Now when I try to load the data I get this message:
    sql@loader-704 Internal error: ulconnect OCISERVERATTACH
    ORA-12154: tns:could not resolve the connect identifier specified
    I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
    I am able to  connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
    However SQL Developer will not let me load a file this big
    I have also tried to load the file within Apex  (SQL Workshop/ Utilities) but again, the file is too big.
    So it seems like SQL Loader is the only option
    I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
    Not sure what else to try or where to look
    thanks

    Hi,
    You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
    open a command prompt
    set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
    This will tell oracle use the config files you find here and no others
    then try sqlldr user/pass@db (in the same dos window)
    see if that connects and let us know.
    Cheers,
    Harry
    http://dbaharrison.blogspot.com

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • SQL*Loader and timestamp values

    I'm loading timestamp values (among other data) from a text file into Oracle using SQL*Loader.
    The data I load is formatted like the following:
    2008/11/13 23:55:21.366
    2008/11/13 23:55:22.782
    2008/11/13 23:55:25.879
    Hence my control file look like this:
    TSTAMP TIMESTAMP "YYYY/MM/DD HH24:MI:SS.FF",
    The timestamp data in the input file are in UTC, however I load into a column that is of type "TIMESTAMP(3)", i.e. in server time. This is on purpose. I do not want the data to be in UTC when I look at them.
    Therefore after the load I have to manually do
    UPDATE mytable
    SET tstamp = tstamp + (tstamp - sys_extract_utc(tstamp));
    This update-after-loading actually takes quite some time due to the amount of records. I would like to avoid it and instead do it as part of the load. Is this possible somehow?
    I'm on Oracle 10.2.
    Thanks.

    Hi
    What about setting a special Time Zone in your database?
    You have two options when setting which time zone the database belongs to. You can either qualify it as a displacement from GMT/UTC in the format of 'hh:mm' or you can specify it as a name that has an entry in the V$TIMEZONE table.
    select tzname,tzabbrev from V$TIMEZONE_NAMES;
    select DBTIMEZONE from dual;
    ALTER database SET TIME_ZONE = 'Denmark/Copenhagen';
    select SESSIONTIMEZONE from dual;
    select CURRENT_TIMESTAMP from dual;
    See that link
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/create.htm#sthref319
    Edited by: Hub on Dec 7, 2008 3:58 PM

  • SQL   Loader and Error ORA-01847/ORA-01839

    Hi,
    While using the direct loading in SQL-LOADER when we get the ORA-01847/ORA-01839 all the other records are getting errorred out. It goes fine with the conventional loading.
    Should I use some parameters or anything to make sure that all the other records are not rejected when we get the ORA-01847/ORA-01839 error while going with the DIRECT loading.
    Thanks
    Jibin

    In internet I found this short message:
    “AL32UTF8 is a multi-byte characterset,that means some characters are stored in more than 1 character, that's true for these special characters.
    If you have same table definitions in both databases you likely face error ORA-12899.
    This metalink note discusses this problem, it's also applicable to sqlloader:
    Import reports "ORA-12899: Value too large for column" when using BYTE semantic
    Doc ID: Note:563893.1”
    By metalink, I can see the Note linked to a one Oracle Internal Bug for Oracle 11g.....
    I'm waiting you suggestion... thanks very much in advance.
    Regards.
    Giovanni

  • Regarding sql loader ---  can we have  control file  to do a check on csv?

    Hi,
    I normally get a csv having data as
    column1 ;columnb;columnc;
    13 ; 12 ; 13 ;
    11 ;13 ;33;
    as the table where it needs to go is say table
    xys( a number, b number , c number).
    so the control file is fairly simple ...
    But from now I need to restrict data entry if the change in format happens in the csv
    say if it is like
    column2;column1;column3,
    12,13;12;
    11;13;14;
    or say the csv like
    column1;column2;column3;column4;
    11;13;14;15;
    111;134;14;12;
    in both cases sql loader should not run and throw the error saying the reason in the log.
    how do i manage it in the control file `???
    any ideas???
    it is urgent pls help !!!
    regards
    SHUBH
    Message was edited by:
    SHUBH

    Try changing the following properties:
    autosubmit = true;
    immediate = true;
    Here is a link where the user uses a transient attribute for the rows in his table:
    Technology on my way...:): ADF 11g : CheckBox Demo (Select one checkbox in table, Select all/Deselect all)
    Hope that helps.
    Regards,
    Frederico.

  • SQL*Loader ContinueIf  with X'hex'

    Hi all,
    i am using SQL*Loader: Release 9.2.0.6.0 to import a text file with this CTL file :
    LOAD DATA
    INFILE AZUPI00F.TXT
    REPLACE
    CONTINUEIF THIS (1:3) != X'0D0A1A'
    INTO TABLE SP_AZUPI00F
         (UPIUPI          POSITION     (1:5)               INTEGER EXTERNAL,
          UPINOM          POSITION     (6:40)          CHAR,
          UPIIND          POSITION     (41:90)          CHAR,
          UPILOC          POSITION     (91:125)     CHAR,
    ...but the log return this :
    Errors allowed: 10001
    Bind array:     10000 rows, maximum of 256000 bytes
    Continuation:   1:3 != 0X0d0a1a(character ''), in current physical record
    Path used:      Conventional
    Silent options: FEEDBACK, ERRORS and DISCARDS
    ...and the import fails. What i miss in the syntax?
    I am trying to exclude from import the last line in my text file that contain the hexadecimal
    Thank to all for sugestions and solutions.

    Following link might be helpful ->
    [SQL Loader 1|http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10825/ldr_cases.htm#i1010200]
    [SQL Loader 2|http://www.psoug.org/reference/sqlloader.html]
    Regards.
    Satyaki De.

  • Problem in external table & sql*loader

    hi all
    i want to transter data between flat file to oracle database through external table option and sql*loader Utility.
    1. how u make .dat file in external table.
    2. how u make .ctl file in sql*loader utility.
    if u know any good site plz. tell me.
    thanx
    Mohammadi52

    Hi,
    Use this link to search any Oracle documentation:
    http://tahiti.oracle.com/
    Also please have a look on the below link:
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14220/utility.htm
    Regards

  • SQL*Loader issue with WHEN command

    Environment: R12.1.2
    We have a file coming in from a bank that needs to be loaded into a custom table using SQL*Loader.
    The file has multiple record formats. Each record in the file starts with a "record type", which defines the format.
    For simplicity, let me say that there is a record type of "H" with the header format, and another record type "D" has a detail record format. An "H" record may be followed by multiple "D" records until the next "H" record is encountered. Unfortunately, there is no common key, like say "Vendor Number" in both the "H" and "D" records to establish a relationship. So the plan was to use a Oracle sequence or SQL*Loader sequence to get a sequence loaded into the table as the file is being loaded. Then if consecutive "H" records had a sequence value of 100 and 112, we would know that the "D" records for the "H" 100 record are all the records with sequence value of 101 through 111.
    The issue occurs as we have to use the WHEN command in the control file to direct a certain record type to specific columns of the table. Based on the populated sequence values, with the WHEN command, it seems that all the "H" records get loaded first followed by the "D" records. The sequence becomes of no use and we cannot establish a link between the "H" and "D" records. The alternative is to not use WHEN with the sequence, but load the file into generic column names which provides for less understanding in the application.
    Is there a way (command feature) to ensure that SQL*Loader loads the records sequentially while using WHEN?
    Thanks
    Satish

    I used RECNUM parameter instead of sequence and it worked fine

  • NLS_DATE_FORMAT and sql*loader

    dear all.
    Facts:
    - oracle 10g r2 std edition
    - centos 4.3
    i have a problem. i want to upload data through sql*loader but there are date format errors.
    - the data in the date field is in that format: MM/DD/YYYY
    - i want to upload in that formar : DD/MM/YYYY
    My *sh file have the following parameters:
    # /bin/bash
    export ORACLE_SID=plelpp
    export ORACLE_HOME=/u01/app/oracle/product/10.2.0/db_1
    export PATH=$PATH:$ORACLE_HOME/bin
    export NLS_DATE_FORMAT=DD/MM/YYYY --> THAT IS THE WAY THAT I WANT
    sqlldr USERID=SYSTEM, CONTROL=datprinf.ctl, LOG=datprinf.log, data=data_generada_para_datprinf.dat, bad=datprinf.bad, discard=datprinf.dsc
    My *.ctl file have this in the date fields:
    FECALT DATE "DD/MM/YYYY",
    In order to upload the rows i had tested too in that way:
    FECALT "TO_DATE(:FECALT,'DD/MM/YYYY')",
    but anything, always show me the same error:
    Record 1: Rejected - Error on table SUNAT.DATPRINF, column FECALT.
    ORA-01843: mes no valido
    i had put the export NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1 in sh file but anyhting works...
    i saw the link in metalink about Note:257909.1 but anything.
    Please would you help me ???
    Thanks a lot !!!

    Internal storage of date fields is always the same, regardless which format you use. In other words you don't store 'DD/MM/YYY' or anything like that. For sqlloader you have to specify the actual format in the external file. Later on you may choose the output format you want.
    Werner

Maybe you are looking for