SQL Loader Foxpro Table

Hi,
I have a dbf (foxpro) table that I need to import into an oracle table, is there a way to use SQL Loader for this? If there is a better way, please recommend.
Thanks,
Ed

If you do not have Foxpro, DBase IV or some other dbase clone to dump that dbf file, you could load into M$ Access and dump to cvs file from there.
:p

Similar Messages

  • How to load Foxpro table?

    Hi,
    I am Senthil Kumar.
    I am connecting to foxpro database.
    Its connection sucessfull.
    but could not load foxpro table.
    I am using "com.hxtt.sql.dbf.DBFDriver" this driver file for foxpro.
    If anybody's know, please help me.
    Here is the error message:
    ``````````````````````````````````
    java.sql.SQLException: java.io.IOException: Server returned HTTP response code: 403 for URL: http://www.greatautoparts.com/../../carsteering_com/data/complete_tables//model.DBF at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown Source) at com.hxtt.concurrent.i.l(Unknown Source) at com.hxtt.concurrent.i.a(Unknown Source) at com.hxtt.concurrent.t.do(Unknown Source) at com.hxtt.concurrent.t.if(Unknown Source) at com.hxtt.concurrent.t.a(Unknown Source) at com.hxtt.sql.dbf.i.b(Unknown Source) at com.hxtt.sql.dbf.i.void(Unknown Source) at com.hxtt.sql.dbf.d.a(Unknown Source) at com.hxtt.sql.dbf.d.(Unknown Source) at com.hxtt.sql.dbf.u.a(Unknown Source) at com.hxtt.sql.bm.if(Unknown Source) at com.hxtt.sql.dc.a(Unknown Source) at com.hxtt.sql.dc.a(Unknown Source) at com.hxtt.sql.bm.a(Unknown Source) at com.hxtt.sql.bm.a(Unknown Source) at com.hxtt.sql.ag.a(Unknown Source) at com.hxtt.sql.ag.a(Unknown Source) at com.hxtt.sql.ag.executeQuery(Unknown Source) at org.apache.jsp.FoxproJDBCTest_jsp._jspService(org.apache.jsp.FoxproJDBCTest_jsp:63) at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:99) at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:325) at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:295) at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:245) at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:237) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:214) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:825) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:731) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:524) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684) at java.lang.Thread.run(Unknown Source)
    Thanks,
    K.Senthil Kumar

    Hi,
    If you are having problems loading data from a file using SQLLDR, even if it is from Foxpro then it is probably better to ask about it in the SQL*Loader forum -
    Export/Import/SQL Loader & External Tables
    Regards,
    Mike

  • SQL LOADER , EXTERNAL  TABLE and ODBS DATA SOURCE

    hello
    Can any body help loading data from dbase file .dbt to an oracle 10g table.
    I tried last day with SQL LOADER, EXTERNAL table , and ODBC data source.
    Why all of these utilities still failing to solve my problem ?
    Is there an efficient way to reach this goal ?
    Thanks in advance

    export the dbase data file to text file,
    then you have choice of using either sql loader or external table option to use.
    regards

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • SQL Loader, nested tables and default values

    Is there a way to specify a default value for a nested table entry when SQL*Loader encounters a 'null' value?
    I want to avoid this:
    Record 5: Rejected - Error on table LEVEL_DESC, column LEVELS.
    NULL nested table element is not allowed

    Use the NULLIF parameter in your control file for the nested table objects.
    e.g
    LOAD DATA
    INFILE 'level_data.dat'
    INTO TABLE LEVEL
    (LEVEL_ID POSITION (01:05) CHAR
    LEVEL_NAME POSITION (07:20)
    LEVEL_DESC COLUMN OBJECT
    (LEVELS POSITION (22:25) CHAR NULLIF LEVEL_DESC.LEVELS=BLNAKS,
    ... ))

  • SQL Loader and table views

    I am having a problem updating a large partitioned table through it's appropriate view. Here is the loader file
    LOAD DATA               
    INSERT INTO TABLE test.example               
         (      PROGRAM POSITION(1:4) CHAR,
              DWING_NUMBER     POSITION(6:32) CHAR,
              LAST_ISSUED_REVISION     POSITION(34:35) CHAR,
              TEMP_NUMBER     POSITION(37:43) CHAR,
              CONFIGURATION_ITEM     POSITION(45:45) CHAR,
              LOCATION_CODE     POSITION(47:47) CHAR
    It generates this error:
    SQL*Loader-951: Error calling once/load initialization
    ORA-26018: Column PROGRAM in table test.example does not exist
    There are a few considerations that go along with this, and I am developing a strong disdain for the 2 dba's that are involved. First, this view accesses a large partitioned table, all according to "PROGRAM". Secondly, PROGRAM is called PROG_ID in the main table. I didn't think this was an issue at first, but it seems like it might be part of the problem I'm having. We're running Oracle 9.2.0 here. Any helpful replies will be greatly appreciated. Thanks.

    This is what the view looks like:
    Column / Data Type / Null? / Updatable
    PROGRAM VARCHAR(10) N y
    DWING_NUMBER VARCHAR(28) N y
    LAST_ISSUED_REVISION VARCHAR(2) Y y
    TEMP_NUMBER VARCHAR(7) Y y
    CONFIGURATION_ITEM VARCHAR(1) Y y
    LOCATION_CODE VARCHAR(1) Y y
    The main table that this view grabs data from is identical, except for the fact that PROGAM is PROG_ID. I'm fairly new to doing database work, so if the answer is trivial please spare my self esteem.
    formatting sucks
    vagrantgringo

  • SQL Loader/External Table multiple record delimiters

    Hi every one.
    I have a strange problem, I have an external csv file which i wish to deal with (external tables or sql loader). This csv is totally not organized in structure and it contains records that are mixed together, meaning that some records are delimited by newline characters. So in short, i want to know if I will be able to load the data in this csv separating records by newline character and another character? So is that possible to have multiple record delimiters specified in the same ctl file?

    abohsin,
    I think using the Stream record format would be helful in your case. Please explore that.
    Using stream record option, instead of the default new line, you can specify a user defined record delimiter.
    Check this link.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005509
    Here is what I did. Not the complete answer, but it might be helpful.
    Replace all delimiters witha standard delimiters (in unix)
    sed s/HEAD,/**DLM**/g < test.dat >test2.dat
    sed s/TAIL,/**DLM**/g < test2.dat >test3.dat
    create table t(
      TEXT varchar2(100)
    and Use that delimiter as the standard delimiter.
    load data
    infile "test3.dat" "str '**DLM**'"
    into table T
    TRUNCATE
    fields terminated by 'XXXXX' optionally enclosed by '"'
    TEXT
    sql> select * from t;
    TEXT
    1111,2222,
    4444,5555,
    4444
    1111,3333,
    8888,6666,
    5555
    {code}
    You should also replace new line charecters with '***DLM***'.
    Thanks,
    Rajesh.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • SQL Loader Overwrites Table

    Does anybody know if it is possible to use SQL Loader in such a way that it does not overwrite any existing data in the destination table, but instead adds new rows?
    Many thanks.

    Depends on the Loading Method described on the documentation for the SQL*Loader Control File Reference.
    C.

  • Oracle SQL Loader - External Table

    This is my code:-
    CREATE TABLE GLO_CUST_EXT
    (SOURCE_ID CHAR(3),
    BUS_DT CHAR(8),
    TRAN_CD CHAR(1),
    CUST_CD NUMBER(10),
    MNEMONIC CHAR(10),
    SHORT_NM CHAR(255))
    ORGANIZATION external
    (TYPE oracle_loader
    DEFAULT DIRECTORY Source_dir
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE Skip 1
    badfile bad_dir:'GLO_CUST%a_%p.bad'
    LOGFILE log_dir:'GLO_CUST%a_%p.log'
    FIELDS TERMINATED BY '|'
    (SOURCE_ID ,
    BUS_DT ,
    TRAN_Cd ,
    CUST_CD,
    MNEMONIC,
    SHORT_NM))
    LOCATION ('GBCUS.txt'))
    REJECT LIMIT 0;
    Question:
    Would like how to load the data by selected field?
    Example:
    I want to all these fields (SOURCE_ID , BUS_DT , TRAN_CD ,
    CUST_CD, SHORT_NM) except the MNEMONIC. Thank you.

    Mohan Nair,
    This is my code:
    CREATE TABLE GLO_CUST_EXT5
    (SOURCE_ID CHAR(3),
    BUS_DT CHAR(8),
    TRAN_CD CHAR(1),
    CUST_CD NUMBER(10),
    MNEMONIC CHAR(10),
    SHORT_NM CHAR(255))
    ORGANIZATION external
    (TYPE oracle_loader
    DEFAULT DIRECTORY Source_dir
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY NEWLINE Skip 1
    badfile bad_dir:'GLO_CUST%a_%p.bad'
    LOGFILE log_dir:'GLO_CUST%a_%p.log'
    FIELDS TERMINATED BY '|'
    (SOURCE_ID ,
    BUS_DT ,
    TRAN_Cd ,
    CUST_CD,
    MNEMONIC FILLER,
    SHORT_NM))
    LOCATION ('GBCUS.txt'))
    REJECT LIMIT 0;
    I'm using Oracle 10g. I execute this code using TOAD version 8.5.0.50.
    I can create the external table, but when i used simple select statement
    (select * from GLO_CUST_EXT5), i got this error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "identifier": expecting one of: "binary_double, binary_float, comma, char, date, defaultif, decimal, double, float, integer, (, nullif, oracle_date, oracle_number, position, raw, recnum, ), unsigned, varrawc, varchar, varraw, varcharc, zoned"
    KUP-01008: the bad identifier was: FILLER
    KUP-01007: at line 9 column 10
    ORA-06512: at "S
    Can you help me solve this problem? Thank you.

  • Fields terminated by (SQL loader, external table) question?

    Hello.
    I have a txt file which looks like:
    Columns:
    A..........B.........C...........D.........E..............F.............G...........H
    739.......P.........0002......05........25012006..25012006..5...........data group
    . = space
    There are different number of spaces between columns.
    What must i use in FIELDS TERMINATED BY to import this?
    Thanks.

    So, don't use FIELDS TERMINATED BY, but, as Ino suggested, fixed format, something like
    LOAD DATA
    TRUNCATE INTO TABLE <table name>
    (a position(1:10),
    b position(11:20),
    c position(21:30),
    d position(31:40),
    e position(41:48) date "ddmmyyyy",
    f position(51:58) date "ddmmyyyy",
    g position(61:72),
    h position(73:92))

  • How can I load data into table with SQL*LOADER

    how can I load data into table with SQL*LOADER
    when column data length more than 255 bytes?
    when column exceed 255 ,data can not be insert into table by SQL*LOADER
    CREATE TABLE A (
    A VARCHAR2 ( 10 ) ,
    B VARCHAR2 ( 10 ) ,
    C VARCHAR2 ( 10 ) ,
    E VARCHAR2 ( 2000 ) );
    control file:
    load data
    append into table A
    fields terminated by X'09'
    (A , B , C , E )
    SQL*LOADER command:
    sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
    datafile:
    column E is more than 255bytes
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)

    Check this out.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961

  • SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)

    제품 : ORACLE SERVER
    작성날짜 : 2004-10-29
    ==================================================================
    SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)
    ==================================================================
    PURPOSE
    SQL*LOADER 에서 variable length record와 variable size field를 가진 data
    file 을 여러 table에 load하는 방법을 소개하고자 한다.
    ( 8i new feature인 FILLER 절 사용)
    Explanation
    SQL*LOADER SYNTAX
    여러 table에 load하고자 할때에는 control file에 아래와 같이 하면 된다.
    INTO TABLE emp
    INTO TABLE emp1
    fixed length field을 가진 data file을 여러 table에 같은 data을 load하고자
    한다면 아래와 같다.
    INTO TABLE emp
    (empno POSITION(1:4) INTEGER EXTERNAL,
    INTO TABLE emp1
    (empno POSITION(1:4) INTEGER EXTERNAL,
    위와 같이 양쪽 table의 empno field에 각각의 load할 data로부터 1-4까지를
    load 할수 있다. 그러나 field의 길이가 가변적이라면 위와 같이 POSITION 절을
    각 field에 사용할 수 없다.
    Example
    예제 1>
    create table one (
    field_1 varchar2(20),
    field_2 varchar2(20),
    empno varchar(10) );
    create table two (
    field_3 varchar2(20),
    empno varchar(10) );
    load할 record가 comma로 나누어지며 길이가 가변적이라고 가정하자.
    << data.txt >> - load할 data file
    "this is field 1","this is field 2",12345678,"this is field 4"
    << test.ctl >> - control file
    load data infile 'data.txt'
    discardfile 'discard.txt'
    into table one
    replace
    fields terminated by ","
    optionally enclosed by '"' (
    field_1,
    field_2,
    empno )
    into table two
    replace
    fields terminated by ","
    optionally enclosed by '"' (
    field_3,
    dummy1 filler position(1),
    dummy2 filler,
    empno )
    dummy1 field는 filler로 선언되었다. filler로 선언하면 table에 load하지 않는다.
    two라는 table에는 dummy1이라는 field는 없으며 position(1)은 current record의
    처음부터 시작해서 첫번째 field을 dummy1 filler item에 load한다는 것을 말한다.
    그리고 두번째 field을 dummy2 filler item에 load한다. 세번째 field인, one이라는
    table에 load되었던 employee number는 two라는 table에도 load되는 것이다,
    << 실행 >>
    $sqlldr scott/tiger control=test.ctl data=data.txt log=test.log bindsize=300000
    $sqlplus scott/tiger
    SQL> select * from one;
    FIELD_1 FIELD_2 EMPNO
    this is field 1 this is field 2 12345678
    SQL> select * from two;
    FIELD_3 EMPNO
    this is field 4 12345678
    예제 2>
    create table testA (c1 number, c2 varchar2(10), c3 varchar2(10));
    << data1.txt >> - load할 data file
    7782,SALES,CLARK
    7839,MKTG,MILLER
    7934,DEV,JONES
    << test1.ctl >>
    LOAD DATA
    INFILE 'data1.txt'
    INTO TABLE testA
    REPLACE
    FIELDS TERMINATED BY ","
    c1 INTEGER EXTERNAL,
    c2 FILLER CHAR,
    c3 CHAR
    << 실행 >>
    $ sqlldr scott/tiger control=test1.ctl data=data1.txt log=test1.log
    $ sqlplus scott/tiger
    SQL> select * from testA;
    C1 C2 C3
    7782 CLARK
    7839 MILLER
    7934 JONES
    Reference Documents
    <Note:74719.1>

  • How to Load Multiple Files in Oracle Database using Sql Loader

    Hi All,
    I want to import multiple files in my DB using Sql*Loader. Please tell me the Syntax, how can I import multiple files using one Control File?
    Thanks & Regards,
    Imran

    Hi,
    You might get a good response to your post in the forum dedicated to data movement , including SQL*Loader . You can find it here Export/Import/SQL Loader & External Tables
    Regards,

  • Using SQL*Loader to Load Russian and Chinese Characters

    We are testing our new 11.2.0.1 database using Oracle Linux 6. We created the database using the AL32UTF8 NLS Character set. We have tried using sqlldr to insert a few records that contain Russian and Chinese characters as a test. We can not seem to get them into the database in the correct format. For example, we can see the correct characters in the file we are trying to load on the Linux server, but once we load them into a table in the database, some of the characters are not displayed correctly (using SQL*Developer to select them out).
    We can set the values within a column on the table by inserting them into the table and then select them out and they are correect, so it appears the problem is not in the database, but in the way sqlldr inserts them. We have tried several settings on the Linux server to set the NLS_LANG environment to AMERICAN_AMERICA.AL32UTF8, AMERICAN_AMERICA.UTF8, etc. without success.
    Can someone provide us with any guidance on this? Would really appreciate any advice as to what we are not getting here.
    Thanks!!

    The characterset of the database does not change the language used in your input data file. The character set of the datafile can be set up by using the NLS_LANG parameter or by specifying a SQL*Loader CHARACTERSET parameter. I suggest to move this question to the appropriate forum: Export/Import/SQL Loader & External Tables for closer topic alignment.

  • SQL*Loader characterset

    Database: 10.2.0.1
    sql> select * from nls_database_parameters t where t.parameter like '%CHARACTERSET%';
    PARAMETER VALUE
    ==============================================
    NLS_CHARACTERSET AL32UTF8
    NLS_NCHAR_CHARACTERSET AL16UTF16
    OS: RHEL5
    $ export | grep LANG
    declare -x LANG="en_US.UTF-8"
    declare -x NLS_LANG="GERMAN_GERMANY.WE8ISO8859P1"
    I have file in latin1 codepage.
    When I create external table (organization external type oracle_loader) I have problems with loading the Ü symbols in this file to the database:
    a;b;c;Ü;d <- row inserted
    Ü;a;b;c;d <- row inserted
    a;b;c;d;Ü <- row failed to insert
    KUP-04021: field formatting error for 1 field
    KUP-04101: record 1 rejected in file latin1.csv
    When I explicitly create external table with "characterset WE8ISO8859P1" clause the latin1 file loads successfully.
    When I load the UTF8 file using external table which was created without clause "characterset WE8ISO8859P1" the file is loads OK too.
    I want to know the process - why the characterset conversion from latin1 file (WE8ISO8859P1) to UTF8 database for some reason fails?
    Is there possible to import latin1 files and do not specify the codepage in the external table DDL (seems that NLS_LANG env variable does not affect SQL*Loader external table)?
    From documentation:
    Specifying the CHARACTERSET parameter tells SQL*Loader the character set of the input datafile. The default character set for all datafiles, if the CHARACTERSET parameter is not specified, is the session character set defined by the NLS_LANG parameter. Only character data (fields in the SQL*Loader datatypes CHAR,  VARCHAR, VARCHARC, numeric EXTERNAL, and the datetime and interval datatypes) is affected by the character set of the datafile.
    Edited by: lynx™ on 15.07.2010 7:46

    Hi Andre,
    this is how I teach my classes normally!
    You need to be aware of some changes you make with NLS_TERRITORY, which is part of NLS_LANG:
    --> if you set NLS_TERRITORY to anohter value then you implicitly change the settings for NLS_NUMERIC_CHARACTERS => , although it is the same with america and australia obviousely the first digit here is the decimal separator and the second one is the grand seperator, thisn can destroy all numeric values if it is set impropper !!!
    NLS_DATE_FORMAT,
    NLS_TIMESTAMP_FORMAT,
    AND NLS_CURRENCY.
    SYS @10gR2 SQL select * from v$nls_parameters
    PARAMETER VALUE
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AMERICA
    NLS_NUMERIC_CHARACTERS .,NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD-MON-RRNLS_DATE_LANGUAGE AMERICAN
    NLS_CHARACTERSET WE8ISO8859P1
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY $
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE
    19 rows selected.
    SYS @10gR2 SQL > alter session set NLS_TERRITORY=australia;
    SYS @10gR2 SQL > select * from v$nls_parameters
    2 ;
    PARAMETER VALUE
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AUSTRALIA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AUSTRALIA
    NLS_NUMERIC_CHARACTERS .,
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD/MON/RR
    NLS_DATE_LANGUAGE AMERICAN
    NLS_CHARACTERSET WE8ISO8859P1
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH12:MI:SSXFF AM
    NLS_TIMESTAMP_FORMAT DD/MON/RR HH12:MI:SSXFF AM
    NLS_TIME_TZ_FORMAT HH12:MI:SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT DD/MON/RR HH12:MI:SSXFF AM TZR
    NLS_DUAL_CURRENCY $
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE

Maybe you are looking for