Import ...errorORA-12899

Hi..
I am trying to import and facing issue in it..
9i to 10g import problems,importing 9i dumpfile to 10g
9i is in single-byte charecterset,10g is multi-byte charecterset
9i is having prod schema,we have to dump into 10g dev schema
9i is in windows,10g is in hp-ux
both are having diff charecter sets,
here 9i data contains special charecters,it is arabic data and they contain special charecters
i have imported all the tables,but..some rows are rejected with errors
in some of the tables,due to this special charecters problem,
error
ORA-12899: value too large for column "I*******" (actual: 38, maximum: 20)
this is the error i am getting
do you have any idea
Can I anyone help me in resolving the error.
**I looked in google for solution but i have 2000 tables.
Rgds
Geeta
Message was edited by:
user574393

Can you review Metalink Note 119119.1?
Because you are using a multi-byte charcterset with Arabic characters you will need to increase the size of some of your columns.
Your error message is telling you that you have a 20byte column and the import data requires 38bytes of space.
The metalink note states:
"US7ASCII characters (A-Z,a-Z,0-1 and ./?,*# etc..) are in UTF8 1 byte, so for
most West European languages the impact is rather limited as only "special"
characters like ç, ñ , é will use more bytes then in a 8 bit characterset.
But if you convert a Cyrillic or Arabic system to AL32UTF8 then the data will
take considerable more bytes to store.
This also means that you have to make sure your columns are big enough to store
the additional bytes."

Similar Messages

  • Error 12899 when importing a dmp (PLEASE SOMEONE HELP ME)

    Does anyone in this wolrd knows how to solve this problem. I have tried the alter system nls_lenght_semantics statement, it did not change the database parameter. I don't believe that that it's impossible to import a 9i dmp into a 10xe database. I need to know how I can fix this problem. please anyone could help me.

    Hi there,
    There is nothing wrong with NLS_LENGTH_SEMANTICS. It is BYTE by default. Swithing to CHAR will solve the problem with column size.
    I made the database structure on OracleXE Beta, which is BYTE. Then I changed NLS_LENGTH_SEMANTICS to CHAR and "impdp"(WE … to CL8MSWIN1251 using Univ.)
    1. If you have problem with characterset try to "impdp" into OracleXE Universal. I think it will accept almost any charset.
    2. There was an issue using NLS_CHARSET in environmental variable on PC Win.
    3. Go to the Installation Guide and check NLS_CHARSET …
    4. Check SQL*Loader
    Konstantin
    Message was edited by:
    konstantin.gudjev

  • Import error on Oracle Database Express 10.2.0.1.0

    Hi,
    I try to import data from oracle V10.01.02 running on SUSE10 Linux to
    oracle 10g (10.1.0.2.0) running on Windows Server 2003.
    I am able to import the big part from my ata, but not all data.
    The begin of my log file is:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0
    - Production
    With the Partitioning, OLAP and Data Mining options
    Export file created by EXPORT:V10.01.00 via conventional path
    And after some time I receive:
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "SAFCI"."EMP"."NAME"
    (actual: 65, maximum: 64)
    Column 1 1000005025
    Column 2 ??????? ???????
    Column 3 19-SEP-2002:00:00:00
    Column 4 9089
    Column 5 ??. ?????
    Column 6 1.9828
    Column 7 377.77
    Column 8 75.55
    Column 9 ???????????? ???????? ? ??? ???? ? 32 ??.
    Column 10 19-SEP-2002:00:00:00
    Column 11 ?????? ?????, ?.?. 172747675, ???. 23.11.200?
    Column 12 ????????? ?????????
    Column 13 ? ????
    Column 14 T
    Column 15
    Column 16 F
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "SAFCI"."EMP"."NAME"
    (actual: 65, maximum: 64)
    Column 1 1000006408
    Column 2 ??????? ???????
    Column 3 05-NOV-2002:00:00:00
    Column 4 9089
    Column 5 ??. ?????
    Column 6 1.939
    Column 7 82
    Column 8 16.4
    Column 9 ?????????? ? ???? ???? ? 40 ??.
    Column 10 05-NOV-2002:00:00:00
    Column 11 ?????? ?????, ?.?. 172747675, ???. 23.11.200?
    Column 12 ????????? ?????????
    Column 13 ? ????
    Column 14 T
    Column 15
    Column 16 F 36943 rows imported
    I can not understand this problem, because I exported the hole user and
    also try to import the hole user in my new system.
    Pls., can some one point me to some paper about this problem or help me
    to solve the problem.
    Thanks
    configuration Oracle10g (EXP source)
    SQLWKS> select * from nls_database_parameters
    2>
    PARAMETER VALUE
    NLS_LANGUAGE BRAZILIAN PORTUGUESE
    NLS_TERRITORY BRAZIL
    NLS_CURRENCY R$
    NLS_ISO_CURRENCY BRAZIL
    NLS_NUMERIC_CHARACTERS ,.
    NLS_CHARACTERSET WE8ISO8859P1
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD/MM/RR
    NLS_DATE_LANGUAGE BRAZILIAN PORTUGUESE
    NLS_SORT WEST_EUROPEAN
    NLS_TIME_FORMAT HH24:MI:SSXFF
    NLS_TIMESTAMP_FORMAT DD/MM/RR HH24:MI:SSXFF
    NLS_TIME_TZ_FORMAT HH24:MI:SSXFF TZR
    NLS_TIMESTAMP_TZ_FORMAT DD/MM/RR HH24:MI:SSXFF TZR
    NLS_DUAL_CURRENCY Cr$
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_RDBMS_VERSION 10.1.0.2.0
    configuration Oracle10g Express (IMP destination)
    SQL> select * from nls_database_parameters;
    PARAMETER VALUE
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AMERICA
    NLS_NUMERIC_CHARACTERS .,
    NLS_CHARACTERSET AL32UTF8
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD-MON-RR
    NLS_DATE_LANGUAGE AMERICAN
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY $
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_RDBMS_VERSION 10.2.0.1.0

    Hi
    Your import database use a multibyte characterset, your export db a singlebyte cs.
    This means, a char can need more than 1 byte.
    Try this before import (and before create tables!!):
    alter system set nls_length_semantics=char;Greetings
    Sven

  • Error while Export from 10g and import to 11g

    Hi,
    I get the following error on few tables when i try to export from 10g and import to 11g DB.
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    . importing TBAADM's objects into TEST
    . . importing table "ACCT_AUTH_SIGN_TABLE"
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "TEST"."ACCT_AUTH_SIGN_TABLE"."MODE_OF_DESPATCH" (actual: 3, maximum: 1)
    How to over come this ?
    Regards,
    jibu

    Jibu  wrote:
    Hi,
    I get the following error on few tables when i try to export from 10g and import to 11g DB.
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    . importing TBAADM's objects into TEST
    . . importing table "ACCT_AUTH_SIGN_TABLE"
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "TEST"."ACCT_AUTH_SIGN_TABLE"."MODE_OF_DESPATCH" (actual: 3, maximum: 1)
    How to over come this ?
    Regards,
    jibu
    [oracle@localhost sql]$ oerr ora 12899
    12899, 00000, "value too large for column %s (actual: %s, maximum: %s)"
    // *Cause: An attempt was made to insert or update a column with a value
    //         which is too wide for the width of the destination column.
    //         The name of the column is given, along with the actual width
    //         of the value, and the maximum allowed width of the column.
    //         Note that widths are reported in characters if character length
    //         semantics are in effect for the column, otherwise widths are
    //         reported in bytes.
    // *Action: Examine the SQL statement for correctness.  Check source
    //          and destination column data types.
    //          Either make the destination column wider, or use a subset
    //          of the source column (i.e. use substring).

  • Problem importing to Oracle XE 10.2.0.1.0

    Hello people...
    I have a Personal Oracle 9i ver. 9.2.0.1.0 in one machine and an Oracle XE ver. 10.2.0.1.0 in the other.
    I am using the exp utility of Oracle 9i with command line exp userid=user/pass owner=user file=file1.dmp
    and get the following output...
    Connected to: Personal Oracle9i Release 9.2.0.1.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.1.0 - Production
    Export done in EL8ISO8859P7 character set and AL16UTF16 NCHAR character set
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user USER
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user USER
    About to export USER's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    . about to export USER's tables via Conventional Path ...
    . . exporting table TABLE1 104789 rows exported
    . . exporting table TABLE2 961 rows exported
    . . exporting table TABLE3 0 rows exported
    . . exporting table TABLE4 133 rows exported
    . . exporting table TABLE5 7 rows exported
    . . exporting table TABLE6 879 rows exported
    . . exporting table TABLE7 3 rows exported
    . . exporting table TABLE8 11 rows exported
    . . exporting table PS_TXN 5 rows exported
    . . exporting table TABLE9 1647 rows exported
    . . exporting table TABLE10 0 rows exported
    . . exporting table TABLE11 77 rows exported
    . exporting synonyms
    . exporting views
    . exporting stored procedures
    . exporting operators
    . exporting referential integrity constraints
    . exporting triggers
    . exporting indextypes
    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting materialized views
    . exporting snapshot logs
    . exporting job queues
    . exporting refresh groups and children
    . exporting dimensions
    . exporting post-schema procedural objects and actions
    . exporting statistics
    Export terminated successfully without warnings.
    Everything seems ok so I go to Oracle XE, run after installation the catalog.sql script to enable import and i run imp using the following syntax:
    imp user/pass full=y file=file1.dmp
    and i get the following output:
    Export file created by EXPORT:V09.02.00 via conventional path
    import done in EL8ISO8859P7 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    and when it starts to import it brings up a message for every VARCHAR2 field i have in a table
    IMP-00019: Row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column USER.TABLE.FIELD (actual: 6, maximum: 4)
    In Oracle 9i the field is VARCHAR2(4) but for some reason the IMP utility in Oracle XE reads it as VARCHAR2(6)... Anyway I can fix that?
    PS. The user in Oracle XE has all the rights enabled...
    Thanx for your time and effords,
    Nikos
    Edited by: Nikolas_S on 10 Αυγ 2009 5:34 πμ

    Nikolas_S wrote:
    Hello people...hi Nik,
    Export file created by EXPORT:V09.02.00 via conventional path
    import done in EL8ISO8859P7 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)the errors you recived and the above info suggests that you are going through very common situation which usually manifests in this way when importing
    data which originated from the database that have single byte characer set and is trying to be imported in the database that has multi-byte character set.
    There are several general suggestions for resolving this, currently i can think of the one that is saying to resize a column to a higher value for the table which is causing import to fail.
    There are several notes about this at metalink and some good to know informations at [tahiti.oracle.com|http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/ch11charsetmig.htm#i1005993]
    Thanx for your time and effords,
    Nikoscheers and good luck

  • Database import errors

    Hello everyone!
    I have some problems with importing a database, I can't find the answer so I turn to you people.
    I've tried to import to an empty database (with the necessary tables of course) but that gave more errors, so I created the important schemes and tables with the sqls from the original database.
    The original database is working, everything is in there so I don't really understand the "not found" or "does not exists" errors, because I think Oracle is supposed to do the import in such a way, that the connections between objects doesn't get lost, so the database is built up from the bottom.
    The machine I export the data from:
    CentOS 5.5
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    NLS_CHARACTERSET EE8MSWIN1250
    nls_length_semantics string BYTE
    The machine I try to import data:
    Red Hat Enterprise Linux Server release 5.6 (Tikanga)
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    NLS_CHARACTERSET AL32UTF8
    nls_length_semantics string CHAR
    (Before you ask, we are Platinum Partners that is why we have Enterprise Edition, we can use it but we don't have support)
    The export command: expdp system/password full=y directory=dmpdir dumpfile=export.dmp logfile=export.log
    Job "SYSTEM"."SYS_EXPORT_FULL_01" completed successfully at: 14:46:34
    The import command: impdp system/password full=y directory=dmpdir table_exists_action=append dumpfile=export.dmp logfile=import_db.log
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 999 error(s) at 15:55:46
    The errors I don't understand:
    ORA-39083: Object type JOB failed to create with error:
    ORA-00001: unique constraint (SYS.I_JOB_JOB) violated
    ORA-31693: Table data object "SYSMAN"."MGMT_JOB_PURGE_POLICIES" failed to load/unload and is being skipped due to error:
    ORA-00001: unique constraint (SYSMAN.PK_MGMT_JOB_PURGE_POL) violated
    ORA-31693: Table data object "SYSMAN"."MGMT_JOB_SINGLE_TARGET_TYPES" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    ORA-31693: Table data object "SYSMAN"."MGMT_CREDENTIALS2" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-28239: no key provided
    ORA-31693: Table data object "SYSMAN"."MGMT_HC_VENDOR_SW_COMPONENTS" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-02291: integrity constraint (SYSMAN.VNC_VND_FK) violated - parent key not found
    KUP-11007: conversion error loading table "SCHAME_NAME"."TABLE_NAME"
    ORA-12899: value too large for column COLUMN_NAME (actual: 3767, maximum: 4000)
    KUP-11009: data for row: COLUMN_NAME: 0X'4146414245484E41507C4146414245484E41507CC166616265'
    And many objects are missing, triggers, sequences, indexes. Much more than the number of errors in the import_db.log.
    I hope someone can help me.
    Thank you in advance,
    Adam
    Edited by: 925120 on Apr 3, 2012 1:55 AM
    Edited by: 925120 on Apr 3, 2012 2:07 AM

    Hi, and welcome to OTN!
    Firstly, try to create the new database with the same character set and nls_length_semantics,
    The machine I export the data from:
    CentOS 5.5
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    NLS_CHARACTERSET EE8MSWIN1250
    nls_length_semantics string BYTE
    The machine I try to import data:
    Red Hat Enterprise Linux Server release 5.6 (Tikanga)
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    NLS_CHARACTERSET AL32UTF8
    nls_length_semantics string CHARThis could be responsible for the error:
    KUP-11007: conversion error loading table "SCHAME_NAME"."TABLE_NAME"
    ORA-12899: value too large for column COLUMN_NAME (actual: 3767, maximum: 4000)Secondly, I often find it better to not do a full export/import when moving between databases, and especially when moving between different database versions. The importing database have a (almost) fully populated sys and sysman-schema, and the full import is trying to add stuff to those schemas, causing all kind of violations.
    So, could you try to either export only the schemas that you're interested in (the user schemas), or just import these from the full export, and post the results?
    HtH
    Johan
    BTW, here's more info on expdp/impdp: http://www.orafaq.com/wiki/Data_Pump

  • Different NLS_LENGTH_SEMANTICS in two dbs...and ORA-12899

    Hi,
    I have created two db instances on the same server. One with db characterset EL8MSWIN1253(single-byte characterset) , NLS_LENGTH_SEMANTICS=BYTE and the other with UTF8(multi-byte characterset) NLS_LENGTH_SEMANTICS=CHAR.
    I'm trying to export a table from db1 and import to db2 ... respectively reffered above.
    This table has the following definition:
    SQL> CREATE TABLE TEST(A NUMBER(1) , B VARCHAR2(10));
    Table created
    SQL> INSERT INTO TEST VALUES(1 , 'TEST_TEST');
    1 row inserted
    SQL> COMMIT;
    Commit complete
    SQL> INSERT INTO TEST VALUES(2 , 'ΤΕΣΤ_ΤΕΣΤ');     <------------greek chars 1 row inserted
    SQL> COMMIT;
    Commit completeIn order to accomplish the aim....
    1) I exported the table
    2) I imported it without the rows - only to 'precreate' the tables
    3) I imported it with rows ... but error ORA-12899 (current value exceeds the max length specified) occurs....
    Of course this type of error is not a suprise ...since after step 2 , when i issue the command:
    SQL> DESC TEST;
    Name Type              Nullable Default Comments
    A    NUMBER(1)         Y                        
    B    VARCHAR2(10 BYTE) Y                        <---------the NLS_LENGTH_SEMANTICS of the exported file has been used...Issuing the command
    SQL> alter table test modify b varchar2(10 char);
    Table alteredbefore step 3 and after repeating step 3 ... solves the problem(ORA-12899).
    However , is there any other way to do this .....????? (imagine that there are some hundreds/thousands of tables....to be imported in such a db)...
    Note: I use Db10g v.2
    Thanks...
    Sim

    Actually.. there is one ...Metalink - Note : 313175.1...
    However , i tried to modify/accomplish the scenarios expressed on :
    1)Metalink Note: 144808.1 (point E1 - How to go to CHAR semantics ? -> use exp/imp ) and
    2)the steps described on:
    http://otn.oracle.com/oramag/oracle/03-mar/o23sql.html
    Have anybody accomplished it with exp/imp as described above on two dbs running different db charactrer sets(one single-byte and the other UTF8)...????
    Thanks....
    Sim

  • Import/export in 10g express edition

    hi all,
    my application was runing in win2000server and 8i oracle. now i want to change to xp with 10g express edition.
    i am geting error while importing data of 8i exported file in to 10g ED (as error 12899).it is missing the data while importing.
    my data contain arabic charchters.
    please help out

    ORA-12899: value too large for column string (actual: string, maximum: string)
    It breaks out character sets conversion from arabic to unicode. It needs
    more length than arabic.
    Can you create tables on Oracle10g XE before importing?
    Oracle8i    Oracle10g
    CHAR(N)     CHAR(N char)
    VARCHAR2(N) VARCHAR2(N char)

  • Excel Import failt in generell

    Hi,
    I was unable to import date from an XLS file.
    Exacly this xls was created beforehand via: Table-> Export Date -> XLS export.
    Very strange.
    Here the DDL (createt with SQLDev):
    REM START SHA AAA
    CREATE TABLE "SHA"."AAA"
    (     "TABLE_NAME" VARCHAR2(30 CHAR),
         "COLUMN_NAME" VARCHAR2(30 CHAR),
         "ANALYSE_RELEVANT_J_N" VARCHAR2(1 CHAR),
         "BEMERKUNG" VARCHAR2(500 CHAR),
         "TODO" VARCHAR2(500 CHAR)
    ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "SHA_DATA" ;
    REM END SHA AAA
    If I leave the column_names in the 1. row I get this:
    java.lang.NumberFormatException: For input string: "_name"
         at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
         at java.lang.Integer.parseInt(Integer.java:447)
         at java.lang.Integer.<init>(Integer.java:620)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelTabPanel.refreshExcelColumnPositions(ExcelTabPanel.java:457)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportDialog.displayExcelDialog(ExcelImportDialog.java:213)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportUtil.insertExcelToTable(ExcelImportUtil.java:41)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportEditor.importExcelToTable(ExcelImportEditor.java:79)
         at oracle.dbtools.raptor.dialogs.importdata.ExcelImportEditor.importExcel(ExcelImportEditor.java:50)
         at oracle.dbtools.raptor.dialogs.actions.ExcelImport.launch(ExcelImport.java:11)
         at oracle.dbtools.raptor.controls.sqldialog.ObjectActionController.handleEvent(ObjectActionController.java:127)
         at oracle.ide.controller.IdeAction.performAction(IdeAction.java:551)
         at oracle.ide.controller.IdeAction$2.run(IdeAction.java:804)
         at oracle.ide.controller.IdeAction.actionPerformedImpl(IdeAction.java:823)
         at oracle.ide.controller.IdeAction.actionPerformed(IdeAction.java:521)
         at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1849)
         at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2169)
         at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:420)
         at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:258)
         at javax.swing.AbstractButton.doClick(AbstractButton.java:302)
         at javax.swing.plaf.basic.BasicMenuItemUI.doClick(BasicMenuItemUI.java:1000)
         at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(BasicMenuItemUI.java:1041)
         at java.awt.Component.processMouseEvent(Component.java:5488)
         at javax.swing.JComponent.processMouseEvent(JComponent.java:3126)
         at java.awt.Component.processEvent(Component.java:5253)
         at java.awt.Container.processEvent(Container.java:1966)
         at java.awt.Component.dispatchEventImpl(Component.java:3955)
         at java.awt.Container.dispatchEventImpl(Container.java:2024)
         at java.awt.Component.dispatchEvent(Component.java:3803)
         at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4212)
         at java.awt.LightweightDispatcher.processMouseEvent(Container.java:3892)
         at java.awt.LightweightDispatcher.dispatchEvent(Container.java:3822)
         at java.awt.Container.dispatchEventImpl(Container.java:2010)
         at java.awt.Window.dispatchEventImpl(Window.java:1774)
         at java.awt.Component.dispatchEvent(Component.java:3803)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:463)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:242)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:163)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:157)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:149)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:110)
    When I delete the 1. row (the colum_names) in eh XLS file this does not happen, but it still don't work.
    Although I move all columns into the "Selectd Columns" box I get the Message that no data has been loaded because no columns has been selected for insert.
    Andre

    When I reviewed the *.bad file in the directory, C:\Documents and Settings\MyUsername\Application Data\SQL Developer, I noticed there were errors found on the insert statements into to my SQL work table.
    For example,
    REM SQL Error: ORA-12899: value too large for column "MySchema"."MyWorkTable"."cpt_code" (actual: 6, maximum: 5)
    I reviewed data and made the necessary corrections. I retried importing my Excel data again using the SQL Developer 1.5 import wizard and it was successful.
    Oh no...wait a minute... This is still an issue! Today, I tried to load a different CSV file into a different table, and the Import Wizard hangs up again. The CSV file contains a Header row, which I check the Header checkbox in the Data Import Wizard. On the Data Preview, I noticed the columns of my data were not delimited as I expected.
    Here's an example of the layout of my data. Each column is basically a comma-separated field:
    COL_A COL_B COL_C
    12345, "Description of data, and more", 123.45
    Since in COL_B (description column) of my data includes a comma, it appears that the Data Import Wizard automatically delimits the data on each comma.
    In Toad, I can specific that my data may be enclosed by a quote ("), followed by a comma (,), as the sample shows above. Is there a way to specify additional delimiters in the Data Import Wizard?
    Message was edited by:
    user599140

  • Imp/exp ORA-12899: value too large for column

    imp/exp ORA-12899: value too large for column
    source :
    os: linux as 4 update4
    .bash_profile NLS_LANG=AMERICAN_AMERICA.US7ASCII
    for run exp bill/admin001 file=bill0518.dmp bill rows=y
    oracle: 10.2.1
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CHARACTERSET US7ASCII
    target :
    os: linux as 4 update4
    .bash_profile NLS_LANG=AMERICAN_AMERICA.AL32UTF8
    for run
    imp bill/admin001 file=bill0518.dmp
    oracle: 10.2.1
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CHARACTERSET AL32UTF8
    imp log
    Import: Release 10.2.0.1.0 - Production on Wed May 16 14:57:59 2007
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc tion
    With the Partitioning, Real Application Clusters, OLAP and Data Mining options
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    export client uses AL32UTF8 character set (possible charset conversion)
    . importing BILL's objects into BILL
    . . importing table "MY_SESSION" 44 rows imported
    . . importing table "T1"
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "BILL"."T1"."NAME" (actual: 62, maximum: 5 0)
    Column 1 1
    Column 2 ÖйúÈË. 0 rows impo rted
    Import terminated successfully with warnings.

    Yes it's probably due to different char sets
    A way around it it to change the DB setup on the new database to use CHAR as default for varchar2 rows, and then use datapump to do your import/export, because datapump uses the default varchar2 type when creating tables that includes varchar2 (which is normally byte). Exp/imp uses the varchar2 type that is in the original database
    Best regards
    /Klaus

  • Import Datapump problems on Oracle 11g

    am trying to Import (using Datapump) and DMP file from Oracle 10g (10.2.0.4).
    The 10g database server has nls_characterset of:
    SYSTEM@testers>
    PARAMETER VALUE
    NLS_CHARACTERSET WE8MSWIN1252
    1 row selected.
    The 11g server has:
    SQL> select * from v$nls_parameters
    2 where parameter = upper('nls_characterset');
    PARAMETER
    VALUE
    NLS_CHARACTERSET
    AL32UTF8
    Here are the errors:
    Import: Release 11.2.0.2.0 - Production on Tue Nov 8 16:40:11 2011
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/********@Testers11g DIRECTORY=impdp_dir DUMPFILE=expdpT1400.dmp REMAP_SCHEMA=t1400:t1500 LOGFILE=expdpt1500.log
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "T1500"."DPR_OPENING_BALANCE" 172.2 MB 4537640 rows
    . . imported "T1500"."IVC_OFFSITE_OPENING_BALANCE" 199.0 MB 5120520 rows
    . . imported "T1500"."IVC_OPENING_BALANCE" 187.3 MB 5171512 rows
    . . imported "T1500"."FIN_AR_STATEMENT_SAVE" 33.73 MB 93 rows
    . . imported "T1500"."AUDIT_CONTRACT" 17.74 MB 39855 rows
    . . imported "T1500"."S1_CONTRACT_FORMAT_TEMP" 6.078 KB 2 rows
    . . imported "T1500"."FO_OPENING_BALANCE" 12.42 MB 406974 rows
    . . imported "T1500"."S1_TICKET_CHARGE_SAVE" 11.62 MB 24 rows
    . . imported "T1500"."FO_TRANSACTION_SUMMARY" 11.76 MB 406974 rows
    . . imported "T1500"."FIN_GL_ACCOUNT_BALANCE" 10.89 MB 337260 rows
    . . imported "T1500"."FIN_GL_AUDIT_TRAIL_JOURNAL" 25.48 KB 30 rows
    . . imported "T1500"."FIN_GL_AUDIT_TRAIL" 1.024 MB 6648 rows
    . . imported "T1500"."DPR_TRANSACTION_DETAIL" 5.888 MB 74588 rows
    ORA-02374: conversion error loading table "T1500"."AUDIT_RELEASE"
    ORA-12899: value too large for column SHIP_TO_SHORT_NAME (actual: 11, maximum: 10)
    ORA-02372: data for row: SHIP_TO_SHORT_NAME : 0X'C8746F696C65204C656D'
    I also tried to Import a DMP (from Export) from Oracle10g server and got this result:
    Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    export client uses US7ASCII character set (possible charset conversion)
    . importing T1400's objects into T1500
    . . importing table "A1_ASYNCH_COMMUNICATION" 0 rows imported
    . . importing table "A1_BROADCAST_LOCK" 1 rows imported
    . . importing table "A1_BROADCAST_MESSAGE" 6 rows imported
    . . importing table "A1_BROADCAST_USER" 0 rows imported
    . . importing table "A1_BUILD_SCRIPT" 437 rows imported
    . . importing table "A1_DBC_APPLIED_SCRIPT" 0 rows imported
    . . importing table "A1_ERROR_HANDLER" 5880 rows imported
    . . importing table "A1_ERROR_HANDLER_COMMON" 2956 rows imported
    . . importing table "A1_ERROR_HANDLER_TITLE"
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "T1500"."A1_ERROR_HANDLER_TITLE"."TITLE_DESCRIPTION" (actual: 34, maximum: 32)
    Column 1 7
    Column 2 FR
    Column 3 Relâche des Fact. Récurr des CAP
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    The errors always occur when I try to import a column containing French accented characters.
    What am I missing???

    Multiple posts on the same topic is considered to be rude and bordering on spam - https://forums.oracle.com/forums/thread.jspa?threadID=2308355

  • Import export woes

    Hello,
    I am having problems exporting and re-importing with a different application id in the online environment.
    I am getting:
    ORA-20001: Errore GET_BLOCK. ORA-20001: Errore GET_STMT. ORA-06502: PL/SQL: numeric or value error: character string buffer too small ORA-06502: PL/SQL: numeric or value error: character string buffer too small.
    According to previous messages this error should be linked with character set but I exported and imported the file with Character set UTF-8 on the same machine, from the same client as DOS files.
    When I ran the export script on my test box from SQLPlus, it went in without problems whatsoever, while I keep receiving the same error if I perform the import operation from HTMLDB import application page (on my box).
    So, just to make a test, I imported the application script as a normal script and I received a different error altogether:
    Error: ORA-20000: Unable to get the block of code: ORA-20000: Unable to get statement: ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    ORA-12899: value too large for column "FLOWS_010500"."WWV_FLOW_SW_STATEMENTS"."TEXT" (actual: 412, maximum: 256)
    This leads me to ask for some improvement in the user interface, like:
    1) create an optional region where dbms_output messages can be read from.
    2) spawn dbms_application_info lines in the export script so that a user can check what was the last operation performed.
    3) support the "PROMPT" sqlplus command in some way inside the SQL command processor when executing scripts.
    Additional question:
    Is it possible to have a copy of the same application, that is working on top of the same schema, while having a different "skin", that is a different UI template?
    This is what I was trying to test, but I was stopped by the import problem.
    Thanks and bye,
    Flavio

    Flavio,
    You will need to set the character set portion of your NLS_LANG environment variable to match the character set of the exported file. Note that this has to be set prior to connecting via SQL*Plus.
    The format of the NLS_LANG environment variable is language_territory.characterset. There is no locale-specific information in an application export file, so it is irrelevant what the language and territory settings are, as long as they are valid. But the characterset value is all important.
    For example, if I exported an application from HTML DB and the file character set is "Unicode UTF-8", I would want to set my NLS_LANG environment variable to something like american_america.al32utf8. Likewise, if my file character set is "Western European ISO-8859-1", I would want to set my NLS_LANG environment variable to something like american_america.we8iso8859p1.
    The character set portion of NLS_LANG must match your exported file character set, regardless of the character set of the database you are importing into.
    Joel

  • Import problem due to Character set

    I did an export of an entire schema using Original export utility(not Datapump) in 10G Release 1 (10.1.0.3.0). When i tried to import it. I got the error mentioned below. It is something to do with the character set. What settings do i have to give during import to bypass this Character set problem? After taking the export dmp file i deleted the Schema. So i all i've got is the export dmp file.
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    . importing MOTORGM's objects into 'MOTORGM2'
    . . importing table "SKU_DETAILS" 1872 rows imported
    . . importing table "COMP_LLU_DETAILS" 1877 rows imported
    . . importing table "SHIP_ROUTING" 0 rows imported
    . . importing table "SHIP_ROUTING_CODE" 0 rows imported
    . . importing table "AI_DETAILS" 103 rows imported
    . . importing table "AI_DETAILS_BKP" 103 rows imported
    IMP-00019: row rejected due to ORACLE error 12899.
    IMP-00003: ORACLE error 12899 encountered.
    ORA-12899: value too large for column "MOTORGM2"."CLASS_REC"(actual: 1017, maximum: 255)
    Column1....
    Column2....

    -- db unicode charset
    SQL> create table z_test_unicode (c varchar2(5));
    Table created.
    SQL> insert into z_test_unicode values('éèîàö');
    insert into z_test_unicode values('éèîàö')
    ERROR at line 1:
    ORA-01401: inserted value too large for column
    SQL> insert into z_test_unicode values('é');
    1 row created.
    SQL> select vsize(c) from z_test_unicode;
    VSIZE(C)
    3
    SQL> insert into z_test_unicode values('éà');
    insert into z_test_unicode values('éà')
    ERROR at line 1:
    ORA-01401: inserted value too large for column
    SQL> drop table z_test_unicode
    2 /
    Table dropped.
    -- db without unicode charset
    Connected.
    SQL> create table z_test_unicode (c varchar2(5));
    Table created.
    SQL> insert into z_test_unicode values('éèîàö');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select vsize(c) from z_test_unicode;
    VSIZE(C)
    5
    On possible solution to investigate for this kink of issue :
    try to take a look at the init param : nls_lenght_semantics (by default set to Byte) --> to be changed to CHAR
    SQL> show parameter semanti
    NAME TYPE VALUE
    nls_length_semantics string BYTE

  • SQL Developer: import from Excel bug

    SQL Developer 1.1.2.25 Build MAIN 25.79
    This is a test for import from Excel.
    Test table:
    create table bla
    (x number null,
    y varchar2(1) null);
    Excel data:
    x y
    1 a
    2 b
    3
    SQL Deveoper generates:
    Error at line 3:insert into BLA (X,Y) VALUES(3,'NULL')
    IMPORT into table BLAcomplete
    Inserted 2rows.
    Failed to insert 1rows.
    SQL Error:ORA-12899: value too large for column "BCRCEK"."BLA"."Y" (actual: 4, maximum: 1)
    I think there is a bug, because there must be generated NULL without apostrophe.
    insert into BLA (X,Y) VALUES(3,NULL);

    Hi Barry,
       That's[b] GREAT NEWS !!!.
       So, i just update current release 1.1.2.25 to 1.1.2.25.79 ?
    Thanks for the great work.
    Zack
      Oops !!! Am already using 1.1.2.25.79, same as bcrcek above. So need to wait for Patch Release ?
    Regards
    Zack
    Message was edited by:
            Zack.L                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Sql*loader 11g - Error ORA-12899

    Hi All,
    I'm using Sqlload for inserti into a 11g db some csv file from db2.
    My server is a Linux o.s. with Oracle 11.2.0.1.0 - 64bit.
    I receive this error ORA-12899: Value too large for column on all file I try to load.
    NLS for my db is:
    SQL> select parameter, value from nls_database_parameters where parameter like '%CHARACTERSET%';
    PARAMETER VALUE
    NLS_CHARACTERSET AL32UTF8
    NLS_NCHAR_CHARACTERSET UTF8
    Have you any idea?
    Thanks very much for help
    Regards
    Giovanni

    In internet I found this short message:
    “AL32UTF8 is a multi-byte characterset,that means some characters are stored in more than 1 character, that's true for these special characters.
    If you have same table definitions in both databases you likely face error ORA-12899.
    This metalink note discusses this problem, it's also applicable to sqlloader:
    Import reports "ORA-12899: Value too large for column" when using BYTE semantic
    Doc ID: Note:563893.1”
    By metalink, I can see the Note linked to a one Oracle Internal Bug for Oracle 11g.....
    I'm waiting you suggestion... thanks very much in advance.
    Regards.
    Giovanni

Maybe you are looking for