ORA-02374: conversion error loading table during import using IMPDP

HI All,
We are trying to migrate the data from one database to an other database.
The source database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
US7ASCII
The destination database is having character set
SQL> select value from nls_database_parameters where parameter='NLS_CHARACTERSET';
VALUE
AL32UTF8
We took an export of the whole database using expdp and when we try to import to the destination database using impdp. We are getting the following error.
ORA-02374: conversion error loading table <TABLE_NAME>
ORA-12899: value too large for column <COLUMN NAME> (actual: 42, maximum: 40)
ORA-02372: data for row:<COLUMN NAME> : 0X'4944454E5449464943414349E44E204445204C4C414D414441'
Kindly let me know how to overcome this issue in destination.
Thanks & Regards,
Vikas Krishna

Hi,
You can overcome this issue by increasing the column width in the target database for the max value required for all data to be imported successfully in the table.
Regards

Similar Messages

  • Error while impdp: ORA-02374: conversion error loading table

    Hi,
    I am trying to convert the character set from WE8ISO8859P1 to AL32UTF8 using expdp/impdp. for this I first convert WE8ISO8859P1 to WE8MSWIN1252 in source DB to get rid of “lossy” data. I created new database(target) with character set AL32UTF8 and nls_length_semantics = ’CHAR’ and created all the tablespaces as in source DB with auoexend on. I took full export (expdp) of source DB excluding TABLESPACE,STATISTICS,INDEX,CONSTRAINT,REF_CONSTRAINT and imported using impdp to target DB. I found below error in the import log file:
    ORA-02374: conversion error loading table "SCTCVT"."SPRADDR_CVT"
    ORA-26093: input data column size (44) exceeds the maximum input size (40)
    ORA-02372: data for row: CONVERT_STREET_LINE1 : 0X'20202020202020202020202020202020202020202020202020'
    I checked with select query on both DBs with below results.
    source DB:
    04:58:42 SQL> select count(*) from "SCTCVT"."SPRADDR_CVT";
    COUNT(*)
    74553
    target DB:
    04:59:24 SQL> select count(*) from "SCTCVT"."SPRADDR_CVT";
    COUNT(*)
    74552
    please suggest me a solution to this.
    Thanks and Regards.
    Edited by: user12045167 on May 9, 2011 10:39 PM

    Thanks for your update maher.
    09:15:53 SQL> desc "SCTCVT"."SPRADDR_CVT"
    Name Null? Type
    SPRADDR_PIDM NUMBER(8)
    CONVERT_PIDM VARCHAR2(9 CHAR)
    SPRADDR_ATYP_CODE VARCHAR2(2 CHAR)
    CONVERT_ATYP_CODE VARCHAR2(2 CHAR)
    SPRADDR_SEQNO NUMBER(2)
    CONVERT_SEQNO VARCHAR2(2 CHAR)
    SPRADDR_FROM_DATE DATE
    CONVERT_FROM_DATE VARCHAR2(8 CHAR)
    SPRADDR_TO_DATE DATE
    CONVERT_TO_DATE VARCHAR2(8 CHAR)
    SPRADDR_STREET_LINE1 VARCHAR2(30 CHAR)
    CONVERT_STREET_LINE1 VARCHAR2(40 CHAR)
    SPRADDR_STREET_LINE2 VARCHAR2(30 CHAR)
    CONVERT_STREET_LINE2 VARCHAR2(40 CHAR)
    SPRADDR_STREET_LINE3 VARCHAR2(30 CHAR)
    CONVERT_STREET_LINE3 VARCHAR2(40 CHAR)
    SPRADDR_CITY VARCHAR2(20 CHAR)
    CONVERT_CITY VARCHAR2(25 CHAR)
    SPRADDR_STAT_CODE VARCHAR2(3 CHAR)
    CONVERT_STAT_CODE VARCHAR2(25 CHAR)
    SPRADDR_ZIP VARCHAR2(10 CHAR)
    CONVERT_ZIP VARCHAR2(15 CHAR)
    SPRADDR_CNTY_CODE VARCHAR2(5 CHAR)
    CONVERT_CNTY_CODE VARCHAR2(5 CHAR)
    SPRADDR_NATN_CODE VARCHAR2(5 CHAR)
    CONVERT_NATN_CODE VARCHAR2(5 CHAR)
    SPRADDR_PHONE_AREA VARCHAR2(3 CHAR)
    CONVERT_PHONE_AREA VARCHAR2(3 CHAR)
    SPRADDR_PHONE_NUMBER VARCHAR2(7 CHAR)
    CONVERT_PHONE_NUMBER VARCHAR2(7 CHAR)
    SPRADDR_PHONE_EXT VARCHAR2(4 CHAR)
    CONVERT_PHONE_EXT VARCHAR2(4 CHAR)
    SPRADDR_STATUS_IND VARCHAR2(1 CHAR)
    CONVERT_STATUS_IND VARCHAR2(1 CHAR)
    SPRADDR_ACTIVITY_DATE DATE
    CONVERT_ACTIVITY_DATE VARCHAR2(8 CHAR)
    SPRADDR_USER VARCHAR2(30 CHAR)
    CONVERT_USER VARCHAR2(30 CHAR)
    SPRADDR_ASRC_CODE VARCHAR2(4 CHAR)
    CONVERT_ASRC_CODE VARCHAR2(4 CHAR)
    SPRADDR_DELIVERY_POINT NUMBER(2)
    CONVERT_DELIVERY_POINT VARCHAR2(2 CHAR)
    SPRADDR_CORRECTION_DIGIT NUMBER(1)
    CONVERT_CORRECTION_DIGIT VARCHAR2(1 CHAR)
    SPRADDR_CARRIER_ROUTE VARCHAR2(4 CHAR)
    CONVERT_CARRIER_ROUTE VARCHAR2(4 CHAR)
    SPRADDR_GST_TAX_ID VARCHAR2(15 CHAR)
    CONVERT_GST_TAX_ID VARCHAR2(15 CHAR)
    SPRADDR_REVIEWED_IND VARCHAR2(1 CHAR)
    CONVERT_REVIEWED_IND VARCHAR2(1 CHAR)
    SPRADDR_REVIEWED_USER VARCHAR2(30 CHAR)
    CONVERT_REVIEWED_USER VARCHAR2(30 CHAR)
    SPRADDR_DATA_ORIGIN VARCHAR2(30 CHAR)
    CONVERT_DATA_ORIGIN VARCHAR2(30 CHAR)
    SPRADDR_CVT_RECORD_ID NUMBER(8)
    SPRADDR_CVT_STATUS VARCHAR2(1 CHAR)
    SPRADDR_CVT_JOB_ID NUMBER(8)
    so here we can see its value is 40 (CONVERT_STREET_LINE1 VARCHAR2(40 CHAR)).
    shall i go ahead altering the column?

  • KEEP GETTING ORA-04030: OUT OF PROCESS MEMORY During import using DataPump

    Hi,
    I know I have several issues with my datapump during imp, but I am stuck again people :(
    We took a expdp from an external client and we are trying to append the data to our existing db. When we do this, we keep
    getting this.
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-31693: Table data object "IWORKS"."TBLEDIFILE_DTL" failed to load/unload and is being
    skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-04030: out of process memory when trying to allocate 64528 bytes (sort subheap,sort key)
    and so on...for all the tables.
    I used 2 different impdp commands to see if I would get different results and they are:
    mpdp system@iworksdb directory=DATA_PUMP_DIR dumpfile=expdpgemdev.dmp
    job_name=impgemdev041708_01 INCLUDE=TABLE TABLE_EXISTS_ACTION=APPEND
    SCHEMAS=GEMDEV LOGFILE=IMPIWORKS_BOON.log REMAP_SCHEMA=GEMDEV:IWORKS
    REMAP_TABLESPACE=IWORKS_INDEX:IWORKS_IDX REMAP_TABLESPACE=IWORKS_IOT:IWORKS_IDX
    REMAP_TABLESPACE=IWORKS_TABLES:IWORKS_TABLES EXCLUDE=GRANT exclude=statistics
    STREAMS_CONFIGURATION=N
    impdp system@iworksdb directory=DATA_PUMP_DIR dumpfile=expdpgemdev.dmp job_name=impgemdev041708_02 SCHEMAS=GEMDEV
    LOGFILE=IMPIWORKS_BOON.log REMAP_SCHEMA=GEMDEV:IWORKS
    REMAP_TABLESPACE=IWORKS_INDEX:IWORKS_IDX REMAP_TABLESPACE=IWORKS_IOT:IWORKS_IDX
    REMAP_TABLESPACE=IWORKS_TABLES:IWORKS_TABLES EXCLUDE=GRANT exclude=statistics
    STREAMS_CONFIGURATION=N
    I have also enabled my 3gb limit on my Windows 2003 Server which has a total of 4gb of RAM and a 2.6 ghz dual core:
    Microsoft Windows [Version 5.2.3790]
    (C) Copyright 1985-2003 Microsoft Corp.
    C:\Documents and Settings\rdgadmin>cd ../..
    C:\>type boot.ini
    [boot loader]
    redirect=UseBiosSettings
    timeout=30
    default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS
    [operating systems]
    multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Windows Server 2003, Enterprise" /noexecute=optout /fastdetect /3GB /redirect
    Here is my Parameter file as well so this will show you how I setup my memory allocation as well.
    # Memory Allocations
    iworksdb.__db_cache_size=0
    iworksdb.__java_pool_size=0
    iworksdb.__large_pool_size=0
    iworksdb.__shared_pool_size=0
    iworksdb.__streams_pool_size=0
    *.db_16k_cache_size=673741824
    *.db_block_size=8192
    *.db_recovery_file_dest_size=1147483648
    *.pga_aggregate_target=1010612736
    *.sga_max_size=1521225472
    *.sga_target=1321225472
    # Instance Parameters
    *.control_files='C:\ORACLE\FILES\IWORKSDB\control01.ctl',
    'R:\ORACLE\FILES\IWORKSDB\control02.ctl',
    'C:\ORACLE\FILES\IWORKSDB\control03.ctl'
    *.db_domain=''
    *.db_name='iworksdb'
    *._kgl_large_heap_warning_threshold=0
    *.compatible='10.2.0.4.0'
    *.job_queue_processes=20
    *.open_cursors=20000
    *.session_cached_cursors=8000
    *.processes=300
    *.remote_login_passwordfile='EXCLUSIVE'
    *.undo_management='AUTO'
    *.undo_tablespace='UNDOTBS1'
    *.db_recovery_file_dest='c:\ORACLE\FILES\IWORKSDB'
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=iworksdbXDB)'
    *.statistics_level=ALL
    *.db_writer_processes=4
    # CBO Settings
    *.optimizer_mode='FIRST_ROWS'
    *.optimizer_index_cost_adj=20
    *.query_rewrite_enabled=TRUE
    *.STAR_TRANSFORMATION_ENABLED=TRUE
    *._NEWSORT_ENABLED=TRUE
    *.OPTIMIZER_DYNAMIC_SAMPLING=4
    *.optimizer_index_caching=75
    *.optimizer_index_cost_adj=15
    Continued on the next post....

    Continuation....
    Here is my log file from the impdp:
    Import: Release 10.2.0.4.0 - Production on Thursday, 17 April, 2008 14:35:31
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."IMPGEMDEV041708_01" successfully loaded/unloaded
    Starting "SYSTEM"."IMPGEMDEV041708_01": system/********@iworksdb directory=DATA_PUMP_DIR dumpfile=expdpgemdev.dmp job_name=impgemdev041708_01 INCLUDE=TABLE TABLE_EXISTS_ACTION=APPEND SCHEMAS=GEMDEV LOGFILE=IMPIWORKS_BOON.log REMAP_SCHEMA=GEMDEV:IWORKS REMAP_TABLESPACE=IWORKS_INDEX:IWORKS_IDX REMAP_TABLESPACE=IWORKS_IOT:IWORKS_IDX REMAP_TABLESPACE=IWORKS_TABLES:IWORKS_TABLES STREAMS_CONFIGURATION=N
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39152: Table "IWORKS"."SYS_TOKENTYPE" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-31693: Table data object "IWORKS"."TBLEDIFILE_DTL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-04030: out of process memory when trying to allocate 64528 bytes (sort subheap,sort key)
    ORA-31693: Table data object "IWORKS"."TBLSUBSCRIBERBENEFITS_DTL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-04030: out of process memory when trying to allocate 64528 bytes (sort subheap,sort key)
    ORA-31693: Table data object "IWORKS"."TBLROUTE_DTL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-04030: out of process memory when trying to allocate 64528 bytes (sort subheap,sort key)
    ;;; Import> kill_job
    Job "SYSTEM"."IMPGEMDEV041708_01" stopped due to fatal error at 14:42:54
    So basically I have looked online at Metalink and they are telling me (VIA me opening a SR with them), that I should look at Note 233869.1 Diagnosing and Resolving ORA-4030 errors
    So I did this, and the only thing that can apply to me that I am thinking is maybe I have sized my SGA too big??? I mean what else can cause this issue?

  • CRM SYSFAIL - Error when decompressing during Import

    Hi All,
    We are getting error in SMQ2   "Error when decompressing during Import"
    in Function module BAPI_CRM_SAVE and status is SYSFAIL. We are having CRM 4.0 and patch level 7.
    Please see the source code extract
    001920   ********************************************************************
    001930   * define a little macro to add a table name to l_t_tables           
    001940     DEFINE lmacro_add_import_table.                                   
    001950       l_t_tables-name = 'T_&1&2&3&4'.                                 
    001960       l_t_tables-itabname = 'C_S_BLOCK-T_&1&2&3&4'.                   
    001970       append l_t_tables.                                              
    001980     END-OF-DEFINITION.                                                
    001990   * define which tables have to be imported                           
    002000     macro_execute: lmacro_add_import_table.                           
    002010     lmacro_add_import_table:                                          
    002020       m h d r,                                                        
    002030       p a r 1,                                                        
    002040       p a r 2.                                                        
    002050   * carry out the IMPORT statement to decompress data                 
         >     IMPORT             (l_t_tables)                                   
    002070       FROM DATABASE    baldat(al)                                     
    002080       ID               i_s_ldat-log_handle                            
    002090       USING            baldat_import                                  
    002100       IGNORING CONVERSION ERRORS.                                     
    002110                                                                       
    002120   ENDFORM.                    "log_block_decompress                                                                               
    Thanks & Regards
    Sujith

    Hi Andersen,
    You should be a system administrator to do above task.
    Deepak Jangra

  • Error is encountered during import ABAP phase

    Hi All,
    When trying to install ECC6 on windows 2003 server the below error is encountered during import ABAP phase....Can any one help me in this regard.
    CJS-30022 Program 'Migration Monitor' exists
    ERROR 2009-07-08 08:39:33
    CJS-30022  Program 'Migration Monitor' exits with error code 103.
    For details see log file(s) import_monitor.java.log, import_monitor.log.
    Thanks
    Sandeep Singh DS

    hi,
    Below are the logs for import_monitor.log and import_monitor.java.log
    import_monitor.log   :
    TRACE: 2009-07-11 21:11:30 com.sap.inst.migmon.LoadTask run
    Loading of 'SAPSSEXC' import package is started.
    TRACE: 2009-07-11 21:11:30 com.sap.inst.migmon.LoadTask processPackage
    Loading of 'SAPSSEXC' import package into database:
    D:\usr\sap\EP2\SYS\exe\uc\NTI386\R3load.exe -i SAPSSEXC.cmd -dbcodepage 4103 -l SAPSSEXC.log -stop_on_error
    TRACE: 2009-07-11 21:11:30 com.sap.inst.migmon.LoadTask run
    Loading of 'SAPAPPL1' import package is started.
    TRACE: 2009-07-11 21:11:30 com.sap.inst.migmon.LoadTask processPackage
    Loading of 'SAPAPPL1' import package into database:
    D:\usr\sap\EP2\SYS\exe\uc\NTI386\R3load.exe -i SAPAPPL1.cmd -dbcodepage 4103 -l SAPAPPL1.log -stop_on_error
    ERROR: 2009-07-11 21:12:38 com.sap.inst.migmon.LoadTask run
    Loading of 'SAPAPPL1' import package is interrupted with R3load error.
    Process 'D:\usr\sap\EP2\SYS\exe\uc\NTI386\R3load.exe -i SAPAPPL1.cmd -dbcodepage 4103 -l SAPAPPL1.log -stop_on_error' exited with return code 2.
    For mode details see 'SAPAPPL1.log' file.
    Standard error output:
    sapparam: sapargv( argc, argv) has not been called.
    sapparam(1c): No Profile used.
    sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
    ERROR: 2009-07-11 21:15:10 com.sap.inst.migmon.LoadTask run
    Loading of 'SAPSSEXC' import package is interrupted with R3load error.
    Process 'D:\usr\sap\EP2\SYS\exe\uc\NTI386\R3load.exe -i SAPSSEXC.cmd -dbcodepage 4103 -l SAPSSEXC.log -stop_on_error' exited with return code 2.
    For mode details see 'SAPSSEXC.log' file.
    Standard error output:
    sapparam: sapargv( argc, argv) has not been called.
    sapparam(1c): No Profile used.
    sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
    WARNING: 2009-07-11 21:15:30
    Cannot start import of packages with views because not all import packages with tables are loaded successfully.
    WARNING: 2009-07-11 21:15:30
    2 error(s) during processing of packages.
    import_monitor.java.log
    Java HotSpot(TM) Client VM (build 1.4.2_12-b03, mixed mode)
    Import Monitor jobs: running 1, waiting 2, completed 16, failed 0, total 19.
    Import Monitor jobs: running 2, waiting 1, completed 16, failed 0, total 19.
    Loading of 'SAPAPPL1' import package: ERROR
    Import Monitor jobs: running 1, waiting 1, completed 16, failed 1, total 19.
    Loading of 'SAPSSEXC' import package: ERROR
    Import Monitor jobs: running 0, waiting 1, completed 16, failed 2, total 19.
    Thanks,
    Sandeep
    Edited by: Sandeep Singh DeviSingh on Jul 13, 2009 11:42 AM

  • DTW Sort Error - after mapping during import

    I got the "Sort Error - after mapping during import" message in DTW. (version 8.8)
    I would like to import warehouse info for items.
    My itemcodes are fix 15 character numbers, like this: 10204150020011
    The note nr. [1331130|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/oss_notes/sdn_oss_sbo_dtw/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31333331313330%7d] said that I can't use recordkey like this.
    Do you have any idea how can I import thees information with DTW?
    Thank you,
    Attila Sarkady

    Dear All
    Please give correct and complete information
    i have tested every kind of combination;
    - recreate the template with the dtw for Items = i have UDF's
    - ADD the column RecordKey   !!! unbelievable that we have to do this
    - i have entered a nr 1 in it (even when we had already 850 items in the database)
    (and of course the hints of Gordon - columns as text, start with basic... - and use CSV format)
    now at last i can import the record line
    greetings
    philippe

  • Database full compile while doing schema import using impdp

    hi,
    oracle 10g
    Database full compile while doing schema import using impdp..
    what is happening here..
    regards,
    Deepak

    My scenario
    I need to import the particular schema from the full export dump. which has taken by using expdp. while importing i am using remap_schema for a single schema.
    But it try to import the full dump and compile all the schema objects.
    regards,
    Deepak

  • Exclude mlog$ table during import

    Hi Experts,
    Could you please help me on how to exclude the materialized view log tables (mlog$ table) during data pump import.
    Error:
    ORA-31693: Table data object "RM_DBA"."MLOG$_ACTUAL_INCOME" failed to load/unload and is being skipped due to error:
    Regards,
    Boris

    I found the answer here..
    impdp job_name=schemaexp1 schemas=sthomas dumpfile=exp_schem.dmp logfile=imp_schema.log directory=exp_dir EXCLUDE=table:\"like \'MLOG\$\%\'\",MATERIALIZED_VIEW_LOG,MATERIALIZED_VIEW
    Original Post: http://www.acehints.com/2013/09/how-to-exclude-mlog-materialized-view.html

  • Se14- conversion error, unlock table

    Hi.
    I have added a new append to MCHA. I modified this append and afterwards I went to se14 in order to adjust and activate the table.
    But, I have a conversion error. Now, append structure is partially active and only temporary table for MCHA exists. I can not fix the conversion problem. Data on MCHA are not important, it is the development system and it contains few data. What happens if I select "Unloc Table" in se14?
    Thanks

    Hi all,
    Regarding data, I lost them, but I did not mind since it is test development, and we are doing the first steps to implant MM module. So data was not revelant for us.
    Archana,
    Our problem was caused by a bad conversion data, in the end I unlock the table. Afterwards table mcha was at the dictionary but not in the database, and se14 only showed option "create database". We used this functionality to create it again, and it worked. It created the table, in inactive status, I fixed the append structure and activated the table. Then I went to db02 to perform consistency check, here I knew that indexes and table views were lost and I had to create them again.
    On the other hand, if you want to maintain your data, I think you have several options:
      - contact system team in order to restore them from backup or
      - before unlocking  the table, you can save temporary tables created for adjustment (tables like QCMMCHA ...) that contains data, or
      - correct the conversion problem and continue adjustment (in my case I couldn't fix the problem, I tried to modify the append that caused the problem but I could not activate again)
    Hope this helps you.

  • Conversion error in Table control for Negative values

    hello all,
    Iam using table control with 6 columns and in that columns values are coming from Ztable. and out of that one column is of value which is input/output field and having negative value's also.type of that column is DEC.
    Now when i run the report dump occurs showing error message as COnversion Error.
    I think its because of negative value even the in attributes of that column i had selected With Sign checkbox.
    Now when i run the report then if in Ztable negative value is there then i want to display negative value in table control also.
    Please suggest me...
    <removed_by_moderator>
    Thanking you.
    Regards.
    Edited by: Hemant Baviskar on Sep 10, 2008 3:07 PM
    Edited by: Julius Bussche on Sep 10, 2008 1:13 PM

    Hemant
    Refer to following link:
    Table Control : Conversion error
    Thanks
    Amol Lohade

  • Conversion error on table maintanance when i drag down the contents

    Hi SAP,
    i have an issue...with table..i,e..i have table maintance generator...say for instance i have 50000records...when i try to drag down to see the contents...as soon as i drag down i get a error 'CONVERSION ERROR'.
    What could be the problem/
    can i have some input pls?
    thank you,
    pasala.

    CHeck these out: It has to be because of some negative value:
    [DYNPRO_FIELD_CONVERSION ERROR;.
    [SM30 Abends on Ztable rec change that contains a Negative QTY?;

  • Creating record on second main table during import

    Hello all,
    I am importing data to a main table (materials), and I have a second main table linked to the materials main table to store supporting data.  Assuming I have a new record being imported that contains an entity that doesn't exist in that second main table, is it possible to create a record inside the second main table?  This functionality exists for lookup tables, if the lookup record doesn't exist you can configure the map to create the record in the lookup table.  Can the same thing be accomplished with multiple main tables?  I'm having trouble with this because I can't get any field aside from the primary key on the second main table to show up in the destination fields in the import manager.

    Hi,
    As you said, Assuming I have a new record being imported that contains an entity that doesn't exist in that second main table, is it possible to create a record inside the second main table?
    This scenario is quite possible.
    I have a Work Around and it should work according to me...
    See, In this case you have to create two maps, one for Main table (Primary) Import and another for Second Main table(Secondary).
    Before Importing Main table(Import) this file should be Imported to Second Main table import(Secondary) by putting that file to ready inbound port of Second Main table.
    So, Records entities that do not exist in secondary main table will get created and for existing records will get updated.
    Now when the same Source file import for Main table, that record entity would be already there in Secondary main table and as such you would not face any issue while importing through main table.
    Kindly let me know if you face any issue.
    Thanks and Regards,
    Mandeep Saini

  • Error with SAPDFACT during import. (BW System Export / Import)

    Hi,
    I'm performing an import into an Oracle DB using an export from an Oracle DB of a BW system.
    I'm getting the following errors from SAPDFACT, it proceeds if I "ign" the indexes of the affected tables but there are 100s of indexes, must I do this for all?
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\NW04\COPY\IMPORT\SYSTEM\ABAP\ORA\NUC\DB/SAPDFACT.cmd -l C:\Program Files\sapinst_instdir\NW04\COPY\IMPORT\SYSTEM\ABAP\ORA\NUC\DB/SAPDFACT.log -stop_on_error -k 1Wegd5M50Dp01eqtd1M80Ae2
    DbSl Trace: ORA-1403 when accessing table SAPUSER
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (GSI) INFO: dbname   = "EAD20090719090758                                                                                "
    (GSI) INFO: vname    = "ORACLE                          "
    (GSI) INFO: hostname = "MYBWESSDEV02                                                    "
    (GSI) INFO: sysname  = "Windows NT"
    (GSI) INFO: nodename = "MYBWESSDEV02"
    (GSI) INFO: release  = "5.2"
    (GSI) INFO: version  = "3790 Service Pack 1"
    (GSI) INFO: machine  = "4x AMD64 Level 15 (Mod 33 Step 2)"
    DbSl Trace: Error 2158 in exec_immediate() from oci_execute_stmt(), orpc=0
    DbSl Trace: ORA-2158 occurred when executing SQL statement (parse error offset=244)
    (DB) ERROR: DDL statement failed
    (   CREATE            INDEX "/BI0/E0FIAA_C13~P" ON "/BI0/E0FIAA_C13"        ("KEY_0FIAA_C13P",         "KEY_0FIAA_C13T",         "KEY_0FIAA_C13U",         "KEY_0FIAA_C132")        PCTFREE 10        INITRANS 002        TABLESPACE         STORAGE (INITIAL     0000000016 K                 NEXT        0000000016 K                 MINEXTENTS  0000000001                 MAXEXTENTS  UNLIMITED                 PCTINCREASE 0000                 FREELISTS   001) )
    DbSlExecute: rc = 99
      (SQL error 2158)
      error message returned by DbSl:
    ORA-02158: invalid CREATE INDEX option
    (DB) INFO: disconnected from DB
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: END OF LOG: 20090720171142
    Any help is greatly appreciated.
    Thanks!
    - Ahmad M.

    Hi Eric,
    I updated the R3load binary but I'm still facing the same issue. I also tested OPS$ as per previous suggestions but I'm still facing problems with SAPDFACT.
    Extract of the log
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: sccsid @(#) $Id: //bas/640_REL/src/R3ld/R3load/R3ldmain.c#27 $ SAP
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: version R6.40/V1.4
    Compiled Jul  5 2009 21:29:44
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe -dbcodepage 1100 -i C:\Program Files\sapinst_instdir\NW04\COPY\IMPORT\SYSTEM\ABAP\ORA\NUC\DB/SAPDFACT.cmd -l C:\Program Files\sapinst_instdir\NW04\COPY\IMPORT\SYSTEM\ABAP\ORA\NUC\DB/SAPDFACT.log -stop_on_error -k 1Wegd5M50Dp01eqtd1M80Ae2
    (DB) INFO: connected to DB
    (DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): WE8DEC
    (GSI) INFO: dbname   = "EAD20090720060750                                                                                "
    (GSI) INFO: vname    = "ORACLE                          "
    (GSI) INFO: hostname = "MYBWESSDEV02                                                    "
    (GSI) INFO: sysname  = "Windows NT"
    (GSI) INFO: nodename = "MYBWESSDEV02"
    (GSI) INFO: release  = "5.2"
    (GSI) INFO: version  = "3790 Service Pack 1"
    (GSI) INFO: machine  = "4x AMD64 Level 15 (Mod 33 Step 2)"
    DbSl Trace: Error 2158 in exec_immediate() from oci_execute_stmt(), orpc=0
    DbSl Trace: ORA-2158 occurred when executing SQL statement (parse error offset=174)
    (DB) ERROR: DDL statement failed
    (   CREATE BITMAP            INDEX "/BI0/E0FIAA_C11~01" ON "/BI0/E0FIAA_C11"        ("KEY_0FIAA_C11P")        PCTFREE 10        INITRANS 002        TABLESPACE         STORAGE (INITIAL     0000000016 K                 NEXT        0000000016 K                 MINEXTENTS  0000000001                 MAXEXTENTS  UNLIMITED                 PCTINCREASE 0000                 FREELISTS   001) )
    DbSlExecute: rc = 99
      (SQL error 2158)
      error message returned by DbSl:
    ORA-02158: invalid CREATE INDEX option
    (DB) INFO: disconnected from DB
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    L:\usr\sap\EAD\SYS\exe\run/R3load.exe: END OF LOG: 20090721001150
    Tried to search for SAP Notes related to this error but no luck so far.
    Can any one advise if I could "ign" all the indexes in the TSK file then later on use the BW indexes repair program to fix the missing indexes?
    Any help is greatly appreciated.
    - Ahmad M.

  • Payroll Conversions to load tables T558B,T558C, T5U8C -- Please Suggest

    Hi All
    We are currently trying to do data conversion at a client with 100,000 + Employees. We are trying load the tables
    T558B - Payroll Periods
    T558C - Payroll Account Transfer: Old Wage Types
    T5U8C - Transfer external payroll results (USA)
    for all the YTD taxes for the Employees.
    We have designed LSMW's using Transaction code SM30 to load all the Employees. When we are loading using SM30, it is taking a very long time. Can anyone suggest, if this is the Best practice, or if there is an alternate way to load all these data into SAP without using SM30.
    Please let me know
    Thank you
    Deepthi
    Note: I have checked SAP Best practice and that document uses SM30 transaction recording

    Hi
    We are facing the same problem can you please tell us how did you resolve
    Venkat

  • Can R3load skip some tables during import?

    We use DB-independent method to export/import SAP DB.
    Now the export is done by R3load.
    We want to exclude some tables in the import.
    I know  that R3load has an option "-o  T"  to  skip tables.
    However, we need to know
    1)  the exact syntax to put into export_monitor_cmd.properties
    2) still need the table structures to be imported even no data is wanted.
    Thanks!

    Lee and Rob,
    Thank you both for your responses.
    Rob, thank you sir... I appreciate greatly some possible solutions to implement.  The CollectionPreseter sounds very interesting.  I am relatively new to Light Room 3 (working through Kelby's book at the moment tryng to learn), so please excuse any lightly thought out questions.
    You mentioned the idea of setting up multiple collections where each would have its own preset.  So lets talk about a possible workflow scenario in order to help me understand whether I comprehend the functionality of what this plugin could do.
    Lets say I have 3 Collections with each having one preset assigned.
    Is it possible ->
    Workflow A
    That after I import photos and then assign into Collection 1, CollectionPreseter will assign the defined preset on Collection 1.
    Once applied, does the ability exist to then move the pictures from Collection 1 into Collection 2 (while keeping Collection 1 preset) to apply it's preset and then lastly Moving the pictures from Collection 2 (while keeping Preset 1 and 2) into Collection 3 to apply its preset? with Final Export.
    OR
    Workflow B
    Would the flow have to be something like this based on above:
    Import and place into Collection 1 (preset 1 is applied).  Export and Save
    Reimport and place into Collection 2 (preset 2 is applied). Export and Save.
    Reimport and place into Collection 3 (preset 3 is applied). Export and Save Final.?
    The other that I have not raised is what about droplets (actions) with Photoshop CS?  Are multiple droplets able to be applied and ran in a batch if I integrated with CS that way?
    Thank you...
    Steven

Maybe you are looking for

  • How can I Disable XI Plugin for Firefox?

    I have disabled the plugin on the Firefox plugin page (Tools>Add-ons), but it re-enables.  I have changed the two .dll files to .dxx.  Tools>Options>Applications does not show .pdf.  I also tried to disable from within Reader's options panel with no

  • JSP Error while accessing application

    Hi , After deploying an application in OC4J 10.1.3.3 , got the following error, 500 Internal Server Error OracleJSP: An error occurred. Consult your application/system administrator for support. Programmers should consider setting the init-param debu

  • GetURL script is not working

    This page is not working. I know very little about Flash. When I preview it a browser, i get this message: Safari can't find the file. No file exists at the address "/DIGITAL DESIGN/FireFoam/iStock_000000752837Flash/NaN". What am I doing wrong? Mered

  • PDF Form Printing Text, but no Form

    I have an accountant trying to print a PDF Form.  It cannot be saved, but only printed.  When the form is completed and printed, it only prints out the entered text, and nothing else.  All other PDF docs print fine IT conditions: She prints to an old

  • HT201412 None of the abot helps, and I can't even restore with iTunes, as my Ipad is not visible in iTunes at all.

    Hi. I am having trouble with my iPad 1. It doesn't switch on, can't restart it, can't reset it, and can't even restore it with iTunes, as it's just no visible with iTunes at all. And yes, I have charged the battery for several hours just before doing