EXPORT/IMPORT Q & A

제품 : ORACLE SERVER
작성날짜 : 2002-04-08
EXPORT/IMPORT Q & A
===================
PURPOSE
다음은 EXPORT와 IMPORT에 관한 주요 질문과 답변에 관한 내용이다.
Explanation
[질문1] RDBMS와 EXPORT, IMPORT의 연관 관계는(catexp.sql 이란)?
Export, Import시 이미 생성된 OBJECT의 정보를 데이타 딕셔너리에서 쿼리를
하는데 이러한 OBJECT의 정보가 data dictionary내의 여러 테이블에 나누어져 있다.
필요한 데이타 딕셔너리 정보를 편리하게 이용하기 위해 여러 가지의 뷰를
catexp.sql에 Script되어 있다.
이 스크립트 화일은 $ORACLE_HOME/rdbms/admin에 있으며 Install시 수행되도록
되어 있다.
[질문2] Export 시 OBJECT의 백업 순서는 있는가 ?
Export하는 OBJECT의 순서는 존재하며 이는 Oracle의 Version Up 등에 의한
새로운 OBJECT가 만들어지거나 하면 Export되는 OBJECT의 순서는 변할 수 있다.
OBJECT의 Export순서는 다음과 같다.
1. Tablespaces
2. Profiles
3. Users
4. Roles
5. System Privilege Grants
6. Role Grants
7. Default Roles
8. Tablespace Quotas
9. Resource Costs
10. Rollback Segments
11. Database Links
12. Sequences( includes Grants )
13. Snapshots ( includes grants, auditing )
14. Snapshot logs
15. Job Queues
16. Refresh Groups
17. Cluster Definitions
18. Tables(includes grants,column grants,comments,indexes,
constraints,auditing)
19. Referential Integrity
20. POSTTABLES actions
21. Synonyms
22. Views
23. Stored Procedures
24. Triggers
25. Default and System Auditing
[질문3] Export 시 BUFFER와 RECORDLENGTH는 무엇인가?
-BUFFER
Export시 OBJECT 내에 있는 여러 개의 Row가 한꺼번에 Fetch된다.
디스크에서 Fetch된 정보는 화일에 Write하기 전에 메모리를 거치게 되며,
이때 할당되는 메모리의 양이 Buffer 파라미터의 값이다.
-RECORDLENGTH
메모리에 있는 Export할 자료를 화일에 Write하기 위해 한번에 운반되는
양을 결정하는 파라미터이다.
[주의] 위의 BUFFER와 RECORDLENGTH는 O/S의 Block Size의 배수가 되도록
하는 것이 효율적이다.
[질문4] 다량의 Row를 Export, Import 시 어느 정도의 Row가 처리되었는지
알수 있는가?
알 수 있다. V 7.1까지는 다량의 Row를 Export, Import시 처리된 정도를 알
수가 없어 현재 작업 중인지 시스템이 Hang인지 파악되지 않았으나 V 7.2
부터는 FEEDBACK이라는 옵션을 이용하여 체크가 가능하다.
[질문5] Export 시 한번에 몇 개의 Row가 Fetch되는가?
한번에 Fetch되는 Row의 수는 Buffer Size와 연관 관계가 있다. 하나의 row가
Export 시 차지하는 양은 (각 Column Size의 합) + 4*(Column의 수)로 구할 수
있다. 한번 Fetch되는 Row의 수는 Buffer Size / 한 Row의 Export시 Size이다.
이를 이용하면 Export된 Output File의 Size는 대략 한 Row의 Export 시
Size * Row 수 이다.
[질문6] Export, Import의 호환성은 어떻게 되는가?
Export, Import의 호환성은 Oracle의 버젼과 직접적인 연관관계를 갖고 있다.
호환성은 4가지로 나누어 설명할 수 있으며 이를 아래의 가정을 이용해
설명하겠다.
가령 A라는 기계에 Oracle V 7.0, B 라는 기계에 Oracle V 7.1이 설치되어
운영 중이라 가정하자.
Oracle V7.0을 X라 하고 Oracle V7.1을 Y라고 하자.
- Base Compatibility : X의 exp를 이용해 X DB를 export하여 X의 imp를
이용해 X DB에 import하는 것을 말한다. 이는 당연히 지원한다.
- Upward Compatibility : X의 exp를 이용해 X DB를 export하여 Y DB에 Y의
imp를 이용해 import하는 것을 말한다.
이도 Oracle에서는 지원한다.
- Downward Compatibility : Y exp를 이용해 Y DB를 export하여 X DB에 X의
imp로 import하는 것을 말한다.
이는 지원될 수도 안될 수도 있다.
- Cross Compatibility : X exp를 이용해 Y DB를 export(SQL*Net 이용)하여
X 또는 Y DB에 import(imp는 적정한 것을 활용)하는 것을 말한다.
이는 지원될 수도 안될 수도 있다.
[질문7] 어떤 경우에 Downward Compatibility가 실패하는가?
V7.2에 hash cluster expressions라는 옵션이 있는데, 이를 이용해서 클러스터를
생성하여 사용 후 export한 것을 V7.0 또는 V7.1로 downward시 create cluster문에
옵션이 맞지않아 실패하게 된다.
[질문8] EXP-37 에러(export views not compatible with database version)
발생의 원인은 무엇인가 ?
이 에러는 Cross Compatibility에서 발생하는 문제로 이는 Export가 이용
하는 View(Catexp.sql에 의해 생성된)가 Oracle Version내에 일치하지 않아
발생한 문제로 이를 해결하기 위해 Exp에서 이용 가능한 View를 설치한다.
[질문9] Full Export는 DBA 권한을 갖고 있는 유저만 할 수 있는가 ?
Version 6 에서는 DBA권한을 갖고 있는 유저만 Full Export를 할 수 있으며,
V7에서는 DBA가 아니더라도 EXP_FULL_DATABASE Role이 Grant되면 Full
Export가 가능하다.
[질문10] 테이블 Import 시에 DEFAULT tablespace가 아닌 곳으로 들어가는 경우는
왜 발생하는가?
예를 들어서 scott 유저의 DEFAULT TABLESPACE가 users 인데 임포트를 해보면
tools TABLESPACE에 테이블이 만들어졌다고 하자. 그 이유는 다음과 같다.
즉, 임포트 하는 테이블이 원래 tools TABLESPACE에 있었고 scott가 현재
tools 테이블스페이스에 대한 Quota 를 가지고 있거나 아니면 Unlimited
Tablespace 권한(Resource Role에 포함)을 부여받았기 때문이다.
Import 시에 테이블을 DEFAULT TABLESPACE에 만들려면 DEFAULT TABLESPACE
외의 TABLESPACE 에 대한 모든 Quota 를 0 로 만들고 UNLIMITED TABLESPACE
권한을 Revoke시킨 다음에 임포트를 수행해야 한다.
그리고, DEFAULT TABLESPACE 에 대한 Quota만 Unlimited 로 한다.
예를 들면 다음과 같다.
$ sqlplus system/manager
SQL> alter user scott
quota 0 on system
quota 0 on tools
quota 0 on data
quota unlimited on users;
SQL>revoke unlimited tablespace from scott;
이렇게 한 다음 Import 를 수행하면 된다. 물론 유저를 만들 때 quota 를 주지
않은 TABLESPACE는 상관 없으며 UNLIMITED TABLESPACE 권한(또는 RESOURCE ROLE)을
주지 않았다면 Revoke 명령도 사용할 필요가 없다.
[질문11] Import 시에 Core Dump/Segmentation Fault 가 발생하는 경우
오라클에는 Character Set이 있다. 국내에서는 US7ASCII 또는 KO16KSC5601을
주로 사용하는데 Export 받은 곳과 Import 하는 곳의 Character Set이 다르면
Import시에 Core Dump 가 발생하거나 원인 불명의 에러가 발생하면서 IMPORT가
중단되는 경우가 발생한다.
이 경우에는 Export 받은 dump file 을 convert 프로그램을 이용하여 Import 하는
곳의 Character Set 으로 변환시킨 다음 Import를 하는 방법이 있고, 아니면 어느
한 쪽 DB 의 Character Set 자체를 바꿔서 동일하게 맞춘 다음 Export/Import하는
방법이 있다. 이 중에서 Convert 프로그램을 이용하는 방법이 간단한데 이
프로그램은 Unix 상에서 cc로 컴파일하여서 사용하면 된다.
Reference Documents
--------------------

I`m talking about the wsusutil export\import process..
Oh! That's NOT what you asked. What you asked is:
my question is, do I need to export all files again?
As for the WSUSUTIL functionality, that is an ALL or NOTHING operation. You have no choice in the matter.
Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
SolarWinds Head Geek
Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
http://www.solarwinds.com/gotmicrosoft
The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds.

Similar Messages

  • Memory Limitation on EXPORT & IMPORT Internal Tables?

    Hi All,
    I have a need to export and import the internal tables to memory. I do not want to export it to any data base tables. is there a limitation on the amount of memroy that is can be used for the EXPORT & IMPORT. I will free the memory once I import it. The maximum I expect would be 13,000,000 lines.
    Thanks,
    Alex (Arthur Samson)

    You don't have limitations, but try to keep your table as small as possible.
    Otherwise, if you are familiar with the ABAP OO context, try use Shared Objects instead of IMPORT/EXPORT.
    <a href="http://help.sap.com/saphelp_erp2004/helpdata/en/13/dc853f11ed0617e10000000a114084/frameset.htm">SAP Help On Shared Objects</a>
    Hope this helps,
    Roby.

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Project duplication:  Copy/Paste vs. Export/Import

    Anyone have any thoughts on project duplication?
    Presently I am copying a project and then pasting it into multiple new locations then renaming them. Recently I was in a training session where the instructor suggested exporting the project and then importing it claiming database size savings. Can anyone shed any insight?
    Thanks in advance.

    I can't imagine why an export/import would be a smaller footprint within the database than a project copy/paste. Not to mention that if you are talking about a relatively large project it can take a substantial amount of time to export and import. One of our schedules that is about 18k activities takes approximately 1.5 hours to export/import where as a project copy/paste only takes about 10 minutes. Note that we do not copy the baseline schedules when replicating projects.

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

  • Error in Client export/import

    Dear all
                     i have test system i would like to perform client export/import in single system correct if made a mistake
    i have export client which create 3 request namely SIDKOxxxxxx , SIDKTxxxxxx,SIDKXxxxxx .i had selected target SID as system same sid bcaz i have only one system
    then i login again with scc4 create new client and login in that sap* and pass then when i run command stms_import to import all three transport into that client as said my message server gets disconnected correct me i am wrong
    or i have to directely run scc7 bcaz the request already in /usr/sap/trans then these error won't occur  i have the back up of database
    now i am getting error oracle disconnected
    Regrards

    Dear Pavan
    I tried last night as you said i made new client as 300 in ides and copy user profile to that
    when i tried to configure stms by setting CTC =1 when i configure transport routes i don't find any client number
    to provide source and target client
    kindly send some screen shots if u have that would great support
    .Delete the existing TMS configuration.
    2.In 000 client by using the user ddic create TMS configuration using STMS.
    3.Click on the system, in transport tool tab add a parameter CTC and set it to 1.
    4.Now configure the transport routes, there you can find the client number in consolidation routes.
    5.Provide your source and destination client and create routes.
    Regards

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol &amp;quot;牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐&amp;quot; when expecting one of the following: ( - + case mod new not null &amp;lt;an identifier&amp;gt; &amp;lt;a double-quoted delimited-identifier&amp;gt; &amp;lt;a bind variable&amp;gt; avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Oracle 9 spatial index export/import

    Hi,
    when exporting/importing a user via exp/imp I encounter a problem with the numeric characters encoding during creation of a spatial index.
    Imp tool produces script like this:
    "BEGIN "
    "execute immediate 'INSERT INTO USER_SDO_GEOM_METADATA values (''POINTS'',''GEOMETRY'',mdsys.SDO_dim_array(MDSYS.SDO_DIM_ELEMENT(''X'',50000000,160000000,,005),MDSYS.SDO_DIM_ELEMENT(''Y'',450000000,600000000,,005)),,005) ' ; "
    "COMMIT; END;"
    Problem is with wrong representation of the numeric value '0.005' as ,005.
    Originally, the index was created via
    INSERT INTO USER_SDO_GEOM_METADATA
    VALUES (
    'points',
    'geometry',
    MDSYS.SDO_DIM_ARRAY(
    MDSYS.SDO_DIM_ELEMENT('X',-125000000,-115000000, '0.005'),
    MDSYS.SDO_DIM_ELEMENT('Y', 30000000, 42100000, '0.005')
    NULL
    Any hints how to reimport the index correctly?
    Thanks,
    Bogart

    You might need to set the NLS_LANG environment variable to get character set conversion done on import (I don't know this unfortunately - it has never been a problem for me).
    There is a section on character set conversions in the utilities manual.

  • Offline Instantiation of a Materialized View Site Using Export/Import

    Has anyone had any success performing offline instantiation of a materialized view site using export/import in Oracle9?

    This is what I wanted to ask in the forum. I want to use datapump for the initial instantiation because I believe that indexes (the actual indextables, not just the definition) are also replicated using datapump.
    So, is it possible to do the instantiation using datapump?

  • Upgrade from 8.1.6 to 9.2.0.7 via export/import

    Has anyone done this or know if it's possible to simply build a 9.2.0.7 instance, then use export/import to perform the upgrade from 8.1.6 to 9.2.0.7 since a direct manual upgrade from 8.1.6 is not supported??
    TIA

    Yes, building a database in a new release and then exporting the existing database using its version of the exp utility, performing a binary copy of the exp file to the new server, and then running the new version imp utility using the transferred dmp file was the standard method of moving databases to new platforms/versions for many years. The same process could always be used on the same server between equal or from a current to a new release.
    HTH -- Mark D Powell --

  • System copy using SAPInst(Export Import database Independent prcoess failed

    Hello ,
    I am doing a System copy using SAPInst export/import process .
    Source system : SAP Netweaver'04( BW 3.5 , Kernel : UC 640 ,Patch level 196 )
    Export process fails at Phase 2 -  Database Export at R3load jobs running 1,waiting 0 . Below is the log details
    SAPSDIC.log
    (EXP) INFO:  entry for BAPICONTEN                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICONTENT255                    in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20031127161249
    (EXP) INFO:  entry for BAPICONVRS                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20010131174038
    (EXP) INFO:  entry for BAPICREATORDATA                   in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICRMDH1                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMDH2                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMEXP                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175513 > 20031211120627
    (EXP) INFO:  entry for BAPICRMEXT                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120627
    (EXP) INFO:  entry for BAPICRMKEY                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMKEY_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMMSG                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMMSG_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMOBJ                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120628
    (EXP) INFO:  entry for BAPICRMPAREX_T                    in DDNTT is newer than in DDNTT_CONV_UC: 20051229175452 > 20031211120305
    (EXP) INFO: limit reached, 5000 tables in DDNTT are newer than in DDNTT_CONV_UC
    (EXP) INFO: NameTab check finished. Result=2  #20100113131216
    (EXP) INFO: check for inactive NameTab entries: Ok.
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 20100113131216
    ***SAPCLUST.log ****
    (NT)  Warn:  EDIDOC: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "EDIDOC"
    (NT)  Warn:  PCDCLS: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "PCDCLS"
    (NT)  Warn:  SFHOA: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHOA"
    (NT)  Warn:  SFHYT: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHYT"
    (NT)  Warn:  UMG_TEST_C: normal NameTab from 20040211095029 younger than alternate NameTab from 20031113150115!
    (EXP) TABLE: "UMG_TEST_C"
    myCluster (55.22.Exp): 712: error when retrieving alternate nametab description for physical table UMG_TEST_F.
    myCluster (55.22.Exp): 713: return code received from nametab is 32
    myCluster (55.22.Exp): 299: error when retrieving physical nametab for table UMG_TEST_F.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 2010011312563
    Please suggest.
    Thanks & Regards
    Ganesh

    Is your DB unicode?  If so, did you select the unicode flag in sapinst?
    This [thread|System Copy Error while exporting ABAP; might offer some help.
    -Zach

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Client Copy Export/Import on large system. 1.5 TB

    Hi,
    We have a 3 tier landscape. DEV, TST, PRD.
    Database size in PRD is 1.5 TB (1500 GB)
    Our developers need fresh master and trans data in TST and DEV.
    We are thinking about making a
    PRD --> TST  (1.5 TB) we do a system copy.
    TST --> DEV (1.5 TB) we can't do a system copy. (Company policy says not to touch repository objects, version management what so ever in DEV....)
    1. How long will a client export / import take from TST --> DEV,  lets say we have "good HW" ?  Will it work ?
    2. I've read about TDMS but its toooo expensive.....is there maybe any 3 party tool that is good for this purpose ? Transfering 1.5 TB of data ...
    Br,
    Martin

    Hi,
    >1. How long will a client export / import take from TST --> DEV, lets say we have "good HW" ? Will it >work ?
    In my opinion, a 1.5 TB database is much too big for a client copy.
    It will work but may last a full week and would degrade the production system performance.
    We have 1.3 TB databases and stopped using client copies since the system was 300 GB...
    We only use database restores and renaming for system refresh even for the dev system.
    There is procedure to keep the versionning of sources.
    Regards,
    Olivier

Maybe you are looking for

  • Problems in Adobe forms

    Hi experts I am trying the tutorial on SDN - 'How to Execute an RFC Model with Input from Interactive Forms'. When I click on submit on the adobe form and the action handler executes,  wdThis.wdFirePlugOutToFlightDetail(); the browser closes without

  • SBO 2005 SP1

    I have SBO 2005 A on my computer. I have just downloaded SBO 2005 SP1. What are the differences between SBO 2005 and SBO 2005 SP1?

  • ALE IDOC AND EDI

    Hi Could u pls explain whats the use of these terminology in SAP and what's the prequisites, configuration setting and how these worked. thanks

  • Express VI - access to POP-UP front panel?

    Hi: Some of the express VI pop-up front panels have an excellent layout and would be similar to some of the front panels that I might have to create (see example for wavelet denoise in latest digital filter design toolkit).  Rather than having to rec

  • Use BADI to enhance main screen of T-CODE XK02

    Hi everybody! I want use badi to enhance main screen of XK02. (add button!) I already found method of BADI. But i make out! I don't know how to using that method. Please help me ! Thanks in Advance !