Export Import Process

Hi,
Is it possible to maintain details for export and import of material in B1..? will it require any customization..?
swapnil

Hi Swapnil,
i am not sure, if i understood you correct. Do you want to import / export data for B1 ? Or do you want to integrate an import / export logistic process into B1 ?.
For data import you should use the Data Transfer Workbench. It normaly is installed into your SAP directory. There you will find a lot of examples and prepared sheets for import datas. For exporting data you could use SQL-Query as the simpelst way.

Similar Messages

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol "牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐" when expecting one of the following: ( - + case mod new not null <an identifier> <a double-quoted delimited-identifier> <a bind variable> avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Comparision of Oracle upgrade(9i to 10g) and export/import process

    Hi Friends,
    I have a schema in Oracle 9i Database and i would like to have it in Oracle 10g.
    I ned to know what will be the advantages and disavantages between the Oracle upgrade(9i and 10g) and export/import (9i export and 10g import)process?
    Please suggest.
    Regards

    Please go this link for new features of 10g:
    [http://www.oracle.com/technology/pub/articles/10gdba/index.html]
    For export and import go to this link:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/exp_imp.htm
    Regards
    Asif Kabir

  • Export/Import Process Flow

    Hi,
    I was trying to export and import Process Flow from one repository using metadata export and import to another repository but it can't bind process activity parameter. My OWB client version is 9.0.4.8.21 and OWB repository version is 9.0.4.0.27. Any suggestion on how to move process flow from one repository to another is greatly appreciated.
    Regards,
    Mardi

    Hi Nikolai.
    Yes, you are exactly right. The process flow is working fine and I managed to export it to a mdl file. When I want to import it to another repository, it give me error messages. Below is some of the error messages.
    Warning at line 40: MDL1307: Cannot bind process activity parameter <P_BUS_PILLAR> because process data for bound data name <PF_BUS_PILLAR> and UOID <C15913F1E3C76266E03400306E088FE2> not found. Process activity is <SAL_SA3_INT_SA3_FCT> in process <PROCESS_ADJS>.
    Warning at line 44: MDL1307: Cannot bind process activity parameter <P_STATE> because process data for bound data name <PF_STATE> and UOID <C15913F1E3C56266E03400306E088FE2> not found. Process activity is <SAL_SA3_INT_SA3_FCT> in process <PROCESS_ADJS>.
    Warning at line 129: MDL1307: Cannot bind process activity parameter <L_STATE> because process data for bound data name <PF_STATE> and UOID <C15913F1E3C56266E03400306E088FE2> not found. Process activity is <SAL_SAR_IGA_SAR_STG> in process <PROCESS_ADJS>.
    Regards,
    Mardi

  • System copy using SAPInst(Export Import database Independent prcoess failed

    Hello ,
    I am doing a System copy using SAPInst export/import process .
    Source system : SAP Netweaver'04( BW 3.5 , Kernel : UC 640 ,Patch level 196 )
    Export process fails at Phase 2 -  Database Export at R3load jobs running 1,waiting 0 . Below is the log details
    SAPSDIC.log
    (EXP) INFO:  entry for BAPICONTEN                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICONTENT255                    in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20031127161249
    (EXP) INFO:  entry for BAPICONVRS                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20010131174038
    (EXP) INFO:  entry for BAPICREATORDATA                   in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICRMDH1                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMDH2                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMEXP                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175513 > 20031211120627
    (EXP) INFO:  entry for BAPICRMEXT                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120627
    (EXP) INFO:  entry for BAPICRMKEY                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMKEY_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMMSG                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMMSG_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMOBJ                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120628
    (EXP) INFO:  entry for BAPICRMPAREX_T                    in DDNTT is newer than in DDNTT_CONV_UC: 20051229175452 > 20031211120305
    (EXP) INFO: limit reached, 5000 tables in DDNTT are newer than in DDNTT_CONV_UC
    (EXP) INFO: NameTab check finished. Result=2  #20100113131216
    (EXP) INFO: check for inactive NameTab entries: Ok.
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 20100113131216
    ***SAPCLUST.log ****
    (NT)  Warn:  EDIDOC: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "EDIDOC"
    (NT)  Warn:  PCDCLS: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "PCDCLS"
    (NT)  Warn:  SFHOA: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHOA"
    (NT)  Warn:  SFHYT: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHYT"
    (NT)  Warn:  UMG_TEST_C: normal NameTab from 20040211095029 younger than alternate NameTab from 20031113150115!
    (EXP) TABLE: "UMG_TEST_C"
    myCluster (55.22.Exp): 712: error when retrieving alternate nametab description for physical table UMG_TEST_F.
    myCluster (55.22.Exp): 713: return code received from nametab is 32
    myCluster (55.22.Exp): 299: error when retrieving physical nametab for table UMG_TEST_F.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 2010011312563
    Please suggest.
    Thanks & Regards
    Ganesh

    Is your DB unicode?  If so, did you select the unicode flag in sapinst?
    This [thread|System Copy Error while exporting ABAP; might offer some help.
    -Zach

  • Export/Import Project still buggy in v2.1.3

    After seeing the update notes for v2.1.3 and noticing references to project importing, I started to get excited that maybe the bugs have been ironed out. Unfortunately my longstanding bug still remains.
    If anyone can think of any workarounds I'd be extremely grateful...
    I have a 2.16GHz Intel Core Duo MacBook Pro with 2GB RAM, running OS X 10.5.6 and now Aperture 2.1.3
    I have a project with many images, arranged into stacks, a pick in each stack with many adjustment and a 62-page book created from these picks. There are also some smart albums and a light-table.
    Now I export the project (i.e. in the Project Inspector I right-click the project and choose Export > Project) to a file on the desktop and rename the existing project in Aperture. I now import the saved .approject file and I find that all my Picks are no longer Picks and my book is therefore no longer made up of my desired images. (I've tried this many times now and ruled out the possibility of a corrupt export file).
    As a result, when I select the Book I get the "Non-Pick Images" dialog with the dreaded "One or more placed items are no longer the pick of a stack." message. "Use Current Pick" ruins the book - I have to go through and work out all the picks again (and also re-pan all the images within the Photo Boxes as this data has also been lost). "Create new Version" appears to preserve my book's original look, but my project now contains a couple of hundred more images than it used to and my other albums and light-table still have incorrect images shown as the Picks.
    Does anybody have any ideas of what I can do to ensure the stack-picks are preserved during the export/import process?
    (By the way, the reason I'm exporting and then re-importing is because I actually want to do the export from my laptop where the project is and then the import on my main work machine where the rest of my Aperture library lives, but that fails for the same reason, so I'm doing the export+import on my laptop for now to reduce the number of variables in the problem.)

    I go with the assumption that you now know how to create transport sets. After having created a transport set containing the objects you want to be exported, go to the administer tab under the main page of portal. There you will find the Export/Import portlet. Here you need to choose your transport set from the first LOV and then click EDIT to choose the "Security Option". Now you export out the stuff. This will export all the user page customizations for you.
    Thanks.

  • Is time to export-import location dependent

    I had a doubt that does the export-import process depend on the location from where it is performed. I will try to explain the scenario. There is export-import to be done of a server in Africa as the migration to a new server is required. My question is does it make a difference in the time it takes to export-import, if the SecureCRT session is logged on from here, Africa itself or if the process is done from a far off location, say, India as some of our DBA team is in India.
    So, the doubt is does the export-import process depend on the location from where it is performed?
    I hope, my question is clear.
    Please, help in solving the doubt.
    regards

    You need to clarify from an Oracle perspective what the client machine is in this scenario (i.e. the machine that is connecting to the Oracle database). What machine is the export utility running on and what machine is the export utility writing to?
    If the export utility is running on the same machine and writing to a dump file on the same machine regardless of where the DBA is sitting, then where the DBA is sitting is irrelevant. If the export utility runs on a different machine or writes to a dump file on a different machine depending on where the DBA is sitting, then where the DBA is sitting is relevant.
    Justin

  • Export - Import in Oracle Application Database

    The export in Oracle Applications Database works fine.
    But the import always show some issue and sometimes never completes.
    Even if it completes it complete with error , how to ensure an error free export/import ?
    How to handle a table movement if the table consists of a LONG column ?

    The errors are cannot import this tables..
    Also mostly are of Long tables..Any IMP-XXXXX or ORA-XXXXX messages?
    Can you explain how long objects are exported and imported ?What type of objects?
    What is the database version?
    Are you doing a full database import?
    Also sometimes import hangs for 2/3 days?Does it hang when importing a specific object?
    Did you follow the steps outlined in the following notes when you did the Export/Import?
    Note: 362205.1 - 10g Release 2 Export/Import Process for Oracle Applications Release 11i
    https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=362205.1
    Note: 230627.1 - 9i Export/Import Process for Oracle Applications Release 11i
    https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=230627.1

  • Export/import in 904 versus 10.1.2

    We have two 10G AS 904 installs (dev and prod - same O/S too). From time to time we move our production portal content to the dev server. We sometimes get corruption and have to use the Schema validation Utility scripts to fix the import to the dev server. Not a show stopper but this adds another manual step to the process.
    I see the 10.1.2 version has improved import/export checks. Can anyone give feedback on the improvements in the export/import process?
    Thanks
    Kevin

    Kevin,
    Careful with this approach. Passing from DEV to PROD is ok, but then to DEV back again, what I suggest you is that you have a clean start on DEV, ie, clean the DEV machine and later do the same to the PROD. Portal Export / Import works only in one-way and not both ways (this basically has to do with the checks we do make and with possible conflicts with the objects on both sides). Also check the available documentation... which is all compiled in Metalink Note:263995.1 - Master Note for OracleAS Portal Export / Import Issues
    As to the improvement of the process, please give it a check on the New Features papers (make a find on "export" - it is easier to pick the references):-
    10.1.2 -- http://www.oracle.com/technology/products/ias/pdf/1012_nf_paper.pdf
    10.1.4 -- http://www.oracle.com/technology/products/ias/portal/pdf/portal_1014_new_features.pdf
    I hope it helps...
    Cheers,
    Pedro.

  • Extents in Export/Import files

    I work on Oracle 8.1.7 with HP unix. I need to create a schema in QA exactly with the same tables as in Prod. (data in QA is a subset of Prod)
    Tables in prod have very big initial extents.
    I created a dump file by exporting the prod env.
    I imported the dump file (from prod) with the Index file option and got the SQL file. However, each 'create table' statement has the initial extent size included in it. I would have to manually change for 600 tables so that the initial extent is smaller for QA environment. I would like to use the default initial extent size for each table.
    I tried using the COMPRESS=Y and N options and could not spot any difference.
    Is there a way to completely avoid the extents or storage parameter in the export/import process ?
    Appreciate any help on this
    Thanks

    Hi Tushar,
    1. How do I pass this internal table to function module ?
       I assume u are creating your own Y/Z FM.
       Pass it thru TABLES parameter.
    2. When I am creating function module in se37 where do I define this iternal table type
       Define this in TABLES interface.
       What Type ?
       THE SAME TYPE WHICH HAS BEEN DEFINED
        WHILE PASSING IN THE USER-EXIT FUNCTION MODULE.
       IF U SEE THE FM OF THE USER-EXIT,
       U WILL COME TO KNOW.
    3.
    Where do I define error structure type (which is returned by function module to main program)? Is it in Export or table parameter during function module creation?
    Define it in TABLES interace. (not in export, import)
      (Since what u are going to return is an internal table)
      U can take for eg. BDCMSGCOLL.
      OR u can create your own Y/Z structure
    for the same purpose.
      (or u can use the structure type T100)
    I hope it helps.
    Regards,
    Amit M.

  • EXPORT/IMPORT Q & A

    제품 : ORACLE SERVER
    작성날짜 : 2002-04-08
    EXPORT/IMPORT Q & A
    ===================
    PURPOSE
    다음은 EXPORT와 IMPORT에 관한 주요 질문과 답변에 관한 내용이다.
    Explanation
    [질문1] RDBMS와 EXPORT, IMPORT의 연관 관계는(catexp.sql 이란)?
    Export, Import시 이미 생성된 OBJECT의 정보를 데이타 딕셔너리에서 쿼리를
    하는데 이러한 OBJECT의 정보가 data dictionary내의 여러 테이블에 나누어져 있다.
    필요한 데이타 딕셔너리 정보를 편리하게 이용하기 위해 여러 가지의 뷰를
    catexp.sql에 Script되어 있다.
    이 스크립트 화일은 $ORACLE_HOME/rdbms/admin에 있으며 Install시 수행되도록
    되어 있다.
    [질문2] Export 시 OBJECT의 백업 순서는 있는가 ?
    Export하는 OBJECT의 순서는 존재하며 이는 Oracle의 Version Up 등에 의한
    새로운 OBJECT가 만들어지거나 하면 Export되는 OBJECT의 순서는 변할 수 있다.
    OBJECT의 Export순서는 다음과 같다.
    1. Tablespaces
    2. Profiles
    3. Users
    4. Roles
    5. System Privilege Grants
    6. Role Grants
    7. Default Roles
    8. Tablespace Quotas
    9. Resource Costs
    10. Rollback Segments
    11. Database Links
    12. Sequences( includes Grants )
    13. Snapshots ( includes grants, auditing )
    14. Snapshot logs
    15. Job Queues
    16. Refresh Groups
    17. Cluster Definitions
    18. Tables(includes grants,column grants,comments,indexes,
    constraints,auditing)
    19. Referential Integrity
    20. POSTTABLES actions
    21. Synonyms
    22. Views
    23. Stored Procedures
    24. Triggers
    25. Default and System Auditing
    [질문3] Export 시 BUFFER와 RECORDLENGTH는 무엇인가?
    -BUFFER
    Export시 OBJECT 내에 있는 여러 개의 Row가 한꺼번에 Fetch된다.
    디스크에서 Fetch된 정보는 화일에 Write하기 전에 메모리를 거치게 되며,
    이때 할당되는 메모리의 양이 Buffer 파라미터의 값이다.
    -RECORDLENGTH
    메모리에 있는 Export할 자료를 화일에 Write하기 위해 한번에 운반되는
    양을 결정하는 파라미터이다.
    [주의] 위의 BUFFER와 RECORDLENGTH는 O/S의 Block Size의 배수가 되도록
    하는 것이 효율적이다.
    [질문4] 다량의 Row를 Export, Import 시 어느 정도의 Row가 처리되었는지
    알수 있는가?
    알 수 있다. V 7.1까지는 다량의 Row를 Export, Import시 처리된 정도를 알
    수가 없어 현재 작업 중인지 시스템이 Hang인지 파악되지 않았으나 V 7.2
    부터는 FEEDBACK이라는 옵션을 이용하여 체크가 가능하다.
    [질문5] Export 시 한번에 몇 개의 Row가 Fetch되는가?
    한번에 Fetch되는 Row의 수는 Buffer Size와 연관 관계가 있다. 하나의 row가
    Export 시 차지하는 양은 (각 Column Size의 합) + 4*(Column의 수)로 구할 수
    있다. 한번 Fetch되는 Row의 수는 Buffer Size / 한 Row의 Export시 Size이다.
    이를 이용하면 Export된 Output File의 Size는 대략 한 Row의 Export 시
    Size * Row 수 이다.
    [질문6] Export, Import의 호환성은 어떻게 되는가?
    Export, Import의 호환성은 Oracle의 버젼과 직접적인 연관관계를 갖고 있다.
    호환성은 4가지로 나누어 설명할 수 있으며 이를 아래의 가정을 이용해
    설명하겠다.
    가령 A라는 기계에 Oracle V 7.0, B 라는 기계에 Oracle V 7.1이 설치되어
    운영 중이라 가정하자.
    Oracle V7.0을 X라 하고 Oracle V7.1을 Y라고 하자.
    - Base Compatibility : X의 exp를 이용해 X DB를 export하여 X의 imp를
    이용해 X DB에 import하는 것을 말한다. 이는 당연히 지원한다.
    - Upward Compatibility : X의 exp를 이용해 X DB를 export하여 Y DB에 Y의
    imp를 이용해 import하는 것을 말한다.
    이도 Oracle에서는 지원한다.
    - Downward Compatibility : Y exp를 이용해 Y DB를 export하여 X DB에 X의
    imp로 import하는 것을 말한다.
    이는 지원될 수도 안될 수도 있다.
    - Cross Compatibility : X exp를 이용해 Y DB를 export(SQL*Net 이용)하여
    X 또는 Y DB에 import(imp는 적정한 것을 활용)하는 것을 말한다.
    이는 지원될 수도 안될 수도 있다.
    [질문7] 어떤 경우에 Downward Compatibility가 실패하는가?
    V7.2에 hash cluster expressions라는 옵션이 있는데, 이를 이용해서 클러스터를
    생성하여 사용 후 export한 것을 V7.0 또는 V7.1로 downward시 create cluster문에
    옵션이 맞지않아 실패하게 된다.
    [질문8] EXP-37 에러(export views not compatible with database version)
    발생의 원인은 무엇인가 ?
    이 에러는 Cross Compatibility에서 발생하는 문제로 이는 Export가 이용
    하는 View(Catexp.sql에 의해 생성된)가 Oracle Version내에 일치하지 않아
    발생한 문제로 이를 해결하기 위해 Exp에서 이용 가능한 View를 설치한다.
    [질문9] Full Export는 DBA 권한을 갖고 있는 유저만 할 수 있는가 ?
    Version 6 에서는 DBA권한을 갖고 있는 유저만 Full Export를 할 수 있으며,
    V7에서는 DBA가 아니더라도 EXP_FULL_DATABASE Role이 Grant되면 Full
    Export가 가능하다.
    [질문10] 테이블 Import 시에 DEFAULT tablespace가 아닌 곳으로 들어가는 경우는
    왜 발생하는가?
    예를 들어서 scott 유저의 DEFAULT TABLESPACE가 users 인데 임포트를 해보면
    tools TABLESPACE에 테이블이 만들어졌다고 하자. 그 이유는 다음과 같다.
    즉, 임포트 하는 테이블이 원래 tools TABLESPACE에 있었고 scott가 현재
    tools 테이블스페이스에 대한 Quota 를 가지고 있거나 아니면 Unlimited
    Tablespace 권한(Resource Role에 포함)을 부여받았기 때문이다.
    Import 시에 테이블을 DEFAULT TABLESPACE에 만들려면 DEFAULT TABLESPACE
    외의 TABLESPACE 에 대한 모든 Quota 를 0 로 만들고 UNLIMITED TABLESPACE
    권한을 Revoke시킨 다음에 임포트를 수행해야 한다.
    그리고, DEFAULT TABLESPACE 에 대한 Quota만 Unlimited 로 한다.
    예를 들면 다음과 같다.
    $ sqlplus system/manager
    SQL> alter user scott
    quota 0 on system
    quota 0 on tools
    quota 0 on data
    quota unlimited on users;
    SQL>revoke unlimited tablespace from scott;
    이렇게 한 다음 Import 를 수행하면 된다. 물론 유저를 만들 때 quota 를 주지
    않은 TABLESPACE는 상관 없으며 UNLIMITED TABLESPACE 권한(또는 RESOURCE ROLE)을
    주지 않았다면 Revoke 명령도 사용할 필요가 없다.
    [질문11] Import 시에 Core Dump/Segmentation Fault 가 발생하는 경우
    오라클에는 Character Set이 있다. 국내에서는 US7ASCII 또는 KO16KSC5601을
    주로 사용하는데 Export 받은 곳과 Import 하는 곳의 Character Set이 다르면
    Import시에 Core Dump 가 발생하거나 원인 불명의 에러가 발생하면서 IMPORT가
    중단되는 경우가 발생한다.
    이 경우에는 Export 받은 dump file 을 convert 프로그램을 이용하여 Import 하는
    곳의 Character Set 으로 변환시킨 다음 Import를 하는 방법이 있고, 아니면 어느
    한 쪽 DB 의 Character Set 자체를 바꿔서 동일하게 맞춘 다음 Export/Import하는
    방법이 있다. 이 중에서 Convert 프로그램을 이용하는 방법이 간단한데 이
    프로그램은 Unix 상에서 cc로 컴파일하여서 사용하면 된다.
    Reference Documents
    --------------------

    I`m talking about the wsusutil export\import process..
    Oh! That's NOT what you asked. What you asked is:
    my question is, do I need to export all files again?
    As for the WSUSUTIL functionality, that is an ALL or NOTHING operation. You have no choice in the matter.
    Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
    SolarWinds Head Geek
    Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
    My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
    http://www.solarwinds.com/gotmicrosoft
    The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds.

  • Export/import troubles

    Folks,
    a ct. has exported/imported more than on page with a batch job without any errors (WOW!). But when calling the different pages they are not the same. After dropping a page from portal UI and reimporting one single page the content is okay. But when doing this for another page, he got the same trouble again. Has anybody have a similar behaviour when importing more than on page?

    When we export/imported a couple of pages, HTML portlets from one ended up in the other and vice versa. This completely screwed up the application and we had to restore from backup.
    Also, one page with tabs dropped the content for one HTML Portlet per tab during the export/import process. The odd thing was that the title was still there for the affected Portlet...just the content was gone. This was a consistant, reproducable error.
    And one more: any images added to Portal (custom sub tabs, for instance) displayed as missing images, and had to be re-added after import. An odd behavior here was that they only dissapeared for certain users, but other users could see them. After re-adding them, all could see them again.

  • Export-Import and Naming convention question

    All,
    Newbie here. Have a question related to the naming convention for SAP R/3 system and XI manual export/import process. We are currently in the development environment where our R/3 systems are named as D55CLNT400, D56CLNT300 etc (first 3 characters are the system id, and last 3 are the client number.) This is per the XI best practices convention.
    The question i have is - if we name the technical system as above - and export the configuration objects from the Dev to Test environment - where the R/3 systems are named as T55CLNT400, T56CLNT300 (similar naming structure). Does it mean that we need to manually change almost all of the items in the Test environment on the configuration side (like business sytem name, interface determination, receiver determination etc)
    Is this the correct way or are we missing something??? I would have preferred a way - where we needed to only update the communication channel parameters.
    Thanks.
    Message was edited by:
            thezone

    In the SLD, create three Business System Groups: DEV, QAS and PRD.
    In each of these groups, you must have the relevant application servers (in your case, R/3s) and one integration server (XI).
    Then, for each Business System in Group DEV, define a transport target in QAS group.
    In your case, the transport landscape should be like this:
    D55CLNT400 -> T55CLNT400
    D56CLNT300 -> T56CLNT300
    XI_DEV -> XI_QAS
    Do the same for the QAS group (defining transport targets in PRD group). Observe that you need to have the same number of Business Systems in each group for this to work properly.
    Now, when you transport your configuration objects from XI_DEV to XI_QAS, all the Business Systems in DEV landscape will be replaced for the equivalent ones in QAS landscape.
    More info: http://help.sap.com/saphelp_nw2004s/helpdata/en/ef/a21e3e0987760be10000000a114084/frameset.htm
    Regards,
    Henrique.

  • BI Export Import

    ERROR: 2011-06-12 01:50:01 com.sap.inst.migmon.LoadTask run
    Unloading of 'SAPDODS' export package is interrupted with R3load error.
    Process 'E:\bin\uc\R3load.exe -e SAPDODS.cmd -datacodepage 4103 -l SAPDODS.log -stop_on_error' exited with return code 2.
    For mode details see 'SAPDODS.log' file.
    Standard error output:
    sapparam: sapargv( argc, argv) has not been called.
    sapparam(1c): No Profile used.
    sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
    ==============SAPDODS.log FILE===============
    (EXP) TABLE: "/BIC/AZSD_O2040" #20110612013025
    (EXP) ERROR: DbSlPrepare/BegRead failed
      rc = 103, table "/BIC/B0000104000"
      (SQL error 208)
      error message returned by DbSl:
    Invalid object name '/BIC/B0000104000'.
    Statement(s) could not be prepared.
    (DB) INFO: disconnected from DB
    E:\bin\uc\R3load.exe: job finished with 1 error(s)
    E:\bin\uc\R3load.exe: END OF LOG: 20110612013026
    ===========================================
    While doing the export import process i am geeting an error , can you please advice what could be the reason
    -Addi

    The indexes of all fact and efact table were not correct. We ran a program "RSDD_MSSQL_CUBEANALYZE" after confirmation from SAP to fix the issue.
    After that the issue was fixed.

Maybe you are looking for