Script assistance to export\import LegacyExchangeDN

Hi Everyone,
I'm currently in the process of migrating our hosted exchange customers from one site to another.
I'm looking for a way to create CSV file that will contain the fields: Alias, EmailAddress and LegacyExchangeDN, and a way to use CSV file in order to create X500 addresses for each account based on the details.
For organization with 5-20 users, manual work is acceptable, but manually setting the X500 addresses for organization with 50-100 users is a pain.
So any assistance here will be highly highly appreciated.

Hi,
The source and target site are similar at most parts. Both are running EX2010 and the Dc structure is similar at most part.
We're using an automation products that creates the mailboxes in the new site, so the only thing that get's broken is the autocomplete and calendar and which can be fixed if I'm copying the LegacyExchangeDN of each user and create it as X500 address.
The LegacyExchangeDN structure is also very similar at the two sites, for example:
New site: /o=HostedExchange/ou=Exchange Administrative Group (FYDIBOHF23SPDLT)/cn=Recipients/cn=poli6c2
Old site: /o=HostedExchange/ou=Exchange Administrative Group (FYDIBOHF23SPDLT)/cn=Recipients/cn=polib9f
So you can see, the only part that is changed during the migration is the users CN.
Anyway, I was able to create the required csv using this command:
Get-Mailbox -Filter {Emailaddresses -like "*domain.com*"} |select name,displayname,PrimarySmtpAddress,LegacyExchangeDN |Export-Csv "C:\details.csvBut I need some help in importing the details into the new site.

Similar Messages

  • Script request for  exporting, importing, inserting entry to Oracle 9i LDAP

    Do anyone has the scripts for exporting, importing, inserting entry to Oracle 9i LDAP server? Thanks for help.

    you can use ldapsearch utility to generate ldif files. Perform a search on the node you want to export and specify the output to be in LDIF format ( -L - this is a ldapsearch option). Once you have the ldif file, you can import it to any LDAPv3 complaint server such as OID. for OID, use ldapadd/ldapmodify utility to import the data.
    These utilities are present under IAS_HOME/bin or ORACLE_HOME/bin.

  • Export /Import of Portal -Urgent -Attn Product Mgr Group

    I m planning to move my current portal database to other box (Win2000-> HP)and keeping apache and portal tec stack on current box.
    Which of following method is safe and tested.
    1. Export whole data base (8.1.7.3) and import it as a full data base on target box.
    2.Or export it user wise.All users ,specially portal%.
    Please advice how to clone or migrate from deployement to test server.Different documentaion says different things.In document named "Import export in portal', they say portal30 can not be exported.While in Document "Migration of Portal Across Datbases" they suggested to export all schemas related to portal.In portal forum some users mentioned that they migrated from deployment to test/development using full database export ,saying that oracle suggests this, but there is no recovery method for it.
    My second objective is to seprate apache from database.Keeping existing apache on NT box (current instance) and data(portal schemas) on HP box.
    Is there any thing else to be done except modifying DAD on current apache.
    Thanks
    Sarvesh

    You can do the following to export your portal objects alone (I am talking wrt 30982) . Do it in that order please.
    1. Go to the WWU directory where you will find the scripts related to export/import.
    2. First run the SSOEXP utility for exporting all the users and groups. (The result is a dump say sso.dmp)
    2. Next, run the SECEXP utility for exporting all the previliges of all the users on different objects. (The result is sec.dmp)
    3. Then run the APPEXP utility for exporting your applications, you could give the "-security" option here as well (The result is app.dmp)
    4. Run the PAGEEXP utility for exporting your pages (This gives pages.dmp).
    5. Lastly run the CONTEXP utility for exporting your content areas (This results in contarea.dmp)
    These five dumps can be supplied to the corresponding import utilities for importing your stuff.
    But, if you are planning to export/import the entire database, then please enquire with the RDBMS migration experts about this.
    Hope this helps.

  • Export/Import scripts not working

    We're trying to move a portal from one server to another server (different hardware). We haven't been able to successfully complete this process--are there any updated scripts out there from Oracle that make this possible?

    2 questions
    1. Have you tried using the Portal export/import utilities ?
    2. Have you installed Portal and Login server in the target database before attempting 1 above ?
    3. Have you patched the Portal repository to the same patch level as that of the source database Portal Repository?

  • Where can I find Export/Import scripts?

    Hello,
    I have read in Oracle9iAS Portal Export/Import document about these scripts:
    secexp.csh
    contexp.csh
    pageexp.csh,
    but I can't find them.....can you help me?
    In which directory are they?
    Thanks in advance.

    The upgrade scripts are available at:
    http://technet.oracle.com/support/products/iportal/support_index.htm
    They say they are for Solaris, but they include the code for NT also. We have used these scripts to upgrade EA to prod on NT with no problems.
    Ken Atkins
    Computer Resource Team (www.crtinc.com)

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Oracle 9 spatial index export/import

    Hi,
    when exporting/importing a user via exp/imp I encounter a problem with the numeric characters encoding during creation of a spatial index.
    Imp tool produces script like this:
    "BEGIN "
    "execute immediate 'INSERT INTO USER_SDO_GEOM_METADATA values (''POINTS'',''GEOMETRY'',mdsys.SDO_dim_array(MDSYS.SDO_DIM_ELEMENT(''X'',50000000,160000000,,005),MDSYS.SDO_DIM_ELEMENT(''Y'',450000000,600000000,,005)),,005) ' ; "
    "COMMIT; END;"
    Problem is with wrong representation of the numeric value '0.005' as ,005.
    Originally, the index was created via
    INSERT INTO USER_SDO_GEOM_METADATA
    VALUES (
    'points',
    'geometry',
    MDSYS.SDO_DIM_ARRAY(
    MDSYS.SDO_DIM_ELEMENT('X',-125000000,-115000000, '0.005'),
    MDSYS.SDO_DIM_ELEMENT('Y', 30000000, 42100000, '0.005')
    NULL
    Any hints how to reimport the index correctly?
    Thanks,
    Bogart

    You might need to set the NLS_LANG environment variable to get character set conversion done on import (I don't know this unfortunately - it has never been a problem for me).
    There is a section on character set conversions in the utilities manual.

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Export Import in Portal Rel 2

    Hi
    How do i export/import Pages and other contents in portal rel 2, pageexp/imp and appexp/imp scripts r not to be found..what has happened to these in Rel 2..
    TIA
    Bijesh

    For the Portal 9.0.2 Export/Import FAQ, try this link instead. The FAQ was just updated today, so the URL was also changed:
    http://portalstudio.oracle.com/pls/ops/docs/FOLDER/COMMUNITY/OTN_CONTENT/MAINPAGE/DEPLOY_PERFORM/EXPORT_IMPORT_FAQ_1127.HTM
    Best Regards,
    Harry

  • Export/import in 904 versus 10.1.2

    We have two 10G AS 904 installs (dev and prod - same O/S too). From time to time we move our production portal content to the dev server. We sometimes get corruption and have to use the Schema validation Utility scripts to fix the import to the dev server. Not a show stopper but this adds another manual step to the process.
    I see the 10.1.2 version has improved import/export checks. Can anyone give feedback on the improvements in the export/import process?
    Thanks
    Kevin

    Kevin,
    Careful with this approach. Passing from DEV to PROD is ok, but then to DEV back again, what I suggest you is that you have a clean start on DEV, ie, clean the DEV machine and later do the same to the PROD. Portal Export / Import works only in one-way and not both ways (this basically has to do with the checks we do make and with possible conflicts with the objects on both sides). Also check the available documentation... which is all compiled in Metalink Note:263995.1 - Master Note for OracleAS Portal Export / Import Issues
    As to the improvement of the process, please give it a check on the New Features papers (make a find on "export" - it is easier to pick the references):-
    10.1.2 -- http://www.oracle.com/technology/products/ias/pdf/1012_nf_paper.pdf
    10.1.4 -- http://www.oracle.com/technology/products/ias/portal/pdf/portal_1014_new_features.pdf
    I hope it helps...
    Cheers,
    Pedro.

  • Export Import single table...

    Gurus...
    I am working on this single table which needs to be exported from prod and import into test.
    As I understand I need to follow below steps:
    1. Test - export table abc dump as backup
    2. Prod - Export single table abc
    3. Test - Drop table abc cascade constraints
    4. Test - Import abc into test
    Export par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= expdp_abc.log
    content= all
    tables=user.abc
    exclude=statistics,object_grant, tablespace_quota
    flashback_time= systimestamp
    Import par file:
    Directory= dbpump
    dumpfile= expdp_abc.dmp
    logfile= impdp_abc.log
    content= all
    tables=user.abc
    table_exist_action=replace
    transform=segment_attributes:N
    my doubts:
    1. Is my process flow correct?
    2. Export & Import file correct? or missing parameters?
    3. What happens to all objects connected to the table, will that also imported?
    4. Do I need to lock user during this process?
    5. Any script to check whether all objects connected to table does exist in test after import?

    Hi,
    Process for table export & import.
    +1. Create database directory in test as well as production:-+
    --> create or replace directory directory_name as 'physical_path';
    --> grant acsess on that directory to user.
    +2. Backp table in test environment (in case if you need old data in your test env):-+
    --> Create table BKP_table_name as select * from table_name; (table_name u want to import)
    +3. Take Export backup in Production database:-+
    --> expdp dumpfile=file_name.dmp logfile=file_name.log directory=directory_name tables=Owner.table_name
    +4. On test server do the following actvity:-+
    a. Just check for dependencies on that table using DBA_DEPENDENCIES.
    b. Truncate table which u want to import
    c. import table-
    impdp dumpfile=file_name.dmp tables=owner.table_name logfile=file_name_imp.log directory=directory_name table_exists_action=APPEND
    d. Just check for dependencies on that table using DBA_DEPENDENCIES.
    And also you can use table_exists_action=replace
    This will also import all depedent objects to the table..
    Regards,
    Edited by: XBOX on Dec 14, 2012 10:15 PM

  • Export/import utility for portals: opeasst ; where to launch from?

    hi all
    does this script opeasst.csh have to be run from infra home or the midtier home? i think for exports of pages and portlets from 904, the CH 10 on export/import requires us to launch export from the mid-tier.
    however, if i set up the midtier as my home, and intend to use the opeasst.csh, i do not see this script sitting in the directory where it's been suggested to reside:
    ORACLE_HOME\portal\admin\plsql\wwu
    instead it is in the ORACLE-infra home:
    ORACLE_HOME\portal\admin\plsql\wwu
    please help. it is URGENT.
    I am using OracleAS 10g Portals 9.0.4 version.

    I've been doing some runs and had typical DB export's at about 12 GB per hour (compressed to ~7GB file). Typical DB I/0 bottle neck loosened by using different file system/disk to point export at than where the DB data files reside (CPU ~80% of single CPU)
    now the import is a bigger item.

  • Export/Import Error: The security token could not be authenticated

    We currently are working in PLM 6.1.1 and users are experiencing Export/Import Issues, the error appears frequently with several users.
    Steps:
    1. A new token is generated from our QA environment
    2. The user logs into Dev and transfers the token
    3. In the export ADMIN area the user selects a section
    4. In the QA environment the user schedules the import
    5. The import is scheduled however the error is received after a few mins
    Error Message:
    The security token could not be authenticated or authorized ---> The directory service is unavailable.
    at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)
    at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)
    at Xeno.Prodika.XenoDoc.Handlers.DRL.DrlService.GetAttachment(tIdentifier Identifier)
    at Xeno.Prodika.XenoDoc.Handlers.DRL.DrlWebServiceLifecycleHandler.Load(IXDocument xdoc, String pkid)
    at Xeno.Prodika.XenoDoc.BaseLibraryManager.LoadDocumentPhaseII(IXLibraryConfiguration libConfig, IXDocument xdoc, String pkid)
    at Xeno.Prodika.XenoDoc.BaseLibraryManager.LoadDocument(String pkid)
    at Xeno.Prodika.ExportImport.DataExchange.ImportRequestProcessor.ProcessRequest(IApplicationManager applicationManager, IImportRequestQueue request)
    This error can be difficult to reproduce but occurs periodically.

    This is likely a DRL issue. verify DRL is configured correctly and a valid PLM4P user is setup in the setup assistant. in addition, make sure you added the new app in IIS for DRLService (this is a doc bug we are correcting that we failed to include in the 611 guide). verify you can attach and then open an attachment on a material spec.

  • EXPORT/IMPORT Q & A

    제품 : ORACLE SERVER
    작성날짜 : 2002-04-08
    EXPORT/IMPORT Q & A
    ===================
    PURPOSE
    다음은 EXPORT와 IMPORT에 관한 주요 질문과 답변에 관한 내용이다.
    Explanation
    [질문1] RDBMS와 EXPORT, IMPORT의 연관 관계는(catexp.sql 이란)?
    Export, Import시 이미 생성된 OBJECT의 정보를 데이타 딕셔너리에서 쿼리를
    하는데 이러한 OBJECT의 정보가 data dictionary내의 여러 테이블에 나누어져 있다.
    필요한 데이타 딕셔너리 정보를 편리하게 이용하기 위해 여러 가지의 뷰를
    catexp.sql에 Script되어 있다.
    이 스크립트 화일은 $ORACLE_HOME/rdbms/admin에 있으며 Install시 수행되도록
    되어 있다.
    [질문2] Export 시 OBJECT의 백업 순서는 있는가 ?
    Export하는 OBJECT의 순서는 존재하며 이는 Oracle의 Version Up 등에 의한
    새로운 OBJECT가 만들어지거나 하면 Export되는 OBJECT의 순서는 변할 수 있다.
    OBJECT의 Export순서는 다음과 같다.
    1. Tablespaces
    2. Profiles
    3. Users
    4. Roles
    5. System Privilege Grants
    6. Role Grants
    7. Default Roles
    8. Tablespace Quotas
    9. Resource Costs
    10. Rollback Segments
    11. Database Links
    12. Sequences( includes Grants )
    13. Snapshots ( includes grants, auditing )
    14. Snapshot logs
    15. Job Queues
    16. Refresh Groups
    17. Cluster Definitions
    18. Tables(includes grants,column grants,comments,indexes,
    constraints,auditing)
    19. Referential Integrity
    20. POSTTABLES actions
    21. Synonyms
    22. Views
    23. Stored Procedures
    24. Triggers
    25. Default and System Auditing
    [질문3] Export 시 BUFFER와 RECORDLENGTH는 무엇인가?
    -BUFFER
    Export시 OBJECT 내에 있는 여러 개의 Row가 한꺼번에 Fetch된다.
    디스크에서 Fetch된 정보는 화일에 Write하기 전에 메모리를 거치게 되며,
    이때 할당되는 메모리의 양이 Buffer 파라미터의 값이다.
    -RECORDLENGTH
    메모리에 있는 Export할 자료를 화일에 Write하기 위해 한번에 운반되는
    양을 결정하는 파라미터이다.
    [주의] 위의 BUFFER와 RECORDLENGTH는 O/S의 Block Size의 배수가 되도록
    하는 것이 효율적이다.
    [질문4] 다량의 Row를 Export, Import 시 어느 정도의 Row가 처리되었는지
    알수 있는가?
    알 수 있다. V 7.1까지는 다량의 Row를 Export, Import시 처리된 정도를 알
    수가 없어 현재 작업 중인지 시스템이 Hang인지 파악되지 않았으나 V 7.2
    부터는 FEEDBACK이라는 옵션을 이용하여 체크가 가능하다.
    [질문5] Export 시 한번에 몇 개의 Row가 Fetch되는가?
    한번에 Fetch되는 Row의 수는 Buffer Size와 연관 관계가 있다. 하나의 row가
    Export 시 차지하는 양은 (각 Column Size의 합) + 4*(Column의 수)로 구할 수
    있다. 한번 Fetch되는 Row의 수는 Buffer Size / 한 Row의 Export시 Size이다.
    이를 이용하면 Export된 Output File의 Size는 대략 한 Row의 Export 시
    Size * Row 수 이다.
    [질문6] Export, Import의 호환성은 어떻게 되는가?
    Export, Import의 호환성은 Oracle의 버젼과 직접적인 연관관계를 갖고 있다.
    호환성은 4가지로 나누어 설명할 수 있으며 이를 아래의 가정을 이용해
    설명하겠다.
    가령 A라는 기계에 Oracle V 7.0, B 라는 기계에 Oracle V 7.1이 설치되어
    운영 중이라 가정하자.
    Oracle V7.0을 X라 하고 Oracle V7.1을 Y라고 하자.
    - Base Compatibility : X의 exp를 이용해 X DB를 export하여 X의 imp를
    이용해 X DB에 import하는 것을 말한다. 이는 당연히 지원한다.
    - Upward Compatibility : X의 exp를 이용해 X DB를 export하여 Y DB에 Y의
    imp를 이용해 import하는 것을 말한다.
    이도 Oracle에서는 지원한다.
    - Downward Compatibility : Y exp를 이용해 Y DB를 export하여 X DB에 X의
    imp로 import하는 것을 말한다.
    이는 지원될 수도 안될 수도 있다.
    - Cross Compatibility : X exp를 이용해 Y DB를 export(SQL*Net 이용)하여
    X 또는 Y DB에 import(imp는 적정한 것을 활용)하는 것을 말한다.
    이는 지원될 수도 안될 수도 있다.
    [질문7] 어떤 경우에 Downward Compatibility가 실패하는가?
    V7.2에 hash cluster expressions라는 옵션이 있는데, 이를 이용해서 클러스터를
    생성하여 사용 후 export한 것을 V7.0 또는 V7.1로 downward시 create cluster문에
    옵션이 맞지않아 실패하게 된다.
    [질문8] EXP-37 에러(export views not compatible with database version)
    발생의 원인은 무엇인가 ?
    이 에러는 Cross Compatibility에서 발생하는 문제로 이는 Export가 이용
    하는 View(Catexp.sql에 의해 생성된)가 Oracle Version내에 일치하지 않아
    발생한 문제로 이를 해결하기 위해 Exp에서 이용 가능한 View를 설치한다.
    [질문9] Full Export는 DBA 권한을 갖고 있는 유저만 할 수 있는가 ?
    Version 6 에서는 DBA권한을 갖고 있는 유저만 Full Export를 할 수 있으며,
    V7에서는 DBA가 아니더라도 EXP_FULL_DATABASE Role이 Grant되면 Full
    Export가 가능하다.
    [질문10] 테이블 Import 시에 DEFAULT tablespace가 아닌 곳으로 들어가는 경우는
    왜 발생하는가?
    예를 들어서 scott 유저의 DEFAULT TABLESPACE가 users 인데 임포트를 해보면
    tools TABLESPACE에 테이블이 만들어졌다고 하자. 그 이유는 다음과 같다.
    즉, 임포트 하는 테이블이 원래 tools TABLESPACE에 있었고 scott가 현재
    tools 테이블스페이스에 대한 Quota 를 가지고 있거나 아니면 Unlimited
    Tablespace 권한(Resource Role에 포함)을 부여받았기 때문이다.
    Import 시에 테이블을 DEFAULT TABLESPACE에 만들려면 DEFAULT TABLESPACE
    외의 TABLESPACE 에 대한 모든 Quota 를 0 로 만들고 UNLIMITED TABLESPACE
    권한을 Revoke시킨 다음에 임포트를 수행해야 한다.
    그리고, DEFAULT TABLESPACE 에 대한 Quota만 Unlimited 로 한다.
    예를 들면 다음과 같다.
    $ sqlplus system/manager
    SQL> alter user scott
    quota 0 on system
    quota 0 on tools
    quota 0 on data
    quota unlimited on users;
    SQL>revoke unlimited tablespace from scott;
    이렇게 한 다음 Import 를 수행하면 된다. 물론 유저를 만들 때 quota 를 주지
    않은 TABLESPACE는 상관 없으며 UNLIMITED TABLESPACE 권한(또는 RESOURCE ROLE)을
    주지 않았다면 Revoke 명령도 사용할 필요가 없다.
    [질문11] Import 시에 Core Dump/Segmentation Fault 가 발생하는 경우
    오라클에는 Character Set이 있다. 국내에서는 US7ASCII 또는 KO16KSC5601을
    주로 사용하는데 Export 받은 곳과 Import 하는 곳의 Character Set이 다르면
    Import시에 Core Dump 가 발생하거나 원인 불명의 에러가 발생하면서 IMPORT가
    중단되는 경우가 발생한다.
    이 경우에는 Export 받은 dump file 을 convert 프로그램을 이용하여 Import 하는
    곳의 Character Set 으로 변환시킨 다음 Import를 하는 방법이 있고, 아니면 어느
    한 쪽 DB 의 Character Set 자체를 바꿔서 동일하게 맞춘 다음 Export/Import하는
    방법이 있다. 이 중에서 Convert 프로그램을 이용하는 방법이 간단한데 이
    프로그램은 Unix 상에서 cc로 컴파일하여서 사용하면 된다.
    Reference Documents
    --------------------

    I`m talking about the wsusutil export\import process..
    Oh! That's NOT what you asked. What you asked is:
    my question is, do I need to export all files again?
    As for the WSUSUTIL functionality, that is an ALL or NOTHING operation. You have no choice in the matter.
    Lawrence Garvin, M.S., MCSA, MCITP:EA, MCDBA
    SolarWinds Head Geek
    Microsoft MVP - Software Packaging, Deployment & Servicing (2005-2014)
    My MVP Profile: http://mvp.microsoft.com/en-us/mvp/Lawrence%20R%20Garvin-32101
    http://www.solarwinds.com/gotmicrosoft
    The views expressed on this post are mine and do not necessarily reflect the views of SolarWinds.

Maybe you are looking for

  • Unable to print added text only in Acrobat Pro XI

    User is trying to print added text that's been added to an existing PDF page.  In Acrobat X she chose "Form fields only" under Comments & Forms from the Print menu.  In XI it produces a blank page.  The added text only shows up if she chooses Documen

  • How can assign the value returned from javascript to Hidden item

    Hi All, I have created a report with button in one column adding following code to SQL select statement like SELECT USER_RQST_ID, USER_RQST_DESC, RQST_DATE, STATUS, USER_ID, CNTRY_ID, KPI_LIST, YEAR_LIST, QTR_LIST, MONTH_LIST, PROD_LIST, FULL_PERIOD_

  • Dynamically updating XML

    Does anyone have any recommendations for the best way to dynamically add new content to an xml file? I have tried to write to the file from a php form using fwrite, but encountered a snag when trying to get the php page to write the new xml to the se

  • Authorization object in program

    Hi All, I have a requirement to determine the sales areas that a business partner is authorized to edit in my report program. Can this be done by programatically using the Authorization object CRM_BP_SA. Thanks, Shilpi

  • While trying to download adobe reader Error msg: already have a more productive application?

    while trying to download adobe reader Error msg: already have a more productive application?