Export/import oracle9

Hi,
since I do not have a good knowledge of oracle, hopefully somebody may help me to understand better how export/import work (oracle 9)
A colleague as run an export from a production server, now he needs to run the import on a test server but it fails.
Could someone be so kind to let me know what I have to take into consideration before running both commands ?
I explain better:
I have noticed that the DB_BLOCK_SIZE is indeed different on the two servers (production server has a DB_BLOCK_SIZE=4096)(test server has a DB_BLOCK_SIZE=8192)
so the first thing I have seen in the import log file is an error message saying that "tablespace block size 4096 does not match configured block sizes"
So my first question would be: is there any chance to run export/import on two servers with different DB_BLOCK_SIZE ?
Then (due to my ignorance of the subject):
- is there any problem is the SID is different on the two servers ? or it is not important ?
- is it important that the PATH of "*dbf" files is different on the two servers or this fact is not taken into consideration at all (since only 'logical' data are taken into consideration) ?
- do the tablespaces currently used on the production server have to exist in the test server in order to have a successful import ?
Hopefully somebody has few minutes to answer my question.
Thanks a lot in advance.

Hi,
Are you talking about FULL export/import method, right? Well, Oracle 9i supports multiple block sizes in the same database. So, it gives you an option of using multiple block sizes, which is especially useful when you are transporting tablespaces from another database with a different block size. In this case, will be necessary to set the initialization parameter called DB_nK_CACHE_SIZE in the initialization parameter file. For example, in your case, if your standard block size is 8KB, you will need to pre-create a tablespace with a block size, say 4KB that are currently using on your product server and then you will must set the DB_4K_CACHE_SIZE parameter in your test server. The DB_nK_CACHE_SIZE The parameter is dynamic so you can alter its value using the ALTER SYSTEM statement.
>>- is there any problem is the SID is different on the two servers ? or it is not important ?
No problem ...
>>- is it important that the PATH of "*dbf" files is different on the two servers or this fact is not taken into consideration at all (since only 'logical' data are taken into consideration) ?
In fact, It is important that the tablespace exists in the target database and doesn't matter where they datafiles are located. In this case, logical data are taken into consideration.
>>do the tablespaces currently used on the production server have to exist in the test server in order to have a successful import ?
Yes. For a full import method, this is very important. So, you will need to pre-create the tablespaces in your target tablespaces, but in this case they need to be created with 4k of block size, as I said before. Anyways, you will need to make some tests.
Otherwise, if you want to make a test, create a database on you test server using 8k default block size and use the export user method instead.
Eg.:
If you have the user A, B and C using the TBS_1 in your prod database, so, create the tablespace TBS1 using the default block size (8k) in your test database and make this test below:
PROD DATABASE
exp a/pass file=a grants=n
exp b/pass file=b grants=n
exp c/pass file=c grants=n
TEST DATABASE
create the users A,B and C and perform these tasks below:
imp a/pass file=a full=y
imp b/pass file=b full=y
imp c/pass file=c full=y
In addition, give a feedback ....
Cheers
Legatti

Similar Messages

  • Offline Instantiation of a Materialized View Site Using Export/Import

    Has anyone had any success performing offline instantiation of a materialized view site using export/import in Oracle9?

    This is what I wanted to ask in the forum. I want to use datapump for the initial instantiation because I believe that indexes (the actual indextables, not just the definition) are also replicated using datapump.
    So, is it possible to do the instantiation using datapump?

  • Best practice metadata export/import  for WebCenter portal

    Hi
    I'm using JDeveloper to build the WebCenter portal. When I let's say register a portlet producer along with my portal, the information about it is stored in the metadata. In case I want to move my portal from the integrated WLS which JDeveloper provides to another WebLogic server with WebCenter installation - steps which I need to do are:
    1) Export the metadata from JDeveloper
    2) Import metadata to my target Weblogic server
    3) deploy the application
    Am I missing something?
    To export/import metadata from JDeveloper to another Weblogic (equipped with WebCenter) server I should use the WLST script, right?
    Thanks

    A little correction: IAS Version is Oracle9i Application Server Release 1 version 1.0.2.2.2

  • Export/Import utility

    With Oracle 10G, do we need to set the NLS_Lang for exporting and importing data.

    The NLS_Lang FAQ says the following.
    "In Oracle9i the Export utility always exports user data, including Unicode data, in the character set of the database. The Import utility automatically converts the data to the character set of the target database."
    But it is not very clear from Export/Import FAQ in metalink.

  • Memory Limitation on EXPORT & IMPORT Internal Tables?

    Hi All,
    I have a need to export and import the internal tables to memory. I do not want to export it to any data base tables. is there a limitation on the amount of memroy that is can be used for the EXPORT & IMPORT. I will free the memory once I import it. The maximum I expect would be 13,000,000 lines.
    Thanks,
    Alex (Arthur Samson)

    You don't have limitations, but try to keep your table as small as possible.
    Otherwise, if you are familiar with the ABAP OO context, try use Shared Objects instead of IMPORT/EXPORT.
    <a href="http://help.sap.com/saphelp_erp2004/helpdata/en/13/dc853f11ed0617e10000000a114084/frameset.htm">SAP Help On Shared Objects</a>
    Hope this helps,
    Roby.

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Project duplication:  Copy/Paste vs. Export/Import

    Anyone have any thoughts on project duplication?
    Presently I am copying a project and then pasting it into multiple new locations then renaming them. Recently I was in a training session where the instructor suggested exporting the project and then importing it claiming database size savings. Can anyone shed any insight?
    Thanks in advance.

    I can't imagine why an export/import would be a smaller footprint within the database than a project copy/paste. Not to mention that if you are talking about a relatively large project it can take a substantial amount of time to export and import. One of our schedules that is about 18k activities takes approximately 1.5 hours to export/import where as a project copy/paste only takes about 10 minutes. Note that we do not copy the baseline schedules when replicating projects.

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

  • Error in Client export/import

    Dear all
                     i have test system i would like to perform client export/import in single system correct if made a mistake
    i have export client which create 3 request namely SIDKOxxxxxx , SIDKTxxxxxx,SIDKXxxxxx .i had selected target SID as system same sid bcaz i have only one system
    then i login again with scc4 create new client and login in that sap* and pass then when i run command stms_import to import all three transport into that client as said my message server gets disconnected correct me i am wrong
    or i have to directely run scc7 bcaz the request already in /usr/sap/trans then these error won't occur  i have the back up of database
    now i am getting error oracle disconnected
    Regrards

    Dear Pavan
    I tried last night as you said i made new client as 300 in ides and copy user profile to that
    when i tried to configure stms by setting CTC =1 when i configure transport routes i don't find any client number
    to provide source and target client
    kindly send some screen shots if u have that would great support
    .Delete the existing TMS configuration.
    2.In 000 client by using the user ddic create TMS configuration using STMS.
    3.Click on the system, in transport tool tab add a parameter CTC and set it to 1.
    4.Now configure the transport routes, there you can find the client number in consolidation routes.
    5.Provide your source and destination client and create routes.
    Regards

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol &amp;quot;牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐&amp;quot; when expecting one of the following: ( - + case mod new not null &amp;lt;an identifier&amp;gt; &amp;lt;a double-quoted delimited-identifier&amp;gt; &amp;lt;a bind variable&amp;gt; avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

  • Oracle 9 spatial index export/import

    Hi,
    when exporting/importing a user via exp/imp I encounter a problem with the numeric characters encoding during creation of a spatial index.
    Imp tool produces script like this:
    "BEGIN "
    "execute immediate 'INSERT INTO USER_SDO_GEOM_METADATA values (''POINTS'',''GEOMETRY'',mdsys.SDO_dim_array(MDSYS.SDO_DIM_ELEMENT(''X'',50000000,160000000,,005),MDSYS.SDO_DIM_ELEMENT(''Y'',450000000,600000000,,005)),,005) ' ; "
    "COMMIT; END;"
    Problem is with wrong representation of the numeric value '0.005' as ,005.
    Originally, the index was created via
    INSERT INTO USER_SDO_GEOM_METADATA
    VALUES (
    'points',
    'geometry',
    MDSYS.SDO_DIM_ARRAY(
    MDSYS.SDO_DIM_ELEMENT('X',-125000000,-115000000, '0.005'),
    MDSYS.SDO_DIM_ELEMENT('Y', 30000000, 42100000, '0.005')
    NULL
    Any hints how to reimport the index correctly?
    Thanks,
    Bogart

    You might need to set the NLS_LANG environment variable to get character set conversion done on import (I don't know this unfortunately - it has never been a problem for me).
    There is a section on character set conversions in the utilities manual.

  • Upgrade from 8.1.6 to 9.2.0.7 via export/import

    Has anyone done this or know if it's possible to simply build a 9.2.0.7 instance, then use export/import to perform the upgrade from 8.1.6 to 9.2.0.7 since a direct manual upgrade from 8.1.6 is not supported??
    TIA

    Yes, building a database in a new release and then exporting the existing database using its version of the exp utility, performing a binary copy of the exp file to the new server, and then running the new version imp utility using the transferred dmp file was the standard method of moving databases to new platforms/versions for many years. The same process could always be used on the same server between equal or from a current to a new release.
    HTH -- Mark D Powell --

Maybe you are looking for