Export and re-import on same database

For one reason or another we want to do a full export and re-import all the data back into the same tablespaces... is there a command to imp to tell it to truncate and over-write all the data before importing? or do I have to manually drop all the tables? (by manual I mean write a script to do it) -- and/or drop all the schemas

Not sure, why you want to export and import back the same thing into the same database, but anyways to answer your question, yes it is possible with Oracle10g by using TABLE_EXISTS_ACTION={SKIP | APPEND | TRUNCATE | REPLACE}. With Oracle9i, you may have to generate a dynamic script to truncate tables....make sure you exclude SYS/SYSTEM tables from the truncate script.........did I say except SYS/SYSTEM and so on :-)
HTH
Thanks
Chandra Pabba

Similar Messages

  • Difference Between Export and Direct Import Process.

    Hi SD Gurus,
    Can anyone help to understnad difference betwenn Export and Direct Import Process.

    Hi,
    export = out of the system or country
    import = into the system or country
    For example in SAP-ABAP
    On the EXPORT side
    DATA: ls_indx TYPE indx.
    ls_indx-aedat = sy-datum.
    ls_indx-usera = sy-uname.
    ls_indx-pgmid = 'ZNAME'.
    EXPORT tab = lt_orders
    TO DATABASE indx(v1)
    FROM ls_indx ID 'PERCENT'.
    On the IMPORT side
    DATA: ls_indx TYPE indx.
    ls_indx-aedat = sy-datum.
    ls_indx-usera = sy-uname.
    ls_indx-pgmid = 'ZNAME'.
    IMPORT tab = lt_orders
    FROM DATABASE indx(v1)
    TO ls_indx ID 'PERCENT'
    Regards,
    Raj

  • When exporting and then importing Apple Mail from one mac to another I get "Some messages could not be imported. Partially imported mailbox will be labeled IMPORT" . That mailbox does not appear.

    When exporting and then importing Apple Mail from one mac to another I get "Some messages could not be imported. Partially imported mailbox will be labeled IMPORT" . That mailbox does not appear.
    Any help would be welcome

    This is a bug in Mail. The mailbox archiving function does not include attachments, which  gives the error with the import. To workaround this you can create a mailbox in Mail and copy to it all emails you want to export. Then go in Finder to [user]/Library/Mail/Mailboxes en select the created mailbox. Copy this to the public directory of your other mac. Then start Mail on that other mac and import the mailbox from the public directory. Should work now.

  • Updatable Materialized View and Master Table on same database

    Hi all,
    My first question - Is it possible to have an Updatable Materialized View and the associated Master Table located on the same database?
    This is the requirement scenario:
    One unique database D exists.
    A is a batch table. Only inserts are allowed on Table A.
    M is an updatable materialized view on Table A (Master). Only updates are allowed on M (no insert or delete).
    Requirement is to push updates/changes from M to A periodically and then get the new inserted records from A into M via a refresh.
    Is this possible? What other approaches are applicable here?

    John,
    My question is related to the implementation and setup of the environment as explained in the above example. How can I achieve this considering that I have created an updatable m-view?
    If possible, how do I push changes made to an updatable m-view back to it's master table when/before I execute DBMS_MVIEW.REFRESH on the m-view? What is the procedure to do this if both table and mview exist on the same database? Do I need to create master groups, materialized view refresh groups, etc.?
    One more thing.. Is there a way to retain changes to the m-view during refresh? In this case, only newly inserted/updated records in the associated table would get inserted into m-view. Whereas changes made to m-view records would stay as-is.
    Hope my question is directed well. Thanks for your help.
    - Ankit

  • Export and Re-import bulk PTR reverse lookup records

    Hi,
    If I were to export an existing zone let's say 0.168.192 which has hundreds of PTR records and I want to create new zones 1.168.192, 2.168.192, 3.168.192 and so on - what would be the best action of exporting this whole zone (0.168.192) and importing
    certain records from this zone into one of the new zones?
    For example 0.168.192 around 200 PTR records to be exported and then these records to be imported into the new zone 1.168.192?
    Would a CSV be achievable? And if so, what columns and command line I would have to run?
    Many thanks.

    Hi,
    To export a DNS zone by Powershell, we can use the command "Export-DnsServerZone".
    For detailed information, please refer to the link below,
    Export-DnsServerZone
    http://technet.microsoft.com/en-us/library/jj649939.aspx
    Beside, here is a blog about how to copy and merge DNS zone with Powershell,
    DNS Zone Copy and Merge with PowerShell
    http://blogs.technet.com/b/ashleymcglone/archive/2014/07/31/dns-zone-copy-and-merge-with-powershell.aspx
    If you have any further questions about Powershell, to get better help, please post the question on the forum below,
    https://social.technet.microsoft.com/Forums/windowsserver/en-US/home?forum=winserverpowershell
    Best Regards.
    Steven Lee
    TechNet Community Support

  • Export and re-import in to project keeping background transparent.

    Hi guys...
    I have had some joy and some no so joy... maybe I am doing something wrong..
    I have been using a smoke emmitter and "flattened" this and re-imported back into project to save ram or to duplicate the layer (for more smoke) and also again save ram. But sometimes the new import has a transparent background layer sometimes it is black. (Always I set all the other layers to not viewable) I have tried using keying to get rid of the black. But possibly there is a setting to keep any export/import back into project background transparent, if in a group above the bottom one. I am using QT animation setting for exports. DVC PRO HD 1080p
    Thanks.
    Cheers for all this great help... invaluable for neubie..

    Set your background to transparent (press apple+J to open the window you need) as it may currently be set to "solid" and render the movie out using Animation, as this preset will support transparency. That should do it.

  • Speed up Client EXPORT and Client import

    Hi,
    how to speed up the client export/import process (from scc8/scc7)?
    There is no profile,if we want User master records and the client specific data. then can we over write the client 2times first with user master data and then followed by client specific import.
    Main requirement is to Export Production client Specific data and users with authorization as CLIENT EXPORT (finally we can import the requests generated by client export) after importing to DEV server once can use as Quality client.
    please give your valuable suggestions in having client export and import to make Quality client in DEV server.
    DEV and PRD patch levels are different (remote client copy will fail in RFC system comparison)
    request you to give your valuable inputs.
    thanking you in anticipation,
    best regards,
    Raghav

    Hi Chowdary,
    This is not a big issue.
    You please find the TR list which are not moved PRD.
    Goto PRD->STMS_IMPORT find the TR list (which are in white color symbol) not moved to PRD.
    Then ask the respective consultant about TRs. Then re-import the TR in QAS. The old configuration will be occured in QAS. The TR which is moved to QAS, Now also it will be in green color only.But just re-import it and get the confirmation form respective consultant.But the user details will not available.
    hope this is useful.
    Regards,
    Patan Thavaheer.

  • Export and then import those images in one step?

    I'm slightly annoyed that every time I want to use the export function to create jpeg versions of images, I then have to do the extra step of importing those jpegs into my catalog. Is there some way to do this in one step?
    If I need .tiff's of .psd's, I just use the edit in photoshop function, because that has the option to automatically stack with original, but there is no option that I can find for other formats.
    Thanks

    Right, then until 2.0 comes out, I'll try the watched folder idea that John suggests. Yea, I was talking about raw files, but also psds or really any "digital neg".
    As for the workflow import-process-export, and the final product not needing to be re-imported because it theoretically needs no more processing, this would be fine if one used lightroom to organize digital negs and bridge to organize absolutely everything, including final versions for various media, but I'm glad that adobe has realized that at least some of us use are trying to use lightroom without opening bridge a lot of the time. Although I haven't even tried using bridge that much, maybe there are advantages over lightroom as a central depository.

  • Run Apex 4.2.6 and APEX 5 on same database parallel

    Hi,
    Is it possible to install Apex 4.2.6 and Apex 5.0 on the same database? If Apex Listener is used as Application Server, do I have to install two different Listener Instances as well?
    Thanks

    Hi,
    I do understand your argument that there should be no custom code in SYS. But in the case of the APEX engine it's a must have to guarantee security. You might know or not know, but APEX is using SYS.DBMS_SQL_SYS package (a more powerful version of the DBMS_SQL package) to execute application code/DML statements with the privileges of the parsing schema of the executed application. Obviously you don't want to grant execute to that package to another schema than SYS because of the power of package to run code in the name of another schema/user.
    Do not have an issue with public synonyms though - understandable from a scope resolution perspective. For multiple versions though, scope should be able to be dealt with using the Oracle logon schema (as configured on web server side) and private synonyms?
    Unfortunately this would not be sufficient because the Oracle Logon Schema (most of the time called APEX_PUBLIC_USER) specified for mod_plsql/ORDS is only used to call the entry points of the APEX Engine like the F procedure or WWV_FLOW.SHOW and WWV_FLOW.ACCEPT. This would work fine until custom PL/SQL code or SQL/DML statements of your application have to be executed. This is not done with the privileges / scope of APEX_PUBLIC_USER, instead the code will be executed with the Parsing Schema specified for your application (that's where the above described SYS package comes into play). Otherwise APEX_PUBLIC_USER would have to be a super highly privileged user with access to any schema, which isn't the case. Instead it's a super low privileged user.
    Let's continue our example. If the application PL/SQL code / SQL or DML statement references one of our public APIs like V, APEX_APPLICATION, APEX_UTIL, ... or an APEX view we are in the the situation that the application schema has to resolve those references. But a schema can just point to one APEX version.
    How could that be solved?
    When copying the application to the new APEX version, the application parsing schema would have to be changed to a new 'proxy' application schema with access to the original application schema. The 'proxy' schema would point it's private synonyms to the new APEX version. But even then it's getting tricky if the original application schema has definer rights packages which do reference APEX apis, because those would still point to the public synonyms. As I said, it's not so easy and there are many traps customers could fall into ending up with a situation where it's hard to diagnose if an application is trying to call different versions of APEX in the same runtime session.
    But as I said, we are totally aware of the situation that customers would like to do a slower one by one application upgrade of their APEX installations to avoid breaking apps.
    Regards
    Patrick
    Member of the APEX development team
    My Blog: http://www.inside-oracle-apex.com
    APEX Plug-Ins: http://apex.oracle.com/plugins
    Twitter: http://www.twitter.com/patrickwolf

  • Exported and later Imported .pst files retain the permissions from the original Mailbox.

    After migrating from Exchange 2003 to 2010 something strange happens when exporting folders to a .pst file and importing them in another mailbox.
    The acces right that are applied, are those from the original mailbox.
    I have been searching on the internet about this strange behaviour, but it seems our organization is the only one where is applies.
    Does anybody have any ideas what can be the cause?

    Hi,
    For example, userA was granted permissions to the Inbox folder of userB's mailbox. You exported userB's Inbox folder to PST file and then imported to userC's mailbox. After that, userA had permissions for Inbox folder in userC's mailbox as well. This is
    what you had encountered, is it right?
    Generally, permissions aren't preserved when you export folders to PST file, only the content itself is exported. I recommend you remove this mailbox folder permission using the Remove-MailboxFolderPermission command to check the result.
    Best regards,
    Belinda
    Belinda Ma
    TechNet Community Support

  • Lightroom catalog export and re-import

    I am not new to Lr, and I do know how to export catalogs, or collections, etc.  However, what I am concenred about it is exporting, lets say a collection "as a catalog" and taking it to another computer for edits, then bringing it back to the origianl computer or hard drive.  My concern is in how it comes back to the original computer.  Is there a way to do this so that it goes right back exactly the way it was before I exported it?  I don't want the edited collection to come back as a new or separate collection.  Is this something that can be done?
    Thanks a lot.
    Richard

    Thanks for your response.  What I'm essentially wanting is for the new edits to go into the catalog and after that there being no indication there was ever an export/import... In other words my folder structure on the master computer to look the same.  Also, is it true that once the edits are imported back to original computer/catalog - the raw image files that were originally exported as part of the "export as catalog" can be deleted as they are now just dupes, or extraneous?
    Thanks again.
    Richard

  • Pre-export    and post-import tasks

    oracle8i/windows 2000 server.
    i would like export my database from my organisation and import into my local
    machine at home.
    what are the precautions,tasks and post tasks
    ,i have to take during import and export.
    c.santhanakrishnan.

    Why are you exporting your organisations db to import at home? Are you aware of the privacy issues this raises? Do you have approval from your organisation?
    Exporting -
    o NLS_LANG is set correctly
    o Available space on disk
    o consistent = y
    o recordlength=65535
    o direct=y
    o log=<logfile_name>
    o do you require grants, indexes?
    o A feedback for progress,
    o But even with consistent = y that is table consistent and there is a probability you may run into a problem with FKs (usually when exporting large amounts of data) and tables in the beginning of the export have child tables near the end of the export which (the child tables) have records inserted during the export which won't have a parent record on import and will be rejected.
    Importing -
    Some depend on the data volume. Do you want to -
    o NLS_LANG is set correctly
    o drop, re-create indexes to speed up import
    o put the database in NOARCHIVELOG (if possible)
    o need to also create public synonyms
    o check triggers and if they will fire?
    o need to create the schema/user to import (if it doesn't exist in the import database and is not a full export) ?
    o do the tablespaces that the export was taken from exist in the import database?
    o disable/enable FK constraints
    o enough UNDO space
    o enough archive log space (if in ARCHIVELOG mode)
    oset a buffer (size depends)
    o set a feedback to monitor progress (or check rows_processed in v$sql)
    o if data already exists in the tables, does it need to be truncated (which raised other issues, constraints etc, take a backup prior to truncate?)
    I probably missed stuff too.

  • I am importing a folder of pictures and they import the same picture?

    Hi,
        I am importing a folder of pictures and the picture information is correct but the thumbnails are all the same picture. If I move the pictures onto the time line they are the same picture. I just upgraded from CS3 to CS5. I tried to update the program but it said that Premier Pro CS5 is up to date. I have tried different folders but they import various pictures with incorrect thumbnails and the pictures are wrong also? Is this a program error?

    OK everyone, good news. I am not losing my mind
    I resized them to 72 dpi at 640x480 (4:3 normal video as specified by the new project. They still do not import correctly under any of the conditions I listed above. This time, I also dragged them from Adobe Bridge to the bin I was working out of. FYI, 300 dpi at 720x480 is EXACTLY the same file size as 72 dpi 720x480. I also have been able to import onsie-twosie photos at full resolution and they auto-fit in the video frame.
    I believe many of you were able to fix this problem with the re-sizing work-around, it did not work for me. I have a hard time believing resolution matters when the pixel dimensions are the same. this looks to me like a configuration issue or something wrong with the 64 bit only version for cs5. CS4 works fine on my virtual machine I created to test (this is how I resolved the issue and got my project done).

  • Can CF and ASP Access the Same Database?

    Hello. I'm about to go into beta with a ColdFusion 8 site I
    just wrote, and I've just realized there could be a problem I
    haven't anticipated. There's another site, written in ASP, which
    will accessing the same data at the same time. The data are in
    FoxPro for DOS, which (if I have the terminology right) is a 'file
    server' database. Both my site and the ASP site will be accessing
    it through ODBC. I will be accessing it on a read only basis, and I
    believe the ASP site accesses it read only as well. However, other
    employees in the company are constantly accessing it read/write.
    The ASP site has been already up and running for some time,
    and the office manager (it's a small company) finds that he is able
    to avoid conflicts by first stopping the ASP server each morning,
    then starting FoxPro, then re-starting the ASP server. (I do not
    mean the IIS server, but a little EXE written by the ASP
    programmer, which puts up its own little window with 'start' and
    'stop' buttons on it.)
    However, I find that on my own PC, using the free Developer
    version of ColdFusion and accessing my site as localhost, if a DBF
    (database file) is opened in FoxPro (even if no one is currently
    writing to it), then my CF web site cannot access it, and vice
    versa. That is, if I start FoxPro first and open the DBF of
    interest, my web site can't access that DBF. And conversely, if I
    access it first via my web site, then FoxPro can't access it. In
    other words, the solution which works for the office manager and
    his ASP site does not work for me with CF on my local PC.
    Will matters be different in their system, where I will be
    installing my site tomorrow, along with the Standard version of CF
    8? If it matters, the FoxPro data and the Web server will be
    different boxes linked by a network. The operating system is
    Windows 2003 if I recall correctly.
    Should I be saying 'Oops!' right about now?
    I can probably come up with a workaround by having FoxPro
    write a record to an alternate DBF every time the DBF of interest
    to my site is updated, but will I have to do this?
    Thanks for your help.

    paross1 wrote:
    > there probably isn't
    > much that can be done with a database that probably
    isn't designed for multiple
    > simultaneous transactions.
    >
    > Phil
    >
    Not much that can be done to make the database robust and
    handle
    simultaneous transactions. But you could wrap all your CFML
    that
    accesses this resource in named <cflock ...> tags so
    that they are
    single threaded, thus restricting CF to one access at at
    time. Then
    wrap all of this up in a <cftry><cfcatch...>
    blocks to gracefully handle
    when the database is locked up by some other user/process.
    You will still have many times when the CF application will
    not be able
    to access the resource do to locking issues, but it will
    handle this
    gracefully and not just FAIL.
    Of course this will have serious performance, throughput and
    scalability
    consequences. But I suspect if you are using something like
    FoxPro you
    are not building an application expected to handle heavy
    load.

  • Using more than one PU and PC for the same database

    I have a scenario described here: [http://www.seamframework.org/Community/UsingTwoParallelNestedConversationsWithParentAtomicConversation|http://www.seamframework.org/Community/UsingTwoParallelNestedConversationsWithParentAtomicConversation]
    which uses Seam, EJB3, JPA, Hibernate, Richfaces (modalPanel) and JSF.
    The question I have is the following:
    is there any negative consequence (memory consumption, performance hit, etc.) of using more than one persistence unit in the persistence.xml that points to the same EntityManagerFactory? I was thinking of having one PersistenceContext (Seam-managed PC - an extended PC which is conversation-scoped) which uses one PU and reserving the other PC for the modalPanel forms and backing beans (SFSBs).
    The reason I needed to use this solution/approach is so that when using Hibernate MANUAL flush with SMPC, I can achieve isolated synchronization of the PersistenceContext without updating values in the modalPanel forms and vice versa.
    Any tips on best practices or alternative solutions? thx.
    persistence.xml snippet:
       <persistence-unit name="boBETS">
          <provider>org.hibernate.ejb.HibernatePersistence</provider>
          <jta-data-source>java:/boBETSDatasource</jta-data-source>
          <properties>
             <property name="hibernate.dialect" value="org.hibernate.dialect.SQLServerDialect"/>
             <!-- <property name="hibernate.hbm2ddl.auto" value="validate"/>   -->
             <property name="hibernate.show_sql" value="true"/>
             <property name="hibernate.format_sql" value="true"/>
             <property name="hibernate.generate_statistics" value="true"/>
             <property name="jboss.entity.manager.factory.jndi.name" value="java:/boBETSEntityManagerFactory"/>
             <property name="hibernate.default_catalog" value="boBETS"/>
             <property name="hibernate.default_schema" value="dbo"/>
          </properties>
       </persistence-unit>
       <!-- using boBETS2 for ListValueParamAction for now! trying to isolate the em.flush() such that we can achieve atomic conversations for the
       base form as well as the popup form! -->   
       <persistence-unit name="boBETS2">
          <provider>org.hibernate.ejb.HibernatePersistence</provider>
          <jta-data-source>java:/boBETSDatasource</jta-data-source>
          <properties>
             <property name="hibernate.dialect" value="org.hibernate.dialect.SQLServerDialect"/>
             <!-- <property name="hibernate.hbm2ddl.auto" value="validate"/>   -->
             <property name="hibernate.show_sql" value="true"/>
             <property name="hibernate.format_sql" value="true"/>
             <property name="hibernate.generate_statistics" value="true"/>
             <property name="jboss.entity.manager.factory.jndi.name" value="java:/boBETS2EntityManagerFactory"/>
             <property name="hibernate.default_catalog" value="boBETS"/>
             <property name="hibernate.default_schema" value="dbo"/>
          </properties>
       </persistence-unit>What happens if I were to have 10 PUs and 10 PCs in the same app? Whether they're all pointing to same DB or not. What's the consequence of using "too many"?

    Yes, you can use multiple iCloud accounts in multiple User Accounts on one computer, but, as you know not multiple iTunes Match accounts. Keep in mind that the two services are not the same.
    Since you've posted your question to the iTunes Match forum, which it really doesn't pertain to, you might want to also post it to the iCloud on my Mac forum.

Maybe you are looking for