Options for moving data

I'm moving some rather large tables between two Oracle 10 instances.
In simple words I would like to copy a schema from one database to another. Simple, you say? Not so!
I cannot use Oracle data pump on the target host as I lack the required privileges (Oracle Data Pump is essentially a server side tool unlike the old 'imp' and 'exp' tools which are client-side tools).
The first option I try is good old-fashioned exp and imp. The export is pretty fast. Only 1 hour. Then comes the import which is painfully slow (we are talking an estimated 2-3 days). The reason is that Oracle is doing a commit per row because the tables contains TIMESTAMP columns. I'm wondering what is so special about TIMESTAMP columns that it forces Oracle not to use array inserts?? Anyway the restriction is fully documented by Oracle.
(Utilities Guide for Oracle 10.2 database, see description on import parameter BUFFER).
The tables I'm loading into have no indexes and there are no indexes as part of the dump file. Furthermore the target tables have been pre-allocated in size so they do not need to extend as part of the import. Lastly logging is turned off on these tables. So all should be in place for fast 'imp' performance. However not so because of the TIMESTAMP restriction. Hmmm.
Then I try other options. How about the COPY command in SQL*Plus. With this you can control the frequency of the commit. Unfortunately the COPY command only works on tables with colums of type CHAR, DATE, LONG, NUMBER and VARCHAR2. In other words: I cannot use this solution either.
What are my options? I simply want to move data from one Oracle 10 to another with reasonable performance. All together we are talking about 5-6 Gb data. Not a lot by modern standards. Why is it so difficult? Am I overlooking something obvious?
Thanks.
Edited by: user491370 on Apr 19, 2009 9:27 AM

Thank you for responding.
I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here?

Similar Messages

  • Which CKM is used for moving data from Oracle to delimited file ?

    Hi All
    Please let me know Which CKM is used for moving data from Oracle to delimited file ?
    Also is there need of defining each columns before hand in target datastore. Cant ODI take it from the oracle table itself ?

    Addy,
    A CKM is a Check KM which is used to validate data and log errors. It is not going to assist you in data movement. You will need an LKM SQL to File append as answered in another thread.
    Assuming that you have a one to one mapping, to make things simpler you can duplicate the Oracle based model and create a file based model. This will take all the column definitions from the Oracle based model.
    Alternatively, you can also use an ODI tool odiSQLUnload to dump the data to a file
    HTH

  • Delta Option for Generic Data Source

    Hi ,
    My requirement is Delta Option for Generic Data Source. This Data Source is getting Extracted from Function Module.
    For this requirement I have followed below steps.
    1) Added the Field (On which Delta gets enabled) to the extract structure by using Append Structure
    2) Enabled Delta On this Filed.
    3) If I try to save the Data Source,  I have gone to a screen where I am able to see 4 coloumns
    a) Selection
    b) Hide Field
    c) Inversion
    d) Filed Only known in User Exit.
    By default Hide Field and Filed Only known in User Exit check boxes have been checked.
    Would you please tell what is the use of Filled Only known in User Exit check biox.
    As My data source is getting the data buy using the Function Module, Shall I uncheck this Check Box.
    Thanks and Regards,
    K.Krishna Chaitanya.

    Hi Krishna,
    The "FIELD" should be checked if we write any enhancement in CMOD, so that system can identify that this particular field has enhancement written to it,
    But even in real time systems i usually find that even though we check the "FIELD" or not , there is no effect .
    So Checking "FIELD" or not will not impose any isses .
    Regards,
    Krishna

  • TS3899 i receiving my gmails with 15+ minutes delay. this is because no option for "push" data, only "fetch" for gmail account. How can fix that?

    I receiving my gmails with 15+ minutes delay. this is because no option for "push" data, only "fetch" for gmail account. How can fix that?

    Sign up for Google's paid Apps for Business service, you can then set things up using Exchange & you'll get push using Gmail. Otherwise, you can't get push using Gmail.

  • What are required Oracle products for moving data from IBM IMS/DB(mainframe) to Oracle environment?

    I am z/OS system programmer, our company is using IMS as its main OLTP database. We are investigating moving data off the mainframe for data warehousing and online fraud detection. One option is using IBM InfoSphere CDC and DB2, another option is using IMS connect and writing our own program, I am wondering what is the oracle solution for this kind of issue?
    I am not oracle technician but I googled and find out Oracle has some product like Oracle Legacy Adapters, OracleAS CDC Adapter and Oracle Connect on z/OS, however I didn't find them in Oracle site(https://edelivery.oracle.com/), I don't know whether these products are deprecated or not?!
    I would very much appreciate any help or guidance you are able to give me

    Thank you for responding.
    I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
    I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
    I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here?

  • What are my options for a data plan + upgrade?

    So my old phone (Samsung Intensity) broke today, and I'm looking to upgrade to a smartphone. I would prefer an iPhone, but really it's not that big of a deal if I get a Droid. I'm eligible for an upgrade currently. What are my options for a smartphone and a data plan? I'm looking for what's the cheapest option as far as this goes. People tell me an iPhone is free with an upgrade, and that the cost is $20 a month, but I just want to be sure.

    Unless you have an old calling plan, you should be able to add a tiered data plan for your new smartphone (i.e., $30 for 2 GB).  If you switched to the Share Everything plan, your monthly cost would be $40 line access + at least $50 for 1 GB data (includes unlimited minutes/texts and mobile hotspot, although the hotspot data usage comes out of your data plan), plus, of course, taxes/fees (and insurance, if applicable).
    You can check pricing for phones via your on-line MyVerizon account if you click on the Upgrade button.  If you are mainly interested in free phones (keep in mind that there's a $30 upgrade fee), there's the Nokia Lumia 822, Spectrum 2, Droid Razr M, iPhone 4, Galaxy Stellar, Marauder, Rhyme, and the Lucid.  Several of those are 3G phones.

  • Which options for moving a lightroom Catalog??

    Hi all,
    I am in the process of merging two catalogs and getting very lost while I do it. First I'll let you know what I've done.
    I exported the catalog I no longer want as a temporary catalog in my pictures library. Now, I'm trying to import that catalog into my main catalog.
    Both of these catalogs are on one main internal hardrive on my desktop.
    Here are my questions.
    1) Which option should I choose for New Photos - File handling. A)Add new photos without moving B) Copy new photos to a new location and import or C) Dont import new photos.
    I was thinking of doing B and then choosing my other lightroom folder as my copy destination since I want to have all my files together (in one folder/catalog). Is this the correct way?
    2) Which option for Existing photos (there is some overlap)? I can A) replace nothing B)replace metadata and develop settings C)replace metadata, develop settings and negative files. I dont want any duplicates however many over the photos I am bringing from the unwanted catalog have been edited so I want these edits to come over. Which option should I choose?
    3) What is the virtual copy for and should I select it?

    Yep, that's exactly right Ray.  You can set which word label is represented by which colour.
    For example, here's 2 of the sets that are built in, and you can create your own sets by going to Metadata menu > Color Label Set > Edit.
    If the label 'words' don't match a color label in the current set, you get a white custom label instead.
    Look back on the PC and see what colors match which words, and repeat the same on the iMac.

  • Is Multimaster Timesten replication a good option for huge data volumes?

    Hi,
    There are 3 timesten nodes in our production setup .There will be around 5 million rows in each node initially which will gradually increase to about 10 million.Once our application moves to production, there will be around 50-70 transactions per second in each node , which need to be replicated on to the other node.
    Initially we thought of going with Active-Standby-Subscriber replication.However in this case if active and standby node go down,then it will be a site failure case.So is Active-Active (Multimaster replication) configuration a good option ? Will data collision happen when replication happens in both directions?
    Thanks in advance.
    Nithya

    Multi-master replication is rarely a good idea. You will get data collisions unless you rigorously partition the workload. Conflict detection and resolution is not adequate to guarantee consistency over time. Recovery back to a consistent state after a failure is complex and error prone. I'd strongly advise against a multi-master setup, especially for a high volume system.
    You seem to be concerned that 2 out of the 3 systems may fail resulting in a site outage. The likelihood of that is small if you have set things up with separate power etc. With the A/S pair based approach you would still have query capability if the two master systems failed. The chances of all 3 systems failing is not that much less than of just 2 failing in reality I would say (depending on the reason for the failure).
    Chris

  • Database Copy Doesn't find tables for moving data

    I'm running SQL Dev 3.1.7 on a Win7-64 PC against Oracle 11g. Using the Database Copy function to move and update tables from one instance to another. The application seems to work fine, creating all of the tables, but in the script when it tries to move the data, for some tables it fails as follows:
    Moving Data for object MY_TABLE_NAME
    Unable to perform batch insert.
    MY_TABLE_NAME ORA-00942: table or view does not exist
    Earlier in the script it dropped and then recreated the table in the target instance. The table does exist in the source instance. This occurred for 24 tables out of 105.
    Any ideas what is going on?
    Edited by: user12200489 on May 25, 2012 9:47 AM

    according to the log, it says the table got created. Here's the log entries as they pertain to our APPOINTMENT table ...
    DROP TABLE "APPOINTMENT" cascade constraints;
    table "APPOINTMENT" dropped.
    DROP SEQUENCE "APPOINTMENT_SEQ";
    sequence "APPOINTMENT_SEQ" dropped.
    -- DDL for Sequence APPOINTMENT_SEQ
    CREATE SEQUENCE "APPOINTMENT_SEQ" MINVALUE 1 MAXVALUE 9999999999999999999999999999 INCREMENT BY 1 START WITH 17721 CACHE 20 NOORDER NOCYCLE ;
    sequence "APPOINTMENT_SEQ" created.
    -- DDL for Table APPOINTMENT
    CREATE TABLE "APPOINTMENT" ("APPOINTMENT_ID" NUMBER(19,0), "PRACTICE_PATIENT_ID" NUMBER(19,0), "PRACTICE_PET_OWNER_ID" NUMBER(19,0), "STAFF_MEMBER_ID" NUMBER(19,0), "APPT_STATUS_ENUM_NAME" VARCHAR2(32 BYTE), "EXT_PRACTICE_APPOINTMENT_ID" VARCHAR2(255 BYTE), "CONFIRMATION_KEY" VARCHAR2(255 BYTE), "REASON" VARCHAR2(255 BYTE), "EXT_REASON" VARCHAR2(255 BYTE), "EXT_SECONDARY_REASON" VARCHAR2(255 BYTE), "NOTE" BLOB, "APPOINTMENT_DATE" TIMESTAMP (6), "CREATED_DATE" TIMESTAMP (6), "CREATED_BY" VARCHAR2(255 BYTE), "LAST_MODIFIED_DATE" TIMESTAMP (6), "LAST_MODIFIED_BY" VARCHAR2(255 BYTE), "CONFIRMATION_TIME" TIMESTAMP (6)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_T" LOB ("NOTE") STORE AS BASICFILE ( TABLESPACE "VET2PET_T" ENABLE STORAGE IN ROW CHUNK 8192 RETENTION NOCACHE LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)) ;
    table "APPOINTMENT" created.
    It even truncates the table before attempting to load the data ...
    TRUNCATE TABLE "APPOINTMENT";
    table "APPOINTMENT" truncated.
    But when it goes to move the data ...
    --- START --------------------------------------------------------------------
    Moving Data for object APPOINTMENT
    Unable to perform batch insert.
    APPOINTMENT ORA-00942: table or view does not exist
    --- END --------------------------------------------------------------------
    There is a lookup table involved but at the time of moving the data none of the foreign keys have been enabled.
    All the indices for the table get created with no issue and here is the log entry for when the constraints are created and enabled ...
    -- Constraints for Table APPOINTMENT
    ALTER TABLE "APPOINTMENT" ADD CONSTRAINT "PK_APPOINTMENT" PRIMARY KEY ("APPOINTMENT_ID") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_I" ENABLE;
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_BY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_DATE" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CONFIRMATION_KEY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("EXT_PRACTICE_APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPT_STATUS_ENUM_NAME" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PET_OWNER_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PATIENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_DATE" NOT NULL ENABLE);
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    Any ideas?
    thanks.

  • Options for auditing data changes

    Hi Friends,
    I thought I will get some inputs on my following implementation. The requirement is to audit some data changes with in the system.( Oracle 10.2 on RHEL 4.7 )
    The audit is required in a sense that, the before images of data and information of who changed the data is required. I have looked at options like Oracle Auditing,FGA and so. But this cannot give me audit for the data changes,when and who changed.
    The first thing that comes into my mind are using triggers . Another option is using log miner. I have successfully tested it out with both of these approaches. The environment is like
    1 ) For some critical tables for which audit is required triggers were written ( ours is an OLTP application )
    2 ) For some non critical tables, log miner which was called by a stored procedure which runs at certain periods is used.
    3 ) audit data is stored in a different schema , with same table names as in base schema.
    I would like to know your thoughts on this.
    Thank You,
    SSN

    The delay with log miner is acceptable with some less critical audit tables and as you said, tirggers can be implemented for some critical tables.
    One bottleneck with using logminer is that it depends on the availability of archived redo logs, the backup mechanism if any implemented should make sure that, no archive logs are removed from the locations specified for the audit program. The backup mechanism should ensure that all archived logs are processed by periodically running audit program which uses log miner.
    Wondering if there is any other recommended approach for this.
    Thanks
    SSN

  • Triggers for moving data

    Hi all.........
    I have defined several tier (with its tablespaces) seting a retain time for each one. For example, 1 month for tier. It is supposed that passed this month, those datafiles (or partition) will be moved into the next tier defined. But , is not there any trigger or condition to prevent moving active data (even this data belongs to a tier with closed retain time) if this data is still active (is being used) ????
    It is yes, where i can set those conditions???

    The ILM Assistant (ILMA) is not designed to automatically move a partition from one tier to another. The ILMA will generate the scripts to move the partition from one tier to another but it does not then execute those scripts even though the use of a calendar on the main page might make you think it was in fact building a schedule of when to move a partition. Once you have generated the SQL script from the ILMA you can then give these scripts to your DBA for scheduling within all the other database operations or you can schedule the execution of the script yourself.
    If you could provide some more information about your specific use case it might be possible to provide a work around.
    Hope this helps
    Keith

  • Options for loading data from legacy mainframe to SAP CRM and ECC

    Hello All,
    I am new to SAP.
    As part of data conversion planning from legacy mainframe to SAP CRM and ECC systems, I need to know the different tools available for loading the data and comparative analysis of the tools by showing the features. Please also describe the process to evaluate the tools (prototyping, technical discussions etc) and make the recommendations.
    I would also appreciate any information on testing tools like (eCATT) and any third party tools (like Informatica)
    I know I am asking for lot of things here. But, I really appreciate any information reagrding this.
    Thanks,
    Balu

    Hi
    Data Migration(Conversions) mainly involves the following Major Steps
    Discovery->Extract->Cleanse->Transform->Load
    Discovery-> Involves identifying various source systems available where the actual data is residing.
    Extract->Extract the data from the source systems(legacy systems)
    Cleanse->Cleanse the data by enriching it,De-duplications etc
    Transform->Transform the data into SAP Loadable format
    Load->Upload the data into SAP Using the Load Programs(Eg,BDC,LSMW,Data Transfer Workbench,BAPIs etc)
    Also i would request you to visit http://www.businessobjects.com/solutions/im/data_migration.asp
    Cheers,
    Hakim

  • Sample pgm for moving data from table control to internal table

    Hi Experts,
          I am newbi to ABAP. I don't have good material for Table control . Appreciate if you direct me to some good source of knowledge on Table control.
    The problem at hand : I am trying to move info/data from table control (in screen painter/ input and output mode ) to ITAB but couldn't . Sample pgm if possible.
    <b>Modify ITAB index TC-Current_Line .</b>
    The above statement is not inserting new lines to ITAB . Help me!
    Thanks for your time

    hi,
    do like this...
    <b>PROCESS AFTER INPUT.</b>
    *&SPWIZARD: PAI FLOW LOGIC FOR TABLECONTROL 'TAB1'
      LOOP AT itab_det.
        CHAIN.
         FIELD itab_det-comp_code.
          FIELD itab_det-bill_no.
          FIELD itab_det-bill_date.
          FIELD itab_det-vend_cust_code.
          FIELD itab_det-bill_amt.
          MODULE <b>tab1_modify</b> ON CHAIN-REQUEST.
        ENDCHAIN.
        FIELD itab_det-mark
          MODULE tab1_mark ON REQUEST.
      ENDLOOP.
    <b>MODULE tab1_modify INPUT.</b>
      APPEND itab_det.
    <b>ENDMODULE.                    "TAB1_MODIFY INPUT</b>

  • Looking for best design approach for moving data from one db to another.

    We have a very simple requirement to keep 2 tables synched up that live in 2 different databases. There can be up to 20K rows of data we need to synch up (nightly).
    The current design:
    BPEL process queries Source DB, puts results into memory and inserts into Target DB. Out of memory exception occurs. (no surprise).
    I am proposing a design change to get the data in 1000 row chunks, something like this:
    1. Get next 1000 records from Source DB. (managed through query)
    2. Put into memory (OR save to file).
    3. Read from memory (OR from a file).
    4. Save into Target DB.
    Question is:
    1 Is this a good approach and if so, does SOA have any built in mechanisms to handle this? I would think so since I believe this is a common problem - we don't want to reinvent the wheel.
    2. Is it better to put records into memory or writing to a file before inserting into the Target DB?
    The implementation team told me this would have to be done with Java code, but I would think this would be out of the box functionality. Is that correct?
    I am a SOA newby, so please let me know if there is a better approach.
    Thank you very much for your valued input.
    wildeman

    Hi,
    After going through your question, the first thing that came to my mind is what would be the size of the 20K records.
    If this is going to be huge then even the 1000 row logic might take significant time to do the transfer. And I think even writing it to a file will not be efficient enough.
    If the size is not huge then probably your solution might work. But I think you will need to decide on the chunk size based on how well your BPEL process will work. Possible you can try different size and test the performance to arrive at an optimal value.
    But in case the size is going to be huge, then you might want to consider using ETL implementations. Oracle ODI does provide such features out of the box with high performance.
    On the other hand, implementing the logic using the DBAdapter should be more efficient than java code.
    Hope this helps. Please do share your thoughts/suggestions.
    Thanks,
    Patrick

  • Options for sending data

    My epic app (DMX 2004) is in the final stages and I'm stuck
    again, I'm sorry to say.
    The app allows users to browse a catalogue of items and add
    them to a basket. Once the user is happy they can create their
    order, including their details, which I've loaded into a sprite,
    using the HTML property, AND saved to a file on the users local
    disk.
    Great. Now I need to get it to us. As part of the final
    ordering process I create a 'real' HTML file and use Dir to open
    this using goToNetPage, which contains the order itself and
    whatever else I want. What I need now is a way in which the user
    can
    automatically send the order file to our email address. As I
    understand it, the mailto tag 'attachment' rarely works and is
    considered a security risk in any case.
    What are my options? I have a file on the users hard drive
    that is a nicely formatted order and I need to get it emailed to us
    in the simplest (for the user) way possible.
    Any ideas or suggestions gratefully received.
    Jon

    The DirectEmail xtra looks like a good fit for you.
    Here is a clip from their website:
    DirectEmail is a cross-platform, easy to use Scripting Xtra
    for
    Macromedia Director, Authorware and Shockwave that provides
    applications
    with the ability to reliably send e-mails, send faxes as well
    as
    validate e-mail addresses.
    Emails can be sent in either plain text or rich HTML format,
    in various
    languages, and can include file attachments.
    http://www.directxtras.com/demail_home.asp?UUID=1217348

Maybe you are looking for

  • Table filter problem Jdeveloper 11.1.1.1.0

    Hi, I have problem with table filtering. When I filter table to get only one row I can not select this row to update child table (row is shown as selected). Is there some option to overcome this problem? Regards, Rok Kogovšek

  • How to do correction in AP/AR invoice.

    Hi Experts, at one of my client i faced the problem when we generated the AR Invoice and after that both side decided that increase in item rate so invoice amount changes. The item is excisable so we need to generated excise invoice for the correspon

  • Problem with a TV episode

    ok so I Downloaded Heroes Episode "Cold Wars" and it stops 10 mins before the end? i Reported the problem but all i got was an email back saying you cant have a refund?? I don't want a refund i want to redownload the episode i payed for as its not co

  • Custom Cluster and role of INDX in that???

    Hi All Gurus, I need to create a custom cluster and then import and export the data to DB. I saw some threads where i saw INDX being used. Can anybody tellme what is this INDX table and how does it play a in creation of custom cluster also how do i c

  • 1300 series AP antennas

    Dear All, I wanted to ask whether AIRANT-24120 or other high gain bridge antennas can be used to provide Access point capabilities with 1300 series AP. My scenario is that, that i wanted to give roaming services to my WIFI clients in approx 15km rang