Import Process Slow .. Suggestions for Speedup ?

Hi,
I am doing an import of data from a dump file since the last 15 hours. The size of the database is 15GB on a 10.2.0.3.
The majority of the size is in one particular schema.
The requirement is to duplicate the schema into a new tablespace. I created the structure off of the DDL from an index file and then loading data using :
imp system/password buffer=125000000 file=............. log=....._rows.log fromuser=scott touser=alice rows=y ignore=y constraints=n indexes=n
Some how since the last 12 or so hours it is trying to load data into a table consisting of 6 million rows, and still hasnt completed.
I can see for a fact that the UNDO TS has crossed 6GB and the tablespace for the schema around 3GB.
The redolog is 50MB each of 3 groups and there is constant Checkpoint incomplete as well.
Now this is a test machine, but the machine where this is supposed to happen tomorrow is a 9.2.0.3 on a Solaris box.
Is there any way to speed up this process .. maybe stop all the logging ? What are the other ways I can import all this data faster than the current unpredictably slow process.
Thanks.

If you are copying data within the same database, why not use CTAS or INSERT, with PARALLEL and APPEND ?
eg
ALTER SYSTEM SET PARALLEL_MAX_SERVERS=8;
CREATE TABLE newschema.table_a AS SELECT * FROM oldschema.table_a where 1=2;
ALTER TABLE newschema.table_a NOLOGGING;
ALTER TABLE newschema.table_a PARALLEL 2;
ALTER TABLE oldschema.table_b PARALLEL 2;
ALTER SESSION ENABLE PARALLEL DML;
INSERT /*+ APPEND  */ INTO newschema.table_a SELECT * FROM oldschema.table_a ;
COMMIT;
ALTER TABLE oldschema.table_a NOPARALLEL;
CREATE INDEX newschema.table_a_ndx_1 ON newschema.table_a(col1,col2) PARALLEL 4 NOLOGGING;
ALTER TABLE newschema.table_a NOPARALLEL;
ALTER INDEX newschema.table_a_ndx_1 NOPARALLEL;and run multiple tables in parallel (table_a being in the above block of code, table_b being in another block of code running concurrently, those each block of code using 4 ParallelQuery operators.
Hemant K Chitale
http://hemantoracledba.blogspot.com

Similar Messages

  • Import process in cin for complite process

    sir
    can some one explain me total process in journal entries (from po to utliz) in capital import process.
    when i am posting excise in j1iex_p it is crediting below gl acc i could nt understand actual process ,when i am doing miro all expencess is posting in debit wip asset posting key 70
    Unapplied customs-CVD

    Hi:
    It seems that you have not capture the excise invoice at the time of GR posting and you have directly posted the excise invoice.Plz capture excise invoice first and then try.
    Cheers
    Rahul

  • How to maintain Import Processing: Control Code for the country India

    Hi Gurus,
    I need to maintain the Control Processing: Control Code for the country India in SPRO configuration.
    Please tell me the path to maintain it.
    Thanks,
    Thiyagu

    spro@Logistics - General@Material Master@Material IDs@Maintain PKWiUs
    Regards,
    Indranil

  • Slowness in import process

    Hi Experts,
    Follwing is my DB version.
    Oracle Database 10g Enterprise Edition Release 10.1.0.3.0 - 64bi
    PL/SQL Release 10.1.0.3.0 - Production
    CORE 10.1.0.3.0 Production
    TNS for Linux: Version 10.1.0.3.0 - Production
    NLSRTL Version 10.1.0.3.0 - Production
    During an import process the database got really slow and to complete the import fully it took more than 5 hrs. normally it takes only 2 -3 hrs.
    When I investigated I found a blocking lock for archive process and also the top consumer was DBSNMP at the same time.
    Is this DBSNMP and archive process lock responsible for making the DB slow ??? if yes, reason for that ... and how to resolve it ??

    what I meant by "when Investigated" is that in the Enterprise manager, Database locks link showed blocking locks related to CKPT and LGWR users.
    Also when I checked the top consumers link it showed me DBSNMP as the top consumer.
    what I want to know is whether these indicators were related to each other and whether it had an impact on the importing process and the slowness of the DB.
    "If you are generating archive logs faster than the ARCHn process can archive them, it is certainly possible that the database would have to pause while the archiver finishes its work"
    is it possible for the above to happen while doing an import. ??? if yes how to resolve it ?

  • Speedup the import process using imp tool

    hi,
    how many ways to speedup the import process using imp tool in oracle 9.2.0.8

    Hi,
    Follow below guidelines also:
    IMPORT(imp):
    Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and supply the passwords for the schemas on all CONNECT statements.
    Place the file to be imported on a separate physical disk from the oracle data files.
    Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the init$SID.ora file
    Set the LOG_BUFFER to a big value and restart oracle.
    Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
    Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback segment of course). The rollback segment must be as big as your biggest table (I think?)
    Use COMMIT=N in the import parameter file if you can afford it
    Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
    Remember to run the indexfile previously created.
    Note: Before following the above guideliness check your requirement aso.
    Best regards,
    Rafi.
    http://rafioracledba.blogspot.com/

  • Is there an EASY way to submit podcasts to the itunes store? i've tried creating podcasts in garageband, then somewhere after wordpress  itunes  doesn't recognize the feed. my process has been (i am open to suggestions for other free and EASY services/me

    is there an EASY way to submit podcasts to the itunes store? i've tried creating podcasts in garageband, then somewhere after wordpress  itunes  doesn't recognize the feed.
    my process has been (i am open to suggestions for other free and EASY services/methods):
    garageband : create & edit audio. add 1400x1400 image.
    share to itunes.
    drag file to desktop.
    upload .m4a file to google drive.
    create a link post in wordpress using "podcast" tag & create "podcast" category.
    click on "entries rss" in wordpress, which takes me to the rss subscribe page (which is basically just my wordpress address with "/feed" at the end.
    i copy this url.
    go to itunes store > then "submit a podcast"
    itunes gives me the error message "we had difficulty downloading episodes from your feed."
    when i try to subscribe to my podcast in itunes, it does, but gives me no episodes available.
    i went back into wordpress and changed settings/ reading settings to : "full text" from "summary"
    still the same error message.
    i added a feedburner step after wordpress but got the same errors. i don't think i should have to add feedburner.
    wordpress seems to be encapsulating the rss, what am i doing wrong?
    this so much easier when you could go directly from garage band to iweb to mobileme; i miss those apple days (also idisk).

    if anyone has a super EASY process, i would LOVE to know what it is. EASY, meaning no html and also free. there are many free online storage systems available, many of which i currently use. the above process was just me trying to figure it out, if you have an easier method, please share. thank you so much!

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • How to map import process for items

    Dear All,
    can anyone guide me how to map import process for items for our customer
    what i understand is this process in B one
    Raise PO
    GRPO
    Landed Cost
    AP Invoice
    I want to know whether this is correct. Also in this process their are clearing agents who perform the formalities at customs office and then raise bill to client for the same

    Hi,
    The process looks correct.  For cost related to clearing agents who perform the formalities at customs office, use landed cost.
    Thanks,
    Gordon

  • Import process for trading materials in INDIA

    Hi Gurus
    May i kno the import process when we procure mobiles, load software in them and sell?
    I have no idea of steps and config.
    Thanks & Regards
    Blue

    Hi,
    Both domestic and import process are same and the difference is the duties, kindly follow the below steps.
    1. Create new document type for Import procurement and number range for easy identification.
    2. Keep two different pricing procedure for import and domestic along with schema.
    3. Keep condition type for  CVD, Customs, CESS and others.
    4. Do MIRO for Customs duty and MIGO after stock reached you.
    5. Do excise availment and do MIRO for vendor.
    Hope it is clear...
    Regards,
    Karna J

  • Nd suggestion for importing Word doc into Pages template....

    I have a rather lengthy Word file that I want to import into Pages using one of its templates (Travel Journal). Is there a way to import the Word file directly into this template?
    I know you can import a Word doc into Pages but once you do that, you have to do the designing from scratch (or at least that what it looks like you have to do).
    I like the Travel Journal template and wanted to use that for my Word doc. Any suggestions for getting my Word doc into this template, without having to cut and paste each section?
    I like the template because of the varied page layout options and I don't want to have to recreate from scratch!
    Help!

    Hello Debbie,
    as I know there is no way to get what you want. I think you have to load the Word document into Pages and than you have to copy and paste all the text and media objects into the Pages template.

  • HT1338 My Mac book Pro is running very slow and the rainbow ball is appearing all the time. Any suggestions for clean up??

    My Mac book Pro is running very slow and the rainbow ball is appearing all the time. Any suggestions for clean up?? I have the OS X Lion system.

    As well as what a brody asked :
    - do you need more RAM? (run Activity Monitor and see how much free RAM - green - is available)
    - do you Restart fairly regularly, e.g. at least once a week, to clear out any swap or temporary files?
    - do  you need to do maintenance, e.g. clearing out caches and unused logs, etc?

  • Any Suggestions for CGI Script for Web Slide Slow?

    Do you have any suggestions for creating a slideshow for my web site, 6 same-size jpgs that rotate automatically?
    I think I prefer a cgi script. However, I've tried two that don't work on Firefox or Safari. One developer tells me that he can see the rotating photos.
    I'm also open to software to create this. However, I'd prefer to use html code.
    Of course, it must work cross platform.
    Any suggestions?
    G5 Quad; Mini Mac; PowerBook G3; iPods   Mac OS X (10.4.8)   Using Dreamweaver 8

    Oh, what a beautiful baby! Is he/she yours?
    She is mine. At least that's what the Mrs. tells me!
    When you say gif, I am assuming that you mean the
    format really must be gif and not jpg. Is that
    correct?
    Yes, the file is a GIF file. It's actually an animated GIF file, meaning that the images that you see are all frames of animation in a single GIF file. There are a lot of apps out there that can create animated GIFs. I happen to use Adobe ImageReady because that is what is on the hard drive.
    Can you give me a hint about the dithering issue? Do
    I dither or not dither, that is the question.
    This is the major downside of the GIF format. It is a really old format...from back in the CompuServe bulletin board days...maybe older. Anyways, it is my understanding that GIFs can only hold information for 256 colors....any 256 colors, but no more and no less. So now in the days of "millions of colors" jpegs, you can imagine that a 256 color palette is limiting. But if the colors of your images are sufficiently close, its possible that 256 colors makes for perfectly acceptable images across all images. This is the one variable of converting images to GIF that can make it or break it for you. If your image has a pretty broad tonal range...like fleshtones or any other gradient...then 256 colors is going to represent things poorly. Then you will get dithering artifacts which is like averaging errors...colors are close...as close as possible...but not close enough to make a smooth gradation. Anyways, that's my layperson's understanding.
    But for some images, it looks just fine.

  • Query on long running client import process

    Hi Gurus,
    I have few queries regarding the parameter PHYS_MEMSIZE. Let me brief about the SAP server configuration before I get into the actual problem.
    We are running on ECC 6.0 on windows 2003, SP2 - 64 bit and DB is SQL 2005 with 16GB RAM and page file configured 22 GB.
    As per the Zero Administration Memory management I have learnt as a rule of thumb, use SAP/DB = 70/30 and set the parameter PHYS_MEMSIZE to approximately 70% of the installed main memory. Please suggest should I change the parameter as described in zero administration memory guide? If so what are the precautions we should take care for changing memory parameters. Are there any major dependencies and known issues associated to this parameter?
    Current PHYS_MEMSIZE parameter set to 512 MB.
    Few days ago we had to perform the client copy using EXPORT / IMPORT method. Export process was normal and went well However the import process took almost 15 HRs to complete. Any clues what could be the possible reasons for long running client copy activity in SQL environment. I am suspecting the parameter PHY_MEMSIZE was configured to 512 MB which appears to be very low.
    Please share your ideas and suggestions on this incase if anyone have ever experienced this sort of issue because we are going to perform a client copy again in next 10 days so i really need your inputs on this.
    Thanks & Regards,
    Vinod
    Edited by: vinod kumar on Dec 5, 2009 9:24 AM

    Hi Nagendra,
    Thanks for your quick response.
    Our production environment is running on ACtive/Active clustering like One central Instance and Dialog Instance. Database size is 116 GB with 1 data file and log file is 4.5 Gb which are shared in cluster.
    As suggested by you if I need to modify the PHYS_MEMSIZE to 11 or 12 GB(70% of physical RAM). What are the precautions should I consider and I see there are many dependencies associated with this parameter as per the documentation of this parameter.
    The standard values of the following parameters are calculated
    According to PHYS_MEMSIZE
    em/initial_size_MB = PHYS_MEMSIZE (extension by PHYS_MEMSIZE / 2)
    rdisp/ROLL_SHM
    rdisp/ROLL_MAXFS
    rdisp/PG_SHM
    rdisp/PG_MAXFS
    Should I make the changes to both Central and dialog instance as well. Please clarify me,. Also are there any other parameters should i enhance or adjust to speedup the client copy process.
    Many Thanks...
    Thanks & Regards,
    Vinod

  • LR4 import very slow...Can you direct import?

    I miss the way LR3 imported images without searching for the source is there a way to speed up the process? Can you stop LR4 from searching for every image on your drives?

    I got the same problem. The export of 17GB database took 4 hours, most of the time it just stays there with one process (the first worker). The regular export took 3 hours. The import took more thatn 18 hours in 10.2.0.2 DB. I set parallel for 5, but it dosen't help, since most of the time only one worker is running.
    I read one paper Note:376969.1 form poor performance, the suggestion is to set the correct parameters:
    spincount = 4000
    session_cached_cursors = 800
    cursor_space_for_time = true
    db_cache_advice = OFF
    But another Oracle paper mentions that there is no paralellism when the metedata worker is running. So if you have large number of object, but small data set, you don't get performance. It only helps when you have small number of objects, and large data set.
    Don't know if any Oracle people can answer that. My experience is that it is much slower than the old export/import in any perspective.

  • Problem with EXPORT IMPORT PROCESS in ApEx 3.1

    Hi all:
    I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
    When I export an application, and try to import it again. I get this error message
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol "牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐" when expecting one of the following: ( - + case mod new not null <an identifier> <a double-quoted delimited-identifier> <a bind variable> avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
    As a workaround, I check the exported file and found this
    wwv_flow_api.create_flow
    p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
    And when I replace with this
    p_documentation_banner=> ' ',
    I can import the application without the error.
    somebody knows why I have to do this??
    Thank you all.
    Nicolas.

    Hi,
    This issue seems to have been around for a while:
    Re: Error importing file
    I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

Maybe you are looking for

  • Received payment documents (not cleared yet) in dunning selection

    Dear experts, Do you know if it is possible to have received payment documents (not yet cleared against open items) in the dunning selection, dunning list and dunning letter? Is there a BADI or BAPI we can use to change the document selection of the

  • Horizontal Line?

    Hi folks, I am revamping my resumé and trying to figure out how to put a horizontal line across the page under some of my section headings. For the life of me I can't figure out how to do that. I can't figure it out in Word 2008 either...

  • Javac problems

    Hello, I'm not really that much of a new user, but I need to compile through javac and can't get it to work (I've been dependent on my programming environment before). It will not recognize my jar files. How do I get them to be included?

  • Pages opened but no window appeared

    Hello, I have problems with Pages 5.1 I tried to launch the program. It is opened but no window appears. when I clicked on the Pages icon on the dock, the menubar on top of the sceen is well the menubar of Pages. I also tried to open some documents o

  • IP19- Graphic Scheduling Overview and Simulation# Issue.

    Dear All, Good Day, I have  run transaction IP19 for a perticular Maintenance plan and I did "Release Call" by clicking relevent icon. Its changend in Green Color ("CALLED ICON") also I saved the same. When i checked related  Maintenance plan, it dos