Using expdp to do a schema export is taking extremely long time

I am using expdp in schema mode to export a 300 gigbyte database.
The job status report 99% complete after about 2 hours .
But now the job has been running 30 hours and is not finish.
I can see that it is exporting the domain indexes and had been exporting
the last index for the last 5 hours. Something is not working because I
looked at the table the index is using and it has no data. So, why is it taking
so long to export an index that has no data?
Can someone tell if there is a way to bypass exporting indexes and a easy way
to recreate the indexes if you do?
I am using oracle 11g and expdp utility.

I checked the log file and there are no errors in the file.
There are no ORA- xxxx error messages.
The last line in the log file is as follows:
"Processing object type schema_export/table/index/domain_index/index "
I just checked the export job run this morning and it is still on the same
index object "A685_IX1" . This is a spatial index. It has been sitting at
this same object according to the job status for at least 24 hours.

Similar Messages

  • Exporting taking extremely long time

    Hi,
    I have been working with FCPX for about a year and all of the sudden, my files are taking forever to export.  I'm exporting the ProRes timeline files by using the Share option. I always export the file with current settings to the desktop.  Once it's finished on the desktop, I convert the file to H.264 using a Compressor droplet.  I've read that this process is supposed to be faster then exporting H.264 directly from FCP.  The files I'm using are about 1-2 hours of 1080i ProRes.  For the file I'm exporting now, the estimated export size is 140 GB and in 24 hours of export time it has exported 45% of the file. 
    I know these are big files, but it never used to take this long to export them.  What can I do to speed up this process?
    Here's what I'm working with:
    Computer:
    MacPro5,1
    OSX Version 10.6.8
    Processor - 2 x 2.4 GHz Quad-Core Intel Xeon
    16 GB 1066 DDR3
    FCP X - Version 10.0.8
    Compressor - 4.0.7
    This is my first time posting a question to this community but I have found answers to many questions here before, so thank you.
    J

    Suggestions:
    Check the amount of free space on your boot drive and media drives. If they have less than 20%, do some archiving. If there is plenty of free space, check the drive speeds by downloading on of the speed apps from AJA or Blackmagic. Also Verify disks and repair permissions in Disk Utility.
    While exporting, open Activity Monitor to make sure that you don't have a memory leak. Check free memory and Page Outs.
    Russ

  • Using Table.Join formula takes extremly long time to get results.(Why Query Folding doesn't work?)

    Hi,
    I built a query with 4 tables inside (load from Oracle DB and two of them are quite big, more than millions of rows). After filtering, I tried to build relationships between tables using Table.Join formula. However, the process took extremly long time to
    bring out results (I ended the process after 15 mins' processing). There's a status bar kept updating while the query was processing, which is showed as  . I suppose
    this is because the query folding didn't working, so PQ had to load all the data to local memory first then do the opertion, instead of doing all the work on the source system side. Am I right? If yes, is there any ways to solve this issue?
    Thanks.
    Regards,
    Qilong 

    Hi Curt,
    Here's the query that I'm refering,
    let
        Source = Oracle.Database("reporting"),
        AOLOT_HISTS = Source{[Schema="GEN",Item="MVIEW$_AOLOT_HISTS"]}[Data],
        WORK_WEEK = Source{[Schema="GEN",Item="WORK_WEEK"]}[Data],
        DEVICES = Source{[Schema="GEN",Item="MVIEW$_DEVICES"]}[Data],
        AO_LOTS = Source{[Schema="GEN",Item="MVIEW$_AO_LOTS"]}[Data],
        Filter_WorkWeek = Table.SelectRows(WORK_WEEK, each ([WRWK_YEAR] = 2015) and (([WORK_WEEK] = 1) or ([WORK_WEEK] = 2) or ([WORK_WEEK] = 3))), 
        Filter_AlotHists = Table.SelectRows(AOLOT_HISTS, each ([STEP_NAME] = "BAKE" or [STEP_NAME] = "COLD TEST-IFLEX" or [STEP_NAME] = "COLD TEST-MFLEX") and ([OUT_QUANTITY] <> 0)),
        #"Added Custom" = Table.AddColumn(Filter_AlotHists, "Custom", each Table.SelectRows(Filter_WorkWeek, (table2Row) => [PROCESS_END_TIME] >= table2Row[WRWK_START_DATE] and [PROCESS_END_TIME] <= table2Row[WRWK_END_DATE])),
        #"Expand Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"WRWK_YEAR", "WORK_WEEK", "WRWK_START_DATE", "WRWK_END_DATE"}, {"WRWK_YEAR", "WORK_WEEK",
    "WRWK_START_DATE", "WRWK_END_DATE"}),
        Filter_AolotHists_byWeek = Table.SelectRows(#"Expand Custom", each ([WORK_WEEK] <> null)),
        SelectColumns_AolotHists = Table.SelectColumns(Filter_AolotHists_byWeek,{"ALOT_NUMBER", "STEP_NAME", "PROCESS_START_TIME", "PROCESS_END_TIME", "START_QUANTITY", "OUT_QUANTITY", "REJECT_QUANTITY",
    "WRWK_FISCAL_YEAR", "WRWK_WORK_WEEK_NO"}),
        Filter_Devices= Table.SelectRows(DEVICES, each ([DEPARTMENT] = "TEST1")),
        SelectColumns_Devices = Table.SelectColumns(Filter_Devices,{"DEVC_NUMBER", "PCKG_CODE"}),
        Filter_AoLots = Table.SelectRows(AO_LOTS, each Text.Contains([DEVC_NUMBER], "MC09XS3400AFK") or Text.Contains([DEVC_NUMBER], "MC09XS3400AFKR2") or Text.Contains([DEVC_NUMBER], "MC10XS3412CHFK") or Text.Contains([DEVC_NUMBER],
    "MC10XS3412CHFKR2")),
        SelectColumns_AoLots = Table.SelectColumns(Filter_AoLots,{"ALOT_NUMBER", "DEVC_NUMBER", "TRACECODE", "WAFERLOTNUMBER"}),
        TableJoin = Table.Join(SelectColumns_AolotHists, "ALOT_NUMBER", Table.PrefixColumns(SelectColumns_AoLots, "AoLots"), "AoLots.ALOT_NUMBER"),
        TableJoin1 = Table.Join(TableJoin, "AoLots.DEVC_NUMBER", Table.PrefixColumns(SelectColumns_Devices, "Devices"), "Devices.DEVC_NUMBER")
    in
        TableJoin1
    Could you please give me some hints why it needs so long to process?
    Thanks.

  • Report script taking very long time to export in ASO

    Hi All,
    My report script is taking very long time to execute and finally a message appears as timed out.
    I'm working on ASO Cubes and there are 14 dimensions for which i need to export all data for all the dimensions for only one version.
    The data is very huge and the member count in each dimension is also huge, so which is making me difficult to export the data.
    Any suggestions??
    Thanks

    Here is a link that addresses several ways to optimize your report script. I utilize report scripts for Level 0 exports in an ASO environment as well, however the majority of our dimemsions are attribute dimensions.
    These are the most effective solutions we have implemented to improve our exports via report scripts:
    1. Make sure your report script is written in the order of how the Report Extractor retrieves data.
    2. Supressing Zero and Missing Data
    3. We use the LINK command within reports for some dimensions that are really big and pull at Level 0
    4. Using Symmetric reports.
    5. Breakout the exports in multiple reports.
    However, you may also consider some additional solutions outlined in this link:
    1. The MDX optimizing commands
    2. Back end system settings
    http://download.oracle.com/docs/cd/E12825_01/epm.111/esb_dbag/drpoptim.htm
    I hope this helps. Maybe posting your report script would also help users to provide feedback.
    Thanks
    Edited by: ronnie on Jul 14, 2011 9:25 AM
    Edited by: ronnie on Jul 14, 2011 9:53 AM

  • Importing Assets from ALER to export to OER is taking so long time

    Guys,
    I am importing Service Bus Projects from ALSB ant trying to export it to Oracle Enterprise Repository 11g. But when I try to export the configuration jar to Oracle Enterprise Repository using the Harvester tool it is taking so long time.
    I have more than 1500 WSDL and XSD (with some circular reference), it is running for 48 hours without finish it.
    What can I do to Harvester and Oracle ER be faster ?
    Do we have some configuration to turn off validation or something like this ?
    Thanks in advanced.
    Edited by: rrocha on 18/04/2011 14:11

    Welcome to the Apple Community.
    Itunes library or iTunes store.

  • Extremely long time when executing an export transaction data package link

    Hi,
    I am working in a packagelink to export transaction data to the application server. The package which  I am currently using is /CPMB/EXPORT_TD_TO_APPL. I use it to generate an output file which I later use in a different appliation to register modifications applied in BPC.
    It has been working correctly without problems for a time. The problem is that suddenly it stopped working. Maybe because of some changes in the dimension library (is it feasible?) . I defined and scheduled again the packagelink and it began working correctly again, but the time it takes to execute the process  is really long. Approximately it takes above 10 hours, which is an extremely long time for the process. In the beginning, the packagelink was executed in an hour and a half. Could anybody tell me any idea about what can be the reason of this problem?
    Any help will b much appreciated.

    >      Database error text........: "POS(1) System error: BD Index not accessible"                    
    >      Database error code........: "-602"                                                            
    >      Triggering SQL statement...: "INSERT INTO "/BIC/SZD_PROD" ( "/BIC/ZD_PROD",                    
    >       "SID", "CHCKFL", "DATAFL", "INCFL" ) VALUES ( ? , ? , ? , ? , ? )"                            
    Hi Hari,
    looks like you are hitting a BAD index.
    Check the [DB50|http://help.sap.com/saphelp_nw04s/helpdata/en/9c/ca5bb3d729034aaf6f4cea2627c2f2/frameset.htm] or the DBMGUI - there should be warnings about this.
    To fix this issue, either use the DBMGUI -> Recover -> [Indexes|http://maxdb.sap.com/doc/7_6/30/5ada38596211d4aa83006094b92fad/frameset.htm] function or logon to dbmcli, get an SQL session ([sql_connect|http://maxdb.sap.com/doc/7_6/11/8af4411cf5c417e10000000a155106/content.htm]) and use the [sql_recreateindex|http://maxdb.sap.com/doc/7_6/30/f7c7f25be311d4aa1500a0c9430730/content.htm] command.
    Regards,
    Lars

  • I,tried,using,a,trial,version,of,creative,cloud.,I,signed,up,and,started,the,download.,The ,download,was,taking,a,long,time,so,I,decided,to,cancel,it,and,buy,a,year,plan,paying,mont hly.,I,started,downloading,creative,cloud,again,but,the,download,was,inte

    I,tried,using,a,trial,version,of,creative,cloud.,I,signed,up,and,started,the,download.,The ,download,was,taking,a,long,time,so,I,decided,to,cancel,it,and,buy,a,year,plan,paying,mont hly.,I,started,downloading,creative,cloud,again,but,the,download,was,interrupted,by,my,dau ghter,by,inserting,a,usb,device,into,the,usb,port.,I,could,not,find,the,download,,I,signed ,into,adobe,and,now,it,tells,me,that,i,have,a,30,day,trial.,My,account,shows,that,I,have,p aid,my,first,monthly,installment,but,i,can,only,use,a,trial,version,,which,says,will,expir e,in,30,days.,What,can,I,do,to,get,the,full,yearly,plan,version???

    Jacobm51486856 please see Sign in, activation, or connection errors | CS5.5 and later for information on how to resolve the connection error preventing your membership from being authorized.

  • Taking snapshot of oracle tables to sql server using transactional replication is taking a long time

    Hi All,
    I am trying to replicate around 200 oracle tables onto sql server using transaction replication and it taking a long time i.e the initial snapshot is taking more than 24 hrs and it still going on.
    Is there any way to replicate those these tables faster?
    Kindly help me out..
    Thanks

    Hi,
    According to the description, I know the replication is working fine. But it is very slow. 
    1. Check the CPU usage on Oracle publisher and SQL Server. This issue may due to slow client processing (Oracle performance) or Network performance issues.
    2. Based on SQL Server 2008 Books Online ‘Performance Tuning for Oracle Publishers’ (http://msdn.microsoft.com/en-us/library/ms151179(SQL.100).aspx). You can enable the transaction
    job set and follow the instructions based on
    http://msdn.microsoft.com/en-us/library/ms147884(v=sql.100).aspx.
    2. You can enable replication agent logging to check the replication behavior. You may follow these steps to collect them:
    To enable Distribution Agent verbose logging. Please follow these steps:
    a. Open SQL Server Agent on the distribution server.
    b. Under Jobs folder, find out the Distribution Agent.
    c. Right click the job and choose Properties.
    d. Select Steps tap, it should be like this:
    e. Click Run agent and click Edit button, add following scripts by the end of scripts in the command box:
            -Output C:\Temp\OUTPUTFILE.txt -Outputverboselevel 2
    f. Exit the dialogs
     For more information about the steps, please refer to:
    http://support.microsoft.com/kb/312292
    Hope the information helps.
    Tracy Cai
    TechNet Community Support

  • I have the current Mac Pro the entry level with the default specification and i feel some slow performance when applying after effects on my videos using final cut pro and also rendering a video takes long time ? what upgrades do you guys suggest?

    i have the current Mac Pro the entry level with the default configuration   and i feel lack of  performance when applying after effects on my videos using final cut pro and also rendering a video takes long time ? what upgrades do you guys suggest i could do on my Mac Pro ?

    256GB SSD  it shipped with will run low and one of the things to watch.
    Default memory is 12GB  also something to think about.
    D500 and FCP-X 10.1+
    http://macperformanceguide.com/index_topics.html#MacPro2013Performance
    Five models of 2013 Mac Pro running Resolve, FCPX, After Effects, Photoshop, and Aperture

  • Export of instance takes long time

    Hi,
    we're exporting an instance of a 10g database and it takes incredible long time. The dba told me, he sees wait/io on db file sequential read.
    I've tried to break the issue down on the OS side (Solaris 10 U09 Sparc,Generic_147440-09 ). The process ora_dw01_testinstance is doing all the work reading the source.
    It uses 8k preads. Between 500 - 700 iops are transferred to the storage and replied, with a transfer rate from 5 - 8 mb/sec. Disk latency is around 4 - 10 ms.
    When looking on the microstats of the process it's mainly on sleep (over 90 %) . I've started 3 dd calls with different blocksizes on different db files in the same filesystem. Although the files have not been in any cache, the storage delivered around 70 - 85 mb/sek, even with a 8k blocksize. Well, the latency was much more higher when the dd processes were running. But I'm still wondering why the export is working that poor.
    Are there ways to get the read phase of the export improved?
    thank you,
    solst_ice
    Edited by: 892620 on Nov 17, 2012 5:49 AM
    Edited by: 892620 on Nov 17, 2012 6:00 AM

    First off, which export utility are you using: expdp or exp? What parameters have been specified? Also 10g is a marketing lable and not a version of Oracle so what is the exact version of Oracle in use? There are more than 60 reports on Oracle support about datapump performance issues but most are Oracle version and feature specific.
    If using datapump the following support document should be of interest:
    Checklist For Slow Performance Of DataPump Export (expdp) And Import (impdp) [ID 453895.1]
    If using exp then did you provide a buffer= parameter?
    HTH -- Mark D Powell --

  • Taking a long time to export

    Why is it taking so long to exort a sile show with with a 185 photos to idvd?its been over 5min and its only half way done. Thanks

    When you use the Share to iDVD export mode iPhoto has to create a Quicktime movie of the slideshow prior to exporting it into iDVD. If you've got lots of KB effects, fancy transitions, etc. along with those 185 photos it's going to take a long live. The 99 slide limit refers to creating a slideshow in iDVD with imported stills.

  • Taking a long time to export to Quicktime

    Hello,
    I'm trying to export my Flash "file" to Quicktime. The issue I'm having is that it's taking a really long time to "record flash content". My file is pretty long (about 5-6min long) so maybe this is normal. Or is it?

    When you use the Share to iDVD export mode iPhoto has to create a Quicktime movie of the slideshow prior to exporting it into iDVD. If you've got lots of KB effects, fancy transitions, etc. along with those 185 photos it's going to take a long live. The 99 slide limit refers to creating a slideshow in iDVD with imported stills.

  • Used Excel for an extremely long time, now using OpenOffice...

    I've been using Excel since about 1989, and so it's hard to switch from it to something else. Recently I've become fed up with Microsoft's serial number scheme, and now have legitimate DVDs/CDs practically worthless because I have used the serial number too many times. I also refuse to call India to have these validated. Because of all this crap, I've started using OpenOffice.
    OpenOffice is nice, but it's taking me a long time to get used to it. Every thing about it is just "ok." There's nothing outstanding about it. I'm wondering if Numbers would be better. Most of what I do is for myself only.
    I've looked at the tutorials for Numbers, but I'm wondering if anyone as made one for people who are switching from Excel, and need to be shown "here's what you did in Excel, now here's how it's done in Numbers."
    Has anyone used all three? Would OpenOffice or Numbers be better to switch to?
    Thanks.

    Untmdsprt wrote:
    something as simple as typing in a series like Jan, Feb, and then have the spreadsheet fill in the rest requires me to look up how to do it in the manual. I think I will stick with OpenOffice for right now until I've read the manual for Numbers.
    Jerrold Green1 wrote:> I'm not going to say anything that might discourage you from reading the User Guides, but you will find that most things are close to being the same in Numbers as in Excel and very few are truly different.
    Including the Jan, Feb... series of the example. OOo doesn't run on pre-intel Macs, so I used NeoOffice.
    Excel (Win): type Jan into a cell, grab the handle at the lower right corner of the cell and drag right (or down) to fill as many cells as desired with the months in succession.
    Numbers: type Jan into a cell, grab the handle at the lower right corner of the cell and drag right (or down) to fill as many cells as desired with the months in succession.
    Neo did not retain the selection border and handle when data was typed into the cell, so it was necessary to confirm the entry by pressing return, enter or tab, then reselect the cell before dragging the handle. Otherwise the process was the same as Excel and Numbers.
    For some series—1, 3, 5 for example—all three applications require two entries (to establish the interval) be made and selected before dragging the handle. Numbers and Excel will fill a series of alternate months. Neo will not.
    BUT like Jerry, I would NOT discourage anyone from reading the Numbers '09 User Guide (and the iWork Formulas and functions User Guide as needed).
    Regards,
    Barry
    PS: I share some of your frustration, but in the opposite direction. Most things are similar enough in Excel to what I'm used to after years of using AppleWorks and months of using Numbers that I can usually figure out a way to do it in Excel. When I can't, the Help files are available.
    B

  • Using Site Feed webpart on site -- opening site takes too long time to open

    Hello,
    We are using the site feed web part on our team sites for sharing pictures, messages etc. When using the site feed web part on the main page on the team site, it tooks a long time to open the site. When we remove the site feed web part from the page, the
    page loads quickly.
    We have a separate web application for the the team sites and the Personal (My Sites). I know a load information is loaded from the SharePoint user profile.
    Anyone this experience with the site feed web part? There's no configuration in this web part to configure for example caching o.i.d.
    Anything to do with distributed cache time-outs?
    I hope to hear something from you.

    Hi  ,
    For your issue , you can enable Developer Dashboard for SharePoint 2013 with PowerShell.
    Developer Dashboard is performance statistics for a request are available in the browser window. . When the Developer Dashboard is enabled in On mode, the Developer Dashboard icon will always be displayed
    at the top right corner. With Developer Dashboard, you can find out where the time was spent at server side.
    For enabling developer dashboard in SharePoint 2013 , you can refer to the blog:
    http://ricardoramirezblog.wordpress.com/2013/02/23/enable-developer-dashboard-in-sharepoint-2013-using-powershell/
    Here is a detail blog about using developer dashboard :
    http://www.sharepoint-journey.com/developer-dashboard-in-sharepoint-2013.html
    Thanks,
    Eric
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support,
    contact [email protected]
    Eric Tao
    TechNet Community Support

  • Dynamic streaming taking too long time to switch video using NetStreamPlayOptions in AS3

    Hi,
         Can any one tell me why dynamic streaming taking too much time to switch video from lower bit rate to higher bit rate and vice versa.
    I am doing dynamic streaming in following ways -
    var param:NetStreamPlayOptions = new NetStreamPlayOptions();
    param.oldStreamName=oldStream();
    param.streamName=newStream()
    param.transition=NetStreamPlayTransitions.SWITCH;
    ns.play2(param);
    ns.addEventListener(NetStatusEvent.NET_STATUS, switchMode);
    I am using duel buffering and that is 3 seconds when video starts and 10 seconds when "NetStream.Buffer.Full". Video taking approximately 30-50 seconds to switch video and when I am calling the above code.
    Thanks & regards
    Sunil Kumar

    Hi Sunil,
    Was my link useful to you? If you have not gone through the link which I suggested just go through the below lines it may help you:
    For a faster switch with optimal keyframe interval and client-side buffer, when Flash Media Server (FMS) receives a "switch" command, the server waits for a keyframe to switch to the new stream. FMS looks for the keyframes in the new stream in chunks equal to the client's buffer size(NetStream.bufferTime), so having a client buffer larger than the keyframe interval of the stream would help with a fast switch response time from the server—in other words, a shorter delay between a "switch" call to the server and the client receiving bits from the new stream in response.
    Following values are considered most optimal:
    Keyframe interval: 5 sec.
    Client-side buffer: 6–10 sec.
    So to maintain optimal keyframe interval you can go for fresh encoding of your videos which will give you a chance to set keyframe interval or if you dont want to do this then I would suggest you to increase the client-side buffer.
    Regards,
    Amit

Maybe you are looking for

  • Best practice for ConcurrentHashMap use?

    Hi All, would the following be considered "best practice", or is there a better way of doing the same thing? The requirement is to have a single unique "Handler" object for each Key: public class HandlerManager {     private Object lock = new Object(

  • How do you enlarge the font on the mailbox sidebar?   never mind thx

    never mind about the mailbox font, thx

  • Some webistes do not load. Pls help

    Some websites like www.speedtest.net,www.pingtest.net,www.indianrailways.gov.in does not load. how to solve this problem? == URL of affected sites == http://www.pingtest.net,www.speedtest.net,www.indianrailways.gov.in

  • Custom Monitoring Objects in BPMon

    Dear Experts, As we know that there are some modules/submodules for which there are no SAP Std Monitoring Objects available in BPMon like FSCM, Asset Mgmt, Dispute Mgmt, HCM, PS, etc. so we need to go for custom monitoring objects. Would like to requ

  • Acro Pro 9 won't flatten overprint

    I've got a catalog PDF supplied from a printer that I need to prep for web distribution. The prepress for the project was a nightmare, particularly with a complex black vector object overprinting a spot color on every page. The overprint object is in