Exp in different optimize_mode

De response-time of exporting a dump with exp seems to be different when changing the optimizer_mode setting.
optimizer_mode = "CHOOSE" -> 3 min
optimizer_mode = "FIRST_ROWS" -> 18 min
optimizer_mode = "ALL_ROWS" -> 20 minWhen I disable "contraints exporting" the response-times are the same.
What could explain this behaviour since CHOOSE and ALL_ROWS have almost the same result?

Default was choose and there was no problem. Then we change to FIRST_ROWS and the problem occured (20 min instead of 3 min). Then following sequence:
- choose in 3 min.
- all_rows in 18 min
- first_rows in 20 min
I used the following command in dos prompt:
exp lims/lims@nautdev file="d:\backup\nautdev.dmp" owner=(lims) compress=Y buffer=100000I changed the setting with Enterprise Manager console and the version is:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
PL/SQL Release 9.2.0.1.0 - Production
CORE     9.2.0.1.0     Production
TNS for 32-bit Windows: Version 9.2.0.1.0 - Production
NLSRTL Version 9.2.0.1.0 - Production

Similar Messages

  • Exp/Imp between different OS

    Hi,
    Could someone please help me clarify that is exp/imp between different OS possible(eg. exp in wind2000 and then imp in Unix)?
    Thanks

    This reply may be a bit late but I just saw your Q:
    The answer is Yes, of course you in imp/exp between different OS, just ensure to transfer the dmp file(the export output) in Binary mode between different OS.

  • Import/Export between different Portal versions

    Hi, is there any resonable argument for not allowing imp/exp between different version? The thing is a developed a Page Group on a 9i AS and now need to export it to a 10g AS. It reasonable to think that an imp/exp machanism should exist, at least, for the latest version.

    Hi there,
    Yes, there is a reasonable argument for that and that is that the export code will be different from the import code. This means that alterations to the code in order to be more flexible or to even fix an issue, for example, on the export won't be in-place. Now when you'll try to import with the different code this certainly will clash your system (as the problem may already be within the DMP file that you've exported).
    Although your request does make sense the implementation part is not that easy to be done.
    I hope it helps you to better understand the constraints...
    Cheers,
    Pedro.

  • EXP propagation

    Hi,
    Can anyone shed some light on the issue of EXP field propagation as labels are removed or added?
    Specifically....
    1. I understand that at ingress of MPLS cloud you would use class-based policy to apply the initial EXP value based on some class-map matching by applying service-policy in either inbound direction in case you want to do on the inbound interface or in the outbound direction if you want to do on the outbound interface (probably frequently used in MPLS DS-TE), however my question is how will this field travel along the LSP?
    2. Do you have to make sure that you apply proper EXP value at any point in LSP when label is being stripped off or applied or is it automatically copies between the labels?
    3. Also, at egree, do you usually match on EXP value to apply proper QoS, like LLC. For example,
    class-map exp5
    match mpls experimental topmost 5
    policy-map premium
    class exp5
    priority 512
    interface fastethernet0/0
    service-policy output premium
    4. Cisco documentation speaks of three modes: Uniform, Pipe and Semi-pipe. Are these simply concept of implementations or are they really configuration parameters?
    Your help would be appreciated.
    Thanks,
    David

    Hi,
    1) the default for a PE is to copy IP precedence to MPLS exp bits in all imposed labels. With a policer you can choose to set exp to different values.
    2) the popped label exp bits are not copied to next label automatically. Assuming however that you set all exp bits to the same value during imposition this should not impose a problem.
    3) This would not work as the output traffic is (presumably) IP. so the trick is:
    class-map exp1
    match mpls exp topmost 1
    policy-map exp-mapping
    class exp1
    set qos-group 1
    class-map qos-group
    match os-group 1
    policy-map customer
    class qos-group
    bandwidth 128
    interface FastEthernet0/1
    description from core
    service-policy input exp-mapping
    interface Serial1/0
    description to CE
    service-policy output customer
    4) The qos models are concepts. Uniform mode means you can (and will) rewrite DSCP as the settings throughout IP/MPLS are using a uniform classification scheme. Pipe and short pipe model let you transport DSCP unchanged through an MPLS domain by setting exp bits ONLY and not rewriting DSCP in any place. This is used in case the customer has different settings than the SP and does not want them to be modified.
    Hope this helps
    Martin

  • Imp-Exp US7ASCII / UTF8

    Hi,
    perhaps has answer around here.
    So my problem is that i have to imp and exp over different NLS_LANG settings.
    I have a database running with US7ASCII. This database is exported into UTF8.
    everthing seems to work fine. But:
    after some changes has been done in the the new UTF8 database i do another exp
    (e.g. backup). If i want to imp this newly backup again i get following warnings:
    IMP-00017: Nachfolgende Anweisung war wegen Oracle-Fehler 6550 erfolglos:
    "DECLARE SREC DBMS_STATS.STATREC; BEGIN SREC.MINVAL := 'C102'; SREC.MAXVAL "
    ":= 'C2020B'; SREC.EAVS := 0; SREC.CHVALS := NULL; SREC.NOVALS := DBMS_STATS"
    ".NUMARRAY(1,110); SREC.BKVALS := DBMS_STATS.NUMARRAY(0,1); SREC.EPC := 2; D"
    "BMS_STATS.SET_COLUMN_STATS(NULL,'"LINK_ENTRY"','"ENTRY_ID"', NULL ,NULL,NUL"
    "L,48,,0208333333333333,0,srec,3,0); END;"
    IMP-00003: Oracle-Fehler 6550 gefunden
    ORA-06550: line 1, column 306:
    PLS-00103: Encountered the symbol "," when expecting one of the following:
    ( - + mod not null others <an identifier>
    <a double-quoted delimited-identifier> <a bind variable> avg
    count current exists max min prior sql stddev sum variance
    execute forall time timestamp interval date
    <a string literal with character set specification>
    <a number> <a single-quoted SQL string>
    The symbol "null" was substituted for "," to continue.
    sry that some text is in German, but my install is done in german.
    but the thing that makes me nervous is the last line.
    PS: If i don't change anything on the first UTF8 import, i can export and
    reimport it as often as i like without any warning.
    oliver

    There are several issues here.
    >
    We have chineses words in US7ASCII characterset.
    >
    How did you manage to do that ? US7ASCII is certainly not capable of storing any Chinese characters.
    >
    ... then change first 3 and 4 bytes to zht16mswin950 ...
    >
    Change what ? where ? how ?
    >
    ... I need to exp/imp from 10g us7ascii to 11g UTF8 ...
    >
    You can certainly do that, but it will not fix your chinese characters.
    Pl post this issue on the Globalization forum - Globalization Support - where Sergiusz Wolicki may be able to help
    HTH
    Srini

  • Question for experts/ wrapping - data pump

    Hello All,
    In am using ORACLE 10.2.0.4.0
    i have an issue while export / import wrapped procedures:
    I have some procedures / packages that are wrapped and when i do export for their related schema and then import it to the database, i got the below error
    Compilation errors for PROCEDURE PROC_NAME
    Error: PLS-00753: malformed or corrupted wrapped unit
    Line: 1
    Text: CREATE OR REPLACE PROCEDURE "PROC_NAME" wrapped
    After i did many tests on this scenario i Noted that:
    This error is not appearing on all procedures, there are some that will be corrupted after the import and some will not be corrupted.
    After i did the normal import export, i found that the character set of the imp/exp is different then the one of the database, so i changed the character set of the database to be the same as the one used by the import, i redo my tests and i found that this solved the issue of normal imp/exp, so now and if i am doing normal exp/imp the recompilation of procedures will not fail after the normal old import. but at this stage the imp/exp character set has changed again.
    The character set of my db was : AR8MSWIN1256 and when i did exp/imp it was for exp /imp AR8ISO8859P6, so i changed the one of my db to AR8ISO8859P6 , after changing the database character set to AR8ISO8859P6 the recompilation was working fine for the normal exp / imp, but now the character set for import/ export changed to AR8MSWIN1256 !! why ?
    The Data pump export import was not solved even after changing the character set of the database, the procedures were imported with errors and it was fixed after recompiling it.
    My issue that i have too many clients and i am not able always to change the character set of the database because i may face ERROR "ORA-12712: new character set must be a superset of old character set"
    Hope the above description is clear,
    Any suggestions or ideas to solve the issue of wrapped procedures with expdp/impdp? can i force certain character set for the expdp/impdp ? knowing that with release 10.2.0.1.0 the expdp/impdp is totally not working with the wrapped procedures

    Any ideas?

  • Problem in data export using DisplayTag

    Hello Friends,
    I am getting the following exception when i try to export the data using display tag's built-in facility.
    [2008-02-26 16:54:27,472] WARN  http-7070-Processor22 (BaseNestableJspTagException.java:99  ) - Exception: [.LookupUtil] Error looking up property "mgrname" in object type "java.util.ArrayList". Cause: Unknown property 'mgrname'
    java.lang.NoSuchMethodException: Unknown property 'mgrname'
         at org.apache.commons.beanutils.PropertyUtilsBean.getSimpleProperty(PropertyUtilsBean.java:1122)
         at org.apache.commons.beanutils.PropertyUtils.getSimpleProperty(PropertyUtils.java:408)
         at org.displaytag.util.LookupUtil.getProperty(LookupUtil.java:271)
         at org.displaytag.util.LookupUtil.getBeanProperty(LookupUtil.java:129)
         at org.displaytag.model.Column.getValue(Column.java:124)
         at org.displaytag.export.BaseExportView.doExport(BaseExportView.java:265)
         at org.displaytag.tags.TableTag.writeExport(TableTag.java:1404)
         at org.displaytag.tags.TableTag.doExport(TableTag.java:1356)
         at org.displaytag.tags.TableTag.doEndTag(TableTag.java:1227)
         at org.apache.jsp.WEB_002dINF.jsps.common.tableViewTag_jsp._jspx_meth_displayTag_table_0(tableViewTag_jsp.java:195)
         at org.apache.jsp.WEB_002dINF.jsps.common.tableViewTag_jsp._jspService(tableViewTag_jsp.java:89)
         at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:97)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    Now,it doesn't face any problem while displaying data in table form on the page but when i try to export it (csv,excel,xml) then it fires the above exception.This is bit surprising to me.
    The dispalytag related tags are in a jsp. This jsp is included in a Spring tag handler class by pageContext.include("xyz.jsp"). This tag(defined by the Spring Tag Handler class) is being used in another jsp where the table is displayed. Paging works perfectly, but when I click on export, the exception occurs.
    I am using the followings:
    JDK1.5,Displaytag 1.1 and Spring 1.2.7
    The Actual flow is something like this.
    1)Controller forwards the request to jsp page.
    2)This jsp page uses a custom tag.
    3)Now,the control goes to custom tag handler class where i set the all the data into request,
    pageContext.getRequest().setAttribute("tableViewTag_data", data);4)Then i have included the page like
    pageContext.include("/WEB-INF/jsps/common/xyz.jsp");5)This xyz.jsp contains the following code.
        <displayTag:table pagesize="10" requestURI="${cmd.metaClass}.htm" name="tableViewTag_data" class="displaytag" decorator="${tableViewTag_options['decorator']}" export="true">
             <displayTag:setProperty name="paging.banner.placement" value="top"/>
             <c:forEach var="property" varStatus="propertyStatus" items="${tableViewTag_columnProperties}">
                  <c:set var="propertyTitle"><fmt:message key="field.${cmd.metaClass}.${property}" /></c:set>
                  <displayTag:column property="${property}" title="${propertyTitle}" />
             </c:forEach>
        </displayTag:table>Here, I am able to retrieve all the data.
    5)So,in this way the page is getting rendered.
    I have also included export filter into web.xml file.
    Hope i have provided all the information.
    I think i haven't made any silly mistake. -:)
    Looking forward to hear from you.
    Thanks
    Vishal Pandya

    Hi,
    Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
    No this is not a problem
    Since this is not a problem and hence no solution.
    Why you see this as a problem
    Cheers
    Anurag

  • Problem in data export

    Hi All,
    I have a problem regarding data export, when I used EXP command in my oracle form then EXPORT data from database then the dmp is larger than the dmp when I use to export data from same database using DBMS_DATAPUMP. For the case of EXP dmp size is say 40,225 KB and for the same database when I used DBMS_DATAPUMP it becomes say 36,125 KB. Why this difference is occur? Is this a problem? What will be the solution?
    Please Help ASAP.
    Thanks in advance.
    Regards
    Sanjit Kr. Mahato
    Edited by: userSanjit on Jul 23, 2009 6:19 PM
    Edited by: userSanjit on Jul 23, 2009 6:24 PM

    Hi,
    Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
    No this is not a problem
    Since this is not a problem and hence no solution.
    Why you see this as a problem
    Cheers
    Anurag

  • Sound level variations without any command

    Hello.
    When using my iMac 24" 3,06 Ghz, the sound level always varies without any command.
    It is very disturbing mostly because it is associated with the display of the sound level on the top of the screen.
    Can you help me?
    :-) Dominique

    A little more background. I've now found that I can get my volume back by clicking the Device setting off of CoreAudio, and then selecting CoreAudio again. This lasts awhile then the sound drops out again. Also, I've tried different instruments in the EXPS 24, different Instrument/MIDI channels and still get the dropout prob. Even deleted the audio object in the environment and started over with a new one. All the other instruments in the track list are EXPS 24 and not affected

  • Trying to imp an exp dump from different OS

    Im having trouble importing export dump provided to me. Source is Oracle 10G on AIX 64-bit and destination is to Oracle 10G on Redhat Enterprise Linux also 64-bit.
    WHen i try to import it i always get this error:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    IMP-00010: not a valid export file, header failed verification
    IMP-00000: Import terminated unsuccessfully
    Is the to and from not allowed due to difference in operating system? Does the Endian play a factor or only applies on transportable tablespace?
    THanks.

    How did you copied the file from source to destination server ?
    -- font-end tool that does the exp and zip it.
    What is the command you have used to perform the import ?
    -- predefined script that does the import using imp command. i tried to imp manually and got the same error
    Is the database version of source and target is different ?
    -- not exactly sure on the last few numbers. im sure its 10G on AIX 64bit and tried it to import to 10G Redhat 64bit and 11G Fedora 32bit.
    do you think it is something related to different type of filesystems? Like AIX to Redhat unable to read it properly?
    So it has nothing to do with Endians? right...

  • Misc transactions running total turns blank when using different exp type

    Hi,
    I would like to ask for some advise on how to fix this issue:
    When entering miscellaneous transaction with 2 different expenditure type (Labor and Material) the running total will be blank. But if we enter the same expenditure type (both Labor or Material) the running total will have values.
    Any idea how to fix this? Thanks!
    Application: 11.5.10.2
    Edited by: JulieAnn on Mar 27, 2012 11:21 PM

    This was fixed by changing the "Pixel Format" in the catalyst control center to "RGB (Full RGB)".. Somehow it got changed to "YcBcr".
    I removed the /etc/ati/amdpcsdb file that the fglrx driver likes to put settings in and did a fresh reinstall of catalyst-test and everything looks good now. Really no idea how the pixel profile got changed. Also noticed that my resolution was much lower than it should have been (which for some reason doesn't affect XBMC which runs in native resolution no matter what?), but that was an easy fix of adding Mode "1920x1080" to my Display section in xorg.

  • Photo looks different when decreasing exposure with adj. brush & when decreasing exp. on whole photo

    I want to darken the overexposed sky in a photo. I reduced the exposure ( - 4 EV) of the whole photo to see how the sky would look. It looked great. Deep blue sky with white clouds with light gray areas. Then I reset the exposure & painted the sky with the adjustment brush set to Exposure - 4 EV. The sky looked terrible. Clouds were dark grayish black in some areas. I increased the exposure to - 3 EV & then - 2.5 EV & it looked less terrible but not good. Why is there such a difference in the way the sky looks when changing the exposure of the whole photo & changing it with the adjustment brush. Thanks!

    It is a known issue with how the adjustment brush works in exposure
    compensation mode. Hopefully it will be fixed in the future.
    I want to darken the overexposed sky in a photo. I reduced the exposure ( - 4
    EV) of the whole photo to see how the sky would look. It looked great. Deep
    blue sky with white clouds with light gray areas. Then I reset the exposure &
    painted the sky with the adjustment brush set to Exposure - 4 EV. The sky
    looked terrible. Clouds were dark grayish black in some areas. I increased the
    exposure to - 3 EV & then - 2.5 EV & it looked less terrible but not good. Why
    is there such a difference in the way the sky looks when changing the exposure
    of the whole photo & changing it with the adjustment brush. Thanks!

  • Error while exporting a table - EXP-00091

    I am doing an export of a table. The table has 1000838 rows. After the export is completed,
    when I checked the log - it said
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
    With the Partitioning option
    JServer Release 9.2.0.4.0 - Production
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    About to export specified tables via Conventional Path ...
    . . exporting table FIDA_LABEL 1000838 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.
    I looked in the Oracle error messages document and found this out ---
    EXP-00091 Exporting questionable statistics
    Cause: Export was able to export statistics, but the statistics may not be useable. The statistics are questionable because one or more of the following happened during export: a row error occurred, client character set or NCHARSET does not match with the server, a query clause was specified on export, only certain partitions or subpartitions were exported, or a fatal error occurred while processing a table.
    Action: To export non-questionable statistics, change the client character set or NCHARSET to match the server, export with no query clause, or export complete tables. If desired, import parameters can be supplied so that only non-questionable statistics will be imported, and all questionable statistics will be recalculated.
    And this how my export command looks like -
    exp vincent/passphr query=\"where state in \(\'MD\',\'CA\',\'WI\'\)\" file=$EXPDIR/fida_label_9i.dmp tables=vincent.fida_label
    log=$LOGDIR/fida_label_exp.log
    Ofcourse, I am using the query clause because I really need to and it has always worked when we were in the Oracle 8i environment. We recently moved to the 9i. And this happens in this 9i version...
    And I certainly do not want to specify the import parameters to ignore the questionable statistics as no changes are desired in that area...(my hands are tied..).
    What could " a fatal error occurred while processing a table " mean? And how can this be traced and troubleshooted ? Or how can I find out if any row errors occurred ? And if required, how do I check the character sets and other likes ?? (I have no idea in this area)
    Thanks. All I needed was to get around this error. Your suggesions/responses would be highly appreciated

    What version of Oracle 9i are you using? Do you have a standard 'NLS_LANG' environment variable set on client's machines? Or do you set it to different values on different machines?
    Here is one of way you could get around it.
    Could you specify the export parameter 'STATISTICS=NONE' while exporting the table data?
    Try this and see.
    If this is successful, you could use the import utility as usual. You could always compute or estimate statistics on the table after import.

  • Same number range for two different series groups?

    Dear all,
    There are two scenarios
    1.Normal export under bond case, series group is 20 and number range maintained,running number is 300016
    2.Another scenario,where ARE1 document generation for Deemed exp customer(already customised) , series group is 30.
    But, client requirement is , for this second scenario also, system should pickup running number range of series group 20(under bond case)  as per excise legal requirement
    Ie running number is for series group 20 is 300016
    For the above deemed exp case (second scenario)it should pickup 300017
    And again when they do under bond case(first scenario), it should pick up 300018 like that
    Is it possible to maintain the same number range for two different series groups(20 and 30)?
    Even if you maintain the same number range for 30, as per running number range of 20
    Will the system update simultaneously the same number range for 20 and 30 series groups?
    Please suggest the way.

    With two different series groups, it is not possible to have the same number range. Even if you maintain it, they will be treated independently.
    Normally, you should not use different series groups if the same number range has to be used. In fact, the concept of series group has been developed to ensure that number ranges can be maintained separately.
    Regards,
    Aroop

  • How to calculate the size of database (a different one)

    Hello Friends,
    I am told to move the data from server to another machine (cloning is one very good option, I agree. But I expect a different answer regarding exports and imports). so How should I go about the task. The destination machine has got unlimited space. So thats not a problem. My questions are :
    1) How should I start the task ( I generally go for studying the structures and their sizes. Is it ok ?)
    2) If I am using Unix machine and there is a limitation that my server will not support file sizes of more than 2 GB, What should I do ?
    3) Shall I have to go for a full database backup or fragment the task ? If I do that , there are many schemas, so it will become tedious. But full backup will exceed OS size limitation. What should be done ?
    4) Is there anyway, I can go through a dump file, to find out, the database objects present inside that and note the related dependencies.
    Please respond.
    Regards,
    Ravi Duvvuri

    1) They are Unix machines. So will the size problem occur(if there is a limitation). How to overcome that?
    If the OS are of the same version you will have any problem. Regarding the storage only you have to have space in disk to store the datafiles, redo logs and controlfiles and nothing else.
    2) I am trying Export/import measure. Though there are other good methods, I just want to explore the full capabilities of Exp/Imp
    r.- Recreate the controlfiles is more effective and fast if the OS are of the same version.
    3) And the oracle version is 9i 2. (If I have to perform this on 8i (both source and destination, will the methods are going to differ ?)).
    R.- The method is the same for 8i/9i/10g.
    How should I go about doing this export/import ?
    r.- To use this method you have to have the original database started.
    To recreate the controlfile without having the datafile that you mentioned you have to get out it from the CREATE CONTROLFILE sentence and that's it.
    For Example: I mean, if your database has 8 datafiles and you have only 7, you have to include only 7 in the CREATE CONTROLFILE sentence.
    Joel Pérez
    http://otn.oracle.com/experts

Maybe you are looking for