Can i use export(datapump)of 10203 against 10202 ?

Hi Gurus,
Can i use export(datapump) taken from 10203 and import (datapump) into 10202?

Hi Gurus,
Can i use export(datapump) taken from 10203 and
import (datapump) into 10202?The answer is Yes with restriction.
Data Pump Import cannot read dump file sets created by a database version that is newer than the current database version, unless those dump file sets were created with the version parameter set to the version of the target database. Therefore, the best way to perform a downgrade is to perform your Data Pump export with the VERSION parameter set to the version of the target database.
Check document
Moving Data Between Different Database Versions

Similar Messages

  • Can I use 10g datapump utility expdp to export an 8i database?

    Hi,
    Can I use 10g datapump utility "expdp" to export an 8i database objects? The reason I'm asking this is because I would like to use 10g impdp utility to import the dumps to 10g.
    Thanks in advance,
    Raj

    No, you can't use 10g datapump utility "expdp" to export an 8i database objects.
    The legal values for the VERSION parameter are as follows:
    COMPATIBLE - This is the default value. The version of the metadata corresponds to the database compatibility level. Database compatibility must be set to 9.2 or higher.
    LATEST - The version of the metadata corresponds to the database version.
    version_string - A specific database version (for example, 10.0.0). In Oracle Database 10g, this value cannot be lower than 9.2.
    Database objects or attributes that are incompatible with the specified version will not be exported. For example, tables containing new datatypes that are not supported in the specified version will not be exported.

  • Can't use Export Functionality from form?

    Hi experts,
    I got a problem now, can anyone help me??
    This's my system information:
    APPLICATION: 11.5.10.2
    DATABASE: 9.2.0.6.0
    And the problem is:
    - Open 1 Form
    - File > Export
    - The progress bar update to 100%
    - The browser is opened
    And it suddenly disappear.
    I've checked metalink and see this note: 423261.1 How to Troubleshoot Forms Hanging Using Export Functionality
    I think it's fail in phase 4:
    4) Finally generate the URL, and display the file in the browser window by calling the Forms built-in web.show_document
    But i don't know how to solve this problem. Can anyone show me?
    Thank you so much

    Hi everybody,
    I solved my problem. In my case, it's because tablespace CTXD was not have enough free extent. You can check this note
    Information You Should Upload When Creating an SR for File Export and Attachments      Doc ID:      Note:427964.1
    Hope that helps,

  • Can i use oracle 7.3 drivers against oracle 8 database.

    When i'm executing a stored procedure with 8i driver i'm having a problem saying "Cannot Save or Rollback in distributed transaction". But when i use oracle 7.3 drivers i'm not having any problem.My question is , Can I use oracle 7.3 dirvers with oracle 8 database.
    null

    I tried disabling MTS. That doesnt look to work. Somehow 8i drivers are working in a distributed transaction mode by default. I resolved configuring 8i drivers as not to use distributed mode and works fine for me.
    For other who had this same kind of errors Solution is :
    Disable distribution transaction off if you are managing manual transactions from your scripts. Example: connectstring= DNS=...;"DistribTx=0.
    Justin,Thanks for your concerns and reply.

  • Can not use exported data

    When I transport tables from mssql2000 to oracle9i, using
    the export tool provided by mssql, I got a problem:
    Firstly, I can not select data from the exported table:
    the tables are exported successfully, but I can not select them in Oracle! It just reported "the table or view does not exist". Which in fact is not true. Because I can find it by
    SELECT owner,table_name,tablespace_name FROM dba_tables where owner='TSKY';
    OWNER TABLE_NAME TABLESP
    TSKY PFSBFG TSKY
    TSKY company TSKY
    Among the tables listed, I can select from 'pfsbfg' (created locally), but I can not select from 'company' (imported from mssql). The other fields of the two tables in the dba_tables are all the same.
    Secondly, I can display data of the company table using Oracle Enterprise Manager, with all data exported displayed correctly.
    How can it be?! I am totally confused!
    The same also happened when I successfully transported data from mssql2000 to oracle9i, using transparent gateway. This time, the transported data can be selected using SELECT * FROM company, but can not select any single column of the company table, with error "no such column"!
    Any suggestions?

    OK, so you're connecting to SQL Server with the transparent gateway via a database link? Are you copying tables across the database link (materialized views?) and selecting from the materialized views or are you selecting data from the remote SQL Server database? Are you doing
    select * from d_table@dblinkToSQLServer
    or
    select * from d_table
    If the latter, how did you create d_table locally?
    My guess is that the columns of the table may have been created case-sensitively too. You could try putting quotes around the column names as well. Can you post the SQL that was used to create d_table?
    Justin

  • Can PQA use a single reference frame against a test video stream?

    For R&D testing of video "set-top" devices I want to initiate some internal processing on our UUT, then using a PXI-1491 analyze the digital (HDMI) video output of the UUT for some large number of seconds (180 seconds or more for example).
    The video that is being analyzed will have static image content. The amount of time I want to analyze the test stream after doing some stuff on the UUT is variable, but I always anticipate it being fairly long.
    It currently appears that I have to have a reference stream that contains exactly the same number of frames as the test stream.
    This makes the reference vbf files very large. My test requirements include a large number of resolutions that must be tested through the UUT. Having an extensive library of very large vbf files is logistically difficult, it would much easier to maintain such a library made up of single "golden" frame reference files instead.  Additionally, since my analyze time needs to be variable depending on test setup and UUT processing options, it would be better for me to have a single golden reference frame and validate alot of test frames against it.
    Since the analyzed video will have static images, is it possible to run a test video stream against a single reference frame (reduced reference) instead of having a full reference stream frame by frame?
    Solved!
    Go to Solution.

    Doing exactly what you are asking is not within the design of PQA.  Of course, the best option that we would recommend when working with video test is a large capacity hard drive, probably in a RAID configuration for more space and better performance, and then just taking a golden reference with 10k frames, and doing what you originally suggested.  
    One method of achieving what you are looking for without using a large reference file is through offline processing.  This would allow you to acquire 10,000 consecutive frames, which I understand is one of your concerns, and then after the fact run them back through the analyzer.  You would still be performing a looping type of action in TestStand or LabVIEW.  The process would be:
    1) Acquire your source from your 1491 provider with no processors, and saving the media stream to disk.
    Loop:
    2) Load PQA with the disk buffer provider instead of the 1491, the disk buffer file will be the acquisition from above
    3) Point the start frame to your current location.
    4) Perform your processor with the results.
    5) Repeat and iterate to a new start frame location
    This process is going to be much slower as you load and unload resources every time.  If you choose to go this way, a better implementation would be to maybe consider doing 100 frames at a time, or some number larger than 1, because the processing time of handling the extra frames is going to be less than loading/unloading PQA.
    Your second option is going to require more work outside of PQA to implement custom functionality through a Custom User Processor.  Effectively this allows you to come up with a way to process incoming data in ways that you'd like.  To do this you will need:
    1) LabVIEW 2011 - Custom User Processors for PQA can only be developed in LabVIEW 2011.  If you are under a SSP agreement with NI and only have LabVIEW 2012 currently, you still have access to older versions.
    2) Vision Development Module - To perform your image processing
    3) An unencrypted video source - Due to limitations of HDCP we can not expose the raw video feed to user processors.
    In this user processor you would:
    1) Load your static image/frame in directly, you wouldn't need to use the Media Ref input
    Loop
    2) Load the current frame from the incoming video array
    3) Perform your video measurement with the Vision function, likely PNSR or SSIM since these are currently the only 2 referenced measurements in PQA.
    To learn more about Custom User Proessor's, in the PQA help check out: NI PQA Executive and the NI PQA Configuration Panel>NI PQA Tabs>Processors tab>Customizable Output Processors>User Processor  as well as: http://digital.ni.com/public.nsf/allkb/514058CC830D86EE86257881004CB45F
    Paul Davidson
    Sound and Vibration Software Staff Product Support Engineer
    National Instruments

  • Can I use export my SAP created master data in a non-SAP system?

    Hello,
    I am about to start a large integration solution with my SAP customer and they are concerned about licensing. What they want to do is export their GL, cost centre and WBS master data to an external system via integration (probably custom ABAP, or home built BAPI/RFC).  The external system then uses the accounting data to determine where to post FI or MM documents back into SAP via custom ABAP, BAPI or maybe IDOC).
    Is this OK?  Are there any licensing issues with master data used externally and posted back in via custom ABAP or standard BAPIs/IDOCs?
    Thank you in advance,
    Geoff

    Hello Geoff,
    Yes, even indirect access to SAP transactions or data needs to be licensed properly with SAP. Your customer should contact their SAP account rep to discuss the exact use case. Indirect access via certain types of certified integrations may qualify for Platform User License pricing, but there is no generic answer, and it is best for the customer to discuss their specific case directly.

  • How can we take the incremental export using the datapump at schema level.

    Hi,
    How can we take the incremental export using the datapump at schema level. Example, today i had taken the full export of one schema.
    After 7 days from now , how can i take the export of change data , from the data of my full export.
    using the export datapump parameter - FLASHBACK_TIME ,can we mention the date range. Example sysdate - (sysdate -7)
    Please advice
    thanks
    Naveen.

    Think of the Data Pump Export/Import tools as taking a "picture" or "snapshot."
    When you use these utilities it exports the data as it appears (by default) while it's doing the export operation. There is no way to export a delta in comparison to a previous export.
    The FLASHBACK_TIME parameter allows you to get a consistent export as of a particular point in time.
    I recommend you go to http://tahiti.oracle.com. Click on your version and go to the Utilities Guide. This is where all the information on Data Pump is located and should answer a lot of your questions.

  • Can RMAN backup and export datapump executed at the same time?

    Hello,
    I have several databases that I backup using RMAN and export datapump everynight starting at 6PM and end at Midnight. The backup maintenance window doesn't give me enough time for each database to run at different time. I am using crontab to schedule my backups. Since I have so many databases that need to be backed up between 6PM - Midnight, some of the export and RMAN backup scripts will execute almost at the same time. My question is can my export data pump and RMAN backup scripts run at the same time?
    Thank you in advance.
    John

    Needs must. If you don't run expdp parallel then it doesn't use that much. If it was really killing the system then look into setting up a Resource plan that knocks that user down but this is a big step.
    I woud better look into using Rman
    system incrementals, and block change tracking, to minimize your RMAN time.
    Regards
    If your shop needs to do both simultaneously then go for it.
    Chris.
    PS : One of my shops has maybe 20-30 rmans and pumps all kicking off some simultaneous, some not, from 0000 to 0130. No complaints from users and no problems either. Go for it.
    Edited by: Chris Slattery on Nov 25, 2012 11:19 PM

  • How can I use the versionning tool to make a coherent solution to export

    how can I use the versionning tool to make a coherent solution to export all objects used in my project to other db schema for testing as well as for implementation .
    I want to know proper steps to Import/export objects.
    while I am exporting my Master repository and work repository then again importing the all , no problem comes all the physical logical schema mapping and mapping between interfaces retains but
    while I am trying to export and import one or two packages on the ease of testing then I am getting so many mismatches manually I am setting all why so?
    required ur suggestion
    Palash Chatterjee

    kelly-bot wrote:
    Thanks for the tip, but like I said, I dont want to always have to use the arrow keys. Is it not possible to actually DRAG one pixel at a time? I recently upgraded to CC from CS3 and I keep running into things that feel like a downgrade. Super disappointed.
    Our posts overlapped there.  Are you saying you could reliably drag one pixel with CS3? 
    What other issues are you having?  Things like the Crop tool can feel strange at first, but I promise you you'll never want to use Classic mode once you get used to it.

  • Can I use preview to export multiple page PDF as jpegs?

    Using Preview, I export multiple page PDF as jpeg and results are the first page only as jpeg. Can I use preview to export multiple page PDF as jpeg for every page?

    Automator is great for this if you need to export many pages to JPG, or just have to do this often.
    Two other quick and dirty way sto get it done:
    (1) Enable thumbnail view, click on the ONE PAGE you want to convert to jpg, then choose "export." If you choose more than one page, only the first will get exported to jpg.
    (2) Create a copy of the file, delete all the pages except the one you want to export to JPG, and export that page. You may be able to "undo" the delete, get all your pages back again, and re-delete all but the one you want to export, etc. I would not recommend doing this on the original version of the file because of Preview's autosave so-called "feature."

  • Can I Use OLE2 to export Excel without having excel?

    Dear all,
    Can i use CLIENT_OLE2 to export from a form to an excel sheet without having the Microsoft excel installed on the AS (Application server)?
    Thanks
    $hOoO$h

    Dear SHo0o0sh;
    yes u can run it on client side but the client must have Microsoft office
    like this :
    PROCEDURE init_excel (application in out client_ole2.obj_type ,
    workbooks in out client_ole2.obj_type ,
    workbook in out client_ole2.obj_type ,
    worksheets in out client_ole2.obj_type ,
    worksheet in out client_ole2.obj_type )IS
    args client_ole2.list_type;
    begin
    application:=client_ole2.create_obj('EXCEL.APPLICATION');
    workbooks:=client_ole2.get_obj_property(application,'workbooks');
    args:=client_ole2.create_arglist;
    client_ole2.add_arg(args,1);
    workbook:=client_ole2.get_obj_property(workbooks,'add',args);
    client_ole2.destroy_arglist(args);
    worksheets:=client_ole2.get_obj_property(workbook,'worksheets');
    worksheet:=client_ole2.get_obj_property(worksheets,'add');
    write_cell(worksheet , cell , 1 ,1 ,'your data');
    write_cell(worksheet , cell , 1 ,2 ,'your data');
    release_excel (application,
    workbooks,
    workbook ,
    worksheets ,
    worksheet);
    end;
    PROCEDURE write_cell (worksheet client_ole2.obj_type,
    cell in out client_ole2.obj_type,
    x number , y number , text varchar2) IS
    args client_ole2.list_type;
    BEGIN
    args:=client_ole2.create_arglist;
    client_ole2.add_arg(args,x);
    client_ole2.add_arg(args,y);
    cell:=client_ole2.get_obj_property(worksheet,'cells',args);
    client_ole2.destroy_arglist(args);
    client_ole2.set_property(cell,'value',text);
    client_ole2.release_obj(cell);
    END;
    Hope it works.
    Regards
    Reda El Mitwally

  • Can I use Bridge to export image data into a .txt file?

    I have a folder of images and I would like to export the File Name, Resolution, Dimensions and Color Mode for each file into one text file. Can I use Bridge to export image data into a .txt file?

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • Can I use Oracle Clusterware 11 against 10.2.0.4 databases?

    Implementing a 2 node RAC system on Linux RHEL 5, with version 10.2.0.4 databases. I read a fleeting post mentioning that you can use Oracle Clusterware version 11 against a 10R2 databases - is that the case? I've searched but can find no additional information regarding this, so if anyone can provide additional links or info, I would be most appreciative -
    -kpw

    Hi,
    Implementing a 2 node RAC system on Linux RHEL 5, with version 10.2.0.4 databases. I read a fleeting post > mentioning that you can use Oracle Clusterware version 11 against a 10R2 databases - is that the case? Yes,
    You can use the 11g clusteware to manage 10g and 11g RAC databases.
    I've searched but can find no additional information regarding this, so if anyone can provide additional links or info, I would be most appreciative -
    Take a look here [Understanding Compatibility in Oracle RAC Environments|http://download.oracle.com/docs/cd/E11882_01/rac.112/e10718/admin.htm#RACAD803], here [Oracle Clusterware Version Compatibility|http://download.oracle.com/docs/cd/E11882_01/rac.112/e10717/intro.htm#CWADD1118] and also in metalink note 337737.1 (Oracle Clusterware - ASM - Database Version Compatibility)
    Hope it helps,
    Cerreia
    Edited by: Cerreia on 27/10/2009 20:38

  • Can I use Datapump for replication of Delta from one database to another

    Good Morning Experts,
    I would like to know if can we use Data Pump (expdp and impdp) for replication between two database only for delta and not for complete database on daily basis and how?
    Regards,
    MS

    MS,
    I assume you are talking about both metadata and data. I don't know of a way to do this. I have heard of dba's using Data Pump import to track differences in metadata, but not to replicate them.
    To track differences in metadata you do the following every day:
    export full=y ...
    import full=y sqlfile=full_todays_data.sql ...
    Then from day to day, just diff the .sql file to see what changed. You will see if tables added/dropped columns, new indexes, etc. Based on this, you may be able to come up with a set of expdp/impdp jobs to replicate the modified objects, but there is nothing built in.
    Hope this helps
    Dean

Maybe you are looking for

  • Program associated with the creation of a virtual instrument of Lab View. Please help me

    I have a question with program associated with the creation of a virtual instrument of Lab View. 1.Create of virtual instrument based on the NI 9401 module fitted to the chassis NI Compaq DAQ (Ni-cDAQ-9172). Creating a virtual tool needed to implemen

  • AUDIO HORRIBLE in imovie to idvd

    HELP!!! I made this wonderful movie in imovie, then it said it had to be rendered so I did. Then I exported it into idvd and the audio is so horrible I can hardly tell it's a song. I never had this when I exported a slideshow/movie from iphoto to idv

  • Getting root's crontab to run with root's privs

    I am trying to schedule a cron job to run some serveradmin scripts. If I edit /etc/crontab, the script runs as scheduled, but while it appears to run as user root, it doesn't appear to run with root's privileges. Here is /etc/crontab and a simplified

  • Duplicated/blurred/overlapping fonted text in web view

    We are experiencing an annoying issue with fonted text in web view frames. See the screenshot below: Look closely at the headline and you'll notice that it seems to be duplicated on top of itself, as if there are two identical headlines on top of eac

  • Web Service u0096 IDOC with acknowledgement interface

    hi experts, for this scenario do i need to use ccBPM?? <b><i>Web Service – IDOC with acknowledgement interface.</i></b> because we need response from idoc and idoc supports asynchronous processing.