Exporting and importing parameters

Hi,
Can any one explain how the exporting and imopting parameters work in case of function call.
for example in the folowing code:
CALL FUNCTION ‘DOWNLOAD’
     EXPORTING
          FILENAME     = <default file name>
          FILETYPE     = <default file type>
          MODE          = <create new or extend>
     IMPORTING
          FILESIZE     = <size of file in bytes>
     TABLES
          DATA_TAB     = <internal table to transfer>
     EXCEPTIONS. . . .
Do we have to specify the importin parameters also in the program, and if yes what is the use of specifying them.
Thanks,
Dhiraj

You need to fill only export parameter and tables parameter
CALL FUNCTION ‘DOWNLOAD’
EXPORTING
FILENAME = <default file name> -> specify local file name
FILETYPE = <default file type> 'ASC'
MODE = <create new or extend> comment mode
IMPORTING -> comment import parameters
FILESIZE = <size of file in bytes> comment
TABLES
DATA_TAB = <internal table to transfer> -> give internal table
EXCEPTIONS. . . . uncomment all exceptions
this FM will download the data from internal table data to local file as like.txt or .xls
Thanks
Seshu

Similar Messages

  • How to differentiate which is exporting and importing parameters when calling dynamically using CL_ABAP_OBJECTDESCR

    Hello All,
    First of all sorry for posting the question in general abap queries, since somehow I am not able to post in ABAP objects discussions.
    My requirement is
    Step 1 :  To get table name as input  and dynamically derive all fields which is going to be passed as parameters for a method
    I did this using  cl_abap_typedescr class
    Step 2:  Then Pass the values into the relevant fields using below code
    ptab_line-name = 'DATA_TAB'.
    ptab_line-kind =
    cl_abap_objectdescr=>changing.
    GET REFERENCE OF text_tab INTO
    ptab_line-value.
    INSERT ptab_line INTO TABLE ptab.
    Step 3:  Later I am calling dynamic class like
        CALL METHOD (class)=>(meth)
          PARAMETER-TABLE
            ptab
          EXCEPTION-TABLE
            etab.
    I have read most of the post using Class cl_abap_typedescr=>describe_by_name to get parameter where I have to give correct parameter type.
    But my issue is that I also need to know which fields are acting as importing , exporting and changing parameter.
    Is there anyway to determine and pass method's paramter type as well ?
    Thanks!
    Regards,
    Ramya

    Might be I can use table view VSEOMEPARA

  • SQL Developer 2.1: Problem exporting and importing unit tests

    Hi,
    I have created several unit tests on functions that are within packages. I wanted to export these from one unit test repository into another repository on a different database. The export and import work fine, but when running the tests on the imported version, there are lots of ORA-06550 errors. When debugging this, the function name is missing in the call, i.e. it is attempting <SCHEMA>.<PACKAGE> (parameters) instead of <SCHEMA>.<PACKAGE>.<FUNCTION> (parameters).
    Looking in the unit test repository itself, it appears that the OBJECT_CALL column in the UT_TEST table is null - if I populate this with the name of the function, then everything works fine. Therefore, this seems to be a bug with export and import, and it is not including this in the XML. The same problem happens whether I export a single unit test or a suite of tests. Can you please confirm whether this is a bug or whether I am doing something wrong?
    Thanks,
    Pierre.

    Hi Pierre,
    Thanks for pointing this out. Unfortunately, it is a bug on our side and you have found the (ugly) "work-around".
    Bug 9236694 - 2.1: OTN: UT_TEST.OBJECT_CALL COLUMN NOT EXPORTED/IMPORTED
    Brian Jeffries
    SQL Developer Team

  • Exporting and importing just table definitions

    Hi,
    I have this production database that has a huge amount of data in it. I was asked to set up a test database based on the exact same schema as the live database. When I tried to do an export (from live) and import (to test), with the parameters rows=N and compress=y, the target (test database) data file will still grow enormously, presumably because of the huge number of extents already allocated to the table in the live database. My test database of course, has a limited hard-disk space.
    Is there a way to export and import the table definitions without having the target database experiencing a huge growth in the size of the tablespace?
    Thanks,
    Chris.

    If an export with compress=n is still creating initial extents that a too large, you can still build with the import file but it will take a little work.
    run imp with indexfile=somefile.sql
    when imp is finished, edit somefile.sql by:
    1. remove all the REM statements.
    2. remove all the storage clauses (tables and indexes)
    Make sure your tablespaces have a small (say 1k) default initial extent.
    run imp again with rows=n
    All your tables and indexes will be created with the default tablespace initial extent.

  • Internal table export and import in ECC 5.0 version

    Hi friends,
    I am trying to export and import internal table from one program to other program.
    The below… export and import commands are not working when I run the program in background (using SUBMIT zxxxx via JOB name NUMBER number…..)
    EXPORT ITAB TO MEMORY id 'ZMATERIAL_CREATE'.
    IMPORT ItAB FROM MEMORY ID 'ZMATERIAL_CREATE'.
    Normally it should work. Since It’s not working I am trying with another alternative..
    i.e EXPORT (ptab) INTERNAL TABLE itab.
    My sap version is ECC 5.0….
    For your information, here I am forwarding sap help. Pls have a look and explain how to declare ptab internal table.
    +Extract from SAP help+
    In the dynamic case the parameter list is specified in an index table ptab with two columns. These columns can have any name and have to be of the type "character". In the first column of ptab, you have to specify the names of the parameters and in the second column the data objects. If the second column is initial, then the name of the parameter in the first column has to match the name of a data object. The data object is then stored under its name in the cluster. If the first column of ptab is initial, an uncatchable exception will be raised.
    Outside of classes you can also use a single-column internal table for parameter_list for the dynamic form. In doing so, all data objects are implicitly stored under their name in the data cluster.
    My internal table having around 45 columns.
    pls help me.
    Thanks in advance
    raghunath

    The export/import should work the way you are using it. Just make sure you are using same memory id and make sure its unique - meaning u are using it only for this itab purpose and not overwriting it with other values. Check itab is not initial before you export in program 1 - then import it in prog2 with same memory id...also check case, I am not sure if its case sensitive...
    Here is how you use the second variant...
    Two fields with two different identifications "P1" and "P2" with the dynamic variant of the cluster definition are written to the ABAP Memory. After execution of the statement IMPORT, the contents of the fields text1 and text2 are interchanged.
    TYPES:
      BEGIN OF tab_type,
        para TYPE string,
        dobj TYPE string,
      END OF tab_type.
    DATA:
      id    TYPE c LENGTH 10 VALUE 'TEXTS',
      text1 TYPE string VALUE `IKE`,
      text2 TYPE string VALUE `TINA`,
      line  TYPE tab_type,
      itab  TYPE STANDARD TABLE OF tab_type.
    line-para = 'P1'.
    line-dobj = 'TEXT1'.
    APPEND line TO itab.
    line-para = 'P2'.
    line-dobj = 'TEXT2'.
    APPEND line TO itab.
    EXPORT (itab)     TO MEMORY ID id.
    IMPORT p1 = text2
           p2 = text1 FROM MEMORY ID id.

  • Export and Import of mappings/process flows etc

    Hi,
    We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
    Thanks for any comments,
    Brandon

    This should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
    #set current location
    set path "C:/TMP"
    # Project parameters
    set root "/MY_PROJECT"
    set one_Module "MY_MODULE"
    set object "MAPPINGS"
    set path "C:/TMP
    # OMBPLUS and tcl related parameters
    set action "remove_parallel"
    set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
    set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
    set ext ".log"
    set sep "_"
    set ombplus "OMBplus"
    set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
    set OMBLOG $omblogname
    set logname $path/$one_Module$sep$object$sep$datetime$ext
    set log_file [open $logname w]
    set word "PARALLEL"
    set i 0
    #Connect to OWB Repository
    OMBCONNECT .... your connect tring
    #Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
    set OMBCONTINUE_ON_ERROR ON
    OMBCC "'$root/$one_Module'";      
    #Searching Mappings for Parallel in source View operators
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    foreach mapName [OMBLIST MAPPINGS] {
    foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
    if { [ regexp $word $prop1] == 1 } {
    incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    if { $i == 0 } {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
         } else {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
    close $log_file;
    Enjoy!
    Michel

  • Calling Oracle Export and Import in java

    Hi,
    I am trying to call oracle export and import utilites in my java application,
    can anyone help me that how can i do this, plz consider the case as this is a part of my project..
    Regards
    Ashish

    You should just be able to call the "imp" and "exp" executables ($ORACLE_HOME/bin) using the Runtime class. You can setup the various parameters for export / import in a parameter file.
    http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html
    http://www.orafaq.com/faqiexp.htm

  • Export and import XMLType table

    Hi ,
    I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
    I got following errors when i export the table , when i tried with exp and imp utility
    EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
    then i tried export and import pump.Exporting pump is working ,following is the log
    Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ;;; Legacy Mode Active due to the following parameters:
    ;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
    ;;; Legacy Mode has set reuse_dumpfiles=true parameter.
    Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 13.23 GB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
    Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
    E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
    Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
    h4. I got error when i import the pump using following command
    +impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
    error is :
    h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
    h3. ORA-01403: no data found*
    h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
    Please help me to get solution for this.

    CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
    <?xml version="1.0" encoding="UTF-8"?>
    <xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
    <context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
    I tried the import in different ways
    1. Create table first then import data_only (CONTENT=DATA_ONLY)
    2. Import all table
    In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
    Starting "aaaaa"."SYS_IMPORT_TABLE_02":  aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
    bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    *KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
    *ORA-01403: no data found*
    ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-26062: Can not continue from previous errors.
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26

  • How to do fast export and import

    i have a windows 2003 server and oracle 10.2.0.3 installed on it.
    here i want to ask how i can speedup my export and import using expdp.

    Hi User,
    user11798002 wrote:
    i have a windows 2003 server and oracle 10.2.0.3 installed on it.
    here i want to ask how i can speedup my export and import using expdp.You can utilize parallelizing export when using data pump,but for traditional export use the following
    Export of Databases - reads data by running a select statement of the data and generating the DDL to perform the import process in future.
    Fast Exports
    1) Use direct=y
    2) Until disk is not fully utilized try running exports in parallel.
    3) Keep export file on different disk then the datafiles.
    4) Run exports in two part rather than one i.e. 1st rows=n and second as indexes=n rows=y constraints=n.
    Fast Imports
    1) Use the first file out of two files created with the exports, It will insert all data but will not create any indexes and constraints. Once the data insertion is done run imp with indexfile option to extract script of index creation in text file. edit the file to include parallel clause and set parameters db_file_multiblock_read_count to 128 with sort_area_size to a higher value and workarea_size_policy=AUTO.
    A R P I T S I N H A
    [oracledba.in]

  • Export and import problems

    i have 2 major problems importing my application from htmldb.oracle.com to my serevr at home.
    1.
    i export my application and thems from htmldb.oracle.com.
    when i opened the sql export file i saw that all hebrew chracters was "jibrish".
    how can i export my application with hebrew fonts ?
    2. i tried to export and import xml data (my tables data).
    the export xml file showing me the hewbrew data but when i import it to my tables it gets ????????? instead of hebrew.
    i think it's the nls lang but i don't know what nls land i need and where to change it
    (database ,enviroment....)
    can someone help me with this problems ?
    thanks
    p.s
    my db is 10.1.0.3
    red hat linux as 4 on vmware 4.5
    my nls parameters:
    nls_language: american
    nls_territory: america
    nls_characterset: we8iso8859p1
    nls_nchar_charcterset: al16utf16
    thanks

    Zvika,
    I have two options that may work for you. First, define the fields that have Hebrew characters using the CHAR qualifier. Example, varchar2(30 char). That allows you to store double byte characters from other languages, and overrides the nls_length_semantics (byte default) parameter.
    Or, you could set the nls_length_semantics to default to char. It definitely uses more space when the char qualifier is used, so it's probably best to override the byte default only when necessary.
    Hope that helps!
    Also, I posted the 'Blogging Tool for HTMLDB' question with the username 'aufan89', which I had to change because my email changed.
    So my new username is now 'aufan1989'. I look forward to your email when you get your app fixed and posted onto HTMLDB Studio!
    Thanks!

  • 9i to 11g migration using export and import

    Hi,
    I got a requirement to migrate 9i database to 11g using export and import.
    Please let me know what are all the parameters required in 11g, so that database will run with better performance.
    thanks,

    You can start with the current parameters, except for COMPATIBLE (which must be 10.0.0 or higher in the 11g environment). However, you will have to remove parameters deprecated in 11g.
    See http://download.oracle.com/docs/cd/E11882_01/server.112/e17222/changes.htm#BABHACIE
    Then you can test performance and identify what you need to change.
    Hemant K Chitale

  • URGENT...Help on User Level export and import

    User level export and import -
    In my db there are more than 20 users, i would like export only 2 users and they have constraints to other user tables in db.
    what are the precautions I need to take before exporting ???
    1. Disabling all constraints etc, .
    2. Synonymns, Procedures , Triggers etc.
    3. what are the parameters setting I should use for the export.
    For Importing back into DB (same users) - What are the sequences/steps to follow..
    1. what are the parameters setting I should use for the Import.
    2. Precautions like Disabling Constraints...
    3. Special actions for Synonyms/ Procedures.
    Using Oracle 8.1.7 on Solaris.. Total DB Size is 13.3 GB
    Any help will be appreciated....

    See to it that u have the DBA PRIVS to exp/imp.
    1.Grant export full database /import full database privileges to the user doing this.
    2.If this is the first time u are doing this u have to run the scripts CATEXP.SQL and CATPROC.SQL to create the necessary views,datadictionaries as user SYS/INTERNAL.These instructions are also mentioned in the scripts comments at the beginning .
    3.Also create the necessary tablespaces where u need to direct u'r imp.
    4.select the interactive option where u can interact with
    imp/exp process and u can do this selectively for users/tables/partitions and other preferences.
    5.look out for the version compatibility of your .DMP files

  • Question regarding export and import of Hyperion Security during upgrade

    Hi Guys,
    We are upgrading Essbase, Integration Services from 7x to 9x which are utilizing Hyperion Hub and we are going to follow the method of uninstalling 7x and reinstalling 9x components.
    Now my question is, what is the best way of transferring security from 7x to 9x. I heard that Advanced Security Manager can be used to export and import back security. Or is there any
    other way of doing it??
    Can someone please enlighten me on this.
    Thanks in advance
    K

    Ihatelightroom wrote:
    First let me say that any software that comes without a save button should be sold with a warning label.
    Why?
    Question 1:  I have having an issue comprehending how to save a photo.  In my case  I select the photo, zoom in on the subject, export it to my desktop. The pciture on my desktop does not incorporate the change. Am I missing a step? What do I need to do to export it with this change? I actually watched a You Tube video on this and could not see what i was not doing.
    You must have selected the wrong option in the Export dialog box. Under "File Settings", you need to select JPG and not "Original". Of course, you probably need to do some additional viewing of videos (or some reading) to learn that most people's workflow does not automatically include a "Save" or "Export" after editing the photo. It's not a necessary part of Lightroom's workflow, unless you need the photo for some non-Lightroom activity.
    Question 2: I just installed Lightroom and am trying to import my 12k strong photo collection. The Import button pulls in about 2k and then cannot find anymore. The photos are stored in folders by date within a master folder. I am selecting the master folder. I can go in and import the sub-fodler individually. However i do not want to do that 200 times.There is no apparent way to go into the subfolder level and select more than one folder.
    In the Import dialog box, on the left, under "Source", there is a checkbox that says "Include SubFolders". Make sure this is checked.
    Seriously, you need to spend some time reading introductory material about LR because Lightroom does not work like any other photographic software you might have used in the past. You are handling it as if it was no different than standard photo editing software, and you are going to be frustrated if that is your mindset. See the videos at adobe.tv and read this: http://www.flickr.com/groups/adobe_lightroom/discuss/72157603590978170/

  • I am using Iphoto 11 ver 9.4.3 on mac using oxs 10.8.5 i want to export calendar projects to an external hard drive. what is the easiest way to do this? i have tried export and import but it didn't seem to work.

    I am using Iphoto 11 ver 9.4.3 on mac using oxs 10.8.5 i want to export calendar projects to an external hard drive. my goal is to store them in an external hard drive so it doesn't use up memory on the mac hard drive. is it possible to copy the specific projects without copying the entire library? what is the easiest way to do this? i have tried export and import but it didn't seem to work.

    What do you not understand?
    You can duplicate the iPhoto library (command - D ) and delete everything except the project and its photos from the copy and move that
    Or
    However the calendar takes very little space - it is simpy database entries - it is the photos in the calendar that take space - and for most people you would wnat to keep those photos in your library
    you can use a photo in 50 calendars and it still is only one photo in your library - as I explained calenders do not exist as such - they are simply database entries telling iPhotop how to display the calendar - they take almost no space at all
    LN

  • Require help in understanding exporting and importing statistics.

    Hi all,
    I am bit new to this statistics.
    Can anyone please explain me in detail about these commands.
    1) exec DBMS_STATS.GATHER_TABLE_STATS (ownname => 'MRP' , tabname => 'MRP_ATP_DETAILS_TEMP', estimate_percent => 100 ,cascade => TRUE);
    2) exec DBMS_STATS.CREATE_STAT_TABLE ( ownname => 'MRP', stattab => 'MRP_ATP_3');
    3) exec DBMS_STATS.EXPORT_TABLE_STATS ( ownname => 'MRP', stattab => 'MRP_ATP_3', tabname => 'MRP_ATP_DETAILS_TEMP',statid => 'MRP27jan14');
    4) exec DBMS_STATS.IMPORT_TABLE_STATS ( ownname => 'MRP', stattab => 'MRP_ATP_3', tabname => 'MRP_ATP_DETAILS_TEMP');
    I understand that these commands are used to export and import table statistics.
    But please anyone help me in understanding this indetail.
    Thanks in advance.
    Regards,
    Shiva.

    Shiva,
    Please post the details of the application release, database version and OS.
    Please see (FAQ: Statistics Gathering Frequently Asked Questions (Doc ID 1501712.1) -- What is the difference between DBMS_STATS and FND_STATS).
    For exporting/importing statistics summary of FND_STATS Subprograms can be found in (Doc ID 122371.1)
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_details?c_name=FND_STATS&c_owner=APPS&c_type=PACKAGE&c_detail_type=source
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_details?c_name=FND_STATS&c_owner=APPS&c_type=PACKAGE%20BODY&c_detail_type=source
    Thanks,
    Hussein

Maybe you are looking for