Export and Import of Layout Set / Readonly Error

Hi,
I tried to export a layout set.
I choosed "Export" from the action context menu.
I've choosen "include related content.."
Download the file to .configarchive
In the target system, I've choosen
"Import" from the action context menu.
But then, I'll receive the error message
"Error while trying to process the import: Cannot run an import in the current configuration status "ReadOnly".
But I am Admin-User.

Hi Holger,
if someone chooses "Export" from action menu, the system will change to "ReadOnly" modus, so that no changes can be done while creating the export.
Maybe the Export modus is also started in your target system? You would need to go on Action -> Export -> Chancel.
Hope this helps,
Robert
PS: If not a restart will definitely do the trick

Similar Messages

  • Exporting and Importing Large data set of approx 300,000 rows

    Hi,
    I have a table in db 1(approx 10 columns) and want to copy all the rows(approx. 300,000) from this table with a simple where clause to the same db table in db 2. Both databases are on unix.
    I am executing this from a laptop with windows XP remotely connected to my office network.
    Could someone let me know what is the best way to do this.
    Thanks

    331991 wrote:
    Thanks for the detailed instructions. I however have some limitations:
    1. Both schemas(from and to) are on the same database server. I am the schema owner for both the schemas A logical impossibility in Oracle. A schema is defined by its owner, therefor you cannot have two schemas owned by the same owner. SCHEMA1 is, by definition, owned by user SCHEMA1. And SCHEMA2 is, by definition, owned by user SCHEMA2.
    but don't have rights to create dir's on the server.Then it is of no value to create a directory object in the database. The database's directory object is nothing but an alias to refer to an actual, existing directory on the host server.
    2. I don't want to copy all data from db1 to db2. I want to copy data where id > 5000 in the id column. Also in table in db1 has 10 columns and table in db2 has 15 columns with 10 from db1 with same data types and 5 more .
    How can I export the data to my laptop from db1 and then import into db2 in this scenario?
    You don't. Create a dblink and copy the data directly from the source to the target:
    INSERT INTO TARGET_TABLE VALUES
      (SELECT COLA, COLB, COLC, NULL, NULL, NULL, SYSDATE
       FROM SOURCE_TABLE@SOURCE_LINK
       WHERE ID > 5000);
    Also I am using Oracle 10G.
    Thanks

  • Model Export and Import - Error 40144

    I'm trying to transport a model from one system to another and I get the following error when I try the Model export and import functionality in IBP 4.0
    "errorCode":"40144","errorMsg":"Repository: delivery unit not found; deliveryUnit: 'SOP_MODEL_EXPORT_DU', vendor:sap.com'"
    Steps:
    1. Model Export -> Create Export
    2. Once the Export is created, I clicked 'Download Deployment Unit' and that's when I get the above error.
    Am I doing anything wrong? If so, what is the procedure to execute the model export and import between systems?

    It turns out a transportation path has to be set up (one time) from the backend (by SAP) to enable this feature.

  • Exporting and Importing Destination Controls Error

    When I export table within destination control and import the table to another Ironport again within destination control I receive error - "Wrong format of the destination config file: ip_sort_pref is required for the global settings."
    We are updating the ASYNC on each of our de clustered Ironport.  So when one is not in use we need the other to handle all the traffic, including domain controls.
    The output displays
    [ABCDE.com]
    max_host_concurrency = 500
    limit_apply = system
    limit_type = host
    max_messages_per_connection = 50 ......
    How can I import this without losing or altering the data within.
    ^^^^^^^^^^^^^^^^^^^^^^
    Solution
    We upgrade one of our SMTP ASYNC to 8.0.1 which creates a additional line within default of the upgraded Ironport exported control destination table that is named... wait for it.... ip_sort_pref. Just add it.

    The applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly. 
    When you exported did you choose to "Export all content for selected task sequence and dependencies".
    80070003 = The system cannot find the path specified.
    See here for exporting and importing task sequences
    http://blogs.technet.com/b/inside_osd/archive/2012/01/19/configuration-manager-2012-task-sequence-export-and-import.aspx
    Gerry Hampson | Blog:
    www.gerryhampsoncm.blogspot.ie | LinkedIn:
    Gerry Hampson | Twitter:
    @gerryhampson

  • Error when exporting and importing TS

    Hi I am running SCCM 2012 SP1 and am trying to export a Task Sequence including with the Windows 7 wim, the boot wim, some applications and drivers.  When I exported the TS I selected to export dependencies and content.
    I have setup dev SCCM environment (same spec as production) and am trying to import the .zip that was created when I completed the export.  When doing the import I am getting the below error on both the Windows 7 wim and the boot wim :
    Error Imported Operating System Image (1):
    Create new Windows 7 Enterprise x64
    Generic failure
    instance of SMS_ExtendedStatus
     Description = "Failed to get the image property from the source WIM file due
    to error 80070003".
     ErrorCode= 2147942403;
     File="e:\\nts_sccm_release\\sms\\siteserver\\sdk_provider\\smsprov\
    \sspimagepackage.cpp";
     Line = 483;
     Operation = "Putinstance";
     ParameterInfo = "";
     ProviderName = "ExtnProv";
     StatusCode = 2147749889;
    The applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly.  I am wondering if I would be better off using the Migration wizard to take the TS and
    content across.  Would appreciate any advice on either a) how to fix the issues with the importing of the TS or b) whether I should just be using the migration wizard.
    Thanks
    G

    The applications and drivers have been imported into the Dev SCCM server however I cannot find the actual content so I am not sure this has worked properly. 
    When you exported did you choose to "Export all content for selected task sequence and dependencies".
    80070003 = The system cannot find the path specified.
    See here for exporting and importing task sequences
    http://blogs.technet.com/b/inside_osd/archive/2012/01/19/configuration-manager-2012-task-sequence-export-and-import.aspx
    Gerry Hampson | Blog:
    www.gerryhampsoncm.blogspot.ie | LinkedIn:
    Gerry Hampson | Twitter:
    @gerryhampson

  • Export and Import of Configuration Data

    Hi All,
    I have created a number of custom properties, structures, groups, renderers, layout sets etc and would like to move them from Dev to Test system.
    I have found the following in the help doco:
    "Export and Import of Configuration Data"
    http://help.sap.com/saphelp_nw04/helpdata/en/e1/029c414c79b25fe10000000a1550b0/content.htm
    However I am unable to see its functionality in our portal systems (namely the Actions->Export and Actions->Import functions).
    Our version of Portal is: 6.0.9.6.0
    KM: 6.0.9.3.0 (NW04 SPS09 Patch3)
    Therefore we should be able to see it? no?
    Are there any special settings to see these buttons?
    Cheers,
    Vic

    Thanks for your replies.
    OK, so in otherwords, it is not available in SP09 and only in SP12+.
    So, to "transport" this information I will have to manually reapply thes settings in the Test system!
    Message was edited by: Victor Yeoh

  • Export and Import of mappings/process flows etc

    Hi,
    We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
    Thanks for any comments,
    Brandon

    This should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
    #set current location
    set path "C:/TMP"
    # Project parameters
    set root "/MY_PROJECT"
    set one_Module "MY_MODULE"
    set object "MAPPINGS"
    set path "C:/TMP
    # OMBPLUS and tcl related parameters
    set action "remove_parallel"
    set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
    set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
    set ext ".log"
    set sep "_"
    set ombplus "OMBplus"
    set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
    set OMBLOG $omblogname
    set logname $path/$one_Module$sep$object$sep$datetime$ext
    set log_file [open $logname w]
    set word "PARALLEL"
    set i 0
    #Connect to OWB Repository
    OMBCONNECT .... your connect tring
    #Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
    set OMBCONTINUE_ON_ERROR ON
    OMBCC "'$root/$one_Module'";      
    #Searching Mappings for Parallel in source View operators
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    foreach mapName [OMBLIST MAPPINGS] {
    foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
    if { [ regexp $word $prop1] == 1 } {
    incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    if { $i == 0 } {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
         } else {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
    close $log_file;
    Enjoy!
    Michel

  • Export and Import of Access DB tables in Apex

    Hi,
    i need help.
    I wanna export an MSACCESS DB Table into an ORACLE DB 10g2.
    but every time i get an error course my ODBC driver which isnt compatible to my tns service name.
    what could i do?
    i only want to export a table..not the whole database.
    if i want to export and import a Table over a XE Version, there are no errors and i could see the table in APEX.
    ...thx for help :)

    Hi Timo,
    If you open your table in Access, and click the "select all" (the empty grey box at the left of the header row), you can copy the entire table and paste it into Apex as though you were copying from Excel - the structure of the copied data is the same. The only thing is that you will be limited to 32,000 characters.
    Obviously, though, getting hold of, and setting up, the correct ODBC drivers is the more proper solution but the above may solve your immediate problem.
    Regards
    Andy

  • Export and import XMLType table

    Hi ,
    I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
    I got following errors when i export the table , when i tried with exp and imp utility
    EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
    then i tried export and import pump.Exporting pump is working ,following is the log
    Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ;;; Legacy Mode Active due to the following parameters:
    ;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
    ;;; Legacy Mode has set reuse_dumpfiles=true parameter.
    Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 13.23 GB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
    Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
    E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
    Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
    h4. I got error when i import the pump using following command
    +impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
    error is :
    h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
    h3. ORA-01403: no data found*
    h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
    Please help me to get solution for this.

    CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
    <?xml version="1.0" encoding="UTF-8"?>
    <xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
    <context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
    I tried the import in different ways
    1. Create table first then import data_only (CONTENT=DATA_ONLY)
    2. Import all table
    In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
    Starting "aaaaa"."SYS_IMPORT_TABLE_02":  aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
    bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    *KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
    *ORA-01403: no data found*
    ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-26062: Can not continue from previous errors.
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26

  • Export and import problem

    hi i am having a test database 9i(9.2.0.1) on windows xp.i want to export a table from test database to development database which is 9i(9.2.0.6) on IBM AIX 5.2. till now i am able to export and import but suddenly i am getting 'TABLESPACE <tablespace name? doesn't exist' error.
    the following is what i am getting
    D:\>imp file=news.dmp fromuser='COUNTER' touser='IBT' tables='NEWS_MASTER'
    Import: Release 9.2.0.1.0 - Production on Sat Oct 6 17:06:00 2007
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Username: system@dev
    Password:
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
    With the Partitioning, Real Application Clusters, OLAP and Oracle Data Mining o
    tions
    JServer Release 9.2.0.6.0 - Production
    Export file created by EXPORT:V09.02.00 via conventional path
    Warning: the objects were exported by COUNTER, not by you
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    import server uses US7ASCII character set (possible charset conversion)
    . importing COUNTER's objects into IBT
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE TABLE "NEWS_MASTER" ("NEWS_DATE" DATE NOT NULL ENABLE, "SUBJECT" VAR"
    "CHAR2(75) NOT NULL ENABLE, "DESCRIPTION" VARCHAR2(300) NOT NULL ENABLE, "PH"
    "OTO" BLOB, "REG_NO" NUMBER(5, 0), "SYSTEM_IP" VARCHAR2(15) NOT NULL ENABLE)"
    " PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREEL"
    "ISTS 1 FREELIST GROUPS 1) TABLESPACE "TESTTBS" LOGGING NOCOMPRESS LOB ("PHO"
    "TO") STORE AS (TABLESPACE "TESTTBS" ENABLE STORAGE IN ROW CHUNK 8192 PCTVE"
    "RSION 10 NOCACHE STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1))"
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'TESTTBS' does not exist
    Import terminated successfully with warnings.
    help me i want to know what happend actually?and what may be the solution?
    thanks in advance

    tablespace TESTTBS doesn't exist in the database where your are importing.
    Either create tablespace in target database or
    use show=y option while importing to get table structure and create manually in target database then import and give ignore=y.

  • Client Copy- releases of the export and import systems have to be identical

    Hi All
    I created a new client using the Remote Client Export \ import method (on the same system). The reason being is that I need a 2nd Test Client on the same system with a smaller data set for other testing purposes. As well as needing to re-import the same Testing Client after a future database restores from a different backup, and still needing the same small data set for testing after the restore whipped it out.
    Now that the 2nd Clients data set has been used, I've been asked for a 3rd test client, the same as the 2nd originally was. (No problem I figured) So I created the 3rd Test Client (SCC4) and tried importing the same transport files as created the 2nd Test Client.
    But I'm receiving the error message "Releases of the export and import systrems have to be identical".
    I am on the same system as the originial client export was made from, so the SAP release is the same.
    The only other thing I can think of is that we have imported some customizing transports from the Dev system into the original system since the export was taken.
    Can anyone help? Is there anyway I can get this back?
    Ken

    Solution found: Note 1291394 - ERROR : TA194 When performing multiple imports in the same client.
    I've imported the note, run the report as per instructions and I can happily say the Import is now running. Although it does seem to be taking a lot longer than it did for the original import.
    Ken

  • Portal Export and Import Steps

    Hi all,
    I need to export all page groups from Production portal instance and import those in brand new portal instance (target Portal)*. Source and target portal instance is as follows
    Production Oracle Database Version: 10.1.0.5
    Target Oracle Database Version: 10.2.0.4
    Production Oracle Infrastructure Version : 10.1.2.0.2
    Target Oracle Infrastructure Version: 10.1.2.0.2
    Production Portal Version: 10.1.2.0.2
    Target Portal Version: 10.1.2.0.2
    What i have done till now is as follows
    1. Install Database server 10.2.0.1
    2. Upgrade Database Server to 10.2.0.4
    3. Install 10.2.0.1 products from Companion CD (ultra search Schemas)
    4. Create Metadata Repository using Repository Creation assistance
    5. Install IAS infrastructure (10.1.2.0.2)
    6. Install IAS Middle Tier (10.1.2.02.)
    7. Create Transport sets in production instance
    Please tell me what i am supposed to do now.
    I am following below link for reference but not able to follow all
    http://www.scott-tiger.dk/portalHelp2/ohw/state?navSetId=_&navId=0
    Regards
    Anindya

    For Portal Export Import (transport), the following are the foremost links on Metalink/ My Oracle Support.
    Note 306785.1 Overview of the Portal Export-Import Process
    Note 263995.1 - Master Note for OracleAS Portal Export / Import Issues
    Note 333867.1 Portal Export and Import Utility Supportability Scenarios
    Regarding your question, please do go through all of these notes.
    - Overview will tell you how to prepare your instances/environments for portal transport. you will see how to ensure they are ready, and that some patches are needed.
    - Master note will tell you which patches are needed, how to get them, what are the diagnostic tools which can help you check and clean your portal repositories, and what common errors you may get and how to fix them.
    - Supportability Scenario tells you what actually you can transport and what you cannot. i.e., what is supported and what is not. Do read it for your case.
    I have seen several Portal Admins who keep banging their heads with problems they create in their systems because they do not read this note and try to force a case which does not follow a supportable scenario.
    hope that helps!
    AMN

  • SQL Developer 2.1: Problem exporting and importing unit tests

    Hi,
    I have created several unit tests on functions that are within packages. I wanted to export these from one unit test repository into another repository on a different database. The export and import work fine, but when running the tests on the imported version, there are lots of ORA-06550 errors. When debugging this, the function name is missing in the call, i.e. it is attempting <SCHEMA>.<PACKAGE> (parameters) instead of <SCHEMA>.<PACKAGE>.<FUNCTION> (parameters).
    Looking in the unit test repository itself, it appears that the OBJECT_CALL column in the UT_TEST table is null - if I populate this with the name of the function, then everything works fine. Therefore, this seems to be a bug with export and import, and it is not including this in the XML. The same problem happens whether I export a single unit test or a suite of tests. Can you please confirm whether this is a bug or whether I am doing something wrong?
    Thanks,
    Pierre.

    Hi Pierre,
    Thanks for pointing this out. Unfortunately, it is a bug on our side and you have found the (ugly) "work-around".
    Bug 9236694 - 2.1: OTN: UT_TEST.OBJECT_CALL COLUMN NOT EXPORTED/IMPORTED
    Brian Jeffries
    SQL Developer Team

  • Language Problem while exporting and importing data

    hi,
    I have Oracle version 8.1.7.0.0 installed on one server and 9.2.0.1.0 installed on new server.
    I'm copying and pasting my version info from SQL*Plus:
    SQL*Plus: Release 8.1.7.0.0 - Production on Mon Aug 22 10:46:31 2005
    (c) Copyright 2000 Oracle Corporation. All rights reserved.
    Connected to:
    Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
    With the Partitioning option
    JServer Release 8.1.7.0.0 - Production
    SQL>
    SQL*Plus: Release 9.2.0.1.0 - Production on Mon Aug 22 12:30:06 2005
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Connected to:
    Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.1.0 - Production
    SQL>
    I created new user on my new server from enterprise manager.
    Exported user from the old server and imported in the new server.
    i.e: from Oracle8i Enterprise Edition Release 8.1.7.0.0, I did
    c:\>exp system/manager file=abc.dmp owner=abc
    Then on the new server Release 9.2.0.1.0, I did
    c:\>imp system/manager file=abc.dmp fromuser=abc touser=abc
    I'm using Arabic Language on my both servers. NLS_LANG parameter on both the servers is AMERICAN_AMERICA.WE8MSWIN1252.
    On both the servers I'm able to insert and select data in arabic.
    However, after I export the data from old server to the new server, the arabic data comes in question marks.
    If I create new table and insert arabic data on new server's user abc it is displaying well. Only the data which I exported and imported is not showing arabic.
    On both old and new servers operating system is Windows XP.
    I'm stuck with this problem. Anybody having any idea about how to solve this problem please help.
    Thank you all in advance.
    Regards

    Let me be clear here. Storing Arabic data in a WE8MSWIN1252 database is not supported by Oracle and will lead to problems. You are incorrectly using the NLS_LANG to prevent proper conversion and your data appears to be okay when you use utilties like SQL*PLUS to view your data. When you write applications that don't rely on the NLS_LANG like JDBC thin driver for instance you will realize your data is in fact invalid. To learn more about the NLS_LANG you can take a look at this FAQ: http://www.oracle.com/technology/tech/globalization/htdocs/nls_lang%20faq.htm
    To migrate your database to a proper character set you can refer to this paper:
    http://www.oracle.com/technology/tech/globalization/pdf/mwp.pdf
    But please do not ask for help in supporting your current configuration in this forum.

  • Exporting and Importing of Schema

    The Export and Import of Schema is not working consistently in SP4.
    When I create a new repository from the schema, all the subtables get created. The main table is has only one field in it. It reads Category.
    The other fields in the main table are missing and are not imported during the Import Schema operation.
    Has anyone encountered the same issue?
    If so, please let me know how you overcame this errorneous behaviour?

    Hi Adhappan
    I have tried importing without success. The import is not consistent. The same schema creates different sets of tables & fields each time. Sometimes if you reimport some more tables which were not set up the first time gets selected for import.
    I did a simple exercise to test the import / export functionality.
    1. I unarchived a repository from standard content provided by SAP
    2. I exported the schema
    3. I tried importing the schema without any modifications to schema
    4. The results were highly inconsistent.
    Arvind
    Pls reward useful answers

Maybe you are looking for

  • Creating @me iTunes Store account without a credit card

    I already have @me e-mail(and Apple ID). Now I want to register it in iTunes Store without a credit card. This instruction doesnt solve my problem:http://support.apple.com/kb/ht2534(because I alredy have an Apple ID registered on my @me e-mail). Plea

  • I can't add A Crystal Report to CMC

    Hello, I download the Crystal Reports Server a month ago.  I've been adding reports for our management.  The management said they don't like this product in that it is very slow when viewing reports.  I am looking at the reports and trying to trouble

  • Remove XML Not In Schema

    Hi, I was wondering if anyone knew of a library that would take an XML document and a schema and remove any invalid nodes (i.e. those not in the schema). I'm wanting this so I can create a schema that's a subset of the XHTML schema and validate user

  • Pricing Condition in SMARTFORMS

    Hi All, I am working on smartforms.I have given select query for codition records like following. SELECT SINGLE  KBETR FROM  KONV   INTO TOTAL WHERE KNUMV = VBDKA-KNUMV AND KPOSN = VBDPA-POSNR AND KSCHL = 'PR00'. Its displaying error that VBDKA is no

  • Microsoft Office and MacBook Air

    Hello everyone, I have just bought a MacBook Air 13 inches (4GB) and I am about to install Microsoft Office. The problem is I remember that when I installed it on my MacBook Pro, the softwares would run very slowly and would always have bugs. Should