Transporting Georaster Data using datapump

Hi all,
I'm trying to use datapump to transport my schema from one database to another. However i have problem with my georaster data.
MAP
Name Null? Type
ID NUMBER(38)
GEOR SDO_GEORASTER
MAP_RDT
Name Null? Type
RASTERID NOT NULL NUMBER
PYRAMIDLEVEL NOT NULL NUMBER
BANDBLOCKNUMBER NOT NULL NUMBER
ROWBLOCKNUMBER NOT NULL NUMBER
COLUMNBLOCKNUMBER NOT NULL NUMBER
BLOCKMBR MDSYS.SDO_GEOMETRY
RASTERBLOCK BLOB
SCHOOL
Name Null? Type
CODE NOT NULL NUMBER(38)
ENG_NAME VARCHAR2(200)
ADDRESS VARCHAR2(200)
POSTAL VARCHAR2(200)
TELEPHONE VARCHAR2(200)
FAX VARCHAR2(200)
EMAIL VARCHAR2(200)
STATUS VARCHAR2(200)
ZONE_ID NUMBER(38)
GEOM SDO_GEOMETRY
MG_ID NUMBER(38)
Other table just contain my normal data.
Export statement:
- expdp schools/** schemas=schools directory=exp_dir dumpfile=schools.dmp
Import statement:
- impdp schools/** schemas=schools directory=imp_dir dumpfile=schools.dmp
parfile=exclude.par
exclude.par content:
exclude=trigger:"like 'GRDMLTR_%'"
exclude=table:"IN('MAP','MAP_RDT')"
- impdp schools/** schemas=schools directory=imp_dir dumpfile=schools.dmp
parfile=include.par content=metadata_only
include.par content:
include=table:"IN('MAP','MAP_RDT')"
- impdp schools/** schemas=schools directory=imp_dir dumpfile=schools.dmp
parfile=include.par content=data_only
However, after my import, i did a validation on the georaster object
select sdo_geor.validategeoraster(geor) from map;
SDO_GEOR.VALIDATEGEORASTER(GEOR)
13442: RDT=MAP_RDT, RID=1
There is problem in the georaster object but i don't quite understand the information that i found online.
After my import, the type of the georaster change to
MAP
Name Null? Type
ID NUMBER(38)
GEOR PUBLIC.SDO_GEORASTER
SCHOOL
Name Null? Type
CODE NOT NULL NUMBER(38)
ENG_NAME VARCHAR2(200)
ADDRESS VARCHAR2(200)
POSTAL VARCHAR2(200)
TELEPHONE VARCHAR2(200)
FAX VARCHAR2(200)
EMAIL VARCHAR2(200)
STATUS VARCHAR2(200)
ZONE_ID NUMBER(38)
GEOM PUBLIC.SDO_GEOMETRY
MG_ID NUMBER(38)
And i have problem running my application that render the georaster with a theme created on the school. The error message is as below:
In the mapviewer console
2008-07-29 16:46:26.773 ERROR cannot find entry in ALL_SDO_GEOM_METADATA table for theme: SCHOOLS_LOCATION
2008-07-29 16:46:26.773 ERROR using query:select SRID from ALL_SDO_GEOM_METADATA where TABLE_NAME=:1 and (COLUMN_NAME=:2 OR COLUMN_NAME=:3 or COLUMN_NAME=:4) and OWNER=:5
2008-07-29 16:46:26.774 ERROR Exhausted Resultset
2008-07-29 16:46:26.783 ERROR cannot find entry in USER_SDO_GEOM_METADATA table for theme: SCHOOLS_LOCATION
2008-07-29 16:46:26.784 ERROR Exhausted Resultset
I sincerely hope if anyone could provide me with some guidance. Thanks in advance =)

This is probably almost a year too late but I had the same problem and was able to resolve it by registering the recently imported geoRaster data
SQL>select * from the (select sdo_geor_admin.listGeoRasterObjects from dual);
This will likely show that you have 'unregistered' data and the following will register it after which your validation calls should all return true instead of 13442
SQL>execute sdo_geor_admin.registerGeoRasterObjects;
SORRY -- I just saw that you'd already done the registerGeoRasterObjects bit msm
Edited by: mmillman on May 27, 2009 5:11 PM

Similar Messages

  • Accidentally dropped data using datapump's replace command

    All,
    I exported 2 partitions of a table using the following data pump command:
    expdp user1/password1 directory=DPUMP_DIR tables=tab1:P27,tab1:P28 logfile=tab1.log dumpfile=tab1_27_28.dmp
    I moved the dump file to staging and tried importing using the following command:
    impdp user1/password1 directory=DPUMP_DIR tables=tab1:P27,tab1:P28 logfile=tab1.log dumpfile=tab1_27_28.dmp table_exists_action=APPEND
    when I ran the command above, I got the following error:
    ORA-31696: unable to export/import TABLE_DATA:"VOP"."R_REQUEST":"P28"."P28_RN" using client specified AUT
    I checked this on Oracle Support website and got the following advie:
    =====================================================
    +*<Moderator edit - deleted MOS Doc content - please do not post contents of MOS Docs - this is a violation of your Support agreement>*+
    =================================================================================
    I then proceeded to use table_exists_action=replace (thinking it was only going to replace the 2 partions specified).
    Using table_exists_action=replace deleted rows from all other partions and loaded the 2 partitions which I exported.
    I have tried retrieving lodt data using command below but it didn't work:
    SQL> flashback table user1.tab1 to timestamp (systimestamp - interval '120' minute);
    I've enable row movement and we've always had flashback enabled:
    db_flashback_retention_target integer 1440
    Does anyone know how else I could retrieved lost data without having to export and import the entire table (which is almost 1 Terabyte)

    If your database in archivelog mode? If yes you can try doing PITR

  • Error in import data using datapump

    Hi all,
    I just install oracle 10gR2 in my local machine and tryied to import data in 9iR2 database. But it ended with errors. Here r the steps I have followed.

    SQL> conn sys/ as sysdba
    Enter password:
    Connected.
    SQL> crete user amila identified by a;
    SP2-0734: unknown command beginning "crete user..." - rest of line ignored.
    SQL> create user amila identified by a;
    User created.
    SQL> grant connect,resource,create any directory to amila;
    Grant succeeded.
    SQL> create or replace directory dump_dir as 'E:\oracle\dump';
    Directory created.
    SQL> grant read,write on directory dump_dir to amila;
    Grant succeeded.
    -==========================================================
    C:\>impdp amila5/a@cd01 directory=dump_dir dumpfile=amila5.dmp logfile=amila5.lo
    g
    Import: Release 10.2.0.1.0 - Production on Friday, 19 October, 2007 13:27:17
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.1.0 - Production
    UDI-00008: operation generated ORACLE error 6550
    ORA-06550: line 1, column 52:
    PLS-00201: identifier 'SYS.DBMS_DATAPUMP' must be declared
    ORA-06550: line 1, column 52:
    PL/SQL: Statement ignored
    Pls help on this issue.

  • ORA-39070 Error  when using datapump and writing to ASM storage

    I am able to export data using datapump when i write to a file system. However, when i try to write to an ASM storage, i get the following errors.
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 536
    ORA-29283: invalid file operation
    below are the steps i tooks.
    create or replace directory jp_dir2 as '+DATA/DEV01/exp_dir';
    grant read,write on directory jp_dir2 to jpark;
    expdp username/password schemas=testdirectory=jp_dir2 dumpfile=test.dmp log=test.log
    Edited by: user564785 on Aug 25, 2011 6:49 AM

    google: expdp ASM
    first hit:
    http://asanga-pradeep.blogspot.com/2010/08/expdp-and-impdp-with-asm.html
    "Log files created during expdp cannot be stored inside ASM, for log files a directory object that uses OS file system location must be given. If not following error will be thrown
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 536
    ORA-29283: invalid file operation
    "

  • Error in import table data using oracle datapump

    i am trying to import table data using oracle datapump
    CREATE TABLE emp_xt (
    ID NUMBER,
    NAME VARCHAR2(30)
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_DATAPUMP
    DEFAULT DIRECTORY backup
    LOCATION ('a.dmp')
    it return the following error
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04084: The ORACLE_DATAPUMP access driver does not support the ROWID column.
    ORA-06512: at "SYS.ORACLE_DATAPUMP", line 19
    please help me

    .dmp file generated from exp command file not from oracle_datapump

  • Import Indexes only using datapump

    Hi Gurus,
    I have 2 identical schemas like same table structures but row number and data is different so I want to import the indexes from one schema to other. Can you please advise how can I do that using datapump.
    Thanks in advance.

    It doesn't matter what data is in the tables, as long as the columns the indexes are on exist. The indexes get created and built using a normal
    create index ...
    statement. So, if the original table had 100 rows and the new table has 10, the index will be built using the 10 rows. All that gets stored in the dumpfile is the ddl (in xml format). No index data is stored in the dumpfile. (If the op was using transportable, that would be a different story.)
    Dean

  • Error while taking dump using datapump

    getting following error -
    Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    2A68E610 14916 package body SYS.KUPW$WORKER
    2A68E610 6300 package body SYS.KUPW$WORKER
    2A68E610 9120 package body SYS.KUPW$WORKER
    2A68E610 1880 package body SYS.KUPW$WORKER
    2A68E610 6861 package body SYS.KUPW$WORKER
    2A68E610 1262 package body SYS.KUPW$WORKER
    255541A8 2 anonymous block
    Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
    Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
    some of the default schemas were purposely dropped from the database. The only default schema available now are -
    DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
    DIP, OUTLN, TSMSYS were created again.
    Could this be a cause of problem??
    Thanks in adv.

    Hi,
    Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
    CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
    GRANT ALTER SESSION TO "DMSYS"
    GRANT ALTER SYSTEM TO "DMSYS"
    GRANT CREATE JOB TO "DMSYS"
    GRANT CREATE LIBRARY TO "DMSYS"
    GRANT CREATE PROCEDURE TO "DMSYS"
    GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
    GRANT CREATE SEQUENCE TO "DMSYS"
    GRANT CREATE SESSION TO "DMSYS"
    GRANT CREATE SYNONYM TO "DMSYS"
    GRANT CREATE TABLE TO "DMSYS"
    GRANT CREATE TRIGGER TO "DMSYS"
    GRANT CREATE TYPE TO "DMSYS"
    GRANT CREATE VIEW TO "DMSYS"
    GRANT DROP PUBLIC SYNONYM TO "DMSYS"
    GRANT QUERY REWRITE TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
    GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
    The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump??

  • Error while export using DATAPUMP

    hi ,
    When i try to export the database using DataPump I am getting the following error.
    Details :
    DB version : 10.2.0.2
    OS : (HP-Unix)
    error :
    dbsrv:/u02/oradata2> 6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new <
    Export: Release 10.2.0.1.0 - 64bit Production on Sunday, 27 July, 2008 10:00:38
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "OCEANFNC"."JOB6": userid=oceanfnc/********@uatdb content=ALL full=Y job_name=job6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,1,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-04067: not executed, package body "DMSYS.DBMS_DM_MODEL_EXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "DMSYS.DBMS_DM_MODEL_EXP"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    c0000002ae4f0a60 14916 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 6300 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 9120 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 1880 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 6861 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 1262 package body SYS.KUPW$WORKER
    c000000264e48dc0 2 anonymous block
    Job "OCEANFNC"."JOB6" stopped due to fatal error at 10:00:41
    Kindly advice me reg this!
    Thanks

    hi,
    we got a metalink note for this problem and the metalink id is "Note Id: 433022.1 " . Even after running the Mentioned catalog.sql and catproc.sql, in our DB, we are facing some problems. The error description :
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(0,dynconnect,10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5327
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(1,dynconnect,10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5327
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSTEM',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OUTLN',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('TSMSYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('ANONYMOUS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OLAPSYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSMAN',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MDDATA',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('UATFNC',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MGMT_VIEW',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SCOTT',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRNSMS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXML',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXBO',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN000',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXRPT',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN800',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN900',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OCEANFNC',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    Kindly advice me regarding this !!
    Thanks a lot for ur reply !!

  • Issue with importing data using data pump

    Hi Guys,
    Need your expertise here. I have just exported a table using the following datapump commands. Please note that I used *%U* to split the export into chunk of 2G files.
    expdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110.log tables=(PT_CONTROL.pipeline_session) filesize=2G job_name=pt_ps_0831_1
    The above command produced the following files
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:04 PT_CONTROL_PS_083110_01.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:05 PT_CONTROL_PS_083110_02.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:06 PT_CONTROL_PS_083110_03.dmp
    -rw-r----- 1 oracle oinstall 394M Aug 31 15:06 PT_CONTROL_PS_083110_04.dmp
    -rw-r--r-- 1 oracle oinstall 2.1K Aug 31 15:06 PT_CONTROL_PS_083110.log
    So far things are good.
    Now when I import the data using the below command, it truncates the table but do no import any data. Last line says "*Job "SYS"."PT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57*".
    impdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_%U.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=PT_ps_imp_0831_1
    Import: Release 10.2.0.3.0 - Production on Tuesday, 31 August, 2010 15:14:53
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.2.0.3.0 - Production
    Master table "SYS"."AT_PS_IMP_0831_1" successfully loaded/unloaded
    Starting "SYS"."AT_PS_IMP_0831_1": '/******** AS SYSDBA' dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=AT_ps_imp_0831_1
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39153: Table "PT_CONTROL"."PIPELINE_SESSION" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."AT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57
    I suspect that it has something to do with %U in the impdp command. Anyone encounter this kind of situation before? What should be my import command be? Just want to confirm I am using the right import command.
    Thanks
    --MM
    Edited by: UserMM on Aug 31, 2010 3:11 PM

    I also looked into the alert log but didn't find anything about the error there. Any opinion?
    --MM                                                                                                                                                                                                           

  • Total number of records of partition tables exported using datapump

    Hi All,
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    OS: RHEL
    I exported a table with partitions using datapump and I would like to verify the total number of all the records exported of all these partition tables.The export has a query on it with WHERE ITEMDATE< TO_DATE (1-JAN-2010). I need it to compare with the exact amount of records in the actual table if it's the same.
    Below is the log file of the exported table. It does not show the total number of rows exported but only individually via partition.
    Starting "SYS"."SYS_EXPORT_TABLE_05": '/******** AS SYSDBA' dumpfile=data_pump_dir:GSDBA_APPROVED_TL.dmp nologfile=y tables=GSDBA.APPROVED_TL query=GSDBA.APPROVED_TL:"
    WHERE ITEMDATE< TO_DATE(\'1-JAN-2010\'\)"
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 517.6 MB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q3" 35.02 MB 1361311 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q4" 33.23 MB 1292051 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q4" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q3" 30.53 MB 1186974 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q3" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q1" 30.44 MB 1183811 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q2" 30.29 MB 1177468 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q4" 30.09 MB 1170470 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q2" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q2" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q3" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q4" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q1" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q1" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_MAXVALUE" 0 KB 0 rows
    Master table "SYS"."SYS_EXPORT_TABLE_05" successfully loaded/unloaded
    Dump file set for SYS.SYS_EXPORT_TABLE_05 is:
    /u01/export/GSDBA_APPROVED_TL.dmp
    Job "SYS"."SYS_EXPORT_TABLE_05" successfully completed at 12:00:36
    Edited by: 831134 on Jan 25, 2012 6:42 AM
    Edited by: 831134 on Jan 25, 2012 6:43 AM
    Edited by: 831134 on Jan 25, 2012 6:43 AM

    I assume you want this so you can run a script to check the count? If not and this is being done manually, then just add up the individual rows. I'm not very good at writing scripts, but I would think that someone here could come up with a script that would sum up the row count for the partitions of the tables from your log file. This is not something that Data Pump writes to the log file.
    Dean

  • Transportation plan date calculated in VA01

    Hi Expert,
         The business wants to deliver all goods existing on a sales order togheter even if we have different good issue dates per schedule lines, probably caused by the fact both sales order lines have different routes. Therefore the material, loading and transportation plan dates will also be different for each schedule line. I understand the system will split this sales order into more than one delivery, because the route is taking into account in the delivery split functionality. This is OK. On the other hand, I would like the system to show in VL10H all sales order lines form that sales order. This is possible to be done if the transportation plan dates from each confimed schedule line are the same, because VL10H takes into account the transportation plan date to show the sales order to be delivered. My question is, would it be possible to align the transportation plan date in both schedules lines via customizing or should I have to put on an EXIT? I have already tried via "Define Scheduling by sales order type", but not succesfull.
    Thanks Marcelo.
    See example below:
    Line 10
    Req. Del. Date: 19.03.2008
    GI date: 05.03.2008
    Loading date: 05.03.2008
    Mat.AV.date: 05.03.2008
    Transp.plan.date: 03.03.2008
    Route: TNLN10
    Line 20
    Req. Del. Date: 19.03.2008
    GI date: 04.03.2008
    Loading date: 04.03.2008
    Mat.AV.date:04.03.2008
    Transp.plan.date: 03.03.2008
    Route: TNLN11

    Hi Marcelo,
    In most general scenario, to have complete delivery we use ship complete flag at header level. This will deliver units together for all line items. There is also one ship complete flag at item level in sales A/B screen. I have also been researching on this and i think this also can be used to ship complete some items of sales orders and not all items.
    But you want to have transp planning date same but diff routes, then doing through standard SAP is difficult. This will involve major changes to availability checks. I think wiser and easier option is to touch user exit at save document level. This will help to trigger this check at VA01/02 level.
    Also please do consider the rescheduling runs effect when such changes are designed.
    If this has helped some way to go closer to answer, please do reward me points.
    Kind Regards
    Sandeep

  • Transporting modified data target to production which has data?

    Hi,
    I have a cube for which i had added two more fields and changed the data model accordingly!
    Now, i have a question:
    In production this cube is filled daliy and now it has some data records in thousands!
    what happenes if i transport this data model with out deleting the data in production?
    Though we are in BI 7.0, i am using the 3.x datasources only!
    Raj

    Hi Raj,
    What fields did you add?
    Characteristics -->  Cube must be empty before transport. You will have to reload the cube.
    Navigational Attrinutes --> no problem. You can even use them in your queries. they will be "filled".
    Key-figures --> no problem, but no historical data. These fields will only be filled from the moment you have transported the update rules.
    Success,
    Udo

  • Stats Not getting imported while importing partition using DATAPUMP API

    Hi,
    I am using datapump API for exporting a partition of a table and the importing the partition into table on another DB.
    The data is getting imported but i could the partition stats are not getting imported. I am using TABLE_EXISTS_ACTION = APPEND.
    any idea on how to achieve the stats also to be imported into the partition.
    Thanks!

    Hi,
    I am using datapump API for exporting a partition of a table and the importing the partition into table on another DB.
    The data is getting imported but i could the partition stats are not getting imported. I am using TABLE_EXISTS_ACTION = APPEND.
    any idea on how to achieve the stats also to be imported into the partition.
    Thanks!

  • Moving Large amount of data using IMPDP with network link

    Hi Guru,
    Here we are having a requirement to move 2TB of data from production to non-prod using Network_link parameter. What is the process to make it fast.
    Previously we did it but it took 7 days for importing data and index .
    Here i am having an idea can you please guide me is it good to make import faster .
    Step 1) import only metadata .
    Step 2) import only table data using table_exists_action=append or truncate.( Here indexes are allready created in step 1 and import will be fast as per my plan.)
    Please help me the better way if we can.
    Thanks & Regards,
    Venkata Poorna Prasad.S

    You might want to check these as well:
    DataPump Import (IMPDP) Over NETWORK_LINK Is Sometimes Very Slow (Doc ID 1439691.1)
    DataPump Import Via NETWORK_LINK Is Slow With CURSOR_SHARING=FORCE (Doc ID 421441.1)
    Performance Problems When Transferring LOBs Using IMPDP With NETWORK_LINK (Doc ID 1488229.1)

  • ORA-39152 : While Impoting data using dbms_datapump utility

    Hi,
    Could anyone help me on this.
    I am importing data using dbms_datapump utility and setting 'Table_exists_action' ='APPEND',getting the error
    "ORA-39152: Table exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append".
    w_dp_handle := dbms_datapump.open(operation => 'IMPORT',
    job_mode => 'TABLE',
    remote_link => NULL,
    job_name => w_job_name);
    dbms_datapump.add_file(handle => w_dp_handle,
    filename => w_filename,
    directory => 'DATA_PUMP_DIR',
    filetype => dbms_datapump.ku$_file_type_dump_file);
    dbms_datapump.add_file(handle => w_dp_handle,
    filename => w_filename || '.LOG',
    directory => 'DATA_PUMP_DIR',
    filetype => dbms_datapump.ku$_file_type_log_file);
    dbms_datapump.set_parameter(handle => w_dp_handle,
    NAME => 'TABLE_EXISTS_ACTION',
    VALUE => 'APPEND');
    dbms_datapump.start_job(w_dp_handle);
    I know it is just a warning message,but i have to show the same log file generated to the user in which this Ora error is coming.
    Please guide me in removing this error.
    Thanks in advance for your help.

    Hi;
    What is DB version? Please see:
    ORA-39152 Table Exists Data Will be Appended to Existing Table [ID 818535.1]
    Also see:
    Import DataPump: Import of Object Types Fails With ORA-39083 ORA-02304 or ORA-39117 ORA-39779 [ID 351519.1]
    How to Transfer the Data Between Tables With Different Structures? [ID 1078211.1]
    Regard
    Helios

Maybe you are looking for

  • What is difference between SUNW,UltraSPARC-IIIi and  US-III+

    Hi All, I am trying to finalise a VxVM version for a Solaris 8 version. On the Solaris machine I see, System Configuration: Sun Microsystems sun4u Sun Fire 280R (UltraSPARC-III+) System clock frequency: 150 MHz Memory size: 4096 Megabytes ===========

  • How do you determine the article ID from the long guid filename format?

    It would be very useful sometimes to determine the article Id number from the guid name or to get the guid name from the article ID number. Sometimes a log error references a guid name. Which isn't too bad because you can go to the distribution point

  • Creating an html email from a PSD

    I have created a long vertical e - blast in Photoshop with graphics and places to make links and would love to know a simple way to convert this PSD to an html format.Any suggestions would be very helpful. Many thanks !

  • Get field names of an internal structure

    Hi, In my program, I have a structure created via begin-of/end-of. A table is based on this structure and later on, I use a macro that deals with the field names and values. I'd like to make the macro dynamic, in that it gets the table field names wi

  • Removing header letters from document list

    Hey, quick question, I have created folders for my documents. They appear in alphabetical order which is great. But they are also separated my alphabet headers e.g. All folders starting with A are under the heading "A" etc. I want to remove these hea