Data pump error ORA-39065, status undefined after restart
Hi members,
The data pump full import job hung, continue client also hung, all of a sudden the window exited.
;;; Import> status
;;; Import> help
;;; Import> status
;;; Import> continue_client
ORA-39065: unexpected master process exception in RECEIVE
ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
Import> status
Job: SYS_IMPORT_FULL_01
Operation: IMPORT
Mode: FULL
State: IDLING
Bytes Processed: 3,139,231,552
Percent Done: 33
Current Parallelism: 8
Job Error Count: 0
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
Worker 1 Status:
State: UNDEFINED
Worker 2 Status:
State: UNDEFINED
Object Schema: trm
Object Name: EVENT_DOCUMENT
Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Completed Objects: 1
Completed Rows: 78,026
Completed Bytes: 4,752,331,264
Percent Done: 100
Worker Parallelism: 1
Worker 3 Status:
State: UNDEFINED
Worker 4 Status:
State: UNDEFINED
Worker 5 Status:
State: UNDEFINED
Worker 6 Status:
State: UNDEFINED
Worker 7 Status:
State: UNDEFINED
Worker 8 Status:
State: UNDEFINED
39065, 00000, "unexpected master process exception in %s"
// *Cause: An unhandled exception was detected internally within the master
// control process for the Data Pump job. This is an internal error.
// messages will detail the problems.
// *Action: If problem persists, contact Oracle Customer Support.
Similar Messages
-
Hi All,
I am getting the following errors when I am trying to connect with datapump.My db is 10g and os is linux.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
ORA-06508: PL/SQL: could not find program unit being called
ORA-06512: at "SYS.DBMS_LOGSTDBY", line 24
ORA-06512: at "SYS.KUPV$FT", line 676
ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
ORA-06508: PL/SQL: could not find program unit being called
When I tried to compile this package I am getting the following error
SQL> alter package DBMS_INTERNAL_LOGSTDBY compile body;
Warning: Package Body altered with compilation errors.
SQL> show error
Errors for PACKAGE BODY DBMS_INTERNAL_LOGSTDBY:
LINE/COL ERROR
1405/4 PL/SQL: SQL Statement ignored
1412/38 PL/SQL: ORA-00904: "SQLTEXT": invalid identifier
1486/4 PL/SQL: SQL Statement ignored
1564/7 PL/SQL: ORA-00904: "DBID": invalid identifier
1751/2 PL/SQL: SQL Statement ignored
1870/7 PL/SQL: ORA-00904: "DBID": invalid identifier
Can anyony suggest/guide me how to resolve the issue.
Thanks in advanceSQL> SELECT OBJECT_TYPE,OBJECT_NAME FROM DBA_OBJECTS
2 WHERE OWNER='SYS' AND STATUS<>'VALID';
OBJECT_TYPE OBJECT_NAME
VIEW DBA_COMMON_AUDIT_TRAIL
PACKAGE BODY DBMS_INTERNAL_LOGSTDBY
PACKAGE BODY DBMS_REGISTRY_SYS
Thanks -
DATA-PUMP ERROR: ORA-39070 Database on Linux, Client on Win 2008
Hi,
i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
While making expdp
expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
i get those errors
ORA-39002:niepoprawna operacja
ORA-39070:nie mozna otworzyc pliku dziennik
ORA-29283:niepoprawna operacja na pliku
ORA-06512:przy "sys.utl_file", linia 536
ORA-29283:niepoprawna operacja na pliku
(decriptions in polish)
I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
How can I save datapump export dumpfile on Windows?tstefanski wrote:
Hi,
i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
While making expdp
expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
i get those errors
ORA-39002:niepoprawna operacja
ORA-39070:nie mozna otworzyc pliku dziennik
ORA-29283:niepoprawna operacja na pliku
ORA-06512:przy "sys.utl_file", linia 536
ORA-29283:niepoprawna operacja na pliku
(decriptions in polish)
I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
How can I save datapump export dumpfile on Windows?
>Hi,
i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
While making expdp
expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
i get those errors
ORA-39002:niepoprawna operacja
ORA-39070:nie mozna otworzyc pliku dziennik
ORA-29283:niepoprawna operacja na pliku
ORA-06512:przy "sys.utl_file", linia 536
ORA-29283:niepoprawna operacja na pliku
(decriptions in polish)
I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
How can I save datapump export dumpfile on Windows?
expdp can only create dump file on DB Server system itself. -
Impdp triggers always give error ORA-04071:missing BEFORE, AFTER or INSTEAD
Hello,
I am using 11g impdp to create a test system.
I use this cmd to import
impdp 'SYS/xxxxx@PWRFUN AS SYSDBA' dumpfile=PWRPROD.DMP logfile=PWRFUNimportPROD.log SCHEMAS=pwrplant TABLE_EXISTS_ACTION=REPLACE
Then I get this error for all the triggers
ORA-39083: Object type TRIGGER failed to create with error:
ORA-04071: missing BEFORE, AFTER or INSTEAD OF keyword
Failing sql is:
CREATE TRIGGER "PWRPLANT"."APPROVAL_STEPS_HISTORY" PS_HISTORY
BEFORE UPDATE OR INSERT ON PWRPLANT.APPROVAL_STEPS_HISTORY
FOR EACH ROW
BEGIN :new.user_id := USER; :new.time_stamp := SYSDATE; END;
As you can see, the trigger does contain the word "before".
When I remove " "PWRPLANT"."APPROVAL_STEPS_HISTORY" " and run the trigger below, it compiles correctly. Why is this, and how do I make my impdp to run correctly?
CREATE TRIGGER PS_HISTORY
BEFORE UPDATE OR INSERT ON PWRPLANT.APPROVAL_STEPS_HISTORY
FOR EACH ROW
BEGIN :new.user_id := USER; :new.time_stamp := SYSDATE; END;same error, thanks for your attemp
Error starting at line 1 in command:
CREATE TRIGGER "PWRPLANT"."APPROVAL_STEPS_HISTORY" PS_HISTORY
BEFORE UPDATE OR INSERT ON PWRPLANT.APPROVAL_STEPS_HISTORY
REFERENCING new AS new
FOR EACH ROW
BEGIN :new.user_id := USER; :new.time_stamp := SYSDATE; END;
Error report:
ORA-04071: missing BEFORE, AFTER or INSTEAD OF keyword
04071. 00000 - "missing BEFORE, AFTER or INSTEAD OF keyword"
*Cause: The trigger statement is missing the BEFORE/AFTER/INSTEAD OF clause.
*Action: Specify either BEFORE, AFTER or INSTEAD OF. -
Hi,everyone,
I had installed R Enterprise in my Oracle 11.2.0.1 base on win7,using the R 2.13.2, ORE 1.1, I can using the part function: like
library(ORE)
options(STERM='iESS', str.dendrogram.last="'", editor='emacsclient.exe', show.error.locations=TRUE)
> ore.connect(user = "RQUSER",password = "RQUSERpsw",conn_string = "", all = TRUE)
> ore.is.connected()
[1] TRUE
> ore.ls()
[1] "IRIS_TABLE"
> demo(package = "ORE")
Demos in package 'ORE':
aggregate Aggregation
analysis Basic analysis & data processing operations
basic Basic connectivity to database
binning Binning logic
columnfns Column functions
cor Correlation matrix
crosstab Frequency cross tabulations
derived Handling of derived columns
distributions Distribution, density, and quantile functions
do_eval Embedded R processing
freqanalysis Frequency cross tabulations
graphics Demonstrates visual analysis
group_apply Embedded R processing by group
hypothesis Hyphothesis testing functions
matrix Matrix related operations
nulls Handling of NULL in SQL vs. NA in R
push_pull RDBMS <-> R data transfer
rank Attributed-based ranking of observations
reg Ordinary least squares linear regression
row_apply Embedded R processing by row chunks
sql_like Mapping of R to SQL commands
stepwise Stepwise OLS linear regression
summary Summary functionality
table_apply Embedded R processing of entire table
> demo("aggregate",package = "ORE")
demo(aggregate)
---- ~~~~~~~~~
Type <Return> to start : Return
> #
> # O R A C L E R E N T E R P R I S E S A M P L E L I B R A R Y
> #
> # Name: aggregate.R
> # Description: Demonstrates aggregations
> # See also summary.R
> #
> #
> #
>
> ## Set page width
> options(width = 80)
> # List all accessible tables and views in the Oracle database
> ore.ls()
[1] "IRIS_TABLE"
> # Create a new table called IRIS_TABLE in the Oracle database
> # using the built-in iris data.frame
>
> # First remove previously created IRIS_TABLE objects from the
> # global environment and the database
> if (exists("IRIS_TABLE", globalenv(), inherits = FALSE))
+ rm("IRIS_TABLE", envir = globalenv())
> ore.drop(table = "IRIS_TABLE")
> # Create the table
> ore.create(iris, table = "IRIS_TABLE")
> # Show the updated list of accessible table and views
> ore.ls()
[1] "IRIS_TABLE"
> # Display the class of IRIS_TABLE and where it can be found in
> # the search path
> class(IRIS_TABLE)
[1] "ore.frame"
attr(,"package")
[1] "OREbase"
> search()
[1] ".GlobalEnv" "ore:RQUSER" "ESSR"
[4] "package:ORE" "package:ORExml" "package:OREeda"
[7] "package:OREgraphics" "package:OREstats" "package:MASS"
[10] "package:OREbase" "package:ROracle" "package:DBI"
[13] "package:stats" "package:graphics" "package:grDevices"
[16] "package:utils" "package:datasets" "package:methods"
[19] "Autoloads" "package:base"
> find("IRIS_TABLE")
[1] "ore:RQUSER"
> # Select count(Petal.Length) group by species
> x = aggregate(IRIS_TABLE$Petal.Length,
+ by = list(species = IRIS_TABLE$Species),
+ FUN = length)
> class(x)
[1] "ore.frame"
attr(,"package")
[1] "OREbase"
> x
species x
1 setosa 50
2 versicolor 50
3 virginica 50
> # Repeat FUN = summary, mean, min, max, sd, median, IQR
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = summary)
species Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
1 setosa 1.0 1.4 1.50 1.462 1.575 1.9 0
2 versicolor 3.0 4.0 4.35 4.260 4.600 5.1 0
3 virginica 4.5 5.1 5.55 5.552 5.875 6.9 0
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = mean)
species x
1 setosa 1.462
2 versicolor 4.260
3 virginica 5.552
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = min)
species x
1 setosa 1.0
2 versicolor 3.0
3 virginica 4.5
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = max)
species x
1 setosa 1.9
2 versicolor 5.1
3 virginica 6.9
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = sd)
species x
1 setosa 0.1736640
2 versicolor 0.4699110
3 virginica 0.5518947
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = median)
species x
1 setosa 1.50
2 versicolor 4.35
3 virginica 5.55
> aggregate(IRIS_TABLE$Petal.Length, by = list(species = IRIS_TABLE$Species),
+ FUN = IQR)
species x
1 setosa 0.175
2 versicolor 0.600
3 virginica 0.775
> # More than one grouping column
> x = aggregate(IRIS_TABLE$Petal.Length,
+ by = list(species = IRIS_TABLE$Species,
+ width = IRIS_TABLE$Petal.Width),
+ FUN = length)
> x
species width x
1 setosa 0.1 5
2 setosa 0.2 29
3 setosa 0.3 7
4 setosa 0.4 7
5 setosa 0.5 1
6 setosa 0.6 1
7 versicolor 1.0 7
8 versicolor 1.1 3
9 versicolor 1.2 5
10 versicolor 1.3 13
11 versicolor 1.4 7
12 virginica 1.4 1
13 versicolor 1.5 10
14 virginica 1.5 2
15 versicolor 1.6 3
16 virginica 1.6 1
17 versicolor 1.7 1
18 virginica 1.7 1
19 versicolor 1.8 1
20 virginica 1.8 11
21 virginica 1.9 5
22 virginica 2.0 6
23 virginica 2.1 6
24 virginica 2.2 3
25 virginica 2.3 8
26 virginica 2.4 3
27 virginica 2.5 3
> # Sort the result by ascending value of count
> ore.sort(data = x, by = "x")
species width x
1 virginica 1.4 1
2 virginica 1.7 1
3 versicolor 1.7 1
4 virginica 1.6 1
5 setosa 0.5 1
6 setosa 0.6 1
7 versicolor 1.8 1
8 virginica 1.5 2
9 versicolor 1.1 3
10 virginica 2.4 3
11 virginica 2.5 3
12 virginica 2.2 3
13 versicolor 1.6 3
14 setosa 0.1 5
15 virginica 1.9 5
16 versicolor 1.2 5
17 virginica 2.0 6
18 virginica 2.1 6
19 setosa 0.3 7
20 versicolor 1.4 7
21 setosa 0.4 7
22 versicolor 1.0 7
23 virginica 2.3 8
24 versicolor 1.5 10
25 virginica 1.8 11
26 versicolor 1.3 13
27 setosa 0.2 29
> # by descending value
> ore.sort(data = x, by = "x", reverse = TRUE)
species width x
1 setosa 0.2 29
2 versicolor 1.3 13
3 virginica 1.8 11
4 versicolor 1.5 10
5 virginica 2.3 8
6 setosa 0.4 7
7 setosa 0.3 7
8 versicolor 1.0 7
9 versicolor 1.4 7
10 virginica 2.1 6
11 virginica 2.0 6
12 virginica 1.9 5
13 versicolor 1.2 5
14 setosa 0.1 5
15 versicolor 1.6 3
16 versicolor 1.1 3
17 virginica 2.4 3
18 virginica 2.5 3
19 virginica 2.2 3
20 virginica 1.5 2
21 virginica 1.6 1
22 virginica 1.4 1
23 setosa 0.6 1
24 setosa 0.5 1
25 versicolor 1.8 1
26 virginica 1.7 1
27 versicolor 1.7 1
> # Preserve just 1 row for duplicate x's
> ore.sort(data = x, by = "x", unique.keys = TRUE)
species width x
1 setosa 0.5 1
2 virginica 1.5 2
3 versicolor 1.1 3
4 setosa 0.1 5
5 virginica 2.0 6
6 setosa 0.3 7
7 virginica 2.3 8
8 versicolor 1.5 10
9 virginica 1.8 11
10 versicolor 1.3 13
11 setosa 0.2 29
> ore.sort(data = x, by = "x", unique.keys = TRUE, unique.data = TRUE)
species width x
1 setosa 0.5 1
2 virginica 1.5 2
3 versicolor 1.1 3
4 setosa 0.1 5
5 virginica 2.0 6
6 setosa 0.3 7
7 virginica 2.3 8
8 versicolor 1.5 10
9 virginica 1.8 11
10 versicolor 1.3 13
11 setosa 0.2 29
but when I use the following The ore.doEval command get the errors,
> ore.doEval(function() { 123 })
Error in .oci.GetQuery(conn, statement, ...) :
ORA-29400: data cartridge error
ORA-24323: ?????
ORA-06512: at "RQSYS.RQEVALIMPL", line 23
ORA-06512: at line 4
and I try to run the demo("row_apply", package="ORE") get the same errors:
demo("row_apply",package = "ORE")
demo(row_apply)
---- ~~~~~~~~~
Type <Return> to start : Return
> #
> # O R A C L E R E N T E R P R I S E S A M P L E L I B R A R Y
> #
> # Name: row_apply.R
> # Description: Execute R code on each row
> #
> #
>
> ## Set page width
> options(width = 80)
> # List all accessible tables and views in the Oracle database
> ore.ls()
[1] "IRIS_TABLE"
> # Create a new table called IRIS_TABLE in the Oracle database
> # using the built-in iris data.frame
>
> # First remove previously created IRIS_TABLE objects from the
> # global environment and the database
> if (exists("IRIS_TABLE", globalenv(), inherits = FALSE))
+ rm("IRIS_TABLE", envir = globalenv())
> ore.drop(table = "IRIS_TABLE")
> # Create the table
> ore.create(iris, table = "IRIS_TABLE")
> # Show the updated list of accessible table and views
> ore.ls()
[1] "IRIS_TABLE"
> # Display the class of IRIS_TABLE and where it can be found in
> # the search path
> class(IRIS_TABLE)
[1] "ore.frame"
attr(,"package")
[1] "OREbase"
> search()
[1] ".GlobalEnv" "ore:RQUSER" "ESSR"
[4] "package:ORE" "package:ORExml" "package:OREeda"
[7] "package:OREgraphics" "package:OREstats" "package:MASS"
[10] "package:OREbase" "package:ROracle" "package:DBI"
[13] "package:stats" "package:graphics" "package:grDevices"
[16] "package:utils" "package:datasets" "package:methods"
[19] "Autoloads" "package:base"
> find("IRIS_TABLE")
[1] "ore:RQUSER"
> # The table should now appear in your R environment automatically
> # since you have access to the table now
> ore.ls()
[1] "IRIS_TABLE"
> # This is a database resident table with just metadata on the R side.
> # You will see this below
> class(IRIS_TABLE)
[1] "ore.frame"
attr(,"package")
[1] "OREbase"
> # Apply given R function to each row
> ore.rowApply(IRIS_TABLE,
+ function(dat) {
+ # Any R code goes here. Operates on one row of IRIS_TABLE at
+ # a time
+ cbind(dat, dat$Petal.Length)
+ })
Error in .oci.GetQuery(conn, statement, ...) :
ORA-29400: data cartridge error
ORA-24323: ?????
ORA-06512: at "RQSYS.RQROWEVALIMPL", line 26
ORA-06512: at line 4
>
whether my oracle's version 11.2.0.1 has no the RDBMS bug fix, and other problems? ThanksOracle R Enterprise 1.1. requires Oracle Database 11.2.0.3, 11.2.0.4. On Linux and Windows. Oracle R Enterprise can also work with an 11.2.0.1 or 11.2.0.2 database if it is properly patched.
Embedded R execution will not work without a patched database. Follow this procedure to patch the database:
1. Go to My Oracle Support:http://support.oracle.com
2. Log in and supply your Customer Support ID (CSI).
3. Choose the Patches & Updates tab.
4. In the Patch Search box, type 11678127
and click Search
5. Select the patch for your version of Oracle Database, 11.2.0.1.
6. Click Download to download the patch.
7. Install the patch using OPatch. Ensure that you are using the latest version of OPatch.
Sherry -
DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- Create a (user-named) Data Pump job to do a schema export.
h1 := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'EXAMPLE1','LATEST');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
--BACKUP DIRECTORY NAME
DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','BACKUP');
-- A metadata filter is used to specify the schema that will be exported.
--ORVETL USER NAME
DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''orvetl'')');
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
END;
error-
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2926
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3162
ORA-06512: at line 20
Message was edited by:
anutoshI assume all the other dimensions are being specified via a load rule header (i.e. the rule is otherwise valid).
What is your data source? What does the number (data) format look like? Can you identify (and post) specific rows that are causing the error? -
Change Data Capture error ORA-31428 at the subscription step?
I am following this cookbook; http://www.oracle.com/technology/products/bi/db/10g/pdf/twp_cdc_cookbook_0206.pdf it was very helpful for me but at the subscription step when I give the list of all columns that I provided at create_change_table step with column_type_list I recieve this error;
ORA-31428 : No publication contains all of the specified columns. One or more of the specified columns cannot be found in a single publication. Consult the ALL_PUBLISHED_COLUMNS view to see the current publications and change the subscription request to select only the columns that are in the same publication.
When I check the view mentioned ALL_PUBLISHED_COLUMNS my columns are listed, strange behaviour. I searched for any comments on forums.oracle.com and metalink.oracle.com and even google but nothing just the explaination above :(
If you have any comments it would be great, thank you again.
Best regards.
Hotlog Source : 9iR2 Solaris
Hotlog Target : 10gR2 Solaris
begin
dbms_cdc_publish.create_change_table(
owner => ‘cdc_stg_pub’,
change_table_name => ‘udb_tcon_ct’,
change_set_name => ‘udb_tcon_set’,
source_schema => ‘udb’,
source_table => ‘tcon’,
column_type_list => ‘ncon number(12), ncst number(12), dwhencon date, twhomcon varchar2(50), cchancon number(3), cacticon number(5), tdatacon varchar2(1000)’,
capture_values => ‘both’,
rs_id => ‘y’,
row_id => ‘n’,
user_id => ‘n’,
timestamp => ‘y’,
object_id => ‘n’,
source_colmap => ‘n’,
target_colmap => ‘y’,
options_string => null) ;
end ;
select x.change_set_name, x.column_name from ALL_PUBLISHED_COLUMNS x ;
begin
dbms_cdc_subscribe.create_subscription(
change_set_name => ‘udb_tcon_set’,
description => ‘UDB TCON change subscription’,
subscription_name => ‘udb_tcon_sub1′);
end;
begin
dbms_cdc_subscribe.subscribe(
subscription_name => ‘udb_tcon_sub1′,
source_schema => ‘udb’,
source_table => ‘tcon’,
column_list => ‘ncon,ncst,dwhencon,twhomcon,cchancon,cacticon,tdatacon’,
subscriber_view => ‘udb_tcon_chg_view’) ;
end ;
CHANGE_SET_NAME COLUMN_NAME
UDB_TCON_SET NCON
UDB_TCON_SET NCST
UDB_TCON_SET DWHENCON
UDB_TCON_SET TDATACON
UDB_TCON_SET CCHANCON
UDB_TCON_SET CACTICON
UDB_TCON_SET TWHOMCON
7 rows selected
PL/SQL procedure successfully completed
begin
dbms_cdc_subscribe.subscribe(
subscription_name => ‘udb_tcon_sub1′,
source_schema => ‘udb’,
source_table => ‘tcon’,
column_list => ‘ncon,ncst,dwhencon,twhomcon,cchancon,cacticon,tdatacon’,
subscriber_view => ‘udb_tcon_chg_view’) ;
end ;
ORA-31428: no publication contains all the specified columns
ORA-06512: at “SYS.DBMS_CDC_SUBSCRIBE”, line 19
ORA-06512: at line 2
I added the OS and Oracle versions of source and target.
Message was edited by:
TongucYnice catch, the error changed but still strange;
SQL> select upper('ncon,ncst,dwhencon,twhomcon,cchancon,cacticon,tdatacon') from dual ;
UPPER('NCON,NCST,DWHENCON,TWHO
NCON,NCST,DWHENCON,TWHOMCON,CCHANCON,CACTICON,TDATACON
SQL> begin
2 dbms_cdc_subscribe.subscribe(
3 subscription_name => 'udb_tcon_sub1',
4 source_schema => 'udb',
5 source_table => 'tcon',
6 column_list => 'NCON,NCST,DWHENCON,TWHOMCON,CCHANCON,CACTICON,TDATACON',
7 subscriber_view => 'udb_tcon_chg_view') ;
8 end ;
9 /
begin
dbms_cdc_subscribe.subscribe(
subscription_name => 'udb_tcon_sub1',
source_schema => 'udb',
source_table => 'tcon',
column_list => 'NCON,NCST,DWHENCON,TWHOMCON,CCHANCON,CACTICON,TDATACON',
subscriber_view => 'udb_tcon_chg_view') ;
end ;
ORA-31466: no publications found
ORA-06512: at "SYS.DBMS_CDC_SUBSCRIBE", line 19
ORA-06512: at line 2 -
I am trying to add standby database to the broker configuration and am getting this error
ORA-16796 : one or more properties could not be imported from the database
-- are there any specific checks I can do to locate the error ?
Thanks,
ramyaHi..
What is the oracle version???
Refer to metalink Doc ID: 194529.1
From metalink for 10g
>
Error: ORA-16796 (ORA-16796)
Text: One or more properties could not be imported from the database.
Cause: The broker was unable to import property values for the database
being added to the broker configuration. This error indicates: -
the net-service-name specified in DGMGRL's CREATE CONFIGURATION or
ADD DATABASE command is not one that provides access to the
database being added, or - there are no instances running for the
database being added.
Action: Remove the database from the configuration using the REMOVE
CONFIGURATION or REMOVE DATABASE command. Make sure that the
database to be added has at least one instance running and that
the net-service-name provides access to the running instance. Then
reissue the CREATE CONFIGURATION or ADD DATABASE command.
>
Anand
Edited by: Anand... on Mar 15, 2009 9:44 AM -
Hi
i am facing an error in dataguard, i am going to down standby DB with dataguard manager when i do this my Prod db status become warning (ORA-16608)"one or more sites have warning".
Thanks*** Duplicate Post ***
Ignore this posting. -
Dear All,
Using datapump to export data for a schema;
#!/bin/sh
PS1='$PWD # '
ORACLE_BASE=/orabin/oracle
ORACLE_HOME=/orabin/oracle/product/10.1.0
ORACLE_SID=vimadb
PATH=$ORACLE_HOME/bin:$PATH:.
export PATH PS1 ORACLE_BASE ORACLE_HOME ORACLE_SID
/orabin/oracle/product/10.1.0/bin/expdp vproddta/vproddta@vimadb schemas = vproddta EXCLUDE = STATISTICS directory = datadir1 dumpfile = datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.dmp logfile= datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.logAlready the directory is created at OS level;
/syslog/datapump/
SQL> create directory datadir1 as '/syslog/datapump/'
grant read, write on directory datadir1 to vproddta;Getting the error -
Export: Release 10.1.0.4.0 - 64bit Production on Wednesday, 10 August, 2011 16:52
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.1.0.4.0 - 64bit Production
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation1. Check OS permitions for this directory.
2. Try to create export using simple name, like "export.dmp", if DP will be successfull, then check generated filename. -
IDOC Syntax Error E0072 with status 26 after upgrade to ECC6.0
Hi
We have upgraded SAP R/3 from 4.6C to ECC6.0. After upgrade, when we create PO and output EDI, we are seeing this IDOC syntax error E0072 for Mandatory Segment E1EDK01. In 4.6C it was working fine with no issues, only after upgrade this problem started. It says E1EDK01 is missing which is mandatory. We are using basic type ORDERS01 with no extensions. When we see WE30 it looks fine, WE20 settings looks fine.
Order of the segments got scrambled in upgraded version. First segment is E1EDP19001, this is supposed to be under E1EDP01 segment. There are two E1EDP19001 segments created but there were no sub segments to E1EDP01.
Any help in this regard is highly appreciated.
Thanks
PandiriUser Exit activation and fixing the issue in user exit fixed this issue.
-
DB13 jobs errors (ORA-01031: insufficient privileges) after System Copy
Dear SAP gurus,
I performed an ECC60 System copy from Dev to a sandbox system (Linux-Oracle). When I try to access DB13 all jobs are cancelled:
Example of "check and update optimizer statistics" job log:
Job started
Step 001 started (program RSDBAJOB, variant &0000000000001, user ID TGEPOMA1)
Execute logical command BRCONNECT On host eccsbx01
Parameters: -u / -jid STATS20110720050000 -c -f stats -t ALL
BR0801I BRCONNECT 7.00 (46)
BR0805I Start of BRCONNECT processing: cegkmpzk.sta 2011-07-27 05.00.32
BR0484I BRCONNECT log file: /oracle/SBX/sapcheck/cegkmpzk.sta
BR0280I BRCONNECT time stamp: 2011-07-27 05.00.36
BR0301W SQL error -1031 at location brc_dblog_open-1, SQL statement:
'INSERT INTO SAP_SDBAH (BEG, FUNCT, SYSID, OBJ, RC, ENDE, ACTID, LINE) VALUES ('20110727050032', 'sta', 'SBX', 'ALL', '9999', '
ORA-01031: insufficient privileges
BR0324W Insertion of database log header failed
I read the SAP note 400241 (Problems with ops$ or sapr3 connect to Oracle) and performed the general checks such checking the SAPUSER owner,etc.
If I executed the sapdba_role.sql script I get the following log errors on sapdba_role.log:
old 1: grant ALL on &User..SDBAH to sapdba
new 1: grant ALL on SAPR3.SDBAH to sapdba
grant ALL on SAPR3.SDBAH to sapdba
ERROR at line 1:
ORA-00942: table or view does not exist
old 1: grant ALL on &User..SDBAD to sapdba
new 1: grant ALL on SAPR3.SDBAD to sapdba
grant ALL on SAPR3.SDBAD to sapdba
ERROR at line 1:
ORA-00942: table or view does not exist
old 1: grant ALL on &User..DBAML to sapdba
new 1: grant ALL on SAPR3.DBAML to sapdba
grant ALL on SAPR3.DBAML to sapdba
Should I create those tables in order to allow ops$ user to access Oracle DB in order to execute the DB job from DB13?
Please let me know if anybody can help me or has face a situation before?
Thanks in advance, MarcHi Markus,
I dont think that it is an authorization issue becasue it has the same authorizations than Development:
eccsbx01:/sapmnt/SBX/exe # ls -ltr br*
-rwsr-srw- 1 orasbx dba 4121272 Jul 19 11:55 brarchive
-rwsr-srw- 1 orasbx dba 4227280 Jul 19 11:55 brbackup
-rwsrwxr-x 1 orasbx sapsys 5489731 Jul 19 11:55 brconnect
-rwxr-xr-x 1 sbxadm sapsys 4537880 Jul 19 11:55 brrecover
-rwxr-xr-x 1 sbxadm sapsys 1554379 Jul 19 11:55 brrestore
-rwxr-xr-x 1 sbxadm sapsys 5617510 Jul 19 11:55 brspace
-rwsrwxr-x 1 orasbx sapsys 2289337 Jul 19 11:55 brtools
Regards, Marc -
trying to execute the sample here:
http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_api.htm#i1006925
Example 5-1 Performing a Simple Schema Export
So i create a packge and I tried to modify it to do a table instead of a schema export so I mess it up and it doesn't work. I do this twice.
If I query USER_DATAPUMP_JOBS I have 2 entries ('EXAMPLE1' and 'EXAMPLE2'), so when I try to rerun the sample, I get an ORA-31634 job already exists.
hmm, it looks like I could then attach to the job to rerun my tests or to remove it, but when I try and attach, I get ORA-31626 job doesn't exist.
Am I missing something basic here? All other procedures require a handle, so it's either OPEN or ATTACH.I better read the doc again. Seems strange to create tables in the schema with the name of the job.
I deleted the tables in the user and then when I did a select * from user_datapump_jobs, then I could see more rows with the jobs from before plus the names of the recyclebin objects. purge recyclebin and they were gone.
I'm not using EXPDP, I'm trying the API based approach. -
Data Pump Export error - network mounted path
Hi,
Please have a look at the data pump error i am getting while doing export. I am running on version 11g . Please help with your feedback..
I am getting error due to Network mounted path for directory OverLoad it works fine with local path. i have given full permissions on network path and utl_file able to create files but datapump fail with below error messages.
Oracle 11g
Solaris 10
Getting below error :
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3444
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3693
ORA-06512: at line 64
DECLARE
p_part_name VARCHAR2(30);
p_msg VARCHAR2(512);
v_ret_period NUMBER;
v_arch_location VARCHAR2(512);
v_arch_directory VARCHAR2(20);
v_rec_count NUMBER;
v_partition_dumpfile VARCHAR2(35);
v_partition_dumplog VARCHAR2(35);
v_part_date VARCHAR2(30);
p_partition_name VARCHAR2(30);
v_partition_arch_location VARCHAR2(512);
h1 NUMBER; -- Data Pump job handle
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
ind NUMBER; -- Loop index
percent_done NUMBER; -- Percentage of job complete
--check dump file exist on directory
l_file utl_file.file_type;
l_file_name varchar2(20);
l_exists boolean;
l_length number;
l_blksize number;
BEGIN
p_part_name:='P2010110800';
p_partition_name := upper(p_part_name);
v_partition_dumpfile := chr(39)||p_partition_name||chr(39);
v_partition_dumplog := p_partition_name || '.LOG';
SELECT COUNT(*) INTO v_rec_count FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
IF v_rec_count != 0 THEN
SELECT
PARTITION_ARCHIVAL_PERIOD
,PARTITION_ARCHIVAL_LOCATION
,PARTITION_ARCHIVAL_DIRECTORY
INTO v_ret_period , v_arch_location , v_arch_directory
FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
END IF;
utl_file.fgetattr('ORALOAD', l_file_name, l_exists, l_length, l_blksize);
IF (l_exists) THEN
utl_file.FRENAME('ORALOAD', l_file_name, 'ORALOAD', p_partition_name ||'_'|| to_char(systimestamp,'YYYYMMDDHH24MISS') ||'.DMP', TRUE);
END IF;
v_part_date := replace(p_partition_name,'P');
DBMS_OUTPUT.PUT_LINE('inside');
h1 := dbms_datapump.open (operation => 'EXPORT',
job_mode => 'TABLE'
dbms_datapump.add_file (handle => h1,
filename => p_partition_name ||'.DMP',
directory => v_arch_directory,
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
dbms_datapump.add_file (handle => h1,
filename => p_partition_name||'.LOG',
directory => v_arch_directory,
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
dbms_datapump.metadata_filter (handle => h1,
name => 'SCHEMA_EXPR',
value => 'IN (''HDB'')');
dbms_datapump.metadata_filter (handle => h1,
name => 'NAME_EXPR',
value => 'IN (''SUBSCRIBER_EVENT'')');
dbms_datapump.data_filter (handle => h1,
name => 'PARTITION_LIST',
value => v_partition_dumpfile,
table_name => 'SUBSCRIBER_EVENT',
schema_name => 'HDB');
dbms_datapump.set_parameter(handle => h1, name => 'COMPRESSION', value => 'ALL');
dbms_datapump.start_job (handle => h1);
dbms_datapump.detach (handle => h1);
END;
/Hi ,
I tried to generate dump with expdp instead of API, got more specific error logs.
but on same path log file got create.
expdp hdb/hdb DUMPFILE=P2010110800.dmp DIRECTORY=ORALOAD TABLES=(SUBSCRIBER_EVENT:P2010110800) logfile=P2010110800.log
Export: Release 11.2.0.1.0 - Production on Wed Nov 10 01:26:13 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/nfs_path/lims/backup/hdb/datapump/P2010110800.dmp"
ORA-27054: NFS file system where the file is created or resides is not mounted with correct options
Additional information: 3Edited by: Sachin B on Nov 9, 2010 10:33 PM -
ORA-39080: failed to create queues "" and "" for Data Pump job
When I am running datapump expdp I receive the following error:
+++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
+++tion+++
+++With the Partitioning, OLAP and Data Mining options+++
+++ORA-31626: job does not exist+++
+++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
+++ORA-39080: failed to create queues "" and "" for Data Pump job+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
+++ORA-01403: no data found+++
Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
OBJECT_NAME OBJECT_TYPE
AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
SCHEDULER$_JOBQ QUEUE
While I run catdpb.sql the datapump queue table does not create:
BEGIN
dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');
EXCEPTION
WHEN OTHERS THEN
IF SQLCODE = -24001 THEN NULL;
ELSE RAISE;
END IF;
END;
ERROR at line 1:
ORA-01403: no data found
ORA-06512: at line 7Snehashish Ghosh wrote:
When I am running datapump expdp I receive the following error:
+++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
+++tion+++
+++With the Partitioning, OLAP and Data Mining options+++
+++ORA-31626: job does not exist+++
+++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
+++ORA-39080: failed to create queues "" and "" for Data Pump job+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
+++ORA-01403: no data found+++
Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
OBJECT_NAME OBJECT_TYPE
AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
SCHEDULER$_JOBQ QUEUE
While I run catdpb.sql the datapump queue table does not create:
BEGIN
dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');does it work better when specifying an Oracle version that is from this Century; newer than V8.1?
Maybe you are looking for
-
What's wrong with the new forum design.
Thought I'd kick off the list here. 1. O.P. can award a correct answer to themselves. 2. Status stays hidden unless checked, making it hard on both newbies and those trying to help them 3. No advanced search (that I can find) I've given up even tryin
-
Imported audiobook and iPhone videos not showing in main library.
Today I imported an Audible book I had downloaded by dragging the files onto the "Library" section of iTunes. The progress bar showed that the files were being imported and the book shows up in a smart playlist in the "Playlist" section but not in th
-
Greyfield Deployment of UC320w with RV120w
I have a number of customers using the RV120w Small Business Firewall/Router in their business, and am having problems deploying the UC320w into these networks in a greyfield scenario. I am following the published Cisco document for configuring the u
-
Dear Experts Please advice which exit / badi / enhancement can be used to derive the same / orginal storage unit date in case when we transfer stocks from one storage location to other. when the stocks are transfered from one Sloc - WM location to ot
-
Nano + iTunes 7 = Messed up Album Art
when I updated to itunes 7 my album art now is messed up. About one of five of every piece of album art is just a bunch of random colored pixels. Anyone else experiencing bugs like this? I am running the latest version of the updater. It also occurs