Schema level refresh help required in types
Hi,
I have the following environments :
OS : windows 2003
Database version : 10.2.0.3
no archive log mode:
i have four schemas. like schema1,schema2,schema3 and schema4.
i have created schema5 like (schema4) by copying dba_role_privs / dba_tab_privs and dba_sys_privs.
i have exported schema4 and imported in schema5.
there three tables not imported.
on exploring found the three tables created on types in schema4.
we have types present in all the schemas except in schema5
when i tried to create types it is coming out with compilation errors.
any idea how to handle types in schema level refreshes?
i am using exp/imp with fromuser/touser clauses.
Thanks,
Raman.
My work log copied below:
USERNAME PASSWORD DEFAULT_TABLESPACE TEMPORARY_TABLESPACE
PROFILE
schema4 2072A1370A380D8A IMPACT TEMP
DEFAULT
GRANTEE GRANTED_ROLE ADM DEF
schema4 CONNECT NO YES
schema4 RESOURCE NO YES
schema4 GRP_IMPACT NO YES
schema4 GRUPPE_IMPACTNET NO YES
schema4 GRUPPE_IMPACTLOGIN NO YES
GRANTEE PRIVILEGE ADM
schema4 CREATE TABLE NO
schema4 CREATE ANY TABLE NO
schema4 UNLIMITED TABLESPACE NO
schema4 EXECUTE ANY PROCEDURE NO
create user schema5 identified by ***** default tablespace impact temporary tablespace temp profile default;
grant connect,resource,GRP_IMPACT,GRUPPE_IMPACTNET,GRUPPE_IMPACTLOGIN to schema5;
grant CREATE TABLE,CREATE ANY TABLE,UNLIMITED TABLESPACE,EXECUTE ANY PROCEDURE to schema5;
set head off
spool grants.sql
SELECT 'GRANT '||PRIVILEGE||' ON '||OWNER||'.'||TABLE_NAME||' TO '||'schema5;' FROM DBA_TAB_PRIVS WHERE GRANTEE='schema4';
spool off
@grants.sql
exp sys/******@****.world file=Impact_1.dmp,Impact_2.dmp,Impact_3.dmp FILESIZE =1000M log=schema4_exp.log consistent=y OWNER=schema4 STATISTICS=none
imp 'sys/*****@****.world as sysdba' file=Impact_1.dmp,Impact_2.dmp,Impact_3.dmp log=schema5_imp.log ignore=y fromuser=schema4 touser=schema5
SQL> select count(*),object_type,status,owner from dba_objects where owner like 'schema4' group by object_type,status,owner;
1 LOB VALID schema4
98 TYPE VALID schema4
407 VIEW VALID schema4
786 INDEX VALID schema4
1379 TABLE VALID schema4
44 PACKAGE VALID schema4
19 SYNONYM VALID schema4
50 TRIGGER VALID schema4
153 FUNCTION VALID schema4
22 SEQUENCE VALID schema4
460 PROCEDURE VALID schema4
3 TYPE BODY VALID schema4
42 PACKAGE BODY VALID schema4
3 DATABASE LINK VALID schema4
14 rows selected.
SQL> select count(*),object_type,status,owner from dba_objects where owner like 'schema5' group by object_type,status,owner;
1 LOB VALID schema5
59 TYPE VALID schema5
392 VIEW VALID schema5
15 VIEW INVALID schema5
780 INDEX VALID schema5
1376 TABLE VALID schema5
41 PACKAGE VALID schema5
3 PACKAGE INVALID schema5
19 SYNONYM VALID schema5
50 TRIGGER VALID schema5
89 FUNCTION VALID schema5
64 FUNCTION INVALID schema5
22 SEQUENCE VALID schema5
126 PROCEDURE VALID schema5
334 PROCEDURE INVALID schema5
24 PACKAGE BODY VALID schema5
18 PACKAGE BODY INVALID schema5
3 DATABASE LINK VALID schema5
SQL> select count(*) from dba_objects where owner like 'schema4';
3467
SQL> select count(*) from dba_objects where owner like 'schema5';
3416
=================
15 VIEW INVALID schema5
3 PACKAGE INVALID schema5
64 FUNCTION INVALID schema5
334 PROCEDURE INVALID schema5
18 PACKAGE BODY INVALID schema5
Similar Messages
-
What level suplemental logging requires to setup Streams at Schema level
Hi,
Working on setting-up streams from 10g to 11g db @ schema level. And the session is hanging with statement "ALTER DATABASE ADD SUPPLEMENTAL LOG DATA" while running following command - generated using DBMS_STREAMS_ADM.MAINTAIN_SCHEMAS.
Begin
dbms_streams_adm.add_schema_rules(
schema_name => '"DPX1"',
streams_type => 'CAPTURE',
streams_name => '"CAPTURE_DPX1"',
queue_name => '"STRMADMIN"."CAPTURE_QUEUE"',
include_dml => TRUE,
include_ddl => TRUE,
include_tagged_lcr => TRUE,
source_database => 'DPX1DB',
inclusion_rule => TRUE,
and_condition => get_compatible);
END;
The generated script also setting each table with table-level logging "'ALTER TABLE "DPX1"."DEPT" ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY, FOREIGN KEY, UNIQUE INDEX) COLUMNS'".
So my question is: Is Database level supplemental logging required to setup schema-level replication? If answer is no then why the following script is invoking "ALTER DATABASE ADD SUPPLEMENTAL LOG DATA" command.
Thanks in advance.
Regards,
SridharHi sri dhar,
From what I found, the "ALTER DATABASE ADD SUPPLEMENTAL LOG DATA" is required for the first capture you create in a database. Once it has been run, you'll see V$DATABASE with the column SUPPLEMENTAL_LOG_DATA_MIN set to YES. It requires a strong level of locking - for example, you cannot run this alter database while an index rebuild is running (maybe an rebuild online?)
I know it is called implicitly by DBMS_STREAMS_ADM.add_table_rules for the first rule created.
So, you can just run the statement once in a maintenance window and you'll be all set.
Minimal Supplemental Logging - http://www.oracle.com/pls/db102/to_URL?remark=ranked&urlname=http:%2F%2Fdownload.oracle.com%2Fdocs%2Fcd%2FB19306_01%2Fserver.102%2Fb14215%2Flogminer.htm%23sthref2006
NOT to be confused with database level supplemental log group.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14228/mon_rep.htm#BABHHCCC
Hope this helps,
Regards, -
PLS-00329: schema-level type has illegal reference to
I am trying to create a PL/SQL Package which needs a table type which needs to be defined at schema level. The type refers to a table across another schema.
I am getting an error as below when I try to create the Type. What is wrong?
My DBA provided me reference access which I see in the dba_tab_privs table as below. Is there some other privilege which is still missing please help.
READ SCOTT ORDERLINES SCOTT REFERENCES YES NO
CREATE TYPE type_tab is TABLE of scott.orderlines%ROWTYPE;
Warning: Type created with compilation errors.
show errors;
Errors for TYPE TEST_TAB:
LINE/COL ERROR
0/0 PL/SQL: Compilation unit analysis terminated
1/32 PLS-00329: schema-level type has illegal reference to
BAAN.TTDSLS401100Thanks Jens. The reason I was trying to do this was as per somebody's suggestion to correct invalid datatype error I was getting in my package body.
Could you let me know then why I am getting the following error in my package body below. Help would be much appreciated.
57/22 PL/SQL: SQL Statement ignored
57/60 PL/SQL: ORA-00902: invalid datatype
The error is in the line open p_recordset for select * from TABLE(CAST(tbl_order as typ_tab));
CREATE OR REPLACE type myTableType as table of varchar2(20000);
CREATE OR REPLACE PACKAGE TEST_PROC_PKG
IS
TYPE cursor_type IS REF CURSOR;
TYPE typ_rec IS RECORD (
ord scott.orderlines.T$orno%TYPE,
pono scott.orderlines.T$pono%TYPE,
cpva scott.orderlines.T$cpva%TYPE
rec_ord typ_rec;
TYPE typ_tab is TABLE of typ_rec
INDEX BY BINARY_INTEGER;
tbl_order typ_tab;
FUNCTION in_list(p_string IN varchar2) return myTableType;
PROCEDURE TEST_PROC(p_orno IN VARCHAR2, p_recordset OUT cursor_type);
END;
CREATE OR REPLACE PACKAGE BODY TEST_PROC_PKG
IS
FUNCTION in_list(p_string IN varchar2) return myTableType
IS
l_string long default p_string || ',';
l_data myTableType := myTableType();
n number;
BEGIN
LOOP
EXIT WHEN l_string is null;
n := instr( l_string, ',' );
l_data.extend;
l_data(l_data.count) :=
ltrim( rtrim( substr( l_string, 1, n-1 ) ) );
l_string := substr( l_string, n+1 );
END LOOP;
return l_data;
END in_list;
PROCEDURE TEST_PROC(p_orno IN VARCHAR2, p_recordset OUT cursor_type)
IS
TYPE type_curvar IS REF CURSOR;
cur_order type_curvar;
i NUMBER := 1;
BEGIN
OPEN cur_order FOR select T$orno, T$pono, T$cpva
from scott.orderlines
where T$orno in ( select *
from THE ( select cast( in_list(p_orno)
as mytableType ) from dual ) );
LOOP
FETCH cur_order INTO rec_ord;
EXIT WHEN cur_order%NOTFOUND;
tbl_order(i).cpva := rec_ord.pono + rec_ord.cpva;
tbl_order(i).ord := rec_ord.ord;
tbl_order(i).pono := rec_ord.pono;
DBMS_OUTPUT.PUT_LINE(tbl_order(i).cpva);
i := i + 1;
END LOOP;
CLOSE cur_order;
open p_recordset for select * from TABLE(CAST(tbl_order as typ_tab));
EXCEPTION
WHEN OTHERS THEN
open p_recordset FOR select T$orno, T$pono, T$cpva
from scott.orderlines
where T$orno = ' ';
END TEST_PROC;
END; -
JDeveloper, Can not build schema - Help Required
Hi everyone,
I am facing a peculiar problem with Jdeveloper. (10.1.3.4.4270)
I am developing an ESB project. When referring to a XSD from within a XSD using a URL i.e. import, It's giving me error.
Here is the sample code:_
<?xml version="1.0" encoding="utf-8"?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
<xsd:import namespace="http://xmlns.mysite.com/q/types"
schemaLocation="http://192.168.8.10:7072/mySchemas/xmltypes.xsd" />
( [http://192.168.8.10:7072/mySchemas/xmltypes.xsd|http://192.168.8.10:7072/mySchemas/xmltypes.xsd] is accessible from a browser directly on the machine where project dir exists)
When opening the XSD is design mode its showing red(error) for any element that uses types from xmltypes.xsd. When opening the XSL mapping which uses the current XSD its gives the following error:
"Failed to open the target schema:
Can not build schema 'http://xmlns.mysite.com/q/types' located at 'file :/C: /....../*xmltypes.xsd*'"
+(It seems Jdeveloper is looking for the xmltypes.xsd in project dir, even though the import points to a URL)+
The same code above works fine in Eclipse Gynamade and elements in current XSD are able to refer to types in xmltypes.xsd.
What could be the problem. I am not finding any solution for this.
Thanks in advance.Thank you for your reply.ButI had assign the dba role to the user,and could you tell me what version can build the schema.I had use the RAM 6.3.4.0.0 Complete 6.334
Thank in advance
Chris -
Help required in mapping MM pricing
PO should have conditions for
freight
Insurance
bank charges
all paid to the vendor who is supplying goods.
These prices will have to be loaded on materials on GR.
2.
PO should not have any values against the conditions as mentioned above as he will not be paid for that.
When goods arrive, these charges are paid by local clearing agent or agents.
He will be paid accordingly based on the invoice he sends.
These costs also should be booked on material upon GR.
The local vendor or vendors does the clearing of goods at the port and for that he/they charge company and sends invoices.
Company wants to track these expenses per vendor and also wants different GL accounts for each type of conditions(heads of expenses these vendor/s charges in their invoice such as port charges, clearing charges,insurance charges,bank charges,freight where original supplier do not charge for these costs to the company)
Importantly,all these costs also have to be loaded on material .How this will be done ?
Consider also the case where you raised the PO with only conditions such as insurance and bank charges included in the PO and rest of the charges as mentioned above are borned by local vendor for clearing goods.
Now when I do the goods receipt the value will only get updated by the quantity*rate basis in mat master.
How can we load the values incurred for clearing the goods at the time of GR then ?
example: PO qty:100 @rate 10/ea
GR: Stock with MAP updated say as 10(no invoice yet from original supplier and also from the local clearing vendor)
Now these goods are transferred to other plant vide say STO or plant to plant transfer.
This material will also get updated with MAP as 10.
Now after couple of days the local vendor invoice comes and he is paid actually for the charges he claimed and also comes in the invoice of original goods supplier.
When these invoices will be posted how these costs will be added to the material ?
Can any one please suggest the best possible solution ASAP as am left with no time here.
Regards,
ManojHi,
Your requirement is very common across all manufacturing comapanies.
1)Create seperate condition types for freight, insurance and bank charges with condition category as "B - Delivery costs"
2) flag the "accruals" check box.
3) Select the appropriate value for "Vendor in GR" field as per your requirement.
4) Insert these condition types in the calculation schema as per the requirement
5) Assign an accrual key. If the amount to be posted to different accounts, use different accrual keys.
6) maintain the "Actual Value" ( a sort of total of all conditon types used in the schema) in the schema such that it includes the values of above condtion types. Check value "S" is assigned in the "sub totals" field for "Actual Value" (if you are following standard JRM000, just insert your condition types before the "actual value" level)
Now your schema is ready. Since the values of freight, insurance and bank charges are included in "actual value", always the material will be loaded with this value at the time of GR.
Usually vendor code will be defaulted in the vendor field of the above condtion types. You can change the same at the time of PO creation by selecting the condition type and clicking on details icon.
You can post the vendor invoice for base value and taxes using the variant
"goods/service items" and post the other charges to different vendor account using "planned delivery costs" variant. if there are any differences between the planned value in the PO and actual invoice receipt, those are automatically adjusted.
If you require any further clarifications plz revert back
regards,
Mallik -
Hello,
I just downloaded and installed Vivado 2013.4 on my Xubuntu 12.04 machine. But when I try to add IP from the IP catalog, as in the ug937 Lab1 step2, it fails with obscure error messages (see below).
Here's basically what I did:
-In the Flow Navigator, i select the IP Catalog button.
-In the search field of the IP Catalog, I type DDS.
-then I double-click the DDS Compiler and my error occure.
Please Help,
Jerome.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen_demo.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen_demo.vhd":1]
Analysis Results[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen.vhd":1]
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
set_property target_language Verilog [current_project]
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
sources_1[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen_demo.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen_demo.vhd":1]
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
update_compile_order -fileset sim_1
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/fsm.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/fsm.vhd":1]
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/fsm.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/fsm.vhd":1]
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[HDL 9-1654] Analyzing Verilog file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sim/testbench.v" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sim/testbench.v":1]
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/debounce.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/debounce.vhd":1]
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-2313] Loaded Vivado IP repository '/opt/Xilinx/Vivado/2013.4/data/ip'.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-1704] No user IP repositories specified
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
Vivado Commands[Project 1-11] Changing the constrs_type of fileset 'constrs_1' to 'XDC'.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
sim_1[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[HDL 9-1654] Analyzing Verilog file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sim/testbench.v" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sim/testbench.v":1]
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-234] Refreshing IP repositories
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-395] Problem validating against XML schema: Invalid value format for this type spirit:order
[IP_Flow 19-193] Failed to save BOM file '/home/jmassol/.Xil/Vivado-2683-ubuntu/coregen/dds_compiler_0/dds_compiler_0.xml'.
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/debounce.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/debounce.vhd":1]
[IP_Flow 19-3378] Failed to create IP instance 'dds_compiler_0'. Error saving IP file.
[IP_Flow 19-194] Failed to save IP instance 'dds_compiler_0'.
[HDL 9-1061] Parsing VHDL file "/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen.vhd" into library work ["/home/jmassol/Desktop/Vivado/vivado_debug/ug937/project_xsim/project_xsim.srcs/sources_1/imports/ug937/sources/sinegen.vhd":1]
set_property constrs_type XDC [current_fileset -constrset]
We had the same problem when switching to Ubuntu 14.04, and there actually is a solution for it: make sure your locales are set to English.
$> env | grep LC_*
should only show english (or C) locales, all others are known to cause parsing errors in some numbers, usually caused by wrong string-to-float conversions (e.g. 18,29 in german is 18.29 in english). You can change the locales in the file /etc/default/localesThis is not the first time we had problems with the locale settings, Xilinx does not seem to test their software with anything else than en_US, causing obscure bugs like this one.
HTH
Philipp -
Schema level and table level supplemental logging
Hello,
I'm setting up bi- directional DML replication between two oracle databases. I have enabled supplemental logging database level by running this command-
SQL>alter database add supplemental log data (primary key) columns;
Database altered.
SQL> select SUPPLEMENTAL_LOG_DATA_MIN, SUPPLEMENTAL_LOG_DATA_PK, SUPPLEMENTAL_LOG_DATA_UI from v$database;
SUPPLEME SUP SUP
IMPLICIT YES NO
-My question is should I enable supplemental logging table level also(for DML replication only)? should I run the below command also?
GGSCI (db1) 1> DBLOGIN USERID ggs_admin, PASSWORD ggs_admin
Successfully logged into database.
GGSCI (db1) 2> ADD TRANDATA schema.<table-name>
what is the deference between schema level and table level supplemental logging?For Oracle, ADD TRANDATA by default enables table-level supplemental logging. The supplemental log group includes one of the following sets of columns, in the listed order of priority, depending on what is defined on the table:
1. Primary key
2. First unique key alphanumerically with no virtual columns, no UDTs, no functionbased
columns, and no nullable columns
3. First unique key alphanumerically with no virtual columns, no UDTs, or no functionbased
columns, but can include nullable columns
4. If none of the preceding key types exist (even though there might be other types of keys
defined on the table) Oracle GoldenGate constructs a pseudo key of all columns that
the database allows to be used in a unique key, excluding virtual columns, UDTs,
function-based columns, and any columns that are explicitly excluded from the Oracle
GoldenGate configuration.
The command issues an ALTER TABLE command with an ADD SUPPLEMENTAL LOG DATA clause that
is appropriate for the type of unique constraint (or lack of one) that is defined for the table.
When to use ADD TRANDATA for an Oracle source database
Use ADD TRANDATA only if you are not using the Oracle GoldenGate DDL replication feature.
If you are using the Oracle GoldenGate DDL replication feature, use the ADD SCHEMATRANDATA command to log the required supplemental data. It is possible to use ADD
TRANDATA when DDL support is enabled, but only if you can guarantee one of the following:
● You can stop DML activity on any and all tables before users or applications perform DDL on them.
● You cannot stop DML activity before the DDL occurs, but you can guarantee that:
❍ There is no possibility that users or applications will issue DDL that adds new tables whose names satisfy an explicit or wildcarded specification in a TABLE or MAP
statement.
❍ There is no possibility that users or applications will issue DDL that changes the key definitions of any tables that are already in the Oracle GoldenGate configuration.
ADD SCHEMATRANDATA ensures replication continuity should DML ever occur on an object for which DDL has just been performed.
You can use ADD TRANDATA even when using ADD SCHEMATRANDATA if you need to use the COLS option to log any non-key columns, such as those needed for FILTER statements and KEYCOLS clauses in the TABLE and MAP parameters.
Additional requirements when using ADD TRANDATA
Besides table-level logging, minimal supplemental logging must be enabled at the database level in order for Oracle GoldenGate to process updates to primary keys and
chained rows. This must be done through the database interface, not through Oracle GoldenGate. You can enable minimal supplemental logging by issuing the following DDL
statement:
SQL> alter database add supplemental log data;
To verify that supplemental logging is enabled at the database level, issue the following statement:
SELECT SUPPLEMENTAL_LOG_DATA_MIN FROM V$DATABASE;
The output of the query must be YES or IMPLICIT. LOG_DATA_MIN must be explicitly set, because it is not enabled automatically when other LOG_DATA options are set.
If you required more details refer Oracle® GoldenGate Windows and UNIX Reference Guide 11g Release 2 (11.2.1.0.0) -
hi all,
is it possible to write a schema level trigger with DCL and DDL commands in it.
Actually in my database there are more than one (X & Y)schemas with differnt privelages.
if i create a table in ' X'schema,i have to create a synonym for the same table for the other schema 'Y' and i want to grant select privelage for that table for 'Y' .my verison is 10204
create or replace
TRIGGER bcs_trigger
after create ON X.SCHEMA
declare
Cursor table_cur is
Select object_id, object_name, object_type, owner
from DBA_OBJECTS
where to_date(created,'DD/MM/YYYY')= to_date(SYSDATE,'DD/MM/YYYY');
type TABLE_collect is table of table_cur %rowtype;
TACOLL TABLE_collect;
v_msg varchar2(1000) := 'SYNONYM HAS BEEN CREATED';
V_error varchar2(1000) := 'ALREADY EXIST';
BEGIN
open table_cur;
loop
fetch table_cur bulk collect into TACOLL;
exit when table_cur %notfound;
end loop;
close table_cur;
for i in 1.. TACOLL.count
loop
IF
TACOLL(i). OBJECT_TYPE ='TABLE'
THEN
execute immediate 'create synonym '||TACOLL(i).OBJECT_NAME||' for '||TACOLL(i).OBJECT_NAME;
execute immediate 'grant select on '||TACOLL(i).OBJECT_NAME||' to Y ';
dbms_output.put_line ('v_msg');
end if;
end loop;
end ;Hi Suresh.
Welcome to OTN Forums!
I think this help you
CREATE OR REPLACE TRIGGER bcs_trigger AFTER
CREATE ON X.SCHEMA
DECLARE
V_MSG VARCHAR2(1000) := 'SYNONYM HAS BEEN CREATED';
BEGIN
FOR OBJ IN
(SELECT object_id,
object_name,
object_type,
owner
FROM DBA_OBJECTS
WHERE TO_DATE(CREATED,'DD/MM/YYYY')= TO_DATE(SYSDATE,'DD/MM/YYYY'))
LOOP
IF OBJ.OBJECT_TYPE = 'TABLE' THEN
EXECUTE IMMEDIATE 'create or replace synonym '||OBJ.OBJECT_NAME||' for '||OBJ.OBJECT_NAME;
EXECUTE IMMEDIATE 'grant select on '||OBJ.OBJECT_NAME||' to Y ';
dbms_output.put_line (V_MSG|| ' : ' ||OBJ_OBJECT_NAME);
END IF;
END LOOP;
END ; -
Schema level health checkup ?
what are contents or parameters comes under schema level health checkup ?
Hi,
What is your requirement and can you elaborate more.. ??
AFAIK, health checkup of DB is concern with Services are Up and Running or not.
- Pavan Kumar
Yess i got confuse what is schema level health checkup ,
where is in DB level health checkup consist of n number of parameters , like memory setting, initialization paramters , cpu usage and many more
what is schema level health check up ? can anybody help me??????? -
OBN call to get value - help required !!!
Hi Friends,
The scenario is, upon click on the LinkToAction button, I am able invoke OBN operation service, which gives the list of values using below API function.
WDPortalNavigation.navigateToObjectWithSpecificOperation
Prior to thisLink to Action coded to call OBN service, Event created, Event subscribed (at wdInit), Event unsubscribed (wdExit) and Event Action handled.
Code to call OBN service as below
String system="XXX_LPT", businessObjType="", objValue="absence", objValueName="absence", operation="show_orgunit_three",
businessParameters="mode=dialog&location=no&menubar=no&titlebar=no&toolbar=no&ShowHeader=false";
boolean forwardOBNMetaData=true, resolveForSourceRoleOnly=true;
WDPortalNavigation.navigateToObjectWithSpecificOperation(system, objValue, objValueName, operation, businessParameters);
Issue is, the OBN service call happens in separate Tab on the browser and selecting an item just does the refresh on the same window.
Expected is, upon clicking on the LinkToAction button from the parent window, the OBN service should display in a separate window (not in browser Tab) and after selecting the value, window should close and the value should be available to the parent view to assign to an input box.
Can anybody help me on this? I am really struck.
Thanks in advance,
KanthaHello Vaibhav,
Make sure that the field defined in your node in the context on the web dynpro view or if defined at the component controller level has the search help defined. Usually if the search help is tied to the database table's field name the 'Input Help Mode' of type automatic should display the determined search help.
If that setting is there then in the adobe form create an input field of the type Webdynpro native->"Value help drop down list". Then bind your database field to this input field.
When you do that the object type of the field changes to text_field change that back to a drop down field from the pick list provided for the objects and then activate it and test it out.
Hope this helps.
Thanks and best regards,
Leena -
XSLT mapping Help Required.
XSLT mapping Help Required.
Hi Experts,
I am New to XSLT Mapping. I am practising the below Example:
InputXML File:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="Persons111.xsl"?>
<ns0:MT_XSLT_Source xmlns:ns0="http://XYZ.com/gen">
<Person>
<FirstName>Anshul</FirstName>
<LastName>Chowdhary</LastName>
<Gender>Male</Gender>
<Address>
<Street>2nd Main</Street>
<Houseno>83/b</Houseno>
<City>Mysore</City>
</Address> </Person>
</ns0:MT_XSLT_Source>
XSL StyleSheet File:
<?xml version='1.0' encoding="UTF-8"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ns0="http://XYZ.com/Gen"
Xmlns:ns1=”http://XYZ.com/Test”>
<xsl:template match="/">
<ns1:MT_XSLT_Target>
<Title> <xsl:value-of select="ns0:MT_XSLT_Source/Person/Gender"/> </Title>
<Name> <xsl:value-of select="concat(concat(ns0:MT_XSLT_Source/Person/FirstName,' '), ns0:MT_XSLT_Source/Person/LastName)"/>
</Name>
<Street> <xsl:value-of select="concat(concat(ns0:Mt_XSLT_Source/Person/Address/Houseno,' '),
ns0:Mt_XSLT_Source/Person/Address/Street)"/> </Street>
<City> <xsl:value-of select="ns0:Mt_XSLT_Source/Person/Address/City"/> </City>
</ns1:MT_XSLT_Target>
</xsl:template>
</xsl:stylesheet>
The Desired Output shuold be:
<?xml version="1.0" encoding="UTF-8"?>
<ns1:MT_XSLT_Target xmlns:ns1="http://XYZ.com/Test">
<Title>Male</Title>
<Name>Anshul Chowdhary</Name>
<Street>83/b 2nd Main</Street>
<City>Mysore</City>
</ns1:MT_XSLT_Target>
I have refered the xsl in xml and i am getting the below Oupt in a Single line like this:
Anshul Chowdhary Male 2nd Main 83/b Mysore
I am Unable to display in Target XML Fomrat as shown above. Please check and do the needful.
Regards,
GIRIDHARHi,
I have used below for testing.
Input xml:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="Persons111.xsl"?>
<ns0:MT_XSLT_Source xmlns:ns0="http://XYZ.com/gen">
<Person>
<FirstName>Anshul</FirstName>
<LastName>Chowdhary</LastName>
<Gender>Male</Gender>
<Address>
<Street>2nd Main</Street>
<Houseno>83/b</Houseno>
<City>Mysore</City>
</Address> </Person>
</ns0:MT_XSLT_Source>
xsl code:
<?xml version='1.0' encoding="UTF-8"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ns0="http://XYZ.com/gen"
xmlns:ns1="http://XYZ.com/Test">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
<xsl:template match="/">
<ns1:MT_XSLT_Target>
<Title> <xsl:value-of select="ns0:MT_XSLT_Source/Person/Gender"/> </Title>
<Name> <xsl:value-of select="concat(concat(ns0:MT_XSLT_Source/Person/FirstName,' '), ns0:MT_XSLT_Source/Person/LastName)"/>
</Name>
<Street> <xsl:value-of select="concat(concat(/ns0:MT_XSLT_Source/Person/Address/Houseno,' '),
/ns0:MT_XSLT_Source/Person/Address/Street)"/> </Street>
<City> <xsl:value-of select="/ns0:MT_XSLT_Source/Person/Address/City"/> </City>
</ns1:MT_XSLT_Target>
</xsl:template>
</xsl:stylesheet>
For testing in PI ,change the extension from .txt to .xsl and zip it and upload into PI as an imported archive .
Regards
Venkat -
Help required when using Function module F4_INT_TABLE_VALUE_REQUEST
Hi,
I wrote the logic as follows.
Select-options: s_lgart FOR pa0015-lgart.
SELECT lgart FROM t512z INTO TABLE it_temp WHERE infty = '0015' AND lgart IN rlgart.
IF NOT it_temp[] IS INITIAL.
SORT it_temp BY lgart.
DELETE ADJACENT DUPLICATES FROM it_temp COMPARING lgart.
LOOP AT it_temp.
SELECT SINGLE lgtxt FROM t512t INTO it_temp-description
WHERE lgart = it_temp-lgart AND
sprsl = 'EN'.
IF sy-subrc = 0.
MODIFY it_temp INDEX sy-tabix.
ENDIF.
ENDLOOP.
at present in internal table it_temp having following fields.
5100 Relolump sum
5111 SIP
my requirement is : when i press F4 help on wage type in selection screen i am able to see two fieexample:
CALL FUNCTION 'F4IF_INT_TABLE_VALUE_REQUEST'
EXPORTING
RETFIELD = 'VBELN'
DYNPPROG = W_PROGNAME
DYNPNR = W_SCR_NUM
DYNPROFIELD = 'KUNNR'
VALUE_ORG = 'S'
TABLES
VALUE_TAB = INT_F4
RETURN_TAB = RETURN_VALUES
EXCEPTIONS
PARAMETER_ERROR = 1
NO_VALUES_FOUND = 2
OTHERS = 3. -
Urgent help required: Query regarding LC Variables
Hi All
Sometime earlier I was working on a performance issue raised by a customer. It was shell script that was taking almost 8-9 hrs to complete. During my research I came across a fact that there were some variables which were not set, the LC variables were impacting the sort funnel operations because of which the script was taking a long time to execute.
I asked them to export the following commands, after which the program went on smoothly and finished in a couple of mins:
export LC_COLLATE=en_US.ISO8859-1
export LC_MESSAGES=C
export LC_MONETARY=en_US.ISO8859-1
export LC_MONETARY=en_US.ISO8859-1
export HZ=100
export LC_CTYPE=en_US.ISO8859-1
export LANG=en_US.UTF-8
Later I did recover that setting the LC_COLLATE to C, is not helping and the program was again taking a lot of time. Few questions that I want to ask are:
1. Can someone please tell me, what each of these variable mean and how these values make a difference.
2. When I exported LC_COLLATE=en_US.ISO8859-1, it worked fine, but when i tried with the defalut value LC_COLLATE=C, then why the program didnt work.
As this issue is still going on, hence I would request All to provide their valuable inputs and let me know as much as possible.
Appreciate your help in this regard.
Thanks
Amit
Hi All
A new development in this regard. The customer has send us a screen shot in which they were trying to export the locale variable using the commands which I have pasted above. I can see in the screen shot that while exporting LC_COLLATE and LC_TYPE, they get a message that ""ksh: export: couldn't set locale correctly"".
Request everyone to please give their inputs as it's a bit urgent.
Thanks for all the help in advance.
Thanks
Amit
Some help required please...
Edited by: amitsinhaengg on Jul 22, 2009 2:03 AM
Edited by: amitsinhaengg on Jul 22, 2009 2:06 AMLC_CTYPE
Controls the behavior of character handling functions.
LC_TIME
Specifies date and time formats, including month names, days of the week, and common full and abbreviated representations.
LC_MONETARY
Specifies monetary formats, including the currency symbol for the locale, thousands separator, sign position, the number of fractional digits, and so forth.
LC_NUMERIC
Specifies the decimal delimiter (or radix character), the thousands separator, and the grouping.
LC_COLLATE
Specifies a collation order and regular expression definition for the locale.
LC_MESSAGES
Specifies the language in which the localized messages are written, and affirmative and negative responses of the locale (yes and no strings and expressions).
You can use command
# locale -k LC_CTYPE
to see more detail about each type. -
Help required - Sales order item is partially delivered but the item grayed
I have a sales order 123 having say item10 with qty 1, item20 with qty 10 , item 30 qty 12
Item 1 confirmed qty 1 and delivered qty is 1
Item 2 confirmed qty 10 and delivered qty 10
Item 3 confirmed qty 1 and delivered qty is 1
Now the item3 still has open requirements of 11 to be delivered. But the item is GRAYED OUT already.
even if I do ATP the qty is not confirming for the remaining 11 pieces.
Why is that? How to make that item out from GRAY.
How to confirm the remaining 11 qty for that item.
Help required as early as possible.
Appreciate ur help guys
Radhahi Radha, how are you ?
---the partial deliveries in master data must have not been mentioned.
---the deliveries should be upto target quantity.
---check order type, item category and schedule line category.
---check unrestricted stock availability.
thank you
regards
Khera. -
Help required network configuration - Gateway route settings get erased on reboot.
Oracle Linux 7
Linux myhostname 3.8.13-35.3.1.el7uek.x86_64 #2 SMP Wed Jun 25 15:27:43 PDT 2014 x86_64 x86_64 x86_64 GNU/Linux
#cat /etc/sysconfig/network-scripts/ifcfg-eno16780032
TYPE="Ethernet"
BOOTPROTO="none"
DEFROUTE="yes"
IPV4_FAILURE_FATAL="no"
IPV6INIT="yes"
IPV6_AUTOCONF="yes"
IPV6_DEFROUTE="yes"
IPV6_FAILURE_FATAL="no"
NAME="eno16780032"
UUID="2d1107e3-8bd9-49b1-b726-701c56dc368b"
ONBOOT="yes"
IPADDR0="34.36.140.86"
PREFIX0="22"
GATEWAY0="34.36.143.254"
DNS1="34.36.132.1"
DNS2="34.34.132.1"
DOMAIN="corp.halliburton.com"
HWADDR="00:50:56:AC:3F:F9"
IPV6_PEERDNS="yes"
IPV6_PEERROUTES="yes"
NM_CONTROLLED="no"
#route -n
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref Use Iface
0.0.0.0 34.36.143.254 0.0.0.0 UG 0 0 0 eno16780032
34.36.140.0 0.0.0.0 255.255.252.0 U 0 0 0 eno16780032
169.254.0.0 0.0.0.0 255.255.0.0 U 1002 0 0 eno16780032
When I reboot the machine, the first line in route table gets erased, I then run:
#route add default gw 34.36.143.254
After which network works fine.
Help required. I don't want to use NetworkManager.The following might be useful:
https://access.redhat.com/solutions/783533
"When transitioning from NetworkManager to using the network initscript, the default gateway parameter in the interface's ifcfg file will be depicted as 'GATEWAY0'. In order for the ifcfg file to be compatible with the network initscript, this parameter must be renamed to 'GATEWAY'. This limitation will be addressed in an upcoming release of RHEL7."
NetworkManager is now the default mechanism for RHEL 7. Personally I don't quite understand this, because as far as I can gather it is a program for systems to automatically detect and connect to known networks. I think such functionality can be useful when switching between wireless and wired networks, but for a server platform, I wonder.
Maybe you are looking for
-
Problem when trying to print standart invoice crystal report for SAP
Hi All, I have the following problem when trying to print a system invoice report in crystal reports for SAP I pass params and load the report using this code Private Sub DisplayThreadReportSeq() Dim oView As New frmViewReport Dim strReportPath As St
-
GT72 - Dragon Gaming Centre "Instant Play" & Display Colour Calibration issue
Hi guys, Very much enjoying my GT72, but I've noticed a strange error that I can't seem to find the solution to either on this forum or elsewhere. Basically the computer boots up to an incorrect colour calibration setting (really blue) which I have t
-
How to implement business rules by using drolls in OSB
Hi I am new to Drools,can any body tell how to implement drools concept in OSB11,provide any useful links or blogs. Thanks in Advance Mani
-
Hi Gurus I need to send emails to employees but that email will contain a table format and i am thinking to do that by using | (pipes). As table format is required, so i cant use only internal table, so can anyone please tell me how to do this. i hav
-
One of our forum member has some issues with posting (user594038 - http://forums.oracle.com/forums/thread.jspa?threadID=704022&tstart=0) Do you have that pb also? Please note that this is a test of posting, please just respond that you have the same