Table compare and derive alter script between 2 schemas

I am in Oracle 10g.
We are in need to synchronise table structures between two different database.
and execute the alter script in the target database.
I have an idea to find the tables which have the table definition changed using all_tab_columns and datbase link;
A sample of it as below:
prompt
prompt columns having same name but difference in datatype or length:
prompt -------------------------------------------------------------------------
select
a.column_name, a.data_type, a.data_length, a.data_scale,a.data_precision,
b.column_name, b.data_type, b.data_length, b.data_scale,b.data_precision
from
all_tab_columns a, all_tab_columns@link b
where
a.table_name in (select tablename from test) and
a.table_name = b.table_name and
a.column_name = b.column_name and
a.data_type <> b.data_type or
a.data_length <> b.data_length or
a.data_scale <> b.data_scale or
a.data_precision <> b.data_precision
prompt columns present in &table1 but not in &table2
prompt ----------------------------------------------
select
column_name
from
all_tab_columns@link
where
table_name in (select tablename from test)
minus
select
column_name --, data_type, data_length, data_scale, data_precision
from
all_tab_columns
where
table_name in (select tablename from test) ;
prompt columns present in &table2 but not in &table1
prompt ----------------------------------------------
select
column_name --, data_type, data_length, data_scale, data_precision
from
all_tab_columns@link
where
table_name in (select tablename from test)
minus
select
column_name
from
all_tab_columns
where
table_name in (select tablename from test)
just looking for idea to derive the alter scripts on these? Please share your ideas on this.

You don't have to write lots of triggers. You only need to write one trigger e.g :-
create table ddl_audit
(audit_date date,
username varchar2(30),
instance_number integer,
database_name varchar2(9),
object_type varchar2(19),
object_owner varchar2(30),
object_name varchar2(128),
sql_text varchar2(2000));
create or replace trigger BEFORE_DDL_TRG before ddl on database
declare
l_sql_text ora_name_list_t;
l_count NUMBER;
l_puser VARCHAR2(30) := NULL;
l_sql varchar2(2000);
begin
l_count := ora_sql_txt(l_sql_text);
l_puser := SYS_CONTEXT('USERENV', 'KZVDVCU');
l_count := ora_sql_txt(l_sql_text);
for i in 1..l_count
loop
l_sql := l_sql||l_sql_text(i);
end loop;
insert into ddl_audit (audit_date, username, instance_number, database_name, object_type, object_owner, object_name, sql_text)
values (sysdate, l_puser,ora_instance_num,ora_database_name,ora_dict_obj_type,ora_dict_obj_owner,ora_dict_obj_name,l_sql);
exception
when others then
null;
end;
show errors;

Similar Messages

  • Table compare deleting rows which does not exist in target table

    Hi Gurus,
    I am struggling with an issue in Data Services.
    I have a job which uses Table Compare, then History Preserving and then a Key Generation transforms.
    There is every possibility that data would get deleted from the source table.
    Now, I want to delete them from the target table also.
    I tried Detect deleted rows but it is not working.
    Could some one please help me on this issue.
    Thanks,
    Raviteja.

    Doesn't history preserving really only operate on "Update" rows.  Wouldn't it only process the deletes if you turned the "Preserve Delete row(s) as update row(s)" on?
    I would think if you turned on Detect Delete rows in the Table compare and did not turn this on in the history preserving it would retain those rows as delete rows and effectively remove them from the target.
    Preserve delete row(s) as update row(s)
    Converts DELETE rows to UPDATE rows in the target warehouse and, if you previously set effective date values (Valid from and Valid to), sets the Valid To value to the execution date. Use this option to maintain slowly changing dimensions by feeding a complete data set first through the Table Comparison transform with its Detect deleted row(s) from comparison table
    option selected.

  • Script for comparing 2 schemas based upon table compare

    Hi,
    I want to write a script on 2 schemas TEST and TEST1 which will compare the difference in tables in both schemas and will Insert Data
    into schema TEST from TEST1 depending upon the matched columns in schema TEST and TEST1
    Any help will be nedful for me

    I want to write a script on 2 schemas TEST and TEST1 which will compare the difference in tables Differences in METADATA or differences in data.?
    It would be helpful if you provided DDL for tables involved.
    It would be helpful if you provided DML for test data.
    It would be helpful if you provided expected/desired results & a detailed explanation how & why the test data gets transformed or organized.

  • Alter script to keep values between 1 and 99

    Hi guys, I currently have a table (for the sake of this lets call it my_table with a column my_column which is integer) I want to restrict the values of this column to values between 1 and 99 and was wondering how I would build this constraint using a script.
    Thanks for any help I may recieve on this.

    Adding a check constraint should serve the purpose.
    create table t_test
    col1 integer
    alter table t_test add constraint check_col1_val check (col1 between 1 and 99);
    insert into t_test values(1);
    insert into t_test values(99);
    insert into t_test values(81);
    insert into t_test values(0);
    insert into t_test values(100);Running the last two insert statements will give you a error
    insert into t_test values(0)
    Error report:
    SQL Error: ORA-02290: check constraint (HI_OWNER.CHECK_COL1_VAL) violated
    02290. 00000 -  "check constraint (%s.%s) violated"
    *Cause:    The values being inserted do not satisfy the named check
    *Action:   do not insert values that violate the constraint.

  • What is difference between table space and shchema

    what is difference between table space and shchema ?

    784633 wrote:
    so each user has it own space of tables - schema ?yes, but let's clarify a bit ....
    The "schema" is the collection of all objects owned by a particular user. So if user SCOTT creates two tables, EMP and DEPT, and a view EMP_RPT, and a procedure GET_MY_EMP, those objects (tables, views, procedures) collectively make up the SCOTT schema.
    Those objects will be physically stored in a tablespace.
    A tablespace is a named collection of data files. So tablespace USERS will be made up of one or more data files. A specific datafile can belong to one and only one tablespace. If a tablespace has more than one data file, oracle will manage those files as a collection invisible to the application - much like the OS or disk subsystem handles striping across multiple physical disks.
    A specific object in the SCOTT schema can exist in only one tablespace, but not all objects of the schema have to be in the same tablespace. Likewise a tablespace can contain objects from multiple schemas.
    and can one user to access tables of other users?As others have said - FRED can access tables belonging to SCOTT as long has SCOTT has granted that access to FRED.

  • Table look up and derivation rule in COPA

    Hi,
    What is the difference between table look up and derivation rule in COPA. How do we inter rellate them in customizing.
    Please clarify.
    Thanks and Regards,
    Ram
    Moderator: Please, avoid asking basic questions

    Hi ram,
    Herewith i am giving you link i hope this will help full to you.
    deleted
    Method which is used in the derivation step.
    You can choose from the following derivation types:
    Derivation rule :
    Derivation rules are "if-then" rules, in which you specify which combination's of characteristic values will yield which target characteristic values.
    Table look-up :
    Table look-ups let you determine characteristic values by having the system read them from any table. The source fields must correspond to the key of the table where you want the system to find the target value.
    Thanks,
    Anil

  • How to make a copy of an application with its schema-tables,data and all

    Good day,
    I am looking for the best way to make a copy of an application from one computer to another, including the schema (tables, data and all) in Apex3.2.
    I can only manage to make a copy of the application without the data using the export utility
    Please assist with this difficulty
    Kind Regards
    Thabo
    Edited by: Thabo K on Jun 1, 2009 1:13 AM

    Hello,
    To copy across the data you can use the traditional EXP/IMP or the Datapump utility.
    If you're used to using EXP/IMP I'd encourage you to look at Datapump, if you haven't used EXP/IMP before I'd encourage you to look at Datapump (datapump rocks) -
    http://www.oracle-base.com/articles/10g/OracleDataPump10g.php
    There are a few major differences between Datapump and traditional EXP/IMP (EXP/IMP creates the export file on the client side, Datapump creates it on the server side etc).
    In my book "Pro Oracle Application Express" I have a section on cloning applications/data between instances, which you might find useful.
    Hope this helps,
    John.
    Blog: http://jes.blogs.shellprompt.net
    Work: http://www.apex-evangelists.com
    Author of Pro Application Express: http://tinyurl.com/3gu7cd
    REWARDS: Please remember to mark helpful or correct posts on the forum, not just for my answers but for everyone!

  • Table compression and alter table statement

    Friends
    I am trying to add columns to a table which is compressed. Since Oracle treats compressed tables as Object tables, I cannot add columns directly so I tried to uncompress table first and then add columns. This doesnt seems to work.
    What could be issue.
    Thanks
    Vishal V.
    Script to test is here and results are below.
    -- Test1 => add columns to uncompressed table -> Success
    DROP TABLE TAB_COMP;
    CREATE TABLE TAB_COMP(ID NUMBER) NOCOMPRESS;
    ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    -- Test2 =. try adding columns to compressed tables, uncompress it and then try again -> Fails
    DROP TABLE TAB_COMP;
    CREATE TABLE TAB_COMP(ID NUMBER) COMPRESS;
    ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    ALTER TABLE TAB_COMP move NOCOMPRESS;
    ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    SQL> -- Test1 => add columns to uncompressed table -> Success
    SQL> DROP TABLE TAB_COMP;
    Table dropped.
    SQL> CREATE TABLE TAB_COMP(ID NUMBER) NOCOMPRESS;
    Table created.
    SQL> ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    Table altered.
    SQL>
    SQL> -- Test2 =. try adding columns to compressed tables, uncompress it and then try again -> Fails
    SQL> DROP TABLE TAB_COMP;
    Table dropped.
    SQL> CREATE TABLE TAB_COMP(ID NUMBER) COMPRESS;
    Table created.
    SQL> ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10))
    ERROR at line 1:
    ORA-22856: cannot add columns to object tables
    SQL> ALTER TABLE TAB_COMP move NOCOMPRESS;
    Table altered.
    SQL> ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10));
    ALTER TABLE TAB_COMP ADD (NAME VARCHAR2(10))
    ERROR at line 1:
    ORA-22856: cannot add columns to object tables

    Which version of oracle you are using?
    1* create table test1234(a number) compress
    SQL> /
    Table created.
    Elapsed: 00:00:00.02
    SQL> alter table test1234 add(b varchar2(200));
    Table altered.
    Elapsed: 00:00:00.02

  • Re: DIfference between Schemas, PCRS' FUnctions and Operations

    Dear SAPExperts,
    Can anybody pls tell me what is difference between Schemas, PCRS' Functions and Operations.
    Thanx in advance
    Regards
    Aniruddha

    Hi Aniruddha
    When the payroll driver is executed (for ex: PC00_M40_CALC for INDIA is executed) the schema IN00 (standard schema) is executed and it calls certain functions (functions may use Rules. Rules contain Operation) and also subschemas.
    Functions could be of four types
    1. Performing some payroll computations (E.g. INEPF function calculates the PF amount of an employee during payroll run)
    2. Calling rules (E.g. P0045 function calls a rule INLN to compute the loan details of a personnel number).
    3. Getting data from Infotypes (E.g. P0581 will get the data from Infotype-581 for payroll processing).
    4. For some decisions. (E.g. IF & ENDIF function is used to execute as per the true and false decisions) etc.
    When the PY is executed SAP uses lot of Internal Tables to store data and also provides data to other internal tables.
    The read and change access to Internal Tables is enabled using functions that are executed in a personnel calculation schema and using operations that are executed in personnel calculation rules.
    The following is just an attempt to provide some info on how Functions are processed during the PY Run. For ex: (functionality of subschema XIN0 and INBD)
    1. INITIALIZATION OF PAYROLL: When the Payroll is executed first the subschema XIN0 is called,   This subschema comprises the following main steps:
    I.     Specify program type (payroll or evaluation)
    II.     Set switch for database updates (YES/NO) all database updates are controlled via this switch (otherwise simulation)
    III.     Only infotypes from the HR master record which apply to the selected personnel number are read.
    IV.     All Time Management infotypes are imported.
    V.     Specify check against control record PA03 (test or live).
    2. READ BASIC DATA: Once the Initialization of Payroll is successful, the subschema INBD is called. This subschema reads the Basic data of an employee who is there in the Payroll execution. Basic Data includes, determining Employee Name, Reading Work Center/Basic Pay Data, Setting Financial Year Dates and Allowance Grouping Tables, Reading Previous Employment Tax Details, Reading Housing (HRA/CLA/COA), Reading Exemptions, Reading Income from Other Sources, Reading Section 80 Deductions, Reading Investment Details (Sec 88), Reading Provident Fund Contributions, Reading Other Statutory Deductions, Reading Car and Conveyance, Reading Long Term Reimbursements, Read ID Details. Function GON checks whether all the master data has been imported, no further processing of schema will occur unless certain data is present.
    The Subschema INBD calls the following functions in order to read the Basic Data:
    ENAME :
    Function ENAME reads the last valid name of the employee in the payroll period. The formatting used for the country in question is used when displaying this name.
    WPBP
    If an employee receives a pay increase within a payroll period, the Basic Pay infotype (0008) is changed and delimited as of a specific date. Two data records exist for one payroll period.
    During payroll, the system writes the Basic Pay wage type to the results table with two different indicators. These split indicators are a link to the WPBP table that contains the relevant values. The system takes into consideration both data records for the corresponding partial periods when calculating remuneration.
    ININI
    P0580
    P0581
    P0582
    P0584
    P0585
    P0586
    P0587
    P0588
    P0583
    P0590
    P185I
    GON :
                    Function GON checks whether all the master data has been imported. No further processing of the schema will occur unless certain data is present. Checking procedures vary from country to country. There must always, however, be a work center (P007).
    [Def. of Functions, Rules and Operations (copied from Raju's answer)
    Schema is a collection of functions
    A Rule is a collection of operations.
    An operation is a very basic piece of logic that is used, mostly, to manipulate wage types
    Best Regards
    Reddy

  • Copying tables between schema owners

    In Timesten, can you copy tables across schemas/owners?
    i.e. OWNER_A.TABLE_Y to OWNER_B.TABLE_Y
    Where TABLE_Y has the same definition? Basically, I'd like to be able to backup one datastore and restore it in another datastore that has the same table definitions, but may have a different owner name.
    Thanks,
    Larry

    I'm not completely clear on exactly what you are looking to do. On one hand you ask about copying tables between schemas. This is easily done:
    CREATE TABLE OWNER_B.TABLE_Y AS SELECT * FROM OWNER_A.TABLE_Y;
    This only works for TimesTen tables that are not part of a cache group; specifically the source table can be part of a cache group but the target table cannot. if the target table is part of a cache group then you need to:
    1. Create the cache group containing the target (cached) table.
    2. INSERT INTO OWNER_B.TABLE_Y SELECT * FROM OWNER_A.TABLE_Y;
    But then you mention backup and restore. Since TimesTen backup/retore (ttBackup/ttRestore) works at a physical level you cannot rename/copy tables as part of that. You might be able to use ttMigrate with the -rename oldOwner:newOwner option but there are some constraints around this (one being that PL/SQL cannot be enabled in the database).
    Chris

  • Difference between schema and DTD

    Difference between schema and DTD
    <author>
    <firstname></firstname>
    <lastname></lastname>
    </author>
    How will u write dtd and schema for above XML ?

    DTD:
    <!ELEMENT author(firstname, lastname)>
    <!ELEMENT firstname(#PCDATA)>
    <!ELEMENT lastname(#PCDATA)>
    Schema:
    <xs:element name="author">
    <xs:complexType>
    <xs:sequence>
    <xs:element name="firstname" type="xs:string"/>
    <xs:element name="lastname" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
    </xs:element>

  • Relationship between Table BKPF and RBKP

    Hello Developers,
    I need to fetch value of field BKTXT  from table BKPF and this value need to insert in internal table i_tab.
    in the below situation:
    select bebeln bebelp bwerks ausnam alifnr abelnr
             abldat abudat
             caedat cekgrp cbukrs cekorg
             dbanfn dbnfpo dnetpr dafnam
      into corresponding fields of table i_tab
      from rbkp as a join rseg as b on abelnr = bbelnr
                     join ekko as c on bebeln = cebeln
                     join ekpo as d on cebeln = debeln
                                          and bebelp = debelp
      where a~budat in s_date
        and b~werks in s_werks
        and c~bstyp = c_f
        and c~ekorg in s_ekorg
        and a~bukrs in s_bukrs.
    Can any one suggest how to relate table BKPF with other table like RBKP.
    Thanks in advance.
    Regards
    Sundeep

    Hi,
    perform the following steps:-
    1.   Go to transaction SQVI
    2.   Create a View
    3.   Enter title
    4.   Choose the Data source as Table Join
    5.   Go to insert table and add table as per your requirement.
    and hereafter you can find relation between any two tables...
    Rgds/Abhi

  • What are the tables PHO and LOC of OIM Schema?

    Hi,
    In the schema documentation of OIM tables you have the following description for the tables PHO and LOC:
    LOC - Holds information about locations
    PHO - Holds all communication addresses for this contact -- e.g., contact telephone numbers,fax numbers, e-mail, etc.
    When those have have some records? I haven't seen yet any form ou function where I can put some information like that.
    I don't have any problem with them and it is just for curiosity. As I understand the OIM User Model, I thought those tables were part of OIM User Model, where:
    Organization (field ACT_Key of USR table)
    Location (I guess table LOC but it is always empty in my env). There is a field USR_LOCATION but it is not shown by default.
    User Group (all groups a user is member are in the USG table)
    User Defined Fields (all fields start with USR_UDF of USR table)
    Manager (USR_MANAGER that is alwasys empty and USR_MANAGER_KEY that has the Key of User's Manager)
    Organization (ACT_KEY of USR table)
    Contact Information (I guess it is the table PHO)
    Thanks,
    Renato.

    They probably serve no purpose anymore but might have been used at some point during the life cycle of the product.
    -Kevin

  • Problems using different tables for base class and derived class

    I have a class named SuperProject and another class Project derived from
    it. If I let SchemaTool generate the tables without specifying a "table"
    extension, I get a single TABLE with all the columns from both classes and
    everything works fine. But if I specify a "table" for the derived class,
    SchemaTool generates the derived class with just one column (corresponds
    to the attribute in derived class). Also it causes problems in using the
    Project class in collection attributes.
    JDO file:
    <jdo>
    <package name="jdo">
    <class name="Project" identity-type="application"
    persistence-capable-superclass="SuperProject">
    <extension vendor-name="kodo" key="table" value="PROJECT"/>
    </class>
    <class name="SuperProject" identity-type="application"
    objectid-class="ProjectId">
    <field name="id" primary-key="true"/>
    </class>
    </package>
    </jdo>
    java classes:
    public class Project extends SuperProject
    String projectSpecific
    public class SuperProject
    BigDecimal id;
    String name;
    tables generated by SchemaTool:
    TABLE SUPERPROJECTSX (IDX, JDOCLASSX, JDOLOCKX, NAMEX);
    TABLE PROJECT(PROJECTSPECIFICX)
    Thanks,
    Justine Thomas

    Justine,
    This will be resolved in 2.3.4, to be released later this evening.
    -Patrick
    In article <aofo2q$mih$[email protected]>, Justine Thomas wrote:
    I have a class named SuperProject and another class Project derived from
    it. If I let SchemaTool generate the tables without specifying a "table"
    extension, I get a single TABLE with all the columns from both classes and
    everything works fine. But if I specify a "table" for the derived class,
    SchemaTool generates the derived class with just one column (corresponds
    to the attribute in derived class). Also it causes problems in using the
    Project class in collection attributes.
    JDO file:
    <jdo>
    <package name="jdo">
    <class name="Project" identity-type="application"
    persistence-capable-superclass="SuperProject">
    <extension vendor-name="kodo" key="table" value="PROJECT"/>
    </class>
    <class name="SuperProject" identity-type="application"
    objectid-class="ProjectId">
    <field name="id" primary-key="true"/>
    </class>
    </package>
    </jdo>
    java classes:
    public class Project extends SuperProject
    String projectSpecific
    public class SuperProject
    BigDecimal id;
    String name;
    tables generated by SchemaTool:
    TABLE SUPERPROJECTSX (IDX, JDOCLASSX, JDOLOCKX, NAMEX);
    TABLE PROJECT(PROJECTSPECIFICX)
    Thanks,
    Justine Thomas
    Patrick Linskey [email protected]
    SolarMetric Inc. http://www.solarmetric.com

  • In which table we can find the relationship between Role id and Task id

    Hi Experts,
    In which table we can find the relationship between Role id and Task id in Cprojects.
    Thanks
    Subhaskar

    Hi Subhaskar,
    Apart from DPR_ENTITY_LINK , you can also get it from table DPR_PART.
    Please go through the below link
    http://wiki.sdn.sap.com/wiki/display/PLM/cProjectstablesin+SAP
    I hope this will help you.
    Regards,
    Rahul

Maybe you are looking for

  • While creating a billing document

    Hi All, I have a sales order, say with 3 line items with 3 different customer groups. Now i am creating a billing document for this order using the transaction VF01. As the Order has 3 items with 3 different customer groups, system is splitting each

  • "Smart Objects" in Illustrator

    So, I do a lot of resizing on multiple art boards. Each art board is a different size, but has the same content, and a lot of the time we just recycle old designs with new content.  I was wondering if there was an option to change multiple objects at

  • Combo filled with data from database

    i want to display the list of schemas dynamically in a combo box and then display the tables in that schema. i tried a servlet; but the schemas are not displayed. import javax.servlet.*; import javax.servlet.http.*; import java.io.*; import java.net.

  • An error occured in the blob cache

    An error occured in the blob cache.  The exception message was 'The system cannot find the file specified. (Exception from HRESULT: 0x80070002)'. Understand that this know issue on SharePoint 2010 is there any fix for this ?  http://blogs.msdn.com/b/

  • Deploytool not deleting fields in ejb-jar.xml and sun-cmp-mappings.xml

    Java(TM) 2 Platform, Enterprise Edition 1.4 SDK Developer Release deploytool This situation was found on a simple, but original application using CMP EJB's. The database is MySQL using numeric id's and foreign keys. I changed some fields in the datab