Dbms_redefinition package and COMPRESS attribute of the target table

Hi experts,
we have an already partitioned and compressed table under Oracle 10g R2 wich we want to move to another tablespace using online redefinition.
The table should keep the Partitions and compressed data after the move.
My question is: How much storage we must have in place for the move of the table ?
Example:
tab (compressed size) : 1000 MB
tab (uncompressed size) : 4000 MB
Seems it depends on how redefinition handles the move of the compressed data.
So if redefinition uses INSERT /* APPEND */ .... is should be roundabout 1000 MB ("compress during write")
Is this assumption correct ?
Can anybody shed some light on redefinition wich kind of compression-conserving stragetgy it uses ?
bye
BB

From the 11.2 admin guide:
Create an empty interim table (in the same schema as the table to be redefined) with all of the desired logical and physical attributes. If columns are to be dropped, do not include them in the definition of the interim table. If a column is to be added, then add the column definition to the interim table. If a column is to be modified, create it in the interim table with the properties that you want.The table being redefined remains available for queries and DML during the entire process.
Execute the FINISH_REDEF_TABLE procedure to complete the redefinition of the table. During this procedure, the original table is locked in exclusive mode for a very short time, independent of the amount of data in the original table. However, FINISH_REDEF_TABLE will wait for all pending DML to commit before completing the redefinition.>
If you did not want to create an interim table, then this approach is not going to work for you. There is no requirement for you to create anything other than the interim table, and any dependent objects can be done automatically, including materialized views. Where did you see that you have to create mview logs?

Similar Messages

  • Display entity-attributes and global-attributes in the same screen

    Hi,
    I have a question about entity.
    Is it possible to display in the same screen, entity-attributes and global-attributes?
    thanks
    Rosalia

    Hi Rosalia,
    You can only display attributes from the current entity and the entity or entities that it is contained by (the parent entities).
    So if you had the following data model
    Global
    ..the person
    ...the income
    the person is contained by global.
    the income is contained by the person (which is contained by global).
    If you want to create a person-level question screen, you can display global attributes on that screen by using subsitution like %global_field_1%. You cannot, however, input global attributes on that screen (ONLY attributes belonging to the person)
    If you want to create an income-level question screen, you can display attributes from the person such as "tell us about %person_name%'s income..." as well as global attributes. Again, you cannot input person or global attributes on that screen (ONLY attributes belonging to the income)
    However it only works from child-to-parent containment relationships! You cannot have a global question screen which loops through the person instances and displays attributes from the person, or from the income out-of-the-box.
    We have managed to implement this using custom controls.
    You can map individual attributes back up to the parent level, and then display them,
    such as
    the first person's name = InstanceValueIf(the people, the person's name, the person's id = 1)
    the first person's name is now a global attribute and can be displayed on global question screens.
    Hope this helps!
    Ben

  • PR and RFQ usage upto the target quantity

    Hi,
    How to restrict the PR and RFQ up to the target quantity as a reference. The PR and RFQ should not be available, after the target quantity as a reference.
    Thanks in advance
    Vikram

    Hi,
    Do you meant to say making the PO with referance to PR,and the qty should not exceed the PR qty, then you have to make as Error for  message 06076 in SPRO-MM-Purchasing-Environment-systemmessages.
    if this is not you are looking for please let me know.
    Reward if usefull.
    Regards,
    Batchu.

  • I HAVE A SOURCE TABLE WITH 10 RECORDS AND TARGET TABLE 15 RECORDS. MY WUESTION IS USING WITH THE TABLE COMPARISON TRANSFORM I WANT TO DELETE UNMATCHED RECORDS FROM THE TARGET TABLE ??

    I HAVE A SOURCE TABLE WITH 10 RECORDS AND TARGET TABLE 15 RECORDS. MY QUESTION IS USING WITH THE TABLE COMPARISON TRANSFORM .I WANT TO DELETE UNMATCHED RECORDS FROM THE TARGET TABLE ?? HOW IT IS ??

    Hi Kishore,
    First identify deleted records by selecting "Detect deleted rows from comparison table" feature in Table Comparison
    Then Use Map Operation with Input row type as "delete" and output row type as "delete" to delete records from target table.

  • How to sort columns in the target table

    I have a simple mapping which I am trying to design. There's only one table on the source and one in the target . There are no filter conditions, only thing is I want the target table to be sorted.
    Literally, say
    Src is source table has 3 columns x,y,z
    Trg is dest table and has 3 columns a,b,c
    x--->a
    y---->b
    z---->c
    The SQL should be
    select x,y,z from src order by x,y.
    I could do the mapping but the order by ..I could not do it .
    IKM used: IKM BIAPPS Oracle Incremental Update

    Why can't you use simple UPDATE command in EXECUTE SQL Task as below,
    DROP TABLE SSN
    DROP TABLE STAGING
    DROP TABLE STUDENT
    CREATE TABLE SSN(pn_id VARCHAR(100),ssn BIGINT)
    INSERT INTO SSN VALUES('000616850',288258466)
    INSERT INTO SSN VALUES('002160790',176268917)
    CREATE TABLE Staging (ssn BIGINT, id INT, pn_id BIGINT, name VARCHAR(100), subject VARCHAR(100),grade VARCHAR(10), [academic year] INT, comments VARCHAR(100))
    INSERT INTO Staging VALUES(288258466, 1001, '770616858','Sally Johnson', 'English','A', 2005,'great student')
    INSERT INTO Staging VALUES(176268917, 1002, '192160792','Will Smith', 'Math','C', 2014,'no comments')
    INSERT INTO Staging VALUES(444718562, 1003, '260518681','Mike Lira', 'Math','B', 2013,'no comments')
    CREATE TABLE Student(id INT,pn_id BIGINT,subject VARCHAR(100), [academic year] INT, grade VARCHAR(10), comments VARCHAR(100) )
    INSERT INTO Student VALUES(1001, '000616850', NULL,NULL,NULL ,NULL)
    INSERT INTO Student VALUES(1002, '002160790', NULL,NULL,NULL ,NULL)
    UPDATE Student SET Subject = C.Subject, [academic year]=C.[academic year], grade=C.grade,comments=C.comments
    FROM SSN A INNER JOIN Student B
    ON A.pn_id=B.pn_id INNER JOIN Staging C
    ON A.ssn = C.ssn
    SELECT * FROM Student
    Regards, RSingh

  • The size of the target table grows abnormaly

    hi all,
    I am curently using OWB (version 9 2.0 4 to feed some tables.
    we have created a new database 9.2.0.5 for a new datawarehouse.
    I have an issue that I really can not explain about the increase size of the target tables.
    I take the exemple of a parameter table that contains 4 fields and only 12 rows.
    CREATE TABLE SSD_DIM_ACT_INS
    ID_ACT_INS INTEGER,
    COD_ACT_INS VARCHAR2(10 BYTE),
    LIB_ACT_INS VARCHAR2(80 BYTE),
    CT_ACT_INS VARCHAR2(10 BYTE)
    TABLESPACE IOW_OIN_DAT
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCACHE
    NOPARALLEL;
    this table is feed by a mapping and I use the update/insert option, which generates a Merge.
    first the table is empty, I run the maping and I add 14 lines.
    the size of the table is now 5 Mo !!
    then I delete 2 lines by sql with TOAD
    I run a again the mapping. It updates 12 lines and add 2 lines.
    at this point,the size of the table has increased of 2 Mo (1 Mo by line !!)
    the size of the table is now 7 Mo !!
    I do the same again and I get a 9 Mo table
    when I delete 2 lines with a SQL statement and create them manually, the size of the table does not change.
    when I create a copy of the table with an insert select sql statement the size becomes equal to 1 Mo which is normal.
    Could someone explain me how this can be possible.
    is it a problem with the database ? with the configuration of OWB ?
    what should I check ?
    Thank you for your help.

    Hi all
    We have found the reason of the increasing.
    Each mapping has a HINT which is defaulted to PARALLEL APPEND. as I understand it, it is use by OWB to determine if an insert allocates of not new space for a table when it runs and insert.
    We have changed each one to PARALLEL NOAPPEND and now, it's correct.

  • In OWB I need to update the target table with same field for match/update

    In OWb I am trying to update the target table with the match and the update on the same field can this be done. I am getting a error match merge error saying you cannot update and match on the same field. But in SQl my select is
    Update table
    set irf = 0
    where irf = 1
    and process_id = 'TEST'
    Hwo do i do this in OWB.

    table name is temp
    fields in the table
    field1 number
    field2 varchar2(10)
    field3 date
    values in the table are example
    0,'TEST',05/29/2009
    9,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009
    In the above example I need to update the first row field1 to 1.
    Update temp
    set field1 = 1
    where field1 = 0
    and field2 = 'TEST'
    when I run this I just need one row to be updated and it should look like this below
    1,'TEST',05/29/2009
    9,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009
    But when I run my mapping I am getting the rows like below the second row with 9 also is getting updated to 1.
    1,'TEST',05/29/2009
    1,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009

  • Issue with INSERT INTO, throws primary key violation error even if the target table is empty

    Hi,
    I am running a simple
    INSERT INTO Table 1 (column 1, column 2, ....., column n)
    SELECT column 1, column 2, ....., column n FROM Table 2
    Table 1 and Table 2 have same definition(schema).
    Table 1 is empty and Table 2 has all the data. Column 1 is primary key and there is NO identity column.
    This statement still throws Primary key violation error. Am clueless about this? 
    How can this happen when the target table is totally empty? 
    Chintu

    Nope thats not true
    Either you're not inserting to the right table or in the background some other trigger code is getting fired which is inserting into some table which causes a PK violation. 
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to gather stats on the target table

    Hi
    I am using OWB 10gR2.
    I have created a mapping with a single target table.
    I have checked the mapping configuration 'Analyze Table Statements'.
    I have set target table property 'Statistics Collection' to 'MONITORING'.
    My requirement is to gather stats on the target table, after the target table is loaded/updated.
    According to Oracle's OWB 10gR2 User Document (B28223-03, Page#. 24-5)
    Analyze Table Statements
    If you select this option, Warehouse Builder generates code for analyzing the target
    table after the target is loaded, if the resulting target table is double or half its original
    size.
    My issue is that when my target table size is not doubled or half its original size then traget table DOES NOT get analyzed.
    I am looking for a way or settings in OWB 10gR2, to gather stats on my target table no matter its size after the target table is loaded/updated.
    Thanks for your help in advance...
    ~Salil

    Hi
    Unfortunately we have to disable automatic stat gather on the 10g database.
    My requirement needs to extract data from one database and then load into my TEMP tables and then process it and finally load into my datawarehouse tables.
    So I need to make sure to analyze my TEMP tables after they are truncated and loaded and subsequently updated, before I can process the data and load it into my datawarehouse tables.
    Also I need to truncate All TEMP tables after the load is completed to save space on my target database.
    If we keep the automatic stats ON my target 10g database then it might gather stats for those TEMP tables which are empty at the time of gather stat.
    Any ideas to overcome this issue is appreciated.
    Thanks
    Salil

  • Duplicates in the target table.

    Hi, I am working on ODI 10.
    In one of my interface when ever I executes there are some duplicates are coming to the target table.
    Say if the count of the rows around 5000 in the source table and in the target it would be around 120000. Even after using the distinct rows in the flow control some bugs are coming.
    Can you please help how solve this...
    Note:In source table one column contains surrogate key.
    IKM oracle control append is the KM I am using

    Using the Control Append IKM will always add the data that is in the Source to the Target, unless you truncate or delete from the Target first. If you have data in the Source that has already been loaded to the Target, and you do not truncate the Target prior to the next load, you will have duplicates.
    Are you truncating the Target or is the Source data always "new" each time the Interface is run?
    Regards,
    Michael Rainey

  • How to soft delete a row from the target table?

    Could someone help me on this requirement?
    How to implement the below logic using only ODI? I am able to implement the below logic with the "DELETE_FLAG" as "N".
    I want to make the latest record with the flag as "N" and all the previous other records with the flag as "D".
    Thanks a lot in advance.
    I have a source table "EMP".
    EMP
    EMPID FIRST_NAME
    1 A
    2 B
    First name is changed from A to C and then, C to D etc. For each data change, I would add a target row and mark the latest row as "N" and the rest as "D". The target table would contain the following data:
    Target_EMP
    EMPID FIRST_NAME DELETE_FLAG
    1 A D
    1 C D
    1 D N

    The problem is that I can't delete the row cause it demands from me to fill the mandatory field previously. It takes place when the key field is ROWID. In other cases delete is succesful.

  • How do I make Merge operation into the target table case insensitive?

    Hi All,
    We have a target table that has a varchar 2 column called nat_key and a map that copies over data from a source table into the target table.
    Based on wheteher the values in the nat_key column matches between the source and the target, an update or an insert has to be done into the target table.
    Let us say target table T has the following row
    nat_key
    EQUIPMENT
    Now, my source table has the same in a different case
    nat_key
    equipment
    I want these rows to be merged .
    In the OWB map, I have given the property of nat_key column in the target table as 'Match while updating' = 'Yes'. Is there a built in feature in OWB, using which I can make this match as case insensitive?
    Basically, I want to make OWB generate my mapping code as
    if UPPER(target. nat_key)=upper(source.nat_key) then update...else insert.
    Note: There is a workaround with 'Alter Session set nls_sort=binary_ci and nls_comp=linguistic', but this involves calling a pre-mapping operator to set these session parameters.
    Could anyone tell me if there is a simpler way?

    Hi,
    use an expression operator to get nat_key in upper case. Then use this value for the MERGE. Then nat_key will only be stored in upper case in your target table.
    If you have historic data in the target table you have to update nat_key to upper case. This has to be done only once and is not necessary if you start with an empty target table.
    Regards,
    Carsten.

  • Refering the target table records in the transfering quey

    Hi all
    I am trying to load some records using my job in DI in the target table. The query I should use is a bit tricky. While I'm loading records into the target table using query, I should check whether one of the columns has been used in transferring record or not. As I want to have a unique value on one column. It is distinct, distinct get the unique records, I need have unique value in one column accross whole the table.
    I noticed it's not possible to refer to target column in the Query object to see whether that value has been used already there or not. But how can I address this requirement? Do you have any experience?
    I write the SQL Code here which I should use in Query object in Data Integrator:
    In the target table, every city should just come in one and only record.
    INSERT INTO target
         Effective_From_Date,
         Effective_To_Date,
         Business_Unit_ID,
         Provider_ID
    SELECT distinct
         table1.Effective_From_Date,
         table2.Effective_To_Date,
         table4.city_ID,
         table4.provider_ID
    FROM
         table1 a
         INNER JOIN table2 b
              ON (a.typeID = b.typeID)
         INNER JOIN table3 c
              ON (a.professionID = c.professionID)
         INNER JOIN table4 d
              ON (c.city_ID = d.city_ID)
                   WHERE  NOT EXISTS
                                    (SELECT * FROM target e 
                               WHERE d.city_ID = e.Business_Unit_ID)
    Thanks.

    You can use the target table as a source table as well, just drag is into your dataflow again and select Source instead of Target this time.  Then you can outer join the new source target table to your query (I might do this in a second query instead of trying to add it to the existing one).
    You could also use a lookup function to check the target table.  In this case you'd also have to add a second query to check the result of your lookup.
    Worst case, you can just throw that whole SQL query you've already created into a SQL transform and then use that as your source.

  • Getting an error while fetching the data and bind it in the Tree table

    Hi All,
    I am getting an error "A navigation paths parameter object has to be defined - " while fetching the data and bind it in the Tree table.
    Please find the code and screenshot below
    var oModel = new sap.ui.model.odata.ODataModel("../../../XXXX.xsodata/", true);
    var oTable = sap.ui.getCore().byId("table");
    oTable.setModel(oModel);
    oTable.bindRows({
        path: "/Parent",
        parameters: {expand: "Children"}
    Can anyone please give me a suggestion to rectify this?
    Thanks in Advance,
    Aravindh

    Hi All,
    Please see the below code. It works fine for me.
    var oController = sap.ui.controller("member_assignment");
    var oModel = new sap.ui.model.odata.ODataModel("../../../services/XXXX.xsodata/", true);
    var Context = "/PARENT?$expand=ASSIGNEDCHILD&$select=NAME,ID,ASSIGNEDCHILD/NAME,ASSIGNEDCHILD/ID,ASSIGNEDCHILD/PARENT_ID";
    var oTable = sap.ui.getCore().byId("tblProviders");
    oModel.read(Context, null, null, true, onSuccess, onError);
    function onSuccess(oEventdata){
        var outputJson = {};
        var p = 0;
        var r = {};
        try {
            if (oEventdata.results){
                r = oEventdata.results;
        } catch(e){
            //alert('oEventdata.results failed');
        $.each(r, function(i, j) {
            outputJson[p] = {};
            outputJson[p]["NAME"] = j.NAME;
            outputJson[p]["ID"] = j.ID;
            outputJson[p]["PARENT_ID"] = j.ID;
            outputJson[p]["DELETE"] = 0;
            var m = 0;
            if (j.ASSIGNEDCHILD.results.length > 0) {
                $.each(j.ASSIGNEDCHILD.results, function(a,b) {
                outputJson[p][m] = { NAME: b.NAME,
                                     ID : b.ID,
                                     PARENT_ID: b.PARENT_ID,
                                     DELETE: 1};
                m++;
            p++;
        var oPM = new sap.ui.model.json.JSONModel();
        oPM.setData(outputJson);
        oTable.setModel(oPM);
    function onError(oEvent){
        console.log("Error on Provider Members");
    oTable.bindRows({
        path:"/"
    Regards
    Aravindh

  • Creation of a generic extractor and data source for the FAGLFLEXA table

    Hi All,
    Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
    Please advice on how to do this.
    Regards, Vishal

    Hi Vishal,
    Its seems a simple a work out.
    1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
    2. Name it accordingly & then create.
    3. Give description to it & then give table name FAGLFLEXA.
    4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
    If you still face some problem then do mail me at [email protected]
    Assign points if helpful
    Regards,
    Himanshu

Maybe you are looking for

  • Migrate from 6i to 11g

    Hi all, I'm migrating my old 6i to 11g. I installed everything in a windows 7 but i can't use the frmplsqlconv.bat. I'm trying to migrate my forms and when i try to execute it, a cmd windows opens and closes almost instantly. I read somewhere that i

  • Why did my ipod do not turn on??

    I have an ipod, i was playing when my ipod turns off and then appear the image with the battery, next i connect my ipod to my computer 30 minutes later i try to turn on my ipod but it was so hot and didn't react.. Help me Please ¡¡¡

  • Organisational unit as Employee

    Hi everyone, In ppoma-crm, I am trying to figure out how a organizational unit can have the role 'Employee' (BUP003). Indeed, for the use of  the module SRP (Service Resource Planning), the client requirement is to have a team as an employee that can

  • Apple response time for an At&t unlock

    Hello. Talked to an At&t assistent about unlocking my iphone 4 (3-04-2012). I was told that it would take about 7-10 days to get an answer on my mail from apple but... still waiting... Already done a restore hopping that it was done... Got a Case ID,

  • Youview blocks DVD player?

    I have recently set up my new YouView box and connected everything up. All works fine. But when I play a DVD in the player, I get sound but no picture - unless I pull the Youview scart cable out! In which case, the DVD player works fine. Anyone any i