Table analysis

Hello,
We have a datawarehouse where the tables are partitioned. Now we want to move the old partitions which will never be updated to a read-only tablespace. Will table analyzing work the same? What about index analyzing, if we also move the indexes to the read-only partition? In general, what problems might bring the move to a read-only partition in regards to analyzing. Please note that we will only move a partition to a read-only tablespace only after they get all the data they are ever going to get and an analysis has been performed on them.
Thank you.

Hi,
Thank you for your answers. Thing is, the table will change! Just some of its partitions will not change. These partitions will be in a read-only tablespace. But I'll need to analyze the table again in the future because new partitions with new data will need analyzing. Will this cause any conflicts with the old partitions in the read-only tablespace? For example, if when analyzing a table Oracle rearranges the data blocks or rebuilds indexes then this would conflict with the partitions in the read-only tablespace.

Similar Messages

  • DVM Cockpit - table analysis

    Hello experts,
    I'm currently configuring the DVMC on an SAP Solution Manager 7.01 system.
    Within the step "DVM Cockpit: Table Analysis", I must indicate "age of records", "archiving potential" and "deletion potential".
    I only want to go for age of records. Is that correct?
    When saving, I get an error message saying:
    The parameter "Table Name" is mandatory.
    The parameter "Archiving Object" is mandatory.
    The parameter "Execution Mode" is mandatory.
    The parameter "Scenario Title" is mandatory.
    The parameter "Max.Age of TAANA" is mandatory.
    The parameter "Detail Analysis Mode" is mandatory.
    The parameter "Max.Age of DANA" is mandatory.
    Which tablename I should enter?
    Thanks a lot.
    Dimitri

    Hi Raghavan,
    you should be able to find the guide under
    http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000724737&_OBJECT=011000358700000171052010E
    Regards
    Steffen

  • Question about selection table analysis

    Dear All,
    My last post was closed by the moderator so I would like to ask a different question. Please apologize if my question is trivial.
    1. There is a selection table seltab with one fields of type I, with the following contents:
    I     GT     3     0
    I     NE     8     0
    The literature eg. ABAP Objects says that the control statement like IF or CASE can be used with logical expression IN.
    Now I am checking if the variable test fulfills the conditions defined by the selection table.
    DO 10 TIMES.
      IF test IN seltab.
        write test.
      ENDIF
      test = test + 1.
    ENDDO.
    As a result I get a full list of values.
    1 2 3 4 5 6 7 8 9 10
    2. Now, if the table contains
    I     GT     3     0
    I get
    4 5 6 7 8 10 as expected
    3. If the table seltab contains
    I     NE     3     0
    the result also seems to be ok
    1 2 4 5 6 7 8 9 10
    The final question is. Is it true that the logical expression IN cannot be used with IF for more complex data comparisions?
    Thanks,
    Grzegorz

    Well,  the contents
    I GT 3 0
    I NE 8 0
    I read as:
    the variable fulfills the condition if is greater than 3 and not equal 8
    test > 3 and
    test <> 8
    which should give the result from the example above a list
    4 5 6 7 9 10
    What is wrong with my analysis?
    Regards,
    Grzegorz

  • Table (analysis item) Width -  In WAD

    Hi All,
    has someone been able to change the width of the analysis item in WAD? I do whatever I can and the table width is still the same..
    I particulary want to use full_width parametr.... so using module com.sap.ip.bi.rig.ColumnWidth is not really solution for me....
    Am I missing something, or is that a bug?
    Many thanks for any hints!!!
    Regards
    Pavel

    Hi Priya,
    thanks for answer.... The table seems OK, so everthing is displayed correctly.... see pics
    [http://www.screencast.com/users/pavel.rais/folders/SAP/media/9da56465-4a54-4589-8674-ba05f1456cba]
    But when I change width of table or chech full_width parameter, it stays the same and I need it to be broader....
    (I am using structers in my reports... so not sure if Paging could change something, I tried to change all the parameters anyway, but without any results....:-))
    Any Ideas?
    Thanks!
    Pavel

  • How to do Indepth table analysis without using any tool

    Is there anyway of indepth analysis of tables of a database without using any tool, i.e. by means of sql's, pl/sql's only.
    My database has around 800 main tables which have several other related relational tables(objects relating 2 tables on basis of OTO, OTM, MTO, MTM object relationship) and several dependent views(made from among the 800 base table only).
    Currently database is indexed, has joins and views, all in working scenario but yet do not gauranty consistent behaviour.
    My sole purpose is to analyse all main tables (around 800 of them) in my database by running scripts and prompt errors, warnings, exceptions wherever table needs indexing or change of joins(eg - from cross to inner,etc. ) or check inorder to avoid table full scan for related relational tables and dependent views.
    My databse is Oracle10g.
    Please do revert for any doubts.

    My sole purpose is to analyse all main tables (around 800 of them) in my database by running scripts and prompt errors, warnings, exceptions wherever table needs indexing or change of joins(eg - from cross to inner,etc. ) or check inorder to avoid table full scan for related relational tables and dependent views.There are no tools which can tell whether your table needs indexing or whether you need to change joins methods, by just looking at your database. At most, you can get an idea about the missing indexes in case of a parent-child relation. Everything else, falls under application tuning - which involves sql tracing, profiling etc.
    If you cannot trace individual sessions, then you are better off monitoring the database with statspack/AWR (if licensed). Generate reports at frequent intervals, look at the resource consuming SQLs and have a discussion with dev team to fix them, whenever possible.
    EM can also be used for SQL analysis.

  • CSS Sticky-table analysis

    We have a CSS 11503 at 7.4.2.02.
    It has a number of L3 Sticky rules.
    The users or rather their workstations are working 24 x 7.
    We would like to work out what is the best sticky-inact-timeout value to use so that we can gracefully close a server (weight = 0) and drain the sticky entries for that backend server.
    It would be good to have more information about the life of the sticky-entry in the table. For example when it was first loaded and perhaps the maximum elapsed time value.
    Are there any debug commands that can get more information on the sticky-table entries?
    Or has anyone got any other ideas on how to find out how long it would take to drain a server without actually setting the weight to zero and seeing what happens?
    My last thought is to change the sticky-inact-timeout value on the rule. As I understand it this change, which appears to be dynamic will only impact new sticky sessions. So a show sticky-table should show the new value for new entries where previous sessions elapsed time has exceeded the old value. Measuring the time taken from the change to the rule to the time that the majority of sessions have shifted to the new timeout value should give an indication of the time it would take to drain the majority of users off the server to be closed.
    If this is true then the only problem is how to interrogate the sticky-table which can only be paged at 100 entries a time and does not seem to be filterable in normal CLI. Hence the request for more info on Debug mode.

    Gilles,
    thanks for the response.
    However, what I am trying to acheive is a little more than see the sticky-entries as they are displayed using the standard show sticky-table command.
    For Layer three sticky entries even if you filter on IP address, you get a single entry in the standard 1 line format. I actually would like to see all entries with a given set of characteristics.
    Also for SSL sticky entries there is a Hash argument that allows the ability to see much more information for an individual entry. I cannot find an equivalent for Layer 3 sticky entries.
    The inability to search the whole table for certain characteristics without devizing a script with a loop on page count is giving us some interesting challenges. The abscence of information about statistics/timers on each flow is also a bit of a barrier for diagnosis.
    Hence the request for more information about the sticky-table debug facilities.
    regards
    Andrew T

  • Error while creating new Portfolio Analysis (PS 2013)

    Dear Folks,
    I am trying to create a new Portfolio Analysis on my production server but getting some weird error message. Although I have configured everything correctly. (PS 2013 without SP1)
    6 Generic Resources with Proper Rates, with Position Roles
    6 Projects with basic information, with small plan, with resources
    Whenever I select "Time-phased Resource Planning" It gives me an Unknown Error. But in ULS Log this is the error message:
    Datasets:
    AnalysisDataSet
    Table Analysis
    Row: ANALYSIS_UID='dd5b4cc1-2d6b-e411-93ff-00155d009e11'
    Error PlannerHorizonStartDateDoesNotMatchTimeScale (28016) - column
    HORIZON_START_DATE
    Error PlannerHorizonEndDateDoesNotMatchTimeScale (28017) - column
    HORIZON_END_DATE
    General
    Error:
    QueueAnalysisCannotUpdateAnalysis (29682). Details: id='29682' name='QueueAnalysisCannotUpdateAnalysis' uid='53485703-2e6b-e411-93ff-00155d009e11' AnalysisUid='dd5b4cc1-2d6b-e411-93ff-00155d009e11' MessageType='Microsoft.Office.Project.Server.BusinessLayer.QueueMsg.AnalysisMessage'
    MessageID='1' Blocking='DontBlock'.
    Queue:
    GeneralQueueJobFailed (26000) - AnalysisUpdate.AnalysisMessage. Details: id='26000' name='GeneralQueueJobFailed' uid='54485703-2e6b-e411-93ff-00155d009e11' JobUID='e54a3803-2e6b-e411-93ff-00155d009e11' ComputerName='e3b7641f-b9bf-4a1f-a5b3-a641445bd3ea'
    GroupType='AnalysisUpdate' MessageType='AnalysisMessage' MessageId='1' Stage='' CorrelationUID='c6b3cb9c-f4ad-b0df-0000-02b15d6add06'. For more details, check the ULS logs on machine
    e3b7641f-b9bf-4a1f-a5b3-a641445bd3ea for entries with JobUID
    e54a3803-2e6b-e411-93ff-00155d009e11. 
    Could someone please help me to resolve this issue?
    Many thanks
    Mohsin Raza

    Hi Moshin,
    The message indicates that your capacity analysis (resource based) is out of range. It might be due to your planning horizon and granularity settings or to the project start and finish dates in the portfolio properties:
    Also what are the start and finish dates of your projects? They cannot be out of the range : 1984 - 2049.
    Hope this helps,
    Guillaume Rouyre, MBA, MVP, P-Seller |

  • Detailed Source System Analysis

    Hi
    I am a new bie to this community and am trying to learn things as much as I can asap. :)
    I am looking for any standard templates available for a detailed "source system analysis". We are trying to put that into practice on two systems that we are targeting and I just want to make sure that we are not missing any information that needs to be gathered from the source system from any angle. So, any of you can point me to the right links,reosurces or templates that we have that would help in capturing all these, help me.
    If my understanding goes right, this is a task that is inevitable in a typical warehouse development cycle. So, I am expecting such templates to exist already. Correct me if I am wrong.
    hoping to hear favourable replies asap.. :)
    Have a great day at your end..
    best regards
    Manesh

    Hello Manesh,
    If your question is non-technical I would recommend the book "The Data Warehouse Lifecycle Toolkit" by R. Kimball et al (ISBN: 0471255475) as a starting point for discovering client requirements. The book contains detailed questions for every phase of a DW project including source system discovery, it even contains an Excel questionnaire / requirements sheet on the companion CD).
    If technical and OWB-related, the next release of OWB (10.2) will as we're told contain advanced source system table analysis. The present OWB only lets you discover database objects that can be imported to OWB, as well as discover/sample flat file structures.
    Good luck, Hans Henrik

  • SAP database tables - Column statistics not found for table in DB02 - During import, inconsistent tables were found - Some open conversion requests still exist in the ABAP dictionary.

    Hi Experts,
    I'm implementing SAP note 1990492 which requires manual implementation. Implementation includes modifying standard tables (i.e. append
    the structure FIEU_S_APP_H to table FIEUD_FIDOC_H). After this I've adjusted the table in SE14 (Database Utility). I've done checks in SE14 and it shows the table is consistent.
    But during DB02 -> Space folder -> Single table analysis -> Input table name -> Indexes tab -> Upon clicking statistics, there is a warning "Column statistics not found for table".
    Our basis team is implementing an Add-On in the development system related to RWD Context Sensitive Help. They cannot proceed due to the following inconsistencies found in the table.
    Screenshot error from the activity:
    Screenshot from DB02:
    I'm an ABAP developer and have no other ideas on what to do. Thanks in Advanced.

    Hi All,
    We were able to fix the issue through the following:
    1. Call transaction
    SE14.
    2. Enter the name of
    the table and choose "Edit".
    3. Choose
    "Indexes".
    4. Select the index
    and choose "Choose (F2)".
    5. If you choose
    "Activate and adjust", the system creates the index again and it is
    consistent.
    6. Check the object
    log of this activation.
    7. If an error
    occurs, eliminate the cause and reactivate the index.

  • Archiving table CDCLS

    Hi gurus,
    I have been requested to provide soution to archive CDCLS table entries with periodic process as this table encoured 'problem' while migrating to UNICODE (it was test scenario but we noted that export took very long time because of lots of entries)
    As I am not expert on SAP archiving, this survey is based on ADK (as data won't be archived on external support) for which I already found lots of documentations in SDN or SAP community websites.
    Thus, what I do require is a point of view and/or insight from 'archiving gurus' (used to perform such a task) to reach my goal.
    I understood that I have to create archiving object (within AOBJ) but I do not know (even I suppose) that prerequisites exist maybe?
    Ending, entries which are required when creating new archiving object are not very easy to determine...
    Thanks to anyone who would be able to help me.
    Rgds,
    Zulain

    Hi,
    CDLCLS (cluster) contains change documents which is basically log of changes to master records and documents.
    The archiving object CHANGDOCU should only be used to archive the change documents of master data. Refer SAP Note 140255. Change documents of transaction data records should still be archived along with the appropriate archiving object.
    To find the appropriate application object you will need to do table analysis using TA: TAANA. The SAP Note 689036 give you more details on change document table analysis.
    Further you can delete change documents using report RSCDOK99. Check SAP note 180927 and 183558.
    Hope this helps
    Cheers!
    Samanjay

  • Tables not archived with CO_ORDER

    Hi All,
    CO_ORDER is the archiving object for archiving orders in CO
    The following tables though they are listed in DB15 transaction are not getting archived with CO_ORDER but the table entries are deleted while running the delete program pertaining to CO_ORDER. These tables however are getting archived and deleted with CO_ITEM archiving object with object type ORD
    The list of tables that are getting deleted are COBK,COBK_INDX,COEJ,COEJL,COEJR,COEP,COEPB,COEPL,COEPR.
    If CO_ORDER object alone has to be archived, then these tables will not get archived but will be directly deleted.
    Is there any sequence in archiving CO_ITEM and CO_ORDER?
    Any pointers to the same will be helpful.

    Hi,
    Refer to the following note and its referenced corresponding notes for further information into this issue.
    Note: 110923  Archives cannot be read or accessed
    In addition, also refer to the following note that provides further clarity on the use of archiving and analysis (especially the information in relation to Programs RARCCOA1, RARCCOA2)
    Note: 138688   Archiving/Controlling: table analysis
    If this is an issue that the archive files for CO_ORDER are no longer available (Deleted), then please refer to the following information:
    If you run the Analysis Program RKAARCS1 and the selection for the archive files shows a 'lightning bolt' this means the files are not accessible.
    You can then check the archive information structures for CO_ORDER and if they are built. If they are not built there will be a 'red light' which indicates to build the structure via transaction SARI and Status. This is not possible if the Archive Files are not available though.
    You will either have to restore the Archive files or Create a new Archive file. Care should be taken when you create Archive files that they are stored in a secure location and backed up on a seperate schedule than your normal backup.
    I hope this information helps you get the results you are seeking.
    Kind Regards,
    Brendan
    Edited by: Brendan O'Brien on Sep 2, 2008 2:21 PM

  • DVMC Saving Potential for table COEP

    Hello All,
    i am curious to know the answer for the below interesting topic.
    We have implemented the Solution Manager DVMC tool for few production systems. After we setup the query and BPM to analyze the saving potential calculation for the few of the table as per the DVM Sizes growth below question applies.
    The Saving potential for table COEP, MSEG (for system like 4.7 systems) says to do the deletion potential. For example, the table last analysis 146.149,000 and out of that the deletion potential = 146.093,987
    So based on the saving potential, we want to make a strategy looking at the deletion potential operational possibilities. However, looking at the SAP standard best practice documentation, we see the table like MSEG and COES u201Cdeletion cannot be usedu201D which is quite contradictory towards the report result in the DVMC report for saving potential.
    I believe, there should be some reason, the Table analysis says about the deletion potential for these tables. I must curious to know the reason for the same.
    Can anybody also explain how we can use the DVMC report either for Saving Potential or archiving potential in the operational activity? Like after we find the result, according to deletion potential, which records to be deleted etc.
    Regards,
    Shashi
    Edited by: Shashi Samal on Feb 29, 2012 5:45 PM
    Edited by: Shashi Samal on Feb 29, 2012 5:45 PM

    Hello Shashi,
    I wouldn't delete any table even if DVMC  recommends so. I don't know your DB vendor but most of them (like DB2) offers table compression for saving spaces.
    regards, Javier

  • Performance issue with COEP table in ECC 6

    Hi,,
    Any idea how to resonlve performance issue on COEP table in ECC6.0
    We are not using COEP table right now. this table occupies 100gb of 900 gb in PRD system.
    Can i directly archive/delete the table?
    Regards
    Siva

    Hi Siva,
    You cannot archive COEP table alone. It should be archived along with the respective archive object. Just deleting the table is not at all a good idea.
    For finding out the appropriate archive object contributing to the entries in COEP, you need to perform CO table analysis using programs RARCCOA1 and RARCCOA2. For further informaton refer to SAP note 138688.
    Hope this helps,
    Naveen

  • Weight factors in a many-to-many relationship with bridge table

    Hi, I have the same N:N relationship schema of this link:
    http://www.rittmanmead.com/2008/08/28/the-mystery-of-obiee-bridge-tables/
    In my bridge table I have a weight factor for every couple (admission,diagnosis). If I aggregate and query in Answers these columns:
    DIAGNOSIS | ADMISSIONS_COSTS
    every single diagnosis has the sum of the WHOLE Admission_cost it refers to, not its contribute to it (for example 0.30 as weight factor). The result is an ADMISSION_COSTS sum larger than the ADMISSION_COSTS sum in the lowest detail level, because it sums many times the same cost.
    How could I use my weight factor and calculate the right diagnosis contribute to its admission? In BI Admin I tried to build a calculated LogicalColumn based on Physical column, but in the expression builder I can select only the ADMISSION_COST measure physical column, and it doesn't let me pick the weight factor from the bridge table.
    Thanks in advance!

    I'm developing a CS degree project with 2 professors, Matteo Golfarelli and Stefano Rizzi, who have developed the Dimensional Fact Model for data warehouses and wrote many books about it.
    They followed the Kimball theory about N:N and used its bridge table concept, so when I said them that in OBIEE there is this definition they were very happy.
    But they stopped this happiness when I said that bridge tables only connect fact tables to dimension tables, and to create N:N between levels at higher aggregation we should use logical joins as you said in your blog. I need to extract metadata concepts from UDML exportation language, and about N:N I can do it only with bridge table analysis, I can't extract and identify a N:N level relationship from a multiple join schema as in your blog... this is the limit of your solution for our project, only this!
    PS: sorry for my english, I'm italian!
    thanks for the replies!

  • Many to Many relationship - Bridge Table

    This post may be more appropriate for a data modeling discussion group, but thought I would post here because it will ultimately be modeled/used in OBIEE.
    Can someone help me understand what the point/use is for a Bridge table when managing a many to many relationship between a fact table and a dimension? I have read a hundred different ways to handle this situation with the brige table method being the overwhelming approved approach .. but I don't see what a bridge table specifically buys you (Im sure Im missing something though).
    For example .. If I have:
    EVENT_FACT
    EFkey
    CRDimKey
    Famount
    CUSTOMER_ROLE_DIM
    CRDimKey
    Customer Name
    Role
    So a customer can hold multiple roles and therefore 1 event fact record could link to multiple CUSTOMER ROLE records and 1 customer role record will most likely link to multiple EVENT_FACT records.
    As I understand the bridge approach would put a bridge table CUSTOMER_ROLE_EVENT_BRIDGE in place like follows:
    CUSTOMER_ROLE_EVENT_BRIDGE
    EFkey
    CRDimkey
    WeightFactor
    With this approach you now have the following setup:
    EVENT_FACT one-to-many CUSTOMER_ROLE_EVENT_BRIDGE
    CUSTOMER_ROLE_DIM many-to-many CUSTOMER_ROLE_EVENT_BRIDGE
    Doesn't a many to many relationship still exist from the dimension to the bridge table? Since all we did was join the dimension to the fact table to create the bridge table I dont see how the many to many from dimension to bridge doesnt exist?
    It seems somewhat inneficient to join the dimension to the bridge ahead of time to create this table and place the weight factor on it. Why not just compute the weight factor of the dimension and place that as a field on the dimension itself and use it when joined to the fact table?
    Thanks for the help and insight!!
    k
    Edited by: user_K on May 19, 2010 4:34 PM

    I'm developing a CS degree project with 2 professors, Matteo Golfarelli and Stefano Rizzi, who have developed the Dimensional Fact Model for data warehouses and wrote many books about it.
    They followed the Kimball theory about N:N and used its bridge table concept, so when I said them that in OBIEE there is this definition they were very happy.
    But they stopped this happiness when I said that bridge tables only connect fact tables to dimension tables, and to create N:N between levels at higher aggregation we should use logical joins as you said in your blog. I need to extract metadata concepts from UDML exportation language, and about N:N I can do it only with bridge table analysis, I can't extract and identify a N:N level relationship from a multiple join schema as in your blog... this is the limit of your solution for our project, only this!
    PS: sorry for my english, I'm italian!
    thanks for the replies!

Maybe you are looking for

  • How can I display more info using remote app?

    Hi there, I use remote app a lot, but I find it quite basic in the display features. I mean: remote is great to browse my music library and play songs with on my living room hi-fi while I'm sitting comfortable on my sofa. But what if I would like to

  • Error while creating the JCO destinations

    Hi All, I am getting the following error while creating the JCO destinations. <b>There are no message servers maintained in the associated SLD. Please contact your system administrator.</b> Please help me in resolving this Thanks

  • Why do we get First operand...error message when trying to change password

    Hi I'm new at posting here, so sorry if I'm doing this wrong. I did search first to see if there was info on this already, but I didn't find anything. (If there is already something out there, please point me to it.) We are using PeopleSoft Financial

  • Oracle 9i iAS: (Install Error)

    Hello, This is the Pop-Up message that I get, when installing Oracle 9i iAS: Your local hsot name database file (typically /etc/hosts) is not configured properly. This file should have an entry of the form: <IP_ADDRESS> <FULLY_QUALIFIED_HOSTNAME> <AI

  • PCR Social security: want to use wage type amount in decision operation

    PCR Social Security: Want to use wage type amount for comparison   Posted: Oct 21, 2010 11:00 AM                                  Reply Dear Guru's, hope u'll b fine and enjoying good health. Below is PCR for social security that I hav currently conf