Measure Fragmentation

Hello All-
Is there a way of measuring Fragmentation in Essbase database ?
Thanks!

Hi-
Thanks for the reply. I used the command GETDBSTATS and got following:
Average Clustering Ratio : 1
Average Fragmentation Quotient : 52.61024
According to DBAG for Average Clustering ratio:
The average clustering ratio database statistic indicates the fragmentation level of the data (.pag) files. The maximum value, 1, indicates no fragmentation.
According to DBAG for Average Fragmentation Quotient :
Large (greater than 2 GB) 30% or higher
Any quotient above the high end of the range indicates that reducing fragmentation may help performance
Now if i see my results and try to judge if my database is fragmented or not so According to Average Clustering Ratio its not at all Fragmented. However if i go by Average Framentation Quotient my database is fragmented (I am sure that
my database is fragmented right now). Do you know if the Average Clustering Ratio gets default value of 1 everytime the application is restarted or if we recycle the services?
Moreover i was trying to look at "Average Fragmentation Quotient" via EAS and was not able to get where in i can see that , i was able to see Average Clustering Ratio via EAS.
Thanks!

Similar Messages

  • How to find Fragmentation of tables and tablespaces in oracle 8i

    Hi,
    How can i measure the fragmentation of Tables and Tablespaces in oracle 8i.
    Do any one have any handy script to find the same.
    What are the attributes to be determined to do health checkup of a database.

    how many such empty blocks are there in the table and of what size.Again Tom Kyte reference :
    http://asktom.oracle.com/pls/ask/f?p=4950:8:7760471022221195640::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:231414051079
    For tablespace, fragmentation is the regular termAbout tablespace, the link from post above can help.
    Nicolas.

  • Time series functions are not working for fragmented logical table sources?

    If i remove that fragmented logical table sources, then its working fine.
    if any body know the reason, please let me know.
    thanks and regards,
    krishna

    Hi,
    is because the time series function are not supported for the framentation content, see the content of the oracle support:
    The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
    Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
    Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
    Regards,
    Gianluca

  • Shared versus dedicated connections in the fragmentation of shared pool mem

    Hi,
    I have a old Oracle 8.1.7 database server.
    I have a legacy application with no source code. This application don't use memory efficiently (no bind variables, etc.) , ie memory becomes fragmented.
    I know that exists two ways to connect the database (dedicated and shared)
    Based on this, I want to know which of the two options creates more fragmentation. I know that recommendation is to use dedicated connection, but I'm not sure if this is recomendation is applicable in this particular environment.
    Thanks in advance.

    Whether you use shared or dedicated connections makes no difference for fragmentation in the shared pool. Whether your hard parse or do not hard parse does matter.
    Measures you can take
    - make sure often used packages like dbms_standard are pinned in the shared pool using a startup trigger
    - set session_cached_cursors to 50 or 100. This will reduce parsing.
    Sybrand Bakker
    Senior Oracle DBA

  • Fragmenting

    I'm sure this topic has been covered in depth on every forum sometime long ago, but I was hoping to get input on defragmenting my hard drive. As I still don't have an external drive my scratch disk is my hard disk. Having done 4-5 pretty significant projects in FCE my computer is definately running slower than it did before. I know apple says that defragmenting is unnecesary, but I've also heard that only applies to smaller files (not huge movie files) A firewire or SCSI external drive is still out of my budget for now, so I'm going to continue doing projects on my computer's hard drive. The only apple specific utility I've heard of for this is iDefrag and I've read mixed reviews about it. My computer has a 100gb hard drive and I have about 40gb of permanant files/software on it. Thoughts please! And I apologize in advance if this subject has been the topic of countless threads before me!

    As you probably know, the simplest and safest way to defragment a Hard Drive is to copy all your files (except the OS and Applications etc.) to another HD and delete all the original ones. Then you copy all the files back onto your original HD and they should be beautifully contiguous.
    However, you say you have not got access to another HD. Is there any chance of borrowing one or another computer?
    Failing that you could get 10 DVDs and copy your 40GBs to them etc.
    I know it would be a bit fiddly but it would cost very little and if you employed DVD-RWs you could use them again.
    Ian.
    PS. A handy free tool to measure the state of fragmentation of your HD is ShowVolumeFragmentation:-
    http://people.freenet.de/amichalak/page1/page1.html
    MacJanitor is useful for minor spring-cleaning of your Mac:-
    http://personalpages.tds.net/~brian_hill/macjanitor.html
    If you want to measure the speed of your computer and compare it with other people's try XBench:-
    http://www.xbench.com/
    They are all FREE utilities.

  • Shall i create timeseries functions for the fragmented logical tablesources

    i am not able display the results of ago and to_date logical columns if i use the fragmented logical table sources
    can any body please help me to achieve this task??
    thanks,
    krishna
    Edited by: 835868 on Jun 29, 2011 8:30 AM

    Hi,
    is because the time series function are not supported for the framentation content, see the content of the oracle support:
    The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
    Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
    Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
    Regards,
    Gianluca

  • Fragmentation

    Hi,
    I am trying to implement Fragmentation in our model. I've been through all the documentation i can set my hands on and tried replicating the steps as advised but still fragmentation does not work.
    Problem: Have a fact at day level linked to Date Dimension + 2 other dimensions. Have fragmented the fact table into 3 seperate ones namely Oct, Nov and Dec. Imported them in physical layer, joined them to Date dimension by the 'date_sk'. In Business Layer, add the 3 sources to my fact and put in fragmentation content : "Date".FullDate >= '2007-10-01 00:00:00' AND "Date".FullDate < '2007-10-31 00:00:00' for Oct Source and "Date".FullDate >= '2007-11-01 00:00:00' AND "Date".FullDate < '2007-11-30 00:00:00' for Nov Source and "Date".FullDate >= '2007-12-01 00:00:00' AND "Date".FullDate < '2007-12-31 00:00:00' for DecSource . Checked the box :This source shoud be combined... Joined to logical date dimension as required.
    However, when i run a query with a specific date of say 22/10/2007 00:00:00 and one of the measures from the fact, it hits all 3 fragmented sources (UNION ALL) instead of hitting only the Oct source. Can you please advise on which steps have i missed or done wrong?
    Many thanks,
    N.

    What is the query that is being fired on the 3 fragmented sources? In your case you have all the 3 sources that are mutually exclusive by the filters. But what if they are not mutually exclusive? In that we would expect it to hit all the 3 sources correct. Also, is it getting data from all the 3 sources or is it just firing queries on all the 3 sources?
    Thanks,
    Venkat
    http://oraclebizint.wordpress.com

  • Tablespace fragmentation

    Hi All,
    How can I identify the tablespace is fragmented or not?
    If fragmented how can I resolve the issue while production database is up & running.
    Thanks in advance...

    REM     This script measures the fragmentation of free space
    REM     in all of the tablespaces in the database and scores them
    REM     according to an arbitrary index for comparison
    REM     
    REM               largest extent     100
    REM     FSFI = 100 * sqrt(------------------) * (-------------------------------)
    REM               sum of all extents sqrt(sqrt((number of extents)))
    REM     
    REM     FSFI - Free Space Fragmentation index
    REM     The largest possible FSFI (for an ideal single-file tablespace) is 100
    REM     As the number of extents increases, the FSFI rating drops slowly. As the size of
    REM     the largest extent drops, however, the FSFI rating drops rapidly.
    select          tablespace_name
    , file_id
    ,          sqrt(max(blocks)/sum(blocks)) *
              (100/sqrt(sqrt(count(blocks)))) fsfi
    from          dba_free_space
    group by     tablespace_name, file_id
    order by     1,2;
    You can move object to another tablespace and rebuild the associated indexes for tablespace fragmentation.
    Drop the old tablespace.
    Also you can go for export/import.
    Regards
    Asif Kabir

  • Fragmentation in Undo TBS

    Hi all experts,
    I want to know if found fragmentation in undo tablespace.
    Is it recommaned to treat it the same way as usual permanent non system fragmented TBSs.
    i.e. do fragmentation
    I wonder if temp/undo tbs needs to be defragmented.
    Looking for your kind and professional advices and references.
    Thanks

    Being Locally Managed, TEMP and UNDO are self-managed by Oracle.
    I also wonder if you are "de-fragmenting" other tablespaces. If they are Locally Managed and Allocation_Type is either UNIFORM or AUTO, you do not need to "de-fragment" them. If using UNIFORM, set the appropriate Extent Size when you create the Tablespace.
    If your Tablespace has been converted from Dictionary Managed to Locally Managed, the Extent Management may appear as "USER" -- in which case you have odd-sized extents brought forward from DMT. Again, "fragmentation" is not an issue once they are Locally Managed.
    You could consider rebuilding the tablespaces as LMT with UNIFORM or AUTO when you get an opportunity. (Obviously : TEST, TEST , TEST and measure the effort and downtime -- it may not be justifiable to even expend that effort).
    Hemant K Chitale

  • Aggregate navigate or Fragmentation in OBIEE

    Whats the better approach aggregate navigate or Fragmentation in OBIEE BMM layer and how ? What are the pros and cons of each one of them

    I still cannot understand the original question:  "aggregate navigate or Fragmentation.."
    Searching the documentation I can find:
    Setting Up Fragmentation Content for Aggregate Navigation
    When a logical table source does not contain the entire set of data at a given level, you need to specify the portion, or fragment, of the set that it does contain. You describe the content in terms of logical columns in the Fragmentation content box in the Content tab of the Logical Table Source dialog.
    and
    Setting Up Aggregate Navigation by Creating Sources for Aggregated Fact Data
    Aggregate tables store precomputed results from measures that have been aggregated over a set of dimensional attributes. Each aggregate.....

  • Which performance counter is JVM fragmentation derrived from?

    In a JRA recording there is a measure of heap fragmentation. How is that value calculated?
    Or even tbetter, is there a runtime MBean which can be used to monitor the heap fragmenetation?
    Regards,
    Allan

    I remember there has been different ways of calculating the fragmentation over the years. In JFR, I believe the fragmentation is calculated as:
    total heap - free lists - actually used
    In other words, memory lost to fragmentation (until we do a compaction). Internally we refer to the unavailable memory as "dark matter". There is no counter for fragmentation/dark matter available through the MBeans today. We are thinking of adding one though. ;)
    Kind regards,
    Marcus

  • How to measure a Ipsec tunnel speed ?any tool or windows command?

    we have connected our two branches using  Ipsec tunnel?connection speed is 2Mbps.and we are accessing a server from the 2Mbps enabled plant.After upgrading to 5Mbps also very slow.other end of the tunnel is 5Mbps.
    is  there any default speed for tunnel?site to site tunnel?

    Disclaimer
    The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
    Liability Disclaimer
    In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
    Posting
    Generally, there's no logical bandwidth cap on an IPSec tunnel.  The tunnel might have a low default bandwidth setting, but that doesn't actually limit tunnel performance.
    That said, there's much that can impact an IPSec tunnel's performance.
    First, encryption can be very processor usage intensive, so performance will depend much on the supporting hardware.
    Second, encryption consumes additional bandwidth, for its overhead, which can be a fairly large percentage for small packets.
    Third, an IPSec tunnel can often lead to packet fragmentation, which consumes both additional bandwidth and CPU performance.
    There are many performance measurement tools, some free, that you might use.  Personally, on Windows hosts, I often use PCATTCP.  It's a very simple tool; I find the UDP bandwidth generator often good for testing end-to-end bandwidth capacity.  For example, I might set it to send 10 Mbps to the far side to confirm 5 Mbps gets there.

  • How to measure performance of supplier when using scheduling agreement ?

    Hello all,
    My client has an absolute need to be able to measure the performance of its suppliers based on delivery dates and delivered quantities. That is to say he needs to be able to compare what dates and quantities were asked to what has been really delivered.
    Most of the procurement processes used are based on scheduling agreements : schedule lines are generated by MRP and forecast is sent to supplier while firm requirements are sent through JIT calls.
    It seems that when doing GR in MIGO, it is done against the outline agreement number, and not against the call. Therefore, we have no way to compare dates and quantity with what was expected (in the JIT call).
    Do you know if SAP proposes a standard solution to this, and what could be a solution to this issue ?
    Thanks for your help
    E. Vallez

    Hi,
    My client faced the same problem and we ended up developing an own analysis in LIS. Since the GR is not linked to specific schedule line (SAP does some kind of apportioning, but it doesn't have to correlate to the correct match), one needs to do assumptions. Our assumption was the closest schedule line, i.e. each GR is related to the schedule line with the closest date. Then all GR the same day are totaled together before the quantity reliability is calculated, since the very same shipment can be reported through several GR transactions in SAP (one per pallet).
    If anybody has info about what SAP has to offer in this question (or is developing), please tell us!
    BR
    Raf

  • Possible to limit dimensions and measures when creating presentations?

    We are trying to use OLAP/BI Beans to add BI functionality to our next-generation data warehouse application. This application has its own security framework, with the ability to define permissions/privileges for objects. We need to integrate BI Beans/OLAP with this security framework.
    One of the things we need to do is control which OLAP objects (like dimensions and measures) are available to a given user in the Items tab when creating a presentation. For example, user A might see dimensions Alpha, Bravo and measure Charlie, while user B might see dimensions Delta, Echo and measure Foxtrot.
    We need to be able to apply these dimension/measure restrictions without using different Oracle users, with each having access only to their own OLAP objects. Our data warehousing application does not use Oracle and Oracle users to control security; it has its own internal frameworks for privileges/permissions. We therefore need to find a way to restrict access to OLAP objects in some programmatic way.
    Here's an example of how this might work:
    - I am a clinical analyst. I sign on to the data warehouse application. The data warehouse knows that as a clinical analyst, I have access to a certain list of objects and functionality across the application. One of the apps I have privilege to is the BI Bean Presentation Creation Application, so I click the menu to bring this up. I can now create BI presentations, but since I am a clinical analyst my list of available dimensions and measures do not contain any of the G/L, payroll or other financial OLAP objects.
    - If I signed onto the data warehousing application as a different user, one that has a financial analyst role, I might see a different set of OLAP objects when I run the presentation application application.
    So what we need is some API way to specify which dimensions and measures are available to a given user when they launch the presentation wizard. I've been digging through the BI Beans help and javadoc and have found a few things, but they aren't what I need.
    Here's what I found:
    - setItemSearchPath: this allows you to specify which folders are to be displayed. We want control at the OLAP object level, not at the folder level, so this doesn't work for us
    - setVisibleDimensions: this controls which dimensions are available in the Dimensions tab, not which dimensions can be selected in the Items tab. Doesn't work for us
    - setDimensionContext/setMeasureContext: These might work for us but I haven't been able to get them to retrieve anything yet. It also seems to me that these might set which dimensions/members are initially selected in the Items tab, not the list of dims/measures that are available for selection.
    Any assistance on this matter would be greatly appreciated.
    s.l.

    Reply from one of our developers:
    The get/setMeasureContext and get/setDimensionContext methods are currently only used by the Thick CalcBuilder (in a few limited scenarios) and cannot be used "to scope the dimensions and measures listed in Query and Calc builder based on user access rights".
    The scoping of dimensions and measures based on user access rights should be performed at the MetadataManager/Database level.
    This may change going forward as the real issue here is the static nature of the metadata and a general issue with the GRANT option within the database. So from the database perspective it is not possible to grant select priviledges on a single column of a table.
    The metadata issue is more complex as the OLAP API reads the metadata only once on startup of a session. The list of available measures is based on the GRANT priviledge, so for relational OLAP this limits the data scoping capabilities. In 10g, the metadata for AW OLAP becomes more dynamic and contained and read directly from the AW. Therefore, with an AW OLAP implementation with 10g it could be possible to scope boht dimensions and measures quickly and easily.
    Hope this helps
    Business Intelligence Beans Product Management Team
    Oracle Corporation

  • Error while defining alt unit of measurement in material master

    All SAP gurus,
    i am trying to define a alt unit of measurement in a finished material, the base unit of measurement is in MT, and the sales unit is nos. the conversion formula is 53 nos = 1kg i.e. 1MT = 53000 nos. as conversion can be done only with base unit of measurement i am giving 53000 nos = 1 MT, but system unable to calculate the conversion as it is not taking mere than 3 digit after decimal.
    e.g.    if 53000 nos = 1 MT
    then  1 nos should be = 0.000018 MT
    but system not taking this value as this value contain 6 digit after decimal
    suggest me what can be the way out? with detailed steps. can i increase the post decimal value or i have to define another middle alt unit of measurement (like KG). and if that is to be done then how.
    Regards
    BKR

    Hi,
    With help of Basis team check the decimal places for u r user ID.
    In t-code SU01 >>> Under Default tab
    Kapil

Maybe you are looking for

  • Original Humax Youview box - no picture.

    Hi. Over the month or so my Youview box has been behaving very oddly. When it wakes from sleep very often I get no picture and no sound. My TV says "no input" on the HDMI. The box starts booting with the "Nearly ready" graphic showing... then when th

  • Pls send the JCA rar file for Websphere

    Hi All, Is there any JCA rar file for IBM Websphere server? I want to access BAPI from through my application in Websphere. I dont have JCo. I want to use JCA only. Where can I find the JCA rar file for Websphere server? If you have the rar file that

  • Meaning of "Apply Validation" property for DataControl

    (JDeveloper 9.0.5.1, ADF) I ran into the Apply Validation property for DataControls in the ADF framework. I am curious what this property (allowable values true an false, default is false) will do. I have looked in the online help and googled around

  • How do I enlarge a image?

    HI! guys, I am now learning to program with J2ME. I want to enlarge a image that is smaller that a smartphone screen, but I don't know what to do. Could anyone give me some hints? thanks! Any information or reference is greatly appreciated!

  • Encoded URL not being decoded by midelt

    hi, I am using Kannel(an open source sms gateway) to send a url to a midlet at port 16555. The midlet works fine, but the only problem i am facing is that, it does not decode the URL which is of the form: %FF%06%01%AE%02%05%6A%00%45%C6%0C%03%77%77%77