UDA effect the cube size

Dear All,
I have create dthe UDAs
Please can any one suggest me is UDA effect the cube size?
And how many attributes we have in essbase ?
Thanks in Advance.
Edited by: user8815661 on Aug 2, 2012 12:17 AM

Hi,
UDAs are member labels that you create to extract data based on a particular characteristic.
UDAs has no effect on the cube size
KosuruS

Similar Messages

  • To find the Cube Size

    Hi guys,
    how to find the current cube size, any t-codes or fn modules
    to figure it out.
    Thanks,
    Your help will be greatly appreciated

    Hi Vj,
    Finding how much DB% a cube is occupying
    1)go to se11 type psa table name /bic/b********** and display the table.
    now u have tab technical setting at top click it
    and it will display the table space and the size
    for cube/bib/d********
    2)If you are maintaining the Statistics. You can check the reports of statistics Multiprovider. or you can also check with statisctis cube.
    3)To see the disk space used from an InfoCube you should see TCode DB02 and should sum all the involved tables (Detailed Analysis -> Object Name "<IC_NAME>"), so E, F and DIM tables.
    4)You can use RSRV -> All Elementary Tests -> Database -> Database information about infoprovider tables to get the rows in the fact tables and dimension tables.
    You can also use the program SAP_INFOCUBE_DESIGNS.
    Hope this helps,
    Regards
    Karthik
    Assign points if helpful

  • Fact table size of the cube

    BW Gods!
    I need to selectively delete data from a cube which holds data from 2001-2006.....say I want to delete data only for 2001. It was suggested to me by pizzaman in one of our earlier threads that BW would decide if the data to be deleted is more than 10% of the cube size and if it is, it would just copy the cube without the data which has to deleted and it would rename the cube. So he suggested that I should find out the fact table size and make sure that there is enough space in my fact tablespace for this operation to go on.
    My question is, how do I find out the fact table size of the cube?
    How do I find out if there is enough space on fact tablespace for the above said deletion operation to be executed.
    I am not too sure whether I have put across the question correctly. Let me know.
    Thank you all in adavance
    Ashwin

    You want to make sure you include space for the F fact table and the E fact table, assuming you compress the infocube. If you use DB02 and use wild cards you migh be able to get all teh size info in one shot.
    Gnerally, he indices for an InfoCube don't occupy a lot of space, but they do take space.
    The default InfoCube table space is PSAFACTD and indics are in PSAPFACTI - not 100% sure that is universal acorss all the different DBs.  Your shop could also have put them in differnt tablespaces altogether.  Probably worth a quick check with your DBAs - they might know that they have lot of space availalbe, and you don't need to take the time to run down the space usage info.
    The table copy process will NOT delete the original fact tables unless/until it has successfully loaded to new copy.

  • How can I reduce the file size rendered by After Effects?

    When I render a relatively simple 5 second project in After Effects, the file size of the resultant .avi is 64MB.  If I change the properties to reduce the file size, the degradation makes the file unusable.  What am I doing wrong?

    Is AE's encoder really that inefficient? 
    The thing is, AVI doesn't mean much.
    It's pretty much an empty container box, which doesn't imply a quality level.
    So, AME could default to something completely different as a starting point to produce an AVI file.
    AE defaults to uncompressed video when you pick AVI as a format. So, obviously this produces huge file sizes. There could be similar quality thresholds with smaller sizes if you pick other AVIcodecs, but that's a different subject. And in any case, when you're rendering a production quality master,  file size is usually not your main concern. You typically use this high quality video file as a source for compressed flavors for distribution. So, pristine video files with huge sizes are a good thing - people then wonder why trailers at apple.com, for instance look so good. And the thing is, the most compressed formats benefit enormously from having an uncompressed file as a source.
    Regarding encoding efficiency, yes, AE is less efficient than dedicated encoding solutions. Above all, because it doesn't support 2 pass encoding. Note that for some formats, 2 pass makes a night and day difference, while for others, nothing as drastic as most users seem to believe.
    All of this is a moot point for AVI, because the default AVI codecs don't offer these encoding options, which are more the realm of distribution formats like FLV, MPEG-4/H264 or WMV.
    There are distribution codecs which use AVI as a container out there, but those are a different case.

  • How to find out Cube Size (Step by step process)

    Hi all,
    Can any body tell me How can i find out the Cube size ?
    Thanks in advance.
    Vaibhav A.

    Hi,
    try Tcode DB02
    and
    A simplified estimation of disk space for the BW can be obtained by using the following formula:
    For each cube:
    Size in bytes =
    (n + 3) x 10 bytes + (m x 17 bytes) *
    http:// rows of initial load + rows of periodic load * no. of periods
    n = number of dimensions
    m = number of key figures
    For more details please read the following:
    Estimating an InfoCube
    When estimating the size of an InfoCube, tables like fact and dimension tables are considered.
    However, the size of the fact table is the most important, since in most cases it will be 80-90% of the
    total storage requirement for the InfoCube.
    When estimating the fact table size consider the effect of compression depending on how many
    records with identical dimension keys will be loaded.
    The amount of data stored in the PSA and ODS has a significant impact on the disk space required.
    If data is stored in the PSA beyond a simply temporary basis, it could be possible that more than 50%
    of total disk space will be allocated for this purpose.
    Dimension Tables
    u2022 Identify all dimension tables for this InfoCube.
    u2022 The size and number of records need to be estimated for a dimension table record. The size of
    one record can be calculated by summing the number of characteristics in the dimension table at
    10 bytes each. Also, add 10 bytes for the key of the dimension table.
    u2022 Estimate the number of records in the dimension table.
    u2022 Adjust the expected number of records in the dimension table by expected growth.
    u2022 Multiply the adjusted record count by the expected size of the dimension table record to obtain
    the estimated size of the dimension table.
    Fact Tables
    u2022 Count the number of key figures the table will contain, assuming a key figure requires 17 bytes.
    u2022 Every dimension table requires a foreign key in the fact table, so add 6 bytes for each key. Donu2018t
    forget the three standard dimensions.
    u2022 Estimate the number of records in the fact table.
    u2022 Adjust the expected number of records in the fact table by expected growth.
    u2022 Multiply the adjusted record count by the expected size of the fact table record to obtain the
    estimated size of the fact table.
    Regards,
    Marasa.

  • ASO physical clear increases cube size?

    I am testing out performance and feasibility of a physical clear and have noticed that the cube size actually INCREASES after doing a physical delete. The cube size increased from 11.9 GB to 21.9 GB. Anyone know why?
    Our ASO cube is used as a forecasting application and models are being updated consistently. Logical delete won't work because a model can be updated multiple times and direct quote from manual precludes the logical clear option.
    "Oracle does not recommend performing a second logical clear region operation on the same region, because the second operation does not clear the compensating cells created in the first operation and does not create new compensating cells."
    Here is the MDX I used to do a physical clear of ~120 models from the cube.
    alter database Express.Express clear data in region
    PM10113,
    PM10123,
    PM10124,
    PM10140,
    PM6503,
    PM6507,
    PM6508,
    PM6509,
    PM6520,
    PM6528
    }' Physical;
    Any insight would be greatly appreciated.

    I am sorry but I do not have my test system available so i will have to do it from memory.
    I am surprised at this - it is what you would expect if you did a logical clear. When you look at db statistics does the number of cells reflect the original less the cleared area? And does it show no slices? If so then you did do a physical as your maxl indicates.
    You might want to stop and start the application. Otherwise I will have to check some more.
    But given the size of the increase (almost doubled) I would wonder why you would do a clear as opposed to a reset. Finally I am wondering why you are doing a clear at all? Why not just a send and let an incremental slice. That way only the changed cells would be found in the slice. More important the slices would be quite small and likely automatically "merged".
    Finally - I am wondering about the DBAG quote (page 982 on PDF) you included. Again I would have to test but I think they are only warning that the the number of slices will start to build up but I would expect that "because the second operation does not clear the compensating cells created in the first operation and does not create new compensating cells". The net result would still be correct.

  • ASO: Cube size

    Hi,
    We are currently working on designing a system with the cube size estimated to be in the region of 400GB to 600Gbs. I wanted to explore Essbase ASO as an option. Has anyone implemented a cube of that size using ASO and if yes how is the performance with regards to query performance and processing time. Are there any specific issues that we have to keep in mind? We would have about 12 dimensions.
    Thanks,
    Amol

    hi Amol,
    1. On what basis , did you estimate your cube to around 400GB to 600GB.
    2. If ASO is an option, its huge advantage lies in space, its does not take more space , unlike BSO.
    3. I have seen cubes ,who size was around 300-400GB in BSO,when made the same cube into ASO , its consumed space of 40GB-45GB.
    HOpe this helps
    Sandeep Reddy Enti
    HCC
    http://hyperionconsutlancy.com/

  • Check the database size

    HI,
    Kindly let me know if there is any maxl comamnd available for to check the cube size and application size for all appilication in one server.
    Thanks,
    VIkram

    Hi,
    Visit this link to display some storage-related metrics. Refer to [Disk Volume|http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_techref/maxl/ddl/statements/dispdvol.htm] & tablespace in particular.
    - Natesh

  • What happens to the cube performance when you have too many UDAs?

    Hi,
    what's the impact on the cube performance if you have too many UDAs associated with the same level of members? Let's say 15 UDAs for each member
    If attribute is not an option, what's another way to help alleviate the data load?

    Hi,
    Adding UDAs to the outline doesn't impact the size of Essbase database.
    I've seen cubes with over 70 UDAs. And I think UDAs is the best choice to set load rules.
    But for considering of the implication of Dynamic calculations as reporting on UDAs can be slow. Define UDAs on sparse dimensions only.

  • In CC214 PS Why has adobe removed the "print Size Button"  And why have they made the Brush preview window so small you can't see the effects to accurately adjust them?

    In CC214 PS Why has adobe removed the "print Size Button"  And why have they made the Brush preview window so small you can't see the effects to accurately adjust them?
    These two things need to be remedied in their upgrades because It's become such a bother I've had to revert to older versions of PS.

    Yea, and it's a pain to get to.  What was the rational for removing the print size button?   With the magnifying glass tool selected in older versions of Photoshop you would get a button that says "print size" right next to Actual pixels, Fit Screen, Fill Screen. Now I have to go out of my way and dig for it if I want to see the print size.  Of all the buttons you could have taken away, why not fill screen?  I never use that.
    Besides the point I just downloaded the Oct Upgrade for PS and surprise! Nothing has changed and these are two big black eyes on newer versions of Photoshop.   The Brush preview window is  unchanged in CC but in CC2014 it's basically worthless.  If you apply a texture to a brush you can't properly see the scale of the texture, the spacing, scatter, etc...

  • Adjustment Brush in Lightroom 5 has a different sized effect smaller than the brush size

    Hi,
    When using Lightroom 5 Adjustment brush, the effect is only in the inner half of the brush size circle. In other words, if you select a 1 inch brush to darken, only about the center 1/2 inch is being darkened. I'm 100 percent sure the brush has no feathering, and all the settings are fine. An expert told me to try to get rid of preferences, and try it, and it still does the same thing. Now, I just updated to LR 5.7 hoping maybe it will fix it, to no avail. The LR on my laptop behaves perfectly fine.
    Attached is a screen shot and a phone shot (excuse the moire, I had to use a phone to show the brush size properly) to show settings and the effect I'm talking about, so it's perfectly clear.
    I have been putting up with this for months, but it's driving me crazy. Also, it occurred to me that all adjustments I make might 'change' later when I fix it, and it might show up unknown to me when I export the files later. And this would waste hundreds of hours of work if something resets somehow.
    Does anyone have any suggestions on how to fix this?
    Thanks in advance for all replies!
    Stu

    This problem is caused by changing the cursor size in the ACCESSIBILITY settings in your OS.  Here is a screen shot from the Mac but Windows has similar settings.  I reported this in Lightroom 2 or 3 and it is STILL not fixed. (Adobe's bug fixing rate to me is pretty dismal)   If you change your system cursor to anything except "Normal" you will get this behavior as Lightroom doesn't account for the system preferences.  I would like to make my cursor bigger but this prevents it.

  • How do you increase the Brush Size past 50 on the Stroke Effect?

    Hey Everyone,
    I created a photo in Photoshop and I imported it to AE, and I've been trying to create a Spray Paint effect by using the Stroke effect and Reveal Original Image to look look like as if I'm spray painting the the picture on the wall. All is going well until I get to the brush size, and it only goes up to 50. Does anyone know how to increase it past 50? 50 is to small, it won't cover what I want it to cover for the stoke. I can't seem to figure out a way to get it higher than 50, and I don't know any other way to create the effect I'm trying to go for. Anyone know how to either get past this or know a different way to get make it look like I'm spray painting the picture on the wall? Any help at all would be great.
    Thanks in advance!
    -Raven

    The After Effects CC (12.2) update increases the maximum size of the brush for the Write-on effect and Stroke effect to 200 pixels (from the previous maximum of 50 pixels).

  • How to calculate the HFM Cube size in SQL Server-2005

    Hi
    How to calculate the HFM Cube size in SQL Server-2005 ?
    Below query used for Oracle. Then what is query for SQL Server?
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'FINANCIAL_%' and owner='HFM';
    SUM(BYTES/1024/1024)
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'HSV FINANCIAL%' and owner='HFM';
    SUM(BYTES/1024/1024)
    Regards
    Smilee

    What is your objective? The subcube in HFM is a concept which applies to the application tier - not so much to the database tier. The size of the subcube is the unique number of data strips (data values for January - December inclusive, for example) for the given entity, currency triplet or Parent.Child node. You have to account for parent accounts and customs which don't exist in the database but are generated in RAM in the application tier.
    So, if your objective is to find the largest subcubes, you could do this by querying the database and counting the number of records per entity/value (DCE tables) or parent.child entity combination (DCN tables). I'm not versed in SQL, but I think the script below would just tell you the schema size and not the subcube sizes.
    Check out Accelatis.com for a third party software product that can do this for you. The feature is called the Subcube Analyzer and was written by the same team that wrote HFM, so they ought to know how this works :-)
    --chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Identify the Cubes where dimension table size exactly 100% of the fact tabl

    Hello,
    We want to Identify the Cubes where dimension table size exactly 100%  of the fact table.
    Is there any table or standard query which can give me this data?
    Regards,
    Shital

    use report (se38) SAP_INFOCUBE_DESIGNS
    M.

  • Cube/ Infoset changes effect the query / reports / workbooks

    Hello Experts,
    I have added infobjects to the cube, ODS and Infoset. These object are already in production from a long time and there queries / reports on the same.
    When I am going to transport the new Cube, ODS and Infoset with the new objects added, would this effect the queries/ reports which the users are already using them.
    Is there any caution or procedure to be followed.
    Thanks,
    BWer

    Hi BWer,
    In addition to what Arun said..
    If you are adding anything in the Infoset make sure you adjust the Infoset in RSISET.
    Ashish.

Maybe you are looking for

  • Setting the filter to restrict the data based on processing period

    Hi all, Hope someone might be able to help me - OBIEE 11.1.1.5 - I have an RPD very simple just two tables (nothing coomplex). Here is the scenario - after every month end close the user will run the report from the dashboard by selecting the process

  • Void Check

    Can any one through some light one the following issue, i am not understanding. What is Void check, why should get a void check when it is void. please explain me clearly. The ticket is related to APP (automatic payment program). The user is having a

  • To calculate Service as on date

    Hi,     I want to calculate the service as on date. Employee No                     Employee Name                                  Date of joining                                      Service as on date 1000                                    XXXXXXX

  • Dates for Interaction records and other Business Activities in IC (CRM2007)

    Hello, I noticed that for interaction records or follow up business activities the dates are not displayed at all in the IC. Is there a way make the configured dates (in date profile) visible in the IC? Thanks, Patrick

  • Trouble Setting Up DML Handler for Journaling

    Hi Pat, I am trying to setup a simple DML handler for journaling of a table and cannot get it to work. Configuration is as follows: Source table Target table - updated via basic replication setup, this works fine. Target table journal - This table lo