To Compress cube

HI Experts,
I am begineer to SAP BI .When I executed a outbound job through process chain at background,but got struck in a step due to SQL statement error.The DBA team informed to compress this cube and  It has 17M+ rows in the F fact table and they may also be able to re-create the F fact table as partitioned and with the new index structure for ~P indexu2026 meaning with time dimension in the 1st column as opposed to the packet dimension.
Cube has last 3 years data and need to do in slots .Can you tell me procedure to do it.

There are 2 different solution you mentioned to improve the performance.
For the compress, you can do it at
1) RSA1 -> Cube -> Management
2) Find the process type and add it into your process chain.
Compress will change the request id to '0' and aggregate the records with same char.
After you do it, the total number of records will decrease. It depends on the data in cube itself.
For the Partition, you can find it at RSA1 -> Cube and find it from the menu: Extras -> DB performance -> Partition.
It devide the data into groups through some particular char. value.
And you have chance to do the partition in physical level which may not supported by all DB. Check some technical notes.

Similar Messages

  • Data mart is not visible in Compression cube

    Hi Experts,
    we r loading data daily into BI inventory Cube 0IC_C03 with compression, but when i loading the same data  into SEM server there  wont get init flag, even i checked datamart status in BI but i did not find.
    so please help me out why it is not finding datamart symbol in BI & Init flag for same data into SEM.
    i think this might be  with compression cube...pl give me complete solution...
    Thanks
    Pinky....

    hello,
    Try doing a couple of checks:
    observe the values for Fields BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG, etc in PSA if they are null or not.
    If not null, That means have you actvated The process key (0PROCESSKEY and 0BWAPPLNM) else activate the same.Take a look on note, 353042.
    When loading to another target,do a full update, followed by init wo data transfer, followed by delta
    OR
    regular init followed by delta
    or
    First do a init W\O Data transfer and then
    Second do a repair Full Load, this will Load all the data which is present at that instant in 0IC_C03.
    and hence further u run Delta Load, this will bring delta Load.
    Don't do any compression before the delta because you will not get any delta records afterwards. The compression combines all of the non-compressed requests, so you will lose data in unloaded deltas.
    When you r compressing, set with No Marker update Update’ NOT SET i.e. unticked or unchecked after loading of 2LIS_03_BX.
    Regards,
    Dhanya

  • Rollup and Datamart concepts for compressed cube

    Hi all,
    I loaded the data into cube and i compressed it ( collapse ),
    Then I have 2 questions
    1. Can i use the  Rollup option now?
    2. Can I load the data into another cube by using the compressed cube?
    good answers are rewarded
    Regards
    Ram

    hi sri
    Its not like we cannot do rollup before compression. You are right we do rollup to fill the aggregates. When you will do compression the request id will be deleted and the data will move from F table to E table. There will be no requst id at all.
    We do rollup based on Request only. If there is no request how can u do rollup. the same requst will not be availabe for reporting if u dont do the Rollup to aggregates.
    Regards
    Khaja

  • Different aggregation operators for the measures in a compressed cube

    I am using OWB 10gR2 to create a cube and its dimensions (deployed into a 10gR2 database). Since the cube has 11 dimensions I set all dimensions to sparse and the cube to compressed. The cube has 4 measures, two of them have SUM as aggregation operator for the TIME dimensions the other two should have AVERAGE (or FIRST). I have SUM for all other dimensions.
    After loading data into the cube for the first time I realized that the aggregation for the TIME dimension was not always (although sometimes) correct. It was really strange because either the aggregated values were correct (for SUM and for AVERAGE) or seemed to be "near" the correct result (like average of 145.279 and 145.281 is 145.282 instead of 145.280 or 122+44+16=180 instead of 182). For all other dimensions the aggregation was OK.
    Now I have the following questions:
    1. Is it possible to have different aggregations for different measures in the same COMPRESSED cube?
    2. Is it possible to have the AVERAGE or FIRST aggregation operator for measures in a COMPRESSED cube?
    For a 10gR1 database the answer would be NO, but for a 10gR2 database I do not know. I could not find the answer, neither in the Oracle documentation nor somewhere else.

    What I found in Oracle presentation is that in 10GR2 compressed cube enhancements support all aggregation methods except weighted methods (first, last, minimum, maximum and so on). It is from September 2005 so maybe something changed since then.
    Regarding your question about the results I think it is caused by the fact that calculation are made on doubles and then there is a compression, so maybe precsion is lost a little bit :(. I really am curious whether it is because of numeric (precision loss) issues.

  • DTP from Compressed cube

    HI Gurus
    I have a Partial  compressed cube, with some of the request are compressed and rest not compressed.
    I am using this cube as a datamart to a similar cube (using for back up data).
    I am using the delta with the request by request option in DTP.
    After the data load i am facing the problem that the data is not similar. Target cube has more figures.
    Is this because of using request by request(in DTP) as i have some copmressed request in source cube.
    Thanks
    Dheeraj

    Hi Dheeraj,
    If you are moving data from one cube to another cube its must be do all requests are compression, and then move it to another cube. 
    I hope to you didn't any problem.
    You are facing the problem that the data is not similar is due to using request by request(in DTP),because If you set this indicator, a DTP request only gets data from one request in the source.
    When it completes processing, the DTP request checks whether the source contains any further new requests. If the source contains more requests, a new DTP request is automatically generated and processed.
    But you are loading not deltas then uncheck this indicator and load it.
    Regards
    SKBABU

  • AVG aggregation in compressed cube

    We are using OWB10gR2 and have designed a compressed cube with eight dimensions, one of them is the TIME dimension with standard year and week hierarchy. The (single) measure is not summed over the time but we have to calculate the average over time.
    What the cube is calculating is sometimes correct, but not always. I tried a bunch of things (having time dimension with only one hierarchy, partitioning the cube or not, ...) to get that problems solved but without result. OLAP 10gR2 should be able to calculated average in a compressed cube so I assume that the code OWB is generating is not correct.
    Has anybody ever designed a compressed cube with an aggregation different from SUM?? It this possible using OWB?

    Hi,
    Even you compress the Cube you can delete the data from Cube by selective deleteion. You can delete based on your Dates/Characters...etc.. you see my blog..
    in  https://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Using Selective Deletion in Process Chains
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/using%20selective%20deletion%20in%20process%20chains.pdf
    Thanks
    Reddy

  • Help needed for running Aggregation map for a compressed cube

    Hello everybody,
    I am having problem in running an Aggregation map using a model on a compressed cube. Please find the model and Aggregation map descriptions below.
    ---Model
    DEFINE MYMODEL MODEL
    MODEL
    DIMENSION this_aw!ACCOUNT
    ACCOUNT('NIBT') = nafill(OPINC 0) + nafill(OTHINC 0)
    END
    -----Aggregation Map
    DEFINE MY_AGGMAP AGGMAP
    AGGMAP
    MODEL MYMODEL PRECOMPUTE(ALL)
    AGGINDEX OFF
    END
    While running the aggregate on an uncompressed cube the Model is working fine but when I am trying to aggregate a compressed cube it is throwing me error (String index out of range: -1). I would appreciate if anyone could provide some thought to this problem.
    The cubes has five dimensions apart of ACCOUNT and it is partitioned by PERIOD. I am using Oracle 10g 10.2.0.4.0 with AWM.
    Thanks,
    Vishal
    Edited by: user7740133 on Sep 16, 2008 5:23 PM

    Vishal,
    I am not sure of using composites to run the model but you can limit your dimensions values to some values(which has data) and then run the model on cube_prt_topvar and aggregate the cube using default aggmap. You have to limit all dimensions to all before you run the agggmap.
    I just saw the account model you posted initially. In your scenario you can limit your account dimension to have only three values 'NIBT' 'OPINC' 'OTHINC' and for other dimension to all. Now when you run the model you will not get values aggregated for account but for others you will see the aggregated value. If you would like to aggregate values for account also then I would suggest you to limit all the dimensions to leaf level and then run the model and then aggregate the cube using default aggmap.
    Hope this helps.
    Thanks
    Brijesh
    Edited by: BGaur on Oct 25, 2008 1:10 PM

  • Compress cube request is not possible

    Hi Expert
    when I compress cube reuest, the system prompt there is no valid request to compress, but actually there are more 30 request in the cube.
    does anyone have this issue and how to solve it?
    by the way, I use the BI7.0  and use DTP transfer and in this cube there are different datasouce to provide data.
    Thnaks.

    Hi
    Thanks.
    there is no any red request and every thing is ok, I can use this cube data to provide data to other cube and Bex report, there is no any issue. but when I want to improve system performace to compress request, the system prompts it has no valid request.

  • Datamart from a compressed cube

    Hi All,
    is it possible to perform an export datasource on a cube that has been compressed and then load the data it into another cube ?
    I get the following error when I do the above
    "No corresponding requests were found for request REQU_3YQUNVFWZQRR87ZPBRYDHE97C in the ODS/Cube"
    Please suggest if you have any ideas.
    Best Regards,
    Sanjay

    Hi Sanjay,
    Yes.It is possible to create an export data source on a cube that is compressed.
    Is it an error OR just a warning.
    Bye
    Dinesh

  • Steps to compress Cube

    Hi Gurus,
    I need full detail steps while doing cube compression, thanks Dave

    Hi Dave,
       Basically when you create a cube 2 tables created ,they are F and E tables.
    When you load the data, data loaded into F table.
      Compression is a process, to compress the data in fact table. Generally when you do compression all the request ids are deleted and data moved in to E table.
    There may be data with same characteristic combination and with different Req ids, these records can be compressed.
    For this,
    Right click on Cube -> manage -> goto Collapse tab -> provide Request id ( defaultly it will take the latest successful request)  -> Click on Release.
        Then all the data moved to E table. Now you cannot delete data request wise.
    If you want to delete data you need to do reverse posting.
    I think this will helpful.

  • How To Find Fully Compressed Cubes via Tables?

    Hello,
    I want to use ABAP to determine if a cube is fully compressed or not. I checked out these tables:
    RSCDSREQDELTAB
    RSICCOMP
    RSICCOMPPLAN
    They don't appear to have what I need...any suggestions?
    Thanks for looking,
    Gary

    Hi,
    In my Inventory cube there are two requests - 2917 and 2988.
    2917 is init of BF, which need to be compressed with "No markup" ticked.
    2988 is delta of BF, which need to be compressed with"No markup" un-ticked.
    When I do compress of 2917. It says in log there -
    "Compression not necessary; no requests found for compressing".
    I have checked in table RSMDATASTATE.
    Which has values -
    DMEXIST - 2854 (which is last compressed request)
    DMALL - 2854
    COMPR - 2854
    COMPR_DUAL - 2854
    In some forum post it is suggested to reset above values in table. But how can I do.
    Please provide any inputs.

  • Compress Cube

    Hi experts:
             In order to improve the performance of the Cube i want to compress the Cube;
    But i don't koow how to do,anyone can give me some advise about this. i am use BI 7.0.
       Thanks in advance!

    Background :
    When you load date into cube, you will have a request id. When you find some error, you can simplely delete the whole batch of data by request id and reload them later.
    If you are sure about the data qulity, the request id is useless and it take extra space.
    Actually, compress will delete the request id for you. Of course, you can not delete the data by request id in the future.
    And compress can improve the performance. find more detail from online help.
    Reference : http://help.sap.com/saphelp_nw2004s/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/frameset.htm
    Search before ask.

  • Deleting data from a compressed cube

    Hi Gurus,
    I have an info cube which is getting data fron three different DSO's (DSO1, DSO2, DSO3)
    and I have Aggregates to the cube and this whole process is automated in a Process chain
    now I have to delete the data in the cube which is getting loaded from DSO2 only
    how to achieve this
    Thanks
    MSS:-)

    Hi,
    You can do Selective deletion but how you do that , is there any field which differentiate between DSO1 , DSO2 & DSO3
    based on that fields you make selective deletion
    for Dynamic Selective deletion goto the following link
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/4001f4b1-189f-2e10-bcb3-87bb0fb428ba
    another option is to create infoset/ multiprovider  based on DSO1 & DSO3
    Best Regards
    Obaid

  • Compression to inventory cube

    Hi
    I am  initializing data to Inventory cube (with cumulative key figures) from
    data sources 2lis_03_bx,  2lis_03_bf, 2lis_03_UM.
    Could any body explain how to compress cube once data is loaded. I mean like non- cumulative cube do i need to use marker update and not marker update?
    if yes whice data sources i need to use marker update.
    Regards.
    Sarath

    Load opening Balance with 2LIS_03_BX. Compress with marker update(mandatory).
    But your BI still does not know how this stock was arrived at, it needs the historical movements, should you need to know some stock value prior to initialization.
    So you do a statistical setup for 2LIS_03_BF and bring this data to BW by an INIT. Now these movements are already accounted for, they reflect the current stock(marker) - means MARKER UPDATE is NOT required. Hence, when compressing historical data with 2LIS_03_BF, you check the "NO MARKER UPDATE" box. Because, if you update marker, it will ruin you current stock figure.
    After this you open you OLTP system to users, they post new movements. You now load these to BI using BF datasource as delta. When compressing these, marker should be updated to reflect the new movements. Hence, you uncheck the flag "NO MARKER UPDATE". For subsequent loads too, this checkbox remains unchecked.
    2LIS_03_UM does not deal with quantities, only valuations, compress it in the same way as you do for 2LIS_03_BF.
    I hope this answers your question.
    Edited by: SDX BI on Oct 10, 2008 3:49 PM

  • Cube compression and request's id

    Can we decompress compressed cube using the request id's?
    What happens to the request id's when the cube gets compressed?
    rgds

    Hi Nitin,
    when you load data into the InfoCube, entire requests can be inserted at the same time.
    Each of these requests has its own request ID, which is included in the fact table in the packet dimension. This makes it possible to pay particular attention to individual requests. One advantage of the request ID concept is that you can subsequently delete complete requests from the InfoCube.
    However, the request ID concept can also cause the same data record (all characteristics agree, with the exception of the request ID) to appear more than once in the fact table. This unnecessarily increases the volume of data, and reduces performance in Reporting, as the system has to aggregate using the request ID every time you execute a query.
    Using compressing, you can eliminate these disadvantages, and bring data from different requests together into one single request (request ID 0).
    This function is critical, as the compressed data can no longer be deleted from the InfoCube using its request IDs.
    Hope now is clearer (and don't forget to assign some points by clickin'on the star to the contributors that helped you !!!)
    Bye,
    Roberto

Maybe you are looking for

  • Dunning Document Set Up Question

    I hope someone with a better understanding of Dunning (which will be every oout there !), can help me with my question  What is the difference between the "F150_DUNN_01" and "F150_DUNN_02" SAPscripts? In the SAP documentation it says "F150_DUNN_02" i

  • [Time].[Date].[Date] vs. [Time].[Date].Members in a Scope

    Hi folks, I'm just trying to understand how the subcubes are created using SCOPE. Example 1: SCOPE([Measures].[Credit Note CA]); SCOPE([Date NextYear].[Date].[Date], [Time].[Date].Members, [Currencies].[Currency].[Currency]); THIS = [Measures].[Credi

  • After Updating my iPhone to the IOS 6.1.2 all of my iPhone music got erased

    Hey today I just updated my iPhone 4 software to the iOS 6.1.2 and then backed up my phone. I have everything back on my phone from before EXCEPT for all of the Music I had on my phone, GONE. All of my playlists, EVERYTHING. I backed up my iPhone aga

  • Possible fix for AppleTV not appearing in devices

    Just read an article I found searching Google http://www.trick77.com/2008/03/16/apple-tv-not-being-recognized-in-itunes/ It states that quitting mDNSresponder in Activity Monitor will force a refresh of the Bonjour scan. Finally, after 6 weeks, my Ap

  • Updating files in FCP 5.0.4 problems

    When ever I update a LiveType or photoshop file (i.e. make new changes to the file in either LT or Pshop while that file exists in the timeline in FCP) upon saving and going back into an the already open FCP project, FCP ALWAYS disconnects the file a