What is a Zero elimination in infocube compression?

Can any body please explain me in detail what is a Zero elimination in infocube compression? when and why we need to switch this on? i appreciate your help. Thank you.

Hi Rafi,
   If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as the compression. In this case, the entries where all key figures are equal to 0 are deleted from the fact table.
Zero-elimination is permitted only for InfoCubes, where key figures with the aggregation behavior ‘SUM’ appear exclusively. In particular, you are not permitted to run zero-elimination with non-cumulative values.
More info at:
http://help.sap.com/saphelp_nw04/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/content.htm
Hope it Helps
Srini

Similar Messages

  • Compress infocube with zero elimination

    Experts, Is that true you should always check "with zero elimination" when you compress infocubes?  I was reading the famous SAP doc "How to Handle Inventory Management Scenarios in BW", the screen shots for cube compression does not have that checkbox checked.  Does anybody know why?
    Thanks!

    Hello,
    Using the zero elimination the entries where all key figures are equal to zero are deleted from the fact table during compression. You don't want that in Inventory Management.
    Regards,
    Jorge Diogo

  • Compression with zero elimination (reverse postings)

    I executed a compression with zero elimination in a custom InfoCube in order to avoid entries that only contain zero values as key figures (for example reverse posting) are contained in the InfoCube after compressing.
    However, not all the entries where all the key figures are blank have been deleted.
    This cube contains the following standard key figures:
    -0DEB_CRE_DC;
    -0DEB_CRE_LC;
    -0DSCT_DAYS1;
    There are also two other custom key figures, created as copy of 0DEB_CRE_LC.
    Have you any suggestion?

    So you're saying you have rows in your E fact table that were created as part of this particular compression run where all KFs = 0? That certainly doesn't sound right.
    The zero elimination is a multi-step process -
    first excluding rows from the F fact table where all KFs are 0, then excluding any summarized rows where all KF = 0, and then finally, deleting rows from E fact table that were updated resulting in all KFs = 0. Were any compressions run on this cube previously without zero elimination specified?
    What DB?
    There have been some problems with Oracle 9.2 merge function.
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=613701&_NLANG=E
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=639253&_NLANG=E

  • Compression pb with zero elimination

    Hi,
    In a BW3.1 SP22, We compress infocube with flag “with zero elimination” activated.
    But occurrences having key figures null and which were just compressed are not deleted.
    Few properties of this cube are :
    •     just cumulatives KF
    •     huge volume (approx 120 millions occurrences / 22 millions new non-deleted occurrences with all KF null)
    •     huge aggregates (biggest : 20 millions)
    •     partitioning by month (12)
    •     previously few requests were compressed without the flag « zero elimination » but obviously the occurrences described as non-deleted don’t come from these old requests
    In the same system, on others cubes with less datas, we tried to reproduce the case without success. In all scenarii we had tested, lines with KF nulls are correctly deleted.
    We don’t understand from where the problem could come.
    If someone has an idea ...
    Thanks by advance.

    So you're saying you have rows in your E fact table that were created as part of this particular compression run where all KFs = 0? That certainly doesn't sound right.
    The zero elimination is a multi-step process -
    first excluding rows from the F fact table where all KFs are 0, then excluding any summarized rows where all KF = 0, and then finally, deleting rows from E fact table that were updated resulting in all KFs = 0. Were any compressions run on this cube previously without zero elimination specified?
    What DB?
    There have been some problems with Oracle 9.2 merge function.
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=613701&_NLANG=E
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=639253&_NLANG=E

  • Infocube compression with zero elemination

    Hi all,
    I have cube which is being compressed daily for almost a year. But without zero elemination.
    Now I have a requirement to compress with zero elemination. I am planing to do this after next load into the cube. So once I compress with zero elemination check box ticked, will all the data records be compressed with zero elemination? Will this cause any problem?
    What is the best way to do it?
    I cannot delete the contents of the cube.
    Expecting a reply ASAP.
    Regards,
    Adarsh

    I Hope nothing wil happen to the data values they will remian same. It is just you are removing zeros. If the zeros are there also they have to aggregate and show the value in the reprort.
    So you can go ahead with the Zero elimination mode.
    Regards,
    Vikram

  • Compress the Request with Zero Elimination

    I delta load the transaction data into Infocube by InfoPackage and compress the Request ID manually without select 'with Zero Elimination'. Now I wanna Compress the request ID with Zero Elimination again. Could you give me some hits on that. Many Thanks .

    hi,
    There're two programs that work for me.
    1. RSCDS_MERGE_DUPL_REF_POINTS - this is to actually compress the records.
    2. RSCDS_NULLELIM : - To remove the records with 0 value i.e. null Elimination
    Although, SAP doesn't recommend using them if your compression is working fine.
    You may want to try them in your non-production system first.
    -RMP

  • Compression with zero-elimination

    Hi experts,
    "If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as the compression. In this case, the entries, where all the key figures are equal to 0, are deleted from the fact table"
    I am looking for clarification regarding the note above.
    I understand that Compressing the InfoCube eliminates the duplicate records by converting all request IDs to 0 and rolling up the key figures for records that have common values for all characteristics
    If I have 1 record with more than 3 KFs for example; within those 3 KFs, 2 have the value 0.
    Does compression with zero-elimination with only set the request id to 0, keeping the value 0 for the 2 KFs like before compression or do we will have blank space (deleted) insteadt?
    Does the zero-elimination is only possible when all KFs in the same record are equal to 0?
    Thanks.

    Hi Blaiso ,
    Zero Elimination means deleting the record from the cube after compression if and only if, the entire key figures of the particular record is zero. If there are two key figures A & B, A = 0, B = 10, then this record will not be deleted from the cube.
    So the answer to your question is if in a particular record there are 3 KF with 1 as non zero then this record will not be deleted but request ID would be deleted .
    Please check the below link for complete information on compression :
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Hope it helps .
    Thanks
    Kamal Mehta

  • Cube compression WITH zero elimination option

    We have tested turning on the switch to perform "zero elimination" when a cube is compressed.  We have tested this with an older cube with lots of data in E table already, and also a new cube with the first compression.  In both cases, at the oracle level we still found records where all of the key figures = zero.  To us, this option did not seem to work. What are we missing? We are on Oracle 9.2.0.7.0 and BW 3.5 SP 17
    Thanks, Peggy

    Haven't looked at ZERO Elimination in detail in the latest releases to see if there have been changes, but here's my understanding based on the last time I dug into it -
    When you run compression with zero elimination, the process first excluded any individual F fact table rows with all KFs = 0, then if any of the summarized F fact table rows had all KF  = 0, that row was excluded ( you could have two facts with amounts that net to 0 in the same request or different requests where all other Dims IDs are equal) and not written to the E fact table.  Then if an E fact table row was updated as a result of a new F fact table row being being merged in, the process checked to see if the updated row had all KF values = 0, and if so, deleted that updated row from the E fact table.
    I don't beleive the compression process has ever gone thru and read all existing E fact table rows and deleted ones where all KFs = 0. 
    Hope that made sense.  We use Oracle, and it is possible that SAP has done some things differently on different DBs.  Its also possible, that the fiddling SAP had done over that last few years trying to use Oracle's MERGE functionality at different SP levels comes into play.
    Suggestions -
    I'm assuming that teh E fact table holds a significant percentage of rows have all KFs = 0.  If it doesn't, it's not worth pursuing.
    Contact SAP, perhaps they have a standalone pgm that deletes E fact table rows where all KFs = 0.  It could be a nice tool to have.
    If they don't have one, consider writing your own pgm that deletes the rows in question.  You'll need to keep downstream impacts in mind, e.g. aggregates (would need to be refilled - probably not a big deal), and InfoProviders that receive data from this cube.
    Another option would be to clone the cube, datamart the data to the new cube.  Once in the new cube, compress with zero elimination - this should get rid of all your 0 KF rows.  Then delete the contents of the original cube and datamart the cloned cube data back to the original cube. 
    You might be able to accomplish this same process by datamarting the orig cube's data to itself which might save some hoop jumping. Then you would have to run a selective deletion to get rid of the orig data, or perhaps if the datamarted data went thru the PSA, you could just delete all the orig data from the cube, then load datamarted data from the PSA.  Once the new request is loaded, compress with zero elimination.
    Now if you happen to have built all your reporting on the this cube to be from a MultiProvider on teh cube rather than directly form the this cube, you could just create anew cube, export the data to it, then swap the old and new cubes in the MultiProvider.  This is one of the benefits of always using a MultiProvider on top of a cube for reporting (an SAP and consultant recommended practice) - you can literally swap underlying cubes with no impact to the user base.

  • Zero elimination into the export datasource

    Hi gurus!
    I have the following problem:
    1) In the Infocube 1 I have a export datasource and this provider receives data from a FLAT FILE;
    2) In this FLAT FILE I have 19 records. One of them has all key figures as zero value;
    3) When I load this FLAT FILE into the Infocube 1 receives all records (total of 19);
    4) I have the Infocube 2 that receives data from Infocube 1 (via export datasource);
    5) When I load data from Infocube 1 to Infocube 2, I miss the record with all KF as zero value. I receive only 18 records.
    Does anybody know what is happening?
    Note: This behavios is like the zero-elimination on compress data.
    Thanks in advance,
    Silvio Messias

    Hi,
    check if your record exist in your Source ICube.
    If yes (and you don't filter your extraction) then the extractor is not considering the records having all key figs = 0.
    If not then then check whether you have routines having RETURNCODE <> 0 in TRules or URules...
    strange...
    Olivier.

  • Performance Impact with InfoCube Compression

    Hi,
    Is there any delivered content which gives a comparative analysis of performance before InfoCube compression and after it? If not then which is the best way to have such stats?
    Thank you,
    sam

    The BW Technical Content cubes/queries can tell you if a query is performing better at differnet points in time.  I like ot always compare a volume of queries before and after, rather than look at a single execution.  As mentioned, ST03 can provide info, as can RSRT.
    Three major components of compression that can aid performance:
    <u><b>Compression</b></u>
    The compression itself - how many rows do you end up with in the E fact table compared to what you had in the F fact table.  this all depends on the data - some cubes compress quite a bit, others, not at all, e.g.
    Lets say you have a cube with a time grain of Calendar Month.  You load trans to it daily.  A particular combination of characteristic values on a transaction occurs every day so after a month you have 30 transactions spread across 30 Requests in the F fact table.  Now you run compression - these 30 rows would compress to just 1 row.  You have now reduced the volume of ddata in your cube to just about 3% of what it used to be.  Queries should run much faster in this case.  In real life, doubt you would see a 30 - 1 reduction, but perhaps a 2 - 1 or 3 - 1 is reasonable.  It all depends on your data and your model.
    <b><u>Zero Elimination</u></b>
    Some R3 appls generate trans where all the KFs are 0, or generate trans that offset each other, netting to 0.  Specifying Aero Elimination during compression will get rid of those records.
    [<b>u]Partitioning</u></b>
    The E fact table can be partitioned on 0FISCPER or 0CALMONTH.  If you have queries that restrict on those characteristics, the DB can narrow in on just the partitions that hold the relevant data (partition pruning is how it is usually referred to).  if a query on goes after 1 month of data form a cube that has 5 years of data, this can be a big benefit.

  • Turning off zero elimination

    Hi,
      A quick and easy question.  Can I turn off zero elimination if I have compressed many requests with zero elimination on? and what are the impacts
    thanks

    You've answered your own question I think.  From a technical standpoint, no problem in turning it off. The reason you sight, business need to see the '0' records is exactly why you would NOT use zero elimination. 
    The degree of space saving and performance varies from application to application, and probably to a degree, from customer to customer.  Some appls, by there nature, frequently use off setting transactions that zero out, or generate zero trans, and other appls seldom do. I have some appls where zero elim reduced the E fact table by 50% and others where it had neglible impact.
    I wouldn't worry about turning it off at all if your business requirements dictate that the 0 records be there.
    Pizzaman

  • Zero-Elimination And Exception Aggregation

    Hi experts,
    I'd like to compress a cube with the option of zero-elimination. Online help says: "Zero-elimination is only allowed for InfoCubes that contain key figures with aggregation behavior u2018SUMu2019 exclusively." (http://help.sap.com/saphelp_nw70/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/frameset.htm)
    The key figures of the cube I like to compress all have aggregation = SUM but some of the key figures have exception aggregation <> SUM. Will the data of a cube still be fine after that compression?
    Thank you for your ideas!
    Volker

    Dear Andrei Bushkevich and Swapnil Dharia,
    thank you for your response! In my question there is a double ** wich originally stands for the sign of "NOT EQUAL".
    So the sentence is:
    "The key figures of the cube I like to compress all have aggregation = SUM but some of the key figures have exception aggregation NOT EQUAL SUM."
    @Andrei Bushkevich
    No that I've corrected the text - are you still sure that I can do the compression with zero-elimination?
    Best Regards,
    Volker

  • Infocube compression error CX_SQL_EXCEPTION

    Hi,
    We are encountering a infocube compression error ( CX_SQL_EXCEPTION: parameter is missing).
    Have applied two notes:1028847 and 973969. But did not fix the error. In the system log we have the following error:
    ORA-00054: resource busy and acquire with NOWAIT specified.
    Every time when the compression failed, we always repeated it and completed successfully in second.
    Could anyone know what we can do to fix this.
    Thanks!
    JXA

    Hello Girija,
    Please check OSS note 973969.
    Regards,
    Praveen

  • Zero elimination failing

    Hi,
    We are compressing a cube with zero elimination.  All KFs are cumulatives with aggregation behavior of SUM.
    For some reason certain rows for which all KFs are zero are not being eliminated from the E fact table.  I have looked back over past requests and do not see any compression failures.
    Has anyone ever encountered this issue?  Any theories on why the zero elimination isn't working?  Better yet, any recommendations for how these records can be eliminated from the E table now that the data have already been compressed?
    Thanks,
    Bob

    All,
    It appears that the SID_0CALMONTH issue was a red herring.  I checked the job log for a recent compression and SID_0CALMONTH is not being taken into account when flagging records for zero elimination.  Here's a snippet of the PL-SQL:
    CREATE OR REPLACE TRIGGER "/BI0/0500002603" AFTER
    UPDATE ON "/BIC/EBOF_C30" FOR EACH ROW WHEN (
    NEW."CML_OR_QTY" = 0 AND   NEW."DLV_QTY" = 0 AND
    NEW."GR_QTY" = 0 AND   NEW."NET_PRICE" = 0 AND
    NEW."NET_VALUE" = 0 AND   NEW."PCONF_QTY" = 0 AND
      NEW."/BIC/BACTSHIP" = 0 AND
    NEW."/BIC/BCOUNTER" = 0 AND
    NEW."/BIC/BDISCPQTY" = 0 AND   NEW."/BIC/BMOQ" =
    0 AND   NEW."/BIC/BMRA_QTY" = 0 AND
    NEW."/BIC/BSHIP_QTY" = 0 AND
    NEW."/BIC/BTGTSHIP" = 0 AND
    NEW."/BIC/BTKILLQTY" = 0 AND
    NEW."/BIC/BTORDQTY" = 0 ) BEGIN INSERT INTO
    "/BI0/0100000097" VALUES ( :NEW.ROWID ,
    :NEW."KEY_BOF_C30P" ); END;
    Also, I looked back at the notes mentioned previously, and it is possible that we could have been impacted in the past by the MERGE statement issue (Notes 61370 and 639253), although we are now at a SP level where this should no longer be an issue.
    Finally, the only recent examples of this problem I can find have been traced to cases where compression was run manually without zero compression being specified.  So, those are operator error.
    Bottom line:  While we may have suffered in the past from an issue SAP has subsequently corrected, there is no evidence of any previously undetected problems with zero elimination.
    Bob

  • Did Infocube compression process locks the infocube?

    HI All,
    First of all thanks for ur active support and co-operation.
    Did the compression process locks the cube?, my doubt is, while the compression process is running on a cube, if i try to load data into the same cube, will it allow or not? please reply me as soon as u can.
    Many Thanks in Advance.
    Jagadeesh.

    hi,
    Compression: It is a process used to delete the Request IDs and this saves space.
    When and why use infocube compression in real time?
    InfoCube compression creates new cube by eliminating duplicates. Compressed infocubes require less storage space and are faster for retrieval of information. Here the catch is .. Once you compress, you can't alter the InfoCube. You are safe as long as you don't have any error in modeling.
    This compression can be done through Process Chain and also manually.
    Check these Links:
    http://www.sap-img.com/business/infocube-compression.htm
    compression is done to increase the performance of the cube...
    http://help.sap.com/saphelp_nw2004s/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/frameset.htm
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/b2/e91c3b85e6e939e10000000a11402f/frameset.htm
    Infocube compression and aggregate compression are mostly independent.
    Usually if you decide to keep the requests in the infocube, you can compress the aggregates. If you need to delete a request, you just have to rebuild an aggregate, if it is compressed. Therefore there are no problems in compressing aggregates, unless the rebuild of the aggregates take a lot of time.
    It does not make sense to compress the infocube without compressing the aggregates. The idea behind compressing is to speed up the infocube access by adding up all the data of the different requests. As a result you get rid of the request number. All other attributes stay the same. If you have more than one record per set of characteristics, the key figures will add the key figures by aggregat characteristic (ADD, MIN, MAX etc.). This will reduce the number of records in the cube.
    Example:
    requestid date 0material 0amount
    12345 20061201 3333 125
    12346 20061201 3333 -125
    12346 20061201 3333 200
    will result to
    requestid date 0material 0amount
    20061201 3333 200
    In this case 2 records are saved.
    But once the requestid is lost (due to compression) you cannot get it back.
    Therefore, once you compressed the infocube, there is no sense in keeping the aggregates uncompressed. But as long as your Infocube is uncompressed you can always compress the aggregates, without any problem other than rebuild time of the aggregates.
    hope it helps..

Maybe you are looking for

  • Problem with Remote Access VPN on ASA 5505

    I am currently having an issue configuring an ASA 5505 to connect via remote access VPN using the Cisco VPN Client 5.0.07.0440 running on Windows 8 Pro x64. The VPN client prompts for the username and password during the connect process, but fails so

  • One of my app tabs for some reason opened in a new window; how do I get it back w/other app tabs on original window?

    I have 5 frequently-used websites on app tabs on my Firefox browser window. I've had these since I first saw the update about app tabs, and I have never before had a problem with it. Today, I was working in one tab, and realized I needed to check som

  • Conversion Agent by Informatica - serializing mappers

    Hi Guys, I need to create a Conversion agent module (To be called with in Comm channel of XI). The module will be constructed of a parser and 2 mappers. 1st parser will parse a CSV file to XML. 2nd mapper will change the XML structure from non-hierar

  • IPhoto 6 export not working

    I've been looking around for the answer to this, but to no avail. I cannot use the Export function in iPhoto. I can drag photos out of iPhoto onto my desktop / finder etc. but when I use the drop down menu of export, absolutely nothing happens. It do

  • Max permitted number of internal tax items reached

    Experts, I am trying to do a MIRO (IR) on a PO with 693 line items and I am getting an error saying " Maximum permitted number of internal tax items reached". The Tax Code on the PO is I3 - AP Self Assessment Use Tax and the Tax amounts on both the P