Infocube compression with zero elemination

Hi all,
I have cube which is being compressed daily for almost a year. But without zero elemination.
Now I have a requirement to compress with zero elemination. I am planing to do this after next load into the cube. So once I compress with zero elemination check box ticked, will all the data records be compressed with zero elemination? Will this cause any problem?
What is the best way to do it?
I cannot delete the contents of the cube.
Expecting a reply ASAP.
Regards,
Adarsh

I Hope nothing wil happen to the data values they will remian same. It is just you are removing zeros. If the zeros are there also they have to aggregate and show the value in the reprort.
So you can go ahead with the Zero elimination mode.
Regards,
Vikram

Similar Messages

  • Compression with zero elimination (reverse postings)

    I executed a compression with zero elimination in a custom InfoCube in order to avoid entries that only contain zero values as key figures (for example reverse posting) are contained in the InfoCube after compressing.
    However, not all the entries where all the key figures are blank have been deleted.
    This cube contains the following standard key figures:
    -0DEB_CRE_DC;
    -0DEB_CRE_LC;
    -0DSCT_DAYS1;
    There are also two other custom key figures, created as copy of 0DEB_CRE_LC.
    Have you any suggestion?

    So you're saying you have rows in your E fact table that were created as part of this particular compression run where all KFs = 0? That certainly doesn't sound right.
    The zero elimination is a multi-step process -
    first excluding rows from the F fact table where all KFs are 0, then excluding any summarized rows where all KF = 0, and then finally, deleting rows from E fact table that were updated resulting in all KFs = 0. Were any compressions run on this cube previously without zero elimination specified?
    What DB?
    There have been some problems with Oracle 9.2 merge function.
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=613701&_NLANG=E
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=639253&_NLANG=E

  • Compression with zero-elimination

    Hi experts,
    "If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as the compression. In this case, the entries, where all the key figures are equal to 0, are deleted from the fact table"
    I am looking for clarification regarding the note above.
    I understand that Compressing the InfoCube eliminates the duplicate records by converting all request IDs to 0 and rolling up the key figures for records that have common values for all characteristics
    If I have 1 record with more than 3 KFs for example; within those 3 KFs, 2 have the value 0.
    Does compression with zero-elimination with only set the request id to 0, keeping the value 0 for the 2 KFs like before compression or do we will have blank space (deleted) insteadt?
    Does the zero-elimination is only possible when all KFs in the same record are equal to 0?
    Thanks.

    Hi Blaiso ,
    Zero Elimination means deleting the record from the cube after compression if and only if, the entire key figures of the particular record is zero. If there are two key figures A & B, A = 0, B = 10, then this record will not be deleted from the cube.
    So the answer to your question is if in a particular record there are 3 KF with 1 as non zero then this record will not be deleted but request ID would be deleted .
    Please check the below link for complete information on compression :
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Hope it helps .
    Thanks
    Kamal Mehta

  • Cube compression WITH zero elimination option

    We have tested turning on the switch to perform "zero elimination" when a cube is compressed.  We have tested this with an older cube with lots of data in E table already, and also a new cube with the first compression.  In both cases, at the oracle level we still found records where all of the key figures = zero.  To us, this option did not seem to work. What are we missing? We are on Oracle 9.2.0.7.0 and BW 3.5 SP 17
    Thanks, Peggy

    Haven't looked at ZERO Elimination in detail in the latest releases to see if there have been changes, but here's my understanding based on the last time I dug into it -
    When you run compression with zero elimination, the process first excluded any individual F fact table rows with all KFs = 0, then if any of the summarized F fact table rows had all KF  = 0, that row was excluded ( you could have two facts with amounts that net to 0 in the same request or different requests where all other Dims IDs are equal) and not written to the E fact table.  Then if an E fact table row was updated as a result of a new F fact table row being being merged in, the process checked to see if the updated row had all KF values = 0, and if so, deleted that updated row from the E fact table.
    I don't beleive the compression process has ever gone thru and read all existing E fact table rows and deleted ones where all KFs = 0. 
    Hope that made sense.  We use Oracle, and it is possible that SAP has done some things differently on different DBs.  Its also possible, that the fiddling SAP had done over that last few years trying to use Oracle's MERGE functionality at different SP levels comes into play.
    Suggestions -
    I'm assuming that teh E fact table holds a significant percentage of rows have all KFs = 0.  If it doesn't, it's not worth pursuing.
    Contact SAP, perhaps they have a standalone pgm that deletes E fact table rows where all KFs = 0.  It could be a nice tool to have.
    If they don't have one, consider writing your own pgm that deletes the rows in question.  You'll need to keep downstream impacts in mind, e.g. aggregates (would need to be refilled - probably not a big deal), and InfoProviders that receive data from this cube.
    Another option would be to clone the cube, datamart the data to the new cube.  Once in the new cube, compress with zero elimination - this should get rid of all your 0 KF rows.  Then delete the contents of the original cube and datamart the cloned cube data back to the original cube. 
    You might be able to accomplish this same process by datamarting the orig cube's data to itself which might save some hoop jumping. Then you would have to run a selective deletion to get rid of the orig data, or perhaps if the datamarted data went thru the PSA, you could just delete all the orig data from the cube, then load datamarted data from the PSA.  Once the new request is loaded, compress with zero elimination.
    Now if you happen to have built all your reporting on the this cube to be from a MultiProvider on teh cube rather than directly form the this cube, you could just create anew cube, export the data to it, then swap the old and new cubes in the MultiProvider.  This is one of the benefits of always using a MultiProvider on top of a cube for reporting (an SAP and consultant recommended practice) - you can literally swap underlying cubes with no impact to the user base.

  • Compress infocube with zero elimination

    Experts, Is that true you should always check "with zero elimination" when you compress infocubes?  I was reading the famous SAP doc "How to Handle Inventory Management Scenarios in BW", the screen shots for cube compression does not have that checkbox checked.  Does anybody know why?
    Thanks!

    Hello,
    Using the zero elimination the entries where all key figures are equal to zero are deleted from the fact table during compression. You don't want that in Inventory Management.
    Regards,
    Jorge Diogo

  • What is a Zero elimination in infocube compression?

    Can any body please explain me in detail what is a Zero elimination in infocube compression? when and why we need to switch this on? i appreciate your help. Thank you.

    Hi Rafi,
       If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as the compression. In this case, the entries where all key figures are equal to 0 are deleted from the fact table.
    Zero-elimination is permitted only for InfoCubes, where key figures with the aggregation behavior ‘SUM’ appear exclusively. In particular, you are not permitted to run zero-elimination with non-cumulative values.
    More info at:
    http://help.sap.com/saphelp_nw04/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/content.htm
    Hope it Helps
    Srini

  • Compress the Request with Zero Elimination

    I delta load the transaction data into Infocube by InfoPackage and compress the Request ID manually without select 'with Zero Elimination'. Now I wanna Compress the request ID with Zero Elimination again. Could you give me some hits on that. Many Thanks .

    hi,
    There're two programs that work for me.
    1. RSCDS_MERGE_DUPL_REF_POINTS - this is to actually compress the records.
    2. RSCDS_NULLELIM : - To remove the records with 0 value i.e. null Elimination
    Although, SAP doesn't recommend using them if your compression is working fine.
    You may want to try them in your non-production system first.
    -RMP

  • Compression pb with zero elimination

    Hi,
    In a BW3.1 SP22, We compress infocube with flag “with zero elimination” activated.
    But occurrences having key figures null and which were just compressed are not deleted.
    Few properties of this cube are :
    •     just cumulatives KF
    •     huge volume (approx 120 millions occurrences / 22 millions new non-deleted occurrences with all KF null)
    •     huge aggregates (biggest : 20 millions)
    •     partitioning by month (12)
    •     previously few requests were compressed without the flag « zero elimination » but obviously the occurrences described as non-deleted don’t come from these old requests
    In the same system, on others cubes with less datas, we tried to reproduce the case without success. In all scenarii we had tested, lines with KF nulls are correctly deleted.
    We don’t understand from where the problem could come.
    If someone has an idea ...
    Thanks by advance.

    So you're saying you have rows in your E fact table that were created as part of this particular compression run where all KFs = 0? That certainly doesn't sound right.
    The zero elimination is a multi-step process -
    first excluding rows from the F fact table where all KFs are 0, then excluding any summarized rows where all KF = 0, and then finally, deleting rows from E fact table that were updated resulting in all KFs = 0. Were any compressions run on this cube previously without zero elimination specified?
    What DB?
    There have been some problems with Oracle 9.2 merge function.
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=613701&_NLANG=E
    https://websmp201.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=639253&_NLANG=E

  • Performance Impact with InfoCube Compression

    Hi,
    Is there any delivered content which gives a comparative analysis of performance before InfoCube compression and after it? If not then which is the best way to have such stats?
    Thank you,
    sam

    The BW Technical Content cubes/queries can tell you if a query is performing better at differnet points in time.  I like ot always compare a volume of queries before and after, rather than look at a single execution.  As mentioned, ST03 can provide info, as can RSRT.
    Three major components of compression that can aid performance:
    <u><b>Compression</b></u>
    The compression itself - how many rows do you end up with in the E fact table compared to what you had in the F fact table.  this all depends on the data - some cubes compress quite a bit, others, not at all, e.g.
    Lets say you have a cube with a time grain of Calendar Month.  You load trans to it daily.  A particular combination of characteristic values on a transaction occurs every day so after a month you have 30 transactions spread across 30 Requests in the F fact table.  Now you run compression - these 30 rows would compress to just 1 row.  You have now reduced the volume of ddata in your cube to just about 3% of what it used to be.  Queries should run much faster in this case.  In real life, doubt you would see a 30 - 1 reduction, but perhaps a 2 - 1 or 3 - 1 is reasonable.  It all depends on your data and your model.
    <b><u>Zero Elimination</u></b>
    Some R3 appls generate trans where all the KFs are 0, or generate trans that offset each other, netting to 0.  Specifying Aero Elimination during compression will get rid of those records.
    [<b>u]Partitioning</u></b>
    The E fact table can be partitioned on 0FISCPER or 0CALMONTH.  If you have queries that restrict on those characteristics, the DB can narrow in on just the partitions that hold the relevant data (partition pruning is how it is usually referred to).  if a query on goes after 1 month of data form a cube that has 5 years of data, this can be a big benefit.

  • Non cumulative infocube compression

    Hi All,
    Presently we have a problem with Non cumulative infocube comression.
    It is like: Presently Non Cumulative infocube contains 2 years of data and we are extracting data from two datasources (2LIS_03_BX AND 2LIS_03_BF) since 2years it has not been compressed.
    Now we are going to do the comprresion
    It is like : 2LIS_03_BX Init load compreesing with marker update (not selecting check box).
    2LIS_03_BF init load copressing without marker update (selecting check box)
    2LIS_03_BF delta load compressing with marker update(not selecting check box)
    2LIS_03_BX delta upload compressing without marker update(selecting check box)
    Here my doubt is in between delta loads there are some full uploads from 2LIS_03_BF datasource how can i copress this full uploads from 2LIS_03_BF req's?
    Please help me it is quiet urget.
    Thanx,
    Niranjan.

    Hi Nirajan,
    First of all as I undestood 2LIS_03_BX is the initial upload of stocks, so there is no need of delta load for this datasource, it collects data from MARC, and MARD tables when running the stock setup in R3 and you ahve to load it just one time.
    If between delta loads of 2LIS_03_BF you're loading full updates you are dupplicating material movements data, the idea of compression with marker update is that this movements affects to the stock value in the query, it's because of that when you load the delta init you do it without marker update because these movements are contained in the opening stock loaded with 2LIS_03_BX so you dont want to affect the stock calculation.
    You can refer to the How to handle Inventory management scenarios in BW for more detail on the topic.
    I hope this helps,
    Regards,
    Carlos.

  • Infocube Compression

    Hi All,
    Presently we have a problem with Non cumulative infocube comression.
    It is like: Presently Non Cumulative infocube contains 2 years of data and we are extracting data from two datasources (2LIS_03_BX AND 2LIS_03_BF) since 2years it has not been compressed.
    Now we are going to do the comprresion
    It is like : 2LIS_03_BX Init load compreesing with marker update (not selecting check box).
    2LIS_03_BF init load copressing without marker update (selecting check box)
    2LIS_03_BF delta load compressing with marker update(not selecting check box)
    2LIS_03_BX delta upload compressing without marker update(selecting check box)
    Here my doubt is in between delta loads there are some full uploads from 2LIS_03_BF datasource how can i copress this full uploads from 2LIS_03_BF req's?
    Please help me it is quiet urget.
    Thanx,
    Niranjan.

    pamireddy,
    The best way to see if it is okay is to check the aggregates... Especially since you are using non *** KF - check if rollup to aggregates is happening fine - the aggregates are automatically compressed , only thing is that there is no delta rollup for aggregates with Non *** KF.
    If there are no aggregates - create one basis aggregate and roll it up and test - if it goes through fine and the query hits the aggregate and is behaving fine - then you can go ahead with compression of the base data.
    Arun
    Assign points if useful

  • Infocube compression error CX_SQL_EXCEPTION

    Hi,
    We are encountering a infocube compression error ( CX_SQL_EXCEPTION: parameter is missing).
    Have applied two notes:1028847 and 973969. But did not fix the error. In the system log we have the following error:
    ORA-00054: resource busy and acquire with NOWAIT specified.
    Every time when the compression failed, we always repeated it and completed successfully in second.
    Could anyone know what we can do to fix this.
    Thanks!
    JXA

    Hello Girija,
    Please check OSS note 973969.
    Regards,
    Praveen

  • DNG with CS5 is always compressed with loss !

    This is not up for discussion , but only as an FYI to all and every one out there using DNG in Photoshops Raw interface:
    DNG done by CS5' s raw 6.x interface will make every DNG file compressed "with Loss" , its not lossless at all ...!!
    To the Adobe Team :
    ....why dont you let us Pro's know about the Fact that your camera Raw 6.x interface will "Compress" an original DNG file
    if a preview jpg is updated ( small option in the right top scroll down menu) within this Raw 6.x interface ?
    ...This is a huge Damage for a previous perfectly and integer (not compressed) DNG file, for your subsequent compression and re-saving of the DNG is
    NOT lossless.....on top of the fact that you do NOT inform us of a "compression" step here , and that we are NOT given an option in your interface of raw 6.x for this NOT to happen, as in :...."do not compress" when update preview jpg
    i have now destroyed 1500 DNG images , from a commercial client shoot,.. by updating all the DNG preview jpgs , ....all DNG files
    now show noise and other artifacts, especially in the always so critical blue chanel,.....this is a shoot where the shoot budget was
    over 50'000 $ !!...
    i also can not open the DNG files anymore in its proprietary software anymore neither,.... sinar capture shop ,....since your raw interface compressed all DNG files... a step that was not asked for...a step that is NOT needed....EVER !
    please everyone ,...spread this around !!

    Ad Agency wrote:
    ...that when updating a preview jpg of a DNG in an open CS5 RAW interface window, CS5 will compress your underlying raw data....something you seem not to be able to read .....so i gather your english...which is ....i am sure ...the only one Language you speak on the other hand .... is definitely a language you need to learn how to read ,...at this point.... dont you think so ?
    Wrong...updating a DNG preview will have exactly ZERO impact on the raw data of your file. You are quite wrong if you think this...seriously...do the research (which you claim to be able to do).
    Ad Agency wrote:
    ...if you would have surfed the internet you would have seen my work.....and that's very serious stuff...serious GLOBAL stuff i do...compared to your work,...which seems to focus on books for amateurs....LOL
    so i guess...yo dont really shoot for clients out there..right?
    I saw your web site (it kinda sucks on mobile iPad BTW, might wanna look into HTML5 instead of Flash). I'm very familiar with the high anxiety and drama/trama of fashion photographers (actually, some of them are friends). You might wanna look at my own website www.schewephoto.com. Been there, done that, have the Tee shirt ya know?
    Ad Agency wrote:
    ...so do yourself a favor here, and stay out of this forum.....you clearly dont belong here !
    and yes...the postings are coming from 2 places...my ad agency,...and my shop as a Fashion Photographer , which i do for a good 20 years between Paris, London and New York.
    and yes ...imagine this......i actually do research...and do the globally very respected photography stuff only for fun.....on the side ...of my real job:  Research !
    Well, am I talking to a photographer or ad agency? In my experience, ad agencies really DON'T have a clue about digital workflow...and NYC shooters tend to rely on digital techs. Maybe you are different (so far, the way you are behaving here indicates it's par for the course).
    Ad Agency wrote:
    ...and lets see ...what do you do again?.....books....for wedding photographers?.....
    stay out of here....
    let the forum people here discover something very serious....without your tangent-distractions on sinar's' financials....or my 12 language skills and my PHD in Physics
    grow up !
    I'm pretty grown up...25 years+ award winning ad work. No, I don't have a PHD (personally, I've never met anybody with a PHD that could shoot their way out of a paper bag, but that's just my experience) but I do have a couple of degrees from RIT (highest honers BTW if that matters).
    You wanna do dick measuring? I'm your guy bud...you chose the wrong guy to get down into the mud with.
    Ad Agency wrote:
    i will not respond to anyone anymore here ....period
    not even to the guys who actually have serious questions to ask...
    imagine this....because of people like you
    ...again!
    Well, too bad...because something somewhere is whacked. If you think that by updating the DNG preview you are changing your raw image data, then it's all on you bud. The conversion from raw to DNG (except for a few formats that require Linear DNG conversion) is completely lossless. And what the heck do you care about updating the DNG preview? The only need to update the DNG preview is if you are using a 3rd party application that can't read the raw data (like some 3rd part database management apps).
    Seriously dooode, you need to fix your attitude. If you come down off your high horse, you might be able to learn something. Otherwise I suspect you need to start hiring  some digital techs that know a bit more about digital than you do (there are a lot of good ones in NYC, I would be happy to give you some names).
    :~)
    P.S. for the rest of the people reading this thread let me apologize in advance. The OP is a high strung fashion doode that is obviously flailing about and having some "issues" which are causing a degree of angst that is making the discussion a bit over the top.
    P.P.S. You are right...I'm getting rid of my P-65+ back and getting an IQ180 back shortly...60 MP ain't enough, I want 80 MP...

  • Drawbacks of Infocube compression

    Hi Experts,
    is there any drawbacks of infocube compression??
    Thanks
    DV

    Hi DV
    During the upload of data, a full request will always be inserted into the F-fact table. Each request gets
    its own request ID and partition (DB dependent), which is contained in the 'package' dimension. This
    feature enables you, for example, to delete a request from the F-fact table after the upload. However,
    this may result in several entries in the fact table with the same values for all characteristics except the
    Best Practice: Periodic Jobs and Tasks in SAP BW
    request ID. This will increase the size of the fact table and number of partitions (DB dependent)
    unnecessarily and consequently decrease the performance of your queries. During compression,
    these records are summarized to one entry with the request ID '0'.
    Once the data has been
    compressed, some functions are no longer available for this data (for example, it is not possible to
    delete the data for a specific request ID).
    Transactional InfoCubes in a BPS environment
    You should compress your InfoCubes regularly, especially the transactional InfoCubes.
    During compression, query has an impact if it hits its respective aggregate. As every time you finish compressing the aggregates are re-built.
    With non-cumulative InfoCubes, compression has an additional effect on query performance. Also, the marker for non-cumulatives in non-cumulative InfoCubes is updated. This means that, on the whole, less data is read for a non-cumulative query, and the reply time is therefore reduced.
    "If you are using an Oracle database as your BW database, you can also carry out a report using the relevant InfoCube in reporting while the compression is running. With other manufacturers’ databases, you will see a warning if you try to execute a query on an InfoCube while the compression is running. In this case you can execute the query once the compression has finished executing."
    Hope this may help you
    GTR

  • Items with  zero stock quantity show negative stock value in Stock reports

    When running Stock reports for controlling the stock value towards the GL accounts, some items appear with zero stock quantity, but the report still shows a stock value  (negative value in my case)
    How can this happen, and how can I correct this situation ?
    System parameters are :   negative stock is not allowed, Items with zero cost price not allowed. On item level average cost price method is used.
    P.K.Johnsen

    Hi Johnsen,
    I believe you have checked the" Manage Inventory by warehouse". I have noticed this issue in SAP B1 2005B but this is rectified in 2007B. The system behaves in this way as the system maintains item cost for the item for all warehouses and even if the stock is not present in the warehouse, the system would still show you a value for the same. Hope this helps. please search the forum. You'll find related threads.
    Thanks,
    Joseph

Maybe you are looking for