Zeros in InfoCube

All,
We have an InfoCube that could potentially have key figures '0' as part of the requirements. And this infoCube is fed by a DSO. When there is a deletion coming from DSO, it is creating a negative entry to zero out the key figures.
Now my issue is if I go ahead and run '0' compression to remove zeros for deleted records then it will remove the records which are not deleted and key figure is '0'.
I am looking for a way to differentiate deleted records with key figure '0' and real records with key figure '0'. IF there is a way then how we can remove only deleted records with the key figure '0' from the cube.
I hope I made it clear enough.
Thank you,

Serkan Tumer wrote:
All,
>
> We have an InfoCube that could potentially have key figures '0' as part of the requirements. And this infoCube is fed by a DSO. When there is a deletion coming from DSO, it is creating a negative entry to zero out the key figures.
>
> Now my issue is if I go ahead and run '0' compression to remove zeros for deleted records then it will remove the records which are not deleted and key figure is '0'.

> I am looking for a way to differentiate deleted records with key figure '0' and real records with key figure '0'. IF there is a way then how we can remove only deleted records with the key figure '0' from the cube.
>
> I hope I made it clear enough.
>
> Thank you,
If so, then write a simple end routine. I am not strong in ABAP, but the end routine statements must look something like this : DELETE [ALL] DATA_PACKAGE WHERE DATA_PACKAGE-<fld>=0.

Similar Messages

  • Removing leading zeros in infocube

    Hi all,
    I am loading a flatfile to an infocube.
    i have 2 fields say field1 and amount.
    field1 is char 32.
    after loading the data infocube,i am seeing leading zeros in the data for field1.
    say
    source file
    field1------amt
    123-----10000
    234------20000
    target data(since the field is 32 lenth its adding zeros)
    0000000000123----
    10000
    0000000000234----
    20000.
    i dont want the system to add zeros to the data
    any help..
    thanks
    srinivas

    Hi,
    You can do, by changing the length of the Key Figure or else using routins, see the below threads.
    characteristic is not ALPHA -converted
    Re: alpha confirming in master data
    https://guiltybread.com/dahr.php?q=aHR0cHM6Ly9mb3J1bXMuc2RuLnNhcC5jb206NDQzL3RocmVhZC5qc3BhP3RocmVhZElEPTEyNjM3MDQ%3D
    Thanks
    Reddy

  • Dummy Entires in report output

    Hi All,
    I have report (Age analysis)which created on multiprovider.
    when i execute (Age Analysis) report it will display selection screen as below.
    production year/month. == Jan 2001 to May 2001
    class                          == A
    in my cube there is data for only Jan,Feb, April, May and there is no data for MARCH. Now i want show data for MAR/2001 with all keyfigures 0(Zero) values.
    Can anyone suggest some solutions.
    Prod Y/M         Key figure1     Key figure2
    Jan-2001         23                       232
    Feb-2001         23                       232
    Mar-2001         0                       0
    Apr-2001        23                       2323
    May-2001        23                        232
    by
    CCC
    Edited by: CCC on Jul 30, 2008 2:36 AM

    Thanks for u reply. My requirement is as follow
    I don't have data in multirprovider for MARCH, even thogh i want show in report output with key figure values zero
    Ex.
    Infocube                                                  Master data
    prod  Y/M                                               prod Y/M      ITems
    JAN/2001                                                   JAN/2001         1
    FEB/2001                                                   FEB/2001        3 APR/2001                                                   MAR/2001       4
                                                                      APR/2001       5
    Regards
    CCC

  • Compress infocube with zero elimination

    Experts, Is that true you should always check "with zero elimination" when you compress infocubes?  I was reading the famous SAP doc "How to Handle Inventory Management Scenarios in BW", the screen shots for cube compression does not have that checkbox checked.  Does anybody know why?
    Thanks!

    Hello,
    Using the zero elimination the entries where all key figures are equal to zero are deleted from the fact table during compression. You don't want that in Inventory Management.
    Regards,
    Jorge Diogo

  • What is a Zero elimination in infocube compression?

    Can any body please explain me in detail what is a Zero elimination in infocube compression? when and why we need to switch this on? i appreciate your help. Thank you.

    Hi Rafi,
       If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as the compression. In this case, the entries where all key figures are equal to 0 are deleted from the fact table.
    Zero-elimination is permitted only for InfoCubes, where key figures with the aggregation behavior ‘SUM’ appear exclusively. In particular, you are not permitted to run zero-elimination with non-cumulative values.
    More info at:
    http://help.sap.com/saphelp_nw04/helpdata/en/ca/aa6437e7a4080ee10000009b38f842/content.htm
    Hope it Helps
    Srini

  • Infocube showing zero records...

    Hello All,
    I am loading data from flat file to ODS then to cube, after loading the data i can see the number of records transfered and added in the ods, but not in the infocube.
    Please help...
    Regards,
    MC

    Hi MC,
    Please Make sure that the data load to ODS is successful, check the number of records added to the ODS.
    If everything is fine with  ODS , then
    1. Check the details tab in RSMO, here you can see the status of records added.
    2. Check the update rule of the InfoCube.
    Then delete the request from the InfoCube by making total status to RED manually.
    Then reset the datamart status of the ODS request then reschedule it.
    Hope it helps...
    Regards,
    Zakirahamed SM

  • Infocube compression with zero elemination

    Hi all,
    I have cube which is being compressed daily for almost a year. But without zero elemination.
    Now I have a requirement to compress with zero elemination. I am planing to do this after next load into the cube. So once I compress with zero elemination check box ticked, will all the data records be compressed with zero elemination? Will this cause any problem?
    What is the best way to do it?
    I cannot delete the contents of the cube.
    Expecting a reply ASAP.
    Regards,
    Adarsh

    I Hope nothing wil happen to the data values they will remian same. It is just you are removing zeros. If the zeros are there also they have to aggregate and show the value in the reprort.
    So you can go ahead with the Zero elimination mode.
    Regards,
    Vikram

  • Purchasing Cube -- 0PUR_C01.. "Added record" is showing zero in request

    When i try to load the data to this cube, my trasfer record is showing values while Added Record is showing zero values. Whats the problem. Please advise !!

    Hi,
    Check whether do you have defined the industry sector in R/3. This has to be done before filling the setup table. search the forums with note no 353042.
    This is done with the help of Transaction MCB_ which you can find in the OLTP IMG for BW (Transaction SBIW) in your attached R/3 source system.
    Here you can choose your industry sector. 'Standard' and 'Consumer products' are for R/3 standard customers, whereas 'Retail' is intended for customers with R/3 Retail only.
    You can display the characteristics of the process key (R/3 field BWVORG, BW field 0PROCESSKEY) by using Transaction MCB0.
    If you have already set up historical data (for example for testing purposes) by using the setup transactions (Statistical Setup Programs) (for example: Purchasing: Tx OLI3BW, material movements: OLI1BW) into the provided setup tables (for example: MC02M_0SCLSETUP, MC03BF0SETUP), you unfortunately have to delete this data (Tx LBWG). After you have chosen the industry sector by using  MCB_, perform the setup again, so that the system fills a valid transaction key for each data record generated. Then load this data into your connected BW by using 'Full update' or 'Initialization of the delta process'. Check, whether the system updates data into the involved InfoCubes now.
    If all this is not successful, please see Note 315880, and set the application indicator 'BW' to active using Transaction 'BF11'.
    Regards,
    Anil Kumar Sharma .P

  • Delete data from infocube which is older than 7 days.

    Hi Gurus,
    I have a Cube with the time dimension characteristic as 0CALDAY. This cube gets data from a DSO ,which has 0calday as one of its key fields, in Delta mode. I am in a need to delete the data with calday in the previous week. i.e SY-DATUM - 0CALDAY >= 7 in the infocube.
    I just tried to create a transformation such that both the data target and the source is the above mentioned DSO and used the technical rule group to set the record mode as "D" for the older records and it works fine with the DSO.
    But the Cube still has the data. Just the keyfigure becomes zero. But i want to completely delete the data. I do not want to use "DELETE-FACTS" or the FM "rsdrd_sel_deletion."
    Please help me get out of this problem.
    Thanks in advance,
    Rajesh.

    Hello Rajesh
    Not sure if I understood the problem correctly, but what I understood is you need to keep only last 7 days data in the cube and anything else got loaded earlier should need to be deleted.
    If this is correct then can you delete the whole cube content and load only last 7 days data from DSO to cube via DTP filter ( or some start routine).
    In this case first time delta ( or full) will never read from change log table but it will read from active table of DSO only
    Regards
    Anindya

  • Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?

    Hi all,
       We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
       The following is from the Short Dump (ST22):
       The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
    program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
      The following are from ROIDOCPRMS table:
       MAXSIZE (in KB) : 20,000
       Frequency       :  10
       Max Processes : 3
      When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records.  I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
      How could I fix this problem, PLEASE ??
    Thanks,
    Venkat.

    Hello A.H.P,
    The following is the Start Routine:
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
    DATA: material(18), plant(4).
    DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
    /BIC/AZCPR_O0200-CPR_BPARTN.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
       clear DATA_PACKAGE.
       loop at DATA_PACKAGE.
          select single /BIC/ZMATERIAL PLANT
             into (material, plant)
             from /BIC/AZCPR_O0400
             where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
             and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
           if sy-subrc = 0.
              DATA_PACKAGE-/BIC/ZMATERIAL = material.
              DATA_PACKAGE-plant = plant.
              modify DATA_PACKAGE.
              commit work.
           endif.
           select single CPR_ROLE into (role_assignment)
                         from /BIC/AZCPR_O0100
                         where CPR_GUID = DATA_PACKAGE-CPR_GUID.
            if sy-subrc = 0.
              select single CPR_BPARTN into (resource)
                         from /BIC/AZCPR_O0200
                         where CPR_ROLE = role_assignment
                         and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
                   if sy-subrc = 0.
                      DATA_PACKAGE-CPR_ROLE = role_assignment.
                      DATA_PACKAGE-/BIC/ZRESOURCE = resource.
                      modify DATA_PACKAGE.
                      commit work.
                   endif.
              endif.
           clear DATA_PACKAGE.
           endloop.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    Thanks,
    Venkat.

  • How to use standard infocubes and creation of customized info cubes

    Hi Gurus,
    1.How to see particular standard info cube suitable for reporting.
    2. How to implement in dev. If standard info cube is not exactly suitable then what to do means can u make changes to standard info cube(I hope it wont possible)
    3.how to create customized info cubes.
    4.please send me the links with screen shots.
    Regards,
    razz

    Hi Narasimha,
    1.How to see particular standard info cube suitable for reporting.
    If you have the name of the Infocube. Go to RSA1 transaction. Selecct Infoproviders in the left pane. Click on Find--> enter the cube name you want to search
    Standard cube starts with 0(zero), whereas Customized cube starts with Z.
    2. How to implement in dev. If standard info cube is not exactly suitable then what to do means can u make changes to standard info cube(I hope it wont possible)
    I dont think you ca make changes to standard cubes. If you want to make any changes, copy the standard cube(If u right click on cube, we have a option of copy) & so the changes in the cusomized cube.
    3.how to create customized info cubes.
    Just goto RSA1-->select infoprovider in the left pane.
    In the right pnae, rightclick-->create Infocube. Infocube is created!
    Hope it helps!
    Regards,
    Pavan

  • Remove Leading zeros for Material in Transformation

    Hi Experts,
    I'm using DTP first time. I don't have much exp on DTP & Transformations.
    I'm creating infocube with some objects. I want to remove leading zeros for zmaterial.
    In 3.x writen update routines as fallows:
    data: zmat(18) type c.
    zmat = COMM_STRUCTURE-/BIC/ZMAT.
    shift zmat left deleting leading '0'.
    result value of the routine
      RESULT = zmat.
    I'm confusing in Transfermation where to write this routines.
    I'm writing in Transformation as fallows:
    data: zmat(18) type c.
    zmat = SOURCE_FIELDS-/BIC/ZMAT.
    shift zmat left deleting leading '0'.
    RESULT = zmat.
    But it's getting remove zero's.
    Anybody suggest on this.
    Siri

    Dear Sir,
    No confusion at all.
    Just double click on the Target Infoobjct i,e Material object in Transformation, you will see a wizard popping up.
    There you will see a option called "RULE TYPE" and the default value will be "Direct Assignment". In the same check box click on the drop down icon and select "Routine".
    The moment you select the routine option, it will open up ABAP workspace where in you can write your routine and get the desired result.
    Hope it helps.

  • Leading zeros-master data

    Hi All,
    We have a characteristic 'Material'(custom defined master data infoObject) of length
    10,Conversion routine:ALPHA, datatype :CHAR,& output length:10.
    Requirement is Material number should be displayed as 10 digit number(characters ) in Bex
    report.
    but for some of the materials,we  don't have 10 digits for example we have a material
    123(this 123 should be seen as 0000000123 in Query output).
    When i am displaying infocube contents or master data contents i could see 123 as
    0000000123. but with F4,it shows 123 as 123  and in Query output also as '123'
    1.actually How data will be stored in infocube and master data tables? (either 123 or
    0000000123)
    2. one soution to display matarial as 10 char in report is  with  transfer/update routine by
    adding leading zeros.
    Is there any other way to achieve this without using routine ?kendly let me know.
    correct me if my above asumption is wrong.
    thanks

    Hi Murali,
    the Alpha Conversion ist used to convert data between an internal and an external view. The internal view is how the data is physically stored in the database. The external is i.e. how the data is displayed in reports.
    At some point the Alpha Conversion (if defined for the characteristic) is automatically used - you can not change it. This means that in this cases you can only see the external format. The BEx Reporting is one place and the F4 help another.
    In some places within the BW system (no enduser view) you have the option to see the data in the internal format. One of these places ist the cube content. There you have a flag "Do not use any conversion". If this flag is activated you will see the data in the internal format. But this is NOT the format a user will see in reporting.
    In your case the internal format is the value "0000000123" and the external "123".
    Hope that helps
    Regards
    Adios

  • Voyager hangs out when trying to display data from an InfoCube

    Hi experts,
    I'm trying to use Voyager for first time in a new installed environment. From the CMC I've created a connection against a BI InfoCube. Then, I create a new Voyager Workspace based on this InfoCube. All InfoCube Dimensions and Infoobjects are displayed in the left pane. Then I try to drag one of the Characteristics to the right pane, but Voyager seems to hang Showing a message "processing". No data data is displayed, and I can't drag more characteristics or key figures...
    There is any known issue about what could be happening? InfoCube has data and I've already built Universes & WebIntelligence reports based on this InfoCube with success. Then the problem seems to lay on Voyager layer, not in data/OLAP layer.
    Thank you very much!

    Hi
    When Voyager creates the initial data view (i.e. when you drop the first dimension on the cross-tab) it has to go and get all the dimensions, all the hierarchies and all the level zero members of the default hierarchies. Although this isn't a large number of items, there are some circumstances that the API call to SAP, say to get the hierarchies for a particular dimension, takes a very long time. The other tools won't necessarily experience this problem as they are more selective about what initial meta-data they load up. For example, they will only load up the dimensions and hierarchies needed for the query.
    The first step in trouble shooting this will be to profile the calls Voyager makes in the SAP OLAP BAPI to see if there are any of these calls that are taking a very long time. You can profile this either from the SAP end or by turning on the logging for the Voyager data access component.
    If you identify a SAP API call which is taking a long time, you then need to see if there are any messages concerning fixes to such problems. I know there have been fixes to correct calls to get hierarchies in certain circumstances which have been taking too long.
    Reuben

  • Delete InfoCube Index Question

    Hi,
         Before my data loads I have always dropped InfoCube Indexes.  I recently Partitioned our largest InfoCube and now it seems that dropping the indexes takes forever so now I can't really drop the indexes during our daytime loads because it would take too long to delete and rebuild the indexes on this InfoCube.
    Does anyone know if this is normal behavior for a partitioned InfoCube to take so long to drop and rebuild indexes?
    Thanks for any ideas or thoughts!

    Hey Kenneth,
    Since you stated that this infocube was large, and you recently partitioned it, make sure you compress your requests . Also consider compressing with zero elimination as much as possible - this will reduce data size for indexing.
    How big is the DB server this is running on?  Are the DBAs looking at DB settings regarding this which could impact index creation time, e.g. ora init settings, temp space, index tablespaces, etc.
    You might also need to delete unused indexes which are still existing on your cube.
    Also, these index rebuilds should be running with No logging specified - which is default. But it might help to check and confirm once.
    Other than this, OSS note 323090 might be worth checking for you.
    Hope this helps!
    Thanks,
    Sheen

Maybe you are looking for

  • My iPad is locked and I can't slide to unlock it.

    Hello, My Ipad has been locked, i can't slide to unlock, any solution...please.

  • Memory leak during copyPixels

    Hi, I am making a custom renderer using the copyPixels function for my Haxe/Flash game. So I have this kind of code : buffer.bitmapData.copyPixels(m_CurrentSkin, RECT3, POINT, null, null, true); where : -buffer is the bitmap where all my scene's obje

  • Permission conflict when sending mail

    Hi, I have a javabean that sends email, and it works fine when I use the ISP settings on my local machine. The problem occurs when I place my code on a webserver and another ISP. 1. How can I set username and password to the smtp mail? 2. Any suggest

  • Restriction of cost elements

    Hi, is it possible to restrict the manuall cost allocation (KB15n) just for special cost elements? we would like to let just post on special ones... Any ideas? I didn´t found a special authorization object for cost elements in this transaction... reg

  • Extract data using javascript

    Hi, I have  created a BW reportin BeX WAD. At viewtime (in internet explorer), I'm extracting some data from a table in the report with some custom javascript (I'm using HTML DOM). Is there any way to directly query the data provider of the table (in