Data in the cube is showing multiple entries when compared with ODS

Hello BW Gurus,
We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW.
An Example:
Waste report in PP run for plant 1000 for 12/2005 and process order 123456. The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000. The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %. These values are not correct. The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
Here is the ODS Code:
tables: /BI0/PPRODORDER.
loop at data_package.
select single COORD_TYPE
PRODVERS
into (/BI0/PPRODORDER-COORD_TYPE,
/BI0/PPRODORDER-PRODVERS)
from /BI0/PPRODORDER
where PRODORDER = data_package-PRODORDER
and OBJVERS = 'A'.
if sy-subrc = 0.
if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
data_package-PRODVERS = space.
else.
data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
endif.
endif.
if data_package-calday = space
or data_package-calday = '00000000'.
if data_package-TGTCONSQTY NE 0.
data_package-calday = data_package-ACTRELDATE.
endif.
endif.
modify data_package.
endloop.
Here is Cube Code:
tables: /BI0/PPRODORDER,
/BIC/ODS.
TYPES:
BEGIN OF ys_mat_unit,
material TYPE /bi0/oimaterial,
mat_unit TYPE /bi0/oimat_unit,
numerator TYPE /bi0/oinumerator,
denomintr TYPE /bi0/oidenomintr,
END OF ys_mat_unit.
DATA:
l_s_mat_unit TYPE ys_mat_unit,
e_factor type p decimals 5.
loop at data_package.
select single COORD_TYPE
PRODVERS
into (/BI0/PPRODORDER-COORD_TYPE,
/BI0/PPRODORDER-PRODVERS)
from /BI0/PPRODORDER
where PRODORDER = data_package-PRODORDER
and OBJVERS = 'A'.
if sy-subrc = 0.
if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
data_package-PRODVERS = space.
else.
data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
endif.
endif.
if data_package-calday = space
or data_package-calday = '00000000'.
if data_package-TGTCONSQTY NE 0.
data_package-calday = data_package-ACTRELDATE.
endif.
endif.
data_package-agsu = 'GSU'.
data_package-agsu_qty = 0.
select single gr_qty
base_uom
into (/BIC/ODS-gr_qty,
/BIC/ODS-base_uom)
from /BIC/ODS
where prodorder = data_package-prodorder
and material = data_package-material.
if sy-subrc = 0.
if /BIC/ODS-base_uom = 'GSU'.
data_package-agsu_qty = /BIC/ODS-gr_qty.
else.
SELECT SINGLE * FROM /bi0/pmat_unit
INTO CORRESPONDING FIELDS OF l_s_mat_unit
WHERE material = data_package-material
AND mat_unit = 'GSU'
AND objvers = 'A'.
IF sy-subrc = 0.
IF l_s_mat_unit-denomintr <> 0.
e_factor = l_s_mat_unit-denomintr /
l_s_mat_unit-numerator.
multiply /BIC/ODS-gr_qty by e_factor.
data_package-agsu_qty = /BIC/ODS-gr_qty.
ENDIF.
else.
CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
EXPORTING
INPUT = /BIC/ODS-gr_qty
NO_TYPE_CHECK = 'X'
ROUND_SIGN = ' '
UNIT_IN = /BIC/ODS-base_uom
UNIT_OUT = 'GSU'
IMPORTING
OUTPUT = DATA_PACKAGE-gsu_qty
EXCEPTIONS
CONVERSION_NOT_FOUND = 1
DIVISION_BY_ZERO = 2
INPUT_INVALID = 3
OUTPUT_INVALID = 4
OVERFLOW = 5
TYPE_INVALID = 6
UNITS_MISSING = 7
UNIT_IN_NOT_FOUND = 8
UNIT_OUT_NOT_FOUND = 9
OTHERS = 10.
endif.
endif.
endif.
modify data_package.
endloop.
some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
your suggestion and solutions would be highly appreciated.
thanks,
Swathi.

Hi Swathi
In ODs we have option of overwriting and addition however in Cube we have only adition.Thats why you are getting multiple enteries.
If you are running daily full load on the cube then please delete the earlier requests.
So at one point of time there should be only one full load request in cube. Hope this will solve your problem.
Regards,
Monika

Similar Messages

  • Data in the cube is showing wrong when compared with ODS

    Hello BW Gurus,
    We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW. 
    An Example:
    Waste report in PP run for plant 1000 for 12/2005 and process order 123456.  The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000.  The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %.  These values are not correct.  The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
    There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
    <b>Here is the ODS Code:</b>
    tables:  /BI0/PPRODORDER.
      loop at data_package.
        select single COORD_TYPE
                      PRODVERS
          into (/BI0/PPRODORDER-COORD_TYPE,
                /BI0/PPRODORDER-PRODVERS)
          from /BI0/PPRODORDER
         where PRODORDER = data_package-PRODORDER
           and OBJVERS   = 'A'.
        if sy-subrc = 0.
          if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
          or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
            data_package-PRODVERS = space.
          else.
            data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
          endif.
        endif.
        if data_package-calday = space
        or data_package-calday = '00000000'.
          if data_package-TGTCONSQTY NE 0.
            data_package-calday = data_package-ACTRELDATE.
          endif.
        endif.
        modify data_package.
      endloop.
    <b>Here is Cube Code:</b>
    tables:  /BI0/PPRODORDER,
               /BIC/ODS.
      TYPES:
      BEGIN OF ys_mat_unit,
        material                 TYPE /bi0/oimaterial,
        mat_unit                 TYPE /bi0/oimat_unit,
        numerator                TYPE /bi0/oinumerator,
        denomintr                TYPE /bi0/oidenomintr,
      END OF ys_mat_unit.
      DATA:
        l_s_mat_unit             TYPE ys_mat_unit,
        e_factor                 type p decimals 5.
      loop at data_package.
        select single COORD_TYPE
                      PRODVERS
          into (/BI0/PPRODORDER-COORD_TYPE,
                /BI0/PPRODORDER-PRODVERS)
          from /BI0/PPRODORDER
         where PRODORDER = data_package-PRODORDER
           and OBJVERS   = 'A'.
        if sy-subrc = 0.
          if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
          or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
            data_package-PRODVERS = space.
          else.
            data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
          endif.
        endif.
        if data_package-calday = space
        or data_package-calday = '00000000'.
          if data_package-TGTCONSQTY NE 0.
            data_package-calday = data_package-ACTRELDATE.
          endif.
        endif.
        data_package-agsu     = 'GSU'.
        data_package-agsu_qty = 0.
        select single gr_qty
                      base_uom
          into (/BIC/ODS-gr_qty,
                /BIC/ODS-base_uom)
          from /BIC/ODS
         where prodorder = data_package-prodorder
           and material  = data_package-material.
        if sy-subrc = 0.
          if /BIC/ODS-base_uom = 'GSU'.
            data_package-agsu_qty = /BIC/ODS-gr_qty.
          else.
            SELECT SINGLE * FROM /bi0/pmat_unit
              INTO CORRESPONDING FIELDS OF l_s_mat_unit
              WHERE material   = data_package-material
                AND mat_unit   = 'GSU'
                AND objvers    = 'A'.
            IF sy-subrc = 0.
              IF l_s_mat_unit-denomintr <> 0.
                e_factor = l_s_mat_unit-denomintr /  
                              l_s_mat_unit-numerator.
                multiply /BIC/ODS-gr_qty by e_factor.
                data_package-agsu_qty = /BIC/ODS-gr_qty.
              ENDIF.
            else.
              CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
                EXPORTING
                  INPUT                = /BIC/ODS-gr_qty
                  NO_TYPE_CHECK        = 'X'
                  ROUND_SIGN           = ' '
                  UNIT_IN              = /BIC/ODS-base_uom
                  UNIT_OUT             = 'GSU'
                IMPORTING
                  OUTPUT               = DATA_PACKAGE-gsu_qty
                EXCEPTIONS
                  CONVERSION_NOT_FOUND = 1
                  DIVISION_BY_ZERO     = 2
                  INPUT_INVALID        = 3
                  OUTPUT_INVALID       = 4
                  OVERFLOW             = 5
                  TYPE_INVALID         = 6
                  UNITS_MISSING        = 7
                  UNIT_IN_NOT_FOUND    = 8
                  UNIT_OUT_NOT_FOUND   = 9
                  OTHERS               = 10.
            endif.
          endif.
        endif.
        modify data_package.
      endloop.
    some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
    your suggestion and solutions would be highly appreciated.
    thanks,
    Swathi.

    Hi Swathi,
    May be you might want to look into the way the % is being calculated in the cube. If the formula involves counting the no. of records, then you will also be counting the -ve records that are posted in the cube unless you have had a compression on the cube. that might give you wrong numbers.
    Doniv

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • Show only logged in User Data from the cube

    Hi,
    Suppose I have couple of customers' data in the cube. And these customers are some BW users. Now, i wish my query should display data only for the customer/user who has logged in. The query could be created by any user.
    Is this possible in a web report?
    regards,
    Sam

    Hi
    Refer this thread
    Re: query execution statistics
    The query execution statistics can be fetched from the relevant table that acts as source for your query
    Regards
    N Ganesh

  • Could not see the data in the cube

    Hi all,
    Iam trying to load data from one cube to another. Everything is fine but Iam not able to see the data in the new cube.
    I created 2 cubes--> Cube A new & cube B new which are copied from existing cubes cube A and Cube B. I need to load data from these new cubes cube Anew & cube B new to another cube (cube C new)  which is also a copy of existing cube C. I created update rules and activated them.I tried to load the data into the cube C new. I can see the load to be green but I cannot see the data in the cube C new. it shows 0 records in cube C new. there are records in cube A new & cube B new.
    Can anyone plz help me out.
    Thanks in advance.
    Thanks,
    Preethi.

    Hi Balu,
    I dont know what exactly the problem was but even after the load was successful i could not see any records in the new cube. The source cubes had 60,446 and 7595 records. but the target cube had 0 records.
    I ran the program SAP_FACTVIEWS_RECREATE as Bhanu suggested and deleted the data from the new cube(inspite of having 0 records I deleted them). I tried to load the data again which was successful and I could see the data in the new cube.
    Hope its clear.
    Thanks,
    Preethi.

  • How to reduce the run time of ABAP code (BADI) when it is reading huge data from BPC Cube and thus writing  back huge data to the cube.

    Hi All ,
    In Case of reading huge amount of record from BPC Cube  from BADI code , performing calculations and writing back huge amount of data into the cube , It takes lot of time . If there is any suggestion to read the data  from Cube  or writing data into the cube using some Parallel Processing  methods , Then Please suggest .
    Regards,
    SHUBHAM

    Hi Gersh ,
    If we have a specific server say 10.10.10.10 (abc.co.in) on which we are working, Then under RZ12 we make the following entry  as :
    LOGON GROUP          INSTANCE
    parallel_generators        abc.co.in_10         ( Lets assume : The instance number is 10 )
    Now in SM59 under ABAP Connections , I am giving the following technical settings:
    TARGET HOST          abc.co.in
    IP address                  10.10.10.10
    Instance number          10
    Now if we have a scenario of load balancing servers with following server details (with all servers on different instance numbers ) :
    10.10.10.11   
    10.10.10.13
    10.1010.10
    10.10.10.15
    In this case how can we make the RZ12 settings and SM59 settings such that we don't have to hardcode any IP Address.
    If the request is redirected to 10.10.10.11 and not to 10.10.10.10 , in that case how will the settings be.
    I have raised this question on the below thread :
    How to configure RZ12  and SM59 ABAP connection settings when we have work with Load Balancing servers rather than a specific server .
    Regards,
    SHUBHAM

  • How to check data in the cube

    for integration purpose, how can you go and check data in the cube and validate it against the query data ? i know we can go to corresponded infocube and right click "display data" but what's on the query is not available when i execute "display data"
    Thanks

    You can always use the similar set of Restrictions as used in RKF and get the value very close to what RKF is displaying.
    But getting the same values in CKF is a bit tough..I can suggest may be you can take few/1-2 examples and do the calculations manually and compare the query result and Infocube data.

  • HT1399 When I import a CD to itunes, it splits the album in to multiple entries. I have changed the info on some but others just will not merge. Is there an easy way to do this and can I stop them from splitting in future? Thanks

    When I import a CD to itunes, it splits the album in to multiple entries. I have changed the info on some but others just will not merge. Is there an easy way to do this and can I stop them from splitting in future? Thanks

    Okay this is what fixed my problem. Some kind of error/glitch occured half-way through importing that third cd that prevented further imports from any other cd. Once I completely removed the half-imported cd album from my library I could further import cd's without any problems.

  • Error while loading data into the cube

    Hi,
    I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
    Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
    Also can some one explain the Datatransfer process(not in process chain)?
    Regards,
    Sam

    Hi Sam
        after you load the data  through DTP(after click on execute button..) > just go to monitor screen.. in that press  the refresh button..> in that it self.. you can find the  logs..
       otherwise.. in the request  screen also..  beside of the request number... you can see the logs icon.. you can click on this..
    DTP  means..
    DTP-used for data transfer process from psa to data target..
    check thi link..for DTP:
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    to load data in to the datatargets or infoproviders like DSO, cube....
    in your  case.. the problem may be.. check the date formats.. special cherectrs... and
    REGARDS
    @JAY

  • Wish to keep always 2 years of data on the cube

    Hi Experts,
                 we wish to keep always 2 years of data on the cube and archive the data after it is 2 years old.
    Any thought on this is appreciated.
    Thanks,
    Varun.

    hi Varun,
    you can use tcode SARA and 0calyear as selection to archive the cube to flat file or another cube,
    for more detail please follow this how to archive in bw doc
    https://websmp108.sap-ag.de/~sapdownload/011000358700000085572002E/HowToArchiveinBW.doc
    hope this helps.

  • Infospke not pulling data from the cube

    Hi,
          Though there is data in the cube ,when I say start extraction the infospoke is pulling zero records .But I see a file created but I am not able to see the data.In the monitor it says 0 record extracted.
        I have done this first time in the new upgrade system where I changed the server name.
    Thanks

    Hi,
    Check out the layout definition(Data column) for the corresponding planning level and also you can check the <b>planning area</b> value whether it corresponds to your cube name ..
    Also make a cross check with the <b>listcube</b> for your corresponding restriction which you have made in planning level..
    Regards,
    Siva.

  • Deleting Data from the cube

    Is there any function module which drops all the data from the cube?
    Or is there a function module which deletes all the requests from the cube?
    Thanks.

    Hello,
    Check one of the foll:
    RSAPO_DELETE_CUBE
    RSAPOADM_CLEANUP_CUBE
    RSAR_DELETE_ICUBE
    RSAU_INFOCUBE_CONTENT_DELETE
    RSDG_CUBE_DELETE
    RSDPC_DROP_CUBE
    RSDPM_DB_CLEANUP_CUBE
    RSDPTEST_MOLAP_CUBE_CLEANUP
    RSDPW_INFOCUBE_DELETE_ALL_DATA
    RSDPW_INFOCUBE_DELETE_DATA
    RSDPW_INFOCUBE_DROP
    RSDP_MDMETA_ICUBE_DROP
    Regds,
    Shashank

  • Iphone 3GS- How do you get the pictures to show up larger when someone call

    Iphone 3GS- How do you get the pictures to show up larger when someone calls? I know this seems trivial but I have difficulty seeing, so one of the reasons I actually purchased this phone was for this feature. When the photos do come up they come up in the top right corner in small boxes / windows. I already google'd this topic and they suggested importing the photos - to itunes and than directly on to the iphone, I did this and it didn't work. Additionally I took a few test photos from the iphone and tried setting them to contacts. This didnt work either. I looked in settings and could not find anything that discussed this issue. Although I do believe I have seen iphones that ring with large pictures. Am i Crazy?

    You aren't crazy.
    Adding the photo to contacts on the computer gets you the smaller pix when people call.
    Assign the photo on the phone itself and you get full-size pix.
    Additionally I took a few test photos from the iphone and tried setting them to contacts. This didnt work either.
    So, that certainly should work. Try it again? Try it with a brand new contact, maybe it is not overwriting a previously existing picture.
    I suppose there is a small possibility that Apple thought this differing behavior was a bug instead of a feature and "fixed" it with a recent update, but I doubt it.

  • When I connect my iphone and open itunes the iphone icon shows up but when I click it nothing happens and the icon disappears.  I have uninstalled and re-installed itunes.  Turned my computer off, and turned my iphone off.  Restarted everything.

    When I connect my iphone and open itunes the iphone icon shows up but when I click it nothing happens and the icon disappears.  I have uninstalled and re-installed itunes.  Turned my computer off, and turned my iphone off.  Restarted everything.  Synced my iphone and my itunes and nothing seems to work.  Any suggestions?  I am using Windows

    Stop clicking the eject button.

  • I cannot get the Output Module to work. I was able to get the button to show up, but when I click on it, nothing happens. i need to export a PDF asap

    I cannot get the Output Module to work. I was able to get the button to show up, but when I click on it, nothing happens. i need to export a PDF asap. I am trying to export 33 PSD files in BRIDGE to a PDF like I used to do on my other computer with bridge.

    Let's start with the general things.
    When you have a problem with one particular site, a good "first thing to try" is clearing your Firefox cache and deleting your saved cookies for the site.
    (1) Bypass Firefox's Cache
    Use Ctrl+Shift+r to reload the page fresh from the server.
    Alternately, you also can clear Firefox's cache completely using:
    orange Firefox button (or Tools menu) > Options > Advanced
    On the Network mini-tab > Cached Web Content : "Clear Now"
    If you have a large hard drive, this might take a few minutes.
    (2) Remove the site's cookies (save any pending work first). While viewing a page on the site, try either:
    * right-click and choose View Page Info > Security > "View Cookies"
    * Alt+t (open the classic Tools menu) > Page Info > Security > "View Cookies"
    In the dialog that opens, you can remove the site's cookies individually.
    Then try reloading the page. Does that help?

Maybe you are looking for