Cube data consistency

Hai,
        I am working with an ODS and a Cube. The data is loaded from PSA to ODS and then to the cube. When I get into the PSA, ODS and cube sperately, for each record( as per the primary keys), the data in PSA is consistent with ODS. But the data in ODS is not consistent with ODS. The values of one object(amount) is made zero for some of the records and not for some. I checked the mappings of the objects. They are all one to one, meaning the data in ODS have to come to Cube as it is atleast for that object.The data is thus inconsistent I believe. I did the RSRV test and it says the data has been loaded into cube with out units. There is no fix for this. THe only way is to reload data. but reloading all the data into production system as you know is hell and not that recommended.
Can somebody please tell me how I can fix this issue?
Thanks a lot.

Hi Visu ,
Do you mean Inconsistency between the ODS and cube?
Do you have any start routine in the update rules??
How did you determine the Inconsistency of few records ..pls let me know..Just trying to understand the problem..
Ashish

Similar Messages

  • Record count is double in the Fact tables when compared to Cube data!

    Hello BW Gurus,
    I have 2 question to be answered!
    1. I have a cube which consist of 3 years of data. Due to some bad data in it, I have dropped the cube data completely from fact and dim tables and did an init. from ODS. the load failed due to "No SID found for value '2000000000000000000000010' of characteristic 0MAT_PLANT". I have run a RSRV test to find if there is any inconsistency in SID's for 0MAT_PLANT but everything looks great. Considering the time span in mind to finish the load I have removed that record from PSA and re-pushed the data from PSA which was successful, however I wanted to know if anyone has come across such kind of error.
    2. The load finished successfully in the cube but the performance of the cube was very bad coz my cube request shows only 3.5 million records but when I checked in the fact table it was showing double i.e. 7 million records, which is because I deleted the failed init. request and re-pushed the data from PSA. can anyone suggest me how to overcome this I mean how to bring back my fact table records to 3.5 million?
    please advise me and thanks so much!
    Swathi.

    hi Swathi,
    1. for 0mat_plant no sid, beside rsrv, you can try rsd1, infoobject maintenance, there is menu (about 3rd from left) 'fill sid ...'.
    for infocube update, please make sure you 'delete data',
    'fact and dim...'. check again if fact table counts 0 records then you can update from ods with 'initialize'. or if you go by psa then delete data both infocube and ods (from ods to cube should sufficient).
    2. for performance, check in infocube -> manage -> performance, make sure index and statistic is green. you can create index there, also refresh the statistic.
    hope this helps.

  • Short dump while displaying cube data in production

    Hi Folks,
    I'm getting a short dump while displaying cube data in production, please suggest
    Thanks and Regards
    Santhosh

    Hi Santosh,
    I'd suggest to follow as per SAP Note: 568768 - Analysis of SQL Errors causing Shortdumps or Error messages. The Note contains precise steps to analyze such an ABAP dump.
    cheers
    m./

  • Objects are not referenced in cube data source view. Large model.

    Looking at large model. 100+ measure groups, 200+ dimensions, one cube. Deployed to multiple customers in production as a part of custom developed software.
    in data tools 2010 I'm looking at the cube on sql 2008, sql 2008 R2 (EE, DE).
    In the cube data source diagram we observe 50+ objects that do not have any links to them or from them.
    1)
    Some of these I was unable to find in DMV's of the cube at all, it turns out they are present in the Data source that is in Data Source Views section.
    How did these end/could end up in the cube data source model? they are not available in dimension usage window at all.
    2) Others do not have any links from them or to them in cube’s data source diagram, however the dimension usage clearly shows these objects are part of relationships 
    both regular and many to many depending on the object in question.
    We are trying to identify what objects we can ‘trim’ from the model. I’m not sure what am I looking at in this case: data tools have a trouble interpreting diagram for 200+ dimensions and
    100+ measure groups (it is a large existing model) and this artifact should not be used to decide what in dimensions is obsolete and can be decommissioned.
    We are looking at cube that has over 40 NxM just to one fact not counting regular relationships.
    Am I looking at a feature or maybe an issue with the model itself that somehow has developed in the particular cube.
    It is not something I came across before.
    thanks

    Hi,
    According to your description, you are designing a large SQL Server Analysis Services cube, the problem is that you are unable view the relationship in the data source
    view diagram, right?
    In your description, you said that you can see the relationships in Dimension Usage window. In your scenario, can you see the relationship under the Tables pane when expanding
    a table's Relationships folder?
    General, the relationships will be defined automatic if the relationships were defined in your data source of the cube. And the dimension relationships between cube dimensions
    and measure groups in the cube are defined manually in Dimension Usage. In your scenario, you can define relationship manually in the data source view, please refer to the link below to see the details.
    http://technet.microsoft.com/en-us/library/ms178448.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • How to do reconcilization of ODS data and CUBE data

    Hi All,
    How to do reconciliation of ODS data and CUBE data,I know how to do reconciliation of R/3  data and BW data.
    Regards.
    hari

    Hi,
    create a multicube based on your ODS and cube, identify commen characteristics and perform the key figure selections; create then a query showing both information from the cube and ODS with perhaps some formula showing differences between key figures.
    hope this helps...
    Olivier.

  • Logical Standby Data Consistency issues

    Hi all,
    We have been running a logical standby instance for about three weeks now. Both our primary and logical are 11g (11.1.0.7) databases running on Sun Solaris.
    We have off-loaded our Discoverer reporting to the logical standby.
    About three days ago, we started getting the following error message (initially for three tables, but from this morning on a whole lot more)
    ORA-26787: The row with key (<coulmn>) = (<value>) does not exist in table <schema>.<table>
    This error implies that we have data consistency issues between our primary and logical standby databases, but we find that hard to believe
    because the "data guard" status is set to "standby", implying that schemas' being replicated by data guard are not available for user modification.
    any assistance in this regard would be greatly appreciated.
    thanks
    Mel

    It is a bug : Bug 10302680 . Apply the corresponding Patch 10302680 to your standby db.

  • Data Consistency when Copying/ Refreshing ECC 6.0 and SRM-SUS 5.0 Systems

    Hello,
    We are planning a refresh / system copy of an ECC 6.0 and SRM-SUS 5.0 system
    The refreshes will be completed from backups taken of production systems refreshed onto the QA Landscape.
    I have referenced the following SDN thread that provides some guidelines on how to refresh R/3 and SRM systems and maintain data consistency between the systems using BDLS and changing entries that correspond to backend RFC destinations:
    [Is there a process/program  to update tables/data after System Refresh?;
    This thread is fairly old and relates to earlier versions of R/3 (4.7) and SRM (3.0).  We have heard that at higher system versions there may be technical reasons why a refresh canu2019t be performed.
    Does anyone have experience of completing successful refreshes of landscape that contain ECC and SRM systems at higher SAP versions (ideally ECC 6.0 and SRM-SUS 5.0)  Does anyone know whether it is technically possible?
    Are there any additional steps that we need to be aware of at these higher SAP versions in completing the copy to ensure that the data remains consistent between ECC and SRM?
    Thanks
    Frances

    I have seen this somewhere in the forum: See if this helps you
    BDLS: Convertion of logical system (SRM).
    Check entry in table TWPURLSVR.
    Check RFC connections (R/3 and SRM)
    SPRO, check or set the following points:
    Set up Distribution Model and distribute it
    Define backend system
    Define backend sytem per product category
    Setting for vendor synchronization
    Numbe ranges
    Define object in backend sytem
    Define external services (catalogs)
    Check WF customizing: SWU3, settings for tasks
    SICF: maintain the service BBPSTART, SAPCONNECT
    SE38:
    Run SIAC_PUBLISH_ALL_INTERNAL
    Run BBP_LOCATIONS_GET_ALL
    Update vendor BBPUPDVD
    Check Middleware if used.
    Run BBP_GET_EXRATE.
    Schedule jobs (bbp_get_status2, clean_reqreq_up)
    Convert attributes with RHOM_ATTRIBUTE_REPLACE

  • How can I  extract non-cumulative cube data to another cubes ?

    Dear Expert,
             I copied inventory cube 0ic_c03 to  a new cube : ZIC_C03  , then I loaded the data of 0ic_c03 to the new cube ZIC_C03 . 
    the loading was sucessful, but  the result of  report of the ZIC_C03 was not correct. After looking into the problem, I found out
    the new cube can successfully extract data related with "Goods movement" & "Revaluation", yet the extract of "Open Balance" does not seems to work.
             I have read through Note:375098, in which I could not find (InfoObject 0RECORDTP) in my start routine as mentioned in the solution
    ,and the note was not mentioned valid for BW 3.5 release.
             How can I  extract non-cumulative cube data to another cubes correctly in BW3.5 release SP20? 
       Thanks !

    Hi,
    Check Routines in Update rules wether same code is wrriten in  new cube
    cheers,
    Satya

  • Error occured while writing cube data

    Hi all,
    I was working around with the manage ownership section. When i tried to change [PCON] or [POWN] values and save it, I get an error message saying "Error occurred while writing cube data". After doing some googling I found out that this was because [PCON] and [POWN] are system accounts and hence we cannot submit data to it.
    Is there any way how I can resolve this issue? I have 2-3 entities which have minority and do not have 100% ownership/consolidation. Please help me out in this matter.
    Regards,
    Ramith

    It should work, check if this helps. Page 133 http://docs.oracle.com/cd/E12825_01/epm.111/hfm_user.pdf

  • Error occurred while writing cube data - PCon

    Hi,
    Could you help me? I have created new application, but when I try to enter and save PCon values, I get a message: "Error occurred while writing cube data.". What could be the reason? I was able to update other data forms.
    Thank you in advance

    It should work, check if this helps. Page 133 http://docs.oracle.com/cd/E12825_01/epm.111/hfm_user.pdf

  • Check cube data

    I have web query that i want to run. It's asking for certain variables eg : date.
    How can i check cube data to see what valid dates have been posted in the cube?
    so then I can put this in for the variable date ?
    thanks

    Hello MaGic,
    How r u ?
    Manage the Cube and check 0CALDAY (date) and then give the parameters.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Error in SP12 TemSe: Data Consistency Check

    Dear All,
    I am getting error in SP12 TemSe: Data Consistency Check. Please suggest. this are only few there are may in thousands. There is one thing more we have resent upgrade from ecc5 to eccc6.
    Please suggest how to solve it.
    Consistency check of table TST01 / TemSe objects
    System PRD 12.04.2010 12:22:02
    Clt TemSe object name Part Comments
    700 BDCLG001048620722636 1 Unable to access the related file
    700 BDCLG001336417070836 1 Unable to access the related file
    700 BDCLG366544424095583 2 Unable to access the related file
    700 BDCLG366544424095583 3 Unable to access the related file
    700 JOBLGX23303906X11768 1 Object reset
    0 JOBLGX_ZOMBIE_X00304 1 Object reset Length = 0
    0 JOBLGX_ZOMBIE_X00773 1 Object reset Length = 0
    0 JOBLGX_ZOMBIE_X01080 1 Object reset Length = 0
    0 JOBLGX_ZOMBIE_X02571 1 Object reset Length = 0
    Regards,
    Kumar

    hi
    what is the solution for this post?/
    please tell me i am also getting same error
    shell i delete all these things??
    regards
    krishna

  • How to make cube data avilable for reporting

    Hello,
    I have loaded data in to info cube from flat file, and created a query using BEx Analyzer. that query does not return results and gives an error msg "No Applicable Data Found".
    Padmanabha Rao had a same problem. In his thread, expert is recommending to check whether the data is available for Reporting by Rt clicking Infocube -> Manage -> Requests tab. Ideally, you'll be able see an indicator(Query icon) if it's available for reporting.
    My cube data is not available for reporting, how can I make it available for reporting?
    Regards,
    Tejas.

    Normally when the data gets loaded to cube, it will be available for reporting unlike DSO where you need to activate the data to make it available for reporting. There could be many reasons why the data is not available for reporting;
    1. May be there is aggregrates created on the cube and the data never been rolled up to the aggregates. If you see a summation sign next to the cube then there is aggregates created on the cube and you need to fill the aggregates first before you can have the data available for reporting in the cube. To do that, manage ->rollup and start, that will fill the aggregates.
    2. If there is any request in the cube which is still red, then any request after that won't be available for reporting until you either delete the red request.
    If the problem still persist, then I would just delete all the data from the cube and reload the data again.
    thanks.
    Wond

  • How to export the single cube data from SAP repository

    Hi ,
    I have a requirement to export the single cube data ( there was so many cubes in the SAP repository) as an XML file or a .csv file or a flat file.
    And also looking for how to do cube quering?
    Thanks in advance ,
    Ramakrishna Thota

    HI RK,
    1. You can use Open Hub service to export data into CSV or File Format.
    2. You can also use RSCRM_REPORT transaction to export data in to File, for this you need to create query first.
    3. You can also use APD to generate file of your cube data.
    thanks
    Ramesh Babu

  • BPC 7.0 NW cubes data

    Q1>What are the BPC consolidation dependant  setting required in FICO?
    Q2>BPC 7.0 NW needs to collect the data from BW. The corresponding R/3 implemented is 4.7 version. If the bw data need to be supplied to BPC legal as well as management consolidaiton which is the ideal BW cubes data can be used?

    Hi ManiIyer,
    Following are some suggestion that you may utilize.
    --> Utilize standard BI Extractors and Infocubes or You may create, copy  BI Infocubes or custom extractors.
    --> Design BPC consolidation application based on your requirement. You need to setup consolidation environment in BPC.
    --> Load data into BI infocubes using sap BI ETL
    --> Move data from BI infocubes to BPC Cubes using BPC stabdard  Packages
    --> You can define custom packages based on your requirement too
    You need to do all consolidation configuration set up needed in SAP BPC by utilization of SAP BPC consolidation Procedures and SAP BPC requirements.
    It is totally different when compared with SEM BCS but it it has so many features.
    -Sreekanth

Maybe you are looking for