No data in cube

Hi,
I create a mapping for cube, debugging shows that everything works well, and the loading data process also works, but at last there's no data in the cube. Could some one tell me what can be the problem? Thanks very much.
And also, does warehouse builder has something like DBMS_CUBE_LOG? So that I can check the rejected data.

I have the same problem. when I right click on the cube and click 'data' and then 'Execute', I dont see anything. Its a ROLAP cube.

Similar Messages

  • Data in cube is different from psa in the production system

    hi friends
    this is very urjent ,the data is fine and  same as r/3 in psa.for example i have sales for one article ie billing
    2lis_13_vditm . which picked the data from r/3 when i see the records in the psa they r good when tried to see the same record in cube record is not avalible in the cube . only few records are filtered out inbetween psa to cube which is leading to lot of data inconsistency . they r no routines which can filter out the data . only standard sap routines .which updates the data to cube . what could be the problem . any help is apperciated and helpful and will be rewarded. thanks in advance for kind replays.

    veda,
    In a cube the data gets added up for similar records -
    Do you have the same number of records in PSA and Cube ?
    if yes - then maybe there exist similar records and the KF is getting summed up in the result.
    Also how did you search for the same record in the cube ? since the characteristics go into the dim tables and all that is in the fact table is only dim ids and KFs ...
    Do one thing - can you try a report on the cube and check if the data is getting summed up.
    Or another workaround - put the PSA Data into an Excel file ( flat file ) and then upload it into an ODS with the same records in Production - you will know if multiple records exits and thereby find out what the problem is due to. ( Bad workaround - but cannot do it in Production)
    Arun

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • Delete data from Cube which is uploaded to Open hub

    Hello Friends I am required to delete the data from GL Cube, But it is connected to 2 Open Hub
    which has one as table /BIC/OHZZIGL_C10 defined in Open hub destination. I deleted the data from this table in SE14, but still when I try to delete the data from cube it throws the below error
    Please help me how to now delete the data from cube, when should i go in open hub t delete all the uploaded requests.
    Request 1.882 already updated in target 3.913 by DTP request DTPR_4CU6JHRJ7MY889A7XBUJ0GDVK(ZZIGL_C10)
    Thanks
    soniya

    just check on your datamart status for that request. Go to the manage screen of your infocube and check where all that request got updated using datamart status.

  • Loading data from Cube to ODS

    Hi All,
    I have a cube which contains projects data. I want to load the data from cube to a write optimized ODS. In the cube for a project i have multiple records. But while loading the data to ODS, I have to summarise the data in such a way that i should have one record for a project i.e if the cube has data as follows:
    Project Cost
    abc      100
    abc       200
    abc       300
    Then in the ODS the record should be
    Project    Cost
    abc        600
    How do i achieve this ?
    Thanks,
    Satya

    Hi Satya,
    Generate export datasource from Cube ( Right click on cube ->generate export datasource).
    Create update rules for the ODS and in the update rule of Cost , you will find two options
    1. overwrite
    2. addition.
    Choose addition and activate update rules. ..This will work exactly same way you wanted.
    One more thing to mention ,Put project info object in to key fields of ODS and rest all info objects in to data fields.
    Hope that helps.
    Regards
    Mr Kapadia
    Assigning points is the way to say thanks in SDN.
    Message was edited by:
            Mr Kapadia

  • Urgent: Error in uploading data in cube after adding new Key figure

    Hi Guys,
    I have added 1 key figure (DEC) to my existing cube. and proceeded as below:
    1. Deleted all data from cube
    2. activated the cube
    3. set Constant 1 for this key figure
    4. activated the transformation
    5. reated a new DTP
    But when i execute it, it gives me error. in updating to infocube ZSDBILL:
    Error while updating to target  ZSDBILL
    Process Terminated.
    all other steps in DTP are succesful like: extraction, error handling and transformation.
    There is no details in long text of these errors.
    Can anyone suggest whats the error stands for or what i am missing.

    Once logout and login back then again activate cube,transformation and run the DTP.
    Hope it helps you

  • Dead lock error while updating data into cube

    We have a scenario of daily truncate and upload of data into cube and volumes arrive @ 2 million per day.We have Parallel process setting (psa and data targets in parallel) in infopackage setting to speed up the data load process.This entire process runs thru process chain.
    We are facing dead lock issue everyday.How to avoid this ?
    In general dead lock occurs because of degenerated indexes if the volumes are very high. so my question is does deletion of Indexes of the cube everyday along with 'deletion of data target content' process help to avoiding dead lock ?
    Also observed is updation of values into one infoobject is taking longer time approx 3 mins for each data packet.That infoobject is placed in dimension and defined it as line item as the volumes are very high for that specific object.
    so this is over all scenario !!
    two things :
    1) will deletion of indexes and recreation help to avoid dead lock ?
    2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
    Regards.

    hello,
    1) will deletion of indexes and recreation help to avoid dead lock ?
    Ans:
    To avoid this problem, we need to drop the indexes of the cube before uploading the data.and rebuild the indexes...
    Also,
    just find out in SM12 which is the process which is causing lock.... Delete that.
    find out the process in SM66 which is running for a very long time.Stop  this process.
    Check the transaction SM50 for the number of processes available in the system. If they are not adequate, you have to increase them with the help of basis team
    2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
    Ans:
    Lie item dimension is one of the ways to improve data load as well as query performance by eliminationg the need for dimensin table. So while loading/reading, one less table to deal with..
    Check in the transformation mapping of that chs, it any rouitne/formula  is written.If so, this can lead to more time for processing that IO.
    Storing mass data in InfoCubes at document level is generally not recommended because when data is loaded, a huge SID table is created for the document number line-item dimension.
    check if your IO is similar to doc no...
    Regards,
    Dhanya

  • Getting an error while trying to load data into cube

    Hello,
    I am trying to load data into cube but it gives me following error:
    fiscal year variant not expected.
    Can someone tell what could be done on fiscal year
    Cheers
    Jim.

    Hi,
    May be you have not mapped anything to fiscal variant or not included fiscal variant in the cube but you have used fiscal period inside it.
    You should load fiscal variant if you wanna use fiscal period.
    Map fiscal variant to a constant K4 or from the source if present and then see the result.
    Generally it is K4 except for FI-CO cubes .
    In the transfer rules, select field '0fiscvarnt' 'Tp' click
    Enter K4 for Constant................Save and Activate
    Iam sure you'd fix the issue.
    Regards,
    Ray

  • While deleting the data from cube..getting the following erros...

    while deleting the data from cube the below error occured in develoment..any help
    i have 4 requets.....2 are deltedd but no records for those...
    2 are not deleted and they have some data..
    please give me u r inputts pelase
    Performing check and potential update for status control table
    Message no. RSM1490
    Diagnosis
    If data is loaded into an InfoCube, or existing data is edited (aggregated/compressed/deleted/got from a DataMart), then there is a change in  the potential reportability of the data , or the possibility of deleting data by request, or of aggregating or compressing.
    This status of each one of these Cubes is stored in a status table, that is updated when there is any change to the status of a request in the Cube.
    The system now analyzes the requests in the Cube and compares the calculated status with the status table.
    If deviations from the status table arise then you are given the option of adjusting the status table.
    System Response Procedure Procedure for System Administration

    Hi,
    Did you tried by right click on cube and delete data? are those requests are compressed or roll up?

  • While loading transaction data into cube what are the tables generats

    Hi,
    while loading transaction data into cube what are the tables normally generats.

    Hi,
    Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
    When you do compress the request the data will be moved into E tables.
    Regards,
    Siva.

  • How to do reconcilization of ODS data and CUBE data

    Hi All,
    How to do reconciliation of ODS data and CUBE data,I know how to do reconciliation of R/3  data and BW data.
    Regards.
    hari

    Hi,
    create a multicube based on your ODS and cube, identify commen characteristics and perform the key figure selections; create then a query showing both information from the cube and ODS with perhaps some formula showing differences between key figures.
    hope this helps...
    Olivier.

  • What is the difference between methods of deleting data from cube

    Hi,
    Is there any differences, pros and cons for deleting data from cube by:
    1) deleting the specific request loaded into the cube
    2) using selective deletion with specific characteristic values
    Assumption for the above is both the options will delete the same data set.
    Thanks and regards.

    Hi BW Beginner,
    Yes, of course, there are differences.
    Selective Deletion:
    PROS: You can use to delete data that satisfies a certain criteria (e.g. some characteristic values).
    CONS: I think this locks the whole cube during deletion
    Deletion by request id:
    PROS: You can easily delete all data that belongs to a certain request id (i.e. you can easily delete a bad load).
    CONS: If you have already compressed the cube, you can not use deletion by request id to delete a certain request.
    Hope this helps.

  • Getting  error in PSA while trying to load data in cube

    Dear Friends,
    I am trying to load data in cube I am getting error in PISA i.e.
         Value 'Boroplus Anticream 19Gm Regular Plain ' (hex. '426F726F706C757320416E7469637265616D203139476D2052') of characteristic .
    please guide me.
    Thanks in Advance.
    Best Regards
    Rafeeq

    <i>Value 'Boroplus Anticream 19Gm Regular Plain ' (hex. '426F726F706C757320416E7469637265616D203139476D2052') of characteristic .</i>
    Your data is in lowercase.. either change the data to UPPERCASE from source..
    or delete the request in cube and edit PSA data from lowecase to UPPER CASE if they are few records only.
    or make the infoobject setting lowecase checked,
    Regards
    Manga(Assign points if it helps!!)

  • Reg:loading data into cube getting error

    Hi,
    When i am loading data intothe cube i am getting following error.
    record1:fiscal year variant not expected.
    error 4 update.

    Recently, I experienced following error:
    fiscal year variant K4 not expected.
    following process fixed my issue
    In the transfer rules, select field '0fiscvarnt' 'Tp' click
    Enter K4 for Constant................Save and Activate
    Also check the following-------
    in R/3 (SE16), make sure K4 exists in table T009
    in Bw, select source system --> transfer global settings

  • Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)

    Hi all,
    I m getting following error while loading the data into cube.
    "Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
    amit shetye

    Hi Amit,
    This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
    Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
    RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
    Hope it Helps
    Srini

  • Frequenty occured  ERRORS in loading data to cube

    hi sap gurs
      can u give me frequently occured errors in loading data to cube.
    giri

    Hi Giri,
    There will be thousands of errors which can occur in any kind of environment.
    Some of the errors listed
    1. SID missing
    2. No alpha conforming values found
    3. Replicate datasource error
    4. BCD_Overflow error
    5. update rule inactive
    6. DUplicate masterdata found error
    RFC connection lost.
    7. Invalid characters while loading.
    8. ALEREMOTE user is locked.
    9. Lower case letters not allowed.
    10. While loading the data i am getting messeage that 'Record
    the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
    11. object locked.
    12. "Non-updated Idocs found in Source System".
    13. While loading master data, one of the datapackage has a red light error message:
    14. Extraction job aborted in r3
    15. repeat of last delta not possible
    16. datasource not replicated
    17. datasource/transfer structure not active
    18. Idoc Or Trfc Error
    Pls find these links to help yourself out
    production support
    Re: BW   production support
    https://forums.sdn.sap.com/click.jspa?searchID=1844533&messageID=1842076
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
    /people/valery.silaev/blog/2006/10/09/loading-please-wait
    Help on "Remedy Tickets resolution"
    Re: What is caller 01?
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    https://forums.sdn.sap.com/click.jspa?searchID=678788&messageID=1842076
    Production Support
    Production support issues
    check Siggi's weblog for common errors in data loading
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Assign points if it is helpful.
    Regards,
    Sreedhar

Maybe you are looking for

  • Can I add more than one song to a slideshow?

    Please tell me the answer is yes. And if so, how? Thanks in advance.

  • Since the last update I can't open iPhoto at all.

    Ever since I downloaded the new software update available I haven't been able to open my iPhoto at all. It looks like it's opening but then it says it's updating and will sit there for hours doing nothing at all. I'm a photographer and can't take the

  • How to remove alternative text for images in my pdf.

    hi everyone, i was wondering how i can remove the alternative text in my pdf file, that appers when the mouse cursor is on the image? i installed adobe acrobat pro 9. write all the text in microsoft office word 2007, add the images, create pdf and ev

  • English Font In Bi-Lingual Forms

    Hi All, We are creating Bi Lingual (Arabic and English) Smart forms with logon language as Arabic. For English Text we are using Styles with character format of TIMES font. But in output,  English text is being displayed in COURIER Font. We tried wit

  • MySQl /Oracle drivers

    I need information on where to get the drivers for: 'MySQL and Oracle 9i and how to install them and use them with Jdeveloper. newbie