Data comparision in 2 infocubes in 2 different systems

Hello all,
We have 2 3.0B BW systems. In both systems, there is a infocube called IC1.
Now my requirement is, i have to compare the number of records, in both the cubes in 2 different BW systems. In the first BW system, we are compression/collapse the infocube every week.
If i compare the the fact table, in the Manage-->Content tabstrip for the no of records, is it a accurate way to compare the no of records? or if there any other standard way could you please let me know?
Many Thanks,
Ravi

Hi Ravi,
try rsa1->infocube->manage->requests. See the column inserted records. But before, set the from date for the request display to a very early date.
regards
Siggi

Similar Messages

  • Data copy from infocube to infocube in two different BW systems

    Dear All,
    If i have a scenario like i have an infocube for sales in two different bw systems, and if i want to send the data from one infocube to another infocube what would be the strategy we have to follow. Is it that i need to send it as a flat file or is there any other mechanism with which i can send the data into the other infocube.
    Yours answers would be rewarded.
    Regards
    Vijay

    hi Vijay,
    no, you have no need to send it as flat file if you have these 2 bw connected,
    you can use datamart scenario, where one bw act as source system for the other bw, we have such discussion several times, you can check the steps
    Loading data from cube1 to cube2
    Loading data from one cube to another cube.
    hope this helps.

  • Shortdump problem for loadinf data from ODS to InfoCube

    hai
    im trying to load the data from ODS to InfoCube.But i got the following error like below
    Short dump in the Warehouse
    Diagnosis
    The data update was not completed. A short dump has probably been logged in BW providing information about the error.
    <b>System response
    "Caller 70" is missing.
    Further analysis:
    Search in the BW short dump overview for the short dump belonging to the request. Pay attention to the correct time and date on the selection screen.
    You get a short dump list using the Wizard or via the menu path "Environment -> Short dump -> In the Data Warehouse".
    Error correction:
    Follow the instructions in the short dump.</b>
    I looked at the shortdump.But it says that there is no shortdump for that particular date selection.
    pls tell me wht i have to do
    i ll assing the points
    bye
    rizwan

    Hi Rizwan,
    Why does the error occurs ?
    • This error normally occurs whenever BW encounters error and is not able to classify them. There could be multiple reasons for the same
    o Whenever we are loading the Master Data for the first time, it creates SID’s. If system is unable to create SID’s for the records in the Data packet, we can get this error message.
    o If the Indexes of the cube are not deleted, then it may happen that the system may give the caller 70 error.
    o Whenever we are trying to load the Transactional data which has master data as one of the Characteristics and the value does not exist in Master Data table we get this error. System can have difficultly in creating SIDs for the Master Data and also load the transactional data.
    o If ODS activation is taking place and at the same time there is another ODS activation running parallel then in that case it may happen that the system may classify the error as caller 70. As there were no processes free for that ODS Activation.
    o It also occurs whenever there is a Read/Write occurring in the Active Data Table of ODS. For example if activation is happening for an ODS and at the same time the data loading is also taking place to the same ODS, then system may classify the error as caller 70.
    o It is a system error which can be seen under the “Status” tab in the Job Over View.
    What happens when this error occurs ?
    • The exact error message is “System response "Caller 70" is missing”.
    • It may happen that it may also log a short dump in the system. It can be checked at "Environment -> Short dump -> In the Data Warehouse".
    What can be the possible actions to be carried out ?
    • If the Master Data is getting loaded for the first time then in that case we can reduce the Data Package size and load the Info Package. Processing sometimes is based on the size of Data Package. Hence we can reduce the data package size and then reload the data again. We can also try to split the data load into different data loads
    • If the error occurs in the cube load then we can try to delete the indexes of the cube and then reload the data again.
    • If we are trying to load the Transactional and Master Data together and this error occurs then we can reduce the size of the Data Package and try reloading, as system may be finding it difficult to create SID’s and load data at the same time. Or we can load the Master Data first and then load Tranactional Data
    • If the error is happening while ODS activation cause of no processes free, or available for processing the ODS activation, then we can define processes in the T Code RSCUSTA2.
    • If error is occurring due to Read/Write in ODS then we need to make changes in the schedule time of the data loading.
    • Once we are sure that the data has not been extracted completely, we can then go ahead and delete the red request from the manage tab in the InfoProvider. Re-trigger the InfoPackage again.
    • Monitor the load for successful completion, and complete the further loads if any in the Process Chain.
    (From Re: caller 70 missing).
    Also check links:
    Caller 70 is missing
    Re: Deadlock - error
    "Caller 70 Missing" Error
    Caller 70 missing.
    Bye
    Dinesh

  • How to make this function module as RFC to get data from different system?

    Hi
    I am trying to use following function module . This function module is used to copy data from one cube to another cube within same system however I need a this should happen across two different system. How can I use this function module to make it remote call to different system and do the same function what it is doing ?
    Name of function module -
    RSDRT_INFOCUBE_DATA_COPY
    Any help would be really helpful
    AG

    HI,
      let us say you want to copy the data of a cube in system A to a cube in system B .
      1) create a RFC function module in system B (in this function module  call the function module RSDRI_CUBE_WRITE_PACKAGE to update the data), this RFC function module should have same parametersa as RSDRI_CUBE_WRITE_PACKAGE .
    2) write a program in system A .. read the data from the infocube using the function module RSDRI_INFOPROV_READ and call the RFC function module in system B that you have created..
    for the details of the parameters to pass to these two function modules use the RSDRT_INFOCUBE_DATA_COPY and get the required code
    Thanks
    mahesh

  • Infocube turned out differently from PSA

    Hi,
      In BI 7.0, if i have confirmed that the content of PSA is what i expected, but the content in the fact table of the infocube turned out differently. How do I rectify the mistakes? It should be something to do with the transfer or update rules, but how do i check those?
    Thanks

    Hi Lip,
    If you feel that data in Cube is different from what is there in the PSA, then it could be that there are some routines written in the Transformations.
    When you are checking for Data Reconciliation, you can proceed as follows.
    Check for the Data in the Source System --> RSA3,
    Check the Data in PSA in BW(SAP BI)
    Check the Data in the InfoCube.
    For test purposes you can take a few records and check if any Transformations are taking place and after that do the reconciliation.
    Hope this helps.
    Regards,
    Tom.

  • Data load failed at infocube level

    Dear Experts,
    I hve data loads from the ECC source system for datasource 2LIS_11_VAITM to 3 different datatargets in BI system. the data load is successful until PSA when comes to load to datatargets the load is successful to 2 datatargets and failed at one data target i.e. infocube. I got the following error message:
    Error 18 in the update
    Diagnosis
        The update delivered the error code 18 .
    Procedure
        You can find further information on this error in the error message of
        the update.
    Here I tried to activate update rules once again by excuting the  program and tried to reload using reconstruction fecility but I get the same error message.
    Kindly, please help me to analyze the issue.
    Thanks&Regards,
    Mannu

    Hi,
    Here I tried to trigger repeat delta in the impression that the error will not repeat but then I encountered the issues like
    1. the data load status in RSMO is red but where as in the data target the status is showing green
    2. when i try to analyze psa from rsmo Tcode PSA is giving me dump with the following.
    Following analysis is from  Tcode  ST22
    Runtime Errors         GETWA_NOT_ASSIGNED
    Short text
         Field symbol has not yet been assigned.
    What happened?
         Error in the ABAP Application Program
         The current ABAP program "SAPLSLVC" had to be terminated because it has
         come across a statement that unfortunately cannot be executed.
    What can you do?
         Note down which actions and inputs caused the error.
         To process the problem further, contact you SAP system
         administrator.
         Using Transaction ST22 for ABAP Dump Analysis, you can look
         at and manage termination messages, and you can also
         keep them for a long time.
    Error analysis
        You attempted to access an unassigned field symbol
        (data segment 32821).
        This error may occur if
        - You address a typed field symbol before it has been set with
          ASSIGN
        - You address a field symbol that pointed to the line of an
          internal table that was deleted
        - You address a field symbol that was previously reset using
          UNASSIGN or that pointed to a local field that no
          longer exists
        - You address a global function interface, although the
          respective function module is not active - that is, is
          not in the list of active calls. The list of active calls
          can be taken from this short dump.
    How to correct the error
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "GETWA_NOT_ASSIGNED" " "
        "SAPLSLVC" or "LSLVCF36"
        "FILL_DATA_TABLE"
    Here I have activated the include LSLVCF36
    reactivated the transfer rules and update rules and retriggered the data load
    But still I am getting the same error...
    Could any one please help me to resolve this issue....
    Thanks a lot,
    Mannu
    Thanks & Regards,
    Mannu

  • Unable to load the data from PSA to INFOCUBE

    Hi BI Experts, good afternoon.
        I am loading 3 years data( Full load ) from R/3 to Infocube.
       So loaded the data by monthwise. So i created 36 info packages.
      Everything is fine. But i got a error in Jan 2005 and Mar 2005. It is the same error in both months. That is Caller 01and caller 02 errors( Means invalid characteristics are there PSA data )
    So i deleted both PSA and Data target Requests and again i loaded the data only to PSA.
      Here i got data in PSA without fail.
      Then i tried to load the data from PSA to Infocube MANUALLY.
    But its not happening.
      One message came this
           SID 60,758 is smaller than the compress SID of cube ZIC_C03; no        request booking.
       Please give me the solution how to solve this problem.
      Thanks & Regards
         Anjali

    Hi Teja,
       Thanks for the good response.
      How can i check whether it is already compressed or not?
      Pls give me the reply.
      Thanks
              Anjali

  • Data not received in InfoCube / PSA object / InfoObject

    Hi Experts,
    i written code against keyfigure to convert values in to same currency values and assigned constant USD for the reference characterstic.
    Now i tried reload the data but it throws error as 'Data not received in InfoCube / PSA object / InfoObject'
    i hope you all understand my problem. if so please advise.
    Thanks,
    RK
    Edited by: RK on May 21, 2009 6:48 PM

    Did you try looking into short dump about the problem ?
    try these links :
    Short Dump "Data not received in InfoCube / PSA object / InfoObject"
    Short Dump "Data not received in InfoCube / PSA object / InfoObject"

  • Data not received in InfoCube / PSA

    Hi Everybody,
    Today all the loads are failed with the Error "Data not received in InfoCube / PSA object / InfoObject", Could you please help how to solve the problem.
    Regards
    BW

    Hi
    First check that your ODS is active
    Then ensure it has data activated and for the selection you want to extract
    What kind of load are you performing? A full or a delta?
    In any case when you fire your InfoPackage from the ODS to the cube a job named BI_REQ* will be kicked. Check this job in SM37. If you can open an InfoPackage with your ODS as DataSource then the DataSource is fine and the transfer program is activated.
    Monitor this job and see what happens.
    If the job is cancelled for any reason, look if there is a dump (ST22), a system message in the log (SM21 on all appserv) and if there is a TRFC transaction with error in BW (SM58).
    Please let me know
    regards,
    Olivier.

  • Planning area data not updated in infocube in APO system

    Hi all,
    User is uploading a flat file using addon program to an infocube 1, from this infocube data for  sales forecast KF is getting transferred to a planning area KF using standard program RTS_INPUT_CUBE.  I can see the updated data in Planning book (which refer the same planning area) data views. In planning book the sales forecast data is also getting copyied to second KF 'Arrangement FC' and its correct.
    Now there is a infocube 2 (second) which is getting data directly from this planning area (infocube also contains both KFs). But When i checked this infocube 2 , the updated data is availabe in Sales forecast KF , but arrangement forecast KF is having old data only and user is running query on second KF , so he is getting wrong report.
    Since there is no data flow for this infocube 2, and it is getting data from planning area, I feel its remote infocube.
    I have also found the datasource for this planning area but don't know how to move further to find why data is not updating properly? Please help on this.
    I have information that 2 weeks before data was properly updated, but this time data is not updated,
    system version  is SAP SCM 4.0

    Hi Vivek
    it is advisable to run the background jobs when the planning books are not being accesses by the users to prevent such inconsistencis. Hence i would advise you to run the jobs durng non-working hours. and if you have a global system, then you may restrict to run the jobs based on regional levels.
    in addition, it is also a good practice to run consistency jobs before and after your have completed the background jobs. i hope you are using process chains to execute the sequeuce of jobs. if yes, then you can add consistency check jobs in the process chains.
    some of the consistency check reports are :
    /SAPAPO/OM17 - liveCache Consistency Check
    /SAPAPO/TSCONS - Consistency Check for Time Series Network
    /SAPAPO/CONSCHK - Model Consistency Check
    and so and so forth..
    you can find these conssistency jobs under APO Adiminstration --> Consistency checks in APO.
    let me know if this helps.
    Rgds, Sandeep

  • Display data from a virtual InfoCube

    Hi experts,
    When I tryed to display data from the virtual InfoCube 0FIGL_V40, I've got a dump.
    Please help me to solve this problem.

    Dear Akshay,
    Is it possible that the problem comes from R/3 since when I check the extractor 0FI_GL_40 with the RSA3 Tcode I've got the message:
    Errors occurred during the extraction --- Message no. RJ012
    I think that the pb has been solved by switching on the business function "Reporting Financials" . Inthe SAP source System -> TA: SFW5. Turn on Reporting Financials. Then I've no pb while testing the extractor.
    Pb solved.
    Many thanks
    Youness
    Edited by: Youness NAJI on Jan 13, 2010 4:39 PM

  • Issue in Data from DSO to DSO Target with different Key

    Hello All,
    I am having Issue in Data from DSO to DSO Target with different Key
    Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
    DSO semantic grouping works like Group By clause in sql, is my understanding right ?
    Also if someone can explain this with a small example, it would be great.
    Many Thanks
    Krishna

    Dear, as explained earlier your issue has nothing to do with semantic grouping .
    Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
    Please go through this blog which explains very clearly the use of semantic grouping .
    http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
    Now coming to your above question
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          cc          300
    If we have semantic group as Employee .   If we have Employee as key of the target DSO and update type as summation .
    then target DSO will have
    Emp                Amount
    100                 700
    200                 200
    In this case Wage type will be the last record arriving from the data package . If the record 100  cc  300 is arrivng last then wage type will be cc .
    2) Case 2
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          aa          300
    if we do Semantic grouping with Emp and Wage Type   If we have Employee and wage type as key of the target DSO and update type as summation .
    then target DSO will have
    Emp     Wage      Amount
    100          aa          500
    200          aa          200
    100          bb          400
    Hope this helps .

  • How to move data connections with SOAP web service in different environments in InfoPath Forms 2010

    Hello,
    I've an InfoPath Form where I've around 10 SOAP web service data connections. They are calling a custom web service for a custom business process. The web service URL has a query string parameter which identifies whether it's a Test web service or the Production
    one. The web service URL looks like this:
    http://server/webservice/wsdl?targetURI=testSPRead (for the Test environment)
    http://server/webservice/wsdl?targetURI=ProdSPRead (for the Production environment)
    When I develop the form in Dev environment, I use the Test web service URL and save the data connection as UDCX files in the data connection library. After completing the development, when I deploy this Form in Production, I update the URL in the UDCX
    file in the Production data connection library, but when I run the Form in Production, it throws error 'Error occurred in calling the web service'. After doing more research, when I extracted the XSN file and opened Manifest.xsf file in Notepad, I found the
    references of 'testSPRead' parameter.
    So, in the UDCX file the web service URL is '/targetURI=ProdSPRead' but in the Manifest.xsf file, there is a reference of Test web service parameter which is 'testSPRead' and that's why it's throwing error.
    For testing purpose, I updated the Manifest.xsf file and replaced all the occurrences of 'testSPRead' to 'ProdSPRead' and also updated all the relevant files of the data connections (like XML, XSF etc.) and saved the Manifest.xsf as Form.xsn and deployed
    in Prod and it worked.
    The question is - is this the right way of doing it? There should be a simple method in such cases where web service has conditional parameter to identify the Test and Production web service.
    Does somebody know what is the right way of doing it? I also thought of adding 'double' data connections - one set of Test and another set of Production and call them by identifying the current SharePointServerRootURL, but that's a lot of work. I've 10 web
    service data connections in my Form and in that case I'll be having 20 data connections and setting their parameters in different Rules is too much work.
    Please advise. It's very important for me!
    Thanks in advance.
    Ashish

    Thanks for your response Hemendra!
    I hope Microsoft improves this thing in subsequent patches of InfoPath 2010 or InfoPath 2013 because I don't think this is a very special requirement. This is failing the purpose of having UDCX files for data connections. Why the WSDL's parameter value
    is being written in the Manifest.xsf and other XSF and XML files. InfoPath should always refer the URL and parameters from the UDCX files.
    --Ashish

  • How to delete the data in the compressed  infocube

    hi, bi gurus
    we are facing a problem in inventory management the info cube in BW production
    normally every time inventory cube gets compressed that means the data will be moving to F fact table to E fact table
    now the problem is we are having some bad data for latest five requests in this cube as we all know compressed data
    can't be delete by deleting request in the request tab the only way is go for selective deletion but i don't find any selective
    option in the cube we are PSA data for that five request which having correctdata please help how to delete the bad data in the
    info cube and load the correct data which we are having in PSA
    Thanks
    Joe

    Hi André
    Thanks you for ur answer
    what i am telling is their is an option for selective deletion for inventory cube
    but i don't find any specific option to delete the data that means like calendar day like that
    i hope you got my question.
    hi Saveen Kumar,
    Thank you again
    we are using 3.xx flow if we do the request reverse posting for all the 5 requests which has updated incorrect data
    next we need to do compression also or not
    and how to reload the data from PSA to Infocube becuse if request still avaliable in info cube it will not allow me to do that
    can you please tell me detail how to proceed in step by step this is first time i am doing request reverse post and i have to do it production please tell me
    Thanks in adavance
    Thanks,joe

  • How to upload data from Infoset to Infocube

    Hi Gurus,
    I am trying to upload the data from , Infoset to Infocube, but its not allowing me in the transformations, showing error "Error while activating transformation    "    Message no. RSTRAN510
    Infoset containing 3 Data store objects, 
    how the data gets loaded from infoset to Infocube?
    how the delta mechanism works, since i am having 3 DSO's , ?
    Thanks
    Shiva

    Hi,
    But we have option, while creating transformation between infoset and Infocube,
    Under Infocube, when we create transformation, we have the option from where we can uplaod,   in that we have INFOSET also,
    i could able to create Transformation, but not able to activate,
    let me know, may be we cannot load from infoset to infocube, since we have the option, i think there is some way of loading data from infoset to infocube,

Maybe you are looking for

  • System is not accepting commands from autoit

    Hi, I've developed an application in autoit. It is for opening an application and sending some key stroke automatically and reading the title of the new windows that comes after sending keystroke. It worked well in windows xp as well as some windows

  • Photoshop CS3 has suddenly started crashing on start - VersionCue.framework is missing

    I have been using CS3 for years on my Mac without issue. Tonight it's suddenly produced this error message and will not run: I have looked for this VersionCue thing all across my computer but cannot find it. The AdobePatcher downloaded from the websi

  • New desktop, new iTunes library

    I have recently changed my desktop and my iPhone sync to my old desktop. Seeking help if there is any way to keep the contacts, songs, videos, pictures in my iPhone while registering to my new desktop? Thanks in advance!

  • Scheduling Reports for different Responsibilities

    We are using discoverer 10g with Oracle Apps 11i Can someone share some input on the following. If I schedule a report, can users from two different countries say US and Canada access the results and get their respective data. By this I mean if a use

  • Script to check resolution of the embedded images in illustrator

    Hi All, Is there is any script available to check the resolution of the embedded image in the illustrator. Regards, Vinoth