Data Request on Cube

Hi All
I'm new to IP.
I've created aggregation level on info cube then created a input ready query on the cube and I've saved the data on Cube.It's working fine but the data request on cube still stays with yellow.
What's the settings to be done to get yellow request to green.
Regards,
Charan.

For IP and BPS, the status remains yellow only.

Similar Messages

  • "get all new data request by request" after compressing source Cube

    Hi
    I need to transfer data from one Infocube to another and use the Delta request by request.
    I have tried this when data on Source Infocube was not compressed and it worked.
    Afterwards some requests were compressed and after that the delta request by request is transfering all the information to target Infocube in only one request.
    Do you know if this a normal behavior?  
    Thanks in advance

    Hi
    The objective of compression is it will delete all the request in your F table and moves data to E table.after compression you don't have request by request by data.
    This is the reason you are getting all the data in single request.
    Get data request by request works, if you don't compress the data in your Cube.
    If you want to know about compression, check the below one
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh.

  • 'Get All New Data Request by Request' option not working Between DSO n Cube

    Hi BI's..
             Could anyone please tell me why the option ' Get one Request only' and  'Get All New Data Request by Request' is not working in DTP between Standard DSO and InfoCube.
    Scenario:
    I have done the data load by Yearwise say FY 2000 to FY 2009 in Infopackage and load it to Write-optimised DSO (10 requests) and again load Request by request to Standard DSO and activate each request. I have selected the option in DTP's to  'Get All New Data Request by Request' and its working fine between WDSO and SDSO. But not working between Cube and SDSO. While Execute DTP its taking as a single request from SDSO to Cube.( 10 request to single request).
    Regards,
    Sari.

    Hi,
    How does your DTP setting looks like from below options ? It should be change log, assuming you are not deleting change log data.
    Delta Init. Extraction from...
    - Active Table (with archive)
    - Active Table (without archive)
    - Archive ( full extraction only)
    - Change Log
    Also if you want to enable deltas, please do not delete change log. That could create issue while further update from DSO.
    Hope that helps.
    Regards
    Mr Kapadia
    *Assigning points is the way to say thanks*

  • Updating the last date request from ODS to CUBE

    Dear Friends,
    Please can someone explain me.
    i allways update request from ods to cube, sometime there will be 3 to 4 request in the ods which wont have the Data mart (Tick) and when i update it to the cube. i can only see 1 request which is current date request. i dont know whether the previous days request has been updated to the cube.???
    but i see in the ODS that all the data mart (tick) is available for all the request.
    And  please can someone explain me, if i have many request in the ods, current date previous day and so on.
    Is there any way to update in such a way, where i can see all the dates in the Cube instead of only 1 request with current date in the cube.
    when i delete the request from the cube and remove the tick from the ods, and refresh, then suddenly the current request and all the previous request TICK will be gone...
    Thanks for your help.
    will assign complete points.
    Thank you so so much

    Hi,
    if you know everday how many request are getting updated in infocube as one request the you can check  the added records in infocube .. it should be equal to the sum of all those request ..
    but transferred can be more or equal .. also its depends on the
    update rules ..designing ..
    Hope this helps you ..
    Regards,
    shikha

  • Data in the cube is showing multiple entries when compared with ODS

    Hello BW Gurus,
    We have a waste report in production planning on Cube and ODS separately. The same info package loads both targets (which means same infosource) but when we run a report on Cube, the records are showing multiple entries (i.e. Key Figures are not matching when compared to ODS) where as the ODS records are showing correctly as it was in R/3. There are totally 6 key figures out of which 4 pulled from R/3 and 2 are populated in BW.
    An Example:
    Waste report in PP run for plant 1000 for 12/2005 and process order 123456. The operational scrap should be 2.46% and the component scrap should be 3.00% for material 10000000. The report is showing 7.87% for planned operational waste % and 9.6% for planned component waste %. These values are not correct. The ODS values for order 123456 matched the data in R/3 for component and operational scrap.
    There is a Start routine to the ODS and also to the cube. I am not good at ABAP so requesting your Help.
    Here is the ODS Code:
    tables: /BI0/PPRODORDER.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    modify data_package.
    endloop.
    Here is Cube Code:
    tables: /BI0/PPRODORDER,
    /BIC/ODS.
    TYPES:
    BEGIN OF ys_mat_unit,
    material TYPE /bi0/oimaterial,
    mat_unit TYPE /bi0/oimat_unit,
    numerator TYPE /bi0/oinumerator,
    denomintr TYPE /bi0/oidenomintr,
    END OF ys_mat_unit.
    DATA:
    l_s_mat_unit TYPE ys_mat_unit,
    e_factor type p decimals 5.
    loop at data_package.
    select single COORD_TYPE
    PRODVERS
    into (/BI0/PPRODORDER-COORD_TYPE,
    /BI0/PPRODORDER-PRODVERS)
    from /BI0/PPRODORDER
    where PRODORDER = data_package-PRODORDER
    and OBJVERS = 'A'.
    if sy-subrc = 0.
    if /BI0/PPRODORDER-COORD_TYPE = 'XXX'
    or /BI0/PPRODORDER-COORD_TYPE = 'YYY'.
    data_package-PRODVERS = space.
    else.
    data_package-PRODVERS = /BI0/PPRODORDER-PRODVERS.
    endif.
    endif.
    if data_package-calday = space
    or data_package-calday = '00000000'.
    if data_package-TGTCONSQTY NE 0.
    data_package-calday = data_package-ACTRELDATE.
    endif.
    endif.
    data_package-agsu = 'GSU'.
    data_package-agsu_qty = 0.
    select single gr_qty
    base_uom
    into (/BIC/ODS-gr_qty,
    /BIC/ODS-base_uom)
    from /BIC/ODS
    where prodorder = data_package-prodorder
    and material = data_package-material.
    if sy-subrc = 0.
    if /BIC/ODS-base_uom = 'GSU'.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    else.
    SELECT SINGLE * FROM /bi0/pmat_unit
    INTO CORRESPONDING FIELDS OF l_s_mat_unit
    WHERE material = data_package-material
    AND mat_unit = 'GSU'
    AND objvers = 'A'.
    IF sy-subrc = 0.
    IF l_s_mat_unit-denomintr <> 0.
    e_factor = l_s_mat_unit-denomintr /
    l_s_mat_unit-numerator.
    multiply /BIC/ODS-gr_qty by e_factor.
    data_package-agsu_qty = /BIC/ODS-gr_qty.
    ENDIF.
    else.
    CALL FUNCTION 'UNIT_CONVERSION_SIMPLE'
    EXPORTING
    INPUT = /BIC/ODS-gr_qty
    NO_TYPE_CHECK = 'X'
    ROUND_SIGN = ' '
    UNIT_IN = /BIC/ODS-base_uom
    UNIT_OUT = 'GSU'
    IMPORTING
    OUTPUT = DATA_PACKAGE-gsu_qty
    EXCEPTIONS
    CONVERSION_NOT_FOUND = 1
    DIVISION_BY_ZERO = 2
    INPUT_INVALID = 3
    OUTPUT_INVALID = 4
    OVERFLOW = 5
    TYPE_INVALID = 6
    UNITS_MISSING = 7
    UNIT_IN_NOT_FOUND = 8
    UNIT_OUT_NOT_FOUND = 9
    OTHERS = 10.
    endif.
    endif.
    endif.
    modify data_package.
    endloop.
    some how the AGSU qyt is not populating in the cube and when I dbug the code, I could see a clean record in the internal table but not in the cube.
    your suggestion and solutions would be highly appreciated.
    thanks,
    Swathi.

    Hi Swathi
    In ODs we have option of overwriting and addition however in Cube we have only adition.Thats why you are getting multiple enteries.
    If you are running daily full load on the cube then please delete the earlier requests.
    So at one point of time there should be only one full load request in cube. Hope this will solve your problem.
    Regards,
    Monika

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • Moving data to archive cube

    Hi Experts,
    We are extracting data from R/3 to BW.We are keeping 2years of Data in DSO1 and moving into Cube(R/3>DSO1>Cube1).we have 2007 and 2008 data in the DSO and Cube.We extracted 1999 to 2006 data in to History DSO  from R/3 and sent into InfoCube(Cube2).The flow as is follows
    Current Data:2007 and 2008
    R/3 -
    >DSO1--->Cube 1(Deltas are running)
    History data:1996 to 2006
    R/3>DSO2--->Cube2.
    Now I want to move 2007 data in to History data(History DSO and Cube).
    I have two options to get this job  done .
    1.Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2.
    2.Move selective data from Cube 1 to Cube 2.If So I can't see item wise data just I can see only Aggregated data in the cube.
    Is there any best approach other than two options.if not what would be the best between two options.
    Once I move the data into History cube I need to delete the data in Current DSO and Current Cube based on selective deletion.If I delete the data in DSO and Cube is there any impact on currrent Delta load.
    I need to do this every year becuase we want to keep only 2years of data in current data InfoCube..Could any one throw some light on the above issue.
    Thansks,
    Rani.

    Hi Rani.........
    Ur 1st Question....
    We are extracting data from R/3 to BW.We are keeping 2years of Data in DSO1 and moving into Cube(R/3>DSO1>Cube1).we have 2007 and 2008 data in the DSO and Cube.We extracted 1999 to 2006 data in to History DSO from R/3 and sent into InfoCube(Cube2).The flow as is follows
    Current Data:2007 and 2008
    R/3 -
    >DSO1--->Cube 1(Deltas are running)
    History data:1996 to 2006
    R/3>DSO2--->Cube2.
    Now I want to move 2007 data in to History data(History DSO and Cube).
    I have two options to get this job done .
    1.Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2.
    2.Move selective data from Cube 1 to Cube 2.If So I can't see item wise data just I can see only Aggregated data in the cube.
    Is there any best approach other than two options.if not what would be the best between two options.
    ANS I want to clear u one thing.....that in infocube data will not get Aggregated until u aggregate the Cube.........So u can see the data in cube item wise if u don't aggregate the cube.......
    Anyways...I think the best option is to follow the Flow.....Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2..........
    Secondly..U hav asked...
    Once I move the data into History cube I need to delete the data in Current DSO and Current Cube based on selective deletion.If I delete the data in DSO and Cube is there any impact on currrent Delta load.
    I need to do this every year becuase we want to keep only 2years of data in current data InfoCube..Could any one throw some light on the above issue....
    ANS Selective deletion of data is not going to effect delta loads...so u can do Selective deletion...
    Actually Delta load will get effected....somehow if anyone delete a delta request without making the QM status red.......then init flag will not be set back....and data will be lost...
    Hope this helps..
    Regards,
    Debjnai...

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • Data from planning cube to TEXT FILE IN directories of BW

    hi all
    i have 2 transactional planning cubes in BW. the data in these cubes in bw needs to be uploaded monthly in to a single text file so that APO system can access the data for some purpose.
    my question is how to upload the data from transactional cubes to the text file, which is more detailed. is that use ful to construct a ODS or CUBE(basic) on top of the transactional cubes.
    if i keep the file in BW directory whethere APO can access that.?..
    please help me out
    praveen.

    Hi,
    You could create a multicube over your 2 transactional cubes and extract your data with a query on the multicube.
    Make sure you restrict 0REQUID 'request id' using 0S_RQMRC 'most current data' to be sure you extract the most recently planned data.
    No need to convert the transactional cubes to basic cubes.
    To download the query I think you have to use the reporting agent of the broadcaster.
    Thanks for you points if you found this helpful or if this solved your problem.
    Best Regards,
    Filip

  • Issues with Writing back data to Planning Cube After adding a new characterstic

    Hello,
    We have two cubes and a MP on top of it. ( Reporting Cube and Planning Cube ). An aggregation layer has also been built and so does a planning BEx query on top of it.
    Before Scenario: things were working fine as we were able to write back data to planning cube using the BEx.
    After Scenario: We created a new Dimension in the planning cube and added a new characteristic ( lets call it ZX_KEY ) . This Characteristic holds the concatenated values of some of the other characteristics in its master data tables . ( For ex: Account_Customer ID_Country-code_Industry-code ).
    This new characteristic is also present in the reporting cube and has been added and transferred to MP.
    This new characterstic was added to the planning BEx query as one of the rows and we executed it the analyzer. After entering the required values for all characterstics ( including the newly added ZX_KEY ) and the key figure values, we tried saving it. Now this is where the Analyzer is throwing up this error..
    ~~Characterstic combination cannot be assigned to part provider ~~
    ~~Characterstic 'ZX_KEY'; Characterstic value 'R123000000_12_US_TX'~~
    ~~ Entered values are incorrect:Correct before navigation ~~
    The value that we are entering ( R123000000_12_US_TX ) is picked up from the master data table of ZX_KEY by double clicking and selecting it from the fetched values. So i am not sure as to why its throwing up the above error. Request your help on this please.
    regards,
    Karthik

    Hi,
    Try to check that all characteristics are correctly assigned at multiprovider level.
    hope it helps

  • Move data from one cube to another cube

    Hi,
    I am on BW 3.5 and I have moved the data from one cube to another cube and found that the number of records in the original cube does not match to the newly created cube. for eg. if the original cube contains 8,549 records then the back up cube contains 7,379 records.
    Please help me on what I need to look on and if in case the records are getting aggregated then how do I check the aggregating record.
    Regards,
    Tyson

    Dear tyson m ,
    check with any update rules in ur transfer.If so check in it.
    Just go through these methods for making transfer from one cube to another cube fully without missing data.
    Update rules method
    if it's updated from ods, you can create update rules for cube2 and update from ods
    or you can try datamart scenario
    cube1 right click 'generate export datasource'
    create update rules for cube2, assign with cube1
    rsa1->source system->bw myself, right click 'replicate datasource'
    rsa1-> infosource -> search 8cube1 name
    (if not get, try right click root note 'infosource'->insert lost node(s)
    from that infosource, you will find assigned with datasource, right click and 'create infopackage', schedule and run.
    Copy from
    While creating the new cube give the cube name in the "Copy from" section. It would copy all the characteristics and Key figures. It would even copy the dimensions and Navigational attributes
    Another option is:
    The steps for copying the contents of one cube to another:
    1. Go to Manage -> Recontruct of the new cube.
    2. Select the "selection button"(red , yellow, blue diamond button).
    3.In the selection screen you can give the technical name of the old cube, requests ids you want to load, from & to date.
    4.Execute and the new cube would be loaded.
    Its all that easy!!!!!!
    Refer this link:
    Copying the structure of an Infocube
    Reward if helpful,
    Regards
    Bala

  • How to delete selective data in a cube

    Hi all,
    Can any one tell me, how to delete selective data in a cube. For example if i want to delete data corresponding particular Matrial Id  or particular Customer ID (say C400) ? And even deleting the data based on particular Request numer.?
    Sandy.

    Hi,
    RSMO > Infocube > Manage > contents tab > selective deletion
    Then define your selections for deletions (Matrial Id or particular Customer ID (say C400)).
    Request based deletions can be done directly from Request tab.
    Thanks,
    JituK

  • Error Caller 09 contains error message - Data Marts loading(cube to ODS)

    Dear all,
              Please ! Help me in this problem, This is very urgent.
              I have one process chain that loads data from BIW to BIW only through Data Marts. In that process chain, one process loads data from one cube(Created by us) & loads data to one ODS(also created by us). Data is loaded through full update & for the selected period specified in 'Calender Day' field in data selection.
             Previously I was able to load data for 2 months, but some days ago, suddenly one day, the process of Extraction got stuck in background for long time,& showed following error :
              Error message from the source system
              Diagnosis
             An error occurred in the source system.
              System Response
             Caller 09 contains an error message.
             Further analysis:
             The error occurred in Extractor . 
             Refer to the error message.
             Procedure
             How you remove the error depends on the error message.
             Note
             If the source system is a Client Workstation, then it is possible that the file that you wanted to                load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
                  Then we killed that process on server & after another attempt, It showed some calmonth...timestamp error. Then after reducing data selection period, It had been loaded successfully, after that I was able to load data for 20 days,Again after some days process got stuck,I followed the same procedure,reduced the period to 15 days & continued, Now I can't even load data for 5 Days successfully in one attempt, I have to kill that process in background & repeat it, then sometimes It get loaded.
             Pls, suggest some solutions as soon as possible. I am waiting for your reply. Points will be assigned.
             Thanks,
              Pankaj N. Kude
    Edited by: Pankaj Kude on Jul 23, 2008 8:33 AM

    Hi Friends !
                      I didn't find any short dump for that in ST22.
                      Actually , What happens is, Request continues to run in background for infinite time. At that time
    Status Tab in Process Monitor shows  this messages :
                        Request still running
                        Diagnosis
                        No errors found. The current process has probably not finished yet.
                         System Response
                         The ALE inbox of BI is identical to the ALE outbox of the source system
                           or
                         the maximum wait time for this request has not yet been exceeded
                           or
                        the background job has not yet finished in the source system.
                       Current status
                       in the source system
                        And Details Tab shows following Messages :
                        Overall Status : Missing Messages or warnings
                        Requests (Messages) : Everything OK
                                Data Request arranged
                                Confirmed with : OK
                         Extraction(Messages) ; missing messages
                                Data request received
                                Data selection scheduled
                                Missing message : Number of Sent Records
                                Missing message : selection completed
                        Transfer (IDOCS and TRFC) : Everything OK
                                Info Idoc1 : Application Document Posted
                                Info Idoc2 : Application Document Posted
                         Processing (data packet) : No data
                        This Process runs for infinite time, then I have to kill that process from server, & Then It shows  Caller 09 Error in Status Tab
                        Should I Changed the value for that memory parameter of server or not ?. We r planning to try it today, Is it really belongs to this problem, Will it be helpful ? What r the risks ?
                        Please, give your suggestion as early as possible, I m ewaiting for your reply.
      Thanks,
    Pankaj N. Kude

  • Delete overlapping requests in Cube

    Dear all,
    I need delete the requests uploaded in the previous day when transferring new request into Cube,and the system is BW 7.0,but I use 3.5 data source and infoSource.
    The issue is that when I

    Tianli,
    You can delete the overlapping request in the infocube.
    In the infopackage -
    > Data Targets -
    > Automatic loading of similar/identical requests
    You can set the parameter for deleting the overlapping request here.
    If you are using process chains, there is a process type available
    Load process and post processing-----> Delete overlapping request from infocube.
    Can you post your question again as I see it is incomplete..
    Sasi

  • Delete overlapping requests from cube not working in processchain.

    In a process chain, 'deletion of overlapping requests from the cube ' step is used.
    Before this step a DTP step runs with a full update to load the cube. This process chain is scheduled every day.
    Issue is, the process chain failed at the DTP step and after correcting and repeating, the step got executed.
    However, the next step after the DTP,'delete overlapping requests from the cube'
    gets executed but without deleting the previous day's request.
    In the step details a message that 'No request for deletion were found' can be seen.
    Then next day when the DTPstep is executed without any problem the 'delete overlapping requests from cube' step is successful
    and the previous requests from cube are deleted.
    the deletion selections in the step ' delete overlapping request from infocube' is
    Delete existing requests
    Conditions:
    only delete requests from same DTP
    Selections
    Same or more comprehensive
    Because of this issue on a particular day because of the presence of 2 days requests the data is getting aggregated and shown as double in the reports.
    Please help.

    Hi Archana,
    When you delete the bad request from target and before repeating your DTP in PC, make sure the bad request deleted from table RSBKREQUEST also.
    If you find the same request in table, first delete the request from table and repeat the DTP in PC.
    Now Delete overlapping step should work.
    As this is not the permanent solution, please raise an OSS for SAP
    Regards,
    Venkatesh

Maybe you are looking for