Delta load to cubes from Datamart

Hi Gurus:
We load data from ODS (Datamart) to cubes. I did the INIT & few Delta lods were also done.
I found a bug in the update rule & that has been fixed. I want to DELETE whatever small data in the cube and then do a FULL load from ODS to cube. Can I do this & will that impact delta setup...
Please suggest. I will be happy to assign the points.
Thanks.

Hi,
Yes you Can.
Delete the Requests loaded from the ODS.
Delete the Datamart Status in the ODS.
Just load the Data from ODS to CUbe with Option Init with Data Transfer.
If you have a huge amt of Data first go for Full and then Init without Data Transfer.
Thanks,
-VIjay

Similar Messages

  • Delta Load to cube

    Hi All,
                  I have an issue in the delta load to cube.From Nov 11 onwards the delta load is not picking any data.It shows 0 from 0 records in the monitor screen.Could anyone please help me on this.
    Thanks in advance,
    Drisya

    hi Drisya,
    actiavte the transfer rules & update rules once & then load the data.
    u said that 0 from 0 records it is showing .Is it showing status yellow OR green?
    2. ur loading data from ODS to cube OR directly to cube
    If it is yellow force that to red
    actiavte transfer structure & update rules and do loading again.
    If ur loading through ODS do export generic data sorce at ODS level by right click
    Thanks,
    kiran

  • Cannot deactivate delta load of materials from R/3

    Hi,
    we have a problem deactivating the delta load of materials from R/3.
    I deactivated the adapter objects MATERIAL and PRODUCT_MAT in R3AC1.
    I even deleted the class MATERIAL in R3AC4.
    The Bdoc's are not created any more, but the queues are still getting generated in SMQ2:
    R3AD_MATERIA***
    Am i wrong in assuming that this settings have to prevent queue creation?
    Is this an error that anyone encountered or are there any other settings that I forgot to check?
    Thank you,
    Borut

    Hi,
    I have a similar problem. I have to deactivate delta flow of materials from ONE SITE only (F6A)  and keep it flowing from other sites N6A and A6A - our CRM system is connected to multiple backends.
    For the required site, i have removed all the conditions in the filter settings but still delta queue keeps on coming. Also, the queue is in SYSFAIL state - we are not able to download material from other sites - supposedly material from the other two sites gets queued up in this stuck queue.
    Is there any way to stop the download from one particular site only?

  • Delta load of customer from R/3 (ECC5.0) to CRM5.0

    We have done an initial load of customers from ECC to CRM (R3AS). A change is done on a customer in ECC. I expect the changes to be reflected in CRM as a delta load. I do not see the changes. What am I missing?
    All responses are appreciated and will be rewarded.

    Hi Mani,
    Here are some clues on how to proceed with processing of incomplete or failed BP deltas:
    First of all you have to note that during the initial download, no delta download can occur. In particular, no delta download of customers occurs if the initial download of materials occurs at the same time. This is because the material download depends on the customer download.
    During the delta download, the CRM system uses the inbound queues R3AD_CUSTOME<nnnnnnnnnn> where <nnnnnnnnnn> corresponds to the customer number in the R/3 system. If an error occurs during the download, the inbound queue for this customer is stopped. You can see the error messages in the flow trace. For the delta download, you have the option to use Transaction CRMM_BUPA_MAP for the error analysis. If you enter the customer and press Enter here, you can display the business partner number for a customer and all open BDocs (that is, those which contain errors or are being processed). To do this, you only have to click the 'Queues and BDocs during download' pushbutton. The system issues an overview of the queue and the open BDocs for this customer and in particular, you can 'look into' the BDoc. As a result, you return to the flow trace and you can display the error messages which occured as described above. During the delta download, you have the option to send the same BDoc again. This is useful if you can simply solve the problem by correcting Customizing in the CRM system. To do this, you simply click the corresponding icon in the flow trace (currently, this is the second icon from the left).
    If you have to correct the error in the R/3 system, you have to delete the existing BDoc since it cannot be processed further. As a result, the inbound queue in the CRM system is released again. In this case, you are recommended to delete all existing open BDocs and to make the changes in the R/3 system.
    If inconsistencies occur between the R/3 system and the CRM system, you can start a request for the customer. To do this, click the 'Get customer from the R/3 system' pushbutton. This starts a request which uses the inbound queue R3AR_CUSTOMER. Process possible errors as during an initial download.
    If you still have problems and don't know how to fix, please post more specific information from the BDoc queue, such as what error message you are getting. It helps us better identify the problem.
    Regards,
    Rahul
    PS. Please award points if it helps!
    PPS. For future reference: this info was found in note 362621: Error drng data exchange customer<->business partner

  • Loading the cube from 3 datasources and getting 3 records for each keyfield

    Hi All,
    I am loading an InfoCube from 3 separated datasources. These 3 datasources are UD Datasources and their unique source system is UD Connect.
    Each of the datasource contains a unique key field 'Incident Number' (same as we use have in Datasources for DSO).
    The problem is, when I am loading data with these 3 datasources to the cube, for each 'Incident number' there becomes 3 records.
    We have reports on this Infocube and the report also displays 3 records for each incident number.
    If I remove Incident Number key field from 2 of the Datasources, the data from these datasources do not reach to the Cube.
    For many of you, this may be a minor problem ( or may not be a problem at all !!! ) , but as a New Joinee in SAP field, this has become a showstopper issue for me.
    Please suggest.
    Thanks in Advance.

    Hi Pravender,
    Thanks for your interest.
    The scenario is, I have 3 datasources form the same source system, All the 3 datasources have different fields except 'Incident Number'. So, each and every field has only one value in the report. But due to 3 separate datasources, it creates 3 records displahying values of each datasource in a separate record.
    There is no field in the query output which is having different values for the different source systems. Due to 3 records in the cube, one record will contain the value for a particular field and the other two records will show a Blank for that field.
    Regards.

  • Initial Load of Cubes from BW 3.1 to BW 3.5 system

    Hello,
    I am looking for some help with regards to the following challenge. We have an existing BW 3.1 (about 3 TB in size) with a cube defined which covers about half the size of the total amount of data (approx 1.5TB). We need to do an inital load of the cube data onto a new BW 3.5 system (which is in a different Data Center) and than establish regular deltas between the 2 systems. My question is what is the best way to move that initial amount of data from BW 3.1 to the new BW 3.5 system? Do we have to send every record over a wire or can we export the cube to tape (which will take a long time and is not easily repeatable), than import it in BW 3.5 and re-establish the connection between the 2 systems to allow deltas to flow. Any advice or explanation is highly appreciated
    Thanks
    Hans deVries

    Hi Hans,
    isn't it possible for you to do full loads for some time slices from your source cube? So you would be able to load the history up to a specific date via full loads and if you are up to date initialize your delta.
    Siggi

  • Data is not loading to cube from DSO

    Hi experts,
    I loaded data to DSO. But, the problem is when i am trying to load the same to cube by using DTP there is no added records.
    All records from DSO are transferred but 0 added records. The mapping in the transformation is direct mapping no routines are available.
    The load is full load.
    Pls suggest me to resolve this issue.
    regards,
    Rajesh.

    Hi,
    You wont find the concept of TRFCs while running the DTPs....
    *DTP increases the performance*
    DTP is faster as DTP data loading is optimized by Parallel process.
    Data loading through DTP has no TRFC/LUWs generated at run time thus minimizing the loading time.
    Delete the index and reload the data and create the index...
    as the request is still active in SM37 -- kill the active job and delete the request from IC.
    Next delete the index and trigger the DTP --> create index for IC.
    this should work...
    Regards
    KP

  • Data not loaded to cube from DSO

    I activated the standard RPM Financial Planning cube 0RPM_C05. When I do the data loads (not DTP) data requests are sent to DSO. Request status is success and data is active and is available for reporting. However these two DSO are mapped to Info cube (standard content) and I dont see data there.
    I don't see any errors in the monitor.
    Data sources - 3.x
    DSO - 0RPM_DS07 and 0RPM_DS08
    Cube - 0RPM_C05
    Thank you,
    Vasu
    Edited by: Subramanya Srinivas Mullapudi on Mar 4, 2010 8:44 AM

    Hi Raja,
    Thanks.
    The problem is though the update automatically option is set, the data is not getting loaded.
    I used the manual update 3.x data to targets now and it worked. I see the data in the cube.
    But how to set this in Process chains. Is this (update 3.x data to targets)  the same as Update DSO Data (further update) process type?

  • Loading BW Cube from R/3 using ABAP

    I am new to BW thus this question.
    I have an ABAP program called Z_ABC in SAP R/3 . Can I create a BW Cube taking using this program as an input?
    If not what are the steps needed in order to create a BW Cube if I know the tables that are used in the program.
    Thanks in Advance

    Hi Mason
    Below are the steps
    1. Create a Datasource based on the view/db table.
    Check this link . I think your case simpler that what has been explained in this doc.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10a89c00-7dd7-2d10-6f83-cd24ee6d517c?QuickLink=index&overridelayout=true
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    2.  Replicate the DataSource in BW system. Activate it
    3. Create InfoPackage, by right clicking on the DataSource
    4. Run Infopackage to bring data till PSA.
    5. After sucessful execution till step 4, create transformation between your cube and DataSource. Activate it.
    6. Create DTP, activate it.  Run the DTP to load data till cube

  • After upgrade to BI NW4.0S issue in loading cube from ODS

    Hi,
      I have an SAP delivered ODS(0CRM_OPPT_H) which I was loading for some time. The cube 0CRM_C04 was loaded from the ODS a couple of times. I have now deleted all the data from the cube and want to load the data from the cube to ODS, In between our BW was upgrade frpm 3.5 to BI NW0S..
      I have done all prerequisite , replicated the 80 datsource , deleted and re activated the update rule and transfer rule again.. and then tried to load the Cube from ODS.. however everytime I am getting the following error
    DataSource 80CRM_OPPI does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 12/23/2005 10:09:36.
    The time stamp in the BW system is 11/09/2005 13:02:29.
    Is it due to the upgrade?? I have done everything so that this dosen't happen, still could not resolve..
    Thanks

    Are you sure you've replicated the DataSource 80CRM_OPPI? Try replicating it individually again. If that doesn't help, have you tried reactivating the ODS Object itself?
    Regards, Klaus

  • Data not loading to cube

    I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    The load I am making is full repair
    Can someone assist
    Thanks

    Hi Akreddy
    I am not sure "Repair full to Cube"...!
    Still the following is my opninon abt your issue...
    It seems there is some  miss map ..Just one quick check in the DTP monitor in which step your getting 0 records . Is it Data package level or transformation level or routine level..
    Identify the step where the transfered records is nullifying then dig through the step..or post the waning or message in this forum so that its pretty easier to get answered..
    Hope its clear a little..!
    Thanks
    K M R
    "Impossible Means I 'M Possible"
    Winners Don't Do Different things,They Do things Differently...!.
    >
    akreddy wrote:
    > I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    >
    > Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    >
    > The load I am making is full repair
    >
    > Can someone assist
    >
    > Thanks
    Edited by: K M R on Aug 16, 2010 2:01 PM

  • Delta load issue in COPA

    Hi All
    We are using COPA datasources and daily delta loads will goingon from source system R/3 without any issues.
    last day the delta load from source system was failed due to some basis problem( non availability of WP) and we deleted that bad request in the DSO and then repeated the failed IP in the process chain,then the delta load was succesful but 0 records has been updated.
    So we are missing that delta records,I guess this is due to the time stamp set in the source system.
    In which table does this timestamp exists and what is the procedure inorder to solve this issue
    How do I need to solve this type of issues.
    Please let us know .,
    Regards
    Shankar

    Hi Shankar........
    I think u hav'nt make the QM status red.......when a delta load will fail............always make the the QM status red before repeating the load...........it will set back the QM status.........and when u will try to repeat the load.....system will ask for a repeat delta...........and a repeat delta will pick all the failed records along with the new records(if any).......
    I think u should do reinit............ie.......
    1)Delete the init flag
    2) Fill the set up table with selection.........did u know which values r missed........any selection criteria..like date.......
    3) Then do Full repair
    4) then delta....
    If u don't know the selection..............then u hav to load data for a selective range.........check ur DSO.....if the Update mode is Overwritten........then no probs.........otherwise there may be duplication of records.........then u hav to do selective deleteion........
    Hope this helps......
    Regards,
    Debjani..........

  • Error in delta loading from a planning cube

    hi all,
    I am using a delta load from one planning cube to another.When the changed records are of a magnitude of 100 then it is successful but when the records are of magnitude of 100000 then it stays yellow with no data records sent and eventually fails.the packet size is 20000.
    any help is appreciated.

    Hello Ajay,
    Have you checked that your Request is closed in the planning cube? FM RSAPO_CLOSE_TRANS_REQUEST will do it
    It is a customizing issue to tell the infopackage to end "green" when it has no data.
    Transaction Spro ==> SAP Netweaver ==> SAP Business Information Warehouse ==> Automated Processes ==> Monitor Settings ==> Set Traffic Light Color. There you have to set the Traffic light on green if no data is available in the system.
    Hope that helps
    best regards
    Yannick

  • Delta loading unsuccessful from cube

    hi all,
    I am using a delta load from one planning cube to another.When the changed records are of a magnitude of 100 then it is successful but when the records are of magnitude of 100000 then it stays yellow with no data records sent and eventually fails.the packet size is 20000.
    any help is appreciated.

    Hi ,
       do Reintialization of delta loading.... and also check out infopackage.
      Check out with this link
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e6769d90-0201-0010-848d-ae3c7bb18f8a
    Hope it helps
    Reg
    VArun CN

  • Delta Load failure from ODS to Cube

    Experts,
    After CRM upgrade, I had to reinitilized the delta process. In this, I reloaded ODS with first init delta without data and then delta load with new data created in CRM. This worked good so far. After this when I tried to load cube with init. delta without data, no issues. But delta load from ODS to Cube doesn't work. Does anybody have any suggestions, please?
    Thanks,
    Nimesh
    Following error observed in Status for delta load from ODS to Cube.
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data Packets or Info Packets are missing in BW, but there were - as far as can be seen - no processing errors in the source system. It is
    therefore probable that an error arose in the data transfer.
    With the analysis an attempt was made to read the ALE outbox of the source system, which lead to error .
    It is possible that no connection exists to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection of the source system for errors and check the
    authorizations and profiles of the remote user in both the BW and
    source systems.
    Check th ALE outbox of the source system for IDocs that have not been

    Hi,
    As far as i understood, you have successful deltas' loading to the ODS and you want to update entire ODS data to the Cube followed by daily delta's.
    If this is the case,
    Make sure that you have active update rules exist between ODS and Cube.
    First goto your ODS and right click and select option 'Update ODS data in data target'.
    Select Init update > IT will take you to the init info package where you select init <b>with data transfer</b> (Init with data transfer will bring all the records from ODS to Cube)
    Once init is completed ,you can schedule to load delta's regularly form ODS to Cube by following the same steps.
    Hope this helps
    Praveen
    Message was edited by:
            Praveen Vujjini

Maybe you are looking for

  • PO Line Item with Open Quantity

    HI all, How should i select a PO line item with Open Quantity? Is it possible to post GR for a PO which is set for Delivery Complition but still has Open Quantity .. please help me how to check for OPEN QUANTITY. Thanks, Ravi.

  • Painting on canvas in native cocoa

    Hi Friends, I need a make a java canvas in which i have to paint thumbnail images by native cocoa code. I had got drawing surface(ds) and drawing surface interface(dsi).Also i am able to read the thumbnail images but donot know how to paint images in

  • How Re- assign budget for Internal Order

    Dear All Experts, I have configured IO- Budgeting in CO. It is working for all FI, MM, SRM. Now I have few questions related to this as below. 1 ) I can see in KO22 in budget column for Overall 25,000 where as in Assign 666.513. 56 how to edit this 6

  • Help: Best Solution for Auditing??

    Hi, Scenario: We are in the process of implementing a large scale application utilizing BC4J and Struts. Our client has about 100 users on the system daily, they are authenticated at the application tier (username is stored throughout the session in

  • SUDOKU in JAVA  Plz Help

    Hi I want to develop a Java program to solve Sudoku puzzles My program must be able to: �Read in a puzzle (from the predefined file format � see below) �Display the unsolved puzzle �Solve the puzzle �Display the solved puzzle File Format The file con