Initial Load of Cubes from BW 3.1 to BW 3.5 system

Hello,
I am looking for some help with regards to the following challenge. We have an existing BW 3.1 (about 3 TB in size) with a cube defined which covers about half the size of the total amount of data (approx 1.5TB). We need to do an inital load of the cube data onto a new BW 3.5 system (which is in a different Data Center) and than establish regular deltas between the 2 systems. My question is what is the best way to move that initial amount of data from BW 3.1 to the new BW 3.5 system? Do we have to send every record over a wire or can we export the cube to tape (which will take a long time and is not easily repeatable), than import it in BW 3.5 and re-establish the connection between the 2 systems to allow deltas to flow. Any advice or explanation is highly appreciated
Thanks
Hans deVries

Hi Hans,
isn't it possible for you to do full loads for some time slices from your source cube? So you would be able to load the history up to a specific date via full loads and if you are up to date initialize your delta.
Siggi

Similar Messages

  • Error in Initial Load of Products from ECC to CRM

    Hi ,
        Im getting below errors In initial load of MATERIAL from ECC to CRM.
    <b>Data cannot be maintained for set type COMM_PR_MAT
    Data cannot be maintained for set type COMM_PR_UNIT
    Data cannot be maintained for set type COMM_PR_SHTEXT</b>
    <b>My Analysis is</b> 
      I found these set types are already assigned in other base category and around 100+ materials are created using this base category.
      After initial load of DNL_CUST_PROD1, i found these set types are not assigned to any of the categories of hierarchy R3PRODSTYP. Since R3PRODSTYP is not having above set types, my initial load is failing.
    <b>Required Solution</b>
      Is there any other way, I can download materials from ECC to CRM, with out deleting the existing materials and Z hierarchy created earlier from CRM.
    Regards,
    Prasad
    Message was edited by:
            SAP CRM

    Hi there,
    Try to attach the Set Types:
    "COMM_PR_MAT", "COMM_PR_UNIT", "COMM_PR_SHTEXT", COMM_PR_BATCH, COMM_PR_LGTEXT, COMM_PR_LGTEXT1,COMM_PR_LGTEXT2,
    to the Category "MAT_" and then run the Initial load again.
    It should work now unless you have some other issues.
    Best regards,
    Pankaj S.

  • Initial Load of contract from ISU to CRM

    Hi All,
    We are working on the replication of contract from ISU to CRM.
    We have done all the necessary setting,like assigning default product,running ecrm_generate_everh report etc
    When we are running initial load on SI_CONTRACT,only single BDOC is getting generated which is error free but still it contains no data.
    Since its without an error ,we are not able to figure out whats the error is.
    Regards
    Nikhil

    Hello Nikhill,
    Could you resolve the problem?? I've a similar error, the BDoc is empty. The table everh is filed but the fields contractpos and contarcthead have value '0000000000000000' i think that is the problem. ANd the recport ECRM_CHECK_EVERH say that are misiing contracts.
    Could you help me please!!!
    Thnks!!

  • Loading the cube from 3 datasources and getting 3 records for each keyfield

    Hi All,
    I am loading an InfoCube from 3 separated datasources. These 3 datasources are UD Datasources and their unique source system is UD Connect.
    Each of the datasource contains a unique key field 'Incident Number' (same as we use have in Datasources for DSO).
    The problem is, when I am loading data with these 3 datasources to the cube, for each 'Incident number' there becomes 3 records.
    We have reports on this Infocube and the report also displays 3 records for each incident number.
    If I remove Incident Number key field from 2 of the Datasources, the data from these datasources do not reach to the Cube.
    For many of you, this may be a minor problem ( or may not be a problem at all !!! ) , but as a New Joinee in SAP field, this has become a showstopper issue for me.
    Please suggest.
    Thanks in Advance.

    Hi Pravender,
    Thanks for your interest.
    The scenario is, I have 3 datasources form the same source system, All the 3 datasources have different fields except 'Incident Number'. So, each and every field has only one value in the report. But due to 3 separate datasources, it creates 3 records displahying values of each datasource in a separate record.
    There is no field in the query output which is having different values for the different source systems. Due to 3 records in the cube, one record will contain the value for a particular field and the other two records will show a Blank for that field.
    Regards.

  • Delta load to cubes from Datamart

    Hi Gurus:
    We load data from ODS (Datamart) to cubes. I did the INIT & few Delta lods were also done.
    I found a bug in the update rule & that has been fixed. I want to DELETE whatever small data in the cube and then do a FULL load from ODS to cube. Can I do this & will that impact delta setup...
    Please suggest. I will be happy to assign the points.
    Thanks.

    Hi,
    Yes you Can.
    Delete the Requests loaded from the ODS.
    Delete the Datamart Status in the ODS.
    Just load the Data from ODS to CUbe with Option Init with Data Transfer.
    If you have a huge amt of Data first go for Full and then Init without Data Transfer.
    Thanks,
    -VIjay

  • Data is not loading to cube from DSO

    Hi experts,
    I loaded data to DSO. But, the problem is when i am trying to load the same to cube by using DTP there is no added records.
    All records from DSO are transferred but 0 added records. The mapping in the transformation is direct mapping no routines are available.
    The load is full load.
    Pls suggest me to resolve this issue.
    regards,
    Rajesh.

    Hi,
    You wont find the concept of TRFCs while running the DTPs....
    *DTP increases the performance*
    DTP is faster as DTP data loading is optimized by Parallel process.
    Data loading through DTP has no TRFC/LUWs generated at run time thus minimizing the loading time.
    Delete the index and reload the data and create the index...
    as the request is still active in SM37 -- kill the active job and delete the request from IC.
    Next delete the index and trigger the DTP --> create index for IC.
    this should work...
    Regards
    KP

  • Abort Status in Monitoring Initial Load Object - SERNR_CONFIG From R/3

    Hi Experts,
    Currently we are replicating Equipement from R/3 to CRM. Now we want to replicate only Serial Number withoutEquipment View. For this I Activated the Object Sernr_config and started for Initial download. When i monitored the replication it is showing the status as "ABORT".
    As we are already replicating Equipment from R/3 to CRM, when we want to replicate only serial numbers how will they appear in CRM?
    When we repllicate Equipment  , it will create a Component & IBase ID for it. What will be created if we replicate only Serial Number?
    I am currently workin on SAP CRM 6.O version.
    Can u please guide me how to replicate.
    best regards,
    Sarangamath

    Hi Srangamath,
    Nomrally status Abort should ony occur if the load was maually cancelled.
    However, the sytmoms you descibre have occurred in th past with equipment downloads.
    If you're using fliters, notes  1230307 and 1224016 may help.
    Otherwise, please check the precondition for "Replication of Equipment Between
    SAP CRM and SAP ECC" in the attached Link:
    [http://help.sap.com/saphelp_crm700_ehp01/helpdata/EN/46/cc79505ec61525e10000000a114a6b/frameset.htm]
    Best regards,
    Brian.

  • Data not loaded to cube from DSO

    I activated the standard RPM Financial Planning cube 0RPM_C05. When I do the data loads (not DTP) data requests are sent to DSO. Request status is success and data is active and is available for reporting. However these two DSO are mapped to Info cube (standard content) and I dont see data there.
    I don't see any errors in the monitor.
    Data sources - 3.x
    DSO - 0RPM_DS07 and 0RPM_DS08
    Cube - 0RPM_C05
    Thank you,
    Vasu
    Edited by: Subramanya Srinivas Mullapudi on Mar 4, 2010 8:44 AM

    Hi Raja,
    Thanks.
    The problem is though the update automatically option is set, the data is not getting loaded.
    I used the manual update 3.x data to targets now and it worked. I see the data in the cube.
    But how to set this in Process chains. Is this (update 3.x data to targets)  the same as Update DSO Data (further update) process type?

  • Loading BW Cube from R/3 using ABAP

    I am new to BW thus this question.
    I have an ABAP program called Z_ABC in SAP R/3 . Can I create a BW Cube taking using this program as an input?
    If not what are the steps needed in order to create a BW Cube if I know the tables that are used in the program.
    Thanks in Advance

    Hi Mason
    Below are the steps
    1. Create a Datasource based on the view/db table.
    Check this link . I think your case simpler that what has been explained in this doc.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10a89c00-7dd7-2d10-6f83-cd24ee6d517c?QuickLink=index&overridelayout=true
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    2.  Replicate the DataSource in BW system. Activate it
    3. Create InfoPackage, by right clicking on the DataSource
    4. Run Infopackage to bring data till PSA.
    5. After sucessful execution till step 4, create transformation between your cube and DataSource. Activate it.
    6. Create DTP, activate it.  Run the DTP to load data till cube

  • Initial load of inventory level from csv - double datarows in query

    Hello everybody,
    a query result shown in a web browser seems strange to me and I would be very glad, if anyone can give me some advice how to solve the problem. As I do not think that it is related to the query, I posted it into this forum.
    The query refers to an InfoCube for inventory management with a single non-cumulative key figure and two other cumulative key figures for increase and decrease of inventory. The time reference characteristic is 0CALDAY. The initial load has been processed reading from a flat file (CSV), the structure looks like this:
    Product group     XXX
    Day               20040101
    Quantity          1000
    Increase          0
    Decrease          0
    Unit               ST
    The initial load runs fine, the system fills all the record sets into the InfoCube. Unfortunately I do not know how to look at the records written into the cube, because only the cumulative key figures are shown in InfoCube-> Manage-> Contents.
    Well, when executing the query, a really simple one, the result is just strange, because somehow there are now two rows for each product group with different dates, one with the 1st of January, 2004 and the other for the 31st of December, 2003 containing both 1000 units. The sum is 2000.
    It became more confusing, when I loaded the data for increase and decrease: now the quantities and sums      are correct, but the date of the initial load is a few days later than before, the data table in the query does not contain the 1st of January.
    Does anybody know, what I did wrong or where there is information about how to perform an initial load of inventory from csv in a better way?
    Kind regards
    Peter

    Peter,
    Inventory is not that straight forward to evaluate as it is non-cumulative. Basically it means that one KF is derived from one/two other KFs. You cannot see non-cumulative KFs in manage infocube.
    Have you uploaded opening balances separately? If so, your data for 31st of december is explained.
    In non-cumulative cubes, there need not be a posting for a particular day for a record to exist. For e.g. if you have stock as 10 units on 1st and then no posting for 2nd and 3rd and then increase 10 units on 4th, even for 2nd and 3rd, the non-cumulative KF will report as 10 units (stock on 1st rolled forward).
    There is a how to...inventory management document on service market place that explains this quite nicely.
    Cheers
    Aneesh

  • After upgrade to BI NW4.0S issue in loading cube from ODS

    Hi,
      I have an SAP delivered ODS(0CRM_OPPT_H) which I was loading for some time. The cube 0CRM_C04 was loaded from the ODS a couple of times. I have now deleted all the data from the cube and want to load the data from the cube to ODS, In between our BW was upgrade frpm 3.5 to BI NW0S..
      I have done all prerequisite , replicated the 80 datsource , deleted and re activated the update rule and transfer rule again.. and then tried to load the Cube from ODS.. however everytime I am getting the following error
    DataSource 80CRM_OPPI does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 12/23/2005 10:09:36.
    The time stamp in the BW system is 11/09/2005 13:02:29.
    Is it due to the upgrade?? I have done everything so that this dosen't happen, still could not resolve..
    Thanks

    Are you sure you've replicated the DataSource 80CRM_OPPI? Try replicating it individually again. If that doesn't help, have you tried reactivating the ODS Object itself?
    Regards, Klaus

  • Data not loading to cube

    I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    The load I am making is full repair
    Can someone assist
    Thanks

    Hi Akreddy
    I am not sure "Repair full to Cube"...!
    Still the following is my opninon abt your issue...
    It seems there is some  miss map ..Just one quick check in the DTP monitor in which step your getting 0 records . Is it Data package level or transformation level or routine level..
    Identify the step where the transfered records is nullifying then dig through the step..or post the waning or message in this forum so that its pretty easier to get answered..
    Hope its clear a little..!
    Thanks
    K M R
    "Impossible Means I 'M Possible"
    Winners Don't Do Different things,They Do things Differently...!.
    >
    akreddy wrote:
    > I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    >
    > Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    >
    > The load I am making is full repair
    >
    > Can someone assist
    >
    > Thanks
    Edited by: K M R on Aug 16, 2010 2:01 PM

  • Replicating data once again to CRM after initial load fails for few records

    My question (to put it simply):
    We performed an initial load for customers and some records error out in CRM due to invalid data in R/3. How do we get the data into CRM after fixing the errors in R/3?
    Detailed information:
    This is a follow up question to the one posted here.
    Can we turn off email validation during BP replication ?
    We are doing an initial load of customers from R/3 to CRM, and those customers with invalid email address in R/3 error out and show up in SMW01 as having an invalid email address.
    If we decide to fix the email address errors on R/3, these customers should then be replicated to CRM automatically, right? (since the deltas for customers are already active) The delta replication takes place, but, then we get this error message "Business Partner with GUID 'XXXX...' does not exist".
    We ran the program ZREPAIR_CRMKUNNR provided by SAP to clear out any inconsistent data in the intermediate tables CRMKUNNR and CRM_BUT_CUSTNO, and then tried the delta load again. It still didn't seem to go through.
    Any ideas how to resolve this issue?
    Thanks in advance.
    Max

    Subramaniyan/Frederic,
    We already performed an initial load of customers from R/3 to CRM. We had 30,330 records in R/3 and 30,300 of them have come over to CRM in the initial load. The remaining 30 show BDOC errors due to invalid email address.
    I checked the delta load (R3AC4) and it is active for customers. Any changes I make for customers already in CRM come through successfully.  When I make changes to customers with an invalid email address, the delta gets triggered and data come through to CRM, and I get the BDOC error "BP with GUID XXX... does not exist"
    When I do a request load for that specific customer, it stays in "Wait" state forever in "Monitor Requests"
    No, the DIMA did not help Frederic. I did follow the same steps you had mentioned in the other thread, but it just doesn't seem to run. I am going to open an OSS message with SAP for it. I'll update the other thread.
    Thanks,
    Max

  • CRM business role - user mapping initial load

    Hi,
    I'm tyring to initial load the data from the CRM systems into IDM. I'm able to get all data of the users expect their business role mapping(parameter in CRM).
    in the read pass of the user from the abap initial load , the attribute for the parameter is as follows,
                        Target                                                                             Source
    sap%$rep.$NAME%Parameter1:Info:VARCHAR:255|                        parameter1
    but no data is stored in the parameter1 table though the user has some parameter mapped in the CRM system                 
    Can anyone plz help to load this user-business role mapping details.
    Thanks in Advance.
    Regards,
    Pricy

    Hello Pricy,
    can you just give me some hints on what you are trying to do exactly?
    My assumptions:
    - CRM business role is stored as user parameter in ABAP SU01 user data, right?
    - you want to read all the ABAP user data from your CRM system INCLUDING the user parameter data from ABAP SU01 user data, right?
    If that is the case at least the loading part should work pretty fine. I just tried this on my local system and had no issues, all user parameters of my existing ABAP users where loaded into one temporary table.
    -> Pass: ReadABAPUsers -> table "sap%$rep.$NAME%Parameter1:Info:VARCHAR:255|"
    Did you find this temp table created on your database correctly? My table is there and is called "sapT01_001Parameter1" (where my repository is named T01_001) and it contains all the existing user parameters.
    What exactly is your issue?
    Regards,
    René

  • Delta load of customer from R/3 (ECC5.0) to CRM5.0

    We have done an initial load of customers from ECC to CRM (R3AS). A change is done on a customer in ECC. I expect the changes to be reflected in CRM as a delta load. I do not see the changes. What am I missing?
    All responses are appreciated and will be rewarded.

    Hi Mani,
    Here are some clues on how to proceed with processing of incomplete or failed BP deltas:
    First of all you have to note that during the initial download, no delta download can occur. In particular, no delta download of customers occurs if the initial download of materials occurs at the same time. This is because the material download depends on the customer download.
    During the delta download, the CRM system uses the inbound queues R3AD_CUSTOME<nnnnnnnnnn> where <nnnnnnnnnn> corresponds to the customer number in the R/3 system. If an error occurs during the download, the inbound queue for this customer is stopped. You can see the error messages in the flow trace. For the delta download, you have the option to use Transaction CRMM_BUPA_MAP for the error analysis. If you enter the customer and press Enter here, you can display the business partner number for a customer and all open BDocs (that is, those which contain errors or are being processed). To do this, you only have to click the 'Queues and BDocs during download' pushbutton. The system issues an overview of the queue and the open BDocs for this customer and in particular, you can 'look into' the BDoc. As a result, you return to the flow trace and you can display the error messages which occured as described above. During the delta download, you have the option to send the same BDoc again. This is useful if you can simply solve the problem by correcting Customizing in the CRM system. To do this, you simply click the corresponding icon in the flow trace (currently, this is the second icon from the left).
    If you have to correct the error in the R/3 system, you have to delete the existing BDoc since it cannot be processed further. As a result, the inbound queue in the CRM system is released again. In this case, you are recommended to delete all existing open BDocs and to make the changes in the R/3 system.
    If inconsistencies occur between the R/3 system and the CRM system, you can start a request for the customer. To do this, click the 'Get customer from the R/3 system' pushbutton. This starts a request which uses the inbound queue R3AR_CUSTOMER. Process possible errors as during an initial download.
    If you still have problems and don't know how to fix, please post more specific information from the BDoc queue, such as what error message you are getting. It helps us better identify the problem.
    Regards,
    Rahul
    PS. Please award points if it helps!
    PPS. For future reference: this info was found in note 362621: Error drng data exchange customer<->business partner

Maybe you are looking for