HR data synchronization strategy to CRM

Dear all expert,
I hope this question never asked before. We are already synchronizing our HR master data (org structure, position) from HR client to CRM client using ALE.
The syncronization already ran 14 months.
I'd like to know, how is the best practises or of run sync. of HR data to CRM?
Now, we are sync. our HR data to CRM daily and using reporting period ALL at PFAL.
But, the consequences is: the IDocs created were very much. We want to eliminate that.
Why we do like that? Because our HR data can be changed dynamically and daily, updated back-dated or hiring in any-date. If we sync. the data not daily (example, weekly), it might be a salesperson not in CRM client when needed.
Why we use reporting period ALL? Because if we do not using that selection (using today or this month), the link between objects might be broken or inconsistencies because not all link will be sent, only link/object in the restricted period.
Any suggestion or experiences in other client about synchronization strategy? Is our selection already correct?
Don't worry about points.

Hi Kittu,
Please surf throught he link below:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84dd0c97-0901-0010-4ab2-9849fba57e31
Best Regards,
Pratik Patel
<b>Reward with Points!</b>

Similar Messages

  • Offline data synchronization

    We are trying to use the offline data synchronization feature of DMS using data modeling.
    Below is the only example we found on adobe site and its working.
    http://help.adobe.com/en_US/LiveCycleDataServicesES/3.1/Developing/WS4ba8596dc6a25eff-56f1 c8f4126dcd963c6-8000.html
    Also we have read “occasionally connected client”  and Model driver applications documentation in lcds31 user guide.
    Is there any other example to demonstrate how to use the offline data sync?. We don’t want to generate the  Java code and use assembler class for this .
    In our example we are implementing the SearchCustomerList Funcationality. Based of search criteria a list of customers is displayed to the user.
    Beloew are the configuration settings
                            var cs:ChannelSet = new ChannelSet();
                            var customChannel:Channel = new RTMPChannel("my-rtmp",
                                        "rtmp://wpuag85393:2038");
                            cs.addChannel(customChannel);
                            this.serviceControl.channelSet=cs;
                            this.serviceControl.autoCommit = false;
                            this.serviceControl.autoConnect = false;
                            this.serviceControl.autoSaveCache = true;
                            this.serviceControl.offlineAdapter = new
                                        com.deere.itarch.adapter.MaintainCustomerBeanOfflineAdapter();
                            this.serviceControl.fallBackToLocalFill=true;
                            this.serviceControl.encryptLocalCache = true;
                            this.serviceControl.cacheID = "localCache";
    CASE I:
    Below is our understanding of offline data sync. for our implementation.
    ·          LCDS server is started and application is connected to the server.
    ·          User enters search criteria and clicks the Search Button.
    ·          Data is fetched and displayed on the screen.
    As autoSaveCache is set to true it should automatically save result in local cache
    ·          Shut down the LCDS server.
    ·          Close the earlier Search Window.
    ·          Run the application and open the customer search page.
    ·          Enter the same search criteria and click search.
    ·          Result : Nothing is displayed on screen. ( No data fetched from local cache)
    Many times we are getting error cannot connect to server ( when server is running 50% of times)
    We also tried setting reconnect strategy to instance. ( but this is also not working)
    Also can you please provide end-to-end sample for data synchronization.

    Good to see you got a little further along with your application. I'm not sure why setting autoconnect to true helped.
    Regarding your search, I'm not sure how you implemented that but the easiest way to do it with model-driven development is by using a criteria filter. It will result in a new query in your offline adapter. You just add a filter element to an entity in your model and in that filter you specify your like expression. I added one to the doc sample app as an example. When you generate code for the offline adapter, you'll be able to see the proper structure for the like clause too. I'm including my fml and offline adapter source below.I've also included the MXML so you can see how I called the new filter method from the client. After I saved to the local cache, and I went offline, I successfully performed the search in the client app. There were no issues with it.
    Here's my fml. The new filter is in bold text. I should have chose a better filter name, since it will generate a method called byProductName, which is very close to the existing getByProductName. But you'll get the idea. Once you add the filter, just remember to redeploy your model and regenerate your code.
    Regarding your question about associations, I'll look into that, but I think you would generate offline adapters for each entity involved in the association and your relationships should behave correctly offline.
    <model xmlns="http://ns.adobe.com/Fiber/1.0">
        <annotation name="DMS">
            <item name="datasource">java:/comp/env/jdbc/ordersDB</item>
            <item name="hibernate.dialect">org.hibernate.dialect.HSQLDialect</item>
        </annotation>
        <entity name="Product" persistent="true">
            <annotation name="ServerProperties" ServerType="LCDS"/>
            <annotation name="DMS" Table="PRODUCT"/>
            <annotation name="VisualModeler" width="114" height="88" x="66" y="79"/>
            <annotation name="ActionScriptGeneration" GenerateOfflineAdapter="true" OfflineAdapterPackage="com.adobe.offline"/>
            <id name="productid" type="integer">
                <annotation name="DMS" ColumnName="PRODUCTID"/>
            </id>
            <property name="description" type="string" length="255">
                <annotation name="DMS" ColumnName="DESCRIPTION"/>
            </property>
            <property name="price" type="float">
                <annotation name="DMS" ColumnName="PRICE"/>
            </property>
            <property name="productname" type="string" length="255">
                <annotation name="DMS" ColumnName="PRODUCTNAME"/>
            </property>
            <filter name="byProductName" criteria="productname like"/>
        </entity>
    </model>
    Here's the new query for byProductName in my offline adapter, which contains a valid like clause. That section of the adapter is in bold text.
    * This is an auto-generated offline adapter for the Product entity.
    package com.adobe.offline
    import mx.core.mx_internal;
    import mx.data.SQLiteOfflineAdapter;
    import mx.utils.StringUtil;
    use namespace mx_internal;
    public class ProductOfflineAdapter extends SQLiteOfflineAdapter
         * Return an appropriate SQL WHERE clause for a given set of fill parameters.
         * @param originalArgs fill parameters
         * @return String representing the WHERE clause for a SQLite SQL
        override protected function getQueryCriteria(originalArgs:Array):String
            var args:Array = originalArgs.concat();
            var filterName:String = args.shift();
            var names:Array = new Array();
            switch (filterName)
                case "byProductName":
                    // JPQL: select Product_alias from Product Product_alias where Product_alias.productname like :productname
                    // Preview: productname like :productname                
                    names.push(getTargetColumnName(["productname"]));
                    return StringUtil.substitute("{0} like :productname", names);
                    break;
            return super.getQueryCriteria(originalArgs);
    Here's my modified MXML. I'm still calling getAll(), but after that I use the new filter to filter the results/datagrid display to just the subset that matches the string I input in the search field. This results in a new call to productService.byProductName(), which is the client-side method generated from the filter element in my model.
    <?xml version="1.0" encoding="utf-8"?>
    <s:WindowedApplication xmlns:fx="http://ns.adobe.com/mxml/2009"
                           xmlns:s="library://ns.adobe.com/flex/spark"
                           xmlns:mx="library://ns.adobe.com/flex/mx" xmlns:OfflineAIRAPP="TestOfflineApp.*"
                           preinitialize="app_preinitializeHandler(event)"
                           creationComplete="windowedapplication1_creationCompleteHandler(event)">
        <fx:Script>
            <![CDATA[
                import com.adobe.offline.ProductOfflineAdapter;
                import mx.controls.Alert;
                import mx.events.FlexEvent;
                import mx.messaging.Channel;
                import mx.messaging.ChannelSet;
                import mx.messaging.channels.RTMPChannel;
                import mx.messaging.events.ChannelEvent;
                import mx.rpc.AsyncToken;
                import mx.rpc.events.FaultEvent;
                import mx.rpc.events.ResultEvent;
                public var myOfflineAdapter:ProductOfflineAdapter;
                public function channelConnectHandler(event:ChannelEvent):void
                    productService.serviceControl.autoConnect=false;
                protected function 
                    app_preinitializeHandler(event:FlexEvent):void
                    var cs:ChannelSet = new ChannelSet();
                    var customChannel:Channel = new RTMPChannel("my-rtmp",
                        "rtmp://localhost:2037");
                    cs.addChannel(customChannel);
                    productService.serviceControl.channelSet=cs;
                    customChannel.addEventListener(ChannelEvent.CONNECT,
                        channelConnectHandler);
                protected function dataGrid_creationCompleteHandler(event:FlexEvent):void
                    getAllResult.token = productService.getAll();
                protected function
                    windowedapplication1_creationCompleteHandler(event:FlexEvent):void
                    productService.serviceControl.autoCommit = false;
                    productService.serviceControl.autoConnect = true;
                    productService.serviceControl.autoSaveCache = true;                
                    productService.serviceControl.fallBackToLocalFill=true;
                    productService.serviceControl.encryptLocalCache = true;
                    productService.serviceControl.cacheID = "myOfflineCache4";
                protected function connectBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.connect();
                protected function disconnectBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.disconnect();
                protected function commitBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.commit();
                protected function saveCacheBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.saveCache();
                protected function clearCacheBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.clearCache();
                protected function button_clickHandler(event:MouseEvent):void
                    getAllResult.token = productService.byProductName("%"+key.text+"%");
            ]]>
        </fx:Script>
        <fx:Declarations>
            <mx:TraceTarget />       
            <s:CallResponder id="getAllResult" />
            <OfflineAIRAPP:ProductService id="productService"
                                          fault="Alert.show(event.fault.faultString + '\n' +
                                          event.fault.faultDetail)"/>
            <s:CallResponder id="byProductNameResult"/>
            </fx:Declarations>
        <mx:DataGrid editable="true" x="141" y="10" id="dataGrid"
                     creationComplete="dataGrid_creationCompleteHandler(event)"
                     dataProvider="{getAllResult.lastResult}">
            <mx:columns>
                <mx:DataGridColumn headerText="productid" dataField="productid"/>
                <mx:DataGridColumn headerText="description" dataField="description"/>
                <mx:DataGridColumn headerText="price" dataField="price"/>
                <mx:DataGridColumn headerText="productname" dataField="productname"/>
            </mx:columns>
        </mx:DataGrid>
        <s:Button x="10" y="246" label="Connect" click="connectBtn_clickHandler(event)"
                  id="connectBtn" width="84" height="30"/>
        <s:Button x="112" y="204" label="Save to Local Cache" id="saveCacheBtn"
                  click="saveCacheBtn_clickHandler(event)" height="30"/>
        <s:Button x="110" y="246" label="Commit to Server" id="commitBtn"
                  click="commitBtn_clickHandler(event)" width="135" height="30"/>
        <s:Button x="10" y="204" label="Disconnect" id="DisconnectBtn"
                  click="disconnectBtn_clickHandler(event)" height="30"/>
        <s:Label x="270" y="204" text="{'Commit Required: ' +
                 productService.serviceControl.commitRequired}"/>
        <s:Label x="270" y="246" text="{'Connected: ' +
                 productService.serviceControl.connected}"/>
        <s:TextInput x="10" y="19" id="key"/>
        <s:Button x="10" y="49" label="Search" id="button" click="button_clickHandler(event)"/>   
    </s:WindowedApplication>

  • Upload data from excel to crm using BDOC possible or not

    Hi all,
    I need to upload data from excel to crm(opportunity management) .is it possible using bdoc or not. Please provide the list of methods to upload data from EXCEL and also provide the best method for this scenario .

    BDocs are used to transfer data from one SAP system to another like from CRM to ECC or R3.
    If u want to upload data from excel to CRM, this can be done with the help of idocs and not bdocs (method 1 - using LSMW).
    Take help of a crm technical consultant and define lsmw projects. The project will take care of field mapping and conversion. Once that is done, 4 steps need to be done.
    1. Read data
    2. Convert data
    3. Generate idocs
    4. Post idocs
    Once the posting of idocs happen error-free, the data will be available in crm system.
    Another method will be using transaction SECATT.
    Here u can define test scripts and record what all activities are done in a transaction. Then define ur test configs which will contain the excel sheet data and then upload the data.
    Reward with points if this helps.

  • How would I implement something similar to SSRS data driven subscriptions in CRM 2015 Online?

    Requirement: Generate Report and send specific pricing information to a list of contacts based on field value.
    I'm new to CRM, but am well versed in SSRS and I'm looking to understand:
    1. Is it possible to create something like a data driven subscription in CRM?
    Everything I've looked at is from years ago, and I'm just looking for some high level directions I can pursue in order to fulfill this requirement.
    Thanks!

    Did you notice the difference between your image and theirs. Their image has a coordinate grid. This coordinate grid can be used to determine size, location of pixels, etc...
    If you add a grid, when you process the image, find the grid, then find the coordinate markings, then find the pixels in relation to these coordinate marking creating an internal buffer of the image.
    ie: have 0=white 1=black 2=grid 3=coordinate marking and make an internall representation of your image
    011002300000
    001002201000
    011102300010
    323232232323
    222222222222
    001002310000
    000102200100
    010002301000
    Just an idea, but look into the coordinate grid it may be the key to why they can process their image.
    DeltaCoder

  • Data conversion strategy for new SOB

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    Hi David,
    for master data conversion you can use LSMW and the RE-FX BAPIs. (please refer to SAP note  [782947|https://service.sap.com/sap/support/notes/782947] ).
    Regards, Franz

  • Replicating data once again to CRM after initial load fails for few records

    My question (to put it simply):
    We performed an initial load for customers and some records error out in CRM due to invalid data in R/3. How do we get the data into CRM after fixing the errors in R/3?
    Detailed information:
    This is a follow up question to the one posted here.
    Can we turn off email validation during BP replication ?
    We are doing an initial load of customers from R/3 to CRM, and those customers with invalid email address in R/3 error out and show up in SMW01 as having an invalid email address.
    If we decide to fix the email address errors on R/3, these customers should then be replicated to CRM automatically, right? (since the deltas for customers are already active) The delta replication takes place, but, then we get this error message "Business Partner with GUID 'XXXX...' does not exist".
    We ran the program ZREPAIR_CRMKUNNR provided by SAP to clear out any inconsistent data in the intermediate tables CRMKUNNR and CRM_BUT_CUSTNO, and then tried the delta load again. It still didn't seem to go through.
    Any ideas how to resolve this issue?
    Thanks in advance.
    Max

    Subramaniyan/Frederic,
    We already performed an initial load of customers from R/3 to CRM. We had 30,330 records in R/3 and 30,300 of them have come over to CRM in the initial load. The remaining 30 show BDOC errors due to invalid email address.
    I checked the delta load (R3AC4) and it is active for customers. Any changes I make for customers already in CRM come through successfully.  When I make changes to customers with an invalid email address, the delta gets triggered and data come through to CRM, and I get the BDOC error "BP with GUID XXX... does not exist"
    When I do a request load for that specific customer, it stays in "Wait" state forever in "Monitor Requests"
    No, the DIMA did not help Frederic. I did follow the same steps you had mentioned in the other thread, but it just doesn't seem to run. I am going to open an OSS message with SAP for it. I'll update the other thread.
    Thanks,
    Max

  • Urgen: SRM and BW user data Synchronization problem

    Dear Buddies:
    I'm a BWer in a SRM project. These days I meet a very strange problem in the user data Synchronization configuration between SUS and BW system.
    The symptom is:
    I config the user data Synchronization parameters in SUS system:
    SAP Reference IMG u2192 Supplier Relationship Management u2192 Supplier Self-Services u2192 Master Data u2192 Maintain Systems for Synchronization of User Data
    Here I've maintained the BW logical system and filled the 'FM BPID' field with 'RS_BW_BCT_SRM_SUS_USER_BPID', and filled the 'Function Module for creating user' field with 'BAPI_USER_CREATE'.
    The function of the config above is that:
    When a new user is created in the SAP SUS system, it will automatically be created in SAP BW, too.
    At the same time, an internal table (SRM_USER_SUPBPID) is filled automatically. The table contains the assignment between the automatically created SAP BW user and the corresponding Business Partner ID of the supplier company.
    Then I test the user creation in SUS on web. I found that when the SUS user created , the same user is created automatically in BW system. That means the 'BAPI_USER_CREATE' is work.
    But the content of the user-BPID mapping table 'SRM_USER_SUPBPID' is still empty. That means the  FM 'RS_BW_BCT_SRM_SUS_USER_BPID' is not work at all.
    Anybody met with similar problem? Or any suggestion do you have pls kindly show your solutions, Thanks!!

    No solutions?  I need your support my friends.

  • Date determination in sap crm 7.0

    Hi experts,
    Date determination in sap crm 7.0  .
    Please explsin the concepts.
    how is it relevant to Business.Transaction.
    Thanks in advance
    Prajith P

    Hi Prajith,
    Date profile can be assigned in Transaction type or Item Category.Depending upon type of dates required you can configure dates specific to meet customer business requirements.
    Path: SAP Customizing Implementation Guide>Customer Relationship Management>Basic Functions>Date Management>
    a. Define Date Types, Duration Types and Date Rules
    b. Define Date Profile
    c. Assign Date Profile to Transaction Type
    d. Assign Date Profile to Item Category
    Steps are as:
    1. Define Define Date Types, Duration Types and Date Rules : You can use predefined date types, duration types & date rules If you require some extra you can configure it here.
    2.Define Date Profile: Here take reference object as  System or user.Assign Date Rules, then date types which you configured in above step.1
    In date Type assign date rules.
    3.Assign Date Profile to Transaction Type or Item Category: Here assign above Date profile to a transaction type or Item Category.
    Regards,
    Rajendra Sonawane

  • Data load fron flat file through data synchronization

    Hi,
    Can anyone please help me out with a problem. I am doing a data load in my planning application through a flat file and using data synhronization for it. How can i specify during my data synchronization mapping to add values from the load file that are at the same intersection instead of overriding it.
    For e:g the load files have the following data
    Entity Period Year Version Scenario Account Value
    HO_ATR Jan FY09 Current CurrentScenario PAT 1000
    HO_ATR Jan FY09 Current CurrentScenario PAT 2000
    the value at the intersection HO_ATR->Jan->FY09->Current->CurrentScenario->PAT should be 3000.
    Is there any possibility? I dont want to give users rights to Admin Console for loading data.

    Hi Manmit,
    First let us know if you are in BW 3.5 or 7.0
    In either of the cases, just try including the fields X,Y,Date, Qty etc in the datasource with their respective length specifications.
    While loading the data using Infopackage, just make the setting for file format as Fixed length in your infopackage
    This will populate the values to the respective fields.
    Prathish

  • Date Dimension in Oracle CRM On Demand Opportunity History Reporting

    Hi All,
    I have a small quesiton concerning the Date Dimension in Oracle CRM On Demand's Opportunity History analysis.
    To which date field does this dimension/column relate? Created Date?, Modified Date?, etc
    Thanks in advance

    Pl post this in a Siebel related forum
    HTH
    Srini

  • How to improve performance for bulk data load in Dynamics CRM 2013 Online

    Hi all,
    We need to bulk update (or create) contacts into Dynamics CRM 2013 online every night due to data updated from another external data source.  The data size is around 100,000 and the data loading duration was around 6 hours.
    We are already using ExecuteMultiple web services to handle the integration, however, the 6 hours integraton duration is still not acceptable and we are seeking for any advise for further improvement. 
    Any help is highly appreciated.  Many thanks.
    Gary

    I think Andrii's referring to running multiple threads in parallel (see
    http://www.mscrmuk.blogspot.co.uk/2012/02/data-migration-performance-to-crm.html - it's a bit dated, but should still be relevant).
    Microsoft do have some throttling limits applied in Crm Online, and it is worth contacting them to see if you can get those raised.
    100 000 records per night seems a large number. Are all these records new or updated records, or are there some that are unchanged, in which case you could filter them out before uploading ? Or are there useful ways to summarise the data before loading
    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

  • Migration of mapping created in data synchronization

    Hi,
    I've created a mapping in EPM data synchronization utility. I am to migrate from dev to production. Is there any way to migrate / export the data synchronization along with the mappings created in Dev or I have to recreate everything from the scratch? IT seems that there is no way in which I can export the mapping created. Appreciate your help.
    Thanks,
    ADB

    Hi Alexey,
    Could you elaborate on the requirement? It is still not clear to me what you want to achieve.
    What I do understand is that the users should be able to make adjustments to the mapping/lookup entries.
    If that is the case, what exactly is going to be maintained in the 'additional table' and how are you suggesting end users are going to maintain this?
    Ideally, your query transformation should not change when parameter values change, so you have to think about what logic you put where.
    My suggestion would be to use a file or a table which can be maintained by users. In your query transformation you can then use the lookup or lookup_ext function.
    Especially with lookup_ext you can make it as complicated as you want. The function is well documented but if you need help then just reply and explain in a bit more detail what you're trying to do.
    If you do think the 'hard-coded' option would suit you better, you can look into the 'decoce' function. Again, it is well documented in the technical manual.
    Jan.

  • Any way to stop Data Synchronizer from syncing every folder?

    SLES 11 64 bit for VMware.
    Groupwise connector 1.0.3.512
    Mobility connector 1.0.3.1689
    I have "folders" unchecked every place you can uncheck them.
    Still the Groupwise connector insists on syncing every single folder in a user's cabinet and contacts (even the ones that are unselected in the Groupwise connector->user edit section).
    I do *not* mean that it force syncs these folders to the device, just that it syncs them into the "folder list" section that you can monitor from Mobility Connector->Monitor->Click on user name.
    Most of our users have dozens of folders, and all the scrolling makes it kind of a pain to monitor the folder (ONE) and address books (usually 1-2) that they are syncing. Also, it seems like adding a ton of unneeded work to the system, and it eats up a pretty good chunk of CPU.

    rhconstruction wrote:
    > So, do the "Folders" checkboxes currently do anything?
    If you are talking about the Folder checkboxes in the GroupWise connector, these
    do not current pertain to Mobility. Remember that Mobility is part of a larger
    "Data Synchronizer" product, with connectors to other components like SugarCRM,
    Teaming, etc. So, some of the GroupWise connector settings show for all types
    of connectors, but do not always apply to each connector.
    Danita Zanr
    Novell Knowledge Partner
    Get up and running with Novell Mobility!
    http://www.caledonia.net/gw-mobility.html

  • Data warehouse monitor initial state data synchronization process failed to write state.

    Data Warehouse monitor initial state data synchronization process failed to write state to the Data Warehouse database. Failed to store synchronization process state information in the Data Warehouse database. The operation will be retried.
    Exception 'SqlException': Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
    One or more workflows were affected by this. 
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Synchronization.MonitorInitialState
    Instance name: Data Warehouse Synchronization Service
    Instance ID: {0FFB4A13-67B7-244A-4396-B1E6F3EB96E5}
    Management group: SCOM2012R2BIZ
    Could you please help me out of the issue?

    Hi,
    It seems like that you are encountering event 31552, you may check operation manager event logs for more information regarding to this issue.
    There can be many causes of getting this 31552 event, such as:
    A sudden flood (or excessive sustained amounts) of data to the warehouse that is causing aggregations to fail moving forward. 
    The Exchange 2010 MP is imported into an environment with lots of statechanges happening. 
    Excessively large ManagedEntityProperty tables causing maintenance to fail because it cannot be parsed quickly enough in the time allotted.
    Too much data in the warehouse staging tables which was not processed due to an issue and is now too much to be processed at one time.
    Please go through the links below to get more information about troubleshooting this issue:
    The 31552 event, or “why is my data warehouse server consuming so much CPU?”
    http://blogs.technet.com/b/kevinholman/archive/2010/08/30/the-31552-event-or-why-is-my-data-warehouse-server-consuming-so-much-cpu.aspx
    FIX: Failed to store data in the Data Warehouse due to a Exception ‘SqlException': Timeout expired.
    Regards,
    Yan Li
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Transfer z-table data from ECC to CRM via Middleware

    Hi,
    I need to transfer some z-table data from ECC to CRM using middleware. Does someone have any tip or reference link?
    André

    Hi,
    The following link shows the replication from CRM to ECC. The same can be followed for replication from ECC to CRM.
    Replication of Z table from CRM to R/3 - No mBDoc Created
    Regards,
    Susanta

Maybe you are looking for