CRM & BW Data Load

Hi
We have activated 0CRM_SRV_PROCESS_H and on executing RSA3 in CRM  Box we get records(1000 records aprrox) but when we are Executing Infopackage for Intial or Full
Load in BW  Box we are getting Zero records.
Regards,
LR

Hi Lakshman,
Is the data provider a Buisness content object or a custom one? If it is a content one i believe it has something to do with the update rules. Might be there is some start routine which drops the records in the process. Do u have any start routines in ur Update rules? IF u have any .. just to make sure that the problem is with routines .. just comment the whole routine and try extracting ... if u get any records .. then u can confirm that it has something to do with the routines. Then u can start working on the routines. keep us posted with ur development.
cheers,
Mav.

Similar Messages

  • What are the CRM Trasaction data Data Sources and Data load Procedure

    Hi BI Gurus,
    Does anybody provide the CRM Transaction data DataSources names and Load procedure into BI7.0
    I know the master Data load procedure from CRM to BI7.0.
    if you provide Step-by-Step documents it is more help.
    Thanks in Advance,
    Venkat

    Hi Venkat
    In order to find the transactions want you can login the CRM system and then use the transaction RSA6. There you can expand all the subtrees by left clicking on the first line and then clicking expand the further left button. After that you can easily search any datasource you may want
    Hope that helps
    Rgds
    John

  • How to improve performance for bulk data load in Dynamics CRM 2013 Online

    Hi all,
    We need to bulk update (or create) contacts into Dynamics CRM 2013 online every night due to data updated from another external data source.  The data size is around 100,000 and the data loading duration was around 6 hours.
    We are already using ExecuteMultiple web services to handle the integration, however, the 6 hours integraton duration is still not acceptable and we are seeking for any advise for further improvement. 
    Any help is highly appreciated.  Many thanks.
    Gary

    I think Andrii's referring to running multiple threads in parallel (see
    http://www.mscrmuk.blogspot.co.uk/2012/02/data-migration-performance-to-crm.html - it's a bit dated, but should still be relevant).
    Microsoft do have some throttling limits applied in Crm Online, and it is worth contacting them to see if you can get those raised.
    100 000 records per night seems a large number. Are all these records new or updated records, or are there some that are unchanged, in which case you could filter them out before uploading ? Or are there useful ways to summarise the data before loading
    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

  • Need to help in initial data loading from ISU into CRM

    One our client has requirement as
    All ISU data applicable (BP, BA, the appropriate technical data, Contracts, Products, Product Configuration and their correspoinding price Keys and Price Amount) to CRM  should be loaded into CRM as the part of the intial load.
    Eventhough ECRM_GENERATE_EVERH  in ISU but its documentation is not available.
    Is any provision like Report or RFC or function module present in SAP.
    I would appreciate  for you all quick reply with positive and appropriate solution. mailto [email protected]

    Got my answer.We can clear the data using MDX.

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Best Practice - Data load

    Hi,
    what is the BEST PRACTICE to migrate master data on a standalone CRM ?
    any advice will be highly appreciated.
    Cheers
    Guest

    Hi,
    Please read the following threads carefully and you will understand the best method by yourself.
    Initial Load on standalone CRM
    Inital load on standalone CRM
    CRM Master Data
    <b>Reward if helps</b>,
    Regards,
    Paul Kondaveeti

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Data loads running even after system is down?

    All,
    I have a data load from crm system to BI system currently loading to PSA. while the data load was in process the BI system or server went down for recycling yet the loads are running. Is this becasue the loads are upto the PSA. So to get my basics right, are PSA tables on the source system - I had the knowledge that they have same structures as that of souce system but lie on the BI system
    Inputs appreciated.
    Thanks!

    Hi,
    You please check that job names in R3 system and find those are active or not.....if those are active then job is running...if they are not active or canled,,,then you please change the QM status for that loads and re run the infopack..it will work
    Regards
    Srinivas

  • Sample SOAP request for Data Loader API

    Hi
    Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .

    Log into the application and then click on Training and Support there is a WS Library of Information within the application

  • Oracle Data Loader

    Hi guys!
    I'm planning to import a file with about 400k records with data loader (insert function).
    I do this operation with web services and I took about 7 hours. With web services I import about 20k records per time.
    Someone know if i use dataloader, will the time be improved?
    Another question, do you know how data loader do to import a file (if it divide the record, how many records per time, parallel importation, etc...)?
    Thanks in advance,
    Rafael Feldberg

    Rafael, I would recommend clicking on the Training and Support link in the upper right of your CRM On Demand application, then click on Browse Training, then click on Training Resources by Job Role, then click on Administrator and look for the following:
    Data Loader FAQ
    Data Loader Overview for R17
    Data Loader User Guide
    If you are successful using web services I would stick with that method.

  • CRM - BI Data Extraction Problem

    Hi Gurus,
    We have CRM 2007 and BI 7.0. We have activated all the DS and also replicated them in BW and also activated all the necessary Infocubes in and Transfer rules etc.
    In CRM, the data is available in service order header table but when I check in RSA3 for the 0CRM_SRV_PROCESS_H Data Source, it displays as 0 records.
    Hence I am unable to load the data into BI.
    And when I try to check the Master datasources in RSA3 then, the data is available, but I am unable to extract to BW.
    Would be of great help if someone can guide me if we missed any necessary settings or configurations that have to be performed in CRM or BI system.
    Thanks & Regards.

    Hi,
    There is one additional step that you have to performat CRM side before replicating the data source.You have to enable the metadata then only u can extract data from CRM.
    *Follow the below steps:*
    Go to SBIW in CRM, in the tree choose metadata adapter.Up on clicking it will populate the list of datasources,find your datasource and select the check box besides you datasource and save this.
    Now replicate you datasource again, activate the transfer rules by using the program RS_TRNSTRU_ACTIVATE_ALL.
    Now try the load it will work.
    Cheers,
    Upendra.

  • Memory dump in contract data load

    Hi,
    e are doing CRM service contract data load from flat file to CRM using
    BAPI_BUSPROCESSND_CREATEMULTI and it is giving ABAP memory dump.Please
    see attached file for more details about dump.
    Following steps we are doing.
    1.Call following functional modules in loop to create service contract.
    2. call
      call function 'BAPI_BUSPROCESSND_CREATEMULTI'
        tables
          header          = lt_header
          item            = lt_item
          sales           = lt_sales
          partner         = lt_partner
          appointment     = lt_appointment
          status          = lt_status
          input_fields    = lt_input_fields
          created_process = lt_created_process
          return          = lt_return
          scheduleline    = lt_scheduleline
          pricing         = lt_pricing
          billing         = lt_billing
          objects         = lt_objects
          condition_create = lt_condition_create
         billplan        = lt_billplan
         billplan_date   = lt_billplan_date
           cancel          = lt_cancel
          document_flow   = lt_docflow.
    2.call
          call function 'CRM_ORDER_SAVE'
        exporting
          it_objects_to_save = lt_objects_to_save
          iv_no_bdoc_send    = 'X'
        exceptions
          document_not_saved = 1
          others             = 2.
    3. call
              call function 'BAPI_TRANSACTION_COMMIT'
        exporting
          wait = 'X'.
    4.call
               call function 'CRM_ORDER_INITIALIZE'
        exporting
          it_guids_to_init           = lt_guid_16
          iv_initialize_whole_buffer = 'X'
          iv_init_frame_log          = 'X'
        exceptions
          error_occurred             = 1
          others                     = 2.
    5.close the loop.
    Though we are every time calling 'CRM_ORDER_INITIALIZE' it's still
    giving following dump. Looks like 'CRM_ORDER_INITIALIZE' is not
    initializing the memory.
    Dump details
    STORAGE_PARAMETERS_WRONG_SET
    &INCLUDE INCL_INSTALLATION_ERROR
    What happened?
    The current program had to be terminated because of an
    error when installing the R/3 System.
    The program had already requested 256701152 bytes from the operating
    system with 'malloc' when the operating system reported after a
    further memory request that there was no more memory space
    available.
    Could you please suggest how to clear the memory?
    -Kavitha

    Hi Kavitha,
    What are the approx. number of records you are trying to Populate? You can follow the below given sequence for loading the data.
    1. call function 'BAPI_BUSPROCESSND_CREATEMULTI'
    2. CALL FUNCTION 'BAPI_BUSPROCESSND_SAVE'
    Example:  
    call function 'BAPI_BUSPROCESSND_SAVE'
          EXPORTING
            update_task_local = false
            save_frame_log    = true
         IMPORTING
          log_handle        = lv_loghandle
          TABLES
            objects_to_save   = lt_objects_to_save
            saved_objects     = lt_saved_objects
            return            = lt_return.
    3.   CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'.
    i think this would work without problem. Let me know whether you still get ABAP Dump.
    Regards
    Abinash

  • Reg "Allow Bulk Data Load"

    Hi all,
    GoodMorning,.
    what exactly does the option of "Allow Bulk Data Load" option on Company Profile page do, it is clear in doc. that it allows crm on demand consultants to load bulk data. But i am not clear on how they load etc etc, do they use anyother tools other than that admin. uses for data uploading.
    any real time implementation example using this option would be appreciated.
    Regards,
    Sreekanth.

    The Bulk Data Load utility is a utility similar to the Import Utility that On Demand Professional Services can use for import. The Bulk Data Load utility is accessed from a separate URL and once a company has allowed bulk data load then we would be able to use the Bulk Data Load Utility for importing their data.
    The Bulk Data Load uses similar method to the Import Utility for importing data with the difference being that the number of records per import is higher and you can queue multiple import jobs.

  • Data Load Error  due to Master data deletion

    Hi,
    While doing the transactional data load I am getting following error.
    +Master data/text of characteristic ZFOCUSGRP already deleted Message no RSDMD138   +
    ZFOCUSGRP  is  an  InfoObject (with Text). Last week we changed the source system from CRM to R/3 during that time we deleted all the Texts in ZFOCUSGRP manually from the table.
    This error is not happening always some time it load properly. I executed the RSRV for  InfoObject ZFOCUSGRP and InfoCube still this error happening.
    Is there any way to fix this error?
    Thanks in advance.
    Thanks
    Vinod

    check this:
    Re: Error while running InfoPackage
    Master data/text of characteristic 0MATERIAL already deleted
    Master data/text of characteristic ZXVY already deleted
    Hope it helps..

Maybe you are looking for

  • How can I tell if my iPod touch needs its battery replaced or if there is something wrong with the power connector?

    About 3 weeks ago, I got pushed into a pool, fully clothed, with my iPod touch in my back pocket.  Miraculously, I was able to salvage it (or so I thought) by putting it for a couple of days in a dessicator jar and then recharging it.  It seemed all

  • Windows server 2012 R2 randomly BSOD's

    Dear all, Since recently we suffering from random BSOD's on a Windows 2012 R2 terminal server. We already checked if there has been any changes, updates, new drivers installed, etc. around the time we receive the first one, but we cannot find anythin

  • Quicktime not playing the the complete mpeg2 file

    I have QT pro with the mpeg2 component so I can play back mpeg2 files. However, when I get about 13 minutes on to the movie the frames stop but the slid bar continues to advance. I have checked to make sure that the movie and file is complete. I can

  • Dark screen dark font

    On both I-Tunes 9 and 9.0.1 I have a black background and font in the Itunes store on most pages. I can see the album art, and barely see any formatted text. Unformatted text is not visible but my mouse indicates that there is clickable stuff there.

  • Trouble with margins when converting Word doc to PDF

    When I convert a word document to a PDF (using Print >> Save as PDF), the resulting PDF (in Preview) cuts off my page numbers which are centered at a half-inch from the bottom. They have to stay there. I've tried changing the non-printable area to "U