DSD: Transferring Master Data to Handheld

Hi Folks,
I'm pretty new to this subject and I have the doubt if its possible to synchronize only master data for testing to the Hand held. We have configured everything (RFCs, Logical Systems, IDOC Configuration, IDOC Message types, etc) and the client is installed correctly and synchronization with MI Server is OK.
What I'm doing is using transaction /DSD/HH_DRIV to send driver information for example, and transfer this to the DSD Connector. From here what should I do next so this information can be transferred to our hand held.
We are using MDSD 3.0
Thanks in advance for your help.
Regards,
Gilberto Li
Edited by: Gilberto Li on Mar 2, 2010 7:02 PM

Hi Gilberto,
I am also configuring DSD in IDES.I am bit stuck on the architecture front.
Configuration guide which I am referring to has the steps to proceed.
I have configured DSD backend and DSD connector as two logical system.
Next step is assign client to this logical system.
Now do we assign client to DSD backend only?
Or to both DSD backend and DSD connector. Please advise.
For example I am in CLIENT100
Do we need to create two client 110 and 120
And assign 110 client to DSD backend and 120 to DSD Connector.
Thanks for your help.
Thanks & Regards,
Vishwas

Similar Messages

  • About transfering master data to mobile client site on extract....

    Hi everyone,
    I want to flow the master data (i.e country, region, title specific data which is not user specific etc..)
    from CRM to Mobile client for perticular site using SMOEAC.
    what happened is first time when I have done whole conntrans I got 8000 messages for a perticular mobile client site.
    after 2 months when i configured a new machine for same cilent and done complete extract using smoeac the master data is not flown( messages where around 3000 only).
    this means after doing complete extract on mobile site the complete data is not flowing....
    Please suggest if you come across some like this...
    regards,
    Dhanraj.

    Hi,
    you can find the load objects by searching for the tables they contain in tables "smofmaptab" & "smoftables".
    Another method is extracting all the info out of the CRM system. Try implementing the following program in your CRM system. It will extract all data of the objects in an XML file and will create an accompanying CSS file to view the data. Open the XML file in a browser and search for your tables in it.
    *& Report  ZMVGTEST12                                                  *
    *& Written by Michael Van Geet                                         *
    *& The report extracts all object data to an XML file                  *
    REPORT  zmvgtest12                              .
    TYPES: BEGIN OF t_objects,
             objname    LIKE smofobject-objname,
             objclass   LIKE smofobject-objclass,
             text       LIKE smofobjctt-text,
             r3tabname  LIKE smoftables-r3tabname,
             sfatabname LIKE smofmaptab-sfatabname,
           END OF t_objects,
           BEGIN OF t_systems,
             objname    LIKE smofobject-objname,
             sourcestid LIKE smofinicon-sourcestid,
             targetstid LIKE smofinicon-targetstid,
           END OF t_systems,
           BEGIN OF t_data,
             data(500),
           END OF t_data.
    DATA: it_objects TYPE STANDARD TABLE OF t_objects,
          is_objects TYPE t_objects,
          is2_objects TYPE t_objects,
          it_systems TYPE STANDARD TABLE OF t_systems,
          is_systems TYPE t_systems,
          it_data TYPE STANDARD TABLE OF t_data,
          is_data TYPE t_data,
          h_lines TYPE i,
          h_file  TYPE string.
    RANGES: r_objclass FOR smofobject-objclass.
    SELECTION-SCREEN BEGIN OF BLOCK blok1 WITH FRAME TITLE text-001.
    PARAMETERS: p_obj RADIOBUTTON GROUP grp1,
                p_cus RADIOBUTTON GROUP grp1,
                p_con RADIOBUTTON GROUP grp1.
    SELECTION-SCREEN END OF BLOCK blok1.
    SELECTION-SCREEN BEGIN OF BLOCK blok2 WITH FRAME.
    PARAMETERS: p_extr AS CHECKBOX DEFAULT 'X',
                p_file(50) DEFAULT 'c:\temp\'.
    SELECTION-SCREEN END OF BLOCK blok2.
    IF NOT p_obj IS INITIAL.
      REFRESH r_objclass.
      r_objclass-sign = 'I'.
      r_objclass-option = 'NE'.
      r_objclass-low = 'CUSTOMIZING'.
      APPEND r_objclass.
      r_objclass-low = 'CONDCUSTOMIZING'.
      APPEND r_objclass.
      r_objclass-low = 'CONDITIONS'.
      APPEND r_objclass.
    ELSEIF NOT p_cus IS INITIAL.
      REFRESH r_objclass.
      r_objclass-sign = 'I'.
      r_objclass-option = 'EQ'.
      r_objclass-low = 'CUSTOMIZING'.
      APPEND r_objclass.
    ELSE.
      REFRESH r_objclass.
      r_objclass-sign = 'I'.
      r_objclass-option = 'EQ'.
      r_objclass-low = 'CONDCUSTOMIZING'.
      APPEND r_objclass.
      r_objclass-low = 'CONDITIONS'.
      APPEND r_objclass.
    ENDIF.
    SELECT obobjname obobjclass obttext tabr3tabname map~sfatabname
                FROM ( ( smofobject AS ob JOIN smofobjctt AS obt
                          ON obtobjname EQ obobjname )
                     JOIN smoftables AS tab ON tabobjname EQ obobjname )
                     JOIN smofmaptab AS map ON mapobjname EQ obobjname
                      AND mapr3tabname EQ tabr3tabname
                INTO TABLE it_objects
                WHERE ob~objclass IN r_objclass
                  AND obt~spras   EQ 'E'.
    SELECT obobjname insourcestid in~targetstid
            INTO TABLE it_systems
            FROM smofobject AS ob JOIN smofinicon AS IN
                 ON inobjname EQ obobjname
            WHERE ob~objclass IN r_objclass.
    PERFORM xlmdoc  TABLES it_data
                    USING 'open'.
    LOOP AT it_objects INTO is_objects.
      CLEAR is2_objects.
      is2_objects = is_objects.
      AT NEW objname.
        WRITE:/ is2_objects-objname COLOR 1,
                is2_objects-objclass COLOR 1,
                is2_objects-text COLOR 1.
        PERFORM open_objects TABLES it_data
                             USING is2_objects.
        WRITE:/30 'SYSTEMS:'.
        PERFORM xlmdoc  TABLES it_data
                        USING '<SYSTEMMAPPING>'.
        LOOP AT it_systems INTO is_systems
             WHERE objname EQ is_objects-objname.
          WRITE:/40 is_systems-sourcestid,
                    is_systems-targetstid.
          PERFORM add_system TABLES it_data
                             USING is_systems.
        ENDLOOP.
        PERFORM xlmdoc  TABLES it_data
                        USING '</SYSTEMMAPPING>'.
        PERFORM xlmdoc  TABLES it_data
                        USING '<TABLEMAPPING>'.
        WRITE:/30 'TABLES:'.
      ENDAT.
      WRITE:/40 is2_objects-r3tabname COLOR 5,
                is2_objects-sfatabname COLOR 5.
      PERFORM add_table TABLES it_data
                        USING is2_objects.
      AT END OF objname.
        PERFORM xlmdoc  TABLES it_data
                        USING '</TABLEMAPPING>'.
        PERFORM xlmdoc  TABLES it_data
                        USING '</OBJECT>'.
      ENDAT.
    ENDLOOP.
    PERFORM xlmdoc  TABLES it_data
                    USING '</OBJECT_OVERVIEW>'.
    IF NOT p_extr IS INITIAL.
      CLEAR h_file.
      CONCATENATE p_file 'objects.xml' INTO h_file.
      CALL METHOD cl_gui_frontend_services=>gui_download
        EXPORTING
          filename = h_file
        CHANGING
          data_tab = it_data.
      PERFORM cssdoc  TABLES it_data.
      CLEAR h_file.
      CONCATENATE p_file 'xml.css' INTO h_file.
      CALL METHOD cl_gui_frontend_services=>gui_download
        EXPORTING
          filename = h_file
        CHANGING
          data_tab = it_data.
    ENDIF.
    *&      Form  open_objects
          text
         -->P_IS2_OBJECTS  text
         -->P_IT_DATA  text
    FORM open_objects  TABLES   p_data
                       USING    p_object STRUCTURE is_objects.
      CLEAR is_data.
      is_data-data = '<OBJECT>'.
      APPEND is_data TO p_data.
      CLEAR is_data.
      CONCATENATE '<TITLE>' p_object-objname '</TITLE>' INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      CONCATENATE '<DESCRIPTION>' p_object-text '</DESCRIPTION>' INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      CONCATENATE '<CLASS>' p_object-objclass '</CLASS>' INTO is_data-data.
      APPEND is_data TO p_data.
    ENDFORM.                    " open_objects
    *&      Form  add_system
          text
         -->P_P_DATA  text
         -->P_IS_SYSTEMS  text
    FORM add_system  TABLES   p_data
                     USING    p_systems STRUCTURE is_systems.
      CLEAR is_data.
      is_data-data = '<SYSTEM>'.
      APPEND is_data TO it_data.
      CLEAR is_data.
      CONCATENATE '<SOURCESYSTEM>' p_systems-sourcestid '</SOURCESYSTEM>'
                 INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      CONCATENATE '<TARGETSYSTEM>' p_systems-targetstid '</TARGETSYSTEM>'
                  INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      is_data-data = '</SYSTEM>'.
      APPEND is_data TO it_data.
    ENDFORM.                    " add_system
    *&      Form  add_table
    FORM add_table  TABLES   p_data
                    USING    p_objects STRUCTURE is_objects.
      CLEAR is_data.
      is_data-data = '<TABLE>'.
      APPEND is_data TO it_data.
      CLEAR is_data.
      CONCATENATE '<SOURCETABLE>' p_objects-r3tabname '</SOURCETABLE>'
                  INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      CONCATENATE '<TARGETTABLE>' p_objects-sfatabname '</TARGETTABLE>'
                  INTO is_data-data.
      APPEND is_data TO p_data.
      CLEAR is_data.
      is_data-data = '</TABLE>'.
      APPEND is_data TO it_data.
    ENDFORM.                    " add_table
    *&      Form  xlmdoc
    FORM xlmdoc  TABLES   p_data
                 USING    p_action.
      CASE p_action.
        WHEN 'open'.
          CLEAR is_data.
          is_data-data = '<?xml version="1.0"?>'.
          APPEND is_data TO p_data.
          CLEAR is_data.
          is_data-data = '<?xml-stylesheet href="xml.css" type="text/css" ?>'.
          APPEND is_data TO p_data.
          CLEAR is_data.
          is_data-data = '<OBJECT_OVERVIEW>'.
          APPEND is_data TO p_data.
        WHEN OTHERS.
          CLEAR is_data.
          is_data-data = p_action.
          APPEND is_data TO p_data.
      ENDCASE.
    ENDFORM.                    " xlmdoc
    *&      Form  cssdoc
          text
         -->P_IT_DATA  text
    FORM cssdoc  TABLES   p_data.
      REFRESH it_data.
      is_data-data = 'OBJECT'.
      APPEND is_data TO p_data.
      is_data-data = 'TITLE{text-decoration: underline;font: bold; width: 180px; float: left; }'.
      APPEND is_data TO p_data.
      is_data-data = 'DESCRIPTION{margin-left: 2px; width: 150px;float: left;}'.
      APPEND is_data TO p_data.
      is_data-data = 'CLASS'.
      APPEND is_data TO p_data.
      is_data-data = 'SYSTEMMAPPING{background-color: #BBB;margin-left: 2px; border: inset;  width:200px; float: left;padding: 2px;font-size:0.7em; }'.
      APPEND is_data TO p_data.
      is_data-data = 'SYSTEM'.
      APPEND is_data TO p_data.
      is_data-data = 'SOURCESYSTEM'.
      APPEND is_data TO p_data.
      is_data-data = 'TARGETSYSTEM'.
      APPEND is_data TO p_data.
      is_data-data = 'TABLEMAPPING{background-color: #AAA;margin-left: 2px; border: inset;width: 370px;float: left;padding: 2px;font-size:0.7em;  }'.
      APPEND is_data TO p_data.
      is_data-data = 'TABLE'.
      APPEND is_data TO p_data.
      is_data-data = 'SOURCETABLE'.
      APPEND is_data TO p_data.
      is_data-data = 'TARGETTABLE'.
      APPEND is_data TO p_data.
    ENDFORM.                    " cssdoc
    Michael.

  • What are the settings master data and transaction data from r/3 to in APO.

    Hi all,
    Can u suggest me ,I need to conform in apo what are the setting when transfering master data and transaction data from r/3 to APO.
    frm
    babu

    Hi
    The data get transfered from R3 to APO via CIF which is SAP standard.
    Please find enclosed herewith the link which will provide you detail information regarding it.
    http://help.sap.com/saphelp_scm41/helpdata/en/9b/954d3baf755b67e10000000a114084/frameset.htm
    Please let us know if it helps you. Please also let us know if you have any more specific question.
    Thanks
    Amol

  • Master Data Distribution !

    Hi!
       I want to know the purpose of master data distribution for the following between the vendor & the customer.
       1. Material Master
       2. Vendor Master & Customer Master.
      Whats the purpose of linking our system with our vendor or customer etc with <b>regard to master data</b>
      Pls explain in detail.
      Thanks
      Rahul.

    Hi Rahul,
    We dont do master data distribution with customer system or vendor system.
    Master data distribution is done between distributed systems of the same organization using ALE configuration. So we dont link to customer or vendor systems for transfering master data but for transfering transactional data like purchase orders or sales orders etc.
    Master Data Distribution
    Rather than distributing the complete master data information, views of the master data can be distributed (for example, material sales data, material purchase data). Each view of the master data is stored in a separate message type.
    Users can specify which data elements in a master record are to be distributed.
    Various distribution strategies are supported:
    ·        Cross-system master data can be maintained centrally and then distributed. The final values are assigned locally.
    ·        A view of the master data can be maintained locally. In this case there is always one maintenance system for each view. After the master data has been maintained it is transferred to a central SAP system and distributed from there.
    Types of Distribution
    ·        Active distribution (PUSH)
    If the master data is changed (for example, new data, changes or deletions), a master data IDoc is created in the original system and is distributed by class as specified in the distribution model.
    ·        Requests (PULL)
    A request occurs when a client system needs information about master data held in the system. You can select specific information about the master data, for example, the plant data for a material.
    If you want to be notified of all subsequent changes to the master data, this has to be set up “manually” between the two systems. It is not yet possible for this to be done automatically in the distribution mechanism in the original system.
    Transferring the Master Data
    A distinction is made between transferring the entire master data and transferring only changes to the master data.
    If the entire master data is transferred, a master IDoc is created for the object to be distributed in response to a direct request from a system. Only the data that was actually requested is read and then sent. The customer specifies the size of the object to be distributed in a request report.
    If only changes are transferred, the master IDoc is created based on the logged changes.
    Reward points for the useful answers,
    Aleem.

  • Error while performing Master Data transfer from SAPR/3- ECC5.0 to SCM4.1

    Hi all,
    Am getting an error while transfering Master data from SAP R/3 system ECC5.0 to SCM4.1 through Core Interface (CIF).
    After creating an Integration model and while activating the same am getting the following error,
    " System: SCMCLNT800    User:  SMART03 06/17/2006 13:32:14
    Function: /SAPAPO/CIF_LOC_INBOUND
    Text:Error in address data for location DC2000/ECC800 "
    Could anyone provide me how to fix this?.
    Regards
    Bharanidharan

    Hi Veera,
    Am sorry for the delayed response. Well am using XI as integration engine for transfer of transactional data from R/3 to SCM4.1. I resolved this issue which was because of master data problem. Thanks for your input.
    Regards
    Bharanidharan

  • Need to find out the number of Master data records transfered to the BW

    Hi,
    We need to find out the number of Master data (example -0MAT_PLANT_ATTR) records to be transfered to BW side. This is a delta extract.  We are preparing test scripts to check the master data extract ( full & delta) from ECC6 TO BI 7.0..
    Advance Thanks.

    Hi,
    Goto RSA3 and run that master data extractor in D mode if you want to know the number of records in delta and in F mode if you want to know the Full volume. But make sure that you set data records/calls and the display extr calls numbers so that you get the total number of records.
    The other option is goto the master data source table and look at the number of records in that table. That'll give you an idea of the total number of records in the table.
    One other option is to goto RSA7 , select the delta datasource and hit the display button. You'll get to know the number of entries in the delta queue that way as well.
    Cheers,
    Kedar

  • How to delete Master Data transferred from ECC to APO via CIF

    Hi, all
    I'm connecting SCM 5.1 with ECC 6.0 via CIF.
    When activating Integration Model in ECC, the following error occurred;
       "Location 0001 of type 1001 (BSG 592BUSGP) already exists in APO"
    This is because Location 0001 was already transferred from different ECC client via CIF, and not cleared.
    I'm now trying to initialise master data transferred from ECC to APO via CIF, but in vain.
    I suppose I have to first delete all master data CIFed from ECC in APO, or have to execute initialisation of APO master data with ECC data in ECC.
    Please tell me how to do this.
    For details, please read the followings;
    I connected APO client with ECC client 590.
    I transferred Plant and Material Master from ECC to APO via CIF.
    After that, I wanted to change ECC client from 590 to the other.
    I created the other client (592) by client-copying from 590 with SAP_ALL copy profile.
    I deactivated Integration Model in ECC 590, and activated the same in ECC 592.
    Here, I faced the problem.
    The initial data transfer failed due to the duplicate of data.
    Now I'm wondering how I can initialise master data in APO and transfer initial data from ECC.
    Due to testing purposes, I have to change ECC clients from one to the other this time, but I cannot initialise master data in APO and cannot tansfer initial data from different ECC client.
    thanks in advance,
    Hozy

    Hi all,
    Thank you very much indeed for lots of your replies.
    Thanks to all of your advices, I'm getting to understand what to do.
    Well, I'll say I have to create one more client for testing with other ECC clients or I have to rebuild the client by client-copying from 001 and implementing all transportation requests, for marking del flags to each master data and running del programme sound like tough work, and sound like odd ..
    Then, I've got next issue here.
    In SCM-APO, I use BI objects, and , if I have to create several clients for testing, I have to set up clients where you can use both BI objects and CIFed master data.
    I suppose I have to change 'BWMANDT' in RSADMINA and make RSA1 accessable in other SCM clients and implement tests.
    Also, I have to register BI System for newly created SCM clients in RSA1-SourceSystem.
    Well, if you have any experiences on this kind of use, and if you have any advice on it, please give me any tips.
    Particulary, I'd appreciate very much if you give me advice on which way you have chosen; creating several clients or rebuilding the one client in SCM-APO when you use both BI objects, CIFed master data and Demand Planning.
    Thanks in advance.
    Hozy

  • Transfering business partner master data from R/3 to Business one

    hi all,
    can anyone have an idea about how to transfer the business partner master data from SP R/3 to SAP Business One.
    Plz tell me in details..
    regards
    nirdesh panwar

    Hi Nirdesh,
    The easiest way is to ask SAP R/3 for an export file of some sorts. This is usually an XML or text type file. You will then pick this file up where they have put it and import it. You can import it into SAP Business One with your own developed (vb, c#, etc.) application that you write using SAP Business One's DI API.
    Hope it helps,
    Adele

  • Transfering Master and Transaction data to active and non active versions

    Hi Gurus,
                 I am planning to transfer ECC master and transaction data to APO active and simulation versions.My question is
    1. Is it possible to release Master and transaction data to active and non active versions ?
    2. How i need to config if i want to release Master and Transaction data in version 001?
    Please let me know .
    Thanks in Advance
    Regards,
    Raj

    Hi Raj,
    1) Once you create active integration models for master data and
    transactional data, the data will get updated in active version 000
    2) For creation of non active version use transaction
    /SAPAPO/MVM
    3) For copying of data from active to non active version, use
    transaction /SAPAPO/VERCOP
    Regards
    R. Senthil Mareeswaran.

  • Transferring MATERIAL MASTER DATA USING lsmw

    I am using LSMW to transfer material master data from text file using direct input program RMDATIND.  The problem is that the field WRKST in screen is of 48 characters while the batch data structure BMMH1 of field WRKST contains 14 characters. How can we transfer the full 48 characters in text file to THE FIELD BMMH1 IN DIRECT INPUT PROGRAM. Can we write any translation or routines . If yes, how is it possible. I can do the same using BDC but our client is already using the LSMW , only the new field WRKST for material has to be inserted. It is picking only 14 characters.
    Regards
    Debopriyo

    Hi
    Kindly use the sap note 351557 it is applicable for your release
    if you have any further clarifications let me know
    Regards
    Damu

  • Transferring Product Master Data Changes From APO to ECC/R/3

    Is it possible to transfer product master data changes in APO to ECC via the CIF?  If so, does it require custom coding?

    Hi James,
    In all the different clients where APO is being used with ECC systems. ECC system is always the main system (system of record) which contains all the master data and data present in ECC supposed to be the reference for APO system for planning purpose.
    Generally it is not recommended to transfer master data from APO to ECC but as mentioned earlier by some experts you may have ways to do that.
    I would be interested in the business scenario where one needs to transfer master data from APO to ECC.
    Thanks,
    Anupam
    Edited by: Anupam Sengar on Aug 11, 2011 12:31 AM

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Extraction of certain master data not working

    Hi, we are on BI7 and some of the BC masterdata extractions are not working correctly.  Some masterdata working fine, consequently communication between systems working, but some master data fields e.g. 0co_area, 0chrt_accts and more are not working fine.  Get this on 3.x and new datasources.
    The data does get pulled through but the upload just never goes green, and keeps running.  If one look at the batch jobs (SM37) in the appropriate source system, it keeps on going until it is manually stopped. 
    I am only experiencing this problem with data pulled through from the production system - I am pulling it through to my testing and production box, but fails in both, where all other ECC systems are fine.
    In the monitor there is a warning and the following is the details of the monitor: 
    Overall status: Missing Messages or warnings
       Requests (messages):  Everything OK  (green)       Data request arranged  (green)
           Confirmed with:  OK (green)
       Extraction (messages): Missing messages (yellow)
           Data request received (green)
           Data selection scheduled (green)
           Missing message:  Number of sent records (yellow)
           Missing message: Selection completed (yellow)
       Transfer (IDocs and TRFC): Missing messages or warnings (yellow)
           Request IDoc: Application document posted (green)
           Info IDoc 1 : Application document posted (green)
           Data Package 1 : arrived in BW ; Processing : Selected number does not agree with transferred n (yellow)
           Info IDoc 2 : Application document posted (green)
       Processing (data packet) : Everyting OK (green)
           Inboun Processing (32 records) : No errors (green)
           Update PSA (32 records posted) : No errors (green)
           Processing end : No errors (green)

    Thanks, for the responses.  The appropriate Idocs work fine with transaction BD87. I looked in SM58 earlier during the course of the day and there was no error, just checked again (8PM) and found some entries with error texts for BW background user - 6 errors within 10 seconds.  Something to do with 'ERROR REQU... PG# IN 1' - do not have authorisation at this moment to check the specific requests, but not sure if the master data extractions would have caused these.
    In the job log for the problematic jobs, only 1 IDoc get created, unlike the other succesful ones that have 3 IDocs (not sure if this means anything)

  • Master Data transfer from SAP R/3 to SAP GTS

    HI,
    Master data like customer master  is transferred  fro R/3 to GTS.  Any customer master in R/3 has many fields like company code, sales area, shipping conditions etc.
    Shipping conditions and other fields are not defined in GTS. So how come  when the customer master which is copied into SAP GTS is not  giving any error since certain fields like shipping conditions are not there in GTS. I mean there are certain validation in R/3 for a customer . So how come these validations are being done in SAP GTS ??
    regards

    Customer/vendor the fields which is passed to GTS from ECC can be found in - (/SAPSLL/MENU_LEGAL - SAP Compliance Management (SPL Screening) - Logistics (Simulation of SPL - Check general address). Here you can see all the details captured for Business Partner in GTS. You can also see the entire field range at (/nbp)
    When you transfer a Customer master record to GTS, in GTS an internal BP number is created to map the stated fields based on number range logic etc as per requirement. The BP's are created in GTS as per the partner role in GTS and also can be retrived using partner role. At document level the validations happen w.r.t to partner functions mapped and created as per the feeder system.

  • Modify Reconciliation Account of Customer Master Data

    Hi,
    I need to change Reconciliation Account into master data for some customer
    I've already changed the field  status group of customer by customizing, setting into area 'Company code data' the Reconciliation Account as optional entry.
    But that field into master data of customer remain not editable.
    What am I missing?
    Any assistance would be greatly appreciated.
    Best Regards.
    Eric

    Hi,
    If you do not have data for the customer account, you can change directly.
    But, if you already have data posted, this is the procedure of changing recon account, if you have data.
    All Document Items that were created in the old account will be posted in the same old account when you have a payment posting, compensations, etc.
    All document created after the change will be posted in the new account as well as the payment postings, compensations and others. 
    The system will separate the postings in accordance with the moment at the documents were created. 
    You should run balance sheet adjustment program after any reconciliation account change. 
    The system performs any adjustments required due to the change of reconciliation accounts or G/L accounts. The items from the old reconciliation accounts are allocated to the new accounts. 
    Since you cannot post to the reconciliation accounts directly, the postings are made to temporary adjustment accounts. 
    These adjustment accounts should be displayed along with the relevant reconciliation account in the balance sheet. The postings are then reversed after the balance sheet has been created. 
    The program for sorting the payables and receivables makes the necessary adjustments automatically. This means that you have to define the adjustment account numbers and the posting keys for these postings in the system. 
    If you purchase and install the FI-LC Consolidation application and have bought up a previous customer or vendor (thus also taking on his/her payables and receivables), please refer to the note in the report documentation on changed reconciliation accounts. To define the account numbers, select the activity Define adjustment accounts for changed reconciliation accounts in the Accounts Receivable and Accounts Payable Implementation Guide.
    You should only run this program if your new reconciliation account is classified differently from the original in your FS. e.g.. AR to Intercompany accounts. It will just reclassify the existing balance. The line items will not be transferred. If not then no need to run the program at all.
    You can do a test in the development client before you do the change in the production. 
    Good Luck,
    Raghu

Maybe you are looking for