B1i - R3PROD2B1ITEM: duration for initial load

Hello,
we want to connect R/3  4.6c without XI / NetWeaver via VPN. B1i will be installed on subsidiary side, because we are hosting Business One.
Now I would like to know, whats the experience for the initial load. We have about 100.000 items in R/3 and B1-system. Maybe somebody can tell me the experience for 5.000 items, so we can calculate our time.
Thanks.
Bührig
Message was edited by:
        Christian Bührig

Regarding your questions:
1)Wurde bereits einmal gelöst, bei neuen Artikeln auch die Preise zu übergeben (R3PROD2B1ITEM)? Ich habe den Eindruck, dass dies bisher nicht abgedeckt ist.
I can see in the XSL file of the BIU that the child object of the Item object is accessed and there, the StandardAveragePrice property is filled with the idoc segment E1MBEWM/STPRS, there fore I would think that the price is carried forward.
2) Muss für benutzerdefinierte Felder (UDF) im Artikelstamm das BUI erweitert werden? 
This has to be done otherwise the BIU will not know to which UDF the corresponding idoc segment has to be matched. It is very easy as the Tags with the names of the UDF just have to be added to the xsl file. For a sample of how it will look like, you can access the following document:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/0bf4978f-0e01-0010-2d89-c140ac1177da
Best regards,
Felipe
For a sample

Similar Messages

  • Disable change logs for initial load

    Hi all,
    I need to do an initial load of many service tickets. In order to improve the performance (and only for this initial load) I would like to disable the change logs (to avoid storing information in the tables CDPOS and CDHDR).
    Any idea how to do this?
    Thank you!
    Regards
    Martin

    For the document type pls uncheck the check box No Change Documents in customizing.Pls try.

  • MDM-BOBJ integration for initial load....

    I have attended some demos for the MDM-BOBJ integration that dealt with the on-going creation and maintenance of master data but my question is how this is used for the initial load and enrichment for any master data.
    When we used D&B for enrichment, data will be enriched/Cleansed  before the initial load then after loading the data to MDM it will be de-duplicated based on the DUNS number .
    How the initial load and de-duplication is handled in MDM-BOBJ integration?.   For example, I have 100 thousand records and would like to enrich/Cleanse it before loading to MDM OR after loading to MDM.
    Any help will be appreciated.

    Mike,
    Yes, BOBJ dataservices has been extensively used to handle the scenario which you described here.
    There is a user guide to explain about the BOBJ-MDM scenarion under Integration of SAP Components with MDM.
    [BOBJ DS for Data Enrichment |https://websmp208.sap-ag.de/installMDM71]
    hope this helps for Jump start.
    thanks
    Alexander

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best Practice for Initial Load

    Hello,
    what is the best way of doing the initial load? is there a best practice somwhere that tells you what should be imported first?
    I want to understand the order ex,
    1. load Lookups,
    2. Hierarchies,
    3. taxonomy and attributes
    last the main table
    etc...
    I dont understand the logic.
    Thanks in advance

    Hi Ario,
    If you follow any SAP Standard business content for MDM Repositories like e.g. Material.
    https://websmp130.sap-ag.de/sap/support/notes/1355137
    In the SAP Note attachments, you will get MDM71_Material_Content.pdf
    You will see Import of reference Data(look up table's data) 1st(step6) before import of Master data(step7).
    During Import of Reference Data(look up data), Please follow the Import Sequence by using Processing level 0,1,2 etc.
    Which take care of filling look up flat tables first then filling Hierarchies tables etc.
    After that if you are maintaining Taxonomy, You need to fill taxonomy table in Taxonomy mode of Data Manager, in the sequence (Categories, Attributes, Linkage between Attributes and Categories and lastly Attribute Values)
    After this I mean populating Reference data you need to populate Main table records along with tuples table data since now in MDM 7.1 Tuple has been replaced by Qualified table for most of the Master's but if you are still maintaining Qualified table you can import Mani table data along with Qualified table in a single step. Otherwise for Qualified table you can alos use this approach of populating Non-qualifeirs to Qualified table first before importing main table and then importing Main table data along with Qualifier's field of Qualified table.
    This above entire process for exporting data from SAP R/3 system to MDM. If you are importing data into MDM from legacy system (Non-Sap systems too), Approach should be remain same Populating Lookup tables data and lastly main table data.
    I dont understand the logic.
    The logic is simple in your main table you have fields which are look up to Reference tables( e.g. field in main table which are look up to Lookup flat tables like Countries, Currencies etc, field in main table which is lookup to Hierarchy/Taxonomy table etc). So, if these values are not populated firstly, so during your Main table import you will have incomplete data for all of these fields from main table which are look up to some other tables as values in your lookup table you haven't populated before Main table import.
    Kindly revert if you still have any doubts.
    Regards,
    Mandeep Saini

  • User ID filter for Initial Load

    Hi Experts,
    We are implementing filters while reading user ids from backend systems. We implemented the filter by getting the error - "selection criteria Address userid is not supported". We tried "logonuid" and "userid" and both gave similar error.
    Please confirm what should we maintain for User ID.
    Thanks in advance for your help.
    Thanks and regards,
    Nits

    Hi Nits,
    Did you try to filter on BNAME?
    Let me know if this helps!
    Anu

  • Initial load of data to UCM for Customer Hub PIP

    Hi,
    What is the recommended approach to have XREF tables populated during an initial load to UCM (via EIM), when the Accounts already exist in Siebel CRM?
    Our approach as of now, using EIM, is as follows:
    1) Set required customer information in EIM_UCM_ORG
    2) Look up the customer's existing Row_ID in Siebel, and populate EIM_UCM_ORG.EXT_SYSTEM_NUM and EIM_UCM_ORG.UCM_EXT_ID) accordingly
    3) Run the EIM job and UCM Batch Process to import the customer into UCM
    The account then appears in UCM with the correct reference to siebel/row_id under the "external account IDs" tab. HOWEVER, it also contains a reference to a newly created duplicate record for that account in Siebel. Looking at the xref tables, there is no reference to the existing Siebel/row_id specified in the EIM batch load, and our hypothesis is that this is the reason the account cannot be found (and a duplicated is created). What we want to achieve here is to tell UCM that the accounts we are inserting do infact already exist in Siebel CRM, and can be identified by the row_id that we pass along.
    The relevant system versions are Customer Hub PIP 11g with AIA 3. Siebel CRM and Siebel UCM are on patch 8.1.1.7 (and pertinent ACRs have been incorporated in the two Siebel instances).
    Any hints or suggestions on how to approach this would be appreciated
    -M
    Edited by: 968713 on Nov 1, 2012 5:05 AM
    Edited by: 968713 on Nov 1, 2012 5:05 AM
    Edited by: 968713 on Nov 1, 2012 5:06 AM

    Do you really need to populate the XREF table/transaction History table for initial load?

  • Initial Load for marketing Attributes & Attribute sets in Cloud for customer 1405 version

    Hi All,
    I am working on Cloud for Customer Implementation with integration using Netweaver PI. SAP has just upgraded the CfC to 1405 version. In the latest integration guide & scenarios, there is provision to integrate marketing attributes & attribute sets as well.
    I am following the latest integration guide (1405 version) but could not find any provision or template report for initial load of marketing attributes. Can anyone please assist in the initial load for marketing attributes?
    Cheers.

    Hello to all,
    as the marketing attributes ar now supported in release 1502 i would like to ask you if there is now a way to perform an initila load for matketing attributes (Attribute Set, Attributes and Attribute assignemnt). This would be very helpful for all integration projects including marketing attributes.
    Thnaks for your help and best regards,
    Markus

  • Best practices for initial data loads to MDM

    Hi,
       We need to load more than 300000 vendors from SAP into MDM production repository. Import server might take days to load that much if no error occurs.
    Are there any best practices for initial loads to MDM available? What considerations must be made while doing the initial loads.
    Harsha

    Hello Harsh
    With SP05 patch1 there is a file aggregation functionality in the import port. Is is supposed to optimize the import performance.
    BTW, give me your mail address and I will send you an idoc packaging paper for MDM.
    Regards,
    Goekhan

  • Initial load gets slower and slower

    For a PoC I tried to use the internal goldengate mechanism for initial load. The size of the table is about 500 mb in total but over time the load is decreasing. Starting with nearly 1000 rows per second after one hour I was down to 50 Rows per hour and again decreasing down to not more than 10 rows per hour. So the entire load took 15 hours!
    There is only a primary key on the target table and no other constraints.
    Any idea?

    Same thing happens performance-wise on imports: starts off pretty fast, then starts to slow down. Can you rebuild/enable the PK index after the load is done? That should be a safe operation, give that your source has a PK. Are you sure there aren't any other constraints (or triggers) on the target table?
    Plus (assuming you are a DBA), what does AWR (or statspack, or tracing) show for wait events?

  • Initial load with LOBs

    Hi, i'm trying to do an inital load and I keep getting errors like these:
    ERROR OGG-01192 Oracle GoldenGate Capture for Oracle, ext1.prm: Trying to use RMTTASK on data types which may be written as LOB chunks (Table: 'TESTDB.BLOBTABLE').
    ERROR OGG-01668 Oracle GoldenGate Capture for Oracle, ext1.prm: PROCESS ABENDING.
    The table looks like this:
    COLUMN_NAME|DATA_TYPE|NULLABLE|DATA_DEFAULT|COLUMN_ID|COMMENTS
    UUID     VARCHAR2(32 BYTE)     No          1     
    DESCRIPTION     VARCHAR2(2000 BYTE)     Yes          2     
    CONTENT     BLOB     Yes          3     
    I've checked and the source database does contain data in the blobtable and both databases have the same tables, so now I have no idea what can be wrong? =/

    For initial loads with LOBs, use a RMTFILE and a normal replicat. There are a number of things that are not supported with "RmtTask". A "rmtfile" is basically the same format as a 'rmttrail' file, but is specifically for initial loads or other "captured" data that is not a continuous stream. And make sure you do have a newer build of GG (either v11 or a latest 10.4 from the support site).
    The 'extract' would look something like this:
    ggsci> add extract e1aa, sourceIsTable
    ggsci> edit param e1aa
    extract e1aa
    userid ggs, password ggs
    -- either local or remote
    -- extFile dirdat/aa, maxFiles 999999, megabytes 100
    rmtFile dirdat/aa, maxFiles 999999, megabytes 100
    Table myschema1.*;
    Table myschema2.*;
    Then on the target, use a normal 'replicat' to read the "files".
    Note that if the source and target are both oracle, this is not the most efficient way to instantiate the target. Using export/import or backup/restore (or any other mechanism) would usually be preferable.

  • Initial Load ISU / CRM Issues

    Hi all
    I have a  few questions.
    1. When you start an initial load for BUPA_MAIN in R3AS (from ISU to CRM) it seems only 1 DIAG WP is used (SM50). Is this normal?
    2. In R3AM1 you can monitor the objects. Everything seems to be loaded but yet the status is still 'Running', is this because of BDoc validation errors?
    Kind regards
    Lucas

    Hello Lucas,
    There's a param(CRM_MAX_QUEUE_NUMBER_INITIAL) you can maintain in the table CRMPAROLTP of the source system(maintain this in the ERP system if you're doing an initial load from ERP to CRM)
    Reference Notes:
    350176
    765236
    The parval1 for Parameter CRM_MAX_QUEUE_NUMBER_INITIAL should be an integer(lets say 5) instead of a character 'X'. In such a case 5 queues would be formed for initial load.
    Here's an example :
    Parameter name CRM_MAX_QUEUE_NUMBER_INITIAL
    Param. Name 2   <Object name>    " for example CUSTOMER_MAIN
    Param. Name 3
    User            <User>           " for example CRM, if CRM is connected
    Param. Value    <no. of queues>  " for example 5
    Param. Value 2
    So using the example above 5 R3AI_CUSTOMER_MAIN queues would be formed on the ERP and sent to CRM for processing.
    The no. of available dialog work processes in SM51 available to the qRFC scheduler in CRM system should be able to handle the parallel processing of queues. So consult your basis administrators on this front.
    Regards,
    Rohit

  • Initial load conditions in OWB

    Hi
    We are building Datawarehouse using Oracle warehouse builder and Express.I would like to know about the initial load for each mapping.Currently the mapping is done for incremental load.If i have to do it for Initial load, how it can be done.Is it require to change each mapping individually to run for initial load conditions.After that we have to set the mapping for incremental load. Otherwise using OWB can be done in other way.Pls.can anyone help in this regard
    Thanks
    Narasimha.

    You can create the mappings with input parameters and then input the initial and incremental loading properties (record dates, for example) as parameters for each load.
    Regards:
    Igor

  • Golden Gate - Initial Load using parallel process group

    Dear all,
    I am new to GG and I was wondering if GG can support initial load with parallel process groups? I have manage to do an initial load using "Direct BULK Load" and "File to Replicat", but I have several big tables and replicat is not catching up. I am aware that GG is not ideal for making initial load, but it is complicated to explain why I am using it.
    Is it possible to user @RANGE function while performing Initial Load regardless of which method is used (file to replicat, direct bulk, ...) ?
    Thanks in advance

    you may use datapump for initial load for large tables.

  • ECC to MDM initial load config

    Hi,
    for initial load of master data using MDMGX for lookup tables and for main table do we need to do any configuration on ECC to run these transactions?
    Jeff

    Hi Jeff,
    MDMGX is a general extracter program used to extract the check table data from ECC which can be used to populate your lookup table.
    For Extracting the data for your main table you need to use the MDM_CLNT_EXTR transaction.
    Kindly go through these following links
    It will give you a clear idea on both the extractions
    Re: I need to load the reference/check table data in to MDM Server - help
    Re: initial load from ECC/R3 to MDM
    Hope It Helped
    Kindly Reward Points If Found Useful
    Thanks & Regards
    Simona

Maybe you are looking for

  • Customer Statement to be Send by Email

    HI, I have config the Customer statement by means of correnpondence & it working fine. But I need to send it through email maintained in customer master when i run f.27 mails should be send automatically. For doing this do we need  to do any config r

  • Numbers on cloud wont open spreadsheet?

    numbers on cloud wont open spreadsheet?  I click on the icon and the window closes but the sheet doesn't open to the desktop

  • HANA, Aggregations and calculating percentages

    I have a table containing user-role assignments, e.g. the table contains tuples of the form (userA, roleA), (userA, roleB). Now, I would like to get an overview on how the distribution of users across roles. I would like to get the following overview

  • Includes in global Class

    Hi all, I have a global class (created in SE24 - Class builder) and need to use constants and types in its methods, that are defined in an include. In functional development we just put a "INCLUDE includename." in the function group top-include. Wher

  • Hide headings when no row in table

    I am trying to create a report using LiveCycle/Acrobat, i.e. the form is created as a dynamic pdf in livecycle. I will open it with acrobat and import data. To simplify the report, it has two columns Client    Date Seen Not all clients have ever been