ASAP Data Migration / Data Conversion / Data Transfer Templates

Hi,
I'm looking for the ASAP Data Migration / Data Conversion / Data Transfer Templates that used to be available in ASAP.
Some time I ago, and I believe it was on this site and can no longer find it, I found a zip file which contained the above .xls template to manage all the data conversions on a project. Also included was the data mapping template .xls
Many thanks.

Hi,
I'm looking for the ASAP Data Migration / Data Conversion / Data Transfer Templates that used to be available in ASAP.
Some time I ago, and I believe it was on this site and can no longer find it, I found a zip file which contained the above .xls template to manage all the data conversions on a project. Also included was the data mapping template .xls
Many thanks.

Similar Messages

  • What is PMS Data Migration Experience and Data Migration Scripts preparatio

    Hi All what is PMS Data Migration Experience and Data Migration Scripts preparation
    why we use these in HRMS(EBS)

    Pl provide a link to where you got these terms from.
    "PMS Data Migration Experience" is not an EBS term - most likely a vendor or third-party phrase/tool/software.
    "Data Migration" in HRMS (and other EBS modules) typically refers to the process of moving data from a legacy system to EBS during the process of going-live with EBS.
    HTH
    Srini

  • HELP ME ON DATA MIGRATION PLEASE

    Hi
    what is SAP data migration what kind of tools and techniques include in that and also please what are the roles and responsibilities include in that when we are working on project.(RELATED TO ABAP)

    Data Migration – Current Situation
    Data migration or data conversion is a persistent issue facing most organizations who have implemented SAP. Whether it is during the final phases of a new SAP implementation, during SAP upgrades and updates, during corporate restructurings, or during mergers and acquisitions, data migration continues to remain a challenging problem for IT.
    Most data migration projects are faced with a time and budget crunch. But, these projects usually require extensive support from programmers and other technical experts and therefore data migration projects tend to get expensive and time-consuming.
    The tools for data transfer and data migration provided by SAP are very technical and have a steep learning curve associated with them and they require technical experts to build and implement the data migration scripts.
    Data Migration – Ideal Scenario
    An ideal data migration scenario would let the data migration project be implemented by the end user departments themselves. A few super-users within these end-user departments that supply the data should have the ability to transfer and migrate data themselves without relying on technical experts. Such a scenario would significantly cut the time and effort required in a data migration project.
    Reaching such an ideal scenario would require data migration tools that are easy to learn and require no programming. Furthermore, these tools should work across all the different SAP modules and the SAP products, including the different versions of SAP.
    Data Migration
    Data Migration is the process of moving required (and most often very large) volumes of data from our clients’ existing systems to new systems. Existing systems can be anything from custom-built IT infrastructures to spreadsheets and standalone databases.
    Data Migration encompasses all the necessary steps to cleanse, correct and move the data into the new system *. Technological changes, change in providers, software updates or data warehousing / data mining projects make such delicate and critical operations necessary.
    <b>Data Migration can be done using BDC, LSMW, BAPI, Idoc's.</b>
    What is the difference between batch input and call transaction in BDC?
    Session method.
    1) synchronous processing.
    2) can tranfer large amount of data.
    3) processing is slower.
    4) error log is created
    5) data is not updated until session is processed.
    Call transaction.
    1) asynchronous processing
    2) can transfer small amount of data
    3) processing is faster.
    4) errors need to be handled explicitly
    5) data is updated automatically
    Differences between bdc session method and call transaction method.
    The most important aspects of the batch session interface are: - Asynchronous processing - Transfers data for multiple transactions - Synchronous database update During processing, no transaction is started until the previous transaction has been written to the database. - A batch input processing log is generated for each session - Sessions cannot be generated in parallel
    The most important aspects of the CALL TRANSACTION USING interface are: - Synchronous processing - Transfers data for a single transaction - Synchronous and asynchronous database updating both possible The program specifies which kind of updating is desired. - Separate LUW for the transaction The system performs a database commit immediately before and after the CALL TRANSACTION USING statement. - No batch input processing log is generated.
    Refer this
    For BDC:
    http://myweb.dal.ca/hchinni/sap/bdc_home.htm
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/home/bdc&
    http://www.sap-img.com/abap/learning-bdc-programming.htm
    http://www.sapdevelopment.co.uk/bdc/bdchome.htm
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://help.sap.com/saphelp_47x200/helpdata/en/69/c250684ba111d189750000e8322d00/frameset.htm
    http://www.sapbrain.com/TUTORIALS/TECHNICAL/BDC_tutorial.html
    Check these link:
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://www.sap-img.com/abap/question-about-bdc-program.htm
    http://www.itcserver.com/blog/2006/06/30/batch-input-vs-call-transaction/
    http://www.planetsap.com/bdc_main_page.htm
    call Transaction or session method ?
    http://www.sapbrain.com/FAQs/TECHNICAL/SAP_ABAP_DATADICTIONARY_FAQ.html
    http://www.****************/InterviewQ/interviewQ.htm
    http://help.sap.com/saphelp_46c/helpdata/en/35/2cd77bd7705394e10000009b387c12/frameset.htm
    ALE/ IDOC
    http://help.sap.com/saphelp_erp2004/helpdata/en/dc/6b835943d711d1893e0000e8323c4f/content.htm
    http://www.sapgenie.com/sapgenie/docs/ale_scenario_development_procedure.doc
    http://edocs.bea.com/elink/adapter/r3/userhtm/ale.htm#1008419
    http://www.netweaverguru.com/EDI/HTML/IDocBook.htm
    http://www.sapgenie.com/sapedi/index.htm
    http://www.sappoint.com/abap/ale.pdf
    http://www.sappoint.com/abap/ale2.pdf
    http://www.sapgenie.com/sapedi/idoc_abap.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/0b/2a60bb507d11d18ee90000e8366fc2/frameset.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/78/217da751ce11d189570000e829fbbd/frameset.htm
    http://www.allsaplinks.com/idoc_sample.html
    http://www.sappoint.com/abap.html
    http://help.sap.com/saphelp_erp2004/helpdata/en/dc/6b835943d711d1893e0000e8323c4f/content.htm
    http://www.sapgenie.com/sapgenie/docs/ale_scenario_development_procedure.doc
    http://edocs.bea.com/elink/adapter/r3/userhtm/ale.htm#1008419
    http://www.netweaverguru.com/EDI/HTML/IDocBook.htm
    http://www.sapgenie.com/sapedi/index.htm
    http://www.allsaplinks.com/idoc_sample.html
    Check these step-by-step links
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/ccab6730-0501-0010-ee84-de050a6cc287
    https://sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/8fd773b3-0301-0010-eabe-82149bcc292e
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/3c5d9ae3-0501-0010-0090-bdfb2d458985
    regards,
    srinivas
    <b>*reward for useful answers*</b>

  • AR & AP Data migration from legacy

    Hi,
    1.What are the precaution to be taken for Data migration from legacy system?
    2.We intend to use a conversion ac ( Dr conv ac cr Vendor ) Is it Correct way of doing it ?
    3.How do i ensure that all my invoices uploaded correctlt ?
    4.What is the best Standard program to get it doen ?.
    5.Is LSMW is best to do it or shall I have to go for custom development?
    Any piece of yr advice is appreciated.
    Thanks, Chitra

    Hi Chitra,
    Following link are helpful in understanding LSMW:
    Note 105244 - LSMW 1.0: known problems, tips and tricks
    Re: LSMW and error handling
    /community [original link is broken]
    For Data Migration:
    Note 400257 - Data Migration and Upload Report
    Note 739860 - Data migration from a legacy system to SAP
    Hope this helps.
    Please assign points as way to say thanks

  • Data migration methods

    Hi experts. I am working on a study for a data migration project where data from multiple legacy systems would be moved into SAP. Apart from LSMW and BDC, what are the other useful methods when the data migration involves huge volumes of upload data?

    Hi,
    Obviously the answer depends to a certain extent on the circumstances of your project (complexity of data, volume, etc) and on the object in question, but here are a few rules of thumb.
    If you are new to this I would stay away from IDOCs and BAPIs as the set up is often a little complicated and if there are errors the problem is often not too easy to identify. And IDOC processing is not always as quick as people assume. For example, the customer master IDOC is processed in the background by a call transaction of XD01.
    For you I would suggest definitely using LSMW, and where possible within LSMW use the standard uploads. There will be situations where the standard upload does not cover everything (again using customers as an example, it will not include the extended address data or the long texts) but I would still use the standard upload to load as much as you can, and then write a couple of simple update BDC recordings (XD02, XK02, etc) to add what's missing.
    Some examples of the standard upload programs are:
    Customers - RFBIDE00
    Vendors - RFBIKE00
    Materials - RMDATIND
    All finance transactions - RFBIBL00
    Fixed assets - RAALTD01
    The standard uploads are normally well documented; they often use BDC (not RMDATIND) so are quite transparent; and they are generally pretty flexible.
    There are a few standard uploads which are not so good, such as the sales order load, and to a lesser extent the purchase order load. In these cases you might need to look at using the BAPIs but get help from someone who has used them before.
    You might want to use recordings for very simple loads like bank master.
    If migrating SAP to SAP, where possible you would look to transfer master data via ALE using IDOCs. But again you need to know what you are doing and it sounds like you are moving from a non-SAP to a SAP system anyway.
    Good luck,
    Ben
    http://www.harlex-ltd.com

  • Treasury Data Migration

    Friends,
    One of our client has decided to use SAP Treasury and Risk Management. I have been asked to lead data migration within this area. For the same, I would like to know the generally used and needed master and transaction data for SAP Treasury Management. Also, let me know the various ways of migrating data into SAP TRM eg. BAPI's, Transaction Manager, LSMW etc.  
    Regards,
    Amey

    Hello Amey,
    Please find list of steps which might be helpful for data migration.
    Legacy Data Transfer
    1. Legacy data
    transfer always happens on a key date.
    2. The procedure is
    different for securities and OTC because they deal differently with positions
    on a key date.
    3. For securities, we
    simply need to copy the positions with all values as of the key date.
    4. For OTC, the
    entire transaction is copied as there is no explicit position management in the
    transaction manager.
    Customizing the Legacy Data Transfer:
    You primarily need
    customizing for the update types that the flows of the legacy data transfer
    business transactions are to carry. We can find the settings under Treasury and
    Risk Management à Transaction Management à General Settings à Tools à Legacy
    Data Transfer à Flow Data à Update Types.
    The update types are
    divided into those for valuation area independent & dependent. The
    independent one only relates to securities. à While the legacy data transfer is
    usually not intended to update the general ledger, nevertheless you must mark
    most of the update types as posting relevant and maintain an account
    determination in customizing. This is necessary to ensure that the data in the position
    management is complete and in particular the account assignment reference
    transfers can be performed correctly.
    Legacy Data Transfer for OTC:
    Step
    1: Existing OTC transactions enter the same way as new transactions.
    Enter either manually (or) use existing BAPIs to import them. BAPIs list à Use
    BAPI_FTR_FD_CREATE & BAPI_FTR_FXT_CREATE
    for creating FD & FOREX transactions respectively. Similarly there are many
    BAPIs for each product and we can get the list of BAPIs in BAPI transaction code
    under the node Financials à Financial Supply Chain management. Use FTR_BAPI
    Transaction to test run the Treasury BAPIs.We may define separate transaction
    type for legacy transactions so that settlement (or) posting release is not
    necessary.
    Step 2: Now we must change status of all the historical flows
    created in Step 1 to POSTED without actually generating accounting document. To do this we will use transaction TBB1_LC.
    This is similar to transaction TBB1. Unlike TBB1, TBB1_LC affects all the valuation areas without giving an option to affect only operational area. We can see the posted flows by TBB1_LC in posting journal (TPM20) without any reference to accounting documents.
    Step 3: Till now the historical flows do not yet explain the book value of OTC transactions. Hence we must first enter the historical valuation results. This is done using TPM63C which you can find under path Treasury and Risk Management à Transaction Management à General.Settings à Tools à Legacy
    Data Transfer à Flow Data à Enter Valuation Area Dependant Data for MM.After entering the
    valuation results use TPM63 to transfer values to Internal Position Management.
    Legacy Data Transfer for Futures & Listed Options:
    Step 1: Similar to OTC
    Step 2: Similar to OTC
    Step 3: Use transaction TPM63D (Valuation area dependant data for futures). We must establish the relationship between lot & the transaction on which it is based via the transaction number.
    Legacy Data Transfer for Securities:
    For securities position, it does not matter whether is arose in the past through a single purchase (or) a series of business transactions. The legacy data transfer thus seeks to build up the positions through One Business Transaction on the transfer key date.
    Step 1: Before we enter business transactions, we must first create the master data of security position i.e. their Position Indicator. I) Use TPM61A (or) menu path Treasury and Risk Management Transaction
    Management à General Settings  Tools à Legacy Data Transfer à Position Data Enter Position Information for Securities
    II) Now create the position indicator using TPM61.
    Step 2: Now we will enter the business transaction.
      I) Valuation area independent data for securities à For External Position
    II) Valuation area dependant data for securities à For Internal Position Nevertheless, we need to use transaction set the effects of update type on position to change the positions. If you want to enter LOT Positions with the transactions of legacy data transfer, use the Item Number. Ensure that business transaction generating the position and the one for valuation results carry the same item number.
    If you have built up
    lots via normal procedures, you must specify the corresponding transaction number
    when entering the valuation area dependant business transaction in the
    transaction column.
    Step 3: Now generate the business transactions using TPM63.We can initially build only valuation area independent data (external position) by checking “only valuation area independent data” button in TPM63. The external position can be checked in securities account cash flow i.e. external securities position in TPM40.
    Also we created LSMW for each product.
    Regards,
    Jain

  • Conversion date

    Hi,
    I created the Info query used 6 tables i.e. ANLA, ANLB, ANLC, ANLZ, ANEP and ANEK. The issue is some assets is showing the conversion data i.e. 05/31/2009 and some assets is given current date (if the asset posted any retirement / additions / transfers u2013 the same date is updated in the query), but business asking the original date i.e. conversion date (05/31/2010) for converted / old assets.
    The goal of this table view and query set is to pull the entire asset base including converted, new adds, and new retirements to a certain point in time or period end.  But, we found that it was not pulling converted assets because of the linkage of data between 6 different tables.
    There are two assets (130049 & 174068, both with sub 0).  Iu2019ve made notes for each line of data in column u201CAu201D.  674068 seem to work perfectly as there is a posting row for the asset from conversion, a new $50 Addition, then the retirement.  630049 is where we are having a problem as the data query is not pulling the original conversion row that I would expect to see for Rs.109.84.  Whatu2019s interesting is when I retired total asset (including new adds) it retired the new adds and the converted amount.  What is preventing us from pulling the conversion data row in our view or query set?
    I have data sets due to tax in the very near future (already put them off quite a bit).  If you could help us look at this issue at your earliest convenience I would really appreciate it
    Thanks in Advance,
    C Babu

    And both java.sql.date and java.sql.Timestamp inherit
    from java.util.Date, so you can just cast to go back.Although you almost never need to. The only case would be to force a different signature to be called:
    void foo(java.util.Date dateOrTimestampThingy) {} // 1
    void foo(java.sql.Timestamp dateOrTimestampThingy) {} // 2
    Timestamp ts = ...
    foo(ts); // calls #2
    foo((java.util.Date)ts); // calls #1

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • HR Payroll Data Migration from 4.6C to ECC Upgrade

    Hi
    We are going to Upgrade our 4.6c R/3 system which includes HR to ECC6.
    After this upgrade we going to start the Data migration of all data inclduing HR payroll from 4.6c to ECC.
    1).Can anybody advise what are the options for loading Payroll data from 46c to ECC ..? After Payroll data migration we must be able to see all the Payroll results from the previous system.
    2) is there any standard programs availble for Payroll data migration  ..?
    Also if anybody have any info/links on Payroll/HR data migration please help to advise.
    Thanks a lot for your info/Time.,
    Edited by: Dp on Apr 3, 2008 11:41 PM

    Hi Somar,
    I could not find the mentioned post "HR Upgrade". Please let me know the details of the "HR Upgrade" details which need to be kept in mind during upgrading R/3 4.7 to ECC 6.0
    Yes, we have got the list of "Modification Objects" using the SPAU. Please confirm that during our upgrade, we just need to work on these objects which were changed/repaired by us according to our business need.
    And, apart from that, please let me know should I need to worry about any OSS notes specific to 4.7, which is not applied to our system and got released after the ECC 6.0 Enhancement Release 3.
    I understand that if we have a system that has the Patch and SP level as in our production system and do the upgrade to ECC 6.0 all the HR functionalities will be automatically taken care. Please confirm.
    Thanks in advance for your help.
    Regards,
    Vijay

  • Used Migration Assistant to successfully transfer PC files to Mini. However file creation or modified date does not transfer. Date of creation is listed as day of transfer.  I would gratefully appreciate any help in solving this,

    Used Migration Assistant to successfully transfer PC files to Mini. However file creation or modified date does not transfer. Date of creation is listed as day of transfer.  I would gratefully appreciate any help in solving this?

    I have the same problem; used Migration Assistant to transfer all my pix from PC to MacbookPro, however dates of pix were not transferred. Date created as the date of transfer. Cannot sort my pix according to date.
    Anybody has an answer?

  • Data Migration Template for Singapore Payroll.

    Hi
    I need a help on data migration template for Singapore Payroll.
    Would also need guidance on what i srequired in case i want to migrate the last year payroll cluster data from legacy system to ECC 6.0.
    Will appreciate if you can send me the migration template at my mail id.
    Thanks and regards

    Hai..
    Check some of the fields mentioned below.. it may be of some use to u..
    Client
    Personnel Number
    Sequential number for payroll period
    Payroll type
    Payroll Identifier
    Pay date for payroll result
    Period Parameters
    Payroll Year
    Payroll Period
    Start date of payroll period (FOR period)
    End of payroll period (for-period)
    Reason for Off-Cycle Payroll
    Sequence Number
    Client
    Personnel Number
    Sequence Number
    Country grouping
    Wage type
    Key date
    Rate
    Number
    Amount

  • Legacy Data Migration - question on conversion

    Hello All,
    We are trying to build a generic tool which could help the conversion process in data migration.
    For eg, If the material name in legacy starts with AB (AB6790), the same has to be referred in SAP as SQ6790.
    So its replacing initial few characters during the conversion process.
    Like this, I am expected to collect possible scenarios and
    develop a generic tool, which can cover some possible conversion conditions.
    Can you please help me with such scenarios, which you may have come across in project needs.
    Thank you.
    Meenakshi

    Rahul,
    I strongly suggest you do a search in the CRM General & Framework Forum on this topic.  If you are looking for specific objects to migrate there are several posts on the major data elements in crm (transactions, business partners, products).
    In general the technique is to use the LSMW combined with the XIF-IDOC adapter to convert data into the CRM system.
    My blog here will explain the general process:
    /people/stephen.johannes/blog/2005/08/18/external-data-loads-for-crm-40-using-xif-adapter
    Thank you,
    Stephen

  • CRM Masterdata and Data Migration Templates

    Could some one please provide me the below templates if you have any...
    - CRM Master Data Mapping Template
    - CRM Data Migration Design Template
    - SAP CRM Data Design Template
    Really appriciate your help...
    Thanks in advance..
    Sr

    It is a duplicate one... so just marking it as answered because I could not find the option to delete.

  • Validation rules applied to data migration templates at import

    Hi everyone!
    First post here for me, so please bear with me if I missed something.
    My company has just started the initial implementation of ByDesign. We come from a set of disparate and partially home-grown systems that we outgrew a few years ago.
    As part of this initial phase, we are basically re-creating the data on customers, suppliers, etc. since none of our existing systems makes a good source, unfortunately. We will be using the XML templates provided by ByDesign itself to import the relevant data.
    It has become clear that ByDesign applies validation rules on fields like postal codes (zip codes), states (for some countries), and other fields.
    It would be really helpful if we could get access to the rules that are applied at import time, so that we can format the data correctly in advance, rather than having to play "trial and error" at import time. For example, if you import address data, the first time it finds a postal code in the Netherlands which is formatted as "1234AB", it will tell you that "there needs to a space in the 5th position, because it expects the format to be "1234 AB". At that point, you stop the import, go back to the template to fix all the Dutch postal codes, and try the import again, only to run into the next validation issue.
    We work with a couple of very experienced German consultants to help us implement ByDesign, and I have put this question to them, but they are unaware of a documented set of validation rules for ByDesign. Which is why I ask the question here.
    So just to be very celar on what we are looking for: the data validation/formatting rules that ByDesign enforces at the time the XML data migration templates are imported.
    Any help would be appreciated!
    Best regards,
    Eelco

    Hello Eelco,
    welcome to the SAP ByDesign Community Network!
    The checks performed on postal codes are country specific, and represent pretty much the information that you would find in places like e.g. the "Postal Codes" page in Wikipedia.
    I recommend to start with small files of 50-100 records that are assembled of a representative set of different records, in order to collect the validation rules that need reactions based on your data in an efficient way. Only once you have caught these generic data issues, I would proceed to larger files.
    Personnaly I prefer to capture such generic work items on my list, then fix the small sample file immediately by editing, and do an immediate resimulation of the entire file, so that I can drill deeper and collect more generic issues of my data sample. Only after a while when I have harvested all learnings that were in my sample file, I would then apply the collected learnings to my actual data and create a new file - still not too large, in order to use my time efficiently.
    Best regards
    Michael  

Maybe you are looking for