Data Migration Part - Reg.

Hi all,
While in Implementation of FICO part, what are the data we need to upload thrugh LSMW or BDC like Vendor Master, GL Master?
Can any one suggest me and anybody can give Templates source data?
Thanks & Regards,
M.Mohan.

Hello,
Broadly I will divide data uploading process into two categories.
1. Master data (GL accounts, customers, vendors, assets, cost elements, cost centers, profit centers etc.)
2. Transactional Data (Open Items for customers and vendors, GL line items, Cost Center planning, profit center plan data load etc.)
In the first case mostly you can use LSMW, if you are conversant. The following document is useful to you.
http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
Note: No template is suitable for you, because, your configuration is different and their configuration is different, field status of various fields must be different. Therefore, follow the above document and start from fresh with recording. Recording is more or less sufficient for master data.
In the second case, you create 5 Offsetting accounts + one main offsetting account.
1. Vendor Offsetting Account
2. Customer Offsetting Account
3. Asset Offsetting Account
4. GL Offsetting Account
5. Stock Offsetting Account
At the end of the day, the total balances within these accounts must be ZERO. Make ZERO these accounts and transfer the balance to MAIN offsetting account. Once you have done this, make sure you have blocked all these accounts.
Technically, in second case, you should take help from ABAPer to write BDC, particularly in case if your entries are having different line items. For example your first document is having 5 line items and second document is having 10 line items, in such case LSMW recording will not work. You can depend on some standard SAP programs to map the LSMW. If you are not conversant with how to map, then you can refer the following document posted by Naimesh.
/people/naimesh.patel/blog/2008/08/14/lsmw-with-rfbibl00
Still, if you are not able to map, then there is no other way, you need to depend on ABAPer for uploading the transaction data through BDC. He / she will create a program and a customized transaction code for you and provide you the format. You may have to help the ABAPer with your requirement what you are exactly looking to upload.
Hope this is informative and let me know if you need any further information.
Regards,
Ravi

Similar Messages

  • Data Migration Reg.

    Dear Experts,
                 What are the effective tools for Data Migration in SAP-ISU, if the data to be migrated is large, say 10lakhs. How this can be done effectively from a Legacy system.
    Thanx in Advance

    Hello,
    Data Migration in SAP-ISU
    EMIGALL :-
                It is designed to upload huge mass data.
    t.code:EMIGALL
    the other sap supports are
    LSMW
    BDC

  • Reg: Efficient Strategy of Data Migration

    Hello all, can anybody please help me whether is there any efficient strategy for doing the data migration.
    looking forward for lots of replies.........

    looking forward for lots of replies......... We'll need a whole lot more information before we can offer a reply. What are you migrating from? What are you migrating to? What constraints are you working under? Is data cleansing involved? Is data ttransformation involved? etc.
    You should have been around these forums long enough to know that these sorts of drive-by questions don't get sensible answers with further details, and are, in fact, likely to provoke bristly responses. After all, if you can't be bothered to invest some time in typing in the question properly why should we give up our time trying to guess what it is you want to know?
    regards, APC

  • Data Migration of Excise Duty payables at initial upload

    Hi!
    Can any one tell me, how to upload initial Excise Duty payable in data migration procedure.
    for example if different excise duty payables like ED, CESS, SEC.CESS for normal goods and capital goods.
    thanks in advances.
    regs,
    ramesh b

    Hi Ramesh,
    From my exposure with indian company codes these are just another GL account to be migrated since there are
    no dependies you can do the mapping from legacy to the new sap system the GL account.
    Just my view
    regards
    pbb

  • SAP Best Practices for Data Migration :repositories only on MS SQL Server ?

    Hi,
    I'm implementing the "SAP Best Practices for Data Migration" (see https://websmp109.sap-ag.de/bp-datamigration).
    As part of the installation you have to install MS SQL Server Express Edition. The installation guide contains detailed steps to do this. All repositories for Data Services should be running on SQL Server, according to the installation guide.
    The customer I'm working for now does not want to use SQL Server, but DB2, as company standard.
    So I use DB2 for the local and profiler repositories.
    I notice however that the web application http://localhost:8080/MigrationServices does not support DB2.The only database type you can select in the configuration area is MS SQL Server.
    Is this a limitation, a by design ?

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Validation rules applied to data migration templates at import

    Hi everyone!
    First post here for me, so please bear with me if I missed something.
    My company has just started the initial implementation of ByDesign. We come from a set of disparate and partially home-grown systems that we outgrew a few years ago.
    As part of this initial phase, we are basically re-creating the data on customers, suppliers, etc. since none of our existing systems makes a good source, unfortunately. We will be using the XML templates provided by ByDesign itself to import the relevant data.
    It has become clear that ByDesign applies validation rules on fields like postal codes (zip codes), states (for some countries), and other fields.
    It would be really helpful if we could get access to the rules that are applied at import time, so that we can format the data correctly in advance, rather than having to play "trial and error" at import time. For example, if you import address data, the first time it finds a postal code in the Netherlands which is formatted as "1234AB", it will tell you that "there needs to a space in the 5th position, because it expects the format to be "1234 AB". At that point, you stop the import, go back to the template to fix all the Dutch postal codes, and try the import again, only to run into the next validation issue.
    We work with a couple of very experienced German consultants to help us implement ByDesign, and I have put this question to them, but they are unaware of a documented set of validation rules for ByDesign. Which is why I ask the question here.
    So just to be very celar on what we are looking for: the data validation/formatting rules that ByDesign enforces at the time the XML data migration templates are imported.
    Any help would be appreciated!
    Best regards,
    Eelco

    Hello Eelco,
    welcome to the SAP ByDesign Community Network!
    The checks performed on postal codes are country specific, and represent pretty much the information that you would find in places like e.g. the "Postal Codes" page in Wikipedia.
    I recommend to start with small files of 50-100 records that are assembled of a representative set of different records, in order to collect the validation rules that need reactions based on your data in an efficient way. Only once you have caught these generic data issues, I would proceed to larger files.
    Personnaly I prefer to capture such generic work items on my list, then fix the small sample file immediately by editing, and do an immediate resimulation of the entire file, so that I can drill deeper and collect more generic issues of my data sample. Only after a while when I have harvested all learnings that were in my sample file, I would then apply the collected learnings to my actual data and create a new file - still not too large, in order to use my time efficiently.
    Best regards
    Michael  

  • GRC AC5.3 CUP Requests Transaction Data Migration to 10.1

    Hi,
    We are starting a new upgrade from AC5.3 to AC 10.1 project. In migration document it mentioned that we can migrate CUP requests transactional data to 10.1.
    our management is asking to maintain the CUP transaction data in AC10.1 application for future reference for auditing requirements. After closing of all old CUP requests we need to migrate CUP Requests Data into 10.1.
    Is it possible? If possible, Please let me know if there are any challenges from your experience from your upgrade projects of same requirement.
    Thanks,
    Sathish.

    Hi Ram,
    Good Day,
    Is that possible to download old CUP request as PDF  format in AC 10.1 as part of  data migration
    Regards
    Narayanan

  • Que on Material Master Data Migration

    Hi All
    We are trying to allocate parallel currency to an existing co code, and since SAP recommends not to add parallel currency to an existing co code, we have decided to go ahead with a new co code - thereby transferring all master and transaction data as part of data migration..
    However, my Qs is.. do we have to migrate the material master data aswell since when I make the material to plant assignment in customizing, it should then officially belong to the new co code too right?
    Please advise
    RS

    Hello Jürgen
    Thanks for your reply.
    I dont think you fully understand my scenario..
    we have an entity that wants to introduce a parallel currency to all their reporting.. therefore, addiding a parallel currency to an active company code is not possible and they are not keen to go with material ledger...
    therefore, we suggested a brand new co code as of 01.01.2012. So the current co code is not going to be active after 31.12.2011. We are creating a brand new co code and doing data migration for all the master data and transaction data.
    meaning, Vendor, Customer, GL, AP & AR item Data, GL item data and asset master and transaction data.
    In customizing, after creating the new company, we will make the assignments to plant purchasing/sales org etc.
    My Qs - do we also need to migrate the material master since it belongs to a Plant. Therefore, when we assign the co code to the Plant, it will automatically be available in the co code when creating a PO etc..
    this is my Qs
    thank you
    RS

  • LSMW data migration porcess from the beginning

    Hi gurus,
    I do understand that there are 14 steps to be completed during migration, but can anyone give me a basic idea as what needs to be done prior to the above steps when starting a data migration project, as i have been only involved in the LSMW(14 steps) process before.
    Thanks in advance, points will be awarded for all answers.
    Ganga

    Hello
    As an intro:
    Classical data migrations are typically limited to master data and
    tend to cause multiple negative impacts on workflow and oper-
    ating business units.
    To minimize costs, risks, and effort during a data migration and ensure a smooth changeover for operational business processs items, and better reporting
    capabilities SAP recently come with a Data Migration Solution called SAP Accelerated Data Migration tool.
    This advanced tool, coupled with support from the SAP Services organization and SAP partners, lets you easily migrate and integrate application data from any legacy system into SAP applications – and reduce migration costs significantly.
    Data migration is done in 4th phase i.e. the final preparation stage of ASAP methodology.
    As far as LSMW is concerned, BDC and LSMW both are use for loading data into SAP system. The difference is ,LSMW is loading tool by SAP where you can use various methods to upload, massage data or cleansed your data while loading into SAP from legacy system. The method can be BDC, BAPI, Idocs or SAP ABAP codes. Usually in BDC the user has to write an ABAP program or use other utility to load legacy data.
    LSMW has been developed mainly for the Functional people who has no knowledge of the technical aspects of report or BDC in ABAP because its very user friendly and handy to use. So, simply they can upload the master records using LSMW.
    Reg
    assign points if useful

  • Offline data migration fails for BLOB field from MySQL 5.0 to 11g

    I tried to use standalone Data Migration several years ago to move a database from MySQL to Oracle. At that time it was unable to migrate blob fields. I am trying again, hoping this issue might have been fixed in the mean time. That does not appear to be the case. The rows in question have a single BLOB field (it is a binary encoding of a serialized Java object, containing on the order of 1-2K bytes, a mixture of plain text and a small amount of non-ASCII data which is presumably part of the structure of the Java object). The mysqldump appears to correctly store the data, surrounded by the expected <EOFD> and <EORD> separators. The data as imported consists of a small (roughly 1-200) ASCII characters, apparently hex encoded, because if I do a hex dump of the mysqldump I can recognized some of the character pairs that appear in the blob field after import. However, they are apparently flipped within the word or otherwise displaced from each other (although both source and destinations machines are x86 family), and the imported record stops long before all the data is encoded.
    For example, here is a portion of the record as imported:
    ACED0005737200136A6
    and here is a hex dump of the input
    0000000 3633 3838 3037 3c39 4f45 4446 303e 3131
    0000020 3036 3830 3836 453c 464f 3e44 312d 453c
    0000040 464f 3e44 6e49 7473 7469 7475 6f69 446e
    0000060 7461 3c61 4f45 4446 ac3e 00ed 7305 0072
    0000100 6a13 7661 2e61 7475 6c69 482e 7361 7468
    0000120 6261 656c bb13 250f 4a21 b8e4 0003 4602
    0000140 0a00 6f6c 6461 6146 7463 726f 0049 7409
    0000160 7268 7365 6f68 646c 7078 403f 0000 0000
    AC ED appears in the 5th and 6th word of the 4th line, 00 05 in the 6th and 7th words, etc.
    I see explicit references to using hex encoding for MS SQL and other source DB's, but not for mysql.
    I suspect the encoder is hitting some character within the binary data that is aborting the encoding process, because so far the records I've looked at contain the same data (roughly 150 characters) for every record, and when I look at the binary input, it appears to be part of the Java object structure which may repeat for every record.
    Here is the ctl code:
    load data
    infile 'user_data_ext.txt' "str '<EORD>'"
    into table userinfo.user_data_ext
    fields terminated by '<EOFD>'
    trailing nullcols
    internal_id NULLIF internal_id = 'NULL',
    rt_number "DECODE(:rt_number, 'NULL', NULL, NULL, ' ', :rt_number)",
    member_number "DECODE(:member_number, 'NULL', NULL, NULL, ' ', :member_number)",
    object_type "DECODE(:object_type, 'NULL', NULL, NULL, ' ', :object_type)",
    object_data CHAR(2000000) NULLIF object_data = 'NULL'
    )

    It looks like the data is actually being converted correctly. What threw me off was the fact that the mysql client displays the actual blob bytes, while sqlplus automatically converts them to hex for display, but only shows about 2 lines of the hex data. When I check field lengths they are correct.

  • E-Recruiting 6.0 SP11 - How to do data migration or batch creation

    Hi Experts,
    For R/3, we can use BDC programs to do data migration for personnel records.
    For E-Recruiting, how do we go about migration of candidate records or batch creation of candidate records (eg, BDC, standard function module, direct append)?
    If it is via function module, can you share with me the name of the function module?
    Thanks,
    William

    Hello William,
    the Business Partner is an application / module which belongs to the base components of the SAP. It is used and partly extended by various other applications / modules. Next to E-Recruiting it is used for example by CRM and the financial service solution (FS-CS, FS-PM, FS-RI). All of these modules can put their data for a person or an organization into the same tables. Depending on the installation / system environment or even within one single module the requirements for available fields and business checks as on authorization differ between kinds of business partners (e.g. in FS-CS the commission solution for the financial service sector knows external agents and internal employees which have to be treated differently). The business partner is the element to assign the logical / business role in which a person is handled by the system.
    For E-Recruiting you have 2 kinds of business partners, too. On the one and there are people being candidates and on the other hand there are branches of your company which hire people. The configuration allows you to seperate them if you need to identify anywhen which business partner is a branch and which is a candidate. So far I never tried if this is really working as there is no real use for this I never set it up. The attributes and the business checks are the same anyways.
    Hope that helps a bit to understand the context
    Best Regards
    Roman Weise
    PS: please remember that you have to maintain the branches via administrator bsp application. Using the IMG entry won't work.

  • Data migration i Finance

    can anyone help me how the <b>data migration</b> works in finance in short description

    Hello
    Data migration from legacy to SAP is responsibility of various module leads.
    Assuming you are responsible for FI, the data could be GL /AP/AR/AA master data and balances.
    While there are tools to upload, the first thing is mapping the legacy data with SAP data.
    What fields are available in SAP and whether they meet the legacy requirements. After this excercise, the ABAPer would make a conversion program. This could be tested in any of the test clients.
    As far as the field in PO are concerned, open a created PO, sit with a MM guy and decide on the mandatory fields and map with legacy.
    Reg
    *Assign points if useful

  • Fixed Asset Data Migration Wizard

    Hi All,
    I tried to use Fixed Asset Data Migration Wizard to migration my fixed asset data from 8.82 to 9 in the development server.  The server only accessible by myself.
    When I start the migration run, system prompt me " To Continue with the fixed asset data migration, you must be the only user logged on to this company data".
    In fact I'm the only user in the company data and system not allow me to execute the migration.
    What are the causes that system recognise there is another user in the system?
    Please advise.
    Thanks in advance.
    Best Regards,
    Foong Yee

    Hi,
    If you're using SQL Server, try Detach (close existing connections) and re-Attach the database again then re-login and try executing again the wizard.
    Regards,
    Donald

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Data migration of open production order

    Hello SAP gurus,
    Please advise me about the steps that should be followed for data migration of open production order from legacy to SAP.
    Regards,
    Anand

    Hello SAP gurus,
    Please advise me about the steps that should be followed for data migration of open production order from legacy to SAP.
    Regards,
    Anand

Maybe you are looking for

  • How to set HttpURLConnection timeout while reading a stream?

    I want it got time out if the connection if lose while i read a stream from URL try {      int timeout = 700;      HttpURLConnection connection = (HttpURLConnection) downloadURl.openConnection();      connection.setConnectTimeout(timeout);      conne

  • How do I slow down video playback in quicktime 10.0?

      I'm running OS 10.6.8. I tried going to the Window dropdown menu, but there is no option for A/V controls, as in other types of quicktime.  Also, I cannot download the other versions of quicktime (such as 7.x) from the website because QT 10.0 is in

  • JBO-25005 implementing multi-entity business rules (solved)

    Hi guys, I've been following through some examples in the "businessrules in adf" whitepaper you recently updated. I'm attempting a multi-entity collection within parent (6.1.4.2) on page 42. I have two entities defined as a composition and as a simpl

  • Problem updating pages that use .dwt file

    So i have 45 files that all use the same .dwt file.  Usually when I make a change to the template and go file > save it asks me if i want to update all pages that use that specific .dwf file.  For some reason it doesn't show the pages that do.  Altho

  • Dreamweaver CS6 meta data

    How do I change the google search engine results from TEMPLATE (I used a template to create site) to the actual name of my site? Help please...