Historical Data migration

Hello Experts!
Our client is switching from a SAP 4.7 to SAP ECC 6.0. We're going to move all open items (customers, vendors), the assets, master data...etc via LSMW.
One question remains: What happens with the historical data? They are not willing to keep the old systems just for consulting data (this would generate a yearly fee to host the systems). The client has IXOS as archiving tool.
My question is rather if it is possible to make a huge copy of all transactional data or not.
thanks for the aswers.
Regards
FX

Hi,
Your situation is a bit unclear... If you are going to upgrade your SAP system, why do you have to migrate the data via LSMW?
Regards,
Eli

Similar Messages

  • Historic data migration (forms 6i to forms 11g)

    Hello,
    We have done a migration from Forms 6i to Forms 11g. We are facing a problem with the historic data for a download/upload file
    utility. In forms 6i the upload/download was done using OLE Container which has become obsolete, the new technology being webutil.
    We had converted the historic data from Long RAW to BLOB (by export/import & by the TO_LOB) and while opening them it throws a
    message or not able to open. This issue exists for all types of documents like .doc, .docx, .html, .pdf. We are unable
    to open the documents after downloading to local client machines.
    One option which works is to manually download the documents (pdf, doc etc) from the older version of forms 6i (OLE) and
    upload it to the forms 11g (Webutil). Is there any way this can be automated?
    Thanks
    Ram

    Are you colleagues?
    OLE Containers in Oracle Forms 6i

  • Data migration for historical consumption

    Hello,
    I would like to get information about data migration for historical consumption.
    To make forecast, we have to load historical consumptions for each materrial (data in forcasting view / Consumption vals).
    Do you know if this is possible with a batch / direct input or we should develop a programm for that ?
    Thanks,
    PYR
    Edited by: Pierre-Yves Ryckwaert on Jul 17, 2008 11:21 AM

    I am certain that I can load historical consumption using IDOC menthod for message MATMAS03
    However I have never uploaded historical consumption in an extra step after the material master migration, but I guess this should work as well with this IDOC. Just make sure that you do not overwrite other data in material master.

  • Data migration

    Dear all my friends,
    please help to share me about data migration before go-live FI system.
    i plan to migrate GL acct balance, AP/AR open item and last 3 months historical data.
    please help to show me : how should i do ?
    thanks !

    1. Normal GL Accounts: In this case line items are not required and totals are to be updated in SAP. This can be done using GL fast entry if the number is not too high.
    2. Open Item Managed GL Accounts: All the open line items are required to be uploaded in SAP from Legacy in order to clear them. Hence, a LSMW / BDC program has to be developed to upload the line items.
    BAPI_ACC_DOCUMENT_POST - for GL/AP/AR
    BAPI_ACC_INVOICE_RECEIPT_POST - for AP
    BAPI_ACC_GL_POSTING_POST - for GL
    GL Balance
    F-02 or RFBIBL00 or BAPI
    Dr GL a/c Credit Data migration Account or Clearing Account
    3. Customer & Vendor Open Items: All open line items in Customer and Vendor Accounts have to be brought in into SAP. For this also a LSMW / BDC program needs to be developed and upload the line items.
    Vendor Balances
    F-02 or RFBIBL00 or BAPI
    Credit Vendor a/c (individually) and Dr Data migration Account or Clearing Account
    Customer balances
    F-02 or RFBIBL00 or BAPI
    Dr Customers a/c (individually) and Dr Data migration Account or Clearing Account
    4. Asset Accounting: In case asset accounting is implemented the individual balances of each asset (Gross Value and Accumulated Depreciation) have to be uploaded using transaction AS91. For this also you need to develop a LSMW / BDC program. Further, you also need to update the GL Accounts related to Asset Accounts with totals for each asset class T.Code OASV.
    For uploading the Assets
    AS91 -> for uploading assets in AA
    OSAV --> for uploading assets in GL entry Dr Asset (individually) Cr Data migration Account or Clearing Account
    You can use LSMW for uploading all these things
    After uploading all, your data migration account will become zero

  • Data Migration of FI-CA Cleared Items

    Hi,
    Currently I am working on a Data Migration project to migrate data from our FI-CA  module on ERP 4.7 to ECC 6.0.
    THere has been a request from the business to migrate historical data (e.g. Cleared items).
    Is there a SAP recommended approach or tools to load this data into the target environment?
    Currently all documentation around SAP data migrations talks about stategies for open item data migration however I have seen nothing arouund migrating historical financial data.
    Is this because it is not recommended or technically impossible?
    Regards
    Adam Gunn

    That BAPI is used typically for straight out vanilla GL, AR, AP postings, however you still have to create the other side of the entry and then clear it.
    I need to be able to migrate full history, which means from an FI-CA  viewpoint:
    1. Migrate FI-CA posting of liability against BP/contract account.
    2. Migrate associated payments.
    3. And then clearing documents.
    Basically the requirement is to represent the historical data in the new system as if it was posted and matched off.
    Is there a technical way to do this?
    OR,
    Do you migrate the FI-CA liabilties, then the associated payments and then run clearing on the target system?
    I suspect this is almost an impossible data migration requirement as development of the extraction and load process would be extremely complex and testing would take months to cover all posting scenarios in FI-CA. However, I would be interested if anyone has attempted to do this before.
    Adam

  • BW Upgrade 3.0 to 7.0 ( Data Migration for Prod)

    Hello Experts,
    We are doing BW Upgrade from 3.0 to 7.0, I need your suggestions.
    We are going to upgrade BW Development, Quality and Production landscape to 7.0 and once technical upgrade is done we will go ahead with Function upgrade.
    Now point is, once we done with all developments in Production environment then how we will take Historical data?
    We are planning to connect Old BW system (3.X) as Source system to new BW 7.0 and then we are going to pull data.
    But as we need to move data from Standard data sources like LO we need downtime for Production environment. Our challenge is to minimize this downtime for production environment.
    Does any body having idea about this? I am aware of one IBM toll TDMF for data migration which migrate data from one data base to one data base with hardly 2 hours of  downtime , but again for this licensing cost and etc is involved.
    Please share your experiences and suggestions.
    Thanks in advance.
    Regards,
    Pankaj Naik

    You would not need to da a data migration if you are upgrading the existing 3.0 system with the data.
    Edited by: Henry Janz on Jun 9, 2011 5:09 AM

  • SAP Accelarated Data migration

    Hi All,
             Could someone kindly provide more info about SAP ADM. I am uanble to get any info about the same. I would like to understand as to how exactly the tool works. I have the 4 page PDF that is posted on the site but its not clear on the actual tool. Can someone kindly provide me with some document or screen shots or any more info about it. My mail id is [email protected] Could someone kindly reply at the earliest.
    Thanks
    Prahlad

    Hi Prahlad,
    Go through this hope u can understand.
    With SAP Accelerated Data Migration, you can reduce migration
    costs by as much as 50% and avoid interruption of business
    processes. Moreover, shutting down the source system after
    migration reduces system administration costs and total cost
    of operations. In short, you realize the following benefits:
    • Significantly reduced cost and time to complete migration
    projects
    • Accurate, cost-effective data transfer applicable for any kind
    of source system
    • Better data quality because of preconfigured business objects
    that ensure data consistency
    • Improved end-user productivity and acceptance thanks to
    migration of historical data
    • Effective migration that avoids interruption of business
    processes
    • Full support services to avoid risks and ensure the optimum
    performance of your new business applications
    • Faster return on investment
    In short, a smoother, more cost-effective migration to a new
    technology solution ultimately positions your organization
    to lower your total cost of ownership, maintain competitive
    advantage, and pursue new business opportunities.
    Expertise in Action
    SAP Accelerated Data Migration applies a business object–oriented,
    two-step approach that uses a neutral interface as a staging area
    and predefined migration content for the conversion and upload
    of data. The neutral interface enables the SAP tool to generate
    predefined migration content and prevents all potential legal
    issues regarding the intellectual property of any source-system
    vendor. The whole data migration process from the source to
    the target system consists of just two steps, as follows:
    1. Data is extracted from the source system into the standard
    interface as XML files.
    2. Data migrates from the interface into the mySAP Business
    Suite database. The migration is based on a new “migration
    workbench” engine developed by SAP based on the SAP
    NetWeaver® platform. All requirements for mapping structures
    and fields and developing complex conversion rules are solved
    within this engine (see Figure 1).
    Once the migration is complete, business-unit end users have
    access to all the legacy data in the new applications as if it had
    originated there. They can continue to work on the existing
    business process items in the new applications and benefit from
    improved functionality.
    Lifting the Limitations
    Much of the cost and effort involved in classical data migrations
    are generated by migration content development, as follows:
    • Identifying business objects for migration to properly support
    the business
    • Defining the structure and field mapping for the relevant
    business objects
    • Developing conversion rules for all necessary value mapping
    Readily available migration content can simplify this effort.
    SAP Accelerated Data Migration provides preconfigured business
    content, helping you migrate it to your new system more efficiently
    and rapidly. The tool allows the migration of all types of
    data, independent of its current state within a business process.
    This includes master and dynamic data, as well as partially
    processed and historical data, to minimize data loss. Business
    processes are uninterrupted and normal operation procedures
    can be retained.
    By providing a standard, neutral interface and reading data as an
    XML file, SAP Accelerated Data Migration is applicable for any
    kind of source system. Preconfigured data migration objects built
    specifically for SAP applications significantly simplify the conversion
    from non-SAP software data into SAP software data objects,
    yielding far-reaching benefits. Besides reducing related IT costs,
    you can be certain of consistency across business-object boundaries.
    Through a direct insert into the database, you avoid the
    performance limitations of classical data migration.
    Reward points if helpful.
    Thanks

  • Upload PR/PO Historical data

    Hi
    How to upload the SAP PR/PO historical data from one system to another SAP system
    I have all relevant data table down loads from old system. Now I need to upload this PR/PO historical data into New SAP system
    there is no link between Old SAP system and New SAP system, both are owned by different orgnizations/owners
    Immediate replies will be appreciated
    Thanks & Regards
    Raju

    Hi Raju,
    Loading of PO's with transactions is problematic, since the follow-on documents can't be loaded. If it is completely closed PO and you load it to the new system, you shall block it - otherwise it will be considered in SAP.
    If you load a PO with GR > IR, the GR's can't be loaded, which means the invoice posted against this PO will remain unbalanced. You can of course close the PO by using MR11, but I think it is much easier just to skip this PO and register the coming invoice directly in FI.
    If you load a PO with IR > GR, the IR's can be loaded, which means that the coming GR will make the PO remain unbalanced. So here I would suggest skipping this order too and registering the coming GR without reference to the PO (mvt 501).
    In all my projects we only migrated completely "naked" PO's and besides, tried to minimise the number of such orders - let the buyers purchase more and complete the orders before the cut-over date - they can do it. In such case the number of open orders will normally be that low that it won't justify automatic migration. On the other hand, enetering the orders in the new system manually will be a good training for the buyers.
    If the reason for loading the closed PO's is having purchasing statisics in the new system, then this can be done very easily by updating the corresponding LIS tables directly (S011, S012). Another tip is to load the data both to the version 000 (actual data) and to some other version in these tables, so that you will always be able to copy them to 000 again whenever you need to reload LIS for some reason.
    BR
    Raf
    Edited by: Rafael Zaragatzky on Jun 3, 2011 8:28 PM

  • Report painter.....PCA reports & historical data issue

    I need to create a report painter PCA report for P&L that will do the foll:
    --> Consolidated P&L for 2 company codes
    --> Company code 1 - YTD figures to include full year
    --> Company code 2 - YTD numbers to include only 2nd half year as company 2 was acquired in middle of year
    Problem is we are migrating the historical data for the last two years. I do not want the report to pick up this as part of the report.
    Suggestions?

    anyone?
    Thanks

  • Data Migration from CRM 5.0 to CRM 7.0

    Hi Friends,
    We are into a re-implementation on CRM 7.0. The old system was on CRM 5.0.
    Since this is not an upgrade, the data will not copied over from the old system into the new system.
    So, now I would like to know how to migrate data (master/transaction) from CRM 5.0 into 7.0.
    I have read that we can make use of BAPI/LSMW/IDOC/XIF adapters to push info into CRM.
    But, the customer wants all the historical data to be in the new system.
    I think master data can be handled....  but how to migrate the transaction data?.... i mean the number ranges/document flow.. and other relationships.. how can we maintain the same links into the new system?
    If it was a migration from legacy system into the SAP CRM, we could have gone ahead with new number ranges etc... but this migration is from SAP to SAP.
    Also, we will have ECC6.0 connected to CRM 7.0. ANy additional things to be taken care of when the connection is live in DEV/QA/PROd instances?
    Any pointers to this ?

    Hi Gary,
    As per my understanding, your data migration involves quite complex scenarios covering the CRM and ERP box. As the customer needs the old data to be present in the new implementation, the old data needs to be migrated from the CRM 5.0 to CRM 7.0. For the migration to happen, you need to setup the customization (like the number ranges,Condition Data Tables, etc ). Then only you can proceed with the data migration. Also, you need a proper backup plan as this involves proper backup of the data.
    Once the setup is complete, you need to configure the ECC6.0 with the CRM7.0 system. Here also, you need to do a lot of settings and performing the Initial loading of the Adapter Objects. But before you proceed with the entire operation, the entire system landscape must be properly defined with appropriate settings.
    My recomendation is to take help of SAP Consultants and data migration experts as they are the best persons to guide you on this. Also it would be great if you can go through the SAP Best Practices documents which are best repository documents to guide you on this.
    Have a look on this link for an overview of the CRM Data Migration
    http://help.sap.com/saphelp_crm60/helpdata/en/1a/023d63b8387c4a8dfea6592f3a23a7/frameset.htm
    Hope this helps.
    Thanks,
    Samantak.

  • Data migration from SAP to Oracle

    Can any one suggest best possible method for migrating master and transactional data from SAP to Oracle.
    Please suggest methods to get data in text file from SAP.

    Best approach would be to use Electronic Data Interchange (EDI) outbound from SAP inbound to Oracle. There are two widely used standards in EDI messages - ANSI X12 and EDIFACT; I'm pretty sure the Oracle system will be able to accept one of the two standards or both.
    In my last message, when I asked if Oracle is able to receive and process IDocs, I actually meant if Oracle will be able to accept and process EDI messages. IDocs are containers of the EDI message i.e. a data container - their formats are SAP proprietary formats which will not make sense to your target system (Oracle). EDI has been around for many years and is the standard way of electronically exchanging business documents between computer systems of business partners, using a standard format over a communication network.
    In regards to your question - I'm not really sure if you want to send master/transactional data real-time from SAP to Oracle or want to transfer the historical data i.e. download from SAP tables (thru SE16 - this will download the field labels as well) into text files and then upload it to Oracle by using a custom program which reads/maps the files then posts the transactional data/updates master data tables in Oracle as required by that system. So let me know as much details as possible.
    It would be nice to show appreciation by rewarding points.
    Cheers,
    Sougata.

  • Standards data migration process from Hyperion Enterprise to HFM 11.1.2.1

    Client currently has HEM and we are implementing EPM v11. I am wondering if there is any standard process to migrate historical data from HEM 6.5 to HFM 11.1.2.1?

    First you need to migrate to v9.3.1 HFM (Fresh implementation of HFM) and then upgrade from 9.3.1 to 11.1.2.1

  • Historical data transfer in HRMS

    Hi,
    I need the best possible ways of transferring the historical data to HRM. I have studied ADE and web adi do any of the guys know how to use.
    Regard,
    Shahzad

    Migrating historical data within Oracle HRMS is not the easiest task in the world. I have experiemented using web ADI because this makes us of the API calls.
    In my experience i would suggest only using web ADI for small amounts of data in non complex scenarios. If you are considering migrating a full set of historical data i would suggest a more technical approach.
    A combination of SQL Loader to stage data before PL/SQL procedures make calls to the API to insert / update date etc.
    As cdunaway mentions it may really depend on if your HRMS environment is live, the PL/SQL procedure route gives you so much more flexibility as you can apply logic to the load of data. Due to date track this can get messy.......Correction / Update / Insert / Insert Overried!!!!! Ouch

  • Treasury Data Migration

    Friends,
    One of our client has decided to use SAP Treasury and Risk Management. I have been asked to lead data migration within this area. For the same, I would like to know the generally used and needed master and transaction data for SAP Treasury Management. Also, let me know the various ways of migrating data into SAP TRM eg. BAPI's, Transaction Manager, LSMW etc.  
    Regards,
    Amey

    Hello Amey,
    Please find list of steps which might be helpful for data migration.
    Legacy Data Transfer
    1. Legacy data
    transfer always happens on a key date.
    2. The procedure is
    different for securities and OTC because they deal differently with positions
    on a key date.
    3. For securities, we
    simply need to copy the positions with all values as of the key date.
    4. For OTC, the
    entire transaction is copied as there is no explicit position management in the
    transaction manager.
    Customizing the Legacy Data Transfer:
    You primarily need
    customizing for the update types that the flows of the legacy data transfer
    business transactions are to carry. We can find the settings under Treasury and
    Risk Management à Transaction Management à General Settings à Tools à Legacy
    Data Transfer à Flow Data à Update Types.
    The update types are
    divided into those for valuation area independent & dependent. The
    independent one only relates to securities. à While the legacy data transfer is
    usually not intended to update the general ledger, nevertheless you must mark
    most of the update types as posting relevant and maintain an account
    determination in customizing. This is necessary to ensure that the data in the position
    management is complete and in particular the account assignment reference
    transfers can be performed correctly.
    Legacy Data Transfer for OTC:
    Step
    1: Existing OTC transactions enter the same way as new transactions.
    Enter either manually (or) use existing BAPIs to import them. BAPIs list à Use
    BAPI_FTR_FD_CREATE & BAPI_FTR_FXT_CREATE
    for creating FD & FOREX transactions respectively. Similarly there are many
    BAPIs for each product and we can get the list of BAPIs in BAPI transaction code
    under the node Financials à Financial Supply Chain management. Use FTR_BAPI
    Transaction to test run the Treasury BAPIs.We may define separate transaction
    type for legacy transactions so that settlement (or) posting release is not
    necessary.
    Step 2: Now we must change status of all the historical flows
    created in Step 1 to POSTED without actually generating accounting document. To do this we will use transaction TBB1_LC.
    This is similar to transaction TBB1. Unlike TBB1, TBB1_LC affects all the valuation areas without giving an option to affect only operational area. We can see the posted flows by TBB1_LC in posting journal (TPM20) without any reference to accounting documents.
    Step 3: Till now the historical flows do not yet explain the book value of OTC transactions. Hence we must first enter the historical valuation results. This is done using TPM63C which you can find under path Treasury and Risk Management à Transaction Management à General.Settings à Tools à Legacy
    Data Transfer à Flow Data à Enter Valuation Area Dependant Data for MM.After entering the
    valuation results use TPM63 to transfer values to Internal Position Management.
    Legacy Data Transfer for Futures & Listed Options:
    Step 1: Similar to OTC
    Step 2: Similar to OTC
    Step 3: Use transaction TPM63D (Valuation area dependant data for futures). We must establish the relationship between lot & the transaction on which it is based via the transaction number.
    Legacy Data Transfer for Securities:
    For securities position, it does not matter whether is arose in the past through a single purchase (or) a series of business transactions. The legacy data transfer thus seeks to build up the positions through One Business Transaction on the transfer key date.
    Step 1: Before we enter business transactions, we must first create the master data of security position i.e. their Position Indicator. I) Use TPM61A (or) menu path Treasury and Risk Management Transaction
    Management à General Settings  Tools à Legacy Data Transfer à Position Data Enter Position Information for Securities
    II) Now create the position indicator using TPM61.
    Step 2: Now we will enter the business transaction.
      I) Valuation area independent data for securities à For External Position
    II) Valuation area dependant data for securities à For Internal Position Nevertheless, we need to use transaction set the effects of update type on position to change the positions. If you want to enter LOT Positions with the transactions of legacy data transfer, use the Item Number. Ensure that business transaction generating the position and the one for valuation results carry the same item number.
    If you have built up
    lots via normal procedures, you must specify the corresponding transaction number
    when entering the valuation area dependant business transaction in the
    transaction column.
    Step 3: Now generate the business transactions using TPM63.We can initially build only valuation area independent data (external position) by checking “only valuation area independent data” button in TPM63. The external position can be checked in securities account cash flow i.e. external securities position in TPM40.
    Also we created LSMW for each product.
    Regards,
    Jain

  • CRS Historical reports - Migration

    Hello,
    I have a customer with CRS 4.0(2) and has been recently upgraded to CRS 4.0(5) on a different server. Is it possible to dump the data regarding historical reports to the new server somehow?
    I don't have the installation CDs of 4.0(2) it is not possible to reinstall 4.0(2), apply the restore and then perform the upgrade to 4.0(5) so another solution would be appreciated.
    Thanks in advance

    If the database schema is same then you can migrates the HR data to the new server. Only data migration of HR data occurs during the upgrade process.

Maybe you are looking for

  • I work with multiple Mac computers and need catalog help

    NEW TO Lightroom Can I create a folder on my desktop that containg the original files and a "new catalog" for each "shoot" I'm hoping to create a workflow where the folder would contain everthing needed to access and edit/develop images at any time.

  • Calling Oracle stored procedure from xMII Query Templates.

    Hi All, We have a requirement to call a Oracle stored procedure from xMII, the SP expects some inputs and then it returns multiple rows. I tried different approches with no results, I remember some posts on the same topic but I could not get in searc

  • Flex session

    Hi, I have 2 mxml files. One is Order Entry screen. When a user enters the order and submits. It goes to the other page to show the m their order, totals etc. But when I come back from the page the main page doesn't hold the sorting , filtering etc..

  • Acrobat changes footer from the original word doc

    Hi, when I convert a Word document to a PDF the footer keeps changing to the file name and not the original footer. Can someone tell me how to stop this. Thanks

  • SQL-only views hanging

    I am using JDeveloper 9.0.3. I use the Application Module to work with the db-tables and views, and I am using the applicationmodule-pooling. I have a nasty problem with some SQL-only views that hangs. If we drop the db-views used in the query and re