Data Migration Infotype load sequence

Hi All,
I am migrating data from our legacy system to SAP HR infotypes and would like to know what the sequence / dependencies are.  There are two of us doing the work and I was hoping we could load some of them simultaneously but I realise there are dependencies between some of them.  I am doing the following infotypes:
Initial Conversion
Current Conversion
Infotype 0006
Infotype 0007
Infotype 0008 (0007 needs to be complete)
Infotype 0009
Infotype 0014
Infotype 0015
Infotype 0016
Infotype 0019
Infotype 0021
Infotype 0025
Infotype 0032
Infotype 0040
Infotype 0041
Infotype 0045
Infotype 0071
Infotype 0077
Infotype 0088
Infotype 0105
Infotype 0171
Infotype 0167 (0171 needs to be complete)
Infotype 0442
Infotype 2006
Infotype 2001 (2006 needs to be complete for holidays)
Infotype 9000
And a colleague is doing these:
Infotype 1000
Infotype 1001
Infotype 1005
Infotype 1007
Infotype 1008
Infotype 1013
Infotype 1050
I am told that my colleagues infotypes (1000 to 1050) all need to be migrated before I can start my infotypes (top list).
Is this true? 
Or can I start loading after she has completed 1000?
Is there a Best Practice sequence list somewhere on the net?
Thanks in advance for your help.
Regards,
Neal.

Neal,
   Have you tried looking for cookbooks on the sap site. They may tell.
However, the basic principal is you will be building your organisation structure (IT 1000) so that when you add the employees you can assign them to the org structure. So yes after you have added your IT 1000 records( org units, positions etc.) you can add the employees. If you aren't assigning your employees into the org structure at load time then you would not need to wait. I have worked on a project where employees where loaded to a dummy org unit , as the structure was not defined by the time employee data needed to be loaded.
Hope that helps,
Gary

Similar Messages

  • Data Migration- Infotypes BDC Recording

    Hi All,
    Iam currently working on data migration project.
    Before i joined here,they developed the upload programes,by using Function modules.But when we are doing that,we are able to see in Over veiw,But when we go to change mode,nothing is there on screen.Then we tried Using BDC Recording for IT0105.Then working perfectly.
    So i need a help,if any one of you any idea on list of the Infotypes which need BDC Recording.
    I need the List of infotypes,so that i make sure,we use BDC .
    Its on priority,Pls have some inputs.
    Thanks,
    Anitha

    Hey Anitha,
    BDC and LSMW are both data loading techniques.
    In both cases you will need input files in a specific format so that these are read by your BDC or LSMW program
    The main difference is
    Using BDC you can validate the data in the system. Suppose you want to check if an employee is active before loading the data then BDC is advisable because you can check the employment status of the employee and then upload data or else give a message that employee is terminated and data loading has been skipped
    If there is a straight forward upload without any validation then go for LSMW
    Finally it depends on your requirements
    Cheers
    Ajay
    As stated above IT 0008 has to be filled using table control hence BDC is helpful
    Edited by: Ajay  Hosur on Apr 13, 2010 6:41 AM

  • Logistic Data source and loading sequence

    Hi Gurus ,
    Can anyone help me explain the use of the 2lis_02_CGR and 2lis_02_SCN.
    I also need to know the data loading sequence for the following data source
    2lis_02_HDR,
    2lis_02_ITM,
    2lis_02_SCL,
    2lis_02_ SGR,
    2lis_02_ CGR,
    2lis_02_SCN
    Thanks for your help

    Hi,
    i.Purchasing Data (Header Level)
    Technical name: 2LIS_02_HDR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:no
    Prerequisites:
    Activation of the DataSource.
    Use :
    The DataSource is used to extract the basic data for analyses of purchasing documents consistently to a BW system.
    ii.Purchasing Data (Item Level)
    Technical name: 2LIS_02_ITM
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:no
    Prerequisites:
    Activation of the DataSource.
    Use :
    The DataSource is used to extract the basic data for analyses of purchasing documents consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    iii.Purchasing Data (Schedule Line Level)
    Technical name: 2LIS_02_SCL
    Technical Data
    Type of DataSource:
    Transaction data
    Application Component:
    Materials Management (MM)
    Available from OLTP Release
    SAP R/3 4.0B
    Available from Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource extracts consistent basic data for analyzing purchasing documents to a BW system.
    Delta Update:
    A Delta update is supported. Delta process: ABR – Complete delta with deletion indicators using delta queue (Cube-compatible).
    iv.Allocation - Schedule Line with Goods Receipt
    Technical name: 2LIS_02_SGR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource is used to extract the schedule line quantities allocated with goods receipt quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    v.Allocation - Confirmation with Goods Receipt
    Technical name: 2LIS_02_CGR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.5B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource is used to extract the confirmation quantities allocated with goods receipt quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    vi.Allocation  – Confirmation with Schedule Line
    Technical name: 2LIS_02_SCN
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.5B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    The DataSource is used to extract the schedule line quantities allocated with confirmation quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    If it helps assign points.
    Thanks,
    Akshay a.

  • Data Migration Tool - Loading C4C Accounts

    Hello gurus
    Our scenario includes our C4C tenant connected to CRM OP via PI/PO with integration built and tested for Accounts and other objects.
    When we manually create an account in C4C, it replicates correctly to CRM OP. And we assume that if we use C4C DM tool to account load data in C4C, then it would automatically trigger WSs to send corresponding data to CRM via PI/PO the same way as it does successfully for manually created transactions.
    Is there a limitation that anyone is aware of that will not allow data created via DM tool to be replicated to OP systems.
    Let me know your thoughts please.
    Regards
    Jai

    Hi Ginger
    Thanks for the reply back. I am of the same opinion as well that it should work. It should be no different than say ERP to CRM (and reverse) connetivity scenario via CRM middleware where we upload the data in one client and then it gets processed on the slave system automatically.
    However in our C4C case, when we are using the DM tool for accounts, the behavior is inconsistent and it does not even generate the Web service that we can view and monitor in Admin facet. So nothing is getting passed to PO to be sent to CRM OP. I did see that BP relationship via DM tool generate a message in WS monitor. I am assuming the CAs that would be triggered for loading data via DM tool will be no different than say if we were to create the data manually in C4C.
    Any way we could run a trace on data loaded via DM tool to see where the hang up may be?
    Thanks Ginger.
    Jai

  • Want to use sequence object of oracle when loading data in sql loader

    Hi,
    I want to use sequence when loading data in sqll loader, but the problem is i could not use sequence object of oracle to load the data by sql loader, i can use sequence of sql loader.
    I want to use sequence object because in later entries this sequence object will be used.If i use sequence of sql loader how can i use oracle sequence object
    Is there any other option

    I have a simillar problem, I also want to use a sequence when loading data by the SQL Loader.
    My control file is:
    load data
    infile '0testdata.txt'
    into table robertl.tbltest
    fields terminated by X'09'
    trailing nullcols
    (redbrojunos,
    broj,
    dolazak,
    odlazak nullif odlazak=blanks,
    komentar nullif komentar=blanks)
    And the datafile is:
    robertl.brojilo.nextval     1368     17.06.2003 08:02:46     17.06.2003 16:17:18     
    robertl.brojilo.nextval     2363     17.06.2003 08:18:18     17.06.2003 16:21:52     
    robertl.brojilo.nextval     7821     17.06.2003 08:29:22     17.06.2003 16:21:59     
    robertl.brojilo.nextval     0408     17.06.2003 11:20:27     17.06.2003 18:33:00     ispit
    robertl.brojilo.nextval     1111     17.06.2003 11:30:58     17.06.2003 16:09:34     Odlazak na ispit
    robertl.brojilo.nextval     6129     17.06.2003 14:02:42     17.06.2003 16:23:23     seminar
    But all records were rejected by the Loader, for every record I get the error:
    Record 1: Rejected - Error on table ROBERTL.TBLTEST, column REDBROJUNOS.
    ORA-01722: invalid number

  • Data Migration: Sequence Check

    i have a database (target) that i am transfering data to from another database (source)
    I need to write a script to check if the sequence on the target will be ok...in a sense, I need to check the max value of the target dababase sequence and see if it is ok to add the new data.
    i cant transfer data over if the sequence numbers overlap...like if i have a sequence # of 200 on the source and target...we run into problems. is this logic correct?
    how do i write this sql check?
    any help would be wonderful. im a novice.
    thanks.

    It may be easier to ensure up front that the sequences won't overlap. i.e.
    CREATE SEQUENCE system1_seq
      START WITH 1
      INCREMENT BY 2
      CACHE 100
    CREATE SEQUENCE system2_seq
      START WITH 2
      INCREMENT BY 2
      CACHE 100This way, one system will generate odd numbers, the other will generate even numbers, and you won't have to worry about overlaps.
    Alternately, you can add a source id to the primary key on the target system or you can renumber the rows (obviously fixing the relationships) when the data is migrated.
    Justin
    Justin

  • Data loading sequence for 0FIGL_014

    Hi experts,
    Can you explain or brief the data loading sequence for 0FIGL_014?
    Regards.
    Prasad

    Hi,
    Following is my system configuration information
    Software Component      Release             Level                  Highest support                  Short Description
    SAP_BASIS                     620                    0058                   SAPKB62058                      SAP Basis Component
    SAP_ABA                        620                    0058                   SAPKA62058                      Cross Application Component
    SAP_APPL                       470                    0025                   SAPKH47025                       Logistics and Accounting
    SAP_HR                           470                    0030                   SAPKE47030                       Human Resources
    With the above configuration in R/3, I am not able to find the data source 0FI_GL_14.
    Can you please let me know what package to install and how to install?
    Regards.
    Prasad

  • Master data Load sequence

    We are preparing to load our master data, then our cubes.
    My question is this: Is there a way to identify the master data objects that are dependent on the infoobject to be loaded.
    Example.  oprojects should be loaded before owbs_elemt, etc?
    make sense. We want to make sure all higher level master data objects are loaded first.
    Is there a way to see these dependencies ?

    i think the order might be important in case you are loading from any other source system(other than R/3) where it the checking is not as good as R/3.
    If its R/3, i think it should be fine.
    Robert correct me if i am wrong. what kind of dependency and order do u think is important, kindly let us know.
    thank you
    Gokul

  • SAP HR Data Migration Strategy

    Hello Guys,
                I worked in Data migration in the areas like SD,MM,FI. But I am new to HR. Can anybody please give me the HR data migration stratagey like what data's (eg, Master ,transactional etc) we need to load and sequences and methods available for each type?
    Thanks
    Kal

    Hi  chakri,
    Documentation we use is Data Convertion Bueprint which is simply customer's requirements accroding to what infotypes he needs, what data should be migrated and which org. elements will be imported.
    Another one is Master Template which is actually prepared by us (technical team). We prepare LSMW for each infotype mentioed in former document, then we document it in this master template to provide to customer required flat files format. We prepare such becuase sometimes some filed may be fixed, some added and even are read-only so should be skipped. So this document differs a bit with previous one in regards to infotypes' used fields. Than we sent this document to our local functional team which is resposible for providing files with given format.
    In first document there are also some Golden rules, which describes exact steps for one entire cycle (staring from deliveirng the file, ending with validation of loaded data for one infotype), which each involed party should follow.
    The last but not least important document is what we call Loading sequence. I already described it previously. It is just the order of loading the data to infotypes which you should follow. It also contains such information as
    - number of records in provided file
    - number of errors, successfully loaded records
    - files which we share with functional team (error logs)
    - finally files which we sent as validation data (extracted from system after successful load of infotype)
    Unfortunatelly I can't provide you any of them as these are company documents which are not to be distributed. Sorry I am not allowed to break this rule.
    Anyhow I hope it will give you an overview of what kind of documentation you need. It is of course of your team how you organize your work and documentation.
    All in all crucial thing is communication between teams, which from my experiaence sometimes fails.
    One more thing:
    Always remeber to give points to usefull answers, that's how the forum works. Of course it is not obligatory !
    Rememebr if you get something, you appreciate it by assigning points:)
    Cheers
    Marcin

  • Automating Data Migration from SQL Server 2005

    Hi All,
    I need to migrate data from SQL Server 2005 to Oracle DB ( Datawarehouse). The data migration involves updating Dimensions and Fact table. I have got painful job of switching topology to point to over 100 different weekly SQL DBs ( in sequence) and pull the data into DW. How can I automate my ODI process to switch from one after the other.
    Any Ideas?
    KS

    ODI Variables are there to help you with this. Please go through this post at [http://blogs.oracle.com/dataintegration/2009/05/using_odi_variables_in_topolog.html] .
    This will show you how to use a variable Oracle data server.
    On similar lines you should be able to switch to any of your 100 weekly SQL server DBs.
    And loop in a sequence to load data from all of them.
    Hope that helps

  • Data Migration examples

    can give some examples on data migration?...thanks.

    Hi Gow.
    Difference Between Batch Input and Call Transaction in BDC
    What is the difference between batch input and call transaction in BDC?
    Session method.
    1) synchronous processing.
    2) can tranfer large amount of data.
    3) processing is slower.
    4) error log is created
    5) data is not updated until session is processed.
    Call transaction.
    1) asynchronous processing
    2) can transfer small amount of data
    3) processing is faster.
    4) errors need to be handled explicitly
    5) data is updated automatically
    Batch Data Communication (BDC) is the oldest batch interfacing technique that SAP provided since the early versions of R/3.   BDC is not a typical integration tool, in the sense that, it can be only be used for uploading data into R/3 and so it is
    not bi-directional. 
    BDC works on the principle of simulating user input for transactional screen, via an ABAP program.
    Typically the input comes in the form of a flat file. The ABAP program reads this file and formats the input data screen by screen into an internal table (BDCDATA). The transaction is then started using this internal table as the input and executed in the background. 
    In ‘Call Transaction’, the transactions are triggered at the time of processing itself and so the ABAP program must do the error handling.  It can also be used for real-time interfaces and custom error handling & logging features. Whereas in
    Batch Input Sessions, the ABAP program creates a session with all the transactional data, and this session can be viewed, scheduled and processed (using Transaction SM35) at a later time. The latter technique has a built-in error processing mechanism too. 
    Batch Input (BI) programs still use the classical BDC approach but doesn’t require an ABAP program to be written to format the BDCDATA. The user has to format the data using predefined structures and store it in a flat file. The BI program then reads this and invokes the transaction mentioned in the header record of the file. 
    Direct Input (DI) programs work exactly similar to BI programs. But the only difference is, instead of processing screens they validate fields and directly load the data into tables using standard function modules. For this reason, DI programs are much faster (RMDATIND - Material Master DI program works at least 5 times faster) than the BDC counterpart and so ideally suited for loading large volume data. DI programs are not available for all application areas. 
    Differences between bdc session method and call transaction method.
    The most important aspects of the batch session interface are: - Asynchronous processing - Transfers data for multiple transactions - Synchronous database update During processing, no transaction is started until the previous transaction has been written to the database. - A batch input processing log is generated for each session - Sessions cannot be generated in parallel 
    The most important aspects of the CALL TRANSACTION USING interface are: - Synchronous processing - Transfers data for a single transaction - Synchronous and asynchronous database updating both possible The program specifies which kind of updating is desired. - Separate LUW for the transaction The system performs a database commit immediately before and after the CALL TRANSACTION USING statement.  - No batch input processing log is generated.
    Explain in detail with example what is batch input session?
    Batch Input Session:
    - It is a sequence of transactions, which is generated when user run a particular program. 
    - It contains the accounting documents that are to be created. The SAP system stores these transactions until you decide to process them online. 
    - It does not update transaction figures until the session has been processed.  Using this technique, you can transfer large amounts of data to the SAP system in a short time.
    Three processing modes of executing Batch Input Session :-
    (1) Run Visibly : You can correct faulty transactions online & work step-by-step through  the transactions not yet executed. 
    (2) Display Errors only : You can correct faulty transactions online.   Transactions not yet executed, but without error, run in the background.
    (3) Run in Background : Recommended by SAP.                                     
    <b>Kindly Reward Points If You Found The Reply Usefull<b>,
    Cheers,
    Chaitanya.

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Data Migration of FI-CA Cleared Items

    Hi,
    Currently I am working on a Data Migration project to migrate data from our FI-CA  module on ERP 4.7 to ECC 6.0.
    THere has been a request from the business to migrate historical data (e.g. Cleared items).
    Is there a SAP recommended approach or tools to load this data into the target environment?
    Currently all documentation around SAP data migrations talks about stategies for open item data migration however I have seen nothing arouund migrating historical financial data.
    Is this because it is not recommended or technically impossible?
    Regards
    Adam Gunn

    That BAPI is used typically for straight out vanilla GL, AR, AP postings, however you still have to create the other side of the entry and then clear it.
    I need to be able to migrate full history, which means from an FI-CA  viewpoint:
    1. Migrate FI-CA posting of liability against BP/contract account.
    2. Migrate associated payments.
    3. And then clearing documents.
    Basically the requirement is to represent the historical data in the new system as if it was posted and matched off.
    Is there a technical way to do this?
    OR,
    Do you migrate the FI-CA liabilties, then the associated payments and then run clearing on the target system?
    I suspect this is almost an impossible data migration requirement as development of the extraction and load process would be extremely complex and testing would take months to cover all posting scenarios in FI-CA. However, I would be interested if anyone has attempted to do this before.
    Adam

  • Data Migration from SAP r/3 to SAP R/3

    Hi all,
    What is the best method to migrate data from one verison of SAP to another verison of SAP ECC say from SAP 3.1 to SAP ECC 6.0?
    This is for all SAP modules including sd, mm, pp, fi, co master and transaction data.  I know there are number of technologies to load the data such as LSMW, IDOC, ALE, DI, etc but what is the best way to extract all data feeds to be loaded back into SAP.
    Thanks in advace.

    Take a look to the following link, may be useful for you
    SAP NetWeaver - Data Migration (CA-DMI) [original link is broken]
    if helpful reward points are appreciated

  • Data Migration from Legacy System in IS-U

    Hi All,
    We are going to Implement a new IS-U project,  I have a problem with LSMW(Legacy System Migration Workbench ),  I have some Conversion Objects of IS-U and I need to know whether Data Migration is possible or not, please tell me how to find these Objects and how to know Data Migration is possible or not.
    Objects are like.,
    1. Accounts
    2. Actuate Reports
    3. Business Partner Relationships
    4. Active Campaigns/Campaign Content/Dispositions
    5. Connection Object
    6. Contacts
    7. Contracts
    8. Opportunities
    9. Payment Arrangement History
    10. Payments
    11. Premises
    12. Rate Changes
    13. Security Deposits
    these are few and there are some more..,
    Thanks in Advance,
    Sai.

    Hi Ram,
    Use Transaction Code EMIGALL. It will ask for company code. By default Company Code is SAP. If you entered with the SAP you will get all the objects.Then goto menu IS-U Migration-->User Handbook. It will give you details Idea.
    Also Check the following Procedure
    You can find detailed documentation of EMIGALL in SAP itself. Use Transaction EQ81 to display it. It provides all the concepts and procedures to work with EMIGALL.
    Here are some points about EMIGALL :
    1. It Migrates data Business Object wise
    2. It uses Direct Input Technique
    3. It has more than 100 objects of IS-U
    and the steps for implementation goes like this:
    1)You have to create a user specially for migration which will have all the authorizations related to migration workbench, BASIS and IS-U
    2)You have to create your own company in EMIGALL. There is a default company called SAP.
    3)Company SAP contains all the Business Objects
    4)You have to figure out what business objects u need and then u have to copy those business objects to ur company from Standard Company SAP
    5)Each objects contains more than one structure and each structure can contain more than one fields. The relation goes like this
    Object ---> Structure ---> Field
    6)You have to define field rules for each required field of the object. You have to mark "Not required" for fields u don't need
    7)After field rules for a given object is set u have to generate load report i.e. actual Direct Input Program which will migrate data. This program is generated on basis of field rules set by u.
    8)After the load report is generated u have to prepare an input file (import File) for migration. The import file should be according to structure provided by SAP and must be in binary format. SAP Provides the structure of file according to your configurations. You have to write ur own Data conversion program(in any language) for this task.
    9)You take import file as input and migrate the data using generated load program
    10)Finally u can check the Migration Statistics and Error Log
    Thanks and Regards,
    Jyotishankar Dutta
    Message was edited by:
            Jyotishankar Dutta
    Message was edited by:
            Jyotishankar Dutta

Maybe you are looking for

  • Matlab script node for real time purpose

    Hi! We are trying to control in real time the frequency of a vibration motor with a voltage signal proportional to EMG activation. Our EMG sends data to a LAN port and we read them in Labview using the UDP Read function. We created a VI which reads d

  • TS1424 error 11222 and more

    I'm not tech-savvy, but I do OK...except I can't connect to the itunes store!!!! First I get prompted by my computer that itunes is not my default audio, which it is, then I wait for a really long time until I x out the store connection attempt and I

  • Ever since i updated to yosemite, i'm having trouble with shortcuts

    am i the only one? is not impossible to work, just extremely annoying

  • How can I virtualize snow leopard in Lion?

    I heard there is a way to run snow leopard alongside lion so i can run my powerpc applications

  • Yosemite problems on I mac mid 2007 machine

    I Mac mid to late 2007 2.0 GHz Intel 2GB ram running Yosemite 10.10 About 3 months ago my sister (we are a Mac family) upgraded to Yosemite on her 2007 machine that had been running fine.  Did not know that was possible on an old machine.  Anyway she