Legacy Data Extraction; iWay Adapters;

Back in September at Tech Ed Vegas, I attended an XI presentation during which it was stated that it is not a good idea to use XI for first-time data load into SAP.
Does everyone agree with this statement?  Is it better to use the LSMW?
What about using XI for extraction of data from a legacy system (Lawson)?
Has anyone used the iWay adapter for Lawson (or any app such as PeopleSoft) in conjunction with XI to perform data extraction? Is that even possible?
Or should we just forget about it and use Informatica?

Hi,
<i>Does everyone agree with this statement? Is it better to use the LSMW?</i>
>>>It is always advisable,  not to use XI for the intial data load. Either you can use any tool or LSMW or standard programs to do.
The problem we faced during intial upload is , huge volume of data..
For Data extraction we can use XI with third-party adapters.
These blogs will help u -
/people/yomeshp.sharma/blog/2006/06/01/integrating-jdedwards-system-with-xi-using-iway-adapter-part--i
Regards,
Moorthy

Similar Messages

  • Data Extraction template

    Hi,
        In my current project there is a requirement for Data migration from legacy, So can anyone please help me on providing the data extraction template for Vendor and customer open line items, G/L balances and for Bank directory,
       Your help in this regard is highly appreciated
    Thanks
    Rajesh. R

    Hi Rajesh,
    When you extract the data from the legacy system the point that you should keep in mind are,
    1. How the organisation structure in legacy system is mapped in SAP system. Becuase the data upload into sap should also happen in the way the reports are expected from SAP.
    There is no standard layout which is used while extracting, you need to make sure that you extract all the necessary information from legacy system which is need to be uploaded in SAP.
    I can give you an example of the field that you may include in the layout of extraction
    Vendor account, Document data, Posting date, Document Type, Company code, Amount in Document currency, and local currency, etc
    2. Please keep in mind that there are some GL account which will be maintained in SAP as open item basis. Therefore your extraction should also possibly happen each transaction wise.
    3. There are certain GL which will be maintained in foreign currency, line bank GL which are in foreign currency. In such cases you need to extract the balance in foreign currency.
    My suggestion to you will be thinking through the precess first and then go ahead with the extraction.
    Hope this helps
    Regards
    Paul

  • Asset Tables for legacy data takeover

    Dear Friends
    We've to take assets data from one client to another and hence would be using Legacy assets.
    Could you please let me know from which tables / reports i should extract the data related to master & transaction data from the legacy system which is also on SAP.
    ANLA - master data and to take those assets which are not deactivated
    ANLB
    ANLC - asset accumulated APC & dep. And current yr dep
    ANLZ - dep terms data
    ANEP - asset transaction data for current year acquisitions
    I dont have the system in from of me and i hope the above tables are correct.
    1. Please let me know if other tables are required?
    2. Should the dep area data be posted with remaining useful life & expired life in legacy asset?
    Regards
    Kapil

    Hello Kapil,
    I invite you to carefully review the attached SAP notes which should
    provide you with the necessart answer:
    68802   Legacy data trnsfr: diff.fields n.ready f.input
    550176  FAQ note legacy data transfer asset master records
    373894  Collective note: introdctn to prblm solutn in FI-AA
    Further I also attach the following 6 notes that provide further
    informative details regarding legacy transfer, and also some specific
    only for legacy transfer during the year:
    729164  AS91/AS92: Incorrect ready for input status for prop. values
    29706   AS91/92/94: Dep. area fields not ready for input
    4206    Net book value input at old data transfer
    50607   Depreciation during old assets data takeover
    26240   Reconciliation of posted depreciat. after takeover
    For further information, please review the information for asset data
    transfer in the R/3 Library:
    FI-AA - Assets -> Legacy Data Transfer -> Special Considerations for
    Asset Data Transfer -> Time of Transfer...
    thanks and regards
    Ray

  • Data extraction / Conversion / Mapping / Miration

    Hi All,
       Can any one please explain the meaning of the below terms and how data is moved from Non SAP system to SAP system?.
    "Data extraction / Conversion / Mapping / Miration"
    Kindly give me detailed step by step instructions on how to accomplish this. Also if there is any documentation please forward.
    I appreciate the help in advance.
    Raj

    Hi Raj,
    We can use LSMW, BDC, eCATT to upload the data from Legacy to SAP system.
    Data Extraction - Data has to extract from legacy in to some flat file like  Xls.
    Conversion - Convert as per the need of SAP.
    Mapping - Mapping the extracted field with SAP field for upload.
    Migration - Data migration from legacy to SAP using any tool like LSMW, BDC, CATT.
    Good Luck
    Om

  • Data Extraction  or IDOC to flat file

    hi,
    I have a project to create a flat file from SAP, for an external legacy system. There are 3 requirement.
    What approach should I take. Simple data extraction OR Idoc to flat file.
    There are 4 requirements:
    1. first time extract all data.
    2. on subsequent run, extract only changed & new records
    3. if in SAP table, a record is deleted, then marked deleted in flat file.
    What approach should I take if I use data extraction.
    Thanks.

    I read your question, my first thought would be to look at where the data is going?
    What are the data requirements of the legacy system. IDOCs can speed up the development related to pushing the data out from SAP. Using ALE and change pointers you can automatically pass out the delta with a limited amount of development.
    However, the receiving system then needs to parse the IDOC data. depending on the IDOC you are working with this can be a challenge especially if the legacy developer doesn't get IDOCs.
    Sometimes its easier to collect and write the data from SAP using "simple data extraction". The data is more readily organized into a format the receiving system is expecting.
    You can also pass the idoc to a middleware maping application if one is available and do the SAP to legacy mapping there.
    Cheers

  • Data extraction from Oracle database

    Hello all,
    I have to extract data from legacy database tables. I need to apply a lot of conditions on data extraction using SQL statements for getting only valid master data, transaction data, SAP date format etc. Unfortunately I don;t have a luxary of accessing legacy system data base table to create table views and applying select statements.
    Is there anyother way round by which I can filter data on source system side w/o getting into legacy system. I mean being in BW data source side.
    I am suppose to use both UD connect and DB connect to test which will workout better way of data extraction. But my question above should be same in either interface.
    This is very urgent as we are in design phase.
    Points will be rewarded immediately.
    Thanks
    Message was edited by:
            Shail
    Message was edited by:
            Shail

    Well I and eveyone know that it can be done in BI.
    I apologize that I did not mention it in my question.
    I am looking for very specific answer, if there is any trick we can do on source system side from BI. Or where we can insert SQL statements in infopackage or data source.
    Thanks

  • Decision  to Build BW data model with legacy data

    Hi experts,
    I was assigned a task to take decision on  creating  new data warehouse on the following scenario's
    Option1:
    1.SAP BW + Legacy data( Separate database)
    Option2:
    2.SAP BW+ Legacy data integrate to BW.
    Please let me the pros and cons involved in integrating legacy data to BW or to report separately as two different data bases using BOBJ.Client is having SAP R/3 EEC6.0 running.SAP data is around 60% and remaining is legacy.
    My question is which option is better with some detail.
    Regards
    prasad

    Hi Prasad,
    As mentioned no need to combine Legasy data and BW data in BOBJ.You can directly extract to BI from R/3 and Legasy Data.But
    Please check the folliwng points from DB People.
    1. As per requirement how many Tables that you are using to extract the data?
    2.  You have three source tables.But there are no unique records.In this case first you need to identify the common key field.
    Note : That key field also should same as in R/3 also.
    Based on the three source tables inform the DB team Mainatain a separate table (Which should collect only common record from three tables).
    3. Now you can load the data into BI.So the above scenario will resolve issue.
    Issues & Concern :
    1. If yopu extract Lagasy and BI to BOBJ ?How u r going to combine data in BOBJ to generate a global report.
    2. Is this possible to maintain a separate table in DB by the DB team.
    Check once with DB team before proceeding further.
    Regards
    Ram.

  • Removing the carriage return / Tab during data extraction

    Hi ,
      I have an issue with removing the carriage return / Tab from the extracted (Flat File) data from SAP EKPO table.
      1) While uploading(Long Back) the Legacy data to SAP tables the carriage returns and Tabs in Product description were uploaded to SAP tables.
      2) Now I am extracting data from bkpf,..EKPO.. tables  to a flat file . 
    3) When I extract the data i. PO number, Product description,......
    The Product description is going to 3 to 5 lines depending on the carriage returns and Tabs in the Product description.  Ekpo-txz01
    Ex : Mfg # DS104NA / NET######GEAR DS104 Hub######model
        I want  this product description in one continuous line. It is possible only  by removing the Carriage returns and tabs from product description before concatenating and TRANSFERING the file to Server.
    4) What is  getting:  The line is breaking @gear @ Hub due to Carriage return/Tab
    Mfg  D  NET
    GEAR DS104 Hub
    model   .
    5) What is required is: in one line
    Mfg  D  NETGEAR DS104 model
    In the process I am extracting product description from Ekpo-txz01 . So it is internal table. Now I want to remove the carriage return / Tab .
    How these can be removed.
    Thanks,
    Vasu

    Dear Vasu,
    Sometimes there is a problem that cannot be solved normally. This is one of them.
    The reason why the mentioned methods do not work is because you need to remove actually two characters CR and LF in one stroke. You can identify one but then not the other. So approach it from the other side.
    Simply define what your correct characters might be AND only use them to export.
    Below you will find a FORM routine that does just that (only passes defined valid characters and NOT others).
    FORM validate_field USING    VALUE(field_in)
                        CHANGING field_out.
    CONSTANTS:
    All low caps characters
      validlow(26)  TYPE c VALUE 'abcdefghijklmnopqrstuvwxyz',
    All upper caps characters and all figures
      validhigh(36) TYPE c VALUE '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ',
    All other characters (starting with space)
      validrest(32) TYPE c VALUE ' <,>.?/:;"{[}]|\~`!@#$%^&*()_-+=',
    Single quote
      validquote           VALUE ''''.
      DATA:
        pos(1) TYPE c,         " Container to hold 1 character
        l      TYPE i,         " Length
        i      TYPE i.         " Indexer
    Determine length
      l = STRLEN( field_in ).
    Stripping loop
      WHILE i LT l.
      Get one character 
        pos = field_in+i(1).
      Check its validity
        IF pos CO validlow
        OR pos CO validhigh
        OR pos CO validrest
        OR pos EQ validquote.
        When valid, use it
          concatenate field_out
                      pos
                 into field_out.
        ENDIF.
      ENDWHILE.
    ENDFORM.
    Hopes this helps solving your problem.
    Regards,
    Rob.

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Asset Legacy data transfer reconciliation with GL

    We have recently merged a deactivated company code with our operating company and transferred all Finance processes, open items,master data etc.
    We transferred assets using Legacy Data Transfer AB91 (we did not use intercompany asset transfer ABT1N).
    We then transferred the Trial Balance.
    The asset balances reconcile against the GL accounts for the transferred assets when executing S_ALR_87011964 and FS10N.
    However there is a difference in ABST2 which is the same difference as the migrated assets.
    I would expect there to be no difference between FI and AA for legacy data transfer
    I have also run ABST for all affected GL accounts and there is no difference when executing this report.
    Note that the company code migration occurred in January 2008, there is no effect on 2007 year end data.

    Hi
    You have to take over the GL  balance using OASV t.code
    regards
    Sibichan

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • Legacy data upload for Training & Event Management

    As per the requirement, the legacy /history data for training events is to be uploaded in the system. To start the process the data templates were prepared for uploading training objects like D (Business Event Group), L (Business Event Type) and E (Business Event) and these were uploaded. Also the relationships A003 between D & L and A020 between E & D were also maintained. As a part of training history the attendees were also maintained (uploaded) using the relationship A025 between E & P objects. However when we go to PSV2, then the tree structure does not reflect the name of the person who attended the event.
    Please suggest and advice.

    Hi Nitin,
    I have a query regarding the Legacy data. I wanted to ask you that how did you structure the Historical BEG and BET.
    Did you create different BEG for the Current and the Legacy data.
    But in that case there could be BET's common in both the current and legacy data with different object ids
    For e.g. Under Current BEG and Historical BEG  I may have a BET Technical Programmes with diffrent Ids say 60000000 and 60987655. This may create a problem when running reports.
    Pls tell me strategy that you have used for creating the catalog.
    Regards
    Yashika Kaka

  • FI data extraction help

    HI All,
    I have gone thorugh the sdn link for FI extarction and founf out to be very  useful.
    Still i have some doubts....
    For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
    ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......

    >
    Jacob Jansen wrote:
    > Hi Raj.
    >
    > Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
    >
    > br
    > jacob
    Not necessarily for systems above plug in 2002.
    As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
    Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm

Maybe you are looking for

  • How to import from DVD (VOB files)

    I have PP CS4 version 4.2.1 and am using it on a Windows 7 - 64 bit machine.  I have tried several different ways to import the VOB files in order to edit.  I am able to view each of the VOB files in the Source Monitor, but I can't seem to put them t

  • Foms timer and item focus handling?

    Hi, I have a Forms that runs a repeating timer. The timer is fired every 2 minutes. My problem with the timer is that when the timer trigger fires it takes the focus from the current item on the forms, executes and then returns the focus. The returni

  • Export Clarification

    I'm trying to solidify my workflow, and I was hoping you FCP wizards could clarify something for me. When it comes to exporting finished video from FCP to create DVD's in DVD Studio Pro, I have seen several arguments over what is the best way. I'm su

  • XAV-601BT: Mirrorlink and Next/Previous + some suggestions

    Hello. I recently purchased a XAV601BT unit and I'm very happy with it. It works really fine and I don't have all the problems people are talking about here. My XAV601 firmware version : HR10_xav_1.12.001 My Sony Xperia Z firmware version: 10.1.A.1.4

  • Create new log file for every 1000 points

    Is it possible to have a new file/folder for every 1000 data points ?  How do you check number of points written to a file?