LMS - Legacy training data load

We are going to implement SAP LMS soon. We are in the planning phase. I have one question around that - What is the best way to load Courses(Business Events) Object type E in the system ? We have lots of training history in the legacy system. The L-D relationship is easy to load via DTT /LSMW. What is the best way to load Object E for each Business Evey Types D and the Persons enrolled for the respective E.
Thanks for your help
Sanghamitra
Edited by: Sanghamitra on Feb 19, 2009 1:21 PM

Hi,
You can use a LSMW for Tcode PV10/PV11 to create the business events.
Also to book attendees its better and convenient to create a LSMW for Tcode PV08 rather than PV00 or PV07 as in these 2 you cannot mention the Event (E) ID and can cause a confusion just in case there are 2 events with similar dates under that event.
Hope this helps.
Shreyasi.

Similar Messages

  • Data load from Legacy system to BW Server through BAPI

    Requirements: We have different kind of legacy systems and SAP BW server. We want to load all legacy system data into SAP BW server using BAPI. Before loading we have to validate all data. If there are bad data, data missing we have to let the legacy system user/ operator knows to fix the data into their system with detail explanation. When it is fixed, we have to load the data again.
    Load Scenario:  We have two options to load data from legacy systems to BW Server.
    1.     We need to load data directly from legacy system to BW Server using BAPI program.
    2.     Legacy Systems data would be in workstations or flash drive as .txt (one line separated by comma) or .csv file. Need to load from .txt /.csv file to BW Server using BAPI program.
    What we want in the BAPI program code?
    It will Read / Retrieve data from text / csv file and will put into the Internal table. Internal table structure would be based on BAPI InfoObject structure.
    Call BAPI InfoObject function module ‘BAPI_IOBJ_CREATE’ to create InfoObject, include all necessary / default components, do the error check, load the data and return the status.
    Could some one help me with the sample code please? I am new in ABAP / BAPI coding.
    Is there any other better idea to load data from legacy system to BW server? BTW we are using BW 3.10. Is there any better option with BI 7.0 to resolve the issue? I appreciate your help.

    my answers:
    1. this is a scendario for a data push into SAP BW. You can only use SOAP-Based Transfer of Data.
    http://help.sap.com/saphelp_nw04/helpdata/en/fd/8012403dbedd5fe10000000a155106/frameset.htm
    (here for BW 3.5, but you'll find similar for 7.0)
    In this scenario you'll have an RFC dinamically created for every Infosource you need to transfer data.
    2. You can make a chain for each data load, so you can call the RFC "RSPC_API_CHAIN_START" to start the chain from external.
    the second solution is more simply and available on every release.
    Regards,
    Sergio

  • Training on CalcScripts, Reporting Scritps, MaxL and Data Loading

    Hi All
    I am new to this forum. I am looking for someone who can train me on topics like CalcSripts, Reporting Scripts, MaxL and Data Loading.
    I am willing to pay for your time. Please let me know.
    Thanks

    Hi Friend,
    As you seems to be new to essbase,you must learn What is Essbase, OLAP, Difference between Dense & Sparse, and then use "essbase tech ref" for more reference
    After that go through
    https://blogs.oracle.com/HyperionPlanning/and start exploring CalcScript, Maxl etc
    and all this for you free free free..........
    Thanks,
    Avneet

  • Steps to prepare and upload legacy master data excel files into SAP?

    Hi abap experts,
    We have brand new installed ECC system somehow configured but with no master or transaction data loaded .It is new empty system....We also have some legacy data in excel files...We want to start loading some data into the SAP sandbox step by step and to see how they work...test some transactions see if the loaded data are good etc initial tests.
    Few questions here are raised:
    -Can someone tell me what is the process of loading this data into SAP system?
    -Should this excel file must me reworked prepared somehow(fields, columns etc) in order to be ready for upload to SAP??
    -Users asked me how to prepared their legacy excel files so they can be ready in SAP format for upload.?Is this an abaper job or it is a functional guy job?
    -Or should the excel files be converted to .txt files and then imported to SAP?Does it really make some difference if files are in excel or .txt format?
    -Should the Abaper determine the structure of those excel file(to be ready for upload ) and if yes, what are the technical rules here ?
    -What tools should be used for this initial data loads? CATT , Lsmw , batch input or something else?
    -At which point we should test the data?I guess after the initial load?
    -What tools are used in all steps before...
    -If someone can provide me with step by step scenario or guide of loading some kind of initial master data - from .xls file alignment to the real upload - this will be great..
    You can email me some upload guide or some excel/txt file examples and screenshots documents to excersize....
    Your help is appreciated it.!
    Jon

    hi,
    excel sheet uploading:
    http://www.sap-img.com/abap/upload-direct-excel.htm
    http://www.sap-img.com/abap/excel_upload_alternative-kcd-excel-ole-to-int-convert.htm
    http://www.sapdevelopment.co.uk/file/file_upexcel.htm
    http://www.sapdevelopment.co.uk/ms/mshome.htm

  • Transactional data loads PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    Check these three thigns
    /n/sapapo/CCR
    /n/sapapo/CQ
    Check what type of stock has active IM, and what type of stock went in after you created the GR.
    Still if you have any problem let us know.
    My

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Sample SOAP request for Data Loader API

    Hi
    Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .

    Log into the application and then click on Training and Support there is a WS Library of Information within the application

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Oracle Data Loader

    Hi guys!
    I'm planning to import a file with about 400k records with data loader (insert function).
    I do this operation with web services and I took about 7 hours. With web services I import about 20k records per time.
    Someone know if i use dataloader, will the time be improved?
    Another question, do you know how data loader do to import a file (if it divide the record, how many records per time, parallel importation, etc...)?
    Thanks in advance,
    Rafael Feldberg

    Rafael, I would recommend clicking on the Training and Support link in the upper right of your CRM On Demand application, then click on Browse Training, then click on Training Resources by Job Role, then click on Administrator and look for the following:
    Data Loader FAQ
    Data Loader Overview for R17
    Data Loader User Guide
    If you are successful using web services I would stick with that method.

  • Data loading on master data

    Hello Guys,
    I am thinking what would be the starting point to do data loading(master data) into SAP from legacy.I got the latest dump from legacy which got plenty of info on it...also I got the SAP data sheet from migration team as well. I am wondering whether I need to just send the SAP data sheet to business to fill it but again they dont know anything on SAP fields .So I thought let the users identify the equipments first which needs to be treated as a floc in SAP  and put it in a seperate file  by confirming floc level as well....Is this the right way ? or is there any other std procedures is available for data loading...I agree this varies from business to business but if someone could able to tell the exact approach, that will be great.
    Mahee

    a simple ( but you can standardize it) method would be the <a href="http://help.sap.com/erp2005_ehp_04/helpdata/En/70/93a417ecf411d296400000e82debf7/frameset.htm">excel upload</a>
    A.
    Edited by: Andreas Mann on Apr 9, 2010 10:08 AM

  • Data load into SAP ECC from Non SAP system

    Hi Experts,
    I am very new to BODS and I have want to load historical data from non SAP source system  into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
    Regards,
    Monil

    Hi
    In order to load into SAP you have the following options
    1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
    2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
    3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
    The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
    However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
    To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
    Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
    Customer Master - DEBMAS
    Article Master - ARTMAS
    Material Master - MATMAS
    Vendor Master - CREMAS
    Purchase Info Records (PIR) - INFREC
    The list is endless.........
    In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
    If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
    Do let me know if you need more info on any specific queries or issues you may encounter.
    kind regards
    Raghu

  • Data Load Speed

    Hi all.
    We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
    Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
    I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
    Thank you and best regards.

    hi friedel,
    Again here is the complete details regarding data transfer techniques.
    <b>Call Transaction:</b>
    1.Synchronous Processing
    2.Synchronous and Asynchrounous database updates
    3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
    4.No batch input log gets generated
    5.No automatic error handling.
    <b>Session Method:</b>
    1.Asynchronous Processing
    2.Synchronous database updates.
    3.Transfer of data for multiple transaction
    4.Batch input log gets generated
    5.Automatic error handling
    6.SAP's standard approach
    <b>Direct Input Method:</b>
    1.Best suited for transferring large amount of data
    2.No screens are processed
    3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
    <b>LSMW.</b>
    1.A code free tool which helps you to transfer data into SAP.
    2.Suited for one time transfer only.
    <b>CALL DIALOG.</b>
    This approach is outdated and you should choose between one of the above techniques..
    Also check the knowledge pool for more reference
    http://help.sap.com
    Cheers,
    Abdul Hakim

  • What is the Short Key,For checking the check box in Data Loader ?

    When i am Transfering the data from a legacy to Oracle Apps DB, Using Data Loader tool.I stuck with an issue his i need to enable the check in a particular form using the data loader,Is there a short key Like Enter( ENT), Save and Proceed(*SP) .. Needed Badly ...
    Thank you ..

    You can be able to enable a check box or disable a check box by using the space bar, you make check box item by navigating using tab and use Space bar *SB.
    try this out
    Regards
    Ramesh Kumar S

  • Legacy FA/AUC Load

    Dear Experts,
    We have set our transfer date as 31.12.2010  and uploading Assets via AS91. This works fine. The issue is with the legacy load of AUC. Since PS is in use all AUC 's are created via settlement from WBS (CJ88). The Cutover plan states loading cost on to WBS via journal and then running a settlement for creating AUC master data/values. Since we are going live in Feb 2011 our take on values are as follows
    Fixed Assets
    Transfer date 31.12.2010
    *** ACQ value - As on 31.12.2010
    Accum Dep value -As on 31.12.2010
    Since reporting is done quarterly depreciation would be run in the month of march for (jan/feb/march), so we are not posting any values for ordinary depreciation.
    Movements for the month of jan/Feb
    Additions:- Since all assets are created va WBS/AUC/FA route we will capture additions for jan/feb via Postings to WBS.(settlement run quarterly in march)
    Transfers/retirements- We have adviced business not to do any transfers/retirements for jan/feb.
    Now if i want to load legacy AUC data i would need to post acconting documents wih date 01/01/2011 and carry out settlement run in period 1 fy 2011. (this will lead to more complications as i would then need to identify what AUC's are legacy load and what are part of additions for jan/feb)
    Has anybody come across this situation? PLease suggest the best way to handle this situation. I have seen a post where the transfer date was flipped to accomodate AUC posting.
    Looking for some constructive help!!!
    Thanks
    Sanjeev

    This issue has now been resolved.
    The approach depends on wether you are converting your data at end of fiscal year or in the middle of the fiscal year. If it is in the middle of the fiscal year it can be done through loading cost on wbs via journal entry and then running the settlement with releavnt transaction types.
    If the transfer date is on the last day of fiscal year the above mentioned approach does not work. I have set my  transfer date to 31.12.2010. I will load the balances on AUC assets via lsmw ( AS92) directly. further settlement on final assets is also possible if you are using PS. This way we also avoid any manual adjustments to eliminate double entry( contra entry for load from GL balances).
    Thank you all for your valuable inputs.
    Sanjeev

  • Flat File Data Load Error

    I have a text file (with fields comma separated) coming from the legacy system which is to be loaded into BW.
    Values in the fields sometimes have comma in them due to which it disturbs the file structure and hence the data load fails. Ex:
    ColA     ColB     ColC
    10     A     $100
    10     B, Inc     $50
    In the above case record 2 comes as
    ColA     ColB     ColC     ColD
    10     B     Inc     $50
    I am having this problem for 3 fields. Any suggestions on how fix this issue other then cleaning the file manually.
    Thank you,
    sam

    Option1: Try to use a different separator like ;
    Option2: For these three records note down where the , is...delete the comma from the records, load the data into PSA only, re-insert the comma into these records now and finally update into the data targets from the PSA.
    Hope this helps.

Maybe you are looking for

  • Rejeição: NF-e de devolução não possui documento fiscal referenciado

    Após go live para o novo layout 3.10 da NF-e, nos deparamos com um situação de rejeição 321 - Rejeição: NF-e de devolução não possui documento fiscal referenciado nos documentos de devolução de clientes(NFD) lançadas na VL01. Esses documentos são ref

  • Single User Mode: Still waiting for root device

    iMac G4. I reset the NVRAM, PRAM. It won't attempt to boot from anything but a OS 10.4 cd; booting from OS 9 cd just leaves me at the flashing question mark icon. If I try to boot from OS X cd, it comes back with the prohibitory sign. If I try to boo

  • Wifi is connected but internet has stopped working

    I have had my new ipod touch for a week and its been working fine. i had it connected to my wifi network at home with no problems until this morning when it stopped working. the wifi connectivity is fine and works with everything else, and the ipod r

  • Viewing sticky notes

    Using AA Reader X on my laptop I can see the content of the sticky notes sprinkled throughout the documents I use.  When the same documents are opened with AA Reader 1.0.8 on my Android 3.1 tablet I can see the yellow balloons for the sticky notes pr

  • Logging User access

    LAMP system DWMX 6.1 I have a table "USERS" which contains, as you'd expect, User, password, accesspermissions. This I use for controlling access. I have also added a new column of lastaccessed. What I am trying to acheive is a timestamp on first log